Why we have an emotional connection to robots | Kate Darling

131,451 views ใƒป 2018-11-06

TED


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

00:00
Translator: Joseph Geni Reviewer: Krystian Aparta
0
0
7000
๋ฒˆ์—ญ: Draka gagoil ๊ฒ€ํ† : Jihyeon J. Kim
00:13
There was a day, about 10 years ago,
1
13047
2508
10๋…„ ์ „ ์–ด๋Š๋‚ ,
00:15
when I asked a friend to hold a baby dinosaur robot upside down.
2
15579
3944
์นœ๊ตฌ์—๊ฒŒ ์•„๊ธฐ ๊ณต๋ฃก ๋กœ๋ด‡์„ ๊ฑฐ๊พธ๋กœ ๋“ค์–ด๋ด๋ผ๊ณ  ํ–ˆ์–ด์š”.
00:21
It was this toy called a Pleo that I had ordered,
3
21889
3446
์ œ๊ฐ€ ์ฃผ๋ฌธํ–ˆ๋˜ ๊ฒƒ์€ ํ”Œ๋ ˆ์˜ค(Pleo)๋ผ๋Š” ์žฅ๋‚œ๊ฐ์ด์—ˆ๊ณ 
00:25
and I was really excited about it because I've always loved robots.
4
25359
4401
๋ชจ๋“  ๋กœ๋ด‡์„ ์ข‹์•„ํ–ˆ๊ธฐ์— ์ •๋ง๋กœ ๊ธฐ๋Œ€ํ•˜๊ณ  ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
00:29
And this one has really cool technical features.
5
29784
2279
๊ทธ๋ฆฌ๊ณ  ์ด ๋กœ๋ด‡์€ ์ •๋ง ๋ฉ‹์ง„ ๊ธฐ์ˆ ์  ํŠน์ง•์ด ์žˆ์—ˆ์–ด์š”.
00:32
It had motors and touch sensors
6
32087
2119
๋ชจํ„ฐ์™€ ํ„ฐ์น˜ ์„ผ์„œ๊ฐ€ ์žˆ์—ˆ๊ณ 
00:34
and it had an infrared camera.
7
34230
2244
์ ์™ธ์„  ์นด๋ฉ”๋ผ๊ฐ€ ์žˆ์—ˆ์–ด์š”.
00:36
And one of the things it had was a tilt sensor,
8
36498
2763
์ด ๋กœ๋ด‡์ด ๊ฐ€์ง„ ํŠน์ง• ์ค‘ ํ•˜๋‚˜๋Š” ์ˆ˜ํ‰ ์„ผ์„œ์˜€๋Š”๋ฐ
00:39
so it knew what direction it was facing.
9
39285
2318
์ด ์„ผ์„œ๋กœ Pleo๊ฐ€ ๋ฐ”๋ผ๋ณด๋Š” ๋ฐฉํ–ฅ์„ ์•Œ ์ˆ˜ ์žˆ์ฃ .
00:42
And when you held it upside down,
10
42095
2134
์ด๊ฒƒ์„ ๊ฑฐ๊พธ๋กœ ๋“ค๋ฉด
00:44
it would start to cry.
11
44253
1572
Pleo๋Š” ์šธ๊ธฐ ์‹œ์ž‘ํ•ฉ๋‹ˆ๋‹ค.
00:46
And I thought this was super cool, so I was showing it off to my friend,
12
46527
3496
์ด ์ ์ด ์ •๋ง ๋ฉ‹์ง€๋‹ค๊ณ  ์ƒ๊ฐํ•ด์„œ ์นœ๊ตฌ์—๊ฒŒ๋„ ๋ณด์—ฌ์คฌ์ฃ .
00:50
and I said, "Oh, hold it up by the tail. See what it does."
13
50047
2805
"์™€, ์ด๊ฑฐ ๊ผฌ๋ฆฌ ์ข€ ์žก๊ณ  ๋“ค์–ด๋ด. ์ด๊ฒŒ ๋ญํ•˜๋Š”์ง€ ์ข€ ๋ด"๋ผ๊ณ  ๋งํ–ˆ์–ด์š”.
00:55
So we're watching the theatrics of this robot
14
55268
3625
์šฐ๋ฆฐ ์ด ๋กœ๋ด‡์ด ๊ธฐ๊ณ„์ ์œผ๋กœ ๋ฒ„๋‘ฅ๊ฑฐ๋ฆฌ๋ฉฐ ์šธ๋ถ€์ง–๋Š” ๊ฒƒ์„
00:58
struggle and cry out.
15
58917
2199
๋ณด๊ณ ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
01:02
And after a few seconds,
16
62767
2047
๊ทธ๋ฆฌ๊ณ  ๋ช‡ ์ดˆ ํ›„
01:04
it starts to bother me a little,
17
64838
1972
์ด๊ฒŒ ์กฐ๊ธˆ ๋ถˆํŽธํ•ด์ง€๊ธฐ ์‹œ์ž‘ํ–ˆ์–ด์š”.
01:07
and I said, "OK, that's enough now.
18
67744
3424
"์•„, ์ถฉ๋ถ„ํ•ด ์ด์ œ.
01:11
Let's put him back down."
19
71930
2305
๋‹ค์‹œ ๋˜‘๋ฐ”๋กœ ์„ธ์›Œ ๋‘์ž." ๋ผ๊ณ  ๋งํ–ˆ์Šต๋‹ˆ๋‹ค.
01:14
And then I pet the robot to make it stop crying.
20
74259
2555
๊ทธ๋ฆฌ๊ณค ์ „ Pleo๊ฐ€ ๋” ์ด์ƒ ์šธ์ง€ ์•Š๋„๋ก ์“ฐ๋‹ค๋“ฌ์—ˆ์–ด์š”.
01:18
And that was kind of a weird experience for me.
21
78973
2452
์ด๋Ÿฐ๊ฒƒ ์€ ์ €์—๊ฒŒ ๊ธฐ๋ฌ˜ํ•œ ๊ฒฝํ—˜ ์ค‘ ํ•˜๋‚˜์ž…๋‹ˆ๋‹ค.
01:22
For one thing, I wasn't the most maternal person at the time.
22
82084
4569
์‚ฌ์‹ค, ๊ทธ ๋‹น์‹œ ์ €๋Š” ๊ทธ๋‹ค์ง€ ๋ชจ์„ฑ์• ์ ์ธ ์‚ฌ๋žŒ์€ ์•„๋‹ˆ์—ˆ์ฃ .
01:26
Although since then I've become a mother, nine months ago,
23
86677
2731
๋น„๋ก 9๊ฐœ์›” ์ „์— ์—„๋งˆ๊ฐ€ ๋œ ์ดํ›„์—
01:29
and I've learned that babies also squirm when you hold them upside down.
24
89432
3433
์•„๊ธฐ๋“ค ๋˜ํ•œ ๊ฑฐ๊พธ๋กœ ๋’ค์ง‘ํžˆ๋ฉด ๋ชธ๋ถ€๋ฆผ์นœ๋‹ค๋Š” ๊ฒƒ์„ ์•Œ์•˜์ง€๋งŒ์š”.
01:32
(Laughter)
25
92889
1563
(์›ƒ์Œ)
01:35
But my response to this robot was also interesting
26
95023
2358
ํ•˜์ง€๋งŒ ์ด ๋กœ๋ด‡์— ๋Œ€ํ•œ ์ €์˜ ๋ฐ˜์‘์€ ํฅ๋ฏธ๋กœ์› ๋Š”๋ฐ
01:37
because I knew exactly how this machine worked,
27
97405
4101
์ด ๋กœ๋ด‡์ด ์–ด๋–ป๊ฒŒ ์›€์ง์ด๋Š”์ง€ ์ •ํ™•ํ•˜๊ฒŒ ์•Œ๊ณ  ์žˆ์—ˆ์Œ์—๋„ ๋ถˆ๊ตฌํ•˜๊ณ 
01:41
and yet I still felt compelled to be kind to it.
28
101530
3262
์—ฌ์ „ํžˆ ๋กœ๋ด‡์—๊ฒŒ ์ž˜ ํ•ด์ค˜์•ผ ํ•œ๋‹ค๊ณ  ๋Š๊ผˆ์–ด์š”.
01:46
And that observation sparked a curiosity
29
106450
2707
์ด ๊ด€์ฐฐ์€ ์ง€๋‚œ ์‹ญ์—ฌ๋…„์˜ ์—ฐ๊ตฌ๋ฅผ ํ•˜๋„๋ก
01:49
that I've spent the past decade pursuing.
30
109181
2832
ํ˜ธ๊ธฐ์‹ฌ์— ๋ถˆ์„ ๋ถ™์˜€์Šต๋‹ˆ๋‹ค.
01:52
Why did I comfort this robot?
31
112911
1793
์™œ ์ด ๋กœ๋ด‡์„ ์œ„๋กœํ•ด์คฌ์„๊นŒ์š”?
01:56
And one of the things I discovered was that my treatment of this machine
32
116228
3579
์ œ๊ฐ€ ๋ฐœ๊ฒฌํ•œ ๊ฒƒ๋“ค ์ค‘ ํ•˜๋‚˜๋Š” ์ด ๋กœ๋ด‡์— ๋Œ€ํ•œ ์ €์˜ ๋Œ€์šฐ๊ฐ€
01:59
was more than just an awkward moment in my living room,
33
119831
3701
๊ฑฐ์‹ค์—์„œ ๋‹จ์ˆœํžˆ ์–ด์ƒ‰ํ•œ ์ˆœ๊ฐ„ ๊ทธ ์ด์ƒ์ด๋ผ๋Š” ๊ฒƒ์„ ๋ฐœ๊ฒฌํ•œ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
02:03
that in a world where we're increasingly integrating robots into our lives,
34
123556
5420
๊ทธ๊ณณ์€ ๋กœ๋ด‡์ด ์ ์  ๋” ์šฐ๋ฆฌ ์‚ถ์œผ๋กœ ๋“ค์–ด์˜ค๋Š” ๊ณณ์ด๊ณ 
02:09
an instinct like that might actually have consequences,
35
129000
3126
๊ทธ๋Ÿฐ ๋ณธ๋Šฅ์€ ์‹ค์ œ๋กœ ์–ด๋–ค ๊ฒฐ๊ณผ๋ฅผ ๋‚ณ๋Š”๋ฐ
02:13
because the first thing that I discovered is that it's not just me.
36
133452
3749
์ œ๊ฐ€ ์ฒ˜์Œ ๋ฐœ๊ฒฌํ•œ ๊ฒŒ ์ €๋งŒ ๊ทธ๋Ÿฐ ๊ฒŒ ์•„๋‹ˆ๋ผ๋Š” ๊ฒ๋‹ˆ๋‹ค.
02:19
In 2007, the Washington Post reported that the United States military
37
139249
4802
2007๋…„, Washington Post๋Š” ๋ฏธ๊ตฐ์ด ์ง€๋ขฐ๋ฅผ ์ œ๊ฑฐํ•˜๋Š”
02:24
was testing this robot that defused land mines.
38
144075
3230
๋กœ๋ด‡์„ ํ…Œ์ŠคํŠธํ–ˆ๋‹ค๊ณ  ๋ณด๊ณ ํ–ˆ์—ˆ์Šต๋‹ˆ๋‹ค.
02:27
And the way it worked was it was shaped like a stick insect
39
147329
2912
์ด๊ฒƒ์ด ์ž‘๋™ํ•˜๋Š” ๋ฐฉ์‹์€ ๋งˆ์น˜ ๋ง‰๋Œ€ ๋ฒŒ๋ ˆ์ฒ˜๋Ÿผ ์ƒ๊ฒผ๊ณ 
02:30
and it would walk around a minefield on its legs,
40
150265
2651
๋‹ค๋ฆฌ๋กœ ์ง€๋ขฐ๋ฐญ์„ ๊ฑธ์–ด๋‹ค๋‹ˆ๋‹ค๊ฐ€
02:32
and every time it stepped on a mine, one of the legs would blow up,
41
152940
3206
์ง€๋ขฐ๋ฅผ ๋ฐŸ์„ ๋•Œ๋งˆ๋‹ค ๋ฐŸ์€ ๋‹ค๋ฆฌ๊ฐ€ ํ„ฐ์งˆ ๊ฒ๋‹ˆ๋‹ค.
02:36
and it would continue on the other legs to blow up more mines.
42
156170
3057
๊ทธ๋ฆฌ๊ณค ๊ณ„์† ๋‚จ์€ ๋‹ค๋ฅธ ๋‹ค๋ฆฌ๋กœ ๋” ๋งŽ์€ ์ง€๋ขฐ๋ฅผ ํ„ฐ๋œจ๋ฆฌ๊ฒ ์ฃ .
02:39
And the colonel who was in charge of this testing exercise
43
159251
3786
์ด ์‹คํ—˜์˜ ์ฑ…์ž„์ž์˜€๋˜ ๋Œ€๋ น์€
02:43
ends up calling it off,
44
163061
2118
์ด ์‹คํ—˜์„ ๋๋ƒˆ์Šต๋‹ˆ๋‹ค.
02:45
because, he says, it's too inhumane
45
165203
2435
์™œ๋ƒ๋ฉด ๊ทธ๊ฐ€ ๋งํ•˜๊ธธ
๋ถ€์„œ์ง„ ๋กœ๋ด‡์ด ์ง€๋ขฐ๋ฐญ์—์„œ ์งˆ์งˆ ๋Œ๊ณ  ๊ฐ€๋Š” ๋ชจ์Šต์„ ๋ณด๋Š” ๊ฒŒ ๋„ˆ๋ฌด ๋น„์ธ๊ฐ„์ ์ด์–ด์„œ ๋ž๋‹ˆ๋‹ค.
02:47
to watch this damaged robot drag itself along the minefield.
46
167662
4516
02:54
Now, what would cause a hardened military officer
47
174978
3897
๋‹จ๋ จ๋œ ๊ตฐ์žฅ๊ต๋‚˜ ์ € ๊ฐ™์€ ์‚ฌ๋žŒ์ด
02:58
and someone like myself
48
178899
2043
๋กœ๋ด‡๋“ค์—๊ฒŒ ์ด๋Ÿฐ ๋ฐ˜์‘์„ ๊ฐ–๋„๋ก ๋งŒ๋“  ๊ฒƒ์€ ๋ฌด์—‡์ผ๊นŒ์š”?
03:00
to have this response to robots?
49
180966
1857
03:03
Well, of course, we're primed by science fiction and pop culture
50
183537
3310
๋ฌผ๋ก  ์ €ํฌ๋Š” ๊ณต์ƒ์†Œ์„ค์ด๋‚˜ ๋Œ€์ค‘ ๋ฌธํ™”์— ์˜ํ–ฅ์„ ๋ฐ›์•˜์ฃ .
03:06
to really want to personify these things,
51
186871
2579
์ด๋Ÿฐ ๋กœ๋ด‡๋“ค์„ ์˜์ธํ™”ํ•˜๊ณ  ์‹ถ์–ดํ•˜์ฃ .
03:09
but it goes a little bit deeper than that.
52
189474
2789
ํ•˜์ง€๋งŒ ์ด๋Ÿฐ ๊ฒƒ๋ณด๋‹ค ์ข€ ๋” ์‹ฌ์˜คํ•ฉ๋‹ˆ๋‹ค.
03:12
It turns out that we're biologically hardwired to project intent and life
53
192287
5309
์šฐ๋ฆฐ ์ƒ๋ฌผํ•™์ ์œผ๋กœ ์˜๋„์™€ ์ƒ๋ช…๋ ฅ์„ ํˆฌ์˜์‹œํ‚ค๋„๋ก ๋˜์–ด ์žˆ๋Š”๋ฐ
03:17
onto any movement in our physical space that seems autonomous to us.
54
197620
4766
์‹ค์ œ ๊ณต๊ฐ„์—์„œ ์ž์œจ์ ์œผ๋กœ ๋ณด์ด๋Š” ์›€์ง์ž„์— ๊ทธ๋Ÿฐ ๋ฐ˜์‘์„ ๊ฐ–๋Š”๋‹ค๊ณ  ํ•ฉ๋‹ˆ๋‹ค.
03:23
So people will treat all sorts of robots like they're alive.
55
203214
3465
๊ทธ๋ ‡๊ธฐ์— ์‚ฌ๋žŒ๋“ค์€ ๋ชจ๋“  ๋กœ๋ด‡์„ ๋งˆ์น˜ ์ƒ๋ฌผ์ฒ˜๋Ÿผ ๋Œ€ํ•ฉ๋‹ˆ๋‹ค.
03:26
These bomb-disposal units get names.
56
206703
2683
์ด ํญํƒ„์ œ๊ฑฐ ๋กœ๋ด‡์€ ์ด๋ฆ„์„ ์–ป๊ฒŒ๋˜์ฃ .
03:29
They get medals of honor.
57
209410
1682
ํ›ˆ์žฅ๋„ ๋ฐ›์ฃ .
03:31
They've had funerals for them with gun salutes.
58
211116
2325
๊ทธ๋“ค์€ ์˜ˆํฌ๋ฅผ ๋ฐ›์œผ๋ฉฐ ์žฅ๋ก€๋„ ์น˜๋ €์Šต๋‹ˆ๋‹ค.
03:34
And research shows that we do this even with very simple household robots,
59
214380
3833
์กฐ์‚ฌ์ž๋ฃŒ๋“ค์€ ๊ฐ€์ „๊ธฐ๊ธฐ์—๋„ ์ด๋Ÿฐํ•œ ์ผ๋“ค์„ ํ•œ๋‹ค๊ณ  ๋ณด์—ฌ์ค๋‹ˆ๋‹ค.
03:38
like the Roomba vacuum cleaner.
60
218237
2135
์ง„๊ณต์ฒญ์†Œ๊ธฐ ๊ฐ™์€ ๊ฒƒ๋“ค์—๋„์š”.
03:40
(Laughter)
61
220396
1291
(์›ƒ์Œ)
03:41
It's just a disc that roams around your floor to clean it,
62
221711
3089
์ด ์ฒญ์†Œ๊ธฐ๋Š” ๋ฐ”๋‹ฅ์„ ์ฒญ์†Œํ•˜๊ธฐ ์œ„ํ•ด ์›€์ง์ด๋Š” ์›๋ฐ˜์ผ ๋ฟ์ด์ฃ .
03:44
but just the fact it's moving around on its own
63
224824
2306
ํ•˜์ง€๋งŒ ์ด ๋กœ๋ด‡์€ ์ž๊ธฐ ๋ฐ˜๊ฒฝ๋งŒ ์›€์ง์—ฌ์„œ
03:47
will cause people to name the Roomba
64
227154
2167
๋ฃธ๋ฐ”๋ผ๊ณ  ๋ถˆ๋ฆฌ๊ฒŒ ๋ ํ…Œ๊ณ 
03:49
and feel bad for the Roomba when it gets stuck under the couch.
65
229345
3182
์†ŒํŒŒ ๋ฐ‘์— ๋ง‰ํ˜€๋ฒ„๋ฆด ๋•Œ๋ฉด ๋ฃธ๋ฐ”๊ฐ€ ์•ˆํƒ€๊น์ฃ .
03:52
(Laughter)
66
232551
1865
(์›ƒ์Œ)
03:54
And we can design robots specifically to evoke this response,
67
234440
3340
์ด๋Ÿฐ ๋ฐ˜์‘์„ ์ผ์œผํ‚ค๋„๋ก ๋กœ๋ด‡์„ ์„ค๊ณ„ํ•  ์ˆ˜ ์žˆ๋Š”๋ฐ
03:57
using eyes and faces or movements
68
237804
3461
๋ˆˆ, ์–ผ๊ตด์ด๋‚˜ ์›€์ง์„ ์ด์šฉํ•ด์„œ
04:01
that people automatically, subconsciously associate
69
241289
3259
์‚ฌ๋žŒ๋“ค์ด ๋ฌด์˜์‹์ ์œผ๋กœ ๋งˆ์Œ ์ƒํƒœ์™€ ์—ฐ๊ฒฐ์ง“๊ฒŒ ํ•˜๋Š” ๊ฑฐ์ฃ .
04:04
with states of mind.
70
244572
2020
04:06
And there's an entire body of research called human-robot interaction
71
246616
3293
์ด๊ฒƒ์ด ์–ผ๋งˆ๋‚˜ ์ž˜ ์ž‘๋™ํ•˜๋Š”์ง€ ๋ณด์—ฌ์ฃผ๋Š” ์ธ๊ฐ„-๋กœ๋ด‡ ์ƒํ˜ธ์ž‘์šฉ์ด๋ผ๋Š”
04:09
that really shows how well this works.
72
249933
1826
์—ฐ๊ตฌ ๋‹จ์ฒด๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
04:11
So for example, researchers at Stanford University found out
73
251783
3126
์˜ˆ๋ฅผ ๋“ค๋ฉด, ์Šคํƒ ํฌ๋“œ ๋Œ€ํ•™ ์—ฐ๊ตฌ์ง„๋“ค์ด ๋ฐœ๊ฒฌํ–ˆ๋Š”๋ฐ
04:14
that it makes people really uncomfortable
74
254933
2001
์‚ฌ๋žŒ๋“ค์ด ์ •๋ง ๋ถˆํŽธํ•ดํ•œ๋‹ค๋Š” ๊ฒ๋‹ˆ๋‹ค.
04:16
when you ask them to touch a robot's private parts.
75
256958
2472
๋กœ๋ด‡์˜ ์‚ฌ์ ์ธ ๋ถ€๋ถ„์„ ๋งŒ์ง€๋ผ๊ณ  ํ•˜๋ฉด์š”.
04:19
(Laughter)
76
259454
2120
(์›ƒ์Œ)
04:21
So from this, but from many other studies,
77
261598
2023
์ด ์—ฐ๊ตฌ์™€ ๋‹ค๋ฅธ ์—ฐ๊ตฌ๋“ค๋กœ๋ถ€ํ„ฐ
04:23
we know, we know that people respond to the cues given to them
78
263645
4223
์‚ฌ๋žŒ๋“ค์€ ์ฃผ์–ด์ง„ ์ž๊ทน์— ๋ฐ˜์‘ํ•จ์„ ์••๋‹ˆ๋‹ค.
04:27
by these lifelike machines,
79
267892
2022
์ง„์งœ๊ฐ€ ์•„๋‹˜์„ ์•„๋Š”๋ฐ๋„ ์‚ด์•„์žˆ๋Š” ๊ฒƒ ๊ฐ™์€ ๊ธฐ๊ณ„์— ๋ฐ˜์‘ํ•ฉ๋‹ˆ๋‹ค.
04:29
even if they know that they're not real.
80
269938
2017
04:33
Now, we're headed towards a world where robots are everywhere.
81
273654
4056
์ด์ œ, ์–ด๋””์—๋‚˜ ๋กœ๋ด‡์ด ์žˆ๋Š” ์„ธ์ƒ์œผ๋กœ ํ–ฅํ•ด๊ฐ€๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
04:37
Robotic technology is moving out from behind factory walls.
82
277734
3065
๋กœ๋ด‡ ๊ธฐ์ˆ ์€ ๊ณต์žฅ์„ ๋ฒ—์–ด๋‚˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
04:40
It's entering workplaces, households.
83
280823
3013
์ง์žฅ๊ณผ ๊ฐ€์ •์œผ๋กœ ๋“ค์–ด๊ฐ€๊ณ  ์žˆ์ฃ .
04:43
And as these machines that can sense and make autonomous decisions and learn
84
283860
6209
์ด๋ ‡๊ฒŒ ์ดํ•ดํ•˜๊ณ  ์Šค์Šค๋กœ ๊ฒฐ์ •ํ•˜๊ณ  ๋ฐฐ์šฐ๋Š” ๊ธฐ๊ณ„๋“ค์ด
04:50
enter into these shared spaces,
85
290093
2552
๊ณต์šฉ ๊ณต๊ฐ„์— ๋“ค์–ด๊ฐ€๊ธฐ์—
04:52
I think that maybe the best analogy we have for this
86
292669
2496
์ด๊ฒƒ์„ ์ ์ ˆํ•˜๊ฒŒ ๋น„์œ ํ•ด๋ณด๋ฉด
04:55
is our relationship with animals.
87
295189
1935
๋™๋ฌผ๊ณผ์˜ ๊ด€๊ณ„์ผ ๊ฒ๋‹ˆ๋‹ค.
04:57
Thousands of years ago, we started to domesticate animals,
88
297523
3888
์ˆ˜ ์ฒœ๋…„ ์ „, ์šฐ๋ฆฐ ๋™๋ฌผ๋“ค์„ ๊ธธ๋“ค์ด๊ธฐ ์‹œ์ž‘ํ–ˆ๊ณ ,
05:01
and we trained them for work and weaponry and companionship.
89
301435
4045
๋…ธ๋™, ๋ฌด๊ธฐ, ๋ฐ˜๋ ค์šฉ์œผ๋กœ ๊ธธ๋ €์Šต๋‹ˆ๋‹ค.
05:05
And throughout history, we've treated some animals like tools or like products,
90
305504
4985
์—ญ์‚ฌ๋ฅผ ํ†ตํ‹€์–ด ๋™๋ฌผ๋“ค์„ ๋„๊ตฌ๋‚˜ ๋ฌผ๊ฑด์œผ๋กœ ๋Œ€ํ–ˆ๊ณ 
05:10
and other animals, we've treated with kindness
91
310513
2174
์–ด๋–ค ๋™๋ฌผ๋“ค์€ ์นœ์ ˆํ•˜๊ฒŒ ๋Œ€ํ•˜๊ณ 
05:12
and we've given a place in society as our companions.
92
312711
3078
๋ฐ˜๋ ค ๋™๋ฌผ๋กœ ์‚ฌํšŒ์  ์œ„์น˜๋ฅผ ๋’€์Šต๋‹ˆ๋‹ค.
05:15
I think it's plausible we might start to integrate robots in similar ways.
93
315813
3849
๋กœ๋ด‡์„ ๋น„์Šทํ•œ ๊ฒฝ์šฐ๋กœ ํ†ตํ•ฉํ•˜๊ธฐ ์‹œ์ž‘ํ•œ๋‹ค๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
05:21
And sure, animals are alive.
94
321484
3096
๋ฌผ๋ก , ๋™๋ฌผ๋“ค์€ ์‚ด์•„์žˆ์ฃ .
05:24
Robots are not.
95
324604
1150
๋กœ๋ด‡์€ ์•„๋‹ˆ๊ณ ์š”.
05:27
And I can tell you, from working with roboticists,
96
327626
2580
๋กœ๋ด‡ ์‚ฐ์—…์—์„œ ์ผํ•˜๋Š” ์‚ฌ๋žŒ์œผ๋กœ ๋ง์”€๋“œ๋ฆด ์ˆ˜ ์žˆ๋Š” ๊ฑด
05:30
that we're pretty far away from developing robots that can feel anything.
97
330230
3522
๊ฐ์ •์„ ๋Š๋ผ๋Š” ๋กœ๋ด‡ ๊ฐœ๋ฐœ์€ ํ•œ์ฐธ ๋ฉ€์—ˆ์Šต๋‹ˆ๋‹ค.
05:35
But we feel for them,
98
335072
1460
ํ•˜์ง€๋งŒ ์šฐ๋ฆฌ๋Š” ๋กœ๋ด‡์—๊ฒŒ ๋Š๋‚๋‹ˆ๋‹ค.
05:37
and that matters,
99
337835
1207
๊ทธ๊ฒƒ์ด ์ค‘์š”ํ•ฉ๋‹ˆ๋‹ค.
05:39
because if we're trying to integrate robots into these shared spaces,
100
339066
3627
์™œ๋ƒ๋ฉด ๋กœ๋ด‡์„ ๊ณต์šฉ ๊ณต๊ฐ„์— ์ ‘๋ชฉ์‹œํ‚ค๋ ค ํ•œ๋‹ค๋ฉด
05:42
we need to understand that people will treat them differently than other devices,
101
342717
4628
์‚ฌ๋žŒ๋“ค์ด ๋‹ค๋ฅธ ๊ธฐ๊ณ„์™€๋Š” ๋‹ค๋ฅด๊ฒŒ ๋Œ€ํ•œ๋‹ค๋Š” ๊ฑธ ์ดํ•ดํ•ด์•ผ ํ•˜๋‹ˆ๊นŒ์š”.
05:47
and that in some cases,
102
347369
1844
์–ด๋–ค ๊ฒฝ์šฐ์—๋Š”
05:49
for example, the case of a soldier who becomes emotionally attached
103
349237
3172
์˜ˆ๋ฅผ ๋“ค์–ด ํ•จ๊ป˜ ์ผํ•˜๋Š” ๋กœ๋ด‡๊ณผ ์• ์ฐฉ์ด ์ƒ๊ธด ๊ตฐ์ธ์˜ ๊ฒฝ์šฐ
05:52
to the robot that they work with,
104
352433
2047
05:54
that can be anything from inefficient to dangerous.
105
354504
2504
๋น„ํšจ์œจ์ ์ธ ์ผ๋ถ€ํ„ฐ ์œ„ํ—˜ํ•œ ์ผ์ด ๋  ์ˆ˜๋„ ์žˆ์–ด์š”.
05:58
But in other cases, it can actually be useful
106
358551
2138
ํ•˜์ง€๋งŒ ๋‹ค๋ฅธ ๊ฒฝ์šฐ์—๋Š” ๋กœ๋ด‡๊ณผ ๊ฐ์ •์ ์ธ ์†Œํ†ต์„
06:00
to foster this emotional connection to robots.
107
360713
2623
์ด‰์ง„ํ•˜๋Š” ๊ฒƒ์€ ์‚ฌ์‹ค ์•„์ฃผ ์œ ์šฉํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
06:04
We're already seeing some great use cases,
108
364184
2134
์ด๋ฏธ ํœผ๋ฅญํ•œ ์‚ฌ์šฉ์ฒ˜๋ฅผ ๋ณด๊ณ ์žˆ์Šต๋‹ˆ๋‹ค.
06:06
for example, robots working with autistic children
109
366342
2604
์˜ˆ๋ฅผ ๋“ค์–ด, ์žํ์•„๋™์„ ๋Œ๋ณด๋Š” ๋กœ๋ด‡์€
06:08
to engage them in ways that we haven't seen previously,
110
368970
3634
์ด์ „์—” ๋ณด์ง€ ๋ชปํ•œ ๋ฐฉ์‹์œผ๋กœ ์•„์ด๋“ค์„ ๋Œ๋ณด๊ฑฐ๋‚˜
06:12
or robots working with teachers to engage kids in learning with new results.
111
372628
4000
ํ•™์ƒ๋“ค์˜ ์ƒˆ๋กœ์šด ํ•™์Šต๊ฒฐ๊ณผ๋ฅผ ๋‚ด๋Š” ๊ต์‚ฌ์™€ ํ•จ๊ป˜ ์ž‘์—…ํ•˜๋Š” ๋กœ๋ด‡์ด์ฃ .
06:17
And it's not just for kids.
112
377433
1381
์•„์ด๋“ค์—๊ฒŒ๋งŒ ํ•ด๋‹นํ•˜์ง„ ์•Š์Šต๋‹ˆ๋‹ค.
06:19
Early studies show that robots can help doctors and patients
113
379750
3223
์ดˆ๊ธฐ ์—ฐ๊ตฌ๋“ค์€ ๋กœ๋ด‡๋“ค์ด ์˜์‚ฌ์™€ ํ™˜์ž๋“ค์˜
06:22
in health care settings.
114
382997
1427
๊ฑด๊ฐ• ์œ ์ง€์— ๋„์›€์„ ์ค€๋‹ค๊ณ  ํ•ฉ๋‹ˆ๋‹ค.
06:25
This is the PARO baby seal robot.
115
385535
1810
PARO๋ผ๋Š” ์•„๊ธฐ ๋ฐ”๋‹คํ‘œ๋ฒ” ๋กœ๋ด‡์ž…๋‹ˆ๋‹ค.
06:27
It's used in nursing homes and with dementia patients.
116
387369
3285
์–‘๋กœ์›๊ณผ ์น˜๋งค ํ™˜์ž๋“ค์„ ๋Œ๋ณด๊ธฐ์œ„ํ•ด ์‚ฌ์šฉ๋˜์ฃ .
06:30
It's been around for a while.
117
390678
1570
์–ผ๋งˆ๋™์•ˆ ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
06:32
And I remember, years ago, being at a party
118
392272
3325
์ œ ๊ธฐ์–ต์—”, ๋ช‡ ๋…„ ์ „ ํ•œ ํŒŒํ‹ฐ์— ์žˆ์„ ๋•Œ
06:35
and telling someone about this robot,
119
395621
2571
๋ˆ„๊ตฐ๊ฐ€์—๊ฒŒ ์ด ๋กœ๋ด‡์„ ๋งํ–ˆ์–ด์š”.
06:38
and her response was,
120
398216
2126
๊ทธ๋Ÿฌ์ž ๊ทธ๋…€์˜ ๋ฐ˜์‘์€
06:40
"Oh my gosh.
121
400366
1262
"์„ธ์ƒ์—
06:42
That's horrible.
122
402508
1188
๋”์ฐํ•˜๋„ค์š”.
06:45
I can't believe we're giving people robots instead of human care."
123
405056
3397
์‚ฌ๋žŒ์ด ๋Œ๋ด์ฃผ์ง€ ์•Š๊ณ  ๋กœ๋ด‡์„ ์ฃผ๋‹ค๋‹ˆ ๋ฏฟ์„์ˆ˜ ์—†๋„ค์š”." ์˜€์Šต๋‹ˆ๋‹ค.
06:50
And this is a really common response,
124
410540
1875
์ด๊ฑด ์•„์ฃผ ์ผ๋ฐ˜์ ์ธ ๋ฐ˜์‘์ด์—์š”.
06:52
and I think it's absolutely correct,
125
412439
2499
๊ทธ๊ฒŒ ์ •๋ง ๋งž๋‹ค๊ณ  ์ƒ๊ฐํ•˜๋Š”๋ฐ
06:54
because that would be terrible.
126
414962
2040
๊ทธ๊ฒŒ ๋”์ฐํ• ํ…Œ๋‹ˆ๊นŒ์š”.
06:57
But in this case, it's not what this robot replaces.
127
417795
2484
ํ•˜์ง€๋งŒ ์ด ๊ฒฝ์šฐ ๋กœ๋ด‡์ด ๋Œ€์ฒดํ•œ ๊ฑธ ๋งํ•˜๋Š” ๊ฒŒ ์•„๋‹ˆ์—์š”.
07:00
What this robot replaces is animal therapy
128
420858
3120
๋กœ๋ด‡์ด ๋Œ€์ฒดํ•œ ๊ฒƒ์€ ๋™๋ฌผ์น˜๋ฃŒ์ž…๋‹ˆ๋‹ค.
07:04
in contexts where we can't use real animals
129
424002
3198
์šฐ๋ฆฌ๊ฐ€ ์‹ค์ œ ๋™๋ฌผ์„ ์“ธ ์ˆ˜ ์—†๋Š” ์ƒํ™ฉ์—์„ 
07:07
but we can use robots,
130
427224
1168
๋กœ๋ด‡์„ ์“ธ ์ˆ˜ ์žˆ์ฃ .
07:08
because people will consistently treat them more like an animal than a device.
131
428416
5230
์‚ฌ๋žŒ๋“ค์€ ๊ณ„์†ํ•ด์„œ ๊ธฐ๊ณ„๋ผ๊ธฐ๋ณด๋‹จ ๋™๋ฌผ์ฒ˜๋Ÿผ PARO๋ฅผ ๋Œ€ํ•  ๊ฒƒ์ด๋‹ˆ๊นŒ์š”.
07:15
Acknowledging this emotional connection to robots
132
435502
2380
๋กœ๋ด‡๊ณผ ์ด๋Ÿฐ ๊ฐ์„ฑ์ ์ธ ์—ฐ๊ฒฐ์„ ์ธ์ง€ํ•˜๋Š” ๊ฒƒ์€
07:17
can also help us anticipate challenges
133
437906
1969
์–ด๋ ค์›€์„ ์˜ˆ์ƒํ•˜๋„๋ก ๋„์™€์ค„ ์ˆ˜ ์žˆ์–ด์š”.
07:19
as these devices move into more intimate areas of people's lives.
134
439899
3451
์ด๋Ÿฐ ๊ธฐ๊ธฐ๋“ค์ด ์‚ถ์˜ ๋ณด๋‹ค ์นœ๋ฐ€ํ•œ ์˜์—ญ์œผ๋กœ ๋“ค์–ด์˜ค๋‹ˆ๊นŒ์š”.
07:24
For example, is it OK if your child's teddy bear robot
135
444111
3404
์ด๋ฅผํ…Œ๋ฉด, ์—ฌ๋Ÿฌ๋ถ„ ์ž๋…€์˜ ๊ณฐ์ธํ˜• ๋กœ๋ด‡์ด
07:27
records private conversations?
136
447539
2237
์‚ฌ์ ์ธ ๋Œ€ํ™”๋ฅผ ๋…น์Œํ•ด๋„ ๊ดœ์ฐฎ๊ฒ ์Šต๋‹ˆ๊นŒ?
07:29
Is it OK if your sex robot has compelling in-app purchases?
137
449800
4063
์—ฌ๋Ÿฌ๋ถ„์˜ ์„น์Šค๋กœ๋ด‡์ด ๊ฐ•์ œ๋กœ ์•ฑ๊ตฌ๋งค๋ฅผ ํ•ด๋„ ๊ดœ์ฐฎ๋‚˜์š”?
07:33
(Laughter)
138
453887
1396
(์›ƒ์Œ)
07:35
Because robots plus capitalism
139
455307
2501
๋กœ๋ด‡๊ณผ ์ž๋ณธ์ฃผ์˜์˜ ๊ฒฐํ•ฉ์€
07:37
equals questions around consumer protection and privacy.
140
457832
3705
์†Œ๋น„์ž ๋ณดํ˜ธ์™€ ์‚ฌ์ƒํ™œ์˜ ๋ฌธ์ œ์™€ ๊ฐ™์œผ๋‹ˆ๊นŒ์š”.
07:42
And those aren't the only reasons
141
462549
1612
๊ทธ๋Ÿฐ ์ด์œ ๋งŒ์œผ๋กœ
07:44
that our behavior around these machines could matter.
142
464185
2570
๊ธฐ๊ณ„์™€ ํ•จ๊ป˜ ์žˆ๋Š” ์šฐ๋ฆฌ ํ–‰๋™์ด ์ค‘์š”ํ•œ ๊ฑด ์•„๋‹™๋‹ˆ๋‹ค.
07:48
A few years after that first initial experience I had
143
468747
3270
๊ทธ ์•„๊ธฐ ๊ณต๋ฃก ๋กœ๋ด‡(Pleo)๋ฅผ ํ†ตํ•œ ๊ฒฝํ—˜์ด ์žˆ์€ ํ›„ ๋ช‡๋…„ ํ›„์—
07:52
with this baby dinosaur robot,
144
472041
2311
07:54
I did a workshop with my friend Hannes Gassert.
145
474376
2501
ํ•˜๋„ค์Šค ๊ฐ€์„œํŠธ๋ž€ ์นœ๊ตฌ์™€ ์›Œํฌ์ƒต์„ ๊ฐ€์กŒ์Šต๋‹ˆ๋‹ค.
07:56
And we took five of these baby dinosaur robots
146
476901
2897
์šฐ๋ฆฐ ๋‹ค์„ฏ ๊ฐœ์˜ Pleo๋ฅผ ๊ฐ€์ ธ์™€์„œ
07:59
and we gave them to five teams of people.
147
479822
2453
๋‹ค์„ฏ ํŒ€์—๊ฒŒ ๋‚˜๋ˆ ์ฃผ์—ˆ์ฃ .
08:02
And we had them name them
148
482299
1697
๊ทธ๋ฆฌ๊ณค ๊ทธ ํŒ€์ด Pleo์—๊ฒŒ ์ด๋ฆ„์„ ์ง“๊ฒŒ ํ•˜๊ณ 
08:04
and play with them and interact with them for about an hour.
149
484020
3809
ํ•œ ์‹œ๊ฐ„ ๋™์•ˆ ๊ฐ™์ด ๋†€๊ณ  ์†Œํ†ตํ•˜๊ฒŒ ํ–ˆ์Šต๋‹ˆ๋‹ค.
08:08
And then we unveiled a hammer and a hatchet
150
488707
2206
๊ทธ๋ฆฌ๊ณ  ์ €ํฌ๋Š” ํ•ด๋จธ์™€ ๋„๋ผ๋ฅผ ๊บผ๋‚ด์„ 
08:10
and we told them to torture and kill the robots.
151
490937
2278
๊ทธ๋“ค์—๊ฒŒ Pleo๋ฅผ ๊ณ ๋ฌธํ•˜๊ณ  ๋ถ€์ˆด๋ผ๊ณ  ๋งํ–ˆ์ฃ .
08:13
(Laughter)
152
493239
3007
(์›ƒ์Œ)
08:16
And this turned out to be a little more dramatic
153
496857
2294
์ด๊ฒƒ์€ ์˜ˆ์ƒํ–ˆ๋˜ ๊ฒƒ๋ณด๋‹ค๋„ ์ข€๋” ๊ทน์ ์ด์—ˆ์Šต๋‹ˆ๋‹ค.
08:19
than we expected it to be,
154
499175
1278
08:20
because none of the participants would even so much as strike
155
500477
3072
์–ด๋–ค ์ฐธ์—ฌ์ž๋„ ์ด ์ž‘์€ ์•„๊ธฐ ๊ณต๋ฃก ๋กœ๋ด‡์„
08:23
these baby dinosaur robots,
156
503573
1307
์•„์ฃผ ๊ฐ•ํ•˜๊ฒŒ ๋‚ด๋ ค์น˜์ง€ ์•Š์•˜๊ฑฐ๋“ ์š”.
08:24
so we had to improvise a little, and at some point, we said,
157
504904
5150
๊ทธ๋ž˜์„œ ์•ฝ๊ฐ„ ๊ณ„ํš์„ ์ˆ˜์ •ํ•ด์•ผ ํ–ˆ๊ณ  ์–ด๋Š ์‹œ์ ์— ๋งํ–ˆ์Šต๋‹ˆ๋‹ค.
"์ข‹์•„์š”, ๋‹ค๋ฅธ ํŒ€์˜ ๋กœ๋ด‡์„ ๋ถ€์ˆ˜๋ฉด ์—ฌ๋Ÿฌ๋ถ„ ํŒ€์˜ ๋กœ๋ด‡์„ ๊ตฌํ•  ์ˆ˜ ์žˆ์–ด์š”."
08:30
"OK, you can save your team's robot if you destroy another team's robot."
158
510078
4437
08:34
(Laughter)
159
514539
1855
(์›ƒ์Œ)
08:36
And even that didn't work. They couldn't do it.
160
516839
2195
์ด๊ฒƒ๋„ ์†Œ์šฉ ์—†์—ˆ์ฃ . ๊ทธ๋“ค์€ ํ•˜์ง€ ์•Š์•˜์–ด์š”.
08:39
So finally, we said,
161
519058
1151
๊ทธ๋ž˜์„œ ๊ฒฐ๊ณผ์ ์œผ๋กœ ์ €ํฌ๋Š” ๋งํ–ˆ์ฃ .
08:40
"We're going to destroy all of the robots
162
520233
2032
"์šฐ๋ฆฐ ๋ชจ๋“  ๋กœ๋ด‡์„ ๋ถ€์ˆ  ๊ฑฐ์˜ˆ์š”.
08:42
unless someone takes a hatchet to one of them."
163
522289
2285
๋ˆ„๊ตฐ๊ฐ€๊ฐ€ ๋„๋ผ๋กœ ์ด ์ค‘ ํ•˜๋‚˜๋ฅผ ๋ถ€์ˆ˜์ง€ ์•Š์œผ๋ฉด์š”."
08:45
And this guy stood up, and he took the hatchet,
164
525586
3579
์ด๋žฌ๋”๋‹ˆ ํ•œ ๋‚จ์ž๊ฐ€ ์ผ์–ด๋‚˜๊ณค ๋„๋ผ๋ฅผ ์ง‘์–ด๋“ค์—ˆ์ฃ .
08:49
and the whole room winced as he brought the hatchet down
165
529189
2706
๊ทธ๊ฐ€ ๋กœ๋ด‡์˜ ๋ชฉ์— ๋„๋ผ๋ฅผ ๋‚ด๋ ค์ฐ์„ ๋•Œ
08:51
on the robot's neck,
166
531919
1780
๋ฐฉ์˜ ๋ชจ๋“  ์‚ฌ๋žŒ๋“ค์ด ์ฃผ์ถคํ–ˆ์ฃ .
08:53
and there was this half-joking, half-serious moment of silence in the room
167
533723
6338
๊ทธ ๋ฐฉ์— ๋ฐ˜์€ ์žฅ๋‚œ์Šค๋Ÿฝ๊ณ  ๋ฐ˜์€ ์‹ฌ๊ฐํ•œ ์ •์ ์˜ ์ˆœ๊ฐ„์ด ์žˆ์—ˆ์ฃ .
09:00
for this fallen robot.
168
540085
1698
์ด ์“ฐ๋Ÿฌ์ง„ ๋กœ๋ด‡์— ๋Œ€ํ•ด์„œ ๋ง์ž…๋‹ˆ๋‹ค.
09:01
(Laughter)
169
541807
1406
(์›ƒ์Œ)
09:03
So that was a really interesting experience.
170
543237
3694
์ด๊ฑด ์—„์ฒญ ํฅ๋ฏธ๋กœ์šด ๊ฒฝํ—˜์ด์—ˆ์–ด์š”.
09:06
Now, it wasn't a controlled study, obviously,
171
546955
2459
์ด๊ฑด ํ†ต์ œ๋œ ์—ฐ๊ตฌ๋Š” ๋ถ„๋ช…ํžˆ ์•„๋‹ˆ์—ˆ์ฃ .
09:09
but it did lead to some later research that I did at MIT
172
549438
2850
ํ•˜์ง€๋งŒ ์ด ์‹คํ—˜์œผ๋กœ ์ œ๊ฐ€ MIT์—์„œ ๋™๋ฃŒ์ธ Palsh์™€ Cynthia์™€ ํ•จ๊ป˜ ํ•œ
09:12
with Palash Nandy and Cynthia Breazeal,
173
552312
2228
์ดํ›„์˜ ๋ช‡๋ช‡ ์—ฐ๊ตฌ๋ฅผ ํ•˜๊ฒŒ ๋์Šต๋‹ˆ๋‹ค.
09:14
where we had people come into the lab and smash these HEXBUGs
174
554564
3627
MIT์—์„œ ์šฐ๋ฆฌ๋“ค์€ ์—ฐ๊ตฌ์‹ค๋กœ ์‚ฌ๋žŒ๋“ค์„ ๋ถˆ๋Ÿฌ์„œ ๊ฑฐ๋ฏธ๋กœ๋ด‡์„ ๋ถ€์ˆ˜๊ฒŒ ํ–ˆ์ฃ .
09:18
that move around in a really lifelike way, like insects.
175
558215
3087
์ด ๊ฑฐ๋ฏธ๋กœ๋ด‡์€ ์ง„์งœ ์‚ฐ ๊ณค์ถฉ์ฒ˜๋Ÿผ ๊ฑธ์–ด๋‹ค๋…”์ฃ .
09:21
So instead of choosing something cute that people are drawn to,
176
561326
3134
์‚ฌ๋žŒ๋“ค์ด ์ƒ๊ฐํ•˜๋˜ ๊ท€์—ฌ์šด ๋ชจ์–‘์„ ์„ ํƒํ•˜๋Š” ๊ฒƒ ๋Œ€์‹ ์—
09:24
we chose something more basic,
177
564484
2093
์ €ํฌ๋Š” ๋” ๊ธฐ๋ณธ์ ์ธ ๊ฑธ ์„ ํƒํ–ˆ๊ณ 
09:26
and what we found was that high-empathy people
178
566601
3480
์ €ํฌ๊ฐ€ ๋ฐœ๊ฒฌํ•œ ๊ฒƒ์€ ์ž˜ ๊ฐ์ •์ด์ž…ํ•˜๋Š” ์‚ฌ๋žŒ๋“ค์ด
09:30
would hesitate more to hit the HEXBUGS.
179
570105
2143
๊ฑฐ๋ฏธ๋กœ๋ด‡์„ ๋ถ€์ˆ˜๊ธฐ์— ์ฃผ์ €ํ•œ๋‹ค๋Š” ๊ฒ๋‹ˆ๋‹ค.
09:33
Now this is just a little study,
180
573575
1564
์ด๊ฑด ๋‹จ์ง€ ๊ฐ„๋‹จํ•œ ์—ฐ๊ตฌ์ผ ๋ฟ์ด์ง€๋งŒ
09:35
but it's part of a larger body of research
181
575163
2389
์‚ฌ๋žŒ๋“ค์˜ ๊ฐ์ •์— ๋Œ€ํ•œ ๊ฒฝํ–ฅ๊ณผ
09:37
that is starting to indicate that there may be a connection
182
577576
2944
๋กœ๋ด‡์— ๋Œ€ํ•œ ๊ทธ๋“ค์˜ ํ–‰๋™ ์‚ฌ์ด์˜ ์—ฐ๊ฒฐ์ด ์žˆ์„ ๊ฒƒ์ด๋ผ๋Š” ๊ฒƒ์„
09:40
between people's tendencies for empathy
183
580544
2373
09:42
and their behavior around robots.
184
582941
1976
๋ณด์—ฌ์ฃผ๊ธฐ ์‹œ์ž‘ํ•œ ์—ฐ๊ตฌ์˜ ํฐ ํ‹€ ์ค‘ ์ผ๋ถ€๋ผ๋Š” ๊ฒƒ์ด์ฃ .
09:45
But my question for the coming era of human-robot interaction
185
585721
3627
ํ•˜์ง€๋งŒ ์‚ฌ๋žŒ-๋กœ๋ด‡์˜ ์ƒํ˜ธ์ž‘์šฉ์˜ ๋‹ค๊ฐ€์˜ค๋Š” ์‚ฌ๋Œ€์— ๋Œ€ํ•œ ์ œ ์˜๋ฌธ์€
09:49
is not: "Do we empathize with robots?"
186
589372
3055
"์šฐ๋ฆฌ๊ฐ€ ๋กœ๋ด‡์—๊ฒŒ ๊ฐ์ •์ด์ž…์„ ํ• ๊นŒ?" ๊ฐ€ ์•„๋‹™๋‹ˆ๋‹ค.
09:53
It's: "Can robots change people's empathy?"
187
593211
2920
"๋กœ๋ด‡์ด ์‚ฌ๋žŒ๋“ค์˜ ๊ฐ์ •์— ๋ณ€ํ™”๋ฅผ ์ค„๊นŒ?" ์ž…๋‹ˆ๋‹ค.
09:57
Is there reason to, for example,
188
597489
2287
์˜ˆ๋ฅผ ๋“ค๋ฉด, ์—ฌ๋Ÿฌ๋ถ„์˜ ์•„์ด๋“ค์ด
09:59
prevent your child from kicking a robotic dog,
189
599800
2333
๋กœ๋ด‡ ๊ฐ•์•„์ง€๋ฅผ ๋ฐœ๋กœ ์ฐจ๋Š” ๊ฑธ ํ•˜์ง€ ๋ง๋ผ๊ณ  ํ•  ์ด์œ ๊ฐ€ ์žˆ๋‚˜์š”?
10:03
not just out of respect for property,
190
603228
2914
์†Œ์œ ๋ฌผ์— ๋Œ€ํ•œ ์ธ์‹ ๋•Œ๋ฌธ์ด ์•„๋‹ˆ๋ผ
10:06
but because the child might be more likely to kick a real dog?
191
606166
2953
์•„์ด๊ฐ€ ์ง„์งœ ๊ฐ•์•„์ง€๋ฅผ ์ฐจ๋ฒ„๋ฆด ๊ฒƒ์ด๋ž€ ๊ฑฑ์ • ๋•Œ๋ฌธ์— ๋ง์ž…๋‹ˆ๋‹ค.
10:10
And again, it's not just kids.
192
610507
1883
์•„์ด์—๊ฒŒ๋งŒ ํ•ด๋‹น๋˜๋Š” ๊ฒŒ ์•„๋‹™๋‹ˆ๋‹ค.
10:13
This is the violent video games question, but it's on a completely new level
193
613564
4056
์ด๊ฑด ํญ๋ ฅ์ ์ธ ๋น„๋””์˜ค ๊ฒŒ์ž„์˜ ์งˆ๋ฌธ์ด์ง€๋งŒ, ์™„์ „ํžˆ ์ƒˆ๋กœ์šด ๋‹จ๊ณ„์— ์žˆ์ฃ .
10:17
because of this visceral physicality that we respond more intensely to
194
617644
4760
์Šคํฌ๋ฆฐ์˜ ์ด๋ฏธ์ง€์— ๋น„ํ•ด ๋” ๊ฒฉ๋ ฌํ•˜๊ฒŒ ์šฐ๋ฆฌ๊ฐ€ ๋ฐ˜์‘ํ•˜๋Š”
10:22
than to images on a screen.
195
622428
1547
๊ฐ์ •์ ์ธ ์‹ค์ฒด์ด๊ธฐ ๋•Œ๋ฌธ์ด์ฃ .
10:25
When we behave violently towards robots,
196
625674
2578
์šฐ๋ฆฌ๊ฐ€ ๋กœ๋ด‡์„ ๋‚œํญํ•˜๊ฒŒ ๋Œ€ํ•  ๋•Œ
10:28
specifically robots that are designed to mimic life,
197
628276
3120
ํŠนํžˆ๋‚˜ ์ƒ๋ฌผ์„ ๋ณธ๋”ด ๋กœ๋ด‡์„ ํญ๋ ฅ์ ์œผ๋กœ ๋Œ€ํ•  ๋•Œ,
10:31
is that a healthy outlet for violent behavior
198
631420
3892
๊ทธ๊ฒƒ์ด ํญ๋ ฅ์  ํ–‰๋™์— ๋Œ€ํ•œ ์ •์ƒ์ ์ธ ํ•ด์†Œ๋ฒ•์ธ๊ฐ€์š”?
10:35
or is that training our cruelty muscles?
199
635336
2544
๋˜๋Š” ์šฐ๋ฆฌ์˜ ์ž”ํ˜นํ•œ ๊ทผ์œก์„ ๋‹จ๋ จํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๊นŒ?
10:39
We don't know ...
200
639511
1150
์šฐ๋ฆฐ ๋ชจ๋ฅด์ฃ .
10:42
But the answer to this question has the potential to impact human behavior,
201
642622
3945
ํ•˜์ง€๋งŒ ์ด ์งˆ๋ฌธ์— ๋Œ€ํ•œ ๋‹ต์€ ์‚ฌ๋žŒ๋“ค์˜ ํ–‰๋™์— ์ž ์žฌ์ ์ธ ์˜ํ–ฅ๋ ฅ์„ ๋ฏธ์นฉ๋‹ˆ๋‹ค.
10:46
it has the potential to impact social norms,
202
646591
2768
์‚ฌํšŒ์  ๊ทœ๋ฒ”์— ์ž ์žฌ์ ์ธ ์˜ํ–ฅ์„ ๋ฏธ์น˜์ฃ .
10:49
it has the potential to inspire rules around what we can and can't do
203
649383
3849
์šฐ๋ฆฌ๊ฐ€ ํŠน์ • ๋กœ๋ด‡์œผ๋กœ ํ•  ์ˆ˜ ์žˆ๋Š” ๊ฒƒ๊ณผ ์—†๋Š” ๊ฒƒ์— ๋Œ€ํ•œ
10:53
with certain robots,
204
653256
1151
๋ฒ•๋“ค์„ ๊ณ ๋ฌด์‹œํ‚ฌ์ˆ˜ ์žˆ์ฃ .
10:54
similar to our animal cruelty laws.
205
654431
1848
์ด๋ฅผ ํ…Œ๋ฉด ๋Œ๋ฌผํ•™๋Œ€ ๋ฐฉ์ง€๋ฒ•๊ณผ ์œ ์‚ฌํ•œ ๋ฒ•์„ ๋ง์ž…๋‹ˆ๋‹ค.
10:57
Because even if robots can't feel,
206
657228
2864
๋กœ๋ด‡์ด ์•„๋ฌด๊ฒƒ๋„ ๋Š๋ผ์ง„ ๋ชปํ•˜๋”๋ผ๋„
11:00
our behavior towards them might matter for us.
207
660116
3080
๋กœ๋ด‡์— ๋Œ€ํ•œ ์šฐ๋ฆฌ ํ–‰๋™์ด ์ค‘์š”ํ•˜๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
11:04
And regardless of whether we end up changing our rules,
208
664889
2855
์šฐ๋ฆฌ ๊ทœ์น™์„ ๋ฐ”๊พธ๊ฒŒ ๋˜๋Š” ๊ฒƒ๊ณผ ์ƒ๊ด€์—†์ด
11:08
robots might be able to help us come to a new understanding of ourselves.
209
668926
3556
๋กœ๋ด‡์€ ์•„๋งˆ ์šฐ๋ฆฌ ์Šค์Šค๋กœ๋ฅผ ์ƒˆ๋กญ๊ฒŒ ์ดํ•ดํ•˜๋„๋ก ๋„์™€์ค„ ๊ฒ๋‹ˆ๋‹ค.
11:14
Most of what I've learned over the past 10 years
210
674276
2316
์ง€๋‚œ 10์—ฌ๋…„ ๊ฐ„ ์ œ๊ฐ€ ๋ฐฐ์šด ๊ฒƒ ๋Œ€๋ถ€๋ถ„์€
11:16
has not been about technology at all.
211
676616
2238
๋‹จ์ˆœํžˆ ๊ธฐ์ˆ ์ ์ธ ๋ถ€๋ถ„๋งŒ์€ ์•„๋‹™๋‹ˆ๋‹ค.
11:18
It's been about human psychology
212
678878
2503
์‚ฌ๋žŒ์˜ ์‹ฌ๋ฆฌ์™€ ๊ณต๊ฐ,
11:21
and empathy and how we relate to others.
213
681405
2603
์šฐ๋ฆฌ๊ฐ€ ๋‹ค๋ฅธ ์‚ฌ๋žŒ๊ณผ ๊ด€๋ จ์ง“๋Š” ๋ฐฉ๋ฒ•์ž…๋‹ˆ๋‹ค.
11:25
Because when a child is kind to a Roomba,
214
685524
2365
์•„์ด๋“ค์ด Roomba์—๊ฒŒ ์นœ์ ˆํ•  ๋•Œ,
11:29
when a soldier tries to save a robot on the battlefield,
215
689262
4015
ํ•œ ๋ณ‘์‚ฌ๊ฐ€ ์ „์žฅ์—์„œ ๋กœ๋ด‡์„ ๊ตฌํ•˜๋ ค ๋…ธ๋ ฅํ•  ๋•Œ๋‚˜
11:33
or when a group of people refuses to harm a robotic baby dinosaur,
216
693301
3638
์‚ฌ๋žŒ๋“ค์ด ์•„๊ธฐ ๊ณต๋ฃก์„ ๋‹ค์น˜๊ฒŒ ํ•˜์ง€ ์•Š์œผ๋ ค ํ•  ๋•Œ
11:38
those robots aren't just motors and gears and algorithms.
217
698248
3191
์ด๋Ÿฐ ๋กœ๋ด‡๋“ค์€ ๋‹จ์ง€ ๋ชจํ„ฐ, ๊ธฐ์–ด, ์•Œ๊ณ ๋ฆฌ์ฆ˜์ด ์•„๋‹™๋‹ˆ๋‹ค.
11:42
They're reflections of our own humanity.
218
702501
1905
๊ทธ๋“ค์€ ์šฐ๋ฆฌ์˜ ์ธ๋ฅ˜์• ๊ฐ€ ๋ฐ˜์˜๋œ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
11:45
Thank you.
219
705523
1151
๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
11:46
(Applause)
220
706698
3397
(๋ฐ•์ˆ˜)
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7