Why we have an emotional connection to robots | Kate Darling

138,301 views ใƒป 2018-11-06

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

00:00
Translator: Joseph Geni Reviewer: Krystian Aparta
0
0
7000
ืชืจื’ื•ื: maor madmon ืขืจื™ื›ื”: Ido Dekkers
00:13
There was a day, about 10 years ago,
1
13047
2508
ื™ื•ื ืื—ื“, ืœืคื ื™ ืขืฉืจ ืฉื ื™ื ื‘ืขืจืš,
00:15
when I asked a friend to hold a baby dinosaur robot upside down.
2
15579
3944
ื‘ื™ืงืฉืชื™ ืžื—ื‘ืจ ืœื”ื—ื–ื™ืง ืจื•ื‘ื•ื˜ ืฉืœ ื“ื™ื ื•ื–ืื•ืจ ืชื™ื ื•ืง ื”ืคื•ืš.
00:21
It was this toy called a Pleo that I had ordered,
3
21889
3446
ื–ื” ื”ื™ื” ืฆืขืฆื•ืข ื‘ืฉื ืคืœื™ืื• ืฉื”ื–ืžื ืชื™,
00:25
and I was really excited about it because I've always loved robots.
4
25359
4401
ื•ื”ื™ื™ืชื™ ื ืจื’ืฉืช ืžืื•ื“ ื›ื™ ืชืžื™ื“ ืื”ื‘ืชื™ ืจื•ื‘ื•ื˜ื™ื.
00:29
And this one has really cool technical features.
5
29784
2279
ื•ืœืจื•ื‘ื•ื˜ ื”ื–ื” ื”ื™ื• ืชื›ื•ื ื•ืช ืžืžืฉ ืžื’ื ื™ื‘ื•ืช.
00:32
It had motors and touch sensors
6
32087
2119
ื”ื™ื• ืœื• ืžื ื•ืขื™ื ื•ื—ื™ื™ืฉื ื™ ื ื’ื™ืขื”
00:34
and it had an infrared camera.
7
34230
2244
ื•ืžืฆืœืžืช ืื™ื ืคืจื-ืื“ื•ื.
00:36
And one of the things it had was a tilt sensor,
8
36498
2763
ื•ื”ื™ื” ืœื• ื’ื ื—ื™ื™ืฉืŸ ื”ื˜ื™ื™ื”,
00:39
so it knew what direction it was facing.
9
39285
2318
ื›ืš ืฉื”ื•ื ื™ื“ืข ื‘ืื™ื–ื” ื›ื™ื•ื•ืŸ ืžื—ื–ื™ืงื™ื ืื•ืชื• ื‘ื™ื—ืก ืœืื“ืžื”.
00:42
And when you held it upside down,
10
42095
2134
ื•ื›ืฉื”ื—ื–ื™ืงื• ืื•ืชื• ื”ืคื•ืš,
00:44
it would start to cry.
11
44253
1572
ื”ื•ื ื”ื™ื” ืžืชื—ื™ืœ ืœื‘ื›ื•ืช.
00:46
And I thought this was super cool, so I was showing it off to my friend,
12
46527
3496
ืื ื™ ื—ืฉื‘ืชื™ ืฉื–ื” ืžืžืฉ ืžื’ื ื™ื‘, ืื– ื”ืจืื™ืชื™ ืืช ื–ื” ืœื—ื‘ืจ ืฉืœื™,
00:50
and I said, "Oh, hold it up by the tail. See what it does."
13
50047
2805
ื•ืืžืจืชื™, "ืชื—ื–ื™ืง ืื•ืชื• ืžื”ื–ื ื‘ ื•ืชืจืื” ืžื” ื”ื•ื ืขื•ืฉื”."
00:55
So we're watching the theatrics of this robot
14
55268
3625
ืื– ืขืžื“ื ื• ื•ืฆืคื™ื ื• ื‘ืจื•ื‘ื•ื˜
00:58
struggle and cry out.
15
58917
2199
ืฉื”ืชื—ื™ืœ ืœื”ื™ืื‘ืง ื•ืœื‘ื›ื•ืช.
01:02
And after a few seconds,
16
62767
2047
ื•ืื—ืจื™ ื›ืžื” ืฉื ื™ื•ืช,
01:04
it starts to bother me a little,
17
64838
1972
ื–ื” ื”ืชื—ื™ืœ ืœื”ืฆื™ืง ืœื™ ืงืฆืช,
01:07
and I said, "OK, that's enough now.
18
67744
3424
ื•ืืžืจืชื™, "ื‘ืกื“ืจ, ื–ื” ืžืกืคื™ืง.
01:11
Let's put him back down."
19
71930
2305
ื‘ื•ื ื ื ื™ื— ืื•ืชื•."
01:14
And then I pet the robot to make it stop crying.
20
74259
2555
ื•ืื– ืœื™ื˜ืคืชื™ ืืช ื”ืจื•ื‘ื•ื˜ ื›ื“ื™ ืฉื™ืคืกื™ืง ืœื‘ื›ื•ืช.
01:18
And that was kind of a weird experience for me.
21
78973
2452
ื–ื• ื”ื™ื™ืชื” ื—ื•ื•ื™ื” ื“ื™ ืžืฉื•ื ื” ื‘ืฉื‘ื™ืœื™.
01:22
For one thing, I wasn't the most maternal person at the time.
22
82084
4569
ืžืฆื“ ืื—ื“, ืœื ื”ื™ื™ืชื™ ืžืื•ื“ ืื™ืžื”ื™ืช ื‘ื–ืžื ื•.
01:26
Although since then I've become a mother, nine months ago,
23
86677
2731
ืœืžืจื•ืช ืฉืžืื– ื”ืคื›ืชื™ ืœืื, ืœืคื ื™ ืชืฉืขื” ื—ื•ื“ืฉื™ื,
01:29
and I've learned that babies also squirm when you hold them upside down.
24
89432
3433
ื•ืœืžื“ืชื™ ืฉื’ื ืชื™ื ื•ืงื•ืช ืžืชื—ื™ืœื™ื ืœื‘ื›ื•ืช ื›ืฉืžื—ื–ื™ืงื™ื ืื•ืชื ื”ืคื•ืš.
01:32
(Laughter)
25
92889
1563
(ืฆื—ื•ืง)
01:35
But my response to this robot was also interesting
26
95023
2358
ืื‘ืœ ื”ืชื’ื•ื‘ื” ืฉืœื™ ืœืจื•ื‘ื•ื˜ ื”ื–ื” ื”ื™ื™ืชื” ืžืขื ื™ื™ื ืช
01:37
because I knew exactly how this machine worked,
27
97405
4101
ื›ื™ ื™ื“ืขืชื™ ื‘ื“ื™ื•ืง ื›ื™ืฆื“ ื”ืžื›ื•ื ื” ื”ื–ืืช ืขื•ื‘ื“ืช,
01:41
and yet I still felt compelled to be kind to it.
28
101530
3262
ื•ื‘ื›ืœ ื–ืืช ื”ืจื’ืฉืชื™ ืฆื•ืจืš ืœื”ื™ื•ืช ื ื—ืžื“ื” ืืœื™ื”.
01:46
And that observation sparked a curiosity
29
106450
2707
ื”ื”ื‘ื—ื ื” ื”ื–ื• ื”ืฆื™ืชื” ื‘ื™ ืกืงืจื ื•ืช
01:49
that I've spent the past decade pursuing.
30
109181
2832
ืฉื”ื•ื‘ื™ืœื” ืืช ื”ืžื—ืงืจ ืฉืœื™ ื‘ืขืฉื•ืจ ื”ืื—ืจื•ืŸ.
01:52
Why did I comfort this robot?
31
112911
1793
ืœืžื” ืœื™ื˜ืคืชื™ ื•ื ื™ื—ืžืชื™ ืืช ื”ืจื•ื‘ื•ื˜?
01:56
And one of the things I discovered was that my treatment of this machine
32
116228
3579
ื•ืื—ื“ ื”ื“ื‘ืจื™ื ืฉื’ื™ืœื™ืชื™, ื”ื™ื” ืฉื”ื™ื—ืก ืฉืœื™ ืœืžื›ื•ื ื” ื”ื–ื•
01:59
was more than just an awkward moment in my living room,
33
119831
3701
ื”ื™ื” ื™ื•ืชืจ ืžืจื’ืข ืžื‘ื™ืš ื‘ืกืœื•ืŸ ืฉืœื™.
02:03
that in a world where we're increasingly integrating robots into our lives,
34
123556
5420
ืฉื‘ืขื•ืœื ืฉื‘ื• ื™ื•ืชืจ ื•ื™ื•ืชืจ ืจื•ื‘ื•ื˜ื™ื ืžืฉืชืœื‘ื™ื ื‘ื—ื™ื™ื ื•,
02:09
an instinct like that might actually have consequences,
35
129000
3126
ืœืื™ื ืกื˜ื™ื ืงื˜ ื›ื–ื” ื™ื›ื•ืœื•ืช ืœื”ื™ื•ืช ื”ืฉืœื›ื•ืช.
02:13
because the first thing that I discovered is that it's not just me.
36
133452
3749
ื›ื™ ื”ื“ื‘ืจ ื”ืจืืฉื•ืŸ ืฉื’ื™ืœื™ืชื™ ื”ื™ื” ืฉืื ื™ ืœื ื”ื™ื—ื™ื“ื”.
02:19
In 2007, the Washington Post reported that the United States military
37
139249
4802
ื‘ืฉื ืช 2007, ื”ืขื™ืชื•ืŸ ื•ื•ืฉื™ื ื’ื˜ื•ืŸ ืคื•ืกื˜ ื“ื™ื•ื•ื— ืฉืฆื‘ื ืืจื”"ื‘
02:24
was testing this robot that defused land mines.
38
144075
3230
ืขืจืš ื ื™ืกื•ื™ื™ื ืขื ืจื•ื‘ื•ื˜ ืœื ื˜ืจื•ืœ ืžื•ืงืฉื™ื.
02:27
And the way it worked was it was shaped like a stick insect
39
147329
2912
ื–ื” ื”ื™ื” ืจื•ื‘ื•ื˜ ื‘ืฆื•ืจืช ื—ืจืง ืžืงืœื•ื ื™
02:30
and it would walk around a minefield on its legs,
40
150265
2651
ืฉื”ื™ื” ืืžื•ืจ ืœืœื›ืช ื‘ืฉื“ื•ืช ืžื•ืงืฉื™ื,
02:32
and every time it stepped on a mine, one of the legs would blow up,
41
152940
3206
ื•ื‘ื›ืœ ืคืขื ืฉื“ืจืš ืขืœ ืžื•ืงืฉ, ืื—ืช ืžืจื’ืœื™ื• ื”ื™ืชื” ืžืชืคื•ืฆืฆืช,
02:36
and it would continue on the other legs to blow up more mines.
42
156170
3057
ื•ื”ื•ื ื”ื™ื” ืžืžืฉื™ืš ืœืœื›ืช ืขืœ ื”ืจื’ืœื™ื™ื ืฉื ื•ืชืจื• ืœื• ื•ืœืคื•ืฆืฅ ืžื•ืงืฉื™ื ื ื•ืกืคื™ื.
02:39
And the colonel who was in charge of this testing exercise
43
159251
3786
ื”ืงื•ืœื•ื ืœ ืฉื”ื™ื” ืื—ืจืื™ ืœื‘ื“ื™ืงืช ื”ืจื•ื‘ื•ื˜
02:43
ends up calling it off,
44
163061
2118
ืขืฆืจ ืืช ื”ื‘ื“ื™ืงื” ื‘ืฉืœื‘ ืžืกื•ื™ื™ื,
02:45
because, he says, it's too inhumane
45
165203
2435
ื›ื™ื•ื•ืŸ ืฉืœื˜ืขื ืชื• ื–ื” ื”ื™ื” ืื›ื–ืจื™ ืžื“ื™
02:47
to watch this damaged robot drag itself along the minefield.
46
167662
4516
ืœืฆืคื•ืช ื‘ืจื•ื‘ื•ื˜ ื”ืคื’ื•ืข ืžืžืฉื™ืš ืœื’ืจื•ืจ ืืช ืขืฆืžื• ืขืœ ื”ืจื’ืœื™ื™ื ืฉื ื•ืชืจื• ืœื• ื‘ืฉื“ื” ื”ืžื•ืงืฉื™ื.
02:54
Now, what would cause a hardened military officer
47
174978
3897
ืขื›ืฉื™ื•, ืžื” ื™ื›ื•ืœ ืœื’ืจื•ื ืœืงืฆื™ืŸ ืฆื‘ื ืงืฉื•ื—
02:58
and someone like myself
48
178899
2043
ื•ืœืื™ืฉื” ื›ืžื•ื ื™
03:00
to have this response to robots?
49
180966
1857
ืœื”ื’ื™ื‘ ื‘ืื•ืคืŸ ื›ื–ื” ืœืจื•ื‘ื•ื˜ื™ื?
03:03
Well, of course, we're primed by science fiction and pop culture
50
183537
3310
ื•ื‘ื›ืŸ, ื›ืžื•ื‘ืŸ, ื”ืžื“ืข ื”ื‘ื“ื™ื•ื ื™ ื•ืชืจื‘ื•ืช ื”ืคื•ืค ื”ืฉืคื™ืขื• ืขืœื™ื ื•
03:06
to really want to personify these things,
51
186871
2579
ื•ื’ืจืžื• ืœื ื• ืœื”ืฉืœื™ืš ืจื’ืฉื•ืช ื•ืชื›ื•ื ื•ืช ืื ื•ืฉื™ื•ืช ืขืœ ื”ื—ืคืฆื™ื ื”ืืœื•,
03:09
but it goes a little bit deeper than that.
52
189474
2789
ืื‘ืœ ื–ื” ื˜ื™ืคื” ื™ื•ืชืจ ืขืžื•ืง ืžื–ื”.
03:12
It turns out that we're biologically hardwired to project intent and life
53
192287
5309
ืžืกืชื‘ืจ ืฉืื ื—ื ื• ืžืชื•ื›ื ืชื™ื ื‘ื™ื•ืœื•ื’ื™ืช ืœื”ื ื™ื— ืืช ืงื™ื•ืžื ืฉืœ ื›ื•ื•ื ื•ืช ื•ื—ื™ื™ื
03:17
onto any movement in our physical space that seems autonomous to us.
54
197620
4766
ืขื‘ื•ืจ ื›ืœ ื“ื‘ืจ ืฉื ืข ื‘ืžืจื—ื‘ ื•ื ืจืื” ืื•ื˜ื•ื ื•ืžื™.
03:23
So people will treat all sorts of robots like they're alive.
55
203214
3465
ื›ืš ืฉืื ืฉื™ื ื™ืชื™ื™ื—ืกื• ืœื›ืœ ืžื ื™ ืจื•ื‘ื•ื˜ื™ื ื›ืื™ืœื• ื”ื ื—ื™ื™ื.
03:26
These bomb-disposal units get names.
56
206703
2683
ื”ืจื•ื‘ื•ื˜ื™ื ืœืคื™ืจื•ืง ืžื•ืงืฉื™ื ืžืงื‘ืœื™ื ืฉืžื•ืช.
03:29
They get medals of honor.
57
209410
1682
ื”ื ืžืงื‘ืœื™ื ืžื“ืœื™ื•ืช.
03:31
They've had funerals for them with gun salutes.
58
211116
2325
ืขืจื›ื• ืœื”ื ื”ืœื•ื•ื™ื•ืช ืฆื‘ืื™ื•ืช ืžื›ื•ื‘ื“ื•ืช.
03:34
And research shows that we do this even with very simple household robots,
59
214380
3833
ื•ื”ืžื—ืงืจ ืžืจืื” ืฉืื ื—ื ื• ืขื•ืฉื™ื ืืช ื–ื” ืืคื™ืœื• ืขื ืจื•ื‘ื•ื˜ื™ื ื‘ื™ืชื™ื™ื ืคืฉื•ื˜ื™ื,
03:38
like the Roomba vacuum cleaner.
60
218237
2135
ื›ืžื• ื”ืจื•ื‘ื•ื˜ ืฉื•ืื‘ ื”ืื‘ืง.
03:40
(Laughter)
61
220396
1291
(ืฆื—ื•ืง)
03:41
It's just a disc that roams around your floor to clean it,
62
221711
3089
ื–ื” ื‘ืกื”"ื› ืžืฉื”ื• ืขื’ื•ืœ ืฉืžืกืชื•ื‘ื‘ ืขืœ ื”ืจืฆืคื” ื•ืžื ืงื” ืื•ืชื”,
03:44
but just the fact it's moving around on its own
63
224824
2306
ืื‘ืœ ืขืฆื ื”ืขื•ื‘ื“ื” ืฉื”ื•ื ื ืข ื‘ื›ื•ื—ื•ืช ืขืฆืžื•
03:47
will cause people to name the Roomba
64
227154
2167
ืชื’ืจื•ื ืœืื ืฉื™ื ืœืชืช ืœื• ืฉื
03:49
and feel bad for the Roomba when it gets stuck under the couch.
65
229345
3182
ื•ืœื”ืจื’ื™ืฉ ืจืข ืขื‘ื•ืจื• ื›ืฉื”ื•ื ื ืชืงืข ืžืชื—ืช ืœืกืคื”.
03:52
(Laughter)
66
232551
1865
(ืฆื—ื•ืง)
03:54
And we can design robots specifically to evoke this response,
67
234440
3340
ื•ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœืชื›ื ืŸ ืจื•ื‘ื•ื˜ื™ื ื‘ืžื™ื•ื—ื“ ื›ื“ื™ ืœืขื•ืจืจ ืืช ื”ืชื’ื•ื‘ื” ื”ื–ื•,
03:57
using eyes and faces or movements
68
237804
3461
ื‘ืขื–ืจืช ืขื™ื ื™ื™ื ื•ืคื ื™ื ืื• ืชื ื•ืขื”
04:01
that people automatically, subconsciously associate
69
241289
3259
ื›ืš ืฉืื ืฉื™ื ื™ื™ื—ืกื• ืœื”ื ืชื›ื•ื ื•ืช ื•ืจื’ืฉื•ืช
04:04
with states of mind.
70
244572
2020
ื‘ืื•ืคืŸ ืื•ื˜ื•ืžื˜ื™, ืชืช ืžื•ื“ืข.
04:06
And there's an entire body of research called human-robot interaction
71
246616
3293
ื•ื™ืฉ ืชื—ื•ื ืžื—ืงืจ ืฉืœื, ืฉื ืงืจื ื™ื—ืกื™ ืื“ื-ืจื•ื‘ื•ื˜
04:09
that really shows how well this works.
72
249933
1826
ืฉืžืจืื” ืขื“ ื›ืžื” ื˜ื•ื‘ ื–ื” ืขื•ื‘ื“.
04:11
So for example, researchers at Stanford University found out
73
251783
3126
ืœื“ื•ื’ืžื”, ื—ื•ืงืจื™ื ื‘ืื•ื ื™ื‘ืจืกื™ื˜ืช ืกื˜ื ืคื•ืจื“
04:14
that it makes people really uncomfortable
74
254933
2001
ื’ื™ืœื• ืฉืื ืฉื™ื ืžืจื’ื™ืฉื™ื ืœื ื‘ื ื•ื—
04:16
when you ask them to touch a robot's private parts.
75
256958
2472
ื›ืฉืžื‘ืงืฉื™ื ืžื”ื ืœื’ืขืช ื‘ืื™ื‘ืจื™ื ื”ืžื•ืฆื ืขื™ื ืฉืœ ื”ืจื•ื‘ื•ื˜.
04:19
(Laughter)
76
259454
2120
(ืฆื—ื•ืง)
04:21
So from this, but from many other studies,
77
261598
2023
ืžื”ืžื—ืงืจ ื”ื–ื”, ื•ืžืžื—ืงืจื™ื ืจื‘ื™ื ื ื•ืกืคื™ื,
04:23
we know, we know that people respond to the cues given to them
78
263645
4223
ืื ื—ื ื• ื™ื•ื“ืขื™ื ืฉืื ืฉื™ื ืžื’ื™ื‘ื™ื ืœืกื™ืžื ื™ื
04:27
by these lifelike machines,
79
267892
2022
ืฉื ื•ืชื ื•ืช ื”ืžื›ื•ื ื•ืช ื“ืžื•ื™ื•ืช-ื”ื—ื™ื™ื ื”ืœืœื•
04:29
even if they know that they're not real.
80
269938
2017
ืืคื™ืœื• ื›ืฉื”ื ื™ื•ื“ืขื™ื ืฉื–ื” ืœื ืืžื™ืชื™.
04:33
Now, we're headed towards a world where robots are everywhere.
81
273654
4056
ืขื›ืฉื™ื•, ืื ื—ื ื• ืฆื•ืขื“ื™ื ืœืขื‘ืจ ืขื•ืœื ืฉื‘ื• ื”ืจื•ื‘ื•ื˜ื™ื ื ืžืฆืื™ื ื‘ื›ืœ ืžืงื•ื.
04:37
Robotic technology is moving out from behind factory walls.
82
277734
3065
ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ื”ืจื•ื‘ื•ื˜ื™ืช ื™ื•ืฆืืช ืžื’ื‘ื•ืœื•ืช ื”ืžืคืขืœื™ื ื•ื”ืชืขืฉื™ื™ื”.
04:40
It's entering workplaces, households.
83
280823
3013
ื”ื™ื ื ื›ื ืกืช ืœืžืงื•ืžื•ืช ืขื‘ื•ื“ื” ื•ืœื‘ืชื™ื ืคืจื˜ื™ื™ื.
04:43
And as these machines that can sense and make autonomous decisions and learn
84
283860
6209
ื•ื›ืฉื”ืžื›ื•ื ื•ืช ื”ืืœื•, ืฉื™ื›ื•ืœื•ืช ืœื—ื•ืฉ ื•ืœืงื‘ืœ ื”ื—ืœื˜ื•ืช ื•ืœืœืžื•ื“
04:50
enter into these shared spaces,
85
290093
2552
ื ื›ื ืกื•ืช ืœื—ืœืœื™ื ื”ืžืฉื•ืชืคื™ื ื”ืœืœื•,
04:52
I think that maybe the best analogy we have for this
86
292669
2496
ืื ื™ ื—ื•ืฉื‘ืช ืฉื”ืื ืœื•ื’ื™ื” ื”ื˜ื•ื‘ื” ื‘ื™ื•ืชืจ ืฉื™ืฉ ืœื ื• ืขื‘ื•ืจ ื–ื”
04:55
is our relationship with animals.
87
295189
1935
ื”ื™ื ื”ื™ื—ืก ืฉืœื ื• ืœื‘ืขืœื™ ื”ื—ื™ื™ื.
04:57
Thousands of years ago, we started to domesticate animals,
88
297523
3888
ืœืคื ื™ ืืœืคื™ ืฉื ื™ื, ื”ืชื—ืœื ื• ืœื‘ื™ื™ืช ื—ื™ื•ืช,
05:01
and we trained them for work and weaponry and companionship.
89
301435
4045
ื•ืื™ืžื ื• ืื•ืชื ืœืขื‘ื•ื“ื”, ืœืžืœื—ืžื” ื•ืœืœื•ื•ื™ื”.
05:05
And throughout history, we've treated some animals like tools or like products,
90
305504
4985
ืœืื•ืจืš ื”ื”ื™ืกื˜ื•ืจื™ื”, ื”ืชื™ื™ื—ืกื ื• ืœื—ื™ื•ืช ืžืกื•ื™ื™ืžื•ืช ื›ืืœ ื›ืœื™ื ืื• ืžื•ืฆืจื™ื,
05:10
and other animals, we've treated with kindness
91
310513
2174
ื•ืœื—ื™ื•ืช ืื—ืจื•ืช ื”ืชื™ื™ื—ืกื ื• ื‘ื—ืžืœื”
05:12
and we've given a place in society as our companions.
92
312711
3078
ื•ื ืชื ื• ืœื”ืŸ ืžืงื•ื ื‘ื—ื‘ืจื” ื›ื‘ื ื™ ื”ืœื•ื•ื™ื” ืฉืœื ื•.
05:15
I think it's plausible we might start to integrate robots in similar ways.
93
315813
3849
ื ืจืื” ืœื™ ื”ื’ื™ื•ื ื™ ืฉื ืชื—ื™ืœ ืœืฉืœื‘ ืจื•ื‘ื•ื˜ื™ื ื‘ืื•ืคืŸ ื“ื•ืžื”.
05:21
And sure, animals are alive.
94
321484
3096
ื•ื›ืžื•ื‘ืŸ, ื‘ืขืœื™ ื—ื™ื™ื ื”ื ื—ื™ื™ื.
05:24
Robots are not.
95
324604
1150
ืจื•ื‘ื•ื˜ื™ื ืื™ื ื ื—ื™ื™ื.
05:27
And I can tell you, from working with roboticists,
96
327626
2580
ื•ืื ื™ ื™ื›ื•ืœื” ืœืกืคืจ ืœื›ืŸ, ืžืขื‘ื•ื“ืชื™ ื‘ืชื—ื•ื ื”ืจื•ื‘ื•ื˜ื™ืงื”,
05:30
that we're pretty far away from developing robots that can feel anything.
97
330230
3522
ืฉืื ื—ื ื• ืจื—ื•ืงื™ื ืžืื•ื“ ืžื”ื™ื›ื•ืœืช ืœืคืชื— ืจื•ื‘ื•ื˜ื™ื ื‘ืขืœื™ ืจื’ืฉื•ืช.
05:35
But we feel for them,
98
335072
1460
ืื‘ืœ ืื ื—ื ื• ืžื–ื“ื”ื™ื ืื™ืชื,
05:37
and that matters,
99
337835
1207
ื•ื–ื” ื—ืฉื•ื‘,
05:39
because if we're trying to integrate robots into these shared spaces,
100
339066
3627
ื›ื™ ืื ืื ื—ื ื• ืžื ืกื™ื ืœื”ื›ื ื™ืก ืจื•ื‘ื•ื˜ื™ื ืœื—ืœืœื™ื ื”ืžืฉื•ืชืคื™ื ื”ืœืœื•,
05:42
we need to understand that people will treat them differently than other devices,
101
342717
4628
ืื ื—ื ื• ืฆืจื™ื›ื™ื ืœื”ื‘ื™ืŸ ืฉืื ืฉื™ื ื™ืชื™ื™ื—ืกื• ืืœื™ื”ื ืื—ืจืช ืžืืฉืจ ืœืžื›ืฉื™ืจื™ื ืื—ืจื™ื,
05:47
and that in some cases,
102
347369
1844
ื•ืฉื‘ืžืงืจื™ื ืžืกื•ื™ื™ืžื™ื,
05:49
for example, the case of a soldier who becomes emotionally attached
103
349237
3172
ืœื“ื•ื’ืžื”, ื‘ืžืงืจื” ืฉืœ ื—ื™ื™ืœ ืฉื ืงืฉืจ ืจื’ืฉื™ืช
05:52
to the robot that they work with,
104
352433
2047
ืœืจื•ื‘ื•ื˜ ืฉืขื‘ื“ ืื™ืชื•,
05:54
that can be anything from inefficient to dangerous.
105
354504
2504
ื–ื” ื™ื›ื•ืœ ืœื”ื™ื•ืช ืœื ื™ืขื™ืœ ื•ืืฃ ืžืกื•ื›ืŸ.
05:58
But in other cases, it can actually be useful
106
358551
2138
ืื‘ืœ ื‘ืžืงืจื™ื ืื—ืจื™ื, ื–ื” ื™ื›ื•ืœ ืœื”ื™ื•ืช ืฉื™ืžื•ืฉื™
06:00
to foster this emotional connection to robots.
107
360713
2623
ืœื—ื–ืง ืืช ื”ืงืฉืจ ื”ืจื’ืฉื™ ืœืจื•ื‘ื•ื˜ื™ื.
06:04
We're already seeing some great use cases,
108
364184
2134
ืื ื—ื ื• ื›ื‘ืจ ืจื•ืื™ื ืžืงืจื™ื ื›ืืœื•.
06:06
for example, robots working with autistic children
109
366342
2604
ืœื“ื•ื’ืžื”, ืจื•ื‘ื•ื˜ื™ื ืฉืขื•ื‘ื“ื™ื ืขื ื™ืœื“ื™ื ืื•ื˜ื™ืกื˜ื™ื
06:08
to engage them in ways that we haven't seen previously,
110
368970
3634
ื•ืžืฆืœื™ื—ื™ื ืœื”ื’ื™ืข ืืœื™ื”ื ื‘ืื•ืคื ื™ื ื—ื“ืฉื™ื,
06:12
or robots working with teachers to engage kids in learning with new results.
111
372628
4000
ืื• ืจื•ื‘ื•ื˜ื™ื ืฉืขื•ื‘ื“ื™ื ืขื ืžื•ืจื™ื ืœืืชื’ืจ ืชืœืžื™ื“ื™ื ืฉืžืจืื™ื ืชื•ืฆืื•ืช ื—ื“ืฉื•ืช.
06:17
And it's not just for kids.
112
377433
1381
ื•ื–ื” ืœื ืจืง ืขื‘ื•ืจ ื™ืœื“ื™ื.
06:19
Early studies show that robots can help doctors and patients
113
379750
3223
ืžื—ืงืจื™ื ืจืืฉื•ื ื™ื™ื ืžืจืื™ื ืฉืจื•ื‘ื•ื˜ื™ื ื™ื›ื•ืœื™ื ืœืขื–ื•ืจ ืœืจื•ืคืื™ื ื•ืœืžื˜ื•ืคืœื™ื
06:22
in health care settings.
114
382997
1427
ืœื ื”ืœ ื˜ื•ื‘ ื™ื•ืชืจ ืืช ื”ื˜ื™ืคื•ืœ ื”ืจืคื•ืื™.
06:25
This is the PARO baby seal robot.
115
385535
1810
ื–ื” ืจื•ื‘ื•ื˜ ื‘ืฆื•ืจืช ื›ืœื‘-ื™ื ืชื™ื ื•ืง ืฉืœ ื—ื‘ืจืช PARO
06:27
It's used in nursing homes and with dementia patients.
116
387369
3285
ืžืฉืชืžืฉื™ื ื‘ื• ื‘ื‘ืชื™ ืื‘ื•ืช ื•ื‘ืื•ืคืŸ ื›ืœืœื™ ืขื‘ื•ืจ ื—ื•ืœื™ ื“ืžื ืฆื™ื”.
06:30
It's been around for a while.
117
390678
1570
ื”ื•ื ื ืžืฆื ื‘ืฉื™ืžื•ืฉ ื›ื‘ืจ ื–ืžืŸ ืžื”.
06:32
And I remember, years ago, being at a party
118
392272
3325
ื•ืื ื™ ื–ื•ื›ืจืช ืฉืœืคื ื™ ืฉื ื™ื, ื”ื™ื™ืชื™ ื‘ืžืกื™ื‘ื”
06:35
and telling someone about this robot,
119
395621
2571
ื•ืกื™ืคืจืชื™ ืœืžื™ืฉื”ื™ ืขืœ ื”ืจื•ื‘ื•ื˜ ื”ื–ื”.
06:38
and her response was,
120
398216
2126
ื•ื”ืชื’ื•ื‘ื” ืฉืœื” ื”ื™ืชื”,
06:40
"Oh my gosh.
121
400366
1262
"ืืœื•ื”ื™ื ืื“ื™ืจื™ื.
06:42
That's horrible.
122
402508
1188
ื–ื” ืื™ื•ื ื•ื ื•ืจื.
06:45
I can't believe we're giving people robots instead of human care."
123
405056
3397
ืื ื™ ืœื ืžืืžื™ื ื” ืฉืื ื—ื ื• ื ื•ืชื ื™ื ืœืื ืฉื™ื ืจื•ื‘ื•ื˜ื™ื ื‘ืžืงื•ื ื˜ื™ืคื•ืœ ืื ื•ืฉื™."
06:50
And this is a really common response,
124
410540
1875
ื–ื• ืชื’ื•ื‘ื” ื ืคื•ืฆื”,
06:52
and I think it's absolutely correct,
125
412439
2499
ื•ืื ื™ ื—ื•ืฉื‘ืช ืฉื”ื™ื ื ื›ื•ื ื” ืœื—ืœื•ื˜ื™ืŸ,
06:54
because that would be terrible.
126
414962
2040
ื›ื™ ื–ื” ื‘ืืžืช ื™ื”ื™ื” ื ื•ืจื.
06:57
But in this case, it's not what this robot replaces.
127
417795
2484
ืื‘ืœ ื‘ืžืงืจื” ื”ื–ื”, ื”ืจื•ื‘ื•ื˜ ืœื ืžื—ืœื™ืฃ ืืช ื”ื˜ื™ืคื•ืœ ื”ืื ื•ืฉื™.
07:00
What this robot replaces is animal therapy
128
420858
3120
ื”ืจื•ื‘ื•ื˜ ื”ื–ื” ืžื—ืœื™ืฃ ืชืจืคื™ื” ืข"ื™ ื˜ื™ืคื•ืœ ื‘ื‘ืข"ื—
07:04
in contexts where we can't use real animals
129
424002
3198
ื‘ืžืงืจื™ื ื‘ื”ื ืื™ ืืคืฉืจ ืœืชืช ืœืžื˜ื•ืคืœ ืœื˜ืคืœ ื‘ื‘ืขืœื™ ื—ื™ื™ื ืืžื™ืชื™ื™ื
07:07
but we can use robots,
130
427224
1168
ืื‘ืœ ืืคืฉืจ ืœื”ืฉืชืžืฉ ื‘ืจื•ื‘ื•ื˜ื™ื,
07:08
because people will consistently treat them more like an animal than a device.
131
428416
5230
ื›ื™ ืื ืฉื™ื ืžืชื™ื™ื—ืกื™ื ืืœื™ื”ื ื›ืžื• ืืœ ื‘ืขืœื™ ื—ื™ื™ื ื•ืœื ื›ืžื• ืืœ ืžื›ืฉื™ืจื™ื.
07:15
Acknowledging this emotional connection to robots
132
435502
2380
ืื ื ื›ื™ืจ ื‘ืงืฉืจ ื”ืจื’ืฉื™ ื”ื–ื” ืขื ื”ืจื•ื‘ื•ื˜ื™ื
07:17
can also help us anticipate challenges
133
437906
1969
ื ื”ื™ื” ืžืกื•ื’ืœื™ื ืœืฆืคื•ืช ืืช ื”ืืชื’ืจื™ื ืฉื ื›ื•ื ื™ื ืœื ื•
07:19
as these devices move into more intimate areas of people's lives.
134
439899
3451
ื›ืฉื”ืžื›ืฉื™ืจื™ื ื”ืœืœื• ื™ื›ื ืกื• ืœืชื—ื•ืžื™ื ืื™ืฉื™ื™ื ื™ื•ืชืจ ื‘ื—ื™ื™ื ื•.
07:24
For example, is it OK if your child's teddy bear robot
135
444111
3404
ืœื“ื•ื’ืžื, ื”ืื ื–ื” ื‘ืกื“ืจ ืฉื”ืจื•ื‘ื•ื˜-ืฆืขืฆื•ืข ืฉืœ ื”ื™ืœื“ ืฉืœื›ื
07:27
records private conversations?
136
447539
2237
ื™ืงืœื™ื˜ ืฉื™ื—ื•ืช ืื™ืฉื™ื•ืช?
07:29
Is it OK if your sex robot has compelling in-app purchases?
137
449800
4063
ื”ืื ื–ื” ื‘ืกื“ืจ ืฉืจื•ื‘ื•ื˜-ื”ืกืงืก ืฉืœื›ื ื™ืฆื™ืข ืืคืฉืจื•ื™ื•ืช ืจื›ื™ืฉื”?
07:33
(Laughter)
138
453887
1396
(ืฆื—ื•ืง)
07:35
Because robots plus capitalism
139
455307
2501
ื›ื™ ืื ืžืฉืœื‘ื™ื ืจื•ื‘ื•ื˜ื™ื ื•ืงืคื™ื˜ืœื™ื–ื
07:37
equals questions around consumer protection and privacy.
140
457832
3705
ืžืงื‘ืœื™ื ืกื•ื’ื™ื•ืช ืฉืœ ืคืจื˜ื™ื•ืช ื•ื”ื’ื ื” ืขืœ ื”ืฆืจื›ืŸ.
07:42
And those aren't the only reasons
141
462549
1612
ื•ืืœื• ืื™ื ืŸ ื”ืกื™ื‘ื•ืช ื”ื™ื—ื™ื“ื•ืช
07:44
that our behavior around these machines could matter.
142
464185
2570
ื‘ื’ืœืœืŸ ื”ื”ืชื ื”ื’ื•ืช ืฉืœื ื• ื›ืœืคื™ ื”ืžื›ื•ื ื•ืช ื”ืืœื• ืžืฉื ื”.
07:48
A few years after that first initial experience I had
143
468747
3270
ื›ืžื” ืฉื ื™ื ืœืื—ืจ ื”ื”ืชื ืกื•ืช ื”ืจืืฉื•ื ื™ืช ืฉืœื™
07:52
with this baby dinosaur robot,
144
472041
2311
ืขื ื”ืจื•ื‘ื•ื˜ ื‘ืฆื•ืจืช ื“ื™ื ื•ื–ืื•ืจ ืชื™ื ื•ืง
07:54
I did a workshop with my friend Hannes Gassert.
145
474376
2501
ื”ืขื‘ืจืชื™ ืกื“ื ื ืขื ื—ื‘ืจื™ ื”ื ืก ื’ืื–ืจื˜
07:56
And we took five of these baby dinosaur robots
146
476901
2897
ื•ืœืงื—ื ื• ื—ืžื™ืฉื” ืจื•ื‘ื•ื˜ื™ื ื›ืืœื•
07:59
and we gave them to five teams of people.
147
479822
2453
ื•ื ืชื ื• ืื•ืชื ืœื—ืžืฉ ืงื‘ื•ืฆื•ืช ืฉืœ ืื ืฉื™ื.
08:02
And we had them name them
148
482299
1697
ื•ื‘ื™ืงืฉื ื• ืžื”ื ืœืชืช ืœื”ื ืฉืžื•ืช
08:04
and play with them and interact with them for about an hour.
149
484020
3809
ื•ืœืฉื—ืง ืื™ืชื ื•ืœืชืงืฉืจ ืื™ืชื ื‘ืžืฉืš ื›ืฉืขื”.
08:08
And then we unveiled a hammer and a hatchet
150
488707
2206
ื•ืื– ื—ืฉืคื ื• ื‘ืคื ื™ื”ื ืคื˜ื™ืฉ ื•ื’ืจื–ืŸ
08:10
and we told them to torture and kill the robots.
151
490937
2278
ื•ื‘ื™ืงืฉื ื• ืžื”ื ืœืขื ื•ืช ื•ืœื”ืจื•ื’ ืืช ื”ืจื•ื‘ื•ื˜ื™ื.
08:13
(Laughter)
152
493239
3007
(ืฆื—ื•ืง)
08:16
And this turned out to be a little more dramatic
153
496857
2294
ื•ื–ื” ื”ืคืš ืœื”ื™ื•ืช ื™ื•ืชืจ ื“ืจืžื˜ื™
08:19
than we expected it to be,
154
499175
1278
ืžื›ืคื™ ืฉืฆื™ืคื™ื ื• ืฉื–ื” ื™ื”ื™ื”,
08:20
because none of the participants would even so much as strike
155
500477
3072
ื›ื™ ืืฃ ืื—ื“ ืžื”ืžืฉืชืชืคื™ื ืœื ื”ื™ื” ืžื•ื›ืŸ ืืคื™ืœื• ืœื”ื›ื•ืช
08:23
these baby dinosaur robots,
156
503573
1307
ืืช ื”ืจื•ื‘ื•ื˜ ื“ื™ื ื•ื–ืื•ืจ-ืชื™ื ื•ืง,
08:24
so we had to improvise a little, and at some point, we said,
157
504904
5150
ืื– ืื™ืœืชืจื ื• ืงืฆืช, ื•ื‘ืฉืœื‘ ืžืกื•ื™ื™ื ืืžืจื ื•,
08:30
"OK, you can save your team's robot if you destroy another team's robot."
158
510078
4437
"ื‘ืกื“ืจ, ืชื•ื›ืœื• ืœื”ืฆื™ืœ ืืช ื”ืจื•ื‘ื•ื˜ ืฉืœ ื”ืงื‘ื•ืฆื” ืฉืœื›ื ืื ืชืฉืžื™ื“ื• ืืช ื”ืจื•ื‘ื•ื˜ ืฉืœ ื”ืงื‘ื•ืฆื” ื”ืฉื ื™ื™ื”."
08:34
(Laughter)
159
514539
1855
(ืฆื—ื•ืง)
08:36
And even that didn't work. They couldn't do it.
160
516839
2195
ื•ืืคื™ืœื• ืื– ืœื ื”ืฆืœื—ื ื•. ื”ื ืœื ื”ื™ื• ืžืกื•ื’ืœื™ื ืœืขืฉื•ืช ืืช ื–ื”.
08:39
So finally, we said,
161
519058
1151
ืื– ืœื‘ืกื•ืฃ ืืžืจื ื•,
08:40
"We're going to destroy all of the robots
162
520233
2032
"ืื ื—ื ื• ื”ื•ืœื›ื™ื ืœื”ืจื•ืก ืืช ื›ืœ ื”ืจื•ื‘ื•ื˜ื™ื
08:42
unless someone takes a hatchet to one of them."
163
522289
2285
ืืœื ืื ื›ืŸ ืžื™ืฉื”ื• ืžื›ืŸ ื™ื›ื” ื‘ืื—ื“ ืžื”ื ื‘ื’ืจื–ืŸ."
08:45
And this guy stood up, and he took the hatchet,
164
525586
3579
ื•ื‘ื—ื•ืจ ืื—ื“ ื ืขืžื“, ื•ืœืงื— ืืช ื”ื’ืจื–ืŸ,
08:49
and the whole room winced as he brought the hatchet down
165
529189
2706
ื•ื›ืœ ื”ื—ื“ืจ ื ืจืชืข ื›ืฉื”ื•ื ื”ื›ื” ื‘ื’ืจื–ืŸ
08:51
on the robot's neck,
166
531919
1780
ืขืœ ืฆื•ื•ืืจ ื”ืจื•ื‘ื•ื˜,
08:53
and there was this half-joking, half-serious moment of silence in the room
167
533723
6338
ื•ื”ื™ืชื” ื“ืงืช ื“ื•ืžื™ื™ื”, ื—ืฆื™-ื‘ืฆื—ื•ืง ื—ืฆื™-ืจืฆื™ื ื™ืช
09:00
for this fallen robot.
168
540085
1698
ืœื–ื›ืจื• ืฉืœ ื”ืจื•ื‘ื•ื˜.
09:01
(Laughter)
169
541807
1406
(ืฆื—ื•ืง)
09:03
So that was a really interesting experience.
170
543237
3694
ื–ื• ื”ื™ื™ืชื” ื—ื•ื•ื™ื” ืžืื•ื“ ืžืขื ื™ื™ื ืช.
09:06
Now, it wasn't a controlled study, obviously,
171
546955
2459
ื‘ืจื•ืจ ืฉื–ื” ืœื ื”ื™ื” ื ื™ืกื•ื™ ืžื‘ื•ืงืจ,
09:09
but it did lead to some later research that I did at MIT
172
549438
2850
ืื‘ืœ ื”ื•ื ื”ื•ื‘ื™ืœ ืœืžื—ืงืจ ื ื•ืกืฃ ืฉืขืจื›ืชื™ ื‘-MIT
09:12
with Palash Nandy and Cynthia Breazeal,
173
552312
2228
ืขื ืคืœืฉ ื ื ื“ื™ ื•ืกื™ื ืชื™ื” ื‘ืจื™ื–ื™ืœ,
09:14
where we had people come into the lab and smash these HEXBUGs
174
554564
3627
ืฉื‘ื• ื”ื›ื ืกื ื• ืื ืฉื™ื ืœืžืขื‘ื“ื” ื•ื‘ื™ืงืฉื ื• ืžื”ื ืœืจืžื•ืก ืจื•ื‘ื•ื˜ื™ HEXBUG
09:18
that move around in a really lifelike way, like insects.
175
558215
3087
ืฉื”ืชืจื•ืฆืฆื• ืขืœ ื”ืจืฆืคื” ืžืžืฉ ื›ืžื• ื—ืจืงื™ื ืืžืชื™ื™ื.
09:21
So instead of choosing something cute that people are drawn to,
176
561326
3134
ื‘ืžืงื•ื ืœื‘ื—ื•ืจ ืžืฉื”ื• ื—ืžื•ื“ ืฉืื ืฉื™ื ื ืžืฉื›ื™ื ืืœื™ื•,
09:24
we chose something more basic,
177
564484
2093
ื‘ื—ืจื ื• ืžืฉื”ื• ื™ื•ืชืจ ื‘ืกื™ืกื™
09:26
and what we found was that high-empathy people
178
566601
3480
ื•ื’ื™ืœื™ื ื• ืฉืื ืฉื™ื ืžืื•ื“ ืืžืคืชื™ื™ื
09:30
would hesitate more to hit the HEXBUGS.
179
570105
2143
ื”ื™ืกืกื• ื™ื•ืชืจ ืœืคื’ื•ืข ื‘-HEXBUG.
09:33
Now this is just a little study,
180
573575
1564
ื–ื” ืจืง ืžื—ืงืจ ืื—ื“ ืงื˜ืŸ,
09:35
but it's part of a larger body of research
181
575163
2389
ืื‘ืœ ื”ื•ื ื—ืœืง ืžื’ื•ืฃ ื™ื“ืข ื”ื•ืœืš ื•ื’ื“ืœ
09:37
that is starting to indicate that there may be a connection
182
577576
2944
ืฉืžืฆื‘ื™ืข ืขืœ ืงืฉืจ ืืคืฉืจื™
09:40
between people's tendencies for empathy
183
580544
2373
ื‘ื™ืŸ ื”ื™ื›ื•ืœืช ืฉืœ ืื ืฉื™ื ืœื’ืœื•ืช ืืžืคืชื™ื”
09:42
and their behavior around robots.
184
582941
1976
ืœื”ืชื ื”ื’ื•ืช ืฉืœื”ื ื›ืœืคื™ ืจื•ื‘ื•ื˜ื™ื.
09:45
But my question for the coming era of human-robot interaction
185
585721
3627
ืื‘ืœ ื”ืฉืืœื” ืฉืœื™, ืขื‘ื•ืจ ื”ืขื™ื“ืŸ ื”ืžืชืงืจื‘ ืฉืœ ืื™ื ื˜ืจืืงืฆื™ื” ื‘ื™ืŸ ืจื•ื‘ื•ื˜ื™ื ืœื‘ื ื™ ืื“ื
09:49
is not: "Do we empathize with robots?"
186
589372
3055
ืื™ื ื”: "ื”ืื ืื ื• ืžืจื’ื™ืฉื™ื ืืžืคืชื™ื” ื›ืœืคื™ ืจื•ื‘ื•ื˜ื™ื?"
09:53
It's: "Can robots change people's empathy?"
187
593211
2920
ืืœื: "ื”ืื ืจื•ื‘ื•ื˜ื™ื ื™ื›ื•ืœื™ื ืœืฉื ื•ืช ืืช ื”ื™ื›ื•ืœืช ืฉืœ ืื ืฉื™ื ืœื—ื•ืฉ ืืžืคืชื™ื”?"
09:57
Is there reason to, for example,
188
597489
2287
ื”ืื ื™ืฉ ืกื™ื‘ื”, ืœื“ื•ื’ืžื,
09:59
prevent your child from kicking a robotic dog,
189
599800
2333
ืœืžื ื•ืข ืžื™ืœื“ ืœื‘ืขื•ื˜ ื‘ื›ืœื‘ ืจื•ื‘ื•ื˜ื™,
10:03
not just out of respect for property,
190
603228
2914
ืœื ื›ื“ื™ ืœืœืžื“ ืื•ืชื• ื›ื‘ื•ื“ ืœืจื›ื•ืฉ,
10:06
but because the child might be more likely to kick a real dog?
191
606166
2953
ืืœื ื›ื™ื•ื•ืŸ ืฉื”ื•ื ืขืœื•ืœ ืœื‘ืขื•ื˜ ื‘ืขืชื™ื“ ื‘ื›ืœื‘ื™ื ืืžืชื™ื™ื?
10:10
And again, it's not just kids.
192
610507
1883
ื•ืฉื•ื‘, ืœื ืžื“ื•ื‘ืจ ืจืง ื‘ื™ืœื“ื™ื.
10:13
This is the violent video games question, but it's on a completely new level
193
613564
4056
ื–ื• ืื•ืชื” ืฉืืœื” ืฉืขื•ืœื” ืœื’ื‘ื™ ืžืฉื—ืงื™ ื•ื™ื“ืื• ืืœื™ืžื™ื, ืื‘ืœ ื‘ืจืžื” ืื—ืจืช ืœื’ืžืจื™
10:17
because of this visceral physicality that we respond more intensely to
194
617644
4760
ื›ื™ ืื ื—ื ื• ืžื’ื™ื‘ื™ื ื‘ืขื•ืฆืžื” ื’ื‘ื•ื”ื” ื”ืจื‘ื” ื™ื•ืชืจ ืœืขืฆืžื™ื ืคื™ื–ื™ื™ื
10:22
than to images on a screen.
195
622428
1547
ืžืืฉืจ ืœืชืžื•ื ื•ืช ืขืœ ืžืกืš.
10:25
When we behave violently towards robots,
196
625674
2578
ื›ืฉืื ื—ื ื• ืืœื™ืžื™ื ื›ืœืคื™ ืจื•ื‘ื•ื˜ื™ื,
10:28
specifically robots that are designed to mimic life,
197
628276
3120
ื‘ืžื™ื•ื—ื“ ืจื•ื‘ื•ื˜ื™ื ืฉืขื•ืฆื‘ื• ื›ืš ืฉื”ื ืžื—ืงื™ื ื—ื™ื™ื,
10:31
is that a healthy outlet for violent behavior
198
631420
3892
ื”ืื ื–ืืช ื“ืจืš ื‘ืจื™ืื” ืœืฉื—ืจืจ ื™ืฆืจื™ื ืืœื™ืžื™ื
10:35
or is that training our cruelty muscles?
199
635336
2544
ืื• ืื™ืžื•ืŸ ืฉืœ ืฉืจื™ืจ ื”ืื›ื–ืจื™ื•ืช ืฉืœื ื•?
10:39
We don't know ...
200
639511
1150
ืื ื—ื ื• ืœื ื™ื•ื“ืขื™ื...
10:42
But the answer to this question has the potential to impact human behavior,
201
642622
3945
ืื‘ืœ ืœืชืฉื•ื‘ื” ืขืœ ื”ืฉืืœื” ื”ื–ื• ื™ืฉ ืืช ื”ื™ื›ื•ืœืช ืœื”ืฉืคื™ืข ืขืœ ื”ื”ืชื ื”ื’ื•ืช ื”ืื ื•ืฉื™ืช,
10:46
it has the potential to impact social norms,
202
646591
2768
ื™ืฉ ืœื” ืืช ื”ื™ื›ื•ืœืช ืœื”ืฉืคื™ืข ืขืœ ื”ื ื•ืจืžื•ืช ื”ื—ื‘ืจืชื™ื•ืช,
10:49
it has the potential to inspire rules around what we can and can't do
203
649383
3849
ื™ืฉ ืœื” ืืช ื”ื™ื›ื•ืœืช ืœืขืฆื‘ ื—ื•ืงื™ื ืœื’ื‘ื™ ืžื” ืžื•ืชืจ ื•ืืกื•ืจ
10:53
with certain robots,
204
653256
1151
ืœืขืฉื•ืช ืขื ืจื•ื‘ื•ื˜ื™ื ืžืกื•ื’ ืžืกื•ื™ื™ื,
10:54
similar to our animal cruelty laws.
205
654431
1848
ื‘ื“ื•ืžื” ืœื—ื•ืงื™ ืฆืขืจ ื‘ืขืœื™ ื—ื™ื™ื.
10:57
Because even if robots can't feel,
206
657228
2864
ื›ื™ื•ื•ืŸ ืฉืืคื™ืœื• ืื ืจื•ื‘ื•ื˜ื™ื ืื™ื ื ืžืจื’ื™ืฉื™ื ื“ื‘ืจ,
11:00
our behavior towards them might matter for us.
207
660116
3080
ื”ื”ืชื ื”ื’ื•ืช ืฉืœื ื• ื›ืœืคื™ื”ื ืขืœื•ืœื” ืœืฉื ื•ืช ืื•ืชื ื•.
11:04
And regardless of whether we end up changing our rules,
208
664889
2855
ื•ื‘ื™ืŸ ืื ื‘ืกื•ืคื• ืฉืœ ื“ื‘ืจ ื ืฉื ื” ืืช ื”ื—ื•ืงื™ื ืื• ืœื
11:08
robots might be able to help us come to a new understanding of ourselves.
209
668926
3556
ื”ืจื•ื‘ื•ื˜ื™ื ื™ื›ื•ืœื™ื ืœืขื–ื•ืจ ืœื ื• ืœื”ื‘ื™ืŸ ืืช ืขืฆืžื ื• ื˜ื•ื‘ ื™ื•ืชืจ.
11:14
Most of what I've learned over the past 10 years
210
674276
2316
ืœืจื•ื‘ ืžื” ืฉืœืžื“ืชื™ ื‘ืขืฉืจ ื”ืฉื ื™ื ื”ืื—ืจื•ื ื•ืช
11:16
has not been about technology at all.
211
676616
2238
ืœื ื”ื™ื” ื›ืœ ืงืฉืจ ืœื˜ื›ื ื•ืœื•ื’ื™ื”.
11:18
It's been about human psychology
212
678878
2503
ืœืžื“ืชื™ ืขืœ ื”ืคืกื™ื›ื•ืœื•ื’ื™ื” ื”ืื ื•ืฉื™ืช
11:21
and empathy and how we relate to others.
213
681405
2603
ืขืœ ืืžืคืชื™ื” ื•ืขืœ ืื™ืš ืฉืื ื—ื ื• ืžืชื™ื™ื—ืกื™ื ื”ืื—ื“ ืœืฉื ื™.
11:25
Because when a child is kind to a Roomba,
214
685524
2365
ื›ื™ ื›ืฉื™ืœื“ ืžื’ืœื” ื—ืžืœื” ื›ืœืคื™ ืจื•ื‘ื•ื˜ ืฉื•ืื‘-ืื‘ืง,
11:29
when a soldier tries to save a robot on the battlefield,
215
689262
4015
ื›ืฉื—ื™ื™ืœ ืžื ืกื” ืœื”ืฆื™ืœ ืจื•ื‘ื•ื˜ ื‘ืฉื“ื” ื”ืงืจื‘,
11:33
or when a group of people refuses to harm a robotic baby dinosaur,
216
693301
3638
ืื• ื›ืฉืงื‘ื•ืฆืช ืื ืฉื™ื ืžืกืจื‘ืช ืœืคื’ื•ืข ื‘ื“ื™ื ื•ื–ืื•ืจ-ืชื™ื ื•ืง ืจื•ื‘ื•ื˜ื™,
11:38
those robots aren't just motors and gears and algorithms.
217
698248
3191
ื”ืจื•ื‘ื•ื˜ื™ื ื”ืืœื• ืื™ื ื ืจืง ืžื ื•ืข ื•ื’ืœื’ืœื™ ืฉื™ื ื™ื™ื ื•ืืœื’ื•ืจื™ืชืžื™ื.
11:42
They're reflections of our own humanity.
218
702501
1905
ื”ื ื”ืฉืชืงืคื•ื™ื•ืช ืฉืœ ื”ืื ื•ืฉื™ื•ืช ืฉืœื ื•.
11:45
Thank you.
219
705523
1151
ืชื•ื“ื”.
11:46
(Applause)
220
706698
3397
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7