Why we have an emotional connection to robots | Kate Darling

138,301 views ใƒป 2018-11-06

TED


ไธ‹ใฎ่‹ฑ่ชžๅญ—ๅน•ใ‚’ใƒ€ใƒ–ใƒซใ‚ฏใƒชใƒƒใ‚ฏใ™ใ‚‹ใจๅ‹•็”ปใ‚’ๅ†็”Ÿใงใใพใ™ใ€‚

00:00
Translator: Joseph Geni Reviewer: Krystian Aparta
0
0
7000
็ฟป่จณ: Yasushi Aoki ๆ กๆญฃ: Kazunori Akashi
00:13
There was a day, about 10 years ago,
1
13047
2508
10ๅนดใใ‚‰ใ„ๅ‰ใฎใ“ใจใงใ™ใŒ
00:15
when I asked a friend to hold a baby dinosaur robot upside down.
2
15579
3944
่ตคใกใ‚ƒใ‚“ๆ็ซœใƒญใƒœใƒƒใƒˆใ‚’ ้€†ใ•ใซๆŒใฃใฆใฟใฆใจๅ‹้”ใซ่จ€ใฃใŸใ‚“ใงใ™
00:21
It was this toy called a Pleo that I had ordered,
3
21889
3446
ใ€Œใƒ—ใƒฌใ‚ชใ€ใจใ„ใ†ใ‚ชใƒขใƒใƒฃใ‚’ๆ‰‹ใซๅ…ฅใ‚Œ ใฏใ—ใ‚ƒใ„ใงใ„ใพใ—ใŸ
00:25
and I was really excited about it because I've always loved robots.
4
25359
4401
ใƒญใƒœใƒƒใƒˆใฏๆ˜”ใ‹ใ‚‰ ๅคงๅฅฝใใงใ—ใŸใฎใง
00:29
And this one has really cool technical features.
5
29784
2279
ใ“ใฎใƒญใƒœใƒƒใƒˆใซใฏ ใ„ใ‚ใ„ใ‚ใ™ใ”ใ„ๆฉŸ่ƒฝใŒใ‚ใ‚Š
00:32
It had motors and touch sensors
6
32087
2119
ใƒขใƒผใ‚ฟใƒผใ‚„ ๆŽฅ่งฆใ‚ปใƒณใ‚ตใƒผใ‚„
00:34
and it had an infrared camera.
7
34230
2244
่ตคๅค–็ทšใ‚ซใƒกใƒฉใชใ‚“ใ‹ใซๅŠ ใˆ
00:36
And one of the things it had was a tilt sensor,
8
36498
2763
ๅ‚พๆ–œใ‚ปใƒณใ‚ตใƒผใŒๅ…ฅใฃใฆใ„ใฆ
00:39
so it knew what direction it was facing.
9
39285
2318
ไฝ“ใฎๅ‘ใใŒ ๅˆ†ใ‹ใ‚‹ใ‚ˆใ†ใซใชใฃใฆใ„ใพใ—ใŸ
00:42
And when you held it upside down,
10
42095
2134
ใใ—ใฆ้€†ใ•ใซใ—ใฆใ‚„ใ‚‹ใจ
00:44
it would start to cry.
11
44253
1572
ๆณฃใๅ‡บใ™ใ‚“ใงใ™
00:46
And I thought this was super cool, so I was showing it off to my friend,
12
46527
3496
ใ“ใ‚Œใฏใ™ใ”ใ„ใจๆ„Ÿๅฟƒใ— ๅ‹้”ใซใ‚‚่ฆ‹ใ›ใฆใ‚„ใ‚ใ†ใจๆ€ใฃใŸใ‚ใ‘ใงใ™
00:50
and I said, "Oh, hold it up by the tail. See what it does."
13
50047
2805
ใ€Œใกใ‚‡ใฃใจ ใ“ใ‚Œใฎๅฐปๅฐพใ‚’ๆŒใฃใฆใฟใฆ ใฉใ†ใชใ‚‹ใ‹ใ€
00:55
So we're watching the theatrics of this robot
14
55268
3625
ใใ—ใฆ ใ“ใฎใƒญใƒœใƒƒใƒˆใŒ ใ‚‚ใŒใๆณฃใๅงฟใ‚’่ฆ‹ใฆใ„ใพใ—ใŸใŒ
00:58
struggle and cry out.
15
58917
2199
01:02
And after a few seconds,
16
62767
2047
ๅฐ‘ใ—ใ—ใฆ ใชใ‚“ใ‹ๅซŒใชๆฐ—ใŒใ—ใฆใใฆ
01:04
it starts to bother me a little,
17
64838
1972
01:07
and I said, "OK, that's enough now.
18
67744
3424
ใ€Œใญใˆ ใ‚‚ใ†ใ„ใ„ใงใ—ใ‚‡ใ† ไธ‹ใซ็ฝฎใ„ใฆใ‚ใ’ใฆใ€ใจ่จ€ใ„
01:11
Let's put him back down."
19
71930
2305
01:14
And then I pet the robot to make it stop crying.
20
74259
2555
ใƒญใƒœใƒƒใƒˆใŒๆณฃใๆญขใ‚€ใ‚ˆใ†ใซ ๆ’ซใงใฆใ‚„ใ‚Šใพใ—ใŸ
01:18
And that was kind of a weird experience for me.
21
78973
2452
็งใซใจใฃใฆ ๅฅ‡ๅฆ™ใชไฝ“้จ“ใงใ—ใŸ
01:22
For one thing, I wasn't the most maternal person at the time.
22
82084
4569
ๅฝ“ๆ™‚ใฎ็งใฏ ใใ‚“ใชใซๆฏๆ€ง็š„ใช ไบบ้–“ใงใฏใ‚ใ‚Šใพใ›ใ‚“ใงใ—ใŸใ‹ใ‚‰
01:26
Although since then I've become a mother, nine months ago,
23
86677
2731
ใงใ‚‚๏ผ™ใƒถๆœˆๅ‰ใซ ็งใ‚‚ๆฏ่ฆชใซใชใฃใฆ
01:29
and I've learned that babies also squirm when you hold them upside down.
24
89432
3433
้€†ใ•ใซๆŒใคใจ ่ตคใ‚“ๅŠใ‚‚ ใ‚‚ใŒใใพใ‚ใ‚‹ใฎใŒๅˆ†ใ‹ใ‚Šใพใ—ใŸ
01:32
(Laughter)
25
92889
1563
(็ฌ‘)
01:35
But my response to this robot was also interesting
26
95023
2358
ใ“ใฎใƒญใƒœใƒƒใƒˆใซๅฏพใ™ใ‚‹ ่‡ชๅˆ†ใฎๅๅฟœใŒ่ˆˆๅ‘ณๆทฑใ‹ใฃใŸใฎใฏ
01:37
because I knew exactly how this machine worked,
27
97405
4101
ใใฎๆฉŸๆขฐ็š„ใชไป•็ต„ใฟใฏ ็†่งฃใ—ใฆใ„ใŸใฎใซ
01:41
and yet I still felt compelled to be kind to it.
28
101530
3262
ๅ„ชใ—ใใ—ใฆใ‚ใ’ใชใใ‚ƒใจ ๆ„Ÿใ˜ใŸใ‹ใ‚‰ใงใ™
01:46
And that observation sparked a curiosity
29
106450
2707
ใใฎๆ™‚ใฎ่ฆณๅฏŸใŒ
ใ“ใฎ10ๅนด่ฟฝใ„็ถšใ‘ใฆใใŸ ๅฅฝๅฅ‡ๅฟƒใซ็ซใ‚’ไป˜ใ‘ใŸใฎใงใ™
01:49
that I've spent the past decade pursuing.
30
109181
2832
01:52
Why did I comfort this robot?
31
112911
1793
ใชใœ่‡ชๅˆ†ใฏใ“ใฎใƒญใƒœใƒƒใƒˆใ‚’ ใ„ใŸใ‚ใฃใŸใฎใ‹๏ผŸ
01:56
And one of the things I discovered was that my treatment of this machine
32
116228
3579
ใฒใจใคๅˆ†ใ‹ใฃใŸใฎใฏ ใ“ใฎใƒญใƒœใƒƒใƒˆใธใฎๅฏพๅฟœใจใ„ใ†ใฎใฏ
01:59
was more than just an awkward moment in my living room,
33
119831
3701
ใ†ใกใฎๅฑ…้–“ใง่ตทใใŸ ๆฐ—ใพใšใ„็žฌ้–“ใซใจใฉใพใ‚‹ใ‚‚ใฎใงใฏใชใ
02:03
that in a world where we're increasingly integrating robots into our lives,
34
123556
5420
็”ŸๆดปใฎไธญใซใƒญใƒœใƒƒใƒˆใŒใพใ™ใพใ™ ็ต„ใฟ่พผใพใ‚Œใคใคใ‚ใ‚‹ไธ–ใฎไธญใง
ใ“ใฎใ‚ˆใ†ใชๆ„Ÿ่ฆšใฏ ๆง˜ใ€…ใช็ตๆžœใ‚’ ใ‚‚ใŸใ‚‰ใ—ใ†ใ‚‹ใจใ„ใ†ใ“ใจใงใ—ใŸ
02:09
an instinct like that might actually have consequences,
35
129000
3126
02:13
because the first thing that I discovered is that it's not just me.
36
133452
3749
็งใ ใ‘ใ˜ใ‚ƒใชใ„ใ“ใจใŒ ๅˆ†ใ‹ใฃใŸใ‚“ใงใ™
02:19
In 2007, the Washington Post reported that the United States military
37
139249
4802
2007ๅนดใซใƒฏใ‚ทใƒณใƒˆใƒณใƒใ‚นใƒˆ็ด™ใŒ
็ฑณ่ปใฎๅฎŸ้จ“ใ—ใฆใ„ใ‚‹ ๅœฐ้›ทๅ‡ฆ็†ใƒญใƒœใƒƒใƒˆใซใคใ„ใฆๅ ฑใ˜ใพใ—ใŸ
02:24
was testing this robot that defused land mines.
38
144075
3230
02:27
And the way it worked was it was shaped like a stick insect
39
147329
2912
ใใ‚ŒใฏใƒŠใƒŠใƒ•ใ‚ทใฟใŸใ„ใช ๅงฟใ‚’ใ—ใฆใ„ใฆ
02:30
and it would walk around a minefield on its legs,
40
150265
2651
่„šใ‚’ไฝฟใฃใฆๅœฐ้›ทๅŽŸใ‚’ๆญฉใๅ›žใ‚Š
02:32
and every time it stepped on a mine, one of the legs would blow up,
41
152940
3206
ๅœฐ้›ทใ‚’่ธใ‚€ใ”ใจใซ ่„šใฎ๏ผ‘ใคใŒๅนใ้ฃ›ใฐใ•ใ‚Œใพใ™ใŒ
02:36
and it would continue on the other legs to blow up more mines.
42
156170
3057
ๆญฉใ็ถšใ‘ใฆๅœฐ้›ทใ‚’็ˆ†็™บใ•ใ›ใฆใ„ใ ใจใ„ใ†ใ‚‚ใฎใงใ—ใŸ
02:39
And the colonel who was in charge of this testing exercise
43
159251
3786
่ฒฌไปป่€…ใ ใฃใŸๅคงไฝใฏ
ๅฎŸ้จ“ใ‚’ไธญๆญขใ•ใ›ใพใ—ใŸ
02:43
ends up calling it off,
44
163061
2118
02:45
because, he says, it's too inhumane
45
165203
2435
ๅ‚ทใคใ„ใŸใƒญใƒœใƒƒใƒˆใŒๅœฐ้›ทๅŽŸใ‚’ ้€™ใ„ๅ›žใ‚‹ใฎใ‚’่ฆ‹ใฆใ„ใ‚‹ใฎใฏ้žไบบ้–“็š„ใ ใ‹ใ‚‰ใจ
02:47
to watch this damaged robot drag itself along the minefield.
46
167662
4516
02:54
Now, what would cause a hardened military officer
47
174978
3897
็ญ‹้‡‘ๅ…ฅใ‚Šใฎ่ปไบบใ‚„
02:58
and someone like myself
48
178899
2043
็งใฟใŸใ„ใชไบบ้–“ใซ
03:00
to have this response to robots?
49
180966
1857
ใใ†ใ„ใ†ๅๅฟœใ‚’ใ•ใ›ใŸใ‚‚ใฎใฏ ไฝ•ใ ใฃใŸใฎใ‹๏ผŸ
03:03
Well, of course, we're primed by science fiction and pop culture
50
183537
3310
SFใ‚„ๅคง่ก†ๆ–‡ๅŒ–ใซใ‚ˆใฃใฆ
ใ“ใ†ใ„ใฃใŸใ‚‚ใฎใ‚’ๆ“ฌไบบๅŒ–ใ™ใ‚‹ใ“ใจใซ ้ฆดใ‚Œใฆใ„ใ‚‹ใจใ„ใ†ใฎใ‚‚ใ‚ใ‚Šใพใ™ใŒ
03:06
to really want to personify these things,
51
186871
2579
03:09
but it goes a little bit deeper than that.
52
189474
2789
ใ“ใ‚Œใฏใ‚‚ใฃใจๆ นใŒๆทฑใ„ใ‚‚ใฎใง
03:12
It turns out that we're biologically hardwired to project intent and life
53
192287
5309
็”Ÿ็‰ฉๅญฆ็š„ใซ็ง้”ใฏ ่‡ชๅพ‹็š„ใซๅ‹•ใใ‚‚ใฎใซๅฏพใ—
03:17
onto any movement in our physical space that seems autonomous to us.
54
197620
4766
ๆ„ๆ€ใ‚„็”Ÿๅ‘ฝใ‚’่ฆ‹ใ‚ˆใ†ใจ ใ™ใ‚‹ใ‚ˆใ†ใซใงใใฆใ„ใ‚‹ใ‚“ใงใ™
03:23
So people will treat all sorts of robots like they're alive.
55
203214
3465
ไบบใŒใƒญใƒœใƒƒใƒˆใ‚’็”Ÿใ็‰ฉใฎใ‚ˆใ†ใซ ๆ‰ฑใ†ใฎใฏใใฎใŸใ‚ใงใ™
03:26
These bomb-disposal units get names.
56
206703
2683
ใ“ใ†ใ„ใ†็ˆ†็™บ็‰ฉๅ‡ฆ็†ใƒญใƒœใƒƒใƒˆใซ ๅๅ‰ใŒไป˜ใ‘ใ‚‰ใ‚Œ
03:29
They get medals of honor.
57
209410
1682
ๅ‹ฒ็ซ ใŒๆŽˆไธŽใ•ใ‚Œ
03:31
They've had funerals for them with gun salutes.
58
211116
2325
็คผ็ ฒไป˜ใใฎ่‘ฌ็คผใŒ ่กŒใ‚ใ‚ŒใŸใ‚Šใ—ใพใ™
03:34
And research shows that we do this even with very simple household robots,
59
214380
3833
็›ธๆ‰‹ใŒๅ˜็ด”ใชๅฎถๅบญ็”จใƒญใƒœใƒƒใƒˆใงใ‚‚ ไบบใฏใใ†ๅๅฟœใ™ใ‚‹ใ“ใจใŒๅˆ†ใ‹ใฃใฆใ„ใพใ™
03:38
like the Roomba vacuum cleaner.
60
218237
2135
ๆŽƒ้™คใƒญใƒœใƒƒใƒˆใฎ ใƒซใƒณใƒใงใ•ใˆใ‚‚
03:40
(Laughter)
61
220396
1291
(็ฌ‘)
03:41
It's just a disc that roams around your floor to clean it,
62
221711
3089
ใŸใ ใฎๅ††็›คๅฝขใ‚’ใ—ใฆใ„ใฆ ๅบŠใฎไธŠใ‚’ๅ‹•ใๅ›žใฃใฆๆŽƒ้™คใ‚’ใ—ใพใ™ใŒ
03:44
but just the fact it's moving around on its own
63
224824
2306
่‡ชๅˆ†ใงๅ‹•ใๅ›žใ‚‹ ใจใ„ใ†ใจใ“ใ‚ใŒ
03:47
will cause people to name the Roomba
64
227154
2167
ๆŒใกไธปใซๅๅ‰ใ‚’ไป˜ใ‘ใ•ใ›ใŸใ‚Š
03:49
and feel bad for the Roomba when it gets stuck under the couch.
65
229345
3182
ใ‚ฝใƒ•ใ‚กใฎไธ‹ใซๅผ•ใฃใ‹ใ‹ใฃใฆใ„ใ‚‹ใฎใ‚’ ๅฏๆ„›ใใ†ใจๆ€ใ‚ใ›ใŸใ‚Šใ™ใ‚‹ใ‚“ใงใ™
03:52
(Laughter)
66
232551
1865
(็ฌ‘)
03:54
And we can design robots specifically to evoke this response,
67
234440
3340
ใใ†ใ„ใ†ๅๅฟœใ‚’ๅผ•ใ่ตทใ“ใ™ในใ ใƒญใƒœใƒƒใƒˆใ‚’ใƒ‡ใ‚ถใ‚คใƒณใ™ใ‚‹ใ“ใจใ‚‚ใงใใพใ™
03:57
using eyes and faces or movements
68
237804
3461
็›ฎใ‚„้ก”ใ‚’ไป˜ใ‘ใŸใ‚Š
ไบบใŒ็„กๆ„่ญ˜ใซ ๆ„Ÿๆƒ…ใจ็ตใณไป˜ใ‘ใ‚‹ใ‚ˆใ†ใช
04:01
that people automatically, subconsciously associate
69
241289
3259
04:04
with states of mind.
70
244572
2020
ๅ‹•ไฝœใ‚’ใ•ใ›ใ‚‹ใ“ใจใซใ‚ˆใฃใฆ
04:06
And there's an entire body of research called human-robot interaction
71
246616
3293
ใใ†ใ„ใฃใŸ็พ่ฑกใซใคใ„ใฆ็ ”็ฉถใ™ใ‚‹
ใ€Œใƒ’ใƒฅใƒผใƒžใƒณ-ใƒญใƒœใƒƒใƒˆใƒปใ‚คใƒณใ‚ฟใƒฉใ‚ฏใ‚ทใƒงใƒณใ€ ใจใ„ใ†ๅˆ†้‡Žใ‚‚ใ‚ใ‚Šใพใ™
04:09
that really shows how well this works.
72
249933
1826
04:11
So for example, researchers at Stanford University found out
73
251783
3126
ใŸใจใˆใฐใ‚นใ‚ฟใƒณใƒ•ใ‚ฉใƒผใƒ‰ๅคงใฎ็ ”็ฉถ่€…ใŒ ็™บ่ฆ‹ใ—ใŸใ“ใจใงใ™ใŒ
04:14
that it makes people really uncomfortable
74
254933
2001
ใƒญใƒœใƒƒใƒˆใงใ‚ใฃใฆใ‚‚ ้™ฐ้ƒจใซ่งฆใ‚Œใ‚‹ใ‚ˆใ†่จ€ใ‚ใ‚Œใ‚‹ใจ
04:16
when you ask them to touch a robot's private parts.
75
256958
2472
ไบบใฏใ™ใ”ใๆฐ—ใพใšใ ๆ„Ÿใ˜ใ‚‹ใ‚“ใงใ™
04:19
(Laughter)
76
259454
2120
(็ฌ‘)
04:21
So from this, but from many other studies,
77
261598
2023
ๅคšใใฎ็ ”็ฉถใ‹ใ‚‰
04:23
we know, we know that people respond to the cues given to them
78
263645
4223
ใ“ใ†ใ„ใ†็”Ÿใ็‰ฉใฎใ‚ˆใ†ใซ ่ฆ‹ใˆใ‚‹ใƒญใƒœใƒƒใƒˆใŒๅ‡บใ™ๅˆๅ›ณใซๅฏพใ—
04:27
by these lifelike machines,
79
267892
2022
ไบบ้–“ใฏไฝœใ‚Š็‰ฉใ ใจ ็Ÿฅใ‚ŠใชใŒใ‚‰
04:29
even if they know that they're not real.
80
269938
2017
ๅๅฟœใ™ใ‚‹ใ“ใจใŒ ๅˆ†ใ‹ใฃใฆใ„ใพใ™
04:33
Now, we're headed towards a world where robots are everywhere.
81
273654
4056
็ง้”ใฏใƒญใƒœใƒƒใƒˆใŒ่‡ณใ‚‹ใจใ“ใ‚ใซ ใ„ใ‚‹ใ‚ˆใ†ใชไธ–็•Œใธใจๅ‘ใ‹ใฃใฆใ„ใพใ™
04:37
Robotic technology is moving out from behind factory walls.
82
277734
3065
ใƒญใƒœใƒƒใƒˆๆŠ€่ก“ใฏ ใ‚‚ใฏใ‚„ๅทฅๅ ดใฎไธญใ ใ‘ใฎใ‚‚ใฎใงใฏใชใ
04:40
It's entering workplaces, households.
83
280823
3013
ใ‚ชใƒ•ใ‚ฃใ‚นใ‚„ๅฎถๅบญใธใจ ้€ฒๅ‡บใ—ใฆใ„ใพใ™
04:43
And as these machines that can sense and make autonomous decisions and learn
84
283860
6209
ใใ†ใ„ใฃใŸๅ ดๆ‰€ใธใจ ๅ…ฅใฃใฆใใ‚‹
็Ÿฅ่ฆšใ— ่‡ชๅพ‹็š„ใซๆฑบๆ–ญใ— ๅญฆใถใ“ใจใฎใงใใ‚‹ๆฉŸๆขฐใซๅฏพใ—ใฆ
04:50
enter into these shared spaces,
85
290093
2552
04:52
I think that maybe the best analogy we have for this
86
292669
2496
ๆœ€ๅ–„ใฎใ‚ขใƒŠใƒญใ‚ธใƒผใจใชใ‚‹ใฎใฏ
04:55
is our relationship with animals.
87
295189
1935
ๅ‹•็‰ฉใจใฎ้–ขไฟ‚ใ‹ใ‚‚ใ—ใ‚Œใพใ›ใ‚“
04:57
Thousands of years ago, we started to domesticate animals,
88
297523
3888
ๆ•ฐๅƒๅนดๅ‰ใซไบบ้กžใฏ ๅ‹•็‰ฉใ‚’ๅฎถ็•œๅŒ–ใ™ใ‚‹ใ‚ˆใ†ใซใชใ‚Š
05:01
and we trained them for work and weaponry and companionship.
89
301435
4045
ๅŠดๅƒใ‚„ๆˆฆไบ‰ใฎใŸใ‚ ใพใŸ้Šใณ็›ธๆ‰‹ใจใ—ใฆ ้ฃผใ„ๆ…ฃใ‚‰ใ—ใฆใใพใ—ใŸ
05:05
And throughout history, we've treated some animals like tools or like products,
90
305504
4985
ๆญดๅฒใ‚’้€šใ—ใฆ ใ‚ใ‚‹ๅ‹•็‰ฉใฏ้“ๅ…ทใ‚„ๅ•†ๅ“ใจใ—ใฆๆ‰ฑใ‚ใ‚Œ
05:10
and other animals, we've treated with kindness
91
310513
2174
ใ‚ใ‚‹ๅ‹•็‰ฉใฏ ๆ€ใ„ใ‚„ใ‚Šใ‚’ใ‚‚ใฃใฆๆ‰ฑใ‚ใ‚Œ
05:12
and we've given a place in society as our companions.
92
312711
3078
็คพไผšใฎไธญใง ไบบ้–“ใฎไปฒ้–“ใจใ—ใฆใฎ ไฝ็ฝฎใ‚’ไธŽใˆใ‚‰ใ‚Œใพใ—ใŸ
05:15
I think it's plausible we might start to integrate robots in similar ways.
93
315813
3849
ใƒญใƒœใƒƒใƒˆใ‚‚ใพใŸๅŒใ˜ใ‚ˆใ†ใซๅ–ใ‚Š่พผใพใ‚Œใฆ ใ„ใใจใ„ใ†ใฎใฏ ใ‚ใ‚Šใใ†ใชใ“ใจใงใ™
05:21
And sure, animals are alive.
94
321484
3096
ใ‚‚ใกใ‚ใ‚“ๅ‹•็‰ฉใฏ็”ŸใใฆใŠใ‚Š
05:24
Robots are not.
95
324604
1150
ใƒญใƒœใƒƒใƒˆใฏ็”Ÿใใฆใฏใ„ใพใ›ใ‚“
05:27
And I can tell you, from working with roboticists,
96
327626
2580
ใƒญใƒœใƒƒใƒˆ็ ”็ฉถ่€…ใจไธ€็ท’ใซ ๅƒใ„ใฆใ„ใ‚‹็ตŒ้จ“ใ‹ใ‚‰่จ€ใˆใพใ™ใŒ
05:30
that we're pretty far away from developing robots that can feel anything.
97
330230
3522
ๆ„Ÿๆƒ…ใ‚’ๆŒใคใƒญใƒœใƒƒใƒˆใŒใงใใ‚‹ใฎใฏ ้šๅˆ†ๅ…ˆใฎ่ฉฑใงใ™
05:35
But we feel for them,
98
335072
1460
ใงใ‚‚ไบบ้–“ใฏใƒญใƒœใƒƒใƒˆใซๅŒๆƒ…ใ—
05:37
and that matters,
99
337835
1207
ใใ“ใŒ้‡่ฆใชใ‚“ใงใ™
05:39
because if we're trying to integrate robots into these shared spaces,
100
339066
3627
ใƒญใƒœใƒƒใƒˆใ‚’็คพไผšใซ ๅ–ใ‚Š่พผใ‚“ใงใ„ใใฎใงใ‚ใ‚Œใฐ
05:42
we need to understand that people will treat them differently than other devices,
101
342717
4628
ไบบใฏใƒญใƒœใƒƒใƒˆใซๅฏพใ—
ไป–ใฎๆฉŸๆขฐใจใฏ้•ใฃใŸๆ‰ฑใ„ใ‚’ใ™ใ‚‹ใ“ใจใ‚’ ็†่งฃใ—ใฆใŠใๅฟ…่ฆใŒใ‚ใ‚Šใพใ™
05:47
and that in some cases,
102
347369
1844
05:49
for example, the case of a soldier who becomes emotionally attached
103
349237
3172
ใŸใจใˆใฐๅ…ตๅฃซใŒ ไธ€็ท’ใซๅƒใใƒญใƒœใƒƒใƒˆใซ
05:52
to the robot that they work with,
104
352433
2047
ๆ„›็€ใ‚’ๆŠฑใๅ ดๅˆใชใฉใซใฏ
05:54
that can be anything from inefficient to dangerous.
105
354504
2504
้žๅŠน็Ž‡ใ•ใ‚„ๅฑ้™บใ‚’ใ‚‚ใŸใ‚‰ใ™ ๅฏ่ƒฝๆ€งใ‚‚ใ‚ใ‚Šใพใ™
05:58
But in other cases, it can actually be useful
106
358551
2138
ใงใ‚‚ไป–ใฎๅ ดๅˆใซใฏ
ใƒญใƒœใƒƒใƒˆใธใฎๆ„Ÿๆƒ…็š„ใช็น‹ใŒใ‚Šใ‚’ ่‚ฒใ‚€ใ“ใจใŒ ๆœ‰็”จใงใ‚ใ‚Šๅพ—ใพใ™
06:00
to foster this emotional connection to robots.
107
360713
2623
06:04
We're already seeing some great use cases,
108
364184
2134
ใ™ใงใซ็ด ๆ™ดใ‚‰ใ—ใ„ๅฎŸไพ‹ใŒ ็Ÿฅใ‚‰ใ‚Œใฆใ„ใพใ™
06:06
for example, robots working with autistic children
109
366342
2604
ใŸใจใˆใฐ่‡ช้–‰็—‡ใฎๅญไพ›ใŒ ใƒญใƒœใƒƒใƒˆ็›ธๆ‰‹ใซ
06:08
to engage them in ways that we haven't seen previously,
110
368970
3634
ใ“ใ‚Œใพใง่ฆ‹ใ‚‰ใ‚Œใชใ‹ใฃใŸ ๅๅฟœใ‚’่ฆ‹ใ›ใ‚‹ใจใ‹
06:12
or robots working with teachers to engage kids in learning with new results.
111
372628
4000
ๆ•™ๅธซใŒใƒญใƒœใƒƒใƒˆใ‚’ไฝฟใ„ ๅญไพ›ใฎๅญฆ็ฟ’ใง ๆ–ฐใŸใช็ตๆžœใŒๅพ—ใ‚‰ใ‚Œใฆใ„ใ‚‹ใจใ‹
06:17
And it's not just for kids.
112
377433
1381
ใพใŸใ“ใ‚Œใฏ ๅญไพ›ใซ้™ใฃใŸ่ฉฑใงใฏใชใ
06:19
Early studies show that robots can help doctors and patients
113
379750
3223
ๅŒป็™‚็พๅ ดใงใƒญใƒœใƒƒใƒˆใŒ
ๅŒปๅธซใ‚„ๆ‚ฃ่€…ใฎๅŠฉใ‘ใซ ใชใ‚Šๅพ—ใ‚‹ใ“ใจใŒ็คบใ•ใ‚Œใฆใ„ใพใ™
06:22
in health care settings.
114
382997
1427
06:25
This is the PARO baby seal robot.
115
385535
1810
ใ“ใ‚Œใฏใ€Œใƒ‘ใƒญใ€ใจใ„ใ† ่ตคใกใ‚ƒใ‚“ใ‚ขใ‚ถใƒฉใ‚ทใฎใƒญใƒœใƒƒใƒˆใง
06:27
It's used in nursing homes and with dementia patients.
116
387369
3285
้คŠ่ญทๆ–ฝ่จญใ‚„่ช็Ÿฅ็—‡ๆ‚ฃ่€…ใฎใ‚ฑใ‚ขใซ ไฝฟใ‚ใ‚Œใฆใ„ใฆ
06:30
It's been around for a while.
117
390678
1570
็ตๆง‹ๅ‰ใ‹ใ‚‰ใ‚ใ‚‹ใ‚‚ใฎใงใ™
06:32
And I remember, years ago, being at a party
118
392272
3325
ใ‚ˆใ่ฆšใˆใฆใ„ใพใ™ใŒ
ไฝ•ๅนดใ‚‚ๅ‰ใซใƒ‘ใƒผใƒ†ใ‚ฃใƒผใง ใ“ใฎใƒญใƒœใƒƒใƒˆใฎ่ฉฑใ‚’ใ—ใŸใจใ“ใ‚
06:35
and telling someone about this robot,
119
395621
2571
็›ธๆ‰‹ใŒใ“ใ†่จ€ใฃใŸใ‚“ใงใ™
06:38
and her response was,
120
398216
2126
06:40
"Oh my gosh.
121
400366
1262
ใ€Œใใ‚Œใฒใฉใใชใ„ใงใ™ใ‹
06:42
That's horrible.
122
402508
1188
ไบบ้–“ใ˜ใ‚ƒใชใใƒญใƒœใƒƒใƒˆใซ ไธ–่ฉฑใ‚’ใ•ใ›ใ‚‹ใชใ‚“ใฆไฟกใ˜ใ‚‰ใ‚Œใชใ„ใ€
06:45
I can't believe we're giving people robots instead of human care."
123
405056
3397
06:50
And this is a really common response,
124
410540
1875
ใ“ใ‚Œใฏ ใ”ใไธ€่ˆฌ็š„ใชๅๅฟœใง
06:52
and I think it's absolutely correct,
125
412439
2499
ใพใฃใŸใใ‚‚ใฃใฆ ๆญฃใ—ใ„ใจๆ€ใ„ใพใ™
06:54
because that would be terrible.
126
414962
2040
ใ‚‚ใ—ใใ†ใชใ‚‰ ้…ทใ„่ฉฑใงใ™ใ‹ใ‚‰
06:57
But in this case, it's not what this robot replaces.
127
417795
2484
ใงใ‚‚ใ“ใฎๅ ดๅˆ ไบบ้–“ใŒใƒญใƒœใƒƒใƒˆใซ ็ฝฎใๆ›ใ‚ใ‚‹ใ‚ใ‘ใงใฏใ‚ใ‚Šใพใ›ใ‚“
07:00
What this robot replaces is animal therapy
128
420858
3120
็ฝฎใๆ›ใ‚ใ‚‹ใฎใฏ ๅ‹•็‰ฉไป‹ๅœจ็™‚ๆณ•ใง
07:04
in contexts where we can't use real animals
129
424002
3198
ๆœฌ็‰ฉใฎๅ‹•็‰ฉใŒ ไฝฟใˆใชใ„ๅ ดๅˆใซ
ใƒญใƒœใƒƒใƒˆใ‚’ไฝฟใ†ใ‚“ใงใ™
07:07
but we can use robots,
130
427224
1168
07:08
because people will consistently treat them more like an animal than a device.
131
428416
5230
ไบบใฏใƒญใƒœใƒƒใƒˆใ‚’ๆฉŸๆขฐใ‚ˆใ‚Šใฏ ๅ‹•็‰ฉใฎใ‚ˆใ†ใซๆ‰ฑใ„ใพใ™ใ‹ใ‚‰
07:15
Acknowledging this emotional connection to robots
132
435502
2380
ใ“ใฎใƒญใƒœใƒƒใƒˆใซๅฏพใ™ใ‚‹ๆ„Ÿๆƒ…็š„ใช ็น‹ใŒใ‚ŠใŒๅˆ†ใ‹ใฃใฆใ„ใ‚Œใฐ
07:17
can also help us anticipate challenges
133
437906
1969
ใƒญใƒœใƒƒใƒˆใŒใ‚ˆใ‚Š่บซ่ฟ‘ใชๅญ˜ๅœจใซ ใชใฃใŸใจใใซ่ตทใใ‚‹ๅ•้กŒใ‚‚
07:19
as these devices move into more intimate areas of people's lives.
134
439899
3451
ไบˆๆœŸใงใใ‚‹ใ‚ˆใ†ใซ ใชใ‚‹ใงใ—ใ‚‡ใ†
07:24
For example, is it OK if your child's teddy bear robot
135
444111
3404
ใŸใจใˆใฐๅญไพ›ใฎ ใƒ†ใƒ‡ใ‚ฃใƒ™ใ‚ขใƒญใƒœใƒƒใƒˆใŒ
็ง็š„ใชไผš่ฉฑใ‚’้Œฒ้Ÿณใ™ใ‚‹ใฎใฏ ่‰ฏใ„ใฎใ‹ใจใ‹
07:27
records private conversations?
136
447539
2237
07:29
Is it OK if your sex robot has compelling in-app purchases?
137
449800
4063
ใ‚ปใƒƒใ‚ฏใ‚นใƒญใƒœใƒƒใƒˆใŒAppๅ†…่ชฒ้‡‘ใ‚’ไฟƒใ™ใฎใ‚’ ่ชใ‚ใฆ่‰ฏใ„ใฎใ‹ใจใ‹
07:33
(Laughter)
138
453887
1396
(็ฌ‘)
07:35
Because robots plus capitalism
139
455307
2501
ใƒญใƒœใƒƒใƒˆใจ่ณ‡ๆœฌไธป็พฉใฎ ็ต„ใฟๅˆใ‚ใ›ใซใฏ
07:37
equals questions around consumer protection and privacy.
140
457832
3705
ๆถˆ่ฒป่€…ไฟ่ญทใ‚„ใƒ—ใƒฉใ‚คใƒใ‚ทใƒผใฎๅ•้กŒใŒ ใ‹ใ‚‰ใ‚“ใงใใพใ™ใ‹ใ‚‰
07:42
And those aren't the only reasons
141
462549
1612
ใงใ‚‚ๆฉŸๆขฐใซๅฏพใ™ใ‚‹ไบบ้–“ใฎๆŒฏใ‚‹่ˆžใ„ใŒ ๅ•้กŒใซใชใ‚Šใ†ใ‚‹ใฎใฏ
07:44
that our behavior around these machines could matter.
142
464185
2570
ใใ†ใ„ใ†็†็”ฑใฐใ‹ใ‚Š ใงใฏใ‚ใ‚Šใพใ›ใ‚“
07:48
A few years after that first initial experience I had
143
468747
3270
ใ‚ใฎ่ตคใกใ‚ƒใ‚“ๆ็ซœใƒญใƒœใƒƒใƒˆใงใฎ
07:52
with this baby dinosaur robot,
144
472041
2311
ๆœ€ๅˆใฎ็ตŒ้จ“ใ‹ใ‚‰ๆ•ฐๅนดๅพŒใซ
07:54
I did a workshop with my friend Hannes Gassert.
145
474376
2501
็งใฏๅ‹ไบบใฎใƒใƒใ‚นใƒปใ‚ฌใƒƒใ‚ตใƒผใƒˆใจ ใƒฏใƒผใ‚ฏใ‚ทใƒงใƒƒใƒ—ใ‚’้–‹ใใพใ—ใŸ
07:56
And we took five of these baby dinosaur robots
146
476901
2897
่ตคใกใ‚ƒใ‚“ๆ็ซœใƒญใƒœใƒƒใƒˆใ‚’ ๏ผ•ไฝ“็”จๆ„ใ—
07:59
and we gave them to five teams of people.
147
479822
2453
ใใ‚Œใžใ‚Œ๏ผ•ใคใฎใ‚ฐใƒซใƒผใƒ—ใซๆธกใ—
08:02
And we had them name them
148
482299
1697
ๅๅ‰ใ‚’ไป˜ใ‘ใฆ
08:04
and play with them and interact with them for about an hour.
149
484020
3809
๏ผ‘ๆ™‚้–“ใใ‚‰ใ„ ใใ‚Œใง้Šใ‚“ใงใ‚‚ใ‚‰ใ„ใพใ—ใŸ
08:08
And then we unveiled a hammer and a hatchet
150
488707
2206
ใใ‚Œใ‹ใ‚‰ใƒใƒณใƒžใƒผใจ ๆ–งใ‚’ๅ‡บใ—ใฆ
08:10
and we told them to torture and kill the robots.
151
490937
2278
ใƒญใƒœใƒƒใƒˆใ‚’็—›ใ‚ไป˜ใ‘ใฆ ๆฎบใ™ใ‚ˆใ†ใซ่จ€ใ„ใพใ—ใŸ
08:13
(Laughter)
152
493239
3007
(็ฌ‘)
08:16
And this turned out to be a little more dramatic
153
496857
2294
ใ“ใ‚ŒใฏไบˆๆœŸใ—ใฆใ„ใŸใ‚ˆใ‚Šใ‚‚
ๅฐ‘ใ—ใฐใ‹ใ‚Šใ‚ทใƒงใƒƒใ‚ฏใ‚’ไธŽใˆใŸใ‚ˆใ†ใง
08:19
than we expected it to be,
154
499175
1278
08:20
because none of the participants would even so much as strike
155
500477
3072
ๅ‚ๅŠ ่€…ใฎ่ชฐใ‚‚
่ตคใกใ‚ƒใ‚“ๆ็ซœใƒญใƒœใƒƒใƒˆใ‚’ ๅฉใ“ใ†ใจใ•ใˆใ—ใพใ›ใ‚“
08:23
these baby dinosaur robots,
156
503573
1307
08:24
so we had to improvise a little, and at some point, we said,
157
504904
5150
ใใ‚Œใงใ‚‚ใฃใจๅพŒๆŠผใ—ใ—ใชใ‘ใ‚Œใฐ ใชใ‚‰ใชใใชใฃใฆ ่จ€ใ„ใพใ—ใŸ
08:30
"OK, you can save your team's robot if you destroy another team's robot."
158
510078
4437
ใ€Œไป–ใฎใƒใƒผใƒ ใฎใƒญใƒœใƒƒใƒˆใ‚’ๅฃŠใ›ใฐ ่‡ชๅˆ†ใฎใƒใƒผใƒ ใฎใƒญใƒœใƒƒใƒˆใฏๅฃŠใ•ใชใใฆใ„ใ„ใงใ™ใ€
08:34
(Laughter)
159
514539
1855
(็ฌ‘)
08:36
And even that didn't work. They couldn't do it.
160
516839
2195
ใใ‚Œใงใ‚‚ใ†ใพใใ„ใ‹ใš ใฟใ‚“ใชๆ‰‹ใ‚’ใ“ใพใญใ„ใฆใ„ใพใ™
08:39
So finally, we said,
161
519058
1151
ๆœ€ๅพŒใซใฏ ใ“ใ†่จ€ใ„ใพใ—ใŸ
08:40
"We're going to destroy all of the robots
162
520233
2032
ใ€Œ่ชฐใ‹ใŒๆ–งใ‚’ๆ‰‹ใซๅ–ใฃใฆ ใฉใ‚Œใ‹๏ผ‘ใคใซๆŒฏใ‚Šไธ‹ใ‚ใ•ใชใ„ใชใ‚‰
08:42
unless someone takes a hatchet to one of them."
163
522289
2285
ใƒญใƒœใƒƒใƒˆใ‚’ใฟใ‚“ใชๅฃŠใ—ใพใ™ใ€
08:45
And this guy stood up, and he took the hatchet,
164
525586
3579
ใใ‚Œใง๏ผ‘ไบบใŒ็ซ‹ใกไธŠใŒใฃใฆ ๆ–งใ‚’ๆ‰‹ใซใ—
08:49
and the whole room winced as he brought the hatchet down
165
529189
2706
ใƒญใƒœใƒƒใƒˆใฎ้ฆ–ใซ ๆ–งใ‚’ๆŒฏใ‚Šไธ‹ใ‚ใ™ใจ
08:51
on the robot's neck,
166
531919
1780
้ƒจๅฑ‹ใซใ„ใŸใฟใ‚“ใชใŒ ใ‚ฎใ‚ฏใƒƒใจใ—ใพใ—ใŸ
08:53
and there was this half-joking, half-serious moment of silence in the room
167
533723
6338
ใใ—ใฆๅŠๅˆ†ๅ†—่ซ‡ ๅŠๅˆ†็œŸๅ‰ฃใช ใ—ใฐใ—ใฎๆฒˆ้ป™ใŒ
ๅ€’ใ‚ŒใŸใƒญใƒœใƒƒใƒˆใซ ๆงใ’ใ‚‰ใ‚Œใพใ—ใŸ
09:00
for this fallen robot.
168
540085
1698
09:01
(Laughter)
169
541807
1406
(็ฌ‘)
09:03
So that was a really interesting experience.
170
543237
3694
ใ™ใ”ใ้ข็™ฝใ„็ตŒ้จ“ใงใ—ใŸ
09:06
Now, it wasn't a controlled study, obviously,
171
546955
2459
ใ“ใ‚Œใฏๅฏพ็…ง่ฉฆ้จ“ใงใฏ ใ‚ใ‚Šใพใ›ใ‚“ใงใ—ใŸใŒ
09:09
but it did lead to some later research that I did at MIT
172
549438
2850
ๅพŒใซ็งใŒใƒ‘ใƒฉใƒƒใ‚ทใƒฅใƒปใƒŠใƒณใƒ‡ใ‚ฃใƒผใจ ใ‚ทใƒณใ‚ทใ‚ขใƒปใƒ–ใƒชใ‚ธใƒผใƒซใจไธ€็ท’ใซ
09:12
with Palash Nandy and Cynthia Breazeal,
173
552312
2228
MITใง่กŒใฃใŸ็ ”็ฉถใซ ็น‹ใŒใ‚Šใพใ—ใŸ
09:14
where we had people come into the lab and smash these HEXBUGs
174
554564
3627
ๅฎŸ้จ“ๅฎคใซๆฅใฆใ‚‚ใ‚‰ใฃใŸไบบใŸใกใซ ใ“ใฎๆ˜†่™ซใฟใŸใ„ใช
09:18
that move around in a really lifelike way, like insects.
175
558215
3087
็”Ÿใ็‰ฉใฎใ‚ˆใ†ใซๅ‹•ใๅ›žใ‚‹ HEXBUGใ‚’ๆฝฐใ—ใฆใ‚‚ใ‚‰ใ„ใพใ—ใŸ
09:21
So instead of choosing something cute that people are drawn to,
176
561326
3134
ๅฏๆ„›ใใฆๆƒนใ‹ใ‚Œใ‚‹ ใ‚ˆใ†ใชใ‚‚ใฎใงใฏใชใ
09:24
we chose something more basic,
177
564484
2093
ใ‚‚ใฃใจ็ฐก็ด ใชใƒญใƒœใƒƒใƒˆใ‚’ ไฝฟใฃใŸใ‚“ใงใ™ใŒ
09:26
and what we found was that high-empathy people
178
566601
3480
ๅ…ฑๆ„ŸๅŠ›ใฎๅผทใ„ไบบใปใฉ HEXBUGใ‚’ๆฝฐใ™ใ“ใจใซ
09:30
would hesitate more to hit the HEXBUGS.
179
570105
2143
ใŸใ‚ใ‚‰ใ„ใ‚’ๆ„Ÿใ˜ใ‚‹ใ“ใจใŒ ๅˆ†ใ‹ใ‚Šใพใ—ใŸ
09:33
Now this is just a little study,
180
573575
1564
ใ“ใ‚Œใฏใกใฃใฝใ‘ใช็ ”็ฉถใงใ™ใŒ
09:35
but it's part of a larger body of research
181
575163
2389
ใšใฃใจๅคงใใช็ ”็ฉถ้ ˜ๅŸŸใฎไธ€้ƒจใง
09:37
that is starting to indicate that there may be a connection
182
577576
2944
ไบบใฎๅ…ฑๆ„Ÿๅ‚พๅ‘ใจใƒญใƒœใƒƒใƒˆใซๅฏพใ™ใ‚‹ ๆŒฏใ‚‹่ˆžใ„ใฎ้–“ใซใฏ
09:40
between people's tendencies for empathy
183
580544
2373
09:42
and their behavior around robots.
184
582941
1976
็น‹ใŒใ‚ŠใŒใ‚ใ‚‹ใ“ใจใŒ ็คบใ•ใ‚Œใคใคใ‚ใ‚Šใพใ™
09:45
But my question for the coming era of human-robot interaction
185
585721
3627
ๆฅใ‚‹ในใไบบใจใƒญใƒœใƒƒใƒˆๅ…ฑ็”Ÿใฎ ๆ™‚ไปฃใธใฎ็–‘ๅ•ใฏ
09:49
is not: "Do we empathize with robots?"
186
589372
3055
ใ€Œไบบใฏใƒญใƒœใƒƒใƒˆใซๅ…ฑๆ„Ÿใ™ใ‚‹ใฎใ‹๏ผŸใ€ ใงใฏใชใ
09:53
It's: "Can robots change people's empathy?"
187
593211
2920
ใ€Œใƒญใƒœใƒƒใƒˆใฏไบบใฎๅ…ฑๆ„Ÿๆ€งใ‚’ๅค‰ใˆใ‚‰ใ‚Œใ‚‹ใฎใ‹๏ผŸใ€ ใจใ„ใ†ใ“ใจใงใ™
09:57
Is there reason to, for example,
188
597489
2287
ใŸใจใˆใฐใƒญใƒœใƒƒใƒˆ็Šฌใ‚’ ่นดใ‚‰ใชใ„ใ‚ˆใ†
09:59
prevent your child from kicking a robotic dog,
189
599800
2333
ๅญไพ›ใ‚’ใ—ใคใ‘ใ‚‹ใ“ใจใซ ๅŠนๆžœใฏใ‚ใ‚‹ใฎใ‹โ€”
10:03
not just out of respect for property,
190
603228
2914
็‰ฉใ‚’ๅคงไบ‹ใซใ™ใ‚‹ ใจใ„ใ†็‚นใ ใ‘ใงใชใ
10:06
but because the child might be more likely to kick a real dog?
191
606166
2953
ๆœฌ็‰ฉใฎ็Šฌใ‚’่นดใ‚‹ใ‚ˆใ†ใชไบบ้–“ใซ ใชใ‚‰ใชใ„ใ‚ˆใ†ใซใ™ใ‚‹ใŸใ‚ใซ
10:10
And again, it's not just kids.
192
610507
1883
ใ“ใ‚Œใ‚‚ๅญไพ›ใซ้™ใฃใŸใ“ใจ ใงใฏใ‚ใ‚Šใพใ›ใ‚“
10:13
This is the violent video games question, but it's on a completely new level
193
613564
4056
ๆšดๅŠ›็š„ใ‚ฒใƒผใƒ ใฎๅ•้กŒใจๅŒใ˜ใซ่žใ“ใˆใพใ™ใŒ ใพใฃใŸใๅˆฅๆฌกๅ…ƒใฎ่ฉฑใง
10:17
because of this visceral physicality that we respond more intensely to
194
617644
4760
็”ป้ขไธŠใฎใ‚คใƒกใƒผใ‚ธใ‚ˆใ‚Šใ‚‚ ๅฝขใ‚ใ‚‹ใ‚‚ใฎใซๅฏพใ—ใฆ
็งใŸใกใฏใ‚ˆใ‚Šๅผทใ ๅๅฟœใ™ใ‚‹ใ‹ใ‚‰ใงใ™
10:22
than to images on a screen.
195
622428
1547
10:25
When we behave violently towards robots,
196
625674
2578
็‰นใซ็”Ÿใ็‰ฉใซไผผใ›ใฆใƒ‡ใ‚ถใ‚คใƒณใ•ใ‚ŒใŸ ใƒญใƒœใƒƒใƒˆใซๅฏพใ—ใฆ
10:28
specifically robots that are designed to mimic life,
197
628276
3120
ๆšดๅŠ›็š„ใซๆŒฏใ‚‹่ˆžใ†ใฎใฏ
10:31
is that a healthy outlet for violent behavior
198
631420
3892
ๆšดๅŠ›ๆ€งใซๅฏพใ™ใ‚‹ ๅฅๅ…จใชใฏใ‘ๅฃใซใชใ‚‹ใฎใ‹
10:35
or is that training our cruelty muscles?
199
635336
2544
ใใ‚Œใจใ‚‚ๆšดๅŠ›ๆ€งใ‚’ ๅผทใ‚ใ‚‹ใ“ใจใซใชใ‚‹ใฎใ‹๏ผŸ
10:39
We don't know ...
200
639511
1150
ๅˆ†ใ‹ใ‚Šใพใ›ใ‚“
10:42
But the answer to this question has the potential to impact human behavior,
201
642622
3945
ใงใ‚‚ใ“ใฎๅ•ใ„ใธใฎ็ญ”ใˆใฏ
ไบบ้–“ใฎ่กŒๅ‹•ใ‚„็คพไผš่ฆ็ฏ„ใซ ๅฝฑ้Ÿฟใ™ใ‚‹ๅฏ่ƒฝๆ€งใŒใ‚ใ‚Š
10:46
it has the potential to impact social norms,
202
646591
2768
10:49
it has the potential to inspire rules around what we can and can't do
203
649383
3849
ใƒญใƒœใƒƒใƒˆใซๅฏพใ—ใฆไฝ•ใŒใงใ ไฝ•ใŒใงใใชใ„ใ‹ใจใ„ใ†
ใƒซใƒผใƒซใซ็น‹ใŒใ‚‹ ๅฏ่ƒฝๆ€งใŒใ‚ใ‚Šใพใ™
10:53
with certain robots,
204
653256
1151
10:54
similar to our animal cruelty laws.
205
654431
1848
ใกใ‚‡ใ†ใฉๅ‹•็‰ฉ่™ๅพ…้˜ฒๆญขๆณ•ใฎใ‚ˆใ†ใซโ€”
10:57
Because even if robots can't feel,
206
657228
2864
ใƒญใƒœใƒƒใƒˆใฏไฝ•ใ‚‚ๆ„Ÿใ˜ใชใ„ใจใ—ใฆใ‚‚
ใƒญใƒœใƒƒใƒˆใซๅฏพใ™ใ‚‹ๆŒฏใ‚‹่ˆžใ„ใฏ ็งใŸใก่‡ช่บซใซๅฝฑ้Ÿฟใ™ใ‚‹ใ‹ใ‚‚ใ—ใ‚Œใชใ„ใ‚“ใงใ™
11:00
our behavior towards them might matter for us.
207
660116
3080
11:04
And regardless of whether we end up changing our rules,
208
664889
2855
ใใ—ใฆๆณ•ๅพ‹ใ‚’ๅค‰ใˆใ‚‹ใ“ใจใซ ใชใ‚‹ใ‹ใฉใ†ใ‹ใซใ‹ใ‹ใ‚ใ‚‰ใš
11:08
robots might be able to help us come to a new understanding of ourselves.
209
668926
3556
ใƒญใƒœใƒƒใƒˆใฏไบบ้–“ใซใคใ„ใฆใฎ ๆ–ฐใŸใช็†่งฃใ‚’ใ‚‚ใŸใ‚‰ใ™ใ‹ใ‚‚ใ—ใ‚Œใพใ›ใ‚“
11:14
Most of what I've learned over the past 10 years
210
674276
2316
ใ“ใฎ10ๅนดใง็งใฎ ๅญฆใ‚“ใงใใŸใ“ใจใฎๅคšใใฏ
11:16
has not been about technology at all.
211
676616
2238
ใƒ†ใ‚ฏใƒŽใƒญใ‚ธใƒผใซ ้–ขใ™ใ‚‹ใ“ใจใงใฏใชใ
11:18
It's been about human psychology
212
678878
2503
ไบบ้–“ใฎๅฟƒ็† ๅ…ฑๆ„Ÿ ไป–ไบบใจใฎ้–ขใ‚ใ‚Šๆ–นใซใคใ„ใฆใงใ™
11:21
and empathy and how we relate to others.
213
681405
2603
11:25
Because when a child is kind to a Roomba,
214
685524
2365
ๅญไพ›ใŒใƒซใƒณใƒใซๅฏพใ—ใฆ ๅ„ชใ—ใๆŽฅใ™ใ‚‹ใจใ
11:29
when a soldier tries to save a robot on the battlefield,
215
689262
4015
ๅ…ตๅฃซใŒๆˆฆๅ ดใงใƒญใƒœใƒƒใƒˆใ‚’ ๆ•‘ใŠใ†ใจใ™ใ‚‹ใจใ
11:33
or when a group of people refuses to harm a robotic baby dinosaur,
216
693301
3638
ไบบใ€…ใŒ่ตคใกใ‚ƒใ‚“ๆ็ซœใƒญใƒœใƒƒใƒˆใ‚’ ๅ‚ทใคใ‘ใ‚‹ใ“ใจใ‚’ๆ‹’ใ‚€ใจใ
11:38
those robots aren't just motors and gears and algorithms.
217
698248
3191
ใƒญใƒœใƒƒใƒˆใฏใƒขใƒผใ‚ฟใƒผใจๆญฏ่ปŠใจ ใƒ—ใƒญใ‚ฐใƒฉใƒ ใ ใ‘ใฎใ‚‚ใฎใงใฏใ‚ใ‚Šใพใ›ใ‚“
11:42
They're reflections of our own humanity.
218
702501
1905
็ง้”่‡ช่บซใฎไบบ้–“ๆ€งใฎ ๅๆ˜ ใชใฎใงใ™
11:45
Thank you.
219
705523
1151
ใ‚ใ‚ŠใŒใจใ†ใ”ใ–ใ„ใพใ—ใŸ
11:46
(Applause)
220
706698
3397
(ๆ‹ๆ‰‹)
ใ“ใฎใ‚ฆใ‚งใƒ–ใ‚ตใ‚คใƒˆใซใคใ„ใฆ

ใ“ใฎใ‚ตใ‚คใƒˆใงใฏ่‹ฑ่ชžๅญฆ็ฟ’ใซๅฝน็ซ‹ใคYouTubeๅ‹•็”ปใ‚’็ดนไป‹ใ—ใพใ™ใ€‚ไธ–็•Œไธญใฎไธ€ๆต่ฌ›ๅธซใซใ‚ˆใ‚‹่‹ฑ่ชžใƒฌใƒƒใ‚นใƒณใ‚’่ฆ‹ใ‚‹ใ“ใจใŒใงใใพใ™ใ€‚ๅ„ใƒ“ใƒ‡ใ‚ชใฎใƒšใƒผใ‚ธใซ่กจ็คบใ•ใ‚Œใ‚‹่‹ฑ่ชžๅญ—ๅน•ใ‚’ใƒ€ใƒ–ใƒซใ‚ฏใƒชใƒƒใ‚ฏใ™ใ‚‹ใจใ€ใใ“ใ‹ใ‚‰ใƒ“ใƒ‡ใ‚ชใ‚’ๅ†็”Ÿใ™ใ‚‹ใ“ใจใŒใงใใพใ™ใ€‚ๅญ—ๅน•ใฏใƒ“ใƒ‡ใ‚ชใฎๅ†็”ŸใจๅŒๆœŸใ—ใฆใ‚นใ‚ฏใƒญใƒผใƒซใ—ใพใ™ใ€‚ใ”ๆ„่ฆ‹ใƒปใ”่ฆๆœ›ใŒใ”ใ–ใ„ใพใ—ใŸใ‚‰ใ€ใ“ใกใ‚‰ใฎใŠๅ•ใ„ๅˆใ‚ใ›ใƒ•ใ‚ฉใƒผใƒ ใ‚ˆใ‚Šใ”้€ฃ็ตกใใ ใ•ใ„ใ€‚

https://forms.gle/WvT1wiN1qDtmnspy7