What's it like to be a robot? | Leila Takayama

92,734 views ・ 2018-02-16

TED


请双击下面的英文字幕来播放视频。

翻译人员: Fang lFyf 校对人员: Wei Wu
00:12
You only get one chance to make a first impression,
0
12760
2656
形成第一印象只有一次机会,
00:15
and that's true if you're a robot as well as if you're a person.
1
15440
3176
无论对人类还是对机器人都是如此。
00:18
The first time that I met one of these robots
2
18640
3016
我第一次看到机器人
00:21
was at a place called Willow Garage in 2008.
3
21680
2896
是2008年在一个叫做 Willow Garage的地方。
00:24
When I went to visit there, my host walked me into the building
4
24600
3016
当我到那的时候, 主人邀请我进去
00:27
and we met this little guy.
5
27640
1576
我见到了这个小东西。
00:29
He was rolling into the hallway,
6
29240
1656
他在走廊上闲逛,
00:30
came up to me, sat there,
7
30920
1816
向我这个方向走来,停在那,
00:32
stared blankly past me,
8
32760
2256
盯着我好像我不存在,
00:35
did nothing for a while,
9
35040
1656
它待了一会儿,
00:36
rapidly spun his head around 180 degrees
10
36720
1936
快速地把头转了180度,
00:38
and then ran away.
11
38680
1536
然后走开了。
00:40
And that was not a great first impression.
12
40240
2176
这可不是个好的第一印象。
00:42
The thing that I learned about robots that day
13
42440
2176
我那天才知道
00:44
is that they kind of do their own thing,
14
44640
2176
这些机器人只做他们自己的事情,
00:46
and they're not totally aware of us.
15
46840
2136
完全没有意识到我们的存在。
00:49
And I think as we're experimenting with these possible robot futures,
16
49000
3239
而且我认为,随着我们对未来 可实现机器人的实验的进行,
00:52
we actually end up learning a lot more about ourselves
17
52263
2673
我们不光是在研究这些机器
00:54
as opposed to just these machines.
18
54960
1656
我们对自己的了解也会越来越深刻。
00:56
And what I learned that day
19
56640
1336
另外,那天我也意识到,
00:58
was that I had pretty high expectations for this little dude.
20
58000
3416
我对这些小东西有着很高的期待。
01:01
He was not only supposed to be able to navigate the physical world,
21
61440
3176
他不仅应该存在于现实生活,
01:04
but also be able to navigate my social world --
22
64640
2656
更应该存在于我的社交领域--
01:07
he's in my space; it's a personal robot.
23
67320
2176
在我的社交空间;它是一个人。
01:09
wWhy didn't it understand me?
24
69520
2016
为什么他不能理解我?
01:11
My host explained to me,
25
71560
1256
我的主人跟我解释说,
01:12
"Well, the robot is trying to get from point A to point B,
26
72840
3176
“这个机器人正试图从A点移动到B点,
01:16
and you were an obstacle in his way,
27
76040
1776
你挡住他的路了,
01:17
so he had to replan his path,
28
77840
2016
所以他不得不重新规划路线,
01:19
figure out where to go,
29
79880
1256
弄清楚该往哪儿走,
01:21
and then get there some other way,"
30
81160
1696
然后找到另一条路径,”
01:22
which was actually not a very efficient thing to do.
31
82880
2456
这明显不是个高效的办法。
01:25
If that robot had figured out that I was a person, not a chair,
32
85360
3256
如果这个机器人知道我是个人, 而不是把椅子,
01:28
and that I was willing to get out of its way
33
88640
2096
如果我挡到它的路,
01:30
if it was trying to get somewhere,
34
90760
1656
我会愿意给他让路,
01:32
then it actually would have been more efficient
35
92440
2216
那么这是让他到达B点的
01:34
at getting its job done
36
94680
1256
更有效的办法,
01:35
if it had bothered to notice that I was a human
37
95960
2216
只要它能意识到我是个人,
01:38
and that I have different affordances than things like chairs and walls do.
38
98200
3576
我和椅子或者墙相比是不一样的。
01:41
You know, we tend to think of these robots as being from outer space
39
101800
3216
我们总是认为机器人要么来自外太空
01:45
and from the future and from science fiction,
40
105040
2136
来自未来或者科幻小说。
01:47
and while that could be true,
41
107200
1416
虽然这些也许是真的,
01:48
I'd actually like to argue that robots are here today,
42
108640
2656
我今天想论证的是机器人就在这里,
01:51
and they live and work amongst us right now.
43
111320
2776
此时此刻他们就跟我们一起生活一起工作。
01:54
These are two robots that live in my home.
44
114120
2856
这是我家的两个机器人。
01:57
They vacuum the floors and they cut the grass
45
117000
2496
他们每天都
01:59
every single day,
46
119520
1216
扫地,除草,
02:00
which is more than I would do if I actually had time to do these tasks,
47
120760
3376
他们做这些事比我做得更快,
02:04
and they probably do it better than I would, too.
48
124160
2336
也比我做得更好。
02:06
This one actually takes care of my kitty.
49
126520
2496
这个机器人专门照顾我的小猫咪。
02:09
Every single time he uses the box, it cleans it,
50
129040
2576
每次猫咪便便后,它都会清理掉,
02:11
which is not something I'm willing to do,
51
131640
1976
我可不愿意干这事儿,
02:13
and it actually makes his life better as well as mine.
52
133640
2576
猫咪和我的生活都变得更舒服了。
02:16
And while we call these robot products --
53
136240
2416
当我们称呼这些机器人产品
02:18
it's a "robot vacuum cleaner, it's a robot lawnmower,
54
138680
2696
“这是一个真空吸尘器机器人, 这是个除草剂机器人,
02:21
it's a robot littler box,"
55
141400
1495
这是个猫马桶机器人,“
02:22
I think there's actually a bunch of other robots hiding in plain sight
56
142919
4137
我想在日常生活中有大量的机器人
02:27
that have just become so darn useful
57
147080
1856
已经变得特别有用
02:28
and so darn mundane
58
148960
1456
特别平常了
02:30
that we call them things like, "dishwasher," right?
59
150440
2496
以至于我们不会称呼它们机器人, 比如“洗碗机”,对吧?
02:32
They get new names.
60
152960
1216
他们有了新名字。
02:34
They don't get called robot anymore
61
154200
1696
他们不再被称作机器人了
02:35
because they actually serve a purpose in our lives.
62
155920
2416
因为他们在我们生活中发挥特定作用了。
类似地,还有恒温器,对吧?
02:38
Similarly, a thermostat, right?
63
158360
1496
02:39
I know my robotics friends out there
64
159880
1776
我一些研究机器人的朋友
02:41
are probably cringing at me calling this a robot,
65
161680
2336
有可能会对我称呼这玩意儿机器人 表示不满
但是它有自己的任务。
02:44
but it has a goal.
66
164040
1256
02:45
Its goal is to make my house 66 degrees Fahrenheit,
67
165320
2896
它的任务是让我的房子保持66华氏度,
02:48
and it senses the world.
68
168240
1256
它能感知世界。
02:49
It knows it's a little bit cold,
69
169520
1576
它能知道,周围有点冷了
02:51
it makes a plan and then it acts on the physical world.
70
171120
2616
然后它提高温度, 然后周围的气温就改变了。
02:53
It's robotics.
71
173760
1256
这是机器人学。
02:55
Even if it might not look like Rosie the Robot,
72
175040
2576
即使它不是像是“杰森一家”中 罗茜那样的机器女仆,
02:57
it's doing something that's really useful in my life
73
177640
2936
在我的生活中它也很有用。
03:00
so I don't have to take care
74
180600
1376
所以我根本不用
03:02
of turning the temperature up and down myself.
75
182000
2576
亲自调节我周围的气温。
03:04
And I think these systems live and work amongst us now,
76
184600
3816
并且我认为这些系统 和我们一起生活、工作
03:08
and not only are these systems living amongst us
77
188440
2336
这些设备不仅仅和我们共存,
03:10
but you are probably a robot operator, too.
78
190800
2656
我们也操控着这些设备。
03:13
When you drive your car,
79
193480
1256
当你开车时,
03:14
it feels like you are operating machinery.
80
194760
2216
你就像正在操作一个机器。
03:17
You are also going from point A to point B,
81
197000
2816
你也正要从A点到B点去,
03:19
but your car probably has power steering,
82
199840
2216
但是你的车可能有动力转向系统,
03:22
it probably has automatic braking systems,
83
202080
2696
可能还有防锁死刹车系统,
03:24
it might have an automatic transmission and maybe even adaptive cruise control.
84
204800
3736
它可能有一个自动变速器 甚至可能是自适应巡航控制。
03:28
And while it might not be a fully autonomous car,
85
208560
2936
但它可能不是个全自动车,
03:31
it has bits of autonomy,
86
211520
1296
它有一些自主能力,
03:32
and they're so useful
87
212840
1336
这很有用,
03:34
and they make us drive safer,
88
214200
1816
让我们开车更安全,
03:36
and we just sort of feel like they're invisible-in-use, right?
89
216040
3656
我们无意中就在使用他们,对吧?
03:39
So when you're driving your car,
90
219720
1576
当你正在开车时,
03:41
you should just feel like you're going from one place to another.
91
221320
3096
你觉得你只是 从一个地方到另一个地方去,
而不觉得这是一件难事,
03:44
It doesn't feel like it's this big thing that you have to deal with and operate
92
224440
3736
你必须处理和操作和使用这些控件,
03:48
and use these controls
93
228200
1256
因为我们花了很长时间学习驾驶
03:49
because we spent so long learning how to drive
94
229480
2176
03:51
that they've become extensions of ourselves.
95
231680
2696
他们已经变成我们的左膀右臂。
03:54
When you park that car in that tight little garage space,
96
234400
2696
当你在一个狭小的车库停车时,
03:57
you know where your corners are.
97
237120
1576
你知道角落在哪儿。
03:58
And when you drive a rental car that maybe you haven't driven before,
98
238720
3256
当你开着一辆以前从没开过的租来的车,
04:02
it takes some time to get used to your new robot body.
99
242000
3056
你需要一些时间去熟悉这个陌生的车。
04:05
And this is also true for people who operate other types of robots,
100
245080
3976
对于其他种类的机器也是一样的,
04:09
so I'd like to share with you a few stories about that.
101
249080
2600
所以我想跟你们分享一些故事。
04:12
Dealing with the problem of remote collaboration.
102
252240
2336
一些解决远程协作问题的故事。
04:14
So, at Willow Garage I had a coworker named Dallas,
103
254600
2576
在Willow Garage我有个同事叫Dallas,
04:17
and Dallas looked like this.
104
257200
1576
Dallas看起来就像这样。
04:18
He worked from his home in Indiana in our company in California.
105
258800
4056
他在印第安纳州的家中 与我们加利福尼亚州的员工一起工作。
04:22
He was a voice in a box on the table in most of our meetings,
106
262880
2936
在很多会议上, 他只是语音盒中的声音。
04:25
which was kind of OK except that, you know,
107
265840
2215
这些会议有时候很和谐,
除非我们正在激烈讨论 而且不喜欢他说的,
04:28
if we had a really heated debate and we didn't like what he was saying,
108
268079
3377
04:31
we might just hang up on him.
109
271480
1416
我们可能就得关掉语音盒。
04:32
(Laughter)
110
272920
1015
(笑声)
04:33
Then we might have a meeting after that meeting
111
273959
2217
会议结束后我们会再开个会
当他不在的时候
04:36
and actually make the decisions in the hallway afterwards
112
276200
2696
04:38
when he wasn't there anymore.
113
278920
1416
在走廊上做决定。
所以这对他不太友好。
04:40
So that wasn't so great for him.
114
280360
1576
04:41
And as a robotics company at Willow,
115
281960
1736
作为一个在Willlow的机器人公司,
我们周围遍布着一些 多余的机器人零部件,
04:43
we had some extra robot body parts laying around,
116
283720
2336
Dallas和他的伙计Curt把这些组装起来,
04:46
so Dallas and his buddy Curt put together this thing,
117
286080
2496
看起来有点像是卡在轮子上的 Skype,
04:48
which looks kind of like Skype on a stick on wheels,
118
288600
2936
看起来像是个无聊的劣质玩具,
04:51
which seems like a techy, silly toy,
119
291560
1736
但是它确实是我见过的远程协作中
04:53
but really it's probably one of the most powerful tools
120
293320
2776
最有用的工具之一了。
04:56
that I've seen ever made for remote collaboration.
121
296120
2480
所以现在,如果我没有回复 Dallas邮件中的问题,
04:59
So now, if I didn't answer Dallas' email question,
122
299160
3496
05:02
he could literally roll into my office,
123
302680
2216
他就可以滚到我的办公室里,
05:04
block my doorway and ask me the question again --
124
304920
2576
挡住我的门,再问我一遍--
05:07
(Laughter)
125
307520
1016
(笑声)
05:08
until I answered it.
126
308560
1216
直到我回答它。
05:09
And I'm not going to turn him off, right? That's kind of rude.
127
309800
2976
我也不会关掉它,对吧? 那有点粗鲁。
05:12
Not only was it good for these one-on-one communications,
128
312800
2696
这不仅可以用于一对一的交流,
05:15
but also for just showing up at the company all-hands meeting.
129
315520
2936
也能让你在公司的全体会议中出席。
05:18
Getting your butt in that chair
130
318480
1696
你坐在椅子上,
05:20
and showing people that you're present and committed to your project
131
320200
3216
让别人知道你的存在以及对项目的贡献
05:23
is a big deal
132
323440
1256
这很重要,
05:24
and can help remote collaboration a ton.
133
324720
2176
而且也有助于远程协作。
05:26
We saw this over the period of months and then years,
134
326920
2856
我们在经年累月的过程中看到,
05:29
not only at our company but at others, too.
135
329800
2160
改变不仅发生在我们公司, 也发生在其他公司。
这些系统最好的地方在于
05:32
The best thing that can happen with these systems
136
332720
2336
它能让我觉得你就在那儿。
05:35
is that it starts to feel like you're just there.
137
335080
2336
那就是你,就是你的身体。
05:37
It's just you, it's just your body,
138
337440
1696
所以人们开始给这些东西个人空间。
05:39
and so people actually start to give these things personal space.
139
339160
3096
05:42
So when you're having a stand-up meeting,
140
342280
1976
当你在开会的时候,
人们就会站在这周围,
05:44
people will stand around the space
141
344280
1656
05:45
just as they would if you were there in person.
142
345960
2216
就像你亲自出席一样。
这情况挺好的直到有发生一些插曲。
05:48
That's great until there's breakdowns and it's not.
143
348200
2576
05:50
People, when they first see these robots,
144
350800
1976
当人们第一次见到这东西时,
05:52
are like, "Wow, where's the components? There must be a camera over there,"
145
352800
3576
会说,“哇,别的零件在哪儿? 应该有一个相机的,”
然后显示屏中的脸就会被乱戳一气
05:56
and they start poking your face.
146
356400
1576
“你声音太低了, 我要提高你的音量,”
05:58
"You're talking too softly, I'm going to turn up your volume,"
147
358000
2936
就像一个同事向你走来,和你说话,
06:00
which is like having a coworker walk up to you and say,
148
360960
2616
"你声音太低了, 我要抬起你的脸。"
06:03
"You're speaking too softly, I'm going to turn up your face."
149
363600
2896
这行为有点尴尬而且不合适,
06:06
That's awkward and not OK,
150
366520
1256
06:07
and so we end up having to build these new social norms
151
367800
2616
因此, 我们最终不得不在使用这些系统时
06:10
around using these systems.
152
370440
2216
建立新的协约。
06:12
Similarly, as you start feeling like it's your body,
153
372680
3416
同理,当你觉得它像你的身体了,
06:16
you start noticing things like, "Oh, my robot is kind of short."
154
376120
3696
你就开始注意到一些事情, 像是“哦,我的机器人有点矮。”
06:19
Dallas would say things to me -- he was six-foot tall --
155
379840
2656
Dallas会对我说这些事, 他有六英尺高,
06:22
and we would take him via robot to cocktail parties and things like that,
156
382520
3456
我们会带他的机器人 参加鸡尾酒派对之类的,
就像你们一样,
06:26
as you do,
157
386000
1216
这个机器人大概有五英尺高, 跟我差不多高。
06:27
and the robot was about five-foot-tall, which is close to my height.
158
387240
3336
06:30
And he would tell me,
159
390600
1216
他会跟我说,
06:31
"You know, people are not really looking at me.
160
391840
2536
“你知道吗,人们都不看我。
06:34
I feel like I'm just looking at this sea of shoulders,
161
394400
2696
我感觉我只看到一堆堆的肩膀,
所以,我们需要一个更高机器人。”
06:37
and it's just -- we need a taller robot."
162
397120
1976
06:39
And I told him,
163
399120
1256
我告诉他,
06:40
"Um, no.
164
400400
1296
“额,不行。”
06:41
You get to walk in my shoes for today.
165
401720
1936
你今天要感受一下我的日常。
06:43
You get to see what it's like to be on the shorter end of the spectrum."
166
403680
3536
你可以看到一个矮子眼中的世界。”
06:47
And he actually ended up building a lot of empathy for that experience,
167
407240
3376
他后来从这次经历中 获得了很多感受,
06:50
which was kind of great.
168
410640
1256
对他影响很大。
06:51
So when he'd come visit in person,
169
411920
1656
当他亲自来找我的时候,
06:53
he no longer stood over me as he was talking to me,
170
413600
2416
他不会再站在我旁边对我说话,
他会坐下来,直视着我和我说话,
06:56
he would sit down and talk to me eye to eye,
171
416040
2096
这是件很美好的事情。
06:58
which was kind of a beautiful thing.
172
418160
1736
06:59
So we actually decided to look at this in the laboratory
173
419920
2656
所以我们决定在实验室里观察
07:02
and see what others kinds of differences things like robot height would make.
174
422600
3656
不同机器人的身高 会造成其他什么不同。
07:06
And so half of the people in our study used a shorter robot,
175
426280
2856
在我们的研究中, 一半的人使用了较矮的机器人,
07:09
half of the people in our study used a taller robot
176
429160
2416
一半的人使用较高的机器人。
07:11
and we actually found that the exact same person
177
431600
2256
我们发现同一个人,
07:13
who has the exact same body and says the exact same things as someone,
178
433880
3336
使用外观完全相同的机器人, 并且和其他人汇报相同的事情时,
07:17
is more persuasive and perceived as being more credible
179
437240
2616
通过身高较高的机器人述说的话
会更加令人信服以及让人觉得可靠。
07:19
if they're in a taller robot form.
180
439880
1656
07:21
It makes no rational sense,
181
441560
1816
这不符合常理,
07:23
but that's why we study psychology.
182
443400
1696
所以这就是我们学习心理学的原因。
07:25
And really, you know, the way that Cliff Nass would put this
183
445120
2856
事实上, 你知道, Cliff Nass的风格是,
07:28
is that we're having to deal with these new technologies
184
448000
3016
我们不得不接受这些新技术,
07:31
despite the fact that we have very old brains.
185
451040
2736
即使我们的观念已经过时。
07:33
Human psychology is not changing at the same speed that tech is
186
453800
2976
人类心理学认知没有技术更新的那么快,
07:36
and so we're always playing catch-up,
187
456800
1816
所以我们总是在追赶,
07:38
trying to make sense of this world
188
458640
1656
试图去理解这个
东西到处乱跑的世界。
07:40
where these autonomous things are running around.
189
460320
2336
07:42
Usually, things that talk are people, not machines, right?
190
462680
2736
通常, 是人在说, 不是这个机器, 对吗?
07:45
And so we breathe a lot of meaning into things like just height of a machine,
191
465440
4576
所以我们把这个东西只是看成机器,
07:50
not a person,
192
470040
1256
而不是人,
07:51
and attribute that to the person using the system.
193
471320
2360
把它看成使用这个东西的人。
我认为,当你想到机器人学,
07:55
You know, this, I think, is really important
194
475120
2216
这才是真正重要的。
07:57
when you're thinking about robotics.
195
477360
1736
这不是要重塑人类,
07:59
It's not so much about reinventing humans,
196
479120
2096
而是要搞清楚怎样去提升人类, 对吧?
08:01
it's more about figuring out how we extend ourselves, right?
197
481240
3136
08:04
And we end up using things in ways that are sort of surprising.
198
484400
2976
我们最终会用一些令人惊讶的方式 来使用这些东西。
08:07
So these guys can't play pool because the robots don't have arms,
199
487400
4256
这些家伙不能玩台球 因为机器人没有手臂
08:11
but they can heckle the guys who are playing pool
200
491680
2336
但是他们可以骚扰玩台球的人
这有助于团队的团结,
08:14
and that can be an important thing for team bonding,
201
494040
3176
很纯粹的作用。
08:17
which is kind of neat.
202
497240
1296
08:18
People who get really good at operating these systems
203
498560
2496
那些能熟练操作这些系统的人
甚至会创造一些新的游戏,
08:21
will even do things like make up new games,
204
501080
2056
就像午夜机器人足球,
08:23
like robot soccer in the middle of the night,
205
503160
2136
来回踢易拉罐。
08:25
pushing the trash cans around.
206
505320
1456
08:26
But not everyone's good.
207
506800
1576
但是并不是所有人都能适应。
08:28
A lot of people have trouble operating these systems.
208
508400
2496
很多人在运行这些系统时遇到了问题。
08:30
This is actually a guy who logged into the robot
209
510920
2256
有一个人登陆上了自己的机器人
机器人的眼球向左边转了90度。
08:33
and his eyeball was turned 90 degrees to the left.
210
513200
2376
但他不知道,
08:35
He didn't know that,
211
515600
1256
08:36
so he ended up just bashing around the office,
212
516880
2176
所以他在办公室里嗨起来了,
跑到人们的办公桌上, 让场面变得超级尴尬,
08:39
running into people's desks, getting super embarrassed,
213
519080
2616
08:41
laughing about it -- his volume was way too high.
214
521720
2336
还哈哈大笑,他的音量太高了。
在照片中的这个人告诉我,
08:44
And this guy here in the image is telling me,
215
524080
2136
“我们需要一个机器人静音按钮。”
08:46
"We need a robot mute button."
216
526240
2096
08:48
And by that what he really meant was we don't want it to be so disruptive.
217
528360
3496
他真正的意思是 我们不想让它变得如此混乱。
08:51
So as a robotics company,
218
531880
1616
所以作为一个机器人公司,
08:53
we added some obstacle avoidance to the system.
219
533520
2456
我们在系统中增加了一些避障装置。
一个小的激光测距仪, 可以检测到障碍,
08:56
It got a little laser range finder that could see the obstacles,
220
536000
3056
如果我作为一个主人说, 撞向一个椅子,
08:59
and if I as a robot operator try to say, run into a chair,
221
539080
3136
09:02
it wouldn't let me, it would just plan a path around,
222
542240
2496
它不会照做,它会在周围找一条路径,
09:04
which seems like a good idea.
223
544760
1856
这看起来很好。
09:06
People did hit fewer obstacles using that system, obviously,
224
546640
3176
很显然,配有该系统的机器人 撞到的障碍物更少,
09:09
but actually, for some of the people,
225
549840
2096
但是实际上,对某些人来说,
09:11
it took them a lot longer to get through our obstacle course,
226
551960
2856
他们花了更长的时间 才能适应机器人的避障功能,
09:14
and we wanted to know why.
227
554840
1560
我们想知道其中的原因。
09:17
It turns out that there's this important human dimension --
228
557080
3056
原来,有一个重要的人格因素--
一个叫做内外控倾向的人格因素,
09:20
a personality dimension called locus of control,
229
560160
2296
09:22
and people who have a strong internal locus of control,
230
562480
3136
拥有强内控倾向的人
09:25
they need to be the masters of their own destiny --
231
565640
3056
他们需要做自己命运的主人--
09:28
really don't like giving up control to an autonomous system --
232
568720
3096
特别不喜欢放弃对 一个自主系统的控制--
09:31
so much so that they will fight the autonomy;
233
571840
2136
以至于他们会反抗这些自主行为;
09:34
"If I want to hit that chair, I'm going to hit that chair."
234
574000
3096
"如果我想撞上那把椅子, 那我就要去撞倒那把椅子。“
09:37
And so they would actually suffer from having that autonomous assistance,
235
577120
3616
因此, 他们会因为存在辅助系统而感到难受
09:40
which is an important thing for us to know
236
580760
2576
知道这件事对我们来说很重要
09:43
as we're building increasingly autonomous, say, cars, right?
237
583360
3296
因为我们正在提高机器的自主性, 比如汽车,对吧?
09:46
How are different people going to grapple with that loss of control?
238
586680
3360
不同的人如何去应对 机器控制权的减少?
09:50
It's going to be different depending on human dimensions.
239
590880
2696
这取决于个体的不同。
09:53
We can't treat humans as if we're just one monolithic thing.
240
593600
3496
我们不能把整个人类混为一谈。
09:57
We vary by personality, by culture,
241
597120
2416
我们的性格,文化不同
09:59
we even vary by emotional state moment to moment,
242
599560
2456
甚至每分钟都有不同的情绪状态,
我们应该考虑每一种人格,
10:02
and being able to design these systems,
243
602040
1976
10:04
these human-robot interaction systems,
244
604040
2296
从而设计这些系统,
10:06
we need to take into account the human dimensions,
245
606360
2736
人机交互系统,
而不仅仅关注科学技术。
10:09
not just the technological ones.
246
609120
1720
10:11
Along with a sense of control also comes a sense of responsibility.
247
611640
4296
如果你操作一个这样的机器人,
10:15
And if you were a robot operator using one of these systems,
248
615960
2856
你除了控制权还要有责任感。
10:18
this is what the interface would look like.
249
618840
2056
这是界面的样子。
10:20
It looks a little bit like a video game,
250
620920
1936
它有点像电子游戏,
10:22
which can be good because that's very familiar to people,
251
622880
2976
很棒, 因为人们都很熟悉,
10:25
but it can also be bad
252
625880
1216
但是它也可能是坏处,
10:27
because it makes people feel like it's a video game.
253
627120
2456
因为它让人觉得它就是个电子游戏。
10:29
We had a bunch of kids over at Stanford play with the system
254
629600
2856
我们有一群孩子在斯坦福大学 操作这个系统
在我们公司旁边的Menlo公园 驱动机器人,
10:32
and drive the robot around our office in Menlo Park,
255
632480
2456
孩子们开始说,
10:34
and the kids started saying things like,
256
634960
1936
打中那个人得10分,那个20分。
10:36
"10 points if you hit that guy over there. 20 points for that one."
257
636920
3176
他们会在走廊里追那些人。
10:40
And they would chase them down the hallway.
258
640120
2016
(笑声)
10:42
(Laughter)
259
642160
1016
我告诉他们,”嗯,那些是真人。
10:43
I told them, "Um, those are real people.
260
643200
1936
如果你们打他们的话 他们会流血会感到疼痛。”
10:45
They're actually going to bleed and feel pain if you hit them."
261
645160
3296
10:48
And they'd be like, "OK, got it."
262
648480
1616
他们说,“好吧,知道了。”
10:50
But five minutes later, they would be like,
263
650120
2056
但是五分钟后,他们又会开始,
10:52
"20 points for that guy over there, he just looks like he needs to get hit."
264
652200
3616
“打中那个人20分, 他看起来很欠扁。”
这有点像“安德的游戏", 对吧?
10:55
It's a little bit like "Ender's Game," right?
265
655840
2136
另一边有一个真实的世界
10:58
There is a real world on that other side
266
658000
1936
10:59
and I think it's our responsibility as people designing these interfaces
267
659960
3416
我认为作为设计人机交互界面的人, 我们的责任是
11:03
to help people remember
268
663400
1256
帮助人们记住
11:04
that there's real consequences to their actions
269
664680
2256
他们的行为是有后果的
11:06
and to feel a sense of responsibility
270
666960
2296
而且当他们操作 自主性越来越高的东西时,
11:09
when they're operating these increasingly autonomous things.
271
669280
3280
他们要有一种责任感
11:13
These are kind of a great example
272
673840
2296
这些事例能对未来的机器人实验发展
11:16
of experimenting with one possible robotic future,
273
676160
3256
提供很大帮助,
11:19
and I think it's pretty cool that we can extend ourselves
274
679440
3856
我认为我们能够让机器变为我们的扩展
研究如何将一个机器
11:23
and learn about the ways that we extend ourselves
275
683320
2336
11:25
into these machines
276
685680
1216
变为我们的化身
11:26
while at the same time being able to express our humanity
277
686920
2696
同时能够表达我们的人性和个性
11:29
and our personality.
278
689640
1216
是很酷的事。
11:30
We also build empathy for others
279
690880
1576
我们也会为其他人考虑
11:32
in terms of being shorter, taller, faster, slower,
280
692480
3216
根据人的高矮,走路快慢
甚至手臂残疾,
11:35
and maybe even armless,
281
695720
1416
这很美妙。
11:37
which is kind of neat.
282
697160
1336
11:38
We also build empathy for the robots themselves.
283
698520
2536
我们也会为机器人着想。
11:41
This is one of my favorite robots.
284
701080
1656
这是我最喜欢的机器人之一。
11:42
It's called the Tweenbot.
285
702760
1456
它叫Tweenbot。
11:44
And this guy has a little flag that says,
286
704240
1976
这家伙有一个小旗,上面写着
11:46
"I'm trying to get to this intersection in Manhattan,"
287
706240
2576
”我想去曼哈顿的十字路口,“
11:48
and it's cute and rolls forward, that's it.
288
708840
2776
它很可爱,向前滚动
11:51
It doesn't know how to build a map, it doesn't know how to see the world,
289
711640
3456
它并不知道如何去建立地图, 也不知道怎么样去看世界,
它只是寻求帮助。
11:55
it just asks for help.
290
715120
1256
最美妙之处在于,
11:56
The nice thing about people
291
716400
1336
11:57
is that it can actually depend upon the kindness of strangers.
292
717760
3096
它可以依赖于陌生人的善良。
12:00
It did make it across the park to the other side of Manhattan --
293
720880
3896
它最后真的穿过公园 到了曼哈顿的另一边--
12:04
which is pretty great --
294
724800
1256
这太棒了--
就因为人们愿意把它拾起来 给它指明正确的方向。
12:06
just because people would pick it up and point it in the right direction.
295
726080
3456
(笑声)
12:09
(Laughter)
296
729560
936
这很棒,不是吗?
12:10
And that's great, right?
297
730520
1256
12:11
We're trying to build this human-robot world
298
731800
2696
我们试图建立一个人机的世界
12:14
in which we can coexist and collaborate with one another,
299
734520
3416
人与机器可以共存,合作,
12:17
and we don't need to be fully autonomous and just do things on our own.
300
737960
3376
我们不需要各自为政, 完全独立在自己的领域。
12:21
We actually do things together.
301
741360
1496
确切说我们应该一起合作。
12:22
And to make that happen,
302
742880
1256
要想实现人机社会,
12:24
we actually need help from people like the artists and the designers,
303
744160
3256
我们实际上需要其他人的帮助 比如艺术家,设计师,
政策制定者,法律学者,
12:27
the policy makers, the legal scholars,
304
747440
1856
心理学家,社会学家,人类学家--
12:29
psychologists, sociologists, anthropologists --
305
749320
2216
我们需要来自各方的力量,
12:31
we need more perspectives in the room
306
751560
1816
12:33
if we're going to do the thing that Stu Card says we should do,
307
753400
2976
如果我们要做Stu Card所说的, 我们应该做的事情--
12:36
which is invent the future that we actually want to live in.
308
756400
3936
创造我们想要的未来。
我认为我们可以一起为了
12:40
And I think we can continue to experiment
309
760360
2656
12:43
with these different robotic futures together,
310
763040
2176
未来的机器人世界不断尝试,
12:45
and in doing so, we will end up learning a lot more about ourselves.
311
765240
4680
通过这样, 我们最终会更多地了解我们自己。
谢谢。
12:50
Thank you.
312
770720
1216
12:51
(Applause)
313
771960
2440
(掌声)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7


This website was created in October 2020 and last updated on June 12, 2025.

It is now archived and preserved as an English learning resource.

Some information may be out of date.

隐私政策

eng.lish.video

Developer's Blog