请双击下面的英文字幕来播放视频。
翻译人员: Ruilai Xue
校对人员: jacks peng
00:12
What happens when technology
knows more about us than we do?
0
12881
4456
当科技比我们还了解我们自己,会发生什么?
00:17
A computer now can detect
our slightest facial microexpressions
1
17992
3672
目前的电脑已经可以解读
我们最精细的微表情,
00:21
and be able to tell the difference
between a real smile and a fake one.
2
21688
3611
能够区分假笑和真笑。
00:25
That's only the beginning.
3
25323
1734
这仅仅只是开始。
00:27
Technology has become
incredibly intelligent
4
27466
2865
科技已经变得极度智能,
00:30
and already knows a lot
about our internal states.
5
30355
3400
而且已经知道很多我们的内心状态。
00:34
And whether we like it or not,
6
34085
2286
不管喜欢与否,
00:36
we already are sharing
parts of our inner lives
7
36395
3499
我们已经在分享我们的一部分私生活了,
00:39
that's out of our control.
8
39918
1733
但这一过程却不受我们控制。
00:43
That seems like a problem,
9
43413
1421
这似乎是个麻烦,
00:44
because a lot of us like to keep
what's going on inside
10
44858
3246
因为很多人只想展示自己的表面,
00:48
from what people actually see.
11
48128
1647
而隐藏自己的内心。
00:50
We want to have agency
over what we share and what we don't.
12
50323
4420
我们希望对我们愿意和不愿意
分享的东西拥有控制权。
00:55
We all like to have a poker face.
13
55473
2321
我们都喜欢有一张不露声色的脸。
00:59
But I'm here to tell you
that I think that's a thing of the past.
14
59584
3346
但我来是想告诉你,这个想法已经过时了。
01:03
And while that might sound scary,
it's not necessarily a bad thing.
15
63347
4770
虽然听起来可怕,但这不一定是坏事。
01:09
I've spent a lot of time
studying the circuits in the brain
16
69030
2770
我花了很多时间研究那些
01:11
that create the unique perceptual
realities that we each have.
17
71824
3693
创造出我们感官世界的脑回路。
01:16
And now I bring that together
18
76110
1405
现在我将这些研究成果
01:17
with the capabilities
of current technology
19
77539
2062
和当前的科技水平结合在一起,
01:19
to create new technology
that does make us better,
20
79625
2537
来创造出一种全新的科技,帮助扩展
01:22
feel more, connect more.
21
82186
1600
我们的感受,和建立连接的能力。
01:24
And I believe to do that,
22
84482
1786
而且我相信要实现这一点,
01:26
we have to be OK
losing some of our agency.
23
86292
2749
我们需要接受失去部分控制力的后果。
01:30
With some animals, it's really amazing,
24
90149
2523
动物实验的研究结果很令人惊叹,
01:32
and we get to see into
their internal experiences.
25
92696
3474
我们可以看到它们的内在感受。
01:36
We get this upfront look
at the mechanistic interaction
26
96649
3722
我们得以全面了解
01:40
between how they respond
to the world around them
27
100395
2817
它们回应世界的方式,
01:43
and the state of their biological systems.
28
103236
2008
以及它们的生物系统状态。
01:45
This is where evolutionary pressures
like eating, mating
29
105268
3809
在这里,像进食,求偶
以及保护自己不被吃掉
01:49
and making sure we don't get eaten
30
109101
1762
这些进化压力
01:50
drive deterministic behavioral responses
to information in the world.
31
110887
4157
驱使着决定性的行为,
对周遭世界的信息进行回应。
01:55
And we get to see into this window,
32
115806
2794
我们可以通过这个窗口,
01:58
into their internal states
and their biological experiences.
33
118624
3636
看到它们的内心状态与生理感受。
02:02
It's really pretty cool.
34
122284
1642
这非常有意思。
02:03
Now, stay with me for a moment --
I'm a violinist, not a singer.
35
123950
4103
现在,认真听——虽然我
不是歌手,我是拉小提琴的。
02:08
But the spider's already
given me a critical review.
36
128077
3590
但这个蜘蛛已经给了我批判性的建议。
02:16
(Video) (Singing in a low pitch)
37
136907
2060
(视频)(低音歌唱)
02:19
(Singing in a middle pitch)
38
139868
2888
(音调升高)
02:23
(Singing in a high pitch)
39
143800
2505
(高音歌唱)
02:27
(Singing in a low pitch)
40
147069
1421
(低音歌唱)
02:29
(Singing in a middle pitch)
41
149236
1600
(音调升高)
02:31
(Singing in a high pitch)
42
151403
1777
(高音歌唱)
02:33
(Laughter)
43
153204
1150
(笑声 )
02:36
Poppy Crum: It turns out, some spiders
tune their webs like violins
44
156387
3198
波比 · 克拉姆:事实上,很多蜘蛛
会把它们的网调得像小提琴一样,
02:39
to resonate with certain sounds.
45
159609
2158
来和某些声音产生共振。
02:41
And likely, the harmonics
of my voice as it went higher
46
161791
2771
相似的,当我的声调随着
02:44
coupled with how loud I was singing
47
164586
1730
音量变大而提高时,
02:46
recreated either the predatory call
of an echolocating bat or a bird,
48
166340
4467
就重造出了蝙蝠或鸟类
等捕食者靠近的警告,
02:50
and the spider did what it should.
49
170831
1881
然后蜘蛛做出了本能的反应。
02:53
It predictively told me to bug off.
50
173300
2817
它在警告我走开。
02:56
I love this.
51
176824
1150
我很喜欢这个现象。
02:58
The spider's responding
to its external world
52
178546
3309
蜘蛛在对外在世界进行回应,
03:01
in a way that we get to see and know
what's happening to its internal world.
53
181879
4350
在某种程度上,我们可以借此看见
并了解它内心世界的样子。
03:07
Biology is controlling
the spider's response;
54
187069
2206
生物学控制着蜘蛛的回应;
03:09
it's wearing its internal
state on its sleeve.
55
189299
2776
它的内心状态体现了出来。
03:13
But us, humans --
56
193768
1655
但是我们,人类,
03:16
we're different.
57
196184
1150
我们不一样。
03:17
We like to think we have cognitive control
over what people see, know and understand
58
197899
5735
我们希望可以控制他人看到,
认识以及理解到的
03:23
about our internal states --
59
203658
1409
我们的内心状态——
03:25
our emotions, our insecurities,
our bluffs, our trials and tribulations --
60
205091
4303
我们的情感,不安全感,
我们的虚张声势,我们的磨难,
03:29
and how we respond.
61
209418
1267
以及我们回应的方式。
03:31
We get to have our poker face.
62
211927
2282
我们必须面不改色。
03:35
Or maybe we don't.
63
215799
1200
或者也许,我们不用这样做。
03:37
Try this with me.
64
217728
1182
和我一起试试。
03:38
Your eye responds
to how hard your brain is working.
65
218934
2690
你的眼睛会反映你的用脑情况。
03:42
The response you're about to see
is driven entirely by mental effort
66
222363
3230
你即将看到的反应
完全是由心理影响的,
03:45
and has nothing to do
with changes in lighting.
67
225617
2635
与光线的改变毫无关系。
03:48
We know this from neuroscience.
68
228276
1650
我们从神经学了解到这一信息。
03:49
I promise, your eyes are doing
the same thing as the subject in our lab,
69
229950
4560
我保证,你们的眼睛会发生
与我们实验室的受试者一样的变化,
03:54
whether you want them to or not.
70
234534
1734
无论你想不想这么做。
03:56
At first, you'll hear some voices.
71
236292
2173
最初,你会听到一些声音。
03:58
Try and understand them
and keep watching the eye in front of you.
72
238489
3278
试着理解它们,并注视着你面前的眼睛。
04:01
It's going to be hard at first,
73
241791
1498
一上来可能很难。
一个声音消失后,就会简单很多。
04:03
one should drop out,
and it should get really easy.
74
243313
2391
你们会看到瞳孔的直径发生了改变。
04:05
You're going to see the change in effort
in the diameter of the pupil.
75
245728
3325
04:10
(Video) (Two overlapping voices talking)
76
250140
2567
(视频)(两个重叠的声音在讲话)
04:12
(Single voice) Intelligent technology
depends on personal data.
77
252731
2963
(一个声音在说话)
智能科技取决于个人数据。
04:15
(Two overlapping voices talking)
78
255718
2446
(两个重叠的声音在讲话)
04:18
(Single voice) Intelligent technology
depends on personal data.
79
258188
2976
(一个声音在说话)
智能科技取决于个人数据。
04:21
PC: Your pupil doesn't lie.
80
261680
1326
波比 · 克拉姆:
你的瞳孔不会撒谎。
04:23
Your eye gives away your poker face.
81
263030
2400
你的眼睛泄露了你真实的心理活动。
04:25
When your brain's having to work harder,
82
265990
1913
当你的大脑需要更快地运转时,
04:27
your autonomic nervous system
drives your pupil to dilate.
83
267927
2785
自动神经系统会导致瞳孔扩张,
04:30
When it's not, it contracts.
84
270736
1555
反过来则会导致收缩。
04:32
When I take away one of the voices,
85
272680
1691
当我移除其中任何一个声道时,
用来理解说话者的认知努力
04:34
the cognitive effort
to understand the talkers
86
274395
2262
04:36
gets a lot easier.
87
276681
1158
变得轻松很多。
04:37
I could have put the two voices
in different spatial locations,
88
277863
3000
我可以把两个声音放在不同位置,
04:40
I could have made one louder.
89
280887
1666
或把其中一个调大,
04:42
You would have seen the same thing.
90
282577
1738
你会看到同样的结果。
04:45
We might think we have more agency
over the reveal of our internal state
91
285006
4786
我们或许认为,我们掌控展现
自己内心情况的能力
04:49
than that spider,
92
289816
1579
比蜘蛛更强,
04:51
but maybe we don't.
93
291419
1266
但或许并不是这样。
04:53
Today's technology is starting
to make it really easy
94
293021
2969
现在的科技水平使得
04:56
to see the signals and tells
that give us away.
95
296014
2690
识别这些信号然后暴露
我们自己变得非常容易。
04:59
The amalgamation of sensors
paired with machine learning
96
299109
3294
在我们身上,在我们四周与环境中
05:02
on us, around us and in our environments,
97
302427
2413
与机器学习相结合的传感器,
05:04
is a lot more than cameras and microphones
tracking our external actions.
98
304864
4653
远不止监控着我们外在
动作的摄像头与麦克风。
05:12
Our bodies radiate our stories
99
312529
2818
我们的身体通过生理温度的改变
05:15
from changes in the temperature
of our physiology.
100
315371
2666
辐射着我们的故事。
05:18
We can look at these
as infrared thermal images
101
318546
2261
看一下我背后的
05:20
showing up behind me,
102
320831
1160
这些红外图像,
红色代表温度高,蓝色代表温度低。
05:22
where reds are hotter
and blues are cooler.
103
322015
2070
05:24
The dynamic signature
of our thermal response
104
324458
3183
我们热响应的动态信号
05:27
gives away our changes in stress,
105
327665
3031
暴露了我们的压力变化,
05:30
how hard our brain is working,
106
330720
2008
大脑的负荷程度,
05:32
whether we're paying attention
107
332752
1936
我们是否在关注
05:34
and engaged in the conversation
we might be having
108
334712
2627
并参与我们可能正在进行的对话,
05:37
and even whether we're experiencing
a picture of fire as if it were real.
109
337363
4095
甚至我们是否如实地
在感受一张火焰的图片。
05:41
We can actually see
people give off heat on their cheeks
110
341482
2643
我们可以看见人们脸颊的温度上升,
05:44
in response to an image of flame.
111
344149
2200
这是对火焰图片做出的回应。
05:48
But aside from giving away
our poker bluffs,
112
348013
2929
但除了泄露我们的情绪伪装,
05:50
what if dimensions of data
from someone's thermal response
113
350966
4746
如果某人的热响应数据
05:55
gave away a glow
of interpersonal interest?
114
355736
2659
还暴露了其人际交往的兴趣呢?
05:58
Tracking the honesty of feelings
in someone's thermal image
115
358966
3532
追踪某人热学图像里的
真挚情感,或许会为理解
06:02
might be a new part of how
we fall in love and see attraction.
116
362522
3626
我们如何坠入爱河
或被吸引提供新的信息。
06:06
Our technology can listen,
develop insights and make predictions
117
366172
3693
仅仅通过分析麦克风接收的
06:09
about our mental and physical health
118
369889
2095
关于我们话语和语言的时序动态,
06:12
just by analyzing the timing dynamics
of our speech and language
119
372008
4000
我们的科技就可以倾听,形成理解,
并对我们的心理和生理健康
06:16
picked up by microphones.
120
376032
1443
做出预测。
06:18
Groups have shown that changes
in the statistics of our language
121
378038
3880
小组研究表明,我们语言中的变化
06:21
paired with machine learning
122
381942
1420
与机器学习相结合,
06:23
can predict the likelihood
someone will develop psychosis.
123
383386
3161
可以预测某人患上精神病的几率。
06:27
I'm going to take it a step further
124
387442
1751
我会更近一步,
06:29
and look at linguistic changes
and changes in our voice
125
389217
2587
去观察我们声音中的
语言变化与音调变化,
06:31
that show up with a lot
of different conditions.
126
391828
2239
它们展现出了许多不同的状况。
06:34
Dementia, diabetes can alter
the spectral coloration of our voice.
127
394091
4367
痴呆,糖尿病会改变
我们声音中的光谱颜色,
06:39
Changes in our language
associated with Alzheimer's
128
399205
3119
与阿尔茨海默症关联的言语变化
06:42
can sometimes show up more
than 10 years before clinical diagnosis.
129
402348
4365
有时在临床诊断十年前就会出现。
06:47
What we say and how we say it
tells a much richer story
130
407236
3960
我们说什么以及说话的方式,
其实比我们曾经以为的
06:51
than we used to think.
131
411220
1254
透露了更多信息。
06:53
And devices we already have in our homes
could, if we let them,
132
413022
4047
而且只要我们允许,我们家里的设施
06:57
give us invaluable insight back.
133
417093
2134
也可以给我们珍贵的反馈。
06:59
The chemical composition of our breath
134
419998
2978
我们呼吸中的化学构成
07:03
gives away our feelings.
135
423959
1354
能够暴露我们的情感。
07:06
There's a dynamic mixture of acetone,
isoprene and carbon dioxide
136
426363
4478
丙酮,异戊二烯以及
二氧化碳的动态混合物
07:10
that changes when our heart speeds up,
when our muscles tense,
137
430865
3384
会随我们心率的提升或肌肉的
紧张状态而改变,
07:14
and all without any obvious change
in our behaviors.
138
434809
2897
尽管我们在行为上并没有明显变化。
07:18
Alright, I want you to watch
this clip with me.
139
438268
2738
好了,我想让你们
和我一起看看这个片段。
两侧的屏幕上可能会出现一些东西,
07:21
Some things might be going on
on the side screens,
140
441030
3119
07:24
but try and focus on
the image in the front
141
444173
3777
但尝试关注中间的图像
07:27
and the man at the window.
142
447974
1463
和窗边的人。
07:31
(Eerie music)
143
451633
2658
(阴森的音乐)
07:39
(Woman screams)
144
459767
1437
(女人的尖叫)
07:50
PC: Sorry about that.
I needed to get a reaction.
145
470692
2395
波比 · 克拉姆:不好意思,
我需要得到一个反应。
07:53
(Laughter)
146
473111
1785
(笑声)
07:55
I'm actually tracking the carbon dioxide
you exhale in the room right now.
147
475412
4972
事实上,我正在检测
你们在房间中呼出的二氧化碳。
08:01
We've installed tubes
throughout the theater,
148
481903
3293
我们在场馆内装了很多管道,
08:05
lower to the ground,
because CO2 is heavier than air.
149
485220
2595
在贴近地面的位置,
因为二氧化碳比氧气重。
08:07
But they're connected
to a device in the back
150
487839
2667
它们连接着后台的一个设备,
08:10
that lets us measure, in real time,
with high precision,
151
490530
3287
允许我们实时并高精准度地测量
08:13
the continuous differential
concentration of CO2.
152
493841
2922
二氧化碳浓度的持续变化。
08:17
The clouds on the sides are actually
the real-time data visualization
153
497246
5508
两边的云朵实际上是二氧化碳浓度的
08:22
of the density of our CO2.
154
502778
1998
可视数据。
08:25
You might still see
a patch of red on the screen,
155
505374
3699
你们或许还会看到屏幕上的一片红色,
08:29
because we're showing increases
with larger colored clouds,
156
509097
3705
因为我们正在展示大型有色云朵的增加,
08:32
larger colored areas of red.
157
512826
2196
红色的部分变得更多。
08:35
And that's the point
where a lot of us jumped.
158
515046
2559
这是我们被吓到的时刻。
08:38
It's our collective suspense
driving a change in carbon dioxide.
159
518173
4915
我们对将要发生什么的共同期待
导致了二氧化碳的变化。
08:43
Alright, now, watch this
with me one more time.
160
523649
2722
好了,我们再看一遍。
08:46
(Cheerful music)
161
526395
2238
(轻松愉悦的音乐)
08:54
(Woman laughs)
162
534553
2137
(女人的笑声)
09:05
PC: You knew it was coming.
163
545344
1349
波比 · 克拉姆:
你们已经知道会发生什么了。
09:06
But it's a lot different
when we changed the creator's intent.
164
546717
3363
当我们改变制作者的意图时,
这就变得很不一样。
09:10
Changing the music and the sound effects
165
550776
2769
改变音乐与声效
09:13
completely alter the emotional
impact of that scene.
166
553569
3603
完全改变了那个镜头的情感冲击。
09:17
And we can see it in our breath.
167
557196
2134
这一点可以在我们的呼吸中观察到。
09:20
Suspense, fear, joy
168
560196
2262
悬念,恐惧,快乐,
09:22
all show up as reproducible,
visually identifiable moments.
169
562482
4507
全都是可重现的,视觉上可见的。
09:27
We broadcast a chemical signature
of our emotions.
170
567473
4151
我们在“广播”我们情感的化学信号。
09:35
It is the end of the poker face.
171
575249
2133
这意味着“扑克脸”的终结。
09:38
Our spaces, our technology
will know what we're feeling.
172
578582
3566
我们的空间和科技将能够了解我们的感受。
09:42
We will know more about each other
than we ever have.
173
582736
2785
我们会远比以前更了解彼此。
09:45
We get a chance to reach in and connect
to the experience and sentiments
174
585911
4307
我们有机会去触碰并连接那些
09:50
that are fundamental to us as humans
175
590242
1742
在我们的感官,情感与社交中
作为人类基础的体验与情绪。
09:52
in our senses, emotionally and socially.
176
592008
2410
09:55
I believe it is the era of the empath.
177
595482
2540
我相信这是移情的时代。
09:58
And we are enabling the capabilities
that true technological partners can bring
178
598046
5222
我们正在实现真正的科技伴侣所带来的
10:03
to how we connect with each other
and with our technology.
179
603292
3047
如何通过科技让彼此相连的各种能力。
10:06
If we recognize the power
of becoming technological empaths,
180
606363
3389
如果我们认识到了
成为科技移情者的能力,
10:09
we get this opportunity
181
609776
1936
就将得到这个机会,
10:11
where technology can help us bridge
the emotional and cognitive divide.
182
611736
4424
让科技帮助我们把情感
与认知分歧联系起来。
10:16
And in that way, we get to change
how we tell our stories.
183
616680
2723
在这种方式下,我们就可以
改变讲故事的方式。
10:19
We can enable a better future
for technologies like augmented reality
184
619427
3580
我们可以创造一个拥有
像现实增强技术的未来,
10:23
to extend our own agency
and connect us at a much deeper level.
185
623031
4193
去延伸我们的媒介,并让我们
在更深的层面进行连接。
10:27
Imagine a high school counselor
being able to realize
186
627625
2547
想象一位高中学生顾问,能够了解
10:30
that an outwardly cheery student
really was having a deeply hard time,
187
630196
3826
一个表面很快乐的学生实际上处境艰难,
10:34
where reaching out can make
a crucial, positive difference.
188
634046
3180
这样一来,交流就可以
促成一个显著,积极的改变。
10:37
Or authorities, being able
to know the difference
189
637766
3230
我们的权威机构,能够区别
10:41
between someone having
a mental health crisis
190
641020
2325
一个人是存在心理健康危机,
10:43
and a different type of aggression,
191
643369
1826
还是怀有一种不同的侵略行为,
10:45
and responding accordingly.
192
645219
1800
并对此做出相应的回应。
10:47
Or an artist, knowing
the direct impact of their work.
193
647609
3273
或者让艺术家可以了解
他们艺术作品的直接影响。
10:52
Leo Tolstoy defined his perspective of art
194
652173
2643
列夫 · 托尔斯泰将他对艺术的理解定义为
10:54
by whether what the creator intended
195
654840
1785
创造者的意图
10:56
was experienced by the person
on the other end.
196
656649
2586
是否被观看者感受到。
10:59
Today's artists can know
what we're feeling.
197
659259
2566
如今的艺术家可以了解我们的感受。
11:02
But regardless of whether
it's art or human connection,
198
662204
3005
但无论是艺术还是人类连接,
11:06
today's technologies
will know and can know
199
666608
2802
如今的科技都将,且能够知道
11:09
what we're experiencing on the other side,
200
669434
2048
我们在另一端的感受。
11:11
and this means we can be
closer and more authentic.
201
671506
2618
这意味着我们彼此可以更接近和更真实。
11:14
But I realize a lot of us
have a really hard time
202
674498
4293
不过,我发现我们很多人不喜欢
11:18
with the idea of sharing our data,
203
678815
2267
分享我们的数据,
11:21
and especially the idea
that people know things about us
204
681673
3111
尤其当知道人们会了解那些
11:24
that we didn't actively choose to share.
205
684808
2321
我们没有主动选择分享的数据。
11:28
Anytime we talk to someone,
206
688728
2216
每当我们与别人说话,
11:31
look at someone
207
691946
1555
看着某人,
11:33
or choose not to look,
208
693525
1468
或选择避开目光接触时,
11:35
data is exchanged, given away,
209
695017
2647
数据都会被交换,被分享,
11:38
that people use to learn,
210
698533
2205
被人们用于学习,
11:40
make decisions about
their lives and about ours.
211
700762
3267
对他们以及我们的生活做出决定。
11:45
I'm not looking to create a world
where our inner lives are ripped open
212
705469
3968
我并不想创造一个人们的
内心生活完全暴露的世界,
11:49
and our personal data
and our privacy given away
213
709461
2348
我们的数据和隐私都被暴露给
11:51
to people and entities
where we don't want to see it go.
214
711833
2713
给我们不愿分享的人和实体。
11:55
But I am looking to create a world
215
715117
2762
我想创造的,
11:57
where we can care about
each other more effectively,
216
717903
3408
是一个我们可以更好地关心彼此,
12:01
we can know more about when
someone is feeling something
217
721335
3060
更加了解何时有人产生了某种感觉,
12:04
that we ought to pay attention to.
218
724419
1872
而这种感觉也是我们应该关注的。
12:06
And we can have richer experiences
from our technology.
219
726800
3335
我们可以利用科技获得更丰富的经验。
12:10
Any technology
can be used for good or bad.
220
730887
2357
任何科技都像一面双刃剑。
12:13
Transparency to engagement
and effective regulation
221
733268
2412
参与的透明和有效的监管
12:15
are absolutely critical
to building the trust for any of this.
222
735704
3120
对在其中建立信任非常重要。
12:20
But the benefits that "empathetic
technology" can bring to our lives
223
740106
4834
但是 "技术移情" 可以为我们带来的好处,
12:24
are worth solving the problems
that make us uncomfortable.
224
744964
3891
值得我们去解决那些
让我们感到不舒服的问题。
12:29
And if we don't, there are
too many opportunities and feelings
225
749315
4025
而且如果不这么做,我们就将错过
12:33
we're going to be missing out on.
226
753364
1695
太多的机会与感受。
12:35
Thank you.
227
755083
1175
谢谢大家。
12:36
(Applause)
228
756282
2479
(掌声)
New videos
关于本网站
这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。