When technology can read minds, how will we protect our privacy? | Nita Farahany

167,640 views ・ 2018-12-18

TED


请双击下面的英文字幕来播放视频。

翻译人员: Yizhuo He 校对人员: Jiahao Zhang
00:13
In the months following the 2009 presidential election in Iran,
0
13096
4714
2009年伊朗总统大选结束后的几个月里,
00:17
protests erupted across the country.
1
17834
2894
抗议活动席卷了伊朗全国。
00:21
The Iranian government violently suppressed
2
21685
2896
伊朗政府暴力镇压了
00:24
what came to be known as the Iranian Green Movement,
3
24605
3979
后来广为知的“伊朗绿色运动”。
00:28
even blocking mobile signals
4
28608
2053
政府甚至屏蔽了移动信号
00:30
to cut off communication between the protesters.
5
30685
2714
以切断抗议者间的联系。
00:34
My parents, who emigrated to the United States in the late 1960s,
6
34316
4669
20世纪60年代末, 我的父母移民到了美国,
我们一大家子
00:39
spend substantial time there,
7
39009
1794
在那住了很多年。
00:40
where all of my large, extended family live.
8
40827
3153
00:44
When I would call my family in Tehran
9
44832
3129
当我在镇压抗议活动 进行得如火如荼的时候
00:47
during some of the most violent crackdowns of the protest,
10
47985
3452
给德黑兰的家人打电话,
00:51
none of them dared discuss with me what was happening.
11
51461
3252
他们谁也不敢跟我说到底发生了什么。
00:55
They or I knew to quickly steer the conversation to other topics.
12
55196
3529
我,或者他们会立刻转移话题。
00:59
All of us understood what the consequences could be
13
59163
3380
我们所有人都知道,
01:02
of a perceived dissident action.
14
62567
2540
持有不同政见这一行为 会带来什么样的后果。
01:06
But I still wish I could have known what they were thinking
15
66095
3469
但我还是希望能知道他们当时的想法,
01:09
or what they were feeling.
16
69588
1418
或者他们的感受。
01:12
What if I could have?
17
72217
1393
如果我真的可以做到呢?
01:14
Or more frighteningly,
18
74149
1151
或者更骇人听闻的是,
01:15
what if the Iranian government could have?
19
75324
2761
如果伊朗政府能做到呢?
01:18
Would they have arrested them based on what their brains revealed?
20
78695
3244
他们会基于人民大脑所呈现出的 想法而逮捕他们吗?
01:22
That day may be closer than you think.
21
82933
2944
那一天或许没有你想象中的那么遥远。
01:26
With our growing capabilities in neuroscience, artificial intelligence
22
86527
3811
随着我们在神经科学,人工智能,
01:30
and machine learning,
23
90362
1703
以及在机器学习方面的进步,
01:32
we may soon know a lot more of what's happening in the human brain.
24
92089
4075
我们可能很快就能对人类大脑中 所发生的事情有更多的了解。
01:37
As a bioethicist, a lawyer, a philosopher
25
97083
3310
作为一名生物伦理学家、 律师兼哲学家,
01:40
and an Iranian-American,
26
100417
1867
以及伊朗裔美国人,
01:42
I'm deeply concerned about what this means for our freedoms
27
102308
3787
我为这类行为对我们自由的威胁
01:46
and what kinds of protections we need.
28
106119
2171
以及我们需要何种保护而深感担忧。
01:48
I believe we need a right to cognitive liberty,
29
108993
3460
我想我们需要认知自由的权利。
01:52
as a human right that needs to be protected.
30
112477
2892
这是一种亟待保护的人权。
01:55
If not, our freedom of thought,
31
115772
2643
如果没有它,我们的思想自由,
01:58
access and control over our own brains
32
118439
3024
对自我大脑控制和访问的权力,
02:01
and our mental privacy will be threatened.
33
121487
2841
以及我们的精神隐私将受到威胁。
02:05
Consider this:
34
125698
1158
设想一下:
02:06
the average person thinks thousands of thoughts each day.
35
126880
3314
一个正常人每天会产生成千上万的想法。
02:10
As a thought takes form,
36
130697
1151
当思想形成时,
02:11
like a math calculation or a number, a word,
37
131872
5056
就像数学计算,一个数字, 或一个单词一样,
02:16
neurons are interacting in the brain,
38
136952
2885
神经元在大脑中相互作用,
02:19
creating a miniscule electrical discharge.
39
139861
3088
释放出微小的电量。
02:23
When you have a dominant mental state, like relaxation,
40
143713
3452
当你处于一种主要的 思想状态,比如放松时,
02:27
hundreds and thousands of neurons are firing in the brain,
41
147189
4175
成百上千的神经元 在大脑中不断释放信号,
02:31
creating concurrent electrical discharges in characteristic patterns
42
151388
4218
以特有的方式同时放电。
02:35
that can be measured with electroencephalography, or EEG.
43
155630
4865
这些电流可以由脑电图, 即EEG检测出来。
02:41
In fact, that's what you're seeing right now.
44
161118
2160
事实上,正如你们所看到的那样。
02:43
You're seeing my brain activity that was recorded in real time
45
163956
3964
你看到的是由我头上戴着的简单装置
02:47
with a simple device that was worn on my head.
46
167944
2735
实时记录下来的我的脑部活动。
02:51
What you're seeing is my brain activity when I was relaxed and curious.
47
171669
5653
你们现在看到的是 当我放松和好奇时的脑部活动。
02:58
To share this information with you,
48
178097
1755
为了和你们分享这些信息,
02:59
I wore one of the early consumer-based EEG devices
49
179876
3020
我戴上了早期的面向 消费者的脑电图记录仪。
03:02
like this one,
50
182920
1211
就像这个,
03:04
which recorded the electrical activity in my brain in real time.
51
184155
3988
它能实时记录我的脑电波活动。
03:08
It's not unlike the fitness trackers that some of you may be wearing
52
188849
3826
它和你们戴的可以测量心率和步数,
03:12
to measure your heart rate or the steps that you've taken,
53
192699
3586
甚至你们的睡眠活动的
03:16
or even your sleep activity.
54
196309
1587
健身追踪器差不多。
03:19
It's hardly the most sophisticated neuroimaging technique on the market.
55
199154
3824
它并不是市场上最复杂的神经成像技术,
03:23
But it's already the most portable
56
203597
2378
但它已经是最便携,
03:25
and the most likely to impact our everyday lives.
57
205999
3152
最有可能对我们日常生活产生影响的了。
03:29
This is extraordinary.
58
209915
1503
这非常了不起。
03:31
Through a simple, wearable device,
59
211442
2505
通过一个简单的可穿戴设备
03:33
we can literally see inside the human brain
60
213971
3548
我们可以窥测到人类大脑内部。
03:37
and learn aspects of our mental landscape without ever uttering a word.
61
217543
6476
一语不发,就能了解到 我们精神状态的方方面面。
03:44
While we can't reliably decode complex thoughts just yet,
62
224829
3653
尽管我们还无法精确的 解码复杂的思想,
03:48
we can already gauge a person's mood,
63
228506
2519
我们已经可以判断一个人的情绪如何。
03:51
and with the help of artificial intelligence,
64
231049
2873
并且在人工智能的帮助下,
03:53
we can even decode some single-digit numbers
65
233946
4341
我们甚至可以洞悉一个人所想的,
03:58
or shapes or simple words that a person is thinking
66
238311
4882
所听到的,所看到的,
一些数字、形状或者简单的词汇。
04:03
or hearing, or seeing.
67
243217
2189
04:06
Despite some inherent limitations in EEG,
68
246345
3365
尽管脑电图技术还存在 一些固有的局限性,
04:09
I think it's safe to say that with our advances in technology,
69
249734
4720
但我敢肯定地说,随着科技的进步,
04:14
more and more of what's happening in the human brain
70
254478
3809
人类大脑中将会有越来越多的东西
04:18
can and will be decoded over time.
71
258311
2310
随着时间的推移被破解出来。
04:21
Already, using one of these devices,
72
261362
2746
现在已经可以使用这其中的一种设备
04:24
an epileptic can know they're going to have an epileptic seizure
73
264132
3255
让癫痫患者在发病之前,
就能够预先知道,
04:27
before it happens.
74
267411
1436
04:28
A paraplegic can type on a computer with their thoughts alone.
75
268871
4603
截瘫患者仅凭思想就能在电脑上打字。
04:34
A US-based company has developed a technology to embed these sensors
76
274485
4183
一家美国公司已经研发了 一种将这些传感器
04:38
into the headrest of automobilies
77
278692
2230
嵌入汽车头枕中的技术,
04:40
so they can track driver concentration,
78
280946
2505
这样它们就可以追踪司机 在开车时的专注度,
04:43
distraction and cognitive load while driving.
79
283475
2667
注意力分散和认知负荷,
04:46
Nissan, insurance companies and AAA have all taken note.
80
286912
4058
东风日产公司、保险公司以及 美国汽车协会都注意到了这一点。
04:51
You could even watch this choose-your-own-adventure movie
81
291949
4508
你甚至可以看一场“自选冒险历程”的电影,
04:56
"The Moment," which, with an EEG headset,
82
296481
4240
名叫《那一刻》,所佩戴的心电图耳机,
05:00
changes the movie based on your brain-based reactions,
83
300745
3926
可以根据你大脑的反应去改变电影的走向。
05:04
giving you a different ending every time your attention wanes.
84
304695
4353
每次当你走神, 就会给你一个不同的结局。
05:11
This may all sound great,
85
311154
2763
这听起来可能很棒。
05:13
and as a bioethicist,
86
313941
2189
作为一个生物伦理学家,
05:16
I am a huge proponent of empowering people
87
316154
3613
我强烈支持通过让人们
05:19
to take charge of their own health and well-being
88
319791
2616
运用包括这项惊人的大脑解悉技术在内的
05:22
by giving them access to information about themselves,
89
322431
2918
方式获取自身信息,
05:25
including this incredible new brain-decoding technology.
90
325373
2976
来掌控自己的健康和幸福。
05:29
But I worry.
91
329878
1167
但我还是担心。
05:31
I worry that we will voluntarily or involuntarily give up
92
331736
4760
我担心我们会自觉或不自觉的放弃
05:36
our last bastion of freedom, our mental privacy.
93
336520
4118
我们最后的自由堡垒,我们的精神隐私。
05:41
That we will trade our brain activity
94
341302
2925
我们会以我们大脑的活动数据,
05:44
for rebates or discounts on insurance,
95
344251
3046
来换取保险折扣,
05:48
or free access to social-media accounts ...
96
348391
2603
或者社交媒体帐号的免费使用权限,
05:52
or even to keep our jobs.
97
352444
1848
甚至以此来保住我们的工作。
05:54
In fact, in China,
98
354900
1913
事实上,在中国,
05:58
the train drivers on the Beijing-Shanghai high-speed rail,
99
358199
5897
那些在全世界最繁忙的高铁线路,
06:04
the busiest of its kind in the world,
100
364120
2532
京沪高铁线上工作的司机,
06:06
are required to wear EEG devices
101
366676
2476
都被要求佩戴脑电图设备,
06:09
to monitor their brain activity while driving.
102
369176
2427
以此来监测他们在驾驶时的大脑活动。
06:12
According to some news sources,
103
372157
2226
根据一些新闻报道,
06:14
in government-run factories in China,
104
374407
2679
在中国的国营工厂中,
06:17
the workers are required to wear EEG sensors to monitor their productivity
105
377110
5364
工人被要求穿戴脑电图设备 来监测他们的生产率,
06:22
and their emotional state at work.
106
382498
2115
以及工作时的情绪状态。
06:25
Workers are even sent home
107
385267
2310
如果工人们的大脑显现出,
06:27
if their brains show less-than-stellar concentration on their jobs,
108
387601
4054
注意力不集中或者情绪激动,
06:31
or emotional agitation.
109
391679
2122
他们甚至会被停工。
06:35
It's not going to happen tomorrow,
110
395189
1745
也许这不会马上就发生,
06:36
but we're headed to a world of brain transparency.
111
396958
3086
但是我们已经向 一个大脑信息透明的世界走去。
06:40
And I don't think people understand that that could change everything.
112
400537
3440
而我并不认为人们能 领会到这将改变一切。
06:44
Everything from our definitions of data privacy to our laws,
113
404474
3675
从我们对数据隐私的定义到法律,
06:48
to our ideas about freedom.
114
408173
1800
到我们对自由的看法。
06:50
In fact, in my lab at Duke University,
115
410731
3077
事实上,我在杜克大学的实验室
06:53
we recently conducted a nationwide study in the United States
116
413832
3175
最近进行了一项全国性的研究,
06:57
to see if people appreciated
117
417031
1959
去看看人们是否会注意到
06:59
the sensitivity of their brain information.
118
419014
2071
他们大脑信息数据的敏感性。
07:02
We asked people to rate their perceived sensitivity
119
422356
3356
我们让他们对33种不同类型的信息
07:05
of 33 different kinds of information,
120
425736
2231
按敏感性进行排名,
07:07
from their social security numbers
121
427991
2220
这些信息包括他们的社会保险号,
07:10
to the content of their phone conversations,
122
430235
2597
电话通话内容,
07:12
their relationship history,
123
432856
2193
他们的恋爱史,
还有他们的情绪、焦虑,
07:15
their emotions, their anxiety,
124
435073
1942
07:17
the mental images in their mind
125
437039
1946
他们脑海中的心理表象,
07:19
and the thoughts in their mind.
126
439009
1538
他们脑海中的想法。
07:21
Shockingly, people rated their social security number as far more sensitive
127
441481
5229
令人震惊的是, 人们将社会保险号的敏感性,
07:26
than any other kind of information,
128
446734
2203
排在包括他们的大脑数据在内的,
07:28
including their brain data.
129
448961
2435
任何其它信息的前面。
07:32
I think this is because people don't yet understand
130
452380
3216
我想这可能是因为人们还不太了解,
07:35
or believe the implications of this new brain-decoding technology.
131
455620
4063
或者相信这个全新的 大脑解悉技术的影响。
07:40
After all, if we can know the inner workings of the human brain,
132
460629
3289
毕竟,如果我们能够知道 一个人大脑的内部运作情况,
07:43
our social security numbers are the least of our worries.
133
463942
2706
那么我们的社会保险号就成为 最不需要我们担心的东西了。
07:46
(Laughter)
134
466672
1285
(笑声)
07:47
Think about it.
135
467981
1167
想象一下。
07:49
In a world of total brain transparency,
136
469172
2396
在一个大脑信息完全透明的世界里,
07:51
who would dare have a politically dissident thought?
137
471592
2429
谁还敢有持不同政见,
07:55
Or a creative one?
138
475279
1541
或者创造性的想法?
07:57
I worry that people will self-censor
139
477503
3476
我担心人们会因为害怕被社会排斥
08:01
in fear of being ostracized by society,
140
481003
3302
害怕因走神或情绪波动,
08:04
or that people will lose their jobs because of their waning attention
141
484329
3813
又或者因为他们想过要 采取反对老板的集体行动,
08:08
or emotional instability,
142
488166
2150
而失去他们的工作,
08:10
or because they're contemplating collective action against their employers.
143
490340
3550
从而对自己进行审查。
08:14
That coming out will no longer be an option,
144
494478
3177
这样的话“出柜”也就不再会是一个选择,
08:17
because people's brains will long ago have revealed their sexual orientation,
145
497679
5067
因为在人们准备有意识地跟其他人
08:22
their political ideology
146
502770
1822
分享信息前,
08:24
or their religious preferences,
147
504616
2025
他们的大脑就已经
08:26
well before they were ready to consciously share that information
148
506665
3080
展露出他们的性取向、政治意识形态,
08:29
with other people.
149
509769
1253
以及宗教观点。
08:31
I worry about the ability of our laws to keep up with technological change.
150
511565
4912
我担心我们的法律 能否跟得上技术的进步。
08:36
Take the First Amendment of the US Constitution,
151
516986
2320
就拿美国宪法第一修正案来说,
08:39
which protects freedom of speech.
152
519330
1958
它是保护言论自由的。
08:41
Does it also protect freedom of thought?
153
521312
1927
那它是否也能保护思想自由呢?
08:43
And if so, does that mean that we're free to alter our thoughts however we want?
154
523944
4169
如果能的话,是否意味着 我们能随意改变自己的想法?
08:48
Or can the government or society tell us what we can do with our own brains?
155
528137
4674
政府或社会能告诉我们 如何运用自己的大脑吗?
08:53
Can the NSA spy on our brains using these new mobile devices?
156
533591
3717
美国国家安全局能用 这些新设备监视我们的大脑吗?
08:58
Can the companies that collect the brain data through their applications
157
538053
4119
企业能通过它们的 应用程序收集大脑数据,
09:02
sell this information to third parties?
158
542196
2074
然后将这些信息卖给第三方吗?
09:05
Right now, no laws prevent them from doing so.
159
545174
3222
现在还没有法律阻止它们这么做。
09:09
It could be even more problematic
160
549203
2025
而在那些还不能享受到
09:11
in countries that don't share the same freedoms
161
551252
2519
与美国人民同样自由的国家,
09:13
enjoyed by people in the United States.
162
553795
2103
可能会有更多的问题。
09:16
What would've happened during the Iranian Green Movement
163
556883
3787
如果在“伊朗绿色运动”期间,
09:20
if the government had been monitoring my family's brain activity,
164
560694
3901
政府监视了我的家人的大脑活动,
09:24
and had believed them to be sympathetic to the protesters?
165
564619
4007
并且认为他们同情抗议者会怎样?
09:30
Is it so far-fetched to imagine a society
166
570091
3047
这样想象一个人们只是
09:33
in which people are arrested based on their thoughts
167
573162
2842
因为有了犯罪的想法就会被逮捕的社会,
09:36
of committing a crime,
168
576028
1167
是否有点牵强?
09:37
like in the science-fiction dystopian society in "Minority Report."
169
577219
4312
它就像是科幻电影《少数派报告》 中的反乌托邦社会。
09:42
Already, in the United States, in Indiana,
170
582286
4323
在美国的印地安那州,
09:46
an 18-year-old was charged with attempting to intimidate his school
171
586633
4937
已经有一个18岁的学生因为发布
在走廊上射击他人的视频 而被指控企图恐吓学校。
09:51
by posting a video of himself shooting people in the hallways ...
172
591594
3309
09:55
Except the people were zombies
173
595881
3007
但是他射击的是僵尸,
09:58
and the video was of him playing an augmented-reality video game,
174
598912
5047
而那个视频是他在 玩一款增强现实电子游戏。
10:03
all interpreted to be a mental projection of his subjective intent.
175
603983
4772
这一切都被解释为他的 主观意图的一个心理折射。
10:10
This is exactly why our brains need special protection.
176
610456
4612
这就是为什么我们的大脑 需要特殊保护。
10:15
If our brains are just as subject to data tracking and aggregation
177
615092
3556
如果我们的大脑像我们的财务记录
10:18
as our financial records and transactions,
178
618672
2532
和交易一样受制于数据跟踪和汇总,
10:21
if our brains can be hacked and tracked like our online activities,
179
621228
4285
如果我们的大脑可以像在线活动,
10:25
our mobile phones and applications,
180
625537
2361
手机和应用程序那样被侵入和跟踪,
10:27
then we're on the brink of a dangerous threat to our collective humanity.
181
627922
4269
那么我们已经处在了 威胁到人类集体的危险边缘。
10:33
Before you panic,
182
633406
1309
在你感到惊慌之前,
10:36
I believe that there are solutions to these concerns,
183
636012
3144
我相信一定存在这些问题的解决方案,
10:39
but we have to start by focusing on the right things.
184
639180
2825
但是我们必须得从聚焦于正确的事情开始。
10:42
When it comes to privacy protections in general,
185
642580
2921
通常来说如果我们通过限制信息流动
10:45
I think we're fighting a losing battle
186
645525
1826
来保护隐私,
10:47
by trying to restrict the flow of information.
187
647375
2858
那么我们是在进行一场毫无希望的战斗。
10:50
Instead, we should be focusing on securing rights and remedies
188
650257
4057
相反,我们应该将重点放在权利保障和
10:54
against the misuse of our information.
189
654338
2275
对信息滥用的补救措施上面。
10:57
If people had the right to decide how their information was shared,
190
657291
3285
比如说让人们拥有 决定如何分享自身信息的权利,
11:00
and more importantly, have legal redress
191
660600
2921
并且更重要的是,如果他们因信息被滥用
11:03
if their information was misused against them,
192
663545
2428
而遭受到诸如就业,
11:05
say to discriminate against them in an employment setting
193
665997
2786
卫生保健、教育方面的歧视时,
11:08
or in health care or education,
194
668807
2785
能够得到法律援助。
11:11
this would go a long way to build trust.
195
671616
2571
这对于建立信任来说 可能还有很长的路要走。
11:14
In fact, in some instances,
196
674843
1718
事实上,在某些情况下,
11:16
we want to be sharing more of our personal information.
197
676585
3524
我们反而愿意去分享更多的个人信息。
11:20
Studying aggregated information can tell us so much
198
680697
3047
研究集体信息可以让我们对
11:23
about our health and our well-being,
199
683768
2747
健康和幸福有更多的了解,
11:26
but to be able to safely share our information,
200
686539
3313
但是为了安全地分享我们的信息,
11:29
we need special protections for mental privacy.
201
689876
3223
我们需要对精神隐私进行特别保护。
11:33
This is why we need a right to cognitive liberty.
202
693832
3147
这就是为什么我们需要认知自由的权利。
11:37
This right would secure for us our freedom of thought and rumination,
203
697543
4079
这项权利保护我们思考和沉思的自由,
11:41
our freedom of self-determination,
204
701646
2540
自主决定的自由,
11:44
and it would insure that we have the right to consent to or refuse
205
704210
4390
它还确保我们有权同意或拒绝
11:48
access and alteration of our brains by others.
206
708624
2857
他人获取或修改我们的大脑信息。
11:51
This right could be recognized
207
711811
2112
这项权利应当被公认为
11:53
as part of the Universal Declaration of Human Rights,
208
713947
2883
世界人权宣言的一部分,
11:56
which has established mechanisms
209
716854
2388
而后者已经建立了
11:59
for the enforcement of these kinds of social rights.
210
719266
2856
执行这类社会权利的机构。
12:03
During the Iranian Green Movement,
211
723872
2070
在“伊朗绿色运动”期间,
12:05
the protesters used the internet and good old-fashioned word of mouth
212
725966
5186
抗议者使用了互联网和 老式的口口相传的方式
12:11
to coordinate their marches.
213
731176
1948
来协调他们的游行队伍。
12:14
And some of the most oppressive restrictions in Iran
214
734238
2769
而最终的结果是一些最残暴的限制
12:17
were lifted as a result.
215
737031
1877
被取消了,
12:20
But what if the Iranian government had used brain surveillance
216
740047
4087
但是如果伊朗政府采取了大脑监视的方法
12:24
to detect and prevent the protest?
217
744158
3061
来侦察和阻止抗议又会怎样呢?
12:28
Would the world have ever heard the protesters' cries?
218
748847
3176
这个世界还能听到抗议者的呼声吗?
12:33
The time has come for us to call for a cognitive liberty revolution.
219
753732
5121
是时候发动一场认知自由的革命了,
12:39
To make sure that we responsibly advance technology
220
759559
3264
以确保我们值得信赖的先进技术
12:42
that could enable us to embrace the future
221
762847
2978
能够使我们拥抱一个
12:45
while fiercely protecting all of us from any person, company or government
222
765849
6717
强烈保护我们内心最深处不被任何个人,
公司或政府非法侵入或改变的未来。
12:52
that attempts to unlawfully access or alter our innermost lives.
223
772590
5040
12:58
Thank you.
224
778659
1174
谢谢。
12:59
(Applause)
225
779857
3492
(掌声)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7