Your Right to Mental Privacy in the Age of Brain-Sensing Tech | Nita Farahany | TED
61,286 views ・ 2023-05-30
请双击下面的英文字幕来播放视频。
翻译人员: Yip Yan Yeung
校对人员: Yanyan Hong
00:04
Today, we know and track virtually nothing
that’s happening in our own brains.
0
4125
6757
如今,我们完全不了解
自己大脑内发生着什么,
也无法去追踪。
00:11
But in a future that is coming
much faster than you realize,
1
11341
4004
但这一切将会在未来改变,
而且这个未来
来得比你预期的还快很多。
00:15
all of that is about to change.
2
15387
2919
00:18
We're now familiar with sensors
in our smart watches to our rings,
3
18306
4129
我们都很熟悉智能手表、
戒指中内嵌的传感器,
00:22
that track everything from our heartbeats
to our footsteps, breaths,
4
22477
5005
它们会追踪各种信息,
包括心跳、步数、呼吸、
00:27
body temperature, even our sleep.
5
27524
2002
体温,甚至睡眠。
00:29
Now, consumer neurotech devices
are being sold worldwide
6
29943
4254
如今,全球各地都有
出售消费型神经科技设备,
00:34
to enable us to track
our own brain activity.
7
34239
3420
让我们能够追踪自己的大脑活动。
00:38
As companies from Meta to Microsoft,
8
38034
3712
随着各大企业,从 Meta
到微软、Snap,甚至是苹果公司
00:41
Snap and even Apple
9
41788
1877
00:43
begin to embed brain sensors
in our everyday devices
10
43665
3337
都开始将大脑传感器
嵌入日常使用的设备中,
00:47
like our earbuds, headphones, headbands,
watches and even wearable tattoos,
11
47002
6715
如耳塞式耳机、头戴式耳机、
头带、手表,甚至是穿戴式纹身,
00:53
we're reaching an inflection point
in brain transparency.
12
53717
4296
我们正在迎来
大脑透明度的转折点,
00:58
And those are just some
of the company names we're familiar with.
13
58013
3086
这还只是一些我们耳熟能详的企业,
还有更多其他的。
01:01
There are so many more.
14
61099
1168
01:02
Consumer neurotech devices
are moving from niche products
15
62309
3253
消费类神经科技设备
从搭载有限应用的小众商品
01:05
with limited applications
16
65604
1793
01:07
to becoming the way in which we'll learn
about our own brain activity,
17
67397
3795
逐渐变为我们了解
自己大脑活动的方式,
01:11
our controller for virtual reality
and augmented reality.
18
71192
3629
我们对虚拟现实和
增强现实的控制器,
01:14
And one of the primary ways we'll interact
19
74863
2586
以及我们与所有其他科技
互动的主要方式之一。
01:17
with all of the rest of our technology.
20
77490
2419
01:20
Even conservative estimates
of the neurotech industry
21
80410
3337
连针对神经科技行业的保守估计,
01:23
put it at more than
38 billion dollars by 2032.
22
83788
4171
都认为到 2032 年时,其市值
会超过 380 亿美元。
01:28
This new category of technology
presents unprecedented possibility,
23
88668
5214
这一全新类别的科技
蕴含着前所未有的可能性,
01:33
both good and bad.
24
93923
2169
有好也有坏。
01:36
Consider how our physical health
and well-being are increasing
25
96509
3671
思考一下,我们的
身体健康和幸福在提升的同时,
01:40
while neurological disease
and suffering continue to rise.
26
100180
3587
神经疾病和痛苦也随之增加。
01:44
55 million people around the world
are struggling with dementia,
27
104267
4588
全球 5500 万人
正受到痴呆症的折磨,
01:48
with more than 60 to 70 percent of them
suffering from Alzheimer's disease.
28
108855
4505
其中超过 60% 至 70% 的人
患有阿尔茨海默症。
01:53
Nearly a billion people struggle
with mental health
29
113860
3170
近 10 亿人饱受心理疾病和
01:57
and drug use disorders.
30
117072
1918
药物使用障碍的折磨。
01:59
Depression affects more than 300 million.
31
119032
3754
有超过 3 亿受抑郁症影响。
02:02
Consumer neurotech devices
could finally enable us
32
122827
2878
消费类神经科技设备
终于让我们可以
02:05
to treat our brain health and wellness
33
125747
2794
像照顾自己的身体健康那样认真地
02:08
as seriously as we treat
the rest of our physical well-being.
34
128583
3295
关注自己的大脑健康。
02:13
But making our brains
transparent to others
35
133463
3378
但将我们的大脑全然展示给别人,
02:16
also introduces extraordinary risks.
36
136841
3337
也会带来不小的风险。
02:20
Which is why, before
it's too late to do so,
37
140845
3003
这也就是为什么
我们得在为时太晚以前,
02:23
we must change the basic terms
of service for neurotechnology
38
143890
4838
改变神经科技的基本服务条款,
02:28
in favour of individual rights.
39
148770
2252
以个人权利为首。
02:31
I say this not just as a law professor
who believes in the power of law,
40
151523
4462
我这么说不仅仅是以
相信法律效力的法学教授身份,
02:35
nor just a philosopher
trying to flesh out norms,
41
155985
3921
也不仅仅是以想要
为规范添砖加瓦的哲学家身份,
02:39
but as a mother who's been personally
and profoundly impacted
42
159906
5339
更是以一位在个人生活中
切身、深刻受到神经科技的使用
02:45
by the use of neurotechnology
in my own life.
43
165245
2711
影响的母亲的身份。
02:50
On Mother's Day in 2017,
44
170291
3462
2017 年的母亲节,
02:53
as my daughter Calista
lay cradled in my arms,
45
173753
4964
我怀抱着我的女儿
卡莉斯塔(Calista),
02:58
she took one last beautiful breath.
46
178717
3503
看着她做出了最后一次美丽的呼吸。
03:03
After a prolonged hospitalization,
47
183972
2627
在漫长的住院后,
03:06
complications following infections
claimed her life.
48
186599
3712
并发症伴随感染夺走了她的生命。
03:11
The harrowing trauma that she endured
and we witnessed stretched into weeks.
49
191521
5047
我们见证了她所承受的痛苦创伤,
这一切持续了数周。
03:17
And I was left with lasting trauma
50
197152
3044
她给我留下了无法磨灭的创伤,
03:20
that progressed into
post-traumatic stress disorder.
51
200238
3128
恶化成了创伤后应激障碍(PTSD),
03:24
Sleep escaped me for years.
52
204325
2837
很多年来,我都无法入睡。
03:27
As each time I closed my eyes,
I relived everything,
53
207662
3629
每当我闭上眼睛,就会重温这一切,
03:31
from the first moments that I was pushed
out of the emergency room
54
211332
4922
从我被推出急诊室的那一瞬间,
03:36
to her gut-wrenching cries.
55
216254
2127
到她令人心碎的哭声。
03:39
Ultimately, it was the help
of a talented psychologist,
56
219549
4171
最终,在一位有才华的
心理学家的帮助下,
03:43
using exposure therapy,
57
223762
2210
通过暴露疗法和神经反馈技术
03:46
and my use of neurofeedback
58
226014
2586
03:48
that enabled me to sleep
through the night.
59
228600
2836
让我能在夜晚入睡。
03:51
For others who are suffering
from traumatic memories,
60
231770
3378
对于其他饱受痛苦回忆折磨的人们,
03:55
an innovative new approach
using decoded neurofeedback,
61
235148
3629
使用解码神经反馈的新方式,
03:58
or DecNef, may offer reprieve.
62
238777
2419
也称为 “DecNef” 能让得到暂时的缓解。
04:01
This groundbreaking approach
uses machine-learning algorithms
63
241613
3712
这个开创性的方式
通过机器学习算法
04:05
to identify specific
brain-activity patterns,
64
245366
3462
识别特定的大脑活动模式,
04:08
including those associated
with traumatic memories.
65
248870
3295
包括那些与痛苦回忆相关的活动。
04:12
Participants then play a game
66
252207
2544
然后参与者会玩一个游戏,
04:14
that enables them
to retrain their brain activity
67
254793
2711
让他们重新训练大脑活动
04:17
on positive associations instead.
68
257545
2544
改建积极的关联。
04:21
If I had had DecNef
available to me at the time,
69
261216
3545
如果当时我可以使用 DecNef,
04:24
I might have overcome my PTSD more quickly
70
264803
3169
也许我就能更快地克服
创伤后应激障碍,
04:28
without having to relive every sound,
terror and smell in order to do so.
71
268014
6423
而且不用重新经历
所有的声音、恐惧和气味,
04:35
I'm not the only one.
72
275146
1543
我不是个例。
04:36
Sarah described herself as being
at the end of her life,
73
276689
3087
萨拉(Sarah)说她
已经走到了人生的尽头,
04:39
no longer in a life worth living,
74
279818
2168
已经没有活下去的意义,
04:42
because of her severe
and intractable depression.
75
282028
2795
都是因为她难以治愈的重度抑郁。
04:45
Then, using implanted brain sensors
76
285740
3045
通过植入性的大脑传感器,
04:48
that reset her brain activity
like a pacemaker for the brain,
77
288827
4504
重置了她的大脑活动,
就像是个给大脑用的心脏起搏器,
04:53
Sarah reclaimed her will to live.
78
293373
2878
萨拉重获了活下去的意志。
04:56
While implanted neurotechnology advances
have been extraordinary,
79
296960
4170
虽然植入式神经科技的
进步令人惊叹,
05:01
it's the everyday brain sensors
80
301172
2544
但我认为嵌入我们平常科技的
05:03
that are embedded
in our ordinary technology
81
303758
2419
日常所用的大脑传感器
05:06
that I believe will impact
the majority of our lives.
82
306219
3462
才是影响我们大部分生活的因素。
05:09
Like the one third of adults
83
309722
2503
有三分之一的成人
05:12
and nearly one quarter of children
who are living with epilepsy
84
312225
3962
和将近四分之一的儿童癫痫患者
05:16
for whom conventional
anti-seizure medications fail.
85
316187
3170
是无法依靠传统的
抗癫痫药物治疗的。
05:20
Now, researchers from Israel to Spain
have developed brain sensors
86
320024
5089
如今有以色列、西班牙的研究者
已经开发了大脑传感器,
05:25
using the power of AI
in pattern recognition
87
325113
3503
利用 AI 模式识别的能力
05:28
and consumer electroencephalography
88
328616
2836
和消费型脑电图
05:31
to enable the detection
of epileptic seizures minutes
89
331494
3712
让癫痫在发作前数分钟,
05:35
to up to an hour before they occur,
90
335248
2419
甚至是一小时以前就能被监测到,
05:37
sending potentially life-saving alerts
to a mobile device.
91
337667
3879
向移动端设备发送
可能可以救命的警报。
05:42
Regular use of brain sensors
could even enable us
92
342171
2753
定期使用大脑传感器
甚至可以让我们
05:44
to detect the earliest stages
93
344966
1460
检测到一些最凶险的脑瘤,
05:46
of the most aggressive forms
of brain tumors, like glioblastoma,
94
346467
4630
如胶质母细胞瘤的早期阶段,
05:51
where early detection
is crucial to saving lives.
95
351139
3253
而早期诊断对拯救生命至关重要。
05:54
The same could hold true
for Parkinson's disease, to Alzheimer's,
96
354392
3712
这也适用于帕金森病、阿尔茨海默症、
05:58
traumatic brain injury, ADHD,
and even depression.
97
358146
4004
创伤性脑损伤、ADHD(多动症),
甚至是抑郁症。
06:03
We may even change our brains
for the better.
98
363276
2461
我们甚至可以改良我们的大脑。
06:06
The brain training game industry,
99
366321
1876
大脑训练游戏产业
06:08
worth a staggering
6.5 billion dollars in 2021,
100
368239
5005
在 2021 年市值已高达 65 亿美元,
06:13
was for years met with controversy
101
373286
1835
多年以来备受争议,
06:15
because of unsupported scientific
claims about their efficacy.
102
375121
3128
由于它们宣称的功效并没有科学依据。
06:19
But now some brain-training platforms
like Cognizant have proven powerful
103
379000
5380
但一些大脑训练的平台,
如高知特(Cognizant)已证明
06:24
in improving brain processing speeds,
memory, reasoning
104
384422
4338
通过反复玩游戏,将有效改善
大脑的处理速度、记忆力、逻辑思维,
06:28
and even executive functioning
when played repeatedly over time.
105
388801
4296
甚至是执行功能。
06:33
When paired with neurofeedback devices
for learning reinforcement,
106
393139
4838
与用于学习强化的
神经反馈设备搭配使用,
06:38
this could revolutionize
how we learn and adapt to change.
107
398019
3670
将彻底革新我们学习、
应对变化的方法。
06:42
Other breakthroughs
could be transformational
108
402398
2836
其他突破会为人类体验
带来革命性改变。
06:45
for the human experience.
109
405276
1710
06:47
Today, most human brain studies
110
407028
1793
如今,大多数的人类大脑研究
06:48
are based on a very small
number of participants
111
408821
2378
基于非常少量的参与者
06:51
engaged in very specific tasks
in a controlled laboratory environment.
112
411240
4505
在受控的实验室环境下
参与非常特定的任务产生的数据。
06:56
With widespread use of brain sensors,
113
416245
2169
随着大脑传感器的广泛使用,
06:58
the data we could have to learn
about the human brain
114
418456
2628
我们可以用于了解人类大脑的数据
07:01
would exponentially increase.
115
421125
2002
就会呈现指数式增长。
07:03
With sufficiently large datasets
of long-term, real-world data
116
423169
4296
有了足够大的数据集
承载着长期、真实的数据,
07:07
from people engaged in everyday activity,
117
427507
2752
取自人们的日常活动,
07:10
we just might address everything
from neurological disease and suffering
118
430301
3879
我们有可能可以解决一切,
从神经疾病和痛苦
07:14
to creating transformational possibilities
for the human experience.
119
434180
3962
到为人类体验创造出
颠覆性的新可能。
07:19
But all of this will only be possible
120
439811
3128
但要让这一切都成为可能,
07:22
if people can confidently
share their brain data
121
442981
2711
人们必须有信心地
分享自己的大脑数据,
07:25
without fear that it will be
misused against them.
122
445733
3128
不会担心它会被滥用,对他们不利。
07:29
You see, the brain data
that will be collected
123
449404
2627
这些设备会收集、生成的大脑数据
07:32
and generated by these devices
124
452073
1460
07:33
won't be collected in traditional
laboratory environments
125
453533
4212
是无法在传统的实验室环境
07:37
or in clinical research studies
run by physicians and scientists.
126
457745
4546
或由医师、科学家组织的
临床研究中收集的。
07:42
Instead, it will be the sellers
of these new devices,
127
462875
4130
反而是这些新型设备的厂商
在进行收集,
07:47
the very companies who've been
commodifying our personal data for years.
128
467046
4797
也正是这些年来
用我们个人数据盈利的公司。
07:52
Which is why we can't go into this new era
naive about the risks
129
472135
4379
这也就是为什么我们不可以
在对风险一无所知,
07:56
or complacent about the challenges
130
476556
2043
或对收集、分享我们大脑数据
带来的挑战不以为然的情况下
07:58
that the collection and sharing
our brain data will pose.
131
478641
3379
走进这个新时代。
08:02
Scientific hurdles can
and will be addressed in time,
132
482061
3003
科研上的困难可以被及时解决,
08:05
but the social hurdles
will be the most challenging.
133
485106
3837
但社会上的困难才是最有挑战性的。
08:10
Unlike the technologies of the past
that track and hack the human brain,
134
490111
4713
与以往追踪、
入侵人类大脑的技术不同,
08:14
brain sensors provide direct access
to the part of ourselves
135
494824
3462
大脑传感器让别人能直接存取
我们大脑有所保留的部分,
08:18
that we hold back,
136
498327
1293
08:19
that we don't express
through our words and our actions.
137
499620
2920
我们未通过言语和行动表达的部分。
08:22
Brain data in many instances
will be more sensitive
138
502582
3253
很多情况下的大脑数据会比
08:25
than the personal data of the past,
139
505877
1835
以往的个人数据更加敏感,
08:27
because it reflects our feelings,
our mental states, our emotions,
140
507754
4754
因为它反映了我们的感受、
精神状态、情绪、
08:32
our preferences, our desires,
even our very thoughts.
141
512550
3295
偏好、欲望,甚至是每一个想法。
08:36
I would never have wanted
the data that was collected
142
516679
3879
我可不想我在度过
由失去亲人造成的创伤之际
08:40
as I worked through the trauma
of my personal loss
143
520600
3462
收集的数据
08:44
to have been commodified,
144
524103
1627
还被人当商品出售、
08:45
shared and analyzed by others.
145
525730
2502
分享、分析。
08:48
These aren't just hypothetical risks.
146
528691
2378
这些都不只是假想的风险。
08:51
Take Entertek, a Hangzhou-based company,
147
531110
2753
举个例子,杭州的公司
回车科技(Entertek)
08:53
who has collected millions of instances
of brain activity data
148
533905
3879
收集了上亿大脑活动的实例数据,
08:57
as people have engaged
in mind-controlled car racing,
149
537784
3295
来自于人们用意念控制赛车、
09:01
sleeping, working,
150
541120
1710
睡觉、工作,
09:02
even using neurofeedback
with their devices.
151
542830
3128
甚至是通过神经反馈
使用设备的时候。
09:06
They've already entered
into partnerships with other companies
152
546000
3629
它已和其他公司达成合作,
09:09
to share and analyze that data.
153
549670
2086
分享、分析这些数据。
09:12
Unless people have individual control
over their brain data,
154
552256
4129
除非人们全权拥有
大脑数据的控制权,
09:16
it will be used
for microtargeting or worse,
155
556427
2670
它都有可能会被用于
精准投放或更糟糕的用途,
09:19
instead of treating dementia.
156
559138
2086
而不是用于治疗痴呆。
09:21
Like the employees worldwide
157
561641
3128
世界各地的员工
09:24
who've already been subject
to brain surveillance in the workplace
158
564811
3628
在工作时被迫当作大脑监控的对象,
09:28
to track their attention and fatigue,
159
568481
2461
来追踪他们的注意力和疲劳,
09:30
to governments,
developing brain biometrics,
160
570942
2544
有政府在开发大脑生物信息识别
09:33
to authenticate people at borders,
161
573486
2127
以便在海关核对身份,
09:35
to interrogate criminal suspects' brains
162
575655
4671
还有审问犯罪嫌疑人的大脑,
09:40
and even weapons that are being crafted
163
580326
3045
甚至还在开发能使人类大脑
09:43
to disable and disorient the human brain.
164
583371
2711
失效或错乱的武器。
09:46
Brain wearables will have not only read
but write capabilities,
165
586499
3545
大脑的可穿戴式设备
不仅可读,还可写,
09:50
creating risks that our brains
can be hacked, manipulated,
166
590044
3837
为我们的大脑带来了被破解、操纵,
09:53
and even subject to targeted attacks.
167
593923
2419
甚至是遭到特定攻击的风险。
09:56
We must act quickly to safeguard
against the very real
168
596843
4212
我们必须立即行动起来,
保护我们最深处的自己
10:01
and terrifying risks
to our innermost selves.
169
601097
2878
免受切切实实又骇人听闻的、
风险的侵扰。
10:04
Recognizing a human right
to cognitive liberty
170
604642
3712
承认人类具有认知自由的权利
10:08
would offer those safeguards.
171
608396
1918
就是一种保护措施。
10:10
Cognitive liberty is a right
from interference by others,
172
610815
3337
认知自由是一种
需要他人介入的权利,
10:14
but it is also a right
to self-determination
173
614193
3170
但它也是对自己的大脑
和精神体验的自主权,
10:17
over our brains and mental experiences
to enable human flourishing.
174
617363
3837
让人类得以获得健康幸福。
10:21
To achieve this,
175
621784
1293
为了达到这一目标,
10:23
we need to recognize
three interrelated human rights
176
623077
3962
我们得认可三项互相关联的人权,
10:27
and update our understanding of them
177
627081
2127
更新我们对其的理解,
10:29
to secure to us a right to mental privacy,
178
629208
2878
保障我们的精神隐私权,
10:32
to safeguard us from interference
with our automatic reactions,
179
632086
3796
保护我们不会受到下意识反应、
10:35
our emotions and our thoughts.
180
635923
2419
情绪和思绪的干扰。
10:38
Freedom of thought
as an absolute human right
181
638342
2545
思想自由是一项明确
10:40
to protect us from interception,
182
640928
2211
保护我们的思想不会受到拦截、
10:43
manipulation and punishment
of our thoughts.
183
643181
2961
操纵和惩罚的人权。
10:46
And self-determination
184
646142
1668
自主权
10:47
to secure self-ownership over our brains
and mental experiences,
185
647810
4505
让我们完全掌控
自己的大脑和精神体验,
10:52
to access and change them
if we want to do so.
186
652356
2878
如果愿意的话,
可以访问,也可以修改。
10:56
There are important efforts
already underway
187
656277
2252
这些都是正在进行的尝试,
10:58
from the UN to UNESCO,
in nations worldwide,
188
658571
3170
从联合国到联合国教科文组织,
遍布世界各国,
11:01
over rights and regulations
around neurotechnologies.
189
661741
3462
都在争取神经科技相关的
权利和法规。
11:05
But those rights need to be better aligned
190
665661
2336
但这些权利得更好地
11:07
with a broader set of digital rights.
191
667997
2336
配合广泛的数字权利。
11:11
Cognitive liberty is an update
to liberty in the digital age
192
671083
5089
认知自由是数字时代的新版自由,
11:16
as an umbrella concept
of human flourishing
193
676172
2794
作为一个总括概念涵盖了
人类在各种数字科技下的蓬勃发展。
11:19
across digital technologies.
194
679008
2169
11:21
Because the right way forward
isn't through metaverse rights
195
681636
2836
因为前进的正确方式
并非通过“元宇宙权”
11:24
or AI rights
196
684513
1127
或“AI 权”、
11:25
or neurotech rights and the like.
197
685681
2086
“神经科技权”等等。
11:27
It's to recognize that these technologies
don't exist in silos,
198
687767
4504
关键在于认识到这些技术
不是独立存在的,
11:32
but in combination, affecting
our brains and mental experiences.
199
692271
4546
而是合而为一,
影响着我们的大脑和精神体验。
11:38
We are literally at a moment before.
200
698027
3337
我们就差临门一脚了,
11:41
And I mean a moment.
201
701739
2085
近在眼前了。
11:44
Consumer brain wearables
have already arrived,
202
704283
2503
消费型大脑可穿戴式设备已经问世,
11:46
and the commodification
of our brains has already begun.
203
706786
3503
大脑的商品化已经开始,
11:50
It's now just a question of scale.
204
710790
3128
只是规模的问题。
11:54
We haven't yet passed the inflection point
205
714794
2294
我们还没有跨过拐点,
11:57
where most of our brains can be directly
accessed and changed by others.
206
717088
4212
跨过之后,我们大脑的大部分
可以直接被别人访问、修改。
12:01
But it is about to happen,
207
721842
2545
但这很快就会发生了,
12:04
giving us a final moment to make a change
208
724387
2794
我们还有最后一点时间做出改变,
12:07
so that we don't look back
in a few years' time
209
727223
2335
让我们不会在多年以后回顾,
12:09
and lament the world we've left behind.
210
729600
3128
只能哀叹我们留下的世界。
12:13
We can and should be hopeful
and deliberate
211
733354
4087
我们可以,也应当对我们现在
做出的决定保持积极、审慎的态度,
12:17
about the choices we make now
212
737441
2211
12:19
to secure a right to self-determination
over our brains and mental experiences.
213
739694
4921
保障我们对自己的
大脑及精神体验的自主权。
12:25
The possibilities, if we do so,
214
745116
3086
如果我们这么做的话,
未来能够限制住可能性的
12:28
are limited only by our imagination.
215
748202
2878
就只有我们的想象力了。
12:31
Thank you.
216
751580
1168
谢谢。
12:32
(Applause)
217
752790
3212
(掌声)
New videos
Original video on YouTube.com
关于本网站
这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。