How I'm using biological data to tell better stories -- and spark social change | Heidi Boisvert

52,810 views ・ 2020-01-02

TED


请双击下面的英文字幕来播放视频。

翻译人员: Jingdan Niu 校对人员: Wanting Zhong
00:13
For the past 15 years I've been trying to change your mind.
0
13458
3542
过去的 15 年间, 我一直在尝试改变你们的想法。
00:17
In my work I harness pop culture and emerging technology
1
17917
3851
我的工作是利用 流行文化和新兴技术
00:21
to shift cultural norms.
2
21792
1458
来改变文化规范。
00:23
I've made video games to promote human rights,
3
23833
3685
我做过电子游戏 来促进人权,
00:27
I've made animations to raise awareness about unfair immigration laws
4
27542
5059
做过动画片来提高关于 不公平的移民法案的意识,
00:32
and I've even made location-based augmented reality apps
5
32625
4226
以及做过基于定位 的增强现实应用
00:36
to change perceptions around homelessness
6
36875
2643
来改变对无家可归者的看法,
00:39
well before Pokémon Go.
7
39542
1934
比 Pokemon Go 早了不少。
00:41
(Laughter)
8
41500
1351
(笑声)
00:42
But then I began to wonder whether a game or an app
9
42875
4018
之后我开始琢磨, 是否有游戏或者应用
00:46
can really change attitudes and behaviors,
10
46917
2476
可以真的改变态度和行为,
00:49
and if so, can I measure that change?
11
49417
3017
如果可以,我能否衡量这种改变?
00:52
What's the science behind that process?
12
52458
2976
这个过程背后的科学依据是什么?
00:55
So I shifted my focus from making media and technology
13
55458
3726
所以我把我的工作重心 从制作媒体和技术
00:59
to measuring their neurobiological effects.
14
59208
3042
转移到了测量它们 的神经生物学效应。
01:03
Here's what I discovered.
15
63458
1851
这是我的发现。
01:05
The web, mobile devices, virtual and augmented reality
16
65333
3893
网络、移动设备、 虚拟现实和增强现实技术
01:09
were rescripting our nervous systems.
17
69250
2684
在改写我们的神经系统。
01:11
And they were literally changing the structure of our brain.
18
71958
2959
它们确实在改变 我们大脑的结构。
01:15
The very technologies I had been using to positively influence hearts and minds
19
75875
4976
我曾经一直用来积极影响 心灵和思想的技术,
01:20
were actually eroding functions in the brain necessary for empathy
20
80875
4393
实际上正在逐渐 侵蚀大脑中
对于同理心和决策力 不可或缺的功能。
01:25
and decision-making.
21
85292
1851
01:27
In fact, our dependence upon the web and mobile devices
22
87167
4142
事实上,我们对网络 和移动设备的依赖
01:31
might be taking over our cognitive and affective faculties,
23
91333
4101
也许正在取代我们 的认知和情感能力,
01:35
rendering us socially and emotionally incompetent,
24
95458
3560
让我们的社交以及 情感功能变得低下,
01:39
and I felt complicit in this dehumanization.
25
99042
3291
而我感觉我是这种 去人情化的帮凶。
01:43
I realized that before I could continue making media about social issues,
26
103292
4642
我意识到在我能够继续制作 关于社会话题的媒体之前,
01:47
I needed to reverse engineer the harmful effects of technology.
27
107958
4417
我需要把科技的 不良影响反转过来。
01:52
To tackle this I asked myself,
28
112917
2767
为了解决这个问题, 我反问自己,
01:55
"How can I translate the mechanisms of empathy,
29
115708
3393
“我如何才能把 同理心的机理,
01:59
the cognitive, affective and motivational aspects,
30
119125
3643
它的认知、情感 和动机方面的原理,
02:02
into an engine that simulates the narrative ingredients
31
122792
3226
转化成模拟 能促使我们行动的
02:06
that move us to act?"
32
126042
1291
故事素材的引擎?”
02:08
To answer this, I had to build a machine.
33
128583
3935
为了回答这个问题, 我不得不造了台机器。
02:12
(Laughter)
34
132542
1726
(笑声)
02:14
I've been developing an open-source biometric lab,
35
134292
2892
我一直在开发一个 开源的生物计量实验室,
02:17
an AI system which I call the Limbic Lab.
36
137208
3476
这是一个人工智能系统, 我叫它大脑边缘实验室。
02:20
The lab not only captures
37
140708
1435
实验室不仅能准确记录
02:22
the brain and body's unconscious response to media and technology
38
142167
4226
大脑和身体对媒体和技术 的无意识反应,
02:26
but also uses machine learning to adapt content
39
146417
2976
还能使用机器学习技术
02:29
based on these biological responses.
40
149417
2208
基于这些生物学反应 来改写内容。
02:32
My goal is to find out what combination of narrative ingredients
41
152667
3517
我的目标就是找出 哪些叙述成分的组合
02:36
are the most appealing and galvanizing
42
156208
2143
对特定的目标观众
02:38
to specific target audiences
43
158375
1809
是最有吸引力 也最有激励性的,
02:40
to enable social justice, cultural and educational organizations
44
160208
5018
从而使社会正义、 文化和教育组织
02:45
to create more effective media.
45
165250
2643
能够制作出更有效的媒体内容。
02:47
The Limbic Lab consists of two components:
46
167917
2851
大脑边缘实验室由两个部分组成: 【注:边缘系统为负责情绪的脑区】
02:50
a narrative engine and a media machine.
47
170792
2541
一个叙事引擎 和一个媒体机器。
02:54
While a subject is viewing or interacting with media content,
48
174375
4226
当一位被试者在观看或者 和媒体内容互动时,
02:58
the narrative engine takes in and syncs real-time data from brain waves,
49
178625
4184
叙事引擎会采集并同步 实时数据,比如脑电波数据,
03:02
biophysical data like heart rate, blood flow, body temperature
50
182833
3560
包括心跳、血流量、体温、 以及肌肉收缩
03:06
and muscle contraction,
51
186417
1684
的生物物理学数据,
03:08
as well as eye-tracking and facial expressions.
52
188125
2958
以及眼动追踪和面部表情。
03:12
Data is captured at key places where critical plot points,
53
192083
3726
数据在关键地方被准确记录, 比如关键情节点、
03:15
character interaction or unusual camera angles occur.
54
195833
3417
角色互动或者不寻常镜头角度 发生的时候。
03:20
Like the final scene in "Game of Thrones, Red Wedding,"
55
200292
2726
比如在《权力的游戏之血色婚礼》 的最后一幕,
03:23
when shockingly,
56
203042
2309
令人震惊地,
03:25
everybody dies.
57
205375
1559
每个人都死了。
03:26
(Laughter)
58
206958
1250
(笑声)
03:29
Survey data on that person's political beliefs,
59
209042
3184
被试者的政治信仰 的问卷数据,
03:32
along with their psychographic and demographic data,
60
212250
3143
以及他们的心理 及人口统计的调查数据,
03:35
are integrated into the system
61
215417
1767
也整合在了系统里,
03:37
to gain a deeper understanding of the individual.
62
217208
2709
以获取对个体更深层的了解。
03:40
Let me give you an example.
63
220833
1542
我给你们举个例子。
03:43
Matching people's TV preferences with their views on social justice issues
64
223708
4851
将人们对电视节目的偏好与他们 对社会公平事件的看法相匹配,
03:48
reveals that Americans who rank immigration among their top three concerns
65
228583
4268
揭示了将移民列为他们 最关心问题前三的美国人,
03:52
are more likely to be fans of "The Walking Dead,"
66
232875
2792
更可能是《行尸走肉》的粉丝,
03:56
and they often watch for the adrenaline boost,
67
236958
2810
并且他们经常为了 肾上腺素的刺激收看节目,
03:59
which is measurable.
68
239792
1291
这也是可以测量的。
04:01
A person's biological signature and their survey response
69
241792
3934
一个人的生物学信号 和他们的问卷回复
04:05
combines into a database to create their unique media imprint.
70
245750
4851
组成了一个创造他们 独一无二媒体印记的数据库。
04:10
Then our predictive model finds patterns between media imprints
71
250625
4268
然后我们的预测模型 会寻找媒体印记之间的规律,
04:14
and tells me which narrative ingredients
72
254917
1976
告诉我哪些叙事成分
04:16
are more likely to lead to engagement in altruistic behavior
73
256917
3767
更有可能引起利他行为,
04:20
rather than distress and apathy.
74
260708
2685
而不是痛苦和冷漠。
04:23
The more imprints added to the database
75
263417
2184
从电视剧到游戏,
04:25
across mediums from episodic television to games,
76
265625
3226
越多媒体印记 加入到数据库中,
04:28
the better the predictive models become.
77
268875
2167
预测模型就变得越好。
04:32
In short, I am mapping the first media genome.
78
272417
3851
总之,我正在绘制 第一个媒体基因组。
04:36
(Applause and cheers)
79
276292
3791
(鼓掌欢呼)
04:44
Whereas the human genome identifies all genes involved
80
284083
3185
人类基因组确定了所有
与人类 DNA 序列相关的基因,
04:47
in sequencing human DNA,
81
287292
1833
04:49
the growing database of media imprints will eventually allow me
82
289917
3226
而不断增长的媒体印记数据库 将最终帮助我
04:53
to determine the media DNA for a specific person.
83
293167
4125
确定一个特定的人 的媒体 DNA。
04:58
Already the Limbic Lab's narrative engine
84
298250
2833
大脑边缘实验室 的叙事引擎
05:02
helps content creators refine their storytelling,
85
302333
2601
能帮助内容创作者完善 他们讲故事的方式,
05:04
so that it resonates with their target audiences on an individual level.
86
304958
4000
使其能在个体层面上 与目标观众产生共鸣。
05:11
The Limbic Lab's other component,
87
311042
2184
大脑边缘实验室的另外一部分,
05:13
the media machine,
88
313250
1976
媒体机器,
05:15
will assess how media elicits an emotional and physiological response,
89
315250
4476
将会评估媒体如何引起 观众感情和生理上的反应,
05:19
then pulls scenes from a content library
90
319750
2434
并从内容库中提取出
05:22
targeted to person-specific media DNA.
91
322208
2584
针对特定个人媒体 DNA 的内容。
05:26
Applying artificial intelligence to biometric data
92
326042
3892
针对生物特征数据 使用人工智能
05:29
creates a truly personalized experience.
93
329958
2726
能够创造真正个性化的体验。
05:32
One that adapts content based on real-time unconscious responses.
94
332708
5084
这种体验能基于实时的 无意识反应而对内容进行调整。
05:38
Imagine if nonprofits and media makers were able to measure how audiences feel
95
338625
6059
想象一下,如果非盈利组织 和媒体制作者能在观众体验内容
05:44
as they experience it
96
344708
1851
的同时测量他们的感受,
05:46
and alter content on the fly.
97
346583
2393
并且动态调整内容会怎样。
05:49
I believe this is the future of media.
98
349000
3458
我相信这是媒体的未来。
05:53
To date, most media and social-change strategies
99
353333
2601
迄今,大部分媒体 和社会变革战略
05:55
have attempted to appeal to mass audiences,
100
355958
2851
已经尝试过吸引普罗大众,
05:58
but the future is media customized for each person.
101
358833
3500
但是未来是媒体 为每一个人定制的时代。
06:03
As real-time measurement of media consumption
102
363458
2685
因为媒体消费的实时评估
06:06
and automated media production becomes the norm,
103
366167
2851
和自动媒体制作会变成常态,
06:09
we will soon be consuming media tailored directly to our cravings
104
369042
4059
很快,我们将会消费到利用 心理学、生物识别技术和人工智能
06:13
using a blend of psychographics, biometrics and AI.
105
373125
4208
为我们的欲望量身定制的媒体。
06:18
It's like personalized medicine based on our DNA.
106
378250
3559
它们就像是基于我们 DNA 的个性化医疗。
06:21
I call it "biomedia."
107
381833
2167
我称它为“生物媒体”。
06:25
I am currently testing the Limbic Lab in a pilot study
108
385042
3267
我现在正在与诺曼·李尔中心一起 【注:USC 的公共政策研究所】
06:28
with the Norman Lear Center,
109
388333
2018
对大脑边缘实验室 进行前导研究,
06:30
which looks at the top 50 episodic television shows.
110
390375
3726
研究对象是 排名前 50 的电视剧。
06:34
But I am grappling with an ethical dilemma.
111
394125
3018
但是我正在和道德困境作斗争。
06:37
If I design a tool that can be turned into a weapon,
112
397167
3851
如果我设计的工具 能够被变成武器,
06:41
should I build it?
113
401042
1291
我应该把它造出来吗?
06:43
By open-sourcing the lab to encourage access and inclusivity,
114
403750
3643
实验室的开源 是为了鼓励使用和包容,
06:47
I also run the risk of enabling powerful governments
115
407417
3434
但我同样冒着可能会让
强势政府和盈利企业 为了捏造新闻、市场营销
06:50
and profit-driven companies to appropriate the platform
116
410875
2934
06:53
for fake news, marketing or other forms of mass persuasion.
117
413833
4084
和其它形式的大范围劝导 而私自占用平台的风险。
06:59
For me, therefore, it is critical to make my research
118
419375
3351
因此,对我来说, 使我的研究
07:02
as transparent to lay audiences as GMO labels.
119
422750
3583
能够像转基因标签那样 对外行透明是非常重要的。
07:07
However, this is not enough.
120
427333
2542
然而,这还不够。
07:11
As creative technologists,
121
431208
1643
作为创新技术人员,
07:12
we have a responsibility
122
432875
2226
我们有责任
07:15
not only to reflect upon how present technology shapes our cultural values
123
435125
4768
不但要反思现在的技术如何 塑造了我们的文化价值观
07:19
and social behavior,
124
439917
2017
和社会行为,
07:21
but also to actively challenge the trajectory of future technology.
125
441958
4935
同样也要主动挑战 未来科技的发展轨迹。
07:26
It is my hope that we make an ethical commitment
126
446917
4059
我的希望是做出道德上的承诺,
07:31
to harvesting the body's intelligence
127
451000
2351
我们获取身体信息,
07:33
for the creation of authentic and just stories
128
453375
3434
是为了创造出 真实和正义的故事,
07:36
that transform media and technology
129
456833
2393
将媒体和科技
07:39
from harmful weapons into narrative medicine.
130
459250
3476
从有害的武器转变成 叙事的良药。
07:42
Thank you.
131
462750
1268
谢谢大家。
07:44
(Applause and cheers)
132
464042
2208
(鼓掌欢呼)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7


This website was created in October 2020 and last updated on June 12, 2025.

It is now archived and preserved as an English learning resource.

Some information may be out of date.

隐私政策

eng.lish.video

Developer's Blog