How I'm using biological data to tell better stories -- and spark social change | Heidi Boisvert

52,810 views ・ 2020-01-02

TED


請雙擊下方英文字幕播放視頻。

譯者: Helen Chang 審譯者: Bruce Sung
00:13
For the past 15 years I've been trying to change your mind.
0
13458
3542
我 15 年來一直嘗試 改變大家的想法。
00:17
In my work I harness pop culture and emerging technology
1
17917
3851
我從事用流行文化和新興技術
00:21
to shift cultural norms.
2
21792
1458
改變文化規範的工作。
00:23
I've made video games to promote human rights,
3
23833
3685
我曾用電子遊戲來倡導人權,
00:27
I've made animations to raise awareness about unfair immigration laws
4
27542
5059
曾用動畫片來提高 對不公平移民法案的意識,
00:32
and I've even made location-based augmented reality apps
5
32625
4226
甚至做過以定位擴增實境的應用
00:36
to change perceptions around homelessness
6
36875
2643
來改變對無家可歸者的看法,
00:39
well before Pokémon Go.
7
39542
1934
比 Pokemon Go 早多了。
00:41
(Laughter)
8
41500
1351
(笑聲)
00:42
But then I began to wonder whether a game or an app
9
42875
4018
之後我思考是否有遊戲或應用
00:46
can really change attitudes and behaviors,
10
46917
2476
能夠真正改變態度和行為?
00:49
and if so, can I measure that change?
11
49417
3017
如果有,能否衡量這改變?
00:52
What's the science behind that process?
12
52458
2976
其過程背後的科學依據是什麼?
00:55
So I shifted my focus from making media and technology
13
55458
3726
所以我把工作的重心 從製作媒體和技術
00:59
to measuring their neurobiological effects.
14
59208
3042
轉移至測量其神經生物學效應。
01:03
Here's what I discovered.
15
63458
1851
我發現
01:05
The web, mobile devices, virtual and augmented reality
16
65333
3893
網路、移動設備、 虛擬實境和擴增實境的技術
01:09
were rescripting our nervous systems.
17
69250
2684
在改寫我們的神經系統,
01:11
And they were literally changing the structure of our brain.
18
71958
2959
的的確確在改變我們的大腦結構。
01:15
The very technologies I had been using to positively influence hearts and minds
19
75875
4976
我一直以來用做 正向影響心靈和思想的技術
01:20
were actually eroding functions in the brain necessary for empathy
20
80875
4393
實際上正在逐漸侵蝕大腦中
對同理心和決策力不可或缺的功能。
01:25
and decision-making.
21
85292
1851
01:27
In fact, our dependence upon the web and mobile devices
22
87167
4142
事實上我們對網路和移動設備的依賴
01:31
might be taking over our cognitive and affective faculties,
23
91333
4101
可能正取代我們的認知和情感能力,
01:35
rendering us socially and emotionally incompetent,
24
95458
3560
令我們的社交和情感功能變得低下,
01:39
and I felt complicit in this dehumanization.
25
99042
3291
而我覺得自己正是這非人性化的幫兇。
01:43
I realized that before I could continue making media about social issues,
26
103292
4642
我意識到在繼續製作 有關社會話題的媒體之前
01:47
I needed to reverse engineer the harmful effects of technology.
27
107958
4417
得要扭轉科技的不良影響。
01:52
To tackle this I asked myself,
28
112917
2767
為解決此一問題,我反問自己:
01:55
"How can I translate the mechanisms of empathy,
29
115708
3393
「怎樣才能把同理心的機理,
01:59
the cognitive, affective and motivational aspects,
30
119125
3643
其認知、情感和動機方面的原理,
02:02
into an engine that simulates the narrative ingredients
31
122792
3226
轉化成能夠促使我們 模擬、陳述故事的行動引擎?
02:06
that move us to act?"
32
126042
1291
02:08
To answer this, I had to build a machine.
33
128583
3935
為回答此一問題, 我必須造一台機器。
02:12
(Laughter)
34
132542
1726
(笑聲)
02:14
I've been developing an open-source biometric lab,
35
134292
2892
我一直在開發 開源的生物計量實驗室,
02:17
an AI system which I call the Limbic Lab.
36
137208
3476
我稱此一人工智慧系統為 大腦邊緣實驗室。
02:20
The lab not only captures
37
140708
1435
這實驗室不僅能夠準確記錄
02:22
the brain and body's unconscious response to media and technology
38
142167
4226
大腦和身體對媒體 和技術的無意識反應,
02:26
but also uses machine learning to adapt content
39
146417
2976
還能利用機器學習
基於這些生物學反應來改寫內容。
02:29
based on these biological responses.
40
149417
2208
02:32
My goal is to find out what combination of narrative ingredients
41
152667
3517
我的目標是找出 哪些敘述成分的組合
02:36
are the most appealing and galvanizing
42
156208
2143
最吸引、最激勵特定目標的觀眾,
02:38
to specific target audiences
43
158375
1809
02:40
to enable social justice, cultural and educational organizations
44
160208
5018
從而使社會正義、文化和教育組織
02:45
to create more effective media.
45
165250
2643
能夠製作出更有效的媒體內容。
02:47
The Limbic Lab consists of two components:
46
167917
2851
大腦邊緣實驗室由兩部分組成:
02:50
a narrative engine and a media machine.
47
170792
2541
敘事引擎和媒體機器。
02:54
While a subject is viewing or interacting with media content,
48
174375
4226
當受試者觀看或與媒體內容互動時,
02:58
the narrative engine takes in and syncs real-time data from brain waves,
49
178625
4184
敘事引擎會同步採集 實時的腦波數據,
03:02
biophysical data like heart rate, blood flow, body temperature
50
182833
3560
像是心跳、血流量、體溫
03:06
and muscle contraction,
51
186417
1684
和肌肉收縮等等的生物物理數據,
03:08
as well as eye-tracking and facial expressions.
52
188125
2958
也會追蹤眼球運動和面部的表情。
03:12
Data is captured at key places where critical plot points,
53
192083
3726
在關鍵點準確記錄數據, 像是關鍵情節點、
03:15
character interaction or unusual camera angles occur.
54
195833
3417
角色互動
或不尋常鏡頭角度的發生時刻。
03:20
Like the final scene in "Game of Thrones, Red Wedding,"
55
200292
2726
像是在《權力遊戲》裡 〈紅色婚禮〉
03:23
when shockingly,
56
203042
2309
觸目驚心的最後一幕,
03:25
everybody dies.
57
205375
1559
賓客全都死了。
03:26
(Laughter)
58
206958
1250
(笑聲)
03:29
Survey data on that person's political beliefs,
59
209042
3184
受試者政治信仰的問卷數據
03:32
along with their psychographic and demographic data,
60
212250
3143
和他們心理、人口統計的調查數據,
03:35
are integrated into the system
61
215417
1767
也被整合在了系統裡,
03:37
to gain a deeper understanding of the individual.
62
217208
2709
以獲取對個體更深層的了解。
03:40
Let me give you an example.
63
220833
1542
舉個例子。
03:43
Matching people's TV preferences with their views on social justice issues
64
223708
4851
匹配人們對電視節目的偏好 與其對社會公平事件的看法,
03:48
reveals that Americans who rank immigration among their top three concerns
65
228583
4268
揭示出把移民問題列為 三個最關注議題之一的美國人
03:52
are more likely to be fans of "The Walking Dead,"
66
232875
2792
更可能是《陰屍路》的粉絲,
03:56
and they often watch for the adrenaline boost,
67
236958
2810
常為刺激腎上腺素的分泌而收看,
03:59
which is measurable.
68
239792
1291
這可被量得出來。
04:01
A person's biological signature and their survey response
69
241792
3934
各人的生物學信號與其問卷的回答
04:05
combines into a database to create their unique media imprint.
70
245750
4851
組成各自獨一無二的媒體印記 被紀錄入資料庫。
04:10
Then our predictive model finds patterns between media imprints
71
250625
4268
接著我們的預測模型 會尋找媒體印記之間的規律,
04:14
and tells me which narrative ingredients
72
254917
1976
告訴我哪些敘事成分
04:16
are more likely to lead to engagement in altruistic behavior
73
256917
3767
更有可能引起利他行為,
04:20
rather than distress and apathy.
74
260708
2685
而不是引起痛苦、冷漠。
04:23
The more imprints added to the database
75
263417
2184
從電視劇到遊戲,
04:25
across mediums from episodic television to games,
76
265625
3226
資料庫裡的媒體印記越多,
04:28
the better the predictive models become.
77
268875
2167
預測的模型就越好。
04:32
In short, I am mapping the first media genome.
78
272417
3851
總之,我正在繪製 第一個媒體基因組。
04:36
(Applause and cheers)
79
276292
3791
(掌聲和歡呼聲)
04:44
Whereas the human genome identifies all genes involved
80
284083
3185
人類基因組確定了所有 與人類 DNA 序列相關的基因,
04:47
in sequencing human DNA,
81
287292
1833
04:49
the growing database of media imprints will eventually allow me
82
289917
3226
而不斷增長的媒體印記資料庫
最終將幫助我確定 特定人物的媒體 DNA。
04:53
to determine the media DNA for a specific person.
83
293167
4125
04:58
Already the Limbic Lab's narrative engine
84
298250
2833
大腦邊緣實驗室的敘事引擎已經能
05:02
helps content creators refine their storytelling,
85
302333
2601
幫助內容創作者完善其敘事方式,
05:04
so that it resonates with their target audiences on an individual level.
86
304958
4000
使目標觀眾和他們 在個體層面上產生共鳴。
05:11
The Limbic Lab's other component,
87
311042
2184
大腦邊緣實驗室的另外一部分,
05:13
the media machine,
88
313250
1976
媒體機器,
05:15
will assess how media elicits an emotional and physiological response,
89
315250
4476
將會評估媒體如何引起 觀眾情感和生理上的反應,
05:19
then pulls scenes from a content library
90
319750
2434
並從資料庫中提取
05:22
targeted to person-specific media DNA.
91
322208
2584
針對特定個人的媒體 DNA 內容。
05:26
Applying artificial intelligence to biometric data
92
326042
3892
針對生物特徵數據使用人工智慧
05:29
creates a truly personalized experience.
93
329958
2726
能夠創造真正個性化的體驗,
05:32
One that adapts content based on real-time unconscious responses.
94
332708
5084
基於即時的無意識反應來調整內容。
05:38
Imagine if nonprofits and media makers were able to measure how audiences feel
95
338625
6059
試想倘若非盈利組織和媒體製作者
能在觀眾體驗內容的當下 測量他們的感受,
05:44
as they experience it
96
344708
1851
05:46
and alter content on the fly.
97
346583
2393
從而動態調整節目內容,將會如何?
05:49
I believe this is the future of media.
98
349000
3458
我相信這是媒體的未來。
05:53
To date, most media and social-change strategies
99
353333
2601
迄今,大部分媒體和社會變革戰略
05:55
have attempted to appeal to mass audiences,
100
355958
2851
已嘗試過吸引普羅大眾,
05:58
but the future is media customized for each person.
101
358833
3500
但是未來是為每一個人 量身打造媒體的時代。
06:03
As real-time measurement of media consumption
102
363458
2685
因為實時評估媒體消費
06:06
and automated media production becomes the norm,
103
366167
2851
和自動製作媒體內容會變成常態,
06:09
we will soon be consuming media tailored directly to our cravings
104
369042
4059
我們很快會利用心理學、 生物識別技術和人工智慧
06:13
using a blend of psychographics, biometrics and AI.
105
373125
4208
為我們的渴望量身打造媒體。
06:18
It's like personalized medicine based on our DNA.
106
378250
3559
它們就像是基於 我們 DNA 的個性化醫療。
06:21
I call it "biomedia."
107
381833
2167
我稱之為「生物媒體」。
06:25
I am currently testing the Limbic Lab in a pilot study
108
385042
3267
我正與(南加州大學的) Norman Lear Center 一起
對大腦邊緣實驗室進行前瞻研究,
06:28
with the Norman Lear Center,
109
388333
2018
06:30
which looks at the top 50 episodic television shows.
110
390375
3726
研究對象是排名前 50 的電視劇。
06:34
But I am grappling with an ethical dilemma.
111
394125
3018
但是我正掙扎於道德困境中。
06:37
If I design a tool that can be turned into a weapon,
112
397167
3851
如果設計的工具能被轉成武器,
06:41
should I build it?
113
401042
1291
我應該造嗎?
06:43
By open-sourcing the lab to encourage access and inclusivity,
114
403750
3643
開源的實驗室是為鼓勵使用和包容,
06:47
I also run the risk of enabling powerful governments
115
407417
3434
但我同樣冒著可能會讓
強勢政府和盈利企業
06:50
and profit-driven companies to appropriate the platform
116
410875
2934
私自佔用平台的風險,
06:53
for fake news, marketing or other forms of mass persuasion.
117
413833
4084
為的是捏造新聞、市場營銷
和其他形式的大規模的說服。
06:59
For me, therefore, it is critical to make my research
118
419375
3351
因此對我而言
使我的研究能像基改標籤那樣
07:02
as transparent to lay audiences as GMO labels.
119
422750
3583
對外行人全然透明至關重要。
07:07
However, this is not enough.
120
427333
2542
然而這還不夠。
07:11
As creative technologists,
121
431208
1643
作為技術創新者,
07:12
we have a responsibility
122
432875
2226
我們有責任
07:15
not only to reflect upon how present technology shapes our cultural values
123
435125
4768
不但反思現在的技術
如何塑造文化價值觀和社會行為,
07:19
and social behavior,
124
439917
2017
07:21
but also to actively challenge the trajectory of future technology.
125
441958
4935
也要主動挑戰未來科技的發展軌跡。
07:26
It is my hope that we make an ethical commitment
126
446917
4059
我希望我們做出道德承諾,
07:31
to harvesting the body's intelligence
127
451000
2351
獲取身體的信息
07:33
for the creation of authentic and just stories
128
453375
3434
是為了創造出真實和正義的故事,
07:36
that transform media and technology
129
456833
2393
將媒體和科技
07:39
from harmful weapons into narrative medicine.
130
459250
3476
從有害的武器轉變成敘事的良藥。
07:42
Thank you.
131
462750
1268
謝謝大家。
07:44
(Applause and cheers)
132
464042
2208
(掌聲和歡呼聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7


This website was created in October 2020 and last updated on June 12, 2025.

It is now archived and preserved as an English learning resource.

Some information may be out of date.

隱私政策

eng.lish.video

Developer's Blog