How I'm using biological data to tell better stories -- and spark social change | Heidi Boisvert

52,136 views

2020-01-02 ใƒป TED


New videos

How I'm using biological data to tell better stories -- and spark social change | Heidi Boisvert

52,136 views ใƒป 2020-01-02

TED


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

๋ฒˆ์—ญ: Kelly An ๊ฒ€ํ† : Jungmin Hwang
00:13
For the past 15 years I've been trying to change your mind.
0
13458
3542
์ €๋Š” ์ง€๋‚œ 15๋…„๊ฐ„ ์—ฌ๋Ÿฌ๋ถ„์˜ ๋งˆ์Œ์„ ๋ฐ”๊พธ๋ ค๋Š” ๋…ธ๋ ฅ์„ ํ•ด์™”์Šต๋‹ˆ๋‹ค.
00:17
In my work I harness pop culture and emerging technology
1
17917
3851
์ œ๊ฐ€ ํ•˜๋Š” ์ผ์€, ๋Œ€์ค‘๋ฌธํ™”์™€ ์ƒˆ๋กœ์šด ๊ธฐ์ˆ ์„ ์ด์šฉํ•ด์„œ
00:21
to shift cultural norms.
2
21792
1458
๋ฌธํ™”์  ๊ทœ๋ฒ•์„ ๋ณ€ํ™”์‹œํ‚ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
00:23
I've made video games to promote human rights,
3
23833
3685
์ธ๊ถŒ์„ ํ–ฅ์ƒ์‹œํ‚ค๊ธฐ ์œ„ํ•œ ๋น„๋””์˜ค ๊ฒŒ์ž„์„ ๋งŒ๋“ค์—ˆ๊ณ ,
00:27
I've made animations to raise awareness about unfair immigration laws
4
27542
5059
๋ถˆ๊ณต์ •ํ•œ ์ด๋ฏผ๋ฒ•์„ ์•Œ๋ฆฌ๊ธฐ ์œ„ํ•ด ์• ๋‹ˆ๋ฉ”์ด์…˜์„ ์ œ์ž‘ํ–ˆ์œผ๋ฉฐ,
00:32
and I've even made location-based augmented reality apps
5
32625
4226
๋…ธ์ˆ™์ž๋“ค์— ๋Œ€ํ•œ ์ธ์‹์„ ๋ฐ”๊พธ๊ธฐ ์œ„ํ•ด์„œ
00:36
to change perceptions around homelessness
6
36875
2643
์œ„์น˜๊ธฐ๋ฐ˜ ์ฆ๊ฐ•ํ˜„์‹ค ์•ฑ์„ ๋งŒ๋“ค๊ธฐ๋„ ํ–ˆ์Šต๋‹ˆ๋‹ค.
00:39
well before Pokรฉmon Go.
7
39542
1934
ํฌ์ผ“๋ชฌ ๊ณ  ๊ฒŒ์ž„๋ณด๋‹ค ํ›จ์”ฌ ์ „์— ๋ง์ž…๋‹ˆ๋‹ค.
00:41
(Laughter)
8
41500
1351
(์›ƒ์Œ)
00:42
But then I began to wonder whether a game or an app
9
42875
4018
๊ทธ๋Ÿฌ๋˜ ์–ด๋Š ์ˆœ๊ฐ„, ์˜๋ฌธ์ด ์ƒ๊ฒผ์Šต๋‹ˆ๋‹ค.
๊ฒŒ์ž„์ด๋‚˜ ์•ฑ์ด ์‚ฌ๋žŒ์˜ ํƒœ๋„๋‚˜ ํ–‰๋™์„ ์‹ค์ œ๋กœ ๋ณ€ํ™”์‹œํ‚ฌ ์ˆ˜ ์žˆ์„๊นŒ์š”?
00:46
can really change attitudes and behaviors,
10
46917
2476
00:49
and if so, can I measure that change?
11
49417
3017
๊ทธ๋ฆฌ๊ณ  ๋งŒ์•ฝ ๊ทธ๋ ‡๋‹ค๋ฉด, ๊ทธ ๋ณ€ํ™”๋ฅผ ์ธก์ •ํ•  ์ˆ˜ ์žˆ์„๊นŒ์š”?
00:52
What's the science behind that process?
12
52458
2976
๊ทธ๋Ÿฌํ•œ ๊ณผ์ •์˜ ์ด๋ฉด์—๋Š” ์–ด๋–ค ๊ณผํ•™์ ์ธ ์„ค๋ช…์ด ์žˆ์„๊นŒ์š”?
00:55
So I shifted my focus from making media and technology
13
55458
3726
๊ทธ๋ž˜์„œ ์ €๋Š” ๋ฏธ๋””์–ด ์ œ์ž‘์ด์—ˆ๋˜ ์ €์˜ ๊ด€์‹ฌ์‚ฌ๋ฅผ
00:59
to measuring their neurobiological effects.
14
59208
3042
๊ทธ๊ฒƒ์ด ๋ฏธ์น˜๋Š” ์‹ ๊ฒฝ์ƒ๋ฌผํ•™์  ํšจ๊ณผ๋ฅผ ์ธก์ •ํ•˜๋Š” ๊ฒƒ์œผ๋กœ ์˜ฎ๊ฒผ์Šต๋‹ˆ๋‹ค.
01:03
Here's what I discovered.
15
63458
1851
์ด๊ฒƒ์ด ์ œ๊ฐ€ ๋ฐœ๊ฒฌํ•œ ๊ฒฐ๊ณผ์ž…๋‹ˆ๋‹ค.
01:05
The web, mobile devices, virtual and augmented reality
16
65333
3893
์ธํ„ฐ๋„ท, ๋ชจ๋ฐ”์ผ ์žฅ์น˜, ๊ฐ€์ƒํ˜„์‹ค๊ณผ ์ฆ๊ฐ•ํ˜„์‹ค์€
01:09
were rescripting our nervous systems.
17
69250
2684
์šฐ๋ฆฌ ์‹ ๊ฒฝ๊ณ„๋ฅผ ๋‹ค์‹œ ์“ฐ๊ณ  ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
01:11
And they were literally changing the structure of our brain.
18
71958
2959
์‹ค์ œ๋กœ ์‚ฌ๋žŒ์˜ ๋‡Œ ๊ตฌ์กฐ๋ฅผ ๋ณ€ํ™”์‹œ์ผฐ์Šต๋‹ˆ๋‹ค.
01:15
The very technologies I had been using to positively influence hearts and minds
19
75875
4976
๊ฐ์ •๊ณผ ์ •์‹ ์— ๊ธ์ •์ ์ธ ์˜ํ–ฅ์„ ์ฃผ๊ธฐ ์œ„ํ•ด ์‚ฌ์šฉํ–ˆ๋˜ ๊ธฐ์ˆ ์ด
01:20
were actually eroding functions in the brain necessary for empathy
20
80875
4393
์‹ค์ œ๋กœ๋Š” ๊ณต๊ฐ๊ณผ ์˜์‚ฌ๊ฒฐ์ •์— ํ•„์š”ํ•œ ๋‡Œ ๊ธฐ๋Šฅ์„
01:25
and decision-making.
21
85292
1851
์•ฝํ™”์‹œํ‚ค๋Š” ๊ฒฐ๊ณผ๋ฅผ ๋ณด์˜€์Šต๋‹ˆ๋‹ค.
01:27
In fact, our dependence upon the web and mobile devices
22
87167
4142
์ธํ„ฐ๋„ท๊ณผ ๋ชจ๋ฐ”์ผ ์žฅ์น˜์— ๋Œ€ํ•œ ์˜์กด์„ฑ์ด
01:31
might be taking over our cognitive and affective faculties,
23
91333
4101
์šฐ๋ฆฌ์˜ ์ธ์ง€์™€ ์ •์„œ ๊ธฐ๋Šฅ์„ ์žฅ์•…ํ•˜์—ฌ
01:35
rendering us socially and emotionally incompetent,
24
95458
3560
์šฐ๋ฆฌ๋ฅผ ์‚ฌํšŒ์ ์œผ๋กœ, ์ •์„œ์ ์œผ๋กœ ๋ฌด๋Šฅํ•˜๊ฒŒ ๋งŒ๋“œ๋Š” ๊ฒƒ์ผ์ง€๋„ ๋ชจ๋ฆ…๋‹ˆ๋‹ค.
01:39
and I felt complicit in this dehumanization.
25
99042
3291
๊ทธ๋ฆฌ๊ณ  ์ด๋Ÿฐ ๋น„์ธ๊ฐ„ํ™”์— ์ œ๊ฐ€ ๊ด€์—ฌํ–ˆ๋‹ค๋Š” ์ƒ๊ฐ์ด ๋“ค์—ˆ์Šต๋‹ˆ๋‹ค.
01:43
I realized that before I could continue making media about social issues,
26
103292
4642
์‚ฌํšŒ๋ฌธ์ œ์— ๊ด€ํ•œ ๋ฏธ๋””์–ด๋ฅผ ๊ณ„์†ํ•ด์„œ ์ œ์ž‘ํ•˜๊ธฐ ์ „์—,
01:47
I needed to reverse engineer the harmful effects of technology.
27
107958
4417
๊ธฐ์ˆ ์ด ๋ฏธ์น˜๋Š” ํ•ด๋กœ์šด ์˜ํ–ฅ์„ ์ œ๊ฑฐํ•  ํ•„์š”๊ฐ€ ์žˆ์Œ์„ ๊นจ๋‹ฌ์•˜์Šต๋‹ˆ๋‹ค.
01:52
To tackle this I asked myself,
28
112917
2767
์ด ๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ•˜๊ธฐ ์œ„ํ•ด ์ œ ์ž์‹ ์—๊ฒŒ ์งˆ๋ฌธ์„ ํ–ˆ์Šต๋‹ˆ๋‹ค.
01:55
"How can I translate the mechanisms of empathy,
29
115708
3393
'์–ด๋–ป๊ฒŒ ํ•˜๋ฉด ๊ณต๊ฐ์˜ ๋ฉ”์ปค๋‹ˆ์ฆ˜์ธ ์ธ์ง€, ์ •์„œ, ๊ทธ๋ฆฌ๊ณ  ๋™๊ธฐ์  ์–‘์ƒ์„
01:59
the cognitive, affective and motivational aspects,
30
119125
3643
02:02
into an engine that simulates the narrative ingredients
31
122792
3226
๋ฐ”ํƒ•์œผ๋กœ ํ•œ ์ด์•ผ๊ธฐ ์—”์ง„์„ ๋งŒ๋“ค์–ด
02:06
that move us to act?"
32
126042
1291
ํ–‰๋™์„ ์ด๋Œ์–ด ๋‚ผ ์ˆ˜ ์žˆ์„๊นŒ?"
02:08
To answer this, I had to build a machine.
33
128583
3935
๊ทธ ๋‹ต์„ ์ฐพ๊ธฐ ์œ„ํ•ด ๊ธฐ๊ณ„๋ฅผ ๋งŒ๋“ค์–ด์•ผ ํ–ˆ์Šต๋‹ˆ๋‹ค.
02:12
(Laughter)
34
132542
1726
(์›ƒ์Œ)
02:14
I've been developing an open-source biometric lab,
35
134292
2892
์ €๋Š” ๊ฐœ๋ฐฉํ˜• ์†Œ์Šค ์ƒ์ฒด์ธก์ • ์—ฐ๊ตฌ์†Œ์ธ
02:17
an AI system which I call the Limbic Lab.
36
137208
3476
'๋ฆผ๋น… ๋žฉ'์ด๋ผ๊ณ  ๋ถˆ๋ฆฌ๋Š” AI ์‹œ์Šคํ…œ์„ ๊ฐœ๋ฐœํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
02:20
The lab not only captures
37
140708
1435
๋ฆผ๋น… ๋žฉ์€ ๋ฏธ๋””์–ด์™€ ๊ธฐ์ˆ ์— ๋Œ€ํ•œ
02:22
the brain and body's unconscious response to media and technology
38
142167
4226
๋‡Œ์™€ ์‹ ์ฒด์˜ ๋ฌด์˜์‹์ ์ธ ๋ฐ˜์‘์„ ํฌ์ฐฉํ•  ๋ฟ๋งŒ ์•„๋‹ˆ๋ผ,
02:26
but also uses machine learning to adapt content
39
146417
2976
๊ธฐ๊ณ„ํ•™์Šต์„ ์ด์šฉํ•จ์œผ๋กœ์จ
์ƒ๋ฌผํ•™์  ๋ฐ˜์‘์— ๋งž๊ฒŒ ์ฝ˜ํ…์ธ ๋ฅผ ์กฐ์ •ํ•˜๊ธฐ๋„ ํ•ฉ๋‹ˆ๋‹ค.
02:29
based on these biological responses.
40
149417
2208
02:32
My goal is to find out what combination of narrative ingredients
41
152667
3517
์ €์˜ ๋ชฉํ‘œ๋Š”, ํŠน์ • ์ง‘๋‹จ์˜ ์‚ฌ๋žŒ๋“ค์—๊ฒŒ
02:36
are the most appealing and galvanizing
42
156208
2143
๊ฐ€์žฅ ํ˜ธ์†Œ๋ ฅ์žˆ๊ณ  ์ถฉ๊ฒฉ์ ์ธ ์„œ์ˆ ์  ์š”์†Œ์˜ ๊ฒฐํ•ฉ์„ ์ฐพ์•„๋ƒ„์œผ๋กœ์จ
02:38
to specific target audiences
43
158375
1809
์‚ฌํšŒ์ •์˜, ๋ฌธํ™”, ๊ต์œก ๋‹จ์ฒด๋“ค์ด
02:40
to enable social justice, cultural and educational organizations
44
160208
5018
๋”์šฑ ํšจ๊ณผ์ ์ธ ๋ฏธ๋””์–ด๋ฅผ ์ œ์ž‘ํ•  ์ˆ˜ ์žˆ๊ฒŒ ํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
02:45
to create more effective media.
45
165250
2643
02:47
The Limbic Lab consists of two components:
46
167917
2851
๋ฆผ๋น… ๋žฉ์€ ๋‘ ๊ฐ€์ง€ ๊ตฌ์„ฑ์š”์†Œ์ธ
02:50
a narrative engine and a media machine.
47
170792
2541
์ด์•ผ๊ธฐ ์—”์ง„๊ณผ ๋ฏธ๋””์–ด ๋จธ์‹ ์œผ๋กœ ์ด๋ฃจ์–ด์ ธ ์žˆ์Šต๋‹ˆ๋‹ค.
02:54
While a subject is viewing or interacting with media content,
48
174375
4226
์‹คํ—˜ ๋Œ€์ƒ์ด ๋ฏธ๋””์–ด ์ฝ˜ํ…์ธ ๋ฅผ ์‹œ์ฒญํ•˜๊ฑฐ๋‚˜ ์‚ฌ์šฉํ•˜๋Š” ๋™์•ˆ,
02:58
the narrative engine takes in and syncs real-time data from brain waves,
49
178625
4184
์ด์•ผ๊ธฐ ์—”์ง„์€ ์‹ค์‹œ๊ฐ„ ๋‡ŒํŒŒ ์ž๋ฃŒ์™€,
03:02
biophysical data like heart rate, blood flow, body temperature
50
182833
3560
์‹ฌ๋ฐ•์ˆ˜, ํ˜ˆ๋ฅ˜์™€ ์ฒด์˜จ, ๊ทผ์œก์ˆ˜์ถ•์„ ์ธก์ •ํ•œ ์ƒ๋ฌผ๋ฌผ๋ฆฌํ•™์  ์ž๋ฃŒ,
03:06
and muscle contraction,
51
186417
1684
03:08
as well as eye-tracking and facial expressions.
52
188125
2958
๊ทธ๋ฆฌ๊ณ  ๋ˆˆ๋™์ž์˜ ์›€์ง์ž„๊ณผ ์–ผ๊ตด ํ‘œ์ •์„ ์ž…๋ ฅํ•˜๊ณ  ๋™๊ธฐํ™”์‹œํ‚ต๋‹ˆ๋‹ค.
03:12
Data is captured at key places where critical plot points,
53
192083
3726
์ค„๊ฑฐ๋ฆฌ์—์„œ ์ค‘์š”ํ•œ ์ง€์ , ์ธ๋ฌผ๊ฐ„์˜ ์ƒํ˜ธ์ž‘์šฉ,
๋˜๋Š” ํŠน์ดํ•œ ์นด๋ฉ”๋ผ ์—ฅ๊ธ€์ด ์‚ฌ์šฉ๋œ ์ค‘์š”ํ•œ ์‹œ์ ์—์„œ
03:15
character interaction or unusual camera angles occur.
54
195833
3417
์ด๋Ÿฌํ•œ ์ž๋ฃŒ๊ฐ€ ํฌ์ฐฉ๋ฉ๋‹ˆ๋‹ค.
03:20
Like the final scene in "Game of Thrones, Red Wedding,"
55
200292
2726
"์™•์ขŒ์˜ ๊ฒŒ์ž„, ํ”ผ์˜ ๊ฒฐํ˜ผ์‹"์—์„œ
03:23
when shockingly,
56
203042
2309
๋ชจ๋‘๊ฐ€ ์ฃฝ์–ด๋ฒ„๋ฆฌ๋Š” ์ถฉ๊ฒฉ์ ์ธ ๋งˆ์ง€๋ง‰ ์žฅ๋ฉด์ฒ˜๋Ÿผ ๋ง์ด์ฃ .
03:25
everybody dies.
57
205375
1559
03:26
(Laughter)
58
206958
1250
(์›ƒ์Œ)
03:29
Survey data on that person's political beliefs,
59
209042
3184
์„ค๋ฌธ์กฐ์‚ฌ๋ฅผ ํ†ตํ•ด์„œ
๊ฐœ์ธ์˜ ์ •์น˜์  ์‹ ๋…, ์‹ฌ๋ฆฌ ๋ฐ์ดํ„ฐ, ๊ทธ๋ฆฌ๊ณ  ์ •๋ณด ๊ธฐ๋ก์„ ์–ป์€ ํ›„
03:32
along with their psychographic and demographic data,
60
212250
3143
03:35
are integrated into the system
61
215417
1767
์‹œ์Šคํ…œ์— ํ†ตํ•ฉํ•จ์œผ๋กœ์จ,
03:37
to gain a deeper understanding of the individual.
62
217208
2709
๊ฐœ์ธ์— ๋Œ€ํ•œ ๋” ๊นŠ์€ ์ดํ•ด๊ฐ€ ๊ฐ€๋Šฅํ•ด์ง‘๋‹ˆ๋‹ค.
03:40
Let me give you an example.
63
220833
1542
์˜ˆ๋ฅผ ๋“ค์–ด ๋ณด๊ฒ ์Šต๋‹ˆ๋‹ค.
03:43
Matching people's TV preferences with their views on social justice issues
64
223708
4851
์‚ฌ๋žŒ๋“ค์ด ์„ ํ˜ธํ•˜๋Š” TV ํ”„๋กœ๊ทธ๋žจ๊ณผ ์‚ฌํšŒ ์ •์˜์— ๋Œ€ํ•œ ๊ด€์ ์„ ๋น„๊ตํ•˜๋ฉด
03:48
reveals that Americans who rank immigration among their top three concerns
65
228583
4268
๊ฐ€์žฅ ๊ด€์‹ฌ ์žˆ๋Š” ์„ธ ๊ฐ€์ง€ ์‚ฌํšŒ๋ฌธ์ œ ์ค‘ ํ•˜๋‚˜๋กœ
์ด๋ฏผ ๋ฌธ์ œ๋ฅผ ๋ฝ‘์€ ๋ฏธ๊ตญ์ธ๋“ค์€
03:52
are more likely to be fans of "The Walking Dead,"
66
232875
2792
"์›Œํ‚น ๋ฐ๋“œ"๋ฅผ ์ข‹์•„ํ•˜๋Š” ํŒฌ์ผ ๊ฐ€๋Šฅ์„ฑ์ด ๋” ์ปธ์œผ๋ฉฐ,
03:56
and they often watch for the adrenaline boost,
67
236958
2810
์ด๋ฅผ ์‹œ์ฒญํ•˜๋Š” ์ฃผ๋œ ์ด์œ ๊ฐ€ ์•„๋“œ๋ ˆ๋‚ ๋ฆฐ ๋ถ„๋น„์˜ ์ด‰์ง„์ด์—ˆ๋Š”๋ฐ,
03:59
which is measurable.
68
239792
1291
์ด๊ฒƒ์€ ์ธก์ •์ด ๊ฐ€๋Šฅํ•ฉ๋‹ˆ๋‹ค.
04:01
A person's biological signature and their survey response
69
241792
3934
๊ฐœ๊ฐœ์ธ์˜ ์ƒ๋ฌผํ•™์  ํŠน์ง•๊ณผ ์„ค๋ฌธ์กฐ์‚ฌ์— ๋Œ€ํ•œ ์‘๋‹ต์„
04:05
combines into a database to create their unique media imprint.
70
245750
4851
๊ฒฐํ•ฉํ•œ ๋ฐ์ดํ„ฐ๋ฒ ์ด์Šค๋ฅผ ์ด์šฉํ•ด์„œ,
๊ฐœ์ธ์˜ ๋…ํŠนํ•œ ์„ฑํ–ฅ์„ ๋ณด์—ฌ์ฃผ๋Š” ๋ฏธ๋””์–ด ๋ฐœ์ž๊ตญ์„ ๋งŒ๋“ค ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
04:10
Then our predictive model finds patterns between media imprints
71
250625
4268
๊ทธ๋Ÿฐ ๋‹ค์Œ, ์˜ˆ์ธก ๋ชจ๋ธ์ด ์—ฌ๋Ÿฌ ๋ฐœ์ž๊ตญ ์†์˜ ํŒจํ„ด์„ ์ฐพ์•„๋‚ด์–ด
04:14
and tells me which narrative ingredients
72
254917
1976
์–ด๋–ค ์„œ์ˆ ์  ์š”์†Œ๋ฅผ ์‚ฌ์šฉํ•  ๋•Œ
04:16
are more likely to lead to engagement in altruistic behavior
73
256917
3767
์ดํƒ€์  ํ–‰๋™์„ ์ด๋Œ์–ด ๋‚ผ ๊ฐ€๋Šฅ์„ฑ์ด ๋” ๋†’์€์ง€ ์•Œ๋ ค์ค๋‹ˆ๋‹ค.
04:20
rather than distress and apathy.
74
260708
2685
๊ณ ํ†ต๊ณผ ๋ฌด๊ด€์‹ฌ ๋Œ€์‹ ์—์š”.
04:23
The more imprints added to the database
75
263417
2184
๋ฐ์ดํ„ฐ๋ฒ ์ด์Šค์— ๋” ๋งŽ์€ ๋ฐœ์ž๊ตญ์ด ์ถ”๊ฐ€๋˜๊ณ ,
04:25
across mediums from episodic television to games,
76
265625
3226
์‹œ๋ฆฌ์ฆˆ๋ฌผ๋ถ€ํ„ฐ ๊ฒŒ์ž„๊นŒ์ง€, ๋„“์€ ๋ฒ”์œ„์˜ ๋ฐ์ดํ„ฐ๊ฐ€ ์ˆ˜์ง‘๋œ๋‹ค๋ฉด
04:28
the better the predictive models become.
77
268875
2167
์˜ˆ์ธก๋ชจ๋ธ์˜ ์ •ํ™•์„ฑ์ด ๋” ๋†’์•„์ง‘๋‹ˆ๋‹ค.
04:32
In short, I am mapping the first media genome.
78
272417
3851
๊ฐ„๋žตํžˆ ๋ง์”€๋“œ๋ฆฌ์ž๋ฉด,
์ €๋Š” ์ง€๊ธˆ ์ตœ์ดˆ์˜ ๋ฏธ๋””์–ด ๊ฒŒ๋†ˆ์„ ๋งŒ๋“ค๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
04:36
(Applause and cheers)
79
276292
3791
(๋ฐ•์ˆ˜์™€ ํ™˜ํ˜ธ)
04:44
Whereas the human genome identifies all genes involved
80
284083
3185
์ธ๊ฐ„์˜ ๊ฒŒ๋†ˆ์„ ํ†ตํ•ด์„œ DNA์— ํฌํ•จ๋œ ๋ชจ๋“  ์œ ์ „์ž๋ฅผ
04:47
in sequencing human DNA,
81
287292
1833
๋ฐํž ์ˆ˜ ์žˆ๋“ฏ์ด,
04:49
the growing database of media imprints will eventually allow me
82
289917
3226
์ ์  ๋” ์ปค์ง€๊ณ  ์žˆ๋Š” ๋ฏธ๋””์–ด ๋ฐœ์ž๊ตญ ๋ฐ์ดํ„ฐ๋ฒ ์ด์Šค๋ฅผ ํ†ตํ•ด
04:53
to determine the media DNA for a specific person.
83
293167
4125
ํŠน์ • ๊ฐœ์ธ์˜ ๋ฏธ๋””์–ด DNA๋ฅผ ๊ฒฐ์ •ํ•˜๋Š” ๊ฒƒ์ด
๊ฐ€๋Šฅํ•ด์งˆ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
04:58
Already the Limbic Lab's narrative engine
84
298250
2833
๋ฆผ๋น… ๋žฉ์˜ ์ด์•ผ๊ธฐ ์—”์ง„์€ ์ด๋ฏธ,
05:02
helps content creators refine their storytelling,
85
302333
2601
์ฝ˜ํ…์ธ  ์ œ์ž‘์ž๋กœ ํ•˜์—ฌ๊ธˆ ๊ฐœ๋ณ„ ์ด์šฉ์ž์— ๋งž์ถ”์–ด
05:04
so that it resonates with their target audiences on an individual level.
86
304958
4000
์Šคํ† ๋ฆฌํ…”๋ง์„ ๊ฐœ์กฐํ•˜๋„๋ก ๋„์›€์„ ์ฃผ๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
05:11
The Limbic Lab's other component,
87
311042
2184
๋ฆผ๋น… ๋žฉ์˜ ๋˜ ๋‹ค๋ฅธ ๊ตฌ์„ฑ์š”์†Œ์ธ, ๋ฏธ๋””์–ด ๋จธ์‹ ์€
05:13
the media machine,
88
313250
1976
05:15
will assess how media elicits an emotional and physiological response,
89
315250
4476
๋ฏธ๋””์–ด๊ฐ€ ์–ด๋–ป๊ฒŒ ๊ฐ์ •์ , ์ƒ๋ฆฌ์  ๋ฐ˜์‘์„ ์ด๋Œ์–ด๋‚ด๋Š”์ง€ ์ง„๋‹จํ•œ ๋’ค
05:19
then pulls scenes from a content library
90
319750
2434
์ฝ˜ํ…์ธ  ๋„์„œ๊ด€์—์„œ ์ ํ•ฉํ•œ ์žฅ๋ฉด์„ ๋ฝ‘์•„๋‚ด
05:22
targeted to person-specific media DNA.
91
322208
2584
๊ฐœ๊ฐœ์ธ์˜ ๋ฏธ๋””์–ด DNA์— ๋งž์ถ”์–ด ์ฝ˜ํ…์ธ ๋ฅผ ์ œ๊ณตํ•ฉ๋‹ˆ๋‹ค.
05:26
Applying artificial intelligence to biometric data
92
326042
3892
์ธ๊ณต์ง€๋Šฅ์„ ์ƒ์ฒด์ธก์ • ๋ฐ์ดํ„ฐ์— ์ ์šฉํ•˜์—ฌ
05:29
creates a truly personalized experience.
93
329958
2726
๊ฐœ์ธ์˜ ์„ฑํ–ฅ์— ์ •ํ™•ํ•˜๊ฒŒ ๋งž์ถ˜ ๊ฒฝํ—˜์„ ๋งŒ๋“ค์–ด๋‚ผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
05:32
One that adapts content based on real-time unconscious responses.
94
332708
5084
์‹ค์‹œ๊ฐ„์œผ๋กœ ์ธก์ •๋œ ๋ฌด์˜์‹ ๋ฐ˜์‘์„ ๊ธฐ๋ฐ˜์œผ๋กœ
์ฝ˜ํ…์ธ ๊ฐ€ ์กฐ์ •๋˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
05:38
Imagine if nonprofits and media makers were able to measure how audiences feel
95
338625
6059
๋น„์˜๋ฆฌ ๋ฏธ๋””์–ด ์ œ์ž‘์ž๊ฐ€
๋ฏธ๋””์–ด๋ฅผ ๊ฒฝํ—˜ํ•˜๋ฉด์„œ ๋Š๋ผ๋Š” ์ด์šฉ์ž์˜ ๋ฐ˜์‘์„ ์ธก์ •ํ•ด์„œ
05:44
as they experience it
96
344708
1851
05:46
and alter content on the fly.
97
346583
2393
์‹ค์‹œ๊ฐ„์„ ์ฝ˜ํ…์ธ ๋ฅผ ์ˆ˜์ •ํ•œ๋‹ค๊ณ  ์ƒ์ƒํ•ด๋ณด์„ธ์š”.
05:49
I believe this is the future of media.
98
349000
3458
์ €๋Š” ์ด๊ฒƒ์ด ๋ฏธ๋””์–ด์˜ ๋ฏธ๋ž˜๋ผ๊ณ  ๋ฏฟ์Šต๋‹ˆ๋‹ค.
05:53
To date, most media and social-change strategies
99
353333
2601
์ง€๊ธˆ๊นŒ์ง€, ๋Œ€๋ถ€๋ถ„์˜ ๋ฏธ๋””์–ด์™€ ์‚ฌํšŒ๋ณ€ํ™” ์ „๋žต์€
05:55
have attempted to appeal to mass audiences,
100
355958
2851
์ผ๋ฐ˜ ๋Œ€์ค‘์—๊ฒŒ ํ˜ธ์†Œํ•˜๋ ค๋Š” ์‹œ๋„๋ฅผ ํ•ด์™”์Šต๋‹ˆ๋‹ค.
05:58
but the future is media customized for each person.
101
358833
3500
๊ทธ๋Ÿฌ๋‚˜ ๋ฏธ๋ž˜์˜ ๋ฏธ๋””์–ด๋Š” ๊ฐ ๊ฐœ์ธ์„ ์œ„ํ•œ ๋งž์ถคํ˜•์ž…๋‹ˆ๋‹ค.
06:03
As real-time measurement of media consumption
102
363458
2685
๋ฏธ๋””์–ด ์ด์šฉ์— ๋Œ€ํ•œ ์‹ค์‹œ๊ฐ„ ์ธก์ •๊ณผ
06:06
and automated media production becomes the norm,
103
366167
2851
์ž๋™ํ™”๋œ ๋ฏธ๋””์–ด ์ƒ์‚ฐ์ด ์ผ๋ฐ˜์ ์ธ ๊ธฐ์ค€์ด ๋จ์— ๋”ฐ๋ผ์„œ,
06:09
we will soon be consuming media tailored directly to our cravings
104
369042
4059
๊ฐœ์ธ์˜ ์š”๊ตฌ์— ์ •ํ™•ํ•˜๊ฒŒ ๋งž์ถฐ์ง„
๋ฏธ๋””์–ด๋ฅผ ์ด์šฉํ•˜๊ฒŒ ๋˜๋Š” ๋‚ ์ด ๋จธ์ง€ ์•Š์•˜์Šต๋‹ˆ๋‹ค.
06:13
using a blend of psychographics, biometrics and AI.
105
373125
4208
์‚ฌ์ด์ฝ”๊ทธ๋ž˜ํ”ฝ, ์ƒ๋ฆฌ์  ๋ฐ์ดํ„ฐ, ๊ทธ๋ฆฌ๊ณ  AI๋ฅผ ์‚ฌ์šฉํ•ด์„œ์š”.
06:18
It's like personalized medicine based on our DNA.
106
378250
3559
๊ฐœ์ธ์˜ DNA์— ๋งž๋Š” ์•ฝ์„ ๋งŒ๋“œ๋Š” ๊ฒƒ๊ณผ ๋น„์Šทํ•ฉ๋‹ˆ๋‹ค.
06:21
I call it "biomedia."
107
381833
2167
์ €๋Š” ์ด๋Ÿฌํ•œ ๋ฏธ๋””์–ด๋ฅผ '๋ฐ”์ด์˜ค ๋ฏธ๋””์–ด'๋ผ๊ณ  ๋ถ€๋ฆ…๋‹ˆ๋‹ค.
06:25
I am currently testing the Limbic Lab in a pilot study
108
385042
3267
์ €๋Š” ํ˜„์žฌ ๋…ธ๋งŒ ๋ฆฌ์–ด ์„ผํ„ฐ์™€ ํ•จ๊ป˜ ํŒŒ์ผ๋Ÿฟ ์—ฐ๊ตฌ๋ฅผ ์ง„ํ–‰ํ•˜์—ฌ
๋ฆผ๋น… ๋žฉ์„ ํ…Œ์ŠคํŠธํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
06:28
with the Norman Lear Center,
109
388333
2018
06:30
which looks at the top 50 episodic television shows.
110
390375
3726
์ด ์—ฐ๊ตฌ๋Š” ๊ฐ€์žฅ ๋งŽ์ด ์‹œ์ฒญ๋˜๋Š”
50๊ฐœ์˜ ์—ํ”ผ์†Œ๋“œ์‹ TV ์‡ผ๋ฅผ ๋ถ„์„ํ•ฉ๋‹ˆ๋‹ค.
06:34
But I am grappling with an ethical dilemma.
111
394125
3018
๊ทธ๋Ÿฐ๋ฐ ์ €๋Š” ์ง€๊ธˆ ์œค๋ฆฌ์  ๋”œ๋ ˆ๋งˆ์— ๋น ์ ”์Šต๋‹ˆ๋‹ค.
06:37
If I design a tool that can be turned into a weapon,
112
397167
3851
'๋‚ด๊ฐ€ ์„ค๊ณ„ํ•˜๋Š” ๋„๊ตฌ๊ฐ€ ๋ฌด๊ธฐ๋กœ ๋ณ€ํ•  ์ˆ˜ ์žˆ๋‹ค๋ฉด,
06:41
should I build it?
113
401042
1291
๊ทธ๋ž˜๋„ ๋งŒ๋“ค์–ด์•ผ ํ•˜๋Š”๊ฐ€?'
06:43
By open-sourcing the lab to encourage access and inclusivity,
114
403750
3643
์ ‘๊ทผ์„ฑ๊ณผ ์ฐธ์—ฌ๋ฅผ ์ด‰์ง„ํ•˜๊ธฐ ์œ„ํ•ด์„œ ๋ฆผ๋น… ๋žฉ์˜ ์†Œ์Šค๋ฅผ ๊ฐœ๋ฐฉํ•จ์œผ๋กœ์จ,
06:47
I also run the risk of enabling powerful governments
115
407417
3434
๊ถŒ๋ ฅ์„ ๊ฐ€์ง„ ์ •๋ถ€์™€ ์ด์œค์„ ์ถ”๊ตฌํ•˜๋Š” ํšŒ์‚ฌ๊ฐ€
06:50
and profit-driven companies to appropriate the platform
116
410875
2934
ํ”Œ๋žซํผ์„ ์•…์šฉํ•˜์—ฌ
06:53
for fake news, marketing or other forms of mass persuasion.
117
413833
4084
๊ฐ€์งœ๋‰ด์Šค์™€ ๊ด‘๊ณ ๋ฅผ ๋‚จ๋ฐœํ•  ์œ„ํ—˜๋„ ๊ฐ์ˆ˜ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
06:59
For me, therefore, it is critical to make my research
118
419375
3351
๋”ฐ๋ผ์„œ ์ œ ์—ฐ๊ตฌ๋ฅผ ์ผ๋ฐ˜ ๋ฏธ๋””์–ด ์ด์šฉ์ž๋“ค์—๊ฒŒ
07:02
as transparent to lay audiences as GMO labels.
119
422750
3583
GMO ๋ผ๋ฒจ๋งŒํผ ํˆฌ๋ช…ํ•˜๊ฒŒ ๋งŒ๋“œ๋Š” ๊ฒƒ์ด ์ค‘์š”ํ•ฉ๋‹ˆ๋‹ค.
07:07
However, this is not enough.
120
427333
2542
๊ทธ๋Ÿฌ๋‚˜ ์ด๊ฒƒ๋งŒ์œผ๋กœ๋Š” ์ถฉ๋ถ„ํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
07:11
As creative technologists,
121
431208
1643
์ฐฝ์กฐ์  ๊ณผํ•™๊ธฐ์ˆ  ์ „๋ฌธ๊ฐ€๋กœ์„œ,
07:12
we have a responsibility
122
432875
2226
ํ˜„์žฌ์˜ ๊ธฐ์ˆ ์ด ๋ฌธํ™”์  ๊ฐ€์น˜์™€ ์‚ฌํšŒ์  ํ–‰๋™์„
07:15
not only to reflect upon how present technology shapes our cultural values
123
435125
4768
์–ด๋–ป๊ฒŒ ํ˜•์„ฑํ•˜๋Š”์ง€์— ๋Œ€ํ•ด์„œ ์‹ฌ์‚ฌ์ˆ™๊ณ ํ•ด์•ผ ํ•  ๋ฟ๋งŒ ์•„๋‹ˆ๋ผ,
07:19
and social behavior,
124
439917
2017
07:21
but also to actively challenge the trajectory of future technology.
125
441958
4935
๋ฏธ๋ž˜ ๊ธฐ์ˆ ์ด ๋‚˜์•„๊ฐ€๋Š” ๋ฐฉํ–ฅ์— ์ ๊ทน์ ์œผ๋กœ ์ด์˜๋ฅผ ์ œ๊ธฐํ•˜๋Š” ๊ฒƒ์ด
์šฐ๋ฆฌ์˜ ์ฑ…์ž„์ž…๋‹ˆ๋‹ค.
07:26
It is my hope that we make an ethical commitment
126
446917
4059
์ €๋Š” ์šฐ๋ฆฌ ๋ชจ๋‘๊ฐ€ ์œค๋ฆฌ์  ์„œ์•ฝ์„ ํ†ตํ•ด
07:31
to harvesting the body's intelligence
127
451000
2351
์ธ๊ฐ„์— ๋Œ€ํ•ด ์ˆ˜์ง‘ํ•œ ์ •๋ณด๋ฅผ
์ง„์‹ค๋˜๊ณ  ๊ณต์ •ํ•œ ์ด์•ผ๊ธฐ๋ฅผ ๋งŒ๋“œ๋Š” ๋ฐ ์‚ฌ์šฉํ•˜์—ฌ
07:33
for the creation of authentic and just stories
128
453375
3434
07:36
that transform media and technology
129
456833
2393
๋ฏธ๋””์–ด์™€ ๊ธฐ์ˆ ์„ ๋ฐ”๊พธ๊ธฐ๋ฅผ ๋ฐ”๋ž๋‹ˆ๋‹ค.
07:39
from harmful weapons into narrative medicine.
130
459250
3476
ํ•ด๋กœ์šด ๋ฌด๊ธฐ๋ฅผ ์ด์•ผ๊ธฐ ์•ฝ์œผ๋กœ ๋ฐ”๊พธ์–ด ๋ด…์‹œ๋‹ค.
07:42
Thank you.
131
462750
1268
๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
07:44
(Applause and cheers)
132
464042
2208
(๋ฐ•์ˆ˜์™€ ํ™˜ํ˜ธ)
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7