Dear Facebook, this is how you're breaking democracy | Yael Eisenstat

113,619 views ใƒป 2020-09-24

TED


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

๋ฒˆ์—ญ: ์Šน๋ฏผ ์ด ๊ฒ€ํ† : Jihyeon J. Kim
00:13
Around five years ago,
0
13000
2018
์•ฝ 5๋…„ ์ „์ฏค์—
00:15
it struck me that I was losing the ability
1
15042
2392
์ €๋Š” ์ œ ์‚ฌ์ƒ์„ ๊ณต์œ ํ•˜์ง€ ์•Š๋Š” ์‚ฌ๋žŒ๋“ค๊ณผ
00:17
to engage with people who aren't like-minded.
2
17458
2500
์†Œํ†ตํ•  ๋Šฅ๋ ฅ์„ ์ƒ์‹คํ•˜๊ณ  ์žˆ์Œ์„ ์•Œ๊ฒŒ ๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
00:20
The idea of discussing hot-button issues with my fellow Americans
3
20917
3642
์ œ ๋™๋ฃŒ ๋ฏธ๊ตญ์ธ๋“ค๊ณผ ๋ฏผ๊ฐํ•œ ๋ฌธ์ œ์— ๋Œ€ํ•ด ๋…ผ์˜ํ•˜๋Š” ๊ฒƒ์ด
00:24
was starting to give me more heartburn
4
24583
2393
๋ฐ”๋‹ค ๊ฑด๋„ˆํŽธ์— ์žˆ๋Š” ๊ทน๋‹จ์ฃผ์˜์ž๋“ค๊ณผ ๋…ผ์˜ํ•˜๋Š” ๊ฒƒ๋ณด๋‹ค ๋” ๋‹ต๋‹ตํ•  ๋•Œ๊ฐ€ ๋งŽ์•˜์ฃ .
00:27
than the times that I engaged with suspected extremists overseas.
5
27000
4125
00:31
It was starting to leave me feeling more embittered and frustrated.
6
31750
3601
์ด ๋ฌธ์ œ๋Š” ์ €๋ฅผ ๋”์šฑ ๋” ์ฐœ์ฐœํ•˜๊ณ  ๋ถˆํŽธํ•˜๊ฒŒ ๋งŒ๋“ค์—ˆ์Šต๋‹ˆ๋‹ค.
00:35
And so just like that,
7
35375
1434
๊ทธ๋ž˜์„œ
00:36
I shifted my entire focus
8
36833
1810
์ €๋Š” ์ œ ์ดˆ์ ์„
00:38
from global national security threats
9
38667
2434
๋ฒ”์„ธ๊ณ„์  ๊ตญ๊ฐ€ ์•ˆ๋ณด ์œ„ํ˜‘์—์„œ
00:41
to trying to understand what was causing this push
10
41125
3226
๋ฏธ๊ตญ์˜ ๊ทน๋‹จ์  ์–‘๊ทนํ™”์— ๋Œ€ํ•œ
00:44
towards extreme polarization at home.
11
44375
3226
๊ทผ๋ณธ์ ์ธ ์›์ธ ๋ถ„์„์œผ๋กœ ์˜ฎ๊ฒผ์Šต๋‹ˆ๋‹ค.
00:47
As a former CIA officer and diplomat
12
47625
2351
์ง€๋‚œ ๋ช‡ ๋…„๊ฐ„ ์–‘๊ทนํ™” ์ด์Šˆ์— ๊ด€ํ•œ ์—ฐ๊ตฌ๋ฅผ ํ•ด์˜ค๋˜
00:50
who spent years working on counterextremism issues,
13
50000
3434
์ „ CIA ์š”์›๊ณผ ์™ธ๊ต๊ด€์œผ๋กœ์„œ
00:53
I started to fear that this was becoming a far greater threat to our democracy
14
53458
3851
์ €๋Š” ์ด ์–‘๊ทนํ™”๊ฐ€ ์šฐ๋ฆฌ ๋ฏผ์ฃผ์ฃผ์˜์˜ ๊ทธ ์–ด๋Š ์ ๋ณด๋‹ค ๋” ํฐ
00:57
than any foreign adversary.
15
57333
2518
์œ„ํ˜‘์ด ๋  ์ˆ˜ ์žˆ์Œ์„ ์ง๊ฐํ–ˆ์Šต๋‹ˆ๋‹ค.
00:59
And so I started digging in,
16
59875
1809
๊ทธ๋ž˜์„œ ์ €๋Š” ์ด ๋ฌธ์ œ์— ๊นŠ์ˆ™์ด ํŒŒ๊ณ ๋“ค๊ณ 
01:01
and I started speaking out,
17
61708
1518
์ œ ๋ชฉ์†Œ๋ฆฌ๋ฅผ ๋†’์ด๊ธฐ ์‹œ์ž‘ํ–ˆ์ฃ .
01:03
which eventually led me to being hired at Facebook
18
63250
2809
์ด๋Ÿฐ ์ผ๋“ค์„ ๋ฐ”ํƒ•์œผ๋กœ ์ €๋Š” ํŽ˜์ด์Šค๋ถ์— ๊ณ ์šฉ์ด ๋๊ณ 
01:06
and ultimately brought me here today
19
66083
2685
์ง€๊ธˆ ์ด ๊ธฐํšŒ๋ฅผ ํ†ตํ•ด
01:08
to continue warning you about how these platforms
20
68792
2809
์—ฌ๋Ÿฌ๋ถ„๋“ค์—๊ฒŒ ์ด ํ”Œ๋žซํผ๋“ค์ด ์–ด๋–ป๊ฒŒ ์šฐ๋ฆฌ์˜ ์‚ฌ์ƒ์„
01:11
are manipulating and radicalizing so many of us
21
71625
4018
๊ทน๋‹จ์ ์œผ๋กœ ์น˜์šฐ์น˜๊ฒŒ ๋งŒ๋“œ๋Š”์ง€, ๊ทธ๋ฆฌ๊ณ  ์–ด๋–ป๊ฒŒ ํ•˜๋ฉด ์ด ํ”Œ๋žซํผ๋“ค์„
01:15
and to talk about how to reclaim our public square.
22
75667
2958
๋‹ค์‹œ ์šฐ๋ฆฌ์˜ ๊ฒƒ์œผ๋กœ ๋งŒ๋“œ๋Š”์ง€์— ๋Œ€ํ•œ ๋Œ€ํ™”๋ฅผ ๋‚˜๋ˆ„๊ณ ์žํ•ฉ๋‹ˆ๋‹ค.
01:19
I was a foreign service officer in Kenya
23
79625
1976
์ €๋Š” 9/11 ํ…Œ๋Ÿฌ ์‚ฌ๊ฑด์œผ๋กœ๋ถ€ํ„ฐ ๋ช‡ ๋…„ ํ›„
01:21
just a few years after the September 11 attacks,
24
81625
3059
์ผ€๋ƒ์— ์™ธ๋ฌด์ง ์š”์›์œผ๋กœ ์ผํ•˜๋ฉด์„œ
01:24
and I led what some call "hearts and minds" campaigns
25
84708
2560
๋งŽ์€ ์‚ฌ๋žŒ๋“ค์ด ๋งํ•˜๊ธธ "๋งˆ์Œ๊ณผ ์‚ฌ์ƒ" ์ด๋ผ ํ•˜๋Š” ์บ ํŽ˜์ธ์„
01:27
along the Somalia border.
26
87292
1934
์†Œ๋ง๋ฆฌ์•„ ๊ตญ๊ฒฝ ๊ทผ์ฒ˜์—์„œ ์ฃผ๋„ํ–ˆ์Šต๋‹ˆ๋‹ค.
01:29
A big part of my job was to build trust with communities
27
89250
3309
์ด ์ผ์˜ ํฐ ๋ถ€๋ถ„์€ ๊ทน๋‹จ์ฃผ์˜์  ์‚ฌ์ƒ์— ์ž ์‹๋  ๊ฐ€๋Šฅ์„ฑ์ด ๊ฐ€์žฅ ๋†’์€
01:32
deemed the most susceptible to extremist messaging.
28
92583
2709
์ปค๋ฎค๋‹ˆํ‹ฐ ๋‚ด์—์„œ ์‹ ๋ขฐ๋ฅผ ์Œ“๋Š” ๊ฒƒ์— ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
01:36
I spent hours drinking tea with outspoken anti-Western clerics
29
96125
4184
์ €๋Š” ๋ฐ˜์„œ๋ฐฉ์ฃผ์˜์ธ ์„ฑ์ง์ž๋“ค๊ณผ ๋ช‡ ์‹œ๊ฐ„์ด๋‚˜ ์ฐจ๋ฅผ ๋งˆ์‹œ๊ณ 
01:40
and even dialogued with some suspected terrorists,
30
100333
3226
์‹ฌ์ง€์–ด ํ…Œ๋Ÿฌ๋ฒ”์œผ๋กœ ์˜์‹ฌ๋˜๋Š” ์ž๋“ค๊ณผ๋„ ๋Œ€ํ™”๋ฅผ ๋‚˜๋ˆด์Šต๋‹ˆ๋‹ค.
01:43
and while many of these engagements began with mutual suspicion,
31
103583
3393
์ด๋Ÿฌํ•œ ๋Œ€ํ™” ์ค‘ ๋‹ค์ˆ˜๋Š” ์ƒํ˜ธ์  ์˜์‹ฌ์œผ๋กœ๋ถ€ํ„ฐ ์‹œ์ž‘๋์œผ๋‚˜
01:47
I don't recall any of them resulting in shouting or insults,
32
107000
3809
์ด๊ฒƒ์ด ์š•์„ค์ด ๋‚œ๋ฌดํ•˜๋Š” ๊ฒฝ์šฐ๋กœ ๋ณ€ํ•œ ์ ์€ ๊ธฐ์–ตํ•˜์ง€ ๋ชปํ•ฉ๋‹ˆ๋‹ค.
01:50
and in some case we even worked together on areas of mutual interest.
33
110833
4084
๋ช‡๋ช‡ ์‚ฌ๋ก€์—์„œ๋Š” ์ƒํ˜ธ ์ดํ•ด๊ด€๊ณ„๋ฅผ ๋งบ๊ณ  ํ˜‘๋ ฅํ–ˆ์ฃ .
01:56
The most powerful tools we had were to simply listen, learn
34
116000
4143
์šฐ๋ฆฌ๊ฐ€ ๊ฐ€์ง€๊ณ  ์žˆ๋˜ ๊ฐ€์žฅ ๊ฐ•๋ ฅํ•œ ๋Œ€ํ™”์ˆ˜๋‹จ์€ ๊ฒฝ์ฒญํ•˜๋Š” ๊ฒƒ๊ณผ ๋ฐฐ์šฐ๋Š” ๊ฒƒ,
02:00
and build empathy.
35
120167
1726
๊ทธ๋ฆฌ๊ณ  ์ƒ๋Œ€๋ฐฉ์„ ์ดํ•ดํ•˜๋Š” ๊ฒƒ์ด์—ˆ์Šต๋‹ˆ๋‹ค.
02:01
This is the essence of hearts and minds work,
36
121917
3059
์ด๊ฒƒ์ด ์šฐ๋ฆฌ ์บ ํŽ˜์ธ์˜ ๋ณธ์งˆ์ด์—ˆ์ฃ .
02:05
because what I found again and again is that what most people wanted
37
125000
3268
์ œ๊ฐ€ ๋ณด์•˜์„ ๋•Œ ์ด ์‚ฌ๋žŒ๋“ค์ด ์šฐ๋ฆฌํ•œํ…Œ ๊ฐ€์žฅ ๋ฐ”๋žฌ๋˜ ๊ฒƒ์€
02:08
was to feel heard, validated and respected.
38
128292
3767
์šฐ๋ฆฌ๊ฐ€ ์ž์‹ ๋“ค์˜ ์ด์•ผ๊ธฐ๋ฅผ ๋“ค์–ด์ฃผ๋Š” ๊ฒƒ๊ณผ ์กด์ค‘๋ฐ›๋Š” ๊ฒƒ์ด์—ˆ์Šต๋‹ˆ๋‹ค.
02:12
And I believe that's what most of us want.
39
132083
2643
๋Œ€๋ถ€๋ถ„ ์šฐ๋ฆฌ๋Š” ์ด๊ฒƒ๋“ค์„ ์›ํ•˜์ฃ .
02:14
So what I see happening online today is especially heartbreaking
40
134750
3309
๊ทธ๋ ‡๊ธฐ์— ์ง€๊ธˆ ์ผ์–ด๋‚˜๊ณ  ์žˆ๋Š” ์ผ๋“ค์ด ํŠนํžˆ๋‚˜ ๊ฐ€์Šด ์•„ํ”„๊ณ 
02:18
and a much harder problem to tackle.
41
138083
2084
ํ•ด๊ฒฐํ•˜๊ธฐ ์–ด๋ ค์šด ๋ฌธ์ œ์ด๊ธฐ๋„ํ•ฉ๋‹ˆ๋‹ค.
02:20
We are being manipulated by the current information ecosystem
42
140917
3767
์šฐ๋ฆฌ๋“ค์€ ํ˜„์žฌ ์ด ์ •๋ณด ๋„คํŠธ์›Œํฌ์— ์˜ํ•ด ํ•œ์ชฝ ์‚ฌ์ƒ์œผ๋กœ ๊ทน๋‹จ์ ์œผ๋กœ ์น˜์šฐ์น˜๊ณ 
02:24
entrenching so many of us so far into absolutism
43
144708
3768
์ ˆ๋Œ€์ฃผ์˜์— ๊นŠ์ด ๋น ์ง€์–ด ํƒ€ํ˜‘์€ ์ƒ๊ฐ๋„ ๋ชปํ•˜๊ฒŒ
02:28
that compromise has become a dirty word.
44
148500
2976
๊ต๋ฌ˜ํžˆ ์ด์šฉ๋‹นํ•˜๊ณ ์žˆ์Šต๋‹ˆ๋‹ค.
02:31
Because right now,
45
151500
1309
์™œ๋ƒ๋ฉด ์ง€๊ธˆ ํ˜„์žฌ
02:32
social media companies like Facebook
46
152833
2143
ํŽ˜์ด์Šค๋ถ์„ ๋น„๋กฏํ•œ ๋งŽ์€ ๊ธฐ์—…๋“ค์ด
02:35
profit off of segmenting us and feeding us personalized content
47
155000
3851
์šฐ๋ฆฌ์˜ ํŽธ๊ฒฌ์—์„œ ๋น„๋กฏ๋œ ๊ฐœ์ธํ™”๋œ ์ฝ˜ํ…์ธ ๋กœ
02:38
that both validates and exploits our biases.
48
158875
3851
์šฐ๋ฆฌ๋ฅผ ๋ถ„์—ด์‹œํ‚ด์œผ๋กœ์จ ์ด๋“์„ ๋ณด๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
02:42
Their bottom line depends on provoking a strong emotion
49
162750
3434
์ด๋Ÿฐ ํšŒ์‚ฌ๋“ค์€ ๊ฒฉํ•œ ๊ฐ์ •์„ ์ด๋Œ์–ด ๋‚ด๊ณ , ์šฐ๋ฆฌ์˜ ์‹œ์„ ์„ ๋„๋Š” ์ฝ˜ํ…์ธ ๋ฅผ
02:46
to keep us engaged,
50
166208
1685
์ œ๊ณตํ•˜๋Š”๋ฐ ์˜์กดํ•˜๊ธฐ ๋•Œ๋ฌธ์—
02:47
often incentivizing the most inflammatory and polarizing voices,
51
167917
4267
๊ทน๋‹จ๋ก ์ž๋“ค์˜ ์ฝ˜ํ…์ธ ๋ฅผ ์šฐ์„ ์ ์œผ๋กœ ์ œ๊ณตํ•จ์œผ๋กœ์จ ์šฐ๋ฆฌ๋“ค์—๊ฒŒ
02:52
to the point where finding common ground no longer feels possible.
52
172208
4435
ํ˜‘์ƒ๊ณผ ํƒ€ํ˜‘์ด ๋ถˆ๊ฐ€๋Šฅํ•˜๋‹ค๋Š” ์ƒ๊ฐ์ด ๋“ค๊ฒŒ๋” ์œ ๋„ํ•ฉ๋‹ˆ๋‹ค.
02:56
And despite a growing chorus of people crying out for the platforms to change,
53
176667
4226
ํ•˜์ง€๋งŒ ๋งŽ์€ ์š”๊ตฌ์—๋„ ๋ถˆ๊ตฌํ•˜๊ณ  ์ด ํšŒ์‚ฌ๋“ค์ด ์ด๋Ÿฐ ์ •์ฑ…๋“ค์„
03:00
it's clear they will not do enough on their own.
54
180917
3309
๋ฐ”๊พธ์ง€ ์•Š์„ ๊ฒƒ์ž„์ด ๋„ˆ๋ฌด๋‚˜๋„ ๋ป”ํ•ฉ๋‹ˆ๋‹ค.
03:04
So governments must define the responsibility
55
184250
2893
๊ทธ๋Ÿฌ๋ฏ€๋กœ ์ด์ œ๋Š” ์ •๋ถ€์—์„œ
03:07
for the real-world harms being caused by these business models
56
187167
3809
๊ธฐ์—…๋“ค์˜ ์ด๋Ÿฐ ์ •์ฑ…๋“ค์ด
03:11
and impose real costs on the damaging effects
57
191000
2768
์šฐ๋ฆฌ์˜ ์‚ฌํšŒ, ๋ฏผ์ฃผ์ฃผ์˜์™€ ๊ณต์ค‘๋ณด๊ฑด์— ๋ผ์น˜๋Š”
03:13
they're having to our public health, our public square and our democracy.
58
193792
5309
์•…์˜ํ–ฅ์— ๋Œ€ํ•ด ์ฑ…์ž„์ง€๊ฒŒ ํ•  ๋•Œ์ž…๋‹ˆ๋‹ค.
03:19
But unfortunately, this won't happen in time for the US presidential election,
59
199125
4643
์œ ๊ฐ์Šค๋Ÿฝ๊ฒŒ๋„, ์ด๋Ÿฐ ์ œ์žฌ๋Š” ๋‹ค์Œ ๋Œ€ํ†ต๋ น ์„ ๊ฑฐ ์ „๊นŒ์ง€๋Š” ์ดํ–‰๋˜์ง€ ๋ชปํ•  ๊ฒ๋‹ˆ๋‹ค.
03:23
so I am continuing to raise this alarm,
60
203792
2767
๊ทธ๋ž˜์„œ ์ œ๊ฐ€ ๊ณ„์† ์ƒ๊ธฐ์‹œ์ผœ ๋“œ๋ฆฌ๋Š” ๊ฒ๋‹ˆ๋‹ค.
03:26
because even if one day we do have strong rules in place,
61
206583
3060
์™œ๋ƒํ•˜๋ฉด ์–ด๋Š๋‚  ๊ฐ•๊ฒฝํ•œ ๊ทœ์œจ์ด ์„ธ์›Œ์ง„๋‹ค ํ•œ๋“ค
03:29
it will take all of us to fix this.
62
209667
2750
์šฐ๋ฆฌ ๋ชจ๋‘๊ฐ€ ๊ณ ์ณ๋‚˜๊ฐ€์•ผ ํ•  ๋ฌธ์ œ์ด๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
03:33
When I started shifting my focus from threats abroad
63
213417
2601
์ œ๊ฐ€ ์ œ ์ดˆ์ ์„ ๊ตญ๊ฐ€ ์•ˆ๋ณด ์œ„ํ˜‘์—์„œ
03:36
to the breakdown in civil discourse at home,
64
216042
2142
์šฐ๋ฆฌ์˜ ์‚ฌํšŒ์  ๋ถ„์—ด์œผ๋กœ ์˜ฎ๊ฒผ์„ ๋•Œ
03:38
I wondered if we could repurpose some of these hearts and minds campaigns
65
218208
4060
์ €๋Š” "๋งˆ์Œ๊ณผ ์‚ฌ์ƒ" ์บ ํŽ˜์ธ์„ ์žฌํ™œ์šฉํ•ด
03:42
to help heal our divides.
66
222292
2101
์ด ๋ถ„์—ด์„ ์น˜์œ ํ•ด๋ณด๋ฉด ์–ด๋–จ์ง€ ์ƒ๊ฐํ–ˆ์Šต๋‹ˆ๋‹ค.
03:44
Our more than 200-year experiment with democracy works
67
224417
3809
์šฐ๋ฆฌ์˜ 200๋…„ ๋œ ๋ฏผ์ฃผ์ฃผ์˜๊ฐ€ ์„ฑ๊ณต ํ•  ์ˆ˜ ์žˆ์—ˆ๋˜ ๊ฐ€์žฅ ํฐ ์ด์œ ๋Š”
03:48
in large part because we are able to openly and passionately
68
228250
3893
์šฐ๋ฆฌ๊ฐ€ ์šฐ๋ฆฌ๋“ค์˜ ์ƒ๊ฐ์— ๋Œ€ํ•ด ์ž์œ ๋กญ๊ฒŒ ํ† ๋ก ํ•˜๊ณ  ํ•ด๋‹ต์„
03:52
debate our ideas for the best solutions.
69
232167
2809
์ฐพ์„ ์ˆ˜ ์žˆ์—ˆ๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
03:55
But while I still deeply believe
70
235000
1809
์ €๋Š” ์ง€๊ธˆ๋„ ๋Œ€๋ฉด ๋ฐฉ์‹์˜
03:56
in the power of face-to-face civil discourse,
71
236833
2268
์‹œ๋ฏผ ๋‹ด๋ก ์˜ ํž˜์„ ๋ฏฟ์Šต๋‹ˆ๋‹ค๋งŒ
03:59
it just cannot compete
72
239125
1643
SNS๊ฐ€ ์šฐ๋ฆฌ์—๊ฒŒ ๋ผ์น˜๋Š”
04:00
with the polarizing effects and scale of social media right now.
73
240792
3809
์•…์˜ํ–ฅ์— ๋น„ํ•ด์„œ๋Š” ์—ญ๋ถ€์กฑ์ž…๋‹ˆ๋‹ค.
04:04
The people who are sucked down these rabbit holes
74
244625
2351
์ด๋Ÿฐ ๊ธฐ์—…๋“ค์˜ ์ˆ ์ฑ…์— ์†์ˆ˜๋ฌด์ฑ…์œผ๋กœ ๋น ์ ธ๋“œ๋Š” ์‚ฌ๋žŒ๋“ค์€
04:07
of social media outrage
75
247000
1309
04:08
often feel far harder to break of their ideological mindsets
76
248333
3643
์ œ๊ฐ€ ์ผ€๋ƒ์—์„œ ๋ณด์•˜๋˜ ์‚ฌ๋žŒ๋“ค๋ณด๋‹ค ๋”์šฑ ๋”
04:12
than those vulnerable communities I worked with ever were.
77
252000
3476
์ž์‹ ์˜ ๊ทน๋‹จํ™”๋œ ์‚ฌ์ƒ์œผ๋กœ๋ถ€ํ„ฐ ํ•ด์–ด ๋‚˜์˜ค๊ธฐ ํž˜๋“ค์–ดํ•ฉ๋‹ˆ๋‹ค.
04:15
So when Facebook called me in 2018
78
255500
2309
๊ทธ๋ž˜์„œ ์ €๋Š” 2018๋…„ ํŽ˜์ด์Šค๋ถ์œผ๋กœ๋ถ€ํ„ฐ
04:17
and offered me this role
79
257833
1310
์„ ๊ฑฐ ๊ด‘๊ณ  ์šด์˜ ์ด๊ด„์ž๋กœ
04:19
heading its elections integrity operations for political advertising,
80
259167
3976
๊ณ ์šฉ ์˜คํผ๊ฐ€ ๋“ค์–ด์™”์„ ๋•Œ
04:23
I felt I had to say yes.
81
263167
2017
์ €๋Š” ์ˆ˜๋ฝํ•ด์•ผ๋งŒ ํ–ˆ์Šต๋‹ˆ๋‹ค.
04:25
I had no illusions that I would fix it all,
82
265208
2726
์ด ์ƒํ™ฉ์„ ๋‹ค ๋ฐ”๋กœ์žก์œผ๋ฆฌ๋ผ ์˜ˆ์ƒํ•˜์ง€๋Š” ์•Š์•˜์ง€๋งŒ
04:27
but when offered the opportunity
83
267958
1643
์šฐ๋ฆฌ๊ฐ€ ํƒ„ ๋ฐฐ๋ฅผ ์ •ํ™•ํ•œ ๋ฐฉํ–ฅ์œผ๋กœ
04:29
to help steer the ship in a better direction,
84
269625
2184
๋ฐ”๊พธ๋Š” ๊ธฐํšŒ๊ฐ€ ์ฃผ์–ด์กŒ์„ ๋•Œ
04:31
I had to at least try.
85
271833
1542
๊ทธ๋ž˜๋„ ์‹œ๋„๋Š” ํ•ด๋ด์•ผ๊ฒ ๋‹ค๊ณ  ์ƒ๊ฐํ–ˆ์Šต๋‹ˆ๋‹ค.
04:34
I didn't work directly on polarization,
86
274667
2267
์–‘๋ถ„ํ™”์— ๋Œ€ํ•˜์—ฌ ์ง์ ‘์ ์œผ๋กœ ์ผ์„ ํ•œ ๊ฒƒ์€ ์•„๋‹ˆ์ง€๋งŒ
04:36
but I did look at which issues were the most divisive in our society
87
276958
4435
์šฐ๋ฆฌ ์‚ฌํšŒ์—์„œ ๊ฐ€์žฅ ๋ถ„์—ดํ™”๋œ ์Ÿ์ ๋“ค์„ ๊ด€์ฐฐํ•  ์ˆ˜ ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
04:41
and therefore the most exploitable in elections interference efforts,
88
281417
3809
๊ฐ€์žฅ ์œ ๋ฆฌํ•˜๊ฒŒ ์ด์šฉํ•  ์ˆ˜ ์žˆ๋Š” ์„ ๊ฑฐ ๊ฐœ์ž… ํ™œ๋™๋“ค๋„ ๋ง์ด์ฃ .
04:45
which was Russia's tactic ahead of 2016.
89
285250
2583
2016๋…„ ์ด์ „์˜ ๋Ÿฌ์‹œ์•„์˜ ์ „๋žต์ฒ˜๋Ÿผ์š”.
04:48
So I started by asking questions.
90
288583
2351
๊ทธ๋ž˜์„œ ์งˆ๋ฌธ์„ ํ•˜๊ธฐ ์‹œ์ž‘ํ–ˆ์Šต๋‹ˆ๋‹ค.
04:50
I wanted to understand the underlying systemic issues
91
290958
2893
์–ด๋–ค ์กฐ์ง์  ๊ธฐ๋ฐ˜ ๋ฌธ์ œ๋“ค์ด
04:53
that were allowing all of this to happen,
92
293875
2434
์ด๊ฒƒ๋“ค์„ ๊ฐ€๋Šฅํ•˜๊ฒŒ ํ–ˆ๊ณ ,
04:56
in order to figure out how to fix it.
93
296333
2084
์–ด๋–ป๊ฒŒ ๊ณ ์น  ์ˆ˜ ์žˆ๋Š”์ง€์š”.
04:59
Now I still do believe in the power of the internet
94
299625
2559
๊ทธ๋ž˜๋„ ์ €๋Š” ์ธํ„ฐ๋„ท์ด ๋” ๋งŽ์€ ์˜๊ฒฌ๋“ค์ด ํ‘œ์ถœ๋˜๊ฒŒ
05:02
to bring more voices to the table,
95
302208
2518
๋„์™€์ค€๋‹ค๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
05:04
but despite their stated goal of building community,
96
304750
3101
ํ•˜์ง€๋งŒ ๊ทธ๋“ค์ด ํ•˜๋‚˜๋œ ๊ณต๋™์ฒด๋ฅผ ๋งŒ๋“ ๋‹ค๋Š” ๋ชฉํ‘œ์™€ ์ƒ๋ฐ˜๋˜์–ด,
05:07
the largest social media companies as currently constructed
97
307875
3309
ํ˜„์žฌ์˜ ๋Œ€๊ธฐ์—… SNS๋“ค์€
05:11
are antithetical to the concept of reasoned discourse.
98
311208
3601
์ด์œ ์— ๊ทผ๊ฑฐํ•œ ๋‹ด๋ก ์ด ์‚ด์•„๋‚จ๊ธฐ ํž˜๋“ค๊ฒŒ ๊ตฌ์กฐํ™”๋˜์–ด ์žˆ์Šต๋‹ˆ๋‹ค.
05:14
There's no way to reward listening,
99
314833
2518
์‚ฌ์šฉ์ž, ํ™œ๋™๋Ÿ‰ ์ฆ๊ฐ€๊ฐ€ ์„ฑ๊ณต์˜ ํ‚ค์ธ ์‚ฌ์—…์ฒด์—์„œ
05:17
to encourage civil debate
100
317375
1601
๊ฒฝ์ฒญ์„ ์˜นํ˜ธํ•˜๊ณ ,
05:19
and to protect people who sincerely want to ask questions
101
319000
3643
์ •์ค‘ํ•œ ํ† ๋ก ์„ ์žฅ๋ คํ•˜๊ณ ,
05:22
in a business where optimizing engagement and user growth
102
322667
3351
์ง„์ •์„ฑ ์žˆ๋Š” ์งˆ๋ฌธ์ž๋“ค์„
05:26
are the two most important metrics for success.
103
326042
3226
๋ณดํ˜ธํ•  ์ผ์€ ์—†์„ ๊ฒ๋‹ˆ๋‹ค.
05:29
There's no incentive to help people slow down,
104
329292
3434
์‚ฌ๋žŒ๋“ค์ด ๋” ์ฒœ์ฒœํžˆ ์ƒ๊ฐํ•˜๊ณ 
05:32
to build in enough friction that people have to stop,
105
332750
3226
์‚ฌ๋žŒ๋“ค์ด ๋ฉˆ์ถœ ๋•Œ๋ฅผ ์•Œ๊ฒŒ ํ•ด์ฃผ๊ณ 
05:36
recognize their emotional reaction to something,
106
336000
2601
์–ด๋–ค ๊ฒƒ์— ๋Œ€ํ•œ ์ž์‹ ์˜ ๋ฐ˜์‘์„ ๊ด€์ฐฐํ•˜๊ณ 
05:38
and question their own assumptions before engaging.
107
338625
2667
์ž์‹ ์˜ ํŽธ๊ฒฌ๋“ค์„ ๋˜๋Œ์•„ ๋ณด๊ฒŒ ํ•  ํ•„์š”๊ฐ€ ์—†๊ฑฐ๋“ ์š”.
05:42
The unfortunate reality is:
108
342625
1976
๋ฐ”๋กœ ์•ˆํƒ€๊นŒ์šด ํ˜„์‹ค์€
05:44
lies are more engaging online than truth,
109
344625
2893
์˜จ๋ผ์ธ์—์„œ ๊ฑฐ์ง“๋ง์€ ์ง„์‹ค๋ณด๋‹ค ๋งคํ˜น์ ์ด๊ณ ,
05:47
and salaciousness beats out wonky, fact-based reasoning
110
347542
3726
๋น ๋ฅด๊ณ  ๊ธ‰์†์ ์ธ ์„ธ์ƒ์—์„œ
05:51
in a world optimized for frictionless virality.
111
351292
3041
์ถ”์žก์Šค๋Ÿฌ์›€์€ ์‹œ์‹œํ•˜๊ณ  ์‚ฌ์‹ค์ ์ธ ์ถ”๋ก ์„ ์ด๊น๋‹ˆ๋‹ค.
05:55
As long as algorithms' goals are to keep us engaged,
112
355167
3434
์•Œ๊ณ ๋ฆฌ์ฆ˜์˜ ๋ชฉํ‘œ๊ฐ€ ์šฐ๋ฆฌ๊ฐ€ ๊ณ„์† SNS๋ฅผ ํ•˜๊ฒŒ ๋งŒ๋“œ๋Š” ๊ฒƒ์ธ ์ด์ƒ,
05:58
they will continue to feed us the poison that plays to our worst instincts
113
358625
4059
์šฐ๋ฆฌ์˜ ์ตœ์•…์˜ ๋ณธ๋Šฅ๊ณผ ์•ฝ์ ์„ ์ด๋Œ์–ด๋‚ด๋Š”
06:02
and human weaknesses.
114
362708
1459
๋…์„ ๊ณ„์† ๋จน์ผ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
06:04
And yes, anger, mistrust,
115
364958
3018
๋ถ„๋…ธ, ๋ถˆ์‹ ,
๊ทธ๋ฆฌ๊ณ  ๋ฌด์„œ์›€๊ณผ ์ฆ์˜ค์˜ ๋ฌธํ™”,
06:08
the culture of fear, hatred:
116
368000
1726
06:09
none of this is new in America.
117
369750
2851
๋‹ค ๋ฏธ๊ตญ์—์„œ ์žˆ๋˜ ๊ฒƒ์ด์ง€๋งŒ,
06:12
But in recent years, social media has harnessed all of that
118
372625
3351
์ตœ๊ทผ์— SNS ๋Š” ๊ทธ๊ฒƒ๋“ค์„ ํฌ์šฉํ•˜๊ณ ,
06:16
and, as I see it, dramatically tipped the scales.
119
376000
3601
์ œ๊ฐ€ ๋ณด๊ธฐ์—๋Š”, ๊ทน๋‹จํ™” ์‹œ์ผฐ์Šต๋‹ˆ๋‹ค.
06:19
And Facebook knows it.
120
379625
2393
๊ทธ๋ฆฌ๊ณ  ํŽ˜์ด์Šค๋ถ๋„ ์••๋‹ˆ๋‹ค.
06:22
A recent "Wall Street Journal" article
121
382042
1892
์ตœ๊ทผ์— ์›” ์ŠคํŠธ๋ฆฌํŠธ ์ €๋„์—์„œ ๋‚˜์˜จ ๊ธฐ์‚ฌ๊ฐ€
06:23
exposed an internal Facebook presentation from 2018
122
383958
4143
ํŽ˜์ด์Šค๋ถ ์‚ฌ๋‚ด์—์„œ 2018๋…„์— ์ด๋ฃจ์–ด์ง„ ๋ฐœํ‘œ๋ฅผ ํญ๋กœํ–ˆ๋Š”๋ฐ
06:28
that specifically points to the companies' own algorithms
123
388125
3768
๋ฐ”๋กœ ์—ฌ๊ธฐ์„œ ํšŒ์‚ฌ์˜ ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ํ†ตํ•ด
06:31
for growing extremist groups' presence on their platform
124
391917
3517
๊ทน๋‹จ์ฃผ์˜ ๋‹จ์ฒด๋“ค์˜ ์กด์žฌ๊ฐ์„ ๋†’์ด๊ณ 
06:35
and for polarizing their users.
125
395458
2167
์‚ฌ์šฉ์ž๋“ค์„ ๋ถ„์—ด์‹œํ‚ค์ž๋Š” ๋‚ด์šฉ์ด ๋‚˜์™”์Šต๋‹ˆ๋‹ค.
06:38
But keeping us engaged is how they make their money.
126
398708
3685
์šฐ๋ฆฌ ํ™œ๋™์„ ๋ถ€์ถ”๊ธฐ๋Š” ๊ฒƒ์ด ๋ฐ”๋กœ ๊ทธ๋“ค์ด ๋ˆ์„ ๋ฒ„๋Š” ๋ฐฉ๋ฒ•์ด๊ธฐ ๋•Œ๋ฌธ์ด์ฃ .
06:42
The modern information environment is crystallized around profiling us
127
402417
4226
ํ˜„์žฌ์˜ ์ •๋ณด ์„ธ์ƒ์€ ์šฐ๋ฆฌ์˜ ์ž๋ฃŒ๋ฅผ ์ˆ˜์ง‘ํ•˜๊ณ ,
06:46
and then segmenting us into more and more narrow categories
128
406667
2976
๋” ์ข์€ ์นดํ…Œ๊ณ ๋ฆฌ๋กœ ๋ถ„๋ฅ˜ํ•˜์—ฌ
06:49
to perfect this personalization process.
129
409667
3309
๊ฐœ์ธํ™” ๊ณผ์ •์„ ๋‹ค์ง€๋Š”๊ฒƒ์— ๋”ฐ๋ผ ํ˜•์„ฑ๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
06:53
We're then bombarded with information confirming our views,
130
413000
3809
๊ทธ ๋‹ค์Œ์— ์šฐ๋ฆฌ์˜ ์‚ฌ์ƒ์„ ๊ฒ€์ฆํ•˜๊ณ ,
06:56
reinforcing our biases,
131
416833
1851
ํŽธ๊ฒฌ๋“ค์„ ๊ฐ•ํ™”์‹œํ‚ค๊ณ 
06:58
and making us feel like we belong to something.
132
418708
3518
์†Œ์†๊ฐ์„ ์ฃผ๋Š” ์ •๋ณด๋กœ ๊ณต๊ฒฉ๋‹นํ•ฉ๋‹ˆ๋‹ค.
07:02
These are the same tactics we would see terrorist recruiters
133
422250
3393
์ด๊ฒƒ๋“ค์€ ํ…Œ๋Ÿฌ๋ฆฌ์ŠคํŠธ ๋ชจ์ง‘์›๋“ค์ด
07:05
using on vulnerable youth,
134
425667
2184
๊ทธ๋“ค์—๊ฒŒ ์‚ฌ์ƒ์„ ์ฃผ์ž…ํ•˜๊ธฐ ์œ„ํ•ด
07:07
albeit in smaller, more localized ways before social media,
135
427875
3851
SNS ์ด์ „์— ๋” ์ž‘๊ณ , ์ง€์—ญ์ ์ธ ๋ฐฉ๋ฒ•์œผ๋กœ
07:11
with the ultimate goal of persuading their behavior.
136
431750
3268
์ทจ์•ฝํ•œ ์ฒญ๋…„์ธต์—๊ฒŒ ์‚ฌ์šฉํ•˜๋Š” ๋ฐฉ๋ฒ•๊ณผ ๊ฐ™์Šต๋‹ˆ๋‹ค.
07:15
Unfortunately, I was never empowered by Facebook to have an actual impact.
137
435042
5267
๋ถˆํ–‰ํ•˜๊ฒŒ๋„, ์ „ ๋ณ€ํ™”๋ฅผ ์ค„ ์ˆ˜ ์žˆ์„ ์ •๋„๋กœ ํŽ˜์ด์Šค๋ถ์—์„œ ๊ถŒํ•œ์„ ๋ถ€์—ฌ๋ฐ›์ง€ ๋ชปํ–ˆ์Šต๋‹ˆ๋‹ค.
07:20
In fact, on my second day, my title and job description were changed
138
440333
3768
์‚ฌ์‹ค ์ œ ๋‘ ๋ฒˆ์งธ ๋‚ , ์ œ ์ง์ฑ…๊ณผ ์—…๋ฌด ๋‚ด์šฉ์ด ๋ฐ”๋€Œ์–ด ์žˆ์—ˆ์œผ๋ฉฐ
07:24
and I was cut out of decision-making meetings.
139
444125
2976
์ค‘๋Œ€ํ•œ ํšŒ์˜๋“ค์—์„œ ์ œ์™ธ๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
07:27
My biggest efforts,
140
447125
1309
์ œ ๊ฐ€์žฅ ํฐ ๋…ธ๋ ฅ๋“ค,
07:28
trying to build plans
141
448458
1393
๊ณ„ํญ์ ์œผ๋กœ ํ—ˆ์œ„์ •๋ณด์™€ ์ •์น˜์  ๊ด‘๊ณ ๋ฅผ ํ†ตํ•œ ์œ ๊ถŒ์ž ์–ต์••์„ ๋ง‰๋Š” ๊ฒƒ,
07:29
to combat disinformation and voter suppression in political ads,
142
449875
3601
07:33
were rejected.
143
453500
1559
๋ชจ๋‘ ๊ฑฐ์ ˆ๋‹นํ–ˆ์Šต๋‹ˆ๋‹ค.
07:35
And so I lasted just shy of six months.
144
455083
2500
๊ทธ๋ž˜์„œ ์•ฝ 6๊ฐœ์›” ํ›„ ํšŒ์‚ฌ๋ฅผ ๊ทธ๋งŒ๋’€์Šต๋‹ˆ๋‹ค.
07:38
But here is my biggest takeaway from my time there.
145
458125
3601
์ œ๊ฐ€ ์ด ๊ฒฝํ—˜์—์„œ ๋ฐฐ์šด ๊ฑด ์ด๊ฒƒ์ž…๋‹ˆ๋‹ค.
07:41
There are thousands of people at Facebook
146
461750
2476
๋ฐ”๋กœ ์—ด์ •์ ์œผ๋กœ ์ œํ’ˆ ๊ฐœ๋ฐœ์— ๋ชฐ๋‘ํ•˜๋Š”
07:44
who are passionately working on a product
147
464250
2018
๋ช‡์ฒœ๋ช…์˜ ํŽ˜์ด์Šค๋ถ ์ง์›๋“ค์€
07:46
that they truly believe makes the world a better place,
148
466292
3684
์„ธ์ƒ์„ ๋” ์ข‹์€ ๊ณณ์œผ๋กœ ๋งŒ๋“ค ์ˆ˜ ์žˆ๋‹ค๊ณ  ๋ฏฟ์ง€๋งŒ,
07:50
but as long as the company continues to merely tinker around the margins
149
470000
3684
ํšŒ์‚ฌ๊ฐ€ ๊ณ„์† ์ปจํ…์ธ  ๊ด€๋ฆฌ์™€ ์ •์ฑ…์—
07:53
of content policy and moderation,
150
473708
2726
์†Œํ™€์ด ํ–‰๋™ํ•˜๊ณ 
07:56
as opposed to considering
151
476458
1310
์ˆ˜์ต ์ฐฝ์ถœ๊ณผ ํ”Œ๋žซํผ ์„ค๊ณ„์—๋งŒ
07:57
how the entire machine is designed and monetized,
152
477792
3226
์ง‘์ค‘ํ•˜๋Š” ์ด์ƒ,
08:01
they will never truly address how the platform is contributing
153
481042
3351
์ด ํ”Œ๋žซํผ์ด ์–ด๋–ป๊ฒŒ ํ˜์˜ค, ๋ถ„์—ด๊ณผ ๊ธ‰์ง„ํ™”์— ์•ž์žฅ์„œ๋Š”์ง€
08:04
to hatred, division and radicalization.
154
484417
4184
์‹ ๊ฒฝ์จ์„œ ๊ณ ์น˜๋ ค ํ•˜์ง„ ์•Š์„ ๊ฒ๋‹ˆ๋‹ค.
08:08
And that's the one conversation I never heard happen during my time there,
155
488625
4268
์ œ๊ฐ€ ๊ฑฐ๊ธฐ์„œ ์ผํ•  ๋•Œ ์ด๋Ÿฐ ๊ฒƒ์— ๋Œ€ํ•œ ์–ธ๊ธ‰์€ ๋“ฃ์ง€๋ฅผ ๋ชปํ–ˆ์Šต๋‹ˆ๋‹ค.
08:12
because that would require fundamentally accepting
156
492917
3059
๊ทธ๋Ÿฌ๊ธฐ ์œ„ํ•ด์„œ๋Š”
08:16
that the thing you built might not be the best thing for society
157
496000
4143
๊ทธ๋“ค์ด ๋งŒ๋“ ๊ฒŒ ๊ทผ๋ณธ์ ์œผ๋กœ ์‚ฌํšŒ์— ํ•ด๊ฐ€ ๋˜๋Š”๊ฒƒ์„ ์ธ์ •ํ•˜๊ณ 
08:20
and agreeing to alter the entire product and profit model.
158
500167
3125
์ œํ’ˆ ์ „์ฒด์™€ ์ˆ˜์ต ์ฐฝ์ถœ ๋ฐฉ๋ฒ•์„ ๋ฐ”๊ฟ”์•ผ ํ•˜๊ธฐ ๋•Œ๋ฌธ์ผ ๊ฒ๋‹ˆ๋‹ค.
08:24
So what can we do about this?
159
504500
1833
๊ทธ๋ž˜์„œ ์šฐ๋ฆฐ ๋ฌด์–ผ ํ•  ์ˆ˜ ์žˆ์„๊นŒ์š”?
08:27
I'm not saying that social media bears the sole responsibility
160
507042
3601
์ „ ์ง€๊ธˆ ์ด ์ƒํ™ฉ์— ๋†“์ด๊ฒŒ ๋œ ๊ฒƒ์ด
08:30
for the state that we're in today.
161
510667
2142
SNS๋งŒ์˜ ์ฑ…์ž„์ด๋ผ๊ณ  ํ•˜๋Š” ๊ฒƒ์€ ์•„๋‹™๋‹ˆ๋‹ค.
08:32
Clearly, we have deep-seated societal issues that we need to solve.
162
512833
5435
๋ถ„๋ช…ํžˆ, ์šฐ๋ฆฌ๋Š” ํ’€์–ด๋‚˜๊ฐ€์•ผ ํ•  ๋ฟŒ๋ฆฌ ๊นŠ์€ ์‚ฌํšŒ์  ๋ฌธ์ œ๋“ค์ด ๋งŽ์ฃ .
08:38
But Facebook's response, that it is just a mirror to society,
163
518292
4142
ํ•˜์ง€๋งŒ ํŽ˜์ด์Šค๋ถ์ด ์ด๊ฑด ๋ชจ๋‘ ๋‹ค ์‚ฌํšŒ์˜ ๋ณธ๋ชจ์Šต์ผ ๋ฟ์ด๋ผ๊ณ  ์ฃผ์žฅํ•˜๋Š” ๊ฒƒ์€,
08:42
is a convenient attempt to deflect any responsibility
164
522458
3018
๊ทธ๋“ค์˜ ํ”Œ๋žซํผ์ด ํ•ด๋กœ์šด ์ฝ˜ํ…์ธ ๋ฅผ ํ™•์‚ฐํ•˜๊ณ 
08:45
from the way their platform is amplifying harmful content
165
525500
4268
๋ช‡๋ช‡ ์œ ์ €๋“ค์„ ๊ทน๋‹จ์ ์ธ ์‚ฌ์ƒ์œผ๋กœ ๋ฐ€์–ด๋„ฃ๋Š” ๊ฒƒ์— ๋Œ€ํ•œ
08:49
and pushing some users towards extreme views.
166
529792
3041
์ฑ…์ž„ ํšŒํ”ผ์ผ ๋ฟ์ž…๋‹ˆ๋‹ค.
08:53
And Facebook could, if they wanted to,
167
533583
2851
ํŽ˜์ด์Šค๋ถ์ด ์›ํ•œ๋‹ค๋ฉด,
08:56
fix some of this.
168
536458
1851
์–ด๋Š ์ •๋„๋Š” ๊ณ ์น  ์ˆ˜ ์žˆ์„ ๊ฒ๋‹ˆ๋‹ค.
08:58
They could stop amplifying and recommending the conspiracy theorists,
169
538333
3851
๊ทธ๋“ค์ด ์Œ๋ชจ๋ก ์ž๋“ค, ์ฆ์˜ค ์ง‘๋‹จ๋“ค,
09:02
the hate groups, the purveyors of disinformation
170
542208
2685
๊ทธ๋ฆฌ๊ณ  ๊ฐ€์งœ ๋‰ด์Šค๋ฅผ ์กฐ๋‹ฌํ•˜๋Š” ์‚ฌ๋žŒ๋“ค์„ ๊ทธ๋งŒ ๋ถ€์ถ”๊ธฐ๊ณ  ์ถ”์ฒœํ•˜๋ฉด ๋ฉ๋‹ˆ๋‹ค.
09:04
and, yes, in some cases even our president.
171
544917
3851
์‹ฌ์ง€์–ด ์šฐ๋ฆฌ์˜ ๋Œ€ํ†ต๋ น๋„ ๋ง์ด์ฃ .
09:08
They could stop using the same personalization techniques
172
548792
3392
๊ทธ๋“ค์ด ์šฐ๋ฆฌ์—๊ฒŒ ์šด๋™ํ™”๋ฅผ ๊ด‘๊ณ ํ•˜๋“ฏ์ด
09:12
to deliver political rhetoric that they use to sell us sneakers.
173
552208
4226
์ •์น˜์ ์œผ๋กœ ์„ค๋“ํ•˜๋Š” ๊ฒƒ์„ ๊ทธ๋งŒํ•˜๋ฉด ๋ฉ๋‹ˆ๋‹ค.
09:16
They could retrain their algorithms
174
556458
1726
๊ทธ๋“ค์ด ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ์žฌํ›ˆ๋ จ ์‹œํ‚ค๋ฉด ๋  ๊ฒ๋‹ˆ๋‹ค.
09:18
to focus on a metric other than engagement,
175
558208
2185
ํ™œ๋™๋Ÿ‰์ด ์•„๋‹Œ ๋‹ค๋ฅธ ์ง€ํ‘œ๋ฅผ ์„ธ์šฐ๊ณ ,
09:20
and they could build in guardrails to stop certain content from going viral
176
560417
4309
ํŠน์ • ์ฝ˜ํ…์ธ ๊ฐ€ ์ธ๊ธฐ๋ฅผ ์–ป์ง€ ๋ชปํ•˜๊ฒŒ
09:24
before being reviewed.
177
564750
1667
์•ˆ์ „ ์žฅ์น˜๋ฅผ ์„ค์น˜ํ•จ์œผ๋กœ์จ ๋ง์ด์ฃ .
09:26
And they could do all of this
178
566875
2268
์ด๊ฒƒ์„ ๋‹ค ํ•˜๋ฉด์„œ๋„
09:29
without becoming what they call the arbiters of truth.
179
569167
4184
๊ทธ๋“ค์ด ์ง€์นญํ•˜๋Š” "์ง„์‹ค์˜ ์ค‘์žฌ์ž"๊ฐ€ ๋˜์ง€ ์•Š์„ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
09:33
But they've made it clear that they will not go far enough
180
573375
2809
ํ•˜์ง€๋งŒ ๊ทธ๋“ค์ด ๊ฐ•์ œ๋˜์ง€ ์•Š๋Š” ์ด์ƒ
09:36
to do the right thing without being forced to,
181
576208
2810
์˜ฎ์€ ์ผ์„ ํ•˜์ง€ ์•Š์„ ๊ฒƒ์ด๋ผ๊ณ  ๋ช…๋ฐฑํžˆ ๋ฐํ˜”์Šต๋‹ˆ๋‹ค.
09:39
and, to be frank, why should they?
182
579042
2934
์†”์งํžˆ, ๊ทธ๋“ค์ด ๊ตณ์ด ๊ทธ๋Ÿด ์ด์œ ๋„ ์—†์œผ๋‹ˆ๊นŒ์š”.
09:42
The markets keep rewarding them, and they're not breaking the law.
183
582000
3934
์‹œ์žฅ์—์„œ ๊ณ„์† ์ˆ˜์ต์ด ๋‚˜๋ฉฐ, ๋ฒ•์„ ์–ด๊ธฐ๋Š” ๊ฒƒ๋„ ์•„๋‹ˆ๊ธฐ ๋•Œ๋ฌธ์ด์ฃ .
09:45
Because as it stands,
184
585958
1310
์™œ๋ƒํ•˜๋ฉด ํ˜„์žฌ ์ƒํ™ฉ์—์„œ๋Š”
09:47
there are no US laws compelling Facebook, or any social media company,
185
587292
4892
๋ฏธ๊ตญ์˜ ์–ด๋Š ๋ฒ•๋„ ํŽ˜์ด์Šค๋ถ์ด๋‚˜ ๋‹ค๋ฅธ SNS์—…์ฒด๋“ค์ด
์šฐ๋ฆฌ์˜ ์˜๊ฒฌ์„ ๋‚ด๋Š” ๊ณต๊ฐ„
09:52
to protect our public square,
186
592208
1768
์šฐ๋ฆฌ์˜ ๋ฏผ์ฃผ์ฃผ์˜๋‚˜ ์„ ๊ฑฐ๋ฅผ ๋ณดํ˜ธํ•˜๋„๋ก
09:54
our democracy
187
594000
1309
09:55
and even our elections.
188
595333
2435
๊ฐ•์ œํ•˜์ง€ ์•Š๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
09:57
We have ceded the decision-making on what rules to write and what to enforce
189
597792
4101
์šฐ๋ฆฌ๋Š” ์ง€๊ธˆ๊นŒ์ง€ ์˜๋ฆฌ ์ธํ„ฐ๋„ท ๊ธฐ์—…๋“ค์˜ CEO๋“ค์—๊ฒŒ
10:01
to the CEOs of for-profit internet companies.
190
601917
3083
๋ฌด์Šจ ๊ทœ์น™์„ ์„ธ์šฐ๊ณ  ๋ฌด์—‡์„ ๊ฐ•์ œํ• ์ง€ ๋งก๊ฒจ์™”์Šต๋‹ˆ๋‹ค.
10:05
Is this what we want?
191
605750
2018
์„œ๋กœ ๋…์ด ๋˜๊ณ  ๋ถ„์—ด๋˜์–ด
10:07
A post-truth world where toxicity and tribalism
192
607792
3226
์œ ๋Œ€๊ฐ ์—†์ด ๊ต๊ฐ๋งŒ ์ฐพ์•„ ํ—ค๋งค๋Š” ์ง„์‹ค์„ ์™ธ๋ฉดํ•˜๋Š” ์„ธ์ƒ,
10:11
trump bridge-building and consensus-seeking?
193
611042
2625
์ด๊ฒŒ ์ •๋ง ์šฐ๋ฆฌ๊ฐ€ ์›ํ•˜๋Š” ๊ฒƒ์ผ๊นŒ์š”?
10:14
I do remain optimistic that we still have more in common with each other
194
614667
4101
๊ทธ๋ž˜๋„ ์•„์ง์€ ์šฐ๋ฆฌ๊ฐ€ SNS์—์„œ ๋ณด์ด๋Š” ๊ฒƒ๋ณด๋‹ค๋Š”
10:18
than the current media and online environment portray,
195
618792
3642
์„œ๋กœ ๋‹ฎ์€ ์ ์ด ๋งŽ๋‹ค๋ผ๊ณ  ๊ธ์ •์ ์œผ๋กœ ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
10:22
and I do believe that having more perspective surface
196
622458
3101
๊ทธ๋ฆฌ๊ณ  ๋” ๋งŽ์€ ๊ด€์ ๋“ค์„ ์ˆ˜์šฉํ•˜๋Š” ๊ฒƒ์ด
10:25
makes for a more robust and inclusive democracy.
197
625583
3601
๋”์šฑ ๊ฑด๊ฐ•ํ•˜๊ณ  ํฌ์šฉ์ ์ธ ๋ฏผ์ฃผ์ฃผ์˜๋ฅผ ๋งŒ๋“ ๋‹ค๊ณ  ๋ฏฟ์Šต๋‹ˆ๋‹ค.
10:29
But not the way it's happening right now.
198
629208
2810
ํ•˜์ง€๋งŒ ์ง€๊ธˆ์ฒ˜๋Ÿผ์€ ์•„๋‹™๋‹ˆ๋‹ค.
10:32
And it bears emphasizing, I do not want to kill off these companies.
199
632042
4226
๊ฐ•์กฐํ•˜๋Š”๋ฐ, ์ €๋Š” ์ด ํšŒ์‚ฌ๋“ค์„ ์ฃฝ์ด๊ณ  ์‹ถ์€ ๋งˆ์Œ์ด ์—†์Šต๋‹ˆ๋‹ค.
10:36
I just want them held to a certain level of accountability,
200
636292
3226
๊ทธ๋ƒฅ ์–ด๋Š ์ •๋„์˜ ์ฑ…์ž„๊ฐ๋งŒ ์งˆ ์ˆ˜ ์žˆ์—ˆ์œผ๋ฉด ์ข‹๊ฒ ์Šต๋‹ˆ๋‹ค.
10:39
just like the rest of society.
201
639542
1916
์‚ฌํšŒ์˜ ๋‹ค๋ฅธ ๊ฒƒ๋“ค์ฒ˜๋Ÿผ ๋ง์ด์ฃ .
10:42
It is time for our governments to step up and do their jobs
202
642542
4351
์ด์ œ ์šฐ๋ฆฌ์˜ ์ •๋ถ€๋“ค์ด ์ผ์–ด์„œ์„œ
10:46
of protecting our citizenry.
203
646917
1642
์šฐ๋ฆฌ์˜ ์‹œ๋ฏผ์„ฑ์„ ์ง€ํ‚ฌ ๋•Œ๊ฐ€ ๋์Šต๋‹ˆ๋‹ค.
10:48
And while there isn't one magical piece of legislation
204
648583
3060
์ด๊ฑธ ๋ชจ๋‘ ๊ณ ์ณ์ค„ ๋งˆ๋ฒ•๊ฐ™์€ ๋ฒ•์•ˆ์ด
10:51
that will fix this all,
205
651667
1476
์กด์žฌํ•˜๋Š” ๊ฒƒ์€ ์•„๋‹ˆ์ง€๋งŒ
10:53
I do believe that governments can and must find the balance
206
653167
4684
์ •๋ถ€๊ฐ€ ํ‘œํ˜„์˜ ์ž์œ  ๋ณด์žฅ๊ณผ SNS๊ฐ€ ์‚ฌํšŒ์— ๋ฏธ์น˜๋Š” ์˜ํ–ฅ์— ๋Œ€ํ•ด
10:57
between protecting free speech
207
657875
2143
์ฑ…์ž„์„ ์ง€๊ฒŒ ํ•˜๋Š” ๊ฒƒ, ๊ทธ ์‚ฌ์ด์— ๊ท ํ˜•์„ ์ฐพ์„ ์ˆ˜ ์žˆ์œผ๋ฉฐ,
11:00
and holding these platforms accountable for their effects on society.
208
660042
4309
์ฐพ์•„์•ผ ํ•œ๋‹ค๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
11:04
And they could do so in part by insisting on actual transparency
209
664375
4309
๊ทธ๋ฆฌ๊ณ  ๊ทธ๋“ค์€ ์ด๊ฑธ ์ถ”์ฒœ ์•Œ๊ณ ๋ฆฌ์ฆ˜๊ณผ
11:08
around how these recommendation engines are working,
210
668708
3143
๊ฐœ์ธํ™”, ์ฆํญ๊ณผ ํƒ€๊ฒŸํŒ…์„ ์–ด๋–ป๊ฒŒ ํ•˜๋Š”์ง€
11:11
around how the curation, amplification and targeting are happening.
211
671875
4643
ํˆฌ๋ช…ํ•˜๊ฒŒ ๋ฐํž˜์œผ๋กœ์จ ๋„์šธ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
11:16
You see, I want these companies held accountable
212
676542
2434
์ด ํšŒ์‚ฌ๋“ค์ด
11:19
not for if an individual posts misinformation
213
679000
3101
๊ฐœ๊ฐœ์ธ์ด ๊ฐ€์งœ ์ •๋ณด๋‚˜
๊ทน๋‹จ์ ์ธ ์‚ฌ์ƒ์„ ๊ฐœ์žฌํ•˜๋Š”๊ฒƒ์ด ์•„๋‹Œ
11:22
or extreme rhetoric,
214
682125
1476
11:23
but for how their recommendation engines spread it,
215
683625
3851
๊ทธ๋“ค์˜ ์•Œ๊ณ ๋ฆฌ์ฆ˜์ด ์ด๊ฑธ ์–ด๋–ป๊ฒŒ ํผํŠธ๋ฆฌ๊ณ ,
11:27
how their algorithms are steering people towards it,
216
687500
3226
์‚ฌ๋žŒ๋“ค์„ ๋Œ์–ด๋“ค์ด๋ฉฐ,
11:30
and how their tools are used to target people with it.
217
690750
3292
ํƒ€๊ฒŸํŒ…์„ ํ•˜๋Š”์ง€์— ๋Œ€ํ•˜์—ฌ ์ฑ…์ž„์„ ์กŒ์œผ๋ฉด ์ข‹๊ฒ ์Šต๋‹ˆ๋‹ค.
11:35
I tried to make change from within Facebook and failed,
218
695417
3684
์ œ๊ฐ€ ํŽ˜์ด์Šค๋ถ ์•ˆ์—์„œ ๋ณ€ํ™”๋ฅผ ์ฃผ๋Š” ๊ฒƒ์— ์‹คํŒจํ–ˆ์ง€๋งŒ,
11:39
and so I've been using my voice again for the past few years
219
699125
3143
์ง€๋‚œ ๋ช‡ ๋…„๊ฐ„ ๋‹ค์‹œ ๋ชฉ์†Œ๋ฆฌ๋ฅผ ๋‚ด์–ด
11:42
to continue sounding this alarm
220
702292
2226
์‚ฌ๋žŒ๋“ค์ด ํšŒ์‚ฌ๋“ค์—๊ฒŒ ์ด๋Ÿฐ ์ฑ…์ž„์„ ์š”๊ตฌํ•˜๊ธฐ๋ฅผ ์›ํ•˜๋ฉฐ
11:44
and hopefully inspire more people to demand this accountability.
221
704542
4392
๊ณ„์† ๊ฒฝ๊ณ ๋ฅผ ํ•˜๊ณ  ์žˆ๋Š” ์ค‘์ž…๋‹ˆ๋‹ค.
11:48
My message to you is simple:
222
708958
2643
์ œ๊ฐ€ ์—ฌ๋Ÿฌ๋ถ„์—๊ฒŒ ํ•˜๊ณ  ์‹ถ์€ ๋ง์€ ๊ฐ„๋‹จํ•ฉ๋‹ˆ๋‹ค:
11:51
pressure your government representatives
223
711625
2184
์ •๋ถ€์—๊ฒŒ ์••๋ ฅ์„ ์ฃผ์–ด
11:53
to step up and stop ceding our public square to for-profit interests.
224
713833
5435
์šฐ๋ฆฌ๊ฐ€ ๋ชฉ์†Œ๋ฆฌ๋ฅผ ๋‚ด๋Š” ๊ณต๊ฐ„์„ ์˜๋ฆฌ ๋ฒ•์ธ๋“ค์—๊ฒŒ ์–‘๋„ํ•˜๋Š” ๊ฒƒ์„ ๋ฉˆ์ถ”๋ผ๊ณ  ๋ง์ด์ฃ .
11:59
Help educate your friends and family
225
719292
1851
์นœ๊ตฌ๋“ค๊ณผ ๊ฐ€์กฑ๋“ค์—๊ฒŒ
12:01
about how they're being manipulated online.
226
721167
3142
๊ทธ๋“ค์ด ์˜จ๋ผ์ธ์ƒ์—์„œ ์–ด๋–ป๊ฒŒ ์ด์šฉ๋‹นํ•˜๋Š”์ง€ ์•Œ๋ ค์ฃผ๊ณ ,
12:04
Push yourselves to engage with people who aren't like-minded.
227
724333
3518
๋น„์Šทํ•œ ์‚ฌ์ƒ์„ ๊ฐ€์ง€์ง€ ์•Š์€ ์‚ฌ๋žŒ๋“ค๊ณผ ์–ด์šธ๋ฆฌ๋„๋ก ๋…ธ๋ ฅํ•˜์„ธ์š”.
12:07
Make this issue a priority.
228
727875
2643
์ด ๋ฌธ์ œ๋ฅผ ์šฐ์„ ์‹œ์ผœ ์ฃผ์„ธ์š”.
12:10
We need a whole-society approach to fix this.
229
730542
3416
์‚ฌํšŒ ๋ชจ๋‘๊ฐ€ ์ฐธ์—ฌํ•ด์•ผ ๊ณ ์น  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
12:15
And my message to the leaders of my former employer Facebook is this:
230
735250
5559
์ œ ์ „ ์ง์žฅ์ธ ํŽ˜์ด์Šค๋ถ์˜ ์ด์ˆ˜๋“ค์—๊ฒŒ ๋ฐ”์นฉ๋‹ˆ๋‹ค:
12:20
right now, people are using your tools exactly as they were designed
231
740833
5810
์ง€๊ธˆ๋„ ์‚ฌ๋žŒ๋“ค์€ ๋‹น์‹ ๋“ค์ด ๋งŒ๋“  ๋„๊ตฌ๋“ค๋กœ
12:26
to sow hatred, division and distrust,
232
746667
2601
์ฆ์˜ค, ๋ถ„๋‹จ๊ณผ ๋ถˆ์‹ ์„ ํผํŠธ๋ฆฌ๊ณ  ์žˆ์œผ๋ฉฐ
12:29
and you're not just allowing it, you are enabling it.
233
749292
4142
๋‹น์‹ ๋“ค์€ ๊ทธ๊ฒƒ๋“ค์„ ํ—ˆ๋ฝํ•˜๋Š” ๊ฒƒ์„ ๋„˜์–ด์„œ ์œ ๋„ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
12:33
And yes, there are lots of great stories
234
753458
2476
๋ฌผ๋ก  ๋‹น์‹ ๋“ค์˜ ํ”Œ๋žซํผ์—๋Š” ์ „ ์„ธ๊ณ„์—์„œ ๋‚˜์˜ค๋Š”
12:35
of positive things happening on your platform around the globe,
235
755958
3685
๊ธ์ •์ ์ด๊ณ  ๊ทผ์‚ฌํ•œ ์ด์•ผ๊ธฐ๋“ค์ด ๋งŽ์ด ์žˆ์ง€๋งŒ,
12:39
but that doesn't make any of this OK.
236
759667
3226
์ด๊ฒƒ๋“ค์„ ์ •๋‹นํ™”์‹œํ‚ค๋Š” ๊ฒƒ์€ ์•„๋‹™๋‹ˆ๋‹ค.
12:42
And it's only getting worse as we're heading into our election,
237
762917
2976
์ด์ œ ์„ ๊ฑฐ์— ๋” ๊ฐ€๊นŒ์›Œ์งˆ์ˆ˜๋ก ๋”์šฑ ๋‚˜๋น ์ง€๊ณ  ์žˆ๋Š”๋ฐ
12:45
and even more concerning,
238
765917
1767
๋”์šฑ๋” ๊ฑฑ์ •์ธ ๊ฒƒ์€
12:47
face our biggest potential crisis yet,
239
767708
2476
๊ฐ€์žฅ ํฐ ๊ตญ๋ฉด์ด ์ผ์–ด๋‚  ๊ฒฝ์šฐ
12:50
if the results aren't trusted, and if violence breaks out.
240
770208
4143
๋ฐ”๋กœ ๊ฒฐ๊ณผ๋ฅผ ๋ฐ›์•„๋“ค์ด์ง€ ๋ชปํ•˜๊ณ  ํญ๋ ฅ ์‹œ์œ„๊ฐ€ ์ผ์–ด๋‚  ๊ฒฝ์šฐ์ž…๋‹ˆ๋‹ค.
12:54
So when in 2021 you once again say, "We know we have to do better,"
241
774375
5018
๊ทธ๋Ÿฌ๋‹ˆ 2021๋…„์— ๋˜ "์šฐ๋ฆฌ๊ฐ€ ๋” ์ž˜ํ•˜๊ฒ ์Šต๋‹ˆ๋‹ค" ๋ผ๊ณ  ํ•  ๋•Œ,
12:59
I want you to remember this moment,
242
779417
2684
์ด ์ˆœ๊ฐ„์„ ๊ธฐ์–ตํ•ด์ฃผ์„ธ์š”.
13:02
because it's no longer just a few outlier voices.
243
782125
3226
์™œ๋ƒํ•˜๋ฉด ์ด์ œ๋Š” ์†Œ์ˆ˜์˜ ๋ชฉ์†Œ๋ฆฌ๊ฐ€ ์•„๋‹Œ
13:05
Civil rights leaders, academics,
244
785375
2268
์ธ๊ถŒ์šด๋™ ์ง€๋„์ž, ๊ต์ˆ˜์™€ ํ•™์ƒ,
13:07
journalists, advertisers, your own employees,
245
787667
2976
๊ธฐ์ž, ๊ด‘๊ณ ์ฃผ์™€ ๋‹น์‹ ์˜ ์ง์›๋“ค์ด
13:10
are shouting from the rooftops
246
790667
2059
๋‹น์‹ ๋“ค์˜ ์ •์ฑ…๊ณผ ์šด์˜ ๋ฐฉ์‹์ด
13:12
that your policies and your business practices
247
792750
2601
์‚ฌ๋žŒ๋“ค๊ณผ ๋ฏผ์ฃผ์ฃผ์˜์— ํ•ด์•…์„ ๋ผ์นœ๋‹ค๊ณ 
13:15
are harming people and democracy.
248
795375
2417
์™ธ์น˜๊ณ  ์žˆ์œผ๋‹ˆ๊นŒ์š”.
13:18
You own your decisions,
249
798875
2351
๋‹น์‹ ๋“ค์˜ ๊ฒฐ์ •์€ ๋‹น์‹ ๋“ค์ด ํ•œ ๊ฒƒ์ด์ง€๋งŒ,
13:21
but you can no longer say that you couldn't have seen it coming.
250
801250
3667
์ด๋ ‡๊ฒŒ ๋  ๊ฑธ ์˜ˆ์ธกํ•˜์ง€ ๋ชปํ–ˆ๋‹ค๊ณ  ๋งํ•  ์ˆ˜๋Š” ์—†์„ ๊ฒ๋‹ˆ๋‹ค.
๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
13:26
Thank you.
251
806167
1250
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7