How you can help transform the internet into a place of trust | Claire Wardle

52,343 views ใƒป 2019-11-15

TED


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

๋ฒˆ์—ญ: Jungmin Hwang ๊ฒ€ํ† : Whayoung Cha
00:13
No matter who you are or where you live,
0
13548
2516
์—ฌ๋Ÿฌ๋ถ„์ด ๋ˆ„๊ตฌ๋“ , ์–ด๋””์—์„œ ์‚ด๋“ 
00:16
I'm guessing that you have at least one relative
1
16088
2356
๊ทธ๋Ÿฐ ์ด๋ฉ”์ผ์„ ์ „๋‹ฌํ•˜๊ธฐ ์ข‹์•„ํ•˜๋Š” ์นœ์ฒ™์ด ์ ์–ด๋„
00:18
that likes to forward those emails.
2
18468
2334
ํ•œ ๋ช…์€ ์žˆ์„ ๊ฑฐ๋ผ ์˜ˆ์ƒํ•ฉ๋‹ˆ๋‹ค.
00:21
You know the ones I'm talking about --
3
21206
1900
์ œ๊ฐ€ ์–ด๋–ค ์ด๋ฉ”์ผ ๋งํ•˜๋Š”์ง€ ์•„์‹ค ํ…๋ฐ์š”.
00:23
the ones with dubious claims or conspiracy videos.
4
23130
2854
์ด์ƒํ•œ ์ฃผ์žฅ์ด๋‚˜ ์Œ๋ชจ๋ก  ์˜์ƒ์„ ๋‹ด์€ ๊ทธ๋Ÿฐ ์ด๋ฉ”์ผ ๋ง์ž…๋‹ˆ๋‹ค.
00:26
And you've probably already muted them on Facebook
5
26315
2668
๊ทธ๋ฆฌ๊ณ  ์—ฌ๋Ÿฌ๋ถ„์€ ์•„๋งˆ๋„ ์ด๋Ÿฐ ๊ฒŒ์‹œ๊ธ€์„ ๊ณต์œ ํ•˜๋Š”
00:29
for sharing social posts like this one.
6
29007
2348
๊ทธ ์นœ์ฒ™์„ ํŽ˜์ด์Šค๋ถ์—์„œ ์Œ์†Œ๊ฑฐ ์ฒ˜๋ฆฌ ํ•˜์…จ์„ ๊ฒ๋‹ˆ๋‹ค.
00:31
It's an image of a banana
7
31379
1404
์ด์ƒํ•œ ๋นจ๊ฐ„ ์‹ญ์ž๊ฐ€๊ฐ€ ์ค‘๊ฐ„์„ ๊ฐ€๋กœ์ง€๋ฅด๊ณ  ์žˆ๋Š”
00:32
with a strange red cross running through the center.
8
32807
2667
๋ฐ”๋‚˜๋‚˜์˜ ์‚ฌ์ง„์ž…๋‹ˆ๋‹ค.
00:35
And the text around it is warning people
9
35498
2139
์ฃผ๋ณ€์— ์žˆ๋Š” ํ…์ŠคํŠธ๋ฅผ ๋ณด์‹œ๋ฉด,
00:37
not to eat fruits that look like this,
10
37661
2158
์ด๋ ‡๊ฒŒ ์ƒ๊ธด ๊ณผ์ผ์„ ๋จน์ง€ ๋ง๋ผ๊ณ  ๊ฒฝ๊ณ ํ•˜๊ณ  ์žˆ๋Š”๋ฐ์š”,
00:39
suggesting they've been injected with blood
11
39843
2028
์—ฌ๊ธฐ์— HIV ๋ฐ”์ด๋Ÿฌ์Šค๋กœ ๊ฐ์—ผ๋œ
00:41
contaminated with the HIV virus.
12
41895
2130
ํ˜ˆ์•ก์ด ์ฃผ์ž…๋˜์—ˆ๋‹ค๋ฉด์„œ ๋ง์ด์ฃ .
00:44
And the social share message above it simply says,
13
44049
2603
๊ทธ๋ฆฌ๊ณ  ์‚ฌ์ง„ ์œ„์˜ ์‚ฌํšŒ์  ๊ณต์œ  ๋ฉ”์‹œ์ง€์—๋Š” ์ด๋ ‡๊ฒŒ ๋˜์–ด ์žˆ์Šต๋‹ˆ๋‹ค.
00:46
"Please forward to save lives."
14
46676
2181
"๋„๋ฆฌ ํผ๋œจ๋ ค ์ƒ๋ช…์„ ๊ตฌํ•ด์ฃผ์„ธ์š”."
00:49
Now, fact-checkers have been debunking this one for years,
15
49672
3294
ํŒฉํŠธ ์ฒดํฌํ•˜๋Š” ์‚ฌ๋žŒ๋“ค์ด ์ด๊ฒƒ์„ ๋ช‡ ๋…„์งธ ๋ฐ”๋กœ์žก๊ณ  ์žˆ์ง€๋งŒ,
00:52
but it's one of those rumors that just won't die.
16
52990
2809
์ ˆ๋Œ€ ์‚ฌ๋ผ์ง€์ง€ ์•Š๊ณ  ์žˆ๋Š” ๋ฃจ๋จธ ์ค‘ ํ•˜๋‚˜์ž…๋‹ˆ๋‹ค.
00:55
A zombie rumor.
17
55823
1271
์ข€๋น„ ๋ฃจ๋จธ์ธ ๊ฒƒ์ด์ฃ .
00:57
And, of course, it's entirely false.
18
57513
2093
๊ทธ๋ฆฌ๊ณ , ๋‹น์—ฐํ•˜๊ฒŒ๋„, ์™„์ „ํžˆ ํ‹€๋ฆฐ ๋ง์ด์ฃ .
01:00
It might be tempting to laugh at an example like this, to say,
19
60180
2959
์ด๋Ÿฐ ์˜ˆ๋ฅผ ๋ณด๊ณ  ๋น„์›ƒ์„ ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
01:03
"Well, who would believe this, anyway?"
20
63163
1884
"๋ˆ„๊ฐ€ ์ด๋Ÿฐ๊ฑธ ์‹ค์ œ๋กœ ๋ฏฟ์„๊นŒ?" ํ•˜๋ฉด์„œ์š”.
01:05
But the reason it's a zombie rumor
21
65419
1626
ํ•˜์ง€๋งŒ ์ด๋Ÿฐ ์‚ฌ์ง„๋“ค์ด ์ข€๋น„ ๋ฃจ๋จธ์ธ ์ด์œ ๋Š”
01:07
is because it taps into people's deepest fears about their own safety
22
67069
3889
์‚ฌ๋žŒ๋“ค๋กœ ํ•˜์—ฌ๊ธˆ ๊ทธ๋“ค ์ž์‹ ์˜ ์•ˆ์ „, ๊ทธ๋ฆฌ๊ณ 
01:10
and that of the people they love.
23
70982
2175
์‚ฌ๋ž‘ํ•˜๋Š” ์ด๋“ค์˜ ์•ˆ์ „์— ๋Œ€ํ•ด ๊นŠ์ด ๊ฑฑ์ •ํ•˜๋„๋ก ๋งŒ๋“ค๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
01:13
And if you spend as enough time as I have looking at misinformation,
24
73783
3273
๊ทธ๋ฆฌ๊ณ  ์ž˜๋ชป๋œ ์ •๋ณด๋ฅผ ์‚ดํ”ผ๋Š”๋ฐ ์ €๋งŒํผ ์‹œ๊ฐ„์„ ์“ฐ์‹ ๋‹ค๋ฉด,
01:17
you know that this is just one example of many
25
77080
2420
์ด๊ฒƒ์ด ์‚ฌ๋žŒ๋“ค์˜ ๊ณตํฌ์‹ฌ๊ณผ ์ทจ์•ฝ์ ์„ ๊ฑด๋“œ๋ฆฌ๋Š” ์ˆ˜๋งŽ์€ ์ž˜๋ชป๋œ ์ •๋ณด ์ค‘
01:19
that taps into people's deepest fears and vulnerabilities.
26
79524
3047
ํ•˜๋‚˜์ผ ๋ฟ์ด๋ผ๋Š” ๊ฒƒ์„ ์•Œ๊ฒŒ ๋˜์‹ค ๊ฒ๋‹ˆ๋‹ค.
01:23
Every day, across the world, we see scores of new memes on Instagram
27
83214
4369
๋งค์ผ ์ „ ์„ธ๊ณ„์—์„œ ์ˆ˜์‹ญ ๊ฐœ์˜ ์ธ์Šคํƒ€๊ทธ๋žจ ๋ฐˆ(meme)์ด ์˜ฌ๋ผ์˜ค๋Š”๋ฐ์š”,
01:27
encouraging parents not to vaccinate their children.
28
87607
3039
์ž๋…€๋“ค ๋ฐฑ์‹ ์ ‘์ข…์„ ์‹œํ‚ค์ง€ ๋ง๋ผ๊ณ  ๋ถ€๋ชจ๋“ค์„ ๋ถ€์ถ”๊ธฐ๋Š” ๋‚ด์šฉ์ด์ฃ .
01:30
We see new videos on YouTube explaining that climate change is a hoax.
29
90670
4532
๊ธฐํ›„๋ณ€ํ™”๊ฐ€ ๊ฑฐ์ง“์ด๋ผ๋Š” ๋‚ด์šฉ์˜ ์œ ํŠœ๋ธŒ ์˜์ƒ๋„ ๋งค์ผ ์—…๋กœ๋“œ๋ฉ๋‹ˆ๋‹ค.
01:35
And across all platforms, we see endless posts designed to demonize others
30
95226
4302
๋˜, ๋ชจ๋“  ์†Œ์…œ ๋ฏธ๋””์–ด ํ”Œ๋žซํผ์—์„œ ํƒ€์ธ์„ ์ธ์ข…, ์ข…๊ต, ์„ฑ ์ •์ฒด์„ฑ์— ๋”ฐ๋ผ
01:39
on the basis of their race, religion or sexuality.
31
99552
3501
์•…๋งˆ์ทจ๊ธ‰ํ•˜๊ณ ์ž ์ž‘์„ฑ๋œ ๊ธ€์ด ๋๋„ ์—†์ด ์˜ฌ๋ผ์˜ต๋‹ˆ๋‹ค.
01:44
Welcome to one of the central challenges of our time.
32
104314
3030
์šฐ๋ฆฌ ์‹œ๋Œ€์— ๊ฐ€์žฅ ํ•ต์‹ฌ์ ์ธ ๋ฌธ์ œ ์ค‘ ํ•˜๋‚˜๊ฐ€ ์ด๊ฒƒ์ž…๋‹ˆ๋‹ค.
01:47
How can we maintain an internet with freedom of expression at the core,
33
107647
4025
์–ด๋–ป๊ฒŒ ํ•˜๋ฉด ํ‘œํ˜„์˜ ์ž์œ ์— ์ค‘์ ์„ ๋‘๋ฉด์„œ๋„
01:51
while also ensuring that the content that's being disseminated
34
111696
3303
์˜จ๋ผ์ธ์ƒ์—์„œ ํผ์ ธ๋‚˜๊ฐ€๋Š” ๋‚ด์šฉ๋“ค์ด ์šฐ๋ฆฌ์˜ ๋ฏผ์ฃผ์ฃผ์˜์™€ ์ปค๋ฎค๋‹ˆํ‹ฐ,
01:55
doesn't cause irreparable harms to our democracies, our communities
35
115023
3886
์šฐ๋ฆฌ์˜ ์‹ ์ฒด์ , ์ •์‹ ์  ๊ฑด๊ฐ•์˜ ์•ˆ๋…•์— ๋Œ์ดํ‚ฌ ์ˆ˜ ์—†๋Š” ํ•ด๋ฅผ ๋ผ์น˜์ง€ ํ•˜๋Š”
01:58
and to our physical and mental well-being?
36
118933
2238
์ธํ„ฐ๋„ท ํ™˜๊ฒฝ์„ ์œ ์ง€ํ•  ์ˆ˜ ์žˆ์„๊นŒ์š”?
02:01
Because we live in the information age,
37
121998
2087
์šฐ๋ฆฌ๋Š” ์ •๋ณดํ™” ์‹œ๋Œ€์— ์‚ด๊ณ  ์žˆ์ง€๋งŒ,
02:04
yet the central currency upon which we all depend -- information --
38
124109
3547
์šฐ๋ฆฌ ๋ชจ๋‘๊ฐ€ ์˜์กดํ•˜๋Š” ์ค‘์•™ํ†ตํ™”, ์ฆ‰ ์ •๋ณด๋Š”,
02:07
is no longer deemed entirely trustworthy
39
127680
2357
๋” ์ด์ƒ ์ „์ ์œผ๋กœ ์‹ ๋ขฐํ•  ๋งŒํ•˜๋‹ค๊ณ  ์—ฌ๊ฒจ์ง€์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
02:10
and, at times, can appear downright dangerous.
40
130061
2328
์–ด์ฉ” ๋•Œ๋Š” ์™„์ „ ์œ„ํ—˜ํ•ด๋ณด์ด๊ธฐ๊นŒ์ง€ ํ•˜์ฃ .
02:12
This is thanks in part to the runaway growth of social sharing platforms
41
132811
3937
์ด๋Š” ๊ฑท์žก์„ ์ˆ˜ ์—†์ด ์„ฑ์žฅํ•œ ์‚ฌํšŒ๊ณต์œ ํ”Œ๋žซํผ์— ์ผ๋ถ€ ์›์ธ์ด ์žˆ๋Š”๋ฐ์š”.
02:16
that allow us to scroll through,
42
136772
1642
์Šคํฌ๋กค์„ ํ†ตํ•ด ์ด๊ฒƒ ์ €๊ฒƒ ๋ณผ ์ˆ˜ ์žˆ๊ฒŒ ๋˜์—ˆ์ง€๋งŒ,
02:18
where lies and facts sit side by side,
43
138438
2222
์ด ๊ณณ์—” ๊ฑฐ์ง“๊ณผ ์‚ฌ์‹ค์ด ๋‚˜๋ž€ํžˆ ๊ณต์กดํ•˜๋Š” ๋ฐ˜๋ฉด,
02:20
but with none of the traditional signals of trustworthiness.
44
140684
3071
์ „ํ†ต์ ์ธ ์‹ ๋ขฐ์˜ ํ”์ ์€ ์ฐพ๊ธฐ ํž˜๋“ค์ฃ .
02:24
And goodness -- our language around this is horribly muddled.
45
144268
3619
๊ทธ๋ฆฌ๊ณ  ์šฐ๋ฆฌ์˜ ์–ธ์–ด๋Š” ์ด ํ˜„์ƒ์„ ์ •ํ™•ํ•˜๊ฒŒ ์„ค๋ช…ํ•ด ์ฃผ์ง€ ๋ชปํ•ฉ๋‹ˆ๋‹ค.
02:27
People are still obsessed with the phrase "fake news,"
46
147911
3103
์‚ฌ๋žŒ๋“ค์€ ์•„์ง๋„ "๊ฐ€์งœ ๋‰ด์Šค"๋ผ๋Š” ๋ง์— ์ง‘์ฐฉํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
02:31
despite the fact that it's extraordinarily unhelpful
47
151038
2531
์ด ๋ง์ด ์ƒํ™ฉ ์„ค๋ช…์— ํŠน๋ณ„ํžˆ ๋„์›€์ด ๋˜์ง€ ์•Š์„ ๋ฟ๋”๋Ÿฌ,
02:33
and used to describe a number of things that are actually very different:
48
153593
3460
์‹ค์ œ๋ก  ์„œ๋กœ ๋‹ค๋ฅธ ๊ฒƒ๋“ค์„ ์„ค๋ช…ํ•˜๋Š”๋ฐ ์“ฐ์ด๊ณ  ์žˆ์Œ์—๋„ ๋ง์ด์ฃ .
02:37
lies, rumors, hoaxes, conspiracies, propaganda.
49
157077
3386
๊ฑฐ์ง“๋ง, ๋ฃจ๋จธ, ๋‚ ์กฐ, ์Œ๋ชจ, ํ”„๋กœํŒŒ๊ฐ„๋‹ค์™€ ๊ฐ™์€ ๊ฐœ๋…๋“ค์ด์š”.
02:40
And I really wish we could stop using a phrase
50
160911
2912
์ €๋Š” ์šฐ๋ฆฌ๊ฐ€ ์ „ ์„ธ๊ณ„ ์ขŒํŒŒ, ์šฐํŒŒ ์ •์น˜์ธ๋“ค์ด
02:43
that's been co-opted by politicians right around the world,
51
163847
2862
์ž์œ  ๋…๋ฆฝ ์–ธ๋ก ์„ ๊ณต๊ฒฉํ•˜๊ธฐ ์œ„ํ•ด
02:46
from the left and the right,
52
166733
1471
์‚ฌ์šฉํ•˜๊ธฐ ์‹œ์ž‘ํ•œ ์ด ๋ฌธ๊ตฌ๋ฅผ
02:48
used as a weapon to attack a free and independent press.
53
168228
3222
๊ทธ๋งŒ ์‚ฌ์šฉํ–ˆ์œผ๋ฉด ์ข‹๊ฒ ์Šต๋‹ˆ๋‹ค.
02:52
(Applause)
54
172307
4702
(๋ฐ•์ˆ˜)
02:57
Because we need our professional news media now more than ever.
55
177033
3462
์™œ๋ƒํ•˜๋ฉด ์ง€๊ธˆ ์šฐ๋ฆฌ์—๊ฒ ๊ทธ ์–ด๋Š ๋•Œ๋ณด๋‹ค ์ „๋ฌธ ๋‰ด์Šค ๋ฏธ๋””์–ด๊ฐ€ ํ•„์š”ํ•˜๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
03:00
And besides, most of this content doesn't even masquerade as news.
56
180882
3373
๊ฒŒ๋‹ค๊ฐ€, ์ด๋Ÿฐ ๋‚ด์šฉ๋“ค์€ ๋‰ด์Šค๋กœ ๋‘”๊ฐ‘ํ•˜์ง€๋„ ์•Š์€ ์ฑ„ ๋Œ์•„๋‹ค๋‹™๋‹ˆ๋‹ค.
03:04
It's memes, videos, social posts.
57
184279
2642
๋ฐˆ(meme), ๋น„๋””์˜ค, ์‡ผ์…œ๋ฏธ๋””์–ด์— ์˜ฌ๋ผ์˜ค๋Š” ๊ฒŒ์‹œ๋ฌผ ๊ฐ™์€ ๊ฒƒ๋“ค์ธ๋ฐ์š”.
03:06
And most of it is not fake; it's misleading.
58
186945
3453
๋Œ€๋ถ€๋ถ„์€ ๊ฐ€์งœ๊ฐ€ ์•„๋‹ˆ๋ผ, ์‚ฌ๋žŒ๋“ค์„ ํ˜ธ๋„ํ•˜๋Š” ๊ฒƒ๋“ค์ž…๋‹ˆ๋‹ค.
03:10
We tend to fixate on what's true or false.
59
190422
3015
์šฐ๋ฆฌ๋Š” ์ฃผ๋กœ ์–ด๋–ค ๊ธ€์˜ ์ง„์œ„์—ฌ๋ถ€๋ฅผ ๊ฐ€๋ ค๋‚ด๋ ค๊ณ  ํ•ฉ๋‹ˆ๋‹ค.
03:13
But the biggest concern is actually the weaponization of context.
60
193461
4032
ํ•˜์ง€๋งŒ ๊ฐ€์žฅ ํฐ ๋ฌธ์ œ๋Š” ๋‚ด์šฉ์ด ๋ฌด๊ธฐํ™”๋œ๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
03:18
Because the most effective disinformation
61
198855
1968
์™œ๋ƒํ•˜๋ฉด ๊ฐ€์žฅ ํšจ๊ณผ์ ์ธ ๊ฑฐ์ง“ ์ •๋ณด๋“ค์€
03:20
has always been that which has a kernel of truth to it.
62
200847
3048
ํ•ญ์ƒ ์กฐ๊ธˆ์˜ ์‚ฌ์‹ค์„ ํฌํ•จํ•˜๊ณ  ์žˆ๊ธฐ ๋งˆ๋ จ์ด์ฃ .
03:23
Let's take this example from London, from March 2017,
63
203919
3476
์ด ์˜ˆ๋ฅผ ํ•œ๋ฒˆ ๋“ค์–ด ๋ณด์ฃ . 2017๋…„ 3์›”, ๋Ÿฐ๋˜์—์„œ
03:27
a tweet that circulated widely
64
207419
1540
ํ•œ ํŠธ์œ—์ด ๋„๋ฆฌ ์œ ํฌ๋๋Š”๋ฐ์š”,
03:28
in the aftermath of a terrorist incident on Westminster Bridge.
65
208983
3587
์›จ์ŠคํŠธ๋ฏผ์Šคํ„ฐ ๋‹ค๋ฆฌ์—์„œ ์ผ์–ด๋‚œ ํ…Œ๋Ÿฌ์‚ฌ๊ฑด์˜ ์—ฌํŒŒ์˜€์ฃ .
03:32
This is a genuine image, not fake.
66
212594
2428
์ด ์‚ฌ์ง„์€ ์ง„์งœ ์‚ฌ์ง„์ž…๋‹ˆ๋‹ค. ๊ฐ€์งœ๊ฐ€ ์•„๋‹ˆ๋ผ์š”.
03:35
The woman who appears in the photograph was interviewed afterwards,
67
215046
3169
์ด ์‚ฌ์ง„์— ๋“ฑ์žฅํ•œ ์—ฌ์„ฑ์€ ์ดํ›„์— ์ธํ„ฐ๋ทฐ์— ์‘ํ–ˆ๋Š”๋ฐ์š”,
03:38
and she explained that she was utterly traumatized.
68
218239
2409
์™„์ „ํžˆ ์ •์‹ ์  ์ถฉ๊ฒฉ์„ ๋ฐ›์•˜๋‹ค๊ณ  ์„ค๋ช…์„ ํ•˜๋”๊ตฐ์š”.
03:40
She was on the phone to a loved one,
69
220672
1738
๊ทธ๋…€๋Š” ์• ์ธ๊ณผ ํ†ตํ™”ํ•˜๋Š” ์ค‘์ด์—ˆ๊ณ ,
03:42
and she wasn't looking at the victim out of respect.
70
222434
2618
ํ”ผํ•ด์ž๋ฅผ ์ œ๋Œ€๋กœ ๋ณด๊ณ  ์žˆ์ง€ ์•Š์•˜์Šต๋‹ˆ๋‹ค.
03:45
But it still was circulated widely with this Islamophobic framing,
71
225076
3960
ํ•˜์ง€๋งŒ ์ด ์‚ฌ์ง„์€ ์ด์Šฌ๋žŒ ํ˜์˜ค ํ”„๋ ˆ์ž„์ด ์”Œ์›Œ์ง„ ์ฑ„ ์œ ํฌ๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
03:49
with multiple hashtags, including: #BanIslam.
72
229060
3046
#BanIslam(์ด์Šฌ๋žŒ๊ธˆ์ง€) ๋“ฑ ์—ฌ๋Ÿฌ ํ•ด์‹œํƒœ๊ทธ๊ฐ€ ๋‹ฌ๋ฆฐ ์ฑ„ ๋ง์ด์ฃ .
03:52
Now, if you worked at Twitter, what would you do?
73
232425
2398
์—ฌ๋Ÿฌ๋ถ„์ด ํŠธ์œ„ํ„ฐ ์ง์›์ด๋ผ๋ฉด, ์–ด๋–ค ์กฐ์น˜๋ฅผ ์ทจํ•˜์‹ค ๊ฑด๊ฐ€์š”?
03:54
Would you take that down, or would you leave it up?
74
234847
2562
์ด ์‚ฌ์ง„์„ ๋‚ด๋ฆฌ์‹ค ๊ฑด๊ฐ€์š”, ์•„๋‹ˆ๋ฉด ๋‚ด๋ฒ„๋ ค๋‘์‹ค ๊ฒƒ์ธ๊ฐ€์š”?
03:58
My gut reaction, my emotional reaction, is to take this down.
75
238553
3429
์ œ ๋ณธ๋Šฅ์  ๋ฐ˜์‘, ๊ฐ์ •์  ๋ฐ˜์‘์€ ์‚ฌ์ง„์„ ๋‚ด๋ฆฐ๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
04:02
I hate the framing of this image.
76
242006
2142
์ €๋Š” ์ด ์‚ฌ์ง„์— ์”Œ์›Œ์ง„ ํ”„๋ ˆ์ž„์ด ์‹ซ์Šต๋‹ˆ๋‹ค.
04:04
But freedom of expression is a human right,
77
244585
2388
ํ•˜์ง€๋งŒ ํ‘œํ˜„์˜ ์ž์œ ๋Š” ์ธ๊ถŒ์ด๋ฉฐ,
04:06
and if we start taking down speech that makes us feel uncomfortable,
78
246997
3225
๋ถˆํŽธํ•˜๋‹ค๋Š” ์ด์œ ๋กœ ๊ฒŒ์‹œ๊ธ€์„ ๋‚ด๋ฆฌ๊ธฐ ์‹œ์ž‘ํ•œ๋‹ค๋ฉด
04:10
we're in trouble.
79
250246
1230
์ด ๋˜ํ•œ ๋ฌธ์ œ๊ฐ€ ๋  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
04:11
And this might look like a clear-cut case,
80
251500
2294
์ด ์˜ˆ์‹œ์˜ ๊ฒฝ์šฐ ๋ช…๋ฐฑํ•ด ๋ณด์ด์ง€๋งŒ,
04:13
but, actually, most speech isn't.
81
253818
1698
๋Œ€๋ถ€๋ถ„์˜ ๊ฒŒ์‹œ๋ฌผ์€ ๊ทธ๋ ‡์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
04:15
These lines are incredibly difficult to draw.
82
255540
2436
์˜ณ๊ณ  ๊ทธ๋ฆ„์˜ ์„ ์„ ๊ธ‹๊ธฐ ๋งค์šฐ ์–ด๋ ต์Šต๋‹ˆ๋‹ค.
04:18
What's a well-meaning decision by one person
83
258000
2281
์–ด๋–ค ์‚ฌ๋žŒ์ด ์ข‹์€ ์˜๋„๋กœ ๋‚ด๋ ค์ง„ ๊ฒฐ์ •์ด
04:20
is outright censorship to the next.
84
260305
2077
๋‹ค๋ฅธ ์‚ฌ๋žŒ์—๊ฒŒ๋Š” ์™„์ „ํ•œ ๊ฒ€์—ด๋กœ ๋  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
04:22
What we now know is that this account, Texas Lone Star,
85
262759
2929
์ด์ œ ์šฐ๋ฆฌ๋Š” ์ด ๊ณ„์ •, Texas Lone Star๊ฐ€
04:25
was part of a wider Russian disinformation campaign,
86
265712
3230
๋Ÿฌ์‹œ์•„์˜ ๊ฑฐ์ง“ ์ •๋ณด ์บ ํŽ˜์ธ์˜ ์ผ๋ถ€์˜€๊ณ ,
04:28
one that has since been taken down.
87
268966
2151
๊ทธ ๋•Œ ์ดํ›„ ๋‚ด๋ ค์ง„ ์ƒํƒœ๋ผ๋Š” ๊ฑธ ์••๋‹ˆ๋‹ค.
04:31
Would that change your view?
88
271141
1563
์ด๊ฒƒ์ด ์—ฌ๋Ÿฌ๋ถ„์˜ ๊ด€์ ์„ ๋ฐ”๊ฟ€ ์ˆ˜ ์žˆ์„๊นŒ์š”?
04:33
It would mine,
89
273322
1159
์ œ ๊ด€์ ์€ ๋ณ€ํ™”์‹œํ‚ฌ ๋“ฏ ํ•ฉ๋‹ˆ๋‹ค.
04:34
because now it's a case of a coordinated campaign
90
274505
2301
์™œ๋ƒํ•˜๋ฉด ์ด์ œ ์ด ๊ธ€์€ ๋ถˆํ™”๋ฅผ ์ดˆ๋ž˜ํ•˜๊ธฐ ์œ„ํ•ด์„œ
04:36
to sow discord.
91
276830
1215
์„ค๊ณ„๋œ ์บ ํŽ˜์ธ์˜ ์ผ๋ถ€์ด๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
04:38
And for those of you who'd like to think
92
278069
1961
์—ฌ๋Ÿฌ๋ถ„ ์ค‘์—์„œ ํ˜น์‹œ
04:40
that artificial intelligence will solve all of our problems,
93
280054
2831
์ธ๊ณต์ง€๋Šฅ์ด ์šฐ๋ฆฌ์˜ ๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ•ด ์ค„ ์ˆ˜ ์žˆ๋‹ค๊ณ  ๋ฏฟ์œผ์‹œ๋Š” ๋ถ„์ด ์žˆ๋‹ค๋ฉด,
04:42
I think we can agree that we're a long way away
94
282909
2225
์•„๋งˆ ์ด๋Ÿฐ ์‚ฌ๊ฑด์„ ํ•ด๊ฒฐํ•  ์ˆ˜ ์žˆ๋Š” ์ธ๊ณต์ง€๋Šฅ์ด ๋“ฑ์žฅํ•˜๊ธฐ ์ „๊นŒ์ง€๋Š”
04:45
from AI that's able to make sense of posts like this.
95
285158
2587
์˜ค๋žœ ์‹œ๊ฐ„์ด ๊ฑธ๋ฆด ๊ฒƒ์ด๋ผ๊ณ  ๋™์˜ํ•˜์‹ค ๊ฒ๋‹ˆ๋‹ค.
04:48
So I'd like to explain three interlocking issues
96
288856
2507
๊ทธ๋ž˜์„œ ์ €๋Š” ์ด ๋ฌธ์ œ๋ฅผ ๋ณต์žกํ•˜๊ฒŒ ๋งŒ๋“œ๋Š”
04:51
that make this so complex
97
291387
2373
์„ธ ๊ฐ€์ง€ ์„œ๋กœ ๋งž๋ฌผ๋ฆฐ ์ด์Šˆ์— ๋Œ€ํ•ด ์„ค๋ช…ํ•˜๊ณ ์ž ํ•ฉ๋‹ˆ๋‹ค.
04:53
and then think about some ways we can consider these challenges.
98
293784
3122
๊ทธ๋ฆฌ๊ณ  ์šฐ๋ฆฌ๊ฐ€ ์ด ๋ฌธ์ œ์˜ ํ•ด๊ฒฐ ๋ฐฉ๋ฒ•์„ ์ƒ๊ฐํ•ด๋ณด๋„๋ก ํ•˜์ฃ .
04:57
First, we just don't have a rational relationship to information,
99
297348
3890
์ฒซ์งธ๋กœ, ์šฐ๋ฆฌ๋Š” ์ •๋ณด๋ฅผ ์ด์„ฑ์ ์œผ๋กœ ๋ฐ›์•„๋“ค์ด์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
05:01
we have an emotional one.
100
301262
1468
๊ฐ์ •์œผ๋กœ ๋ฐ›์•„๋“ค์ด์ฃ .
05:02
It's just not true that more facts will make everything OK,
101
302754
3794
์‚ฌ์‹ค์ด ๋” ๋งŽ์•„์ง€๋ฉด ๋ชจ๋“  ๊ฒƒ์ด ๊ดœ์ฐฎ์•„์งˆ ๊ฒƒ์ด๋ผ๋Š” ๊ฑด ์‚ฌ์‹ค์ด ์•„๋‹™๋‹ˆ๋‹ค.
05:06
because the algorithms that determine what content we see,
102
306572
3100
์™œ๋ƒํ•˜๋ฉด ์šฐ๋ฆฌ๊ฐ€ ์–ด๋–ค ์ฝ˜ํ…์ธ ๋ฅผ ๋ณผ์ง€๋ฅผ ์ •ํ•˜๋Š” ์•Œ๊ณ ๋ฆฌ์ฆ˜์€
05:09
well, they're designed to reward our emotional responses.
103
309696
3127
์šฐ๋ฆฌ์˜ ๊ฐ์ •์ ์ธ ๋ฐ˜์‘์„ ์šฐ์„ ์‹œํ•˜๊ฒŒ ๋˜์–ด ์žˆ์Šต๋‹ˆ๋‹ค.
05:12
And when we're fearful,
104
312847
1381
๊ทธ๋ฆฌ๊ณ  ์šฐ๋ฆฌ๊ฐ€ ๊ณตํฌ๋ฅผ ๋Š๋‚„ ๋•Œ,
05:14
oversimplified narratives, conspiratorial explanations
105
314252
3174
์ง€๋‚˜์น˜๊ฒŒ ์ถ•์†Œ๋œ ์ด์•ผ๊ธฐ๋“ค, ์Œ๋ชจ๋ก ์ ์ธ ์„ค๋ช…๋“ค,
05:17
and language that demonizes others is far more effective.
106
317450
3418
๊ทธ๋ฆฌ๊ณ  ๋‹ค๋ฅธ ์‚ฌ๋žŒ์„ ํƒ“ํ•˜๋Š” ์–ธ์–ด๊ฐ€ ํ›จ์”ฌ ํšจ๊ณผ์ ์ž…๋‹ˆ๋‹ค.
05:21
And besides, many of these companies,
107
321538
1874
๊ฒŒ๋‹ค๊ฐ€, ๋งŽ์€ ์ด๋Ÿฐ ํšŒ์‚ฌ๋“ค์˜
05:23
their business model is attached to attention,
108
323436
2546
๋น„์ฆˆ๋‹ˆ์Šค ๋ชจ๋ธ์€ ๊ด€์‹ฌ์„ ๋ฐ›๋Š” ์ •๋„์— ๋”ฐ๋ผ ๊ฒฐ์ •๋ฉ๋‹ˆ๋‹ค.
05:26
which means these algorithms will always be skewed towards emotion.
109
326006
3690
๋”ฐ๋ผ์„œ ์ด ์•Œ๊ณ ๋ฆฌ์ฆ˜๋“ค์€ ๊ฐ์ • ๋ณ€ํ™”์— ์น˜์šฐ์ณ์„œ ๋ฐ˜์‘ํ•˜๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
05:30
Second, most of the speech I'm talking about here is legal.
110
330371
4298
๋‘˜์งธ๋กœ, ์ œ๊ฐ€ ๋งํ•˜๋Š” ๋Œ€๋ถ€๋ถ„์˜ ๋‚ด์šฉ์€ ๋ฒ•์ ์ธ ๋ฌธ์ œ์ž…๋‹ˆ๋‹ค.
05:35
It would be a different matter
111
335081
1446
์•„๋™ ์„ฑ๋ฒ”์ฃ„ ๊ด€๋ จ ์‚ฌ์ง„ ํ˜น์€
05:36
if I was talking about child sexual abuse imagery
112
336551
2341
ํญ๋ ฅ์„ ์ผ์œผํ‚ค๋Š” ๋‚ด์šฉ์ด๋ผ๋ฉด
05:38
or content that incites violence.
113
338916
1927
๋‹ค๋ฅธ ๋ฌธ์ œ๊ฐ€ ๋  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
05:40
It can be perfectly legal to post an outright lie.
114
340867
3270
๋…ธ๊ณจ์ ์ธ ๊ฑฐ์ง“์„ ์—…๋กœ๋“œํ•˜๋ฉด ์™„๋ฒฝํžˆ ๋ฒ•์ ์ธ ๋ฌธ์ œ๊ฐ€ ๋  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
05:45
But people keep talking about taking down "problematic" or "harmful" content,
115
345130
4034
ํ•˜์ง€๋งŒ ์‚ฌ๋žŒ๋“ค์€ "๋ฌธ์ œ๊ฐ€ ๋˜๋Š”" ๋˜๋Š” "ํ•ด๋กœ์šด" ๊ธ€์„ ๋‚ด๋ ค์•ผ ํ•œ๋‹ค๊ณ  ํ•ฉ๋‹ˆ๋‹ค.
05:49
but with no clear definition of what they mean by that,
116
349188
2609
๊ทธ ๋ง์ด ๋ฌด์Šจ ๋œป์ธ์ง€ ๋ช…ํ™•ํ•œ ์ •์˜๋ฅผ ๋‚ด๋ฆฌ์ง€ ๋ชปํ•œ ์ฑ„ ๋ง์ด์ฃ .
05:51
including Mark Zuckerberg,
117
351821
1264
๋งˆํฌ ์ €์ปค๋ฒ„๊ทธ๋ฅผ ํฌํ•จํ•ด์„œ์š”.
05:53
who recently called for global regulation to moderate speech.
118
353109
3412
๊ทธ๋Š” ์ตœ๊ทผ์— ์˜จ๊ฑดํ•œ ํ‘œํ˜„์— ๋Œ€ํ•œ ๊ตญ์ œ์  ๊ทœ์ œ๋ฅผ ์ด‰๊ตฌํ–ˆ์Šต๋‹ˆ๋‹ค.
05:56
And my concern is that we're seeing governments
119
356870
2215
์ œ๊ฐ€ ๊ฑฑ๊ฒ…๋˜๋Š” ๊ฒƒ์€,
05:59
right around the world
120
359109
1292
์ „ ์„ธ๊ณ„ ์ •๋ถ€๋“ค์ด
06:00
rolling out hasty policy decisions
121
360425
2676
์šฐ๋ฆฌ๊ฐ€ ์˜ฌ๋ฆฌ๋Š” ๊ฒŒ์‹œ๋ฌผ, ํ‘œํ˜„์— ์žˆ์–ด
06:03
that might actually trigger much more serious consequences
122
363125
2746
ํ›จ์”ฌ ๋” ํฐ ๊ฒฐ๊ณผ๋ฅผ ์ด‰๋ฐœํ• ์ˆ˜ ์žˆ๋Š” ์ •์ฑ…๊ฒฐ์ •์„
06:05
when it comes to our speech.
123
365895
1714
์„ฑ๊ธ‰ํ•˜๊ฒŒ ๋‚ด๋†“๊ณ  ์žˆ๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
06:08
And even if we could decide which speech to take up or take down,
124
368006
3706
์‹ฌ์ง€์–ด ์šฐ๋ฆฌ๊ฐ€ ์–ด๋–ค ํ‘œํ˜„์„ ํ—ˆ์šฉํ•˜๊ณ , ์–ด๋–ค ํ‘œํ˜„์„ ๊ทœ์ œํ• ์ง€ ์ •ํ•œ๋‹ค ํ•ด๋„
06:11
we've never had so much speech.
125
371736
2174
์ด๋ณด๋‹ค ๋” ๋งŽ์•˜๋˜ ์ ์ด ์—†์—ˆ๋‹ค๋Š” ๊ฒŒ ๋ฌธ์ œ์ฃ .
06:13
Every second, millions of pieces of content
126
373934
2131
๋งค ์ดˆ ๋งˆ๋‹ค, ์ˆ˜ ๋ฐฑ๋งŒ ๊ฐœ์˜ ๊ฒŒ์‹œ๊ธ€์ด
06:16
are uploaded by people right around the world
127
376089
2107
์„ธ๊ณ„ ๊ฐ์ง€์—์„œ ์—…๋กœ๋“œ๋ฉ๋‹ˆ๋‹ค.
06:18
in different languages,
128
378220
1168
์—ฌ๋Ÿฌ ์–ธ์–ด๋กœ ๋ง์ด์ฃ .
06:19
drawing on thousands of different cultural contexts.
129
379412
2768
์ˆ˜ ์ฒœ๊ฐœ์˜ ๋‹ค๋ฅธ ๋ฌธํ™”์  ๋งฅ๋ฝ์ด ๋“ค์–ด๊ฐ„ ๊ฒƒ๋“ค์ด์ฃ .
06:22
We've simply never had effective mechanisms
130
382204
2532
์ด๋Ÿฐ ๊ทœ๋ชจ์˜ ํ‘œํ˜„์„ ์ œ์–ดํ•  ์ˆ˜ ์žˆ๋Š”
06:24
to moderate speech at this scale,
131
384760
1738
ํšจ์œจ์ ์ธ ๋ฐฉ๋ฒ•์ด ํ˜„์žฌ๋กœ์„  ์—†์Šต๋‹ˆ๋‹ค.
06:26
whether powered by humans or by technology.
132
386522
2801
์ธ๊ฐ„์ด ๋˜์—ˆ๋“ , ๊ธฐ๊ณ„๊ฐ€ ๋˜์—ˆ๋“  ๊ฐ„์— ๋ง์ด์ฃ .
06:30
And third, these companies -- Google, Twitter, Facebook, WhatsApp --
133
390284
3944
์…‹์งธ๋กœ, ๊ตฌ๊ธ€, ํŠธ์œ„ํ„ฐ, ํŽ˜์ด์Šค๋ถ, ์™“์ธ ์•ฑ๊ณผ ๊ฐ™์€ ํšŒ์‚ฌ๋“ค์€
06:34
they're part of a wider information ecosystem.
134
394252
2841
์ •๋ณด ์ƒํƒœ๊ณ„์˜ ์ผ๋ถ€๋ถ„์„ ์ฐจ์ง€ํ•  ๋ฟ์ด๋ผ๋Š” ๊ฒ๋‹ˆ๋‹ค.
06:37
We like to lay all the blame at their feet, but the truth is,
135
397117
3352
์šฐ๋ฆฌ๋Š” ์ด ํšŒ์‚ฌ๋“ค์„ ํƒ“ํ•˜๊ธฐ ์‰ฝ์ง€๋งŒ,
06:40
the mass media and elected officials can also play an equal role
136
400493
3830
์‚ฌ์‹ค์€, ๋งค์Šค๋ฏธ๋””์–ด์™€ ์ง€๋„์ž๋“ค๋„ ๊ทธ๋“ค์ด ์›ํ•  ๋•Œ
06:44
in amplifying rumors and conspiracies when they want to.
137
404347
2913
๋ฃจ๋จธ์™€ ์Œ๋ชจ๋ก ์„ ํ™•์‚ฐํ•  ์ˆ˜ ์žˆ๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
06:47
As can we, when we mindlessly forward divisive or misleading content
138
407800
4944
์šฐ๋ฆฌ๋„ ๋งˆ์ฐฌ๊ฐ€์ง€๋กœ ์ƒ๊ฐ์—†์ด ์˜คํ•ด์˜ ์†Œ์ง€๊ฐ€ ์žˆ๋Š” ๊ธ€์„ ๊ณต์œ ํ•ฉ๋‹ˆ๋‹ค.
06:52
without trying.
139
412768
1285
06:54
We're adding to the pollution.
140
414077
1800
์šฐ๋ฆฌ๋„ ์ •๋ณด ์ƒํƒœ๊ณ„์˜ ์˜ค์—ผ์— ํ•œ ๋ชซ์„ ํ•˜๋Š” ๊ฒ๋‹ˆ๋‹ค.
06:57
I know we're all looking for an easy fix.
141
417236
2618
์šฐ๋ฆฌ๋Š” ๋ชจ๋‘ ์‰ฌ์šด ํ•ด๊ฒฐ์ฑ…์„ ๊ฐˆ๊ตฌํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
06:59
But there just isn't one.
142
419878
1667
ํ•˜์ง€๋งŒ ๊ทธ๋Ÿฐ ๊ฑด ์กด์žฌํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
07:01
Any solution will have to be rolled out at a massive scale, internet scale,
143
421950
4445
์–ด๋–ค ํ•ด๊ฒฐ์ฑ…์ด๋“ , ๋„“์€ ๋ฒ”์œ„๋กœ, ์ธํ„ฐ๋„ท ๋ฒ”์œ„๋กœ ์ ์šฉ๋˜์–ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
07:06
and yes, the platforms, they're used to operating at that level.
144
426419
3261
๊ทธ๋ฆฌ๊ณ  ์†Œ์…œ๋„คํŠธ์›Œํฌ ํ”Œ๋žซํผ์€ ๊ทธ ๋‹จ๊ณ„์— ์ตœ์ ํ™”๋˜์–ด ์žˆ์Šต๋‹ˆ๋‹ค.
07:09
But can and should we allow them to fix these problems?
145
429704
3472
ํ•˜์ง€๋งŒ ๊ทธ๋ ‡๋‹ค๊ณ  ๊ทธ๋“ค์ด ์ด ๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ•˜๋„๋ก ๋‘˜ ์ˆ˜ ์žˆ์„๊นŒ์š”? ๊ทธ๋ž˜์•ผ ํ• ๊นŒ์š”?
07:13
They're certainly trying.
146
433668
1232
๊ทธ๋“ค๋„ ๋ฌผ๋ก  ๋…ธ๋ ฅํ•˜๊ณ  ์žˆ๊ธด ํ•˜์ฃ .
07:14
But most of us would agree that, actually, we don't want global corporations
147
434924
4086
ํ•˜์ง€๋งŒ ์šฐ๋ฆฌ ๋Œ€๋ถ€๋ถ„์€ ๊ธ€๋กœ๋ฒŒ ๊ธฐ์—…๋“ค์ด ์˜จ๋ผ์ธ์—์„œ
07:19
to be the guardians of truth and fairness online.
148
439034
2332
์ง„์‹ค๊ณผ ๊ณต์ •์˜ ์ˆ˜ํ˜ธ์ž๊ฐ€ ๋˜๋Š” ๊ฒƒ์„ ์›์น˜ ์•Š์„ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
07:21
And I also think the platforms would agree with that.
149
441390
2537
์†Œ์…œํ”Œ๋žซํผ๋“ค๋„ ์ด์— ๋™์˜ํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
07:24
And at the moment, they're marking their own homework.
150
444257
2881
์ง€๊ธˆ ๊ทธ๋“ค์€, ์Šค์Šค๋กœ์—๊ฒŒ ๋‚ด ์ค€ ์ˆ™์ œ๋ฅผ ์ฑ„์ ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
07:27
They like to tell us
151
447162
1198
๊ทธ๋“ค์€ ์šฐ๋ฆฌ์—๊ฒŒ
07:28
that the interventions they're rolling out are working,
152
448384
2579
๊ทธ๋“ค์ด ์„ค์น˜ํ•œ ์ œ์–ด ์žฅ์น˜๋“ค์ด ์ž‘๋™ํ•˜๊ณ  ์žˆ๋‹ค๊ณ  ๋ป๊ธฐ์ฃ .
07:30
but because they write their own transparency reports,
153
450987
2540
ํ•˜์ง€๋งŒ ๊ทธ๋“ค์ด ์Šค์Šค๋กœ ํˆฌ๋ช…์„ฑ ๋ณด๊ณ ์„œ๋ฅผ ์“ฐ๊ธฐ ๋•Œ๋ฌธ์—,
07:33
there's no way for us to independently verify what's actually happening.
154
453551
3618
๋ฌด์—‡์ด ์ง„์งœ๋กœ ์ผ์–ด๋‚˜๋Š”์ง€ ์šฐ๋ฆฌ๊ฐ€ ํ™•์ธํ•  ๊ธธ์ด ์—†์Šต๋‹ˆ๋‹ค.
07:38
(Applause)
155
458431
3342
(๋ฐ•์ˆ˜)
07:41
And let's also be clear that most of the changes we see
156
461797
2952
๋˜, ์šฐ๋ฆฌ๊ฐ€ ๋ณด๋Š” ๋ณ€ํ™”์˜ ๋Œ€๋ถ€๋ถ„์€
07:44
only happen after journalists undertake an investigation
157
464773
2994
์ €๋„๋ฆฌ์ŠคํŠธ๋“ค์ด ์กฐ์‚ฌ๋ฅผ ํ†ตํ•ด ํŽธํ–ฅ๋˜์—ˆ๋‹ค๋Š” ์ฆ๊ฑฐ๋ฅผ ๋ฐœ๊ฒฌํ•˜๊ฑฐ๋‚˜,
07:47
and find evidence of bias
158
467791
1611
์ปค๋ฎค๋‹ˆํ‹ฐ ๊ฐ€์ด๋“œ๋ผ์ธ์„ ์œ„๋ฐ˜ํ•˜๋Š” ์ฝ˜ํ…์ธ ๋ฅผ
07:49
or content that breaks their community guidelines.
159
469426
2829
๋ฐœ๊ฒฌํ•œ ์ดํ›„์—๋‚˜ ๋‚˜ํƒ€๋‚ฉ๋‹ˆ๋‹ค.
07:52
So yes, these companies have to play a really important role in this process,
160
472815
4595
๋„ค, ์ด ํšŒ์‚ฌ๋“ค์€ ๋ฌธ์ œํ•ด๊ฒฐ์„ ์œ„ํ•ด ์•„์ฃผ ์ค‘์š”ํ•œ ์—ญํ• ์„ ํ•ด์•ผ ํ•˜์ง€๋งŒ,
07:57
but they can't control it.
161
477434
1560
๊ทธ๊ฒƒ์„ ํ†ต์ œํ•˜์ง€๋Š” ๋ชปํ•ฉ๋‹ˆ๋‹ค.
07:59
So what about governments?
162
479855
1518
์ •๋ถ€๋Š” ์–ด๋–จ๊นŒ์š”?
08:01
Many people believe that global regulation is our last hope
163
481863
3096
๋งŽ์€ ์‚ฌ๋žŒ๋“ค์€ ์šฐ๋ฆฌ์˜ ์ •๋ณด ์ƒํƒœ๊ณ„๋ฅผ ์ •ํ™”์‹œํ‚ค๊ธฐ ์œ„ํ•ด์„œ๋Š”
08:04
in terms of cleaning up our information ecosystem.
164
484983
2880
์ „์„ธ๊ณ„์ ์ธ ๊ทœ์œจ์ด ํ•„์š”ํ•˜๋‹ค๊ณ  ๋ฏฟ๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
08:07
But what I see are lawmakers who are struggling to keep up to date
165
487887
3166
ํ•˜์ง€๋งŒ ์ •์ฑ… ์ž…์•ˆ์ž๋“ค์€ ๋น ๋ฅธ ๊ธฐ์ˆ ์˜ ๋ฐœ์ „์„
08:11
with the rapid changes in technology.
166
491077
2341
๋”ฐ๋ผ๊ฐ€์ง€ ๋ชปํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
08:13
And worse, they're working in the dark,
167
493442
1904
์„ค์ƒ๊ฐ€์ƒ์œผ๋กœ, ์ •๋ถ€๋Š” ์•”ํ‘ ์†์—์„œ ์ผํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
08:15
because they don't have access to data
168
495370
1821
์ด๋Ÿฐ ํ”Œ๋žซํผ์—์„œ ๋ฌด์Šจ ์ผ์ด ์ผ์–ด๋‚˜๋Š”์ง€ ์•Œ ๊ธธ์ด ์—†์Šต๋‹ˆ๋‹ค.
08:17
to understand what's happening on these platforms.
169
497215
2650
์•Œ ์ˆ˜ ์žˆ๋Š” ๋ฐ์ดํ„ฐ์— ์ ‘๊ทผํ•  ์ˆ˜ ์—†์œผ๋‹ˆ๊นŒ์š”.
08:20
And anyway, which governments would we trust to do this?
170
500260
3071
๊ฒŒ๋‹ค๊ฐ€, ์–ด๋–ค ์ •๋ถ€๋ฅผ ๋ฏฟ์„ ์ˆ˜ ์žˆ์„๊นŒ์š”?
08:23
We need a global response, not a national one.
171
503355
2770
์šฐ๋ฆฌ๋Š” ๊ตญ๊ฐ€์ ์ธ ๋Œ€์‘๋ณด๋‹ค๋Š” ์„ธ๊ณ„์ ์ธ ๋Œ€์‘์ด ํ•„์š”ํ•ฉ๋‹ˆ๋‹ค.
08:27
So the missing link is us.
172
507419
2277
์—ฌ๊ธฐ์„œ ๋น ์ง„ ์—ฐ๊ฒฐ๊ณ ๋ฆฌ๊ฐ€ ๋ฐ”๋กœ ์šฐ๋ฆฌ์ž…๋‹ˆ๋‹ค.
08:29
It's those people who use these technologies every day.
173
509720
3123
์ด ๊ธฐ์ˆ ์„ ๋งค์ผ๋งค์ผ ์‚ฌ์šฉํ•˜๋Š” ์šฐ๋ฆฌ๋“ค์ด์ง€์š”.
08:33
Can we design a new infrastructure to support quality information?
174
513260
4591
์–‘์งˆ์˜ ์ •๋ณด๋ฅผ ์ง€์ง€ํ•˜๊ธฐ ์œ„ํ•œ ์ƒˆ๋กœ์šด ์ธํ”„๋ผ๋ฅผ ๋งŒ๋“ค ์ˆ˜ ์žˆ์„๊นŒ์š”?
08:38
Well, I believe we can,
175
518371
1230
์ €๋Š” ๊ฐ€๋Šฅํ•˜๋‹ค๊ณ  ๋ฏฟ์Šต๋‹ˆ๋‹ค.
08:39
and I've got a few ideas about what we might be able to actually do.
176
519625
3357
์šฐ๋ฆฌ๊ฐ€ ์‹ค์ œ๋กœ ํ•  ์ˆ˜ ์žˆ๋Š” ์ผ์— ๋Œ€ํ•œ ๋ช‡ ๊ฐ€์ง€ ์•„์ด๋””์–ด๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
08:43
So firstly, if we're serious about bringing the public into this,
177
523006
3103
์ฒซ์งธ๋กœ, ์‚ฌ๋žŒ๋“ค์˜ ์ฐธ์—ฌ๊ฐ€ ์ •๋ง๋กœ ํ•„์š”ํ•˜๋‹ค๋ฉด,
08:46
can we take some inspiration from Wikipedia?
178
526133
2381
์œ„ํ‚คํ”ผ๋””์•„์—์„œ ์˜๊ฐ์„ ์–ป์„ ์ˆ˜ ์—†์„๊นŒ์š”?
08:48
They've shown us what's possible.
179
528538
1824
๊ทธ๋“ค์€ ๋ฌด์—‡์ด ๊ฐ€๋Šฅํ•œ์ง€ ์šฐ๋ฆฌ์—๊ฒŒ ๋ณด์—ฌ์ฃผ์—ˆ์Šต๋‹ˆ๋‹ค.
08:50
Yes, it's not perfect,
180
530386
1151
๋„ค, ์™„๋ฒฝํ•˜์ง„ ์•Š์ง€๋งŒ,
08:51
but they've demonstrated that with the right structures,
181
531561
2634
์œ„ํ‚คํ”ผ๋””์•„๋Š” ํˆฌ๋ช…์„ฑ์ด ๋ณด์žฅ๋˜๊ณ 
08:54
with a global outlook and lots and lots of transparency,
182
534219
2635
์ „ ์„ธ๊ณ„ ์‚ฌ๋žŒ๋“ค์ด ์ฐธ์—ฌํ•  ์ˆ˜ ์žˆ๋Š” ๊ธฐ์ดˆ์ ์ธ ๊ตฌ์กฐ๊ฐ€ ์žˆ๋‹ค๋ฉด
08:56
you can build something that will earn the trust of most people.
183
536878
3096
์‚ฌ๋žŒ๋“ค์ด ์‹ ๋ขฐํ•  ์ˆ˜ ์žˆ๋Š” ํ”Œ๋žซํผ์„ ๋งŒ๋“ค ์ˆ˜ ์žˆ๋‹ค๋Š” ๊ฒƒ์„ ๋ณด์—ฌ์ฃผ์—ˆ์Šต๋‹ˆ๋‹ค.
08:59
Because we have to find a way to tap into the collective wisdom
184
539998
3162
์™œ๋ƒํ•˜๋ฉด ์šฐ๋ฆฌ๋Š” ์ง‘๋‹จ์  ์ง€ํ˜œ์™€ ๋ชจ๋“  ์‚ฌ์šฉ์ž๋“ค์˜ ๊ฒฝํ—˜์„ ํ™œ์šฉํ• 
09:03
and experience of all users.
185
543184
2309
๋ฐฉ๋ฒ•์„ ์ฐพ์•„์•ผ ํ•˜๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
09:05
This is particularly the case for women, people of color
186
545517
2646
์ด๋Š” ์—ฌ์„ฑ, ์œ ์ƒ‰์ธ์ข…๋“ค, ๊ทธ๋ฆฌ๊ณ  ๋Œ€ํ‘œ์„ฑ์„ ๊ฐ–์ง€ ๋ชปํ•˜๋Š” ์†Œ์ˆ˜์ž๋“ค์—๊ฒŒ
09:08
and underrepresented groups.
187
548187
1346
๋”์šฑ ํ•„์š”ํ•œ ๋ฐฉ๋ฒ•์ž…๋‹ˆ๋‹ค.
09:09
Because guess what?
188
549557
1166
์™œ๋ƒํ•˜๋ฉด
09:10
They are experts when it comes to hate and disinformation,
189
550747
2735
๊ทธ๋“ค์€ ํ˜์˜ค์™€ ํ—ˆ์œ„ ์ •๋ณด์— ๊ด€ํ•ด์„œ๋ผ๋ฉด ์ „๋ฌธ๊ฐ€๋‹ˆ๊นŒ์š”.
09:13
because they have been the targets of these campaigns for so long.
190
553506
3516
๊ทธ๋“ค์ด์•ผ๋ง๋กœ ์˜ค๋žซ๋™์•ˆ ํ˜์˜ค ์บ ํŽ˜์ธ์˜ ํ‘œ์ ์ด ๋˜์–ด์™”๊ธฐ ๋•Œ๋ฌธ์ด์ฃ .
09:17
And over the years, they've been raising flags,
191
557046
2350
๋ช‡ ๋…„ ์ „๋ถ€ํ„ฐ ๊ทธ๋“ค์€ ์˜๋ฌธ์„ ์ œ๊ธฐํ•˜๊ธฐ ์‹œ์ž‘ํ•˜์˜€์ง€๋งŒ,
09:19
and they haven't been listened to.
192
559420
1665
๊ทธ๋“ค์˜ ๋ชฉ์†Œ๋ฆฌ๋Š” ์•„๋ฌด๋„ ๋“ฃ์ง€ ์•Š์•˜์Šต๋‹ˆ๋‹ค.
09:21
This has got to change.
193
561109
1280
์ด๋Š” ๋ฐ”๋€Œ์–ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
09:22
So could we build a Wikipedia for trust?
194
562807
4326
์‹ ๋ขฐ์˜ ์œ„ํ‚คํ”ผ๋””์•„ ๊ฐ™์€ ๊ฒƒ์„ ์šฐ๋ฆฌ๊ฐ€ ๋งŒ๋“ค ์ˆ˜ ์žˆ์„๊นŒ์š”?
09:27
Could we find a way that users can actually provide insights?
195
567157
4189
์‚ฌ์šฉ์ž๋“ค์ด ์ž์‹ ์˜ ํ†ต์ฐฐ์„ ์ œ์‹œํ•˜๊ฒŒ๋” ํ•  ๋ฐฉ๋ฒ•์„ ์ฐพ์„ ์ˆ˜ ์žˆ์„๊นŒ์š”?
09:31
They could offer insights around difficult content-moderation decisions.
196
571370
3697
๊ทธ๋“ค์€ ์–ด๋–ค ํ‘œํ˜„์ด ์˜จ๊ฑดํ•œ์ง€์™€ ๊ฐ™์€, ํŒ๋‹จ์ด ์–ด๋ ค์šด ๊ฒฐ์ •์— ํ†ต์ฐฐ์„ ์ค„ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
09:35
They could provide feedback
197
575091
1463
ํ”Œ๋žซํผ๋“ค์ด ์ƒˆ๋กœ์šด ๋ณ€๊ฒฝ์‚ฌํ•ญ์„ ์ถœ์‹œํ•  ๋•Œ,
09:36
when platforms decide they want to roll out new changes.
198
576578
3041
ํ”ผ๋“œ๋ฐฑ์„ ์ค„ ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
09:40
Second, people's experiences with the information is personalized.
199
580241
4162
๋‘˜์งธ๋กœ, ์ •๋ณด์— ๋Œ€ํ•œ ์‚ฌ๋žŒ๋“ค์˜ ๊ฒฝํ—˜์€ ๊ฐœ์ธ์ ์ด๋ผ๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
09:44
My Facebook news feed is very different to yours.
200
584427
2643
์ €์˜ ํŽ˜์ด์Šค๋ถ ํ”ผ๋“œ๋Š” ์—ฌ๋Ÿฌ๋ถ„์˜ ๊ฒƒ๊ณผ ์•„์ฃผ ๋‹ค๋ฆ…๋‹ˆ๋‹ค.
09:47
Your YouTube recommendations are very different to mine.
201
587094
2745
์—ฌ๋Ÿฌ๋ถ„์˜ ์œ ํŠœ๋ธŒ ์ถ”์ฒœ๋ชฉ๋ก์€ ์ œ ๊ฒƒ๊ณผ ๋งŽ์ด ๋‹ค๋ฆ…๋‹ˆ๋‹ค.
09:49
That makes it impossible for us to actually examine
202
589863
2492
๊ทธ๋ž˜์„œ ์‚ฌ๋žŒ๋“ค์ด ๋ฌด์Šจ ์ •๋ณด๋ฅผ ๋ณด๊ณ  ์žˆ๋Š”์ง€ ์กฐ์‚ฌํ•˜๋Š” ๊ฒƒ์€
09:52
what information people are seeing.
203
592379
2023
๋ถˆ๊ฐ€๋Šฅํ•ฉ๋‹ˆ๋‹ค.
09:54
So could we imagine
204
594815
1389
๊ทธ๋ ‡๋‹ค๋ฉด
09:56
developing some kind of centralized open repository for anonymized data,
205
596228
4778
์ต๋ช…ํ™”๋œ ๋ฐ์ดํ„ฐ๋ฅผ ์œ„ํ•œ ์ผ์ข…์˜ ๊ฐœ๋ฐฉํ˜• ์ค‘์•™ ์ €์žฅ์†Œ๋ฅผ ๊ฐœ๋ฐœํ•ด์„œ
10:01
with privacy and ethical concerns built in?
206
601030
2864
๊ฐœ์ธ ์ •๋ณด์™€ ์œค๋ฆฌ๋ฅผ ๋ณดํ˜ธํ•  ์ˆ˜ ์žˆ๋„๋ก ํ•  ์ˆ˜ ์žˆ์„๊นŒ์š”?
10:04
Because imagine what we would learn
207
604220
1778
์˜ˆ๋ฅผ ๋“ค์–ด, ์ด๋Ÿฐ ๊ฒฝ์šฐ ์šฐ๋ฆฌ๊ฐ€ ๋ฌด์—‡์„ ๋ฐฐ์šธ ์ˆ˜ ์žˆ์„์ง€ ์ƒ์ƒํ•ด๋ณผ๊นŒ์š”.
10:06
if we built out a global network of concerned citizens
208
606022
3261
์ด๋Ÿฐ ๋ฌธ์ œ์— ๊ด€์‹ฌ์ด ์žˆ๊ณ , ์ž์‹ ์˜ ์†Œ์…œ๋ฐ์ดํ„ฐ๋ฅผ ์—ฐ๊ตฌ์— ๊ธฐ๋ถ€ํ•˜๋ ค๋Š”
10:09
who wanted to donate their social data to science.
209
609307
3294
์‹œ๋ฏผ๋“ค๋กœ ๊ตฌ์„ฑ๋œ ๊ธ€๋กœ๋ฒŒ ๋„คํŠธ์›Œํฌ๋ฅผ ๋งŒ๋“ ๋‹ค๋ฉด ๋ง์ด์ฃ .
10:13
Because we actually know very little
210
613141
1722
ํ˜์˜ค์™€ ํ—ˆ์œ„ ์ •๋ณด๊ฐ€
10:14
about the long-term consequences of hate and disinformation
211
614887
2881
์žฅ๊ธฐ์ ์œผ๋กœ ์‚ฌ๋žŒ๋“ค์˜ ํƒœ๋„์™€ ํ–‰๋™์— ์–ด๋–ค ์˜ํ–ฅ์„ ๋ฏธ์น˜๋Š”์ง€,
10:17
on people's attitudes and behaviors.
212
617792
1975
์šฐ๋ฆฌ๋Š” ์ž˜ ์•Œ๊ณ  ์žˆ์ง€ ๋ชปํ•ฉ๋‹ˆ๋‹ค..
10:20
And what we do know,
213
620236
1167
์šฐ๋ฆฌ๊ฐ€ ์•„๋Š” ๊ฒƒ๋“ค์€
10:21
most of that has been carried out in the US,
214
621427
2193
๋ฏธ๊ตญ ํ•œ์ •์œผ๋กœ ๋ฐํ˜€์ง„ ์—ฐ๊ตฌ์ž…๋‹ˆ๋‹ค.
10:23
despite the fact that this is a global problem.
215
623644
2381
์ด ๋ฌธ์ œ๋Š” ์ „ ์„ธ๊ณ„์ ์ธ ๋ฌธ์ œ์ธ๋ฐ ๋ง์ด์ฃ .
10:26
We need to work on that, too.
216
626049
1635
์ด ๋ฌธ์ œ๋„ ํ•ด๊ฒฐํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
10:28
And third,
217
628192
1150
์„ธ ๋ฒˆ์งธ๋กœ,
10:29
can we find a way to connect the dots?
218
629366
2310
์ ๊ณผ ์ ์„ ์„œ๋กœ ์ด์„ ๋ฐฉ๋ฒ•์„ ์ฐพ์„ ์ˆ˜ ์žˆ์„๊นŒ์š”?
10:31
No one sector, let alone nonprofit, start-up or government,
219
631700
3438
์–ด๋–ค ํ•œ ๋น„์˜๋ฆฌ๋‹จ์ฒด, ์Šคํƒ€ํŠธ์—…, ์ •๋ถ€๋„
10:35
is going to solve this.
220
635162
1422
์ด ๋ฌธ์ œ๋ฅผ ์Šค์Šค๋กœ ํ•ด๊ฒฐํ•  ์ˆ˜๋Š” ์—†์Šต๋‹ˆ๋‹ค.
10:36
But there are very smart people right around the world
221
636608
2564
ํ•˜์ง€๋งŒ ์ด ๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ•˜๊ธฐ ์œ„ํ•ด ์ผํ•˜๊ณ  ์žˆ๋Š” ์•„์ฃผ ๋˜‘๋˜‘ํ•œ ์‚ฌ๋žŒ๋“ค์ด
10:39
working on these challenges,
222
639196
1381
์ „ ์„ธ๊ณ„์— ์žˆ์Šต๋‹ˆ๋‹ค.
10:40
from newsrooms, civil society, academia, activist groups.
223
640601
3576
๋‰ด์Šค ํŽธ์ง‘๊ตญ์—์„œ๋ถ€ํ„ฐ ์‹œ๋ฏผ ์‚ฌํšŒ, ํ•™๊ณ„, ๊ถŒ์ต ๋ณดํ˜ธ ์šด๋™ ๋‹จ์ฒด๊นŒ์ง€์š”.
10:44
And you can see some of them here.
224
644201
1898
์—ฌ๊ธฐ ๊ทธ๋“ค ์ค‘ ์ผ๋ถ€๋ฅผ ๋ณด์‹ค ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
10:46
Some are building out indicators of content credibility.
225
646123
2927
๋ช‡ ๋ช…์€ ์ฝ˜ํ…์ธ  ์‹ ๋ขฐ์„ฑ์„ ๋‚˜ํƒ€๋‚ด๋Š” ์ง€ํ‘œ๋ฅผ ๊ฐœ๋ฐœ ์ค‘์ž…๋‹ˆ๋‹ค.
10:49
Others are fact-checking,
226
649074
1246
๋ช‡ ๋ช…์€ ํŒฉํŠธ์ฒดํฌ๋ฅผ ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
10:50
so that false claims, videos and images can be down-ranked by the platforms.
227
650344
3591
๊ฑฐ์ง“ ์ •๋ณด, ๋น„๋””์˜ค์™€ ์‚ฌ์ง„๋“ค์ด ์ˆœ์œ„์—์„œ ๋ฉ€์–ด์งˆ ์ˆ˜ ์žˆ๊ฒŒ ๋ง์ด์ฃ .
10:53
A nonprofit I helped to found, First Draft,
228
653959
2213
์ œ๊ฐ€ ์„ค๋ฆฝ์„ ๋„์šด ๋น„์˜๋ฆฌ๋‹จ์ฒด์ธ ํผ์ŠคํŠธ ๋“œ๋ž˜ํ”„ํŠธ(First Draft)๋Š”
10:56
is working with normally competitive newsrooms around the world
229
656196
2968
ํ‰๊ท ์ ์œผ๋กœ ๊ฒฝ์Ÿ๋ ฅ์žˆ๋Š” ๋‰ด์Šค ํŽธ์ง‘๊ตญ๋“ค๊ณผ ํ•จ๊ป˜ ์ผํ•˜๋ฉฐ
10:59
to help them build out investigative, collaborative programs.
230
659188
3503
๊ทธ๋“ค์ด ํ˜‘๋ ฅํ•  ์ˆ˜ ์žˆ๋Š” ํ”„๋กœ๊ทธ๋žจ์„ ์„ธ์šธ ์ˆ˜ ์žˆ๋„๋ก ๋•์Šต๋‹ˆ๋‹ค.
11:03
And Danny Hillis, a software architect,
231
663231
2309
์†Œํ”„ํŠธ์›จ์–ด ์„ค๊ณ„์ž์ธ ๋Œ€๋‹ˆ ํž์ฆˆ(Danny Hillls)๋Š”
11:05
is designing a new system called The Underlay,
232
665564
2381
์–ธ๋”๋ ˆ์ด(The Underlay)๋ผ๊ณ  ๋ถˆ๋ฆฌ๋Š” ์ƒˆ๋กœ์šด ์‹œ์Šคํ…œ์„ ๋งŒ๋“ค๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
11:07
which will be a record of all public statements of fact
233
667969
2775
์ด๋Š” ์ธํ„ฐ๋„ท ์ƒ์˜ ๋ชจ๋“  ๋ฐœ์–ธ๋“ค๊ณผ ๊ทธ ์ถœ์ฒ˜๋ฅผ ์—ฎ์€
11:10
connected to their sources,
234
670768
1329
๊ธฐ๋ก๋ฌผ์ž…๋‹ˆ๋‹ค.
11:12
so that people and algorithms can better judge what is credible.
235
672121
3654
์‚ฌ๋žŒ๋“ค๊ณผ ์•Œ๊ณ ๋ฆฌ์ฆ˜์ด ๋ฏฟ์„ ๋งŒํ•œ ์ •๋ณด๋ฅผ ๊ฐ€๋ ค๋‚ด๋Š” ๊ฒƒ์„ ๋„์™€์ฃผ์ฃ .
11:16
And educators around the world are testing different techniques
236
676800
3356
์ „ ์„ธ๊ณ„์˜ ๊ต์œก์ž๋“ค์€ ์‚ฌ๋žŒ๋“ค์ด ์ ‘ํ•˜๋Š” ์ •๋ณด๋“ค์„
11:20
for finding ways to make people critical of the content they consume.
237
680180
3484
๋น„ํŒ์ ์œผ๋กœ ๋ฐ›์•„๋“ค์ผ ์ˆ˜ ์žˆ๊ฒŒ ๊ต์œกํ•˜๋Š” ๋ฐฉ๋ฒ•์„ ์ฐพ๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
11:24
All of these efforts are wonderful, but they're working in silos,
238
684633
3141
์ด ๋ชจ๋“  ๋…ธ๋ ฅ๋“ค์€ ๋Œ€๋‹จํ•˜์ง€๋งŒ ๊ทธ๋“ค์€ ์ง€ํ•˜์—์„œ ์ผํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
11:27
and many of them are woefully underfunded.
239
687798
2680
๋Œ€๋ถ€๋ถ„์€ ์ž๊ธˆ ๋ถ€์กฑ์„ ๊ฒช๊ณ  ์žˆ์ฃ .
11:30
There are also hundreds of very smart people
240
690502
2053
๋˜ํ•œ ์ˆ˜๋ฐฑ ๋ช…์˜ ์ •๋ง ๋˜‘๋˜‘ํ•œ ์ธ์žฌ๋“ค์ด
11:32
working inside these companies,
241
692579
1652
์ด๋“ค ํšŒ์‚ฌ์—์„œ ์ผํ•˜๊ณ  ์žˆ์ง€๋งŒ,
11:34
but again, these efforts can feel disjointed,
242
694255
2325
์ด๋“ค์˜ ๋…ธ๋ ฅ์ด ์„œ๋กœ ์—ฐ๊ฒฐ๋˜๊ณ  ์žˆ์ง€ ์•Š์€ ๋Š๋‚Œ์ž…๋‹ˆ๋‹ค.
11:36
because they're actually developing different solutions to the same problems.
243
696604
3937
๊ฐ™์€ ๋ฌธ์ œ์— ๋Œ€ํ•ด ์„œ๋กœ ๋‹ค๋ฅธ ํ•ด๊ฒฐ์ฑ…์„ ๋งŒ๋“ค๊ณ  ์žˆ๊ธฐ ๋•Œ๋ฌธ์ด์ฃ .
11:41
How can we find a way to bring people together
244
701205
2269
์–ด๋–ป๊ฒŒ ํ•˜๋ฉด ์ด ์‚ฌ๋žŒ๋“ค์„ ๋ฉฐ์น  ํ˜น์€ ๋ช‡ ์ฃผ ๊ฐ„,
11:43
in one physical location for days or weeks at a time,
245
703498
3278
ํ•˜๋‚˜์˜ ๋ฌผ๋ฆฌ์ ์ธ ๊ณต๊ฐ„์— ๋ชจ์•„์„œ,
11:46
so they can actually tackle these problems together
246
706800
2396
๊ทธ๋“ค์ด ์ž์‹ ๋“ค์˜ ๊ฒฌํ•ด์™€ ๊ด€์ ์„ ์ œ์‹œํ•˜๋ฉฐ
11:49
but from their different perspectives?
247
709220
1818
์ด ๋ฌธ์ œ๋ฅผ ํ•จ๊ป˜ ํ•ด๊ฒฐํ•˜๋„๋ก ํ•  ์ˆ˜ ์žˆ์„๊นŒ์š”?
11:51
So can we do this?
248
711062
1340
์ด๊ฒƒ์ด ๊ฐ€๋Šฅํ• ๊นŒ์š”?
11:52
Can we build out a coordinated, ambitious response,
249
712426
3239
์šฐ๋ฆฌ์˜ ๋ฌธ์ œ๊ฐ€ ๋ณต์žกํ•˜๊ณ  ํฐ ๋งŒํผ
11:55
one that matches the scale and the complexity of the problem?
250
715689
3709
์šฐ๋ฆฌ๋„ ์ž˜ ์„ค๊ณ„๋œ ํ•ด๊ฒฐ ๋ฐฉ๋ฒ•์„ ์•ผ์‹ฌ์ฐจ๊ฒŒ ๋‚ด๋†“์„ ์ˆ˜ ์žˆ์„๊นŒ์š”?
11:59
I really think we can.
251
719819
1373
์ €๋Š” ์ด๊ฒƒ์ด ๊ฐ€๋Šฅํ•˜๋‹ค๊ณ  ๋ฏฟ์Šต๋‹ˆ๋‹ค.
12:01
Together, let's rebuild our information commons.
252
721216
2959
ํ•จ๊ป˜ ์•ˆ์ „ํ•œ ์ •๋ณด ๊ณต๋™์ฒด๋ฅผ ์žฌ๊ตฌ์ถ•ํ•ด ๋ด…์‹œ๋‹ค.
12:04
Thank you.
253
724819
1190
๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
12:06
(Applause)
254
726033
3728
(๋ฐ•์ˆ˜)
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7