How we can protect truth in the age of misinformation | Sinan Aral

216,072 views ใƒป 2020-01-16

TED


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

00:00
Translator: Ivana Korom Reviewer: Krystian Aparta
0
0
7000
๋ฒˆ์—ญ: Sujin Byeon ๊ฒ€ํ† : hansom Lee
00:13
So, on April 23 of 2013,
1
13468
5222
2013๋…„ 4์›” 23์ผ,
00:18
the Associated Press put out the following tweet on Twitter.
2
18714
5514
APํ†ต์‹ ์€ ์ด๋Ÿฐ ๋‚ด์šฉ์„ ํŠธ์œ—ํ–ˆ์Šต๋‹ˆ๋‹ค.
00:24
It said, "Breaking news:
3
24252
2397
"์†๋ณด์ž…๋‹ˆ๋‹ค.
00:26
Two explosions at the White House
4
26673
2571
๋ฐฑ์•…๊ด€์—์„œ ๋‘ ๊ฑด์˜ ํญ๋ฐœ๋กœ,
00:29
and Barack Obama has been injured."
5
29268
2333
์˜ค๋ฐ”๋งˆ ๋Œ€ํ†ต๋ น์ด ๋ถ€์ƒ."
00:32
This tweet was retweeted 4,000 times in less than five minutes,
6
32212
5425
์ด ํŠธ์œ—์€ 5๋ถ„๋„ ์ฑ„ ๋˜์ง€ ์•Š์•„ 4์ฒœ ๋ฒˆ ๋ฆฌํŠธ์œ— ๋๊ณ ,
00:37
and it went viral thereafter.
7
37661
2217
๊ทธ ํ›„ ๋„๋ฆฌ ํผ์ง€๊ฒŒ ๋์ฃ .
00:40
Now, this tweet wasn't real news put out by the Associated Press.
8
40760
4350
์ž, ์ด ํŠธ์œ—์€ AP ํ†ต์‹ ์ด ๋ณด๋„ํ•œ ์ง„์งœ ๋‰ด์Šค๊ฐ€ ์•„๋‹™๋‹ˆ๋‹ค.
00:45
In fact it was false news, or fake news,
9
45134
3333
APํ†ต์‹  ํŠธ์œ„ํ„ฐ ๊ณ„์ •์— ์นจํˆฌํ•œ
00:48
that was propagated by Syrian hackers
10
48491
2825
์‹œ๋ฆฌ์•„ ํ•ด์ปค๋“ค์ด ํผ๋œจ๋ฆฐ
00:51
that had infiltrated the Associated Press Twitter handle.
11
51340
4694
๊ฑฐ์ง“ ๋‰ด์Šค, ์ฆ‰ ๊ฐ€์งœ ๋‰ด์Šค์ž…๋‹ˆ๋‹ค.
00:56
Their purpose was to disrupt society, but they disrupted much more.
12
56407
3889
๋ชฉ์ ์€ ์‚ฌํšŒ์— ํ˜ผ๋ž€์„ ์ฃผ๋Š” ๊ฒƒ์ด์—ˆ์ง€๋งŒ, ๊ทธ ์ด์ƒ์ด์—ˆ์ฃ .
01:00
Because automated trading algorithms
13
60320
2476
์™œ๋ƒํ•˜๋ฉด ์•Œ๊ณ ๋ฆฌ์ฆ˜ ํŠธ๋ ˆ์ด๋”ฉ์ด
01:02
immediately seized on the sentiment on this tweet,
14
62820
3360
์ฆ‰๊ฐ์ ์œผ๋กœ ๊ทธ ํŠธ์œ—์˜ ํ•จ์˜๋ฅผ ํŒŒ์•…ํ•ด,
01:06
and began trading based on the potential
15
66204
2968
๋ฐฑ์•…๊ด€ ๋‚ด ํญ๋ฐœ๋กœ ์ธํ•ด ๋ฏธ๊ตญ ๋Œ€ํ†ต๋ น์ด
01:09
that the president of the United States had been injured or killed
16
69196
3381
๋‹ค์น˜๊ฑฐ๋‚˜ ์‚ฌ๋งํ–ˆ์„ ๊ฐ€๋Šฅ์„ฑ์— ๊ทผ๊ฑฐํ•ด
01:12
in this explosion.
17
72601
1200
ํŠธ๋ ˆ์ด๋”ฉ์„ ์‹œ์ž‘ํ–ˆ๊ธฐ ๋–„๋ฌธ์ด์ฃ .
01:14
And as they started tweeting,
18
74188
1992
๊ฐ€์งœ๋‰ด์Šค๊ฐ€ ํŠธ์œ—๋˜์ž,
01:16
they immediately sent the stock market crashing,
19
76204
3349
์ฃผ์‹์‹œ์žฅ์€ ์ฆ‰๊ฐ์ ์œผ๋กœ ํญ๋ฝํ–ˆ๊ณ ,
01:19
wiping out 140 billion dollars in equity value in a single day.
20
79577
5167
ํ•˜๋ฃจ์•„์นจ์— 1400์–ต ๋‹ฌ๋Ÿฌ ๊ฐ€์น˜๊ฐ€ ์‚ฌ๋ผ์กŒ์Šต๋‹ˆ๋‹ค.
01:25
Robert Mueller, special counsel prosecutor in the United States,
21
85062
4476
๋กœ๋ฒ„ํŠธ ๋ฎฌ๋Ÿฌ ๋ฏธ ํŠน์ˆ˜๋ถ€ ๊ฒ€์‚ฌ๋Š”
01:29
issued indictments against three Russian companies
22
89562
3892
2016๋…„ ๋ฏธ ๋Œ€์„  ๊ฐœ์ž…์„ ํ†ตํ•ด
๋ฏธ๊ตญ์„ ์‚ฌ์ทจํ•˜๋ ค๊ณ  ํ•œ
01:33
and 13 Russian individuals
23
93478
2619
๊ณต๋ชจํ•œ ํ˜์˜๋กœ 3๊ฐœ์˜ ๋Ÿฌ์‹œ์•„ ๊ธฐ์—…๊ณผ
01:36
on a conspiracy to defraud the United States
24
96121
3167
13๋ช…์˜ ๋Ÿฌ์‹œ์•„์ธ์„ ๊ธฐ์†Œํ–ˆ์Šต๋‹ˆ๋‹ค.
01:39
by meddling in the 2016 presidential election.
25
99312
3780
01:43
And what this indictment tells as a story
26
103855
3564
๊ทธ๋ฆฌ๊ณ  ์ด ๊ธฐ์†Œ์žฅ์ด ๋งํ•ด์ฃผ๋Š” ๊ฑด
01:47
is the story of the Internet Research Agency,
27
107443
3142
๋Ÿฌ์‹œ์•„ ์ธํ„ฐ๋„ท ์—ฐ๊ตฌ ๊ธฐ๊ด€,
01:50
the shadowy arm of the Kremlin on social media.
28
110609
3594
๋Ÿฌ์‹œ์•„ ์ •๋ถ€์˜ ์†Œ์…œ ๋ฏธ๋””์–ด ๊ทธ๋ฆผ์ž ๊ตฐ๋‹จ์— ๋Œ€ํ•œ ์ด์•ผ๊ธฐ์ž…๋‹ˆ๋‹ค.
01:54
During the presidential election alone,
29
114815
2777
๋ฏธ ๋Œ€ํ†ต๋ น ์„ ๊ฑฐ ๊ธฐ๊ฐ„ ๋™์•ˆ์—๋งŒ,
01:57
the Internet Agency's efforts
30
117616
1889
๋Ÿฌ์‹œ์•„์˜ ์ธํ„ฐ๋„ท ๊ธฐ๊ด€์€
01:59
reached 126 million people on Facebook in the United States,
31
119529
5167
์ˆ˜๋ฐฑ ๋งŒ ๋ช…์˜ ๋ฏธ๊ตญ ํŽ˜์ด์Šค๋ถ ์œ ์ €์— ์ ‘๊ทผํ•ด,
02:04
issued three million individual tweets
32
124720
3277
300๋งŒ๊ฐœ์˜ ๊ฐœ์ธ ํŠธ์œ—์„ ํ•˜๊ณ ,
02:08
and 43 hours' worth of YouTube content.
33
128021
3842
43์‹œ๊ฐ„ ์งœ๋ฆฌ ์œ ํŠœ๋ธŒ๋ฅผ ์˜์ƒ์„ ์ œ์ž‘ํ–ˆ์Šต๋‹ˆ๋‹ค.
02:11
All of which was fake --
34
131887
1652
์ด ๋ชจ๋“  ๊ฒƒ์€ ๊ฐ€์งœ์˜€์Šต๋‹ˆ๋‹ค.
02:13
misinformation designed to sow discord in the US presidential election.
35
133563
6323
๋ฏธ๊ตญ ๋Œ€์„ ์— ํ˜ผ๋ž€์„ ์ฃผ๊ธฐ ์œ„ํ•œ ๊ฑฐ์ง“ ์ •๋ณด์ธ ์…ˆ์ด์ฃ .
02:20
A recent study by Oxford University
36
140996
2650
์ตœ๊ทผ ์˜ฅ์Šคํผ๋“œ๋Œ€ํ•™์˜ ์—ฐ๊ตฌ์— ๋”ฐ๋ฅด๋ฉด,
02:23
showed that in the recent Swedish elections,
37
143670
3270
์ตœ๊ทผ ์Šค์›จ๋ด์˜ ์„ ๊ฑฐ ๊ธฐ๊ฐ„ ๋™์•ˆ
02:26
one third of all of the information spreading on social media
38
146964
4375
์†Œ์…œ๋ฏธ๋””์–ด์— ํผ์ง„ ์„ ๊ฑฐ ๊ด€๋ จ ์ •๋ณด์˜
3๋ถ„์˜ 1์ด
02:31
about the election
39
151363
1198
๊ฐ€์งœ๊ฑฐ๋‚˜ ์ž˜๋ชป๋œ ๊ฒƒ์ด์—ˆ๋‹ค๊ณ  ํ•ฉ๋‹ˆ๋‹ค.
02:32
was fake or misinformation.
40
152585
2087
02:35
In addition, these types of social-media misinformation campaigns
41
155037
5078
๋˜ํ•œ, ์ด๋Ÿฌํ•œ ํ˜•ํƒœ์˜ ์†Œ์…œ ๋ฏธ๋””์–ด์˜ ๊ฐ€์งœ ์ •๋ณด์˜ ํ™œ๋™์€
02:40
can spread what has been called "genocidal propaganda,"
42
160139
4151
์†Œ์œ„ "๋Œ€๋Ÿ‰ ํ•™์‚ด์˜ ์„ ์ „." ์ด๋ผ ๋ถ€๋ฅด๋Š”๋ฐ,
02:44
for instance against the Rohingya in Burma,
43
164314
3111
๋ถ€๋ฅด๋งˆ์˜ ๋กœํž์•ผ ์กฑ์— ๊ด€ํ•œ ๊ฒƒ์ด๋ผ๋“ ๊ฐ€,
02:47
triggering mob killings in India.
44
167449
2303
์ธ๋„์—์„œ ๋Œ€๋Ÿ‰ ํ•™์‚ด์ด ์ผ์–ด๋‚ฌ๋‹ค๋“ ๊ฐ€ ๋Ÿฐ ์‹์ด์ฃ .
02:49
We studied fake news
45
169776
1494
์šฐ๋ฆฌ๋Š” ๊ฐ€์งœ ๋‰ด์Šค๊ฐ€
02:51
and began studying it before it was a popular term.
46
171294
3219
์ง€๊ธˆ์ฒ˜๋Ÿผ ๋„๋ฆฌํผ์ง„ ์šฉ์–ด๊ฐ€ ๋˜๊ธฐ ์ „๋ถ€ํ„ฐ ์ด์— ๋Œ€ํ•ด ์—ฐ๊ตฌํ•˜๊ธฐ ์‹œ์ž‘ํ–ˆ์Šต๋‹ˆ๋‹ค.
02:55
And we recently published the largest-ever longitudinal study
47
175030
5040
๊ทธ๋ฆฌ๊ณ  ์ตœ๊ทผ ์˜จ๋ผ์ธ ์ƒ์˜ ๊ฐ€์งœ ๋‰ด์Šค์˜ ์ „ํŒŒ์— ๋Œ€ํ•œ
03:00
of the spread of fake news online
48
180094
2286
์‚ฌ์ƒ ์ตœ๋Œ€์˜ ์—ฐ๊ตฌ ๊ฒฐ๊ณผ๋ฅผ ๋ฐœํ‘œํ–ˆ๊ณ ,
03:02
on the cover of "Science" in March of this year.
49
182404
3204
2019๋…„ ์‚ฌ์ด์–ธ์Šค์ง€ 3์›”ํ˜ธ ํ‘œ์ง€์— ์‹ค๋ฆฌ๊ฒŒ ๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
03:06
We studied all of the verified true and false news stories
50
186523
4161
์šฐ๋ฆฌ๋Š” 2006๋…„๋ถ€ํ„ฐ 2017๋…„๊นŒ์ง€
ํŠธ์œ„ํ„ฐ์— ํผ์ง„ ๋ชจ๋“  ์‚ฌ์‹ค๋“ค๊ณผ ๊ฐ€์งœ ๋‰ด์Šค๋“ค์„ ์—ฐ๊ตฌํ–ˆ์Šต๋‹ˆ๋‹ค.
03:10
that ever spread on Twitter,
51
190708
1753
03:12
from its inception in 2006 to 2017.
52
192485
3818
03:16
And when we studied this information,
53
196612
2314
๊ทธ๋ฆฌ๊ณ  ์ด ์ •๋ณด๋ฅผ ์—ฐ๊ตฌํ•˜๋ฉด์„œ,
03:18
we studied verified news stories
54
198950
2876
๋…๋ฆฝ ๋œ ์‚ฌ์‹ค ํ™•์ธ ๊ธฐ๊ด€ ์—ฌ์„ฏ ๊ณณ์—์„œ
03:21
that were verified by six independent fact-checking organizations.
55
201850
3918
๊ฒ€์ฆํ•œ ๋‰ด์Šค๋“ค์„ ์—ฐ๊ตฌํ–ˆ์Šต๋‹ˆ๋‹ค.
03:25
So we knew which stories were true
56
205792
2762
์šฐ๋ฆฌ๋Š” ์–ด๋–ค ์ด์•ผ๊ธฐ๊ฐ€ ์ง„์‹ค์ด๊ณ 
03:28
and which stories were false.
57
208578
2126
์–ด๋–ค ๊ฒƒ์ด ๊ฑฐ์ง“์ธ์ง€ ์•Œ ์ˆ˜ ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
03:30
We can measure their diffusion,
58
210728
1873
์šฐ๋ฆฌ๋Š” ์ด๋“ค์˜ ํ™•์‚ฐ๊ณผ
03:32
the speed of their diffusion,
59
212625
1651
ํ™•์‚ฐ์˜ ์†๋„,
03:34
the depth and breadth of their diffusion,
60
214300
2095
๊นŠ์ด์™€ ๋„“์ด,
03:36
how many people become entangled in this information cascade and so on.
61
216419
4142
์–ผ๋งˆ๋‚˜ ๋งŽ์€ ์‚ฌ๋žŒ๋“ค์ด ์ด ์ •๋ณด๋ฅผ ์ ‘ํ•˜๊ณ  ํผ๋œจ๋ ธ๋Š”์ง€ ์•Œ ์ˆ˜ ์žˆ์—ˆ์ฃ .
03:40
And what we did in this paper
62
220942
1484
๊ทธ๋ฆฌ๊ณ  ์ง„์งœ ๋‰ด์Šค๊ฐ€ ํผ์ง€๋Š” ์†๋„
03:42
was we compared the spread of true news to the spread of false news.
63
222450
3865
๊ฐ€์งœ ๋‰ด์Šค๊ฐ€ ํผ์ง€๋Š” ์†๋„๋ฅผ ๋น„๊ตํ•ด ๋…ผ๋ฌธ์— ์‹ค์—ˆ์ฃ .
03:46
And here's what we found.
64
226339
1683
์ €ํฌ๊ฐ€ ์—ฐ๊ตฌํ•œ ๊ฒฐ๊ณผ์ž…๋‹ˆ๋‹ค.
03:48
We found that false news diffused further, faster, deeper
65
228046
3979
๊ฐ€์งœ ๋‰ด์Šค๋Š” ์ง„์‹ค๋ณด๋‹ค ๋” ๋น ๋ฅด๊ณ , ๋” ๊นŠ๊ณ ,
03:52
and more broadly than the truth
66
232049
1806
๋” ๋„“๊ฒŒ ํผ์ ธ ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
03:53
in every category of information that we studied,
67
233879
3003
์ €ํฌ๊ฐ€ ์—ฐ๊ตฌํ•˜๋ฉด์„œ ๋ถ„๋ฅ˜ํ•œ ๋ชจ๋“  ์ •๋ณด์—์„œ์š”.
03:56
sometimes by an order of magnitude.
68
236906
2499
๊ฐ€๋”์€ ์ผ์ •ํ•œ ๊ทœ๋ชจ๋กœ.
03:59
And in fact, false political news was the most viral.
69
239842
3524
๊ทธ๋ฆฌ๊ณ  ์ •์น˜ ๊ด€๋ จ ๊ฐ€์งœ ๋‰ด์Šค๊ฐ€ ๊ฐ€์žฅ ๋„๋ฆฌ ํผ์กŒ์Šต๋‹ˆ๋‹ค.
04:03
It diffused further, faster, deeper and more broadly
70
243390
3147
์ •์น˜ ๋‰ด์Šค๋Š” ๋‹ค๋ฅธ ์–ด๋–ค ์ข…๋ฅ˜์˜ ๊ฐ€์งœ ๋‰ด์Šค๋ณด๋‹ค
04:06
than any other type of false news.
71
246561
2802
๋” ๋น ๋ฅด๊ณ , ๋” ๊นŠ๊ณ , ๋” ๋„“๊ฒŒ ํผ์กŒ์Šต๋‹ˆ๋‹ค.
04:09
When we saw this,
72
249387
1293
์ด์™€ ๊ฐ™์€ ๊ฒฐ๊ณผ์— ๋Œ€ํ•ด
04:10
we were at once worried but also curious.
73
250704
2841
์šฐ๋ฆฌ๋Š” ๊ฑฑ์ •์„ ํ•˜๋ฉด์„œ๋„ ํ˜ธ๊ธฐ์‹ฌ์„ ๋Š๊ผˆ์Šต๋‹ˆ๋‹ค.
04:13
Why?
74
253569
1151
์™œ ๊ทธ๋Ÿด๊นŒ์š”?
04:14
Why does false news travel so much further, faster, deeper
75
254744
3373
๊ฐ€์งœ ๋‰ด์Šค๋Š” ์™œ ์ง„์‹ค๋ณด๋‹ค ํ›จ์”ฌ ๋” ๋ฉ€๋ฆฌ, ๋” ๋น ๋ฅด๊ฒŒ, ๋” ๊นŠ๊ฒŒ,
04:18
and more broadly than the truth?
76
258141
1864
๋” ๋„“๊ฒŒ ํผ์ง€๋Š” ๊ฑธ๊นŒ์š”?
04:20
The first hypothesis that we came up with was,
77
260339
2961
์ €ํฌ๊ฐ€ ์ƒ๊ฐํ•œ ์ฒซ ๋ฒˆ์งธ ๊ฐ€์„ค์€,
04:23
"Well, maybe people who spread false news have more followers or follow more people,
78
263324
4792
"์•„๋งˆ ํŒ”๋กœ์›Œ ์ˆ˜๊ฐ€ ๋” ๋งŽ๊ฑฐ๋‚˜ ํŒ”๋กœ์ž‰์„ ๋งŽ์ด ํ•œ ์‚ฌ๋žŒ๋“ค์ด๊ฑฐ๋‚˜
04:28
or tweet more often,
79
268140
1557
๋” ์ž์ฃผ ํŠธ์œ—์„ ํ•˜๊ฑฐ๋‚˜,
04:29
or maybe they're more often 'verified' users of Twitter, with more credibility,
80
269721
4126
ํŠธ์œ„ํ„ฐ์—์„œ '๊ณต์ธ ๋œ' ์œ ์ €๋ผ ์‹ ๋ขฐ๋„๊ฐ€ ๋” ๋†’๊ฑฐ๋‚˜,
04:33
or maybe they've been on Twitter longer."
81
273871
2182
์•„๋‹ˆ๋ฉด ํŠธ์œ„ํ„ฐ๋ฅผ ์˜ค๋žซ๋™์•ˆ ์‚ฌ์šฉํ•œ ์œ ์ €๋ผ๋Š” ๊ฒƒ์ด์—ˆ์ฃ ."
04:36
So we checked each one of these in turn.
82
276077
2298
๊ทธ๋ž˜์„œ ์šฐ๋ฆฌ๋Š” ์ด๋Ÿฐ ์š”์†Œ๋“ค์„ ์ฐจ๋ก€๋กœ ํ™•์ธํ–ˆ์Šต๋‹ˆ๋‹ค.
04:38
And what we found was exactly the opposite.
83
278691
2920
๊ทธ๋ฆฌ๊ณ  ์šฐ๋ฆฌ์˜ ์˜ˆ์ƒ๊ณผ๋Š” ์ •ํ™•ํ•˜๊ฒŒ ๋ฐ˜๋Œ€์˜€์Šต๋‹ˆ๋‹ค.
04:41
False-news spreaders had fewer followers,
84
281635
2436
๊ฐ€์งœ ๋‰ด์Šค ์œ ํฌ์ž๋“ค์€ ํŒ”๋กœ์›Œ๊ฐ€ ์ ๊ณ 
04:44
followed fewer people, were less active,
85
284095
2254
ํŒ”๋กœ์ž‰๋„ ๋งŽ์ง€ ์•Š์œผ๋ฉฐ, ํ™œ๋™๋„ ๊ทธ๋ฆฌ ๋งŽ์ง€ ์•Š๊ณ 
04:46
less often "verified"
86
286373
1460
โ€˜๊ณต์ธ ๋œ' ์œ ์ €๋„ ์•„๋‹ˆ์—ˆ๊ณ 
04:47
and had been on Twitter for a shorter period of time.
87
287857
2960
ํŠธ์œ„ํ„ฐ๋ฅผ ์‚ฌ์šฉํ•œ ๊ธฐ๊ฐ„๋„ ์งง์•˜์Šต๋‹ˆ๋‹ค.
04:50
And yet,
88
290841
1189
๊ทธ๋Ÿผ์—๋„ ๋ถˆ๊ตฌํ•˜๊ณ ,
04:52
false news was 70 percent more likely to be retweeted than the truth,
89
292054
5033
๊ฐ€์งœ ๋‰ด์Šค๋Š” ์ง„์‹ค๋ณด๋‹ค 70%์ •๋„ ๋” ๋ฆฌํŠธ์œ— ๋์Šต๋‹ˆ๋‹ค.
04:57
controlling for all of these and many other factors.
90
297111
3363
์ด๋Ÿฌํ•œ ๊ฒƒ๋“ค์„ ํ†ต์ œํ•˜๊ณ  ๋‹ค๋ฅธ ์ˆ˜๋งŽ์€ ์š”์†Œ๋“ค๋„ ํ•จ๊ป˜์š”.
05:00
So we had to come up with other explanations.
91
300498
2690
๊ทธ๋ž˜์„œ ์šฐ๋ฆฌ๋Š” ๋‹ค๋ฅธ ๊ฐ€์„ค๋“ค์„ ์ƒ๊ฐํ•ด์•ผ ํ–ˆ์Šต๋‹ˆ๋‹ค.
05:03
And we devised what we called a "novelty hypothesis."
92
303212
3467
์ €ํฌ๋Š” "์ƒˆ๋กœ์›€์˜ ๊ฐ€์„ค"์ด๋ผ๋Š” ์ƒˆ๋กœ์šด ๊ฐ€์„ค์„ ์„ธ์› ์Šต๋‹ˆ๋‹ค.
05:07
So if you read the literature,
93
307038
1960
๋ฌธํ—Œ์„ ์ฝ์–ด๋ณด๋ฉด,
05:09
it is well known that human attention is drawn to novelty,
94
309022
3754
์ธ๊ฐ„์˜ ๊ด€์‹ฌ์€ ์ต์ˆ™ํ•œ ํ™˜๊ฒฝ ์†์—์„œ ์ƒˆ๋กœ์šด ๊ฒƒ์— ๋Œ๋ฆฐ๋‹ค๋Š” ์‚ฌ์‹ค์ด
05:12
things that are new in the environment.
95
312800
2519
์ž˜ ๋‚˜์™€ ์žˆ์Šต๋‹ˆ๋‹ค.
05:15
And if you read the sociology literature,
96
315343
1985
๊ทธ๋ฆฌ๊ณ  ์‚ฌํšŒํ•™ ๋ฌธํ—Œ๋“ค์„ ์ฝ์–ด๋ณด๋ฉด,
05:17
you know that we like to share novel information.
97
317352
4300
์ €ํฌ๋Š” ์ƒˆ๋กœ์šด ์ •๋ณด๋ฅผ ๊ณต์œ ํ•˜๋Š” ๊ฒƒ์„ ์ข‹์•„ํ•œ๋‹ค๋Š” ์‚ฌ์‹ค์„ ์•Œ๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
05:21
It makes us seem like we have access to inside information,
98
321676
3838
์šฐ๋ฆฌ๊ฐ€ ๊ทธ ์ •๋ณด ์•ˆ์— ๋“ค์–ด๊ฐ€๋Š” ๊ฒƒ์ฒ˜๋Ÿผ ๋Š๋ผ๊ฒŒ ํ•˜๊ณ ,
05:25
and we gain in status by spreading this kind of information.
99
325538
3785
์ด๋Ÿฐ ์ •๋ณด๋ฅผ ํผํŠธ๋ฆผ์œผ๋กœ์จ ์šฐ์œ„๋ฅผ ์ฐจ์ง€ํ•  ์ˆ˜ ์žˆ๋‹ค๋Š” ๊ฒƒ์ด์ฃ .
05:29
So what we did was we measured the novelty of an incoming true or false tweet,
100
329792
6452
๊ทธ๋ž˜์„œ ์ €ํฌ๋Š” ์ƒˆ๋กœ ํŠธ์œ—๋œ ์ง„์งœ์™€ ๊ฐ€์งœ์˜ ์ฐธ์‹ ํ•จ์„ ์ธก์ •ํ•˜๋ ค๊ณ 
05:36
compared to the corpus of what that individual had seen
101
336268
4055
๊ฐœ์ธ์ด 60์ผ ์ „์— ํŠธ์œ„ํ„ฐ์—์„œ ๋ดค๋˜ ๋‚ด์šฉ๋“ค์„
05:40
in the 60 days prior on Twitter.
102
340347
2952
๋‹ค ๋น„๊ตํ•ด ๋ดค์Šต๋‹ˆ๋‹ค.
05:43
But that wasn't enough, because we thought to ourselves,
103
343323
2659
ํ•˜์ง€๋งŒ ๊ทธ๊ฒƒ์œผ๋ก  ์ถฉ๋ถ„ํ•˜์ง€ ์•Š์•˜์ฃ .
์™œ๋ƒ๋ฉด ์ •๋ณด ์ด๋ก ์ ์œผ๋กœ ๋ณด๋ฉด ๊ฐ€์งœ ๋‰ด์Šค๊ฐ€ ํ›จ์”ฌ ๋” ์‹ ์„ ํ•  ์ˆ˜๋„ ์žˆ๊ฒ ์ง€๋งŒ,
05:46
"Well, maybe false news is more novel in an information-theoretic sense,
104
346006
4208
05:50
but maybe people don't perceive it as more novel."
105
350238
3258
์‚ฌ๋žŒ๋“ค์ด ๊ทธ๊ฑธ ์‹ ์„ ํ•˜์ง€ ์•Š๋‹ค๊ณ  ์—ฌ๊ธธ ์ˆ˜ ์žˆ๋‹ค๊ณ  ์ƒ๊ฐํ–ˆ๊ธฐ ๋•Œ๋ฌธ์ด์ฃ .
05:53
So to understand people's perceptions of false news,
106
353849
3927
๊ทธ๋ž˜์„œ ๊ฐ€์งœ ๋‰ด์Šค์— ๋Œ€ํ•œ ์‚ฌ๋žŒ๋“ค์˜ ์ธ์‹์„ ์•Œ์•„๋ณด๊ธฐ ์œ„ํ•ด
05:57
we looked at the information and the sentiment
107
357800
3690
์ €ํฌ๋Š” ์ •๋ณด๋“ค์„ ์‚ดํŽด๋ณด๊ณ  ์ง„์งœ์™€ ๊ฐ€์งœ ํŠธ์œ— ์‘๋‹ต์—
06:01
contained in the replies to true and false tweets.
108
361514
4206
ํ•จ์˜๋˜์–ด ์žˆ๋Š” ์ •์„œ๋ฅผ ์‚ดํŽด๋ดค์Šต๋‹ˆ๋‹ค.
06:06
And what we found
109
366022
1206
๊ทธ๋ฆฌ๊ณ  ์šฐ๋ฆฌ๊ฐ€ ๋ฐœ๊ฒฌํ•œ ๊ฒƒ์€
06:07
was that across a bunch of different measures of sentiment --
110
367252
4214
์ˆ˜๋งŽ์€ ๋‹ค์–‘ํ•œ ๊ฐ์ •๋“ค
06:11
surprise, disgust, fear, sadness,
111
371490
3301
๋†€๋ผ์›€, ํ˜์˜ค, ๋‘๋ ค์›€, ์Šฌํ””,
06:14
anticipation, joy and trust --
112
374815
2484
๊ธฐ๋Œ€, ์ฆ๊ฑฐ์›€, ์‹ ๋ขฐ ์ด๋Ÿฌํ•œ ๋‹ค์–‘ํ•œ ๊ฐ์ • ์ค‘์—์„œ
06:17
false news exhibited significantly more surprise and disgust
113
377323
5857
๊ฐ€์งœ ๋‰ด์Šค์—๋Š” ๋†€๋ผ์›€๊ณผ ํ˜์˜ค์˜ ์ •์„œ๊ฐ€ ํ›จ์”ฌ ๋” ๋งŽ์ด ๋ณด์—ฌ์ง„๋‹ค๋Š” ๊ฒ๋‹ˆ๋‹ค.
06:23
in the replies to false tweets.
114
383204
2806
๊ฐ€์งœ ํŠธ์œ— ์‘๋‹ต์ž๋“ค ์ค‘์—์„œ์š”.
06:26
And true news exhibited significantly more anticipation,
115
386392
3789
์ง„์งœ ๋‰ด์Šค์—์„œ๋Š” ์ƒ๋Œ€์ ์œผ๋กœ
06:30
joy and trust
116
390205
1547
์ฆ๊ฑฐ์›€, ์‹ ๋ขฐ๊ฐ€
06:31
in reply to true tweets.
117
391776
2547
๋” ๋งŽ์ด ๋‚˜ํƒ€๋‚ฌ์ฃ .
06:34
The surprise corroborates our novelty hypothesis.
118
394347
3786
์ด ๋†€๋ผ์šด ์‚ฌ์‹ค์€ ์ €ํฌ์˜ ๊ฐ€์„ค์„ ๋’ท๋ฐ›์นจํ•ด์ฃผ์—ˆ์Šต๋‹ˆ๋‹ค.
06:38
This is new and surprising, and so we're more likely to share it.
119
398157
4609
์ด๊ฒƒ์€ ์ƒˆ๋กญ๊ณ  ๋†€๋ผ์šด ์ผ์ด๋ฉฐ, ๋” ๋งŽ์ด ๊ณต์œ ํ•ด์•ผ๊ฒ ๋‹ค๊ณ  ์ƒ๊ฐํ–ˆ์ฃ .
06:43
At the same time, there was congressional testimony
120
403092
2925
๋™์‹œ์—, ์ž˜๋ชป๋œ ์ •๋ณด์˜ ํ™•์‚ฐ์— ์žˆ์–ด
06:46
in front of both houses of Congress in the United States,
121
406041
3036
๋กœ๋ด‡์˜ ์—ญํ• ์„ ์‚ดํŽด๋ณด๋Š”
06:49
looking at the role of bots in the spread of misinformation.
122
409101
3738
์ฒญ๋ฌธํšŒ๊ฐ€ ๋ฏธ๊ตญ ์ƒํ•˜์› ์–‘์›์—์„œ ์ง„ํ–‰๋์ฃ .
06:52
So we looked at this too --
123
412863
1354
์ €ํฌ๋„ ์‚ดํŽด๋ณด์•˜์ฃ .
06:54
we used multiple sophisticated bot-detection algorithms
124
414241
3598
์ €ํฌ๋Š” ๋ช‡๊ฐ€์ง€ ๋กœ๋ด‡ ํƒ์ง€ ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ์‚ฌ์šฉํ•ด
06:57
to find the bots in our data and to pull them out.
125
417863
3074
์ €ํฌ ๋ฐ์ดํ„ฐ ์†์˜ ๋กœ๋ด‡์„ ์ฐพ์•„๋‚ด ๊ทธ ๋กœ๋ด‡์„ ๊บผ๋ƒˆ์ฃ .
07:01
So we pulled them out, we put them back in
126
421347
2659
๊ทธ๋Ÿฌ๊ณ  ๋‚œ ํ›„ ๋‹ค์‹œ ์ง‘์–ด ๋„ฃ์–ด์„œ
07:04
and we compared what happens to our measurement.
127
424030
3119
๋ฌด์Šจ ์ผ์ด ์ผ์–ด๋‚ฌ๋Š”์ง€ ๊ฒฐ๊ณผ๋ฅผ ๋น„๊ตํ–ˆ์Šต๋‹ˆ๋‹ค.
07:07
And what we found was that, yes indeed,
128
427173
2293
๊ทธ๋ฆฌ๊ณ  ์šฐ๋ฆฌ๊ฐ€ ๋ฐœ๊ฒฌํ•œ ๊ฒƒ์€, ์ •๋ง๋กœ
07:09
bots were accelerating the spread of false news online,
129
429490
3682
๋กœ๋ด‡์ด ์˜จ๋ผ์ธ์ƒ์˜ ๊ฐ€์งœ ๋‰ด์Šค ํ™•์‚ฐ์„ ๊ฐ€์†ํ™”ํ•œ๋‹ค๋Š” ์‚ฌ์‹ค,
07:13
but they were accelerating the spread of true news
130
433196
2651
ํ•˜์ง€๋งŒ ์ง„์งœ ๋‰ด์Šค๋„ ๊ทธ๋งŒํผ ๋น ๋ฅธ ์†๋„๋กœ
07:15
at approximately the same rate.
131
435871
2405
ํผํŠธ๋ฆฌ๊ณ  ์žˆ๋‹ค๋Š” ์‚ฌ์‹ค์ด์—ˆ์Šต๋‹ˆ๋‹ค.
07:18
Which means bots are not responsible
132
438300
2858
์ด๊ฒƒ์€ ๋กœ๋ด‡์€ ์˜จ๋ผ์ธ์ƒ์˜ ์ง„์งœ ๋‰ด์Šค์™€ ๊ฐ€์งœ ๋‰ด์Šค์˜ ํ™•์‚ฐ์—
07:21
for the differential diffusion of truth and falsity online.
133
441182
4713
์ฑ…์ž„์ด ์—†๋‹ค๋Š” ์‚ฌ์‹ค์„ ์˜๋ฏธํ•ฉ๋‹ˆ๋‹ค.
07:25
We can't abdicate that responsibility,
134
445919
2849
์šฐ๋ฆฌ๋Š” ์ด ์ฑ…๋ฌด๋ฅผ ๊ฑฐ๋ถ€ํ•  ์ˆ˜ ์—†์Šต๋‹ˆ๋‹ค.
07:28
because we, humans, are responsible for that spread.
135
448792
4259
์™œ๋ƒ๋ฉด, ์šฐ๋ฆฌ ์ธ๊ฐ„์—๊ฒŒ ํ™•์‚ฐ์˜ ์ฑ…์ž„์ด ์žˆ๊ธฐ ๋•Œ๋ฌธ์ด์ฃ .
07:34
Now, everything that I have told you so far,
136
454472
3334
์ง€๊ธˆ๊นŒ์ง€ ์ œ๊ฐ€ ๋ง์”€๋“œ๋ฆฐ ๊ฒƒ์€
07:37
unfortunately for all of us,
137
457830
1754
์•ˆํƒ€๊น๊ฒŒ๋„ ์šฐ๋ฆฌ ๋ชจ๋‘์—๊ฒŒ
07:39
is the good news.
138
459608
1261
์ข‹์€ ์†Œ์‹์ž…๋‹ˆ๋‹ค
07:42
The reason is because it's about to get a whole lot worse.
139
462670
4450
์™œ๋ƒํ•˜๋ฉด ๋จธ์ง€์•Š์•„ ํ›จ์”ฌ ๋” ์•…ํ™”๋  ๊ฒƒ์ด๊ธฐ ๋–„๋ฌธ์ž…๋‹ˆ๋‹ค.
07:47
And two specific technologies are going to make it worse.
140
467850
3682
๊ทธ๋ฆฌ๊ณ  ํŠนํžˆ ๋‘ ๊ฐ€์ง€ ๊ธฐ์ˆ ์ด ์ƒํ™ฉ์„ ๋” ์•ˆ ์ข‹๊ฒŒ ๋งŒ๋“ค ๊ฒ๋‹ˆ๋‹ค.
07:52
We are going to see the rise of a tremendous wave of synthetic media.
141
472207
5172
์ €ํฌ๋Š” ํ•ฉ์„ฑ ๋ฏธ๋””์–ด์˜ ํ™์ˆ˜๋ฅผ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
07:57
Fake video, fake audio that is very convincing to the human eye.
142
477403
6031
๊ฐ€์งœ ๋น„๋””์˜ค, ๊ฐ€์งœ ์˜ค๋””์˜ค๋“ค์€ ์‚ฌ๋žŒ์˜ ์ธ์ง€์— ์„ค๋“์„ ๋”ํ•˜์ฃ .
08:03
And this will powered by two technologies.
143
483458
2754
๊ทธ๋ฆฌ๊ณ  ์ด๊ฒƒ์€ ๋‘ ๊ฐ€์ง€ ๊ธฐ์ˆ ๋กœ ๊ฐ€๋Šฅํ•˜๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
08:06
The first of these is known as "generative adversarial networks."
144
486236
3833
์ฒซ ๋ฒˆ์งธ ๊ธฐ์ˆ ์€ โ€˜์ƒ์„ฑ์  ์ ๋Œ€ ์‹ ๊ฒฝ(GAN)โ€™ ์ž…๋‹ˆ๋‹ค.
08:10
This is a machine-learning model with two networks:
145
490093
2563
GAN์€ ๋‘ ๊ฐœ์˜ ๋„คํŠธ์›Œํฌ๋ฅผ ๊ฐ€์ง„ ๋จธ์‹ ๋Ÿฌ๋‹ ๋ชจ๋ธ์ž…๋‹ˆ๋‹ค.
08:12
a discriminator,
146
492680
1547
์—ฌ๊ธฐ์„œ ํŒ๋ณ„์ž๋Š”
08:14
whose job it is to determine whether something is true or false,
147
494251
4200
๋ฌด์—‡์ด ์ง„์งœ์ด๊ณ  ๊ฐ€์งœ์ธ์ง€ ํŒ๋ณ„ํ•˜๋Š” ์—ญํ• ์„ ํ•ฉ๋‹ˆ๋‹ค.
08:18
and a generator,
148
498475
1167
๊ทธ๋ฆฌ๊ณ  ์ƒ์„ฑ์ž๋Š”
08:19
whose job it is to generate synthetic media.
149
499666
3150
ํ•ฉ์„ฑ ๋ฏธ๋””์–ด๋ฅผ ์ƒ์„ฑํ•ฉ๋‹ˆ๋‹ค.
08:22
So the synthetic generator generates synthetic video or audio,
150
502840
5102
๊ทธ๋ž˜์„œ ์ƒ์„ฑ์ž๋Š” ๋น„๋””์˜ค, ์˜ค๋””์˜ค๋ฅผ ํ•ฉ์„ฑํ•˜๊ณ 
08:27
and the discriminator tries to tell, "Is this real or is this fake?"
151
507966
4675
ํŒ๋ณ„์ž๋Š” ์ด๊ฒƒ์ด ์ง„์งœ์ธ์ง€ ๊ฐ€์งœ์ธ์ง€ ๊ทธ ์ง„์œ„์—ฌ๋ถ€๋ฅผ ํŒ๋‹จํ•ฉ๋‹ˆ๋‹ค.
08:32
And in fact, it is the job of the generator
152
512665
2874
๊ทธ๋ฆฌ๊ณ  ์‹ค์ œ๋กœ, ์ƒ์„ฑ์ž์˜ ์—ญํ• ์€
08:35
to maximize the likelihood that it will fool the discriminator
153
515563
4435
์ตœ๋Œ€ํ•œ ๋น„์Šทํ•˜๊ฒŒ ๋งŒ๋“ค์–ด์„œ ํŒ๋ณ„์ž๊ฐ€ ๊ทธ๊ฒƒ์ด
08:40
into thinking the synthetic video and audio that it is creating
154
520022
3587
ํ•ฉ์„ฑ๋œ ๋น„๋””์˜ค๋‚˜ ์˜ค๋””์˜ค์ธ์ง€
08:43
is actually true.
155
523633
1730
์•Œ ์ˆ˜ ์—†๋„๋ก ํ•˜๋Š” ์—ญํ• ์ž…๋‹ˆ๋‹ค.
08:45
Imagine a machine in a hyperloop,
156
525387
2373
ํ•˜์ดํผ๋ฃจํ”„ ์•ˆ์˜ ๊ธฐ๊ณ„๊ฐ€
08:47
trying to get better and better at fooling us.
157
527784
2803
์šฐ๋ฆฌ๋ฅผ ์†์ด๋ ค๊ณ  ์ ์  ๋” ๋…ธ๋ ฅํ•œ๋‹ค๊ณ  ์ƒ์ƒํ•ด๋ณด์„ธ์š”.
08:51
This, combined with the second technology,
158
531114
2500
์ด ๊ธฐ์ˆ ์ด ๋‘ ๋ฒˆ์งธ ๊ธฐ์ˆ ๊ณผ ๊ฒฐํ•ฉํ•˜๋ฉด,
08:53
which is essentially the democratization of artificial intelligence to the people,
159
533638
5722
๊ทผ๋ณธ์ ์œผ๋กœ ๋ชจ๋“  ์‚ฌ๋žŒ๋“ค์ด ์ธ๊ณต์ง€๋Šฅ์„ ์“ธ ์ˆ˜ ์žˆ๊ณ ,
08:59
the ability for anyone,
160
539384
2189
๋ˆ„๊ตฌ๋‚˜ ๊ฐ€๋Šฅํ•˜๋ฉฐ,
09:01
without any background in artificial intelligence
161
541597
2830
์ธ๊ณต์ง€๋Šฅ ํ˜น์€
09:04
or machine learning,
162
544451
1182
๋จธ์‹ ๋Ÿฌ๋‹์— ๋Œ€ํ•œ ๋ฐฐ๊ฒฝ์ง€์‹์ด ์—†์–ด๋„
09:05
to deploy these kinds of algorithms to generate synthetic media
163
545657
4103
์ด๋Ÿฌํ•œ ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ์ด์šฉํ•˜์—ฌ ํ•ฉ์„ฑ ๋ฏธ๋””์–ด๋ฅผ ๋งŒ๋“ค์–ด์„œ
09:09
makes it ultimately so much easier to create videos.
164
549784
4547
๊ฒฐ๊ตญ์—๋Š” ๋” ์‰ฝ๊ฒŒ ๋น„๋””์˜ค๋ฅผ ๋งŒ๋“ค ์ˆ˜ ์žˆ๊ฒŒ ๋˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
09:14
The White House issued a false, doctored video
165
554355
4421
๋ฐฑ์•…๊ด€์ด ํ•œ ๊ธฐ์ž๊ฐ€ ๊ทธ์˜ ๋งˆ์ดํฌ๋ฅผ ๋ฐ›์œผ๋ ค๊ณ  ํ•˜๋Š” ์ธํ„ด๊ธฐ์ž์™€์˜
09:18
of a journalist interacting with an intern who was trying to take his microphone.
166
558800
4288
์ ‘์ด‰์„ ์‚ด์ง ํŽธ์ง‘ํ•œ ๊ฐ€์งœ ๋น„๋””์˜ค๋ฅผ ์˜ฌ๋ ธ์Šต๋‹ˆ๋‹ค.
09:23
They removed frames from this video
167
563427
1999
๊ทธ๋“ค์€ ์ด ๋น„๋””์˜ค์˜ ํ”„๋ ˆ์ž„ ์ค‘ ์ผ๋ถ€๋ถ„์„ ์‚ญ์ œํ•ด
09:25
in order to make his actions seem more punchy.
168
565450
3287
๊ทธ์˜ ํ–‰๋™์„ ์ผ๋ถ€๋Ÿฌ ๋” ์ž˜๋ณด์ด๊ฒŒ ํ–ˆ์Šต๋‹ˆ๋‹ค.
09:29
And when videographers and stuntmen and women
169
569157
3385
์ดฌ์˜๊ธฐ์‚ฌ๋‚˜ ์Šคํ„ดํŠธ ๋ฐฐ์šฐ๋“ค์€
09:32
were interviewed about this type of technique,
170
572566
2427
์ธํ„ฐ๋ทฐ์—์„œ ์ด์™€ ๊ฐ™์€ ๊ธฐ์ˆ ๋“ค์ด
09:35
they said, "Yes, we use this in the movies all the time
171
575017
3828
"๋„ค, ์˜ํ™”์—์„œ ์ฃผ๋จน์„ ํœ˜๋‘๋ฅด๊ฑฐ๋‚˜ ๋ฐœ์ฐจ๊ธฐ๋ฅผ ํ•  ๋•Œ
09:38
to make our punches and kicks look more choppy and more aggressive."
172
578869
4763
๋”์šฑ ๊ฑฐ์น ๊ณ  ๊ณต๊ฒฉ์ ์œผ๋กœ ๋ณด์ด๊ธฐ ์œ„ํ•ด ๋Š˜ ์‚ฌ์šฉํ•œ๋‹ค" ๊ณ  ๋งํ–ˆ์Šต๋‹ˆ๋‹ค.
09:44
They then put out this video
173
584268
1867
๋ฐฑ์•…๊ด€์€ ์ด ์˜์ƒ์„
09:46
and partly used it as justification
174
586159
2500
์ง ์•„์†Œํ‹ฐ์นด ๊ธฐ์ž์˜
09:48
to revoke Jim Acosta, the reporter's, press pass
175
588683
3999
๋ฐฑ์•…๊ด€ ์ถœ์ž… ์ •์ง€ ์กฐ์น˜๋ฅผ
์ •๋‹นํ™”ํ•˜๋Š”๋ฐ ์ด์šฉํ–ˆ์Šต๋‹ˆ๋‹ค.
09:52
from the White House.
176
592706
1339
09:54
And CNN had to sue to have that press pass reinstated.
177
594069
4809
๊ทธ๋ฆฌ๊ณ  CNN์€ ๋ฐฑ์•…๊ด€ ์ถœ์ž…์ฆ์„ ๋Œ๋ ค๋ฐ›๊ธฐ ์œ„ํ•ด ๊ณ ์†Œ๋ฅผ ํ–ˆ์Šต๋‹ˆ๋‹ค.
10:00
There are about five different paths that I can think of that we can follow
178
600538
5603
์˜ค๋Š˜๋‚  ์šฐ๋ฆฌ๊ฐ€ ๋งˆ์ฃผํ•œ ์ด์™€ ๊ฐ™์€ ์–ด๋ ค์šด ๋ฌธ์ œ๋“ค์„
10:06
to try and address some of these very difficult problems today.
179
606165
3739
ํ•ด๊ฒฐํ•  ์ˆ˜ ์žˆ๋Š” 5๊ฐ€์ง€ ๋ฐฉ๋ฒ•์ด ์žˆ์Šต๋‹ˆ๋‹ค.
10:10
Each one of them has promise,
180
610379
1810
๋ชจ๋‘ ๋‹ค ์žฅ๋ž˜์„ฑ์ด ์žˆ์ง€๋งŒ,
10:12
but each one of them has its own challenges.
181
612213
2999
๋‚˜๋ฆ„์˜ ๊ณผ์ œ๋ฅผ ์•ˆ๊ณ  ์žˆ๊ธฐ๋„ ํ•ฉ๋‹ˆ๋‹ค.
10:15
The first one is labeling.
182
615236
2008
์ฒซ๋ฒˆ์งธ๋Š” ๋ผ๋ฒจ๋ง์ž…๋‹ˆ๋‹ค.
10:17
Think about it this way:
183
617268
1357
์ด๋ ‡๊ฒŒ ์ƒ๊ฐํ•ด๋ณด์„ธ์š”.
10:18
when you go to the grocery store to buy food to consume,
184
618649
3611
์—ฌ๋Ÿฌ๋ถ„์ด ์Œ์‹์„ ์‚ฌ๋Ÿฌ ์‹๋ฃŒํ’ˆ์ ์— ๊ฐ€๋ฉด,
10:22
it's extensively labeled.
185
622284
1904
์•„์ฃผ ๋งŽ์€ ๋ผ๋ฒจ์ด ๋ถ™์–ด ์žˆ์Šต๋‹ˆ๋‹ค.
10:24
You know how many calories it has,
186
624212
1992
์นผ๋กœ๋ฆฌ๊ฐ€ ์–ผ๋งŒ์ง€,
10:26
how much fat it contains --
187
626228
1801
์ง€๋ฐฉ์€ ์–ผ๋งŒ์ง€,
10:28
and yet when we consume information, we have no labels whatsoever.
188
628053
4278
ํ•˜์ง€๋งŒ ์šฐ๋ฆฌ๊ฐ€ ์ •๋ณด๋ฅผ ์†Œ๋น„ํ•  ๋•Œ๋Š” ๊ทธ ์–ด๋–ค ๋ผ๋ฒจ๋„ ์—†์Šต๋‹ˆ๋‹ค.
10:32
What is contained in this information?
189
632355
1928
์ด ์ •๋ณด๋Š” ๋ฌด์Šจ ๋‚ด์šฉ์ด๊ณ 
10:34
Is the source credible?
190
634307
1453
์ถœ์ฒ˜๋Š” ๋ฏฟ์„๋งŒํ•œ์ง€
10:35
Where is this information gathered from?
191
635784
2317
์–ด๋””์—์„œ ์ˆ˜์ง‘๋œ ์ •๋ณด์ธ์ง€
10:38
We have none of that information
192
638125
1825
์ •๋ณด๋ฅผ ์†Œ๋น„ํ•  ๋•Œ๋Š”
10:39
when we are consuming information.
193
639974
2103
์ด์— ๋Œ€ํ•ด์„œ๋Š” ์ „ํ˜€ ์•Œ ์ˆ˜ ์—†์ฃ .
10:42
That is a potential avenue, but it comes with its challenges.
194
642101
3238
์ข‹์€ ๋ฐฉ๋ฒ•์ด๊ธด ํ•˜์ง€๋งŒ, ๋ฌธ์ œ๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
10:45
For instance, who gets to decide, in society, what's true and what's false?
195
645363
6451
์˜ˆ๋ฅผ ๋“ค๋ฉด, ์šฐ๋ฆฌ ์‚ฌํšŒ์—์„œ ์ง„์‹ค๊ณผ ๊ฑฐ์ง“์€ ๋ฌด์—‡์ด๊ณ  ๋ˆ„๊ฐ€ ์ •ํ•˜์ฃ ?
10:52
Is it the governments?
196
652387
1642
์ •๋ถ€์ผ๊นŒ์š”?
10:54
Is it Facebook?
197
654053
1150
ํŽ˜์ด์Šค๋ถ์ผ๊นŒ์š”?
10:55
Is it an independent consortium of fact-checkers?
198
655601
3762
๋…๋ฆฝ ๋œ ํŒฉํŠธ ์ฒด์ปค ์ปจ์†Œ์‹œ์›€์ผ๊นŒ์š”?
10:59
And who's checking the fact-checkers?
199
659387
2466
๊ทธ๋ ‡๋‹ค๋ฉด ํŒฉํŠธ์ฒด์ปค๋Š” ๋ˆ„๊ฐ€ ํ™•์ธํ•˜์ฃ ?
11:02
Another potential avenue is incentives.
200
662427
3084
๋˜ ๋‹ค๋ฅธ ์ข‹์€ ๋ฐฉ๋ฒ•์€ ์ธ์„ผํ‹ฐ๋ธŒ ์ž…๋‹ˆ๋‹ค.
11:05
We know that during the US presidential election
201
665535
2634
์šฐ๋ฆฌ๋Š” ๋ฏธ๊ตญ ๋Œ€ํ†ต๋ น ์„ ๊ฑฐ ๊ธฐ๊ฐ„ ๋™์•ˆ
11:08
there was a wave of misinformation that came from Macedonia
202
668193
3690
๋งˆ์ผ€๋„๋‹ˆ์•„์—์„œ ๋‚˜์˜จ ์ž˜๋ชป๋œ ์ •๋ณด๋“ค์ด
11:11
that didn't have any political motive
203
671907
2337
๊ฒฝ์ œ์  ๋™๊ธฐ๊ฐ€ ์žˆ์—ˆ์„ ๋ฟ
11:14
but instead had an economic motive.
204
674268
2460
์ •์น˜์  ๋™๊ธฐ๋Š” ์—†์—ˆ๋‹ค๋Š” ๊ฒƒ์„ ์•Œ๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
11:16
And this economic motive existed,
205
676752
2148
์—ฌ๊ธฐ์—์„œ ๊ฒฝ์ œ์  ๋™๊ธฐ๋Š”
11:18
because false news travels so much farther, faster
206
678924
3524
๊ฐ€์งœ ๋‰ด์Šค๊ฐ€ ์ง„์‹ค๋ณด๋‹ค ํ›จ์”ฌ ๋” ๋ฉ€๋ฆฌ, ๋” ๋น ๋ฅด๊ณ 
11:22
and more deeply than the truth,
207
682472
2010
๋” ๊นŠ์ด ํผ์งˆ ์ˆ˜ ์žˆ๊ธฐ ๋•Œ๋ฌธ์—,
11:24
and you can earn advertising dollars as you garner eyeballs and attention
208
684506
4960
์ด๋Ÿฌํ•œ ์ •๋ณด๋กœ ์‚ฌ๋žŒ๋“ค์˜ ๊ด€์‹ฌ๊ณผ ์‹œ์„ ์„ ์–ป์œผ๋ฉด
๊ด‘๊ณ  ์ˆ˜์ต์„ ์–ป์„ ์ˆ˜ ์žˆ๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
11:29
with this type of information.
209
689490
1960
11:31
But if we can depress the spread of this information,
210
691474
3833
ํ•˜์ง€๋งŒ ๋งŒ์•ฝ ์šฐ๋ฆฌ๊ฐ€ ์ด ์ •๋ณด์˜ ํ™•์‚ฐ์„ ๋ง‰๊ฒŒ ๋˜๋ฉด
11:35
perhaps it would reduce the economic incentive
211
695331
2897
์• ์ดˆ์— ๊ฐ€์งœ ๋‰ด์Šค๋ฅผ ์ƒ์‚ฐํ• 
11:38
to produce it at all in the first place.
212
698252
2690
๊ฒฝ์ œ์  ๋™๊ธฐ๊ฐ€ ์กด์žฌํ•˜์ง€ ์•Š๊ฒŒ ๋˜๋Š” ๊ฒƒ์ด์ฃ .
11:40
Third, we can think about regulation,
213
700966
2500
์…‹์งธ, ๊ทœ์ œ๋ฅผ ์ƒ๊ฐํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
11:43
and certainly, we should think about this option.
214
703490
2325
๊ทธ๋ฆฌ๊ณ  ํ™•์‹คํžˆ, ์šฐ๋ฆฌ๋Š” ์ด ๋ฐฉ๋ฒ•์„ ์ƒ๊ฐํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
11:45
In the United States, currently,
215
705839
1611
์ตœ๊ทผ ๋ฏธ๊ตญ์—์„œ๋Š”
11:47
we are exploring what might happen if Facebook and others are regulated.
216
707474
4848
ํŽ˜์ด์Šค๋ถ์ด๋‚˜ ๋‹ค๋ฅธ ๊ฒƒ๋“ค์ด ๊ทœ์ œ๋˜๋ฉด ์–ด๋–ค ์ผ์ด ์ผ์–ด๋‚ ์ง€ ์กฐ์‚ฌํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
11:52
While we should consider things like regulating political speech,
217
712346
3801
์ •์น˜ ์—ฐ์„ค์„ ๊ทœ์ œํ•˜๊ณ , ์ •์น˜ ์—ฐ์„ค์ด๋ผ๋Š” ์‚ฌ์‹ค์„ ํ‘œ๊ธฐํ•˜๋ฉฐ,
11:56
labeling the fact that it's political speech,
218
716171
2508
์™ธ๊ตญ ๋ฐฐ์šฐ๊ฐ€ ์ •์น˜ ์—ฐ์„ค์— ์ž๊ธˆ์„ ๋Œˆ ์ˆ˜ ์—†๋„๋ก ํ•˜๋Š” ๊ฒƒ ๋“ฑ์„
11:58
making sure foreign actors can't fund political speech,
219
718703
3819
๋™์‹œ์— ๊ณ ๋ คํ•˜๋ฉด์„œ ๋ง์ด์ฃ .
12:02
it also has its own dangers.
220
722546
2547
๊ทธ๋Ÿฌ๋‚˜ ์ด๊ฒƒ๋„ ๊ทธ ์ž์ฒด๋กœ ์œ„ํ—˜ํ•ฉ๋‹ˆ๋‹ค.
12:05
For instance, Malaysia just instituted a six-year prison sentence
221
725522
4878
์˜ˆ๋ฅผ ๋“ค์–ด, ๋ง๋ ˆ์ด์‹œ์•„๋Š” ๊ฐ€์งœ ๋‰ด์Šค๋ฅผ ํผํŠธ๋ฆฐ ์‚ฌ๋žŒ์—๊ฒŒ
์ง•์—ญ 6๋…„ ํ˜•์„ ์„ ๊ณ ํ–ˆ์Šต๋‹ˆ๋‹ค.
12:10
for anyone found spreading misinformation.
222
730424
2734
12:13
And in authoritarian regimes,
223
733696
2079
๊ทธ๋ฆฌ๊ณ  ๊ถŒ์œ„์ฃผ์˜์ ์ธ ์ •๊ถŒ์—์„œ๋Š”
12:15
these kinds of policies can be used to suppress minority opinions
224
735799
4666
์ด๋Ÿฐ ์ข…๋ฅ˜์˜ ์ •์ฑ…๋“ค์ด ์†Œ์ˆ˜ ์˜๊ฒฌ์„ ์–ต์••ํ•˜๊ณ 
12:20
and to continue to extend repression.
225
740489
3508
ํƒ„์••์„ ํ™•๋Œ€ํ•˜๋Š”๋ฐ ์‚ฌ์šฉ๋  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
12:24
The fourth possible option is transparency.
226
744680
3543
๋„ค ๋ฒˆ์งธ ์„ ํƒ์ง€๋Š” ํˆฌ๋ช…์„ฑ์ž…๋‹ˆ๋‹ค.
12:28
We want to know how do Facebook's algorithms work.
227
748843
3714
์šฐ๋ฆฌ๋Š” ํŽ˜์ด์Šค๋ถ์˜ ์•Œ๊ณ ๋ฆฌ์ฆ˜์ด ์–ด๋–ป๊ฒŒ ์ž‘๋™ํ•˜๋Š”์ง€ ์•Œ๊ณ  ์‹ถ์–ด ํ•ฉ๋‹ˆ๋‹ค.
12:32
How does the data combine with the algorithms
228
752581
2880
์–ด๋–ป๊ฒŒ ๋ฐ์ดํ„ฐ๊ฐ€ ์•Œ๊ณ ๋ฆฌ์ฆ˜๊ณผ ๊ฒฐํ•ฉํ•˜์—ฌ
12:35
to produce the outcomes that we see?
229
755485
2838
์šฐ๋ฆฌ๊ฐ€ ๋ณผ ์ˆ˜ ์žˆ๊ฒŒ ๊ฒฐ๊ณผ๋ฌผ์„ ๋งŒ๋“ค์–ด๋‚ผ๊นŒ์š”?
12:38
We want them to open the kimono
230
758347
2349
์šฐ๋ฆฌ๋Š” ๊ทธ๋“ค์ด ์ •๋ณด๋ฅผ ๊ณต๊ฐœํ•ด์„œ
12:40
and show us exactly the inner workings of how Facebook is working.
231
760720
4214
ํŽ˜์ด์Šค๋ถ ๋‚ด๋ถ€๊ฐ€ ์ •ํ™•ํžˆ ์–ด๋–ป๊ฒŒ ์ž‘๋™ํ•˜๋Š”์ง€ ๋ณด์—ฌ์ฃผ๊ธธ ์›ํ•ฉ๋‹ˆ๋‹ค.
12:44
And if we want to know social media's effect on society,
232
764958
2779
๋˜ํ•œ ์†Œ์…œ ๋ฏธ๋””์–ด๊ฐ€ ์‚ฌํšŒ์— ๋ฏธ์น˜๋Š” ์˜ํ–ฅ์„ ์•Œ๊ณ  ์‹ถ๋‹ค๋ฉด,
12:47
we need scientists, researchers
233
767761
2086
์ด๋Ÿฐ ์ข…๋ฅ˜์˜ ์ •๋ณด์— ์ ‘๊ทผํ•  ์ˆ˜ ์žˆ๋Š”
12:49
and others to have access to this kind of information.
234
769871
3143
๊ณผํ•™์ž์™€ ์—ฐ๊ตฌ์ž, ๋“ฑ์ด ํ•„์š”ํ•˜๋‹ค๋Š” ๊ฒƒ์„ ์••๋‹ˆ๋‹ค.
12:53
But at the same time,
235
773038
1547
ํ•˜์ง€๋งŒ ๋™์‹œ์—,
12:54
we are asking Facebook to lock everything down,
236
774609
3801
์šฐ๋ฆฌ๋Š” ํŽ˜์ด์Šค๋ถ์ด ๋ชจ๋“  ๋ฐ์ดํ„ฐ๋ฅผ ์ฐจ๋‹จํ•˜๊ณ 
12:58
to keep all of the data secure.
237
778434
2173
๊ทธ ๋ฐ์ดํ„ฐ๋ฅผ ์•ˆ์ „ํ•˜๊ฒŒ ๋ณด๊ด€ํ•˜๋„๋ก ์š”๊ตฌํ•ฉ๋‹ˆ๋‹ค.
13:00
So, Facebook and the other social media platforms
238
780631
3159
๊ทธ๋ž˜์„œ ํŽ˜์ด์Šค๋ถ๊ณผ ๋˜ ๋‹ค๋ฅธ ์†Œ์…œ๋ฏธ๋””์–ด ํ”Œ๋žซํผ๋“ค์€
13:03
are facing what I call a transparency paradox.
239
783814
3134
์ œ๊ฐ€ ํˆฌ๋ช…์„ฑ์˜ ์—ญ์„ค์ด๋ผ๊ณ  ๋งํ•˜๋Š” ์ƒํ™ฉ์— ์ง๋ฉดํ•ด ์žˆ์Šต๋‹ˆ๋‹ค.
13:07
We are asking them, at the same time,
240
787266
2674
์šฐ๋ฆฌ๋Š” ๋ฏธ๋””์–ด์—๊ฒŒ
13:09
to be open and transparent and, simultaneously secure.
241
789964
4809
๊ฐœ๋ฐฉ์ ์ด๊ณ  ํˆฌ๋ช…ํ•˜๋ฉด์„œ๋„ ๋™์‹œ์— ์•ˆ์ „ํ•  ๊ฒƒ์„ ์š”๊ตฌํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
13:14
This is a very difficult needle to thread,
242
794797
2691
์ด๊ฒƒ์€ ์‹ค์„ ๊ฟฐ๊ธฐ์—” ์•„์ฃผ ์–ด๋ ค์šด ๋ฐ”๋Š˜์ด์ง€๋งŒ,
13:17
but they will need to thread this needle
243
797512
1913
์ด ๋ฐ”๋Š˜์— ๊ผญ ์‹ค์„ ๊ฟฐ์–ด์•ผ๋งŒ ํ•˜๋Š” ๊ฒƒ๊ณผ ๊ฐ™์ฃ .
13:19
if we are to achieve the promise of social technologies
244
799449
3787
์‚ฌํšŒ์ ์ธ ๊ธฐ์ˆ ์˜ ์œ„ํ—˜์„ ํ”ผํ•˜๋ฉด์„œ๋„ ๊ทธ ๊ฐ€๋Šฅ์„ฑ์„ ์–ป๊ธฐ ์œ„ํ•ด์„œ๋Š”
13:23
while avoiding their peril.
245
803260
1642
ํ•„์š”ํ•œ ์ผ์ž…๋‹ˆ๋‹ค.
13:24
The final thing that we could think about is algorithms and machine learning.
246
804926
4691
๋งˆ์ง€๋ง‰์œผ๋กœ ์•Œ๊ณ ๋ฆฌ์ฆ˜๊ณผ ๋จธ์‹  ๋Ÿฌ๋‹์— ๋Œ€ํ•ด ์ƒ๊ฐํ•ด ๋ด…์‹œ๋‹ค.
13:29
Technology devised to root out and understand fake news, how it spreads,
247
809641
5277
์ด ๊ธฐ์ˆ ์€ ๊ฐ€์งœ ๋‰ด์Šค๊ฐ€ ๊ทผ๋ณธ์ ์œผ๋กœ ์–ด๋–ค์ง€ ์ดํ•ดํ•˜๊ณ , ์–ผ๋งˆ๋‚˜ ํผ์ง€๊ณ ,
13:34
and to try and dampen its flow.
248
814942
2331
๊ทธ ํ๋ฆ„์ด ์–ผ๋งˆ๋‚˜ ๊นŠ์ด ํผ์ง€๋Š”์ง€ ์•Œ์•„๋‚ด๊ธฐ ์œ„ํ•ด ๋งŒ๋“  ๊ธฐ์ˆ ์ž…๋‹ˆ๋‹ค.
13:37
Humans have to be in the loop of this technology,
249
817824
2897
์ธ๊ฐ„์€ ์ด ๊ธฐ์ˆ ์„ ๋ถ„๋ช…ํžˆ ์ดํ•ดํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
13:40
because we can never escape
250
820745
2278
์™œ๋ƒํ•˜๋ฉด ์šฐ๋ฆฌ๋Š” ๊ฒฐ์ฝ”
13:43
that underlying any technological solution or approach
251
823047
4038
๋ชจ๋“  ๊ธฐ์ˆ ์  ํ•ด๊ฒฐ์ฑ…์ด๋‚˜ ์ ‘๊ทผ๋ฐฉ์‹์— ์กด์žฌํ•˜๋Š”
13:47
is a fundamental ethical and philosophical question
252
827109
4047
๊ทผ๋ณธ์ ์ด๊ณ  ์œค๋ฆฌ์ ์ด๊ณ  ์ฒ ํ•™์ ์ธ ์งˆ๋ฌธ๋“ค์„ ํ”ผํ•  ์ˆ˜ ์—†๊ธฐ ๋•Œ๋ฌธ์ด์ฃ .
13:51
about how do we define truth and falsity,
253
831180
3270
์šฐ๋ฆฌ๊ฐ€ ์–ด๋–ป๊ฒŒ ์ง„์‹ค๊ณผ ๊ฑฐ์ง“์„ ์ •์˜ํ•˜๊ณ ,
13:54
to whom do we give the power to define truth and falsity
254
834474
3180
๋ˆ„๊ตฌ์—๊ฒŒ ์ด๊ฒƒ์„ ํŒ๋ณ„ํ•  ์ˆ˜ ์žˆ๋Š” ๊ถŒํ•œ์„ ๋ถ€์—ฌํ•  ๊ฒƒ์ธ์ง€,
13:57
and which opinions are legitimate,
255
837678
2460
์–ด๋–ค ์˜๊ฒฌ์ด ํ•ฉ๋ฒ•์ ์ธ ๊ฒƒ์ธ์ง€,
14:00
which type of speech should be allowed and so on.
256
840162
3706
์–ด๋–ค ์ข…๋ฅ˜์˜ ๋ชฉ์†Œ๋ฆฌ๊ฐ€ ํ—ˆ์šฉ๋˜์•ผ ํ•˜๋Š”์ง€ ๋“ฑ์— ๊ด€ํ•œ ์งˆ๋ฌธ๋“ค์ด์ฃ .
14:03
Technology is not a solution for that.
257
843892
2328
๊ธฐ์ˆ ์€ ๊ทธ ๋ชจ๋“  ๊ฒƒ์˜ ํ•ด๊ฒฐ์ฑ…์ด ์•„๋‹™๋‹ˆ๋‹ค.
14:06
Ethics and philosophy is a solution for that.
258
846244
3698
์œค๋ฆฌ์™€ ์ฒ ํ•™์ด ํ•ด๊ฒฐ์ฑ…์ด์ฃ .
14:10
Nearly every theory of human decision making,
259
850950
3318
์ธ๊ฐ„์˜ ์˜์‚ฌ๊ฒฐ์ •์— ๊ด€ํ•œ ๊ฑฐ์˜ ๋ชจ๋“  ์ด๋ก ๋“ค๊ณผ
14:14
human cooperation and human coordination
260
854292
2761
์ธ๊ฐ„์˜ ํ˜‘๋ ฅ ๊ทธ๋ฆฌ๊ณ  ํ˜‘๋™์˜
14:17
has some sense of the truth at its core.
261
857077
3674
๊ทธ ์ค‘์‹ฌ์—๋Š” ์ง„์‹ค์ด ์žˆ์Šต๋‹ˆ๋‹ค.
14:21
But with the rise of fake news,
262
861347
2056
๊ทธ๋Ÿฌ๋‚˜ ๊ฐ€์งœ ๋‰ด์Šค์˜ ์ฆ๊ฐ€,
14:23
the rise of fake video,
263
863427
1443
๊ฐ€์งœ ๋น„๋””์˜ค์˜ ์ฆ๊ฐ€,
14:24
the rise of fake audio,
264
864894
1882
๊ฐ€์งœ ์˜ค๋””์˜ค์˜ ์ฆ๊ฐ€,
14:26
we are teetering on the brink of the end of reality,
265
866800
3924
์šฐ๋ฆฌ๋Š” ์ง„์‹ค์˜ ์ข…๋ง์„ ๋ˆˆ์•ž์— ๋‘๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
14:30
where we cannot tell what is real from what is fake.
266
870748
3889
๋ฌด์—‡์ด ์ง„์งœ์ด๊ณ  ๋ฌด์—‡์ด ๊ฐ€์งœ์ธ์ง€ ํŒ๋ณ„ํ•  ์ˆ˜ ์—†๋Š” ์ƒํƒœ์ž…๋‹ˆ๋‹ค.
14:34
And that's potentially incredibly dangerous.
267
874661
3039
๊ทธ๋ฆฌ๊ณ  ์ด๊ฑด ์ •๋ง๋กœ ์œ„ํ—˜ํ•ฉ๋‹ˆ๋‹ค.
14:38
We have to be vigilant in defending the truth
268
878931
3948
์šฐ๋ฆฌ๋Š” ์ž˜๋ชป๋œ ์ •๋ณด๋กœ๋ถ€ํ„ฐ ์ง„์‹ค์„ ๋ณดํ˜ธํ•˜๋ฉฐ
14:42
against misinformation.
269
882903
1534
๊ฒฝ๊ณ„ํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
14:44
With our technologies, with our policies
270
884919
3436
์šฐ๋ฆฌ์˜ ๊ธฐ์ˆ ๊ณผ ์šฐ๋ฆฌ์˜ ์ •์ฑ…
14:48
and, perhaps most importantly,
271
888379
1920
๊ทธ๋ฆฌ๊ณ  ์•„๋งˆ๋„, ๊ฐ€์žฅ ์ค‘์š”ํ•œ ๊ฒƒ์€
14:50
with our own individual responsibilities,
272
890323
3214
์šฐ๋ฆฌ ๊ฐœ๊ฐœ์ธ์˜ ์ฑ…์ž„๊ณผ
14:53
decisions, behaviors and actions.
273
893561
3555
๊ฒฐ์ •, ํ–‰๋™ ๊ทธ๋ฆฌ๊ณ  ์›€์ง์ž„์ผ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
14:57
Thank you very much.
274
897553
1437
๋Œ€๋‹จํžˆ ๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
14:59
(Applause)
275
899014
3517
(๋ฐ•์ˆ˜)
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7