How we can protect truth in the age of misinformation | Sinan Aral

238,833 views ใƒป 2020-01-16

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

00:00
Translator: Ivana Korom Reviewer: Krystian Aparta
0
0
7000
ืชืจื’ื•ื: Zeeva Livshitz ืขืจื™ื›ื”: Allon Sasson
00:13
So, on April 23 of 2013,
1
13468
5222
ื‘ 23 ื‘ืืคืจื™ืœ 2013,
00:18
the Associated Press put out the following tweet on Twitter.
2
18714
5514
ืกื•ื›ื ื•ืช ื”ื™ื“ื™ืขื•ืช AP ื”ืขืœืชื” ืฆื™ื•ืฅ ื‘ื˜ื•ื•ื™ื˜ืจ.
00:24
It said, "Breaking news:
3
24252
2397
ื ืืžืจ ื‘ื•: "ื—ื“ืฉื•ืช ืžืจืขื™ืฉื•ืช:
00:26
Two explosions at the White House
4
26673
2571
ืฉื ื™ ืคื™ืฆื•ืฆื™ื ืืจืขื• ื‘ื‘ื™ืช ื”ืœื‘ืŸ
00:29
and Barack Obama has been injured."
5
29268
2333
ื•ื‘ืจืง ืื•ื‘ืžื” ื ืคืฆืข."
00:32
This tweet was retweeted 4,000 times in less than five minutes,
6
32212
5425
ื”ื™ื“ื™ืขื” ื”ื–ื• ืฆื•ื™ืฆื” ืžื—ื“ืฉ ื›-4000 ืคืขื ื‘ืžืฉืš ืคื—ื•ืช ืžื—ืžืฉ ื“ืงื•ืช,
00:37
and it went viral thereafter.
7
37661
2217
ื•ื”ืคื›ื” ืœืื—ืจ ืžื›ืŸ ืœื•ื•ื™ืจืืœื™ืช.
00:40
Now, this tweet wasn't real news put out by the Associated Press.
8
40760
4350
ื”ืฆื™ื•ืฅ ื”ื–ื” ืœื ื”ื›ื™ืœ ื—ื“ืฉื•ืช ืืžื™ืชื™ื•ืช ืฉืคื™ืจืกืžื” ืกื•ื›ื ื•ืช ื”ื™ื“ื™ืขื•ืช AP.
00:45
In fact it was false news, or fake news,
9
45134
3333
ืœืžืขืฉื” ืืœื” ื”ื™ื• ื—ื“ืฉื•ืช ื›ื–ื‘, ืื• "ืคื™ื™ืง ื ื™ื•ื–",
00:48
that was propagated by Syrian hackers
10
48491
2825
ืฉื”ื•ืคืฆื• ืขืœ ื™ื“ื™ ื”ืืงืจื™ื ืกื•ืจื™ื
00:51
that had infiltrated the Associated Press Twitter handle.
11
51340
4694
ืฉื”ืกืชื ื ื• ืœื—ืฉื‘ื•ืŸ ื”ื˜ื•ื•ื™ื˜ืจ ืฉืœ AP.
00:56
Their purpose was to disrupt society, but they disrupted much more.
12
56407
3889
ืžื˜ืจืชื ื”ื™ืชื” ืœืฉื‘ืฉ ืกื“ืจื™ื ื‘ื—ื‘ืจื”, ืื‘ืœ ื”ื ืฉื™ื‘ืฉื• ื”ืจื‘ื” ื™ื•ืชืจ.
01:00
Because automated trading algorithms
13
60320
2476
ื›ื™ ืืœื’ื•ืจื™ืชืžื™ื ืื•ื˜ื•ืžื˜ื™ื™ื ืฉืœ ืžืกื—ืจ
01:02
immediately seized on the sentiment on this tweet,
14
62820
3360
ื”ื’ื™ื‘ื• ืžื™ื“ ืœืชื—ื•ืฉื” ืฉืขื•ืจืจ ื”ืฆื™ื•ืฅ ื”ื–ื”,
01:06
and began trading based on the potential
15
66204
2968
ื•ื”ื—ืœ ืžืกื—ืจ ืฉืžื‘ื•ืกืก ืขืœ ื”ืืคืฉืจื•ืช
01:09
that the president of the United States had been injured or killed
16
69196
3381
ืฉื ืฉื™ื ืืจื”"ื‘ ื ืคื’ืข ืื• ื ื”ืจื’
01:12
in this explosion.
17
72601
1200
ื‘ืคื™ืฆื•ืฅ ื–ื”.
01:14
And as they started tweeting,
18
74188
1992
ื•ื‘ืจื’ืข ืฉื”ื ื”ืชื—ื™ืœื• ืœืฆื™ื™ืฅ,
01:16
they immediately sent the stock market crashing,
19
76204
3349
ื”ื ืžื™ื“ ื’ืจืžื• ืœืงืจื™ืกืช ืฉื•ืง ื”ืžื ื™ื•ืช,
01:19
wiping out 140 billion dollars in equity value in a single day.
20
79577
5167
ื›ืฉื ืžื—ืงื• 140 ืžื™ืœื™ืืจื“ ื“ื•ืœืจ ืฉืœ ืฉื•ื•ื™ ื”ื•ืŸ ื‘ื™ื•ื ืื—ื“.
01:25
Robert Mueller, special counsel prosecutor in the United States,
21
85062
4476
ืจื•ื‘ืจื˜ ืžื•ืœืจ, ื™ื•ืขืฅ ืชื•ื‘ืข ืžื™ื•ื—ื“ ื‘ืืจื”"ื‘,
01:29
issued indictments against three Russian companies
22
89562
3892
ื”ื’ื™ืฉ ื›ืชื‘ื™ ืื™ืฉื•ื ื ื’ื“ 3 ื—ื‘ืจื•ืช ืจื•ืกื™ื•ืช
01:33
and 13 Russian individuals
23
93478
2619
ื•- 13 ื‘ื ื™ ืื“ื ืจื•ืกื™ื™ื
01:36
on a conspiracy to defraud the United States
24
96121
3167
ืขืœ ืงืฉื™ืจืช ืงืฉืจ ืœื”ื•ื ื•ืช ืืช ืืจื”"ื‘
01:39
by meddling in the 2016 presidential election.
25
99312
3780
ืขืœ ื™ื“ื™ ื”ืชืขืจื‘ื•ืช ื‘ื‘ื—ื™ืจื•ืช 2016 ืœื ืฉื™ืื•ืช.
01:43
And what this indictment tells as a story
26
103855
3564
ื•ื›ืชื‘ ืื™ืฉื•ื ื–ื” ืžืกืคืจ
01:47
is the story of the Internet Research Agency,
27
107443
3142
ืฉืกื•ื›ื ื•ืช ื”ืžื—ืงืจ ืฉืœ ื”ืื™ื ื˜ืจื ื˜
01:50
the shadowy arm of the Kremlin on social media.
28
110609
3594
ื”ื™ื ื–ืจื•ืข ื”ืฆืœืœื™ื ืฉืœ ื”ืงืจืžืœื™ืŸ ื‘ืžื“ื™ื” ื”ื—ื‘ืจืชื™ืช.
01:54
During the presidential election alone,
29
114815
2777
ื‘ืžื”ืœืš ื”ื‘ื—ื™ืจื•ืช ืœื ืฉื™ืื•ืช ืœื‘ื“ื•,
01:57
the Internet Agency's efforts
30
117616
1889
ืžืืžืฆื™ ืกื•ื›ื ื•ืช ื”ืื™ื ื˜ืจื ื˜
01:59
reached 126 million people on Facebook in the United States,
31
119529
5167
ื”ื’ื™ืขื• ืœ 126 ืžื™ืœื™ื•ืŸ ื‘ื ื™ ืื“ื ื‘ืคื™ื™ืกื‘ื•ืง, ื‘ืืจื”"ื‘,
02:04
issued three million individual tweets
32
124720
3277
ืฉื”ืขืœื• 3 ืžื™ืœื™ื•ืŸ ืฆื™ื•ืฆื™ื ืื™ืฉื™ื™ื
02:08
and 43 hours' worth of YouTube content.
33
128021
3842
ื•ืฉื•ื•ื™ ืฉืœ 43 ืฉืขื•ืช ืชื•ื›ืŸ ื‘ื™ื•ื˜ื™ื•ื‘.
02:11
All of which was fake --
34
131887
1652
ืฉื›ื•ืœื ื”ื™ื• ื›ื•ื–ื‘ื™ื --
02:13
misinformation designed to sow discord in the US presidential election.
35
133563
6323
ืžื™ื“ืข ื›ื•ื–ื‘ ืฉื ื•ืขื“ ืœื–ืจื•ืข ืžื—ืœื•ืงืช ื‘ื‘ื—ื™ืจื•ืช ืœื ืฉื™ืื•ืช ื‘ืืจื”"ื‘.
02:20
A recent study by Oxford University
36
140996
2650
ืžื—ืงืจ ืฉื ืขืจืš ืœืื—ืจื•ื ื” ืขืœ ื™ื“ื™ ืื•ื ื™ื‘ืจืกื™ื˜ืช ืื•ืงืกืคื•ืจื“
02:23
showed that in the recent Swedish elections,
37
143670
3270
ื”ืจืื” ืฉื‘ื‘ื—ื™ืจื•ืช ื”ืื—ืจื•ื ื•ืช ื‘ืฉื•ื•ื“ื™ื”,
02:26
one third of all of the information spreading on social media
38
146964
4375
ืฉืœื™ืฉ ืžื›ืœ ื”ืžื™ื“ืข ืฉื”ืชืคืฉื˜ ื‘ืจืฉืชื•ืช ื”ื—ื‘ืจืชื™ื•ืช
02:31
about the election
39
151363
1198
ืื•ื“ื•ืช ื”ื‘ื—ื™ืจื•ืช
02:32
was fake or misinformation.
40
152585
2087
ื”ื™ื” ื›ื•ื–ื‘ ืื• ืฉื’ื•ื™.
02:35
In addition, these types of social-media misinformation campaigns
41
155037
5078
ื‘ื ื•ืกืฃ, ืกื•ื’ื™ื ืืœื” ืฉืœ ืงืžืคื™ื™ื ื™ื ืฉื’ื•ื™ื™ื ื‘ืจืฉืชื•ืช ื”ื—ื‘ืจืชื™ื•ืช
02:40
can spread what has been called "genocidal propaganda,"
42
160139
4151
ืขืœื•ืœื™ื ืœื”ืคื™ืฅ ืืช ืžื” ืฉื ืงืจื "ืชืขืžื•ืœื” ืœืจืฆื— ืขื,"
02:44
for instance against the Rohingya in Burma,
43
164314
3111
ื›ืžื• ืœืžืฉืœ, ื ื’ื“ ื”ืจื•ื”ื™ื ื’ื” ื‘ื‘ื•ืจืžื”,
02:47
triggering mob killings in India.
44
167449
2303
ื•ืœืขื•ืจืจ ืจืฆื— ื‘ื™ื“ื™ ื”ืืกืคืกื•ืฃ ื‘ื”ื•ื“ื•.
02:49
We studied fake news
45
169776
1494
ื—ืงืจื ื• ืืช ื—ื“ืฉื•ืช ื”ื›ื–ื‘
02:51
and began studying it before it was a popular term.
46
171294
3219
ื•ื”ืชื—ืœื ื• ืœื—ืงื•ืจ ื–ืืช ืœืคื ื™ ืฉื–ื” ื”ืคืš ืœืžื•ื ื— ืคื•ืคื•ืœืจื™.
02:55
And we recently published the largest-ever longitudinal study
47
175030
5040
ื•ืœืื—ืจื•ื ื” ืคื™ืจืกืžื ื• ืืช ืžื—ืงืจ ื”ืื•ืจืš ื”ื’ื“ื•ืœ ื‘ื™ื•ืชืจ ืฉื ืขืจืš ืื™ ืคืขื
03:00
of the spread of fake news online
48
180094
2286
ืขืœ ื”ืชืคืฉื˜ื•ืช ื—ื“ืฉื•ืช ื”ื›ื–ื‘ ื‘ืจืฉืช
03:02
on the cover of "Science" in March of this year.
49
182404
3204
ืขืœ ื”ืฉืขืจ ืฉืœ ื”"ืกื™ื™ืื ืก" ื‘ืžืจืฅ ื”ืฉื ื”.
03:06
We studied all of the verified true and false news stories
50
186523
4161
ื—ืงืจื ื• ืืช ื›ืœ ืกื™ืคื•ืจื™ ื”ื—ื“ืฉื•ืช ืฉืื•ืžืชื• ื›ืืžื™ืชื™ื™ื ื•ื›ื•ื–ื‘ื™ื
03:10
that ever spread on Twitter,
51
190708
1753
ืฉื”ืชืคืฉื˜ื• ืื™ ืคืขื ื‘ื˜ื•ื•ื™ื˜ืจ,
03:12
from its inception in 2006 to 2017.
52
192485
3818
ืžืจืืฉื™ืชื• ื‘ 2006 ื•ืขื“ 2017.
03:16
And when we studied this information,
53
196612
2314
ื•ื›ืฉื—ืงืจื ื• ืืช ื”ืžื™ื“ืข ื”ื–ื”,
03:18
we studied verified news stories
54
198950
2876
ื—ืงืจื ื• ืกื™ืคื•ืจื™ ื—ื“ืฉื•ืช ืžืื•ืžืชื™ื
03:21
that were verified by six independent fact-checking organizations.
55
201850
3918
ืฉืื•ืžืชื• ืขืœ ื™ื“ื™ 6 ืืจื’ื•ื ื™ื ืขืฆืžืื™ื™ื, ืฉื‘ื•ื“ืงื™ื ืืช ื”ืขื•ื‘ื“ื•ืช.
03:25
So we knew which stories were true
56
205792
2762
ื›ืš ื™ื“ืขื ื• ืื™ืœื• ืกื™ืคื•ืจื™ื ื”ื™ื• ื ื›ื•ื ื™ื
03:28
and which stories were false.
57
208578
2126
ื•ืื™ืœื• ืกื™ืคื•ืจื™ื ื”ื™ื• ื›ื•ื–ื‘ื™ื.
03:30
We can measure their diffusion,
58
210728
1873
ืื ื• ื™ื›ื•ืœื™ื ืœืžื“ื•ื“ ืืช ื”ื”ืชืคืฉื˜ื•ืช ืฉืœื”ื,
03:32
the speed of their diffusion,
59
212625
1651
ืืช ืžื”ื™ืจื•ืช ื”ื”ืชืคืฉื˜ื•ืช ืฉืœื”ื,
03:34
the depth and breadth of their diffusion,
60
214300
2095
ืืช ืขื•ืžืงื” ื•ืจื•ื—ื‘ื” ืฉืœ ื”ื”ืชืคืฉื˜ื•ืช,
03:36
how many people become entangled in this information cascade and so on.
61
216419
4142
ื›ืžื” ืื ืฉื™ื ื”ืกืชื‘ื›ื• ื‘ืžืคืœ ืžื™ื“ืข ื–ื” ื•ื›ื•'.
03:40
And what we did in this paper
62
220942
1484
ื•ืžื” ืฉืขืฉื™ื ื• ืขื ืžืกืžืš ื–ื”
03:42
was we compared the spread of true news to the spread of false news.
63
222450
3865
ื”ื™ื” ืœื”ืฉื•ื•ืช ืืช ื”ืชืคืฉื˜ื•ืช ื—ื“ืฉื•ืช ื”ืืžืช ืœื”ืชืคืฉื˜ื•ืช ื—ื“ืฉื•ืช ื”ื›ื–ื‘.
03:46
And here's what we found.
64
226339
1683
ื•ื”ื ื” ืžื” ืฉืžืฆืื ื•.
03:48
We found that false news diffused further, faster, deeper
65
228046
3979
ื’ื™ืœื™ื ื• ืฉื—ื“ืฉื•ืช ื›ื–ื‘ ื ืคื•ืฆื• ืจื—ื•ืง ื™ื•ืชืจ, ืžื”ืจ ื™ื•ืชืจ, ืขืžื•ืง ื™ื•ืชืจ
03:52
and more broadly than the truth
66
232049
1806
ื•ื‘ืื•ืคืŸ ื”ืจื‘ื” ื™ื•ืชืจ ืจื—ื‘ ืžื”ืืžืช
03:53
in every category of information that we studied,
67
233879
3003
ื‘ื›ืœ ืงื˜ื’ื•ืจื™ื™ืช ืžื™ื“ืข ืฉื—ืงืจื ื•,
03:56
sometimes by an order of magnitude.
68
236906
2499
ืœืคืขืžื™ื ื‘ืกื“ืจ ื’ื•ื“ืœ.
03:59
And in fact, false political news was the most viral.
69
239842
3524
ื•ืœืžืขืฉื” ื—ื“ืฉื•ืช ื›ื–ื‘ ืคื•ืœื™ื˜ื™ื•ืช ื”ื™ื• ื”ื›ื™ ื•ื•ื™ืจืืœื™ื•ืช .
04:03
It diffused further, faster, deeper and more broadly
70
243390
3147
ื”ืŸ ื”ื•ืคืฆื• ืจื—ื•ืง ื™ื•ืชืจ, ืžื”ืจ ื™ื•ืชืจ, ืขืžื•ืง ื•ืจื—ื‘ ื™ื•ืชืจ
04:06
than any other type of false news.
71
246561
2802
ืžื›ืœ ืกื•ื’ ืื—ืจ ืฉืœ ื—ื“ืฉื•ืช ื›ื–ื‘.
04:09
When we saw this,
72
249387
1293
ื›ืฉืจืื™ื ื• ืืช ื–ื”,
04:10
we were at once worried but also curious.
73
250704
2841
ื”ื™ื™ื ื• ื‘ื• ื‘ื–ืžืŸ ืžื•ื“ืื’ื™ื, ืื‘ืœ ื’ื ืกืงืจื ื™ื.
04:13
Why?
74
253569
1151
ืžื“ื•ืข?
04:14
Why does false news travel so much further, faster, deeper
75
254744
3373
ืžื“ื•ืข ื—ื“ืฉื•ืช ื›ื–ื‘ ืžื’ื™ืขื•ืช ื”ืจื‘ื” ื™ื•ืชืจ ืจื—ื•ืง, ืžื”ืจ, ืขืžื•ืง
04:18
and more broadly than the truth?
76
258141
1864
ื•ื‘ืื•ืคืŸ ืจื—ื‘ ื™ื•ืชืจ ืžืืฉืจ ื”ืืžืช?
04:20
The first hypothesis that we came up with was,
77
260339
2961
ื”ื”ืฉืขืจื” ื”ืจืืฉื•ื ื” ืฉื”ืขืœื™ื ื• ื”ื™ืชื”,
04:23
"Well, maybe people who spread false news have more followers or follow more people,
78
263324
4792
"ื•ื‘ื›ืŸ, ืื•ืœื™ ืœืžืคื™ืฆื™ ื—ื“ืฉื•ืช ื›ื–ื‘ ื™ืฉ ื™ื•ืชืจ ืขื•ืงื‘ื™ื ืื• ื ืขืงื‘ื™ื,
04:28
or tweet more often,
79
268140
1557
ืื• ืฉื”ื ืžืฆื™ื™ืฆื™ื ื™ื•ืชืจ ื”ืจื‘ื”,
04:29
or maybe they're more often 'verified' users of Twitter, with more credibility,
80
269721
4126
ืื• ืฉื”ื ืœืจื•ื‘ ืžืฉืชืžืฉื™ ื˜ื•ื•ื™ื˜ืจ 'ืžืื•ืžืชื™ื' ืขื ืืžื™ื ื•ืช ืจื‘ื” ื™ื•ืชืจ,
04:33
or maybe they've been on Twitter longer."
81
273871
2182
ืื• ืฉืื•ืœื™ ื”ื ื ืžืฆืื™ื ื‘ื˜ื•ื•ื™ื˜ืจ ื™ื•ืชืจ ื–ืžืŸ."
04:36
So we checked each one of these in turn.
82
276077
2298
ืื– ื‘ื“ืงื ื• ื›ืœ ืื—ื“ ืžืืœื” ื‘ื–ื” ืื—ืจ ื–ื”.
04:38
And what we found was exactly the opposite.
83
278691
2920
ื•ืžื” ืฉืžืฆืื ื• ื”ื™ื” ื‘ื“ื™ื•ืง ื”ื”ื™ืคืš.
04:41
False-news spreaders had fewer followers,
84
281635
2436
ืœืžืคืจืกืžื™ ื—ื“ืฉื•ืช ื›ื–ื‘ ื”ื™ื• ืคื—ื•ืช ืขื•ืงื‘ื™ื,
04:44
followed fewer people, were less active,
85
284095
2254
ื”ื ืขืงื‘ื• ืื—ืจ ืคื—ื•ืช ืื ืฉื™ื, ื”ื™ื• ืคื—ื•ืช ืคืขื™ืœื™ื,
04:46
less often "verified"
86
286373
1460
"ืžืื•ืžืชื™ื" ืœืขืชื™ื ืคื—ื•ืช ืงืจื•ื‘ื•ืช
04:47
and had been on Twitter for a shorter period of time.
87
287857
2960
ื•ื”ื™ื• ื‘ื˜ื•ื•ื™ื˜ืจ ืชืงื•ืคืช ื–ืžืŸ ืงืฆืจื” ื™ื•ืชืจ
04:50
And yet,
88
290841
1189
ื•ืขื“ื™ื™ืŸ,
04:52
false news was 70 percent more likely to be retweeted than the truth,
89
292054
5033
ื—ื“ืฉื•ืช ื›ื–ื‘ ืฆื•ื™ื™ืฆื• ืžื—ื“ืฉ ื‘ืกื‘ื™ืจื•ืช ืฉืœ 70% ื™ื•ืชืจ ืžื”ืืžืช,
04:57
controlling for all of these and many other factors.
90
297111
3363
ื›ืฉืžืชื—ืฉื‘ื™ืŸ ื‘ื›ืœ ืืœื” ื•ื‘ื’ื•ืจืžื™ื ืจื‘ื™ื ืื—ืจื™ื.
05:00
So we had to come up with other explanations.
91
300498
2690
ืื– ื”ืฆื˜ืจื›ื ื• ืœื‘ื•ื ืขื ื”ืกื‘ืจื™ื ืื—ืจื™ื.
05:03
And we devised what we called a "novelty hypothesis."
92
303212
3467
ื•ื”ื’ื™ื ื• ืžืฉื”ื• ืฉืงืจืื ื• ืœื• "ื”ืฉืขืจืช ื—ื™ื“ื•ืฉ."
05:07
So if you read the literature,
93
307038
1960
ืื ืืชื ืงื•ืจืื™ื ืืช ื”ืกืคืจื•ืช,
05:09
it is well known that human attention is drawn to novelty,
94
309022
3754
ื™ื“ื•ืข ื”ื™ื˜ื‘ ืฉืชืฉื•ืžืช ืœื‘ ืื ื•ืฉื™ืช ื ืžืฉื›ืช ืœื—ื™ื“ื•ืฉื™ื.
05:12
things that are new in the environment.
95
312800
2519
ื“ื‘ืจื™ื ืฉื”ื ื—ื“ืฉื™ื ื‘ืกื‘ื™ื‘ื”.
05:15
And if you read the sociology literature,
96
315343
1985
ื•ืื ืืชื ืงื•ืจืื™ื ืกืคืจื•ืช ื‘ื ื•ืฉืื™ ืกื•ืฆื™ื•ืœื•ื’ื™ื”,
05:17
you know that we like to share novel information.
97
317352
4300
ืืชื ื™ื•ื“ืขื™ื ืฉืื ื• ืื•ื”ื‘ื™ื ืœืฉืชืฃ ืžื™ื“ืข ื—ื“ืฉื ื™.
05:21
It makes us seem like we have access to inside information,
98
321676
3838
ื–ื” ื’ื•ืจื ืœื ื• ืœื”ื™ืจืื•ืช ื›ืžื• ืžื™ ืฉื™ืฉ ืœื• ื’ื™ืฉื” ืœืžื™ื“ืข ืคื ื™ืžื™.
05:25
and we gain in status by spreading this kind of information.
99
325538
3785
ื•ืื ื• ื–ื•ื›ื™ื ื‘ืžืขืžื“ ืขืœ ื™ื“ื™ ื”ืคืฆืช ืžื™ื“ืข ืžืกื•ื’ ื–ื”.
05:29
So what we did was we measured the novelty of an incoming true or false tweet,
100
329792
6452
ืื– ืžื“ื“ื ื• ืืช ื”ื—ื™ื“ื•ืฉ ืฉืœ ืฆื™ื•ืฅ ืฉื ื›ื ืก, ืืžื™ืชื™ ืื• ื›ื•ื–ื‘.
05:36
compared to the corpus of what that individual had seen
101
336268
4055
ื‘ื”ืฉื•ื•ืื” ืœืื•ืกืฃ ืฉื”ืื“ื ื”ื–ื” ืจืื”
05:40
in the 60 days prior on Twitter.
102
340347
2952
ื‘ื˜ื•ื•ื™ื˜ืจ ื‘ 60 ื”ื™ืžื™ื ืฉืงื“ืžื• ืœืฆื™ื•ืฅ.
05:43
But that wasn't enough, because we thought to ourselves,
103
343323
2659
ืื‘ืœ ื–ื” ืœื ื”ืกืคื™ืง, ื›ื™ ื—ืฉื‘ื ื• ืœืขืฆืžื ื•,
05:46
"Well, maybe false news is more novel in an information-theoretic sense,
104
346006
4208
"ืื•ืœื™ ื™ื“ื™ืขื•ืช ื›ื•ื–ื‘ื•ืช ื—ื“ืฉื ื™ื•ืช ื™ื•ืชืจ ื‘ืžื•ื‘ืŸ ืฉืœ ืžื™ื“ืข ืชื™ืื•ืจื˜ื™,
05:50
but maybe people don't perceive it as more novel."
105
350238
3258
ืื‘ืœ ืื•ืœื™ ืื ืฉื™ื ืœื ืชื•ืคืกื™ื ื–ืืช ื›ื™ื•ืชืจ ื—ื“ืฉื ื™ื•ืช."
05:53
So to understand people's perceptions of false news,
106
353849
3927
ืื– ื›ื“ื™ ืœื”ื‘ื™ืŸ ื›ื™ืฆื“ ืื ืฉื™ื ืชื•ืคืกื™ื ื—ื“ืฉื•ืช ื›ื–ื‘,
05:57
we looked at the information and the sentiment
107
357800
3690
ื”ืกืชื›ืœื ื• ืขืœ ื”ืžื™ื“ืข ื•ื”ืจื’ืฉ
06:01
contained in the replies to true and false tweets.
108
361514
4206
ืฉื ื›ืœืœ ื‘ืชืฉื•ื‘ื•ืช ืœืฆื™ื•ืฆื™ื ืฉืœ ืืžืช ื•ื›ื–ื‘.
06:06
And what we found
109
366022
1206
ื•ืžื” ืฉืžืฆืื ื•
06:07
was that across a bunch of different measures of sentiment --
110
367252
4214
ื”ื™ื” ืฉืœืจื•ื—ื‘ ืžืงื‘ืฅ ืฉืœ ืžื“ื“ื™ ืจื’ืฉื•ืช ืฉื•ื ื™ื --
06:11
surprise, disgust, fear, sadness,
111
371490
3301
ื”ืคืชืขื”, ืกืœื™ื“ื”, ืคื—ื“, ืขืฆื‘,
06:14
anticipation, joy and trust --
112
374815
2484
ืฆื™ืคื™ื™ื”, ืฉืžื—ื” ื•ืืžื•ืŸ --
06:17
false news exhibited significantly more surprise and disgust
113
377323
5857
ื—ื“ืฉื•ืช ื›ื–ื‘ ื”ืฆื™ื’ื• ื™ื•ืชืจ ื”ืคืชืขื” ื•ืกืœื™ื“ื” ื‘ืื•ืคืŸ ืžืฉืžืขื•ืชื™,
06:23
in the replies to false tweets.
114
383204
2806
ื‘ืชืฉื•ื‘ื•ืช ืœืฆื™ื•ืฆื™ ื›ื–ื‘.
06:26
And true news exhibited significantly more anticipation,
115
386392
3789
ื•ื—ื“ืฉื•ืช ืืžืช ื”ืฆื™ื’ื• ืฆื™ืคื™ื™ื” ื™ื•ืชืจ ืžืฉืžืขื•ืชื™ืช,
06:30
joy and trust
116
390205
1547
ืฉืžื—ื” ื•ืืžื•ืŸ
06:31
in reply to true tweets.
117
391776
2547
ื‘ืชืฉื•ื‘ื•ืช ืœืฆื™ื•ืฆื™ ืืžืช.
06:34
The surprise corroborates our novelty hypothesis.
118
394347
3786
ื”ื”ืคืชืขื” ืžืืžืชืช ืืช ื”ืฉืขืจืช ื”ื—ื™ื“ื•ืฉ ืฉืœื ื•.
06:38
This is new and surprising, and so we're more likely to share it.
119
398157
4609
ื–ื” ื—ื“ืฉ ื•ืžืคืชื™ืข, ื•ืœื›ืŸ ืกื‘ื™ืจ ื™ื•ืชืจ ืฉื ืฉืชืฃ ืื•ืชื•.
06:43
At the same time, there was congressional testimony
120
403092
2925
ื‘ืื•ืชื• ื–ืžืŸ ื”ื•ืฆื’ื” ืขื“ื•ืช ื‘ืงื•ื ื’ืจืก
06:46
in front of both houses of Congress in the United States,
121
406041
3036
ื‘ืคื ื™ ืฉื ื™ ื‘ืชื™ ื”ืงื•ื ื’ืจืก ื‘ืืจื”"ื‘,
06:49
looking at the role of bots in the spread of misinformation.
122
409101
3738
ืฉื‘ื•ื—ื ืช ืืช ื—ืœืงื ืฉืœ ื”ื‘ื•ื˜ื™ื ื‘ื”ืชืคืฉื˜ื•ืช ืฉืœ ืžื™ื“ืข ืžื•ื˜ืขื”.
06:52
So we looked at this too --
123
412863
1354
ืื– ื”ืกืชื›ืœื ื• ื’ื ืขืœ ื–ื” --
06:54
we used multiple sophisticated bot-detection algorithms
124
414241
3598
ื”ืฉืชืžืฉื ื• ื‘ื›ืžื” ืืœื’ื•ืจื™ืชืžื™ื ืžื–ื”ื™-ื‘ื•ื˜ ืžืชื•ื—ื›ืžื™ื
06:57
to find the bots in our data and to pull them out.
125
417863
3074
ื›ื“ื™ ืœืžืฆื•ื ืืช ื”ื‘ื•ื˜ื™ื ื‘ื ืชื•ื ื™ื ืฉืœื ื• ื•ืœืฉืœื•ืฃ ืื•ืชื ื”ื—ื•ืฆื”.
07:01
So we pulled them out, we put them back in
126
421347
2659
ืฉืœืคื ื• ืื•ืชื ื”ื—ื•ืฆื”, ื•ื”ื—ื–ืจื ื• ืื•ืชื ืคื ื™ืžื”
07:04
and we compared what happens to our measurement.
127
424030
3119
ื•ื”ืฉื•ื•ื™ื ื• ืืช ืžื” ืฉืงืจื” ืœืžื“ื™ื“ื•ืช ืฉืœื ื•.
07:07
And what we found was that, yes indeed,
128
427173
2293
ื•ืžื” ืฉืžืฆืื ื• ื”ื™ื” ืฉืื›ืŸ,
07:09
bots were accelerating the spread of false news online,
129
429490
3682
ื‘ื•ื˜ื™ื ื”ืื™ืฆื• ืืช ื”ืชืคืฉื˜ื•ืช ื—ื“ืฉื•ืช ื”ื›ื–ื‘ ื‘ืจืฉืช,
07:13
but they were accelerating the spread of true news
130
433196
2651
ืื‘ืœ ื”ื ื”ืื™ืฆื• ื’ื ืืช ื”ืคืฆืช ื”ื—ื“ืฉื•ืช ื”ืืžื™ืชื™ื•ืช
07:15
at approximately the same rate.
131
435871
2405
ื‘ืฉื™ืขื•ืจ ื–ื”ื” ื‘ืงื™ืจื•ื‘,
07:18
Which means bots are not responsible
132
438300
2858
ืžื” ืฉืื•ืžืจ ืฉื‘ื•ื˜ื™ื ืื™ื ื ืื—ืจืื™ื
07:21
for the differential diffusion of truth and falsity online.
133
441182
4713
ืœืคื™ื–ื•ืจ ื”ื“ื™ืคืจื ืฆื™ืืœื™ ืฉืœ ืืžืช ื•ื›ื–ื‘ ื‘ืจืฉืช.
07:25
We can't abdicate that responsibility,
134
445919
2849
ืื™ื ื ื• ื™ื›ื•ืœื™ื ืœื”ืชื ืขืจ ืžืื—ืจื™ื•ืช ื–ื•,
07:28
because we, humans, are responsible for that spread.
135
448792
4259
ืžืฉื•ื ืฉืื ื—ื ื•, ื‘ื ื™ ื”ืื“ื, ืื—ืจืื™ื ืœื”ืชืคืฉื˜ื•ืช ื–ื•.
07:34
Now, everything that I have told you so far,
136
454472
3334
ื›ืœ ืžื” ืฉืกื™ืคืจืชื™ ืœื›ื ืขื“ ื›ื”,
07:37
unfortunately for all of us,
137
457830
1754
ืœืจื•ืข ื”ืžื–ืœ ืฉืœ ื›ื•ืœื ื•,
07:39
is the good news.
138
459608
1261
ืืœื” ื”ื—ื“ืฉื•ืช ื”ื˜ื•ื‘ื•ืช.
07:42
The reason is because it's about to get a whole lot worse.
139
462670
4450
ื•ื–ื” ืžืฉื•ื ืฉื”ื“ื‘ืจ ืขื•ืžื“ ืœื”ื—ืžื™ืจ ื”ืจื‘ื” ื™ื•ืชืจ.
07:47
And two specific technologies are going to make it worse.
140
467850
3682
ื•ืฉืชื™ ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ืกืคืฆื™ืคื™ื•ืช ืขื•ืžื“ื•ืช ืœื”ื—ืžื™ืจ ืืช ื–ื”.
07:52
We are going to see the rise of a tremendous wave of synthetic media.
141
472207
5172
ืื ื—ื ื• ืขื•ืžื“ื™ื ืœืจืื•ืช ืขืœื™ื™ื” ืฉืœ ื’ืœ ืื“ื™ืจ ืฉืœ ืžื“ื™ื” ืกื™ื ืชื˜ื™ืช.
07:57
Fake video, fake audio that is very convincing to the human eye.
142
477403
6031
ื•ื™ื“ืื• ื›ื•ื–ื‘, ืฉืžืข ื›ื•ื–ื‘, ืฉืžืฉื›ื ืขื™ื ืžืื•ื“ ืืช ื”ืขื™ืŸ ื”ืื ื•ืฉื™ืช.
08:03
And this will powered by two technologies.
143
483458
2754
ื•ื–ื” ื™ื•ื ืข ืขืœ ื™ื“ื™ ืฉืชื™ ื˜ื›ื ื•ืœื•ื’ื™ื•ืช.
08:06
The first of these is known as "generative adversarial networks."
144
486236
3833
ื”ืจืืฉื•ื ื” ืžื”ืŸ ืžื•ื›ืจืช ื›"ืจืฉืชื•ืช ื’ื ืจื˜ื™ื‘ื™ื•ืช ื™ืจื™ื‘ื•ืช."
08:10
This is a machine-learning model with two networks:
145
490093
2563
ื–ื”ื• ืžื•ื“ืœ ืœืžื™ื“ืช ืžื›ื•ื ื” ืขื ืฉืชื™ ืจืฉืชื•ืช:
08:12
a discriminator,
146
492680
1547
ืžืื‘ื—ืŸ,
08:14
whose job it is to determine whether something is true or false,
147
494251
4200
ืฉืชืคืงื™ื“ื• ืœื”ื‘ื—ื™ืŸ ืื ืžืฉื”ื• ื”ื™ื ื• ืืžื™ืชื™ ืื• ื›ื•ื–ื‘,
08:18
and a generator,
148
498475
1167
ื•ื’ื ืจื˜ื•ืจ,
08:19
whose job it is to generate synthetic media.
149
499666
3150
ืฉืชืคืงื™ื“ื• ืœื”ืคื™ืง ืžื“ื™ื” ืกื™ื ืชื˜ื™ืช.
08:22
So the synthetic generator generates synthetic video or audio,
150
502840
5102
ื›ืš ืฉื”ื’ื ืจื˜ื•ืจ ื”ืกื™ื ืชื˜ื™ ืžืคื™ืง ื•ื™ื“ืื• ืื• ืื•ื“ื™ื• ืกื™ื ืชื˜ื™ื™ื,
08:27
and the discriminator tries to tell, "Is this real or is this fake?"
151
507966
4675
ื•ื”ืžืื‘ื—ืŸ ืžื ืกื” ืœื•ืžืจ, "ื”ืื ื–ื” ืืžื™ืชื™ ืื• ื›ื•ื–ื‘?"
08:32
And in fact, it is the job of the generator
152
512665
2874
ื•ืœืžืขืฉื” ื–ื” ืชืคืงื™ื“ื• ืฉืœ ื”ื’ื ืจื˜ื•ืจ
08:35
to maximize the likelihood that it will fool the discriminator
153
515563
4435
ืœืžืงืกื ืืช ื”ื“ืžื™ื•ืŸ ืฉื™ืฉื˜ื” ื‘ืžืื‘ื—ืŸ
08:40
into thinking the synthetic video and audio that it is creating
154
520022
3587
ืœื—ืฉื•ื‘ ืฉื”ื•ื•ื™ื“ืื• ื•ื”ืื•ื“ื™ื• ื”ืกื™ื ืชื˜ื™ื™ื ืฉื”ื•ื ื™ื•ืฆืจ
08:43
is actually true.
155
523633
1730
ื”ื ืœืžืขืฉื” ืืžื™ืชื™ื™ื.
08:45
Imagine a machine in a hyperloop,
156
525387
2373
ื“ืžื™ื™ื ื• ืžื›ื•ื ื” ื‘ื”ื™ื™ืคืจืœื•ืค,
08:47
trying to get better and better at fooling us.
157
527784
2803
ืžื ืกื” ืœื”ืฉืชืคืจ ื™ื•ืชืจ ื•ื™ื•ืชืจ ื‘ืœืฉื˜ื•ืช ื‘ื ื•.
08:51
This, combined with the second technology,
158
531114
2500
ื–ืืช ื‘ืฉื™ืœื•ื‘ ืขื ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ื”ืฉื ื™ื™ื”,
08:53
which is essentially the democratization of artificial intelligence to the people,
159
533638
5722
ืฉื”ื™ื ื‘ืขืฆื ื”ื“ืžื•ืงืจื˜ื™ื–ืฆื™ื” ืฉืœ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ืœืขื,
08:59
the ability for anyone,
160
539384
2189
ื”ื™ื›ื•ืœืช ืฉืœ ื›ืœ ืื—ื“,
09:01
without any background in artificial intelligence
161
541597
2830
ืœืœื ืฉื•ื ืจืงืข ื‘ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช
09:04
or machine learning,
162
544451
1182
ืื• ืœืžื™ื“ืช ืžื›ื•ื ื”,
09:05
to deploy these kinds of algorithms to generate synthetic media
163
545657
4103
ืœื”ืฉืชืžืฉ ื‘ืกื•ื’ื™ื ืืœื” ืฉืœ ืืœื’ื•ืจื™ืชืžื™ื ื›ื“ื™ ืœื™ื™ืฆืจ ืžื“ื™ื” ืกื™ื ืชื˜ื™ืช
09:09
makes it ultimately so much easier to create videos.
164
549784
4547
ื”ื•ืคืš ืืช ื™ืฆื™ืจืช ืกืจื˜ื•ื ื™ ื”ื•ื™ื“ืื•, ื‘ืกื•ืคื• ืฉืœ ื“ื‘ืจ, ื”ืจื‘ื” ื™ื•ืชืจ ืงืœื” .
09:14
The White House issued a false, doctored video
165
554355
4421
ื”ื‘ื™ืช ื”ืœื‘ืŸ ืคื™ืจืกื ืกืจื˜ื•ืŸ ื•ื™ื“ืื• ืžื–ื•ื™ืฃ
09:18
of a journalist interacting with an intern who was trying to take his microphone.
166
558800
4288
ืขืœ ืื™ื ื˜ืจืืงืฆื™ื” ื‘ื™ืŸ ืขื™ืชื•ื ืื™ ืœืžืชืžื—ื” ืฉื ื™ืกื” ืœืงื—ืช ืืช ื”ืžื™ืงืจื•ืคื•ืŸ ืฉืœื•.
09:23
They removed frames from this video
167
563427
1999
ื”ื ื”ืกื™ืจื• ืคืจื™ื™ืžื™ื ืžื”ืกืจื˜ื•ืŸ ื”ื–ื”.
09:25
in order to make his actions seem more punchy.
168
565450
3287
ื›ื“ื™ ืœื’ืจื•ื ืœืคืขื•ืœื•ืชื™ื• ืœื”ื™ืจืื•ืช ื™ื•ืชืจ ืชื•ืงืคื ื™ื•ืช.
09:29
And when videographers and stuntmen and women
169
569157
3385
ื•ื›ืืฉืจ ืฆืœืžื™ ื•ื™ื“ืื• ืžืงืฆื•ืขื™ื™ื ืคืขืœื•ืœื ื™ื, ื•ื›ืคื™ืœื™ื ื•ื›ืคื™ืœื•ืช
09:32
were interviewed about this type of technique,
170
572566
2427
ื”ืชืจืื™ื™ื ื• ืขืœ ืกื•ื’ ื–ื” ืฉืœ ื˜ื›ื ื™ืงื”,
09:35
they said, "Yes, we use this in the movies all the time
171
575017
3828
ื”ื ืืžืจื•, "ื›ืŸ, ืื ื—ื ื• ืžืฉืชืžืฉื™ื ื‘ื–ื” ื‘ืกืจื˜ื™ื ื›ืœ ื”ื–ืžืŸ
09:38
to make our punches and kicks look more choppy and more aggressive."
172
578869
4763
ื›ื“ื™ ืœื”ืคื•ืš ืืช ื”ืื’ืจื•ืคื™ื ื•ื”ื‘ืขื™ื˜ื•ืช ืฉืœื ื• ืœื”ื™ืจืื•ืช ื™ื•ืชืจ ืกื•ืขืจื™ื ื•ื™ื•ืชืจ ืื’ืจืกื™ื‘ื™ื™ื."
09:44
They then put out this video
173
584268
1867
ืœืื—ืจ ืžื›ืŸ ื”ื ืคื™ืจืกืžื• ืืช ื”ื•ื™ื“ืื•
09:46
and partly used it as justification
174
586159
2500
ื•ื‘ืžื™ื“ืช ืžื” ื”ืฉืชืžืฉื• ื‘ื• ื›ื”ืฆื“ืงื”
09:48
to revoke Jim Acosta, the reporter's, press pass
175
588683
3999
ืœืฉืœื•ืœ ืžื”ืขื™ืชื•ื ืื™ ื’'ื™ื ืืงื•ืกื˜ื” ืืช ืชืขื•ื“ืช ื”ื›ื ื™ืกื” ืœืขื™ืชื•ื ืื™ื
09:52
from the White House.
176
592706
1339
ืœื‘ื™ืช ื”ืœื‘ืŸ.
09:54
And CNN had to sue to have that press pass reinstated.
177
594069
4809
ื•- CNN ื ืืœืฆื• ืœืชื‘ื•ืข ื›ื“ื™ ืฉืชืขื•ื“ืช ื”ืขื™ืชื•ื ืื™ ื”ื–ื• ืชื•ื ืคืง ืžื—ื“ืฉ.
10:00
There are about five different paths that I can think of that we can follow
178
600538
5603
ืื ื™ ื™ื›ื•ืœ ืœื—ืฉื•ื‘ ืขืœ ื›ื—ืžืฉ ื“ืจื›ื™ื ืฉื•ื ื•ืช ืฉื ื•ื›ืœ ื‘ืขื–ืจืชืŸ
10:06
to try and address some of these very difficult problems today.
179
606165
3739
ืœื ืกื•ืช ื•ืœื˜ืคืœ ื‘ื›ืžื” ืžื”ื‘ืขื™ื•ืช ื”ืงืฉื•ืช ื‘ื™ื•ืชืจ.
10:10
Each one of them has promise,
180
610379
1810
ื‘ื›ืœ ืื—ืช ืžื”ืŸ ื™ืฉ ื”ื‘ื˜ื—ื”,
10:12
but each one of them has its own challenges.
181
612213
2999
ืื‘ืœ ืœื›ืœ ืื—ืช ืžื”ืŸ ื™ืฉ ืืชื’ืจื™ื ืžืฉืœื”.
10:15
The first one is labeling.
182
615236
2008
ื”ืจืืฉื•ื ื” ื”ื™ื ืชื™ื•ื’,
10:17
Think about it this way:
183
617268
1357
ื—ื™ืฉื‘ื• ืขืœ ื–ื” ื›ืš:
10:18
when you go to the grocery store to buy food to consume,
184
618649
3611
ื›ืฉืืชื ื”ื•ืœื›ื™ื ืœืงื ื•ืช ืžื•ืฆืจื™ ืžื–ื•ืŸ ื‘ืžื›ื•ืœืช,
10:22
it's extensively labeled.
185
622284
1904
ื”ื ืžืชื•ื™ื’ื™ื ื‘ื”ืจื—ื‘ื”.
10:24
You know how many calories it has,
186
624212
1992
ื ื™ืชืŸ ืœื“ืขืช ื›ืžื” ืงืœื•ืจื™ื•ืช ื™ืฉ ื‘ื”ื,
10:26
how much fat it contains --
187
626228
1801
ื›ืžื” ืฉื•ืžืŸ ื”ื ืžื›ื™ืœื™ื --
10:28
and yet when we consume information, we have no labels whatsoever.
188
628053
4278
ืื‘ืœ ื›ืฉืื ื• ืฆื•ืจื›ื™ื ืžื™ื“ืข ืขื“ื™ื™ืŸ ืื™ืŸ ืœื ื• ืชื•ื•ื™ื•ืช ื›ืœืฉื”ืŸ.
10:32
What is contained in this information?
189
632355
1928
ืžื” ื›ืœื•ืœ ื‘ืžื™ื“ืข ื”ื–ื”?
10:34
Is the source credible?
190
634307
1453
ื”ืื ื”ืžืงื•ืจ ืืžื™ืŸ?
10:35
Where is this information gathered from?
191
635784
2317
ืžื”ื™ื›ืŸ ื”ืžื™ื“ืข ื”ื–ื” ื ืืกืฃ?
10:38
We have none of that information
192
638125
1825
ืื™ืŸ ืœื ื• ืฉื•ื ื“ื‘ืจ ืžืกื•ื’ ื–ื”
10:39
when we are consuming information.
193
639974
2103
ื›ืฉืื ื• ืฆื•ืจื›ื™ื ืžื™ื“ืข.
10:42
That is a potential avenue, but it comes with its challenges.
194
642101
3238
ื–ื•ื”ื™ ื“ืจืš ืฉื™ืฉ ื‘ื” ืคื•ื˜ื ืฆื™ืืœ, ืื‘ืœ ื’ื ืืชื’ืจื™ื ืžืฉืœื”.
10:45
For instance, who gets to decide, in society, what's true and what's false?
195
645363
6451
ืœืžืฉืœ, ืžื™ ื”ืžื—ืœื™ื˜ ื‘ื—ื‘ืจื”, ืžื” ื ื›ื•ืŸ ื•ืžื” ื›ื•ื–ื‘?
10:52
Is it the governments?
196
652387
1642
ื”ืื ื”ืžืžืฉืœื•ืช?
10:54
Is it Facebook?
197
654053
1150
ื”ืื ื–ื” ืคื™ื™ืกื‘ื•ืง?
10:55
Is it an independent consortium of fact-checkers?
198
655601
3762
ื”ืื ื–ื” ืžึทืึฒื’ึธื“ ืขืฆืžืื™ ืฉืœ ื‘ื•ื“ืงื™ ืขื•ื‘ื“ื•ืช?
10:59
And who's checking the fact-checkers?
199
659387
2466
ื•ืžื™ ื‘ื•ื“ืง ืืช ื‘ื•ื“ืงื™ ื”ืขื•ื‘ื“ื•ืช?
11:02
Another potential avenue is incentives.
200
662427
3084
ื“ืจืš ืคื•ื˜ื ืฆื™ืืœื™ืช ื ื•ืกืคืช ื”ื™ื ืชืžืจื™ืฆื™ื.
11:05
We know that during the US presidential election
201
665535
2634
ืื ื• ื™ื•ื“ืขื™ื ืฉื‘ืžื”ืœืš ื”ื‘ื—ื™ืจื•ืช ืœื ืฉื™ืื•ืช ืืจื”"ื‘
11:08
there was a wave of misinformation that came from Macedonia
202
668193
3690
ื”ื’ื™ืข ืžืžืงื“ื•ื ื™ื” ื’ืœ ืฉืœ ืžื™ื“ืข ื›ื•ื–ื‘
11:11
that didn't have any political motive
203
671907
2337
ืฉืœื ื”ื™ื” ืœื• ื›ืœ ืžื ื™ืข ืคื•ืœื™ื˜ื™
11:14
but instead had an economic motive.
204
674268
2460
ืื‘ืœ ืœืขื•ืžืช ื–ื” ื”ื™ื” ืœื• ืžื ื™ืข ื›ืœื›ืœื™.
11:16
And this economic motive existed,
205
676752
2148
ื•ื”ืžื ื™ืข ื”ื›ืœื›ืœื™ ื”ื–ื” ื”ื™ื” ืงื™ื™ื,
11:18
because false news travels so much farther, faster
206
678924
3524
ื›ื™ ื—ื“ืฉื•ืช ื›ื–ื‘ ืžืชืคืฉื˜ื•ืช ื”ืจื‘ื” ื™ื•ืชืจ ืจื—ื•ืง, ื•ืžื”ืจ
11:22
and more deeply than the truth,
207
682472
2010
ื•ืขืžื•ืง ื™ื•ืชืจ ืžื”ืืžืช,
11:24
and you can earn advertising dollars as you garner eyeballs and attention
208
684506
4960
ื•ื›ืฉื”ื ืžื•ืฉื›ื™ื ืชืฉื•ืžืช ืœื‘ ื ื™ืชืŸ ืœื”ืจื•ื•ื™ื— ืžื”ืคืจืกื•ื
11:29
with this type of information.
209
689490
1960
ืฉืœ ืกื•ื’ ื›ื–ื” ืฉืœ ืžื™ื“ืข.
11:31
But if we can depress the spread of this information,
210
691474
3833
ืื‘ืœ ืื ื ื•ื›ืœ ืœื“ื›ื ืืช ื”ืชืคืฉื˜ื•ืช ื”ืžื™ื“ืข ื”ื–ื”,
11:35
perhaps it would reduce the economic incentive
211
695331
2897
ืื•ืœื™ ื”ื“ื‘ืจ ื™ืฆืžืฆื ืืช ื”ืชืžืจื™ืฅ ื”ื›ืœื›ืœื™
11:38
to produce it at all in the first place.
212
698252
2690
ืœื™ื™ืฆืจ ืื•ืชื• ื‘ื›ืœืœ, ืžืœื›ืชื—ื™ืœื”.
11:40
Third, we can think about regulation,
213
700966
2500
ื›ื“ืจืš ืฉืœื™ืฉื™ืช, ืืคืฉืจ ืœื—ืฉื•ื‘ ืขืœ ืจื’ื•ืœืฆื™ื”,
11:43
and certainly, we should think about this option.
214
703490
2325
ื•ื›ืžื•ื‘ืŸ ืฉืขืœื™ื ื• ืœื—ืฉื•ื‘ ืขืœ ืืคืฉืจื•ืช ื–ื•.
11:45
In the United States, currently,
215
705839
1611
ื‘ืืจืฆื•ืช ื”ื‘ืจื™ืช, ื ื›ื•ืŸ ืœืขื›ืฉื™ื•,
11:47
we are exploring what might happen if Facebook and others are regulated.
216
707474
4848
ืื ื• ื—ื•ืงืจื™ื ืžื” ืขืœื•ืœ ืœืงืจื•ืช ืื ืคื™ื™ืกื‘ื•ืง ื•ืื—ืจื™ื, ื™ื”ื™ื• ืชื—ืช ืจื’ื•ืœืฆื™ื”.
11:52
While we should consider things like regulating political speech,
217
712346
3801
ื‘ืขื•ื“ ืฉืœื•ืงื—ื™ื ื‘ื—ืฉื‘ื•ืŸ ื“ื‘ืจื™ื ื›ืžื• ื”ืกื“ืจืช ื”ืฉื™ื—ื” ื”ืคื•ืœื™ื˜ื™,
11:56
labeling the fact that it's political speech,
218
716171
2508
ืชื™ื•ื’ ื”ืขื•ื‘ื“ื” ืฉื–ื” ืฉื™ื— ืคื•ืœื™ื˜ื™,
11:58
making sure foreign actors can't fund political speech,
219
718703
3819
ืžื•ื•ื“ืื™ื ืฉืฉื—ืงื ื™ื ื–ืจื™ื ืœื ื™ื›ื•ืœื™ื ืœืžืžืŸ ืฉื™ื— ืคื•ืœื™ื˜ื™,
12:02
it also has its own dangers.
220
722546
2547
ื™ืฉ ืœื–ื” ื’ื ืกื›ื ื•ืช ืžืฉืœื•.
12:05
For instance, Malaysia just instituted a six-year prison sentence
221
725522
4878
ืœืžืฉืœ, ืžืœื–ื™ื” ื–ื” ืขืชื” ื—ื•ืงืงื” ืขื•ื ืฉ ืžืืกืจ ืฉืœ 6 ืฉื ื™ื
12:10
for anyone found spreading misinformation.
222
730424
2734
ืœื›ืœ ืžื™ ืฉื ืžืฆื ืžืคื™ืฅ ืžื™ื“ืข ื›ื•ื–ื‘.
12:13
And in authoritarian regimes,
223
733696
2079
ื•ื‘ืžืฉื˜ืจื™ื ืื•ื˜ื•ืจื™ื˜ืจื™ื™ื,
12:15
these kinds of policies can be used to suppress minority opinions
224
735799
4666
ืฉื™ืžื•ืฉ ื‘ืžื“ื™ื ื™ื•ืช ืžืกื•ื’ ื–ื” ื™ื›ื•ืœ ืœืฉืžืฉ ืœื“ื™ื›ื•ื™ ื“ืขื•ืช ืฉืœ ืžื™ืขื•ื˜ื™ื
12:20
and to continue to extend repression.
225
740489
3508
ื•ืœื”ืžืฉื™ืš ืœื”ืจื—ื™ื‘ ืืช ื”ื“ื™ื›ื•ื™.
12:24
The fourth possible option is transparency.
226
744680
3543
ื”ืื•ืคืฆื™ื” ื”ืจื‘ื™ืขื™ืช ื”ืืคืฉืจื™ืช ื”ื™ื ืฉืงื™ืคื•ืช.
12:28
We want to know how do Facebook's algorithms work.
227
748843
3714
ืื ื• ืจื•ืฆื™ื ืœื“ืขืช ื›ื™ืฆื“ ืคื•ืขืœื™ื ื”ืืœื’ื•ืจื™ืชืžื™ื ืฉืœ ืคื™ื™ืกื‘ื•ืง.
12:32
How does the data combine with the algorithms
228
752581
2880
ืื™ืš ื”ื ืชื•ื ื™ื ืžืฉืชืœื‘ื™ื ืขื ื”ืืœื’ื•ืจื™ืชืžื™ื
12:35
to produce the outcomes that we see?
229
755485
2838
ื›ื“ื™ ืœื™ื™ืฆืจ ืืช ื”ืชื•ืฆืื•ืช ืฉืื ื—ื ื• ืจื•ืื™ื?
12:38
We want them to open the kimono
230
758347
2349
ืื ื• ืจื•ืฆื™ื ืฉื”ื ื™ื—ืฉืคื• ืžื™ื“ืข ืคื ื™ืžื™
12:40
and show us exactly the inner workings of how Facebook is working.
231
760720
4214
ื•ื™ืจืื• ืœื ื• ื‘ื“ื™ื•ืง ืืช ื”ืคืขื•ืœื•ืช ื”ืคื ื™ืžื™ื•ืช ืœืคื™ื”ืŸ ืคื™ื™ืกื‘ื•ืง ืขื•ื‘ื“ืช,
12:44
And if we want to know social media's effect on society,
232
764958
2779
ื•ืื ื ืจืฆื” ืœื“ืขืช ืžื”ื™ ื”ืฉืคืขืชื” ืฉืœ ื”ืžื“ื™ื” ื”ื—ื‘ืจืชื™ืช,
12:47
we need scientists, researchers
233
767761
2086
ื ื–ื“ืงืง ืฉืœืžื“ืขื ื™ื, ื—ื•ืงืจื™ื ื•ืื—ืจื™ื
12:49
and others to have access to this kind of information.
234
769871
3143
ืชื”ื™ื” ื’ื™ืฉื” ืœืกื•ื’ ื–ื” ืฉืœ ืžื™ื“ืข.
12:53
But at the same time,
235
773038
1547
ืื‘ืœ ื‘ืื•ืชื• ื–ืžืŸ,
12:54
we are asking Facebook to lock everything down,
236
774609
3801
ืื ื—ื ื• ืžื‘ืงืฉื™ื ืžืคื™ื™ืกื‘ื•ืง ืœื ืขื•ืœ ืืช ื”ื›ืœ,
12:58
to keep all of the data secure.
237
778434
2173
ื›ื“ื™ ืœืฉืžื•ืจ ืขืœ ืื‘ื˜ื—ืช ื›ืœ ื”ื ืชื•ื ื™ื.
13:00
So, Facebook and the other social media platforms
238
780631
3159
ื•ื›ืš ืคื™ื™ืกื‘ื•ืง ื•ืคืœื˜ืคื•ืจืžื•ืช ืžื“ื™ื” ื—ื‘ืจืชื™ืช ืื—ืจื•ืช
13:03
are facing what I call a transparency paradox.
239
783814
3134
ืขื•ืžื“ื•ืช ื‘ืคื ื™ ืžื” ืฉืื ื™ ืžื›ื ื” ืคืจื“ื•ืงืก ืฉืงื™ืคื•ืช.
13:07
We are asking them, at the same time,
240
787266
2674
ืื ื—ื ื• ืžื‘ืงืฉื™ื ืžื”ื ื‘ื” ื‘ืขืช,
13:09
to be open and transparent and, simultaneously secure.
241
789964
4809
ืœื”ื™ื•ืช ืคืชื•ื—ื™ื ื•ืฉืงื•ืคื™ื ื•ื‘ื• ื‘ื–ืžืŸ ืžืื•ื‘ื˜ื—ื™ื.
13:14
This is a very difficult needle to thread,
242
794797
2691
ื™ื”ื™ื” ืงืฉื” ืœื ื•ื•ื˜ ื‘ืžื™ื•ืžื ื•ืช ื‘ื“ืจืš ืขืงืœืงืœื”
13:17
but they will need to thread this needle
243
797512
1913
ืื‘ืœ ื”ื ื™ืฆื˜ืจื›ื• ืœืขืฉื•ืช ื–ืืช ืื ื™ืจืฆื•
13:19
if we are to achieve the promise of social technologies
244
799449
3787
ืœื”ื’ืฉื™ื ืืช ื™ืขื“ื™ ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ื”ื—ื‘ืจืชื™ืช
13:23
while avoiding their peril.
245
803260
1642
ืชื•ืš ื”ื™ืžื ืขื•ืช ืžื”ืกื™ื›ื•ื ื™ื.
13:24
The final thing that we could think about is algorithms and machine learning.
246
804926
4691
ื”ื“ื‘ืจ ื”ืื—ืจื•ืŸ ืฉื™ื›ื•ืœื ื• ืœื—ืฉื•ื‘ ืขืœื™ื• ื”ื•ื ืืœื’ื•ืจื™ืชืžื™ื ื•ืœืžื™ื“ืช ืžื›ื•ื ื”.
13:29
Technology devised to root out and understand fake news, how it spreads,
247
809641
5277
ื˜ื›ื ื•ืœื•ื’ื™ื” ืฉืชื‘ื™ืŸ ืืช ื”ืชืคืฉื˜ื•ืช ื—ื“ืฉื•ืช ื”ื›ื–ื‘
13:34
and to try and dampen its flow.
248
814942
2331
ื•ืชืฉืจืฉ ืื•ืชืŸ. ื•ืชื ืกื” ืœื“ื›ื ืืช ื–ืจื™ืžืชืŸ.
13:37
Humans have to be in the loop of this technology,
249
817824
2897
ืฆืจื™ืš ืœื”ื™ื•ืช ืžืขื•ื“ื›ื ื™ื ื‘ื›ืœ ืžื” ืฉื ื•ื’ืข ืœื˜ื›ื ื•ืœื•ื’ื™ื” ื–ื•,
13:40
because we can never escape
250
820745
2278
ื›ื™ ืœืขื•ืœื ืœื ื ื•ื›ืœ ืœื‘ืจื•ื— ืžื›ืš
13:43
that underlying any technological solution or approach
251
823047
4038
ืฉื‘ื‘ืกื™ืก ื›ืœ ืคื™ืชืจื•ืŸ ืื• ื’ื™ืฉื” ื˜ื›ื ื•ืœื•ื’ื™ืช
13:47
is a fundamental ethical and philosophical question
252
827109
4047
ืžื•ื ื—ืช ืฉืืœื” ืืชื™ืช ืื• ืคื™ืœื•ืกื•ืคื™ืช ื‘ืกื™ืกื™ืช
13:51
about how do we define truth and falsity,
253
831180
3270
ื›ื™ืฆื“ ืื ื• ืžื’ื“ื™ืจื™ื ืืžื™ืชื™ ื•ื›ื•ื–ื‘.
13:54
to whom do we give the power to define truth and falsity
254
834474
3180
ืœืžื™ ืื ื• ื ื•ืชื ื™ื ืืช ื”ื›ื•ื— ืœื”ื’ื“ื™ืจ ืืžืช ืื• ืฉืงืจ.
13:57
and which opinions are legitimate,
255
837678
2460
ื•ืื™ืœื• ื“ืขื•ืช ื”ืŸ ืœื’ื™ื˜ื™ืžื™ื•ืช,
14:00
which type of speech should be allowed and so on.
256
840162
3706
ืื™ื–ื” ืกื•ื’ ืฉืœ ื“ื™ื‘ื•ืจ ืฆืจื™ืš ืœื”ื™ื•ืช ืžื•ืชืจ, ื•ื›ื•'.
14:03
Technology is not a solution for that.
257
843892
2328
ื˜ื›ื ื•ืœื•ื’ื™ื” ืื™ื ื” ื”ืคื™ืชืจื•ืŸ ืœื›ืš.
14:06
Ethics and philosophy is a solution for that.
258
846244
3698
ืืชื™ืงื” ื•ืคื™ืœื•ืกื•ืคื™ื” ื”ืŸ ื”ืคืชืจื•ืŸ ืœื–ื”.
14:10
Nearly every theory of human decision making,
259
850950
3318
ื›ืžืขื˜ ื‘ื›ืœ ืชื™ืื•ืจื™ื” ืฉืœ ืงื‘ืœืช ื”ื—ืœื˜ื•ืช ืื ื•ืฉื™ืช
14:14
human cooperation and human coordination
260
854292
2761
ืฉื™ืชื•ืฃ ืคืขื•ืœื” ืื ื•ืฉื™ ื•ืชื™ืื•ื ืื ื•ืฉื™
14:17
has some sense of the truth at its core.
261
857077
3674
ื™ืฉ ืชื—ื•ืฉื” ืžืกื•ื™ืžืช ืฉืœ ืืžืช ื‘ื‘ืกื™ืกื”.
14:21
But with the rise of fake news,
262
861347
2056
ืื‘ืœ ืขื ืขืœื™ื™ืชืŸ ืฉืœ ื—ื“ืฉื•ืช ื”ื›ื–ื‘,
14:23
the rise of fake video,
263
863427
1443
ืขืœื™ื™ืช ื”ืกืจื˜ื•ื ื™ื ื”ืžื–ื•ื™ืคื™ื,
14:24
the rise of fake audio,
264
864894
1882
ืขืœื™ื™ืชื• ืฉืœ ื”ืื•ื“ื™ื• ื”ืžื–ื•ื™ืฃ,
14:26
we are teetering on the brink of the end of reality,
265
866800
3924
ืื ื—ื ื• ืžืชื ื•ื“ื“ื™ื ืขืœ ืกืฃ ืกื•ืคื” ืฉืœ ื”ืžืฆื™ืื•ืช,
14:30
where we cannot tell what is real from what is fake.
266
870748
3889
ืฉื‘ื” ืื™ื ื ื• ื™ื›ื•ืœื™ื ืœื”ื‘ื“ื™ืœ ื‘ื™ืŸ ืืžื™ืชื™ ืœืžื–ื•ื™ืฃ
14:34
And that's potentially incredibly dangerous.
267
874661
3039
ื•ื–ื” ืขืœื•ืœ ืœื”ื™ื•ืช ืžืกื•ื›ืŸ ืžืื™ืŸ ื›ืžื•ื”ื•.
14:38
We have to be vigilant in defending the truth
268
878931
3948
ืขืœื™ื ื• ืœื”ื™ื•ืช ืขืจื ื™ื™ื ื‘ื”ื’ื ื” ืขืœ ื”ืืžืช
14:42
against misinformation.
269
882903
1534
ื›ื ื’ื“ ืžื™ื“ืข ืžื–ื•ื™ืฃ.
14:44
With our technologies, with our policies
270
884919
3436
ืขื ื”ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ืฉืœื ื•, ืขื ื”ืžื“ื™ื ื™ื•ืช ืฉืœื ื•
14:48
and, perhaps most importantly,
271
888379
1920
ื•ืื•ืœื™, ื”ื›ื™ ื—ืฉื•ื‘,
14:50
with our own individual responsibilities,
272
890323
3214
ืขื ื”ืื—ืจื™ื•ืช ื”ืื™ืฉื™ืช ืฉืœื ื•,
14:53
decisions, behaviors and actions.
273
893561
3555
ื•ื”ื”ื—ืœื˜ื•ืช, ื”ื”ืชื ื”ื’ื•ืช ื•ื”ืžืขืฉื™ื.
14:57
Thank you very much.
274
897553
1437
ืชื•ื“ื” ืจื‘ื” ืœื›ื.
14:59
(Applause)
275
899014
3517
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7