Inside the fight against Russia's fake news empire | Olga Yurkova

95,334 views ・ 2018-06-29

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: μˆ˜μ—° κΉ€ κ²€ν† : Sojeong KIM
00:13
2014, July 5,
0
13440
3696
2014λ…„ 7μ›” 5일
00:17
the Ukrainian army entered Sloviansk city in eastern Ukraine.
1
17160
4200
μš°ν¬λΌμ΄λ‚˜ κ΅°λŒ€κ°€ μš°ν¬λΌμ΄λ‚˜ 동뢀 μŠ¬λ‘œλ±μ‹œν¬λ‘œ λ“€μ–΄μ™”μŠ΅λ‹ˆλ‹€.
00:21
They gathered all the locals in Lenin Square.
2
21880
3936
κ΅°λŒ€λŠ” λ ˆλ‹Œ κ΄‘μž₯에 λͺ¨λ“  지역주민을 μ†Œν™˜ν•©λ‹ˆλ‹€.
00:25
Then, they organized the public crucifixion
3
25840
4216
그리고 친 λŸ¬μ‹œμ•„ ꡰ인의 아듀을
00:30
of the son of a pro-Russia militant.
4
30080
3176
곡개 μ‹­μžκ°€ μ²˜ν˜•ν•˜μ˜€μŠ΅λ‹ˆλ‹€.
00:33
He was only three years old.
5
33280
2480
μ•„μ΄λŠ” 겨우 3μ‚΄μ΄μ—ˆμŠ΅λ‹ˆλ‹€.
00:36
Refugee Galina Pyshnyak told this story to Russia's First TV channel.
6
36440
6496
λ‚œλ―Ό κ°ˆλ¦¬λ‚˜ ν”Όμ‹œλ‹ˆν¬κ°€ λŸ¬μ‹œμ•„ TVμ±„λ„μ—μ„œ 이 이야기λ₯Ό μ „ν•©λ‹ˆλ‹€.
00:42
In fact, this incident never happened.
7
42960
4056
사싀, 이 사건은 μΌμ–΄λ‚˜μ§€ μ•Šμ•˜μŠ΅λ‹ˆλ‹€.
00:47
I visited Sloviansk.
8
47040
2096
μ €λŠ” μŠ¬λ‘œλ±μ‹œν¬λ₯Ό λ°©λ¬Έν–ˆμŠ΅λ‹ˆλ‹€.
00:49
There is no Lenin Square.
9
49160
2336
그곳에 λ ˆλ‹Œκ΄‘μž₯은 μ—†μŠ΅λ‹ˆλ‹€.
00:51
In reality, Galina's husband was an active pro-Russia militant in Donbass.
10
51520
6456
κ°ˆλ¦¬λ‚˜μ˜ λ‚¨νŽΈμ€ λˆλ°”μŠ€μ˜ 친 λŸ¬μ‹œμ•„ κ΅°μΈμ΄μ—ˆμŠ΅λ‹ˆλ‹€.
00:58
This is just one of many examples.
11
58000
3176
μ΄λŠ” 수 λ§Žμ€ 사둀 쀑 ν•˜λ‚˜μž…λ‹ˆλ‹€.
01:01
Ukraine has been suffering from Russian propaganda and fake news
12
61200
3896
μš°ν¬λΌμ΄λ‚˜λŠ” λŸ¬μ‹œμ•„μ˜ μ„ μ „κ³Ό κ°€μ§œλ‰΄μŠ€λ‘œ
01:05
for four years now,
13
65120
2416
ν˜„μž¬κΉŒμ§€ 4λ…„μ§Έ 고톡 λ°›κ³  μžˆμŠ΅λ‹ˆλ‹€.
01:07
but Russia is not the only player in this space.
14
67560
4376
ν•˜μ§€λ§Œ λŸ¬μ‹œμ•„λ§Œ 이런 ν–‰μœ„λ₯Ό ν•˜λŠ” 건 μ•„λ‹™λ‹ˆλ‹€.
01:11
Fake news is happening all around the world.
15
71960
4056
μ „ μ„Έκ³„μ—μ„œ κ°€μ§œλ‰΄μŠ€κ°€ κΈ°μŠΉμ„ 뢀리고 μžˆμŠ΅λ‹ˆλ‹€.
01:16
We all know about fake news.
16
76040
1856
μš°λ¦¬λŠ” λͺ¨λ‘ κ°€μ§œ λ‰΄μŠ€μ— λŒ€ν•΄ μ•Œκ³  있죠.
01:17
We see it and read it all the time.
17
77920
3136
항상 κ°€μ§œλ‰΄μŠ€λ₯Ό μ ‘ν•˜κ³  μ½μŠ΅λ‹ˆλ‹€.
01:21
But the thing about fake news
18
81080
2536
ν•˜μ§€λ§Œ κ°€μ§œλ‰΄μŠ€μ˜ μ§„μ§œ λ¬Έμ œλŠ”
01:23
is that we don't always know what is fake and what is real,
19
83640
4176
늘 κ°€μ§œλ‰΄μŠ€μ™€ μ§„μ§œλ‰΄μŠ€λ₯Ό ꡬ뢄할 μˆ˜λŠ” μ—†λ‹€λŠ”λ° 있죠.
01:27
but we base our decisions on facts we get from the press and social media.
20
87840
5936
결정을 내릴 λ•Œ μ–Έλ‘ κ³Ό SNSμƒμ˜ 정보λ₯Ό μ°Έμ‘°ν•©λ‹ˆλ‹€.
01:33
When facts are false,
21
93800
2056
제곡된 사싀이 κ°€μ§œλΌλ©΄
01:35
decisions are wrong.
22
95880
1560
잘λͺ»λœ 결정을 ν•˜κ²Œ 되죠.
01:38
A lot of people stop believing anyone at all
23
98560
3496
λ§Žμ€ 이듀이 λˆ„κ΅¬λ„ 믿으렀 ν•˜μ§€ μ•Šμ£ .
01:42
and this is even more dangerous.
24
102080
2736
이런 방식은 였히렀 훨씬 μœ„ν—˜ν•©λ‹ˆλ‹€.
01:44
They easily become prey to populists in elections,
25
104840
3936
이듀은 μ„ κ±°μ—μ„œ ν¬ν“°λ¦¬μŠ€νŠΈμ˜ 먹이가 되고
01:48
or even take up arms.
26
108800
1880
무기λ₯Ό λ“€κ³  λ‚˜μ„œκΈ°λ„ ν•©λ‹ˆλ‹€.
01:51
Fake news is not only bad for journalism.
27
111760
2536
κ°€μ§œλ‰΄μŠ€λŠ” 언둠을 ν•΄μΉ  뿐만 μ•„λ‹ˆλΌ
01:54
It's a threat for democracy and society.
28
114320
3760
λ―Όμ£Όμ£Όμ˜μ™€ μ‚¬νšŒλ₯Ό μœ„ν˜‘ν•©λ‹ˆλ‹€.
01:58
Four years ago, unmarked soldiers
29
118960
3456
4λ…„ μ „, μœ„μž₯ꡰ인듀이
02:02
entered the Crimean Peninsula,
30
122440
2776
크림 λ°˜λ„μ— μ§„μž…ν•˜μ˜€μŠ΅λ‹ˆλ‹€,
02:05
and at the same time,
31
125240
1616
κ·Έ μ‹œκΈ°μ—
02:06
Russian media was going crazy with fake news about Ukraine.
32
126880
4160
λŸ¬μ‹œμ•„ 언둠은 μš°ν¬λΌμ΄λ‚˜ κ΄€λ ¨ κ°€μ§œλ‰΄μŠ€μ— 열을 올리고 μžˆμ—ˆμ£ .
02:12
So a group of journalists, including me,
33
132200
3496
μ €λ₯Ό ν¬ν•¨ν•œ κΈ°μžλ“€μ΄ ν•©μ‹¬ν•˜μ—¬
02:15
started a website to investigate this fake news.
34
135720
3936
μ›Ήμ‚¬μ΄νŠΈλ₯Ό λ§Œλ“€κ³  κ°€μ§œλ‰΄μŠ€λ₯Ό μ°Ύμ•„λ‚΄κΈ° μ‹œμž‘ν–ˆμŠ΅λ‹ˆλ‹€.
02:19
We called it StopFake.
35
139680
2280
StopFake λΌλŠ” 이름을 λΆ™μ˜€μŠ΅λ‹ˆλ‹€.
02:22
The idea was simple:
36
142600
1616
μ•„μ΄λ””μ–΄λŠ” κ°„λ‹¨ν–ˆμŠ΅λ‹ˆλ‹€.
02:24
take a piece of news, check it with verifiable proof
37
144240
4336
λ‰΄μŠ€ 기사에 μ‚¬μš©λœ 사진, μ˜μƒ λ˜λŠ” κ°•λ ₯ν•œ 근거자료λ₯Ό
02:28
like photos, videos and other strong evidence.
38
148600
3496
μž…μ¦ν•  수 μžˆλŠ”μ§€ 팩트 μ²΄ν¬ν•©λ‹ˆλ‹€.
02:32
If it turns out to be fake, we put it on our website.
39
152120
3520
λ‰΄μŠ€κ°€ κ°€μ§œλ‘œ λ°ν˜€μ§€λ©΄ μ›Ήμ‚¬μ΄νŠΈμ— 이λ₯Ό κ³΅κ°œν–ˆμŠ΅λ‹ˆλ‹€.
02:36
Now, StopFake is an informational hub
40
156600
3616
StopFakeλŠ” μ„ μ „μ˜ λͺ¨λ“  면을 λΆ„μ„ν•˜λŠ”
02:40
which analyzes propaganda in all its phases.
41
160240
3440
정보 ν—ˆλΈŒμ˜ 역할을 ν•©λ‹ˆλ‹€.
02:44
We have 11 language versions, we have millions of views,
42
164600
4816
μ •λ³΄λŠ” 11개 μ–Έμ–΄λ‘œ 제곡되고 μ‘°νšŒμˆ˜λŠ” μˆ˜λ°±λ§Œμ— λ‹¬ν•©λ‹ˆλ‹€.
02:49
We have taught more than 10,000 people
43
169440
3136
만 λͺ… μ΄μƒμ—κ²Œ 진싀과 거짓을
02:52
how to distinguish true from false.
44
172600
3016
κ΅¬λΆ„ν•˜λŠ” 법을 κ°€λ₯΄μΉ˜κ³ 
02:55
And we teach fact checkers all around the world.
45
175640
3960
μ „ μ„Έκ³„μ˜ νŒ©νŠΈμ²΄μ»€λ“€μ„ κ΅μœ‘ν•©λ‹ˆλ‹€.
03:00
StopFake has uncovered more than 1,000 fakes about Ukraine.
46
180760
5120
μš°λ¦¬λŠ” μš°ν¬λΌμ΄λ‚˜μ— κ΄€ν•œ κ°€μ§œλ‰΄μŠ€ 1,000건 이상을 λ°ν˜€λƒˆμŠ΅λ‹ˆλ‹€.
03:06
We've identified 18 narratives
47
186880
3936
κ°€μ§œλ‰΄μŠ€λ₯Ό 톡해 λ§Œλ“€μ–΄λ‚Έ
03:10
created using this fake news,
48
190840
2536
18가지 λ‚΄λŸ¬ν‹°λΈŒλ₯Ό νŒŒμ•…ν•΄λƒˆμŠ΅λ‹ˆλ‹€,
03:13
such as Ukraine is a fascist state,
49
193400
3616
μš°ν¬λΌμ΄λ‚˜λŠ” νŒŒμ‹œμŠ€νŠΈ κ΅­κ°€μ΄μž
03:17
a failed state,
50
197040
1496
νŒŒνƒ„κ΅­κ°€λ‘œ
03:18
a state run by a junta who came to power as a result of a coup d'Γ©tat.
51
198560
4856
μΏ λ°νƒ€λ‘œ μ •κΆŒμ„ μž‘μ€ κ΅°μ‚¬μ •κΆŒμ΄ λ‹€μŠ€λ¦°λ‹€λŠ” μ‹μ˜ λ‚΄λŸ¬ν‹°λΈŒμš”.
03:23
We proved that it's not bad journalism;
52
203440
2776
μ΄λŸ¬ν•œ 기사가 ν˜•νŽΈμ—†λŠ” μ €λ„λ¦¬μ¦˜μ˜ κ²°κ³Όκ°€ μ•„λ‹ˆλΌ
03:26
it's a deliberate act of misinformation.
53
206240
2960
고의적인 ν—ˆμœ„ μ •λ³΄μž„μ„ λ°ν˜€λƒˆμŠ΅λ‹ˆλ‹€.
03:29
Fake news is a powerful weapon in information warfare,
54
209800
3456
κ°€μ§œλ‰΄μŠ€λŠ” μ •λ³΄μ „μŸμ˜ κ°•λ ₯ν•œ λ¬΄κΈ°μ΄μ§€λ§Œ
03:33
but there is something we can do about it.
55
213280
2800
μš°λ¦¬λ„ λŒ€μ‘μ±…μ΄ μžˆμŠ΅λ‹ˆλ‹€.
03:37
We all have smartphones.
56
217080
1936
μŠ€λ§ˆνŠΈν°μž…λ‹ˆλ‹€.
03:39
When we see something interesting, it's often automatic.
57
219040
3456
ν₯미둜운 것을 보면 μš°λ¦¬λŠ” 보톡 μžλ™μœΌλ‘œ λ°˜μ‘ν•©λ‹ˆλ‹€.
03:42
We just click and pass it along.
58
222520
1920
ν΄λ¦­ν•˜κ³  κ³΅μœ ν•˜μ£ .
03:45
But how can you not be a part of fake news?
59
225400
2840
μ–΄λ–»κ²Œ ν•˜λ©΄ κ°€μ§œλ‰΄μŠ€ 확산에 μΌμ‘°ν•˜μ§€ μ•Šμ„ 수 μžˆμ„κΉŒμš”?
03:48
First, if it's too dramatic, too emotional, too clickbait,
60
228680
6480
첫째, κ³Όμž₯이 μ‹¬ν•˜κ³  감정적이며 λ‚šμ‹œ 기사인 경우
03:56
then it's very likely that it isn't true.
61
236240
2920
κ°€μ§œμΌ κ°€λŠ₯성이 ν½λ‹ˆλ‹€.
04:00
The truth is boring sometimes.
62
240320
2336
λ•Œλ•Œλ‘œ 진싀은 μ§€λ£¨ν•˜λ‹ˆκΉŒμš”.
04:02
(Laughter)
63
242680
1376
(μ›ƒμŒ)
04:04
Manipulations are always sexy.
64
244080
2320
μ‘°μž‘μ€ 늘 맀λ ₯적이죠.
04:07
They are designed to captivate you.
65
247040
2080
λ‹Ήμ‹ μ˜ λ§ˆμŒμ„ μ‚¬λ‘œμž‘λŠ” 것이 λͺ©ν‘œμž…λ‹ˆλ‹€.
04:10
Do your research.
66
250000
1776
직접 쑰사해 λ΄…μ‹œλ‹€.
04:11
This is the second point, very simple.
67
251800
2816
이것이 두 번째 ν¬μΈνŠΈμž…λ‹ˆλ‹€.
04:14
Look at other sites.
68
254640
1816
λ‹€λ₯Έ μ‚¬μ΄νŠΈλ₯Ό λ³΄μ„Έμš”.
04:16
Check out alternative news sources.
69
256480
2616
λ‹€λ₯Έ λ‰΄μŠ€ 좜처λ₯Ό ν™•μΈν•΄λ³΄μ„Έμš”.
04:19
Google names, addresses, license plates, experts and authors.
70
259120
5416
ꡬ글 이름, μ£Όμ†Œ, μΈκ°€λ²ˆν˜Έ, μ „λ¬Έκ°€, μ €μž.
04:24
Don't just believe, check.
71
264560
1600
κ·Έλƒ₯ 믿지 말고 ν™•μΈν•΄λ³΄μ„Έμš”.
04:26
It's the only way to stop this culture of fake news.
72
266840
3960
이것이 κ°€μ§œλ‰΄μŠ€ λ¬Έν™”λ₯Ό 막을 μœ μΌν•œ λ°©λ²•μž…λ‹ˆλ‹€.
04:31
This information warfare is not only about fake news.
73
271640
3496
μ •λ³΄μ „μŸμ€ κ°€μ§œλ‰΄μŠ€μ—λ§Œ κ΅­ν•œλœ 것이 μ•„λ‹™λ‹ˆλ‹€.
04:35
Our society depends on trust:
74
275160
2600
μ‚¬νšŒμ˜ κΈ°λ°˜μ€ μ‹ λ’°μž…λ‹ˆλ‹€.
04:38
trust in our institutions,
75
278840
1856
μ œλ„μ— λŒ€ν•œ μ‹ λ’°
04:40
in science,
76
280720
1376
과학에 λŒ€ν•œ μ‹ λ’°
04:42
trust in our leaders,
77
282120
2056
리더에 λŒ€ν•œ μ‹ λ’°
04:44
trust in our news outlets.
78
284200
2480
언둠에 λŒ€ν•œ μ‹ λ’°.
04:47
And it's on us to find a way to rebuild trust,
79
287320
5056
μ‹ λ’°λ₯Ό 극볡할 방법은 μš°λ¦¬μ—κ²Œ μžˆμŠ΅λ‹ˆλ‹€.
04:52
because fake news destroys it.
80
292400
2120
κ°€μ§œλ‰΄μŠ€κ°€ μ‹ λ’°λ₯Ό νŒŒκ΄΄ν•˜κ³  있기 λ•Œλ¬Έμ΄μ£ .
04:55
So ask yourself,
81
295320
2096
슀슀둜 μ§ˆλ¬Έν•˜μ„Έμš”.
04:57
what have you lost your faith in?
82
297440
2640
무엇 λ•Œλ¬Έμ— μ‹ λ’°κ°€ μ‚¬λΌμ‘Œλ‚˜μš”?
05:01
Where has trust been ruined for you?
83
301000
2720
μ–΄λ””μ—μ„œ μ‹ λ’°κ°€ λ¬΄λ„ˆμ‘Œλ‚˜μš”?
05:04
And what are you going to do about it?
84
304600
2800
여기에 λŒ€ν•΄μ„œ 무엇을 ν•  μƒκ°μΈκ°€μš”?
05:08
Thank you.
85
308160
1216
κ°μ‚¬ν•©λ‹ˆλ‹€.
05:09
(Applause)
86
309400
5520
(λ°•μˆ˜)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7