How deepfakes undermine truth and threaten democracy | Danielle Citron

87,364 views

2019-10-07 ・ TED


New videos

How deepfakes undermine truth and threaten democracy | Danielle Citron

87,364 views ・ 2019-10-07

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: Jungmin Hwang κ²€ν† : Jihyeon J. Kim
00:12
[This talk contains mature content]
0
12535
2767
[λ³Έ 강연에 μ„±μΈμš© λ‚΄μš©μ΄ ν¬ν•¨λ˜μ–΄ 있음]
00:17
Rana Ayyub is a journalist in India
1
17762
2992
λΌλ‚˜ μ•„μ΄μœ±μ€ μΈλ„μ˜ μ–Έλ‘ μΈμœΌλ‘œ,
00:20
whose work has exposed government corruption
2
20778
2602
μ •λΆ€μ˜ λΆ€μ •λΆ€νŒ¨μ™€
00:24
and human rights violations.
3
24411
2157
μΈκΆŒμΉ¨ν•΄λ₯Ό ν­λ‘œν•΄μ™”μŠ΅λ‹ˆλ‹€.
00:26
And over the years,
4
26990
1167
그리고 κ·Έλ…€λŠ” μˆ˜λ…„κ°„,
00:28
she's gotten used to vitriol and controversy around her work.
5
28181
3303
μžμ‹ μ˜ 보도λ₯Ό λ‘˜λŸ¬μ‹Ό λ…ΌμŸκ³Ό 독섀에 μ΅μˆ™ν•΄μ‘Œμ£ .
00:32
But none of it could have prepared her for what she faced in April 2018.
6
32149
5109
ν•˜μ§€λ§Œ κ·Έ μ–΄λ–€ 것도 2018λ…„ 4μ›” κ·Έλ…€κ°€ λ§ˆμ£Όν•œ 것과 λΉ„ν•  수 μ—†μ—ˆμŠ΅λ‹ˆλ‹€.
00:38
She was sitting in a cafΓ© with a friend when she first saw it:
7
38125
3651
κ·Έλ…€κ°€ μΉœκ΅¬μ™€ μΉ΄νŽ˜μ— μžˆλ‹€κ°€
00:41
a two-minute, 20-second video of her engaged in a sex act.
8
41800
4943
μžμ‹ μ΄ λ‚˜μ˜€λŠ” 2λΆ„ 20초짜리 성관계 μ˜μƒμ„ λ³Έ κ²ƒμž…λ‹ˆλ‹€.
00:47
And she couldn't believe her eyes.
9
47188
2349
κ·Έλ…€λŠ” μžμ‹ μ˜ λˆˆμ„ 믿을 μˆ˜κ°€ μ—†μ—ˆμŠ΅λ‹ˆλ‹€.
00:49
She had never made a sex video.
10
49561
2273
성관계 μ˜μƒμ„ μ΄¬μ˜ν•œ 적이 μ—†μ—ˆμœΌλ‹ˆκΉŒμš”.
00:52
But unfortunately, thousands upon thousands of people
11
52506
3465
ν•˜μ§€λ§Œ λΆˆν–‰νžˆλ„, μ˜μƒμ„ λ³Έ 수천 λͺ…μ˜ μ‚¬λžŒλ“€μ΄
00:55
would believe it was her.
12
55995
1666
그녀라고 λ―Ώμ—ˆκ² μ£ .
00:58
I interviewed Ms. Ayyub about three months ago,
13
58673
2944
μ €λŠ” μ•½ 3κ°œμ›” 전에 성적 μ‚¬μƒν™œμ— λŒ€ν•œ 제 μ±…κ³Ό κ΄€λ ¨ν•΄
01:01
in connection with my book on sexual privacy.
14
61641
2504
μ•„μ΄μœ± 씨λ₯Ό μΈν„°λ·°ν–ˆμŠ΅λ‹ˆλ‹€.
01:04
I'm a law professor, lawyer and civil rights advocate.
15
64681
3198
μ €λŠ” 법학 ꡐ수이며, λ³€ν˜Έμ‚¬μ΄μž μ‹œλ―ΌκΆŒ μ˜Ήν˜Έμžμž…λ‹ˆλ‹€.
01:08
So it's incredibly frustrating knowing that right now,
16
68204
4611
μ§€κΈˆ λ‹Ήμž₯ κ·Έλ…€λ₯Ό 돕기 μœ„ν•΄ λ²•μ μœΌλ‘œ ν•  수 μžˆλŠ” 일이
01:12
law could do very little to help her.
17
72839
2238
λ§Žμ§€ μ•Šλ‹€λŠ” 것이 정말 λ‹΅λ‹΅ν•  λ”°λ¦„μ΄μ—μš”.
01:15
And as we talked,
18
75458
1547
이야기 쀑 κ·Έλ…€λŠ”
01:17
she explained that she should have seen the fake sex video coming.
19
77029
4512
κ°€μ§œ 성관계 μ˜μƒμ΄ λ– λŒ 것을 μ§„μž‘ μ•Œμ•„μ°¨λ Έμ–΄μ•Ό ν–ˆλ‹€κ³  λ§ν–ˆμ£ .
01:22
She said, "After all, sex is so often used to demean and to shame women,
20
82038
5596
그리고 μ’…μ’… μ„ΉμŠ€κ°€ 여성을 λΉ„ν•˜ν•˜κ³  λ§μ‹ μ‹œν‚€κΈ° μœ„ν•΄ μ•…μš©λœλ‹€κ³  ν–ˆμ£ .
01:27
especially minority women,
21
87658
2428
특히 μ†Œμˆ˜ μ—¬μ„±, 그리고
01:30
and especially minority women who dare to challenge powerful men,"
22
90110
4312
κ·Έ 여성듀이 ꢌλ ₯ μžˆλŠ” λ‚¨μžλ“€μ—κ²Œ λ§žμ„œλŠ” κ²½μš°μ—λŠ” λ”μš±μ΄ κ·Έλ ‡λ‹€κ³ μš”.
01:34
as she had in her work.
23
94446
1533
μ•„μ΄μœ± 씨가 그랬던 κ²ƒμ²˜λŸΌμš”.
01:37
The fake sex video went viral in 48 hours.
24
97191
3976
κ°€μ§œ 성관계 μ˜μƒμ€ 48μ‹œκ°„ λ§Œμ— 인터넷에 νΌμ‘ŒμŠ΅λ‹ˆλ‹€.
01:42
All of her online accounts were flooded with screenshots of the video,
25
102064
5307
κ·Έλ…€μ˜ 온라인 계정은 μ˜¨ν†΅ μ˜μƒμ˜ μŠ€ν¬λ¦°μƒ·, μ λ‚˜λΌν•œ κ°•κ°„ μž₯λ©΄κ³Ό
01:47
with graphic rape and death threats
26
107395
2627
살인 ν˜‘λ°•, 그리고 κ·Έλ…€μ˜ 이슬람 신앙을
01:50
and with slurs about her Muslim faith.
27
110046
2533
λΉ„λ°©ν•˜λŠ” λ§λ“€λ‘œ λ„λ°°λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
01:53
Online posts suggested that she was "available" for sex.
28
113426
4564
온라인 κ²Œμ‹œλ¬Όλ“€μ€ κ·Έλ…€κ°€ "ν—€ν”ˆ" μ—¬μžλΌλŠ” 의미λ₯Ό λ‚΄ν¬ν–ˆμ£ .
01:58
And she was doxed,
29
118014
1610
κ·Έλ…€λŠ” 해킹을 λ‹Ήν–ˆμŠ΅λ‹ˆλ‹€.
01:59
which means that her home address and her cell phone number
30
119648
2778
κ·Έλ…€μ˜ 집 μ£Όμ†Œμ™€ μ „ν™” λ²ˆν˜ΈκΉŒμ§€
02:02
were spread across the internet.
31
122450
1746
인터넷에 νΌμ‘ŒμŠ΅λ‹ˆλ‹€.
02:04
The video was shared more than 40,000 times.
32
124879
4084
μ˜μƒμ€ 4만 회 이상 κ³΅μœ λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
02:09
Now, when someone is targeted with this kind of cybermob attack,
33
129760
3936
λˆ„κ΅°κ°€κ°€ μ΄λŸ¬ν•œ 사이버 곡격의 λŒ€μƒμ΄ λœλ‹€λ©΄
02:13
the harm is profound.
34
133720
2063
κ·Έ ν”Όν•΄λŠ” μ‹¬κ°ν•©λ‹ˆλ‹€.
02:16
Rana Ayyub's life was turned upside down.
35
136482
3039
λΌλ‚˜ μ•„μ΄μœ±μ˜ 인생은 솑두리채 λ’€λ°”λ€Œμ—ˆμŠ΅λ‹ˆλ‹€.
02:20
For weeks, she could hardly eat or speak.
36
140211
3334
λͺ‡ μ£Ό λ™μ•ˆ κ·Έλ…€λŠ” 거의 먹지도, λ§ν•˜μ§€λ„ λͺ»ν–ˆμŠ΅λ‹ˆλ‹€.
02:23
She stopped writing and closed all of her social media accounts,
37
143919
3689
κΈ€ μ“°λŠ” 것을 λ©ˆμΆ”κ³  λͺ¨λ“  μ†Œμ…œ λ―Έλ””μ–΄ 계정을 νμ‡„ν–ˆμ£ .
02:27
which is, you know, a tough thing to do when you're a journalist.
38
147632
3158
μ €λ„λ¦¬μŠ€νŠΈλ‘œμ„œ 정말 νž˜λ“  μΌμ΄μ—ˆμ„ κ±°μ˜ˆμš”.
02:31
And she was afraid to go outside her family's home.
39
151188
3484
κ·Έλ…€λŠ” λΆ€λͺ¨λ‹˜μ˜ 집을 λ²—μ–΄λ‚˜λŠ” 것이 λ‘λ €μ› μŠ΅λ‹ˆλ‹€.
02:34
What if the posters made good on their threats?
40
154696
3022
λ§Œμ•½ κ²Œμ‹œλ¬ΌλŒ€λ‘œ μœ„ν˜‘μ΄ μƒκΈ°λ©΄μš”?
02:38
The UN Council on Human Rights confirmed that she wasn't being crazy.
41
158395
4365
μœ μ—” 인ꢌ μœ„μ›νšŒλŠ” κ·Έλ…€μ˜ 걱정이 κ³Όλ―Ό λ°˜μ‘μ΄ μ•„λ‹ˆλΌκ³  κ²°λ‘ μ§€μ—ˆμŠ΅λ‹ˆλ‹€.
02:42
It issued a public statement saying that they were worried about her safety.
42
162784
4637
μœ„μ›νšŒλŠ” κ·Έλ…€μ˜ μ•ˆμ „μ΄ κ±±μ •λœλ‹€λŠ” 곡식 μ„±λͺ…μ„œλ₯Ό λ°œν‘œν•˜μ˜€μŠ΅λ‹ˆλ‹€.
02:48
What Rana Ayyub faced was a deepfake:
43
168776
4229
λΌλ‚˜ μ•„μ΄μœ±μ΄ λ³Έ 것은 λ”₯ 페이크(deepfake)μ˜€μŠ΅λ‹ˆλ‹€.
02:53
machine-learning technology
44
173029
2540
λ”₯ νŽ˜μ΄ν¬λŠ” λ¨Έμ‹  λŸ¬λ‹ 기술의 μΌμ’…μœΌλ‘œ,
02:55
that manipulates or fabricates audio and video recordings
45
175593
4111
μ‚¬λžŒμ΄ λ“±μž₯ν•˜λŠ” μ˜μƒκ³Ό μŒμ„±νŒŒμΌμ„ μ‘°μž‘ν•΄
02:59
to show people doing and saying things
46
179728
2723
μ‹€μ œλ‘œ ν•˜μ§€ μ•Šμ€ λ§μ΄λ‚˜ 행동이
03:02
that they never did or said.
47
182475
1866
이루어진 κ²ƒμ²˜λŸΌ λ³΄μ—¬μ£ΌλŠ” κΈ°μˆ μž…λ‹ˆλ‹€.
03:04
Deepfakes appear authentic and realistic, but they're not;
48
184807
3361
λ”₯νŽ˜μ΄ν¬λŠ” μ‹€μ œμ²˜λŸΌ λ³΄μ΄μ§€λ§Œ, 싀상은 그렇지 μ•ŠμŠ΅λ‹ˆλ‹€.
03:08
they're total falsehoods.
49
188192
1772
μ™„μ „ν•œ κ°€μ§œμΈ 것이죠.
03:11
Although the technology is still developing in its sophistication,
50
191228
3794
아직 이 κΈ°μˆ μ€ λ°œλ‹¬ λ‹¨κ³„μ΄μ§€λ§Œ,
03:15
it is widely available.
51
195046
1614
λ§Žμ€ μ‚¬λžŒλ“€μ΄ μ‚¬μš©ν•  수 μžˆλŠ” μƒνƒœμž…λ‹ˆλ‹€.
03:17
Now, the most recent attention to deepfakes arose,
52
197371
3072
μ˜¨λΌμΈμƒμ˜ λ§Žμ€ 것듀이 κ·ΈλŸ¬λ“―μ΄,
03:20
as so many things do online,
53
200467
2161
λ”₯νŽ˜μ΄ν¬λŠ” 포λ₯΄λ…Έκ·Έλž˜ν”Όμ™€ κ΄€λ ¨ν•œ
03:22
with pornography.
54
202652
1255
관심을 많이 λ°›κ²Œ λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
03:24
In early 2018,
55
204498
2111
2018λ…„ μ΄ˆμ—,
03:26
someone posted a tool on Reddit
56
206633
2468
μ–΄λ–€ μ‚¬λžŒμ΄ λ ˆλ”§(Reddit)에
03:29
to allow users to insert faces into porn videos.
57
209125
4412
포λ₯΄λ…Έ λΉ„λ””μ˜€μ— 얼꡴을 μ‚½μž…ν•  수 μžˆλ„λ‘ ν•˜λŠ” 도ꡬλ₯Ό μ˜¬λ ΈμŠ΅λ‹ˆλ‹€.
03:33
And what followed was a cascade of fake porn videos
58
213561
3440
κ·Έ ν›„, κ°μžκ°€ κ°€μž₯ μ’‹μ•„ν•˜λŠ” μ—¬μ„± μ—°μ˜ˆμΈμ„ ν•©μ„±ν•œ
03:37
featuring people's favorite female celebrities.
59
217025
2797
κ°€μ§œ 포λ₯΄λ…Έ μ˜μƒλ“€μ΄ 인터넷을 νœ©μ“Έκ²Œ λ˜μ—ˆμ£ .
03:40
And today, you can go on YouTube and pull up countless tutorials
60
220712
3477
μ˜€λŠ˜λ‚ , μœ νŠœλΈŒμ— κ²€μƒ‰ν•˜λ©΄ λ°μŠ€ν¬ν†±μ„ μ‚¬μš©ν•œ
03:44
with step-by-step instructions
61
224213
2286
λ”₯페이크 μ˜μƒμ„ λ§Œλ“œλŠ” 방법이
03:46
on how to make a deepfake on your desktop application.
62
226523
3163
λ‹¨κ³„λ³„λ‘œ μ„€λͺ…λ˜μ–΄ μžˆλŠ” νŠœν† λ¦¬μ–Όμ΄ 정말 λ§ŽμŠ΅λ‹ˆλ‹€.
03:50
And soon we may be even able to make them on our cell phones.
63
230260
3706
곧, νœ΄λŒ€ν°μœΌλ‘œ λ§Œλ“€ 수 μžˆμ„μ§€λ„ λͺ¨λ¦…λ‹ˆλ‹€.
03:55
Now, it's the interaction of some of our most basic human frailties
64
235072
5382
μΈκ°„μ˜ 약점과 λ„€νŠΈμ›Œν¬ λ„κ΅¬μ˜
04:00
and network tools
65
240478
1682
μƒν˜Έμž‘μš©μ΄
04:02
that can turn deepfakes into weapons.
66
242184
2666
λ”₯페이크λ₯Ό 무기둜 λ§Œλ“€ 수 μžˆλŠ” κ²ƒμž…λ‹ˆλ‹€.
04:04
So let me explain.
67
244874
1200
μ„€λͺ…ν•΄ λ“œλ¦¬κ² μŠ΅λ‹ˆλ‹€.
04:06
As human beings, we have a visceral reaction to audio and video.
68
246875
4566
μΈκ°„μœΌλ‘œμ„œ, μš°λ¦¬λŠ” μ˜μƒκ³Ό μŒμ„±μ— λ³ΈλŠ₯적으둜 λ°˜μ‘ν•©λ‹ˆλ‹€.
04:11
We believe they're true,
69
251860
1488
μš°λ¦¬λŠ” 그것이 사싀이라고 λ°›μ•„λ“€μž…λ‹ˆλ‹€.
04:13
on the notion that of course you can believe
70
253372
2078
μ™œλƒν•˜λ©΄ μžμ‹ μ˜ 눈과 κ·€κ°€ λ³΄μ—¬μ£ΌλŠ” 것은
04:15
what your eyes and ears are telling you.
71
255474
2478
λ‹Ήμ—°νžˆ 믿을 수 있기 λ•Œλ¬Έμ΄μ£ .
04:18
And it's that mechanism
72
258476
1699
그리고 이 방법이
04:20
that might undermine our shared sense of reality.
73
260199
3698
우리의 ν˜„μ‹€ 감각을 μ•½ν™”μ‹œν‚€λŠ” κ²ƒμž…λ‹ˆλ‹€.
04:23
Although we believe deepfakes to be true, they're not.
74
263921
3147
μš°λ¦¬λŠ” λ”₯νŽ˜μ΄ν¬κ°€ 사싀이라고 λ―Ώμ§€λ§Œ, μ΄λŠ” 사싀이 μ•„λ‹™λ‹ˆλ‹€.
04:27
And we're attracted to the salacious, the provocative.
75
267604
4157
또, μš°λ¦¬λŠ” μŒλž€ν•˜κ³  λˆˆμ— λ„λŠ” 것에 끌리기 마련이죠.
04:32
We tend to believe and to share information
76
272365
3047
μš°λ¦¬λŠ” 뢀정적이고 μƒˆλ‘œμš΄ 정보λ₯Ό
04:35
that's negative and novel.
77
275436
2023
λ―Ώκ³  κ³΅μœ ν•˜λ €λŠ” μŠ΅μ„±μ΄ μžˆμŠ΅λ‹ˆλ‹€.
04:37
And researchers have found that online hoaxes spread 10 times faster
78
277809
5019
μ—°κ΅¬μžλ“€μ€ 온라인 μ‘°μž‘μ΄ μ‹€μ œ 이야기보닀
04:42
than accurate stories.
79
282852
1627
10λ°°λ‚˜ 빨리 νΌμ§„λ‹€λŠ” 것을 λ°ν˜”μŠ΅λ‹ˆλ‹€.
04:46
Now, we're also drawn to information
80
286015
4380
또, μš°λ¦¬λŠ” 우리의 관점과 잘 λ“€μ–΄λ§žλŠ”
04:50
that aligns with our viewpoints.
81
290419
1892
정보에 μ΄λŒλ¦¬λŠ” λ³ΈλŠ₯이 μžˆμŠ΅λ‹ˆλ‹€.
04:52
Psychologists call that tendency "confirmation bias."
82
292950
3561
μ‹¬λ¦¬ν•™μžλ“€μ€ 이 ν˜„μƒμ„ "확증 편ν–₯"이라 λΆ€λ¦…λ‹ˆλ‹€.
04:57
And social media platforms supercharge that tendency,
83
297300
4387
μ†Œμ…œ λ―Έλ””μ–΄ ν”Œλž«νΌμ€ 이 λ³ΈλŠ₯을 μžκ·Ήν•˜κ²Œ 되죠.
05:01
by allowing us to instantly and widely share information
84
301711
3881
우리의 관점과 잘 λ“€μ–΄λ§žλŠ” 정보λ₯Ό μ¦‰κ°μ μœΌλ‘œ,
05:05
that accords with our viewpoints.
85
305616
1792
널리 퍼뜨릴 수 있게 ν•΄ μ£Όλ‹ˆκΉŒμš”.
05:08
Now, deepfakes have the potential to cause grave individual and societal harm.
86
308735
5568
λ”₯νŽ˜μ΄ν¬λŠ” μ‹¬κ°ν•œ 개인, μ‚¬νšŒμ  ν”Όν•΄λ₯Ό μ΄ˆλž˜ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
05:15
So, imagine a deepfake
87
315204
2024
μ•„ν”„κ°€λ‹ˆμŠ€νƒ„μ— μžˆλŠ” 미ꡰ듀이 μ½”λž€μ„ νƒœμš°λŠ” λ”₯νŽ˜μ΄ν¬κ°€ μžˆλ‹€κ³  상상해 λ³΄μ„Έμš”.
05:17
that shows American soldiers in Afganistan burning a Koran.
88
317252
4182
05:22
You can imagine that that deepfake would provoke violence
89
322807
3024
κ·Έ λ”₯νŽ˜μ΄ν¬λŠ” μ˜μƒ 속 미ꡰ듀에 λŒ€ν•œ
05:25
against those soldiers.
90
325855
1533
폭λ ₯을 낳을 것이라고 μ˜ˆμƒν•  수 있죠.
05:27
And what if the very next day
91
327847
2873
λ°”λ‘œ κ·Έ λ‹€μŒ λ‚ ,
05:30
there's another deepfake that drops,
92
330744
2254
런던의 ν•œ 이맘(μ„±μ§μž)이
05:33
that shows a well-known imam based in London
93
333022
3317
ꡰ인듀을 ν–₯ν•œ 폭λ ₯을 μ°¬μ–‘ν•˜λŠ”
05:36
praising the attack on those soldiers?
94
336363
2467
λ”₯페이크 μ˜μƒμ΄ λ§Œλ“€μ–΄μ§€λ©΄ μ–΄λ–¨κΉŒμš”?
05:39
We might see violence and civil unrest,
95
339617
3163
μ–΄μ©Œλ©΄ μ•„ν”„κ°€λ‹ˆμŠ€νƒ„κ³Ό λ―Έκ΅­ 뿐만이 μ•„λ‹Œ,
05:42
not only in Afganistan and the United Kingdom,
96
342804
3249
μ „ 세계에 걸친
05:46
but across the globe.
97
346077
1515
폭λ ₯κ³Ό μ‹œλ―Ό μ†Œμš”κ°€ 일어날 μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
05:48
And you might say to me,
98
348251
1158
μ—¬λŸ¬λΆ„λ“€μ€ μ €μ—κ²Œ
05:49
"Come on, Danielle, that's far-fetched."
99
349433
2247
"에이, λ‹€λ‹ˆμ—˜, 그건 λ„ˆλ¬΄ 얡지닀." 라고 λ§ν•˜μ‹€ μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
05:51
But it's not.
100
351704
1150
ν•˜μ§€λ§Œ μ•„λ‹™λ‹ˆλ‹€.
05:53
We've seen falsehoods spread
101
353293
2191
μš°λ¦¬λŠ” μ™“μΈ μ•±(WhatsApp)κ³Ό μ—¬λŸ¬ λ‹€λ₯Έ λ©”μ‹œμ§€ μ„œλΉ„μŠ€μ—μ„œ
05:55
on WhatsApp and other online message services
102
355508
2722
거짓 정보가 νΌμ§€λŠ” 것을 ν™•μΈν–ˆκ³ ,
05:58
lead to violence against ethnic minorities.
103
358254
2761
그것이 μ†Œμˆ˜ 민쑱에 λŒ€ν•œ 폭λ ₯으둜 μ΄μ–΄μ§€λŠ” 것을 λ³΄μ•˜μŠ΅λ‹ˆλ‹€.
06:01
And that was just text --
104
361039
1887
그리고 그것은 단지 κΈ€ ν˜•νƒœμ— λΆˆκ³Όν–ˆμ£ .
06:02
imagine if it were video.
105
362950
2024
그것이 μ˜μƒμ΄λΌλ©΄ μ–΄λ–¨κΉŒμš”?
06:06
Now, deepfakes have the potential to corrode the trust that we have
106
366593
5357
λ”₯νŽ˜μ΄ν¬λŠ” μš°λ¦¬κ°€ 가지고 μžˆλŠ” 민주주의적 μ œλ„μ— λŒ€ν•œ λ―ΏμŒμ„
06:11
in democratic institutions.
107
371974
1992
μ•½ν™”μ‹œν‚¬ 수 μžˆλŠ” κ°€λŠ₯성을 가지고 μžˆμŠ΅λ‹ˆλ‹€.
06:15
So, imagine the night before an election.
108
375006
2667
μ„ κ±° 전날이라고 생각해 λ³΄μ„Έμš”.
06:17
There's a deepfake showing one of the major party candidates
109
377996
3238
μ£Όμš” ν›„λ³΄μž 쀑 ν•œ λͺ…이 μ•„μ£Ό μ•„ν”ˆ λͺ¨μŠ΅μ„ λ³΄μ—¬μ£ΌλŠ”
06:21
gravely sick.
110
381258
1150
λ”₯νŽ˜μ΄ν¬κ°€ μžˆλ‹€κ³  κ°€μ •ν•©μ‹œλ‹€.
06:23
The deepfake could tip the election
111
383202
2333
κ·Έ λ”₯νŽ˜μ΄ν¬λŠ” μ„ κ±°λ₯Ό λ’€λ°”κΏ€ 수 있고
06:25
and shake our sense that elections are legitimate.
112
385559
3375
μ„ κ±°κ°€ μ •λ‹Ήν•˜λ‹€λŠ” 우리의 λ―ΏμŒμ„ 뒀흔듀 수 μžˆμŠ΅λ‹ˆλ‹€.
06:30
Imagine if the night before an initial public offering
113
390515
3326
큰 세계적인 μ€ν–‰μ˜ μ‹ κ·œ 상μž₯ 전날이라고 생각해 λ³΄μ„Έμš”.
06:33
of a major global bank,
114
393865
2333
κ·Έ μ€ν–‰μ˜ CEOκ°€ μˆ μ— μ·¨ν•œ μ±„λ‘œ
06:36
there was a deepfake showing the bank's CEO
115
396222
3149
음λͺ¨λ‘  이야기듀을 λ‚΄λ±‰λŠ”
06:39
drunkenly spouting conspiracy theories.
116
399395
2697
λ”₯νŽ˜μ΄ν¬κ°€ λ– λŒκ³  μžˆλ‹€κ³  생각해 λ³΄μ„Έμš”.
06:42
The deepfake could tank the IPO,
117
402887
3047
κ·Έ λ”₯νŽ˜μ΄ν¬λŠ” μ‹ κ·œ 상μž₯을 μ™„μ „νžˆ λ§ν•˜κ²Œ ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
06:45
and worse, shake our sense that financial markets are stable.
118
405958
4115
ν˜Ήμ€, 금육 μ‹œμž₯이 μ•ˆμ •μ μΌ κ²ƒμ΄λΌλŠ” 우리의 λ―ΏμŒμ„ 뒀흔듀 수 있죠.
06:51
So deepfakes can exploit and magnify the deep distrust that we already have
119
411385
6989
λ”°λΌμ„œ, λ”₯νŽ˜μ΄ν¬λŠ” μš°λ¦¬κ°€ 이미 가지고 μžˆλŠ”
06:58
in politicians, business leaders and other influential leaders.
120
418398
4214
μ •μΉ˜μΈ, 사업가, 리더듀에 λŒ€ν•œ λΆˆμ‹ μ„ κ°•ν™”μ‹œν‚¬ 수 μžˆμŠ΅λ‹ˆλ‹€.
07:02
They find an audience primed to believe them.
121
422945
3284
λ”₯νŽ˜μ΄ν¬λŠ” 그것을 믿을 λ§Œν•œ 관객을 μ°Ύμ•„λ‚΄μ–΄ μœ μΈν•˜μ£ .
07:07
And the pursuit of truth is on the line as well.
122
427287
2765
또, μ§„λ¦¬μ˜ 좔ꡬ λ˜ν•œ μœ„νƒœλ‘­μŠ΅λ‹ˆλ‹€.
07:11
Technologists expect that with advances in AI,
123
431077
3564
κΈ°μˆ μΈλ“€μ€ 인곡지λŠ₯의 λ°œμ „ 속도λ₯Ό λ³΄μ•˜μ„ λ•Œ,
07:14
soon it may be difficult if not impossible
124
434665
3682
곧 μ§„μ§œμ™€ κ°€μ§œ μ˜μƒμ„ κ΅¬λΆ„ν•˜λŠ” 것이
07:18
to tell the difference between a real video and a fake one.
125
438371
3769
μ•„μ£Ό νž˜λ“€κ±°λ‚˜ λΆˆκ°€λŠ₯ν•  것이라고 μ˜ˆμΈ‘ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
07:23
So how can the truth emerge in a deepfake-ridden marketplace of ideas?
126
443022
5341
κ·Έλ ‡λ‹€λ©΄ μ–΄λ–»κ²Œ 진싀이 λ”₯νŽ˜μ΄ν¬κ°€ λ‚œλ¬΄ν•œ 정보 μ†μ—μ„œ 빛을 λ°œν•  수 μžˆμ„κΉŒμš”?
07:28
Will we just proceed along the path of least resistance
127
448752
3420
아무 λ…Έλ ₯도 듀이지 μ•Šμ€ 채
07:32
and believe what we want to believe,
128
452196
2437
μš°λ¦¬κ°€ μ›ν•˜λŠ” κ²ƒλ§Œμ„ λ―Ώκ³ ,
07:34
truth be damned?
129
454657
1150
μ‚¬μ‹€λ‘œλΆ€ν„° μ™„μ „νžˆ 등을 λŒλ¦΄κΉŒμš”?
07:36
And not only might we believe the fakery,
130
456831
3175
μ–΄μ©Œλ©΄ μš°λ¦¬λŠ” κ°€μ§œλ₯Ό λ―ΏλŠ” 것 뿐만 μ•„λ‹ˆλΌ
07:40
we might start disbelieving the truth.
131
460030
3326
사싀을 κ±°μ ˆν•˜κ²Œ 될 μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
07:43
We've already seen people invoke the phenomenon of deepfakes
132
463887
4079
μš°λ¦¬λŠ” 이미 μ‚¬λžŒλ“€μ΄ λ”₯페이크λ₯Ό 이유둜 λ“€λ©°
07:47
to cast doubt on real evidence of their wrongdoing.
133
467990
3920
μ‹€μ œ 증거 μžλ£Œμ—κ²Œ μ˜λ¬Έμ„ μ œκΈ°ν•˜λŠ” λͺ¨μŠ΅μ„ λ³΄μ•˜μŠ΅λ‹ˆλ‹€.
07:51
We've heard politicians say of audio of their disturbing comments,
134
471934
5969
μ •μΉ˜μΈλ“€μ€ μžμ‹ μ˜ λ°œμ–Έμ΄ λ‹΄κΈ΄ μŒμ„± νŒŒμΌμ„ 두고 μ΄λ ‡κ²Œ λ§ν•©λ‹ˆλ‹€.
07:57
"Come on, that's fake news.
135
477927
1746
"이봐, 그건 κ°€μ§œ λ‰΄μŠ€λΌκ³ .
07:59
You can't believe what your eyes and ears are telling you."
136
479697
3920
μ—¬λŸ¬λΆ„μ˜ 눈과 κ·€κ°€ λ³΄μ—¬μ£ΌλŠ” 것이 μ „λΆ€κ°€ μ•„λ‹ˆλž€ λ§μž…λ‹ˆλ‹€."
08:04
And it's that risk
137
484402
1731
그리고 λ‘œλ²„νŠΈ μ²΄μŠ€λ‹ˆ κ΅μˆ˜μ™€ μ €λŠ”
08:06
that professor Robert Chesney and I call the "liar's dividend":
138
486157
5436
이 μœ„ν—˜μ„ "κ±°μ§“λ§μŸμ΄μ˜ λ°°λ‹ΉκΈˆ" 이라고 λΆ€λ¦…λ‹ˆλ‹€.
08:11
the risk that liars will invoke deepfakes
139
491617
3357
κ±°μ§“λ§μŸμ΄λ“€μ΄ λ”₯페이크λ₯Ό 이유둜 λ“€λ©°
08:14
to escape accountability for their wrongdoing.
140
494998
2905
μžμ‹ μ˜ 잘λͺ»μ— λŒ€ν•œ μ±…μž„μ„ νšŒν”Όν•˜λŠ” 것이죠.
08:18
So we've got our work cut out for us, there's no doubt about it.
141
498963
3071
이제 μš°λ¦¬κ°€ 무엇을 ν•΄μ•Ό 할지 ν™•μ‹€ν•΄μ‘ŒμŠ΅λ‹ˆλ‹€.
08:22
And we're going to need a proactive solution
142
502606
3325
κΈ°μˆ νšŒμ‚¬λ“€κ³Ό μž…λ²•μžλ“€,
08:25
from tech companies, from lawmakers,
143
505955
3511
집행관듀과 λ―Έλ””μ–΄ μ‚¬μš©μžλ“€μ€
08:29
law enforcers and the media.
144
509490
1984
방지책을 λ‚΄μ•Ό ν•©λ‹ˆλ‹€.
08:32
And we're going to need a healthy dose of societal resilience.
145
512093
4016
κ±΄κ°•ν•œ μ‚¬νšŒμ˜ μœ μ—°μ„± λ˜ν•œ ν•„μš”ν•©λ‹ˆλ‹€.
08:37
So now, we're right now engaged in a very public conversation
146
517506
3896
μš°λ¦¬λŠ” μ§€κΈˆ 기술 νšŒμ‚¬λ“€μ˜ μ±…μž„μ— λŒ€ν•œ
08:41
about the responsibility of tech companies.
147
521426
2913
토둠에 μ°Έμ—¬ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
08:44
And my advice to social media platforms
148
524926
3032
μ†Œμ…œ λ―Έλ””μ–΄ ν”Œλž«νΌμ—κ²Œ μ£ΌλŠ” 제 쑰언은 λ‹€μŒκ³Ό κ°™μŠ΅λ‹ˆλ‹€.
08:47
has been to change their terms of service and community guidelines
149
527982
3873
μ„œλΉ„μŠ€ μ•½κ΄€κ³Ό 곡동체 κ·œμΉ™μ„ λ°”κΎΈμ–΄
08:51
to ban deepfakes that cause harm.
150
531879
2336
ν•΄λ₯Ό μ£ΌλŠ” λ”₯페이크λ₯Ό λ§‰λŠ” 것이죠.
08:54
That determination, that's going to require human judgment,
151
534712
3960
이λ₯Ό μœ„ν•΄μ„œλŠ” μ˜μ§€λ ₯κ³Ό μΈκ°„μ˜ νŒλ‹¨λ ₯이 ν•„μš”ν•˜κ³ ,
08:58
and it's expensive.
152
538696
1571
비싸기도 ν•©λ‹ˆλ‹€.
09:00
But we need human beings
153
540673
2285
ν•˜μ§€λ§Œ μš°λ¦¬λŠ” λ”₯ 페이크의
09:02
to look at the content and context of a deepfake
154
542982
3873
λ‚΄μš©κ³Ό λ§₯락을 보고
09:06
to figure out if it is a harmful impersonation
155
546879
3682
μ‚¬λžŒμ—κ²Œ ν•΄λ₯Ό μ£ΌλŠ” 것인지
09:10
or instead, if it's valuable satire, art or education.
156
550585
4382
κ°€μΉ˜μžˆλŠ” ν’μžλ‚˜ μ˜ˆμˆ μΈμ§€ νŒλ‹¨ν•  μ‚¬λžŒλ“€μ΄ ν•„μš”ν•©λ‹ˆλ‹€.
09:16
So now, what about the law?
157
556118
1495
κ·Έλ ‡λ‹€λ©΄, 법은 μ–΄λ–¨κΉŒμš”?
09:18
Law is our educator.
158
558666
2349
법은 우리의 κ΅μœ‘μžμž…λ‹ˆλ‹€.
09:21
It teaches us about what's harmful and what's wrong.
159
561515
4038
μ–΄λ–€ 것이 ν•΄λ‘­κ³  μ–΄λ–€ 것이 잘λͺ»μΈμ§€ κ°€λ₯΄μ³ μ£Όμ£ .
09:25
And it shapes behavior it deters by punishing perpetrators
160
565577
4555
그리고 κ°€ν•΄μžλ₯Ό μ²˜λ²Œν•˜κ³  ν”Όν•΄μžλ₯Ό λ³΄ν˜Έν•΄ μ£Όλ©°
09:30
and securing remedies for victims.
161
570156
2267
μ˜¬λ°”λ₯Έ 행동을 ν•  수 있게 지도해 μ€λ‹ˆλ‹€.
09:33
Right now, law is not up to the challenge of deepfakes.
162
573148
4280
ν˜„μž¬μ˜ 법은 λ”₯νŽ˜μ΄ν¬μ— λŒ€ν•­ν•  수 μ—†μŠ΅λ‹ˆλ‹€.
09:38
Across the globe,
163
578116
1390
μ „ μ„Έκ³„μ μœΌλ‘œ λ³΄μ•˜μ„ λ•Œ,
09:39
we lack well-tailored laws
164
579530
2444
성적 μ‚¬μƒν™œμ„ μΉ¨ν•΄ν•˜κ³ 
09:41
that would be designed to tackle digital impersonations
165
581998
3570
λͺ…μ˜ˆλ₯Ό ν›Όμ†ν•˜κ³ 
09:45
that invade sexual privacy,
166
585592
2231
정신적인 고톡을 μœ λ°œν•˜λŠ”
09:47
that damage reputations
167
587847
1387
λ”₯페이크λ₯Ό μ œμ–΄ν•  수 μžˆλŠ”
09:49
and that cause emotional distress.
168
589258
1951
μ μ ˆν•œ 법λ₯ μ€ μ „λ¬΄ν•©λ‹ˆλ‹€.
09:51
What happened to Rana Ayyub is increasingly commonplace.
169
591725
3873
λΌλ‚˜ μ•„μ΄μœ±μ—κ²Œ μΌμ–΄λ‚¬λ˜ 일은 점점 흔해지고 μžˆμŠ΅λ‹ˆλ‹€.
09:56
Yet, when she went to law enforcement in Delhi,
170
596074
2214
ν•˜μ§€λ§Œ κ·Έλ…€κ°€ 델리의 법 집행기관에 갔을 λ•Œ,
09:58
she was told nothing could be done.
171
598312
2135
아무것도 ν•  수 μ—†λ‹€λŠ” 말을 λ“€μ—ˆμŠ΅λ‹ˆλ‹€.
10:01
And the sad truth is that the same would be true
172
601101
3183
μŠ¬ν”ˆ 것은, λ―Έκ΅­κ³Ό μ˜κ΅­μ—μ„œλ„
10:04
in the United States and in Europe.
173
604308
2266
λ˜‘κ°™μ„ κ²ƒμ΄λΌλŠ” μ‚¬μ‹€μž…λ‹ˆλ‹€.
10:07
So we have a legal vacuum that needs to be filled.
174
607300
4356
μš°λ¦¬κ°€ μ±„μ›Œμ•Ό ν•˜λŠ” λ²•μ˜ 빈 곡간이 μ‘΄μž¬ν•©λ‹ˆλ‹€.
10:12
My colleague Dr. Mary Anne Franks and I are working with US lawmakers
175
612292
4092
제 λ™λ£ŒμΈ 마리 μ•ˆλ„€ ν”„λž‘ν¬ 박사와 μ €λŠ” λ―Έκ΅­ μž…λ²•μžλ“€κ³Ό ν•¨κ»˜ μΌν•˜λ©°
10:16
to devise legislation that would ban harmful digital impersonations
176
616408
4804
μ‹ λΆ„ λ„μš©μ΄λ‚˜ λ‹€λ¦„μ—†λŠ” 디지털 사칭을
10:21
that are tantamount to identity theft.
177
621236
2533
막을 수 μžˆλŠ” 법을 κ³ μ•ˆν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
10:24
And we've seen similar moves
178
624252
2126
그리고 μ•„μ΄μŠ¬λžœλ“œ, 영ꡭ, ν˜Έμ£Όμ—μ„œ
10:26
in Iceland, the UK and Australia.
179
626402
3301
λΉ„μŠ·ν•œ μ‹œλ„κ°€ μΌμ–΄λ‚˜λŠ” 것을 λ³΄μ•˜μŠ΅λ‹ˆλ‹€.
10:30
But of course, that's just a small piece of the regulatory puzzle.
180
630157
3259
ν•˜μ§€λ§Œ μ΄λŠ” μ•„μ£Ό μž‘μ€ 퍼즐 쑰각에 λΆˆκ³Όν•˜μ£ .
10:34
Now, I know law is not a cure-all. Right?
181
634911
3169
법은 λ§Œλ³‘ν†΅μΉ˜μ•½μ΄ μ•„λ‹™λ‹ˆλ‹€. κ·Έλ ‡μ£ ?
10:38
It's a blunt instrument.
182
638104
1600
κ·Έμ € λ‘”ν•œ λͺ½λ‘₯이일 λΏμž…λ‹ˆλ‹€.
10:40
And we've got to use it wisely.
183
640346
1539
μš°λ¦¬λŠ” 이λ₯Ό ν˜„λͺ…ν•˜κ²Œ 닀루어야 ν•˜μ£ .
10:42
It also has some practical impediments.
184
642411
2812
또, 법은 μ‹€μ§ˆμ μΈ 어렀움을 λ‹΄κ³  μžˆμŠ΅λ‹ˆλ‹€.
10:45
You can't leverage law against people you can't identify and find.
185
645657
5044
찾을 수 μ—†λŠ” μ‚¬λžŒμ— λŒ€ν•΄μ„œ λ²•μ˜ μ²˜λ²Œμ„ 내릴 수 μ—†μ£ .
10:51
And if a perpetrator lives outside the country
186
651463
3286
그리고 κ°€ν•΄μžκ°€ ν”Όν•΄μžκ°€ μ‚¬λŠ” λ‚˜λΌμ— 살지 μ•ŠλŠ”λ‹€λ©΄,
10:54
where a victim lives,
187
654773
1754
10:56
then you may not be able to insist
188
656551
1629
κ°€ν•΄μžκ°€ μ²˜λ²Œμ„ 받을 수 있게
10:58
that the perpetrator come into local courts
189
658204
2349
법정에 λ°λ €μ˜€λŠ” 것 μžμ²΄κ°€
11:00
to face justice.
190
660577
1150
κ°€λŠ₯ν•˜μ§€ μ•Šμ„ 수 μžˆμŠ΅λ‹ˆλ‹€.
11:02
And so we're going to need a coordinated international response.
191
662236
4063
λ”°λΌμ„œ ꡭ제적인 λŒ€μ‘μ΄ ν•„μš”ν•©λ‹ˆλ‹€.
11:07
Education has to be part of our response as well.
192
667819
3333
κ΅μœ‘λ„ κ·Έ 일뢀가 λ˜μ–΄μ•Όκ² μ§€μš”.
11:11
Law enforcers are not going to enforce laws
193
671803
3731
법 집행관듀은 본인이 λͺ¨λ₯΄λŠ” 법을
11:15
they don't know about
194
675558
1458
μ§‘ν–‰ν•˜μ§€ μ•Šμ„ 것이고,
11:17
and proffer problems they don't understand.
195
677040
2596
본인이 μ΄ν•΄ν•˜μ§€ λͺ»ν•œ 문제λ₯Ό μ§€μ§€ν•˜λ € ν•˜μ§€ μ•Šμ„ κ²ƒμž…λ‹ˆλ‹€.
11:20
In my research on cyberstalking,
196
680376
2191
사이버 μŠ€ν† ν‚Ήμ— λŒ€ν•œ 제 μ—°κ΅¬μ—μ„œ,
11:22
I found that law enforcement lacked the training
197
682591
3499
μ €λŠ” 법 μ§‘ν–‰κΈ°κ΄€λ“€μ—λŠ”
11:26
to understand the laws available to them
198
686114
2582
온라인 λ¬Έμ œλ“€κ³Ό κ΄€λ ¨ 법λ₯ μ„ μ΄ν•΄ν•˜λ„λ‘ ν•˜λŠ”
11:28
and the problem of online abuse.
199
688720
2349
ν›ˆλ ¨ 과정이 μ—†λ‹€λŠ” 것을 μ•Œκ²Œ λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
11:31
And so often they told victims,
200
691093
2682
κ·Έλž˜μ„œ 그듀은 ν”Όν•΄μžλ“€μ—κ²Œ μ΄λ ‡κ²Œ λ§ν•œ 것이죠.
11:33
"Just turn your computer off. Ignore it. It'll go away."
201
693799
3971
"κ·Έλƒ₯ λ‹Ήμ‹ μ˜ 컴퓨터λ₯Ό 끄고 λ¬΄μ‹œν•˜μ„Έμš”. 그러면 λ¬Έμ œκ°€ μ—†μ–΄μ§ˆ κ²ƒμž…λ‹ˆλ‹€."
11:38
And we saw that in Rana Ayyub's case.
202
698261
2466
그리고 μš°λ¦¬λŠ” 이λ₯Ό λΌλ‚˜ μ•„μ΄μœ±μ˜ μ‚¬λ‘€μ—μ„œλ„ λ³΄μ•˜μŠ΅λ‹ˆλ‹€.
11:41
She was told, "Come on, you're making such a big deal about this.
203
701102
3468
"에이, λ„ˆλ¬΄ κ³Όλ―Όλ°˜μ‘ν•˜μ‹œλ„€μš”. λ‚¨μžλ“€μ΄ 그런 κ±°μž–μ•„μš”."
11:44
It's boys being boys."
204
704594
1743
λΌλ‚˜ μ•„μ΄μœ±μ€ 이 말을 λ“€μ–΄μ•Ό ν–ˆμŠ΅λ‹ˆλ‹€.
11:47
And so we need to pair new legislation with efforts at training.
205
707268
5252
λ”°λΌμ„œ μš°λ¦¬λŠ” ν›ˆλ ¨ 과정을 μƒˆ 법λ₯ κ³Ό λ°œλ§žμΆ”μ–΄μ•Ό ν•©λ‹ˆλ‹€.
11:54
And education has to be aimed on the media as well.
206
714053
3429
또, κ΅μœ‘μ€ λ―Έλ””μ–΄μ—μ„œλ„ 이루어져야 ν•˜μ£ .
11:58
Journalists need educating about the phenomenon of deepfakes
207
718180
4260
μ €λ„λ¦¬μŠ€νŠΈλ“€μ€ λ”₯페이크 ν˜„μƒμ— λŒ€ν•œ ꡐ윑이 ν•„μš”ν•©λ‹ˆλ‹€.
12:02
so they don't amplify and spread them.
208
722464
3039
그듀이 λ”₯페이크λ₯Ό νΌλœ¨λ¦¬λŠ” 것을 막기 μœ„ν•΄μ„œμ£ .
12:06
And this is the part where we're all involved.
209
726583
2168
그리고 μš°λ¦¬λŠ” 이 λΆ€λΆ„μ—μ„œ λͺ¨λ‘ λ…Έλ ₯을 κΈ°μšΈμ—¬μ•Ό ν•©λ‹ˆλ‹€.
12:08
Each and every one of us needs educating.
210
728775
3855
우리 λͺ¨λ‘λŠ” ꡐ윑이 ν•„μš”ν•©λ‹ˆλ‹€.
12:13
We click, we share, we like, and we don't even think about it.
211
733375
3675
μš°λ¦¬λŠ” 생각없이 ν΄λ¦­ν•˜κ³ , κ³΅μœ ν•˜κ³ , μ’‹μ•„μš”λ₯Ό λˆ„λ¦…λ‹ˆλ‹€.
12:17
We need to do better.
212
737551
1547
μš°λ¦¬λŠ” 이보닀 더 μž˜ν•΄μ•Ό ν•©λ‹ˆλ‹€.
12:19
We need far better radar for fakery.
213
739726
2809
μš°λ¦¬λŠ” κ°€μ§œμ— λŒ€ν•œ 더 쒋은 탐지 λŠ₯λ ₯이 ν•„μš”ν•©λ‹ˆλ‹€.
12:25
So as we're working through these solutions,
214
745744
3841
μš°λ¦¬κ°€ 이 해결책을 μ„€κ³„ν•˜λ©°,
12:29
there's going to be a lot of suffering to go around.
215
749609
2563
λ§Žμ€ ν”Όν•΄ 사둀가 일어날 κ²ƒμž…λ‹ˆλ‹€.
12:33
Rana Ayyub is still wrestling with the fallout.
216
753093
2746
λΌλ‚˜ μ•„μ΄μœ±μ€ 아직도 ν›„μœ μ¦μ— λ§žμ„œ μ‹Έμš°κ³  μžˆμŠ΅λ‹ˆλ‹€.
κ·Έλ…€λŠ” 온/μ˜€ν”„λΌμΈμ—μ„œ μžμ‹ μ„ 자유둭게 ν‘œμΆœν•˜μ§€ λͺ»ν•©λ‹ˆλ‹€.
12:36
She still doesn't feel free to express herself on- and offline.
217
756669
4189
12:41
And as she told me,
218
761566
1365
그리고 κ·Έλ…€κ°€ μ €μ—κ²Œ λ§ν•΄μ£Όμ—ˆλ“―μ΄,
12:42
she still feels like there are thousands of eyes on her naked body,
219
762955
5074
κ·Έλ…€λŠ” 수천 개의 눈이 μžμ‹ μ˜ λ‚˜μ²΄λ₯Ό μ£Όμ‹œν•˜κ³  μžˆλŠ” κ²ƒμ²˜λŸΌ λŠλ‚λ‹ˆλ‹€.
12:48
even though, intellectually, she knows it wasn't her body.
220
768053
3661
비둝 λ¨Έλ¦¬λ‘œλŠ” 그것이 본인의 λͺΈμ΄ μ•„λ‹ˆλΌλŠ” 것을 μ•Œλ©΄μ„œλ„μš”.
12:52
And she has frequent panic attacks,
221
772371
2349
그리고 가끔씩 κ·Έλ…€μ—κ²Œ λ°œμž‘μ΄ μ°Ύμ•„μ˜€κΈ°λ„ ν•©λ‹ˆλ‹€.
12:54
especially when someone she doesn't know tries to take her picture.
222
774744
4100
특히, κ·Έλ…€κ°€ λͺ¨λ₯΄λŠ” μ‚¬λžŒμ΄ κ·Έλ…€λ₯Ό 사진 찍으렀 ν•  λ•Œ λ§μž…λ‹ˆλ‹€.
12:58
"What if they're going to make another deepfake?" she thinks to herself.
223
778868
3511
"μ € μ‚¬λžŒμ΄ 또 λ‹€λ₯Έ λ”₯페이크λ₯Ό λ§Œλ“€λ©΄ μ–΄λ–‘ν•˜μ§€?"라고 κ·Έλ…€λŠ” μƒκ°ν•©λ‹ˆλ‹€.
13:03
And so for the sake of individuals like Rana Ayyub
224
783082
3921
κ·Έλž˜μ„œ λΌλ‚˜ μ•„μ΄μœ±κ°™μ€ μ‚¬λžŒλ“€μ„ μœ„ν•΄,
13:07
and the sake of our democracy,
225
787027
2306
또 우리의 민주주의λ₯Ό μœ„ν•΄,
13:09
we need to do something right now.
226
789357
2182
μš°λ¦¬λŠ” μ§€κΈˆ λ‹Ήμž₯ 쑰치λ₯Ό μ·¨ν•΄μ•Ό ν•©λ‹ˆλ‹€.
13:11
Thank you.
227
791563
1151
κ°μ‚¬ν•©λ‹ˆλ‹€.
13:12
(Applause)
228
792738
2508
(λ°•μˆ˜)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7