When AI Can Fake Reality, Who Can You Trust? | Sam Gregory | TED

136,884 views ใƒป 2023-12-26

TED


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

๋ฒˆ์—ญ: JISU BANG ๊ฒ€ํ† : Hyunjin Lee
00:03
It's getting harder, isn't it, to spot real from fake,
0
3583
3879
์ง„์งœ์™€ ๊ฐ€์งœ, AI๊ฐ€ ๋งŒ๋“  ๊ฒƒ๊ณผ ์‚ฌ๋žŒ์ด ๋งŒ๋“  ๊ฒƒ์„ ๊ตฌ๋ณ„ํ•˜๊ธฐ๊ฐ€
00:07
AI-generated from human-generated.
1
7504
2252
์ ์  ๋” ์–ด๋ ค์›Œ์ง€๊ณ  ์žˆ์ง€ ์•Š๋‚˜์š”?
00:10
With generative AI,
2
10340
1126
์ƒ์„ฑํ˜• AI์™€
00:11
along with other advances in deep fakery,
3
11508
2419
๋‹ค๋ฅธ ๋”ฅํŽ˜์ดํฌ ๊ธฐ์ˆ ์ด ๋ฐœ๋‹ฌํ•จ์— ๋”ฐ๋ผ,
00:13
it doesn't take many seconds of your voice,
4
13969
2252
์—ฌ๋Ÿฌ๋ถ„์˜ ๋ณต์ œํ’ˆ์„ ๋งŒ๋“œ๋Š”๋ฐ ํ•„์š”ํ•œ
00:16
many images of your face,
5
16263
1459
์—ฌ๋Ÿฌ๋ถ„์˜ ์Œ์„ฑ๊ณผ ์‚ฌ์ง„์€
00:17
to fake you,
6
17764
1251
๋” ์ ๊ฒŒ ํ•„์š”ํ•˜๊ฒŒ ๋˜๊ณ 
00:19
and the realism keeps increasing.
7
19015
2628
ํ˜„์‹ค๊ฐ์€ ๋”์šฑ ์ฆ๊ฐ€ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
00:21
I first started working on deepfakes in 2017,
8
21685
3086
์ œ๊ฐ€ ๋”ฅํŽ˜์ดํฌ ์—ฐ๊ตฌ๋ฅผ ์ฒ˜์Œ ์‹œ์ž‘ํ•œ ๋•Œ๋Š” 2017๋…„๋„ ์ž…๋‹ˆ๋‹ค.
00:24
when the threat to our trust in information was overhyped,
9
24813
3962
๊ทธ ๋•Œ๋Š” ์ •๋ณด ์‹ ์šฉ์— ๋Œ€ํ•œ ์šฐ๋ ค๊ฐ€ ๊ณผํ–ˆ์—ˆ๊ณ ,
00:28
and the big harm, in reality, was falsified sexual images.
10
28775
3670
ํ˜„์‹ค์—์„œ์˜ ํฐ ํ”ผํ•ด๋Š” ์„ ์ •์ ์ธ ์ด๋ฏธ์ง€ ์œ„์กฐ์˜€์Šต๋‹ˆ๋‹ค.
00:32
Now that problem keeps growing, harming women and girls worldwide.
11
32904
4171
์ด์ œ ๊ทธ ๋ฌธ์ œ๋Š” ๊ณ„์† ์ปค์ ธ์„œ ์ „ ์„ธ๊ณ„ ์—ฌ์„ฑ์—๊ฒŒ ํ”ผํ•ด๋ฅผ ์ฃผ๊ณ  ์žˆ์ฃ .
00:38
But also, with advances in generative AI, we're now also approaching a world
12
38159
4713
๋ฟ๋งŒ ์•„๋‹ˆ๋ผ, ์ƒ์„ฑํ˜• AI์˜ ๋ฐœ์ „์œผ๋กœ ์šฐ๋ฆฌ๋Š” ์ด์ œ ํ›จ์”ฌ ๋” ์‰ฝ๊ฒŒ
00:42
where it's broadly easier to make fake reality,
13
42872
3462
๊ฑฐ์ง“ ํ˜„์‹ค์„ ๋งŒ๋“ค์–ด ๋‚ผ ์ˆ˜ ์žˆ์„ ๋ฟ๋งŒ ์•„๋‹ˆ๋ผ
00:46
but also to dismiss reality as possibly faked.
14
46376
3879
ํ˜„์‹ค์ด ์กฐ์ž‘์ผ ๊ฑฐ๋ผ๊ณ  ์™ธ๋ฉดํ•˜๋Š” ์„ธ์ƒ์— ์ ‘์–ด๋“ค๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
00:50
Now, deceptive and malicious audiovisual AI
15
50755
3420
์ด์ œ, ๊ธฐ๋งŒ์ ์ด๊ณ  ์•…์˜์ ์ธ AI ์˜์ƒ์ด
00:54
is not the root of our societal problems,
16
54217
2669
์šฐ๋ฆฌ ์‚ฌํšŒ ๋ฌธ์ œ์˜ ๊ทผ์›์€ ์•„๋‹ˆ๋”๋ผ๋„
00:56
but it's likely to contribute to them.
17
56928
2252
๋ฌธ์ œ๋“ค์— ๋งŽ์€ ๊ธฐ์—ฌ๋ฅผ ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
00:59
Audio clones are proliferating in a range of electoral contexts.
18
59180
4213
๋‹ค์–‘ํ•œ ์„ ๊ฑฐ ์ƒํ™ฉ์—์„œ ๋ณต์ œ์Œ์„ฑ์ด ๊ธ‰์ฆํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
01:03
"Is it, isn't it" claims cloud human-rights evidence from war zones,
19
63435
5130
โ€œ์‚ฌ์‹ค์ธ๊ฐ€ ์•„๋‹Œ๊ฐ€โ€ ํ•˜๋Š” ์ฃผ์žฅ์ด ์ „์Ÿ์ง€์—ญ ์ธ๊ถŒ๋ฌธ์ œ ์ฆ์–ธ์— ํ˜ผ๋ž€์„ ์ฃผ๊ณ ,
01:08
sexual deepfakes target women in public and in private,
20
68565
4129
์„ฑ์  ๋”ฅํŽ˜์ดํฌ๋Š” ๊ณต์  ๋˜๋Š” ์‚ฌ์ ์ธ ๊ณณ์—์„œ
01:12
and synthetic avatars impersonate news anchors.
21
72736
3336
์—ฌ์„ฑ์„ ํ‘œ์ ์œผ๋กœ ์‚ผ๊ณ  ์žˆ์œผ๋ฉฐ, ๋ชจ์กฐ ์•„๋ฐ”ํƒ€๋Š” ๋‰ด์Šค ์•ต์ปค๋ฅผ ๋ชจ๋ฐฉํ•ฉ๋‹ˆ๋‹ค.
01:16
I lead WITNESS.
22
76656
1460
์ €๋Š” WITNESS๋ฅผ ์ด๋Œ๊ณ  ์žˆ๊ณ ,
01:18
We're a human-rights group
23
78116
1376
์ €ํฌ๋Š” ์‚ฌ๋žŒ๋“ค์ด ๋น„๋””์˜ค๊ธฐ์ˆ ์„ ์‚ฌ์šฉํ•ด
01:19
that helps people use video and technology to protect and defend their rights.
24
79492
3671
์ž์‹ ์˜ ๊ถŒ๋ฆฌ๋ฅผ ๋ณดํ˜ธํ•˜๊ณ  ์ง€ํ‚ค๋„๋ก ๋•๋Š” ์ธ๊ถŒ ๋‹จ์ฒด์ž…๋‹ˆ๋‹ค.
01:23
And for the last five years, we've coordinated a global effort,
25
83163
3003
๊ทธ๋ฆฌ๊ณ  ์ง€๋‚œ 5๋…„ ๋™์•ˆ ์šฐ๋ฆฌ๋Š” ํ˜„์‹ค์„ ์กฐ์ž‘ํ•˜๊ณ  ํ•ฉ์„ฑํ•˜๋Š”
01:26
"Prepare, Don't Panic,"
26
86333
1167
์ด๋Ÿฌํ•œ ์ƒˆ๋กœ์šด ๋ฐฉ๋ฒ•๊ณผ
01:27
around these new ways to manipulate and synthesize reality,
27
87500
3045
์ผ์„  ์–ธ๋ก ์ธ๊ณผ ์ธ๊ถŒ์˜นํ˜ธ์ž๋“ค์˜ ์ง„์‹ค์„ ๊ฐ•ํ™”ํ•˜๋Š” ๋ฐฉ๋ฒ•์„ ์ค‘์‹ฌ์œผ๋กœ
01:30
and on how to fortify the truth
28
90587
2377
โ€œ๋‹นํ™ฉํ•˜์ง€ ๋ง๊ณ , ์ค€๋น„ํ•˜๋ผ.โ€๋ผ๋Š”
01:32
of critical frontline journalists and human-rights defenders.
29
92964
3420
์ „ ์„ธ๊ณ„์  ๋…ธ๋ ฅ์„ ๊ธฐ์šธ์—ฌ ์™”์Šต๋‹ˆ๋‹ค.
01:37
Now, one element in that is a deepfakes rapid-response task force,
30
97218
5423
๊ทธ ๋…ธ๋ ฅ ์ค‘ ํ•˜๋‚˜๊ฐ€ ๋ฐ”๋กœ ๋”ฅํŽ˜์ดํฌ ์‹ ์† ๋Œ€์‘ ์ „๋‹ด ์กฐ์ง์ด๊ณ ,
01:42
made up of media-forensics experts
31
102641
2127
์ด๋Š” ๋ฏธ๋””์–ด ํฌ๋ Œ์‹ ์ „๋ฌธ๊ฐ€์™€
01:44
and companies who donate their time and skills
32
104768
2168
๋”ฅํŽ˜์ดํฌ ์˜ํ˜น์„ ๋ฐํ˜€๋‚ด๊ธฐ ์œ„ํ•ด
01:46
to debunk deepfakes and claims of deepfakes.
33
106978
3087
๊ธฐ์ˆ ๊ณผ ์‹œ๊ฐ„์„ ํ• ์• ํ•˜๋Š” ๊ธฐ์—…๋“ค๋กœ ๊ตฌ์„ฑ๋˜์–ด ์žˆ์Šต๋‹ˆ๋‹ค.
01:50
The task force recently received three audio clips,
34
110899
3211
์ด ์ „๋‹ด ํŒ€์€ ์ตœ๊ทผ ์ˆ˜๋‹จ, ์„œ์•„ํ”„๋ฆฌ์นด, ์ธ๋„๋กœ๋ถ€ํ„ฐ
01:54
from Sudan, West Africa and India.
35
114110
2670
์„ธ ๊ฐœ์˜ ์˜ค๋””์˜ค ํด๋ฆฝ์„ ๋ฐ›์•˜์Šต๋‹ˆ๋‹ค.
01:57
People were claiming that the clips were deepfaked, not real.
36
117155
3879
์‚ฌ๋žŒ๋“ค์€ ์˜ค๋””์˜ค ํด๋ฆฝ์ด ์ง„์งœ๊ฐ€ ์•„๋‹Œ ์กฐ์ž‘๋œ ํด๋ฆฝ์ด๋ผ๊ณ  ์ฃผ์žฅํ–ˆ์Šต๋‹ˆ๋‹ค.
02:01
In the Sudan case,
37
121451
1210
์ˆ˜๋‹จ์˜ ๊ฒฝ์šฐ,
02:02
experts used a machine-learning algorithm
38
122702
2002
์ „๋ฌธ๊ฐ€๋“ค์ด 100๋งŒ ๊ฐœ ์ด์ƒ์˜
02:04
trained on over a million examples of synthetic speech
39
124746
2628
ํ•ฉ์„ฑ ์Œ์„ฑ ์˜ˆ์ œ๋ฅผ ํ•™์Šตํ•œ ๋จธ์‹  ๋Ÿฌ๋‹ ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ์‚ฌ์šฉํ•˜์—ฌ
02:07
to prove, almost without a shadow of a doubt,
40
127374
2294
๋Œ€๋ถ€๋ถ„ ์˜์‹ฌ์˜ ์—ฌ์ง€ ์—†์ด ๊ทธ ํด๋ฆฝ์ด ์ง„์งœ์ž„์„
02:09
that it was authentic.
41
129709
1335
์ฆ๋ช…ํ–ˆ์Šต๋‹ˆ๋‹ค.
02:11
In the West Africa case,
42
131586
1835
์„œ์•„ํ”„๋ฆฌ์นด์˜ ๊ฒฝ์šฐ,
02:13
they couldn't reach a definitive conclusion
43
133463
2002
ํŠธ์œ„ํ„ฐ์—์„œ ๋‚˜์˜ค๋Š” ์˜ค๋””์˜ค๋ฅผ ๋ถ„์„ํ•˜๋Š” ๋ฐ
02:15
because of the challenges of analyzing audio from Twitter,
44
135507
2794
๋ฐฐ๊ฒฝ ์†Œ์Œ ๋ฌธ์ œ ๋•Œ๋ฌธ์— ํ™•์‹คํ•œ ๊ฒฐ๋ก ์—
02:18
and with background noise.
45
138301
1752
๋„๋‹ฌํ•  ์ˆ˜ ์—†์—ˆ์Šต๋‹ˆ๋‹ค.
02:20
The third clip was leaked audio of a politician from India.
46
140095
3712
์„ธ ๋ฒˆ์งธ ํด๋ฆฝ์€ ์ธ๋„ ์ •์น˜์ธ ์Œ์„ฑ์˜ ์œ ์ถœ๋ณธ์ด์—ˆ์Šต๋‹ˆ๋‹ค.
02:23
Nilesh Christopher of โ€œRest of Worldโ€ brought the case to the task force.
47
143848
3796
โ€œRest of Worldโ€์˜ ๋‹๋ ˆ์‰ฌ ํฌ๋ฆฌ์Šคํ† ํผ๋Š” ์‚ฌ๊ฑด์„ ์ œ์ถœํ–ˆ๊ณ 
02:27
The experts used almost an hour of samples
48
147644
2961
์ „๋ฌธ๊ฐ€๋“ค์€ 1์‹œ๊ฐ„ ๋ถ„๋Ÿ‰์˜ ์ƒ˜ํ”Œ์„ ์‚ฌ์šฉํ•ด
02:30
to develop a personalized model of the politician's authentic voice.
49
150605
3879
์ •์น˜์ธ์˜ ์‹ค์ œ ๋ชฉ์†Œ๋ฆฌ๋ฅผ ๋‹ด์€ ๋งž์ถคํ˜• ๋ชจ๋ธ์„ ๊ฐœ๋ฐœํ–ˆ์Šต๋‹ˆ๋‹ค.
02:35
Despite his loud and fast claims that it was all falsified with AI,
50
155151
4380
๋ชจ๋“  ๊ฒŒ AI๋กœ ์œ„์กฐ๋๋‹ค๋Š” ์ •์น˜์ธ์˜ ๊ฐ•๊ฑดํ•œ ์ฃผ์žฅ์—๋„ ๋ถˆ๊ตฌํ•˜๊ณ ,
02:39
experts concluded that it at least was partially real, not AI.
51
159572
4255
์ „๋ฌธ๊ฐ€๋“ค์€ ์ตœ์†Œํ•œ ๋ถ€๋ถ„์ ์œผ๋กœ AI๊ฐ€ ์•„๋‹ˆ๋ผ๊ณ  ๊ฒฐ๋ก ์„ ๋‚ด๋ ธ์Šต๋‹ˆ๋‹ค.
02:44
As you can see,
52
164369
1335
๋ณด์‹œ๋‹ค์‹œํ”ผ,
02:45
even experts cannot rapidly and conclusively separate true from false,
53
165745
5089
์ „๋ฌธ๊ฐ€๋“ค์กฐ์ฐจ ์ง„์‹ค๊ณผ ๊ฑฐ์ง“์„ ๋น ๋ฅด๊ณ  ํ™•์‹คํ•˜๊ฒŒ ๊ตฌ๋ณ„ํ•  ์ˆ˜ ์—†๊ณ ,
02:50
and the ease of calling "that's deepfaked" on something real
54
170875
4421
์ง„์งœ๋ฅผ โ€œ๋”ฅํŽ˜์ดํฌโ€๋ผ๊ณ  ํ•˜๋Š” ์ƒํ™ฉ์ด
02:55
is increasing.
55
175296
1168
์ ์  ๋Š˜์–ด๋‚˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
02:57
The future is full of profound challenges,
56
177132
2002
๋ฏธ๋ž˜๋Š” ์ง„์‹ค์„ ๋ณดํ˜ธํ•˜๊ณ ,
02:59
both in protecting the real and detecting the fake.
57
179175
3420
๊ฑฐ์ง“์„ ์ƒ‰์ถœํ•˜๋Š” ์ค‘๋Œ€ํ•œ ๋ฌธ์ œ๋“ค๋กœ ๊ฐ€๋“ํ•ฉ๋‹ˆ๋‹ค.
03:03
We're already seeing the warning signs
58
183888
1919
์šฐ๋ฆฌ๋Š” ์ด๋ฏธ ์‚ฌ์‹ค๊ณผ ํ—ˆ๊ตฌ๋ฅผ ๋ถ„๋ณ„ํ•˜๋Š”
03:05
of this challenge of discerning fact from fiction.
59
185807
2711
๋ฌธ์ œ์˜ ๊ฒฝ๊ณ  ์‹ ํ˜ธ๋ฅผ ๋ณด๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
03:08
Audio and video deepfakes have targeted politicians,
60
188560
3128
์˜ค๋””์˜ค ๋ฐ ๋น„๋””์˜ค ๋”ฅํŽ˜์ดํฌ๋Š”
03:11
major political leaders in the EU, Turkey and Mexico,
61
191688
3587
EU์˜ ์ฃผ์š” ์ •์น˜ ์ง€๋„์ž, ํ„ฐํ‚ค, ๋ฉ•์‹œ์ฝ”์˜ ์ฃผ์š” ์ •์น˜ ์ง€๋„์ž,
03:15
and US mayoral candidates.
62
195316
1710
๋ฏธ๊ตญ ์‹œ์žฅ ํ›„๋ณด๋ฅผ ํ‘œ์ ์œผ๋กœ ์‚ผ์•˜์Šต๋‹ˆ๋‹ค.
03:17
Political ads are incorporating footage of events that never happened,
63
197444
3503
์ •์น˜ ๊ด‘๊ณ ์—๋Š” ์ „ํ˜€ ์ผ์–ด๋‚œ ์ ์ด ์—†๋Š” ์‚ฌ๊ฑด์˜ ์˜์ƒ์ด ํฌํ•จ๋˜๊ณ  ์žˆ์œผ๋ฉฐ,
03:20
and people are sharing AI-generated imagery from crisis zones,
64
200947
4546
์‚ฌ๋žŒ๋“ค์€ ์œ„ํ—˜ ์ง€์—ญ์—์„œ AI๋กœ ์ƒ์„ฑํ•œ ์ด๋ฏธ์ง€๋ฅผ
03:25
claiming it to be real.
65
205535
1418
์‹ค์ œ๋ผ๊ณ  ์ฃผ์žฅํ•˜๋ฉฐ ๊ณต์œ ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
03:27
Now, again, this problem is not entirely new.
66
207454
3211
๋‹ค์‹œ ๋ง์”€๋“œ๋ฆฌ์ง€๋งŒ, ์ด ๋ฌธ์ œ๋Š” ์ „ํ˜€ ์ƒˆ๋กœ์šด ๊ฒƒ์ด ์•„๋‹™๋‹ˆ๋‹ค.
03:31
The human-rights defenders and journalists I work with
67
211207
2628
์ €์™€ ํ•จ๊ป˜ ์ผํ•˜๋Š” ์ธ๊ถŒ ์šด๋™๊ฐ€๋“ค๊ณผ ์ €๋„๋ฆฌ์ŠคํŠธ๋“ค์€
03:33
are used to having their stories dismissed,
68
213835
2794
์ž์‹ ๋“ค์˜ ์ด์•ผ๊ธฐ๋ฅผ ๊ธฐ๊ฐํ•˜๋Š” ๋ฐ ์ต์ˆ™ํ•ฉ๋‹ˆ๋‹ค.
03:36
and they're used to widespread, deceptive, shallow fakes,
69
216671
3462
๋˜ ๊ทธ๋“ค์€ ๊ด‘๋ฒ”์œ„ํ•˜๊ณ  ๊ธฐ๋งŒ์ ์ธ ๊ฐ€์งœ ๋™์˜์ƒ๊ณผ ์ด๋ฏธ์ง€์— ์ต์ˆ™ํ•ฉ๋‹ˆ๋‹ค.
03:40
videos and images taken from one context or time or place
70
220175
3670
์ด๋Š” ํ•œ ์ƒํ™ฉ์ด๋‚˜ ์‹œ๊ฐ„ ๋˜๋Š” ์žฅ์†Œ์—์„œ ์ดฌ์˜ํ•œ ๊ฒƒ์„
03:43
and claimed as if they're in another,
71
223887
2460
๋งˆ์น˜ ๋‹ค๋ฅธ ๊ณณ์— ์žˆ๋Š” ๊ฒƒ ์ฒ˜๋Ÿผ ์ฃผ์žฅํ•˜๋ฉฐ
03:46
used to share confusion and spread disinformation.
72
226347
3129
ํ˜ผ๋ž€์„ ์ผ์œผํ‚ค๊ณ  ํ—ˆ์œ„์ •๋ณด๋ฅผ ํผ๋œจ๋ฆฝ๋‹ˆ๋‹ค.
03:49
And of course, we live in a world that is full of partisanship
73
229934
3170
๋ฌผ๋ก  ์šฐ๋ฆฌ๋Š” ๋‹นํŒŒ์ฃผ์˜์™€ ํ™•์ฆ ํŽธํ–ฅ์ด ๊ฐ€๋“ํ•œ
03:53
and plentiful confirmation bias.
74
233146
2127
์„ธ์ƒ์— ์‚ด๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
03:57
Given all that,
75
237317
1209
์ด ๋ชจ๋“  ๊ฒƒ์„ ๊ณ ๋ คํ–ˆ์„ ๋•Œ,
03:58
the last thing we need is a diminishing baseline
76
238526
3045
์šฐ๋ฆฌ์—๊ฒŒ ํ•„์š”ํ•œ ๊ฒƒ์€ ๋ฏผ์ฃผ์ฃผ์˜๊ฐ€ ๋ฒˆ์˜ํ•˜๋Š”
04:01
of the shared, trustworthy information upon which democracies thrive,
77
241613
3962
๊ณต์œ ๋˜๊ณ  ์‹ ๋ขฐ ๊ฐ€๋Šฅํ•œ ์ •๋ณด๋“ค์˜ ๊ธฐ์ค€์„  ๊ฐ์†Œ์ด๋ฉฐ,
04:05
where the specter of AI
78
245617
1418
์—ฌ๊ธฐ์„œ AI์˜ ๋ง๋ น์€
04:07
is used to plausibly believe things you want to believe,
79
247076
3462
์šฐ๋ฆฌ๊ฐ€ ๋ฏฟ๊ณ  ์‹ถ์€ ๊ฒƒ์€ ํƒ€๋‹นํ•˜๊ฒŒ ๋ฏฟ๊ณ ,
04:10
and plausibly deny things you want to ignore.
80
250538
2336
๋ฌด์‹œํ•˜๊ณ  ์‹ถ์€ ๊ฒƒ์€ ํƒ€๋‹นํ•˜๊ฒŒ ๋ฌด์‹œํ•  ์ˆ˜ ์žˆ๊ฒŒ ํ•˜๋Š” ๊ฒ๋‹ˆ๋‹ค.
04:15
But I think there's a way we can prevent that future,
81
255084
2628
ํ•˜์ง€๋งŒ ์ง€๊ธˆ ํ–‰๋™ํ•œ๋‹ค๋ฉด ๊ทธ ๋ฏธ๋ž˜๋ฅผ ๋ง‰์„ ์ˆ˜ ์žˆ๋Š” ๋ฐฉ๋ฒ•์ด
04:17
if we act now;
82
257754
1501
์žˆ๋‹ค๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
04:19
that if we "Prepare, Don't Panic,"
83
259255
2211
โ€œ์ค€๋น„ํ•˜๊ณ , ๋‹นํ™ฉํ•˜์ง€ ๋งˆ์„ธ์š”.โ€ ์ฒ˜๋Ÿผ
04:21
we'll kind of make our way through this somehow.
84
261508
3253
์šฐ๋ฆฌ๊ฐ€ ์–ด๋–ป๊ฒŒ๋“  ์ด ๋ฌธ์ œ๋ฅผ ํ—ค์ณ๋‚˜๊ฐˆ ์ˆ˜ ์žˆ๋‹ค๋Š” ๊ฑฐ์ฃ .
04:25
Panic won't serve us well.
85
265929
2627
๋‹นํ™ฉํ•˜๋Š” ๊ฒƒ์€ ๋„์›€์ด ๋˜์ง€ ์•Š์•„์š”.
04:28
[It] plays into the hands of governments and corporations
86
268681
2711
์ด๋Š” ์šฐ๋ฆฌ์˜ ๊ณตํฌ๋ฅผ ์•…์šฉํ•˜๋ ค๋Š”
04:31
who will abuse our fears,
87
271392
1669
์ •๋ถ€์™€ ๊ธฐ์—…์˜ ์†์—๋„ ์˜ํ–ฅ์„ ๋ฏธ์น˜๋ฉฐ,
04:33
and into the hands of people who want a fog of confusion
88
273102
3045
ํ˜ผ๋ž€์˜ ์•ˆ๊ฐœ๋ฅผ ํ”ผํ•ด AI๋ฅผ ํ•‘๊ณ„๋กœ ์‚ผ์œผ๋ ค๋Š”
04:36
and will use AI as an excuse.
89
276147
2461
์‚ฌ๋žŒ๋“ค์˜ ์†์—๋„ ์˜ํ–ฅ์„ ๋ฏธ์นฉ๋‹ˆ๋‹ค.
04:40
How many people were taken in, just for a minute,
90
280610
2419
์–ผ๋งˆ๋‚˜ ๋งŽ์€ ์‚ฌ๋žŒ๋“ค์ด ๋‹จ 1๋ถ„ ๋™์•ˆ
04:43
by the Pope in his dripped-out puffer jacket?
91
283029
2336
๋กฑํŒจํŒ… ์žฌํ‚ท์„ ์ž…์€ ๊ตํ™ฉ์—๊ฒŒ ๋งค๋ฃŒ๋˜์—ˆ์„๊นŒ์š”?
04:45
You can admit it.
92
285406
1168
์ธ์ •ํ•˜์…”๋„ ๋ฉ๋‹ˆ๋‹ค.
04:46
(Laughter)
93
286574
1210
(์›ƒ์Œ)
04:47
More seriously,
94
287784
1209
๋” ์ง„์ง€ํ•˜๊ฒŒ๋Š”,
04:49
how many of you know someone who's been scammed
95
289035
2503
์–ผ๋งˆ๋‚˜ ๋งŽ์€ ์‚ฌ๋žŒ๋“ค์ด ๊ทธ๋“ค์˜ ์ž๋…€์™€ ๋น„์Šทํ•œ ๋ชฉ์†Œ๋ฆฌ๋กœ ์ธํ•ด
04:51
by an audio that sounds like their kid?
96
291579
2044
์‚ฌ๊ธฐ๋ฅผ ๋‹นํ–ˆ๋Š”์ง€ ์•Œ๊ณ  ๊ณ„์‹ ๊ฐ€์š”?
04:54
And for those of you who are thinking "I wasn't taken in,
97
294624
2920
๊ทธ๋ฆฌ๊ณ  โ€œ๋‚œ ์•ˆ ๋‹นํ–ˆ์–ด, ๋”ฅํŽ˜์ดํฌ๋ฅผ ์ฐพ์•„๋‚ด๋Š” ๋ฐฉ๋ฒ•์„ ์•Œ์•„.โ€
04:57
I know how to spot a deepfake,"
98
297544
1584
๋ผ๊ณ  ์ƒ๊ฐํ•˜๋Š” ๋ถ„๋“ค์—๊ฒŒ ๋ง์”€๋“œ๋ฆฌ์ž๋ฉด,
04:59
any tip you know now is already outdated.
99
299170
3003
์•Œ๊ณ  ๊ณ„์‹  ๊ฒƒ๋“ค์€ ์ด๋ฏธ ์˜ˆ์ „ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
05:02
Deepfakes didn't blink, they do now.
100
302757
2544
์˜ˆ์ „ ๋”ฅํŽ˜์ดํฌ๋Š” ๋ˆˆ๋„ ๊นœ๋นก์ด์ง€์•Š๊ณ 
05:06
Six-fingered hands were more common in deepfake land than real life --
101
306177
3587
์—ฌ์„ฏ์†๊ฐ€๋ฝ์˜ ์†์ด ์ž์ฃผ ๋ณด์ด๊ณค ํ–ˆ์ง€๋งŒ
05:09
not so much.
102
309806
1126
์ง€๊ธˆ์€ ๊ทธ๋ ‡์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
05:11
Technical advances erase those visible and audible clues
103
311307
3754
๊ธฐ์ˆ ์˜ ๋ฐœ์ „์€ ๊ฐ€์‹œ์ ์ด๊ณ  ์ฒญ๊ฐ์ ์ธ ๋‹จ์„œ๋“ค์„ ์ง€์›Œ๋ฒ„๋ ธ์Šต๋‹ˆ๋‹ค.
05:15
that we so desperately want to hang on to
104
315061
2002
์šฐ๋ฆฌ๊ฐ€ ์ง„์งœ์™€ ๊ฐ€์งœ๋ฅผ ๊ตฌ๋ณ„ํ•˜๊ธฐ ์œ„ํ•ด
05:17
as proof we can discern real from fake.
105
317105
2252
๊ฐ„์ ˆํžˆ ๋งค๋‹ฌ๋ ธ๋˜ ์ฆ๊ฑฐ๋“ค์„ ๋ง์ด์ฃ 
05:20
But it also really shouldnโ€™t be on us to make that guess without any help.
106
320191
4713
ํ•˜์ง€๋งŒ ๋„์›€ ์—†์ด ์ถ”์ธก์„ ํ•˜๋Š” ๊ฒƒ์ด ์šฐ๋ฆฌ์˜ ๋ชซ์ด ๋˜์–ด์„œ๋Š” ์•ˆ ๋ฉ๋‹ˆ๋‹ค.
05:24
Between real deepfakes and claimed deepfakes,
107
324946
2127
์‹ค์ œ ๋”ฅํŽ˜์ดํฌ์™€ ์ฃผ์žฅ์ด ์ œ๊ธฐ๋œ ๋”ฅํŽ˜์ดํฌ ์‚ฌ์ด์—๋Š”
05:27
we need big-picture, structural solutions.
108
327073
2961
๊ฑฐ์‹œ์ , ๊ตฌ์กฐ์ ์ธ ์†”๋ฃจ์…˜์ด ํ•„์š”ํ•ฉ๋‹ˆ๋‹ค.
05:30
We need robust foundations
109
330034
1502
์šฐ๋ฆฌ๋Š”
05:31
that enable us to discern authentic from simulated,
110
331578
3128
์‹œ๋ฌผ๋ ˆ์ด์…˜๊ณผ ์ง„์งœ๋ฅผ ๊ตฌ๋ณ„ํ•  ์ˆ˜ ์žˆ๋Š” ๊ฒฌ๊ณ ํ•œ ๊ธฐ๋ฐ˜,
05:34
tools to fortify the credibility of critical voices and images,
111
334747
3921
์ค‘์š”ํ•œ ์Œ์„ฑ ๋ฐ ์ด๋ฏธ์ง€์˜ ์‹ ๋ขฐ์„ฑ์„ ๊ฐ•ํ™”ํ•˜๋Š” ๋„๊ตฌ,
05:38
and powerful detection technology
112
338668
2336
๊ทธ๋ฆฌ๊ณ  ํ•ด๊ฒฐํ•˜๋Š” ๊ฒƒ ๋ณด๋‹ค
05:41
that doesn't raise more doubts than it fixes.
113
341045
2670
๋” ๋งŽ์€ ์˜์‹ฌ์„ ๋ถˆ๋Ÿฌ์ผ์œผํ‚ค์ง€ ์•Š๋Š” ๊ฐ•๋ ฅํ•œ ํƒ์ง€ ๊ธฐ์ˆ ์ด ํ•„์š”ํ•ฉ๋‹ˆ๋‹ค.
05:45
There are three steps we need to take to get to that future.
114
345091
3045
์ด๋Ÿฌํ•œ ๋ฏธ๋ž˜๋ฅผ ํ–ฅํ•ด ๋‚˜์•„๊ฐ€๋ ค๋ฉด ๋‹ค์Œ ์„ธ ๋‹จ๊ณ„๋ฅผ ๊ฑฐ์ณ์•ผ ํ•ฉ๋‹ˆ๋‹ค.
05:48
Step one is to ensure that the detection skills and tools
115
348887
3712
์ฒซ ๋ฒˆ์งธ ๋‹จ๊ณ„๋Š” ํƒ์ง€ ๊ธฐ์ˆ ๊ณผ ๋„๊ตฌ๋“ค์ด
05:52
are in the hands of the people who need them.
116
352599
2168
๊ทธ๊ฒƒ์„ ํ•„์š”๋กœ ํ•˜๋Š” ์‚ฌ๋žŒ๋“ค์—๊ฒŒ ์ด์šฉ๋˜๋„๋ก ํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
05:54
I've talked to hundreds of journalists,
117
354767
2253
์ €๋Š” ์ˆ˜๋ฐฑ ๋ช…์˜ ์–ธ๋ก ์ธ,
05:57
community leaders and human-rights defenders,
118
357020
2252
์ปค๋ฎค๋‹ˆํ‹ฐ ๋ฆฌ๋”, ์ธ๊ถŒ ์šด๋™๊ฐ€๋“ค๊ณผ ์ด์•ผ๊ธฐ๋ฅผ ๋‚˜๋ˆด์Šต๋‹ˆ๋‹ค.
05:59
and they're in the same boat as you and me and us.
119
359272
2919
๊ทธ๋“ค์€ ์—ฌ๋Ÿฌ๋ถ„๊ณผ ์ €, ๊ทธ๋ฆฌ๊ณ  ์šฐ๋ฆฌ์™€ ๊ฐ™์€ ์ž…์žฅ์— ์„œ ์žˆ์Šต๋‹ˆ๋‹ค.
06:02
They're listening to the audio, trying to think, "Can I spot a glitch?"
120
362191
3421
๊ทธ๋“ค์€ ์˜ค๋””์˜ค๋ฅผ ๋“ค์œผ๋ฉฐ โ€œ๊ฒฐํ•จ์„ ๋ฐœ๊ฒฌํ•  ์ˆ˜ ์žˆ์„๊นŒ?โ€ ๋ผ๊ณ  ์ƒ๊ฐํ•˜๋ ค๊ณ  ํ•ฉ๋‹ˆ๋‹ค.
06:05
Looking at the image, saying, "Oh, does that look right or not?"
121
365612
3295
์ด๋ฏธ์ง€๋ฅผ ๋ณด๋ฉฐ โ€œ์•„, ์ด๊ฒŒ ๋งž๋Š”๊ฑด๊ฐ€, ์•„๋‹Œ๊ฐ€?โ€ ๋ผ๊ณ  ๋งํ•ฉ๋‹ˆ๋‹ค.
06:08
Or maybe they're going online to find a detector.
122
368907
3336
์•„๋‹ˆ๋ฉด ํƒ์ง€๊ธฐ๋ฅผ ์ฐพ๊ธฐ ์œ„ํ•ด ์˜จ๋ผ์ธ์— ์ ‘์†ํ•  ์ˆ˜๋„ ์žˆ๊ฒ ์ฃ .
06:12
And the detector they find,
123
372285
1335
๊ทธ๋ฆฌ๊ณ  ๊ทธ๋“ค์ด ์ฐพ์•„๋‚ธ ํƒ์ง€๊ธฐ๋กœ๋Š”
06:13
they don't know whether they're getting a false positive, a false negative,
124
373661
3545
๊ฒฐ๊ณผ๊ฐ€ ์œ„์–‘์„ฑ์ธ์ง€,์œ„์Œ์„ฑ์ธ์ง€, ์‹ ๋ขฐํ•  ์ˆ˜ ์žˆ๋Š” ๊ฒฐ๊ณผ๋ฅผ ์–ป์—ˆ๋Š”์ง€
06:17
or a reliable result.
125
377248
1251
์•Œ ์ˆ˜ ์—†์Šต๋‹ˆ๋‹ค.
06:18
Here's an example.
126
378541
1168
์˜ˆ๋ฅผ ํ•˜๋‚˜ ๋ณด์—ฌ๋“œ๋ฆด๊ฒŒ์š”.
06:19
I used a detector, which got the Pope in the puffer jacket right.
127
379751
3712
์ €๋Š” ํƒ์ง€๊ธฐ๋ฅผ ์‚ฌ์šฉํ–ˆ๋Š”๋ฐ, ๊ตํ™ฉ์€ ํŒจ๋”ฉ์„ ์ž…์œผ์‹  ๊ฒŒ ๋งž๋‹ค๋„ค์š”.
06:23
But then, when I put in the Easter bunny image that I made for my kids,
128
383796
4255
๊ทธ๋Ÿฐ๋ฐ ์ œ๊ฐ€ ์•„์ด๋“ค์„ ์œ„ํ•ด ๋งŒ๋“  ๋ถ€ํ™œ์ ˆ ํ† ๋ผ ์ด๋ฏธ์ง€๋ฅผ ๋„ฃ์—ˆ๋”๋‹ˆ
06:28
it said that it was human-generated.
129
388092
1961
์‚ฌ๋žŒ์ด ๋งŒ๋“  ๊ฒƒ์ด๋ผ๊ณ  ํ•˜์ฃ .
06:30
This is because of some big challenges in deepfake detection.
130
390637
3253
์ด๋Š” ๋”ฅํŽ˜์ดํฌ ํƒ์ง€๊ฐ€ ์–ด๋ ค์šด ๋ช‡ ๊ฐ€์ง€ ์ด์œ  ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
06:34
Detection tools often only work on one single way to make a deepfake,
131
394474
3295
ํƒ์ง€ ๋„๊ตฌ๋Š” ์ข…์ข… ํ•œ ๊ฐ€์ง€ ๋ฐฉ๋ฒ•์œผ๋กœ๋งŒ ์ž‘๋™ํ•˜๊ธฐ์—
06:37
so you need multiple tools,
132
397769
1543
์—ฌ๋Ÿฌ ๋„๊ตฌ๊ฐ€ ํ•„์š”ํ•˜๋ฉฐ
06:39
and they don't work well on low-quality social media content.
133
399354
4337
ํ’ˆ์งˆ์ด ๋‚ฎ์€ ์†Œ์…œ ๋ฏธ๋””์–ด ์ฝ˜ํ…์ธ ์—์„œ๋Š” ์ œ๋Œ€๋กœ ์ž‘๋™ํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
06:43
Confidence score, 0.76-0.87,
134
403691
3337
์‹ ๋ขฐ๋„ ์ ์ˆ˜, 0.76-0.87์ธ ์ด ๊ธฐ์ˆ ์ด
06:47
how do you know whether that's reliable,
135
407028
1919
์–ด๋–ป๊ฒŒ ๋ฏฟ์„ ๋งŒ ํ•œ์ง€ ์•Œ ์ˆ˜ ์žˆ์„๊นŒ์š”.
06:48
if you don't know if the underlying technology is reliable,
136
408988
2795
๊ธฐ๋ณธ ๊ธฐ์ˆ ์ด ๋ฏฟ์„ ๋งŒ ํ•œ์ง€ ์•Œ ์ˆ˜ ์—†๊ณ ,
06:51
or whether it works on the manipulation that is being used?
137
411824
2795
์ง€๊ธˆ ์‚ฌ์šฉ๋˜๋Š” ์กฐ์ž‘์— ํšจ๊ณผ์ ์ธ์ง€ ์•„๋‹Œ์ง€ ์•Œ ์ˆ˜ ์—†๋‹ค๋ฉด ๋ง์ด์ฃ .
06:54
And tools to spot an AI manipulation don't spot a manual edit.
138
414661
5046
๊ทธ๋ฆฌ๊ณ  AI ์กฐ์ž‘์„ ์ฐพ์•„๋‚ด๋Š” ๋„๊ตฌ๋กœ๋Š” ์ˆ˜๋™ ํŽธ์ง‘์„ ์ฐพ์•„๋‚ผ ์ˆ˜ ์—†์Šต๋‹ˆ๋‹ค.
07:00
These tools also won't be available to everyone.
139
420583
3587
๋˜ํ•œ ๋ชจ๋“  ์‚ฌ๋žŒ์ด ์ด๋Ÿฌํ•œ ๋„๊ตฌ๋ฅผ ์‚ฌ์šฉํ•  ์ˆ˜ ์žˆ๋Š” ๊ฒƒ์€ ์•„๋‹™๋‹ˆ๋‹ค.
07:04
There's a trade-off between security and access,
140
424212
3128
๋ณด์•ˆ๊ณผ ์•ก์„ธ์Šค ์‚ฌ์ด์—๋Š” ์ ˆ์ถฉ์ ์ด ์žˆ์Šต๋‹ˆ๋‹ค.
07:07
which means if we make them available to anyone,
141
427382
2544
์ฆ‰, ๋ˆ„๊ตฌ๋‚˜ ์‚ฌ์šฉํ•  ์ˆ˜ ์žˆ๊ฒŒ ํ•˜๋ฉด
07:09
they become useless to everybody,
142
429926
2586
๋ชจ๋“  ์‚ฌ๋žŒ์ด ์‚ฌ์šฉํ•  ์ˆ˜ ์—†๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
07:12
because the people designing the new deception techniques
143
432512
2711
์ƒˆ๋กœ์šด ์†์ž„์ˆ˜ ๊ธฐ๋ฒ•์„ ์„ค๊ณ„ํ•˜๋Š” ์‚ฌ๋žŒ๋“ค์ด
07:15
will test them on the publicly available detectors
144
435264
3087
๊ณต๊ฐœ์ ์œผ๋กœ ์‚ฌ์šฉ๊ฐ€๋Šฅํ•œ ํƒ์ง€๊ธฐ๋กœ ์ด๋ฅผ ํ…Œ์ŠคํŠธํ•˜๊ณ ๋Š”
07:18
and evade them.
145
438393
1209
๊ทธ ํƒ์ง€๋ง์„ ํ”ผํ•  ๊ฒƒ์ด๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
07:20
But we do need to make sure these are available
146
440061
2920
ํ•˜์ง€๋งŒ ์šฐ๋ฆฌ๋Š” ์šฐ๋ฆฌ์˜ 1์ฐจ ๋ฐฉ์–ด์„ ์ธ
07:22
to the journalists, the community leaders,
147
442981
2085
์ „ ์„ธ๊ณ„ ์–ธ๋ก ์ธ, ์ปค๋ฎค๋‹ˆํ‹ฐ ๋ฆฌ๋”,
07:25
the election officials, globally, who are our first line of defense,
148
445108
3628
์„ ๊ฑฐ ๊ด€๊ณ„์ž๋“ค์ด ์‹ค์ œ ์ ‘๊ทผ์„ฑ๊ณผ ์‚ฌ์šฉ์—
07:28
thought through with attention to real-world accessibility and use.
149
448736
3254
์ฃผ์˜๋ฅผ ๊ธฐ์šธ์ด๋ฉฐ ์ด์šฉํ•  ์ˆ˜ ์žˆ๋„๋ก ํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
07:32
Though at the best circumstances,
150
452991
2544
์ตœ์ƒ์˜ ์ƒํ™ฉ์ด๋ผ๋ฉด
07:35
detection tools will be 85 to 95 percent effective,
151
455576
3003
ํƒ์ง€ ๋„๊ตฌ๊ฐ€ 85~ 95% ์˜ ํšจ๊ณผ๋ฅผ ๋‚ผ ๊ฒƒ์ธ๋ฐ,
07:38
they have to be in the hands of that first line of defense,
152
458579
2795
๊ทธ ๋„๊ตฌ๊ฐ€ ์ตœ์ „๋ฐฉ ๋ฐฉ์–ด์„ ์— ์žˆ์–ด์•ผ๋งŒ ํ•ฉ๋‹ˆ๋‹ค.
07:41
and they're not, right now.
153
461374
1543
ํ•˜์ง€๋งŒ ์ง€๊ธˆ์€ ๊ทธ๋ ‡์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
07:43
So for step one, I've been talking about detection after the fact.
154
463710
3128
๊ทธ๋ž˜์„œ, ์ฒซ ๋ฒˆ์งธ ๋‹จ๊ณ„์—์„œ๋Š” ์‚ฌํ›„ ํƒ์ง€์— ๋Œ€ํ•ด ๋ง์”€๋“œ๋ ธ์Šต๋‹ˆ๋‹ค.
07:46
Step two -- AI is going to be everywhere in our communication,
155
466838
4462
2๋‹จ๊ณ„ -- AI๋Š” ์ฐฝ์ž‘, ๋ณ€๊ฒฝ, ํŽธ์ง‘ ๋“ฑ ์ปค๋ฎค๋‹ˆ์ผ€์ด์…˜์˜
07:51
creating, changing, editing.
156
471300
2169
๋ชจ๋“  ๊ณณ์— ์‚ฌ์šฉ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
07:53
It's not going to be a simple binary of "yes, it's AI" or "phew, it's not."
157
473469
4755
โ€œ๋„ค, AI์˜ˆ์š”โ€, โ€œํœด, ์•„๋‹ˆ์—์š”โ€์˜ ๋‹จ์ˆœํ•œ ์ด์ง„๋ฒ•์ด ์•„๋‹ ๊ฒ๋‹ˆ๋‹ค.
07:58
AI is part of all of our communication,
158
478224
3086
AI๋Š” ๋ชจ๋“  ์ปค๋ฎค๋‹ˆ์ผ€์ด์…˜์˜ ์ผ๋ถ€์ด๋ฏ€๋กœ
08:01
so we need to better understand the recipe of what we're consuming.
159
481352
4046
์šฐ๋ฆฌ๊ฐ€ ์†Œ๋น„ํ•˜๋Š” ์‹ํ’ˆ์˜ ์ œ์กฐ๋ฒ•์„ ๋” ์ž˜ ์ดํ•ดํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
08:06
Some people call this content provenance and disclosure.
160
486232
3754
์–ด๋–ค ์‚ฌ๋žŒ๋“ค์€ ์ด๋ฅผ ์ฝ˜ํ…์ธ  ์ถœ์ฒ˜ ๋ฐ ๊ณต๊ฐœ๋ผ๊ณ  ๋ถ€๋ฆ…๋‹ˆ๋‹ค.
08:10
Technologists have been building ways to add invisible watermarking
161
490028
3503
๊ธฐ์ˆ ์ž๋“ค์€ AI ์ƒ์„ฑ ๋ฏธ๋””์–ด์— ๋ณด์ด์ง€ ์•Š๋Š” ์›Œํ„ฐ๋งˆํ‚น์„
08:13
to AI-generated media.
162
493573
1877
์ถ”๊ฐ€ํ•˜๋Š” ๋ฐฉ๋ฒ•์„ ๊ฐœ๋ฐœํ•ด ์™”์Šต๋‹ˆ๋‹ค.
08:15
They've also been designing ways --
163
495491
1752
๋˜ํ•œ C2PA๋ผ๋Š” ํ‘œ์ค€ ๋‚ด์—์„œ
08:17
and I've been part of these efforts --
164
497243
1877
์•”ํ˜ธํ™”๋œ ๋ฐฉ์‹์œผ๋กœ ์„œ๋ช…๋œ
08:19
within a standard called the C2PA,
165
499162
1710
๋ฉ”ํƒ€๋ฐ์ดํ„ฐ๋ฅผ ํŒŒ์ผ์— ์ถ”๊ฐ€ํ•˜๋Š” ๋ฐฉ๋ฒ•์„
08:20
to add cryptographically signed metadata to files.
166
500872
2669
์„ค๊ณ„ํ•˜๊ณ  ์žˆ์œผ๋ฉฐ ์ €๋„ ์ด๋Ÿฌํ•œ ๋…ธ๋ ฅ์— ์ฐธ์—ฌํ–ˆ์Šต๋‹ˆ๋‹ค.
08:24
This means data that provides details about the content,
167
504125
4379
์ด๋Š” ์ฝ˜ํ…์ธ ์— ๋Œ€ํ•œ ์„ธ๋ถ€ ์ •๋ณด๋ฅผ ์ œ๊ณตํ•˜๊ณ  ํ•ด๋‹น์ •๋ณด์— ๋Œ€ํ•œ
08:28
cryptographically signed in a way that reinforces our trust
168
508546
3712
์šฐ๋ฆฌ์˜ ์‹ ๋ขฐ๋ฅผ ๊ฐ•ํ™”ํ•˜๋Š” ๋ฐฉ์‹์œผ๋กœ ์•”ํ˜ธํ™” ๋ฐฉ์‹์œผ๋กœ ์„œ๋ช…๋œ ๋ฐ์ดํ„ฐ๋ฅผ
08:32
in that information.
169
512300
1501
์˜๋ฏธํ•ฉ๋‹ˆ๋‹ค.
08:33
It's an updating record of how AI was used to create or edit it,
170
513801
5297
AI๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ ์ฝ˜ํ…์ธ ๋ฅผ ๋งŒ๋“ค๊ฑฐ๋‚˜ ํŽธ์ง‘ํ•œ ๋ฐฉ๋ฒ•,
08:39
where humans and other technologies were involved,
171
519098
2670
์‚ฌ๋žŒ๊ณผ ๋‹ค๋ฅธ ๊ธฐ์ˆ ์ด ์‚ฌ์šฉ๋œ ๊ณณ, ๋ฐฐํฌ ๋ฐฉ์‹์— ๋Œ€ํ•œ
08:41
and how it was distributed.
172
521809
1919
์ตœ์‹  ๊ธฐ๋ก์ž…๋‹ˆ๋‹ค.
08:43
It's basically a recipe and serving instructions
173
523770
3003
์ด๋Š” ๊ธฐ๋ณธ์ ์œผ๋กœ ์—ฌ๋Ÿฌ๋ถ„์ด ๋ณด๊ณ  ๋“ฃ๋Š” ๊ฒƒ์— ์‚ฌ์šฉ๋˜๋Š”
08:46
for the mix of AI and human
174
526814
1961
AI์™€ ์ธ๊ฐ„์„ ํ˜ผํ•ฉํ•˜๊ธฐ ์œ„ํ•œ
08:48
that's in what you're seeing and hearing.
175
528816
2336
๋ ˆ์‹œํ”ผ์ด์ž ์„ค๋ช…์„œ์ž…๋‹ˆ๋‹ค.
08:51
And it's a critical part of a new AI-infused media literacy.
176
531903
4462
๊ทธ๋ฆฌ๊ณ  ์ด๋Š” ์ƒˆ๋กœ์šด AI ๊ธฐ๋ฐ˜ ๋ฏธ๋””์–ด ๋ฆฌํ„ฐ๋Ÿฌ์‹œ์˜ ์ค‘์š”ํ•œ ๋ถ€๋ถ„์ž…๋‹ˆ๋‹ค.
08:57
And this actually shouldn't sound that crazy.
177
537116
2461
์‚ฌ์‹ค ์ด๊ฒŒ ๊ทธ๋ ‡๊ฒŒ ์ด์ƒํ•˜๊ฒŒ ๋“ค๋ฆฌ์ง€๋Š” ์•Š์„ ๊ฑฐ์˜ˆ์š”.
08:59
Our communication is moving in this direction already.
178
539577
3212
์šฐ๋ฆฌ์˜ ์ปค๋ฎค๋‹ˆ์ผ€์ด์…˜์€ ์ด๋ฏธ ์ด ๋ฐฉํ–ฅ์œผ๋กœ ๋‚˜์•„๊ฐ€๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
09:02
If you're like me -- you can admit it --
179
542789
2002
์ €์™€ ๊ฐ™์€ ๋ถ„์ด๋ผ๋ฉด ์ธ์ •ํ•˜์‹ค ๊ฒ๋‹ˆ๋‹ค.
09:04
you browse your TikTok โ€œFor Youโ€ page,
180
544832
2419
ํ‹ฑํ†ก โ€œFor Youโ€๋ฅผ ๋‘˜๋Ÿฌ๋ณด๋ฉด
09:07
and you're used to seeing videos that have an audio source,
181
547251
4213
์˜ค๋””์˜ค ์†Œ์Šค, AI ํ•„ํ„ฐ, ๊ทธ๋ฆฐ ์Šคํฌ๋ฆฐ, ๋ฐฐ๊ฒฝ,
09:11
an AI filter, a green screen, a background,
182
551464
2419
๋‹ค๋ฅธ ํŽธ์ง‘์ด ์žˆ๋Š” ์Šคํ‹ฐ์น˜๊ฐ€ ์žˆ๋Š” ๋™์˜์ƒ์„ ๋ณด๋Š” ๋ฐ
09:13
a stitch with another edit.
183
553883
1752
์ต์ˆ™ํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
09:15
This, in some sense, is the alpha version of this transparency
184
555676
3337
์–ด๋–ค ์˜๋ฏธ์—์„œ๋Š” ์ด๊ฒƒ์ด ์˜ค๋Š˜๋‚  ์šฐ๋ฆฌ๊ฐ€ ์‚ฌ์šฉํ•˜๋Š”
09:19
in some of the major platforms we use today.
185
559055
2377
์ฃผ์š” ํ”Œ๋žซํผ๋“ค์ด ๊ฐ€์ง„ ํˆฌ๋ช…์„ฑ์˜ ์•ŒํŒŒ ๋ฒ„์ „์ด๋ผ๊ณ  ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
09:21
It's just that it does not yet travel across the internet,
186
561474
2753
๋‹จ์ง€ ์•„์ง ์ธํ„ฐ๋„ท์„ ํ†ตํ•ด ์ „์†ก๋˜์ง€ ์•Š์•˜๋‹ค๋Š” ๊ฒƒ ๋ฟ์ž…๋‹ˆ๋‹ค.
09:24
itโ€™s not reliable, updatable, and itโ€™s not secure.
187
564268
3128
์‹ ๋ขฐํ•  ์ˆ˜๋„, ์—…๋ฐ์ดํŠธ ํ•  ์ˆ˜๋„ ์—†๊ณ  ์•ˆ์ „ํ•˜์ง€๋„ ์•Š์Šต๋‹ˆ๋‹ค.
09:27
Now, there are also big challenges
188
567980
2628
ํ•˜์ง€๋งŒ ์ด๋Ÿฌํ•œ ์œ ํ˜•์˜ ์ธํ”„๋ผ์—๋Š”
09:30
in this type of infrastructure for authenticity.
189
570650
2544
์‹ ๋ขฐ์„ฑ์— ๋Œ€ํ•œ ํฐ ๊ณผ์ œ๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
09:34
As we create these durable signs of how AI and human were mixed,
190
574278
4088
AI์™€ ์ธ๊ฐ„์ด ์–ด๋–ป๊ฒŒ ํ˜ผํ•ฉ๋˜์—ˆ๋Š”์ง€, ๋ฏธ๋””์–ด๊ฐ€ ์–ด๋–ป๊ฒŒ ๋งŒ๋“ค์–ด์ง€๋Š”์ง€ ๋ณด์—ฌ์ฃผ๋Š”
09:38
that carry across the trajectory of how media is made,
191
578366
3086
๊ถค์ ์„ ๊ฐ€๋กœ์ง€๋ฅด๋Š” ์ด๋Ÿฌํ•œ ํ”์ ์„ ์ง€์†์ ์œผ๋กœ ๋งŒ๋“ค์–ด๋‚ด๊ณ  ์žˆ๊ธฐ ๋•Œ๋ฌธ์—,
09:41
we need to ensure they don't compromise privacy or backfire globally.
192
581494
4129
๋ฏธ๋””์–ด๊ฐ€ ์„ธ๊ณ„์ ์œผ๋กœ ์‚ฌ์ƒํ™œ์„ ์นจํ•ดํ•˜๊ฑฐ๋‚˜ ์—ญํšจ๊ณผ๋ฅผ ๋‚ด์ง€ ์•Š๋„๋ก ํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
09:46
We have to get this right.
193
586249
1710
์šฐ๋ฆฌ๋Š” ์ด ๋ฌธ์ œ๋ฅผ ๋ฐ”๋กœ ์žก์•„์•ผ ํ•ฉ๋‹ˆ๋‹ค.
09:48
We can't oblige a citizen journalist filming in a repressive context
194
588584
4255
์šฐ๋ฆฌ๋Š” ์–ต์••์ ์ธ ์ƒํ™ฉ์—์„œ ์ดฌ์˜ํ•˜๋Š” ์‹œ๋ฏผ ์ €๋„๋ฆฌ์ŠคํŠธ๋‚˜
09:52
or a satirical maker using novel gen-AI tools
195
592839
3169
์ƒˆ๋กœ์šด Gen-AI ๋„๊ตฌ๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ ํž˜์žˆ๋Š” ์‚ฌ๋žŒ๋“ค์„ ํŒจ๋Ÿฌ๋””ํ•˜๋Š”
09:56
to parody the powerful ...
196
596008
1252
ํ’์ž์  ์ œ์ž‘์ž์—๊ฒŒ
09:58
to have to disclose their identity or personally identifiable information
197
598845
4879
์นด๋ฉ”๋ผ๋‚˜ ์ฑ—GPT๋ฅผ ์‚ฌ์šฉํ•˜๊ธฐ ์œ„ํ•ด ์‹ ์›์ด๋‚˜ ๊ฐœ์ธ ์‹๋ณ„ ์ •๋ณด๋ฅผ ๊ณต๊ฐœํ•˜๋ผ๊ณ 
10:03
in order to use their camera or ChatGPT.
198
603766
2961
๊ฐ•์š”ํ•  ์ˆ˜ ์—†์Šต๋‹ˆ๋‹ค.
10:08
Because it's important they be able to retain their ability to have anonymity,
199
608312
3712
๊ทธ๋“ค์ด ์ต๋ช…์„ฑ์„ ์œ ์ง€ํ•  ์ˆ˜ ์žˆ์œผ๋ฉด์„œ๋„
10:12
at the same time as the tool to create is transparent.
200
612066
3378
๋™์‹œ์— ์ œ์ž‘์„ ์œ„ํ•œ ๋„๊ตฌ๊ฐ€ ํˆฌ๋ช…ํ•œ๊ฒƒ์ด ์ค‘์š”ํ•˜๊ธฐ ๋–„๋ฌธ์ž…๋‹ˆ๋‹ค.
10:16
This needs to be about the how of AI-human media making,
201
616112
4171
์ด๊ฒƒ์€ AI-Human ๋ฏธ๋””์–ด ์ œ์ž‘์ด ๋ˆ„๊ฐ€ ๋งŒ๋“œ๋А๋ƒ๊ฐ€ ์•„๋‹ˆ๋ผ
10:20
not the who.
202
620283
1167
์–ด๋–ป๊ฒŒ ๋งŒ๋“œ๋Š”์ง€์— ๋Œ€ํ•œ ๊ฒƒ์ด์–ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
10:22
This brings me to the final step.
203
622952
2211
์ด์ œ ๋งˆ์ง€๋ง‰ ๋‹จ๊ณ„๋กœ ๋„˜์–ด๊ฐ€์ฃ .
10:25
None of this works without a pipeline of responsibility
204
625163
4462
์ฑ…์ž„๊ฐ ์žˆ๋Š” ํŒŒ์ดํ”„๋ผ์ธ์ด ์—†์œผ๋ฉด
10:29
that runs from the foundation models and the open-source projects
205
629667
4254
๊ธฐ๋ณธ ๋ชจ๋ธ ๋ฐ ์˜คํ”ˆ ์†Œ์Šค ํ”„๋กœ์ ํŠธ๋ถ€ํ„ฐ ์‹œ์Šคํ…œ, API ๋ฐ ์•ฑ์— ๋ฐฐํฌ๋˜๋Š” ๋ฐฉ์‹,
10:33
through to the way that is deployed into systems, APIs and apps,
206
633963
4213
๋ฏธ๋””์–ด๋ฅผ ์†Œ๋น„ํ•˜๊ณ  ์ปค๋ฎค๋‹ˆ์ผ€์ด์…˜ํ•˜๋Š” ํ”Œ๋žซํผ์— ์ด๋ฅด๊ธฐ๊นŒ์ง€
10:38
to the platforms where we consume media and communicate.
207
638217
3379
์ด ๋ชจ๋“  ๊ฒƒ์ด ์ž‘๋™ํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
10:43
I've spent much of the last 15 years fighting, essentially, a rearguard action,
208
643139
4171
์ €๋Š” ์ง€๋‚œ 15๋…„ ๋™์•ˆ ์ธ๊ถŒ๊ณ„์— ์žˆ๋Š” ๋‹ค๋ฅธ ๋งŽ์€ ๋™๋ฃŒ๋“ค์ฒ˜๋Ÿผ
10:47
like so many of my colleagues in the human rights world,
209
647310
2919
๊ทผ๋ณธ์ ์œผ๋กœ ์†Œ์…œ ๋ฏธ๋””์–ด์˜ ์‹คํŒจ์— ๋งž์„œ ํ›„์œ„์  ์กฐ์น˜๋ฅผ ์ทจํ•˜๊ธฐ ์œ„ํ•ด
10:50
against the failures of social media.
210
650229
2169
์‹ธ์›Œ์™”์Šต๋‹ˆ๋‹ค.
10:52
We can't make those mistakes again in this next generation of technology.
211
652899
5380
์ด ์ฐจ์„ธ๋Œ€ ๊ธฐ์ˆ ์—์„œ๋Š” ์ด๋Ÿฐ ์‹ค์ˆ˜๋ฅผ ๋‹ค์‹œ ์ €์ง€๋ฅผ ์ˆ˜ ์—†์Šต๋‹ˆ๋‹ค.
10:59
What this means is that governments
212
659572
1835
์ด๊ฒƒ์ด ์˜๋ฏธํ•˜๋Š” ๋ฐ”๋Š”
11:01
need to ensure that within this pipeline of responsibility for AI,
213
661449
4254
์ •๋ถ€๊ฐ€ AI์— ๋Œ€ํ•œ ์ฑ…์ž„ ํŒŒ์ดํ”„๋ผ์ธ ๋‚ด์—์„œ
11:05
there is transparency, accountability and liability.
214
665703
3253
ํˆฌ๋ช…์„ฑ๊ณผ ์ฑ…์ž„์„ฑ์„ ๋ณด์žฅํ•ด์•ผ ํ•œ๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
11:10
Without these three steps --
215
670666
2086
์ด ์„ธ ๋‹จ๊ณ„, ์ฆ‰
11:12
detection for the people who need it most,
216
672793
3129
๊ฐ€์žฅ ํ•„์š”๋กœ ํ•˜๋Š” ์‚ฌ๋žŒ๋“ค์„ ์œ„ํ•œ ํƒ์ง€,
11:15
provenance that is rights-respecting
217
675922
2502
๊ถŒ๋ฆฌ๋ฅผ ์กด์ค‘ํ•˜๋Š” ์ฆ๋ช…,
11:18
and that pipeline of responsibility,
218
678466
2169
์ฑ…์ž„ ํŒŒ์ดํ”„๋ผ์ธ ์—†์ด๋Š”
11:20
we're going to get stuck looking in vain for the six-fingered hand,
219
680635
3545
์—ฌ์„ฏ ์†๊ฐ€๋ฝ์˜ ์†์ด๋‚˜ ๊นœ๋ฐ•์ด์ง€ ์•Š๋Š” ๋ˆˆ์„ ์ฐพ๋А๋ผ
11:24
or the eyes that don't blink.
220
684222
1543
ํ—›์ˆ˜๊ณ ๋งŒ ๋‚จ๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
11:26
We need to take these steps.
221
686390
1836
์šฐ๋ฆฌ๋Š” ์ด๋Ÿฌํ•œ ์กฐ์น˜๋ฅผ ์ทจํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
11:28
Otherwise, we risk a world where it gets easier and easier
222
688226
3294
๊ทธ๋ ‡์ง€ ์•Š์œผ๋ฉด ํ˜„์‹ค์„ ์†์ด๊ณ  ํ˜„์‹ค์„ ์ž ์žฌ์ ์œผ๋กœ
11:31
to both fake reality
223
691520
1502
์œ„์กฐ๋œ ๊ฒƒ์œผ๋กœ ์น˜๋ถ€ํ•˜๋Š” ๊ฒƒ์ด ์‰ฌ์›Œ์ ธ
11:33
and dismiss reality as potentially faked.
224
693064
2669
์„ธ์ƒ์„ ์œ„ํ—˜์— ๋น ๋œจ๋ฆด ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
11:36
And that is a world that the political philosopher Hannah Arendt
225
696234
3086
์ด๊ฒƒ์ด ๋ฐ”๋กœ ์ •์น˜ ์ฒ ํ•™์ž ํ•œ๋‚˜ ์•„๋ ŒํŠธ๊ฐ€ ์ด๋Ÿฐ ์šฉ์–ด๋กœ
11:39
described in these terms:
226
699320
1460
๋ฌ˜์‚ฌํ•œ ์„ธ๊ณ„์ž…๋‹ˆ๋‹ค.
11:40
"A people that no longer can believe anything
227
700821
2628
โ€œ๋” ์ด์ƒ ์•„๋ฌด๊ฒƒ๋„ ๋ฏฟ์„ ์ˆ˜ ์—†๋Š” ์‚ฌ๋žŒ๋“ค์€
11:43
cannot make up its own mind.
228
703491
2294
์Šค์Šค๋กœ ๊ฒฐ์ •์„ ๋‚ด๋ฆด ์ˆ˜ ์—†์Šต๋‹ˆ๋‹ค.
11:45
It is deprived not only of its capacity to act
229
705785
3211
์ธ๊ฐ„์˜ ํ–‰๋™ ๋Šฅ๋ ฅ๋ฟ ์•„๋‹ˆ๋ผ
11:48
but also of its capacity to think and to judge.
230
708996
3504
์‚ฌ๊ณ ํ•˜๊ณ  ํŒ๋‹จํ•˜๋Š” ๋Šฅ๋ ฅ๋„ ๋ฐ•ํƒˆ๋‹นํ–ˆ์Šต๋‹ˆ๋‹ค.
11:52
And with such a people you can then do what you please."
231
712959
3211
๊ทธ๋ฆฌ๊ณ  ๊ทธ๋Ÿฐ ์‚ฌ๋žŒ๋“ค๊ณผ ํ•จ๊ป˜๋ผ๋ฉด ๋งˆ์Œ๋Œ€๋กœ ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.โ€
11:56
That's a world I know none of us want,
232
716712
2044
์šฐ๋ฆฌ ์ค‘ ๋ˆ„๊ตฌ๋„ ์›ํ•˜์ง€ ์•Š๋Š” ์„ธ์ƒ์ด์ฃ .
11:58
that I think we can prevent.
233
718798
2002
์šฐ๋ฆฌ๊ฐ€ ๋ง‰์„ ์ˆ˜ ์žˆ๋‹ค๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
12:00
Thanks.
234
720800
1168
๊ณ ๋งˆ์›Œ์š”.
12:02
(Cheers and applause)
235
722009
2544
(ํ™˜ํ˜ธ์™€ ๋ฐ•์ˆ˜)
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7