When AI Can Fake Reality, Who Can You Trust? | Sam Gregory | TED

132,775 views ใƒป 2023-12-26

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืชืจื’ื•ื: zeeva livshitz ืขืจื™ื›ื”: aknv tso
00:03
It's getting harder, isn't it, to spot real from fake,
0
3583
3879
ื ื›ื•ืŸ ืฉื ืขืฉื” ืงืฉื” ื™ื•ืชืจ ืœื”ื‘ื—ื™ืŸ ื‘ื™ืŸ ืืžื™ืชื™ ืœืžื–ื•ื™ืฃ?
00:07
AI-generated from human-generated.
1
7504
2252
ื‘ื™ืŸ ืžืขืฉื” ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ืœื‘ื™ืŸ ืžืขืฉื” ืื“ื?
ื‘ืขื–ืจืช ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ื™ื•ืฆืจืช,
00:10
With generative AI,
2
10340
1126
00:11
along with other advances in deep fakery,
3
11508
2419
ืœืฆื“ ืคื™ืชื•ื—ื™ื ืื—ืจื™ื ื‘ืžืœืื›ืช ื”ื–ื™ื•ืฃ ื”ืขืžื•ืง,
00:13
it doesn't take many seconds of your voice,
4
13969
2252
ื“ื™ ื‘ื“ื’ื™ืžื” ืงืฆืจื” ืฉืœ ื”ืงื•ืœ ืฉืœื›ื,
00:16
many images of your face,
5
16263
1459
ื•ื‘ืžืกืคืจ ืชืžื•ื ื•ืช ืฉืœ ืคื ื™ื›ื,
00:17
to fake you,
6
17764
1251
ื›ื“ื™ ืœื–ื™ื™ืฃ ืืชื›ื,
ื•ื‘ืžื™ื“ื” ื”ื•ืœื›ืช ื•ื’ื“ื•ืœื” ืฉืœ ืจื™ืืœื™ื–ื.
00:19
and the realism keeps increasing.
7
19015
2628
00:21
I first started working on deepfakes in 2017,
8
21685
3086
ื”ืชื—ืœืชื™ ืœืขื‘ื•ื“ ืœืจืืฉื•ื ื” ืขืœ ื“ื™ืค-ืคื™ื™ืง ื‘ืฉื ืช 2017,
00:24
when the threat to our trust in information was overhyped,
9
24813
3962
ื›ืฉื”ืื™ื•ื ืขืœ ื”ืืžื•ืŸ ืฉืœื ื• ื‘ืžื™ื“ืข ืชื•ืืจ ื‘ื”ื’ื–ืžื”,
00:28
and the big harm, in reality, was falsified sexual images.
10
28775
3670
ื‘ืขื•ื“ ืฉื”ื ื–ืง ื”ื’ื“ื•ืœ, ื‘ืžืฆื™ืื•ืช, ื”ื™ื• ืฆื™ืœื•ืžื™ื ืžื™ื ื™ื™ื ืžื–ื•ื™ืคื™ื.
00:32
Now that problem keeps growing, harming women and girls worldwide.
11
32904
4171
ื›ื™ื•ื, ื”ื‘ืขื™ื” ื”ื–ื• ืžืžืฉื™ื›ื” ืœื’ื“ื•ืœ, ื•ืคื•ื’ืขืช ื‘ื ืฉื™ื ื•ื ืขืจื•ืช ื‘ืขื•ืœื ื›ื•ืœื•.
00:38
But also, with advances in generative AI, we're now also approaching a world
12
38159
4713
ืื‘ืœ ืขื ื”ื”ืชืงื“ืžื•ืช ื‘ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ื”ื™ื•ืฆืจืช, ืื ื• ืžืชืงืจื‘ื™ื ื›ืขืช ื’ื ืœืขื•ืœื
00:42
where it's broadly easier to make fake reality,
13
42872
3462
ืฉื‘ื• ืงืœ ื‘ื”ืจื‘ื” ืœื™ืฆื•ืจ ืžืฆื™ืื•ืช ืžื–ื•ื™ืคืช,
00:46
but also to dismiss reality as possibly faked.
14
46376
3879
ืืš ื’ื ืœืคื˜ื•ืจ ืืช ื”ืžืฆื™ืื•ืช ืขืฆืžื” ื›ืžื–ื•ื™ืคืช.
00:50
Now, deceptive and malicious audiovisual AI
15
50755
3420
ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ืื•ืจืงื•ืœื™ืช ืžื˜ืขื” ื•ื–ื“ื•ื ื™ืช
00:54
is not the root of our societal problems,
16
54217
2669
ืื™ื ื” ืฉื•ืจืฉ ื‘ืขื™ื•ืชื™ื ื• ื”ื—ื‘ืจืชื™ื•ืช,
00:56
but it's likely to contribute to them.
17
56928
2252
ืืš ืกื‘ื™ืจ ืœื”ื ื™ื— ืฉื”ื™ื ืชื—ืžื™ืจ ืื•ืชืŸ.
00:59
Audio clones are proliferating in a range of electoral contexts.
18
59180
4213
ืฉื™ื‘ื•ื˜ื™ ื”ืฉืžืข ื”ื•ืœื›ื™ื ื•ืžืชืจื‘ื™ื ื‘ืžื’ื•ื•ืŸ ื”ืงืฉืจื™ ื‘ื—ื™ืจื•ืช.
01:03
"Is it, isn't it" claims cloud human-rights evidence from war zones,
19
63435
5130
ื˜ืขื ื•ืช ืฉืœ โ€œื–ื” ื ื›ื•ืŸ, ืœื?โ€ ืžืขืžื™ื“ ื‘ืฆืœ ื”ื•ื›ื—ื•ืช ืœื”ืคืจืช ื–ื›ื•ื™ื•ืช ืื“ื ืžืื–ื•ืจื™ ืžืœื—ืžื”,
01:08
sexual deepfakes target women in public and in private,
20
68565
4129
ื–ื™ื•ืคื™ื ืžื™ื ื™ื™ื ืžื˜ืจื’ื˜ื™ื ื ืฉื™ื ื‘ืคื•ืžื‘ื™ ื•ื‘ืื•ืคืŸ ืคืจื˜ื™,
01:12
and synthetic avatars impersonate news anchors.
21
72736
3336
ื•ืื•ื•ื˜ืจื™ื ืžืœืื›ื•ืชื™ื™ื ืžืชื—ื–ื™ื ืœืžื’ื™ืฉื™ ื—ื“ืฉื•ืช .
01:16
I lead WITNESS.
22
76656
1460
ืื ื™ ืจืืฉ โ€œื•ื•ื™ื˜ื ืกโ€œ.
ืื ื• ืงื‘ื•ืฆื” ืœื”ื’ื ืช ื–ื›ื•ื™ื•ืช ืื“ื
01:18
We're a human-rights group
23
78116
1376
01:19
that helps people use video and technology to protect and defend their rights.
24
79492
3671
ืฉืขื•ื–ืจืช ืœืื ืฉื™ื ืœื”ืฉืชืžืฉ ื‘ื•ื•ื™ื“ืื• ื•ื‘ื˜ื›ื ื•ืœื•ื’ื™ื” ื›ื“ื™ ืœืฉืžื•ืจ ื•ืœื”ื’ืŸ ืขืœ ื–ื›ื•ื™ื•ืชื™ื”ื.
ื•ื‘ื—ืžืฉ ื”ืฉื ื™ื ื”ืื—ืจื•ื ื•ืช ืชื™ืืžื ื• ืžืืžืฅ ืขื•ืœืžื™,
01:23
And for the last five years, we've coordinated a global effort,
25
83163
3003
โ€œืœื”ืชื›ื•ื ืŸ ื•ืœื ืœื”ื™ื‘ื”ืœโ€œ,
01:26
"Prepare, Don't Panic,"
26
86333
1167
01:27
around these new ways to manipulate and synthesize reality,
27
87500
3045
ืกื‘ื™ื‘ ื”ื“ืจื›ื™ื ื”ื—ื“ืฉื•ืช ื”ืœืœื• ืœืชืžืจืŸ ื•ืœืกื ืชื– ืืช ื”ืžืฆื™ืื•ืช,
01:30
and on how to fortify the truth
28
90587
2377
ื•ืœื‘ืฆืจ ืืช ื”ืืžืช
01:32
of critical frontline journalists and human-rights defenders.
29
92964
3420
ืฉืœ ืขื™ืชื•ื ืื™ื ื‘ื™ืงื•ืจืชื™ื™ื ื•ืžื’ื™ื ื™ ื–ื›ื•ื™ื•ืช-ืื“ื ืฉื‘ื—ื–ื™ืช.
01:37
Now, one element in that is a deepfakes rapid-response task force,
30
97218
5423
ืžืจื›ื™ื‘ ืื—ื“ ื”ื•ื ื›ื•ื— ืžืฉื™ืžื” ืœืชื’ื•ื‘ื” ืžื”ื™ืจื” ื ื’ื“ ื–ื™ื•ืฃ ืขืžื•ืง,
01:42
made up of media-forensics experts
31
102641
2127
ื”ืžื•ืจื›ื‘ ืžืžื•ืžื—ื™ื ื•ื—ื‘ืจื•ืช ืœื–ื™ื”ื•ื™ ืคืœื™ืœื™ ื‘ืžื“ื™ื”
01:44
and companies who donate their time and skills
32
104768
2168
ืฉืชื•ืจืžื™ื ืืช ื–ืžื ื ื•ื›ื™ืฉื•ืจื™ื”ื
01:46
to debunk deepfakes and claims of deepfakes.
33
106978
3087
ื›ื“ื™ ืœื”ืคืจื™ืš ื–ื™ื•ืคื™ื ืขืžื•ืงื™ื ื•ื˜ืขื ื•ืช ืœื–ื™ื•ืคื™ื ืขืžื•ืงื™ื.
01:50
The task force recently received three audio clips,
34
110899
3211
ืฆื•ื•ืช ื”ืžืฉื™ืžื” ืงื™ื‘ืœ ืœืื—ืจื•ื ื” ืฉืœื•ืฉื” ืงื˜ืขื™ ืฉืžืข,
01:54
from Sudan, West Africa and India.
35
114110
2670
ืžืกื•ื“ืŸ, ืžืขืจื‘ ืืคืจื™ืงื” ื•ื”ื•ื“ื•.
01:57
People were claiming that the clips were deepfaked, not real.
36
117155
3879
ื”ื˜ืขื ื” ื”ื™ืชื” ืฉืงื˜ืขื™ื ื”ื ื–ื™ื•ืคื™ื ืขืžื•ืงื™ื ื•ืœื-ืืžื™ืชื™ื™ื.
02:01
In the Sudan case,
37
121451
1210
ื‘ืžืงืจื” ืฉืœ ืกื•ื“ืŸ,
02:02
experts used a machine-learning algorithm
38
122702
2002
ืžื•ืžื—ื™ื ื”ืฉืชืžืฉื• ื‘ืืœื’ื•ืจื™ืชื ืœืžื™ื“ืช ืžื›ื•ื ื”
02:04
trained on over a million examples of synthetic speech
39
124746
2628
ืฉืื•ืžืŸ ืขื ืœืžืขืœื” ืžืžื™ืœื™ื•ืŸ ื“ื•ื’ืžืื•ืช ืฉืœ ื“ื™ื‘ื•ืจ ืžืœืื›ื•ืชื™
02:07
to prove, almost without a shadow of a doubt,
40
127374
2294
ื›ื“ื™ ืœื”ื•ื›ื™ื—, ื›ืžืขื˜ ืœืœื ืฆืœ ืฉืœ ืกืคืง,
02:09
that it was authentic.
41
129709
1335
ืฉื”ืงื˜ืข ืื•ืชื ื˜ื™.
02:11
In the West Africa case,
42
131586
1835
ื‘ืžืงืจื” ืฉืœ ืžืขืจื‘ ืืคืจื™ืงื”,
02:13
they couldn't reach a definitive conclusion
43
133463
2002
ื”ื ืœื ื”ืฆืœื™ื—ื• ืœื”ื’ื™ืข ืœืžืกืงื ื” ื—ื“-ืžืฉืžืขื™ืช
02:15
because of the challenges of analyzing audio from Twitter,
44
135507
2794
ื‘ื’ืœืœ ื”ืงืฉื™ื™ื ืฉืœ ื ื™ืชื•ื— ืฉืžืข ืžโ€œื˜ื•ื•ื™ื˜ืจโ€
02:18
and with background noise.
45
138301
1752
ื›ื•ืœืœ ืจืขืฉื™ ื”ืจืงืข.
02:20
The third clip was leaked audio of a politician from India.
46
140095
3712
ื”ืงื˜ืข ื”ืฉืœื™ืฉื™ ื”ื•ื“ืœืฃ ื›ื”ืงืœื˜ื” ืฉืœ ืคื•ืœื™ื˜ื™ืงืื™ ืžื”ื•ื“ื•.
02:23
Nilesh Christopher of โ€œRest of Worldโ€ brought the case to the task force.
47
143848
3796
ื ื™ืœืฉ ื›ืจื™ืกื˜ื•ืคืจ ืžื”ืืชืจ โ€œืฉืืจ ื”ืขื•ืœืโ€ ื”ื‘ื™ื ืืช ื”ืžืงืจื” ืœื›ื•ื— ื”ืžืฉื™ืžื”.
02:27
The experts used almost an hour of samples
48
147644
2961
ื”ืžื•ืžื—ื™ื ื”ืฉืชืžืฉื• ื›ืžืขื˜ ื‘ืฉืขื” ืฉืœ ื“ื’ื™ืžื•ืช
02:30
to develop a personalized model of the politician's authentic voice.
49
150605
3879
ืœืคื™ืชื•ื— ืžื•ื“ืœ ืžื•ืชืื ืื™ืฉื™ืช ืฉืœ ืงื•ืœื• ื”ืžืงื•ืจื™ ืฉืœ ื”ืคื•ืœื™ื˜ื™ืงืื™.
02:35
Despite his loud and fast claims that it was all falsified with AI,
50
155151
4380
ืœืžืจื•ืช ื˜ืขื ื•ืชื™ื• ื”ืงื•ืœื ื™ื•ืช ื•ื”ืžื”ื™ืจื•ืช ืฉื”ื›ืœ ื–ื•ื™ื™ืฃ ื‘ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช,
02:39
experts concluded that it at least was partially real, not AI.
51
159572
4255
ื”ืžื•ืžื—ื™ื ื”ืกื™ืงื• ืฉื–ื” ืืžื™ืชื™ ืœืคื—ื•ืช ื—ืœืงื™ืช, ื•ืœื ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช.
02:44
As you can see,
52
164369
1335
ื›ืคื™ ืฉืืชื ืจื•ืื™ื,
02:45
even experts cannot rapidly and conclusively separate true from false,
53
165745
5089
ืืคื™ืœื• ืžื•ืžื—ื™ื ืื™ื ื ื™ื›ื•ืœื™ื ืœื”ื‘ื“ื™ืœ ื‘ืžื”ื™ืจื•ืช ื•ื‘ืคืกืงื ื•ืช ื‘ื™ืŸ ืืžืช ืœืฉืงืจ,
02:50
and the ease of calling "that's deepfaked" on something real
54
170875
4421
ื•ื”ืงืœื•ืช ืœืชื™ื™ื’ ืžืฉื”ื• ืืžื™ืชื™ ื›โ€œื“ื™ืค-ืคื™ื™ืงโ€
02:55
is increasing.
55
175296
1168
ื”ื•ืœื›ืช ื•ื’ื•ื‘ืจืช.
02:57
The future is full of profound challenges,
56
177132
2002
ื”ืขืชื™ื“ ืžืœื ื‘ืืชื’ืจื™ื ืงืฉื™ื,
02:59
both in protecting the real and detecting the fake.
57
179175
3420
ื”ืŸ ื‘ื”ื’ื ื” ืขืœ ื”ืืžื™ืชื™ ื•ื”ืŸ ื‘ื’ื™ืœื•ื™ ื”ื–ื™ื•ืฃ.
03:03
We're already seeing the warning signs
58
183888
1919
ืื ื• ื›ื‘ืจ ืจื•ืื™ื ืืช ืกื™ืžื ื™ ื”ืื–ื”ืจื” ืœืืชื’ืจ ื–ื”
03:05
of this challenge of discerning fact from fiction.
59
185807
2711
ืฉืœ ื”ื‘ื—ื ื” ื‘ื™ืŸ ืขื•ื‘ื“ื” ืœื‘ื“ื™ื•ืŸ.
03:08
Audio and video deepfakes have targeted politicians,
60
188560
3128
ื–ื™ื•ืคื™ ืฉืžืข ื•ื•ื™ื“ืื• ื”ืชืžืงื“ื• ื‘ืคื•ืœื™ื˜ื™ืงืื™ื,
03:11
major political leaders in the EU, Turkey and Mexico,
61
191688
3587
ืžื ื”ื™ื’ื™ื ืคื•ืœื™ื˜ื™ื™ื ื—ืฉื•ื‘ื™ื ื‘ืื™ื—ื•ื“ ื”ืื™ืจื•ืคื™, ื‘ื˜ื•ืจืงื™ื” ื•ื‘ืžืงืกื™ืงื•,
03:15
and US mayoral candidates.
62
195316
1710
ื•ื‘ืžื•ืขืžื“ื™ื ืœืจืืฉื•ืช ืขืจื™ื ื‘ืืจื”โ€œื‘.
03:17
Political ads are incorporating footage of events that never happened,
63
197444
3503
ื‘ืคืจืกื•ืžื™ื ืคื•ืœื™ื˜ื™ื™ื ืžื•ืคื™ืขื™ื ืฆื™ืœื•ืžื™ื ืฉืœ ืื™ืจื•ืขื™ื ืฉืœื ื”ืชืจื—ืฉื•,
03:20
and people are sharing AI-generated imagery from crisis zones,
64
200947
4546
ื•ืื ืฉื™ื ืžืฉืชืคื™ื ืชืžื•ื ื•ืช ืชื•ืฆืจืช ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ืžืื–ื•ืจื™ ืžืฉื‘ืจ,
03:25
claiming it to be real.
65
205535
1418
ื‘ื˜ืขื ื” ืฉื”ืŸ ืืžื™ืชื™ื•ืช.
03:27
Now, again, this problem is not entirely new.
66
207454
3211
ืฉื•ื‘, ื‘ืขื™ื” ื–ื• ืื™ื ื” ื—ื“ืฉื” ืœื’ืžืจื™.
03:31
The human-rights defenders and journalists I work with
67
211207
2628
ืžื’ื™ื ื™ ื–ื›ื•ื™ื•ืช ื”ืื“ื ื•ื”ืขื™ืชื•ื ืื™ื ืื™ืชื ืื ื™ ืขื•ื‘ื“
03:33
are used to having their stories dismissed,
68
213835
2794
ืจื’ื™ืœื™ื ืœื›ืš ืฉืคื•ื˜ืจื™ื ืืช ืกื™ืคื•ืจื™ื”ื,
03:36
and they're used to widespread, deceptive, shallow fakes,
69
216671
3462
ื•ื”ื ืจื’ื™ืœื™ื ืœื–ื™ื•ืคื™ื ื ืจื—ื‘ื™ื, ืžื˜ืขื™ื, ืจื“ื•ื“ื™ื,
03:40
videos and images taken from one context or time or place
70
220175
3670
ืœืกืจื˜ื•ื ื™ื ื•ืชืžื•ื ื•ืช ืฉืฆื•ืœืžื• ื‘ื”ืงืฉืจ ืื• ื‘ื–ืžืŸ ืžืกื•ื™ืžื™ื
03:43
and claimed as if they're in another,
71
223887
2460
ื•ืฉืžื•ืฆื’ื™ื ื‘ื”ืงืฉืจ ื•ื–ืžืŸ ืื—ืจื™ื,
03:46
used to share confusion and spread disinformation.
72
226347
3129
ืฉืžื˜ืจืชื ืœืคื–ืจ ื‘ืœื‘ื•ืœ ื•ืœื”ืคื™ืฅ ืžื™ื“ืข ืžืกื•ืœืฃ.
03:49
And of course, we live in a world that is full of partisanship
73
229934
3170
ื•ื›ืžื•ื‘ืŸ, ืื ื• ื—ื™ื™ื ื‘ืขื•ืœื ืžืœื ื‘ืžืคืœื’ืชื™ื•ืช
03:53
and plentiful confirmation bias.
74
233146
2127
ื•ื‘ืฉืคืข ืฉืœ ื”ื˜ื™ื™ืช ืื™ืฉื•ืจ.
03:57
Given all that,
75
237317
1209
ื‘ื”ื™ื ืชืŸ ื›ืœ ื–ื”,
03:58
the last thing we need is a diminishing baseline
76
238526
3045
ื”ื“ื‘ืจ ื”ืื—ืจื•ืŸ ืฉื ื—ื•ืฅ ืœื ื• ื”ื•ื ื”ืชืžืขื˜ื•ืช ื”ื‘ืกื™ืก ื”ืžืฉื•ืชืฃ
04:01
of the shared, trustworthy information upon which democracies thrive,
77
241613
3962
ืฉืœ ื”ืžื™ื“ืข ื”ืžื•ืคืฅ ื•ื”ืืžื™ืŸ ืฉืขืœื™ื• ืžืฉื’ืฉื’ื•ืช ื“ืžื•ืงืจื˜ื™ื•ืช,
04:05
where the specter of AI
78
245617
1418
ืฉื‘ื• ืชืขืชื•ืข ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช
04:07
is used to plausibly believe things you want to believe,
79
247076
3462
ืžื ื•ืฆืœ ื›ื“ื™ ืฉื ืืžื™ืŸ ื‘ืžื™ื“ื” ืกื‘ื™ืจื” ื‘ื“ื‘ืจื™ื ืฉื ืจืฆื” ืœื”ืืžื™ืŸ ื‘ื”ื,
04:10
and plausibly deny things you want to ignore.
80
250538
2336
ื•ืœื”ืชื›ื—ืฉ ื‘ืžื™ื“ื” ืกื‘ื™ืจื” ืœื“ื‘ืจื™ื ืฉื ืจืฆื” ืœื”ืชืขืœื ืžื”ื.
04:15
But I think there's a way we can prevent that future,
81
255084
2628
ืื‘ืœ ืื ื™ ื—ื•ืฉื‘ ืฉื™ืฉ ื“ืจืš ืœืžื ื•ืข ืืช ื”ืขืชื™ื“ ื”ื–ื”,
04:17
if we act now;
82
257754
1501
ืื ื ืคืขืœ ืขื›ืฉื™ื•;
04:19
that if we "Prepare, Don't Panic,"
83
259255
2211
ื•ื–ื” ืื โ€œื ืชื›ื•ื ืŸ ื•ืœื ื ื™ื‘ื”ืœโ€œ,
04:21
we'll kind of make our way through this somehow.
84
261508
3253
ื ืฆืœื™ื— ืื™ื›ืฉื”ื• ืœืคืœืก ืืช ื“ืจื›ื ื•.
04:25
Panic won't serve us well.
85
265929
2627
ื”ื‘ื”ืœื” ืœื ืชื•ืขื™ืœ ืœื ื•.
04:28
[It] plays into the hands of governments and corporations
86
268681
2711
ื–ื” ืžื•ืขื™ืœ ืœืžืžืฉืœื•ืช ื•ืชืื’ื™ื“ื™ื
04:31
who will abuse our fears,
87
271392
1669
ืฉื™ื ืฆืœื• ืœืจืขื” ืืช ืคื—ื“ื™ื ื•,
04:33
and into the hands of people who want a fog of confusion
88
273102
3045
ื•ืœืื ืฉื™ื ืฉืขืจืคืœ ืฉืœ ื‘ืœื‘ื•ืœ ืจืฆื•ื™ ืœื”ื
04:36
and will use AI as an excuse.
89
276147
2461
ื•ืฉื™ืฉืชืžืฉื• ื‘ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ื›ืชื™ืจื•ืฅ.
04:40
How many people were taken in, just for a minute,
90
280610
2419
ื›ืžื” ืื ืฉื™ื ื”ืืžื™ื ื• ืœืจื’ืข,
04:43
by the Pope in his dripped-out puffer jacket?
91
283029
2336
ืœืชืžื•ื ืช ื”ืืคื™ืคื™ื•ืจ ื‘ืžืขื™ืœ ื”ื ืคื•ื— ืฉืœื•? (ืฆื—ื•ืง)
04:45
You can admit it.
92
285406
1168
ืืชื ื™ื›ื•ืœื™ื ืœื”ื•ื“ื•ืช ื‘ื–ื”.
04:46
(Laughter)
93
286574
1210
04:47
More seriously,
94
287784
1209
ื•ื—ืžื•ืจ ื™ื•ืชืจ,
04:49
how many of you know someone who's been scammed
95
289035
2503
ื›ืžื” ืžื›ื ืžื›ื™ืจื™ื ืžื™ืฉื”ื• ืฉื ืคืœ ืงื•ืจื‘ืŸ ืœื”ื•ื ืื”
04:51
by an audio that sounds like their kid?
96
291579
2044
ืฉืœ ืงื˜ืข ืฉืžืข ืฉืžื–ื›ื™ืจ ืืช ื”ื™ืœื“ ืฉืœื•?
04:54
And for those of you who are thinking "I wasn't taken in,
97
294624
2920
ื•ืœืืœื” ืžื›ื ืฉื—ื•ืฉื‘ื™ื โ€œืœื ื”ืฆืœื™ื—ื• ืœืขื‘ื•ื“ ืขืœื™,
04:57
I know how to spot a deepfake,"
98
297544
1584
โ€œืื ื™ ื™ื•ื“ืข ืœื–ื”ื•ืช ื“ื™ืค-ืคื™ื™ืงโ€œ,
04:59
any tip you know now is already outdated.
99
299170
3003
ื›ืœ ื˜ื™ืค ืฉืืชื ืžื›ื™ืจื™ื ื›ื™ื•ื ื›ื‘ืจ ืžื™ื•ืฉืŸ.
05:02
Deepfakes didn't blink, they do now.
100
302757
2544
ื‘ื–ื™ื•ืคื™ื ื”ืขืžื•ืงื™ื ืœื ื”ื™ื• ืžืฆืžื•ืฆื™ื. ืขื›ืฉื™ื• ื™ืฉ ื‘ื”ื.
ื™ื“ื™ื™ื ืขื ืฉืฉ ืืฆื‘ืขื•ืช ื”ื™ื• ื ืคื•ืฆื•ืช ื™ื•ืชืจ ื‘ืืจืฅ ื“ื™ืค-ืคื™ื™ืง ืžืืฉืจ ื‘ื—ื™ื™ื ื”ืืžื™ืชื™ื™ื --
05:06
Six-fingered hands were more common in deepfake land than real life --
101
306177
3587
05:09
not so much.
102
309806
1126
ื–ื” ื›ื‘ืจ ืœื ื›ืš.
05:11
Technical advances erase those visible and audible clues
103
311307
3754
ื”ื”ืชืงื“ืžื•ืช ื”ื˜ื›ื ื™ืช ืžื•ื—ืงืช ืืช ื”ืจืžื–ื™ื ื”ื’ืœื•ื™ื™ื ื•ื”ืฉืžื™ืขื™ื
05:15
that we so desperately want to hang on to
104
315061
2002
ืฉืื ื• ื›ืœ-ื›ืš ืจื•ืฆื™ื ืœื”ื™ืื—ื– ื‘ื”ื
ื›ื”ื•ื›ื—ื•ืช ืœื›ืš ืฉื‘ื™ื›ื•ืœืชื ื• ืœื”ื‘ื—ื™ืŸ ื‘ื™ืŸ ื”ืืžื™ืชื™ ืœืžื–ื•ื™ืฃ.
05:17
as proof we can discern real from fake.
105
317105
2252
05:20
But it also really shouldnโ€™t be on us to make that guess without any help.
106
320191
4713
ืื‘ืœ ืื ื• ื’ื ื‘ืืžืช ืœื ืืžื•ืจื™ื ืœืขืกื•ืง ื‘ื ื™ื—ื•ืฉื™ื ื‘ืœื™ ืฉื•ื ืขื–ืจื”.
05:24
Between real deepfakes and claimed deepfakes,
107
324946
2127
ื‘ื™ืŸ ื–ื™ื•ืคื™ื ืขืžื•ืงื™ื ืืžื™ืชื™ื™ื ืœื‘ื™ืŸ ื–ื™ื•ืคื™ื ืขืžื•ืงื™ื ืœื›ืื•ืจื”,
ืื ื• ื–ืงื•ืงื™ื ืœืคืชืจื•ื ื•ืช ืžื‘ื ื™ื™ื ืฉืžืชื—ืฉื‘ื™ื ื‘ืชืžื•ื ื” ื”ืžืœืื”.
05:27
we need big-picture, structural solutions.
108
327073
2961
05:30
We need robust foundations
109
330034
1502
ืื ื• ื–ืงื•ืงื™ื ืœื™ืกื•ื“ื•ืช ืžื•ืฆืงื™ื
05:31
that enable us to discern authentic from simulated,
110
331578
3128
ืฉื™ืืคืฉืจื• ืœื ื• ืœื”ื‘ื—ื™ืŸ ื‘ื™ืŸ ืื•ืชื ื˜ื™ ืœื‘ื™ืŸ ืžื“ื•ืžื”,
05:34
tools to fortify the credibility of critical voices and images,
111
334747
3921
ืœื›ืœื™ื ืฉื™ื—ื–ืงื• ืืช ืืžื™ื ื•ืชื ืฉืœ ืงื•ืœื•ืช ื•ืชืžื•ื ื•ืช ื‘ื™ืงื•ืจืชื™ื™ื,
05:38
and powerful detection technology
112
338668
2336
ื•ืœื˜ื›ื ื•ืœื•ื’ื™ื™ืช ื–ื™ื”ื•ื™ ืจื‘ืช-ืขื•ืฆืžื”
05:41
that doesn't raise more doubts than it fixes.
113
341045
2670
ืฉืœื ืžืขื•ืจืจืช ื™ื•ืชืจ ืกืคืงื•ืช ืžืืฉืจ ืžืชืงื ืช.
05:45
There are three steps we need to take to get to that future.
114
345091
3045
ื™ืฉ ืฉืœื•ืฉื” ืฉืœื‘ื™ื ืฉืขืœื™ื ื• ืœื ืงื•ื˜ ื›ื“ื™ ืœื”ื’ื™ืข ืœืขืชื™ื“ ื”ื–ื”.
05:48
Step one is to ensure that the detection skills and tools
115
348887
3712
ื”ืฉืœื‘ ื”ืจืืฉื•ืŸ ื”ื•ื ืœื”ื‘ื˜ื™ื— ืฉื›ื™ืฉื•ืจื™ ื•ืืžืฆืขื™ ื”ื’ื™ืœื•ื™
05:52
are in the hands of the people who need them.
116
352599
2168
ืžื’ื™ืขื™ื ืœื™ื“ื™ ืžื™ ืฉื–ืงื•ืงื™ื ืœื”ื.
05:54
I've talked to hundreds of journalists,
117
354767
2253
ืฉื•ื—ื—ืชื™ ืขื ืžืื•ืช ืขื™ืชื•ื ืื™ื, ืžื ื”ื™ื’ื™ ืงื”ื™ืœื” ื•ืžื’ื™ื ื™ ื–ื›ื•ื™ื•ืช ืื“ื,
05:57
community leaders and human-rights defenders,
118
357020
2252
05:59
and they're in the same boat as you and me and us.
119
359272
2919
ื•ื”ื ื‘ืื•ืชื” ืกื™ืจื” ืื™ืชื›ื ื•ืื™ืชื™ ื•ืื™ืชื ื•.
ื”ื ืžืื–ื™ื ื™ื ื‘ืงืคื™ื“ื” ืœืงื˜ืขื™ ืฉืžืข ื•ืžื ืกื™ื ืœื—ืฉื•ื‘, โ€œื”ืื ื–ื” ืคืกืคื•ืก?โ€
06:02
They're listening to the audio, trying to think, "Can I spot a glitch?"
120
362191
3421
06:05
Looking at the image, saying, "Oh, does that look right or not?"
121
365612
3295
ืžืกืชื›ืœื™ื ืขืœ ื”ืชืžื•ื ื” ื•ืชื•ื”ื™ื, โ€œื–ื” ื ืจืื” ื ื›ื•ืŸ ืื• ืœื?โ€
06:08
Or maybe they're going online to find a detector.
122
368907
3336
ืื• ืฉื”ื ื ื›ื ืกื™ื ืœืื™ื ื˜ืจื ื˜ ื›ื“ื™ ืœืžืฆื•ื ืชื•ื›ื ืช ื–ื™ื”ื•ื™.
06:12
And the detector they find,
123
372285
1335
ื•ืชื•ื›ื ืช ื”ื–ื™ื”ื•ื™ ืฉื”ื ืžื•ืฆืื™ื --
06:13
they don't know whether they're getting a false positive, a false negative,
124
373661
3545
ื”ื ืœื ื™ื•ื“ืขื™ื ืื ื”ื ืžืงื‘ืœื™ื ืชื•ืฆืื” ื—ื™ื•ื‘ื™ืช ืฉื’ื•ื™ื”, ืฉืœื™ืœื™ืช ืฉื’ื•ื™ื”
ืื• ืชื•ืฆืื” ืืžื™ื ื”.
06:17
or a reliable result.
125
377248
1251
06:18
Here's an example.
126
378541
1168
ื”ื ื” ื“ื•ื’ืžื”.
06:19
I used a detector, which got the Pope in the puffer jacket right.
127
379751
3712
ื”ืฉืชืžืฉืชื™ ื‘ืชื•ื›ื ืช ื–ื™ื”ื•ื™ ืฉื™ื“ืขื” ืœื”ืœื‘ื™ืฉ ื ื›ื•ืŸ ืืช ื”ืืคื™ืคื™ื•ืจ ื‘ืžืขื™ืœ.
06:23
But then, when I put in the Easter bunny image that I made for my kids,
128
383796
4255
ืื‘ืœ ื›ืฉื”ื›ื ืกืชื™ ืืช ืชืžื•ื ืช ืืจื ื‘ ื”ืคืกื—ื ืฉื”ื›ื ืชื™ ืœื™ืœื“ื™ื ืฉืœื™,
06:28
it said that it was human-generated.
129
388092
1961
ื ื›ืชื‘ ืฉื”ื™ื ื ื•ืฆืจื” ืขืœ ื™ื“ื™ ืื“ื.
06:30
This is because of some big challenges in deepfake detection.
130
390637
3253
ื”ืกื™ื‘ื” ืœื›ืš ื”ื™ื ื›ืžื” ืงืฉื™ื™ื ื’ื“ื•ืœื™ื ื‘ื–ื™ื”ื•ื™ ื“ื™ืค-ืคื™ื™ืง.
06:34
Detection tools often only work on one single way to make a deepfake,
131
394474
3295
ื›ืœื™ ื–ื™ื”ื•ื™ ืขื•ื‘ื“ื™ื ืœืจื•ื‘ ืจืง ืขืœ ื“ืจืš ืื—ืช ืœื™ืฆื™ืจืช ื“ื™ืค-ืคื™ื™ืง,
06:37
so you need multiple tools,
132
397769
1543
ื›ืš ืฉื“ืจื•ืฉื™ื ืžืกืคืจ ื›ืœื™ื,
06:39
and they don't work well on low-quality social media content.
133
399354
4337
ื•ื”ื ืœื ืขื•ื‘ื“ื™ื ื˜ื•ื‘ ืขืœ ืชื•ื›ืŸ ืžื“ื™ื” ื—ื‘ืจืชื™ืช ื‘ืื™ื›ื•ืช ื ืžื•ื›ื”.
06:43
Confidence score, 0.76-0.87,
134
403691
3337
ื“ื™ืจื•ื’ ืกื™ื•ื•ื’, 0.76-0.87,
ืื™ืš ื™ื•ื“ืขื™ื ืื ื–ื” ืืžื™ืŸ,
06:47
how do you know whether that's reliable,
135
407028
1919
06:48
if you don't know if the underlying technology is reliable,
136
408988
2795
ืื ืœื ื™ื•ื“ืขื™ื ืื ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ื”ื‘ืกื™ืกื™ืช ืืžื™ื ื”,
06:51
or whether it works on the manipulation that is being used?
137
411824
2795
ืื• ืื ื”ื™ื ืขื•ื‘ื“ืช ืขืœ ื”ืžื ื™ืคื•ืœืฆื™ื” ืฉื‘ื” ื ืขืฉื” ืฉื™ืžื•ืฉ?
06:54
And tools to spot an AI manipulation don't spot a manual edit.
138
414661
5046
ื•ื›ืœื™ื ืœืื™ืชื•ืจ ืžื ื™ืคื•ืœืฆื™ื•ืช ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ืื™ื ื ืžื–ื”ื™ื ืขืจื™ื›ื” ื™ื“ื ื™ืช.
07:00
These tools also won't be available to everyone.
139
420583
3587
ื›ืœื™ื ืืœื” ื’ื ืœื ื™ื”ื™ื• ื–ืžื™ื ื™ื ืœื›ื•ืœื.
07:04
There's a trade-off between security and access,
140
424212
3128
ื™ืฉ ืคืฉืจื” ื‘ื™ืŸ ืื‘ื˜ื—ื” ืœื’ื™ืฉื”,
07:07
which means if we make them available to anyone,
141
427382
2544
ืžื” ืฉืื•ืžืจ ืฉืื ื ื”ืคื•ืš ืื•ืชื ืœื–ืžื™ื ื™ื ืœื›ื•ืœื,
07:09
they become useless to everybody,
142
429926
2586
ื”ื ื™ื™ืขืฉื• ื—ืกืจื™-ืชื•ืขืœืช ืœื›ื•ืœื,
07:12
because the people designing the new deception techniques
143
432512
2711
ื›ื™ ื”ืื ืฉื™ื ืฉืžืชื›ื ื ื™ื ืืช ื˜ื›ื ื™ืงื•ืช ื”ื”ื•ื ืื” ื”ื—ื“ืฉื•ืช
ื™ื‘ื“ืงื• ืื•ืชื ื‘ืชื•ื›ื ื•ืช ื”ื–ื™ื”ื•ื™ ื”ื–ืžื™ื ื•ืช ืœืฆื™ื‘ื•ืจ ื•ื™ืขืงืคื• ืื•ืชืŸ.
07:15
will test them on the publicly available detectors
144
435264
3087
07:18
and evade them.
145
438393
1209
ืื‘ืœ ืขืœื™ื ื• ื‘ื”ื—ืœื˜ ืœื•ื•ื“ื ืฉื”ืŸ ืชื”ื™ื™ื ื” ื–ืžื™ื ื•ืช
07:20
But we do need to make sure these are available
146
440061
2920
07:22
to the journalists, the community leaders,
147
442981
2085
ืœืขื™ืชื•ื ืื™ื, ืœืžื ื”ื™ื’ื™ ื”ืงื”ื™ืœื”,
07:25
the election officials, globally, who are our first line of defense,
148
445108
3628
ืœืคืงื™ื“ื™ ื”ื‘ื—ื™ืจื•ืช ื‘ืขื•ืœื ื›ื•ืœื•, ืฉื”ื ืงื• ื”ื”ื’ื ื” ื”ืจืืฉื•ืŸ ืฉืœื ื•,
07:28
thought through with attention to real-world accessibility and use.
149
448736
3254
ืชื•ื›ื ื•ืช ืฉื ื‘ื ื• ืชื•ืš ืชืฉื•ืžืช ืœื‘ ืœื ื’ื™ืฉื•ืช ื•ืœืฉื™ืžื•ืฉ ื‘ืขื•ืœื ื”ืืžื™ืชื™.
07:32
Though at the best circumstances,
150
452991
2544
ืœืžืจื•ืช ืฉื‘ื ืกื™ื‘ื•ืช ื”ื˜ื•ื‘ื•ืช ื‘ื™ื•ืชืจ,
07:35
detection tools will be 85 to 95 percent effective,
151
455576
3003
ื›ืœื™ ื”ื’ื™ืœื•ื™ ื™ื”ื™ื• ื™ืขื™ืœื™ื ื‘-85 ืขื“ 95 ืื—ื•ื–ื™ื,
07:38
they have to be in the hands of that first line of defense,
152
458579
2795
ืขืœื™ื”ื ืœื”ื™ืžืฆื ื‘ื™ื“ื™ ืงื• ื”ื”ื’ื ื” ื”ืจืืฉื•ืŸ,
07:41
and they're not, right now.
153
461374
1543
ื•ื›ืจื’ืข, ื”ื ืœื.
07:43
So for step one, I've been talking about detection after the fact.
154
463710
3128
ืื– ื‘ืฉืœื‘ ื”ืจืืฉื•ืŸ ื“ื™ื‘ืจืชื™ ืขืœ ื’ื™ืœื•ื™ ืœืื—ืจ ืžืขืฉื”.
07:46
Step two -- AI is going to be everywhere in our communication,
155
466838
4462
ื”ืฉืœื‘ ื”ืฉื ื™ -- ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ืชื’ื™ืข ืœื›ืœ ืžืงื•ื ื‘ืชืงืฉื•ืจืช ืฉืœื ื•,
07:51
creating, changing, editing.
156
471300
2169
ื”ื™ื ืชื™ืฆื•ืจ, ืชืฉื ื”, ืชืขืจื•ืš.
07:53
It's not going to be a simple binary of "yes, it's AI" or "phew, it's not."
157
473469
4755
ืœื ื™ื”ื™ื” ื–ื™ื”ื•ื™ ื‘ื™ื ืืจื™ ืคืฉื•ื˜ ืฉืœ โ€œื›ืŸ, ื–ื” ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืชโ€ ืื• โ€œืžื–ืœ ืฉื–ื” ืœื.โ€
07:58
AI is part of all of our communication,
158
478224
3086
ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ื”ื™ื ื—ืœืง ืžื›ืœ ื”ืชืงืฉื•ืจืช ืฉืœื ื•,
08:01
so we need to better understand the recipe of what we're consuming.
159
481352
4046
ื•ืœื›ืŸ ืขืœื™ื ื• ืœื”ื‘ื™ืŸ ื˜ื•ื‘ ื™ื•ืชืจ ืืช ื”ืžืชื›ื•ืŸ ืฉืœ ืžื” ืฉืื ื• ืฆื•ืจื›ื™ื.
08:06
Some people call this content provenance and disclosure.
160
486232
3754
ื™ืฉ ืื ืฉื™ื ืฉืงื•ืจืื™ื ืœื–ื” โ€œื–ื™ื”ื•ื™ ืžืงื•ืจ ืฉืœ ืชื•ื›ืŸ ื•ื—ืฉื™ืคืชื•โ€œ.
08:10
Technologists have been building ways to add invisible watermarking
161
490028
3503
ื˜ื›ื ื•ืœื•ื’ื™ื ื›ื‘ืจ ื‘ื•ื ื™ื ื“ืจื›ื™ื ืœื”ื•ืกื™ืฃ ืกื™ืžื ื™-ืžื™ื ืกืžื•ื™ื™ื
08:13
to AI-generated media.
162
493573
1877
ืœืžื“ื™ื” ืฉื ื•ืฆืจื” ืข"ื™ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช.
08:15
They've also been designing ways --
163
495491
1752
ื”ื ื’ื ืชื›ื ื ื• ื“ืจื›ื™ื -- ื•ืื ื™ ื”ืฉืชืชืคืชื™ ื‘ืžืืžืฆื™ื ืืœื”--
08:17
and I've been part of these efforts --
164
497243
1877
ื‘ืžืกื’ืจืช ืชืงืŸ ื‘ืฉื C2PA,
08:19
within a standard called the C2PA,
165
499162
1710
08:20
to add cryptographically signed metadata to files.
166
500872
2669
ืœื”ื•ืกื™ืฃ ืœืงื‘ืฆื™ื ื ืชื•ื ื™-ืขืœ ื—ืชื•ืžื™ื ื‘ื”ืฆืคื ื”.
08:24
This means data that provides details about the content,
167
504125
4379
ื›ืœื•ืžืจ, ื ืชื•ื ื™ื ืฉืžืกืคืงื™ื ืคืจื˜ื™ื ืขืœ ื”ืชื•ื›ืŸ,
08:28
cryptographically signed in a way that reinforces our trust
168
508546
3712
ื—ืชื•ืžื™ื ื‘ืฆื•ืจื” ืžื•ืฆืคื ืช
ื‘ืื•ืคืŸ ืฉืžื—ื–ืง ืืช ื”ืืžื•ืŸ ืฉืœื ื• ื‘ืžื™ื“ืข ื–ื”.
08:32
in that information.
169
512300
1501
08:33
It's an updating record of how AI was used to create or edit it,
170
513801
5297
ื–ื”ื• ืชื™ืขื•ื“ ืžืชืขื“ื›ืŸ ืฉืœ ืื•ืคืŸ ื”ืฉื™ืžื•ืฉ ื‘ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ื‘ื™ืฆื™ืจืชื• ืื• ื‘ืขืจื™ื›ืชื•,
08:39
where humans and other technologies were involved,
171
519098
2670
ืื™ืคื” ื”ื™ื• ืžืขื•ืจื‘ื™ื ื‘ื ื™ ืื“ื ื•ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ืื—ืจื•ืช
08:41
and how it was distributed.
172
521809
1919
ื•ื›ื™ืฆื“ ื”ื•ื ื”ื•ืคืฅ.
08:43
It's basically a recipe and serving instructions
173
523770
3003
ื–ื”ื• ื‘ืขืฆื ืžืชื›ื•ืŸ ื•ื”ื•ืจืื•ืช ื”ื’ืฉื”
08:46
for the mix of AI and human
174
526814
1961
ืœืชืขืจื•ื‘ืช ืฉืœ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ื•ืื“ื ืฉื ืžืฆืืช ื‘ืžื” ืฉืืชื ืจื•ืื™ื ื•ืฉื•ืžืขื™ื.
08:48
that's in what you're seeing and hearing.
175
528816
2336
08:51
And it's a critical part of a new AI-infused media literacy.
176
531903
4462
ื•ื–ื”ื• ื—ืœืง ืงืจื™ื˜ื™ ื‘ืื•ืจื™ื™ื ื•ืช ืฉืœ ืžื“ื™ื” ื—ื“ืฉื” ืฉืฉื•ืœื‘ื” ื‘ื” ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช.
08:57
And this actually shouldn't sound that crazy.
177
537116
2461
ื•ื–ื” ื‘ืขืฆื ืœื ืืžื•ืจ ืœื”ื™ืฉืžืข ื›ืœ-ื›ืš ืžื˜ื•ืจืฃ.
08:59
Our communication is moving in this direction already.
178
539577
3212
ื”ืชืงืฉื•ืจืช ืฉืœื ื• ื›ื‘ืจ ืžืชืงื“ืžืช ืœืฉื.
09:02
If you're like me -- you can admit it --
179
542789
2002
ืื ืืชื ื›ืžื•ื ื™ -- ืืชื ื™ื›ื•ืœื™ื ืœื”ื•ื“ื•ืช ื‘ื–ื” --
09:04
you browse your TikTok โ€œFor Youโ€ page,
180
544832
2419
ืืชื ื’ื•ืœืฉื™ื ื‘โ€œื˜ื™ืงื˜ื•ืงโ€ ื‘ืขืžื•ื“ โ€œื‘ืฉื‘ื™ืœื›ืโ€
09:07
and you're used to seeing videos that have an audio source,
181
547251
4213
ื•ืืชื ืจื’ื™ืœื™ื ืœืจืื•ืช ืกืจื˜ื•ื ื™ื ืฉื™ืฉ ืœื”ื ืžืงื•ืจ ืฉืžืข,
09:11
an AI filter, a green screen, a background,
182
551464
2419
ืžืกื ืŸ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช, ืžืกืš ื™ืจื•ืง, ืจืงืข,
09:13
a stitch with another edit.
183
553883
1752
ืชืคืจ ืขื ืขืจื™ื›ื” ืื—ืจืช.
09:15
This, in some sense, is the alpha version of this transparency
184
555676
3337
ื–ื•ื”ื™, ื‘ืžื•ื‘ืŸ ืžืกื•ื™ื, ื’ืจืกืช ื”ืืœืคื ืฉืœ ืฉืงื™ืคื•ืช ื–ื•
09:19
in some of the major platforms we use today.
185
559055
2377
ื‘ื›ืžื” ืžื”ืคืœื˜ืคื•ืจืžื•ืช ื”ืขื™ืงืจื™ื•ืช ื‘ื”ืŸ ืื ื• ืžืฉืชืžืฉื™ื ื›ื™ื•ื.
09:21
It's just that it does not yet travel across the internet,
186
561474
2753
ื–ื” ืจืง ืœื ืžืกืชื•ื‘ื‘ ืขื“ื™ื™ืŸ ื‘ืจื—ื‘ื™ ื”ืื™ื ื˜ืจื ื˜,
09:24
itโ€™s not reliable, updatable, and itโ€™s not secure.
187
564268
3128
ื–ื” ืœื ืืžื™ืŸ, ื–ื” ืœื ื‘ืจ-ืขื“ื›ื•ืŸ ื•ื–ื” ืœื ืžืื•ื‘ื˜ื—.
09:27
Now, there are also big challenges
188
567980
2628
ื™ืฉื ื ื’ื ืืชื’ืจื™ื ื’ื“ื•ืœื™ื
09:30
in this type of infrastructure for authenticity.
189
570650
2544
ื‘ืกื•ื’ ื–ื” ืฉืœ ืชืฉืชื™ืช ืœืื•ืชื ื˜ื™ื•ืช.
09:34
As we create these durable signs of how AI and human were mixed,
190
574278
4088
ื›ืฉืื ื• ื™ื•ืฆืจื™ื ืืช ื”ืกื™ืžื ื™ื ื”ืขืžื™ื“ื™ื ื”ืืœื” ืœืชืขืจื•ื‘ืช ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ื•ืื“ื,
09:38
that carry across the trajectory of how media is made,
191
578366
3086
ืฉืžืฉื•ืœื‘ื™ื ื‘ื›ืœ ืžืกืœื•ืœ ื™ืฆื™ืจืช ื”ืžื“ื™ื” ื”ื—ื“ืฉื”,
09:41
we need to ensure they don't compromise privacy or backfire globally.
192
581494
4129
ืขืœื™ื ื• ืœื”ื‘ื˜ื™ื— ืฉื”ื ืœื ื™ืคื’ืขื• ื‘ืคืจื˜ื™ื•ืช ืื• ื™ืคืขืœื• ื›ื‘ื•ืžืจื ื’ ื‘ืจื—ื‘ื™ ื”ืขื•ืœื.
09:46
We have to get this right.
193
586249
1710
ืขืœื™ื ื• ืœืขืฉื•ืช ืืช ื–ื” ื ื›ื•ืŸ.
09:48
We can't oblige a citizen journalist filming in a repressive context
194
588584
4255
ืื™ื ื ื• ื™ื›ื•ืœื™ื ืœื—ื™ื™ื‘ ืขื™ืชื•ื ืื™ ืื–ืจื—ื™ ืฉืžืฆืœื ืื™ืจื•ืขื™ ื“ื™ื›ื•ื™,
09:52
or a satirical maker using novel gen-AI tools
195
592839
3169
ืื• ื™ื•ืฆืจ ืกืื˜ื™ืจื™ ืฉืžืฉืชืžืฉ ื‘ื›ืœื™ื ื—ื“ืฉื ื™ื™ื ืฉืœ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ื™ื•ืฆืจืช
09:56
to parody the powerful ...
196
596008
1252
ื›ื“ื™ ืœืœื’ืœื’ ืขืœ ื”ื—ื–ืงื™ื,
09:58
to have to disclose their identity or personally identifiable information
197
598845
4879
ืœื—ืฉื•ืฃ ืืช ื–ื”ื•ืชื ืื• ืืช ื”ืžื™ื“ืข ื”ืžื–ื”ื” ื”ืื™ืฉื™ ืฉืœื”ื
10:03
in order to use their camera or ChatGPT.
198
603766
2961
ื›ื“ื™ ืฉื™ื•ื›ืœื• ืœื”ืฉืชืžืฉ ื‘ืžืฆืœืžื” ืื• ื‘ืฆโ€˜ื˜ ื’โ€™ื™-ืคื™-ื˜ื™.
10:08
Because it's important they be able to retain their ability to have anonymity,
199
608312
3712
ื›ื™ ื—ืฉื•ื‘ ืฉื”ื ื™ื•ื›ืœื• ืœืฉืžื•ืจ ืขืœ ืืœืžื•ื ื™ื•ืช,
10:12
at the same time as the tool to create is transparent.
200
612066
3378
ื‘ืžืงื‘ื™ืœ ืœืฉืงื™ืคื•ืช ื”ื›ืœื™ ื‘ื• ื”ื ื™ื•ืฆืจื™ื.
10:16
This needs to be about the how of AI-human media making,
201
616112
4171
ื”ืขื™ืงืจ ืฆืจื™ืš ืœื”ื™ื•ืช ื”โ€œืื™ืšโ€ ื‘ื™ืฆื™ืจืช ื”ืžื“ื™ื” ืฉืœ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช-ืื“ื,
10:20
not the who.
202
620283
1167
ื•ืœื ื”โ€œืžื™โ€œ.
10:22
This brings me to the final step.
203
622952
2211
ื–ื” ืžื‘ื™ื ืื•ืชื™ ืœืฉืœื‘ ื”ืื—ืจื•ืŸ.
10:25
None of this works without a pipeline of responsibility
204
625163
4462
ื›ืœ ื–ื” ืœื ืขื•ื‘ื“ ืœืœื ืžื“ืจื’ ืื—ืจื™ื•ืช
10:29
that runs from the foundation models and the open-source projects
205
629667
4254
ื”ื—ืœ ืžื”ืžื•ื“ืœื™ื ื”ื‘ืกื™ืกื™ื™ื ื•ืžื™ื–ืžื™ ื”ืงื•ื“ ื”ืคืชื•ื—
10:33
through to the way that is deployed into systems, APIs and apps,
206
633963
4213
ื“ืจืš ืื•ืคืŸ ื”ื™ื™ืฉื•ื ื‘ืžืขืจื›ื•ืช, ื‘ืžืžืฉืงื™ ื”ืชื›ื ื•ืช ื•ื‘ืืคืœื™ืงืฆื™ื•ืช,
10:38
to the platforms where we consume media and communicate.
207
638217
3379
ื•ืขื“ ืœืคืœื˜ืคื•ืจืžื•ืช ืฉื‘ื”ืŸ ืื ื• ืฆื•ืจื›ื™ื ืžื“ื™ื” ื•ืžืชืงืฉืจื™ื.
10:43
I've spent much of the last 15 years fighting, essentially, a rearguard action,
208
643139
4171
ืืช ืจื•ื‘ 15 ื”ืฉื ื™ื ื”ืื—ืจื•ื ื•ืช ื”ืงื“ืฉืชื™ ื‘ืขืฆื ืœืžืื‘ืง ื›ื—ืœืง ืžืžืฉืžืจ ืื—ื•ืจื™,
10:47
like so many of my colleagues in the human rights world,
209
647310
2919
ื›ืžื• ืจื‘ื™ื ื›ืœ-ื›ืš ืžืขืžื™ืชื™ ื‘ืขื•ืœื ื–ื›ื•ื™ื•ืช ื”ืื“ื,
10:50
against the failures of social media.
210
650229
2169
ื ื’ื“ ื”ื›ื™ืฉืœื•ื ื•ืช ืฉืœ ื”ืžื“ื™ื” ื”ื—ื‘ืจืชื™ืช.
10:52
We can't make those mistakes again in this next generation of technology.
211
652899
5380
ืื™ื ื ื• ื™ื›ื•ืœื™ื ืœื—ื–ื•ืจ ืขืœ ื”ื˜ืขื•ื™ื•ืช ื”ืืœื” ื‘ื“ื•ืจ ื”ื‘ื ืฉืœ ื”ื˜ื›ื ื•ืœื•ื’ื™ื”.
10:59
What this means is that governments
212
659572
1835
ื”ืžืฉืžืขื•ืช ื”ื™ื ืฉื”ืžืžืฉืœื•ืช ืฆืจื™ื›ื•ืช ืœื”ื‘ื˜ื™ื—
11:01
need to ensure that within this pipeline of responsibility for AI,
213
661449
4254
ืฉื›ื—ืœืง ืžืžื“ืจื’ ื”ืื—ืจื™ื•ืช ื”ื–ื” ืœื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช,
11:05
there is transparency, accountability and liability.
214
665703
3253
ื™ื”ื™ื• ืฉืงื™ืคื•ืช, ื“ื™ืŸ ื•ื—ืฉื‘ื•ืŸ ื•ืื—ืจื™ื•ืช.
11:10
Without these three steps --
215
670666
2086
ืœืœื ืฉืœื•ืฉืช ื”ืฉืœื‘ื™ื ื”ืืœื” --
11:12
detection for the people who need it most,
216
672793
3129
ื–ื™ื”ื•ื™, ืขื‘ื•ืจ ืžื™ ืฉื”ื›ื™ ื–ืงื•ืงื™ื ืœื›ืš,
11:15
provenance that is rights-respecting
217
675922
2502
ืžืงื•ืจื•ืช, ืฉืžื›ื‘ื“ื™ื ื–ื›ื•ื™ื•ืช
11:18
and that pipeline of responsibility,
218
678466
2169
ื•ืžื“ืจื’ ืื—ืจื™ื•ืช ื–ื”,
11:20
we're going to get stuck looking in vain for the six-fingered hand,
219
680635
3545
ื ื™ืชืงืข ื‘ื—ื™ืคื•ืฉ ืกืชืžื™ ืื—ืจ ื™ื“ื™ื™ื ืขื ืฉืฉ ืืฆื‘ืขื•ืช,
ืื• ืขื™ื ื™ื™ื ืฉืœื ืžืžืฆืžืฆื•ืช.
11:24
or the eyes that don't blink.
220
684222
1543
11:26
We need to take these steps.
221
686390
1836
ืขืœื™ื ื• ืœื ืงื•ื˜ ืฆืขื“ื™ื ื”ืืœื”.
11:28
Otherwise, we risk a world where it gets easier and easier
222
688226
3294
ืื—ืจืช, ืื ื• ืžืกืชื›ื ื™ื ื‘ืขื•ืœื ืฉื‘ื• ื ื”ื™ื” ืงืœ ื™ื•ืชืจ ื•ื™ื•ืชืจ
11:31
to both fake reality
223
691520
1502
ืœื–ื™ื™ืฃ ืืช ื”ืžืฆื™ืื•ืช
11:33
and dismiss reality as potentially faked.
224
693064
2669
ื•ื’ื ืœืคื˜ื•ืจ ืืช ื”ืžืฆื™ืื•ืช ื›ืžื–ื•ื™ืคืช ื‘ืคื•ื˜ื ืฆื™ืืœ.
11:36
And that is a world that the political philosopher Hannah Arendt
225
696234
3086
ื•ื–ื”ื• ืขื•ืœื ืฉื”ืคื™ืœื•ืกื•ืคื™ืช ื”ืคื•ืœื™ื˜ื™ืช ื—ื ื” ืืจื ื“ื˜ ืชื™ืืจื” ื‘ืžื•ื ื—ื™ื ืืœื”:
11:39
described in these terms:
226
699320
1460
11:40
"A people that no longer can believe anything
227
700821
2628
โ€œืขื ืฉื›ื‘ืจ ืœื ื™ื›ื•ืœ ืœื”ืืžื™ืŸ ืœื“ื‘ืจ
11:43
cannot make up its own mind.
228
703491
2294
โ€œืœื ื™ื›ื•ืœ ืœืงื‘ืœ ื”ื—ืœื˜ื•ืช ื‘ืขืฆืžื•.
11:45
It is deprived not only of its capacity to act
229
705785
3211
โ€œื ืฉืœืœืช ืžืžื ื• ืœื ืจืง ื”ื™ื›ื•ืœืช ืœืคืขื•ืœ
11:48
but also of its capacity to think and to judge.
230
708996
3504
โ€œืืœื ื’ื ื™ื›ื•ืœืชื• ืœื—ืฉื•ื‘ ื•ืœืฉืคื•ื˜.
11:52
And with such a people you can then do what you please."
231
712959
3211
โ€œื•ืขื ื›ื–ื” ื”ื•ื ื›ื—ื•ืžืจ ื‘ื™ื“ ื”ื™ื•ืฆืจโ€œ.
11:56
That's a world I know none of us want,
232
716712
2044
ื–ื”ื• ืขื•ืœื ืฉืื ื™ ื™ื•ื“ืข ืฉืื™ืฉ ืžืื™ืชื ื• ืœื ืจื•ืฆื” ื‘ื•,
11:58
that I think we can prevent.
233
718798
2002
ื•ืื ื™ ื—ื•ืฉื‘ ืฉื ื•ื›ืœ ืœืžื ื•ืข ืื•ืชื•.
12:00
Thanks.
234
720800
1168
ืชื•ื“ื”.
12:02
(Cheers and applause)
235
722009
2544
(ืชืจื•ืขื•ืช ื•ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7