AI Is Dangerous, but Not for the Reasons You Think | Sasha Luccioni | TED

1,211,342 views ใƒป 2023-11-06

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืชืจื’ื•ื: zeeva livshitz ืขืจื™ื›ื”: aknv tso
00:04
So I've been an AI researcher for over a decade.
0
4292
3504
โ€ซืื ื™ ื—ื•ืงืจืช ืฉืœ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช (ื‘โ€œืž) โ€ซื›ื‘ืจ ืœืžืขืœื” ืžืขืฉื•ืจ.โ€ฌ
00:07
And a couple of months ago, I got the weirdest email of my career.
1
7796
3503
โ€ซื•ืœืคื ื™ ื›ืžื” ื—ื•ื“ืฉื™ื, ืงื™ื‘ืœืชื™ ืืช โ€ฌ โ€ซื”ื“ื•ื"ืœ ื”ืžื•ื–ืจ ื‘ื™ื•ืชืจ ื‘ืงืจื™ื™ืจื” ืฉืœื™.โ€ฌ
00:11
A random stranger wrote to me
2
11925
1668
ืื“ื โ€ซื–ืจ ื›ืชื‘ ืœื™โ€ฌ,
00:13
saying that my work in AI is going to end humanity.
3
13635
3420
โ€ซื•ืืžืจ ืฉืขื‘ื•ื“ืชื™ ื‘ื‘"ืž ืชืฉื™ื ืงืฅ ืœืื ื•ืฉื•ืช.โ€ฌ
00:18
Now I get it, AI, it's so hot right now.
4
18598
3754
ืื ื™ ืžื‘ื™ื ื” ืžื–ื” ืฉื”โ€ซื‘"ืž ื”ื™ื ื›ื–ื” ืœื”ื™ื˜ ืขื›ืฉื™ื•.โ€ฌ
00:22
(Laughter)
5
22352
1627
โ€ซ(ืฆื—ื•ืง)โ€ฌ
00:24
It's in the headlines pretty much every day,
6
24020
2086
โ€ซื”ื™ื ืžื•ืคื™ืขื” ื‘ื›ื•ืชืจื•ืช ื›ืžืขื˜ ื›ืœ ื™ื•ื,โ€ฌ
00:26
sometimes because of really cool things
7
26106
1918
โ€ซืœืคืขืžื™ื ื‘ื’ืœืœ ื“ื‘ืจื™ื ืžืžืฉ ืžื’ื ื™ื‘ื™ืโ€ฌ,
00:28
like discovering new molecules for medicine
8
28066
2169
โ€ซื›ืžื• ื’ื™ืœื•ื™ ืžื•ืœืงื•ืœื•ืช ื—ื“ืฉื•ืช ื‘ืจืคื•ืื”โ€ฌ,
00:30
or that dope Pope in the white puffer coat.
9
30235
2252
โ€ซืื• ื”ืืคื™ืคื™ื•ืจ ื”ืžื˜ื•ืคืฉ ื”ื–ื” ื‘ืžืขื™ืœ ื”ื“ื•ื‘ื•ืŸ ื”ืœื‘ืŸ.โ€ฌ
00:33
But other times the headlines have been really dark,
10
33446
2461
โ€ซืื‘ืœ ื‘ืคืขืžื™ื ืื—ืจื•ืช ื”ื›ื•ืชืจื•ืช ื”ื™ื• ืžืžืฉ ืืคืœื•ืช,โ€ฌ
00:35
like that chatbot telling that guy that he should divorce his wife
11
35907
3671
โ€ซื›ืžื• ื”ืฆโ€™ืื˜ื‘ื•ื˜ ืฉืืžืจ ืœื‘ื—ื•ืจ ื”ื”ื•ืโ€ฌ โ€ซืฉื”ื•ื ืฆืจื™ืš ืœื”ืชื’ืจืฉ ืžืืฉืชื•โ€ฌ,
00:39
or that AI meal planner app proposing a crowd pleasing recipe
12
39619
4088
โ€ซืื• ื™ื™ืฉื•ืžื•ืŸ ื‘โ€œืž ืœืชื›ื ื•ืŸ ืืจื•ื—ื•ืช โ€ฌ โ€ซืฉืžืฆื™ืข ืžืชื›ื•ืŸ ืฉื™ืฉืžื— ืืช ื›ื•ืœื,
00:43
featuring chlorine gas.
13
43707
2002
ื•ืฉื›ื•ืœืœ ื’ื– ื›ืœื•ืจ.โ€ฌ
00:46
And in the background,
14
46376
1418
โ€ซื•ื‘ืจืงืข โ€ซืฉืžืขื ื• ื”ืจื‘ื” ื“ื™ื‘ื•ืจื™ื ืขืœ ืชืจื—ื™ืฉื™ ื™ื•ื ื”ื“ื™ืŸ,โ€ฌ
00:47
we've heard a lot of talk about doomsday scenarios,
15
47836
2419
00:50
existential risk and the singularity,
16
50255
1918
โ€ซืขืœ ืกื™ื›ื•ื ื™ื ืงื™ื•ืžื™ื™ื ื•ืขืœ ื”ืกื™ื ื’ื•ืœืจื™ื•ืช,โ€ฌ
00:52
with letters being written and events being organized
17
52215
2503
ื•ื ืฉืœื—ื™ื ืžื›ืชื‘ื™ื ื•ืžืชืืจื’ื ื™ื ืื™ืจื•ืขื™ื
00:54
to make sure that doesn't happen.
18
54718
2002
โ€ซื›ื“ื™ ืœื•ื•ื“ื ืฉื–ื” ืœื ื™ืงืจื”.โ€ฌ
00:57
Now I'm a researcher who studies AI's impacts on society,
19
57637
4630
โ€ซืื ื™ ื—ื•ืงืจืช ืืช ื”ื”ืฉืคืขื•ืชโ€ฌ โ€ซืฉืœ ื”ื‘"ืž ืขืœ ื”ื—ื‘ืจื”,โ€ฌ
01:02
and I don't know what's going to happen in 10 or 20 years,
20
62267
2836
โ€ซื•ืื ื™ ืœื ื™ื•ื“ืขืช ืžื” ื™ืงืจื”โ€ฌ โ€ซื‘ืขื•ื“ 10 ืื• 20 ืฉื ื”,โ€ฌ
01:05
and nobody really does.
21
65145
2461
โ€ซื•ืืฃ ืื—ื“ ืœื ื‘ืืžืช ื™ื•ื“ืข.โ€ฌ
01:07
But what I do know is that there's some pretty nasty things going on right now,
22
67981
4546
โ€ซืื‘ืœ ืื ื™ ื›ืŸ ื™ื•ื“ืขืช ืฉืงื•ืจื™ื ืขื›ืฉื™ื•โ€ฌ โ€ซื›ืžื” ื“ื‘ืจื™ื ื“ื™ ืžื’ืขื™ืœื™ื,โ€ฌ
01:12
because AI doesn't exist in a vacuum.
23
72527
2878
โ€ซื›ื™ ื”ื‘โ€œืž ืœื ืงื™ื™ืžืช ื‘ื—ืœืœ ืจื™ืง.โ€ฌ
01:15
It is part of society, and it has impacts on people and the planet.
24
75447
3920
โ€ซื”ื™ื ื—ืœืง ืžื”ื—ื‘ืจื”, ื•ื™ืฉ ืœื” ื”ืฉืคืขื•ืช โ€ฌ โ€ซืขืœ ืื ืฉื™ื ื•ืขืœ ื›ื“ื•ืจ ื”ืืจืฅ.โ€ฌ
01:20
AI models can contribute to climate change.
25
80160
2502
โ€ซืžื•ื“ืœื™ื ืฉืœ ื‘"ืž ืขืœื•ืœื™ื ืœื”ื—ืจื™ืฃ ืืช ืฉื™ื ื•ื™ื™ ื”ืืงืœื™ื.โ€ฌ
01:22
Their training data uses art and books created by artists
26
82704
3462
ืœืฆื•ืจืš ื”ืื™ืžื•ืŸ ืฉืœื” ืžืฉืชืžืฉื™ื ื‘ืืžื ื•ืชโ€ฌ โ€ซื•ื‘ืกืคืจื™ื ืฉื ื•ืฆืจื• ืขืœ ื™ื“ื™ ืืžื ื™ืโ€ฌ ื•ืกื•ืคืจื™ื
01:26
and authors without their consent.
27
86207
1710
โ€ซืœืœื ื”ืกื›ืžืชื.โ€ฌ
01:27
And its deployment can discriminate against entire communities.
28
87959
3837
โ€ซื•ื”ืคืขืœืช ื”ื‘โ€œืž ืขืœื•ืœื” ืœืขื•ืจืจ ืืคืœื™ื” ื ื’ื“ ืงื”ื™ืœื•ืช ืฉืœืžื•ืช.โ€ฌ
01:32
But we need to start tracking its impacts.
29
92797
2127
ืขืœื™ื ื• ืœื”ืชื—ื™ืœ ืœืขืงื•ื‘ ืื—ืจ ื”ืฉืคืขื•ืชื™ื”.โ€ฌ
01:34
We need to start being transparent and disclosing them and creating tools
30
94966
3587
ืขืœื™ื ื• ืœื”ืชื—ื™ืœ ืœื”ื™ื•ืช ืฉืงื•ืคื™ื, ืœื—ืฉื•ืฃ ืื•ืชืŸ ื•ืœื™ืฆื•ืจ ื›ืœื™ืโ€ฌ
01:38
so that people understand AI better,
31
98595
2419
โ€ซื›ื“ื™ ืฉืื ืฉื™ื ื™ื‘ื™ื ื• ื˜ื•ื‘ ื™ื•ืชืจ ืืช ื”ื‘"ืž,โ€ฌ
01:41
so that hopefully future generations of AI models
32
101056
2335
โ€ซื‘ืชืงื•ื•ื” ืฉื”ื“ื•ืจื•ืช ื”ื‘ืื™ื ืฉืœ ืžื•ื“ืœื™ื ืฉืœ ื‘"ืžโ€ฌ
01:43
are going to be more trustworthy, sustainable,
33
103433
2836
โ€ซื™ื”ื™ื• ื™ื•ืชืจ ืืžื™ื ื™ื ื•ื‘ืจื™-ืงื™ื™ืžื,โ€ฌ
01:46
maybe less likely to kill us, if that's what you're into.
34
106269
2836
ื•โ€ซืื•ืœื™ ื’ื ืคื—ื•ืช ืฆืคื•ื™ื™ื ืœื”ืจื•ื’ ืื•ืชื ื•,โ€ฌ โ€ซืื ื–ื” ืžื” ืฉืžืขื ื™ื™ืŸ ืืชื›ื.โ€ฌ
01:50
But let's start with sustainability,
35
110148
1752
โ€ซืื‘ืœ ื‘ื•ืื• ื ืชื—ื™ืœ ืขื ืงื™ื™ืžื•ืช,โ€ฌ
01:51
because that cloud that AI models live on is actually made out of metal, plastic,
36
111900
5756
โ€ซื›ื™ ื”ืขื ืŸ ืฉืขืœื™ื• ื—ื™ื™ื ื”ืžื•ื“ืœื™ื ืฉืœ ื”โ€ซื‘"ืž ืขืฉื•ื™ ืžืžืชื›ืช ื•ืคืœืกื˜ื™ืง,โ€ฌ
01:57
and powered by vast amounts of energy.
37
117656
2460
โ€ซื•ื”ื•ื ืžื•ืคืขืœ ื‘ื›ืžื•ื™ื•ืช ืื“ื™ืจื•ืช ืฉืœ ืื ืจื’ื™ื”.โ€ฌ
02:00
And each time you query an AI model, it comes with a cost to the planet.
38
120116
4463
โ€ซื•ืชืžื™ื“ ื›ืฉืืชื ืžืคื ื™ื ืฉืื™ืœืชื” ืœืžื•ื“ืœ ื‘โ€œืž, ื›ื“ื•ืจ ื”ืืจืฅ ืžืฉืœื.โ€ฌ
02:05
Last year, I was part of the BigScience initiative,
39
125789
3044
โ€ซื‘ืฉื ื” ืฉืขื‘ืจื” ื”ื™ื™ืชื™ ื—ืœืง ืžื™ื•ื–ืžืช "ื‘ื™ื’ ืกืื™ื™ื ืก",โ€ฌ
02:08
which brought together a thousand researchers
40
128833
2127
โ€ซืฉื”ืคื’ื™ืฉื” ืืœืฃ ื—ื•ืงืจื™ืโ€ฌ โ€ซืžื›ืœ ืจื—ื‘ื™ ื”ืขื•ืœื ืœื™ืฆื™ืจืช โ€œื‘ืœื•ืโ€œ,โ€ฌ
02:10
from all over the world to create Bloom,
41
130960
2503
02:13
the first open large language model, like ChatGPT,
42
133505
4337
โ€ซืžื•ื“ืœ ื”ืฉืคื” ื”ื’ื“ื•ืœ ื”ืคืชื•ื— โ€ซื”ืจืืฉื•ืŸ, ื›ืžื• ืฆโ€˜ืื˜ ื’โ€™ื™-ืคื™-ื˜ื™,โ€ฌ
02:17
but with an emphasis on ethics, transparency and consent.
43
137842
3546
ื‘ื“ื’ืฉ ืขืœ ืืชื™ืงื”, ืฉืงื™ืคื•ืช ื•ื”ืกื›ืžื”.โ€ฌ
02:21
And the study I led that looked at Bloom's environmental impacts
44
141721
3253
โ€ซื•ื”ืžื—ืงืจ ืฉื”ื•ื‘ืœืชื™, ืฉื‘ื“ืง ืืช ื”ื”ืฉืคืขื•ืชโ€ฌ โ€ซื”ืกื‘ื™ื‘ืชื™ื•ืช ืฉืœ โ€œื‘ืœื•ืโ€œ,โ€ฌ
02:25
found that just training it used as much energy
45
145016
3253
โ€ซืžืฆื ืฉืจืง ื”ืื™ืžื•ืŸ ืฉืœื• ื ื™ืฆืœ ืื ืจื’ื™ื”โ€ฌ โ€ซื›ืžื• ืฉืœ 30 ื‘ืชื™ื ื‘ืฉื ื” ืฉืœืžื”โ€ฌ,
02:28
as 30 homes in a whole year
46
148311
2211
02:30
and emitted 25 tons of carbon dioxide,
47
150563
2419
โ€ซื•ืคืœื˜ 25 ื˜ื•ื ื•ืช ืฉืœ ืคื—ืžืŸ ื“ื•-ื—ืžืฆื ื™,โ€ฌ
02:33
which is like driving your car five times around the planet
48
153024
3253
โ€ซืฉื–ื” ื›ืžื• ืœื”ืกื™ืข ืžื›ื•ื ื™ืช โ€ซื—ืžืฉ ืคืขืžื™ื ืกื‘ื™ื‘ ื›ื“ื•ืจ ื”ืืจืฅโ€ฌ
02:36
just so somebody can use this model to tell a knock-knock joke.
49
156319
3170
โ€ซืจืง ื›ื“ื™ ืฉืžื™ืฉื”ื• ื™ื•ื›ืœ ืœื”ืฉืชืžืฉ ื‘ืžื•ื“ืœ โ€ซื”ื–ื” ื‘ืฉื‘ื™ืœ ื‘ื“ื™ื—ืช ืงืจืฉโ€ฌ
02:39
And this might not seem like a lot,
50
159489
2169
โ€ซื•ื–ื” ืื•ืœื™ ืœื ื ืจืื” ื”ืจื‘ื”,โ€ฌ
02:41
but other similar large language models,
51
161700
2460
โ€ซืื‘ืœ ืžื•ื“ืœื™ื ื’ื“ื•ืœื™ื ื“ื•ืžื™ื ืื—ืจื™ื ืฉืœ ืฉืคื”,โ€ฌ
02:44
like GPT-3,
52
164202
1126
โ€ซื›ืžื• ื’'ื™-ืคื™-ื˜ื™ 3,โ€ฌ
02:45
emit 20 times more carbon.
53
165370
2544
โ€ซืคื•ืœื˜ื™ื ืคื™ 20 ื™ื•ืชืจ ืคื—ืžืŸ.โ€ฌ
02:47
But the thing is, tech companies aren't measuring this stuff.
54
167956
2878
โ€ซื”ืขื ื™ื™ืŸ ื”ื•ื ืฉื—ื‘ืจื•ืช ื˜ื›ื ื•ืœื•ื’ื™ื” โ€ฌ โ€ซืœื ืžื•ื“ื“ื•ืช ืืช ื”ื“ื‘ืจื™ื ื”ืืœื”.โ€ฌ
02:50
They're not disclosing it.
55
170875
1252
โ€ซื”ืŸ ืœื ื—ื•ืฉืคื•ืช ืืช ื–ื”.โ€ฌ
02:52
And so this is probably only the tip of the iceberg,
56
172168
2461
โ€ซื•ื–ื” ื›ื ืจืื” ืจืง ืงืฆื” ื”ืงืจื—ื•ืŸ,โ€ฌ
02:54
even if it is a melting one.
57
174629
1418
โ€ซื’ื ืื ื”ื•ื ื ืžืก.โ€ฌ
02:56
And in recent years we've seen AI models balloon in size
58
176798
3629
โ€ซื•ื‘ืฉื ื™ื ื”ืื—ืจื•ื ื•ืช ืจืื™ื ื• โ€ฌโ€ซืžื•ื“ืœื™ื ืฉืœ ื‘โ€œืž โ€ฌืฉืžืชื ืคื—ื™ื ื‘ื’ื•ื“ืœื
03:00
because the current trend in AI is "bigger is better."
59
180468
3462
ื›ื™ ื”ืžื’ืžื” ื”ื ื•ื›ื—ื™ืช ื‘ื‘โ€œืž ื”ื™ืโ€ฌ โ€ซโ€œื’ื“ื•ืœ ื™ื•ืชืจ ื”ื•ื ื™ื•ืชืจ ื˜ื•ื‘โ€œ.โ€ฌ
03:04
But please don't get me started on why that's the case.
60
184305
2795
โ€ซืื‘ืœ ืื ื™ ืœื ืจื•ืฆื” ืœื”ืชื—ื™ืœ โ€ฌ โ€ซืœื”ืกื‘ื™ืจ ืœืžื” ื–ื” ื›ื›ื”.โ€ฌ
03:07
In any case, we've seen large language models in particular
61
187100
3003
โ€ซื‘ื›ืœ ืžืงืจื”, ืจืื™ื ื• ืฉืžื•ื“ืœื™ ืฉืคื” ื’ื“ื•ืœื™ื, ื‘ืžื™ื•ื—ื“โ€ฌ,
03:10
grow 2,000 times in size over the last five years.
62
190103
3211
โ€ซื’ื“ืœื• ืคื™ 2,000 ื‘ื—ืžืฉ ื”ืฉื ื™ื ื”ืื—ืจื•ื ื•ืช.โ€ฌ
03:13
And of course, their environmental costs are rising as well.
63
193314
3045
โ€ซื•ื›ืžื•ื‘ืŸ, ื’ื ื”ืขืœื•ื™ื•ืช ื”ืกื‘ื™ื‘ืชื™ื•ืช ืฉืœื”ื ืขื•ืœื•ืช.โ€ฌ
03:16
The most recent work I led, found that switching out a smaller,
64
196401
3795
โ€ซื”ืขื‘ื•ื“ื” ื”ืื—ืจื•ื ื” ืฉื”ื•ื‘ืœืชื™, โ€ฌ โ€ซืžืฆืื” ื›ื™ ื”ืžืขื‘ืจ
ืžืžื•ื“ืœ ืงื˜ืŸโ€ฌ โ€ซื•ื™ืขื™ืœ ื™ื•ืชืจ ืœืžื•ื“ืœ ืฉืคื” ื’ื“ื•ืœ ื™ื•ืชืจโ€ฌ,
03:20
more efficient model for a larger language model
65
200238
3337
03:23
emits 14 times more carbon for the same task.
66
203616
3754
โ€ซืคื•ืœื˜ืช ืคื™ 14 ื™ื•ืชืจ ืคื—ืžืŸ ื‘ืื•ืชื” ื”ืžื˜ืœื”.โ€ฌ
03:27
Like telling that knock-knock joke.
67
207412
1877
โ€ซื›ืžื• ืœืกืคืจ ืืช ื‘ื“ื™ื—ืช ื”ืงืจืฉ ื”ื–ืืช.โ€ฌ
03:29
And as we're putting in these models into cell phones and search engines
68
209289
3462
โ€ซื•ื›ืฉืื ื—ื ื• ืžื›ื ื™ืกื™ื ืืช ื”ื“ื’ืžื™ื ื”ืืœื”โ€ฌ โ€ซืœื˜ืœืคื•ื ื™ื ืกืœื•ืœืจื™ื™ื, ืœืžื ื•ืขื™ ื—ื™ืคื•ืฉโ€ฌ
03:32
and smart fridges and speakers,
69
212792
2836
ื•ืœืžืงืจืจื™ื ื•ืœืจืžืงื•ืœื™ื ื—ื›ืžื™ื,โ€ฌ
03:35
the environmental costs are really piling up quickly.
70
215628
2628
โ€ซื”ืขืœื•ื™ื•ืช ื”ืกื‘ื™ื‘ืชื™ื•ืช ืžืฆื˜ื‘ืจื•ืช ืžืžืฉ ื‘ืžื”ื™ืจื•ืช.โ€ฌ
03:38
So instead of focusing on some future existential risks,
71
218840
3754
โ€ซืื– ื‘ืžืงื•ื ืœื”ืชืžืงื“ ื‘ื›ืžื” โ€ฌ โ€ซืกื™ื›ื•ื ื™ื ืงื™ื•ืžื™ื™ื ืขืชื™ื“ื™ื™ื,โ€ฌ
03:42
let's talk about current tangible impacts
72
222635
2753
โ€ซื‘ื•ืื• ื ื“ื‘ืจ ืขืœ ื”ื”ืฉืคืขื•ืช ื”ืžืžืฉื™ื•ืช ื”ื ื•ื›ื—ื™ื•ืชโ€ฌ
03:45
and tools we can create to measure and mitigate these impacts.
73
225388
3629
โ€ซื•ืขืœ ื”ื›ืœื™ื ืฉื ื•ื›ืœ ืœื™ืฆื•ืจ ื›ื“ื™ ืœืžื“ื•ื“ โ€ฌ โ€ซื•ืœื”ืงื˜ื™ืŸ ืืช ื”ื”ืฉืคืขื•ืช ื”ืœืœื•.โ€ฌ
03:49
I helped create CodeCarbon,
74
229893
1668
โ€ซืขื–ืจืชื™ ืœื™ืฆื•ืจ ืืช โ€œืงื•ื“ ืงืจื‘ื•ืŸโ€œ,โ€ฌ
03:51
a tool that runs in parallel to AI training code
75
231603
2961
โ€ซื›ืœื™ ืฉืคื•ืขืœ ื‘ืžืงื‘ื™ืœ ืœืงื•ื“ ืื™ืžื•ืŸ ื‘โ€œืžโ€ฌ,
03:54
that estimates the amount of energy it consumes
76
234564
2211
ื•โ€ซืฉืžืขืจื™ืš ืืช ื›ืžื•ืช ื”ืื ืจื’ื™ื” ืฉื”ื•ื ืฆื•ืจืšโ€ฌ
03:56
and the amount of carbon it emits.
77
236775
1668
โ€ซื•ืืช ื›ืžื•ืช ื”ืคื—ืžืŸ ืฉื”ื•ื ืคื•ืœื˜.โ€ฌ
03:58
And using a tool like this can help us make informed choices,
78
238485
2877
โ€ซื•ืฉื™ืžื•ืฉ ื‘ื›ืœื™ ื›ื–ื” ื™ื›ื•ืœ ืœืขื–ื•ืจโ€ฌ โ€ซืœื ื• ืœืขืฉื•ืช ื‘ื—ื™ืจื•ืช ืžื•ืฉื›ืœื•ืช,โ€ฌ
04:01
like choosing one model over the other because it's more sustainable,
79
241404
3253
โ€ซื›ืžื• ืœื‘ื—ื•ืจ ื“ื’ื ืื—ื“ ื‘ืžืงื•ื ืื—ืจ ื›ื™ ื”ื•ื ื™ื•ืชืจ ื‘ืจ-ืงื™ื™ืžื,โ€ฌ
04:04
or deploying AI models on renewable energy,
80
244657
2920
โ€ซืื• ื”ืคืขืœืช ื“ื’ืžื™ ื‘โ€œืž ื‘ืขื–ืจืช ืื ืจื’ื™ื” ืžืชื—ื“ืฉืช,โ€ฌ
04:07
which can drastically reduce their emissions.
81
247619
2544
โ€ซืฉื™ื›ื•ืœื” ืœื”ืคื—ื™ืช ื‘ืื•ืคืŸ โ€ฌโ€ซื“ืจืกื˜ื™ ืืช ื”ืคืœื™ื˜ื•ืช ืฉืœื”ื.โ€ฌ
04:10
But let's talk about other things
82
250163
2085
โ€ซืื‘ืœ ื‘ื•ืื• ื ื“ื‘ืจ ืขืœ ื“ื‘ืจื™ื ืื—ืจื™ืโ€ฌ
04:12
because there's other impacts of AI apart from sustainability.
83
252290
2961
โ€ซื›ื™ ื™ืฉ ืœื‘โ€œืž ื”ืฉืคืขื•ืช ื ื•ืกืคื•ืช ืžืœื‘ื“ ืงื™ื™ืžื•ืช.โ€ฌ
04:15
For example, it's been really hard for artists and authors
84
255627
3128
โ€ซืœื“ื•ื’ืžื”, ืœืืžื ื™ื ื•ืœืกื•ืคืจื™ื ื”ื™ื” ืžืžืฉ ืงืฉื”โ€ฌ
04:18
to prove that their life's work has been used for training AI models
85
258797
4212
โ€ซืœื”ื•ื›ื™ื— ืฉืขื‘ื•ื“ืช ื—ื™ื™ื”ื ืฉื™ืžืฉื”โ€ฌ โ€ซืœืื™ืžื•ืŸ ืžื•ื“ืœื™ื ืฉืœ ื‘โ€œืžโ€ฌ
04:23
without their consent.
86
263051
1209
โ€ซืœืœื ื”ืกื›ืžืชื.โ€ฌ
04:24
And if you want to sue someone, you tend to need proof, right?
87
264302
3170
โ€ซื•ื›ืฉืจื•ืฆื™ื ืœืชื‘ื•ืข ืžื™ืฉื”ื•,โ€ฌ ื“ืจื•ืฉื” ื”ื•ื›ื—ื”, ื ื›ื•ืŸ?โ€ฌ
04:27
So Spawning.ai, an organization that was founded by artists,
88
267806
3920
โ€ซืื– โ€œืกืคื•ื ื™ื ื’ ืื™-ืื™ื™โ€œ, ืืจื’ื•ืŸ ืฉื”ื•ืงื ืขืœ ื™ื“ื™ ืืžื ื™ื,โ€ฌ
04:31
created this really cool tool called โ€œHave I Been Trained?โ€
89
271726
3337
โ€ซื™ืฆืจ ืืช ื”ื›ืœื™ ื”ืžื’ื ื™ื‘ ื”ื–ื” โ€ฌ โ€ซืฉื ืงืจื โ€œื”ืื ืื•ืžื ืชื™?โ€โ€ฌ
04:35
And it lets you search these massive data sets
90
275104
2461
ืฉืžืืคืฉืจ ืœื›ื ืœื—ืคืฉ โ€ฌ โ€ซื‘ืžืขืจื›ื™ ื”ื ืชื•ื ื™ื ื”ืžืกื™ื‘ื™ื™ื ื”ืืœื”โ€ฌ
04:37
to see what they have on you.
91
277607
2085
โ€ซื›ื“ื™ ืœืจืื•ืช ืžื” ื”ื ื™ื•ื“ืขื™ื ืขืœื™ื›ื.โ€ฌ
04:39
Now, I admit it, I was curious.
92
279734
1668
ืื ื™ ืžื•ื“ื” ืฉื”ื™ื™ืชื™ ืกืงืจื ื™ืช.โ€ฌ
04:41
I searched LAION-5B,
93
281444
1627
โ€ซื—ื™ืคืฉืชื™ ื‘โ€œืœืื™ื•ืŸ 5-ื‘ื™โ€œ,
04:43
which is this huge data set of images and text,
94
283112
2461
โ€ซืฉื”ื•ื ืžืขืจืš ื”ื ืชื•ื ื™ื ื”ืขืฆื•ืโ€ฌ ืฉืœ ืชืžื•ื ื•ืช ื•ื˜ืงืกื˜,โ€ฌ
04:45
to see if any images of me were in there.
95
285615
2711
โ€ซื›ื“ื™ ืœืจืื•ืช ืื ื™ืฉ ื‘ื• ืชืžื•ื ื•ืช ืฉืœื™.โ€ฌ
04:49
Now those two first images,
96
289285
1585
ื‘โ€ซืฉืชื™ ื”ืชืžื•ื ื•ืช ื”ืจืืฉื•ื ื•ืช ื”ืืœื”,โ€ฌ
04:50
that's me from events I've spoken at.
97
290870
2169
ืื ื™ ืžืฆื•ืœืžืช ื‘ืื™ืจื•ืขื™ื ืฉื‘ื”ื ื”ืจืฆื™ืชื™.
04:53
But the rest of the images, none of those are me.
98
293081
2753
โ€ซืื‘ืœ ื‘ืฉืืจ ื”ืชืžื•ื ื•ืช - ื–ื• ืœื ืื ื™.โ€ฌ
04:55
They're probably of other women named Sasha
99
295875
2002
ืืœื” ื›ื ืจืื” ื ืฉื™ื ืื—ืจื•ืช ื‘ืฉื ืกืฉื”โ€ฌ
04:57
who put photographs of themselves up on the internet.
100
297919
2628
โ€ซืฉื”ืขืœื• ืชืžื•ื ื•ืช ืฉืœื”ืŸ ื‘ืื™ื ื˜ืจื ื˜.โ€ฌ
05:01
And this can probably explain why,
101
301047
1627
โ€ซื•ื–ื” ื›ื ืจืื” ื™ื›ื•ืœ ืœื”ืกื‘ื™ืจ ืžื“ื•ืข,โ€ฌ
05:02
when I query an image generation model
102
302715
1836
โ€ซื›ืฉืื ื™ ืžื‘ืงืฉืช ืžืžื•ื“ืœ ืœื™ืฆื™ืจืช ืชืžื•ื ื•ืชโ€ฌ โ€ซืœื™ืฆื•ืจ ืชืฆืœื•ื ืฉืœ ืื™ืฉื” ื‘ืฉื ืกืฉื”,โ€ฌ
05:04
to generate a photograph of a woman named Sasha,
103
304551
2294
05:06
more often than not I get images of bikini models.
104
306886
2753
ืื ื™ ืžืงื‘ืœืชโ€ฌ ื‘ื“ืจืš-ื›ืœืœ โ€ซืชืžื•ื ื•ืช ืฉืœ ื“ื•ื’ืžื ื™ื•ืช ื‘ื™ืงื™ื ื™.โ€ฌ
05:09
Sometimes they have two arms,
105
309681
1626
โ€ซืœืคืขืžื™ื ื™ืฉ ืœื”ืŸ ืฉืชื™ ื–ืจื•ืขื•ืช,โ€ฌ
05:11
sometimes they have three arms,
106
311349
2294
โ€ซืœืคืขืžื™ื ื™ืฉ ืœื”ืŸ ืฉืœื•ืฉ ื–ืจื•ืขื•ืช,โ€ฌ
05:13
but they rarely have any clothes on.
107
313685
2043
โ€ซืื‘ืœ ืœืขืชื™ื ืจื—ื•ืงื•ืช ื™ืฉ ืœื”ืŸ ื‘ื›ืœืœ ื‘ื’ื“ื™ื.โ€ฌ
05:16
And while it can be interesting for people like you and me
108
316563
2794
โ€ซื•ืœืžืจื•ืช ืฉืื ืฉื™ื ื›ืžื•ื›ื ื•ื›ืžื•ื ื™โ€ฌ ืขืฉื•ื™ื™ื ืœืžืฆื•ื ืขื ื™ื™ืŸ
05:19
to search these data sets,
109
319357
2127
ื‘ื—ื™ืคื•ืฉ ื‘ืžืขืจื›ื™ ื”ื ืชื•ื ื™ื ื”ืืœื”,โ€ฌ
05:21
for artists like Karla Ortiz,
110
321526
2044
ื”ืจื™ ืฉืœืืžื ื™ื ื›ืžื• ืงืจืœื” ืื•ืจื˜ื™ื–,โ€ฌ
05:23
this provides crucial evidence that her life's work, her artwork,
111
323570
3753
โ€ซื–ื” ืžืกืคืง ืจืื™ื•ืช ืžื›ืจื™ืขื•ืช ืœื›ืšโ€ฌ โ€ซืฉืขื‘ื•ื“ืช ื—ื™ื™ื”, ื™ืฆื™ืจื•ืช ื”ืืžื ื•ืช ืฉืœื”,โ€ฌ
05:27
was used for training AI models without her consent,
112
327365
2961
โ€ซืฉื™ืžืฉื• ืœืื™ืžื•ืŸ ืžื•ื“ืœื™ื ืฉืœ ื‘โ€œืž ืœืœื ื”ืกื›ืžืชื”,โ€ฌ
05:30
and she and two artists used this as evidence
113
330326
2336
โ€ซื•ื”ื™ื ื•ืขื•ื“ ืฉื ื™ ืืžื ื™ื ื”ืฉืชืžืฉื• ื‘ื–ื” ื›ืจืื™ื”โ€ฌ
05:32
to file a class action lawsuit against AI companies
114
332704
2794
ื‘ืชื‘ื™ืขื” ื™ื™ืฆื•ื’ื™ืช โ€ฌโ€ซื ื’ื“ ื—ื‘ืจื•ืช ื‘โ€œืžโ€ฌ
05:35
for copyright infringement.
115
335540
1960
โ€ซื‘ื’ื™ืŸ ื”ืคืจืช ื–ื›ื•ื™ื•ืช ื™ื•ืฆืจื™ื.โ€ฌ
05:37
And most recently --
116
337542
1168
โ€ซื•ืžืžืฉ ืœืื—ืจื•ื ื” --โ€ฌ
05:38
(Applause)
117
338710
3378
โ€ซ(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)โ€ฌ
05:42
And most recently Spawning.ai partnered up with Hugging Face,
118
342130
3044
โ€ซื•ืœืื—ืจื•ื ื” โ€œืกืคื•ื ื™ื ื’ ืื™-ืื™ื™โ€ ืฉื™ืชืคื” ืคืขื•ืœื” ืขืโ€ฌ โ€œื”ื’ื™ื ื’ ืคื™ื™ืกโ€œ,โ€ฌ
05:45
the company where I work at,
119
345216
1585
โ€ซื”ื—ื‘ืจื” ืฉื‘ื” ืื ื™ ืขื•ื‘ื“ืชโ€ฌ
05:46
to create opt-in and opt-out mechanisms for creating these data sets.
120
346801
4922
โ€ซื›ื“ื™ ืœื™ืฆื•ืจ ืžื ื’ื ื•ื ื™ ื‘ื—ื™ืจื” ื•ื”ื™ืžื ืขื•ืชโ€ฌ ื‘ื™ืฆื™ืจืช ืžืขืจื›ื™ ื ืชื•ื ื™ื ืืœื”.โ€ฌ
05:52
Because artwork created by humans shouldnโ€™t be an all-you-can-eat buffet
121
352098
3587
ื›ื™ ื™ืฆื™ืจื•ืช ืืžื ื•ืช ืฉื ื•ืฆืจื• ืขโ€œื™ ื‘ื ื™ ืื“ืโ€ฌ โ€ซืœื ืฆืจื™ื›ื•ืช ืœื”ื™ื•ืช ืžื–ื ื•ืŸ ื—ื•ืคืฉื™
05:55
for training AI language models.
122
355727
1793
โ€ซืœืื™ืžื•ืŸ ืžื•ื“ืœื™ื ืฉืœ ืฉืคื•ืช ื‘"ืž.โ€ฌ
05:58
(Applause)
123
358313
4254
โ€ซ(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)โ€ฌ
06:02
The very last thing I want to talk about is bias.
124
362567
2336
โ€ซื•ืœื‘ืกื•ืฃ ืื ื™ ืจื•ืฆื” ืœื“ื‘ืจ ืขืœ ื”ื˜ื™ื”.โ€ฌ
06:04
You probably hear about this a lot.
125
364944
1919
โ€ซืืชื ื‘ื˜ื— ืฉื•ืžืขื™ื ืขืœ ื–ื” ื”ืจื‘ื”.โ€ฌ
06:07
Formally speaking, it's when AI models encode patterns and beliefs
126
367196
3713
ืจืฉืžื™ืช, ื–ื” ื›ืืฉืจ ืžื•ื“ืœื™ื โ€ซืฉืœ ื‘โ€œืž ืžืงื•ื“ื“ื™ื ื“ืคื•ืกื™ื ื•ืืžื•ื ื•ืชโ€ฌ
06:10
that can represent stereotypes or racism and sexism.
127
370950
3128
โ€ซืฉื™ื›ื•ืœื™ื ืœื™ื™ืฆื’ ืกื˜ืจื™ืื•ื˜ื™ืคื™ื โ€ซืื• ื’ื–ืขื ื•ืช ื•ืกืงืกื™ื–ื.โ€ฌ
06:14
One of my heroes, Dr. Joy Buolamwini, experienced this firsthand
128
374412
3212
โ€ซืื—ืช ื”ื’ื™ื‘ื•ืจื•ืช ืฉืœื™, ื“โ€œืจ ื’โ€™ื•ื™โ€ฌ โ€ซื‘ื•ืœืžื•ื•ื™ื ื™, ื—ื•ื•ืชื” ื–ืืช ื‘ืขืฆืžื”
06:17
when she realized that AI systems wouldn't even detect her face
129
377665
3045
โ€ซื›ืฉื”ื‘ื™ื ื” ืฉืžืขืจื›ื•ืช ื‘โ€œืžโ€ฌ โ€ซืืคื™ืœื• ืœื ืžื–ื”ื•ืช ืืช ืคื ื™ื”โ€ฌ
06:20
unless she was wearing a white-colored mask.
130
380752
2169
โ€ซืืœื ืื ื”ื™ื ืขื•ื˜ื” ืžืกื›ื” ื‘ืฆื‘ืข ืœื‘ืŸ.โ€ฌ
06:22
Digging deeper, she found that common facial recognition systems
131
382962
3754
โ€ซื›ืฉื”ื™ื ื”ืžืฉื™ื›ื” ืœื‘ื“ื•ืง, ื”ื™ื ื’ื™ืœืชื” โ€ฌ โ€ซืฉืžืขืจื›ื•ืช ื–ื™ื”ื•ื™ ืคื ื™ื ื ืคื•ืฆื•ืชโ€ฌ
06:26
were vastly worse for women of color compared to white men.
132
386758
3253
โ€ซื”ื™ื• ื’ืจื•ืขื•ืช ื‘ื”ืจื‘ื” ื›ืฉืžื“ื•ื‘ืจ ื‘โ€ซื ืฉื™ื ืฆื‘ืขื•ื ื™ื•ืช ื‘ื”ืฉื•ื•ืื” ืœื’ื‘ืจื™ื ืœื‘ื ื™ื.โ€ฌ
06:30
And when biased models like this are deployed in law enforcement settings,
133
390428
5297
โ€ซื•ื›ืืฉืจ ืžื•ื“ืœื™ื ืžื•ื˜ื™ื ื›ืืœื” โ€ฌ โ€ซืžื•ืคืขืœื™ื ื‘ืžืกื’ืจื•ืช ืื›ื™ืคืช ื”ื—ื•ืง,โ€ฌ
06:35
this can result in false accusations, even wrongful imprisonment,
134
395767
4296
โ€ซื–ื” ื™ื›ื•ืœ ืœื’ืจื•ื ืœืื™ืฉื•ืžื™ ืฉื•ื•ื ื•ืืคื™ืœื• ืœืžืืกืจ ืœื-ืฆื•ื“ืง,โ€ฌ
ืžื” ืฉืจืื™ื ื• ืฉืงืจื” ืœื›ืžื” ืื ืฉื™ื ื‘ื—ื•ื“ืฉื™ื ื”ืื—ืจื•ื ื™ื.โ€ฌ
06:40
which we've seen happen to multiple people in recent months.
135
400063
3920
06:44
For example, Porcha Woodruff was wrongfully accused of carjacking
136
404025
3086
โ€ซืœื“ื•ื’ืžื”, ืคื•ืจืฉื” ื•ื•ื“ืจื•ืฃ โ€ฌ โ€ซื”ื•ืืฉืžื” ืฉืœื ื‘ืฆื“ืง ื‘ื’ื ื™ื‘ืช ืจื›ื‘โ€ฌ
ื‘ื”ื™ื•ืชื” โ€ซื‘ื—ื•ื“ืฉ ื”ืฉืžื™ื ื™ ืœื”ืจื™ื•ื ื”โ€ฌ,
06:47
at eight months pregnant
137
407111
1252
06:48
because an AI system wrongfully identified her.
138
408363
2961
ื›ืฉืžืขืจื›ืช ื‘โ€œืž ื–ื™ื”ืชื” ืื•ืชื” ื‘ืฉื•ื’ื’.โ€ฌ
โ€ซืืš ืœืžืจื‘ื” ื”ืฆืขืจ, ื”ืžืขืจื›ื•ืช ื”ืืœื”โ€ฌ โ€ซื”ืŸ ืงื•ืคืกืื•ืช ืฉื—ื•ืจื•ืช,โ€ฌ
06:52
But sadly, these systems are black boxes,
139
412325
2002
06:54
and even their creators can't say exactly why they work the way they do.
140
414369
5964
โ€ซื•ืืคื™ืœื• ื™ื•ืฆืจื™ื”ืŸ ืœื ื™ื›ื•ืœื™ื ืœื•ืžืจโ€ฌ โ€ซื‘ื“ื™ื•ืง ืžื“ื•ืข ื”ืŸ ืคื•ืขืœื•ืช ื›ืคื™ ืฉื”ืŸ ืคื•ืขืœื•ืช.โ€ฌ
07:00
And for example, for image generation systems,
141
420917
3462
โ€ซืœื“ื•ื’ืžื”, ื‘ืžืขืจื›ื•ืช ื™ืฆื™ืจืช ืชืžื•ื ื•ืช,โ€ฌ
07:04
if they're used in contexts like generating a forensic sketch
142
424379
4129
โ€ซืื ืžืฉืชืžืฉื™ื ื‘ื”ืŸ ื‘ื”ืงืฉืจื™ืโ€ฌ โ€ซื›ืžื• ื™ืฆื™ืจืช ืงืœืกืชืจื•ืŸ
07:08
based on a description of a perpetrator,
143
428549
2711
ืขืœ ืกืžืš ืชื™ืื•ืจ ืฉืœ ื”ืขื‘ืจื™ื™ืŸ,โ€ฌ
07:11
they take all those biases and they spit them back out
144
431260
3587
โ€ซื”ืŸ ืœื•ืงื—ื•ืช ืืช ื›ืœ ื”ื”ื˜ื™ื•ืชโ€ฌ โ€ซื”ืืœื” ื•ืคื•ืœื˜ื•ืช ืื•ืชืŸ ื‘ื—ื–ืจื”โ€ฌ
07:14
for terms like dangerous criminal, terrorists or gang member,
145
434889
3462
ืžื•ืœ ืžื•ื ื—ื™ื ื›ืžื• โ€œืคื•ืฉืข ืžืกื•ื›ืŸโ€œ, โ€ฌ โ€ซโ€œื˜ืจื•ืจื™ืกื˜ื™ืโ€ ืื• โ€œื—ื‘ืจ ื›ื ื•ืคื™ื”,โ€ฌโ€
07:18
which of course is super dangerous
146
438393
2168
โ€ซืฉื–ื” ื›ืžื•ื‘ืŸ ืžืกื•ื›ืŸ ื‘ืžื™ื•ื—ื“โ€ฌ
07:20
when these tools are deployed in society.
147
440603
4421
โ€ซื›ืฉื”ื›ืœื™ื ื”ืืœื” ืžื•ืคืขืœื™ื ื‘ื—ื‘ืจื”.โ€ฌ
07:25
And so in order to understand these tools better,
148
445566
2294
ืื– ื›ื“ื™ ืœื”ื‘ื™ืŸ ื˜ื•ื‘ ื™ื•ืชืจ ืืช ื”ื›ืœื™ื ื”ืืœื”,โ€ฌ
07:27
I created this tool called the Stable Bias Explorer,
149
447902
3212
โ€ซื™ืฆืจืชื™ ืืช ื”ื›ืœื™ ื”ื–ื” โ€ฌ โ€ซืฉื ืงืจื Stable Bias Explorer (ืกื™ื™ืจ ื”ื˜ื™ื•ืช),โ€ฌ
07:31
which lets you explore the bias of image generation models
150
451155
3379
โ€ซืฉืžืืคืฉืจ ืœื›ื ืœื‘ื“ื•ืง ื”ื˜ื™ื•ืช ืฉืœ ืžื•ื“ืœื™ื ืœื™ืฆื™ืจืช ืชืžื•ื ื•ืชโ€ฌ
07:34
through the lens of professions.
151
454575
1669
โ€ซื“ืจืš ืขื“ืฉืช ื”ืžืงืฆื•ืข.โ€ฌ
07:37
So try to picture a scientist in your mind.
152
457370
3045
ื ืกื• ืœื“ืžื™ื™ืŸ ืžื“ืขืŸ.โ€ฌ
07:40
Don't look at me.
153
460456
1168
โ€ซืืœ ืชืกืชื›ืœื• ืขืœื™.โ€ฌ
07:41
What do you see?
154
461666
1335
โ€ซืžื” ืืชื ืจื•ืื™ื?โ€ฌ
07:43
A lot of the same thing, right?
155
463835
1501
ื›ื•ืœื ืื•ืชื• ื”ื“ื‘ืจ, ื ื›ื•ืŸ?โ€ฌ
07:45
Men in glasses and lab coats.
156
465378
2377
โ€ซื’ื‘ืจื™ื ื‘ืžืฉืงืคื™ื™ื ื•ื—ืœื•ืงื™ ืžืขื‘ื“ื”.โ€ฌ
07:47
And none of them look like me.
157
467797
1710
โ€ซื•ืืฃ ืื—ื“ ืžื”ื ืœื ื ืจืื” ื›ืžื•ื ื™.โ€ฌ
07:50
And the thing is,
158
470174
1460
โ€ซื•ื”ืขื ื™ื™ืŸ ื”ื•ืโ€ฌ,
07:51
is that we looked at all these different image generation models
159
471676
3253
โ€ซืฉื‘ื“ืงื ื• ืืช ื›ืœ ื”ืžื•ื“ืœื™ื ื”ืฉื•ื ื™ื โ€ฌ โ€ซืฉืœ ื™ืฆื™ืจืช ืชืžื•ื ื•ืชโ€ฌ
07:54
and found a lot of the same thing:
160
474929
1627
โ€ซื•ืžืฆืื ื• ื”ืจื‘ื” ืžืื•ืชื• ื”ื“ื‘ืจ:โ€ฌ
07:56
significant representation of whiteness and masculinity
161
476597
2586
โ€ซื™ื™ืฆื•ื’ ืžืฉืžืขื•ืชื™ ืฉืœ ืœื•ื‘ืŸ ื•ื’ื‘ืจื™ื•ืชโ€ฌ
โ€ซื‘ื›ืœ 150 ื”ืžืงืฆื•ืขื•ืช ืฉื‘ื“ืงื ื•,โ€ฌ
07:59
across all 150 professions that we looked at,
162
479225
2127
08:01
even if compared to the real world,
163
481352
1794
โ€ซื’ื ื‘ื”ืฉื•ื•ืื” ืœืขื•ืœื ื”ืืžื™ืชื™,โ€ฌ
08:03
the US Labor Bureau of Statistics.
164
483187
1836
ื”ืกื˜ื˜ื™ืกื˜ื™ืงื” ืฉืœ โ€ซืœืฉื›ืช ื”ืชืขืกื•ืงื” ื”ืืžืจื™ืงื ื™ืช.โ€ฌ
08:05
These models show lawyers as men,
165
485023
3044
โ€ซืžื•ื“ืœื™ื ืืœื” ืžืฆื™ื’ื™ื ืขื•ืจื›ื™ ื“ื™ืŸ ื›ื’ื‘ืจื™ื,โ€ฌ
08:08
and CEOs as men, almost 100 percent of the time,
166
488109
3462
โ€ซื•ืžื ื›โ€œืœื™ื ื›ื’ื‘ืจื™ื, ื›ืžืขื˜ 100% ืžื”ื–ืžืŸ,โ€ฌ
08:11
even though we all know not all of them are white and male.
167
491571
3170
โ€ซืœืžืจื•ืช ืฉื›ื•ืœื ื• ื™ื•ื“ืขื™ื ืฉืœื ื›ื•ืœื ืœื‘ื ื™ื ื•ื’ื‘ืจื™ื.โ€ฌ
08:14
And sadly, my tool hasn't been used to write legislation yet.
168
494782
4380
โ€ซื•ืœืžืจื‘ื” ื”ืฆืขืจ, ื”ื›ืœื™ ืฉืœื™ ื˜ืจื ืฉื™ืžืฉ ืœื—ืงื™ืงื”.โ€ฌ
08:19
But I recently presented it at a UN event about gender bias
169
499203
3963
โ€ซืื‘ืœ ืœืื—ืจื•ื ื” ื”ืฆื’ืชื™ ืื•ืชื• ื‘ืื™ืจื•ืข โ€ฌ โ€ซืฉืœ ื”ืื•โ€œื ื‘ื ื•ืฉื ื”ื˜ื™ื” ืžื’ื“ืจื™ืชโ€ฌ
08:23
as an example of how we can make tools for people from all walks of life,
170
503166
3879
โ€ซื›ื“ื•ื’ืžื” ืœื›ืš ืฉืื ื• ื™ื›ื•ืœื™ืโ€ฌ โ€ซืœื™ืฆื•ืจ ื›ืœื™ื ืœืื ืฉื™ื ืžื›ืœ ืชื—ื•ืžื™ ื”ื—ื™ื™ื,โ€ฌ
08:27
even those who don't know how to code,
171
507086
2252
โ€ซืืคื™ืœื• ืืœื” ืฉืœื ื™ื•ื“ืขื™ื ืœืชื›ื ืช,โ€ฌ
08:29
to engage with and better understand AI because we use professions,
172
509380
3253
โ€ซืœื”ืชืžื•ื“ื“ ืขื ื”ื‘โ€œืž ื•ืœื”ื‘ื™ื ื” ื˜ื•ื‘ ื™ื•ืชืจโ€ฌ ื›ื™ ืื ื—ื ื• ืžืฉืชืžืฉื™ื ื‘ืžืงืฆื•ืขื•ืช,โ€ฌ
08:32
but you can use any terms that are of interest to you.
173
512633
3087
โ€ซืื‘ืœ ืืคืฉืจ ืœื”ืฉืชืžืฉ โ€ฌ โ€ซื‘ื›ืœ ืžื•ื ื— ืฉืžืขื ื™ื™ืŸ ืืชื›ื.โ€ฌ
08:36
And as these models are being deployed,
174
516596
2752
โ€ซื•ื›ื›ืœ ืฉื”ืžื•ื“ืœื™ื ื”ืืœื” ืžื•ืคืขืœื™ื,โ€ฌ
08:39
are being woven into the very fabric of our societies,
175
519390
3128
โ€ซื ืืจื’ื™ื ืœืชื•ืš ืขืฆื ื”ืžืจืงื ืฉืœ ื”ื—ื‘ืจื•ืช ืฉืœื ื•,โ€ฌ
08:42
our cell phones, our social media feeds,
176
522518
2044
ื‘ื˜ืœืคื•ื ื™ื ื”ืกืœื•ืœืจื™ื™ื โ€ฌ ื•โ€ซื‘ืขื“ื›ื•ื ื™ ื”ืžื“ื™ื” ื”ื—ื‘ืจืชื™ืช ืฉืœื ื•,โ€ฌ
08:44
even our justice systems and our economies have AI in them.
177
524604
3211
โ€ซืืคื™ืœื• ื‘ืชื•ืš ืžืขืจื›ื•ืช ื”ืžืฉืคื˜ โ€ซื•ื”ื›ืœื›ืœื•ืช ืฉืœื ื• ื™ืฉ ื‘โ€œืž.โ€ฌ
08:47
And it's really important that AI stays accessible
178
527815
3879
โ€ซื•ื—ืฉื•ื‘ ืžืื•ื“ ืฉื”ื‘โ€œืž ืชื™ืฉืืจ ื ื’ื™ืฉื”โ€ฌ
08:51
so that we know both how it works and when it doesn't work.
179
531736
4713
โ€ซื›ื“ื™ ืฉื ื“ืข ื’ื ืื™ืš ื–ื” ืขื•ื‘ื“ โ€ฌ โ€ซื•ื’ื ืžืชื™ ื–ื” ืœื ืขื•ื‘ื“.โ€ฌ
08:56
And there's no single solution for really complex things like bias
180
536908
4296
โ€ซื•ืื™ืŸ ืคืชืจื•ืŸ ื™ื—ื™ื“ ืœื“ื‘ืจื™ื โ€ฌ โ€ซืžื•ืจื›ื‘ื™ื ื‘ืืžืช ื›ืžื• ื”ื˜ื™ื”โ€ฌ
09:01
or copyright or climate change.
181
541245
2419
โ€ซืื• ื–ื›ื•ื™ื•ืช ื™ื•ืฆืจื™ื ืื• ืฉื™ื ื•ื™ื™ ืืงืœื™ื.โ€ฌ
09:03
But by creating tools to measure AI's impact,
182
543664
2711
โ€ซืื‘ืœ ื‘ืขื–ืจืช ื™ืฆื™ืจืช ื›ืœื™ื ืœืžื“ื™ื“ืช ื”ืฉืคืขื•ืชื™ื” ืฉืœ ื”ื‘โ€œืž,โ€ฌ
09:06
we can start getting an idea of how bad they are
183
546375
3337
ื ื•ื›ืœ ืœื”ืชื—ื™ืœ ืœืงื‘ืœ โ€ฌโ€ซืžื•ืฉื’ ืขื“ ื›ืžื” ื”ืŸ ื—ืžื•ืจื•ืช
09:09
and start addressing them as we go.
184
549754
2502
โ€ซื•ืœื”ืชื—ื™ืœ ืœื˜ืคืœ ื‘ื”ืŸ ื‘ืžืงื‘ื™ืœ ืœืฉื™ืžื•ืฉ ื‘ื”,
09:12
Start creating guardrails to protect society and the planet.
185
552256
3337
ืœื”ืชื—ื™ืœ ืœื™ืฆื•ืจ ื’ื“ืจื•ืช ื‘ื˜ื™ื—ื•ืช ืœื”ื’ื ืชโ€ฌ ื”ื—ื‘ืจื” ื•ื›ื“ื•ืจ ื”ืืจืฅ.โ€ฌ
09:16
And once we have this information,
186
556177
2336
โ€ซื•ืžื™ื“ ื›ืฉื™ื”ื™ื” ืœื ื• ื”ืžื™ื“ืข ื”ื–ื”,โ€ฌ
09:18
companies can use it in order to say,
187
558513
1835
ืืจื’ื•ื ื™ื ื™ื•ื›ืœื• ืœื”ืฉืชืžืฉ ื‘ื• ื›ื“ื™ ืœื•ืžืจ,โ€ฌ
09:20
OK, we're going to choose this model because it's more sustainable,
188
560389
3170
โ€ซื‘ืกื“ืจ, ื ื‘ื—ืจ ื‘ืžื•ื“ืœ ื”ื–ื”โ€ฌ โ€ซื›ื™ ื”ื•ื ื‘ืจ-ืงื™ื™ืžื ื™ื•ืชืจ,โ€ฌ
ื•โ€ซื‘ืžื•ื“ืœ ื”ื–ื” ื›ื™ ื”ื•ื ืžื›ื‘ื“ ื–ื›ื•ื™ื•ืช ื™ื•ืฆืจื™ื.โ€ฌ
09:23
this model because it respects copyright.
189
563601
2044
09:25
Legislators who really need information to write laws,
190
565686
3087
โ€ซืžื—ื•ืงืงื™ื ืฉื‘ืืžืช ื–ืงื•ืงื™ื ืœืžื™ื“ืขโ€ฌ โ€ซื›ื“ื™ ืœื›ืชื•ื‘ ื—ื•ืงื™ื,โ€ฌ
09:28
can use these tools to develop new regulation mechanisms
191
568773
3462
โ€ซื™ื›ื•ืœื™ื ืœื”ืฉืชืžืฉ ื‘ื›ืœื™ื ืืœื”โ€ฌ โ€ซื›ื“ื™ ืœืคืชื— ืžื ื’ื ื•ื ื™ ืจื’ื•ืœืฆื™ื” ื—ื“ืฉื™ืโ€ฌ
09:32
or governance for AI as it gets deployed into society.
192
572276
3796
โ€ซืื• ืœืคืงื— ืขืœ ื”ื‘โ€œืž ื›ืืฉืจ ื”ื™ื ืžื•ืคืขืœืช ื‘ื—ื‘ืจื”.โ€ฌ
โ€ซื•ืžืฉืชืžืฉื™ื ื›ืžื•ื›ื ื•ื›ืžื•ื ื™ ื™ื›ื•ืœื™ืโ€ฌ โ€ซืœื”ืฉืชืžืฉ ื‘ืžื™ื“ืข ื”ื–ื”โ€ฌ
09:36
And users like you and me can use this information
193
576114
2377
09:38
to choose AI models that we can trust,
194
578491
3337
โ€ซื›ื“ื™ ืœื‘ื—ื•ืจ ืžื•ื“ืœื™ื ืฉืœ ื‘โ€œืž โ€ฌ โ€ซืฉืืคืฉืจ ืœืกืžื•ืš ืขืœื™ื”ื,โ€ฌ
09:41
not to misrepresent us and not to misuse our data.
195
581869
2920
ืฉโ€ซืœื ื™ื™ืฆื’ื• ืื•ืชื ื• ื‘ืžืกื•ืœืฃ โ€ซื•ืœื ื™ืขืฉื• ืฉื™ืžื•ืฉ ืœืจืขื” ื‘ื ืชื•ื ื™ื ืฉืœื ื•.โ€ฌ
09:45
But what did I reply to that email
196
585790
1918
โ€ซืื‘ืœ ืžื” ืขื ื™ืชื™ ืœืื•ืชื• ื“ื•ืโ€œืœ
09:47
that said that my work is going to destroy humanity?
197
587750
2961
โ€ซืฉืืžืจ ืฉืขื‘ื•ื“ืชื™ ืขืชื™ื“ื” ืœื”ืจื•ืก ืืช ื”ืื ื•ืฉื•ืช?โ€ฌ
09:50
I said that focusing on AI's future existential risks
198
590711
4046
โ€ซืืžืจืชื™ ืฉื”ืชืžืงื“ื•ืช ื‘ืกื™ื›ื•ื ื™ื โ€ฌ โ€ซื”ืงื™ื•ืžื™ื™ื ื”ืขืชื™ื“ื™ื™ื ืฉืœ ื”ื‘โ€œืžโ€ฌ
09:54
is a distraction from its current,
199
594799
2044
โ€ซื”ื™ื ื”ืกื—ืช ื“ืขืชโ€ฌ
โ€ซืžื”ื”ืฉืคืขื•ืช ื”ื ื•ื›ื—ื™ื•ืช ื•ื”ืžื•ื—ืฉื™ื•ืช ืฉืœื”โ€ฌ
09:56
very tangible impacts
200
596843
1835
09:58
and the work we should be doing right now, or even yesterday,
201
598719
4004
โ€ซื•ืžื”ืขื‘ื•ื“ื” ืฉืื ื—ื ื• ืฆืจื™ื›ื™ืโ€ฌ โ€ซืœืขืฉื•ืช ื›ืจื’ืข, ืื• ืืคื™ืœื• ืืชืžื•ืœ,โ€ฌ
10:02
for reducing these impacts.
202
602723
1919
โ€ซืœื”ืคื—ืชืช ื”ื”ืฉืคืขื•ืช ื”ืœืœื•.โ€ฌ
10:04
Because yes, AI is moving quickly, but it's not a done deal.
203
604684
4045
โ€ซื›ื™ ื›ืŸ, ื”ื‘โ€œืž ืžืชืงื“ืžืช ื‘ืžื”ื™ืจื•ืช, โ€ฌ โ€ซืื‘ืœ ื–ื• ืœื ืขื ื™ื™ืŸ ืกื’ื•ืจ.โ€ฌ
10:08
We're building the road as we walk it,
204
608771
2503
โ€ซืื ื—ื ื• ืกื•ืœืœื™ื ืืช ื”ื“ืจืš โ€ฌ โ€ซืชื•ืš ื›ื“ื™ ื›ืš ืฉืื ื• ื”ื•ืœื›ื™ื ื‘ื”,โ€ฌ
10:11
and we can collectively decide what direction we want to go in together.
205
611274
3795
โ€ซื•ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœื”ื—ืœื™ื˜ ื‘ืื•ืคืŸ ืงื•ืœืงื˜ื™ื‘ื™โ€ฌ โ€ซืœืื™ื–ื” ื›ื™ื•ื•ืŸ ื‘ืจืฆื•ื ื ื• ืœืœื›ืช ื™ื—ื“.โ€ฌ
10:15
Thank you.
206
615069
1210
โ€ซืชื•ื“ื” ืœื›ื.โ€ฌ
10:16
(Applause)
207
616279
2002
โ€ซ(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)โ€ฌ
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7