AI Is Dangerous, but Not for the Reasons You Think | Sasha Luccioni | TED

1,167,750 views ・ 2023-11-06

TED


Please double-click on the English subtitles below to play the video.

00:04
So I've been an AI researcher for over a decade.
0
4292
3504
00:07
And a couple of months ago, I got the weirdest email of my career.
1
7796
3503
00:11
A random stranger wrote to me
2
11925
1668
00:13
saying that my work in AI is going to end humanity.
3
13635
3420
00:18
Now I get it, AI, it's so hot right now.
4
18598
3754
00:22
(Laughter)
5
22352
1627
00:24
It's in the headlines pretty much every day,
6
24020
2086
00:26
sometimes because of really cool things
7
26106
1918
00:28
like discovering new molecules for medicine
8
28066
2169
00:30
or that dope Pope in the white puffer coat.
9
30235
2252
00:33
But other times the headlines have been really dark,
10
33446
2461
00:35
like that chatbot telling that guy that he should divorce his wife
11
35907
3671
00:39
or that AI meal planner app proposing a crowd pleasing recipe
12
39619
4088
00:43
featuring chlorine gas.
13
43707
2002
00:46
And in the background,
14
46376
1418
00:47
we've heard a lot of talk about doomsday scenarios,
15
47836
2419
00:50
existential risk and the singularity,
16
50255
1918
00:52
with letters being written and events being organized
17
52215
2503
00:54
to make sure that doesn't happen.
18
54718
2002
00:57
Now I'm a researcher who studies AI's impacts on society,
19
57637
4630
01:02
and I don't know what's going to happen in 10 or 20 years,
20
62267
2836
01:05
and nobody really does.
21
65145
2461
01:07
But what I do know is that there's some pretty nasty things going on right now,
22
67981
4546
01:12
because AI doesn't exist in a vacuum.
23
72527
2878
01:15
It is part of society, and it has impacts on people and the planet.
24
75447
3920
01:20
AI models can contribute to climate change.
25
80160
2502
01:22
Their training data uses art and books created by artists
26
82704
3462
01:26
and authors without their consent.
27
86207
1710
01:27
And its deployment can discriminate against entire communities.
28
87959
3837
01:32
But we need to start tracking its impacts.
29
92797
2127
01:34
We need to start being transparent and disclosing them and creating tools
30
94966
3587
01:38
so that people understand AI better,
31
98595
2419
01:41
so that hopefully future generations of AI models
32
101056
2335
01:43
are going to be more trustworthy, sustainable,
33
103433
2836
01:46
maybe less likely to kill us, if that's what you're into.
34
106269
2836
01:50
But let's start with sustainability,
35
110148
1752
01:51
because that cloud that AI models live on is actually made out of metal, plastic,
36
111900
5756
01:57
and powered by vast amounts of energy.
37
117656
2460
02:00
And each time you query an AI model, it comes with a cost to the planet.
38
120116
4463
02:05
Last year, I was part of the BigScience initiative,
39
125789
3044
02:08
which brought together a thousand researchers
40
128833
2127
02:10
from all over the world to create Bloom,
41
130960
2503
02:13
the first open large language model, like ChatGPT,
42
133505
4337
02:17
but with an emphasis on ethics, transparency and consent.
43
137842
3546
02:21
And the study I led that looked at Bloom's environmental impacts
44
141721
3253
02:25
found that just training it used as much energy
45
145016
3253
02:28
as 30 homes in a whole year
46
148311
2211
02:30
and emitted 25 tons of carbon dioxide,
47
150563
2419
02:33
which is like driving your car five times around the planet
48
153024
3253
02:36
just so somebody can use this model to tell a knock-knock joke.
49
156319
3170
02:39
And this might not seem like a lot,
50
159489
2169
02:41
but other similar large language models,
51
161700
2460
02:44
like GPT-3,
52
164202
1126
02:45
emit 20 times more carbon.
53
165370
2544
02:47
But the thing is, tech companies aren't measuring this stuff.
54
167956
2878
02:50
They're not disclosing it.
55
170875
1252
02:52
And so this is probably only the tip of the iceberg,
56
172168
2461
02:54
even if it is a melting one.
57
174629
1418
02:56
And in recent years we've seen AI models balloon in size
58
176798
3629
03:00
because the current trend in AI is "bigger is better."
59
180468
3462
03:04
But please don't get me started on why that's the case.
60
184305
2795
03:07
In any case, we've seen large language models in particular
61
187100
3003
03:10
grow 2,000 times in size over the last five years.
62
190103
3211
03:13
And of course, their environmental costs are rising as well.
63
193314
3045
03:16
The most recent work I led, found that switching out a smaller,
64
196401
3795
03:20
more efficient model for a larger language model
65
200238
3337
03:23
emits 14 times more carbon for the same task.
66
203616
3754
03:27
Like telling that knock-knock joke.
67
207412
1877
03:29
And as we're putting in these models into cell phones and search engines
68
209289
3462
03:32
and smart fridges and speakers,
69
212792
2836
03:35
the environmental costs are really piling up quickly.
70
215628
2628
03:38
So instead of focusing on some future existential risks,
71
218840
3754
03:42
let's talk about current tangible impacts
72
222635
2753
03:45
and tools we can create to measure and mitigate these impacts.
73
225388
3629
03:49
I helped create CodeCarbon,
74
229893
1668
03:51
a tool that runs in parallel to AI training code
75
231603
2961
03:54
that estimates the amount of energy it consumes
76
234564
2211
03:56
and the amount of carbon it emits.
77
236775
1668
03:58
And using a tool like this can help us make informed choices,
78
238485
2877
04:01
like choosing one model over the other because it's more sustainable,
79
241404
3253
04:04
or deploying AI models on renewable energy,
80
244657
2920
04:07
which can drastically reduce their emissions.
81
247619
2544
04:10
But let's talk about other things
82
250163
2085
04:12
because there's other impacts of AI apart from sustainability.
83
252290
2961
04:15
For example, it's been really hard for artists and authors
84
255627
3128
04:18
to prove that their life's work has been used for training AI models
85
258797
4212
04:23
without their consent.
86
263051
1209
04:24
And if you want to sue someone, you tend to need proof, right?
87
264302
3170
04:27
So Spawning.ai, an organization that was founded by artists,
88
267806
3920
04:31
created this really cool tool called “Have I Been Trained?”
89
271726
3337
04:35
And it lets you search these massive data sets
90
275104
2461
04:37
to see what they have on you.
91
277607
2085
04:39
Now, I admit it, I was curious.
92
279734
1668
04:41
I searched LAION-5B,
93
281444
1627
04:43
which is this huge data set of images and text,
94
283112
2461
04:45
to see if any images of me were in there.
95
285615
2711
04:49
Now those two first images,
96
289285
1585
04:50
that's me from events I've spoken at.
97
290870
2169
04:53
But the rest of the images, none of those are me.
98
293081
2753
04:55
They're probably of other women named Sasha
99
295875
2002
04:57
who put photographs of themselves up on the internet.
100
297919
2628
05:01
And this can probably explain why,
101
301047
1627
05:02
when I query an image generation model
102
302715
1836
05:04
to generate a photograph of a woman named Sasha,
103
304551
2294
05:06
more often than not I get images of bikini models.
104
306886
2753
05:09
Sometimes they have two arms,
105
309681
1626
05:11
sometimes they have three arms,
106
311349
2294
05:13
but they rarely have any clothes on.
107
313685
2043
05:16
And while it can be interesting for people like you and me
108
316563
2794
05:19
to search these data sets,
109
319357
2127
05:21
for artists like Karla Ortiz,
110
321526
2044
05:23
this provides crucial evidence that her life's work, her artwork,
111
323570
3753
05:27
was used for training AI models without her consent,
112
327365
2961
05:30
and she and two artists used this as evidence
113
330326
2336
05:32
to file a class action lawsuit against AI companies
114
332704
2794
05:35
for copyright infringement.
115
335540
1960
05:37
And most recently --
116
337542
1168
05:38
(Applause)
117
338710
3378
05:42
And most recently Spawning.ai partnered up with Hugging Face,
118
342130
3044
05:45
the company where I work at,
119
345216
1585
05:46
to create opt-in and opt-out mechanisms for creating these data sets.
120
346801
4922
05:52
Because artwork created by humans shouldn’t be an all-you-can-eat buffet
121
352098
3587
05:55
for training AI language models.
122
355727
1793
05:58
(Applause)
123
358313
4254
06:02
The very last thing I want to talk about is bias.
124
362567
2336
06:04
You probably hear about this a lot.
125
364944
1919
06:07
Formally speaking, it's when AI models encode patterns and beliefs
126
367196
3713
06:10
that can represent stereotypes or racism and sexism.
127
370950
3128
06:14
One of my heroes, Dr. Joy Buolamwini, experienced this firsthand
128
374412
3212
06:17
when she realized that AI systems wouldn't even detect her face
129
377665
3045
06:20
unless she was wearing a white-colored mask.
130
380752
2169
06:22
Digging deeper, she found that common facial recognition systems
131
382962
3754
06:26
were vastly worse for women of color compared to white men.
132
386758
3253
06:30
And when biased models like this are deployed in law enforcement settings,
133
390428
5297
06:35
this can result in false accusations, even wrongful imprisonment,
134
395767
4296
06:40
which we've seen happen to multiple people in recent months.
135
400063
3920
06:44
For example, Porcha Woodruff was wrongfully accused of carjacking
136
404025
3086
06:47
at eight months pregnant
137
407111
1252
06:48
because an AI system wrongfully identified her.
138
408363
2961
06:52
But sadly, these systems are black boxes,
139
412325
2002
06:54
and even their creators can't say exactly why they work the way they do.
140
414369
5964
07:00
And for example, for image generation systems,
141
420917
3462
07:04
if they're used in contexts like generating a forensic sketch
142
424379
4129
07:08
based on a description of a perpetrator,
143
428549
2711
07:11
they take all those biases and they spit them back out
144
431260
3587
07:14
for terms like dangerous criminal, terrorists or gang member,
145
434889
3462
07:18
which of course is super dangerous
146
438393
2168
07:20
when these tools are deployed in society.
147
440603
4421
07:25
And so in order to understand these tools better,
148
445566
2294
07:27
I created this tool called the Stable Bias Explorer,
149
447902
3212
07:31
which lets you explore the bias of image generation models
150
451155
3379
07:34
through the lens of professions.
151
454575
1669
07:37
So try to picture a scientist in your mind.
152
457370
3045
07:40
Don't look at me.
153
460456
1168
07:41
What do you see?
154
461666
1335
07:43
A lot of the same thing, right?
155
463835
1501
07:45
Men in glasses and lab coats.
156
465378
2377
07:47
And none of them look like me.
157
467797
1710
07:50
And the thing is,
158
470174
1460
07:51
is that we looked at all these different image generation models
159
471676
3253
07:54
and found a lot of the same thing:
160
474929
1627
07:56
significant representation of whiteness and masculinity
161
476597
2586
07:59
across all 150 professions that we looked at,
162
479225
2127
08:01
even if compared to the real world,
163
481352
1794
08:03
the US Labor Bureau of Statistics.
164
483187
1836
08:05
These models show lawyers as men,
165
485023
3044
08:08
and CEOs as men, almost 100 percent of the time,
166
488109
3462
08:11
even though we all know not all of them are white and male.
167
491571
3170
08:14
And sadly, my tool hasn't been used to write legislation yet.
168
494782
4380
08:19
But I recently presented it at a UN event about gender bias
169
499203
3963
08:23
as an example of how we can make tools for people from all walks of life,
170
503166
3879
08:27
even those who don't know how to code,
171
507086
2252
08:29
to engage with and better understand AI because we use professions,
172
509380
3253
08:32
but you can use any terms that are of interest to you.
173
512633
3087
08:36
And as these models are being deployed,
174
516596
2752
08:39
are being woven into the very fabric of our societies,
175
519390
3128
08:42
our cell phones, our social media feeds,
176
522518
2044
08:44
even our justice systems and our economies have AI in them.
177
524604
3211
08:47
And it's really important that AI stays accessible
178
527815
3879
08:51
so that we know both how it works and when it doesn't work.
179
531736
4713
08:56
And there's no single solution for really complex things like bias
180
536908
4296
09:01
or copyright or climate change.
181
541245
2419
09:03
But by creating tools to measure AI's impact,
182
543664
2711
09:06
we can start getting an idea of how bad they are
183
546375
3337
09:09
and start addressing them as we go.
184
549754
2502
09:12
Start creating guardrails to protect society and the planet.
185
552256
3337
09:16
And once we have this information,
186
556177
2336
09:18
companies can use it in order to say,
187
558513
1835
09:20
OK, we're going to choose this model because it's more sustainable,
188
560389
3170
09:23
this model because it respects copyright.
189
563601
2044
09:25
Legislators who really need information to write laws,
190
565686
3087
09:28
can use these tools to develop new regulation mechanisms
191
568773
3462
09:32
or governance for AI as it gets deployed into society.
192
572276
3796
09:36
And users like you and me can use this information
193
576114
2377
09:38
to choose AI models that we can trust,
194
578491
3337
09:41
not to misrepresent us and not to misuse our data.
195
581869
2920
09:45
But what did I reply to that email
196
585790
1918
09:47
that said that my work is going to destroy humanity?
197
587750
2961
09:50
I said that focusing on AI's future existential risks
198
590711
4046
09:54
is a distraction from its current,
199
594799
2044
09:56
very tangible impacts
200
596843
1835
09:58
and the work we should be doing right now, or even yesterday,
201
598719
4004
10:02
for reducing these impacts.
202
602723
1919
10:04
Because yes, AI is moving quickly, but it's not a done deal.
203
604684
4045
10:08
We're building the road as we walk it,
204
608771
2503
10:11
and we can collectively decide what direction we want to go in together.
205
611274
3795
10:15
Thank you.
206
615069
1210
10:16
(Applause)
207
616279
2002
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7