AI Is Dangerous, but Not for the Reasons You Think | Sasha Luccioni | TED

1,384,510 views ใƒป 2023-11-06

TED


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

๋ฒˆ์—ญ: Jinsol Song ๊ฒ€ํ† : DK Kim
00:04
So I've been an AI researcher for over a decade.
0
4292
3504
์ €๋Š” ์ธ๊ณต ์ง€๋Šฅ ์—ฐ๊ตฌ์›์œผ๋กœ ์‹ญ ๋…„ ๋„˜๊ฒŒ ์ผํ–ˆ์Šต๋‹ˆ๋‹ค.
00:07
And a couple of months ago, I got the weirdest email of my career.
1
7796
3503
๊ทธ๋Ÿฐ๋ฐ ๋ช‡ ๊ฐœ์›” ์ „, ์ตœ๊ณ ๋กœ ์ด์ƒํ•œ ๋ฉ”์ผ์„ ๋ฐ›์•˜์ฃ .
00:11
A random stranger wrote to me
2
11925
1668
๋ชจ๋ฅด๋Š” ์‚ฌ๋žŒ์ด ๋ณด๋ƒˆ๋Š”๋ฐ,
00:13
saying that my work in AI is going to end humanity.
3
13635
3420
์ œ๊ฐ€ ์—ฐ๊ตฌํ•˜๋Š” AI๊ฐ€ ์ธ๋ฅ˜๋ฅผ ๋ฉธ๋ง์‹œํ‚ฌ ๊ฑฐ๋ผ๊ณ  ํ–ˆ์–ด์š”.
00:18
Now I get it, AI, it's so hot right now.
4
18598
3754
๋ฌด์Šจ ๋ง์ธ์ง€ ์•Œ๊ฒ ์–ด์š”, AI๊ฐ€ ์ง€๊ธˆ ํ•œ์ฐฝ์ด์ž–์•„์š”.
00:22
(Laughter)
5
22352
1627
(์›ƒ์Œ)
๋งค์ผ ๋‰ด์Šค ๋จธ๋ฆฌ๊ธฐ์‚ฌ์— ๋œจ๋Š”๋ฐ ๋•Œ๋กœ๋Š” ์•„์ฃผ ๋Œ€๋‹จํ•œ ๊ฑธ ๋•Œ๋„ ์žˆ์ฃ .
00:24
It's in the headlines pretty much every day,
6
24020
2086
00:26
sometimes because of really cool things
7
26106
1918
์˜๋ฃŒ์šฉ ์ƒˆ๋กœ์šด ๋ถ„์ž๋ฅผ ๋ฐœ๊ฒฌํ–ˆ๋‹ค๊ฑฐ๋‚˜ ํฐ์ƒ‰ ํŒจ๋”ฉ์„ ์ž…์€ ๊ตํ™ฉ ๊ฐ™์€ ๊ฑฐ์š”.
00:28
like discovering new molecules for medicine
8
28066
2169
00:30
or that dope Pope in the white puffer coat.
9
30235
2252
00:33
But other times the headlines have been really dark,
10
33446
2461
ํ•˜์ง€๋งŒ ๊ทธ๋ฅผ ์ œ์™ธํ•˜๋ฉด ๋จธ๋ฆฌ๊ธฐ์‚ฌ๋“ค์€ ์ •๋ง ์•”๋‹ดํ–ˆ์ฃ .
00:35
like that chatbot telling that guy that he should divorce his wife
11
35907
3671
์•„๋‚ด์™€ ์ดํ˜ผํ•˜๋ผ๊ณ  ๋ถ€์ถ”๊ธฐ๋Š” ์ฑ—๋ด‡์ด๋ผ๋“ ๊ฐ€
00:39
or that AI meal planner app proposing a crowd pleasing recipe
12
39619
4088
์‚ฌ๋žŒ๋“ค์—๊ฒŒ ๋Œ€์ ‘ํ•  ์Œ์‹์œผ๋กœ
์—ผ์†Œ ๊ฐ€์Šค๊ฐ€ ๋“ค์–ด๊ฐ„ ์š”๋ฆฌ๋ฒ•์„ ์•Œ๋ ค์ค€ AI ์‹์‚ฌ ๊ณ„ํš ์–ดํ”Œ์ด๋ผ๋“ ์ง€์š”.
00:43
featuring chlorine gas.
13
43707
2002
00:46
And in the background,
14
46376
1418
๋˜ํ•œ ๊ทธ์™€ ํ•จ๊ป˜ ์ธ๋ฅ˜ ์ตœํ›„์˜ ๋‚ ์ด๋ผ๋“ ๊ฐ€
00:47
we've heard a lot of talk about doomsday scenarios,
15
47836
2419
์‹ค์กด์ ์ธ ์œ„ํ—˜, ๊ทธ๋ฆฌ๊ณ  ํŠน์ด์ ์— ๋Œ€ํ•ด ๋งŽ์ด ๋“ค์—ˆ์ฃ .
00:50
existential risk and the singularity,
16
50255
1918
00:52
with letters being written and events being organized
17
52215
2503
๊ทธ๋Ÿฐ ์ผ๋“ค์ด ์ผ์–ด๋‚˜์ง€ ์•Š๋„๋ก ํ•˜๋ ค๋Š” ๊ธ€์ด๋‚˜ ํ–‰์‚ฌ๋„ ์žˆ์—ˆ๊ณ ์š”.
00:54
to make sure that doesn't happen.
18
54718
2002
00:57
Now I'm a researcher who studies AI's impacts on society,
19
57637
4630
์ €๋Š” ์ธ๊ณต ์ง€๋Šฅ์ด ์‚ฌํšŒ์— ๋ฏธ์น  ์˜ํ–ฅ์— ๋Œ€ํ•ด ์—ฐ๊ตฌํ•˜์ง€๋งŒ
01:02
and I don't know what's going to happen in 10 or 20 years,
20
62267
2836
10๋…„์ด๋‚˜ 20๋…„ ๋’ค์— ๋ฌด์Šจ ์ผ์ด ์ผ์–ด๋‚ ์ง„ ๋ชจ๋ฅด๊ฒ ์–ด์š”.
01:05
and nobody really does.
21
65145
2461
์‚ฌ์‹ค ์•„๋ฌด๋„ ๋ชจ๋ฅด์ฃ .
01:07
But what I do know is that there's some pretty nasty things going on right now,
22
67981
4546
์ œ๊ฐ€ ์•„๋Š” ๊ฑด
์ง€๊ธˆ ์ด ์ˆœ๊ฐ„์— ์•„์ฃผ ์•ˆ ์ข‹์€ ์ผ์ด ์ผ์–ด๋‚˜๊ณ  ์žˆ๋‹ค๋Š” ๊ฑฐ์˜ˆ์š”.
01:12
because AI doesn't exist in a vacuum.
23
72527
2878
AI๋Š” ํ˜ผ์ž ์กด์žฌํ•˜๋Š” ๊ฒŒ ์•„๋‹ˆ๊ฑฐ๋“ ์š”.
01:15
It is part of society, and it has impacts on people and the planet.
24
75447
3920
AI๋Š” ์‚ฌํšŒ์˜ ์ผ๋ถ€๋ถ„์ด๊ณ  ์ธ๊ฐ„๊ณผ ์ง€๊ตฌ์— ์˜ํ–ฅ์„ ๋ฏธ์ณ์š”.
01:20
AI models can contribute to climate change.
25
80160
2502
๊ธฐํ›„ ๋ณ€ํ™”์— AI ๋ชจํ˜•์ด ์˜ํ–ฅ์„ ์ค„ ์ˆ˜๋„ ์žˆ์ฃ .
01:22
Their training data uses art and books created by artists
26
82704
3462
AI๋Š” ์ฐฝ์ž‘๊ฐ€์˜ ๋™์˜ ์—†์ด ๊ทธ๋ฆผ์ด๋‚˜ ์ฑ…์„ ์‚ฌ์šฉํ•ด์„œ ํ•™์Šตํ•ด์š”.
01:26
and authors without their consent.
27
86207
1710
01:27
And its deployment can discriminate against entire communities.
28
87959
3837
๊ทธ๋ฆฌ๊ณ  AI๋ฅผ ์ ์šฉํ•˜๋ฉด ํ•œ ์ง‘๋‹จ ์ „์ฒด๋ฅผ ์ฐจ๋ณ„ํ•  ์ˆ˜๋„ ์žˆ์ฃ .
01:32
But we need to start tracking its impacts.
29
92797
2127
๊ทธ ์˜ํ–ฅ๋“ค์„ ์ถ”์ ํ•˜๊ธฐ ์‹œ์ž‘ํ•ด์•ผ ํ•ด์š”.
01:34
We need to start being transparent and disclosing them and creating tools
30
94966
3587
์ด์ œ ํˆฌ๋ช…ํ•˜๊ฒŒ ๋ฐํžˆ๊ณ  ๋„๊ตฌ๋ฅผ ๋งŒ๋“ค์–ด์„œ
01:38
so that people understand AI better,
31
98595
2419
์ธ๊ณต ์ง€๋Šฅ์„ ๋” ์ž˜ ์ดํ•ดํ•  ์ˆ˜ ์žˆ๊ฒŒ ํ•˜๊ณ 
01:41
so that hopefully future generations of AI models
32
101056
2335
๋ฏธ๋ž˜ AI๊ฐ€ ๋”์šฑ ์‹ ๋ขฐ๋ฐ›๊ณ  ์ง€์† ๊ฐ€๋Šฅํ•  ์ˆ˜ ์žˆ๊ฒŒ ํ•˜๊ณ 
01:43
are going to be more trustworthy, sustainable,
33
103433
2836
01:46
maybe less likely to kill us, if that's what you're into.
34
106269
2836
์—ฌ๋Ÿฌ๋ถ„์ด ์›ํ•˜๋Š” ๊ฒŒ ๊ทธ๊ฑฐ๋ผ๋ฉด, ์ธ๊ฐ„์„ ์ฃฝ์ด์ง€ ์•Š๊ฒŒ ํ•ด์•ผ ํ•ด์š”.
01:50
But let's start with sustainability,
35
110148
1752
๋จผ์ € ์ง€์† ๊ฐ€๋Šฅ์„ฑ๋ถ€ํ„ฐ ๋ด…์‹œ๋‹ค.
01:51
because that cloud that AI models live on is actually made out of metal, plastic,
36
111900
5756
AI๊ฐ€ ์‚ด์•„๊ฐ€๋Š” โ€˜ํด๋ผ์šฐ๋“œโ€™๋Š”
์‚ฌ์‹ค ์ฒ , ํ”Œ๋ผ์Šคํ‹ฑ์œผ๋กœ ๋˜์–ด ์žˆ๊ณ  ์—๋„ˆ์ง€๊ฐ€ ์—„์ฒญ๋‚˜๊ฒŒ ๋งŽ์ด ํ•„์š”ํ•˜๊ฑฐ๋“ ์š”.
01:57
and powered by vast amounts of energy.
37
117656
2460
02:00
And each time you query an AI model, it comes with a cost to the planet.
38
120116
4463
์ธ๊ณต ์ง€๋Šฅ์— ์งˆ๋ฌธ์„ ํ•  ๋•Œ๋งˆ๋‹ค ์ง€๊ตฌ์˜ ์ž์›์ด ์†Œ๋ชจ๋˜์ฃ .
02:05
Last year, I was part of the BigScience initiative,
39
125789
3044
์ž‘๋…„์— ๋น…์‚ฌ์ด์–ธ์Šค ๊ณ„ํš์— ์ฐธ์—ฌํ–ˆ์—ˆ๋Š”๋ฐ
02:08
which brought together a thousand researchers
40
128833
2127
์ „ ์„ธ๊ณ„์—์„œ ์—ฐ๊ตฌ์ž๋“ค ์ฒœ ๋ช…์ด ๋ชจ์—ฌ์„œ '๋ธ”๋ฃธ'์„ ๋งŒ๋“œ๋Š” ์ผ์ด์—ˆ์ฃ .
02:10
from all over the world to create Bloom,
41
130960
2503
02:13
the first open large language model, like ChatGPT,
42
133505
4337
์ฑ—์ง€ํ”ผํ‹ฐ ๊ฐ™์€ ๊ฒƒ์œผ๋กœ ์ตœ์ดˆ ๋Œ€๊ทœ๋ชจ ๊ณต๊ฐœ ์–ธ์–ด ๋ชจ๋ธ์ด์ง€๋งŒ
02:17
but with an emphasis on ethics, transparency and consent.
43
137842
3546
์œค๋ฆฌ, ํˆฌ๋ช…์„ฑ๊ณผ ๋™์˜์— ์ค‘์ ์„ ๋‘์—ˆ์ฃ .
02:21
And the study I led that looked at Bloom's environmental impacts
44
141721
3253
์ €๋Š” ๋ธ”๋ฃธ์ด ํ™˜๊ฒฝ์— ๋ฏธ์น˜๋Š” ์˜ํ–ฅ ์—ฐ๊ตฌ๋ฅผ ์ฃผ๋„ํ–ˆ๋Š”๋ฐ
๋ธ”๋ฃธ์„ ํ›ˆ๋ จํ•˜๋Š” ๊ฒƒ๋งŒ์œผ๋กœ๋„
02:25
found that just training it used as much energy
45
145016
3253
๊ฐ€์ •์ง‘ 30๊ณณ์ด ์ผ ๋…„ ๋‚ด๋‚ด ์“ฐ๋Š” ์—๋„ˆ์ง€๋งŒํผ์„ ์†Œ๋ชจํ•˜๊ณ 
02:28
as 30 homes in a whole year
46
148311
2211
02:30
and emitted 25 tons of carbon dioxide,
47
150563
2419
์ด์‚ฐํ™” ํƒ„์†Œ 25ํ†ค์„ ๋ฐฐ์ถœํ•œ๋‹ค๊ณ  ๋‚˜ํƒ€๋‚ฌ์ฃ .
์ฐจ๋กœ ์ง€๊ตฌ๋ฅผ ๋‹ค์„ฏ ๋ฐ”ํ€ด ๋„๋Š” ์…ˆ์ธ๋ฐ
02:33
which is like driving your car five times around the planet
48
153024
3253
02:36
just so somebody can use this model to tell a knock-knock joke.
49
156319
3170
AI๋กœ ๊ทธ์ € ๋ง์žฅ๋‚œ์„ ํ•˜๋‚˜ ํ•˜๋Š” ๋ฐ์— ๋ง์ด์ฃ .
02:39
And this might not seem like a lot,
50
159489
2169
์ด๊ฑด ๊ทธ๋ ‡๊ฒŒ ๋งŽ์•„ ๋ณด์ด์ง€ ์•Š์„์ง€ ๋ชฐ๋ผ๋„
02:41
but other similar large language models,
51
161700
2460
์ง€ํ”ผํ‹ฐ 3 ๊ฐ™์€ ๋น„์Šทํ•œ ๋‹ค๋ฅธ ๋Œ€๊ทœ๋ชจ ์–ธ์–ด ๋ชจ๋ธ๋“ค์€
02:44
like GPT-3,
52
164202
1126
02:45
emit 20 times more carbon.
53
165370
2544
ํƒ„์†Œ๋ฅผ 20๋ฐฐ๋‚˜ ๋” ๋ฐฐ์ถœํ•ฉ๋‹ˆ๋‹ค.
02:47
But the thing is, tech companies aren't measuring this stuff.
54
167956
2878
๊ทธ๋Ÿฐ๋ฐ ๋ฌธ์ œ๋Š” ๊ธฐ์ˆ  ํšŒ์‚ฌ๋“ค์ด ์ด๊ฑธ ์ธก์ •๋„ ๊ณต๊ฐœ๋„ ์•ˆ ํ•œ๋‹ค๋Š” ๊ฒ๋‹ˆ๋‹ค.
02:50
They're not disclosing it.
55
170875
1252
์ด๊ฒƒ์€ ๋น™์‚ฐ์˜ ์ผ๊ฐ์— ๋ถˆ๊ณผํ•ด์š”.
02:52
And so this is probably only the tip of the iceberg,
56
172168
2461
02:54
even if it is a melting one.
57
174629
1418
๋…น๊ณ  ์žˆ๋Š” ๋น™์‚ฐ์ด๊ธด ํ•˜์ง€๋งŒ ๋ง์ด์ฃ .
02:56
And in recent years we've seen AI models balloon in size
58
176798
3629
์ตœ๊ทผ ๋ช‡ ๋…„๊ฐ„ AI ๋ชจ๋ธ ๊ทœ๋ชจ๊ฐ€ ์ปค์ง€๋Š” ๊ฒƒ์„ ๋ดค๋Š”๋ฐ
03:00
because the current trend in AI is "bigger is better."
59
180468
3462
์ด๊ฑด ํ˜„์žฌ AI ์ถ”์„ธ๊ฐ€ 'ํด์ˆ˜๋ก ์ข‹๋‹ค'์ด๊ธฐ ๋•Œ๋ฌธ์ด์—์š”.
03:04
But please don't get me started on why that's the case.
60
184305
2795
์™œ ๊ทธ๋Ÿฐ ๊ฑด์ง€๋Š” ๋ฌป์ง€ ๋งˆ์„ธ์š”.
03:07
In any case, we've seen large language models in particular
61
187100
3003
์–ด์จŒ๋“  ํŠนํžˆ ๋Œ€ํ˜• ์–ธ์–ด ๋ชจ๋ธ๋“ค์ด
03:10
grow 2,000 times in size over the last five years.
62
190103
3211
์ง€๋‚œ ์˜ค ๋…„๊ฐ„ ์ด์ฒœ ๋ฐฐ๋‚˜ ์„ฑ์žฅํ–ˆ์ฃ .
03:13
And of course, their environmental costs are rising as well.
63
193314
3045
๋ฌผ๋ก  ํ™˜๊ฒฝ ๋น„์šฉ๋„ ์ฆ๊ฐ€ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
03:16
The most recent work I led, found that switching out a smaller,
64
196401
3795
์ตœ๊ทผ์— ์ œ๊ฐ€ ์ฃผ๋„ํ–ˆ๋˜ ์—ฐ๊ตฌ์—์„œ๋Š”
์ž‘๊ณ  ํšจ์œจ์ ์ธ ๋ชจ๋ธ์„ ๋” ํฐ ์–ธ์–ด ๋ชจ๋ธ๋กœ ๋ฐ”๊พธ๊ฒŒ ๋˜๋ฉด
03:20
more efficient model for a larger language model
65
200238
3337
03:23
emits 14 times more carbon for the same task.
66
203616
3754
๋˜‘๊ฐ™์€ ์ž‘์—…์— ํƒ„์†Œ๋ฅผ 14๋ฐฐ๋‚˜ ๋” ๋ฐฐ์ถœํ•œ๋‹ค๋Š” ๊ฑธ ์•Œ ์ˆ˜ ์žˆ์—ˆ์ฃ .
03:27
Like telling that knock-knock joke.
67
207412
1877
์•„๊นŒ ๋งํ•œ ๋ง์žฅ๋‚œ ๊ฐ™์€ ์ž‘์—… ๋ง์ž…๋‹ˆ๋‹ค.
03:29
And as we're putting in these models into cell phones and search engines
68
209289
3462
์ด๋Ÿฐ ๋ชจ๋ธ์„ ํœด๋Œ€ํฐ, ๊ฒ€์ƒ‰ ์—”์ง„,
03:32
and smart fridges and speakers,
69
212792
2836
์Šค๋งˆํŠธ ๋ƒ‰์žฅ๊ณ ์™€ ์Šคํ”ผ์ปค์— ์ ์šฉํ•˜๋ฉด์„œ
03:35
the environmental costs are really piling up quickly.
70
215628
2628
ํ™˜๊ฒฝ ๋น„์šฉ์ด ๋ณธ๊ฒฉ์ ์œผ๋กœ ๋น ๋ฅด๊ฒŒ ์ƒ์Šนํ•˜๊ณ  ์žˆ์ฃ .
03:38
So instead of focusing on some future existential risks,
71
218840
3754
๊ทธ๋Ÿฌ๋ฏ€๋กœ ๋ฏธ๋ž˜์˜ ์‹ค์กด์  ์œ„ํ—˜๋ณด๋‹ค๋Š”
03:42
let's talk about current tangible impacts
72
222635
2753
์ง€๊ธˆ ์ผ์–ด๋‚˜๊ณ  ์žˆ๋Š” ๋ˆˆ์— ๋ณด์ด๋Š” ์˜ํ–ฅ๊ณผ
03:45
and tools we can create to measure and mitigate these impacts.
73
225388
3629
๊ทธ ์˜ํ–ฅ์„ ์ธก์ •ํ•˜๊ณ  ์™„ํ™”ํ•  ๋„๊ตฌ์— ๋Œ€ํ•ด ์–˜๊ธฐํ•ด ๋ด…์‹œ๋‹ค.
03:49
I helped create CodeCarbon,
74
229893
1668
์ œ๊ฐ€ ์ œ์ž‘์— ์ฐธ์—ฌํ–ˆ๋˜ โ€˜์ฝ”๋“œ์นด๋ณธโ€™ ํ”„๋กœ๊ทธ๋žจ์€
03:51
a tool that runs in parallel to AI training code
75
231603
2961
์ฝ”๋“œ๋ฅผ ํ•™์Šตํ•˜๋Š” ์ธ๊ณต ์ง€๋Šฅ๊ณผ ๋ณ‘ํ–‰ํ•ด์„œ ์ž‘๋™ํ•˜๋ฉฐ
03:54
that estimates the amount of energy it consumes
76
234564
2211
์ธ๊ณต ์ง€๋Šฅ์ด ์†Œ๋น„ํ•˜๋Š” ์—๋„ˆ์ง€์™€ ๋ฐฉ์ถœํ•˜๋Š” ํƒ„์†Œ๋ฅผ ์ธก์ •ํ•ฉ๋‹ˆ๋‹ค.
03:56
and the amount of carbon it emits.
77
236775
1668
03:58
And using a tool like this can help us make informed choices,
78
238485
2877
์ด ํ”„๋กœ๊ทธ๋žจ์„ ์‚ฌ์šฉํ•˜๋ฉด ๋” ๋‚˜์€ ์„ ํƒ์„ ํ•  ์ˆ˜ ์žˆ์–ด์š”.
04:01
like choosing one model over the other because it's more sustainable,
79
241404
3253
์˜ˆ๋ฅผ ๋“ค์–ด ๋ชจ๋ธ๋“ค์„ ๋น„๊ตํ•ด ์ง€์† ๊ฐ€๋Šฅ์„ฑ์ด ๋†’์€ ๊ฑธ ๊ณ ๋ฅด๊ฑฐ๋‚˜
04:04
or deploying AI models on renewable energy,
80
244657
2920
์žฌ์ƒ ๊ฐ€๋Šฅํ•œ ์—๋„ˆ์ง€๋กœ ๋Œ์•„๊ฐ€๋Š” ์ธ๊ณต ์ง€๋Šฅ์„ ์‚ฌ์šฉํ•  ์ˆ˜ ์žˆ๋Š”๋ฐ
04:07
which can drastically reduce their emissions.
81
247619
2544
๊ทธ๋Ÿฌ๋ฉด ํƒ„์†Œ ๋ฐฐ์ถœ๋Ÿ‰์„ ๋ˆˆ์— ๋„๊ฒŒ ์ค„์ผ ์ˆ˜ ์žˆ๊ฒ ์ฃ .
04:10
But let's talk about other things
82
250163
2085
๋‹ค๋ฅธ ๊ฒƒ๋„ ์–˜๊ธฐํ•ด ๋ณผ๊นŒ์š”?
04:12
because there's other impacts of AI apart from sustainability.
83
252290
2961
์ง€์† ๊ฐ€๋Šฅ์„ฑ ์™ธ์—๋„ AI๊ฐ€ ๋ฏธ์น˜๋Š” ์˜ํ–ฅ๋“ค์ด ์žˆ์ฃ .
04:15
For example, it's been really hard for artists and authors
84
255627
3128
์˜ˆ๋ฅผ ๋“ค์–ด ์˜ˆ์ˆ ๊ฐ€์™€ ์ž‘๊ฐ€๋“ค์€
์ž์‹ ๋“ค์˜ ์ž‘ํ’ˆ๋“ค์ด AI ๋ชจ๋ธ ํ•™์Šต์— ์“ฐ์ผ ๋•Œ
04:18
to prove that their life's work has been used for training AI models
85
258797
4212
๊ทธ๋“ค์˜ ๋™์˜๋ฅผ ๋ฐ›์ง€ ์•Š์•˜์Œ์„ ์ฆ๋ช…ํ•˜๊ธฐ๊ฐ€ ๋ฌด์ฒ™ ํž˜๋“ค์—ˆ์ฃ .
04:23
without their consent.
86
263051
1209
04:24
And if you want to sue someone, you tend to need proof, right?
87
264302
3170
๋ˆ„๊ตฐ๊ฐ€๋ฅผ ๊ณ ์†Œํ•˜๋ ค๋ฉด ์ฆ๊ฑฐ๊ฐ€ ํ•„์š”ํ•˜์ž–์•„์š”?
04:27
So Spawning.ai, an organization that was founded by artists,
88
267806
3920
๊ทธ๋ž˜์„œ ์˜ˆ์ˆ ๊ฐ€๋“ค์ด ์„ค๋ฆฝํ•œ ๋‹จ์ฒด์ธ ์ŠคํŒŒ์šฐ๋‹ AI๊ฐ€
04:31
created this really cool tool called โ€œHave I Been Trained?โ€
89
271726
3337
'๋‚ด๊ฐ€ ํ•™์Šต์— ์‚ฌ์šฉ๋๋‚˜?'๋ผ๋Š” ์ •๋ง ๋ฉ‹์ง„ ํ”„๋กœ๊ทธ๋žจ์„ ๋งŒ๋“ค์—ˆ์ฃ .
04:35
And it lets you search these massive data sets
90
275104
2461
๊ทธ๊ฑธ๋กœ ๊ฑฐ๋Œ€ํ•œ ๋ฐ์ดํ„ฐ๋ฅผ ๊ฒ€์ƒ‰ํ•ด์„œ ๋‚ด ์ž‘ํ’ˆ์ด ์“ฐ์˜€๋Š”์ง€ ์•Œ ์ˆ˜ ์žˆ์–ด์š”.
04:37
to see what they have on you.
91
277607
2085
04:39
Now, I admit it, I was curious.
92
279734
1668
์‚ฌ์‹ค ์ €๋„ ๊ถ๊ธˆํ–ˆ์ฃ .
04:41
I searched LAION-5B,
93
281444
1627
์˜์ƒ๊ณผ ๋ฌธ์ž์˜ ๊ฑฐ๋Œ€ํ•œ ์ง‘ํ•ฉ์ฒด์ธ LAION-5B๋ฅผ ์กฐ์‚ฌํ–ˆ์–ด์š”.
04:43
which is this huge data set of images and text,
94
283112
2461
04:45
to see if any images of me were in there.
95
285615
2711
์ œ ์‚ฌ์ง„๋„ ์žˆ๋Š”์ง€ ๊ถ๊ธˆํ–ˆ๊ฑฐ๋“ ์š”.
04:49
Now those two first images,
96
289285
1585
์ € ์ฒซ ๋ฒˆ์งธ ์‚ฌ์ง„ ๋‘ ์žฅ์€ ์ œ๊ฐ€ ๊ฐ•์—ฐํ•œ ํ–‰์‚ฌ์—์„œ ์ฐ์€ ๊ฑฐ์˜ˆ์š”.
04:50
that's me from events I've spoken at.
97
290870
2169
04:53
But the rest of the images, none of those are me.
98
293081
2753
ํ•˜์ง€๋งŒ ๋‚˜๋จธ์ง€ ์‚ฌ์ง„๋“ค์€ ์ œ๊ฐ€ ์•„๋‹ˆ์—์š”.
04:55
They're probably of other women named Sasha
99
295875
2002
๊ทธ๊ฒƒ๋“ค์€ ์‚ฌ์ƒค๋ผ๋Š” ๋‹ค๋ฅธ ์—ฌ์ž๊ฐ€ ์ธํ„ฐ๋„ท์— ์˜ฌ๋ฆฐ ์‚ฌ์ง„์ผ ๊ฑฐ์˜ˆ์š”.
04:57
who put photographs of themselves up on the internet.
100
297919
2628
์ œ๊ฐ€ ์ด๋ฏธ์ง€ ์ƒ์„ฑ ๋ชจ๋ธ์„ ์ด์šฉํ•ด์„œ ์‚ฌ์ƒค๋ผ๋Š” ์—ฌ์„ฑ์˜ ์‚ฌ์ง„์„ ๋งŒ๋“ค๋ฉด
05:01
And this can probably explain why,
101
301047
1627
05:02
when I query an image generation model
102
302715
1836
05:04
to generate a photograph of a woman named Sasha,
103
304551
2294
์ž๊พธ ๋น„ํ‚ค๋‹ˆ ๋ชจ๋ธ ์ด๋ฏธ์ง€๊ฐ€ ๋‚˜์˜ค๋Š” ์ด์œ ๊ฐ€ ๋ฐ”๋กœ ์ด๊ฒƒ์ด๊ฒ ์ฃ .
05:06
more often than not I get images of bikini models.
104
306886
2753
05:09
Sometimes they have two arms,
105
309681
1626
ํŒ”์ด ๋‘ ๊ฐœ์ด๊ธฐ๋„ ํ•˜๊ณ  ๊ฐ€๋”์€ ์„ธ ๊ฐœ์ด๊ธฐ๋„ ํ•˜์ง€๋งŒ
05:11
sometimes they have three arms,
106
311349
2294
(์›ƒ์Œ)
05:13
but they rarely have any clothes on.
107
313685
2043
์˜ท์„ ์ œ๋Œ€๋กœ ๊ฐ–์ถฐ ์ž…์€ ์‚ฌ์ง„์€ ๊ฑฐ์˜ ์—†์ฃ .
05:16
And while it can be interesting for people like you and me
108
316563
2794
์ด๋Ÿฐ ์‚ฌ์ง„์„ ์ฐพ์•„๋ณด๋Š” ์ €๋‚˜ ์—ฌ๋Ÿฌ๋ถ„์—๊ฒŒ๋Š” ์žฌ๋ฏธ์žˆ๋Š” ๊ฒƒ์ด๊ฒ ์ง€๋งŒ
05:19
to search these data sets,
109
319357
2127
05:21
for artists like Karla Ortiz,
110
321526
2044
์นด๋ฅผ๋ผ ์˜คํ‹ฐ์ฆˆ ๊ฐ™์€ ์˜ˆ์ˆ ๊ฐ€์ธ ๊ฒฝ์šฐ
05:23
this provides crucial evidence that her life's work, her artwork,
111
323570
3753
๊ทธ๋…€์˜ ์˜ˆ์ˆ  ์ž‘ํ’ˆ์ด ๋™์˜ ์—†์ด AI ๋ชจ๋ธ์„ ํ›ˆ๋ จํ•˜๋Š” ๋ฐ ์‚ฌ์šฉ๋˜์—ˆ๋‹ค๋Š”
์ค‘์š”ํ•œ ์ฆ๊ฑฐ๋ฅผ ์ œ๊ณตํ•˜๋Š” ํ”„๋กœ๊ทธ๋žจ์ด์ฃ .
05:27
was used for training AI models without her consent,
112
327365
2961
05:30
and she and two artists used this as evidence
113
330326
2336
์นด๋ฅผ๋ผ์™€ ์˜ˆ์ˆ ๊ฐ€ ๋‘ ๋ช…์€ ์ด ์ฆ๊ฑฐ๋ฅผ ์‚ฌ์šฉํ•ด์„œ
05:32
to file a class action lawsuit against AI companies
114
332704
2794
AI ํšŒ์‚ฌ๋ฅผ ์ƒ๋Œ€๋กœ ์ €์ž‘๊ถŒ ์นจํ•ด ํ˜์˜๋กœ ์ง‘๋‹จ ์†Œ์†ก์„ ์ œ๊ธฐํ–ˆ์Šต๋‹ˆ๋‹ค.
05:35
for copyright infringement.
115
335540
1960
05:37
And most recently --
116
337542
1168
๊ทธ๋ฆฌ๊ณ  ์ตœ๊ทผ์—,
05:38
(Applause)
117
338710
3378
(๋ฐ•์ˆ˜)
05:42
And most recently Spawning.ai partnered up with Hugging Face,
118
342130
3044
์ตœ๊ทผ์— ์ŠคํŒŒ์šฐ๋‹ AI๋Š” ์ œ ์ง์žฅ์ธ ํ—ˆ๊น…ํŽ˜์ด์Šค์™€ ํ˜‘์•ฝ์„ ๋งบ์–ด
05:45
the company where I work at,
119
345216
1585
05:46
to create opt-in and opt-out mechanisms for creating these data sets.
120
346801
4922
์ด๋Ÿฐ ๋ฐ์ดํ„ฐ ์ง‘ํ•ฉ์„ ์ƒ์„ฑํ•˜๋Š” ๋ฐ์— ์ฐธ์—ฌ, ๊ฑฐ๋ถ€ ์ ˆ์ฐจ๋ฅผ ๋งŒ๋“ค์—ˆ์Šต๋‹ˆ๋‹ค.
05:52
Because artwork created by humans shouldnโ€™t be an all-you-can-eat buffet
121
352098
3587
์‚ฌ๋žŒ์ด ์ฐฝ์ž‘ํ•˜๋Š” ์˜ˆ์ˆ ์€
AI ์–ธ์–ด ๋ชจ๋ธ ๊ต์œก์„ ์œ„ํ•œ ๋ฌดํ•œ ๋ฆฌํ•„ ๋ท”ํŽ˜๊ฐ€ ์•„๋‹ˆ๋‹ˆ๊นŒ์š”.
05:55
for training AI language models.
122
355727
1793
05:58
(Applause)
123
358313
4254
(๋ฐ•์ˆ˜)
06:02
The very last thing I want to talk about is bias.
124
362567
2336
๋งˆ์ง€๋ง‰์œผ๋กœ ํŽธ๊ฒฌ์— ๋Œ€ํ•ด์„œ ๋ด…์‹œ๋‹ค.
06:04
You probably hear about this a lot.
125
364944
1919
์ด์— ๋Œ€ํ•ด ๋งŽ์ด ๋“ค์–ด๋ณด์…จ์„ ๊ฑฐ์˜ˆ์š”.
06:07
Formally speaking, it's when AI models encode patterns and beliefs
126
367196
3713
๊ธธ๊ฒŒ ๋งํ•˜์ž๋ฉด AI ๋ชจ๋ธ์ด ๊ณ ์ • ๊ด€๋…๊ณผ ์ธ์ข… ์ฐจ๋ณ„,
06:10
that can represent stereotypes or racism and sexism.
127
370950
3128
๊ทธ๋ฆฌ๊ณ  ์„ฑ์ฐจ๋ณ„์ด ๋‹ด๊ธด ํŒจํ„ด๊ณผ ์‹ ๋…์„ ๋ถ€ํ˜ธํ™”ํ•˜๋Š” ๊ฒƒ์ด์ฃ .
06:14
One of my heroes, Dr. Joy Buolamwini, experienced this firsthand
128
374412
3212
์ œ ์˜์›…์ธ ์กฐ์ด ๋ถ€์˜ฌ๋žŒ์œ„๋‹ˆ ๋ฐ•์‚ฌ๊ฐ€ ์ด๊ฑธ ์ง์ ‘ ๊ฒฝํ—˜ํ–ˆ์ฃ .
06:17
when she realized that AI systems wouldn't even detect her face
129
377665
3045
์กฐ์ด๊ฐ€ ํฐ์ƒ‰ ๋งˆ์Šคํฌ๋ฅผ ๋ผ์ง€ ์•Š์œผ๋ฉด AI๋Š” ๊ทธ์˜ ์–ผ๊ตด์„ ๊ฐ์ง€ํ•˜์ง€ ๋ชปํ–ˆ์–ด์š”.
06:20
unless she was wearing a white-colored mask.
130
380752
2169
06:22
Digging deeper, she found that common facial recognition systems
131
382962
3754
๋” ๊นŠ์ด ํŒŒ๊ณ ๋“ค์ž ์ผ๋ฐ˜์ ์ธ ์•ˆ๋ฉด ์ธ์‹ ์‹œ์Šคํ…œ์€
06:26
were vastly worse for women of color compared to white men.
132
386758
3253
๋ฐฑ์ธ ๋‚จ์„ฑ์— ๋น„ํ•ด ์œ ์ƒ‰ ์ธ์ข… ์—ฌ์„ฑ์—์„œ ๋”์šฑ ์—‰๋ง์ด๋ผ๋Š” ์ ์„ ์•Œ๊ฒŒ ๋˜์—ˆ์ฃ .
06:30
And when biased models like this are deployed in law enforcement settings,
133
390428
5297
์ด๋ ‡๊ฒŒ ํŽธํ–ฅ๋œ ํ”„๋กœ๊ทธ๋žจ๋“ค์ด ์‚ฌ๋ฒ• ์ฒด๊ณ„์— ์ ์šฉ๋˜๋ฉด
06:35
this can result in false accusations, even wrongful imprisonment,
134
395767
4296
๋ˆ„๋ช…์ด๋‚˜ ๋ถ€๋‹นํ•œ ์ง•์—ญ์‚ด์ด๋ฅผ ์ดˆ๋ž˜ํ•  ์ˆ˜๋„ ์žˆ์–ด์š”.
์ตœ๊ทผ ๋ช‡ ๋‹ฌ๊ฐ„ ์ด๋Ÿฐ ์ผ์ด ์—ฌ๋Ÿฌ ๋ช…์—๊ฒŒ ์ผ์–ด๋‚ฌ์ฃ .
06:40
which we've seen happen to multiple people in recent months.
135
400063
3920
06:44
For example, Porcha Woodruff was wrongfully accused of carjacking
136
404025
3086
์ž„์‹  ํŒ” ๊ฐœ์›” ์ฐจ์ธ ํฌ๋ฅด์ฐจ ์šฐ๋“œ๋Ÿฌํ”„๋Š” ์–ต์šธํ•˜๊ฒŒ ์ž๋™์ฐจ ์ ˆ๋„ ํ˜์˜๋ฅผ ๋ฐ›์•˜๋Š”๋ฐ
06:47
at eight months pregnant
137
407111
1252
06:48
because an AI system wrongfully identified her.
138
408363
2961
๊ทธ ์ด์œ ๋Š” AI ์‹œ์Šคํ…œ์ด ๊ทธ๋…€๋ฅผ ์ž˜๋ชป ์ธ์‹ํ–ˆ๊ธฐ ๋•Œ๋ฌธ์ด์—ˆ์ฃ .
06:52
But sadly, these systems are black boxes,
139
412325
2002
๊ทธ๋Ÿฐ๋ฐ ์•ˆํƒ€๊น์ง€๋งŒ ์ด ์‹œ์Šคํ…œ์€ ๋ถˆ๋ž™๋ฐ•์Šค์˜ˆ์š”.
06:54
and even their creators can't say exactly why they work the way they do.
140
414369
5964
์™œ ์ด๋ ‡๊ฒŒ ์ž‘๋™ํ•˜๋Š”์ง€๋Š” ๋งŒ๋“  ์‚ฌ๋žŒ์กฐ์ฐจ ์„ค๋ช…ํ•˜์ง€ ๋ชปํ•˜์ฃ .
07:00
And for example, for image generation systems,
141
420917
3462
์˜ˆ๋ฅผ ๋“ค์–ด์„œ ์˜์ƒ ์ƒ์„ฑ ์‹œ์Šคํ…œ์ด
07:04
if they're used in contexts like generating a forensic sketch
142
424379
4129
๊ฐ€ํ•ด์ž์— ๋Œ€ํ•œ ๋ฌ˜์‚ฌ๋ฅผ ๋ฐ”ํƒ•์œผ๋กœ ์šฉ์˜์ž ๋ชฝํƒ€์ฃผ๋ฅผ ์ƒ์„ฑํ•  ๋•Œ
07:08
based on a description of a perpetrator,
143
428549
2711
07:11
they take all those biases and they spit them back out
144
431260
3587
์œ„ํ—˜ํ•œ ๋ฒ”์ฃ„์ž, ํ…Œ๋Ÿฌ๋ฆฌ์ŠคํŠธ, ๋ฒ”์ฃ„ ์ง‘๋‹จ์ด๋ผ๋Š” ๋‹จ์–ด๋ฅผ ๋„ฃ์œผ๋ฉด
07:14
for terms like dangerous criminal, terrorists or gang member,
145
434889
3462
๊ทธ์— ๋Œ€ํ•œ ํŽธ๊ฒฌ์„ ๊ทธ๋Œ€๋กœ ๋ฐ˜์˜ํ•ด์„œ ๋‹ต์„ ๋‚ด๋†“์ฃ .
07:18
which of course is super dangerous
146
438393
2168
์ด๋Ÿฐ ๋„๊ตฌ๊ฐ€ ์šฐ๋ฆฌ ์‚ฌํšŒ์— ์ ์šฉ๋˜๋ฉด ๋‹น์—ฐํžˆ ์ •๋ง ์œ„ํ—˜ํ•ฉ๋‹ˆ๋‹ค.
07:20
when these tools are deployed in society.
147
440603
4421
07:25
And so in order to understand these tools better,
148
445566
2294
์ €๋Š” ์ด๋Ÿฌํ•œ ๋„๊ตฌ๋ฅผ ๋” ์ž˜ ์•Œ์•„๋ณด๋ ค๊ณ 
07:27
I created this tool called the Stable Bias Explorer,
149
447902
3212
์Šคํ…Œ์ด๋ธ” ๋ฐ”์ด์–ด์Šค ์ต์Šคํ”Œ๋กœ๋Ÿฌ๋ผ๋Š” ํ”„๋กœ๊ทธ๋žจ์„ ๋งŒ๋“ค์—ˆ์ฃ .
07:31
which lets you explore the bias of image generation models
150
451155
3379
์ด๊ฒƒ์œผ๋กœ ์˜์ƒ ์ƒ์„ฑ ๋ชจ๋ธ๋“ค์— ์žˆ๋Š” ์ง์—…์— ๋Œ€ํ•œ ํŽธ๊ฒฌ์„ ์•Œ์•„๋ณผ ์ˆ˜ ์žˆ์–ด์š”.
07:34
through the lens of professions.
151
454575
1669
07:37
So try to picture a scientist in your mind.
152
457370
3045
๊ณผํ•™์ž์— ๋Œ€ํ•œ ์˜์ƒ์„ ๋งˆ์Œ์†์œผ๋กœ ๊ทธ๋ ค ๋ณด์„ธ์š”.
07:40
Don't look at me.
153
460456
1168
์ ˆ ์ณ๋‹ค๋ณด์ง€ ๋งˆ์‹œ๊ตฌ์š”.
07:41
What do you see?
154
461666
1335
๋ญ๊ฐ€ ๋ณด์ด์‹œ๋‚˜์š”?
07:43
A lot of the same thing, right?
155
463835
1501
์ด๋Ÿฐ ๊ฒŒ ๋งŽ์ด ๋ณด์ด์ฃ ?
07:45
Men in glasses and lab coats.
156
465378
2377
์•ˆ๊ฒฝ์„ ์“ฐ๊ณ  ์‹คํ—˜๋ณต์„ ์ž…์€ ๋‚จ์ž๋“ค์ด์š”.
07:47
And none of them look like me.
157
467797
1710
์ €๋ž‘ ๋‹ฎ์€ ์‚ฌ๋žŒ์€ ํ•œ ๋ช…๋„ ์—†๋„ค์š”.
07:50
And the thing is,
158
470174
1460
์˜์ƒ ์ƒ์„ฑ ๋ชจ๋ธ์„ ๋ชจ๋‘ ๋ดค๋Š”๋ฐ ๋ฌธ์ œ๋Š” ๊ฐ™์€ ํ˜„์ƒ์„ ๋งŽ์ด ๋ฐœ๊ฒฌํ–ˆ๋‹ค๋Š” ๊ฑฐ์˜ˆ์š”.
07:51
is that we looked at all these different image generation models
159
471676
3253
07:54
and found a lot of the same thing:
160
474929
1627
07:56
significant representation of whiteness and masculinity
161
476597
2586
์ €ํฌ๊ฐ€ ์•Œ์•„ ๋ณธ ์ง์—… 150๊ฐœ ๋ชจ๋‘์—์„œ ๋ฐฑ์ธ์„ฑ๊ณผ ๋‚จ์„ฑ์„ฑ์ด ๊ณผํ•˜๊ฒŒ ๋‚˜ํƒ€๋‚ฌ์ฃ .
07:59
across all 150 professions that we looked at,
162
479225
2127
08:01
even if compared to the real world,
163
481352
1794
๋ฏธ๊ตญ ๋…ธ๋™ ํ†ต๊ณ„๊ตญ์— ๋‚˜์™€ ์žˆ๋Š” ํ˜„์‹ค ์„ธ์ƒ๊ณผ ๋น„๊ตํ•ด์„œ๋„์š”.
08:03
the US Labor Bureau of Statistics.
164
483187
1836
์ด๋Ÿฌํ•œ ๋ชจ๋ธ๋“ค์€ ๋‚จ์„ฑ ๋ณ€ํ˜ธ์‚ฌ, ๋‚จ์„ฑ CEO ๋“ฑ
08:05
These models show lawyers as men,
165
485023
3044
08:08
and CEOs as men, almost 100 percent of the time,
166
488109
3462
๊ฑฐ์˜ ๋ชจ๋“  ์‚ฌ๋žŒ์„ ๋ฐฑ์ธ ๋‚จ์„ฑ์œผ๋กœ ๊ทธ๋ ธ์ฃ .
08:11
even though we all know not all of them are white and male.
167
491571
3170
๋ณ€ํ˜ธ์‚ฌ๋‚˜ CEO๊ฐ€ ๋ชจ๋‘ ๋ฐฑ์ธ ๋‚จ์ž์ธ ๊ฑด ์•„๋‹Œ๋ฐ๋„์š”.
08:14
And sadly, my tool hasn't been used to write legislation yet.
168
494782
4380
์•ˆํƒ€๊น์ง€๋งŒ ์ œ ํ”„๋กœ๊ทธ๋žจ์ด ์•„์ง ๋ฒ•๋ฅ  ์ œ์ •์— ์ด์šฉ๋˜์ง„ ์•Š์•˜์ฃ .
08:19
But I recently presented it at a UN event about gender bias
169
499203
3963
ํ•˜์ง€๋งŒ ์ตœ๊ทผ ์„ฑ ํŽธ๊ฒฌ์„ ๋‹ค๋ฃฌ UN ํ–‰์‚ฌ์— ํ”„๋กœ๊ทธ๋žจ์„ ์†Œ๊ฐœํ•˜๋ฉด์„œ
08:23
as an example of how we can make tools for people from all walks of life,
170
503166
3879
๊ฐ๊ณ„๊ฐ์ธต์˜ ์‚ฌ๋žŒ๋“ค, ํ˜น์€ ์ฝ”๋”ฉ์„ ํ•  ์ค„ ๋ชจ๋ฅด๋Š” ์‚ฌ๋žŒ๋“ค์„ ์œ„ํ•ด์„œ๋„
08:27
even those who don't know how to code,
171
507086
2252
AI์— ๊ด€์—ฌํ•˜๊ณ  ๋” ์ž˜ ์ดํ•ดํ•  ์ˆ˜ ์žˆ๋Š” ๋„๊ตฌ๋ฅผ ๋งŒ๋“ค ์ˆ˜ ์žˆ๋‹ค๋Š” ๊ฑธ ์„ค๋ช…ํ–ˆ์ฃ .
08:29
to engage with and better understand AI because we use professions,
172
509380
3253
์ €ํฌ๋Š” ์ง์—…์„ ์‚ฌ์šฉํ–ˆ์ง€๋งŒ ๋งˆ์Œ์— ๋“œ๋Š” ์–ด๋–ค ๋‹จ์–ด๋กœ ํ•ด๋„ ๋ผ์š”.
08:32
but you can use any terms that are of interest to you.
173
512633
3087
08:36
And as these models are being deployed,
174
516596
2752
์ด๋Ÿฐ ๋ชจ๋ธ๋“ค์ด ์‚ฌ์šฉ๋˜๋ฉด์„œ
08:39
are being woven into the very fabric of our societies,
175
519390
3128
์šฐ๋ฆฌ ์‚ฌํšŒ์˜ ๋ชจ๋“  ๋ฉด๊ณผ
08:42
our cell phones, our social media feeds,
176
522518
2044
ํ•ธ๋“œํฐ, ์†Œ์…œ๋ฏธ๋””์–ด ํ”ผ๋“œ,
08:44
even our justice systems and our economies have AI in them.
177
524604
3211
์‹ฌ์ง€์–ด ์‚ฌ๋ฒ• ์ฒด๊ณ„๋‚˜ ๊ฒฝ์ œ๊นŒ์ง€ ๋ชจ๋“  ๊ฒƒ์ด ์˜ํ–ฅ์„ ๋ฐ›์Šต๋‹ˆ๋‹ค.
08:47
And it's really important that AI stays accessible
178
527815
3879
AI๊ฐ€ ์–ธ์ œ ์ž˜ ์ž‘๋™ํ•˜๊ณ  ์–ธ์ œ ์ž‘๋™ํ•˜์ง€ ์•Š๋Š”์ง€ ์•Œ ์ˆ˜ ์žˆ๋„๋ก
08:51
so that we know both how it works and when it doesn't work.
179
531736
4713
AI์— ์–ธ์ œ๋‚˜ ์ ‘๊ทผํ•  ์ˆ˜ ์žˆ๋„๋ก ํ•˜๋Š” ๊ฑด ์ •๋ง ์ค‘์š”ํ•ฉ๋‹ˆ๋‹ค.
08:56
And there's no single solution for really complex things like bias
180
536908
4296
ํŽธ๊ฒฌ, ์ €์ž‘๊ถŒ, ๊ธฐํ›„ ๋ณ€ํ™”์ฒ˜๋Ÿผ ๋„ˆ๋ฌด๋‚˜ ๋ณต์žกํ•œ ๋ฌธ์ œ๋“ค์—๋Š”
09:01
or copyright or climate change.
181
541245
2419
๊ฐ„๋‹จํ•œ ํ•ด๊ฒฐ ๋ฐฉ๋ฒ•์ด ์—†์–ด์š”.
09:03
But by creating tools to measure AI's impact,
182
543664
2711
ํ•˜์ง€๋งŒ AI์˜ ์˜ํ–ฅ์„ ์ธก์ •ํ•˜๋Š” ๋„๊ตฌ๋ฅผ ๋งŒ๋“ค์–ด์„œ
09:06
we can start getting an idea of how bad they are
183
546375
3337
์ด๊ฒƒ๋“ค์ด ์–ผ๋งˆ๋‚˜ ์•ˆ ์ข‹์€์ง€ ์•Œ ์ˆ˜ ์žˆ์œผ๋ฉฐ
09:09
and start addressing them as we go.
184
549754
2502
์•ž์œผ๋กœ ๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ•ด ๋‚˜๊ฐˆ ์ˆ˜ ์žˆ๊ณ 
09:12
Start creating guardrails to protect society and the planet.
185
552256
3337
์‚ฌํšŒ์™€ ์ง€๊ตฌ๋ฅผ ๋ณดํ˜ธํ•˜๋Š” ๋ฐฉํ˜ธ์ฑ…์„ ๋งŒ๋“ค ์ˆ˜๋„ ์žˆ์ฃ .
09:16
And once we have this information,
186
556177
2336
์ผ๋‹จ ์šฐ๋ฆฌ๊ฐ€ ์ด ์ •๋ณด๋ฅผ ์–ป์œผ๋ฉด
09:18
companies can use it in order to say,
187
558513
1835
๊ธฐ์—…๋“ค์€ ์ด๋ ‡๊ฒŒ ๋งํ•  ๊ฑฐ์˜ˆ์š”.
09:20
OK, we're going to choose this model because it's more sustainable,
188
560389
3170
"์ด ๋ชจ๋ธ์ด ๋” ์ง€์† ๊ฐ€๋Šฅํ•˜๋‹ˆ๊นŒ ์ด๊ฑธ ์จ์•ผ๊ฒ ๊ตฐ."
09:23
this model because it respects copyright.
189
563601
2044
"์ด ๋ชจ๋ธ์ด ์ €์ž‘๊ถŒ์„ ๋ณดํ˜ธํ•˜๋‹ˆ๊นŒ ์ด๊ฑธ ์จ์•ผ๊ฒ ๊ตฐ."
09:25
Legislators who really need information to write laws,
190
565686
3087
๋ฒ•๋ฅ  ์ œ์ •์— ์ •๋ณด๊ฐ€ ์ •๋ง๋กœ ํ•„์š”ํ•œ ๊ตญํšŒ์˜์›๋“ค์€
09:28
can use these tools to develop new regulation mechanisms
191
568773
3462
์ด ๋„๊ตฌ๋“ค์„ ์ด์šฉํ•ด ์ƒˆ๋กœ์šด ๋ฒ•์•ˆ์„ ๊ฐœ๋ฐœํ•˜๊ฑฐ๋‚˜
09:32
or governance for AI as it gets deployed into society.
192
572276
3796
์‚ฌํšŒ์— ๋„๋ฆฌ ์“ฐ์ด๋Š” ์ธ๊ณต ์ง€๋Šฅ์„ ๊ด€๋ฆฌํ•˜๋Š” ๋ฐฉ๋ฒ•์„ ๋ชจ์ƒ‰ํ•  ์ˆ˜ ์žˆ์ฃ .
09:36
And users like you and me can use this information
193
576114
2377
์ €์™€ ์—ฌ๋Ÿฌ๋ถ„ ๊ฐ™์€ ์ด์šฉ์ž๋Š” ์ด๋Ÿฌํ•œ ์ •๋ณด๋“ค์„ ์‚ฌ์šฉํ•ด
09:38
to choose AI models that we can trust,
194
578491
3337
๋ฏฟ์„ ์ˆ˜ ์žˆ๋Š” AI ๋ชจ๋ธ, ์šฐ๋ฆฌ๋ฅผ ์™œ๊ณกํ•˜์ง€ ์•Š์„ AI ๋ชจ๋ธ,
09:41
not to misrepresent us and not to misuse our data.
195
581869
2920
์šฐ๋ฆฌ ์ •๋ณด๋ฅผ ๋‚จ์šฉํ•˜์ง€ ์•Š์„ AI ๋ชจ๋ธ์„ ์„ ํƒํ•  ์ˆ˜ ์žˆ์ฃ .
09:45
But what did I reply to that email
196
585790
1918
์ œ ์ผ์ด ์ธ๋ฅ˜๋ฅผ ๋ฉธ๋ง์‹œํ‚ฌ ๊ฑฐ๋ผ๋Š” ์ด๋ฉ”์ผ์—๋Š” ๋ญ๋ผ๊ณ  ๋‹ตํ–ˆ๋ƒ๊ณ ์š”?
09:47
that said that my work is going to destroy humanity?
197
587750
2961
09:50
I said that focusing on AI's future existential risks
198
590711
4046
์šฐ๋ฆฌ๊ฐ€ AI๊ฐ€ ๋ชฐ๊ณ  ์˜ฌ ๋ฏธ๋ž˜ ์‹ค์กด ์œ„ํ—˜์— ์ง‘์ค‘ํ•œ๋‹ค๋ฉด
09:54
is a distraction from its current,
199
594799
2044
ํ˜„์žฌ ์ผ์–ด๋‚˜๊ณ  ์žˆ๋Š” ๋งค์šฐ ์‹ค์งˆ์ ์ธ ์˜ํ–ฅ์„ ๋ฌด์‹œํ•˜๊ฒŒ ๋˜๊ณ 
09:56
very tangible impacts
200
596843
1835
09:58
and the work we should be doing right now, or even yesterday,
201
598719
4004
์ด๋Ÿฐ ์˜ํ–ฅ์„ ์ค„์ด๊ธฐ ์œ„ํ•ด ์˜ค๋Š˜ ํ• , ์‹ฌ์ง€์–ด๋Š” ์–ด์ œ ํ–ˆ์–ด์•ผ ํ• 
10:02
for reducing these impacts.
202
602723
1919
์ผ๋“ค์— ์ง‘์ค‘ํ•˜์ง€ ๋ชปํ•œ๋‹ค๊ณ  ํ–ˆ์Šต๋‹ˆ๋‹ค.
10:04
Because yes, AI is moving quickly, but it's not a done deal.
203
604684
4045
๋„ค, AI ๋ฐœ์ „์ด ๋น ๋ฅด๊ธด ํ•˜์ง€๋งŒ ์•„์ง ๋๋‚œ ๊ฒŒ ์•„๋‹ˆ์—์š”.
10:08
We're building the road as we walk it,
204
608771
2503
์šฐ๋ฆฌ๋Š” ๊ฑธ์œผ๋ฉด์„œ ๊ธธ์„ ๋งŒ๋“ค๊ณ 
10:11
and we can collectively decide what direction we want to go in together.
205
611274
3795
์–ด๋””๋กœ ํ•จ๊ป˜ ๋‚˜์•„๊ฐˆ์ง€ ๊ฐ™์ด ๊ฒฐ์ •ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
10:15
Thank you.
206
615069
1210
(๋ฐ•์ˆ˜)
10:16
(Applause)
207
616279
2002
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7