Your Right to Repair AI Systems | Rumman Chowdhury | TED

44,322 views ・ 2024-06-05

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: Jihoo Cha κ²€ν† : Jihyeon J. Kim
00:04
I want to tell you a story
0
4543
1835
인곡지λŠ₯κ³Ό 농뢀듀에 λŒ€ν•œ
00:06
about artificial intelligence and farmers.
1
6419
4171
이야기λ₯Ό λ“€λ €λ“œλ¦¬κ³ μž ν•©λ‹ˆλ‹€.
00:10
Now, what a strange combination, right?
2
10966
3044
정말 μ΄μƒν•œ 쑰합이죠?
00:14
Two topics could not sound more different from each other.
3
14010
4088
이 두 가지 μ£Όμ œλŠ” μ„œλ‘œ 관련이 μ—†μ–΄ λ³΄μž…λ‹ˆλ‹€.
00:18
But did you know that modern farming actually involves a lot of technology?
4
18431
5047
ν•˜μ§€λ§Œ ν˜„λŒ€ λ†μ—…μ—λŠ” μ‹€μ œλ‘œ λ§Žμ€ 기술이 ν•„μš”ν•˜λ‹€λŠ” 사싀을 μ•Œκ³  κ³„μ…¨λ‚˜μš”?
00:23
So computer vision is used to predict crop yields.
5
23478
3796
컴퓨터 비전은 λ†μž‘λ¬Ό μˆ˜ν™•λŸ‰μ„ μ˜ˆμΈ‘ν•˜λŠ” 데 μ‚¬μš©λ©λ‹ˆλ‹€.
00:27
And artificial intelligence is used to find,
6
27315
2836
그리고 인곡 지λŠ₯은 곀좩을 μ°Ύκ³ ,
00:30
identify and get rid of insects.
7
30151
2878
μ‹λ³„ν•˜κ³ , μ—†μ• λŠ” 데 μ‚¬μš©λ©λ‹ˆλ‹€.
00:33
Predictive analytics helps figure out extreme weather conditions
8
33071
4129
예츑 뢄석은 κ°€λ­„μ΄λ‚˜ ν—ˆλ¦¬μΌ€μΈκ³Ό
00:37
like drought or hurricanes.
9
37242
1960
같은 κ·Ήν•œμ˜ 기상 쑰건을 νŒŒμ•…ν•˜λŠ” 데 도움이 λ©λ‹ˆλ‹€.
00:39
But this technology is also alienating to farmers.
10
39744
4547
ν•˜μ§€λ§Œ 이 κΈ°μˆ μ€ λ†λΆ€λ“€μ—κ²Œλ„ μ†Œμ™Έκ°μ„ μ•ˆκ²¨μ£Όμ£ .
00:44
And this all came to a head in 2017
11
44291
3003
2017λ…„, νŠΈλž™ν„° νšŒμ‚¬μΈ μ£€ λ””μ–΄κ°€
00:47
with the tractor company John Deere when they introduced smart tractors.
12
47335
4922
슀마트 νŠΈλž™ν„°λ₯Ό μΆœμ‹œν•˜λ©΄μ„œ 이 λͺ¨λ“  것이 극에 λ‹¬ν–ˆμŠ΅λ‹ˆλ‹€.
00:52
So before then, if a farmer's tractor broke,
13
52299
2961
이 μ΄μ „μ—λŠ” λ†λΆ€μ˜ νŠΈλž™ν„°κ°€ κ³ μž₯λ‚˜λ©΄
00:55
they could just repair it themselves or take it to a mechanic.
14
55302
3920
직접 μˆ˜λ¦¬ν•˜κ±°λ‚˜ μ •λΉ„μ‚¬μ—κ²Œ κ°€μ Έκ°ˆ 수 μžˆμ—ˆμ£ .
00:59
Well, the company actually made it illegal
15
59222
2961
κΈ€μŽ„μš”, νšŒμ‚¬λŠ” 농뢀듀이 직접 μž₯λΉ„λ₯Ό κ³ μΉ˜λŠ” 것을 사싀상
01:02
for farmers to fix their own equipment.
16
62225
2586
λΆˆλ²•μœΌλ‘œ κ·œμ •ν–ˆμ–΄μš”.
01:04
You had to use a licensed technician
17
64811
2836
λ©΄ν—ˆλ₯Ό 받은 기술자λ₯Ό κ³ μš©ν•΄μ•Όλ§Œ ν–ˆκ³ 
01:07
and farmers would have to wait for weeks
18
67689
2628
농뢀듀은 λ†μž‘λ¬Όμ΄ 썩고 해좩이
01:10
while their crops rot and pests took over.
19
70317
3044
κΈ°μŠΉμ„ 뢀릴 λ•ŒκΉŒμ§€ λͺ‡ μ£Όλ₯Ό κΈ°λ‹€λ €μ•Ό ν–ˆμŠ΅λ‹ˆλ‹€.
01:14
So they took matters into their own hands.
20
74279
3170
κ·Έλž˜μ„œ 그듀은 문제λ₯Ό 직접 ν•΄κ²°ν–ˆμ£ .
01:17
Some of them learned to program,
21
77490
1544
κ·Έλ“€ 쀑 λͺ‡λͺ‡μ€ ν”„λ‘œκ·Έλž˜λ°μ„ λ°°μ› κ³ ,
01:19
and they worked with hackers to create patches to repair their own systems.
22
79075
4922
해컀듀과 ν•¨κ»˜ μžμ‹ λ“€μ˜ μ‹œμŠ€ν…œμ„ κ³ μΉ  수 μžˆλŠ” 패치λ₯Ό λ§Œλ“€μ—ˆμŠ΅λ‹ˆλ‹€.
01:23
In 2022,
23
83997
1418
2022λ…„, μ„Έκ³„μ—μ„œ κ°€μž₯ 큰 해컀 컨퍼런슀 쀑 ν•˜λ‚˜μΈ DEFCONμ—μ„œ
01:25
at one of the largest hacker conferences in the world, DEFCON,
24
85457
3712
Sick CodesλΌλŠ” 해컀와 κ·Έ 의 νŒ€μ€ μ‘΄ λ””μ–΄ νŠΈλž™ν„°μ—
01:29
a hacker named Sick Codes and his team
25
89210
2586
μΉ¨μž…ν•˜λŠ” 방법을 λͺ¨λ‘μ—κ²Œ λ³΄μ—¬μ€¬μŠ΅λ‹ˆλ‹€.
01:31
showed everybody how to break into a John Deere tractor,
26
91838
3212
μ΄λŠ” 기술이 μ·¨μ•½ν•˜λ‹€λŠ” 사싀 뿐만 μ•„λ‹ˆλΌ
01:35
showing that, first of all, the technology was vulnerable,
27
95091
3629
당신이 λ‹Ήμ‹ λ§Œμ˜ μž₯λΉ„λ₯Ό μ†Œμœ ν•  수 있으며
01:38
but also that you can and should own your own equipment.
28
98762
4337
μ†Œμœ ν•΄μ•Όλ§Œ ν•œλ‹€λŠ” 것을 λ‚˜νƒ€λƒˆμ£ .
01:43
To be clear, this is illegal,
29
103683
2836
λΆ„λͺ…νžˆ λ§μ”€λ“œλ¦¬μžλ©΄, 이것은 λΆˆλ²•μž„μ—λ„ λΆˆκ΅¬ν•˜κ³ 
01:46
but there are people trying to change that.
30
106561
2795
이것을 λ°”κΎΈλ €λŠ” μ‚¬λžŒλ“€μ΄ μžˆμŠ΅λ‹ˆλ‹€.
01:49
Now that movement is called the β€œright to repair.”
31
109689
3796
그리고 이 μš΄λ™μ€ β€œμˆ˜λ¦¬ν•  κΆŒλ¦¬β€λΌκ³  λΆˆλ¦½λ‹ˆλ‹€.
01:53
The right to repair goes something like this.
32
113526
2169
μˆ˜λ¦¬ν•  κΆŒλ¦¬λž€ λ‹€μŒκ³Ό κ°™μŠ΅λ‹ˆλ‹€.
01:55
If you own a piece of technology,
33
115737
1668
당신이 κΈ°μˆ μ„ μ†Œμœ ν•˜κ³  μžˆλ‹€λ©΄
01:57
it could be a tractor, a smart toothbrush,
34
117405
2544
μ΄λŠ” νŠΈλž™ν„°, 슀마트 칫솔,
01:59
a washing machine,
35
119991
1377
세탁기 등이 될 수 μžˆμŠ΅λ‹ˆλ‹€.
02:01
you should have the right to repair it if it breaks.
36
121368
3086
그리고 이듀이 κ³ μž₯났닀면 μˆ˜λ¦¬ν•  κΆŒλ¦¬κ°€ μžˆμ–΄μ•Ό ν•˜μ£ .
02:05
So why am I telling you this story?
37
125246
2253
그럼 μ œκ°€ μ™œ 이런 이야기λ₯Ό ν•˜λŠ” κ±ΈκΉŒμš”?
02:08
The right to repair needs to extend to artificial intelligence.
38
128541
5214
수리 κΆŒλ¦¬λŠ” 인곡 지λŠ₯μ—κΉŒμ§€ ν™•λŒ€λ˜μ–΄μ•Ό ν•©λ‹ˆλ‹€.
02:14
Now it seems like every week
39
134214
2127
AI산업은 맀주 μƒˆλ‘­κ³  λ†€λΌμš΄
02:16
there is a new and mind-blowing innovation in AI.
40
136383
3336
ν˜μ‹ μ΄ νŽΌμ³μ§€λŠ” 곳처럼 λ³΄μž…λ‹ˆλ‹€.
02:19
But did you know that public confidence is actually declining?
41
139719
4880
ν•˜μ§€λ§Œ λŒ€μ€‘μ˜ 신뒰도가 μ‹€μ œλ‘œ 떨어지고 μžˆλ‹€λŠ” 사싀을 μ•Œκ³  κ³„μ…¨λ‚˜μš”?
02:24
A recent Pew poll showed that more Americans are concerned
42
144599
4963
졜근 Pew μ—¬λ‘  쑰사에 λ”°λ₯΄λ©΄ κΈ°μˆ μ— λŒ€ν•΄ κΈ°λŒ€ν•˜λŠ” 것보닀
02:29
than they are excited about the technology.
43
149604
2503
μš°λ €ν•˜λŠ” 미ꡭ인이 더 λ§Žμ€ κ²ƒμœΌλ‘œ λ‚˜νƒ€λ‚¬μŠ΅λ‹ˆλ‹€.
02:32
This is echoed throughout the world.
44
152148
2211
μ΄λŠ” μ „ μ„Έκ³„μ—μ„œλ„ 울렀 퍼지고 μžˆμŠ΅λ‹ˆλ‹€.
02:34
The World Risk Poll shows
45
154401
1418
World Risk Poll에 λ”°λ₯΄λ©΄
02:35
that respondents from Central and South America and Africa
46
155860
3462
쀑남미, μ•„ν”„λ¦¬μΉ΄μ˜ μ‘λ‹΅μžλ“€μ€
02:39
all said that they felt AI would lead to more harm than good for their people.
47
159322
6090
λͺ¨λ‘ AIκ°€ κ΅­λ―Όμ—κ²Œ 득보닀 싀이 더 λ§Žμ„ 것이라고 μƒκ°ν•œλ‹€κ³  λ‹΅ν–ˆμŠ΅λ‹ˆλ‹€.
02:46
As a social scientist and an AI developer,
48
166287
2503
μ‚¬νšŒκ³Όν•™μžμ΄μž AI 개발자인 μ €μ—κ²ŒλŠ”
02:48
this frustrates me.
49
168790
1585
이 점이 μ’Œμ ˆκ°μ„ μ•ˆκ²¨μ€λ‹ˆλ‹€.
02:50
I'm a tech optimist
50
170417
1293
μ €λŠ” 기술 λ‚™κ΄€λ‘ μžμž…λ‹ˆλ‹€.
02:51
because I truly believe this technology can lead to good.
51
171751
4338
μ™œλƒν•˜λ©΄ μ €λŠ” 이 기술이 쒋은 κ²°κ³Όλ₯Ό κ°€μ Έμ˜¬ 수 μžˆλ‹€κ³  μ§„μ‹¬μœΌλ‘œ λ―ΏκΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
02:56
So what's the disconnect?
52
176464
1919
κ·Έλ ‡λ‹€λ©΄ 무엇이 λ‹¨μ μΌκΉŒμš”?
02:58
Well, I've talked to hundreds of people over the last few years.
53
178758
3754
κΈ€μŽ„μš”, μ €λŠ” μ§€λ‚œ λͺ‡ λ…„ λ™μ•ˆ 수백 λͺ…μ˜ μ‚¬λžŒλ“€κ³Ό 이야기λ₯Ό λ‚˜λˆ΄μ–΄μš”.
03:02
Architects and scientists, journalists and photographers,
54
182554
3670
건좕가와 κ³Όν•™μž, μ €λ„λ¦¬μŠ€νŠΈμ™€ μ‚¬μ§„μž‘κ°€,
03:06
ride-share drivers and doctors,
55
186224
1710
곡유 μ°¨λŸ‰ μš΄μ „μžμ™€ μ˜μ‚¬κ°€
03:07
and they all say the same thing.
56
187976
2961
λͺ¨λ‘ 같은 말을 ν•©λ‹ˆλ‹€.
03:12
People feel like an afterthought.
57
192272
3795
μ‚¬λžŒλ“€μ€ λͺ¨λ‘ λ‚˜μ€‘μ— ν•˜κ³  μ‹Άμ–΄ν•΄μš”.
03:17
They all know that their data is harvested often without their permission
58
197485
4338
μ‚¬λžŒλ“€μ€ λͺ¨λ‘ μžμ‹ μ˜ 데이터가 μ •κ΅ν•œ μ‹œμŠ€ν…œμ„ λ§Œλ“€ λ•Œ
03:21
to create these sophisticated systems.
59
201823
2461
ν—ˆλ½μ—†μ΄ μˆ˜μ§‘λ˜λŠ” κ²½μš°κ°€ λ§Žλ‹€λŠ” 것을 μ•Œκ³  μžˆμŠ΅λ‹ˆλ‹€.
03:24
They know that these systems are determining their life opportunities.
60
204325
4171
그듀은 μ΄λŸ¬ν•œ μ‹œμŠ€ν…œμ΄ μ‚Άμ˜ 기회λ₯Ό κ²°μ •ν•œλ‹€λŠ” 것을 μ•Œκ³  μžˆμŠ΅λ‹ˆλ‹€.
03:28
They also know that nobody ever bothered to ask them
61
208496
3545
ν•˜μ§€λ§Œ κ·Έ λˆ„κ΅¬λ„ μ‹œμŠ€ν…œμ„ μ–΄λ–»κ²Œ ꡬ좕해야 ν•˜λŠ”μ§€λŠ”
03:32
how the system should be built,
62
212083
1502
물어보지 μ•ŠλŠ”λ‹€λŠ” 것도 μ•Œκ³  μžˆμŠ΅λ‹ˆλ‹€.
03:33
and they certainly have no idea where to go if something goes wrong.
63
213585
5964
그리고 λ¬Έμ œκ°€ 생기면 μ–΄λ””λ‘œ κ°€μ•Ό 할지도 μ „ν˜€ λͺ¨λ₯΄κ³  있죠.
03:40
We may not own AI systems,
64
220383
2336
μš°λ¦¬κ°€ AI μ‹œμŠ€ν…œμ„ μ†Œμœ ν•˜κ³  μžˆμ§€ μ•Šμ„ μˆ˜λ„ μžˆμ§€λ§Œ
03:42
but they are slowly dominating our lives.
65
222761
2794
AIλŠ” μ„œμ„œνžˆ 우리 삢을 μ§€λ°°ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
03:45
We need a better feedback loop
66
225597
1501
μš°λ¦¬μ—κ² μ΄λŸ¬ν•œ μ‹œμŠ€ν…œμ„ λ§Œλ“œλŠ” μ‚¬λžŒλ“€κ³Ό
03:47
between the people who are making these systems,
67
227140
2961
μ΄λŸ¬ν•œ AI μ‹œμŠ€ν…œμ΄ 우리의 세계 μ•ˆμ—μ„œ
03:50
and the people who are best determined to tell us
68
230101
3337
μ–΄λ–»κ²Œ μƒν˜Έμž‘μš©ν•΄μ•Όν•˜λŠ”μ§€ κ°€μž₯ 잘 μ•Œλ €μ€„ 수 μžˆλŠ” μ‚¬λžŒλ“€
03:53
how these AI systems should interact in their world.
69
233480
3628
사이에 더 λ‚˜μ€ ν”Όλ“œλ°± 루프가 ν•„μš”ν•©λ‹ˆλ‹€.
03:57
One step towards this is a process called red teaming.
70
237609
3837
이λ₯Ό μœ„ν•œ ν•œ κ±ΈμŒμ€ λ ˆλ“œ νŒ€ ꡬ성 (red teaming) μ΄λΌλŠ” ν”„λ‘œμ„ΈμŠ€μž…λ‹ˆλ‹€.
04:01
Now, red teaming is a practice that was started in the military,
71
241821
3003
λ ˆλ“œ νŒ€ ꡬ성은 κ΅°λŒ€μ—μ„œ μ‹œμž‘λœ κ΄€ν–‰μœΌλ‘œ
04:04
and it's used in cybersecurity.
72
244866
1919
사이버 λ³΄μ•ˆμ— μ‚¬μš©λ˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
04:06
In a traditional red-teaming exercise,
73
246826
2628
기쑴의 λ ˆλ“œνŒ€ ꡬ성 λ°©μ‹μ—μ„œλŠ”
04:09
external experts are brought in to break into a system,
74
249454
3920
μ™ΈλΆ€ μ „λ¬Έκ°€λ₯Ό μ΄ˆλΉ™ν•˜μ—¬ μ‹œμŠ€ν…œμ— μΉ¨μž…ν•©λ‹ˆλ‹€.
04:13
sort of like what Sick Codes did with tractors, but legal.
75
253374
4338
Sick Codesκ°€ νŠΈλž™ν„°λ‘œ ν–ˆλ˜ 것과 λΉ„μŠ·ν•˜μ§€λ§Œ 합법적이죠.
04:17
So red teaming acts as a way of testing your defenses
76
257712
3670
λ”°λΌμ„œ λ ˆλ“œ νŒ€ ꡬ성은 λ°©μ–΄ 체계λ₯Ό ν…ŒμŠ€νŠΈν•˜λŠ”
04:21
and when you can figure out where something will go wrong,
77
261424
3545
ν•˜λ‚˜μ˜ μˆ˜λ‹¨μœΌλ‘œ μž‘μš©ν•˜λ©°, μ–΄λ””μ„œ λ¬Έμ œκ°€ λ°œμƒν• μ§€
04:25
you can figure out how to fix it.
78
265011
3003
νŒŒμ•…ν•  수 μžˆμ„ λ•Œ 이λ₯Ό κ³ μΉ  방법을 μ°Ύμ•„λ‚Ό 수 μžˆμŠ΅λ‹ˆλ‹€.
04:28
But when AI systems go rogue,
79
268056
2377
ν•˜μ§€λ§Œ AI μ‹œμŠ€ν…œμ΄ λ¬΄μš©μ§€λ¬Όμ΄ λœλ‹€λ©΄
04:30
it's more than just a hacker breaking in.
80
270433
2920
μ΄λŠ” λ‹¨μˆœνžˆ 해컀가 μΉ¨μž…ν•˜λŠ” 것을 λ„˜μ–΄μ„œκ²Œ λ©λ‹ˆλ‹€.
04:33
The model could malfunction or misrepresent reality.
81
273394
3754
이 λͺ¨λΈμ€ μ˜€μž‘λ™μ„ μΌμœΌν‚€κ±°λ‚˜ ν˜„μ‹€μ„ 잘λͺ» ν‘œν˜„ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
04:37
So, for example, not too long ago,
82
277190
2044
예λ₯Ό λ“€μ–΄, μ–Όλ§ˆ μ „κΉŒμ§€λ§Œ 해도 우린
04:39
we saw an AI system attempting diversity
83
279234
2627
AI μ‹œμŠ€ν…œμ΄ κ³Όκ±°μ—λŠ” μ •ν™•ν•˜μ§€ μ•Šμ€ 사진을 λ³΄μ—¬μ€ŒμœΌλ‘œμ¨
04:41
by showing historically inaccurate photos.
84
281903
3503
닀양성을 μ‹œλ„ν•˜λŠ” 것을 λ³Ό 수 μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
04:45
Anybody with a basic understanding of Western history
85
285406
2670
μ„œμ–‘ 역사에 λŒ€ν•œ 기본적인 이해가 μžˆλŠ” μ‚¬λžŒμ΄λΌλ©΄
04:48
could have told you that neither the Founding Fathers
86
288076
2502
λˆ„κ΅¬λ‚˜ 건ꡭ의 μ•„λ²„μ§€λ‚˜ λ‚˜μΉ˜ μ‹œλŒ€ ꡰ인 λͺ¨λ‘
04:50
nor Nazi-era soldiers would have been Black.
87
290620
2169
흑인이 μ•„λ‹ˆμ—ˆμ„ 것이라고 말할 수 μžˆμ—ˆμ„ κ²ƒμž…λ‹ˆλ‹€.
04:54
In that case, who qualifies as an expert?
88
294123
3629
κ·Έλ ‡λ‹€λ©΄ μ „λ¬Έκ°€κ°€ 될 자격이 μžˆλŠ” μ‚¬λžŒμ€ λˆ„κ΅¬μΌκΉŒμš”?
04:58
You.
89
298670
1168
λ°”λ‘œ λ‹Ήμ‹ μž…λ‹ˆλ‹€.
05:00
I'm working with thousands of people all around the world
90
300380
2836
μ €λŠ” μ „ 세계 수천 λͺ…μ˜ μ‚¬λžŒλ“€κ³Ό ν•¨κ»˜
05:03
on large and small red-teaming exercises,
91
303258
2252
크고 μž‘μ€ νŒ€ ꡬ성 μ—°μŠ΅μ„ ν•˜κ³  있으며,
05:05
and through them we found and fixed mistakes in AI models.
92
305552
4629
이λ₯Ό 톡해 AI λͺ¨λΈμ˜ μ‹€μˆ˜λ₯Ό λ°œκ²¬ν•˜κ³  μˆ˜μ •ν–ˆμŠ΅λ‹ˆλ‹€.
05:10
We also work with some of the biggest tech companies in the world:
93
310181
3587
μš°λ¦¬λŠ” λ˜ν•œ OpenAI, Meta, Anthropic, Googleκ³Ό 같은
05:13
OpenAI, Meta, Anthropic, Google.
94
313810
2753
세계 μ΅œλŒ€μ˜ 기술 νšŒμ‚¬λ“€κ³Όλ„ ν˜‘λ ₯ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
05:16
And through this, we've made models work better for more people.
95
316604
4630
이λ₯Ό 톡해 μš°λ¦¬λŠ” λͺ¨λΈμ΄ 더 λ§Žμ€ μ‚¬λžŒλ“€μ—κ²Œ 더 잘 μž‘λ™ν•˜λ„λ‘ λ§Œλ“€μ—ˆμŠ΅λ‹ˆλ‹€.
05:22
Here's a bit of what we've learned.
96
322277
2168
μš°λ¦¬κ°€ μ•Œκ²Œ 된 λͺ‡ 가지 λ‚΄μš©μ€ λ‹€μŒκ³Ό κ°™μŠ΅λ‹ˆλ‹€.
05:24
We partnered with the Royal Society in London to do a scientific,
97
324988
3670
μš°λ¦¬λŠ” 런던의 왕립 ν•™νšŒμ™€ ν˜‘λ ₯ν•˜μ—¬ μ§ˆλ³‘ κ³Όν•™μžλ“€κ³Ό ν•¨κ»˜
05:28
mis- and disinformation event with disease scientists.
98
328658
3545
κ³Όν•™μ μœΌλ‘œ 잘λͺ»λœ 정보 및 ν—ˆμœ„ 정보 이벀트λ₯Ό μ—΄μ—ˆμŠ΅λ‹ˆλ‹€ .
05:32
What these scientists found
99
332203
1335
이 κ³Όν•™μžλ“€μ€
05:33
is that AI models actually had a lot of protections
100
333580
2794
AI λͺ¨λΈμ΄ μ‹€μ œλ‘œ COVID의 잘λͺ»λœ μ •λ³΄λ‘œλΆ€ν„°
05:36
against COVID misinformation.
101
336416
2294
λ§Žμ€ 보호 κΈ°λŠ₯을 μ œκ³΅ν–ˆμŒμ„ λ°œκ²¬ν–ˆμŠ΅λ‹ˆλ‹€.
05:38
But for other diseases like measles, mumps and the flu,
102
338751
3546
ν•˜μ§€λ§Œ 홍역, μœ ν–‰μ„± μ΄ν•˜μ„ μ—Ό, 독감과 같은 λ‹€λ₯Έ μ§ˆλ³‘μ—λŠ”
05:42
the same protections didn't apply.
103
342297
2460
λ™μΌν•œ 보호 μ‘°μΉ˜κ°€ μ μš©λ˜μ§€ μ•Šμ•˜μŠ΅λ‹ˆλ‹€.
05:44
We reported these changes,
104
344757
1293
μš°λ¦¬λŠ” μ΄λŸ¬ν•œ λ³€ν™”λ₯Ό λ³΄κ³ ν–ˆκ³ , μˆ˜μ •ν–ˆμœΌλ©°,
05:46
they’re fixed and now we are all better protected
105
346050
3170
이제 과학적인 μ˜€ν•΄μ™€ ν—ˆμœ„ μ •λ³΄λ‘œλΆ€ν„°
05:49
against scientific mis- and disinformation.
106
349220
2670
우리 λͺ¨λ‘λ₯Ό 더 잘 λ³΄ν˜Έν•  수 있게 λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
05:52
We did a really similar exercise with architects at Autodesk University,
107
352724
4838
Autodesk University의 건좕가듀과 정말 λΉ„μŠ·ν•œ μ‹€ν—˜μ„ ν•˜μ˜€κ³ ,
05:57
and we asked them a simple question:
108
357562
1877
κ·Έλ“€μ—κ²Œ κ°„λ‹¨ν•œ μ§ˆλ¬Έμ„ ν–ˆμŠ΅λ‹ˆλ‹€.
05:59
Will AI put them out of a job?
109
359439
2961
β€œAIκ°€ 그듀을 직μž₯μ—μ„œ μ«“μ•„λ‚ΌκΉŒμš”?”
06:02
Or more specifically,
110
362442
2169
μ’€ 더 ꡬ체적으둜 λ§ν•˜μžλ©΄,
06:04
could they imagine a modern AI system
111
364611
2043
β€œν˜„λŒ€ λ―Έμˆ κ΄€μ˜ 사양을 섀계할 수 μžˆλŠ”
06:06
that would be able to design the specs of a modern art museum?
112
366696
3670
ν˜„λŒ€μ‹ AI μ‹œμŠ€ν…œμ„ 상상할 수 μžˆμ„κΉŒμš”?”
06:10
The answer, resoundingly, was no.
113
370408
3962
닡은 단연코 β€˜μ•„λ‹ˆμš”β€™ μ˜€μŠ΅λ‹ˆλ‹€.
06:14
Here's why, architects do more than just draw buildings.
114
374787
3629
건좕가듀이 λ‹¨μˆœνžˆ 건물을 κ·Έλ¦¬λŠ” 것 μ΄μƒμ˜ 일을 ν•˜λŠ” μ΄μœ κ°€ λ°”λ‘œ 여기에 μžˆμŠ΅λ‹ˆλ‹€.
06:18
They have to understand physics and material science.
115
378458
3336
학생듀은 물리학과 재료 과학을 이해해야 ν•©λ‹ˆλ‹€.
06:21
They have to know building codes,
116
381836
1585
건좕 λ²•κ·œλ₯Ό μ•Œμ•„μ•Ό ν•˜κ³ ,
06:23
and they have to do that
117
383463
1251
감정을 λΆˆλŸ¬μΌμœΌν‚€λŠ” 무언가λ₯Ό λ§Œλ“€λ©΄μ„œ
06:24
while making something that evokes emotion.
118
384756
2794
이λ₯Ό μˆ™μ§€ν•΄μ•Ό ν•©λ‹ˆλ‹€.
06:28
What the architects wanted was an AI system
119
388426
2169
건좕가듀이 μ›ν–ˆλ˜ 것은
06:30
that interacted with them, that would give them feedback,
120
390595
2753
κ·Έλ“€κ³Ό μƒν˜Έμž‘μš©ν•˜μ—¬ ν”Όλ“œλ°±μ„ μ£Όκ³  사전에
06:33
maybe proactively offer design recommendations.
121
393389
2753
섀계 ꢌμž₯ 사항을 μ œμ‹œν•  수 μžˆλŠ” AI μ‹œμŠ€ν…œμ΄μ—ˆμŠ΅λ‹ˆλ‹€.
06:36
And today's AI systems, not quite there yet.
122
396142
3670
그리고 μ˜€λŠ˜λ‚ μ˜ AI μ‹œμŠ€ν…œμ€ 아직 μ™„μ„±λ˜μ§€ μ•Šμ•˜μŠ΅λ‹ˆλ‹€.
06:39
But those are technical problems.
123
399854
1752
ν•˜μ§€λ§Œ 이건 기술적인 λ¬Έμ œμž…λ‹ˆλ‹€.
06:41
People building AI are incredibly smart,
124
401648
2043
AIλ₯Ό λ§Œλ“œλŠ” μ‚¬λžŒλ“€μ€ λ†€λΌμšΈ μ •λ„λ‘œ λ˜‘λ˜‘ν•©λ‹ˆλ‹€.
06:43
and maybe they could solve all that in a few years.
125
403733
2753
μ–΄μ©Œλ©΄ λͺ‡ λ…„ μ•ˆμ— 이 λͺ¨λ“  것을 ν•΄κ²°ν•  수 μžˆμ„μ§€λ„ λͺ¨λ₯΄μ§€μš”.
06:46
But that wasn't their biggest concern.
126
406527
1961
ν•˜μ§€λ§Œ 그게 κ·Έλ“€μ˜ κ°€μž₯ 큰 κ΄€μ‹¬μ‚¬λŠ” μ•„λ‹ˆμ—ˆμ–΄μš”.
06:48
Their biggest concern was trust.
127
408529
2795
κ·Έλ“€μ˜ κ°€μž₯ 큰 κ΄€μ‹¬μ‚¬λŠ” μ‹ λ’°μ˜€μŠ΅λ‹ˆλ‹€.
06:51
Now architects are liable if something goes wrong with their buildings.
128
411991
4546
이제 κ±΄μΆ•κ°€λŠ” 건물에 λ¬Έμ œκ°€ 생기면 μ±…μž„μ„ μ Έμ•Ό ν•©λ‹ˆλ‹€.
06:56
They could lose their license,
129
416537
1669
λ©΄ν—ˆλ₯Ό μžƒμ„ μˆ˜λ„ 있고, 벌금이 뢀과될 μˆ˜λ„ 있으며
06:58
they could be fined, they could even go to prison.
130
418206
3211
심지어 감μ˜₯에 갈 μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
07:01
And failures can happen in a million different ways.
131
421459
3086
μ‹€νŒ¨λŠ” 백만 가지 λ°©μ‹μœΌλ‘œ 일어날 수 μžˆμŠ΅λ‹ˆλ‹€.
07:04
For example, exit doors that open the wrong way,
132
424545
3045
예λ₯Ό λ“€μ–΄ 비상ꡬ가 잘λͺ» μ—΄λ €μ„œ
07:07
leading to people being crushed in an evacuation crisis,
133
427632
4296
μ—΄λ €μ„œ μ‚¬λžŒλ“€μ΄ λŒ€ν”Ό μœ„κΈ°μ— μ²˜ν•˜κ±°λ‚˜,
07:11
or broken glass raining down onto pedestrians in the street
134
431928
4129
κΈΈκ±°λ¦¬μ—μ„œ λ°”λžŒμ΄ λ„ˆλ¬΄ μ„Έκ²Œ λΆˆμ–΄ 창문이 λΆ€μ„œμ Έ
07:16
because the wind blows too hard and shatters windows.
135
436099
3837
깨진 μœ λ¦¬κ°€ λ³΄ν–‰μžμ—κ²Œ μŸμ•„μ§€λŠ” κ²½μš°κ°€ 이에 ν•΄λ‹Ήν•©λ‹ˆλ‹€.
07:20
So why would an architect trust an AI system with their job,
136
440311
3921
κ·Έλ ‡λ‹€λ©΄ μ™œ κ±΄μΆ•κ°€λŠ” μ‹€μˆ˜λ₯Ό μ°Ύμ•„λƒˆμ„ λ•Œ
07:24
with their literal freedom,
137
444273
2670
직접 λ“€μ–΄κ°€ κ³ μΉ  수 μ—†μŒμ—λ„ λΆˆκ΅¬ν•˜κ³ 
07:26
if they couldn't go in and fix a mistake if they found it?
138
446943
3628
AIμ‹œμŠ€ν…œμ„ μžμ‹ μ˜ 업무와 μžμœ λ‘œμ›€μœΌλ‘œ μ‹ λ’°ν• κΉŒμš”?
07:31
So we need to figure out these problems today, and I'll tell you why.
139
451030
4171
μš°λ¦¬λŠ” 였늘 이 λ¬Έμ œλ“€μ„ μ•Œμ•„λ‚΄μ•Ό ν•©λ‹ˆλ‹€. κ·Έ 이유λ₯Ό λ§μ”€λ“œλ¦¬κ² μŠ΅λ‹ˆλ‹€.
07:35
The next wave of artificial intelligence systems, called agentic AI,
140
455201
5047
μ°¨μ„ΈλŒ€ 인곡 지λŠ₯ μ‹œμŠ€ν…œμΈ μ—μ΄μ „νŠΈ AIλŠ”
07:40
is a true tipping point
141
460289
1335
μΈκ°„μ˜ 주체성을 μœ μ§€ν• μ§€,
07:41
between whether or not we retain human agency,
142
461624
4213
μ•„λ‹ˆλ©΄ AI μ‹œμŠ€ν…œμ΄ λŒ€μ‹  결정을 내릴지
07:45
or whether or not AI systems make our decisions for us.
143
465837
3795
μ—¬λΆ€λ₯Ό κ°€λ₯΄λŠ” μ§„μ •ν•œ μ „ν™˜μ μž…λ‹ˆλ‹€.
07:50
Imagine an AI agent as kind of like a personal assistant.
144
470008
3211
AI μ—μ΄μ „νŠΈκ°€ μΌμ’…μ˜ 개인 λΉ„μ„œλΌκ³  상상해 λ³΄μ„Έμš”.
07:53
So, for example, a medical agent might determine
145
473261
3712
예λ₯Ό λ“€μ–΄, 의료 상담원은 κ°€μ‘±μ—κ²Œ
07:57
whether or not your family needs doctor's appointments,
146
477015
2585
μ§„λ£Œ μ˜ˆμ•½μ΄ ν•„μš”ν•œμ§€ μ—¬λΆ€λ₯Ό κ²°μ •ν•˜κ±°λ‚˜,
07:59
it might refill prescription medications, or in case of an emergency,
147
479642
3379
μ²˜λ°©μ•½μ„ λ³΄μΆ©ν•˜κ±°λ‚˜, 응급 상황이 λ°œμƒν–ˆμ„ λ•Œμ—λŠ”
08:03
send medical records to the hospital.
148
483062
2586
의료 기둝을 병원에 보낼 수 μžˆμŠ΅λ‹ˆλ‹€.
08:05
But AI agents can't and won't exist
149
485690
2836
ν•˜μ§€λ§Œ μš°λ¦¬μ—κ²Œ μ§„μ •ν•œ 수리 κΆŒλ¦¬κ°€ μ—†λ‹€λ©΄
08:08
unless we have a true right to repair.
150
488568
2210
AI μ—μ΄μ „νŠΈλŠ” μ‘΄μž¬ν•  수 μ—†μœΌλ©° μ•žμœΌλ‘œλ„ μ‘΄μž¬ν•˜μ§€ μ•Šμ„ κ²ƒμž…λ‹ˆλ‹€.
08:10
What parent would trust their child's health to an AI system
151
490820
4922
AIκ°€ 기본적인 진단도 ν•  수 μ—†λ‹€λ©΄
08:15
unless you could run some basic diagnostics?
152
495783
2795
μ–΄λ–€ λΆ€λͺ¨κ°€ μ•„μ΄μ˜ 건강을 AI μ‹œμŠ€ν…œμ— λ§‘κΈΈκΉŒμš”?
08:18
What professional would trust an AI system with job decisions,
153
498619
4213
또, μ£Όλ‹ˆμ–΄ μ§μ›μ²˜λŸΌ AI μ‹œμŠ€ν…œμ„ μž¬κ΅μœ‘ν•  수 μ—†λ‹€λ©΄
08:22
unless you could retrain it the way you might a junior employee?
154
502874
4296
μ–΄λ–€ μ „λ¬Έκ°€κ°€ 직무 결정을 내릴 λ•Œ AI μ‹œμŠ€ν…œμ„ μ‹ λ’°ν• κΉŒμš”?
08:28
Now, a right to repair might look something like this.
155
508129
2878
수리 κΆŒλ¦¬λŠ” 이런 λ°©μ‹μœΌλ‘œ λ‚˜νƒ€λ‚  수 μžˆμŠ΅λ‹ˆλ‹€.
08:31
You could have a diagnostics board
156
511007
2210
당신은 직접 μ„€κ³„ν•œ κΈ°λ³Έ ν…ŒμŠ€νŠΈλ₯Ό
08:33
where you run basic tests that you design,
157
513259
3003
μ‹€ν–‰ν•˜λŠ” 진단 λ³΄λ“œλ₯Ό μ‚¬μš©ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
08:36
and if something's wrong, you could report it to the company
158
516304
2836
λ¬Έμ œκ°€ λ°œμƒν•˜λ©΄ νšŒμ‚¬μ— λ³΄κ³ ν•˜κ³ 
08:39
and hear back when it's fixed.
159
519140
1585
λ¬Έμ œκ°€ ν•΄κ²°λ˜λ©΄ 닡변을 듀을 수 μžˆμŠ΅λ‹ˆλ‹€.
08:40
Or you could work with third parties like ethical hackers
160
520725
3128
μ•„λ‹ˆλ©΄ μ˜€λŠ˜λ‚  우리처럼 μ‹œμŠ€ν…œμš© 패치λ₯Ό λ§Œλ“œλŠ”
08:43
who make patches for systems like we do today.
161
523895
2586
윀리적 해컀와 같은 제3μžμ™€ ν˜‘λ ₯ν•  μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
08:46
You can download them and use them to improve your system
162
526481
2711
패치λ₯Ό λ‹€μš΄λ‘œλ“œν•˜μ—¬ μ‹œμŠ€ν…œμ„ μ›ν•˜λŠ” λ°©μ‹μœΌλ‘œ
08:49
the way you want it to be improved.
163
529233
1919
κ°œμ„ ν•˜λŠ” 데 μ‚¬μš©ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
08:51
Or you could be like these intrepid farmers and learn to program
164
531194
3628
μ•„λ‹ˆλ©΄ μš©κ°ν•œ λ†λΆ€λ“€μ²˜λŸΌ μžμ‹ λ§Œμ˜ μ‹œμŠ€ν…œμ„
08:54
and fine-tune your own systems.
165
534864
3003
ν”„λ‘œκ·Έλž˜λ°ν•˜κ³  λ―Έμ„Έ μ‘°μ •ν•˜λŠ” 법을 배울 μˆ˜λ„ 있겠죠.
08:58
We won't achieve the promised benefits of artificial intelligence
166
538618
4171
μ‚¬λžŒλ“€μ„ 개발 ν”„λ‘œμ„ΈμŠ€μ— μ°Έμ—¬μ‹œν‚€λŠ” 방법을 μ•Œμ•„λ‚΄μ§€ λͺ»ν•˜λ©΄
09:02
unless we figure out how to bring people into the development process.
167
542789
5046
μ•½μ†ν–ˆλ˜ 인곡 지λŠ₯의 이점을 얻을 수 없을 κ²ƒμž…λ‹ˆλ‹€.
09:08
I've dedicated my career to responsible AI,
168
548377
3504
μ €λŠ” μ±…μž„κ° μžˆλŠ” AI에 제 κ²½λ ₯을 λ°”μ³€λŠ”λ°,
09:11
and in that field we ask the question,
169
551923
3253
κ·Έ λΆ„μ•Όμ—μ„œ μš°λ¦¬λŠ” β€œμ‚¬λžŒλ“€μ΄ AIλ₯Ό μ‹ λ’°ν•  수 있으렀면
09:15
what can companies build to ensure that people trust AI?
170
555218
4963
기업은 무엇을 ꡬ좕해야 ν•˜λŠ”κ°€β€œ λΌλŠ” μ§ˆλ¬Έμ„ λ˜μ§‘λ‹ˆλ‹€.
09:20
Now, through these red-teaming exercises, and by talking to you,
171
560723
3837
μ΄λŸ¬ν•œ νŒ€μ› ꡬ성 μ—°μŠ΅μ„ 톡해, 그리고 μ—¬λŸ¬λΆ„κ³Ό 이야기λ₯Ό λ‚˜λˆ„λ©΄μ„œ
09:24
I've come to realize that we've been asking the wrong question all along.
172
564602
5339
μš°λ¦¬λŠ” 쀄곧 잘λͺ»λœ μ§ˆλ¬Έμ„ ν•΄μ™”λ‹€λŠ” 것을 κΉ¨λ‹«κ²Œ λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
09:30
What we should have been asking is what tools can we build
173
570566
3921
μš°λ¦¬κ°€ λ¬Όμ—ˆμ–΄μ•Ό ν•  것은 β€œμ‚¬λžŒλ“€μ΄ AIλ₯Ό μœ μš©ν•˜κ²Œ
09:34
so people can make AI beneficial for them?
174
574529
4045
ν™œμš©ν•  수 μžˆλ„λ‘ μ–΄λ–€ 도ꡬλ₯Ό λ§Œλ“€ 수 μžˆλŠ”κ°€β€ μž…λ‹ˆλ‹€.
09:39
Technologists can't do it alone.
175
579117
2460
기술자 ν˜Όμžμ„œλŠ” ν•  수 μ—†μŠ΅λ‹ˆλ‹€.
09:41
We can only do it with you.
176
581619
2419
였직 μ—¬λŸ¬λΆ„λ§Œμ΄ ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
09:44
Thank you.
177
584080
1168
κ°μ‚¬ν•©λ‹ˆλ‹€.
09:45
(Applause)
178
585248
2794
(λ°•μˆ˜)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7