How we can build AI to help humans, not hurt us | Margaret Mitchell

81,177 views ・ 2018-03-12

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: Jeannie Yoo κ²€ν† : Sojeong KIM
00:13
I work on helping computers communicate about the world around us.
0
13381
4015
μ „ 컴퓨터가 세상을 μ΄ν•΄ν•˜λ„λ‘ λ•λŠ” 일을 ν•©λ‹ˆλ‹€.
00:17
There are a lot of ways to do this,
1
17754
1793
μ—¬λŸ¬κ°€μ§€ 방법이 μžˆμ§€μš”.
00:19
and I like to focus on helping computers
2
19571
2592
μ „ 컴퓨터λ₯Ό λ•λŠ” 것을 즐기며
00:22
to talk about what they see and understand.
3
22187
2874
컴퓨터가 보고 μ΄ν•΄ν•˜λŠ” 것을 λ•μŠ΅λ‹ˆλ‹€.
00:25
Given a scene like this,
4
25514
1571
이런 μž₯면을 μ»΄ν“¨ν„°μ—κ²Œ 보여주면
00:27
a modern computer-vision algorithm
5
27109
1905
ν˜„λŒ€νŒ 컴퓨터 μ•Œκ³ λ¦¬μ¦˜μ΄ μž‘μš©ν•˜μ—¬
00:29
can tell you that there's a woman and there's a dog.
6
29038
3095
ν•œ μ—¬μ„±κ³Ό κ°œκ°€ μžˆλ‹€κ³  μ•Œλ €μ£Όμ£ .
00:32
It can tell you that the woman is smiling.
7
32157
2706
κ·Έλ…€λŠ” 웃고 μžˆλ‹€κ³  μ•Œλ €μ£ΌκΈ°λ„ ν•˜μ£ .
00:34
It might even be able to tell you that the dog is incredibly cute.
8
34887
3873
κ°œκ°€ 맀우 κ·€μ—½λ‹€κ³  말해 쀄 μˆ˜λ„ μžˆμ„ κ²λ‹ˆλ‹€.
00:38
I work on this problem
9
38784
1349
μ œκ°€ μ—°κ΅¬ν•˜κ³  μžˆλŠ” λΆ„μ•ΌλŠ”
00:40
thinking about how humans understand and process the world.
10
40157
4212
인간이 μ–΄λ–»κ²Œ 세상을 μ΄ν•΄ν•˜κ³  λ°›μ•„λ“€μ΄λŠ” κ°€μž…λ‹ˆλ‹€.
00:45
The thoughts, memories and stories
11
45577
2952
생각, κΈ°μ–΅, 이야기듀
00:48
that a scene like this might evoke for humans.
12
48553
2818
이런 μž₯면이 μΈκ°„μ—κ²Œ μ–΄λ–€ 영ν–₯을 μ£ΌλŠ” 지λ₯Ό μ—°κ΅¬ν•˜κ³  있죠.
00:51
All the interconnections of related situations.
13
51395
4285
μ—°κ΄€λœ μƒν™©λ“€μ˜ μƒν˜Έ 연결에 λŒ€ν•΄μ„œμš”.
00:55
Maybe you've seen a dog like this one before,
14
55704
3126
μ—¬λŸ¬λΆ„μ€ 이런 개λ₯Ό 본적이 μžˆκ±°λ‚˜
00:58
or you've spent time running on a beach like this one,
15
58854
2969
κ°œμ™€ ν•¨κ»˜ 해변을 달렸을 μˆ˜λ„ 있죠.
01:01
and that further evokes thoughts and memories of a past vacation,
16
61847
4778
μ΄λŸ¬ν•œ 생각은 이전 νœ΄κ°€μ˜ 기얡을 λ– μ˜€λ₯΄κ²Œ ν•©λ‹ˆλ‹€.
01:06
past times to the beach,
17
66649
1920
ν•΄λ³€μ—μ„œ 보낸 과거의 κΈ°μ–΅
01:08
times spent running around with other dogs.
18
68593
2603
κ°œμ™€ 달리며 보낸 μ‹œκ°„λ“€μ„μš”.
01:11
One of my guiding principles is that by helping computers to understand
19
71688
5207
컴퓨터가 세상을 μ΄ν•΄ν•˜λ„λ‘ λ•λŠ” μ €μ˜ 지도 원칙은
01:16
what it's like to have these experiences,
20
76919
2896
이 κ²½ν—˜λ“€μ΄ μ–΄λ–€ 것인지λ₯Ό μ΄ν•΄μ‹œν‚€κ³ 
01:19
to understand what we share and believe and feel,
21
79839
5176
μš°λ¦¬κ°€ λ‚˜λˆ„κ³  λ―Ώκ³  λŠλΌλŠ” 것을 μ΄ν•΄μ‹œν‚€λ©°
01:26
then we're in a great position to start evolving computer technology
22
86094
4310
컴퓨터 κΈ°μˆ μ„ λ°œμ „μ‹œν‚€λŠ” 것이 우리의 μ€‘μš”ν•œ μ—­ν• μ΄λΌλŠ” κ²ƒμž…λ‹ˆλ‹€.
01:30
in a way that's complementary with our own experiences.
23
90428
4587
μΈκ°„μ˜ κΈ°μ–΅κ³Ό μƒν˜Έ λ³΄μ™„λ˜λ„λ‘ λ§μž…λ‹ˆλ‹€.
01:35
So, digging more deeply into this,
24
95539
3387
쒀더 μžμ„Ένžˆ μ„€λͺ…ν•΄λ³΄κ² μŠ΅λ‹ˆλ‹€.
01:38
a few years ago I began working on helping computers to generate human-like stories
25
98950
5905
λͺ‡λ…„ μ „ μ €λŠ” 컴퓨터가 μΈκ°„μ˜ 이야기λ₯Ό μ§€μ–΄λ‚Όμˆ˜ μžˆλ„λ‘ ν–ˆμŠ΅λ‹ˆλ‹€.
01:44
from sequences of images.
26
104879
1666
μ—¬λŸ¬ μž₯의 이미지λ₯Ό ν†΅ν•΄μ„œμš”.
01:47
So, one day,
27
107427
1904
μ–΄λŠ λ‚ 
01:49
I was working with my computer to ask it what it thought about a trip to Australia.
28
109355
4622
μ €λŠ” μ»΄ν“¨ν„°μ—κ²Œ 호주 여행을 μ–΄λ–»κ²Œ μƒκ°ν•˜λƒκ³  λ¬Όμ—ˆμŠ΅λ‹ˆλ‹€.
01:54
It took a look at the pictures, and it saw a koala.
29
114768
2920
μ»΄ν“¨ν„°λŠ” 사진듀 μ†μ—μ„œ μ½”μ•ŒλΌλ₯Ό λ³΄μ•˜μŠ΅λ‹ˆλ‹€.
01:58
It didn't know what the koala was,
30
118236
1643
그것은 μ½”μ•ŒλΌκ°€ 무엇인지 λͺ°λžμ£ .
01:59
but it said it thought it was an interesting-looking creature.
31
119903
2999
μ»΄ν“¨ν„°λŠ” ν₯미둭게 생긴 생λͺ…체가 μžˆλ‹€κ³  ν•˜λ”κ΅°μš”.
02:04
Then I shared with it a sequence of images about a house burning down.
32
124053
4004
κ·Έν›„ λΆˆνƒ€κ³  μžˆλŠ” μ§‘μ˜ 이미지λ₯Ό λ³΄μ—¬μ£Όμ—ˆμŠ΅λ‹ˆλ‹€.
02:09
It took a look at the images and it said,
33
129704
3285
이미지λ₯Ό 보닀가 μ»΄ν“¨ν„°λŠ” λŒ€λ‹΅ν–ˆμ£ .
02:13
"This is an amazing view! This is spectacular!"
34
133013
3500
"멋진 κ΄‘κ²½μ΄λ„€μš”! λŒ€λ‹¨ν•œ μž₯κ΄€μ΄μ—μš”!"
02:17
It sent chills down my spine.
35
137450
2095
κ°‘μžκΈ° μ˜€μ‹Ήν•˜λŠ” λŠλ‚Œμ΄ λ“€μ—ˆμŠ΅λ‹ˆλ‹€.
02:20
It saw a horrible, life-changing and life-destroying event
36
140983
4572
λ”μ°ν•˜κ³  인생을 λ’€λ°”κΎΈκ³  νŒŒκ΄΄ν•  μˆ˜λ„ μžˆλŠ” 사건을 보고
02:25
and thought it was something positive.
37
145579
2382
그것은 긍정적 μž₯면이라고 μƒκ°ν–ˆλ˜ κ±°μ£ .
02:27
I realized that it recognized the contrast,
38
147985
3441
μ €λŠ” 컴퓨터가 색채 λŒ€λΉ„μ™€
02:31
the reds, the yellows,
39
151450
2699
λΉ¨κ°•κ³Ό λ…Έλž‘μ„ λ³΄κ³ λŠ”
02:34
and thought it was something worth remarking on positively.
40
154173
3078
κΈμ •μ μœΌλ‘œ μΈμ§€ν• λ§Œν•œ 것이라고 μƒκ°ν–ˆμŒμ„ κΉ¨λ‹¬μ•˜μŠ΅λ‹ˆλ‹€.
02:37
And part of why it was doing this
41
157928
1615
그런 λ°˜μ‘μ˜ μ΄μœ λŠ”
02:39
was because most of the images I had given it
42
159577
2945
μ—¬νƒœ 보여쀀 이미지듀이
02:42
were positive images.
43
162546
1840
κΈμ •μ μ΄μ—ˆκΈ° λ•Œλ¬Έμ΄μ£ .
02:44
That's because people tend to share positive images
44
164903
3658
μ‚¬λžŒλ“€μ€ 쒋은 μˆœκ°„μ˜ 사진듀을 보여주죠.
02:48
when they talk about their experiences.
45
168585
2190
자기 이야기λ₯Ό ν•  λ•ŒλŠ” 말이죠.
02:51
When was the last time you saw a selfie at a funeral?
46
171267
2541
μž₯λ‘€μ‹μ—μ„œ μžμ‹ μ˜ 사진을 찍어본 적이 μžˆμœΌμ„Έμš”?
02:55
I realized that, as I worked on improving AI
47
175434
3095
μ „ 인곡지λŠ₯(Al)κ³Ό μΌν•˜λ©΄μ„œ κΉ¨λ‹¬μ•˜μŠ΅λ‹ˆλ‹€.
02:58
task by task, dataset by dataset,
48
178553
3714
λͺ¨λ“  μž‘μ—…κ³Ό 정보 λ°μ΄ν„°λ§ˆλ‹€
03:02
that I was creating massive gaps,
49
182291
2897
κ±°λŒ€ν•œ ν‹ˆμ΄ μžˆμ—ˆλ˜ κ²λ‹ˆλ‹€.
03:05
holes and blind spots in what it could understand.
50
185212
3999
μ»΄ν“¨ν„°λŠ” ν—ˆμ κ³Ό μ‚¬κ°μ§€λŒ€ μ†μ—μ„œ 이해λ₯Ό ν•˜κ³  있던 κ²λ‹ˆλ‹€.
03:10
And while doing so,
51
190307
1334
κ·Έ 와쀑에
03:11
I was encoding all kinds of biases.
52
191665
2483
μ €λŠ” 편파적인 생각을 μΈμ‹μ‹œν‚€κ³  있던 κ²λ‹ˆλ‹€.
03:15
Biases that reflect a limited viewpoint,
53
195029
3318
μ œν•œλœ 생각을 가진
03:18
limited to a single dataset --
54
198371
2261
ν•œκ°€μ§€ μ •λ³΄μ—λ§Œ κ΅­ν•œλœ μ„ μž…κ΄€
03:21
biases that can reflect human biases found in the data,
55
201283
3858
데이터 속에 반영된 μΈκ°„λ“€μ˜ 편견
03:25
such as prejudice and stereotyping.
56
205165
3104
νŒμ— 박은 κ³ μ • 관념
03:29
I thought back to the evolution of the technology
57
209554
3057
μ»΄ν“¨ν„°μ˜ λ°œμ „μ„ 생각해보면
03:32
that brought me to where I was that day --
58
212635
2502
μ˜ˆμ „μ˜ λͺ¨μŠ΅μ„ λ– μ˜¬λ¦¬κ²Œ λ©λ‹ˆλ‹€.
03:35
how the first color images
59
215966
2233
처음 μƒ‰μ±„μ˜ μ΄λ―Έμ§€λŠ”
03:38
were calibrated against a white woman's skin,
60
218223
3048
백인 μ—¬μžμ˜ 피뢀색에 λ§žμΆ”μ–΄ μ‘Œλ‹€λŠ” μ‚¬μ‹€μ„μš”.
03:41
meaning that color photography was biased against black faces.
61
221665
4145
μ‚¬μ§„μ˜ 색채가 흑인 μ—¬μžμ˜ μ–Όκ΅΄μƒ‰μ—λŠ” 편견으둜 μž‘μš©ν•œλ‹€λŠ” κ±°μ£ .
03:46
And that same bias, that same blind spot
62
226514
2925
λ°”λ‘œ κ·Έ 편견과 μ‚¬κ°μ§€λŒ€κ°€
03:49
continued well into the '90s.
63
229463
1867
90λ…„λŒ€κΉŒμ§€ κ³„μ†λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
03:51
And the same blind spot continues even today
64
231701
3154
그같은 헛점은 μ˜€λŠ˜λ‚ μ—λ„ μ‘΄μž¬ν•©λ‹ˆλ‹€.
03:54
in how well we can recognize different people's faces
65
234879
3698
각각의 μ–Όκ΅΄μ˜ λͺ¨μ–‘을 κ΅¬λ³„ν•˜λŠ”
03:58
in facial recognition technology.
66
238601
2200
μ–Όκ΅΄ 인식 ν”„λ‘œκ·Έλž¨μ—λ„ μ‘΄μž¬ν•©λ‹ˆλ‹€.
04:01
I though about the state of the art in research today,
67
241323
3143
μ˜€λŠ˜λ‚ μ˜ μ΅œμ²¨λ‹¨μ˜ 연ꡬλ₯Ό μƒκ°ν•΄λ³΄μ•˜μŠ΅λ‹ˆλ‹€.
04:04
where we tend to limit our thinking to one dataset and one problem.
68
244490
4514
μš°λ¦¬λŠ” ν•œκ°€μ§€ 문제, ν•œκ°€μ§€ 데이터 속에 생각을 μ œν•œν•©λ‹ˆλ‹€.
04:09
And that in doing so, we were creating more blind spots and biases
69
249688
4881
그둜 인해 헛점과 νŽΈκ²¬μ„ λ§Œλ“€μ–΄ λ‚΄λŠ” κ²ƒμž…λ‹ˆλ‹€.
04:14
that the AI could further amplify.
70
254593
2277
인곡지λŠ₯(AI)은 κ·Έ νŽΈκ²¬μ„ ν™•λŒ€μ‹œν‚΅λ‹ˆλ‹€.
04:17
I realized then that we had to think deeply
71
257712
2079
μš°λ¦¬λŠ” 깊게 이 문제λ₯Ό μƒκ°ν•΄μ•Όν•©λ‹ˆλ‹€.
04:19
about how the technology we work on today looks in five years, in 10 years.
72
259815
5519
였늘 μš°λ¦¬κ°€ μ—°κ΅¬ν•˜λŠ” 이 기술이 5λ…„, 10λ…„ν›„μ—” μ–΄λ–»κ²Œ λ³€ν• κΉŒμš”?
04:25
Humans evolve slowly, with time to correct for issues
73
265990
3142
인간은 천천히 λ°œμ „ν•˜λ©΄μ„œ μ‹œκ°„μ— 맞좰 문제λ₯Ό ν•΄κ²°ν•©λ‹ˆλ‹€.
04:29
in the interaction of humans and their environment.
74
269156
3534
λ‹€λ₯Έ 인간과 λ‹€λ₯Έ ν™˜κ²½κ³Ό μƒν˜Έ μž‘μš©μ„ ν•˜λ©΄μ„œμš”.
04:33
In contrast, artificial intelligence is evolving at an incredibly fast rate.
75
273276
5429
반면, 인곡지λŠ₯은 λ†€λΌμš΄ μ†λ„λ‘œ λΉ λ₯΄κ²Œ λ°œμ „ν•©λ‹ˆλ‹€.
04:39
And that means that it really matters
76
279013
1773
정말 μ•Œμ•„μ•Ό ν•  μ€‘μš”ν•œ 사싀은
04:40
that we think about this carefully right now --
77
280810
2317
μš°λ¦¬κ°€ μ‚¬κ°μ§€λŒ€μΈ ν—›μ κΉŒμ§€λ„
04:44
that we reflect on our own blind spots,
78
284180
3008
인곡지λŠ₯에 λ°˜μ˜ν•˜κ³  μžˆλ‹€λŠ” μ‚¬μ‹€μž…λ‹ˆλ‹€.
04:47
our own biases,
79
287212
2317
우리 μžμ‹ μ˜ νŽΈκ²¬μ„μš”.
04:49
and think about how that's informing the technology we're creating
80
289553
3857
μš°λ¦¬κ°€ λ§Œλ“€μ–΄λ‚Έ 기술이 μ–΄λ–»κ²Œ 인식될 지λ₯Ό μƒκ°ν•˜λ©°
04:53
and discuss what the technology of today will mean for tomorrow.
81
293434
3902
ν˜„μž¬μ˜ 기술이 λ―Έλž˜μ— μ–΄λ–€ 영ν–₯을 λ―ΈμΉ  지λ₯Ό 생각해야 ν•©λ‹ˆλ‹€.
04:58
CEOs and scientists have weighed in on what they think
82
298593
3191
졜고 κ²½μ˜μžλ“€κ³Ό κ³Όν•™μžλ“€μ€ μ‹ μ€‘νžˆ κ³ λ €ν•˜λ©°
05:01
the artificial intelligence technology of the future will be.
83
301808
3325
인곡지λŠ₯이 λ―Έλž˜μ— μ–΄λ–»κ²Œ 변할지λ₯Ό λ§ν–ˆμŠ΅λ‹ˆλ‹€.
05:05
Stephen Hawking warns that
84
305157
1618
μŠ€ν‹°λΈ ν˜Έν‚Ήμ€ κ²½κ³ ν–ˆμ£ .
05:06
"Artificial intelligence could end mankind."
85
306799
3007
"인곡 지λŠ₯은 인λ₯˜λ₯Ό λ©Έλ§μ‹œν‚¬μˆ˜ μžˆλ‹€"
05:10
Elon Musk warns that it's an existential risk
86
310307
2683
μ—˜λ‘  λ¨ΈμŠ€ν¬λŠ” 인곡 지λŠ₯은 μ‹€μ‘΄ν•˜λŠ” μœ„ν—˜μ΄λ©°
05:13
and one of the greatest risks that we face as a civilization.
87
313014
3574
λ¬Έλͺ…μ‚¬νšŒλ‘œμ„œ κ²ͺμ–΄μ•Ό ν•  κ°€μž₯ 큰 μœ„ν—˜μ΄λΌκ³  ν–ˆμŠ΅λ‹ˆλ‹€.
05:17
Bill Gates has made the point,
88
317665
1452
빌 κ²Œμ΄μΈ λŠ” μ—­μ„€ν–ˆμŠ΅λ‹ˆλ‹€.
05:19
"I don't understand why people aren't more concerned."
89
319141
3185
"μ‚¬λžŒλ“€μ΄ μ™œ 신경을 μ•ˆ μ“°λŠ”μ§€ 이해가 μ•ˆλ©λ‹ˆλ‹€."
05:23
But these views --
90
323412
1318
ν•˜μ§€λ§Œ 이런 견해듀은
05:25
they're part of the story.
91
325618
1734
κ·Έμ € μ΄μ•ΌκΈ°μ˜ ν•œ 뢀뢄일 λΏμž…λ‹ˆλ‹€.
05:28
The math, the models,
92
328079
2420
μˆ˜ν•™κ³Ό λͺ¨ν˜•κ°™μ€
05:30
the basic building blocks of artificial intelligence
93
330523
3070
인곡지λŠ₯의 κΈ°μ΄ˆν† λŒ€λŠ”
05:33
are something that we call access and all work with.
94
333617
3135
μš°λ¦¬κ°€ λͺ¨λ‘ μ ‘κ·Όν•΄μ„œ μž…λ ₯ν•΄μ•Όν•˜λŠ” μ •λ³΄λ“€μž…λ‹ˆλ‹€.
05:36
We have open-source tools for machine learning and intelligence
95
336776
3785
κ³΅κ°œμ†ŒμŠ€μΈ μ˜€ν”ˆμ†ŒμŠ€λ₯Ό 톡해 μ»΄ν“¨ν„°μ˜ ν•™μŠ΅κ³Ό 지λŠ₯에
05:40
that we can contribute to.
96
340585
1734
κΈ°μ—¬λ₯Ό ν•  수 μžˆλŠ” κ²ƒμž…λ‹ˆλ‹€.
05:42
And beyond that, we can share our experience.
97
342919
3340
더 λ‚˜μ•„κ°€ 우리의 κ²½ν—˜μ„ κ³΅μœ ν•  μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
05:46
We can share our experiences with technology and how it concerns us
98
346760
3468
μš°λ¦¬λŠ” ν…Œν¬λ†€λ‘œμ§€λ₯Ό ν†΅ν•˜μ—¬ 고민도 ν•˜κ³  κΈ°μ˜κ²Œλ„ ν•˜λŠ” κ²½ν—˜μ„
05:50
and how it excites us.
99
350252
1467
ν•¨κ»˜ λ‚˜λˆŒ 수 μžˆμŠ΅λ‹ˆλ‹€.
05:52
We can discuss what we love.
100
352251
1867
μš°λ¦¬κ°€ 무엇을 μ’‹μ•„ν•˜λŠ” 지λ₯Ό μ΄μ•ΌκΈ°ν•˜λ©°
05:55
We can communicate with foresight
101
355244
2031
μ•žλ‚ μ„ μ˜ˆμΈ‘ν•˜λ©° μ˜μ‚¬μ†Œν†΅μ„ ν•©λ‹ˆλ‹€.
05:57
about the aspects of technology that could be more beneficial
102
357299
4857
ν…Œν¬λ†€λ‘œμ§€κ°€ 인λ₯˜μ—κ²Œ 이득인지λ₯Ό λ°˜λ¬Έν•˜κ³ 
06:02
or could be more problematic over time.
103
362180
2600
ν˜Ήμ€ μ‹œκ°„μ΄ 지남에 따라 ν•΄κ°€ λ˜λŠ”μ§€λ„ ν† λ‘ ν•˜μ£ .
06:05
If we all focus on opening up the discussion on AI
104
365799
4143
우리 λͺ¨λ‘κ°€ 인곡지λŠ₯에 λŒ€ν•΄ κ³΅κ°œν•˜κ³  ν† λ‘ ν•˜λ©΄μ„œ
06:09
with foresight towards the future,
105
369966
1809
미래λ₯Ό κ΄€μΈ‘ν•œλ‹€λ©΄
06:13
this will help create a general conversation and awareness
106
373093
4270
그것은 μ „λ°˜μ μΈ λŒ€ν™”μ™€ 인식을 μ΄λŒμ–΄λ‚΄μ„œ
06:17
about what AI is now,
107
377387
2513
인곡지λŠ₯이 무엇인지λ₯Ό μ΄ν•΄ν•˜κ²Œ 되고
06:21
what it can become
108
381212
2001
미래의 λͺ¨μŠ΅λ„ μΆ”μΈ‘ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
06:23
and all the things that we need to do
109
383237
1785
그에 따라 무엇을 ν•΄μ•Όν•  지도 μ•Œκ²Œ 되죠.
06:25
in order to enable that outcome that best suits us.
110
385046
3753
인λ₯˜μ—κ²Œ μ΅œμ„ μ΄ 될 κ²°κ³Όλ₯Ό λ§Œλ“€κ²Œλ˜λŠ” κ±°μ£ .
06:29
We already see and know this in the technology that we use today.
111
389490
3674
ν˜„μž¬μ˜ ν…Œν¬λ†€λ‘œμ§€ μ†μ—μ„œλ„ μš°λ¦¬λŠ” 이미 이 사싀을 보고 있죠.
06:33
We use smart phones and digital assistants and Roombas.
112
393767
3880
예둜 슀마트폰, λ””μ§€νƒˆ κΈ°κΈ°, λ£Έλ°”[Roombas]λ‘œλ΄‡ 등이 있죠.
06:38
Are they evil?
113
398457
1150
이듀이 ν•΄λ‘œμšΈκΉŒμš”?
06:40
Maybe sometimes.
114
400268
1547
μ•„λ§ˆλ„ κ°€λ”μ€μš”.
06:42
Are they beneficial?
115
402664
1333
이듀이 도움이 λ κΉŒμš”?
06:45
Yes, they're that, too.
116
405005
1533
λ„€, κ·Έλ ‡μ£ .
06:48
And they're not all the same.
117
408236
1761
이듀이 늘 λ˜‘κ°™μ§€λŠ” μ•ŠμŠ΅λ‹ˆλ‹€.
06:50
And there you already see a light shining on what the future holds.
118
410489
3540
μ—¬λŸ¬λΆ„μ€ 이미 λ―Έλž˜μ— λŒ€ν•œ μ‹ ν˜Έλ₯Ό 보고 μžˆλŠ” κ²λ‹ˆλ‹€.
06:54
The future continues on from what we build and create right now.
119
414942
3619
λ―Έλž˜λŠ” ν˜„μž¬λ₯Ό μ–΄λ–»κ²Œ λ§Œλ“œλŠλƒμ— 따라 λ‹¬λΌμ§ˆ κ²ƒμž…λ‹ˆλ‹€.
06:59
We set into motion that domino effect
120
419165
2642
도미노 νš¨κ³Όκ°€ μΌμ–΄λ‚˜ λ“― λ§μž…λ‹ˆλ‹€.
07:01
that carves out AI's evolutionary path.
121
421831
2600
κ·Έ νš¨κ³ΌλŠ” 인곡지λŠ₯의 λ°œμ „μ— 영ν–₯을 쀄 κ²ƒμž…λ‹ˆλ‹€.
07:05
In our time right now, we shape the AI of tomorrow.
122
425173
2871
μ§€κΈˆ 우리의 μ‹œλŒ€λŠ”, 미래λ₯Ό μœ„ν•œ 인곡지λŠ₯의 κΈ°μ΄ˆμž…λ‹ˆλ‹€.
07:08
Technology that immerses us in augmented realities
123
428566
3699
μ¦κ°•ν˜„μ‹€μ— λͺ°μž…ν•˜κ²Œ ν•˜λŠ” κΈ°μˆ μ€
07:12
bringing to life past worlds.
124
432289
2566
세상을 λ›°μ–΄λ„˜μ–΄ 생λͺ…을 λΆˆμ–΄λ„£μ–΄μ£ΌλŠ” 기술이죠.
07:15
Technology that helps people to share their experiences
125
435844
4312
μ„œλ‘œμ˜ 이야기λ₯Ό λ‚˜λˆ„κΈ° μœ„ν•œ κΈ°μˆ μ€
07:20
when they have difficulty communicating.
126
440180
2262
μ˜μ‚¬μ†Œν†΅μ˜ 어렀움을 ν•΄κ²°ν•΄μ£Όμ£ .
07:23
Technology built on understanding the streaming visual worlds
127
443323
4532
슀트리밍으둜 μ „μ†‘λ˜λŠ” μ˜μƒμ„Έκ³„λ₯Ό λ§Œλ“œλŠ” κΈ°μˆ μ€
07:27
used as technology for self-driving cars.
128
447879
3079
μžλ™μš΄μ „ ν…Œν¬λ†€λ¦¬μ§€μ— 이용되죠.
07:32
Technology built on understanding images and generating language,
129
452490
3413
이미지와 언어생성을 λͺ©μ μœΌλ‘œ λ§Œλ“€μ–΄μ§„ κΈ°μˆ μ€
07:35
evolving into technology that helps people who are visually impaired
130
455927
4063
μ‹œκ°μž₯애인듀을 λ•λŠ” ν…Œν¬λ†€λ‘œμ§€λ‘œ λ°œμ „ν–ˆμ£ .
07:40
be better able to access the visual world.
131
460014
2800
그듀이 μ‹œκ°μ μœΌλ‘œ 세상에 더 κ°€κΉŒμ΄ λ‹€κ°€κ°ˆ 수 μžˆλ„λ‘μš”.
07:42
And we also see how technology can lead to problems.
132
462838
3261
ν•˜μ§€λ§Œ ν…Œν¬λ†€λ‘œμ§€λŠ” 문제λ₯Ό μ•ΌκΈ°ν•˜κΈ°λ„ ν•©λ‹ˆλ‹€.
07:46
We have technology today
133
466885
1428
μ˜€λŠ˜λ‚  우리의 ν…Œν¬λ†€λ‘œμ§€λŠ”
07:48
that analyzes physical characteristics we're born with --
134
468337
3835
우리의 μ„ μ²œμ  신체적 νŠΉμ§•μ„ λΆ„μ„ν•©λ‹ˆλ‹€.
07:52
such as the color of our skin or the look of our face --
135
472196
3272
ν”ΌλΆ€μƒ‰μ΄λ‚˜ μ–Όκ΅΄μ˜ ν‘œμ • λ“±μ„μš”.
07:55
in order to determine whether or not we might be criminals or terrorists.
136
475492
3804
λ²”μ£„μžμΈμ§€ ν…ŒλŸ¬λ¦¬μŠ€νŠΈμΈμ§€λ₯Ό ꡬ별할 수 μžˆλ„λ‘ ν•˜κΈ° μœ„ν•΄μ„œμ£ .
07:59
We have technology that crunches through our data,
137
479688
2905
μš°λ¦¬λŠ” 정보 데이터λ₯Ό ν†΅ν•˜μ—¬ μˆ˜μ§‘ν•˜λŠ” κΈ°μˆ μ„ 가지고 μžˆμŠ΅λ‹ˆλ‹€.
08:02
even data relating to our gender or our race,
138
482617
2896
성별 λ˜λŠ” 인쒅과 κ΄€λ ¨λ˜λŠ” λ°μ΄ν„°λŠ”
08:05
in order to determine whether or not we might get a loan.
139
485537
2865
λΉšμ„ 지고 μžˆλŠ”μ§€ μ•„λ‹Œμ§€λ₯Ό κ²°μ •ν•˜λŠ” 기술이 λ˜μ§€λ„ ν•˜μ§€μš”.
08:09
All that we see now
140
489494
1579
μ§€κΈˆ 보고 μžˆλŠ” λͺ¨λ“  것은
08:11
is a snapshot in the evolution of artificial intelligence.
141
491097
3617
인곡지λŠ₯의 λ‹¨νŽΈμΌ λΏμž…λ‹ˆλ‹€.
08:15
Because where we are right now,
142
495763
1778
μ™œλƒν•˜λ©΄ μš°λ¦¬λŠ” μ§€κΈˆ
08:17
is within a moment of that evolution.
143
497565
2238
인곡지λŠ₯의 λ°œμ „μ˜ ν•œ μˆœκ°„μ— μ‘΄μž¬ν•˜κ³  μžˆμœΌλ‹ˆκΉŒμš”.
08:20
That means that what we do now will affect what happens down the line
144
500690
3802
즉 μ§€κΈˆ μš°λ¦¬κ°€ 무엇을 ν•˜λŠ” κ°€λŠ” μ „μ μœΌλ‘œ 영ν–₯을 λΌμΉ ν…Œλ‹ˆκΉŒμš”.
08:24
and in the future.
145
504516
1200
우리의 λ―Έλž˜μ—μš”.
08:26
If we want AI to evolve in a way that helps humans,
146
506063
3951
인곡지λŠ₯이 인λ₯˜λ₯Ό λ„μ™€μ£ΌλŠ” λ°©ν–₯으둜 λ°œμ „μ‹œν‚€κ³  μ‹Άλ‹€λ©΄
08:30
then we need to define the goals and strategies
147
510038
2801
μš°λ¦¬λŠ” λͺ©μ κ³Ό κ³„νšμ„ κ·œμ •ν•΄μ•Όν•©λ‹ˆλ‹€.
08:32
that enable that path now.
148
512863
1733
그런 λ°©ν–₯으둜 갈 수 μžˆλ„λ‘μš”.
08:35
What I'd like to see is something that fits well with humans,
149
515680
3738
μΈκ°„μ—κ²Œ μ ν•©ν•œ 기술이기λ₯Ό λ°”λžλ‹ˆλ‹€.
08:39
with our culture and with the environment.
150
519442
2800
문화와 ν™˜κ²½μ΄ λ‹΄κΈ΄ 기술.
08:43
Technology that aids and assists those of us with neurological conditions
151
523435
4484
μ‹ κ²½μ§ˆν™˜μ„ 가진 μ‚¬λžŒλ“€μ—κ²Œ 도움이 λ˜λŠ” ν…Œν¬λ†€λ‘œμ§€.
08:47
or other disabilities
152
527943
1721
λ‹€λ₯Έ 신체 μž₯애인듀도
08:49
in order to make life equally challenging for everyone.
153
529688
3216
일반인처럼 λ™λ“±ν•˜κ²Œ ν•˜λ£¨λ₯Ό μ‚΄μ•„κ°ˆ 수 μžˆλ„λ‘ν•˜λŠ” 기술.
08:54
Technology that works
154
534097
1421
ν…Œν¬λ†€λ‘œμ§€κ°€
08:55
regardless of your demographics or the color of your skin.
155
535542
3933
μΈκ΅¬ν†΅κ³„λ‚˜ 피뢀색과 관련없이 μ‹€ν–‰λ˜κΈ°λ₯Ό λ°”λžλ‹ˆλ‹€.
09:00
And so today, what I focus on is the technology for tomorrow
156
540383
4742
μ œκ°€ μ§€κΈˆ μ—°κ΅¬ν•˜κ³  μžˆλŠ” 일은 미래λ₯Ό μœ„ν•œ ν…Œν¬λ†€λ‘œμ§€μž…λ‹ˆλ‹€.
09:05
and for 10 years from now.
157
545149
1733
μ§€κΈˆλΆ€ν„° μ‹­λ…„ ν›„λ₯Ό μœ„ν•œ κ±°μ£ .
09:08
AI can turn out in many different ways.
158
548530
2634
인곡지λŠ₯은 μ—¬λŸ¬λͺ¨μŠ΅μœΌλ‘œ λ§Œλ“€μ–΄ 질수 μžˆμŠ΅λ‹ˆλ‹€.
09:11
But in this case,
159
551688
1225
ν•˜μ§€λ§Œ μ§€κΈˆμ€
09:12
it isn't a self-driving car without any destination.
160
552937
3328
λͺ©μ μ—†μ΄ 슀슀둜 μž‘λ™ν•˜κ³  μžˆλŠ” μœ„ν—˜ν•œ μžλ™μ°¨μž…λ‹ˆλ‹€.
09:16
This is the car that we are driving.
161
556884
2400
이 μ°¨λŠ” μš°λ¦¬κ°€ μš΄μ „ν•΄μ•Ό ν•˜λŠ”λ° λ§μž…λ‹ˆλ‹€.
09:19
We choose when to speed up and when to slow down.
162
559953
3595
μ–Έμ œ 속도λ₯Ό λ‚΄κ³  μ–Έμ œ 속도λ₯Ό μ€„μΌμ§€λŠ” μš°λ¦¬κ°€ μ •ν•΄μ•Ό ν•©λ‹ˆλ‹€.
09:23
We choose if we need to make a turn.
163
563572
2400
차의 진행 λ°©ν–₯도 μš°λ¦¬κ°€ μ •ν•΄μ•Ό ν•©λ‹ˆλ‹€.
09:26
We choose what the AI of the future will be.
164
566868
3000
μš°λ¦¬λŠ” 인곡지λŠ₯의 미래의 λͺ¨μŠ΅μ„ μ •ν•΄μ•Ό ν•©λ‹ˆλ‹€.
09:31
There's a vast playing field
165
571186
1337
κ±°λŒ€ν•œ μ»΄ν“¨ν„°μ˜ μž₯이
09:32
of all the things that artificial intelligence can become.
166
572547
2965
인곡지λŠ₯의 λ³€ν™”λ₯Ό 기닀리고 μžˆμŠ΅λ‹ˆλ‹€.
09:36
It will become many things.
167
576064
1800
인곡지λŠ₯은 μ—¬λŸ¬ λͺ¨μŠ΅μ΄ 될것이고
09:39
And it's up to us now,
168
579694
1732
그것은 우리 λͺ¨λ‘μ—κ²Œ λ‹¬λ €μžˆμŠ΅λ‹ˆλ‹€.
09:41
in order to figure out what we need to put in place
169
581450
3061
무엇이 적절히 μ‹€ν–‰λ˜μ–΄μ•Ό ν•˜λŠ”μ§€λ₯Ό μ•Œκ³ 
09:44
to make sure the outcomes of artificial intelligence
170
584535
3807
인곡지λŠ₯ λ°œμ „μ˜ κ²°κ³Όλ₯Ό ν™•μ‹€νžˆ ν•˜λŠ” 것은
09:48
are the ones that will be better for all of us.
171
588366
3066
우리 λͺ¨λ‘μ—κ²Œ μ΅œμƒμ΄ 될 κ²ƒμž…λ‹ˆλ‹€.
09:51
Thank you.
172
591456
1150
κ°μ‚¬ν•©λ‹ˆλ‹€.
09:52
(Applause)
173
592630
2187
(λ°•μˆ˜)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7