Don't fear superintelligent AI | Grady Booch

270,234 views ・ 2017-03-13

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: Yeonsoo Kwon κ²€ν† : Jihyeon J. Kim
00:12
When I was a kid, I was the quintessential nerd.
0
12580
3840
μ œκ°€ 어렸을 적, μ €λŠ” μ „ν˜•μ μΈ κ΄΄μ§œμ˜€μŠ΅λ‹ˆλ‹€.
00:17
I think some of you were, too.
1
17140
2176
μ—¬λŸ¬λΆ„ 쀑 λͺ‡λͺ‡λ„ κ·Έλž¬μ„ κ±°μ˜ˆμš”.
00:19
(Laughter)
2
19340
1216
(μ›ƒμŒ)
00:20
And you, sir, who laughed the loudest, you probably still are.
3
20580
3216
κ±°κΈ° κ°€μž₯ 크게 μ›ƒμœΌμ‹  뢄은 μ—¬μ „νžˆ 그럴 것 κ°™κ΅°μš”.
00:23
(Laughter)
4
23820
2256
(μ›ƒμŒ)
00:26
I grew up in a small town in the dusty plains of north Texas,
5
26100
3496
μ €λŠ” λΆν…μ‚¬μŠ€ ν™©λ¬΄μ§€μ˜ μž‘μ€ λ§ˆμ„μ—μ„œ
00:29
the son of a sheriff who was the son of a pastor.
6
29620
3336
λͺ©μ‚¬μΈ 할아버지와 λ³΄μ•ˆκ΄€μΈ 아버지 λ°‘μ—μ„œ μžλžμŠ΅λ‹ˆλ‹€.
00:32
Getting into trouble was not an option.
7
32980
1920
λ§μ½ν”Όμš°λŠ” 일은 생각도 ν•  수 μ—†μ—ˆμŠ΅λ‹ˆλ‹€.
00:35
And so I started reading calculus books for fun.
8
35860
3256
κ·Έλž˜μ„œ μ €λŠ” 미적뢄학 책을 재미둜 μ½μ—ˆμŠ΅λ‹ˆλ‹€.
00:39
(Laughter)
9
39140
1536
(μ›ƒμŒ)
00:40
You did, too.
10
40700
1696
당신도 κ·Έλž¬μž–μ•„μš”.
00:42
That led me to building a laser and a computer and model rockets,
11
42420
3736
κ·Έκ²ƒμœΌλ‘œ μ €λŠ” λ ˆμ΄μ €, 컴퓨터와 λͺ¨λΈ λ‘œμΌ“μ„ λ§Œλ“€μ—ˆμ§€μš”.
00:46
and that led me to making rocket fuel in my bedroom.
12
46180
3000
κ·ΈλŸ¬λ‹€ λ³΄λ‹ˆ 제 방에 λ‘œμΌ“ μ—°λ£Œλ₯Ό λ§Œλ“€κ²Œ λμ–΄μš”.
00:49
Now, in scientific terms,
13
49780
3656
ν˜„μž¬, 과학적 κ΄€μ μ—μ„œ
00:53
we call this a very bad idea.
14
53460
3256
μš°λ¦¬λŠ” 이것을 맀우 λ‚˜μœ 생각이라고 ν•˜μ£ .
00:56
(Laughter)
15
56740
1216
(μ›ƒμŒ)
00:57
Around that same time,
16
57980
2176
λ˜ν•œ κ·Έ μ‹œμ ˆμ—
01:00
Stanley Kubrick's "2001: A Space Odyssey" came to the theaters,
17
60180
3216
μŠ€νƒ λ¦¬ 큐브릭의 "2001λ…„: 슀페이슀 μ˜€λ””μ„Έμ΄"κ°€
01:03
and my life was forever changed.
18
63420
2200
κ°œλ΄‰ν•˜μ˜€μœΌλ©° 제 삢은 μ™„μ „νžˆ λ°”λ€Œμ—ˆμ£ .
01:06
I loved everything about that movie,
19
66100
2056
μ €λŠ” κ·Έ μ˜ν™”μ˜ λͺ¨λ“  것이 μ’‹μ•˜μŠ΅λ‹ˆλ‹€.
01:08
especially the HAL 9000.
20
68180
2536
특히 HAL 9000은 νŠΉλ³„ν–ˆμ£ .
01:10
Now, HAL was a sentient computer
21
70740
2056
HAL은 지각을 ν•  수 μžˆλŠ” μ»΄ν“¨ν„°λ‘œ
01:12
designed to guide the Discovery spacecraft
22
72820
2456
λ””μŠ€μ»€λ²„λ¦¬ν˜Έκ°€ μ§€κ΅¬μ—μ„œ λͺ©μ„±μœΌλ‘œ 가도둝
01:15
from the Earth to Jupiter.
23
75300
2536
κ°€μ΄λ“œν•˜λŠ” 역할을 ν–ˆμŠ΅λ‹ˆλ‹€.
01:17
HAL was also a flawed character,
24
77860
2056
HAL은 λ˜ν•œ 결함이 μžˆμ–΄μ„œ
01:19
for in the end he chose to value the mission over human life.
25
79940
4280
κ²°κ΅­ 끝엔 μΈκ°„μ˜ 생λͺ…을 λ’€λ‘œν•˜κ³  주어진 μž„λ¬΄λ₯Ό νƒν•˜κ²Œ λ©λ‹ˆλ‹€.
01:24
Now, HAL was a fictional character,
26
84660
2096
HAL은 ν—ˆκ΅¬μ  μΊλ¦­ν„°μ΄μ§€λ§Œ
01:26
but nonetheless he speaks to our fears,
27
86780
2656
κ·ΈλŸΌμ—λ„ λΆˆκ΅¬ν•˜κ³ 
κ·ΈλŠ” μš°λ¦¬μ—κ²Œ 곡포감을 μ„ μ‚¬ν•©λ‹ˆλ‹€.
01:29
our fears of being subjugated
28
89460
2096
인λ₯˜μ—κ²Œ λƒ‰λ‹΄ν•œ 감정이 μ—†λŠ” 인곡적인 지λŠ₯μ—κ²Œ
01:31
by some unfeeling, artificial intelligence
29
91580
3016
01:34
who is indifferent to our humanity.
30
94620
1960
μ§€λ°°λ‹Ήν•˜κ²Œ 될 κ²ƒμ΄λΌλŠ” 곡포죠.
01:37
I believe that such fears are unfounded.
31
97700
2576
μ €λŠ” κ·ΈλŸ¬ν•œ 곡포가 κ·Όκ±° μ—†λ‹€κ³  μƒκ°ν•©λ‹ˆλ‹€.
01:40
Indeed, we stand at a remarkable time
32
100300
2696
ν™•μ‹€νžˆ μš°λ¦¬λŠ” 인간 μ—­μ‚¬μ—μ„œ
01:43
in human history,
33
103020
1536
맀우 μ€‘μš”ν•œ μ‹œκΈ°μ— μžˆμŠ΅λ‹ˆλ‹€.
01:44
where, driven by refusal to accept the limits of our bodies and our minds,
34
104580
4976
μš°λ¦¬λŠ” 우리 λͺΈκ³Ό μ •μ‹ μ˜ ν•œκ³„λ₯Ό 받아듀이길 κ±°λΆ€ν•˜λ©°
01:49
we are building machines
35
109580
1696
기계λ₯Ό λ§Œλ“€μ–΄λ‚΄κ³  μžˆμŠ΅λ‹ˆλ‹€.
01:51
of exquisite, beautiful complexity and grace
36
111300
3616
μš°μ•„ν•˜κ³  λ³΅μž‘ν•˜λ©΄μ„œλ„ 아름닡고 μ •κ΅ν•˜κ²Œμš”.
01:54
that will extend the human experience
37
114940
2056
그리고 그것은 μΈκ°„μ˜ κ²½ν—˜μ„
01:57
in ways beyond our imagining.
38
117020
1680
상상을 μ΄ˆμ›”ν• λ§ŒνΌ ν™•μž₯μ‹œν‚¬ κ²ƒμž…λ‹ˆλ‹€.
01:59
After a career that led me from the Air Force Academy
39
119540
2576
μ €λŠ” κ³΅κ΅°μ‚¬κ΄€ν•™κ΅μ—μ„œ
02:02
to Space Command to now,
40
122140
1936
ν˜„μž¬ 우주 μ‚¬λ ΉλΆ€λ‘œ μ™€μ„œ
02:04
I became a systems engineer,
41
124100
1696
μ‹œμŠ€ν…œ μ—”μ§€λ‹ˆμ–΄κ°€ 됐고
02:05
and recently I was drawn into an engineering problem
42
125820
2736
졜근 λ‚˜μ‚¬μ˜ ν™”μ„±κ³Ό κ΄€λ ¨λœ κΈ°μˆ λ¬Έμ œμ— κ΄€μ—¬ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
02:08
associated with NASA's mission to Mars.
43
128580
2576
02:11
Now, in space flights to the Moon,
44
131180
2496
μ˜€λŠ˜λ‚  μš°λ¦¬λŠ” νœ΄μŠ€ν„΄ 우주 λΉ„ν–‰ κ΄€μ œμ„Όν„°λ₯Ό 톡해
02:13
we can rely upon mission control in Houston
45
133700
3136
달을 ν–₯ν•œ 우주 비행을
02:16
to watch over all aspects of a flight.
46
136860
1976
λͺ¨λ“  μΈ‘λ©΄μ—μ„œ μ§€μΌœλ³Ό 수 μžˆμŠ΅λ‹ˆλ‹€.
02:18
However, Mars is 200 times further away,
47
138860
3536
ν•˜μ§€λ§Œ 화성은 200λ°° 멀리 λ–¨μ–΄μ Έ 있으며
02:22
and as a result it takes on average 13 minutes
48
142420
3216
μ‹ ν˜Έκ°€ μ§€κ΅¬μ—μ„œ ν™”μ„±κΉŒμ§€ ν‰κ· μ μœΌλ‘œ 13λΆ„ κ±Έλ¦½λ‹ˆλ‹€.
02:25
for a signal to travel from the Earth to Mars.
49
145660
3136
02:28
If there's trouble, there's not enough time.
50
148820
3400
λ¬Έμ œκ°€ μžˆλ‹€λ©΄ μ‹œκ°„μ΄ μΆ©λΆ„ν•˜μ§€ μ•Šλ‹€λŠ” κ²ƒμ΄μ§€μš”.
02:32
And so a reasonable engineering solution
51
152660
2496
그리고 ν•©λ‹Ήν•œ 기술적 λ°©μ•ˆμ€
02:35
calls for us to put mission control
52
155180
2576
우주 λΉ„ν–‰ κ΄€μ œ μ„Όν„°λ₯Ό
02:37
inside the walls of the Orion spacecraft.
53
157780
3016
였리온 μš°μ£Όμ„  λ²½ μ•ˆμ— λ„£λŠ” κ²ƒμ΄μ§€μš”.
02:40
Another fascinating idea in the mission profile
54
160820
2896
λ‹€λ₯Έ ν₯미둜운 μž„λ¬΄ κ°œμš” μ•„μ΄λ””μ–΄λŠ”
02:43
places humanoid robots on the surface of Mars
55
163740
2896
μ‚¬λžŒμ΄ 직접 λ„μ°©ν•˜κΈ° 전에
02:46
before the humans themselves arrive,
56
166660
1856
μΈκ°„ν˜• λ‘œλ΄‡μ„ λ¨Όμ € 화성에 두어
02:48
first to build facilities
57
168540
1656
μ‹œμ„€λ“€μ„ λ¨Όμ € μ§“κ²Œ ν•œ ν›„
02:50
and later to serve as collaborative members of the science team.
58
170220
3360
κ³Όν•™νŒ€μ˜ ν˜‘λ ₯ λ©€λ²„λ‘œ λ‘λŠ” κ²ƒμž…λ‹ˆλ‹€.
02:55
Now, as I looked at this from an engineering perspective,
59
175220
2736
이제 μ œκ°€ 이것을 기술적 μΈ‘λ©΄μ—μ„œ 보면
02:57
it became very clear to me that what I needed to architect
60
177980
3176
μ œκ°€ λ§Œλ“€μ–΄μ•Ό ν•˜λŠ” 것이
03:01
was a smart, collaborative,
61
181180
2176
λ˜‘λ˜‘ν•˜κ³  ν˜‘λ ₯적인
03:03
socially intelligent artificial intelligence.
62
183380
2376
μ‚¬νšŒμ  지λŠ₯의 인곡 지λŠ₯μž„μ„ μ•Œ 수 μžˆμŠ΅λ‹ˆλ‹€.
03:05
In other words, I needed to build something very much like a HAL
63
185780
4296
λ‹€λ₯Έ 말둜, μ €λŠ” μ‚΄μΈν•˜λŠ” κ²½ν–₯이 μ—†λŠ”
03:10
but without the homicidal tendencies.
64
190100
2416
HAL을 λ§Œλ“€μ–΄μ•Ό ν•œλ‹€λŠ” κ²ƒμž…λ‹ˆλ‹€.
03:12
(Laughter)
65
192540
1360
(μ›ƒμŒ)
03:14
Let's pause for a moment.
66
194740
1816
μž μ‹œλ§Œ λ©ˆμΆ”κ³  μƒκ°ν•˜μ—¬λ΄…μ‹œλ‹€.
03:16
Is it really possible to build an artificial intelligence like that?
67
196580
3896
κ·ΈλŸ¬ν•œ 인곡지λŠ₯을 λ§Œλ“œλŠ” 것이 κ°€λŠ₯ν• κΉŒμš”?
03:20
Actually, it is.
68
200500
1456
사싀, κ°€λŠ₯ν•©λ‹ˆλ‹€.
03:21
In many ways,
69
201980
1256
λ‹€μ–‘ν•œ μΈ‘λ©΄μ—μ„œ
03:23
this is a hard engineering problem
70
203260
1976
이것은 AI μš”μ†Œκ°€ μžˆλŠ” μ–΄λ €μš΄ 기술 λ¬Έμ œμ΄μ§€
03:25
with elements of AI,
71
205260
1456
03:26
not some wet hair ball of an AI problem that needs to be engineered.
72
206740
4696
κ°€λ²Όμš΄ 섀계적 인곡지λŠ₯ λ¬Έμ œκ°€ μ•„λ‹™λ‹ˆλ‹€.
03:31
To paraphrase Alan Turing,
73
211460
2656
μ•¨λŸ° 튜링의 말을 μ •λ¦¬ν•˜μ—¬ 보자면
03:34
I'm not interested in building a sentient machine.
74
214140
2376
μ €λŠ” 감각이 μžˆλŠ” 기계λ₯Ό λ§Œλ“œλŠ” 것에 ν₯λ―Έκ°€ μžˆλŠ” 것이 μ•„λ‹™λ‹ˆλ‹€.
03:36
I'm not building a HAL.
75
216540
1576
μ €λŠ” HAL을 λ§Œλ“œλŠ” 것이 μ•„λ‹™λ‹ˆλ‹€.
03:38
All I'm after is a simple brain,
76
218140
2416
μ €λŠ” 지λŠ₯의 ν™˜μƒμ„ μ£ΌλŠ”
03:40
something that offers the illusion of intelligence.
77
220580
3120
κ°„λ‹¨ν•œ λ‘λ‡Œλ₯Ό λ§Œλ“€λ €λŠ” 것일 λΏμž…λ‹ˆλ‹€.
03:44
The art and the science of computing have come a long way
78
224820
3136
HAL이 μƒμ˜λœ 후에
기계적 κ³Όν•™κ³Ό μ˜ˆμˆ μ€ λ°œμ „ν•΄μ™”μœΌλ©°
03:47
since HAL was onscreen,
79
227980
1496
03:49
and I'd imagine if his inventor Dr. Chandra were here today,
80
229500
3216
만일 발λͺ…κ°€ μƒ¨λ“œλΌκ°€ 였늘 이곳에 μžˆμ—ˆλ”λΌλ©΄
03:52
he'd have a whole lot of questions for us.
81
232740
2336
μš°λ¦¬μ—κ²Œ λ§Žμ€ 질문이 μžˆμ—ˆμ„ 거라고 μ €λŠ” μƒμƒν•©λ‹ˆλ‹€.
03:55
Is it really possible for us
82
235100
2096
μˆ˜λ§Žμ€ 기계듀쀑 ν•˜λ‚˜μ˜ μ‹œμŠ€ν…œμ„ 뽑아
03:57
to take a system of millions upon millions of devices,
83
237220
4016
λ°μ΄νƒ€μ˜ 흐름을 읽고
04:01
to read in their data streams,
84
241260
1456
κ·Έλ“€μ˜ μ‹€νŒ¨λ₯Ό μ˜ˆκ²¬ν•˜λŠ”κ²Œ
04:02
to predict their failures and act in advance?
85
242740
2256
정말 κ°€λŠ₯ν•œ κ²ƒμΌκΉŒμš”?
04:05
Yes.
86
245020
1216
λ„€.
04:06
Can we build systems that converse with humans in natural language?
87
246260
3176
μš°λ¦¬μ™€ μžμ—°μŠ€λŸ½κ²Œ λŒ€ν™”ν•˜λŠ” μ‹œμŠ€ν…œμ„ λ§Œλ“€ 수 μžˆμ„κΉŒμš”?
04:09
Yes.
88
249460
1216
λ„€.
04:10
Can we build systems that recognize objects, identify emotions,
89
250700
2976
물건을 μΈμ‹ν•˜κ³ , 감정을 μ•Œμ•„λ³΄λ©°, 감정을 λ‚˜νƒ€λ‚΄κ³ 
04:13
emote themselves, play games and even read lips?
90
253700
3376
κ²Œμž„μ„ ν•˜λ©°, μž…μˆ μ„ μ½λŠ”, 그런 μ‹œμŠ€ν…œμ„ λ§Œλ“€ 수 μžˆμ„κΉŒμš”?
04:17
Yes.
91
257100
1216
λ„€.
04:18
Can we build a system that sets goals,
92
258340
2136
λͺ©ν‘œλ₯Ό μ„Έμ›Œ κ·Έ λͺ©ν‘œλ₯Ό ν–₯ν•΄ λ‚˜μ•„κ°€λ©°
04:20
that carries out plans against those goals and learns along the way?
93
260500
3616
κ·Έ κ³Όμ •μ—μ„œ ν•™μŠ΅ν•˜λŠ” μ‹œμŠ€ν…œμ„ λ§Œλ“€ 수 μžˆμ„κΉŒμš”?
04:24
Yes.
94
264140
1216
λ„€.
04:25
Can we build systems that have a theory of mind?
95
265380
3336
마음 이둠을 가지고 μžˆλŠ” μ‹œμŠ€ν…œμ„ λ§Œλ“€ 수 μžˆμ„κΉŒμš”?
04:28
This we are learning to do.
96
268740
1496
이것을 ν–₯ν•΄ μš°λ¦¬λŠ” λ‚˜μ•„κ°€κ³  μžˆμŠ΅λ‹ˆλ‹€.
04:30
Can we build systems that have an ethical and moral foundation?
97
270260
3480
윀리적, 도덕적 배경을 가지고 μžˆλŠ” μ‹œμŠ€ν…œμ„ λ§Œλ“€ 수 μžˆμ„κΉŒμš”?
04:34
This we must learn how to do.
98
274300
2040
μš°λ¦¬κ°€ ν•΄μ•Όλ§Œ ν•˜λŠ” κ²ƒμž…λ‹ˆλ‹€.
04:37
So let's accept for a moment
99
277180
1376
κ·ΈλŸ¬λ―€λ‘œ μ΄λŸ¬ν•œ 사λͺ…을 가진
04:38
that it's possible to build such an artificial intelligence
100
278580
2896
인곡지λŠ₯을 λ§Œλ“œλŠ” 것이 κ°€λŠ₯ν•˜λ‹€λŠ” 것을
04:41
for this kind of mission and others.
101
281500
2136
λ°›μ•„λ“€μ—¬ λ΄…μ‹œλ‹€.
04:43
The next question you must ask yourself is,
102
283660
2536
κ·Έλ ‡λ‹€λ©΄ 이 μ§ˆλ¬Έμ„ μŠ€μŠ€λ‘œμ—κ²Œ λ¬Όμ–΄λ³΄μ„Έμš”.
04:46
should we fear it?
103
286220
1456
μš°λ¦¬λŠ” λ‘λ €μ›Œν•΄μ•Ό ν• κΉŒμš”?
04:47
Now, every new technology
104
287700
1976
μ˜€λŠ˜λ‚  λͺ¨λ“  μƒˆλ‘œμš΄ κΈ°μˆ μ€
04:49
brings with it some measure of trepidation.
105
289700
2896
μ–΄λŠ μ •λ„μ˜ 두렀움을 κ°€μ Έμ˜€μ§€μš”.
04:52
When we first saw cars,
106
292620
1696
μš°λ¦¬κ°€ 처음 μ°¨λ₯Ό λ³΄μ•˜μ„ λ•Œ
04:54
people lamented that we would see the destruction of the family.
107
294340
4016
μ‚¬λžŒλ“€μ€ 그것이 가쑱듀을 λ¬΄λ„ˆλœ¨λ¦΄κΉŒλ΄ κ±±μ •ν–ˆμ—ˆμŠ΅λ‹ˆλ‹€.
04:58
When we first saw telephones come in,
108
298380
2696
μš°λ¦¬κ°€ 처음 μ „ν™”λ₯Ό λ³΄μ•˜μ„ λ•Œ
05:01
people were worried it would destroy all civil conversation.
109
301100
2896
μ‚¬λžŒλ“€μ€ 그것이 λŒ€ν™”λ₯Ό μ—†μ• λ²„λ¦΄κΉŒλ΄ κ±±μ •ν–ˆμ—ˆμŠ΅λ‹ˆλ‹€.
05:04
At a point in time we saw the written word become pervasive,
110
304020
3936
κΈ€μžλ₯Ό μ“°λŠ”κ²ƒμ΄ λ³΄νŽΈν™” λ˜μ–΄κ°ˆ λ•Œ
05:07
people thought we would lose our ability to memorize.
111
307980
2496
μ‚¬λžŒλ“€μ€ κΈ°μ–΅λ ₯을 μžƒκ²Œ 될 거라고 μƒκ°ν–ˆμ—ˆμ£ .
05:10
These things are all true to a degree,
112
310500
2056
μ–΄λŠ 정도 λ§žλŠ” 말일 수 μžˆμ§€λ§Œ
05:12
but it's also the case that these technologies
113
312580
2416
기술이 μ‚¬λžŒμ˜ 삢을 더 깊게 ν•΄μ£ΌλŠ” κ²½μš°λ„
05:15
brought to us things that extended the human experience
114
315020
3376
05:18
in some profound ways.
115
318420
1880
μžˆμ—ˆλ‹€κ³  λ³Ό 수 있죠.
05:21
So let's take this a little further.
116
321660
2280
더 λ‚˜μ•„κ°€λ΄…μ‹œλ‹€.
05:24
I do not fear the creation of an AI like this,
117
324940
4736
μ €λŠ” 인곡지λŠ₯이 우리의 κ°€μΉ˜λ₯Ό ꢁ극적으둜 κ΅¬ν˜„ν•  것이라고 믿기에
05:29
because it will eventually embody some of our values.
118
329700
3816
인곡지λŠ₯의 발λͺ…을 λ‘λ €μ›Œν•˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
05:33
Consider this: building a cognitive system is fundamentally different
119
333540
3496
이것을 κ³ λ €ν•˜μ„Έμš”: 인식적인 μ‹œμŠ€ν…œμ„ λ§Œλ“œλŠ” 것은
05:37
than building a traditional software-intensive system of the past.
120
337060
3296
과거의 μ†Œν”„νŠΈμ›¨μ–΄ 쀑심 μ‹œμŠ€ν…œμ„ λ§Œλ“œλŠ” 것과 λ‹€λ¦…λ‹ˆλ‹€.
05:40
We don't program them. We teach them.
121
340380
2456
μš°λ¦¬λŠ” 그것을 ν”„λ‘œκ·Έλž¨ν•˜μ§€ μ•Šκ³  κ°€λ₯΄μΉ©λ‹ˆλ‹€.
05:42
In order to teach a system how to recognize flowers,
122
342860
2656
꽃을 μ•Œμ•„λ³΄λŠ”κ²ƒμ„ μ‹œμŠ€ν…œμ—κ²Œ κ°€λ₯΄μΉ˜κΈ° μœ„ν•΄μ„œλŠ”
05:45
I show it thousands of flowers of the kinds I like.
123
345540
3016
수천 μ’…λ₯˜μ˜ 꽃을 보여주여야 ν•©λ‹ˆλ‹€.
05:48
In order to teach a system how to play a game --
124
348580
2256
μ‹œμŠ€ν…œμ—κ²Œ κ²Œμž„ν•˜λŠ” 법을 κ°€λ₯΄μΉ˜κΈ° μœ„ν•΄μ„œλŠ”
05:50
Well, I would. You would, too.
125
350860
1960
저도 그러고 μ‹Άκ³ , μ—¬λŸ¬λΆ„λ„ κ·ΈλŸ¬κ² μ§€μš”.
05:54
I like flowers. Come on.
126
354420
2040
μ €λŠ” 꽃을 μ’‹μ•„ν•΄μš”, μ •λ§λ‘œμš”.
05:57
To teach a system how to play a game like Go,
127
357260
2856
μ‹œμŠ€ν…œμ—κ²Œ 바둑같은 κ²Œμž„μ„ κ°€λ₯΄μΉ˜κΈ° μœ„ν•΄μ„œλŠ”
06:00
I'd have it play thousands of games of Go,
128
360140
2056
수천 번의 λ°”λ‘‘ κ²Œμž„μ„ ν•΄μ£Όμ–΄μ•Ό ν•  것이며
06:02
but in the process I also teach it
129
362220
1656
κ·Έ κ³Όμ •μ—μ„œ μž˜ν•œ κ²Œμž„κ³Ό λͺ»ν•œ κ²Œμž„μ„
06:03
how to discern a good game from a bad game.
130
363900
2416
κ΅¬λ³„ν•˜λŠ”κ²ƒλ„ κ°€λ₯΄μ³μ•Όκ² μ§€μš”.
06:06
If I want to create an artificially intelligent legal assistant,
131
366340
3696
인곡 지λŠ₯μ—κ²Œ 법λ₯  보쑰일을 μ‹œν‚€κΈ° μœ„ν•΄μ„œλŠ”
06:10
I will teach it some corpus of law
132
370060
1776
λŒ€μ „ 뿐만이 μ•„λ‹ˆλΌ
06:11
but at the same time I am fusing with it
133
371860
2856
μžλΉ„μ™€ μ •μ˜λ“±μ˜ κ°œλ… μ—­μ‹œ κ°€λ₯΄μ³μ•Όκ² μ§€μš”.
06:14
the sense of mercy and justice that is part of that law.
134
374740
2880
06:18
In scientific terms, this is what we call ground truth,
135
378380
2976
κ³Όν•™μ μœΌλ‘œ, μš°λ¦¬κ°€ λ§ν•˜λŠ” 근본적인 진싀 및
06:21
and here's the important point:
136
381380
2016
μ€‘μš”ν•œ 사싀이 μ—¬κΈ° μžˆμŠ΅λ‹ˆλ‹€:
06:23
in producing these machines,
137
383420
1456
μ΄λŸ¬ν•œ 기계λ₯Ό λ§Œλ“€κΈ° μœ„ν•΄μ„œλŠ”
06:24
we are therefore teaching them a sense of our values.
138
384900
3416
μš°λ¦¬λŠ” 우리의 κ°€μΉ˜λ₯Ό κ·Έλ“€μ—κ²Œ κ°€λ₯΄μ³μ•Όν•©λ‹ˆλ‹€.
06:28
To that end, I trust an artificial intelligence
139
388340
3136
κ·Έ λͺ©μ μ„ μœ„ν•΄μ„œ, μ €λŠ” 인곡지λŠ₯이
06:31
the same, if not more, as a human who is well-trained.
140
391500
3640
잘 ν›ˆλ ¨λ°›μ€ μ‚¬λžŒλ³΄λ‹€ κ°™κ±°λ‚˜ 더 λŒ€λ‹¨ν•˜λ‹€κ³  λ―ΏμŠ΅λ‹ˆλ‹€.
06:35
But, you may ask,
141
395900
1216
ν•˜μ§€λ§Œ μ—¬λŸ¬λΆ„
06:37
what about rogue agents,
142
397140
2616
잘 μ„€λ¦½λœ 둜그 μ—μ΄μ „νŠΈμ™€ 같은
06:39
some well-funded nongovernment organization?
143
399780
3336
비정뢀기관에 λŒ€ν•˜μ—¬ λ¬Όμ„μˆ˜ 있겠죠?
06:43
I do not fear an artificial intelligence in the hand of a lone wolf.
144
403140
3816
μ €λŠ” μ™Έλ‘œμš΄ λŠ‘λŒ€μ—κ²Œ μžˆλŠ” 인곡지λŠ₯을 λ‘λ €μ›Œν•˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
06:46
Clearly, we cannot protect ourselves against all random acts of violence,
145
406980
4536
λΆ„λͺ…νžˆ, μš°λ¦¬λŠ” 마ꡬ작이의 폭λ ₯μ—μ„œ 우리λ₯Ό λ³΄ν˜Έν•  수 μ—†μ§€λ§Œ
06:51
but the reality is such a system
146
411540
2136
μ‹€μ œμ μœΌλ‘œ λ³΄μ•˜μ„λ•Œ
06:53
requires substantial training and subtle training
147
413700
3096
μ΄λŠ” 근본적인 νŠΈλ ˆμ΄λ‹μ„ μš”κ΅¬ν•˜κ³ 
06:56
far beyond the resources of an individual.
148
416820
2296
이것은 개인의 λŠ₯λ ₯치 λ°”κΉ₯에 μžˆλŠ” μΌμž…λ‹ˆλ‹€.
06:59
And furthermore,
149
419140
1216
더 λ‚˜μ•„κ°€ 이것은
07:00
it's far more than just injecting an internet virus to the world,
150
420380
3256
인터넷 λ°”μ΄λŸ¬μŠ€λ₯Ό 세상에 μ£Όμž…ν•˜μ—¬
07:03
where you push a button, all of a sudden it's in a million places
151
423660
3096
ν•˜λ‚˜μ˜ λ²„νŠΌμ„ 눌러 전세계 인터넷을 ν­νŒŒμ‹œν‚€λŠ”κ²ƒλ³΄λ‹€
07:06
and laptops start blowing up all over the place.
152
426780
2456
더 νž˜λ“  μΌμž…λ‹ˆλ‹€.
07:09
Now, these kinds of substances are much larger,
153
429260
2816
이제 μ΄λŸ¬ν•œ 것듀은 훨씬 컀쑌으며
07:12
and we'll certainly see them coming.
154
432100
1715
μš°λ¦¬λŠ” 이것듀이 λ‹€κ°€μ˜€λŠ”κ²ƒμ„ ν™•μ‹€ν•˜κ²Œ λ³Ό 수 있죠.
07:14
Do I fear that such an artificial intelligence
155
434340
3056
μ œκ°€ 인곡지λŠ₯이 인λ₯˜λ₯Ό μœ„ν˜‘ν• κΉŒλ΄
07:17
might threaten all of humanity?
156
437420
1960
λ‘λ €μ›Œν• κΉŒμš”?
07:20
If you look at movies such as "The Matrix," "Metropolis,"
157
440100
4376
"λ©”νŠΈλ¦­μŠ€", "λ©”νŠΈλ‘œν΄λ¦¬μŠ€", "터미넀이터" 같은
07:24
"The Terminator," shows such as "Westworld,"
158
444500
3176
μ„œκ΅¬μ˜ μ˜ν™”λ“€μ„ 보면
07:27
they all speak of this kind of fear.
159
447700
2136
μ˜ν™”λ“€μ€ μ΄λŸ¬ν•œ 두렀움을 닀루고 μžˆμŠ΅λ‹ˆλ‹€.
07:29
Indeed, in the book "Superintelligence" by the philosopher Nick Bostrom,
160
449860
4296
μ² ν•™μž λ‹‰λ³΄μŠ€νŠΈλ‘¬μ˜ "μ΄ˆμ§€λŠ₯"μ΄λΌλŠ” μ±…μ—μ„œ
07:34
he picks up on this theme
161
454180
1536
κ·ΈλŠ” 이 주제λ₯Ό 닀루며
07:35
and observes that a superintelligence might not only be dangerous,
162
455740
4016
μ΄ˆμ§€λŠ₯이 μœ„ν—˜ν•  뿐만 μ•„λ‹ˆλΌ
07:39
it could represent an existential threat to all of humanity.
163
459780
3856
인λ₯˜μ— μžˆμ–΄μ„œ λͺ¨λ“  μœ„ν—˜μ„ λŒ€λ³€ν•œλ‹€κ³  ν•©λ‹ˆλ‹€.
07:43
Dr. Bostrom's basic argument
164
463660
2216
ν•„μžμ˜ 근본적인 μ£Όμž₯은
07:45
is that such systems will eventually
165
465900
2736
μ΄λŸ¬ν•œ μ‹œμŠ€ν…œμ΄ κ²°κ΅­
07:48
have such an insatiable thirst for information
166
468660
3256
νƒμš•μ μœΌλ‘œ 정보λ₯Ό μ›ν•˜κ²Œ λ˜μ–΄
07:51
that they will perhaps learn how to learn
167
471940
2896
μΈκ°„μ˜ ν•„μš”μ™€
07:54
and eventually discover that they may have goals
168
474860
2616
λ°˜λŒ€λ˜λŠ” λͺ©μ μ„
07:57
that are contrary to human needs.
169
477500
2296
이루게 λ κ±°λΌλŠ” 것이죠.
07:59
Dr. Bostrom has a number of followers.
170
479820
1856
보슀트둬 ꡐ수λ₯Ό λ”°λ₯΄λŠ” μ‚¬λžŒλ“€μ΄ λ§ŽμŠ΅λ‹ˆλ‹€.
08:01
He is supported by people such as Elon Musk and Stephen Hawking.
171
481700
4320
μ—˜λ‘ λ¨ΈμŠ€ν¬μ™€ μŠ€ν‹°λΈ ν˜Έν‚Ήκ°™μ€ μ‚¬λžŒλ“€μ˜ 지지λ₯Ό λ°›κ³ μžˆμ£ .
08:06
With all due respect
172
486700
2400
μ²œλΆ€μ μΈ λ§ˆμŒλ“€μ—
08:09
to these brilliant minds,
173
489980
2016
μ™ΈλžŒλœ λ§μ”€μž…λ‹ˆλ‹€λ§Œ
08:12
I believe that they are fundamentally wrong.
174
492020
2256
μ €λŠ” 그듀이 근본적으둜 ν‹€λ Έλ‹€κ³  μƒκ°ν•©λ‹ˆλ‹€.
08:14
Now, there are a lot of pieces of Dr. Bostrom's argument to unpack,
175
494300
3176
보슀트둬 ꡐ수의 μ£Όμž₯μ—λŠ” μ‚΄νŽ΄λ³΄μ•„μ•Ό ν•  것듀이 많고
08:17
and I don't have time to unpack them all,
176
497500
2136
μ§€κΈˆ 이것을 λ‹€ λ‹€λ£° μ‹œκ°„μ€ μ—†μ§€λ§Œ
08:19
but very briefly, consider this:
177
499660
2696
μ•„μ£Ό κ°„λž΅ν•˜κ²Œ, 이것을 μƒκ°ν•˜μ—¬λ³΄μ„Έμš”.
08:22
super knowing is very different than super doing.
178
502380
3736
많이 μ•„λŠ” 것은 많이 ν•˜λŠ” 것과 λ‹€λ¦…λ‹ˆλ‹€.
08:26
HAL was a threat to the Discovery crew
179
506140
1896
HAL이 νƒμ‚¬νŒ€μ—κ²Œ μœ„ν˜‘μ΄ λ˜μ—ˆλ˜ μ΄μœ λŠ”
08:28
only insofar as HAL commanded all aspects of the Discovery.
180
508060
4416
단지 HAL이 νƒμ‚¬μ˜ λͺ¨λ“  μΈ‘λ©΄μ—μ„œ λͺ…령을 λ‚΄λ ΈκΈ° λ•Œλ¬Έμ΄μ£ .
08:32
So it would have to be with a superintelligence.
181
512500
2496
μ΄ˆμ§€λŠ₯κ³Ό 관련이 μžˆμŠ΅λ‹ˆλ‹€.
08:35
It would have to have dominion over all of our world.
182
515020
2496
그것은 우리 μ„Έμƒμ˜ μ§€λ°°κΆŒμ„ κ°€μ§€κ²Œ λ κ²ƒμž…λ‹ˆλ‹€.
08:37
This is the stuff of Skynet from the movie "The Terminator"
183
517540
2816
μ΄ˆμ§€λŠ₯이 μ‚¬λžŒμ΄ μ›ν•˜λŠ”κ²ƒμ„
08:40
in which we had a superintelligence
184
520380
1856
λͺ…λ Ήλ°›ν•˜κ²Œ λ˜λŠ”κ²ƒμ€
08:42
that commanded human will,
185
522260
1376
μ˜ν™” "터미넀이터"의 μŠ€μΉ΄μ΄λ„·μ„ 톡해 λ‹€λ€„μ§€λŠ”λ°
08:43
that directed every device that was in every corner of the world.
186
523660
3856
μ΄λŠ” λͺ¨λ“  기기듀이 전세계에 λ‹€ μžˆλŠ” κ²½μš°μž…λ‹ˆλ‹€.
08:47
Practically speaking,
187
527540
1456
μ†”μ§ν•˜κ²Œ μ–˜κΈ°ν•˜μ—¬λ³΄μžλ©΄,
08:49
it ain't gonna happen.
188
529020
2096
이것은 μΌμ–΄λ‚˜μ§€ μ•Šμ„ μΌμž…λ‹ˆλ‹€.
08:51
We are not building AIs that control the weather,
189
531140
3056
μš°λ¦¬λŠ” 날씨λ₯Ό μ‘°μ •ν•˜κ±°λ‚˜, νŒŒλ„λ₯Ό μ΄λŒκ±°λ‚˜
08:54
that direct the tides,
190
534220
1336
μ˜ˆμΈ‘ν• μˆ˜ μ—†κ³  ν˜Όλž€μŠ€λŸ¬μš΄ μ‚¬λžŒμ„ λͺ…λ Ήν•˜λŠ”
08:55
that command us capricious, chaotic humans.
191
535580
3376
인곡지λŠ₯을 λ§Œλ“€κ³ μžˆλŠ”κ²ƒμ΄ μ•„λ‹™λ‹ˆλ‹€.
08:58
And furthermore, if such an artificial intelligence existed,
192
538980
3896
그리고 μ΄λŸ¬ν•œ 인곡지λŠ₯이 μžˆλ‹€λ©΄,
09:02
it would have to compete with human economies,
193
542900
2936
그것은 μžμ›μ„ μœ„ν•˜μ—¬
09:05
and thereby compete for resources with us.
194
545860
2520
인λ₯˜μ˜ κ²½μ œμ™€ κ²½μŸν•΄μ•Όκ² μ§€μš”.
09:09
And in the end --
195
549020
1216
κ²°κ΅­μ—λŠ”
09:10
don't tell Siri this --
196
550260
1240
μ‹œλ¦¬μ—κ²ŒλŠ” λ§ν•˜μ§€ λ§ˆμ„Έμš”.
09:12
we can always unplug them.
197
552260
1376
μš°λ¦¬λŠ” μ–Έμ œλ‚˜ 전원을 빼버릴 수 μžˆμŠ΅λ‹ˆλ‹€.
09:13
(Laughter)
198
553660
2120
(μ›ƒμŒ)
09:17
We are on an incredible journey
199
557180
2456
μš°λ¦¬λŠ” 기계듀과 κ³΅μ§„ν™”ν•˜λŠ”
09:19
of coevolution with our machines.
200
559660
2496
λ†€λž„λ§Œν•œ 여행을 ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
09:22
The humans we are today
201
562180
2496
미래의 인간듀은
09:24
are not the humans we will be then.
202
564700
2536
μ˜€λŠ˜λ‚ μ˜ 인간듀과 λ‹€λ₯΄κ² μ§€μš”.
09:27
To worry now about the rise of a superintelligence
203
567260
3136
μ΄ˆμ§€λŠ₯의 μΆœν˜„μ„ μ§€κΈˆ κ±±μ •ν•˜λŠ”κ²ƒμ€
09:30
is in many ways a dangerous distraction
204
570420
3056
기술의 λ°œμ „μ΄ μš°λ¦¬μ—κ²Œ
09:33
because the rise of computing itself
205
573500
2336
관심을 κ°€μ Έμ•Όλ§Œν•˜λŠ” λ§Žμ€
09:35
brings to us a number of human and societal issues
206
575860
3016
μ‚¬νšŒμ , 인λ₯˜μ  이슈λ₯Ό κ°€μ Έμ˜€κΈ°μ—
09:38
to which we must now attend.
207
578900
1640
μœ„ν—˜ν•œ 뢄산이라고 λ³Ό 수 μžˆμŠ΅λ‹ˆλ‹€.
09:41
How shall I best organize society
208
581180
2816
그럼 μ–΄λ–»κ²Œ ν•΄μ•Ό μΈκ°„μ˜ 노동이 덜 μ€‘μš”ν•΄μ§„ μ‚¬νšŒλ₯Ό
09:44
when the need for human labor diminishes?
209
584020
2336
잘 ꡬ성할 수 μžˆμ„κΉŒμš”?
09:46
How can I bring understanding and education throughout the globe
210
586380
3816
μ–΄λ–»κ²Œ ν•΄μ•Ό 차이점을 μ‘΄μ€‘ν•˜λ©°
09:50
and still respect our differences?
211
590220
1776
ꡐ윑과 이해λ₯Ό μ„Έκ³„μ μœΌλ‘œ κ°€μ Έμ˜¬ 수 μžˆμ„κΉŒμš”?
09:52
How might I extend and enhance human life through cognitive healthcare?
212
592020
4256
인식적인 건강관리 μ‹œμŠ€ν…œμœΌλ‘œ μ–΄λ–»κ²Œ μΈκ°„μ˜ 삢을 λ°œμ „μ‹œν‚¬κΉŒμš”?
09:56
How might I use computing
213
596300
2856
μ–΄λ–»κ²Œ 기술적인 것듀을 μ΄μš©ν•˜μ—¬
09:59
to help take us to the stars?
214
599180
1760
μš°λ¦¬κ°€ 별을 따도둝 λ„μšΈ 수 μžˆμ„κΉŒμš”?
10:01
And that's the exciting thing.
215
601580
2040
그것이 이 일의 μ‹ λ‚˜λŠ” λΆ€λΆ„μž…λ‹ˆλ‹€.
10:04
The opportunities to use computing
216
604220
2336
이제 μš°λ¦¬λŠ”
10:06
to advance the human experience
217
606580
1536
기계λ₯Ό ν†΅ν•œ 인λ₯˜ λ°œμ „μ„ 이룰 수 있으며
10:08
are within our reach,
218
608140
1416
10:09
here and now,
219
609580
1856
μ§€κΈˆ μ΄κ³³μ—μ„œ μš°λ¦¬λŠ” μ‹œμž‘ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
10:11
and we are just beginning.
220
611460
1680
10:14
Thank you very much.
221
614100
1216
κ°μ‚¬ν•©λ‹ˆλ‹€.
10:15
(Applause)
222
615340
4286
(λ°•μˆ˜)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7