Rodney Brooks: How robots will invade our lives

60,837 views ใƒป 2008-10-10

TED


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

๋ฒˆ์—ญ: Han-Me Kim ๊ฒ€ํ† : Ian Park
00:18
What I want to tell you about today is how I see robots invading our lives
0
18330
5000
์˜ค๋Š˜ ์ €๋Š” ์—ฌ๋Ÿฌ๋ถ„๊ป˜ ๋กœ๋ด‡์ด ์–ผ๋งˆ๋‚˜ ๋‹ค์–‘ํ•œ ๋ชจ์Šต์œผ๋กœ, ๋‹ค์–‘ํ•œ ์‹œ๊ฐ„๋Œ€์—
00:23
at multiple levels, over multiple timescales.
1
23330
3000
์šฐ๋ฆฌ์˜ ์‚ถ์— ๋ฐ€์–ด๋‹ฅ์น  ๊ฒƒ์ธ์ง€์— ๋Œ€ํ•œ ์ œ ๊ฒฌํ•ด๋ฅผ ๋ง์”€๋“œ๋ฆฌ๊ณ ์ž ํ•ฉ๋‹ˆ๋‹ค.
00:26
And when I look out in the future, I can't imagine a world, 500 years from now,
2
26330
4000
500๋…„ ๋’ค์˜ ๋ฏธ๋ž˜๋ฅผ ์ƒ์ƒํ•ด๋ณด๋ฉด
00:30
where we don't have robots everywhere.
3
30330
2000
๋กœ๋ด‡์ด ์‚ฌ์šฉ๋˜์ง€ ์•Š๋Š” ๊ณณ์ด ์žˆ๋Š” ์„ธ์ƒ์„ ์ƒ์ƒ ํ•  ์ˆ˜ ์—†์Šต๋‹ˆ๋‹ค.
00:32
Assuming -- despite all the dire predictions from many people about our future --
4
32330
5000
์šฐ๋ฆฌ ๋ฏธ๋ž˜์— ๋Œ€ํ•œ ๋งŽ์€ ๋น„๊ด€์ ์ธ ์˜ˆ์ธก๋“ค์„ ๋ฐฐ์ œํ•˜๊ณ , ์—ฌ์ „ํžˆ ์ธ๊ฐ„์ด ์กด์žฌํ•œ๋‹ค๋ฉด,
00:37
assuming we're still around, I can't imagine the world not being populated with robots.
5
37330
4000
๋กœ๋ด‡์ด ๋Œ€์ค‘ํ™” ๋˜์ง€ ์•Š์€ ์„ธ์ƒ์€ ์ƒ์ƒ ํ•  ์ˆ˜ ์—†์Šต๋‹ˆ๋‹ค.
00:41
And then the question is, well, if they're going to be here in 500 years,
6
41330
3000
๋งŒ์•ฝ ๊ทธ๋ ‡๋‹ค๋ฉด, ๋กœ๋ด‡์ด 500๋…„ ์•ˆ์— ๊ทธ๋ ‡๊ฒŒ ๋Œ€์ค‘ํ™”๊ฐ€ ๋  ์ˆ˜ ์žˆ๋‹ค๊ณ  ํ•œ๋‹ค๋ฉด,
00:44
are they going to be everywhere sooner than that?
7
44330
2000
500๋…„๋ณด๋‹ค ๋” ๋นจ๋ฆฌ ์ด๋ค„์งˆ ์ˆ˜๋Š” ์—†์„๊นŒ์š”?
00:46
Are they going to be around in 50 years?
8
46330
2000
์•ž์œผ๋กœ 50๋…„ ์•ˆ์— ๋กœ๋ด‡์„ ์šฐ๋ฆฌ ์ฃผ๋ณ€ ๊ณณ๊ณณ์—์„œ ๋ณผ ์ˆ˜๋Š” ์—†์„๊นŒ์š”?
00:48
Yeah, I think that's pretty likely -- there's going to be lots of robots everywhere.
9
48330
3000
๋„ค. ์ €๋Š” ๋ฉ€์ง€์•Š์€ ๋ฏธ๋ž˜์— ๋กœ๋ด‡์ด ์šฐ๋ฆฌ ์‚ถ ๊ณณ๊ณณ์— ์“ฐ์—ฌ์งˆ ๊ฒƒ์ด๋ฉฐ
00:51
And in fact I think that's going to be a lot sooner than that.
10
51330
3000
๊ทธ ์‹œ๊ธฐ๊ฐ€ ์šฐ๋ฆฌ์˜ ์˜ˆ์ธก๋ณด๋‹ค ํ›จ์”ฌ ๋นจ๋ฆฌ ์ด๋ค„์งˆ ๊ฒƒ์ด๋ผ๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
00:54
I think we're sort of on the cusp of robots becoming common,
11
54330
4000
์ €๋Š” ์ง€๊ธˆ ์šฐ๋ฆฌ๊ฐ€ ๋กœ๋ด‡์ด ์ ์ฐจ ์ผ๋ฐ˜ํ™”๋˜๊ธฐ ์‹œ์ž‘ํ•˜๋Š” ์‹œ์ ์— ์žˆ๋‹ค๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
00:58
and I think we're sort of around 1978 or 1980 in personal computer years,
12
58330
6000
๊ฐœ์ธ์šฉ ์ปดํ“จํ„ฐ ์‹œ๋Œ€์™€ ๋น„๊ตํ–ˆ์„ ๋•Œ, ์ฒ˜์Œ์œผ๋กœ ๋ช‡๋ช‡ ๋กœ๋ด‡์ด ๋‚˜ํƒ€๋‚˜๊ธฐ ์‹œ์ž‘ํ•œ ์ง€๊ธˆ์€
01:04
where the first few robots are starting to appear.
13
64330
3000
1978๋…„ ํ˜น์€ 1980๋…„๊ณผ ๋น„์Šทํ•œ ์ƒํ™ฉ์ธ ๊ฒƒ์ด์ฃ .
01:07
Computers sort of came around through games and toys.
14
67330
4000
์ปดํ“จํ„ฐ๋Š” ์ฒ˜์Œ ๊ฒŒ์ž„์ด๋‚˜ ์žฅ๋‚œ๊ฐ ํ˜•ํƒœ๋กœ ์„ธ์ƒ์— ๋‚˜ํƒ€๋‚˜๊ธฐ ์‹œ์ž‘ํ–ˆ์Šต๋‹ˆ๋‹ค.
01:11
And you know, the first computer most people had in the house
15
71330
3000
์—ฌ๋Ÿฌ๋ถ„๋“ค๋„ ์•„์‹œ๋‹ค์‹œํ”ผ, ๋Œ€๋ถ€๋ถ„์˜ ์‚ฌ๋žŒ๋“ค์ด ์ฒ˜์Œ ์ง‘์— ๊ฐ€์ง€๊ณ  ์žˆ์—ˆ๋˜ ์ปดํ“จํ„ฐ๋Š”
01:14
may have been a computer to play Pong,
16
74330
2000
์ž‘์€ ๋งˆ์ดํฌ๋กœํ”„๋กœ์„ธ์„œ ๋‚ด์žฅํ•œ
01:16
a little microprocessor embedded,
17
76330
2000
"ํ(Pong, ์ดˆ์ฐฝ๊ธฐ ์•„์ผ€์ด๋“œ ๊ฒŒ์ž„)" ๊ฒŒ์ž„๊ธฐ์˜€์Šต๋‹ˆ๋‹ค.
01:18
and then other games that came after that.
18
78330
3000
๊ทธ๋ฆฌ๊ณ , ๊ทธ ์ดํ›„๋กœ ๋‹ค๋ฅธ ๊ฒŒ์ž„๋“ค์ด ๋‚˜ํƒ€๋‚˜๊ธฐ ์‹œ์ž‘ํ–ˆ์ฃ .
01:21
And we're starting to see that same sort of thing with robots:
19
81330
3000
์ง€๊ธˆ ํ™”๋ฉด์„ ๋ณด์‹œ๋ฉด ์ฒ˜์Œ ์ปดํ“จํ„ฐ์™€ ๋˜‘๊ฐ™์€ ํ˜•ํƒœ์˜ ๋กœ๋ด‡๋“ค์„ ๋ณด์‹ค ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
01:24
LEGO Mindstorms, Furbies -- who here -- did anyone here have a Furby?
20
84330
4000
๋ ˆ๊ณ ์˜ ๋งˆ์ธ๋“œ ์Šคํ†ฐ, ํผ๋น„.. ์—ฌ๊ธฐ ํผ๋น„๋ฅผ ๊ฐ€์ ธ๋ณด์‹  ๋ถ„ ๊ณ„์‹ ๊ฐ€์š”?
01:28
Yeah, there's 38 million of them sold worldwide.
21
88330
3000
๋„ค, ์ „์„ธ๊ณ„์ ์œผ๋กœ 3์ฒœ8๋ฐฑ๋งŒ๋Œ€๊ฐ€ ํŒ”๋ ธ์—ˆ์ฃ .
01:31
They are pretty common. And they're a little tiny robot,
22
91330
2000
ํผ๋น„๋Š” ๋ช‡๊ฐœ์˜ ์„ผ์„œ๋ฅผ ๊ฐ€์ง€๊ณ , ๊ฐ„๋‹จํ•œ ๋™์ž‘์„ ํ•  ์ˆ˜ ์žˆ๋Š”
01:33
a simple robot with some sensors,
23
93330
2000
์•„์ฃผ ์ž‘๊ณ  ๋‹จ์ˆœํ•œ ๋กœ๋ด‡์œผ๋กœ
01:35
a little bit of processing actuation.
24
95330
2000
๊ฑฐ์˜ ๋Œ€์ค‘์ ์œผ๋กœ ์ธ๊ธฐ๋ฅผ ๋Œ์—ˆ์—ˆ์ฃ .
01:37
On the right there is another robot doll, who you could get a couple of years ago.
25
97330
3000
๊ทธ ์˜ค๋ฅธ์ชฝ์—๋Š” ๋ช‡๋…„์ „์— ๊ตฌ์ž…ํ•  ์ˆ˜ ์žˆ์—ˆ๋˜ ๋‹ค๋ฅธ ๋กœ๋ด‡ ์ธํ˜•๋„ ๋ณด์‹ค ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
01:40
And just as in the early days,
26
100330
2000
๊ทธ๋ฆฌ๊ณ , ์ปดํ“จํ„ฐ๋ฅผ ํ†ตํ•œ ๋งŽ์€ ์•„๋งˆ์ถ”์–ด ํ™œ๋™์ด ์žˆ์—ˆ๋˜
01:42
when there was a lot of sort of amateur interaction over computers,
27
102330
5000
์ดˆ์ฐฝ๊ธฐ ๊ฐœ์ธ์šฉ ์ปดํ“จํ„ฐ ์‹œ๋Œ€ ๋•Œ์™€ ๊ฐ™์ด, ๋กœ๋ด‡๊ณผ ๊ด€๋ จํ•ด์„œ๋„
01:47
you can now get various hacking kits, how-to-hack books.
28
107330
4000
์–ด๋–ป๊ฒŒ ํ•ดํ‚น์„ ํ•  ์ˆ˜ ์žˆ๋Š”์ง€๋ฅผ ์„ค๋ช…ํ•˜๋Š” ์ฑ…๋“ค๊ณผ ๋„๊ตฌ๋“ค์„ ๊ตฌํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
01:51
And on the left there is a platform from Evolution Robotics,
29
111330
4000
๊ทธ๋ฆฌ๊ณ , ํ™”๋ฉด ์™ผ์ชฝ์— ๋ณด์ด๋Š” ๊ฒƒ์€ Evolution Robotics์—์„œ ๋งŒ๋“  ํ”Œ๋žซํผ์œผ๋กœ
01:55
where you put a PC on, and you program this thing with a GUI
30
115330
3000
PC๋ฅผ ์žฅ์ฐฉํ•˜๊ณ , ๊ทธ๋ž˜ํ”ฝ ์‚ฌ์šฉ์ž ์ธํ„ฐํŽ˜์ด์Šค๋ฅผ ํ†ตํ•ด
01:58
to wander around your house and do various stuff.
31
118330
3000
์ง‘์•ˆ์„ ๋Œ์•„๋‹ค๋‹ˆ๊ฑฐ๋‚˜, ์—ฌ๋Ÿฌ๊ฐ€์ง€ ์ผ๋“ค์„ ํ•˜๋„๋ก ํ”„๋กœ๊ทธ๋žจ ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
02:01
And then there's a higher price point sort of robot toys --
32
121330
3000
๊ทธ๋ฆฌ๊ณ , ์—ฌ๊ธฐ์— ๊ณ ๊ฐ€์˜ ์žฅ๋‚œ๊ฐ ๋กœ๋ด‡๋“ค์ด ์žˆ๋Š”๋ฐ
02:04
the Sony Aibo. And on the right there, is one that the NEC developed,
33
124330
4000
์†Œ๋‹ˆ์˜ "์•„์ด๋ณด"์™€ ์˜ค๋ฅธ์ชฝ์—๋Š” NEC์—์„œ ๊ฐœ๋ฐœํ•œ ๊ฒƒ์œผ๋กœ
02:08
the PaPeRo, which I don't think they're going to release.
34
128330
3000
"ํŒŒํŽ˜๋กœ"๋ผ๋Š” ๋กœ๋ด‡์ธ๋ฐ, ์‹œ์žฅ์— ์ถœ์‹œ๋  ๊ฒƒ ๊ฐ™์ง€๋Š” ์•Š์Šต๋‹ˆ๋‹ค.
02:11
But nevertheless, those sorts of things are out there.
35
131330
3000
๊ทธ๋ ‡์ง€๋งŒ, ์ด๋Ÿฌํ•œ ์žฅ๋‚œ๊ฐ ๋กœ๋ด‡๋“ค์ด ํ˜„์žฌ ํŒ๋งค, ๊ฐœ๋ฐœ๋˜๊ณ  ์žˆ๋Š” ๊ฒƒ์€ ๋ถ„๋ช…ํ•ฉ๋‹ˆ๋‹ค.
02:14
And we've seen, over the last two or three years, lawn-mowing robots,
36
134330
4000
๊ทธ๋ฆฌ๊ณ , ์šฐ๋ฆฌ๋Š” 2~3๋…„ ์ „๋ถ€ํ„ฐ ์ž”๋””๊น์ด ๋กœ๋ด‡๋“ค์„ ๋ณด์•„์™”์Šต๋‹ˆ๋‹ค.
02:18
Husqvarna on the bottom, Friendly Robotics on top there, an Israeli company.
37
138330
6000
์•„๋ž˜์˜ ๊ฒƒ์ด "ํ—ˆ์Šคํฌ๋ฐ”๋‚˜"์ด๊ณ , ์œ„์— ๊ฒƒ์ด ์ด์Šค๋ผ์—˜ ํšŒ์‚ฌ Friendly Robotics์ž…๋‹ˆ๋‹ค.
02:24
And then in the last 12 months or so
38
144330
2000
๊ทธ๋ฆฌ๊ณ , ๋Œ€๋žต 1๋…„์ „๋ถ€ํ„ฐ๋Š”,
02:26
we've started to see a bunch of home-cleaning robots appear.
39
146330
4000
๋งŽ์€ ์ข…๋ฅ˜์˜ ์ฒญ์†Œ ๋กœ๋ด‡๋“ค์ด ์ถœ์‹œ๋˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
02:30
The top left one is a very nice home-cleaning robot
40
150330
3000
๋จผ์ € ์ขŒ์ธก ์ƒ๋‹จ์—๋Š” ์˜๊ตญ์˜ ๋‹ค์ด์Šจ์ด๋ž€ ํšŒ์‚ฌ์—์„œ ๋งŒ๋“ 
02:33
from a company called Dyson, in the U.K. Except it was so expensive --
41
153330
4000
๋งค์šฐ ์ข‹์€ ์„ฑ๋Šฅ์˜ ์ฒญ์†Œ ๋กœ๋ด‡์ด ์žˆ์Šต๋‹ˆ๋‹ค.
02:37
3,500 dollars -- they didn't release it.
42
157330
2000
ํ•˜์ง€๋งŒ, 3์ฒœ5๋ฐฑ๋‹ฌ๋Ÿฌ๋‚˜ ํ•˜๋Š” ๋น„์‹ผ ๊ฐ€๊ฒฉ์œผ๋กœ ์ถœ์‹œ๋˜์ง€ ์•Š์•˜์ฃ .
02:39
But at the bottom left, you see Electrolux, which is on sale.
43
159330
3000
ํ•˜์ง€๋งŒ, ์ขŒ์ธก ํ•˜๋‹จ์˜ ์ผ๋ ‰ํŠธ๋กœ๋ฃฉ์Šค ์ œํ’ˆ์€ ํ˜„์žฌ ํŒ๋งค๋˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
02:42
Another one from Karcher.
44
162330
2000
๊ทธ๋ฆฌ๊ณ , ์นด์ฒ˜์—์„œ ๋งŒ๋“  ์ œํ’ˆ๋„ ์žˆ๊ณ 
02:44
At the bottom right is one that I built in my lab
45
164330
2000
์šฐ์ธก ํ•˜๋‹จ์˜ ๊ฒƒ์€ ์•ฝ 10๋…„์ „ ์ฏค์— ์ €ํฌ ์—ฐ๊ตฌ์‹ค์—์„œ ๊ฐœ๋ฐœํ•œ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
02:46
about 10 years ago, and we finally turned that into a product.
46
166330
3000
๊ทธ๋ฆฌ๊ณ , ํ˜„์žฌ๋Š” ์ œํ’ˆ์œผ๋กœ ์ƒ์šฉํ™”๊ฐ€ ๋˜์—ˆ์ฃ .
02:49
And let me just show you that.
47
169330
2000
๊ทธ๋Ÿผ ์ง์ ‘ ๊ทธ ์ œํ’ˆ์„ ์‹œ์—ฐํ•˜๋„๋ก ํ•˜๊ฒ ์Šต๋‹ˆ๋‹ค.
02:51
We're going to give this away I think, Chris said, after the talk.
48
171330
4000
๋จผ์ € ํฌ๋ฆฌ์Šค๊ฐ€ ๋งํ•œ ๊ฒƒ์ฒ˜๋Ÿผ ์ด ๊ฐ•์—ฐ ์ดํ›„์— ๋” ๋ณด์‹ค ์ˆ˜ ์žˆ๋„๋ก ํ•˜๊ฒ ์Šต๋‹ˆ๋‹ค.
02:55
This is a robot that you can go out and buy, and that will clean up your floor.
49
175330
6000
์ด ์ฒญ์†Œ ๋กœ๋ด‡์€ ํ˜„์žฌ ์‹œ์ค‘์—์„œ ์‚ฌ์‹ค ์ˆ˜ ์žˆ๋Š” ๊ฒƒ์œผ๋กœ, ๋ฐ”๋‹ฅ ์ฒญ์†Œ๋ฅผ ํ•ด์ค๋‹ˆ๋‹ค.
03:05
And it starts off sort of just going around in ever-increasing circles.
50
185330
5000
์ฒ˜์Œ ์‹œ์ž‘ํ•˜๋ฉด ์ด ๋กœ๋ด‡์€ ์ ์  ๋Š˜์–ด๋‚˜๋Š” ์›ํ˜• ํ˜•ํƒœ๋กœ ์›€์ง์ด๋‹ค๊ฐ€
03:10
If it hits something -- you people see that?
51
190330
4000
์ง€๊ธˆ ๋ณด์‹œ๋Š” ๊ฒƒ๊ณผ ๊ฐ™์ด ์–ด๋Š ๊ณณ์— ๋ถ€๋”ชํžˆ๋ฉด
03:14
Now it's doing wall-following, it's following around my feet
52
194330
3000
์ง€๊ธˆ ์ œ ๋ฐœ ์ฃผ์œ„๋กœ ์›€์ง์ด๋Š” ๊ฒƒ์ฒ˜๋Ÿผ ๋ฒฝ์„ ๋”ฐ๋ผ ์›€์ง์ด๋ฉด์„œ
03:17
to clean up around me. Let's see, let's --
53
197330
4000
๊ทธ ์ฃผ๋ณ€์„ ์ฒญ์†Œํ•ฉ๋‹ˆ๋‹ค.
03:21
oh, who stole my Rice Krispies? They stole my Rice Krispies!
54
201330
5000
์–ด, ๋ˆ„๊ฐ€ ๋‚ด ๊ณผ์ž๋ฅผ ํ›”์ณ๊ฐ”์ฃ ? ๋กœ๋ด‡๋“ค์ด ๋‚ด ์Œ€๊ณผ์ž๋ฅผ ํ›”์ณ๊ฐ”๋‹ค ๋ณด๊ตฐ์š”!
03:26
(Laughter)
55
206330
6000
(์›ƒ์Œ)
03:32
Don't worry, relax, no, relax, it's a robot, it's smart!
56
212330
3000
๊ทธ๋ƒฅ ๋‘์„ธ์š”. ๊ดœ์ฐฎ์•„์š”. ๊ทธ๊ฑฐ ๋กœ๋ด‡์ด์—์š”. ๋˜‘๋˜‘ํ•˜๋‹ค๊ณ ์š”.
03:35
(Laughter)
57
215330
3000
(์›ƒ์Œ)
03:38
See, the three-year-old kids, they don't worry about it.
58
218330
4000
์—ฌ๊ธฐ ์„ธ์‚ด์งœ๋ฆฌ ๊ผฌ๋งน์ด๋“ค๋„ ์ „ํ˜€ ๊ฑฑ์ •์•ˆํ•˜๊ณ  ์žˆ์ž–์•„์š”.
03:42
It's grown-ups that get really upset.
59
222330
2000
์–ด๋ฅธ๋“ค์ด๋‚˜ ์ด๋Ÿฐ๊ฑฐ์— ์‹ ๊ฒฝ์„ ๊ณค๋‘์„ธ์šฐ์ง€์š”.
03:44
(Laughter)
60
224330
1000
(์›ƒ์Œ)
03:45
We'll just put some crap here.
61
225330
2000
์ž ์—ฌ๊ธฐ์— ๊ณผ์ž๋ฅผ ํ•œ๋ฒˆ ๋ฟŒ๋ ค๋ณผ๊นŒ์š”?
03:47
(Laughter)
62
227330
4000
(์›ƒ์Œ)
03:51
Okay.
63
231330
2000
์ž ๋์Šต๋‹ˆ๋‹ค.
03:53
(Laughter)
64
233330
4000
(์›ƒ์Œ)
03:57
I don't know if you see -- so, I put a bunch of Rice Krispies there,
65
237330
3000
๋ณด์ด์‹ค๋Ÿฐ์ง€ ๋ชจ๋ฅด๊ฒ ์ง€๋งŒ, ์ด๊ณณ์— ๊ณผ์ž๋ฅผ ์ข€ ๋ฟŒ๋ ธ๊ณ ,
04:00
I put some pennies, let's just shoot it at that, see if it cleans up.
66
240330
7000
๋™์ „๋„ ์ข€ ๋†“์•˜์Šต๋‹ˆ๋‹ค. ๊ทธ๋Ÿผ ๋กœ๋ด‡์„ ๋™์ž‘์‹œ์ผœ ์–ด๋–ป๊ฒŒ ์ฒญ์†Œ๋ฅผ ํ•˜๋Š”์ง€ ๋ณด๋„๋กํ•˜์ฃ .
04:10
Yeah, OK. So --
67
250330
2000
๋„ค, ์ข‹์Šต๋‹ˆ๋‹ค.
04:12
we'll leave that for later.
68
252330
4000
๋‚˜์ค‘์— ๋” ๋ณด์—ฌ๋“œ๋ฆฌ๋„๋ก ํ•˜์ง€์š”.
04:16
(Applause)
69
256330
5000
(๋ฐ•์ˆ˜)
04:22
Part of the trick was building a better cleaning mechanism, actually;
70
262330
4000
๊นŒ๋‹ค๋กœ์› ๋˜ ๊ฒƒ ์ค‘ ํ•˜๋‚˜๋Š” ์ข€ ๋” ํšจ๊ณผ์ ์ธ ์ฒญ์†Œ์šฉ ๊ธฐ๊ตฌ๋ฌผ์„ ์ œ์ž‘ํ•˜๋Š” ๊ฒƒ์ด์—ˆ์Šต๋‹ˆ๋‹ค.
04:26
the intelligence on board was fairly simple.
71
266330
4000
์‚ฌ์šฉ๋œ ์ธ๊ณต์ง€๋Šฅ์€ ์˜์™ธ๋กœ ๊ฐ„๋‹จํ–ˆ์ฃ .
04:30
And that's true with a lot of robots.
72
270330
2000
์ด๊ฒƒ์€ ๋‹ค๋ฅธ ๋งŽ์€ ๋กœ๋ด‡ ๊ฐœ๋ฐœ์—๋„ ์ ์šฉ๋˜๋Š” ์‚ฌ์‹ค์ž…๋‹ˆ๋‹ค.
04:32
We've all, I think, become, sort of computational chauvinists,
73
272330
4000
์ œ ์ƒ๊ฐ์—๋Š” ์šฐ๋ฆฌ๊ฐ€ ์–ธ์  ๊ฐ€๋ถ€ํ„ฐ ์ปดํ“จํ„ฐ์˜ ๋งน์‹ ์ž๊ฐ€ ๋œ ๊ฒƒ ๊ฐ™์€๋ฐ
04:36
and think that computation is everything,
74
276330
2000
๊ทธ๋ž˜์„œ, ์ปดํ“จํ„ฐ๋ฉด ๋ชจ๋“  ๊ฒƒ์ด ๋‹ค ๋œ๋‹ค๊ณ  ์ƒ๊ฐํ•˜๋Š” ๊ฒฝํ–ฅ์ด ์žˆ์ฃ .
04:38
but the mechanics still matter.
75
278330
2000
๊ทธ๋Ÿฌ๋‚˜, ์‚ฌ์‹ค์€ ๊ธฐ๊ตฌ๋ฌผ๋“ค์ด ์—ฌ์ „ํžˆ ํ•ต์‹ฌ ๊ธฐ์ˆ ์ž…๋‹ˆ๋‹ค.
04:40
Here's another robot, the PackBot,
76
280330
3000
์—ฌ๊ธฐ์— ์šฐ๋ฆฌ๊ฐ€ ๋ช‡๋…„์— ๊ฑธ์ณ ๊ฐœ๋ฐœํ•˜๊ณ  ์žˆ๋Š”
04:43
that we've been building for a bunch of years.
77
283330
2000
"ํŒฉ๋ด‡"์ด๋ผ๋Š” ๋˜ ํ•˜๋‚˜์˜ ๋กœ๋ด‡์ด ์žˆ์Šต๋‹ˆ๋‹ค.
04:45
It's a military surveillance robot, to go in ahead of troops --
78
285330
6000
์ด๊ฒƒ์€ ๊ตฐ์‚ฌ์šฉ ๊ฐ์‹œ ๋กœ๋ด‡์œผ๋กœ์จ, ์˜ˆ๋ฅผ๋“ค์–ด
04:51
looking at caves, for instance.
79
291330
3000
๊ตฐ์ธ๋“ค๋ณด๋‹ค ๋จผ์ € ๋™๊ตด์— ๋“ค์–ด๊ฐ€ ์œ„ํ—˜ ์—ฌ๋ถ€๋ฅผ ํ™•์ธํ•˜๋Š” ๋“ฑ์˜ ์ž„๋ฌด๋ฅผ ์ˆ˜ํ–‰ํ•ฉ๋‹ˆ๋‹ค.
04:54
But we had to make it fairly robust,
80
294330
2000
์šฐ๋ฆฌ๋Š” ์ด ๋กœ๋ด‡์„ ์ด์ œ๊นŒ์ง€ ๊ฐœ๋ฐœํ–ˆ๋˜ ์–ด๋–ค ๋กœ๋ด‡๋ณด๋‹ค
04:56
much more robust than the robots we build in our labs.
81
296330
7000
๋” ๊ฐ•ํ•˜๊ฒŒ ๋งŒ๋“ค์–ด์•ผ ํ–ˆ์Šต๋‹ˆ๋‹ค.
05:03
(Laughter)
82
303330
3000
(์›ƒ์Œ)
05:12
On board that robot is a PC running Linux.
83
312330
4000
์ด ๋กœ๋ด‡ ์ž์ฒด๋Š” ๋ฆฌ๋ˆ…์Šค๋กœ ์šด์˜๋˜๋Š” ์ปดํ“จํ„ฐ์ด๊ณ 
05:16
It can withstand a 400G shock. The robot has local intelligence:
84
316330
6000
400G ์ถฉ๊ฒฉ์„ ๊ฒฌ๋”œ ์ˆ˜ ์žˆ๋„๋ก ๋งŒ๋“ค์–ด์กŒ์Šต๋‹ˆ๋‹ค.
05:22
it can flip itself over, can get itself into communication range,
85
322330
6000
๊ทธ๋ฆฌ๊ณ , ์ด ๋กœ๋ด‡์€ ์ธ๊ณต์ง€๋Šฅ์„ ๊ฐ€์ง€๊ณ  ์žˆ์–ด์„œ ํ˜ผ์ž ๋’ค์ง‘์„ ์ˆ˜๋„ ์žˆ๊ณ ,
05:28
can go upstairs by itself, et cetera.
86
328330
3000
์Šค์Šค๋กœ ํ†ต์‹ ์ง€์—ญ์œผ๋กœ ์ด๋™ํ•˜๊ฑฐ๋‚˜, ์ธต๊ณ„๋ฅผ ์˜ฌ๋ผ๊ฐ€๋Š” ๋“ฑ์˜ ์ผ์„ ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค
05:38
Okay, so it's doing local navigation there.
87
338330
4000
๋„ค, ์—ฌ๊ธฐ์„œ๋Š” ๋กœ๋ด‡์ด ๋กœ์ปฌ ๋„ค๋น„๊ฒŒ์ด์…˜์„ ์ˆ˜ํ–‰ํ•˜๊ณ  ์žˆ๋Š” ๊ฒƒ์„ ๋ณด์—ฌ์ฃผ๊ณ  ์žˆ๋Š”๋ฐ์š”.
05:42
A soldier gives it a command to go upstairs, and it does.
88
342330
6000
๊ตฐ์ธ์ด ์ธต๊ณ„๋ฅผ ์˜ฌ๋ผ๊ฐ€๋ผ๊ณ  ๋ช…๋ น์„ ํ•˜๋ฉด, ๊ทธ๋Œ€๋กœ ์ˆ˜ํ–‰์„ ํ•ฉ๋‹ˆ๋‹ค.
05:49
That was not a controlled descent.
89
349330
3000
๋ณด์‹œ๋Š”๊ฒƒ์ฒ˜๋Ÿผ ๊ฐ€๋” ๋ง์ฝ์„ ๋ถ€๋ฆฌ๊ธฐ๋„ ํ•˜์ง€๋งŒ ๋ง์ด์ง€์š”.
05:52
(Laughter)
90
352330
2000
(์›ƒ์Œ)
05:54
Now it's going to head off.
91
354330
2000
์ž ์ด์ œ ์Šค์Šค๋กœ ๋‹ค์‹œ ์›€์ง์ด๋Š”๊ตฐ์š”.
05:56
And the big breakthrough for these robots, really, was September 11th.
92
356330
5000
์ด ๋กœ๋ด‡์ด ์ˆ˜ํ–‰ํ•œ ๊ฐ€์žฅ ํฐ ์ž‘์ „ ์ค‘ ํ•˜๋‚˜๋Š” 9์›”11์ผ ๊ทธ๋‚ ์— ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
06:01
We had the robots down at the World Trade Center late that evening.
93
361330
4000
๊ทธ๋‚  ์ €๋… ์šฐ๋ฆฌ๋Š” ๊ตญ์ œ ๋ฌด์—ญ ์„ผํ„ฐ๋กœ ์ด ๋กœ๋ด‡์„ ๊ฐ€์ง€๊ณ  ๊ฐ”์Šต๋‹ˆ๋‹ค.
06:06
Couldn't do a lot in the main rubble pile,
94
366330
2000
๊ทธ ๊ฑฐ๋Œ€ํ•œ ์ž”ํ•ด์†์—์„œ ํ•  ์ˆ˜ ์žˆ๋Š” ๊ฒƒ๋„ ๋ณ„๋กœ ์—†์—ˆ๊ณ 
06:08
things were just too -- there was nothing left to do.
95
368330
3000
์ •๋ง ์•„๋ฌด๊ฒƒ๋„ ๋‚จ์•„ ์žˆ๋Š” ๊ฒƒ๋„ ์—†์—ˆ์ง€๋งŒ
06:11
But we did go into all the surrounding buildings that had been evacuated,
96
371330
5000
์šฐ๋ฆฌ๋Š” ์ด๋ฏธ ์†Œ๊ฐœ๋œ ๊ทธ ์ฃผ๋ณ€์˜ ๋ชจ๋“  ๊ฑด๋ฌผ๋“ค์— ๋“ค์–ด๊ฐ€
06:16
and searched for possible survivors in the buildings
97
376330
3000
์žˆ์„์ง€๋„ ๋ชจ๋ฅผ ์ƒ์กด์ž ์ˆ˜์ƒ‰์„ ํ–ˆ์Šต๋‹ˆ๋‹ค.
06:19
that were too dangerous to go into.
98
379330
2000
์‚ฌ๋žŒ์ด ๋“ค์–ด๊ฐ€๊ธฐ์—๋Š” ๋„ˆ๋ฌด ์œ„ํ—˜ํ•œ ๊ทธ๊ณณ์— ์ด ๋กœ๋ด‡์„ ์ด์šฉํ•ด์„œ ๋ง์ด์ฃ .
06:21
Let's run this video.
99
381330
2000
์ด ์˜์ƒ ํ•˜๋‚˜๋ฅผ ๋ณด๋„๋ก ํ•˜๊ฒ ์Šต๋‹ˆ๋‹ค.
06:23
Reporter: ...battlefield companions are helping to reduce the combat risks.
100
383330
3000
๋ฆฌํฌํ„ฐ: ... ์ „์Ÿ์˜ ์œ„ํ—˜์„ ๋‚ฎ์ถ”๋Š”๋ฐ ๋„์›€์ด ๋˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
06:26
Nick Robertson has that story.
101
386330
3000
๋‹‰ ๋กœ๋ฒ„ํŠธ์Šจ ๊ธฐ์ž์ž…๋‹ˆ๋‹ค.
06:31
Rodney Brooks: Can we have another one of these?
102
391330
2000
๋กœ๋“œ๋‹ˆ ๋ถ€๋ฃฉ์Šค: ์ด๋Ÿฐ ๊ฒƒ ํ•˜๋‚˜ ๋” ์–ป์„ ์ˆ˜ ์žˆ์„๊นŒ์š”?
06:38
Okay, good.
103
398330
2000
๋„ค, ์ข‹๋„ค์š”.
06:43
So, this is a corporal who had seen a robot two weeks previously.
104
403330
3000
์—ฌ๊ธฐ์— ๋‚˜์˜ค๋Š” ์ด ๊ตฐ์ธ์€ 2์ฃผ์ •๋„ ์•ž์„œ ์ด ๋กœ๋ด‡์— ๋Œ€ํ•œ ๊ต์œก์„ ๋ฐ›์•˜๊ณ 
06:48
He's sending robots into caves, looking at what's going on.
105
408330
4000
ํ˜„์žฌ ๋™๊ตด ๋‚ด๋ถ€๋ฅผ ์‚ดํŽด๋ณด๊ธฐ ์œ„ํ•ด ๋กœ๋ด‡์„ ๋™๊ตด ์•ˆ์œผ๋กœ ๋ณด๋ƒˆ์Šต๋‹ˆ๋‹ค.
06:52
The robot's being totally autonomous.
106
412330
2000
์ด์ œ๋ถ€ํ„ฐ ์ด ๋กœ๋ด‡์€ ์™„์ „ํžˆ ์ž๋™์œผ๋กœ ๋™์ž‘ํ•ฉ๋‹ˆ๋‹ค.
06:54
The worst thing that's happened in the cave so far
107
414330
4000
์ง€๊ธˆ๊นŒ์ง€ ์ด ๋™๊ตด์—์„œ ์ž‘์ „ ์ค‘ ๋ฐœ์ƒํ•œ ๊ฐ€์žฅ ์•ˆ์ข‹์•˜๋˜ ์‚ฌ๊ฑด์€
06:58
was one of the robots fell down ten meters.
108
418330
3000
๋กœ๋ด‡์ค‘ ํ•˜๋‚˜๊ฐ€ 10๋ฏธํ„ฐ ์•„๋ž˜๋กœ ๋–จ์–ด์กŒ๋˜ ๊ฒƒ์ด์—ˆ์ฃ .
07:08
So one year ago, the US military didn't have these robots.
109
428330
3000
1๋…„์ „๋งŒ ํ•˜๋”๋ผ๋„ ๋ฏธ๊ตฐ์€ ์ด๋Ÿฌํ•œ ๋กœ๋ด‡์„ ๊ฐ€์ง€๊ณ  ์žˆ์ง€ ์•Š์•˜์Šต๋‹ˆ๋‹ค.
07:11
Now they're on active duty in Afghanistan every day.
110
431330
2000
ํ•˜์ง€๋งŒ, ์ง€๊ธˆ์€ ์•„ํ”„๊ฐ€๋‹ˆ์Šคํƒ„์—์„œ ๋งค์ผ ๊ทธ๋“ค์˜ ์ž„๋ฌด๋ฅผ ์ˆ˜ํ–‰ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
07:13
And that's one of the reasons they say a robot invasion is happening.
111
433330
3000
์ด๊ฒƒ์ด ์šฐ๋ฆฌ ์ฃผ๋ณ€์—์„œ ๋กœ๋ด‡์ด ์“ฐ์ด๊ธฐ ์‹œ์ž‘ํ–ˆ๋‹ค๊ณ  ๋งํ•˜๋Š” ์ด์œ ์ค‘ ํ•˜๋‚˜์ž…๋‹ˆ๋‹ค.
07:16
There's a sea change happening in how -- where technology's going.
112
436330
4000
ํ˜„์žฌ ๋กœ๋ด‡ ๊ธฐ์ˆ ์ด ์–ด๋–ป๊ฒŒ, ์–ด๋–ค ๋ฐฉํ–ฅ์œผ๋กœ ๋ฐœ์ „ํ•ด๋‚˜๊ฐˆ์ง€์— ๋Œ€ํ•œ ํฐ ๋ณ€ํ™”๊ฐ€ ์ผ์–ด๋‚˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
07:20
Thanks.
113
440330
2000
๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
07:23
And over the next couple of months,
114
443330
2000
๊ทธ๋ฆฌ๊ณ , ๋ช‡๋‹ฌ ๋’ค์— ์šฐ๋ฆฌ๋Š”
07:25
we're going to be sending robots in production
115
445330
3000
ํ–ฅํ›„ ๋ช‡๋…„๋™์•ˆ ์„์œ ๋ฅผ ์ฑ„์ทจ ํ•  ์œ ์ •(ๆฒนไบ•) ๊ฑด์„ค์— ์‚ฌ์šฉ๋ 
07:28
down producing oil wells to get that last few years of oil out of the ground.
116
448330
4000
๋กœ๋ด‡์„ ์ƒ์‚ฐ ํ•  ๊ณ„ํš์ž…๋‹ˆ๋‹ค.
07:32
Very hostile environments, 150หš C, 10,000 PSI.
117
452330
4000
์ด ๋กœ๋ด‡์€ ์„ญ์”จ 150๋„๊ฐ€ ๋„˜๊ณ , 10,000 PSI์ธ ๋งค์šฐ ํ˜น๋…ํ•œ ํ™˜๊ฒฝ์˜
07:36
Autonomous robots going down, doing this sort of work.
118
456330
4000
๋•…์†์œผ๋กœ ๋‚ด๋ ค๊ฐ€ ๊ทธ ์ž„๋ฌด๋ฅผ ์ˆ˜ํ–‰ํ•˜๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
07:40
But robots like this, they're a little hard to program.
119
460330
3000
ํ˜„์žฌ ์ด๋Ÿฌํ•œ ๋กœ๋ด‡๋“ค์€ ํ”„๋กœ๊ทธ๋žจํ•˜๊ธฐ๊ฐ€ ์ข€ ์–ด๋ ต์Šต๋‹ˆ๋‹ค.
07:43
How, in the future, are we going to program our robots
120
463330
2000
๊ทธ๋Ÿผ ๋ฏธ๋ž˜์— ์šฐ๋ฆฌ๋Š” ์–ด๋–ป๊ฒŒ ์šฐ๋ฆฌ์˜ ๋กœ๋ด‡๋“ค์„ ํ”„๋กœ๊ทธ๋žจํ• ๊นŒ์š”?
07:45
and make them easier to use?
121
465330
2000
์–ด๋–ป๊ฒŒ ๊ทธ๋“ค์„ ๋”์šฑ ์‚ฌ์šฉํ•˜๊ธฐ ์‰ฝ๊ฒŒ ๋ง๋“ค ์ˆ˜ ์žˆ์„๊นŒ์š”?
07:47
And I want to actually use a robot here --
122
467330
3000
์ดํ•ดํ•˜๊ธฐ ์‰ฝ๋„๋ก ์—ฌ๊ธฐ์žˆ๋Š” ํฌ๋ฆฌ์Šค๋ฅผ ๋กœ๋ด‡์ด๋ผ ๊ฐ€์ •ํ•˜๊ณ  ๋ณด์—ฌ๋“œ๋ฆฌ๊ฒ ์Šต๋‹ˆ๋‹ค.
07:50
a robot named Chris -- stand up. Yeah. Okay.
123
470330
5000
ํฌ๋ฆฌ์Šค, ์ผ์–ด๋‚˜์„ธ์š”. ๋„ค ์ข‹์Šต๋‹ˆ๋‹ค.
07:57
Come over here. Now notice, he thinks robots have to be a bit stiff.
124
477330
4000
์ด๋ฆฌ๋กœ ์˜ค์„ธ์š”. ์ง€๊ธˆ ํฌ๋ฆฌ์Šค๊ฐ€ ์›€์ง์ด๋Š” ๊ฒƒ์„ ๋ณด์‹œ๋ฉด
08:01
He sort of does that. But I'm going to --
125
481330
3000
ํฌ๋ฆฌ์Šค๋Š” ๋กœ๋ด‡์ด๋ผ๋ฉด ๋”ฑ๋”ฑํ•˜๊ฒŒ ์›€์ง์—ฌ์•ผ ํ•œ๋‹ค๊ณ  ์ƒ๊ฐํ•˜๊ณ  ๊ทธ๋ ‡๊ฒŒ ์›€์ง์ž…๋‹ˆ๋‹ค.
08:04
Chris Anderson: I'm just British. RB: Oh.
126
484330
2000
๋‚œ ๊ทธ๋ƒฅ ์˜๊ตญ์ธ์ด๋ผ์„œ ๊ทธ๋ž˜์š”.
08:06
(Laughter)
127
486330
2000
(์›ƒ์Œ)
08:08
(Applause)
128
488330
2000
(๋ฐ•์ˆ˜)
08:10
I'm going to show this robot a task. It's a very complex task.
129
490330
3000
์ „ ์ง€๊ธˆ ์ด ๋กœ๋ด‡์—๊ฒŒ ๋งค์šฐ ๋ณต์žกํ•œ ์ž‘์—…์„ ํ•˜๋‚˜ ๋ณด์—ฌ์ฃผ๋ ค๊ณ  ํ•ฉ๋‹ˆ๋‹ค.
08:13
Now notice, he nodded there, he was giving me some indication
130
493330
3000
์ง€๊ธˆ ๋ณด์‹œ๋ฉด ์ด ๋กœ๋ด‡์€ ๋„๋•์ด๋ฉด์„œ ์ €์—๊ฒŒ ๋ฐ˜์‘์„ ๋ณด์ž…๋‹ˆ๋‹ค.
08:16
he was understanding the flow of communication.
131
496330
3000
๊ทธ๋Š” ์ง€๊ธˆ ๋Œ€ํ™”์˜ ํ๋ฆ„์„ ์ดํ•ดํ•˜๊ณ  ์žˆ๋Š” ๊ฒƒ์ด์ฃ .
08:19
And if I'd said something completely bizarre
132
499330
2000
๊ทธ๋ฆฌ๊ณ , ๋งŒ์•ฝ ์ œ๊ฐ€ ์ „ํ˜€ ์ด์ƒํ•œ ๊ฒƒ์„ ์–˜๊ธฐํ•˜๋ฉด
08:21
he would have looked askance at me, and regulated the conversation.
133
501330
3000
๊ทธ๋Š” ์ €๋ฅผ ์ด์ƒํ•˜๊ฒŒ ์ณ๋‹ค๋ณด๊ณ  ๋Œ€ํ™”๋ฅผ ๋ฐ”๊พธ๋ ค๊ณ  ํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
08:24
So now I brought this up in front of him.
134
504330
3000
์ž, ์ง€๊ธˆ์ฒ˜๋Ÿผ ์ œ๊ฐ€ ์ด๋ ‡๊ฒŒ ๋ฌผ๋ณ‘ ํ•˜๋‚˜๋ฅผ ๋“ค์–ด๋ณด์ด๋ฉด์„œ, ์ด ๋กœ๋ด‡์˜ ์‹œ์„ ์„ ๋ณด๋ฉด
08:27
I'd looked at his eyes, and I saw his eyes looked at this bottle top.
135
507330
4000
์ด ๋กœ๋ด‡์ด ์ง€๊ธˆ ์ด ๋ฌผ๋ณ‘ ๋šœ๊ป‘์„ ๋ฐ”๋ผ๋ณด๊ณ  ์žˆ๋Š” ๊ฒƒ์„ ์•Œ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
08:31
And I'm doing this task here, and he's checking up.
136
511330
2000
๊ทธ๋ฆฌ๊ณ , ์ €๋Š” ์ด ๋กœ๋ด‡์ด ๊ณ„์† ์ฃผ์‹œํ•˜๋Š” ๋™์•ˆ ์ด๋Ÿฌํ•œ ๋™์ž‘์„ ๋ณด์—ฌ์ค๋‹ˆ๋‹ค.
08:33
His eyes are going back and forth up to me, to see what I'm looking at --
137
513330
3000
๊ทธ๋Ÿฌ๋ฉด, ๋กœ๋ด‡์€ ๋ˆˆ์„ ์œ„, ์•„๋ž˜๋กœ ์›€์ง์ด๋ฉด์„œ, ๋‚ด๊ฐ€ ํ˜„์žฌ ์ณ๋‹ค๋ณด๊ณ  ์žˆ๋Š” ๊ฒƒ์„ ํ™•์ธํ•˜์ฃ .
08:36
so we've got shared attention.
138
516330
2000
์ด๋Ÿฐ ์‹์œผ๋กœ ์šฐ๋ฆฌ๋Š” ์„œ๋กœ์—๊ฒŒ ์ฃผ์‹œ๋ฅผ ํ•ฉ๋‹ˆ๋‹ค.
08:38
And so I do this task, and he looks, and he looks to me
139
518330
3000
์ด๋ ‡๊ฒŒ ๋‚ด๊ฐ€ ์ž‘์—…์„ ์ˆ˜ํ–‰ํ•˜๊ณ , ๊ทธ๊ฐ€ ์ณ๋‹ค๋ณด๊ณ , ๋‹ค์Œ์—๋Š” ๋ฌด์—‡์ด ์ง„ํ–‰๋ ์ง€ ํ™•์ธํ•˜๊ธฐ ์œ„ํ•ด
08:41
to see what's happening next. And now I'll give him the bottle,
140
521330
4000
์ €๋ฅผ ์ณ๋‹ค๋ณด์ฃ . ๊ทธ๋Ÿฐ ๋‹ค์Œ ์ €๋Š” ๋ฌผ๋ณ‘์„ ๋กœ๋ด‡์—๊ฒŒ ๊ฑด๋ƒ…๋‹ˆ๋‹ค.
08:45
and we'll see if he can do the task. Can you do that?
141
525330
2000
๊ทธ๋Ÿผ ์ด ๋กœ๋ด‡์ด ๋˜‘๊ฐ™์€ ๋™์ž‘์„ ํ•  ์ˆ˜ ์žˆ๋Š”์ง€ ๋ณด๋„๋กํ•˜์ง€์š”. ํ•  ์ˆ˜ ์žˆ์–ด์š”?
08:47
(Laughter)
142
527330
3000
(์›ƒ์Œ)
08:50
Okay. He's pretty good. Yeah. Good, good, good.
143
530330
4000
๋„ค, ์•„์ฃผ ์ž˜ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค. ๋งค์šฐ ์ข‹์•„์š”.
08:54
I didn't show you how to do that.
144
534330
2000
๋‚œ ๊ทธ๋Ÿฐ ๋™์ž‘์„ ๋ณด์—ฌ์ฃผ์ง€ ์•Š์•˜์–ด์š”.
08:56
Now see if you can put it back together.
145
536330
2000
๊ทธ๋Ÿผ ๋‹ค์‹œ ์ด ๋šœ๊ป‘์„ ๋‹ซ์„ ์ˆ˜ ์žˆ๋Š”์ง€ ๋ด…์‹œ๋‹ค.
08:58
(Laughter)
146
538330
2000
(์›ƒ์Œ)
09:00
And he thinks a robot has to be really slow.
147
540330
1000
ํฌ๋ฆฌ์Šค๋Š” ๋กœ๋ด‡์ด ์ •๋ง ์ฒœ์ฒœํžˆ ์›€์ง์—ฌ์•ผ ๋œ๋‹ค๊ณ  ์ƒ๊ฐํ•˜๋‚˜๋ณด๋„ค์š”.
09:01
Good robot, that's good.
148
541330
2000
์ž˜ํ–ˆ์–ด์š” ๋กœ๋ด‡. ์•„์ฃผ ์ž˜ํ–ˆ์–ด์š”.
09:03
So we saw a bunch of things there.
149
543330
2000
์ž ์šฐ๋ฆฌ๋Š” ์ง€๊ธˆ๊นŒ์ง€ ์—ฌ๋Ÿฌ๊ฐ€์ง€๋ฅผ ๋ณด์•˜์Šต๋‹ˆ๋‹ค.
09:06
We saw when we're interacting,
150
546330
3000
์šฐ๋ฆฌ๋Š” ์„œ๋กœ ์ƒํ˜ธ ์ž‘์šฉ์„ ํ•  ๋•Œ,
09:09
we're trying to show someone how to do something, we direct their visual attention.
151
549330
4000
์–ด๋–ค ๊ฒƒ์„ ์–ด๋–ป๊ฒŒ ํ•˜๋Š”์ง€๋ฅผ ๋ณด์—ฌ์ฃผ๋ฉด์„œ, ๊ทธ๋“ค์˜ ์‹œ๊ฐ์  ์ฃผ์˜๋ฅผ ์ด๋Œ์—ˆ์ฃ .
09:13
The other thing communicates their internal state to us,
152
553330
4000
๋‹ค๋ฅธ ๊ฒƒ์œผ๋กœ๋Š” ๊ทธ๋“ค์˜ ๋‚ด๋ถ€ ์ƒํƒœ๋ฅผ ์ €ํฌ์—๊ฒŒ ์•Œ๋ ค์ฃผ๋Š” ๊ฒƒ์ด ์žˆ๊ณ 
09:17
whether he's understanding or not, regulates a social interaction.
153
557330
3000
์ •๋ง ์ดํ•ดํ•˜๊ณ  ํ–ˆ๋Š”์ง€ ์•„๋‹Œ์ง€๋Š” ๋ชจ๋ฅด๊ฒ ์ง€๋งŒ, ์‚ฌํšŒ์  ์ƒํ˜ธ ์ž‘์šฉ์„ ์ œ์–ดํ–ˆ๊ณ ,
09:20
There was shared attention looking at the same sort of thing,
154
560330
2000
๊ฐ™์€ ๊ฒƒ์„ ๋ฐ”๋ผ๋ณด๋ฉด์„œ ์„œ๋กœ์—๊ฒŒ ์ฃผ์‹œํ–ˆ์ฃ .
09:22
and recognizing socially communicated reinforcement at the end.
155
562330
4000
๊ทธ๋ฆฌ๊ณ , ๋งˆ์ง€๋ง‰์—๋Š” ์‚ฌํšŒ์  ์˜์‚ฌ ์†Œํ†ต์„ ์ธ์‹ํ•˜๊ณ  ๊ฐ•ํ™”ํ–ˆ์Šต๋‹ˆ๋‹ค.
09:26
And we've been trying to put that into our lab robots
156
566330
3000
๊ทธ๋ž˜์„œ, ์šฐ๋ฆฌ ์—ฐ๊ตฌ์‹ค์—์„œ๋Š” ์ด์™€ ๊ฐ™์€ ๊ฒƒ๋“ค์„ ๋กœ๋ด‡์— ๋„ฃ์œผ๋ ค๊ณ  ๋…ธ๋ ฅํ•ด์™”์Šต๋‹ˆ๋‹ค.
09:29
because we think this is how you're going to want to interact with robots in the future.
157
569330
4000
์™œ๋ƒํ•˜๋ฉด, ์ด๋Ÿฌํ•œ ๋ฐฉ๋ฒ•๋“ค์ด ๋ฏธ๋ž˜์— ์šฐ๋ฆฌ๊ฐ€ ์›ํ•˜๋Š” ๋กœ๋ด‡๊ณผ์˜ ์˜์‚ฌ ์†Œํ†ต ๋ฐฉ๋ฒ•์ด๋ผ๊ณ  ์ƒ๊ฐํ–ˆ๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
09:33
I just want to show you one technical diagram here.
158
573330
2000
์—ฌ๊ธฐ ๊ธฐ์ˆ ์ ์ธ ๋‹ค์ด์–ด๊ทธ๋žจ ํ•˜๋‚˜๋ฅผ ๋ณด์—ฌ๋“œ๋ฆฌ๋ ค๊ณ  ํ•ฉ๋‹ˆ๋‹ค.
09:35
The most important thing for building a robot that you can interact with socially
159
575330
4000
์šฐ๋ฆฌ๊ฐ€ ์ƒํ˜ธ ์ž‘์šฉ์„ ํ•  ์ˆ˜ ์žˆ๋Š” ๋กœ๋ด‡์„ ๊ฐœ๋ฐœํ•˜๋Š”๋ฐ ๊ฐ€์žฅ ์ค‘์š”ํ•œ ๊ฒƒ์€
09:39
is its visual attention system.
160
579330
2000
๋กœ๋ด‡์˜ ๋น„์ฃผ์–ผ ์–ดํ…์…˜ ์‹œ์Šคํ…œ์ž…๋‹ˆ๋‹ค.
09:41
Because what it pays attention to is what it's seeing
161
581330
3000
์™œ๋ƒํ•˜๋ฉด, ์–ด๋–ค ๊ฒƒ์— ์ง‘์ค‘ํ•œ๋‹ค๋Š” ๊ฒƒ์€ ๊ทธ๊ฒƒ์„ ๋ณด๊ณ  ์žˆ๋‹ค๋Š” ๋œป์ด๊ณ ,
09:44
and interacting with, and what you're understanding what it's doing.
162
584330
3000
๊ทธ๊ฒƒ๊ณผ ์ƒํ˜ธ ์ž‘์šฉ์„ ํ•œ๋‹ค๋Š” ๊ฒƒ์ด๋ฉฐ, ๊ทธ๊ฒƒ์ด ๋ฌด์—‡์„ ํ•˜๊ณ  ์žˆ๋Š”์ง€๋ฅผ ์ดํ•ดํ•œ๋‹ค๋Š” ๋œป์ด๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
09:47
So in the videos I'm about to show you,
163
587330
3000
๊ทธ๋ž˜์„œ, ์ œ๊ฐ€ ๊ณง ๋ณด์—ฌ๋“œ๋ฆด ๋น„๋””์˜ค์—์„œ
09:50
you're going to see a visual attention system on a robot
164
590330
4000
์—ฌ๋Ÿฌ๋ถ„์€ ๋กœ๋ด‡์˜ ๋น„์ฅฌ์–ผ ์–ดํ…์…˜ ์‹œ์Šคํ…œ์„ ๋ณด์‹œ๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
09:54
which has -- it looks for skin tone in HSV space,
165
594330
4000
HSV ์ปฌ๋Ÿฌ ๋ชจ๋ธ์—์„œ
09:58
so it works across all human colorings.
166
598330
4000
์ธ์ฒด์ƒ‰์— ํ•ด๋‹นํ•˜๋Š” ํ”ผ๋ถ€์ƒ‰์„ ์ฐพ์•„๋‚ด๊ณ 
10:02
It looks for highly saturated colors, from toys.
167
602330
2000
์žฅ๋‚œ๊ฐ ์˜์—ญ์— ํ•ด๋‹นํ•˜๋Š” ์ƒ‰์ด ๋†’๊ฒŒ ํฌํ™”๋œ ์˜์—ญ์„ ์ฐพ์•„๋ƒ…๋‹ˆ๋‹ค.
10:04
And it looks for things that move around.
168
604330
2000
๋˜, ์˜์ƒ์—์„œ ์›€์ง์ด๋Š” ์˜์—ญ๋“ค์„ ์ฐพ์•„๋ƒ…๋‹ˆ๋‹ค.
10:06
And it weights those together into an attention window,
169
606330
3000
๊ทธ๋ฆฌ๊ณ , ์ด ๋ชจ๋“  ๊ฒƒ๋“ค์— ๊ฐ€์ค‘์น˜๋ฅผ ๋”ํ•ด ์–ดํ…์…˜ ์œˆ๋„์šฐ์— ๋„ฃ๊ณ ,
10:09
and it looks for the highest-scoring place --
170
609330
2000
๊ทธ์ค‘์—์„œ ๊ฐ€์žฅ ๋†’๊ฒŒ ๋งค๊ฒจ์ง„ ์˜์—ญ์„
10:11
the stuff where the most interesting stuff is happening --
171
611330
2000
๊ฐ€์žฅ ๊ด€์‹ฌ์žˆ๋Š” ์˜์—ญ์ด๋ผ ์ƒ๊ฐํ•˜๊ณ  ์ฐพ๋Š” ๊ฒƒ์ด์ฃ .
10:13
and that is what its eyes then segue to.
172
613330
4000
๊ทธ๋ฆฌ๊ณ , ๊ทธ ์˜์—ญ์„ ๋กœ๋ด‡์˜ ๋ˆˆ์ด
10:17
And it looks right at that.
173
617330
2000
์ณ๋‹ค๋ณด๊ฒŒ ๋˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
10:19
At the same time, some top-down sort of stuff:
174
619330
3000
๊ทธ๋ฆฌ๊ณ  ๋™์‹œ์—, ๋‹ค๋ฅธ ์ ‘๊ทผ ๋ฐฉ์‹์œผ๋กœ
10:22
might decide that it's lonely and look for skin tone,
175
622330
3000
๋กœ๋ด‡์€ ์Šค์Šค๋กœ ์™ธ๋กญ๋‹ค๊ณ  ์ƒ๊ฐํ•˜๊ณ  ํ”ผ๋ถ€์ƒ‰์„ ์ฐพ์„ ์ˆ˜๋„ ์žˆ๊ณ ,
10:25
or might decide that it's bored and look for a toy to play with.
176
625330
3000
ํ˜น์€, ์‹ฌ์‹ฌํ•˜๋‹ค๊ณ  ์ƒ๊ฐ๋˜์–ด ๊ฐ€์ง€๊ณ  ๋†€ ์žฅ๋‚œ๊ฐ์„ ์ฐพ์„ ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
10:28
And so these weights change.
177
628330
2000
๊ทธ๋Ÿฌ๋ฉด, ์ด๋Ÿฌํ•œ ๊ฐ€์ค‘์น˜๋“ค์ด ๋ณ€ํ•˜๊ฒŒ๋˜์ฃ .
10:30
And over here on the right,
178
630330
2000
๊ทธ๋ฆฌ๊ณ , ๋งจ ์˜ค๋ฅธ์ชฝ์˜ ๊ทธ๋ฆผ
10:32
this is what we call the Steven Spielberg memorial module.
179
632330
3000
์šฐ๋ฆฌ๋Š” ์ด๊ฒƒ์„ ์Šคํ‹ฐ๋ธ ์Šคํ•„๋ฒ„๊ทธ ๋ฉ”๋ชจ๋ฆฌ์–ผ ๋ชจ๋“ˆ์ด๋ผ๊ณ  ๋ถ€๋ฅด๋Š”๋ฐ
10:35
Did people see the movie "AI"? (Audience: Yes.)
180
635330
2000
์—ฌ๊ธฐ ์˜ํ™” AI๋ฅผ ๋ณด์‹ ๋ถ„ ๊ณ„์‹ ๊ฐ€์š”? ์ฒญ์ค‘: ๋„ค.
10:37
RB: Yeah, it was really bad, but --
181
637330
2000
๋„ค, ์ •๋ง ๋ณ„๋กœ์ธ ์˜ํ™”์˜€์ฃ , ํ•˜์ง€๋งŒ
10:39
remember, especially when Haley Joel Osment, the little robot,
182
639330
4000
์ž‘์€ ๋กœ๋ด‡ ์—ญํ™œ์„ ํ–ˆ๋˜ ํ• ๋ฆฌ ์กฐ์—˜ ์˜ค์Šค๋จผํŠธ๊ฐ€ ํŒŒ๋ž€์ƒ‰ ์š”์ •์„
10:43
looked at the blue fairy for 2,000 years without taking his eyes off it?
183
643330
4000
2์ฒœ๋…„๋™์•ˆ ํ•œ์‹œ๋„ ๋ˆˆ์„ ๋–ผ์ง€ ์•Š๊ณ  ๋ฐ”๋ผ๋ณธ ๊ฒƒ์„ ๊ธฐ์–ตํ•˜์‹œ๋‚˜์š”?
10:47
Well, this gets rid of that,
184
647330
2000
์ด๊ฒƒ์ด ๊ทธ๋Ÿฐ ํ˜„์ƒ์„ ์—†์•จ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
10:49
because this is a habituation Gaussian that gets negative,
185
649330
4000
์™œ๋ƒํ•˜๋ฉด, ์ด๊ฒƒ์€ ์Œ์ˆ˜๋ฅผ ๊ฐ€์ง€๋Š” ์Šต๊ด€ํ™” ๊ฐ€์šฐ์‹œ์•ˆ์œผ๋กœ์จ,
10:53
and more and more intense as it looks at one thing.
186
653330
3000
๋กœ๋ด‡์ด ํ•œ ๊ณณ์„ ๋ฐ”๋ผ๋ณด๋ฉด ๊ทธ ๋ถ€๋ถ„์˜ ๊ฐ•๋„๊ฐ€ ์ ์  ์„ธ์–ด์ง€๋„๋ก ๋˜์–ด ์žˆ์Šต๋‹ˆ๋‹ค.
10:56
And it gets bored, so it will then look away at something else.
187
656330
3000
๊ทธ๋Ÿฌ๋ฏ€๋กœ, ๋กœ๋ด‡์ด ์ง€๋ฃจํ•จ์„ ๋Š๋ผ๊ฒŒ ํ•  ์ˆ˜ ์žˆ๊ณ , ๋‹ค๋ฅธ ํฅ๋ฏธ๋กœ์šด ๊ฒƒ์„ ์ฐพ๊ฒŒ ํ•  ์ˆ˜ ์žˆ์ง€์š”.
10:59
So, once you've got that -- and here's a robot, here's Kismet,
188
659330
4000
๊ทธ๋Ÿฌ๋ฉด, ์ด๋Ÿฌํ•œ ๊ฐœ๋…๋“ค์„ ๊ฐ€์ง€๊ณ , ์—ฌ๊ธฐ์— ํ˜„์žฌ ์žฅ๋‚จ๊ฐ์„ ์ฐพ๊ณ  ์žˆ๋Š”
11:03
looking around for a toy. You can tell what it's looking at.
189
663330
4000
"ํ‚ค์Šค๋ฉง"์ด๋ผ๊ณ  ๋ถ€๋ฅด๋Š” ๋กœ๋ด‡์„ ๋ณด๋„๋ก ํ•˜๊ฒ ์Šต๋‹ˆ๋‹ค.
11:07
You can estimate its gaze direction from those eyeballs covering its camera,
190
667330
5000
์นด๋ฉ”๋ผ๋ฅผ ๊ฐ์‹ธ๊ณ  ์žˆ๋Š” ๋กœ๋ด‡ ๋ˆˆ์˜ ๋ฐฉํ–ฅ์„ ๊ฐ€์ง€๊ณ , ํ˜„์žฌ ๋กœ๋ด‡์ด ์–ด๋””๋ฅผ ๋ฐ”๋ผ๋ณด๊ณ  ์žˆ๋Š”์ง€ ์•Œ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
11:12
and you can tell when it's actually seeing the toy.
191
672330
3000
๊ทธ๋Ÿฌ๋ฏ€๋กœ, ์šฐ๋ฆฌ๋Š” ๋กœ๋ด‡์ด ์ € ์žฅ๋‚œ๊ฐ์„ ๋ฐ”๋ผ๋ณผ ๋•Œ๋ฅผ ์•Œ ์ˆ˜ ์žˆ๊ณ ,
11:15
And it's got a little bit of an emotional response here.
192
675330
2000
์ž‘์€ ๊ฐ์ • ํ‘œํ˜„์„ ํ•˜๋Š” ๊ฒƒ๋„ ๋ณผ ์ˆ˜ ์žˆ๋„ค์š”.
11:17
(Laughter)
193
677330
1000
(์›ƒ์Œ)
11:18
But it's still going to pay attention
194
678330
2000
๊ทธ๋Ÿฌ๋‚˜, ์ข€ ๋” ์ค‘์š”ํ•œ ๊ฒƒ์ด ์ž์‹ ์˜ ์‹œ๊ฐ์— ๋‚˜ํƒ€๋‚˜๋ฉด
11:20
if something more significant comes into its field of view --
195
680330
4000
๊ทธ๊ฒƒ์œผ๋กœ ์ฃผ์˜๋ฅผ ๋Œ๋ฆด ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
11:24
such as Cynthia Breazeal, the builder of this robot, from the right.
196
684330
4000
์ง€๊ธˆ ์˜ค๋ฅธ์ชฝ์— ๋‚˜ํƒ€๋‚œ ์ด ๋กœ๋ด‡์˜ ๊ฐœ๋ฐœ์ž ์‹ ๋””์•„ ๋ธŒ๋ฆฌ์ง€์–ผ์„ ์ณ๋‹ค๋ณด๋Š” ๊ฒƒ์ฒ˜๋Ÿผ ๋ง์ด์ฃ .
11:28
It sees her, pays attention to her.
197
688330
5000
๋„ค, ์ง€๊ธˆ ๋กœ๋ด‡์ด ๊ทธ๋…€๋ฅผ ์ณ๋‹ค๋ณด๋ฉฐ, ๊ทธ๋…€๋ฅผ ์ฃผ์‹œํ•˜๊ณ  ์žˆ๋„ค์š”.
11:33
Kismet has an underlying, three-dimensional emotional space,
198
693330
4000
ํ‚ค์Šค๋ฉง์€ ๊ฐ์ •์  ๋ฒกํ„ฐ ์˜์—ญ์ธ
11:37
a vector space, of where it is emotionally.
199
697330
3000
3์ฐจ์› ๊ฐ์ • ์˜์—ญ์„ ๊ฐ€์ง€๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
11:40
And at different places in that space, it expresses --
200
700330
5000
๊ฐ๊ฐ ๋‹ค๋ฅธ ์˜์—ญ์—์„œ ๊ทธ๊ฒƒ์ด ํ‘œํ˜„ํ•˜๋Š” ๊ฐ์ •๋“ค์ด์ฃ .
11:46
can we have the volume on here?
201
706330
2000
๋ณผ๋ฅจ์„ ์ข€ ๋” ๋†’์ผ ์ˆ˜ ์žˆ๋‚˜์š”?
11:48
Can you hear that now, out there? (Audience: Yeah.)
202
708330
2000
๋„ค, ๋งจ ๋’ค์—๊นŒ์ง€ ์ž˜ ๋“ค๋ฆฌ์„ธ์š”? ์ฒญ์ถฉ: ๋„ค.
11:50
Kismet: Do you really think so? Do you really think so?
203
710330
5000
ํ‚ค์Šค๋ฉง: ์ •๋ง ๊ทธ๋ ‡๊ฒŒ ์ƒ๊ฐํ•˜์„ธ์š”? ์ •๋ง ๊ทธ๋ ‡๊ฒŒ ์ƒ๊ฐํ•ด?
11:57
Do you really think so?
204
717330
2000
์ •๋ง ๊ทธ๋ ‡๊ฒŒ ์ƒ๊ฐํ•˜๋‹ˆ?
12:00
RB: So it's expressing its emotion through its face
205
720330
3000
์ด๋ ‡๊ฒŒ ๋กœ๋ด‡์€ ์ž์‹ ์˜ ์–ผ๊ตด๊ณผ ๋ชฉ์†Œ๋ฆฌ์˜ ๋ณ€ํ™”๋ฅผ ํ†ตํ•ด์„œ
12:03
and the prosody in its voice.
206
723330
2000
์ž์‹ ์˜ ๊ฐ์ •์„ ํ‘œํ˜„ํ•ฉ๋‹ˆ๋‹ค.
12:05
And when I was dealing with my robot over here,
207
725330
4000
๊ทธ๋ฆฌ๊ณ , ์ œ๊ฐ€ ์กฐ๊ธˆ ์ „์— ํฌ๋ฆฌ์Šค๋ž€ ๋กœ๋ด‡์„ ๋ณด์—ฌ๋“œ๋ ธ์„ ๋•Œ
12:09
Chris, the robot, was measuring the prosody in my voice,
208
729330
3000
ํฌ๋ฆฌ์Šค๊ฐ€ ์ œ ๋ชฉ์†Œ๋ฆฌ์˜ ์–ต์–‘์„ ๊ฐ์ง€ํ•  ์ˆ˜ ์žˆ์—ˆ๋˜ ๊ฒƒ์ฒ˜๋Ÿผ
12:12
and so we have the robot measure prosody for four basic messages
209
732330
5000
์šฐ๋ฆฌ ์–ด๋จธ๋‹ˆ๋“ค์ด ๋ง๊ท€๋ฅผ ์•Œ์•„๋“ฃ์ง€ ๋ชปํ•˜๋Š” ์•„๊ธฐ๋“ค์—๊ฒŒ ํ•˜๋Š”
12:17
that mothers give their children pre-linguistically.
210
737330
4000
๊ธฐ๋ณธ์ ์ธ 4๊ฐ€์ง€ ๋ฉ”์„ธ์ง€๋ฅผ ์ธ์‹ํ•˜๋Š” ๋กœ๋ด‡์„ ๋ณด์—ฌ๋“œ๋ฆฌ๋„๋ก ํ•˜๊ฒ ์Šต๋‹ˆ๋‹ค
12:21
Here we've got naive subjects praising the robot:
211
741330
3000
๋จผ์ € ๋กœ๋ด‡์„ ์นญ์ฐฌํ•˜๋Š” ๊ฒƒ์„ ๋ณด๋„๋ก ํ•˜์ฃ .
12:26
Voice: Nice robot.
212
746330
2000
์ž˜ํ–ˆ์–ด์š”.
12:29
You're such a cute little robot.
213
749330
2000
๋„ˆ ์ฐธ ์ž‘๊ณ  ๊ท€์—ฌ์šด ๋กœ๋ด‡์ด๊ตฌ๋‚˜.
12:31
(Laughter)
214
751330
2000
(์›ƒ์Œ)
12:33
RB: And the robot's reacting appropriately.
215
753330
2000
๊ทธ๋Ÿฌ๋ฉด, ๋กœ๋ด‡์€ ์ ์ ˆํ•˜๊ฒŒ ๋ฐ˜์‘์„ ํ•˜์ฃ .
12:35
Voice: ...very good, Kismet.
216
755330
4000
์•„์ฃผ ์ข‹์•˜์–ด, ํ‚ค์Šค๋ฉง.
12:40
(Laughter)
217
760330
2000
(์›ƒ์Œ)
12:42
Voice: Look at my smile.
218
762330
2000
๋‚ด ์›ƒ๋Š” ๋ชจ์Šต์„ ์ข€ ๋ด.
12:46
RB: It smiles. She imitates the smile. This happens a lot.
219
766330
3000
๋กœ๋ด‡์ด ๋ฏธ์†Œ๋ฅผ ์ง“๊ณ , ๊ทธ๋…€๊ฐ€ ๊ทธ ๋ฏธ์†Œ๋ฅผ ๋”ฐ๋ผํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค. ์ด๋Ÿฌํ•œ ์ƒํ™ฉ์€ ์ž์ฃผ ์ผ์–ด๋‚˜์ฃ .
12:49
These are naive subjects.
220
769330
2000
์ด๊ฒƒ์€ ์•„์ฃผ ๊ฐ„๋‹จํ•œ ๊ฒƒ๋“ค์ด์—ˆ๊ณ ์š”.
12:51
Here we asked them to get the robot's attention
221
771330
3000
๋‹ค์Œ์œผ๋กœ ์šฐ๋ฆฌ๋Š” ์‹คํ—˜์— ์ฐธ์—ฌํ•œ ์‚ฌ๋žŒ๋“ค์—๊ฒŒ ๋กœ๋ด‡์˜ ์ฃผ์˜๋ฅผ ๋Œ์–ด๋ณด๊ณ 
12:54
and indicate when they have the robot's attention.
222
774330
3000
์„ฑ๊ณตํ–ˆ์„ ๋•Œ๋ฅผ ๋‚˜ํƒ€๋‚ด๋ณด๋ผ๊ณ  ์š”๊ตฌํ–ˆ์Šต๋‹ˆ๋‹ค.
12:57
Voice: Hey, Kismet, ah, there it is.
223
777330
4000
์•ผ ํ‚ค์Šค๋ฉง, ์ด์ œ ๋‚˜๋ฅผ ์ณ๋‹ค๋ณด๋Š”๊ตฌ๋‚˜.
13:01
RB: So she realizes she has the robot's attention.
224
781330
4000
์ด๋ ‡๊ฒŒ ๊ทธ๋…€๋Š” ๋กœ๋ด‡์ด ์ž์‹ ์„ ์ณ๋‹ค๋ณด๊ณ  ์žˆ๋‹ค๋Š” ๊ฒƒ์„ ์•Œ์ฃ .
13:08
Voice: Kismet, do you like the toy? Oh.
225
788330
4000
ํ‚ค์Šค๋ฉง, ๋„ˆ ์ € ์žฅ๋‚œ๊ฐ ์ข‹์•„ํ•˜๋‹ˆ? ๊ทธ๋ ‡๊ตฌ๋‚˜.
13:13
RB: Now, here they're asked to prohibit the robot,
226
793330
2000
์ž ๊ทธ๋Ÿฌ๋ฉด, ๋กœ๋ด‡์„ ์•ผ๋‹จ์น˜๋Š” ๊ฒƒ์„ ๋ณด๋„๋ก ํ•˜๊ฒ ์Šต๋‹ˆ๋‹ค.
13:15
and this first woman really pushes the robot into an emotional corner.
227
795330
4000
์—ฌ๊ธฐ ์ฒซ๋ฒˆ์งธ ๋‚˜์˜ค๋Š” ์—ฌ์„ฑ๋ถ„์€ ์ •๋ง ๊ฐ์ •์ ์œผ๋กœ ๋ชฐ์•„์„ธ์šฐ๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
13:19
Voice: No. No. You're not to do that. No.
228
799330
5000
์•ˆ๋ผ! ์•ˆ๋ผ! ๊ทธ๋Ÿฌ๋ฉด ๋ชป์จ! ์•ˆ๋ผ!
13:24
(Laughter)
229
804330
3000
(์›ƒ์Œ)
13:27
Not appropriate. No. No.
230
807330
6000
๊ทธ๊ฒƒ์€ ๋‚˜์œ๊ฑฐ์•ผ. ํ•˜์ง€๋งˆ. ํ•˜์ง€๋งˆ.
13:33
(Laughter)
231
813330
3000
(์›ƒ์Œ)
13:36
RB: I'm going to leave it at that.
232
816330
2000
๊ทธ๋Ÿผ ์—ฌ๊ธฐ๊นŒ์ง€ ํ•˜๋„๋ก ํ•˜๊ณ ์š”.
13:38
We put that together. Then we put in turn taking.
233
818330
2000
์ด ๋ชจ๋“  ๊ฒƒ์„ ํฌํ•จํ•œ ์™„์ „ํ•œ ๋Œ€ํ™”๋ฅผ ๋ณด๋„๋ก ํ•˜๊ฒ ์Šต๋‹ˆ๋‹ค.
13:40
When we talk to someone, we talk.
234
820330
3000
์šฐ๋ฆฌ๋Š” ๋ˆ„๊ตฐ๊ฐ€์™€ ๋Œ€ํ™”๋ฅผ ํ•  ๋•Œ
13:43
Then we sort of raise our eyebrows, move our eyes,
235
823330
4000
๋ˆˆ์น์„ ์น˜์ผœ์˜ฌ๋ฆฐ๋‹ค๋˜์ง€, ๋ˆˆ์„ ์›€์ง์ธ๋‹ค๋“ ์ง€ ํ•˜๋ฉด์„œ
13:47
give the other person the idea it's their turn to talk.
236
827330
3000
์ƒ๋Œ€๋ฐฉ์—๊ฒŒ ์–˜๊ธฐ ํ•  ์ฐจ๋ก€๋ผ๋Š” ๊ฒƒ์€ ์•”์‹œํ•˜๊ณ ๋Š” ํ•ฉ๋‹ˆ๋‹ค.
13:50
And then they talk, and then we pass the baton back and forth between each other.
237
830330
4000
๊ทธ๋Ÿฌ๋ฉด, ๊ทธ ์‚ฌ๋žŒ์ด ์–˜๊ธฐ๋ฅผ ํ•˜๊ณ , ๋‹ค์‹œ ์ƒ๋Œ€๋ฐฉ์—๊ฒŒ ๋„˜๊ธฐ๊ณ  ๋ฐ›๊ณ  ํ•˜๋ฉด์„œ ๋Œ€ํ™”๋ฅผ ํ•˜์ง€์š”.
13:54
So we put this in the robot.
238
834330
2000
๊ทธ๋ž˜์„œ, ์šฐ๋ฆฌ๋Š” ๋กœ๋ด‡์ด ์ด๋Ÿฌํ•œ ๊ฒƒ๋“ค์„ ํ•  ์ˆ˜ ์žˆ๋„๋ก ํ–ˆ์Šต๋‹ˆ๋‹ค.
13:56
We got a bunch of naive subjects in,
239
836330
2000
์œ„์—์„œ ๋ณด์—ฌ๋“œ๋ฆฐ ์—ฌ๋Ÿฌ๊ฐ€์ง€ ๊ฐ„๋‹จํ•œ ๊ฒƒ๋“ค์„ ๋กœ๋ด‡์— ํ”„๋กœ๊ทธ๋žจ์„ ํ•œ ํ›„
13:58
we didn't tell them anything about the robot,
240
838330
2000
์‹คํ—˜์ž๋“ค์—๊ฒŒ ๋กœ๋ด‡์— ๋Œ€ํ•ด ์•„๋ฌด๋Ÿฐ ์–˜๊ธฐ๋ฅผ ํ•ด์ฃผ์ง€์•Š๊ณ 
14:00
sat them down in front of the robot and said, talk to the robot.
241
840330
2000
๊ทธ๋ƒฅ ๋กœ๋ด‡ ์•ž์— ์•‰์•„์„œ ๋กœ๋ด‡๊ณผ ๋Œ€ํ™”๋ฅผ ๋‚˜๋ˆ„๋ผ๊ณ  ํ–ˆ์Šต๋‹ˆ๋‹ค.
14:02
Now what they didn't know was,
242
842330
2000
์ฆ‰, ์‹คํ—˜์ž๋“ค์€
14:04
the robot wasn't understanding a word they said,
243
844330
2000
๋กœ๋ด‡์ด ์ž์‹ ๋“ค์˜ ๋ง์„ ์ดํ•ดํ•˜์ง€ ๋ชปํ•œ๋‹ค๋Š” ๊ฒƒ๊ณผ
14:06
and that the robot wasn't speaking English.
244
846330
3000
๋กœ๋ด‡์ด ์˜์–ด๋กœ ๋ง์„ ํ•˜๋Š” ๊ฒƒ์ด ์•„๋‹ˆ๋ผ
14:09
It was just saying random English phonemes.
245
849330
2000
์ž„์˜์˜ ์˜์–ด ๊ธ€์ž๋“ค์„ ์†Œ๋ฆฌ๋‚ธ๋‹ค๋Š” ๊ฒƒ์„ ์•Œ์ง€ ๋ชปํ–ˆ์Šต๋‹ˆ๋‹ค.
14:11
And I want you to watch carefully, at the beginning of this,
246
851330
2000
์ด ๋น„๋””์˜ค ์‹œ์ž‘ ๋ถ€๋ถ„์„ ์ฃผ์˜๊นŠ๊ฒŒ ์‚ดํŽด๋ณด์‹œ๊ธธ ๋ฐ”๋ž๋‹ˆ๋‹ค.
14:13
where this person, Ritchie, who happened to talk to the robot for 25 minutes --
247
853330
4000
์—ฌ๊ธฐ ๋ฆฌ์น˜๋ผ๋Š” ์‚ฌ๋žŒ์ด ๋‚˜์™€์„œ 25๋ถ„๋™์•ˆ ๋กœ๋ด‡๊ณผ ์–˜๊ธฐ๋ฅผ ํ•˜๋Š”๋ฐ
14:17
(Laughter)
248
857330
2000
(์›ƒ์Œ)
14:19
-- says, "I want to show you something.
249
859330
2000
์ด ์‚ฌ๋žŒ์ด "๋‚˜ ๋„ˆ์—๊ฒŒ ๋ณด์—ฌ์ฃผ๊ณ  ์‹ถ์€ ๊ฒƒ์ด ์žˆ์–ด"
14:21
I want to show you my watch."
250
861330
2000
"๋‚ด ์‹œ๊ณ„ ์ข€ ๋ด๋ฐ”"๋ผ๊ณ  ๋กœ๋ด‡์—๊ฒŒ ๋ง์„ ํ•˜๋ฉด์„œ
14:23
And he brings the watch center, into the robot's field of vision,
251
863330
5000
์ž์‹ ์˜ ์‹œ๊ณ„๋ฅผ ๋กœ๋ด‡์˜ ์‹œ์„  ๊ฐ€์šด๋ฐ๋กœ ๊ฐ€์ง€๊ณ  ์˜ต๋‹ˆ๋‹ค.
14:28
points to it, gives it a motion cue,
252
868330
2000
๊ทธ๋ฆฌ๊ณ , ๊ทธ ์‹œ๊ณ„๋ฅผ ์†๊ฐ€๋ฝ์œผ๋กœ ๊ฐ€๋ฅดํ‚ค๋ฉด์„œ ์›€์ง์ž„์œผ๋กœ ํžŒํŠธ๋ฅผ ์ฃผ์ง€์š”.
14:30
and the robot looks at the watch quite successfully.
253
870330
2000
๊ทธ๋Ÿฌ๋ฉด, ๋กœ๋ด‡์€ ์•„์ฃผ ์„ฑ๊ณต์ ์œผ๋กœ ๊ทธ ์‹œ๊ณ„๋ฅผ ๋ฐ”๋ผ๋ด…๋‹ˆ๋‹ค.
14:32
We don't know whether he understood or not that the robot --
254
872330
3000
์šฐ๋ฆฌ๋Š” ๊ทธ ์‚ฌ๋žŒ์ด ์ด ๋กœ๋ด‡์„ ์ดํ•ดํ–ˆ๋Š”์ง€ ์•ˆํ–ˆ๋Š”์ง€๋Š” ๋ชจ๋ฆ…๋‹ˆ๋‹ค.
14:36
Notice the turn-taking.
255
876330
2000
์„œ๋กœ ์ฃผ๊ณ ๋ฐ›๋Š” ๋Œ€ํ™”๋ฅผ ์ฃผ์˜๊นŠ๊ฒŒ ๋ณด์„ธ์š”.
14:38
Ritchie: OK, I want to show you something. OK, this is a watch
256
878330
3000
๋ฆฌ์น˜: ๋„ˆ์—๊ฒŒ ๋ณด์—ฌ์ค„ ๊ฒƒ์ด ์žˆ์–ด.
14:41
that my girlfriend gave me.
257
881330
3000
์ด ์‹œ๊ณ„ ๋‚ด ์—ฌ์ž์นœ๊ตฌ๊ฐ€ ์ค€๊ฑฐ๋‹ค.
14:44
Robot: Oh, cool.
258
884330
2000
๋กœ๋ด‡: ์˜ค, ๋ฉ‹์ง„๋ฐ.
14:46
Ritchie: Yeah, look, it's got a little blue light in it too. I almost lost it this week.
259
886330
4000
๋ฆฌ์น˜: ์‘, ๋ด๋ฐ”, ์ž‘์€ ํŒŒ๋ž€ ๋ถˆ๋„ ๋‚˜์˜จ๋‹ค. ์ด๋ฒˆ ์ฃผ์— ํ•˜๋งˆํ„ฐ๋ฉด ์žƒ์–ด๋ฒ„๋ฆด๋ป”ํ–ˆ์–ด.
14:51
(Laughter)
260
891330
4000
(์›ƒ์Œ)
14:55
RB: So it's making eye contact with him, following his eyes.
261
895330
3000
๋กœ๋ด‡์ด ์ง€๊ธˆ ์ € ์‚ฌ๋žŒ๊ณผ ๋ˆˆ์„ ๋งˆ์ฃผ์น˜๊ณ  ์žˆ๊ณ , ๊ทธ์˜ ์‹œ์„ ์„ ๋”ฐ๋ผ ๋ด…๋‹ˆ๋‹ค.
14:58
Ritchie: Can you do the same thing? Robot: Yeah, sure.
262
898330
2000
๋„ˆ๋„ ๋˜‘๊ฐ™์ด ํ•  ์ˆ˜ ์žˆ์–ด? ์‘, ๊ทธ๋Ÿผ.
15:00
RB: And they successfully have that sort of communication.
263
900330
2000
์ด๋ ‡๊ฒŒ ๊ทธ๋“ค์€ ๋Œ€ํ™”๋ฅผ ์„ฑ๊ณต์ ์œผ๋กœ ์ฃผ๊ณ  ๋ฐ›์Šต๋‹ˆ๋‹ค.
15:02
And here's another aspect of the sorts of things that Chris and I were doing.
264
902330
4000
๊ทธ๋ฆฌ๊ณ , ์ด๋ฒˆ์—๋Š” ์•„๊นŒ ํฌ๋ฆฌ์Šค์™€ ํ•จ๊ป˜ ๋ณด์—ฌ๋“œ๋ฆฐ ๊ฒƒ์ฒ˜๋Ÿผ
15:06
This is another robot, Cog.
265
906330
2000
์—ฌ๊ธฐ "์ฝ”๊ทธ"๋ผ๋Š” ๋กœ๋ด‡์ด
15:08
They first make eye contact, and then, when Christie looks over at this toy,
266
908330
6000
ํฌ๋ฆฌ์Šคํ‹ฐ์™€ ๋ˆˆ์„ ๋งˆ์ฃผ๋ณด๊ณ  ์žˆ๋‹ค๊ฐ€, ํฌ๋ฆฌ์Šคํ‹ฐ๊ฐ€ ์žฅ๋‚œ๊ฐ์„ ์ณ๋‹ค๋ณด๋ฉด
15:14
the robot estimates her gaze direction
267
914330
2000
๊ทธ๋…€๊ฐ€ ๋ฐ”๋ผ๋ณด๊ณ  ์žˆ๋Š” ๊ณณ์„ ์ง์ž‘ํ•˜์—ฌ
15:16
and looks at the same thing that she's looking at.
268
916330
2000
๊ทธ๋…€์™€ ๊ฐ™์€ ๊ณณ์„ ๋ฐ”๋ผ๋ณด๋Š” ๊ฒƒ์„ ๋ณด์‹ค ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
15:18
(Laughter)
269
918330
1000
(์›ƒ์Œ)
15:19
So we're going to see more and more of this sort of robot
270
919330
3000
์šฐ๋ฆฌ๋Š” ํ–ฅํ›„ ๋ช‡๋…„๋™์•ˆ ์ด์™€ ๊ฐ™์€ ๋กœ๋ด‡๋“ค์„
15:22
over the next few years in labs.
271
922330
2000
์ ์  ๋” ๋งŽ์ด ์ ‘ํ•˜๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
15:24
But then the big questions, two big questions that people ask me are:
272
924330
5000
์‚ฌ๋žŒ๋“ค์ด ์ €์—๊ฒŒ 2๊ฐ€์ง€ ์งˆ๋ฌธ์„ ๊ฐ€์žฅ ๋งŽ์ด ํ•ฉ๋‹ˆ๋‹ค.
15:29
if we make these robots more and more human-like,
273
929330
2000
์ฒซ๋ฒˆ์งธ๋กœ๋Š”, ๋งŒ์•ฝ ์šฐ๋ฆฌ๊ฐ€ ์ ์  ๋” ์‚ฌ๋žŒ๊ณผ ๊ฐ™์€ ๋กœ๋ด‡์„ ๋งŒ๋“ ๋‹ค๋ฉด
15:31
will we accept them, will we -- will they need rights eventually?
274
931330
5000
์šฐ๋ฆฌ๊ฐ€ ๊ทธ๋“ค์„ ๋ฐ›์•„๋“ค์ผ ์ˆ˜ ์žˆ์„๊นŒ์š”? ๊ฒฐ๊ตญ ๊ทธ๋“ค๋„ ์‚ฌ๋žŒ๊ณผ ๊ฐ™์€ ๊ถŒ๋ฆฌ๊ฐ€ ํ•„์š”ํ• ๊นŒ์š”?
15:36
And the other question people ask me is, will they want to take over?
275
936330
3000
๊ทธ๋ฆฌ๊ณ  ๋‘๋ฒˆ์งธ ์งˆ๋ฌธ์œผ๋กœ๋Š”. ๊ทธ๋“ค์ด ์šฐ๋ฆฌ ์ธ๊ฐ„์„ ์ง€๋ฐฐํ•˜๊ฒŒ ๋ ๊นŒ์š”?
15:39
(Laughter)
276
939330
1000
(์›ƒ์Œ)
15:40
And on the first -- you know, this has been a very Hollywood theme
277
940330
3000
๊ทธ๋Ÿผ ๋จผ์ € ์ฒซ๋ฒˆ์งธ ์งˆ๋ฌธ, ๊ทธ์™€ ๊ฐ™์€ ์–˜๊ธฐ๋Š” ๊ทธ๋™์•ˆ ๋งŽ์€ ํ—๋ฆฌ์šฐ๋“œ ์˜ํ™”์˜ ํ…Œ๋งˆ๊ฐ€ ๋์—ˆ์Šต๋‹ˆ๋‹ค.
15:43
with lots of movies. You probably recognize these characters here --
278
943330
3000
์—ฌ๊ธฐ ํ™”๋ฉด์— ๋‚˜์˜ค๋Š” ๋กœ๋ด‡ ์ผ€๋ฆญํ„ฐ๋“ค์ด ์ต์ˆ™ํ•˜์‹คํ…๋ฐ์š”.
15:46
where in each of these cases, the robots want more respect.
279
946330
4000
์—ฌ๊ธฐ์žˆ๋Š” ๋ชจ๋“  ๋กœ๋ด‡๋“ค์€ ์ธ๊ฐ„๋“ค์ด ์ข€ ๋” ์ž๊ธฐ๋ฅผ ์กด์ค‘ํ•ด์ฃผ๊ธธ ๋ฐ”๋žฌ์Šต๋‹ˆ๋‹ค.
15:50
Well, do you ever need to give robots respect?
280
950330
3000
๊ธ€์Ž„์š”, ์šฐ๋ฆฌ๊ฐ€ ๋กœ๋ด‡์„ ์กด์ค‘ํ•ด์•ผ ํ•  ํ•„์š”๊ฐ€ ์žˆ์„๊นŒ์š”?
15:54
They're just machines, after all.
281
954330
2000
๊ทธ๋ƒฅ ๊ทธ๋“ค์€ ์šฐ๋ฆฌ๊ฐ€ ๋งŒ๋“  ๊ธฐ๊ณ„์ผ๋ฟ์ธ๋ฐ์š”.
15:56
But I think, you know, we have to accept that we are just machines.
282
956330
4000
๊ทธ๋Ÿฌ๋‚˜, ์ €๋Š” ์šฐ๋ฆฌ ์ธ๊ฐ„๋„ ๊ธฐ๊ณ„์™€ ๊ฐ™์€ ์กด์žฌ์ผ ์ˆ˜๋„ ์žˆ๋‹ค๋Š” ๊ฒƒ์„ ์ธ์ •ํ•ด์•ผํ•œ๋‹ค๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
16:00
After all, that's certainly what modern molecular biology says about us.
283
960330
5000
์ด๊ฒƒ์€ ํ˜„๋Œ€ ๋ถ„์ž ์ƒ๋ฌผํ•™์—์„œ ์ฃผ์žฅํ•˜๋Š” ๊ฒƒ์ธ๋ฐ์š”.
16:05
You don't see a description of how, you know,
284
965330
3000
์ด๊ฒƒ์ด ์–ด๋–ป๊ฒŒ ๊ฐ€๋Šฅํ•œ์ง€ ๋ณผ ์ˆ˜๋Š” ์—†์ง€๋งŒ
16:08
Molecule A, you know, comes up and docks with this other molecule.
285
968330
4000
A๋ผ๋Š” ๋ถ„์ž๊ฐ€ ๋‹ค๋ฅธ ๋ถ„์ž์™€ ๊ฒฐํ•ฉํ•œ ํ›„
16:12
And it's moving forward, you know, propelled by various charges,
286
972330
3000
์—ฌ๋Ÿฌ๊ฐ€์ง€ ๋ณ€ํ™” ๊ณผ์ •์„ ๊ฑฐ์ณ ์ง„ํ™”๋ฅผ ํ•˜๊ฒŒ ๋˜๊ณ 
16:15
and then the soul steps in and tweaks those molecules so that they connect.
287
975330
4000
๊ฒฐ๊ตญ, ์ด๋Ÿฌํ•œ ๋ถ„์ž๋“ค์ด ์˜ํ˜ผ๊ณผ ๊ฒฐํ•ฉํ•˜์—ฌ ์ธ๊ฐ„์ด ๋œ๋‹ค.
16:19
It's all mechanistic. We are mechanism.
288
979330
3000
์ด ๋ชจ๋“  ๊ฒƒ๋“ค์€ ๊ธฐ๊ณ„์ ์ด๋ฉฐ, ๊ฒฐ๊ตญ ์ธ๊ฐ„๋„ ๊ธฐ๊ณ„์˜ ์ผ์ข…์ผ ์ˆ˜๋„ ์žˆ๋‹ค๋Š” ์ฃผ์žฅ์ž…๋‹ˆ๋‹ค.
16:22
If we are machines, then in principle at least,
289
982330
3000
๋งŒ์•ฝ ์ด ์„ค๋ช…์ฒ˜๋Ÿผ ์šฐ๋ฆฌ ์ธ๊ฐ„๋„ ๊ธฐ๊ณ„์˜ ์ผ์ข…์ด๋ผ๋ฉด, ์ตœ์†Œํ•œ ์ด๋ก ์ ์œผ๋กœ
16:25
we should be able to build machines out of other stuff,
290
985330
4000
์šฐ๋ฆฌ ์ธ๊ฐ„๊ณผ ๊ฐ™์€ ์ƒ๋ช…์ ์ธ ์š”์†Œ๋“ค์€ ๋บ€
16:29
which are just as alive as we are.
291
989330
4000
๋กœ๋ด‡์„ ๊ฐœ๋ฐœ ํ•  ์ˆ˜ ์žˆ๋‹ค๋Š” ์–˜๊ธฐ์ž…๋‹ˆ๋‹ค.
16:33
But I think for us to admit that,
292
993330
2000
๊ทธ๋Ÿฌ๋‚˜, ์šฐ๋ฆฌ๊ฐ€ ์ด๋Ÿฌํ•œ ๊ฒƒ๋“ค์„ ๋ฐ›์•„๋“ค์ด๋ ค๋ฉด
16:35
we have to give up on our special-ness, in a certain way.
293
995330
3000
์šฐ๋ฆฌ๋Š” ์ธ๊ฐ„์ด ํŠน๋ณ„ํ•œ ์กด์žฌ๋ผ๋Š” ๊ฒƒ์„ ํฌ๊ธฐํ•ด์•ผ๊ฒ ์ง€์š”.
16:38
And we've had the retreat from special-ness
294
998330
2000
๊ณผ๊ฑฐ๋ฅผ ๋Œ์•„๋ณด๋ฉด, ์šฐ๋ฆฌ๋Š” ์ง€๋‚œ ๋ช‡๋ฐฑ๋…„๋™์•ˆ
16:40
under the barrage of science and technology many times
295
1000330
3000
๊ณผํ•™ ๊ธฐ์ˆ ์˜ ๋ฐœ์ „์— ์˜ํ•ด
16:43
over the last few hundred years, at least.
296
1003330
2000
์ธ๊ฐ„ ์ค‘์‹ฌ์ ์ธ ๋งŽ์€ ์ด๋ก ๋“ค์„ ํฌ๊ธฐํ•ด์™”์Šต๋‹ˆ๋‹ค.
16:45
500 years ago we had to give up the idea
297
1005330
2000
์˜ˆ๋ฅผ๋“ค๋ฉด, 500๋…„ ์ „์— ์šฐ๋ฆฌ๋Š”
16:47
that we are the center of the universe
298
1007330
3000
์ง€๊ตฌ๊ฐ€ ํƒœ์–‘ ์ฃผ์˜๋ฅผ ๋ˆ๋‹ค๋Š” ๊ฒƒ์„ ๋ฐœ๊ฒฌํ•˜๋ฉด์„œ
16:50
when the earth started to go around the sun;
299
1010330
2000
์ง€๊ตฌ๊ฐ€ ์šฐ์ฃผ์˜ ์ค‘์‹ฌ์ด๋ผ๋Š” ์ƒ๊ฐ์„ ๋ฒ„๋ ค์•ผํ–ˆ์—ˆ์ฃ .
16:52
150 years ago, with Darwin, we had to give up the idea we were different from animals.
300
1012330
5000
๊ทธ๋ฆฌ๊ณ , 150๋…„ ์ „์—๋Š” ๋‹ค์œˆ์— ์˜ํ•ด์„œ ์ธ๊ฐ„์€ ๋™๋ฌผ๊ณผ ๋‹ค๋ฅธ ์กด์žฌ๋ผ๋Š” ์ƒ๊ฐ์„ ๋ฒ„๋ ค์•ผํ–ˆ๊ณ ์š”.
16:57
And to imagine -- you know, it's always hard for us.
301
1017330
3000
์ •๋ง ๋ฐ›์•„๋“ค์ด๊ธฐ ํž˜๋“ค์ง€๋งŒ ์ƒ์ƒํ•ด๋ณด์ฃ .
17:00
Recently we've been battered with the idea that maybe
302
1020330
3000
์ตœ๊ทผ ์ธ๊ฐ„์˜ ์ฐฝ์กฐ๋ก ์กฐ์ฐจ ์—†์—ˆ๋˜ ๊ฒƒ์ด๋ผ๋Š” ์ฃผ์žฅ์—
17:03
we didn't even have our own creation event, here on earth,
303
1023330
2000
๋งŽ์€ ์‚ฌ๋žŒ๋“ค์ด ๊ทธ ์ฃผ์žฅ์„ ์‹ซ์–ดํ–ˆ์—ˆ๊ณ ,
17:05
which people didn't like much. And then the human genome said,
304
1025330
3000
์šฐ๋ฆฌ ์ธ๊ฐ„์ด ๊ฒจ์šฐ 35,000๊ฐœ์˜ ์œ ์ „์ž๋ฐ–์— ๊ฐ€์ง€์ง€ ์•Š์•˜๋‹ค๋Š” ์ฃผ์žฅ๋„
17:08
maybe we only have 35,000 genes. And that was really --
305
1028330
3000
๋งŽ์€ ์‚ฌ๋žŒ๋“ค์ด ์ •๋ง ์‹ซ์–ดํ–ˆ์Šต๋‹ˆ๋‹ค.
17:11
people didn't like that, we've got more genes than that.
306
1031330
3000
์šฐ๋ฆฌ ์ธ๊ฐ„์€ ๊ทธ๋ณด๋‹ค ๋งŽ์€ ์œ ์ „์ž๋ฅผ ๊ฐ€์ง€๊ณ  ์žˆ๋‹ค๊ณ  ํ•˜๋ฉด์„œ ๋ง์ด์ฃ .
17:14
We don't like to give up our special-ness, so, you know,
307
1034330
3000
์ด๋ ‡๊ฒŒ ์šฐ๋ฆฌ๋Š” ์ธ๊ฐ„์˜ ํŠน์ˆ˜์„ฑ์„ ํฌ๊ธฐํ•˜๊ธฐ ์‹ซ์–ดํ•ฉ๋‹ˆ๋‹ค.
17:17
having the idea that robots could really have emotions,
308
1037330
2000
์ด๋Ÿฌํ•œ ์ƒ๊ฐ๋“ค์„ ๊ฐ€์ง€๊ณ , ๋กœ๋ด‡์ด ๊ฐ์ •์„ ๊ฐ€์ง€๊ฒŒ ๋œ๋‹ค?
17:19
or that robots could be living creatures --
309
1039330
2000
๋˜๋Š” ๋กœ๋ด‡์ด ์‚ด์•„์žˆ๋Š” ์ƒ๋ช…์ฒด๊ฐ€ ๋œ๋‹ค?
17:21
I think is going to be hard for us to accept.
310
1041330
2000
์ œ ์ƒ๊ฐ์—๋Š” ์ด๋Ÿฌํ•œ ๊ฒƒ๋“ค์ด ๋ฐ›์•„๋“ค์—ฌ์ง€๊ธฐ๋Š” ๋ฌด์ฒ™ ํž˜๋“ค๊บผ๋ผ๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
17:23
But we're going to come to accept it over the next 50 years or so.
311
1043330
4000
๊ทธ๋Ÿฌ๋‚˜, ์šฐ๋ฆฌ๋“ค์€ ํ–ฅํ›„ 50๋…„ ๋™์•ˆ ์ด๋Ÿฌํ•œ ๊ฒƒ๋“ค์„ ๋ฐ›์•„๋“ค์—ฌ์•ผ ํ• ์ง€๋„ ๋ชจ๋ฅด์ฃ .
17:27
And the second question is, will the machines want to take over?
312
1047330
3000
๋‹ค์Œ์œผ๋กœ ๋‘๋ฒˆ์งธ ์งˆ๋ฌธ์„ ์ƒ๊ฐํ•ด๋ณด์ฃ . ๊ธฐ๊ณ„๋“ค์ด ์šฐ๋ฆฌ ์ธ๊ฐ„์„ ์ง€๋ฐฐํ•˜๊ฒŒ๋ ๊นŒ์š”?
17:30
And here the standard scenario is that we create these things,
313
1050330
5000
์—ฌ๊ธฐ์— ์œ ๋ช…ํ•œ ์‹œ๋‚˜๋ฆฌ์˜ค๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค. ์šฐ๋ฆฌ๊ฐ€ ์ด๋Ÿฐ ๋กœ๋ด‡๋“ค์„ ๊ฐœ๋ฐœํ•˜๊ณ ,
17:35
they grow, we nurture them, they learn a lot from us,
314
1055330
3000
์ด๋“ค์ด ์šฐ๋ฆฌ ์ธ๊ฐ„๋“ค์— ์˜ํ•ด ์ ์  ์ง„ํ™” ํ•˜๋‹ค๊ฐ€
17:38
and then they start to decide that we're pretty boring, slow.
315
1058330
4000
์–ด๋Š ์ˆœ๊ฐ„๋ถ€ํ„ฐ ์šฐ๋ฆฌ ์ธ๊ฐ„๋“ค์ด ๋Š๋ คํ„ฐ์ง€๊ณ , ์ง€๋ฃจํ•œ ์กด์žฌ๋ผ๊ณ  ์ƒ๊ฐํ•˜๊ธฐ ์‹œ์ž‘ํ•˜๊ณ 
17:42
They want to take over from us.
316
1062330
2000
๊ฒฐ๊ตญ์—” ์šฐ๋ฆฌ ์ธ๊ฐ„๋“ค์„ ์ •๋ณตํ•˜๋ ค๊ณ  ํ•œ๋‹ค.
17:44
And for those of you that have teenagers, you know what that's like.
317
1064330
3000
์ด๋Ÿฐ ์‹œ๋‚˜๋ฆฌ์˜ค๋Š” 10๋Œ€ ์ž๋…€๋ฅผ ๋‘์‹  ๋ถ„์ด๋ผ๋ฉด ๋”์šฑ ์ž˜ ์•„์‹ค๊บผ๋ผ ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
17:47
(Laughter)
318
1067330
1000
(์›ƒ์Œ)
17:48
But Hollywood extends it to the robots.
319
1068330
3000
ํ—๋ฆฌ์šฐ๋“œ์—์„œ๋Š” ์ด๋Ÿฐ ์‹œ๋‚˜๋ฆฌ์˜ค๋ฅผ ๋กœ๋ด‡์—๊ฒŒ ์ ์šฉํ•˜์—ฌ ๋งŽ์€ ์˜ํ™”๋ฅผ ๋งŒ๋“ค๊ณ ์žˆ์ฃ .
17:51
And the question is, you know,
320
1071330
3000
์ž, ์—ฌ๋Ÿฌ๋ถ„๋“ค์—๊ฒŒ ํ•œ๋ฒˆ ๋ฌป๊ฒ ์Šต๋‹ˆ๋‹ค.
17:54
will someone accidentally build a robot that takes over from us?
321
1074330
4000
์–ด๋Š ๋ˆ„๊ตฐ๊ฐ€๊ฐ€ ์šฐ์—ฐํžˆ ์šฐ๋ฆฌ๋ฅผ ์ •๋ณตํ•  ๋กœ๋ด‡์„ ๋งŒ๋“ค๊ฒŒ ๋ ๊นŒ์š”?
17:58
And that's sort of like this lone guy in the backyard,
322
1078330
3000
์–ด๋–ค ์™ธ๋กœ์šด ๋‚จ์ž๊ฐ€ ์ž์‹ ์˜ ๋’ท ๋งˆ๋‹น์—์„œ
18:01
you know -- "I accidentally built a 747."
323
1081330
3000
"์•—! ๋‚ด๊ฐ€ ์šฐ์—ฐํžˆ 747ํ˜ธ๋ฅผ ๋งŒ๋“ค์—ˆ์–ด!!" ๋ผ๊ณ  ํ•˜๋Š” ๊ทธ๋Ÿฐ ๊ฒฝ์šฐ ๋ง์ด์ฃ .
18:04
I don't think that's going to happen.
324
1084330
2000
์•„์‹œ๋‹ค์‹œํ”ผ, ๊ทธ๋Ÿด์ผ์€ ์—†์„๊ฒ๋‹ˆ๋‹ค.
18:06
And I don't think --
325
1086330
2000
๊ทธ๋ฆฌ๊ณ , ์ €๋Š”
18:08
(Laughter)
326
1088330
1000
(์›ƒ์Œ)
18:09
-- I don't think we're going to deliberately build robots
327
1089330
3000
๊ทธ๋ฆฌ๊ณ , ์ €๋Š” ์ธ๋ฅ˜์—๊ฒŒ ์ข‹์€ ์˜ํ–ฅ์„ ๋ผ์น˜๋Š”
18:12
that we're uncomfortable with.
328
1092330
2000
๋กœ๋ด‡๋“ค๋งŒ ๋งŒ๋“ค์–ด์งˆ ๊ฑฐ๋ผ๊ณ  ๋ฏฟ์Šต๋‹ˆ๋‹ค.
18:14
We'll -- you know, they're not going to have a super bad robot.
329
1094330
2000
์Œ.. ๊ทธ ๋กœ๋ด‡๋“ค์ด ์Šค์Šค๋กœ ์•…๋‹น๋กœ๋ด‡์ด ๋˜์–ด ๋‚˜ํƒ€๋‚˜์ง€๋Š” ๋ชปํ• ๊ฒ๋‹ˆ๋‹ค.
18:16
Before that has to come to be a mildly bad robot,
330
1096330
3000
์•…๋‹น๋กœ๋ด‡์ด ๋˜๋ ค๋ฉด ์šฐ์„  ๊ฝค ๋ชป๋œ ๋กœ๋ด‡์œผ๋กœ ์ง„ํ™”ํ•ด์•ผ ํ•˜๊ณ ,
18:19
and before that a not so bad robot.
331
1099330
2000
๊ทธ๋ณด๋‹ค ๋จผ์ € ์•ฝ๊ฐ„ ์‚๋šค์–ด์ง„ ๋กœ๋ด‡์ด ๋จผ์ € ๋˜์–ด์•ผ ํ•˜๋Š”๋ฐ,
18:21
(Laughter)
332
1101330
1000
(์›ƒ์Œ)
18:22
And we're just not going to let it go that way.
333
1102330
2000
๋กœ๋ด‡๋“ค์ด ๊ทธ๋ ‡๊ฒŒ ์‚๋šค์–ด์ง€๋„๋ก ๋†”๋‘๋ฉด ๋‹น์—ฐํžˆ ์•ˆ๋˜๊ฒ ์ง€์š”?
18:24
(Laughter)
334
1104330
1000
(์›ƒ์Œ)
18:25
So, I think I'm going to leave it at that: the robots are coming,
335
1105330
6000
๊ทธ๋Ÿผ, ์ €๋Š” ๋กœ๋ด‡์ด ๋Œ€์ค‘ํ™”๋˜๋Š” ์‹œ๋Œ€๊ฐ€ ๋‹ค๊ฐ€์˜ค๊ณ  ์žˆ๋‹ค๋Š” ๊ฒƒ์„ ๋ง์”€๋“œ๋ฆฌ๋ฉฐ ์ด ๊ฐ•์—ฐ์„ ๋งˆ์น ๊นŒํ•ฉ๋‹ˆ๋‹ค.
18:31
we don't have too much to worry about, it's going to be a lot of fun,
336
1111330
3000
๋„ˆ๋ฌด ๊ฑฑ์ •ํ•˜์‹ค ํ•„์š”๋Š” ์—†์Šต๋‹ˆ๋‹ค. ์˜คํžˆ๋ ค ์žฌ๋ฏธ์žˆ๋Š” ์ผ๋“ค์ด ๋งŽ์„๊บผ์—์š”.
18:34
and I hope you all enjoy the journey over the next 50 years.
337
1114330
4000
์—ฌ๊ธฐ ๊ณ„์‹  ๋ชจ๋“  ๋ถ„๋“ค๊ป˜์„œ ์•ž์œผ๋กœ์˜ ๊ทธ 50๋…„์„ ์ถฉ๋ถ„ํžˆ ์ฆ๊ธฐ์‹ค ์ˆ˜ ์žˆ๊ธฐ๋ฅผ ๋ฐ”๋ž๋‹ˆ๋‹ค.
18:38
(Applause)
338
1118330
2000
(๋ฐ•์ˆ˜)
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7