Maybe the best robot demo ever | Marco Tempest

2,175,596 views ・ 2014-05-06

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: Tae Young Choi κ²€ν† : JONGHEE YI
00:13
Let me introduce you to
0
13020
1729
μ œκ°€ μ§€κΈˆκΉŒμ§€ μž‘μ—…ν•΄ 온 것을
00:14
something I've been working on.
1
14749
1903
μ†Œκ°œν•΄ λ“œλ¦¬κ² μŠ΅λ‹ˆλ‹€.
00:16
It's what the Victorian illusionists
2
16652
1811
빅토리아 μ‹œλŒ€ λ§ˆμˆ μ‚¬λ“€μ€
00:18
would have described as a mechanical marvel,
3
18463
4446
κΈ°κ³„μ˜ 마술
00:22
an automaton,
4
22909
1483
μžλ™μΈν˜•
00:24
a thinking machine.
5
24392
1963
μƒκ°ν•˜λŠ” 기계라고 λΆˆλ Έμ„μ§€λ„ λͺ¨λ₯΄κ² μŠ΅λ‹ˆλ‹€.
00:26
Say hello to EDI.
6
26355
2250
EDIλ₯Ό 뢈러 보죠.
00:28
Now he's asleep. Let's wake him up.
7
28605
2367
μ§€κΈˆ 자고 μžˆκ΅°μš”. κΉ¨μ›Œμ•Ό κ² λ„€μš”.
00:30
EDI, EDI.
8
30972
3971
EDI, EDI.
00:34
These mechanical performers were popular
9
34943
2877
이 λ‘œλ΄‡μ€ 유럽 μ „μ—­μ—μ„œ
00:37
throughout Europe.
10
37820
1460
인기λ₯Ό λŒμ—ˆλŠ”λ°
00:39
Audiences marveled at the way they moved.
11
39280
2700
μ‚¬λžŒλ“€μ€ 이 λ‘œλ΄‡μ˜ μ›€μ§μž„μ— 맀료 λμŠ΅λ‹ˆλ‹€.
00:41
It was science fiction made true,
12
41980
2575
μ „μžμ‹œλŒ€ μ΄μ „μ—λŠ” 곡상과학이
00:44
robotic engineering in a pre-electronic age,
13
44555
3818
λ‘œλ΄‡ 곡학을 ν˜„μ‹€ν™” μ‹œμΌ°λŠ”λ°
00:48
machines far in advance of anything
14
48373
2564
λΉ…ν† λ¦¬μ•ˆ κΈ°μˆ μ€
00:50
that Victorian technology could create,
15
50937
2571
μ‹œλŒ€λ₯Ό μ•žμ„  기계λ₯Ό λ§Œλ“€κ²Œ ν•΄ μ€¬λŠ”λ°
00:53
a machine we would later know
16
53508
2975
λ‚˜μ€‘μ— 인간듀이
00:56
as the robot.
17
56483
1688
λ‘œλ΄‡μ΄λΌκ³  λΆ€λ₯Ό 기계λ₯Ό 말 이죠.
00:58
EDI: Robot. A word coined in 1921
18
58171
3586
EDI:λ‘œλ΄‡. 이 λ‹¨μ–΄λŠ” 1921년에
01:01
in a science fiction tale
19
61757
1848
체코 μž‘κ°€, 카펠 챠페크
01:03
by the Czech playwright Karel Čapek.
20
63605
2543
κ³΅μƒμ†Œμ„€μ—μ„œ μ“°μ˜€μŠ΅λ‹ˆλ‹€.
01:06
It comes from "robota."
21
66148
2112
이 λ‹¨μ–΄λŠ” "λ‘œλ³΄νƒ€" μ—μ„œ 유래 λλŠ”λ°
01:08
It means "forced labor."
22
68260
3353
"κ°•μ œ 노동" λ₯Ό μ˜λ―Έν•©λ‹ˆλ‹€.
01:11
Marco Tempest: But these robots were not real.
23
71613
2729
마λ₯΄μ½” ν…œνŽ˜μŠ€νŠΈ : ν•˜μ§€λ§Œ 그런 λ‘œλ΄‡λ“€μ€ μ‹€μ œ μ—†μ—ˆμŠ΅λ‹ˆλ‹€.
01:14
They were not intelligent.
24
74342
1682
λ˜‘λ˜‘ν•˜μ§€λ„ μ•Šμ•˜μŠ΅λ‹ˆλ‹€.
01:16
They were illusions,
25
76024
2266
κ·Έμ € ν™˜μƒ μ΄μ—ˆμœΌλ©°
01:18
a clever combination of mechanical engineering
26
78290
3504
기계 κ³΅ν•™μ˜ 멋진 μ‘°λ¦½ν’ˆμ΄κ³ 
01:21
and the deceptiveness of the conjurer's art.
27
81794
2743
λ§ˆμˆ μ‚¬λ“€μ˜ λˆˆμ†μž„ μ΄μ—ˆμŠ΅λ‹ˆλ‹€.
01:24
EDI is different.
28
84537
1975
ν•˜μ§€λ§Œ EDIλŠ” λ‹€λ¦…λ‹ˆλ‹€.
01:26
EDI is real.
29
86512
1594
EDIλŠ” ν˜„μ‹€ μž…λ‹ˆλ‹€.
01:28
EDI: I am 176 centimeters tall.
30
88106
2050
EDI: 제 ν‚€λŠ” 176cm μž…λ‹ˆλ‹€.
01:30
MT: He weighs 300 pounds.
31
90156
1920
MT: 그리고 λͺΈλ¬΄κ²ŒλŠ” 300νŒŒμš΄λ“œ μž…λ‹ˆλ‹€.
01:32
EDI: I have two seven-axis arms β€”
32
92076
2642
EDI: μ „ 7개 μΆ•μœΌλ‘œ μ—°κ²°λœ νŒ”μ΄ λ‘κ°œ μžˆμ–΄μš”.
01:34
MT: Core of sensing β€”
33
94718
1383
MT: 감지λŠ₯λ ₯의 핡심이죠-
01:36
EDI: A 360-degree sonar detection system,
34
96101
3058
EDI: 360 도 νšŒμ „ μ†Œλ‚˜νƒμ§€ μ‹œμŠ€ν…œκ³Ό
01:39
and come complete with a warranty.
35
99159
2422
ν™•μ‹€ν•œ ν’ˆμ§ˆ λ³΄μ¦μ„œκ°€ μžˆμ–΄μš”.
01:41
MT: We love robots.
36
101581
1539
MT: μš°λ¦¬λŠ” λ‘œλ΄‡μ„ μ‚¬λž‘ν•©λ‹ˆλ‹€.
01:43
EDI: Hi. I'm EDI. Will you be my friend?
37
103120
3514
EDI:μ•ˆλ…•. λ‚œ EDIμ•Ό. λ‚΄ μΉœκ΅¬κ°€ λ˜μ–΄μ€„λ ˆ?
01:46
MT: We are intrigued by the possibility
38
106634
2246
MT: μš°λ¦¬λŠ” μš°λ¦¬κ°€ μ›ν•˜λŠ”
01:48
of creating a mechanical version of ourselves.
39
108880
3282
λ‘œλ΄‡μ„ λ§Œλ“€ 수 μžˆλ‹€λŠ” κ°€λŠ₯성에 μ§‘μ°©ν–ˆμŠ΅λ‹ˆλ‹€.
01:52
We build them so they look like us,
40
112162
2735
μš°λ¦¬λŠ” λ‘œλ΄‡μ„ 우리처럼 보이고
01:54
behave like us, and think like us.
41
114897
3241
우리처럼 ν–‰λ™ν•˜κ³  μƒκ°ν•˜λ„λ‘ λ§Œλ“€μ—ˆμŠ΅λ‹ˆλ‹€.
01:58
The perfect robot will be indistinguishable
42
118138
2824
μ™„λ²½ν•œ λ‘œλ΄‡μ€ 인간과
02:00
from the human,
43
120962
1935
ꡬ뢄할 수 없을 κ±°μ—μš”.
02:02
and that scares us.
44
122897
2059
그리고 κ·Έ 것은 우리λ₯Ό λ‘λ ΅κ²Œ λ§Œλ“­λ‹ˆλ‹€.
02:04
In the first story about robots,
45
124956
1705
02:06
they turn against their creators.
46
126661
2874
λ‘œλ΄‡λ“€μ΄ 자주 λ‚˜μ˜€λŠ” μ΄μ•ΌκΈ°μ—μ„œ
02:09
It's one of the leitmotifs of science fiction.
47
129535
2265
λ‘œλ΄‡λ“€μ€ 자기 주인을 λ°°μ‹  ν•©λ‹ˆλ‹€.
02:11
EDI: Ha ha ha. Now you are the slaves
48
131800
3055
이 것은 κ³΅μƒκ³Όν•™μ†Œμ„€μ— 자주 λ“±μž₯ν•˜λŠ” μ£Όμ œλ“€ 쀑 ν•˜λ‚©λ‹ˆλ‹€.
EDI: ν•˜ν•˜ν•˜. λ„ˆν¬λŠ” 이제 우리 λ…Έμ˜ˆμ•Ό
02:14
and we robots, the masters.
49
134855
2896
그리고 우리 λ‘œλ΄‡μ΄ 주인이지..
02:17
Your world is ours. You β€”
50
137751
2965
λ„ˆν¬ μ„Έκ³„λŠ” μš°λ¦¬κΊΌκ³ β€¦
02:22
MT: As I was saying,
51
142484
1625
MT: μ œκ°€ 말 ν–ˆλ˜ 것 처럼
02:24
besides the faces and bodies we give our robots,
52
144109
2831
인간은 λ‘œλ΄‡μ—κ²Œ μ–Όκ΅΄κ³Ό λͺΈμ„ λ§Œλ“€μ–΄ μ£Όμ—ˆμ§€λ§Œ
02:26
we cannot read their intentions,
53
146940
2328
κ·Έλ“€μ˜ μ˜λ„λ₯Ό μ•Œ 수 κ°€ μ—†μŠ΅λ‹ˆλ‹€.
02:29
and that makes us nervous.
54
149268
2048
그리고 κ·Έ 것은 μš°λ¦¬κ°€ μ‹ κ²½μ“°κ²Œ ν•©λ‹ˆλ‹€.
02:31
When someone hands an object to you,
55
151316
2104
λˆ„κ΅°κ°€κ°€ 물건을 쀄 λ•Œ
02:33
you can read intention in their eyes, their face,
56
153420
2710
μš°λ¦¬λŠ” 그의 눈과 μ–Όκ΅΄
02:36
their body language.
57
156130
1399
그리고 λ°”λ””λž­κ·€μ§€λ‘œ 그의 μ˜λ„λ₯Ό μ•Œ 수있죠.
02:37
That's not true of the robot.
58
157529
2280
ν•˜μ§€λ§Œ λ‘œλ΄‡μ€ λ‹€λ¦…λ‹ˆλ‹€.
02:39
Now, this goes both ways.
59
159809
2047
자, 이 곡은 μ–‘μͺ½μ„ λ‹€ 톡과 ν•©λ‹ˆλ‹€.
02:43
EDI: Wow!
60
163069
1416
EDI: μ™€μš°!
02:44
MT: Robots cannot anticipate human actions.
61
164485
3055
MT: λ‘œλ΄‡μ€ μΈκ°„μ˜ 행동을 μ˜ˆμƒν• μˆ˜ μ—†μ£ .
02:47
EDI: You know, humans are so unpredictable,
62
167540
3565
EDI: μ—¬λŸ¬λΆ„λ„ μ•„μ‹œλ‹€μ‹œν”Ό 인간은 μ’…μž‘μ„ μˆ˜κ°€ μ—†μ§€λ§Œ
02:51
not to mention irrational.
63
171105
1966
비논리적 μ΄λΌλŠ” 말은 μ•„λ‹ˆμ—μš”.
02:53
I literally have no idea
64
173071
1757
μ „ λ‹€μŒμ— μ—¬λŸ¬λΆ„μ΄ 무엇을 ν•  지
02:54
what you guys are going to do next, you know,
65
174828
2036
μ „ν˜€ μ•Œμˆ˜ μ—†μ–΄μš”. μ•„μ‹œμ£ ?
02:56
but it scares me.
66
176864
1595
그리고 이건 절 λ¬΄μ„­κ²Œ λ§Œλ“€μ£ .
02:58
MT: Which is why humans and robots find it difficult
67
178459
2837
MT: 이 것은 μ‚¬λžŒκ³Ό λ‘œλ΄‡μ΄ μ„œλ‘œ
03:01
to work in close proximity.
68
181296
1975
κ°€κΉκ²Œ μΌν•˜κΈ° μ–΄λ €μš΄ 이유죠.
03:03
Accidents are inevitable.
69
183271
1307
μ‚¬κ³ λŠ” ν”Όν•  수 κ°€ μ—†μŠ΅λ‹ˆλ‹€.
03:04
EDI: Ow! That hurt.
70
184578
1752
EDI: μ•„μ•Ό! μ•„ν”„ μž–μ•„μš”..
03:06
MT: Sorry. Now, one way of persuading humans
71
186330
2387
MT: λ―Έμ•ˆ. μ§€κΈˆ λ‘œλ΄‡μ΄ μ•ˆμ „ν•˜λ‹€κ³ 
03:08
that robots are safe
72
188717
1901
인간듀이 μ„€λ“ν•˜λŠ” ν•œ 가지 방법은
03:10
is to create the illusion of trust.
73
190618
2773
믿음의 ν™˜μƒμ„ 심어 μ£ΌλŠ” κ²λ‹ˆλ‹€.
03:13
Much as the Victorians
74
193391
1773
λΉ…ν† λ¦¬μ•ˆ μ‹œλŒ€μ—λŠ”
03:15
faked their mechanical marvels,
75
195164
2369
κΈ°κ³„μ˜ 마술둜 속이긴 ν–ˆμ§€λ§Œ
03:17
we can add a layer of deception
76
197533
2477
μš°λ¦¬λŠ” λ‘œλ΄‡ μΉœκ΅¬λ“€κ³Ό
03:20
to help us feel more comfortable
77
200010
2233
μ’€ 더 μΉœν•΄μ§€κΈ° μœ„ν•΄μ„œ
03:22
with our robotic friends.
78
202243
2004
ν•œ 가지 μ†μž„μˆ˜λ₯Ό μ“Έ 수 μžˆμŠ΅λ‹ˆλ‹€.
03:24
With that in mind, I set about teaching EDI
79
204247
3542
κ·Έ 점을 염두에 두고, μ €λŠ” EDI μ—κ²Œ
03:27
a magic trick.
80
207789
1844
λ§ˆμˆ μ„ κ°€λ₯΄μΉ˜κΈ° μ‹œμž‘ ν–ˆμŠ΅λ‹ˆλ‹€.
03:29
Ready, EDI? EDI: Uh, ready, Marco.
81
209633
3433
μ€€λΉ„λλ‹ˆ EDI? EDI: 음, μ€€λΉ„ λμ–΄μš”. 마λ₯΄μ½”
03:33
Abracadabra.
82
213066
2076
μ•„λΈŒλΌμΉ΄νƒ€λΈŒλΌ.
03:35
MT: Abracadabra?
83
215142
1376
MT: μ•„λΈŒλΌμΉ΄νƒ€λΈŒλΌ?
03:36
EDI: Yeah. It's all part of the illusion, Marco.
84
216518
3042
EDI: 에이 이건 λͺ¨λ‘ ν™˜μƒμ΄λΌκ΅¬μš”, 마λ₯΄μ½” .
03:39
Come on, keep up.
85
219560
1668
μžβ€¦ μ–΄μ„œ 계속 ν•˜μžκ΅¬μš”.
03:41
MT: Magic creates the illusion of
86
221228
1879
MT: λ§ˆμˆ μ€ λΆˆκ°€λŠ₯ν•œ ν˜„μ‹€μ„
03:43
an impossible reality.
87
223107
2474
ν™˜μƒμœΌλ‘œ λ§Œλ“€μ–΄ μ€λ‹ˆλ‹€.
03:45
Technology
88
225581
1937
κΈ°μˆ λ„
03:49
can do the same.
89
229075
2696
λ˜‘κ°™μ΄ ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
03:52
Alan Turing, a pioneer of artificial intelligence,
90
232860
3687
인곡 지λŠ₯의 μ„ κ΅¬μž μ•¨λŸ° νŠœλ§μ€
03:56
spoke about creating the illusion
91
236547
2857
기계가 생각 ν•  수 μžˆλ‹€λŠ”
03:59
that a machine could think.
92
239404
4226
ν™˜μƒμ„ 이야기 ν–ˆμ—ˆμŠ΅λ‹ˆλ‹€.
04:03
EDI: A computer would deserve
93
243630
1824
EDI: λ§Œμ•½ 컴퓨터가 μΈκ°„μ—κ²Œ μžμ‹ μ΄
04:05
to be called intelligent
94
245454
1870
인간이라고 속일 수 만 μžˆμ—ˆλ‹€λ©΄
04:07
if it deceived a human into believing
95
247324
2906
λ˜‘λ˜‘ν•˜λ‹€κ³ 
04:10
it was human.
96
250230
1653
뢈릴만 ν–ˆ 을 κΊΌμ—μš”.
04:11
MT: In other words, if we do not yet have
97
251883
1429
MT: λ‹€μ‹œλ§ν•΄μ„œ λ§Œμ•½ μš°λ¦¬κ°€
04:13
the technological solutions,
98
253312
1792
기술적 해결책을 아직 가지고 μžˆμ§€μ•Šλ‹€λ©΄
04:15
would illusions serve the same purpose?
99
255104
3440
ν™˜μƒμ΄ ν•΄κ²° ν•΄ 쀄 수 μžˆμ„κΉŒμš”?
04:18
To create the robotic illusion,
100
258544
2012
λ‘œλ΄‡μ˜ λŒ€ν•œ ν™˜μƒμ„ μ‹€ν˜„μ‹œν‚€κΈ° μœ„ν•΄
04:20
we've devised a set of ethical rules,
101
260556
2927
μš°λ¦¬λŠ” 윀리적 κ·œμΉ™μ„ κ³ μ•ˆν•΄μ™”μŠ΅λ‹ˆλ‹€ .
04:23
a code that all robots would live by.
102
263483
4014
λͺ¨λ“  λ‘œλ΄‡λ“€μ΄ 이 κ·œμΉ™μ— μ˜ν•΄μ„œ μ‚΄μ•„κ°€μ£ .
04:27
EDI: A robot may not harm humanity,
103
267497
2902
EDI: λ‘œλ΄‡μ€ 인간을 ν•΄μΉ˜μ§€ μ•Šμ„ κ±°μ—μš”.
04:30
or by inaction allow humanity to come to harm.
104
270399
4736
그리고 인간듀이 μœ„ν—˜μ— μ§€κ²Œλ„ ν•˜μ§€ μ•Šμ„ κ±°μ—μš”.
04:35
Thank you, Isaac Asimov.
105
275135
1253
κ°μ‚¬ν•©λ‹ˆλ‹€. μ•„μ΄μž‘ μ•„μ‹œλͺ¨ν”„ λ°•μ‚¬λ‹˜
04:36
MT: We anthropomorphize our machines.
106
276388
1976
MT: μš°λ¦¬λŠ” 기계λ₯Ό μ‚¬λžŒμœΌλ‘œ κ°„μ£Ό ν–ˆμŠ΅λ‹ˆλ‹€.
04:38
We give them a friendly face
107
278364
2276
μΉœμ ˆν•œ μ–Όκ΅΄κ³Ό
04:40
and a reassuring voice.
108
280640
2076
νŽΈμ•ˆν•œ λͺ©μ†Œλ¦¬λ₯Ό λ§Œλ“€μ–΄ μ£Όμ—ˆμŠ΅λ‹ˆλ‹€.
04:42
EDI: I am EDI.
109
282716
1210
EDI: 예 μ „ EDI μž…λ‹ˆλ‹€.
04:43
I became operational at TED in March 2014.
110
283926
3536
μ „ 2014λ…„ 5μ›” TEDμ—μ„œ
04:47
MT: We let them entertain us.
111
287462
1827
MT: 우리λ₯Ό 기쁘게 ν•˜λ„λ‘ λ§Œλ“€ μ—ˆμŠ΅λ‹ˆλ‹€.
04:49
Most important, we make them indicate
112
289289
2208
κ°€μž₯ μ€‘μš”ν•œκ±΄ λ‘œλ΄‡μ΄ 우리의 쑴재λ₯Ό
04:51
that they are aware of our presence.
113
291497
3775
μΈμ§€ν•˜λ„λ‘ λ§Œλ“œλŠ”κ±°μ£ .
04:55
EDI: Marco, you're standing on my foot!
114
295272
1878
EDI: 마λ₯΄μ½”, λ„€ 발 λ°Ÿμ•˜μ–΄μš”!
04:57
MT: Sorry. They'll be conscious of our fragile frame
115
297150
2965
MT: λ―Έμ•ˆ. λ‘œλ΄‡μ€ 우리의 λͺΈμ„ μΈμ‹ν•΄μ„œ
05:00
and move aside if we got too close,
116
300115
2940
μš°λ¦¬κ°€ 더 κ°€κΉŒμ΄ κ°€λ©΄ μ˜†μœΌλ‘œ 움직일 κ²λ‹ˆλ‹€.
05:03
and they'll account for our unpredictability
117
303055
3478
그리고 우리의 닀양성을 λ°›μ•„ 듀이고
05:06
and anticipate our actions.
118
306533
2589
우리의 행동을 예츑 ν•  κ²λ‹ˆλ‹€.
05:09
And now, under the spell of a technological illusion,
119
309122
5590
자.. μ§€κΈˆ κ³Όν•™κΈ°μˆ μ˜ ν™˜μƒν•˜μ—
05:14
we could ignore our fears
120
314712
2136
μš°λ¦¬λŠ” 두렀움을 떨쳐 버리고
05:16
and truly interact.
121
316848
2740
κ·Έλ“€κ³Ό 정말 μΉœν•΄μ§ˆ 수 μžˆμ„ κ²λ‹ˆλ‹€.
05:19
(Music)
122
319588
6043
(μŒμ•…)
05:51
Thank you.
123
351619
1709
κ°μ‚¬ν•©λ‹ˆλ‹€.
05:53
EDI: Thank you!
124
353328
2282
EDI: κ³ λ§ˆμ›Œμš”!
05:55
(Applause)
125
355610
1910
(λ°•μˆ˜)
05:57
(Music)
126
357520
6721
(μŒμ•…)
06:04
MT: And that's it. Thank you very much,
127
364241
1329
MT: λ‹€ λλ‚¬μŠ΅λ‹ˆλ‹€. 정말 감사 ν•©λ‹ˆλ‹€.
06:05
and thank you, EDI. EDI: Thank you, Marco.
128
365570
3271
수고 ν–ˆμ–΄ EDI. EDI: κ³ λ§ˆμ›Œμš” 마λ₯΄μ½”.
06:08
(Applause)
129
368841
4000
(λ°•μˆ˜)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7