A funny look at the unintended consequences of technology | Chuck Nice

272,906 views

2018-02-27 ・ TED


New videos

A funny look at the unintended consequences of technology | Chuck Nice

272,906 views ・ 2018-02-27

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: JY Kang κ²€ν† : Eunice Nam
00:12
Future tech always comes with two things: promise
0
12863
4205
미래 κΈ°μˆ μ—λŠ” 늘 λ”°λΌλΆ™λŠ” 두 가지가 μžˆμŠ΅λ‹ˆλ‹€.
λ―Έλž˜μ— λŒ€ν•œ 약속, 그리고 μ˜λ„μΉ˜ μ•Šμ€ κ²°κ³Όμ£ .
00:17
and unintended consequences.
1
17092
2104
00:19
And it's those consequences that I want to explore.
2
19220
3406
κ·Έ μ˜μ™Έμ˜ 결과에 λŒ€ν•΄μ„œ μ–˜κΈ°ν•΄λ³΄λ €κ³  ν•©λ‹ˆλ‹€.
00:23
And before we get to how future tech may affect us,
3
23101
3723
미래 기술이 μš°λ¦¬μ—κ²Œ μ–΄λ–€ 영ν–₯을 λ―ΈμΉ˜λŠ”μ§€ λ§ν•˜κΈ°μ— μ•žμ„œ
00:26
I'd like to spend a little time exploring the unintended consequences
4
26848
3749
졜근 λͺ‡ 가지 미래 기술이 낳은 μ˜λ„μΉ˜ μ•Šμ€ 결과에 λŒ€ν•΄μ„œ
00:30
of some of our recent tech,
5
30621
1567
잠깐 μ†Œκ°œν•΄λ“œλ¦΄κΉŒ ν•©λ‹ˆλ‹€.
00:32
namely, social media.
6
32212
2022
λ¨Όμ €, μ†Œμ…œ λ―Έλ””μ–΄κ°€ 있죠.
00:34
Social media, a few short years ago, was the tech of future you.
7
34885
4632
뢈과 λͺ‡ λ…„ μ „λ§Œ 해도, μ†Œμ…œ λ―Έλ””μ–΄λŠ” μ—¬λŸ¬λΆ„μ˜ 미래λ₯Ό λ°”κΏ€ κΈ°μˆ μ΄μ—ˆμŠ΅λ‹ˆλ‹€.
00:39
Now it just is you.
8
39541
2090
그런데 μ§€κΈˆ λ‹Ήμ‹  μžμ‹  λΉΌκ³€ 달라진 게 μ—†μ–΄μš”.
00:42
Social media was supposed to bring us together
9
42125
2827
μ†Œμ…œ λ―Έλ””μ–΄κ°€ 우리 λͺ¨λ‘λ₯Ό 이어쀄 거라 κΈ°λŒ€ν–ˆμŠ΅λ‹ˆλ‹€.
00:44
in ways we could never imagine.
10
44976
2443
상상도 λͺ»ν–ˆλ˜ λ°©λ²•μœΌλ‘œ 말이죠.
00:47
And the predictors were correct.
11
47443
1769
κ·Έ 예츑이 λ§žμ•˜μŠ΅λ‹ˆλ‹€.
00:50
These three girls are talking to one another
12
50412
2842
μ—¬κΈ° μ—¬μžμ•„μ΄ μ„Έ λͺ…이 μ„œλ‘œ λŒ€ν™”ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
00:53
without the awkward discomfort of eye contact.
13
53278
3034
μ–΄μƒ‰ν•˜κ²Œ μ„œλ‘œ λˆˆμ„ λ§žμΆ”μ§€ μ•Šκ³ λ„ 말이죠.
00:56
(Laughter)
14
56336
1329
(μ›ƒμŒ)
00:57
I call that advancement.
15
57689
2080
이것을 진보라고 ν•©λ‹ˆλ‹€.
01:01
We were supposed to be caught up in a communication tsunami,
16
61533
3508
μš°λ¦¬λŠ” ν†΅μ‹ μ˜ μ“°λ‚˜λ―Έμ— κ°‡ν˜€ 꼼짝 λͺ»ν•˜κ²Œ 될 거라 μ˜ˆμΈ‘ν–ˆμŠ΅λ‹ˆλ‹€.
01:05
the likes of which the world has never seen.
17
65065
2840
사상 μœ λž˜κ°€ 없을 μ •λ„λ‘œ 말이죠.
01:07
And that did happen.
18
67929
1745
그런데 κ·Έ 일이 μ‹€μ œλ‘œ λ²Œμ–΄μ‘Œκ³ 
01:09
And so did this.
19
69698
1781
μ΄λ ‡κ²Œ λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
01:13
(Sings) One of these things is not like the other.
20
73153
3044
(λ…Έλž˜) 이 μ€‘μ—μ„œ ν•˜λ‚˜λ§Œ λ‹€λ₯΄μ§€~ β™«
01:16
(Speaks) Now, look at this picture.
21
76221
1697
자, 사진을 ν•œλ²ˆ λ³΄μ„Έμš”.
01:17
If you picked the guy with the book, you’re wrong --
22
77942
2464
책을 μ½λŠ” μ‚¬λžŒμ„ κ³¨λžλ‹€λ©΄, 정닡이 μ•„λ‹™λ‹ˆλ‹€.
01:20
or, as a certain president would say, "Wrong!"
23
80430
2764
λͺ¨ λŒ€ν†΅λ Ήμ²˜λŸΌ λ§ν•˜λ©΄ "μ•„λ‹™λ‹ˆλ‹€--!"
01:23
(Laughter)
24
83218
1922
(μ›ƒμŒ)
01:27
Clearly, three of these guys are reading,
25
87496
2287
이 μ€‘μ—μ„œ μ„Έ λͺ…은 λΆ„λͺ…νžˆ λ­”κ°€ 읽고 있죠.
01:29
and one guy, on the end, is listening to music
26
89807
2380
그런데 끝에 ν•œ λͺ…은 μŒμ•…μ„ λ“€μœΌλ©° 'μΊ”λ”” ν¬λŸ¬μ‹œ' κ²Œμž„μ„ ν•˜κ³  μžˆμ–΄μš”.
01:32
and playing "Candy Crush."
27
92211
1499
01:33
(Laughter)
28
93734
1551
(μ›ƒμŒ)
01:35
So are we more connected,
29
95862
1583
자, μ„œλ‘œ κ°„μ˜ μ†Œν†΅μ΄ λ‚˜μ•„μ‘Œλ‚˜μš”?
01:37
or are we just more connected to our devices?
30
97469
3645
μ•„λ‹ˆλ©΄ ν•Έλ“œν°κ³Όμ˜ μ†Œν†΅μ΄ 더 λ‚˜μ•„μ§„ κ±΄κ°€μš”?
01:42
Social media was supposed to place us in a veritable town square,
31
102156
3405
μ†Œμ…œ λ―Έλ””μ–΄κ°€ μ§„μ •ν•œ λ§ˆμ„ κ΄‘μž₯으둜 우리λ₯Ό λŒμ–΄λͺ¨μ„ 거라 κΈ°λŒ€ν–ˆμŠ΅λ‹ˆλ‹€.
01:45
where we could engage one another with challenging ideas and debates.
32
105585
4739
μ„œλ‘œ λ§Œλ‚˜μ„œ 생각과 λ…ΌμŸμ„ 펼칠 수 μžˆλŠ” 그런 κ΄‘μž₯μ΄μš”.
01:50
And instead what we got were trolls.
33
110348
2890
그런데, κ·Έ λŒ€μ‹  νƒ„μƒν•œ 게 '인터넷 λŒ“κΈ€λŸ¬'μž…λ‹ˆλ‹€.
01:53
This is an actual tweet that I received.
34
113262
3577
이건 μ œκ°€ μ‹€μ œλ‘œ 받은 νŠΈμœ„ν„° κΈ€μž…λ‹ˆλ‹€.
01:58
"Chuck, no one wants to hear your stupid, ill-informed political views!
35
118220
4154
"μ²™. λ„ˆμ˜ λ©μ²­ν•˜κ³ , λ­£ λͺ¨λ₯΄λŠ” μ •μΉ˜μ  μ£Όμž₯을 듀어쀄 μ‚¬λžŒμ€ 아무도 μ—†μ–΄!
02:02
I hope you get leprosy and die.
36
122398
2321
'λ‚˜λ°©'μ΄λ‚˜ κ±Έλ €μ„œ 죽어버렀라.
02:04
Love, Dad"
37
124743
1408
μ‚¬λž‘ν•˜λŠ” μ•„λΉ κ°€."
02:06
(Laughter)
38
126175
1667
(μ›ƒμŒ)
02:08
Now, the great thing about that tweet if you look at it,
39
128796
2653
잘 λ³΄μ‹œλ©΄ λŒ€λ‹¨ν•œ 게 ν•˜λ‚˜ μžˆμ–΄μš”.
λŒ€λΆ€λΆ„ λŒ“κΈ€λŸ¬λ‚˜ λ§ˆμ°¬κ°€μ§€λ‘œ μ–˜λ„ κ·Έλ ‡κ²Œ λ‚˜μœ μ• λŠ” μ•„λ‹ˆμ—μš”.
02:11
just like most trolls, it's not that bad,
40
131473
1987
02:13
because he wished "leporsy" on me instead of "leprosy,"
41
133484
3670
μ™œλƒν•˜λ©΄ 'λ‚˜λ°©'μ΄λ‚˜ κ±Έλ¦¬λΌμž–μ•„μš”. 'λ‚˜λ³‘'이 μ•„λ‹ˆκ³ μš”.
02:17
and "leporsy" is not dangerous at all.
42
137178
2841
'λ‚˜λ°©'은 ν•˜λ‚˜λ„ μ•ˆ λ¬΄μ„œμ›Œμš”.
02:20
(Laughter)
43
140043
1581
(μ›ƒμŒ)
02:21
(Applause)
44
141648
1660
(λ°•μˆ˜)
02:26
Along with trolls, we got a brand new way of torturing teenagers --
45
146333
4517
λŒ“κΈ€λŸ¬μ™€ ν•¨κ»˜λΌλ©΄ μ‹­λŒ€ 아이λ₯Ό κ΄΄λ‘­νžˆλŠ” μ°Έμ‹ ν•œ 방법도 배우게 λ©λ‹ˆλ‹€.
02:30
cyberbullying.
46
150874
1365
인터넷 μ™•λ”°μ£ .
02:32
A concept that my 75-year-old mother just can't seem to wrap her head around.
47
152848
4933
75살인 저희 μ–΄λ¨Έλ‹ˆλŠ” μ „ν˜€ 이해 λͺ»ν•  κ°œλ…μ΄μ£ .
02:38
"So, uh, did they hit him?"
48
158441
2102
"κ·Έλž˜μ„œ 애듀이 κ±”λ₯Ό λ•Œλ Έλ‹ˆ?"
02:40
"No, Mom, they didn't hit him."
49
160567
1682
"아뇨 μ—„λ§ˆ, λ•Œλ¦¬μ§„ μ•Šμ•˜μ–΄μš”."
02:42
"Did they take his money?"
50
162619
1316
"애듀이 κ±” λˆμ„ λΊμ—ˆλ‹ˆ?"
02:43
"No, Mom, they didn't take his money."
51
163959
1866
"아뇨. λˆμ„ λΉΌμ•—μ§€λŠ” μ•Šμ•˜μ–΄μš”"
02:45
"Did they put his face in the toilet?"
52
165849
1856
"그럼 변기에 머리λ₯Ό 쳐 λ°•μ•˜λ‹ˆ?"
02:47
"No, Mom, they didn't --"
53
167729
1219
"아뇨 μ—„λ§ˆ, 그게 μ•„λ‹ˆκ³ μš”.."
02:48
"Well, what did they do?"
54
168972
1211
"그럼 뭘 μ–΄μ¨Œλ‹€λŠ” κ±°λ‹ˆ?"
02:50
"They attacked him on the internet."
55
170207
1879
"애듀이 κ±”λ₯Ό μΈν„°λ„·μ—μ„œ κ³΅κ²©ν–ˆμ–΄μš”."
02:52
"Attacked him on the internet?"
56
172800
1521
"μΈν„°λ„·μ—μ„œ κ³΅κ²©ν–ˆλ‹€κ³ ?"
02:54
(Laughter)
57
174345
1001
(μ›ƒμŒ)
02:55
"Well, why don't you just turn off the internet?"
58
175370
2342
"그럼, 인터넷을 끄면 될 κ±° 아냐!!"
02:57
(Laughter)
59
177736
1289
(μ›ƒμŒ)
02:59
"Your whole generation is a bunch of wussies."
60
179049
2542
"λ„€ 녀석듀 μ „λΆ€ 계집애 κ°™μ•„μ„œλŠ”."
03:01
(Laughter)
61
181615
1505
(μ›ƒμŒ)
03:03
She's got a point.
62
183660
1150
μ—„λ§ˆ 말이 λ§žμ•„μš”.
03:04
(Laughter)
63
184834
1339
(μ›ƒμŒ)
03:06
She's got a point.
64
186197
1159
일리 μžˆμ–΄μš”.
03:07
And I don't even want to talk about what social media has done to dating.
65
187380
3618
짝 μ§€μ–΄μ£ΌλŠ” μ†Œμ…œ λ―Έλ””μ–΄λŠ” 또 말할 것도 μ—†μ£ .
03:11
I was on Grindr until I found out it wasn't a sandwich app.
66
191491
4901
μ €λŠ” Grindr 앱이 무슨 μš”λ¦¬λ‚˜ κ°€λ₯΄μ³μ£ΌλŠ” 앱인 쀄 μ•Œμ•˜μ–΄μš”.
03:16
(Laughter)
67
196416
2083
(μ›ƒμŒ)
03:19
And I can't even tell you about Tinder,
68
199908
3222
Tinder 앱도 ꡳ이 μ–˜κΈ°ν•˜μ§€ μ•Šκ² μŠ΅λ‹ˆλ‹€.
03:23
except for the fact that if you think there is a limit
69
203154
3900
ν•˜λ‚˜λ§Œ λ§ν•˜μžλ©΄
이 세상에 ν•˜λ£»λ°€ λ§Œλ‚¨μ„ ν•˜λŠ” μ‚¬λžŒλ“€ μˆ˜κ°€ μ •ν•΄μ Έμžˆλ‹€κ³  μƒκ°ν•˜μ‹ λ‹€λ©΄
03:27
to the amount of anonymous sex we can have on this planet,
70
207078
3886
03:30
you are sadly mistaken.
71
210988
1817
μ•ˆνƒ€κΉκ²Œλ„ μ°©κ°ν•˜κ³  계신 κ²λ‹ˆλ‹€.
03:32
(Laughter)
72
212829
1001
(μ›ƒμŒ)
03:33
So where do we go from here?
73
213854
1879
λ‹€μŒμœΌλ‘œ 뭐가 μžˆμ„κΉŒμš”?
03:35
Well, let's just jump right in and play the hits.
74
215757
2367
μΈκΈ°μžˆλŠ” 걸둜 λ°”λ‘œ λ„˜μ–΄κ°€μ£ .
무인 μžλ™μ°¨μž…λ‹ˆλ‹€.
03:38
Driverless cars.
75
218148
1151
03:39
Something that has already been around for many years,
76
219323
2672
λͺ‡ λ…„ 사이에 주변에 많이 λŒμ•„λ‹΅λ‹ˆλ‹€.
인간 μ»΄ν“¨ν„°μ˜ 도움도 없이 말이죠.
03:42
just without the assistance of computers.
77
222019
2161
03:44
(Laughter)
78
224204
2054
(μ›ƒμŒ)
03:47
(Applause)
79
227352
2359
(λ°•μˆ˜)
03:50
Because for years, we have been driving while texting,
80
230733
3980
μ™œλƒν•˜λ©΄ μ§€κΈˆκΉŒμ§€ μš°λ¦¬λŠ” μš΄μ „ν•˜λ©΄μ„œ λ¬Έμžλ„ 보내고,
03:54
putting on makeup,
81
234737
1578
ν™”μž₯도 ν•˜κ³ ,
03:56
shaving, reading -- actually reading --
82
236339
3082
면도에 λ…μ„œλ„ ν–ˆκ±°λ“ μš”. μ‹€μ œ 책도 μ½μ–΄μš”.
03:59
that would be me.
83
239445
1162
μ œκ°€ κ·ΈλŸ½λ‹ˆλ‹€.
04:00
(Laughter)
84
240631
1013
(μ›ƒμŒ)
04:01
The other thing is that since driverless cars will be shared,
85
241668
2999
또 λ‹€λ₯Έ 경우둜, 무인 μžλ™μ°¨λ„ 곡유 μ„œλΉ„μŠ€κ°€ λ‚˜μ˜¨λ‹€λ©΄
04:04
most people won't own cars,
86
244691
1316
μ‚¬λžŒλ“€μ΄ μ°¨λ₯Ό 사지 μ•Šμ„ κ²λ‹ˆλ‹€.
04:06
and that means the DMV will go away.
87
246031
2318
ꡐ톡ꡭ도 μ—†μ–΄μ§€κ²Œ 되겠죠.
04:09
The DMV -- I know what you're saying right now.
88
249000
2260
κ΅ν†΅κ΅­μ΄λΌλ‹ˆκΉŒ λ‹€λ“€ μ΄λ ‡κ²Œ μƒκ°ν•˜μ‹€μ§€ λͺ¨λ₯΄κ² λ„€μš”.
"이 μž‘μžκ°€ μ—¬κΈ° λ‚˜μ™€μ„œ ꡐ톡ꡭ νŽΈμ„ λ“€κ³  μžˆλ„€."
04:11
"There's no way this guy is going to stand up here
89
251284
2367
04:13
and make a case for the DMV."
90
253675
1425
μ—¬λŸ¬λΆ„μ€ 어떨지 λͺ¨λ₯΄μ§€λ§Œ
04:15
Well, I don't know about you, but I do not want to live in a world
91
255124
3173
μ €λŠ” 그런 곳에 있고 싢지 μ•Šμ•„μš”.
04:18
where harsh fluorescent lights,
92
258321
2278
μ°¨κ°€μš΄ ν˜•κ΄‘λ“± μ•„λž˜μ—μ„œ
04:20
endless lines,
93
260623
1898
λŒ€κΈ°μ€„μ€ 끝이 μ—†κ³ 
04:22
terrible forms to fill out
94
262545
1836
λ³΅μž‘ν•œ μ„œλ₯˜λ₯Ό λ§Œλ“€μ–΄μ•Ό ν•˜κ³ 
04:24
and disaffected, soulless bureaucrats remind me
95
264405
3975
그런 λΆˆλ§Œμ— 넋이 λ‚˜κ°€ μžˆλŠ” 곡무원을 보노라면
λ‚˜λŠ” 이런 λ°μ„œ μΌν•˜μ§€ μ•ŠλŠ” 게 μ°Έ λ‹€ν–‰μ΄λΌλŠ” 생각이 λ“€ κ±°λ“ μš”.
04:28
that I am pretty damn lucky not to work here.
96
268404
3442
04:31
(Laughter)
97
271870
1150
(μ›ƒμŒ)
04:33
That is the real service they provide.
98
273490
2250
그게 λ°”λ‘œ 그듀이 μ œκ³΅ν•˜λŠ” μ§„μ§œ μ„œλΉ„μŠ€μž…λ‹ˆλ‹€.
04:36
The DMV:
99
276458
1517
γ€Œκ΅ν†΅κ΅­.
04:37
come for the registration renewal,
100
277999
2061
λ“±λ‘λ©΄ν—ˆ κ°±μ‹ ν•˜λ €κ³  μ™”λ‹€κ°€
04:40
stay for the satisfaction of knowing you made some pretty good life choices.
101
280084
4251
λ‚΄κ°€ μ„ νƒν•œ 삢이 κ·Έλ‚˜λ§ˆ λ‚«λ‹€λŠ” κ±Έ μ•Œκ³  λ§Œμ‘±ν•˜κ²Œ λ˜λŠ” 곳」
04:44
(Laughter)
102
284359
1847
(μ›ƒμŒ)
04:49
Nobody will own their car in the future,
103
289060
1998
λ―Έλž˜μ— μ‚¬λžŒλ“€μ΄ μ°¨λ₯Ό μ†Œμœ ν•˜μ§€ μ•Šκ²Œ 되면
04:51
and that means teenagers will not have a place to make out.
104
291082
3046
μ‹­λŒ€ 아이듀이 λ§Œλ‚˜μ„œ μ‚¬λž‘μ„ λ‚˜λˆŒ 곳이 없어지겠죠.
04:55
So you know what that means.
105
295546
1373
그러면 μ–΄λ–»κ²Œ λ˜λƒλ©΄ 말이죠.
04:56
That means they will order driverless cars to do just that.
106
296943
3230
κ·Έ 아이듀은 그럴 λͺ©μ μœΌλ‘œ 무인 μžλ™μ°¨λ₯Ό λΆ€λ₯Ό κ²λ‹ˆλ‹€.
05:00
I do not want to step into a vehicle and ask the question:
107
300602
4365
λ‚˜μ€‘μ— μ œκ°€ κ·Έ μ°¨λ₯Ό νƒ€κ²Œ 되면 이런 μ˜μ‹¬μ΄ λ“€κ²Œ λ κ±Έμš”.
05:04
"Why does this car smell like awkwardness, failure and shame?"
108
304991
4793
"μ™œ 이 μ°¨μ—μ„œ λ­”κ°€ μ–΄μƒ‰ν•˜κ³ , μ‹€νŒ¨μ™€ 수치슀러운 λƒ„μƒˆκ°€ λ‚˜λŠ” κ±° 같지?"
05:09
(Laughter)
109
309808
2158
(μ›ƒμŒ)
05:12
If I want to ask that question, I'll walk into my own bedroom.
110
312916
3044
μ œκ°€ 그런 μ˜μ‹¬μ΄ λ“€ λ•ŒλŠ” 침싀에 λ“€μ–΄μ„€ λ•Œ 뿐인데 말이죠.
05:15
(Laughter)
111
315984
1381
(μ›ƒμŒ)
05:17
So what else do we have to look forward to?
112
317389
2010
그럼 λ‹€μŒμœΌλ‘œ 뭘 μ–˜κΈ°ν•΄λ³ΌκΉŒμš”?
05:19
That's right, artificial intelligence.
113
319423
1824
κ·Έλ ‡μ£ . 인곡지λŠ₯이 μžˆλ„€μš”.
05:21
Artificial intelligence, yes.
114
321271
2291
인.곡.지.λŠ₯.μ΄μš”.
05:23
You know, there was a time when artificial intelligence was a joke.
115
323586
3189
인곡지λŠ₯을 λ†λ‹΄μ²˜λŸΌ μ–˜κΈ°ν•˜λ˜ λ•Œλ„ μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
05:26
I mean, literally a quip that you would hear at a cocktail party
116
326799
3632
μˆ μžλ¦¬μ—μ„œ λˆ„κ΅°κ°€ 이런 말둜 λΉˆμ •κ±°λ¦¬λ©° λŒ€ν™”μ— λΌμ–΄λ“œλŠ” κ±°μ£ .
05:30
when somebody would bring it up in conversation:
117
330455
2361
05:32
"Artificial intelligence.
118
332840
1932
"인곡지λŠ₯으둜 λ§ν•˜μžλ©΄
05:34
The only real artificial intelligence is our American Congress.
119
334796
3556
κ΅­νšŒμ•Όλ§λ‘œ μ§„μ§œ 인곡지λŠ₯이지.
05:38
Ha, ha, ha, ha, ha."
120
338376
2028
ν•˜ν•˜ν•˜ν•˜."
05:40
Well, it's not funny anymore.
121
340428
1417
μ΄μ œλŠ” 웃을 일이 μ•„λ‹ˆμ—μš”!
05:41
(Laughter)
122
341869
2039
(μ›ƒμŒ)
05:48
Stephen Hawking, Elon Musk and Bill Gates have all gone on record
123
348108
3551
μŠ€ν‹°λΈ ν˜Έν‚Ή, μ—˜λ‘  머슀크, 빌 κ²Œμ΄μΈ λŠ” κ³΅μ‹μ μœΌλ‘œ
05:51
expressing grave reservations about artificial intelligence.
124
351683
3956
인곡지λŠ₯에 λŒ€ν•΄ μ‹¬κ°ν•œ 우렀λ₯Ό ν‘œλͺ…ν–ˆμŠ΅λ‹ˆλ‹€.
05:55
That's like Jesus, Moses and Muhammad coming together and saying,
125
355663
3531
마치 예수, λͺ¨μ„Έ, λ§ˆν˜Έλ©”νŠΈκ°€ λͺ¨μ—¬μ„œ μ΄λ ‡κ²Œ μ–˜κΈ°ν•˜λŠ” κ±° κ°™μ£ .
05:59
"Guy, guys -- here's something we can all believe in."
126
359218
2790
"λ‹€λ“€ λͺ¨μ—¬λ΄. μš°λ¦¬λ„ 믿을 게 생겼어."
06:02
(Laughter)
127
362032
1022
(μ›ƒμŒ)
06:03
You might want to go with that, is all I'm saying.
128
363078
2413
λ‹€λ“€ 제 말을 λ―ΏμœΌμ…”μ•Ό ν•  κ²λ‹ˆλ‹€.
06:07
We are actually teaching machines how to think,
129
367310
4467
μš°λ¦¬λŠ” μ‹€μ œλ‘œ κΈ°κ³„μ—κ²Œ μƒκ°ν•˜λŠ” 법을 κ°€λ₯΄μΉ˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
06:11
how to understand our behavior,
130
371801
2224
μΈκ°„μ˜ 행동을 μ΄ν•΄ν•˜λŠ” 법과 μžμ‹ μ„ λ³΄ν˜Έν•˜λŠ” 방법,
06:14
how to defend themselves and even practice deception.
131
374049
3601
심지어 인간을 μ†μ΄λŠ” λ²•κΉŒμ§€λ„μš”.
06:18
What could possibly go wrong?
132
378904
1729
κ·Έλž˜μ„œ 뭐 문제될 게 μžˆλ‚˜μš”?
06:20
(Laughter)
133
380657
1591
(μ›ƒμŒ)
06:23
The one thing that's for sure:
134
383568
1828
λΆ„λͺ…ν•œ ν•œ κ°€μ§€λŠ”
06:25
the creation always despises its creator.
135
385420
3149
피쑰물은 μ–Έμ œλ‚˜ μžμ‹ μ˜ μ°½μ‘°μ£Όλ₯Ό μ–•λ³Έλ‹€λŠ” κ²λ‹ˆλ‹€.
06:29
OK?
136
389043
1193
κ·Έλ ‡μ£ ?
06:30
The Titans rose up against the gods;
137
390260
2125
타이탄 쒅쑱은 μ‹ μ—κ²Œ λŒ€λ“€μ—ˆκ³ 
06:32
Lucifer against Jehovah.
138
392409
1732
λ£¨μ‹œνΌλ„ μ—¬ν˜Έμ™€μ— λ§žμ„°μ£ .
06:34
And anybody who has a teenager has heard these words:
139
394547
2782
μ‹­λŒ€ μžλ…€κ°€ μžˆλŠ” 뢄이라면 이런 말 듀어보셨을 κ±°μ˜ˆμš”.
06:37
"I hate you and you're ruining my life!
140
397353
2033
"μ‹«μœΌλ‹ˆκΉŒ, λ‚΄ 인생에 κ°„μ„­ν•˜μ§€ λ§ˆμš”!
06:39
I hate you!"
141
399410
1150
μ‹«λ‹€κ³ μš”!"
06:42
Now just imagine that sentiment with a machine that can outthink you
142
402325
4555
기계가 κ·ΈλŸ°λ‹€κ³  μƒμƒν•΄λ³΄μ„Έμš”.
μΈκ°„μ˜ 생각을 μ•žμ§€λ₯΄κ³  무기둜 쀑무μž₯을 ν•˜λŠ” κ²λ‹ˆλ‹€.
06:46
and is heavily armed.
143
406904
1843
06:48
(Laughter)
144
408771
1511
(μ›ƒμŒ)
06:51
The result?
145
411074
1201
κ·Έ κ²°λ§μ€μš”?
06:52
Absolutely.
146
412976
1202
λΆ„λͺ…νžˆ μ΄λ ‡κ²Œ 되겠죠.
06:54
(Laughter)
147
414202
1456
(μ›ƒμŒ)
06:58
What we need to do before we perfect artificial intelligence
148
418973
3244
μ™„λ²½ν•œ 인곡지λŠ₯을 λ§Œλ“€κΈ° 전에
07:02
is perfect artificial emotions.
149
422241
2187
μ™„λ²½ν•œ 인곡 감성을 λ§Œλ“€μ–΄μ•Ό ν•©λ‹ˆλ‹€.
07:04
That way, we can teach the robots or machines
150
424793
3910
κ·Έλž˜μ•Ό λ‘œλ΄‡μ΄λ‚˜ κΈ°κ³„μ—κ²Œ
07:08
how to love us unconditionally,
151
428727
2481
인간을 무쑰건 μ‚¬λž‘ν•˜λ„λ‘ κ°€λ₯΄μΉ  수 μžˆμŠ΅λ‹ˆλ‹€.
07:11
so that when they figure out that the only real problem on this planet
152
431232
4525
κ·Έλž˜μ•Όλ§Œ 이 μ§€κ΅¬μ˜ μ§„μ •ν•œ 골칫거리가
07:15
is us,
153
435781
1190
μΈκ°„μž„μ„ μ•Œμ•˜μ„ λ•Œ,
07:16
instead of destroying us --
154
436995
1716
인간을 κ³΅κ²©ν•˜μ§€ μ•Šκ³ 
07:18
which, by the way, is totally logical --
155
438735
3104
λ¬Όλ‘  κ·Έλž˜μ•Ό λ§žμ§€λ§Œ..
07:21
they will find us adorable --
156
441863
2446
우릴 κ·€μ—½κ²Œ 봐쀄 κ²λ‹ˆλ‹€.
07:24
(Laughter)
157
444333
1066
(μ›ƒμŒ)
07:25
like baby poop.
158
445423
1221
μ•„κΈ°κ°€ λ˜₯ μŒŒμ„ λ•Œμ²˜λŸΌμš”.
07:26
(Laughter)
159
446668
1003
(μ›ƒμŒ)
07:27
"Oh my god, I just love the way you just destroyed the planet.
160
447695
3106
"이런 세상에, λ„€κ°€ 지ꡬλ₯Ό νŒŒκ΄΄ν•˜λŠ” 것쑰차 μ‚¬λž‘μŠ€λŸ½κ΅¬λ‚˜.
07:30
I can't stay mad at you, you're so cute!
161
450825
2772
κ·€μ—¬μ›Œμ„œ ν™”λ₯Ό λ‚Ό μˆ˜κ°€ μ—†μ–΄!
07:33
You're so cute!"
162
453621
1222
정말 κ·€μ—¬μ›Œ!"
07:34
(Laughter)
163
454867
1160
(μ›ƒμŒ)
07:36
Can't talk about this without talking about robotics. OK?
164
456051
6925
여기에 λ§λΆ™μ—¬μ„œ λ‘œλ΄‡ 기술 μ–˜κΈ°λ₯Ό μ•ˆ ν•  μˆ˜κ°€ μ—†λ„€μš”.
07:43
Remember when you thought robotics were cool?
165
463000
2121
λ‘œλ΄‡ 기술이 λŒ€λ‹¨ν•˜λ‹€κ³  μƒκ°ν•˜μ‹  적 있죠?
저도 λ‘œλ΄‡ 기술이 λŒ€λ‹¨ν•˜λ‹€κ³  μƒκ°ν–ˆμŠ΅λ‹ˆλ‹€.
07:45
I remember when I thought robotics were cool,
166
465145
2122
μ‚¬λžŒμ˜ 역할을 λΉΌμ•—λŠ”λ‹€λŠ” κ±Έ μ•ŒκΈ° μ „κΉŒμ§€λŠ”μš”.
07:47
until I figured out that they were going to take everybody's place,
167
467291
3160
택배기사뢀터 심μž₯μ „λ¬Έμ˜κΉŒμ§€μš”.
07:50
from the delivery guy down to the heart surgeon.
168
470475
2266
그런데, λ‘œλ΄‡ κΈ°μˆ μ—μ„œ κ°€μž₯ μ‹€λ§μŠ€λŸ¬μš΄ 점은
07:52
The one thing, though, that is very disappointing about robotics
169
472765
3029
07:55
is the holy grail of robotics,
170
475818
1432
κ°€μž₯ μ€‘μš”ν•œ λ‘œλ΄‡μ€ 아직 μ•ˆ λ‚˜μ™”λ‹€λŠ” κ±°μ˜ˆμš”.
07:57
and it hasn't even happened.
171
477274
1347
07:58
I'm talking about the robot girlfriend,
172
478645
1939
λ‘œλ΄‡ μ—¬μž 친ꡬλ₯Ό λ§ν•˜λŠ” κ²λ‹ˆλ‹€.
08:00
the dream of one lonely geek in a windowless basement
173
480608
3608
창문도 μ—†λŠ” μ§€ν•˜μ‹€μ—μ„œ μ§€λ‚΄λŠ” μ˜€νƒ€μΏ μ˜ 꿈이죠.
08:04
who vowed one day: "I am going to marry my creation."
174
484240
3425
μ–΄λŠ λ‚  μ΄λ ‡κ²Œ λ‹€μ§ν•©λ‹ˆλ‹€. "λ‚΄κ°€ λ§Œλ“  λ‘œλ΄‡κ³Ό κ²°ν˜Όν•  κ±°μ•Ό."
08:08
And there actually is a movement underway to stop this from happening,
175
488738
5027
이런 일이 μΌμ–΄λ‚˜λŠ” κ±Έ λ§‰μžλŠ” μš΄λ™μ΄ μ‹€μ œλ‘œ λ²Œμ–΄μ§€κ³  μžˆμ–΄μš”.
08:13
for fear of exploitation.
176
493789
2164
μ„±λ…Έμ˜ˆλ‘œ λ§Œλ“€ 우렀 λ•Œλ¬Έμ΄μ£ .
08:16
And I, for one, am against that movement.
177
496653
2626
μ €λŠ” 개인적으둜 κ·Έ μš΄λ™μ— λ°˜λŒ€ν•©λ‹ˆλ‹€.
08:20
I believe we should have robot girlfriends.
178
500224
2827
μ €λŠ” λ‘œλ΄‡ μ—¬μžμΉœκ΅¬κ°€ μžˆμ–΄μ•Ό ν•œλ‹€κ³  μƒκ°ν•΄μš”.
08:23
I just believe that they should come with a feminist protocol
179
503596
4245
κ·Έ μ—¬μž λ‘œλ΄‡μ€ λ°˜λ“œμ‹œ μ—¬μ„±μΈκΆŒμ— λŒ€ν•œ κ·œμΉ™κ³Ό
08:27
and artificial intelligence,
180
507865
1879
인곡지λŠ₯을 κ°–μΆ”κ³  μžˆμ–΄μ•Ό ν•©λ‹ˆλ‹€.
08:29
so she can take one look at that guy and go, "I am too good for you.
181
509768
4205
그러면 κ·Έ λ‘œλ΄‡μ΄ λ‚¨μžλ₯Ό ν•œλ²ˆ 훑어보고 μ΄λ ‡κ²Œ λ§ν•˜κ² μ£ .
"λ‚œ λ„ˆν•œν…Œ κ³ΌλΆ„ν•΄. 잘 μžˆμ–΄. λ‚œ κ°„λ‹€."
08:33
I'm leaving."
182
513997
1302
08:35
(Laughter)
183
515323
1565
(μ›ƒμŒ)
08:36
(Applause)
184
516912
2036
(λ°•μˆ˜)
08:40
And finally,
185
520371
1631
λ§ˆμ§€λ§‰μœΌλ‘œ
08:42
I have to talk about bioengineering,
186
522026
2182
생λͺ…곡학에 λŒ€ν•΄ ν•  말이 μžˆμ–΄μš”.
08:44
an area of science that promises to end disease before it even begins,
187
524714
6053
이 κ³Όν•™ λΆ„μ•ΌλŠ” 증상이 λ‚˜νƒ€λ‚˜κΈ°λ„ 전에 μ§ˆλ³‘μ„ 없애버리겠닀고 ν•©λ‹ˆλ‹€.
08:51
to help us live longer, fuller, healthier lives.
188
531658
4420
더 κΈΈκ³ , 더 ν’λΆ€ν•˜κ³ , 더 κ±΄κ°•ν•œ 삢을 μ•½μ†ν•˜μ£ .
08:57
And when you couple that with implantable hardware,
189
537218
3122
λͺΈμ— 이식할 수 μžˆλŠ” μž₯μΉ˜κΉŒμ§€ ν•¨κ»˜ ν•œλ‹€λ©΄
09:00
you are looking at the next incarnation of human evolution.
190
540364
4427
인간 μ§„ν™”μ˜ μƒˆλ‘œμš΄ 화신이 될 κ±°λΌλŠ” κΈ°λŒ€κ°μ„ κ°–κ²Œ 되죠.
09:04
And all of that sounds great,
191
544815
2316
λ‹€ κ·ΈλŸ΄λ“―ν•œ μ†Œλ¦¬ κ°™μ§€λ§Œ
09:07
until you figure out where it's really going.
192
547155
2301
μ‹€μ œλ‘œ μ–΄λ–€ 일이 λ²Œμ–΄μ§ˆμ§€ μƒκ°ν•΄λ³΄μ„Έμš”.
09:09
One place:
193
549480
1179
ν•˜λ‚˜λŠ” 말이죠.
09:11
designer babies,
194
551133
1319
맞좀 μ•„κΈ°μž…λ‹ˆλ‹€.
09:13
where, no matter where you are on the globe
195
553317
2027
μ—¬λŸ¬λΆ„μ΄ μ–΄λŠ 곳에 μ‚΄λ“ 
09:15
or what your ethnicity,
196
555368
1785
μ–΄λ–€ 인쒅이든
09:17
babies will end up looking like that.
197
557177
2634
이런 λͺ¨μŠ΅μ˜ μ•„κΈ°κ°€ νƒœμ–΄λ‚  κ²λ‹ˆλ‹€.
09:19
(Laughter)
198
559835
1596
(μ›ƒμŒ)
09:21
That boy is surprised
199
561455
2582
이 아이가 μ΄λ ‡κ²Œ λ†€λΌλŠ” μ΄μœ λŠ”
09:24
because he just found out both his parents are black.
200
564061
3479
μ—„λ§ˆ, μ•„λΉ κ°€ λͺ¨λ‘ ν‘μΈμ΄λΌλŠ” κ±Έ μ•Œμ•˜κΈ° λ•Œλ¬Έμ΄μ—μš”.
09:27
(Laughter)
201
567564
2708
(μ›ƒμŒ)
09:35
Can you imagine him at a cocktail party in 20 years?
202
575569
3168
이 μ•„κΈ°κ°€ 슀무 살이 λ˜μ–΄ 사ꡐ λͺ¨μž„에 κ°”λ‹€κ³  μƒκ°ν•΄λ³΄μ„Έμš”.
09:39
"Yeah, both my parents are black.
203
579429
1613
"응, 우리 λΆ€λͺ¨λ‹˜μ€ 흑인이야.
09:41
I mean, it's a little awkward at times,
204
581066
2199
가끔 μ‹ κ²½ 쓰이기도 ν•˜μ§€λ§Œ
09:43
but you should see my credit rating.
205
583289
1747
λ‚΄ μ‹ μš© 등급을 μ‘°νšŒν•΄λ³΄λ©΄
09:45
Impressive, very impressive."
206
585060
1723
μ—„μ²­λ‚˜. 정말 끝내주지."
09:46
(Laughter)
207
586807
1439
(μ›ƒμŒ)
09:49
Now, all of this seems scary,
208
589353
2340
자, μ „λΆ€ λ¬΄μ„œμš΄ μ–˜κΈ° κ°™μ§€λ§Œ
09:51
and everybody in this room knows that it isn't.
209
591717
2647
μ—¬κΈ° 계신 뢄듀은 그렇지 μ•Šλ‹€λŠ” κ±Έ μ•„μ‹€ κ±°μ˜ˆμš”.
09:54
Technology isn't scary.
210
594388
1462
κΈ°μˆ μ€ λ‘λ €μš΄ 것이 μ•„λ‹™λ‹ˆλ‹€.
09:56
Never has been and it never will be.
211
596568
2915
μ§€κΈˆκΉŒμ§€ 그래 μ™”κ³ , μ•žμœΌλ‘œλ„ κ·Έλ ‡μŠ΅λ‹ˆλ‹€.
10:00
What's scary is us
212
600113
2671
λ‘λ €μš΄ 것은 우리 인간이고
10:02
and what we will do with technology.
213
602808
1992
μš°λ¦¬κ°€ 기술둜 무엇을 ν•  것인가죠.
10:05
Will we allow it to expose our humanity,
214
605332
3089
기술둜 우리의 인간미λ₯Ό λ“œλŸ¬λ‚΄κ³ 
10:09
showing our true selves
215
609334
2307
우리의 μ§„μ •ν•œ μžμ•„λ₯Ό 보여주고
10:11
and reinforcing the fact that we are indeed our brother's keeper?
216
611665
3844
우리 ν˜•μ œλ“€μ„ 지킬 μ‚¬λžŒμ΄ λ°”λ‘œ μš°λ¦¬λΌλŠ” κ±Έ μ•Œκ²Œ ν•΄ μ€„κΉŒμš”?
10:16
Or will we allow it to reveal our deepest, darkest demons?
217
616240
5734
μ•„λ‹ˆλ©΄ 우리 κ°€μž₯ κΉŠμ€ 곳의 μ–΄λ‘μš΄ μ•…λ§ˆλ₯Ό λŒμ–΄λ‚΄κ²Œ λ κΉŒμš”?
10:22
The true question is not whether or not technology is scary.
218
622723
5157
μ§„μ§œ λ¬Έμ œλŠ” 기술이 두렡고 말고가 μ•„λ‹™λ‹ˆλ‹€.
10:28
The true question is:
219
628326
1285
μ§„μ§œ λ¬Έμ œλŠ”
10:30
How human
220
630273
1152
μš°λ¦¬κ°€ μ–Όλ§ˆλ‚˜
10:31
are you?
221
631874
1170
μΈκ°„μ μ΄λƒλŠ” κ²ƒμž…λ‹ˆλ‹€.
10:33
Thank you.
222
633656
1151
κ°μ‚¬ν•©λ‹ˆλ‹€.
10:34
(Applause)
223
634831
3930
(λ°•μˆ˜)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7