How AI can save our humanity | Kai-Fu Lee

912,029 views ・ 2018-08-27

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: Hyemin LEE κ²€ν† : Jihyeon J. Kim
00:12
I'm going to talk about how AI and mankind can coexist,
0
12640
4536
μ €λŠ” AIκ³Ό 인간이 μ–΄λ–»κ²Œ 곡쑴할 수 μžˆλŠ”μ§€ μ–˜κΈ°ν•  κ²λ‹ˆλ‹€.
00:17
but first, we have to rethink about our human values.
1
17200
3816
κ·ΈλŸ¬κΈ°μ— μ•žμ„œ μš°λ¦¬λŠ” μΈκ°„μ˜ κ°€μΉ˜μ— λŒ€ν•΄ 생각해봐야 ν•©λ‹ˆλ‹€.
00:21
So let me first make a confession about my errors in my values.
2
21040
4320
λ¨Όμ € μ œκ°€ 가지고 있던 κ°€μΉ˜μ˜ μ‹€μˆ˜λ₯Ό κ³ λ°±ν•˜κ³ μž ν•©λ‹ˆλ‹€.
00:25
It was 11 o'clock, December 16, 1991.
3
25920
4256
1991λ…„ 12μ›” 16일 11μ‹œμ˜€μŠ΅λ‹ˆλ‹€.
00:30
I was about to become a father for the first time.
4
30200
2680
μ œκ°€ 처음으둜 μ•„λΉ κ°€ λ˜λŠ” μˆœκ°„μ΄μ—ˆμ£ .
00:33
My wife, Shen-Ling, lay in the hospital bed
5
33480
2856
제 μ•„λ‚΄ μ…˜ 링은 λΆ„λ§Œμ‹€μ— λˆ„μ›Œμžˆμ—ˆμŠ΅λ‹ˆλ‹€.
00:36
going through a very difficult 12-hour labor.
6
36360
3080
12μ‹œκ°„μ˜ 진톡을 κ²ͺμœΌλ©΄μ„œ 말이죠.
00:40
I sat by her bedside
7
40280
2296
μ €λŠ” κ·Έ μ˜†μ— μ•‰μ•„μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
00:42
but looked anxiously at my watch,
8
42600
2376
μ €λŠ” μ‹œκ³„λ₯Ό 보며 μ•ˆμ ˆλΆ€μ ˆν–ˆμŠ΅λ‹ˆλ‹€.
00:45
and I knew something that she didn't.
9
45000
1800
μ•„λ‚΄κ°€ λͺ¨λ₯΄λŠ” 게 μžˆμ—ˆκ±°λ“ μš”.
00:47
I knew that if in one hour,
10
47440
1800
μ•žμœΌλ‘œ ν•œ μ‹œκ°„ μ•ˆμ—
00:50
our child didn't come,
11
50360
1936
μ•„κΈ°κ°€ νƒœμ–΄λ‚˜μ§€ μ•ŠμœΌλ©΄
00:52
I was going to leave her there
12
52320
1936
μ €λŠ” μ•„λ‚΄λ§Œ 혼자 남겨두고
00:54
and go back to work
13
54280
1656
μ‚¬λ¬΄μ‹€λ‘œ λŒμ•„κ°€μ„œ,
00:55
and make a presentation about AI
14
55960
2576
AI κ΄€λ ¨ λ°œν‘œλ₯Ό ν•΄μ•Ό ν–ˆμŠ΅λ‹ˆλ‹€.
00:58
to my boss, Apple's CEO.
15
58560
2200
제 상사인 μ• ν”Œ λŒ€ν‘œλ‹˜κ»˜ λ§μž…λ‹ˆλ‹€.
01:03
Fortunately, my daughter was born at 11:30 --
16
63160
3896
딸은 11μ‹œ 30뢄에 νƒœμ–΄λ‚¬μŠ΅λ‹ˆλ‹€.
01:07
(Laughter)
17
67080
1936
(μ›ƒμŒ)
01:09
(Applause)
18
69040
1936
(λ°•μˆ˜)
01:11
sparing me from doing the unthinkable,
19
71000
4336
λ‹€ν–‰νžˆ μ €λŠ” μ–΄μ²˜κ΅¬λ‹ˆμ—†λŠ” 일을 저지λ₯΄μ§€ μ•Šκ²Œ λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
01:15
and to this day, I am so sorry
20
75360
2496
μ§€κΈˆκΉŒμ§€ μ €λŠ” λ―Έμ•ˆν•œ 마음 λΏμž…λ‹ˆλ‹€.
01:17
for letting my work ethic take precedence over love for my family.
21
77880
4656
일을 가쑱에 λŒ€ν•œ μ‚¬λž‘λ³΄λ‹€ μš°μ„ μ‹œν–ˆλ˜ 것을 λ§μž…λ‹ˆλ‹€.
01:22
(Applause)
22
82560
5736
(λ°•μˆ˜)
01:28
My AI talk, however, went off brilliantly.
23
88320
2616
κ·Έλ ‡μ§€λ§Œ 제 AI λ°œν‘œλŠ” 정말 μ„±κ³΅μ μ΄μ—ˆμŠ΅λ‹ˆλ‹€.
01:30
(Laughter)
24
90960
2736
(μ›ƒμŒ)
01:33
Apple loved my work and decided to announce it
25
93720
3136
μ• ν”Œμ€ 제 λ°œν‘œλ₯Ό λ§Œμ‘±μŠ€λŸ¬μ›Œ ν–ˆκ³  κ²°κ΅­ 그것을 세상에 μ•Œλ Έμ£ .
01:36
at TED1992,
26
96880
2696
1992λ…„ TED λ°œν‘œνšŒμž₯인
01:39
26 years ago on this very stage.
27
99600
3016
26λ…„μ „ μ •ν™•νžˆ 이 μžλ¦¬μ—μ„œ 말이죠.
01:42
I thought I had made one of the biggest, most important discoveries in AI,
28
102640
5056
μ €λŠ” AI에 κ΄€ν•œ κ°€μž₯ μœ„λŒ€ν•˜κ³  μ€‘μš”ν•œ λ°œκ²¬μ„ ν–ˆλ‹€κ³  μƒκ°ν–ˆμŠ΅λ‹ˆλ‹€.
01:47
and so did the "Wall Street Journal" on the following day.
29
107720
2920
λ°œν‘œ λ‹€μŒ λ‚ , μ›”μŠ€νŠΈλ¦¬νŠΈ 신문도 μ—­μ‹œ 같은 λ‚΄μš©μ„ μ‹€μ—ˆμŠ΅λ‹ˆλ‹€.
01:51
But as far as discoveries went,
30
111440
2536
κ·ΈλŸ¬λ‚˜ 발견이 κ³„μ†λ μˆ˜λ‘
01:54
it turned out,
31
114000
1336
λΆ„λͺ…해진 사싀은
01:55
I didn't discover India, or America.
32
115360
2776
제 발견이 μΈλ„λ‚˜ 미ꡭ이 μ•„λ‹ˆλΌ
01:58
Perhaps I discovered a little island off of Portugal.
33
118160
3080
포루투갈 근처 μž‘μ€ 섬에 λΆˆκ³Όν–ˆλ‹€λŠ” 것이죠.
02:02
But the AI era of discovery continued,
34
122720
3256
κ·ΈλŸ¬λ‚˜ AI λ°œκ²¬μ€ κ³„μ†λ˜μ–΄ κ°”μŠ΅λ‹ˆλ‹€.
02:06
and more scientists poured their souls into it.
35
126000
3216
더 λ§Žμ€ κ³Όν•™μžλ“€μ€ 열정을 λ‹€ν•˜μ—¬ AIλ₯Ό μ—°κ΅¬ν–ˆμŠ΅λ‹ˆλ‹€.
02:09
About 10 years ago, the grand AI discovery
36
129240
2616
μ•½ 10λ…„ μ „, 뢁미 κ³Όν•™μž μ„Έ λͺ…은
02:11
was made by three North American scientists,
37
131880
3176
AI μ΅œλŒ€ λ°œκ²¬μ„ μ΄λŒμ—ˆμŠ΅λ‹ˆλ‹€.
02:15
and it's known as deep learning.
38
135080
1600
λ”₯ λŸ¬λ‹μ΄λΌκ³  ν•˜λŠ” 것이죠.
02:17
Deep learning is a technology that can take a huge amount of data
39
137400
3696
λ”₯λŸ¬λ‹μ€ μ—„μ²­λ‚œ μ–‘μ˜ 데이터λ₯Ό μ΄μš©ν•˜μ—¬
02:21
within one single domain
40
141120
1736
ν•œ 가지 μ˜μ—­ λ‚΄μ—μ„œ
02:22
and learn to predict or decide at superhuman accuracy.
41
142880
4936
인간보닀 훨씬 μ •ν™•ν•˜κ²Œ 예츑과 결정을 ν•™μŠ΅ν•˜λŠ” κ²ƒμž…λ‹ˆλ‹€.
02:27
For example, if we show the deep learning network
42
147840
2776
예λ₯Ό λ“€μ–΄, λ”₯λŸ¬λ‹ λ„€νŠΈμ›Œν¬μ—
02:30
a massive number of food photos,
43
150640
2816
λŒ€λŸ‰μ˜ μŒμ‹μ‚¬μ§„μ„ 보여주면
02:33
it can recognize food
44
153480
1576
μŒμ‹μ„ 식별할 수 μžˆμŠ΅λ‹ˆλ‹€.
02:35
such as hot dog or no hot dog.
45
155080
3296
μΌλ‘€λ‘œ, 핫도그와 핫도그가 μ•„λ‹Œ 것을 ꡬ뢄할 수 μžˆμŠ΅λ‹ˆλ‹€.
02:38
(Applause)
46
158400
3136
(λ°•μˆ˜)
02:41
Or if we show it many pictures and videos and sensor data
47
161560
5016
λ˜λŠ” κ³ μ†λ„λ‘œμ—μ„œ μš΄μ „ν•˜λ©΄μ„œ 찍은 λ§Žμ€ 사진, λΉ„λ””μ˜€, 감지데이터λ₯Ό
02:46
from driving on the highway,
48
166600
2536
λ„€νŠΈμ›Œν¬μ— 보여주면
02:49
it can actually drive a car as well as a human being
49
169160
3280
μΈκ°„μ²˜λŸΌ μ‹€μ œλ‘œ μš΄μ „ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
02:53
on the highway.
50
173360
1656
κ³ μ†λ„λ‘œμ—μ„œ 말이죠.
02:55
And what if we showed this deep learning network
51
175040
2816
λ§Œμ•½μ— λ”₯λŸ¬λ‹ λ„€νŠΈμ›Œν¬μ—
02:57
all the speeches made by President Trump?
52
177880
2400
νŠΈλŸΌν”„ λŒ€ν†΅λ Ή 연섀을 λͺ¨λ‘ 보여쀀닀면
03:01
Then this artificially intelligent President Trump,
53
181680
3000
인곡지λŠ₯ νŠΈλŸΌν”„ λŒ€ν†΅λ Ήμ€
03:05
actually the network --
54
185600
2216
λ„€νŠΈμ›Œν¬λŠ” μ‹€μ œλ‘œ..
03:07
(Laughter)
55
187840
1656
(μ›ƒμŒ)
03:09
can --
56
189520
1216
ν•  수 μžˆμŠ΅λ‹ˆλ‹€...
03:10
(Applause)
57
190760
4056
(λ°•μˆ˜)
03:14
You like double oxymorons, huh?
58
194840
2336
μš°λ¦¬λŠ” 이쀑적인 λͺ¨μˆœν™”법을 μ’‹μ•„ν•˜μ£ ?
03:17
(Laughter)
59
197200
3856
(μ›ƒμŒ)
03:21
(Applause)
60
201080
6096
(λ°•μˆ˜)
03:27
So this network, if given the request to make a speech about AI,
61
207200
5256
λ„€νŠΈμ›Œν¬μ— AI에 κ΄€ν•œ 연섀을 λΆ€νƒν•˜λ©΄
03:32
he, or it, might say --
62
212480
2640
μ•„λ§ˆλ„ μ΄λ ‡κ²Œ 말할 κ²ƒμž…λ‹ˆλ‹€.
03:36
(Recording) Donald Trump: It's a great thing
63
216280
2096
(λ…ΉμŒ) λ„λ„λ“œ νŠΈλŸΌν”„ :
인곡지λŠ₯으둜 더 λ‚˜μ€ 세상을 λ§Œλ“œλŠ” 것은 μœ„λŒ€ν•œ κ²ƒμž…λ‹ˆλ‹€.
03:38
to build a better world with artificial intelligence.
64
218400
2936
03:41
Kai-Fu Lee: And maybe in another language?
65
221360
2016
카이 ν‘Έ 리: λ‹€λ₯Έ μ–Έμ–΄λ‘œλŠ”μš”?
03:43
DT: (Speaking Chinese)
66
223400
1816
(쀑ꡭ어 μŒμ„±)
03:45
(Laughter)
67
225240
1496
(μ›ƒμŒ)
03:46
KFL: You didn't know he knew Chinese, did you?
68
226760
2160
KFL: νŠΈλŸΌν”„κ°€ 쀑ꡭ어도 ν•  수 μžˆλ‹€λŠ” 건 μ—¬λŸ¬λΆ„λ“€λ„ λͺ¨λ₯΄μ…¨κ² μ£ ?
03:50
So deep learning has become the core in the era of AI discovery,
69
230120
5016
λ”₯λŸ¬λ‹μ€ AI μ‹œλŒ€μ˜ 핡심 기술이 λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
03:55
and that's led by the US.
70
235160
1816
이것은 미ꡭ의 μ£Όλ„λ‘œ κ°œλ°œλ˜μ—ˆμŠ΅λ‹ˆλ‹€.
03:57
But we're now in the era of implementation,
71
237000
3256
이제 μš°λ¦¬λŠ” μ‹€ν–‰μ˜ μ‹œλŒ€μ— μžˆμŠ΅λ‹ˆλ‹€.
04:00
where what really matters is execution, product quality, speed and data.
72
240280
5536
정말 μ€‘μš”ν•œ 것은 μ‹€ν–‰, ν’ˆμ§ˆ, 속도, λ°μ΄ν„°μž…λ‹ˆλ‹€.
04:05
And that's where China comes in.
73
245840
2096
쀑ꡭ이 이 뢄야에 λ›°μ–΄λ“€κ³  μžˆμŠ΅λ‹ˆλ‹€.
04:07
Chinese entrepreneurs,
74
247960
1576
쀑ꡭ 기업가듀은
04:09
who I fund as a venture capitalist,
75
249560
1896
저도 κ·Έλ“€μ˜ 벀처자본 투자자둜 μžˆλŠ”λ°μš”.
04:11
are incredible workers,
76
251480
1736
일을 정말 μž˜ν•˜λŠ” λΆ„λ“€μž…λ‹ˆλ‹€.
04:13
amazing work ethic.
77
253240
1936
일에 λŒ€ν•œ 열정도 λŒ€λ‹¨ν•©λ‹ˆλ‹€.
04:15
My example in the delivery room is nothing compared to how hard people work in China.
78
255200
5296
μ—΄μ‹¬νžˆ μΌν•˜λŠ” 그런 뢄듀에 λΉ„ν•˜λ©΄ μ €μ˜ λΆ„λ§Œμ‹€ μ‚¬λ‘€λŠ” 아무것도 μ•„λ‹™λ‹ˆλ‹€.
04:20
As an example, one startup tried to claim work-life balance:
79
260520
3696
μΌλ‘€λ‘œ, ν•œ 신생기업은 λ‹€μŒκ³Ό 같이 일과 μ‚Άμ˜ κ· ν˜•μ„ μ£Όμž₯ν–ˆμŠ΅λ‹ˆλ‹€.
04:24
"Come work for us because we are 996."
80
264240
3696
"우리 νšŒμ‚¬λ‘œ μΌν•˜λŸ¬ μ˜€μ‹­μ‹œμ˜€. μ €ν¬λŠ” 996 μΌν•˜κΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€."
04:27
And what does that mean?
81
267960
1256
이 말이 무슨 λœ»μΈμ§€ μ•„μ‹­λ‹ˆκΉŒ?
04:29
It means the work hours of 9am to 9pm, six days a week.
82
269240
4400
μ˜€μ „9μ‹œλΆ€ν„° μ˜€ν›„ 9μ‹œκΉŒμ§€ μ£Ό6일 μΌν•œλ‹€λŠ” μ˜λ―Έμž…λ‹ˆλ‹€.
04:34
That's contrasted with other startups that do 997.
83
274960
3280
이것은 997 μΌν•˜λŠ” λ‹€λ₯Έ 신생기업과 ν™•μ‹€νžˆ 차이가 μžˆμŠ΅λ‹ˆλ‹€.
04:39
And the Chinese product quality has consistently gone up
84
279320
3096
λ˜ν•œ 쀑ꡭ μ œν’ˆ ν’ˆμ§ˆμ€ μ§€μ†μ μœΌλ‘œ ν–₯μƒλ˜μ–΄ μ™”μŠ΅λ‹ˆλ‹€.
04:42
in the past decade,
85
282440
1696
μ§€λ‚œ μ‹­ λ…„κ°„ 말이죠.
04:44
and that's because of a fiercely competitive environment.
86
284160
4120
이것은 μΉ˜μ—΄ν•œ 경쟁의 κ²°κ³Όμž…λ‹ˆλ‹€.
04:48
In Silicon Valley, entrepreneurs compete in a very gentlemanly fashion,
87
288760
5896
μ‹€λ¦¬μ½˜ 밸리의 기업듀은 맀우 μ—¬μœ λ‘­κ²Œ κ²½μŸν•©λ‹ˆλ‹€.
04:54
sort of like in old wars in which each side took turns
88
294680
3896
과거의 μ „μŸμ²˜λŸΌ ν•œ μͺ½μ—μ„œ μ‚¬κ²©ν•˜λ©΄
04:58
to fire at each other.
89
298600
1256
λ‹€λ₯Έ μͺ½μ΄ λŒ€μ‘ μ‚¬κ²©ν•˜λŠ” 식이죠.
04:59
(Laughter)
90
299880
1056
(μ›ƒμŒ)
05:00
But in the Chinese environment,
91
300960
2016
κ·ΈλŸ¬λ‚˜ μ€‘κ΅­μ˜ ν™˜κ²½μ€
05:03
it's truly a gladiatorial fight to the death.
92
303000
2880
κ²€νˆ¬μ‚¬μ™€ 같은 μ‚Άκ³Ό μ£½μŒμ„ κ°€λ₯΄λŠ” μ§„μ •ν•œ μ „νˆ¬μ™€ κ°™μŠ΅λ‹ˆλ‹€.
05:06
In such a brutal environment, entrepreneurs learn to grow very rapidly,
93
306920
6096
μ΄λŸ¬ν•œ μΉ˜μ—΄ν•œ κ²½μŸμ†μ—μ„œ κΈ°μ—…κ°€λŠ” μΎŒμ† μ„±μž₯을 λ°°μ›λ‹ˆλ‹€.
05:13
they learn to make their products better at lightning speed,
94
313040
3936
또, μ œν’ˆ ν’ˆμ§ˆμ„ λ†’μ΄λŠ” 방법도 빛이 μ†λ„λ‘œ 읡히게 λ©λ‹ˆλ‹€.
05:17
and they learn to hone their business models
95
317000
2336
그리고 μ‚¬μ—…μ•„μ΄ν…œμ„ λŠμž„μ—†μ΄ 갈고 닦아
05:19
until they're impregnable.
96
319360
1400
경쟁의 μ ˆλŒ€ μŠΉμžκ°€ λ©λ‹ˆλ‹€.
05:21
As a result, great Chinese products like WeChat and Weibo
97
321400
3856
κ²°κ΅­ μœ„μ±—μ΄λ‚˜ 웨이보와 같은 ν›Œλ₯­ν•œ 쀑ꡭ μƒν’ˆλ“€μ΄
05:25
are arguably better
98
325280
1416
λ”μš± μ’‹κ²Œ ν‰κ°€λ©λ‹ˆλ‹€.
05:26
than the equivalent American products from Facebook and Twitter.
99
326720
3480
νŽ˜μ΄μŠ€λΆμ΄λ‚˜ νŠΈμœ„ν„°μ™€ 같은 미ꡭ의 동쒅 μƒν’ˆλ“€μ— λΉ„ν•΄μ„œ 말이죠.
05:31
And the Chinese market embraces this change
100
331920
3096
그리고 μ€‘κ΅­μ‹œμž₯은 μ΄λŸ¬ν•œ 변화와
05:35
and accelerated change and paradigm shifts.
101
335040
3016
κ°€μ†ν™”λœ νŒ¨λŸ¬λ‹€μž„ μ „ν™˜μ„ μˆ˜μš©ν–ˆμŠ΅λ‹ˆλ‹€.
05:38
As an example, if any of you go to China,
102
338080
2096
μΌλ‘€λ‘œ, μ—¬λŸ¬λΆ„μ΄ 쀑ꡭ을 κ°€λ©΄
05:40
you will see it's almost cashless and credit card-less,
103
340200
4016
ν˜„κΈˆμ΄λ‚˜ μ‹ μš©μΉ΄λ“œ μ†Œμ§€κ°€ ν•„μš”μ—†λŠ” 것을 ν™•μΈν•˜μ‹€ 수 μžˆμ„ κ²λ‹ˆλ‹€.
05:44
because that thing that we all talk about, mobile payment,
104
344240
2736
μ™œλƒν•˜λ©΄ ν”νžˆ λͺ¨λ°”일 결제라고 ν•˜λŠ” 것이
05:47
has become the reality in China.
105
347000
2416
μ€‘κ΅­μ—μ„œλŠ” ν˜„μ‹€μ΄ λ˜μ—ˆκΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
05:49
In the last year,
106
349440
1296
μž‘λ…„ ν•œ 해에
05:50
18.8 trillion US dollars were transacted on mobile internet,
107
350760
5936
λͺ¨λ°”일 결제둜 18μ‘° 8μ²œμ–΅ λ‹¬λŸ¬κ°€ κ±°λž˜λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
05:56
and that's because of very robust technologies
108
356720
2936
이것은 λ³΄μ•ˆμ΄ 맀우 μ² μ €ν•œ 기술이 λ’·λ°›μΉ¨λ˜μ—ˆκΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
05:59
built behind it.
109
359680
1256
06:00
It's even bigger than the China GDP.
110
360960
2440
이 κ±°λž˜λŸ‰μ€ μ€‘κ΅­μ˜ GDPλ₯Ό λŠ₯κ°€ν•©λ‹ˆλ‹€.
06:04
And this technology, you can say, how can it be bigger than the GDP?
111
364120
3616
μ–΄λ–»κ²Œ λͺ¨λ°”일 κ±°λž˜μ•‘μ΄ 쀑ꡭ GDPλ₯Ό λŠ₯κ°€ν•˜λŠ”μ§€ κΆκΈˆν•˜μ‹€ 수 μžˆμŠ΅λ‹ˆλ‹€.
06:07
Because it includes all transactions:
112
367760
1976
μ΄λŠ” λͺ¨λ“  거래λ₯Ό ν¬ν•¨ν•˜κΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
06:09
wholesale, channels, retail, online, offline,
113
369760
3776
도맀, 쀑간도맀, μ†Œλ§€, 온라인, μ˜€ν”„λΌμΈ
06:13
going into a shopping mall or going into a farmers market like this.
114
373560
5056
μ‡Όν•‘λͺ°λ‘œ κ°€λŠ” κ²ƒμ΄λ‚˜, μ΄κ²ƒμ²˜λŸΌ 직거래 μž₯ν„°λ„μš”.
06:18
The technology is used by 700 million people
115
378640
3376
이 κΈ°μˆ μ€ 7μ–΅ λͺ…이 μ΄μš©ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
06:22
to pay each other, not just merchants,
116
382040
2056
κΌ­ 상인이 μ•„λ‹ˆμ–΄λ„, 개인 κ°„ κ²°μ œν•©λ‹ˆλ‹€.
06:24
so it's peer to peer,
117
384120
1416
개인 λŒ€ 개인 거래인 것이죠.
06:25
and it's almost transaction-fee-free.
118
385560
2680
μˆ˜μˆ˜λ£Œλ„ 거의 μ—†μŠ΅λ‹ˆλ‹€.
06:29
And it's instantaneous,
119
389720
2376
그리고 λ°”λ‘œ 결제되고,
06:32
and it's used everywhere.
120
392120
1440
μ–΄λ””μ„œλ“  μ΄μš©ν•  수 μžˆκ³ μš”.
06:34
And finally, the China market is enormous.
121
394320
3376
κ²°κ΅­ 쀑ꡭ μ‹œμž₯은 κ±°λŒ€ν•΄μ§‘λ‹ˆλ‹€.
06:37
This market is large,
122
397720
1976
이 큰 쀑ꡭ μ‹œμž₯ 덕뢄에
06:39
which helps give entrepreneurs more users, more revenue,
123
399720
4496
κΈ°μ—…κ°€λŠ” 더 λ§Žμ€ μ‚¬μš©μžμ™€ 수읡,
06:44
more investment, but most importantly,
124
404240
2336
더 λ§Žμ€ 투자λ₯Ό λ°›μŠ΅λ‹ˆλ‹€. λ”μš± μ€‘μš”ν•œ 것은
06:46
it gives the entrepreneurs a chance to collect a huge amount of data
125
406600
4536
κΈ°μ—…μ—κ²Œ μ—„μ²­λ‚œ 데이터λ₯Ό μˆ˜μ§‘ν•˜λ„λ‘ 기회λ₯Ό μ œκ³΅ν•œλ‹€λŠ” κ²λ‹ˆλ‹€.
06:51
which becomes rocket fuel for the AI engine.
126
411160
2960
이것은 AI μ—”μ§„μ˜ λ‘œμΌ“ μ—°λ£Œκ°€ λ©λ‹ˆλ‹€.
06:54
So as a result, the Chinese AI companies
127
414960
2936
κ²°κ΅­ μ€‘κ΅­μ˜ AI 기업은
06:57
have leaped ahead
128
417920
1736
도약 λ°œμ „ν•΄λ‚˜κ°€κ³  μžˆμŠ΅λ‹ˆλ‹€.
06:59
so that today, the most valuable companies
129
419680
3096
μ˜€λŠ˜λ‚  κΈ°μ—…κ°€μΉ˜κ°€ κ°€μž₯ 높은 κΈ°μ—…λ“€,
07:02
in computer vision, speech recognition,
130
422800
2616
컴퓨터 μ˜μƒ, μ—°μ„€ μŒμ„± 인식
07:05
speech synthesis, machine translation and drones
131
425440
3416
μŒμ„± ν•©μ„±, λ²ˆμ—­, λ“œλ‘  λ“± λΆ„μ•Όμ—μ„œ 말이죠.
07:08
are all Chinese companies.
132
428880
1920
λͺ¨λ‘ 쀑ꡭ κΈ°μ—…μž…λ‹ˆλ‹€.
07:11
So with the US leading the era of discovery
133
431400
3216
즉, 미ꡭ의 μ£Όλ„λ‘œ λ°œκ²¬ν•˜κ³ 
07:14
and China leading the era of implementation,
134
434640
3136
μ€‘κ΅­μ˜ μ£Όλ„λ‘œ μ‹€ν–‰ν•˜κ³  μžˆλŠ”
07:17
we are now in an amazing age
135
437800
2256
획기적인 μ‹œλŒ€μ— μ‚΄κ³  μžˆμŠ΅λ‹ˆλ‹€.
07:20
where the dual engine of the two superpowers
136
440080
3256
이 두 μ΄ˆκ°•λ ₯ νž˜μ„ 가진 μ–‘μͺ½ 엔진이 ν•¨κ»˜ νž˜μ„ 합쳐
07:23
are working together
137
443360
1680
07:25
to drive the fastest revolution in technology
138
445960
3656
기술의 μΎŒμ† 혁λͺ…을 이끌고 μžˆμŠ΅λ‹ˆλ‹€.
07:29
that we have ever seen as humans.
139
449640
2360
μ§€κΈˆκΉŒμ§€ κ²½ν—˜ν•œ κ²ƒμ²˜λŸΌ 말이죠.
07:32
And this will bring tremendous wealth,
140
452640
2496
μ΄λŠ” μ—„μ²­λ‚œ λΆ€λ₯Ό κ°€μ Έμ˜¬ κ²ƒμž…λ‹ˆλ‹€.
07:35
unprecedented wealth:
141
455160
1776
μ „μ—λŠ” μ—†λ˜ λΆ€μž…λ‹ˆλ‹€.
07:36
16 trillion dollars, according to PwC,
142
456960
3896
PwC에 λ”°λ₯΄λ©΄ μ΄λŠ” 16μ‘° λ‹¬λŸ¬μ— λ‹¬ν•©λ‹ˆλ‹€.
07:40
in terms of added GDP to the worldwide GDP by 2030.
143
460880
5376
μ΄λŠ” 2030λ…„κΉŒμ§€ μ „ μ„Έκ³„μ˜ GDPλ₯Ό ν•©μ‚°ν•œ κ²ƒμž…λ‹ˆλ‹€.
07:46
It will also bring immense challenges
144
466280
2616
μ΄λŠ” λ˜ν•œ 큰 도전을 κ°€μ Έμ˜¬ κ²ƒμž…λ‹ˆλ‹€.
07:48
in terms of potential job replacements.
145
468920
3576
μ§μ—…μ˜ λ³€ν™”κ°€λŠ₯성에 κ΄€ν•΄μ„œ λ§μž…λ‹ˆλ‹€.
07:52
Whereas in the Industrial Age
146
472520
2160
μ‚°μ—…ν˜λͺ… μ‹œλŒ€μ—λŠ”
07:55
it created more jobs
147
475880
1536
더 λ§Žμ€ 직업이 λ§Œλ“€μ–΄μ‘ŒμŠ΅λ‹ˆλ‹€.
07:57
because craftsman jobs were being decomposed into jobs in the assembly line,
148
477440
5776
ν•œ μ‚¬λžŒμ΄ ν•˜λ˜ 일이 μ œμ‘°λΌμΈμ—μ„œ μ—¬λŸ¬ κ°€μ§€λ‘œ λΆ„μ—…ν™”λ˜μ—ˆκΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
08:03
so more jobs were created.
149
483240
1976
κ·ΈλŸ¬λ‹ˆ 더 λ§Žμ€ 직업이 λ§Œλ“€μ–΄μ‘ŒμŠ΅λ‹ˆλ‹€.
08:05
But AI completely replaces the individual jobs
150
485240
4936
κ·ΈλŸ¬λ‚˜ AIλŠ” 제쑰라인의 개인의 일을
08:10
in the assembly line with robots.
151
490200
2256
λ‘œλ΄‡μœΌλ‘œ μ™„μ „νžˆ λŒ€μ²΄ν•©λ‹ˆλ‹€.
08:12
And it's not just in factories,
152
492480
1936
단지 곡μž₯μ—λ§Œ ν•΄λ‹Ήλ˜λŠ” 것은 μ•„λ‹™λ‹ˆλ‹€.
08:14
but truckers, drivers
153
494440
2056
νŠΈλŸ­κΈ°μ‚¬, μš΄μ „κΈ°μ‚¬
08:16
and even jobs like telesales, customer service
154
496520
4096
ν…”λ ˆλ§ˆμΌ€ν„°, 고객관리원
08:20
and hematologists as well as radiologists
155
500640
2976
μ±„ν˜ˆμ‚¬, 방사선사듀도 λ§ˆμ°¬κ°€μ§€λ‘œ
08:23
over the next 15 years
156
503640
2096
ν–₯ν›„ 15λ…„ μ•ˆμ—
08:25
are going to be gradually replaced
157
505760
2536
점차 λŒ€μ²΄λ  κ²ƒμž…λ‹ˆλ‹€.
08:28
by artificial intelligence.
158
508320
1440
인곡지λŠ₯으둜 λ§μž…λ‹ˆλ‹€.
08:30
And only the creative jobs --
159
510360
2056
μ°½μ˜μ„±μ΄ μš”κ΅¬λ˜λŠ” μ§μ—…λ§Œμ΄
08:32
(Laughter)
160
512440
1976
(μ›ƒμŒ)
08:34
I have to make myself safe, right?
161
514440
2200
제 μžμ‹ μ€ μ•ˆμ „ν•˜λ‹€κ³  ν•΄μ•Όκ² μ£ ?
08:38
Really, the creative jobs are the ones that are protected,
162
518960
2976
μ°½μ˜μ„±μ΄ ν•„μš”ν•œ 직업은 μ‹€μ œλ‘œ μ•ˆμ „ν•©λ‹ˆλ‹€.
08:41
because AI can optimize but not create.
163
521960
3040
AIλŠ” μ΅œμ ν™”ν•  μˆ˜λŠ” μžˆμ§€λ§Œ μ°½μ‘°ν•΄λ‚΄μ§€λŠ” λͺ»ν•˜κΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
08:45
But what's more serious than the loss of jobs
164
525960
3575
그런데 직업이 μ—†μ–΄μ§€λŠ” 것보닀 λ”μš± μ‹¬κ°ν•œ 것은
08:49
is the loss of meaning,
165
529559
1777
λ°”λ‘œ 일의 μ˜λ―Έκ°€ μ‚¬λΌμ§€λŠ” κ²ƒμž…λ‹ˆλ‹€.
08:51
because the work ethic in the Industrial Age
166
531360
3136
μ™œλƒν•˜λ©΄ μ‚°μ—…ν˜λͺ… μ‹œλŒ€μ˜ 직업 κ°€μΉ˜λŠ”
08:54
has brainwashed us into thinking that work is the reason we exist,
167
534520
5616
일 μžμ²΄κ°€ 쑴재의 이유인 κ²ƒμœΌλ‘œ 우리λ₯Ό μ„Έλ‡Œμ‹œμΌœ μ™”μŠ΅λ‹ˆλ‹€.
09:00
that work defined the meaning of our lives.
168
540160
2936
일이 우리의 삢을 μ •μ˜ν–ˆλ˜ 것이죠.
09:03
And I was a prime and willing victim to that type of workaholic thinking.
169
543120
6296
μ €λŠ” 일에 λΉ μ ΈμžˆλŠ” 것을 λ‹Ήμ—°νžˆ μ—¬κΈ°λŠ” μ „ν˜•μ μ΄λ©΄μ„œλ„ 자발적인 ν¬μƒμžμ˜€μŠ΅λ‹ˆλ‹€.
09:09
I worked incredibly hard.
170
549440
1616
정말 μ§€λ‚˜μΉ  μ •λ„λ‘œ μ—΄μ‹¬νžˆ μΌν–ˆμŠ΅λ‹ˆλ‹€.
09:11
That's why I almost left my wife in the delivery room,
171
551080
3576
일 λ•Œλ¬Έμ— λΆ„λ§Œμ‹€μ— μ•„λ‚΄λ§Œ λ‚¨κ²¨λ‘˜ λ»”ν•˜κΈ°λ„ ν–ˆμœΌλ‹ˆ λ§μž…λ‹ˆλ‹€.
09:14
that's why I worked 996 alongside my entrepreneurs.
172
554680
4176
λ‹€λ₯Έ κΈ°μ—…κ°€λ“€κ³Ό λ§ˆμ°¬κ°€μ§€λ‘œ 996둜 μΌν–ˆμŠ΅λ‹ˆλ‹€.
09:18
And that obsession that I had with work
173
558880
4416
κ·ΈλŸ¬ν•œ 일에 λŒ€ν•œ 집착은
09:23
ended abruptly a few years ago
174
563320
3056
λͺ‡ λ…„ μ „ κ°‘μž‘μŠ€λŸ½κ²Œ λλ‚˜κ²Œ λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
09:26
when I was diagnosed with fourth stage lymphoma.
175
566400
3920
4κΈ° λ¦Όν”„μ’… 진단을 λ°›μ•˜μŠ΅λ‹ˆλ‹€.
09:31
The PET scan here shows over 20 malignant tumors
176
571480
4136
μ—¬κΈ° PET μ΄¬μ˜μ‚¬μ§„μ„ λ³΄μ‹œλ©΄ 20κ°œκ°€ λ„˜λŠ” 악성쒅양이 μžˆμŠ΅λ‹ˆλ‹€.
09:35
jumping out like fireballs,
177
575640
2176
이것은 ν™”μ—Όμ²˜λŸΌ νŠ€μ–΄μ˜¬λΌ
09:37
melting away my ambition.
178
577840
2576
μ €μ˜ ν—›λœ 야망을 λ…Ήμ—¬λ²„λ ΈμŠ΅λ‹ˆλ‹€.
09:40
But more importantly,
179
580440
1456
λ”μš± 더 μ€‘μš”ν•œ 것은
09:41
it helped me reexamine my life.
180
581920
2736
이 일둜 μ €λŠ” 제 삢을 λ‹€μ‹œ 생각해볼 수 μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
09:44
Knowing that I may only have a few months to live
181
584680
3096
μ‚΄ 수 μžˆλŠ” 날이 이제 λͺ‡ κ°œμ›” 뿐 μ•„λ‹ˆλΌλŠ” 사싀은
09:47
caused me to see how foolish it was
182
587800
2936
μ œκ°€ κ·Έλ™μ•ˆ μ–Όλ§ˆλ‚˜ μ–΄λ¦¬μ„μ—ˆλŠ”μ§€ μ•Œκ²Œ ν•΄μ£Όμ—ˆμŠ΅λ‹ˆλ‹€.
09:50
for me to base my entire self-worth
183
590760
3576
제 μžμ‹ μ˜ κ°€μΉ˜ 기쀀을
09:54
on how hard I worked and the accomplishments from hard work.
184
594360
4040
일을 μ–Όλ§ˆλ‚˜ μ—΄μ‹¬νžˆ ν–ˆλŠ”μ§€ μ–΄λ–€ μ„±κ³Όλ₯Ό κ±°λ‘μ—ˆλŠ”μ§€λ‘œ μ‚Όμ•˜μŠ΅λ‹ˆλ‹€.
09:59
My priorities were completely out of order.
185
599320
2976
제 μš°μ„ μˆœμœ„λŠ” μ™„μ „νžˆ μ—‰λ§μ΄μ—ˆμŠ΅λ‹ˆλ‹€.
10:02
I neglected my family.
186
602320
1600
μ €λŠ” 가쑱을 λ“±ν•œμ‹œν–ˆμŠ΅λ‹ˆλ‹€.
10:05
My father had passed away,
187
605000
1416
μ•„λ²„μ§€κ»˜μ„œ λŒμ•„κ°€μ…¨λŠ”λ°
10:06
and I never had a chance to tell him I loved him.
188
606440
2800
μ €λŠ” μ‚¬λž‘ν•œλ‹€λŠ” 말 ν•œλ§ˆλ”” λͺ»ν–ˆμŠ΅λ‹ˆλ‹€.
10:10
My mother had dementia and no longer recognized me,
189
610120
3736
μ–΄λ¨Έλ‹ˆλŠ” μΉ˜λ§€κ°€ μžˆμœΌμ…”μ„œ μ €λ₯Ό 이제 μ•Œμ•„λ³΄μ‹œμ§€ λͺ»ν•˜μ‹­λ‹ˆλ‹€.
10:13
and my children had grown up.
190
613880
1920
아이듀은 이미 λ‹€ μžλžμŠ΅λ‹ˆλ‹€.
10:16
During my chemotherapy,
191
616400
1656
ν™”ν•™μΉ˜λ£Œλ₯Ό ν•˜λŠ” λ™μ•ˆμ—
10:18
I read a book by Bronnie Ware
192
618080
2456
μ €λŠ” Bonnie Ware 책을 μ½μ—ˆμŠ΅λ‹ˆλ‹€.
10:20
who talked about dying wishes and regrets of the people in the deathbed.
193
620560
5496
μ£½μŒμ„ μ•žλ‘” μ‚¬λžŒλ“€μ˜ μ†Œλ§κ³Ό ν›„νšŒμ— λŒ€ν•œ μ±…μž…λ‹ˆλ‹€.
10:26
She found that facing death,
194
626080
2256
그녀에 λ”°λ₯΄λ©΄, μ£½μŒμ„ μ•žλ‘λ©΄
10:28
nobody regretted that they didn't work hard enough in this life.
195
628360
3600
μ–΄λŠ λˆ„κ΅¬λ„ μ—΄μ‹¬νžˆ μΌν•˜μ§€ μ•Šμ€ 것을 ν›„νšŒν•˜μ§„ μ•ŠλŠ”λ‹€κ³  ν•©λ‹ˆλ‹€.
10:32
They only regretted that they didn't spend enough time with their loved ones
196
632880
5176
λ‹€λ§Œ μ‚¬λž‘ν•˜λŠ” μ‚¬λžŒλ“€κ³Ό μ‹œκ°„μ„ μΆ©λΆ„νžˆ 보내지 λͺ»ν•œ 것을 ν›„νšŒν•˜μ£ .
10:38
and that they didn't spread their love.
197
638080
2760
μ‚¬λž‘μ„ μ „ν•˜μ§€ λͺ»ν•œ 것을 ν›„νšŒν•˜κ³ μš”.
10:42
So I am fortunately today in remission.
198
642400
4536
λ‹€ν–‰νžˆ μ €λŠ” νšŒλ³΅μ€‘μ— μžˆμŠ΅λ‹ˆλ‹€.
10:46
(Applause)
199
646960
6856
(λ°•μˆ˜)
10:53
So I can be back at TED again
200
653840
1976
λ‹€μ‹œ TED에 μ„€ 수 있게 λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
10:55
to share with you that I have changed my ways.
201
655840
3376
제 λ³€ν™”λœ μ‚Άμ˜ 방식을 κ³΅μœ ν•˜κΈ° μœ„ν•΄μ„œ 말이죠.
10:59
I now only work 965 --
202
659240
2760
μ§€κΈˆμ€ 965 일을 ν•©λ‹ˆλ‹€.
11:03
occasionally 996, but usually 965.
203
663200
3976
μ–΄μ©Œλ‹€κ°€ 996ν•˜μ§€λ§Œ, λŒ€κ°œ 965μž…λ‹ˆλ‹€.
11:07
I moved closer to my mother,
204
667200
1896
μ§€κΈˆμ€ μ–΄λ¨Έλ‹ˆ 근처둜 이사λ₯Ό ν–ˆκ³ μš”.
11:09
my wife usually travels with me,
205
669120
2456
제 아내와 μ’…μ’… 여행을 κ°‘λ‹ˆλ‹€.
11:11
and when my kids have vacation, if they don't come home, I go to them.
206
671600
3816
아이듀이 방학이고, μ§‘μœΌλ‘œ μ˜€μ§€ λͺ»ν•˜λ©΄ μ œκ°€ 아이듀을 보러 κ°‘λ‹ˆλ‹€.
11:15
So it's a new form of life
207
675440
2456
μ΄λŸ¬ν•œ μƒˆλ‘œμš΄ μ‚Άμ˜ 방식은
11:17
that helped me recognize
208
677920
1816
μ €λ₯Ό μΌκΉ¨μ›Œμ£Όμ—ˆμŠ΅λ‹ˆλ‹€.
11:19
how important it is that love is for me,
209
679760
3176
제게 μ‚¬λž‘μ΄ μ–Όλ§ˆλ‚˜ μ€‘μš”ν•œμ§€ λ§μž…λ‹ˆλ‹€.
11:22
and facing death helped me change my life,
210
682960
3416
μ£½μŒμ„ μ§λ©΄ν•œ 것은 제 삢을 λ°”κΏ”λ†“μ•˜μŠ΅λ‹ˆλ‹€.
11:26
but it also helped me see a new way
211
686400
2456
λ˜ν•œ μƒˆλ‘œμš΄ κ΄€μ μœΌλ‘œ 보게 λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
11:28
of how AI should impact mankind
212
688880
3776
AIκ°€ μ–΄λ–»κ²Œ 인λ₯˜μ—κ²Œ 영ν–₯을 λ―ΈμΉ  수 μžˆλŠ”μ§€,
11:32
and work and coexist with mankind,
213
692680
2480
또 인λ₯˜μ™€ ν˜‘λ ₯ν•˜μ—¬ 곡쑴할 수 μžˆλŠ”μ§€ λ§μž…λ‹ˆλ‹€.
11:36
that really, AI is taking away a lot of routine jobs,
214
696680
4496
μ‹€μ œλ‘œ AIλŠ” λ§Žμ€ λ‹¨μˆœ 노동을 λŒ€μ²΄ν•  κ²ƒμž…λ‹ˆλ‹€.
11:41
but routine jobs are not what we're about.
215
701200
3656
λ‹¨μˆœ 노동이 우리의 μ „λΆ€λŠ” μ•„λ‹™λ‹ˆλ‹€.
11:44
Why we exist is love.
216
704880
2256
μš°λ¦¬λŠ” μ‚¬λž‘μ„ μœ„ν•΄ μ‘΄μž¬ν•©λ‹ˆλ‹€.
11:47
When we hold our newborn baby,
217
707160
2096
κ°“ νƒœμ–΄λ‚œ 아이λ₯Ό μ•ˆκ³  μžˆμ„ λ•Œ,
11:49
love at first sight,
218
709280
1496
첫 λˆˆμ— λ°˜ν–ˆμ„ λ•Œ,
11:50
or when we help someone in need,
219
710800
1776
도움이 ν•„μš”ν•œ μ‚¬λžŒμ„ λ„μšΈ λ•Œ,
11:52
humans are uniquely able to give and receive love,
220
712600
4256
μΈκ°„λ§Œμ΄ μ‚¬λž‘μ„ μ£Όκ³  받을 수 μžˆμŠ΅λ‹ˆλ‹€.
11:56
and that's what differentiates us from AI.
221
716880
2800
그것이 인간과 AIλ₯Ό κ΅¬λ³„ν•˜λŠ” κ²ƒμž…λ‹ˆλ‹€.
12:00
Despite what science fiction may portray,
222
720800
2696
κ³΅μƒκ³Όν•™μ†Œμ„€μ—μ„œ λ‚˜μ˜¬μˆ˜λŠ” μžˆμ§€λ§Œ,
12:03
I can responsibly tell you that AI has no love.
223
723520
3736
μ œκ°€ μž₯λ‹΄ν•˜κ±΄λŒ€, AIλŠ” μ‚¬λž‘μ΄ μ—†μŠ΅λ‹ˆλ‹€.
12:07
When AlphaGo defeated the world champion Ke Jie,
224
727280
3536
μ•ŒνŒŒκ³ κ°€ μ„Έκ³„μ±”ν”Όμ˜¨ 컀제λ₯Ό 이겼을 λ•Œ,
12:10
while Ke Jie was crying and loving the game of go,
225
730840
3096
μ»€μ œλŠ” μšΈλ©΄μ„œλ„ 바둑을 μ’‹μ•„ν•˜λŠ” 반면
12:13
AlphaGo felt no happiness from winning
226
733960
3176
μ•ŒνŒŒκ³ λŠ” 승리의 기쁨을 맛보지 λͺ»ν–ˆκ³ ,
12:17
and certainly no desire to hug a loved one.
227
737160
4480
λΆ„λͺ… μ‚¬λž‘ν•˜λŠ” μ‚¬λžŒμ„ κ»΄μ•ˆκ³  싢은 감정도 μ—†μ—ˆμ„ κ²λ‹ˆλ‹€.
12:23
So how do we differentiate ourselves
228
743600
2656
그러면 μ–΄λ–»κ²Œ 우리 μžμ‹ μ„
12:26
as humans in the age of AI?
229
746280
2536
AI와 μ°¨λ³„ν™”μ‹œν‚¬ 수 μžˆμ„κΉŒμš”?
12:28
We talked about the axis of creativity,
230
748840
3096
μš°λ¦¬λŠ” μ°½μ˜μ„± 좕을 μ–˜κΈ°ν–ˆμŠ΅λ‹ˆλ‹€.
12:31
and certainly that is one possibility,
231
751960
2856
λΆ„λͺ… 그것은 ν•œ λ°©λ²•μž…λ‹ˆλ‹€.
12:34
and now we introduce a new axis
232
754840
2296
이제 μƒˆλ‘œμš΄ 좕을 λ§Œλ“€κ² μŠ΅λ‹ˆλ‹€.
12:37
that we can call compassion, love, or empathy.
233
757160
3616
동정, μ‚¬λž‘, κ³΅κ°μ΄λΌλŠ” κ²ƒλ“€μž…λ‹ˆλ‹€.
12:40
Those are things that AI cannot do.
234
760800
2576
AIκ°€ ν•  수 μ—†λŠ” 것듀이죠.
12:43
So as AI takes away the routine jobs,
235
763400
2816
AIκ°€ 반볡 직업을 κ°€μ Έκ°ˆ λ•Œ,
12:46
I like to think we can, we should and we must create jobs of compassion.
236
766240
4960
μš°λ¦¬λŠ” 곡감λŠ₯λ ₯이 ν•„μš”ν•œ 직업을 μƒˆλ‘œ λ§Œλ“€ λ•ŒλΌκ³  μ—¬κΈ°λ©΄ μ–΄λ–¨κΉŒμš”.
12:51
You might ask how many of those there are,
237
771800
2336
그런 직업이 μ–Όλ§ˆλ‚˜ λ˜κ² λƒκ³  λ§ν•˜μ‹€ 수 μžˆμŠ΅λ‹ˆλ‹€.
12:54
but I would ask you:
238
774160
1616
κ·ΈλŸ¬λ‚˜ μ €λŠ” 묻고 μ‹ΆμŠ΅λ‹ˆλ‹€.
12:55
Do you not think that we are going to need a lot of social workers
239
775800
3816
μ‚¬νšŒλ³΅μ§€μ‚¬κ°€ 더 많이 ν•„μš”ν•˜λ‹€κ³  μƒκ°ν•˜μ§€ μ•ŠμœΌμ‹œλ‚˜μš”?
12:59
to help us make this transition?
240
779640
1600
μ΄λ ‡κ²Œ λ³€ν™”λ₯Ό 이끌기 μœ„ν•΄μ„œ λ§μž…λ‹ˆλ‹€.
13:01
Do you not think we need a lot of compassionate caregivers
241
781960
3256
마음이 λ”°λœ»ν•œ λŒλ΄„μ’…μ‚¬μžκ°€ 더 λ§Žμ•„μ•Ό ν•œλ‹€κ³  μƒκ°ν•˜μ§€ μ•ŠμœΌμ‹œλ‚˜μš”?
13:05
to give more medical care to more people?
242
785240
2696
더 λ§Žμ€ μ‚¬λžŒμ—κ²Œ 폭넓은 μ˜λ£Œμ„œλΉ„μŠ€λ₯Ό μ œκ³΅ν•˜κΈ° μœ„ν•΄μ„œ λ§μž…λ‹ˆλ‹€.
13:07
Do you not think we're going to need 10 times more teachers
243
787960
3256
μ„ μƒλ‹˜ 수λ₯Ό 10λ°° 더 λŠ˜λ €μ•Ό ν•œλ‹€κ³  μƒκ°ν•˜μ§€ μ•ŠμœΌμ‹œλ‚˜μš”?
13:11
to help our children find their way
244
791240
2776
우리 아이듀이 μžμ‹ μ˜ κΏˆμ„ μ°Ύκ³ ,
13:14
to survive and thrive in this brave new world?
245
794040
3256
μš©κΈ°κ°€ ν•„μš”ν•œ μƒˆλ‘œμš΄ μ„Έμƒμ—μ„œ μ„±κ³΅ν•˜λ„λ‘ 도움을 μ£Όλ €λ©΄ λ§μž…λ‹ˆλ‹€.
13:17
And with all the newfound wealth,
246
797320
2440
μƒˆλ‘­κ²Œ μΆ•μ ν•œ λΆ€λ₯Ό 가지고
13:19
should we not also make labors of love into careers
247
799800
4536
μ‚¬λž‘μ„ λ² ν‘ΈλŠ” 일을 μ§μ—…μœΌλ‘œ λ§Œλ“€λ©΄ μ•ˆλ˜λŠ” κ±ΈκΉŒμš”?
13:24
and let elderly accompaniment
248
804360
2696
또 λ…ΈμΈλŒλ΄„μ΄λ‚˜
13:27
or homeschooling become careers also?
249
807080
3496
ν™ˆμŠ€μΏ¨λ§μ΄ 직업이 되면 μ•ˆλ˜λŠ” κ±ΈκΉŒμš”?
13:30
(Applause)
250
810600
5280
(λ°•μˆ˜)
13:36
This graph is surely not perfect,
251
816800
2256
이 κ·Έλž˜ν”„λŠ” λΆ„λͺ… μ™„λ²½ν•˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
13:39
but it points at four ways that we can work with AI.
252
819080
3536
κ·ΈλŸ¬λ‚˜ 이것은 AI와 ν˜‘λ ₯ν•  수 μžˆλŠ” λ„€ 가지 방식을 μ œμ‹œν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
13:42
AI will come and take away the routine jobs
253
822640
3576
AIλŠ” λ‹¨μˆœ 노동을 λΉΌμ•—μ•„ 갈 κ²ƒμž…λ‹ˆλ‹€.
13:46
and in due time, we will be thankful.
254
826240
2040
그리고 곧 κ°μ‚¬ν•˜κ²Œ 될 κ²λ‹ˆλ‹€.
13:49
AI will become great tools for the creatives
255
829000
3096
AIλŠ” 창의적인 직업을 μœ„ν•œ 효과적인 도ꡬ가 될 κ²ƒμž…λ‹ˆλ‹€.
13:52
so that scientists, artists, musicians and writers
256
832120
3656
κ³Όν•™μž, μ˜ˆμˆ κ°€, μŒμ•…κ°€, μž‘κ°€λŠ”
13:55
can be even more creative.
257
835800
1600
λ”μš± 창의적으둜 될 κ²ƒμž…λ‹ˆλ‹€.
13:58
AI will work with humans as analytical tools
258
838120
5376
AIλŠ” 뢄석 λ„κ΅¬λ‘œμ„œ 인간을 돕고,
14:03
that humans can wrap their warmth around
259
843520
2696
인간은 우리의 λ”°λœ»ν•¨μ„ 가지고
14:06
for the high-compassion jobs.
260
846240
1896
κ³ λ„μ˜ 감정 직업을 ν¬μš©ν•  κ²ƒμž…λ‹ˆλ‹€.
14:08
And we can always differentiate ourselves
261
848160
2656
μš°λ¦¬λŠ” 항상 우리 μžμ‹ μ„
14:10
with the uniquely capable jobs
262
850840
1816
μΈκ°„λ§Œμ΄ ν•  수 μžˆλŠ” μ§μ—…μœΌλ‘œ μ°¨λ³„ν™”μ‹œν‚¬ 수 μžˆμŠ΅λ‹ˆλ‹€.
14:12
that are both compassionate and creative,
263
852680
3616
감정적이고 창의적인 직업듀을,
14:16
using and leveraging our irreplaceable brains and hearts.
264
856320
5176
λ¬΄μ—‡μœΌλ‘œλ„ λŒ€μ²΄ν•  수 μ—†λŠ” λ‘λ‡Œμ™€ κ°€μŠ΄μ„ μ‹­λΆ„ ν™œμš©ν•΄μ„œ λ§μž…λ‹ˆλ‹€.
14:21
So there you have it:
265
861520
1296
이제 λ‹€ λ§μ”€λ“œλ ΈμŠ΅λ‹ˆλ‹€.
14:22
a blueprint of coexistence for humans and AI.
266
862840
3640
인간과 AI의 곡쑴의 μ²­μ‚¬μ§„μ„μš”.
14:27
AI is serendipity.
267
867400
1816
AIλŠ” ν–‰μš΄μ˜ λ°œκ²¬μž…λ‹ˆλ‹€.
14:29
It is here to liberate us from routine jobs,
268
869240
2976
AIλŠ” 반볡적인 μΌμ—μ„œ λ²—μ–΄λ‚˜κ²Œ ν•΄μ£ΌκΈ° μœ„ν•œ κ²ƒμž…λ‹ˆλ‹€.
14:32
and it is here to remind us what it is that makes us human.
269
872240
3680
AIλŠ” 우리λ₯Ό μΈκ°„λ‹΅κ²Œ ν•˜λŠ” 것이 무엇인지 μΌκΉ¨μ›Œμ£ΌκΈ° μœ„ν•œ κ²ƒμž…λ‹ˆλ‹€.
14:36
So let us choose to embrace AI and to love one another.
270
876440
3976
κ·ΈλŸ¬λ‹ˆ AIλ₯Ό 받아듀이고 μ„œλ‘œλ₯Ό μ‚¬λž‘ν•˜μ‹œκΈ° λ°”λžλ‹ˆλ‹€.
14:40
Thank you.
271
880440
1216
κ°μ‚¬λ“œλ¦½λ‹ˆλ‹€.
14:41
(Applause)
272
881680
6160
(λ°•μˆ˜)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7