A fascinating time capsule of human feelings toward AI | Lucy Farey-Jones

53,268 views

2020-04-14 ใƒป TED


New videos

A fascinating time capsule of human feelings toward AI | Lucy Farey-Jones

53,268 views ใƒป 2020-04-14

TED


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

๋ฒˆ์—ญ: Ji Seon Lee ๊ฒ€ํ† : Jihyeon J. Kim
์ œ๊ฐ€ ์—ฌ๊ธฐ ์žˆ๋Š” ์ด์œ ๋Š” ๊ฒฐ๊ตญ ๋ˆ„๊ฐ€ ์ด๊ธฐ๋Š”์ง€
๋งŽ์€ ์‹œ๊ฐ„ ๋™์•ˆ ๊ฑฑ์ •ํ•˜๊ณ  ๊ถ๊ธˆํ–ˆ๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
์ธ๊ฐ„๊ณผ ๋กœ๋ด‡ ์ค‘ ๋ˆ„๊ฐ€ ์ด๊ธธ๊นŒ์š”?
00:12
I'm here, because I've spent far too many nights lying awake,
0
12802
4571
๊ธฐ์ˆ  ์ „๋žต๊ฐ€๋กœ์„œ
์ œ ์ง์—…์€ ํ–‰๋™ ๋ณ€ํ™”์™€ ๊ด€๋ จ ์žˆ์Šต๋‹ˆ๋‹ค.
00:17
worrying and wondering who wins in the end.
1
17397
2486
์‚ฌ๋žŒ๋“ค์ด ์‹ ๊ธฐ์ˆ ์„ ์“ฐ๋Š” ์ด์œ ์™€ ๋ฐฉ๋ฒ•์„ ์ดํ•ดํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
00:19
Is it humans or is it robots?
2
19907
2450
์ด ๋ชจ๋“  ๊ฒƒ์ด ์–ด๋–ป๊ฒŒ ๋๋‚ ์ง€๋ฅผ ๋ณด๋ฉฐ
00:22
You see, as a technology strategist,
3
22918
1723
00:24
my job involves behavior change:
4
24665
1904
์‚ด๊ธธ ์›ํ•˜์ง€ ์•Š๋Š”๋‹ค๋Š” ์‚ฌ์‹ค์ด ์ •๋ง ์ ˆ๋ง์Šค๋Ÿฝ์Šต๋‹ˆ๋‹ค.
00:26
understanding why and how people adopt new technologies.
5
26593
3096
์‚ฌ์‹ค, ์ด๊ฑธ ๋ณด๊ณ  ์žˆ๋Š” ์ตœ์—ฐ์†Œ์ž๊ฐ€ 14์„ธ
00:30
And that means I'm really frustrated
6
30101
2373
์ตœ๊ณ ๋ น์ž๊ฐ€ ํŒ”ํŒ”ํ•œ 99์„ธ๋ผ๋ฉด
00:32
that I know I won't live to see how this all ends up.
7
32498
3460
์šฐ๋ฆฌ์˜ ์ง‘๋‹จ์˜์‹์€ ๋‹จ 185๋…„์— ๊ฑธ์ณ ์ด์–ด์ง‘๋‹ˆ๋‹ค.
00:36
And in fact, if the youngest person watching this is 14
8
36577
3273
00:39
and the oldest, a robust 99,
9
39874
2813
์ด ์„ธ์ƒ์—์„œ์˜ ์ง„ํ™”์™€
00:42
then together,
10
42711
1152
00:43
our collective consciousnesses span just 185 years.
11
43887
4174
์ƒํ™ฉ์— ๋Œ€ํ•ด ์ƒ๊ฐํ•˜๋ฉด
๊ทผ์‹œ์•ˆ์ ์ด๊ณ  ์„ฑ๊ฐ€์‹  ์‹œ๊ฐ„์ž…๋‹ˆ๋‹ค.
์šฐ๋ฆฌ ๋ชจ๋‘์˜ ๊ฒฌํ•ด๋กœ ๋ดค์„ ๋•Œ
00:48
That is a myopic pinprick of time
12
48593
3366
์ƒํ™ฉ์ด ์–ด๋–ป๊ฒŒ ์ง„ํ–‰ ๋˜๋Š”์ง€ ์ง€์ผœ๋ณด๋ฉฐ ์‚ฌ๋Š” ์‚ฌ๋žŒ์€ ์—†์„ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
00:51
when you think of the evolution and the story of life on this planet.
13
51983
3492
์ €ํฌ ํšŒ์‚ฌ์—์„œ๋Š” ์ด ๋ฌธ์ œ๋ฅผ ๋ฐ”๋กœ์žก๊ณ  ์‹ถ์Šต๋‹ˆ๋‹ค.
์šฐ๋ฆฌ๋Š” ๊ณ ์ •๋œ ์ผ์‹œ์ ์ธ ์‹œ์ ์„ ๋„˜์–ด
00:55
Turns out we're all in the cheap seats
14
55912
1825
00:57
and none of us will live to see how it all pans out.
15
57761
2992
์บ”ํ‹ธ๋ ˆ๋ฒ„๊ฐ€ ์–ด๋–ป๊ฒŒ ํ•ด๊ฒฐํ•˜๋Š”์ง€
01:00
So at my company, we wanted a way around this.
16
60777
2348
ํ™•์ธํ•˜๊ณ  ์‹ถ์—ˆ์Šต๋‹ˆ๋‹ค.
๊ทธ๋ฆฌ๊ณ  ์šฐ๋ฆฌ๋Š” 1,200๋ช…์˜ ๋ฏธ๊ตญ์ธ๋“ค ์ค‘์—์„œ
01:03
We wanted to see if there was a way to cantilever out,
17
63149
2738
01:05
beyond our fixed temporal vantage point,
18
65911
3008
๋ฏธ๊ตญ ์ธ๊ตฌ์กฐ์‚ฌ๋ฅผ ๋Œ€ํ‘œํ•˜๋Š”
01:08
to get a sense of how it all shakes up.
19
68943
2270
์ˆ˜๋งŽ์€ ์‚ฌ๊ณ ๋ฐฉ์‹ ์งˆ๋ฌธ์„
01:11
And to do this, we conducted a study amongst 1,200 Americans
20
71237
3977
๋กœ๋ด‡๊ณตํ•™๊ณผ AI๋กœ ๋ฌผ์–ด๋ดค๊ณ 
๊ธฐ์ˆ  ์ฑ„ํƒ๊ณผ ๊ด€๋ จ๋œ ํ–‰๋™์ด ์žกํ˜”์Šต๋‹ˆ๋‹ค.
01:15
representative of the US census,
21
75238
2150
01:17
in which we asked a battery of attitudinal questions
22
77412
2524
๋Œ€๋Œ€์ ์ธ ์—ฐ๊ตฌ ์ง„ํ–‰์„ ํ†ตํ•ด
์„ฑ๋ณ„๊ณผ ์„ธ๋Œ€ ์ฐจ์ด
01:19
around robotics and AI
23
79960
1801
01:21
and also captured behavioral ones around technology adoption.
24
81785
3373
๊ทธ๋ฆฌ๊ณ  ์ข…๊ต์™€ ์ •์น˜์  ๋ฏฟ์Œ
์‹ฌ์ง€์–ด ์—…๋ฌด ๊ธฐ๋Šฅ๊ณผ ์„ฑ๊ฒฉ ํŠน์„ฑ์˜ ์ฐจ์ด๋„ ๋ถ„์„ํ–ˆ์Šต๋‹ˆ๋‹ค.
01:26
We had a big study
25
86011
1285
๋กœ๋ด‡ ๊ณตํ•™ ์‹œ๋Œ€๊ฐ€ ์—ด๋ฆฌ๊ธฐ ์ „
01:27
so that we could analyze differences in gender and generations,
26
87320
2977
01:30
between religious and political beliefs,
27
90321
1968
์ธ๊ฐ„์˜ ๋‚˜์•ฝํ•จ์„ ๋ณด์—ฌ์ฃผ๋Š”
์‹œ๊ฐ„์ œํ•œ์ด ์žˆ๋Š” ๋งค๋ ฅ์ ์ธ ํƒ€์ž„์บก์Š์ž…๋‹ˆ๋‹ค.
01:32
even job function and personality trait.
28
92313
2769
๊ทธ๋ฆฌ๊ณ  5๋ถ„๊ฐ„ ์—ฌ๋Ÿฌ๋ถ„๊ป˜ ๋“œ๋ฆด ๋ง์”€์ด ์žˆ์Šต๋‹ˆ๋‹ค.
01:35
It is a fascinating, time-bound time capsule
29
95106
3503
๋จผ์ € ์•Œ์•„์•ผ ํ•  ๊ฒƒ์€ ํ˜„์žฌ ๋ฐ ์ž ์žฌ์ ์ธ AI ๋กœ๋ด‡ ๊ณตํ•™
01:38
of our human frailty
30
98633
1666
01:40
in this predawn of the robotic era.
31
100323
2586
์‹œ๋‚˜๋ฆฌ์˜ค ๋ชฉ๋ก์„ ๋ธŒ๋ ˆ์ธ์Šคํ† ๋ฐํ–ˆ๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
01:43
And I have five minutes to tell you about it.
32
103220
2278
01:46
The first thing you should know is that we brainstormed
33
106260
2579
ํ‰๋ฒ”ํ•œ ์‚ฌ๋žŒ๋“ค์˜ ์ŠคํŽ™ํŠธ๋Ÿผ์„ ์‚ดํŽด๋ดค๋Š”๋ฐ
01:48
a list of scenarios of current and potential AI robotics.
34
108863
6227
๋กœ๋ด‡ ์ฒญ์†Œ๊ธฐ ์žˆ์œผ์‹ ๊ฐ€์š”?
์กฐ๊ธˆ ์ง–๊ถ‚์€ ๊ฒƒ๋“ค์„ ์‚ดํŽด๋ณผ๊นŒ์š”
๋กœ๋ด‡ ํŽซ์‹œํ„ฐ, ํ˜น์€ ๋กœ๋ด‡ ๋ณ€ํ˜ธ์‚ฌ
01:55
They ran the spectrum from the mundane,
35
115693
2542
์„น์Šค ํŒŒํŠธ๋„ˆ๋Š” ์–ด๋– ์‹ ๊ฐ€์š”?
01:58
so, a robot house cleaner, anyone?
36
118259
1976
์„ฌ๋œฉํ•œ ์‚ฌ์ด๋ณด๊ทธ๊ฐ€ ๋˜์–ด
02:00
Through to the mischievous,
37
120259
1302
02:01
the idea of a robot pet sitter, or maybe a robot lawyer,
38
121585
3095
์ธ๊ฐ„๊ณผ ๋กœ๋ด‡์„ ๋’ค์„ž๊ณ 
๋‡Œ๋ฅผ ์—…๋กœ๋“œํ•ด ์‚ฌํ›„์—๋„ ์‚ฐ๋‹ค๋Š” ์ƒ๊ฐ์€ ์–ด๋–ค๊ฐ€์š”?
02:04
or maybe a sex partner.
39
124704
1533
02:06
Through to the downright macabre, the idea of being a cyborg,
40
126553
2921
์ €ํฌ๋Š” ์ด๋Ÿฌํ•œ ๋‹ค์–‘ํ•œ ์‹œ๋‚˜๋ฆฌ์˜ค๋กœ
์‚ฌ๋žŒ๋“ค์˜ ํŽธ์•ˆํ•จ ์ˆ˜์ค€์„ ๋„ํ‘œํ™” ํ–ˆ์Šต๋‹ˆ๋‹ค.
02:09
blending human and robot,
41
129498
1293
02:10
or uploading your brain so it could live on after your death.
42
130815
3881
์‚ฌ์‹ค์ƒ 31๊ฐœ์˜ ์—ฐ๊ตฌ๊ฐ€ ์žˆ์—ˆ์ง€๋งŒ
ํŽธ์˜๋ฅผ ์œ„ํ•ด ๋ช‡ ๊ฐœ๋งŒ ๋ณด์—ฌ๋“œ๋ฆฌ๊ฒ ์Šต๋‹ˆ๋‹ค
02:15
And we plotted people's comfort levels with these various scenarios.
43
135045
3897
๋ฌผ๋ก  ๊ฐ€์žฅ ๋จผ์ € ๋ˆˆ์— ๋„๋Š” ๊ฒƒ์€ ๋ถ‰์€ ๋ถ€๋ถ„์ด์ฃ .
02:18
There were actually 31 in the study,
44
138966
1730
๋ฏธ๊ตญ์€ ์ด๊ฒƒ์„ ๊ต‰์žฅํžˆ ๋ถˆํŽธํ•ด ํ•ฉ๋‹ˆ๋‹ค.
02:20
but for ease, I'm going to show you just a few of them here.
45
140720
3214
์พŒ์ ์ง€์ˆ˜๊ฐ€ ์•„๋‹Œ ๋ถˆ์พŒ์ง€์ˆ˜๋ผ๊ณ  ๋ถ€๋ฅด๋Š” ์ด์œ  ์ค‘ ํ•˜๋‚˜์ฃ .
02:24
The first thing you'll notice, of course, is the sea of red.
46
144355
2833
๋ฏธ๊ตญ์ธ ๋Œ€๋‹ค์ˆ˜๊ฐ€ ์ฐฌ์„ฑํ•˜๋Š” ๊ฒƒ์€ ๋‹จ ๋‘๊ฐ€์ง€์˜€์Šต๋‹ˆ๋‹ค.
02:27
America is very uncomfortable with this stuff.
47
147212
2809
02:30
That's why we call it the discomfort index,
48
150901
2835
๋กœ๋ด‡ AI ๊ฐ€์ • ์ฒญ์†Œ๊ธฐ์™€
02:33
not the comfort index.
49
153760
1420
๋กœ๋ด‡ AI ํƒ๋ฐฐ ๋ฐฐ๋‹ฌ์›์ž…๋‹ˆ๋‹ค.
02:35
There were only two things the majority of America is OK with.
50
155204
3691
๋‹ค์ด์Šจ๊ณผ ์•„๋งˆ์กด์€ ์ฐธ๊ณ ํ•ด์•ผ๊ฒ ๊ตฐ์š”.
02:38
And that's the idea of a robot AI house cleaner
51
158919
2874
๋‘ ๊ธฐ์—…์€ ๊ธฐํšŒ๊ฐ€ ๋˜๊ฒ ๋„ค์š”.
์šฐ๋ฆฌ๋Š” ๋กœ๋ด‡ ์นœ๊ตฌ๋“ค์—๊ฒŒ ์šฐ๋ฆฌ์˜ ์žก์ผ์„ ๋งก๊ธธ ์ค€๋น„๊ฐ€ ๋œ ๊ฒƒ ๊ฐ™์Šต๋‹ˆ๋‹ค.
02:41
and a robot AI package deliverer,
52
161817
1992
02:43
so Dyson and Amazon, you guys should talk.
53
163833
2626
๋กœ๋ด‡ AI ๋ณ€ํ˜ธ์‚ฌ ํ˜น์€ ํˆฌ์ž ์ž๋ฌธ์—ญ๊ณผ ๊ฐ™์€
02:46
There's an opportunity there.
54
166848
1548
์šฉ์—ญ์ด ์ƒ๊ธด๋‹ค๋ฉด ์šฐ๋ฆฌ๋Š” ๋ถ„๋ช… ์• ๋งคํ•œ ํƒœ๋„๋ฅผ ์ทจํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
02:48
It seems we're ready to off-load our chores to our robot friends.
55
168420
3960
๊ทธ๋Ÿฌ๋‚˜ ๊ฐ„ํ˜ธ์‚ฌ, ์˜์‚ฌ, ๋ณด์œก์›๊ณผ ๊ด€๋ จ๋œ
02:52
We're kind of definitely on the fence when it comes to services,
56
172404
3015
๋กœ๋ด‡ ๋Œ๋ด„์˜ ๊ฐœ๋…์€ ํ™•์‹คํžˆ ํ์‡„์ ์ž…๋‹ˆ๋‹ค.
02:55
so robot AI lawyer or a financial adviser, maybe.
57
175443
3770
์ด๋Ÿฌํ•œ ์ƒํ™ฉ์„ ๋ณด๊ณ  ์‚ฌ๋žŒ๋“ค์€ ๋งํ•ฉ๋‹ˆ๋‹ค.
โ€œ๊ดœ์ฐฎ์•„, ๋ฃจ์‹œ, ๊ทธ๊ฑฐ ์•Œ์•„?
๊ฑฑ์ • ๊ทธ๋งŒํ•˜๊ณ  ์ž ์ด๋‚˜ ์ž, ๊ฒฐ๊ตญ ์ธ๊ฐ„์ด ์ด๊ธธ ๊ฑฐ๋‹ˆ๊นŒ.โ€
02:59
But we're firmly closed to the idea of robot care,
58
179237
2785
ํ•˜์ง€๋งŒ ๋จธ์ง€์•Š์•˜์Šต๋‹ˆ๋‹ค.
03:02
whether it be a nurse, a doctor, child care.
59
182046
2439
์ œ ์ž๋ฃŒ๋ฅผ ์ž์„ธํžˆ ๋ณด์‹œ๋ฉด
03:04
So from this, you'd go,
60
184509
1353
์ธ๊ฐ„์€ ์ƒ๊ฐ๋ณด๋‹ค ์ทจ์•ฝํ•˜๋‹ค๋Š” ๊ฑธ ์•Œ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
03:05
"It's OK, Lucy, you know what?
61
185886
1453
03:07
Go back to sleep, stop worrying, the humans win in the end."
62
187363
3055
AI๋Š” ๋ธŒ๋žœ๋”ฉ ๋ฌธ์ œ๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
๊ฐœ์ธ ๋น„์„œ์˜ ์ƒ๊ฐ์„
03:10
But actually not so fast.
63
190442
1222
์ ˆ๋Œ€์ ์œผ๋กœ ๊ฑฐ๋ถ€ํ•˜๊ฒ ๋‹ค๊ณ  ๋งํ•œ ์‚ฌ๋žŒ๋“ค ์ค‘
03:11
If you look at my data very closely,
64
191688
1722
03:13
you can see we're more vulnerable than we think.
65
193434
2246
45%๋Š” ์‚ฌ์‹ค ์ฃผ๋จธ๋‹ˆ ์†์—
03:15
AI has a branding problem.
66
195704
1285
์•Œ๋ ‰์‚ฌ, ๊ตฌ๊ธ€, ์‹œ๋ฆฌ์™€ ๊ฐ™์€ ๊ธฐ๊ธฐ๋ฅผ ๊ฐ€์ง€๊ณ  ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
03:17
So of those folks who said
67
197013
2157
03:19
that they would absolutely reject the idea of a personal assistant,
68
199194
3143
AI ์ค‘๋งค๋ฅผ ๋ฐ˜๋Œ€ํ•˜๋Š” ๋‹ค์„ฏ ๋ช… ์ค‘ ํ•œ ๋ช…์€
03:22
45 percent of them had, in fact, one in their pockets,
69
202361
2886
์˜จ๋ผ์ธ ๋ฐ์ดํŠธ๋ฅผ ํ•œ ์ ์ด ์žˆ์„ ๊ฒ๋‹ˆ๋‹ค.
03:25
in terms of a device with Alexa, Google or Siri.
70
205271
3603
๋˜ํ•œ, ์ €ํฌ ์ค‘ 80%๋Š” ํŒŒ์ผ๋Ÿฟ์˜ ์ง€์›์œผ๋กœ
03:28
One in five of those who were against the idea of AI matchmaking
71
208898
3072
์ž์œจ ๋น„ํ–‰๊ธฐ์— ํƒ‘์Šนํ•˜๋Š” ์•„์ด๋””์–ด๋ฅผ ๊ฑฐ์ ˆํ–ˆ์Šต๋‹ˆ๋‹ค.
์‹ค์ œ๋กœ, ์ œ๊ฐ€ ์ง์ ‘ ๊ฒฝํ—˜ํ–ˆ์Šต๋‹ˆ๋‹ค. ๋ฐด์ฟ ๋ฒ„์— ์˜ค๊ธฐ ์œ„ํ•ด
03:31
had of course, you guessed it, done online dating.
72
211994
2655
์ƒ์—… ๋น„ํ–‰์„ ํ–ˆ์Šต๋‹ˆ๋‹ค.
03:34
And 80 percent of those of us who refuse the idea
73
214673
2289
๋ชจ๋‘๊ฐ€ ๋ฌด์„œ์›Œํ• ๊นŒ ๋ด ๊ฑฑ์ •๋˜์‹œ๊ฒ ์ง€๋งŒ
03:36
of boarding an autonomous plane with a pilot backup
74
216986
2456
๊ฐ€์šด๋ฐ์— ์žˆ๋Š” ๋ฉ‹์ง„ ์‚ฌ๋žŒ๋“ค๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
์ด๋“ค์€ ์ค‘๋ฆฝ์ž…๋‹ˆ๋‹ค.
03:39
had in fact, just like me to get here to Vancouver,
75
219466
2555
์—ฌ๋Ÿฌ๋ถ„์˜ ๋ง์— ์ด๋Ÿฐ ๋ฐ˜์‘์„ ํ•  ์‚ฌ๋žŒ๋“ค์ž…๋‹ˆ๋‹ค.
โ€œ๋กœ๋ด‡ ์นœ๊ตฌ๋ž˜.โ€
03:42
flown commercial.
76
222045
1167
03:43
Lest you think everybody was scared, though,
77
223236
2055
๊ทธ๋Ÿฌ๋ฉด ์ด๋“ค์€ โ€œ๋กœ๋ด‡ ์นœ๊ตฌ๋ผ, ๊ทธ๋Ÿด์ง€๋„ ๋ชจ๋ฅด์ง€.โ€
03:45
here are the marvelous folk in the middle.
78
225315
2135
ํ˜น์€, โ€œAI ๋ฐ˜๋ ค๋™๋ฌผ์ด๋ž˜.โ€
03:47
These are the neutrals.
79
227474
1247
โ€œ์ ˆ๋Œ€ ์•ˆ ๋œ๋‹ค๊ณ  ํ•˜์ง€๋งˆ.โ€
03:48
These are people for whom you say,
80
228745
1642
03:50
"OK, robot friend,"
81
230411
1269
์ œ๋Œ€๋กœ ๋œ ์ •์น˜ ์ •๋ณด์›์ด๋ผ๋ฉด ์•Œ๊ฒ ์ง€๋งŒ
03:51
and they're like, "Hm, robot friend. Maybe."
82
231704
2935
์–‘๋ฉด์ ์ธ ์˜๊ฒฌ์„ ๋’ค์ง‘๋Š” ๊ฒƒ์€ ํŒ๋„๋ฅผ ๋ฐ”๊ฟ€ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
03:54
Or, "AI pet,"
83
234663
1621
03:56
and they go, "Never say never."
84
236308
2062
๋˜ ๋‹ค๋ฅธ ์ด์œ ๋กœ ๋‚จ์„ฑ์€ ์ทจ์•ฝํ•œ ๋ถ€๋ถ„์ด ์žˆ์ฃ .
03:58
And as any decent political operative knows,
85
238843
2666
์ฃ„์†กํ•˜์ง€๋งŒ, ๋‚จ์„ฑ์€ ๋ฌด์ธ์ž๋™์ฐจ ํƒ‘์Šน๊ณผ
04:01
flipping the ambivalent middle can change the game.
86
241533
2722
ํ›„์„ธ๋ฅผ ์œ„ํ•ด ๋‡Œ๋ฅผ ์—…๋กœ๋”ฉํ•˜๋Š” ๊ฒƒ์ด ์žฌ๋ฐŒ๋‹ค๊ณ 
04:04
Another reason I know we're vulnerable is men --
87
244644
2319
๋ฏฟ์„ ๊ฐ€๋Šฅ์„ฑ์ด ์—ฌ์„ฑ๋ณด๋‹ค ๋‘ ๋ฐฐ ๋†’์Šต๋‹ˆ๋‹ค.
04:06
I'm sorry, but men, you are twice as likely than women
88
246987
2576
๋˜ํ•œ, ์‚ฌ์ด๋ณด๊ทธ๊ฐ€ ๋˜๋Š” ๊ฒƒ์ด ๋ฉ‹์ง€๋‹ค ๋ฏฟ๋Š” ๋น„์œจ์€ 2.5๋ฐฐ ๋” ๋†’์•˜๊ณ 
04:09
to believe that getting into an autonomous car is a good idea,
89
249587
3411
์ €๋Š” ์ด ์ƒํ™ฉ์— ๋Œ€ํ•ด ํ• ๋ฆฌ์šฐ๋“œ๋ฅผ ๋น„๋‚œํ•ฉ๋‹ˆ๋‹ค.
(์›ƒ์Œ)
04:13
that uploading your brain for posterity is fun,
90
253022
2817
๊ทน์žฅ์„ ๋‘˜๋Ÿฌ ๋ณด๋ฉด
04:15
and two and a half times more likely to believe that becoming a cyborg is cool,
91
255863
3722
๋‚จ์„ฑ ๋„ค ๋ช… ์ค‘์— ํ•œ ๋ช… ๊ผด๋กœ ๋กœ๋ด‡๊ณผ ์„ฑ๊ด€๊ณ„ ๋งบ๋Š” ๊ฒƒ์— ํ˜ธ์˜์ ์ž…๋‹ˆ๋‹ค.
04:19
and for this, I blame Hollywood.
92
259609
1679
์ด๋Š” ์—ฌ์„ฑ ์—ด ๋ช… ์ค‘ ํ•œ ๋ช…์— ๋ถˆ๊ณผํ–ˆ๋˜
04:21
(Laughter)
93
261312
1296
04:22
And this is where I want you to look around the theater
94
262632
2596
๋ฐ€๋ ˆ๋‹ˆ์–ผ ๋‚จ์„ฑ๋“ค์˜ 44%์— ๋‹ฌํ•˜๋Š” ์ˆ˜์น˜์ž…๋‹ˆ๋‹ค.
์ €๋Š” ์ด๊ฒƒ์ด ๊ธฐ๊ณ„์ ์ธ ์„ฑ๊ด€๊ณ„ ๋ถˆํ‰์— ๋ฐ˜์ „์„ ๊ฐ€์ ธ์˜จ๋‹ค ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
04:25
and know that one in four men are OK with the idea of sex with a robot.
95
265252
3357
04:28
That goes up to 44 percent of millennial men
96
268633
3137
(์›ƒ์Œ)
์‹ฌ์ง€์–ด ๋” ๋†€๋ผ์šด ์ ์€
04:31
compared to just one in 10 women,
97
271794
1594
์ด ํ–‰๋™ ์ฐจ์ด์ž…๋‹ˆ๋‹ค
04:33
which I think puts a whole new twist on the complaint of mechanical sex.
98
273412
3389
์ด ํ‘œ๋Š” ์Šค๋งˆํŠธ ์Šคํ”ผ์ปค ํ™ˆ ํ—ˆ๋ธŒ ํ˜น์€ ์Šค๋งˆํŠธํฐ ๊ฐ™์€
04:36
(Laughter)
99
276825
1968
04:38
Even more astounding than that though, to be honest,
100
278817
2429
์Œ์„ฑ ์ง€์› ์žฅ์น˜๋ฅผ ์‚ฌ์šฉํ•˜๋Š” ์‚ฌ๋žŒ๊ณผ
04:41
is this behavioral difference.
101
281270
1570
๊ทธ๋ ‡์ง€ ์•Š์€ ์‚ฌ๋žŒ๋“ค์„ ๋‚˜ํƒ€๋ƒ…๋‹ˆ๋‹ค.
04:42
So here we have people who have a device with a voice assistant in it,
102
282864
4213
๊ทธ๋ž˜ํ”„์—์„œ ๋ณผ ์ˆ˜ ์žˆ๋“ฏ์ด
ํŠธ๋กœ์ด ๋ชฉ๋งˆ๋Š” ์ด๋ฏธ ์šฐ๋ฆฌ ๊ฑฐ์‹ค์— ์žˆ์Šต๋‹ˆ๋‹ค.
04:47
so a smart speaker, a home hub or a smart phone,
103
287101
2746
์ด๋Ÿฌํ•œ ์žฅ์น˜๋“ค์ด ํ™•์‚ฐํ•˜๊ณ 
04:49
versus those who don't.
104
289871
1722
์ง‘๋‹จ ๋ฐฉ์–ด๊ฐ€ ์•ฝํ•ด์ง€๋ฉด
04:51
And you can see from this graph
105
291617
1500
04:53
that the Trojan horse is already in our living room.
106
293141
3723
์šฐ๋ฆฌ๋Š” ์–ด๋–ป๊ฒŒ ๋๋‚  ์ˆ˜ ์žˆ๋Š”์ง€ ์•Œ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
์‚ฌ์‹ค, ์ €๋Š” ํœด๊ฐ€ ๋•Œ ์•Œ๋ ‰์‚ฌ ๋‹ท์„ ๊ฐ€์ ธ๊ฐ”์Šต๋‹ˆ๋‹ค.
04:56
And as these devices proliferate
107
296888
2087
04:58
and our collective defenses soften,
108
298999
3217
๋งˆ์ง€๋ง‰์œผ๋กœ ์ œ๊ฐ€ ์ฐพ์„ ์ˆ˜ ์žˆ๋Š” ๊ฒƒ์€ ์„ธ๋Œ€์ ์ธ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
05:02
we all see how it can end.
109
302240
1997
3์„ธ๋Œ€์˜ ์ฐจ์ด๋ฅผ ๋ณด์„ธ์š”
05:04
In fact, this may be as good a time as any to admit
110
304261
2389
05:06
I did take my Alexa Dot on vacation with me.
111
306674
2406
๋ฌดํ‘œ์ธ ์‚ฌ๋žŒ๋“ค์—์„œ ๋ฒ ์ด๋น„๋ถ€๋จธ์™€
๋ฐ€๋ ˆ๋‹ˆ์–ด ์„ธ๋Œ€๋กœ์˜ ๋„์•ฝ์ž…๋‹ˆ๋‹ค.
๋ณด๋‹ค ํฅ๋ฏธ๋กœ์šด ์ ์€ ์—ฌ๋Ÿฌ๋ถ„์ด ์ด ๊ฐ™์€ ๋ณ€ํ™” ์†๋„๋ฅผ ์ถ”์ •ํ•œ๋‹ค๋ฉด,
05:10
Final finding I have time for is generational.
112
310192
2151
05:12
So look at the difference just three generations make.
113
312367
2555
05:14
This is the leap from silent to boomer to millennial.
114
314946
3155
์ œ๊ฐ€ ์‹ค์ œ๋กœ ๋ฏฟ๋Š” ๊ฐ€์†๋„๊ฐ€ ์•„๋‹Œ ๊ฐ™์€ ์†๋„์ž…๋‹ˆ๋‹ค.
05:18
And what's more fascinating than this is if you extrapolate this out,
115
318125
3245
๊ทธ๋Ÿฌ๋ฉด ๋ชจ๋“  ๋ฏธ๊ตญ์ธ๋“ค์ด ์—ฌ๊ธฐ ์žˆ๋Š” ๋‹ค์ˆ˜์˜ ๊ฒƒ๋“ค์ด
05:21
the same rate of change,
116
321394
1215
์ •์ƒ์ด๋ผ๊ณ  ์ƒ๊ฐํ•  ๋•Œ 8์„ธ๋Œ€๋‚˜ ๊ฑธ๋ฆด ๊ฒƒ์ž…๋‹ˆ๋‹ค.
05:22
just the same pace,
117
322633
1190
05:23
not the accelerated one I actually believe will be the case,
118
323847
2835
05:26
the same pace,
119
326706
1175
2222๋…„์—๋Š” ์ด๊ณณ์˜ ๋ชจ๋“  ๊ฒƒ์ด
05:27
then it is eight generations away
120
327905
1697
05:29
when we hear every single American
121
329626
2340
์ฃผ๋ฅ˜๊ฐ€ ๋˜๋Š” ๊ฒฝ์•…์Šค๋Ÿฌ์šด ๊ณณ์ž…๋‹ˆ๋‹ค.
05:31
thinking the majority of these things here are normal.
122
331990
3381
๋”์ด์ƒ ์„ค๋“์ด ํ•„์š”ํ•˜์ง€ ์•Š๋„๋ก
์„ธ๋Œ€๋ณ„ โ€œAI์— ๋Œ€ํ•œ ํฅ๋ฏธ๋„โ€๋ฅผ ๋ณด์—ฌ๋“œ๋ฆฌ๊ฒ ์Šต๋‹ˆ๋‹ค.
05:35
So the year 2222 is an astounding place
123
335395
3039
๋‹น์—ฐํ•˜๊ฒŒ๋„, ์šฐ๋ฆฌ ๋ง‰๋‚ด๊ฐ€ ์ œ์ผ ์ข‹์•„ํ•˜๋„ค์š”.
05:38
where everything here is mainstream.
124
338458
2294
05:40
And lest you needed any more convincing,
125
340776
1945
ํ•˜์ง€๋งŒ ์•„๋งˆ๋„ ์ œ ๊ฒฝ๋ ฅ ์ค‘์—์„œ ๊ฐ€์žฅ ์—ญ์„ค์ ์ธ ๋ฐœ๊ฒฌ์€
05:42
here is the generation's "excitement level with AI."
126
342745
2808
์ด ์‚ฌ๋žŒ๋“ค์—๊ฒŒ 3am ์งˆ๋ฌธ์„ ํ–ˆ์„ ๋•Œ์ผ ๊ฒ๋‹ˆ๋‹ค.
05:45
So not surprisingly, the youngest of us are more excited.
127
345577
3340
" ๊ฒฐ๊ตญ์— ๋ˆ„๊ฐ€ ์ด๊ธธ๊นŒ์š”?โ€
05:49
But, and possibly the most paradoxical finding of my career,
128
349331
3976
๋งž์ถฐ๋ณด์„ธ์š”.
AI์™€ ๋กœ๋ด‡๊ณตํ•™์— ๋” ๋งŽ์€ ํฅ๋ฏธ๋ฅผ ๋Š๋‚„์ˆ˜๋ก
05:53
when I asked these people my 3am question,
129
353331
3000
๋กœ๋ด‡์ด ์ด๊ธธ ๊ฑฐ๋ผ ๋‹ต๋ณ€ํ•ฉ๋‹ˆ๋‹ค.
05:56
"Who wins in the end?"
130
356355
1467
์‹ ๊ฒฝ๋ง์„ ์ž‘๋™์‹œํ‚ค๋Š” ํŒจํ„ด ์ธ์‹ ์†Œํ”„ํŠธ์›จ์–ด๊ฐ€
05:58
Guess what.
131
358188
1150
05:59
The more excited you are about AI and robotics,
132
359736
2238
๋ชจ๋“  ๊ฒƒ์„ ์•Œ ํ•„์š”๋Š” ์—†์Šต๋‹ˆ๋‹ค.
06:01
the more likely you are to say it's the robots.
133
361998
2685
โ€˜๋“๋Š” ๋ฌผ์— ๋น ์ง„ ๊ฐœ๊ตฌ๋ฆฌโ€™๋ผ๋Š” ์†๋‹ด์ด ์žˆ์ฃ .
2222๋…„ ํ…Œ๋“œ์—์„œ ๋กœ๋ด‡์ด ํ›„์„ธ๋ฅผ ์œ„ํ•ด ๋ณด๊ณ  ์žˆ๋‹ค๋ฉด
06:05
And I don't think we need a neural net running pattern-recognition software
134
365347
3532
06:08
to see where this is all headed.
135
368903
1705
06:10
We are the proverbial frogs in boiling water.
136
370632
2705
์‚ฌ์ด๋ณด๊ทธ๋ฅผ ๋ณด๋‚ด์„œ, ์ €๋ฅผ ํŒŒํ—ค์นœ ํ›„์—
์ œ๊ฐ€ ๋งž์•˜๋‹ค๊ณ  ๋งํ•ด ์ค„๋ž˜์š”?
06:13
So if the robots at TED2222 are watching this for posterity,
137
373361
5153
(์›ƒ์Œ)
๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
(๋ฐ•์ˆ˜)
06:18
could you send a cyborg, dig me up and tell me if I was right?
138
378538
2976
06:21
(Laughter)
139
381538
1165
06:22
Thank you.
140
382727
1177
06:23
(Applause)
141
383928
1619
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7