A fascinating time capsule of human feelings toward AI | Lucy Farey-Jones

53,260 views ใƒป 2020-04-14

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืชืจื’ื•ื: zeeva livshitz ืขืจื™ื›ื”: Ido Dekkers
00:12
I'm here, because I've spent far too many nights lying awake,
0
12802
4571
ืื ื™ ื›ืืŸ, ื›ื™ ื™ื•ืชืจ ืžื“ื™ ืœื™ืœื•ืช ืฉื›ื‘ืชื™ ืขืจื”,
00:17
worrying and wondering who wins in the end.
1
17397
2486
ื“ื•ืื’ืช ื•ืชื•ื”ื” ืžื™ ื™ื ืฆื— ื‘ืกื•ืฃ.
00:19
Is it humans or is it robots?
2
19907
2450
ื”ืื ื™ื”ื™ื• ืืœื” ื‘ื ื™ ื”ืื“ื ืื• ื”ืจื•ื‘ื•ื˜ื™ื?
00:22
You see, as a technology strategist,
3
22918
1723
ื›ืืกื˜ืจื˜ื’ื™ืช ื˜ื›ื ื•ืœื•ื’ื™ืช,
00:24
my job involves behavior change:
4
24665
1904
ื”ืขื‘ื•ื“ื” ืฉืœื™ ื›ืจื•ื›ื” ื‘ืฉื™ื ื•ื™ ื”ืชื ื”ื’ื•ืช:
00:26
understanding why and how people adopt new technologies.
5
26593
3096
ืœื”ื‘ื™ืŸ ืœืžื” ื•ืื™ืš ืื ืฉื™ื ืžืืžืฆื™ื ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื—ื“ืฉื•ืช.
00:30
And that means I'm really frustrated
6
30101
2373
ื•ื–ื” ืื•ืžืจ ืฉืื ื™ ืžืžืฉ ืžืชื•ืกื›ืœืช
00:32
that I know I won't live to see how this all ends up.
7
32498
3460
ืžื›ืš ืฉืื ื™ ื™ื•ื“ืขืช ืฉืœื ืื—ื™ื” ื›ื“ื™ ืœืจืื•ืช ืื™ืš ื›ืœ ื–ื” ื ื’ืžืจ.
00:36
And in fact, if the youngest person watching this is 14
8
36577
3273
ื•ืœืžืขืฉื”, ืื ื”ืื“ื ื”ืฆืขื™ืจ ื‘ื™ื•ืชืจ ืฉืฆื•ืคื” ื‘ื–ื” ื”ื•ื ื‘ืŸ 14
00:39
and the oldest, a robust 99,
9
39874
2813
ื•ื”ืžื‘ื•ื’ืจ ื‘ื™ื•ืชืจ ื”ื•ื ืื“ื ื—ืกื•ืŸ ื‘ืŸ 99
00:42
then together,
10
42711
1152
ืื– ื‘ื™ื—ื“
00:43
our collective consciousnesses span just 185 years.
11
43887
4174
ื”ืชื•ื“ืขื” ื”ืงื•ืœืงื˜ื™ื‘ื™ืช ืฉืœื ื• ืžืฉืชืจืขืช ืจืง ืขืœ ืคื ื™ 185 ืฉื ื™ื
00:48
That is a myopic pinprick of time
12
48593
3366
ืฉื”ื™ื ื ืงื•ื“ืช ื–ืžืŸ ืงืฆืจืช ืจื•ืื™
00:51
when you think of the evolution and the story of life on this planet.
13
51983
3492
ื›ืฉื—ื•ืฉื‘ื™ื ืขืœ ื”ืื‘ื•ืœื•ืฆื™ื” ื•ืขืœ ืกื™ืคื•ืจ ื”ื—ื™ื™ื ืขืœ ื”ืคืœื ื˜ื” ื”ื–ื•.
00:55
Turns out we're all in the cheap seats
14
55912
1825
ืžืกืชื‘ืจ ืฉื›ื•ืœื ื• ื™ื•ืฉื‘ื™ื ื‘ืžื•ืฉื‘ื™ื ื”ื–ื•ืœื™ื
00:57
and none of us will live to see how it all pans out.
15
57761
2992
ื•ืืฃ ืื—ื“ ืžืื™ืชื ื• ืœื ื™ื—ื™ื” ื›ื“ื™ ืœืจืื•ืช ืื™ืš ื”ื›ืœ ืžืชื’ืœื’ืœ.
01:00
So at my company, we wanted a way around this.
16
60777
2348
ืื– ื‘ื—ื‘ืจื” ืฉืœื™, ืจืฆื™ื ื• ืœืžืฆื•ื ื“ืจืš ืœืขืงื•ืฃ ืืช ื–ื”.
01:03
We wanted to see if there was a way to cantilever out,
17
63149
2738
ืจืฆื™ื ื• ืœืจืื•ืช ืื ื™ืฉื ื” ื“ืจืš ืœื”ื™ื—ืœืฅ,
01:05
beyond our fixed temporal vantage point,
18
65911
3008
ืืœ ืžืขื‘ืจ ื ืงื•ื“ืช ื”ืชืฆืคื™ืช ื”ื–ืžื ื™ืช ืฉืœื ื•
01:08
to get a sense of how it all shakes up.
19
68943
2270
ื›ื“ื™ ืœืงื‘ืœ ืชื—ื•ืฉื” ืฉืœ ืื™ืš ื›ืœ ื–ื” ืžื™ื˜ืœื˜ืœ.
01:11
And to do this, we conducted a study amongst 1,200 Americans
20
71237
3977
ื•ื›ื“ื™ ืœืขืฉื•ืช ื–ืืช, ืขืจื›ื ื• ืžื—ืงืจ ื‘ืงืจื‘ 1,200 ืืžืจื™ืงืื™ื
01:15
representative of the US census,
21
75238
2150
ืฉืžื™ื™ืฆื’ื™ื ืืช ืžืคืงื“ ื”ืื•ื›ืœื•ืกื™ืŸ ืฉืœ ืืจื”โ€œื‘,
01:17
in which we asked a battery of attitudinal questions
22
77412
2524
ืฉื‘ื• ืฉืืœื ื• ืกื“ืจื” ืฉืœ ืฉืืœื•ืช ื”ืชื™ื™ื—ืกื•ืช
01:19
around robotics and AI
23
79960
1801
ืกื‘ื™ื‘ ืจื•ื‘ื•ื˜ื™ืงื” ื•ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช
01:21
and also captured behavioral ones around technology adoption.
24
81785
3373
ื•ื’ื ืชืคื™ืกื•ืช ื”ืชื ื”ื’ื•ืชื™ื•ืช ืกื‘ื™ื‘ ืื™ืžื•ืฅ ื˜ื›ื ื•ืœื•ื’ื™ื”.
ื ื™ื”ืœื ื• ืžื—ืงืจ ื’ื“ื•ืœ
01:26
We had a big study
25
86011
1285
01:27
so that we could analyze differences in gender and generations,
26
87320
2977
ื›ื“ื™ ืฉื ื•ื›ืœ ืœื ืชื— ื”ื‘ื“ืœื™ื ื‘ืžื’ื“ืจ ื•ื‘ืงื‘ื•ืฆื•ืช ื’ื™ืœ,
01:30
between religious and political beliefs,
27
90321
1968
ื‘ืืžื•ื ื•ืช ื“ืชื™ื•ืช ื•ืคื•ืœื™ื˜ื™ื•ืช,
01:32
even job function and personality trait.
28
92313
2769
ืืคื™ืœื• ื‘ืชืคืงื•ื“ ื‘ืขื‘ื•ื“ื” ื•ื‘ืชื›ื•ื ื•ืช ืื™ืฉื™ื•ืช.
01:35
It is a fascinating, time-bound time capsule
29
95106
3503
ื–ื• ืงืคืกื•ืœืช ื–ืžืŸ ืžืจืชืง ื•ืžื•ื’ื‘ืœืช ื‘ื–ืžืŸ
01:38
of our human frailty
30
98633
1666
ืฉืœ ื”ืฉื‘ืจื™ืจื™ื•ืช ื”ืื ื•ืฉื™ืช ืฉืœื ื•
01:40
in this predawn of the robotic era.
31
100323
2586
ื‘ืฉื—ืจ ื–ื” ืฉืœ ื”ืขื™ื“ืŸ ื”ืจื•ื‘ื•ื˜ื™.
01:43
And I have five minutes to tell you about it.
32
103220
2278
ื•ื™ืฉ ืœื™ ื—ืžืฉ ื“ืงื•ืช ืœืกืคืจ ืœื›ื ืขืœ ื–ื”.
01:46
The first thing you should know is that we brainstormed
33
106260
2579
ื”ื“ื‘ืจ ื”ืจืืฉื•ืŸ ืฉืขืœื™ื›ื ืœื“ืขืช ื”ื•ื ืฉื ื™ื”ืœื ื• ืกื™ืขื•ืจ ืžื•ื—ื•ืช
01:48
a list of scenarios of current and potential AI robotics.
34
108863
6227
ืกื“ืจื” ืฉืœ ืชืจื—ื™ืฉื™ื ืฉืœ ืจื•ื‘ื•ื˜ื™ืงืช ื‘โ€œืž ืขื›ืฉื™ื•ื™ืช ื•ืคื•ื˜ื ืฆื™ืืœื™ืช.
01:55
They ran the spectrum from the mundane,
35
115693
2542
ื”ื ื”ืจื™ืฆื• ืืช ื”ืกืคืงื˜ืจื•ื ืžื”ื™ื•ืžื™ื•ืžื™,
01:58
so, a robot house cleaner, anyone?
36
118259
1976
ืžืจื•ื‘ื•ื˜ ืฉืžื ืงื” ื‘ืชื™ื,
02:00
Through to the mischievous,
37
120259
1302
ื•ืขื“ ื”ืงื•ื ื“ืกื™,
02:01
the idea of a robot pet sitter, or maybe a robot lawyer,
38
121585
3095
ื›ืžื• ืจื•ื‘ื•ื˜ ื‘ื™ื™ื‘ื™ ืกื™ื˜ืจ ืœื—ื™ื•ืช ืžื—ืžื“, ืื• ืื•ืœื™ ืจื•ื‘ื•ื˜ ืขื•โ€œื“,
02:04
or maybe a sex partner.
39
124704
1533
ืื• ืื•ืœื™ ื‘ืŸ ื–ื•ื’ ืœืกืงืก.
02:06
Through to the downright macabre, the idea of being a cyborg,
40
126553
2921
ื•ืขื“ ืœืžืงืื‘ืจื™ ืžืžืฉ, ื›ืžื• ื”ืจืขื™ื•ืŸ ืœื”ื™ื•ืช ืกื™ื™ื‘ื•ืจื’,
02:09
blending human and robot,
41
129498
1293
ืฉื™ืœื•ื‘ ื‘ื™ืŸ ืื“ื ืœืจื•ื‘ื•ื˜,
02:10
or uploading your brain so it could live on after your death.
42
130815
3881
ืื• ื”ืขืœืืช ื”ืžื•ื— ืฉืœื›ื ื›ื“ื™ ืฉื”ื•ื ื™ื•ื›ืœ ืœื—ื™ื•ืช ื’ื ืื—ืจื™ ืžื•ืชื›ื.
02:15
And we plotted people's comfort levels with these various scenarios.
43
135045
3897
ื•ื”ืชื•ื•ื™ื ื• ืืช ืจืžื•ืช ื”ื ื•ื—ื•ืช ืฉืœ ืื ืฉื™ื ืขื ื”ืชืจื—ื™ืฉื™ื ื”ืฉื•ื ื™ื ื”ืœืœื•.
02:18
There were actually 31 in the study,
44
138966
1730
ื‘ืžื—ืงืจ ื”ืฉืชืชืคื• 31 ืื ืฉื™ื ,
02:20
but for ease, I'm going to show you just a few of them here.
45
140720
3214
ืื‘ืœ ื›ื“ื™ ืœืคืฉื˜ ืืฆื™ื’ ื‘ืคื ื™ื›ื ืจืง ื›ืžื” ืžื”ื.
02:24
The first thing you'll notice, of course, is the sea of red.
46
144355
2833
ื”ื“ื‘ืจ ื”ืจืืฉื•ืŸ ืฉืชืฉื™ืžื• ืืœื™ื• ืœื‘, ื›ืžื•ื‘ืŸ, ื”ื•ื ื”ืจื‘ื” ืฆื‘ืข ืื“ื•ื.
02:27
America is very uncomfortable with this stuff.
47
147212
2809
ืœืืžืจื™ืงื” ืžืื•ื“ ืœื ื ื•ื— ืขื ื”ื“ื‘ืจื™ื ื”ืืœื”.
02:30
That's why we call it the discomfort index,
48
150901
2835
ืœื›ืŸ ืื ื—ื ื• ืงื•ืจืื™ื ืœื–ื” ืžื“ื“ ืื™ ื”ื ื•ื—ื•ืช,
02:33
not the comfort index.
49
153760
1420
ื•ืœื ืžื“ื“ ื”ื ื•ื—ื•ืช.
02:35
There were only two things the majority of America is OK with.
50
155204
3691
ื”ื™ื• ืจืง ืฉื ื™ ื“ื‘ืจื™ื ืฉื”ืจื•ื‘ ื‘ืืžืจื™ืงื” ืžืจื’ื™ืฉื™ื ื‘ืกื“ืจ ืื™ืชื.
02:38
And that's the idea of a robot AI house cleaner
51
158919
2874
ื•ื”ื ื”ืจืขื™ื•ืŸ ืฉืœ ืจื•ื‘ื•ื˜ ื‘โ€œืž ืฉืžื ืงื” ืืช ื”ื‘ื™ืช
02:41
and a robot AI package deliverer,
52
161817
1992
ื•ืจื•ื‘ื•ื˜ ื‘โ€œืž ืœืžืฉืœื•ื— ื—ื‘ื™ืœื•ืช,
02:43
so Dyson and Amazon, you guys should talk.
53
163833
2626
ืื– ื“ื™ื™ืกื•ืŸ ื•ืืžื–ื•ืŸ, ืืชื ืฆืจื™ื›ื™ื ืœื“ื‘ืจ.
02:46
There's an opportunity there.
54
166848
1548
ื™ืฉ ืฉื ื”ื–ื“ืžื ื•ืช.
02:48
It seems we're ready to off-load our chores to our robot friends.
55
168420
3960
ื ืจืื” ืฉืื ื—ื ื• ืžื•ื›ื ื™ื ืœื”ืขื‘ื™ืจ ืืช ื›ื•ื‘ื“ ื”ืžื˜ืœื•ืช ืฉืœื ื• ืœื—ื‘ืจื™ื ื• ื”ืจื•ื‘ื•ื˜ื™ื.
02:52
We're kind of definitely on the fence when it comes to services,
56
172404
3015
ืื ื—ื ื• ื‘ื”ื—ืœื˜ ื“ื™ ื™ื•ืฉื‘ื™ื ืขืœ ื”ื’ื“ืจ ื›ืฉื–ื” ืžื’ื™ืข ืœืฉื™ืจื•ืชื™ื,
02:55
so robot AI lawyer or a financial adviser, maybe.
57
175443
3770
ืื– ืจื•ื‘ื•ื˜ ื‘โ€œืž ืขื•ืจืš ื“ื™ืŸ ืื• ื™ื•ืขืฅ ืคื™ื ื ืกื™, ืื•ืœื™.
02:59
But we're firmly closed to the idea of robot care,
58
179237
2785
ืื‘ืœ ืื ื—ื ื• ืกื’ื•ืจื™ื ื”ื™ื˜ื‘ ืœืจืขื™ื•ืŸ ืฉืœ ืจื•ื‘ื•ื˜ ืžื˜ืคืœ,
03:02
whether it be a nurse, a doctor, child care.
59
182046
2439
ื‘ื™ืŸ ืื ื–ื• ืื—ื•ืช, ืื• ืจื•ืคื, ื•ื˜ื™ืคื•ืœ ื‘ื™ืœื“ื™ื.
03:04
So from this, you'd go,
60
184509
1353
ืื– ื›ืืŸ ื›ื‘ืจ ื”ื™ื™ืชื ืื•ืžืจื™ื,
03:05
"It's OK, Lucy, you know what?
61
185886
1453
โ€œื–ื” ื‘ืกื“ืจ, ืœื•ืกื™, ืืช ื™ื•ื“ืขืช ืžื”?
03:07
Go back to sleep, stop worrying, the humans win in the end."
62
187363
3055
ื—ื™ื–ืจื™ ืœื™ืฉื•ืŸ, ื”ืคืกื™ืงื™ ืœื“ืื•ื’, ื‘ืกื•ืคื• ืฉืœ ื“ื‘ืจ ื‘ื ื™ ื”ืื“ื ืžื ืฆื—ื™ืโ€œ.
03:10
But actually not so fast.
63
190442
1222
ืื‘ืœ ื‘ืขืฆื ืœื ื›ืœ ื›ืš ืžื”ืจ.
03:11
If you look at my data very closely,
64
191688
1722
ืื ืชืกืชื›ืœื• ื”ื™ื˜ื‘ ืขืœ ื”ื ืชื•ื ื™ื ืฉืœื™,
03:13
you can see we're more vulnerable than we think.
65
193434
2246
ืชื•ื›ืœื• ืœืจืื•ืช ืฉืื ื• ื™ื•ืชืจ ืคื’ื™ืขื™ื ืžืžื” ืฉืื ื—ื ื• ื—ื•ืฉื‘ื™ื.
03:15
AI has a branding problem.
66
195704
1285
ืœ-ื‘โ€œืž ื™ืฉ ื‘ืขื™ื™ืช ืžื™ืชื•ื’.
ืื– ืžืืœื” ืฉืืžืจื•
03:17
So of those folks who said
67
197013
2157
03:19
that they would absolutely reject the idea of a personal assistant,
68
199194
3143
ืฉื”ื ื™ื“ื—ื• ืœื—ืœื•ื˜ื™ืŸ ืืช ื”ืจืขื™ื•ืŸ ืฉืœ ืขื•ื–ืจ ืื™ืฉื™,
03:22
45 percent of them had, in fact, one in their pockets,
69
202361
2886
ืœ-45 ืื—ื•ื– ืžื”ื ื”ื™ื”, ืœืžืขืฉื”, ืื—ื“ ืžืืœื” ื‘ื›ื™ืก ืฉืœื”ื,
03:25
in terms of a device with Alexa, Google or Siri.
70
205271
3603
ื‘ืžื•ื ื—ื™ื ืฉืœ ืžื›ืฉื™ืจ ืขื ืืœืงืกื”, ื’ื•ื’ืœ ืื• ืกื™ืจื™.
03:28
One in five of those who were against the idea of AI matchmaking
71
208898
3072
ืื—ื“ ืžื›ืœ ื—ืžื™ืฉื” ืžืืœื” ืฉื”ื™ื• ื ื’ื“ ื”ืจืขื™ื•ืŸ ืฉืœ ืฉื™ื“ื•ื›ื™ AI
03:31
had of course, you guessed it, done online dating.
72
211994
2655
ื”ื™ื” ื›ืžื•ื‘ืŸ, ื ื™ื—ืฉืชื ื ื›ื•ืŸ, ื›ืืœื” ืฉื—ื™ืคืฉื• ื”ื›ืจื•ื™ื•ืช ื‘ืื™ื ื˜ืจื ื˜.
03:34
And 80 percent of those of us who refuse the idea
73
214673
2289
ื•-80 ืื—ื•ื– ืžืืœื” ืžืื™ืชื ื• ืฉืžืกืจื‘ื™ื ืœืจืขื™ื•ืŸ
03:36
of boarding an autonomous plane with a pilot backup
74
216986
2456
ืฉืœ ืขืœื™ื™ื” ืœืžื˜ื•ืก ืื•ื˜ื•ื ื•ืžื™ ืขื ื’ื™ื‘ื•ื™ ืฉืœ ื˜ื™ื™ืก
03:39
had in fact, just like me to get here to Vancouver,
75
219466
2555
ืขืฉื• ื–ืืช ืœืžืขืฉื”, ื‘ื“ื™ื•ืง ื›ืžื•ื ื™ ื›ืฉื”ื’ื™ืขื• ืœื›ืืŸ ืžื•ื•ื ืงื•ื‘ืจ,
03:42
flown commercial.
76
222045
1167
ื‘ื˜ื™ืกื” ืžืกื—ืจื™ืช.
03:43
Lest you think everybody was scared, though,
77
223236
2055
ื‘ืžืงืจื” ืฉืชื—ืฉื‘ื• ืฉื›ื•ืœื ืคื—ื“ื•,
03:45
here are the marvelous folk in the middle.
78
225315
2135
ื”ื ื” ืื ืฉื™ ื”ืืžืฆืข ื”ืžื•ืคืœืื™ื.
03:47
These are the neutrals.
79
227474
1247
ืืœื” ื”ื ื”ื ื™ื™ื˜ืจืœื™ื.
03:48
These are people for whom you say,
80
228745
1642
ืืœื” ื”ื ื”ืื ืฉื™ื ืฉืœื”ื ืืชื ืื•ืžืจื™ื,
03:50
"OK, robot friend,"
81
230411
1269
โ€œื‘ืกื“ืจ, ื—ื‘ืจ ืจื•ื‘ื•ื˜,โ€
03:51
and they're like, "Hm, robot friend. Maybe."
82
231704
2935
ื•ื”ื ืื•ืžืจื™ื: โ€œื”ืž, ื—ื‘ืจ ืจื•ื‘ื•ื˜. ืื•ืœื™.โ€
03:54
Or, "AI pet,"
83
234663
1621
ืื•, โ€œื—ื™ื™ืช ืžื—ืžื“ ื‘โ€œืž,โ€
03:56
and they go, "Never say never."
84
236308
2062
ื•ื”ื ืžืžืฉื™ื›ื™ื, โ€œืœืขื•ืœื ืืœ ืชื’ื™ื“ื• ืœืขื•ืœื ืœืโ€œ.
03:58
And as any decent political operative knows,
85
238843
2666
ื•ื›ืžื• ืฉื›ืœ ืคืขื™ืœ ืคื•ืœื™ื˜ื™ ื”ื’ื•ืŸ ื™ื•ื“ืข,
04:01
flipping the ambivalent middle can change the game.
86
241533
2722
ื”ื™ืคื•ืš ื”ืืžืฆืข ื”ืืžื‘ื™ื•ื•ืœื ื˜ื™ ื™ื›ื•ืœ ืœืฉื ื•ืช ืืช ื”ืžืฉื—ืง.
04:04
Another reason I know we're vulnerable is men --
87
244644
2319
ืกื™ื‘ื” ื ื•ืกืคืช ืฉืื ื™ ื™ื•ื“ืขืช ืฉืื ื—ื ื• ืคื’ื™ืขื™ื ื”ื™ื ื’ื‘ืจื™ื --
04:06
I'm sorry, but men, you are twice as likely than women
88
246987
2576
ืื ื™ ืžืฆื˜ืขืจืช, ืื‘ืœ ื’ื‘ืจื™ื, ื™ืฉ ืกื™ื›ื•ื™ ืฉืืชื ืคื™ ืฉื ื™ื™ื ืžื ืฉื™ื ืžืืžื™ื ื™ื
04:09
to believe that getting into an autonomous car is a good idea,
89
249587
3411
ืฉืœื”ื™ื›ื ืก ืœืžื›ื•ื ื™ืช ืื•ื˜ื•ื ื•ืžื™ืช ื–ื” ืจืขื™ื•ืŸ ื˜ื•ื‘,
04:13
that uploading your brain for posterity is fun,
90
253022
2817
ืฉื”ืขืœืืช ืชื›ื•ืœืช ื”ืžื•ื— ืฉืœื›ื ืœืžื—ืฉื‘ ืขื‘ื•ืจ ื”ื“ื•ืจื•ืช ื”ื‘ืื™ื ื–ื” ื›ื™ืฃ,
04:15
and two and a half times more likely to believe that becoming a cyborg is cool,
91
255863
3722
ื•ืคื™ ืฉื ื™ื™ื ื•ื—ืฆื™ ื™ื•ืชืจ ืกื‘ื™ืจ ืฉืืชื ืžืืžื™ื ื™ื ืฉืœื”ื™ื•ืช ืกื™ื™ื‘ื•ืจื’ ื–ื” ืžื’ื ื™ื‘,
04:19
and for this, I blame Hollywood.
92
259609
1679
ื•ืขืœ ื–ื”, ืื ื™ ืžืืฉื™ืžื” ืืช ื”ื•ืœื™ื•ื•ื“.
04:21
(Laughter)
93
261312
1296
(ืฆื—ื•ืง)
04:22
And this is where I want you to look around the theater
94
262632
2596
ื•ื›ืืŸ ืื ื™ ืจื•ืฆื” ืฉืชืกืชื›ืœื• ืกื‘ื™ื‘ื›ื ื‘ืื•ืœื
04:25
and know that one in four men are OK with the idea of sex with a robot.
95
265252
3357
ื•ื“ืขื•, ืฉืื—ื“ ืžื›ืœ 4 ื’ื‘ืจื™ื ืคืชื•ื—ื™ื ืœืจืขื™ื•ืŸ ืฉืœ ืกืงืก ืขื ืจื•ื‘ื•ื˜.
04:28
That goes up to 44 percent of millennial men
96
268633
3137
ื–ื” ืขื•ืœื” ืœ-44 ืื—ื•ื– ืฉืœ ื’ื‘ืจื™ื ื‘ื ื™ ื“ื•ืจ ื”ืžื™ืœื ื™ื•ื
04:31
compared to just one in 10 women,
97
271794
1594
ืœืขื•ืžืช ืื—ืช ื‘ืœื‘ื“ ืžื›ืœ 10 ื ืฉื™ื,
04:33
which I think puts a whole new twist on the complaint of mechanical sex.
98
273412
3389
ืฉืœื“ืขืชื™ ื ื•ืชืŸ ื˜ื•ื•ื™ืกื˜ ื—ื“ืฉ ืœื’ืžืจื™ ืœืชืœื•ื ื” ืขืœ ืžื™ืŸ ืžื›ื ื™.
04:36
(Laughter)
99
276825
1968
(ืฆื—ื•ืง)
04:38
Even more astounding than that though, to be honest,
100
278817
2429
ืื‘ืœ ืืคื™ืœื• ื™ื•ืชืจ ืžื“ื”ื™ื ืžื–ื”, ืœืžืขืŸ ื”ืืžืช,
04:41
is this behavioral difference.
101
281270
1570
ื”ื•ื ื”ื”ื‘ื“ืœ ื”ื”ืชื ื”ื’ื•ืชื™ ื”ื–ื”.
04:42
So here we have people who have a device with a voice assistant in it,
102
282864
4213
ืื– ื›ืืŸ ื™ืฉ ืœื ื• ืื ืฉื™ื ืฉื™ืฉ ืœื”ื ืžื›ืฉื™ืจ ืขื ืขื•ื–ืจ ืงื•ืœื™ ื‘ืชื•ื›ื•,
04:47
so a smart speaker, a home hub or a smart phone,
103
287101
2746
ืื• ืจืžืงื•ืœ ื—ื›ื, ืจื›ื–ืช ื‘ื™ืชื™ืช ืื• ืกืžืจื˜ืคื•ืŸ,
04:49
versus those who don't.
104
289871
1722
ืœืขื•ืžืช ืืœื” ืฉืื™ืŸ ืœื”ื.
04:51
And you can see from this graph
105
291617
1500
ื•ืืชื ื™ื›ื•ืœื™ื ืœืจืื•ืช ืžื”ื’ืจืฃ ื”ื–ื”
04:53
that the Trojan horse is already in our living room.
106
293141
3723
ืฉื”ืกื•ืก ื”ื˜ืจื•ื™ืื ื™ ื ืžืฆื ื›ื‘ืจ ื‘ืกืœื•ืŸ ืฉืœื ื•.
04:56
And as these devices proliferate
107
296888
2087
ื•ื›ื›ืœ ืฉื”ืžื›ืฉื™ืจื™ื ื”ืืœื” ืžืชืจื‘ื™ื
04:58
and our collective defenses soften,
108
298999
3217
ื•ื”ื”ื’ื ื•ืช ื”ืงื•ืœืงื˜ื™ื‘ื™ื•ืช ืฉืœื ื• ืžืชืจื›ื›ื•ืช,
05:02
we all see how it can end.
109
302240
1997
ื›ื•ืœื ื• ืจื•ืื™ื ืื™ืš ื–ื” ื™ื›ื•ืœ ืœื”ื™ื’ืžืจ.
05:04
In fact, this may be as good a time as any to admit
110
304261
2389
ืœืžืขืฉื”, ื–ื” ืขืฉื•ื™ ื’ื ืœื”ื™ื•ืช ื–ืžืŸ ื˜ื•ื‘ ืœื”ืชื•ื•ื“ื•ืช
05:06
I did take my Alexa Dot on vacation with me.
111
306674
2406
ืฉืœืงื—ืชื™ ืืช ื”ืืœืงืกื” ื“ื•ื˜ ืฉืœื™ ืื™ืชื™ ืœื—ื•ืคืฉื”
05:10
Final finding I have time for is generational.
112
310192
2151
ื”ืžืžืฆื ื”ืกื•ืคื™ ืฉื™ืฉ ืœื™ ื–ืžืŸ ื‘ืฉื‘ื™ืœื• ื”ื•ื ื“ื•ืจื™.
05:12
So look at the difference just three generations make.
113
312367
2555
ืื– ืชืจืื• ืืช ื”ื”ื‘ื“ืœ ืฉืจืง ืฉืœื•ืฉื” ื“ื•ืจื•ืช ืขื•ืฉื™ื.
05:14
This is the leap from silent to boomer to millennial.
114
314946
3155
ื–ื•ื”ื™ ื”ืงืคื™ืฆื” ืžื”ื“ื•ืจ ื”ืฉืงื˜ ืœื“ื•ืจ ื”ื‘ื™ื™ื‘ื™ ื‘ื•ื ืขื“ ืœื“ื•ืจ ื”ืžื™ืœื ื™ื•ื
05:18
And what's more fascinating than this is if you extrapolate this out,
115
318125
3245
ื•ืžื” ืฉื™ื•ืชืจ ืžืจืชืง ืžื–ื” ื”ื•ื ืื ืืชื ืžื•ืฆื™ืื™ื ืืช ื–ื” ื”ื—ื•ืฆื”,
05:21
the same rate of change,
116
321394
1215
ืื•ืชื• ืงืฆื‘ ืฉื™ื ื•ื™,
05:22
just the same pace,
117
322633
1190
ื‘ื“ื™ื•ืง ื‘ืื•ืชื• ืงืฆื‘,
05:23
not the accelerated one I actually believe will be the case,
118
323847
2835
ืœื ื–ื” ื”ืžื•ืืฅ, ืื ื™ ื‘ืืžืช ืžืืžื™ื ื” ืฉื–ื” ื™ื”ื™ื” ื”ืžืฆื‘,
05:26
the same pace,
119
326706
1175
ืื•ืชื• ืงืฆื‘,
05:27
then it is eight generations away
120
327905
1697
ืื– ื‘ืžืจื—ืง ืฉืœ ืฉืžื•ื ื” ื“ื•ืจื•ืช,
05:29
when we hear every single American
121
329626
2340
ื›ืฉื ืฉืžืข ืฉื›ืœ ืืžืจื™ืงืื™ ื™ื—ื™ื“
05:31
thinking the majority of these things here are normal.
122
331990
3381
ื—ื•ืฉื‘ ืฉืจื•ื‘ ื”ื“ื‘ืจื™ื ื”ืืœื” ื›ืืŸ ื”ื ื ื•ืจืžืœื™ื™ื.
05:35
So the year 2222 is an astounding place
123
335395
3039
ืื– ืฉื ืช 2222 ื”ื™ื ืžืงื•ื ืžื“ื”ื™ื
05:38
where everything here is mainstream.
124
338458
2294
ืฉื‘ื• ื”ื›ืœ ื›ืืŸ ื”ื•ื ืžื™ื™ื ืกื˜ืจื™ื.
05:40
And lest you needed any more convincing,
125
340776
1945
ื•ืื ืืชื ื–ืงื•ืงื™ื ืœืฉื›ื ื•ืข ื ื•ืกืฃ,
05:42
here is the generation's "excitement level with AI."
126
342745
2808
ื›ืืŸ ื ืžืฆืืช โ€œืจืžืช ื”ื”ื”ืชืœื”ื‘ื•ืช ื”ื“ื•ืจื™ืช ืžื”-ื‘โ€œืž.โ€
05:45
So not surprisingly, the youngest of us are more excited.
127
345577
3340
ืื™ืŸ ื–ื” ืžืคืชื™ืข, ืฉื”ืฆืขื™ืจื™ื ื‘ื™ื•ืชืจ ืžื‘ื™ื ื™ื ื• ืžืชืœื”ื‘ื™ื ื™ื•ืชืจ.
05:49
But, and possibly the most paradoxical finding of my career,
128
349331
3976
ืื‘ืœ, ื™ื™ืชื›ืŸ ืฉื”ืžืžืฆื ื”ื›ื™ ืคืจื“ื•ืงืกืœื™ ื‘ืงืจื™ื™ืจื” ืฉืœื™,
05:53
when I asked these people my 3am question,
129
353331
3000
ื”ื™ื” ื›ืฉืฉืืœืชื™ ืืช ื”ืื ืฉื™ื ื”ืืœื” ืืช ื”ืฉืืœื” ืฉืœื™ ื‘-3 ืœืคื ื•ืช ื‘ื•ืงืจ,
05:56
"Who wins in the end?"
130
356355
1467
โ€œืžื™ ืžื ืฆื— ื‘ืกื•ืฃ?โ€
05:58
Guess what.
131
358188
1150
ื ื—ืฉื• ืžื”.
05:59
The more excited you are about AI and robotics,
132
359736
2238
ื›ื›ืœ ืฉืืชื ื™ื•ืชืจ ื ืœื”ื‘ื™ื ืžโ€œื‘ ื•ืจื•ื‘ื•ื˜ื™ืงื”,
06:01
the more likely you are to say it's the robots.
133
361998
2685
ืกื‘ื™ืจ ื™ื•ืชืจ ืฉืชื’ื™ื“ื• ืฉืืœื” ื™ื”ื™ื• ื”ืจื•ื‘ื•ื˜ื™ื.
06:05
And I don't think we need a neural net running pattern-recognition software
134
365347
3532
ื•ืื ื™ ืœื ื—ื•ืฉื‘ืช ืฉืื ื—ื ื• ืฆืจื™ื›ื™ื ืจืฉืช ืขืฆื‘ื™ืช ืฉืžืคืขื™ืœื” ืชื•ื›ื ืช ื–ื™ื”ื•ื™ ืชื‘ื ื™ื•ืช
06:08
to see where this is all headed.
135
368903
1705
ื›ื“ื™ ืœืจืื•ืช ืœืืŸ ื›ืœ ื–ื” ืžื•ื‘ื™ืœ.
06:10
We are the proverbial frogs in boiling water.
136
370632
2705
ืื ื—ื ื• โ€œื”ืฆืคืจื“ืขื™ื ื”ืžืชื‘ืฉืœื•ืชโ€ ืžื”ืคืชื’ื ื”ืžืคื•ืจืกื.
06:13
So if the robots at TED2222 are watching this for posterity,
137
373361
5153
ืื– ืื ื”ืจื•ื‘ื•ื˜ื™ื ื‘-TED2222 ืฆื•ืคื™ื ื‘ื–ื” ืขื‘ื•ืจ ื”ื“ื•ืจื•ืช ื”ื‘ืื™ื,
06:18
could you send a cyborg, dig me up and tell me if I was right?
138
378538
2976
ื”ืื ืชื•ื›ืœื• ืœืฉืœื•ื— ืกื™ื™ื‘ื•ืจื’,ืฉื™ืžืฆื ืื•ืชื™ ื•ืชื’ื™ื“ื• ืœื™ ืื ืฆื“ืงืชื™?
06:21
(Laughter)
139
381538
1165
(ืฆื—ื•ืง)
06:22
Thank you.
140
382727
1177
ืชื•ื“ื” ืœื›ื.
06:23
(Applause)
141
383928
1619
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7