What Is an AI Anyway? | Mustafa Suleyman | TED

1,902,222 views ใƒป 2024-04-22

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืชืจื’ื•ื: aknv tso ืขืจื™ื›ื”: zeeva livshitz
00:04
I want to tell you what I see coming.
0
4292
2586
โ€ซโ€ซืื ื™ ืจื•ืฆื” ืœืกืคืจ ืœื›ืโ€ฌ โ€ซืžื” ืื ื™ ืฆื•ืคื” ืฉื™ื”ื™ื”.โ€ฌโ€ฌ
00:07
I've been lucky enough to be working on AI for almost 15 years now.
1
7712
4463
โ€ซโ€ซื”ืชืžื–ืœ ืžื–ืœื™ ืœืขื‘ื•ื“ ืขืœโ€ฌ โ€ซื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืชโ€ฌ ืžื–ื” ื›ืžืขื˜ 15 ืฉื ื”.โ€ฌโ€ฌ
00:12
Back when I started, to describe it as fringe would be an understatement.
2
12676
5088
โ€ซโ€ซื›ืฉื”ืชื—ืœืชื™, ืœื›ื ื•ืช ืื•ืชื” โ€œืชื•ืคืขืช ืฉื•ืœื™ื™ืโ€œโ€ฌ โ€ซื”ื™ื” ืœืฉื•ืŸ ื”ืžืขื˜ื”.โ€ฌโ€ฌ
00:17
Researchers would say, โ€œNo, no, weโ€™re only working on machine learning.โ€
3
17806
4045
โ€ซโ€ซื”ื—ื•ืงืจื™ื ื”ื™ื• ืื•ืžืจื™ื, "ืœื, ืœื,โ€ฌ โ€ซืื ื•โ€ฌ โ€ซืขื•ื‘ื“ื™ื ืจืง ืขืœ ืœืžื™ื“ืช ืžื›ื•ื ื”."โ€ฌโ€ฌ
00:21
Because working on AI was seen as way too out there.
4
21893
3087
โ€ซโ€ซื›ื™ ื”ืขื‘ื•ื“ื” ืขืœ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืชโ€ฌ โ€ซื ืชืคืกื”โ€ฌ โ€ซื›ืจืขื™ื•ืŸ ืžื•ื’ื–ื ื‘ื™ื•ืชืจ.โ€ฌโ€ฌ
00:25
In 2010, just the very mention of the phrase โ€œAGI,โ€
5
25021
4380
โ€ซโ€ซื‘ืฉื ืช 2010, ืจืง ืขืฆื ืื–ื›ื•ืจโ€ฌ โ€ซื”ื‘ื™ื˜ื•ื™โ€ฌ โ€ซโ€œื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ื›ืœืœื™ืชโ€ฌโ€โ€ฌ
00:29
artificial general intelligence,
6
29401
2169
โ€ซโ€ซโ€ซื”ื™ื” ืžื–ื›ื” ืืชื›ืโ€ฌ โ€ซื‘ื›ืžื” ืžื‘ื˜ื™ื ืžื•ื–ืจื™ื ื‘ื™ื•ืชืจโ€ฌโ€ฌ
00:31
would get you some seriously strange looks
7
31570
2877
00:34
and even a cold shoulder.
8
34489
1585
โ€ซโ€ซื•ืืคื™ืœื• ื‘ื›ืชืฃ ืงืจื”.โ€ฌโ€ฌ
00:36
"You're actually building AGI?" people would say.
9
36366
3837
โ€ซโ€ซโ€œืืชื” ื‘ืืžืช ื‘ื•ื ื” ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ื›ืœืœื™ืช?โ€โ€ฌ โ€ซโ€ซืื ืฉื™ื ื”ื™ื• ืฉื•ืืœื™ื.โ€ฌโ€ฌ
00:40
"Isn't that something out of science fiction?"
10
40203
2294
โ€ซโ€ซโ€œื–ื” ืœื ืžืฉื”ื•โ€ฌโ€ฌ โ€ซโ€ซืžืชื•ืš ืกืคืจ ืžื“ืข ื‘ื“ื™ื•ื ื™?โ€œโ€ฌโ€ฌ
00:42
People thought it was 50 years away or 100 years away,
11
42914
2544
โ€ซโ€ซืื ืฉื™ื ื—ืฉื‘ื• ืฉืžื“ื•ื‘ืจ ื‘ืขืชื™ื“โ€ฌ โ€ซืฉืœโ€ฌ โ€ซ50 ืื• 100 ืฉื ื”,โ€ฌโ€ฌ
00:45
if it was even possible at all.
12
45500
1919
โ€ซโ€ซืื ื–ื” ื‘ื›ืœืœ ืืคืฉืจื™.โ€ฌโ€ฌ
00:47
Talk of AI was, I guess, kind of embarrassing.
13
47752
3337
โ€ซโ€ซื”ื“ื™ื‘ื•ืจื™ื ืขืœ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ื”ื™ื•,โ€ฌโ€ฌ โ€ซโ€ซืื ื™ ืžื ื™ื—, ื“ื™ ืžื‘ื™ื›ื™ื.โ€ฌโ€ฌ
00:51
People generally thought we were weird.
14
51631
2211
โ€ซโ€ซืื ืฉื™ื ื‘ื“ืจืš ื›ืœืœ ื—ืฉื‘ื• ืฉืื ื—ื ื• ืžื•ื–ืจื™ื.โ€ฌโ€ฌ
00:54
And I guess in some ways we kind of were.
15
54217
2503
โ€ซโ€ซื•ืื ื™ ืžื ื™ื— ืฉื‘ืžื•ื‘ื ื™ื ืžืกื•ื™ืžื™ืโ€ฌ โ€ซืื›ืŸ ื”ื™ื™ื ื• ื›ืืœื”.โ€ฌโ€ฌ
00:56
It wasn't long, though, before AI started beating humans
16
56761
2878
ืื‘ืœ ืžื”ืจ ืžืื•ื“, ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืชโ€ฌ
โ€ซื”ื—ืœื” โ€ซืœื ืฆื— ื‘ื ื™ ืื“ืโ€ฌ โ€ซื‘ืžื’ื•ื•ืŸ ืฉืœื ืฉืœ ืžืฉื™ืžื•ืชโ€ฌโ€ฌ
00:59
at a whole range of tasks
17
59681
1793
01:01
that people previously thought were way out of reach.
18
61516
2795
โ€ซโ€ซืฉืื ืฉื™ื ื—ืฉื‘ื• ื‘ืขื‘ืจโ€ฌโ€ฌ โ€ซโ€ซืฉื”ืŸ ืจื—ื•ืงื•ืช ืžื”ื™ืฉื’ ื™ื“ื”.โ€ฌโ€ฌ
01:05
Understanding images,
19
65020
2127
โ€ซโ€ซื”ื‘ื ืช ืชืžื•ื ื•ืช,โ€ฌโ€ฌ
01:07
translating languages,
20
67147
1918
โ€ซโ€ซืชืจื’ื•ื ืฉืคื•ืช,โ€ฌโ€ฌ
01:09
transcribing speech,
21
69065
1585
โ€ซโ€ซืชืžืœื•ืœ ื“ื™ื‘ื•ืจ,โ€ฌโ€ฌ
01:10
playing Go and chess
22
70692
2127
โ€ซโ€ซืžืฉื—ืงื™ "ื’ื•" ื•ืฉื—ืžื˜โ€ฌโ€ฌ
01:12
and even diagnosing diseases.
23
72861
1918
โ€ซโ€ซื•ืืคื™ืœื• ืื‘ื—ื•ืŸ ืžื—ืœื•ืช.โ€ฌโ€ฌ
01:15
People started waking up to the fact
24
75905
1752
โ€ซโ€ซืื ืฉื™ื ื”ืชื—ื™ืœื• ืœืงืœื•ื˜โ€ฌโ€ฌ โ€ซืืช ื”ืขื•ื‘ื“ื”โ€ฌ
01:17
that AI was going to have an enormous impact,
25
77657
3462
โ€ซืฉืœื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืชโ€ฌโ€ฌ โ€ซโ€ซืชื”ื™ื” ื”ืฉืคืขื” ืขืฆื•ืžื”,โ€ฌโ€ฌ
01:21
and they were rightly asking technologists like me
26
81119
2836
โ€ซโ€ซื•ื”ื ื”ืคื ื• ืœื˜ื›ื ื•ืœื•ื’ื™ื ื›ืžื•ื ื™โ€ฌ, ื•ื‘ืฆื“ืง,โ€ฌ ืฉืืœื• โ€ซโ€ซื›ืžื” ืฉืืœื•ืช ื“ื™ ืงืฉื•ืช.โ€ฌโ€ฌ
01:23
some pretty tough questions.
27
83955
1752
01:25
Is it true that AI is going to solve the climate crisis?
28
85749
3044
โ€ซโ€œโ€ซื”ืื ื ื›ื•ืŸ ืฉื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืชโ€ฌโ€ฌ โ€ซืชืคืชื•ืจ ืืช ืžืฉื‘ืจ ื”ืืงืœื™ื?โ€ฌโ€โ€ฌ
01:29
Will it make personalized education available to everyone?
29
89127
3337
โ€ซโ€œโ€ซื”ืื ื”ื™ื ืชื”ืคื•ืš ืืช ื”ื—ื™ื ื•ืšโ€ฌโ€ฌ โ€ซโ€ซื”ืžื•ืชืื-ืื™ืฉื™ืช ืœื–ืžื™ืŸ ืœื›ื•ืœื?โ€œโ€ฌโ€ฌ
01:32
Does it mean we'll all get universal basic income
30
92881
2294
โ€ซโ€œโ€ซื”ืื ื–ื” ืื•ืžืจ ืฉื›ื•ืœื ื• ื ืงื‘ืœโ€ฌ โ€ซื”ื›ื ืกื” ื‘ืกื™โ€ซืกื™ืช ืื•ื ื™ื‘ืจืกืœื™ืชโ€ฌ
โ€ซโ€œื•ืœืโ€ฌ โ€ซื ืฆื˜ืจืš ืœืขื‘ื•ื“ ื™ื•ืชืจ?โ€ฌโ€โ€ฌ
01:35
and we won't have to work anymore?
31
95216
2002
01:37
Should I be afraid?
32
97218
1544
โ€ซโ€œโ€ซื”ืื ืขืœื™ ืœืคื—ื•ื“?โ€ฌโ€โ€ฌ
01:38
What does it mean for weapons and war?
33
98762
2335
โ€ซโ€œโ€ซืžื” ื–ื” ืื•ืžืจ ืžื‘ื—ื™ื ืช ื ืฉืง ื•ืžืœื—ืžื”?โ€ฌโ€โ€ฌ
01:41
And of course, will China win?
34
101598
1460
โ€ซโ€ซื•ื›ืžื•ื‘ืŸ, โ€œื”ืื ืกื™ืŸ ืชื ืฆื—?โ€ฌโ€โ€ฌ
โ€ซโ€œโ€ซื”ืื ืื ื—ื ื• ื‘ืžื™ืจื•ืฅ?โ€ฌโ€โ€ฌ
01:43
Are we in a race?
35
103058
1543
01:45
Are we headed for a mass misinformation apocalypse?
36
105518
3212
โ€ซโ€œโ€ซื”ืื ืžืฆืคื” ืœื ื• ืืคื•ืงืœื™ืคืกื”โ€ฌ โ€ซืขืฆื•ืžื”โ€ฌ โ€ซืฉืœ ืžื™ื“ืข ืžื•ื˜ืขื”?โ€ฌโ€โ€ฌ
01:49
All good questions.
37
109147
1668
โ€ซโ€ซื›ื•ืœืŸ ืฉืืœื•ืช ื˜ื•ื‘ื•ืช.โ€ฌโ€ฌ
01:51
But it was actually a simpler
38
111608
1418
โ€ซโ€ซืื‘ืœ ืœืžืขืฉื”โ€ฌ, ื”ื™ืชื” โ€ซืฉืืœื” ืคืฉื•ื˜ื”โ€ฌ โ€ซโ€ซื•ื‘ืกื™ืกื™ืช ื”ืจื‘ื” ื™ื•ืชืจโ€ฌ
01:53
and much more kind of fundamental question that left me puzzled.
39
113026
4046
โ€ซืฉื”ื•ืชื™ืจื” ืื•ืชื™ ืžื‘ื•ืœื‘ืœ.โ€ฌโ€ฌ
01:58
One that actually gets to the very heart of my work every day.
40
118031
3962
โ€ซืฉืืœื” ืฉืœืžืขืฉื” ื ื•ื’ืขืชโ€ฌ โ€ซื‘ืœื‘ โ€ซื”ืขื‘ื•ื“ื” ื”ื™ื•ืžื™ื•ืžื™ืช ืฉืœื™.โ€ฌโ€ฌ
02:03
One morning over breakfast,
41
123620
2044
โ€ซโ€ซื‘ื•ืงืจ ืื—ื“, ื‘ืืจื•ื—ืช ื”ื‘ื•ืงืจ,โ€ฌ
02:05
my six-year-old nephew Caspian was playing with Pi,
42
125705
3379
โ€ซืื—ื™ื™ื ื™ ืงืกืคื™ืืŸ ื‘ืŸ ื”ืฉืฉโ€ฌโ€ฌ โ€ซโ€ซืฉื™ื—ืง ืขื โ€œืคื™ื™โ€œ,โ€ฌ
02:09
the AI I created at my last company, Inflection.
43
129084
3378
โ€ซโ€ซื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ืฉื™ืฆืจืชื™ ื‘ื—ื‘ืจื”โ€ฌโ€ฌ โ€ซโ€ซื”ืื—ืจื•ื ื” ืฉืœื™, โ€œืื™ื ืคืœืงืฉืŸโ€œ.โ€ฌโ€ฌ
02:12
With a mouthful of scrambled eggs,
44
132504
1877
โ€ซโ€ซืขื ืคื” ืžืœื ื‘ื™ืฆื” ืžืงื•ืฉืงืฉืช,โ€ฌโ€ฌ
02:14
he looked at me plain in the face and said,
45
134381
3503
โ€ซโ€ซื”ื•ื ื”ื‘ื™ื˜ ื‘ื™โ€ฌโ€ฌ โ€ซื™ืฉืจ ื‘ืขื™ื ื™ื™ื ื•ืฉืืœ,โ€ฌโ€ฌ
02:17
"But Mustafa, what is an AI anyway?"
46
137926
3295
โ€ซโ€ซโ€œืื‘ืœ ืžื•ืกื˜ืคื,โ€ฌ โ€ซืžื” ื–ื” ื‘ื›ืœืœ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช?โ€œโ€ฌโ€ฌ
02:21
He's such a sincere and curious and optimistic little guy.
47
141930
3545
โ€ซโ€ซื”ื•ื ื‘ื—ื•ืจ ืงื˜ืŸ,โ€ฌโ€ฌ โ€ซื›ื ื”, ืกืงืจืŸ ื•ืื•ืคื˜ื™ืžื™.โ€ฌโ€ฌ
02:25
He'd been talking to Pi about how cool it would be if one day in the future,
48
145517
4254
โ€ซโ€ซื”ื•ื ื“ื™ื‘ืจ ืขื ืคื™ื™ ืขืœ ื›ืžื” โ€ซื™ื”ื™ื” ืžื’ื ื™ื‘โ€ฌ โ€ซืื ื™ื•ื ืื—ื“ ื‘ืขืชื™ื“,โ€ฌโ€ฌ
02:29
he could visit dinosaurs at the zoo.
49
149771
2377
โ€ซโ€ซื”ื•ื ื™ื•ื›ืœ ืœื‘ืงืจ ื“ื™ื ื•ื–ืื•ืจื™ื ื‘ื’ืŸ ื”ื—ื™ื•ืช.โ€ฌโ€ฌ
02:32
And how he could make infinite amounts of chocolate at home.
50
152190
3795
โ€ซโ€ซื•ืฉื”ื•ื ื™ื›ื•ืœ ืœื”ื›ื™ืŸ ื›ืžื•ื™ื•ืชโ€ฌโ€ฌ โ€ซโ€ซืื™ื ืกื•ืคื™ื•ืช ืฉืœ ืฉื•ืงื•ืœื“ ื‘ื‘ื™ืช.โ€ฌโ€ฌ
02:35
And why Pi couldnโ€™t yet play I Spy.
51
155985
2920
โ€ซโ€ซื•ืžื“ื•ืข ืคื™ื™ ืขื“ื™ื™ืŸ ืœื ื™ื›ื•ืœโ€ฌ โ€ซืœืฉื—ืง โ€œืื™ื™ ืกืคื™ื™โ€œ.โ€ฌ
02:39
"Well," I said, "it's a clever piece of software
52
159823
2252
โ€ซโ€ซโ€œื•ื‘ื›ืŸ,โ€ ืืžืจืชื™, โ€œืžื“ื•ื‘ืจ ื‘ืชื•ื›ื ื” ื—ื›ืžื”โ€ฌโ€ฌ
02:42
that's read most of the text on the open internet,
53
162075
2461
โ€ซโ€œโ€ซืฉืงืจืื” ืืช ืจื•ื‘โ€ฌ โ€ซื”ื˜ืงืกื˜ ื‘ืื™ื ื˜ืจื ื˜ ื”ืคืชื•ื—,โ€ฌโ€ฌ
02:44
and it can talk to you about anything you want."
54
164577
2419
โ€ซโ€ซ"ื•ื”ื™ื ื™ื›ื•ืœื” ืœื“ื‘ืจ ืื™ืชืšโ€ฌโ€ฌ โ€ซโ€ซืขืœ ื›ืœ ืžื” ืฉืืชื” ืจื•ืฆื”."โ€ฌโ€ฌ
02:48
"Right.
55
168540
1168
โ€ซโ€ซ"ืื”.โ€ฌโ€ฌ
02:49
So like a person then?"
56
169749
2127
โ€ซโ€ซ"ืื– ื›ืžื• ื‘ืŸ ืื“ื?"โ€ฌโ€ฌ
02:54
I was stumped.
57
174254
1334
โ€ซื ืืœืžืชื™ ื“ื•ื.โ€ฌโ€ฌ
02:56
Genuinely left scratching my head.
58
176798
2502
โ€ซืžืžืฉ โ€ซื’ื™ืจื“ืชื™ ืืช ื”ืจืืฉ.โ€ฌโ€ฌ
03:00
All my boring stock answers came rushing through my mind.
59
180301
4046
โ€ซโ€ซื›ืœ ื”ืชืฉื•ื‘ื•ืช ื”ืฉื’ื•ืจื•ืช ื•ื”ืžืฉืขืžืžื•ืชโ€ฌ โ€ซืฉืœื™โ€ฌ โ€ซืขืœื• ืžื™ื“ ื‘ื“ืขืชื™.โ€ฌโ€ฌ
03:04
"No, but AI is just another general-purpose technology,
60
184723
2585
โ€ซโ€ซโ€œืœื, ืื‘ืœ ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ื”ื™ื ืจืงโ€ฌโ€ฌ โ€ซโ€ซืขื•ื“ ื˜ื›ื ื•ืœื•ื’ื™ื” ื›ืœืœื™ืช,โ€ฌโ€ฌ
03:07
like printing or steam."
61
187308
1710
ื›ืžื• ื”ื“ืคื•ืก ืื• ื”ืงื™ื˜ื•ืจ.โ€œโ€ฌโ€ฌ
03:09
It will be a tool that will augment us
62
189394
2502
โ€ซโ€ซื–ื” ื™ื”ื™ื” ื›ืœื™ ืฉื™ืขืฆื™ื ืื•ืชื ื•โ€ฌ โ€ซื•ื™ื”ืคื•ืš ืื•ืชื ื• โ€ฌโ€ซืœื—ื›ืžื™ื ื•ืคื•ืจื™ื™ื ื™ื•ืชืจ.โ€ฌโ€ฌ
03:11
and make us smarter and more productive.
63
191938
2377
03:14
And when it gets better over time,
64
194649
1960
ื•ื›ืฉื–ื” ื™ื™ืœืš ื•ื™ืฉืชืคืจ,โ€ฌโ€ฌ
03:16
it'll be like an all-knowing oracle
65
196651
1919
โ€ซโ€ซ"ื–ื” ื™ื”ื™ื” ื›ืžื• ืื•ืจืงืœ ื™ื•ื“ืข-ื›ืœโ€ฌโ€ฌ
03:18
that will help us solve grand scientific challenges."
66
198611
3087
ืฉื™ืขื–ื•ืจ ืœื ื• ืœืคืชื•ืจโ€ฌโ€ฌ โ€ซโ€ซืืชื’ืจื™ื ืžื“ืขื™ื™ื ื’ื“ื•ืœื™ืโ€œ.โ€ฌโ€ฌ
03:22
You know, all of these responses started to feel, I guess,
67
202115
3712
โ€ซโ€ซืืชื ื™ื•ื“ืขื™ื, ื”ืชื—ืœืชื™ ืœื”ืจื’ื™ืฉโ€ฌ โ€ซืฉื›ืœ ื”ืชื’ื•ื‘ื•ืช ื”ืืœื”โ€ฌ, ืื ื™ ืžื ื™ื—,โ€ฌโ€ฌ
03:25
a little bit defensive.
68
205869
1418
โ€ซโ€ซืงืฆืช ืžืชื’ื•ื ื ื•ืช.โ€ฌโ€ฌ
03:28
And actually better suited to a policy seminar
69
208246
2211
โ€ซโ€ซื•ืœืžืขืฉื”, ืžืชืื™ืžื•ืช ื™ื•ืชืจโ€ฌ โ€ซืœืกืžื™ื ืจ ืžื“ื™ื ื™ื•ืชโ€ฌโ€ฌ
03:30
than breakfast with a no-nonsense six-year-old.
70
210457
2627
โ€ซโ€ซืžืืฉืจ ืœืืจื•ื—ืช-ื‘ื•ืงืจ ืขืโ€ฌ โ€ซื™ืœื“ ื‘ืŸ ืฉืฉโ€ฌ โ€ซืฉืื™-ืืคืฉืจ ืœืขื‘ื•ื“ ืขืœื™ื•.โ€ฌโ€ฌ
03:33
"Why am I hesitating?" I thought to myself.
71
213126
3086
โ€ซโ€ซโ€œืœืžื” ืื ื™ ืžื”ืกืก?โ€œโ€ฌโ€ฌ โ€ซืฉืืœืชื™ ืืช ืขืฆืžื™.โ€ฌโ€ฌ
03:37
You know, let's be honest.
72
217839
1960
โ€ซื”ื‘ื” ื ื”ื™ื” ื›ื ื™ื.โ€ฌโ€ฌ
03:39
My nephew was asking me a simple question
73
219799
3253
โ€ซโ€ซื”ืื—ื™ื™ืŸ ืฉืœื™ ืฉืืœ ืื•ืชื™ ืฉืืœื” ืคืฉื•ื˜ื”โ€ฌโ€ฌ
03:43
that those of us in AI just don't confront often enough.
74
223052
3796
โ€ซโ€ซืฉืืœื” ืฉืื ื—ื ื•, ื‘ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืชโ€ฌ โ€ซืคืฉื•ื˜โ€ฌ โ€ซืœื ืžืชืขืžืชื™ื ืื™ืชื” ืžืกืคื™ืง.โ€ฌโ€ฌ
03:48
What is it that we are actually creating?
75
228099
2920
โ€ซโ€ซืžื” ืื ื• ื‘ืขืฆื ื™ื•ืฆืจื™ื?โ€ฌโ€ฌ
03:51
What does it mean to make something totally new,
76
231895
3753
โ€ซโ€ซืžื”ื™ ื”ืžืฉืžืขื•ืช ืฉืœ ืœื™ืฆื•ืจโ€ฌโ€ฌ โ€ซโ€ซืžืฉื”ื• ื—ื“ืฉ ืœื’ืžืจื™,โ€ฌ
03:55
fundamentally different to any invention that we have known before?
77
235648
4255
โ€ซืฉื•ื ื”โ€ฌ ื‘ืื•ืคืŸ ื™ืกื•ื“ื™โ€ฌ โ€ซืžื›ืœโ€ฌ โ€ซื”ืžืฆืื” ืฉื”ื›ืจื ื• ืขื“ ื›ื”?โ€ฌโ€ฌ
04:00
It is clear that we are at an inflection point
78
240695
2795
โ€ซโ€ซื‘ืจื•ืจ ืฉืื ื• ื ืžืฆืื™ื ื‘ื ืงื•ื“ืชโ€ฌ-ืžืคื ื”โ€ฌ โ€ซื‘ืชื•ืœื“ื•ืช ื”ืื ื•ืฉื•ืช.โ€ฌโ€ฌ
04:03
in the history of humanity.
79
243490
1793
04:06
On our current trajectory,
80
246493
1918
โ€ซโ€ซื‘ืžืกืœื•ืœ ื”ื ื•ื›ื—ื™ ืฉืœื ื•,โ€ฌโ€ฌ
04:08
we're headed towards the emergence of something
81
248411
2211
โ€ซโ€ซืื ื• ืžืชืงื“ืžื™ื ืœืงืจโ€ฌโ€ซืืช ื”ื•ืคืขืชื• ืฉืœ ืžืฉื”ื•โ€ฌโ€ฌ
04:10
that we are all struggling to describe,
82
250663
3170
โ€ซโ€ซืฉื›ื•ืœื ื• ืžืชืงืฉื™ื ืœืชืืจ,โ€ฌโ€ฌ
04:13
and yet we cannot control what we don't understand.
83
253875
4630
โ€ซโ€ซื•ื‘ื›ืœ ื–ืืช ืื™ื ื ื• ื™ื›ื•ืœื™ืโ€ฌ โ€ซืœืฉืœื•ื˜โ€ฌ โ€ซื‘ืžื” ืฉืื™ื ื ื• ืžื‘ื™ื ื™ื.โ€ฌโ€ฌ
04:19
And so the metaphors,
84
259631
1585
โ€ซโ€ซื•ืœื›ืŸ ืœื“ื™ืžื•ื™ื™ื,โ€ฌ โ€ซืœืžื•ื“ืœื™ืโ€ฌ โ€ซื”ืžื ื˜ืœื™ื™ื,โ€ฌ ืœืฉืžื•ืช,โ€ฌ
04:21
the mental models,
85
261216
1251
04:22
the names, these all matter
86
262509
2585
โ€ซืœื›ืœ ืืœื” ื™ืฉ ื—ืฉื™ื‘ื•ืชโ€ฌ
โ€ซโ€ซืื ื‘ื“ืขืชื ื• ืœื”ืคื™ืง ืืช ื”ืžื™ืจื‘โ€ฌ โ€ซืžื”โ€ซื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืชโ€ฌ
04:25
if weโ€™re to get the most out of AI whilst limiting its potential downsides.
87
265094
4088
โ€ซืชื•ืš ื”ื’ื‘ืœืช ื”ื—ืกืจื•ื ื•ืช ื”ืคื•ื˜ื ืฆื™ืืœื™ื™ื ืฉืœื”.โ€ฌโ€ฌ
04:30
As someone who embraces the possibilities of this technology,
88
270391
3421
โ€ซโ€ซื›ืžื™ ืฉืฉืžื—ื™ื ืขืœ ื”ืืคืฉืจื•ื™ื•ืชโ€ฌโ€ฌ โ€ซื”ื˜ืžื•ื ื•ืช ื‘ื˜ื›ื ื•ืœื•ื’ื™ื” ื–ื•,โ€ฌโ€ฌ
04:33
but who's also always cared deeply about its ethics,
89
273812
3670
โ€ซโ€ซืื‘ืœ ืฉืชืžื™ื“ ืื™ื›ืคืช ืœื ื• ืžืื•ื“โ€ฌ โ€ซื‘ื™ื—ืก ืœืืชื™ืงื” ืฉืœื”,โ€ฌโ€ฌ
04:37
we should, I think,
90
277524
1293
โ€ซโ€ซืื ื• ืฆืจื™ื›ื™ื, ืœื“ืขืชื™,โ€ฌโ€ฌ
04:38
be able to easily describe what it is we are building.
91
278817
3044
โ€ซโ€ซืœื”ื™ื•ืช ืžืกื•ื’ืœื™ื ืœืชืืจ ื‘ืงืœื•ืชโ€ฌโ€ฌ โ€ซโ€ซืืช ืžื” ืฉืื ื—ื ื• ื‘ื•ื ื™ื,โ€ฌ
04:41
And that includes the six-year-olds.
92
281861
2002
โ€ซโ€ซื›ื•ืœืœ ื‘ืคื ื™ ื™ืœื“ื™ื ื‘ื ื™ ืฉืฉ.โ€ฌโ€ฌ
04:44
So it's in that spirit that I offer up today the following metaphor
93
284239
4254
โ€ซโ€ซืื– ื‘ืจื•ื— ื–ื• ืื ื™ ืžืฆื™ืขโ€ฌ โ€ซื”ื™ื•ืโ€ฌ โ€ซืืช ื”ื“ื™ืžื•ื™ ื”ื‘ืโ€ฌ
04:48
for helping us to try to grapple with what this moment really is.
94
288535
3461
โ€ซโ€ซืฉื™ืขื–ื•ืจ ืœื ื• ืœื ืกื•ืช ืœื”ืชืžื•ื“ื“ ืขืโ€ฌโ€ฌ โ€ซโ€ซื”ืืžืช ืฉืœ ื”ืจื’ืข ื”ื–ื”.โ€ฌโ€ฌ
04:52
I think AI should best be understood
95
292539
2627
โ€ซโ€ซืื ื™ ื—ื•ืฉื‘ ืฉื”ื›ื™ ื˜ื•ื‘ ืœื”ื‘ื™ืŸโ€ฌ โ€ซืืช ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืชโ€ฌโ€ฌ
04:55
as something like a new digital species.
96
295166
4254
โ€ซื›ืื™ืœื• ื”ื™ื ืžื™ืŸ ื“ื™ื’ื™ื˜ืœื™ ื—ื“ืฉ.โ€ฌโ€ฌ
05:00
Now, don't take this too literally,
97
300296
2253
โ€ซโ€ซืืœ ืชื™ืงื—ื• ืืช ื–ื” ืžื™ืœื•ืœื™ืช ืžื“ื™,โ€ฌโ€ฌ
05:02
but I predict that we'll come to see them as digital companions,
98
302590
4630
โ€ซโ€ซืื‘ืœ ืื ื™ ื—ื•ื–ื” ืฉื ืœืžื“ โ€ซืœืจืื•ืช ื‘ื”โ€ฌ โ€ซื—ื‘ืจื” ื“ื™ื’ื™ื˜ืœื™ืช,โ€ฌ
05:07
new partners in the journeys of all our lives.
99
307220
3295
โ€ซืฉื•ืชืคื” โ€ซื—ื“ืฉื” ื‘ืžืกืขื•ืชโ€ฌ โ€ซื‘ื›ืœ ื—ื™ื™ื ื•.โ€ฌโ€ฌ
05:10
Whether you think weโ€™re on a 10-, 20- or 30-year path here,
100
310557
4087
โ€ซืœื ืžืฉื ื” ืื ืœื“ืขืชื›ื ืžื“ื•ื‘ืจ ื›ืืŸโ€ฌ โ€ซื‘ื“ืจืš ืฉืœ โ€ซ10, 20 ืื• 30 ืฉื ื”,โ€ฌโ€ฌ
05:14
this is, in my view, the most accurate and most fundamentally honest way
101
314686
5005
โ€ซโ€ซื–ื•, ืœื“ืขืชื™, ื”ื“ืจืšโ€ฌ โ€ซื”ืžื“ื•ื™ืงืชโ€ฌ
โ€ซื•ื‘ื™ืกื•ื“ื”, ื”ื™ืฉืจื” ื‘ื™ื•ืชืจ ืœืชืืจโ€ฌโ€ฌ โ€ซโ€ซืืช ืžื” ืฉื‘ืืžืช ืžื’ื™ืข.โ€ฌโ€ฌ
05:19
of describing what's actually coming.
102
319732
2378
05:22
And above all, it enables everybody to prepare for
103
322610
3963
โ€ซโ€ซื•ืžืขืœ ืœื›ืœ, ื–ื”โ€ฌโ€ฌ โ€ซโ€ซืžืืคืฉืจ ืœื›ื•ืœื ืœื”ืชื›ื•ื ืŸโ€ฌโ€ฌ
05:26
and shape what comes next.
104
326614
2711
โ€ซโ€ซื•ืœืขืฆื‘ ืืช ืžื” ืฉื™ื‘ื•ื.โ€ฌโ€ฌ
05:29
Now I totally get, this is a strong claim,
105
329868
2002
โ€ซืื ื™ ืœื’ืžืจื™ ืžื‘ื™ืŸ ืฉื–ื”ื• ื˜ื™ืขื•ืŸ ืจืฆื™ื ื™,โ€ฌโ€ฌ
05:31
and I'm going to explain to everyone as best I can why I'm making it.
106
331911
3837
โ€ซโ€ซื•ืื ื™ ืืกื‘ื™ืจ ืœื›ื•ืœื ื›ืžื™ื˜ื‘โ€ฌ โ€ซื™ื›ื•ืœืชื™โ€ฌ โ€ซืžื“ื•ืข ืื ื™ ืžืขืœื” ืื•ืชื•.โ€ฌโ€ฌ
05:36
But first, let me just try to set the context.
107
336291
2961
โ€ซโ€ซืื‘ืœ ืชื—ื™ืœื” ืชื ื• ืœื™ ืจืงโ€ฌโ€ฌ โ€ซโ€ซืœื ืกื•ืช ืœื”ื’ื“ื™ืจ ืืช ื”ื”ืงืฉืจ.โ€ฌโ€ฌ
05:39
From the very first microscopic organisms,
108
339252
2753
โ€ซโ€ซืžืื– ืจืืฉื•ื ื™ ื”ืื•ืจื’ื ื™ื–ืžื™ื ื”ืžื™ืงืจื•ืกืงื•ืคื™ื™ื,โ€ฌโ€ฌ
05:42
life on Earth stretches back billions of years.
109
342046
3462
โ€ซโ€ซื”ื—ื™ื™ื ืขืœ ืคื ื™ ื›ื“ื•ืจโ€ฌ โ€ซื”ืืจืฅโ€ฌ โ€ซื”ืชืคืชื—ื• ื‘ืžืฉืš ืžื™ืœื™ืืจื“ื™ ืฉื ื™ื.โ€ฌโ€ฌ
05:45
Over that time, life evolved and diversified.
110
345508
4213
โ€ซโ€ซื‘ืžื”ืœืš ื”ื–ืžืŸ ื”ื–ื”,โ€ฌโ€ฌ โ€ซโ€ซื”ื—ื™ื™ื ื”ืชืคืชื—ื• ื•ื”ืชื’ื•ื•ื ื•.โ€ฌโ€ฌ
05:49
Then a few million years ago, something began to shift.
111
349762
3796
โ€ซโ€ซื•ืœืคื ื™ ื›ืžื” ืžื™ืœื™ื•ื ื™โ€ฌ โ€ซืฉื ื™ื,โ€ฌ โ€ซืžืฉื”ื• ื”ืชื—ื™ืœ ืœื”ืฉืชื ื•ืช.โ€ฌโ€ฌ
05:54
After countless cycles of growth and adaptation,
112
354183
3629
โ€ซโ€ซืœืื—ืจ ืื™ื ืกืคื•ืจ ืžื—ื–ื•ืจื™ืโ€ฌโ€ฌ โ€ซโ€ซืฉืœ ืฆืžื™ื—ื” ื•ื”ืกืชื’ืœื•ืช,โ€ฌโ€ฌ
05:57
one of lifeโ€™s branches began using tools, and that branch grew into us.
113
357812
6256
โ€ซโ€ซืื—ื“ ืžืขื ืคื™ ื”ื—ื™ื™ืโ€ฌ โ€ซื”ื—ืœ ืœื”ืฉืชืžืฉโ€ฌ โ€ซื‘ื›ืœื™ื,โ€ฌ
โ€ซื•ืžื”ืขื ืฃ ื”ื–ื” ื”ืชืคืชื—ื ื• ืื ื—ื ื•.โ€ฌโ€ฌ
06:04
We went on to produce a mesmerizing variety of tools,
114
364777
4130
โ€ซื™ื™ืฆืจื ื• ืžื’ื•ื•ืŸ ืžื”ืคื ื˜โ€ฌโ€ฌ โ€ซโ€ซืฉืœ ื›ืœื™ื,โ€ฌ
06:08
at first slowly and then with astonishing speed,
115
368907
3670
โ€ซื‘ื”ืชโ€ซื—ืœื” ืœืื˜,โ€ฌ โ€ซื•ืื–โ€ฌ โ€ซื‘ืžื”ื™ืจื•ืช ืžื“ื”ื™ืžื”,โ€ฌ
06:12
we went from stone axes and fire
116
372577
3670
โ€ซื”ื’ืขื ื• ืžื’ืจื–ื ื™ ื”ืื‘ืŸ ื•ืžืŸ ื”ืืฉโ€ฌโ€ฌ
06:16
to language, writing and eventually industrial technologies.
117
376247
5005
โ€ซืขื“ โ€ซืœืฉืคื”, ืœื›ืชื™ื‘ื” ื•ื‘ืกื•ืคื•โ€ฌ โ€ซืฉืœ ื“ื‘ืจโ€ฌ โ€ซืœื˜ื›ื ื•ืœื•ื’ื™ื•ืช ืชืขืฉื™ื™ืชื™ื•ืช.โ€ฌโ€ฌ
06:21
One invention unleashed a thousand more.
118
381878
2919
โ€ซโ€ซื”ืžืฆืื” ืื—ืช ื”ื‘ื™ืื” ืœืืœืฃ ื ื•ืกืคื•ืช.โ€ฌโ€ฌ
06:25
And in time, we became homo technologicus.
119
385173
3712
โ€ซโ€ซื•ืขื ื”ื–ืžืŸ ื”ืคื›ื ื• ืœโ€œื”ื•ืžื• ื˜ื›ื ื•ืœื•ื’ื™ืงื•ืกโ€œ.โ€ฌโ€ฌ
06:29
Around 80 years ago,
120
389594
1209
โ€ซโ€ซืœืคื ื™ ื›-80 ืฉื ื” ื ื•ืœื“โ€ฌ โ€ซืขื ืฃ ื˜ื›ื ื•ืœื•ื’ื™ื” ื—ื“ืฉโ€ฌ.โ€ฌ
06:30
another new branch of technology began.
121
390803
2545
06:33
With the invention of computers,
122
393973
1710
โ€ซโ€ซืขื ื”ืžืฆืืช ื”ืžื—ืฉื‘ื™ืโ€ฌ
06:35
we quickly jumped from the first mainframes and transistors
123
395725
3295
โ€ซื–ื™ื ืงื ื• ื‘ืžื”ื™ืจื•ืช ืžืžื—ืฉื‘ื™โ€ฌ โ€ซื”ืžื™ื™ื ืคืจื™ื™ืโ€ฌ โ€ซื•ื”ื˜ืจื ื–ื™ืกื˜ื•ืจื™ื ื”ืจืืฉื•ื ื™ืโ€ฌ
โ€ซืœื˜ืœืคื•ื ื™ื ื—ื›ืžื™ืโ€ฌ โ€ซื•ืœืžืขืจื›ื•ืช ื”ืžืฆื™ืื•ืช ื”ืžื“โ€ฌโ€ซื•ืžื” ืฉืœ ื™ืžื™ื ื•.โ€ฌโ€ฌ
06:39
to today's smartphones and virtual-reality headsets.
124
399062
3461
06:42
Information, knowledge, communication, computation.
125
402565
4421
โ€ซโ€ซืžื™ื“ืข, ื™ื“ืข,โ€ฌโ€ฌ โ€ซโ€ซืชืงืฉื•ืจืช, ืžื—ืฉื•ื‘.โ€ฌโ€ฌ
06:47
In this revolution,
126
407570
1418
โ€ซโ€ซื‘ืžื”ืคื›ื” ื”ื–ื•,โ€ฌโ€ฌ
โ€ซืžืœืื›ืช ื”ื™ืฆื™ืจื” ื”ืชืขืฆืžื”โ€ฌ โ€ซื™ื•ืชืจ ืžืื™-ืคืขื ื‘ืขื‘ืจ.โ€ฌโ€ฌ
06:49
creation has exploded like never before.
127
409030
3295
06:53
And now a new wave is upon us.
128
413242
2503
โ€ซโ€ซื•ืขื›ืฉื™ื• ืžื’ื™ืข ื’ืœ ื—ื“ืฉ.โ€ฌโ€ฌ
06:55
Artificial intelligence.
129
415787
1668
โ€ซื”โ€ซื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช.โ€ฌโ€ฌ
06:57
These waves of history are clearly speeding up,
130
417872
2544
โ€ซื‘ืจื•ืจ ืฉโ€ซื’ืœื™ ื”ื”ื™ืกื˜ื•ืจื™ื” ื”ืืœื”โ€ฌโ€ฌโ€ฌ โ€ซื”ื•ืœื›ื™ื ื•ื ืขืฉื™ื ืชื›ื•ืคื™ื ื™ื•ืชืจ,โ€ฌ
07:00
as each one is amplified and accelerated by the last.
131
420458
4338
โ€ซื›ืฉื›ืœ ืื—ื“ ืžื”ืโ€ฌโ€ฌ โ€ซโ€ซืžื•ื’ื‘ืจ ื•ืžื•ืืฅ ืขืœ ื™ื“ื™ ืงื•ื“ืžื•.โ€ฌโ€ฌ
07:05
And if you look back,
132
425088
1167
โ€ซโ€ซื•ืื ืžืกืชื›ืœื™ื ืื—ื•ืจื”,โ€ฌโ€ฌ
07:06
it's clear that we are in the fastest
133
426297
2002
โ€ซโ€ซื‘ืจื•ืจ ืฉืื ื—ื ื• ื ืžืฆืื™ื ื‘ื’ืœ ื”ืžื”ื™ืจโ€ฌโ€ฌ โ€ซโ€ซื•ื”ืชื•ืฆืืชื™ ื‘ื™ื•ืชืจ ืžืื– ื•ืžืชืžื™ื“.โ€ฌโ€ฌ
07:08
and most consequential wave ever.
134
428299
2586
07:11
The journeys of humanity and technology are now deeply intertwined.
135
431844
4630
โ€ซื“ืจื›ื™ ื”ืื ื•ืฉื•ืช ื•ื”ื˜ื›ื ื•ืœื•ื’ื™ื”โ€ฌ โ€ซืฉื–ื•ืจื•ืชโ€ฌ โ€ซื–ื• ื‘ื–ื• ื‘ืื•ืคืŸ ืขืžื•ืง.โ€ฌโ€ฌ
07:16
In just 18 months,
136
436516
1543
โ€ซโ€ซื‘ืชื•ืš ืฉื ื” ื•ื—ืฆื™ ื‘ืœื‘ื“,โ€ฌโ€ฌ
07:18
over a billion people have used large language models.
137
438059
3170
โ€ซโ€ซืœืžืขืœื” ืžืžื™ืœื™ืืจื“ ืื ืฉื™ื ื”ืฉืชืžืฉื•โ€ฌโ€ฌ โ€ซโ€ซื‘ืžื•ื“ืœื™ื ื’ื“ื•ืœื™ื ืฉืœ ืฉืคื”.โ€ฌโ€ฌ
07:21
We've witnessed one landmark event after another.
138
441729
3545
โ€ซโ€ซื”ื™ื™ื ื• ืขื“ื™ื ืœืื™ืจื•ืขื™ื ื‘ื•ืœโ€ซื˜ื™ืโ€ฌ โ€ซื‘ื–ื” ืื—ืจ ื–ื”.โ€ฌโ€ฌ
07:25
Just a few years ago, people said that AI would never be creative.
139
445650
3462
โ€ซโ€ซืจืง ืœืคื ื™ ื›ืžื” ืฉื ื™ื ืืžืจื•โ€ฌโ€ฌ โ€ซโ€ซืฉื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ืœืขื•ืœื ืœื ืชื”ื™ื” ื™ืฆื™ืจืชื™ืช.โ€ฌโ€ฌ
07:30
And yet AI now feels like an endless river of creativity,
140
450113
4045
โ€ซืืš ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ืžื•ืจื’ืฉืช ื›ื™ื•ืโ€ฌ โ€ซโ€ซื›ืžื• ื ื”ืจ ืื™ื ืกื•ืคื™ ืฉืœ ื™ืฆื™ืจืชื™ื•ืช,โ€ฌโ€ฌ
07:34
making poetry and images and music and video that stretch the imagination.
141
454200
4671
โ€ซโ€ซื™ื•ืฆืจืช ืฉื™ืจื” ื•ืชืžื•ื ื•ืช ื•ืžื•ื–ื™ืงื”โ€ฌโ€ฌ โ€ซโ€ซื•ืกืจื˜ื•ื ื™ื ืฉืงื•ืจืื™ื ืชื™ื’ืจ ืขืœ ื”ื“ืžื™ื•ืŸ.โ€ฌโ€ฌ
07:39
People said it would never be empathetic.
142
459664
2252
โ€ซโ€ซืื ืฉื™ื ืืžืจื• ืฉื”ื™ืโ€ฌ โ€ซืœืขื•ืœื ืœื ืชื”ื™ื” ืืžืคืชื™ืช.โ€ฌโ€ฌ
07:42
And yet today, millions of people enjoy meaningful conversations with AIs,
143
462417
5297
โ€ซืื‘ืœ ืžื™ืœื™ื•ื ื™ ืื ืฉื™ื ื ื”ื ื™ืโ€ฌ ื›ื™ื•ืโ€ฌ โ€ซโ€ซืžืฉื™ื—ื•ืช ืžืฉืžืขื•ืชื™ื•ืช ืขื ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช,โ€ฌโ€ฌ
07:47
talking about their hopes and dreams
144
467755
2002
โ€ซโ€ซืžื“ื‘ืจื™ื ืขืœ ืชืงื•ื•ืชื™ื”ื ื•ื—ืœื•ืžื•ืชื™ื”ืโ€ฌโ€ฌ
07:49
and helping them work through difficult emotional challenges.
145
469757
3087
โ€ซโ€ซื•ื ืขื–ืจื™ื ื‘ื” ื›ื“ื™ ืœื”ืชืžื•ื“ื“โ€ฌโ€ฌ โ€ซโ€ซืขื ืืชื’ืจื™ื ืจื’ืฉื™ื™ื ืงืฉื™ื.โ€ฌโ€ฌ
07:53
AIs can now drive cars,
146
473177
2294
โ€ซื”โ€ซื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ื™ื›ื•ืœื” ื›ืขืชโ€ฌ โ€ซืœื ื”ื•ื’ ื‘ืžื›ื•ื ื™ื•ืช,โ€ฌโ€ฌ
07:55
manage energy grids
147
475513
1543
โ€ซโ€ซืœื ื”ืœ ืจืฉืชื•ืช ื—ืฉืžืœโ€ฌ
07:57
and even invent new molecules.
148
477056
2252
โ€ซโ€ซื•ืืคื™ืœื• ืœื”ืžืฆื™ื ืžื•ืœืงื•ืœื•ืช ื—ื“ืฉื•ืช.โ€ฌโ€ฌ
07:59
Just a few years ago, each of these was impossible.
149
479308
3713
โ€ซโ€ซืจืง ืœืคื ื™ ืฉื ื™ื ืกืคื•ืจื•ืชโ€ฌ โ€ซื›ืœโ€ฌ ื“ื‘ืจ ื›ื–ื” ื”ื™ื” ื‘ืœืชื™-ืืคืฉืจื™.โ€ฌโ€ฌ
08:03
And all of this is turbocharged by spiraling exponentials of data
150
483771
5506
โ€ซโ€ซื•ื›ืœ ื–ื” ืžืชื•ื“ืœืง ืขืœ ื™ื“ื™โ€ฌโ€ฌ โ€ซืกื“ืจื™-ื’ื•ื“ืœ ืžืขืจื™ื›ื™ื™ื ืฉืœ ื ืชื•ื ื™ื ื•ืžื—ืฉื•ื‘.โ€ฌโ€ฌ
08:09
and computation.
151
489277
1626
08:10
Last year, Inflection 2.5, our last model,
152
490945
5297
โ€ซโ€ซื‘ืฉื ื” ืฉืขื‘ืจื”, โ€œืื™ื ืคืœืงืฉืŸ 2.5โ€,โ€ฌ โ€ซื”ืžื•ื“ืœ ื”ืื—ืจื•ืŸ ืฉืœื ื•,โ€ฌโ€ฌ
08:16
used five billion times more computation
153
496242
4129
โ€ซโ€ซื”ืฉืชืžืฉ ื‘ืคื™ ื—ืžื™ืฉื” ืžื™ืœื™ืืจื“ ื™ื•ืชืจโ€ฌ โ€ซื›ื•ื— ืžื—ืฉื•ื‘โ€ฌ
08:20
than the DeepMind AI that beat the old-school Atari games
154
500371
3629
โ€ซโ€ซืžืืฉืจ ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ืฉืœ โ€œื“ื™ืคืžื™ื™ื ื“โ€โ€ฌ โ€ซืฉื ื™ืฆื—ื” ืืชโ€ฌ โ€ซืžืฉื—ืงื™ โ€œืื˜ืืจื™โ€ ื”ืžื™ื•ืฉื ื™ืโ€ฌ
โ€ซืœืคื ื™ ืงืฆืช ื™ื•ืชืจ ืžืขโ€ฌืฉืจ ืฉื ื™ืโ€ฌ
08:24
just over 10 years ago.
155
504042
1668
08:26
That's nine orders of magnitude more computation.
156
506085
3420
โ€ซืžื“ื•ื‘ืจ ื‘ืชืฉืขื” ืกื“ืจื™โ€ฌ-โ€ซื’ื•ื“ืœโ€ฌ โ€ซื™ื•ืชืจ ื›ื•ื— ืžื—ืฉื•ื‘.โ€ฌโ€ฌ
08:30
10x per year,
157
510089
1627
โ€ซื›ืคื•ืœ 10 ื‘ืฉื ื”,โ€ฌโ€ฌ
08:31
every year for almost a decade.
158
511716
3253
โ€ซืžื™ื“ื™ ืฉื ื”โ€ฌ โ€ซื‘ืžืฉืš ืขืฉื•ืจ ื›ืžืขื˜.โ€ฌโ€ฌ
08:34
Over the same time, the size of these models has grown
159
514969
2544
โ€ซโ€ซื‘ืžืงื‘ื™ืœ, ื’ื•ื“ืœื ืฉืœโ€ฌโ€ฌ โ€ซโ€ซืžื•ื“ืœื™ื ืืœื” ื’ื“ืœโ€ฌโ€ฌ
08:37
from first tens of millions of parameters to then billions of parameters,
160
517555
4213
โ€ซืชื—ื™ืœื” โ€ซืžืขืฉืจื•ืช ืžื™ืœื™ื•ื ื™ ืคืจืžื˜ืจื™ืโ€ฌโ€ฌ โ€ซื•ืขื“ ืžื™ืœื™ืืจื“ื™ ืคืจืžื˜ืจื™ื,โ€ฌโ€ฌ
08:41
and very soon, tens of trillions of parameters.
161
521809
3504
โ€ซโ€ซื•ื‘ืงืจื•ื‘ ืžืื•ื“,โ€ฌโ€ฌ โ€ซโ€ซืขืฉืจื•ืช ื˜ืจื™ืœื™ื•ื ื™ ืคืจืžื˜ืจื™ื.โ€ฌโ€ฌ
08:45
If someone did nothing but read 24 hours a day for their entire life,
162
525313
4755
โ€ซโ€ซืื ืžื™ืฉื”ื• ืœื ื”ื™ื” ืขื•ืฉื” ื“ื‘ืจ ืžืœื‘ื“โ€ฌโ€ฌ โ€ซโ€ซืœืงืจื•ื 24 ืฉืขื•ืช ื‘ื™ืžืžื” ื‘ืžืฉืš ื›ืœ ื—ื™ื™ื•,โ€ฌโ€ฌ
08:50
they'd consume eight billion words.
163
530109
3337
โ€ซโ€ซื”ื•ื ื”ื™ื” ืฆื•ืจืš ืฉืžื•ื ื” ืžื™ืœื™ืืจื“ ืžื™ืœื™ื.โ€ฌโ€ฌ
08:53
And of course, that's a lot of words.
164
533488
1835
โ€ซโ€ซื•ืืœื” ื›ืžื•ื‘ืŸ ื”ืžื•ืŸ ืžื™ืœื™ื.โ€ฌโ€ฌ
08:55
But today, the most advanced AIs consume more than eight trillion words
165
535364
5756
โ€ซโ€ซืื‘ืœ ื›ื™ื•ื, ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ื”ืžืชืงื“ืžืช ื‘ื™ื•ืชืจโ€ฌโ€ฌ
โ€ซโ€ซืฆื•ืจื›ืช ื™ื•ืชืจ ืžืฉืžื•ื ื” ื˜ืจื™ืœื™ื•ืŸ ืžื™ืœื™ืโ€ฌโ€ฌ
09:01
in a single month of training.
166
541120
2336
โ€ซโ€ซื‘ื—ื•ื“ืฉ ืื—ื“ ืฉืœ ืื™ืžื•ืŸ.โ€ฌโ€ฌ
09:03
And all of this is set to continue.
167
543873
1960
โ€ซโ€ซื•ื›ืœ ื–ื” ืืžื•ืจ ืœื”ื™ืžืฉืš.โ€ฌโ€ฌ
09:05
The long arc of technological history is now in an extraordinary new phase.
168
545875
5797
โ€ซโ€ซื”ืงืฉืช ื”ืืจื•ื›ื” ืฉืœ ื”ื”ื™ืกื˜ื•ืจื™ื” ื”ื˜ื›ื ื•ืœื•ื’ื™ืชโ€ฌ โ€ซื ืžืฆืืชโ€ฌ โ€ซื›ืขืช ื‘ืฉืœื‘ ื—ื“ืฉ ื•ืžื™ื•ื—ื“ ื‘ืžื™ื ื•.โ€ฌโ€ฌ
09:12
So what does this mean in practice?
169
552256
2503
โ€ซโ€ซืื– ืžื” ื–ื” ืื•ืžืจ ื‘ืคื•ืขืœ?โ€ฌโ€ฌ
09:15
Well, just as the internet gave us the browser
170
555426
2920
โ€ซโ€ซื•ื‘ื›ืŸ, ื‘ื“ื™ื•ืง ื›ืคื™ ืฉื”ืื™ื ื˜ืจื ื˜โ€ฌโ€ฌ โ€ซโ€ซื ืชืŸ ืœื ื• ืืช ื”ื“ืคื“ืคืŸโ€ฌโ€ฌ
09:18
and the smartphone gave us apps,
171
558346
2502
โ€ซโ€ซื•ื”ื˜ืœืคื•ืŸ ื”ื—ื›ื ื ืชืŸ ืœื ื• ื™ื™ืฉื•ืžื•ื ื™ื,โ€ฌ
09:20
the cloud-based supercomputer is ushering in a new era
172
560890
4004
โ€ซื›ืš ืžื—ืฉื‘-ื”โ€ซืขืœ ืžื‘ื•ืกืก ื”ืขื โ€ฌโ€ซืŸโ€ฌ โ€ซืžื•ื‘ื™ืœ ืื•ืชื ื• ืœืขื™ื“ืŸ ื—ื“ืฉโ€ฌโ€ฌ
09:24
of ubiquitous AIs.
173
564936
2502
โ€ซโ€ซืฉืœ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ื‘ื›ืœ ืžืงื•ื.โ€ฌโ€ฌ
09:27
Everything will soon be represented by a conversational interface.
174
567438
4588
โ€ซโ€ซื”ื›ืœ ื™ื”ื™ื” ืžื™ื•ืฆื’ ื‘ืงืจื•ื‘โ€ฌโ€ฌ โ€ซโ€ซืขืœ ื™ื“ื™ ืžืžืฉืง ืฉื™ื—ื”.โ€ฌโ€ฌ
09:32
Or, to put it another way, a personal AI.
175
572026
3003
โ€ซโ€ซืื•, ื‘ืžื™ืœื™ื ืื—ืจื•ืช,โ€ฌ โ€ซื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ืื™ืฉื™ืช.โ€ฌโ€ฌ
09:35
And these AIs will be infinitely knowledgeable,
176
575780
2336
โ€ซโ€ซื•ื”ื‘ื™ื ื•ืช ื”ืžืœืื›ื•ืชื™ื•ืช ื”ืืœื”โ€ฌ โ€ซื™ื›ื™ืœื• ื™ื“ืข ืื™ื ืกื•ืคื™,โ€ฌโ€ฌ
09:38
and soon they'll be factually accurate and reliable.
177
578157
3879
โ€ซโ€ซื•ื‘ืงืจื•ื‘ ื”ืŸ ืชื”ื™ื™ื ื”โ€ฌ โ€ซโ€ซืžื“ื•ื™ืงื•ืช ื•ืืžื™ื ื•ืช ืžื‘ื—ื™ื ืช ื”ืขื•ื‘ื“ื•ืช.โ€ฌโ€ฌ
09:42
They'll have near-perfect IQ.
178
582036
1794
โ€ซืชื”ื™ื” ืœื”ืŸ ืžื ืช-ืžืฉื›ืœ ื›ืžืขื˜ ืžื•ืฉืœืžืช.โ€ฌโ€ฌ
09:44
Theyโ€™ll also have exceptional EQ.
179
584914
2377
โ€ซืชื”ื™ื” ืœื”ืŸ ื’ื ืชื‘ื•ื ื” ืจื’ืฉื™ืชโ€ฌ โ€ซื™ื•ืฆืืช ืžืŸ ื”ื›ืœืœ.โ€ฌโ€ฌ
09:47
Theyโ€™ll be kind, supportive, empathetic.
180
587291
4129
โ€ซโ€ซื”ืŸ ืชื”ื™ื™ื ื” ืื“ื™ื‘ื•ืช, ืชื•ืžื›ื•ืช, ืื•ื”ื“ื•ืช.โ€ฌโ€ฌ
09:53
These elements on their own would be transformational.
181
593089
2836
โ€ซืจื›ื™ื‘ื™ื ืืœื” ื‘ืคื ื™โ€ฌ โ€ซืขืฆืžืโ€ฌ โ€ซื™ืฉื ื• ืืช ื”ืชืžื•ื ื” ื›ื•ืœื”.โ€ฌ
09:55
Just imagine if everybody had a personalized tutor in their pocket
182
595925
3795
โ€ซโ€ซืชืืจื• ืœืขืฆืžื›ื ืื ืœื›ื•ืœื ื”ื™ื”โ€ฌโ€ฌ โ€ซโ€ซืžื•ืจื” ื‘ื”ืชืืžื” ืื™ืฉื™ืช ื‘ื›ื™ืกโ€ฌโ€ฌ
09:59
and access to low-cost medical advice.
183
599720
3003
โ€ซโ€ซื•ื’ื™ืฉื” ืœื™ื™ืขื•ืฅ ืจืคื•ืื™ ื–ื•ืœ.โ€ฌโ€ฌ
10:02
A lawyer and a doctor,
184
602723
1544
โ€ซโ€ซืขื•ืจืš ื“ื™ืŸ ื•โ€ซืจื•ืคื, ืืกื˜ืจื˜ื’ ืขืกืงื™ ื•ืžืืžืŸ -โ€ฌโ€ฌ
10:04
a business strategist and coach --
185
604267
1960
10:06
all in your pocket 24 hours a day.
186
606269
2252
โ€ซื›ืœ ืืœื” ื‘ื›ื™ืก ืฉืœื›ื โ€ซ24 ืฉืขื•ืช ื‘ื™ืžืžื”.โ€ฌโ€ฌ
10:08
But things really start to change when they develop what I call AQ,
187
608980
4713
โ€ซโ€ซืื‘ืœ ื”ื“ื‘ืจื™ื ื‘ืืžืช ืžืชื—ื™ืœื™ื ืœื”ืฉืชื ื•ืชโ€ฌ
โ€ซื›ืฉื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ืžืคืชื—ืชโ€ฌโ€ฌ โ€ซโ€ซืืช ืžื” ืฉืื ื™ ืžื›ื ื” ืื™ื™-ืงื™ื•,โ€ฌ
10:13
their โ€œactions quotient.โ€
188
613693
1668
โ€ซืืช โ€œืžื ืช ื”ืคืขื™ืœื•ืชโ€ โ€ซืฉืœื”.โ€ฌโ€ฌ
10:15
This is their ability to actually get stuff done
189
615695
2794
โ€ซืžื“ื•ื‘ืจ ื‘ื™ื›ื•ืœืช ืฉืœื”โ€ฌโ€ฌ โ€ซโ€ซืœื”ืฉืœื™ื ื“ื‘ืจื™ืโ€ฌ
10:18
in the digital and physical world.
190
618531
2294
โ€ซโ€ซื‘ืขื•ืœื ื”ื“ื™ื’ื™ื˜ืœื™ ื•ื”ืžืžืฉื™.โ€ฌโ€ฌ
10:20
And before long, it won't just be people that have AIs.
191
620867
3420
โ€ซโ€ซื•ื‘ืžื”ืจื”, ืœื โ€ซืจืง ืœืื ืฉื™ืโ€ฌ โ€ซืชื”ื™ื” ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช.โ€ฌโ€ฌ
10:24
Strange as it may sound, every organization,
192
624328
2962
โ€ซโ€ซืžื•ื–ืจ ื›ื›ืœ ืฉื–ื” ื™ื™ืฉืžืข,โ€ฌโ€ฌ
โ€ซโ€ซืœื›ืœ ืืจื’ื•ืŸ, ืžืขืกืงื™ื ืงื˜ื ื™ืโ€ฌ,โ€ฌ โ€ซืขื‘ื•ืจ ื‘ืžืœื›โ€œืจื™ื ื•ืขื“ ืžืžืฉืœื•ืช,โ€ฌโ€ฌ
10:27
from small business to nonprofit to national government,
193
627290
3253
10:30
each will have their own.
194
630585
1710
โ€ซโ€ซืœื›ืœ ืื—ื“ ืžืืœื”โ€ฌ โ€ซืชื”ื™ื” ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ืžืฉืœื•.โ€ฌโ€ฌ
10:32
Every town, building and object
195
632795
2711
โ€ซโ€ซื›ืœ ืขื™ืจ, ื‘ื ื™ื™ืŸ ื•ื—ืคืฅโ€ฌโ€ฌ
10:35
will be represented by a unique interactive persona.
196
635506
3754
ื™โ€ซโ€ซื™ื•ืฆื’ื• ืขืœ ื™ื“ื™โ€ฌโ€ฌ โ€ซโ€ซืื™ืฉื™ื•ืช ืื™ื ื˜ืจืืงื˜ื™ื‘ื™ืช ื™ื™ื—ื•ื“ื™ืช.โ€ฌโ€ฌ
10:39
And these won't just be mechanistic assistants.
197
639302
2544
โ€ซโ€ซื•ืืœื” ืœื ื™ื”ื™ื•โ€ฌโ€ฌ โ€ซืกืชื ืขื•ื–ืจื™ื ืžื›ื ื™ืกื˜ื™ื™ื.โ€ฌโ€ฌ
10:42
They'll be companions, confidants,
198
642221
3754
โ€ซโ€ซื”ื ื™ื”ื™ื• ื‘ื ื™ ืœื•ื•ื™ื”, ืื ืฉื™ ืกื•ื“,โ€ฌโ€ฌ
10:46
colleagues, friends and partners,
199
646017
2669
โ€ซโ€ซืขืžื™ืชื™ื, ื—ื‘ืจื™ื ื•ืฉื•ืชืคื™ื,โ€ฌโ€ฌ
10:48
as varied and unique as we all are.
200
648728
2627
โ€ซโ€ซืžื’ื•ื•ื ื™ื ื•ื™ื™ื—ื•ื“ื™ื™ื ื›ืžื• ื›ื•ืœื ื•.โ€ฌโ€ฌ
10:52
At this point, AIs will convincingly imitate humans at most tasks.
201
652273
4630
โ€ซโ€ซื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ืชื—ืงื” ืื–โ€ฌ โ€ซื‘ืื•ืคืŸโ€ฌ โ€ซืžืฉื›ื ืข ื‘ื ื™ ืื“ื ื‘ืจื•ื‘ ื”ืžืฉื™ืžื•ืช.โ€ฌโ€ฌ
10:57
And we'll feel this at the most intimate of scales.
202
657737
2794
โ€ซโ€ซื•ืื ื—ื ื• ื ืจื’ื™ืฉ ืืช ื–ื”โ€ฌโ€ฌ โ€ซโ€ซื‘ืงื ื” ื”ืžื™ื“ื” ื”ืื™ื ื˜ื™ืžื™ ื‘ื™ื•ืชืจ.โ€ฌโ€ฌ
11:00
An AI organizing a community get-together for an elderly neighbor.
203
660990
3587
โ€ซโ€ซื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ืฉืชืืจื’ืŸโ€ฌ โ€ซืžืคื’ืฉโ€ฌ โ€ซืงื”ื™ืœืชื™ ืขื‘ื•ืจ ืฉื›ืŸ ืžื‘ื•ื’ืจ.โ€ฌโ€ฌ
11:04
A sympathetic expert helping you make sense of a difficult diagnosis.
204
664619
4588
โ€ซโ€ซืžื•ืžื—ื” ืžืœื-ืื”ื“ื” ืฉื™ืขื–ื•ืจโ€ฌโ€ฌ โ€ซโ€ซืœื”ื‘ื™ืŸ ืื‘ื—ื•ืŸ ืงืฉื”.โ€ฌโ€ฌ
11:09
But we'll also feel it at the largest scales.
205
669248
2753
โ€ซโ€ซืื‘ืœ ืื ื—ื ื• ื ืจื’ื™ืฉ ืืช ื–ื”โ€ฌโ€ฌ โ€ซโ€ซื’ื ื‘ืงื ื” ื”ืžื™ื“ื” ื”ื’ื“ื•ืœ ื‘ื™ื•ืชืจ.โ€ฌโ€ฌ
11:12
Accelerating scientific discovery,
206
672043
2586
โ€ซโ€ซื’ื™ืœื•ื™ ืžื“ืขื™ ืžื•ืืฅ,โ€ฌ
11:14
autonomous cars on the roads,
207
674629
2168
โ€ซืžื›ื•ื ื™ื•ืชโ€ฌ โ€ซืื•ื˜ื•ื ื•ืžื™ื•ืช ื‘ื›ื‘ื™ืฉื™ืโ€ฌโ€ฌ
11:16
drones in the skies.
208
676797
1627
โ€ซโ€ซืžืœ"ื˜ื™ื ื‘ืฉืžื™ื™ื.โ€ฌโ€ฌ
11:18
They'll both order the takeout and run the power station.
209
678966
3337
โ€ซื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ื’ื ืชื–ืžื™ืŸ ืืช ื”ืžืฉืœื•ื—โ€ฌโ€ฌ โ€ซโ€ซื•ื’ื ืชืคืขื™ืœ ืืช ืชื—ื ืช ื”ื›ื•ื—.โ€ฌโ€ฌ
11:22
Theyโ€™ll interact with us and, of course, with each other.
210
682970
3337
โ€ซโ€ซื”ื™ื ืชื”ื™ื” ื‘ืื™ื ื˜ืจืืงืฆื™ื” ืื™ืชื ื•โ€ฌ โ€ซื•ื›ืžื•ื‘ืŸโ€ฌ ื‘ื™ื ื” ืœื‘ื™ื ื”.โ€ฌ
11:26
They'll speak every language,
211
686349
1918
โ€ซโ€ซื”ื™ื ืชื“ื‘ืจ ื‘ื›ืœ ืฉืคื”,โ€ฌโ€ฌ
11:28
take in every pattern of sensor data,
212
688267
2836
โ€ซืชืงื‘ืœ ื›ืœ ื“ืคื•ืก ืฉืœ ื ืชื•ื ื™ ื—ื™ื™ืฉื ื™ื,โ€ฌ โ€ซืžโ€ฌโ€ซืจืื•ืช, ืฆืœื™ืœื™ื,โ€ฌโ€ฌ
11:31
sights, sounds,
213
691145
2336
11:33
streams and streams of information,
214
693481
2377
โ€ซื–ืจืžื™ื ืขืœ ื–ืจืžื™ื ืฉืœ ืžื™ื“ืข,โ€ฌโ€ฌ
11:35
far surpassing what any one of us could consume in a thousand lifetimes.
215
695900
4379
โ€ซโ€ซื”ืจื‘ื” ืžืขื‘ืจ ืœืžื” ืฉื›ืœ ืื—ื“โ€ฌโ€ฌ โ€ซโ€ซืžืื™ืชื ื• ื™ื›ื•ืœ ืœืฆืจื•ืš ื‘ืืœืฃ ืชืงื•ืคื•ืช ื—ื™ื™ื.โ€ฌโ€ฌ
11:40
So what is this?
216
700780
1293
โ€ซโ€ซืื– ืžื” ื–ื”?โ€ฌโ€ฌ
11:42
What are these AIs?
217
702990
2086
โ€ซโ€ซืžื” ื”ืŸ ื”ื‘ื™ื ื•ืช ื”ืžืœืื›ื•ืชื™ื•ืช ื”ืืœื”?โ€ฌโ€ฌ
11:46
If we are to prioritize safety above all else,
218
706410
4588
โ€ซโ€ซืื ื‘ืจืฆื•ื ื ื• ืœืชืขื“ืฃ ืืช ื”ื‘โ€ฌโ€ซื˜ื™ื—ื•ืชโ€ฌ โ€ซืžืขืœ ืœื›ืœ ื“ื‘ืจ ืื—ืจ,โ€ฌโ€ฌ
11:51
to ensure that this new wave always serves and amplifies humanity,
219
711040
5088
โ€ซโ€ซื›ื“ื™ ืœื”ื‘ื˜ื™ื— ืฉื”ื’ืœ ื”ื—ื“ืฉ ื”ื–ื”โ€ฌโ€ฌ โ€ซโ€ซืชืžื™ื“ ื™ืฉืจืช ื•ื™ืขืฆื™ื ืืช ื”ืื ื•ืฉื•ืช,โ€ฌโ€ฌ
11:56
then we need to find the right metaphors for what this might become.
220
716128
4922
โ€ซโ€ซืื– ืขืœื™ื ื• ืœืžืฆื•ื ืืช ื”ื“ื™ืžื•ื™ื™ื ื”ื ื›ื•ื ื™ืโ€ฌ โ€ซืœืžื” ืฉื–ื” ืขืฉื•ื™ ืœื”ืคื•ืš.โ€ฌโ€ฌ
12:01
For years, we in the AI community, and I specifically,
221
721842
4296
โ€ซโ€ซื‘ืžืฉืš ืฉื ื™ื, ืœื ื• ื‘ืงื”ื™ืœืช ื”ื‘ื™ื ื”โ€ฌโ€ฌ โ€ซโ€ซื”ืžืœืื›ื•ืชื™ืช, ื•ืœื™ ืื™ืฉื™ืช,โ€ฌโ€ฌ
12:06
have had a tendency to refer to this as just tools.
222
726180
3587
โ€ซโ€ซื”ื™ืชื” ื ื˜ื™ื™ื” ืœื”ืชื™ื™ื—ืกโ€ฌ ืืœื™ื”ืŸโ€ฌ โ€ซื›ืืœ ื›ืœื™ื ื‘ืœื‘ื“.โ€ฌโ€ฌ
12:11
But that doesn't really capture what's actually happening here.
223
731060
3003
โ€ซโ€ซืื‘ืœ ื–ื” ืœื ืžืžืฉ ืžืงื™ืฃโ€ฌ โ€ซโ€ซืืช ืžื” ืฉืงื•ืจื” ื›ืืŸ ื‘ืคื•ืขืœ.โ€ฌโ€ฌ
12:14
AIs are clearly more dynamic,
224
734730
2503
โ€ซื”โ€ซื‘ื™ื ื•ืช ื”ืžืœืื›ื•ืชื™ื•ืช ื”ืŸ ื‘ื‘ื™ืจื•ืจโ€ฌ โ€ซื“ื™ื ืžื™ื•ืช ื™ื•ืชืจ, ืžืขื•ืจืคืœื•ืช ื™ื•ืชืจโ€ฌโ€ฌ
12:17
more ambiguous, more integrated
225
737233
2753
โ€ซโ€ซืžืฉื•ืœื‘ื•ืช ื™ื•ืชืจโ€ฌโ€ฌ โ€ซโ€ซื•ืžืชืคืชื—ื•ืช ืžื”ืจ ื™ื•ืชืจ ืžื›ืœื™ื ื’ืจื™ื“ื,โ€ฌโ€ฌ
12:19
and more emergent than mere tools,
226
739986
2627
12:22
which are entirely subject to human control.
227
742613
2878
โ€ซืฉื›ืคื•ืคื™ืโ€ฌ โ€ซืœื—ืœื•ื˜ื™ืŸ ืœืฉืœื™ื˜ื” ืื ื•ืฉื™ืช.โ€ฌโ€ฌ
12:25
So to contain this wave,
228
745533
2544
โ€ซโ€ซืื– ื›ื“ื™ ืœื”ื›ื™ืœ ืืช ื”ื’ืœโ€ฌ ื”ื–ื”,โ€ฌ
12:28
to put human agency at its center
229
748119
3044
โ€ซืœืฉื™ื ื‘ืžืจื›ื–ื•โ€ฌ ืืช ื”ืื“ื,โ€ฌ
12:31
and to mitigate the inevitable unintended consequences
230
751163
2711
โ€ซโ€ซื•ืœื”ืงื˜ื™ืŸ ืืช ื”ื”ืฉืœื›ื•ืชโ€ฌ โ€ซื”ื‘ืœืชื™-ืžื›ื•ื•ื ื•ืชโ€ฌ โ€ซื•ื”ื‘ืœืชื™-ื ืžื ืขื•ืชโ€ฌ โ€ซืฉืฆืคื•ื™ื•ืช ืœื”ื•ืคื™ืข,โ€ฌโ€ฌ
12:33
that are likely to arise,
231
753916
1835
12:35
we should start to think about them as we might a new kind of digital species.
232
755793
5005
โ€ซโ€ซืขืœื™ื ื• ืœื”ืชื—ื™ืœ ืœืจืื•ืช ื‘ื‘ื™ื ื•ืช ื”ืžืœืื›ื•ืชื™ื•ืชโ€ฌ โ€ซโ€ซืกื•ื’ ื—ื“ืฉ ืฉืœ ืžื™ืŸ ื“ื™ื’ื™ื˜ืœื™.โ€ฌโ€ฌ
12:41
Now it's just an analogy,
233
761132
1793
ืขื›ืฉื™ื• ื–ื• ืจืง ืื ืœื•ื’ื™ื”,โ€ฌโ€ฌ
12:42
it's not a literal description, and it's not perfect.
234
762925
2586
โ€ซโ€ซื–ื” ืœื ืชื™ืื•ืจโ€ฌ ืคืฉื•ื˜ื• ื›ืžืฉืžืขื•,โ€ฌ โ€ซื•ื–ื” ืœื ืžื•ืฉืœื.โ€ฌโ€ฌ
โ€ซโ€ซื‘ืชื•ืจ ื”ืชื—ืœื”, ื‘ืจื•ืจ ืฉื”ืŸโ€ฌ โ€ซืื™ื ืŸโ€ฌ โ€ซื‘ื™ื•ืœื•ื’ื™ื•ืช ื‘ืžื•ื‘ืŸ ืžื•ื›ืจ ื›ืœืฉื”ื•,โ€ฌโ€ฌ
12:46
For a start, they clearly aren't biological in any traditional sense,
235
766053
4046
12:50
but just pause for a moment
236
770141
2377
โ€ซโ€ซืื‘ืœ ืคืฉื•ื˜ ืงื—ื• ืจื’ืขโ€ฌ
12:52
and really think about what they already do.
237
772560
2836
โ€ซโ€ซื•ื—ื™ืฉื‘ื• ื‘ืืžืช ืขืœโ€ฌ โ€ซืžื” ืฉื”ืŸ ื›ื‘ืจ ืขื•ืฉื•ืช.โ€ฌโ€ฌ
12:55
They communicate in our languages.
238
775438
2627
โ€ซโ€ซื”ืŸ ื™ื•ืฆืจื•ืช ืชืงืฉื•ืจืช ื‘ืฉืคื•ืช ืฉืœื ื•.โ€ฌโ€ฌ
12:58
They see what we see.
239
778107
2127
โ€ซโ€ซื”ืŸ ืจื•ืื•ืช ืืช ืžื” ืฉืื ื—ื ื• ืจื•ืื™ื.โ€ฌโ€ฌ
13:00
They consume unimaginably large amounts of information.
240
780234
3587
โ€ซโ€ซื”ืŸ ืฆื•ืจื›ื•ืช ื›ืžื•ื™ื•ืช ืžื™ื“ืขโ€ฌ โ€ซื’ื“ื•ืœื•ืช ื‘ืื•ืคืŸโ€ฌ โ€ซื‘ืœืชื™-ื ืชืคืฉ.โ€ฌโ€ฌ
13:04
They have memory.
241
784739
1376
โ€ซโ€ซื™ืฉ ืœื”ืŸ ื–ื›ืจื•ืŸ.โ€ฌโ€ฌ
13:06
They have personality.
242
786991
1752
โ€ซโ€ซื™ืฉ ืœื”ืŸ ืื™ืฉื™ื•ืช.โ€ฌโ€ฌ
13:09
They have creativity.
243
789493
1710
โ€ซื”ืŸ ื ื™ื—ื ื• ื‘ื™ืฆื™ืจืชื™ื•ืช.โ€ฌโ€ฌ
13:12
They can even reason to some extent and formulate rudimentary plans.
244
792038
3962
โ€ซโ€ซื”ืŸ ื™ื›ื•ืœื•ืช ืืคื™ืœื• ืœื”ืกื‘ื™ืจ ื‘ืžื™ื“ื”โ€ฌโ€ฌ โ€ซโ€ซืžืกื•ื™ืžืช ื•ืœื ืกื— ืชื•ื›ื ื™ื•ืช ื‘ืกื™ืกื™ื•ืช.โ€ฌโ€ฌ
13:16
They can act autonomously if we allow them.
245
796709
3128
โ€ซโ€ซื”ืŸ ื™ื•ื“ืขื•ืช ืœืคืขื•ืœ ื‘ืื•ืคืŸโ€ฌ โ€ซืื•ื˜ื•ื ื•ืžื™,โ€ฌ โ€ซืื ื ืืคืฉืจ ืœื”ืŸ.โ€ฌโ€ฌ
13:20
And they do all this at levels of sophistication
246
800546
2294
โ€ซโ€ซื•ื”ืŸ ืขื•ืฉื•ืช ืืช ื›ืœโ€ฌ โ€ซื–ื”โ€ฌ โ€ซื‘ืจืžื•ืช ื’ื‘ื•ื”ื•ืช ืฉืœ ืชื—ื›ื•ืโ€ฌโ€ฌ
13:22
that is far beyond anything that we've ever known from a mere tool.
247
802882
3587
โ€ซืžืขื‘ืจ ืœื›ืœ ืžื”โ€ฌ โ€ซืฉืื™-ืคืขื ื”ื›ืจื ื•โ€ฌ โ€ซื‘ื›ืœื™ื ื’ืจื™ื“ื.โ€ฌโ€ฌ
13:27
And so saying AI is mainly about the math or the code
248
807762
4296
โ€ซืื– ืœื•ืžืจ ืฉื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืชโ€ฌ โ€ซื”ื™ื ื‘ืขื™ืงืจ ืขื ื™ื™ืŸ ืฉืœ ืžืชืžื˜ื™ืงื” ืื• ืงื•ื“,โ€ฌ
13:32
is like saying we humans are mainly about carbon and water.
249
812099
4838
โ€ซโ€ซื–ื” ื›ืžื• ืœื•ืžืจ ืฉื‘ื ื™โ€ฌ โ€ซื”ืื“ืโ€ฌ โ€ซื”ื ื‘ืขื™ืงืจื ืคื—ืžืŸ ื•ืžื™ื.โ€ฌโ€ฌ
13:37
It's true, but it completely misses the point.
250
817897
3795
โ€ซโ€ซื–ื” ื ื›ื•ืŸ,โ€ฌ โ€ซืื‘ืœ ื–ื• ื”ื—ืžืฆื” ื’ืžื•ืจื”.โ€ฌโ€ฌ
13:42
And yes, I get it, this is a super arresting thought
251
822860
3629
โ€ซโ€ซื•ื›ืŸ, ืื ื™ ืžื‘ื™ืŸ ืืช ื–ื”,โ€ฌโ€ฌ โ€ซโ€ซื–ื”ื• ืจืขื™ื•ืŸ ืขื•ืฆืจ-ื ืฉื™ืžื”,โ€ฌ
13:46
but I honestly think this frame helps sharpen our focus on the critical issues.
252
826489
5171
โ€ซโ€ซืื‘ืœ ืื ื™ ื—ื•ืฉื‘ ื‘ื›ื ื•ืชโ€ฌ
โ€ซืฉืฆื•ืจืช ื”ื—ืฉื™ื‘ื” ื”ื–ืืช ืขื•ื–ืจืช ืœื—ื“ื“โ€ฌ โ€ซืืช ื”ืชืžืงื“ื•ืชื ื• ื‘ื ื•ืฉืื™ื ื”ืžื›ืจื™ืขื™ื.โ€ฌโ€ฌ
13:52
What are the risks?
253
832745
1585
โ€ซโ€ซืžื”ื ื”ืกื™ื›ื•ื ื™ื?โ€ฌโ€ฌ
13:55
What are the boundaries that we need to impose?
254
835790
2752
โ€ซโ€ซืžื”ืŸ ื”ืžื’ื‘ืœื•ืช ืฉื—ืฉื•ื‘ ืฉื ืื›ื•ืฃ?โ€ฌโ€ฌ
13:59
What kind of AI do we want to build or allow to be built?
255
839293
4338
โ€ซโ€ซืื™ื–ื” ืกื•ื’ ืฉืœ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืชโ€ฌ โ€ซืื ื—ื ื•โ€ฌ โ€ซืจื•ืฆื™ื ืœื‘ื ื•ืช,โ€ฌ
โ€ซืื• ืœืืคืฉืจ ืฉืชื™ื‘ื ื”?โ€ฌโ€ฌ
14:04
This is a story that's still unfolding.
256
844465
2377
โ€ซโ€ซื–ื”ื• ืกื™ืคื•ืจ ืฉืขื“ื™ื™ืŸ ื ื›ืชื‘.โ€ฌโ€ฌ
14:06
Nothing should be accepted as a given.
257
846884
2544
โ€ซโ€ซืฉื•ื ื“ื‘ืจ ืœื ืฆืจื™ืš ืœื”ืชืงื‘ืœ ื›ื’ื–ื™ืจื”.โ€ฌโ€ฌ
14:09
We all must choose what we create.
258
849428
2544
โ€ซโ€ซื›ื•ืœื ื• ืฆืจื™ื›ื™ื ืœื‘ื—ื•ืจโ€ฌ โ€ซืืช ืžื” ืฉืื ื• ื™ื•ืฆืจื™ื.โ€ฌโ€ฌ
14:12
What AIs we bring into the world, or not.
259
852681
3921
โ€ซโ€ซืื™ืœื• ื‘ื™ื ื•ืช ืžืœืื›ื•ืชื™ื•ืชโ€ฌ โ€ซืื ื• ืžื‘ื™ืื™ื ืœืขื•ืœื,โ€ฌ
โ€ซื•ืื™ืœื• - ืœื.โ€ฌโ€ฌ
14:18
These are the questions for all of us here today,
260
858604
2836
โ€ซโ€ซืืœื” ื”ืฉืืœื•ืช ืขื‘ื•ืจ โ€ซื›ื•ืœื ื• ื›ืืŸ ื”ื™ื•ื,โ€ฌโ€ฌ
14:21
and all of us alive at this moment.
261
861440
2586
โ€ซโ€ซื•ื›ื•ืœื ื• ื—ื™ื™ื ื‘ืจื’ืข ื–ื”.โ€ฌโ€ฌ
14:24
For me, the benefits of this technology are stunningly obvious,
262
864693
4296
โ€ซื‘ืขื™ื ื™, ื™ืชืจื•ื ื•ืชื™ื” ืฉืœ ื”ื˜ื›ื ื•ืœื•ื’ื™ื”โ€ฌ โ€ซื”ื–ื•โ€ฌ โ€ซื‘ืจื•ืจื™ื ืœื”ืคืœื™ื,โ€ฌโ€ฌ
14:28
and they inspire my life's work every single day.
263
868989
3546
โ€ซโ€ซื•ื”ื ืžืขื•ืจืจื™ื ื”ืฉืจืื”โ€ฌโ€ฌ โ€ซโ€ซื‘ืขื‘ื•ื“ืช ื—ื™ื™ ื‘ื›ืœ ื™ื•ื.โ€ฌโ€ฌ
14:33
But quite frankly, they'll speak for themselves.
264
873661
3086
โ€ซโ€ซืื‘ืœ ื‘ื›ื ื•ืช,โ€ฌ โ€ซื”ืโ€ฌ โ€ซื™ื“ื‘ืจื• ื‘ืขื“ ืขืฆืžื.โ€ฌโ€ฌ
14:37
Over the years, I've never shied away from highlighting risks
265
877706
2920
โ€ซโ€ซื‘ืžื”ืœืš ื”ืฉื ื™ื ืžืขื•ืœื ืœื ื ืจืชืขืชื™โ€ฌ โ€ซืžืœื”ื“ื’ื™ืฉ ืกื™ื›ื•ื ื™ืโ€ฌโ€ฌ
14:40
and talking about downsides.
266
880668
1710
โ€ซโ€ซื•ืœื“ื‘ืจโ€ฌ โ€ซืขืœ ื—ืกืจื•ื ื•ืช.โ€ฌโ€ฌ
14:43
Thinking in this way helps us focus on the huge challenges
267
883087
3211
โ€ซโ€ซื—ืฉื™ื‘ื” ื‘ื“ืจืš ื–ื• ืขื•ื–ืจืช ืœื ื•โ€ฌโ€ฌ โ€ซโ€ซืœื”ืชืžืงื“ ื‘ืืชื’ืจื™ื ื”ืขืฆโ€ซื•ืžื™ื ืฉืœืคื ื™ ื›ื•ืœื ื•.โ€ฌโ€ฌ
14:46
that lie ahead for all of us.
268
886298
1836
14:48
But let's be clear.
269
888843
1418
โ€ซโ€ซืื‘ืœ ื‘ื•ืื• ื ื”ื™ื” ื‘ืจื•ืจื™ื.โ€ฌโ€ฌ
14:50
There is no path to progress
270
890719
2128
โ€ซโ€ซืื™-ืืคืฉืจ ืœื”ืชืงื“ืโ€ฌโ€ฌ โ€ซืื ืžืฉืื™ืจื™ื ืืช ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ืžืื—ื•ืจ.โ€ฌโ€ฌ
14:52
where we leave technology behind.
271
892888
2169
14:55
The prize for all of civilization is immense.
272
895683
3795
โ€ซโ€ซื”ืคืจืก ืœื›ืœ ื”ืฆื™ื•ื•ื™ืœื™ื–ืฆื™ื”โ€ฌโ€ฌ โ€ซโ€ซื”ื•ื ืขืฆื•ื.โ€ฌโ€ฌ
15:00
We need solutions in health care and education, to our climate crisis.
273
900062
3879
โ€ซโ€ซืื ื• ื–ืงื•ืงื™ื ืœืคืชืจื•ื ื•ืช ื‘ืชื—ื•ื ื”ื‘ืจื™ืื•ืช,โ€ฌโ€ฌ โ€ซโ€ซื”ื—ื™ื ื•ืš, ืžืฉื‘ืจ ื”ืืงืœื™ื ืฉืœื ื•.โ€ฌโ€ฌ
15:03
And if AI delivers just a fraction of its potential,
274
903941
3921
โ€ซโ€ซื•ืื ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืชโ€ฌโ€ฌ โ€ซโ€ซืชืžืžืฉ ืจืง ืฉื‘ืจื™ืจ โ€ซืžื”ืคื•ื˜ื ืฆื™ืืœ ื”ื’ืœื•ื ื‘ื”,โ€ฌ
15:07
the next decade is going to be the most productive in human history.
275
907903
4421
โ€ซื”ืขืฉื•ืจ ื”ื‘ืโ€ฌ โ€ซื™ื”ื™ื” ื”ืคื•ืจื” ื‘ื™ื•ืชืจโ€ฌ โ€ซื‘ืชื•ืœื“ื•ืช ื”ืื ื•ืฉื•ืช.โ€ฌโ€ฌ
15:13
Here's another way to think about it.
276
913701
2002
โ€ซโ€ซื”ื ื” ืขื•ื“ ื“ืจืš ืœื—ืฉื•ื‘ ืขืœ ื–ื”.โ€ฌโ€ฌ
15:15
In the past,
277
915744
1293
โ€ซโ€ซื‘ืขื‘ืจ, ืœืคืชื™ื—ืช ื”ืฆืžื™ื—ื” ื”ื›ืœื›โ€ฌโ€ซืœื™ืชโ€ฌ โ€ซื ืœื•ื• ืœืขืชื™ืโ€ฌ โ€ซืงืจื•ื‘ื•ืช ื—ืกืจื•ื ื•ืช ืขืฆื•ืžื™ื.โ€ฌโ€ฌ
15:17
unlocking economic growth often came with huge downsides.
278
917037
4088
15:21
The economy expanded as people discovered new continents
279
921500
3712
โ€ซโ€ซื”ื›ืœื›ืœื” ื”ืชืจื—ื‘ื” ื›ื›ืœโ€ฌ โ€ซืฉืื ืฉื™ืโ€ฌ โ€ซื’ื™ืœื• ื™ื‘ืฉื•ืชโ€ฌ โ€ซื—ื“ืฉื•ืช ื•ื”ืฆื™ื‘ื• ื’ื‘ื•ืœื•ืช ื—ื“ืฉื™ืโ€ฌ.โ€ฌ
15:25
and opened up new frontiers.
280
925254
2002
15:28
But they colonized populations at the same time.
281
928632
3045
โ€ซโ€ซืื‘ืœ ื‘ืžืงื‘ื™ืœ, ื”ื ื”ืชื ื—ืœื•โ€ฌ โ€ซื‘โ€ซืื•ื›ืœื•ืกื™ื•ืช ืงื™ื™ืžื•ืช.โ€ฌโ€ฌ
15:32
We built factories,
282
932595
1710
โ€ซโ€ซื‘ื ื™ื ื• ืžืคืขืœื™ื,โ€ฌโ€ฌ
15:34
but they were grim and dangerous places to work.
283
934305
3044
โ€ซโ€ซืื‘ืœ ื”ื ื”ื™ื• ืžืงื•ืžื•ืชโ€ฌโ€ฌ โ€ซโ€ซืขื’ื•ืžื™ื ื•ืžืกื•ื›ื ื™ื ืœืขื‘ื•ื“ื”.โ€ฌโ€ฌ
15:38
We struck oil,
284
938017
1918
โ€ซืžืฆืื ื• ื ืคื˜,โ€ฌโ€ฌ
15:39
but we polluted the planet.
285
939977
1627
โ€ซโ€ซืื‘ืœ ื–ื™ื”ืžื ื• ืืช ื›ื“ื•ืจ ื”ืืจืฅ.โ€ฌโ€ฌ
15:42
Now because we are still designing and building AI,
286
942146
3170
โ€ซื•โ€ซืขื›ืฉื™ื•, ื”ื™ื•ืช ืฉืื ื• ืขื“ื™ื™ืŸโ€ฌโ€ฌ โ€ซโ€ซืžืขืฆื‘ื™ื ื•ื‘ื•ื ื™ื ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช,โ€ฌ
15:45
we have the potential and opportunity to do it better,
287
945316
3795
โ€ซื™ืฉ โ€ซืœื ื• ืคื•ื˜ื ืฆื™ืืœ ื•ื”ื–ื“ืžื ื•ืชโ€ฌ โ€ซืœืขืฉื•ืชโ€ฌ โ€ซื–ืืช ื˜ื•ื‘ ื™ื•ืชืจ,โ€ฌ
15:49
radically better.
288
949153
1335
โ€ซื˜ื•ื‘ ื™ื•ืชืจ โ€ซื‘ืื•ืคืŸ ืงื™ืฆื•ื ื™.โ€ฌโ€ฌ
15:51
And today, we're not discovering a new continent
289
951071
2628
โ€ซโ€ซื•ื”ื™ื•ื, ืื™ื ื ื• โ€ซืžื’ืœื™ื ื™ื‘ืฉืช ื—ื“ืฉื”โ€ฌโ€ฌ โ€ซโ€ซื•ื‘ื•ื–ื–ื™ื ืืช ืžืฉืื‘ื™ื”.โ€ฌโ€ฌ
15:53
and plundering its resources.
290
953741
1918
15:56
We're building one from scratch.
291
956285
1919
โ€ซโ€ซืื ื—ื ื• ื‘ื•ื ื™ื ืžืฉื”ื• ืœื’ืžืจื™ ืžื”ืชื—ืœื”.โ€ฌ
15:58
Sometimes people say that data or chips are the 21st centuryโ€™s new oil,
292
958662
5297
โ€ซโ€ซืœืคืขืžื™ื ืื•ืžืจื™ื ืฉื ืชื•ื ื™ื ืื• ืฉื‘ื‘ื™ืโ€ฌ โ€ซื”ืโ€ฌ โ€ซื”ื ืคื˜ ื”ื—ื“ืฉ ืฉืœ ื”ืžืื” ื”-21,โ€ฌโ€ฌ
16:03
but that's totally the wrong image.
293
963959
2002
โ€ซโ€ซืื‘ืœ ื–ื” ืžืžืฉ ืœื ื”ื“ื™ืžื•ื™ ื”ื ื›ื•ืŸ.โ€ฌโ€ฌ
16:06
AI is to the mind
294
966587
1919
โ€ซื”โ€ซื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช, ืžื‘ื—ื™ื ืช ื”ืžื•ื—โ€ฌ,โ€ฌ
16:08
what nuclear fusion is to energy.
295
968547
2753
โ€ซื”ื™ื โ€ซืžื” ืฉื”ื”ื™ืชื•ืš ื”ื’ืจืขื™ื ื™ ื”ื•ืโ€ฌ โ€ซืžื‘ื—ื™ื ืช ื”ืื ืจื’ื™ื”.โ€ฌโ€ฌ
16:12
Limitless, abundant,
296
972259
2670
โ€ซืžืงื•ืจ โ€ซื‘ืœืชื™-ืžื•ื’ื‘ืœ, ืฉื•ืคืขโ€ฌ โ€ซืฉืžืฉื ื” ืืช ื”ืขื•ืœืโ€ฌ.โ€ฌ
16:14
world-changing.
297
974970
1544
16:17
And AI really is different,
298
977389
2962
โ€ซโ€ซื•ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ื‘ืืžืช ืฉื•ื ื”,โ€ฌโ€ฌ
16:20
and that means we have to think about it creatively and honestly.
299
980392
4213
โ€ซโ€ซื•ื–ื” ืื•ืžืจ ืฉืขืœื™ื ื• ืœื—ืฉื•ื‘โ€ฌ ืขืœื™ื”โ€ฌ โ€ซื‘ื™ืฆื™ืจืชื™ื•ืช ื•ื‘ื›ื ื•ืช.โ€ฌโ€ฌ
16:24
We have to push our analogies and our metaphors
300
984647
2586
โ€ซโ€ซืขืœื™ื ื• ืœื“ื—ื•ืง ืืชโ€ฌ ื’ื‘ื•ืœื•ืชโ€ฌ โ€ซโ€ซื”ืื ืœื•ื’ื™ื•ืช ื•ื”ื“ื™ืžื•ื™ื™ื ืฉืœื ื•โ€ฌโ€ฌ
16:27
to the very limits
301
987274
1752
16:29
to be able to grapple with what's coming.
302
989068
2252
โ€ซโ€ซื›ื“ื™ ืœื”ื™ื•ืช ืžืกื•ื’ืœื™ื ืœื”ืชืžื•ื“ื“โ€ฌ โ€ซืขื ืžื” ืฉืžืฆืคื” ืœื ื•.โ€ฌโ€ฌ
16:31
Because this is not just another invention.
303
991362
2878
โ€ซโ€ซื›ื™ ื–ื• ืœืโ€ฌ โ€ซืกืชื ืขื•ื“ ื”ืžืฆืื”.โ€ฌโ€ฌ
16:34
AI is itself an infinite inventor.
304
994907
3503
โ€ซโ€ซื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ื”ื™ื ื‘ืขืฆืžื”โ€ฌ โ€ซืžืžืฆื™ืื” ืื™ื ืกื•ืคื™ืช.โ€ฌโ€ฌ
16:38
And yes, this is exciting and promising and concerning
305
998869
3295
โ€ซโ€ซื•ื›ืŸ, ื–ื” ืžืœื”ื™ื‘ โ€ซื•ืžื‘ื˜ื™ื— ื•ืžื“ืื™ื’โ€ฌโ€ฌ โ€ซโ€ซื•ืžืกืงืจืŸ ื‘ื‘ืช-ืื—ืช.โ€ฌโ€ฌ
16:42
and intriguing all at once.
306
1002164
2044
16:45
To be quite honest, it's pretty surreal.
307
1005251
2585
โ€ซโ€ซืœืžืขืŸ ื”ืืžืช, ื–ื” ื“ื™ ืกื•ืจื™ืืœื™ืกื˜ื™.โ€ฌโ€ฌ
16:47
But step back,
308
1007836
1293
โ€ซโ€ซืื‘ืœ ืงื—ื• ืฆืขื“ ืื—ื•ืจื”,โ€ฌ
16:49
see it on the long view of glacial time,
309
1009171
3462
โ€ซื”ื‘ื™ื˜ื• ื‘ื” ื‘ืžื‘ื˜ ื”ืืจื•ืšโ€ฌ โ€ซืฉืœ ื”ื–ืžืŸ ื”ืงืจื—โ€ฌื•ื ื™โ€ฌ
16:52
and these really are the very most appropriate metaphors that we have today.
310
1012633
4796
โ€ซโ€ซื•ืืœื• ื‘ืืžืช ื”ื“ื™ืžื•ื™ื™ืโ€ฌ โ€ซโ€ซื”ื”ื•ืœืžื™ื ื‘ื™ื•ืชืจ ืฉื™ืฉ ืœื ื• ื›ื™ื•ื.โ€ฌโ€ฌ
16:57
Since the beginning of life on Earth,
311
1017471
3003
โ€ซโ€ซืžืื– ืชื—ื™ืœืช ื”ื—ื™ื™ื ืขืœื™ ืื“ืžื•ืช,โ€ฌโ€ฌ
17:00
we've been evolving, changing
312
1020516
2961
โ€ซื”ืชืคืชื—ื ื•, ืฉื™ื ื™ื ื•,โ€ฌ
17:03
and then creating everything around us in our human world today.
313
1023477
4171
โ€ซโ€ซื•ืื– ื™ืฆืจื ื• ืืช ื›ืœโ€ฌ โ€ซืžื” ืฉืกื‘ื™ื‘ื ื•โ€ฌ โ€ซื‘ืขื•ืœืžื ื• ื”ืื ื•ืฉื™ ื›ื™ื•ื.โ€ฌโ€ฌ
17:08
And AI isn't something outside of this story.
314
1028232
3003
โ€ซโ€ซื•ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ืื™ื ื ื”โ€ฌ โ€ซืžื—ื•ืฅ ืœืกื™ืคื•ืจ ื”ื–ื”.โ€ฌโ€ฌ
17:11
In fact, it's the very opposite.
315
1031569
2961
โ€ซโ€ซืœืžืขืฉื”, ื–ื” ื‘ื“ื™ื•ืง ื”ื”ืคืš.โ€ฌโ€ฌ
17:15
It's the whole of everything that we have created,
316
1035197
3170
โ€ซโ€ซื–ื” ืกืš ื›ืœโ€ฌ โ€ซืžื” ืฉื™ืฆืจื ื•,โ€ฌโ€ฌ
17:18
distilled down into something that we can all interact with
317
1038367
3545
โ€ซโ€ซืžื–ื•ืงืง ืœืžืฉื”ื• ืฉื›ื•ืœื ื• ื™ื›ื•ืœื™ืโ€ฌโ€ฌ โ€ซโ€ซืœืงื™ื™ื ืื™ืชื• ืื™ื ื˜ืจืืงืฆื™ื”โ€ฌ
17:21
and benefit from.
318
1041912
1377
โ€ซื•ืœื”ืคื™ืงโ€ฌ ืžืžื ื• โ€ซืชื•ืขืœืช.โ€ฌโ€ฌ
17:23
It's a reflection of humanity across time,
319
1043872
3504
โ€ซโ€ซื–ื•ื”ื™ ื”ืฉืชืงืคื•ืช ืฉืœ ื”ืื ื•ืฉื•ืช ืœืื•ืจืš ื–ืžืŸ,โ€ฌโ€ฌ
17:27
and in this sense,
320
1047418
1251
โ€ซโ€ซื•ืžื‘ื—ื™ื ื” ื–ื•,โ€ฌ โ€ซื”ื™ื ื‘ื›ืœืœ ืœื ืžื™ืŸ ื—ื“ืฉ.โ€ฌโ€ฌ
17:28
it isn't a new species at all.
321
1048711
1918
17:31
This is where the metaphors end.
322
1051213
1960
โ€ซโ€ซื›ืืŸ ืžืกืชื™ื™ืžื™ื ื”ื“ื™ืžื•ื™ื™ื.โ€ฌโ€ฌ
17:33
Here's what I'll tell Caspian next time he asks.
323
1053591
2502
โ€ซโ€ซื”ื ื” ืžื” ืฉืื’ื™ื“ ืœืงืกืคื™ืืŸโ€ฌโ€ฌ โ€ซโ€ซื‘ืคืขื ื”ื‘ืื” ืฉื”ื•ื ื™ืฉืืœ ืื•ืชื™.โ€ฌโ€ฌ
17:37
AI isn't separate.
324
1057219
1460
โ€ซื”โ€ซื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ืื™ื ื ื” ืžืฉื”ื• ื ืคืจื“.โ€ฌโ€ฌ
17:39
AI isn't even in some senses, new.
325
1059388
3295
โ€ซื‘ืžื•ื‘ื ื™ื ืžืกื•ื™ืžื™ื, ื”โ€ซื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืชโ€ฌ โ€ซืื™ื ื ื” ืืคื™ืœื• ื—ื“ืฉื”.โ€ฌโ€ฌ
17:43
AI is us.
326
1063517
1126
โ€ซื”โ€ซื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ื”ื™ื ืื ื—ื ื•,โ€ฌ
17:45
It's all of us.
327
1065394
1210
โ€ซื›ื•ืœื ื•.โ€ฌโ€ฌ
17:47
And this is perhaps the most promising and vital thing of all
328
1067062
3754
โ€ซโ€ซื•ื–ื” ืื•ืœื™ ื”ื“ื‘ืจ ื”ืžื‘ื˜ื™ื—โ€ฌ โ€ซื•ื”ื—ืฉื•ื‘ โ€ซืžื›ืœ,โ€ฌ
17:50
that even a six-year-old can get a sense for.
329
1070858
2669
โ€ซืฉืืคื™ืœื• ื™ืœื“ ื‘ืŸโ€ฌ โ€ซืฉืฉโ€ฌ โ€ซื™ื›ื•ืœโ€ฌ โ€ซืœืงื‘ืœ ืžื•ืฉื’ ืœื’ื‘ื™ื•.โ€ฌโ€ฌ
17:54
As we build out AI,
330
1074528
1376
โ€ซโ€ซื›ืฉืื ื• ืžืคืชื—ื™ื ืืช ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืชโ€ฌ
17:55
we can and must reflect all that is good,
331
1075946
3545
โ€ซโ€ซืื ื• ื™ื›ื•ืœื™ื ื•ื—ื™ื™ื‘ื™ื ืœืฉืงืฃโ€ฌ โ€ซืืช ื›ืœ ืžื” ืฉื˜ื•ื‘,โ€ฌ
17:59
all that we love,
332
1079533
1126
โ€ซืืชโ€ฌ โ€ซื›ืœ ืžื” ืฉืื ื• ืื•ื”ื‘ื™ื,โ€ฌ โ€ซืืชโ€ฌ โ€ซื›ืœ ื”ืžื™ื•ื—ื“ ื‘ืื ื•ืฉื•ืช:โ€ฌ
18:00
all that is special about humanity:
333
1080701
2753
18:03
our empathy, our kindness,
334
1083495
2169
โ€ซื”ืืžืคืชื™ื”โ€ฌโ€ซ, ื”ื—ืกื“, ื”ืกืงืจโ€ซื ื•ืชโ€ฌ โ€ซื•ื”ื™ืฆื™ืจืชื™ื•ืช ืฉืœื ื•.โ€ฌโ€ฌ
18:05
our curiosity and our creativity.
335
1085664
2169
18:09
This, I would argue, is the greatest challenge of the 21st century,
336
1089251
5130
โ€ซโ€ซื–ื”, ืœื˜ืขื ืชื™, ื”ื•ื ื”ืืชื’ืจโ€ฌ โ€ซื”ื’ื“ื•ืœโ€ฌ โ€ซื‘ื™ื•ืชืจ ืฉืœ ื”ืžืื” ื”-21,โ€ฌโ€ฌ
18:14
but also the most wonderful,
337
1094381
2211
โ€ซโ€ซืืš ื’ื ื”โ€ฌโ€ซื”ื–ื“ืžื ื•ืช ื”ื›ื™ ื ืคืœืื”,โ€ฌโ€ฌ โ€ซโ€ซืžืขื•ืจืจืช ื”ืฉืจืื” ื•ืžืœืืช ืชืงื•ื•ื” ืœื›ื•ืœื ื•.โ€ฌโ€ฌ
18:16
inspiring and hopeful opportunity for all of us.
338
1096592
3336
18:20
Thank you.
339
1100429
1168
โ€ซโ€ซืชื•ื“ื” ืœื›ื.โ€ฌโ€ฌ
18:21
(Applause)
340
1101639
4879
โ€ซโ€ซ(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)โ€ฌโ€ฌ
18:26
Chris Anderson: Thank you Mustafa.
341
1106560
1668
โ€ซโ€ซื›ืจื™ืก ืื ื“ืจืกื•ืŸ: ืชื•ื“ื”, ืžื•ืกื˜ืคื”.โ€ฌโ€ฌ
18:28
It's an amazing vision and a super powerful metaphor.
342
1108270
3796
โ€ซโ€ซื–ื” ื—ื–ื•ืŸ ืžื“ื”ื™ืโ€ฌโ€ฌ โ€ซโ€ซื•ื“ื™ืžื•ื™ ืกื•ืคืจ-ืขื•ืฆืžืชื™.โ€ฌโ€ฌ
18:32
You're in an amazing position right now.
343
1112066
1960
โ€ซโ€ซืืชื” ื‘ืขืžื“ื” ืžื“ื”ื™ืžื” ื›ืจื’ืข.โ€ฌโ€ฌ
18:34
I mean, you were connected at the hip
344
1114026
1793
โ€ซโ€ซื›ืœื•ืžืจ, ื”ื™ื™ืช ืžื—ื•ื‘ืจ ื‘ื›ืœ ื”ืžื•ื‘ื ื™ืโ€ฌ
18:35
to the amazing work happening at OpenAI.
345
1115819
2795
โ€ซโ€ซืœืขื‘ื•ื“ื” ื”ืžื“ื”ื™ืžื” ืฉืœโ€ฌ โ€ซOpenAI.โ€ฌโ€ฌ
18:38
Youโ€™re going to have resources made available,
346
1118614
2169
โ€ซโ€ซื™ื”ื™ื• ืœื›ืโ€ฌ โ€ซืžืฉืื‘ื™ื ื–ืžื™ื ื™ื,โ€ฌโ€ฌ
18:40
there are reports of these giant new data centers,
347
1120824
4046
โ€ซโ€ซื™ืฉ ื“ื™ื•ื•ื—ื™ื ืขืœโ€ฌโ€ฌ โ€ซโ€ซืžืจื›ื–ื™ ื ืชื•ื ื™ื ืขื ืงื™ื™ื ื—ื“ืฉื™ื,โ€ฌโ€ฌ
18:44
100 billion dollars invested and so forth.
348
1124870
2461
โ€ซื”ืฉืงืขื•ืช ืฉืœ โ€ซ100 ืžื™ืœื™ืืจื“ ื“ื•ืœืจโ€ฌโ€ฌ โ€ซื•ื›ืŸ ื”ืœืื”.โ€ฌโ€ฌ
18:48
And a new species can emerge from it.
349
1128082
3837
โ€ซโ€ซื•ืขืฉื•ื™ ืœืฆืžื•ื— ืžื–ื” ืžื™ืŸ ื—ื“ืฉ.โ€ฌโ€ฌ
18:52
I mean, in your book,
350
1132294
1502
โ€ซื‘ืกืคืจ ืฉืœืš,โ€ฌ ื‘ื ื•ืกืฃโ€ฌโ€ฌ โ€ซืขืœ ื”ื—ื–ื•ืŸ ื”ืื•ืคื˜ื™ืžื™ ืœื”ืคืœื™ื,โ€ฌโ€ฌ
18:53
you did, as well as painting an incredible optimistic vision,
351
1133837
2878
18:56
you were super eloquent on the dangers of AI.
352
1136757
3629
โ€ซโ€ซื”ืชื‘ื˜ืืช ื‘ืจื”ื™ื˜ื•ืช ืจื‘ื”โ€ฌ โ€ซโ€ซืขืœ ื”ืกื›ื ื•ืช ืฉื‘ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช.โ€ฌโ€ฌ
19:00
And I'm just curious, from the view that you have now,
353
1140427
3963
โ€ซโ€ซื•ืื ื™ ืคืฉื•ื˜ ืกืงืจืŸ,โ€ฌ โ€ซื‘ืจืื™ื™ื” ื”ื ื•ื›ื—ื™ืช ืฉืœืš,โ€ฌโ€ฌ
19:04
what is it that most keeps you up at night?
354
1144431
2336
โ€ซโ€ซืžื” ื”ื›ื™ ืžืคืจื™ืข ืœืš ืœื™ืฉื•ืŸ ื‘ืœื™ืœื”?โ€ฌโ€ฌ
19:06
Mustafa Suleyman: I think the great risk is that we get stuck
355
1146809
2878
โ€ซโ€ซืžื•ืกื˜ืคื ืกื•ืœื™ืžืŸ:โ€ฌ โ€ซืื ื™ ื—ื•ืฉื‘ ืฉื”ืกื™ื›ื•ืŸ ื”ื’ื“ื•ืœโ€ฌโ€ฌ
โ€ซโ€ซื”ื•ื ืฉื ื™ืชืงืข ื‘ืžื” ืฉืื ื™ ืžื›ื ื”โ€ฌ โ€ซโ€œืžืœื›ื•โ€ฌโ€ซื“ืชโ€ฌ โ€ซืกืœื™ื“ื” ืคืกื™ืžื™ืชโ€œ.โ€ฌโ€ฌ
19:09
in what I call the pessimism aversion trap.
356
1149728
2086
19:11
You know, we have to have the courage to confront
357
1151855
2586
โ€ซืฆืจื™ืš ืœื”ื™ื•ืช ืœื ื• ื”ืื•ืžืฅ ืœื”ืชืขืžืชโ€ฌ โ€ซืขืโ€ฌ โ€ซื”ืืคืฉืจื•ืช ืฉืœ ืชืจื—ื™ืฉื™ืโ€ฌ โ€ซืืคืœื™ืโ€ฌ
19:14
the potential of dark scenarios
358
1154483
1960
19:16
in order to get the most out of all the benefits that we see.
359
1156443
3128
โ€ซื›ื“ื™ ืฉื ืคื™ืง ืืชโ€ฌ โ€ซื”ืžื™ืจื‘โ€ฌ โ€ซืžื›ืœ ื”ื™ืชืจื•ื ื•ืช ืฉืื ื• ืจื•ืื™ื.โ€ฌโ€ฌ
19:19
So the good news is that if you look at the last two or three years,
360
1159613
3712
โ€ซโ€ซืื– ื”ื‘ืฉื•ืจื•ืช ื”ื˜ื•ื‘ื•ืช ื”ืŸ ืฉืืโ€ฌโ€ฌ โ€ซโ€ซืžืกืชื›ืœื™ื ืขืœ ื”ืฉื ืชื™ื™ื-ืฉืœื•ืฉ ื”ืื—ืจื•ื ื•ืช,โ€ฌโ€ฌ
19:23
there have been very, very few downsides, right?
361
1163325
2961
โ€ซโ€ซื”ื™ื• ืžืขื˜โ€ฌ โ€ซืžืื•ื“ ื—ืกืจื•ื ื•ืช, ื ื›ื•ืŸ?โ€ฌโ€ฌ
19:26
Itโ€™s very hard to say explicitly what harm an LLM has caused.
362
1166328
4922
โ€ซโ€ซืงืฉื” ืžืื•ื“ ืœื–ื”ื•ืช ื‘ืžืคื•ืจืฉโ€ฌโ€ฌ ื ื–ืง ื›ืœืฉื”ื• ืฉื’ืจืžื• ืžื•ื“ืœื™ ืฉืคื” ื’ื“ื•ืœื™ื.โ€ฌโ€ฌ
19:31
But that doesnโ€™t mean that thatโ€™s what the trajectory is going to be
363
1171291
3212
โ€ซโ€ซืื‘ืœ ื–ื” ืœื ืื•ืžืจ ืฉื–ื”โ€ฌโ€ฌ โ€ซโ€ซื™ื”ื™ื” ื”ื›ื™ื•ื•ืŸ โ€ซโ€ซื‘-10 ื”ืฉื ื™ื ื”ื‘ืื•ืช.โ€ฌโ€ฌ
19:34
over the next 10 years.
364
1174503
1168
โ€ซโ€ซืื– ืื ื™ ื—ื•ืฉื‘ ืฉืื ืืชื”โ€ฌโ€ฌ โ€ซโ€ซืฉื ืœื‘ ืœื›ืžื” ื™ื›ื•ืœื•ืช ืกืคืฆื™ืคื™ื•ืช,โ€ฌโ€ฌ
19:35
So I think if you pay attention to a few specific capabilities,
365
1175671
3462
19:39
take for example, autonomy.
366
1179174
1835
ืœืžืฉืœ, ืื•ื˜ื•ื ื•ืžื™ื”.โ€ฌโ€ฌ
19:41
Autonomy is very obviously a threshold
367
1181009
2336
โ€ซโ€ซืื•ื˜ื•ื ื•ืžื™ื” ื”ื™ื ื‘ื‘ื™ืจื•ืจ ืกืฃโ€ฌโ€ฌ
19:43
over which we increase risk in our society.
368
1183387
2794
โ€ซโ€ซืฉืžืขื‘ืจ ืœื• ืื ื• ืžื’ื“ื™ืœื™ืโ€ฌโ€ฌ โ€ซโ€ซืืช ื”ืกื™ื›ื•ืŸ ื‘ื—ื‘ืจื” ืฉืœื ื•,
19:46
And it's something that we should step towards very, very closely.
369
1186223
3128
โ€ซโ€ซื•ื–ื” ืžืฉื”ื• ืฉืขืœื™ื ื• ืœื”ืชืงื“ื ืืœื™ื• ื‘ื–ื”ื™ืจื•ืช ืจื‘ื” ืžืื•ื“.โ€ฌโ€ฌ
19:49
The other would be something like recursive self-improvement.
370
1189393
3420
โ€ซโ€ซื”ืฉื ื™ ื™ื”ื™ื” ืžืฉื”ื• ื›ืžื•โ€ฌโ€ฌ โ€ซโ€ซืฉื™ืคื•ืจ ืขืฆืžื™ ืจืงื•ืจืกื™ื‘ื™.โ€ฌโ€ฌ
19:52
If you allow the model to independently self-improve,
371
1192813
3629
โ€ซโ€ซืื ืชืืคืฉืจ ืœืžื•ื“ืœ ืœื”ืฉืชืคืจโ€ฌโ€ฌ โ€ซโ€ซื‘ืื•ืคืŸ ืขืฆืžืื™,
19:56
update its own code,
372
1196483
1460
ืœืขื“โ€ซโ€ซื›ืŸ ื‘ืขืฆืžื• ืืช ื”ืงื•ื“ ืฉืœื•,โ€ฌโ€ฌ
19:57
explore an environment without oversight, and, you know,
373
1197985
3295
โ€ซโ€ซืœื—ืงื•ืจ ืกื‘ื™ื‘ื” ื‘ืœื™ โ€ซโ€ซืคื™ืงื•ื—,
20:01
without a human in control to change how it operates,
374
1201321
3295
ื•โ€ซโ€ซื‘ืœื™ ืฉืœื™ื˜ื” ืื ื•ืฉื™ืช ืฉืชืฉื ื”โ€ฌโ€ฌ โ€ซโ€ซืืช ืื•ืคืŸ ืคืขื•ืœืชื•,โ€ฌ
20:04
that would obviously be more dangerous.
375
1204616
1919
ื‘ืจื•ืจ โ€ซโ€ซืฉื–ื” ื™ื”ื™ื” ืžืกื•ื›ืŸ ื™ื•ืชืจ.โ€ฌโ€ฌ
20:06
But I think that we're still some way away from that.
376
1206535
2544
โ€ซโ€ซืื‘ืœ ืื ื™ ื—ื•ืฉื‘ ืฉืื ื•โ€ฌโ€ฌ โ€ซโ€ซืขื“ื™ื™ืŸ ื“ื™ ืจื—ื•ืงื™ื ืžื–ื”.โ€ฌโ€ฌ
20:09
I think it's still a good five to 10 years before we have to really confront that.
377
1209079
3879
ืœื“ืขืชื™, ื™ืฉ ืขื“ื™ื™ืŸ 5-10 ืฉื ื™ืโ€ฌโ€ฌ ื‘ื˜ืจื ื ื™ืืœืฅ ืžืžืฉ ืœื”ืชืžื•ื“ื“ ืขื ื–ื”.โ€ฌโ€ฌ
โ€ซโ€ซืื‘ืœ ื”ื’ื™ืข ื”ื–ืžืŸ ืœื”ืชื—ื™ืœโ€ฌโ€ฌ โ€ซโ€ซืœื“ื‘ืจ ืขืœ ื–ื” ืขื›ืฉื™ื•.โ€ฌโ€ฌ
20:12
But it's time to start talking about it now.
378
1212958
2085
โ€ซโ€ซื›โ€œื: ืžื™ืŸ ื“ื™ื’ื™ื˜ืœื™, ืฉืœื โ€ซโ€ซื›ืžื• ื›ืœ ืžื™ืŸ ื‘ื™ื•ืœื•ื’ื™,โ€ฌโ€ฌ
20:15
CA: A digital species, unlike any biological species,
379
1215043
2545
20:17
can replicate not in nine months,
380
1217588
2002
โ€ซโ€ซื™ื›ื•ืœ ืœื”ืฉืชื›ืคืœ ืœื ื‘ืชืฉืขื” ื—ื•ื“ืฉื™ื,โ€ฌโ€ฌ
20:19
but in nine nanoseconds,
381
1219631
1669
โ€ซโ€ซืืœื ื‘ืชืฉืข ื ื ื•-ืฉื ื™ื•ืช,โ€ฌโ€ฌ
20:21
and produce an indefinite number of copies of itself,
382
1221341
3421
โ€ซโ€ซื•ืœื™ื™ืฆืจ ืื™ื ืกื•ืฃ ืขื•ืชืงื™ื ืฉืœ ืขืฆืžื•,
20:24
all of which have more power than we have in many ways.
383
1224803
3796
ืฉืœโ€ซโ€ซื›ื•ืœื ื™ืฉ ื™ื•ืชืจ ื›ื•ื—โ€ฌโ€ฌ โ€ซโ€ซืžืžื” ืฉื™ืฉ ืœื ื• ื‘ืžื•ื‘ื ื™ื ืจื‘ื™ื.โ€ฌโ€ฌ
20:28
I mean, the possibility for unintended consequences seems pretty immense.
384
1228599
4838
โ€ซโ€ซื›ืœื•ืžืจ, ื”ืืคืฉืจื•ืช ืœืชื•ืฆืื•ืช ืœืโ€ฌโ€ฌ-โ€ซโ€ซืžื›ื•ื•ื ื•ืช ื ืจืื™ืช ื“ื™ ืขืฆื•ืžื”.โ€ฌโ€ฌ
20:33
And isn't it true that if a problem happens,
385
1233479
2168
โ€ซโ€ซื•ื”ืื ื–ื” ืœื ื ื›ื•ืŸโ€ฌโ€ฌ โ€ซโ€ซืฉืื ืžืชืจื—ืฉืช ื‘ืขื™ื”,โ€ฌโ€ฌ
20:35
it could happen in an hour?
386
1235689
1919
โ€ซโ€ซื–ื” ื™ื›ื•ืœ ืœืงืจื•ืช ืชื•ืš ืฉืขื”?โ€ฌโ€ฌ
20:37
MS: No.
387
1237649
1335
โ€ซโ€ซืž"ืก: ืœื.โ€ฌโ€ฌ
20:38
That is really not true.
388
1238984
1752
โ€ซโ€ซื–ื” ื‘ืืžืช ืœื ื ื›ื•ืŸ.โ€ฌโ€ฌ
20:40
I think there's no evidence to suggest that.
389
1240778
2085
โ€ซโ€ซืื ื™ ื—ื•ืฉื‘ ืฉืื™ืŸ ืจืื™ื•ืชโ€ฌโ€ฌ ืœื›ืš.โ€ฌโ€ฌ
20:42
And I think that, you know,
390
1242863
1585
โ€ซโ€ซื•ืื ื™ ื—ื•ืฉื‘ ืฉ...
20:44
thatโ€™s often referred to as the โ€œintelligence explosion.โ€
391
1244490
2836
โ€ซโ€ซื–ื” ืžื›ื•ื ื” ืœืขืชื™ื โ€ซโ€ซืงืจื•ื‘ื•ืช โ€œื”ืชืคื•ืฆืฆื•ืช ื”ื‘ื™ื ื”โ€œ.โ€ฌโ€ฌ
20:47
And I think it is a theoretical, hypothetical maybe
392
1247367
3712
โ€ซโ€ซื•ืื ื™ ื—ื•ืฉื‘ ืฉื–ื•โ€ฌ ืกื•ื’ื™ื”โ€ฌ โ€ซโ€ซืชื™ืื•ืจื˜ื™ืช, ื”ื™ืคื•ืชื˜ื™ืช ืื•ืœื™โ€ฌโ€ฌ
20:51
that we're all kind of curious to explore,
393
1251079
2420
โ€ซโ€ซืฉื›ื•ืœื ื• ืกืงืจื ื™ื ืœื—ืงื•ืจ,โ€ฌโ€ฌ
20:53
but there's no evidence that we're anywhere near anything like that.
394
1253540
3212
โ€ซโ€ซืื‘ืœ ืื™ืŸ ืฉื•ื ื”ื•ื›ื—ื”โ€ฌโ€ฌ โ€ซโ€ซืœื›ืš ืฉืื ื• ืงืจื•ื‘ื™ื ืœืžืฉื”ื• ื›ื–ื”.โ€ฌโ€ฌ
20:56
And I think it's very important that we choose our words super carefully.
395
1256752
3462
โ€ซโ€ซื•ืœื“ืขืชื™, ื—ืฉื•ื‘ ืžืื•ื“ ืฉื ื‘ื—ืจโ€ฌโ€ฌ ื‘ื–ื”ื™ืจื•ืช ืจื‘ื” โ€ซโ€ซืžืื•ื“ ืืช ื”ืžืœื™ื ืฉืœื ื•.โ€ฌโ€ฌ
21:00
Because you're right, that's one of the weaknesses of the species framing,
396
1260255
3546
โ€ซโ€ซื›ื™ ืืชื” ืฆื•ื“ืง, ื–ื• ืื—ืชโ€ฌโ€ฌ ืžื ืงื•ื“ื•ืช ื”ืชื•ืจืคื” ืฉืœ ื“ื™ืžื•ื™ ื”ืžื™ืŸ,โ€ฌโ€ฌ
21:03
that we will design the capability for self-replication into it
397
1263801
4337
โ€ซโ€ซืฉื ืขืฆื‘ ืืช ื™ื›ื•ืœืชโ€ฌโ€ฌ ื”ืฉื›ืคื•ืœ ื”ืขืฆืžื™ ื›ื—ืœืง ืžืžื ื”
21:08
if people choose to do that.
398
1268180
1668
โ€ซโ€ซืื ื™ื”ื™ื• ืžื™ ืฉื™ื‘ื—ืจื• ืœืขืฉื•ืช ื–ืืช.โ€ฌโ€ฌ
21:09
And I would actually argue that we should not,
399
1269890
2169
โ€ซโ€ซื•ืื ื™ ื˜ื•ืขืŸโ€ฌโ€ฌ โ€ซโ€ซืฉืœื ื›ื“ืื™,โ€ฌโ€ฌ
21:12
that would be one of the dangerous capabilities
400
1272059
2210
โ€ซโ€ซื–ื• ืชื”ื™ื”โ€ฌโ€ฌ โ€ซโ€ซืื—ืช ื”ื™ื›ื•ืœื•ืช ื”ืžืกื•ื›ื ื•ืช โ€ซโ€ซืฉืžื”ืŸ ืขืœื™ื ื• ืœื”ืชืจื—ืง, ื ื›ื•ืŸ?โ€ฌโ€ฌ
21:14
that we should step back from, right?
401
1274269
1835
21:16
So there's no chance that this will "emerge" accidentally.
402
1276104
3796
โ€ซโ€ซืื– ืื™ืŸ ืกื™ื›ื•ื™โ€ฌโ€ฌ โ€ซโ€ซืฉื–ื” โ€œื™ื•ืคื™ืขโ€ ื‘ื˜ืขื•ืช.โ€ฌโ€ฌ
21:19
I really think that's a very low probability.
403
1279942
2502
โ€ซโ€ซืื ื™ ื‘ืืžืช ื—ื•ืฉื‘ ืฉื–ื•โ€ฌโ€ฌ โ€ซโ€ซืกื‘ื™ืจื•ืช ื ืžื•ื›ื” ืžืื•ื“.โ€ฌโ€ฌ
21:22
It will happen if engineers deliberately design those capabilities in.
404
1282486
4338
โ€ซโ€ซื–ื” ื™ืงืจื” ืื ื”ืžื”ื ื“ืกื™ื ื™ืขืฆื‘ื• โ€ซโ€ซื‘ืžื›ื•ื•ืŸ ืืช ื”ื™ื›ื•ืœื•ืช ื”ืœืœื•.โ€ฌโ€ฌ
21:26
And if they don't take enough efforts to deliberately design them out.
405
1286865
3295
โ€ซโ€ซื•ืื ื”ื ืœื ื™ืฉืชื“ืœื• ืžืกืคื™ืงโ€ฌโ€ฌ ืœื”ื•ืฆื™ื ืื•ืชืŸ ืžื”ืชื›ื ื•ืŸ.โ€ฌโ€ฌ
โ€ซโ€ซื•ืžื›ืืŸ ื”ื—ืฉื™ื‘ื•ืช ืฉืœ ื‘ื”ื™ืจื•ืช ื•ื”ืฉืงื™ืคื•ืช
21:30
And so this is the point of being explicit
406
1290160
2294
21:32
and transparent about trying to introduce safety by design very early on.
407
1292454
5672
ื›ืฉืžืฉืœื‘ื™ื ื‘ื˜ื™ื—ื•ืชโ€ฌโ€ฌ ืžืชื•ื›ื ื ืช ื‘ืฉืœื‘ ืžื•ืงื“ื ืžืื•ื“.โ€ฌโ€ฌ
ื›"โ€ซโ€ซื: ืชื•ื“ื” ืœืš, ื”ื—ื–ื•ืŸ ืฉืœืš,โ€ฌโ€ฌ
21:39
CA: Thank you, your vision of humanity injecting into this new thing
408
1299044
5964
ืœืคื™ื• ื”ืื ื•ืฉื•ืช ืžื–ืจื™ืงื” ืœืชื•ืš ื”ื“ื‘ืจ ื”ื—ื“ืฉ ื”ื–ื”โ€ฌโ€ฌ
21:45
the best parts of ourselves,
409
1305008
1877
โ€ซโ€ซืืช ื”ื—ืœืงื™ื ื”ื˜ื•ื‘ื™ื ื‘ื™ื•ืชืจ ืฉืœ ืขืฆืžื ื•,โ€ฌโ€ฌ
21:46
avoiding all those weird, biological, freaky,
410
1306927
2920
โ€ซโ€ซืชื•ืš ื”ื™ืžื ืขื•ืช ืžื›ืœ ื”ื ื˜ื™ื•ืช ื”ืžื•ื–ืจื•ืช, ื”ื‘ื™ื•ืœื•ื’ื™ื•ืช, ื”ืžืฉื•ื ื•ืช โ€ซโ€ซื•ื”ื ื•ืจืื™ื•ืช
21:49
horrible tendencies that we can have in certain circumstances,
411
1309888
2920
ืฉืขืœื•ืœื•ืช ืœื”ื™ื•ืช ืœื ื• ื‘ื ืกื™ื‘ื•ืช ืžืกื•ื™ืžื•ืช,โ€ฌโ€ฌ
21:52
I mean, that is a very inspiring vision.
412
1312808
2127
ื–ื”ื• ื—ื–ื•ืŸ ืžืขื•ืจืจ-ื”ืฉืจืื” ืžืื•ื“.โ€ฌโ€ฌ
21:54
And thank you so much for coming here and sharing it at TED.
413
1314977
3336
โ€ซโ€ซื•ืชื•ื“ื” ืจื‘ื” ืฉื”ื’ืขืช ื”ื ื” ื•ืฉื™ืชืคืชโ€ฌโ€ฌ โ€ซโ€ซืืช ื–ื” ื‘-TED.โ€ฌโ€ฌ
21:58
Thank you, good luck.
414
1318355
1210
โ€ซโ€ซืชื•ื“ื”, ื‘ื”ืฆืœื—ื”.โ€ฌโ€ฌ -ืชื•ื“ื” ืจื‘ื”.
21:59
(Applause)
415
1319565
1876
โ€ซโ€ซ(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)โ€ฌโ€ฌ
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7