How AI Could Hack Democracy | Lawrence Lessig | TED

41,476 views ใƒป 2024-11-05

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืชืจื’ื•ื: aknv tso ืขืจื™ื›ื”: zeeva livshitz
00:03
So on January 6, 2021
0
3580
5520
ื‘-6 ืœื™ื ื•ืืจ, 2021
00:09
my nation suffered a little bit of a democracy heart attack.
1
9140
5360
ื”ืžื“ื™ื ื” ืฉืœื™ ืกื‘ืœื” ืžื”ืชืงืฃ-ืœื‘ ื“ืžื•ืงืจื˜ื™ ืงืœ.
00:15
Thousands of Americans had been told that the election had been stolen,
2
15820
4520
ืœืืœืคื™ ืืžืจื™ืงื ื™ื ื ืืžืจ ืฉื”ื‘ื—ื™ืจื•ืช ื ื’ื ื‘ื•,
00:20
and tens of thousands of them showed up
3
20380
3320
ื•ื”ื’ื™ืขื• ืขืฉืจื•ืช ืืœืคื™ื ืžื”ื,
00:23
because they believed the election had been stolen.
4
23740
3920
ืžืฉื•ื ืฉื”ื ื”ืืžื™ื ื• ืฉื”ื‘ื—ื™ืจื•ืช ื ื’ื ื‘ื•.
00:29
And indeed, in polling immediately after January 6,
5
29260
3960
ื•ืืžื ื, ื‘ืกืงืจื™ื ืฉื ืขืจื›ื• ืžื™ื“ ืœืื—ืจ ื”-6 ืœื™ื ื•ืืจ,
00:33
โ€œThe Washington Postโ€ found 70 percent of Republicans believed
6
33220
5160
ื”โ€œื•ื•ืฉื™ื ื’ื˜ื•ืŸ ืคื•ืกื˜โ€ ืžืฆื ืฉ-70% ืžื”ืจืคื•ื‘ืœื™ืงื ื™ื
00:38
the election had been stolen,
7
38420
2120
ืžืืžื™ื ื™ื ืฉื”ื‘ื—ื™ืจื•ืช ื ื’ื ื‘ื•,
00:40
and a majority of college-educated Republicans believed
8
40580
5080
ื•ืฉืจื•ื‘ ื”ืจืคื•ื‘ืœื™ืงื ื™ื ื‘ื•ื’ืจื™ ืงื•ืœื’โ€ฒ ืžืืžื™ื ื™ื ืฉื”ื‘ื—ื™ืจื•ืช ื ื’ื ื‘ื•.
00:45
the election had been stolen.
9
45700
1840
00:47
That was their perception.
10
47580
2240
ื›ืš ื”ื ื”ื‘ื™ื ื•.
00:51
And I don't know what the right way to act on that perception is.
11
51140
3560
ื•ืื™ืŸ ืœื™ ืžื•ืฉื’ ื›ื™ืฆื“ ืจืื•ื™ ืœื ื”ื•ื’ ื‘ืงืฉืจ ืœื”ื‘ื ื” ื”ื–ืืช,
00:54
They thought the right way was to defend
12
54740
3120
ืื ื”ื ื—ืฉื‘ื• ืฉื”ืžืขืฉื” ื”ื ื›ื•ืŸ ื”ื•ื ืœื”ื’ืŸ ืขืœ ื”ื“ืžื•ืงืจื˜ื™ื” ืฉื ื’ื ื‘ื”, ืœื“ืขืชื.
00:57
what they thought was a democracy stolen.
13
57900
3680
01:03
Now, these numbers were astonishing:
14
63220
3040
ื”ืžืกืคืจื™ื ื”ืืœื” ื”ื™ื• ืžื“ื”ื™ืžื™ื:
01:06
30 percent or two thirds of Republicans believing the election was stolen.
15
66260
4320
30% ืื• ืฉื ื™ ืฉืœื™ืฉ ืžื”ืจืคื•ื‘ืœื™ืงื ื™ื ืžืืžื™ื ื™ื ืฉื”ื‘ื—ื™ืจื•ืช ื ื’ื ื‘ื•.
01:10
But even more extraordinary are these numbers.
16
70580
3720
ืื‘ืœ ื”ืžืกืคืจื™ื ื”ืืœื” ืขื•ื“ ื™ื•ืชืจ ืžื“ื”ื™ืžื™ื.
01:15
The fact that in the three years since that astonishing event,
17
75340
6160
ื”ืขื•ื‘ื“ื” ืฉื‘ืฉืœื•ืฉ ื”ืฉื ื™ื ืฉื—ืœืคื• ืžืื– ืื•ืชื• ืื™ืจื•ืข ืžื“ื”ื™ื,
01:21
the numbers have not changed.
18
81500
2120
ืœื ื—ืœ ืฉื™ื ื•ื™ ื‘ืžืกืคืจื™ื.
01:24
The same number believe today that the election was stolen
19
84340
3360
ื”ืื ืฉื™ื ืฉืžืืžื™ื ื™ื ืฉื”ื‘ื—ื™ืจื•ืช ื ื’ื ื‘ื• ื”ื ื‘ื“ื™ื•ืง ืื•ืชื ืื ืฉื™ื ืžืœืคื ื™ 3 ืฉื ื™ื.
01:27
as believed it was stolen three years ago,
20
87740
2200
01:29
despite the fact that we've had investigations
21
89980
2760
ื—ืจืฃ ื”ืขื•ื‘ื“ื” ืฉื ืขืจื›ื• ื‘ื“ื™ืงื•ืช
01:32
and overwhelming evidence that there was no fraud
22
92780
2840
ื•ื‘ื™ื“ื™ื ื• ื”ื•ื›ื—ื•ืช ืžื›ืจื™ืขื•ืช ืœื›ืš ืฉืœื ื”ื™ื” ืฉื•ื ื–ื™ื•ืฃ
01:35
sufficient to ever change even a single state.
23
95660
4120
ืฉื“ื™ ื‘ื• ืœืฉื ื•ืช ืืช ื”ืชื•ืฆืื” ื•ืœื• ื‘ืžื“ื™ื ื” ืื—ืช.
01:39
This is something new.
24
99820
1480
ื–ื” ืžืฉื”ื• ื—ื“ืฉ.
01:42
When Richard Nixon went through the Watergate scandal,
25
102020
2840
ื›ืฉืจื™ืฆโ€™ืจื“ ื ื™ืงืกื•ืŸ ื”ื™ื” ืžืขื•ืจื‘ ื‘ืฉืขืจื•ืจื™ื™ืช ื•ื•ื˜ืจื’ื™ื™ื˜,
01:44
as the news was being reported,
26
104900
2160
ื•ื–ื” ื”ื™ื” ื‘ื—ื“ืฉื•ืช,
01:47
Nixon's popularity collapsed not just among Democrats,
27
107100
3840
ื”ืคื•ืคื•ืœืจื™ื•ืช ืฉืœ ื ื™ืงืกื•ืŸ ืงืจืกื” ืœื ืจืง ื‘ืงืจื‘ ื”ื“ืžื•ืงืจื˜ื™ื,
01:50
but among independents and Republicans.
28
110980
2360
ืืœื ื’ื ื‘ืงืจื‘ ื”ืžืคืœื’ื•ืช ื”ืขืฆืžืื™ื•ืช ื•ื”ืจืคื•ื‘ืœื™ืงื ื™ื.
01:53
But we're at a stage where it doesn't matter what happens.
29
113980
4240
ืื‘ืœ ืขื›ืฉื™ื• ืื ื• ื‘ืฉืœื‘ ืฉื‘ื• ื›ื‘ืจ ืœื ืžืฉื ื” ืžื” ืงื•ืจื”.
01:58
This is Donald Trump's popularity over the course of his administration.
30
118260
4840
ื”ื ื” ื”ืคื•ืคื•ืœืจื™ื•ืช ืฉืœ ื“ื•ื ืœื“ ื˜ืจืืžืค ื‘ืžื”ืœืš ื›ื”ื•ื ืชื•.
02:03
Nothing changes.
31
123620
1520
ืื™ืŸ ืฉื•ื ืฉื™ื ื•ื™.
ืื™ืŸ ื—ืฉื™ื‘ื•ืช ืœืขื•ื‘ื“ื•ืช.
02:05
The facts don't matter.
32
125140
2600
02:08
Now I think this truth should bother us a lot.
33
128540
5000
ืœื“ืขืชื™, ื”ืืžืช ื”ื–ืืช ืฆืจื™ื›ื” ืœื”ื˜ืจื™ื“ ืื•ืชื ื•. ืžืื•ื“.
02:14
I think we need to develop a kind of paranoia
34
134620
3840
ืœื“ืขืชื™, ืขืœื™ื ื• ืœืคืชื— ืžืขื™ืŸ ืคืจื ื•ื™ื”
02:18
about what produces this reality.
35
138460
3440
ื‘ื ื•ื’ืข ืœืžื” ืฉื™ื•ืฆืจ ืืช ื”ืžืฆื™ืื•ืช ื”ื–ืืช.
02:21
A particular paranoia, the paranoia of the hunted.
36
141940
4080
ืคืจื ื•ื™ื” ืžืกื•ื™ืžืช ืžืื•ื“. ื”ืคืจื ื•ื™ื” ืฉืœ ื”ื ื™ืฆื•ื“.
02:26
Think of the kids in "The Birds"
37
146340
3080
ื—ื™ืฉื‘ื• ืขืœ ื”ื™ืœื“ื™ื ื”ืืœื”, ื‘ืกืจื˜ โ€œื”ืฆื™ืคื•ืจื™ืโ€œ,
02:29
first realizing that those crows were attacking them,
38
149460
2960
ื›ืฉื”ื ืงื•ืœื˜ื™ื ืœืจืืฉื•ื ื” ืฉื”ืขื•ืจื‘ื™ื ื”ืืœื” ืชื•ืงืคื™ื ืื•ืชื,
02:32
or "Black Mirror's" Metalheads,
39
152460
2160
ืื• ื—ื•ื‘ื‘ื™ ืžื•ื–ื™ืงืช โ€œื”ื‘ื™ ืžื˜ืืœโ€ ื•โ€œืžืจืื” ืฉื—ื•ืจื”โ€œ,
02:34
when you see these creatures chasing and surrounding you.
40
154660
4240
ื›ืฉืจื•ืื™ื ืืช ื”ื™ืฆื•ืจื™ื ื”ืืœื” ืจื•ื“ืคื™ื ืื—ืจื™ืš ื•ืžืงื™ืคื™ื ืื•ืชืš.
02:38
The point is, we need to recognize
41
158900
1960
ื”ื ืงื•ื“ื” ื”ื™ื ืฉืขืœื™ื ื• ืœื”ื›ื™ืจ ื‘ื›ืš
02:40
that there is an intelligence out there to get us.
42
160900
6280
ืฉืงื™ื™ืžืช ื‘ื™ื ื” ืžืกื•ื™ืžืช ืฉื™ืฆืื” ืœืฆื•ื“ ืื•ืชื ื•.
02:47
Because our perceptions,
43
167780
3640
ื›ื™ ื”ืชืคื™ืฉื•ืช ืฉืœื ื•,
02:51
our collective perceptions,
44
171460
1960
ื”ืชืคื™ืฉื•ืช ื”ืงื™ื‘ื•ืฆื™ื•ืช ืฉืœื ื•,
02:53
our collective misimpressions
45
173460
3200
ื”ืจืฉืžื™ื ื”ืžื•ื˜ืขื™ื ื”ืงื™ื‘ื•ืฆื™ื™ื ืฉืœื ื•,
02:56
are not accidental.
46
176660
1520
ืื™ื ื ืžืงืจื™ื™ื.
02:58
They are expected, they are intended,
47
178780
2160
ื”ื ืฆืคื•ื™ื™ื, ื™ืฉ ืžืื—ื•ืจื™ื”ื ื›ื•ื•ื ื”,
03:00
they are the product of the thing.
48
180940
4000
ื”ื ื”ืชื•ืฆืจ ืฉืœ โ€œื”ื“ื‘ืจโ€œ.
03:05
(Laughter)
49
185300
1160
(ืฆื—ื•ืง)
03:06
OK, I want to be careful introducing the thing.
50
186500
2680
ื˜ื•ื‘, ืื ื™ ืจื•ืฆื” ืœื”ื™ื–ื”ืจ ื›ืฉืื ื™ ืžืฆื™ื’ ืืช ื”ื“ื‘ืจ.
03:10
I'm going to talk a little bit about AI, but I'm not going to slag on AI.
51
190020
3480
ืื“ื‘ืจ ืžืขื˜ ืขืœ ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช, ืื‘ืœ ืื™ื ื ื™ ืžืชื›ื•ื•ืŸ ืœืจื“ืช ืขืœื™ื”.
03:13
I think AI is the most extraordinary technology
52
193540
3400
ืื ื™ ื—ื•ืฉื‘ ืฉื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ื”ื™ื ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ื”ืžื“ื”ื™ืžื” ื‘ื™ื•ืชืจ
03:16
humanity has ever even conceived of.
53
196980
2680
ืฉื”ืื ื•ืฉื•ืช ื—ืฉื‘ื” ืขืœื™ื” ืื™-ืคืขื.
03:19
But I also think it has the potential to end humanity.
54
199700
5320
ืื‘ืœ ืื ื™ ื’ื ื—ื•ืฉื‘ ืฉื‘ื›ื•ื—ื” ืœืฉื™ื ืงืฅ ืœืื ื•ืฉื•ืช.
03:25
But I'm not going to slag on AI,
55
205420
1560
ืื‘ืœ ืœื ืืจื“ ืขืœ ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช,
03:27
because I'm pretty sure that our robot overlord
56
207020
2560
ื›ื™ ืื ื™ ื“ื™ ื‘ื˜ื•ื— ืฉืื“ื•ื ื ื• ื”ืจื•ื‘ื•ื˜
03:29
is going to be listening to these TED talks someday,
57
209620
2960
ื™ืื–ื™ืŸ ืžืชื™ืฉื”ื• ืœื”ืจืฆืื•ืช TED ืืœื”,
03:32
and I don't want to be on the wrong side of the overlord.
58
212620
2760
ื•ืื ื™ ืœื ืจื•ืฆื” ืฉื”ื•ื ื™ืกืชื›ืœ ืขืœื™ ื‘ืขื™ืŸ ืขืงื•ืžื”.
03:35
So AI is just fine.
59
215420
1360
ืื– ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ืžืžืฉ ื‘ืกื“ืจ.
03:36
I'm not going to talk about this AI first.
60
216820
2000
ืœื ืื“ื‘ืจ ืชื—ื™ืœื” ืขืœ ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ื”ื–ืืช.
03:38
I want to instead put AI in a little bit of a perspective,
61
218860
4880
ื‘ืžืงื•ื ื–ื” ืื ื™ ืจื•ืฆื” ืœื”ืขืžื™ื“ ืืช ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ื‘ืคืจืกืคืงื˜ื™ื‘ื”,
03:43
because I think that we're too obsessed with the new,
62
223740
4080
ื›ื™ ื ืจืื” ืœื™ ืฉืื ื• ื™ื•ืชืจ ืžื“ื™ ืื•ื‘ืกืกื™ื‘ื™ื™ื ืขื ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ื”ื—ื“ืฉื”,
03:47
and we fail to recognize the significance of AI in the old.
63
227860
5280
ื•ืœื ืžืฆืœื™ื—ื™ื ืœื”ื›ื™ืจ ื‘ื—ืฉื™ื‘ื•ืชื” ืฉืœ ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ื”ื™ืฉื ื”.
03:53
We think about intelligence,
64
233820
1440
ื›ืฉืื ื• ื—ื•ืฉื‘ื™ื ืขืœ ื‘ื™ื ื”,
03:55
and we're distinguishing between artificial and natural intelligence.
65
235300
3640
ืื ื• ืžื‘ื—ื™ื ื™ื ื‘ื™ืŸ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ืœื‘ื™ืŸ ื‘ื™ื ื” ื˜ื‘ืขื™ืช.
03:58
And we, of course, as humans,
66
238980
1400
ื•ื›ื‘ื ื™ ืื“ื, ืื ื• ื›ืžื•ื‘ืŸ
04:00
claim pride of kingdom in the world of natural intelligence.
67
240420
4360
ื˜ื•ืขื ื™ื ื‘ื’ืื•ื•ื” ืœื›ืชืจ ื›ืื“ื•ื ื™ ื”ืขื•ืœื ื‘ื‘ื™ื ื” ื˜ื‘ืขื™ืช.
04:05
And then we build artificial intelligence.
68
245420
2440
ื•ืื– ืื ื• ื‘ื•ื ื™ื ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช.
04:07
It's intelligence that we make.
69
247900
1760
ื”ื‘ื™ื ื” ื”ื–ื• ืฉืื ื• ื™ื•ืฆืจื™ื.
04:09
But here's the critical point.
70
249700
1600
ืื‘ืœ ื”ื ื” ื”ื ืงื•ื“ื” ื”ืžื›ืจืขืช.
04:11
We have already, for a long time,
71
251860
4320
ืžื–ื” ื–ืžืŸ ืจื‘ ืฉืื ื• ื—ื™ื™ื ืขื ืžืขืจื›ื•ืช ืฉืœ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช.
04:16
lived with systems of artificial intelligence.
72
256220
2720
04:18
I don't mean digital AI,
73
258980
2560
ืื ื™ ืœื ืžืชื›ื•ื•ืŸ ืœื‘ื™ื ื” ืžืœืื•ืชื™ืช ื“ื™ื’ื™ื˜ืœื™ืช,
04:21
I mean analog AI.
74
261580
1360
ืืœื ืœื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ืื ืœื•ื’ื™ืช.
04:24
Any entity or institution that we build with a purpose
75
264340
4240
ื›ืœ ื’ื•ืฃ ืื• ืžื•ืกื“ ืฉืื ื• ืžืงื™ืžื™ื ืœืฆื•ืจืš ื›ืœืฉื”ื•
04:28
that acts instrumentally in the world is in this sense an AI.
76
268620
5040
ืฉืžืฉืžืฉ ืื•ืชื ื• ื‘ืขื•ืœื ื”ื–ื” ื”ื•ื ื‘ืžื•ื‘ืŸ ื–ื” ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช.
04:34
It is an instrumentally rational entity that's mapping how it should behave,
77
274620
4680
ืžื“ื•ื‘ืจ ื‘ื’ื•ืฃ ืจืฆื™ื•ื ืœื™ ื•ืฉื™ืžื•ืฉื™ ืฉืžืžืคื” ืืช ื”ืชื ื”ื’ื•ืชื ื• ื”ืจืื•ื™ื”,
04:39
given the way the world evolves and responds to it.
78
279340
3000
ืœืคื™ ื›ื™ื•ื•ืŸ ื”ืชืคืชื—ื•ืชื• ืฉืœ ื”ืขื•ืœื, ื•ื‘ืชื’ื•ื‘ื” ืœื›ืš.
04:42
So think about democracy as an AI.
79
282340
3560
ื—ื™ืฉื‘ื• ืขืœ ื”ื“ืžื•ืงืจื˜ื™ื” ื›ืขืœ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช.
04:46
It has institutions, elections, parliaments, constitutions
80
286300
4000
ื™ืฉ ืœื” ืžื•ืกื“ื•ืช, ื‘ื—ื™ืจื•ืช, ื‘ืชื™ ื ื‘ื—ืจื™ื, ื—ื•ืงื•ืช
04:50
for the purpose of some collective ends.
81
290300
3800
ืฉื™ืฉ ืœื”ื ืžื˜ืจื•ืช ืงื™ื‘ื•ืฆื™ื•ืช ืžืกื•ื™ืžื•ืช.
04:54
Our Constitution says it's for the common good.
82
294100
3560
ืœืคื™ ื”ื—ื•ืงื” ืฉืœื ื•, ืžื“ื•ื‘ืจ ื‘ื˜ื•ื‘ืช ื”ื›ืœืœ.
04:57
So the democracy in our Constitution is an analog artificial intelligence
83
297700
6040
ื›ืš ืฉื”ื“ืžื•ืงืจื˜ื™ื” ืฉื‘ื—ื•ืงื” ืฉืœื ื• ื”ื™ื ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ืื ืœื•ื’ื™ืช
05:03
devoted to the common good.
84
303780
2040
ื”ืžื•ืงื“ืฉืช ืœื˜ื•ื‘ืช ื”ื›ืœืœ.
05:06
Or think about corporations as an AI.
85
306460
3040
ื—ื™ืฉื‘ื• ืขืœ ื”ืชืื’ื™ื“ื™ื ืฉืœื ื• ื›ืขืœ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช.
05:09
They have institutions, boards, management, finance,
86
309540
3040
ื™ืฉ ื‘ื”ื ืžื•ืกื“ื•ืช, ื—ื‘ืจื™ ืžื ื”ืœื™ื, ื”ื ื”ืœื•ืช, ืคื™ื ื ืกื™ื,
05:12
for the purpose of making money, or at least conceived of narrowly,
87
312620
3640
ืœืชื›ืœื™ืช ืขืฉื™ื™ืช ื›ืกืฃ, ืื• ื›ืš ื”ื ื ืชืคืฉื™ื ื‘ืื•ืคืŸ ืฉื˜ื—ื™,
05:16
today, thatโ€™s the way it is.
88
316260
1360
ื–ื” ื”ืžืฆื‘ ื”ื™ื•ื.
05:17
There corporation is an analog intelligence devoted to maximizing
89
317660
3840
ื”ืชืื’ื™ื“ื™ื ื”ื ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ืื ืœื•ื’ื™ืช ื”ืžื•ืงื“ืฉืช ืœืžื™ืจื•ื‘ ืขืจืš ื‘ืขืœื™ ื”ืžื ื™ื•ืช.
05:21
shareholder value.
90
321500
1320
05:22
These are AIs.
91
322860
1840
ื”ื ื‘ื™ื ื•ืช ืžืœืื›ื•ืชื™ื•ืช.
05:25
They have purposes and objectives sometimes complementing each other.
92
325940
4600
ื™ืฉ ืœื”ื ืžื˜ืจื•ืช ื•ื™ืขื“ื™ื ืžืฉืœื™ืžื™ื, ืœืคืขืžื™ื.
05:31
So the purpose of a school bus company
93
331020
2120
ืœืžืฉืœ, ืžื˜ืจืชื” ืฉืœ ื—ื‘ืจืช ื”ืกืขื•ืช ื”ืชืœืžื™ื“ื™ื ืžืฉืœื™ืžื” ืืช ืžื˜ืจืช ื”ื ื”ืœืช ื‘ื™ืช ื”ืกืคืจ
05:33
complements the purpose of a school board
94
333140
2280
05:35
to produce school bus transportation in a district.
95
335460
3080
ืœื›ื•ื ืŸ ื”ืกืขื•ืช ืชืœืžื™ื“ื™ื ืืœ ื‘ืชื™ ื”ืกืคืจ ื‘ืžื—ื•ื–.
05:38
That's just beautiful.
96
338580
1720
ื™ืคื” ืžืื•ื“.
05:41
But sometimes they're competing.
97
341060
2000
ืื‘ืœ ืœืคืขืžื™ื ื”ืžื˜ืจื•ืช ืกื•ืชืจื•ืช.
05:43
The purpose of a government and having a clean environment
98
343660
2720
ืืช ืžื˜ืจื•ืช ื”ืžืžืฉืœื” ื•ื”ื“ืจื™ืฉื” ืœืกื‘ื™ื‘ื” ื ืงื™ื”
05:46
conflicts with the purpose of a coal company
99
346380
2320
ืžืชื ื’ืฉื•ืช ืขื ืžื˜ืจื•ืชื™ื” ืฉืœ ื—ื‘ืจืช ืคื—ื ืฉืชื•ื›ื ื ื” ืœื™ื™ืฆืจ ื—ืฉืžืœ
05:48
designing to produce electricity
100
348740
2360
05:51
by spewing carbon and soot into the environment.
101
351140
3880
ืขโ€œื™ ืคืœื™ื˜ืช ืคื—ืžืŸ ื•ืืคืจ ืืœ ืชื•ืš ื”ืกื‘ื™ื‘ื”.
05:55
And when they conflict, we tell ourselves this happy story.
102
355060
3520
ื•ื›ืฉืžื˜ืจื•ืช ืืœื” ืกื•ืชืจื•ืช, ืื ื• ืžืกืคืจื™ื ืœืขืฆืžื ื• ืกื™ืคื•ืจ ื ื—ืžื“.
05:59
We tell ourselves the story that democracy is going to stand up
103
359180
3920
ืœืคื™ ื”ืกื™ืคื•ืจ ืฉืื ื• ืžืกืคืจื™ื ืœืขืฆืžื ื•, ื”ื“ืžื•ืงืจื˜ื™ื” ืชืขืžื•ื“ ืขืœ ืฉืœื”,
06:03
and discipline that evil corporation.
104
363140
3360
ื•ืชื˜ื™ืœ ืืช ืžืจื•ืชื” ืขืœ ื”ืชืื’ื™ื“ ื”ืžืจื•ืฉืข.
06:06
To get the corporation to do the right thing,
105
366500
2120
ื›ื“ื™ ืฉื”ืชืื’ื™ื“ ื™ื ื”ื’ ื ื›ื•ืŸ, ื•ื™ืขืฉื” ืืช ืžื” ืฉื—ืฉื•ื‘ ืœื›ื•ืœื ื•.
06:08
to do the thing that's in the interest of all of us.
106
368620
3000
06:11
That's our happy story.
107
371660
1240
ื–ื”ื• ื”ืกื™ืคื•ืจ ื”ื ื—ืžื“ ืฉืœื ื•, ื•ื–ืืช ื’ื ืคื ื˜ื–ื™ื”,
06:12
It's also a fantasy.
108
372940
1760
06:15
Because at least in my country,
109
375980
3520
ื›ื™ ื‘ืžื“ื™ื ื” ืฉืœื™, ืœืคื—ื•ืช,
06:19
corporations are more effective AIs than democracy.
110
379540
5160
ื”ืชืื’ื™ื“ื™ื ื”ื ื‘ื™ื ื•ืช ืžืœืื›ื•ืชื™ื•ืช ืืคืงื˜ื™ื‘ื™ื•ืช ื‘ื”ืจื‘ื” ืžื”ื“ืžื•ืงืจื˜ื™ื”.
06:25
Think about it a little bit like this.
111
385140
1840
ื ืกื• ืœื—ืฉื•ื‘ ืขืœ ื–ื” ื›ื›ื”.
06:27
If we think about instrumental rationality along one axis of this graph
112
387020
4120
ืื ื”ืจืฆื™ื•ื ืœื™ื•ืช ื”ืฉื™ืžื•ืฉื™ืช ื ืžืฆืืช ืขืœ ืฆื™ืจ ืื—ื“ ืฉืœ ื”ื’ืจืฃ ื”ื–ื”,
06:31
and time across the other,
113
391140
1520
ื•ื”ื–ืžืŸ ืขืœ ื”ืฆื™ืจ ื”ืฉื ื™,
06:32
humans, of course, are the first instrumentally rational entity
114
392660
3200
ื‘ื ื™ ืื“ื ื”ื ื›ืžื•ื‘ืŸ
ื”ื™ืฉื•ืช ื”ืจืฆื™ื•ื ืœื™ืช ื”ืฉื™ืžื•ืฉื™ืช ื”ืจืืฉื•ื ื” ืฉื—ืฉื•ื‘ื” ืœื ื•.
06:35
we care about, we're better than cows.
115
395900
2600
ืื ื• ื˜ื•ื‘ื™ื ืžืคืจื•ืช. ืื•ืœื™ ืœื ื˜ื•ื‘ื™ื ื›ืžื• ื”ื ืžืœื™ื,
06:38
Maybe not as good as ants, but the point is,
116
398540
2400
06:40
we're pretty good as individuals
117
400980
3080
ืื‘ืœ ื”ื ืงื•ื“ื” ื”ื™ื, ืฉื›ืœ ืื“ื ื›ืฉืœืขืฆืžื• ื“ื™ ื˜ื•ื‘
06:44
figuring out how to do things strategically.
118
404100
3160
ื‘ืœืžืฆื•ื ืื™ืš ืœืขืฉื•ืช ื“ื‘ืจื™ื ื‘ืฆื•ืจื” ืืกื˜ืจื˜ื’ื™ืช.
06:47
And then we built democracy to do that a little bit better,
119
407900
3480
ื•ืื– ืื ื• ื‘ื•ื ื™ื ื“ืžื•ืงืจื˜ื™ื” ืฉืชืขืฉื” ืืช ื–ื” ืงืฆืช ื™ื•ืชืจ ื˜ื•ื‘,
06:51
to act collectively for all of us.
120
411380
1840
ืฉืชืคืขืœ ื‘ืื•ืคืŸ ืงื™ื‘ื•ืฆื™ ืœื˜ื•ื‘ืช ื›ื•ืœื ื•.
06:53
And that's a more instrumentally rational entity
121
413260
2680
ื•ื–ืืช ื™ืฉื•ืช ืขื•ื“ ื™ื•ืชืจ ืจืฆื™ื•ื ืœื™ืช ื•ืฉื™ืžื•ืฉื™ืช
06:55
than we, individual humans, can be.
122
415940
2320
ืžื›ืคื™ ืฉื ื•ื›ืœ ืœื”ื™ื•ืช ื›ื™ื—ื™ื“ื™ื,
06:58
Then we created corporations.
123
418260
2120
ื•ืื– ืื ื• ื™ื•ืฆืจื™ื ืชืื’ื™ื“ื™ื.
07:01
And it turns out, they have become, at least in corrupted political regimes โ€”
124
421580
3760
ื•ืžืกืชื‘ืจ ืฉืืœื” ื ืขืฉื•, ืœืคื—ื•ืช ื‘ืžืฉื˜ืจื™ื ืžื•ืฉื—ืชื™ื --
07:05
which, Iโ€™ll just submit, my political regime is โ€”
125
425340
2920
ื•ื›ื–ื” ื”ื•ื, ืจืง ืื•ืžืจ, ื”ื•ื ื”ืžืฉื˜ืจ ืฉืœื™ --
07:08
better than democracy
126
428260
3160
ื˜ื•ื‘ื™ื ื™ื•ืชืจ ืžื”ื“ืžื•ืงืจื˜ื™ื” ื‘ื”ืฉื’ืช ื™ืขื“ื™ื”ื.
07:11
in bringing about their objective ends.
127
431460
3640
07:15
Now, of course, in this system,
128
435580
1640
ื‘ืžืขืจื›ืช ื›ื–ืืช, ื›ืžื•ื‘ืŸ,
07:17
each of these layers has an aspiration to control the higher layer.
129
437260
3600
ื›ืœ ืฉื›ื‘ื” ื›ื–ืืช ืฉื•ืืคืช ืœืฉืœื•ื˜ ื‘ืฉื›ื‘ื” ืฉืžืขืœื™ื”.
07:21
So humans try to control democracy through elections.
130
441300
3400
ื‘ื ื™ ื”ืื“ื ืžื ืกื™ื ืœืฉืœื•ื˜ ื‘ื“ืžื•ืงืจื˜ื™ื” ื‘ืืžืฆืขื•ืช ื‘ื—ื™ืจื•ืช.
07:24
Democracy tries to control corporations through regulation.
131
444740
3520
ื”ื“ืžื•ืงืจื˜ื™ื” ืžื ืกื” ืœืฉืœื•ื˜ ื‘ืชืื’ื™ื“ื™ื ื‘ืืžืฆืขื•ืช ื—ืงื™ืงื”.
07:29
But the reality of control is, of course, a little bit different.
132
449020
4080
ืื‘ืœ ื‘ืžืฆื™ืื•ืช, ื›ืžื•ื‘ืŸ, ืกื“ืจ ื”ืฉืœื™ื˜ื” ืฉื•ื ื” ืžืขื˜.
07:33
In the United States, corporations control democracy
133
453940
2720
ื‘ืืจืฆื•ืช ื”ื‘ืจื™ืช, ื”ืชืื’ื™ื“ื™ื ืฉื•ืœื˜ื™ื ื‘ื“ืžื•ืงืจื˜ื™ื”
07:36
through the extraordinary amount of money they pour into elections,
134
456700
3200
ื‘ืขื–ืจืช ื›ืžื•ื™ื•ืช ื›ืกืฃ ื“ืžื™ื•ื ื™ื•ืช ืฉื”ื ืฉื•ืคื›ื™ื ืขืœ ื”ื‘ื—ื™ืจื•ืช,
07:39
making our representatives dependent not on us, but on them.
135
459940
3480
ื•ื‘ื›ืš ื’ื•ืจืžื™ื ืœื ืฆื™ื’ื™ื ื• ืœื”ื™ื•ืช ืชืœื•ื™ื™ื ืœื ื‘ื ื•, ืืœื ื‘ื”ื.
07:44
And democracy then controls the humans
136
464340
4680
ื•ืื– ื”ื“ืžื•ืงืจื˜ื™ื” ืฉื•ืœื˜ืช ื‘ื‘ื ื™ ื”ืื“ื ื‘ืืžืฆืขื•ืช ื”ื ืฆื™ื’ื™ื,
07:49
by making representation,
137
469060
1200
07:50
not actually representation, corrupting representation.
138
470300
3320
ืœื ื‘ื“ื™ื•ืง ื ืฆื™ื’ื™ื ืืœื ื ืฆื™ื’ื™ื ืžื•ืฉื—ืชื™ื.
07:54
Now, this structure, this layer of higher order intelligence
139
474380
4520
ืžืŸ ื”ืžื‘ื ื” ื”ื–ื”, ื”ืฉื›ื‘ื” ื”ื–ืืช ืฉืœ ื‘ื™ื ื” ืžืกื“ืจ ื’ื‘ื•ื” ื™ื•ืชืจ,
07:58
or instrumental rationality might evoke,
140
478940
2760
ืื• ื”ืจืฆื™ื•ื ืœื™ื•ืช ื”ืฉื™ืžื•ืฉื™ืช ื”ื–ืืช, ื ื™ืชืŸ ืœื’ื–ื•ืจ --
08:01
for those of you who think about AI,
141
481740
2560
ืœืืœื” ืžื›ื ืฉื—ื•ืฉื‘ื™ื ืขืœ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช --
08:04
a statement by the godfather of AI, Geoffrey Hinton.
142
484300
3320
ืืžื™ืจื” ืžืคื™ ืกื ื“ืง ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช, ื’โ€™ืคืจื™ ื”ื™ื ื˜ื•ืŸ.
08:08
Hinton warns us,
143
488780
1360
ื”ื™ื ื˜ื•ืŸ ืžื–ื”ื™ืจ ืื•ืชื ื• :
08:10
"There are a few examples of a more intelligent thing
144
490180
4480
โ€œืžืขื˜ื•ืช ืžืื•ื“ ื”ืŸ ื”ื“ื•ื’ืžืื•ืช ืฉืœ ืžืฉื”ื• ื ื‘ื•ืŸ ื™ื•ืชืจ
08:14
being controlled by a less intelligent thing."
145
494660
2480
โ€œืฉื ืฉืœื˜ ื‘ื™ื“ื™ ืžืฉื”ื• ื ื‘ื•ืŸ ืคื—ื•ืช.โ€
08:18
Or we could say,
146
498380
1240
ื•ื‘ื ื™ืกื•ื— ืื—ืจ - โ€œืžืฉื”ื• ืจืฆื™ื•ื ืœื™ ื•ืฉื™ืžื•ืฉื™ ื™ื•ืชืจ
08:19
a more instrumentally rational thing
147
499660
2640
08:22
being controlled by a less instrumentally rational thing.
148
502340
3360
โ€œืฉื ืฉืœื˜ ื‘ื™ื“ื™ ืžืฉื”ื• ืจืฆื™ื•ื ืœื™ ื•ืฉื™ืžื•ืฉื™ ืคื—ื•ืช.โ€
08:26
And that is consistent with this picture of AIs.
149
506620
4640
ื•ื–ื” ืชื•ืื ืืช ื”ืชืžื•ื ื” ื”ื–ืืช ืฉืœ ื‘ื™ื ื•ืช ืžืœืื›ื•ืชื™ื•ืช.
08:31
And then we add digital AI into this mix.
150
511300
3800
ื•ืื– ืื ื• ื‘ืื™ื ื•ืžื•ืกื™ืคื™ื ืœืชืขืจื•ื‘ืช ื”ื–ืืช ื‘ื™ื ื” ื“ื™ื’ื™ื˜ืœื™ืช.
08:35
And here too, once again,
151
515100
1240
ื•ื’ื ื›ืืŸ, ืฉื•ื‘,
08:36
we have corporations attempting to control their digital AI.
152
516380
5360
ื™ืฉ ืœื ื• ืชืื’ื™ื“ื™ื ืฉืžื ืกื™ื ืœืฉืœื•ื˜ ื‘ื‘ื™ื ื” ื”ื“ื™ื’ื™ื˜ืœื™ืช ืฉืœื”ื.
08:43
But the reality of that control is not quite perfect.
153
523020
4400
ืื‘ืœ ื‘ืžืฆื™ืื•ืช, ื”ืฉืœื™ื˜ื” ืื™ื ื” ืžื•ืฉืœืžืช.
08:48
Facebook, in September of 2017,
154
528300
3800
ื‘ืกืคื˜ืžื‘ืจ 2017 ื ื—ืฉืฃ
08:52
was revealed to have a term in their ad system
155
532140
5800
ืฉื‘ืžืขืจื›ืช ื”ืคืจืกื•ื ืฉืœ โ€œืคื™ื™ืกื‘ื•ืงโ€œ ืงื™ื™ื ื”ืžื•ื ื—:
08:57
called "Jew Haters."
156
537980
1480
โ€œืฉื•ื ืื™ ื™ื”ื•ื“ื™ืโ€.
08:59
You could buy ads targeting Jew haters.
157
539780
3440
ืืคืฉืจ ืœืงื ื•ืช ืžื•ื“ืขื•ืช ืฉืžืชืžืงื“ื•ืช ื‘ืฉื•ื ืื™ ื™ื”ื•ื“ื™ื.
09:03
Now, nobody in Facebook created that category.
158
543620
3720
ืื‘ืœ ืื™ืฉ ื‘ืคื™ื™ืกื‘ื•ืง ืœื ื™ืฆืจ ืืช ื”ืงื˜ื’ื•ืจื™ื” ื”ื–ืืช.
09:07
There was not a human in Facebook who decided
159
547380
2360
ืœื ื”ื™ื” ื‘ืคื™ื™ืกื‘ื•ืง ื™ืฆื•ืจ ืื ื•ืฉื™ ืื—ื“ ืฉื”ื—ืœื™ื˜:
09:09
"We're going to start targeting Jew haters."
160
549780
2160
โ€œืžืขื›ืฉื™ื• ื ืชื—ื™ืœ ืœื”ืชืžืงื“ ื‘ืฉื•ื ืื™ ื™ื”ื•ื“ื™ื,โ€
09:11
Its AI created that category
161
551980
3520
ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ืฉืœื” ื”ื™ื ืฉื™ืฆืจื” ืงื˜ื’ื•ืจื™ื” ื–ื•
09:15
because its AI figured Jew haters would be a profitable category
162
555540
3240
ื›ื™ ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ืฉืœื” ื”ืกื™ืงื” ืฉืฉื•ื ืื™ ื™ื”ื•ื“ื™ื ื™ื”ื•ื• ืงื˜ื’ื•ืจื™ื” ืจื•ื•ื—ื™ืช
09:18
for them to begin to sell ads to,
163
558820
2840
ื›ื“ื™ ืœืžื›ื•ืจ ืœื”ื ืคืจืกื•ืžื•ืช,
09:21
and the company was, of course, embarrassed
164
561660
2040
ื•ื”ื—ื‘ืจื”, ื›ืžื•ื‘ืŸ, ื‘ืื” ื‘ืžื‘ื•ื›ื”
09:23
that it turned out they didn't actually have control
165
563740
2680
ืžื›ืš ืฉื”ืชื‘ืจืจ ืฉืื™ืŸ ืœื” ืฉื•ื ืฉืœื™ื˜ื”
09:26
over the machine that ran their machines that run our lives.
166
566420
4120
ืขืœ ื”ืžื›ื•ื ื” ืฉืœื”, ืฉืžื ื”ืœืช ืืช ื”ืžื›ื•ื ื•ืช ืฉืœื” ืฉืžื ื”ืœื•ืช ืืช ื—ื™ื™ื ื•.
09:30
The real difference in this story, though,
167
570580
2680
ืื‘ืœ ืžื” ืฉืžื™ื•ื—ื“ ื‘ืกื™ืคื•ืจ ื”ื–ื”,
09:33
is the extraordinary potential
168
573260
3720
ื”ื•ื ื”ืคื•ื˜ื ืฆื™ืืœ ื”ืขืฆื•ื ื”ื˜ืžื•ืŸ ื‘ื™ืฉื•ืช ื”ืจืฆื™ื•ื ืœื™ืช ื”ืฉื™ืžื•ืฉื™ืช ื”ื–ืืช
09:37
of this instrumentally rational entity
169
577020
3280
09:40
versus us.
170
580300
1640
ื ื’ื“ื ื•.
09:41
This massively better instrumentally rational entity
171
581980
3920
ื”ื™ืฉื•ืช ื”ืจืฆื™ื•ื ืœื™ืช ื”ืฉื™ืžื•ืฉื™ืช ื”ื–ื•, ืฉืขื“ื™ืคื” ืžืขืœ ื•ืžืขื‘ืจ
09:45
versus even corporations and certainly democracies,
172
585940
2960
ืืคื™ืœื• ื™ื•ืชืจ ืžืชืื’ื™ื“ื™ื ื•ืžื“ืžื•ืงืจื˜ื™ื•ืช ืžืกื•ื™ืžื•ืช,
09:48
because it's going to be more efficient at achieving its objective than we are.
173
588940
5040
ืžืฉื•ื ืฉื”ื™ื ืชื™ืขืฉื” ืขื•ื“ ื™ื•ืชืจ ื™ืขื™ืœื” ืžืื™ืชื ื• ื‘ื”ืฉื’ืช ืžื˜ืจื•ืชื™ื”.
09:54
And here's where we cue the paranoia I began to seed
174
594020
4120
ื•ื›ืืŸ ื ื™ืชืŸ ื”ืื•ืช ืœืคืจื ื•ื™ื” ืฉืืช ื–ืจืขื™ื” ื”ืชื—ืœืชื™ ืœื–ืจื•ืข,
09:58
because our collective perceptions, our collective misperceptions
175
598180
5800
ื›ื™ ื”ืชืคื™ืฉื•ืช ื”ืงื™ื‘ื•ืฆื™ื•ืช ืฉืœื ื•, ื”ืชืคื™ืฉื•ืช ื”ืฉื’ื•ื™ื•ืช ื”ืงื™ื‘ื•ืฆื™ื•ืช ืฉืœื ื•,
10:04
are not accidental.
176
604020
1680
ืื™ื ืŸ ืขื ื™ื™ืŸ ืžืงืจื™.
10:05
They are expected, intended, the product of this AI.
177
605740
5440
ืžื“ื•ื‘ืจ ื‘ืžื•ืฆืจ ืฆืคื•ื™ ื•ืžื›ื•ื•ืŸ ืฉืœ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ื–ื•.
10:11
We could think of it as the AI perception machine.
178
611180
3680
ืืคืฉืจ ืœืจืื•ืช ื‘ื” โ€œืžื—ื•ืœืœ ืชืคื™ืฉื•ืช ืžื‘ื•ืกืก ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืชโ€œ.
10:14
We are its targets.
179
614900
2040
ืื ื• ื”ืžื˜ืจื•ืช ืฉืœื•.
10:19
Now, the first contact we had with this AI,
180
619460
2520
ื”ืžื’ืข ื”ืจืืฉื•ืŸ ืฉื”ื™ื” ืœื ื• ืขื ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ื”ื–ืืช,
10:22
as Tristan Harris described it,
181
622020
1800
ื‘ืชื™ืื•ืจื• ืฉืœ ื˜ืจื™ืกื˜ืŸ ื”ืืจื™ืก,
10:23
came from social media.
182
623860
2440
ื”ื’ื™ืข ืžื”ืžื“ื™ื” ื”ื—ื‘ืจืชื™ืช.
10:26
Tristan Harris, who started the Center for Humane Technology,
183
626940
3440
ื˜ืจื™ืกื˜ืŸ ื”ืืจื™ืก, ืฉื™ื™ืกื“ ืืช ื”ืžืจื›ื– ืœื˜ื›ื ื•ืœื•ื’ื™ื” ืื ื•ืฉื™ืช,
10:30
cofounded it, famous in this extraordinary documentary,
184
630420
3880
ื”ื•ื ืื—ื“ ื”ืžื™ื™ืกื“ื™ื, ืฉื ื•ื“ืข ื‘ืกืจื˜ ื”ืชื™ืขื•ื“ื™ ื”ื ืคืœื,
10:34
"The Social Dilemma,"
185
634300
1200
โ€œื”ื“ื™ืœืžื” ื”ื—ื‘ืจืชื™ืชโ€œ,
10:35
before he was famous, he was just an engineer at Google.
186
635540
2880
ืœืคื ื™ ืฉื”ืชืคืจืกื, ื”ื•ื ื”ื™ื” ืขื•ื“ ืžื”ื ื“ืก ื‘โ€œื’ื•ื’ืœโ€œ.
10:38
And at Google, he was focused on the science of attention
187
638420
3920
ื›ืฉื”ื™ื” ื‘ื’ื•ื’ืœ ื”ื•ื ื”ืชืžืงื“ ื‘ืžื“ืข ืชืฉื•ืžืช ื”ืœื‘,
10:42
using AI to engineer attention,
188
642380
2320
ื•ื”ืฉื™ืžื•ืฉ ื‘ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ื›ื“ื™ ืœื”ื ื“ืก ืชืฉื•ืžืช ืœื‘,
10:44
to overcome resistance,
189
644740
2800
ื‘ืžื’ืžื” ืœื”ืชื’ื‘ืจ ืขืœ ื”ืชื ื’ื“ื•ืช,
10:47
to increase human engagement with the platform
190
647540
3920
ื•ื›ื“ื™ ืœื”ื’ื‘ื™ืจ ืืช ื”ืžืขื•ืจื‘ื•ืช ื”ืื ื•ืฉื™ืช ื‘ืคืœื˜ืคื•ืจืžื”,
10:51
because engagement is the business model.
191
651500
3320
ืžืฉื•ื ืฉื”ืžืขื•ืจื‘ื•ืช ื”ื™ื ื”ืžื•ื“ืœ ื”ืขืกืงื™.
10:55
Compare this to, think of it as brain hacking.
192
655660
3440
ื”ืฉื•ื• ืืช ื–ื” ืœืžืฉืœ... ืจืื• ื‘ื–ื” ืคื™ืฆื•ื— ืฉืœ ื”ืžื•ื—.
10:59
We could compare it to what we could call body hacking.
193
659100
2680
ืืคืฉืจ ืœื”ืฉื•ื•ืช ืืช ื–ื” ืœโ€œืคื™ืฆื•ื— ืฉืœ ื”ื’ื•ืฃโ€œ,
11:02
This is the exploiting of food science.
194
662100
2560
ื›ืœื•ืžืจ, ื ื™ืฆื•ืœ ืžื“ืข ื”ืžื–ื•ืŸ.
11:04
Scientists engineer food to exploit our evolution,
195
664700
3920
ืžื“ืขื ื™ื ืžื”ื ื“ืกื™ื ืžื–ื•ืŸ ื›ื“ื™ ืœื ืฆืœ ืืช ื”ืื‘ื•ืœื•ืฆื™ื” ืฉืœื ื•,
11:08
our mix of salt, fat and sugar,
196
668660
1960
ืืช ืชืขืจื•ื‘ืช ื”ืžืœื—, ื”ืฉื•ืžืŸ ื•ื”ืกื•ื›ืจ ืฉืœื ื•,
11:10
to overcome the natural resistance
197
670620
1760
ื›ื“ื™ ืœื”ืชื’ื‘ืจ ืขืœ ื”ืชื ื’ื“ื•ืชื ื• ื”ื˜ื‘ืขื™ืช
11:12
so you can't stop eating food.
198
672420
3040
ืขื“ ืฉืื™-ืืคืฉืจ ืœื”ืคืกื™ืง ืœืื›ื•ืœ.
11:16
So that they can sell food or sell, "food" more profitably.
199
676300
4400
ื›ื“ื™ ืฉื”ื ื™ื•ื›ืœื• ืœืžื›ื•ืจ ืžื–ื•ืŸ, ืื• โ€œืžื–ื•ืŸโ€œ, ื‘ื™ืชืจ ืจื•ื•ื—ื™ื•ืช.
11:20
Brain hacking is the same, but focused on attention.
200
680700
2880
ื›ืš ื’ื ื‘ืคื™ืฆื•ื— ื”ืžื•ื—, ื›ืฉื”ืžื™ืงื•ื“ ื”ื•ื ื‘ืชืฉื•ืžืช ื”ืœื‘.
11:24
It's exploiting evolution.
201
684260
1720
ื–ื”ื• ื ื™ืฆื•ืœ ืฉืœ ื”ืื‘ื•ืœื•ืฆื™ื”.
11:26
The fact that we have an irrational response to random rewards,
202
686580
3480
ื”ืขื•ื‘ื“ื” ืฉื™ืฉ ืœื ื• ืชื’ื•ื‘ื” ืื™-ืจืฆื™ื•ื ืœื™ืช ืœืืคืฉืจื•ืช ืœื’ืžื•ืœ ืืงืจืื™,
11:30
or can't stop consuming bottomless pits of content
203
690100
4880
ืื• ื–ื” ืฉืื™ื ื ื• ื™ื›ื•ืœื™ื ืœื”ืคืกื™ืง ืœืฆืจื•ืš ืชื•ื›ืŸ ืžื‘ื•ืจื•ืช ื—ืกืจื™-ืชื—ืชื™ืช
11:35
with the aim to increase engagement,
204
695020
2840
ื›ืฉืžื˜ืจื” ื”ื™ื ืœื”ื’ื‘ื™ืจ ืืช ื”ืžืขื•ืจื‘ื•ืช, ื›ื“ื™ ืœืžื›ื•ืจ ื™ื•ืชืจ ืคืจืกื•ืžื•ืช.
11:37
to sell more ads.
205
697900
1920
11:40
And it just so happens, too bad for us,
206
700140
3320
ืจืฆื” ื”ืžืงืจื”, ืœื“ืื‘ื•ื ื ื•,
11:43
that we engage more
207
703500
2320
ืฉืื ื• ื ืขืฉื™ื ื™ื•ืชืจ ืžืขื•ืจื‘ื™ื
11:45
the more extreme, the more polarizing,
208
705820
2600
ื›ื›ืœ ืฉื”ืชื•ื›ืŸ ื”ื•ื ื™ื•ืชืจ ืงื™ืฆื•ื ื™, ืžืงื˜ื‘, ื ื•ื˜ืฃ ืฉื ืื”.
11:48
the more hate-filled this content is.
209
708460
3880
11:52
So that is what we're fed by these AIs.
210
712340
4920
ืืœื” ื”ื“ื‘ืจื™ื ืฉื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ืžืื›ื™ืœื” ืื•ืชื ื• ื‘ื”ื.
11:58
With the consequence
211
718180
2000
ื•ื”ืชื•ืฆืื” ื”ื™ื
12:00
that we produce a people more polarized and ignorant and angry
212
720220
5160
ืฉืื ื• ืžื™ื™ืฆืจื™ื ืขึทื ื™ื•ืชืจ ืžืงื•ื˜ื‘, ื ื‘ืขืจ ื•ื–ื•ืขื
12:05
than at any time in democracy's history in America
213
725420
3360
ืžื›ืœ ืชืงื•ืคื” ืื—ืจืช ื‘ืชื•ืœื“ื•ืช ื”ื“ืžื•ืงืจื˜ื™ื” ืฉืœ ืืžืจื™ืงื”
12:08
since the Civil War
214
728820
1560
ืžืื– ืžืœื—ืžืช ื”ืื–ืจื—ื™ื,
12:10
and democracy is thereby weakened.
215
730420
2360
ื•ื›ืš ื”ื•ืœื›ืช ื”ื“ืžื•ืงืจื˜ื™ื” ื•ื ื—ืœืฉืช.
12:13
They give us what we want.
216
733140
2400
ื ื•ืชื ื™ื ืœื ื• ืืช ืžื” ืฉืื ื• ืจื•ืฆื™ื.
12:16
What we want makes us into this.
217
736060
4200
ืžื” ืฉืื ื• ืจื•ืฆื™ื ื”ื•ืคืš ืื•ืชื ื• ืœื“ื‘ืจ ื”ื–ื”.
12:21
OK, but recognize something really critically important.
218
741140
2720
ืื‘ืœ ืขืœื™ื›ื ืœื”ื›ื™ืจ ื‘ืžืฉื”ื• ื—ืฉื•ื‘ ื‘ืžื™ื“ื” ืžื›ืจืขืช.
12:23
This is not because AI is so strong.
219
743900
2880
ื–ื” ืœื ืžืฉื•ื ืฉื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ื—ื–ืงื” ื›ืœ-ื›ืš.
12:27
It's because we are so weak.
220
747620
2720
ื–ื” ื‘ื’ืœืœ ืฉืื ื—ื ื• ื—ืœืฉื™ื ื›ืœ-ื›ืš.
12:31
Here's Tristan Harris describing this.
221
751900
1880
ื•ื›ืš ืžืชืืจ ื–ืืช ื˜ืจื™ืกื˜ืŸ ื”ืืจื™ืก.
12:33
(Video) "We're all looking out for the moment
222
753820
2120
(ืกืจื˜ื•ืŸ) โ€œื›ื•ืœื ื• ืžืฆืคื™ื ืœืจื’ืข
12:35
when technology would overwhelm human strengths and intelligence.
223
755940
3480
โ€œืฉื‘ื• ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ืชืชื’ื‘ืจ ืขืœ ื—ื•ื–ืงื•ืชื™ื• ื•ื‘ื™ื ืชื• ืฉืœ ื”ืื“ื,
12:39
When is it going to cross the singularity,
224
759460
2040
ืžืชื™ ื”ื™ื ืชื—ืฆื” ืืช ืกืฃ ื”ืกื™ื ื’ื•ืœืจื™ื•ืช,
12:41
replace our jobs,
225
761540
1160
ืชื—ืœื™ืฃ ืื•ืชื ื• ื‘ืžืงื•ื ื”ืขื‘ื•ื“ื”, ื•ืชื”ื™ื” ื—ื›ืžื” ื™ื•ืชืจ ืžื”ืื“ื?
12:42
be smarter than humans?
226
762740
1360
12:44
But there's this much earlier moment
227
764420
2800
ืื‘ืœ ื™ืฉื ื• ืจื’ืข ืžื•ืงื“ื ื‘ื”ืจื‘ื”,
12:47
when technology exceeds and overwhelms human weaknesses.
228
767260
4320
ืฉื‘ื• ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ืขื•ืœื” ื•ืžืชื’ื‘ืจืช ืขืœ ื”ื—ื•ืœืฉื•ืช ื”ืื ื•ืฉื™ื•ืช.
- ืžืชื•ืš โ€œื”ื“ื™ืœืžื” ื”ื—ื‘ืจืชื™ืชโ€ -
12:53
This point being crossed is at the root of addiction, polarization,
229
773900
5080
โ€œื—ืฆื™ื™ืช ื”ืฉืœื‘ ื”ื–ื” ืขื•ืžื“ืช ื‘ื™ืกื•ื“ ื”ื”ืชืžื›ืจื•ืช, ื”ืงื™ื˜ื•ื‘,
12:59
radicalization, outrage-ification, vanity-ification, the entire thing.
230
779020
3840
ื”ื”ืงืฆื ื”, ื”ื–ืขื ื”ืฆื“ืงื ื™, ื”ืฉื—ืฆื ื•ืช ื”ืฆื“ืงื ื™ืช, ื›ืœ ื”ื“ื‘ืจ ื”ื–ื”.
13:04
This is overpowering human nature.
231
784100
2240
ื–ื”ื• ื”ื ืฆื—ื•ืŸ ืขืœ ื˜ื‘ืข ื”ืื“ื.
13:06
And this is checkmate on humanity."
232
786940
2800
ื•ื–ื”ื• ืžื”ืœืš ื”ืฉื—-ืžื˜ ื ื’ื“ ื”ืื“ื.โ€
13:10
Lawrence Lessig: So Tristanโ€™s point is weโ€™re always focused
233
790420
2840
ื›ืœื•ืžืจ, ื˜ืจื™ืกื˜ืŸ ื˜ื•ืขืŸ ืฉืื ื• ืชืžื™ื“ ืžืชืžืงื“ื™ื ื‘ืคื™ื ื” ื”ื–ืืช,
13:13
on this corner, when AGI comes,
234
793260
3800
ื‘ืจื’ืข ื”ื’ืขืชื” ืฉืœ ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ื”ื›ืœืœื™ืช,
13:17
when it's super intelligent,
235
797100
1720
ื”ืจื’ืข ืฉื‘ื• ื”ื™ื ืชื”ื™ื” ืกื•ืคืจ-ื ื‘ื•ื ื”,
13:18
when it's more intelligent than any of us.
236
798860
2440
ื ื‘ื•ื ื” ื”ืจื‘ื” ื™ื•ืชืจ ืžื›ืœ ืื—ื“ ื•ืื—ืช ืžืื™ืชื ื•.
13:21
And that's what we now fear.
237
801300
2200
ื•ืžื›ืš ืื ื• ื—ื•ืฉืฉื™ื ืขื›ืฉื™ื•.
13:23
Whether we will get there in three years or 20 years,
238
803500
3120
ื‘ื™ืŸ ืื ื ื’ื™ืข ืœืฉื ืขื•ื“ 3 ืฉื ื™ื ืื• 20 ืฉื ื”,
13:26
what will happen then?
239
806620
1200
ืžื” ื™ืงืจื” ืื–?
13:27
But his point is, it's actually this place
240
807860
3960
ืื‘ืœ ื”ื•ื ืžื ืกื” ืœื•ืžืจ ืœื ื• ืฉื”ื ืงื•ื“ื” ื”ื–ืืช ื”ื™ื ื”ืžืงื•ื
13:31
that we must begin to worry,
241
811820
2560
ืฉืœื’ื‘ื™ื• ืขืœื™ื ื• ืœื”ืชื—ื™ืœ ืœื“ืื•ื’,
13:34
because at this place, it can overcome our weaknesses.
242
814420
4400
ื›ื™ ื–ื”ื• ื”ืžืงื•ื ืฉื‘ื• ื”ื™ื ืžืกื•ื’ืœืช ืœื”ืชื’ื‘ืจ ืขืœ ื—ื•ืœืฉื•ืชื™ื ื•.
13:39
"The Social Dilemma"
243
819460
1160
โ€œื”ื“ื™ืœืžื” ื”ื—ื‘ืจืชื™ืชโ€ ืขืกืง ื‘ื—ื•ืœืฉื•ืช ืฉืœ ื›ืœ ืื—ื“ ื•ืื—ืช ืžืื™ืชื ื•,
13:40
was about the individual weaknesses we have,
244
820660
2320
13:42
not to be able to turn away from our phones,
245
822980
2760
ื›ืฉืื™ื ื ื• ืžืกื•ื’ืœื™ื ืœื”ืชืจื—ืง ืžื”ื˜ืœืคื•ื ื™ื ืฉืœื ื•,
13:45
or to convince our children to turn away from their phones.
246
825780
3640
ืื• ืœืฉื›ื ืข ืืช ื™ืœื“ื™ื ื• ืœื”ืชืจื—ืง ืžื”ื˜ืœืคื•ื ื™ื ืฉืœื”ื.
13:49
But I want you to see that there's also a collective human weakness.
247
829460
4560
ืื‘ืœ ืื ื™ ืจื•ืฆื” ืฉืชื‘ื™ื ื• ืฉื™ืฉ ื’ื ื—ื•ืœืฉื” ืื ื•ืฉื™ืช ืงื™ื‘ื•ืฆื™ืช,
13:54
That this technology drives us to disable our capacity to act collectively
248
834540
5160
ืฉื”ื˜ื›ื ื•ืœื•ื’ื™ื” ื”ื–ื• ื“ื•ื—ืคืช ืื•ืชื ื• ืœืกื›ืœ ืืช ื™ื›ื•ืœืชื ื• ืœืคืขื•ืœ ื™ื—ื“
13:59
in ways that any of us would want.
249
839700
2480
ื‘ื“ืจื›ื™ื ืจืฆื•ื™ื•ืช ืœื›ืœ ืื—ื“ ื•ืื—ืช ืžืื™ืชื ื•.
14:02
So we are surrounded individually by these metalheads,
250
842220
4240
ื•ื›ืš, ื’ื•ืคื™ ื”ืžืชื›ืช ื”ืืœื” ืžืงื™ืคื™ื ืืช ื›ื•ืœื ื• ืื™ืฉื™ืช,
14:06
and we are also surrounded as a people
251
846500
3680
ื•ื’ื ื›ืขื,
14:10
by these metalheads
252
850180
1200
14:11
long before AGI is anywhere on the horizon.
253
851420
4400
ื”ืจื‘ื” ืœืคื ื™ ืฉื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ื”ื›ืœืœื™ืช ืชืคืฆื™ืข ื‘ืื•ืคืง.
14:15
It overwhelms us.
254
855820
1560
ื”ื™ื ืžื›ื ื™ืขื” ืื•ืชื ื•.
14:17
AI gets us to do what it seeks,
255
857700
3680
ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ื’ื•ืจืžืช ืœื ื• ืœืขืฉื•ืช ืืช ืžื” ืฉืจืฆื•ื™ ืœื”,
14:21
which is engagement,
256
861380
2840
ื“ื”ื™ื™ื ื•, ืžืขื•ืจื‘ื•ืช,
14:24
and we get democracy hacked in return.
257
864260
5320
ื•ื‘ืชืžื•ืจื”, ืื ื• ืกื•ืคื’ื™ื ืืช ืคื™ืฆื•ื— ื”ื“ืžื•ืงืจื˜ื™ื”.
14:30
Now, if the first contact that we had gave us that,
258
870180
4520
ื•ืื ื”ืžื’ืข ื”ืจืืฉื•ืŸ ืฉืœื ื• ื–ื™ื›ื” ืื•ืชื ื• ื‘ื“ื‘ืจื™ื ื”ืืœื”,
14:34
if social media circa 2020 gave us that,
259
874740
4400
ืื ื”ืžื“ื™ื” ื”ื—ื‘ืจืชื™ืช ืฉืœ ืกื‘ื™ื‘ื•ืช 2020 ื ืชื ื” ืœื ื• ืื•ืชื,
14:39
what's this second contact with AI going to produce?
260
879180
4200
ืžื” ื™ื•ืœื™ื“ ื”ืžื’ืข ื”ืฉื ื™ ืขื ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช?
14:43
When AI is capable, not just in figuring out how to target you
261
883900
3880
ื›ืฉื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ืชื•ื›ืœ ืœื ืจืง ืœื”ื‘ื™ืŸ ืื™ืš ืœืงืœื•ืข ืืœื™ื›ื
14:47
with the content it knows will elicit the most reaction
262
887820
3600
ืขื ื”ืชื•ื›ืŸ ืฉื™ื“ื•ืข ืœื” ืฉื™ืคื™ืง ืžื›ื
ืืช ืžื™ืจื‘ ื”ืชื’ื•ื‘ื” ื•ื”ืžืขื•ืจื‘ื•ืช,
14:51
and engagement from you,
263
891460
1520
14:53
but can create content
264
893020
2080
ืืœื ื’ื ืชื•ื›ืœ ืœื™ืฆื•ืจ ืชื•ื›ืŸ
14:55
that it knows will react or get you to engage more directly,
265
895140
3720
ืฉื”ื™ื ื™ื•ื“ืขืช ืฉื™ืคื™ืง ืžื›ื ืชื’ื•ื‘ื” ืื• ืžืขื•ืจื‘ื•ืช ื‘ืื•ืคืŸ ื™ืฉื™ืจ ื™ื•ืชืจ,
14:58
whether true or false,
266
898900
2520
ื‘ื™ืŸ ืื ืชื•ื›ืŸ ื ื›ื•ืŸ ืื• ืฉืงืจื™,
15:01
whether slanderous or not.
267
901460
1920
ืžืฉืžื™ืฅ ืื• ืœื.
15:03
What does that contact do?
268
903420
3760
ืžื” ื™ืขื•ืœืœ ื”ืžื’ืข ื”ื–ื”?
15:08
I so hate the writers of "Game of Thrones"
269
908580
4840
ืื ื™ ืฉื•ื ื ื›ืœ-ื›ืš ืืช ื›ื•ืชื‘ื™ โ€œืžืฉื—ืงื™ ื”ื›ืกโ€œ,
15:13
because their last season,
270
913460
1720
ื›ื™ ื‘ืขื•ื ื” ื”ืื—ืจื•ื ื” ื”ื ื”ืจืกื• ืœื’ืžืจื™ ืืช ื”ืกื“ืจื” ื›ื•ืœื”...
15:15
they so completely ruined the whole series.
271
915180
3600
15:18
(Laughter and applause)
272
918780
2280
(ืฆื—ื•ืง ื•ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
15:21
We can't use memes from "Game of Thrones" anymore.
273
921100
3040
ืขื“ ืฉื›ื‘ืจ ืื™ื ื ื• ื™ื›ื•ืœื™ื ืœื”ืฉืชืžืฉ ื‘ืžืžื™ื ืžืชื•ืš โ€œืžืฉื—ืงื™ ื”ื›ืกโ€œ.
15:24
But if we could, I would say winter is coming, friends,
274
924180
4000
ืื‘ืœ ืื™ืœื• ื™ื›ื•ืœื ื•, ื”ื™ื™ืชื™ ืื•ืžืจ, โ€œื”ื—ื•ืจืฃ ืžืชืงืจื‘, ื—ื‘ืจื™ืโ€œ.
15:28
I'm just going to say it anyway.
275
928220
1560
ืื•ืžืจ ืืช ื–ื” ื‘ื›ืœ ื–ืืช: ื”ื—ื•ืจืฃ ืžืชืงืจื‘, ื—ื‘ืจื™ื.
15:29
Winter is coming, friends.
276
929780
2120
15:32
And these AIs are the source that we have to worry about.
277
932660
6760
ื•ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ืœืกื•ื’ื™ื” ื”ื™ื ื”ื‘ืขื™ื” ืฉืฆืจื™ื›ื” ืœื”ื“ืื™ื’ ืื•ืชื ื•.
15:39
So then, what is to be done?
278
939420
2120
ืื ื›ืš, ืžื” ืืคืฉืจ ืœืขืฉื•ืช?
15:44
Well, you know, if there's a flood, what you do is you turn around and run.
279
944100
4360
ืื– ื›ื™ื“ื•ืข ืœื›ื, ื›ืฉื™ืฉ ืฉื˜ืคื•ืŸ, ืฆืจื™ืš ืœื”ืกืชื•ื‘ื‘ ื•ืœื‘ืจื•ื—.
15:49
You move.
280
949020
1640
ืœื”ืชื—ื™ืœ ืœื–ื•ื–.
15:50
You move to higher ground or protected ground.
281
950700
3040
ืœืขื‘ื•ืจ ืœืื™ื–ื•ืจ ื’ื‘ื•ื” ืื• ืžื•ื’ืŸ.
15:55
You find a way to insulate democracy
282
955900
3080
ืฆืจื™ืš ืœืžืฆื•ื ื“ืจืš ืœื‘ื•ื“ื“ ืืช ื”ื“ืžื•ืงืจื˜ื™ื”
15:59
or to shelter democracy from AI's force
283
959020
2800
ืื• ืœื”ื’ืŸ ืขืœ ื”ื“ืžื•ืงืจื˜ื™ื” ืžื›ื•ื—ื•ืช ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช
16:01
or from AI's harmful force.
284
961860
1640
ืื• ืžื›ื•ื—ื” ื”ืžื–ื™ืง ืฉืœ ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช.
16:03
And, you know, the law does this in America with juries.
285
963540
4400
ื•ื›ื™ื“ื•ืข ืœื›ื, ื”ื—ื•ืง ื‘ืืžืจื™ืงื” ืขื•ืฉื” ื–ืืช ื‘ืขื–ืจืช ื—ื‘ืจ ื”ืžื•ืฉื‘ืขื™ื.
16:07
We have juries.
286
967980
1160
ื™ืฉ ืœื ื• ื—ื‘ืจื™ ืžื•ืฉื‘ืขื™ื.
16:09
They deliberate,
287
969180
1480
ื”ื ื“ื ื™ื ื‘ื™ื ื™ื”ื,
16:10
but they are protected in the types of information
288
970700
3360
ืื‘ืœ ื”ื ืžื•ื’ื ื™ื ืžื‘ื—ื™ื ืช ืกื•ื’ื™ ื”ืžื™ื“ืข
16:14
that they're allowed to hear or talk about or deliberate upon,
289
974100
3960
ืฉืžื•ืชืจ ืœื”ื ืœืฉืžื•ืข, ืื• ืœื“ื‘ืจ ื•ืœื“ื•ืŸ ืขืœื™ื”ื,
16:18
because we know we need to protect them
290
978060
1920
ื›ื™ ื‘ืจื•ืจ ืœื ื• ืฉืขืœื™ื ื• ืœื’ื•ื ืŸ ืขืœื™ื”ื
16:20
if they're to reach a judgment that is just.
291
980020
2320
ืื ื‘ืจืฆื•ื ื ื• ืฉื”ื ื™ื’ื™ืขื• ืœืฉื™ืคื•ื˜ ืฆื•ื“ืง.
16:22
And democracy reformers, especially across Europe,
292
982380
3760
ื•ืžืชึทืงื ื™ ื”ื“ืžื•ืงืจื˜ื™ื”, ื‘ืžื™ื•ื—ื“ ื‘ืื™ืจื•ืคื”,
16:26
are trying to do this right now.
293
986140
1720
ืžื ืกื™ื ื›ืขืช ืœืขืฉื•ืช ื‘ื“ื™ื•ืง ืืช ื–ื”.
16:27
Reformers building citizen assemblies,
294
987860
2920
ืžืชึทืงื ื™ื ืฉืžืงื™ืžื™ื ืžื•ืขืฆื•ืช ืื–ืจื—ื™ื•ืช,
16:30
across Europe mainly, in Japan as well.
295
990780
2760
ื‘ืขื™ืงืจ ื‘ืื™ืจื•ืคื”, ื•ื’ื ื‘ื™ืคืŸ.
16:33
And the citizen assemblies are these random, representative,
296
993540
4080
ื•ืžื•ืขืฆื•ืช ืื–ืจื—ื™ื•ืช ืืœื”
ื”ืŸ ื’ื•ืคื™ื ืืงืจืื™ื™ื, ืžื™ื™ืฆื’ื™ื, ืžืฉื›ื™ืœื™ื ืฉืžื ื”ืœื™ื ื“ื™ื•ื ื™ื
16:37
informed and deliberative bodies
297
997660
2280
16:39
that take on particular democratic questions
298
999940
4560
ื•ืžืชืžื•ื“ื“ื™ื ืขื ืกื•ื’ื™ื•ืช ื“ืžื•ืงืจื˜ื™ื•ืช ื™ื™ื—ื•ื“ื™ื•ืช
16:44
and address them in a way that could be protected
299
1004540
3720
ื•ืžืชื™ื™ื—ืกื™ื ืืœื™ื”ืŸ ื‘ื“ืจืš ืฉื ื™ืชืŸ ืœื”ื’ืŸ ืขืœื™ื”
16:48
from this corruption of the AI.
300
1008300
1720
ืžื”ื”ืฉื—ืชื” ืฉืœ ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช.
16:50
So Iceland was able to craft a constitution out of a process like this.
301
1010020
4280
ืื™ืกืœื ื“ ื”ืฆืœื™ื—ื” ืœื ืกื— ื—ื•ืงื” ืžืชื•ืš ืชื”ืœื™ืš ื›ื–ื”.
16:54
Ireland was able to approve gay marriage and deregulation of abortion
302
1014300
4880
ืื™ืจืœื ื“ ื”ืฆืœื™ื—ื” ืœืืฉืจ ื ื™ืฉื•ืื™ืŸ ื—ื“-ืžื™ื ื™ื™ื ื•ืœืฉื ื•ืช ืืช ื”ื—ืงื™ืงื” ืœื’ื‘ื™ ื”ืคืœื•ืช
16:59
through a process like this.
303
1019220
1400
ื‘ืืžืฆืขื•ืช ืชื”ืœื™ืš ื›ื–ื”.
17:00
France has addressed climate change
304
1020620
2000
ืฆืจืคืช ืžืชื™ื™ื—ืกืช ืœืฉื™ื ื•ื™ื™ ื”ืืงืœื™ื
17:02
and also end-of-life decisions.
305
1022660
1480
ื•ื’ื ืœื”ื—ืœื˜ื•ืช ืกื™ื•ื ื”ื—ื™ื™ื.
17:04
And across Germany there are many of these entities
306
1024180
3360
ื•ื‘ื›ืœ ื’ืจืžื ื™ื” ื™ืฉ ื’ื•ืคื™ื ืจื‘ื™ื ื›ืืœื”
17:07
that are boiling up to find ways for a different democratic voice,
307
1027540
5760
ืฉื‘ืชืžืฆื™ืช, ืžื•ืฆืื™ื ื“ืจื›ื™ื ืœื”ืฉืžื™ืข ืงื•ืœ ื“ืžื•ืงืจื˜ื™ ืฉื•ื ื” --
17:13
to find voice.
308
1033340
1360
ืœืžืฆื•ื ื‘ื›ืœืœ ืงื•ืœ.
17:15
But here's the point.
309
1035620
1400
ืื‘ืœ ื”ื ืงื•ื“ื” ื”ื™ื ื–ื•.
17:18
These are extraordinarily hopeful and exciting, no doubt.
310
1038260
3200
ืืœื” ื“ื‘ืจื™ื ืžืœืื™ ืชืงื•ื•ื” ื•ืžืœื”ื™ื‘ื™ื ื‘ื™ื•ืชืจ, ืœืœื ืกืคืง.
17:21
But they are not just a good idea.
311
1041460
3840
ืื‘ืœ ื”ื ืื™ื ื ืกืชื ืจืขื™ื•ืŸ ื˜ื•ื‘.
17:26
They are existential for democracy.
312
1046380
4160
ืืœื” ื“ื‘ืจื™ื ืงื™ื•ืžื™ื™ื ืžื‘ื—ื™ื ืช ื”ื“ืžื•ืงืจื˜ื™ื”.
17:30
They are security for democracy.
313
1050540
2800
ื”ื ืžื‘ื˜ื™ื—ื™ื ืืช ื”ื“ืžื•ืงืจื˜ื™ื”.
17:33
They are a way to protect us from this AI hacking
314
1053380
4040
ื”ื ืžื”ื•ื•ื™ื ื“ืจืš ืœื”ื’ืŸ ืขืœ ืขืฆืžื ื• ืžืคื ื™ ืคื™ืฆื•ื— ืžืฆื“ ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช
17:37
that steers against a public will.
315
1057460
3800
ืฉืคื•ื ื” ื ื’ื“ ืจืฆื•ืŸ ื”ืฆื™ื‘ื•ืจ.
17:42
This is change not just to make democracy better,
316
1062740
3840
ื–ื” ืื™ื ื ื• ืฉื™ื ื•ื™ ืฉืจืง ืžืฉืคืจ ืืช ื”ื“ืžื•ืงืจื˜ื™ื”,
17:46
a tweak to just make it a little bit more democratic.
317
1066620
3800
ืฉื™ืคื•ืฅ ืงืœ ืฉืจืง ืขื•ืฉื” ืื•ืชื” ืœืงืฆืช ื™ื•ืชืจ ื“ืžื•ืงืจื˜ื™ืช.
17:50
It's a change to let democracy survive,
318
1070460
4880
ื–ื”ื• ืฉื™ื ื•ื™ ืฉื ื•ืขื“ ืœืืคืฉืจ ืœื“ืžื•ืงืจื˜ื™ื” ืœืฉืจื•ื“,
17:55
given what we know technology will become.
319
1075380
4080
ืœืื•ืจ ืžื” ืฉื™ื“ื•ืข ืœื ื• ืขืœ ื“ืžื•ืชื” ื”ืขืชื™ื“ื” ืฉืœ ื”ื˜ื›ื ื•ืœื•ื’ื™ื”.
18:00
This is a terrifying moment.
320
1080940
1760
ื–ื”ื• ืจื’ืข ืžืคื—ื™ื“.
18:03
It's an exhilarating moment.
321
1083340
1880
ื–ื”ื• ืจื’ืข ืžืœื”ื™ื‘.
18:06
Long before superintelligence, long before AGI threatens us,
322
1086340
4680
ื”ืจื‘ื” ืœืคื ื™ ื‘ื™ื ืช ื”ืขืœ, ื”ืจื‘ื” ืœืคื ื™ ืื™ื•ื ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ื”ื›ืœืœื™ืช,
18:11
a different AI threatens us.
323
1091060
2680
ื™ืฉ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ืื—ืจืช ืฉืžืื™ื™ืžืช ืขืœื™ื ื•.
18:14
But there is something to do while we still can do something.
324
1094340
4400
ืื‘ืœ ื™ืฉ ืžื” ืœืขืฉื•ืช, ื›ืœ ืขื•ื“ ื‘ื™ื›ื•ืœืชื ื• ืœืขืฉื•ืช ืžืฉื”ื•.
18:19
We should know enough now
325
1099460
2640
ืขื›ืฉื™ื• ืื ื• ืืžื•ืจื™ื ืœื“ืขืช ืžืกืคื™ืง
18:22
to know we can't trust democracy just now.
326
1102100
3240
ื›ื“ื™ ืœื“ืขืช ืฉื›ืจื’ืข ืื™ื ื ื• ื™ื›ื•ืœื™ื ืœืกืžื•ืš ืขืœ ื”ื“ืžื•ืงืจื˜ื™ื”.
18:25
We should see that we still have time to build something different.
327
1105820
4960
ืขืœื™ื ื• ืœื”ื‘ื™ืŸ ืฉื™ืฉ ืœื ื• ืขื“ื™ื™ืŸ ื–ืžืŸ ื›ื“ื™ ืœื‘ื ื•ืช ืžืฉื”ื• ืฉื•ื ื”.
18:30
We should act with the love that makes anything possible,
328
1110780
6840
ืขืœื™ื ื• ืœืคืขื•ืœ ืžืชื•ืš ื”ืื”ื‘ื” ืฉืžืืคืฉืจืช ื”ื›ืœ,
18:37
not because we know we will succeed.
329
1117660
3160
ืœื ืžืฉื•ื ืฉื‘ืจื•ืจ ืœื ื• ืฉื ืฆืœื™ื—.
18:40
I'm pretty sure we won't.
330
1120860
2360
ืื ื™ ื“ื™ ื‘ื˜ื•ื— ืฉืœื.
18:45
But because this is what love means.
331
1125860
4720
ืืœื ืžืฉื•ื ืฉื–ื• ืžืฉืžืขื•ืชื” ืฉืœ ืื”ื‘ื”.
18:51
You do whatever you can,
332
1131860
2280
ืืชื” ืขื•ืฉื” ื›ื›ืœ ื™ื›ื•ืœืชืš,
18:54
whatever the odds, for your children,
333
1134180
2480
ื•ืœื ืžืฉื ื” ืžื”ื ื”ืกื™ื›ื•ื™ื™ื, ืœืžืขืŸ ื™ืœื“ื™ืš,
18:56
for your family, for your country, for humanity
334
1136660
5040
ืœืžืขืŸ ืžืฉืคื—ืชืš, ืœืžืขืŸ ืืจืฆืš, ืœืžืขืŸ ื”ืื ื•ืฉื•ืช
19:01
while there is still time,
335
1141740
2960
ื›ืœ ืขื•ื“ ื™ืฉ ื–ืžืŸ,
19:04
while our robot overlord is still just a sci-fi fantasy.
336
1144700
5360
ื›ืœ ืขื•ื“ ืื“ื•ื ื ื• ื”ืจื•ื‘ื•ื˜ ื”ื•ื ืจืง ืคื ื˜ื–ื™ื™ืช ืžื“โ€œื‘.
19:10
Thank you very much.
337
1150820
1600
ืชื•ื“ื” ืจื‘ื” ืœื›ื.
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
19:12
(Applause)
338
1152460
2080
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7