We're building a dystopia just to make people click on ads | Zeynep Tufekci

742,285 views ใƒป 2017-11-17

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืžืชืจื’ื: Shlomo Adam ืžื‘ืงืจ: Ido Dekkers
00:12
So when people voice fears of artificial intelligence,
0
12760
3536
ื›ืฉืื ืฉื™ื ืžื‘ื˜ืื™ื ืคื—ื“ื™ื ืžืคื ื™ ืชื‘ื•ื ื” ื”ืžืœืื›ื•ืชื™ืช,
00:16
very often, they invoke images of humanoid robots run amok.
1
16320
3976
ื”ื ืžืจื‘ื™ื ืœื”ืขืœื•ืช ื“ื™ืžื•ื™ื™ื ืฉืœ ืจื•ื‘ื•ื˜ื™ื ื“ืžื•ื™ื™-ืื“ื ืฉื™ื•ืฆืื™ื ืžืฉืœื™ื˜ื”.
00:20
You know? Terminator?
2
20320
1240
ื›ืžื• ื‘"ืฉืœื™ื—ื•ืช ืงื˜ืœื ื™ืช".
00:22
You know, that might be something to consider,
3
22400
2336
ืื•ืœื™ ืฆืจื™ืš ืœืชืช ืขืœ ื–ื” ืืช ื”ื“ืขืช,
00:24
but that's a distant threat.
4
24760
1856
ืื‘ืœ ื–ื”ื• ืื™ื•ื ืจื—ื•ืง.
00:26
Or, we fret about digital surveillance
5
26640
3456
ืื• ืฉื”ื ืžืชืจื’ื–ื™ื ืขืœ ื”ืžืขืงื‘ ื”ื“ื™ื’ื™ื˜ืœื™
ื•ืžื‘ื™ืื™ื ื“ื™ืžื•ื™ื™ื ืžื”ืขื‘ืจ.
00:30
with metaphors from the past.
6
30120
1776
00:31
"1984," George Orwell's "1984,"
7
31920
2656
"1984" ืฉืœ ื’'ื•ืจื’' ืื•ืจื•ื•ืœ
00:34
it's hitting the bestseller lists again.
8
34600
2280
ืžื˜ืคืก ืฉื•ื‘ ื‘ืจืฉื™ืžืช ืจื‘ื™ ื”ืžื›ืจ.
00:37
It's a great book,
9
37960
1416
ื–ื”ื• ืกืคืจ ื ื”ื“ืจ,
00:39
but it's not the correct dystopia for the 21st century.
10
39400
3880
ืื‘ืœ ื–ื” ืื™ื ื ื• ื”ืชื™ืื•ืจ ื”ื ื›ื•ืŸ ืฉืœ ื”ื“ื™ืกื˜ื•ืคื™ื” ื‘ืžืื” ื”-21.
00:44
What we need to fear most
11
44080
1416
ืขืœื™ื ื• ืœื—ืฉื•ืฉ ื™ื•ืชืจ ืžื›ืœ
00:45
is not what artificial intelligence will do to us on its own,
12
45520
4776
ืœื ืžืžื” ืฉื”ืชื‘ื•ื ื” ื”ืžืœืื›ื•ืชื™ืช ืชืขื•ืœืœ ืœื ื• ื‘ืขืฆืžื”,
00:50
but how the people in power will use artificial intelligence
13
50320
4736
ืืœื ืžื”ืื•ืคืŸ ืฉื‘ื• ืื ืฉื™ื ื‘ืขืžื“ื•ืช ื›ื•ื— ื™ื ืฆืœื• ืืช ื”ืชื‘ื•ื ื” ื”ืžืœืื›ื•ืชื™ืช
ื›ื“ื™ ืœืฉืœื•ื˜ ื‘ื ื• ื•ืœื ื”ืœ ืื•ืชื ื•
00:55
to control us and to manipulate us
14
55080
2816
00:57
in novel, sometimes hidden,
15
57920
3136
ื‘ื“ืจื›ื™ื ื—ื“ืฉื ื™ื•ืช, ืกืžื•ื™ื•ืช ืœืขืชื™ื, ืขืจืžื•ืžื™ื•ืช ื•ืœื-ืฆืคื•ื™ื•ืช.
01:01
subtle and unexpected ways.
16
61080
3016
01:04
Much of the technology
17
64120
1856
ื”ืจื‘ื” ืžื”ื˜ื›ื ื•ืœื•ื’ื™ื”
ืฉืžืื™ื™ืžืช ืขืœ ื—ื™ืจื•ืชื ื• ื•ื›ื‘ื•ื“ื ื• ื‘ืขืชื™ื“ ื”ืงืจื•ื‘
01:06
that threatens our freedom and our dignity in the near-term future
18
66000
4336
01:10
is being developed by companies
19
70360
1856
ืžืคื•ืชื—ืช ื‘ื™ื“ื™ ื—ื‘ืจื•ืช
01:12
in the business of capturing and selling our data and our attention
20
72240
4936
ืฉืขื•ืกืงื•ืช ื‘ืœื›ื™ื“ืช ื•ืžื›ื™ืจืช ื”ื ืชื•ื ื™ื ื•ืžื•ืงื“ื™ ืชืฉื•ืžืช ืœื™ื‘ื ื•
01:17
to advertisers and others:
21
77200
2256
ืœืžืคืจืกืžื™ื ื•ืขื•ื“:
01:19
Facebook, Google, Amazon,
22
79480
3416
"ืคื™ื™ืกื‘ื•ืง", "ื’ื•ื’ืœ", "ืืžื–ื•ืŸ", "ืขืœื™ื‘ืื‘ื”", "ื˜ื ืกื˜".
01:22
Alibaba, Tencent.
23
82920
1880
01:26
Now, artificial intelligence has started bolstering their business as well.
24
86040
5496
ื’ื ืขืกืงื™ ื”ืชื‘ื•ื ื” ื”ืžืœืื›ื•ืชื™ืช ื”ื—ืœื• ืœื”ืชืขืฆื.
01:31
And it may seem like artificial intelligence
25
91560
2096
ื•ืื•ืœื™ ื ืจืื” ืฉื”ืชื‘ื•ื ื” ื”ืžืœืื›ื•ืชื™ืช ื”ื™ื ืคืฉื•ื˜ ื”ืœื”ื™ื˜ ื”ื‘ื, ืื—ืจื™ ืคืจืกื•ืžื•ืช ืžืงื•ื•ื ื•ืช.
01:33
is just the next thing after online ads.
26
93680
2856
01:36
It's not.
27
96560
1216
ื”ื™ื ืœื.
01:37
It's a jump in category.
28
97800
2456
ื–ืืช ืขืœื™ื™ืช ืงื˜ื’ื•ืจื™ื”.
01:40
It's a whole different world,
29
100280
2576
ื–ื”ื• ืขื•ืœื ืื—ืจ ืœื’ืžืจื™,
01:42
and it has great potential.
30
102880
2616
ื•ื˜ืžื•ืŸ ื‘ื” ืคื•ื˜ื ืฆื™ืืœ ื’ื“ื•ืœ;
01:45
It could accelerate our understanding of many areas of study and research.
31
105520
6920
ื”ื™ื ื™ื›ื•ืœื” ืœื”ืื™ืฅ ืืช ื”ื‘ื ืชื ื• ื‘ืชื—ื•ืžื™ ืœื™ืžื•ื“ ื•ืžื—ืงืจ ืจื‘ื™ื.
ืื‘ืœ ื›ืคืจืคืจืื–ื” ืขืœ ื“ื‘ืจื™ื• ืฉืœ ืคื™ืœื•ืกื•ืฃ ื”ื•ืœื™ื•ื•ื“ื™ ื™ื“ื•ืข,
01:53
But to paraphrase a famous Hollywood philosopher,
32
113120
3496
01:56
"With prodigious potential comes prodigious risk."
33
116640
3640
"ืคื•ื˜ื ืฆื™ืืœ ืขืฆื•ื ืžื‘ื™ื ืขื™ืžื• ืกื›ื ื•ืช ืขืฆื•ืžื•ืช."
02:01
Now let's look at a basic fact of our digital lives, online ads.
34
121120
3936
ื”ื‘ื” ื ื‘ื—ืŸ ืขื•ื‘ื“ื” ื‘ืกื™ืกื™ืช ืžื—ื™ื™ื ื• ื”ื“ื™ื’ื™ื˜ืœื™ื™ื: ืคืจืกื•ืžื•ืช ืžืงื•ื•ื ื•ืช.
ื‘ืกื“ืจ? ืื ื• ืคื•ื˜ืจื™ื ืื•ืชืŸ ื›ืœืื—ืจ-ื™ื“.
02:05
Right? We kind of dismiss them.
35
125080
2896
ื”ืŸ ื ืจืื•ืช ื’ื•ืœืžื™ื•ืช, ืœื-ืืคืงื˜ื™ื‘ื™ื•ืช.
02:08
They seem crude, ineffective.
36
128000
1976
ื›ื•ืœื ื• ืจื’ื™ืœื™ื ืฉืขื•ืงื‘ื™ื ืื—ืจื™ื ื• ื‘ืจืฉืช
02:10
We've all had the experience of being followed on the web
37
130000
4256
02:14
by an ad based on something we searched or read.
38
134280
2776
ื‘ืคืจืกื•ืžื•ืช ืฉืžื‘ื•ืกืกื•ืช ืขืœ ืžืฉื”ื• ืฉื—ื™ืคืฉื ื• ืื• ืงืจืื ื•.
ืœืžืฉืœ, ืืช ืžื—ืคืฉืช ื–ื•ื’ ืžื’ืคื™ื™ื,
02:17
You know, you look up a pair of boots
39
137080
1856
02:18
and for a week, those boots are following you around everywhere you go.
40
138960
3376
ื•ื‘ืžืฉืš ืฉื‘ื•ืข ื”ืžื’ืคื™ื™ื ื”ืืœื” ืขื•ืงื‘ื™ื ืื—ืจื™ื™ืš ืœื›ืœ ืžืงื•ื.
02:22
Even after you succumb and buy them, they're still following you around.
41
142360
3656
ื’ื ืื—ืจื™ ืฉืืช ื ื›ื ืขืช ื•ืงื•ื ื” ืื•ืชื, ื”ื ืขื“ื™ื™ืŸ ืžืกืชื•ื‘ื‘ื™ื ืื—ืจื™ื™ืš.
ืื ื• ื“ื™ ืžื—ื•ืกื ื™ื ืœืžื ื™ืคื•ืœืฆื™ื” ื‘ืกื™ืกื™ืช ื•ื–ื•ืœื” ื›ื–ืืช.
02:26
We're kind of inured to that kind of basic, cheap manipulation.
42
146040
3016
ืื ื• ืžื’ืœื’ืœื™ื ืืช ืขื™ื ื™ื ื• ื•ืื•ืžืจื™ื, "ื”ื“ื‘ืจื™ื ื”ืืœื” ืœื ืขื•ื‘ื“ื™ื."
02:29
We roll our eyes and we think, "You know what? These things don't work."
43
149080
3400
02:33
Except, online,
44
153720
2096
ืื‘ืœ ื‘ืื™ื ื˜ืจื ื˜
02:35
the digital technologies are not just ads.
45
155840
3600
ื”ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื”ื“ื™ื’ื™ื˜ืœื™ื•ืช ืื™ื ืŸ ืจืง ืคืจืกื•ืžื•ืช.
02:40
Now, to understand that, let's think of a physical world example.
46
160240
3120
ื›ื“ื™ ืœื”ื‘ื™ืŸ ื–ืืช, ื”ื‘ื” ื ื™ืงื— ื“ื•ื’ืžื” ืžื”ืขื•ืœื ื”ืžืžืฉื™.
02:43
You know how, at the checkout counters at supermarkets, near the cashier,
47
163840
4656
ืžื›ื™ืจื™ื ืืช ื–ื” ืฉื‘ืงื•ืคื•ืช ื‘ืžืจื›ื•ืœ, ืœื™ื“ ื”ืงื•ืคื•ืช,
02:48
there's candy and gum at the eye level of kids?
48
168520
3480
ื™ืฉ ืชืžื™ื“ ืกื•ื›ืจื™ื•ืช ื•ืžืกื˜ื™ืงื™ื ื‘ื’ื•ื‘ื” ืขื™ื ื™ ื”ื™ืœื“ื™ื?
02:52
That's designed to make them whine at their parents
49
172800
3496
ื–ื” ืžืชื•ื›ื ืŸ ื›ื“ื™ ืœื’ืจื•ื ืœื”ื ืœื™ื™ืœืœ ืœื”ื•ืจื™ื”ื
02:56
just as the parents are about to sort of check out.
50
176320
3080
ื‘ื“ื™ื•ืง ื›ืฉื”ื”ื•ืจื™ื ืขื•ืžื“ื™ื ืœืฉืœื.
ื–ืืช ืืจื›ื™ื˜ืงื˜ื•ืจื” ืฉืœ ืฉื›ื ื•ืข.
03:00
Now, that's a persuasion architecture.
51
180040
2640
03:03
It's not nice, but it kind of works.
52
183160
3096
ื”ื™ื ืœื ื ื—ืžื“ื”, ืื‘ืœ ื”ื™ื ืื™ื›ืฉื”ื• ืขื•ื‘ื“ืช
03:06
That's why you see it in every supermarket.
53
186280
2040
ื•ืœื›ืŸ ืจื•ืื™ื ืื•ืชื” ื‘ื›ืœ ืžืจื›ื•ืœ.
03:08
Now, in the physical world,
54
188720
1696
ื‘ืขื•ืœื ื”ืžืžืฉื™,
03:10
such persuasion architectures are kind of limited,
55
190440
2496
ืืจื›ื™ื˜ืงื˜ื•ืจื•ืช ืฉื›ื ื•ืข ื›ืืœื” ื”ืŸ ืžื•ื’ื‘ืœื•ืช ืœืžื“ื™,
03:12
because you can only put so many things by the cashier. Right?
56
192960
4816
ื›ื™ ื™ืฉ ื’ื‘ื•ืœ ืœื›ืžื•ืช ื”ื“ื‘ืจื™ื ืฉืืคืฉืจ ืœืžืงื ืœื™ื“ ื”ืงื•ืคื”, ื ื›ื•ืŸ?
03:17
And the candy and gum, it's the same for everyone,
57
197800
4296
ื•ื›ื•ืœื ืจื•ืื™ื ืืช ืื•ืชื ืžืžืชืงื™ื ื•ืžืกื˜ื™ืงื™ื,
ื’ื ืื ื–ื” ื‘ืขื™ืงืจ ืขื•ื‘ื“
03:22
even though it mostly works
58
202120
1456
03:23
only for people who have whiny little humans beside them.
59
203600
4040
ืขืœ ืื ืฉื™ื ืฉื‘ื ื™-ืื“ื ืงื˜ื ื™ื ืžื™ื™ืœืœื™ื ืœืฆื™ื“ื.
ื‘ืขื•ืœื ื”ืžืžืฉื™ ืื ื• ื—ื™ื™ื ืขื ื”ืžื’ื‘ืœื•ืช ื”ืืœื”.
03:29
In the physical world, we live with those limitations.
60
209160
3920
03:34
In the digital world, though,
61
214280
1936
ืื‘ืœ ื‘ืขื•ืœื ื”ื“ื™ื’ื™ื˜ืœื™
03:36
persuasion architectures can be built at the scale of billions
62
216240
4320
ืืคืฉืจ ืœื‘ื ื•ืช ืืจื›ื™ื˜ืงื˜ื•ืจื•ืช ืฉื›ื ื•ืข ื‘ื”ื™ืงืฃ ืฉืœ ืžื™ืœื™ืืจื“ื™ื
03:41
and they can target, infer, understand
63
221840
3856
ื•ื”ืŸ ื™ื•ื“ืขื•ืช ืœืงืœื•ืข, ืœื”ืกื™ืง, ืœื”ื‘ื™ืŸ
03:45
and be deployed at individuals
64
225720
2896
ื•ืœื”ื™ื•ืช ืžื•ืคื ื•ืช ืœื‘ื ื™-ืื“ื ื‘ื•ื“ื“ื™ื,
03:48
one by one
65
228640
1216
ืื—ื“-ืื—ื“,
03:49
by figuring out your weaknesses,
66
229880
2136
ื‘ื›ืš ืฉื”ืŸ ืžื’ืœื•ืช ืืช ื”ื—ื•ืœืฉื•ืช ืฉืœื›ื,
ื•ื ื™ืชืŸ ืœืฉื’ืจ ืื•ืชืŸ ืœืžืกืš ื”ื˜ืœืคื•ืŸ ื”ืคืจื˜ื™,
03:52
and they can be sent to everyone's phone private screen,
67
232040
5616
03:57
so it's not visible to us.
68
237680
2256
ื›ืš ืฉื–ื” ืœื ื’ืœื•ื™ ืœื›ื•ืœื ื•.
03:59
And that's different.
69
239960
1256
ื•ื–ื” ืฉื•ื ื”.
04:01
And that's just one of the basic things that artificial intelligence can do.
70
241240
3576
ื•ื–ื” ืจืง ืื—ื“ ื”ื“ื‘ืจื™ื ื”ืคืฉื•ื˜ื™ื ืฉื”ืชื‘ื•ื ื” ื”ืžืœืื›ื•ืชื™ืช ืžืกื•ื’ืœืช ืœืขืฉื•ืช.
04:04
Now, let's take an example.
71
244840
1336
ื ื™ืงื— ื“ื•ื’ืžื”.
ื ื ื™ื— ืฉืืชื ืจื•ืฆื™ื ืœืžื›ื•ืจ ื›ืจื˜ื™ืกื™ ื˜ื™ืกื” ืœื•ื•ื’ืืก, ื‘ืกื“ืจ?
04:06
Let's say you want to sell plane tickets to Vegas. Right?
72
246200
2696
04:08
So in the old world, you could think of some demographics to target
73
248920
3496
ื‘ืขื•ืœื ื”ื™ืฉืŸ ื”ื™ื™ืชื ืขื•ืฉื™ื ืžื™ืงื•ื“ ืœืคื™ ืคื™ืœื•ื— ื“ืžื•ื’ืจืคื™
04:12
based on experience and what you can guess.
74
252440
2520
ืขืœ ืกืžืš ื ืกื™ื•ืŸ ื•ื ื™ื—ื•ืฉื™ื.
04:15
You might try to advertise to, oh,
75
255560
2816
ื”ื™ื™ืชื ืื•ืœื™ ืคื•ื ื™ื ื‘ืคืจืกื•ืžืช, ืœืžืฉืœ,
04:18
men between the ages of 25 and 35,
76
258400
2496
ืœื’ื‘ืจื™ื ื‘ื’ื™ืœืื™ 25 ืขื“ 35,
04:20
or people who have a high limit on their credit card,
77
260920
3936
ืื• ืื ืฉื™ื ืขื ืืฉืจืื™ ื’ื‘ื•ื”,
04:24
or retired couples. Right?
78
264880
1376
ืื• ื–ื•ื’ื•ืช ื‘ื’ืžืœืื•ืช, ื ื›ื•ืŸ?
04:26
That's what you would do in the past.
79
266280
1816
ื–ื” ืžื” ืฉื”ื™ื™ืชื ืขื•ืฉื™ื ืคืขื.
ืขื ื ืชื•ื ื™ ืขืชืง ื•ืœืžื™ื“ืช ืžื›ื•ื ื”,
04:28
With big data and machine learning,
80
268120
2896
ื–ื” ื›ื‘ืจ ืœื ืขื•ื‘ื“ ื›ื›ื”.
04:31
that's not how it works anymore.
81
271040
1524
04:33
So to imagine that,
82
273320
2176
ื›ื“ื™ ืœื“ืžื™ื™ืŸ ืืช ื–ื”,
04:35
think of all the data that Facebook has on you:
83
275520
3856
ื—ื™ืฉื‘ื• ืขืœ ื›ืœ ื”ื ืชื•ื ื™ื ืฉื™ืฉ ืœ"ืคื™ื™ืกื‘ื•ืง" ืขืœื™ื›ื:
04:39
every status update you ever typed,
84
279400
2536
ื›ืœ ืขื“ื›ื•ืŸ ืกื˜ื˜ื•ืก ืฉื”ืงืœื“ืชื,
04:41
every Messenger conversation,
85
281960
2016
ื›ืœ ืฉื™ื—ื” ื‘"ืžืกื ื’'ืจ",
ื›ืœ ืžืงื•ื ืžืžื ื• ื ื›ื ืกืชื ืœื—ืฉื‘ื•ื ื›ื,
04:44
every place you logged in from,
86
284000
1880
04:48
all your photographs that you uploaded there.
87
288400
3176
ื›ืœ ื”ืฆื™ืœื•ืžื™ื ืฉื”ืขืœื™ืชื ืœืฉื,
04:51
If you start typing something and change your mind and delete it,
88
291600
3776
ืื ื”ืชื—ืœืชื ืœื”ืงืœื™ื“ ืžืฉื”ื•, ืฉื™ื ื™ืชื ืืช ื“ืขืชื›ื ื•ืžื—ืงืชื ืื•ืชื•,
04:55
Facebook keeps those and analyzes them, too.
89
295400
3200
"ืคื™ื™ืกื‘ื•ืง" ืฉื•ืžืจืช ื•ืžื ืชื—ืช ื’ื ืืช ื–ื”.
04:59
Increasingly, it tries to match you with your offline data.
90
299160
3936
ื”ื™ื ืชื ืกื” ื™ื•ืชืจ ื•ื™ื•ืชืจ ืœื”ืชืื™ื ื‘ื™ื ื™ื›ื ืœื‘ื™ืŸ ื ืชื•ื ื™ื›ื ื”ื‘ืœืชื™-ืžืงื•ื•ื ื™ื.
ื”ื™ื ื’ื ืจื•ื›ืฉืช ื”ืจื‘ื” ื ืชื•ื ื™ื ืžืกื—ืจื ื™ ืžื™ื“ืข.
05:03
It also purchases a lot of data from data brokers.
91
303120
3176
05:06
It could be everything from your financial records
92
306320
3416
ื–ื” ื™ื›ื•ืœ ืœื”ื™ื•ืช ื”ื›ืœ, ื”ื—ืœ ืžื”ืจืฉื•ืžื•ืช ื”ืคื™ื ื ืกื™ื•ืช ืฉืœื›ื
05:09
to a good chunk of your browsing history.
93
309760
2120
ื•ืขื“ ืœื ืชื— ืจืฆื™ื ื™ ืžื”ื™ืกื˜ื•ืจื™ื™ืช ื”ื’ืœื™ืฉื” ืฉืœื›ื.
05:12
Right? In the US, such data is routinely collected,
94
312360
5416
ื‘ืืจื”"ื‘, ื ืชื•ื ื™ื ื›ืืœื” ื ืืกืคื™ื, ืžื•ืฉื•ื•ื™ื ื•ื ืžื›ืจื™ื ื“ืจืš ืฉื’ืจื”,
05:17
collated and sold.
95
317800
1960
05:20
In Europe, they have tougher rules.
96
320320
2440
ื‘ืื™ืจื•ืคื”, ื”ื—ื•ืงื™ื ื ื•ืงืฉื™ื ื™ื•ืชืจ.
05:23
So what happens then is,
97
323680
2200
ื•ื›ืขืช ืžื” ืฉืงื•ืจื” ื”ื•ื,
05:26
by churning through all that data, these machine-learning algorithms --
98
326920
4016
ื”ืืœื’ื•ืจื™ืชืžื™ื ืฉืœ ืœืžื™ื“ืช ื”ืžื›ื•ื ื” ื”ืืœื” ื—ื•ืคืจื™ื ื‘ื›ืœ ื”ื ืชื•ื ื™ื ื”ืœืœื• --
05:30
that's why they're called learning algorithms --
99
330960
2896
ื‘ื’ืœืœ ื–ื” ืงื•ืจืื™ื ืœื”ื "ืืœื’ื•ืจื™ืชืžื™ื ืฉืœ ืœืžื™ื“ืช ืžื›ื•ื ื”" --
05:33
they learn to understand the characteristics of people
100
333880
4096
ื•ื”ื ืœื•ืžื“ื™ื ืœื”ื‘ื™ืŸ ืืช ื”ืžืืคื™ื™ื ื™ื ืฉืœ ืื ืฉื™ื
ืฉืจื›ืฉื• ื‘ืขื‘ืจ ื›ืจื˜ื™ืกื™ื ืœื•ื•ื’ืืก.
05:38
who purchased tickets to Vegas before.
101
338000
2520
05:41
When they learn this from existing data,
102
341760
3536
ื›ืฉื”ื ืœื•ืžื“ื™ื ื–ืืช ืžื ืชื•ื ื™ื ืงื™ื™ืžื™ื,
05:45
they also learn how to apply this to new people.
103
345320
3816
ื”ื ื’ื ืœื•ืžื“ื™ื ืœื™ื™ืฉื ื–ืืช ืœืื ืฉื™ื ื—ื“ืฉื™ื.
ืื– ื›ืฉืžืฆื™ื’ื™ื ืœื”ื ืžื™ืฉื”ื• ื—ื“ืฉ,
05:49
So if they're presented with a new person,
104
349160
3056
05:52
they can classify whether that person is likely to buy a ticket to Vegas or not.
105
352240
4640
ื”ื ื™ื•ื“ืขื™ื ืœืกื•ื•ื’ ืื•ืชื•, ื›ืžื™ ืฉืกื‘ื™ืจ ืฉื™ืงื ื” ื›ืจื˜ื™ืก ืœื•ื•ื’ืืก ืื• ืœื.
05:57
Fine. You're thinking, an offer to buy tickets to Vegas.
106
357720
5456
ื™ื•ืคื™. ืืชื ื—ื•ืฉื‘ื™ื, "ื”ืฆืขื” ืœืจื›ื•ืฉ ื›ืจื˜ื™ืกื™ื ืœื•ื•ื’ืืก?
"ืื ื™ ื™ื›ื•ืœ ืœื”ืชืขืœื ืžืžื ื”."
06:03
I can ignore that.
107
363200
1456
06:04
But the problem isn't that.
108
364680
2216
ืื‘ืœ ืœื ื–ืืช ื”ื‘ืขื™ื”.
06:06
The problem is,
109
366920
1576
ื”ื‘ืขื™ื” ื”ื™ื,
06:08
we no longer really understand how these complex algorithms work.
110
368520
4136
ืฉื›ื‘ืจ ืื™ื ื ื• ืžื‘ื™ื ื™ื ื‘ืืžืช ืื™ืš ื”ืืœื’ื•ืจื™ืชืžื™ื ื”ืžืกื•ื‘ื›ื™ื ื”ืืœื” ืขื•ื‘ื“ื™ื.
06:12
We don't understand how they're doing this categorization.
111
372680
3456
ืื™ื ื ื• ืžื‘ื™ื ื™ื ืื™ืš ื”ื ืžืกื•ื•ื’ื™ื.
06:16
It's giant matrices, thousands of rows and columns,
112
376160
4416
ืžื“ื•ื‘ืจ ื‘ืžื˜ืจื™ืฆื•ืช ืขื ืง, ืืœืคื™ ืฉื•ืจื•ืช ื•ืขืžื•ื“ื•ืช,
06:20
maybe millions of rows and columns,
113
380600
1960
ืื•ืœื™ ืžื™ืœื™ื•ื ื™ ืฉื•ืจื•ืช ื•ืขืžื•ื“ื•ืช,
06:23
and not the programmers
114
383320
2640
ื•ืœื ื”ืžืชื›ื ืชื™ื
06:26
and not anybody who looks at it,
115
386760
1680
ื•ืœื ืžื™ ืฉื‘ื•ื—ืŸ ืื•ืชืŸ,
06:29
even if you have all the data,
116
389440
1496
ื’ื ืื ื™ืฉ ืœื• ื›ืœ ื”ื ืชื•ื ื™ื,
06:30
understands anymore how exactly it's operating
117
390960
4616
ื›ื‘ืจ ืžื‘ื™ืŸ ื‘ืื™ื–ื• ืจืžืช ื“ื™ื•ืง ื”ื ืคื•ืขืœื™ื,
06:35
any more than you'd know what I was thinking right now
118
395600
3776
ื›ืคื™ ืฉืื™ื ื›ื ื™ื•ื“ืขื™ื ืžื” ืื ื™ ื—ื•ืฉื‘ืช ื›ืจื’ืข
06:39
if you were shown a cross section of my brain.
119
399400
3960
ืื™ืœื• ื”ืจืื• ืœื›ื ื—ืชืš-ืจื•ื—ื‘ ืฉืœ ื”ืžื•ื— ืฉืœื™.
06:44
It's like we're not programming anymore,
120
404360
2576
ื–ื” ื›ืื™ืœื• ืฉืื™ื ื ื• ืžืชื›ื ืชื™ื ื™ื•ืชืจ,
06:46
we're growing intelligence that we don't truly understand.
121
406960
4400
ืื ื• ืžื’ื“ืœื™ื ืชื‘ื•ื ื”
ืฉืื™ื ื ื• ื‘ืืžืช ืžื‘ื™ื ื™ื.
06:52
And these things only work if there's an enormous amount of data,
122
412520
3976
ื•ื”ื“ื‘ืจื™ื ื”ืืœื” ืขื•ื‘ื“ื™ื ืจืง ืื ื›ืžื•ืช ื”ื ืชื•ื ื™ื ืขืฆื•ืžื”,
06:56
so they also encourage deep surveillance on all of us
123
416520
5096
ื•ืœื›ืŸ ื”ื ื’ื ืžืขื•ื“ื“ื™ื ืžืขืงื‘ ืžืขืžื™ืง ืื—ืจื™ ื›ื•ืœื ื•
07:01
so that the machine learning algorithms can work.
124
421640
2336
ื›ื“ื™ ืฉื”ืืœื’ื•ืจื™ืชืžื™ื ืฉืœ ืœืžื™ื“ืช ื”ืžื›ื•ื ื” ื™ื•ื›ืœื• ืœืคืขื•ืœ ื›ื”ืœื›ื”.
ื–ื• ื”ืกื™ื‘ื” ืฉ"ืคื™ื™ืกื‘ื•ืง" ืจื•ืฆื” ืœืืกื•ืฃ ืขืœื™ื›ื ื ืชื•ื ื™ื ืจื‘ื™ื ื›ื›ืœ ืฉืชื•ื›ืœ:
07:04
That's why Facebook wants to collect all the data it can about you.
125
424000
3176
07:07
The algorithms work better.
126
427200
1576
ื”ืืœื’ื•ืจื™ืชืžื™ื ื™ืขื‘ื“ื• ื˜ื•ื‘ ื™ื•ืชืจ.
07:08
So let's push that Vegas example a bit.
127
428800
2696
ื‘ื•ืื• ื ืขืœื” ืžื“ืจื’ื” ื‘ื“ื•ื’ืžื” ืฉืœ ื•ื•ื’ืืก.
07:11
What if the system that we do not understand
128
431520
3680
ืžื” ืื ื”ืžืขืจื›ืช ืฉืื™ื ื ื• ืžื‘ื™ื ื™ื
07:16
was picking up that it's easier to sell Vegas tickets
129
436200
5136
ื”ื—ืœื™ื˜ื” ืฉืงืœ ื™ื•ืชืจ ืœืžื›ื•ืจ ื›ืจื˜ื™ืกื™ื ืœื•ื•ื’ืืก
07:21
to people who are bipolar and about to enter the manic phase.
130
441360
3760
ืœืื ืฉื™ื ื“ื•-ืงื•ื˜ื‘ื™ื™ื ืฉืขื•ืžื“ื™ื ืœืขื‘ื•ืจ ืœืžืื ื™ื”.
07:25
Such people tend to become overspenders, compulsive gamblers.
131
445640
4920
ืื ืฉื™ื ื›ืืœื” ื ื•ื˜ื™ื ืœื”ื™ื•ืช ื‘ื–ื‘ื–ื ื™ื ืžื“ื™, ืžื”ืžืจื™ื ื›ืคื™ื™ืชื™ื™ื.
07:31
They could do this, and you'd have no clue that's what they were picking up on.
132
451280
4456
ื”ื™ื ืชื•ื›ืœ ืœืขืฉื•ืช ื–ืืช, ื•ืœื ื™ื”ื™ื” ืœื›ื ืžื•ืฉื’ ืฉื›ืš ื”ื™ื ื”ื—ืœื™ื˜ื”.
07:35
I gave this example to a bunch of computer scientists once
133
455760
3616
ืคืขื ื ืชืชื™ ืืช ื”ื“ื•ื’ืžื” ื”ื–ืืช ืœืงื‘ื•ืฆืช ืžื“ืขื ื™ ืžื—ืฉื‘ื™ื
07:39
and afterwards, one of them came up to me.
134
459400
2056
ื•ืื—ืจ-ื›ืš ืื—ื“ ืžื”ื ื ื™ื’ืฉ ืืœื™.
07:41
He was troubled and he said, "That's why I couldn't publish it."
135
461480
3520
ื”ื•ื ื”ื™ื” ื ืกืขืจ, ื•ืืžืจ, "ื‘ื’ืœืœ ื–ื” ืœื ื™ื›ื•ืœืชื™ ืœืคืจืกื ืืช ื–ื”."
07:45
I was like, "Couldn't publish what?"
136
465600
1715
ืฉืืœืชื™, "ืœืคืจืกื ืืช ืžื”?"
07:47
He had tried to see whether you can indeed figure out the onset of mania
137
467800
5856
ื”ื•ื ื ื™ืกื” ืœืจืื•ืช ืื ืื›ืŸ ืืคืฉืจ ืœื–ื”ื•ืช ืืช ืชื—ื™ืœืช ื”ืฉืœื‘ ื”ืžืื ื™
07:53
from social media posts before clinical symptoms,
138
473680
3216
ืžืคื•ืกื˜ื™ื ื‘ืžื“ื™ื” ื”ื—ื‘ืจืชื™ืช ืขื•ื“ ืœืคื ื™ ื”ืชืกืžื™ื ื™ื ื”ืจืคื•ืื™ื™ื,
07:56
and it had worked,
139
476920
1776
ื•ื–ื” ื”ืฆืœื™ื— ืœื•,
07:58
and it had worked very well,
140
478720
2056
ื–ื” ื”ืฆืœื™ื— ืœื• ืžืžืฉ ื˜ื•ื‘.
08:00
and he had no idea how it worked or what it was picking up on.
141
480800
4880
ื•ืœื ื”ื™ื” ืœื• ืžื•ืฉื’ ืื™ืš ื–ื” ืขื‘ื“ ืื• ืœืคื™ ืžื”.
08:06
Now, the problem isn't solved if he doesn't publish it,
142
486840
4416
ื”ื‘ืขื™ื” ืื™ื ื” ืคืชื•ืจื” ืจืง ื›ื™ ื”ื•ื ืœื ืคื™ืจืกื ืืช ื–ื”,
08:11
because there are already companies
143
491280
1896
ื›ื™ ื›ื‘ืจ ื™ืฉ ื—ื‘ืจื•ืช ืฉืžืคืชื—ื•ืช ื˜ื›ื ื•ืœื•ื’ื™ื” ื›ื–ืืช,
08:13
that are developing this kind of technology,
144
493200
2536
08:15
and a lot of the stuff is just off the shelf.
145
495760
2800
ื•ื”ืจื‘ื” ืžื”ืžื•ืฆืจื™ื ืฉืœื”ืŸ ื ื—ื˜ืคื™ื ืžื”ืžื“ืคื™ื.
08:19
This is not very difficult anymore.
146
499240
2576
ื–ื” ื›ื‘ืจ ืœื ืงืฉื” ื‘ืžื™ื•ื—ื“.
08:21
Do you ever go on YouTube meaning to watch one video
147
501840
3456
ื™ื•ืฆื ืœื›ื ืœื”ื™ื›ื ืก ืœ"ื™ื•-ื˜ื™ื•ื‘" ื›ื“ื™ ืœืจืื•ืช ืกืจื˜ื•ืŸ ืื—ื“
08:25
and an hour later you've watched 27?
148
505320
2360
ื•ืื—ืจื™ ืฉืขื” ืฆืคื™ืชื ื›ื‘ืจ ื‘-27 ื›ืืœื”?
08:28
You know how YouTube has this column on the right
149
508760
2496
ืžื›ื™ืจื™ื ืืช ื–ื” ืฉื‘"ื™ื•-ื˜ื™ื•ื‘" ื™ืฉ ื‘ืฆื“ ื™ืžื™ืŸ ืขืžื•ื“ื”
08:31
that says, "Up next"
150
511280
2216
ืฉื›ืชื•ื‘ ื‘ื”: "ื”ืกืจื˜ื•ืŸ ื”ื‘ื"
08:33
and it autoplays something?
151
513520
1816
ื•ื”ื•ื ืžื•ืงืจืŸ ื‘ืื•ืคืŸ ืื•ื˜ื•ืžื˜ื™?
08:35
It's an algorithm
152
515360
1216
ื–ื”ื• ืืœื’ื•ืจื™ืชื
08:36
picking what it thinks that you might be interested in
153
516600
3616
ืฉื‘ื•ื—ืจ ืžื” ืฉืœื“ืขืชื• ืขืฉื•ื™ ืœืขื ื™ื™ืŸ ืืชื›ื
08:40
and maybe not find on your own.
154
520240
1536
ื•ืฉืื•ืœื™ ืœื ืชื—ืคืฉื• ื‘ืขืฆืžื›ื.
08:41
It's not a human editor.
155
521800
1256
ืื™ืŸ ืฉื ืขื•ืจืš ืื ื•ืฉื™.
ื–ื” ืžื” ืฉืืœื’ื•ืจื™ืชืžื™ื ืขื•ืฉื™ื.
08:43
It's what algorithms do.
156
523080
1416
08:44
It picks up on what you have watched and what people like you have watched,
157
524520
4736
ื”ื ื‘ื•ื—ืจื™ื ืœืคื™ ืžื” ืฉืจืื™ืชื ื•ืžื” ืฉืจืื• ืื ืฉื™ื ื›ืžื•ื›ื,
08:49
and infers that that must be what you're interested in,
158
529280
4216
ื•ืžืกื™ืงื™ื ืฉื–ื” ื‘ื•ื•ื“ืื™ ืžื” ืฉื™ืขื ื™ื™ืŸ ืืชื›ื,
08:53
what you want more of,
159
533520
1255
ื•ืฉืืชื ืจื•ืฆื™ื ืžืžื ื• ืขื•ื“,
08:54
and just shows you more.
160
534799
1336
ื•ื”ื ืคืฉื•ื˜ ืžืฆื™ื’ื™ื ืœื›ื ืขื•ื“.
ื–ื” ื ืฉืžืข ื›ืžื• ืชื›ื•ื ื” ื ื—ืžื“ื” ื•ืฉื™ืžื•ืฉื™ืช,
08:56
It sounds like a benign and useful feature,
161
536159
2201
08:59
except when it isn't.
162
539280
1200
ืืœื ื›ืฉื”ื™ื ืœื.
09:01
So in 2016, I attended rallies of then-candidate Donald Trump
163
541640
6960
ื‘-2016 ื”ืฉืชืชืคืชื™ ื‘ืขืฆืจื•ืช ืฉืœ ื”ืžื•ืขืžื“ ื“ืื–, ื“ื•ื ืœื“ ื˜ืจืืžืค
09:09
to study as a scholar the movement supporting him.
164
549840
3336
ื›ื“ื™ ืœื—ืงื•ืจ, ื›ืžืœื•ืžื“ืช, ืืช ื”ืชื ื•ืขื” ืฉืชื•ืžื›ืช ื‘ื•.
09:13
I study social movements, so I was studying it, too.
165
553200
3456
ื—ืงืจืชื™ ืื– ืชื ื•ืขื•ืช ื—ื‘ืจืชื™ื•ืช, ื•ื—ืงืจืชื™ ื’ื ืืช ื–ืืช.
09:16
And then I wanted to write something about one of his rallies,
166
556680
3336
ื•ืื– ืจืฆื™ืชื™ ืœื›ืชื•ื‘ ืžืฉื”ื• ืขืœ ืื—ืช ื”ืขืฆืจื•ืช ืฉืœื•,
ืื– ืฆืคื™ืชื™ ื‘ื” ื›ืžื” ืคืขืžื™ื ื‘"ื™ื•-ื˜ื™ื•ื‘".
09:20
so I watched it a few times on YouTube.
167
560040
1960
09:23
YouTube started recommending to me
168
563240
3096
"ื™ื•-ื˜ื™ื•ื‘" ื”ื—ืœ ืœื”ืžืœื™ืฅ ืœื™
09:26
and autoplaying to me white supremacist videos
169
566360
4256
ื•ืœื”ืงืจื™ืŸ ืœื™ ืื•ื˜ื•ืžื˜ื™ืช ืกืจื˜ื•ื ื™ ืขืœื™ื•ื ื•ืช ืœื‘ื ื”
09:30
in increasing order of extremism.
170
570640
2656
ื‘ืงื™ืฆื•ื ื™ื•ืช ื”ื•ืœื›ืช ื•ื’ื•ื‘ืจืช.
09:33
If I watched one,
171
573320
1816
ืื ืฆืคื™ืชื™ ื‘ืื—ื“ ืžื”ื,
ื”ืขืจื•ืฅ ื”ื‘ื™ื ืœื™ ืกืจื˜ื•ืŸ ืงื™ืฆื•ื ื™ ืขื•ื“ ื™ื•ืชืจ
09:35
it served up one even more extreme
172
575160
2976
ื•ื”ืงืจื™ืŸ ืœื™ ื’ื ืื•ืชื• ืื•ื˜ื•ืžื˜ื™ืช.
09:38
and autoplayed that one, too.
173
578160
1424
09:40
If you watch Hillary Clinton or Bernie Sanders content,
174
580320
4536
ืื ืฆืคื™ืชื ื‘ืชื•ื›ืŸ ืฉืœ ื”ื™ืœืจื™ ืงืœื™ื ื˜ื•ืŸ ืื• ื‘ืจื ื™ ืกื ื“ืจืก,
09:44
YouTube recommends and autoplays conspiracy left,
175
584880
4696
"ื™ื•-ื˜ื™ื•ื‘" ื™ืžืœื™ืฅ ื•ื™ืงืจื™ืŸ ืœื›ื ืื•ื˜ื•ืžื˜ื™ืช ืžื–ื™ืžื•ืช ืฉืžืืœื ื™ื•ืช,
09:49
and it goes downhill from there.
176
589600
1760
ื•ืžืฉื ื–ื” ืจืง ื™ื™ืœืš ื•ื™ื™ื“ืจื“ืจ.
09:52
Well, you might be thinking, this is politics, but it's not.
177
592480
3056
ืื•ืœื™ ืืชื ื—ื•ืฉื‘ื™ื, "ื›ื›ื” ื–ื” ื‘ืคื•ืœื™ื˜ื™ืงื”." ืื‘ืœ ืœื.
09:55
This isn't about politics.
178
595560
1256
ืœื ืžื“ื•ื‘ืจ ื›ืืŸ ื‘ืคื•ืœื™ื˜ื™ืงื”.
09:56
This is just the algorithm figuring out human behavior.
179
596840
3096
ื–ื”ื• ืคืฉื•ื˜ ืืœื’ื•ืจื™ืชื ืฉืœื•ืžื“ ื”ืชื ื”ื’ื•ืช ืื ื•ืฉื™ืช.
09:59
I once watched a video about vegetarianism on YouTube
180
599960
4776
ืจืื™ืชื™ ืคืขื ืกืจื˜ื•ืŸ ื‘"ื™ื•-ื˜ื™ื•ื‘" ืขืœ ืฆืžื—ื•ื ื•ืช
10:04
and YouTube recommended and autoplayed a video about being vegan.
181
604760
4936
ื•"ื™ื•-ื˜ื™ื•ื‘" ื”ืžืœื™ืฅ ืœื™, ื•ื”ืงืจื™ืŸ ืื•ื˜ื•ืžื˜ื™ืช ืกืจื˜ื•ืŸ ืขืœ ื˜ื‘ืขื•ื ื•ืช.
10:09
It's like you're never hardcore enough for YouTube.
182
609720
3016
ื”ื”ืจื’ืฉื” ื”ื™ื ืฉืืชื” ืืฃ ืคืขื ืœื ืžืกืคื™ืง ืงื™ืฆื•ื ื™ ื‘ืฉื‘ื™ืœ "ื™ื•-ื˜ื™ื•ื‘".
10:12
(Laughter)
183
612760
1576
(ืฆื—ื•ืง)
10:14
So what's going on?
184
614360
1560
ืื– ืžื” ืงื•ืจื”?
10:16
Now, YouTube's algorithm is proprietary,
185
616520
3536
ื”ืืœื’ื•ืจื™ืชื ืฉืœ "ื™ื•-ื˜ื™ื•ื‘" ื”ื•ื ืงื ื™ื™ื ื™,
ืื‘ืœ ื”ื ื” ืžื” ืฉืื ื™ ื—ื•ืฉื‘ืช ืฉืงื•ืจื”.
10:20
but here's what I think is going on.
186
620080
2360
10:23
The algorithm has figured out
187
623360
2096
ื”ืืœื’ื•ืจื™ืชื ื—ื™ืฉื‘ ื•ืžืฆื
10:25
that if you can entice people
188
625480
3696
ืฉืื ืืคืฉืจ ืœืคืชื•ืช ืื ืฉื™ื ืœื—ืฉื•ื‘
10:29
into thinking that you can show them something more hardcore,
189
629200
3736
ืฉื”ื ื™ื•ื›ืœื• ืœืจืื•ืช ืžืฉื”ื• ืงื™ืฆื•ื ื™ ื™ื•ืชืจ,
10:32
they're more likely to stay on the site
190
632960
2416
ืกื‘ื™ืจ ื™ื•ืชืจ ืฉื”ื ื™ื™ืฉืืจื• ื‘ืืชืจ
10:35
watching video after video going down that rabbit hole
191
635400
4416
ื•ื™ืฆืคื• ื‘ืกืจื˜ื•ืŸ ืื—ืจื™ ืกืจื˜ื•ืŸ ืชื•ืš ื ืคื™ืœื” ืœืžืื•ืจืช ื”ืืจื ื‘
10:39
while Google serves them ads.
192
639840
1680
ื‘ืขื•ื“ "ื’ื•ื’ืœ" ืžื’ื™ืฉ ืœื”ื ืคืจืกื•ืžื•ืช.
10:43
Now, with nobody minding the ethics of the store,
193
643760
3120
ื›ืฉืื™ืฉ ืื™ื ื ื• ืžืฉื’ื™ื— ืขืœ ื”ืืชื™ืงื” ื‘ื—ื ื•ืช,
10:47
these sites can profile people
194
647720
4240
ื”ืืชืจื™ื ื”ืืœื” ื™ื›ื•ืœื™ื ืœืืคื™ื™ืŸ ืื ืฉื™ื
10:53
who are Jew haters,
195
653680
1920
ืฉื”ื ืฉื•ื ืื™ ื™ื”ื•ื“ื™ื,
10:56
who think that Jews are parasites
196
656360
2480
ืฉื—ื•ืฉื‘ื™ื ืฉื”ื™ื”ื•ื“ื™ื ื”ื ื˜ืคื™ืœื™ื
11:00
and who have such explicit anti-Semitic content,
197
660320
4920
ื•ื”ื ื™ืงื‘ืœื• ืชื•ื›ืŸ ืื ื˜ื™ืฉืžื™ ืžืคื•ืจืฉ
ื•ืืคืฉืจ ืœืžืงื“ ืืœื™ื”ื ืคืจืกื•ืžื•ืช.
11:06
and let you target them with ads.
198
666080
2000
ื”ื ื™ื›ื•ืœื™ื ื’ื ืœื’ื™ื™ืก ืืœื’ื•ืจื™ืชืžื™ื
11:09
They can also mobilize algorithms
199
669200
3536
11:12
to find for you look-alike audiences,
200
672760
3136
ืฉื™ืžืฆืื• ืขื‘ื•ืจื›ื ืงื”ืœื™ื ื“ื•ืžื™ื,
11:15
people who do not have such explicit anti-Semitic content on their profile
201
675920
5576
ืื ืฉื™ื ืฉื”ืื™ืคื™ื•ืŸ ืฉืœื”ื ืื™ื ื• ื›ื•ืœืœ ืชื•ื›ืŸ ืื ื˜ื™ืฉืžื™ ืžืคื•ืจืฉ ื›ื–ื”,
11:21
but who the algorithm detects may be susceptible to such messages,
202
681520
6176
ืื‘ืœ ืฉื”ืืœื’ื•ืจื™ืชื ื–ื™ื”ื” ืฉื”ื ื™ื”ื™ื• ืคืชื•ื—ื™ื ืœืžืกืจื™ื ื›ืืœื”,
11:27
and lets you target them with ads, too.
203
687720
1920
ื•ื”ื•ื ื™ืืคืฉืจ ืœืžืงื“ ื’ื ืืœื™ื”ื ืคืจืกื•ืžื•ืช.
11:30
Now, this may sound like an implausible example,
204
690680
2736
ื–ื” ืื•ืœื™ ื ืฉืžืข ื›ืžื• ื“ื•ื’ืžื” ื‘ืœืชื™-ืกื‘ื™ืจื”,
11:33
but this is real.
205
693440
1320
ืื‘ืœ ื–ื” ืืžื™ืชื™.
11:35
ProPublica investigated this
206
695480
2136
"ืคืจื•ืคื‘ืœื™ืงื”" ื—ืงืจื• ืืช ื–ื”,
11:37
and found that you can indeed do this on Facebook,
207
697640
3616
ื•ืžืฆืื• ืฉืื›ืŸ ืืคืฉืจ ืœืขืฉื•ืช ื–ืืช ื‘"ืคื™ื™ืกื‘ื•ืง",
11:41
and Facebook helpfully offered up suggestions
208
701280
2416
ื•"ืคื™ื™ืกื‘ื•ืง" ื ืขืชืจื” ื‘ืจืฆื•ืŸ ื•ื”ืฆื™ืขื” ื”ืฆืขื•ืช
11:43
on how to broaden that audience.
209
703720
1600
ืื™ืš ืœื”ืจื—ื™ื‘ ืืช ื”ืงื”ืœ ื”ื–ื”.
11:46
BuzzFeed tried it for Google, and very quickly they found,
210
706720
3016
"ื‘ืื–ืคื™ื“" ื ื™ืกื• ื–ืืช ืขืœ "ื’ื•ื’ืœ", ื•ื”ื ื’ื™ืœื• ืžื”ืจ ืžืื“
11:49
yep, you can do it on Google, too.
211
709760
1736
ืฉื›ืŸ, ืืคืฉืจ ืœืขืฉื•ืช ื–ืืช ื’ื ื‘"ื’ื•ื’ืœ".
11:51
And it wasn't even expensive.
212
711520
1696
ื•ื–ื” ืœื ื”ื™ื” ืืคื™ืœื• ื™ืงืจ.
11:53
The ProPublica reporter spent about 30 dollars
213
713240
4416
ื”ื›ืชื‘ ืฉืœ "ืคืจื•ืคื‘ืœื™ืงื”" ื”ื•ืฆื™ื ื›-30 ื“ื•ืœืจ
11:57
to target this category.
214
717680
2240
ื›ื“ื™ ืœืžืงื“ ืคืจืกื•ื ืœืงื˜ื’ื•ืจื™ื” ื”ื–ืืช.
12:02
So last year, Donald Trump's social media manager disclosed
215
722600
5296
ื‘ืฉื ื” ืฉืขื‘ืจื”, ืžื ื”ืœ ื”ืžื“ื™ื” ื”ื—ื‘ืจืชื™ืช ืฉืœ ื“ื•ืœื ื“ ื˜ืจืืžืค ื’ื™ืœื”
12:07
that they were using Facebook dark posts to demobilize people,
216
727920
5336
ืฉื”ื ื”ืขืœื• ืคื•ืกื˜ื™ื ืืคืœื™ื ืœ"ืคื™ื™ืกื‘ื•ืง" ื›ื“ื™ ืœื”ื ื™ื ืื ืฉื™ื ืžืœื”ืฆื‘ื™ืข;
12:13
not to persuade them,
217
733280
1376
ืœื ื›ื“ื™ ืœืฉื›ื ืข ืื•ืชื,
12:14
but to convince them not to vote at all.
218
734680
2800
ืืœื ื›ื“ื™ ืœืฉื›ื ืข ืื•ืชื ืฉืœื ืœื”ืฆื‘ื™ืข ื›ืœืœ.
12:18
And to do that, they targeted specifically,
219
738520
3576
ื•ืœืฉื ื›ืš ื”ื ื‘ื™ืฆืขื• ืžื™ืงื•ื“ ืกืคืฆื™ืคื™,
ืœืžืฉืœ, ื’ื‘ืจื™ื ืืคืจื•-ืืžืจื™ืงื ื™ื ื‘ืขืจื™ื ืžืจื›ื–ื™ื•ืช ื›ืžื• ืคื™ืœื“ืœืคื™ื”,
12:22
for example, African-American men in key cities like Philadelphia,
220
742120
3896
ื•ืื ื™ ืืงืจื™ื ื‘ื“ื™ื•ืง ืืช ืžื” ืฉื”ื•ื ืืžืจ.
12:26
and I'm going to read exactly what he said.
221
746040
2456
12:28
I'm quoting.
222
748520
1216
ืื ื™ ืžืฆื˜ื˜ืช.
12:29
They were using "nonpublic posts
223
749760
3016
ื”ื ื”ืฉืชืžืฉื• ื‘"ืคื•ืกื˜ื™ื ืœื-ืฆื™ื‘ื•ืจื™ื™ื
12:32
whose viewership the campaign controls
224
752800
2176
"ืฉืžื™ื“ืจื•ื’ ื”ืฆืคื™ื” ืฉืœื”ื ื‘ืžืกืข ื”ื‘ื—ื™ืจื•ืช ืงื‘ืข
12:35
so that only the people we want to see it see it.
225
755000
3776
"ืฉืจืง ืื ืฉื™ื ืฉืจื•ืฆื™ื ืœืจืื•ืช ื–ืืช ื™ืจืื• ื–ืืช.
12:38
We modeled this.
226
758800
1216
"ืžื™ื“ืœื ื• ืืช ื–ื”.
"ื–ื” ื™ืฉืคื™ืข ื‘ืื•ืคืŸ ื“ืจืžื˜ื™ ืขืœ ื™ื›ื•ืœืชื” ืœืฉื ื•ืช ืืช ื“ืขื•ืชื™ื”ื."
12:40
It will dramatically affect her ability to turn these people out."
227
760040
4720
12:45
What's in those dark posts?
228
765720
2280
ืžื” ื™ืฉ ื‘ืคื•ืกื˜ื™ื ื”ืืคืœื™ื ื”ืืœื”?
12:48
We have no idea.
229
768480
1656
ืื™ืŸ ืœื ื• ืžื•ืฉื’.
12:50
Facebook won't tell us.
230
770160
1200
"ืคื™ื™ืกื‘ื•ืง" ืœื ืžื’ืœื” ืœื ื•.
12:52
So Facebook also algorithmically arranges the posts
231
772480
4376
ืื– ื’ื "ืคื™ื™ืกื‘ื•ืง" ืžืืจื’ื ืช ื‘ืขื–ืจืช ืืœื’ื•ืจื™ืชืžื™ื
ืืช ื”ืคื•ืกื˜ื™ื ืฉื—ื‘ืจื™ื›ื ืžืขืœื™ื ืœ"ืคื™ื™ืกื‘ื•ืง", ืื• ืืช ื”ื“ืคื™ื ืฉืื—ืจื™ื”ื ืืชื ืขื•ืงื‘ื™ื.
12:56
that your friends put on Facebook, or the pages you follow.
232
776880
3736
13:00
It doesn't show you everything chronologically.
233
780640
2216
ื”ื™ื ืœื ืžืฆื™ื’ื” ื”ื›ืœ ื‘ืกื“ืจ ื”ื›ืจื•ื ื•ืœื•ื’ื™.
13:02
It puts the order in the way that the algorithm thinks will entice you
234
782880
4816
ืืœื ืืช ื”ืกื“ืจ ืฉื”ืืœื’ื•ืจื™ืชื ื—ื•ืฉื‘ ืฉื™ืคืชื” ืืชื›ื
13:07
to stay on the site longer.
235
787720
1840
ืœื”ื™ืฉืืจ ื™ื•ืชืจ ื‘ืืชืจ.
13:11
Now, so this has a lot of consequences.
236
791040
3376
ืœื›ืš ื™ืฉ ื”ืžื•ืŸ ื”ืฉืœื›ื•ืช.
13:14
You may be thinking somebody is snubbing you on Facebook.
237
794440
3800
ืื•ืœื™ ื ืจืื” ืœื›ื ืฉืžื™ืฉื”ื• ื‘"ืคื™ื™ืกื‘ื•ืง" ืžื–ืœื–ืœ ื‘ื›ื.
13:18
The algorithm may never be showing your post to them.
238
798800
3256
ื”ืืœื’ื•ืจื™ืชื ืื•ืœื™ ืœืขื•ืœื ืœื ื™ืจืื” ืœื”ื ืืช ื”ืคื•ืกื˜ ืฉืœื›ื.
ื”ืืœื’ื•ืจื™ืชื ืžืชืขื“ืฃ ืคื•ืกื˜ื™ื ืžืกื•ื™ืžื™ื ื•ืงื•ื‘ืจ ืืช ื”ื™ืชืจ.
13:22
The algorithm is prioritizing some of them and burying the others.
239
802080
5960
13:29
Experiments show
240
809320
1296
ื ื™ืกื•ื™ื™ื ื”ื•ื›ื™ื—ื•
13:30
that what the algorithm picks to show you can affect your emotions.
241
810640
4520
ืฉืžื” ืฉื”ืืœื’ื•ืจื™ืชื ื‘ื•ื—ืจ ืœื”ืจืื•ืช ืœื›ื ืขืฉื•ื™ ืœื”ืฉืคื™ืข ืขืœ ืจื’ืฉื•ืชื™ื›ื.
13:36
But that's not all.
242
816600
1200
ืื‘ืœ ื–ื” ืœื ื”ื›ืœ.
13:38
It also affects political behavior.
243
818280
2360
ื–ื” ื’ื ืžืฉืคื™ืข ืขืœ ื”ื”ืชื ื”ื’ื•ืช ื”ืคื•ืœื™ื˜ื™ืช.
13:41
So in 2010, in the midterm elections,
244
821360
4656
ื‘-2010, ื‘ื‘ื—ื™ืจื•ืช ืืžืฆืข ื”ื›ื”ื•ื ื”,
"ืคื™ื™ืกื‘ื•ืง" ืขืจื›ื” ื ื™ืกื•ื™ ืขืœ 61 ืžื™ืœื™ื•ืŸ ื‘ื ื™-ืื“ื ื‘ืืจื”"ื‘
13:46
Facebook did an experiment on 61 million people in the US
245
826040
5896
13:51
that was disclosed after the fact.
246
831960
1896
ืฉืคื•ืจืกื ืจืง ืžืื•ื—ืจ ื™ื•ืชืจ.
13:53
So some people were shown, "Today is election day,"
247
833880
3416
ื”ื ื”ืฆื™ื’ื• ืœืื ืฉื™ื ืžืกื•ื™ืžื™ื ืืช "ื”ื™ื•ื ื™ื•ื ื”ื‘ื—ื™ืจื•ืช,"
13:57
the simpler one,
248
837320
1376
ื”ืคืฉื•ื˜ ื™ื•ืชืจ,
13:58
and some people were shown the one with that tiny tweak
249
838720
3896
ื•ืื—ืจื™ื ืจืื• ืื•ืชื• ืขื ืฉื™ื ื•ื™ ื–ืขื™ืจ:
14:02
with those little thumbnails
250
842640
2096
ืขื ืชืžื•ื ื•ืช ืžืžื•ื–ืขืจื•ืช
14:04
of your friends who clicked on "I voted."
251
844760
2840
ืฉืœ ื—ื‘ืจื™ื”ื ืฉื”ืงืœื™ืงื• "ื”ืฆื‘ืขืชื™".
ืฉื™ื ื•ื™ ืคืฉื•ื˜.
14:09
This simple tweak.
252
849000
1400
14:11
OK? So the pictures were the only change,
253
851520
4296
ื‘ืกื“ืจ? ื”ืชืžื•ื ื•ืช ื”ื™ื• ื”ืฉื™ื ื•ื™ ื”ื™ื—ื™ื“.
14:15
and that post shown just once
254
855840
3256
ื•ื”ืคื•ืกื˜ ื”ื–ื”, ืฉื”ื•ืฆื’ ืคืขื ืื—ืช ื‘ืœื‘ื“,
ื”ื•ืกื™ืฃ ืขื•ื“ 340,000 ืžืฆื‘ื™ืขื™ื ื‘ืื•ืชืŸ ื‘ื—ื™ืจื•ืช
14:19
turned out an additional 340,000 voters
255
859120
6056
14:25
in that election,
256
865200
1696
14:26
according to this research
257
866920
1696
ืœืคื™ ื”ืžื—ืงืจ ื”ื–ื”, ื›ืคื™ ืฉืจืฉื™ืžื•ืช ื”ืžืฆื‘ื™ืขื™ื ืื™ืฉืจื•.
14:28
as confirmed by the voter rolls.
258
868640
2520
14:32
A fluke? No.
259
872920
1656
ืขื ื™ื™ืŸ ืฉืœ ืžื–ืœ? ืœื.
14:34
Because in 2012, they repeated the same experiment.
260
874600
5360
ื›ื™ ื‘-2012 ื”ื ื—ื–ืจื• ืขืœ ื”ื ื™ืกื•ื™ ื”ื–ื”.
14:40
And that time,
261
880840
1736
ื•ื”ืคืขื,
14:42
that civic message shown just once
262
882600
3296
ื”ืžืกืจ ื”ืื–ืจื—ื™ ื”ื–ื”, ืฉื”ื•ืฆื’ ืจืง ืคืขื ืื—ืช,
14:45
turned out an additional 270,000 voters.
263
885920
4440
ื”ื•ืกื™ืฃ 270,000 ืžืฆื‘ื™ืขื™ื.
14:51
For reference, the 2016 US presidential election
264
891160
5216
ืœืฆื•ืจืš ื™ื™ื—ื•ืก, ื”ื‘ื—ื™ืจื•ืช ืœื ืฉื™ืื•ืช ื‘-2016
14:56
was decided by about 100,000 votes.
265
896400
3520
ื”ื•ื›ืจืขื• ืข"ื™ ื›-100,000 ืงื•ืœื•ืช.
15:01
Now, Facebook can also very easily infer what your politics are,
266
901360
4736
"ืคื™ื™ืกื‘ื•ืง" ื’ื ื™ื›ื•ืœื” ื‘ืงืœื•ืช ืจื‘ื” ืœื”ืกื™ืง ืžื”ืŸ ื“ืขื•ืชื™ื›ื ื”ืคื•ืœื™ื˜ื™ื•ืช,
ื’ื ืื ืžืขื•ืœื ืœื ืฆื™ื™ื ืชื ืื•ืชืŸ ื‘ืืชืจ.
15:06
even if you've never disclosed them on the site.
267
906120
2256
15:08
Right? These algorithms can do that quite easily.
268
908400
2520
ื ื›ื•ืŸ? ื”ืืœื’ื•ืจื™ืชืžื™ื ื”ืืœื” ืžืกื•ื’ืœื™ื ืœื›ืš ื“ื™ ื‘ืงืœื•ืช.
15:11
What if a platform with that kind of power
269
911960
3896
ืžื” ืื ืื™ื–ื• ืคืœื˜ืคื•ืจืžื” ืขื ื›ื•ื— ื›ื–ื”
15:15
decides to turn out supporters of one candidate over the other?
270
915880
5040
ืชื—ืœื™ื˜ ืœื”ื˜ื•ืช ืชื•ืžื›ื™ ืžื•ืขืžื“ ืžืกื•ื™ื ืœืฆื“ื“ ื‘ืฉื ื™?
15:21
How would we even know about it?
271
921680
2440
ืื™ืš ื‘ื›ืœืœ ื ื“ืข ืขืœ ื–ื”?
15:25
Now, we started from someplace seemingly innocuous --
272
925560
4136
ื”ืชื—ืœื ื• ืžืžืฉื”ื• ืœื›ืื•ืจื” ืชืžื™ื ืœืžื“ื™ --
15:29
online adds following us around --
273
929720
2216
ื”ืคืจืกื•ืžื•ืช ื”ืžืงื•ื•ื ื•ืช ืฉืขื•ืงื‘ื•ืช ืื—ืจื™ื ื• --
15:31
and we've landed someplace else.
274
931960
1840
ื•ื”ื’ืขื ื• ืœืžืงื•ื ืื—ืจ ืœื’ืžืจื™.
15:35
As a public and as citizens,
275
935480
2456
ื›ืฆื™ื‘ื•ืจ ื•ื›ืื–ืจื—ื™ื,
15:37
we no longer know if we're seeing the same information
276
937960
3416
ื›ื‘ืจ ืื™ื ื ื• ื™ื•ื“ืขื™ื ืื ื›ื•ืœื ื• ืฆื•ืคื™ื ื‘ืื•ืชื• ื”ืžื™ื“ืข
15:41
or what anybody else is seeing,
277
941400
1480
ืื• ื‘ืžื” ืฆื•ืคื™ื ื›ืœ ื”ืื—ืจื™ื,
15:43
and without a common basis of information,
278
943680
2576
ื•ืœืœื ื‘ืกื™ืก ืžืฉื•ืชืฃ ืฉืœ ืžื™ื“ืข,
15:46
little by little,
279
946280
1616
ืœืื˜ ืœืื˜,
15:47
public debate is becoming impossible,
280
947920
3216
ื”ื“ื™ื•ืŸ ื”ืฆื™ื‘ื•ืจื™ ื”ื•ืคืš ื‘ืœืชื™-ืืคืฉืจื™,
15:51
and we're just at the beginning stages of this.
281
951160
2976
ื•ืื ื• ืจืง ื‘ืฉืœื‘ื™ื ื”ืจืืฉื•ื ื™ื ืฉืœ ื–ื”.
ื”ืืœื’ื•ืจื™ืชืžื™ื ื”ืืœื” ื™ื•ื“ืขื™ื ืœื”ืกื™ืง ื‘ืงืœื•ืช ืจื‘ื”
15:54
These algorithms can quite easily infer
282
954160
3456
15:57
things like your people's ethnicity,
283
957640
3256
ื“ื‘ืจื™ื ื›ืžื• ืืชื ื™ื•ืช,
16:00
religious and political views, personality traits,
284
960920
2336
ื”ืฉืงืคื•ืช ื“ืชื™ื•ืช ื•ืคื•ืœื™ื˜ื™ื•ืช, ืชื›ื•ื ื•ืช ืื™ืฉื™ื•ืช,
16:03
intelligence, happiness, use of addictive substances,
285
963280
3376
ืื™ื ื˜ืœื™ื’ื ืฆื™ื”, ืจืžืช ืื•ืฉืจ, ืฉื™ืžื•ืฉ ื‘ื—ื•ืžืจื™ื ืžืžื›ืจื™ื,
16:06
parental separation, age and genders,
286
966680
3136
ืžืฆื‘ ื”ื–ื•ื’ื™ื•ืช, ื’ื™ืœ ื•ืžื’ื“ืจ,
16:09
just from Facebook likes.
287
969840
1960
ืจืง ืž"ืคื™ื™ืกื‘ื•ืง" ื•ื“ื•ืžื™ื”.
16:13
These algorithms can identify protesters
288
973440
4056
ื”ืืœื’ื•ืจื™ืชืžื™ื ื”ืืœื” ื™ื•ื“ืขื™ื ืœื–ื”ื•ืช ืžืคื’ื™ื ื™ื
16:17
even if their faces are partially concealed.
289
977520
2760
ื’ื ืื ืคื ื™ื”ื ืžื•ืกืชืจื™ื ื—ืœืงื™ืช.
16:21
These algorithms may be able to detect people's sexual orientation
290
981720
6616
ื”ืืœื’ื•ืจื™ืชืžื™ื ื”ืืœื” ืื•ืœื™ ื™ื•ื“ืขื™ื ืœื–ื”ื•ืช ื ื˜ื™ื•ืช ืžื™ื ื™ื•ืช
16:28
just from their dating profile pictures.
291
988360
3200
ืจืง ืžืชื•ืš ืชืžื•ื ื•ืช ื‘ืคืจื•ืคื™ืœ ื”ื”ื™ื›ืจื•ื™ื•ืช.
16:33
Now, these are probabilistic guesses,
292
993560
2616
ื ื›ื•ืŸ, ืžื“ื•ื‘ืจ ื‘ื ื™ื—ื•ืฉื™ื ื”ืกืชื‘ืจื•ืชื™ื™ื,
ื•ื”ื ืœื ื™ื”ื™ื• ื ื›ื•ื ื™ื ื‘-100%,
16:36
so they're not going to be 100 percent right,
293
996200
2896
ืื‘ืœ ืื ื™ ืœื ืฆื•ืคื” ืฉื‘ืขืœื™ ื”ื›ื•ื— ื™ืขืžื“ื• ื‘ืคื™ืชื•ื™ ืœื ืฆืœ ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ืืœื”
16:39
but I don't see the powerful resisting the temptation to use these technologies
294
999120
4896
ืจืง ื‘ื’ืœืœ ื›ืžื” ื–ื™ื”ื•ื™ื™ื ืฉื’ื•ื™ื™ื,
16:44
just because there are some false positives,
295
1004040
2176
16:46
which will of course create a whole other layer of problems.
296
1006240
3256
ื•ื–ื” ื›ืžื•ื‘ืŸ ื™ื™ืฆื•ืจ ืจื•ื‘ื“ ื—ื“ืฉ ืœื’ืžืจื™ ืฉืœ ื‘ืขื™ื•ืช .
16:49
Imagine what a state can do
297
1009520
2936
ืชืืจื• ืœืขืฆืžื›ื ืžื” ืžื“ื™ื ื” ื™ื›ื•ืœื” ืœืขืฉื•ืช
16:52
with the immense amount of data it has on its citizens.
298
1012480
3560
ืขื ื›ืžื•ืช ื”ื ืชื•ื ื™ื ื”ืขืฆื•ืžื” ืฉื™ืฉ ืœื” ืขืœ ืื–ืจื—ื™ื”.
16:56
China is already using face detection technology
299
1016680
4776
ืกื™ืŸ ื›ื‘ืจ ืžืฉืชืžืฉืช ื‘ื˜ื›ื ื•ืœื•ื’ื™ื™ืช ื–ื™ื”ื•ื™ ืคื ื™ื
17:01
to identify and arrest people.
300
1021480
2880
ื›ื“ื™ ืœื–ื”ื•ืช ื•ืœืขืฆื•ืจ ืื ืฉื™ื.
17:05
And here's the tragedy:
301
1025280
2136
ื•ื”ื ื” ื”ื˜ืจื’ื“ื™ื”:
17:07
we're building this infrastructure of surveillance authoritarianism
302
1027440
5536
ืื ื• ื‘ื•ื ื™ื ืชืฉืชื™ืช ื–ื• ืฉืœ ืขืจื™ืฆื•ืช ืžืขืงื‘
ืจืง ื›ื“ื™ ืœื’ืจื•ื ืœืื ืฉื™ื ืœื”ืงืœื™ืง ืขืœ ืคืจืกื•ืžื•ืช.
17:13
merely to get people to click on ads.
303
1033000
2960
ื•ื–ื• ืœื ืชื”ื™ื” ื”ืขืจื™ืฆื•ืช ืœืคื™ ืื•ืจื•ื•ืœ;
17:17
And this won't be Orwell's authoritarianism.
304
1037240
2576
17:19
This isn't "1984."
305
1039839
1897
ื–ื” ืœื "1984".
17:21
Now, if authoritarianism is using overt fear to terrorize us,
306
1041760
4576
ืื ื”ืขืจื™ืฆื•ืช ืชืคืขื™ืœ ื”ืคื—ื“ื” ื’ืœื•ื™ื” ื›ื“ื™ ืœื”ืฉืœื™ื˜ ื˜ืจื•ืจ,
17:26
we'll all be scared, but we'll know it,
307
1046359
2897
ื›ื•ืœื ื• ื ื”ื™ื” ืžื‘ื•ื”ืœื™ื, ืื‘ืœ ื ื“ืข ืžื” ืงื•ืจื”.
17:29
we'll hate it and we'll resist it.
308
1049280
2200
ื ืฉื ื ืืช ื–ื” ื•ื ืชื ื’ื“ ืœื–ื”.
17:32
But if the people in power are using these algorithms
309
1052880
4416
ืื‘ืœ ืื ื‘ืขืœื™ ื”ืฉืจืจื” ืžืฉืชืžืฉื™ื ื‘ืืœื’ื•ืจื™ืชืžื™ื ื”ืืœื”
17:37
to quietly watch us,
310
1057319
3377
ื›ื“ื™ ืœืฆืคื•ืช ื‘ื ื• ื‘ื—ืฉืื™,
17:40
to judge us and to nudge us,
311
1060720
2080
ื›ื“ื™ ืœืฉืคื•ื˜ ืื•ืชื ื• ื•ืœื”ื˜ื•ืช ืื•ืชื ื•,
17:43
to predict and identify the troublemakers and the rebels,
312
1063720
4176
ื›ื“ื™ ืœื ื‘ื ื•ืœื–ื”ื•ืช ืžื™ ื™ื”ื™ื• ืขื•ืฉื™ ื”ืฆืจื•ืช ื•ื”ืžื•ืจื“ื™ื,
17:47
to deploy persuasion architectures at scale
313
1067920
3896
ื›ื“ื™ ืœืคืจื•ืฉ ืืจื›ื™ื˜ืงื˜ื•ืจื•ืช ืขื ืง ืฉืœ ืฉื›ื ื•ืข
17:51
and to manipulate individuals one by one
314
1071840
4136
ื•ืœื”ืคืขื™ืœ ื‘ื ื™-ืื“ื, ืื—ื“-ืื—ื“,
ืข"ื™ ื ื™ืฆื•ืœ ื”ื—ื•ืœืฉื•ืช ื•ื”ืคื’ื™ืขื•ึผืช ื”ืื™ืฉื™ืช ืฉืœ ื›ืœ ืื—ื“ ื•ืื—ืช,
17:56
using their personal, individual weaknesses and vulnerabilities,
315
1076000
5440
18:02
and if they're doing it at scale
316
1082720
2200
ืื ื”ื ืขื•ืฉื™ื ื–ืืช ื‘ื”ื™ืงืฃ ื’ื“ื•ืœ
ื“ืจืš ื”ืžืกื›ื™ื ื”ืคืจื˜ื™ื™ื ืฉืœื ื•
18:06
through our private screens
317
1086080
1736
18:07
so that we don't even know
318
1087840
1656
ื•ืืคื™ืœื• ืื™ื ื ื• ื™ื•ื“ืขื™ื
18:09
what our fellow citizens and neighbors are seeing,
319
1089520
2760
ื‘ืžื” ืฆื•ืคื™ื ืฉื›ื ื™ื ื• ื•ืื—ื™ื ื• ื”ืื–ืจื—ื™ื,
18:13
that authoritarianism will envelop us like a spider's web
320
1093560
4816
ื›ื™ ืื– ื”ืขืจื™ืฆื•ืช ืชืขื˜ื•ืฃ ืื•ืชื ื• ื›ืžื• ืงื•ืจื™ ืขื›ื‘ื™ืฉ
18:18
and we may not even know we're in it.
321
1098400
2480
ื•ืื•ืœื™ ืืคื™ืœื• ืœื ื ื“ืข ืฉืื ื• ืœื›ื•ื“ื™ื ื‘ื”.
18:22
So Facebook's market capitalization
322
1102440
2936
ืฉื•ื•ื™ ื”ืฉื•ืง ืฉืœ "ืคื™ื™ืกื‘ื•ืง"
18:25
is approaching half a trillion dollars.
323
1105400
3296
ืžืชืงืจื‘ ืœื—ืฆื™ ื˜ืจื™ืœื™ื•ืŸ ื“ื•ืœืจ.
18:28
It's because it works great as a persuasion architecture.
324
1108720
3120
ื•ื–ื” ืžืฉื•ื ืฉื”ื™ื ืžื•ืฆืœื—ืช ืžืื“ ื‘ืชื•ืจ ืืจื›ื™ื˜ืงื˜ื•ืจืช ืฉื›ื ื•ืข.
18:33
But the structure of that architecture
325
1113760
2816
ืื‘ืœ ืžื‘ื ื” ื”ืืจื›ื™ื˜ืงื˜ื•ืจื” ื”ื–ืืช
18:36
is the same whether you're selling shoes
326
1116600
3216
ื–ื”ื”, ื‘ื™ืŸ ืื ืžื•ื›ืจื™ื ื ืขืœื™ื™ื
18:39
or whether you're selling politics.
327
1119840
2496
ื•ื‘ื™ืŸ ืื ืžื•ื›ืจื™ื ืคื•ืœื™ื˜ื™ืงื”.
18:42
The algorithms do not know the difference.
328
1122360
3120
ื”ืืœื’ื•ืจื™ืชืžื™ื ืœื ืžื›ื™ืจื™ื ืืช ื”ื”ื‘ื“ืœ.
18:46
The same algorithms set loose upon us
329
1126240
3296
ืื•ืชื ื”ืืœื’ื•ืจื™ืชืžื™ื ืฉืžืฉืกื™ื ื‘ื ื•
18:49
to make us more pliable for ads
330
1129560
3176
ื›ื“ื™ ืœื’ืจื•ื ืœื ื• ืœื”ื™ื•ืช ืงืœื™ื ื™ื•ืชืจ ืœื”ืฉืคืขืช ื”ืคืจืกื•ืžื•ืช
18:52
are also organizing our political, personal and social information flows,
331
1132760
6736
ืžืืจื’ื ื™ื ื’ื ืืช ื–ืจื ื”ืžื™ื“ืข ื”ืคื•ืœื™ื˜ื™, ื”ืื™ืฉื™ ื•ื”ื—ื‘ืจืชื™ ืฉืœื ื•,
18:59
and that's what's got to change.
332
1139520
1840
ื•ื–ื” ืžื” ืฉืฆืจื™ืš ืœื”ืฉืชื ื•ืช.
19:02
Now, don't get me wrong,
333
1142240
2296
ืืœ ืชื‘ื™ื ื• ืื•ืชื™ ืœื-ื ื›ื•ืŸ,
19:04
we use digital platforms because they provide us with great value.
334
1144560
3680
ืื ื• ืžืฉืชืžืฉื™ื ื‘ืคืœื˜ืคื•ืจืžื•ืช ื“ื™ื’ื™ื˜ืœื™ื•ืช ื›ื™ ื™ืฉ ื‘ื”ืŸ ื”ืžื•ืŸ ืขืจืš ืขื‘ื•ืจื ื•.
19:09
I use Facebook to keep in touch with friends and family around the world.
335
1149120
3560
ืื ื™ ืžืฉืชืžืฉืช ื‘"ืคื™ื™ืกื‘ื•ืง" ื›ื“ื™ ืœืฉืžื•ืจ ืขืœ ืงืฉืจ ืขื ื—ื‘ืจื™ื ื•ืงืจื•ื‘ื™ื ื‘ื›ืœ ื”ืขื•ืœื.
19:14
I've written about how crucial social media is for social movements.
336
1154000
5776
ื›ืชื‘ืชื™ ื‘ืขื‘ืจ ืขืœ ืชืคืงื™ื“ื” ื”ืžื›ืจื™ืข ืฉืœ ื”ืžื“ื™ื” ื”ื—ื‘ืจืชื™ืช ื‘ืชื ื•ืขื•ืช ื”ื—ื‘ืจืชื™ื•ืช.
19:19
I have studied how these technologies can be used
337
1159800
3016
ื—ืงืจืชื™ ืื™ืš ื ื™ืชืŸ ืœื”ืฉืชืžืฉ ื‘ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื”ืืœื”
19:22
to circumvent censorship around the world.
338
1162840
2480
ื›ื“ื™ ืœืขืงื•ืฃ ืฆื ื–ื•ืจื•ืช ื‘ืจื—ื‘ื™ ื”ืขื•ืœื.
19:27
But it's not that the people who run, you know, Facebook or Google
339
1167280
6416
ื”ืขื ื™ื™ืŸ ื”ื•ื, ืฉื”ืื ืฉื™ื ืฉืžื ื”ืœื™ื ืืช "ืคื™ื™ืกื‘ื•ืง" ืื• "ื’ื•ื’ืœ"
19:33
are maliciously and deliberately trying
340
1173720
2696
ืื™ื ื ืžื ืกื™ื ื‘ื–ื“ื•ืŸ ื•ื‘ืžื›ื•ื•ืŸ ืœื”ื’ื“ื™ืœ ืืช ื”ืงื™ื˜ื•ื‘ ื‘ืืจืฅ ืื• ื‘ืขื•ืœื
19:36
to make the country or the world more polarized
341
1176440
4456
19:40
and encourage extremism.
342
1180920
1680
ื•ืœืขื•ื“ื“ ืงื™ืฆื•ื ื™ื•ืช.
19:43
I read the many well-intentioned statements
343
1183440
3976
ืื ื™ ืงื•ืจืืช ืืช ื”ื”ืฆื”ืจื•ืช ื”ืจื‘ื•ืช, ื”ืžืœืื•ืช ื‘ื›ื•ื•ื ื•ืช ื˜ื•ื‘ื•ืช
19:47
that these people put out.
344
1187440
3320
ืฉื”ืื ืฉื™ื ื”ืืœื” ืžืฉื—ืจืจื™ื.
19:51
But it's not the intent or the statements people in technology make that matter,
345
1191600
6056
ืื‘ืœ ืœื ื”ื›ื•ื•ื ื•ืช ืื• ื”ื”ืฆื”ืจื•ืช ืฉืœ ืื ืฉื™ ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ื”ืŸ ืฉืžืฉื ื•ืช,
19:57
it's the structures and business models they're building.
346
1197680
3560
ืืœื ื”ืžื‘ื ื™ื ื•ื”ืžื•ื“ืœื™ื ื”ืขืกืงื™ื™ื ืฉื”ื ืžืงื™ืžื™ื.
20:02
And that's the core of the problem.
347
1202360
2096
ื•ื›ืืŸ ืœื‘ ื”ื‘ืขื™ื”.
20:04
Either Facebook is a giant con of half a trillion dollars
348
1204480
4720
ืื• ืฉ"ืคื™ื™ืกื‘ื•ืง" ื”ื™ื ื ื•ื›ืœืช ืขื ืง ื‘ืฉื•ื•ื™ ื—ืฆื™ ื˜ืจื™ืœื™ื•ืŸ ื“ื•ืœืจ
20:10
and ads don't work on the site,
349
1210200
1896
ืฉื”ืคืจืกื•ืžื•ืช ืœื ืขื•ื‘ื“ื•ืช ื‘ืืชืจ ืฉืœื”,
ื•ืœื ืžื•ืฆืœื—ื•ืช ื›ืืจื›ื™ื˜ืงื˜ื•ืจืช ืฉื›ื ื•ืข,
20:12
it doesn't work as a persuasion architecture,
350
1212120
2696
20:14
or its power of influence is of great concern.
351
1214840
4120
ืื• ืฉื›ื•ื— ื”ื”ืฉืคืขื” ืฉืœื” ื”ื•ื ืžืงื•ืจ ืœื“ืื’ื” ืจื‘ื”.
20:20
It's either one or the other.
352
1220560
1776
ืื• ื–ื”, ืื• ื–ื”.
20:22
It's similar for Google, too.
353
1222360
1600
ื›ืš ื’ื ืœื’ื‘ื™ "ื’ื•ื’ืœ".
20:24
So what can we do?
354
1224880
2456
ืื– ืžื” ื ื•ื›ืœ ืœืขืฉื•ืช?
20:27
This needs to change.
355
1227360
1936
ื–ื” ื—ื™ื™ื‘ ืœื”ืฉืชื ื•ืช.
20:29
Now, I can't offer a simple recipe,
356
1229320
2576
ืื™ื ื ื™ ื™ื›ื•ืœื” ืœื”ืฆื™ืข ืžืชื›ื•ืŸ ืคืฉื•ื˜,
20:31
because we need to restructure
357
1231920
2256
ื›ื™ ืขืœื™ื ื• ืœืฉื ื•ืช ืžืŸ ื”ื™ืกื•ื“
20:34
the whole way our digital technology operates.
358
1234200
3016
ืืช ื›ืœ ืื•ืคืŸ ืคืขื•ืœืชืŸ ืฉืœ ื”ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื”ื“ื™ื’ื™ื˜ืœื™ื•ืช ืฉืœื ื•.
20:37
Everything from the way technology is developed
359
1237240
4096
ื”ื›ืœ, ื”ื—ืœ ืžืื•ืคืŸ ืคื™ืชื•ื— ื”ื˜ื›ื ื•ืœื•ื’ื™ื”
20:41
to the way the incentives, economic and otherwise,
360
1241360
3856
ื•ืขื“ ืœื“ืจื›ื™ื ื‘ื”ืŸ ื”ืชืžืจื™ืฆื™ื, ื”ื›ืœื›ืœื™ื™ื ื•ื”ืื—ืจื™ื,
20:45
are built into the system.
361
1245240
2280
ืžืฉื•ืœื‘ื™ื ื‘ืžืขืจื›ืช.
20:48
We have to face and try to deal with
362
1248480
3456
ืขืœื™ื ื• ืœื”ืชืขืžืช ื•ืœื”ืชืžื•ื“ื“
20:51
the lack of transparency created by the proprietary algorithms,
363
1251960
4656
ืขื ื”ืขื“ืจ ื”ืฉืงื™ืคื•ืช ืฉื™ื•ืฆืจื™ื ื”ืืœื’ื•ืจื™ืชืžื™ื ื”ืงื ื™ื™ื ื™ื™ื,
20:56
the structural challenge of machine learning's opacity,
364
1256640
3816
ืขื ื”ืืชื’ืจ ื”ืžื‘ื ื™ ืฉื‘ืขืžื™ืžื•ืชื” ืฉืœ ืœืžื™ื“ืช ื”ืžื›ื•ื ื”,
21:00
all this indiscriminate data that's being collected about us.
365
1260480
3400
ืขื ื”ืื™ืกื•ืฃ ื—ืกืจ-ื”ืื‘ื—ื ื” ืฉืœ ื”ื ืชื•ื ื™ื ืฉืœื ื•.
ืžื•ื˜ืœืช ืขืœื™ื ื• ืžืฉื™ืžื” ืขื ืงื™ืช.
21:05
We have a big task in front of us.
366
1265000
2520
21:08
We have to mobilize our technology,
367
1268160
2680
ืขืœื™ื ื• ืœื’ื™ื™ืก ืืช ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ืฉืœื ื•,
21:11
our creativity
368
1271760
1576
ืืช ื”ื™ืฆื™ืจืชื™ื•ืช ืฉืœื ื•
21:13
and yes, our politics
369
1273360
1880
ื•ื›ืŸ, ื’ื ืืช ื”ืคื•ืœื™ื˜ื™ืงื” ืฉืœื ื•
21:16
so that we can build artificial intelligence
370
1276240
2656
ื›ื“ื™ ืฉื ื•ื›ืœ ืœื‘ื ื•ืช ืชื‘ื•ื ื” ืžืœืื›ื•ืชื™ืช
21:18
that supports us in our human goals
371
1278920
3120
ืฉืชื•ืžื›ืช ื‘ืžื˜ืจื•ืชื™ื ื• ื”ืื ื•ืฉื™ื•ืช
21:22
but that is also constrained by our human values.
372
1282800
3920
ืื‘ืœ ื’ื ืžื•ื’ื‘ืœืช ืข"ื™ ืขืจื›ื™ื ื• ื”ืื ื•ืฉื™ื™ื.
21:27
And I understand this won't be easy.
373
1287600
2160
ื•ืื ื™ ืžื‘ื™ื ื” ืฉื–ื” ืœื ื™ื”ื™ื” ืงืœ.
21:30
We might not even easily agree on what those terms mean.
374
1290360
3600
ืื•ืœื™ ืืคื™ืœื• ืœื ื ืกื›ื™ื ื‘ืงืœื•ืช ืžื” ืคื™ืจื•ืฉ ื”ืชื ืื™ื ื”ืœืœื•.
21:34
But if we take seriously
375
1294920
2400
ืื‘ืœ ืื ื ืชื™ื™ื—ืก ื‘ืจืฆื™ื ื•ืช
21:38
how these systems that we depend on for so much operate,
376
1298240
5976
ืœื“ืจื›ื™ ืคืขื•ืœืช ื”ืžืขืจื›ื•ืช ื”ืืœื”, ืฉืื ื• ื›ื” ืชืœื•ื™ื™ื ื‘ื”ืŸ,
ืื™ื ื ื™ ืจื•ืื” ื›ื™ืฆื“ ื ื•ื›ืœ ืœื“ื—ื•ืช ืขื•ื“ ืืช ื”ื“ื™ื•ืŸ ื”ื–ื”.
21:44
I don't see how we can postpone this conversation anymore.
377
1304240
4120
21:49
These structures
378
1309200
2536
ื”ืžื‘ื ื™ื ื”ืืœื”
21:51
are organizing how we function
379
1311760
4096
ืžืืจื’ื ื™ื ืืช ื”ืชืคืงื•ื“ ืฉืœื ื•
21:55
and they're controlling
380
1315880
2296
ื•ื”ื ืฉื•ืœื˜ื™ื
ื‘ืžื” ืฉื ื•ื›ืœ ื•ื‘ืžื” ืฉืœื ื ื•ื›ืœ ืœืขืฉื•ืช.
21:58
what we can and we cannot do.
381
1318200
2616
22:00
And many of these ad-financed platforms,
382
1320840
2456
ื•ืจื‘ื•ืช ืžื”ืคืœื˜ืคื•ืจืžื•ืช ื”ืืœื”, ืฉื”ืคืจืกื•ืžื•ืช ืžืžืžื ื•ืช,
22:03
they boast that they're free.
383
1323320
1576
ืžืชื’ืื•ืช ื‘ื›ืš ืฉื”ืŸ ื—ื™ื ืžื™ื•ืช.
22:04
In this context, it means that we are the product that's being sold.
384
1324920
4560
ื‘ื”ืงืฉืจ ื”ื–ื”, ื–ื” ืื•ืžืจ ืฉืื ื• ื”ืžื•ืฆืจ ืฉื ืžื›ืจ.
22:10
We need a digital economy
385
1330840
2736
ื ื—ื•ืฆื” ืœื ื• ื›ืœื›ืœื” ื“ื™ื’ื™ื˜ืœื™ืช
22:13
where our data and our attention
386
1333600
3496
ืฉื‘ื” ื”ื ืชื•ื ื™ื ื•ืžื•ืงื“ื™ ืชืฉื•ืžืช ื”ืœื‘ ืฉืœื ื•
22:17
is not for sale to the highest-bidding authoritarian or demagogue.
387
1337120
5080
ืœื ื™ื•ืฆืขื• ืœืžื›ื™ืจื” ืœืขืจื™ืฅ ืื• ืœื“ืžื’ื•ื’ ื”ืžืจื‘ื” ื‘ืžื—ื™ืจ.
22:23
(Applause)
388
1343160
3800
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
22:30
So to go back to that Hollywood paraphrase,
389
1350480
3256
ืื– ืื ื ื—ื–ื•ืจ ืœืคืจืคืจืื–ื” ื”ื”ื•ืœื™ื•ื•ื“ื™ืช,
22:33
we do want the prodigious potential
390
1353760
3736
ืื ื• ื‘ื”ื—ืœื˜ ืจื•ืฆื™ื ื‘ืฉื’ืฉื•ื’ ื”ืคื•ื˜ื ืฆื™ืืœ ื”ืขืฆื•ื
22:37
of artificial intelligence and digital technology to blossom,
391
1357520
3200
ืฉืœ ื”ืชื‘ื•ื ื” ื”ืžืœืื›ื•ืชื™ืช ื•ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ื”ื“ื™ื’ื™ื˜ืœื™ืช,
22:41
but for that, we must face this prodigious menace,
392
1361400
4936
ืื‘ืœ ืœืฉื ื›ืš ืขืœื™ื ื• ืœื”ืชืžื•ื“ื“ ืขื ื”ืกื›ื ื” ื”ืขืฆื•ืžื” ื”ื˜ืžื•ื ื” ื‘ื”ืŸ
22:46
open-eyed and now.
393
1366360
1936
ื‘ืขื™ื ื™ื™ื ืคืงื•ื—ื•ืช ื•ืขื›ืฉื™ื•.
22:48
Thank you.
394
1368320
1216
ืชื•ื“ื” ืœื›ื.
22:49
(Applause)
395
1369560
4640
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7