We're building a dystopia just to make people click on ads | Zeynep Tufekci

738,629 views ・ 2017-11-17

TED


请双击下面的英文字幕来播放视频。

翻译人员: Ethan Ouyang 校对人员: Phoebe Ling Zhang
00:12
So when people voice fears of artificial intelligence,
0
12760
3536
当人们谈论起对于 人工智能的恐惧时
00:16
very often, they invoke images of humanoid robots run amok.
1
16320
3976
浮现在脑海里的 往往是失控的机器人
00:20
You know? Terminator?
2
20320
1240
就像终结者一样
00:22
You know, that might be something to consider,
3
22400
2336
这种担心固然有一定道理
00:24
but that's a distant threat.
4
24760
1856
但目前和我们相隔甚远
00:26
Or, we fret about digital surveillance
5
26640
3456
我们也会对数字监控心生恐惧
00:30
with metaphors from the past.
6
30120
1776
这从过去的隐喻中就可以初见端倪
00:31
"1984," George Orwell's "1984,"
7
31920
2656
例如乔治·奥威尔的著作 1984
00:34
it's hitting the bestseller lists again.
8
34600
2280
最近再次登上热销榜
00:37
It's a great book,
9
37960
1416
这是一本很好的书
00:39
but it's not the correct dystopia for the 21st century.
10
39400
3880
但是书中的反乌托邦社会 并不是21世纪的正确缩影
00:44
What we need to fear most
11
44080
1416
我们最应该担心的
00:45
is not what artificial intelligence will do to us on its own,
12
45520
4776
并不是人工智能本身 对我们的影响
00:50
but how the people in power will use artificial intelligence
13
50320
4736
而是掌权的人会怎样 利用人工智能
00:55
to control us and to manipulate us
14
55080
2816
来控制并摆布我们
00:57
in novel, sometimes hidden,
15
57920
3136
通过新奇 有时是隐蔽的
01:01
subtle and unexpected ways.
16
61080
3016
微妙以及不可预料的手段
01:04
Much of the technology
17
64120
1856
很多对我们的
01:06
that threatens our freedom and our dignity in the near-term future
18
66000
4336
自由和尊严有潜在威胁的科技
01:10
is being developed by companies
19
70360
1856
正在被那些收集
01:12
in the business of capturing and selling our data and our attention
20
72240
4936
并贩卖我们的私人信息给广告商的
01:17
to advertisers and others:
21
77200
2256
公司开发出来
01:19
Facebook, Google, Amazon,
22
79480
3416
例如脸书 谷歌 亚马逊
01:22
Alibaba, Tencent.
23
82920
1880
以及阿里巴巴和腾讯
01:26
Now, artificial intelligence has started bolstering their business as well.
24
86040
5496
现在 人工智能也开始强化 他们自身的业务
01:31
And it may seem like artificial intelligence
25
91560
2096
看起来好像人工智能只不过
01:33
is just the next thing after online ads.
26
93680
2856
是网络广告的下一步
01:36
It's not.
27
96560
1216
但并非如此
01:37
It's a jump in category.
28
97800
2456
它是一个全新的类别
01:40
It's a whole different world,
29
100280
2576
是一个完全不同的世界
01:42
and it has great potential.
30
102880
2616
并且有着极高的潜力
01:45
It could accelerate our understanding of many areas of study and research.
31
105520
6920
它可以加快人们在很多 领域的学习与研究速度
01:53
But to paraphrase a famous Hollywood philosopher,
32
113120
3496
但就如好莱坞一名著名哲学家所言
01:56
"With prodigious potential comes prodigious risk."
33
116640
3640
惊人的潜力带来的是惊人的风险
02:01
Now let's look at a basic fact of our digital lives, online ads.
34
121120
3936
我们得明白关于数字生活 以及网络广告的基本事实
是吧 我们几乎把它们忽略了
02:05
Right? We kind of dismiss them.
35
125080
2896
02:08
They seem crude, ineffective.
36
128000
1976
尽管它们看起来很粗糙 没什么说服力
02:10
We've all had the experience of being followed on the web
37
130000
4256
我们都曾在上网时 被网上的一些广告追踪过
02:14
by an ad based on something we searched or read.
38
134280
2776
它们是根据我们的浏览历史生成的
02:17
You know, you look up a pair of boots
39
137080
1856
比如 你搜索了一双皮靴
02:18
and for a week, those boots are following you around everywhere you go.
40
138960
3376
接下来的一周里 这双皮靴就 在网上如影随形的跟着你
02:22
Even after you succumb and buy them, they're still following you around.
41
142360
3656
即使你屈服了 买下了它们 广告也不会消失
我们已经习惯了这种 廉价粗暴的操纵
02:26
We're kind of inured to that kind of basic, cheap manipulation.
42
146040
3016
02:29
We roll our eyes and we think, "You know what? These things don't work."
43
149080
3400
还不屑一顾的想着 这东西对我没用的
02:33
Except, online,
44
153720
2096
但是别忘了 在网上
02:35
the digital technologies are not just ads.
45
155840
3600
广告并不是数字科技的全部
02:40
Now, to understand that, let's think of a physical world example.
46
160240
3120
为了便于理解 我们举几个 现实世界的例子
02:43
You know how, at the checkout counters at supermarkets, near the cashier,
47
163840
4656
你知道为什么在超市收银台的旁边
02:48
there's candy and gum at the eye level of kids?
48
168520
3480
要放一些小孩子 一眼就能看到的糖果吗
02:52
That's designed to make them whine at their parents
49
172800
3496
那是为了让孩子在父母面前撒娇
02:56
just as the parents are about to sort of check out.
50
176320
3080
就当他们马上要结账的时候
03:00
Now, that's a persuasion architecture.
51
180040
2640
那是一种说服架构
03:03
It's not nice, but it kind of works.
52
183160
3096
并不完美 但很管用
03:06
That's why you see it in every supermarket.
53
186280
2040
这也是每家超市惯用的伎俩
03:08
Now, in the physical world,
54
188720
1696
在现实世界里
03:10
such persuasion architectures are kind of limited,
55
190440
2496
这种说服架构是有限制的
03:12
because you can only put so many things by the cashier. Right?
56
192960
4816
因为能放在收银台旁边的 东西是有限的 对吧
03:17
And the candy and gum, it's the same for everyone,
57
197800
4296
而且所有人看到的都是同样的糖果
03:22
even though it mostly works
58
202120
1456
所以说大多数情况下
03:23
only for people who have whiny little humans beside them.
59
203600
4040
只是针对那些带着小孩的买主
03:29
In the physical world, we live with those limitations.
60
209160
3920
这些是现实世界的种种局限
03:34
In the digital world, though,
61
214280
1936
但在网络世界里
03:36
persuasion architectures can be built at the scale of billions
62
216240
4320
说服架构可以千变万化 因人而异
03:41
and they can target, infer, understand
63
221840
3856
它们可以理解并推断个体用户的喜好
03:45
and be deployed at individuals
64
225720
2896
然后被部署在用户周围
03:48
one by one
65
228640
1216
一个接一个
03:49
by figuring out your weaknesses,
66
229880
2136
通过对每个人弱点的了解
03:52
and they can be sent to everyone's phone private screen,
67
232040
5616
出现在每个人的私人手机屏幕上
03:57
so it's not visible to us.
68
237680
2256
而其他人却看不见
03:59
And that's different.
69
239960
1256
这是(与物质世界)截然不同的地方
04:01
And that's just one of the basic things that artificial intelligence can do.
70
241240
3576
而这仅仅是人工智能的基本功能之一
04:04
Now, let's take an example.
71
244840
1336
再举个例子
假如你要销售飞往 拉斯维加斯的机票
04:06
Let's say you want to sell plane tickets to Vegas. Right?
72
246200
2696
04:08
So in the old world, you could think of some demographics to target
73
248920
3496
在过去 你也许需要一些 统计资料来确定销售对象
04:12
based on experience and what you can guess.
74
252440
2520
然后根据你的个人经验和判断
04:15
You might try to advertise to, oh,
75
255560
2816
你也许会把推广目标定为
04:18
men between the ages of 25 and 35,
76
258400
2496
25岁到35岁的男性
04:20
or people who have a high limit on their credit card,
77
260920
3936
或者是高信用卡额度人群
04:24
or retired couples. Right?
78
264880
1376
或者是退休夫妇 对吧
04:26
That's what you would do in the past.
79
266280
1816
那就是你以前采用的方法
04:28
With big data and machine learning,
80
268120
2896
但在大数据和人工智能面前
04:31
that's not how it works anymore.
81
271040
1524
一切都改变了
04:33
So to imagine that,
82
273320
2176
请想象一下
04:35
think of all the data that Facebook has on you:
83
275520
3856
你被Facebook掌握的所有信息
04:39
every status update you ever typed,
84
279400
2536
你的每一次状态更新
04:41
every Messenger conversation,
85
281960
2016
每一条对话内容
所有的登陆地点
04:44
every place you logged in from,
86
284000
1880
04:48
all your photographs that you uploaded there.
87
288400
3176
你上传的所有照片
04:51
If you start typing something and change your mind and delete it,
88
291600
3776
还有你输入了一部分 后来又删掉的内容
04:55
Facebook keeps those and analyzes them, too.
89
295400
3200
Facebook也会保存下来进行分析
04:59
Increasingly, it tries to match you with your offline data.
90
299160
3936
它将越来越多的数据 和你的离线生活匹配
05:03
It also purchases a lot of data from data brokers.
91
303120
3176
还有从网络信息商贩那里购买信息
05:06
It could be everything from your financial records
92
306320
3416
从你的财务记录到 所有网页浏览记录
05:09
to a good chunk of your browsing history.
93
309760
2120
各类信息无所不包
05:12
Right? In the US, such data is routinely collected,
94
312360
5416
在美国 这种数据是经常被收集
05:17
collated and sold.
95
317800
1960
被整理 然后被贩卖的
05:20
In Europe, they have tougher rules.
96
320320
2440
而在欧洲 这是被明令禁止的
05:23
So what happens then is,
97
323680
2200
所以接下来会发生的是
05:26
by churning through all that data, these machine-learning algorithms --
98
326920
4016
电脑通过算法分析 所有收集到的数据
05:30
that's why they're called learning algorithms --
99
330960
2896
这个算法之所以叫做学习算法
05:33
they learn to understand the characteristics of people
100
333880
4096
因为它们能够学会分析所有之前买过
05:38
who purchased tickets to Vegas before.
101
338000
2520
去维加斯机票的人的性格特征
05:41
When they learn this from existing data,
102
341760
3536
而在学会分析已有数据的同时
05:45
they also learn how to apply this to new people.
103
345320
3816
它们也在学习如何将其 应用在新的人群中
05:49
So if they're presented with a new person,
104
349160
3056
如果有个新用户
05:52
they can classify whether that person is likely to buy a ticket to Vegas or not.
105
352240
4640
它们可以迅速判断这个人 会不会买去维加斯的机票
05:57
Fine. You're thinking, an offer to buy tickets to Vegas.
106
357720
5456
这倒还好 你也许会想 不就是一个卖机票的广告吗
06:03
I can ignore that.
107
363200
1456
我不理它不就行了
06:04
But the problem isn't that.
108
364680
2216
但问题不在这儿
06:06
The problem is,
109
366920
1576
真正的问题是
06:08
we no longer really understand how these complex algorithms work.
110
368520
4136
我们已经无法真正理解 这些复杂的算法究竟是怎样工作的了
06:12
We don't understand how they're doing this categorization.
111
372680
3456
我们不知道它们是 如何进行这种分类的
06:16
It's giant matrices, thousands of rows and columns,
112
376160
4416
那是庞大的数字矩阵 成千上万的行与列
06:20
maybe millions of rows and columns,
113
380600
1960
也许是数百万的行与列
06:23
and not the programmers
114
383320
2640
而没有程序员看管它们
06:26
and not anybody who looks at it,
115
386760
1680
没有任何人看管它们
06:29
even if you have all the data,
116
389440
1496
即使你拥有所有的数据
06:30
understands anymore how exactly it's operating
117
390960
4616
也完全了解算法是如何运行的
06:35
any more than you'd know what I was thinking right now
118
395600
3776
如果仅仅展示给你我的部分脑截面
06:39
if you were shown a cross section of my brain.
119
399400
3960
你也不可能知道我的想法
06:44
It's like we're not programming anymore,
120
404360
2576
就好像这已经不是我们在编程了
06:46
we're growing intelligence that we don't truly understand.
121
406960
4400
我们是在创造一种 我们并不了解的智能
06:52
And these things only work if there's an enormous amount of data,
122
412520
3976
这种智能只有在 庞大的数据支持下才能工作
06:56
so they also encourage deep surveillance on all of us
123
416520
5096
所以它们才致力于对我们 所有人进行强力监控
07:01
so that the machine learning algorithms can work.
124
421640
2336
以便学习算法的运行
这就是Facebook费尽心思 收集用户信息的原因
07:04
That's why Facebook wants to collect all the data it can about you.
125
424000
3176
07:07
The algorithms work better.
126
427200
1576
这样算法才能更好的运行
07:08
So let's push that Vegas example a bit.
127
428800
2696
我们再将那个维加斯的 例子强化一下
07:11
What if the system that we do not understand
128
431520
3680
如果那个我们并不了解的系统
07:16
was picking up that it's easier to sell Vegas tickets
129
436200
5136
发现即将进入躁狂 阶段的躁郁症患者
07:21
to people who are bipolar and about to enter the manic phase.
130
441360
3760
更有可能买去维加斯的机票
07:25
Such people tend to become overspenders, compulsive gamblers.
131
445640
4920
这是一群有挥霍金钱 以及好赌倾向的人
07:31
They could do this, and you'd have no clue that's what they were picking up on.
132
451280
4456
这些算法完全做得到 而你却对它们 是如何做到的毫不知情
07:35
I gave this example to a bunch of computer scientists once
133
455760
3616
我曾把这个例子举给 一些计算机科学家
07:39
and afterwards, one of them came up to me.
134
459400
2056
后来其中一个找到我
07:41
He was troubled and he said, "That's why I couldn't publish it."
135
461480
3520
他很烦恼 并对我说 这就是我没办法发表它的原因
07:45
I was like, "Couldn't publish what?"
136
465600
1715
我问 发表什么
07:47
He had tried to see whether you can indeed figure out the onset of mania
137
467800
5856
他曾尝试在狂躁症病人 被确诊具有某些医疗症状前
07:53
from social media posts before clinical symptoms,
138
473680
3216
是否可以从他们的社交媒体上 发现病情的端倪
07:56
and it had worked,
139
476920
1776
他做到了
07:58
and it had worked very well,
140
478720
2056
还做得相当不错
08:00
and he had no idea how it worked or what it was picking up on.
141
480800
4880
但他不明白这是怎么做到的 或者说如何算出来的
08:06
Now, the problem isn't solved if he doesn't publish it,
142
486840
4416
那么如果他不发表论文 这个问题就得不到解决
08:11
because there are already companies
143
491280
1896
因为早就有其他的一些公司
08:13
that are developing this kind of technology,
144
493200
2536
在发展这样的科技了
08:15
and a lot of the stuff is just off the shelf.
145
495760
2800
很多类似的东西现在就摆在货架上
08:19
This is not very difficult anymore.
146
499240
2576
这已经不是什么难事了
08:21
Do you ever go on YouTube meaning to watch one video
147
501840
3456
你是否曾经想在YouTube 上看一个视频
08:25
and an hour later you've watched 27?
148
505320
2360
结果不知不觉看了27个
08:28
You know how YouTube has this column on the right
149
508760
2496
你知不知道YouTube的 网页右边有一个边栏
08:31
that says, "Up next"
150
511280
2216
上面写着 即将播放
08:33
and it autoplays something?
151
513520
1816
然后它往往会自动播放一些东西
08:35
It's an algorithm
152
515360
1216
这就是算法
08:36
picking what it thinks that you might be interested in
153
516600
3616
算出你的兴趣点
08:40
and maybe not find on your own.
154
520240
1536
甚至连你自己都没想到
08:41
It's not a human editor.
155
521800
1256
这可不是人工编辑
08:43
It's what algorithms do.
156
523080
1416
这就是算法的本职工作
08:44
It picks up on what you have watched and what people like you have watched,
157
524520
4736
它选出你以及和你 相似的人看过的视频
08:49
and infers that that must be what you're interested in,
158
529280
4216
然后推断出你的大致兴趣圈
08:53
what you want more of,
159
533520
1255
推断出你想看什么
08:54
and just shows you more.
160
534799
1336
然后就那些东西展示给你
08:56
It sounds like a benign and useful feature,
161
536159
2201
听起来像是一个无害且贴心的功能
08:59
except when it isn't.
162
539280
1200
但有时候它并不是
09:01
So in 2016, I attended rallies of then-candidate Donald Trump
163
541640
6960
2016年 我参加了当时的总统 候选人 唐纳德 特朗普 的系列集会
09:09
to study as a scholar the movement supporting him.
164
549840
3336
以学者的身份研究 这个支持他的运动
09:13
I study social movements, so I was studying it, too.
165
553200
3456
我当时正好在研究社会运动
09:16
And then I wanted to write something about one of his rallies,
166
556680
3336
然后我想要写一些 有关其中一次集会的文章
09:20
so I watched it a few times on YouTube.
167
560040
1960
所以我在YouTube上看了几遍 这个集会的视频
09:23
YouTube started recommending to me
168
563240
3096
然后YouTube就开始 不断给我推荐
09:26
and autoplaying to me white supremacist videos
169
566360
4256
并且自动播放一些 白人至上主义的视频
09:30
in increasing order of extremism.
170
570640
2656
这些视频一个比一个更极端
09:33
If I watched one,
171
573320
1816
如果我看了一个
就会有另一个更加 极端的视频加入队列
09:35
it served up one even more extreme
172
575160
2976
09:38
and autoplayed that one, too.
173
578160
1424
并自动播放
09:40
If you watch Hillary Clinton or Bernie Sanders content,
174
580320
4536
如果你看有关 希拉里 克林顿 或者 伯尼 桑德斯 的内容
09:44
YouTube recommends and autoplays conspiracy left,
175
584880
4696
YouTube就会开始推荐并 自动播放左翼阴谋内容
09:49
and it goes downhill from there.
176
589600
1760
并且愈演愈烈
09:52
Well, you might be thinking, this is politics, but it's not.
177
592480
3056
你也许觉得这和政治有关
09:55
This isn't about politics.
178
595560
1256
但事实上并不是这样
09:56
This is just the algorithm figuring out human behavior.
179
596840
3096
这只不过是算法在 学习人类行为而已
09:59
I once watched a video about vegetarianism on YouTube
180
599960
4776
我曾在YouTube上观看过 一个有关素食主义的视频
10:04
and YouTube recommended and autoplayed a video about being vegan.
181
604760
4936
然后YouTube就推送了 纯素主义的视频
10:09
It's like you're never hardcore enough for YouTube.
182
609720
3016
在YouTube上你就 好像永远都不够决绝
10:12
(Laughter)
183
612760
1576
(笑声)
10:14
So what's going on?
184
614360
1560
这到底是怎么回事儿
10:16
Now, YouTube's algorithm is proprietary,
185
616520
3536
现在YouTube有其专有的算法
但我认为事情是这样的
10:20
but here's what I think is going on.
186
620080
2360
10:23
The algorithm has figured out
187
623360
2096
这算法已经分析出了
10:25
that if you can entice people
188
625480
3696
如果能展示出更加核心的内容
10:29
into thinking that you can show them something more hardcore,
189
629200
3736
以此来诱惑网站用户
10:32
they're more likely to stay on the site
190
632960
2416
那么人们就更有可能沉浸在网页里
10:35
watching video after video going down that rabbit hole
191
635400
4416
一个接一个的观看推荐的视频
10:39
while Google serves them ads.
192
639840
1680
同时Google给它们投放广告
10:43
Now, with nobody minding the ethics of the store,
193
643760
3120
目前没有人在意网络的道德规范
10:47
these sites can profile people
194
647720
4240
这些网站可以对用户进行划分
10:53
who are Jew haters,
195
653680
1920
哪些人仇视犹太人
10:56
who think that Jews are parasites
196
656360
2480
哪些人视犹太人为寄生虫
11:00
and who have such explicit anti-Semitic content,
197
660320
4920
以及说过明显反犹太言论的人
11:06
and let you target them with ads.
198
666080
2000
然后让你面向这些 目标人群投放广告
11:09
They can also mobilize algorithms
199
669200
3536
他们也可以利用算法
11:12
to find for you look-alike audiences,
200
672760
3136
来找到和你类似的观众
11:15
people who do not have such explicit anti-Semitic content on their profile
201
675920
5576
那些个人账号中虽然没有过 明显的反犹太人言论
11:21
but who the algorithm detects may be susceptible to such messages,
202
681520
6176
但却被算法检测出 可能被这种言论影响的人
11:27
and lets you target them with ads, too.
203
687720
1920
然后也面向他们投放同样的广告
11:30
Now, this may sound like an implausible example,
204
690680
2736
这听起来难以置信
11:33
but this is real.
205
693440
1320
但确有其事
11:35
ProPublica investigated this
206
695480
2136
ProPublica在这方面调查过
11:37
and found that you can indeed do this on Facebook,
207
697640
3616
发现这的确可以在Facebook上实现
11:41
and Facebook helpfully offered up suggestions
208
701280
2416
Facebook还积极的就 有关如何将算法的受众
11:43
on how to broaden that audience.
209
703720
1600
再度扩大提出了建议
11:46
BuzzFeed tried it for Google, and very quickly they found,
210
706720
3016
Buzzfeed曾在Google上 进行尝试 并很快发现
11:49
yep, you can do it on Google, too.
211
709760
1736
没错 这也可在Google实现
11:51
And it wasn't even expensive.
212
711520
1696
而这甚至花不了多少钱
11:53
The ProPublica reporter spent about 30 dollars
213
713240
4416
ProPublica只花了大概30美元
11:57
to target this category.
214
717680
2240
就找出了目标人群
12:02
So last year, Donald Trump's social media manager disclosed
215
722600
5296
那么去年 特朗普的 社交媒体经理披露道
12:07
that they were using Facebook dark posts to demobilize people,
216
727920
5336
他们使用Facebook的 隐藏发帖来动员大众退出
12:13
not to persuade them,
217
733280
1376
不是劝告
12:14
but to convince them not to vote at all.
218
734680
2800
而是说服他们根本就不要投票
12:18
And to do that, they targeted specifically,
219
738520
3576
为了做到这一点 他们有 针对性的找到目标
12:22
for example, African-American men in key cities like Philadelphia,
220
742120
3896
比如 在费城这种关键城市里 居住的非裔美国人
12:26
and I'm going to read exactly what he said.
221
746040
2456
请注意接下来我要复述的
12:28
I'm quoting.
222
748520
1216
都是他们的原话
12:29
They were using "nonpublic posts
223
749760
3016
他们使用 以下是引用 由竞选者控制的
12:32
whose viewership the campaign controls
224
752800
2176
非面向公众的贴文发帖
12:35
so that only the people we want to see it see it.
225
755000
3776
这样就只有我们选定的人 可以看到其内容
12:38
We modeled this.
226
758800
1216
我们估算了一下
这会极大程度的做到让这些人退出
12:40
It will dramatically affect her ability to turn these people out."
227
760040
4720
12:45
What's in those dark posts?
228
765720
2280
以上我引述的隐藏贴文说了些什么呢
12:48
We have no idea.
229
768480
1656
我们无从知晓
12:50
Facebook won't tell us.
230
770160
1200
Facebook不会告诉我们
12:52
So Facebook also algorithmically arranges the posts
231
772480
4376
所以Facebook也利用 算法管理贴文
12:56
that your friends put on Facebook, or the pages you follow.
232
776880
3736
不管是你朋友的发帖 还是你的跟帖
13:00
It doesn't show you everything chronologically.
233
780640
2216
它不会把东西按时间顺序展现给你
13:02
It puts the order in the way that the algorithm thinks will entice you
234
782880
4816
而是按算法计算的顺序展现给你
13:07
to stay on the site longer.
235
787720
1840
以使你更长时间停留在页面上
13:11
Now, so this has a lot of consequences.
236
791040
3376
而这一切都是有后果的
13:14
You may be thinking somebody is snubbing you on Facebook.
237
794440
3800
你也许会觉得有人在 Facebook上对你不理不睬
13:18
The algorithm may never be showing your post to them.
238
798800
3256
这是因为算法可能根本就 没有给他们展示你的发帖
13:22
The algorithm is prioritizing some of them and burying the others.
239
802080
5960
算法会优先展示一些贴文 而把另一些埋没
13:29
Experiments show
240
809320
1296
实验显示
13:30
that what the algorithm picks to show you can affect your emotions.
241
810640
4520
算法决定展示给你的东西 会影响到你的情绪
13:36
But that's not all.
242
816600
1200
还不止这样
13:38
It also affects political behavior.
243
818280
2360
它也会影响到政治行为
13:41
So in 2010, in the midterm elections,
244
821360
4656
在2010年的中期选举中
Facebook对美国6100万人 做了一个实验
13:46
Facebook did an experiment on 61 million people in the US
245
826040
5896
13:51
that was disclosed after the fact.
246
831960
1896
这是在事后被披露的
13:53
So some people were shown, "Today is election day,"
247
833880
3416
当时有些人收到了 今天是选举日 的贴文
13:57
the simpler one,
248
837320
1376
简单的版本
13:58
and some people were shown the one with that tiny tweak
249
838720
3896
而有一些人则收到了 微调过的贴文
14:02
with those little thumbnails
250
842640
2096
上面有一些小的缩略图
14:04
of your friends who clicked on "I voted."
251
844760
2840
显示的是你的 哪些好友 已投票
这小小的微调
14:09
This simple tweak.
252
849000
1400
14:11
OK? So the pictures were the only change,
253
851520
4296
看到了吧 改变仅仅是 添加了缩略图而已
14:15
and that post shown just once
254
855840
3256
并且那些贴文仅出现一次
后来的调查结果显示
14:19
turned out an additional 340,000 voters
255
859120
6056
14:25
in that election,
256
865200
1696
在那次选举中
14:26
according to this research
257
866920
1696
根据选民登记册的确认
14:28
as confirmed by the voter rolls.
258
868640
2520
多出了34万的投票者
14:32
A fluke? No.
259
872920
1656
仅仅是意外吗 并非如此
14:34
Because in 2012, they repeated the same experiment.
260
874600
5360
因为在2012年 他们再次进行了同样的实验
14:40
And that time,
261
880840
1736
而那一次
14:42
that civic message shown just once
262
882600
3296
类似贴文也只出现了一次
14:45
turned out an additional 270,000 voters.
263
885920
4440
最后多出了28万投票者
作为参考 2016年总统大选的
14:51
For reference, the 2016 US presidential election
264
891160
5216
14:56
was decided by about 100,000 votes.
265
896400
3520
最终结果是由大概 十万张选票决定的
Facebook还可以轻易 推断出你的政治倾向
15:01
Now, Facebook can also very easily infer what your politics are,
266
901360
4736
15:06
even if you've never disclosed them on the site.
267
906120
2256
即使你从没有在网上披露过
15:08
Right? These algorithms can do that quite easily.
268
908400
2520
这可难不倒算法
15:11
What if a platform with that kind of power
269
911960
3896
而如果一个拥有 这样强大能力的平台
15:15
decides to turn out supporters of one candidate over the other?
270
915880
5040
决定要让一个候选者胜利获选
15:21
How would we even know about it?
271
921680
2440
我们根本无法察觉
15:25
Now, we started from someplace seemingly innocuous --
272
925560
4136
现在我们从一个无伤大雅的方面 也就是如影随形的
15:29
online adds following us around --
273
929720
2216
网络广告
15:31
and we've landed someplace else.
274
931960
1840
转到了另一个方面
15:35
As a public and as citizens,
275
935480
2456
作为一个普通大众和公民
15:37
we no longer know if we're seeing the same information
276
937960
3416
我们已经无法确认 自己看到的信息
15:41
or what anybody else is seeing,
277
941400
1480
和别人看到的信息是否一样
15:43
and without a common basis of information,
278
943680
2576
而在没有一个共同的 基本信息的情况下
15:46
little by little,
279
946280
1616
逐渐的
15:47
public debate is becoming impossible,
280
947920
3216
公开辩论将变得不再可能
15:51
and we're just at the beginning stages of this.
281
951160
2976
而我们已经开始走在这条路上了
15:54
These algorithms can quite easily infer
282
954160
3456
这些算法可以轻易推断出
15:57
things like your people's ethnicity,
283
957640
3256
任何一个用户的种族 宗教信仰
16:00
religious and political views, personality traits,
284
960920
2336
包括政治倾向 还有个人喜好
16:03
intelligence, happiness, use of addictive substances,
285
963280
3376
你的智力 心情 以及用药历史
16:06
parental separation, age and genders,
286
966680
3136
父母是否离异 你的年龄和性别
16:09
just from Facebook likes.
287
969840
1960
这些都可以从你的 Facebook关注里推算出来
16:13
These algorithms can identify protesters
288
973440
4056
这些算法可以识别抗议人士
16:17
even if their faces are partially concealed.
289
977520
2760
即使他们部分掩盖了面部特征
16:21
These algorithms may be able to detect people's sexual orientation
290
981720
6616
这些算法可以测出人们的性取向
只需要查看他们的约会账号头像
16:28
just from their dating profile pictures.
291
988360
3200
16:33
Now, these are probabilistic guesses,
292
993560
2616
然而所有的一切都 只是概率性的推算
16:36
so they're not going to be 100 percent right,
293
996200
2896
所以它们不会百分之百精确
16:39
but I don't see the powerful resisting the temptation to use these technologies
294
999120
4896
这些算法有很多误报
也必然会导致其他层次的种种问题
16:44
just because there are some false positives,
295
1004040
2176
但我没有看到对想要使用这些 科技的有力反抗
16:46
which will of course create a whole other layer of problems.
296
1006240
3256
16:49
Imagine what a state can do
297
1009520
2936
想象一下 拥有了海量的市民数据
16:52
with the immense amount of data it has on its citizens.
298
1012480
3560
一个国家能做出什么
16:56
China is already using face detection technology
299
1016680
4776
中国已经在使用
17:01
to identify and arrest people.
300
1021480
2880
面部识别来抓捕犯人
17:05
And here's the tragedy:
301
1025280
2136
然而不幸的是
17:07
we're building this infrastructure of surveillance authoritarianism
302
1027440
5536
我们正在建造一个 监控独裁性质的设施
目的仅是为了让人们点击广告
17:13
merely to get people to click on ads.
303
1033000
2960
17:17
And this won't be Orwell's authoritarianism.
304
1037240
2576
而这和奥威尔笔下的独裁政府不同
17:19
This isn't "1984."
305
1039839
1897
不是 1984 里的情景
17:21
Now, if authoritarianism is using overt fear to terrorize us,
306
1041760
4576
现在如果独裁主义公开恐吓我们
17:26
we'll all be scared, but we'll know it,
307
1046359
2897
我们会惧怕 但我们也会察觉
17:29
we'll hate it and we'll resist it.
308
1049280
2200
我们会奋起抵抗并瓦解它
17:32
But if the people in power are using these algorithms
309
1052880
4416
但如果掌权的人使用这种算法
17:37
to quietly watch us,
310
1057319
3377
来安静的监视我们
17:40
to judge us and to nudge us,
311
1060720
2080
来评判我们 煽动我们
17:43
to predict and identify the troublemakers and the rebels,
312
1063720
4176
来预测和识别出那些 会给政府制造麻烦的家伙
17:47
to deploy persuasion architectures at scale
313
1067920
3896
并且大规模的布置说服性的架构
17:51
and to manipulate individuals one by one
314
1071840
4136
利用每个人自身的
17:56
using their personal, individual weaknesses and vulnerabilities,
315
1076000
5440
弱点和漏洞来把我们逐个击破
18:02
and if they're doing it at scale
316
1082720
2200
假如他们的做法受众面很广
18:06
through our private screens
317
1086080
1736
就会给每个手机都推送不同的信息
18:07
so that we don't even know
318
1087840
1656
这样我们甚至都不会知道
18:09
what our fellow citizens and neighbors are seeing,
319
1089520
2760
我们周围的人看到的是什么
18:13
that authoritarianism will envelop us like a spider's web
320
1093560
4816
独裁主义会像蜘蛛网 一样把我们困住
18:18
and we may not even know we're in it.
321
1098400
2480
而我们并不会意识到 自己已深陷其中
18:22
So Facebook's market capitalization
322
1102440
2936
Facebook现在的市值
18:25
is approaching half a trillion dollars.
323
1105400
3296
已经接近了5000亿美元
18:28
It's because it works great as a persuasion architecture.
324
1108720
3120
只因为它作为一个说服架构 完美的运作着
18:33
But the structure of that architecture
325
1113760
2816
但不管你是要卖鞋子
18:36
is the same whether you're selling shoes
326
1116600
3216
还是要卖政治思想
18:39
or whether you're selling politics.
327
1119840
2496
这个架构的结构都是固定的
18:42
The algorithms do not know the difference.
328
1122360
3120
算法并不知道其中的差异
18:46
The same algorithms set loose upon us
329
1126240
3296
同样的算法也被使用在我们身上
18:49
to make us more pliable for ads
330
1129560
3176
它让我们更易受广告诱导
18:52
are also organizing our political, personal and social information flows,
331
1132760
6736
也管控着我们的政治 个人 以及社会信息的流向
18:59
and that's what's got to change.
332
1139520
1840
而那正是需要改变的部分
19:02
Now, don't get me wrong,
333
1142240
2296
我还需要澄清一下
19:04
we use digital platforms because they provide us with great value.
334
1144560
3680
我们使用数字平台 因为它们带给我们便利
我和世界各地的朋友和家人 通过 Facebook 联系
19:09
I use Facebook to keep in touch with friends and family around the world.
335
1149120
3560
我也曾撰文谈过社交媒体 在社会运动中的重要地位
19:14
I've written about how crucial social media is for social movements.
336
1154000
5776
19:19
I have studied how these technologies can be used
337
1159800
3016
我也曾研究过如何使用这些技术
19:22
to circumvent censorship around the world.
338
1162840
2480
来绕开世界范围内的审查制度
19:27
But it's not that the people who run, you know, Facebook or Google
339
1167280
6416
但并不是那些管理Facebook 或者Google的人
19:33
are maliciously and deliberately trying
340
1173720
2696
在意图不轨的尝试
19:36
to make the country or the world more polarized
341
1176440
4456
如何使世界走向极端化
19:40
and encourage extremism.
342
1180920
1680
并且推广极端主义
19:43
I read the many well-intentioned statements
343
1183440
3976
我曾读到过很多由这些人写的
19:47
that these people put out.
344
1187440
3320
十分善意的言论
19:51
But it's not the intent or the statements people in technology make that matter,
345
1191600
6056
但重要的并不是 这些科技人员说的话
19:57
it's the structures and business models they're building.
346
1197680
3560
而是他们正在建造的 架构体系和商业模式
20:02
And that's the core of the problem.
347
1202360
2096
那才是问题的关键所在
20:04
Either Facebook is a giant con of half a trillion dollars
348
1204480
4720
要么Facebook是个 5000亿市值的弥天大谎
20:10
and ads don't work on the site,
349
1210200
1896
那些广告根本就不奏效
20:12
it doesn't work as a persuasion architecture,
350
1212120
2696
它并不是以一个 说服架构的模式成功运作
20:14
or its power of influence is of great concern.
351
1214840
4120
要么Facebook的影响力 就是令人担忧的
20:20
It's either one or the other.
352
1220560
1776
只有这两种可能
20:22
It's similar for Google, too.
353
1222360
1600
Google也是一样
20:24
So what can we do?
354
1224880
2456
那么我们能做什么呢
20:27
This needs to change.
355
1227360
1936
我们必须改变现状
20:29
Now, I can't offer a simple recipe,
356
1229320
2576
现在我还无法给出 一个简单的方法
20:31
because we need to restructure
357
1231920
2256
因为我们必须重新调整
20:34
the whole way our digital technology operates.
358
1234200
3016
整个数字科技的运行结构
20:37
Everything from the way technology is developed
359
1237240
4096
一切科技从发展到激励的方式
20:41
to the way the incentives, economic and otherwise,
360
1241360
3856
不论是在经济 还是在其他领域
20:45
are built into the system.
361
1245240
2280
都是建立在这种结构之上
20:48
We have to face and try to deal with
362
1248480
3456
我们必须得面对并尝试解决
20:51
the lack of transparency created by the proprietary algorithms,
363
1251960
4656
由专有算法制造出来的 透明度过低问题
20:56
the structural challenge of machine learning's opacity,
364
1256640
3816
还有由机器学习的 不透明带来的结构挑战
21:00
all this indiscriminate data that's being collected about us.
365
1260480
3400
以及所有这些不加选择 收集到的我们的信息
我们的任务艰巨
21:05
We have a big task in front of us.
366
1265000
2520
21:08
We have to mobilize our technology,
367
1268160
2680
必须调整我们的科技
21:11
our creativity
368
1271760
1576
我们的创造力
21:13
and yes, our politics
369
1273360
1880
以及我们的政治
这样我们才能够制造出
21:16
so that we can build artificial intelligence
370
1276240
2656
21:18
that supports us in our human goals
371
1278920
3120
真正为人类服务的人工智能
21:22
but that is also constrained by our human values.
372
1282800
3920
但这也会受到人类价值观的阻碍
21:27
And I understand this won't be easy.
373
1287600
2160
我也明白这不会轻松
21:30
We might not even easily agree on what those terms mean.
374
1290360
3600
我们甚至都无法在这些 理论上达成一致
21:34
But if we take seriously
375
1294920
2400
但如果我们每个人都认真对待
这些我们一直以来 都在依赖的操作系统
21:38
how these systems that we depend on for so much operate,
376
1298240
5976
21:44
I don't see how we can postpone this conversation anymore.
377
1304240
4120
我认为我们也 没有理由再拖延下去了
21:49
These structures
378
1309200
2536
这些结构
21:51
are organizing how we function
379
1311760
4096
在影响着我们的工作方式
21:55
and they're controlling
380
1315880
2296
它们同时也在控制
21:58
what we can and we cannot do.
381
1318200
2616
我们能做与不能做什么事情
22:00
And many of these ad-financed platforms,
382
1320840
2456
而许许多多的 这种以广告为生的平台
22:03
they boast that they're free.
383
1323320
1576
他们夸下海口 对大众分文不取
22:04
In this context, it means that we are the product that's being sold.
384
1324920
4560
而事实上 我们却是他们销售的产品
22:10
We need a digital economy
385
1330840
2736
我们需要一种数字经济
22:13
where our data and our attention
386
1333600
3496
一种我们的数据以及我们专注的信息
22:17
is not for sale to the highest-bidding authoritarian or demagogue.
387
1337120
5080
不会如竞拍一样被售卖给 出价最高的独裁者和煽动者
22:23
(Applause)
388
1343160
3800
(掌声)
22:30
So to go back to that Hollywood paraphrase,
389
1350480
3256
回到那句好莱坞名人说的话
22:33
we do want the prodigious potential
390
1353760
3736
我们的确想要
22:37
of artificial intelligence and digital technology to blossom,
391
1357520
3200
由人工智能与数字科技发展 带来的惊人潜力
22:41
but for that, we must face this prodigious menace,
392
1361400
4936
但与此同时 我们也要 做好面对惊人风险的准备
22:46
open-eyed and now.
393
1366360
1936
睁大双眼 就在此时此刻
22:48
Thank you.
394
1368320
1216
谢谢
22:49
(Applause)
395
1369560
4640
(掌声)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7