We're building a dystopia just to make people click on ads | Zeynep Tufekci
738,629 views ・ 2017-11-17
請雙擊下方英文字幕播放視頻。
譯者: Lilian Chiu
審譯者: Helen Chang
00:12
So when people voice fears
of artificial intelligence,
0
12760
3536
當人們表達出對人工智慧的恐懼,
00:16
very often, they invoke images
of humanoid robots run amok.
1
16320
3976
他們腦中的景象通常是
形象似人的機器人失控殺人。
00:20
You know? Terminator?
2
20320
1240
知道嗎?魔鬼終結者?
00:22
You know, that might be
something to consider,
3
22400
2336
雖然考量那種情況的確沒錯,
00:24
but that's a distant threat.
4
24760
1856
但那是遙遠以後的威脅。
00:26
Or, we fret about digital surveillance
5
26640
3456
或者,我們擔心被數位監視,
00:30
with metaphors from the past.
6
30120
1776
有著來自過去的隱喻。
00:31
"1984," George Orwell's "1984,"
7
31920
2656
喬治歐威爾的《1984》
00:34
it's hitting the bestseller lists again.
8
34600
2280
再度登上了暢銷書的排行榜。
00:37
It's a great book,
9
37960
1416
雖然它是本很棒的書,
00:39
but it's not the correct dystopia
for the 21st century.
10
39400
3880
但它並未正確地反映出
21 世紀的反烏托邦。
00:44
What we need to fear most
11
44080
1416
我們最需要恐懼的
00:45
is not what artificial intelligence
will do to us on its own,
12
45520
4776
並不是人工智慧本身會對我們怎樣,
00:50
but how the people in power
will use artificial intelligence
13
50320
4736
而是掌權者會如何運用人工智慧
00:55
to control us and to manipulate us
14
55080
2816
來控制和操縱我們,
00:57
in novel, sometimes hidden,
15
57920
3136
用新穎的、有時隱蔽的、
01:01
subtle and unexpected ways.
16
61080
3016
精細的、出乎意料的方式。
01:04
Much of the technology
17
64120
1856
那些會在不遠的將來
01:06
that threatens our freedom
and our dignity in the near-term future
18
66000
4336
威脅我們自由和尊嚴的科技,
01:10
is being developed by companies
19
70360
1856
多半出自下面這類公司,
01:12
in the business of capturing
and selling our data and our attention
20
72240
4936
他們攫取我們的注意力和資料,
01:17
to advertisers and others:
21
77200
2256
販售給廣告商和其他對象:
01:19
Facebook, Google, Amazon,
22
79480
3416
臉書、Google、亞馬遜、
01:22
Alibaba, Tencent.
23
82920
1880
阿里巴巴、騰訊。
01:26
Now, artificial intelligence has started
bolstering their business as well.
24
86040
5496
人工智慧開始鞏固這些公司的事業。
01:31
And it may seem
like artificial intelligence
25
91560
2096
看似人工智慧將是
01:33
is just the next thing after online ads.
26
93680
2856
線上廣告後的下一個產物。
01:36
It's not.
27
96560
1216
並非如此。
01:37
It's a jump in category.
28
97800
2456
它是個大躍進的類別,
01:40
It's a whole different world,
29
100280
2576
一個完全不同的世界,
01:42
and it has great potential.
30
102880
2616
它具有龐大的潛力,
01:45
It could accelerate our understanding
of many areas of study and research.
31
105520
6920
能夠加速我們對於
許多研究領域的了解。
01:53
But to paraphrase
a famous Hollywood philosopher,
32
113120
3496
但,轉述一位知名
好萊塢哲學家的說法:
01:56
"With prodigious potential
comes prodigious risk."
33
116640
3640
「驚人的潛力會帶來驚人的風險。」
02:01
Now let's look at a basic fact
of our digital lives, online ads.
34
121120
3936
先談談一個數位生活的
基本面向:線上廣告。
02:05
Right? We kind of dismiss them.
35
125080
2896
我們算是有點輕視了線上的廣告。
02:08
They seem crude, ineffective.
36
128000
1976
它們看似粗糙、無效。
02:10
We've all had the experience
of being followed on the web
37
130000
4256
我們都曾經因為在網路上
搜尋或閱讀過某些內容,
02:14
by an ad based on something
we searched or read.
38
134280
2776
而老是被一個廣告給跟隨著。
02:17
You know, you look up a pair of boots
39
137080
1856
上網搜尋一雙靴子,
02:18
and for a week, those boots are following
you around everywhere you go.
40
138960
3376
之後的一週,你到哪兒
都會看見那雙靴子。
02:22
Even after you succumb and buy them,
they're still following you around.
41
142360
3656
即使你屈服,買下了它,
它還是到處跟著你。
02:26
We're kind of inured to that kind
of basic, cheap manipulation.
42
146040
3016
我們算是習慣了
那種基本、廉價的操縱,
02:29
We roll our eyes and we think,
"You know what? These things don't work."
43
149080
3400
翻個白眼,心想:
「知道嗎?這些沒有用。」
02:33
Except, online,
44
153720
2096
只除了在線上,
02:35
the digital technologies are not just ads.
45
155840
3600
數位科技並不只是廣告。
02:40
Now, to understand that,
let's think of a physical world example.
46
160240
3120
為瞭解這一點,我們先用
實體世界當作例子。
02:43
You know how, at the checkout counters
at supermarkets, near the cashier,
47
163840
4656
你們有沒有看過,在超市結帳台
靠近收銀機的地方,
02:48
there's candy and gum
at the eye level of kids?
48
168520
3480
會有放在孩子視線高度的
糖果和口香糖?
02:52
That's designed to make them
whine at their parents
49
172800
3496
那是設計來讓孩子哀求
02:56
just as the parents
are about to sort of check out.
50
176320
3080
正在結帳的父母用的。
03:00
Now, that's a persuasion architecture.
51
180040
2640
那是一種說服架構,
03:03
It's not nice, but it kind of works.
52
183160
3096
不太好,但算是有些效用,
03:06
That's why you see it
in every supermarket.
53
186280
2040
因此在每個超級市場都看得到。
03:08
Now, in the physical world,
54
188720
1696
在實體世界中,
03:10
such persuasion architectures
are kind of limited,
55
190440
2496
這種說服架構有點受限,
03:12
because you can only put
so many things by the cashier. Right?
56
192960
4816
因為在收銀台那裡
只擺得下那麼點東西,對吧?
03:17
And the candy and gum,
it's the same for everyone,
57
197800
4296
並且每個人看到的
是同樣的糖果和口香糖,
03:22
even though it mostly works
58
202120
1456
這招只對身旁
03:23
only for people who have
whiny little humans beside them.
59
203600
4040
有小孩子喋喋不休吵著的大人有用。
03:29
In the physical world,
we live with those limitations.
60
209160
3920
我們生活的實體世界裡有那些限制。
03:34
In the digital world, though,
61
214280
1936
但在數位世界裡,
03:36
persuasion architectures
can be built at the scale of billions
62
216240
4320
說服架構的規模可達數十億的等級,
03:41
and they can target, infer, understand
63
221840
3856
它們會瞄準、臆測、了解,
03:45
and be deployed at individuals
64
225720
2896
針對個人來部署,
03:48
one by one
65
228640
1216
各個擊破,
03:49
by figuring out your weaknesses,
66
229880
2136
弄清楚個別的弱點,
03:52
and they can be sent
to everyone's phone private screen,
67
232040
5616
且能傳送到每個人
私人手機的螢幕上,
03:57
so it's not visible to us.
68
237680
2256
別人是看不見的。
03:59
And that's different.
69
239960
1256
那就很不一樣。
04:01
And that's just one of the basic things
that artificial intelligence can do.
70
241240
3576
那只是人工智慧
能做到的基本功能之一。
04:04
Now, let's take an example.
71
244840
1336
讓我舉個例子。
04:06
Let's say you want to sell
plane tickets to Vegas. Right?
72
246200
2696
比如說,你要賣飛往賭城的機票。
04:08
So in the old world, you could think
of some demographics to target
73
248920
3496
在舊式的世界裡,你可以想出
某些特徵的人來當目標,
04:12
based on experience
and what you can guess.
74
252440
2520
根據你的經驗和猜測。
04:15
You might try to advertise to, oh,
75
255560
2816
你也可以試著打廣告,
04:18
men between the ages of 25 and 35,
76
258400
2496
像針對 25~35 歲的男性,
04:20
or people who have
a high limit on their credit card,
77
260920
3936
或高信用卡額度的人,
04:24
or retired couples. Right?
78
264880
1376
或退休的夫妻,對吧?
04:26
That's what you would do in the past.
79
266280
1816
那是過去的做法。
04:28
With big data and machine learning,
80
268120
2896
有了大量資料和機器學習,
04:31
that's not how it works anymore.
81
271040
1524
方式就不一樣了。
04:33
So to imagine that,
82
273320
2176
試想,
04:35
think of all the data
that Facebook has on you:
83
275520
3856
想想臉書掌握什麼關於你的資料:
04:39
every status update you ever typed,
84
279400
2536
所有你輸入的動態更新、
04:41
every Messenger conversation,
85
281960
2016
所有的訊息對話、
04:44
every place you logged in from,
86
284000
1880
所有你登入時的所在地、
04:48
all your photographs
that you uploaded there.
87
288400
3176
所有你上傳的照片。
04:51
If you start typing something
and change your mind and delete it,
88
291600
3776
如果你開始輸入些內容,
但隨後改變主意而將之刪除,
04:55
Facebook keeps those
and analyzes them, too.
89
295400
3200
臉書會保留那些內容和分析它們。
04:59
Increasingly, it tries
to match you with your offline data.
90
299160
3936
它越來越會試著將你
和你的離線資料做匹配,
05:03
It also purchases
a lot of data from data brokers.
91
303120
3176
也會向資料仲介商購買許多資料。
05:06
It could be everything
from your financial records
92
306320
3416
從你的財務記錄
05:09
to a good chunk of your browsing history.
93
309760
2120
到你過去瀏覽過的一大堆記錄。
05:12
Right? In the US,
such data is routinely collected,
94
312360
5416
在美國,這些資料被常規地收集、
05:17
collated and sold.
95
317800
1960
校對和售出。
05:20
In Europe, they have tougher rules.
96
320320
2440
歐洲的規定比較嚴。
05:23
So what happens then is,
97
323680
2200
接下來發生的狀況是
05:26
by churning through all that data,
these machine-learning algorithms --
98
326920
4016
透過攪拌所有這些資料,
這些機器學習演算法──
05:30
that's why they're called
learning algorithms --
99
330960
2896
這就是為什麼它們
被稱為學習演算法──
05:33
they learn to understand
the characteristics of people
100
333880
4096
它們學會了解過去購買機票
05:38
who purchased tickets to Vegas before.
101
338000
2520
飛往賭城的人有何特徵。
05:41
When they learn this from existing data,
102
341760
3536
當它們從既有的資料中
學到這些之後,
05:45
they also learn
how to apply this to new people.
103
345320
3816
也學習如何將所學
套用到新的人身上。
05:49
So if they're presented with a new person,
104
349160
3056
如果交給它們一個新的人,
05:52
they can classify whether that person
is likely to buy a ticket to Vegas or not.
105
352240
4640
它們能辨識那人可能
或不太可能買機票。
05:57
Fine. You're thinking,
an offer to buy tickets to Vegas.
106
357720
5456
好。你心想,不就是提供
購買飛往賭城機票的訊息罷了,
06:03
I can ignore that.
107
363200
1456
可以忽略它。
06:04
But the problem isn't that.
108
364680
2216
但問題不在那裡。
06:06
The problem is,
109
366920
1576
問題是,
06:08
we no longer really understand
how these complex algorithms work.
110
368520
4136
我們已經不能真正了解
這些複雜的演算法如何運作。
06:12
We don't understand
how they're doing this categorization.
111
372680
3456
我們不了解它們如何分類。
06:16
It's giant matrices,
thousands of rows and columns,
112
376160
4416
它是個巨大的矩陣,
有數以千計的直行和橫列,
06:20
maybe millions of rows and columns,
113
380600
1960
也許有上百萬的直行和橫列,
06:23
and not the programmers
114
383320
2640
程式設計者也無法了解,
06:26
and not anybody who looks at it,
115
386760
1680
任何人看到它都無法了解,
06:29
even if you have all the data,
116
389440
1496
即使握有所有的資料,
06:30
understands anymore
how exactly it's operating
117
390960
4616
對於它到底如何運作的了解程度,
06:35
any more than you'd know
what I was thinking right now
118
395600
3776
絕對不會高於你對我現在
腦中想什麼的了解程度,
06:39
if you were shown
a cross section of my brain.
119
399400
3960
如果你單憑看我大腦的切面圖。
06:44
It's like we're not programming anymore,
120
404360
2576
感覺好像我們不是在寫程式了,
06:46
we're growing intelligence
that we don't truly understand.
121
406960
4400
而是在栽培一種我們不是
真正了解的智慧。
06:52
And these things only work
if there's an enormous amount of data,
122
412520
3976
只在資料量非常巨大的情況下
這些才行得通,
06:56
so they also encourage
deep surveillance on all of us
123
416520
5096
所以他們也助長了
對我們所有人的密切監視,
07:01
so that the machine learning
algorithms can work.
124
421640
2336
這樣機器學習才能行得通。
07:04
That's why Facebook wants
to collect all the data it can about you.
125
424000
3176
那就是為什麼臉書要盡可能
收集關於你的資料。
07:07
The algorithms work better.
126
427200
1576
這樣演算法效果才會比較好。
07:08
So let's push that Vegas example a bit.
127
428800
2696
讓我們再談談賭城的例子。
07:11
What if the system
that we do not understand
128
431520
3680
如果這個我們不了解的系統
07:16
was picking up that it's easier
to sell Vegas tickets
129
436200
5136
發現比較容易把機票銷售給
07:21
to people who are bipolar
and about to enter the manic phase.
130
441360
3760
即將進入躁症階段的躁鬱症患者。
07:25
Such people tend to become
overspenders, compulsive gamblers.
131
445640
4920
這類人傾向於變成
花錢超支的人、強迫性賭徒。
07:31
They could do this, and you'd have no clue
that's what they were picking up on.
132
451280
4456
他們能這麼做,而你完全不知道
那是他們選目標的根據。
07:35
I gave this example
to a bunch of computer scientists once
133
455760
3616
有次,我把這個例子
給了一群電腦科學家,
07:39
and afterwards, one of them came up to me.
134
459400
2056
之後,其中一人來找我。
07:41
He was troubled and he said,
"That's why I couldn't publish it."
135
461480
3520
他感到困擾,說:「那就是
為什麼我們無法發表它。」
07:45
I was like, "Couldn't publish what?"
136
465600
1715
我說:「不能發表什麼?」
07:47
He had tried to see whether you can indeed
figure out the onset of mania
137
467800
5856
他曾嘗試能否在出現臨床症狀前
就預知躁鬱症快發作了,
靠的是分析社交媒體的貼文。
07:53
from social media posts
before clinical symptoms,
138
473680
3216
07:56
and it had worked,
139
476920
1776
他辦到了,
07:58
and it had worked very well,
140
478720
2056
結果非常成功,
08:00
and he had no idea how it worked
or what it was picking up on.
141
480800
4880
而他完全不知道是怎麼成功的,
也不知道預測的根據是什麼。
08:06
Now, the problem isn't solved
if he doesn't publish it,
142
486840
4416
如果他不發表結果,
問題就沒有解決,
08:11
because there are already companies
143
491280
1896
因為已經有公司
08:13
that are developing
this kind of technology,
144
493200
2536
在發展這種技術,
08:15
and a lot of the stuff
is just off the shelf.
145
495760
2800
很多東西都已經是現成的了。
08:19
This is not very difficult anymore.
146
499240
2576
這已經不是很困難的事了。
08:21
Do you ever go on YouTube
meaning to watch one video
147
501840
3456
你可曾經上 YouTube
原本只是要看一支影片,
08:25
and an hour later you've watched 27?
148
505320
2360
一個小時之後你卻已看了 27 支?
08:28
You know how YouTube
has this column on the right
149
508760
2496
你可知道 YouTube 在網頁的右欄
08:31
that says, "Up next"
150
511280
2216
擺著「即將播放」的影片,
08:33
and it autoplays something?
151
513520
1816
而且會自動接著播放那些影片?
08:35
It's an algorithm
152
515360
1216
那是種演算法,
08:36
picking what it thinks
that you might be interested in
153
516600
3616
選出它認為你可能會感興趣,
08:40
and maybe not find on your own.
154
520240
1536
但不見得會自己去找到的影片。
08:41
It's not a human editor.
155
521800
1256
並不是人類編輯者,
08:43
It's what algorithms do.
156
523080
1416
而是演算法做的。
08:44
It picks up on what you have watched
and what people like you have watched,
157
524520
4736
它去了解你看過什麼影片,
像你這類的人看過什麼影片,
08:49
and infers that that must be
what you're interested in,
158
529280
4216
然後推論出那就是你會感興趣、
想看更多的影片,
08:53
what you want more of,
159
533520
1255
然後呈現更多給你看。
08:54
and just shows you more.
160
534799
1336
08:56
It sounds like a benign
and useful feature,
161
536159
2201
聽起來是個良性又有用的特色,
08:59
except when it isn't.
162
539280
1200
除了它不是這樣的時候。
09:01
So in 2016, I attended rallies
of then-candidate Donald Trump
163
541640
6960
在 2016 年,我去了一場
擁護當時還是候選人川普的集會,
09:09
to study as a scholar
the movement supporting him.
164
549840
3336
我以學者身份去研究支持他的運動。
09:13
I study social movements,
so I was studying it, too.
165
553200
3456
我研究社會運動,所以也去研究它。
09:16
And then I wanted to write something
about one of his rallies,
166
556680
3336
接著,我想要針對
他的某次集會寫點什麼,
09:20
so I watched it a few times on YouTube.
167
560040
1960
所以就在 YouTube 上
看了幾遍。
09:23
YouTube started recommending to me
168
563240
3096
YouTube 開始推薦給我
09:26
and autoplaying to me
white supremacist videos
169
566360
4256
並為我自動播放,
白人至上主義的影片,
09:30
in increasing order of extremism.
170
570640
2656
一支比一支更極端主義。
09:33
If I watched one,
171
573320
1816
如果我看了一支,
09:35
it served up one even more extreme
172
575160
2976
它就會送上另一支更極端的,
09:38
and autoplayed that one, too.
173
578160
1424
並且自動播放它。
09:40
If you watch Hillary Clinton
or Bernie Sanders content,
174
580320
4536
如果你看的影片內容是
希拉蕊柯林頓或伯尼桑德斯,
09:44
YouTube recommends
and autoplays conspiracy left,
175
584880
4696
YouTube 會推薦並自動播放
陰謀論左派的影片,
09:49
and it goes downhill from there.
176
589600
1760
之後就每況愈下。
09:52
Well, you might be thinking,
this is politics, but it's not.
177
592480
3056
你可能會想,這是政治。
但並不是,重點不是政治,
09:55
This isn't about politics.
178
595560
1256
09:56
This is just the algorithm
figuring out human behavior.
179
596840
3096
這只是猜測人類行為的演算法。
09:59
I once watched a video
about vegetarianism on YouTube
180
599960
4776
我曾經上 YouTube
看一支關於吃素的影片,
10:04
and YouTube recommended
and autoplayed a video about being vegan.
181
604760
4936
而 YouTube 推薦並自動播放了
一支關於嚴格素食主義者的影片。
10:09
It's like you're never
hardcore enough for YouTube.
182
609720
3016
似乎對 YouTube 而言
你的口味永遠都還不夠重。
10:12
(Laughter)
183
612760
1576
(笑聲)
10:14
So what's going on?
184
614360
1560
發生了什麼事?
10:16
Now, YouTube's algorithm is proprietary,
185
616520
3536
YouTube 的演算法是專有的,
10:20
but here's what I think is going on.
186
620080
2360
但我認為發生的事是這樣的:
10:23
The algorithm has figured out
187
623360
2096
演算法發現到,
10:25
that if you can entice people
188
625480
3696
如果誘使人們思索
10:29
into thinking that you can
show them something more hardcore,
189
629200
3736
你還能提供他們更重口味的東西,
10:32
they're more likely to stay on the site
190
632960
2416
他們就更可能會留在網站上,
10:35
watching video after video
going down that rabbit hole
191
635400
4416
看一支又一支的影片,
一路掉進兔子洞,
10:39
while Google serves them ads.
192
639840
1680
同時 Google 還給他們看廣告。
10:43
Now, with nobody minding
the ethics of the store,
193
643760
3120
沒人在意商家倫理的情況下,
10:47
these sites can profile people
194
647720
4240
這些網站能夠描繪人的特性,
10:53
who are Jew haters,
195
653680
1920
哪些人痛恨猶太人,
10:56
who think that Jews are parasites
196
656360
2480
認為猶太人是寄生蟲,
11:00
and who have such explicit
anti-Semitic content,
197
660320
4920
以及哪些人明確地反猶太人,
11:06
and let you target them with ads.
198
666080
2000
讓你針對他們提供廣告。
11:09
They can also mobilize algorithms
199
669200
3536
它們也能動員演算法,
11:12
to find for you look-alike audiences,
200
672760
3136
為你找出相近的觀眾群,
11:15
people who do not have such explicit
anti-Semitic content on their profile
201
675920
5576
那些側看不怎麼明顯反猶太人,
11:21
but who the algorithm detects
may be susceptible to such messages,
202
681520
6176
但是被演算法偵測出來
很容易受到這類訊息影響的人,
11:27
and lets you target them with ads, too.
203
687720
1920
讓你針對他們提供廣告。
11:30
Now, this may sound
like an implausible example,
204
690680
2736
這可能聽起來像是個
難以置信的例子,
11:33
but this is real.
205
693440
1320
但它是真實的。
11:35
ProPublica investigated this
206
695480
2136
ProPublica 調查了這件事,
11:37
and found that you can indeed
do this on Facebook,
207
697640
3616
且發現你的確可以
在臉書上做到這件事,
11:41
and Facebook helpfully
offered up suggestions
208
701280
2416
且臉書很有效地提供建議,
11:43
on how to broaden that audience.
209
703720
1600
告訴你如何拓展觀眾群。
11:46
BuzzFeed tried it for Google,
and very quickly they found,
210
706720
3016
BuzzFeed 用 Google
做了實驗,他們很快發現,
11:49
yep, you can do it on Google, too.
211
709760
1736
是的,你也可以在
Google 上這樣做。
11:51
And it wasn't even expensive.
212
711520
1696
而且甚至不貴。
11:53
The ProPublica reporter
spent about 30 dollars
213
713240
4416
ProPublica 的記者
花了大約 30 美元
11:57
to target this category.
214
717680
2240
來針對這個類別。
12:02
So last year, Donald Trump's
social media manager disclosed
215
722600
5296
去年川普的社交媒體經理透露,
12:07
that they were using Facebook dark posts
to demobilize people,
216
727920
5336
他們利用臉書的隱藏廣告貼文
來「反動員」選民,
12:13
not to persuade them,
217
733280
1376
不是勸說或動員他們,
12:14
but to convince them not to vote at all.
218
734680
2800
而是說服他們根本不去投票。
12:18
And to do that,
they targeted specifically,
219
738520
3576
為做到這一點,他們準確設定目標,
12:22
for example, African-American men
in key cities like Philadelphia,
220
742120
3896
比如像費城這樣
關鍵城市的非裔美國男性,
12:26
and I'm going to read
exactly what he said.
221
746040
2456
讓我把他的話一字不漏讀出來。
12:28
I'm quoting.
222
748520
1216
以下為引述。
12:29
They were using "nonpublic posts
223
749760
3016
他們使用「非公開貼文,
12:32
whose viewership the campaign controls
224
752800
2176
那些貼文的觀看權限
由競選團隊來控制,
12:35
so that only the people
we want to see it see it.
225
755000
3776
所以只有我們挑的讀者才看得到。
我們為此建立了模型,
12:38
We modeled this.
226
758800
1216
12:40
It will dramatically affect her ability
to turn these people out."
227
760040
4720
會嚴重影響到她(指希拉蕊)
動員那些人去投票的能力。」
12:45
What's in those dark posts?
228
765720
2280
那些隱藏廣告貼文中有什麼內容?
12:48
We have no idea.
229
768480
1656
我們不知道。
12:50
Facebook won't tell us.
230
770160
1200
臉書不告訴我們。
12:52
So Facebook also algorithmically
arranges the posts
231
772480
4376
所以臉書也用演算法的方式
來安排你的朋友
12:56
that your friends put on Facebook,
or the pages you follow.
232
776880
3736
在臉書的貼文或是你追蹤的頁面。
它並不會照時間順序
來呈現所有內容。
13:00
It doesn't show you
everything chronologically.
233
780640
2216
13:02
It puts the order in the way
that the algorithm thinks will entice you
234
782880
4816
呈現順序是演算法認為
能引誘你在網站上逗留久一點的順序。
13:07
to stay on the site longer.
235
787720
1840
13:11
Now, so this has a lot of consequences.
236
791040
3376
所以,這麼做有許多後果。
13:14
You may be thinking
somebody is snubbing you on Facebook.
237
794440
3800
你可能會認為有人在臉書上冷落你。
13:18
The algorithm may never
be showing your post to them.
238
798800
3256
也許是演算法根本沒把
你的貼文呈現給他們看。
13:22
The algorithm is prioritizing
some of them and burying the others.
239
802080
5960
演算法優先呈現其中某些,
而埋藏掉其他的。
13:29
Experiments show
240
809320
1296
實驗顯示,
13:30
that what the algorithm picks to show you
can affect your emotions.
241
810640
4520
演算法選擇呈現給你的內容,
會影響你的情緒。
13:36
But that's not all.
242
816600
1200
但不止這樣,
13:38
It also affects political behavior.
243
818280
2360
它也會影響政治行為。
13:41
So in 2010, in the midterm elections,
244
821360
4656
在 2010 年的期中選舉時,
13:46
Facebook did an experiment
on 61 million people in the US
245
826040
5896
臉書做了一個實驗,
對象是美國 6100 萬人,
13:51
that was disclosed after the fact.
246
831960
1896
該實驗後來被揭露出來。
13:53
So some people were shown,
"Today is election day,"
247
833880
3416
有些人看到的是「今天是選舉日」,
13:57
the simpler one,
248
837320
1376
簡單的版本,
13:58
and some people were shown
the one with that tiny tweak
249
838720
3896
有些人看到的是有
小小調整過的版本,
14:02
with those little thumbnails
250
842640
2096
用小型照片縮圖來顯示出
14:04
of your friends who clicked on "I voted."
251
844760
2840
你的朋友中按了
「我已投票」的那些人。
14:09
This simple tweak.
252
849000
1400
這是個小小的調整。
14:11
OK? So the pictures were the only change,
253
851520
4296
唯一的差別就是照片,
14:15
and that post shown just once
254
855840
3256
這篇貼文只被顯示出來一次,
14:19
turned out an additional 340,000 voters
255
859120
6056
結果多出了 34 萬的投票者
14:25
in that election,
256
865200
1696
在那次選舉投了票,
14:26
according to this research
257
866920
1696
根據這研究指出,
14:28
as confirmed by the voter rolls.
258
868640
2520
這結果已經由選舉人名冊確認過了。
14:32
A fluke? No.
259
872920
1656
是僥倖嗎?不是。
14:34
Because in 2012,
they repeated the same experiment.
260
874600
5360
因為在 2012 年,
他們重覆了同樣的實驗。
14:40
And that time,
261
880840
1736
那一次,
14:42
that civic message shown just once
262
882600
3296
只顯示一次的公民訊息
14:45
turned out an additional 270,000 voters.
263
885920
4440
造成投票者多出了 27 萬人。
14:51
For reference, the 2016
US presidential election
264
891160
5216
供參考用:2016 年
總統大選的結果,
14:56
was decided by about 100,000 votes.
265
896400
3520
大約十萬選票的差距決定了江山。
15:01
Now, Facebook can also
very easily infer what your politics are,
266
901360
4736
臉書也能輕易推論出你的政治傾向,
即使你未曾在臉書上透露過。
15:06
even if you've never
disclosed them on the site.
267
906120
2256
15:08
Right? These algorithms
can do that quite easily.
268
908400
2520
對吧?那些演算法很輕易就做得到。
15:11
What if a platform with that kind of power
269
911960
3896
一旦具有那種力量的平台決定要使
15:15
decides to turn out supporters
of one candidate over the other?
270
915880
5040
一位候選人的支持者出來投票,
另一位的則不,會如何呢?
15:21
How would we even know about it?
271
921680
2440
我們如何得知發生了這種事?
15:25
Now, we started from someplace
seemingly innocuous --
272
925560
4136
我們討論的起始點看似無害──
15:29
online adds following us around --
273
929720
2216
線上廣告跟著我們到處出現──
15:31
and we've landed someplace else.
274
931960
1840
但後來卻談到別的現象。
15:35
As a public and as citizens,
275
935480
2456
身為大眾、身為公民,
15:37
we no longer know
if we're seeing the same information
276
937960
3416
我們不再知道
我們看到的資訊是否相同,
15:41
or what anybody else is seeing,
277
941400
1480
或是其他人看到了什麼,
15:43
and without a common basis of information,
278
943680
2576
沒有共同的資訊基礎,
15:46
little by little,
279
946280
1616
漸漸地,
15:47
public debate is becoming impossible,
280
947920
3216
就會變成不可能公開辯論了,
15:51
and we're just at
the beginning stages of this.
281
951160
2976
我們目前只是在
這個過程的初始階段。
15:54
These algorithms can quite easily infer
282
954160
3456
這些演算法很容易推論出
15:57
things like your people's ethnicity,
283
957640
3256
比如你的種族、
16:00
religious and political views,
personality traits,
284
960920
2336
宗教和政治觀點、個人特質、
16:03
intelligence, happiness,
use of addictive substances,
285
963280
3376
智力、快樂程度、
是否使用上癮式物質、
16:06
parental separation, age and genders,
286
966680
3136
父母離異、年齡和性別,
16:09
just from Facebook likes.
287
969840
1960
只從臉書按的讚就能知道。
16:13
These algorithms can identify protesters
288
973440
4056
這些演算法能夠辨識抗議者,
16:17
even if their faces
are partially concealed.
289
977520
2760
即使遮蔽他們部份的臉也能辨識。
16:21
These algorithms may be able
to detect people's sexual orientation
290
981720
6616
這些演算法或許能偵測人的性向,
16:28
just from their dating profile pictures.
291
988360
3200
只要有他們的約會側寫照片即可。
16:33
Now, these are probabilistic guesses,
292
993560
2616
這些是用機率算出的猜測,
16:36
so they're not going
to be 100 percent right,
293
996200
2896
所以不見得 100% 正確,
16:39
but I don't see the powerful resisting
the temptation to use these technologies
294
999120
4896
但我並沒有看到因為這些技術有
假陽性結果(實際沒有被預測為有)
16:44
just because there are
some false positives,
295
1004040
2176
大家就抗拒使用它們,
16:46
which will of course create
a whole other layer of problems.
296
1006240
3256
因而這些假陽性結果
又造成全然另一層的問題。
16:49
Imagine what a state can do
297
1009520
2936
想像一下國家會怎麼用
16:52
with the immense amount of data
it has on its citizens.
298
1012480
3560
所擁有的大量國民資料。
16:56
China is already using
face detection technology
299
1016680
4776
中國已經在使用面部辨識技術
17:01
to identify and arrest people.
300
1021480
2880
來識別和逮捕人。
17:05
And here's the tragedy:
301
1025280
2136
不幸的是,
17:07
we're building this infrastructure
of surveillance authoritarianism
302
1027440
5536
起初我們建立這個
專制監視的基礎結構,
17:13
merely to get people to click on ads.
303
1033000
2960
僅僅為了要讓人們點閱廣告。
17:17
And this won't be
Orwell's authoritarianism.
304
1037240
2576
這不會是歐威爾的專制主義。
17:19
This isn't "1984."
305
1039839
1897
這不是《1984》。
17:21
Now, if authoritarianism
is using overt fear to terrorize us,
306
1041760
4576
如果專制主義
公然利用恐懼來恐嚇我們,
17:26
we'll all be scared, but we'll know it,
307
1046359
2897
我們會害怕,但我們心知肚明,
17:29
we'll hate it and we'll resist it.
308
1049280
2200
我們會厭惡它,也會抗拒它。
17:32
But if the people in power
are using these algorithms
309
1052880
4416
但如果掌權者用這些演算法
17:37
to quietly watch us,
310
1057319
3377
悄悄地監看我們、
17:40
to judge us and to nudge us,
311
1060720
2080
評斷我們、輕輕推使我們,
17:43
to predict and identify
the troublemakers and the rebels,
312
1063720
4176
用這些演算法來預測和辨識出
問題製造者和叛亂份子,
17:47
to deploy persuasion
architectures at scale
313
1067920
3896
部署大規模的說服結構,
17:51
and to manipulate individuals one by one
314
1071840
4136
並個別操弄每一個人,
17:56
using their personal, individual
weaknesses and vulnerabilities,
315
1076000
5440
利用他們個人、個別的缺點和弱點,
18:02
and if they're doing it at scale
316
1082720
2200
如果規模夠大,
18:06
through our private screens
317
1086080
1736
透過我們私人的螢幕,
18:07
so that we don't even know
318
1087840
1656
那麼我們甚至不會知道
18:09
what our fellow citizens
and neighbors are seeing,
319
1089520
2760
其他公民及鄰居看到了什麼內容,
18:13
that authoritarianism
will envelop us like a spider's web
320
1093560
4816
那種專制主義會像蜘蛛網
一樣把我們緊緊地包裹起來,
18:18
and we may not even know we're in it.
321
1098400
2480
而我們甚至不會知道
自己被包在裡面。
18:22
So Facebook's market capitalization
322
1102440
2936
所以,臉書的市場資本化
18:25
is approaching half a trillion dollars.
323
1105400
3296
已經接近五千億美元。
18:28
It's because it works great
as a persuasion architecture.
324
1108720
3120
因為它是個很成功的說服架構。
18:33
But the structure of that architecture
325
1113760
2816
但用的架構一樣,
18:36
is the same whether you're selling shoes
326
1116600
3216
不論你銷售的是鞋子
18:39
or whether you're selling politics.
327
1119840
2496
或是政治。
18:42
The algorithms do not know the difference.
328
1122360
3120
演算法不知道差別。
18:46
The same algorithms set loose upon us
329
1126240
3296
那個被鬆綁了的、
為使我們更容易
被廣告左右的演算法,
18:49
to make us more pliable for ads
330
1129560
3176
18:52
are also organizing our political,
personal and social information flows,
331
1132760
6736
同時也正組織著我們的
政治、個人和社會的資訊流,
18:59
and that's what's got to change.
332
1139520
1840
這點必須要被改變才行。
19:02
Now, don't get me wrong,
333
1142240
2296
別誤會我,
19:04
we use digital platforms
because they provide us with great value.
334
1144560
3680
我們使用數位平台,是因為
它們能提供我們極大的價值。
19:09
I use Facebook to keep in touch
with friends and family around the world.
335
1149120
3560
我用臉書來和世界各地的
朋友家人保持聯絡。
19:14
I've written about how crucial
social media is for social movements.
336
1154000
5776
我寫過關於社交媒體對於
社會運動有多重要的文章。
19:19
I have studied how
these technologies can be used
337
1159800
3016
我研究過這些技術能如何
19:22
to circumvent censorship around the world.
338
1162840
2480
被用來規避世界各地的審查制度。
19:27
But it's not that the people who run,
you know, Facebook or Google
339
1167280
6416
但,不是臉書
或 Google 的營運者
19:33
are maliciously and deliberately trying
340
1173720
2696
在惡意、刻意地嘗試
19:36
to make the country
or the world more polarized
341
1176440
4456
讓國家或世界變得更兩極化、
19:40
and encourage extremism.
342
1180920
1680
或鼓勵極端主義。
19:43
I read the many
well-intentioned statements
343
1183440
3976
我讀過許多出發點很好的聲明,
19:47
that these people put out.
344
1187440
3320
都是這些人發出來的。
19:51
But it's not the intent or the statements
people in technology make that matter,
345
1191600
6056
但重點並不是科技人的意圖或聲明,
19:57
it's the structures
and business models they're building.
346
1197680
3560
而他們建造的結構與商業模型
20:02
And that's the core of the problem.
347
1202360
2096
才是問題的核心。
20:04
Either Facebook is a giant con
of half a trillion dollars
348
1204480
4720
要不就臉書是個大騙子,
詐騙了半兆美元,
20:10
and ads don't work on the site,
349
1210200
1896
該網站上的廣告沒有用,
20:12
it doesn't work
as a persuasion architecture,
350
1212120
2696
它不以說服架構的形式運作;
20:14
or its power of influence
is of great concern.
351
1214840
4120
要不就它的影響力很讓人擔心。
20:20
It's either one or the other.
352
1220560
1776
只會是兩者其一。
20:22
It's similar for Google, too.
353
1222360
1600
Google 也類似。
20:24
So what can we do?
354
1224880
2456
所以,我們能做什麼?
20:27
This needs to change.
355
1227360
1936
這必須要改變。
20:29
Now, I can't offer a simple recipe,
356
1229320
2576
我無法提供簡單的解決之道,
20:31
because we need to restructure
357
1231920
2256
因為我們得要重建
20:34
the whole way our
digital technology operates.
358
1234200
3016
整個數位技術的運作方式;
20:37
Everything from the way
technology is developed
359
1237240
4096
每一樣──從發展技術的方式
20:41
to the way the incentives,
economic and otherwise,
360
1241360
3856
到獎勵的方式,不論是實質
或其他形式的獎勵──
20:45
are built into the system.
361
1245240
2280
都要被建置到系統中。
20:48
We have to face and try to deal with
362
1248480
3456
我們得要面對並試圖處理
20:51
the lack of transparency
created by the proprietary algorithms,
363
1251960
4656
專有演算法所造成的透明度缺乏,
20:56
the structural challenge
of machine learning's opacity,
364
1256640
3816
難懂的機器學習的結構性挑戰,
21:00
all this indiscriminate data
that's being collected about us.
365
1260480
3400
所有被不分皂白地收集走、
與我們相關的資料。
21:05
We have a big task in front of us.
366
1265000
2520
我們面對巨大的任務。
21:08
We have to mobilize our technology,
367
1268160
2680
我們得要動員我們的科技、
21:11
our creativity
368
1271760
1576
我們的創意、
21:13
and yes, our politics
369
1273360
1880
以及我們的政治。
21:16
so that we can build
artificial intelligence
370
1276240
2656
以讓我們建立的人工智慧
21:18
that supports us in our human goals
371
1278920
3120
能夠支持我們人類的目標,
21:22
but that is also constrained
by our human values.
372
1282800
3920
那些同時也被人類價值
所限制住的目標。
21:27
And I understand this won't be easy.
373
1287600
2160
我知道這不容易。
21:30
We might not even easily agree
on what those terms mean.
374
1290360
3600
我們甚至無法輕易取得
那些用語意義的共識。
21:34
But if we take seriously
375
1294920
2400
但如果我們認真看待
21:38
how these systems that we
depend on for so much operate,
376
1298240
5976
我們如此依賴的這些系統如何運作,
21:44
I don't see how we can postpone
this conversation anymore.
377
1304240
4120
我看不出我們怎能再延遲對話。
21:49
These structures
378
1309200
2536
這些結構
21:51
are organizing how we function
379
1311760
4096
正在組織我們運作的方式,
21:55
and they're controlling
380
1315880
2296
並且控制了
21:58
what we can and we cannot do.
381
1318200
2616
我們能做什麼、不能做什麼。
22:00
And many of these ad-financed platforms,
382
1320840
2456
許多這類由廣告贊助的平台,
22:03
they boast that they're free.
383
1323320
1576
它們誇說它們是免費的。
22:04
In this context, it means
that we are the product that's being sold.
384
1324920
4560
在這個情境下,意思就是說
「我們」就是被銷售的產品。
22:10
We need a digital economy
385
1330840
2736
我們需要一個數位經濟結構,
22:13
where our data and our attention
386
1333600
3496
在這個結構中,我們的
資料和注意力是非賣品,
22:17
is not for sale to the highest-bidding
authoritarian or demagogue.
387
1337120
5080
不能售與出價最高的
專制主義者或煽動者。
22:23
(Applause)
388
1343160
3800
(掌聲)
22:30
So to go back to
that Hollywood paraphrase,
389
1350480
3256
回到前面說的好萊塢改述,
22:33
we do want the prodigious potential
390
1353760
3736
我們的確希望人工智慧與數位科技的
22:37
of artificial intelligence
and digital technology to blossom,
391
1357520
3200
巨大潛能能夠綻放,
22:41
but for that, we must face
this prodigious menace,
392
1361400
4936
但為此,我們必須要
面對這個巨大的威脅,
22:46
open-eyed and now.
393
1366360
1936
睜開眼睛,現在就做。
22:48
Thank you.
394
1368320
1216
謝謝。
22:49
(Applause)
395
1369560
4640
(掌聲)
New videos
Original video on YouTube.com
關於本網站
本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。