How a handful of tech companies control billions of minds every day | Tristan Harris
947,059 views ・ 2017-07-28
請雙擊下方英文字幕播放視頻。
譯者: 易帆 余
審譯者: Wilde Luo
00:12
I want you to imagine
0
12960
1200
我要各位去想像一個場景,
00:15
walking into a room,
1
15000
1200
你走進了一個房間,
00:17
a control room with a bunch of people,
2
17480
2136
這個控制室裡有一群人,
00:19
a hundred people, hunched
over a desk with little dials,
3
19640
2840
一百多人縮在布置着
各種小儀表盤的辦公桌前,
00:23
and that that control room
4
23280
1520
而這間控制室,
00:25
will shape the thoughts and feelings
5
25680
3696
即將影響
十億多人的想法與感受。
00:29
of a billion people.
6
29400
1240
00:32
This might sound like science fiction,
7
32560
1880
這聽起來像是科幻小說,
00:35
but this actually exists
8
35320
2216
但它確實存在,
00:37
right now, today.
9
37560
1200
就在當下,今天。
00:40
I know because I used to be
in one of those control rooms.
10
40040
3360
我會知道的原因,是因為
我也曾經是控制室裡的一員。
00:44
I was a design ethicist at Google,
11
44159
2297
我曾經是谷歌的倫理設計學家,
00:46
where I studied how do you ethically
steer people's thoughts?
12
46480
3360
我在研究如何在符合道德的前提下,
控制人們的思想。
00:50
Because what we don't talk about
is how the handful of people
13
50560
2896
因為我們不會去討論
00:53
working at a handful
of technology companies
14
53480
2536
這幾家科技公司裡面的人
00:56
through their choices will steer
what a billion people are thinking today.
15
56040
5040
會如何以他們的選擇意志
去控制十幾億人的想法。
01:02
Because when you pull out your phone
16
62400
1736
因為當你拿出手機時,
01:04
and they design how this works
or what's on the feed,
17
64160
3096
他們已經設計好如何運作
或者要給你什麼資訊。
01:07
it's scheduling little blocks
of time in our minds.
18
67280
3216
它已經在我們的腦中
安排好很多小時段。
01:10
If you see a notification,
it schedules you to have thoughts
19
70520
3136
如果你看了通知,
這會促使你產生一個
01:13
that maybe you didn't intend to have.
20
73680
2040
你也許不想要的想法。
01:16
If you swipe over that notification,
21
76400
2616
如果你跳過那個通知,
01:19
it schedules you into spending
a little bit of time
22
79040
2381
它就會讓你多花點時間
01:21
getting sucked into something
23
81445
1381
投入到你不想要的東西上,
01:22
that maybe you didn't intend
to get sucked into.
24
82850
2955
而你原本也許不想要
花時間在那上面。
01:27
When we talk about technology,
25
87320
1520
當我們在談論科技時,
01:30
we tend to talk about it
as this blue sky opportunity.
26
90040
2696
我們傾向於把它當作是
湛藍天空的機會。
01:32
It could go any direction.
27
92760
1480
它可以往任何方向發展。
01:35
And I want to get serious for a moment
28
95400
1856
但我想認真地說,
01:37
and tell you why it's going
in a very specific direction.
29
97280
2680
我要告訴各位,
為什麼科技正在往特定的方向發展。
01:40
Because it's not evolving randomly.
30
100840
2200
因為科技的演變不是隨機的。
01:44
There's a hidden goal
driving the direction
31
104000
2016
在我們所有創造的科技背後,
01:46
of all of the technology we make,
32
106040
2136
都隱藏著一個特定目標,
01:48
and that goal is the race
for our attention.
33
108200
2920
而那個目標就是
競相爭奪我們的注意力。
01:52
Because every news site,
34
112840
2736
因為每一個新網頁──
01:55
TED, elections, politicians,
35
115600
2736
TED 網頁、選舉網頁、政客網頁、
01:58
games, even meditation apps
36
118360
1976
遊戲網頁,甚至是冥想的應用軟體──
02:00
have to compete for one thing,
37
120360
1960
都必須競爭同樣的東西,
02:03
which is our attention,
38
123160
1736
也就是我們的注意力,
02:04
and there's only so much of it.
39
124920
1600
但市場就這麼大。
02:08
And the best way to get people's attention
40
128440
2416
而要獲得人們注意力的最佳方法,
02:10
is to know how someone's mind works.
41
130880
2440
就是去了解使用者的腦袋
是如何運作的。
02:13
And there's a whole bunch
of persuasive techniques
42
133800
2336
有很多說服的技巧,
02:16
that I learned in college at a lab
called the Persuasive Technology Lab
43
136160
3496
我是從大學的
「說服力技術實驗室」學來的,
02:19
to get people's attention.
44
139680
1600
他們教你如何獲得別人的注意力。
02:21
A simple example is YouTube.
45
141880
1480
一個簡單的例子就是 YouTube。
02:24
YouTube wants to maximize
how much time you spend.
46
144000
2936
YouTube 想要最大化
你花費在他們網站的時間。
02:26
And so what do they do?
47
146960
1200
那他們會怎麼做?
02:28
They autoplay the next video.
48
148840
2280
他們會幫你自動撥放下一部片。
02:31
And let's say that works really well.
49
151760
1816
這一招真的很有效,
02:33
They're getting a little bit
more of people's time.
50
153600
2416
他們也因此得到
使用者更多的時間。
02:36
Well, if you're Netflix,
you look at that and say,
51
156040
2376
但,如果你是 Netflix,
你看到了這樣的狀況,你會說,
02:38
well, that's shrinking my market share,
52
158440
1858
不行,這樣會把我的客戶給搶走,
02:40
so I'm going to autoplay the next episode.
53
160322
2000
所以,我也要自動播放下一集。
02:43
But then if you're Facebook,
54
163320
1376
但如果你是 Facebook,
02:44
you say, that's shrinking
all of my market share,
55
164720
2336
你會說,那這樣我的市場
都被你們瓜分掉了,
02:47
so now I have to autoplay
all the videos in the newsfeed
56
167080
2656
所以我會在你點擊播放按鍵前
02:49
before waiting for you to click play.
57
169760
1762
自動播放所有的影片給你看。
02:52
So the internet is not evolving at random.
58
172320
3160
所以,網際網路的演化不是隨機的。
02:56
The reason it feels
like it's sucking us in the way it is
59
176320
4416
它會讓你感覺欲罷不能的原因,
03:00
is because of this race for attention.
60
180760
2376
就是因為這場注意力的爭奪賽。
03:03
We know where this is going.
61
183160
1416
我們知道這會有什麼後果。
03:04
Technology is not neutral,
62
184600
1520
因為科技不是中立的。
03:07
and it becomes this race
to the bottom of the brain stem
63
187320
3416
這個競賽已變成
看誰可以更深地滲入
使用者腦袋的比賽。
03:10
of who can go lower to get it.
64
190760
2200
03:13
Let me give you an example of Snapchat.
65
193920
2336
我跟各位舉個例子, Snapchat。
03:16
If you didn't know,
Snapchat is the number one way
66
196280
3696
不知道各位是否了解,
Snapchat 目前是
美國年輕人之間,
最熱門的社交軟體。
03:20
that teenagers in
the United States communicate.
67
200000
2256
03:22
So if you're like me, and you use
text messages to communicate,
68
202280
4176
所以,如果你們和我一樣,
有在用簡訊在與人交流,
03:26
Snapchat is that for teenagers,
69
206480
1776
應該知道 Snapchat 就是專門
設計給年輕人使用的,
03:28
and there's, like,
a hundred million of them that use it.
70
208280
2696
差不多將近有一億人在使用它。
03:31
And they invented
a feature called Snapstreaks,
71
211000
2216
這家公司發明了一個叫做
Snapstreaks 的特色功能,
03:33
which shows the number of days in a row
72
213240
1896
它會告訴你,
你跟你朋友兩個人之間,
連續不間斷地聊了幾天。
03:35
that two people have
communicated with each other.
73
215160
2616
03:37
In other words, what they just did
74
217800
1856
換句話說,它們給予的是一種
03:39
is they gave two people
something they don't want to lose.
75
219680
2960
兩人都捨不得放棄的東西。
03:44
Because if you're a teenager,
and you have 150 days in a row,
76
224000
3456
因為,如果你是年輕人
而有著連續 150 天的聊天紀錄,
03:47
you don't want that to go away.
77
227480
1976
你不會想讓紀錄就此中斷的。
03:49
And so think of the little blocks of time
that that schedules in kids' minds.
78
229480
4160
所以想想孩子們
腦袋裡被設定好的時間模式。
03:54
This isn't theoretical:
when kids go on vacation,
79
234160
2336
我沒在騙你:已經有人證實,
當孩子在度假時,
03:56
it's been shown they give their passwords
to up to five other friends
80
236520
3256
他們會把密碼給另外五位朋友,
03:59
to keep their Snapstreaks going,
81
239800
2216
請他們幫忙維持
Snapstreaks 的聊天記錄,
04:02
even when they can't do it.
82
242040
2016
就算他們不能用手機。
04:04
And they have, like, 30 of these things,
83
244080
1936
他們差不多有 30 種類似
這樣的東西要維護,
04:06
and so they have to get through
taking photos of just pictures or walls
84
246040
3376
所以他們每天要東拍拍、西拍拍
04:09
or ceilings just to get through their day.
85
249440
2480
拍牆壁、拍天花板,
不然當天他們會渾身不舒服。
04:13
So it's not even like
they're having real conversations.
86
253200
2696
所以這根本不像是
他們在真正的交流。
04:15
We have a temptation to think about this
87
255920
1936
我們可能會這麽想,
04:17
as, oh, they're just using Snapchat
88
257880
2696
他們用 Snapchat 的方式
04:20
the way we used to
gossip on the telephone.
89
260600
2016
就像我們曾經用電話聊八卦一樣。
04:22
It's probably OK.
90
262640
1200
應該還好吧!
04:24
Well, what this misses
is that in the 1970s,
91
264480
2256
但,不同於 1970 年代的是:
04:26
when you were just
gossiping on the telephone,
92
266760
2615
當你們用電話聊八卦時,
04:29
there wasn't a hundred engineers
on the other side of the screen
93
269399
3017
旁邊並沒有數百位工程師在監控你,
04:32
who knew exactly
how your psychology worked
94
272440
2056
準確地知道你的心理,
04:34
and orchestrated you
into a double bind with each other.
95
274520
2640
並操控著你們倆緊緊地綁在一起。
04:38
Now, if this is making you
feel a little bit of outrage,
96
278440
3400
如果現在你有點生氣了,
04:42
notice that that thought
just comes over you.
97
282680
2576
有沒有注意到,你生氣了?
04:45
Outrage is a really good way also
of getting your attention,
98
285280
3320
因為激怒你也是引起你
注意的方式之一,
04:49
because we don't choose outrage.
99
289880
1576
因為就算我們不想生氣,
04:51
It happens to us.
100
291480
1416
它還是會發生。
04:52
And if you're the Facebook newsfeed,
101
292920
1856
如果你是 Facebook 的新聞推送者,
04:54
whether you'd want to or not,
102
294800
1416
不管你是刻意或是不經意,
04:56
you actually benefit when there's outrage.
103
296240
2736
人們憤怒的時候,實際上你是受益的。
04:59
Because outrage
doesn't just schedule a reaction
104
299000
2936
因為憤怒不僅僅在
情感上讓你有個宣洩的出口,
05:01
in emotional time, space, for you.
105
301960
2880
更提供了你一個發洩的空間。
05:05
We want to share that outrage
with other people.
106
305440
2416
我們還會想和其他人
分享我們的憤怒。
05:07
So we want to hit share and say,
107
307880
1576
所以我們會按下分享鍵然後說,
05:09
"Can you believe the thing
that they said?"
108
309480
2040
「你敢相信他們說的嗎?」
05:12
And so outrage works really well
at getting attention,
109
312520
3376
所以讓人發怒,可以吸引到注意力,
05:15
such that if Facebook had a choice
between showing you the outrage feed
110
315920
3896
所以,如果 Facebook 可以在
向你展示令人憤怒或者
令人平靜的消息之間進行選擇,
05:19
and a calm newsfeed,
111
319840
1320
05:22
they would want
to show you the outrage feed,
112
322120
2136
他們會選擇向你展示令人憤怒的消息,
05:24
not because someone
consciously chose that,
113
324280
2056
不是因為有人刻意如此選,
05:26
but because that worked better
at getting your attention.
114
326360
2680
只是因為這樣可以讓你
更注意到他們。
05:31
And the newsfeed control room
is not accountable to us.
115
331120
5480
新聞控制室不對我們負責。
05:37
It's only accountable
to maximizing attention.
116
337040
2296
它只對最大化注意力負責。
05:39
It's also accountable,
117
339360
1216
它也要向──
05:40
because of the business model
of advertising,
118
340600
2376
因為商業模式的關係,
05:43
for anybody who can pay the most
to actually walk into the control room
119
343000
3336
它也要向走進控制室
付廣告費的人負責,
05:46
and say, "That group over there,
120
346360
1576
他們會說,「那個團體在那邊,
05:47
I want to schedule these thoughts
into their minds."
121
347960
2640
我想要灌輸一些想法到他們腦裡。」
05:51
So you can target,
122
351760
1200
所以你可以定位,
05:54
you can precisely target a lie
123
354040
1936
你可以準確直接定位到
05:56
directly to the people
who are most susceptible.
124
356000
2920
那些最容易被受到影響的人。
06:00
And because this is profitable,
it's only going to get worse.
125
360080
2880
因為這是有利可圖的,
所以狀況只會越來越糟。
06:05
So I'm here today
126
365040
1800
所以我今天來到這裡的原因,
06:08
because the costs are so obvious.
127
368160
2000
是因為這件事的代價太高了。
06:12
I don't know a more urgent
problem than this,
128
372280
2136
我認為沒有其它事
比這問題還要緊急,
06:14
because this problem
is underneath all other problems.
129
374440
3120
因為其它所有問題的背後,
都與這個問題有關。
06:18
It's not just taking away our agency
130
378720
3176
它不僅剝奪我們的自主權,
06:21
to spend our attention
and live the lives that we want,
131
381920
2600
還浪費我們的注意力,
影響我們的生活方式。
06:25
it's changing the way
that we have our conversations,
132
385720
3536
也改變了我們的交流方式,
06:29
it's changing our democracy,
133
389280
1736
改變了我們的民主制度,
06:31
and it's changing our ability
to have the conversations
134
391040
2616
而且還改變了我們
想要與他人交流、
06:33
and relationships we want with each other.
135
393680
2000
維持關係的能力。
06:37
And it affects everyone,
136
397160
1776
這會影響到每一個人,
06:38
because a billion people
have one of these in their pocket.
137
398960
3360
因為十幾億人口的口袋裡
都有一台這個東西。
06:45
So how do we fix this?
138
405360
1400
所以我們要如何修復這個問題?
06:49
We need to make three radical changes
139
409080
2936
我們需要對科技和我們的社會
06:52
to technology and to our society.
140
412040
1800
做三大激進的改變。
06:55
The first is we need to acknowledge
that we are persuadable.
141
415720
3800
首先,我們需要了解
我們是會被說服的。
07:00
Once you start understanding
142
420840
1376
一旦你了解
07:02
that your mind can be scheduled
into having little thoughts
143
422240
2776
人腦可以被有計劃性地
注入一些思想
07:05
or little blocks of time
that you didn't choose,
144
425040
2576
或被占用掉一些時間時,
07:07
wouldn't we want to use that understanding
145
427640
2056
我們難道不會利用這點
07:09
and protect against the way
that that happens?
146
429720
2160
來防止這樣的事發生嗎?
07:12
I think we need to see ourselves
fundamentally in a new way.
147
432600
3296
我認為我們需要以全新的方式審視自己。
07:15
It's almost like a new period
of human history,
148
435920
2216
就像是人類歷史上新的篇章,
07:18
like the Enlightenment,
149
438160
1216
就像啟蒙運動,
07:19
but almost a kind of
self-aware Enlightenment,
150
439400
2216
但是是自覺性的啟蒙運動,
07:21
that we can be persuaded,
151
441640
2080
瞭解到我們是會被說服的,
07:24
and there might be something
we want to protect.
152
444320
2240
並意識到我們也有想要保護的東西。
07:27
The second is we need new models
and accountability systems
153
447400
4576
第二點是我們需要新的
模式和責任系統,
07:32
so that as the world gets better
and more and more persuasive over time --
154
452000
3496
如此,當世界變得越來越好、
我們越來越容易被說服時──
07:35
because it's only going
to get more persuasive --
155
455520
2336
因為它只會變得更有說服力──
07:37
that the people in those control rooms
156
457880
1856
這樣在控制室裡的那些人
07:39
are accountable and transparent
to what we want.
157
459760
2456
才會對我們想要的東西負責
並把它透明化。
07:42
The only form of ethical
persuasion that exists
158
462240
2696
道德說服存在的唯一前提就是:
07:44
is when the goals of the persuader
159
464960
1936
只有當說服者的目標
07:46
are aligned with the goals
of the persuadee.
160
466920
2200
和被說服者的目標是一致時才存在。
07:49
And that involves questioning big things,
like the business model of advertising.
161
469640
3840
而這涉及到對大型事件的質疑,
像是廣告的商業模式。
07:54
Lastly,
162
474720
1576
最後,
07:56
we need a design renaissance,
163
476320
1680
我們需要一個經過設計的
科技文藝復興,
07:59
because once you have
this view of human nature,
164
479080
3056
因為一旦你對人性本質
有一定的了解,
08:02
that you can steer the timelines
of a billion people --
165
482160
2976
那你就可以控制十億人的時間軸──
08:05
just imagine, there's people
who have some desire
166
485160
2736
想像一下,人都會有一些慾望,
08:07
about what they want to do
and what they want to be thinking
167
487920
2856
會有想做的事、想思考的事、
08:10
and what they want to be feeling
and how they want to be informed,
168
490800
3136
想感受的事物、想了解的事物,
而我們卻只能被引導到其它方向。
08:13
and we're all just tugged
into these other directions.
169
493960
2536
十億人只會被引導到這些不同的方向。
08:16
And you have a billion people just tugged
into all these different directions.
170
496520
3696
所以,試想一下
一個完整的文藝復興設計,
08:20
Well, imagine an entire design renaissance
171
500240
2056
08:22
that tried to orchestrate
the exact and most empowering
172
502320
3096
可以幫助我們引導至
正確的、有自主性的、
08:25
time-well-spent way
for those timelines to happen.
173
505440
3136
時間分配良好的路上。
08:28
And that would involve two things:
174
508600
1656
那就得包含兩件事:
08:30
one would be protecting
against the timelines
175
510280
2136
一是我們要保護自己的時間軸
08:32
that we don't want to be experiencing,
176
512440
1856
不被支配到不想經歷的事情上、
08:34
the thoughts that we
wouldn't want to be happening,
177
514320
2416
不去產生我們不想要的想法,
08:36
so that when that ding happens,
not having the ding that sends us away;
178
516760
3336
如此,當簡訊的提醒聲響起時,
我們就不會被牽著鼻子走;
08:40
and the second would be empowering us
to live out the timeline that we want.
179
520120
3616
二是要讓我們能活出
自己想要的時光。
08:43
So let me give you a concrete example.
180
523760
1880
我給各位舉個例子。
08:46
Today, let's say your friend
cancels dinner on you,
181
526280
2456
比如說,今天你朋友取消了
與你的晚餐約會,
08:48
and you are feeling a little bit lonely.
182
528760
3775
所以你感到有點寂寞。
08:52
And so what do you do in that moment?
183
532559
1817
那當下你會做甚麼?
08:54
You open up Facebook.
184
534400
1279
你會打開 Facebook。
08:56
And in that moment,
185
536960
1696
而就在那一刻,
08:58
the designers in the control room
want to schedule exactly one thing,
186
538680
3376
控制室裡的設計者
想要準確地規劃一件事,
09:02
which is to maximize how much time
you spend on the screen.
187
542080
3040
那就是最大化你盯著螢幕的時間。
09:06
Now, instead, imagine if those designers
created a different timeline
188
546640
3896
現在,反過來,想像一下,
是否這些設計師可以創造出一個
09:10
that was the easiest way,
using all of their data,
189
550560
3496
不一樣的時間軸,以最簡單的方法、
利用他們所有的數據,
09:14
to actually help you get out
with the people that you care about?
190
554080
3096
幫你準確地約出你關心的人?
09:17
Just think, alleviating
all loneliness in society,
191
557200
5416
試想一下,消除社會中所有的寂寞,
09:22
if that was the timeline that Facebook
wanted to make possible for people.
192
562640
3496
這樣的時間軸不就是 Facebook
想要為我們實現的嗎?
09:26
Or imagine a different conversation.
193
566160
1715
或試想另一個對話。
09:27
Let's say you wanted to post
something supercontroversial on Facebook,
194
567899
3317
比方說你想要在 Facebook 上
發表備受爭議的言論,
09:31
which is a really important
thing to be able to do,
195
571240
2416
你覺得這個爭議性話題很重要,
09:33
to talk about controversial topics.
196
573680
1696
需要被拿出來討論。
09:35
And right now, when there's
that big comment box,
197
575400
2336
現在,有一個很大的留言區,
09:37
it's almost asking you,
what key do you want to type?
198
577760
3376
它就像是在問你,
你想要輸入什麽東西?
09:41
In other words, it's scheduling
a little timeline of things
199
581160
2816
換句話說,它正在安排一些時間軸,
09:44
you're going to continue
to do on the screen.
200
584000
2136
好讓你可以繼續待在螢幕上。
09:46
And imagine instead that there was
another button there saying,
201
586160
2976
試想如果有另一個按鈕跳出來說,
09:49
what would be most
time well spent for you?
202
589160
2056
你想怎麼安排你的時間?
09:51
And you click "host a dinner."
203
591240
1576
然後你點選,「舉辦一個晚餐的聚會」。
09:52
And right there
underneath the item it said,
204
592840
2096
然後,底下就會跳出一個,
09:54
"Who wants to RSVP for the dinner?"
205
594960
1696
「有誰想要聚餐,請盡速回覆」?
09:56
And so you'd still have a conversation
about something controversial,
206
596680
3256
所以,你的爭議性話題
可以被繼續討論,
09:59
but you'd be having it in the most
empowering place on your timeline,
207
599960
3736
而且可以放置在你時間軸上
最顯眼的位置,
10:03
which would be at home that night
with a bunch of a friends over
208
603720
3016
那天晚上,你就可以邀請到一堆朋友
10:06
to talk about it.
209
606760
1200
來你家裡晚餐並討論這個話題。
10:09
So imagine we're running, like,
a find and replace
210
609000
3160
想像我們正在賽跑,
想盡快找到並替換掉
10:13
on all of the timelines
that are currently steering us
211
613000
2576
所有那些正在促使我們
10:15
towards more and more
screen time persuasively
212
615600
2560
花越來越多的時間在螢幕上的時間軸,
10:19
and replacing all of those timelines
213
619080
2536
並盡快把這些時間軸
10:21
with what do we want in our lives.
214
621640
1640
用我們在生活中想做的事情替代掉。
10:26
It doesn't have to be this way.
215
626960
1480
真的不須要這樣。
10:30
Instead of handicapping our attention,
216
630360
2256
不需要癱瘓我們的注意力,
10:32
imagine if we used all of this data
and all of this power
217
632640
2816
試想如果我們利用所有這些數據和能力,
10:35
and this new view of human nature
218
635480
1616
加上對人性本質的全新認識,
10:37
to give us a superhuman ability to focus
219
637120
2856
來讓我們擁有超人般的注意力、
10:40
and a superhuman ability to put
our attention to what we cared about
220
640000
4136
讓我們更關心我們在乎的事情、
10:44
and a superhuman ability
to have the conversations
221
644160
2616
讓我們擁有超人般的能力,
10:46
that we need to have for democracy.
222
646800
2000
來進行民主所需要的互動交流。
10:51
The most complex challenges in the world
223
651600
2680
世界上最複雜的挑戰,
10:56
require not just us
to use our attention individually.
224
656280
3120
不僅需要我們每個人的注意力。
11:00
They require us to use our attention
and coordinate it together.
225
660440
3320
也需要我們的同心協力才能克服。
11:04
Climate change is going to require
that a lot of people
226
664440
2816
地球暖化議題需要大家
11:07
are being able
to coordinate their attention
227
667280
2096
一起使用最有力的方式
11:09
in the most empowering way together.
228
669400
1896
將所有人的注意力整合起來。
11:11
And imagine creating
a superhuman ability to do that.
229
671320
3080
試想如果有了這樣的超人能力會怎樣。
11:19
Sometimes the world's
most pressing and important problems
230
679000
4160
有時世界上最要緊、最重要的問題
11:24
are not these hypothetical future things
that we could create in the future.
231
684040
3840
不是未來我們可以創造的
假設性事物。
11:28
Sometimes the most pressing problems
232
688560
1736
有時最要緊的問題,
11:30
are the ones that are
right underneath our noses,
233
690320
2336
就是我們眼前的問題,
11:32
the things that are already directing
a billion people's thoughts.
234
692680
3120
已經在影響著十億人思想的事情。
11:36
And maybe instead of getting excited
about the new augmented reality
235
696600
3376
與其花時間對擴增實境感到興奮、
11:40
and virtual reality
and these cool things that could happen,
236
700000
3296
對虛擬實境這類酷炫產品感到興奮──
11:43
which are going to be susceptible
to the same race for attention,
237
703320
3296
對這些注意力競賽的產品感到興奮──
11:46
if we could fix the race for attention
238
706640
2176
不如把時間放在修正注意力競賽上,
11:48
on the thing that's already
in a billion people's pockets.
239
708840
2720
修正十億人口袋裡的那台機器上。
11:52
Maybe instead of getting excited
240
712040
1576
與其花時間對刺激、
11:53
about the most exciting
new cool fancy education apps,
241
713640
4176
酷炫的教育軟體感到興奮,
11:57
we could fix the way
kids' minds are getting manipulated
242
717840
2896
不如花時間找出方法,
來挽救那些已經被操控、
12:00
into sending empty messages
back and forth.
243
720760
2480
腦中只想傳些空洞簡訊的孩子們。
12:04
(Applause)
244
724040
4296
(掌聲)
12:08
Maybe instead of worrying
245
728360
1256
與其擔憂那假想的未來:
12:09
about hypothetical future
runaway artificial intelligences
246
729640
3776
只為了吸引我們注意力的
人工智慧未來,
12:13
that are maximizing for one goal,
247
733440
1880
12:16
we could solve the runaway
artificial intelligence
248
736680
2656
倒不如開始解決我們現有的、
12:19
that already exists right now,
249
739360
2056
已經失去控制的人工智慧,
12:21
which are these newsfeeds
maximizing for one thing.
250
741440
2920
也就是這些為了吸引我們注意力
推送新聞的人工智慧機器。
12:26
It's almost like instead of running away
to colonize new planets,
251
746080
3816
就像是,與其逃跑到新的殖民星球,
12:29
we could fix the one
that we're already on.
252
749920
2056
不如好好地拯救我們的地球。
12:32
(Applause)
253
752000
4120
(掌聲)
12:40
Solving this problem
254
760040
1776
解決這個問題
12:41
is critical infrastructure
for solving every other problem.
255
761840
3800
是解決其它問題的關鍵所在。
12:46
There's nothing in your life
or in our collective problems
256
766600
4016
在你生命中或是我們彼此的共同問題中,
12:50
that does not require our ability
to put our attention where we care about.
257
770640
3560
沒有一件事是不需要我們
花時間關注、花心思在乎的。
12:55
At the end of our lives,
258
775800
1240
畢竟,到了生命的盡頭,
12:58
all we have is our attention and our time.
259
778240
2640
我們最後所擁有的
就是曾經在乎的美好時光。
13:01
What will be time well spent for ours?
260
781800
1896
讓時間為我們所用。
13:03
Thank you.
261
783720
1216
13:04
(Applause)
262
784960
3120
謝謝各位。
(掌聲)
13:17
Chris Anderson: Tristan, thank you.
Hey, stay up here a sec.
263
797760
2936
克里斯 · 安德森:
崔斯頓,謝謝你。請留步。
13:20
First of all, thank you.
264
800720
1336
首先,謝謝你。
13:22
I know we asked you to do this talk
on pretty short notice,
265
802080
2776
我知道我們很晚才通知你要做這次演講,
13:24
and you've had quite a stressful week
266
804880
2216
你這禮拜壓力很大,
13:27
getting this thing together, so thank you.
267
807120
2440
為了這次的演講,所以謝謝你。
13:30
Some people listening might say,
what you complain about is addiction,
268
810680
3976
有些聽眾可能會說,
你抱怨的是沉迷,
13:34
and all these people doing this stuff,
for them it's actually interesting.
269
814680
3496
而對於那些已經沉迷的人來說,
他們認為那是興趣。
13:38
All these design decisions
270
818200
1256
所有這些設計過的決策、內容,
13:39
have built user content
that is fantastically interesting.
271
819480
3096
對用戶而言是相當有趣的。
13:42
The world's more interesting
than it ever has been.
272
822600
2416
這個世界從未如此有趣過。
13:45
What's wrong with that?
273
825040
1256
這有什麽錯嗎?
13:46
Tristan Harris:
I think it's really interesting.
274
826320
2256
崔斯頓.哈瑞斯:我認為這確實有趣。
13:48
One way to see this
is if you're just YouTube, for example,
275
828600
4016
我們換個角度看這件事,
假設,你在用 Youtube 看影片,
13:52
you want to always show
the more interesting next video.
276
832640
2656
你總是希望下一部影片是最有趣的。
13:55
You want to get better and better
at suggesting that next video,
277
835320
3016
你希望下一部推薦的影片越來越棒,
但是即使你推薦了一部比一部好看,
所有人都想要看的完美影片,
13:58
but even if you could propose
the perfect next video
278
838360
2456
14:00
that everyone would want to watch,
279
840840
1656
這只會讓你一直盯著螢幕看。
14:02
it would just be better and better
at keeping you hooked on the screen.
280
842520
3336
而其中遺漏的是要找出
我們的邊界。對吧?
14:05
So what's missing in that equation
281
845880
1656
14:07
is figuring out what
our boundaries would be.
282
847560
2136
14:09
You would want YouTube to know
something about, say, falling asleep.
283
849720
3216
你會想讓 YouTube 知道,
你甚麼時候會「睡著」嗎?
Netflix 的 CEO 最近說,
14:12
The CEO of Netflix recently said,
284
852960
1616
我們最大的競爭者就是臉書、
YouTube 和「睡著了」,對吧?
14:14
"our biggest competitors
are Facebook, YouTube and sleep."
285
854600
2736
14:17
And so what we need to recognize
is that the human architecture is limited
286
857360
4456
所以我們需要認識到
人體是有極限的、
14:21
and that we have certain boundaries
or dimensions of our lives
287
861840
2976
我們的生活是有某種界線的、
14:24
that we want to be honored and respected,
288
864840
1976
我們都想要被尊重,
14:26
and technology could help do that.
289
866840
1816
而科技可以幫我們做到。
14:28
(Applause)
290
868680
2616
(掌聲)
14:31
CA: I mean, could you make the case
291
871320
1696
克里斯:可否請你說明一下,
14:33
that part of the problem here is that
we've got a naïve model of human nature?
292
873040
6056
這裡還有個問題是,會不會
我們把人性想的太天真了?
14:39
So much of this is justified
in terms of human preference,
293
879120
2736
如何判別各式各樣
人類偏好的合理性,
14:41
where we've got these algorithms
that do an amazing job
294
881880
2616
我們可以用這些演算法
幫我們完成,
14:44
of optimizing for human preference,
295
884520
1696
幫我們優化人類的偏好,
14:46
but which preference?
296
886240
1336
但是,是哪方面的偏好?
14:47
There's the preferences
of things that we really care about
297
887600
3496
有我們確實關心的、
14:51
when we think about them
298
891120
1376
在意的事情的偏好,
14:52
versus the preferences
of what we just instinctively click on.
299
892520
3056
也有我們只是直覺地點擊的偏好。
14:55
If we could implant that more nuanced
view of human nature in every design,
300
895600
4656
如果我們在每個設計中
植入了對人性本質的微妙了解,
15:00
would that be a step forward?
301
900280
1456
這會是一種進步嗎?
15:01
TH: Absolutely. I mean, I think right now
302
901760
1976
崔斯頓:那是肯定的。
我的意思是,我認為現在
15:03
it's as if all of our technology
is basically only asking our lizard brain
303
903760
3496
好像我們的科技基本上
只詢問我們本能反應的腦的意見,
15:07
what's the best way
to just impulsively get you to do
304
907280
2496
它們最好的方式就是強迫你
15:09
the next tiniest thing with your time,
305
909800
2136
在下一秒、下一刻做出一些小事,
15:11
instead of asking you in your life
306
911960
1656
而不是問你人生中的大事、
15:13
what we would be most
time well spent for you?
307
913640
2176
問你花時間在哪方面
才是對你有幫助的?
15:15
What would be the perfect timeline
that might include something later,
308
915840
3296
也不是問你接下來
完美的時間安排是怎樣的,
最後一天你要不要
聽一下 TED 演講之類的?
15:19
would be time well spent for you
here at TED in your last day here?
309
919160
3176
克里斯: 所以,是不是要臉書、
谷歌在一開始就問我們,
15:22
CA: So if Facebook and Google
and everyone said to us first up,
310
922360
2976
15:25
"Hey, would you like us
to optimize for your reflective brain
311
925360
2896
嘿,你想要我們優化思考的腦
還是優化本能反應的腦?
15:28
or your lizard brain? You choose."
312
928280
1656
由你來選擇。
15:29
TH: Right. That would be one way. Yes.
313
929960
2080
崔斯頓:是的。這也是個方法。是的。
15:34
CA: You said persuadability,
that's an interesting word to me
314
934358
2858
克里斯:你剛提到的「說服能力」,
對我來說這個詞很有趣,
15:37
because to me there's
two different types of persuadability.
315
937240
2856
因為在我看來,
有兩種不同的說服能力。
15:40
There's the persuadability
that we're trying right now
316
940120
2536
有一種說服能力是我們現在在嘗試的,
15:42
of reason and thinking
and making an argument,
317
942680
2176
用前因後果、提出看法的方式說服聽眾,
15:44
but I think you're almost
talking about a different kind,
318
944880
2696
但是我覺得你是在談論另一種
15:47
a more visceral type of persuadability,
319
947590
1906
更能引起「情緒上本能反應」
的說服能力,
15:49
of being persuaded without
even knowing that you're thinking.
320
949520
2896
那種不經思考就被說服的能力。
沒錯。我會十分關心這個問題的原因是
15:52
TH: Exactly. The reason
I care about this problem so much is
321
952440
2856
我在史丹佛的說服力技術實驗室待過,
15:55
I studied at a lab called
the Persuasive Technology Lab at Stanford
322
955320
3176
那裡就是在教導人們這些技術。
15:58
that taught [students how to recognize]
exactly these techniques.
323
958520
2546
那裡有論壇和討論,教導人們
使用這些偷偷摸摸的方式
16:01
There's conferences and workshops
that teach people all these covert ways
324
961106
2990
16:04
of getting people's attention
and orchestrating people's lives.
325
964120
2976
來獲得人們的注意力,
指揮人們的生活。
16:07
And it's because most people
don't know that that exists
326
967120
2656
正因為大部分人不知道
有這項技術的存在,
16:09
that this conversation is so important.
327
969800
1896
所以我們的討論才會如此重要。
16:11
CA: Tristan, you and I, we both know
so many people from all these companies.
328
971720
3776
克里斯:崔斯頓,你和我都認識
許多來自這些公司的人。
16:15
There are actually many here in the room,
329
975520
1976
許多人也在這裡,
16:17
and I don't know about you,
but my experience of them
330
977520
2477
我不知道你是怎麽樣的,
但就我的經驗而言,
16:20
is that there is
no shortage of good intent.
331
980021
2075
他們也都是善意的。
16:22
People want a better world.
332
982120
2176
大家都想要一個更好的世界。
16:24
They are actually -- they really want it.
333
984320
3520
他們確實──他們真的想要。
16:28
And I don't think anything you're saying
is that these are evil people.
334
988320
4176
我不認為你所說的,
是在指這些人是壞人。
16:32
It's a system where there's
these unintended consequences
335
992520
3696
你指的是有一個系統,
導致了一些意想不到的後果
16:36
that have really got out of control --
336
996240
1856
且超出了控制範圍──
16:38
TH: Of this race for attention.
337
998120
1496
崔斯頓:關於注意力的競爭。
16:39
It's the classic race to the bottom
when you have to get attention,
338
999640
3176
當你想要吸引人家的注意
它就是一場無視規則的精典賽,
16:42
and it's so tense.
339
1002840
1216
而且競爭很激烈。
16:44
The only way to get more
is to go lower on the brain stem,
340
1004080
2736
取得更多注意的唯一辦法,
只有更深入腦袋、
16:46
to go lower into outrage,
to go lower into emotion,
341
1006840
2416
更引起憤怒、更深入情感、
16:49
to go lower into the lizard brain.
342
1009280
1696
更深入本能反應的腦。
16:51
CA: Well, thank you so much for helping us
all get a little bit wiser about this.
343
1011000
3816
克里斯:非常感謝你幫助我們
對這個問題有更進一步的了解。
16:54
Tristan Harris, thank you.
TH: Thank you very much.
344
1014840
2416
謝謝你。
崔斯頓:非常感謝。
16:57
(Applause)
345
1017280
2240
(掌聲)
New videos
Original video on YouTube.com
關於本網站
本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。