How a handful of tech companies control billions of minds every day | Tristan Harris

975,569 views ・ 2017-07-28

TED


请双击下面的英文字幕来播放视频。

翻译人员: Lipeng Chen 校对人员: Jing Peng
00:12
I want you to imagine
0
12960
1200
我想让你们想象
走进一个房间,
00:15
walking into a room,
1
15000
1200
00:17
a control room with a bunch of people,
2
17480
2136
这个房间是个控制室, 里面有一群人,大概上百人,
00:19
a hundred people, hunched over a desk with little dials,
3
19640
2840
都在有控制器的桌子旁伏案工作,
00:23
and that that control room
4
23280
1520
而那间控制室
00:25
will shape the thoughts and feelings
5
25680
3696
会决定上亿人的
00:29
of a billion people.
6
29400
1240
思想和情感。
00:32
This might sound like science fiction,
7
32560
1880
这听上去像是科幻小说,
00:35
but this actually exists
8
35320
2216
但是今天,此刻,
00:37
right now, today.
9
37560
1200
它真真切切地发生着。
因为我就在这种控制室工作过。
00:40
I know because I used to be in one of those control rooms.
10
40040
3360
我曾是谷歌的设计伦理学家,
00:44
I was a design ethicist at Google,
11
44159
2297
00:46
where I studied how do you ethically steer people's thoughts?
12
46480
3360
在那里我曾研究过如何 名正言顺地左右人们的想法。
00:50
Because what we don't talk about is how the handful of people
13
50560
2896
因为我们不常讨论屈指可数的
00:53
working at a handful of technology companies
14
53480
2536
几家科技公司的那么几个人,
是如何通过他们的选择 来影响现今上亿人的思想。
00:56
through their choices will steer what a billion people are thinking today.
15
56040
5040
01:02
Because when you pull out your phone
16
62400
1736
因为当你拿出手机,
他们将设计下一步会如何 以及应该推送哪些新闻,
01:04
and they design how this works or what's on the feed,
17
64160
3096
01:07
it's scheduling little blocks of time in our minds.
18
67280
3216
就像在我们的脑子里设定出了小小的模块。
01:10
If you see a notification, it schedules you to have thoughts
19
70520
3136
如果你看到一个消息, 这会促使你形成
01:13
that maybe you didn't intend to have.
20
73680
2040
或许原本没有的想法。
01:16
If you swipe over that notification,
21
76400
2616
如果你忽略了这个消息,
他们还会设计让你在原本
01:19
it schedules you into spending a little bit of time
22
79040
2381
01:21
getting sucked into something
23
81445
1381
不关注的东西上
01:22
that maybe you didn't intend to get sucked into.
24
82850
2955
多花些时间。
01:27
When we talk about technology,
25
87320
1520
当我们谈到科技时,
我们倾向于把它描绘成 充满机遇的蓝图。
01:30
we tend to talk about it as this blue sky opportunity.
26
90040
2696
01:32
It could go any direction.
27
92760
1480
天马行空,任你想象。
01:35
And I want to get serious for a moment
28
95400
1856
而我想严肃地告诉你们,
为什么它选择了某个特定的方向。
01:37
and tell you why it's going in a very specific direction.
29
97280
2680
01:40
Because it's not evolving randomly.
30
100840
2200
因为科技的发展不是随机的。
在我们创造的所有科技背后
01:44
There's a hidden goal driving the direction
31
104000
2016
都隐藏着明确的目标,
01:46
of all of the technology we make,
32
106040
2136
那个目标就是追求我们的注意力。
01:48
and that goal is the race for our attention.
33
108200
2920
01:52
Because every news site,
34
112840
2736
因为所有新的网站——
01:55
TED, elections, politicians,
35
115600
2736
TED演讲,选举,政客,
01:58
games, even meditation apps
36
118360
1976
游戏,甚至是冥想软件——
02:00
have to compete for one thing,
37
120360
1960
都需要为同一种东西彼此竞争,
02:03
which is our attention,
38
123160
1736
那就是我们的注意力,
02:04
and there's only so much of it.
39
124920
1600
而注意力是有限的。
02:08
And the best way to get people's attention
40
128440
2416
获得人们注意力的最佳办法
02:10
is to know how someone's mind works.
41
130880
2440
就是了解我们的大脑是如何运作的。
02:13
And there's a whole bunch of persuasive techniques
42
133800
2336
大学时我曾在一个叫做 说服性技术实验室的地方
02:16
that I learned in college at a lab called the Persuasive Technology Lab
43
136160
3496
学到了许多说服别人的技巧,
02:19
to get people's attention.
44
139680
1600
来获取人们的注意力。
02:21
A simple example is YouTube.
45
141880
1480
一个简单的例子就是YouTube。
YouTube希望你在 它上面花的时间越多越好。
02:24
YouTube wants to maximize how much time you spend.
46
144000
2936
02:26
And so what do they do?
47
146960
1200
那么他们做了什么呢?
02:28
They autoplay the next video.
48
148840
2280
他们增加了自动播放下一个视频的功能。
02:31
And let's say that works really well.
49
151760
1816
效果应该说很不错。
02:33
They're getting a little bit more of people's time.
50
153600
2416
人们在上面花费了更多的时间。
如果你在Netflix工作, 看到他们这么做之后会觉得,
02:36
Well, if you're Netflix, you look at that and say,
51
156040
2376
这会减少我的市场份额,
02:38
well, that's shrinking my market share,
52
158440
1858
我也要自动播放下一集。
02:40
so I'm going to autoplay the next episode.
53
160322
2000
02:43
But then if you're Facebook,
54
163320
1376
但如果你是Facebook的员工,
02:44
you say, that's shrinking all of my market share,
55
164720
2336
你会说,这减少了我的市场份额,
所以我会在你点击播放按键前
02:47
so now I have to autoplay all the videos in the newsfeed
56
167080
2656
自动播放新闻推送中的所有视频。
02:49
before waiting for you to click play.
57
169760
1762
02:52
So the internet is not evolving at random.
58
172320
3160
所以网络不是随机发展的。
02:56
The reason it feels like it's sucking us in the way it is
59
176320
4416
我们感觉上瘾的原因在于
03:00
is because of this race for attention.
60
180760
2376
我们的注意力成了被竞争的对象。
我们知道这样发展下去会如何。
03:03
We know where this is going.
61
183160
1416
03:04
Technology is not neutral,
62
184600
1520
科技并非中性的,
03:07
and it becomes this race to the bottom of the brain stem
63
187320
3416
它变成了那些可以深入了解
03:10
of who can go lower to get it.
64
190760
2200
人脑运作的人之间的竞争。
03:13
Let me give you an example of Snapchat.
65
193920
2336
给你们举个Snapchat的例子。
03:16
If you didn't know, Snapchat is the number one way
66
196280
3696
有人可能不太了解, Snapchat是在美国青少年中
排名第一的交流方式。
03:20
that teenagers in the United States communicate.
67
200000
2256
如果你们和我一样用短信交流,
03:22
So if you're like me, and you use text messages to communicate,
68
202280
4176
03:26
Snapchat is that for teenagers,
69
206480
1776
那么Snapchat就是青少年的短信,
03:28
and there's, like, a hundred million of them that use it.
70
208280
2696
有近一亿人在使用。
他们发明了一种叫Snapstreaks的功能,
03:31
And they invented a feature called Snapstreaks,
71
211000
2216
它会展示两个人
03:33
which shows the number of days in a row
72
213240
1896
持续互动的天数。
03:35
that two people have communicated with each other.
73
215160
2616
03:37
In other words, what they just did
74
217800
1856
换句话说,他们所做的
03:39
is they gave two people something they don't want to lose.
75
219680
2960
就是给了两个人不想失去的记录。
因为如果你是一个青少年, 连续交流了150天,
03:44
Because if you're a teenager, and you have 150 days in a row,
76
224000
3456
03:47
you don't want that to go away.
77
227480
1976
你不想失去这些东西。
03:49
And so think of the little blocks of time that that schedules in kids' minds.
78
229480
4160
试想一下孩子脑子里 已被设定好的时间模块。
这不是理论空谈,而是事实; 当孩子们去度假时,
03:54
This isn't theoretical: when kids go on vacation,
79
234160
2336
03:56
it's been shown they give their passwords to up to five other friends
80
236520
3256
即使他们不能使用 Snapstreaks,也会把密码
03:59
to keep their Snapstreaks going,
81
239800
2216
告诉至多五个朋友,
来帮助他把Snapstreaks接力下去。
04:02
even when they can't do it.
82
242040
2016
他们要同时关注大约30件这种事情,
04:04
And they have, like, 30 of these things,
83
244080
1936
所以每天他们就忙于拍拍画,拍拍墙,
04:06
and so they have to get through taking photos of just pictures or walls
84
246040
3376
04:09
or ceilings just to get through their day.
85
249440
2480
拍拍天花板来消磨时间。
04:13
So it's not even like they're having real conversations.
86
253200
2696
这甚至不能被称为真正的交流。
04:15
We have a temptation to think about this
87
255920
1936
我们可能会这么想,
04:17
as, oh, they're just using Snapchat
88
257880
2696
他们用Snapchat的方式
04:20
the way we used to gossip on the telephone.
89
260600
2016
就像我们曾经用电话聊八卦一样,
04:22
It's probably OK.
90
262640
1200
应该是没问题的。
04:24
Well, what this misses is that in the 1970s,
91
264480
2256
但人们忽略了在上世纪七十年代,
04:26
when you were just gossiping on the telephone,
92
266760
2615
当你们用电话聊八卦时,
04:29
there wasn't a hundred engineers on the other side of the screen
93
269399
3017
在另一头并没有数百名工程师
准确知道你们的心理活动,
04:32
who knew exactly how your psychology worked
94
272440
2056
04:34
and orchestrated you into a double bind with each other.
95
274520
2640
并精心策划着如何让你们加强联络。
04:38
Now, if this is making you feel a little bit of outrage,
96
278440
3400
如果这让你有些愤怒了,
04:42
notice that that thought just comes over you.
97
282680
2576
请注意,你刚刚才意识到这点,
04:45
Outrage is a really good way also of getting your attention,
98
285280
3320
愤怒也是获取你注意力的一种好方式。
04:49
because we don't choose outrage.
99
289880
1576
因为我们不会选择愤怒。
04:51
It happens to us.
100
291480
1416
它会自动发生在我们身上。
04:52
And if you're the Facebook newsfeed,
101
292920
1856
如果你负责Facebook的新闻推送,
04:54
whether you'd want to or not,
102
294800
1416
信不信由你,
人们愤怒的时候你会受益。
04:56
you actually benefit when there's outrage.
103
296240
2736
因为愤怒不仅仅是人们
04:59
Because outrage doesn't just schedule a reaction
104
299000
2936
05:01
in emotional time, space, for you.
105
301960
2880
在情感和空间上的一个反应。
05:05
We want to share that outrage with other people.
106
305440
2416
想要与人分享这份愤怒,
05:07
So we want to hit share and say,
107
307880
1576
所以我们会按下分享键,然后说,
“你能相信他们说的吗?”
05:09
"Can you believe the thing that they said?"
108
309480
2040
05:12
And so outrage works really well at getting attention,
109
312520
3376
而这种愤怒真的特别容易获得注意力,
05:15
such that if Facebook had a choice between showing you the outrage feed
110
315920
3896
甚至于如果Facebook 可以在展示惹人愤怒的消息
05:19
and a calm newsfeed,
111
319840
1320
和一般的消息之间进行选择的话,
他们会选择向你发布让你愤怒的消息,
05:22
they would want to show you the outrage feed,
112
322120
2136
不是因为刻意的原因,
05:24
not because someone consciously chose that,
113
324280
2056
05:26
but because that worked better at getting your attention.
114
326360
2680
只是那会更好的获得你的注意。
消息控制室不用对我们负责,
05:31
And the newsfeed control room is not accountable to us.
115
331120
5480
它只对最大化获得人们的注意力负责,
05:37
It's only accountable to maximizing attention.
116
337040
2296
05:39
It's also accountable,
117
339360
1216
鉴于广告的商业模型,
05:40
because of the business model of advertising,
118
340600
2376
它也对可以花钱进入
控制室的人负责,他们可以要求说
05:43
for anybody who can pay the most to actually walk into the control room
119
343000
3336
05:46
and say, "That group over there,
120
346360
1576
“那一组人,我想把
05:47
I want to schedule these thoughts into their minds."
121
347960
2640
这些东西灌输给他们。”
05:51
So you can target,
122
351760
1200
所以你可以定位,
你可以准确定位到
05:54
you can precisely target a lie
123
354040
1936
那些最易受到影响的人。
05:56
directly to the people who are most susceptible.
124
356000
2920
因为这有利可图, 所以只会变得越来越糟。
06:00
And because this is profitable, it's only going to get worse.
125
360080
2880
今天我在这里(跟大家讲)
06:05
So I'm here today
126
365040
1800
是因为代价已经太高了。
06:08
because the costs are so obvious.
127
368160
2000
06:12
I don't know a more urgent problem than this,
128
372280
2136
我不知道还有什么问题比这更紧急的,
06:14
because this problem is underneath all other problems.
129
374440
3120
因为这个问题隐藏在所有问题之中。
06:18
It's not just taking away our agency
130
378720
3176
它并不仅仅是剥夺了我们的注意力
06:21
to spend our attention and live the lives that we want,
131
381920
2600
和选择生活方式的自主权,
06:25
it's changing the way that we have our conversations,
132
385720
3536
更是改变了我们进行交流的方式,
06:29
it's changing our democracy,
133
389280
1736
改变了我们的民主意识,
改变了我们与他人交流,
06:31
and it's changing our ability to have the conversations
134
391040
2616
06:33
and relationships we want with each other.
135
393680
2000
维系关系的能力。
这影响到了每个人,
06:37
And it affects everyone,
136
397160
1776
06:38
because a billion people have one of these in their pocket.
137
398960
3360
因为亿万人的口袋里 都装着这个东西(手机)。
06:45
So how do we fix this?
138
405360
1400
那么我们要如何解决这个问题呢?
我们需要对科技和社会
06:49
We need to make three radical changes
139
409080
2936
做三个大胆的突破。
06:52
to technology and to our society.
140
412040
1800
06:55
The first is we need to acknowledge that we are persuadable.
141
415720
3800
首先,我们需要承认我们可以被说服。
07:00
Once you start understanding
142
420840
1376
一旦了解
我们是可以安排大脑去想一点其他事情,
07:02
that your mind can be scheduled into having little thoughts
143
422240
2776
或无计划地占用一些时间,
07:05
or little blocks of time that you didn't choose,
144
425040
2576
07:07
wouldn't we want to use that understanding
145
427640
2056
我们难道不能利用这点认识
07:09
and protect against the way that that happens?
146
429720
2160
来改变现状吗?
07:12
I think we need to see ourselves fundamentally in a new way.
147
432600
3296
我认为我们需要 以全新的方式审视自己。
07:15
It's almost like a new period of human history,
148
435920
2216
就像是人类历史上的新篇章,
就像启蒙运动,
07:18
like the Enlightenment,
149
438160
1216
但这是自省式的启蒙运动,
07:19
but almost a kind of self-aware Enlightenment,
150
439400
2216
07:21
that we can be persuaded,
151
441640
2080
意识到我们可以被说服影响,
07:24
and there might be something we want to protect.
152
444320
2240
意识到有些东西需要我们去保护。
07:27
The second is we need new models and accountability systems
153
447400
4576
第二点是,我们需要新的 模型和责任系统,
以便让世界变得越来越好, 也越来越有影响力时——
07:32
so that as the world gets better and more and more persuasive over time --
154
452000
3496
07:35
because it's only going to get more persuasive --
155
455520
2336
也就是更能说服我们时——
07:37
that the people in those control rooms
156
457880
1856
那些控制室里的人们
才会对我们负责,且行为对我们透明化。
07:39
are accountable and transparent to what we want.
157
459760
2456
道德说服只有当
07:42
The only form of ethical persuasion that exists
158
462240
2696
07:44
is when the goals of the persuader
159
464960
1936
说服和被说服者的
07:46
are aligned with the goals of the persuadee.
160
466920
2200
目标一致时才存在。
07:49
And that involves questioning big things, like the business model of advertising.
161
469640
3840
这就涉及到对热门举措的怀疑, 比如广告的商业模型。
07:54
Lastly,
162
474720
1576
最后,
07:56
we need a design renaissance,
163
476320
1680
我们需要一次科技重塑,
因为一旦你开始了解人的这种本性,
07:59
because once you have this view of human nature,
164
479080
3056
你就可以控制上亿人的时间——
08:02
that you can steer the timelines of a billion people --
165
482160
2976
08:05
just imagine, there's people who have some desire
166
485160
2736
想象一下,有些人有这样的欲望,
08:07
about what they want to do and what they want to be thinking
167
487920
2856
他们想做什么, 他们想思考什么,
他们想感受什么, 他们想了解什么,
08:10
and what they want to be feeling and how they want to be informed,
168
490800
3136
而我们被吸引到这些不同的方向。
08:13
and we're all just tugged into these other directions.
169
493960
2536
数亿人会跟随着这些不同的方向。
08:16
And you have a billion people just tugged into all these different directions.
170
496520
3696
试想一下一个完整的科技复兴
08:20
Well, imagine an entire design renaissance
171
500240
2056
08:22
that tried to orchestrate the exact and most empowering
172
502320
3096
将会指导我们准确有效的
08:25
time-well-spent way for those timelines to happen.
173
505440
3136
分配时间。
08:28
And that would involve two things:
174
508600
1656
这包含两个方面:
一是防止我们把时间花在
08:30
one would be protecting against the timelines
175
510280
2136
08:32
that we don't want to be experiencing,
176
512440
1856
在不想花的地方,
08:34
the thoughts that we wouldn't want to be happening,
177
514320
2416
阻止我们产生不想形成的想法,
08:36
so that when that ding happens, not having the ding that sends us away;
178
516760
3336
即便提示音响了, 我们也不会被牵着鼻子走;
二是让我们按照所期待的时间轨迹生活。
08:40
and the second would be empowering us to live out the timeline that we want.
179
520120
3616
08:43
So let me give you a concrete example.
180
523760
1880
给你们举个实际的例子。
08:46
Today, let's say your friend cancels dinner on you,
181
526280
2456
今天,比如你的朋友 取消了和你共进晚餐,
08:48
and you are feeling a little bit lonely.
182
528760
3775
你感到有些寂寞。
08:52
And so what do you do in that moment?
183
532559
1817
这一刻你会做什么呢?
你打开了Facebook。
08:54
You open up Facebook.
184
534400
1279
08:56
And in that moment,
185
536960
1696
在那一刻,
08:58
the designers in the control room want to schedule exactly one thing,
186
538680
3376
控制室里的设计者要做一件事,
09:02
which is to maximize how much time you spend on the screen.
187
542080
3040
那就是要你盯着屏幕,时间越长越好。
09:06
Now, instead, imagine if those designers created a different timeline
188
546640
3896
现在想象一下, 如果设计者规划了另一条时间轴,
09:10
that was the easiest way, using all of their data,
189
550560
3496
利用他们所有的数据, 用最简单的方法
帮你约出你关心的人呢?
09:14
to actually help you get out with the people that you care about?
190
554080
3096
09:17
Just think, alleviating all loneliness in society,
191
557200
5416
试想如果消除社会中所有的孤单,
09:22
if that was the timeline that Facebook wanted to make possible for people.
192
562640
3496
才是Facebook想要为人们实现的理想。
或试想另一个对话。
09:26
Or imagine a different conversation.
193
566160
1715
09:27
Let's say you wanted to post something supercontroversial on Facebook,
194
567899
3317
比方说你想要在Facebook上 发表备受争议的言论,
能这么做是很重要的,
09:31
which is a really important thing to be able to do,
195
571240
2416
谈论争议性话题。
09:33
to talk about controversial topics.
196
573680
1696
现在有一个巨大的评论栏,
09:35
And right now, when there's that big comment box,
197
575400
2336
09:37
it's almost asking you, what key do you want to type?
198
577760
3376
它就像是在问你, 你想要输入什么东西?
换句话说,它设定了你将要
09:41
In other words, it's scheduling a little timeline of things
199
581160
2816
继续在屏幕上做的事情。
09:44
you're going to continue to do on the screen.
200
584000
2136
试想如果有另外一个提问框跳出来问你,
09:46
And imagine instead that there was another button there saying,
201
586160
2976
怎么花费时间最好?
09:49
what would be most time well spent for you?
202
589160
2056
你点击 “办个晚餐聚会”
09:51
And you click "host a dinner."
203
591240
1576
09:52
And right there underneath the item it said,
204
592840
2096
紧接着下面有个选项问道,
09:54
"Who wants to RSVP for the dinner?"
205
594960
1696
“谁想要去这个晚餐?”
09:56
And so you'd still have a conversation about something controversial,
206
596680
3256
虽然你仍想对 有争议性的话题进行讨论,
09:59
but you'd be having it in the most empowering place on your timeline,
207
599960
3736
但你的时间花在了最好的地方,
10:03
which would be at home that night with a bunch of a friends over
208
603720
3016
晚上在家里和一群朋友
10:06
to talk about it.
209
606760
1200
讨论那个话题。
想象我们正在使用“查找并替代” 功能,
10:09
So imagine we're running, like, a find and replace
210
609000
3160
把所有那些正在促使我们
10:13
on all of the timelines that are currently steering us
211
613000
2576
10:15
towards more and more screen time persuasively
212
615600
2560
花越来越多的时间在屏幕上的事情,
用我们在生活中的真实意愿
10:19
and replacing all of those timelines
213
619080
2536
10:21
with what do we want in our lives.
214
621640
1640
来逐一替换掉。
10:26
It doesn't have to be this way.
215
626960
1480
事情本不必如此复杂。
10:30
Instead of handicapping our attention,
216
630360
2256
与阻碍我们的注意力相反,
10:32
imagine if we used all of this data and all of this power
217
632640
2816
试想如果我们利用所有这些数据和功能,
10:35
and this new view of human nature
218
635480
1616
加上对人类本性的全新认识,
给我们以超人的能力来集中注意力,
10:37
to give us a superhuman ability to focus
219
637120
2856
来关注我们应该关心的事,
10:40
and a superhuman ability to put our attention to what we cared about
220
640000
4136
来进行民主所需要的
10:44
and a superhuman ability to have the conversations
221
644160
2616
10:46
that we need to have for democracy.
222
646800
2000
互动交流。
10:51
The most complex challenges in the world
223
651600
2680
世界上最复杂的挑战
10:56
require not just us to use our attention individually.
224
656280
3120
要求我们不仅独立的运用我们的注意力,
11:00
They require us to use our attention and coordinate it together.
225
660440
3320
还要求我们同心协力。
11:04
Climate change is going to require that a lot of people
226
664440
2816
气候变化需要许多人
11:07
are being able to coordinate their attention
227
667280
2096
使用最有力的方式
11:09
in the most empowering way together.
228
669400
1896
彼此协作。
11:11
And imagine creating a superhuman ability to do that.
229
671320
3080
试想创造出这种超人能力会是什么情景。
有时世界上最紧要,最关键的问题
11:19
Sometimes the world's most pressing and important problems
230
679000
4160
不是假象出来的未来事物。
11:24
are not these hypothetical future things that we could create in the future.
231
684040
3840
11:28
Sometimes the most pressing problems
232
688560
1736
有时最紧要的东西
恰恰是我们眼前的东西,
11:30
are the ones that are right underneath our noses,
233
690320
2336
11:32
the things that are already directing a billion people's thoughts.
234
692680
3120
是已经影响了上亿人思想的东西。
11:36
And maybe instead of getting excited about the new augmented reality
235
696600
3376
也许,与其对新兴的增强现实,
虚拟现实等炫酷玩物感到着迷,
11:40
and virtual reality and these cool things that could happen,
236
700000
3296
11:43
which are going to be susceptible to the same race for attention,
237
703320
3296
使得它们也会成为 注意力竞争中的一员,
11:46
if we could fix the race for attention
238
706640
2176
我们不如改进亿万人口袋里
11:48
on the thing that's already in a billion people's pockets.
239
708840
2720
影响注意力的那个东西。
也许,我们与其对
11:52
Maybe instead of getting excited
240
712040
1576
11:53
about the most exciting new cool fancy education apps,
241
713640
4176
新潮炫酷的教育软件感到兴奋,
11:57
we could fix the way kids' minds are getting manipulated
242
717840
2896
还不如想想如何拯救那些已被操纵,
12:00
into sending empty messages back and forth.
243
720760
2480
来回发送空洞消息的孩子们。
(掌声)
12:04
(Applause)
244
724040
4296
12:08
Maybe instead of worrying
245
728360
1256
也许,我们与其
12:09
about hypothetical future runaway artificial intelligences
246
729640
3776
对假象的未来中的最大主角,
12:13
that are maximizing for one goal,
247
733440
1880
人工智能感到担心,
12:16
we could solve the runaway artificial intelligence
248
736680
2656
不如解决当下就存在的
12:19
that already exists right now,
249
739360
2056
人工智能问题,
12:21
which are these newsfeeds maximizing for one thing.
250
741440
2920
就是那些争夺注意力的新闻推送。
这就好比与其逃跑和殖民新的星球,
12:26
It's almost like instead of running away to colonize new planets,
251
746080
3816
12:29
we could fix the one that we're already on.
252
749920
2056
我们可以解决现在所在星球的问题。
(掌声)
12:32
(Applause)
253
752000
4120
解决这个问题
12:40
Solving this problem
254
760040
1776
12:41
is critical infrastructure for solving every other problem.
255
761840
3800
是解决其他问题的关键所在。
12:46
There's nothing in your life or in our collective problems
256
766600
4016
在你生命之中或是我们共同的问题中,
12:50
that does not require our ability to put our attention where we care about.
257
770640
3560
没有一件事不需要我们 把注意力放到我们关心的事情上去。
12:55
At the end of our lives,
258
775800
1240
归根到底,在一生中,
12:58
all we have is our attention and our time.
259
778240
2640
我们共同拥有的就是注意力和时间。
13:01
What will be time well spent for ours?
260
781800
1896
时间会被我们自己充分利用吗?
13:03
Thank you.
261
783720
1216
谢谢大家。
13:04
(Applause)
262
784960
3120
(掌声)
13:17
Chris Anderson: Tristan, thank you. Hey, stay up here a sec.
263
797760
2936
克里斯•安德森: Tristan,谢谢你,请留步。
13:20
First of all, thank you.
264
800720
1336
首先,谢谢你。
我知道我们很晚才通知你要做这次演讲,
13:22
I know we asked you to do this talk on pretty short notice,
265
802080
2776
13:24
and you've had quite a stressful week
266
804880
2216
这周准备这个演讲,
压力确实很大,再次谢谢你。
13:27
getting this thing together, so thank you.
267
807120
2440
13:30
Some people listening might say, what you complain about is addiction,
268
810680
3976
有些听众可能会说, 你抱怨的不就是上瘾吗,
13:34
and all these people doing this stuff, for them it's actually interesting.
269
814680
3496
而对于真正这么做的人来说, 他们确确实实是觉得有趣的。
所有这些设计策划
13:38
All these design decisions
270
818200
1256
使得用户获取了十分有趣的内容。
13:39
have built user content that is fantastically interesting.
271
819480
3096
13:42
The world's more interesting than it ever has been.
272
822600
2416
这个世界从未如此有意思过。
这有什么问题吗?
13:45
What's wrong with that?
273
825040
1256
特里斯坦•哈里斯: 我认为这确实有趣。
13:46
Tristan Harris: I think it's really interesting.
274
826320
2256
换个角度思考一下 举个例子,还用YouTube,
13:48
One way to see this is if you're just YouTube, for example,
275
828600
4016
13:52
you want to always show the more interesting next video.
276
832640
2656
你总是想让下一个视频更有趣。
13:55
You want to get better and better at suggesting that next video,
277
835320
3016
你想要在建议下一个视频 这件事上越做越好,
但即使你可以建议一个
13:58
but even if you could propose the perfect next video
278
838360
2456
所有人都想要观看的完美视频,
14:00
that everyone would want to watch,
279
840840
1656
其实只是在吸引人们 盯住屏幕这件事上越来越好。
14:02
it would just be better and better at keeping you hooked on the screen.
280
842520
3336
所以这个等式缺失的是
14:05
So what's missing in that equation
281
845880
1656
知道我们的极限在哪里。
14:07
is figuring out what our boundaries would be.
282
847560
2136
比如,你想要让YouTube对 入睡这样的事儿有所了解。
14:09
You would want YouTube to know something about, say, falling asleep.
283
849720
3216
Netflix的CEO最近说过,
14:12
The CEO of Netflix recently said,
284
852960
1616
“我们最大的对手是Facebook, YouTube和睡眠。”
14:14
"our biggest competitors are Facebook, YouTube and sleep."
285
854600
2736
14:17
And so what we need to recognize is that the human architecture is limited
286
857360
4456
所以我们需要认识到 人体本身是有极限的,
14:21
and that we have certain boundaries or dimensions of our lives
287
861840
2976
在我们的生活中有某些界限或方面
14:24
that we want to be honored and respected,
288
864840
1976
是要得到尊重的,
14:26
and technology could help do that.
289
866840
1816
而科技是可以帮助我们实现这些的。
14:28
(Applause)
290
868680
2616
(掌声)
14:31
CA: I mean, could you make the case
291
871320
1696
CA:你是否可以说明一下
我们是不是对人性的认识太天真了?
14:33
that part of the problem here is that we've got a naïve model of human nature?
292
873040
6056
从人类偏好来看, 这些东西是可以合理化的,
14:39
So much of this is justified in terms of human preference,
293
879120
2736
14:41
where we've got these algorithms that do an amazing job
294
881880
2616
我们有这些极其棒的算法
为人类的偏好优化上做着贡献,
14:44
of optimizing for human preference,
295
884520
1696
但是,是什么偏好呢?
14:46
but which preference?
296
886240
1336
14:47
There's the preferences of things that we really care about
297
887600
3496
有我们确实关心的
偏好,
14:51
when we think about them
298
891120
1376
14:52
versus the preferences of what we just instinctively click on.
299
892520
3056
也有我们只是下意识点击的偏好。
14:55
If we could implant that more nuanced view of human nature in every design,
300
895600
4656
如果我们在每个设计中 多加一点对人类本性的关注,
这会不会是一种进步呢?
15:00
would that be a step forward?
301
900280
1456
15:01
TH: Absolutely. I mean, I think right now
302
901760
1976
TH:那是肯定的。我认为现在
15:03
it's as if all of our technology is basically only asking our lizard brain
303
903760
3496
科技基本上只是利用我们的下意识,
找到最好的方式让我们自然而然地
15:07
what's the best way to just impulsively get you to do
304
907280
2496
15:09
the next tiniest thing with your time,
305
909800
2136
对微不足道的事上瘾,
15:11
instead of asking you in your life
306
911960
1656
而不是问我们在生活中
15:13
what we would be most time well spent for you?
307
913640
2176
时间要如何花费才最有意义。
15:15
What would be the perfect timeline that might include something later,
308
915840
3296
什么才是完美的时间安排, 这可能包括后面的事情,
例如你利用在这儿的最后一天 参加TED,是很好的利用了时间吗?
15:19
would be time well spent for you here at TED in your last day here?
309
919160
3176
CA:如果Facebook和Google 或任何一个人上来对我们说,
15:22
CA: So if Facebook and Google and everyone said to us first up,
310
922360
2976
“嘿,你想要我们为你的思考进行优化,
15:25
"Hey, would you like us to optimize for your reflective brain
311
925360
2896
15:28
or your lizard brain? You choose."
312
928280
1656
还是优化你的下意识?你来选择。”
15:29
TH: Right. That would be one way. Yes.
313
929960
2080
TH:是的。这可能是个方法。
15:34
CA: You said persuadability, that's an interesting word to me
314
934358
2858
CA:你说到可说服性, 对我来说这个词很有趣,
15:37
because to me there's two different types of persuadability.
315
937240
2856
因为在我看来有两种不同的可说服性。
有一种可说服性是我们现在正尝试的,
15:40
There's the persuadability that we're trying right now
316
940120
2536
15:42
of reason and thinking and making an argument,
317
942680
2176
关于推理,思考以及做出论证,
15:44
but I think you're almost talking about a different kind,
318
944880
2696
但是我觉得你在谈论另一种,
更加内在的本能的一种可说服性,
15:47
a more visceral type of persuadability,
319
947590
1906
即在不经思考下就被说服了。
15:49
of being persuaded without even knowing that you're thinking.
320
949520
2896
TH:是的。我十分关注 这个问题的原因是,
15:52
TH: Exactly. The reason I care about this problem so much is
321
952440
2856
我曾在斯坦福的 说服性技术实验室学习,
15:55
I studied at a lab called the Persuasive Technology Lab at Stanford
322
955320
3176
那里就是教学生这些技巧。
15:58
that taught [students how to recognize] exactly these techniques.
323
958520
2546
那有专门的论坛和讨论会, 教授人们如何用隐蔽的方式
16:01
There's conferences and workshops that teach people all these covert ways
324
961106
2990
来获得人们的注意力, 从而策划人们的生活。
16:04
of getting people's attention and orchestrating people's lives.
325
964120
2976
正因为大部分人不知道它的存在,
16:07
And it's because most people don't know that that exists
326
967120
2656
才使得这个演讲如此重要。
16:09
that this conversation is so important.
327
969800
1896
16:11
CA: Tristan, you and I, we both know so many people from all these companies.
328
971720
3776
CA:Tristan,咱们都认识 许多来自这些公司的人。
16:15
There are actually many here in the room,
329
975520
1976
他们当中许多人也在这里,
16:17
and I don't know about you, but my experience of them
330
977520
2477
我不知道你的想法, 但是就我的经验而言,
他们是心存善意的。
16:20
is that there is no shortage of good intent.
331
980021
2075
大家都向往更美好的世界。
16:22
People want a better world.
332
982120
2176
16:24
They are actually -- they really want it.
333
984320
3520
他们确实是这样想的。
16:28
And I don't think anything you're saying is that these are evil people.
334
988320
4176
我不认为你的意思是 这些人都是坏人。
16:32
It's a system where there's these unintended consequences
335
992520
3696
只不过是这个系统产生了不经意的结果,
超出了我们的控制范围——
16:36
that have really got out of control --
336
996240
1856
TH:在争夺注意力 这件事儿上,没错儿。
16:38
TH: Of this race for attention.
337
998120
1496
当你要获得众人的注意力时, 便成了经典的厮杀,
16:39
It's the classic race to the bottom when you have to get attention,
338
999640
3176
而且特别激烈残忍。
16:42
and it's so tense.
339
1002840
1216
取得更多注意的唯一办法, 只有深入大脑,
16:44
The only way to get more is to go lower on the brain stem,
340
1004080
2736
16:46
to go lower into outrage, to go lower into emotion,
341
1006840
2416
深入愤怒,深入情感,
16:49
to go lower into the lizard brain.
342
1009280
1696
深入本性。
CA:感谢你帮助我们 对这个问题有了更深的认识。
16:51
CA: Well, thank you so much for helping us all get a little bit wiser about this.
343
1011000
3816
16:54
Tristan Harris, thank you. TH: Thank you very much.
344
1014840
2416
特里斯坦•哈里斯,谢谢你。 TH:非常感谢。
(掌声)
16:57
(Applause)
345
1017280
2240
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7


This website was created in October 2020 and last updated on June 12, 2025.

It is now archived and preserved as an English learning resource.

Some information may be out of date.

隐私政策

eng.lish.video

Developer's Blog