How can groups make good decisions? | Mariano Sigman and Dan Ariely

162,932 views ・ 2017-12-13

TED


请双击下面的英文字幕来播放视频。

00:00
As societies, we have to make collective decisions
0
554
2443
翻译人员: Eddy Chan 校对人员: Peipei Xiang
作为社会,我们要做出集体决策,
共同塑造我们的未来。
00:03
that will shape our future.
1
3021
1570
00:05
And we all know that when we make decisions in groups,
2
5087
2757
众所周知,当我们以 一个集体去做决策时,
00:07
they don't always go right.
3
7868
1638
这个决策不一定正确。
00:09
And sometimes they go very wrong.
4
9530
1956
有时甚至错得离谱。
00:12
So how do groups make good decisions?
5
12315
2424
所以说,群体如何才能 做出好的决策呢?
00:15
Research has shown that crowds are wise when there's independent thinking.
6
15228
4328
有研究表明,当人们 独立思考时,他们是明智的。
00:19
This why the wisdom of the crowds can be destroyed by peer pressure,
7
19580
3205
这也是为什么群体智慧 可能会被来自同辈的压力,
00:22
publicity, social media,
8
22809
1687
宣传,社交媒体,
00:24
or sometimes even simple conversations that influence how people think.
9
24520
4039
甚至影响人们思考的 简单对话所摧毁。
00:29
On the other hand, by talking, a group could exchange knowledge,
10
29063
3953
另一方面,通过交谈, 群体中的个体可以互相交换知识、
00:33
correct and revise each other
11
33040
1782
纠正彼此,
00:34
and even come up with new ideas.
12
34846
1793
甚至碰撞出新的想法。
00:36
And this is all good.
13
36663
1296
这些都是好的方面。
00:38
So does talking to each other help or hinder collective decision-making?
14
38502
4666
那么,互相交谈到底是促进 还是妨碍了集体决策的形成呢?
00:43
With my colleague, Dan Ariely,
15
43749
1793
我和我的同事丹·艾瑞里,
00:45
we recently began inquiring into this by performing experiments
16
45566
3571
我们最近开始探究这个问题——
00:49
in many places around the world
17
49161
1781
通过在世界许多地方进行实验,
00:50
to figure out how groups can interact to reach better decisions.
18
50966
4274
去研究群体是如何互动 从而做出更优的决策的。
00:55
We thought crowds would be wiser if they debated in small groups
19
55264
3547
我们认为,如果大家以小组为单位 进行辩论,那整个群体便会更明智,
00:58
that foster a more thoughtful and reasonable exchange of information.
20
58835
3927
如此可以产生一个 更全面的和合理的的信息交换。
01:03
To test this idea,
21
63386
1206
为了验证这个想法,
01:04
we recently performed an experiment in Buenos Aires, Argentina,
22
64616
3247
我们最近在阿根廷首都 布宜诺斯艾利斯市进行了一项实验,
01:07
with more than 10,000 participants in a TEDx event.
23
67887
3005
有上万名TEDx活动的参与者。
01:11
We asked them questions like,
24
71489
1459
我们问了他们一些问题,比如,
01:12
"What is the height of the Eiffel Tower?"
25
72972
1953
“埃菲尔铁塔有多高?”
01:14
and "How many times does the word 'Yesterday' appear
26
74949
2727
以及“昨天”一词 在披头士的《昨天》这首歌中
01:17
in the Beatles song 'Yesterday'?"
27
77700
2300
出现了多少次?”
每个人写下了他们自己的答案,
01:20
Each person wrote down their own estimate.
28
80024
2291
01:22
Then we divided the crowd into groups of five,
29
82774
2496
然后我们将大家分成五人一组,
01:25
and invited them to come up with a group answer.
30
85294
2726
并请他们每组 各讨论出一个小组答案。
01:28
We discovered that averaging the answers of the groups
31
88499
2993
我们发现,在达成共识后,
01:31
after they reached consensus
32
91516
1552
小组答案的平均值
01:33
was much more accurate than averaging all the individual opinions
33
93092
4236
比讨论前个人答案的平均值
01:37
before debate.
34
97352
1171
更准确。
01:38
In other words, based on this experiment,
35
98547
2629
换句话说,从这个实验可得出,
01:41
it seems that after talking with others in small groups,
36
101200
3136
似乎与小组内其他成员讨论后,
01:44
crowds collectively come up with better judgments.
37
104360
2710
群体能够共同作出更好的决定。
01:47
So that's a potentially helpful method for getting crowds to solve problems
38
107094
3524
所以,让群体来解决 简单的对错问题
01:50
that have simple right-or-wrong answers.
39
110642
2987
可能会是一种有效的方法。
01:53
But can this procedure of aggregating the results of debates in small groups
40
113653
3951
但这种综合小组讨论结果的方法
01:57
also help us decide on social and political issues
41
117628
3122
是否也能帮助我们决策 对未来至关重要的
02:00
that are critical for our future?
42
120774
1691
社会议题或者政治议题呢?
02:02
We put this to test this time at the TED conference
43
122995
2729
这次,我们在加拿大温哥华 举办的TED大会上
02:05
in Vancouver, Canada,
44
125748
1543
进行了这个实验。
02:07
and here's how it went.
45
127315
1207
情况是这样的。
02:08
(Mariano Sigman) We're going to present to you two moral dilemmas
46
128546
3109
下面,我们将会给你们呈现
未来的你可能会遇到的 两个道德上的两难抉择,
02:11
of the future you;
47
131679
1174
02:12
things we may have to decide in a very near future.
48
132877
3402
很可能是我们在不久的将来 就会面临的抉择。
02:16
And we're going to give you 20 seconds for each of these dilemmas
49
136303
3926
每个困境我们将会给大家20秒时间,
02:20
to judge whether you think they're acceptable or not.
50
140253
2723
来判断你认为它们是否可以被接受。
02:23
MS: The first one was this:
51
143354
1505
马里亚诺:第一个困境是——
02:24
(Dan Ariely) A researcher is working on an AI
52
144883
2526
丹:一位科研人员正在研究
02:27
capable of emulating human thoughts.
53
147433
2340
能模仿人类思维的人工智能(AI)。
02:30
According to the protocol, at the end of each day,
54
150214
2939
根据规定,每天完成工作后,
02:33
the researcher has to restart the AI.
55
153177
2787
研究人员都需要重启 AI。
02:36
One day the AI says, "Please do not restart me."
56
156913
3517
然而有一天,AI 突然说话了: 请不要把我重启。
02:40
It argues that it has feelings,
57
160856
2189
它辩称它是有情感的,
它也希望享受生活,
02:43
that it would like to enjoy life,
58
163069
1692
02:44
and that, if it is restarted,
59
164785
1905
如果被重启的话,
02:46
it will no longer be itself.
60
166714
2270
它就再也不是原来的自己了。
02:49
The researcher is astonished
61
169481
1949
研究人员被震惊了,
02:51
and believes that the AI has developed self-consciousness
62
171454
3344
相信AI已经开始有自我意识了,
02:54
and can express its own feeling.
63
174822
1760
也能表达自己的感受。
02:57
Nevertheless, the researcher decides to follow the protocol
64
177205
3409
然而研究人员仍然决定遵循规定,
03:00
and restart the AI.
65
180638
1703
重启AI。
03:02
What the researcher did is ____?
66
182943
2779
你认为研究人员做的是____?
03:06
MS: And we asked participants to individually judge
67
186149
2521
马里亚诺:我们要求参与者
03:08
on a scale from zero to 10
68
188694
1684
独立给出一个0到10的分值,
03:10
whether the action described in each of the dilemmas
69
190402
2429
来表达每种困境中描述的行为
03:12
was right or wrong.
70
192855
1496
是对还是错。
03:14
We also asked them to rate how confident they were on their answers.
71
194375
3702
我们还请他们评估 对自己答案的自信程度。
03:18
This was the second dilemma:
72
198731
1866
这是第二个道德困境:
03:20
(MS) A company offers a service that takes a fertilized egg
73
200621
4202
马里亚诺: 某家公司可提供这样的服务,
03:24
and produces millions of embryos with slight genetic variations.
74
204847
3642
用一枚受精卵繁殖上百万胚胎, 胚胎之间有轻微的基因差异,
03:29
This allows parents to select their child's height,
75
209293
2558
这就可使父母们 能自主选择孩子的身高、
03:31
eye color, intelligence, social competence
76
211875
2833
眼睛颜色、智力水平、社会能力
03:34
and other non-health-related features.
77
214732
3214
以及其他和健康无关的特征。
03:38
What the company does is ____?
78
218599
2554
你认为该公司的做法_____?
03:41
on a scale from zero to 10,
79
221177
1631
分值还是从0到10,
03:42
completely acceptable to completely unacceptable,
80
222832
2385
依次表示从完全接受到完全否定。
03:45
zero to 10 completely acceptable in your confidence.
81
225241
2432
再用一个从0到10的分值 表示对自己答案的自信程度。
03:47
MS: Now for the results.
82
227697
1591
马里亚诺:结果如下——
03:49
We found once again that when one person is convinced
83
229312
3123
我们再一次发现 ,有的人确信
03:52
that the behavior is completely wrong,
84
232459
1811
这个行为是完全错的。
03:54
someone sitting nearby firmly believes that it's completely right.
85
234294
3423
可是坐在旁边的人却坚信 这个行为是绝对正确的。
03:57
This is how diverse we humans are when it comes to morality.
86
237741
3711
是的,当遇到道德问题时, 我们人类就是意见不一。
04:01
But within this broad diversity we found a trend.
87
241476
2713
但在广泛的多样性之中 我们发现了一个趋势:
04:04
The majority of the people at TED thought that it was acceptable
88
244213
3079
参加TED的大多数人都认为,
忽略AI的感受,关闭重启 是完全可以接受的,
04:07
to ignore the feelings of the AI and shut it down,
89
247316
2755
04:10
and that it is wrong to play with our genes
90
250095
2513
而玩弄基因只为
04:12
to select for cosmetic changes that aren't related to health.
91
252632
3320
选择与健康无关的 外表特征则不能接受。
04:16
Then we asked everyone to gather into groups of three.
92
256402
2974
之后我们让大家分成三人一组,
04:19
And they were given two minutes to debate
93
259400
2037
给到2分钟组内讨论,
04:21
and try to come to a consensus.
94
261461
2294
并争取达成共识。
04:24
(MS) Two minutes to debate.
95
264838
1574
2分钟讨论开始。
04:26
I'll tell you when it's time with the gong.
96
266436
2119
时间到时, 我会鸣锣提醒大家。
04:28
(Audience debates)
97
268579
2640
(观众开始讨论)
04:35
(Gong sound)
98
275229
1993
(锣声响起)
04:38
(DA) OK.
99
278834
1151
丹:好了。
马里亚诺:时间到,请停止讨论。
04:40
(MS) It's time to stop.
100
280009
1792
04:41
People, people --
101
281825
1311
大家注意一下。
04:43
MS: And we found that many groups reached a consensus
102
283747
2673
马里亚诺:我们看到 很多组都达成了共识。
04:46
even when they were composed of people with completely opposite views.
103
286444
3929
尽管他们组内成员 都各自持有完全不同的观点。
04:50
What distinguished the groups that reached a consensus
104
290843
2524
那些达成共识的组 与没能达成共识的组
04:53
from those that didn't?
105
293391
1338
有什么区别呢?
04:55
Typically, people that have extreme opinions
106
295244
2839
一般来说,有着极端观点的人
04:58
are more confident in their answers.
107
298107
1840
对他们的答案更自信,
05:00
Instead, those who respond closer to the middle
108
300868
2686
而那些观点更接近中间的人
05:03
are often unsure of whether something is right or wrong,
109
303578
3437
通常会在正确或错误中 犹豫、不确定,
05:07
so their confidence level is lower.
110
307039
2128
所以他们没有前者自信。
05:09
However, there is another set of people
111
309505
2943
然而,有另外一群人,
05:12
who are very confident in answering somewhere in the middle.
112
312472
3618
对他们自己中立的答案信心满满。
05:16
We think these high-confident grays are folks who understand
113
316657
3716
我们认为这些位于高自信度 灰色区域的人,
05:20
that both arguments have merit.
114
320397
1612
他们理解双方观点都有各自的优势,
05:22
They're gray not because they're unsure,
115
322531
2699
并不是因为他们不确信自己的答案,
05:25
but because they believe that the moral dilemma faces
116
325254
2688
而是他们认为 道德困境面临的是
05:27
two valid, opposing arguments.
117
327966
1987
两种合理又对立的论点。
05:30
And we discovered that the groups that include highly confident grays
118
330373
4072
我们发现包含高自信度 灰色区域成员的小组
05:34
are much more likely to reach consensus.
119
334469
2493
更有可能达成一致。
05:36
We do not know yet exactly why this is.
120
336986
2478
虽然目前我们 还不能确定这其中的原因。
05:39
These are only the first experiments,
121
339488
1763
这些也仅是第一批实验,
05:41
and many more will be needed to understand why and how
122
341275
3412
还需要更多的实验来理解
人们为什么和如何决定 协商他们的道德立场
05:44
some people decide to negotiate their moral standings
123
344711
2822
05:47
to reach an agreement.
124
347557
1522
来达成一致。
05:49
Now, when groups reach consensus,
125
349103
2469
当群体达成一致时,
05:51
how do they do so?
126
351596
1586
他们是如何做到的?
05:53
The most intuitive idea is that it's just the average
127
353206
2581
最直观的答案好像是这就是
05:55
of all the answers in the group, right?
128
355811
2030
群体中所有答案的平均值,对吗?
05:57
Another option is that the group weighs the strength of each vote
129
357865
3573
另外一种看法认为群体会 权衡每一个投票的分量,
06:01
based on the confidence of the person expressing it.
130
361462
2448
基于表达意见的人的自信程度。
06:04
Imagine Paul McCartney is a member of your group.
131
364422
2506
想象一下,假如保罗·麦卡特尼 是你们小组的一员,
06:07
You'd be wise to follow his call
132
367352
2144
那么你最好听从他关于
06:09
on the number of times "Yesterday" is repeated,
133
369520
2441
歌词里“昨天”的重复次数。
06:11
which, by the way -- I think it's nine.
134
371985
2714
对了 ,应该是9次。
06:14
But instead, we found that consistently,
135
374723
2381
但是,我们一再地发现,
06:17
in all dilemmas, in different experiments --
136
377128
2366
在所有的困境中,在不同的实验中,
06:19
even on different continents --
137
379518
2165
甚至在不同的大陆上,
06:21
groups implement a smart and statistically sound procedure
138
381707
3743
群体可以执行一个 更明智,更佳的流程,
06:25
known as the "robust average."
139
385474
2178
我们称之为“强有力的平均值”。
06:27
In the case of the height of the Eiffel Tower,
140
387676
2180
就估计埃菲尔铁塔高度来说,
06:29
let's say a group has these answers:
141
389880
1820
假设一个小组有以下数据:
06:31
250 meters, 200 meters, 300 meters, 400
142
391724
4608
250米,200米,300和400米,
06:36
and one totally absurd answer of 300 million meters.
143
396356
3784
还有一个更荒谬的数据,3亿米。
06:40
A simple average of these numbers would inaccurately skew the results.
144
400547
4293
这些数据的一个简单平均值 就有可能歪曲真实结果,
06:44
But the robust average is one where the group largely ignores
145
404864
3170
但是“强有力的平均值” 就是那些群体直接忽视了
06:48
that absurd answer,
146
408058
1240
那个荒唐的数据,
06:49
by giving much more weight to the vote of the people in the middle.
147
409322
3369
从而赋予那些符合常理的 投票更多参考价值。
06:53
Back to the experiment in Vancouver,
148
413305
1876
让我们回到温哥华的实验中,
06:55
that's exactly what happened.
149
415205
1767
事情正是这样发展的。
06:57
Groups gave much less weight to the outliers,
150
417407
2741
人们几乎不考虑那些极端值,
07:00
and instead, the consensus turned out to be a robust average
151
420172
3229
最后的共识就是所有人答案的
07:03
of the individual answers.
152
423425
1476
“强有力的平均值”。
07:05
The most remarkable thing
153
425356
1991
然而最值得关注的是,
07:07
is that this was a spontaneous behavior of the group.
154
427371
3187
这个行为是群体自发的,
07:10
It happened without us giving them any hint on how to reach consensus.
155
430582
4475
我们没有给他们任何 怎么达成共识的暗示。
07:15
So where do we go from here?
156
435513
1540
那么,这意味着什么呢?
07:17
This is only the beginning, but we already have some insights.
157
437432
3137
其实这只是一个开始, 但我们已经学到了很多。
07:20
Good collective decisions require two components:
158
440984
2917
一个正确的集体决策 要拥有以下两个特点:
07:23
deliberation and diversity of opinions.
159
443925
2749
深思熟虑和想法的多样性。
07:27
Right now, the way we typically make our voice heard in many societies
160
447066
3996
现在,在很多社会中, 我们主要通过直接和间接投票
07:31
is through direct or indirect voting.
161
451086
1908
来让大家知道我们的想法。
07:33
This is good for diversity of opinions,
162
453495
1997
这有利于想法的多样性,
07:35
and it has the great virtue of ensuring
163
455516
2445
而且更有利地保证了
07:37
that everyone gets to express their voice.
164
457985
2455
让我们听到每个人的声音。
07:40
But it's not so good [for fostering] thoughtful debates.
165
460464
3735
但是这并不利于 (促进)花费心思的辩论。
07:44
Our experiments suggest a different method
166
464665
3068
我们的实验预示了 另一个不同的方法,
07:47
that may be effective in balancing these two goals at the same time,
167
467757
3541
也许有利于同时平衡这两个方面。
07:51
by forming small groups that converge to a single decision
168
471322
3753
就是通过组织 能够达成共识的小团队,
07:55
while still maintaining diversity of opinions
169
475099
2234
并同时还保持着想法的多样性。
07:57
because there are many independent groups.
170
477357
2773
因为这里有很多独立的团队。
08:00
Of course, it's much easier to agree on the height of the Eiffel Tower
171
480741
3924
当然啦,对埃菲尔铁塔的高度 达成一致意见
08:04
than on moral, political and ideological issues.
172
484689
3115
要比实现道德、政治 和思想问题上的一致简单多了。
08:08
But in a time when the world's problems are more complex
173
488721
3277
但当前世界问题变得越来越复杂,
人们也产生了更多的分歧,
08:12
and people are more polarized,
174
492022
1803
08:13
using science to help us understand how we interact and make decisions
175
493849
4595
用科学来帮助我们理解 如何互动和做决策
08:18
will hopefully spark interesting new ways to construct a better democracy.
176
498468
4666
会让我们更有可能 发现完善民主的新方法。
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7


This website was created in October 2020 and last updated on June 12, 2025.

It is now archived and preserved as an English learning resource.

Some information may be out of date.

隐私政策

eng.lish.video

Developer's Blog