How can groups make good decisions? | Mariano Sigman and Dan Ariely

151,320 views ・ 2017-12-13

TED


Please double-click on the English subtitles below to play the video.

00:00
As societies, we have to make collective decisions
0
554
2443
Translator: Reviewer: Daban Q. Jaff
00:03
that will shape our future.
1
3021
1570
00:05
And we all know that when we make decisions in groups,
2
5087
2757
00:07
they don't always go right.
3
7868
1638
00:09
And sometimes they go very wrong.
4
9530
1956
00:12
So how do groups make good decisions?
5
12315
2424
00:15
Research has shown that crowds are wise when there's independent thinking.
6
15228
4328
00:19
This why the wisdom of the crowds can be destroyed by peer pressure,
7
19580
3205
00:22
publicity, social media,
8
22809
1687
00:24
or sometimes even simple conversations that influence how people think.
9
24520
4039
00:29
On the other hand, by talking, a group could exchange knowledge,
10
29063
3953
00:33
correct and revise each other
11
33040
1782
00:34
and even come up with new ideas.
12
34846
1793
00:36
And this is all good.
13
36663
1296
00:38
So does talking to each other help or hinder collective decision-making?
14
38502
4666
00:43
With my colleague, Dan Ariely,
15
43749
1793
00:45
we recently began inquiring into this by performing experiments
16
45566
3571
00:49
in many places around the world
17
49161
1781
00:50
to figure out how groups can interact to reach better decisions.
18
50966
4274
00:55
We thought crowds would be wiser if they debated in small groups
19
55264
3547
00:58
that foster a more thoughtful and reasonable exchange of information.
20
58835
3927
01:03
To test this idea,
21
63386
1206
01:04
we recently performed an experiment in Buenos Aires, Argentina,
22
64616
3247
01:07
with more than 10,000 participants in a TEDx event.
23
67887
3005
01:11
We asked them questions like,
24
71489
1459
01:12
"What is the height of the Eiffel Tower?"
25
72972
1953
01:14
and "How many times does the word 'Yesterday' appear
26
74949
2727
01:17
in the Beatles song 'Yesterday'?"
27
77700
2300
01:20
Each person wrote down their own estimate.
28
80024
2291
01:22
Then we divided the crowd into groups of five,
29
82774
2496
01:25
and invited them to come up with a group answer.
30
85294
2726
01:28
We discovered that averaging the answers of the groups
31
88499
2993
01:31
after they reached consensus
32
91516
1552
01:33
was much more accurate than averaging all the individual opinions
33
93092
4236
01:37
before debate.
34
97352
1171
01:38
In other words, based on this experiment,
35
98547
2629
01:41
it seems that after talking with others in small groups,
36
101200
3136
01:44
crowds collectively come up with better judgments.
37
104360
2710
01:47
So that's a potentially helpful method for getting crowds to solve problems
38
107094
3524
01:50
that have simple right-or-wrong answers.
39
110642
2987
01:53
But can this procedure of aggregating the results of debates in small groups
40
113653
3951
01:57
also help us decide on social and political issues
41
117628
3122
02:00
that are critical for our future?
42
120774
1691
02:02
We put this to test this time at the TED conference
43
122995
2729
02:05
in Vancouver, Canada,
44
125748
1543
02:07
and here's how it went.
45
127315
1207
02:08
(Mariano Sigman) We're going to present to you two moral dilemmas
46
128546
3109
02:11
of the future you;
47
131679
1174
02:12
things we may have to decide in a very near future.
48
132877
3402
02:16
And we're going to give you 20 seconds for each of these dilemmas
49
136303
3926
02:20
to judge whether you think they're acceptable or not.
50
140253
2723
02:23
MS: The first one was this:
51
143354
1505
02:24
(Dan Ariely) A researcher is working on an AI
52
144883
2526
02:27
capable of emulating human thoughts.
53
147433
2340
02:30
According to the protocol, at the end of each day,
54
150214
2939
02:33
the researcher has to restart the AI.
55
153177
2787
02:36
One day the AI says, "Please do not restart me."
56
156913
3517
02:40
It argues that it has feelings,
57
160856
2189
02:43
that it would like to enjoy life,
58
163069
1692
02:44
and that, if it is restarted,
59
164785
1905
02:46
it will no longer be itself.
60
166714
2270
02:49
The researcher is astonished
61
169481
1949
02:51
and believes that the AI has developed self-consciousness
62
171454
3344
02:54
and can express its own feeling.
63
174822
1760
02:57
Nevertheless, the researcher decides to follow the protocol
64
177205
3409
03:00
and restart the AI.
65
180638
1703
03:02
What the researcher did is ____?
66
182943
2779
03:06
MS: And we asked participants to individually judge
67
186149
2521
03:08
on a scale from zero to 10
68
188694
1684
03:10
whether the action described in each of the dilemmas
69
190402
2429
03:12
was right or wrong.
70
192855
1496
03:14
We also asked them to rate how confident they were on their answers.
71
194375
3702
03:18
This was the second dilemma:
72
198731
1866
03:20
(MS) A company offers a service that takes a fertilized egg
73
200621
4202
03:24
and produces millions of embryos with slight genetic variations.
74
204847
3642
03:29
This allows parents to select their child's height,
75
209293
2558
03:31
eye color, intelligence, social competence
76
211875
2833
03:34
and other non-health-related features.
77
214732
3214
03:38
What the company does is ____?
78
218599
2554
03:41
on a scale from zero to 10,
79
221177
1631
03:42
completely acceptable to completely unacceptable,
80
222832
2385
03:45
zero to 10 completely acceptable in your confidence.
81
225241
2432
03:47
MS: Now for the results.
82
227697
1591
03:49
We found once again that when one person is convinced
83
229312
3123
03:52
that the behavior is completely wrong,
84
232459
1811
03:54
someone sitting nearby firmly believes that it's completely right.
85
234294
3423
03:57
This is how diverse we humans are when it comes to morality.
86
237741
3711
04:01
But within this broad diversity we found a trend.
87
241476
2713
04:04
The majority of the people at TED thought that it was acceptable
88
244213
3079
04:07
to ignore the feelings of the AI and shut it down,
89
247316
2755
04:10
and that it is wrong to play with our genes
90
250095
2513
04:12
to select for cosmetic changes that aren't related to health.
91
252632
3320
04:16
Then we asked everyone to gather into groups of three.
92
256402
2974
04:19
And they were given two minutes to debate
93
259400
2037
04:21
and try to come to a consensus.
94
261461
2294
04:24
(MS) Two minutes to debate.
95
264838
1574
04:26
I'll tell you when it's time with the gong.
96
266436
2119
04:28
(Audience debates)
97
268579
2640
04:35
(Gong sound)
98
275229
1993
04:38
(DA) OK.
99
278834
1151
04:40
(MS) It's time to stop.
100
280009
1792
04:41
People, people --
101
281825
1311
04:43
MS: And we found that many groups reached a consensus
102
283747
2673
04:46
even when they were composed of people with completely opposite views.
103
286444
3929
04:50
What distinguished the groups that reached a consensus
104
290843
2524
04:53
from those that didn't?
105
293391
1338
04:55
Typically, people that have extreme opinions
106
295244
2839
04:58
are more confident in their answers.
107
298107
1840
05:00
Instead, those who respond closer to the middle
108
300868
2686
05:03
are often unsure of whether something is right or wrong,
109
303578
3437
05:07
so their confidence level is lower.
110
307039
2128
05:09
However, there is another set of people
111
309505
2943
05:12
who are very confident in answering somewhere in the middle.
112
312472
3618
05:16
We think these high-confident grays are folks who understand
113
316657
3716
05:20
that both arguments have merit.
114
320397
1612
05:22
They're gray not because they're unsure,
115
322531
2699
05:25
but because they believe that the moral dilemma faces
116
325254
2688
05:27
two valid, opposing arguments.
117
327966
1987
05:30
And we discovered that the groups that include highly confident grays
118
330373
4072
05:34
are much more likely to reach consensus.
119
334469
2493
05:36
We do not know yet exactly why this is.
120
336986
2478
05:39
These are only the first experiments,
121
339488
1763
05:41
and many more will be needed to understand why and how
122
341275
3412
05:44
some people decide to negotiate their moral standings
123
344711
2822
05:47
to reach an agreement.
124
347557
1522
05:49
Now, when groups reach consensus,
125
349103
2469
05:51
how do they do so?
126
351596
1586
05:53
The most intuitive idea is that it's just the average
127
353206
2581
05:55
of all the answers in the group, right?
128
355811
2030
05:57
Another option is that the group weighs the strength of each vote
129
357865
3573
06:01
based on the confidence of the person expressing it.
130
361462
2448
06:04
Imagine Paul McCartney is a member of your group.
131
364422
2506
06:07
You'd be wise to follow his call
132
367352
2144
06:09
on the number of times "Yesterday" is repeated,
133
369520
2441
06:11
which, by the way -- I think it's nine.
134
371985
2714
06:14
But instead, we found that consistently,
135
374723
2381
06:17
in all dilemmas, in different experiments --
136
377128
2366
06:19
even on different continents --
137
379518
2165
06:21
groups implement a smart and statistically sound procedure
138
381707
3743
06:25
known as the "robust average."
139
385474
2178
06:27
In the case of the height of the Eiffel Tower,
140
387676
2180
06:29
let's say a group has these answers:
141
389880
1820
06:31
250 meters, 200 meters, 300 meters, 400
142
391724
4608
06:36
and one totally absurd answer of 300 million meters.
143
396356
3784
06:40
A simple average of these numbers would inaccurately skew the results.
144
400547
4293
06:44
But the robust average is one where the group largely ignores
145
404864
3170
06:48
that absurd answer,
146
408058
1240
06:49
by giving much more weight to the vote of the people in the middle.
147
409322
3369
06:53
Back to the experiment in Vancouver,
148
413305
1876
06:55
that's exactly what happened.
149
415205
1767
06:57
Groups gave much less weight to the outliers,
150
417407
2741
07:00
and instead, the consensus turned out to be a robust average
151
420172
3229
07:03
of the individual answers.
152
423425
1476
07:05
The most remarkable thing
153
425356
1991
07:07
is that this was a spontaneous behavior of the group.
154
427371
3187
07:10
It happened without us giving them any hint on how to reach consensus.
155
430582
4475
07:15
So where do we go from here?
156
435513
1540
07:17
This is only the beginning, but we already have some insights.
157
437432
3137
07:20
Good collective decisions require two components:
158
440984
2917
07:23
deliberation and diversity of opinions.
159
443925
2749
07:27
Right now, the way we typically make our voice heard in many societies
160
447066
3996
07:31
is through direct or indirect voting.
161
451086
1908
07:33
This is good for diversity of opinions,
162
453495
1997
07:35
and it has the great virtue of ensuring
163
455516
2445
07:37
that everyone gets to express their voice.
164
457985
2455
07:40
But it's not so good [for fostering] thoughtful debates.
165
460464
3735
07:44
Our experiments suggest a different method
166
464665
3068
07:47
that may be effective in balancing these two goals at the same time,
167
467757
3541
07:51
by forming small groups that converge to a single decision
168
471322
3753
07:55
while still maintaining diversity of opinions
169
475099
2234
07:57
because there are many independent groups.
170
477357
2773
08:00
Of course, it's much easier to agree on the height of the Eiffel Tower
171
480741
3924
08:04
than on moral, political and ideological issues.
172
484689
3115
08:08
But in a time when the world's problems are more complex
173
488721
3277
08:12
and people are more polarized,
174
492022
1803
08:13
using science to help us understand how we interact and make decisions
175
493849
4595
08:18
will hopefully spark interesting new ways to construct a better democracy.
176
498468
4666
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7