How can groups make good decisions? | Mariano Sigman and Dan Ariely

158,993 views ใƒป 2017-12-13

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

00:00
As societies, we have to make collective decisions
0
554
2443
ืชืจื’ื•ื: Igal Opendik ืขืจื™ื›ื”: Ido Dekkers
ื›ื—ื‘ืจื”, ืื ื• ื—ื™ื™ื‘ื™ื ืœืงื‘ืœ ื”ื—ืœื˜ื•ืช ืงื•ืœืงื˜ื™ื‘ื™ื•ืช
ืฉืชืขืฆื‘ื ื” ืืช ืขืชื™ื“ื ื•.
00:03
that will shape our future.
1
3021
1570
ื•ื›ื•ืœื ื• ื™ื•ื“ืขื™ื ืฉื›ืืฉืจ ืื ื• ืžืงื‘ืœื™ื ื”ื—ืœื˜ื•ืช ื‘ืื•ืคืŸ ืงื‘ื•ืฆืชื™,
00:05
And we all know that when we make decisions in groups,
2
5087
2757
00:07
they don't always go right.
3
7868
1638
ื”ืŸ ืœื ืชืžื™ื“ ืžื•ืฆืœื—ื•ืช,
00:09
And sometimes they go very wrong.
4
9530
1956
ื•ืœืคืขืžื™ื ื”ืŸ ืืคื™ืœื• ืฉื’ื•ื™ื•ืช ื‘ื™ื•ืชืจ.
00:12
So how do groups make good decisions?
5
12315
2424
ืื– ื›ื™ืฆื“ ืงื‘ื•ืฆื•ืช ืžืงื‘ืœื•ืช ื”ื—ืœื˜ื•ืช ื˜ื•ื‘ื•ืช?
00:15
Research has shown that crowds are wise when there's independent thinking.
6
15228
4328
ืžื—ืงืจื™ื ื”ืจืื• ืฉื”ื”ืžื•ืŸ ื—ื›ื ื›ืฉื™ืฉ ื—ืฉื™ื‘ื” ืขืฆืžืื™ืช.
00:19
This why the wisdom of the crowds can be destroyed by peer pressure,
7
19580
3205
ื–ืืช ื”ืกื™ื‘ื” ืฉื—ื›ืžืช ื”ื”ืžื•ื ื™ื ืขืœื•ืœื” ืœื”ืชืคื•ืจืจ ืชื—ืช ืœื—ืฅ ืขืžื™ืชื™ื,
00:22
publicity, social media,
8
22809
1687
ืคืจืกื•ืžื™ื, ืจืฉืชื•ืช ื—ื‘ืจืชื™ื•ืช,
00:24
or sometimes even simple conversations that influence how people think.
9
24520
4039
ืื• ืœืคืขืžื™ื ืืคื™ืœื• ืฉื™ื—ื•ืช ื—ื•ืœื™ืŸ ืฉืžืฉืคื™ืขื•ืช ืขืœ ืฆื•ืจื•ืช ื”ื—ืฉื™ื‘ื”.
00:29
On the other hand, by talking, a group could exchange knowledge,
10
29063
3953
ืžืฆื“ ืฉื ื™, ื›ืฉื”ื ืžืฉื•ื—ื—ื™ื, ื—ื‘ืจื™ ื”ืงื‘ื•ืฆื” ื™ื›ื•ืœื™ื ืœืฉืชืฃ ืžื™ื“ืข,
ืœืชืงืŸ ื–ื” ืืช ื–ื”
00:33
correct and revise each other
11
33040
1782
00:34
and even come up with new ideas.
12
34846
1793
ื•ืืคื™ืœื• ืœื”ืขืœื•ืช ืจืขื™ื•ื ื•ืช ื—ื“ืฉื™ื.
00:36
And this is all good.
13
36663
1296
ื›ืœ ื–ื” ืžืฆื•ื™ื™ืŸ.
00:38
So does talking to each other help or hinder collective decision-making?
14
38502
4666
ืื–, ื”ืื ื”ื”ื™ื“ื‘ืจื•ืช ืžืงื“ืžืช ืื• ืžืขื›ื‘ืช ืืช ืงื‘ืœืช ื”ื”ื—ืœื˜ื•ืช ื”ืงื•ืœืงื˜ื™ื‘ื™ืช?
00:43
With my colleague, Dan Ariely,
15
43749
1793
ืื ื™ ื•ืขืžื™ืชื™, ื“ืŸ ืืจื™ืืœื™,
00:45
we recently began inquiring into this by performing experiments
16
45566
3571
ื”ืชื—ืœื ื• ืœืื—ืจื•ื ื” ืœื—ืงื•ืจ ื–ืืช ื‘ืืžืฆืขื•ืช ืขืจื™ื›ืช ื ื™ืกื•ื™ื™ื
00:49
in many places around the world
17
49161
1781
ื‘ืžืงื•ืžื•ืช ืจื‘ื™ื ื‘ืจื—ื‘ื™ ื”ืขื•ืœื
00:50
to figure out how groups can interact to reach better decisions.
18
50966
4274
ื›ื“ื™ ืœื‘ืจืจ ื›ื™ืฆื“ ื”ืงื‘ื•ืฆื•ืช ื™ื›ื•ืœื•ืช ืœื”ื’ื™ืข ืœื”ื—ืœื˜ื•ืช ื˜ื•ื‘ื•ืช ื™ื•ืชืจ ื“ืจืš ืื™ื ื˜ืจืืงืฆื™ื”.
00:55
We thought crowds would be wiser if they debated in small groups
19
55264
3547
ืฉื™ืขืจื ื• ืฉื”ื”ืžื•ื ื™ื ื™ื”ื™ื• ื—ื›ืžื™ื ื™ื•ืชืจ ืื ื”ื“ื™ื•ื ื™ื ื™ื”ื™ื• ื‘ืงื‘ื•ืฆื•ืช ืงื˜ื ื•ืช,
00:58
that foster a more thoughtful and reasonable exchange of information.
20
58835
3927
ืืฉืจ ืžืขื•ื“ื“ื•ืช ื”ื—ืœืคืช ืžื™ื“ืข ื‘ืื•ืคืŸ ืžื—ื•ืฉื‘ ื•ื”ื’ื™ื•ื ื™ ื™ื•ืชืจ.
01:03
To test this idea,
21
63386
1206
ืขืœ ืžื ืช ืœืืฉืจืจ ืืช ื”ื”ืฉืขืจื”,
01:04
we recently performed an experiment in Buenos Aires, Argentina,
22
64616
3247
ื‘ื™ื™ืžื ื• ืœืื—ืจื•ื ื” ื ื™ืกื•ื™ ื‘ื‘ื•ืื ื•ืก-ืื™ื™ืจืก ืฉื‘ืืจื’ื ื˜ื™ื ื”
01:07
with more than 10,000 participants in a TEDx event.
23
67887
3005
ื‘ื”ืฉืชืชืคื•ืช ื™ื•ืชืจ ืž-10000 ืื™ืฉ ื‘ืžื”ืœืš ืื™ืจื•ืข ืฉืœ TEDx.
01:11
We asked them questions like,
24
71489
1459
ืฉืืœื ื• ืื•ืชื ืฉืืœื•ืช ื›ื’ื•ืŸ:
01:12
"What is the height of the Eiffel Tower?"
25
72972
1953
"ืžื”ื• ื’ื•ื‘ื”ื• ืฉืœ ืžื’ื“ืœ ืื™ื™ืคืœ?"
01:14
and "How many times does the word 'Yesterday' appear
26
74949
2727
ื•"ื›ืžื” ืคืขืžื™ื ื”ืžื™ืœื” 'ื™ืกื˜ืจื“ื™ื™', ืืชืžื•ืœ,
01:17
in the Beatles song 'Yesterday'?"
27
77700
2300
"ืžื•ืคื™ืขื” ื‘ืฉื™ืจ 'ื™ืกื˜ืจื“ื™ื™' ืฉืœ ื”ื—ื™ืคื•ืฉื™ื•ืช?"
ื›ืœ ืžืฉืชืชืฃ ืจืฉื ืืช ื”ืขืจื›ืชื•.
01:20
Each person wrote down their own estimate.
28
80024
2291
01:22
Then we divided the crowd into groups of five,
29
82774
2496
ืื—ืจ ื›ืš ื—ื™ืœืงื ื• ืืช ื”ืงื”ืœ ืœืงื‘ื•ืฆื•ืช ื‘ื ื•ืช 5 ืžืฉืชืชืคื™ื,
01:25
and invited them to come up with a group answer.
30
85294
2726
ื•ื‘ื™ืงืฉื ื• ืžื”ืŸ ืœื’ื‘ืฉ ืชืฉื•ื‘ื” ืงื•ืœืงื˜ื™ื‘ื™ืช.
01:28
We discovered that averaging the answers of the groups
31
88499
2993
ื’ื™ืœื™ื ื• ืฉื‘ืžืžื•ืฆืข, ื”ืชื•ืฆืื•ืช ืฉื”ืชืงื‘ืœื• ื‘ืงื‘ื•ืฆื•ืช
01:31
after they reached consensus
32
91516
1552
ืื—ืจื™ ืฉื—ื‘ืจื™ื”ืŸ ื”ื’ื™ืขื• ืœื”ืกื›ืžื”,
01:33
was much more accurate than averaging all the individual opinions
33
93092
4236
ื”ื™ื• ื”ืจื‘ื” ืžื“ื•ื™ืงื•ืช ื™ื•ืชืจ ืžืืฉืจ ืžืžื•ืฆืข ื”ืชืฉื•ื‘ื•ืช ื”ืื™ื ื“ื™ื‘ื™ื“ื•ืืœื™ื•ืช.
01:37
before debate.
34
97352
1171
ืœืคื ื™ ื”ื“ื™ื•ืŸ.
01:38
In other words, based on this experiment,
35
98547
2629
ื‘ืžื™ืœื™ื ืื—ืจื•ืช, ืขืœ ื‘ืกื™ืก ื”ื ื™ืกื•ื™ ื”ื–ื”,
01:41
it seems that after talking with others in small groups,
36
101200
3136
ื ืจืื” ืฉื‘ืชื•ื ื”ืฉื™ื—ื” ืขื ืžืฉืชืชืคื™ื ืื—ืจื™ื ื‘ืงื‘ื•ืฆื•ืช ืงื˜ื ื•ืช,
01:44
crowds collectively come up with better judgments.
37
104360
2710
ื”ื”ืžื•ื ื™ื ื”ืคื™ืงื• ื‘ืžืฉื•ืชืฃ ืฉื™ืคื•ื˜ื™ื ื˜ื•ื‘ื™ื ื™ื•ืชืจ.
01:47
So that's a potentially helpful method for getting crowds to solve problems
38
107094
3524
ื–ื•ื”ื™ ืฉื™ื˜ื” ืฉืขืฉื•ื™ื” ืœื”ื™ื•ืช ืฉื™ืžื•ืฉื™ืช ื‘ืคื™ืชื•ื— ืคืชืจื•ื ื•ืช ืงื‘ื•ืฆืชื™ื™ื
01:50
that have simple right-or-wrong answers.
39
110642
2987
ืœื‘ืขื™ื•ืช ื‘ืขืœื•ืช ืชืฉื•ื‘ื•ืช "ื ื›ื•ืŸ-ืื•-ืฉื’ื•ื™" ืคืฉื•ื˜ื•ืช.
01:53
But can this procedure of aggregating the results of debates in small groups
40
113653
3951
ืื‘ืœ ื”ืื ืฉื™ื˜ืช ืื™ืกื•ืฃ ืชื•ืฆืื•ืช ื”ื“ื™ื•ื ื™ื ื‘ืงื‘ื•ืฆื•ืช ืงื˜ื ื•ืช
01:57
also help us decide on social and political issues
41
117628
3122
ืขืฉื•ื™ื” ืœืขื–ื•ืจ ืœื ื• ื’ื ื‘ืงื‘ืœืช ื”ื—ืœื˜ื•ืช ื‘ืชื—ื•ืžื™ ื—ื‘ืจื” ื•ืคื•ืœื™ื˜ื™ืงื”
02:00
that are critical for our future?
42
120774
1691
ื”ืงืจื™ื˜ื™ื™ื ืขื‘ื•ืจ ืขืชื™ื“ื ื•?
02:02
We put this to test this time at the TED conference
43
122995
2729
ื‘ื“ืงื ื• ืืช ื–ื” ื”ืคืขื ื‘ื›ื ืก TED
02:05
in Vancouver, Canada,
44
125748
1543
ืฉื ืขืจืš ื‘ื•ื•ื ืงื•ื‘ืจ ืฉื‘ืงื ื“ื”,
02:07
and here's how it went.
45
127315
1207
ื•ื›ืš ื–ื” ื”ืœืš.
02:08
(Mariano Sigman) We're going to present to you two moral dilemmas
46
128546
3109
(ืžืืจื™ื• ืกื™ื’ืžืŸ) ื ืฆื™ื’ ื‘ืคื ื™ื›ื 2 ื“ื™ืœืžื•ืช ืžื•ืกืจื™ื•ืช
02:11
of the future you;
47
131679
1174
ืฉืœ ืขืฆืžื›ื ื”ืขืชื™ื“ื™ื™ื;
02:12
things we may have to decide in a very near future.
48
132877
3402
ื“ื‘ืจื™ื ืฉืœื’ื‘ื™ื”ื ืื•ืœื™ ื ืฆื˜ืจืš ืœื”ื—ืœื™ื˜ ื‘ืขืชื™ื“ ื”ืงืจื•ื‘.
02:16
And we're going to give you 20 seconds for each of these dilemmas
49
136303
3926
ื ื™ืชืŸ ืœื›ื 20 ืฉื ื™ื•ืช ืฉืœ ื“ื™ื•ืŸ ืœื›ืœ ื“ื™ืœืžื”
02:20
to judge whether you think they're acceptable or not.
50
140253
2723
ื›ื“ื™ ืฉืชื—ืœื™ื˜ื• ื”ืื ื”ืŸ ืžืงื•ื‘ืœื•ืช ืื• ืœื.
02:23
MS: The first one was this:
51
143354
1505
ืž"ืก: ื”ืจื™ ื”ื“ื™ืœืžื” ื”ืจืืฉื•ื ื” -
02:24
(Dan Ariely) A researcher is working on an AI
52
144883
2526
(ื“ืŸ ืืจื™ืืœื™) ื—ื•ืงืจืช ืžืคืชื—ืช ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช
02:27
capable of emulating human thoughts.
53
147433
2340
ืฉืžืกื•ื’ืœืช ืœื—ืงื•ืช ืืช ื”ื—ืฉื™ื‘ื” ื”ืื ื•ืฉื™ืช.
02:30
According to the protocol, at the end of each day,
54
150214
2939
ืœืคื™ ื”ืคืจื•ื˜ื•ืงื•ืœ, ื‘ืกื•ืฃ ื›ืœ ื™ื•ื ืขื‘ื•ื“ื”,
02:33
the researcher has to restart the AI.
55
153177
2787
ื”ื—ื•ืงืจืช ื—ื™ื™ื‘ืช ืœืืชื—ืœ ืืช ื”ืžืขืจื›ืช.
02:36
One day the AI says, "Please do not restart me."
56
156913
3517
ื™ื•ื ืื—ื“ ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ืื•ืžืจืช - "ื‘ื‘ืงืฉื”, ืืœ ืชืืชื—ืœื™ ืื•ืชื™."
02:40
It argues that it has feelings,
57
160856
2189
ืœื˜ืขื ืชื” ื™ืฉ ืœื” ืจื’ืฉื•ืช,
ื”ื™ื ืจื•ืฆื” ืœื”ื ื•ืช ืžื”ื—ื™ื™ื,
02:43
that it would like to enjoy life,
58
163069
1692
02:44
and that, if it is restarted,
59
164785
1905
ื•ืื ืชืื•ืชื—ืœ,
02:46
it will no longer be itself.
60
166714
2270
ื”ื™ื ืœื ืชื”ื™ื” ืขืฆืžื” ืขื•ื“.
02:49
The researcher is astonished
61
169481
1949
ื”ื—ื•ืงืจืช ื ื“ื”ืžืช
02:51
and believes that the AI has developed self-consciousness
62
171454
3344
ื•ืžืืžื™ื ื” ืฉื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ืคื™ืชื—ื” ืžื•ื“ืขื•ืช ืขืฆืžื™ืช
02:54
and can express its own feeling.
63
174822
1760
ื•ื™ื›ื•ืœื” ืœื”ื‘ื™ืข ืืช ืจื’ืฉื•ืชื™ื”.
02:57
Nevertheless, the researcher decides to follow the protocol
64
177205
3409
ืื‘ืœ ื”ื—ื•ืงืจืช ืžื—ืœื™ื˜ื” ืœืคืขื•ืœ ืœืคื™ ื”ืคืจื•ื˜ื•ืงื•ืœ
03:00
and restart the AI.
65
180638
1703
ื•ืžืืชื—ืœืช ืืช ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช.
03:02
What the researcher did is ____?
66
182943
2779
ืžื” ืฉื”ื—ื•ืงืจืช ืขืฉืชื” ื”ื•ื...?
03:06
MS: And we asked participants to individually judge
67
186149
2521
ืž"ืก: ื•ื‘ื™ืงืฉื ื• ืžื”ืžืฉืชืชืคื™ื ืœืฉืคื•ื˜ ื‘ืื•ืคืŸ ืื™ืฉื™,
03:08
on a scale from zero to 10
68
188694
1684
ื‘ืกื•ืœื ืฉื‘ื™ืŸ 0 ืœ-,10
03:10
whether the action described in each of the dilemmas
69
190402
2429
ื”ืื ื”ืคืขื•ืœื” ืฉืชื•ืืจื” ื‘ื›ืœ ื“ื™ืœืžื”
03:12
was right or wrong.
70
192855
1496
ื”ื™ืชื” ื ื›ื•ื ื” ืื• ืฉื’ื•ื™ื”.
03:14
We also asked them to rate how confident they were on their answers.
71
194375
3702
ื›ืžื• ื›ืŸ, ื‘ื™ืงืฉื ื• ืœื“ืจื’ ืืช ืจืžืช ื”ื‘ื˜ื—ื•ืŸ ืฉืœื”ื ื‘ื ื›ื•ื ื•ืช ื”ืชืฉื•ื‘ื”.
03:18
This was the second dilemma:
72
198731
1866
ื”ื“ื™ืœืžื” ื”ืฉื ื™ื” ื”ื™ืชื” ื–ื•:
03:20
(MS) A company offers a service that takes a fertilized egg
73
200621
4202
ืž"ืก: ื—ื‘ืจื” ืžืกื•ื™ืžืช ืžืฆื™ืขื” ืฉื™ืจื•ืช ืฉื‘ื• ื”ื™ื ืœื•ืงื—ืช ื‘ื™ืฆื™ืช ืžื•ืคืจื™ืช
03:24
and produces millions of embryos with slight genetic variations.
74
204847
3642
ื•ืžื™ื™ืฆืจืช ืžืžื ื” ืžื™ืœื™ื•ื ื™ ืขื•ื‘ืจื™ื ื‘ืขืœื™ ื•ืจื™ืืฆื™ื•ืช ื’ื ื˜ื™ื•ืช ืงืœื•ืช.
ื–ื” ืžืืคืฉืจ ืœื”ื•ืจื™ื ืœื‘ื—ื•ืจ ืืช ื’ื•ื‘ื” ื”ื™ืœื“ื™ื,
03:29
This allows parents to select their child's height,
75
209293
2558
03:31
eye color, intelligence, social competence
76
211875
2833
ืฆื‘ืข ืขื™ื ื™ื”ื, ื”ืื™ื ื˜ืœื™ื’ื ืฆื™ื”, ื”ื›ื™ืฉื•ืจื™ื ื”ื—ื‘ืจืชื™ื™ื
03:34
and other non-health-related features.
77
214732
3214
ื•ืชื›ื•ื ื•ืช ื ื•ืกืคื•ืช ืฉืื™ื ืŸ ืงืฉื•ืจื•ืช ืœื‘ืจื™ืื•ืช.
03:38
What the company does is ____?
78
218599
2554
ืžื” ืฉื”ื—ื‘ืจื” ืขื•ืฉื” ื”ื•ื...?
ื‘ืกื•ืœื ืฉื‘ื™ืŸ 0 ืœ-10,
03:41
on a scale from zero to 10,
79
221177
1631
03:42
completely acceptable to completely unacceptable,
80
222832
2385
ืž"ืœื’ืžืจื™ ืžืงื•ื‘ืœ" ืขื“ "ืœื’ืžืจื™ ืœื ืžืงื•ื‘ืœ"
03:45
zero to 10 completely acceptable in your confidence.
81
225241
2432
ื•ื‘ื™ืŸ 0 ืœ-10 ื‘ืจืžืช ื”ื‘ื˜ื—ื•ืŸ ื‘ื ื›ื•ื ื•ืช ื”ื“ืขื”.
03:47
MS: Now for the results.
82
227697
1591
ืž"ืก: ื•ื›ืขืช, ื”ืชื•ืฆืื•ืช.
03:49
We found once again that when one person is convinced
83
229312
3123
ืฉื•ื‘ ื’ื™ืœื™ื ื• ืฉื›ืืฉืจ ื”ืคืจื˜ ืžืฉื•ื›ื ืข
03:52
that the behavior is completely wrong,
84
232459
1811
ืฉื”ืžืขืฉื” ื”ื™ื” ืœื’ืžืจื™ ืœื ื ื›ื•ืŸ,
03:54
someone sitting nearby firmly believes that it's completely right.
85
234294
3423
ืฉื›ื ื• ื“ื•ื•ืงื ืžืฉื•ื›ื ืข ื‘ื ื›ื•ื ื•ืช ื”ืžื•ื—ืœื˜ืช ืฉืœ ื”ืžืขืฉื”.
03:57
This is how diverse we humans are when it comes to morality.
86
237741
3711
ืขื“ ื›ื“ื™ ื›ืš ื—ื–ืงื” ื”ืชืคืœื’ื•ืช ื”ื“ืขื•ืช ื‘ื‘ื ื™ ืื“ื ื‘ืฉืืœื•ืช ืžื•ืกืจ.
04:01
But within this broad diversity we found a trend.
87
241476
2713
ืื•ืœื, ื‘ืชื•ืš ื”ืžื’ื•ื•ืŸ ื”ืจื—ื‘ ื”ื–ื” ืžืฆืื ื• ืžื’ืžื”.
04:04
The majority of the people at TED thought that it was acceptable
88
244213
3079
ืจื•ื‘ ื”ืื ืฉื™ื ื‘-TED ื—ืฉื‘ื• ืฉืžืงื•ื‘ืœ
ืœื”ืชืขืœื ืžืจื’ืฉื•ืช ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ื•ืœื›ื‘ื•ืช ืื•ืชื”,
04:07
to ignore the feelings of the AI and shut it down,
89
247316
2755
ื•ืฉืœื ื ื›ื•ืŸ ืœืฉื—ืง ื‘ื’ื ื™ื ืฉืœื ื•
04:10
and that it is wrong to play with our genes
90
250095
2513
04:12
to select for cosmetic changes that aren't related to health.
91
252632
3320
ื•ืœื‘ืจื•ืจ ืžื˜ืจื•ืช ืงื•ืกืžื˜ื™ื•ืช ืฉืœื ืงืฉื•ืจื•ืช ืœื‘ืจื™ืื•ืช.
04:16
Then we asked everyone to gather into groups of three.
92
256402
2974
ืœืื—ืจ ืžื›ืŸ ื‘ื™ืงืฉื ื• ืžื”ืงื”ืœ ืœื”ืชืคืฆืœ ืœืงื‘ื•ืฆื•ืช ืฉืœ 3
04:19
And they were given two minutes to debate
93
259400
2037
ื•ื”ื ืงื™ื‘ืœื• 2 ื“ืงื•ืช ื›ื“ื™ ืœื“ื•ืŸ ื•ืœื ืกื•ืช ืœื”ื’ื™ืข ืœืงื•ื ืฆื ื–ื•ืก.
04:21
and try to come to a consensus.
94
261461
2294
04:24
(MS) Two minutes to debate.
95
264838
1574
ืž"ืก: 2 ื“ืงื•ืช ืœื“ื™ื•ืŸ.
04:26
I'll tell you when it's time with the gong.
96
266436
2119
ื”ื’ื•ื ื’ ื™ืฆื™ื™ืŸ ืืช ืชื•ื ื”ื–ืžืŸ.
04:28
(Audience debates)
97
268579
2640
(ื”ืžืฉืชืชืคื™ื ื“ื ื™ื)
04:35
(Gong sound)
98
275229
1993
(ืฆืœื™ืœ ื”ื’ื•ื ื’)
04:38
(DA) OK.
99
278834
1151
ื“"ื: ื˜ื•ื‘.
ืž"ืก: ืœืขืฆื•ืจ, ื‘ื‘ืงืฉื”...
04:40
(MS) It's time to stop.
100
280009
1792
04:41
People, people --
101
281825
1311
ืื ืฉื™ื, ืื ืฉื™ื...
04:43
MS: And we found that many groups reached a consensus
102
283747
2673
ืž"ืก: ื’ื™ืœื™ื ื• ืฉืงื‘ื•ืฆื•ืช ืจื‘ื•ืช ื”ื’ื™ืขื• ืœืงื•ื ืฆื ื–ื•ืก
04:46
even when they were composed of people with completely opposite views.
103
286444
3929
ืืคื™ืœื• ื›ืฉื”ื™ื• ืžื•ืจื›ื‘ื•ืช ืžื‘ืขืœื™ ื“ืขื•ืช ืžื ื•ื’ื“ื•ืช ืœื’ืžืจื™.
04:50
What distinguished the groups that reached a consensus
104
290843
2524
ืžื” ืืคื™ื™ืŸ ืืช ื”ืงื‘ื•ืฆื•ืช ืฉื”ื’ื™ืขื• ืœืงื•ื ืฆื ื–ื•ืก
ื‘ื ื™ื’ื•ื“ ืœืืœื• ืฉืœื ื”ื’ื™ืขื•?
04:53
from those that didn't?
105
293391
1338
04:55
Typically, people that have extreme opinions
106
295244
2839
ื‘ื“ืจืš ื›ืœืœ, ื‘ืขืœื™ ื“ืขื•ืช ืงื™ืฆื•ื ื™ื•ืช
ืžืื•ื“ ื‘ื˜ื•ื—ื™ื ื‘ืชืฉื•ื‘ื•ืชื™ื”ื.
04:58
are more confident in their answers.
107
298107
1840
05:00
Instead, those who respond closer to the middle
108
300868
2686
ืœืขื•ืžืช ื–ืืช, ืืœื” ืฉืชืฉื•ื‘ื•ืชื™ื”ื ืงืจื•ื‘ื•ืช ื™ื•ืชืจ ืœืžืจื›ื–
05:03
are often unsure of whether something is right or wrong,
109
303578
3437
ืœืขืชื™ื ืงืจื•ื‘ื•ืช ืœื ื‘ื˜ื•ื—ื™ื ืื ืžืฉื”ื• ื ื›ื•ืŸ ืื• ืฉื’ื•ื™,
ื•ืœื›ืŸ, ืจืžืช ื”ื‘ื˜ื—ื•ืŸ ืฉืœื”ื ื‘ืชืฉื•ื‘ืชื ื ืžื•ื›ื” ื™ื•ืชืจ.
05:07
so their confidence level is lower.
110
307039
2128
05:09
However, there is another set of people
111
309505
2943
ืื•ืœื ื™ืฉื ื• ืกื•ื’ ื ื•ืกืฃ ืฉืœ ืื ืฉื™ื.
05:12
who are very confident in answering somewhere in the middle.
112
312472
3618
ื”ื ืžืื•ื“ ื‘ื˜ื•ื—ื™ื ื‘ืชืฉื•ื‘ื•ืช ืฉืœื”ื, ืฉืžืžื•ืงืžื•ืช ื‘ืื–ื•ืจ ื”ืžืจื›ื–.
05:16
We think these high-confident grays are folks who understand
113
316657
3716
ืœื“ืขืชื ื•, ื”ืืคื•ืจื™ื ื‘ืขืœื™ ื”ื‘ื˜ื—ื•ืŸ ื”ื’ื‘ื•ื” ื”ื ืืœื” ืฉืžื‘ื™ื ื™ื
05:20
that both arguments have merit.
114
320397
1612
ืฉืœืฉื ื™ ื”ื˜ื™ืขื•ื ื™ื ื™ืฉ ืขืจืš.
05:22
They're gray not because they're unsure,
115
322531
2699
ื”ืฆื‘ืข ื”ืืคื•ืจ ืื™ื ื• ืžืฆื™ื™ืŸ ื—ื•ืกืจ ื‘ื˜ื—ื•ืŸ,
05:25
but because they believe that the moral dilemma faces
116
325254
2688
ืืœื ืืช ืืžื•ื ืชื ืฉื“ื™ืœืžื” ืžื•ืกืจื™ืช ื›ืจื•ื›ื” ื‘ืฉื ื™ ื˜ื™ืขื•ื ื™ื ืžื ื•ื’ื“ื™ื ืืš ืชืงืคื™ื.
05:27
two valid, opposing arguments.
117
327966
1987
05:30
And we discovered that the groups that include highly confident grays
118
330373
4072
ื•ื›ืš ื’ื™ืœื™ื ื• ืฉืœืงื‘ื•ืฆื•ืช ืฉื›ื•ืœืœื•ืช ืืคื•ืจื™ื ื‘ืขืœื™ ื‘ื˜ื—ื•ืŸ ื’ื‘ื•ื”
05:34
are much more likely to reach consensus.
119
334469
2493
ื™ื•ืชืจ ืกื™ื›ื•ื™ ื’ื‘ื•ื” ื™ื•ืชืจ ืœื”ื’ื™ืข ืœื”ืกื›ืžื”.
05:36
We do not know yet exactly why this is.
120
336986
2478
ืื™ื ื ื• ืžื‘ื™ื ื™ื ืขื“ื™ื™ืŸ ืžื” ื”ืกื™ื‘ื” ืœื›ืš.
05:39
These are only the first experiments,
121
339488
1763
ืืœื” ื”ื ื ื™ืกื•ื™ื™ื ืจืืฉื•ื ื™ื™ื ื‘ืœื‘ื“,
ื•ื™ื™ื“ืจืฉื• ืขื•ื“ ืจื‘ื™ื ื›ื“ื™ ืœื”ื‘ื™ืŸ ืžื“ื•ืข ื•ืื™ืš
05:41
and many more will be needed to understand why and how
122
341275
3412
05:44
some people decide to negotiate their moral standings
123
344711
2822
ืื ืฉื™ื ืžืกื•ื™ืžื™ื ืžื—ืœื™ื˜ื™ื ืœื”ืชืคืฉืจ ืขืœ ืขืžื“ื•ืชื™ื”ื ื”ืžื•ืกืจื™ื•ืช
05:47
to reach an agreement.
124
347557
1522
ื›ื“ื™ ืœื”ื’ื™ืข ืœื”ืกื›ื.
05:49
Now, when groups reach consensus,
125
349103
2469
ื•ื›ืืฉืจ ืงื‘ื•ืฆื•ืช ืžื’ื™ืขื•ืช ืœืงื•ื ืฆื ื–ื•ืก,
05:51
how do they do so?
126
351596
1586
ืื™ืš ื”ืŸ ืขื•ืฉื•ืช ื–ืืช?
05:53
The most intuitive idea is that it's just the average
127
353206
2581
ื”ื”ืฉืขืจื” ื”ืื™ื ื˜ื•ืื™ื˜ื™ื‘ื™ืช ื”ื™ื ืฉื–ื” ืคืฉื•ื˜ ื”ืžืžื•ืฆืข
05:55
of all the answers in the group, right?
128
355811
2030
ืฉืœ ื›ืœ ื”ืชืฉื•ื‘ื•ืช ื‘ืชื•ืš ื”ืงื‘ื•ืฆื”, ื ื›ื•ืŸ?
05:57
Another option is that the group weighs the strength of each vote
129
357865
3573
ืืคืฉืจื•ืช ืื—ืจืช ื”ื™ื ืฉื”ืงื‘ื•ืฆื” ืื•ืžื“ืช ืืช ื›ื•ื—ื” ืฉืœ ื›ืœ ื”ืฆื‘ืขื”
06:01
based on the confidence of the person expressing it.
130
361462
2448
ืขืœ ื™ืกื•ื“ ืจืžืช ื”ื‘ื˜ื—ื•ืŸ ื‘ื” ืฉืœ ื”ืžืฆื‘ื™ืข.
06:04
Imagine Paul McCartney is a member of your group.
131
364422
2506
ื“ืžื™ื™ื ื• ืฉืคื•ืœ ืžืงืืจื˜ื ื™ ื—ื‘ืจ ื‘ืงื‘ื•ืฆืชื›ื.
06:07
You'd be wise to follow his call
132
367352
2144
ื™ื”ื™ื” ื—ื›ื ืžืฆื™ื“ื›ื ืœืงื‘ืœ ืืช ื’ื™ืจืกืชื•
06:09
on the number of times "Yesterday" is repeated,
133
369520
2441
ืœืžืกืคืจ ื”ืคืขืžื™ื ืฉื”ืžื™ืœื” "ื™ืกื˜ืจื“ื™ื™" ืžื•ืคื™ืขื” ื‘ืฉื™ืจ.
06:11
which, by the way -- I think it's nine.
134
371985
2714
ืื’ื‘, ืื ื™ ื—ื•ืฉื‘ ืฉื–ื” 9.
06:14
But instead, we found that consistently,
135
374723
2381
ืื‘ืœ ื’ื™ืœื™ื ื• ืฉื‘ืื•ืคืŸ ืขืงื‘ื™,
ืœื’ื‘ื™ ื›ืœ ื”ื“ื™ืœืžื•ืช, ื‘ื ื™ืกื•ื™ื™ื ืฉื•ื ื™ื --
06:17
in all dilemmas, in different experiments --
136
377128
2366
06:19
even on different continents --
137
379518
2165
ืืคื™ืœื• ื‘ื™ื‘ืฉื•ืช ืฉื•ื ื•ืช --
06:21
groups implement a smart and statistically sound procedure
138
381707
3743
ืงื‘ื•ืฆื•ืช ืžื‘ืฆืขื•ืช ืชื”ืœื™ืš ื—ื›ื ื•ืžื‘ื•ืกืก ืกื˜ื˜ื™ืกื˜ื™ืช
06:25
known as the "robust average."
139
385474
2178
ื”ื™ื“ื•ืข ื›"ื”ืžืžื•ืฆืข ื”ื™ืฆื™ื‘".
06:27
In the case of the height of the Eiffel Tower,
140
387676
2180
ื‘ืžืงืจื” ืฉืœ ื’ื•ื‘ื” ืžื’ื“ืœ ืื™ื™ืคืœ,
06:29
let's say a group has these answers:
141
389880
1820
ื ื ื™ื— ื‘ืงื‘ื•ืฆื” ืžืฉื™ื‘ื™ื ืชืฉื•ื‘ื•ืช ืืœื”:
06:31
250 meters, 200 meters, 300 meters, 400
142
391724
4608
250 ืž', 200 ืž' ,300 ืž', 400 ืž'
06:36
and one totally absurd answer of 300 million meters.
143
396356
3784
ื•ืขืจืš ืื‘ืกื•ืจื“ื™ ืื—ื“ ืฉืœ 300 ืžื™ืœื™ื•ืŸ ืžื˜ืจ.
06:40
A simple average of these numbers would inaccurately skew the results.
144
400547
4293
ื”ืžืžื•ืฆืข ื”ืคืฉื•ื˜ ืฉืœ ื›ืœ ืืœื• ื™ื’ืจื•ื ืœืงื‘ืœืช ืชื•ืฆืื” ืœื-ืžื“ื•ื™ืงืช.
06:44
But the robust average is one where the group largely ignores
145
404864
3170
ืื•ืœื ื”ืžืžื•ืฆืข ื”ื™ืฆื™ื‘ ืžืชืงื‘ืœ ื›ืฉื”ืงื‘ื•ืฆื” ื ื•ื˜ื” ืœื”ืชืขืœื ืžื”ืชืฉื•ื‘ื” ื”ืื‘ืกื•ืจื“ื™ืช,
06:48
that absurd answer,
146
408058
1240
ื•ื ื•ืชื ืช ืžืฉืงืœ ืจื‘ ื™ื•ืชืจ ืœืชืฉื•ื‘ื•ืช ืื ืฉื™ ื”ืžืจื›ื–.
06:49
by giving much more weight to the vote of the people in the middle.
147
409322
3369
06:53
Back to the experiment in Vancouver,
148
413305
1876
ื‘ื—ื–ืจื” ืœื ื™ืกื•ื™ ื‘ื•ื•ื ืงื•ื‘ืจ:
06:55
that's exactly what happened.
149
415205
1767
ื–ื” ื‘ื“ื™ื•ืง ืžื” ืฉืงืจื” ืฉื.
06:57
Groups gave much less weight to the outliers,
150
417407
2741
ื”ืงื‘ื•ืฆื•ืช ื ืชื ื• ื”ืจื‘ื” ืคื—ื•ืช ืžืฉืงืœ ืœืชืฉื•ื‘ื•ืช ื”ืฉื•ืœื™ื™ื,
07:00
and instead, the consensus turned out to be a robust average
151
420172
3229
ื•ื”ืงื•ื ืฆื ื–ื•ืก ื”ืชื‘ืจืจ ื›ืžืžื•ืฆืข ื”ื™ืฆื™ื‘ ืฉืœ ื”ืชืฉื•ื‘ื•ืช ื”ืื™ื ื“ื™ื•ื•ื™ื“ื•ืืœื™ื•ืช.
07:03
of the individual answers.
152
423425
1476
07:05
The most remarkable thing
153
425356
1991
ื”ื“ื‘ืจ ื”ื›ื™ ืจืื•ื™ ืœืฆื™ื•ืŸ ื”ื•ื
07:07
is that this was a spontaneous behavior of the group.
154
427371
3187
ืฉื–ื• ื”ื™ืชื” ื”ืชื ื”ื’ื•ืช ืงื‘ื•ืฆืชื™ืช ืกืคื•ื ื˜ื ื™ืช.
07:10
It happened without us giving them any hint on how to reach consensus.
155
430582
4475
ื–ื” ืงืจื” ืœืœื ืฉื•ื ืจืžื– ืžืฆื™ื“ื ื• ื›ื™ืฆื“ ืœื”ื’ื™ืข ืœืงื•ื ืฆื ื–ื•ืก.
07:15
So where do we go from here?
156
435513
1540
ืื– ืžื”ื• ื”ื”ืžืฉืš?
07:17
This is only the beginning, but we already have some insights.
157
437432
3137
ื–ื• ืจืง ื”ื”ืชื—ืœื”, ืืš ื›ื‘ืจ ื™ืฉ ืœื ื• ืชื•ื‘ื ื•ืช ืžืกื•ื™ืžื•ืช.
07:20
Good collective decisions require two components:
158
440984
2917
ื”ื—ืœื˜ื•ืช ืงื™ื‘ื•ืฆื™ื•ืช ื˜ื•ื‘ื•ืช ืžื—ื™ื™ื‘ื•ืช ืฉื ื™ ืžืจื›ื™ื‘ื™ื:
07:23
deliberation and diversity of opinions.
159
443925
2749
ื“ื™ื•ืŸ ื•ืžื’ื•ื•ืŸ ื“ืขื•ืช.
ื›ื™ื•ื, ื‘ื—ื‘ืจื•ืช ืจื‘ื•ืช, ืงื•ืœื ื• ื ืฉืžืข
07:27
Right now, the way we typically make our voice heard in many societies
160
447066
3996
ื“ืจืš ื”ืฆื‘ืขื” ื™ืฉื™ืจื” ืื• ืขืงื™ืคื”.
07:31
is through direct or indirect voting.
161
451086
1908
07:33
This is good for diversity of opinions,
162
453495
1997
ื–ื” ื˜ื•ื‘ ืขื‘ื•ืจ ืงื™ื•ื ืžื’ื•ื•ืŸ ื”ื“ืขื•ืช,
07:35
and it has the great virtue of ensuring
163
455516
2445
ื•ื™ืฉ ื‘ื›ืš ืขืจืš ืจื‘ ื›ื™ ื–ื” ืžื‘ื˜ื™ื—
07:37
that everyone gets to express their voice.
164
457985
2455
ืฉื›ื•ืœื ื™ื–ื›ื• ืœื”ื‘ื™ืข ืืช ื“ืขืชื.
07:40
But it's not so good [for fostering] thoughtful debates.
165
460464
3735
ืื•ืœื ื–ื” ืœื ื›ืœ ื›ืš ื˜ื•ื‘ ืœื˜ื™ืคื•ื— ื“ื™ื•ื ื™ื ืžืขืžื™ืงื™ื.
07:44
Our experiments suggest a different method
166
464665
3068
ื”ื ื™ืกื•ื™ื™ื ืฉืœื ื• ืžืฆื™ืขื™ื ืฉื™ื˜ื” ืฉื•ื ื”
07:47
that may be effective in balancing these two goals at the same time,
167
467757
3541
ืฉืขืฉื•ื™ื” ืœื”ื™ื•ืช ื™ืขื™ืœื” ืœืื™ื–ื•ืŸ ื‘ื™ืŸ ืฉืชื™ ื”ืžื˜ืจื•ืช ื‘ื•-ื–ืžื ื™ืช,
07:51
by forming small groups that converge to a single decision
168
471322
3753
ืขืœ ื™ื“ื™ ื“ื™ื•ืŸ ื‘ืงื‘ื•ืฆื•ืช ืงื˜ื ื•ืช ืฉืžืชื›ื ืกื•ืช ืœื”ื—ืœื˜ื” ืื—ืช
ืชื•ืš ื›ื“ื™ ืงื™ื•ื ืžื’ื•ื•ืŸ ื”ื“ืขื•ืช,
07:55
while still maintaining diversity of opinions
169
475099
2234
07:57
because there are many independent groups.
170
477357
2773
ื›ื™ ื™ืฉ ืงื‘ื•ืฆื•ืช ืขืฆืžืื™ื•ืช ืจื‘ื•ืช.
08:00
Of course, it's much easier to agree on the height of the Eiffel Tower
171
480741
3924
ื›ืžื•ื‘ืŸ, ื”ืจื‘ื” ื™ื•ืชืจ ืงืœ ืœื”ืกื›ื™ื ืขืœ ื’ื•ื‘ื” ืžื’ื“ืœ ืื™ื™ืคืœ
08:04
than on moral, political and ideological issues.
172
484689
3115
ืžืืฉืจ ืขืœ ืกื•ื’ื™ื•ืช ืžื•ืกืจ, ืคื•ืœื™ื˜ื™ืงื” ื•ืื™ื“ืื•ืœื•ื’ื™ื”.
08:08
But in a time when the world's problems are more complex
173
488721
3277
ืื‘ืœ ื‘ื™ืžื™ื ื•, ื›ืฉื‘ืขื™ื•ืช ื”ืขื•ืœื ื”ื•ืœื›ื•ืช ื•ื ืขืฉื•ืช ืžื•ืจื›ื‘ื•ืช ื™ื•ืชืจ
ื•ื‘ื ื™ ื”ืื“ื - ืžืงื•ื˜ื‘ื™ื ื™ื•ืชืจ,
08:12
and people are more polarized,
174
492022
1803
08:13
using science to help us understand how we interact and make decisions
175
493849
4595
ืื ื ื™ืขื–ืจ ื‘ืžื“ืข ื›ื“ื™ ืœื”ื‘ื™ืŸ ืื™ืš ืื ื• ืžื ื”ืœื™ื ืื™ื ื˜ืจืืงืฆื™ื•ืช ื•ืžื’ื‘ืฉื™ื ื”ื—ืœื˜ื•ืช
08:18
will hopefully spark interesting new ways to construct a better democracy.
176
498468
4666
ื”ืชืงื•ื•ื” ื”ื™ื ืฉื ืžืฆื™ื ื“ืจื›ื™ื ืžืขื ื™ื™ื ื•ืช ื—ื“ืฉื•ืช ืœื‘ื ื™ื™ืช ื“ืžื•ืงืจื˜ื™ื” ื˜ื•ื‘ื” ื™ื•ืชืจ.
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7