Laurie Santos: How monkeys mirror human irrationality

198,212 views ใƒป 2010-07-29

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืžืชืจื’ื: Sigal Tifferet ืžื‘ืงืจ: Ido Dekkers
00:17
I want to start my talk today with two observations
0
17260
2000
ืื ื™ ืจื•ืฆื” ืœืคืชื•ื— ื‘ืฉืชื™ ื”ื‘ื—ื ื•ืช
00:19
about the human species.
1
19260
2000
ืขืœ ื”ืžื™ืŸ ื”ืื ื•ืฉื™.
00:21
The first observation is something that you might think is quite obvious,
2
21260
3000
ื”ื”ื‘ื—ื ื” ื”ืจืืฉื•ื ื” ื”ื™ื ืื•ืœื™ ื‘ืจื•ืจื” ืžืืœื™ื”,
00:24
and that's that our species, Homo sapiens,
3
24260
2000
ื•ื”ื™ื ืฉื”ืžื™ืŸ ืฉืœื ื•, ื”ื•ืžื• ืกืคื™ื™ืื ืก,
00:26
is actually really, really smart --
4
26260
2000
ื”ื•ื ืžืžืฉ, ืื‘ืœ ืžืžืฉ ื—ื›ื,
00:28
like, ridiculously smart --
5
28260
2000
ื—ื›ื ื‘ืื•ืคืŸ ืžื’ื•ื—ืš.
00:30
like you're all doing things
6
30260
2000
ืืชื ืขื•ืฉื™ื ื“ื‘ืจื™ื
00:32
that no other species on the planet does right now.
7
32260
3000
ืฉืืฃ ืžื™ืŸ ืื—ืจ ื‘ื›ื“ื”"ื ืœื ืขื•ืฉื”.
00:35
And this is, of course,
8
35260
2000
ื•ื–ื” ื›ืžื•ื‘ืŸ
00:37
not the first time you've probably recognized this.
9
37260
2000
ืœื ื—ื“ืฉ ืœื›ื.
00:39
Of course, in addition to being smart, we're also an extremely vain species.
10
39260
3000
ื‘ื ื•ืกืฃ ืœื”ื™ื•ืชื ื• ื—ื›ืžื™ื, ืื ื—ื ื• ื’ื ืžืื•ื“ ื™ื”ื™ืจื™ื.
00:42
So we like pointing out the fact that we're smart.
11
42260
3000
ืื– ืื ื—ื ื• ืื•ื”ื‘ื™ื ืœื”ื“ื’ื™ืฉ ืืช ื—ื›ืžืชื ื•.
00:45
You know, so I could turn to pretty much any sage
12
45260
2000
ืื– ืื ื™ ื™ื›ื•ืœื” ืœืคื ื•ืช ืœื›ืœ ื’ืื•ืŸ,
00:47
from Shakespeare to Stephen Colbert
13
47260
2000
ืžืฉื™ื™ืงืกืคื™ืจ ื•ืขื“ ืกื˜ื™ื‘ืŸ ืงื•ืœื‘ืจื˜,
00:49
to point out things like the fact that
14
49260
2000
ื•ืœื”ืฆื‘ื™ืข ืขืœ ื›ืš ืฉืื ื—ื ื•
00:51
we're noble in reason and infinite in faculties
15
51260
2000
ืืฆื™ืœื™ื ื‘ืชื‘ื•ื ื”, ืื™ื ืกื•ืคื™ื™ื ื‘ื›ื™ืฉืจื•ื ื•ืช,
00:53
and just kind of awesome-er than anything else on the planet
16
53260
2000
ื•ื‘ืขืฆื ื”ื›ื™ ืžื“ื”ื™ืžื™ื ื‘ืขื•ืœื
00:55
when it comes to all things cerebral.
17
55260
3000
ื›ืฉื–ื” ืžื’ื™ืข ืœืขื ื™ื™ื ื™ื ืฉื›ืœื™ื™ื.
00:58
But of course, there's a second observation about the human species
18
58260
2000
ืื‘ืœ ื™ืฉ ื›ืžื•ื‘ืŸ ื’ื ื”ื‘ื—ื ื” ืฉื ื™ื” ืœื’ื‘ื™ ื”ืžื™ืŸ ื”ืื ื•ืฉื™
01:00
that I want to focus on a little bit more,
19
60260
2000
ื‘ื” ืื ื™ ืจื•ืฆื” ืœื”ืชืžืงื“,
01:02
and that's the fact that
20
62260
2000
ื•ื–ื• ื”ืขื•ื‘ื“ื”
01:04
even though we're actually really smart, sometimes uniquely smart,
21
64260
3000
ืฉืœืžืจื•ืช ืฉืื ื—ื ื• ืžืžืฉ ื—ื›ืžื™ื, ื—ื›ืžื™ื ื‘ืžื™ื•ื—ื“,
01:07
we can also be incredibly, incredibly dumb
22
67260
3000
ืื ื—ื ื• ื™ื›ื•ืœื™ื ื’ื ืœื”ื™ื•ืช ืžืžืฉ ื˜ื™ืคืฉื™ื
01:10
when it comes to some aspects of our decision making.
23
70260
3000
ื›ืฉื–ื” ืžื’ื™ืข ืœื›ืžื” ืืกืคืงื˜ื™ื ื‘ืงื‘ืœืช ื”ื—ืœื˜ื•ืช.
01:13
Now I'm seeing lots of smirks out there.
24
73260
2000
ืื ื™ ืจื•ืื” ื”ืจื‘ื” ื—ื™ื•ื›ื™ื ื‘ืงื”ืœ.
01:15
Don't worry, I'm not going to call anyone in particular out
25
75260
2000
ืืœ ืชื“ืื’ื•, ืœื ืืฆื™ื’ ืืช ื”ื˜ืขื•ื™ื•ืช
01:17
on any aspects of your own mistakes.
26
77260
2000
ืฉืœ ืืฃ ืื—ื“ ืžื›ื.
01:19
But of course, just in the last two years
27
79260
2000
ืื‘ืœ ืจืง ื‘ืฉื ืชื™ื™ื ื”ืื—ืจื•ื ื•ืช
01:21
we see these unprecedented examples of human ineptitude.
28
81260
3000
ืจืื™ื ื• ื“ื•ื’ืžืื•ืช ื—ืกืจื•ืช ืชืงื“ื™ื ืฉืœ ื—ื•ืกืจ ื™ื›ื•ืœืช ืื ื•ืฉื™ืช.
01:24
And we've watched as the tools we uniquely make
29
84260
3000
ื•ืจืื™ื ื• ืื™ืš ื”ืฉื™ื˜ื•ืช ืฉืชื›ื ื ื• ื‘ืžื™ื•ื—ื“
01:27
to pull the resources out of our environment
30
87260
2000
ื›ื“ื™ ืœื”ื•ืฆื™ื ืžืฉืื‘ื™ื ืžื”ืกื‘ื™ื‘ื”
01:29
kind of just blow up in our face.
31
89260
2000
ืงื•ืจืกื•ืช ื‘ื—ื•ืกืจ ื”ืฆืœื—ื”.
01:31
We've watched the financial markets that we uniquely create --
32
91260
2000
ืจืื™ื ื• ืืช ื”ืฉื•ื•ืงื™ื ื”ื›ืœื›ืœื™ื™ื ืฉื™ื™ืฆืจื ื•,
01:33
these markets that were supposed to be foolproof --
33
93260
3000
ืฉื•ื•ืงื™ื ืฉืืžื•ืจื™ื ื”ื™ื• ืœื”ื™ื•ืช ื—ืกื™ื ื™-ื›ืœ,
01:36
we've watched them kind of collapse before our eyes.
34
96260
2000
ืžืชืžื•ื˜ื˜ื™ื ืžื•ืœ ืขื™ื ื™ื ื•.
01:38
But both of these two embarrassing examples, I think,
35
98260
2000
ืื‘ืœ ืฉืชื™ ื”ื“ื•ื’ืžืื•ืช ื”ืžื‘ื™ื›ื•ืช ื”ืืœื”, ืื ื™ ื—ื•ืฉื‘ืช,
01:40
don't highlight what I think is most embarrassing
36
100260
3000
ืœื ืžื“ื’ื™ืฉื•ืช ืืช ืžื” ืฉื”ื›ื™ ืžื‘ื™ืš
01:43
about the mistakes that humans make,
37
103260
2000
ื‘ื˜ืขื•ื™ื•ืช ื”ืื ื•ืฉื™ื•ืช,
01:45
which is that we'd like to think that the mistakes we make
38
105260
3000
ื•ื”ื•ื ืฉื”ื™ื™ื ื• ืจื•ืฆื™ื ืœื—ืฉื•ื‘ ืฉื”ื˜ืขื•ื™ื•ืช ืฉืœื ื•
01:48
are really just the result of a couple bad apples
39
108260
2000
ื”ืŸ ืชื•ืฆืื” ืฉืœ ื›ืžื” ืชืคื•ื—ื™ื ืจืงื•ื‘ื™ื,
01:50
or a couple really sort of FAIL Blog-worthy decisions.
40
110260
3000
ื˜ืขื•ื™ื•ืช ืžืงืจื™ื•ืช ื—ืกืจื•ืช ื—ืฉื™ื‘ื•ืช.
01:53
But it turns out, what social scientists are actually learning
41
113260
3000
ืื‘ืœ ืžื” ืฉืื ื—ื ื• ืœื•ืžื“ื™ื ืžืžื“ืขื™ ื”ื—ื‘ืจื”
01:56
is that most of us, when put in certain contexts,
42
116260
3000
ื”ื•ื ืฉืจื•ื‘ื ื•, ื‘ืžืฆื‘ื™ื ืžืกื•ื™ื™ืžื™ื,
01:59
will actually make very specific mistakes.
43
119260
3000
ื ื‘ืฆืข ื˜ืขื•ื™ื•ืช ืžืื•ื“ ืกืคืฆื™ืคื™ื•ืช.
02:02
The errors we make are actually predictable.
44
122260
2000
ื˜ืขื•ื™ื•ืช ื ื™ืชื ื•ืช ืœื ื™ื‘ื•ื™.
02:04
We make them again and again.
45
124260
2000
ื•ืื ื—ื ื• ื ื‘ืฆืข ืื•ืชืŸ ืฉื•ื‘ ื•ืฉื•ื‘.
02:06
And they're actually immune to lots of evidence.
46
126260
2000
ื•ื”ืŸ ื—ืกื™ื ื•ืช ื‘ืคื ื™ ื”ื•ื›ื—ื•ืช.
02:08
When we get negative feedback,
47
128260
2000
ื›ืฉืื ื—ื ื• ืžืงื‘ืœื™ื ืžืฉื•ื‘ ืฉืœื™ืœื™,
02:10
we still, the next time we're face with a certain context,
48
130260
3000
ืขื“ื™ื™ืŸ ื‘ืคืขื ื”ื‘ืื” ื‘ืื•ืชื• ืžืฆื‘,
02:13
tend to make the same errors.
49
133260
2000
ืื ื—ื ื• ื ื•ื˜ื™ื ืœื—ื–ื•ืจ ืขืœ ืื•ืชืŸ ื˜ืขื•ื™ื•ืช.
02:15
And so this has been a real puzzle to me
50
135260
2000
ืื– ื–ื• ื”ื™ืชื” ื—ื™ื“ื” ื‘ืขื™ื ื™
02:17
as a sort of scholar of human nature.
51
137260
2000
ื›ื—ื•ืงืจืช ืฉืœ ื”ื˜ื‘ืข ื”ืื ื•ืฉื™.
02:19
What I'm most curious about is,
52
139260
2000
ืžื” ืฉืžืกืงืจืŸ ืื•ืชื™
02:21
how is a species that's as smart as we are
53
141260
3000
ื–ื” ืื™ืš ืžื™ืŸ ื—ื›ื ื›ืžื• ืฉืœื ื•
02:24
capable of such bad
54
144260
2000
ืžืกื•ื’ืœ ืœื‘ืฆืข ื˜ืขื•ื™ื•ืช
02:26
and such consistent errors all the time?
55
146260
2000
ื›ื” ื’ืจื•ืขื•ืช ื•ืขืงื‘ื™ื•ืช?
02:28
You know, we're the smartest thing out there, why can't we figure this out?
56
148260
3000
ืื ื—ื ื• ื”ื›ื™ ื—ื›ืžื™ื, ืื™ืš ืื ื—ื ื• ืœื ืคื•ืชืจื™ื ืืช ื–ื”?
02:31
In some sense, where do our mistakes really come from?
57
151260
3000
ืžืื™ืŸ ืžื’ื™ืขื•ืช ื”ื˜ืขื•ื™ื•ืช ื”ืืœื”?
02:34
And having thought about this a little bit, I see a couple different possibilities.
58
154260
3000
ื•ื—ืฉื‘ืชื™ ืขืœ ืฉืชื™ ืืคืฉืจื•ื™ื•ืช ืฉื•ื ื•ืช.
02:37
One possibility is, in some sense, it's not really our fault.
59
157260
3000
ืื—ืช, ื–ื• ืœื ื‘ืืžืช ืืฉืžืชื ื•.
02:40
Because we're a smart species,
60
160260
2000
ืžืฉื•ื ืฉืื ื—ื ื• ืžื™ืŸ ื—ื›ื
02:42
we can actually create all kinds of environments
61
162260
2000
ืื ื—ื ื• ืžื™ื™ืฆืจื™ื ืกื‘ื™ื‘ื•ืช
02:44
that are super, super complicated,
62
164260
2000
ืฉื”ืŸ ืžืื•ื“ ืžืกื•ื‘ื›ื•ืช,
02:46
sometimes too complicated for us to even actually understand,
63
166260
3000
ืœืคืขืžื™ื ืžืกื•ื‘ื›ื•ืช ืžื›ื“ื™ ืฉื ื‘ื™ืŸ ืื•ืชืŸ,
02:49
even though we've actually created them.
64
169260
2000
ืœืžืจื•ืช ืฉื™ืฆืจื ื• ืื•ืชืŸ.
02:51
We create financial markets that are super complex.
65
171260
2000
ืื ื—ื ื• ื™ื•ืฆืจื™ื ืฉื•ื•ืงื™ื ืคื™ื ื ืกื™ื™ื ืžืื•ื“ ืžื•ืจื›ื‘ื™ื.
02:53
We create mortgage terms that we can't actually deal with.
66
173260
3000
ืื ื—ื ื• ื™ื•ืฆืจื™ื ืชื ืื™ ืžืฉื›ื ืชื ืฉืื™ ืืคืฉืจ ืœื”ืชืžื•ื“ื“ ืื™ืชื.
02:56
And of course, if we are put in environments where we can't deal with it,
67
176260
3000
ื•ื›ืžื•ื‘ืŸ, ืื ืฉืžื™ื ืื•ืชื ื• ื‘ืกื‘ื™ื‘ื•ืช ืื™ืชืŸ ืงืฉื” ืœื”ืชืžื•ื“ื“,
02:59
in some sense makes sense that we actually
68
179260
2000
ื”ื’ื™ื•ื ื™ ืฉืื•ืœื™
03:01
might mess certain things up.
69
181260
2000
ื ื”ืจื•ืก ื›ืžื” ื“ื‘ืจื™ื.
03:03
If this was the case, we'd have a really easy solution
70
183260
2000
ืื ื–ื” ื”ื™ื” ื”ืžืฆื‘, ื”ื™ื” ืœื ื• ืคืชืจื•ืŸ ืคืฉื•ื˜
03:05
to the problem of human error.
71
185260
2000
ืœื‘ืขื™ื™ืช ื”ื˜ืขื•ืช ื”ืื ื•ืฉื™ืช.
03:07
We'd actually just say, okay, let's figure out
72
187260
2000
ื”ื™ื™ื ื• ืื•ืžืจื™ื, ื˜ื•ื‘, ื‘ื•ืื• ื ืจืื”
03:09
the kinds of technologies we can't deal with,
73
189260
2000
ืขื ืืœื• ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ืงืฉื” ืœื ื• ืœื”ืชืžื•ื“ื“,
03:11
the kinds of environments that are bad --
74
191260
2000
ืื•ืชืŸ ืกื‘ื™ื‘ื•ืช ื‘ืขื™ื™ืชื™ื•ืช -
03:13
get rid of those, design things better,
75
193260
2000
ื ื™ืคื˜ืจ ืžื”ืŸ, ื ืชื›ื ืŸ ื˜ื•ื‘ ื™ื•ืชืจ,
03:15
and we should be the noble species
76
195260
2000
ื•ื ื”ื™ื” ื”ืžื™ืŸ ื”ืืฆื™ืœ
03:17
that we expect ourselves to be.
77
197260
2000
ืฉืื ื—ื ื• ืžืฆืคื™ื ืœื”ื™ื•ืช.
03:19
But there's another possibility that I find a little bit more worrying,
78
199260
3000
ืื‘ืœ ื™ืฉื ื” ืืคืฉืจื•ืช ืžื“ืื™ื’ื” ื™ื•ืชืจ,
03:22
which is, maybe it's not our environments that are messed up.
79
202260
3000
ื•ื”ื™ื ืฉืื•ืœื™ ืœื ื”ืกื‘ื™ื‘ื” ื”ื™ื ืฉื“ืคื•ืงื”.
03:25
Maybe it's actually us that's designed badly.
80
205260
3000
ืื•ืœื™ ื–ื” ืื ื—ื ื• ืฉืžืชื•ื›ื ื ื™ื ืœื ื˜ื•ื‘.
03:28
This is a hint that I've gotten
81
208260
2000
ื–ื” ืจืžื– ืฉืงื™ื‘ืœืชื™
03:30
from watching the ways that social scientists have learned about human errors.
82
210260
3000
ืžื”ืกืชื›ืœื•ืช ื‘ื“ืจื›ื™ื ื‘ื”ืŸ ืžื“ืขื ื™ื ืœืžื“ื• ืขืœ ื˜ืขื•ื™ื•ืช ืื ื•ืฉ.
03:33
And what we see is that people tend to keep making errors
83
213260
3000
ื•ืžื” ืฉืื ื—ื ื• ืจื•ืื™ื ื”ื•ื ืฉืื ืฉื™ื ื ื•ื˜ื™ื ืœื‘ืฆืข ืืช ืื•ืชืŸ ืฉื’ื™ืื•ืช
03:36
exactly the same way, over and over again.
84
216260
3000
ื‘ืื•ืชื• ืื•ืคืŸ, ืฉื•ื‘ ื•ืฉื•ื‘.
03:39
It feels like we might almost just be built
85
219260
2000
ื ืจืื” ื›ืื™ืœื• ืื ื• ื‘ื ื•ื™ื™ื
03:41
to make errors in certain ways.
86
221260
2000
ืœืฉื’ื•ืช ื‘ื“ืจื›ื™ื ืžืกื•ื™ื™ืžื•ืช.
03:43
This is a possibility that I worry a little bit more about,
87
223260
3000
ื–ื• ืืคืฉืจื•ืช ืฉืžื“ืื™ื’ื” ืื•ืชื™ ื™ื•ืชืจ,
03:46
because, if it's us that's messed up,
88
226260
2000
ื›ื™ ืื ืื ื—ื ื• ื”ื“ืคื•ืงื™ื,
03:48
it's not actually clear how we go about dealing with it.
89
228260
2000
ืœื ืžืžืฉ ื‘ืจื•ืจ ืื™ืš ื ืชืžื•ื“ื“ ืขื ื–ื”.
03:50
We might just have to accept the fact that we're error prone
90
230260
3000
ืื•ืœื™ ืื ื—ื ื• ืฆืจื™ื›ื™ื ืœืงื‘ืœ ืืช ื–ื” ืฉืื ื—ื ื• ื ื•ื˜ื™ื ืœื˜ืขื•ื™ื•ืช,
03:53
and try to design things around it.
91
233260
2000
ื•ืœื ืกื•ืช ืœืชื›ื ืŸ ื“ื‘ืจื™ื ื‘ื”ืชืื.
03:55
So this is the question my students and I wanted to get at.
92
235260
3000
ืื– ื–ื• ื”ืฉืืœื” ืฉืจืฆื™ืชื™ ืœื‘ื—ื•ืŸ ืขื ื”ืกื˜ื•ื“ื ื˜ื™ื ืฉืœื™.
03:58
How can we tell the difference between possibility one and possibility two?
93
238260
3000
ืื™ืš ืืคืฉืจ ืœื”ื‘ื—ื™ืŸ ื‘ื™ืŸ ืฉืชื™ ื”ืืคืฉืจื•ื™ื•ืช ื”ืœืœื•?
04:01
What we need is a population
94
241260
2000
ืื ื—ื ื• ื–ืงื•ืงื™ื ืœืื•ื›ืœื•ืกื™ื”
04:03
that's basically smart, can make lots of decisions,
95
243260
2000
ื—ื›ืžื”, ืฉื™ื›ื•ืœื” ืœืงื‘ืœ ื”ืจื‘ื” ื”ื—ืœื˜ื•ืช,
04:05
but doesn't have access to any of the systems we have,
96
245260
2000
ืื‘ืœ ืฉืื™ืŸ ืœื” ื’ื™ืฉื” ืœืžืขืจื›ื•ืช ืฉื™ืฉ ืœื ื•,
04:07
any of the things that might mess us up --
97
247260
2000
ื›ืœ ื”ื“ื‘ืจื™ื ืฉื™ื›ื•ืœื™ื ืœื”ืจื•ืก ืื•ืชื ื• -
04:09
no human technology, human culture,
98
249260
2000
ืœื ื˜ื›ื ื•ืœื•ื’ื™ื”, ืœื ืชืจื‘ื•ืช ืื ื•ืฉื™ืช,
04:11
maybe even not human language.
99
251260
2000
ืืคื™ืœื• ืœื ืฉืคื” ืื ื•ืฉื™ืช.
04:13
And so this is why we turned to these guys here.
100
253260
2000
ื•ืœื›ืŸ ืคื ื™ื ื• ืœื—ื‘ืจื” ื”ืืœื”.
04:15
These are one of the guys I work with. This is a brown capuchin monkey.
101
255260
3000
ื”ื ื” ืื—ื“ ื”ื‘ื—ื•ืจื™ื ืื™ืชื ืื ื™ ืขื•ื‘ื“ืช. ื–ื” ืงื•ืฃ ืงืคื•ืฆ'ื™ืŸ ื—ื•ื.
04:18
These guys are New World primates,
102
258260
2000
ืืœื” ืคืจื™ืžืื˜ื™ื ืžื”ืขื•ืœื ื”ื—ื“ืฉ,
04:20
which means they broke off from the human branch
103
260260
2000
ืžื” ืฉืื•ืžืจ ืฉื”ื ื ืคืจื“ื• ืžื”ืขื ืฃ ื”ืื ื•ืฉื™
04:22
about 35 million years ago.
104
262260
2000
ืœืคื ื™ 35 ืžื™ืœื™ื•ืŸ ืฉื ื” ื‘ืขืจืš.
04:24
This means that your great, great, great great, great, great --
105
264260
2000
ื–ื” ืื•ืžืจ ืฉืกื‘ืชื ืจื‘ื, ืจื‘ื, ืจื‘ื ืฉืœื›ื -
04:26
with about five million "greats" in there --
106
266260
2000
ืขื ื‘ืขืจืš 5 ืžื™ืœื™ื•ืŸ "ืจื‘ื" -
04:28
grandmother was probably the same great, great, great, great
107
268260
2000
ื”ื™ืชื” ื›ื ืจืื” ืื•ืชื” ืกื‘ืชื ืจื‘ื, ืจื‘ื, ืจื‘ื -
04:30
grandmother with five million "greats" in there
108
270260
2000
ืขื 5 ืžื™ืœื™ื•ืŸ "ืจื‘ื",
04:32
as Holly up here.
109
272260
2000
ืฉืœ ื”ื•ืœื™ ืฉืืชื ืจื•ืื™ื ื›ืืŸ.
04:34
You know, so you can take comfort in the fact that this guy up here is a really really distant,
110
274260
3000
ืื– ืืคืฉืจ ืœื”ืชื ื—ื ื‘ื›ืš ืฉื”ื‘ื—ื•ืจ ื”ื–ื” ืžืžืฉ ืจื—ื•ืง ืžืื™ืชื ื•,
04:37
but albeit evolutionary, relative.
111
277260
2000
ืื‘ืœ ืžื‘ื—ื™ื ื” ืื‘ื•ืœื•ืฆื™ื•ื ื™ืช, ื”ื•ื ืงืจื•ื‘ ืžืฉืคื—ื”.
04:39
The good news about Holly though is that
112
279260
2000
ื”ื—ื“ืฉื•ืช ื”ื˜ื•ื‘ื•ืช ืœื’ื‘ื™ ื”ื•ืœื™ ื”ืŸ
04:41
she doesn't actually have the same kinds of technologies we do.
113
281260
3000
ืฉืื™ืŸ ืœื” ืืช ื”ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ืฉื™ืฉ ืœื ื•.
04:44
You know, she's a smart, very cut creature, a primate as well,
114
284260
3000
ืืชื ื™ื•ื“ืขื™ื, ื”ื™ื ื—ื›ืžื”, ื—ืžื•ื“ื”, ื’ื ื”ื™ื ืคืจื™ืžืื˜ื™ืช,
04:47
but she lacks all the stuff we think might be messing us up.
115
287260
2000
ืื‘ืœ ืื™ืŸ ืœื” ืืช ื›ืœ ื”ื—ืคืฆื™ื ืฉืื•ืœื™ ื”ื•ืจืกื™ื ืื•ืชื ื•.
04:49
So she's the perfect test case.
116
289260
2000
ืื– ื”ื™ื ืžืงืจื” ืžื‘ื—ืŸ ืžืฆื•ื™ืŸ.
04:51
What if we put Holly into the same context as humans?
117
291260
3000
ืžื” ืื ื ืฉื™ื ืืช ื”ื•ืœื™ ื‘ื”ืงืฉืจ ืื ื•ืฉื™?
04:54
Does she make the same mistakes as us?
118
294260
2000
ื”ืื ื”ื™ื ืชืขืฉื” ืืช ืื•ืชืŸ ื˜ืขื•ื™ื•ืช?
04:56
Does she not learn from them? And so on.
119
296260
2000
ื”ืื ื”ื™ื ืœื ืชืœืžื“ ืžื”ืŸ?
04:58
And so this is the kind of thing we decided to do.
120
298260
2000
ืื– ื–ื” ืžื” ืฉื ื™ืกื™ื ื• ืœืขืฉื•ืช.
05:00
My students and I got very excited about this a few years ago.
121
300260
2000
ื”ืกื˜ื•ื“ื ื˜ื™ื ืฉืœื™ ื•ืื ื™ ื”ืชืœื”ื‘ื ื• ืžื”ืจืขื™ื•ืŸ ื”ื–ื” ืœืคื ื™ ื›ืžื” ืฉื ื™ื.
05:02
We said, all right, let's, you know, throw so problems at Holly,
122
302260
2000
ืืžืจื ื•, ื˜ื•ื‘, ื‘ื•ืื• ื ื–ืจื•ืง ืœื” ื›ืžื” ื‘ืขื™ื•ืช,
05:04
see if she messes these things up.
123
304260
2000
ื•ื ืจืื” ืื ื”ื™ื ืžืคืฉืœืช.
05:06
First problem is just, well, where should we start?
124
306260
3000
ื‘ืขื™ื” ืจืืฉื•ื ื”, ื‘ืขืฆื, ืื™ืคื” ื ืชื—ื™ืœ?
05:09
Because, you know, it's great for us, but bad for humans.
125
309260
2000
ื›ื™ ืืชื ื™ื•ื“ืขื™ื, ื–ื” ื˜ื•ื‘ ืขื‘ื•ืจื ื•, ืื‘ืœ ื’ืจื•ืข ืœื‘ื ื™ ืื“ื,
05:11
We make a lot of mistakes in a lot of different contexts.
126
311260
2000
ืื ื—ื ื• ืขื•ืฉื™ื ืžืœื ื˜ืขื•ื™ื•ืช ื‘ื”ืจื‘ื” ื”ืงืฉืจื™ื.
05:13
You know, where are we actually going to start with this?
127
313260
2000
ืื– ืื™ืคื” ื ืชื—ื™ืœ?
05:15
And because we started this work around the time of the financial collapse,
128
315260
3000
ื•ืžืฉื•ื ืฉื”ืชื—ืœื ื• ื‘ืขืจืš ื‘ื–ืžืŸ ื”ืžืฉื‘ืจ ื”ื›ืœื›ืœื™,
05:18
around the time when foreclosures were hitting the news,
129
318260
2000
ื›ืฉื”ืขื™ืงื•ืœื™ื ื”ื•ืคื™ืขื• ื‘ื—ื“ืฉื•ืช,
05:20
we said, hhmm, maybe we should
130
320260
2000
ืืžืจื ื•, ื”ืžืž... ืื•ืœื™ ื ืชื—ื™ืœ
05:22
actually start in the financial domain.
131
322260
2000
ื‘ืชื—ื•ื ื”ืคื™ื ื ืกื™.
05:24
Maybe we should look at monkey's economic decisions
132
324260
3000
ืื•ืœื™ ื ืกืชื›ืœ ืขืœ ืงื‘ืœืช ื”ื—ืœื˜ื•ืช ื›ืœื›ืœื™ืช ืฉืœ ืงื•ืคื™ื,
05:27
and try to see if they do the same kinds of dumb things that we do.
133
327260
3000
ื•ื ื ืกื” ืœืจืื•ืช ืื ื”ื ืขื•ืฉื™ื ืืช ืื•ืชืŸ ืฉื˜ื•ื™ื•ืช ืฉืื ื—ื ื• ืขื•ืฉื™ื.
05:30
Of course, that's when we hit a sort second problem --
134
330260
2000
ื•ืื– ื›ืžื•ื‘ืŸ ื ืชืงืœื ื• ื‘ื‘ืขื™ื” ืฉื ื™ื” -
05:32
a little bit more methodological --
135
332260
2000
ื™ื•ืชืจ ืžืชื•ื“ื•ืœื•ื’ื™ืช -
05:34
which is that, maybe you guys don't know,
136
334260
2000
ื•ื”ื™ื, ืื•ืœื™ ืœื ื™ื“ืขืชื,
05:36
but monkeys don't actually use money. I know, you haven't met them.
137
336260
3000
ืื‘ืœ ืงื•ืคื™ื ืœื ื‘ืืžืช ืžืฉืชืžืฉื™ื ื‘ื›ืกืฃ. ืื ื™ ื™ื•ื“ืขืช, ื˜ืจื ืคื’ืฉืชื ืื•ืชื.
05:39
But this is why, you know, they're not in the queue behind you
138
339260
2000
ืื‘ืœ ื‘ื’ืœืœ ื–ื” ื”ื ืœื ืขื•ืžื“ื™ื ื‘ืชื•ืจ ืžืื—ื•ืจื™ื›ื
05:41
at the grocery store or the ATM -- you know, they don't do this stuff.
139
341260
3000
ื‘ืžื›ื•ืœืช ืื• ื‘ื›ืกืคื•ืžื˜ - ื”ื ืœื ืขื•ืฉื™ื ืืช ื”ื“ื‘ืจื™ื ื”ืืœื”.
05:44
So now we faced, you know, a little bit of a problem here.
140
344260
3000
ืื– ื”ื™ืชื” ืœื ื• ื›ืืŸ ื‘ืขื™ื”.
05:47
How are we actually going to ask monkeys about money
141
347260
2000
ืื™ืš ื ืฉืืœ ืืช ืงื•ืคื™ื ืขืœ ื›ืกืฃ
05:49
if they don't actually use it?
142
349260
2000
ื›ืฉื”ื ืœื ื‘ืืžืช ืžืฉืชืžืฉื™ื ื‘ื•?
05:51
So we said, well, maybe we should just, actually just suck it up
143
351260
2000
ืื– ืืžืจื ื•, ื˜ื•ื‘, ืื•ืœื™ ืคืฉื•ื˜ ืฆืจื™ืš
05:53
and teach monkeys how to use money.
144
353260
2000
ืœืœืžื“ ืื•ืชื ืื™ืš ืœื”ืฉืชืžืฉ ื‘ื›ืกืฃ.
05:55
So that's just what we did.
145
355260
2000
ื•ื–ื” ื‘ื“ื™ื•ืง ืžื” ืฉืขืฉื™ื ื•.
05:57
What you're looking at over here is actually the first unit that I know of
146
357260
3000
ืžื” ืฉืืชื ืจื•ืื™ื ื›ืืŸ ื–ื• ื”ื™ื—ื™ื“ื” ื”ืจืืฉื•ื ื” ื”ื™ื“ื•ืขื” ืœื™
06:00
of non-human currency.
147
360260
2000
ืฉืœ ืžื˜ื‘ืข ืœื ืื ื•ืฉื™.
06:02
We weren't very creative at the time we started these studies,
148
362260
2000
ืœื ื”ื™ื™ื ื• ืžืื•ื“ ื™ืฆื™ืจืชื™ื™ื,
06:04
so we just called it a token.
149
364260
2000
ืื– ืงืจืื ื• ืœื–ื” ืืกื™ืžื•ืŸ.
06:06
But this is the unit of currency that we've taught our monkeys at Yale
150
366260
3000
ืื‘ืœ ื–ื”ื• ื”ืžื˜ื‘ืข ืฉืœื™ืžื“ื ื• ืืช ื”ืงื•ืคื™ื ื‘ื™ื™ืœ
06:09
to actually use with humans,
151
369260
2000
ืœื”ืฉืชืžืฉ ื‘ื• ืขื ืื ืฉื™ื,
06:11
to actually buy different pieces of food.
152
371260
3000
ืœืงื ื•ืช ื‘ืืžืฆืขื•ืชื• ื—ืชื™ื›ื•ืช ืžื–ื•ืŸ ืฉื•ื ื•ืช.
06:14
It doesn't look like much -- in fact, it isn't like much.
153
374260
2000
ื–ื” ืœื ื ืจืื” ืžืฉื”ื• - ื–ื” ื‘ืืžืช ืœื ืžืฉื”ื•.
06:16
Like most of our money, it's just a piece of metal.
154
376260
2000
ื›ืžื• ืจื•ื‘ ื”ื›ืกืฃ ืฉืœื ื•, ื–ื• ืจืง ื—ืชื™ื›ืช ืžืชื›ืช.
06:18
As those of you who've taken currencies home from your trip know,
155
378260
3000
ื›ืคื™ ืฉืืชื ื™ื•ื“ืขื™ื ืื ืœืงื—ืชื ืžื˜ื‘ืขื•ืช ืžื—ื•"ืœ,
06:21
once you get home, it's actually pretty useless.
156
381260
2000
ื›ืฉืžื’ื™ืขื™ื ื”ื‘ื™ืชื”, ื”ื ื“ื™ ื—ืกืจื™ ืขืจืš.
06:23
It was useless to the monkeys at first
157
383260
2000
ื–ื” ื”ื™ื” ื—ืกืจ ืขืจืš ืขื‘ื•ืจ ื”ืงื•ืคื™ื ื‘ืชื—ื™ืœื”,
06:25
before they realized what they could do with it.
158
385260
2000
ืœืคื ื™ ืฉื”ื ื”ื‘ื™ื ื• ืžื” ื”ื ื™ื›ื•ืœื™ื ืœืขืฉื•ืช ืื™ืชื•.
06:27
When we first gave it to them in their enclosures,
159
387260
2000
ื›ืฉื ืชื ื• ืœื”ื ืืช ื–ื” ื‘ืžืชื—ื ืฉืœื”ื,
06:29
they actually kind of picked them up, looked at them.
160
389260
2000
ื”ื ื”ืจื™ืžื• ืื•ืชื, ื”ืกืชื›ืœื• ืขืœื™ื”ื.
06:31
They were these kind of weird things.
161
391260
2000
ื”ื ื”ื™ื• ืขื‘ื•ืจื ื“ื‘ืจื™ื ืžื•ื–ืจื™ื.
06:33
But very quickly, the monkeys realized
162
393260
2000
ืื‘ืœ ืžื”ืจ ืžืื•ื“ ื”ืงื•ืคื™ื ื”ื‘ื™ื ื•
06:35
that they could actually hand these tokens over
163
395260
2000
ืฉื”ื ื™ื›ื•ืœื™ื ืœืชืช ืืช ื”ืืกื™ืžื•ื ื™ื ื”ืืœื”
06:37
to different humans in the lab for some food.
164
397260
3000
ืœืื ืฉื™ื ื‘ืžืขื‘ื“ื” ื•ืœืงื‘ืœ ืชืžื•ืจืชื ืžื–ื•ืŸ.
06:40
And so you see one of our monkeys, Mayday, up here doing this.
165
400260
2000
ืื– ื›ืืŸ ืจื•ืื™ื ืืช ืžื™ื™ื“ื™ื™ ืขื•ืฉื” ืืช ื–ื”.
06:42
This is A and B are kind of the points where she's sort of a little bit
166
402260
3000
ื‘ื ืงื•ื“ื” A ื‘B ื”ื™ื ืงืฆืช
06:45
curious about these things -- doesn't know.
167
405260
2000
ืกืงืจื ื™ืช ืœื’ื‘ื™ ื–ื”, ืœื ื™ื•ื“ืขืช.
06:47
There's this waiting hand from a human experimenter,
168
407260
2000
ื”ื ื” ื”ื™ื“ ื”ืžื—ื›ื” ืฉืœ ื”ื ืกื™ื™ืŸ ื”ืื ื•ืฉื™,
06:49
and Mayday quickly figures out, apparently the human wants this.
169
409260
3000
ื•ืžื™ื™ื“ื™ื™ ืžื‘ื™ื ื” ืžื”ืจ ืฉื”ืื“ื ืจื•ืฆื” ืืช ื–ื”.
06:52
Hands it over, and then gets some food.
170
412260
2000
ื”ื™ื ืžื•ืฉื™ื˜ื” ืœื•, ื•ืžืงื‘ืœืช ืงืฆืช ืื•ื›ืœ.
06:54
It turns out not just Mayday, all of our monkeys get good
171
414260
2000
ืžืชื‘ืจืจ ืฉื›ืœ ื”ืงื•ืคื™ื ืฉืœื ื• ืžืฉืชืคืจื™ื
06:56
at trading tokens with human salesman.
172
416260
2000
ื‘ืกื—ืจ ื”ืืกื™ืžื•ื ื™ื ืขื ืื™ืฉ ื”ืžื›ื™ืจื•ืช ื”ืื ื•ืฉื™.
06:58
So here's just a quick video of what this looks like.
173
418260
2000
ืื– ื”ื ื” ื•ื™ื“ืื• ืงืฆืจ ืฉืžืจืื” ืืช ื–ื”.
07:00
Here's Mayday. She's going to be trading a token for some food
174
420260
3000
ื”ื ื” ืžื™ื™ื“ื™ื™. ื”ื™ื ืขื•ืžื“ืช ืœื”ื—ืœื™ืฃ ืืกื™ืžื•ืŸ ืขื‘ื•ืจ ืžื–ื•ืŸ,
07:03
and waiting happily and getting her food.
175
423260
3000
ืžื—ื›ื” ื‘ืฉืžื—ื” ื•ืžืงื‘ืœืช ืืช ื”ืื•ื›ืœ ืฉืœื”.
07:06
Here's Felix, I think. He's our alpha male; he's a kind of big guy.
176
426260
2000
ื”ื ื” ืคืœื™ืงืก. ื”ื•ื ื–ื›ืจ ื”ืืœืคื ืฉืœื ื•, ื“ื™ ื’ื“ื•ืœ.
07:08
But he too waits patiently, gets his food and goes on.
177
428260
3000
ืื‘ืœ ื’ื ื”ื•ื ืžื—ื›ื” ื‘ืกื‘ืœื ื•ืช, ืžืงื‘ืœ ืืช ื”ืื•ื›ืœ ื•ืžืžืฉื™ืš.
07:11
So the monkeys get really good at this.
178
431260
2000
ืื– ื”ืงื•ืคื™ื ืžืžืฉ ื ื”ื™ื™ื ื˜ื•ื‘ื™ื ื‘ื–ื”.
07:13
They're surprisingly good at this with very little training.
179
433260
3000
ื‘ืื•ืคืŸ ืžืคืชื™ืข, ืขื ืžืขื˜ ืžืื•ื“ ืื™ืžื•ืŸ.
07:16
We just allowed them to pick this up on their own.
180
436260
2000
ืจืง ื ืชื ื• ืœื”ื ืœืชืคื•ืก ืืช ื–ื” ื‘ืขืฆืžื.
07:18
The question is: is this anything like human money?
181
438260
2000
ื”ืฉืืœื” ื”ื™ื: ื”ืื ื–ื” ื“ื•ืžื” ืœื›ืกืฃ ืื ื•ืฉื™?
07:20
Is this a market at all,
182
440260
2000
ื”ืื ื–ื” ื‘ืืžืช ืฉื•ืง,
07:22
or did we just do a weird psychologist's trick
183
442260
2000
ืื• ืฉื–ื” ืกืชื ื˜ืจื™ืง ืคืกื™ื›ื•ืœื•ื’ื™
07:24
by getting monkeys to do something,
184
444260
2000
ืฉื’ื•ืจื ืœืงื•ืคื™ื ืœืขืฉื•ืช ืžืฉื”ื•,
07:26
looking smart, but not really being smart.
185
446260
2000
ืœื”ื™ืจืื•ืช ื—ื›ืžื™ื, ืื‘ืœ ื‘ืขืฆื ืœื ืœื”ื™ื•ืช ื—ื›ืžื™ื.
07:28
And so we said, well, what would the monkeys spontaneously do
186
448260
3000
ืื– ืฉืืœื ื•, ืžื” ื”ืงื•ืคื™ื ื”ื™ื• ืขื•ืฉื™ื ื‘ืื•ืคืŸ ืกืคื•ื ื˜ื ื™
07:31
if this was really their currency, if they were really using it like money?
187
451260
3000
ืื ื–ื” ื‘ืืžืช ื”ื™ื” ื”ื›ืกืฃ ืฉืœื”ื?
07:34
Well, you might actually imagine them
188
454260
2000
ืืคืฉืจ ืœื“ืžื™ื™ืŸ ืื•ืชื
07:36
to do all the kinds of smart things
189
456260
2000
ืขื•ืฉื™ื ืืช ื›ืœ ื”ื“ื‘ืจื™ื ื”ื—ื›ืžื™ื
07:38
that humans do when they start exchanging money with each other.
190
458260
3000
ืฉื‘ื ื™ ืื“ื ืขื•ืฉื™ื ื›ืฉื”ื ืžื—ืœื™ืคื™ื ื›ืกืฃ ื–ื” ืขื ื–ื”.
07:41
You might have them start paying attention to price,
191
461260
3000
ืื•ืœื™ ื”ื ื™ืฉื™ืžื• ืœื‘ ืœืžื—ื™ืจ,
07:44
paying attention to how much they buy --
192
464260
2000
ื™ืฉื™ืžื• ืœื‘ ืœื›ืžื•ืช ืฉื”ื ืงื•ื ื™ื,
07:46
sort of keeping track of their monkey token, as it were.
193
466260
3000
ื‘ืขืฆื ืขื•ืงื‘ื™ื ืื—ืจื™ ืขืจืš ื”ื›ืกืฃ.
07:49
Do the monkeys do anything like this?
194
469260
2000
ื”ืื ื”ืงื•ืคื™ื ืขื•ืฉื™ื ืžืฉื”ื• ื›ื–ื”?
07:51
And so our monkey marketplace was born.
195
471260
3000
ืื– ืฉื•ืง ื”ืงื•ืคื™ื ืฉืœื ื• ื ื•ืœื“.
07:54
The way this works is that
196
474260
2000
ื•ื›ืš ื–ื” ืขื•ื‘ื“.
07:56
our monkeys normally live in a kind of big zoo social enclosure.
197
476260
3000
ื”ืงื•ืคื™ื ืฉืœื ื• ื—ื™ื™ื ื‘ืžืชืงืŸ ื—ื‘ืจืชื™ ื’ื“ื•ืœ.
07:59
When they get a hankering for some treats,
198
479260
2000
ื›ืฉืžืชื—ืฉืง ืœื”ื ืžืžืชืง,
08:01
we actually allowed them a way out
199
481260
2000
ืืคืฉืจื ื• ืœื”ื ืœืฆืืช
08:03
into a little smaller enclosure where they could enter the market.
200
483260
2000
ืœืžืชื—ื ืงื˜ืŸ ื™ื•ืชืจ ื“ืจื›ื• ื”ื ื ื›ื ืกื™ื ืœืฉื•ืง.
08:05
Upon entering the market --
201
485260
2000
ื›ืฉื”ื ื ื›ื ืกื™ื ืœืฉื•ืง,
08:07
it was actually a much more fun market for the monkeys than most human markets
202
487260
2000
ื–ื” ื‘ืขืฆื ื™ื•ืชืจ ื›ื™ืฃ ืืฆืœื”ื ืžืืฉืจ ืืฆืœื ื•,
08:09
because, as the monkeys entered the door of the market,
203
489260
3000
ื›ื™ ื›ืฉื”ื ื ื›ื ืกื™ื ื‘ื“ืœืช ืฉืœ ื”ืฉื•ืง
08:12
a human would give them a big wallet full of tokens
204
492260
2000
ืื“ื ื ื•ืชืŸ ืœื”ื ืืจื ืง ื’ื“ื•ืœ ืžืœื ืืกื™ืžื•ื ื™ื
08:14
so they could actually trade the tokens
205
494260
2000
ื›ืš ืฉื”ื ื™ื›ื•ืœื™ื ืœื”ื—ืœื™ืฃ ืืช ื”ืืกื™ืžื•ื ื™ื
08:16
with one of these two guys here --
206
496260
2000
ืขื ืื—ื“ ืžื”ืฉื ื™ื™ื ื”ืืœื” -
08:18
two different possible human salesmen
207
498260
2000
ืฉื ื™ ืื ืฉื™ ืžื›ื™ืจื•ืช ืฉื•ื ื™ื
08:20
that they could actually buy stuff from.
208
500260
2000
ืžื”ื ื ื™ืชืŸ ืœืงื ื•ืช ื“ื‘ืจื™ื.
08:22
The salesmen were students from my lab.
209
502260
2000
ืื ืฉื™ ื”ืžื›ื™ืจื•ืช ื”ื™ื• ืกื˜ื•ื“ื ื˜ื™ื ืžื”ืžืขื‘ื“ื” ืฉืœื™.
08:24
They dressed differently; they were different people.
210
504260
2000
ื”ื ื”ืชืœื‘ืฉื• ืื—ืจืช, ื”ื ื”ื™ื• ืื ืฉื™ื ืฉื•ื ื™ื.
08:26
And over time, they did basically the same thing
211
506260
3000
ื•ืœืื•ืจืš ื–ืžืŸ ื”ื ืขืฉื• ืœืžืขืฉื” ืืช ืื•ืชื• ื”ื“ื‘ืจ.
08:29
so the monkeys could learn, you know,
212
509260
2000
ื›ืš ื”ืงื•ืคื™ื ื™ื›ืœื• ืœืœืžื•ื“
08:31
who sold what at what price -- you know, who was reliable, who wasn't, and so on.
213
511260
3000
ืžื™ ืžื›ืจ ืžื” ื•ื‘ืื™ื–ื” ืžื—ื™ืจ. ืืชื ื™ื•ื“ืขื™ื - ืžื™ ืืžื™ืŸ ื•ื›ืŸ ื”ืœืื”.
08:34
And you can see that each of the experimenters
214
514260
2000
ื•ืืชื ื™ื›ื•ืœื™ื ืœืจืื•ืช ืฉื›ืœ ืื—ื“ ืžื”ื ืกื™ื™ื ื™ื
08:36
is actually holding up a little, yellow food dish.
215
516260
3000
ืžื—ื–ื™ืง ืฆืœื—ืช ืื•ื›ืœ ืฆื”ื•ื‘ื”
08:39
and that's what the monkey can for a single token.
216
519260
2000
ื•ื–ื” ืžื” ืฉื”ืงื•ืฃ ื™ื›ื•ืœ ืœืงื‘ืœ ืชืžื•ืจืช ืืกื™ืžื•ืŸ ืื—ื“.
08:41
So everything costs one token,
217
521260
2000
ืื– ื”ื›ืœ ืขื•ืœื” ืืกื™ืžื•ืŸ ืื—ื“,
08:43
but as you can see, sometimes tokens buy more than others,
218
523260
2000
ืื‘ืœ ืœืคืขืžื™ื ื”ืืกื™ืžื•ืŸ ืงื•ื ื” ื™ื•ืชืจ,
08:45
sometimes more grapes than others.
219
525260
2000
ืœืคืขืžื™ื ืžืงื‘ืœื™ื ื™ื•ืชืจ ืขื ื‘ื™ื.
08:47
So I'll show you a quick video of what this marketplace actually looks like.
220
527260
3000
ืื– ืืจืื” ืœื›ื ื•ื™ื“ืื• ืงืฆืจ ืฉืœ ื”ืฉื•ืง.
08:50
Here's a monkey-eye-view. Monkeys are shorter, so it's a little short.
221
530260
3000
ื›ืืŸ ืžื–ื•ื™ืช ื”ืจืื™ื™ื” ืฉืœ ื”ืงื•ืคื™ื, ืฉื”ื ื™ื•ืชืจ ื ืžื•ื›ื™ื.
08:53
But here's Honey.
222
533260
2000
ื•ื”ื ื” ื”ืื ื™.
08:55
She's waiting for the market to open a little impatiently.
223
535260
2000
ื”ื™ื ืžื—ื›ื” ืฉื”ืฉื•ืง ื™ืคืชื—, ืžืขื˜ ื‘ื—ื•ืกืจ ืกื‘ืœื ื•ืช.
08:57
All of a sudden the market opens. Here's her choice: one grapes or two grapes.
224
537260
3000
ืคืชืื•ื ื”ืฉื•ืง ื ืคืชื—. ื”ื ื” ื”ื‘ื—ื™ืจื” ืฉืœื”: ืขื ื‘ ืื—ื“ ืื• ืฉื ื™ื™ื.
09:00
You can see Honey, very good market economist,
225
540260
2000
ืืชื ืจื•ืื™ื ืฉื”ืื ื™, ื›ืœื›ืœื ื™ืช ืฉื•ืง ื˜ื•ื‘ื”,
09:02
goes with the guy who gives more.
226
542260
3000
ื”ื•ืœื›ืช ืœื‘ื—ื•ืจ ืฉื ื•ืชืŸ ื™ื•ืชืจ.
09:05
She could teach our financial advisers a few things or two.
227
545260
2000
ื”ื™ื ื™ื›ื•ืœื” ืœืœืžื“ ืืช ื”ืžื•ืžื—ื™ื ื›ืžื” ื“ื‘ืจื™ื.
09:07
So not just Honey,
228
547260
2000
ื•ืœื ืจืง ื”ืื ื™,
09:09
most of the monkeys went with guys who had more.
229
549260
3000
ืจื•ื‘ ื”ืงื•ืคื™ื ื”ืœื›ื• ืœืžื™ ืฉื ืชืŸ ื™ื•ืชืจ.
09:12
Most of the monkeys went with guys who had better food.
230
552260
2000
ืจื•ื‘ ื”ืงื•ืคื™ื ื”ืœื›ื• ืœืžื™ ืฉื ืชืŸ ืื•ื›ืœ ื˜ื•ื‘ ื™ื•ืชืจ.
09:14
When we introduced sales, we saw the monkeys paid attention to that.
231
554260
3000
ื›ืฉื”ื›ื ืกื ื• ืžื‘ืฆืขื™ื ืจืื™ื ื• ืฉื”ืงื•ืคื™ื ืฉืžื• ืœื‘ ืœื›ืš.
09:17
They really cared about their monkey token dollar.
232
557260
3000
ื”ื ืžืžืฉ ื“ืื’ื• ืœืืกื™ืžื•ื ื™ื ืฉืœื”ื.
09:20
The more surprising thing was that when we collaborated with economists
233
560260
3000
ืžื” ืฉื”ืคืชื™ืข ืื•ืชื ื• ื™ื•ืชืจ ื”ื™ื” ืฉื›ืฉืฉื™ืชืคื ื• ืคืขื•ืœื” ืขื ื›ืœื›ืœื ื™ื
09:23
to actually look at the monkeys' data using economic tools,
234
563260
3000
ื›ื“ื™ ืœืจืื•ืช ืื™ืš ื”ืงื•ืคื™ื ืžืฉืชืžืฉื™ื ื‘ื›ืœื™ื ื›ืœื›ืœื™ื™ื,
09:26
they basically matched, not just qualitatively,
235
566260
3000
ื”ื ื”ืชืื™ืžื•, ืœื ืจืง ืื™ื›ื•ืชื ื™ืช,
09:29
but quantitatively with what we saw
236
569260
2000
ืืœื ื’ื ื›ืžื•ืชื™ืช ืœืžื” ืฉืจืื™ื ื•
09:31
humans doing in a real market.
237
571260
2000
ืฉืื ืฉื™ื ืขื•ืฉื™ื ื‘ืฉื•ืง ืืžื™ืชื™.
09:33
So much so that, if you saw the monkeys' numbers,
238
573260
2000
ืขื“ ื›ื“ื™ ื›ืš ืฉืื ืจืื™ืชื ืืช ื”ื ืชื•ื ื™ื ืžื”ืงื•ืคื™ื,
09:35
you couldn't tell whether they came from a monkey or a human in the same market.
239
575260
3000
ืœื ื™ื›ื•ืœืชื ืœื“ืขืช ืื ื”ื ื”ื’ื™ืขื• ืžืงื•ืฃ ืื• ืžืื“ื ื‘ืื•ืชื• ืฉื•ืง.
09:38
And what we'd really thought we'd done
240
578260
2000
ื•ืžื” ืฉื—ืฉื‘ื ื• ืฉืขืฉื™ื ื•
09:40
is like we'd actually introduced something
241
580260
2000
ื”ื™ื” ืฉื™ืฆืจื ื• ืžืฉื”ื•
09:42
that, at least for the monkeys and us,
242
582260
2000
ืฉืœืคื—ื•ืช ืขื‘ื•ืจื ื• ื•ืขื‘ื•ืจ ื”ืงื•ืคื™ื,
09:44
works like a real financial currency.
243
584260
2000
ืคื•ืขืœ ื›ืžื• ืžื˜ื‘ืข ืคื™ื ื ืกื™ ืืžื™ืชื™.
09:46
Question is: do the monkeys start messing up in the same ways we do?
244
586260
3000
ื”ืฉืืœื” ื”ื™ื: ื”ืื ื”ืงื•ืคื™ื ืžืชื—ื™ืœื™ื ืœืคืฉืœ ื‘ืื•ืชืŸ ื“ืจื›ื™ื ืฉืื ื—ื ื• ืžืคืฉืœื™ื?
09:49
Well, we already saw anecdotally a couple of signs that they might.
245
589260
3000
ืื– ื›ื‘ืจ ืจืื™ื ื• ื‘ืื ืงื“ื•ื˜ื” ื›ืžื” ืกื™ืžื ื™ื ืฉื›ืŸ.
09:52
One thing we never saw in the monkey marketplace
246
592260
2000
ืžื” ืฉืœื ืจืื™ื ื• ืืฃ ืคืขื ืืฆืœ ื”ืงื•ืคื™ื
09:54
was any evidence of saving --
247
594260
2000
ื”ื•ื ืขื“ื•ืช ืœื—ืกื›ื•ืŸ -
09:56
you know, just like our own species.
248
596260
2000
ืืชื ื™ื•ื“ืขื™ื, ื›ืžื• ืืฆืœื ื•.
09:58
The monkeys entered the market, spent their entire budget
249
598260
2000
ื”ืงื•ืคื™ื ื ื›ื ืกื• ืœืฉื•ืง, ื‘ื–ื‘ื–ื• ืืช ื›ืœ ื”ืชืงืฆื™ื‘ ืฉืœื”ื
10:00
and then went back to everyone else.
250
600260
2000
ื•ืื– ื—ื–ืจื• ืœืฉืืจ.
10:02
The other thing we also spontaneously saw,
251
602260
2000
ืžื” ืฉืขื•ื“ ืจืื™ื ื•
10:04
embarrassingly enough,
252
604260
2000
ื•ื–ื” ื”ื™ื” ืžื‘ื™ืš,
10:06
is spontaneous evidence of larceny.
253
606260
2000
ื–ื• ืขื“ื•ืช ืกืคื•ื ื˜ื ื™ืช ืœื’ื ื™ื‘ื”.
10:08
The monkeys would rip-off the tokens at every available opportunity --
254
608260
3000
ื”ืงื•ืคื™ื ื”ื™ื• ื’ื•ื ื‘ื™ื ืืกื™ืžื•ื ื™ื ื‘ื›ืœ ื”ื–ื“ืžื ื•ืช,
10:11
from each other, often from us --
255
611260
2000
ืžืงื•ืคื™ื ืื—ืจื™ื, ืœืคืขืžื™ื ื’ื ืžืื™ืชื ื•,
10:13
you know, things we didn't necessarily think we were introducing,
256
613260
2000
ืืชื ื™ื•ื“ืขื™ื, ืชื•ืคืขื•ืช ืฉืœื ื™ื“ืขื ื• ืฉืื ื—ื ื• ื‘ื•ื“ืงื™ื,
10:15
but things we spontaneously saw.
257
615260
2000
ืื‘ืœ ืจืื™ื ื• ื‘ืื•ืคืŸ ืกืคื•ื ื˜ื ื™.
10:17
So we said, this looks bad.
258
617260
2000
ืื– ืืžืจื ื•, ื–ื” ื ืจืื” ืจืข.
10:19
Can we actually see if the monkeys
259
619260
2000
ื”ืื ืืคืฉืจ ืžืžืฉ ืœื‘ื“ื•ืง ืื ื”ืงื•ืคื™ื
10:21
are doing exactly the same dumb things as humans do?
260
621260
3000
ืขื•ืฉื™ื ื‘ื“ื™ื•ืง ืืช ืื•ืชืŸ ื”ืฉื˜ื•ื™ื•ืช ืฉืื ืฉื™ื ืขื•ืฉื™ื?
10:24
One possibility is just kind of let
261
624260
2000
ืืคืฉืจื•ืช ืื—ืช ื”ื™ื ืคืฉื•ื˜
10:26
the monkey financial system play out,
262
626260
2000
ืœืžืฆื•ืช ืืช ื”ืžืขืจื›ืช ื”ืคื™ื ื ืกื™ืช ืฉืœ ื”ืงื•ืคื™ื,
10:28
you know, see if they start calling us for bailouts in a few years.
263
628260
2000
ืœืจืื•ืช ืื ื”ื ื™ื‘ืงืฉื• ืžืื™ืชื ื• ืœืžื ื•ืข ืืช ืงืจื™ืกืชื ื‘ืขื•ื“ ื›ืžื” ืฉื ื™ื.
10:30
We were a little impatient so we wanted
264
630260
2000
ื”ื™ื™ื ื• ื—ืกืจื™ ืกื‘ืœื ื•ืช, ืื– ืจืฆื™ื ื•
10:32
to sort of speed things up a bit.
265
632260
2000
ืœื”ืื™ืฅ ืืช ื”ืชื”ืœื™ื›ื™ื.
10:34
So we said, let's actually give the monkeys
266
634260
2000
ืื– ืืžืจื ื•, ื‘ื•ืื• ื ืฆื™ื’ ืœืงื•ืคื™ื
10:36
the same kinds of problems
267
636260
2000
ืืช ืื•ืชืŸ ื”ื‘ืขื™ื•ืช
10:38
that humans tend to get wrong
268
638260
2000
ื‘ื”ืŸ ืื ืฉื™ื ื ื•ื˜ื™ื ืœืฉื’ื•ืช
10:40
in certain kinds of economic challenges,
269
640260
2000
ื‘ืืชื’ืจื™ื ื›ืœื›ืœื™ื™ื ืžืกื•ื™ื™ืžื™ื,
10:42
or certain kinds of economic experiments.
270
642260
2000
ื‘ื ื™ืกื•ื™ื™ื ื›ืœื›ืœื™ื™ื ืžืกื•ื™ื™ืžื™ื.
10:44
And so, since the best way to see how people go wrong
271
644260
3000
ืื– ืžื›ื™ื•ื•ืŸ ืฉื”ื“ืจืš ื”ื˜ื•ื‘ื” ื‘ื™ื•ืชืจ ืœืจืื•ืช ื”ื™ื›ืŸ ืื ืฉื™ื ื˜ื•ืขื™ื
10:47
is to actually do it yourself,
272
647260
2000
ื”ื™ื ืœื ืกื•ืช ื–ืืช ื‘ืขืฆืžื›ื,
10:49
I'm going to give you guys a quick experiment
273
649260
2000
ืื ื™ ืืชืŸ ืœื›ื ื ื™ืกื•ื™ ืงื˜ืŸ
10:51
to sort of watch your own financial intuitions in action.
274
651260
2000
ื›ื“ื™ ืœืจืื•ืช ืืช ื”ืื™ื ื˜ื•ืื™ืฆื™ื•ืช ื”ืคื™ื ื ืกื™ื•ืช ืฉืœื›ื ื‘ืคืขื•ืœื”.
10:53
So imagine that right now
275
653260
2000
ืื– ื“ืžื™ื™ื ื• ืฉืขื›ืฉื™ื•
10:55
I handed each and every one of you
276
655260
2000
ื”ื™ื™ืชื™ ื ื•ืชื ืช ืœื›"ื ืžื›ื
10:57
a thousand U.S. dollars -- so 10 crisp hundred dollar bills.
277
657260
3000
1000 ื“ื•ืœืจ. 10 ืฉื˜ืจื•ืช ื—ื“ืฉื™ื ืฉืœ $100.
11:00
Take these, put it in your wallet
278
660260
2000
ืฉื™ืžื• ืื•ืชื ื‘ืืจื ืง ืฉืœื›ื
11:02
and spend a second thinking about what you're going to do with it.
279
662260
2000
ื•ืชื—ืฉื‘ื• ื›ืžื” ืฉื ื™ื•ืช ืขืœ ืžื” ืชืขืฉื• ืื™ืชื.
11:04
Because it's yours now; you can buy whatever you want.
280
664260
2000
ื›ื™ ื”ื ืขื›ืฉื™ื• ืฉืœื›ื, ืืชื ื™ื›ื•ืœื™ื ืœืงื ื•ืช ืžื” ืฉืชืจืฆื•.
11:06
Donate it, take it, and so on.
281
666260
2000
ืœืชืจื•ื, ืœืงื—ืช ื•ื›ืŸ ื”ืœืื”.
11:08
Sounds great, but you get one more choice to earn a little bit more money.
282
668260
3000
ื ืฉืžืข ืžืฆื•ื™ืŸ, ืื‘ืœ ื™ืฉ ืœื›ื ืืคืฉืจื•ืช ืœื”ืจื•ื•ื™ื— ื™ื•ืชืจ ื›ืกืฃ.
11:11
And here's your choice: you can either be risky,
283
671260
3000
ื•ื”ื ื” ื”ื‘ื—ื™ืจื”: ืืชื ื™ื›ื•ืœื™ื ืœื”ืกืชื›ืŸ,
11:14
in which case I'm going to flip one of these monkey tokens.
284
674260
2000
ื•ืื– ืื ื™ ืื–ืจื•ืง ืืช ืžื˜ื‘ืข ื”ืงื•ืคื™ื ื”ื–ื”.
11:16
If it comes up heads, you're going to get a thousand dollars more.
285
676260
2000
ืื ื–ื” ืขืฅ - ืืชื ืžืงื‘ืœื™ื ืขื•ื“ 1000 ื“ื•ืœืจ,
11:18
If it comes up tails, you get nothing.
286
678260
2000
ืื ื–ื” ืคืœื™, ืืชื ืœื ืžืงื‘ืœื™ื ื›ืœื•ื.
11:20
So it's a chance to get more, but it's pretty risky.
287
680260
3000
ืื– ื™ืฉ ืืคืฉืจื•ืช ืœื”ืจื•ื•ื™ื—, ืื‘ืœ ื™ืฉ ืกื™ื›ื•ืŸ.
11:23
Your other option is a bit safe. Your just going to get some money for sure.
288
683260
3000
ื”ืืคืฉืจื•ืช ื”ืฉื ื™ื” ื”ื™ื ื™ื•ืชืจ ื‘ื˜ื•ื—ื”. ืืชื ื‘ื˜ื•ื— ืชืงื‘ืœื• ื›ืกืฃ.
11:26
I'm just going to give you 500 bucks.
289
686260
2000
ืื ื™ ืืชืŸ ืœื›ื 500 ื“ื•ืœืจ.
11:28
You can stick it in your wallet and use it immediately.
290
688260
3000
ืืชื ื™ื›ื•ืœื™ื ืœื”ื›ื ื™ืก ืื•ืชื ืœืืจื ืง ื•ืœื”ืฉืชืžืฉ ื‘ื”ื ืžื™ื“.
11:31
So see what your intuition is here.
291
691260
2000
ืื– ืžื” ื”ืื™ื ื˜ื•ืื™ืฆื™ื” ืฉืœื›ื ื›ืืŸ.
11:33
Most people actually go with the play-it-safe option.
292
693260
3000
ืจื•ื‘ ื”ืื ืฉื™ื ืžืขื“ื™ืคื™ื ืืช ื”ืืคืฉืจื•ืช ื”ื‘ื˜ื•ื—ื”.
11:36
Most people say, why should I be risky when I can get 1,500 dollars for sure?
293
696260
3000
ืจื•ื‘ ื”ืื ืฉื™ื ืื•ืžืจื™ื, ืœืžื” ืœื”ืกืชื›ืŸ ืื ืื ื™ ื™ื›ื•ืœ ืœืงื‘ืœ 1500 ื“ื•ืœืจ ื‘ื•ื“ืื•ืช?
11:39
This seems like a good bet. I'm going to go with that.
294
699260
2000
ื–ื” ื ืฉืžืข ื›ืžื• ื”ื™ืžื•ืจ ื˜ื•ื‘.
11:41
You might say, eh, that's not really irrational.
295
701260
2000
ืืชื ื™ื›ื•ืœื™ื ืœื•ืžืจ ืฉื–ื” ืœื ืžืžืฉ ืื™-ืจืฆื™ื•ื ืœื™.
11:43
People are a little risk-averse. So what?
296
703260
2000
ืื ืฉื™ื ื”ื ืงืฆืช ืฉื•ื ืื™ ืกื™ื›ื•ืŸ, ืื– ืžื”?
11:45
Well, the "so what?" comes when start thinking
297
705260
2000
ื ืกื• ืœื—ืฉื•ื‘
11:47
about the same problem
298
707260
2000
ืขืœ ืื•ืชื” ื”ื‘ืขื™ื”
11:49
set up just a little bit differently.
299
709260
2000
ืื‘ืœ ืžื•ืฆื’ืช ืงืฆืช ืฉื•ื ื”.
11:51
So now imagine that I give each and every one of you
300
711260
2000
ื“ืžื™ื™ื ื• ืฉื›"ื ืžื›ื ืžืงื‘ืœ ืžืžื ื™
11:53
2,000 dollars -- 20 crisp hundred dollar bills.
301
713260
3000
2000 ื“ื•ืœืจ, 20 ืฉื˜ืจื•ืช ื—ื“ืฉื™ื ืฉืœ 100 ื“ื•ืœืจ.
11:56
Now you can buy double to stuff you were going to get before.
302
716260
2000
ืขื›ืฉื™ื• ืืชื ื™ื›ื•ืœื™ื ืœืงื ื•ืช ืคื™ ืฉื ื™ื™ื ืžืืฉืจ ืงื•ื“ื.
11:58
Think about how you'd feel sticking it in your wallet.
303
718260
2000
ื“ืžื™ื™ื ื• ืืชื›ื ืžื›ื ื™ืกื™ื ืืช ื”ื›ืกืฃ ืœืืจื ืง.
12:00
And now imagine that I have you make another choice
304
720260
2000
ืขื›ืฉื™ื• ื“ืžื™ื™ื ื• ืฉื™ืฉ ืœื›ื ื‘ื—ื™ืจื”
12:02
But this time, it's a little bit worse.
305
722260
2000
ืื‘ืœ ื”ืคืขื ื–ื” ืงืฆืช ื™ื•ืชืจ ืงืฉื”.
12:04
Now, you're going to be deciding how you're going to lose money,
306
724260
3000
ืขื›ืฉื™ื• ืืชื ืฆืจื™ื›ื™ื ืœื”ื—ืœื™ื˜ ืื™ืš ืชืคืกื™ื“ื• ืืช ื”ื›ืกืฃ,
12:07
but you're going to get the same choice.
307
727260
2000
ืื‘ืœ ืขื ืื•ืชืŸ ืืคืฉืจื•ื™ื•ืช.
12:09
You can either take a risky loss --
308
729260
2000
ืื• ืฉืชื‘ื—ืจื• ื”ืคืกื“ ื‘ืกื™ื›ื•ืŸ -
12:11
so I'll flip a coin. If it comes up heads, you're going to actually lose a lot.
309
731260
3000
ืื– ืื ื™ ืื–ืจื•ืง ืžื˜ื‘ืข, ืื ื–ื” ืขืฅ ืชืคืกื™ื“ื• ื”ืจื‘ื”,
12:14
If it comes up tails, you lose nothing, you're fine, get to keep the whole thing --
310
734260
3000
ืื ื–ื” ืคืœื™ ืœื ืชืคืกื™ื“ื• ื›ืœื•ื, ืชื™ืฉืืจื• ืขื ื›ืœ ื”ื›ืกืฃ.
12:17
or you could play it safe, which means you have to reach back into your wallet
311
737260
3000
ืื• ืฉืชืฉื—ืงื• ื‘ื–ื”ื™ืจื•ืช, ื›ืœื•ืžืจ ืชืชื ื• ืœื™
12:20
and give me five of those $100 bills, for certain.
312
740260
3000
500 ื“ื•ืœืจ, ื‘ื•ื•ื“ืื•ืช.
12:23
And I'm seeing a lot of furrowed brows out there.
313
743260
3000
ื•ืื ื™ ืจื•ืื” ื›ืืŸ ื’ื‘ื•ืช ืžื•ืจืžื•ืช.
12:26
So maybe you're having the same intuitions
314
746260
2000
ืื– ืื•ืœื™ ื™ืฉ ืœื›ื ืืช ืื•ืชืŸ ื”ืื™ื ื˜ื•ืื™ืฆื™ื•ืช
12:28
as the subjects that were actually tested in this,
315
748260
2000
ืฉืœ ื”ื ื‘ื“ืงื™ื ืฉื ื‘ื—ื ื• ื‘ื–ื” ื‘ืืžืช,
12:30
which is when presented with these options,
316
750260
2000
ื•ื”ืŸ ืฉื›ืฉืžืฆื™ื’ื™ื ืœื”ื ืืช ื”ืื•ืคืฆื™ื•ืช ื”ืœืœื•,
12:32
people don't choose to play it safe.
317
752260
2000
ืื ืฉื™ื ืœื ื‘ื•ื—ืจื™ื ืœื”ื™ื•ืช ื–ื”ื™ืจื™ื.
12:34
They actually tend to go a little risky.
318
754260
2000
ื”ื ื‘ื•ื—ืจื™ื ืœืงื—ืช ืงืฆืช ื™ื•ืชืจ ืกื™ื›ื•ืŸ.
12:36
The reason this is irrational is that we've given people in both situations
319
756260
3000
ื”ืกื™ื‘ื” ืœื›ืš ืฉื–ื” ืœื ืจืฆื™ื•ื ืœื™ ื”ื™ื ืžืฉื•ื ืฉื‘ืฉื ื™ ื”ืžืฆื‘ื™ื ื”ืื ืฉื™ื ืžืงื‘ืœื™ื
12:39
the same choice.
320
759260
2000
ืืช ืื•ืชื” ื”ื‘ื—ื™ืจื”.
12:41
It's a 50/50 shot of a thousand or 2,000,
321
761260
3000
ื–ื” ืกื™ื›ื•ื™ ืฉืœ 50/50 ืœืงื‘ืœ 1000 ืื• 2000
12:44
or just 1,500 dollars with certainty.
322
764260
2000
ืื• ื•ื“ืื•ืช ืœืงื‘ืœ 1500.
12:46
But people's intuitions about how much risk to take
323
766260
3000
ืื‘ืœ ื”ืื™ื ื˜ื•ืื™ืฆื™ื•ืช ืฉืœ ืื ืฉื™ื ืœื’ื‘ื™ ืจืžืช ื”ืกื™ื›ื•ืŸ ืฉื”ื ื™ืงื—ื•
12:49
varies depending on where they started with.
324
769260
2000
ืžืฉืชื ื” ื‘ื”ืชืื ืœืžืฆื‘ ื”ื”ืชื—ืœืชื™ ืฉืœื”ื.
12:51
So what's going on?
325
771260
2000
ืื– ืžื” ืงื•ืจื” ื›ืืŸ?
12:53
Well, it turns out that this seems to be the result
326
773260
2000
ื ืจืื” ืฉื–ื• ื”ืชื•ืฆืื”
12:55
of at least two biases that we have at the psychological level.
327
775260
3000
ืฉืœ ืœืคื—ื•ืช ืฉืชื™ ื”ื˜ื™ื•ืช ื‘ืจืžื” ื”ืคืกื™ื›ื•ืœื•ื’ื™ืช.
12:58
One is that we have a really hard time thinking in absolute terms.
328
778260
3000
ื”ืจืืฉื•ื ื” ื”ื™ื ืฉืงืฉื” ืœื ื• ืžืื•ื“ ืœื—ืฉื•ื‘ ื‘ืžื•ืฉื’ื™ื ืื‘ืกื•ืœื•ื˜ื™ื™ื.
13:01
You really have to do work to figure out,
329
781260
2000
ืฆืจื™ืš ืœืขื‘ื•ื“ ืงืฉื” ื›ื“ื™ ืœื—ืฉื‘,
13:03
well, one option's a thousand, 2,000;
330
783260
2000
ื˜ื•ื‘, ืืคืฉืจื•ืช ืื—ืช ื”ื™ื 1000, 2000
13:05
one is 1,500.
331
785260
2000
ืื—ืจืช ื”ื™ื 1500.
13:07
Instead, we find it very easy to think in very relative terms
332
787260
3000
ื‘ืžืงื•ื ื–ื”, ืงืœ ืœื ื• ืžืื•ื“ ืœื—ืฉื•ื‘ ื‘ืžื•ืฉื’ื™ื ื™ื—ืกื™ื™ื
13:10
as options change from one time to another.
333
790260
3000
ื›ืฉืืคืฉืจื•ื™ื•ืช ืžืฉืชื ื•ืช ืขื ื”ื–ืžืŸ.
13:13
So we think of things as, "Oh, I'm going to get more," or "Oh, I'm going to get less."
334
793260
3000
ืื– ืื ื—ื ื• ื—ื•ืฉื‘ื™ื: "ืื”, ืื ื™ ืืงื‘ืœ ื™ื•ืชืจ" ืื• "ืื”, ืื ื™ ืืงื‘ืœ ืคื—ื•ืช."
13:16
This is all well and good, except that
335
796260
2000
ื–ื” ื‘ืกื“ืจ, ืคืจื˜ ืœื›ืš
13:18
changes in different directions
336
798260
2000
ืฉืฉื™ื ื•ื™ื™ื ื‘ื›ื™ื•ื•ื ื™ื ืฉื•ื ื™ื
13:20
actually effect whether or not we think
337
800260
2000
ืžืฉืคื™ืขื™ื ืขืœ ื”ื”ืขืจื›ื”
13:22
options are good or not.
338
802260
2000
ืฉืœื ื• ืืช ื”ืืคืฉืจื•ื™ื•ืช ื›ื˜ื•ื‘ื•ืช ืื• ืœื.
13:24
And this leads to the second bias,
339
804260
2000
ื•ื–ื” ืžื•ื‘ื™ืœ ืœื”ื˜ื™ื” ื”ืฉื ื™ื™ื”,
13:26
which economists have called loss aversion.
340
806260
2000
ืฉื›ืœื›ืœื ื™ื ืงื•ืจืื™ื ืœื” ืฉื ืืช ืกื™ื›ื•ืŸ.
13:28
The idea is that we really hate it when things go into the red.
341
808260
3000
ื”ืจืขื™ื•ืŸ ื”ื•ื ืฉืื ื—ื ื• ืžืžืฉ ืฉื•ื ืื™ื ืœื”ื™ื›ื ืก ืœื—ื•ื‘.
13:31
We really hate it when we have to lose out on some money.
342
811260
2000
ืื ื—ื ื• ืžืžืฉ ืฉื•ื ืื™ื ืœื”ืคืกื™ื“ ื›ืกืฃ.
13:33
And this means that sometimes we'll actually
343
813260
2000
ื•ื–ื” ืื•ืžืจ ืฉืœืคืขืžื™ื ืื ื—ื ื•
13:35
switch our preferences to avoid this.
344
815260
2000
ื ืฉื ื” ืืช ื”ื”ืขื“ืคื•ืช ืฉืœื ื• ื›ื“ื™ ืœื”ื™ืžื ืข ืžื›ืš.
13:37
What you saw in that last scenario is that
345
817260
2000
ืžื” ืฉืจืื™ืชื ื‘ืกืฆื ืจื™ื• ื”ืื—ืจื•ืŸ ื”ื•ื
13:39
subjects get risky
346
819260
2000
ืฉื ื‘ื“ืงื™ื ืžืกืชื›ื ื™ื ื™ื•ืชืจ
13:41
because they want the small shot that there won't be any loss.
347
821260
3000
ื›ื™ ื”ื ืจื•ืฆื™ื ืกื™ื›ื•ื™ ืงื˜ืŸ ืฉืœื ื™ื”ื™ื” ื”ืคืกื“.
13:44
That means when we're in a risk mindset --
348
824260
2000
ื–ื” ืื•ืžืจ ืฉื›ืฉืื ื—ื ื• ื‘ืกื˜ ืžื—ืฉื‘ืชื™ --
13:46
excuse me, when we're in a loss mindset,
349
826260
2000
ื›ืฉืื ื—ื ื• ื‘ืกื˜ ืžื—ืฉื‘ืชื™ ืฉืœ ื”ืคืกื“,
13:48
we actually become more risky,
350
828260
2000
ืื ื—ื ื• ื‘ืขืฆื ืœื•ืงื—ื™ื ื™ื•ืชืจ ืกื™ื›ื•ื ื™ื,
13:50
which can actually be really worrying.
351
830260
2000
ื•ื–ื” ืขืœื•ืœ ืœื”ื™ื•ืช ืžืžืฉ ืžื“ืื™ื’.
13:52
These kinds of things play out in lots of bad ways in humans.
352
832260
3000
ื”ื“ื‘ืจื™ื ื”ืืœื” ื™ื›ื•ืœื™ื ืœืคืขื•ืœ ืœืจืขืชื ืฉืœ ื‘ื ื™ ื”ืื“ื.
13:55
They're why stock investors hold onto losing stocks longer --
353
835260
3000
ื”ื ื”ืกื™ื‘ื” ืœื›ืš ืฉืžืฉืงื™ืขื™ื ื‘ื‘ื•ืจืกื” ืžื—ื–ื™ืงื™ื ืžื ื™ื•ืช ืžืคืกื™ื“ื•ืช ื™ื•ืชืจ ืžื“ื™ ื–ืžืŸ -
13:58
because they're evaluating them in relative terms.
354
838260
2000
ื›ื™ ื”ื ืžืขืจื™ื›ื™ื ืื•ืชื ื‘ืžื•ืฉื’ื™ื ื™ื—ืกื™ื™ื.
14:00
They're why people in the housing market refused to sell their house --
355
840260
2000
ื”ื ื”ืกื™ื‘ื” ืœื›ืš ืฉืื ืฉื™ื ืžืกืจื‘ื™ื ืœืžื›ื•ืจ ืืช ื”ื‘ื™ืช ืฉืœื”ื -
14:02
because they don't want to sell at a loss.
356
842260
2000
ื›ื™ ื”ื ืœื ืจื•ืฆื™ื ืœืžื›ื•ืจ ื‘ืžื—ื™ืจ ื”ืคืกื“.
14:04
The question we were interested in
357
844260
2000
ื”ืฉืืœื” ืฉืžืขื ื™ื™ื ืช ืื•ืชื ื•
14:06
is whether the monkeys show the same biases.
358
846260
2000
ื”ื™ื ืื ื”ืงื•ืคื™ื ืžืจืื™ื ืืช ืื•ืชืŸ ื”ื”ื˜ื™ื•ืช.
14:08
If we set up those same scenarios in our little monkey market,
359
848260
3000
ืื ื ื™ื™ืฆืจ ืืช ืื•ืชื ืกืฆื ืจื™ื• ื‘ืฉื•ืง ื”ืงื•ืคื™ื ืฉืœื ื•,
14:11
would they do the same thing as people?
360
851260
2000
ื”ืื ื”ื ื™ืคืขืœื• ื›ืžื• ื‘ื ื™ ืื“ื?
14:13
And so this is what we did, we gave the monkeys choices
361
853260
2000
ืื– ื–ื” ืžื” ืฉืขืฉื™ื ื•, ื ืชื ื• ืœืงื•ืคื™ื ื‘ื—ื™ืจื•ืช
14:15
between guys who were safe -- they did the same thing every time --
362
855260
3000
ื‘ื™ืŸ ืžื•ื›ืจื™ื ืฉื”ื™ื• ื‘ื˜ื•ื—ื™ื - ืชืžื™ื“ ืขืฉื• ืืช ืื•ืชื• ื”ื“ื‘ืจ -
14:18
or guys who were risky --
363
858260
2000
ืœื‘ื™ืŸ ืžื•ื›ืจื™ื ืฉื”ื™ื• ืขื ืกื™ื›ื•ืŸ -
14:20
they did things differently half the time.
364
860260
2000
ืขืฉื• ื“ื‘ืจื™ื ืฉื•ื ื™ื ืžื—ืฆื™ืช ืžื”ื–ืžืŸ.
14:22
And then we gave them options that were bonuses --
365
862260
2000
ื•ืื– ื ืชื ื• ืœื”ื ืืคืฉืจื•ื™ื•ืช ื‘ื•ื ื•ืก -
14:24
like you guys did in the first scenario --
366
864260
2000
ื›ืžื• ืฉื ืชื ื• ืœื›ื ื‘ืกืฆื ืจื™ื• ื”ืจืืฉื•ืŸ -
14:26
so they actually have a chance more,
367
866260
2000
ื›ืš ืฉื™ืฉ ืœื”ื ื™ื•ืชืจ ืกื™ื›ื•ื™,
14:28
or pieces where they were experiencing losses --
368
868260
3000
ืื• ืงื˜ืขื™ื ื‘ื”ื ื”ื ื—ื•ื•ื™ื ื”ืคืกื“ื™ื -
14:31
they actually thought they were going to get more than they really got.
369
871260
2000
ื”ื ื—ืฉื‘ื• ืฉื”ื ื™ืงื‘ืœื• ื™ื•ืชืจ ืžืžื” ืฉื”ื ืงื™ื‘ืœื•.
14:33
And so this is what this looks like.
370
873260
2000
ื•ื›ืš ื–ื” ื ืจืื”.
14:35
We introduced the monkeys to two new monkey salesmen.
371
875260
2000
ื”ืฆื’ื ื• ืœืงื•ืคื™ื ืฉืœื ื• ืฉื ื™ ืžื•ื›ืจื™ื ื—ื“ืฉื™ื.
14:37
The guy on the left and right both start with one piece of grape,
372
877260
2000
ื”ื‘ื—ื•ืจื™ื ืžืฉืžืืœ ื•ืžื™ืžื™ืŸ ืžืชื—ื™ืœื™ื ืขื ืขื ื‘ ืื—ื“,
14:39
so it looks pretty good.
373
879260
2000
ืื– ื–ื” ื ืจืื” ื˜ื•ื‘.
14:41
But they're going to give the monkeys bonuses.
374
881260
2000
ืื‘ืœ ื”ื ืขื•ืžื“ื™ื ืœืชืช ืœืงื•ืคื™ื ื‘ื•ื ื•ืกื™ื.
14:43
The guy on the left is a safe bonus.
375
883260
2000
ื”ื‘ื—ื•ืจ ืžืฉืžืืœ ื”ื•ื ื‘ื•ื ื•ืก ื‘ื˜ื•ื—.
14:45
All the time, he adds one, to give the monkeys two.
376
885260
3000
ื”ื•ื ืชืžื™ื“ ืžื•ืกื™ืฃ ืื—ื“ ื›ืš ืฉื”ืงื•ืฃ ืžืงื‘ืœ ืฉื ื™ื™ื.
14:48
The guy on the right is actually a risky bonus.
377
888260
2000
ื”ื‘ื—ื•ืจ ืžื™ืžื™ืŸ ื”ื•ื ื‘ืขืฆื ื‘ื•ื ื•ืก ืขื ืกื™ื›ื•ืŸ.
14:50
Sometimes the monkeys get no bonus -- so this is a bonus of zero.
378
890260
3000
ืœืคืขืžื™ื ื”ืงื•ืฃ ืœื ืงื™ื‘ืœ ื‘ื•ื ื•ืก - ื›ืœื•ืžืจ ื‘ื•ื ื•ืก 0,
14:53
Sometimes the monkeys get two extra.
379
893260
3000
ืœืคืขืžื™ื ื”ืงื•ืฃ ืงื™ื‘ืœ 2 ื‘ื•ื ื•ืก,
14:56
For a big bonus, now they get three.
380
896260
2000
ืœื‘ื•ื ื•ืก ื’ื“ื•ืœ, ืขื›ืฉื™ื• ื”ื ืžืงื‘ืœื™ื ืฉืœื•ืฉื”.
14:58
But this is the same choice you guys just faced.
381
898260
2000
ืื‘ืœ ื–ื• ืื•ืชื” ื‘ื—ื™ืจื” ืฉืขืžื“ื” ื‘ืคื ื™ื›ื.
15:00
Do the monkeys actually want to play it safe
382
900260
3000
ื”ืื ื”ืงื•ืคื™ื ืจื•ืฆื™ื ืœืฉื—ืง ืขืœ ื‘ื˜ื•ื—
15:03
and then go with the guy who's going to do the same thing on every trial,
383
903260
2000
ื•ืœืœื›ืช ืขื ื”ื‘ื—ื•ืจ ืฉืชืžื™ื“ ืขื•ืฉื” ืื•ืชื• ื”ื“ื‘ืจ,
15:05
or do they want to be risky
384
905260
2000
ืื• ืฉื”ื ืจื•ืฆื™ื ืœื”ืกืชื›ืŸ
15:07
and try to get a risky, but big, bonus,
385
907260
2000
ื•ืœื ืกื•ืช ืœืœื›ืช ืขืœ ื‘ื•ื ื•ืก ื’ื“ื•ืœ,
15:09
but risk the possibility of getting no bonus.
386
909260
2000
ืืš ืขื ืกื™ื›ื•ืŸ ืœื ืœืงื‘ืœ ื‘ื•ื ื•ืก.
15:11
People here played it safe.
387
911260
2000
ืื ืฉื™ื ื›ืืŸ ืฉื™ื—ืงื• ืขืœ ื‘ื˜ื•ื—.
15:13
Turns out, the monkeys play it safe too.
388
913260
2000
ืžืชื‘ืจืจ ืฉื’ื ื”ืงื•ืคื™ื ืฉื—ืงื• ืขืœ ื‘ื˜ื•ื—.
15:15
Qualitatively and quantitatively,
389
915260
2000
ืื™ื›ื•ืชื™ืช ื•ื›ืžื•ืชื™ืช,
15:17
they choose exactly the same way as people,
390
917260
2000
ื”ื ื‘ื•ื—ืจื™ื ื‘ื“ื™ื•ืง ื‘ืื•ืชื• ืื•ืคืŸ ื›ืžื• ื‘ื ื™ ื”ืื“ื,
15:19
when tested in the same thing.
391
919260
2000
ืฉื ื‘ื“ืงื™ื ื‘ืžืฆื‘ ื“ื•ืžื”.
15:21
You might say, well, maybe the monkeys just don't like risk.
392
921260
2000
ืืชื ื™ื›ื•ืœื™ื ืœื•ืžืจ - ืื•ืœื™ ื”ืงื•ืคื™ื ืคืฉื•ื˜ ืœื ืื•ื”ื‘ื™ื ืกื™ื›ื•ืŸ.
15:23
Maybe we should see how they do with losses.
393
923260
2000
ืื•ืœื™ ืฆืจื™ืš ืœืจืื•ืช ืื™ืš ื”ื ืžืชืžื•ื“ื“ื™ื ืขื ื”ืคืกื“ื™ื.
15:25
And so we ran a second version of this.
394
925260
2000
ืื– ื”ืจืฆื ื• ืืช ื”ื’ืจืกื” ื”ืฉื ื™ื™ื” ืฉืœ ื–ื”.
15:27
Now, the monkeys meet two guys
395
927260
2000
ืขื›ืฉื™ื• ื”ืงื•ืคื™ื ืคื•ื’ืฉื™ื ืฉื ื™ ืžื•ื›ืจื™ื
15:29
who aren't giving them bonuses;
396
929260
2000
ืฉืœื ื ื•ืชื ื™ื ืœื”ื ื‘ื•ื ื•ืกื™ื,
15:31
they're actually giving them less than they expect.
397
931260
2000
ื”ื ื‘ืขืฆื ื ื•ืชื ื™ื ืœื”ื ืคื—ื•ืช ืžืžื” ืฉื”ื ืžืฆืคื™ื.
15:33
So they look like they're starting out with a big amount.
398
933260
2000
ืื– ื ืจืื” ืฉื”ื ืžืชื—ื™ืœื™ื ืขื ื›ืžื•ืช ื’ื“ื•ืœื”.
15:35
These are three grapes; the monkey's really psyched for this.
399
935260
2000
ืืœื” 3 ืขื ื‘ื™ื, ื”ืงื•ืคื™ื ืžืžืฉ ืจื•ืฆื™ื ืืช ื–ื”.
15:37
But now they learn these guys are going to give them less than they expect.
400
937260
3000
ืื‘ืœ ืขื›ืฉื™ื• ื”ื ืœื•ืžื“ื™ื ืฉื”ืžื•ื›ืจื™ื ื™ืชื ื• ืœื”ื ืคื—ื•ืช ืžืžื” ืฉื”ื ืžืฆืคื™ื.
15:40
They guy on the left is a safe loss.
401
940260
2000
ื”ื‘ื—ื•ืจ ืžืฉืžืืœ ื”ื•ื ื”ืคืกื“ ื‘ื˜ื•ื—.
15:42
Every single time, he's going to take one of these away
402
942260
3000
ื›ืœ ืคืขื ื”ื•ื ื™ื•ืจื™ื“ ืืช ืื—ื“ ื”ืขื ื‘ื™ื
15:45
and give the monkeys just two.
403
945260
2000
ื•ื™ืชืŸ ืœืงื•ืคื™ื ืจืง ืฉื ื™ื™ื.
15:47
the guy on the right is the risky loss.
404
947260
2000
ื”ื‘ื—ื•ืจ ืžื™ืžื™ืŸ ื”ื•ื ื”ืคืกื“ ืขื ืกื™ื›ื•ืŸ.
15:49
Sometimes he gives no loss, so the monkeys are really psyched,
405
949260
3000
ืœืคืขืžื™ื ืœื ื™ื”ื™ื” ื”ืคืกื“, ื›ืš ืฉื”ืงื•ืคื™ื ืžืžืฉ ืžืชืœื”ื‘ื™ื,
15:52
but sometimes he actually gives a big loss,
406
952260
2000
ืื‘ืœ ืœืคืขืžื™ื ื™ืฉ ื”ืคืกื“ ื’ื“ื•ืœ,
15:54
taking away two to give the monkeys only one.
407
954260
2000
ื”ื•ื ืžื•ืจื™ื“ ืฉื ื™ื™ื ื•ืžืฉืื™ืจ ืจืง ืื—ื“.
15:56
And so what do the monkeys do?
408
956260
2000
ืื– ืžื” ื”ืงื•ืคื™ื ืขื•ืฉื™ื?
15:58
Again, same choice; they can play it safe
409
958260
2000
ืฉื•ื‘, ืื•ืชื” ื‘ื—ื™ืจื”, ื”ื ื™ื›ื•ืœื™ื ืœืœื›ืช ืขืœ ื‘ื˜ื•ื—
16:00
for always getting two grapes every single time,
410
960260
3000
ื•ืชืžื™ื“ ืœืงื‘ืœ ืฉื ื™ ืขื ื‘ื™ื,
16:03
or they can take a risky bet and choose between one and three.
411
963260
3000
ืื• ืฉื”ื ื™ื›ื•ืœื™ื ืœื”ืกืชื›ืŸ ื•ืœืงื‘ืœ 1 ืื• 3.
16:06
The remarkable thing to us is that, when you give monkeys this choice,
412
966260
3000
ืžื” ืฉืžื“ื”ื™ื ืขื‘ื•ืจื ื• ื”ื•ื ืฉื›ืฉื ื•ืชื ื™ื ืœืงื•ืคื™ื ืœื‘ื—ื•ืจ,
16:09
they do the same irrational thing that people do.
413
969260
2000
ื”ื ืขื•ืฉื™ื ืืช ืื•ืชื” ืคืขื•ืœื” ืœื ืจืฆื™ื•ื ืœื™ืช ืฉืื ืฉื™ื ืขื•ืฉื™ื.
16:11
They actually become more risky
414
971260
2000
ื”ื ืœื•ืงื—ื™ื ื™ื•ืชืจ ืกื™ื›ื•ื ื™ื
16:13
depending on how the experimenters started.
415
973260
3000
ื‘ื”ืชืื ืœืžืฆื‘ ื‘ื• ื”ื—ืœื• ื”ื ืกื™ื™ื ื™ื.
16:16
This is crazy because it suggests that the monkeys too
416
976260
2000
ื–ื” ืžื˜ื•ืจืฃ, ื›ื™ ื–ื” ืจื•ืžื– ืœื›ืš ืฉื’ื ื”ืงื•ืคื™ื
16:18
are evaluating things in relative terms
417
978260
2000
ืžืขืจื™ื›ื™ื ื“ื‘ืจื™ื ื‘ืื•ืคืŸ ื™ื—ืกื™
16:20
and actually treating losses differently than they treat gains.
418
980260
3000
ื•ื‘ืขืฆื ืžืชื™ื™ื—ืกื™ื ืื—ืจืช ืœื”ืคืกื“ื™ื ื•ืœืจื•ื•ื—ื™ื.
16:23
So what does all of this mean?
419
983260
2000
ืื– ืžื” ื”ืžืฉืžืขื•ืช ืฉืœ ื›ืœ ื–ื”?
16:25
Well, what we've shown is that, first of all,
420
985260
2000
ืื– ืงื•ื“ื ื›ืœ, ืžื” ืฉื”ืจืื™ื ื•
16:27
we can actually give the monkeys a financial currency,
421
987260
2000
ื–ื” ืฉืื ื—ื ื• ื™ื›ื•ืœื™ื ืœืชืช ืœืงื•ืคื™ื ื›ืกืฃ,
16:29
and they do very similar things with it.
422
989260
2000
ื•ื”ื ื™ืขืฉื• ืื™ืชื• ื“ื‘ืจื™ื ื“ื•ืžื™ื ืœื ื•.
16:31
They do some of the smart things we do,
423
991260
2000
ื”ื ืขื•ืฉื™ื ื“ื‘ืจื™ื ื—ื›ืžื™ื ื›ืžื•ื ื•,
16:33
some of the kind of not so nice things we do,
424
993260
2000
ื›ืžื” ื“ื‘ืจื™ื ืœื ื™ืคื™ื ื›ืžื•ื ื•,
16:35
like steal it and so on.
425
995260
2000
ื›ืžื• ืœื’ื ื•ื‘ ื•ืขื•ื“.
16:37
But they also do some of the irrational things we do.
426
997260
2000
ืื‘ืœ ื”ื ื’ื ืขื•ืฉื™ื ื›ืžื” ื“ื‘ืจื™ื ืœื ืจืฆื™ื•ื ืœื™ื™ื ื›ืžื•ื ื•.
16:39
They systematically get things wrong
427
999260
2000
ื‘ืื•ืคืŸ ืฉื™ื˜ืชื™ ื”ื ื˜ื•ืขื™ื
16:41
and in the same ways that we do.
428
1001260
2000
ื•ื‘ืื•ืชืŸ ื“ืจื›ื™ื ื‘ื” ืื ื• ื˜ื•ืขื™ื.
16:43
This is the first take-home message of the Talk,
429
1003260
2000
ื–ื” ื”ืžืกืจ ื”ืจืืฉื•ืŸ ืฉืœ ื”ื”ืจืฆืื”,
16:45
which is that if you saw the beginning of this and you thought,
430
1005260
2000
ื•ื”ื•ื ืฉืื ื—ืฉื‘ืชื
16:47
oh, I'm totally going to go home and hire a capuchin monkey financial adviser.
431
1007260
2000
ืœืฉื›ื•ืจ ืงื•ืฃ ืงืคื•ืฆ'ื™ืŸ ื›ื™ื•ืขืฅ ื›ืœื›ืœื™.
16:49
They're way cuter than the one at ... you know --
432
1009260
2000
ื”ื ื”ืจื‘ื” ื™ื•ืชืจ ื—ืžื•ื“ื™ื ืžื”ื™ื•ืขืฅ...
16:51
Don't do that; they're probably going to be just as dumb
433
1011260
2000
ืืœ ืชืขืฉื• ืืช ื–ื”. ื”ื ื›ื ืจืื” ื™ื”ื™ื• ื˜ืคืฉื™ื
16:53
as the human one you already have.
434
1013260
3000
ื‘ื“ื™ื•ืง ื›ืžื• ื”ื™ื•ืขืฅ ื”ืื ื•ืฉื™ ืฉืœื›ื.
16:56
So, you know, a little bad -- Sorry, sorry, sorry.
435
1016260
2000
ืื– - ืกืœื™ื—ื”, ืกืœื™ื—ื”, ืกืœื™ื—ื”,
16:58
A little bad for monkey investors.
436
1018260
2000
ื—ื“ืฉื•ืช ืจืขื•ืช ืœืงื•ืคื™ื ืžืฉืงื™ืขื™ื.
17:00
But of course, you know, the reason you're laughing is bad for humans too.
437
1020260
3000
ืื‘ืœ ื›ืžื•ื‘ืŸ, ื”ืกื™ื‘ื” ืฉืืชื ืฆื•ื—ืงื™ื ื”ื™ื ืจืขื” ื’ื ืขื‘ื•ืจ ื‘ื ื™ ืื“ื.
17:03
Because we've answered the question we started out with.
438
1023260
3000
ื›ื™ ืขื ื™ื ื• ืขืœ ื”ืฉืืœื” ืฉืื™ืชื” ื”ืชื—ืœื ื•.
17:06
We wanted to know where these kinds of errors came from.
439
1026260
2000
ืจืฆื™ื ื• ืœื“ืขืช ืžืื™ืŸ ืžื’ื™ืขื•ืช ื”ืฉื’ื™ืื•ืช ื”ืืœื”.
17:08
And we started with the hope that maybe we can
440
1028260
2000
ื•ื”ืชื—ืœื ื• ืขื ื”ืชืงื•ื•ื” ืฉืื•ืœื™ ื ื•ื›ืœ
17:10
sort of tweak our financial institutions,
441
1030260
2000
ืœืชืงืŸ ืืช ื”ืžื•ืกื“ื•ืช ื”ืคื™ื ื ืกื™ื™ื ืฉืœื ื•,
17:12
tweak our technologies to make ourselves better.
442
1032260
3000
ืœืชืงืŸ ืืช ื”ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื›ื“ื™ ืœืฉืคืจ ืืช ืขืฆืžื ื•.
17:15
But what we've learn is that these biases might be a deeper part of us than that.
443
1035260
3000
ืื‘ืœ ืžื” ืฉืœืžื“ื ื• ื”ื•ื ืฉื”ื”ื˜ื™ื•ืช ื”ืืœื” ื”ืŸ ืื•ืœื™ ื—ืœืง ืขืžื•ืง ื‘ื ื•.
17:18
In fact, they might be due to the very nature
444
1038260
2000
ืœืžืขืฉื”, ื™ืชื›ืŸ ืฉื”ืŸ ื ื•ื‘ืขื•ืช ืžื”ื˜ื‘ืข
17:20
of our evolutionary history.
445
1040260
2000
ืฉืœ ื”ื”ื™ืกื˜ื•ืจื™ื” ื”ืื‘ื•ืœื•ืฆื™ื•ื ื™ืช ืฉืœื ื•.
17:22
You know, maybe it's not just humans
446
1042260
2000
ืืชื ื™ื•ื“ืขื™ื, ืื•ืœื™ ืœื ืจืง ื‘ื ื™ ื”ืื“ื
17:24
at the right side of this chain that's duncey.
447
1044260
2000
ื‘ืฆื“ ื”ื™ืžื ื™ ืฉืœ ื”ืฉืจืฉืจืช ื”ื ื˜ื™ืคืฉื™ื.
17:26
Maybe it's sort of duncey all the way back.
448
1046260
2000
ืื•ืœื™ ื™ืฉ ื›ืืŸ ื˜ืคืฉื•ืช ืœืื•ืจืš ื›ืœ ื”ื“ืจืš.
17:28
And this, if we believe the capuchin monkey results,
449
1048260
3000
ื•ืื ืื ื—ื ื• ืžืงื‘ืœื™ื ืืช ื”ืžืžืฆืื™ื ืฉืœ ืงื•ืคื™ ื”ืงืคื•ืฆ'ื™ืŸ,
17:31
means that these duncey strategies
450
1051260
2000
ื–ื” ืื•ืžืจ ืฉื”ืืกื˜ืจื˜ื’ื™ื•ืช ื”ื˜ื™ืคืฉื•ืช ื”ืืœื”
17:33
might be 35 million years old.
451
1053260
2000
ื”ืŸ ืื•ืœื™ ื‘ื ื•ืช 35 ืžื™ืœื™ื•ืŸ ืฉื ื”.
17:35
That's a long time for a strategy
452
1055260
2000
ื–ื” ื–ืžืŸ ืืจื•ืš ืžืื•ื“ ืœืืกื˜ืจื˜ื’ื™ื”
17:37
to potentially get changed around -- really, really old.
453
1057260
3000
ื›ื“ื™ ืฉื ื™ืชืŸ ื™ื”ื™ื” ืœื”ืคื•ืš ืื•ืชืŸ - ืžืžืฉ, ืžืžืฉ ืขืชื™ืงื•ืช.
17:40
What do we know about other old strategies like this?
454
1060260
2000
ืžื” ืื ื—ื ื• ื™ื•ื“ืขื™ื ืขืœ ืืกื˜ืจื˜ื’ื™ื•ืช ืขืชื™ืงื•ืช ื›ืžื• ืืœื”?
17:42
Well, one thing we know is that they tend to be really hard to overcome.
455
1062260
3000
ืื—ื“ ื”ื“ื‘ืจื™ื ืฉืื ื—ื ื• ื™ื•ื“ืขื™ื ื”ื•ื ืฉืงืฉื” ืžืื•ื“ ืœืฉื ื•ืช ืื•ืชืŸ.
17:45
You know, think of our evolutionary predilection
456
1065260
2000
ื—ืฉื‘ื• ืœืžืฉืœ ืขืœ ื”ื ื˜ื™ื” ื”ืื‘ื•ืœื•ืฆื™ื•ื ื™ืช ืฉืœื ื•
17:47
for eating sweet things, fatty things like cheesecake.
457
1067260
3000
ืœืื›ื•ืœ ื“ื‘ืจื™ื ืžืชื•ืงื™ื ื•ืฉื•ืžื ื™ื™ื ื›ืžื• ืขื•ื’ืช ื’ื‘ื™ื ื”.
17:50
You can't just shut that off.
458
1070260
2000
ืื™ ืืคืฉืจ ืœื›ื‘ื•ืช ืืช ื–ื”.
17:52
You can't just look at the dessert cart as say, "No, no, no. That looks disgusting to me."
459
1072260
3000
ืื™ ืืคืฉืจ ืœื”ืกืชื›ืœ ืขืœ ืขื’ืœืช ื”ืงื™ื ื•ื—ื™ื ื•ืœื•ืžืจ "ืœื, ืœื, ืœื, ื–ื” ืžื’ืขื™ืœ ืื•ืชื™."
17:55
We're just built differently.
460
1075260
2000
ืื ื—ื ื• ืคืฉื•ื˜ ื‘ื ื•ื™ื™ื ืื—ืจืช.
17:57
We're going to perceive it as a good thing to go after.
461
1077260
2000
ืื ื—ื ื• ื ืจืื” ื‘ื–ื” ื“ื‘ืจ ื˜ื•ื‘ ืฉืฆืจื™ืš ืœืฉืื•ืฃ ืืœื™ื•.
17:59
My guess is that the same thing is going to be true
462
1079260
2000
ื”ื ื™ื—ื•ืฉ ืฉืœื™ ื”ื•ื ืฉืื•ืชื• ื”ื“ื‘ืจ ืงื•ืจื”
18:01
when humans are perceiving
463
1081260
2000
ื›ืฉืื ืฉื™ื ืชื•ืคืกื™ื
18:03
different financial decisions.
464
1083260
2000
ื”ื—ืœื˜ื•ืช ืคื™ื ื ืกื™ื•ืช ืฉื•ื ื•ืช.
18:05
When you're watching your stocks plummet into the red,
465
1085260
2000
ื›ืฉืืชื ืจื•ืื™ื ืืช ื”ืžื ื™ื•ืช ืฉืœื›ื ืžืชืจืกืงื•ืช,
18:07
when you're watching your house price go down,
466
1087260
2000
ื›ืฉืืชื ืจื•ืื™ื ืืช ืžื—ื™ืจ ื”ื‘ื™ืช ืฉืœื›ื ื ื•ืคืœ,
18:09
you're not going to be able to see that
467
1089260
2000
ืืชื ืœื ืชื•ื›ืœื• ืœืจืื•ืช ืืช ื–ื”
18:11
in anything but old evolutionary terms.
468
1091260
2000
ื‘ืžื•ืฉื’ื™ื ืฉื•ื ื™ื ืžื”ืžื•ืฉื’ื™ื ื”ืื‘ื•ืœื•ืฆื™ื•ื ื™ื™ื ื”ื™ืฉื ื™ื.
18:13
This means that the biases
469
1093260
2000
ื–ื” ืื•ืžืจ ืฉื”ื”ื˜ื™ื•ืช
18:15
that lead investors to do badly,
470
1095260
2000
ืฉื”ื•ื‘ื™ืœื• ืืช ื”ืžืฉืงื™ืขื™ื ืœื”ื™ื›ืฉืœ,
18:17
that lead to the foreclosure crisis
471
1097260
2000
ืฉื’ืจืžื• ืœืžืฉื‘ืจ ื”ืขื™ืงื•ืœื™ื,
18:19
are going to be really hard to overcome.
472
1099260
2000
ื™ื”ื™ื• ืงืฉื•ืช ืœืฉื™ื ื•ื™.
18:21
So that's the bad news. The question is: is there any good news?
473
1101260
2000
ืื– ืืœื” ื”ื—ื“ืฉื•ืช ื”ืจืขื•ืช. ื”ืฉืืœื” ื”ื™ื: ื”ืื ื™ืฉ ื—ื“ืฉื•ืช ื˜ื•ื‘ื•ืช?
18:23
I'm supposed to be up here telling you the good news.
474
1103260
2000
ืื ื™ ืืžื•ืจื” ืœืกืคืจ ืœื›ื ื›ืืŸ ืขืœ ื”ื—ื“ืฉื•ืช ื”ื˜ื•ื‘ื•ืช.
18:25
Well, the good news, I think,
475
1105260
2000
ืื– ื”ื—ื“ืฉื•ืช ื”ื˜ื•ื‘ื•ืช, ืื ื™ ื—ื•ืฉื‘ืช,
18:27
is what I started with at the beginning of the Talk,
476
1107260
2000
ื”ื ืžื” ืฉื“ื™ื‘ืจืชื™ ืขืœื™ื• ื‘ืคืชื™ื—ื”,
18:29
which is that humans are not only smart;
477
1109260
2000
ื•ื”ื•ื ืฉื‘ื ื™ ื”ืื“ื ืœื ืกืชื ื—ื›ืžื™ื,
18:31
we're really inspirationally smart
478
1111260
2000
ืื ื—ื ื• ื—ื›ืžื™ื ื‘ืื•ืคืŸ ืžืขื•ืจืจ ื”ืฉืจืื”
18:33
to the rest of the animals in the biological kingdom.
479
1113260
3000
ื™ื—ืกื™ืช ืœืฉืืจ ื”ื—ื™ื•ืช ื‘ืžืžืœื›ื” ื”ื‘ื™ื•ืœื•ื’ื™ืช.
18:36
We're so good at overcoming our biological limitations --
480
1116260
3000
ืื ื—ื ื• ื›"ื› ื˜ื•ื‘ื™ื ื‘ื”ืชื’ื‘ืจื•ืช ืขืœ ื”ืžื’ื‘ืœื•ืช ื”ื‘ื™ื•ืœื•ื’ื™ื•ืช ืฉืœื ื• -
18:39
you know, I flew over here in an airplane.
481
1119260
2000
ืืชื ื™ื•ื“ืขื™ื, ืื ื™ ื”ื’ืขืชื™ ืœื›ืืŸ ื‘ืžื˜ื•ืก.
18:41
I didn't have to try to flap my wings.
482
1121260
2000
ืœื ื”ื™ื™ืชื™ ืฆืจื™ื›ื” ืœื ื•ืคืฃ ื‘ื›ื ืคื™.
18:43
I'm wearing contact lenses now so that I can see all of you.
483
1123260
3000
ื™ืฉ ืœื™ ืขื“ืฉื•ืช ืžื’ืข ื›ืš ืฉืื•ื›ืœ ืœืจืื•ืช ืืชื›ื.
18:46
I don't have to rely on my own near-sightedness.
484
1126260
3000
ืื ื™ ืœื ืฆืจื™ื›ื” ืœื”ืกืชืžืš ืขืœ ืงื•ืฆืจ-ื”ืจืื™ื™ื” ืฉืœื™.
18:49
We actually have all of these cases
485
1129260
2000
ื™ืฉ ืœื ื• ื”ืจื‘ื” ืžืงืจื™ื
18:51
where we overcome our biological limitations
486
1131260
3000
ื‘ื”ื ื”ืชื’ื‘ืจื ื• ืขืœ ื”ืžื’ื‘ืœื•ืช ื”ื‘ื™ื•ืœื•ื’ื™ื•ืช ืฉืœื ื•
18:54
through technology and other means, seemingly pretty easily.
487
1134260
3000
ื‘ืืžืฆืขื•ืช ื˜ื›ื ื•ืœื•ื’ื™ื” ื•ื“ืจื›ื™ื ืื—ืจื•ืช, ื“ื™ ื‘ืงืœื•ืช.
18:57
But we have to recognize that we have those limitations.
488
1137260
3000
ืื‘ืœ ืื ื—ื ื• ืฆืจื™ื›ื™ื ืœื”ื›ื™ืจ ื‘ื›ืš ืฉื™ืฉ ืœื ื• ืืช ืื•ืชืŸ ืžื’ื‘ืœื•ืช.
19:00
And here's the rub.
489
1140260
2000
ื•ื”ื ื” ื”ื‘ืขื™ื”.
19:02
It was Camus who once said that, "Man is the only species
490
1142260
2000
ืงืืžื™ ื”ื™ื” ื–ื” ืฉืืžืจ: "ื”ืื“ื ื”ื•ื ื”ืžื™ืŸ ื”ื™ื—ื™ื“
19:04
who refuses to be what he really is."
491
1144260
3000
ื”ืžืกืจื‘ ืœื”ื™ื•ืช ืžื” ืฉื”ื•ื ื‘ืืžืช."
19:07
But the irony is that
492
1147260
2000
ืื‘ืœ ื”ืื™ืจื•ื ื™ื” ื”ื™ื
19:09
it might only be in recognizing our limitations
493
1149260
2000
ืฉืจืง ื‘ืืžืฆืขื•ืช ื”ื›ืจื” ื‘ืžื’ื‘ืœื•ืช ืฉืœื ื•
19:11
that we can really actually overcome them.
494
1151260
2000
ืื ื—ื ื• ื™ื›ื•ืœื™ื ื‘ืืžืช ืœื”ืชื’ื‘ืจ ืขืœื™ื”ืŸ.
19:13
The hope is that you all will think about your limitations,
495
1153260
3000
ื”ืชืงื•ื•ื” ื”ื™ื ืฉื›ืœ ืื—ื“ ืžื›ื ื™ื—ืฉื•ื‘ ืขืœ ื”ืžื’ื‘ืœื•ืช ืฉืœื•,
19:16
not necessarily as unovercomable,
496
1156260
3000
ืœื ื‘ื”ื›ืจื— ื›ื‘ืœืชื™-ื ื™ืชื ื•ืช ืœื”ืชืžื•ื“ื“ื•ืช,
19:19
but to recognize them, accept them
497
1159260
2000
ืื‘ืœ ืœื”ื›ื™ืจ ื‘ื”ื, ืœืงื‘ืœ ืื•ืชืŸ
19:21
and then use the world of design to actually figure them out.
498
1161260
3000
ื•ืื– ืœื”ืฉืชืžืฉ ื‘ืขื•ืœื ื”ืชื›ื ื•ืŸ ื›ื“ื™ ืžืžืฉ ืœื”ื‘ื™ืŸ ืื•ืชื.
19:24
That might be the only way that we will really be able
499
1164260
3000
ื–ื• ืื•ืœื™ ื”ื“ืจืš ื”ื™ื—ื™ื“ื” ื‘ื” ื ื•ื›ืœ ื‘ืืžืช
19:27
to achieve our own human potential
500
1167260
2000
ืœื”ื’ืฉื™ื ืืช ื”ืคื•ื˜ื ืฆื™ืืœ ื”ืื ื•ืฉื™ ืฉืœื ื•
19:29
and really be the noble species we hope to all be.
501
1169260
3000
ื•ืœื”ื™ื•ืช ืื•ืชื• ืžื™ืŸ ืืฆื™ืœื™ ืฉื›ื•ืœื ื• ืžืงื•ื•ื™ื ืœื”ื™ื•ืช.
19:32
Thank you.
502
1172260
2000
ืชื•ื“ื” ืจื‘ื”.
19:34
(Applause)
503
1174260
5000
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7