What really motivates people to be honest in business | Alexander Wagner

235,808 views ใƒป 2017-09-26

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืžืชืจื’ื: Shlomo Adam ืžื‘ืงืจ: Sigal Tifferet
00:12
How many companies have you interacted with today?
0
12580
3440
ืขื ื›ืžื” ื—ื‘ืจื•ืช ื”ื™ื” ืœื›ื ื”ื™ื•ื ืžื’ืข?
ื˜ื•ื‘, ืงืžืชื ื‘ื‘ื•ืงืจ,
00:17
Well, you got up in the morning,
1
17060
1656
00:18
took a shower,
2
18740
1215
ื”ืชืงืœื—ืชื, ื—ืคืคืชื ืจืืฉ,
00:19
washed your hair,
3
19980
1256
00:21
used a hair dryer,
4
21260
1536
ื”ืฉืชืžืฉืชื ื‘ืžื™ื™ื‘ืฉ ืฉื™ืขืจ,
00:22
ate breakfast --
5
22820
1216
ืื›ืœืชื ืืจื•ื—ืช-ื‘ื•ืงืจ --
ื“ื’ื ื™ ื‘ื•ืงืจ, ืคืจื™, ื™ื•ื’ื•ืจื˜ ื•ื›ื•' --
00:24
ate cereals, fruit, yogurt, whatever --
6
24060
1858
00:25
had coffee --
7
25942
1214
ืฉืชื™ืชื ืงืคื”, ืชื”.
00:27
tea.
8
27180
1376
00:28
You took public transport to come here,
9
28580
1976
ื”ื’ืขืชื ื”ื ื” ื‘ืชื—ื‘ื•ืจื” ืฆื™ื‘ื•ืจื™ืช ืื• ื‘ืžื›ื•ื ื™ืชื›ื ื”ืคืจื˜ื™ืช.
00:30
or maybe used your private car.
10
30580
1840
00:33
You interacted with the company that you work for or that you own.
11
33340
3560
ื”ื™ื™ืชื ื‘ืžื’ืข ืขื ื”ื—ื‘ืจื” ืฉืœื›ื ืื• ื–ื• ืฉื‘ื” ืืชื ืขื•ื‘ื“ื™ื.
00:37
You interacted with your clients,
12
37980
1960
ื”ื™ื™ืชื ื‘ืžื’ืข ืขื ืœืงื•ื—ื•ืชื™ื›ื,
00:40
your customers,
13
40580
1200
ืขื ื”ืฆืจื›ื ื™ื ืฉืœื›ื,
00:42
and so on and so forth.
14
42460
1256
ื•ื›ืŸ ื”ืœืื” ื•ื›ืŸ ื”ืœืื”.
00:43
I'm pretty sure there are at least seven companies
15
43740
3616
ืื ื™ ื‘ื˜ื•ื— ืœืžื“ื™ ืฉื™ืฉ ืœืคื—ื•ืช ืฉื‘ืข ื—ื‘ืจื•ืช
00:47
you've interacted with today.
16
47380
1760
ืฉืขื™ืžืŸ ื”ื™ื” ืœื›ื ื”ื™ื•ื ืงืฉืจ.
00:49
Let me tell you a stunning statistic.
17
49780
2000
ื”ื‘ื” ื•ืื’ืœื” ืœื›ื ื ืชื•ืŸ ืžื“ื”ื™ื.
00:52
One out of seven large, public corporations
18
52660
4376
ื›ืœ ืฉื ื”, ืื—ื“ ืžื›ืœ ืฉื‘ืขื” ืชืื’ื™ื“ื™ื ืฆื™ื‘ื•ืจื™ื™ื ื’ื“ื•ืœื™ื ืžื‘ืฆืข ื”ื•ื ืื”.
00:57
commit fraud every year.
19
57060
2240
01:00
This is a US academic study that looks at US companies --
20
60220
3416
ื–ื”ื• ื ืชื•ืŸ ืžืžื—ืงืจ ืืงื“ืžื™ ืืžืจื™ืงื ื™ ืฉื‘ื—ืŸ ื—ื‘ืจื•ืช ืืžืจื™ืงื ื™ื•ืช --
01:03
I have no reason to believe that it's different in Europe.
21
63660
3200
ืื™ืŸ ืœื™ ืกื™ื‘ื” ืœื”ืืžื™ืŸ ืฉื”ืžืฆื‘ ื‘ืื™ืจื•ืคื” ืฉื•ื ื”.
01:07
This is a study that looks at both detected and undetected fraud
22
67300
4216
ื”ืžื—ืงืจ ื”ื–ื” ื‘ื—ืŸ ื”ื•ื ืื•ืช ืฉื ื—ืฉืคื• ื•ื›ืืœื” ืฉืœื
01:11
using statistical methods.
23
71540
1736
ื‘ืฉื™ื˜ื•ืช ืกื˜ื˜ื™ืกื˜ื™ื•ืช.
01:13
This is not petty fraud.
24
73300
1720
ืื™ืŸ ืžื“ื•ื‘ืจ ื‘ื”ื•ื ืื•ืช ืคืขื•ื˜ื•ืช.
01:15
These frauds cost the shareholders of these companies,
25
75940
2856
ื”ื”ื•ื ืื•ืช ื”ืืœื” ืขื•ืœื•ืช ืœื‘ืขืœื™ ื”ืžื ื™ื•ืช ืฉืœ ื—ื‘ืจื•ืช ืืœื”,
01:18
and therefore society,
26
78820
1256
ื•ืœืคื™ื›ืš ืœื—ื‘ืจื”,
01:20
on the order of 380 billion dollars per year.
27
80100
3600
ืกื“ืจ-ื’ื•ื“ืœ ืฉืœ 380 ืžื™ืœื™ืืจื“ ื“ื•ืœืจ ื‘ืฉื ื”.
01:24
We can all think of some examples, right?
28
84780
2216
ื›ื•ืœื ื• ื™ื›ื•ืœื™ื ืœื”ื™ื–ื›ืจ ื‘ื›ืžื” ื“ื•ื’ืžืื•ืช, ื ื›ื•ืŸ?
01:27
The car industry's secrets aren't quite so secret anymore.
29
87020
3800
ืกื•ื“ื•ืช ืชืขืฉื™ื™ืช ื”ืจื›ื‘ ืฉื›ื‘ืจ ืื™ื ื ืกื•ื“ื•ืช.
01:31
Fraud has become a feature,
30
91620
3296
ื”ื”ื•ื ืื•ืช ื”ืคื›ื• ืœืกืžืžืŸ ื‘ื•ืœื˜,
01:34
not a bug,
31
94940
1216
ื•ื›ื‘ืจ ืื™ื ืŸ ื‘ื’ื“ืจ ืชืงืœื”,
01:36
of the financial services industry.
32
96180
1936
ื‘ืชืขืฉื™ื™ืช ื”ืฉื™ืจื•ืชื™ื ื”ืคื™ื ื ืกื™ื™ื.
01:38
That's not me who's claiming that,
33
98140
2216
ืœื ืื ื™ ื”ื•ื ืฉื˜ื•ืขืŸ ื›ืš,
01:40
that's the president of the American Finance Association
34
100380
3256
ืืœื ื ืฉื™ื ื”ืื™ื’ื•ื“ ื”ืคื™ื ื ืกื™ ื”ืืžืจื™ืงื ื™
01:43
who stated that in his presidential address.
35
103660
2936
ืฉื”ืฆื”ื™ืจ ืขืœ ื›ืš ื‘ื ืื•ืžื• ื”ื ืฉื™ืื•ืชื™.
01:46
That's a huge problem if you think about, especially,
36
106620
2736
ื–ืืช ื‘ืขื™ื” ืขื ืงื™ืช ืื ื—ื•ืฉื‘ื™ื ื‘ืžื™ื•ื—ื“ ืขืœ ื›ืœื›ืœื” ื›ืžื• ืฉื•ื•ื™ื™ืฅ,
01:49
an economy like Switzerland,
37
109380
1696
ืฉืžืกืชืžื›ืช ืžืื“ ืขืœ ื”ืืžื•ืŸ ื‘ืชืขืฉื™ื™ื” ื”ืคื™ื ื ืกื™ืช ืฉืœื”.
01:51
which relies so much on the trust put into its financial industry.
38
111100
4200
01:56
On the other hand,
39
116780
1216
ืžืฆื“ ืฉื ื™,
ืฉืฉ ื—ื‘ืจื•ืช ืžื›ืœ ืฉื‘ืข ืขื“ื™ื™ืŸ ื ื•ื”ื’ื•ืช ื‘ื™ื•ืฉืจ
01:58
there are six out of seven companies who actually remain honest
40
118020
3536
02:01
despite all temptations to start engaging in fraud.
41
121580
3840
ื—ืจืฃ ื›ืœ ื”ืคื™ืชื•ื™ื™ื ืœืขืกื•ืง ื‘ื”ื•ื ืื”.
ื™ืฉ ื—ื•ืฉืคื™-ืฉื—ื™ืชื•ื™ื•ืช ื›ืžื• ืžื™ื™ืงืœ ื•ื•ื“ืคื•ืจื“,
02:06
There are whistle-blowers like Michael Woodford,
42
126060
2296
02:08
who blew the whistle on Olympus.
43
128380
2336
ืฉื—ืฉืฃ ืืช ืฉืขืจื•ืจื™ื™ืช "ืื•ืœื™ืžืคื•ืก".
02:10
These whistle-blowers risk their careers,
44
130740
2696
ื—ื•ืฉืคื™-ืฉื—ื™ืชื•ื™ื•ืช ืืœื” ืžืกื›ื ื™ื ืืช ื”ืงืจื™ื™ืจื” ื•ืงืฉืจื™ ื”ื™ื“ื™ื“ื•ืช ืฉืœื”ื
02:13
their friendships,
45
133460
1216
02:14
to bring out the truth about their companies.
46
134700
2136
ื›ื“ื™ ืœื—ืฉื•ืฃ ืืช ื”ืืžืช ืื•ื“ื•ืช ื”ื—ื‘ืจื•ืช ืฉืœื”ื.
02:16
There are journalists like Anna Politkovskaya
47
136860
2616
ื™ืฉ ืขื™ืชื•ื ืื™ื ื›ืžื• ืื ื” ืคื•ืœื™ื˜ืงื•ื‘ืกืงืื™ื”
02:19
who risk even their lives to report human rights violations.
48
139500
3856
ืฉืืคื™ืœื• ืžืกื›ื ื™ื ืืช ื—ื™ื™ื”ื ื›ื“ื™ ืœื“ื•ื•ื— ืขืœ ืคื’ื™ืขื•ืช ื‘ื–ื›ื•ื™ื•ืช ื”ืื“ื.
02:23
She got killed --
49
143380
1216
ื”ื™ื ื ืจืฆื—ื”.
02:24
every year,
50
144620
1216
ื‘ื›ืœ ืฉื ื”,
02:25
around 100 journalists get killed
51
145860
1656
ื›ืžืื” ืขื™ืชื•ื ืื™ื ื ืจืฆื—ื™ื ื‘ื’ืœืœ ื“ื‘ืงื•ืชื ื‘ืืžืช.
02:27
because of their conviction to bring out the truth.
52
147540
2720
02:31
So in my talk today,
53
151860
1256
ืื– ื‘ื”ืจืฆืืชื™, ื”ื™ื•ื,
02:33
I want to share with you some insights I've obtained and learned
54
153140
3496
ื‘ืจืฆื•ื ื™ ืœืฉืชืฃ ืืชื›ื ื‘ื›ืžื” ืชื•ื‘ื ื•ืช ืฉื’ื™ืœื™ืชื™ ื•ืœืžื“ืชื™
02:36
in the last 10 years of conducting research in this.
55
156660
3296
ื‘ืขืฉืจ ื”ืฉื ื™ื ื”ืื—ืจื•ื ื•ืช ืฉืœ ืžื—ืงืจื™ ื‘ืชื—ื•ื ื”ื–ื”.
02:39
I'm a researcher, a scientist working with economists,
56
159980
3496
ืื ื™ ื—ื•ืงืจ, ืžื“ืขืŸ ืฉืขื•ื‘ื“ ืขื ื›ืœื›ืœื ื™ื,
02:43
financial economists,
57
163500
1336
ื›ืœื›ืœื ื™ื ืคื™ื ื ืกื™ื™ื,
02:44
ethicists, neuroscientists,
58
164860
2056
ืื ืฉื™ ืชื•ืจืช ื”ืžื•ืกืจ, ืžื“ืขื ื™ ืžื•ื—,
02:46
lawyers and others
59
166940
1336
ืขื•ืจื›ื™-ื“ื™ืŸ ื•ืขื•ื“,
02:48
trying to understand what makes humans tick,
60
168300
2096
ื‘ื ื™ืกื™ื•ืŸ ืœื”ื‘ื™ืŸ ืžื” ืžื ื™ืข ืื ืฉื™ื
02:50
and how can we address this issue of fraud in corporations
61
170420
4776
ื•ืื™ืš ื‘ื™ื›ื•ืœืชื ื• ืœื˜ืคืœ ื‘ื‘ืขื™ื™ืช ื”ื”ื•ื ืื” ื‘ืชืื’ื™ื“ื™ื ืฉืœื ื•
02:55
and therefore contribute to the improvement of the world.
62
175220
3160
ื•ื‘ื›ืš ืœืชืจื•ื ื‘ืžืขื˜ ืœืฉื™ืคื•ืจ ื”ืขื•ืœื.
02:59
I want to start by sharing with you two very distinct visions
63
179100
3536
ืืชื—ื™ืœ ื‘ื›ืš ืฉืืฉืชืฃ ืืชื›ื ื‘ืฉืชื™ ื”ืฉืงืคื•ืช ืฉื•ื ื•ืช ืžืื“
03:02
of how people behave.
64
182660
1816
ืœื’ื‘ื™ ื”ืชื ื”ื’ื•ืชื ืฉืœ ื‘ื ื™-ืื“ื.
03:04
First, meet Adam Smith,
65
184500
1840
ืชื—ื™ืœื”, ื”ื›ื™ืจื• ืืช ืื“ื ืกืžื™ืช,
ืื‘ื™ ื”ื›ืœื›ืœื” ื”ืžื•ื“ืจื ื™ืช.
03:07
founding father of modern economics.
66
187020
1960
03:10
His basic idea was that if everybody behaves in their own self-interests,
67
190100
4296
ื”ืจืขื™ื•ืŸ ื”ื‘ืกื™ืกื™ ืฉืœื• ืื•ืžืจ ืฉืื ื›ื•ืœื ื™ืชื ื”ื’ื• ื‘ืื ื•ื›ื™ื•ืช,
03:14
that's good for everybody in the end.
68
194420
2520
ื‘ืกื•ืคื• ืฉืœ ื“ื‘ืจ, ื–ื” ื™ื•ืขื™ืœ ืœื›ื•ืœื.
03:17
Self-interest isn't a narrowly defined concept
69
197900
3056
ืื™ื ื˜ืจืก ืขืฆืžื™ ืื™ื ื ื• ืชืคื™ืฉื” ืฆืจื”
03:20
just for your immediate utility.
70
200980
1936
ืฉืžืชื™ื™ื—ืกืช ืจืง ืœืชื•ืขืœืช ื”ืžื™ื™ื“ื™ืช.
03:22
It has a long-run implication.
71
202940
1936
ื™ืฉ ืœื” ื”ืฉืœื›ื•ืช ืืจื•ื›ื•ืช-ื˜ื•ื•ื—.
03:24
Let's think about that.
72
204900
1480
ื”ื‘ื” ื ื—ืฉื•ื‘ ืขืœ ื›ืš.
03:26
Think about this dog here.
73
206900
2016
ื—ื™ืฉื‘ื• ืขืœ ื”ื›ืœื‘ ื”ื–ื”.
03:28
That might be us.
74
208940
1200
ื ื ื™ื— ืฉื”ื•ื ืžื™ื™ืฆื’ ืื•ืชื ื•.
03:31
There's this temptation --
75
211260
1256
ื™ืฉื ื• ื”ืคื™ืชื•ื™ --
03:32
I apologize to all vegetarians, but --
76
212540
2376
ืื ื™ ืžืชื ืฆืœ ื‘ืคื ื™ ื”ืฆืžื—ื•ื ื™ื™ื, ืื‘ืœ --
03:34
(Laughter)
77
214940
1016
(ืฆื—ื•ืง)
03:35
Dogs do like the bratwurst.
78
215980
1696
ื›ืœื‘ื™ื ืื•ื”ื‘ื™ื ื ืงื ื™ืง ื—ื–ื™ืจ.
03:37
(Laughter)
79
217700
2376
(ืฆื—ื•ืง)
03:40
Now, the straight-up, self-interested move here
80
220100
3096
ื›ืืŸ, ื”ืฆืขื“ ื”ื™ืฉื™ืจ ืœื˜ื•ื‘ืช ื”ืื™ื ื˜ืจืก ื”ืขืฆืžื™
03:43
is to go for that.
81
223220
1576
ื”ื•ื ืœื”ืชื ืคืœ ืขืœ ื–ื”.
03:44
So my friend Adam here might jump up,
82
224820
2936
ื•ืœื›ืŸ ื”ื—ื‘ืจ ื”ื–ื” ืฉืœื™, ืื“ื, ื™ื–ื ืง,
03:47
get the sausage and thereby ruin all this beautiful tableware.
83
227780
3360
ื™ื—ื˜ื•ืฃ ืืช ื”ื ืงื ื™ืงื™ื” ื•ื™ื”ืจื•ืก ืืช ื›ืœ ื›ืœื™ ื”ืฉื•ืœื—ืŸ ื”ื™ืคื™ื ื”ืืœื”.
03:51
But that's not what Adam Smith meant.
84
231820
1816
ืื‘ืœ ืœื ืœื›ืš ื”ืชื›ื•ื•ืŸ ืื“ื ืกืžื™ืช.
03:53
He didn't mean disregard all consequences --
85
233660
2656
ื”ื•ื ืœื ื”ืชื›ื•ื•ืŸ ืœื”ืชืขืœืžื•ืช ืžื›ืœ ื”ืชื•ืฆืื•ืช --
03:56
to the contrary.
86
236340
1216
ืœื”ื™ืคืš.
03:57
He would have thought,
87
237580
1256
ื”ื•ื ื—ืฉื‘,
03:58
well, there may be negative consequences,
88
238860
2016
ืฉืขืœื•ืœื•ืช ืœื”ื™ื•ืช ืชื•ืฆืื•ืช ืฉืœื™ืœื™ื•ืช,
04:00
for example,
89
240900
1216
ืœืžืฉืœ,
04:02
the owner might be angry with the dog
90
242140
3096
ื”ื‘ืขืœื™ื ื™ืชืจื’ื–ื• ืขืœ ื”ื›ืœื‘
04:05
and the dog, anticipating that, might not behave in this way.
91
245260
3600
ื•ื”ื›ืœื‘, ืฉืžืฆืคื” ืœื›ืš, ืœื ื™ืชื ื”ื’ ื›ืš.
04:09
That might be us,
92
249660
1256
ื•ืื•ืœื™ ื’ื ืื ื•
04:10
weighing the benefits and costs of our actions.
93
250940
3056
ืฉื•ืงืœื™ื ืืช ื”ืชื•ืขืœืช ื•ื”ืžื—ื™ืจ ืฉืœ ืžืขืฉื™ื ื•.
04:14
How does that play out?
94
254020
1240
ืื™ืš ื–ื” ืžืกืชื™ื™ื?
04:15
Well, many of you, I'm sure,
95
255780
1976
ื•ื‘ื›ืŸ, ืื ื™ ื‘ื˜ื•ื— ืฉืืฆืœ ืจื‘ื™ื ืžื›ื
04:17
have in your companies,
96
257780
1536
ื™ืฉ ื‘ื—ื‘ืจื”,
04:19
especially if it's a large company,
97
259340
1816
ื‘ืžื™ื•ื—ื“ ืื ืžื“ื•ื‘ืจ ื‘ื—ื‘ืจื” ื’ื“ื•ืœื”,
04:21
a code of conduct.
98
261180
1656
ืงื•ื“ ื”ืชื ื”ื’ื•ืช.
04:22
And then if you behave according to that code of conduct,
99
262860
3416
ื•ืžื™ ืฉืžืชื ื”ื’ ื‘ื”ืชืื ืœืงื•ื“ ื–ื”,
04:26
that improves your chances of getting a bonus payment.
100
266300
3176
ืžืฉืคืจ ืืช ืกื™ื›ื•ื™ื™ื• ืœืžืขื ืง.
04:29
And on the other hand, if you disregard it,
101
269500
2135
ืœืขื•ืžืชื•, ืžื™ ืฉืžืชืขืœื ืžื”ืงื•ื“,
04:31
then there are higher chances of not getting your bonus
102
271659
2737
ืกื™ื›ื•ื™ื™ื• ื’ื‘ื•ื”ื™ื ืœื ืœื–ื›ื•ืช ื‘ืžืขื ืง
04:34
or its being diminished.
103
274420
1536
ืื• ืœื–ื›ื•ืช ื‘ืžืขื ืง ืžื•ืคื—ืช.
04:35
In other words,
104
275980
1256
ื‘ืžืœื™ื ืื—ืจื•ืช,
04:37
this is a very economic motivation
105
277260
1816
ื–ื”ื• ืžื ื™ืข ื›ืœื›ืœื™ ืžืื“
04:39
of trying to get people to be more honest,
106
279100
2776
ืœื’ืจื•ื ืœืื ืฉื™ื ืœื ื”ื•ื’ ื™ื•ืชืจ ื‘ื™ื•ืฉืจ,
04:41
or more aligned with the corporation's principles.
107
281900
3360
ืื• ืœื”ืชืื™ื ืขืฆืžื ืœืขืงืจื•ื ื•ืช ื”ืชืื’ื™ื“.
04:46
Similarly, reputation is a very powerful economic force, right?
108
286060
5256
ื’ื ืžื•ื ื™ื˜ื™ืŸ ื”ื ืžื ื™ืข ื›ืœื›ืœื™ ื—ื–ืง ืžืื“, ื ื›ื•ืŸ?
04:51
We try to build a reputation,
109
291340
1536
ืื ื• ืžืฉืชื“ืœื™ื ืœื‘ื ื•ืช ืžื•ื ื™ื˜ื™ืŸ,
04:52
maybe for being honest,
110
292900
1416
ืœืžืฉืœ, ืฉืœ ืื ืฉื™ื ื™ืฉืจื™ื,
04:54
because then people trust us more in the future.
111
294340
2400
ื›ื™ ืื– ืื ืฉื™ื ื™ื‘ื˜ื—ื• ื‘ื ื• ื™ื•ืชืจ ื‘ืขืชื™ื“.
04:57
Right?
112
297780
1216
ื ื›ื•ืŸ?
ืื“ื ืกืžื™ืช ื“ื™ื‘ืจ ืขืœ ื”ืื•ืคื”
04:59
Adam Smith talked about the baker
113
299020
2096
05:01
who's not producing good bread out of his benevolence
114
301140
3776
ืฉืื•ืคื” ืœื—ื ื˜ื•ื‘ ืœื ืžืชื•ืš ื ื“ื™ื‘ื•ืช
05:04
for those people who consume the bread,
115
304940
3016
ื›ืœืคื™ ื”ืื ืฉื™ื ืฉืฆื•ืจื›ื™ื ืืช ื”ืœื—ื,
05:07
but because he wants to sell more future bread.
116
307980
3040
ืืœื ื›ื™ ื”ื•ื ืžืขื•ื ื™ื™ืŸ ืœืžื›ื•ืจ ื™ื•ืชืจ ืœื—ื ื‘ืขืชื™ื“.
05:11
In my research, we find, for example,
117
311980
2216
ื‘ืžื—ืงืจ ืฉืœื™ ืžืฆืื ื•, ืœืžืฉืœ,
05:14
at the University of Zurich,
118
314220
1376
ื‘ืื•ื ื™ื‘ืจืกื™ื˜ืช ืฆื™ืจื™ืš,
05:15
that Swiss banks who get caught up in media,
119
315620
4200
ืฉื‘ื ืงื™ื ืฉื•ื•ื™ื™ืฆืจื™ื™ื ืฉื ื—ืฉืคื™ื ื‘ืชืงืฉื•ืจืช,
05:20
and in the context, for example,
120
320540
1776
ืœืžืฉืœ, ื‘ื”ืงืฉืจ ืœื”ืชื—ืžืงื•ืช ืžืชืฉืœื•ื ืžืกื™ื, ืœื”ื•ื ืื•ืช ืžืก,
05:22
of tax evasion, of tax fraud,
121
322340
1536
05:23
have bad media coverage.
122
323900
1736
ื•ื–ื•ื›ื™ื ืœืกื™ืงื•ืจ ืชืงืฉื•ืจืชื™ ืฉืœื™ืœื™.
05:25
They lose net new money in the future
123
325660
2736
ืžืื‘ื“ื™ื ื›ืกืคื™ื ืขืชื™ื“ื™ื™ื
05:28
and therefore make lower profits.
124
328420
1616
ื•ืœื›ืŸ ื”ืจื•ื•ื—ื™ื ืฉืœื”ื ื ืžื•ื›ื™ื.
ื–ื”ื• ืžื ื™ืข ืžื•ื ื™ื˜ื™ืŸ ืจื‘-ืขื•ืฆืžื”.
05:30
That's a very powerful reputational force.
125
330060
2360
05:34
Benefits and costs.
126
334020
1600
ืชื•ืขืœืช ื•ืžื—ื™ืจ.
05:36
Here's another viewpoint of the world.
127
336940
2576
ื”ื ื” ื”ืฉืงืคื” ืฉื•ื ื” ืขืœ ื”ืขื•ืœื.
05:39
Meet Immanuel Kant,
128
339540
1536
ื”ื›ื™ืจื• ืืช ืขืžื ื•ืืœ ืงืื ื˜,
05:41
18th-century German philosopher superstar.
129
341100
2760
ืคื™ืœื•ืกื•ืฃ-ืขืœ ื’ืจืžื ื™ ื‘ืŸ ื”ืžืื” ื”-18.
05:44
He developed this notion
130
344740
1616
ื”ื•ื ืคื™ืชื— ืืช ื”ืจืขื™ื•ืŸ
05:46
that independent of the consequences,
131
346380
3136
ืœืคื™ื•, ืœืœื ืชืœื•ืช ื‘ืชื•ืฆืื•ืช,
05:49
some actions are just right
132
349540
2976
ื™ืฉ ืžืขืฉื™ื ืฉื”ื ืคืฉื•ื˜ ื ื›ื•ื ื™ื
05:52
and some are just wrong.
133
352540
1696
ื•ื™ืฉ ืžืขืฉื™ื ืฉื”ื ืคืฉื•ื˜ ืฉื’ื•ื™ื™ื.
05:54
It's just wrong to lie, for example.
134
354260
3216
ืœืžืฉืœ, ืคืฉื•ื˜ ืฉื’ื•ื™ ืœืฉืงืจ.
05:57
So, meet my friend Immanuel here.
135
357500
3136
ืื– ื”ื ื” ื—ื‘ืจื™ ืขืžื ื•ืืœ.
06:00
He knows that the sausage is very tasty,
136
360660
2816
ื”ื•ื ื™ื•ื“ืข ืฉื”ื ืงื ื™ืงื™ื” ื˜ืขื™ืžื” ืžืื“,
06:03
but he's going to turn away because he's a good dog.
137
363500
2456
ืื‘ืœ ื”ื•ื ื™ืคื ื” ืœื” ืขื•ืจืฃ ื›ื™ ื”ื•ื ื›ืœื‘ ื˜ื•ื‘.
06:05
He knows it's wrong to jump up
138
365980
2696
ื”ื•ื ื™ื•ื“ืข ืฉืœื ื ื›ื•ืŸ ืœื–ื ืง ืขืœื™ื”
06:08
and risk ruining all this beautiful tableware.
139
368700
2800
ื•ืœื”ืกืชื›ืŸ ื‘ื”ืจื™ืกืช ื›ืœื™ ื”ืฉื•ืœื—ืŸ ื”ื™ืคื™ื.
06:12
If you believe that people are motivated like that,
140
372340
2416
ืžื™ ืฉืžืืžื™ืŸ ืฉื–ื” ืžื” ืฉืžื ื™ืข ืื ืฉื™ื,
06:14
then all the stuff about incentives,
141
374780
2176
ื›ืœ ืขื ื™ื™ืŸ ื”ืชืžืจื™ืฆื™ื,
06:16
all the stuff about code of conduct and bonus systems and so on,
142
376980
3776
ื›ืœ ืขื ื™ื™ืŸ ืงื•ื“ ื”ื”ืชื ื”ื’ื•ืช, ืฉื™ื˜ืช ื”ืžืขื ืงื™ื ื•ื›ืŸ ื”ืœืื” -
06:20
doesn't make a whole lot of sense.
143
380780
2176
ื‘ื›ืœ ืืœื” ืื™ืŸ ื”ืจื‘ื” ื”ื’ื™ื•ืŸ.
06:22
People are motivated by different values perhaps.
144
382980
4176
ืขืจื›ื™ื ืื—ืจื™ื ื”ื ืฉืžื ื™ืขื™ื ืื ืฉื™ื.
06:27
So, what are people actually motivated by?
145
387180
3376
ืื– ืžื” ื‘ืืžืช ืžื ื™ืข ืื ืฉื™ื?
06:30
These two gentlemen here have perfect hairdos,
146
390580
2176
ืœืฉื ื™ ื”ืื“ื•ื ื™ื ื”ืืœื” ื™ืฉ ืชืกืจื•ืงื•ืช ืžื•ืฉืœืžื•ืช,
06:32
but they give us very different views of the world.
147
392780
4480
ืื‘ืœ ื”ื ืžืฆื™ื’ื™ื ืœืคื ื™ื ื• ืฉืชื™ ื”ืฉืงืคื•ืช ืขื•ืœื ืฉื•ื ื•ืช.
06:37
What do we do with this?
148
397660
1256
ื‘ืžื” ื ื‘ื—ืจ?
06:38
Well, I'm an economist
149
398940
1656
ืื ื™ ื›ืœื›ืœืŸ,
06:40
and we conduct so-called experiments to address this issue.
150
400620
4176
ื•ืื ื• ืžื ื”ืœื™ื "ื ื™ืกื•ื™ื™ื" ื›ื“ื™ ืœื”ืชืžื•ื“ื“ ืขื ื”ื ื•ืฉื.
06:44
We strip away facts which are confusing in reality.
151
404820
3296
ืื ื• ืžืคืฉื˜ื™ื ืืช ื”ืžืฆื™ืื•ืช ื”ืžื‘ืœื‘ืœืช --
06:48
Reality is so rich, there is so much going on,
152
408140
2736
ื”ืžืฆื™ืื•ืช ื›ื” ืขืฉื™ืจื”, ืงื•ืจื” ื›ืœ-ื›ืš ื”ืจื‘ื”,
06:50
it's almost impossible to know what drives people's behavior really.
153
410900
3960
ืฉื›ืžืขื˜ ืื™-ืืคืฉืจ ืœื“ืขืช ืžื” ื‘ืขืฆื ืžื ื™ืข ืืช ื”ื”ืชื ื”ื’ื•ืช.
06:55
So let's do a little experiment together.
154
415340
2720
ืื– ื‘ื•ืื• ื ืขืฉื” ื‘ื™ื—ื“ ื ื™ืกื•ื™.
06:58
Imagine the following situation.
155
418500
2600
ืชืืจื• ืœืขืฆืžื›ื ืืช ื”ืžืฆื‘ ื”ื‘ื:
07:02
You're in a room alone,
156
422220
2416
ืืชื ืœื‘ื“ ื‘ื—ื“ืจ,
07:04
not like here.
157
424660
1536
ืœื ื›ืžื• ื”ืื•ืœื ื”ื–ื”,
07:06
There's a five-franc coin like the one I'm holding up right now
158
426220
3440
ืžื•ื ื— ืœืคื ื™ื›ื ืžื˜ื‘ืข ื›ื–ื”, ืฉืœ ื—ืžื™ืฉื” ืคืจื ืงื™ื.
07:10
in front of you.
159
430380
1576
07:11
Here are your instructions:
160
431980
1576
ื”ื ื” ื”ื”ื•ืจืื•ืช ืฉืœื›ื:
07:13
toss the coin four times,
161
433580
2480
ื”ื˜ื™ืœื• ืืช ื”ืžื˜ื‘ืข ืืจื‘ืข ืคืขืžื™ื,
07:17
and then on a computer terminal in front of you,
162
437620
2416
ื•ืื–, ื‘ืžืกื•ืฃ ื”ืžื—ืฉื‘ ืฉืœืคื ื™ื›ื,
ื”ื›ื ื™ืกื• ืืช ืžืกืคืจ ื”ืคืขืžื™ื ื‘ื”ื ื”ืžื˜ื‘ืข ื ืคืœ ืขืœ "ืขืฅ".
07:20
enter the number of times tails came up.
163
440060
3656
07:23
This is the situation.
164
443740
1280
ื–ื”ื• ื”ืžืฆื‘.
07:25
Here's the rub.
165
445540
1216
ื•ื”ื ื” ื”ืขื ื™ื™ืŸ:
07:26
For every time that you announce that you had a tails throw,
166
446780
3376
ื‘ื›ืœ ื”ืฆื”ืจื” ืฉืœื›ื ืขืœ "ืขืฅ",
07:30
you get paid five francs.
167
450180
1496
ืžืฉืœืžื™ื ืœื›ื ื—ืžื™ืฉื” ืคืจื ืงื™ื.
07:31
So if you say I had two tails throws,
168
451700
2536
ืื– ืื ืืžืจืชื ืฉื™ืฆื ืœื›ื ืคืขืžื™ื™ื "ืขืฅ",
07:34
you get paid 10 francs.
169
454260
2216
ื™ืฉืœืžื• ืœื›ื 10 ืคืจื ืงื™ื.
07:36
If you say you had zero, you get paid zero francs.
170
456500
2936
ื‘ืžืงืจื” ืฉืœ ืืคืก ื”ื˜ืœื•ืช "ืขืฅ", ื–ื›ื™ืชื ื‘ืืคืก ืคืจื ืงื™ื.
07:39
If you say, "I had four tails throws,"
171
459460
2456
ืื ืืชื ืื•ืžืจื™ื, "ื™ืฆื ืœื™ ืืจื‘ืข ืคืขืžื™ื ืขืฅ,"
07:41
then you get paid 20 francs.
172
461940
2016
ืืชื ื–ื•ื›ื™ื ื‘-20 ืคืจื ืงื™ื.
07:43
It's anonymous,
173
463980
1256
ื”ื›ืœ ื ืขืฉื” ื‘ืขื™ืœื•ื ืฉื
07:45
nobody's watching what you're doing,
174
465260
1896
ืื™ืฉ ืื™ื ื• ืฆื•ืคื” ื‘ืžื” ืฉืืชื ืขื•ืฉื™ื,
07:47
and you get paid that money anonymously.
175
467180
2336
ื•ืืชื ื–ื•ื›ื™ื ื‘ืชืฉืœื•ื ื‘ืขื™ืœื•ื ืฉื.
07:49
I've got two questions for you.
176
469540
1477
ื™ืฉ ืœื™ ืืœื™ื›ื ืฉืชื™ ืฉืืœื•ืช.
07:51
(Laughter)
177
471580
1616
(ืฆื—ื•ืง)
07:53
You know what's coming now, right?
178
473220
1640
ื‘ืจื•ืจ ืœื›ื ืžื” ื”ืฉืืœื•ืช, ื ื›ื•ืŸ?
07:55
First, how would you behave in that situation?
179
475820
3480
ืจืืฉื™ืช, ืื™ืš ืชืชื ื”ื’ื• ืืชื ื‘ืžืฆื‘ ื”ื–ื”?
08:00
The second, look to your left and look to your right --
180
480060
2936
ืฉื ื™ืช, ื”ื‘ื™ื˜ื• ืฉืžืืœื”, ื”ื‘ื™ื˜ื• ื™ืžื™ื ื” --
(ืฆื—ื•ืง)
08:03
(Laughter)
181
483020
1016
ื•ืฉืืœื• ืืช ืขืฆืžื›ื ืื™ืš ืžื™ ืฉื™ื•ืฉื‘ ืœืฆื™ื“ื›ื
08:04
and think about how the person sitting next to you
182
484060
2376
08:06
might behave in that situation.
183
486460
1656
ื”ื™ื” ื ื•ื”ื’ ื‘ืžืฆื‘ ื”ื–ื”.
ืขืจื›ื ื• ืืช ื”ื ื™ืกื•ื™ ื”ื–ื” ื‘ืคื•ืขืœ.
08:08
We did this experiment for real.
184
488140
2136
08:10
We did it at the Manifesta art exhibition
185
490300
2696
ื–ื” ื”ื™ื” ื‘ืชืขืจื•ื›ืช ื”ืืžื ื•ืช "ืžื ื™ืคืกื˜ื”"
ืฉื”ืชืงื™ื™ืžื” ืœืื—ืจื•ื ื” ื‘ืฆื™ืจื™ืš,
08:13
that took place here in Zurich recently,
186
493020
2456
08:15
not with students in the lab at the university
187
495500
2856
ืœื ืขื ืกื˜ื•ื“ื ื˜ื™ื ื‘ืžืขื‘ื“ื” ืฉืœ ื”ืื•ื ื™ื‘ืจืกื™ื˜ื”,
08:18
but with the real population,
188
498380
1776
ืืœื ืขื ืื•ื›ืœื•ืกื™ื” ืืžื™ืชื™ืช, ื›ืžื•ื›ื.
08:20
like you guys.
189
500180
1200
08:21
First, a quick reminder of stats.
190
501900
2136
ืชื—ื™ืœื”, ืชื–ื›ื•ืจืช ืžื”ื™ืจื” ืฉืœ ื”ืกื˜ื˜ื™ืกื˜ื™ืงื”:
08:24
If I throw the coin four times and it's a fair coin,
191
504060
3576
ืื ืื ื™ ืžื˜ื™ืœ ืžื˜ื‘ืข ืชืงื™ื ื” ืืจื‘ืข ืคืขืžื™ื,
08:27
then the probability that it comes up four times tails
192
507660
4096
ื”ืกื‘ื™ืจื•ืช ืฉื”ื™ื ืชื™ืคื•ืœ ืืจื‘ืข ืคืขืžื™ื ืขืœ "ืขืฅ"
08:31
is 6.25 percent.
193
511780
2400
ื”ื™ื 6.25%.
08:34
And I hope you can intuitively see
194
514900
1656
ื•ืื ื™ ืžืงื•ื•ื” ืฉืืชื ืžื‘ื™ื ื™ื ืžื™ื“
08:36
that the probability that all four of them are tails is much lower
195
516580
3376
ืฉื”ืกื‘ื™ืจื•ืช ืฉืœ ืืจื‘ืข ืคืขืžื™ื "ืขืฅ" ื ืžื•ื›ื” ื‘ื”ืจื‘ื” ืžืืฉืจ ืคืขืžื™ื™ื, ื ื›ื•ืŸ?
08:39
than if two of them are tails, right?
196
519980
2120
08:42
Here are the specific numbers.
197
522580
1440
ืืœื” ื”ื ืชื•ื ื™ื ื”ืžื“ื•ื™ื™ืงื™ื.
08:45
Here's what happened.
198
525859
1496
ื•ื”ื ื” ืžื” ืฉืงืจื”.
08:47
People did this experiment for real.
199
527379
2201
ืื ืฉื™ื ื”ืฉืชืชืคื• ื‘ื ื™ืกื•ื™ ื”ื–ื” ื‘ืคื•ืขืœ.
08:50
Around 30 to 35 percent of people said,
200
530619
3336
ื›-30-35% ืžื”ืื ืฉื™ื ืืžืจื•,
08:53
"Well, I had four tails throws."
201
533979
2401
"ื™ืฆื ืœื™ ืขืฅ ืืจื‘ืข ืคืขืžื™ื."
08:57
That's extremely unlikely.
202
537460
1816
ื–ื” ืžืื“ ืœื ืกื‘ื™ืจ.
08:59
(Laughter)
203
539300
1936
(ืฆื—ื•ืง)
09:01
But the really amazing thing here,
204
541260
3136
ืื‘ืœ ืžื” ืฉื”ื›ื™ ืžื“ื”ื™ื ื›ืืŸ,
09:04
perhaps to an economist,
205
544420
1296
ืื•ืœื™ ื‘ืขื™ื ื™ื• ืฉืœ ื›ืœื›ืœืŸ,
09:05
is there are around 65 percent of people who did not say I had four tails throws,
206
545740
6536
ื”ื•ื ืฉื›-65% ืžื”ืื ืฉื™ื ืœื ื”ืฆื”ื™ืจื• ืขืœ ืืจื‘ืข ืคืขืžื™ื "ืขืฅ",
09:12
even though in that situation,
207
552300
2176
ืœืžืจื•ืช ืฉื‘ืžืฆื‘ ื”ื–ื”
09:14
nobody's watching you,
208
554500
2096
ืื™ืฉ ืœื ืฆื•ืคื” ื‘ื›ื,
09:16
the only consequence that's in place
209
556620
1936
ื•ื”ืชื•ืฆืื” ื”ื™ื—ื™ื“ื” ื”ื™ื
09:18
is you get more money if you say four than less.
210
558580
3336
ืฉืชื–ื›ื• ื‘ื™ื•ืชืจ ื›ืกืฃ ืื ืืžืจืชื ืืจื‘ืข, ืžืืฉืจ ืื ื”ืฆื”ืจืชื ืขืœ ืคื—ื•ืช.
09:21
You leave 20 francs on the table by announcing zero.
211
561940
3280
ืืชื ืžืฉืื™ืจื™ื ืขืœ ื”ืฉื•ืœื—ืŸ 20 ืคืจื ืงื™ื ืื ื”ืฆื”ืจืชื ืขืœ ืืคืก.
09:25
I don't know whether the other people all were honest
212
565860
2576
ืื™ื ื™ ื™ื•ื“ืข ืื ื›ืœ ื”ืฉืืจ ื”ื™ื• ื™ืฉืจื™ื
09:28
or whether they also said a little bit higher or lower than what they did
213
568460
3456
ืื• ืื ื’ื ื”ื ื”ืฆื”ื™ืจื• ืขืœ ืžืขื˜ ื™ื•ืชืจ ืื• ืคื—ื•ืช ืžื”ืืžืช
09:31
because it's anonymous.
214
571940
1216
ื›ื™ ื–ื” ื”ื™ื” ื‘ืขื™ืœื•ื ืฉื.
09:33
We only observed the distribution.
215
573180
1656
ืื ื• ืจื•ืื™ื ืจืง ืืช ื”ื”ืชืคืœื’ื•ืช.
09:34
But what I can tell you -- and here's another coin toss.
216
574860
2656
ืื‘ืœ ืžื” ืื•ืžืจ ืœื›ื -- ื”ื ื” ืขื•ื“ ื”ื˜ืœืช ืžื˜ื‘ืข.
09:37
There you go, it's tails.
217
577540
1496
ื‘ื‘ืงืฉื”. "ืขืฅ".
09:39
(Laughter)
218
579060
1496
(ืฆื—ื•ืง)
09:40
Don't check, OK?
219
580580
1456
ืืœ ืชื‘ื“ืงื• ืื•ืชื™, ื‘ืกื“ืจ?
09:42
(Laughter)
220
582060
2816
(ืฆื—ื•ืง)
09:44
What I can tell you
221
584900
1296
ืžื” ืฉืื ื™ ื™ื›ื•ืœ ืœื•ืžืจ ืœื›ื,
09:46
is that not everybody behaved like Adam Smith would have predicted.
222
586220
4440
ื”ื•ื ืฉืœื ื›ื•ืœื ืžืชื ื”ื’ื™ื ื›ืžื• ืฉื—ื–ื” ืื“ื ืกืžื™ืช'.
09:52
So what does that leave us with?
223
592660
1576
ืื– ืžื”ื• ื”ืœืงื—?
09:54
Well, it seems people are motivated by certain intrinsic values
224
594260
4496
ืžืกืชื‘ืจ ืฉืื ืฉื™ื ืžื•ื ืขื™ื ื‘ื™ื“ื™ ืขืจื›ื™ื ืคื ื™ืžื™ื™ื ืžืกื•ื™ืžื™ื,
09:58
and in our research, we look at this.
225
598780
1800
ื•ื–ื” ืžื” ืฉื‘ื“ืงื ื• ื‘ืžื—ืงืจ ืฉืœื ื•.
10:01
We look at the idea that people have so-called protected values.
226
601260
4480
ื”ืชื™ื™ื—ืกื ื• ืœื›ืš ืฉืœืื ืฉื™ื ื™ืฉ "ืขืจื›ื™ื ืฉืžื•ืจื™ื".
10:06
A protected value isn't just any value.
227
606580
2816
ืขืจืš ืฉืžื•ืจ ืื™ื ื ื• ื›ื›ืœ ื”ืขืจื›ื™ื.
10:09
A protected value is a value where you're willing to pay a price
228
609420
5816
ืœืžืขืŸ ืขืจืš ืฉืžื•ืจ ืืชื ืžื•ื›ื ื™ื ืœืฉืœื ืžื—ื™ืจ
10:15
to uphold that value.
229
615260
1256
ื›ื“ื™ ืœืงื™ื™ืžื•.
10:16
You're willing to pay a price to withstand the temptation to give in.
230
616540
4440
ืืชื ืžื•ื›ื ื™ื ืœืฉืœื ืžื—ื™ืจ ื•ืœืขืžื•ื“ ื‘ืคื™ืชื•ื™ ืœื”ื™ื›ื ืข,
10:22
And the consequence is you feel better
231
622020
2656
ื•ืขืงื‘ ื›ืš ืืชื ืžืจื’ื™ืฉื™ื ื˜ื•ื‘ ื™ื•ืชืจ
10:24
if you earn money in a way that's consistent with your values.
232
624700
4296
ืื ื”ืจื•ื•ื—ืชื ื›ืกืฃ ื‘ื“ืจืš ืฉืขื•ืœื” ื‘ืงื ื” ืื—ื“ ืขื ืขืจื›ื™ื›ื.
ืืžื—ื™ืฉ ื–ืืช ืฉื•ื‘ ื‘ืขื–ืจืช ื›ืœื‘ื ื• ื”ืื”ื•ื‘.
10:29
Let me show you this again in the metaphor of our beloved dog here.
233
629020
4280
10:34
If we succeed in getting the sausage without violating our values,
234
634420
4056
ืื ื ืฆืœื™ื— ืœื”ืฉื™ื’ ืืช ื”ื ืงื ื™ืงื™ื” ืžื‘ืœื™ ืœืคื’ื•ืข ื‘ืขืจื›ื™ื ื•,
10:38
then the sausage tastes better.
235
638500
1976
ื”ื ืงื ื™ืงื™ื” ืชื”ื™ื” ื˜ืขื™ืžื” ื™ื•ืชืจ.
10:40
That's what our research shows.
236
640500
1480
ื–ื” ืžื” ืฉื”ืžื—ืงืจ ืฉืœื ื• ื”ื•ื›ื™ื—.
10:42
If, on the other hand,
237
642540
1256
ืžืฆื“ ืฉื ื™, ืื ืขืฉื™ื ื• ื–ืืช --
10:43
we do so --
238
643820
1256
10:45
if we get the sausage
239
645100
1416
ืื ื”ืฉื’ื ื• ืืช ื”ื ืงื ื™ืงื™ื”
10:46
and in doing so we actually violate values,
240
646540
3456
ื•ืื’ื‘ ื›ืš ื”ืคืจื ื• ืืช ืขืจื›ื™ื ื•,
ืื ื• ื ืขืจื™ืš ืคื—ื•ืช ืืช ื”ื ืงื ื™ืงื™ื”.
10:50
we value the sausage less.
241
650020
2976
ืื ืžื›ืžืชื™ื ื–ืืช, ื™ืฉ ืœื›ืš ื›ื•ื— ืื“ื™ืจ.
10:53
Quantitatively, that's quite powerful.
242
653020
2456
10:55
We can measure these protected values,
243
655500
2456
ืื ื• ื™ื›ื•ืœื™ื ืœืžื“ื•ื“ ืืช ื”ืขืจื›ื™ื ื”ืฉืžื•ืจื™ื ื”ืœืœื•,
10:57
for example,
244
657980
1216
ืœืžืฉืœ ื‘ืขื–ืจืช ืกืงืจ.
10:59
by a survey measure.
245
659220
1920
11:02
Simple, nine-item survey that's quite predictive in these experiments.
246
662180
5976
ืกืงืจ ืคืฉื•ื˜ ื‘ืŸ ืชืฉืขื” ืคืจื™ื˜ื™ื ืฉื—ื•ื–ื” ื”ื™ื˜ื‘ ืžื” ื™ืงืจื” ื‘ื ื™ืกื•ื™ื™ื ืืœื”.
11:08
If you think about the average of the population
247
668180
2336
ืื ืชื—ืฉื‘ื• ืขืœ ืžืžื•ืฆืข ื”ืื•ื›ืœื•ืกื™ื”
11:10
and then there's a distribution around it --
248
670540
2096
ื•ืขืœ ื”ื”ืชืคืœื’ื•ืช ืกื‘ื™ื‘ื• --
11:12
people are different, we all are different.
249
672660
2040
ืื ืฉื™ื ื”ื ืฉื•ื ื™ื, ื›ื•ืœื ื• ืฉื•ื ื™ื.
11:15
People who have a set of protected values
250
675300
2976
ืื ืฉื™ื ื‘ืขืœื™ ืžืขืจื›ืช ืขืจื›ื™ื ืฉืžื•ืจื™ื
11:18
that's one standard deviation above the average,
251
678300
4176
ื”ื ืžืฆืื™ื ืกื˜ื™ื™ืช-ืชืงืŸ ืื—ืช ืžืขืœ ืœืžืžื•ืฆืข,
11:22
they discount money they receive by lying by about 25 percent.
252
682500
5056
vo nzkzkho cfx; acu zfu ื‘ืฉืงืจ ื‘ื›-25%.
11:27
That means a dollar received when lying
253
687580
3616
ื›ืœื•ืžืจ, ื“ื•ืœืจ ืื—ื“ ืฉื”ืชืงื‘ืœ ื”ื•ื“ื•ืช ืœืฉืงืจ,
11:31
is worth to them only 75 cents
254
691220
2136
ืฉื•ื•ื” ื‘ืขื™ื ื™ื”ื 75 ืกื ื˜ ื‘ืœื‘ื“.
11:33
without any incentives you put in place for them to behave honestly.
255
693380
3696
ืžื‘ืœื™ ืฉื ื™ืชื ื• ืœื”ื ืชืžืจื™ืฆื™ื ืœื”ืชื ื”ื’ ื‘ื™ื•ืฉืจ,
11:37
It's their intrinsic motivation.
256
697100
1736
ื–ื”ื• ื”ืžื ื™ืข ื”ืคื ื™ืžื™ ืฉืœื”ื.
11:38
By the way, I'm not a moral authority.
257
698860
1856
ืื’ื‘, ืื ื™ ืื™ื ื ื™ ื‘ืจ-ืกืžื›ื ื‘ืชื—ื•ื ื”ืžื•ืกืจ.
11:40
I'm not saying I have all these beautiful values, right?
258
700740
2920
ืื™ื ื ื™ ื˜ื•ืขืŸ ืฉื™ืฉ ืœื™ ื›ืœ ื”ืขืจื›ื™ื ื”ื ื—ืžื“ื™ื ื”ืืœื”, ื‘ืกื“ืจ?
11:44
But I'm interested in how people behave
259
704260
1936
ืื‘ืœ ืžืขื ื™ื™ืŸ ืื•ืชื™ ืื™ืš ืื ืฉื™ื ืžืชื ื”ื’ื™ื
11:46
and how we can leverage that richness in human nature
260
706220
3376
ื•ืื™ืš ื‘ืืคืฉืจื•ืชื ื• ืœืžื ืฃ ืืช ืขื•ืฉืจื• ืฉืœ ื”ื˜ื‘ืข ื”ืื ื•ืฉื™
11:49
to actually improve the workings of our organizations.
261
709620
3440
ื›ื“ื™ ืœืฉืคืจ ืืช ืชืคืงื•ื“ ื”ืืจื’ื•ื ื™ื ืฉืœื ื•.
11:54
So there are two very, very different visions here.
262
714220
3176
ืื– ื™ืฉ ืœื›ื ื›ืืŸ ืฉืชื™ ื”ืฉืงืคื•ืช ืฉื•ื ื•ืช ืขื“ ืžืื“.
11:57
On the one hand,
263
717420
1336
ืžืฆื“ ืื—ื“ ืืคืฉืจ ืœืคื ื•ืช ืœื™ืชืจื•ื ื•ืช ื•ืœืขืœื•ื™ื•ืช
11:58
you can appeal to benefits and costs
264
718780
3016
12:01
and try to get people to behave according to them.
265
721820
2656
ื•ืœื ืกื•ืช ืœื’ืจื•ื ืœืื ืฉื™ื ืœื”ืชื ื”ื’ ื‘ื”ืชืื.
12:04
On the other hand,
266
724500
1616
ืžืฆื“ ืฉื ื™,
12:06
you can select people who have the values
267
726140
4016
ืืคืฉืจ ืœื‘ื—ื•ืจ ืื ืฉื™ื ื‘ืขืœื™ ืขืจื›ื™ื ื•ืžืืคื™ื™ื ื™ื ืจืฆื•ื™ื™ื, ื›ืžื•ื‘ืŸ --
12:10
and the desirable characteristics, of course --
268
730180
2216
12:12
competencies that go in line with your organization.
269
732420
3576
ื™ื›ื•ืœื•ืช ืฉืžืชืื™ืžื•ืช ืœืืจื’ื•ืŸ ืฉืœื›ื.
12:16
I do not yet know where these protected values really come from.
270
736020
4216
ืขื“ื™ื™ืŸ ืื™ื ื ื™ ื™ื•ื“ืข ืžื” ื‘ืืžืช ืžืงื•ืจื ืฉืœ ื”ืขืจื›ื™ื ื”ืฉืžื•ืจื™ื ื”ืืœื”.
12:20
Is it nurture or is it nature?
271
740260
3376
ื”ืื ื”ื ืžื•ืœื“ื™ื ืื• ื ืจื›ืฉื™ื?
12:23
What I can tell you
272
743660
1376
ืื•ื›ืœ ืจืง ืœื•ืžืจ ืœื›ื
12:25
is that the distribution looks pretty similar for men and women.
273
745060
5096
ืฉื”ื”ืชืคืœื’ื•ืช ื”ื–ืืช ื“ื•ืžื” ืžืื“ ืืฆืœ ื’ื‘ืจื™ื ื•ืืฆืœ ื ืฉื™ื.
12:30
It looks pretty similar for those who had studied economics
274
750180
3776
ื”ื™ื ื“ื•ืžื” ืžืื“ ืืฆืœ ืžื™ ืฉืœืžื“ื• ื›ืœื›ืœื”
12:33
or those who had studied psychology.
275
753980
2360
ื•ืืฆืœ ืžื™ ืฉืœืžื“ื• ืคืกื™ื›ื•ืœื•ื’ื™ื”.
12:37
It looks even pretty similar around different age categories
276
757820
3376
ื”ื™ื ื“ื•ืžื” ืžืื“ ืืคื™ืœื• ื‘ืงื˜ื’ื•ืจื™ื•ืช ื’ื™ืœ ืฉื•ื ื•ืช ืฉืœ ืžื‘ื•ื’ืจื™ื,
12:41
among adults.
277
761220
1216
12:42
But I don't know yet how this develops over a lifetime.
278
762460
2656
ืื‘ืœ ืขื“ื™ื™ืŸ ืื™ื ื ื™ ื™ื•ื“ืข ืื™ืš ื–ื” ืžืชืคืชื— ื‘ืžืฉืš ื”ื—ื™ื™ื.
12:45
That will be the subject of future research.
279
765140
3440
ื–ื” ื™ื”ื™ื” ื ื•ืฉื ืœืžื—ืงืจ ืขืชื™ื“ื™.
12:49
The idea I want to leave you with
280
769460
1656
ื”ืจืขื™ื•ืŸ ืฉืื ื™ ืจื•ืฆื” ืฉืชื–ื›ืจื• ื”ื•ื,
12:51
is it's all right to appeal to incentives.
281
771140
2776
ืฉืื™ืŸ ื‘ืขื™ื” ืœื”ื™ืขื–ืจ ื‘ืชืžืจื™ืฆื™ื.
12:53
I'm an economist;
282
773940
1216
ืื ื™ ื›ืœื›ืœืŸ;
ืื ื™ ื‘ื”ื—ืœื˜ ืžืืžื™ืŸ ืฉืชืžืจื™ืฆื™ื ืžื•ืขื™ืœื™ื.
12:55
I certainly believe in the fact that incentives work.
283
775180
2920
12:59
But do think about selecting the right people
284
779220
4016
ืื‘ืœ ื”ืงื“ื™ืฉื• ื’ื ืžื—ืฉื‘ื” ืœื‘ื—ื™ืจืช ื”ืื ืฉื™ื ื”ื ื›ื•ื ื™ื,
13:03
rather than having people and then putting incentives in place.
285
783260
3496
ื•ืœื ืกืชื ืœื”ืขืกื™ืง ืื•ืชื ื•ืื—"ื› ืœืชืช ืชืžืจื™ืฆื™ื.
13:06
Selecting the right people with the right values
286
786780
2256
ื‘ื—ื™ืจืช ื”ืื ืฉื™ื ื”ื ื›ื•ื ื™ื ื‘ืขืœื™ ื”ืขืจื›ื™ื ื”ื ื›ื•ื ื™ื
ืขืฉื•ื™ื” ืœื—ืกื•ืš ื‘ืขืชื™ื“ ื‘ืขื™ื•ืช ืจื‘ื•ืช
13:09
may go a long way to saving a lot of trouble
287
789060
3936
13:13
and a lot of money
288
793020
1376
ื•ื’ื ื›ืกืฃ ืจื‘
13:14
in your organizations.
289
794420
1736
ื‘ืืจื’ื•ื ื™ื ืฉืœื›ื.
13:16
In other words,
290
796180
1256
ื‘ืžืœื™ื ืื—ืจื•ืช,
13:17
it will pay off to put people first.
291
797460
3760
ื™ืฉืชืœื ืœื›ื ืœืชืช ืขื“ื™ืคื•ืช ืœืื ืฉื™ื ืขืฆืžื.
13:21
Thank you.
292
801860
1216
ืชื•ื“ื” ืœื›ื.
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
13:23
(Applause)
293
803100
3640
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7