What really motivates people to be honest in business | Alexander Wagner
233,748 views ・ 2017-09-26
請雙擊下方英文字幕播放視頻。
譯者: Lilian Chiu
審譯者: NAN-KUN WU
00:12
How many companies
have you interacted with today?
0
12580
3440
今天你和多少間公司互動過?
00:17
Well, you got up in the morning,
1
17060
1656
你早上起床,
00:18
took a shower,
2
18740
1215
先去淋浴,
00:19
washed your hair,
3
19980
1256
洗洗頭髮,
00:21
used a hair dryer,
4
21260
1536
用了吹風機,
00:22
ate breakfast --
5
22820
1216
吃了早餐──
00:24
ate cereals, fruit, yogurt, whatever --
6
24060
1858
吃的是穀片、水果、優格、等等──
00:25
had coffee --
7
25942
1214
喝了咖啡
00:27
tea.
8
27180
1376
或茶。
00:28
You took public transport to come here,
9
28580
1976
你搭乘大眾交通工具到這裡,
00:30
or maybe used your private car.
10
30580
1840
或是開你自己的車過來。
00:33
You interacted with the company
that you work for or that you own.
11
33340
3560
你和你上班的公司或是
你自己開的公司互動。
00:37
You interacted with your clients,
12
37980
1960
你和你的客戶互動,
00:40
your customers,
13
40580
1200
和你的顧客互動,
00:42
and so on and so forth.
14
42460
1256
諸如此類。
00:43
I'm pretty sure there are
at least seven companies
15
43740
3616
我非常確定
你今天至少和七間公司互動過。
00:47
you've interacted with today.
16
47380
1760
00:49
Let me tell you a stunning statistic.
17
49780
2000
讓我告訴各位一個驚人的統計數據。
00:52
One out of seven
large, public corporations
18
52660
4376
每七間大型的上市公司中
00:57
commit fraud every year.
19
57060
2240
每年會有一間犯下詐欺罪。
01:00
This is a US academic study
that looks at US companies --
20
60220
3416
這是美國的學術研究,
調查的對象是美國的公司──
01:03
I have no reason to believe
that it's different in Europe.
21
63660
3200
但我認為歐洲公司的情況類似。
01:07
This is a study that looks
at both detected and undetected fraud
22
67300
4216
被發現、未被發現的詐欺
兩者都被調查研究,
01:11
using statistical methods.
23
71540
1736
使用的方法是統計。
01:13
This is not petty fraud.
24
73300
1720
不是小欺小騙。
01:15
These frauds cost
the shareholders of these companies,
25
75940
2856
這些詐欺會造成公司股東的損失,
01:18
and therefore society,
26
78820
1256
因此也造成社會的損失,
01:20
on the order of
380 billion dollars per year.
27
80100
3600
每年損失約三千八百億美元。
01:24
We can all think of some examples, right?
28
84780
2216
我們都能想出一些例子,對嗎?
01:27
The car industry's secrets
aren't quite so secret anymore.
29
87020
3800
汽車產業的秘密不再那麼秘密了。
01:31
Fraud has become a feature,
30
91620
3296
在金融服務業,
詐欺已經變成了一種特性,
01:34
not a bug,
31
94940
1216
而不是錯誤。
01:36
of the financial services industry.
32
96180
1936
01:38
That's not me who's claiming that,
33
98140
2216
並不是我自己在這麼說,
01:40
that's the president
of the American Finance Association
34
100380
3256
是美國財務學會的會長
01:43
who stated that
in his presidential address.
35
103660
2936
在他的會長演說當中說的。
01:46
That's a huge problem
if you think about, especially,
36
106620
2736
如果你仔細想想,
這是個很大的問題,
01:49
an economy like Switzerland,
37
109380
1696
特別是在像瑞士這樣的經濟體中,
01:51
which relies so much on the trust
put into its financial industry.
38
111100
4200
瑞士非常仰賴人們對其
金融業投入的信任。
01:56
On the other hand,
39
116780
1216
另一方面,
01:58
there are six out of seven companies
who actually remain honest
40
118020
3536
七間中的六間事實上維持著誠信,
02:01
despite all temptations
to start engaging in fraud.
41
121580
3840
它們抗拒了各種引誘
它們進行詐欺的誘惑。
02:06
There are whistle-blowers
like Michael Woodford,
42
126060
2296
有像邁克爾伍德福特這樣的告密者,
02:08
who blew the whistle on Olympus.
43
128380
2336
他告了奧林巴斯的密。
02:10
These whistle-blowers risk their careers,
44
130740
2696
這些告密者賭上他們的職涯
02:13
their friendships,
45
133460
1216
和友情,
02:14
to bring out the truth
about their companies.
46
134700
2136
揭露他們公司的真相。
02:16
There are journalists
like Anna Politkovskaya
47
136860
2616
有像安娜波利特科夫
斯卡婭這樣的記者,
02:19
who risk even their lives
to report human rights violations.
48
139500
3856
冒著生命危險去報導
違反人權的事件。
02:23
She got killed --
49
143380
1216
後來她被殺了──
02:24
every year,
50
144620
1216
每年,
02:25
around 100 journalists get killed
51
145860
1656
有大約一百名記者
02:27
because of their conviction
to bring out the truth.
52
147540
2720
因為堅定地想要揭露真相而被殺害。
02:31
So in my talk today,
53
151860
1256
在今天這場演說,
02:33
I want to share with you
some insights I've obtained and learned
54
153140
3496
我想和各位分享的是我過去十年間
02:36
in the last 10 years
of conducting research in this.
55
156660
3296
在研究這個主題時
所學到的一些洞見。
02:39
I'm a researcher,
a scientist working with economists,
56
159980
3496
我是個研究者、科學家,
合作的對象有經濟學家、
02:43
financial economists,
57
163500
1336
金融經濟學家、
02:44
ethicists, neuroscientists,
58
164860
2056
倫理學家、神經科學家、
律師,以及其他的人。
02:46
lawyers and others
59
166940
1336
02:48
trying to understand
what makes humans tick,
60
168300
2096
我嘗試要了解是什麼在驅使人類,
02:50
and how can we address this issue
of fraud in corporations
61
170420
4776
以及我們如何處理
企業內的詐欺議題,
02:55
and therefore contribute
to the improvement of the world.
62
175220
3160
因而對改善世界貢獻一份心力。
02:59
I want to start by sharing with you
two very distinct visions
63
179100
3536
一開始,我想分享兩個
非常不同的看法,
03:02
of how people behave.
64
182660
1816
對於人的行為的看法。
03:04
First, meet Adam Smith,
65
184500
1840
首先,來見見亞當史密斯,
03:07
founding father of modern economics.
66
187020
1960
現代經濟學之父。
03:10
His basic idea was that if everybody
behaves in their own self-interests,
67
190100
4296
他的基本想法是:
如果每個人都依自身利益而行,
03:14
that's good for everybody in the end.
68
194420
2520
最終,那對每個人而言都是好的。
03:17
Self-interest isn't
a narrowly defined concept
69
197900
3056
自身利益不是一個
定義很狹隘的觀念,
03:20
just for your immediate utility.
70
200980
1936
不是只為了立即的功利。
03:22
It has a long-run implication.
71
202940
1936
它有著長期的意涵。
03:24
Let's think about that.
72
204900
1480
我們來想想這一點。
03:26
Think about this dog here.
73
206900
2016
想想圖上的這隻狗。
03:28
That might be us.
74
208940
1200
牠可能就是我們。
03:31
There's this temptation --
75
211260
1256
圖上有著誘惑──
03:32
I apologize to all vegetarians, but --
76
212540
2376
我要向所有的素食者道歉,但──
03:34
(Laughter)
77
214940
1016
03:35
Dogs do like the bratwurst.
78
215980
1696
(笑聲)
狗確實喜歡臘腸。
03:37
(Laughter)
79
217700
2376
(笑聲)
03:40
Now, the straight-up,
self-interested move here
80
220100
3096
這裡最直接、最以
自身利益為主的做法,
03:43
is to go for that.
81
223220
1576
就是去取得臘腸。
03:44
So my friend Adam here might jump up,
82
224820
2936
所以我的朋友亞當,可能會跳上去,
03:47
get the sausage and thereby ruin
all this beautiful tableware.
83
227780
3360
取得臘腸,因而把
所有的美麗餐具都給毀了。
03:51
But that's not what Adam Smith meant.
84
231820
1816
但那並不是亞當史密斯的意思。
03:53
He didn't mean
disregard all consequences --
85
233660
2656
他並不是說要不顧一切的後果──
03:56
to the contrary.
86
236340
1216
其實相反。
03:57
He would have thought,
87
237580
1256
他會想,
03:58
well, there may be negative consequences,
88
238860
2016
也許會有負面的後果,
04:00
for example,
89
240900
1216
比如,
04:02
the owner might be angry with the dog
90
242140
3096
主人可能會對狗發怒,
04:05
and the dog, anticipating that,
might not behave in this way.
91
245260
3600
而狗能預期到這一點,
就不會做出這個行為。
04:09
That might be us,
92
249660
1256
那可能就是我們,
04:10
weighing the benefits
and costs of our actions.
93
250940
3056
權衡我們每個行動的利益和成本。
04:14
How does that play out?
94
254020
1240
那會產生什麼結果?
04:15
Well, many of you, I'm sure,
95
255780
1976
我相信,在座許多人,
04:17
have in your companies,
96
257780
1536
在你們的公司裡,
04:19
especially if it's a large company,
97
259340
1816
特別是大公司裡,
04:21
a code of conduct.
98
261180
1656
會有「行為準則」。
04:22
And then if you behave
according to that code of conduct,
99
262860
3416
如果你根據行為準則來做事,
04:26
that improves your chances
of getting a bonus payment.
100
266300
3176
就能讓你比較有機會得到獎金。
04:29
And on the other hand,
if you disregard it,
101
269500
2135
另一方面,如果你漠視它,
04:31
then there are higher chances
of not getting your bonus
102
271659
2737
就比較有可能得不到獎金,
04:34
or its being diminished.
103
274420
1536
或是獎金會縮水。
04:35
In other words,
104
275980
1256
換言之,
04:37
this is a very economic motivation
105
277260
1816
這是個非常經濟的動機,
04:39
of trying to get people to be more honest,
106
279100
2776
試圖讓人們更誠實,
04:41
or more aligned with
the corporation's principles.
107
281900
3360
或是更符合公司的原則。
04:46
Similarly, reputation is a very
powerful economic force, right?
108
286060
5256
同樣地,名聲也是種非常
強大的經濟力量,對吧?
04:51
We try to build a reputation,
109
291340
1536
我們試圖建立名聲,
04:52
maybe for being honest,
110
292900
1416
也許是誠實的名聲,
04:54
because then people
trust us more in the future.
111
294340
2400
因為這樣做,人們將來會更信任我們。
04:57
Right?
112
297780
1216
對嗎?
04:59
Adam Smith talked about the baker
113
299020
2096
亞當史密斯談到一個麵包師傅,
05:01
who's not producing good bread
out of his benevolence
114
301140
3776
他為那些消費者製作好的麵包,
05:04
for those people who consume the bread,
115
304940
3016
並不是出於善心,
05:07
but because he wants to sell
more future bread.
116
307980
3040
而是因為他想要在未來
能賣出更多麵包。
05:11
In my research, we find, for example,
117
311980
2216
在我的研究中,我們發現,比如,
05:14
at the University of Zurich,
118
314220
1376
在蘇黎世大學,
05:15
that Swiss banks
who get caught up in media,
119
315620
4200
瑞士銀行被捲入媒體當中,
05:20
and in the context, for example,
120
320540
1776
例如在逃稅或稅務詐欺的情況下,
05:22
of tax evasion, of tax fraud,
121
322340
1536
05:23
have bad media coverage.
122
323900
1736
有很糟的媒體報導。
05:25
They lose net new money in the future
123
325660
2736
他們就會在未來失去淨新增資金,
05:28
and therefore make lower profits.
124
328420
1616
因此賺的利潤就會減少。
05:30
That's a very powerful reputational force.
125
330060
2360
那是非常強大的名聲力量。
05:34
Benefits and costs.
126
334020
1600
利益和成本。
05:36
Here's another viewpoint of the world.
127
336940
2576
世界上有另一種觀點。
05:39
Meet Immanuel Kant,
128
339540
1536
來見見伊曼努爾康德,
05:41
18th-century German philosopher superstar.
129
341100
2760
十八世紀的明星德國哲學家。
05:44
He developed this notion
130
344740
1616
他發展出了這個概念:
05:46
that independent of the consequences,
131
346380
3136
和結果無關,
05:49
some actions are just right
132
349540
2976
有些行為就是對的,
05:52
and some are just wrong.
133
352540
1696
有些就是錯的。
05:54
It's just wrong to lie, for example.
134
354260
3216
比如,說謊就是錯的。
05:57
So, meet my friend Immanuel here.
135
357500
3136
所以,來見見我的朋友伊曼努爾。
06:00
He knows that the sausage is very tasty,
136
360660
2816
牠知道香腸非常可口,
06:03
but he's going to turn away
because he's a good dog.
137
363500
2456
但牠打算調頭走開,
因為牠是條好狗。
06:05
He knows it's wrong to jump up
138
365980
2696
牠知道跳上去是錯的,
06:08
and risk ruining
all this beautiful tableware.
139
368700
2800
有可能會打破所有這些漂亮的餐具。
06:12
If you believe that people
are motivated like that,
140
372340
2416
如果你相信人們會被
這樣的動機驅使,
06:14
then all the stuff about incentives,
141
374780
2176
那麼所有關於獎勵的一切,
06:16
all the stuff about code of conduct
and bonus systems and so on,
142
376980
3776
那麼所有關於行為準則、
獎金制度等等的一切,
06:20
doesn't make a whole lot of sense.
143
380780
2176
就不是很有道理了。
06:22
People are motivated
by different values perhaps.
144
382980
4176
人們被不同的價值觀驅使,也許吧。
06:27
So, what are people actually motivated by?
145
387180
3376
所以人們的動機到底是什麼?
06:30
These two gentlemen here
have perfect hairdos,
146
390580
2176
這裡的兩位先生有著完美的髮型,
06:32
but they give us
very different views of the world.
147
392780
4480
但他們給我們非常不同的世界觀。
06:37
What do we do with this?
148
397660
1256
對此,我們該怎麼辦?
06:38
Well, I'm an economist
149
398940
1656
嗯,我是經濟學家,
06:40
and we conduct so-called experiments
to address this issue.
150
400620
4176
而我們會進行所謂的實驗,
來處理這個議題。
06:44
We strip away facts
which are confusing in reality.
151
404820
3296
我們會剝除在現實中讓我們困惑的事實,
06:48
Reality is so rich,
there is so much going on,
152
408140
2736
現實非常的豐富,
有太多事情在發生,
06:50
it's almost impossible to know
what drives people's behavior really.
153
410900
3960
幾乎不可能知道
什麼真正驅動人類的行為。
06:55
So let's do a little experiment together.
154
415340
2720
所以,讓我們一起來做個小實驗。
06:58
Imagine the following situation.
155
418500
2600
想像下面的情境。
07:02
You're in a room alone,
156
422220
2416
你單獨在一間房間中,
07:04
not like here.
157
424660
1536
不像這裡這麼多人。
07:06
There's a five-franc coin
like the one I'm holding up right now
158
426220
3440
有一個五法郎硬幣,
就像我手上的這個,
07:10
in front of you.
159
430380
1576
硬幣就在你面前。
07:11
Here are your instructions:
160
431980
1576
你得到的指示是:
07:13
toss the coin four times,
161
433580
2480
擲硬幣四次,
07:17
and then on a computer
terminal in front of you,
162
437620
2416
接著,在你面前的電腦終端機上,
07:20
enter the number of times tails came up.
163
440060
3656
輸入硬幣出現反面的次數。
07:23
This is the situation.
164
443740
1280
情境就是這樣。
07:25
Here's the rub.
165
445540
1216
難處在這裏:
07:26
For every time that you announce
that you had a tails throw,
166
446780
3376
每次你宣佈你擲出了反面,
07:30
you get paid five francs.
167
450180
1496
你就會得到五法郎。
07:31
So if you say I had two tails throws,
168
451700
2536
所以如果你說我擲出兩次反面,
07:34
you get paid 10 francs.
169
454260
2216
你就會得十法郎。
07:36
If you say you had zero,
you get paid zero francs.
170
456500
2936
如果你說你沒擲出反面,
你就會得到零法郎。
07:39
If you say, "I had four tails throws,"
171
459460
2456
如果你說:「我擲出四次反面」,
07:41
then you get paid 20 francs.
172
461940
2016
你就會得到二十法郎。
07:43
It's anonymous,
173
463980
1256
這是匿名的,
07:45
nobody's watching what you're doing,
174
465260
1896
沒有人在看你做,
07:47
and you get paid that money anonymously.
175
467180
2336
付錢給你時也是匿名的。
07:49
I've got two questions for you.
176
469540
1477
我要問各位兩個問題。
07:51
(Laughter)
177
471580
1616
(笑聲)
07:53
You know what's coming now, right?
178
473220
1640
你們知道接下來是什麼吧?
07:55
First, how would you behave
in that situation?
179
475820
3480
第一,在這個情境,你會怎麼做?
08:00
The second, look to your left
and look to your right --
180
480060
2936
第二,看看你的左邊,
看看你的右邊──
08:03
(Laughter)
181
483020
1016
(笑聲)
08:04
and think about how
the person sitting next to you
182
484060
2376
想想坐在你旁邊的人
08:06
might behave in that situation.
183
486460
1656
在這情境可能會怎麼做。
08:08
We did this experiment for real.
184
488140
2136
我們真做了這個實驗。
08:10
We did it at the Manifesta art exhibition
185
490300
2696
我們是在最近蘇黎世這裡舉行的
08:13
that took place here in Zurich recently,
186
493020
2456
Manifesta 藝術展覽上做的,
08:15
not with students in the lab
at the university
187
495500
2856
不是在大學實驗室裡對學生做的,
08:18
but with the real population,
188
498380
1776
對象是真正的一般大眾,
08:20
like you guys.
189
500180
1200
就像在座各位。
08:21
First, a quick reminder of stats.
190
501900
2136
首先,快速提醒大家一下統計數字,
08:24
If I throw the coin four times
and it's a fair coin,
191
504060
3576
如果你擲硬幣四次,
且它是個公平的硬幣,
08:27
then the probability
that it comes up four times tails
192
507660
4096
四次都是反面的機率
08:31
is 6.25 percent.
193
511780
2400
是 6.25%。
08:34
And I hope you can intuitively see
194
514900
1656
希望你們用直覺就能看出,
08:36
that the probability that all four
of them are tails is much lower
195
516580
3376
四次都是反面的機率
08:39
than if two of them are tails, right?
196
519980
2120
遠低於兩次是反面的機率,對吧?
08:42
Here are the specific numbers.
197
522580
1440
這裡是明確的數據。
08:45
Here's what happened.
198
525859
1496
而結果如下。
08:47
People did this experiment for real.
199
527379
2201
人們真的做了這個實驗。
08:50
Around 30 to 35 percent of people said,
200
530619
3336
大約 30%~35% 的人說:
08:53
"Well, I had four tails throws."
201
533979
2401
「嗯,我擲出四次反面。」
08:57
That's extremely unlikely.
202
537460
1816
那是極度不可能的。
08:59
(Laughter)
203
539300
1936
(笑聲)
09:01
But the really amazing thing here,
204
541260
3136
但,真正驚人的是,
09:04
perhaps to an economist,
205
544420
1296
也許對經濟學家而言,驚人的是:
09:05
is there are around 65 percent of people
who did not say I had four tails throws,
206
545740
6536
大約 65% 的人
沒說他們擲出四次反面,
09:12
even though in that situation,
207
552300
2176
即使在那個情境中,
09:14
nobody's watching you,
208
554500
2096
沒有人在看你,
09:16
the only consequence that's in place
209
556620
1936
唯一會發生的後果是:
09:18
is you get more money
if you say four than less.
210
558580
3336
假如你說四次而不是少於四次,
得的錢就會比較多;
09:21
You leave 20 francs on the table
by announcing zero.
211
561940
3280
如果你宣稱零次,就放棄了二十法郎。
09:25
I don't know whether
the other people all were honest
212
565860
2576
我不知道其他人是否都誠實,
09:28
or whether they also said a little bit
higher or lower than what they did
213
568460
3456
或是他們會把數字向上或向下調整,
09:31
because it's anonymous.
214
571940
1216
因為這是匿名的。
09:33
We only observed the distribution.
215
573180
1656
我們只是觀察數據分佈。
09:34
But what I can tell you --
and here's another coin toss.
216
574860
2656
但我可以告訴各位──
這是另一次擲硬幣結果。
09:37
There you go, it's tails.
217
577540
1496
來了,是反面。
09:39
(Laughter)
218
579060
1496
(笑聲)
09:40
Don't check, OK?
219
580580
1456
別來確認,好嗎?
09:42
(Laughter)
220
582060
2816
(笑聲)
09:44
What I can tell you
221
584900
1296
我能告訴你們的是,
09:46
is that not everybody behaved
like Adam Smith would have predicted.
222
586220
4440
並非每個人的行為
都如亞當史密斯所預測。
09:52
So what does that leave us with?
223
592660
1576
所以這告訴我們什麼?
09:54
Well, it seems people are motivated
by certain intrinsic values
224
594260
4496
嗯,似乎人們會被某種
內在的價值觀所驅使,
09:58
and in our research, we look at this.
225
598780
1800
在研究中,我們調查了這點。
10:01
We look at the idea that people have
so-called protected values.
226
601260
4480
我們探討了人會有所謂的
「被保護的價值觀」的這個想法。
10:06
A protected value isn't just any value.
227
606580
2816
被保護的價值觀並非任何價值觀。
10:09
A protected value is a value
where you're willing to pay a price
228
609420
5816
被保護的價值觀是
你願意付出一個代價
10:15
to uphold that value.
229
615260
1256
來維持的價值。
10:16
You're willing to pay a price
to withstand the temptation to give in.
230
616540
4440
你願意付出一個代價來
對抗要你屈服的誘惑。
10:22
And the consequence is you feel better
231
622020
2656
而結果就是,如果你賺到錢的方式
10:24
if you earn money in a way
that's consistent with your values.
232
624700
4296
和你的價值觀是一致的,
你的感覺會比較好。
10:29
Let me show you this again
in the metaphor of our beloved dog here.
233
629020
4280
讓我再次用我們都愛的
狗狗比喻來說明。
10:34
If we succeed in getting the sausage
without violating our values,
234
634420
4056
如果我們不違反價值觀
而成功地取得香腸,
10:38
then the sausage tastes better.
235
638500
1976
那麼香腸嚐起來的味道比較好。
10:40
That's what our research shows.
236
640500
1480
那就是我們的研究發現。
10:42
If, on the other hand,
237
642540
1256
另一方面,
10:43
we do so --
238
643820
1256
若我們這樣做──
10:45
if we get the sausage
239
645100
1416
如果我們取得香腸,
10:46
and in doing so
we actually violate values,
240
646540
3456
用的是違反價值觀的方式,
10:50
we value the sausage less.
241
650020
2976
我們也會比較不珍視這香腸。
10:53
Quantitatively, that's quite powerful.
242
653020
2456
就量化而言,相當強而有力。
10:55
We can measure these protected values,
243
655500
2456
我們能夠測量這些被保護的價值觀,
10:57
for example,
244
657980
1216
比如,
10:59
by a survey measure.
245
659220
1920
可以用問卷調查來測量。
11:02
Simple, nine-item survey that's quite
predictive in these experiments.
246
662180
5976
簡單的九題問卷調查
是非常容易預測結果的實驗。
11:08
If you think about the average
of the population
247
668180
2336
如果考量人數平均值,
11:10
and then there's
a distribution around it --
248
670540
2096
平均值的周邊會呈現一種分佈──
11:12
people are different,
we all are different.
249
672660
2040
人各不相同,我們都不一樣。
11:15
People who have a set of protected values
250
675300
2976
持有著一組保護價值觀
11:18
that's one standard deviation
above the average,
251
678300
4176
比平均值高出一個標準差的那些人,
11:22
they discount money they receive
by lying by about 25 percent.
252
682500
5056
透過說謊得到的金錢價值
大約會被他們打個 75 折。
11:27
That means a dollar received when lying
253
687580
3616
也就是說,說謊得到的一塊錢,
11:31
is worth to them only 75 cents
254
691220
2136
和誠實得到的一塊錢相比,
11:33
without any incentives you put in place
for them to behave honestly.
255
693380
3696
只值 75 分錢而已。
11:37
It's their intrinsic motivation.
256
697100
1736
那是他們的內在動機。
11:38
By the way, I'm not a moral authority.
257
698860
1856
順便一提,我並非道德權威。
11:40
I'm not saying I have
all these beautiful values, right?
258
700740
2920
我並不是說我有所有這些
美好的價值觀。
11:44
But I'm interested in how people behave
259
704260
1936
但我感興趣的是人的行為,
11:46
and how we can leverage
that richness in human nature
260
706220
3376
以及如何讓人類本性中的
豐富特性能夠發揮功效,
11:49
to actually improve
the workings of our organizations.
261
709620
3440
來實際改善在組織內的工作。
11:54
So there are two
very, very different visions here.
262
714220
3176
所以,這裡有兩種非常不同的觀點。
11:57
On the one hand,
263
717420
1336
一方面,
11:58
you can appeal to benefits and costs
264
718780
3016
你可以訴諸利益和成本,
12:01
and try to get people
to behave according to them.
265
721820
2656
嘗試讓人們依利益和成本而行。
12:04
On the other hand,
266
724500
1616
另一方面,
12:06
you can select people who have the values
267
726140
4016
當然你可以篩選具有價值觀
12:10
and the desirable
characteristics, of course --
268
730180
2216
和具有你所想要之特質的人──
12:12
competencies that go
in line with your organization.
269
732420
3576
能夠配合你的組織。
12:16
I do not yet know where
these protected values really come from.
270
736020
4216
我還不知道
這些受保護的價值觀來自何處。
12:20
Is it nurture or is it nature?
271
740260
3376
是後天養成還是天性?
12:23
What I can tell you
272
743660
1376
但我能告訴各位,
12:25
is that the distribution
looks pretty similar for men and women.
273
745060
5096
男性和女性的分佈很類似。
12:30
It looks pretty similar
for those who had studied economics
274
750180
3776
學經濟的人和學心理學的人,
12:33
or those who had studied psychology.
275
753980
2360
他們的分佈看起來很相似。
12:37
It looks even pretty similar
around different age categories
276
757820
3376
不同年齡層的成人
他們的分佈還更相似。
12:41
among adults.
277
761220
1216
12:42
But I don't know yet
how this develops over a lifetime.
278
762460
2656
但我還不知道一輩子的進展會怎樣。
12:45
That will be the subject
of future research.
279
765140
3440
那將會是未來的研究主題。
12:49
The idea I want to leave you with
280
769460
1656
我想留給各位的想法是,
12:51
is it's all right to appeal to incentives.
281
771140
2776
被獎勵所吸引是沒關係的。
12:53
I'm an economist;
282
773940
1216
我是經濟學家;
12:55
I certainly believe in the fact
that incentives work.
283
775180
2920
我肯定相信獎勵是有效的。
12:59
But do think about selecting
the right people
284
779220
4016
但你一定要考慮篩選對的人,
13:03
rather than having people
and then putting incentives in place.
285
783260
3496
而不只是先把人招來,再提供獎勵。
13:06
Selecting the right people
with the right values
286
786780
2256
篩選對的、有正確價值觀的人,
13:09
may go a long way
to saving a lot of trouble
287
789060
3936
可成功地為你的組織
省掉很多麻煩、
13:13
and a lot of money
288
793020
1376
和很多金錢。
13:14
in your organizations.
289
794420
1736
13:16
In other words,
290
796180
1256
換言之,
13:17
it will pay off to put people first.
291
797460
3760
把「人」放在首位必有好收穫。
13:21
Thank you.
292
801860
1216
謝謝。
13:23
(Applause)
293
803100
3640
(掌聲)
New videos
Original video on YouTube.com
關於本網站
本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。