What really motivates people to be honest in business | Alexander Wagner

233,748 views ใƒป 2017-09-26

TED


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

๋ฒˆ์—ญ: Jeong Woo Song ๊ฒ€ํ† : Jihyeon J. Kim
00:12
How many companies have you interacted with today?
0
12580
3440
์˜ค๋Š˜ ํ•˜๋ฃจ ์–ผ๋งˆ๋‚˜ ๋งŽ์€ ๊ธฐ์—…์„ ์ ‘ํ–ˆ๋‚˜์š”?
00:17
Well, you got up in the morning,
1
17060
1656
์•„์นจ์— ์ผ์–ด๋‚˜์„œ
00:18
took a shower,
2
18740
1215
์ƒค์›Œ๋ฅผ ํ•˜๊ณ 
00:19
washed your hair,
3
19980
1256
๋จธ๋ฆฌ๋ฅผ ๊ฐ๊ณ 
00:21
used a hair dryer,
4
21260
1536
๋จธ๋ฆฌ๋ฅผ ๋ง๋ฆฌ๊ณ 
00:22
ate breakfast --
5
22820
1216
๋ฐฅ์„ ๋จน์ฃ .
00:24
ate cereals, fruit, yogurt, whatever --
6
24060
1858
์‹œ๋ฆฌ์–ผ, ๊ณผ์ผ, ์š”๊ฑฐํŠธ ๋“ฑ๋“ฑ
00:25
had coffee --
7
25942
1214
์ปคํ”ผ๋„ ๋งˆ์…จ๊ณ 
00:27
tea.
8
27180
1376
์ฐจ๋„์š”.
00:28
You took public transport to come here,
9
28580
1976
์ด๊ณณ์— ์˜ฌ ๋•Œ ๋Œ€์ค‘๊ตํ†ต์„ ์ด์šฉํ–ˆ์ฃ .
00:30
or maybe used your private car.
10
30580
1840
์•„๋‹ˆ๋ฉด ์ž๊ฐ€์šฉ์„ ์ด์šฉํ–ˆ๊ฒ ์ฃ .
00:33
You interacted with the company that you work for or that you own.
11
33340
3560
์—ฌ๋Ÿฌ๋ถ„์€ ์ผํ•˜์‹œ๊ฑฐ๋‚˜ ์†Œ์œ ํ•˜๊ณ  ์žˆ๋Š” ํšŒ์‚ฌ์™€ ๋งŒ๋‚ฉ๋‹ˆ๋‹ค.
00:37
You interacted with your clients,
12
37980
1960
์—ฌ๋Ÿฌ๋ถ„์˜ ์˜๋ขฐ์ธ, ๊ณ ๊ฐ๋“ค๊ณผ๋„ ๋งŒ๋‚˜์ฃ .
00:40
your customers,
13
40580
1200
00:42
and so on and so forth.
14
42460
1256
๊ธฐํƒ€ ๋“ฑ๋“ฑ..
00:43
I'm pretty sure there are at least seven companies
15
43740
3616
์˜ค๋Š˜ ์—ฌ๋Ÿฌ๋ถ„์€ ์ ์–ด๋„ ์ผ๊ณฑ ํšŒ์‚ฌ๋“ค์„ ๊ฒช์—ˆ์„ ๊ฒ๋‹ˆ๋‹ค.
00:47
you've interacted with today.
16
47380
1760
00:49
Let me tell you a stunning statistic.
17
49780
2000
์ œ๊ฐ€ ์ถฉ๊ฒฉ์ ์ธ ํ†ต๊ณ„๋ฅผ ๋ณด์—ฌ๋“œ๋ฆฌ๊ฒ ์Šต๋‹ˆ๋‹ค.
00:52
One out of seven large, public corporations
18
52660
4376
์ผ๊ณฑ ๊ฐœ์˜ ๊ฑฐ๋Œ€ ๊ณต๊ธฐ์—…๋“ค ์ค‘ ํ•˜๋‚˜๋Š”
00:57
commit fraud every year.
19
57060
2240
๋งค๋…„ ์‚ฌ๊ธฐ๋ฅผ ์นฉ๋‹ˆ๋‹ค.
01:00
This is a US academic study that looks at US companies --
20
60220
3416
์ด๊ฑด ๋ฏธ๊ตญ ๊ธฐ์—…๋“ค์„ ์—ฐ๊ตฌํ•œ ๋ฏธ๊ตญ์˜ ํ•™์ˆ ์ž๋ฃŒ์ž…๋‹ˆ๋‹ค.
01:03
I have no reason to believe that it's different in Europe.
21
63660
3200
๋ฌผ๋ก , ์œ ๋Ÿฝ์ด๋ผ๊ณ  ์˜ˆ์™ธ์ผ๊ฑฐ๋ผ ๋ฏฟ์„ ์ด์œ ๋Š” ์—†์ฃ .
01:07
This is a study that looks at both detected and undetected fraud
22
67300
4216
์ด ์—ฐ๊ตฌ์ž๋ฃŒ๋Š” ๋ฐํ˜€์ง„, ๊ทธ๋ฆฌ๊ณ  ๋ฐํ˜€์ง€์ง€ ์•Š์€ ์‚ฌ๊ธฐ ๋ชจ๋‘๋ฅผ ํฌํ•จํ•ฉ๋‹ˆ๋‹ค.
01:11
using statistical methods.
23
71540
1736
ํ†ต๊ฒŒํ•™์  ์ ‘๊ทผ์„ ํ†ตํ•ด์„œ์š”.
01:13
This is not petty fraud.
24
73300
1720
์ด๋Š” ์‚ฌ์†Œํ•œ ์‚ฌ๊ธฐ ๊ฐ™์€ ๊ฒŒ ์•„๋‹™๋‹ˆ๋‹ค.
01:15
These frauds cost the shareholders of these companies,
25
75940
2856
์ด๋Ÿฐ ์‚ฌ๊ธฐ๋“ค์€ ๊ธฐ์—…๋“ค์˜ ์ฃผ์ฃผ๋“ค๊ณผ ์‚ฌํšŒ์—
01:18
and therefore society,
26
78820
1256
01:20
on the order of 380 billion dollars per year.
27
80100
3600
์—ฐ๊ฐ„ ์•ฝ $3800์–ต(์•ฝ430์กฐ์›)์˜ ๋น„์šฉ์„ ์ดˆ๋ž˜ํ•ฉ๋‹ˆ๋‹ค.
01:24
We can all think of some examples, right?
28
84780
2216
๋ช‡ ๊ฐ€์ง€ ์˜ˆ๊ฐ€ ๋– ์˜ค๋ฅด์ง€ ์•Š๋‚˜์š”?
01:27
The car industry's secrets aren't quite so secret anymore.
29
87020
3800
์ž๋™์ฐจ ์‚ฐ์—…์˜ ๋น„๋ฐ€๋“ค์€ ๋” ์ด์ƒ ๋น„๋ฐ€์ด ์•„๋‹ˆ์ฃ .
01:31
Fraud has become a feature,
30
91620
3296
์‚ฌ๊ธฐ๋Š” ๊ธˆ์œต ์„œ๋น„์Šค์—…๊ณ„์˜ ์˜ค๋ฅ˜๊ฐ€ ์•„๋‹Œ
01:34
not a bug,
31
94940
1216
ํ•˜๋‚˜์˜ ํŠน์ง•์ด ๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
01:36
of the financial services industry.
32
96180
1936
01:38
That's not me who's claiming that,
33
98140
2216
์ด๋Š” ์ œ๊ฐ€ ์ฃผ์žฅํ•˜๋Š” ๊ฒƒ์ด ์•„๋‹ˆ๋ผ
01:40
that's the president of the American Finance Association
34
100380
3256
๋ฏธ๊ตญ ์žฌ๋ฌดํ•™ํšŒ ํšŒ์žฅ์ด
01:43
who stated that in his presidential address.
35
103660
2936
์ž์‹ ์˜ ํšŒ์žฅ ์—ฐ์„ค์—์„œ ์ฃผ์žฅํ•œ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
01:46
That's a huge problem if you think about, especially,
36
106620
2736
์ด๊ฑด ํฌ๋‚˜ ํฐ ๋ฌธ์ œ์ž…๋‹ˆ๋‹ค.
01:49
an economy like Switzerland,
37
109380
1696
ํŠนํžˆ ์‹ ๋ขฐ๋ฅผ ๊ธฐ๋ฐ˜์œผ๋กœ ํ•˜๋Š” ๊ธˆ์œต ์‚ฐ์—…์„ ๊ฐ€์ง„
01:51
which relies so much on the trust put into its financial industry.
38
111100
4200
์Šค์œ„์Šค์™€ ๊ฐ™์€ ๋‚˜๋ผ๋“ค์„ ๊ณ ๋ คํ•œ๋‹ค๋ฉด ๋ง์ด์ฃ .
01:56
On the other hand,
39
116780
1216
๋‹ค๋ฅธ ํ•œํŽธ์œผ๋กœ
01:58
there are six out of seven companies who actually remain honest
40
118020
3536
์ผ๊ณฑ ๊ฐœ์˜ ๊ธฐ์—… ์ค‘ ์—ฌ์„ฏ ๊ฐœ๋Š” ์ •์งํ•˜๊ฒŒ ๋‚จ์•„์žˆ์ฃ .
02:01
despite all temptations to start engaging in fraud.
41
121580
3840
์‚ฌ๊ธฐ์˜ ์œ ํ˜น ์†์—๋„ ๋ง์ด์ฃ .
02:06
There are whistle-blowers like Michael Woodford,
42
126060
2296
Olympus์˜ ๋น„๋ฆฌ๋ฅผ ๊ณ ๋ฐœํ•œ
02:08
who blew the whistle on Olympus.
43
128380
2336
๋งˆ์ดํด ์šฐ๋“œํฌ๋“œ ๊ฐ™์€ ๋‚ด๋ถ€๊ณ ๋ฐœ์ž๋“ค๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
02:10
These whistle-blowers risk their careers,
44
130740
2696
์ด๋“ค์€ ์ž์‹ ์˜ ์ง์žฅ ์ƒํ™œ์„ ๊ฒ๋‹ˆ๋‹ค.
02:13
their friendships,
45
133460
1216
๊ทธ๋“ค์˜ ๋™๋ฃŒ์• ๋„์š”.
02:14
to bring out the truth about their companies.
46
134700
2136
ํšŒ์‚ฌ์˜ ์ง„์‹ค์„ ๋ฐํžˆ๊ธฐ ์œ„ํ•ด์„œ ๋ง์ด์ฃ .
02:16
There are journalists like Anna Politkovskaya
47
136860
2616
์•ˆ๋‚˜ ํด๋ฆฌ์ฝ”ํ”„์Šค์นด์•ผ ๊ฐ™์€ ๊ธฐ์ž๋“ค๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
02:19
who risk even their lives to report human rights violations.
48
139500
3856
์ธ๊ถŒ ์นจํ•ด ์‚ฌ๋ก€๋ฅผ ๋ณด๋„ํ•˜๊ธฐ ์œ„ํ•ด ๋ชฉ์ˆจ๊นŒ์ง€๋„ ๋ฐ”์น˜์ฃ .
02:23
She got killed --
49
143380
1216
๊ทธ๋…€๋Š” ์‚ดํ•ด๋‹นํ–ˆ์Šต๋‹ˆ๋‹ค.
02:24
every year,
50
144620
1216
๋งค๋…„
02:25
around 100 journalists get killed
51
145860
1656
์•ฝ 100๋ช…์˜ ๊ธฐ์ž๋“ค์ด ์‚ดํ•ด๋‹นํ•ฉ๋‹ˆ๋‹ค.
02:27
because of their conviction to bring out the truth.
52
147540
2720
์ง„์‹ค์„ ๋ฐํžˆ๊ฒ ๋‹ค๋Š” ์‹ ๋… ํ•˜๋‚˜๋กœ์š”.
02:31
So in my talk today,
53
151860
1256
๋”ฐ๋ผ์„œ ์˜ค๋Š˜ ์ „
02:33
I want to share with you some insights I've obtained and learned
54
153140
3496
์ง€๋‚œ 10๋…„๊ฐ„ ์ด์— ๊ด€ํ•œ ์—ฐ๊ตฌ๋ฅผ ํ•˜๋ฉฐ
02:36
in the last 10 years of conducting research in this.
55
156660
3296
์–ป์€ ๊ฒฌํ•ด๋ฅผ ๋‚˜๋ˆ„๊ณ ์ž ํ•ฉ๋‹ˆ๋‹ค.
02:39
I'm a researcher, a scientist working with economists,
56
159980
3496
์ „ ์—ฐ๊ตฌ์›์ด์ž ๊ณผํ•™์ž์ž…๋‹ˆ๋‹ค.
๊ฒฝ์ œํ•™์ž, ๊ธˆ์œต๊ฒฝ์ œํ•™์ž
02:43
financial economists,
57
163500
1336
02:44
ethicists, neuroscientists,
58
164860
2056
์œค๋ฆฌํ•™์ž, ์‹ ๊ฒฝ ๊ณผํ•™์ž
02:46
lawyers and others
59
166940
1336
๋ณ€ํ˜ธ์‚ฌ ๋“ฑ๊ณผ
02:48
trying to understand what makes humans tick,
60
168300
2096
๋ฌด์—‡์ด ์‚ฌ๋žŒ๋“ค์„ ์›€์ง์ด๊ฒŒ ํ•˜๊ณ 
02:50
and how can we address this issue of fraud in corporations
61
170420
4776
์ด๋Ÿฐ ๊ธฐ์—…๋“ค์˜ ์‚ฌ๊ธฐ ๋ฌธ์ œ๋ฅผ ์–ด๋–ป๊ฒŒ ํ•ด๊ฒฐํ• ์ง€๋ฅผ ์ดํ•ดํ•ด์„œ
02:55
and therefore contribute to the improvement of the world.
62
175220
3160
์„ธ์ƒ์˜ ๋ฐœ์ „์— ๊ธฐ์—ฌํ•˜๊ธฐ ์œ„ํ•ด ์ผํ•ฉ๋‹ˆ๋‹ค.
02:59
I want to start by sharing with you two very distinct visions
63
179100
3536
๋จผ์ €, ์‚ฌ๋žŒ๋“ค์ด ์–ด๋–ป๊ฒŒ ํ–‰๋™ํ•˜๋Š”์ง€์— ๋Œ€ํ•œ
03:02
of how people behave.
64
182660
1816
๋‘ ๊ฐœ์˜ ๋‹ค๋ฅธ ์‹œ๊ฐ์„ ๋‚˜๋ˆ„๊ณ ์ž ํ•ฉ๋‹ˆ๋‹ค.
03:04
First, meet Adam Smith,
65
184500
1840
์ฒซ์งธ๋กœ, ์• ๋ค ์Šค๋ฏธ์Šค์ž…๋‹ˆ๋‹ค
03:07
founding father of modern economics.
66
187020
1960
๊ทผ๋Œ€ ๊ฒฝ์ œํ•™์˜ ์•„๋ฒ„์ง€์ด์ฃ .
03:10
His basic idea was that if everybody behaves in their own self-interests,
67
190100
4296
๊ทธ์˜ ๊ธฐ๋ณธ์ ์ธ ์ƒ๊ฐ์€, ๋งŒ์•ฝ ๋ชจ๋‘๊ฐ€ ์‚ฌ๋ฆฌ๋ฅผ ์œ„ํ•ด ํ–‰๋™ํ•œ๋‹ค๋ฉด
03:14
that's good for everybody in the end.
68
194420
2520
์ด๋Š” ๊ฒฐ๊ณผ์ ์œผ๋กœ ๋ชจ๋‘์˜ ์ด์ต์ด ๋œ๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
03:17
Self-interest isn't a narrowly defined concept
69
197900
3056
์‚ฌ๋ฆฌ๋Š” ๊ทธ์ € ๋‹น์‹ ์˜ ํ˜„์žฌ ์ด์ต์„ ์œ„ํ•œ
03:20
just for your immediate utility.
70
200980
1936
์ข์€ ๊ฐœ๋…์ด ์•„๋‹™๋‹ˆ๋‹ค.
03:22
It has a long-run implication.
71
202940
1936
์ด๋Š” ์žฅ๊ธฐ์ ์ธ ์•”์‹œ๋ฅผ ๋‹ด๊ณ  ์žˆ์ง€์š”.
03:24
Let's think about that.
72
204900
1480
ํ•œ๋ฒˆ ์ƒ๊ฐํ•ด๋ด…์‹œ๋‹ค.
03:26
Think about this dog here.
73
206900
2016
์—ฌ๊ธฐ ์ด ๊ฐœ๋ฅผ ์ƒ์ƒํ•ด๋ณด์„ธ์š”.
03:28
That might be us.
74
208940
1200
์šฐ๋ฆฌ์ผ์ง€๋„ ๋ชจ๋ฆ…๋‹ˆ๋‹ค.
03:31
There's this temptation --
75
211260
1256
์—ฌ๊ธฐ ์ด ์œ ํ˜น์ด ์žˆ์Šต๋‹ˆ๋‹ค.
03:32
I apologize to all vegetarians, but --
76
212540
2376
์ฑ„์‹์ฃผ์˜์ž๋“ค์—๊ฒ ๋ฏธ์•ˆํ•˜์ง€๋งŒ
03:34
(Laughter)
77
214940
1016
03:35
Dogs do like the bratwurst.
78
215980
1696
(์›ƒ์Œ)
๊ฐœ๋“ค์€ ์†Œ์‹œ์ง€ ์ข‹์•„ํ•ฉ๋‹ˆ๋‹ค
03:37
(Laughter)
79
217700
2376
(์›ƒ์Œ)
03:40
Now, the straight-up, self-interested move here
80
220100
3096
์—ฌ๊ธฐ์„œ ์ˆœ์ˆ˜ํ•œ ์ด๊ธฐ์  ํ–‰๋™์€
03:43
is to go for that.
81
223220
1576
๋ฐ”๋กœ ์†Œ์‹œ์ง€์— ๋‹ฌ๋ ค๋“œ๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
03:44
So my friend Adam here might jump up,
82
224820
2936
์—ฌ๊ธฐ ์ œ ์นœ๊ตฌ ์• ๋ค์€ ์ ํ”„ํ•ด์„œ
03:47
get the sausage and thereby ruin all this beautiful tableware.
83
227780
3360
์†Œ์‹œ์ง€๋ฅผ ๋ฌผ๊ณ  ๊ฒฐ๊ณผ์ ์œผ๋กœ ์ด ๋ฉ‹์ง„ ๊ทธ๋ฆ‡๋“ค์„ ๊นจ๋œจ๋ฆฌ๊ฒ ์ฃ .
03:51
But that's not what Adam Smith meant.
84
231820
1816
ํ•˜์ง€๋งŒ ์ด๋Š” ์• ๋ค ์Šค๋ฏธ์Šค๊ฐ€ ์˜๋ฏธํ•œ ๊ฒƒ์ด ์•„๋‹™๋‹ˆ๋‹ค.
03:53
He didn't mean disregard all consequences --
85
233660
2656
๊ทธ์˜ ์˜๋„๋Š” ๋ชจ๋“  ๊ฒฐ๊ณผ๋ฅผ ๋ฌด์‹œํ•˜๋ผ๋Š” ๊ฒŒ ์•„๋‹ˆ์—ˆ์Šต๋‹ˆ๋‹ค.
03:56
to the contrary.
86
236340
1216
๋ฐ˜๋Œ€๋กœ
03:57
He would have thought,
87
237580
1256
๊ทธ๋Š” ์ด๋ ‡๊ฒŒ ์ƒ๊ฐํ–ˆ๊ฒ ์ฃ .
03:58
well, there may be negative consequences,
88
238860
2016
๋ถ€์ •์  ๊ฒฐ๊ณผ๊ฐ€ ์žˆ์„ ์ˆ˜๋„ ์žˆ์–ด
04:00
for example,
89
240900
1216
๊ฐ€๋ น
04:02
the owner might be angry with the dog
90
242140
3096
๊ฐœ์˜ ์ฃผ์ธ์ด ํ™”๊ฐ€ ๋‚  ์ˆ˜๋„ ์žˆ๊ณ 
04:05
and the dog, anticipating that, might not behave in this way.
91
245260
3600
๊ทธ๋ฆฌ๊ณ , ๊ทธ ๊ฐœ๋Š” ์ด๋ฅผ ์˜ˆ์ธกํ•˜์—ฌ ์ด๋ ‡๊ฒŒ ํ–‰๋™ํ•˜์ง€ ์•Š๊ฒ ์ง€์š”.
04:09
That might be us,
92
249660
1256
๊ทธ๊ฒŒ ์šฐ๋ฆฌ์ผ ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
04:10
weighing the benefits and costs of our actions.
93
250940
3056
์šฐ๋ฆฌ ํ–‰๋™์˜ ํŽธ์ต๊ณผ ๋น„์šฉ์„ ๊ณ„์‚ฐํ•˜๊ณ  ์žˆ์ง€์š”.
04:14
How does that play out?
94
254020
1240
์ด๋Š” ์–ด๋–ป๊ฒŒ ์ˆ˜ํ–‰๋ ๊นŒ์š”?
04:15
Well, many of you, I'm sure,
95
255780
1976
์—ฌ๊ธฐ ์žˆ๋Š” ๋งŽ์€ ๋ถ„๋“ค์˜ ํšŒ์‚ฌ์—
04:17
have in your companies,
96
257780
1536
ํŠนํžˆ ํฐ ํšŒ์‚ฌ๋ผ๋ฉด
04:19
especially if it's a large company,
97
259340
1816
์žˆ์„ ๊ฑฐ๋ผ๊ณ  ํ™•์‹ ํ•ฉ๋‹ˆ๋‹ค.
04:21
a code of conduct.
98
261180
1656
๋ฐ”๋กœ ํ–‰๋™ ์ˆ˜์น™์ด์š”.
04:22
And then if you behave according to that code of conduct,
99
262860
3416
๊ทธ๋ฆฌ๊ณ  ๊ทธ ํ–‰๋™์ˆ˜์น™์— ๋งž๊ฒŒ ํ–‰๋™ํ•œ๋‹ค๋ฉด
04:26
that improves your chances of getting a bonus payment.
100
266300
3176
์—ฌ๋Ÿฌ๋ถ„์ด ๋ณด๋„ˆ์Šค๋ฅผ ๋ฐ›์„ ํ™•๋ฅ ์„ ๋†’์—ฌ์ฃผ๊ฒ ์ฃ .
04:29
And on the other hand, if you disregard it,
101
269500
2135
๊ทธ๋ฆฌ๊ณ  ๋ฐ˜๋Œ€๋กœ, ๋งŒ์•ฝ ์ด๋ฅผ ์–ด๊ธด๋‹ค๋ฉด
04:31
then there are higher chances of not getting your bonus
102
271659
2737
๋ณด๋„ˆ์Šค๋ฅผ ๋ฐ›์ง€ ๋ชปํ•  ํ™•๋ฅ ์ด ๋” ํฌ๊ฒ ์ฃ .
04:34
or its being diminished.
103
274420
1536
ํ˜น์€ ๋ณด๋„ˆ์Šค๊ฐ€ ์ค„ ์ˆ˜๋„ ์žˆ๊ณ ์š”.
04:35
In other words,
104
275980
1256
๋‹ค์‹œ ๋งํ•ด
04:37
this is a very economic motivation
105
277260
1816
์ด๋Š” ์‚ฌ๋žŒ๋“ค์ด ๋” ์ •์งํ•ด์ง€๊ฑฐ๋‚˜
04:39
of trying to get people to be more honest,
106
279100
2776
๊ธฐ์—…์˜ ๊ทœ์œจ์— ์ค€์ˆ˜ํ•˜๋„๋ก ๋งŒ๋“œ๋Š”
04:41
or more aligned with the corporation's principles.
107
281900
3360
๊ต‰์žฅํžˆ ๊ฒฝ์ œ์ ์ธ ๋™๊ธฐ ๋ถ€์—ฌ์ž…๋‹ˆ๋‹ค.
04:46
Similarly, reputation is a very powerful economic force, right?
108
286060
5256
๋งˆ์ฐฌ๊ฐ€์ง€๋กœ, ๋ช…์„ฑ๋„ ํ•˜๋‚˜์˜ ๊ฐ•๋ ฅํ•œ ๊ฒฝ์ œ์  ์ž๊ทน์ž…๋‹ˆ๋‹ค, ๊ทธ๋ ‡์ฃ ?
04:51
We try to build a reputation,
109
291340
1536
์šฐ๋ฆฌ๋Š” ๊ฐ€๋ น ์ •์งํ•˜๋‹ค๋Š” ๋ช…์„ฑ์„ ์Œ“๊ณ  ์‹ถ์–ดํ•ฉ๋‹ˆ๋‹ค.
04:52
maybe for being honest,
110
292900
1416
04:54
because then people trust us more in the future.
111
294340
2400
๊ทธ๋Ÿฌ๋ฉด ์‚ฌ๋žŒ๋“ค์ด ๋ฏธ๋ž˜์— ์šฐ๋ฆฌ๋ฅผ ๋” ์‹ ๋ขฐํ•  ๊ฑฐ๋ผ ๋ฏฟ๊ธฐ ๋•Œ๋ฌธ์ด์ฃ .
04:57
Right?
112
297780
1216
๊ทธ๋ ‡์ฃ ?
04:59
Adam Smith talked about the baker
113
299020
2096
์• ๋ค ์Šค๋ฏธ์Šค๋Š” ์ข‹์€ ๋นต์„ ์ƒ์‚ฐํ•˜๋Š” ํ•œ ์ œ๋นต์‚ฌ ์–˜๊ธฐ๋ฅผ ํ–ˆ์Šต๋‹ˆ๋‹ค.
05:01
who's not producing good bread out of his benevolence
114
301140
3776
๊ทธ๋ฆฌ๊ณ  ๊ทธ์˜ ๋™๊ธฐ๋Š” ๋นต์„ ์†Œ๋น„ํ•˜๋Š” ์‚ฌ๋žŒ๋“ค์„ ์œ„ํ•œ ๋ฐ•์• ๊ฐ€ ์•„๋‹ˆ๋ผ
05:04
for those people who consume the bread,
115
304940
3016
05:07
but because he wants to sell more future bread.
116
307980
3040
๋ฏธ๋ž˜์— ๋นต์„ ๋” ๋งŽ์ด ํŒ”๊ณ  ์‹ถ์€ ์š•๊ตฌ๋ผ๊ณ  ๋งํ–ˆ์ฃ .
05:11
In my research, we find, for example,
117
311980
2216
์ €์˜ ์—ฐ๊ตฌ๋Š”, ์˜ˆ๋ฅผ ๋“ค์–ด
05:14
at the University of Zurich,
118
314220
1376
์ทจ๋ฆฌํžˆ ๋Œ€ํ•™์—์„œ ์‹ค์‹œํ•œ ์—ฐ๊ตฌ์—์„ 
05:15
that Swiss banks who get caught up in media,
119
315620
4200
ํƒˆ์„ธ๋‚˜ ์„ธ๊ธˆ ์‚ฌ๊ธฐ์™€ ๊ฐ™์€ ์ด์œ ๋กœ
๋งค์ฒด์˜ ์ฃผ๋ชฉ์„ ๋ฐ›์€ ์Šค์œ„์Šค ์€ํ–‰๋“ค์€
05:20
and in the context, for example,
120
320540
1776
05:22
of tax evasion, of tax fraud,
121
322340
1536
์•ˆ ์ข‹์€ ์–ธ๋ก ์˜ ํ‰ํŒ์„ ๋ฐ›๋Š”๋‹ค๋Š” ์‚ฌ์‹ค์„ ์•Œ์•„๋ƒˆ์Šต๋‹ˆ๋‹ค.
05:23
have bad media coverage.
122
323900
1736
05:25
They lose net new money in the future
123
325660
2736
๊ทธ๋“ค์€ ๋ฏธ๋ž˜์— ์ˆœ์ˆ˜์ž…์„ ์žƒ๊ฒŒ๋˜๊ณ ์š”.
05:28
and therefore make lower profits.
124
328420
1616
์ด์— ๋”ฐ๋ผ ๋” ์ ์€ ์ˆ˜์ต์„ ๋‚ด๊ฒ ์ง€์š”.
05:30
That's a very powerful reputational force.
125
330060
2360
์ด๋Š” ๊ต‰์žฅํžˆ ๊ฐ•๋ ฅํ•œ ๋ช…์„ฑ์˜ ํž˜์ž…๋‹ˆ๋‹ค.
05:34
Benefits and costs.
126
334020
1600
๋น„์šฉ๊ณผ ํŽธ์ต
05:36
Here's another viewpoint of the world.
127
336940
2576
์—ฌ๊ธฐ ์„ธ๊ณ„์˜ ๋˜ ๋‹ค๋ฅธ ์‹œ๊ฐ์ด ์žˆ์Šต๋‹ˆ๋‹ค.
05:39
Meet Immanuel Kant,
128
339540
1536
์ž„๋งˆ๋ˆ„์—˜ ์นธํŠธ
05:41
18th-century German philosopher superstar.
129
341100
2760
18์„ธ๊ธฐ ๋…์ผ์˜ ๋Œ€์ฒ ํ•™์ž
05:44
He developed this notion
130
344740
1616
๊ทธ๋Š” ์ด๋Ÿฐ ๊ด€๋…์„ ๊ณ ์•ˆํ–ˆ์ฃ .
05:46
that independent of the consequences,
131
346380
3136
๊ฒฐ๊ณผ์— ์ƒ๊ด€์—†์ด
05:49
some actions are just right
132
349540
2976
์–ด๋–ค ํ–‰๋™์€ ๋ฌด์กฐ๊ฑด์ ์œผ๋กœ ์˜ณ๊ณ 
05:52
and some are just wrong.
133
352540
1696
์–ด๋–ค ํ–‰๋™์€ ๋ฌด์กฐ๊ฑด์ ์œผ๋กœ ํ‹€๋ฆฝ๋‹ˆ๋‹ค.
05:54
It's just wrong to lie, for example.
134
354260
3216
์˜ˆ๋ฅผ ๋“ค์–ด ๊ฑฐ์ง“๋ง์€ ๋ฌด์กฐ๊ฑด์ ์œผ๋กœ ๋‚˜์ฉ๋‹ˆ๋‹ค.
05:57
So, meet my friend Immanuel here.
135
357500
3136
์—ฌ๊ธฐ ์ œ ์นœ๊ตฌ ์ž„๋งˆ๋ˆ„์—˜์ด ์žˆ์Šต๋‹ˆ๋‹ค.
06:00
He knows that the sausage is very tasty,
136
360660
2816
๊ทธ๋Š” ์†Œ์‹œ์ง€๊ฐ€ ๊ต‰์žฅํžˆ ๋ง›์žˆ๋‹ค๋Š” ๊ฒƒ์„ ์•Œ์ง€์š”.
06:03
but he's going to turn away because he's a good dog.
137
363500
2456
ํ•˜์ง€๋งŒ ๊ทธ๋Š” ์ฐฉํ•œ ๊ฐœ์ด๊ธฐ ๋•Œ๋ฌธ์— ๋Œ์•„์„ค ๊ฒƒ์ž…๋‹ˆ๋‹ค.
06:05
He knows it's wrong to jump up
138
365980
2696
์ ํ”„ํ•ด์„œ ์ด ๋ฉ‹์ง„ ๊ทธ๋ฆ‡๋“ค์„ ๊นจ๋œจ๋ฆด ๋„๋ฐ•์„ ํ•˜๋Š” ๊ฒƒ์ด ๋‚˜์œ ํ–‰๋™์ž„์„ ์••๋‹ˆ๋‹ค.
06:08
and risk ruining all this beautiful tableware.
139
368700
2800
๋งŒ์•ฝ ์—ฌ๋Ÿฌ๋ถ„์ด, ์‚ฌ๋žŒ๋“ค์ด ์ด๋Ÿฐ ์‹์œผ๋กœ ๋™๊ธฐ๋ถ€์—ฌ๋œ๋‹ค๊ณ  ๋ฏฟ๋Š”๋‹ค๋ฉด
06:12
If you believe that people are motivated like that,
140
372340
2416
06:14
then all the stuff about incentives,
141
374780
2176
์ธ์„ผํ‹ฐ๋ธŒ๋‚˜ ํ–‰๋™์ˆ˜์น™, ๋ณด๋„ˆ์Šค ๋“ฑ์— ๋Œ€ํ•œ ๋ชจ๋“  ๊ฒƒ๋“ค์€
06:16
all the stuff about code of conduct and bonus systems and so on,
142
376980
3776
06:20
doesn't make a whole lot of sense.
143
380780
2176
๋ณ„๋กœ ์˜๋ฏธ๊ฐ€ ์—†์Šต๋‹ˆ๋‹ค.
06:22
People are motivated by different values perhaps.
144
382980
4176
์‚ฌ๋žŒ๋“ค์€ ์–ด์ฉŒ๋ฉด ๋‹ค๋ฅธ ๊ฐ€์น˜๋“ค์— ์˜ํ•ด ๋™๊ธฐ๋ถ€์—ฌ๋˜๋Š” ๊ฒƒ์ด์ฃ .
06:27
So, what are people actually motivated by?
145
387180
3376
๊ทธ๋ž˜์„œ, ๋Œ€์ฒด ๋ฌด์—‡์ด ์‚ฌ๋žŒ๋“ค์„ ์ž๊ทนํ•˜๋Š” ๊ฒƒ์ผ๊นŒ์š”?
06:30
These two gentlemen here have perfect hairdos,
146
390580
2176
์ด ๋‘ ์‚ฌ๋žŒ์€ ์ฃฝ์ด๋Š” ํ—ค์–ด์Šคํƒ€์ผ์„ ๊ฐ€์กŒ์ง€๋งŒ
06:32
but they give us very different views of the world.
147
392780
4480
๊ทน๋ช…ํ•˜๊ฒŒ ๋‹ค๋ฅธ ์„ธ๊ณ„๊ด€์„ ๋ณด์—ฌ์ค๋‹ˆ๋‹ค.
06:37
What do we do with this?
148
397660
1256
์šฐ๋ฆฌ๋Š” ์–ด๋–ป๊ฒŒ ํ• ๊นŒ์š”?
06:38
Well, I'm an economist
149
398940
1656
์ €๋Š” ๊ฒฝ์ œํ•™์ž์ž…๋‹ˆ๋‹ค.
06:40
and we conduct so-called experiments to address this issue.
150
400620
4176
์šฐ๋ฆฌ๋Š” ์ด ๋ฌธ์ œ๋ฅผ ๋‹ค๋ฃจ๊ธฐ ์œ„ํ•ด ํ”ํžˆ ๋งํ•˜๋Š” ์‹คํ—˜์„ ํ•ฉ๋‹ˆ๋‹ค.
06:44
We strip away facts which are confusing in reality.
151
404820
3296
์šฐ๋ฆฌ๋Š” ํ˜„์‹ค์—์„œ ํ—ท๊ฐˆ๋ฆฌ๋Š” ์‚ฌ์‹ค๋“ค์„ ์ œ๊ฑฐํ•ฉ๋‹ˆ๋‹ค.
06:48
Reality is so rich, there is so much going on,
152
408140
2736
ํ˜„์‹ค์€ ๋„ˆ๋ฌด ๋‹ค์ฑ„๋กญ๊ณ , ๋„ˆ๋ฌด ๋งŽ์€ ๊ฒƒ๋“ค์ด ์ง„ํ–‰๋˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
06:50
it's almost impossible to know what drives people's behavior really.
153
410900
3960
๋ฌด์—‡์ด ์‚ฌ๋žŒ๋“ค์˜ ํ–‰๋™์„ ์ด๋„๋Š”๊ฐ€๋ฅผ ์•Œ์•„๋‚ด๊ธฐ๋ž€ ๊ฑฐ์˜ ๋ถˆ๊ฐ€๋Šฅํ•˜์ฃ .
06:55
So let's do a little experiment together.
154
415340
2720
๊ทธ๋ž˜์„œ, ์ž‘์€ ์‹คํ—˜์„ ํ•˜๋‚˜ ํ•ด๋ณผ๊นŒ ํ•ฉ๋‹ˆ๋‹ค.
06:58
Imagine the following situation.
155
418500
2600
๋‹ค์Œ์˜ ์ƒํ™ฉ์„ ํ•œ๋ฒˆ ์ƒ์ƒํ•ด๋ณด์„ธ์š”.
07:02
You're in a room alone,
156
422220
2416
์—ฌ๋Ÿฌ๋ถ„์€ ๋ฐฉ ์•ˆ์— ํ™€๋กœ ์žˆ์Šต๋‹ˆ๋‹ค.
07:04
not like here.
157
424660
1536
์—ฌ๊ธฐ์™€๋Š” ๋‹ค๋ฅด์ฃ .
07:06
There's a five-franc coin like the one I'm holding up right now
158
426220
3440
๊ทธ๊ณณ์—๋Š” ์ œ๊ฐ€ ์ง€๊ธˆ ๋“ค๊ณ  ์žˆ๋Š”๊ฒƒ๊ณผ ๊ฐ™์€ 5 ํ”„๋ž‘์งœ๋ฆฌ ๋™์ „์ด ์žˆ์Šต๋‹ˆ๋‹ค.
07:10
in front of you.
159
430380
1576
์—ฌ๋Ÿฌ๋ถ„ ์•ž์— ๋ง์ด์ฃ .
07:11
Here are your instructions:
160
431980
1576
์—ฌ๊ธฐ ์ง€์‹œ๋ฌธ์ด ์žˆ์Šต๋‹ˆ๋‹ค.
07:13
toss the coin four times,
161
433580
2480
๋™์ „์„ ๋„ค ๋ฒˆ ๋˜์ง€๊ณ 
07:17
and then on a computer terminal in front of you,
162
437620
2416
์•ž์— ๋ณด์ด๋Š” ์ปดํ“จํ„ฐ ๋‹จ๋ง๊ธฐ์—
07:20
enter the number of times tails came up.
163
440060
3656
๋’ท๋ฉด์ด ๋‚˜์˜ค๋Š” ํšŸ์ˆ˜๋ฅผ ์ž…๋ ฅํ•˜์„ธ์š”.
07:23
This is the situation.
164
443740
1280
์ด๊ฒƒ์ด ์šฐ๋ฆฌ์˜ ๊ฐ€์ •์ž…๋‹ˆ๋‹ค.
07:25
Here's the rub.
165
445540
1216
๋ฌธ์ œ๋Š” ์ด๊ฒ๋‹ˆ๋‹ค.
07:26
For every time that you announce that you had a tails throw,
166
446780
3376
์—ฌ๋Ÿฌ๋ถ„์ด ๋’ท๋ฉด์ด ๋‚˜์™”๋‹ค๊ณ  ์ž…๋ ฅํ•  ๋•Œ๋งˆ๋‹ค
07:30
you get paid five francs.
167
450180
1496
์—ฌ๋Ÿฌ๋ถ„์€ 5 ํ”„๋ž‘์„ ์ง€๊ธ‰๋ฐ›์Šต๋‹ˆ๋‹ค.
07:31
So if you say I had two tails throws,
168
451700
2536
๋งŒ์•ฝ ์—ฌ๋Ÿฌ๋ถ„์ด ๋‘๋ฒˆ ๋’ท๋ฉด์ด ๋‚˜์™”๋‹ค๊ณ  ํ•˜๋ฉด
07:34
you get paid 10 francs.
169
454260
2216
10 ํ”„๋ž‘์„ ๋ฐ›์Šต๋‹ˆ๋‹ค.
07:36
If you say you had zero, you get paid zero francs.
170
456500
2936
๋งŒ์•ฝ ์—ฌ๋Ÿฌ๋ถ„์ด 0๋ฒˆ์ด๋ผ๊ณ  ํ•˜๋ฉด, ์•„๋ฌด๊ฒƒ๋„ ๋ฐ›์ง€ ๋ชปํ•˜๋Š” ๊ฑฐ์ฃ .
07:39
If you say, "I had four tails throws,"
171
459460
2456
๋งŒ์•ฝ ์—ฌ๋Ÿฌ๋ถ„์ด "๋‚˜๋Š” ๋’ท๋ฉด์ด 4๋ฒˆ ๋‚˜์™”์†Œ,"๋ผ๊ณ  ํ•˜๋ฉด
07:41
then you get paid 20 francs.
172
461940
2016
์—ฌ๋Ÿฌ๋ถ„์€ 20 ํ”„๋ž‘์„ ์ง€๊ธ‰๋ฐ›์Šต๋‹ˆ๋‹ค.
07:43
It's anonymous,
173
463980
1256
์ด ์‹คํ—˜์€ ์ต๋ช…์ด๊ณ 
07:45
nobody's watching what you're doing,
174
465260
1896
์•„๋ฌด๋„ ์—ฌ๋Ÿฌ๋ถ„์„ ์ง€์ผœ๋ณด๊ณ  ์žˆ์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
07:47
and you get paid that money anonymously.
175
467180
2336
๋ˆ๋„ ์ต๋ช…์œผ๋กœ ์ง€๊ธ‰๋ฐ›๊ณ ์š”.
07:49
I've got two questions for you.
176
469540
1477
์—ฌ๋Ÿฌ๋ถ„๊ป˜ ๋‘ ๊ฐœ์˜ ์งˆ๋ฌธ์ด ์žˆ์Šต๋‹ˆ๋‹ค.
07:51
(Laughter)
177
471580
1616
(์›ƒ์Œ)
07:53
You know what's coming now, right?
178
473220
1640
๋Œ€์ถฉ ๊ฐ์ด ์˜ค์‹œ์ฃ ?
07:55
First, how would you behave in that situation?
179
475820
3480
์ฒซ์งธ๋กœ, ์—ฌ๋Ÿฌ๋ถ„์€ ์–ด๋–ค ํ–‰๋™์„ ํ•˜์‹œ๊ฒ ์Šต๋‹ˆ๊นŒ?
08:00
The second, look to your left and look to your right --
180
480060
2936
๋‘๋ฒˆ์งธ๋กœ, ๊ฐ์ž ์–‘์˜†์„ ๋ณด์„ธ์š”.
08:03
(Laughter)
181
483020
1016
(์›ƒ์Œ)
08:04
and think about how the person sitting next to you
182
484060
2376
๊ทธ๋ฆฌ๊ณ  ์—ฌ๋Ÿฌ๋ถ„ ์˜†์˜ ์‚ฌ๋žŒ๋“ค์€ ์–ด๋–ค ํ–‰๋™์„ ํ• ์ง€
08:06
might behave in that situation.
183
486460
1656
ํ•œ๋ฒˆ ์ƒ์ƒํ•ด ๋ณด์„ธ์š”.
08:08
We did this experiment for real.
184
488140
2136
์šฐ๋ฆฐ ์ด ์‹คํ—˜์„ ์‹ค์ œ๋กœ ํ–ˆ์Šต๋‹ˆ๋‹ค.
08:10
We did it at the Manifesta art exhibition
185
490300
2696
์ตœ๊ทผ์— ์—ฌ๊ธฐ ์ทจ๋ฆฌํžˆ์—์„œ ๊ฐœ์ตœ๋œ
08:13
that took place here in Zurich recently,
186
493020
2456
Manifesta ๋ฏธ์ˆ ์ „์—์„œ ๋ง์ด์ฃ .
08:15
not with students in the lab at the university
187
495500
2856
๋Œ€ํ•™๊ต ์‹คํ—˜์‹ค ์†์˜ ํ•™์ƒ๋“ค์ด ์•„๋‹ˆ๋ผ
08:18
but with the real population,
188
498380
1776
์‹ค์ œ ์—ฌ๋Ÿฌ๋ถ„ ๊ฐ™์€ ๋ฏผ๊ฐ„์ธ์„ ๋Œ€์ƒ์œผ๋กœ ํ–ˆ์Šต๋‹ˆ๋‹ค.
08:20
like you guys.
189
500180
1200
08:21
First, a quick reminder of stats.
190
501900
2136
๋จผ์ €, ์ƒํ™ฉ์„ ์งง๊ฒŒ ์š”์•ฝํ•ด๋ณผ๊ฒŒ์š”.
08:24
If I throw the coin four times and it's a fair coin,
191
504060
3576
๋งŒ์•ฝ ์ œ๊ฐ€ ๋™์ „์„ ๋„ค ๋ฒˆ ๋˜์ง€๊ณ , ๋™์ „์ด ๋ณดํ†ต ๋™์ „์ด๋ผ๋ฉด
08:27
then the probability that it comes up four times tails
192
507660
4096
๋’ท๋ฉด์ด 4๋ฒˆ ๋‚˜์˜ฌ ํ™•๋ฅ ์€
08:31
is 6.25 percent.
193
511780
2400
6.25%์ž…๋‹ˆ๋‹ค
08:34
And I hope you can intuitively see
194
514900
1656
์—ฌ๋Ÿฌ๋ถ„๋“ค์€ ์ง๊ด€์ ์œผ๋กœ
08:36
that the probability that all four of them are tails is much lower
195
516580
3376
๋’ท๋ฉด์ด 4๋ฒˆ ๋ชจ๋‘ ๋‚˜์˜ฌ ํ™•๋ฅ ์€ 2๋ฒˆ ๋‚˜์˜ฌ ํ™•๋ฅ ๋ณด๋‹ค
08:39
than if two of them are tails, right?
196
519980
2120
ํ˜„์ €ํžˆ ๋‚ฎ๋‹ค๋Š” ๊ฒƒ์„ ์•Œ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
08:42
Here are the specific numbers.
197
522580
1440
์—ฌ๊ธฐ ์ž์„ธํ•œ ์ˆ˜์น˜๋“ค์ด ๋‚˜์™€์žˆ์Šต๋‹ˆ๋‹ค.
08:45
Here's what happened.
198
525859
1496
๊ทธ๋ฆฌ๊ณ , ์ด๊ฒƒ์ด ๋ฐ”๋กœ ๊ฒฐ๊ณผ์ž…๋‹ˆ๋‹ค.
08:47
People did this experiment for real.
199
527379
2201
์‚ฌ๋žŒ๋“ค์€ ์ด ์‹คํ—˜์„ ์‹ค์ œ๋กœ ์ˆ˜ํ–‰ํ–ˆ์Šต๋‹ˆ๋‹ค.
08:50
Around 30 to 35 percent of people said,
200
530619
3336
์•ฝ 30์—์„œ 35%์˜ ์‚ฌ๋žŒ๋“ค์ด
08:53
"Well, I had four tails throws."
201
533979
2401
"์ €๋Š” ๋’ท๋ฉด์ด 4๋ฒˆ ๋‚˜์™”์–ด์š”." ๋ผ๊ณ  ๋งํ–ˆ์Šต๋‹ˆ๋‹ค
08:57
That's extremely unlikely.
202
537460
1816
๊ต‰์žฅํžˆ ๋น„ํ˜„์‹ค์ ์ด์ฃ .
08:59
(Laughter)
203
539300
1936
(์›ƒ์Œ)
09:01
But the really amazing thing here,
204
541260
3136
ํ•˜์ง€๋งŒ, ๋” ๋†€๋ผ์šด ๊ฒƒ์€
09:04
perhaps to an economist,
205
544420
1296
๊ฒฝ์ œํ•™์ž์—๊ฒŒ๋Š”์š”.
09:05
is there are around 65 percent of people who did not say I had four tails throws,
206
545740
6536
์•ฝ 65%์˜ ์‚ฌ๋žŒ๋“ค์€ ๋’ท๋ฉด์ด 4๋ฒˆ ๋‚˜์™”๋‹ค๊ณ  ๋งํ•˜์ง€ ์•Š์•˜๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
09:12
even though in that situation,
207
552300
2176
์•„๋ฌด๋„ ์ž์‹ ์„ ๋ณด๊ณ  ์žˆ์ง€ ์•Š๊ณ 
09:14
nobody's watching you,
208
554500
2096
์œ ์ผํ•œ ๊ฒฐ๋ก ์€
09:16
the only consequence that's in place
209
556620
1936
4๋ฒˆ์ด๋ผ ๋งํ•˜๋Š” ๊ฒƒ์ด ๋” ๋งŽ์€ ๋ˆ์„ ์ฑ™๊ธธ ์ˆ˜ ์žˆ๋‹ค๋Š”
09:18
is you get more money if you say four than less.
210
558580
3336
์ƒํ™ฉ ์†์—์„œ๋„ ๋ง์ด์ฃ .
09:21
You leave 20 francs on the table by announcing zero.
211
561940
3280
์—ฌ๋Ÿฌ๋ถ„์€ 0์ด๋ผ ๋งํ•จ์œผ๋กœ์จ 20ํ”„๋ž‘์„ ๋ฒ„๋ฆฌ๋Š” ๊ผด์ด ๋ฉ๋‹ˆ๋‹ค.
09:25
I don't know whether the other people all were honest
212
565860
2576
์ €๋Š” ๋‹ค๋ฅธ ์‚ฌ๋žŒ๋“ค์ด ๋ชจ๋‘ ์ •์งํ–ˆ๋Š”์ง€
09:28
or whether they also said a little bit higher or lower than what they did
213
568460
3456
ํ˜น์€ ๊ทธ๋“ค๋„ ์กฐ๊ธˆ์”ฉ ๋†’์ด๊ฑฐ๋‚˜ ๋‚ฎ์ถ”์—ˆ๋Š”์ง€๋Š” ์•Œ์ง€ ๋ชปํ•ฉ๋‹ˆ๋‹ค.
09:31
because it's anonymous.
214
571940
1216
์ต๋ช…์œผ๋กœ ์ง„ํ–‰๋˜์—ˆ๊ธฐ ๋•Œ๋ฌธ์ด์ฃ .
09:33
We only observed the distribution.
215
573180
1656
์šฐ๋ฆฌ๋Š” ์˜ค์ง ๋ถ„ํฌ๋งŒ์„ ๊ด€์ฐฐํ–ˆ์Šต๋‹ˆ๋‹ค.
09:34
But what I can tell you -- and here's another coin toss.
216
574860
2656
ํ™•์‹คํ•œ ๊ฑด, ๋˜ ํ•œ๋ฒˆ ๋˜์ ธ๋ณผ๊นŒ์š”.
09:37
There you go, it's tails.
217
577540
1496
๊ทธ๋ ‡์ฃ , ๋’ท๋ฉด์ž…๋‹ˆ๋‹ค.
09:39
(Laughter)
218
579060
1496
(์›ƒ์Œ)
09:40
Don't check, OK?
219
580580
1456
ํ™•์ธํ•˜์ง„ ๋งˆ์„ธ์š”, ์•Œ๊ฒ ์ฃ ?
09:42
(Laughter)
220
582060
2816
(์›ƒ์Œ)
09:44
What I can tell you
221
584900
1296
ํ™•์‹คํ•œ ๊ฑด
09:46
is that not everybody behaved like Adam Smith would have predicted.
222
586220
4440
๋ชจ๋‘๊ฐ€ ์• ๋ค ์Šค๋ฏธ์Šค์˜ ์ถ”์ธก๋Œ€๋กœ ํ–‰๋™ํ•˜์ง„ ์•Š์•˜๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
09:52
So what does that leave us with?
223
592660
1576
๊ทธ๋ž˜์„œ, ๊ทธ๊ฒŒ ๋ฌด์Šจ ์˜๋ฏธ๊ฐ€ ์žˆ๋Š” ๊ฑฐ์ฃ ?
09:54
Well, it seems people are motivated by certain intrinsic values
224
594260
4496
์Œ, ์‚ฌ๋žŒ๋“ค์€ ํŠน์ •ํ•œ ๋‚ด๋ฉด์  ๊ฐ€์น˜์— ์˜ํ•ด ์ž๊ทน๋œ๋‹ค๊ณ  ๋ณผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
09:58
and in our research, we look at this.
225
598780
1800
๊ทธ๋ฆฌ๊ณ , ์šฐ๋ฆฌ์˜ ์—ฐ๊ตฌ์—์„œ ์šฐ๋ฆฐ ์ด๋ฅผ ์‚ดํŽด๋ด…๋‹ˆ๋‹ค.
10:01
We look at the idea that people have so-called protected values.
226
601260
4480
์‚ฌ๋žŒ๋“ค์ด ์ด๋ฅธ๋ฐ” ๋ณดํ˜ธ๊ฐ€์น˜๋ฅผ ์ง€๋‹ˆ๊ณ  ์žˆ๋‹ค๋Š” ๋ฐœ์ƒ์„ ์‚ดํŽด๋ด…๋‹ˆ๋‹ค.
10:06
A protected value isn't just any value.
227
606580
2816
๋ณดํ˜ธ๊ฐ€์น˜๋ž€ ๊ทธ์ € ์•„๋ฌด ๊ฐ€์น˜๊ฐ€ ์•„๋‹™๋‹ˆ๋‹ค.
10:09
A protected value is a value where you're willing to pay a price
228
609420
5816
๋ณดํ˜ธ๊ฐ€์น˜๋ž€ ์—ฌ๋Ÿฌ๋ถ„์ด ๊ทธ ๊ฐ€์น˜๋ฅผ ์œ ์ง€ํ•˜๊ธฐ ์œ„ํ•ด
๊ธฐ๊บผ์ด ๋น„์šฉ์„ ์ง€๋ถˆํ•  ๊ฐ€์น˜๋ฅผ ๋งํ•ฉ๋‹ˆ๋‹ค.
10:15
to uphold that value.
229
615260
1256
10:16
You're willing to pay a price to withstand the temptation to give in.
230
616540
4440
์—ฌ๋Ÿฌ๋ถ„์€ ์œ ํ˜น์„ ๊ฒฌ๋””๊ธฐ ์œ„ํ•ด ๊ธฐ๊บผ์ด ๋น„์šฉ์„ ์ง€๋ถˆํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
10:22
And the consequence is you feel better
231
622020
2656
๊ทธ๋ฆฌ๊ณ  ๊ทธ ๊ฒฐ๊ณผ๋Š” ์—ฌ๋Ÿฌ๋ถ„์ด ๋” ํ–‰๋ณตํ•˜๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
10:24
if you earn money in a way that's consistent with your values.
232
624700
4296
๋งŒ์•ฝ ์—ฌ๋Ÿฌ๋ถ„์˜ ๊ฐ€์น˜์— ๋ถ€ํ•ฉํ•˜๋Š” ๋ฐฉ๋ฒ•์œผ๋กœ ๋ˆ์„ ๋ฒŒ์—ˆ๋‹ค๋ฉด ๋ง์ด์ฃ .
10:29
Let me show you this again in the metaphor of our beloved dog here.
233
629020
4280
์šฐ๋ฆฌ์˜ ์‚ฌ๋ž‘์Šค๋Ÿฌ์šด ๊ฐœ๋ฅผ ํ†ตํ•œ ๋น„์œ ๋กœ ๋‹ค์‹œ ํ•œ๋ฒˆ ๋ณด์—ฌ๋“œ๋ฆฌ์ฃ .
10:34
If we succeed in getting the sausage without violating our values,
234
634420
4056
๋งŒ์•ฝ ์šฐ๋ฆฌ๊ฐ€ ์šฐ๋ฆฌ์˜ ๊ฐ€์น˜๋ฅผ ์นจํ•ดํ•˜์ง€ ์•Š๊ณ  ์†Œ์‹œ์ง€๋ฅผ ์–ป๋Š”๋ฐ ์„ฑ๊ณตํ•œ๋‹ค๋ฉด
10:38
then the sausage tastes better.
235
638500
1976
๊ทธ ์†Œ์‹œ์ง€๋Š” ๋” ๋ง›์žˆ์„ ๊ฒ๋‹ˆ๋‹ค.
10:40
That's what our research shows.
236
640500
1480
์ด๋Š” ์ €ํฌ์˜ ์—ฐ๊ตฌ๊ฐ€ ๋ณด์—ฌ์ฃผ๋Š” ๋ฐ”์ด์ฃ .
10:42
If, on the other hand,
237
642540
1256
ํ•˜์ง€๋งŒ ๋งŒ์•ฝ
10:43
we do so --
238
643820
1256
์šฐ๋ฆฌ๊ฐ€ ์ด๋ ‡๊ฒŒ
10:45
if we get the sausage
239
645100
1416
์†Œ์‹œ์ง€๋ฅผ ์–ป๋Š” ๊ณผ์ •์—์„œ
10:46
and in doing so we actually violate values,
240
646540
3456
์šฐ๋ฆฌ์˜ ๊ฐ€์น˜๋ฅผ ์นจํ•ดํ•œ๋‹ค๋ฉด
10:50
we value the sausage less.
241
650020
2976
์šฐ๋ฆฌ๋Š” ์†Œ์‹œ์ง€์˜ ๊ฐ€์น˜๋ฅผ ํ•˜๋ฝ์‹œํ‚ต๋‹ˆ๋‹ค.
10:53
Quantitatively, that's quite powerful.
242
653020
2456
์–‘์ ์œผ๋กœ, ์ด๋Š” ๊ฝค๋‚˜ ๊ฐ•๋ ฅํ•ฉ๋‹ˆ๋‹ค.
10:55
We can measure these protected values,
243
655500
2456
์šฐ๋ฆฐ ์ด๋Ÿฌํ•œ ๋ณดํ˜ธ๊ฐ€์น˜๋“ค์„ ๊ณ„์‚ฐํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
10:57
for example,
244
657980
1216
์˜ˆ๋ฅผ ๋“ค์–ด
10:59
by a survey measure.
245
659220
1920
์„ค๋ฌธ์กฐ์‚ฌ๋ฅผ ํ†ตํ•ด์„œ ๋ง์ด์ฃ .
11:02
Simple, nine-item survey that's quite predictive in these experiments.
246
662180
5976
๊ฐ„๋‹จํ•œ, ์•„ํ™‰ ํ•ญ๋ชฉ ์„ค๋ฌธ์กฐ์‚ฌ๋Š” ์ด ์‹คํ—˜๋“ค์—์„œ ๊ฝค๋‚˜ ํšจ๊ณผ์ ์ž…๋‹ˆ๋‹ค.
11:08
If you think about the average of the population
247
668180
2336
๋งŒ์•ฝ ์—ฌ๋Ÿฌ๋ถ„์ด ์‚ฌ๋žŒ๋“ค์˜ ํ‰๊ท ์„ ์ƒ๊ฐํ•œ๋‹ค๋ฉด
11:10
and then there's a distribution around it --
248
670540
2096
์ด๋ฅผ ๋‘˜๋Ÿฌ์‹ผ ๋ถ„ํฌ๊ฐ€ ์กด์žฌํ•ฉ๋‹ˆ๋‹ค.
11:12
people are different, we all are different.
249
672660
2040
์‚ฌ๋žŒ๋“ค์€ ๋‹ค๋ฆ…๋‹ˆ๋‹ค, ์šฐ๋ฆฌ๋Š” ๋ชจ๋‘ ๋‹ค๋ฅด์ฃ .
11:15
People who have a set of protected values
250
675300
2976
ํ‰๊ท  ๋ณด๋‹ค 1 ํ‘œ์ค€ ํŽธ์ฐจ ์œ„์˜ ๋ณดํ˜ธ๊ฐ€์น˜๋ฅผ ์ง€๋‹Œ ์‚ฌ๋žŒ๋“ค์€
11:18
that's one standard deviation above the average,
251
678300
4176
11:22
they discount money they receive by lying by about 25 percent.
252
682500
5056
๊ทธ๋“ค์ด ๊ฑฐ์ง“๋ง์„ ํ†ตํ•ด ๋ฒˆ ๋ˆ์„ ์•ฝ 25% ํ• ์ธํ•ฉ๋‹ˆ๋‹ค.
11:27
That means a dollar received when lying
253
687580
3616
๋‹ค์‹œ๋งํ•ด, ๊ฑฐ์ง“๋ง์„ ํ†ตํ•ด ์–ป์€ 1๋‹ฌ๋Ÿฌ๋Š”
11:31
is worth to them only 75 cents
254
691220
2136
๊ทธ๋“ค์—๊ฒ ์˜ค์ง 75์„ผํŠธ์˜ ๊ฐ€์น˜๋ฅผ ์ง€๋‹Œ๋‹ค๋Š” ์˜๋ฏธ์ฃ .
11:33
without any incentives you put in place for them to behave honestly.
255
693380
3696
๊ทธ๋“ค์ด ์ •์งํ•˜๊ฒŒ ํ–‰๋™ํ•˜๋„๋ก ์ œ๊ณตํ•˜๋Š” ์ธ์„ผํ‹ฐ๋ธŒ๊ฐ€ ์ „ํ˜€ ์—†์„ ๋•Œ ๋ง์ด์—์š”.
11:37
It's their intrinsic motivation.
256
697100
1736
์ด๋Š” ๊ทธ๋“ค์˜ ๋‚ด๋ฉด์ ์ธ ๋™๊ธฐ๋ถ€์—ฌ์ž…๋‹ˆ๋‹ค.
11:38
By the way, I'm not a moral authority.
257
698860
1856
๊ทธ๋Ÿฌ๋‚˜, ์ €๋Š” ๋„๋•์ ์ธ ์„ฑ์ธ์ด ์•„๋‹™๋‹ˆ๋‹ค.
11:40
I'm not saying I have all these beautiful values, right?
258
700740
2920
์ œ๊ฐ€ ์ด๋Ÿฐ ๋ชจ๋“  ์•„๋ฆ„๋‹ค์šด ๊ฐ€์น˜๋“ค์„ ์ง€๋‹ˆ๊ณ  ์žˆ๋‹ค๋Š”๊ฒŒ ์•„๋‹ˆ์—์š”. ๊ทธ๋ ‡์ฃ ?
11:44
But I'm interested in how people behave
259
704260
1936
ํ•˜์ง€๋งŒ ์ €๋Š” ์‚ฌ๋žŒ๋“ค์ด ์–ด๋–ป๊ฒŒ ํ–‰๋™ํ•˜๋Š”์ง€
11:46
and how we can leverage that richness in human nature
260
706220
3376
์šฐ๋ฆฌ๊ฐ€ ์–ด๋–ป๊ฒŒ ํ•˜๋ฉด ์กฐ์ง์˜ ๊ฐœ์„ ์„ ์œ„ํ•ด ์ธ๊ฐ„ ๋ณธ์„ฑ ์† ๊ทธ ํ’๋ถ€ํ•จ์„ ํ™œ์šฉํ•˜๋Š”์ง€์—
11:49
to actually improve the workings of our organizations.
261
709620
3440
๊ด€์‹ฌ์ด ์žˆ์Šต๋‹ˆ๋‹ค.
11:54
So there are two very, very different visions here.
262
714220
3176
๋”ฐ๋ผ์„œ, ์—ฌ๊ธฐ ๋‘๊ฐœ์˜ ๊ทน๋ช…ํžˆ ๋‹ค๋ฅธ ๊ด€์ ์ด ์กด์žฌํ•ฉ๋‹ˆ๋‹ค.
11:57
On the one hand,
263
717420
1336
ํ•œํŽธ์œผ๋กœ๋Š”
11:58
you can appeal to benefits and costs
264
718780
3016
์—ฌ๋Ÿฌ๋ถ„์€ ๋น„์šฉ๊ณผ ํŽธ์ต์— ๋Œ๋ ค
12:01
and try to get people to behave according to them.
265
721820
2656
์‚ฌ๋žŒ๋“ค์„ ์ด์— ๋”ฐ๋ผ ํ–‰๋™ํ•˜๊ฒŒ ๋งŒ๋“ค ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค
12:04
On the other hand,
266
724500
1616
๋‹ค๋ฅธ ํ•œํŽธ์œผ๋ก 
12:06
you can select people who have the values
267
726140
4016
์—ฌ๋Ÿฌ๋ถ„์€ ์—ฌ๋Ÿฌ๋ถ„์˜ ์กฐ์ง์— ๋งž๋Š” ๊ฐ€์น˜์™€
12:10
and the desirable characteristics, of course --
268
730180
2216
๋ฐ”๋žŒ์งํ•œ ์„ฑ๊ฒฉ์€ ๋ฌผ๋ก , ๋Šฅ๋ ฅ๊นŒ์ง€ ๊ฒธ๋น„ํ•œ
12:12
competencies that go in line with your organization.
269
732420
3576
์‚ฌ๋žŒ๋“ค์„ ์„ ๋ฐœํ•  ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
12:16
I do not yet know where these protected values really come from.
270
736020
4216
์ €๋Š” ์•„์ง ์ด๋Ÿฌํ•œ ๋ณดํ˜ธ๊ฐ€์น˜๋“ค์ด ์–ด๋””์„œ๋ถ€ํ„ฐ ์˜ค๋Š”์ง€ ์•Œ์ง€ ๋ชปํ•ฉ๋‹ˆ๋‹ค.
12:20
Is it nurture or is it nature?
271
740260
3376
ํ›„์ฒœ์ ์ผ๊นŒ์š” ์•„๋‹ˆ๋ฉด ์„ ์ฒœ์ ์ผ๊นŒ์š”?
12:23
What I can tell you
272
743660
1376
ํ•˜์ง€๋งŒ ํ™•์‹คํ•œ ๊ฑด
12:25
is that the distribution looks pretty similar for men and women.
273
745060
5096
์ด์˜ ๋ถ„ํฌ๋Š” ๋‚จ์ž์™€ ์—ฌ์ž ์‚ฌ์ด์—์„œ ๊ฝค๋‚˜ ๋น„์Šทํ•œ ๊ฒƒ์ด๋ผ๋Š” ๊ฒ๋‹ˆ๋‹ค.
12:30
It looks pretty similar for those who had studied economics
274
750180
3776
๊ฒฝ์ œํ•™์„ ๊ณต๋ถ€ํ•œ ์‚ฌ๋žŒ๋“ค๊ณผ, ์‹ฌ๋ฆฌํ•™์„ ๊ณต๋ถ€ํ•œ ์‚ฌ๋žŒ๋“ค ์‚ฌ์ด์—์„œ๋„์š”.
12:33
or those who had studied psychology.
275
753980
2360
12:37
It looks even pretty similar around different age categories
276
757820
3376
๋˜, ์–ด๋ฅธ๋“ค ๊ฐ€์šด๋ฐ ๋‹ค๋ฅธ ์—ฐ๋ น๋Œ€ ์‚ฌ์ด์—์„œ๋„ ๋น„์Šทํ•ฉ๋‹ˆ๋‹ค.
12:41
among adults.
277
761220
1216
12:42
But I don't know yet how this develops over a lifetime.
278
762460
2656
ํ•˜์ง€๋งŒ, ์•„์ง ์ด๊ฐ€ ์ผ์ƒ ์ค‘ ์–ด๋–ป๊ฒŒ ๋ฐœ์ „ํ•˜๋Š”์ง€๋Š” ์•Œ์ง€ ๋ชปํ•ฉ๋‹ˆ๋‹ค.
12:45
That will be the subject of future research.
279
765140
3440
์ด๋Š” ๋ฏธ๋ž˜ ์—ฐ๊ตฌ์˜ ์ฃผ์ œ๊ฐ€ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
12:49
The idea I want to leave you with
280
769460
1656
์ œ๊ฐ€ ๋‚จ๊ธฐ๊ณ  ์‹ถ์€ ์ƒ๊ฐ์€
12:51
is it's all right to appeal to incentives.
281
771140
2776
์ธ์„ผํ‹ฐ๋ธŒ์— ๋Œ๋ ค๋„ ๊ดœ์ฐฎ๋‹ค๋Š” ๊ฒ๋‹ˆ๋‹ค.
12:53
I'm an economist;
282
773940
1216
์ €๋Š” ๊ฒฝ์ œํ•™์ž์ž…๋‹ˆ๋‹ค.
12:55
I certainly believe in the fact that incentives work.
283
775180
2920
์ €๋Š” ์ธ์„ผํ‹ฐ๋ธŒ๊ฐ€ ํšจ๊ณผ๊ฐ€ ์žˆ๋‹ค๋Š” ๊ฒƒ์„ ๋ฏฟ์–ด ์˜์‹ฌ์น˜ ์•Š์Šต๋‹ˆ๋‹ค.
12:59
But do think about selecting the right people
284
779220
4016
ํ•˜์ง€๋งŒ, ์‚ฌ๋žŒ๋“ค์—๊ฒŒ ์ธ์„ผํ‹ฐ๋ธŒ๋ฅผ ๋ถ€์—ฌํ•˜๋Š” ๊ฒƒ์ด ์•„๋‹ˆ๋ผ
13:03
rather than having people and then putting incentives in place.
285
783260
3496
์˜ฌ๋ฐ”๋ฅธ ์‚ฌ๋žŒ์„ ์„ ๋ฐœํ•˜๋Š” ๊ฒƒ์„ ์ƒ๊ฐํ•ด๋ณด์„ธ์š”.
13:06
Selecting the right people with the right values
286
786780
2256
์˜ฌ๋ฐ”๋ฅธ ๊ฐ€์น˜๋ฅผ ์ง€๋‹Œ ์˜ฌ๋ฐ”๋ฅธ ์‚ฌ๋žŒ์„ ์„ ๋ฐœํ•˜๋Š” ๊ฒƒ์€
13:09
may go a long way to saving a lot of trouble
287
789060
3936
์—ฌ๋Ÿฌ๋ถ„์˜ ์กฐ์ง์—์„œ
๋งŽ์€ ์—ญ๊ฒฝ๊ณผ ๋ˆ์„
13:13
and a lot of money
288
793020
1376
13:14
in your organizations.
289
794420
1736
์ ˆ์•ฝํ•ด ์ค„ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
13:16
In other words,
290
796180
1256
๋‹ค์‹œ ๋งํ•ด
13:17
it will pay off to put people first.
291
797460
3760
์‚ฌ๋žŒ์„ ๋จผ์ € ๊ณ ๋ คํ•˜๋Š” ๊ฒƒ์ด ์„ฑ๊ณผ๋ฅผ ๋ฐœํœ˜ํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
13:21
Thank you.
292
801860
1216
๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
13:23
(Applause)
293
803100
3640
(๋ฐ•์ˆ˜)
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7