We've stopped trusting institutions and started trusting strangers | Rachel Botsman

195,941 views ・ 2016-11-07

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: Kim Jung hye κ²€ν† : Park Haesik
00:12
Let's talk about trust.
0
12760
2560
신뒰에 λŒ€ν•΄ 이야기해 λ΄…μ‹œλ‹€.
00:16
We all know trust is fundamental,
1
16240
3496
μš°λ¦¬λŠ” λͺ¨λ‘ μ‹ λ’°κ°€ 기본적인 것이라고 μ•Œκ³  μžˆμŠ΅λ‹ˆλ‹€.
00:19
but when it comes to trusting people,
2
19760
2616
ν•˜μ§€λ§Œ μ‚¬λžŒλ“€μ„ μ‹ λ’°ν•˜λ €κ³  ν•  λ•Œλ©΄
00:22
something profound is happening.
3
22400
2760
μ‹¬μ˜€ν•œ 무언가가 μΌμ–΄λ‚©λ‹ˆλ‹€.
00:25
Please raise your hand
4
25800
1256
μ—μ–΄λΉ„μ•€λΉ„μ˜ ν˜ΈμŠ€νŠΈλ‚˜
00:27
if you have ever been a host or a guest on Airbnb.
5
27080
4320
μ†λ‹˜μ΄ λ˜μ–΄ λ³Έ 적이 μžˆμœΌμ‹œλ©΄ 손을 λ“€μ–΄λ³΄μ„Έμš”
00:32
Wow. That's a lot of you.
6
32400
3216
와, μ—„μ²­ λ§Žκ΅°μš”.
00:35
Who owns Bitcoin?
7
35640
1440
λΉ„νŠΈμ½”μΈμ„ 가지고 κ³„μ‹ λΆ„μ€μš”?
00:38
Still a lot of you. OK.
8
38640
1256
κ·Έλž˜μš”. μ—¬μ „νžˆ λ§Žκ΅°μš”
00:39
And please raise your hand if you've ever used Tinder
9
39920
2696
그리고 λ§Œμ•½ Tinder을 μ΄μš©ν•΄ 본적이 μžˆλ‹€λ©΄ 손을 λ“€μ–΄μ£Όμ‹­μ‹œμ˜€.
00:42
to help you find a mate.
10
42640
1616
짝을 μ°ΎλŠ”λ° 말이죠.
00:44
(Laughter)
11
44280
1816
(μ›ƒμŒ)
00:46
This one's really hard to count because you're kind of going like this.
12
46120
3376
μ΄λ ‡κ²Œ 손을 λ“œμ‹œλ©΄ μ„ΈκΈ°κ°€ μ–΄λ €μ›Œμš”.
00:49
(Laughter)
13
49520
1816
(μ›ƒμŒ)
00:51
These are all examples of how technology
14
51360
2776
이 λͺ¨λ“  것은 기술이 μ–΄λ–»κ²Œ
00:54
is creating new mechanisms
15
54160
2096
μƒˆλ‘œμš΄ λ§€μ»€λ‹ˆμ¦˜μ„ λ§Œλ“œλŠ”μ§€μ— λŒ€ν•œ μ˜ˆλ“€μž…λ‹ˆλ‹€.
00:56
that are enabling us to trust unknown people, companies and ideas.
16
56280
6016
이 λ§€μ»€λ‹ˆμ¦˜μ€ λ‚―μ„  μ‚¬λžŒλ“€κ³Ό κΈ°μ—…κ³Ό 아이디어λ₯Ό 믿을 수 μžˆλ„λ‘ ν•©λ‹ˆλ‹€.
01:02
And yet at the same time,
17
62320
2016
κ·ΈλŸ¬λ‚˜ λ™μ‹œμ—
01:04
trust in institutions --
18
64360
1536
은행과 μ •λΆ€
01:05
banks, governments and even churches --
19
65920
2936
심지어 κ΅νšŒκ°™μ€ 기관에 λŒ€ν•œ μ‹ λ’°λŠ”
01:08
is collapsing.
20
68880
1376
λ¬΄λ„ˆμ§€κ³  μžˆμŠ΅λ‹ˆλ‹€.
01:10
So what's happening here,
21
70280
2216
자 μ—¬κΈ°μ„œ 무슨 일이 μΌμ–΄λ‚˜κ³  있고,
01:12
and who do you trust?
22
72520
1760
μ—¬λŸ¬λΆ„μ€ λˆ„κ΅¬λ₯Ό λ―ΏμŠ΅λ‹ˆκΉŒ?
01:14
Let's start in France with a platform -- with a company, I should say --
23
74800
3896
ν”„λž‘μŠ€μ— μžˆλŠ” ν•œ κΈ°μ—…λΆ€ν„° 이야기 ν•΄λ³΄κ² μŠ΅λ‹ˆλ‹€.
01:18
with a rather funny-sounding name,
24
78720
2136
쑰금 μž¬λ°ŒλŠ” 이름을 가지고 μžˆλŠ”λ°μš”.
01:20
BlaBlaCar.
25
80880
1256
블라블라카(BlaBlaCar) μž…λ‹ˆλ‹€.
01:22
It's a platform that matches drivers and passengers
26
82160
3616
이 ν”Œλž«νΌμ€ μž₯거리 여행을 ν•¨κ»˜ ν•˜κ³ μž ν•˜λŠ”
01:25
who want to share long-distance journeys together.
27
85800
4016
μš΄μ „μžμ™€ μŠΉκ°λ“€μ„ μ—°κ²°μ‹œμΌœμ€λ‹ˆλ‹€.
01:29
The average ride taken is 320 kilometers.
28
89840
4216
평균 이동거리가 320km이기 λ•Œλ¬Έμ—
01:34
So it's a good idea to choose your fellow travelers wisely.
29
94080
4600
μ—¬κΈ°μ„  동행할 μ‚¬λžŒμ„ μ‹ μ€‘ν•˜κ²Œ κ³ λ₯΄λŠ” 것이 μ€‘μš”ν•©λ‹ˆλ‹€.
01:39
Social profiles and reviews help people make a choice.
30
99320
3976
μ—¬κΈ°μ„  ν”„λ‘œν•„κ³Ό 리뷰가 선택에 도움을 μ€λ‹ˆλ‹€.
01:43
You can see if someone's a smoker, you can see what kind of music they like,
31
103320
5016
λˆ„κ°€ ν‘μ—°μžμ΄κ³ , μ–΄λ–€ μŒμ•…μ„ μ’‹μ•„ν•˜λŠ”μ§€ μ•Œ 수 있으며
01:48
you can see if they're going to bring their dog along for the ride.
32
108360
3600
μƒλŒ€κ°€ 강아지λ₯Ό 데리고 κ°€λŠ”μ§€λ„ μ•Œμˆ˜μžˆμŠ΅λ‹ˆλ‹€.
01:52
But it turns out that the key social identifier
33
112440
3576
ν•˜μ§€λ§Œ κ°€μž₯ μ€‘μš”ν•œ μš₯μ…˜μ€
01:56
is how much you're going to talk in the car.
34
116040
2616
μ—¬λŸ¬λΆ„μ΄ μ°¨μ—μ„œ μ–Όλ§ˆλ‚˜ 말을 ν•˜λŠ”μ§€ μž…λ‹ˆλ‹€.
01:58
(Laughter)
35
118680
1816
(μ›ƒμŒ)
02:00
Bla, not a lot,
36
120520
1616
블라 ν•˜λ‚˜, (ν”„λ‘œν•„μƒμ˜ λΈ”λΌκ°―μˆ˜) κ·Έλ ‡κ²Œ 말이 λ§Žμ§€ μ•ŠμŒ
02:02
bla bla, you want a nice bit of chitchat,
37
122160
2296
블라블라. κ°€λ²Όμš΄ λŒ€ν™”μ •λ„λ₯Ό 원함.
02:04
and bla bla bla, you're not going to stop talking the entire way
38
124480
3736
블라블라블라, μ‰΄μƒˆμ—†μ΄ 말함
02:08
from London to Paris.
39
128240
1456
λŸ°λ˜μ—μ„œ νŒŒλ¦¬κΉŒμ§€.
02:09
(Laughter)
40
129720
2296
(μ›ƒμŒ)
02:12
It's remarkable, right, that this idea works at all,
41
132040
3016
이런 아이디어가 ν†΅ν–ˆλ‹€λŠ”κ²Œ λŒ€λ‹¨ν•˜μ§€ μ•Šλ‚˜μš”.
02:15
because it's counter to the lesson most of us were taught as a child:
42
135080
3856
μ™œλƒν•˜λ©΄ μš°λ¦¬κ°€ μ•„μ΄μΌλ•Œ 늘 λ“€μ–΄μ™”λ˜ 말과 μƒλ°˜λ˜κΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
02:18
never get in a car with a stranger.
43
138960
2320
"μ ˆλŒ€ λ‚―μ„ μ‚¬λžŒ 차에 νƒ€μ§€λ§ˆλΌ."
02:21
And yet, BlaBlaCar transports more than four million people
44
141680
5136
ν•˜μ§€λ§Œ, λΈ”λΌλΈ”λΌμΉ΄λŠ” 맀달 4백만 λͺ… μ΄μƒμ˜ μ‚¬λžŒλ“€μ„
02:26
every single month.
45
146840
1240
μ‹€μ–΄ λ‚˜λ¦…λ‹ˆλ‹€.
02:28
To put that in context, that's more passengers
46
148880
2336
달리 λ§ν•˜λ©΄, μœ λ‘œμŠ€νƒ€λ‚˜ μ œνŠΈλΈ”λ£¨ 에어라인이
02:31
than the Eurostar or JetBlue airlines carry.
47
151240
3480
μ‹€μ–΄λ‚˜λ₯΄λŠ” μŠΉκ°λ³΄λ‹€ λ§ŽμŠ΅λ‹ˆλ‹€.
02:35
BlaBlaCar is a beautiful illustration of how technology is enabling
48
155360
4136
λΈ”λΌλΈ”λΌμΉ΄λŠ” 기술이 μ–΄λ–»κ²Œ 수 백만 λͺ…μ˜ μ‚¬λžŒλ“€μ΄
02:39
millions of people across the world to take a trust leap.
49
159520
3600
'신뒰도약'을 ν•˜λ„λ‘ ν•˜λŠ”μ§€ λ³΄μ—¬μ£ΌλŠ” μ•„μ£Ό 멋진 μ˜ˆμž…λ‹ˆλ‹€.
02:43
A trust leap happens when we take the risk to do something new or different
50
163560
6256
μ‹ λ’°μ˜ 도약은 μš°λ¦¬κ°€ 항상 ν•΄μ˜€λ˜ 방식과 λ‹€λ₯Έ μƒˆλ‘œμš΄ 것에
02:49
to the way that we've always done it.
51
169840
2040
λ„μ „ν•˜λ €κ³  μœ„ν—˜μ„ κ°μˆ˜ν•  λ•Œ λ°œμƒν•©λ‹ˆλ‹€.
02:52
Let's try to visualize this together.
52
172440
2560
이것을 μ‹œκ°ν™”ν•΄λ³΄κ² μŠ΅λ‹ˆλ‹€.
02:55
OK. I want you to close your eyes.
53
175360
3160
λˆˆμ„ 감아 μ£Όμ‹œκΈΈ λ°”λžλ‹ˆλ‹€.
02:59
There is a man staring at me with his eyes wide open.
54
179320
3056
λˆˆμ„ 크게 뜨고 절 바라보고 μžˆλŠ” ν•œλΆ„μ΄ κ³„μ‹œλ„€μš”.
03:02
I'm on this big red circle. I can see.
55
182400
2456
μ—¬κΈ° λΉ¨λž€μ› μœ„μ— μ„œμžˆμœΌλ©΄ λ‹€ λ³΄μΈλ‹€κ³ μš”.
03:04
So close your eyes.
56
184880
1416
자 λˆˆμ„ 감아 λ³΄μ„Έμš”.
03:06
(Laughter) (Applause)
57
186320
2856
(μ›ƒμŒ)(λ°•μˆ˜)
03:09
I'll do it with you.
58
189200
1256
저도 같이 ν• κ²Œμš”.
03:10
And I want you to imagine there exists a gap
59
190480
3016
자 μ—¬λŸ¬λΆ„κ³Ό λͺ¨λ₯΄λŠ” λ¬΄μ–Έκ°€μ™€μ˜
03:13
between you and something unknown.
60
193520
2560
사이에 ν‹ˆμ΄ μžˆλ‹€κ³  μƒμƒν•΄λ³΄μ„Έμš”.
03:16
That unknown can be someone you've just met.
61
196760
2856
그것은 μ—¬λŸ¬λΆ„μ΄ 방금 막 λ§Œλ‚œ μ‚¬λžŒμΌ μˆ˜λ„ 있고
03:19
It can be a place you've never been to.
62
199640
2096
μ—¬λŸ¬λΆ„μ΄ ν•œ λ²ˆλ„ 가지 λͺ»ν–ˆλ˜ μž₯μ†ŒμΌ μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
03:21
It can be something you've never tried before.
63
201760
3376
μ•„λ‹ˆλ©΄ 이전에 ν•œλ²ˆλ„ μ‹œλ„ν•œ μ μ—†λŠ” 무언가 μΌμˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
03:25
You got it?
64
205160
1216
자 λλ‚˜μš”?
03:26
OK. You can open your eyes now.
65
206400
2096
이젠 λˆˆμ„ λœ¨μ…”λ„ μ’‹μŠ΅λ‹ˆλ‹€.
03:28
For you to leap from a place of certainty,
66
208520
3336
μ—¬λŸ¬λΆ„μ΄ ν™•μ‹ μ˜ μ§€μ μ—μ„œ
03:31
to take a chance on that someone or something unknown,
67
211880
3576
λͺ¨λ₯΄λŠ” μ‚¬λžŒκ³Ό μž₯μ†Œλ‘œ 도약을 ν•˜κ³ μž ν•œλ‹€λ©΄
03:35
you need a force to pull you over the gap,
68
215480
3336
μ—¬λŸ¬λΆ„μ€ 이 간극을 λ›°μ–΄λ„˜λ„λ‘ 밀어쀄 힘이 ν•„μš”ν•©λ‹ˆλ‹€.
03:38
and that remarkable force is trust.
69
218840
2840
그리고 이 λŒ€λ‹¨ν•œ 힘이 λ°”λ‘œ μ‹ λ’°μž…λ‹ˆλ‹€.
03:42
Trust is an elusive concept,
70
222560
3616
μ‹ λ’°λŠ” λ‹€μ†Œ λͺ¨ν˜Έν•œ κ°œλ…μž…λ‹ˆλ‹€.
03:46
and yet we depend on it for our lives to function.
71
226200
3376
ν•˜μ§€λ§Œ μš°λ¦¬λŠ” 삢이 μ œλŒ€λ‘œ κΈ°λŠ₯ν•˜λŠ”λ° 이 신뒰에 μ˜μ§€ν•©λ‹ˆλ‹€.
03:49
I trust my children
72
229600
2256
μ €λŠ” 제 아이듀을 λ―ΏμŠ΅λ‹ˆλ‹€.
03:51
when they say they're going to turn the lights out at night.
73
231880
2856
아이듀이 밀에 λΆˆμ„ 끄고 μžκ² λ‹€κ³  ν• λ•Œ 말이죠.
03:54
I trusted the pilot who flew me here to keep me safe.
74
234760
3216
μ €λŠ” μ €λ₯Ό μ—¬κΈ°κΉŒμ§€ μ•ˆμ „ν•˜κ²Œ νƒœμ›Œμ€€ λΉ„ν–‰κΈ° 쑰쒅사λ₯Ό λ―ΏμŠ΅λ‹ˆλ‹€.
03:58
It's a word we use a lot,
75
238000
2816
μ‹ λ’°λŠ” μš°λ¦¬κ°€ 정말 많이 μ‚¬μš©ν•˜λŠ” λ‹¨μ–΄μž…λ‹ˆλ‹€.
04:00
without always thinking about what it really means
76
240840
2416
μ§„μ§œ μ˜λ―Έκ°€ 무엇인지, 우리의 λ‹€μ–‘ν•œ 삢에
04:03
and how it works in different contexts of our lives.
77
243280
3376
μ–΄λ–»κ²Œ μž‘μš©ν•˜λŠ”μ§€λ₯Ό 생각해보지 μ•Šμ•„λ„
04:06
There are, in fact, hundreds of definitions of trust,
78
246680
3616
사싀 μ‹ λ’°μ—” 수 λ°±κ°€μ§€μ˜ μ •μ˜κ°€ μžˆμŠ΅λ‹ˆλ‹€.
04:10
and most can be reduced to some kind of risk assessment
79
250320
4576
그리고 이 μ •μ˜λ“€μ€ 일이 μ–Όλ§ˆλ‚˜ 잘 λ κ²ƒμΈκ°€λΌλŠ”
04:14
of how likely it is that things will go right.
80
254920
2760
μΌμ’…μ˜ μœ„ν—˜ν‰κ°€λΌλŠ” κ²ƒμœΌλ‘œ κ·€κ²°λ©λ‹ˆλ‹€.
04:18
But I don't like this definition of trust,
81
258120
2576
ν•˜μ§€λ§Œ μ „ 이런 μ •μ˜λ₯Ό μ’‹μ•„ν•˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
04:20
because it makes trust sound rational and predictable,
82
260720
4296
μ™œλƒν•˜λ©΄ 이성적이고 μ˜ˆμΈ‘κ°€λŠ₯ν•œ κ²ƒμ²˜λŸΌ 보이게 ν•˜κΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
04:25
and it doesn't really get to the human essence
83
265040
2576
그리고 무엇이 우리λ₯Ό 움직이고
04:27
of what it enables us to do
84
267640
1936
타인과 관계λ₯Ό κ°€μ§€κΈ°μœ„ν•΄
04:29
and how it empowers us
85
269600
1976
μ–΄λ–€ 힘이 μž‘μš©ν•˜λŠ”κ°€μ— λŒ€ν•œ
04:31
to connect with other people.
86
271600
2096
μΈκ°„μ˜ λ³Έμ§ˆμ„ 닀루지도 λͺ»ν•©λ‹ˆλ‹€.
04:33
So I define trust a little differently.
87
273720
2056
λ”°λΌμ„œ μ €λŠ” μ‹ λ’°λ₯Ό 쑰금 λ‹€λ₯΄κ²Œ μ •μ˜ν•΄λ³ΌκΉŒ ν•©λ‹ˆλ‹€.
04:35
I define trust as a confident relationship to the unknown.
88
275800
5440
μ €λŠ” μ‹ λ’°λ₯Ό 낯선것에 λŒ€ν•œ ν™•μ‹ μžˆλŠ” 관계라고 μ •μ˜ν•©λ‹ˆλ‹€.
04:41
Now, when you view trust through this lens,
89
281920
2096
이 κ΄€μ μœΌλ‘œ μ‹ λ’°λ₯Ό 바라본닀면
04:44
it starts to explain why it has the unique capacity
90
284040
3656
μ™œ μ‹ λ’°κ°€ λΆˆν™•μ‹€μ„±μ— μˆœμ‘ν•˜κ³ 
04:47
to enable us to cope with uncertainty,
91
287720
3096
λ‚―μ„  μ‚¬λžŒμ„ μ‹ λ’°ν•˜κ²Œ ν•˜λ©°
04:50
to place our faith in strangers,
92
290840
3016
μ•žμœΌλ‘œ λ‚˜μ•„κ°€λ„λ‘ ν•˜λŠ”
04:53
to keep moving forward.
93
293880
1880
νŠΉλ³„ν•œ νž˜μ„ κ°€μ‘ŒλŠ”μ§€ μ΄ν•΄ν•˜κ²Œ λ κ²λ‹ˆλ‹€.
04:56
Human beings are remarkable
94
296400
2776
인간은 신뒰도약을 ν•˜λŠ”λ°
04:59
at taking trust leaps.
95
299200
1680
맀우 νƒμ›”ν•©λ‹ˆλ‹€.
05:01
Do you remember the first time you put your credit card details
96
301400
2976
μ›Ήμ‚¬μ΄νŠΈμ— 처음으둜 μ—¬λŸ¬λΆ„μ˜ μ‹ μš©μΉ΄λ“œ 정보λ₯Ό
05:04
into a website?
97
304400
1216
μž…λ ₯ν–ˆμ„ λ•Œκ°€ κΈ°μ–΅λ‚˜μ‹œλ‚˜μš”?
05:05
That's a trust leap.
98
305640
1336
그게 λ°”λ‘œ μ‹ λ’°μ˜ λ„μ•½μž…λ‹ˆλ‹€.
05:07
I distinctly remember telling my dad
99
307000
2936
λΆ„λͺ…νžˆ κΈ°μ–΅λ‚˜λŠ”λ°
05:09
that I wanted to buy a navy blue secondhand Peugeot on eBay,
100
309960
5256
μ œκ°€ μ΄λ² μ΄μ—μ„œ μ§„ν•œ 넀이비블루 μƒ‰μƒμ˜ 쀑고 ν‘Έμ‘°λ₯Ό 사고싢닀고
05:15
and he rightfully pointed out
101
315240
1656
μ•„λ²„μ§€μ—κ²Œ λ§ν–ˆμ„λ•Œ 아버지가 μ •ν™•νžˆ κ°€λ¦¬ν‚€μ‹œλ”κ΅°μš”.
05:16
that the seller's name was "Invisible Wizard"
102
316920
2376
판맀자 이름이 '보이지 μ•ŠλŠ” λ§ˆλ²•μ‚¬'인데
05:19
and that this probably was not such a good idea.
103
319320
3096
그닀지 쒋은 생각이 μ•„λ‹Œ 것 κ°™λ‹€ 라고 말이죠.
05:22
(Laughter)
104
322440
1696
(μ›ƒμŒ)
05:24
So my work, my research focuses on how technology
105
324160
3416
κ·Έλž˜μ„œ μ €λŠ” μ–΄λ–»κ²Œ 기술이 μ‚¬νšŒμ˜ 결속λ ₯κ³Ό
05:27
is transforming the social glue of society,
106
327600
2656
μ‚¬λžŒκ°„μ˜ μ‹ λ’°λ₯Ό λ³€ν™”μ‹œν‚€λŠ”μ§€μ—
05:30
trust between people,
107
330280
1656
μ—°κ΅¬μ˜ μ΄ˆμ μ„ λ§žμ·„μŠ΅λ‹ˆλ‹€.
05:31
and it's a fascinating area to study,
108
331960
2256
이건 μ•„μ£Ό 맀λ ₯적인 μ—°κ΅¬λΆ„μ•Όμž…λ‹ˆλ‹€.
05:34
because there's still so much we do not know.
109
334240
3376
μš°λ¦¬λŠ” 아직 λͺ¨λ₯΄λŠ”κ²Œ λ„ˆλ¬΄ λ§ŽμœΌλ‹ˆκΉŒμš”.
05:37
For instance, do men and women trust differently in digital environments?
110
337640
5096
예λ₯Ό λ“€μ–΄, λ‚¨λ…€λŠ” 디지털 ν™˜κ²½μ—μ„œ μ„œλ‘œ λ‹€λ₯Έ λ°©μ‹μœΌλ‘œ μ‹ λ’°ν• κΉŒμš”?
05:42
Does the way we build trust face-to-face translate online?
111
342760
4896
얼꡴을 보며 μ‹ λ’°λ₯Ό μŒ“μ•„μ˜¨ 방식이 μ˜¨λΌμΈμ—μ„œλŠ” λ°”λ€”κΉŒμš”?
05:47
Does trust transfer?
112
347680
1936
μ‹ λ’°λŠ” λ‹€λ₯Έκ³³μœΌλ‘œ μ „ν™˜λ κΉŒμš”?
05:49
So if you trust finding a mate on Tinder,
113
349640
2696
λ§Œμ•½ μ—¬λŸ¬λΆ„μ΄ Tinderμ—μ„œ 짝 μ°ΎλŠ” 것을 μ‹ λ’°λ₯Ό ν•œλ‹€λ©΄
05:52
are you more likely to trust finding a ride on BlaBlaCar?
114
352360
3360
μ—¬λŸ¬λΆ„μ€ BlaBlaCar μ—μ„œλ„ λ™ν–‰μžλ₯Ό 찾을 κ°€λŠ₯성이 ν½λ‹ˆλ‹€.
05:56
But from studying hundreds of networks and marketplaces,
115
356440
3256
ν•˜μ§€λ§Œ, 수백개의 λ„€νŠΈμ›Œν¬μ™€ νŒλ§€μ‹œμž₯을 μ‘°μ‚¬ν•œ κ²°κ³Ό,
05:59
there is a common pattern that people follow,
116
359720
2816
μ—¬κΈ°μ—” μ‚¬λžŒλ“€μ΄ λ”°λ₯΄λŠ” 일반적인 νŒ¨ν„΄μ΄ μžˆμŠ΅λ‹ˆλ‹€.
06:02
and I call it "climbing the trust stack."
117
362560
2656
그리고 μ €λŠ” 이것을 "μ‹ λ’°μŠ€νƒ 였λ₯΄κΈ°"라고 λΆ€λ¦…λ‹ˆλ‹€.
06:05
Let me use BlaBlaCar as an example to bring it to life.
118
365240
3200
ν•œ 예둜 BlaBlaCar을 λ“€μ–΄λ³΄κ² μŠ΅λ‹ˆλ‹€.
06:09
On the first level,
119
369080
1296
첫 번째 λ‹¨κ³„μ—μ„œ
06:10
you have to trust the idea.
120
370400
2176
μ—¬λŸ¬λΆ„μ€ 아이디어λ₯Ό λ―Ώμ–΄μ•Ό ν•©λ‹ˆλ‹€.
06:12
So you have to trust
121
372600
1216
λ°”λ‘œ μ°¨λŸ‰κ³΅μœ λΌλŠ” 아이디어가
06:13
the idea of ride-sharing is safe and worth trying.
122
373840
3400
μ•ˆμ „ν•˜κ³ , ν•΄λ³Όλ§Œν•œ κ°€μΉ˜κ°€ μžˆλ‹€κ³  λ―Ώμ–΄μ•Ό ν•©λ‹ˆλ‹€.
06:17
The second level is about having confidence in the platform,
123
377640
4696
λ‘λ²ˆμ§ΈλŠ” 이 ν”Œλž«νΌμ— 확신이 μžˆμ–΄μ•Ό ν•©λ‹ˆλ‹€.
06:22
that BlaBlaCar will help you if something goes wrong.
124
382360
4136
무언가 잘λͺ»λ˜λ©΄ BlaBlaCarκ°€ 도와쀄 κ²ƒμ΄λΌλŠ” 확신말이죠.
06:26
And the third level is about using little bits of information
125
386520
3656
그리고 μ„Έλ²ˆμ§Έ λ‹¨κ³„λŠ” λ‹€λ₯Έ μ‚¬λžŒμ΄ λ―Ώμ„λ§Œ ν•œμ§€
06:30
to decide whether the other person is trustworthy.
126
390200
3480
κ²°μ •ν•˜λŠ”λ° μ•½κ°„μ˜ 정보가 제곡 λ˜λŠλƒμž…λ‹ˆλ‹€.
06:34
Now, the first time we climb the trust stack,
127
394200
2616
처음 μ‹ λ’°μŠ€νƒμ„ 였λ₯Ό λ•ŒλŠ”
06:36
it feels weird, even risky,
128
396840
3256
μ΄μƒν•˜κ²Œ λŠκ»΄μ§€κ³ , μœ„ν—˜ν•˜λ‹€κ³ λ„ 느껴질 κ²λ‹ˆλ‹€.
06:40
but we get to a point where these ideas seem totally normal.
129
400120
4976
ν•˜μ§€λ§Œ, 일단 이 아이디어가 νŽΈν•˜κ²Œ λŠκ»΄μ§€λŠ” 지점에 λ„λ‹¬ν•˜λ©΄
06:45
Our behaviors transform,
130
405120
2296
우리의 행동은 λ³€ν™”ν•©λ‹ˆλ‹€,
06:47
often relatively quickly.
131
407440
1976
가끔은 μ•„μ£Ό λΉ λ₯΄κ²Œμš”.
06:49
In other words, trust enables change and innovation.
132
409440
4800
λ‹€μ‹œ λ§ν•˜λ©΄, μ‹ λ’°λŠ” 변화와 ν˜μ‹ μ„ μ΄λŒμ–΄ λƒ…λ‹ˆλ‹€.
06:55
So an idea that intrigued me, and I'd like you to consider,
133
415280
3416
μ €μ˜ ν˜ΈκΈ°μ‹¬μ„ μžκ·Ήν•˜κ³ , μ—¬λŸ¬λΆ„λ„ μƒκ°ν•΄λ΄€μœΌλ©΄ ν•˜λŠ” 것은
06:58
is whether we can better understand
134
418720
2536
μ‹ λ’°μ˜ 렌즈λ₯Ό 톡해 μ‚¬νšŒ 속 κ°œμΈλ“€μ˜
07:01
major waves of disruption and change in individuals in society
135
421280
4176
ν˜Όλž€κ³Ό λ³€ν™”μ˜ μ£Όμš” μ›€μ§μž„μ„
07:05
through the lens of trust.
136
425480
2056
더 잘 이해할 수 μžˆμ„κΉŒ ν•˜λŠ” κ²λ‹ˆλ‹€.
07:07
Well, it turns out that trust has only evolved
137
427560
3296
μ‹ λ’°λŠ” μΈκ°„μ˜ μ—­μ‚¬μ—μ„œ
07:10
in three significant chapters throughout the course of human history:
138
430880
4656
세가지 λ‹¨κ³„λ‘œ μ§„ν™”ν–ˆμŠ΅λ‹ˆλ‹€.
07:15
local, institutional
139
435560
2216
μ§€μ—­μ‚¬νšŒ, μ œλ„κΆŒ
07:17
and what we're now entering, distributed.
140
437800
2400
그리고 μ§€κΈˆ μš°λ¦¬κ°€ μ§„μž…ν•˜κ³  μžˆλŠ” λΆ„μ‚°μž…λ‹ˆλ‹€.
07:20
So for a long time,
141
440680
2336
였랜 μ‹œκ°„λ™μ•ˆ
07:23
until the mid-1800s,
142
443040
1256
1800λ…„λŒ€ μ€‘λ°˜κΉŒμ§€,
07:24
trust was built around tight-knit relationships.
143
444320
3936
μ‹ λ’°λŠ” μ•„μ£Ό μ΄˜μ΄˜ν•œ 관계λ₯Ό 톡해 ν˜•μ„±λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
07:28
So say I lived in a village
144
448280
2016
μ œκ°€ ν•œ λ§ˆμ„μ— μ‚΄κ³  μžˆλ‹€κ³  해보죠
07:30
with the first five rows of this audience,
145
450320
2576
μ—¬κΈ° μ•žμ€„μ— μžˆλŠ” λ‹€μ„―λΆ„κ³Ό 말이죠.
07:32
and we all knew one another,
146
452920
1936
μ €ν¬λŠ” μ„œλ‘œλ₯Ό 잘 μ•Œκ³  μžˆμŠ΅λ‹ˆλ‹€.
07:34
and say I wanted to borrow money.
147
454880
2896
그리고 μ €λŠ” λˆμ„ 빌리고 μ‹Άλ‹€κ³  λ§ν•©λ‹ˆλ‹€.
07:37
The man who had his eyes wide open, he might lend it to me,
148
457800
2976
μ•žμ— λˆˆμ„ 크게 뜨고 μžˆλŠ” λ‚¨μžλΆ„μ΄ λˆμ„ λΉŒλ €μ€„ κ²λ‹ˆλ‹€.
07:40
and if I didn't pay him back,
149
460800
2096
그리고 λ§Œμ•½ μ œκ°€ λˆμ„ κ°šμ§€ μ•ŠλŠ”λ‹€λ©΄
07:42
you'd all know I was dodgy.
150
462920
1656
어렀뢄은 μ €μ—κ²Œ μ†μ•˜λ‹€κ³  생각할 것이고
07:44
I would get a bad reputation,
151
464600
1656
μ €μ˜ ν‰νŒμ€ λ‚˜λΉ μ§ˆκ²λ‹ˆλ‹€.
07:46
and you would refuse to do business with me in the future.
152
466280
3056
그리곀 λ‚˜μ€‘μ— μ—¬λŸ¬λΆ„μ€ 저와 거래 ν•˜κΈ°λ₯Ό κ±°λΆ€ν•˜κ² μ£ .
07:49
Trust was mostly local and accountability-based.
153
469360
4216
μ‹ λ’°λŠ” λŒ€λΆ€λΆ„ μ†Œκ·œλͺ¨ 지역과 μ±…μž„μ— κΈ°λ°˜ν–ˆμŠ΅λ‹ˆλ‹€.
07:53
In the mid-19th century,
154
473600
1336
19μ„ΈκΈ° μ€‘λ°˜μ— 이λ₯΄λŸ¬,
07:54
society went through a tremendous amount of change.
155
474960
3416
μ‚¬νšŒλŠ” μ—„μ²­λ‚œ λ³€ν™”λ₯Ό κ²ͺμ—ˆμŠ΅λ‹ˆλ‹€.
07:58
People moved to fast-growing cities such as London and San Francisco,
156
478400
3776
μ‚¬λžŒλ“€μ€ 런던과 μƒŒν”„λž€μ‹œμŠ€μ½”κ°™μ€ κΈ‰μ„±μž₯ν•˜λŠ” λ„μ‹œλ‘œ μ΄μ£Όν–ˆκ³ 
08:02
and a local banker here was replaced by large corporations
157
482200
4856
지역은행듀은 훨씬 큰 κΈ°μ—…λ“€λ‘œ λŒ€μ²΄λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
08:07
that didn't know us as individuals.
158
487080
2896
이 기업듀은 κ°œμΈμœΌλ‘œμ„œμ˜ 우리λ₯Ό μ•Œμ§€ λͺ»ν–ˆμ£ .
08:10
We started to place our trust
159
490000
1976
μš°λ¦¬λŠ” κΆŒμœ„μ˜ λΈ”λž™λ°•μŠ€ μ‹œμŠ€ν…œμ—
08:12
into black box systems of authority,
160
492000
3576
μ‹ λ’°λ₯Ό λΆ€μ—¬ν•˜κΈ° μ‹œμž‘ν–ˆμŠ΅λ‹ˆλ‹€.
08:15
things like legal contracts and regulation and insurance,
161
495600
4296
법적계약, κ·œμ•½, 그리고 보μž₯같은 것듀 λ§μž…λ‹ˆλ‹€.
08:19
and less trust directly in other people.
162
499920
4016
그리고 타인에 λŒ€ν•œ 직접적인 μ‹ λ’°λŠ” μ•½ν™”λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
08:23
Trust became institutional and commission-based.
163
503960
3856
μ‹ λ’°λŠ” κΈ°κ΄€ν™”ν•˜κ³  단체에 κΈ°λ°˜ν•˜κ²Œ λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
08:27
It's widely talked about how trust in institutions and many corporate brands
164
507840
4816
μ§€κΈˆ λ§Žμ€ κΈ°κ΄€κ³Ό κΈ°μ—…μ˜ μ‹ λ’°κ°€ μŠ¬ν”„κ²Œλ„
08:32
has been steadily declining and continues to do so.
165
512680
3736
계속 μΆ”λ½ν•˜λŠ”λ° λŒ€ν•œ λ§Žμ€ λ…Όμ˜κ°€ μžˆμŠ΅λ‹ˆλ‹€.
08:36
I am constantly stunned by major breaches of trust:
166
516440
5616
μ €λŠ” 계속적인 λŒ€ν˜• μ‹ λ’°μœ„λ°˜μ— λ†€λžμŠ΅λ‹ˆλ‹€.
08:42
the News Corp phone hacking,
167
522080
2496
The News Corp의 νœ΄λŒ€ν° ν•΄ν‚Ή,
08:44
the Volkswagen emissions scandal,
168
524600
2816
ν­μŠ€λ°”κ²μ˜ λ°°μΆœκ°€μŠ€μ‘°μž‘ μŠ€μΊ”λ“€
08:47
the widespread abuse in the Catholic Church,
169
527440
3335
카톨릭 κ΅νšŒμ— λ§Œμ—°ν•œ μΆ”λ¬Έ
08:50
the fact that only one measly banker
170
530799
3216
κΈˆμœ΅μœ„κΈ° ν›„ λ‹¬λž‘ ν•œλͺ…μ˜
08:54
went to jail after the great financial crisis,
171
534039
3297
μ€ν–‰κ°„λΆ€λ§Œμ΄ 감μ˜₯에 κ°„ μ‚¬μ‹€μ΄λ‚˜
08:57
or more recently the Panama Papers
172
537360
2056
μ΅œκ·Όμ— μžˆμ—ˆλ˜ μ–΄λ–»κ²Œ λΆ€μžλ“€μ΄
08:59
that revealed how the rich can exploit offshore tax regimes.
173
539440
5136
μ—°μ•ˆμ˜ μ„ΈκΈˆνšŒν”Όμ²˜λ₯Ό μ΄μš©ν–ˆλŠ”μ§€λ₯Ό ν­λ‘œν•œ νŒŒλ‚˜λ§ˆ 페이퍼
09:04
And the thing that really surprises me
174
544600
2456
그리고 μ €λ₯Ό 정말 λ†€λΌκ²Œ ν–ˆμ—ˆλ˜ 건
09:07
is why do leaders find it so hard
175
547080
4016
μ™œ 리더듀이 μ‚¬κ³Όν•˜λŠ” κ±Έ μ–΄λ €μ›Œν•˜λŠ”μ§€
09:11
to apologize, I mean sincerely apologize,
176
551120
3176
진심이 λ‹΄κΈ΄ 사죄말이죠
09:14
when our trust is broken?
177
554320
2160
우리의 μ‹ λ’°κ°€ κΉ¨μ‘Œμ„ λ•Œμš”.
09:17
It would be easy to conclude that institutional trust isn't working
178
557360
4136
κΈ°κ΄€μ˜ μ‹ λ’°κ°€ μž‘λ™ν•˜μ§€ μ•ŠλŠ”λ‹€κ³  결둠을 λ‚΄λŠ”κ±΄ μ‰¬μšΈκ²λ‹ˆλ‹€.
09:21
because we are fed up
179
561520
1496
μ™œλƒν•˜λ©΄
09:23
with the sheer audacity of dishonest elites,
180
563040
2896
우리 λͺ¨λ‘ λΆ€λ„λ•ν•œ μ—˜λ¦¬νŠΈλ“€μ˜ 뻔뻔함에 진저리가 났기 λ•Œλ¬Έμ΄μ£ .
09:25
but what's happening now
181
565960
1976
ν•˜μ§€λ§Œ μ§€κΈˆ λ²Œμ–΄μ§€κ³  μžˆλŠ” 상황은
09:27
runs deeper than the rampant questioning of the size and structure of institutions.
182
567960
5616
λ‹¨μˆœνžˆ κΈ°κ΄€μ˜ 규λͺ¨μ™€ ꡬ쑰에 λŒ€ν•œ μ§ˆλ¬Έμ„ λ„˜μ–΄μ„œκ³  μžˆμŠ΅λ‹ˆλ‹€.
09:33
We're starting to realize
183
573600
2016
μ €ν¬λŠ” 이제 κΉ¨λ‹«κΈ° μ‹œμž‘ν–ˆμŠ΅λ‹ˆλ‹€.
09:35
that institutional trust
184
575640
1776
기관에 λŒ€ν•œ μ‹ λ’°λŠ”
09:37
wasn't designed for the digital age.
185
577440
2976
디지털 μ‹œλŒ€μ— λ§žμ§€ μ•Šλ‹€λŠ” κ²ƒμ„μš”.
09:40
Conventions of how trust is built,
186
580440
3656
μ–΄λ–»κ²Œ λΈŒλžœλ“œμ™€ 리더 그리고 전체 μ‹œμŠ€ν…œμ—μ„œ
09:44
managed, lost and repaired --
187
584120
2336
μ‹ λ’°κ°€ μŒ“μ—¬, κ΄€λ¦¬λ˜κ³ ,
09:46
in brands, leaders and entire systems --
188
586480
2496
μžƒκ³  νšŒλ³΅ν•˜λŠ”μ§€μ— λŒ€ν•œ 톡념은
09:49
is being turned upside down.
189
589000
2000
μ™„μ „νžˆ λ’€μ§‘νžˆκ³  μžˆμŠ΅λ‹ˆλ‹€.
09:51
Now, this is exciting,
190
591760
2096
이건 정말 ν₯미둭기도 ν•˜μ§€λ§Œ
09:53
but it's frightening,
191
593880
1536
λ‘λ €μš΄ 일이기도 ν•©λ‹ˆλ‹€.
09:55
because it forces many of us to have to rethink
192
595440
2696
μ™œλƒν•˜λ©΄ 고객과 직원, 심지어 μ‚¬λž‘ν•˜λŠ” μ‚¬λžŒλ“€κ³Όμ˜ μ‹ λ’°κ°€
09:58
how trust is built and destroyed with our customers, with our employees,
193
598160
4696
μ–΄λ–»κ²Œ ν˜•μ„±λ˜κ³  λ¬΄λ„ˆμ§€λŠ”μ§€μ— λŒ€ν•΄
10:02
even our loved ones.
194
602880
1480
λ‹€μ‹œ μƒκ°ν•˜κ²Œ ν•˜κΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
10:05
The other day, I was talking to the CEO of a leading international hotel brand,
195
605800
6176
μ–΄λŠλ‚ , μ–΄λ–€ 큰 ν˜Έν…”λΈŒλžœλ“œμ˜ CEO와 이야기λ₯Ό λ‚˜λˆˆ 적이 μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
10:12
and as is often the case, we got onto the topic of Airbnb.
196
612000
3240
μ–Έμ œλ‚˜μ²˜λŸΌ, 에어비앀비에 λŒ€ν•΄ 이야기λ₯Ό λ‚˜λˆ„μ—ˆμ£ .
10:15
And he admitted to me that he was perplexed by their success.
197
615840
5096
그뢄은 μ—μ–΄λΉ„μ•€λΉ„μ˜ 성곡에 λ†€λžλ‹€κ³  인정을 ν–ˆμŠ΅λ‹ˆλ‹€.
10:20
He was perplexed at how a company
198
620960
2136
μ–΄λ–»κ²Œ λ‚―μ„  μ‚¬λžŒλ“€μ˜ μƒν˜Έμ‹ λ’° μ˜μ§€μ—
10:23
that depends on the willingness of strangers to trust one another
199
623120
4176
κΈ°λ°˜ν•œ νšŒμ‚¬κ°€ μ–΄λ–»κ²Œ 전세계 191κ°œκ΅­μ—μ„œ
10:27
could work so well across 191 countries.
200
627320
3960
성곡을 κ±°λ‘˜μˆ˜ μžˆμ—ˆμ„κΉŒμ— λ‹Ήν˜ΉμŠ€λŸ¬μ›Œ ν–ˆμŠ΅λ‹ˆλ‹€.
10:31
So I said to him that I had a confession to make,
201
631920
3136
그리고 μ œκ°€ κ·ΈλΆ„μ—κ²Œ κ³ λ°±ν• κ²Œ μžˆλ‹€κ³  λ§ν–ˆμ£ .
10:35
and he looked at me a bit strangely,
202
635080
1976
μ €λ₯Ό μ’€ μ΄μƒν•˜κ²Œ μ³λ‹€λ³΄λ”κ΅°μš”.
10:37
and I said --
203
637080
1376
μ œκ°€ λ§ν–ˆμŠ΅λ‹ˆλ‹€.
10:38
and I'm sure many of you do this as well --
204
638480
2016
당신도 이미 λΉ„μŠ·ν•œκ±Έ ν•˜κ³  μžˆλ‹€κ³  말이죠.
10:40
I don't always bother to hang my towels up
205
640520
2496
μ „ ν˜Έν…”μ—μ„œ λ‚˜μ˜¬ λ•Œ
10:43
when I'm finished in the hotel,
206
643040
2936
타월을 κ±°λŠ” 수고λ₯Ό ν•˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
10:46
but I would never do this as a guest on Airbnb.
207
646000
2640
ν•˜μ§€λ§Œ μ—μ–΄λΉ„μ•€λΉ„μ—μ„œλŠ” μ ˆλŒ€ ν•΄μ„œλŠ” μ•ˆλ  일이죠.
10:49
And the reason why I would never do this as a guest on Airbnb
208
649560
3336
μ—μ–΄λΉ„μ•€λΉ„μ˜ μ†λ‹˜μœΌλ‘œμ„œ 정리λ₯Ό ν•˜λŠ” μ΄μœ λŠ”
10:52
is because guests know that they'll be rated by hosts,
209
652920
3656
주인듀이 μ†λ‹˜μ— λŒ€ν•œ 평점을 맀기기 λ•Œλ¬Έμž…λ‹ˆλ‹€.
10:56
and that those ratings are likely to impact their ability
210
656600
3736
그리고 이 평점은 미래 μˆ™λ°•κ±°λž˜μ— μžˆμ–΄
11:00
to transact in the future.
211
660360
1680
영ν–₯을 λ―ΈμΉ  κ°€λŠ₯성이 ν½λ‹ˆλ‹€.
11:02
It's a simple illustration of how online trust will change our behaviors
212
662680
4216
이건 μ˜¨λΌμΈμ‹ λ’°κ°€ ν˜„μ‹€μ—μ„œ μ–΄λ–»κ²Œ 우리 행동을 λ³€ν™”μ‹œν‚€κ³ 
11:06
in the real world,
213
666920
1296
더 λ―Ώμ„μˆ˜ 있게
11:08
make us more accountable
214
668240
2496
ν•˜λŠ”μ§€λ₯Ό λ³΄μ—¬μ£ΌλŠ” κ°„λ‹¨ν•œ μ˜ˆμž…λ‹ˆλ‹€.
11:10
in ways we cannot yet even imagine.
215
670760
3440
아직 μƒμƒν•˜μ§€ λͺ»ν•˜λŠ” λ§Žμ€ λ°©λ²•μœΌλ‘œ 말이죠.
11:14
I am not saying we do not need hotels
216
674880
3056
μ „ μš°λ¦¬κ°€ ν˜Έν…”μ΄ 이젠 ν•„μš”μ—†λ‹€κ³  λ§ν•˜λŠ”κ²Œ μ•„λ‹™λ‹ˆλ‹€.
11:17
or traditional forms of authority.
217
677960
2336
전톡적인 κΆŒμœ„μ²΄κ³„λ„ λ§ˆμ°¬κ°€μ§€κ΅¬μš”.
11:20
But what we cannot deny
218
680320
2096
ν•˜μ§€λ§Œ μš°λ¦¬κ°€ λΆ€μ •ν• μˆ˜ μ—†λŠ” 것은
11:22
is that the way trust flows through society is changing,
219
682440
4256
μ‚¬νšŒλ₯Ό κ΄€ν†΅ν•˜λŠ” μ‹ λ’°μ˜ 흐름이 λ°”λ€Œκ³  있으며
11:26
and it's creating this big shift
220
686720
2296
κΈ°κ΄€μ˜ μ‹ λ’°λ‘œ μ •μ˜λœ
11:29
away from the 20th century
221
689040
1856
20μ„ΈκΈ°μ—μ„œ λ²—μ–΄λ‚˜
11:30
that was defined by institutional trust
222
690920
2816
λΆ„ν™”λœ μ‹ λ’°λ‘œ 동λ ₯을 μ–»λŠ”
11:33
towards the 21st century
223
693760
2496
21μ„ΈκΈ°λ‘œ κ°€λŠ”
11:36
that will be fueled by distributed trust.
224
696280
2640
큰 λ³€ν™”λ₯Ό μ°½μ‘°ν•˜κ³  μžˆλ‹€λŠ” κ²λ‹ˆλ‹€.
11:39
Trust is no longer top-down.
225
699480
4176
μ‹ λ’°λŠ” 더 이상 ν•˜ν–₯식이 μ•„λ‹™λ‹ˆλ‹€.
11:43
It's being unbundled and inverted.
226
703680
2096
λΆ„ν™”λ˜κ³ , μ „ν™˜λ˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
11:45
It's no longer opaque and linear.
227
705800
2840
더이상 λΆˆλΆ„λͺ…ν•˜μ§€λ„ λ‹¨νŽΈμ μ΄μ§€λ„ μ•ŠμŠ΅λ‹ˆλ‹€.
11:49
A new recipe for trust is emerging
228
709160
2976
μƒˆλ‘œμš΄ μ‹ λ’° μ œμ‘°λ²•μ΄ λ“±μž₯ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
11:52
that once again is distributed amongst people
229
712160
3896
λ‹€μ‹œ ν•œλ²ˆ μ‹ λ’°κ°€ μ‚¬λžŒλ“€μ—κ²Œ λ°°λΆ„λ˜κ³ 
11:56
and is accountability-based.
230
716080
2136
μ΄λŠ” μ±…μž„μ— κΈ°λ°˜ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
11:58
And this shift is only going to accelerate
231
718240
3416
그리고 이런 λ³€ν™”λŠ” λΈ”λ‘μ²΄μΈμ˜ λ“±μž₯κ³Ό ν•¨κ»˜
12:01
with the emergence of the blockchain,
232
721680
2736
가속화 될 κ²ƒμž…λ‹ˆλ‹€.
12:04
the innovative ledger technology underpinning Bitcoin.
233
724440
3640
블둝체인은 λΉ„νŠΈμ½”μΈμ΄ μ‚¬μš©ν•˜λŠ” ν˜μ‹ μ μΈ 금육거래 κΈ°μˆ μž…λ‹ˆλ‹€.
12:08
Now let's be honest,
234
728800
2936
이제, 솔직해져 λ³΄κ² μŠ΅λ‹ˆλ‹€.
12:11
getting our heads around the way blockchain works
235
731760
3456
머리λ₯Ό ꡴렀볼수둝 블락체인이 μž‘λ™ν•˜λŠ” 방식은
12:15
is mind-blowing.
236
735240
1440
정말 λ³΅μž‘ν•˜κ³  μ΄ν•΄ν•˜κΈ° νž˜λ“­λ‹ˆλ‹€.
12:17
And one of the reasons why is it involves processing
237
737720
3256
κ·Έ μ΄μœ λŠ” λ³΅μž‘ν•œ μ»¨μ…‰μ˜ ν”„λ‘œμ„Έμ‹± 과정을
12:21
some pretty complicated concepts
238
741000
2656
거치기 λ•Œλ¬Έμž…λ‹ˆλ‹€.
12:23
with terrible names.
239
743680
1496
이름도 μ΅œμ•…μ΄μ£ .
12:25
I mean, cryptographic algorithms and hash functions,
240
745200
4496
μ•”ν˜Έν™” μ•Œκ³ λ¦¬μ¦˜, 해쉬값
12:29
and people called miners, who verify transactions --
241
749720
3056
거래λ₯Ό ν™•μΈν•΄μ£ΌλŠ” MinerλΌλŠ” μ‚¬λžŒλ“€,
12:32
all that was created by this mysterious person
242
752800
3576
이 λͺ¨λ“ κ²Œ μ–΄λ–€ λ―ΈμŠ€ν…Œλ¦¬ν•œ μ‚¬λžŒμ΄ λ§Œλ“€μ—ˆλ‹€κ±°λ‚˜
12:36
or persons called Satoshi Nakamoto.
243
756400
2736
μ•„λ‹ˆλ©΄ μ‚¬ν† μ‹œ λ‚˜μΉ΄λͺ¨ν† λΌκ³  ν•˜λŠ” μ‚¬λžŒμ΄ λ§Œλ“€μ—ˆλ‹€λŠ” λ‘₯
12:39
Now, that is a massive trust leap that hasn't happened yet.
244
759160
5656
아직 μΌμ–΄λ‚˜μ§„ μ•Šμ•˜μ§€λ§Œ 이거야 말둜 μ—„μ²­λ‚œ 신뒰도약이라고 ν• μˆ˜ 있죠.
12:44
(Applause)
245
764840
3056
(λ°•μˆ˜)
12:47
But let's try to imagine this.
246
767920
1456
ν•˜μ§€λ§Œ 이해해 보렀고 λ…Έλ ₯ν•΄λ³΄κ² μŠ΅λ‹ˆλ‹€.
12:49
So "The Economist" eloquently described the blockchain
247
769400
3696
μ΄μ½”λ…Έλ―ΈμŠ€νŠΈμ§€λŠ” 이 블락체인을 사물에 λŒ€ν•œ
12:53
as the great chain of being sure about things.
248
773120
3656
κ±°λŒ€ν•œ ν™•μ‹ μ˜ μ‚¬μŠ¬μ΄λΌκ³  섀득λ ₯있게 μ„€λͺ…ν–ˆμŠ΅λ‹ˆλ‹€.
12:56
The easiest way I can describe it is imagine the blocks as spreadsheets,
249
776800
5056
κ°€μž₯ μ‰½κ²Œ μ„€λͺ…ν•˜λŠ” 방법은 블둝을 μŠ€ν”„λ ˆμ΄λ“œμ‹œνŠΈλ‘œ μƒκ°ν•˜λŠ” κ²λ‹ˆλ‹€.
13:01
and they are filled with assets.
250
781880
2976
그리고 이 블둝듀은 μžμ‚°λ“€λ‘œ μ°¨μžˆμŠ΅λ‹ˆλ‹€.
13:04
So that could be a property title.
251
784880
2416
이것은 μ†Œμœ κΆŒμ΄ 될 수 μžˆμŠ΅λ‹ˆλ‹€.
13:07
It could be a stock trade.
252
787320
2016
μ¦κΆŒκ±°λž˜κ°€ 될 μˆ˜λ„ μžˆκ³ μš”.
13:09
It could be a creative asset, such as the rights to a song.
253
789360
2960
μ°½μž‘ μžμ‚°μ΄ λ μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€. μ €μž‘κΆŒ κ°™μ€κ±°μš”.
13:12
Every time something moves
254
792960
3016
무언가가 λ“±λ‘λœ ν•œκ³³μ—μ„œ
13:16
from one place on the register to somewhere else,
255
796000
3816
λ‹€λ₯Έ 곳으둜 이동할 λ•Œλ§ˆλ‹€
13:19
that asset transfer is time-stamped
256
799840
3096
κ·Έ μžμ‚°μ— μ‹œκ°„μ΄ λ“±λ‘λ˜κ³ 
13:22
and publicly recorded on the blockchain.
257
802960
3416
블둝체인에 곡개적으둜 기둝이 λ©λ‹ˆλ‹€.
13:26
It's that simple. Right.
258
806400
1880
κ°„λ‹¨ν•©λ‹ˆλ‹€. κ·Έλ ‡μ£ .
13:28
So the real implication of the blockchain
259
808720
3096
λΈ”λ‘μ²΄μΈμ˜ μ§„μ§œ 영ν–₯은
13:31
is that it removes the need for any kind of third party,
260
811840
4136
ꡐλ₯˜λ₯Ό μ΄‰μ§„ν•˜κΈ°μœ„ν•΄
13:36
such as a lawyer,
261
816000
1336
λ³€ν˜Έμ‚¬λ‚˜
13:37
or a trusted intermediary, or maybe not a government intermediary
262
817360
3456
λ―Ώμ„λ§Œν•œ μ€‘κ°œμΈ 같은; μ •λΆ€μ€‘κ°œλ₯Ό λ§ν•˜λŠ”κ±΄ μ•„λ‹ˆκ³ μš”.
13:40
to facilitate the exchange.
263
820840
1816
이런 μ‚Όμžμ˜ ν•„μš”μ„±μ„ μ—†μ•΄λ‹€λŠ” κ²λ‹ˆλ‹€.
13:42
So if we go back to the trust stack,
264
822680
2256
μ‹ λ’°μŠ€νƒμœΌλ‘œ λ‹€μ‹œ λŒμ•„κ°€λ³΄λ©΄
13:44
you still have to trust the idea,
265
824960
2736
μ—¬λŸ¬λΆ„μ€ 아이디어λ₯Ό λ―Ώμ–΄μ•Όν•˜κ³ 
13:47
you have to trust the platform,
266
827720
2296
ν”Œλž«νΌμ„ λ―Ώμ–΄μ•Όν•˜μ§€λ§Œ
13:50
but you don't have to trust the other person
267
830040
2936
전톡적인 λ°©μ‹μœΌλ‘œ λ‹€λ₯Έμ‚¬λžŒμ„
13:53
in the traditional sense.
268
833000
1936
μ‹ λ’°ν•  ν•„μš”κ°€ μ—†μ–΄μ§‘λ‹ˆλ‹€.
13:54
The implications are huge.
269
834960
2496
μ΄κ²ƒμ˜ νŒŒκΈ‰λ ₯은 맀우 ν½λ‹ˆλ‹€.
13:57
In the same way the internet blew open the doors to an age of information
270
837480
3696
인터넷이 λͺ¨λ“ μ΄μ—κ²Œ μ •λ³΄μ˜ μ‹œλŒ€λ‘œ λ“€μ–΄κ°€λŠ” 문을
14:01
available to everyone,
271
841200
1416
ν™œμ§ μ—° 것과 λ§ˆμ°¬κ°€μ§€λ‘œ
14:02
the blockchain will revolutionize trust on a global scale.
272
842640
4320
블둝체인은 전세계적 λ²”μœ„μ—μ„œ μ‹ λ’°μ˜ κ°œλ…μ„ 혁λͺ…μ μœΌλ‘œ λ°”κΏ€κ²λ‹ˆλ‹€.
14:08
Now, I've waited to the end intentionally to mention Uber,
273
848240
3936
이제, μ „ μš°λ²„λ₯Ό μ–ΈκΈ‰ν•˜λ©΄μ„œ 끝을 λ‚Όλ €κ³  ν•©λ‹ˆλ‹€.
14:12
because I recognize that it is a contentious
274
852200
3456
μ™œλƒν•˜λ©΄, μš°λ²„λŠ” λ…ΌμŸμ„ μΌμœΌν‚€κ³ 
14:15
and widely overused example,
275
855680
2616
널리 인용되고 μžˆλŠ” 사둀이기 λ•Œλ¬Έμž…λ‹ˆλ‹€.
14:18
but in the context of a new era of trust, it's a great case study.
276
858320
3240
ν•˜μ§€λ§Œ μƒˆλ‘œμš΄ μ‹ λ’°μ‹œλŒ€λΌλŠ” λ©΄μ—μ„œ μ΄λŠ” 맀우 ν›Œλ₯­ν•œ μ—°κ΅¬μžλ£Œμž…λ‹ˆλ‹€.
14:21
Now, we will see cases of abuse of distributed trust.
277
861920
4736
이제, λΆ„μ‚°λœ μ‹ λ’°μ˜ λ‚¨μš©μ‚¬λ‘€λ₯Ό λ³΄μ—¬λ“œλ¦¬κ² μŠ΅λ‹ˆλ‹€.
14:26
We've already seen this, and it can go horribly wrong.
278
866680
3656
μ €ν¬λŠ” 이런 사둀λ₯Ό 이미 봐왔고, μ™„μ „νžˆ 잘λͺ»λœ 길을 κ°ˆμˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
14:30
I am not surprised that we are seeing protests from taxi associations
279
870360
5176
μ „ 전세계에 걸쳐 μ•ˆμ „ν•˜μ§€ μ•Šλ‹€λŠ” 이유둜
14:35
all around the world
280
875560
1336
정뢀에 μš°λ²„λ₯Ό κΈˆμ§€ν•˜λΌκ³ 
14:36
trying to get governments to ban Uber based on claims that it is unsafe.
281
876920
4880
μš”κ΅¬ν•˜λŠ” νƒμ‹œν˜‘νšŒμ˜ μ‹œμœ„λ₯Ό 보고 놀라진 μ•ŠμŠ΅λ‹ˆλ‹€.
14:42
I happened to be in London the day that these protests took place,
282
882320
4376
μ‹œμœ„κ°€ 있던 λ‚  μ „ λŸ°λ˜μ— μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
14:46
and I happened to notice a tweet
283
886720
1976
ν•œ νŠΈμœ„ν„°λ₯Ό 보게 λ˜μ—ˆλŠ”λ°μš”.
14:48
from Matt Hancock, who is a British minister for business.
284
888720
3616
영ꡭ κ²½μ œλΆ€μž₯κ΄€ 맀트 핸콕이
14:52
And he wrote,
285
892360
1216
올린 κΈ€μž…λ‹ˆλ‹€.
14:53
"Does anyone have details of this #Uber app everyone's talking about?
286
893600
3976
"μ§€κΈˆ λ‚œλ¦¬μΈ μš°λ²„μ•±μ— λŒ€ν•΄ 잘 μ•Œκ³  μžˆλŠ” μ‚¬λžŒ μ—†μ–΄μš”?
14:57
(Laughter)
287
897600
1200
(μ›ƒμŒ)
14:59
I'd never heard of it until today."
288
899880
2640
μ§€κΈˆκΉŒμ§€ 이런걸 λ“€μ–΄λ³Έ 적이 μ—†μ–΄μš”..."
15:03
Now, the taxi associations,
289
903560
3280
νƒμ‹œ ν˜‘νšŒλŠ”
15:07
they legitimized the first layer of the trust stack.
290
907800
2736
μ‹ λ’°μŠ€νƒμ˜ 첫 번째 단계λ₯Ό λ§Œλ“€μ—ˆμŠ΅λ‹ˆλ‹€.
15:10
They legitimized the idea that they were trying to eliminate,
291
910560
3336
이듀은 μžκΈ°λ“€μ΄ μ—†μ• λ €κ³  ν•˜λŠ” 아이디어에 였히렀 합리성을 λΆ€μ—¬ν–ˆκ³ 
15:13
and sign-ups increased by 850 percent in 24 hours.
292
913920
5136
μš°λ²„μ•± λ‹€μš΄λ‘œλ“œμˆ˜λŠ” 24μ‹œκ°„ λ§Œμ— 850νΌμ„ΌνŠΈλ‘œ λŠ˜μ–΄λ‚¬μŠ΅λ‹ˆλ‹€.
15:19
Now, this is a really strong illustration
293
919080
3256
이것 μ•„μ£Ό κ°•λ ₯ν•œ μ˜ˆμž…λ‹ˆλ‹€.
15:22
of how once a trust shift has happened around a behavior or an entire sector,
294
922360
5816
일단 μ‹ λ’°λ³€ν™”κ°€ 행동 λ˜λŠ” μ „μ²΄λΆ„μ•Όμ—μ„œ μΌμ–΄λ‚˜λ©΄
15:28
you cannot reverse the story.
295
928200
2240
이 흐름을 λ°”κΏ€μˆ˜ μ—†μŠ΅λ‹ˆλ‹€.
15:31
Every day, five million people will take a trust leap
296
931120
3856
맀일, 500만λͺ…μ˜ μ‚¬λžŒλ“€μ€ μ‹ λ’°μ˜ 도약을 ν•˜κ³ 
15:35
and ride with Uber.
297
935000
1536
μš°λ²„λ₯Ό μ΄μš©ν•  κ²λ‹ˆλ‹€.
15:36
In China, on Didi, the ride-sharing platform,
298
936560
3216
μ€‘κ΅­μ—μ„œλŠ”, λ””λ””λΌλŠ” μžλ™μ°¨ 쉐어링 ν”Œλž«νΌμœΌλ‘œ
15:39
11 million rides taken every day.
299
939800
2896
맀일 1,100만번의 μžλ™μ°¨ 쉐이링을 ν•©λ‹ˆλ‹€.
15:42
That's 127 rides per second,
300
942720
3616
μ΄ˆλ‹Ή 127번 이죠.
15:46
showing that this is a cross-cultural phenomenon.
301
946360
2816
이건 μ „ 문화에 걸친 ν˜„μƒμ΄λž€κ±Έ λ³΄μ—¬μ€λ‹ˆλ‹€.
15:49
And the fascinating thing is that both drivers and passengers report
302
949200
4176
그리고 ν₯미둜운 점은 μš΄μ „μžμ™€ 승객 λͺ¨λ‘
15:53
that seeing a name
303
953400
2496
μƒλŒ€λ°©μ˜ 이름과 사진
15:55
and seeing someone's photo and their rating
304
955920
2976
그리고 κ·Έλ“€μ˜ 평점을 λ³΄κ³ λ‚˜λ©΄
15:58
makes them feel safer,
305
958920
2256
더 μ•ˆμ „ν•˜λ‹€κ³  λŠλ‚€λ‹€λŠ” κ²λ‹ˆλ‹€.
16:01
and as you may have experienced,
306
961200
1576
그리고 μ•„λ§ˆ μ—¬λŸ¬λΆ„μ΄ κ²½ν—˜ν–ˆλ“―μ΄
16:02
even behave a little more nicely in the taxi cab.
307
962800
3960
νƒμ‹œμ—μ„œ μ’€ 더 친절히 ν–‰λ™ν•˜κ²Œ λœλ‹€κ³  ν•©λ‹ˆλ‹€.
16:07
Uber and Didi are early but powerful examples
308
967360
3696
μœ λ²„μ™€ λ””λ””λŠ” μ΄ˆκΈ°λ‹¨κ³„μ΄μ§€λ§Œ
16:11
of how technology is creating trust between people
309
971080
3976
μ–΄λ–»κ²Œ 기술이 μ΄μ „μ—λŠ” λΆˆκ°€λŠ₯ν–ˆλ˜ 방식과 μŠ€μΌ€μΌλ‘œ
16:15
in ways and on a scale never possible before.
310
975080
3280
μ‚¬λžŒλ“€ 사이에 μ‹ λ’°λ₯Ό ν˜•μ„±ν•˜λŠ”μ§€λ₯Ό λ³΄μ—¬μ£ΌλŠ” κ°•λ ₯ν•œ μ˜ˆμž…λ‹ˆλ‹€.
16:19
Today, many of us are comfortable getting into cars driven by strangers.
311
979120
6056
μ˜€λŠ˜λ‚ , λ§Žμ€ μ‚¬λžŒμ΄ λ‚―μ„  μ‚¬λžŒμ΄ λͺ¨λŠ” μ°¨λ₯Ό νƒ€λŠ”κ±Έ νŽΈν•˜κ²Œ μƒκ°ν•˜κ³ 
16:25
We meet up with someone we swiped right to be matched with.
312
985200
4296
μ‚¬λžŒλ“€μ„ λ§Œλ‚˜λ©° Swipe right (Tinder의 맀칭툴)λ₯Ό ν•©λ‹ˆλ‹€.
16:29
We share our homes with people we do not know.
313
989520
3936
그리고 λͺ¨λ₯΄λŠ” μ‚¬λžŒλ“€κ³Ό 집을 κ³΅μœ ν•©λ‹ˆλ‹€.
16:33
This is just the beginning,
314
993480
2360
이건 μ‹œμž‘μΌ λΏμž…λ‹ˆλ‹€.
16:36
because the real disruption happening
315
996440
2576
μ§„μ§œ λ³€ν˜μ„ μΌμœΌν‚€λŠ” 건
16:39
isn't technological.
316
999040
1936
기술이 μ•„λ‹ˆλΌ
16:41
It's the trust shift it creates,
317
1001000
2320
기술이 λ§Œλ“œλŠ” μ‹ λ’°μ˜ 변화이기 λ•Œλ¬Έμž…λ‹ˆλ‹€.
16:43
and for my part, I want to help people understand this new era of trust
318
1003880
5096
μ €λŠ” μ‚¬λžŒλ“€μ΄ 이 μƒˆλ‘œμš΄ μ‹ λ’°μ˜ μ‹œλŒ€λ₯Ό μ΄ν•΄ν•΄μ„œ
16:49
so that we can get it right
319
1009000
1656
λ°”λ₯΄κ²Œ μ΄μš©ν•˜κ³ 
16:50
and we can embrace the opportunities to redesign systems
320
1010680
3896
투λͺ…ν•˜κ³  수용적이며 μ±…μž„μžˆλŠ” 이 μ‹œμŠ€ν…œμ˜
16:54
that are more transparent, inclusive and accountable.
321
1014600
4096
기회λ₯Ό 받아듀이도둝 돕고 μ‹ΆμŠ΅λ‹ˆλ‹€.
16:58
Thank you very much.
322
1018720
1256
κ°μ‚¬ν•©λ‹ˆλ‹€.
17:00
(Applause)
323
1020000
2576
(λ°•μˆ˜)
17:02
Thank you.
324
1022600
1216
κ°μ‚¬ν•΄μš”.
17:03
(Applause)
325
1023840
3708
(λ°•μˆ˜)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7