We've stopped trusting institutions and started trusting strangers | Rachel Botsman

208,994 views ・ 2016-11-07

TED


請雙擊下方英文字幕播放視頻。

譯者: Hsun Ying Tsai 審譯者: Winston Szeto
00:12
Let's talk about trust.
0
12760
2560
讓我們來談談信任。
00:16
We all know trust is fundamental,
1
16240
3496
我們都知道信任是萬物之本,
00:19
but when it comes to trusting people,
2
19760
2616
但當談及人跟人之間的信任時,
00:22
something profound is happening.
3
22400
2760
有些意味深遠的事情便發生了。
00:25
Please raise your hand
4
25800
1256
如果各位曾經是 Airbnb 的 房東或房客的話,請舉手。
00:27
if you have ever been a host or a guest on Airbnb.
5
27080
4320
00:32
Wow. That's a lot of you.
6
32400
3216
哇!很多人。
00:35
Who owns Bitcoin?
7
35640
1440
誰有比特幣?
00:38
Still a lot of you. OK.
8
38640
1256
也是非常多人!
00:39
And please raise your hand if you've ever used Tinder
9
39920
2696
那麼如果你曾使用 Tinder 幫助你找到對象的話,也請舉手。
00:42
to help you find a mate.
10
42640
1616
00:44
(Laughter)
11
44280
1816
(笑聲)
00:46
This one's really hard to count because you're kind of going like this.
12
46120
3376
真的很難算有幾個人, 因為好像你們手都舉得很低。
00:49
(Laughter)
13
49520
1816
(笑聲)
這些例子都顯示出科技創造的新機制
00:51
These are all examples of how technology
14
51360
2776
00:54
is creating new mechanisms
15
54160
2096
00:56
that are enabling us to trust unknown people, companies and ideas.
16
56280
6016
讓我們信任陌生的人、企業和想法。
01:02
And yet at the same time,
17
62320
2016
與此同時,
01:04
trust in institutions --
18
64360
1536
對制度的信任,
01:05
banks, governments and even churches --
19
65920
2936
對銀行、政府 甚至教會的信任正在瓦解。
01:08
is collapsing.
20
68880
1376
01:10
So what's happening here,
21
70280
2216
所以,發生了什麼事? 你現在還信任誰?
01:12
and who do you trust?
22
72520
1760
01:14
Let's start in France with a platform -- with a company, I should say --
23
74800
3896
我們從法國的一個平台開始談。
01:18
with a rather funny-sounding name,
24
78720
2136
這其實是一間名字聽起來 有點搞笑的公司,
01:20
BlaBlaCar.
25
80880
1256
叫做「BlaBlaCar」。
01:22
It's a platform that matches drivers and passengers
26
82160
3616
這個平台
為想要一起分享長途旅行的 司機和乘客進行配對。
01:25
who want to share long-distance journeys together.
27
85800
4016
01:29
The average ride taken is 320 kilometers.
28
89840
4216
已經完成的路途平均 320 公里。
01:34
So it's a good idea to choose your fellow travelers wisely.
29
94080
4600
所以大家最好明智地選擇旅伴。
01:39
Social profiles and reviews help people make a choice.
30
99320
3976
社交媒體帳戶簡介和評論 幫助人們做選擇。
01:43
You can see if someone's a smoker, you can see what kind of music they like,
31
103320
5016
你可以知道他是否抽菸, 你可以知道他喜歡的音樂類型,
01:48
you can see if they're going to bring their dog along for the ride.
32
108360
3600
你可以知道他們會否帶著愛犬出遊。
01:52
But it turns out that the key social identifier
33
112440
3576
但其實最關鍵的社會標識符
01:56
is how much you're going to talk in the car.
34
116040
2616
卻是你在車裡會說多少話。
01:58
(Laughter)
35
118680
1816
(笑聲)
02:00
Bla, not a lot,
36
120520
1616
Bla 代表話不多,
02:02
bla bla, you want a nice bit of chitchat,
37
122160
2296
Bla Bla 表示你們想來點閒聊,
02:04
and bla bla bla, you're not going to stop talking the entire way
38
124480
3736
Bla bla Bla 則表示倫敦到巴黎的 整段路程你們都沒完沒了的聊。
02:08
from London to Paris.
39
128240
1456
02:09
(Laughter)
40
129720
2296
(笑聲)
02:12
It's remarkable, right, that this idea works at all,
41
132040
3016
這個想法竟然奏效,實在出人意表,
02:15
because it's counter to the lesson most of us were taught as a child:
42
135080
3856
因為它違反我們當中大部分 自小被教導的道理:
02:18
never get in a car with a stranger.
43
138960
2320
永遠不要進到陌生人的車裡。
02:21
And yet, BlaBlaCar transports more than four million people
44
141680
5136
然而 BlaBlaCar 的每月載客量 已經超過四百萬人。
02:26
every single month.
45
146840
1240
02:28
To put that in context, that's more passengers
46
148880
2336
具體地說,這比歐洲之星 和捷藍航空的載客量還多。
02:31
than the Eurostar or JetBlue airlines carry.
47
151240
3480
02:35
BlaBlaCar is a beautiful illustration of how technology is enabling
48
155360
4136
BlaBlaCar 絕佳體現出
科技如何讓全球數百萬人 改弦易轍互相信任。
02:39
millions of people across the world to take a trust leap.
49
159520
3600
02:43
A trust leap happens when we take the risk to do something new or different
50
163560
6256
信任大躍進在於我們承擔風險,
做點全新或跟以前做法不同的事情。
02:49
to the way that we've always done it.
51
169840
2040
02:52
Let's try to visualize this together.
52
172440
2560
讓我們一起想像這是甚麼回事。
02:55
OK. I want you to close your eyes.
53
175360
3160
我想請你們閉上眼睛。
02:59
There is a man staring at me with his eyes wide open.
54
179320
3056
那邊有個人張開眼睛看著我,
03:02
I'm on this big red circle. I can see.
55
182400
2456
站在這紅色大圓圈上的我都看到了,
03:04
So close your eyes.
56
184880
1416
所以請你們閉上眼睛。
03:06
(Laughter) (Applause)
57
186320
2856
(笑聲)(拍手)
03:09
I'll do it with you.
58
189200
1256
我會與你們一起想像。
03:10
And I want you to imagine there exists a gap
59
190480
3016
我要你們想像和某個 不熟悉的事物之間有一條間隙。
03:13
between you and something unknown.
60
193520
2560
03:16
That unknown can be someone you've just met.
61
196760
2856
這個不熟悉的事物可能是你剛遇到的人。
03:19
It can be a place you've never been to.
62
199640
2096
這可以是你不曾去過的地方。
03:21
It can be something you've never tried before.
63
201760
3376
這可以是你不曾嘗試過的事物。
03:25
You got it?
64
205160
1216
這樣懂嗎?
03:26
OK. You can open your eyes now.
65
206400
2096
好的,你們可以張開眼睛了。
03:28
For you to leap from a place of certainty,
66
208520
3336
要你們從明確的地方大跳躍出去,
03:31
to take a chance on that someone or something unknown,
67
211880
3576
嘗試跟不熟悉的人和事一起,
03:35
you need a force to pull you over the gap,
68
215480
3336
你需要一股力量 將你拉起越過那一條間隙,
03:38
and that remarkable force is trust.
69
218840
2840
而那股意想不到的力量就是信任。
03:42
Trust is an elusive concept,
70
222560
3616
信任是難以捉摸的概念,
03:46
and yet we depend on it for our lives to function.
71
226200
3376
但我們必須有信任才可活出人生。
03:49
I trust my children
72
229600
2256
當我的子女說會在晚上關燈時, 我信任他們。
03:51
when they say they're going to turn the lights out at night.
73
231880
2856
03:54
I trusted the pilot who flew me here to keep me safe.
74
234760
3216
我信任飛機機師安全地接載我過來。
03:58
It's a word we use a lot,
75
238000
2816
我們經常用到這個詞語,
04:00
without always thinking about what it really means
76
240840
2416
卻沒有常常思考它真正的涵義,
04:03
and how it works in different contexts of our lives.
77
243280
3376
也沒有思考它如何 在人生的不同情境活出來。
04:06
There are, in fact, hundreds of definitions of trust,
78
246680
3616
其實信任的定義數以百計,
04:10
and most can be reduced to some kind of risk assessment
79
250320
4576
當中大多數都可簡化為
某種對事情順利或然率的風險評估。
04:14
of how likely it is that things will go right.
80
254920
2760
04:18
But I don't like this definition of trust,
81
258120
2576
但我不喜歡這樣對信任所下的定義,
04:20
because it makes trust sound rational and predictable,
82
260720
4296
因為這使信任聽起來 理性且可以預測;
04:25
and it doesn't really get to the human essence
83
265040
2576
這樣的定義也捕捉不到人的本質,
04:27
of what it enables us to do
84
267640
1936
捕捉不到信任可以讓我們做到的事,
04:29
and how it empowers us
85
269600
1976
以及信任如何讓我們能夠 與其他人聯繫起來。
04:31
to connect with other people.
86
271600
2096
04:33
So I define trust a little differently.
87
273720
2056
因此我對信任所下的定義有些不同,
04:35
I define trust as a confident relationship to the unknown.
88
275800
5440
我把信任定義為與未知的事物 建立起充滿信心的關係。
04:41
Now, when you view trust through this lens,
89
281920
2096
當你透過這塊鏡片看待信任,
04:44
it starts to explain why it has the unique capacity
90
284040
3656
才能夠解釋 為何信任有一種特別的能力
04:47
to enable us to cope with uncertainty,
91
287720
3096
讓我們應付不確定的事物,
04:50
to place our faith in strangers,
92
290840
3016
讓我們信任陌生人,
04:53
to keep moving forward.
93
293880
1880
讓我們持續往前走。
04:56
Human beings are remarkable
94
296400
2776
人類進行信任大躍進時 表現非常出色。
04:59
at taking trust leaps.
95
299200
1680
05:01
Do you remember the first time you put your credit card details
96
301400
2976
你還記得第一次 將信用卡資料輸入到網站上嗎?
05:04
into a website?
97
304400
1216
05:05
That's a trust leap.
98
305640
1336
那就是信任大躍進。
05:07
I distinctly remember telling my dad
99
307000
2936
我還清楚記得曾經告訴父親,
05:09
that I wanted to buy a navy blue secondhand Peugeot on eBay,
100
309960
5256
我想在 eBay 上買一台 海軍藍色的二手寶獅汽車,
05:15
and he rightfully pointed out
101
315240
1656
我父親直接點出 賣家的名字叫做「隱形巫師」,
05:16
that the seller's name was "Invisible Wizard"
102
316920
2376
05:19
and that this probably was not such a good idea.
103
319320
3096
所以不應該跟他買車。
05:22
(Laughter)
104
322440
1696
(笑聲)
05:24
So my work, my research focuses on how technology
105
324160
3416
我專注研究科技如何轉變
05:27
is transforming the social glue of society,
106
327600
2656
社會凝聚力和人與人之間的信任,
05:30
trust between people,
107
330280
1656
05:31
and it's a fascinating area to study,
108
331960
2256
這個研究領域非常地吸引人,
05:34
because there's still so much we do not know.
109
334240
3376
因為這領域還有很多 我們未知的事物。
05:37
For instance, do men and women trust differently in digital environments?
110
337640
5096
譬如,數位環境中的信任感 是否存在男女之別?
05:42
Does the way we build trust face-to-face translate online?
111
342760
4896
我們面對面建立信任的方式 同樣適用在網路上嗎?
05:47
Does trust transfer?
112
347680
1936
信任可以移轉嗎﹖
05:49
So if you trust finding a mate on Tinder,
113
349640
2696
如果你信任在 Tinder 上找到伴侶,
05:52
are you more likely to trust finding a ride on BlaBlaCar?
114
352360
3360
那麼會否更傾向信任 在 BlaBlaCar 上找到人共乘嗎?
05:56
But from studying hundreds of networks and marketplaces,
115
356440
3256
研究數百個網路和市場的結果顯示,
05:59
there is a common pattern that people follow,
116
359720
2816
大家都會遵循一個共同模式,
06:02
and I call it "climbing the trust stack."
117
362560
2656
我稱之為「爬上信任層疊」。
06:05
Let me use BlaBlaCar as an example to bring it to life.
118
365240
3200
就讓我用 BlaBlaCar 作例子 去生動地形容這概念。
06:09
On the first level,
119
369080
1296
在第一個層次,你要信任這個想法。
06:10
you have to trust the idea.
120
370400
2176
06:12
So you have to trust
121
372600
1216
你必須要信任 共乘這個想法是安全且值得一試。
06:13
the idea of ride-sharing is safe and worth trying.
122
373840
3400
06:17
The second level is about having confidence in the platform,
123
377640
4696
第二個層次關乎對這個平台的信心,
06:22
that BlaBlaCar will help you if something goes wrong.
124
382360
4136
相信 BlaBlaCar 會在 發生狀況時幫助你。
06:26
And the third level is about using little bits of information
125
386520
3656
第三個層次是用一點一點的資訊
06:30
to decide whether the other person is trustworthy.
126
390200
3480
決定另一個人是否值得信任。
06:34
Now, the first time we climb the trust stack,
127
394200
2616
當我們第一次爬上信任層疊,
06:36
it feels weird, even risky,
128
396840
3256
我們覺得奇怪,甚至感到危險,
06:40
but we get to a point where these ideas seem totally normal.
129
400120
4976
但我們最後都會相信 這些都是看似完全正常的想法。
06:45
Our behaviors transform,
130
405120
2296
我們的行為往往轉變得很快。
06:47
often relatively quickly.
131
407440
1976
06:49
In other words, trust enables change and innovation.
132
409440
4800
也就是說,信任促使改變和創新。
06:55
So an idea that intrigued me, and I'd like you to consider,
133
415280
3416
我對某個想法很感興趣, 我也希望你們一起來思考:
06:58
is whether we can better understand
134
418720
2536
我們可否透過信任這鏡片更清楚知道
07:01
major waves of disruption and change in individuals in society
135
421280
4176
個人和社會層面的主要轉折和轉變。
07:05
through the lens of trust.
136
425480
2056
07:07
Well, it turns out that trust has only evolved
137
427560
3296
原來信任在整個人類歷史上
07:10
in three significant chapters throughout the course of human history:
138
430880
4656
只有三個顯著的進化階段:
07:15
local, institutional
139
435560
2216
本地屬性、體制屬性,
07:17
and what we're now entering, distributed.
140
437800
2400
以及我們即將進入的分配屬性階段。
07:20
So for a long time,
141
440680
2336
在 19 世紀中葉前的 一段漫長時間,
07:23
until the mid-1800s,
142
443040
1256
07:24
trust was built around tight-knit relationships.
143
444320
3936
信任是圍繞着緊密關係而建立。
07:28
So say I lived in a village
144
448280
2016
假設我跟前五排的觀眾
07:30
with the first five rows of this audience,
145
450320
2576
一起住在村莊裡,彼此認識,
07:32
and we all knew one another,
146
452920
1936
07:34
and say I wanted to borrow money.
147
454880
2896
並假設我要借錢。
07:37
The man who had his eyes wide open, he might lend it to me,
148
457800
2976
剛剛睜眼的先生可能會借錢給我,
07:40
and if I didn't pay him back,
149
460800
2096
如果我沒有還他錢, 大家都會知道我信不過,
07:42
you'd all know I was dodgy.
150
462920
1656
07:44
I would get a bad reputation,
151
464600
1656
我的名聲就臭掉了, 以後你也不會想要跟我做生意。
07:46
and you would refuse to do business with me in the future.
152
466280
3056
07:49
Trust was mostly local and accountability-based.
153
469360
4216
信任大都具有本地屬性, 以問責為基礎。
07:53
In the mid-19th century,
154
473600
1336
在 19 世紀中葉, 社會經歷了巨大的轉變。
07:54
society went through a tremendous amount of change.
155
474960
3416
07:58
People moved to fast-growing cities such as London and San Francisco,
156
478400
3776
大家搬遷到倫敦或舊金山等 快速發展的城市,
08:02
and a local banker here was replaced by large corporations
157
482200
4856
在個人層次上不認識我們的大集團
08:07
that didn't know us as individuals.
158
487080
2896
取代了本地銀行家。
08:10
We started to place our trust
159
490000
1976
我們開始信任權力機關的黑箱體制,
08:12
into black box systems of authority,
160
492000
3576
08:15
things like legal contracts and regulation and insurance,
161
495600
4296
例如法律合同、監管規則和保險,
08:19
and less trust directly in other people.
162
499920
4016
較少直接信任其他人。
08:23
Trust became institutional and commission-based.
163
503960
3856
信任變成具有體制屬性, 以委託為基礎。
08:27
It's widely talked about how trust in institutions and many corporate brands
164
507840
4816
對體制及很多公司品牌的信任
08:32
has been steadily declining and continues to do so.
165
512680
3736
眾所周知已經並將會持續下滑。
08:36
I am constantly stunned by major breaches of trust:
166
516440
5616
我經常對重大的 破壞信任行為感到驚訝:
08:42
the News Corp phone hacking,
167
522080
2496
新聞集團電話竊聽案;
08:44
the Volkswagen emissions scandal,
168
524600
2816
福斯汽車廢氣排放醜聞;
08:47
the widespread abuse in the Catholic Church,
169
527440
3335
鋪天蓋地的天主教教會性侵案;
08:50
the fact that only one measly banker
170
530799
3216
巨大的金融危機後只有 一個卑鄙小銀行家入獄;
08:54
went to jail after the great financial crisis,
171
534039
3297
08:57
or more recently the Panama Papers
172
537360
2056
或者最近的巴拿馬文件
08:59
that revealed how the rich can exploit offshore tax regimes.
173
539440
5136
揭示富人如何利用離岸公司避稅。
09:04
And the thing that really surprises me
174
544600
2456
使我感到無比驚訝的是,
09:07
is why do leaders find it so hard
175
547080
4016
當我們之間的信任被打破後,
09:11
to apologize, I mean sincerely apologize,
176
551120
3176
為何領袖認為誠懇道歉是如此艱難﹖
09:14
when our trust is broken?
177
554320
2160
09:17
It would be easy to conclude that institutional trust isn't working
178
557360
4136
各位大可以總結, 對體制的信任失效,
09:21
because we are fed up
179
561520
1496
是因為我們對不誠實精英的 恣意妄為忍無可忍,
09:23
with the sheer audacity of dishonest elites,
180
563040
2896
09:25
but what's happening now
181
565960
1976
但現在發生的事所涉及的層面,
09:27
runs deeper than the rampant questioning of the size and structure of institutions.
182
567960
5616
並非不斷質疑 體制規模和結構所能觸及。
09:33
We're starting to realize
183
573600
2016
我們開始明白到,
09:35
that institutional trust
184
575640
1776
對體制的信任不是為數位時代而設。
09:37
wasn't designed for the digital age.
185
577440
2976
09:40
Conventions of how trust is built,
186
580440
3656
建立、經營、失去及修補 信任感的常規—
09:44
managed, lost and repaired --
187
584120
2336
09:46
in brands, leaders and entire systems --
188
586480
2496
無論是對品牌、領袖 或整個體系的信任─正被顛覆。
09:49
is being turned upside down.
189
589000
2000
09:51
Now, this is exciting,
190
591760
2096
這令人既興奮也害怕,
09:53
but it's frightening,
191
593880
1536
09:55
because it forces many of us to have to rethink
192
595440
2696
因為這使我們重新思考
09:58
how trust is built and destroyed with our customers, with our employees,
193
598160
4696
跟顧客、僱員甚至愛人的信任 是如何建立和摧毀。
10:02
even our loved ones.
194
602880
1480
10:05
The other day, I was talking to the CEO of a leading international hotel brand,
195
605800
6176
日前我與一間國際頂尖飯店品牌的 執行長對談,
10:12
and as is often the case, we got onto the topic of Airbnb.
196
612000
3240
正如一般情況, 我們聊到 Airbnb 這個話題。
10:15
And he admitted to me that he was perplexed by their success.
197
615840
5096
他向我承認難以理解 Airbnb 的成功。
10:20
He was perplexed at how a company
198
620960
2136
他難以理解
10:23
that depends on the willingness of strangers to trust one another
199
623120
4176
一間依靠陌生人願意互相信任的公司
10:27
could work so well across 191 countries.
200
627320
3960
竟可以在 191 個國家運作如此良好。
10:31
So I said to him that I had a confession to make,
201
631920
3136
我跟他說我有事情要自白,
10:35
and he looked at me a bit strangely,
202
635080
1976
他就用很奇怪的眼光看著我,
10:37
and I said --
203
637080
1376
然後我說— 我相信很多人都會這樣做─
10:38
and I'm sure many of you do this as well --
204
638480
2016
10:40
I don't always bother to hang my towels up
205
640520
2496
我在飯店用完毛巾都懶得掛起來,
10:43
when I'm finished in the hotel,
206
643040
2936
10:46
but I would never do this as a guest on Airbnb.
207
646000
2640
但我身為 Airbnb 的顧客 卻永遠不會這樣做,
10:49
And the reason why I would never do this as a guest on Airbnb
208
649560
3336
為什麼當 Airbnb 的顧客 永遠不會這樣做,
10:52
is because guests know that they'll be rated by hosts,
209
652920
3656
是因為顧客知道房東會給他們評分,
10:56
and that those ratings are likely to impact their ability
210
656600
3736
這些評分有可能影響到 他們日後能否進行交易。
11:00
to transact in the future.
211
660360
1680
11:02
It's a simple illustration of how online trust will change our behaviors
212
662680
4216
這簡單的刻畫出線上的信任感 將會如何改變我們現實世界的行為,
11:06
in the real world,
213
666920
1296
11:08
make us more accountable
214
668240
2496
使我們以從未想像過的方式 變得更有責任感。
11:10
in ways we cannot yet even imagine.
215
670760
3440
11:14
I am not saying we do not need hotels
216
674880
3056
我並非要說我們不需要飯店 或是傳統形式的權威。
11:17
or traditional forms of authority.
217
677960
2336
11:20
But what we cannot deny
218
680320
2096
但我們無法否認,
11:22
is that the way trust flows through society is changing,
219
682440
4256
信任在社會流通的方式已經改變了,
11:26
and it's creating this big shift
220
686720
2296
所創造出的巨大轉變,
11:29
away from the 20th century
221
689040
1856
就是由體制屬性信任 所界定的 20 世紀
11:30
that was defined by institutional trust
222
690920
2816
11:33
towards the 21st century
223
693760
2496
轉變為分配屬性信任 所推動的 21 世紀。
11:36
that will be fueled by distributed trust.
224
696280
2640
11:39
Trust is no longer top-down.
225
699480
4176
信任的流向不再是由上往下。
11:43
It's being unbundled and inverted.
226
703680
2096
它逐漸被鬆綁,流向也被倒轉過來。
11:45
It's no longer opaque and linear.
227
705800
2840
它不再是模糊或呈線性。
11:49
A new recipe for trust is emerging
228
709160
2976
構成信任的新方式逐漸形成,
11:52
that once again is distributed amongst people
229
712160
3896
其本質回歸到人際分配模式,
11:56
and is accountability-based.
230
716080
2136
以問責為基礎。
11:58
And this shift is only going to accelerate
231
718240
3416
隨着區塊鍊 這種支撐比特幣的分帳技術冒起,
12:01
with the emergence of the blockchain,
232
721680
2736
12:04
the innovative ledger technology underpinning Bitcoin.
233
724440
3640
轉變只會越來越快。
12:08
Now let's be honest,
234
728800
2936
老實說,理解區塊鏈如何運作
12:11
getting our heads around the way blockchain works
235
731760
3456
12:15
is mind-blowing.
236
735240
1440
可令人頭昏腦脹。
12:17
And one of the reasons why is it involves processing
237
737720
3256
其中一個原因是當中包含的步驟
12:21
some pretty complicated concepts
238
741000
2656
有很多非常複雜且名稱難懂的概念。
12:23
with terrible names.
239
743680
1496
12:25
I mean, cryptographic algorithms and hash functions,
240
745200
4496
我說的是加密演算法和雜湊函數,
12:29
and people called miners, who verify transactions --
241
749720
3056
以及驗證交易、被稱為礦工的人,
12:32
all that was created by this mysterious person
242
752800
3576
這一切都是由一位或多位
12:36
or persons called Satoshi Nakamoto.
243
756400
2736
稱作「中本聰」的神秘人物 所創造出來。
12:39
Now, that is a massive trust leap that hasn't happened yet.
244
759160
5656
這才是從未發生過的 超大型信任大躍進。
12:44
(Applause)
245
764840
3056
(掌聲)
12:47
But let's try to imagine this.
246
767920
1456
讓我們試著想像,
12:49
So "The Economist" eloquently described the blockchain
247
769400
3696
《經濟學人》意味深長地將區塊鍊技術
12:53
as the great chain of being sure about things.
248
773120
3656
形容為「確保萬物的巨大鎖鏈」。
12:56
The easiest way I can describe it is imagine the blocks as spreadsheets,
249
776800
5056
用最簡單的方式形容, 就是把區塊鏈想像成試算表,
13:01
and they are filled with assets.
250
781880
2976
每個格子填滿著資產。
13:04
So that could be a property title.
251
784880
2416
這些資產可以是財產契據。
13:07
It could be a stock trade.
252
787320
2016
可以是股票交易。
13:09
It could be a creative asset, such as the rights to a song.
253
789360
2960
可以是歌曲版權等創意資產。
13:12
Every time something moves
254
792960
3016
每次資產
從記錄的一處移至另一處時,
13:16
from one place on the register to somewhere else,
255
796000
3816
13:19
that asset transfer is time-stamped
256
799840
3096
資產移轉就會被標記時間, 在區塊鏈上被公開記錄下來。
13:22
and publicly recorded on the blockchain.
257
802960
3416
13:26
It's that simple. Right.
258
806400
1880
就是這麼簡單的事情。
13:28
So the real implication of the blockchain
259
808720
3096
所以區塊鏈真正的意義
13:31
is that it removes the need for any kind of third party,
260
811840
4136
就是移除對任何第三方的需求。
13:36
such as a lawyer,
261
816000
1336
例如律師和非政府授信的中介人,
13:37
or a trusted intermediary, or maybe not a government intermediary
262
817360
3456
13:40
to facilitate the exchange.
263
820840
1816
讓交易更加方便。
13:42
So if we go back to the trust stack,
264
822680
2256
讓我們回到信任層疊,
13:44
you still have to trust the idea,
265
824960
2736
你還是要信任這樣的想法, 你也要信任這樣的平台,
13:47
you have to trust the platform,
266
827720
2296
13:50
but you don't have to trust the other person
267
830040
2936
但你不需要以傳統的方式信任別人。
13:53
in the traditional sense.
268
833000
1936
13:54
The implications are huge.
269
834960
2496
這意義巨大,
13:57
In the same way the internet blew open the doors to an age of information
270
837480
3696
就像互聯網打開了 通往資訊共享世代的大門,
14:01
available to everyone,
271
841200
1416
14:02
the blockchain will revolutionize trust on a global scale.
272
842640
4320
區塊鍊將會以全球規模 徹底改變信任感。
14:08
Now, I've waited to the end intentionally to mention Uber,
273
848240
3936
現在我等到快要結束 才打算談到 Uber,
14:12
because I recognize that it is a contentious
274
852200
3456
因為我知道這是一個非常具爭議性 而且廣泛被過度使用的例子,
14:15
and widely overused example,
275
855680
2616
14:18
but in the context of a new era of trust, it's a great case study.
276
858320
3240
但當我們談論信任的新時代, 這就是很棒的個案。
14:21
Now, we will see cases of abuse of distributed trust.
277
861920
4736
我們會看到具分配屬性的信任 被濫用的個案。
14:26
We've already seen this, and it can go horribly wrong.
278
866680
3656
我們已見識過,而且可以相當離譜。
14:30
I am not surprised that we are seeing protests from taxi associations
279
870360
5176
我不驚訝於全球計程車協會抗議,
14:35
all around the world
280
875560
1336
14:36
trying to get governments to ban Uber based on claims that it is unsafe.
281
876920
4880
聲稱 Uber 不安全而要求政府取締它,
14:42
I happened to be in London the day that these protests took place,
282
882320
4376
在抗議發生的那天,我剛好在倫敦,
14:46
and I happened to notice a tweet
283
886720
1976
剛好注意到由英國商務大臣 夏國賢發出的一條推特。
14:48
from Matt Hancock, who is a British minister for business.
284
888720
3616
14:52
And he wrote,
285
892360
1216
他寫道:
14:53
"Does anyone have details of this #Uber app everyone's talking about?
286
893600
3976
「有沒有人知道大家在談論的 #Uber 應用程式詳細資訊?
14:57
(Laughter)
287
897600
1200
(笑聲)
14:59
I'd never heard of it until today."
288
899880
2640
我到今天才知道耶。」
15:03
Now, the taxi associations,
289
903560
3280
現在計程車協會
15:07
they legitimized the first layer of the trust stack.
290
907800
2736
合理化信任層疊的第一層。
15:10
They legitimized the idea that they were trying to eliminate,
291
910560
3336
他們合理化自己試圖要消滅的想法,
15:13
and sign-ups increased by 850 percent in 24 hours.
292
913920
5136
Uber 的註冊人數 在 24 小時內增加了 8.5 倍。
15:19
Now, this is a really strong illustration
293
919080
3256
這強而有力地證明
15:22
of how once a trust shift has happened around a behavior or an entire sector,
294
922360
5816
當圍繞一種行為 或是整個行業的信任移轉時,
15:28
you cannot reverse the story.
295
928200
2240
情況就會無法逆轉。
15:31
Every day, five million people will take a trust leap
296
931120
3856
每天有五百萬人 進行信任大躍進搭乘 Uber。
15:35
and ride with Uber.
297
935000
1536
15:36
In China, on Didi, the ride-sharing platform,
298
936560
3216
在中國,「滴滴出行」共乘平台 每天就錄得 1,100 萬次共乘。
15:39
11 million rides taken every day.
299
939800
2896
15:42
That's 127 rides per second,
300
942720
3616
即是每秒 127 次共乘, 顯示這是跨文化的現象。
15:46
showing that this is a cross-cultural phenomenon.
301
946360
2816
15:49
And the fascinating thing is that both drivers and passengers report
302
949200
4176
最棒的是,司機和乘客都表示,
15:53
that seeing a name
303
953400
2496
看到名字,看到別人的照片和評價,
15:55
and seeing someone's photo and their rating
304
955920
2976
15:58
makes them feel safer,
305
958920
2256
讓他們覺得更加安心,
16:01
and as you may have experienced,
306
961200
1576
而大家可能也體驗過, 甚至比乘搭計程車時更有禮貌。
16:02
even behave a little more nicely in the taxi cab.
307
962800
3960
16:07
Uber and Didi are early but powerful examples
308
967360
3696
Uber 和滴滴出行 這些早期例子強而有力地展現,
16:11
of how technology is creating trust between people
309
971080
3976
科技如何以從來不可能的方式和規模
16:15
in ways and on a scale never possible before.
310
975080
3280
在人與人之間建立信任。
16:19
Today, many of us are comfortable getting into cars driven by strangers.
311
979120
6056
如今我們大部分人 對搭乘陌生人開的車感到自在。
16:25
We meet up with someone we swiped right to be matched with.
312
985200
4296
我們跟向右輕掃後配對的人見面。
16:29
We share our homes with people we do not know.
313
989520
3936
我們把自己的家分享給陌生人。
16:33
This is just the beginning,
314
993480
2360
這只是起步而已,
16:36
because the real disruption happening
315
996440
2576
因為正在發生的真正轉折 與科技無關。
16:39
isn't technological.
316
999040
1936
16:41
It's the trust shift it creates,
317
1001000
2320
而是跟它造成的信任移轉有關。
16:43
and for my part, I want to help people understand this new era of trust
318
1003880
5096
我就希望幫助人們 了解這新世代的信任感,
16:49
so that we can get it right
319
1009000
1656
這樣我們才能把事情做的正確,
16:50
and we can embrace the opportunities to redesign systems
320
1010680
3896
我們就能擁抱機遇,重新設計制度,
16:54
that are more transparent, inclusive and accountable.
321
1014600
4096
使它們更具透明度、廣納性和問責性。
16:58
Thank you very much.
322
1018720
1256
非常感謝大家。
17:00
(Applause)
323
1020000
2576
(掌聲)
17:02
Thank you.
324
1022600
1216
謝謝。
17:03
(Applause)
325
1023840
3708
(掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7


This website was created in October 2020 and last updated on June 12, 2025.

It is now archived and preserved as an English learning resource.

Some information may be out of date.

隱私政策

eng.lish.video

Developer's Blog