How tech companies deceive you into giving up your data and privacy | Finn Lützow-Holm Myrstad
133,693 views ・ 2018-11-21
请双击下面的英文字幕来播放视频。
翻译人员: jacks peng
校对人员: Yinchun Rui
00:13
Do you remember when you were a child,
0
13040
2416
你还记得童年时期,
00:15
you probably had a favorite toy
that was a constant companion,
1
15480
3576
你可能有一个钟爱的玩具与你形影不离,
00:19
like Christopher Robin
had Winnie the Pooh,
2
19080
2616
就像克里斯多夫 · 罗宾有小熊维尼一样,
00:21
and your imagination
fueled endless adventures?
3
21720
2800
以及你那充满无尽冒险的想象力吗?
00:25
What could be more innocent than that?
4
25640
2400
还有什么比这个更天真的呢?
00:28
Well, let me introduce you
to my friend Cayla.
5
28800
4800
让我向你们介绍我的朋友凯拉。
00:34
Cayla was voted toy of the year
in countries around the world.
6
34600
3456
凯拉是全球各个国家投票选出的年度玩具。
00:38
She connects to the internet
and uses speech recognition technology
7
38080
3576
她能与互联网连接并使用语音识别技术
00:41
to answer your child's questions,
8
41680
2136
来回答孩子们的问题,
00:43
respond just like a friend.
9
43840
1960
就像一个朋友一样。
00:46
But the power doesn't lie
with your child's imagination.
10
46920
3656
但权力并不在你孩子的想象力中。
00:50
It actually lies with the company
harvesting masses of personal information
11
50600
4536
它其实在于公司所收集的
海量个人信息上,
00:55
while your family is innocently
chatting away in the safety of their home,
12
55160
5536
你的家人在安全的家中闲聊时,
01:00
a dangerously false sense of security.
13
60720
2480
这其实是一种对于安全感的危险错觉。
01:04
This case sounded alarm bells for me,
14
64840
2656
这个案子给我敲响了警钟,
01:07
as it is my job to protect
consumers' rights in my country.
15
67520
3200
因为我的工作是保护
我们国家消费者的权益。
01:11
And with billions of devices such as cars,
16
71800
3496
随着到2020年,数十亿诸如
01:15
energy meters and even vacuum cleaners
expected to come online by 2020,
17
75320
5096
汽车,能源仪表,甚至吸尘器
那样的设备投入使用,
01:20
we thought this was a case
worth investigating further.
18
80440
3936
我们认为这个案例值得深入调查。
01:24
Because what was Cayla doing
19
84400
1896
我们很好奇,凯拉用她所学到的
01:26
with all the interesting things
she was learning?
20
86320
2536
所有有趣的东西,到底做了些什么呢?
01:28
Did she have another friend she was
loyal to and shared her information with?
21
88880
3640
她有没有另一个忠实的朋友,
分享她的信息?
01:33
Yes, you guessed right. She did.
22
93640
2776
是的,你猜对了,她有的。
01:36
In order to play with Cayla,
23
96440
2096
要跟凯拉玩耍,
01:38
you need to download an app
to access all her features.
24
98560
3000
你需要下载一个APP
来获取她所有的功能。
01:42
Parents must consent to the terms
being changed without notice.
25
102280
3840
家长必须同意更改条款而毋须另行通知。
01:47
The recordings of the child,
her friends and family,
26
107280
3776
孩子们,她的好友和家人的录音,
01:51
can be used for targeted advertising.
27
111080
1960
可以被用于定向广告。
01:54
And all this information can be shared
with unnamed third parties.
28
114080
4960
并且所有这些信息都可以
与未具名的第三方分享。
01:59
Enough? Not quite.
29
119760
2120
就这些了?其实还不止。
02:02
Anyone with a smartphone
can connect to Cayla
30
122880
4296
任何有智能手机的人在一定的距离内
02:07
within a certain distance.
31
127200
1600
都可以连接到凯拉。
02:09
When we confronted the company
that made and programmed Cayla,
32
129560
4576
当我们与制造和开发凯拉的公司对峙时,
02:14
they issued a series of statements
33
134160
2256
他们发表了一系列的声明称
02:16
that one had to be an IT expert
in order to breach the security.
34
136440
4120
只有IT专家才能破坏其安全性。
02:22
Shall we fact-check that statement
and live hack Cayla together?
35
142039
3921
我们要不要一起核实一下这份声明,
直播一下如何“劫持”凯拉?
02:29
Here she is.
36
149920
1200
这就是她。
02:32
Cayla is equipped with a Bluetooth device
37
152200
3376
凯拉装备有蓝牙设备,
02:35
which can transmit up to 60 feet,
38
155600
2216
传输距离可以达到60英尺,
02:37
a bit less if there's a wall between.
39
157840
2616
如果有墙阻挡就少点。
02:40
That means I, or any stranger,
can connect to the doll
40
160480
5296
这意味着,我或任何陌生人
02:45
while being outside the room
where Cayla and her friends are.
41
165800
3736
在凯拉和她朋友的
家门口就能连接上她。
02:49
And to illustrate this,
42
169560
2176
为了演示这个过程,
02:51
I'm going to turn Cayla on now.
43
171760
2136
我现在把凯拉开机。
02:53
Let's see, one, two, three.
44
173920
1800
让我们看看,1,2,3.
02:57
There. She's on. And I asked a colleague
45
177040
1976
好了,她开机了,我之前让一位同事
拿着他的智能手机站在门外,
02:59
to stand outside with his smartphone,
46
179040
2096
03:01
and he's connected,
47
181160
1240
现在他连上了,
03:03
and to make this a bit creepier ...
48
183320
2096
为了加点恐怖的气氛…
03:05
(Laughter)
49
185440
4056
(笑声)
03:09
let's see what kids could hear Cayla say
in the safety of their room.
50
189520
4920
让我们看看孩子们会在他们安全的
家中听到凯拉说什么。
03:15
Man: Hi. My name is Cayla. What is yours?
51
195920
2896
男:嗨,我叫凯拉,你的名字呢?
03:18
Finn Myrstad: Uh, Finn.
52
198840
1320
芬恩:芬恩。
03:20
Man: Is your mom close by?
53
200960
1296
男:你妈妈在旁边吗?
03:22
FM: Uh, no, she's in the store.
54
202280
1480
芬恩:不在,她在店里。
03:24
Man: Ah. Do you want
to come out and play with me?
55
204680
2376
男:啊,你想出来跟我一起玩耍吗?
03:27
FM: That's a great idea.
56
207080
1480
芬恩:真是好主意呀。
03:29
Man: Ah, great.
57
209720
1200
男:好呀。
03:32
FM: I'm going to turn Cayla off now.
58
212480
2136
芬恩:我现在要关掉凯拉了。
03:34
(Laughter)
59
214640
1200
(笑声)
03:39
We needed no password
60
219080
2736
我们无须任何密码
03:41
or to circumvent any other
type of security to do this.
61
221840
3560
或规避任何其他安全措施就可以做到。
03:46
We published a report
in 20 countries around the world,
62
226440
3816
我们在全球20个国家发布了一份报告,
03:50
exposing this significant security flaw
63
230280
2976
曝光了这种明显的安全漏洞
03:53
and many other problematic issues.
64
233280
1760
和很多其他问题。
03:56
So what happened?
65
236000
1240
后来怎么样了呢?
03:57
Cayla was banned in Germany,
66
237840
1640
凯拉在德国被禁止出售,
04:00
taken off the shelves
by Amazon and Wal-Mart,
67
240480
3216
被亚马逊和沃尔玛下架,
04:03
and she's now peacefully resting
68
243720
3056
现在她正平静地呆在
04:06
at the German Spy Museum in Berlin.
69
246800
3456
柏林的德国间谍博物馆。
04:10
(Laughter)
70
250280
2776
(笑声)
04:13
However, Cayla was also for sale
in stores around the world
71
253080
4296
然而,在我们报告发布前,凯拉已经在
04:17
for more than a year
after we published our report.
72
257400
3576
全球各地的商店出售超过一年。
04:21
What we uncovered is that
there are few rules to protect us
73
261000
4256
我们所揭示的是,
很少有规则能够保护我们,
04:25
and the ones we have
are not being properly enforced.
74
265280
3360
并且有的规则往往也
得不到很好的执行。
04:30
We need to get the security
and privacy of these devices right
75
270000
3856
在它们进入市场时,我们需要让
04:33
before they enter the market,
76
273880
2856
这些设备具备可靠的安全和隐私功能,
04:36
because what is the point
of locking a house with a key
77
276760
3976
因为如果任何人都可以
通过一个联网设备进来,
04:40
if anyone can enter it
through a connected device?
78
280760
2920
用钥匙锁门的意义又何在?
04:45
You may well think,
"This will not happen to me.
79
285640
3296
你可能会觉得,“这没发生在我身上。
04:48
I will just stay away
from these flawed devices."
80
288960
2600
我只需要远离这些有缺陷的设备就好。”
04:52
But that won't keep you safe,
81
292600
2056
但这无法保证你的安全,
04:54
because simply by
connecting to the internet,
82
294680
3176
因为只是通过连接上互联网,
04:57
you are put in an impossible
take-it-or-leave-it position.
83
297880
4576
你就不可避免地被置于
要么接受,要么走人的境地。
05:02
Let me show you.
84
302480
1200
给大家看看。
05:04
Like most of you,
I have dozens of apps on my phone,
85
304400
3096
像很多人一样,我手机上有几十个应用,
05:07
and used properly,
they can make our lives easier,
86
307520
2856
使用得当,它们可以让我们生活更轻松,
05:10
more convenient and maybe even healthier.
87
310400
2440
更便捷,甚至可能更健康。
05:13
But have we been lulled
into a false sense of security?
88
313960
3520
但我们是否被一种
虚假的安全感蒙蔽了呢?
05:18
It starts simply by ticking a box.
89
318600
2440
它开始于简单勾选一个方框开始。
05:21
Yes, we say,
90
321880
1776
是的,我们会说,
05:23
I've read the terms.
91
323680
1440
我已经阅读过这些条款。
05:27
But have you really read the terms?
92
327240
3040
但你真的阅读了这些条款吗?
05:31
Are you sure they didn't look too long
93
331200
2296
你确定它们不是看起来太长,
05:33
and your phone was running out of battery,
94
333520
2056
你的手机快没电了,
05:35
and the last time you tried
they were impossible to understand,
95
335600
3216
你最后一次尝试时,
它们非常晦涩难懂,
05:38
and you needed to use the service now?
96
338840
1840
而且你需要立刻使用这个服务?
05:41
And now, the power
imbalance is established,
97
341840
3656
目前,权力的不平衡已经建立,
05:45
because we have agreed
to our personal information
98
345520
3656
因为我们同意让对方
收集我们的个人信息,
05:49
being gathered and used
on a scale we could never imagine.
99
349200
3120
并用于我们无法想象的规模层面。
05:53
This is why my colleagues and I
decided to take a deeper look at this.
100
353640
3696
这就是我和同事们决定去
深入研究这个问题的原因。
05:57
We set out to read the terms
101
357360
3336
我们开始阅读
06:00
of popular apps on an average phone.
102
360720
2696
一台普通手机上热门应用的条款。
06:03
And to show the world
how unrealistic it is
103
363440
3736
并且向世界展示,让消费者去阅读
06:07
to expect consumers
to actually read the terms,
104
367200
3216
这些条款是多么的不切实际,
06:10
we printed them,
105
370440
1496
我们把这些条款打印出来,
06:11
more than 900 pages,
106
371960
1840
总长度超过了900页纸,
06:14
and sat down in our office
and read them out loud ourselves,
107
374800
3600
然后坐在办公室里大声朗读,
06:19
streaming the experiment
live on our websites.
108
379800
2536
并在我们的网站上直播这个实验。
06:22
As you can see, it took quite a long time.
109
382360
2536
很显然,需要花很长的时间。
06:24
It took us 31 hours,
49 minutes and 11 seconds
110
384920
4416
我们花了31个小时,49分钟11秒
06:29
to read the terms on an average phone.
111
389360
2576
才阅读完一台普通手机上的条款。
06:31
That is longer than a movie marathon
of the "Harry Potter" movies
112
391960
4376
这比《哈利波特》和《教父》
06:36
and the "Godfather" movies combined.
113
396360
2496
系列电影加起来还长。
06:38
(Laughter)
114
398880
1400
(笑声)
06:41
And reading is one thing.
115
401600
1936
阅读是一回事,
06:43
Understanding is another story.
116
403560
1976
理解则是另外一码事。
06:45
That would have taken us
much, much longer.
117
405560
3576
这会需要我们更多更多的时间。
06:49
And this is a real problem,
118
409160
1776
这是一个切实存在的问题,
06:50
because companies have argued
for 20 to 30 years
119
410960
3216
因为这些公司争执了20-30年,
06:54
against regulating the internet better,
120
414200
3056
反对更好地监管互联网,
06:57
because users have consented
to the terms and conditions.
121
417280
3160
因为用户已经同意了这些条款和条件。
07:02
As we've shown with this experiment,
122
422520
1976
正如我们在这个实验中展示的,
07:04
achieving informed consent
is close to impossible.
123
424520
2880
获得知情同意几乎是不可能的。
07:09
Do you think it's fair to put the burden
of responsibility on the consumer?
124
429080
3524
你认为让消费者承担责任是公平的吗?
07:14
I don't.
125
434000
1736
我不认同。
07:15
I think we should demand
less take-it-or-leave-it
126
435760
3096
我认为我们应该要求减少
要么接受要么放弃,
07:18
and more understandable terms
before we agree to them.
127
438880
3176
在我们同意前给出更多可理解的条款。
07:22
(Applause)
128
442080
1536
(鼓掌)
07:23
Thank you.
129
443640
1200
谢谢。
07:28
Now, I would like to tell you
a story about love.
130
448200
4880
现在我想讲一个关于爱的故事。
07:34
Some of the world's
most popular apps are dating apps,
131
454080
3536
一些世界上最流行的应用是相亲应用,
07:37
an industry now worth more than,
or close to, three billion dollars a year.
132
457640
4640
这个行业现在价值超过
或接近,一年30亿美元。
07:43
And of course, we're OK
sharing our intimate details
133
463160
4176
当然,我们可以与另一半
07:47
with our other half.
134
467360
1240
分享我们的私密细节。
07:49
But who else is snooping,
135
469240
1976
但当我们在坦白灵魂时,
07:51
saving and sharing our information
136
471240
2936
还有谁在窥探,保存和
07:54
while we are baring our souls?
137
474200
1640
分享我们的信息呢?
07:56
My team and I decided to investigate this.
138
476520
2200
我和团队打算对此展开调查。
08:00
And in order to understand
the issue from all angles
139
480920
3016
为了从各个角度了解这个问题
08:03
and to truly do a thorough job,
140
483960
2240
并认真做好工作,
08:07
I realized I had to download
141
487400
1976
我意识到我得亲自下载
08:09
one of the world's
most popular dating apps myself.
142
489400
3440
一款世界上最流行的相亲软件。
08:14
So I went home to my wife ...
143
494440
2296
所以我回家问我老婆…
08:16
(Laughter)
144
496760
1936
(笑声)
08:18
who I had just married.
145
498720
1656
我们刚结婚不久。
08:20
"Is it OK if I establish a profile
on a very popular dating app
146
500400
4616
“纯粹为了科学研究,我可以在
一个非常流行的相亲软件上
08:25
for purely scientific purposes?"
147
505040
1896
建立个人档案吗?”
08:26
(Laughter)
148
506960
1856
(笑声)
08:28
This is what we found.
149
508840
1496
于是我们发现了这个。
08:30
Hidden behind the main menu
was a preticked box
150
510360
3976
隐藏在主菜单后面的
是一个预先标记好的格子,
08:34
that gave the dating company access
to all my personal pictures on Facebook,
151
514360
6056
让这个相亲公司可以获取
我在Facebook上的所有个人照片,
08:40
in my case more than 2,000 of them,
152
520440
2856
我个人有超过2千张照片,
08:43
and some were quite personal.
153
523320
2120
其中有些相当私密。
08:46
And to make matters worse,
154
526400
2216
并且更糟糕的是,
08:48
when we read the terms and conditions,
155
528640
2056
当我阅读条款时,
08:50
we discovered the following,
156
530720
1376
我们发现了下面的内容,
08:52
and I'm going to need to take out
my reading glasses for this one.
157
532120
3120
我需要掏出老花镜才能看清。
08:56
And I'm going to read it for you,
because this is complicated.
158
536400
2936
我打算给大家阅读一下,因为太复杂了。
08:59
All right.
159
539360
1200
好的。
09:01
"By posting content" --
160
541440
1536
“发布内容”——
09:03
and content refers to your pictures, chat
161
543000
1976
内容指你的照片,聊天记录
09:05
and other interactions
in the dating service --
162
545000
2216
和其他在相亲服务上的互动——
09:07
"as a part of the service,
163
547240
1256
“作为服务的一部分,
09:08
you automatically grant to the company,
164
548520
1976
你自动授予公司,
09:10
its affiliates, licensees and successors
165
550520
2176
及其关联方,被许可方或继任公司
09:12
an irrevocable" -- which means
you can't change your mind --
166
552720
3616
不可撤销,”——意思是你不能改变主意——
09:16
"perpetual" -- which means forever --
167
556360
2776
“永久的”——意思是永远——
09:19
"nonexclusive, transferrable,
sublicensable, fully paid-up,
168
559160
2896
“非排他、可转让、可附带许可、完全付费、
09:22
worldwide right and license
to use, copy, store, perform,
169
562080
2696
在全球范围内的使用、复制、存储、执行、
09:24
display, reproduce, record,
170
564800
1336
显示、复制、录制、
09:26
play, adapt, modify
and distribute the content,
171
566160
2216
播放、改编、修改和分发内容
09:28
prepare derivative works of the content,
172
568400
1936
以及内容的派生作品的权利和许可,
09:30
or incorporate the content
into other works
173
570360
2016
或将内容合并到其他作品中,
09:32
and grant and authorize sublicenses
of the foregoing in any media
174
572400
3056
并在任何已知或以后创建的媒体中授予
09:35
now known or hereafter created."
175
575480
1560
和授权上述转授许可。”
09:40
That basically means
that all your dating history
176
580640
3816
这基本上意味着你所有的相亲历史
09:44
and everything related to it
can be used for any purpose for all time.
177
584480
5080
以及任何与之有关的一切东西
可以一直被用于任何目的。
09:50
Just imagine your children
seeing your sassy dating photos
178
590520
4976
想象一下,你的孩子
在20年后的节育广告中
09:55
in a birth control ad 20 years from now.
179
595520
2560
会看到你时髦的约会照。
10:00
But seriously, though --
180
600400
1216
但说真的——
10:01
(Laughter)
181
601640
1600
(笑声)
10:04
what might these commercial
practices mean to you?
182
604880
2360
这些商业惯例对你意味着什么?
10:08
For example, financial loss:
183
608320
2240
比如,财务损失:
10:11
based on your web browsing history,
184
611480
1696
基于你的网站浏览历史,
10:13
algorithms might decide
whether you will get a mortgage or not.
185
613200
2960
算法可能决定你能否该获得贷款。
10:16
Subconscious manipulation:
186
616840
1480
潜意识操纵:
10:19
companies can analyze your emotions
based on your photos and chats,
187
619560
3696
公司可以通过你的照片
和聊天分析你的情绪,
10:23
targeting you with ads
when you are at your most vulnerable.
188
623280
3256
在你最脆弱时投放针对你的定向广告。
10:26
Discrimination:
189
626560
1496
歧视:
10:28
a fitness app can sell your data
to a health insurance company,
190
628080
3016
健身应用可以把你的
数据卖给健康保险公司,
10:31
preventing you from getting
coverage in the future.
191
631120
3056
让你将来获得保险困难重重。
10:34
All of this is happening
in the world today.
192
634200
2520
所有这一切都发生在今天的世界中。
10:37
But of course, not all uses
of data are malign.
193
637800
3336
但当然,并非所有的
数据使用都是有害的。
10:41
Some are just flawed or need more work,
194
641160
1976
有些只是缺陷,或者需要更多的工作,
10:43
and some are truly great.
195
643160
1520
有些则真的很棒。
10:47
And there is some good news as well.
196
647560
3696
这里也有一些好消息。
10:51
The dating companies
changed their policies globally
197
651280
3296
在我们提出诉讼后,这个交友公司
10:54
after we filed a legal complaint.
198
654600
1680
改变了他们的全球政策。
10:57
But organizations such as mine
199
657720
2696
但像我们这样
11:00
that fight for consumers' rights
can't be everywhere.
200
660440
2976
为消费者权益斗争的
组织不是到处都有。
11:03
Nor can consumers fix this on their own,
201
663440
2536
消费者也无法独自修复这个问题,
11:06
because if we know
that something innocent we said
202
666000
3576
因为假如我们知道
我们的一些无心之言
11:09
will come back to haunt us,
203
669600
1456
会反过来困扰我们,
11:11
we will stop speaking.
204
671080
1896
我们就会停止发声。
11:13
If we know that we are being
watched and monitored,
205
673000
3376
如果我们知道自己正在被关注和监控,
11:16
we will change our behavior.
206
676400
2096
就会改变自身的行为。
11:18
And if we can't control who has our data
and how it is being used,
207
678520
3896
如果我们无法控制谁拥有
我们的数据,以及数据如何被使用,
11:22
we have lost the control of our lives.
208
682440
1840
我们就失去了对自己生活的控制。
11:26
The stories I have told you today
are not random examples.
209
686400
3496
我今天告诉你们的故事
并不是随机的案例。
11:29
They are everywhere,
210
689920
1776
它们无处不在,
11:31
and they are a sign
that things need to change.
211
691720
2856
它们标志着是时候该改变现状了。
11:34
And how can we achieve that change?
212
694600
2096
我们如何才能实现这种改变?
11:36
Well, companies need to realize
that by prioritizing privacy and security,
213
696720
5576
公司需要认识到应该以隐私和安全为先,
11:42
they can build trust
and loyalty to their users.
214
702320
2960
在此基础上培养用户的信任和忠诚。
11:46
Governments must create a safer internet
215
706520
3096
政府必须通过监督执行和更新规则
11:49
by ensuring enforcement
and up-to-date rules.
216
709640
2880
去构建一个更安全的互联网。
11:53
And us, the citizens?
217
713400
2216
而我们这些公民呢?
11:55
We can use our voice
218
715640
1816
我们可以用自己的声音
11:57
to remind the world that technology
can only truly benefit society
219
717480
5096
去提醒世界,科技只有
在尊重基本权利的基础上,
12:02
if it respects basic rights.
220
722600
2600
才能真正让全社会受益。
12:05
Thank you so much.
221
725720
1576
谢谢大家。
12:07
(Applause)
222
727320
4080
(鼓掌)
New videos
Original video on YouTube.com
关于本网站
这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。