How tech companies deceive you into giving up your data and privacy | Finn Lützow-Holm Myrstad

133,693 views

2018-11-21 ・ TED


New videos

How tech companies deceive you into giving up your data and privacy | Finn Lützow-Holm Myrstad

133,693 views ・ 2018-11-21

TED


Please double-click on the English subtitles below to play the video.

00:13
Do you remember when you were a child,
0
13040
2416
00:15
you probably had a favorite toy that was a constant companion,
1
15480
3576
00:19
like Christopher Robin had Winnie the Pooh,
2
19080
2616
00:21
and your imagination fueled endless adventures?
3
21720
2800
00:25
What could be more innocent than that?
4
25640
2400
00:28
Well, let me introduce you to my friend Cayla.
5
28800
4800
00:34
Cayla was voted toy of the year in countries around the world.
6
34600
3456
00:38
She connects to the internet and uses speech recognition technology
7
38080
3576
00:41
to answer your child's questions,
8
41680
2136
00:43
respond just like a friend.
9
43840
1960
00:46
But the power doesn't lie with your child's imagination.
10
46920
3656
00:50
It actually lies with the company harvesting masses of personal information
11
50600
4536
00:55
while your family is innocently chatting away in the safety of their home,
12
55160
5536
01:00
a dangerously false sense of security.
13
60720
2480
01:04
This case sounded alarm bells for me,
14
64840
2656
01:07
as it is my job to protect consumers' rights in my country.
15
67520
3200
01:11
And with billions of devices such as cars,
16
71800
3496
01:15
energy meters and even vacuum cleaners expected to come online by 2020,
17
75320
5096
01:20
we thought this was a case worth investigating further.
18
80440
3936
01:24
Because what was Cayla doing
19
84400
1896
01:26
with all the interesting things she was learning?
20
86320
2536
01:28
Did she have another friend she was loyal to and shared her information with?
21
88880
3640
01:33
Yes, you guessed right. She did.
22
93640
2776
01:36
In order to play with Cayla,
23
96440
2096
01:38
you need to download an app to access all her features.
24
98560
3000
01:42
Parents must consent to the terms being changed without notice.
25
102280
3840
01:47
The recordings of the child, her friends and family,
26
107280
3776
01:51
can be used for targeted advertising.
27
111080
1960
01:54
And all this information can be shared with unnamed third parties.
28
114080
4960
01:59
Enough? Not quite.
29
119760
2120
02:02
Anyone with a smartphone can connect to Cayla
30
122880
4296
02:07
within a certain distance.
31
127200
1600
02:09
When we confronted the company that made and programmed Cayla,
32
129560
4576
02:14
they issued a series of statements
33
134160
2256
02:16
that one had to be an IT expert in order to breach the security.
34
136440
4120
02:22
Shall we fact-check that statement and live hack Cayla together?
35
142039
3921
02:29
Here she is.
36
149920
1200
02:32
Cayla is equipped with a Bluetooth device
37
152200
3376
02:35
which can transmit up to 60 feet,
38
155600
2216
02:37
a bit less if there's a wall between.
39
157840
2616
02:40
That means I, or any stranger, can connect to the doll
40
160480
5296
02:45
while being outside the room where Cayla and her friends are.
41
165800
3736
02:49
And to illustrate this,
42
169560
2176
02:51
I'm going to turn Cayla on now.
43
171760
2136
02:53
Let's see, one, two, three.
44
173920
1800
02:57
There. She's on. And I asked a colleague
45
177040
1976
02:59
to stand outside with his smartphone,
46
179040
2096
03:01
and he's connected,
47
181160
1240
03:03
and to make this a bit creepier ...
48
183320
2096
03:05
(Laughter)
49
185440
4056
03:09
let's see what kids could hear Cayla say in the safety of their room.
50
189520
4920
03:15
Man: Hi. My name is Cayla. What is yours?
51
195920
2896
03:18
Finn Myrstad: Uh, Finn.
52
198840
1320
03:20
Man: Is your mom close by?
53
200960
1296
03:22
FM: Uh, no, she's in the store.
54
202280
1480
03:24
Man: Ah. Do you want to come out and play with me?
55
204680
2376
03:27
FM: That's a great idea.
56
207080
1480
03:29
Man: Ah, great.
57
209720
1200
03:32
FM: I'm going to turn Cayla off now.
58
212480
2136
03:34
(Laughter)
59
214640
1200
03:39
We needed no password
60
219080
2736
03:41
or to circumvent any other type of security to do this.
61
221840
3560
03:46
We published a report in 20 countries around the world,
62
226440
3816
03:50
exposing this significant security flaw
63
230280
2976
03:53
and many other problematic issues.
64
233280
1760
03:56
So what happened?
65
236000
1240
03:57
Cayla was banned in Germany,
66
237840
1640
04:00
taken off the shelves by Amazon and Wal-Mart,
67
240480
3216
04:03
and she's now peacefully resting
68
243720
3056
04:06
at the German Spy Museum in Berlin.
69
246800
3456
04:10
(Laughter)
70
250280
2776
04:13
However, Cayla was also for sale in stores around the world
71
253080
4296
04:17
for more than a year after we published our report.
72
257400
3576
04:21
What we uncovered is that there are few rules to protect us
73
261000
4256
04:25
and the ones we have are not being properly enforced.
74
265280
3360
04:30
We need to get the security and privacy of these devices right
75
270000
3856
04:33
before they enter the market,
76
273880
2856
04:36
because what is the point of locking a house with a key
77
276760
3976
04:40
if anyone can enter it through a connected device?
78
280760
2920
04:45
You may well think, "This will not happen to me.
79
285640
3296
04:48
I will just stay away from these flawed devices."
80
288960
2600
04:52
But that won't keep you safe,
81
292600
2056
04:54
because simply by connecting to the internet,
82
294680
3176
04:57
you are put in an impossible take-it-or-leave-it position.
83
297880
4576
05:02
Let me show you.
84
302480
1200
05:04
Like most of you, I have dozens of apps on my phone,
85
304400
3096
05:07
and used properly, they can make our lives easier,
86
307520
2856
05:10
more convenient and maybe even healthier.
87
310400
2440
05:13
But have we been lulled into a false sense of security?
88
313960
3520
05:18
It starts simply by ticking a box.
89
318600
2440
05:21
Yes, we say,
90
321880
1776
05:23
I've read the terms.
91
323680
1440
05:27
But have you really read the terms?
92
327240
3040
05:31
Are you sure they didn't look too long
93
331200
2296
05:33
and your phone was running out of battery,
94
333520
2056
05:35
and the last time you tried they were impossible to understand,
95
335600
3216
05:38
and you needed to use the service now?
96
338840
1840
05:41
And now, the power imbalance is established,
97
341840
3656
05:45
because we have agreed to our personal information
98
345520
3656
05:49
being gathered and used on a scale we could never imagine.
99
349200
3120
05:53
This is why my colleagues and I decided to take a deeper look at this.
100
353640
3696
05:57
We set out to read the terms
101
357360
3336
06:00
of popular apps on an average phone.
102
360720
2696
06:03
And to show the world how unrealistic it is
103
363440
3736
06:07
to expect consumers to actually read the terms,
104
367200
3216
06:10
we printed them,
105
370440
1496
06:11
more than 900 pages,
106
371960
1840
06:14
and sat down in our office and read them out loud ourselves,
107
374800
3600
06:19
streaming the experiment live on our websites.
108
379800
2536
06:22
As you can see, it took quite a long time.
109
382360
2536
06:24
It took us 31 hours, 49 minutes and 11 seconds
110
384920
4416
06:29
to read the terms on an average phone.
111
389360
2576
06:31
That is longer than a movie marathon of the "Harry Potter" movies
112
391960
4376
06:36
and the "Godfather" movies combined.
113
396360
2496
06:38
(Laughter)
114
398880
1400
06:41
And reading is one thing.
115
401600
1936
06:43
Understanding is another story.
116
403560
1976
06:45
That would have taken us much, much longer.
117
405560
3576
06:49
And this is a real problem,
118
409160
1776
06:50
because companies have argued for 20 to 30 years
119
410960
3216
06:54
against regulating the internet better,
120
414200
3056
06:57
because users have consented to the terms and conditions.
121
417280
3160
07:02
As we've shown with this experiment,
122
422520
1976
07:04
achieving informed consent is close to impossible.
123
424520
2880
07:09
Do you think it's fair to put the burden of responsibility on the consumer?
124
429080
3524
07:14
I don't.
125
434000
1736
07:15
I think we should demand less take-it-or-leave-it
126
435760
3096
07:18
and more understandable terms before we agree to them.
127
438880
3176
07:22
(Applause)
128
442080
1536
07:23
Thank you.
129
443640
1200
07:28
Now, I would like to tell you a story about love.
130
448200
4880
07:34
Some of the world's most popular apps are dating apps,
131
454080
3536
07:37
an industry now worth more than, or close to, three billion dollars a year.
132
457640
4640
07:43
And of course, we're OK sharing our intimate details
133
463160
4176
07:47
with our other half.
134
467360
1240
07:49
But who else is snooping,
135
469240
1976
07:51
saving and sharing our information
136
471240
2936
07:54
while we are baring our souls?
137
474200
1640
07:56
My team and I decided to investigate this.
138
476520
2200
08:00
And in order to understand the issue from all angles
139
480920
3016
08:03
and to truly do a thorough job,
140
483960
2240
08:07
I realized I had to download
141
487400
1976
08:09
one of the world's most popular dating apps myself.
142
489400
3440
08:14
So I went home to my wife ...
143
494440
2296
08:16
(Laughter)
144
496760
1936
08:18
who I had just married.
145
498720
1656
08:20
"Is it OK if I establish a profile on a very popular dating app
146
500400
4616
08:25
for purely scientific purposes?"
147
505040
1896
08:26
(Laughter)
148
506960
1856
08:28
This is what we found.
149
508840
1496
08:30
Hidden behind the main menu was a preticked box
150
510360
3976
08:34
that gave the dating company access to all my personal pictures on Facebook,
151
514360
6056
08:40
in my case more than 2,000 of them,
152
520440
2856
08:43
and some were quite personal.
153
523320
2120
08:46
And to make matters worse,
154
526400
2216
08:48
when we read the terms and conditions,
155
528640
2056
08:50
we discovered the following,
156
530720
1376
08:52
and I'm going to need to take out my reading glasses for this one.
157
532120
3120
08:56
And I'm going to read it for you, because this is complicated.
158
536400
2936
08:59
All right.
159
539360
1200
09:01
"By posting content" --
160
541440
1536
09:03
and content refers to your pictures, chat
161
543000
1976
09:05
and other interactions in the dating service --
162
545000
2216
09:07
"as a part of the service,
163
547240
1256
09:08
you automatically grant to the company,
164
548520
1976
09:10
its affiliates, licensees and successors
165
550520
2176
09:12
an irrevocable" -- which means you can't change your mind --
166
552720
3616
09:16
"perpetual" -- which means forever --
167
556360
2776
09:19
"nonexclusive, transferrable, sublicensable, fully paid-up,
168
559160
2896
09:22
worldwide right and license to use, copy, store, perform,
169
562080
2696
09:24
display, reproduce, record,
170
564800
1336
09:26
play, adapt, modify and distribute the content,
171
566160
2216
09:28
prepare derivative works of the content,
172
568400
1936
09:30
or incorporate the content into other works
173
570360
2016
09:32
and grant and authorize sublicenses of the foregoing in any media
174
572400
3056
09:35
now known or hereafter created."
175
575480
1560
09:40
That basically means that all your dating history
176
580640
3816
09:44
and everything related to it can be used for any purpose for all time.
177
584480
5080
09:50
Just imagine your children seeing your sassy dating photos
178
590520
4976
09:55
in a birth control ad 20 years from now.
179
595520
2560
10:00
But seriously, though --
180
600400
1216
10:01
(Laughter)
181
601640
1600
10:04
what might these commercial practices mean to you?
182
604880
2360
10:08
For example, financial loss:
183
608320
2240
10:11
based on your web browsing history,
184
611480
1696
10:13
algorithms might decide whether you will get a mortgage or not.
185
613200
2960
10:16
Subconscious manipulation:
186
616840
1480
10:19
companies can analyze your emotions based on your photos and chats,
187
619560
3696
10:23
targeting you with ads when you are at your most vulnerable.
188
623280
3256
10:26
Discrimination:
189
626560
1496
10:28
a fitness app can sell your data to a health insurance company,
190
628080
3016
10:31
preventing you from getting coverage in the future.
191
631120
3056
10:34
All of this is happening in the world today.
192
634200
2520
10:37
But of course, not all uses of data are malign.
193
637800
3336
10:41
Some are just flawed or need more work,
194
641160
1976
10:43
and some are truly great.
195
643160
1520
10:47
And there is some good news as well.
196
647560
3696
10:51
The dating companies changed their policies globally
197
651280
3296
10:54
after we filed a legal complaint.
198
654600
1680
10:57
But organizations such as mine
199
657720
2696
11:00
that fight for consumers' rights can't be everywhere.
200
660440
2976
11:03
Nor can consumers fix this on their own,
201
663440
2536
11:06
because if we know that something innocent we said
202
666000
3576
11:09
will come back to haunt us,
203
669600
1456
11:11
we will stop speaking.
204
671080
1896
11:13
If we know that we are being watched and monitored,
205
673000
3376
11:16
we will change our behavior.
206
676400
2096
11:18
And if we can't control who has our data and how it is being used,
207
678520
3896
11:22
we have lost the control of our lives.
208
682440
1840
11:26
The stories I have told you today are not random examples.
209
686400
3496
11:29
They are everywhere,
210
689920
1776
11:31
and they are a sign that things need to change.
211
691720
2856
11:34
And how can we achieve that change?
212
694600
2096
11:36
Well, companies need to realize that by prioritizing privacy and security,
213
696720
5576
11:42
they can build trust and loyalty to their users.
214
702320
2960
11:46
Governments must create a safer internet
215
706520
3096
11:49
by ensuring enforcement and up-to-date rules.
216
709640
2880
11:53
And us, the citizens?
217
713400
2216
11:55
We can use our voice
218
715640
1816
11:57
to remind the world that technology can only truly benefit society
219
717480
5096
12:02
if it respects basic rights.
220
722600
2600
12:05
Thank you so much.
221
725720
1576
12:07
(Applause)
222
727320
4080
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7