How tech companies deceive you into giving up your data and privacy | Finn Lützow-Holm Myrstad

133,693 views ・ 2018-11-21

TED


Please double-click on the English subtitles below to play the video.

Prevodilac: Aleksandar Bošnjak Lektor: Ivana Krivokuća
00:13
Do you remember when you were a child,
0
13040
2416
Sećate li se da ste, kad ste bili dete,
00:15
you probably had a favorite toy that was a constant companion,
1
15480
3576
verovatno imali omiljenu igračku koja vam je stalno pravila društvo,
00:19
like Christopher Robin had Winnie the Pooh,
2
19080
2616
kao što je Kristofer Robin imao Vinija Pua,
00:21
and your imagination fueled endless adventures?
3
21720
2800
i da je vaša mašta pokretala beskrajne avanture?
00:25
What could be more innocent than that?
4
25640
2400
Šta bi moglo biti bezazlenije od toga?
00:28
Well, let me introduce you to my friend Cayla.
5
28800
4800
Pa, dozvolite mi da vam predstavim moju prijateljicu Kejlu.
00:34
Cayla was voted toy of the year in countries around the world.
6
34600
3456
Kejla je bila igračka godine u zemljama širom sveta.
00:38
She connects to the internet and uses speech recognition technology
7
38080
3576
Povezuje se sa internetom i koristi tehnologiju prepoznavanja govora
00:41
to answer your child's questions,
8
41680
2136
da odgovori vašem detetu na pitanja,
00:43
respond just like a friend.
9
43840
1960
odgovara baš kao prijatelj.
00:46
But the power doesn't lie with your child's imagination.
10
46920
3656
Ali moć se ne nalazi u mašti vašeg deteta.
00:50
It actually lies with the company harvesting masses of personal information
11
50600
4536
Nalazi se zapravo u kompaniji koja skuplja gomilu ličnih informacija,
00:55
while your family is innocently chatting away in the safety of their home,
12
55160
5536
dok vaša porodica bezazleno ćaska u bezbednosti svoje kuće -
01:00
a dangerously false sense of security.
13
60720
2480
opasno lažan osećaj sigurnosti.
01:04
This case sounded alarm bells for me,
14
64840
2656
Ovaj slučaj me je zabrinuo,
01:07
as it is my job to protect consumers' rights in my country.
15
67520
3200
jer je moj posao da zaštitim potrošačka prava u svojoj zemlji.
01:11
And with billions of devices such as cars,
16
71800
3496
Sa milijardama uređaja kao što su automobili, merači energije,
01:15
energy meters and even vacuum cleaners expected to come online by 2020,
17
75320
5096
čak i usisivači, za koje se očekuje da će biti povezani sa internetom do 2020,
01:20
we thought this was a case worth investigating further.
18
80440
3936
pomislili smo da ovaj slučaj vredi dalje istražiti.
01:24
Because what was Cayla doing
19
84400
1896
Jer šta je Kejla radila
01:26
with all the interesting things she was learning?
20
86320
2536
sa svim interesantnim stvarima koje je saznavala?
01:28
Did she have another friend she was loyal to and shared her information with?
21
88880
3640
Da li je imala drugaricu kojoj je lojalna i sa kojom je podelila informacije?
01:33
Yes, you guessed right. She did.
22
93640
2776
Da, dobro ste pogodili. Imala je.
01:36
In order to play with Cayla,
23
96440
2096
Da biste se igrali sa Kejlom,
01:38
you need to download an app to access all her features.
24
98560
3000
morate da preuzmete aplikaciju kako biste pristupili svim funkcijama.
01:42
Parents must consent to the terms being changed without notice.
25
102280
3840
Roditelji moraju da pristanu na promenu uslova bez obaveštenja.
01:47
The recordings of the child, her friends and family,
26
107280
3776
Snimci deteta, njenih prijatelja i porodice,
01:51
can be used for targeted advertising.
27
111080
1960
mogu da se koriste za ciljano oglašavanje.
01:54
And all this information can be shared with unnamed third parties.
28
114080
4960
I sve ove informacije mogu da se podele sa neimenovanim trećim licima.
01:59
Enough? Not quite.
29
119760
2120
Dovoljno? Ne baš.
02:02
Anyone with a smartphone can connect to Cayla
30
122880
4296
Bilo ko sa pametnim telefonom može da se poveže sa Kejlom
02:07
within a certain distance.
31
127200
1600
unutar određene razdaljine.
02:09
When we confronted the company that made and programmed Cayla,
32
129560
4576
Kad smo se usprotivili kompaniji koja je napravila i programirala Kejlu,
02:14
they issued a series of statements
33
134160
2256
objavili su niz izjava
02:16
that one had to be an IT expert in order to breach the security.
34
136440
4120
da neko mora da bude stručnjak za IT da bi narušio bezbednost.
02:22
Shall we fact-check that statement and live hack Cayla together?
35
142039
3921
Hoćemo li da proverimo činjenice te izjave i da uživo zajedno hakujemo Kejlu?
02:29
Here she is.
36
149920
1200
Evo je.
02:32
Cayla is equipped with a Bluetooth device
37
152200
3376
Kejla poseduje blutut uređaj,
02:35
which can transmit up to 60 feet,
38
155600
2216
koji može da vrši prenos na daljini do 18 metera,
02:37
a bit less if there's a wall between.
39
157840
2616
malčice manje ako između postoji zid.
02:40
That means I, or any stranger, can connect to the doll
40
160480
5296
To znači da ja, ili bilo koji neznanac, mogu da se povežem sa lutkom
02:45
while being outside the room where Cayla and her friends are.
41
165800
3736
dok je ispred sobe gde su Kejla i njene drugarice.
02:49
And to illustrate this,
42
169560
2176
Da bih ovo prikazao,
02:51
I'm going to turn Cayla on now.
43
171760
2136
uključiću Kejlu sada.
02:53
Let's see, one, two, three.
44
173920
1800
Da vidimo, jedan, dva, tri.
02:57
There. She's on. And I asked a colleague
45
177040
1976
Eto ga. Uključena je. Zamolio sam kolegu
02:59
to stand outside with his smartphone,
46
179040
2096
da stane ispred sa svojim pametnim telefonom,
03:01
and he's connected,
47
181160
1240
priključen je,
03:03
and to make this a bit creepier ...
48
183320
2096
i da načinim ovo jezivijim...
03:05
(Laughter)
49
185440
4056
(Smeh)
03:09
let's see what kids could hear Cayla say in the safety of their room.
50
189520
4920
da vidimo šta su deca mogla čuti da Kejla govori u bezbednosti njihove sobe.
03:15
Man: Hi. My name is Cayla. What is yours?
51
195920
2896
Čovek: Ćao. Moje ime je Kejla. Kako se ti zoveš?
03:18
Finn Myrstad: Uh, Finn.
52
198840
1320
Fin Mirstad: Am, Fin.
03:20
Man: Is your mom close by?
53
200960
1296
Čovek: Da li je tvoja mama u blizi?
03:22
FM: Uh, no, she's in the store.
54
202280
1480
Fin Mirstad: Am, ne, u radnji je.
03:24
Man: Ah. Do you want to come out and play with me?
55
204680
2376
Čovek: Aha. Želiš li da izađeš napolje i da se igraš sa mnom?
03:27
FM: That's a great idea.
56
207080
1480
Fin Mirstad: To je odlična ideja.
03:29
Man: Ah, great.
57
209720
1200
Čovek: Aha, odlično.
03:32
FM: I'm going to turn Cayla off now.
58
212480
2136
Isključiću Kejlu sada.
03:34
(Laughter)
59
214640
1200
(Smeh)
03:39
We needed no password
60
219080
2736
Nije nam trebala nikakva lozinka,
03:41
or to circumvent any other type of security to do this.
61
221840
3560
niti da zaobiđemo bilo koji drugi tip zaštite da bismo ovo uradili.
03:46
We published a report in 20 countries around the world,
62
226440
3816
Objavili smo izveštaj u 20 zemalja širom sveta,
03:50
exposing this significant security flaw
63
230280
2976
otkrivajući ovu značajnu manu zaštite
03:53
and many other problematic issues.
64
233280
1760
i mnoga druga problematična pitanja.
03:56
So what happened?
65
236000
1240
Pa, šta se desilo?
03:57
Cayla was banned in Germany,
66
237840
1640
Kejla je bila zabranjena u Nemačkoj,
04:00
taken off the shelves by Amazon and Wal-Mart,
67
240480
3216
skinuta sa rafova u radnjama Amazona i Volmarta,
04:03
and she's now peacefully resting
68
243720
3056
i sada mirno počiva
04:06
at the German Spy Museum in Berlin.
69
246800
3456
u nemačkom muzeju špijunaže u Berlinu.
04:10
(Laughter)
70
250280
2776
(Smeh)
04:13
However, Cayla was also for sale in stores around the world
71
253080
4296
Međutim, Kejla je bila u prodaji u radnjama širom sveta
04:17
for more than a year after we published our report.
72
257400
3576
više od godinu dana nakon što smo objavili izveštaj.
04:21
What we uncovered is that there are few rules to protect us
73
261000
4256
Otkrili smo da postoji samo nekoliko zakona koji nas štite,
04:25
and the ones we have are not being properly enforced.
74
265280
3360
a oni koje imamo ne primenjuju se kako treba.
04:30
We need to get the security and privacy of these devices right
75
270000
3856
Treba da dovedemo u red zaštitu i privatnost ovih uređaja
04:33
before they enter the market,
76
273880
2856
pre nego što izađu na tržište,
04:36
because what is the point of locking a house with a key
77
276760
3976
jer koja je poenta zaključavanja kuće ključem,
04:40
if anyone can enter it through a connected device?
78
280760
2920
ako bilo ko može da uđe u nju preko povezanog uređaja.
04:45
You may well think, "This will not happen to me.
79
285640
3296
Možda mislite: „Ovo se neće desiti meni.
04:48
I will just stay away from these flawed devices."
80
288960
2600
Držaću se podalje od tih uređaja sa nedostatkom.“
04:52
But that won't keep you safe,
81
292600
2056
Ali time nećete biti sigurni,
04:54
because simply by connecting to the internet,
82
294680
3176
jer jednostavno, povezujući se sa internetom,
04:57
you are put in an impossible take-it-or-leave-it position.
83
297880
4576
stavljeni ste u nemoguću poziciju „uzmi ili ostavi“.
05:02
Let me show you.
84
302480
1200
Dozvolite mi da vam pokažem.
05:04
Like most of you, I have dozens of apps on my phone,
85
304400
3096
Kao i većina vas, imam desetine aplikacija na telefonu,
05:07
and used properly, they can make our lives easier,
86
307520
2856
i kad se koriste ispravno, mogu nam olakati život,
05:10
more convenient and maybe even healthier.
87
310400
2440
učiniti ga ugodnijim, a možda čak i zdravijim.
05:13
But have we been lulled into a false sense of security?
88
313960
3520
Ali da li smo uljuljkani u lažan osećaj sigurnosti?
05:18
It starts simply by ticking a box.
89
318600
2440
Počinje prostim klikom na kvadratić.
05:21
Yes, we say,
90
321880
1776
Da, onda kažemo,
05:23
I've read the terms.
91
323680
1440
pročitao sam uslove.
05:27
But have you really read the terms?
92
327240
3040
A da li ste ih zaista pročitali?
05:31
Are you sure they didn't look too long
93
331200
2296
Jeste li sigurni da nisu izgledali predugačko,
05:33
and your phone was running out of battery,
94
333520
2056
i da vašem telefonu nije ponestajalo baterije,
05:35
and the last time you tried they were impossible to understand,
95
335600
3216
i da nisu, poslednji put kad ste pokušali, bili nemogući za razumevanje,
05:38
and you needed to use the service now?
96
338840
1840
a morali ste da koristite uslugu odmah?
05:41
And now, the power imbalance is established,
97
341840
3656
Sada je disbalans u moći uspostavljen,
05:45
because we have agreed to our personal information
98
345520
3656
jer smo se složili da se naše lične informacije
05:49
being gathered and used on a scale we could never imagine.
99
349200
3120
skupljaju i koriste u obimu kakav nismo ni mogli zamisliti.
05:53
This is why my colleagues and I decided to take a deeper look at this.
100
353640
3696
Zato smo moje kolege i ja odlučili da dublje pogledamo ovo.
05:57
We set out to read the terms
101
357360
3336
Počeli smo da čitamo uslove
06:00
of popular apps on an average phone.
102
360720
2696
popularnih aplikacija na prosečnom telefonu.
06:03
And to show the world how unrealistic it is
103
363440
3736
Da bismo pokazali svetu koliko je nerealno
06:07
to expect consumers to actually read the terms,
104
367200
3216
da očekujemo od kupaca da zapravo čitaju uslove,
06:10
we printed them,
105
370440
1496
ištampali smo ih,
06:11
more than 900 pages,
106
371960
1840
više od 900 strana,
06:14
and sat down in our office and read them out loud ourselves,
107
374800
3600
pa smo seli u kancelariji i čitali ih naglas,
06:19
streaming the experiment live on our websites.
108
379800
2536
prenoseći eksperiment uživo na našim veb sajtovima.
06:22
As you can see, it took quite a long time.
109
382360
2536
Kao što možete videti, bilo je potrebno dosta vremena.
06:24
It took us 31 hours, 49 minutes and 11 seconds
110
384920
4416
Bilo nam je potrebno 31 sat, 49 minuta i 11 sekundi
06:29
to read the terms on an average phone.
111
389360
2576
da pročitamo uslove na prosečnom telefonu.
06:31
That is longer than a movie marathon of the "Harry Potter" movies
112
391960
4376
To je duže od filmskog maratona filmova o Hariju Poteru
06:36
and the "Godfather" movies combined.
113
396360
2496
i filmova „Kum“ zajedno.
06:38
(Laughter)
114
398880
1400
(Smeh)
06:41
And reading is one thing.
115
401600
1936
Čitanje je jedna stvar.
06:43
Understanding is another story.
116
403560
1976
Razumevanje je druga priča.
06:45
That would have taken us much, much longer.
117
405560
3576
Za to bi nam bilo potrebno mnogo, mnogo više.
06:49
And this is a real problem,
118
409160
1776
Ovo je pravi problem,
06:50
because companies have argued for 20 to 30 years
119
410960
3216
jer su kompanije 20 do 30 godina iznosile argumente
06:54
against regulating the internet better,
120
414200
3056
protiv bolje regulacije interneta,
06:57
because users have consented to the terms and conditions.
121
417280
3160
jer su korisnici pristali na uslove.
07:02
As we've shown with this experiment,
122
422520
1976
Kao što smo pokazali ovim eksperimentom,
07:04
achieving informed consent is close to impossible.
123
424520
2880
dostići obavešten pristanak je skoro nemoguće.
07:09
Do you think it's fair to put the burden of responsibility on the consumer?
124
429080
3524
Da li mislite da je pravedno da stavljaju teret odgovornosti na korisnika?
07:14
I don't.
125
434000
1736
Ja ne mislim tako.
07:15
I think we should demand less take-it-or-leave-it
126
435760
3096
Mislim da bi trebalo da zahtevamo manje „uzmi ili ostavi“,
07:18
and more understandable terms before we agree to them.
127
438880
3176
i razumljivije uslove pre nego što pristanemo na njih.
07:22
(Applause)
128
442080
1536
(Aplauz)
07:23
Thank you.
129
443640
1200
Hvala vam.
07:28
Now, I would like to tell you a story about love.
130
448200
4880
Želeo bih da vam ispričam priču o ljubavi.
07:34
Some of the world's most popular apps are dating apps,
131
454080
3536
Neke od najpopularnijih aplikacija na svetu su aplikacije za upoznavanje,
07:37
an industry now worth more than, or close to, three billion dollars a year.
132
457640
4640
industrija koja sada vredi više od, ili blizu, tri milijarde dolara godišnje.
07:43
And of course, we're OK sharing our intimate details
133
463160
4176
I naravno, u redu nam je da delimo intimne detalje
07:47
with our other half.
134
467360
1240
sa svojom drugom polovinom.
07:49
But who else is snooping,
135
469240
1976
Ali ko još njuška okolo,
07:51
saving and sharing our information
136
471240
2936
čuva i deli naše informacije
07:54
while we are baring our souls?
137
474200
1640
dok mi otkrivamo svoje duše?
07:56
My team and I decided to investigate this.
138
476520
2200
Moj tim i ja smo odlučili da ovo istražimo.
08:00
And in order to understand the issue from all angles
139
480920
3016
A da bih razumeo problem iz svih uglova
08:03
and to truly do a thorough job,
140
483960
2240
i da bih zaista obavio posao detaljno,
08:07
I realized I had to download
141
487400
1976
shvatio sam da moram da preuzmem
08:09
one of the world's most popular dating apps myself.
142
489400
3440
jednu od najpopularnijih aplikacija za upoznavanje na svetu.
08:14
So I went home to my wife ...
143
494440
2296
Otišao sam kući svojoj ženi...
08:16
(Laughter)
144
496760
1936
(Smeh)
08:18
who I had just married.
145
498720
1656
sa kojom sam se tek venčao.
08:20
"Is it OK if I establish a profile on a very popular dating app
146
500400
4616
„Da li je u redu ako napravim profil
na vrlo popularnoj aplikaciji za upoznavanje
08:25
for purely scientific purposes?"
147
505040
1896
u isključivo naučne svrhe?“
08:26
(Laughter)
148
506960
1856
(Smeh)
08:28
This is what we found.
149
508840
1496
A ovo smo otkrili.
08:30
Hidden behind the main menu was a preticked box
150
510360
3976
Sakriven iza glavnog menija, postojao je unapred štrikliran kvadratić
08:34
that gave the dating company access to all my personal pictures on Facebook,
151
514360
6056
koji je dao toj kompaniji pristup svim mojim ličnim slikama sa Fejsbuka,
08:40
in my case more than 2,000 of them,
152
520440
2856
u mom slučaju više od 2 000 njih,
08:43
and some were quite personal.
153
523320
2120
a neke od njih su bile baš lične.
08:46
And to make matters worse,
154
526400
2216
A da stvari budu još gore,
08:48
when we read the terms and conditions,
155
528640
2056
kad smo pročitali uslove,
08:50
we discovered the following,
156
530720
1376
otkrili smo sledeće,
08:52
and I'm going to need to take out my reading glasses for this one.
157
532120
3120
i moraću da izvadim svoje naočare za čitanje za ovo.
08:56
And I'm going to read it for you, because this is complicated.
158
536400
2936
Pročitaću vam, jer je ovo komplikovano.
08:59
All right.
159
539360
1200
U redu.
09:01
"By posting content" --
160
541440
1536
„Objavljujući sadržaj“ -
09:03
and content refers to your pictures, chat
161
543000
1976
a sadržaj se odnosi na slike, dopisivanja
09:05
and other interactions in the dating service --
162
545000
2216
i druge interakcije u servisu za upoznavanje -
09:07
"as a part of the service,
163
547240
1256
„kao deo usluge,
09:08
you automatically grant to the company,
164
548520
1976
automatski odobravate kompaniji, njenim filijalama,
09:10
its affiliates, licensees and successors
165
550520
2176
onima sa njenom licencom i budućim vlasnicima, neopozivo“ -
09:12
an irrevocable" -- which means you can't change your mind --
166
552720
3616
što znači da ne možete da se predomislite -
09:16
"perpetual" -- which means forever --
167
556360
2776
„trajno“ - što znači zauvek -
„neisključivo, prenosivo, podlicencno, potpuno isplaćeno,
09:19
"nonexclusive, transferrable, sublicensable, fully paid-up,
168
559160
2896
pravo i dozvolu za korišćenje, umnožavanje, čuvanje, izvršavanje,
09:22
worldwide right and license to use, copy, store, perform,
169
562080
2696
09:24
display, reproduce, record,
170
564800
1336
prikazivanje, reprodukciju, beleženje,
09:26
play, adapt, modify and distribute the content,
171
566160
2216
objavljivanje, prilagođavanje, modifikovanje i distribuciju sadržaja,
09:28
prepare derivative works of the content,
172
568400
1936
izdvajanje delove sadržaja i uključivanje sadržaja u druge aktivnosti,
09:30
or incorporate the content into other works
173
570360
2016
09:32
and grant and authorize sublicenses of the foregoing in any media
174
572400
3056
davanje odobrenja i ovlašćenja gore pomenutim sublicenciranima
09:35
now known or hereafter created."
175
575480
1560
u svim medijima, trenutno poznatim ili nadalje stvorenim.“
09:40
That basically means that all your dating history
176
580640
3816
U suštini, to znači da vaša celokupna istorija upoznavanja
09:44
and everything related to it can be used for any purpose for all time.
177
584480
5080
i sve u vezi sa njom može biti korišćeno u bilo koje svrhe za sva vremena.
09:50
Just imagine your children seeing your sassy dating photos
178
590520
4976
Samo zamislite svoju decu kako gledaju vaše zavodničke slike
09:55
in a birth control ad 20 years from now.
179
595520
2560
za 20 godina u reklami za sredstva za kontracepciju.
10:00
But seriously, though --
180
600400
1216
Sad ozbiljno, zaista -
10:01
(Laughter)
181
601640
1600
(Smeh)
10:04
what might these commercial practices mean to you?
182
604880
2360
šta ovi komercijalni običaji mogu značiti za vas?
10:08
For example, financial loss:
183
608320
2240
Na primer, finansijski gubitak -
10:11
based on your web browsing history,
184
611480
1696
na osnovu vaše istorije pretraživanja,
10:13
algorithms might decide whether you will get a mortgage or not.
185
613200
2960
algoritmi mogu odlučiti da li ćete dobiti hipoteku ili ne.
10:16
Subconscious manipulation:
186
616840
1480
Podsvesna manipulacija -
10:19
companies can analyze your emotions based on your photos and chats,
187
619560
3696
kompanije mogu analizirati vaše emocije na osnovu vaših slika i dopisivanja,
10:23
targeting you with ads when you are at your most vulnerable.
188
623280
3256
ciljano vam prikazujući reklame kada ste najranjiviji.
10:26
Discrimination:
189
626560
1496
Diskriminacija -
aplikacija za fitnes može prodati podatke kompaniji zdravstvenog osiguranja,
10:28
a fitness app can sell your data to a health insurance company,
190
628080
3016
10:31
preventing you from getting coverage in the future.
191
631120
3056
što vam može onemogućiti dobijanje osiguranja u budućnosti.
10:34
All of this is happening in the world today.
192
634200
2520
Sve ovo se dešava u svetu danas.
10:37
But of course, not all uses of data are malign.
193
637800
3336
Ali naravno, korišćenje podataka nije uvek loše prirode.
Nekad samo ima nedostatke ili treba više poraditi na njemu,
10:41
Some are just flawed or need more work,
194
641160
1976
10:43
and some are truly great.
195
643160
1520
a neka su stvarno jako dobra.
10:47
And there is some good news as well.
196
647560
3696
Postoje i neke dobre vesti.
10:51
The dating companies changed their policies globally
197
651280
3296
Kompanije za upoznavanja su širom sveta promenile svoja pravila
10:54
after we filed a legal complaint.
198
654600
1680
nakon što smo pravno podneli žalbu.
10:57
But organizations such as mine
199
657720
2696
Ali organizacije kao što je moja,
11:00
that fight for consumers' rights can't be everywhere.
200
660440
2976
koje se bore za potrošačka prava, ne mogu biti svuda.
11:03
Nor can consumers fix this on their own,
201
663440
2536
Niti potrošači mogu ovo sami rešiti,
11:06
because if we know that something innocent we said
202
666000
3576
jer ako znamo da će se nešto bezazleno što smo rekli
11:09
will come back to haunt us,
203
669600
1456
vratiti da nas proganja,
11:11
we will stop speaking.
204
671080
1896
prestaćemo da pričamo.
11:13
If we know that we are being watched and monitored,
205
673000
3376
Ako znamo da nas gledaju i prate,
11:16
we will change our behavior.
206
676400
2096
promenićemo naše ponašanje.
11:18
And if we can't control who has our data and how it is being used,
207
678520
3896
A ako ne možemo da kontrolišemo ko ima naše podatke i kako se oni koriste,
11:22
we have lost the control of our lives.
208
682440
1840
izgubili smo kontrolu nad svojim životima.
11:26
The stories I have told you today are not random examples.
209
686400
3496
Priče koje sam vam ispričao danas nisu slučajni primeri.
11:29
They are everywhere,
210
689920
1776
One su svuda,
11:31
and they are a sign that things need to change.
211
691720
2856
i znak su da stvari treba da se promene.
11:34
And how can we achieve that change?
212
694600
2096
A kako možemo dostići tu promenu?
11:36
Well, companies need to realize that by prioritizing privacy and security,
213
696720
5576
Kompanije treba da shvate da postavljajući privatnost i zaštitu kao prioritet
11:42
they can build trust and loyalty to their users.
214
702320
2960
mogu izgraditi poverenje i lojalnost u odnosu sa svojim korisnicima.
11:46
Governments must create a safer internet
215
706520
3096
Vlade moraju stvoriti sigurniji internet
11:49
by ensuring enforcement and up-to-date rules.
216
709640
2880
osiguravajući primenu zakona i ažurirana pravila.
11:53
And us, the citizens?
217
713400
2216
A mi, građani?
11:55
We can use our voice
218
715640
1816
Mi možemo da koristimo svoj glas
11:57
to remind the world that technology can only truly benefit society
219
717480
5096
da podsetimo svet da tehnologija može zaista da bude od koristi društvu
12:02
if it respects basic rights.
220
722600
2600
samo ako poštuje osnovna prava.
12:05
Thank you so much.
221
725720
1576
Hvala vam puno.
12:07
(Applause)
222
727320
4080
(Aplauz)
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7