How you can help transform the internet into a place of trust | Claire Wardle

53,461 views

2019-11-15 ・ TED


New videos

How you can help transform the internet into a place of trust | Claire Wardle

53,461 views ・ 2019-11-15

TED


Dvaput kliknite na engleske titlove ispod za reprodukciju videozapisa.

Prevoditelj: Martina Peraic Recezent: Sanda L
00:13
No matter who you are or where you live,
0
13548
2516
Bez obzira tko ste i gdje živite,
00:16
I'm guessing that you have at least one relative
1
16088
2356
nagađam da imate barem jednog rođaka
00:18
that likes to forward those emails.
2
18468
2334
koji voli prosljeđivati one poruke.
00:21
You know the ones I'm talking about --
3
21206
1900
Znate o kojima govorim --
onima sa sumnjivim tvrdnjama ili snimkama teorija urota.
00:23
the ones with dubious claims or conspiracy videos.
4
23130
2854
00:26
And you've probably already muted them on Facebook
5
26315
2668
I vjerojatno ste ih već blokirali na Facebooku
00:29
for sharing social posts like this one.
6
29007
2348
zbog dijeljenja ovakvih objava.
00:31
It's an image of a banana
7
31379
1404
To je slika banane
00:32
with a strange red cross running through the center.
8
32807
2667
s neobičnim crvenim križem koji se proteže kroz sredinu.
00:35
And the text around it is warning people
9
35498
2139
Tekst uokolo upozorava ljude
00:37
not to eat fruits that look like this,
10
37661
2158
da ne jedu takvo voće,
00:39
suggesting they've been injected with blood
11
39843
2028
napominjući da je injektirano krvlju
00:41
contaminated with the HIV virus.
12
41895
2130
zagađenom virusom HIV-a.
00:44
And the social share message above it simply says,
13
44049
2603
A društveno dijeljena poruka iznad jednostavno kaže,
00:46
"Please forward to save lives."
14
46676
2181
"Molimo pošaljite dalje za spas života."
00:49
Now, fact-checkers have been debunking this one for years,
15
49672
3294
Stručnjaci za provjeru informacija pokušavaju razotkriti ovo godinama,
00:52
but it's one of those rumors that just won't die.
16
52990
2809
ali to je jedan od tračeva koji odbija umrijeti.
00:55
A zombie rumor.
17
55823
1271
Zombijevski trač.
00:57
And, of course, it's entirely false.
18
57513
2093
I naravno, potpuno je pogrešan.
01:00
It might be tempting to laugh at an example like this, to say,
19
60180
2959
Dolazimo u napast smijati se primjerima poput ovoga, reći,
01:03
"Well, who would believe this, anyway?"
20
63163
1884
"Tko bi ionako povjerovao u to?"
01:05
But the reason it's a zombie rumor
21
65419
1626
Ali razlog zbog kojeg je to zombijevski trač
01:07
is because it taps into people's deepest fears about their own safety
22
67069
3889
jest jer zadire u najdublje ljudske strahove o osobnoj sigurnosti
01:10
and that of the people they love.
23
70982
2175
te sigurnosti ljudi koje vole.
01:13
And if you spend as enough time as I have looking at misinformation,
24
73783
3273
A ako provedete toliko vremena koliko i ja promatrajući dezinformacije,
01:17
you know that this is just one example of many
25
77080
2420
znate da je ovo tek jedan od mnogih primjera
01:19
that taps into people's deepest fears and vulnerabilities.
26
79524
3047
koji zadire u najdublje ljudske strahove i slabosti.
01:23
Every day, across the world, we see scores of new memes on Instagram
27
83214
4369
Svakog dana, svuda u svijetu, vidimo desetke novih memeova na Instagramu
01:27
encouraging parents not to vaccinate their children.
28
87607
3039
koji ohrabruju roditelje da ne cijepe svoju djecu.
01:30
We see new videos on YouTube explaining that climate change is a hoax.
29
90670
4532
Vidimo nove videe na YouTubeu koji tvrde da su klimatske promjene prevara.
01:35
And across all platforms, we see endless posts designed to demonize others
30
95226
4302
A na svim platformama, vidimo bezbrojne postove s ciljem demoniziranja drugih
01:39
on the basis of their race, religion or sexuality.
31
99552
3501
na osnovi njihove rase, vjere ili spolnosti.
01:44
Welcome to one of the central challenges of our time.
32
104314
3030
Dobrodošli u jedan od središnjih izazova našeg vremena.
01:47
How can we maintain an internet with freedom of expression at the core,
33
107647
4025
Kako da zadržimo internet sa slobodom izražavanja u svojoj srži,
01:51
while also ensuring that the content that's being disseminated
34
111696
3303
a da osiguramo da sadržaj kojeg se širi
01:55
doesn't cause irreparable harms to our democracies, our communities
35
115023
3886
ne stvara nepovratnu štetu našim demokracijama, našim zajednicama
01:58
and to our physical and mental well-being?
36
118933
2238
te našem tjelesnom i umnom blagostanju?
02:01
Because we live in the information age,
37
121998
2087
Jer živimo u informacijskom dobu,
02:04
yet the central currency upon which we all depend -- information --
38
124109
3547
a ipak glavna valuta o kojoj svi ovisimo -- informacija --
02:07
is no longer deemed entirely trustworthy
39
127680
2357
više se ne smatra posve vjerodostojnom
02:10
and, at times, can appear downright dangerous.
40
130061
2328
a ponekad se može činiti izravno opasnom.
02:12
This is thanks in part to the runaway growth of social sharing platforms
41
132811
3937
Ovo se događa zahvaljujući brzom rastu platformi za društveno dijeljenje
02:16
that allow us to scroll through,
42
136772
1642
koje nam omogućuju da skrolamo dolje,
02:18
where lies and facts sit side by side,
43
138438
2222
gdje laži i činjenice stoje jedni uz druge,
02:20
but with none of the traditional signals of trustworthiness.
44
140684
3071
ali bez ijednog od tradicionalnih znakova vjerodostojnosti.
02:24
And goodness -- our language around this is horribly muddled.
45
144268
3619
I zaboga -- naš govor koji okružuje sve to je zastrašujuće zamagljen.
02:27
People are still obsessed with the phrase "fake news,"
46
147911
3103
Ljudi su još uvijek opsjednuti frazom "lažne vijesti",
02:31
despite the fact that it's extraordinarily unhelpful
47
151038
2531
unatoč činjenici da nam izuzetno odmaže
02:33
and used to describe a number of things that are actually very different:
48
153593
3460
i koristi se za opis brojnih stvari koje su zapravo vrlo različite:
02:37
lies, rumors, hoaxes, conspiracies, propaganda.
49
157077
3386
laži, tračevi, prevare, urote, promidžba.
02:40
And I really wish we could stop using a phrase
50
160911
2912
I stvarno priželjkujem da prestanemo koristiti tu frazu
02:43
that's been co-opted by politicians right around the world,
51
163847
2862
koju su redom preuzeli političari gotovo cijelog svijeta,
02:46
from the left and the right,
52
166733
1471
od ljevice do desnice,
02:48
used as a weapon to attack a free and independent press.
53
168228
3222
i koriste se njome kao oružjem za napade na slobodne i neovisne medije.
02:52
(Applause)
54
172307
4702
(Pljesak)
Jer sada više nego ikada trebaju nam naši profesionalni informativni mediji.
02:57
Because we need our professional news media now more than ever.
55
177033
3462
03:00
And besides, most of this content doesn't even masquerade as news.
56
180882
3373
Osim toga, većina takvog sadržaja se niti ne prikriva pod viješću.
03:04
It's memes, videos, social posts.
57
184279
2642
To su memeovi, videi, društveni postovi.
03:06
And most of it is not fake; it's misleading.
58
186945
3453
I većina toga nije lažna, nego krivo navodi.
03:10
We tend to fixate on what's true or false.
59
190422
3015
Mi se većinom fiksiramo na istinu ili laž.
03:13
But the biggest concern is actually the weaponization of context.
60
193461
4032
Ali najveća opasnost je zapravo pretvaranje konteksta u oružje.
03:18
Because the most effective disinformation
61
198855
1968
Jer najučinkovitija dezinformacija
03:20
has always been that which has a kernel of truth to it.
62
200847
3048
oduvijek je bila ona koja u sebi sadrži zrnce istine.
03:23
Let's take this example from London, from March 2017,
63
203919
3476
Uzmimo ovaj primjer iz Londona iz ožujka 2017. godine,
03:27
a tweet that circulated widely
64
207419
1540
objava na Twitteru koja se jako proširila
03:28
in the aftermath of a terrorist incident on Westminster Bridge.
65
208983
3587
u danima nakon terorističkog napada na Westminsterskom mostu.
03:32
This is a genuine image, not fake.
66
212594
2428
Ovo je prava slika, nije lažirana.
03:35
The woman who appears in the photograph was interviewed afterwards,
67
215046
3169
Žena koja se pojavljuje na slici kasnije je bila intervjuirana
03:38
and she explained that she was utterly traumatized.
68
218239
2409
i objasnila je da je bila krajnje preplašena.
03:40
She was on the phone to a loved one,
69
220672
1738
Bila je na telefonu s voljenom osobom
03:42
and she wasn't looking at the victim out of respect.
70
222434
2618
i iz poštovanja nije gledala u žrtvu.
03:45
But it still was circulated widely with this Islamophobic framing,
71
225076
3960
Ali svejedno je proširena uz islamofobne implikacije,
03:49
with multiple hashtags, including: #BanIslam.
72
229060
3046
uz brojne hashtagove, uključujući: #ZabraniteIslam.
03:52
Now, if you worked at Twitter, what would you do?
73
232425
2398
Da radite u Twitteru, što biste učinili?
03:54
Would you take that down, or would you leave it up?
74
234847
2562
Biste li ovo maknuli ili ostavili?
03:58
My gut reaction, my emotional reaction, is to take this down.
75
238553
3429
Moja instinktivna, emocionalna reakcija jest da bih to maknula.
04:02
I hate the framing of this image.
76
242006
2142
Mrzim implikacije ove slike.
04:04
But freedom of expression is a human right,
77
244585
2388
Ali sloboda izražavanja je ljudsko pravo,
04:06
and if we start taking down speech that makes us feel uncomfortable,
78
246997
3225
a ako počnemo micati govor koji čini da se osjećamo nelagodno,
04:10
we're in trouble.
79
250246
1230
onda smo u nevolji.
04:11
And this might look like a clear-cut case,
80
251500
2294
Ovo može izgledati kao kristalno jasan slučaj,
04:13
but, actually, most speech isn't.
81
253818
1698
ali zapravo, većina govora nije takva.
04:15
These lines are incredibly difficult to draw.
82
255540
2436
Izuzetno je teško povući te crte.
04:18
What's a well-meaning decision by one person
83
258000
2281
Ono što je za jednu osobu dobronamjerna odluka,
04:20
is outright censorship to the next.
84
260305
2077
za drugu može biti otvorena cenzura.
04:22
What we now know is that this account, Texas Lone Star,
85
262759
2929
Sada znamo da je ovaj Twitter-račun, Texas Lone Star,
04:25
was part of a wider Russian disinformation campaign,
86
265712
3230
dio široke ruske kampanje dezinformiranja,
04:28
one that has since been taken down.
87
268966
2151
koja je otada odstranjena.
04:31
Would that change your view?
88
271141
1563
Bi li to promijenilo vaš stav?
04:33
It would mine,
89
273322
1159
Moj bi,
04:34
because now it's a case of a coordinated campaign
90
274505
2301
jer sada je to slučaj usmjerene kampanje
04:36
to sow discord.
91
276830
1215
za sijanje nemira.
04:38
And for those of you who'd like to think
92
278069
1961
A vama koji volite misliti
04:40
that artificial intelligence will solve all of our problems,
93
280054
2831
kako će umjetna inteligencija riješiti sve naše probleme,
04:42
I think we can agree that we're a long way away
94
282909
2225
mislim da se možemo složiti kako je još dug put
04:45
from AI that's able to make sense of posts like this.
95
285158
2587
do UI-ja koji bi mogao razlučiti smisao ovakvih postova.
04:48
So I'd like to explain three interlocking issues
96
288856
2507
Želim objasniti tri isprepletena problema
04:51
that make this so complex
97
291387
2373
koji ovo čine toliko složenim,
04:53
and then think about some ways we can consider these challenges.
98
293784
3122
a zatim razmisliti o načinima kako možemo razmotriti ove izazove.
04:57
First, we just don't have a rational relationship to information,
99
297348
3890
Prvo, mi jednostavno nemamo racionalan odnos s informacijama,
05:01
we have an emotional one.
100
301262
1468
nego emocionalan.
05:02
It's just not true that more facts will make everything OK,
101
302754
3794
Jednostavno nije točno da će više činjenica sve popraviti,
05:06
because the algorithms that determine what content we see,
102
306572
3100
jer algoritmi koji određuju koji sadržaj gledamo
05:09
well, they're designed to reward our emotional responses.
103
309696
3127
napravljeni su tako da nagrade naše emocionalne reakcije.
05:12
And when we're fearful,
104
312847
1381
A kada strahujemo,
05:14
oversimplified narratives, conspiratorial explanations
105
314252
3174
pojednostavljene priče, urotnička objašnjenja
05:17
and language that demonizes others is far more effective.
106
317450
3418
i govor koji demonizira druge su daleko učinkovitiji.
05:21
And besides, many of these companies,
107
321538
1874
Usput, mnoge od ovih tvrtki,
05:23
their business model is attached to attention,
108
323436
2546
njihov poslovni model veže se na pozornost,
05:26
which means these algorithms will always be skewed towards emotion.
109
326006
3690
što znači da će ti algoritmi uvijek biti naklonjeni prema emocijama.
05:30
Second, most of the speech I'm talking about here is legal.
110
330371
4298
Drugo, većina govora koji ovdje opisujem je legalan.
05:35
It would be a different matter
111
335081
1446
Bila bi to druga stvar
05:36
if I was talking about child sexual abuse imagery
112
336551
2341
ako bih govorila o slikama spolnog zlostavljanja djece
05:38
or content that incites violence.
113
338916
1927
ili o sadržaju koji poziva na nasilje.
05:40
It can be perfectly legal to post an outright lie.
114
340867
3270
Može biti posve legalno objaviti besramnu laž.
05:45
But people keep talking about taking down "problematic" or "harmful" content,
115
345130
4034
A ljudi i dalje pričaju o skidanju "problematičnog" ili "štetnog" sadržaja,
05:49
but with no clear definition of what they mean by that,
116
349188
2609
ali bez jasne definicije što pod tim smatraju,
05:51
including Mark Zuckerberg,
117
351821
1264
čak i Mark Zuckerberg,
05:53
who recently called for global regulation to moderate speech.
118
353109
3412
koji je nedavno zatražio svjetsku regulativu za moderiranje govora.
05:56
And my concern is that we're seeing governments
119
356870
2215
Vidim opasnost u tome što vlade
05:59
right around the world
120
359109
1292
po cijelom svijetu
06:00
rolling out hasty policy decisions
121
360425
2676
izbacuju žurne političke odluke
06:03
that might actually trigger much more serious consequences
122
363125
2746
koje bi zapravo mogle izazvati puno ozbiljnije posljedice
06:05
when it comes to our speech.
123
365895
1714
kada se radi o našem govoru.
06:08
And even if we could decide which speech to take up or take down,
124
368006
3706
Čak i ako bismo mogli odlučiti koji govor postaviti ili maknuti,
06:11
we've never had so much speech.
125
371736
2174
nikada nije bilo toliko puno govora.
06:13
Every second, millions of pieces of content
126
373934
2131
Svake sekunde, milijune komadića sadržaja
06:16
are uploaded by people right around the world
127
376089
2107
postavljaju ljudi iz cijelog svijeta
06:18
in different languages,
128
378220
1168
na različitim jezicima,
06:19
drawing on thousands of different cultural contexts.
129
379412
2768
pozivajući se na tisuće različitih kulturnih konteksta.
06:22
We've simply never had effective mechanisms
130
382204
2532
Nikada nismo imali učinkovita sredstva
06:24
to moderate speech at this scale,
131
384760
1738
za moderiranje govora na ovoj razini,
06:26
whether powered by humans or by technology.
132
386522
2801
bilo da ih pokreću ljudi ili tehnologija.
06:30
And third, these companies -- Google, Twitter, Facebook, WhatsApp --
133
390284
3944
Treće, ove tvrtke -- Google, Twitter, Facebook, WhatsApp --
06:34
they're part of a wider information ecosystem.
134
394252
2841
dio su šireg informacijskog ekosustava.
06:37
We like to lay all the blame at their feet, but the truth is,
135
397117
3352
Sviđa nam se svaliti sav grijeh na njih, ali zapravo,
06:40
the mass media and elected officials can also play an equal role
136
400493
3830
masovni mediji i dužnosnici na položajima također igraju podjednaku ulogu
06:44
in amplifying rumors and conspiracies when they want to.
137
404347
2913
u preuveličavanju tračeva i zavjera kada to žele.
06:47
As can we, when we mindlessly forward divisive or misleading content
138
407800
4944
Tu ulogu igramo i sami kada proslijedimo huškajući ili neprovjeren sadržaj
06:52
without trying.
139
412768
1285
a da se nismo ni potrudili.
06:54
We're adding to the pollution.
140
414077
1800
Pridonosimo zagađenju.
06:57
I know we're all looking for an easy fix.
141
417236
2618
Znam da svi tražimo lako rješenje.
06:59
But there just isn't one.
142
419878
1667
Ali jednostavno ga nema.
07:01
Any solution will have to be rolled out at a massive scale, internet scale,
143
421950
4445
Bilo koje rješenje trebalo bi biti pokrenuto na visokoj razini, internetskoj,
07:06
and yes, the platforms, they're used to operating at that level.
144
426419
3261
i da, platforme su naviknute raditi na toj razini.
07:09
But can and should we allow them to fix these problems?
145
429704
3472
Ali možemo li i trebamo li im dopustiti da riješe ovaj problem?
07:13
They're certainly trying.
146
433668
1232
Oni se svakako trude.
07:14
But most of us would agree that, actually, we don't want global corporations
147
434924
4086
Ali većina nas će se složiti da, zapravo, ne želimo da globalne korporacije
07:19
to be the guardians of truth and fairness online.
148
439034
2332
postanu čuvari istine i poštenja na internetu.
07:21
And I also think the platforms would agree with that.
149
441390
2537
Mislim da bi se i same platforme složile s time.
07:24
And at the moment, they're marking their own homework.
150
444257
2881
A trenutno, oni ocjenjuju svoje vlastite domaće zadaće.
07:27
They like to tell us
151
447162
1198
Vole nam reći
07:28
that the interventions they're rolling out are working,
152
448384
2579
kako intervencije koje poduzimaju imaju učinak,
07:30
but because they write their own transparency reports,
153
450987
2540
ali budući da sami pišu svoja izvješća o transparentnosti,
07:33
there's no way for us to independently verify what's actually happening.
154
453551
3618
nema načina da neovisno utvrdimo što se zapravo događa.
07:38
(Applause)
155
458431
3342
(Pljesak)
07:41
And let's also be clear that most of the changes we see
156
461797
2952
Razjasnimo također kako se većina promjena koje vidimo
07:44
only happen after journalists undertake an investigation
157
464773
2994
dogode tek nakon što novinari poduzmu istragu
07:47
and find evidence of bias
158
467791
1611
te pronađu dokaze pristranosti
07:49
or content that breaks their community guidelines.
159
469426
2829
ili sadržaja koji krši njihove društvene smjernice.
07:52
So yes, these companies have to play a really important role in this process,
160
472815
4595
Da, te tvrtke moraju igrati vrlo važnu ulogu u ovom procesu,
07:57
but they can't control it.
161
477434
1560
ali ga ne smiju kontrolirati.
07:59
So what about governments?
162
479855
1518
Što je s vladama?
08:01
Many people believe that global regulation is our last hope
163
481863
3096
Mnogi ljudi vjeruju da je svjetska regulativa posljednja nada
08:04
in terms of cleaning up our information ecosystem.
164
484983
2880
kada se radi o pročišćenju našeg informacijskog ekosustava.
08:07
But what I see are lawmakers who are struggling to keep up to date
165
487887
3166
Ali vidim tek donositelje zakona koji se bore da ostanu ukorak
08:11
with the rapid changes in technology.
166
491077
2341
s brzim promjenama u tehnologiji.
08:13
And worse, they're working in the dark,
167
493442
1904
Još gore, oni djeluju u mraku,
08:15
because they don't have access to data
168
495370
1821
jer nemaju pristup podacima
08:17
to understand what's happening on these platforms.
169
497215
2650
kako bi razumjeli što se događa na ovim platformama.
08:20
And anyway, which governments would we trust to do this?
170
500260
3071
A i kojim bismo vladama povjerili da to učine?
08:23
We need a global response, not a national one.
171
503355
2770
Potrebna nam je svjetska, ne nacionalna, reakcija.
08:27
So the missing link is us.
172
507419
2277
Mi smo karika koja nedostaje.
08:29
It's those people who use these technologies every day.
173
509720
3123
Ljudi koji se služe ovim tehnologijama svakodnevno.
08:33
Can we design a new infrastructure to support quality information?
174
513260
4591
Možemo li stvoriti novu infrastrukturu za podršku kvalitetnim informacijama?
08:38
Well, I believe we can,
175
518371
1230
Vjerujem da možemo,
08:39
and I've got a few ideas about what we might be able to actually do.
176
519625
3357
i imam par ideja o tome što zapravo možemo učiniti.
08:43
So firstly, if we're serious about bringing the public into this,
177
523006
3103
Najprije, ako ozbiljno mislimo uključiti javnost u ovo,
08:46
can we take some inspiration from Wikipedia?
178
526133
2381
možemo li crpiti nadahnuće iz Wikipedije?
08:48
They've shown us what's possible.
179
528538
1824
Oni su pokazali što je moguće.
08:50
Yes, it's not perfect,
180
530386
1151
Nije savršena,
08:51
but they've demonstrated that with the right structures,
181
531561
2634
ali prikazali su kako pomoću pravih struktura,
globalnog pogleda te puno, puno transparentnosti,
08:54
with a global outlook and lots and lots of transparency,
182
534219
2635
08:56
you can build something that will earn the trust of most people.
183
536878
3096
može se izgraditi nešto što će steći povjerenje većine ljudi.
08:59
Because we have to find a way to tap into the collective wisdom
184
539998
3162
Jer moramo pronaći način da zagrabimo u kolektivnu mudrost
09:03
and experience of all users.
185
543184
2309
te iskustvo svakog korisnika.
09:05
This is particularly the case for women, people of color
186
545517
2646
To je posebno bitno za žene, nebijelce,
09:08
and underrepresented groups.
187
548187
1346
i nedovoljno zastupljene skupine.
09:09
Because guess what?
188
549557
1166
Jer pogodite?
09:10
They are experts when it comes to hate and disinformation,
189
550747
2735
Oni su stručnjaci kada se radi o mržnji i dezinformiranju,
09:13
because they have been the targets of these campaigns for so long.
190
553506
3516
jer su oni toliko dugo bili mete takvih kampanja.
09:17
And over the years, they've been raising flags,
191
557046
2350
Tijekom godina, podizali su zastavice,
09:19
and they haven't been listened to.
192
559420
1665
a nismo ih slušali.
09:21
This has got to change.
193
561109
1280
To se mora promijeniti.
09:22
So could we build a Wikipedia for trust?
194
562807
4326
Možemo li sagraditi Wikipediju za povjerenje?
09:27
Could we find a way that users can actually provide insights?
195
567157
4189
Možemo li pronaći način da nam korisnici zapravo omoguće uvide?
09:31
They could offer insights around difficult content-moderation decisions.
196
571370
3697
Mogu ponuditi uvide pri teškim odlukama o moderiranju sadržaja.
09:35
They could provide feedback
197
575091
1463
Mogu nam dati povratnu informaciju
09:36
when platforms decide they want to roll out new changes.
198
576578
3041
kada platforme odluče izbaciti nove promjene.
09:40
Second, people's experiences with the information is personalized.
199
580241
4162
Drugo, ljudska iskustva s informacijama su osobna.
09:44
My Facebook news feed is very different to yours.
200
584427
2643
Moj popis novosti na Facebooku je vrlo različit od vašeg.
09:47
Your YouTube recommendations are very different to mine.
201
587094
2745
Vaše preporuke na YouTubeu su vrlo različite od mojih.
09:49
That makes it impossible for us to actually examine
202
589863
2492
To čini nemogućim da preispitamo
09:52
what information people are seeing.
203
592379
2023
koje informacije ljudi vide.
09:54
So could we imagine
204
594815
1389
Možemo li zamisliti
09:56
developing some kind of centralized open repository for anonymized data,
205
596228
4778
da razvijemo neku vrstu središnjeg otvorenog skladišta za anonimne podatke,
10:01
with privacy and ethical concerns built in?
206
601030
2864
s ugrađenom privatnošću i etičkim ogradama?
10:04
Because imagine what we would learn
207
604220
1778
Jer zamislite što bismo mogli naučiti
10:06
if we built out a global network of concerned citizens
208
606022
3261
ako izgradimo svjetsku mrežu angažiranih građana
10:09
who wanted to donate their social data to science.
209
609307
3294
koji žele donirati svoje društvene podatke znanosti.
10:13
Because we actually know very little
210
613141
1722
Jer zapravo znamo jako malo
10:14
about the long-term consequences of hate and disinformation
211
614887
2881
o dugoročnim posljedicama mržnje i dezinformacija
10:17
on people's attitudes and behaviors.
212
617792
1975
na ljudske stavove i ponašanja.
10:20
And what we do know,
213
620236
1167
A ono što znamo
10:21
most of that has been carried out in the US,
214
621427
2193
većinom je bilo izvedeno u SAD-u,
10:23
despite the fact that this is a global problem.
215
623644
2381
usprkos tome što je ovo svjetski problem.
10:26
We need to work on that, too.
216
626049
1635
I na tome moramo raditi.
10:28
And third,
217
628192
1150
Treće,
10:29
can we find a way to connect the dots?
218
629366
2310
možemo li naći način da spojimo točke?
10:31
No one sector, let alone nonprofit, start-up or government,
219
631700
3438
Nijedan sektor, posebno ne neprofitna udruga, start-up tvrtka ili vlada
10:35
is going to solve this.
220
635162
1422
neće ovo riješiti.
10:36
But there are very smart people right around the world
221
636608
2564
Ali ima jako pametnih ljudi svuda po svijetu
10:39
working on these challenges,
222
639196
1381
koji rade na ovim izazovima,
10:40
from newsrooms, civil society, academia, activist groups.
223
640601
3576
uredništva vijesti, civilno društvo, akademski djelatnici, skupine aktivista.
10:44
And you can see some of them here.
224
644201
1898
Neke od njih možete vidjeti ovdje.
10:46
Some are building out indicators of content credibility.
225
646123
2927
Neki grade pokazatelje za vjerodostojnost sadržaja.
10:49
Others are fact-checking,
226
649074
1246
Drugi provjeravaju podatke,
10:50
so that false claims, videos and images can be down-ranked by the platforms.
227
650344
3591
kako bi lažne tvrdnje, videi i slike bili nisko rangirani na platformama.
10:53
A nonprofit I helped to found, First Draft,
228
653959
2213
Pomogla sam osnovati neprofitnu udrugu First Draft
10:56
is working with normally competitive newsrooms around the world
229
656196
2968
koja radi s uobičajeno natjecateljskim uredništvima vijesti po svijetu
10:59
to help them build out investigative, collaborative programs.
230
659188
3503
da im pomogne u gradnji istraživačkih, surađivačkih programa.
11:03
And Danny Hillis, a software architect,
231
663231
2309
Danny Hillis, inženjer softvera,
11:05
is designing a new system called The Underlay,
232
665564
2381
gradi novi sustav imenom The Underlay,
11:07
which will be a record of all public statements of fact
233
667969
2775
koji će biti arhiv svih javnih činjenica
11:10
connected to their sources,
234
670768
1329
povezanih sa svojim izvorom,
11:12
so that people and algorithms can better judge what is credible.
235
672121
3654
kako bi ljudi i algoritmi bolje procijenili što je vjerodostojno.
11:16
And educators around the world are testing different techniques
236
676800
3356
Edukatori diljem svijeta testiraju različite tehnike
11:20
for finding ways to make people critical of the content they consume.
237
680180
3484
u nalaženju načina da ljudi postanu kritični prema sadržaju koji koriste.
11:24
All of these efforts are wonderful, but they're working in silos,
238
684633
3141
Svi ti pokušaji su divni, ali oni rade nepovezano,
11:27
and many of them are woefully underfunded.
239
687798
2680
i mnogima od njih kronično nedostaju financijska sredstva.
11:30
There are also hundreds of very smart people
240
690502
2053
Ima i stotine vrlo pametnih ljudi
11:32
working inside these companies,
241
692579
1652
koji rade unutar ovih tvrtki,
11:34
but again, these efforts can feel disjointed,
242
694255
2325
ali opet, ta nastojanja su rascjepkana,
11:36
because they're actually developing different solutions to the same problems.
243
696604
3937
jer oni zapravo razvijaju različita rješenja za isti problem.
11:41
How can we find a way to bring people together
244
701205
2269
Kako možemo naći način da spojimo ljude
11:43
in one physical location for days or weeks at a time,
245
703498
3278
na jednoj fizičkoj lokaciji danima ili tjednima istodobno,
11:46
so they can actually tackle these problems together
246
706800
2396
kako bi zajedno mogli savladati ove probleme
11:49
but from their different perspectives?
247
709220
1818
ali s različitih gledišta?
11:51
So can we do this?
248
711062
1340
Pa možemo li to učiniti?
11:52
Can we build out a coordinated, ambitious response,
249
712426
3239
Možemo li izgraditi koordiniranu, ambicioznu reakciju
11:55
one that matches the scale and the complexity of the problem?
250
715689
3709
koja će se mjeriti s razinom i kompleksnošću problema?
11:59
I really think we can.
251
719819
1373
Stvarno mislim da možemo.
12:01
Together, let's rebuild our information commons.
252
721216
2959
Zajedno, izgradimo ponovno naš informacijski krajolik.
12:04
Thank you.
253
724819
1190
Hvala.
12:06
(Applause)
254
726033
3728
(Pljesak)
O ovoj web stranici

Ova stranica će vas upoznati s YouTube videozapisima koji su korisni za učenje engleskog jezika. Vidjet ćete lekcije engleskog koje vode vrhunski profesori iz cijelog svijeta. Dvaput kliknite na engleske titlove prikazane na svakoj video stranici da biste reproducirali video s tog mjesta. Titlovi se pomiču sinkronizirano s reprodukcijom videozapisa. Ako imate bilo kakvih komentara ili zahtjeva, obratite nam se putem ovog obrasca za kontakt.

https://forms.gle/WvT1wiN1qDtmnspy7