How technology can fight extremism and online harassment | Yasmin Green

75,440 views ・ 2018-06-27

TED


Please double-click on the English subtitles below to play the video.

Prevodilac: Aleksandar Bošnjak Lektor: Ivana Korom
00:13
My relationship with the internet reminds me of the setup
0
13131
4411
Moja veza sa internetom me podseća na radnju
00:17
to a clichéd horror movie.
1
17566
1833
predvidivog horor filma.
00:19
You know, the blissfully happy family moves in to their perfect new home,
2
19867
4386
Presrećna porodica se useljava u svoju novu kuću,
00:24
excited about their perfect future,
3
24277
2281
uzbuđeni zbog svoje savršene budućnosti,
00:26
and it's sunny outside and the birds are chirping ...
4
26582
3521
a napolju je sunčano, pričice cvrkuću...
00:30
And then it gets dark.
5
30857
1839
A onda postaje mračno.
00:32
And there are noises from the attic.
6
32720
2348
I čuju se zvukovi na tavanu.
00:35
And we realize that that perfect new house isn't so perfect.
7
35092
4345
Pa shvatamo da ta savršena nova kuća nije toliko savršena.
00:40
When I started working at Google in 2006,
8
40485
3131
Kada sam počela da radim u Guglu 2006,
00:43
Facebook was just a two-year-old,
9
43640
1767
Fejsbuk je imao samo dve godine,
00:45
and Twitter hadn't yet been born.
10
45431
2012
a Tviter još nije bio rođen.
00:47
And I was in absolute awe of the internet and all of its promise
11
47848
4410
Bila sam potpuno zapanjena internetom i svim njegovim obećanjima
00:52
to make us closer
12
52282
1437
da nas zbliži
00:53
and smarter
13
53743
1296
da nas načini pametnijim
00:55
and more free.
14
55063
1214
i slobodnijim.
00:57
But as we were doing the inspiring work of building search engines
15
57265
3714
Ali kako smo radili inspirušući posao izrađivanja pretraživača,
01:01
and video-sharing sites and social networks,
16
61003
2886
sajtova za razmenu video klipova i socijalnih mreža,
01:04
criminals, dictators and terrorists were figuring out
17
64907
4304
kriminalci, diktatori i teroristi su shvatali
01:09
how to use those same platforms against us.
18
69235
3202
kako da koriste te iste platforme protiv nas.
01:13
And we didn't have the foresight to stop them.
19
73417
2455
Nismo imali smotrenosti da ih zaustavimo.
01:16
Over the last few years, geopolitical forces have come online to wreak havoc.
20
76746
5099
Proteklih godina geopolitičke sile su došle na internet da izazovu propast.
01:21
And in response,
21
81869
1169
A kao odgovor,
01:23
Google supported a few colleagues and me to set up a new group called Jigsaw,
22
83062
4778
Gugl je podržao nekoliko kolega i mene da osnujemo novu grupu "Slagalica",
01:27
with a mandate to make people safer from threats like violent extremism,
23
87864
4596
sa mandatom da načinimo ljude sigurnijima od pretnji kao što su nasilni eksremizam,
01:32
censorship, persecution --
24
92484
2078
cenzura, progon -
01:35
threats that feel very personal to me because I was born in Iran,
25
95186
4117
pretnje koje su meni veoma lične jer sam rođena u Iranu,
01:39
and I left in the aftermath of a violent revolution.
26
99327
2929
i otišla sam tokom posledica nasilne revolucije.
01:43
But I've come to realize that even if we had all of the resources
27
103525
4346
Shvatila sam da čak i da imamo sve resurse
01:47
of all of the technology companies in the world,
28
107895
2858
svih tehnoloških kompanija na svetu,
01:51
we'd still fail
29
111595
1230
i dalje bismo gubili
01:53
if we overlooked one critical ingredient:
30
113586
2948
ako bismo predvideli jedan presudan sastojak:
01:57
the human experiences of the victims and perpetrators of those threats.
31
117653
5789
ljudska iskustva žrtava i izvršioce tih pretnji.
02:04
There are many challenges I could talk to you about today.
32
124935
2736
Ima mnogo izazova o kojima bih mogla da vam pričam danas.
02:07
I'm going to focus on just two.
33
127695
1504
Foksiraću se na samo dva.
02:09
The first is terrorism.
34
129623
2079
Prvi je terorizam.
02:13
So in order to understand the radicalization process,
35
133563
2557
Dakle, da bismo razumeli proces radikalizacije,
02:16
we met with dozens of former members of violent extremist groups.
36
136144
4287
sastali smo se sa desetinom bivših članova ekstremističkih grupa.
02:21
One was a British schoolgirl,
37
141590
2483
Jedna je bila britanska učenica,
02:25
who had been taken off of a plane at London Heathrow
38
145049
3699
koja je bila izvedena iz aviona na Londonskom aerodromu "Hitrou"
02:28
as she was trying to make her way to Syria to join ISIS.
39
148772
4692
dok je pokušavala da dođe do Sirije da bi se pridružila ISIS-u.
02:34
And she was 13 years old.
40
154281
1931
A imala je 13 godina.
02:37
So I sat down with her and her father, and I said, "Why?"
41
157792
4625
Sela sam sa njom i njenim ocem, i pitala sam: “Zašto?”
02:42
And she said,
42
162441
1717
a ona je rekla:
02:44
"I was looking at pictures of what life is like in Syria,
43
164182
3639
“Gledala sam slike na kojima je prikazano kakav je život u Siriji
02:47
and I thought I was going to go and live in the Islamic Disney World."
44
167845
3510
i mislila sam da idem da živim u islamskom Dizni svetu.”
02:52
That's what she saw in ISIS.
45
172527
2084
To je ono što je ona videla u ISISu.
02:54
She thought she'd meet and marry a jihadi Brad Pitt
46
174635
3492
Mislila je da će upoznati i venčati se sa džihadi Bredom Pitom
02:58
and go shopping in the mall all day and live happily ever after.
47
178151
3058
i da će biti u tržnom centru ceo dan i živeti srećno sve do kraja života.
03:02
ISIS understands what drives people,
48
182977
2824
ISIS razume šta pokreće ljude,
03:05
and they carefully craft a message for each audience.
49
185825
3544
oni pažljivo kreiraju poruku za svaku publiku.
03:11
Just look at how many languages
50
191122
1511
Samo pogledajte na koliko jezika
03:12
they translate their marketing material into.
51
192657
2273
oni prevode svoj propagandni materijal.
03:15
They make pamphlets, radio shows and videos
52
195677
2661
Prave brošure, radio emisije i klipove
03:18
in not just English and Arabic,
53
198362
1973
ne samo na engleskom i arapskom,
03:20
but German, Russian, French, Turkish, Kurdish,
54
200359
4767
već i nemačkom, ruskom, francuskom, turskom, kurdskom,
03:25
Hebrew,
55
205150
1672
hebrejskom
03:26
Mandarin Chinese.
56
206846
1741
mandarinskom.
03:29
I've even seen an ISIS-produced video in sign language.
57
209309
4192
Čak sam videla i video gde koriste znakovni jezik.
03:34
Just think about that for a second:
58
214605
1884
Samo mislite o tome na sekundu:
03:36
ISIS took the time and made the effort
59
216513
2308
ISIS je odvojio vreme i uložio napor
03:38
to ensure their message is reaching the deaf and hard of hearing.
60
218845
3804
da osigura da njihova poruka dosegne i do gluvih i onih sa slabijim sluhom.
03:45
It's actually not tech-savviness
61
225143
2144
Nije zapravo razumevanje u tehnologiju
03:47
that is the reason why ISIS wins hearts and minds.
62
227311
2595
razlog što ISIS osvaja srca i umove.
03:49
It's their insight into the prejudices, the vulnerabilities, the desires
63
229930
4163
Razlog su njihovi uvidi u predrasude, ranjivosti, žudnje
03:54
of the people they're trying to reach
64
234117
1774
ljudi koje pokušavaju da dosegnu.
03:55
that does that.
65
235915
1161
03:57
That's why it's not enough
66
237718
1429
Zato nije dovoljno
03:59
for the online platforms to focus on removing recruiting material.
67
239171
4239
za internet platforme da se fokusiraju na uklanjanje materijala za vrbovanje.
04:04
If we want to have a shot at building meaningful technology
68
244518
3581
Ako želimo da imamo šansu u stvaranju značajne tehnologije
04:08
that's going to counter radicalization,
69
248123
1874
koja će se protiviti radikalizaciji,
04:10
we have to start with the human journey at its core.
70
250021
2979
moramo da počnemo sa ljudskim putem u svojoj srži.
04:13
So we went to Iraq
71
253884
2187
Otišli smo u Irak
04:16
to speak to young men who'd bought into ISIS's promise
72
256095
2831
da pričamo sa mladićem koji je poverovao u obećanje ISIS-a
04:18
of heroism and righteousness,
73
258950
3191
o heroizmu i pravednosti,
04:22
who'd taken up arms to fight for them
74
262165
1847
koji je uzeo oružje da se bori za njih,
04:24
and then who'd defected
75
264036
1338
a onda je dezertirao
04:25
after they witnessed the brutality of ISIS's rule.
76
265398
3021
nakon što je bio očevidac brutalnih pravila ISIS-a.
04:28
And I'm sitting there in this makeshift prison in the north of Iraq
77
268880
3192
Sedim tamo u improvizovanom zatvoru na severu Iraka
04:32
with this 23-year-old who had actually trained as a suicide bomber
78
272096
4550
sa dvadesettrogodišnjakom koje je trenirao da postane bombaš samoubica
04:36
before defecting.
79
276670
1552
pre nego što je dezertirao.
04:39
And he says,
80
279080
1158
On kaže:
04:41
"I arrived in Syria full of hope,
81
281119
3220
"Stigao sam u Siriju pun nade,
04:44
and immediately, I had two of my prized possessions confiscated:
82
284363
4365
i odmah su mi konfiskovali dve najvažnije stvari:
04:48
my passport and my mobile phone."
83
288752
2933
moj pasoš i mobilni telefon."
04:52
The symbols of his physical and digital liberty
84
292140
2406
Simboli njegove fizičke i digitalne slobode
04:54
were taken away from him on arrival.
85
294570
1760
su mu bili oduzeti po dolasku.
04:57
And then this is the way he described that moment of loss to me.
86
297248
3510
A ovo je način kojim je on meni opisao taj momenat gubitka.
05:01
He said,
87
301356
1586
Rekao je:
05:02
"You know in 'Tom and Jerry,'
88
302966
2329
"Znate u ‘Tomu i Džeriju’,
05:06
when Jerry wants to escape, and then Tom locks the door
89
306192
3103
kad Džeri želi da pobegne, pa onda Tom zaključava vrata
05:09
and swallows the key
90
309319
1156
i proguta ključ,
05:10
and you see it bulging out of his throat as it travels down?"
91
310499
3551
a onda vidite kako se ocrtava na njegovom vratu dok ide ka dole?”
05:14
And of course, I really could see the image that he was describing,
92
314446
3153
Naravno, stvarno sam mogla da vidim sliku koju je opisivao,
05:17
and I really did connect with the feeling that he was trying to convey,
93
317623
3661
i zaista sam se povezala sa osećanjem koje je on pokušavao da prenese,
05:21
which was one of doom,
94
321308
2021
koje je bilo osećanje propasti,
05:23
when you know there's no way out.
95
323353
1789
kad znate da nema izlaza.
05:26
And I was wondering:
96
326551
1289
Pitala sam se:
05:28
What, if anything, could have changed his mind
97
328644
2682
šta je, ako išta, moglo da promeni njegovo mišljenje
05:31
the day that he left home?
98
331350
1240
na dan kad je otišao?
05:32
So I asked,
99
332614
1250
Pa sam ga pitala:
05:33
"If you knew everything that you know now
100
333888
3178
"Da si znao sve što sada znaš
05:37
about the suffering and the corruption, the brutality --
101
337090
3051
o patnji, korupciji i brutalnosti -
05:40
that day you left home,
102
340165
1415
tog dana kad si otišao,
05:41
would you still have gone?"
103
341604
1679
da li bi i pored toga opet otišao?"
05:43
And he said, "Yes."
104
343786
1711
On je rekao: "Da."
05:45
And I thought, "Holy crap, he said 'Yes.'"
105
345846
2282
Pomislila sam: "Zaboga, rekao je 'da.'"
05:48
And then he said,
106
348694
1219
A onda je rekao:
05:49
"At that point, I was so brainwashed,
107
349937
3001
"U tom trenutku mi je mozak bio ispran,
05:52
I wasn't taking in any contradictory information.
108
352962
3244
nisam uzimao u obzir bilo koju protivrečnu informaciju.
05:56
I couldn't have been swayed."
109
356744
1555
Nisam mogao biti pokoleban."
05:59
"Well, what if you knew everything that you know now
110
359235
2527
"Šta da si znao sve što sada znaš
06:01
six months before the day that you left?"
111
361786
2098
šest meseci pre dana kada si otišao?"
06:05
"At that point, I think it probably would have changed my mind."
112
365345
3131
"U tom trenutku, mislim da bi mi to promenilo mišljenje."
06:10
Radicalization isn't this yes-or-no choice.
113
370138
3397
Radikalizacija nije da ili ne izbor.
06:14
It's a process, during which people have questions --
114
374007
2977
To je proces tokom kog ljudi imaju pitanja -
06:17
about ideology, religion, the living conditions.
115
377008
3776
o ideologiji, religiji, uslovima življenja.
06:20
And they're coming online for answers,
116
380808
2766
Idu na internet u potragu za odgovorima,
06:23
which is an opportunity to reach them.
117
383598
1917
što je prilika da se do njih dođe.
06:25
And there are videos online from people who have answers --
118
385905
3014
Postoje klipovi na internetu od ljudi koji imaju odgovore -
06:28
defectors, for example, telling the story of their journey
119
388943
2876
dezerteri, na primer, pričaju priču o svom putovanju
06:31
into and out of violence;
120
391843
1583
u i iz nasilja;
06:33
stories like the one from that man I met in the Iraqi prison.
121
393450
3487
priče kao što je od onog čoveka kog sam upoznala u iračkom zatvoru.
06:37
There are locals who've uploaded cell phone footage
122
397914
2590
Ima lokalaca koji su postavljali telefonske snimke
06:40
of what life is really like in the caliphate under ISIS's rule.
123
400528
3503
o tome kakav je život u kalifatu pod pravilima ISIS-a.
06:44
There are clerics who are sharing peaceful interpretations of Islam.
124
404055
3735
Postoje sveštenici koji dele miroljubive interpretacije islama.
06:48
But you know what?
125
408830
1150
Ali znate šta?
06:50
These people don't generally have the marketing prowess of ISIS.
126
410004
3020
Ovi ljudi obično nemaju marketinšku sposobnost ISIS-a.
06:54
They risk their lives to speak up and confront terrorist propaganda,
127
414049
4532
Oni rizikuju svoje živote da se pročuju i usprotive terorističkoj propagandi,
06:58
and then they tragically don't reach the people
128
418605
2211
a onda na žalost ne dosegnu do ljudi
07:00
who most need to hear from them.
129
420840
1682
koji najviše treba da ih čuju.
07:03
And we wanted to see if technology could change that.
130
423173
2612
Želeli smo da vidimo da li bi tehnologija mogla to da promeni.
07:06
So in 2016, we partnered with Moonshot CVE
131
426205
4183
Tako da smo se 2016. udružili sa "Moonshot CVE"
07:10
to pilot a new approach to countering radicalization
132
430412
3180
da sprovedemo novi pristup borbe protiv radikalizacije
07:13
called the "Redirect Method."
133
433616
1780
zvan "Metod preusmeravanja."
07:16
It uses the power of online advertising
134
436453
3012
Koristi potencijal online oglašavanja
07:19
to bridge the gap between those susceptible to ISIS's messaging
135
439489
4514
da premosti jaz između onih podložnih porukama ISIS-a
07:24
and those credible voices that are debunking that messaging.
136
444027
3760
i onih uverljivih glasova koji razbijaju iluzije iz tih poruka.
07:28
And it works like this:
137
448633
1150
Funkcioniše ovako:
07:29
someone looking for extremist material --
138
449807
1961
neko kad traži eksremistički materijal -
07:31
say they search for "How do I join ISIS?" --
139
451792
2990
recimo da pretražuju "Kako da se pridružim ISIS-u?" -
07:34
will see an ad appear
140
454806
2476
videće kako se pojavljuje oglas
07:37
that invites them to watch a YouTube video of a cleric, of a defector --
141
457306
4882
koji ih poziva da pogledaju YouTube video sveštenika, dezertera -
07:42
someone who has an authentic answer.
142
462212
2310
nekoga ko ima autentičan odgovor.
07:44
And that targeting is based not on a profile of who they are,
143
464546
3623
I to targetiranje nije zasnovano na profilu njihove ličnosti
07:48
but of determining something that's directly relevant
144
468193
3053
nego na određivanju nečega što je direktno u vezi
07:51
to their query or question.
145
471270
1708
sa njihovom nedoumicom ili pitanjem.
07:54
During our eight-week pilot in English and Arabic,
146
474122
2842
Tokom našeg osmonedeljnog pilot programa
07:56
we reached over 300,000 people
147
476988
3279
na engleskom i arapskom, dosegli smo preko 300.000 ljudi
08:00
who had expressed an interest in or sympathy towards a jihadi group.
148
480291
5545
koji su izrazili interesovanje ili naklonost prema džihadskoj grupi.
08:06
These people were now watching videos
149
486626
2264
Ovi ljudi su sada gledali klipove
08:08
that could prevent them from making devastating choices.
150
488914
3340
koji bi mogli da ih spreče od pravljenja poražavajućih izbora.
08:13
And because violent extremism isn't confined to any one language,
151
493405
3727
A zbog toga što nasilni ekstremizam nije ograničen samo na jedan jezik,
08:17
religion or ideology,
152
497156
1804
jednu religiju ili ideologiju,
08:18
the Redirect Method is now being deployed globally
153
498984
3501
"Metod preusmeravanja" sada biva globalno raspoređen
08:22
to protect people being courted online by violent ideologues,
154
502509
3804
da zaštiti ljude od izloženosti nasilnih ideologija na internetu,
08:26
whether they're Islamists, white supremacists
155
506337
2596
bilo da su islamisti, beli supremacisti
08:28
or other violent extremists,
156
508957
2103
ili drugi nasilni ekstremisti,
08:31
with the goal of giving them the chance to hear from someone
157
511084
2873
sa ciljem da im da šansu da čuju od nekoga
08:33
on the other side of that journey;
158
513981
2091
sa druge strane tog putovanja;
08:36
to give them the chance to choose a different path.
159
516096
2839
da im damo šansu da odaberu drugačiji put.
08:40
It turns out that often the bad guys are good at exploiting the internet,
160
520749
5980
Ispada da su često loši momci dobri u iskorišćavanju interneta,
08:46
not because they're some kind of technological geniuses,
161
526753
3744
ne zato što su neki tehnološki geniji,
08:50
but because they understand what makes people tick.
162
530521
2985
nego zato što razumeju šta pokreće ljude.
08:54
I want to give you a second example:
163
534855
2369
Hoću da vam dam drugi primer:
08:58
online harassment.
164
538019
1391
uznemiravanje na internetu.
09:00
Online harassers also work to figure out what will resonate
165
540629
3363
Uznemirivači na internetu rade da bi shvatili šta će imati uticaja
09:04
with another human being.
166
544016
1615
na druge ljude.
09:05
But not to recruit them like ISIS does,
167
545655
3110
Ali ne da bi ih regrutovali kao ISIS,
09:08
but to cause them pain.
168
548789
1275
nego da im prouzrokuju bol.
09:11
Imagine this:
169
551259
1342
Zamislite ovo:
09:13
you're a woman,
170
553347
1659
žena ste,
09:15
you're married,
171
555030
1413
venčani ste,
09:16
you have a kid.
172
556467
1154
i imate dete.
09:18
You post something on social media,
173
558834
1784
Objavljujete nešto na društvenim mrežama,
09:20
and in a reply, you're told that you'll be raped,
174
560642
2886
i u komentarima vam je rečeno da ćete biti silovani,
09:24
that your son will be watching,
175
564577
1560
da će vaš sin gledati,
09:26
details of when and where.
176
566825
1856
detalji vremena i mesta.
09:29
In fact, your home address is put online for everyone to see.
177
569148
3143
Zapravo, vaša adresa je stavljena na internet da je svi vide.
09:33
That feels like a pretty real threat.
178
573580
2007
To izgleda kao prava pretnja.
Da li mislite da biste vi otišli kući?
09:37
Do you think you'd go home?
179
577113
1656
09:39
Do you think you'd continue doing the thing that you were doing?
180
579999
3048
Mislite li da biste nastavili da radite stvar koju ste radili?
Da li biste nastavili da radite tu stvar koja iritira vašeg napadača?
09:43
Would you continue doing that thing that's irritating your attacker?
181
583071
3220
09:48
Online abuse has been this perverse art
182
588016
3096
Zloupotreba interneta je izopačena umetnost
09:51
of figuring out what makes people angry,
183
591136
3468
shvatanja šta nervira ljude,
09:54
what makes people afraid,
184
594628
2132
šta ih plaši,
09:56
what makes people insecure,
185
596784
1641
šta ih čini nesigurnim,
09:58
and then pushing those pressure points until they're silenced.
186
598449
3067
a onda stiskanja tih slabih tačaka dok ih ne ućutkaju.
10:02
When online harassment goes unchecked,
187
602333
2304
Kad internet uznemiravanje prođe neprovereno,
10:04
free speech is stifled.
188
604661
1667
slobodan govor je ugušen.
10:07
And even the people hosting the conversation
189
607196
2127
Čak i ljudi koji su osnovali razgovor
10:09
throw up their arms and call it quits,
190
609347
1834
dižu ruke i napuštaju ga,
10:11
closing their comment sections and their forums altogether.
191
611205
2957
zatvarajaći u isti mah sekciju sa komentarima i njihove forume.
10:14
That means we're actually losing spaces online
192
614186
2849
To znači da zapravo gubimo mesta na internetu
10:17
to meet and exchange ideas.
193
617059
1987
za upoznavanje i razmenjivanje ideja.
10:19
And where online spaces remain,
194
619939
2163
I tamo gde internet mesta ostaju,
10:22
we descend into echo chambers with people who think just like us.
195
622126
4470
mi ulazimo u eho-komore sa ljudima koji misle isto kao i mi.
10:27
But that enables the spread of disinformation;
196
627688
2499
Ali to omogućava širenje dezinformacija;
10:30
that facilitates polarization.
197
630211
2184
to olakšava polarizaciju.
10:34
What if technology instead could enable empathy at scale?
198
634508
5269
Šta ako bi tehnologija mogla da omogući empatiju u većoj meri?
10:40
This was the question that motivated our partnership
199
640451
2486
Ovo pitanje je motivisalo naše partnerstvo
10:42
with Google's Counter Abuse team,
200
642961
1819
sa Guglovim timom za suzbijanje zloupotreba,
10:44
Wikipedia
201
644804
1178
Vikipedijom
10:46
and newspapers like the New York Times.
202
646006
1934
i novinama kao što su Njujork Tajms.
10:47
We wanted to see if we could build machine-learning models
203
647964
2876
Želeli smo da vidimo da li bismo mogli da napravimo modele za mašinsko učenje
10:50
that could understand the emotional impact of language.
204
650864
3606
koji mogu da razumeju emocionalni uticaj jezika.
10:55
Could we predict which comments were likely to make someone else leave
205
655062
3610
Možemo li predvideti koji komentari će naterati nekoga da napusti
10:58
the online conversation?
206
658696
1374
internet razgovor?
11:00
And that's no mean feat.
207
660515
3887
A to nije zloban podvig.
11:04
That's no trivial accomplishment
208
664426
1566
To nije trivjalno ostvarenje
11:06
for AI to be able to do something like that.
209
666016
2563
za VI da bude u mogućnosti da uradi nešto tako.
11:08
I mean, just consider these two examples of messages
210
668603
3729
Samo uzmite u obzir ova dva primera poruka
11:12
that could have been sent to me last week.
211
672356
2224
koje su mi mogle biti poslate prošle nedelje.
11:15
"Break a leg at TED!"
212
675517
1879
"Slomi nogu na TED-u!"
11:17
... and
213
677420
1164
...i
11:18
"I'll break your legs at TED."
214
678608
2126
"Polomiću ti noge na TED-u!"
11:20
(Laughter)
215
680758
1246
(Smeh)
11:22
You are human,
216
682028
1513
Vi ste ljudi,
11:23
that's why that's an obvious difference to you,
217
683565
2210
zato je vama to očigledna razlika,
11:25
even though the words are pretty much the same.
218
685799
2224
iako su reči skoro iste.
11:28
But for AI, it takes some training to teach the models
219
688047
3079
Ali za VI je potrebna vežba da bi se podučavali modeli
11:31
to recognize that difference.
220
691150
1571
da bi se te razlike prepoznale
11:32
The beauty of building AI that can tell the difference
221
692745
3245
Lepota izgradnje Vl koja može da kaže razliku
11:36
is that AI can then scale to the size of the online toxicity phenomenon,
222
696014
5050
je da Vl može da odredi veličinu fenomena zatrovanosti na internetu,
11:41
and that was our goal in building our technology called Perspective.
223
701088
3287
i to je bio naš cilj tokom izgradnje naše tehnologije "Perspektiva".
11:45
With the help of Perspective,
224
705056
1427
Pomoću "Perspektive”,
11:46
the New York Times, for example,
225
706507
1583
Njujork Tajms, na primer,
11:48
has increased spaces online for conversation.
226
708114
2487
je povećao mesta na internetu za razgovor.
11:51
Before our collaboration,
227
711005
1310
Pre naše saradnje,
11:52
they only had comments enabled on just 10 percent of their articles.
228
712339
4305
imali su omogućene komentare na samo 10% svojih članaka.
11:57
With the help of machine learning,
229
717495
1644
Pomoću mašinskog učenja,
11:59
they have that number up to 30 percent.
230
719163
1897
povećali su taj broj do 30%.
12:01
So they've tripled it,
231
721084
1156
Pa su ih utrostručili,
12:02
and we're still just getting started.
232
722264
1917
a mi smo tek započeli.
12:04
But this is about way more than just making moderators more efficient.
233
724872
3461
Ovo je o nečemu više od same činjenice da činimo moderatore efikasnijim.
12:10
Right now I can see you,
234
730076
1850
Sada mogu da vas vidim
12:11
and I can gauge how what I'm saying is landing with you.
235
731950
3294
i mogu da ocenim kako vam ovo što pričam ulazi u uši.
12:16
You don't have that opportunity online.
236
736370
1879
Nemate prilike za to na internetu.
12:18
Imagine if machine learning could give commenters,
237
738558
3635
Zamislite kad bi mašinsko učenje moglo dati ljudima koji pišu komentare,
12:22
as they're typing,
238
742217
1162
12:23
real-time feedback about how their words might land,
239
743403
3347
povratne informacije u realnom vremenu
o tome kako njihove reči mogu biti shvaćene,
12:27
just like facial expressions do in a face-to-face conversation.
240
747609
3024
baš kao što izrazi lica to rade u razgovoru licem u lice.
12:32
Machine learning isn't perfect,
241
752926
1842
Mašinsko učenje nije savršeno,
12:34
and it still makes plenty of mistakes.
242
754792
2394
i dalje pravi dosta grešaka.
12:37
But if we can build technology
243
757210
1557
Ali ako možemo izgraditi tehnologiju
12:38
that understands the emotional impact of language,
244
758791
3293
koja razume emocionalni uticaj jezika,
12:42
we can build empathy.
245
762108
1460
možemo da izgradimo empatiju.
12:43
That means that we can have dialogue between people
246
763592
2425
To znači da možemo da imamo dijalog između ljudi
12:46
with different politics,
247
766041
1816
sa različitim politikama,
12:47
different worldviews,
248
767881
1216
drugačijim pogledima na svet,
12:49
different values.
249
769121
1246
drugačijim vrednostima.
12:51
And we can reinvigorate the spaces online that most of us have given up on.
250
771359
4775
i možemo da osvežimo mesta na internetu od kojih smo digli ruke.
12:57
When people use technology to exploit and harm others,
251
777857
3785
Kad ljudi koriste tehnologiju da eksploatišu i čine nažao drugima,
13:01
they're preying on our human fears and vulnerabilities.
252
781666
3642
oni se hvataju za naše strahove i ranjivosti.
13:06
If we ever thought that we could build an internet
253
786461
3508
Ako smo ikad mislili da bismo mogli da izgradimo internet
13:09
insulated from the dark side of humanity,
254
789993
2578
izolovan od mračne strane čovečanstva,
13:12
we were wrong.
255
792595
1184
pogrešili smo.
13:14
If we want today to build technology
256
794361
2270
Ako želimo danas da izgradimo tehnologiju
13:16
that can overcome the challenges that we face,
257
796655
3127
koja može da savlada izazove sa kojima se susrećemo,
13:19
we have to throw our entire selves into understanding the issues
258
799806
4043
moramo da damo sve od sebe da shvatimo probleme
13:23
and into building solutions
259
803873
1893
i da stvaramo rešenja
13:25
that are as human as the problems they aim to solve.
260
805790
3782
koja su ljudska kao i problemi koje oni ciljaju da reše.
13:30
Let's make that happen.
261
810071
1513
Hajde da ostvarimo to.
13:31
Thank you.
262
811924
1150
Hvala vam.
13:33
(Applause)
263
813098
3277
(Aplauz)
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7