How we can build AI to help humans, not hurt us | Margaret Mitchell

81,177 views ・ 2018-03-12

TED


Please double-click on the English subtitles below to play the video.

Prevodilac: Marko Mijatović Lektor: Ivana Krivokuća
00:13
I work on helping computers communicate about the world around us.
0
13381
4015
Ja pomažem da računari saopštavaju informacije o svetu oko nas.
00:17
There are a lot of ways to do this,
1
17754
1793
Postoji mnogo načina da se to uradi,
00:19
and I like to focus on helping computers
2
19571
2592
i ja volim da se fokusiram na pomaganje da računari
00:22
to talk about what they see and understand.
3
22187
2874
pričaju o tome šta vide i razumeju.
00:25
Given a scene like this,
4
25514
1571
Na slici kao što je ova,
00:27
a modern computer-vision algorithm
5
27109
1905
savremeni algoritmi za prepoznavanje oblika
00:29
can tell you that there's a woman and there's a dog.
6
29038
3095
će videti da su tu žena i pas.
00:32
It can tell you that the woman is smiling.
7
32157
2706
Prepoznaće da se žena smeje.
00:34
It might even be able to tell you that the dog is incredibly cute.
8
34887
3873
Može čak proceniti da je pas neverovatno sladak.
00:38
I work on this problem
9
38784
1349
Ja radim na ovom problemu
00:40
thinking about how humans understand and process the world.
10
40157
4212
razmišljajući kako ljudi razumeju svet i razmišljaju o njemu.
00:45
The thoughts, memories and stories
11
45577
2952
Misli, sećanja i priče
00:48
that a scene like this might evoke for humans.
12
48553
2818
koje slika kao što je ova mogu izazvati kod ljudi.
00:51
All the interconnections of related situations.
13
51395
4285
Sve međuzavisnosti povezanih situacija.
00:55
Maybe you've seen a dog like this one before,
14
55704
3126
Možda ste već videli psa kao što je ovaj,
00:58
or you've spent time running on a beach like this one,
15
58854
2969
ili ste provodili vreme u trčanju na plaži kao što je ova,
01:01
and that further evokes thoughts and memories of a past vacation,
16
61847
4778
a to dalje podstiče misli i sećanja na odmor iz prošlosti,
01:06
past times to the beach,
17
66649
1920
vreme provedeno na plaži
01:08
times spent running around with other dogs.
18
68593
2603
i u trčanju sa drugim psima.
01:11
One of my guiding principles is that by helping computers to understand
19
71688
5207
Jedan od mojih vodećih principa je da pomaganjem računarima da shvate
01:16
what it's like to have these experiences,
20
76919
2896
šta znači imati ovakva iskustva,
01:19
to understand what we share and believe and feel,
21
79839
5176
da shvate šta mi imamo zajedničko, u šta verujemo i šta osećamo,
01:26
then we're in a great position to start evolving computer technology
22
86094
4310
dolazimo do odlične pozicije za početak razvoja računarske tehnologije
01:30
in a way that's complementary with our own experiences.
23
90428
4587
tako da dopunjava naša sopstvena iskustva.
01:35
So, digging more deeply into this,
24
95539
3387
Ako uđemo malo dublje u temu,
01:38
a few years ago I began working on helping computers to generate human-like stories
25
98950
5905
pre par godina sam počela rad
na pomaganju računarima da stvaraju priče
01:44
from sequences of images.
26
104879
1666
na osnovu niza slika.
01:47
So, one day,
27
107427
1904
Jednog dana
01:49
I was working with my computer to ask it what it thought about a trip to Australia.
28
109355
4622
sam radila na računaru i pitala ga šta misli o putovanju u Australiju.
01:54
It took a look at the pictures, and it saw a koala.
29
114768
2920
Pogledao je slike i video je koalu.
01:58
It didn't know what the koala was,
30
118236
1643
Nije znao šta je koala,
01:59
but it said it thought it was an interesting-looking creature.
31
119903
2999
ali je rekao da misli da to biće izgleda zanimljivo.
02:04
Then I shared with it a sequence of images about a house burning down.
32
124053
4004
Zatim sam sa njim podelila niz slika kuća u plamenu.
02:09
It took a look at the images and it said,
33
129704
3285
Pogledao je slike i rekao:
02:13
"This is an amazing view! This is spectacular!"
34
133013
3500
„Ovo izgleda sjajno! Ovo je spektakularno!“
02:17
It sent chills down my spine.
35
137450
2095
Niz kičmu su mi prošli žmarci.
02:20
It saw a horrible, life-changing and life-destroying event
36
140983
4572
Video je užasan događaj koji menja i uništava živote ljudi
02:25
and thought it was something positive.
37
145579
2382
i mislio je da je to nešto pozitivno.
02:27
I realized that it recognized the contrast,
38
147985
3441
Shvatila sam da je prepoznao kontrast,
02:31
the reds, the yellows,
39
151450
2699
crvene i žute boje,
02:34
and thought it was something worth remarking on positively.
40
154173
3078
i da je mislio da je to nešto što treba označiti kao pozitivno.
02:37
And part of why it was doing this
41
157928
1615
Delimično je razlog tome
02:39
was because most of the images I had given it
42
159577
2945
to što je većina slika koje sam mu davala
02:42
were positive images.
43
162546
1840
bila pozitivna.
02:44
That's because people tend to share positive images
44
164903
3658
To je zbog toga što su ljudi skloni da dele pozitivne slike
02:48
when they talk about their experiences.
45
168585
2190
kada pričaju o svojim iskustvima.
02:51
When was the last time you saw a selfie at a funeral?
46
171267
2541
Kada ste poslednji put videli selfi sa sahrane?
02:55
I realized that, as I worked on improving AI
47
175434
3095
Shvatila sam da, dok sam radila na unapređenju veštačke inteligencije
02:58
task by task, dataset by dataset,
48
178553
3714
zadatak po zadatak, bazu po bazu,
03:02
that I was creating massive gaps,
49
182291
2897
da sam u stvari stvarala velike pukotine,
03:05
holes and blind spots in what it could understand.
50
185212
3999
rupe i „mrtve uglove“
u onome što može da razume.
03:10
And while doing so,
51
190307
1334
Time sam sam ubacivala različite pristrasnosti.
03:11
I was encoding all kinds of biases.
52
191665
2483
03:15
Biases that reflect a limited viewpoint,
53
195029
3318
Pristrasnosti koje odražavaju ograničeno gledište,
03:18
limited to a single dataset --
54
198371
2261
ograničeno na samo jedan skup podataka -
03:21
biases that can reflect human biases found in the data,
55
201283
3858
pristrasnosti koje odražavaju ljudske pristrasnosti pronađene u podacima
03:25
such as prejudice and stereotyping.
56
205165
3104
kao što su predrasude i stereotipi.
03:29
I thought back to the evolution of the technology
57
209554
3057
Razmislila sam o evoluciji tehnologije
03:32
that brought me to where I was that day --
58
212635
2502
koja me je dovela tu gde sam bila u tom trenutku -
03:35
how the first color images
59
215966
2233
kako su prve slike u boji
03:38
were calibrated against a white woman's skin,
60
218223
3048
bile podešene prema boji kože belih žena,
03:41
meaning that color photography was biased against black faces.
61
221665
4145
što znači da je fotografija u boji
bila naklonjena protiv crnih lica.
03:46
And that same bias, that same blind spot
62
226514
2925
Ta ista naklonjenost, isti „mrtav ugao“
03:49
continued well into the '90s.
63
229463
1867
se nastavio duboko u '90-im godinama.
03:51
And the same blind spot continues even today
64
231701
3154
Isti „mrtav ugao“ traje čak i danas
03:54
in how well we can recognize different people's faces
65
234879
3698
u tome koliko dobro možemo prepoznati lica različitih ljudi
03:58
in facial recognition technology.
66
238601
2200
u tehnologiji za prepoznavanje lica.
04:01
I though about the state of the art in research today,
67
241323
3143
Razmišljala sam o najnaprednijim istraživanjima danas,
04:04
where we tend to limit our thinking to one dataset and one problem.
68
244490
4514
gde pokušavamo da ograničimo svoje razmišljanje
na jedan skup podataka i jedan problem.
04:09
And that in doing so, we were creating more blind spots and biases
69
249688
4881
A čineći to, stvaramo još više „mrtvih uglova“ i predrasuda
04:14
that the AI could further amplify.
70
254593
2277
koje veštačka inteligencija može još više da pojača.
04:17
I realized then that we had to think deeply
71
257712
2079
Tada sam shvatila da moramo da dobro razmislimo
04:19
about how the technology we work on today looks in five years, in 10 years.
72
259815
5519
o tome kako će tehnologija na kojoj danas radimo
da izgleda za pet ili 10 godina.
04:25
Humans evolve slowly, with time to correct for issues
73
265990
3142
Ljudi evoluiraju polako, imaju vremena da isprave greške
04:29
in the interaction of humans and their environment.
74
269156
3534
u interakciji sa ljudima i okruženjem.
04:33
In contrast, artificial intelligence is evolving at an incredibly fast rate.
75
273276
5429
Za razliku od njih, veštačka inteligencija se razvija neverovatnom brzinom.
04:39
And that means that it really matters
76
279013
1773
To znači da je zaista bitno
04:40
that we think about this carefully right now --
77
280810
2317
da o ovome pažljivo razmislimo sada -
04:44
that we reflect on our own blind spots,
78
284180
3008
da razmislimo o svojim „mrtvim uglovima“,
04:47
our own biases,
79
287212
2317
o sopstvenim predrasudama,
04:49
and think about how that's informing the technology we're creating
80
289553
3857
i da razmislimo o tome kako to utiče na tehnologiju koju stvaramo
04:53
and discuss what the technology of today will mean for tomorrow.
81
293434
3902
i da razmotrimo šta će značiti tehnologija današnjice za sutrašnjicu.
04:58
CEOs and scientists have weighed in on what they think
82
298593
3191
Direktori i naučnici raspravljaju o tome kakva će biti
05:01
the artificial intelligence technology of the future will be.
83
301808
3325
veštačka inteligencija u budućnosti.
05:05
Stephen Hawking warns that
84
305157
1618
Stiven Hoking upozorava da
05:06
"Artificial intelligence could end mankind."
85
306799
3007
„Veštačka inteligencija može uništiti čovečanstvo.“
05:10
Elon Musk warns that it's an existential risk
86
310307
2683
Ilon Mask upozorava da je to rizik za preživljavanje
05:13
and one of the greatest risks that we face as a civilization.
87
313014
3574
i jedan od najvećih rizika sa kojima se susrećemo kao civilizacija.
05:17
Bill Gates has made the point,
88
317665
1452
Bil Gejts je izneo stav:
05:19
"I don't understand why people aren't more concerned."
89
319141
3185
„Ne razumem zašto ljudi nisu više zabrinuti.“
05:23
But these views --
90
323412
1318
Ali ova mišljenja
05:25
they're part of the story.
91
325618
1734
su deo priče.
05:28
The math, the models,
92
328079
2420
Matematika, modeli,
05:30
the basic building blocks of artificial intelligence
93
330523
3070
osnovni sastavni delovi veštačke inteligencije
05:33
are something that we call access and all work with.
94
333617
3135
su nešto čemu svi možemo da pristupimo i da radimo sa time.
05:36
We have open-source tools for machine learning and intelligence
95
336776
3785
Imamo alate otvorenog koda za mašinsko učenje i inteligenciju
05:40
that we can contribute to.
96
340585
1734
kojima možemo da doprinesemo.
05:42
And beyond that, we can share our experience.
97
342919
3340
Osim toga, možemo da delimo naše iskustvo.
05:46
We can share our experiences with technology and how it concerns us
98
346760
3468
Možemo da delimo naša iskustva sa tehnologijom, kako nas brine
05:50
and how it excites us.
99
350252
1467
i kako nas uzbuđuje.
05:52
We can discuss what we love.
100
352251
1867
Možemo da razmatramo šta volimo.
05:55
We can communicate with foresight
101
355244
2031
Možemo da govorimo
05:57
about the aspects of technology that could be more beneficial
102
357299
4857
o budućim aspektima tehnologije koji bi bili delotvorniji
06:02
or could be more problematic over time.
103
362180
2600
ili problematičniji tokom vremena.
06:05
If we all focus on opening up the discussion on AI
104
365799
4143
Ako se svi fokusiramo na otvaranje diskusije o veštačkoj inteligenciji
06:09
with foresight towards the future,
105
369966
1809
sa pogledom na budućnost,
06:13
this will help create a general conversation and awareness
106
373093
4270
to će pomoći kreiranju uopštenih razgovora i svesti
06:17
about what AI is now,
107
377387
2513
o tome šta je veštačka inteligencija sada,
06:21
what it can become
108
381212
2001
šta može da postane
06:23
and all the things that we need to do
109
383237
1785
i sve stvari koje treba da uradimo
06:25
in order to enable that outcome that best suits us.
110
385046
3753
da ostvarimo rezultat koji bi nam najbolje služio.
06:29
We already see and know this in the technology that we use today.
111
389490
3674
Već vidimo i znamo to iz tehnologije koju danas koristimo.
06:33
We use smart phones and digital assistants and Roombas.
112
393767
3880
Koristimo pametne telefone, digitalne pomoćnike i robotske usisivače.
06:38
Are they evil?
113
398457
1150
Da li su zli?
06:40
Maybe sometimes.
114
400268
1547
Možda ponekad.
06:42
Are they beneficial?
115
402664
1333
Da li su korisni?
06:45
Yes, they're that, too.
116
405005
1533
Jesu i to.
06:48
And they're not all the same.
117
408236
1761
I nisu svi isti.
06:50
And there you already see a light shining on what the future holds.
118
410489
3540
Već možemo da vidimo da je budućnost svetla.
06:54
The future continues on from what we build and create right now.
119
414942
3619
Budućnost nastaje iz onoga što kreiramo upravo sada.
06:59
We set into motion that domino effect
120
419165
2642
Mi pokrećemo taj domino efekat
07:01
that carves out AI's evolutionary path.
121
421831
2600
koji kreira evolucioni put veštačke inteligencije.
07:05
In our time right now, we shape the AI of tomorrow.
122
425173
2871
U svom vremenu sada, mi oblikujemo veštačku inteligenciju sutrašnjice.
07:08
Technology that immerses us in augmented realities
123
428566
3699
Tehnologiju koja nas uvodi u izmenjenu stvarnost
07:12
bringing to life past worlds.
124
432289
2566
oživljavajući prošlost.
07:15
Technology that helps people to share their experiences
125
435844
4312
Tehnologiju koja pomaže ljudima da dele svoja iskustva
07:20
when they have difficulty communicating.
126
440180
2262
kada imaju problem da komuniciraju.
07:23
Technology built on understanding the streaming visual worlds
127
443323
4532
Tehnologiju zasnovanu na prenosu sadržaja vizuelnog sveta,
07:27
used as technology for self-driving cars.
128
447879
3079
korišćenu kao tehnologiju za samovozeće automobile.
07:32
Technology built on understanding images and generating language,
129
452490
3413
Tehnologiju zasnovanu na razumevanju slika i generisanju jezika,
07:35
evolving into technology that helps people who are visually impaired
130
455927
4063
koja evoluira u tehnologiju koja pomaže ljudima sa oštećenim vidom
07:40
be better able to access the visual world.
131
460014
2800
da imaju bolji pristup vizuelnom svetu.
07:42
And we also see how technology can lead to problems.
132
462838
3261
Takođe vidimo kako tehnologija može da dovede do problema.
07:46
We have technology today
133
466885
1428
Danas imamo tehnologiju
07:48
that analyzes physical characteristics we're born with --
134
468337
3835
koja analizira fizičke karakteristike sa kojima smo rođeni -
07:52
such as the color of our skin or the look of our face --
135
472196
3272
kao što je boja kože ili izgled lica -
07:55
in order to determine whether or not we might be criminals or terrorists.
136
475492
3804
kako bi odredila da li smo možda kriminalci ili teroristi.
07:59
We have technology that crunches through our data,
137
479688
2905
Imamo tehnologiju koja pretražuje naše podatke,
08:02
even data relating to our gender or our race,
138
482617
2896
čak i podatke o našem polu ili rasi,
08:05
in order to determine whether or not we might get a loan.
139
485537
2865
da bi odredila da li možemo da dobijemo kredit.
08:09
All that we see now
140
489494
1579
Sve što sada vidimo
08:11
is a snapshot in the evolution of artificial intelligence.
141
491097
3617
je trenutna slika u evoluciji veštačke inteligencije.
08:15
Because where we are right now,
142
495763
1778
Zato što je ovo gde smo sada
08:17
is within a moment of that evolution.
143
497565
2238
samo trenutak u toj evoluciji.
08:20
That means that what we do now will affect what happens down the line
144
500690
3802
To znači da će ono što sada radimo uticati na to šta se dešava
08:24
and in the future.
145
504516
1200
u budućnosti.
Ukoliko želimo da pomognemo da se veštačka inteligencija razvije
08:26
If we want AI to evolve in a way that helps humans,
146
506063
3951
tako da pomogne ljudima,
08:30
then we need to define the goals and strategies
147
510038
2801
onda moramo da definišemo ciljeve i strategije
08:32
that enable that path now.
148
512863
1733
koje će omogućiti taj put sada.
08:35
What I'd like to see is something that fits well with humans,
149
515680
3738
Ono što bih želela da vidim je nešto što će se uklopiti sa ljudima,
08:39
with our culture and with the environment.
150
519442
2800
sa našom kulturom i okruženjem.
08:43
Technology that aids and assists those of us with neurological conditions
151
523435
4484
Tehnologiju koja pomaže onima sa neurološkim problemima
08:47
or other disabilities
152
527943
1721
ili drugim vidovima invaliditeta
08:49
in order to make life equally challenging for everyone.
153
529688
3216
kako bi život bio podjednako izazovan za svakoga.
08:54
Technology that works
154
534097
1421
Tehnologiju koja radi
08:55
regardless of your demographics or the color of your skin.
155
535542
3933
bez obzira na poreklo ili boju kože.
09:00
And so today, what I focus on is the technology for tomorrow
156
540383
4742
Tako da ono na šta se danas fokusiram je tehnologija sutrašnjice
09:05
and for 10 years from now.
157
545149
1733
i 10 godina od danas.
09:08
AI can turn out in many different ways.
158
548530
2634
Veštačka inteligencija može da krene različitim putevima.
09:11
But in this case,
159
551688
1225
Ali u ovom slučaju,
09:12
it isn't a self-driving car without any destination.
160
552937
3328
to nije samovozeći automobil bez cilja.
09:16
This is the car that we are driving.
161
556884
2400
To je automobil koji mi vozimo.
09:19
We choose when to speed up and when to slow down.
162
559953
3595
Mi biramo kada da ubrzamo i kada da usporimo.
09:23
We choose if we need to make a turn.
163
563572
2400
Mi biramo da li treba da skrenemo.
09:26
We choose what the AI of the future will be.
164
566868
3000
Mi biramo šta će veštačka inteligencija budućnosti zapravo biti.
09:31
There's a vast playing field
165
571186
1337
Veliki je broj mogućnosti
09:32
of all the things that artificial intelligence can become.
166
572547
2965
šta veštačka inteligencija može postati.
09:36
It will become many things.
167
576064
1800
I postaće dosta toga.
09:39
And it's up to us now,
168
579694
1732
Sada je na nama
09:41
in order to figure out what we need to put in place
169
581450
3061
da smislimo šta treba da uradimo
09:44
to make sure the outcomes of artificial intelligence
170
584535
3807
da bismo bili sigurni da rezultati veštačke inteligencije
09:48
are the ones that will be better for all of us.
171
588366
3066
budu oni koji su dobri za sve nas.
09:51
Thank you.
172
591456
1150
Hvala.
09:52
(Applause)
173
592630
2187
(Aplauz)
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7