How we can teach computers to make sense of our emotions | Raphael Arar

64,579 views

2018-04-24 ・ TED


New videos

How we can teach computers to make sense of our emotions | Raphael Arar

64,579 views ・ 2018-04-24

TED


Dvaput kliknite na engleske titlove ispod za reprodukciju videozapisa.

00:00
Translator: Ivana Korom Reviewer: Joanna Pietrulewicz
0
0
7000
Prevoditelj: Ivana Varga Recezent: Sanda L
00:13
I consider myself one part artist and one part designer.
1
13760
4680
Smatram se dijelom umjetnikom i dijelom dizajnerom.
00:18
And I work at an artificial intelligence research lab.
2
18480
3160
Radim u istraživačkom laboratoriju za umjetnu inteligenciju.
00:22
We're trying to create technology
3
22720
1696
Pokušavamo stvoriti tehnologiju
00:24
that you'll want to interact with in the far future.
4
24440
3296
s kakvom ćete htjeti komunicirati u daljnjoj budućnosti.
00:27
Not just six months from now, but try years and decades from now.
5
27760
4640
Ne samo šest mjeseci od sada, već godinama, desetljećima dalje.
00:33
And we're taking a moonshot
6
33120
1616
Mi predlažemo ambiciozan projekt:
00:34
that we'll want to be interacting with computers
7
34760
2456
da ćemo htjeti komunicirati s računalima
00:37
in deeply emotional ways.
8
37240
2120
na duboko emocionalne načine.
00:40
So in order to do that,
9
40280
1456
Pa, kako bismo to postigli,
00:41
the technology has to be just as much human as it is artificial.
10
41760
4480
tehnologija mora biti u istoj mjeri ljudska koliko i umjetna.
00:46
It has to get you.
11
46920
2256
Mora vas razumjeti.
00:49
You know, like that inside joke that'll have you and your best friend
12
49200
3336
Znate, poput privatne šale zbog koje vi i vaš najbolji prijatelj
00:52
on the floor, cracking up.
13
52560
1936
umirete od smijeha na podu.
00:54
Or that look of disappointment that you can just smell from miles away.
14
54520
4560
Ili ono razočaranje na licu koje možete vidjeti još izdaleka.
01:00
I view art as the gateway to help us bridge this gap between human and machine:
15
60560
6040
Za mene je umjetnost poveznica koja pomaže smanjiti jaz između ljudi i strojeva:
01:07
to figure out what it means to get each other
16
67280
3136
shvatiti što to znači kad se razumijemo,
01:10
so that we can train AI to get us.
17
70440
2760
tako da možemo obučiti umjetnu inteligenciju da nas razumije.
01:13
See, to me, art is a way to put tangible experiences
18
73920
3816
Vidite, meni je umjetnost sredstvo spajanja opipljiva iskustva
01:17
to intangible ideas, feelings and emotions.
19
77760
3240
s neopipljivim idejama, osjećajima i emocijama.
01:21
And I think it's one of the most human things about us.
20
81800
2600
Mislim da je to jedna od najljudskijih stvari u vezi s nama.
01:25
See, we're a complicated and complex bunch.
21
85480
2936
Znate, mi smo komplicirana i složena gomila.
01:28
We have what feels like an infinite range of emotions,
22
88440
3136
Imamo ono što se čini kao beskonačan raspon osjećaja,
01:31
and to top it off, we're all different.
23
91600
2496
a povrh toga, svi smo različiti.
01:34
We have different family backgrounds,
24
94120
2296
Imamo različita obiteljska porijekla,
01:36
different experiences and different psychologies.
25
96440
3080
različita iskustva i različitu psihologiju.
01:40
And this is what makes life really interesting.
26
100240
2720
To je ono što život čini stvarno zanimljivim.
01:43
But this is also what makes working on intelligent technology
27
103440
3496
No, to je ono što čini rad na inteligentnoj tehnologiji
01:46
extremely difficult.
28
106960
1600
iznimno teškim.
01:49
And right now, AI research, well,
29
109640
3456
I trenutno, istraživanje AI-a,
01:53
it's a bit lopsided on the tech side.
30
113120
2016
naginje na stranu tehnologije.
01:55
And that makes a lot of sense.
31
115160
2136
I to ima smisla.
01:57
See, for every qualitative thing about us --
32
117320
2456
Vidite, svaku kvalitativnu stvar u vezi s nama,
01:59
you know, those parts of us that are emotional, dynamic and subjective --
33
119800
4456
znate, te dijelove nas koji su emocionalni, dinamični i subjektivni,
02:04
we have to convert it to a quantitative metric:
34
124280
3136
moramo pretvoriti u kvantitativnu mjeru:
02:07
something that can be represented with facts, figures and computer code.
35
127440
4360
nešto što se može predstaviti činjenicama, brojevima i računalnim kodom.
02:13
The issue is, there are many qualitative things
36
133000
3376
Problem je što postoje mnoge kvalitativne stvari
02:16
that we just can't put our finger on.
37
136400
1960
koje ne možemo skroz razumjeti.
02:20
So, think about hearing your favorite song for the first time.
38
140400
3200
Zamislite da prvi put slušate svoju najdražu pjesmu.
02:25
What were you doing?
39
145200
1200
Što ste radili tada?
02:28
How did you feel?
40
148000
1200
Kako ste se osjećali?
02:30
Did you get goosebumps?
41
150720
1360
Jeste li se naježili?
02:33
Or did you get fired up?
42
153240
1640
Jeste li se uzbudili?
02:36
Hard to describe, right?
43
156400
1200
Teško je opisati, zar ne?
02:38
See, parts of us feel so simple,
44
158800
2096
Znate, neki se dijelovi nas čine jednostavni,
02:40
but under the surface, there's really a ton of complexity.
45
160920
3656
no, ispod površine ima hrpa složenosti.
02:44
And translating that complexity to machines
46
164600
2936
I prevesti tu kompleksnost strojevima
02:47
is what makes them modern-day moonshots.
47
167560
2856
ono je što je čini suvremenim lansiranjem na Mjesec.
02:50
And I'm not convinced that we can answer these deeper questions
48
170440
4176
Nisam uvjeren da možemo odgovoriti na ova dublja pitanja
02:54
with just ones and zeros alone.
49
174640
1480
samo pomoću jedinica i nula.
Stoga sam u laboratoriju stvarao umjetnost
02:57
So, in the lab, I've been creating art
50
177120
1936
02:59
as a way to help me design better experiences
51
179080
2456
na način na koji će mi pomoći dizajnirati bolja iskustva
03:01
for bleeding-edge technology.
52
181560
2096
za najnoviju tehnologiju.
03:03
And it's been serving as a catalyst
53
183680
1736
I ono služi kao pokretač
03:05
to beef up the more human ways that computers can relate to us.
54
185440
3840
za jačanje ljudskijih načina putem kojih se računala mogu poistovjetiti s nama.
03:10
Through art, we're tacking some of the hardest questions,
55
190000
2696
Putem umjetnosti, bavimo se nekim od najtežih pitanja,
03:12
like what does it really mean to feel?
56
192720
2360
recimo, što to točno znači "osjećati"?
03:16
Or how do we engage and know how to be present with each other?
57
196120
4080
Ili kako da se uključimo i znamo kako biti prisutni jedni s drugima?
03:20
And how does intuition affect the way that we interact?
58
200800
4000
I kako intuicija utječe na način na koji komuniciramo?
03:26
So, take for example human emotion.
59
206440
2056
Uzmimo za primjer ljudske osjećaje.
03:28
Right now, computers can make sense of our most basic ones,
60
208520
3256
Trenutno, računala mogu razumjeti ona najosnovnija,
03:31
like joy, sadness, anger, fear and disgust,
61
211800
3696
poput sreće, tuge, ljutnje, straha i gnušanja,
03:35
by converting those characteristics to math.
62
215520
3000
tako što pretvaraju te karakteristike u matematiku.
03:39
But what about the more complex emotions?
63
219400
2536
No, što je sa složenijim osjećajima?
03:41
You know, those emotions
64
221960
1216
Znate, onim osjećajima
03:43
that we have a hard time describing to each other?
65
223200
2376
koje nam je teško opisati jedni drugima?
03:45
Like nostalgia.
66
225600
1440
Poput nostalgije.
03:47
So, to explore this, I created a piece of art, an experience,
67
227640
3936
Dakle, kako bih to istražio, stvorio sam umjetninu, iskustvo,
03:51
that asked people to share a memory,
68
231600
2096
koje je tražilo od ljudi da podijele sjećanje
03:53
and I teamed up with some data scientists
69
233720
2136
i udružio sam se s podatkovnim znanstvenicima
03:55
to figure out how to take an emotion that's so highly subjective
70
235880
3576
da bih odgonetnuo kako uzeti emociju koja je toliko subjektivna
03:59
and convert it into something mathematically precise.
71
239480
3200
i pretvoriti je u nešto matematički precizno.
04:03
So, we created what we call a nostalgia score
72
243840
2136
Dakle, stvorili smo ono što zovemo rezultatom nostalgije
04:06
and it's the heart of this installation.
73
246000
2216
i on je sama srž ove instalacije.
04:08
To do that, the installation asks you to share a story,
74
248240
3056
Kako bismo to postigli, ova instalacija traži da joj kažete priču,
04:11
the computer then analyzes it for its simpler emotions,
75
251320
3256
računalo to zatim analizira u jednostavnije osjećaje,
04:14
it checks for your tendency to use past-tense wording
76
254600
2656
provjerava vašu sklonost k uporabi prošlog vremena,
04:17
and also looks for words that we tend to associate with nostalgia,
77
257280
3336
a traži i riječi koje povezujemo s nostalgijom,
04:20
like "home," "childhood" and "the past."
78
260640
3040
poput "dom", "djetinjstvo" i "prošlost".
04:24
It then creates a nostalgia score
79
264760
2056
Ono zatim definira rezultat nostalgije,
04:26
to indicate how nostalgic your story is.
80
266840
2736
kako bi odredilo koliko je vaša priča nostalgična.
04:29
And that score is the driving force behind these light-based sculptures
81
269600
4136
I taj je rezultat pokretačka sila iza ovih svjetlosnih skulptura,
04:33
that serve as physical embodiments of your contribution.
82
273760
3896
koje služe kao fizička utjelovljenja vašeg doprinosa.
04:37
And the higher the score, the rosier the hue.
83
277680
3216
Što je rezultat viši, ružičastija je boja.
04:40
You know, like looking at the world through rose-colored glasses.
84
280920
3936
Znate, kao da gledate na svijet ružičastim naočalama.
04:44
So, when you see your score
85
284880
2616
Dakle, kad vidite svoj rezultat
04:47
and the physical representation of it,
86
287520
2656
i njegov fizički prikaz,
04:50
sometimes you'd agree and sometimes you wouldn't.
87
290200
2936
ponekad ćete se složiti s njim, a ponekad ne.
04:53
It's as if it really understood how that experience made you feel.
88
293160
3480
To je kao da shvatite kako ste se osjećali zbog tog iskustva.
04:57
But other times it gets tripped up
89
297400
2216
No, koji put dođe do nekog zapleta
04:59
and has you thinking it doesn't understand you at all.
90
299640
2560
i onda mislite kako vas ne razumije uopće.
05:02
But the piece really serves to show
91
302680
1896
Međutim, skulptura stvarno dokazuje
05:04
that if we have a hard time explaining the emotions that we have to each other,
92
304600
4056
da ako nam je teško objasniti osjećaje jedni drugima,
05:08
how can we teach a computer to make sense of them?
93
308680
2360
kako onda možemo naučiti računalo da ih razumije?
05:12
So, even the more objective parts about being human are hard to describe.
94
312360
3576
Dakle, čak je i objektivnije aspekte čovjeka teško opisati.
05:15
Like, conversation.
95
315960
1240
Poput razgovora.
05:17
Have you ever really tried to break down the steps?
96
317880
2736
Jeste li ikad pokušali razložiti korake?
05:20
So think about sitting with your friend at a coffee shop
97
320640
2656
Zamislite da sjedite s prijateljem u kafiću
05:23
and just having small talk.
98
323320
1320
i samo čavrljate.
05:25
How do you know when to take a turn?
99
325160
1720
Kako znate kad je vaš red da pričate?
05:27
How do you know when to shift topics?
100
327440
1840
Kako znate kada promijeniti temu?
05:29
And how do you even know what topics to discuss?
101
329960
2720
I kako uopće znate o kojim temama raspravljati?
05:33
See, most of us don't really think about it,
102
333560
2096
Većina nas zapravo ne razmišlja puno o tome,
05:35
because it's almost second nature.
103
335680
1656
jer nam to dolazi prirodno.
05:37
And when we get to know someone, we learn more about what makes them tick,
104
337360
3496
Pa kad se pobliže upoznamo s nekim, naučimo o tome što ih određuje
05:40
and then we learn what topics we can discuss.
105
340880
2376
i onda naučimo o kojim temama možemo raspravljati.
05:43
But when it comes to teaching AI systems how to interact with people,
106
343280
3656
No, kad se radi o učenju AI sustava kako komunicirati s ljudima,
05:46
we have to teach them step by step what to do.
107
346960
2856
moramo ih naučiti korak po korak što treba činiti.
05:49
And right now, it feels clunky.
108
349840
3496
Trenutno, ono se čini nezgrapnim.
05:53
If you've ever tried to talk with Alexa, Siri or Google Assistant,
109
353360
4136
Ako ste ikad pokušali razgovarati s Alexom, Siri ili Googleovim pomoćnikom,
05:57
you can tell that it or they can still sound cold.
110
357520
4200
možete uvidjeti da ona još uvijek zvuče hladno.
06:02
And have you ever gotten annoyed
111
362440
1656
I jeste li se ikad uzrujali
06:04
when they didn't understand what you were saying
112
364120
2256
kad ona nisu razumjela što ste im govorili
06:06
and you had to rephrase what you wanted 20 times just to play a song?
113
366400
3840
pa ste morali 20 puta mijenjati upit, samo da biste čuli željenu pjesmu?
06:11
Alright, to the credit of the designers, realistic communication is really hard.
114
371440
4896
Dobro, u obranu dizajnera, realistična komunikacija stvarno je teška.
06:16
And there's a whole branch of sociology,
115
376360
2136
I postoji cijela grana sociologije,
06:18
called conversation analysis,
116
378520
1936
zvana konverzacijska analiza,
06:20
that tries to make blueprints for different types of conversation.
117
380480
3136
koja pokušava napraviti nacrte za različite vrste razgovora.
06:23
Types like customer service or counseling, teaching and others.
118
383640
4080
Vrste poput korisničke podrške ili savjetovanja, poučavanja i ostalo.
06:28
I've been collaborating with a conversation analyst at the lab
119
388880
2936
Surađivao sam s konverzacijskim analitičarem u laboratoriju,
06:31
to try to help our AI systems hold more human-sounding conversations.
120
391840
4696
kako bih pokušao pomoći našim AI sustavima da zvuče više kao ljudi u razgovoru.
06:36
This way, when you have an interaction with a chatbot on your phone
121
396560
3176
Na taj način, kad komunicirate s chatbotom na mobitelu
06:39
or a voice-based system in the car,
122
399760
1856
ili s glasovnim sustavom u autu,
06:41
it sounds a little more human and less cold and disjointed.
123
401640
3320
ono zvuči više kao osoba, a manje hladna i udaljena.
06:46
So I created a piece of art
124
406360
1336
Zato sam stvorio umjetninu
06:47
that tries to highlight the robotic, clunky interaction
125
407720
2816
koja pokušava naglasiti robotsku, nezgrapnu interakciju,
06:50
to help us understand, as designers,
126
410560
1976
kako bi nama dizajnerima pomogla razumjeti
06:52
why it doesn't sound human yet and, well, what we can do about it.
127
412560
4576
zašto još uvijek ne zvuči ljudski i što mi tu možemo napraviti.
06:57
The piece is called Bot to Bot
128
417160
1456
Ovaj komad zove se Bot to bot
06:58
and it puts one conversational system against another
129
418640
2936
i stavlja jedan konverzacijski sustav nasuprot drugom,
07:01
and then exposes it to the general public.
130
421600
2536
a onda to izlaže javnosti.
07:04
And what ends up happening is that you get something
131
424160
2496
I onda se dogodi da dobijete nešto
07:06
that tries to mimic human conversation,
132
426680
1896
što pokušava preslikati ljudski razgovor,
07:08
but falls short.
133
428600
1896
ali ne uspijeva.
07:10
Sometimes it works and sometimes it gets into these, well,
134
430520
2736
Ponekad ono radi, a ponekad zapadne u neke
07:13
loops of misunderstanding.
135
433280
1536
oblike nesporazuma.
07:14
So even though the machine-to-machine conversation can make sense,
136
434840
3096
Zato, iako razgovor stroja sa strojem može imati smisla,
07:17
grammatically and colloquially,
137
437960
2016
gramatički i kolokvijalno,
07:20
it can still end up feeling cold and robotic.
138
440000
3216
ono i dalje može zvučati hladno i robotski.
07:23
And despite checking all the boxes, the dialogue lacks soul
139
443240
4016
I unatoč tome što ispunjava sve zahtjeve, nema duše u dijalogu
07:27
and those one-off quirks that make each of us who we are.
140
447280
3136
i one jedinstvene posebnosti koje svatko od nas ima.
07:30
So while it might be grammatically correct
141
450440
2056
Pa iako jest gramatički točno
07:32
and uses all the right hashtags and emojis,
142
452520
2656
i koristi se svim pravim hashtagovima i emotikonima,
07:35
it can end up sounding mechanical and, well, a little creepy.
143
455200
4136
ipak može zvučati mehanički i malo jezivo.
07:39
And we call this the uncanny valley.
144
459360
2336
To nazivamo jezovitom dolinom.
07:41
You know, that creepiness factor of tech
145
461720
1936
Znate, čimbenik jeze za tehnologiju
07:43
where it's close to human but just slightly off.
146
463680
2856
koja je gotovo ljudska, ali ipak malo nedostaje da to bude.
07:46
And the piece will start being
147
466560
1456
Umjetnina će biti određena
07:48
one way that we test for the humanness of a conversation
148
468040
3216
kad budemo ispitivali ljudskost razgovora
07:51
and the parts that get lost in translation.
149
471280
2160
i dijelove koji su se izgubili u prijevodu.
07:54
So there are other things that get lost in translation, too,
150
474560
2856
Postoje i druge stvari koje se izgube u prijevodu,
07:57
like human intuition.
151
477440
1616
poput ljudske intuicije.
07:59
Right now, computers are gaining more autonomy.
152
479080
2776
Trenutno, računala stječu sve veću samostalnost.
08:01
They can take care of things for us,
153
481880
1736
Mogu preuzeti neke naše brige,
08:03
like change the temperature of our houses based on our preferences
154
483640
3176
poput promjene temperature u našim domovima u skladu s našim željama,
08:06
and even help us drive on the freeway.
155
486840
2160
čak nam pomoći i u vožnji autocestom.
08:09
But there are things that you and I do in person
156
489560
2496
No, postoje stvari koje vi i ja radimo uživo,
08:12
that are really difficult to translate to AI.
157
492080
2760
koje je vrlo teško prevesti za AI.
08:15
So think about the last time that you saw an old classmate or coworker.
158
495440
4360
Pomislite na zadnji put kad ste vidjeli starog školskog prijatelja ili kolegu.
08:21
Did you give them a hug or go in for a handshake?
159
501080
2480
Jeste li ih zagrlili ili se samo rukovali s njima?
08:24
You probably didn't think twice
160
504800
1496
Vjerojatno niste o tome puno dvojili,
08:26
because you've had so many built up experiences
161
506320
2336
jer ste imali toliko pohranjenih iskustava
08:28
that had you do one or the other.
162
508680
2000
da ste morali učiniti ili jedno ili drugo.
08:31
And as an artist, I feel that access to one's intuition,
163
511440
3456
Kao umjetnik, smatram da je pristup nečijoj intuiciji,
08:34
your unconscious knowing,
164
514920
1416
vašem nesvjesnom znanju,
08:36
is what helps us create amazing things.
165
516360
3056
ono što nam pomaže stvoriti nevjerojatne stvari.
08:39
Big ideas, from that abstract, nonlinear place in our consciousness
166
519440
4056
Velike ideje iz tog apstraktnog, nelinearnog mjesta u našoj svjesti,
08:43
that is the culmination of all of our experiences.
167
523520
2960
koje je vrhunac svog našeg iskustva.
08:47
And if we want computers to relate to us and help amplify our creative abilities,
168
527840
4656
I ako želimo da nas računala razumiju i povećaju naše kreativne sposobnosti,
08:52
I feel that we'll need to start thinking about how to make computers be intuitive.
169
532520
3896
onda smatram da ćemo morati razmišljati o tome kako učiniti računala intuitivnima.
08:56
So I wanted to explore how something like human intuition
170
536440
3096
Zato sam htio istražiti kako se nešto poput ljudske intuicije,
08:59
could be directly translated to artificial intelligence.
171
539560
3456
može izravno prevesti za umjetnu inteligenciju.
09:03
And I created a piece that explores computer-based intuition
172
543040
3216
Stvorio sam umjetninu koja istražuje računalnu intuiciju
09:06
in a physical space.
173
546280
1320
u fizičkom prostoru.
09:08
The piece is called Wayfinding,
174
548480
1696
Zove se Wayfinding
09:10
and it's set up as a symbolic compass that has four kinetic sculptures.
175
550200
3936
i oblikovana je kao simbolički kompas koji ima četiri kinetičke skulpture.
09:14
Each one represents a direction,
176
554160
2056
Svaka od njih predstavlja smjer;
09:16
north, east, south and west.
177
556240
2120
sjever, istok, jug i zapad.
09:19
And there are sensors set up on the top of each sculpture
178
559080
2696
Na vrhu svake skulpture su senzori
09:21
that capture how far away you are from them.
179
561800
2256
koji snimaju koliko ste udaljeni od njih.
09:24
And the data that gets collected
180
564080
1816
Svi ti prikupljeni podaci
09:25
ends up changing the way that sculptures move
181
565920
2136
na kraju mijenjaju pokret skulpture
09:28
and the direction of the compass.
182
568080
2040
i smjer kompasa.
09:31
The thing is, the piece doesn't work like the automatic door sensor
183
571360
3656
Stvar je u tome da umjetnina ne reagira poput senzora na automatskim vratima,
09:35
that just opens when you walk in front of it.
184
575040
2656
koja se jednostavno otvore kad im priđete.
09:37
See, your contribution is only a part of its collection of lived experiences.
185
577720
5056
Znate, vaš doprinos samo je dio njezine zbirke proživljenih iskustava.
09:42
And all of those experiences affect the way that it moves.
186
582800
4056
I sva ta iskustva utječu na način na koji se kreće.
09:46
So when you walk in front of it,
187
586880
1736
Tako da kad hodate pred njom,
09:48
it starts to use all of the data
188
588640
1976
ona počinje koristiti sve svoje podatke
09:50
that it's captured throughout its exhibition history --
189
590640
2616
koje je prikupila tijekom svoje izložbene povijesti,
09:53
or its intuition --
190
593280
1816
ili svoju intuiciju,
09:55
to mechanically respond to you based on what it's learned from others.
191
595120
3560
kako bi mehanički reagirala na vas temeljeno na prikupljenom od drugih.
09:59
And what ends up happening is that as participants
192
599480
2536
Na kraju se dogodi da kao sudionici
10:02
we start to learn the level of detail that we need
193
602040
2816
počinjemo učiti količinu detalja koji su nam potrebni
10:04
in order to manage expectations
194
604880
2016
kako bismo upravljali očekivanjima
10:06
from both humans and machines.
195
606920
2776
i ljudi i strojeva.
10:09
We can almost see our intuition being played out on the computer,
196
609720
3616
Gotovo da možemo vidjeti kako se naša intuicija odvija na računalu,
10:13
picturing all of that data being processed in our mind's eye.
197
613360
3600
zamisliti kako se svi ti podaci procesuiraju u našim mislima.
10:17
My hope is that this type of art
198
617560
1656
Moja želja je da će ova vrsta umjetnosti
10:19
will help us think differently about intuition
199
619240
2416
pomoći nam da drukčije razmišljamo o intuiciji
10:21
and how to apply that to AI in the future.
200
621680
2280
i kako to primijeniti na AI u budućnosti.
10:24
So these are just a few examples of how I'm using art to feed into my work
201
624480
3936
Ovo su samo neki od primjera kako koristim umjetnost za svoj posao,
10:28
as a designer and researcher of artificial intelligence.
202
628440
3096
kao dizajnera i istraživača umjetne inteligencije.
10:31
And I see it as a crucial way to move innovation forward.
203
631560
3496
Vidim to kao ključan način da unaprijedimo inovaciju.
10:35
Because right now, there are a lot of extremes when it comes to AI.
204
635080
4376
Jer trenutno, postoji mnogo ekstrema kad govorimo o AI-ju.
10:39
Popular movies show it as this destructive force
205
639480
2816
Popularni filmovi prikazuju ga kao silu koja uništava,
10:42
while commercials are showing it as a savior
206
642320
3056
dok ga reklame prikazuju kao spasitelja
10:45
to solve some of the world's most complex problems.
207
645400
2936
koji će rješiti sve najsloženije probleme svijeta.
10:48
But regardless of where you stand,
208
648360
2536
No, bez obzira na kojoj ste strani,
10:50
it's hard to deny that we're living in a world
209
650920
2176
teško je poreći da živimo u svijetu
10:53
that's becoming more and more digital by the second.
210
653120
2576
koji svakom sekundom postaje sve više digitalan.
10:55
Our lives revolve around our devices, smart appliances and more.
211
655720
4600
Životi nam se vrte oko spravica, pametnih uređaja i ostalog.
11:01
And I don't think this will let up any time soon.
212
661400
2320
I ne mislim da će se tu nešto skoro promijeniti.
11:04
So, I'm trying to embed more humanness from the start.
213
664400
3736
Stoga želim u to unijeti što više ljudskosti od samog početka.
11:08
And I have a hunch that bringing art into an AI research process
214
668160
5136
I slutim da je dovođenje umjetnosti u istraživanje AI-ja
11:13
is a way to do just that.
215
673320
1896
upravo način kako to učiniti.
11:15
Thank you.
216
675240
1216
Hvala vam.
11:16
(Applause)
217
676480
2280
(Pljesak)
O ovoj web stranici

Ova stranica će vas upoznati s YouTube videozapisima koji su korisni za učenje engleskog jezika. Vidjet ćete lekcije engleskog koje vode vrhunski profesori iz cijelog svijeta. Dvaput kliknite na engleske titlove prikazane na svakoj video stranici da biste reproducirali video s tog mjesta. Titlovi se pomiču sinkronizirano s reprodukcijom videozapisa. Ako imate bilo kakvih komentara ili zahtjeva, obratite nam se putem ovog obrasca za kontakt.

https://forms.gle/WvT1wiN1qDtmnspy7