Rana el Kaliouby: This app knows how you feel — from the look on your face

137,190 views ・ 2015-06-15

TED


Please double-click on the English subtitles below to play the video.

Prevodilac: Marija Kojić Lektor: Tijana Mihajlović
00:12
Our emotions influence every aspect of our lives,
0
12556
4017
Naše emocije utiču na svaki aspekt naših života,
00:16
from our health and how we learn, to how we do business and make decisions,
1
16573
3576
od našeg zdravlja i načina učenja
do načina obavljanja posla i donošenja odluka,
00:20
big ones and small.
2
20149
1773
bile one velike ili male.
00:22
Our emotions also influence how we connect with one another.
3
22672
3490
Naše emocije takođe utiču na to kako se povezujemo jedni sa drugima.
00:27
We've evolved to live in a world like this,
4
27132
3976
Razvili smo se za život u svetu kao što je ovaj,
00:31
but instead, we're living more and more of our lives like this --
5
31108
4319
ali umesto toga, živimo sve više naše živote ovako -
00:35
this is the text message from my daughter last night --
6
35427
3134
ovo je SMS poruka koju mi je ćerka poslala sinoć -
00:38
in a world that's devoid of emotion.
7
38561
2740
u svetu koji je lišen emocija.
00:41
So I'm on a mission to change that.
8
41301
1951
Tako sam na misiji to da promenim.
00:43
I want to bring emotions back into our digital experiences.
9
43252
4091
Želim da vratim emocije u naše digitalno iskustvo.
00:48
I started on this path 15 years ago.
10
48223
3077
Krenula sam ovim putem pre 15 godina.
Bila sam kompjuterski naučnik u Egiptu
00:51
I was a computer scientist in Egypt,
11
51300
2066
00:53
and I had just gotten accepted to a Ph.D. program at Cambridge University.
12
53366
4505
i samo što su me primili
na doktorski program na univerzitetu u Kembridžu.
00:57
So I did something quite unusual
13
57871
2113
Tako sam uradila nešto prilično neobično
00:59
for a young newlywed Muslim Egyptian wife:
14
59984
4225
za jednu mladu, tek venčanu ženu koja je muslimanska Egipćanka:
(Smeh)
01:05
With the support of my husband, who had to stay in Egypt,
15
65599
2999
uz podršku mog muža koji je morao da ostane u Egiptu,
01:08
I packed my bags and I moved to England.
16
68598
3018
spakovala sam kofere i preselila se u Englesku.
01:11
At Cambridge, thousands of miles away from home,
17
71616
3228
Na Kembridžu, kilometrima daleko od kuće,
01:14
I realized I was spending more hours with my laptop
18
74844
3413
shvatila sam da više vremena provodim sa laptopom
nego sa bilo kojim drugim ljudskim bićem.
01:18
than I did with any other human.
19
78257
2229
01:20
Yet despite this intimacy, my laptop had absolutely no idea how I was feeling.
20
80486
4853
Ipak, i pored ove prisnosti, moj laptop nije imao pojma kako se osećam.
01:25
It had no idea if I was happy,
21
85339
3211
Nije imao pojma da li sam srećna,
01:28
having a bad day, or stressed, confused,
22
88550
2988
da li sam imala loš dan, da li sam pod stresom, zbunjena,
01:31
and so that got frustrating.
23
91538
2922
tako da je to postalo frustrirajuće.
01:35
Even worse, as I communicated online with my family back home,
24
95600
5231
Još gore, dok sam komunicirala sa porodicom kod kuće preko mreže,
01:41
I felt that all my emotions disappeared in cyberspace.
25
101421
3282
osećala sam da sve moje emocije nestaju u sajber prostoru.
01:44
I was homesick, I was lonely, and on some days I was actually crying,
26
104703
5155
Nedostajala mi je kuća, bila sam usamljena i nekih dana sam zapravo i plakala,
01:49
but all I had to communicate these emotions was this.
27
109858
4928
ali sve što sam imala da iskažem ove emocije bilo je ovo.
01:54
(Laughter)
28
114786
2020
(Smeh)
01:56
Today's technology has lots of I.Q., but no E.Q.;
29
116806
4974
Današnja tehnologija ima mnogo IQ-a, ali nema EQ;
mnogo kognitivne inteligencije, ali nimalo emocionalne inteligencije,
02:01
lots of cognitive intelligence, but no emotional intelligence.
30
121780
3176
02:04
So that got me thinking,
31
124956
2197
te me je to nagnalo na razmišljanje.
02:07
what if our technology could sense our emotions?
32
127153
3624
Šta ako bi naša tehnologija mogla da oseti naše emocije?
02:10
What if our devices could sense how we felt and reacted accordingly,
33
130777
4076
Šta ako bi uređaji mogli da prepoznaju i reaguju na naša osećanja,
02:14
just the way an emotionally intelligent friend would?
34
134853
3013
baš kao što bi reagovao i emocionalno inteligentan prijatelj?
02:18
Those questions led me and my team
35
138666
3564
Ta pitanja su navela mene i moj tim da napravimo tehnologiju
02:22
to create technologies that can read and respond to our emotions,
36
142230
4377
koja može da čita i reaguje na naše emocije,
02:26
and our starting point was the human face.
37
146607
3090
a naša početna tačka bilo je ljudsko lice.
02:30
So our human face happens to be one of the most powerful channels
38
150577
3173
Dakle, ljudsko lice je jedno od najmoćnijih kanala
02:33
that we all use to communicate social and emotional states,
39
153750
4016
koje koristimo da prenesemo društvena i emocionalna stanja,
02:37
everything from enjoyment, surprise,
40
157766
3010
sve od zadovoljstva, iznenađenja,
02:40
empathy and curiosity.
41
160776
4203
empatije do radoznalosti.
02:44
In emotion science, we call each facial muscle movement an action unit.
42
164979
4928
U nauci o emocijama,
sve pokrete facijalnih mišića nazivamo akcijskim jedinicama.
02:49
So for example, action unit 12,
43
169907
2925
Tako, na primer, akcijska jedinica 12 nije holivudski blokbaster,
02:52
it's not a Hollywood blockbuster,
44
172832
2038
02:54
it is actually a lip corner pull, which is the main component of a smile.
45
174870
3442
nego se radi o podizanju ugla usana, što je glavna komponenta osmeha.
02:58
Try it everybody. Let's get some smiles going on.
46
178312
2988
Probajte to. Hajde da svi nabacimo osmehe.
03:01
Another example is action unit 4. It's the brow furrow.
47
181300
2654
Još jedan primer je akcijska jedinica 4. To je boranje obrve.
03:03
It's when you draw your eyebrows together
48
183954
2238
To je kada skupite obrve i stvorite sve ove teksture i bore.
03:06
and you create all these textures and wrinkles.
49
186192
2267
Ne volimo ih, ali one su jasan pokazatelj neke negativne emocije.
03:08
We don't like them, but it's a strong indicator of a negative emotion.
50
188459
4295
Tako imamo oko 45 ovih akcijskih jedinica
03:12
So we have about 45 of these action units,
51
192754
2206
03:14
and they combine to express hundreds of emotions.
52
194960
3390
i one se kombinuju da izraze stotine emocija.
03:18
Teaching a computer to read these facial emotions is hard,
53
198350
3901
Podučavanje kompjutera da čita ove facijalne emocije je teško
03:22
because these action units, they can be fast, they're subtle,
54
202251
2972
zato što ove akcijske jedinice mogu biti brze, suptilne
03:25
and they combine in many different ways.
55
205223
2554
i kombinuju se na mnogo različitih načina.
03:27
So take, for example, the smile and the smirk.
56
207777
3738
Uzmite tako, na primer, osmeh i zloban osmeh.
03:31
They look somewhat similar, but they mean very different things.
57
211515
3753
Izgledaju pomalo slično ali imaju veoma različita značenja.
03:35
(Laughter)
58
215268
1718
(Smeh)
03:36
So the smile is positive,
59
216986
3004
Dakle, osmeh je pozitivan,
03:39
a smirk is often negative.
60
219990
1270
a zloban osmeh je često negativan.
Ponekad jedan zloban osmeh može da vas napravi poznatim.
03:41
Sometimes a smirk can make you become famous.
61
221260
3876
03:45
But seriously, it's important for a computer to be able
62
225136
2824
Ozbiljno, važno je da kompjuter bude u stanju
03:47
to tell the difference between the two expressions.
63
227960
2855
da prepozna razliku između ova dva izraza.
03:50
So how do we do that?
64
230815
1812
Dakle, kako to postižemo?
03:52
We give our algorithms
65
232627
1787
Obezbeđujemo svojim algoritmima
03:54
tens of thousands of examples of people we know to be smiling,
66
234414
4110
desetine hiljada primera ljudi za koje znamo da se osmehuju,
03:58
from different ethnicities, ages, genders,
67
238524
3065
ljudi različitih etničkih pripadnosti, godina, različitog pola,
04:01
and we do the same for smirks.
68
241589
2811
a to isto činimo za podsmehe.
04:04
And then, using deep learning,
69
244400
1554
Onda, uz dubinski pristup učenju,
04:05
the algorithm looks for all these textures and wrinkles
70
245954
2856
algoritam traži sve ove teksture i bore
04:08
and shape changes on our face,
71
248810
2580
i promene oblika na našim licima,
04:11
and basically learns that all smiles have common characteristics,
72
251390
3202
i u suštini uči da svi osmesi imaju zajedničke osobine,
04:14
all smirks have subtly different characteristics.
73
254592
3181
da se svi zlobni osmesi suptilno razlikuju od osmeha po osobinama.
04:17
And the next time it sees a new face,
74
257773
2368
I sledeći put kada vidi novo lice,
04:20
it essentially learns that
75
260141
2299
u suštini uči da ovo lice ima iste osobine osmeha
04:22
this face has the same characteristics of a smile,
76
262440
3033
04:25
and it says, "Aha, I recognize this. This is a smile expression."
77
265473
4278
i kaže „Aha, prepoznajem ovo. Ovo je izraz osmeha.”
04:30
So the best way to demonstrate how this technology works
78
270381
2800
Najbolji način da pokažemo kako ova tehnologija funkcioniše
04:33
is to try a live demo,
79
273181
2136
jeste da probamo demo-verziju uživo, tako da mi treba dobrovoljac.
04:35
so I need a volunteer, preferably somebody with a face.
80
275317
3913
Poželjno je da taj neko ima lice.
(Smeh)
04:39
(Laughter)
81
279230
2334
04:41
Cloe's going to be our volunteer today.
82
281564
2771
Kloi će nam danas biti dobrovoljac.
04:45
So over the past five years, we've moved from being a research project at MIT
83
285325
4458
Dakle, u zadnjih pet godina, od istraživačkog projekta na MIT-u
04:49
to a company,
84
289783
1156
postali smo kompanija,
04:50
where my team has worked really hard to make this technology work,
85
290939
3192
u kojoj je moj tim vredno radio da ova tehnologija uspe,
kako mi volimo da kažemo, u divljini.
04:54
as we like to say, in the wild.
86
294131
2409
04:56
And we've also shrunk it so that the core emotion engine
87
296540
2670
Takođe smo je smanjili tako da osnovni emotivni motor
radi na bilo kom mobilnom uređaju koji ima kameru, kao što je ovaj Ajped.
04:59
works on any mobile device with a camera, like this iPad.
88
299210
3320
05:02
So let's give this a try.
89
302530
2786
Dakle, hajde da probamo.
05:06
As you can see, the algorithm has essentially found Cloe's face,
90
306756
3924
Kao što možete da vidite, algoritam je pronašao Kloino lice.
05:10
so it's this white bounding box,
91
310680
1692
To je ovaj beli granični okvir,
05:12
and it's tracking the main feature points on her face,
92
312372
2571
koji prati glavne tačke odlika na njenom licu,
05:14
so her eyebrows, her eyes, her mouth and her nose.
93
314943
2856
dakle, njene obrve, njene oči, njena usta i njen nos.
05:17
The question is, can it recognize her expression?
94
317799
2987
Pitanje je, da li može prepoznati njen izraz lica?
05:20
So we're going to test the machine.
95
320786
1671
Dakle, testiraćemo mašinu.
Pre svega, da vidimo tvoje pokeraško lice. Da, super.
05:22
So first of all, give me your poker face. Yep, awesome. (Laughter)
96
322457
4186
(Smeh)
05:26
And then as she smiles, this is a genuine smile, it's great.
97
326643
2813
Onda, kada se osmehne, ovo je iskren osmeh, odlično.
05:29
So you can see the green bar go up as she smiles.
98
329456
2300
Vidite da se zelena traka puni kada se osmehuje.
05:31
Now that was a big smile.
99
331756
1222
To je bio širok osmeh.
05:32
Can you try a subtle smile to see if the computer can recognize?
100
332978
3043
A jedan suptilan osmeh da vidimo da li kompjuter ume da prepozna?
Prepoznaje i suptilne osmehe.
05:36
It does recognize subtle smiles as well.
101
336021
2331
Vredno smo radili da ovo ostvarimo.
05:38
We've worked really hard to make that happen.
102
338352
2125
Zatim podignute obrve, pokazatelj iznenađenja.
05:40
And then eyebrow raised, indicator of surprise.
103
340477
2962
05:43
Brow furrow, which is an indicator of confusion.
104
343439
4249
Boranje obrva, što je pokazatelj zbunjenosti.
05:47
Frown. Yes, perfect.
105
347688
4007
Mrštenje. Da, savršeno.
05:51
So these are all the different action units. There's many more of them.
106
351695
3493
Ovo su različite akcijske jedinice. Postoji ih još mnogo više.
Ovo je samo skraćena demo verzija.
05:55
This is just a slimmed-down demo.
107
355188
2032
05:57
But we call each reading an emotion data point,
108
357220
3148
Svako čitanje nazivamo tačkom emotivnih podataka
06:00
and then they can fire together to portray different emotions.
109
360368
2969
i one mogu zajedno da rade da iskažu različite emocije.
06:03
So on the right side of the demo -- look like you're happy.
110
363337
4653
Dakle, na desnoj strani demo verzije - izgledaj kao da si srećna.
06:07
So that's joy. Joy fires up.
111
367990
1454
Dakle, to je radost. Radost se povećava.
06:09
And then give me a disgust face.
112
369444
1927
Sad mi pokaži izraz gađenja.
06:11
Try to remember what it was like when Zayn left One Direction.
113
371371
4272
Pokušaj da se setiš kako si se osećala kada je Zejn napustio „One Direction".
(Smeh)
06:15
(Laughter)
114
375643
1510
Da, naboraj nos. Super.
06:17
Yeah, wrinkle your nose. Awesome.
115
377153
4342
06:21
And the valence is actually quite negative, so you must have been a big fan.
116
381495
3731
Valenca je stvarno krajnje negativna, tako da mora da si bila veliki fan.
06:25
So valence is how positive or negative an experience is,
117
385226
2700
Valenca označava koliko je iskustvo pozitivno ili negativno,
06:27
and engagement is how expressive she is as well.
118
387926
2786
a angažman je taj koji označava koliko je ona ekspresivna.
06:30
So imagine if Cloe had access to this real-time emotion stream,
119
390712
3414
Zamislite da Kloi ima pristup ovom emotivnom prenosu u realnom vremenu
06:34
and she could share it with anybody she wanted to.
120
394126
2809
i da može da ga podeli sa kim god želi.
06:36
Thank you.
121
396935
2923
Hvala ti.
(Aplauz)
06:39
(Applause)
122
399858
4621
06:45
So, so far, we have amassed 12 billion of these emotion data points.
123
405749
5270
Dakle, do sada smo nagomilali 12 milijardi ovih tačaka emotivnih podataka.
To je najveća baza emocija u svetu.
06:51
It's the largest emotion database in the world.
124
411019
2611
06:53
We've collected it from 2.9 million face videos,
125
413630
2963
Sakupili smo je kroz 2,9 miliona klipova lica,
06:56
people who have agreed to share their emotions with us,
126
416593
2600
ljudi koji su pristali da podele svoje emocije sa nama,
06:59
and from 75 countries around the world.
127
419193
3205
a to iz 75 zemalja širom sveta.
07:02
It's growing every day.
128
422398
1715
Broj svaki dan raste.
07:04
It blows my mind away
129
424603
2067
Raspamećuje me
07:06
that we can now quantify something as personal as our emotions,
130
426670
3195
to da sada možemo odrediti količinu nečega tako ličnog kao što su emocije
07:09
and we can do it at this scale.
131
429865
2235
i da to možemo obaviti na ovom nivou.
07:12
So what have we learned to date?
132
432100
2177
Dakle, šta smo naučili do sada?
07:15
Gender.
133
435057
2331
Pol.
07:17
Our data confirms something that you might suspect.
134
437388
3646
Naši podaci potvrđuju ono što možda pretpostavljate.
07:21
Women are more expressive than men.
135
441034
1857
Žene su eskpresivnije od muškaraca.
07:22
Not only do they smile more, their smiles last longer,
136
442891
2683
Ne samo da se više osmehuju, njihovi osmesi traju duže.
07:25
and we can now really quantify what it is that men and women
137
445574
2904
Sada stvarno možemo odrediti šta je to na šta muškarci i žene
07:28
respond to differently.
138
448478
2136
reaguju drugačije.
07:30
Let's do culture: So in the United States,
139
450614
2290
Pozabavimo se kulturom. U Sjedinjenim Državama,
07:32
women are 40 percent more expressive than men,
140
452904
3204
žene su 40% ekspresivnije od muškaraca,
07:36
but curiously, we don't see any difference in the U.K. between men and women.
141
456108
3645
ali neobično je to da nema razlike u UK između muškaraca i žena.
07:39
(Laughter)
142
459753
2506
(Smeh)
07:43
Age: People who are 50 years and older
143
463296
4027
Godine. Ljudi koji imaju 50 godina ili stariji od toga
07:47
are 25 percent more emotive than younger people.
144
467323
3436
su 25% emotivniji od mlađih ljudi.
07:51
Women in their 20s smile a lot more than men the same age,
145
471899
3852
Žene u dvadesetima osmehuju se mnogo više od muškaraca istih godina,
07:55
perhaps a necessity for dating.
146
475751
3839
što je možda neophodno pri zabavljanju.
07:59
But perhaps what surprised us the most about this data
147
479590
2617
Ipak, možda najveće iznenađenje u vezi ovih podataka
08:02
is that we happen to be expressive all the time,
148
482207
3203
je to da smo stalno izražajni,
08:05
even when we are sitting in front of our devices alone,
149
485410
2833
čak i kada sedimo sami ispred naših uređaja,
08:08
and it's not just when we're watching cat videos on Facebook.
150
488243
3274
i to ne samo kada gledamo klipove sa mačkama na Fejbuku.
08:12
We are expressive when we're emailing, texting, shopping online,
151
492217
3010
Izražajni smo kada šaljemo i-mejlove, SMS-ove, kupujemo onlajn,
08:15
or even doing our taxes.
152
495227
2300
čak i kada obrađujemo porez.
08:17
Where is this data used today?
153
497527
2392
Gde se ovi podaci koriste danas?
08:19
In understanding how we engage with media,
154
499919
2763
Kada treba da razumemo kako da se angažujemo na mrežama,
08:22
so understanding virality and voting behavior;
155
502682
2484
da razumemo viralnost i ponašanja pri glasanju,
08:25
and also empowering or emotion-enabling technology,
156
505166
2740
kao i da razumemo tehnologije koje osnažuju i omogućuju emocije.
08:27
and I want to share some examples that are especially close to my heart.
157
507906
4621
Želim takođe da podelim i primere koji su mi posebno dragi.
08:33
Emotion-enabled wearable glasses can help individuals
158
513197
3068
Naočare koje omogućuju emocije mogu da pomognu pojedincima
08:36
who are visually impaired read the faces of others,
159
516265
3228
koji imaju oštećen vid da čitaju lica drugih ljudi
08:39
and it can help individuals on the autism spectrum interpret emotion,
160
519493
4187
i mogu da pomognu pojedincima sa autizmom da protumače emocije,
08:43
something that they really struggle with.
161
523680
2778
nešto sa čim stvarno imaju problema.
08:47
In education, imagine if your learning apps
162
527918
2859
U obrazovanju, zamislite kada bi vaše aplikacije za učenje
08:50
sense that you're confused and slow down,
163
530777
2810
mogle da osete kada ste zbunjeni i uspore,
08:53
or that you're bored, so it's sped up,
164
533587
1857
ili kada vam je dosadno i ubrzaju,
08:55
just like a great teacher would in a classroom.
165
535444
2969
kao što bi dobar profesor uradio u učionici.
Šta bi bilo ako bi vaš ručni sat mogao da prati vaše raspoloženje,
08:59
What if your wristwatch tracked your mood,
166
539043
2601
09:01
or your car sensed that you're tired,
167
541644
2693
ili kada bi vaš auto mogao da oseti kada ste umorni,
09:04
or perhaps your fridge knows that you're stressed,
168
544337
2548
ili kada bi vaš frižider znao kada ste pod stresom,
09:06
so it auto-locks to prevent you from binge eating. (Laughter)
169
546885
6066
pa se automatski zatvori da vas spreči da se opsesivno prejedate?
Da, i ja bih to volela.
09:12
I would like that, yeah.
170
552951
2717
(Smeh)
09:15
What if, when I was in Cambridge,
171
555668
1927
Šta bi bilo da sam na Kembridžu
09:17
I had access to my real-time emotion stream,
172
557595
2313
imala pristup svom emotivnom prenosu u realnom vremenu,
09:19
and I could share that with my family back home in a very natural way,
173
559908
3529
da sam mogla da ga podelim to sa porodicom kod kuće na prirodan način,
09:23
just like I would've if we were all in the same room together?
174
563437
3971
kao što bih uradila da smo svi u istoj sobi zajedno?
09:27
I think five years down the line,
175
567408
3142
Mislim da će kroz pet godina,
09:30
all our devices are going to have an emotion chip,
176
570550
2337
svi naši uređaji imati emotivni čip
09:32
and we won't remember what it was like when we couldn't just frown at our device
177
572887
4064
i nećemo se sećati kako je bilo
kada nismo mogli samo da se namrštimo uređajima,
09:36
and our device would say, "Hmm, you didn't like that, did you?"
178
576951
4249
a da oni ne kažu: „Hmm, to ti se nije baš svidelo, zar ne?”
09:41
Our biggest challenge is that there are so many applications of this technology,
179
581200
3761
Naš najveći izazov je veliki broj korisnika ove tehnologije.
09:44
my team and I realize that we can't build them all ourselves,
180
584961
2903
Moj tim i ja smo shvatili da je ne možemo napraviti sami,
09:47
so we've made this technology available so that other developers
181
587864
3496
pa smo ovu tehnologiju učinili dostupnom tako da i drugi programeri
09:51
can get building and get creative.
182
591360
2114
mogu da počnu sa građenjem i iskažu kreativnost.
09:53
We recognize that there are potential risks
183
593474
4086
Razumemo da su mogući i rizici
09:57
and potential for abuse,
184
597560
2067
i da je moguća zloupotreba,
09:59
but personally, having spent many years doing this,
185
599627
2949
ali, budući da sam mnogo godina provela u radu na ovome,
10:02
I believe that the benefits to humanity
186
602576
2972
verujem da su prednosti čovečanstva
sa emocionalno inteligentnom tehnologijom
10:05
from having emotionally intelligent technology
187
605548
2275
10:07
far outweigh the potential for misuse.
188
607823
3576
važnije od moguće zloupotrebe.
10:11
And I invite you all to be part of the conversation.
189
611399
2531
Pozivam vas sve da budete deo te rasprave.
10:13
The more people who know about this technology,
190
613930
2554
Što više ljudi zna za ovu tehnologiju,
10:16
the more we can all have a voice in how it's being used.
191
616484
3177
to će više nas imati pravo da odlučuje kako će se ona koristiti.
10:21
So as more and more of our lives become digital,
192
621081
4574
Dakle, kako naši životi postaju sve više i više digitalni,
10:25
we are fighting a losing battle trying to curb our usage of devices
193
625655
3498
sve više gubimo bitku u pokušaju da smanjimo korišćenje ovih uređaja
10:29
in order to reclaim our emotions.
194
629153
2229
da bismo povratili naše emocije.
10:32
So what I'm trying to do instead is to bring emotions into our technology
195
632622
3914
Dakle, ono što ja pokušavam umesto toga je da uvedem emocije u našu tehnologiju
10:36
and make our technologies more responsive.
196
636536
2229
da bi tehnologija bila prijemčivija.
10:38
So I want those devices that have separated us
197
638765
2670
Dakle, želim da nas ovi uređaji koji su nas rastavili ponovo spoje.
10:41
to bring us back together.
198
641435
2462
10:43
And by humanizing technology, we have this golden opportunity
199
643897
4588
Kada damo ljudska svojstva tehnologiji, dobijamo zlatnu priliku
10:48
to reimagine how we connect with machines,
200
648485
3297
da obnovimo način na koji se povezujemo sa mašinama
10:51
and therefore, how we, as human beings,
201
651782
4481
i stoga, kako se mi, kao ljudska bića,
10:56
connect with one another.
202
656263
1904
povezujemo jedni sa drugima.
10:58
Thank you.
203
658167
2160
Hvala vam.
(Aplauz)
11:00
(Applause)
204
660327
3313
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7