Rana el Kaliouby: This app knows how you feel — from the look on your face

138,524 views

2015-06-15 ・ TED


New videos

Rana el Kaliouby: This app knows how you feel — from the look on your face

138,524 views ・ 2015-06-15

TED


Dvaput kliknite na engleske titlove ispod za reprodukciju videozapisa.

Prevoditelj: Anja Kolobarić Recezent: Vanja Kovac
00:12
Our emotions influence every aspect of our lives,
0
12556
4017
Naše emocije utječu na svako područje naših života,
00:16
from our health and how we learn, to how we do business and make decisions,
1
16573
3576
od zdravlja i načina na koji učimo do toga kako radimo i donosimo odluke,
00:20
big ones and small.
2
20149
1773
bile one velike ili male.
00:22
Our emotions also influence how we connect with one another.
3
22672
3490
Naše emocije također utječu na način na koji se povezujemo s drugima.
00:27
We've evolved to live in a world like this,
4
27132
3976
Razvili smo se za život u ovakvom svijetu,
00:31
but instead, we're living more and more of our lives like this --
5
31108
4319
no umjesto toga sve više i više živimo ovako -
00:35
this is the text message from my daughter last night --
6
35427
3134
ovo je sinoćnji SMS moje kćeri -
00:38
in a world that's devoid of emotion.
7
38561
2740
u svijetu lišenom emocija.
00:41
So I'm on a mission to change that.
8
41301
1951
Ja to želim promijeniti.
00:43
I want to bring emotions back into our digital experiences.
9
43252
4091
Želim vratiti emocije u digitalni svijet.
00:48
I started on this path 15 years ago.
10
48223
3077
U to sam se upustila prije 15 godina.
00:51
I was a computer scientist in Egypt,
11
51300
2066
Bila sam računalna znanstvenica u Egiptu
00:53
and I had just gotten accepted to a Ph.D. program at Cambridge University.
12
53366
4505
i bila sam primljena na doktorski studij na sveučilištu u Cambridgeu.
00:57
So I did something quite unusual
13
57871
2113
Učinila sam nešto prilično neuobičajeno
00:59
for a young newlywed Muslim Egyptian wife:
14
59984
4225
za egipatsku muslimansku mladenku:
01:05
With the support of my husband, who had to stay in Egypt,
15
65599
2999
uz potporu supruga koji je ostao u Egiptu,
01:08
I packed my bags and I moved to England.
16
68598
3018
spakirala sam kofere i preselila u Englesku.
01:11
At Cambridge, thousands of miles away from home,
17
71616
3228
Na Cambridgeu, kilometrima daleko od kuće,
01:14
I realized I was spending more hours with my laptop
18
74844
3413
shvatila sam da više vremena provodim s laptopom
01:18
than I did with any other human.
19
78257
2229
nego s bilo kojim drugim ljudskim bićem.
01:20
Yet despite this intimacy, my laptop had absolutely no idea how I was feeling.
20
80486
4853
Unatoč ovoj intimnosti moj laptop nije imao pojma kako se ja osjećam.
01:25
It had no idea if I was happy,
21
85339
3211
Nije znao jesam li sretna,
01:28
having a bad day, or stressed, confused,
22
88550
2988
tužna, pod stresom, zbunjena,
01:31
and so that got frustrating.
23
91538
2922
što je postalo frustrirajuće.
01:35
Even worse, as I communicated online with my family back home,
24
95600
5231
Tim više, dok sam online kontaktirala s obitelji,
01:41
I felt that all my emotions disappeared in cyberspace.
25
101421
3282
osjećala sam kako sve moje emocije nestaju u cyber-prostoru.
01:44
I was homesick, I was lonely, and on some days I was actually crying,
26
104703
5155
Nedostajala mi je kuća, bila sam usamljena, a nekada sam znala čak i zaplakati,
01:49
but all I had to communicate these emotions was this.
27
109858
4928
ali svoje sam emocije mogla prenijeti samo ovako.
01:54
(Laughter)
28
114786
2020
(Smijeh)
01:56
Today's technology has lots of I.Q., but no E.Q.;
29
116806
4974
Suvremena tehnologija ima puno IQ-a, a ništa EQ-a,
02:01
lots of cognitive intelligence, but no emotional intelligence.
30
121780
3176
puno kognitivne inteligencije, ništa emocionalne inteligencije,
02:04
So that got me thinking,
31
124956
2197
što mi je dalo na razmišljanje -
02:07
what if our technology could sense our emotions?
32
127153
3624
što kad bi naša tehnologija mogla osjetiti naše emocije?
02:10
What if our devices could sense how we felt and reacted accordingly,
33
130777
4076
Što kad bi sve naprave mogle osjetiti naše raspoloženje i reagirati na njih
02:14
just the way an emotionally intelligent friend would?
34
134853
3013
onako kako bi to napravio emocionalno inteligentan prijatelj.
02:18
Those questions led me and my team
35
138666
3564
Ta su pitanja mene i moj tim naveli
02:22
to create technologies that can read and respond to our emotions,
36
142230
4377
na razvijanje tehnologije koja očitava i odgovara na naše emocije,
02:26
and our starting point was the human face.
37
146607
3090
a početna točka bilo nam je ljudsko lice.
02:30
So our human face happens to be one of the most powerful channels
38
150577
3173
Ljudsko lice jedno je od najmoćnijih kanala
02:33
that we all use to communicate social and emotional states,
39
153750
4016
za prijenos društvenih i emocionalnih stanja,
02:37
everything from enjoyment, surprise,
40
157766
3010
sve od užitka, iznenađenja,
02:40
empathy and curiosity.
41
160776
4203
empatije i znatiželje.
02:44
In emotion science, we call each facial muscle movement an action unit.
42
164979
4928
U znanosti emocija svaki je pokret facijalnih mišića akcijska jedinica.
02:49
So for example, action unit 12,
43
169907
2925
Npr. akcijska jedinica 12,
02:52
it's not a Hollywood blockbuster,
44
172832
2038
nije to holivudski blockbuster,
02:54
it is actually a lip corner pull, which is the main component of a smile.
45
174870
3442
nego se radi o podizanju kuta usnice, što je osnovna sastavnica osmijeha.
02:58
Try it everybody. Let's get some smiles going on.
46
178312
2988
Probajte. Da vidimo te osmijehe.
03:01
Another example is action unit 4. It's the brow furrow.
47
181300
2654
Drugi je primjer akcijska jedinica 4: boranje obrva.
03:03
It's when you draw your eyebrows together
48
183954
2238
To je kada skupite obrve
03:06
and you create all these textures and wrinkles.
49
186192
2267
čime stvarate ovakve teksture i bore.
03:08
We don't like them, but it's a strong indicator of a negative emotion.
50
188459
4295
Ne volimo ih, ali snažan su pokazatelj negativnih emocija.
03:12
So we have about 45 of these action units,
51
192754
2206
Imamo oko 45 ovakvih akcijskih jedinica
03:14
and they combine to express hundreds of emotions.
52
194960
3390
koje se kombiniraju kako bi izrazile stotine emocija.
03:18
Teaching a computer to read these facial emotions is hard,
53
198350
3901
Teško je naučiti računalo da ih očitava
03:22
because these action units, they can be fast, they're subtle,
54
202251
2972
jer mogu biti brze, suptilne su
03:25
and they combine in many different ways.
55
205223
2554
i mogu se kombinirati na različite načine.
03:27
So take, for example, the smile and the smirk.
56
207777
3738
Npr. smijeh i podsmijeh.
03:31
They look somewhat similar, but they mean very different things.
57
211515
3753
Izgledaju koliko-toliko slično, ali znače različite stvari.
03:35
(Laughter)
58
215268
1718
(Smijeh)
03:36
So the smile is positive,
59
216986
3004
Osmijeh je pozitivan,
03:39
a smirk is often negative.
60
219990
1270
a podsmijeh negativan.
03:41
Sometimes a smirk can make you become famous.
61
221260
3876
Nekad vas podsmijeh može proslaviti.
03:45
But seriously, it's important for a computer to be able
62
225136
2824
No ozbiljno, za računalo je važno da može
03:47
to tell the difference between the two expressions.
63
227960
2855
razlikovati ova dva izraza.
03:50
So how do we do that?
64
230815
1812
Kako da to izvedemo?
03:52
We give our algorithms
65
232627
1787
Tako da algoritmima damo
03:54
tens of thousands of examples of people we know to be smiling,
66
234414
4110
desetke tisuća primjera osmijeha ljudi
03:58
from different ethnicities, ages, genders,
67
238524
3065
različitih nacionalnosti, dobi, spolova,
04:01
and we do the same for smirks.
68
241589
2811
a isto to učinimo i za podsmijehe.
04:04
And then, using deep learning,
69
244400
1554
Zatim koristeći dubinsko učenje,
04:05
the algorithm looks for all these textures and wrinkles
70
245954
2856
algoritam traži sve te teksture i bore
04:08
and shape changes on our face,
71
248810
2580
i promjene oblika lica
04:11
and basically learns that all smiles have common characteristics,
72
251390
3202
te nauči da svi osmjesi imaju zajedničke karakteristike,
04:14
all smirks have subtly different characteristics.
73
254592
3181
a svi podsmjesi neznatno različite karakteristike.
04:17
And the next time it sees a new face,
74
257773
2368
Sljedeći put kad ugleda novo lice,
04:20
it essentially learns that
75
260141
2299
nauči da
04:22
this face has the same characteristics of a smile,
76
262440
3033
to lice ima iste karakteristike osmijeha
04:25
and it says, "Aha, I recognize this. This is a smile expression."
77
265473
4278
pa kaže: "Aha, znam što je to - osmijeh."
04:30
So the best way to demonstrate how this technology works
78
270381
2800
Najbolji način demonstracije funkcioniranja ove tehnologije
04:33
is to try a live demo,
79
273181
2136
jest demonstracija uživo.
04:35
so I need a volunteer, preferably somebody with a face.
80
275317
3913
Trebat će mi dobrovoljac, po mogućnosti netko s licem.
04:39
(Laughter)
81
279230
2334
(Smijeh)
04:41
Cloe's going to be our volunteer today.
82
281564
2771
Cloe će biti naša dobrovoljka.
04:45
So over the past five years, we've moved from being a research project at MIT
83
285325
4458
Tijekom zadnjih pet godina iz istraživačkog projekta na MIT-u
04:49
to a company,
84
289783
1156
prerasli smo u tvrtku
04:50
where my team has worked really hard to make this technology work,
85
290939
3192
koja naporno radi da omogući funkcioniranje ove tehnologije,
04:54
as we like to say, in the wild.
86
294131
2409
kako mi to nazivamo, u divljini.
04:56
And we've also shrunk it so that the core emotion engine
87
296540
2670
Smanjili smo je tako da ključni motor za emocije
04:59
works on any mobile device with a camera, like this iPad.
88
299210
3320
radi na bilo kakvom uređaju s kamerom, poput ovog iPada.
05:02
So let's give this a try.
89
302530
2786
Isprobajmo ga.
05:06
As you can see, the algorithm has essentially found Cloe's face,
90
306756
3924
Kao što možete vidjeti, algoritam je prepoznao Chloeino lice
05:10
so it's this white bounding box,
91
310680
1692
vidljivo u ovoj bijeloj kutijici
05:12
and it's tracking the main feature points on her face,
92
312372
2571
i prati njezine glavne crte lica,
05:14
so her eyebrows, her eyes, her mouth and her nose.
93
314943
2856
dakle obrve, oči, usta i nos.
05:17
The question is, can it recognize her expression?
94
317799
2987
Pitanje glasi: prepoznaje li izraz lica?
05:20
So we're going to test the machine.
95
320786
1671
Testirat ćemo uređaj.
05:22
So first of all, give me your poker face. Yep, awesome. (Laughter)
96
322457
4186
Prvo mi pokaži pokerašku facu. Odlično. (Smijeh)
05:26
And then as she smiles, this is a genuine smile, it's great.
97
326643
2813
I onda kako se smije, ovo je iskren osmijeh, sjajno.
05:29
So you can see the green bar go up as she smiles.
98
329456
2300
Zelena se pokazatelj povećava kako se smije.
05:31
Now that was a big smile.
99
331756
1222
To je bio veliki osmijeh.
05:32
Can you try a subtle smile to see if the computer can recognize?
100
332978
3043
Sad probaj nešto manje očito da vidimo hoće li ga prepoznati.
05:36
It does recognize subtle smiles as well.
101
336021
2331
Prepoznaje i manje očite osmjehe.
05:38
We've worked really hard to make that happen.
102
338352
2125
Naporno smo radili da to postignemo.
05:40
And then eyebrow raised, indicator of surprise.
103
340477
2962
Zatim podignute obrve, pokazatelj iznenađenja.
05:43
Brow furrow, which is an indicator of confusion.
104
343439
4249
Naborane obrve, pokazatelj zbunjenosti.
05:47
Frown. Yes, perfect.
105
347688
4007
Namrgodi se. Tako, savršeno.
05:51
So these are all the different action units. There's many more of them.
106
351695
3493
To su sve različite akcijske jedinice, a ima ih još jako puno.
05:55
This is just a slimmed-down demo.
107
355188
2032
Ovo je samo osnovna demonstracija.
05:57
But we call each reading an emotion data point,
108
357220
3148
Svako očitanje nazivamo točkom emocionalnih podataka
06:00
and then they can fire together to portray different emotions.
109
360368
2969
i zajedno mogu izraziti različite emocije.
06:03
So on the right side of the demo -- look like you're happy.
110
363337
4653
Na desnoj strani izgledaj sretno.
06:07
So that's joy. Joy fires up.
111
367990
1454
To je sreća, povećava se.
06:09
And then give me a disgust face.
112
369444
1927
Sad nam pokaži gađenje.
06:11
Try to remember what it was like when Zayn left One Direction.
113
371371
4272
Sjeti se kako si se osjećala kad je Zayn napustio One Direction.
06:15
(Laughter)
114
375643
1510
(Smijeh)
06:17
Yeah, wrinkle your nose. Awesome.
115
377153
4342
Da, naboraj nos. Odlično.
06:21
And the valence is actually quite negative, so you must have been a big fan.
116
381495
3731
Valencija je prilično negativna, mora da si bila njegov veliki fan.
06:25
So valence is how positive or negative an experience is,
117
385226
2700
Valencija pokazuje pozitivnost, tj. negativnost iskustva,
06:27
and engagement is how expressive she is as well.
118
387926
2786
a angažman pokazuje koliko je ekspresivna.
06:30
So imagine if Cloe had access to this real-time emotion stream,
119
390712
3414
Zamislite da Cloe ima pristup ovom emocionalnom prijenosu uživo
06:34
and she could share it with anybody she wanted to.
120
394126
2809
i da ga može podijeliti s bilo kime s kime želi.
06:36
Thank you.
121
396935
2923
Hvala.
06:39
(Applause)
122
399858
4621
(Pljesak)
06:45
So, so far, we have amassed 12 billion of these emotion data points.
123
405749
5270
Dosad smo prikupili 12 milijardi ovih jedinica emocionalnih podataka.
06:51
It's the largest emotion database in the world.
124
411019
2611
To je najveća baza podataka za emocije u svijetu.
06:53
We've collected it from 2.9 million face videos,
125
413630
2963
Prikupili smo ih iz 2,9 milijuna snimaka lica
06:56
people who have agreed to share their emotions with us,
126
416593
2600
ljudi koji su pristali podijeliti svoje emocije s nama
06:59
and from 75 countries around the world.
127
419193
3205
iz 75 zemalja diljem svijeta.
07:02
It's growing every day.
128
422398
1715
Svakim je danom sve veća.
07:04
It blows my mind away
129
424603
2067
Svakim me danom sve više oduševljava
07:06
that we can now quantify something as personal as our emotions,
130
426670
3195
kako možemo kvantificirati nešto tako osobno poput emocija,
07:09
and we can do it at this scale.
131
429865
2235
i to možemo prikazati na ovoj skali.
07:12
So what have we learned to date?
132
432100
2177
Što smo dosad naučili?
07:15
Gender.
133
435057
2331
Spol.
07:17
Our data confirms something that you might suspect.
134
437388
3646
Prikupljeni podatci potvrđuju nešto na što ste vjerojatno sumnjali.
07:21
Women are more expressive than men.
135
441034
1857
Žene su ekspresivnije od muškaraca.
07:22
Not only do they smile more, their smiles last longer,
136
442891
2683
Žene se više smiješe, a i osmjesi im dulje traju
07:25
and we can now really quantify what it is that men and women
137
445574
2904
te sada možemo zaista kvantificirati na što to muškarci i žene
07:28
respond to differently.
138
448478
2136
drugačije reagiraju.
07:30
Let's do culture: So in the United States,
139
450614
2290
Prijeđimo na kulturu. U SAD-u
07:32
women are 40 percent more expressive than men,
140
452904
3204
žene su 40% eskspresivnije od muškaraca,
07:36
but curiously, we don't see any difference in the U.K. between men and women.
141
456108
3645
ali zanimljivo je da tu razliku ne vidlmo u UK-u između muškaraca i žena.
07:39
(Laughter)
142
459753
2506
(Smijeh)
07:43
Age: People who are 50 years and older
143
463296
4027
Dob: ljudi od pedeset godina i više
07:47
are 25 percent more emotive than younger people.
144
467323
3436
25% su emotivniji od mlađih ljudi.
07:51
Women in their 20s smile a lot more than men the same age,
145
471899
3852
Žene u dvadesetima smiju se više nego muškarci iste dobi,
07:55
perhaps a necessity for dating.
146
475751
3839
što je vjerojatno nužno za romantične veze.
07:59
But perhaps what surprised us the most about this data
147
479590
2617
Vjerojatno najnevjerojatnija spoznaja
08:02
is that we happen to be expressive all the time,
148
482207
3203
jest da da smo ekspresivni cijelo vrijeme,
08:05
even when we are sitting in front of our devices alone,
149
485410
2833
čak i kad sami sjedimo ispred uređaja,
08:08
and it's not just when we're watching cat videos on Facebook.
150
488243
3274
a ne samo kad gledamo snimke mačaka na Facebooku.
08:12
We are expressive when we're emailing, texting, shopping online,
151
492217
3010
Ekspresivni smo dok pišemo mejlove, poruke, kupujemo online
08:15
or even doing our taxes.
152
495227
2300
ili pak računamo porez.
08:17
Where is this data used today?
153
497527
2392
Gdje se ovi podatci danas koriste?
08:19
In understanding how we engage with media,
154
499919
2763
U razumijevanju toga kako funkcioniramo s medijima,
08:22
so understanding virality and voting behavior;
155
502682
2484
dakle razumijevanje marketinga i glasačkog ponašanja,
08:25
and also empowering or emotion-enabling technology,
156
505166
2740
ali i u osnaživanju ili tehnologiji u službi emocija.
08:27
and I want to share some examples that are especially close to my heart.
157
507906
4621
Željela bih vam pokazati neke meni drage primjere.
08:33
Emotion-enabled wearable glasses can help individuals
158
513197
3068
Naočale za očitavanje emocija pomažu osobama
08:36
who are visually impaired read the faces of others,
159
516265
3228
oštećenog vida da čitaju emocije drugih,
08:39
and it can help individuals on the autism spectrum interpret emotion,
160
519493
4187
a mogu pomoći i autistima da prepoznaju emocije,
08:43
something that they really struggle with.
161
523680
2778
nešto što je njima zaista teško.
08:47
In education, imagine if your learning apps
162
527918
2859
Zamislite kad bi aplikacije za učenje u obrazovanju
08:50
sense that you're confused and slow down,
163
530777
2810
mogle osjetiti da ste zbunjeni i usporiti
08:53
or that you're bored, so it's sped up,
164
533587
1857
ili da vam je dosadno, pa ubrzati,
08:55
just like a great teacher would in a classroom.
165
535444
2969
baš poput odličnog učitelja u učionici.
08:59
What if your wristwatch tracked your mood,
166
539043
2601
Što kad bi vam ručni sat mogao pratiti raspoloženje
09:01
or your car sensed that you're tired,
167
541644
2693
ili kad bi automobil osjetio da ste umorni
09:04
or perhaps your fridge knows that you're stressed,
168
544337
2548
ili da hladnjak zna da ste pod stresom,
09:06
so it auto-locks to prevent you from binge eating. (Laughter)
169
546885
6066
pa se automatski zaključa da se ne bi prejedali. (Smijeh)
Meni bi to bilo super.
09:12
I would like that, yeah.
170
552951
2717
09:15
What if, when I was in Cambridge,
171
555668
1927
Što da sam, dok sam bila na Cambridgeu,
09:17
I had access to my real-time emotion stream,
172
557595
2313
imala pristup emocionalnom prijenosu uživo
09:19
and I could share that with my family back home in a very natural way,
173
559908
3529
i da sam to mogla podijeliti sa svojom obitelji na prirodan način,
09:23
just like I would've if we were all in the same room together?
174
563437
3971
kao što bih to učinila da smo zajedno u istoj prostoriji?
09:27
I think five years down the line,
175
567408
3142
Mislim da će za pet godina
09:30
all our devices are going to have an emotion chip,
176
570550
2337
svi naši uređaji imati emocionalni čip
09:32
and we won't remember what it was like when we couldn't just frown at our device
177
572887
4064
i bit će nam teško zamisliti kako je bilo kad se nismo mogli samo namrštiti uređaju,
09:36
and our device would say, "Hmm, you didn't like that, did you?"
178
576951
4249
a da nam on ne kaže: "Hmm, to ti se nije svidjelo, zar ne?"
09:41
Our biggest challenge is that there are so many applications of this technology,
179
581200
3761
Izazov je u tome što ovu tehnologiju možemo primijeniti na razne načine,
09:44
my team and I realize that we can't build them all ourselves,
180
584961
2903
a moj tim i ja shvaćamo da ne možemo sve to napraviti sami.
09:47
so we've made this technology available so that other developers
181
587864
3496
Stoga smo tu tehnologiju podijelili s drugim razvojnim inženjerima
09:51
can get building and get creative.
182
591360
2114
da je nastave razvijati i biti kreativni.
09:53
We recognize that there are potential risks
183
593474
4086
Shvaćamo potencijalne rizike,
09:57
and potential for abuse,
184
597560
2067
kao i potencijal za zloupotrebu,
09:59
but personally, having spent many years doing this,
185
599627
2949
ali osobno, nakon što sam tolike godine provela radeći na tome,
10:02
I believe that the benefits to humanity
186
602576
2972
vjerujem da su koristi za čovječanstvo
10:05
from having emotionally intelligent technology
187
605548
2275
od emocionalno inteligentne tehnologije
10:07
far outweigh the potential for misuse.
188
607823
3576
puno veće od potencijalne zloupotrebe.
10:11
And I invite you all to be part of the conversation.
189
611399
2531
Sve vas pozivam da budete dio rasprave.
10:13
The more people who know about this technology,
190
613930
2554
Što je više ljudi upoznato s tom tehnologijom,
10:16
the more we can all have a voice in how it's being used.
191
616484
3177
to će više nas imati pravo izabrati kako će se koristiti.
10:21
So as more and more of our lives become digital,
192
621081
4574
Što se naši životi sve više digitaliziraju,
10:25
we are fighting a losing battle trying to curb our usage of devices
193
625655
3498
to smo više osuđeni na propast u pokušaju da smanjimo upotrebu uređaja
10:29
in order to reclaim our emotions.
194
629153
2229
kako bismo povratili emocije.
10:32
So what I'm trying to do instead is to bring emotions into our technology
195
632622
3914
Ono što ja želim napraviti jest donijeti emocije u tehnologiju
10:36
and make our technologies more responsive.
196
636536
2229
te je učiniti osjetljivijom na nas.
10:38
So I want those devices that have separated us
197
638765
2670
Želim da nas ti uređaji koji su nas razdvajali
10:41
to bring us back together.
198
641435
2462
ponovno spoje.
10:43
And by humanizing technology, we have this golden opportunity
199
643897
4588
Humaniziranjem tehnologije dobivamo jedinstvenu priliku
10:48
to reimagine how we connect with machines,
200
648485
3297
ponovnog interpretiranja načina na koji se povezujemo sa strojevima
10:51
and therefore, how we, as human beings,
201
651782
4481
te samim time i kako se mi kao ljudska bića
10:56
connect with one another.
202
656263
1904
povezujemo jedni s drugima.
10:58
Thank you.
203
658167
2160
Hvala vam.
(Pljesak)
11:00
(Applause)
204
660327
3313
O ovoj web stranici

Ova stranica će vas upoznati s YouTube videozapisima koji su korisni za učenje engleskog jezika. Vidjet ćete lekcije engleskog koje vode vrhunski profesori iz cijelog svijeta. Dvaput kliknite na engleske titlove prikazane na svakoj video stranici da biste reproducirali video s tog mjesta. Titlovi se pomiču sinkronizirano s reprodukcijom videozapisa. Ako imate bilo kakvih komentara ili zahtjeva, obratite nam se putem ovog obrasca za kontakt.

https://forms.gle/WvT1wiN1qDtmnspy7