The Rise of Personal Robots | Cynthia Breazeal | TED Talks

159,306 views ・ 2011-02-08

TED


Dvaput kliknite na engleske titlove ispod za reprodukciju videozapisa.

Prevoditelj: Davorin Jelačić Recezent: Tilen Pigac - EFZG
00:15
Ever since I was a little girl
0
15260
3000
Još otkako sam bila djevojčica
00:18
seeing "Star Wars" for the first time,
1
18260
2000
i gledala "Ratove zvijezda" po prvi puta,
00:20
I've been fascinated by this idea
2
20260
2000
bila sam očarana idejom
00:22
of personal robots.
3
22260
2000
osobnih robota.
00:24
And as a little girl,
4
24260
2000
I kao djevojčica,
00:26
I loved the idea of a robot that interacted with us
5
26260
2000
voljela sam ideju robota koji komunicira s nama
00:28
much more like a helpful, trusted sidekick --
6
28260
3000
više kao drug koji nam pomaže i kojem vjerujemo --
00:31
something that would delight us, enrich our lives
7
31260
2000
nešto što bi nas radovalo, obogatilo nam život
00:33
and help us save a galaxy or two.
8
33260
3000
i pomoglo nam spasiti galaksiju ili dvije.
00:37
I knew robots like that didn't really exist,
9
37260
3000
Znala sam da takvi roboti ne postoje u stvarnosti,
00:40
but I knew I wanted to build them.
10
40260
2000
ali sam znala da ih želim izrađivati.
00:42
So 20 years pass --
11
42260
2000
Pa prolazi 20 godina --
00:44
I am now a graduate student at MIT
12
44260
2000
Sada sam studentica diplomskog studija na MIT-u
00:46
studying artificial intelligence,
13
46260
2000
na studiju umjetne inteligencije,
00:48
the year is 1997,
14
48260
2000
godina je 1997.,
00:50
and NASA has just landed the first robot on Mars.
15
50260
3000
i NASA je upravo spustila prvog robota na Mars.
00:53
But robots are still not in our home, ironically.
16
53260
3000
Ali roboti, ironično, još uvijek nisu u našim domovima.
00:56
And I remember thinking about
17
56260
2000
I sjećam se da sam razmišljala
00:58
all the reasons why that was the case.
18
58260
2000
o svim razlozima zašto je to slučaj.
01:00
But one really struck me.
19
60260
2000
Ali jedan me doista pogodio.
01:02
Robotics had really been about interacting with things,
20
62260
3000
Robotika se zapravo bavi interakcijom sa stvarima,
01:05
not with people --
21
65260
2000
a ne s ljudima --
01:07
certainly not in a social way that would be natural for us
22
67260
2000
sigurno ne na društveni način koji bi nama bio prirodan
01:09
and would really help people accept robots
23
69260
2000
i koji bi stvarno pomogao ljudima da prihvate robote
01:11
into our daily lives.
24
71260
2000
u naše svakodnevne živote.
01:13
For me, that was the white space; that's what robots could not do yet.
25
73260
3000
Za mene, to je onaj bijeli prostor, nešto što roboti još ne mogu.
01:16
And so that year, I started to build this robot, Kismet,
26
76260
3000
I tako sam te godine počela stvarati tog robota, Kismeta,
01:19
the world's first social robot.
27
79260
3000
prvog društvenog robota na svijetu.
01:22
Three years later --
28
82260
2000
I tri godine kasnije --
01:24
a lot of programming,
29
84260
2000
nakon puno programiranja,
01:26
working with other graduate students in the lab --
30
86260
2000
i posla s drugim diplomcima u laboratoriju --
01:28
Kismet was ready to start interacting with people.
31
88260
2000
Kismet je bio spreman ostvariti interakciju s ljudima.
01:30
(Video) Scientist: I want to show you something.
32
90260
2000
(Video) Znanstvenik: Želim ti nešto pokazati.
01:32
Kismet: (Nonsense)
33
92260
2000
Kismet: (Glupost).
01:34
Scientist: This is a watch that my girlfriend gave me.
34
94260
3000
Znanstvenik: Ovo je sat koji mi je djevojka poklonila.
01:37
Kismet: (Nonsense)
35
97260
2000
Kismet: (Glupost).
01:39
Scientist: Yeah, look, it's got a little blue light in it too.
36
99260
2000
Znanstvenik: Da, gledaj, ima i malo plavo svjetlo.
01:41
I almost lost it this week.
37
101260
3000
Skoro sam ga izgubio ovoga tjedna.
01:44
Cynthia Breazeal: So Kismet interacted with people
38
104260
3000
Cynthia Breazeal: Dakle, Kismet je komunicirao s ljudima
01:47
like kind of a non-verbal child or pre-verbal child,
39
107260
3000
poput djeteta koje ne priča ili koje još nije propričalo,
01:50
which I assume was fitting because it was really the first of its kind.
40
110260
3000
što je, pretpostavljam, prikladno jer je doista prvi svoje vrste.
01:53
It didn't speak language, but it didn't matter.
41
113260
2000
Nije znao jezik, ali to nije važno.
01:55
This little robot was somehow able
42
115260
2000
Ovaj mali robot je nekako bio u stanju
01:57
to tap into something deeply social within us --
43
117260
3000
dotaknuti nešto duboko društveno u nama.
02:00
and with that, the promise of an entirely new way
44
120260
2000
I time, obećati potpuno novi način
02:02
we could interact with robots.
45
122260
2000
na koji bismo mogli komunicirati s robotima.
02:04
So over the past several years
46
124260
2000
U proteklih nekoliko godina
02:06
I've been continuing to explore this interpersonal dimension of robots,
47
126260
2000
nastavila sam istraživati tu interpersonalnu dimenziju robota,
02:08
now at the media lab
48
128260
2000
sada u medijskom laboratoriju
02:10
with my own team of incredibly talented students.
49
130260
2000
s vlastitim timom nevjerojatno nadarenih studenata.
02:12
And one of my favorite robots is Leonardo.
50
132260
3000
Jedan od mojih omiljenih robota je Leonardo.
02:15
We developed Leonardo in collaboration with Stan Winston Studio.
51
135260
3000
Razvili smo Leonarda u suradnji sa studiom Stana Winstona.
02:18
And so I want to show you a special moment for me of Leo.
52
138260
3000
I želim vam pokazati za mene poseban trenutak s Leom.
02:21
This is Matt Berlin interacting with Leo,
53
141260
2000
Ovdje Matt Berlin komunicira s Leom,
02:23
introducing Leo to a new object.
54
143260
2000
upoznajući Lea s novim predmetom.
02:25
And because it's new, Leo doesn't really know what to make of it.
55
145260
3000
I budući da je nov, Leo zapravo ne zna što da misli o njemu.
02:28
But sort of like us, he can actually learn about it
56
148260
2000
Ali pomalo kao i mi, može nekako naučiti nešto o tome
02:30
from watching Matt's reaction.
57
150260
3000
promatrajući Mattovu reakciju.
02:33
(Video) Matt Berlin: Hello, Leo.
58
153260
2000
(Video) Matt Berlin: Zdravo, Leo.
02:38
Leo, this is Cookie Monster.
59
158260
3000
Leo, ovo je Cookie Monster.
02:44
Can you find Cookie Monster?
60
164260
3000
Možeš li pronaći Cookie Monstera?
02:52
Leo, Cookie Monster is very bad.
61
172260
3000
Leo, Cookie Monster je vrlo loš.
02:56
He's very bad, Leo.
62
176260
2000
On je vrlo loš, Leo.
03:00
Cookie Monster is very, very bad.
63
180260
3000
Cookie Monster je vrlo, vrlo loš.
03:07
He's a scary monster.
64
187260
2000
On je zastrašujuće čudovište.
03:09
He wants to get your cookies.
65
189260
2000
Ono želi uzeti tvoje kolačiće.
03:12
(Laughter)
66
192260
2000
(Smijeh)
03:14
CB: All right, so Leo and Cookie
67
194260
3000
CB: Dobro, dakle Leo i Cookie
03:17
might have gotten off to a little bit of a rough start,
68
197260
2000
su možda počeli pomalo neugodno,
03:19
but they get along great now.
69
199260
3000
ali sada se odlično slažu.
03:22
So what I've learned
70
202260
2000
Ono što sam naučila
03:24
through building these systems
71
204260
2000
stvarajući ove sustave
03:26
is that robots are actually
72
206260
2000
je da su roboti zapravo
03:28
a really intriguing social technology,
73
208260
2000
doista intrigantna društvena tehnologija.
03:30
where it's actually their ability
74
210260
2000
Gdje je, u stvari, njihova sposobnost
03:32
to push our social buttons
75
212260
2000
da pritisnu tipke naše društvenosti
03:34
and to interact with us like a partner
76
214260
2000
i komuniciraju s nama poput partnera
03:36
that is a core part of their functionality.
77
216260
3000
onaj jezgreni dio njihove funkcionalnosti.
03:39
And with that shift in thinking, we can now start to imagine
78
219260
2000
S tim pomakom u mišljenju, sada možemo početi zamišljati
03:41
new questions, new possibilities for robots
79
221260
3000
nova pitanja, nove mogućnosti za robote
03:44
that we might not have thought about otherwise.
80
224260
3000
o kojima inače možda ne bismo razmišljali.
03:47
But what do I mean when I say "push our social buttons?"
81
227260
2000
No, što mislim kad kažem da „pritišću tipke naše društvenosti?“
03:49
Well, one of the things that we've learned
82
229260
2000
Pa, jedna od stvari koju smo naučili
03:51
is that, if we design these robots to communicate with us
83
231260
2000
je da, ako stvorimo robote da komuniciraju s nama
03:53
using the same body language,
84
233260
2000
rabeći isti govor tijela,
03:55
the same sort of non-verbal cues that people use --
85
235260
2000
istu vrstu neverbalnih signala koje koriste ljudi --
03:57
like Nexi, our humanoid robot, is doing here --
86
237260
3000
kao što Nexi, naš humanoidni robot čini ovdje --
04:00
what we find is that people respond to robots
87
240260
2000
ono što otkrivamo je da ljudi reagiraju na robote
04:02
a lot like they respond to people.
88
242260
2000
vrlo slično kao što reagiraju na ljude.
04:04
People use these cues to determine things like how persuasive someone is,
89
244260
3000
Ljudi koriste te signale da odrede koliko je netko uvjerljiv,
04:07
how likable, how engaging,
90
247260
2000
koliko je simpatičan, koliko zanimljiv,
04:09
how trustworthy.
91
249260
2000
koliko mu se može vjerovati.
04:11
It turns out it's the same for robots.
92
251260
2000
Ispostavlja se da je isto s robotima.
04:13
It's turning out now
93
253260
2000
Pokazuje se sada
04:15
that robots are actually becoming a really interesting new scientific tool
94
255260
3000
da roboti doista postaju stvarno zanimljivo novo znanstveno oruđe
04:18
to understand human behavior.
95
258260
2000
za razumijevanje ljudskog ponašanja.
04:20
To answer questions like, how is it that, from a brief encounter,
96
260260
3000
Za odgovaranje na pitanja poput, kako to da, na temelju kratkog susreta,
04:23
we're able to make an estimate of how trustworthy another person is?
97
263260
3000
možemo procijeniti koliko je druga osoba vrijedna našeg povjerenja?
04:26
Mimicry's believed to play a role, but how?
98
266260
3000
Vjeruje se da oponašanje igra ulogu, ali kako?
04:29
Is it the mimicking of particular gestures that matters?
99
269260
3000
Da li je važno oponašanje određenih pokreta?
04:32
It turns out it's really hard
100
272260
2000
Pokazuje se da je stvarno teško
04:34
to learn this or understand this from watching people
101
274260
2000
ovo naučiti ili shvatiti promatrajući ljude
04:36
because when we interact we do all of these cues automatically.
102
276260
3000
jer kad komuniciramo svi tim signalima upravljamo automatski.
04:39
We can't carefully control them because they're subconscious for us.
103
279260
2000
Ne možemo ih pažljivo kontrolirati jer su nam podsvjesni.
04:41
But with the robot, you can.
104
281260
2000
Ali s robotom možete.
04:43
And so in this video here --
105
283260
2000
Pa u ovom filmu ovdje --
04:45
this is a video taken from David DeSteno's lab at Northeastern University.
106
285260
3000
video je snimljen u laboratoriju Davida DeStenoa na sveučilištu Northeastern.
04:48
He's a psychologist we've been collaborating with.
107
288260
2000
On je psiholog s kojim surađujemo.
04:50
There's actually a scientist carefully controlling Nexi's cues
108
290260
3000
Postoji znanstvenik koji pažljivo kontrolira Nexijeve signale
04:53
to be able to study this question.
109
293260
3000
kako bi mogao proučavati ovo pitanje.
04:56
And the bottom line is -- the reason why this works is
110
296260
2000
A zaključak je – razlog zašto ovo funcionira --
04:58
because it turns out people just behave like people
111
298260
2000
jer se pokazuje da se ljudi ponašaju kao ljudi
05:00
even when interacting with a robot.
112
300260
3000
čak i kad komuniciraju s robotom.
05:03
So given that key insight,
113
303260
2000
Pa s tim ključnim uvidom,
05:05
we can now start to imagine
114
305260
2000
možemo početi zamišljati
05:07
new kinds of applications for robots.
115
307260
3000
nove primjene za robote.
05:10
For instance, if robots do respond to our non-verbal cues,
116
310260
3000
Na primjer, ako roboti uzvraćaju na naše neverbalne signale,
05:13
maybe they would be a cool, new communication technology.
117
313260
4000
možda bi bili sjajna nova komunikacijska tehnologija.
05:17
So imagine this:
118
317260
2000
Zamislite ovo:
05:19
What about a robot accessory for your cellphone?
119
319260
2000
Što mislite o robotskom dodatku za vaš mobitel?
05:21
You call your friend, she puts her handset in a robot,
120
321260
2000
Nazovete prijateljicu, ona stavi mobitel u robota,
05:23
and, bam! You're a MeBot --
121
323260
2000
i, bam!, vi ste JaBot --
05:25
you can make eye contact, you can talk with your friends,
122
325260
3000
možete ostvariti kontakt pogledom, pričati s prijateljima,
05:28
you can move around, you can gesture --
123
328260
2000
možete se kretati uokolo, gestikulirati --
05:30
maybe the next best thing to really being there, or is it?
124
330260
3000
možda sljedeća najbolja stvar nakon stvarne nazočnosti ondje, je li?
05:33
To explore this question,
125
333260
2000
Da bismo ovo ispitali
05:35
my student, Siggy Adalgeirsson, did a study
126
335260
3000
moj student, Siggy Adalgeirsson, napravio je studiju
05:38
where we brought human participants, people, into our lab
127
338260
3000
u kojoj smo doveli ljudske sudionike, ljude, u naš laboratorij
05:41
to do a collaborative task
128
341260
2000
da izvrše zadatak koji zahtijeva suradnju
05:43
with a remote collaborator.
129
343260
2000
sa suradnikom na daljinu.
05:45
The task involved things
130
345260
2000
Zadatak je uključivao stvari
05:47
like looking at a set of objects on the table,
131
347260
2000
poput gledanja skupa predmeta na stolu,
05:49
discussing them in terms of their importance and relevance to performing a certain task --
132
349260
3000
raspravljanja o njima u smislu njihove važnosti za obavljanje određenog zadatka --
05:52
this ended up being a survival task --
133
352260
2000
što je završilo kao zadatak preživljavanja --
05:54
and then rating them in terms
134
354260
2000
i zatim rangiranja predmeta prema tome
05:56
of how valuable and important they thought they were.
135
356260
2000
koliko su ih smatrali vrijednima ili važnima.
05:58
The remote collaborator was an experimenter from our group
136
358260
3000
Udaljeni suradnik je bio eksperimentator iz naše skupine
06:01
who used one of three different technologies
137
361260
2000
pri čemu su koristili jednu od tri različite tehnologije
06:03
to interact with the participants.
138
363260
2000
za interakciju sa sudionicima.
06:05
The first was just the screen.
139
365260
2000
Prva je bila običan ekran.
06:07
This is just like video conferencing today.
140
367260
3000
To je poput današnje video konferencije.
06:10
The next was to add mobility -- so, have the screen on a mobile base.
141
370260
3000
Sljedeća je bila s dodanom pokretljivošću, pa smo ekran stavili na pokretnu osnovu.
06:13
This is like, if you're familiar with any of the telepresence robots today --
142
373260
3000
To je poput, ako ste upoznati s nekim oblikom današnjih telenazočnih robota --
06:16
this is mirroring that situation.
143
376260
3000
ovo oponaša tu situaciju.
06:19
And then the fully expressive MeBot.
144
379260
2000
I na koncu, potpuno izažajan JaBot.
06:21
So after the interaction,
145
381260
2000
Nakon interakcije,
06:23
we asked people to rate their quality of interaction
146
383260
3000
tražili smo od ljudi da ocijene kvalitetu interakcije
06:26
with the technology, with a remote collaborator
147
386260
2000
s tehnologijom, sa suradnikom na daljinu,
06:28
through this technology, in a number of different ways.
148
388260
3000
na brojne razne načine.
06:31
We looked at psychological involvement --
149
391260
2000
Promatrali smo psihološki moment --
06:33
how much empathy did you feel for the other person?
150
393260
2000
koliko empatije ste osjetili za drugoga?
06:35
We looked at overall engagement.
151
395260
2000
Promatrali smo sveukupni angažman.
06:37
We looked at their desire to cooperate.
152
397260
2000
Promatrali smo njihovu želju za suradnjom.
06:39
And this is what we see when they use just the screen.
153
399260
3000
Evo što vidimo kada koriste samo ekran.
06:42
It turns out, when you add mobility -- the ability to roll around the table --
154
402260
3000
Ispada da kad dodate pokretljivost – mogućnost okretanja na stolu --
06:45
you get a little more of a boost.
155
405260
2000
dobijete malo više poticaja.
06:47
And you get even more of a boost when you add the full expression.
156
407260
3000
A dobijete još više poticaja kada dodate punu izražajnost.
06:50
So it seems like this physical, social embodiment
157
410260
2000
Pa se čini da ovo fizičko društveno utjelovljenje
06:52
actually really makes a difference.
158
412260
2000
stvarno čini razliku.
06:54
Now let's try to put this into a little bit of context.
159
414260
3000
Pokušajmo to sada malo staviti u kontekst.
06:57
Today we know that families are living further and further apart,
160
417260
3000
Danas znamo da članovi obitelji žive sve udaljenije jedni od drugih,
07:00
and that definitely takes a toll on family relationships
161
420260
2000
i da to definitivno uzima danak u obiteljskim odnosima
07:02
and family bonds over distance.
162
422260
2000
i obiteljskim vezama na daljinu.
07:04
For me, I have three young boys,
163
424260
2000
Što se mene tiče, imam tri dječaka,
07:06
and I want them to have a really good relationship
164
426260
2000
i želim da imaju stvarno dobar odnos
07:08
with their grandparents.
165
428260
2000
sa svojim djedom i bakom.
07:10
But my parents live thousands of miles away,
166
430260
2000
Ali moji roditelji žive tisućama kilometara daleko,
07:12
so they just don't get to see each other that often.
167
432260
2000
pa se oni jednostavno ne vide baš često.
07:14
We try Skype, we try phone calls,
168
434260
2000
Probamo pomoću Skypea, probamo telefonom,
07:16
but my boys are little -- they don't really want to talk;
169
436260
2000
ali dečki su maleni – zapravo ne žele pričati,
07:18
they want to play.
170
438260
2000
žele se igrati.
07:20
So I love the idea of thinking about robots
171
440260
2000
Oni vole ideju gledanja na robote
07:22
as a new kind of distance-play technology.
172
442260
3000
kao na novu vrstu tehnologije za igranje na daljinu.
07:25
I imagine a time not too far from now --
173
445260
3000
Pa si ja zamišljam vrijeme ne tako daleko od danas --
07:28
my mom can go to her computer,
174
448260
2000
kada moja mati može otići do svog računala,
07:30
open up a browser and jack into a little robot.
175
450260
2000
otvoriti preglednik i uključiti se u malog robota.
07:32
And as grandma-bot,
176
452260
3000
I kao baka-bot,
07:35
she can now play, really play,
177
455260
2000
može se igrati, zaista igrati,
07:37
with my sons, with her grandsons,
178
457260
2000
s mojim sinovima, svojim unucima,
07:39
in the real world with his real toys.
179
459260
3000
u stvarnom svijetu s njegovim pravim igračkama.
07:42
I could imagine grandmothers being able to do social-plays
180
462260
2000
Mogu zamisliti bake kako igraju društvene igre
07:44
with their granddaughters, with their friends,
181
464260
2000
sa svojim unukama, prijateljicama,
07:46
and to be able to share all kinds of other activities around the house,
182
466260
2000
kako mogu podijeliti sve ostale aktivnosti po kući,
07:48
like sharing a bedtime story.
183
468260
2000
kao što razmjenjuju priče za laku noć.
07:50
And through this technology,
184
470260
2000
Pomoću ove tehnologije,
07:52
being able to be an active participant
185
472260
2000
moći će biti aktivni sudionici
07:54
in their grandchildren's lives
186
474260
2000
u životima svoje unučadi
07:56
in a way that's not possible today.
187
476260
2000
na način koji danas nije moguć.
07:58
Let's think about some other domains,
188
478260
2000
Razmislimo i o nekim drugim područjima,
08:00
like maybe health.
189
480260
2000
poput zdravlja, možda.
08:02
So in the United States today,
190
482260
2000
U Sjedinjenim Državama danas je,
08:04
over 65 percent of people are either overweight or obese,
191
484260
3000
preko 65 posto ljudi pretilo ili gojazno,
08:07
and now it's a big problem with our children as well.
192
487260
2000
a sada je to velik problem i s našom djecom.
08:09
And we know that as you get older in life,
193
489260
2000
A znamo da, kako ste stariji u životu,
08:11
if you're obese when you're younger, that can lead to chronic diseases
194
491260
3000
ako ste gojazni kad ste mladi, to može voditi u kronične bolesti
08:14
that not only reduce your quality of life,
195
494260
2000
koje ne samo da smanjuju kvalitetu života,
08:16
but are a tremendous economic burden on our health care system.
196
496260
3000
već su i strahovit ekonomski teret za naš zdravstveni sustav.
08:19
But if robots can be engaging,
197
499260
2000
Ali ako roboti mogu zainteresirati,
08:21
if we like to cooperate with robots,
198
501260
2000
ako volimo surađivati s robotima,
08:23
if robots are persuasive,
199
503260
2000
ako su roboti uvjerljivi,
08:25
maybe a robot can help you
200
505260
2000
možda vam robot može pomoći
08:27
maintain a diet and exercise program,
201
507260
2000
da poštujete program dijete i vježbanja,
08:29
maybe they can help you manage your weight.
202
509260
3000
možda vam mogu pomoći da kontrolirate svoju težinu.
08:32
Sort of like a digital Jiminy --
203
512260
2000
Nešto kao digitalna Jiminy --
08:34
as in the well-known fairy tale --
204
514260
2000
dobro poznata bajka --
08:36
a kind of friendly, supportive presence that's always there
205
516260
2000
svojevrsna prijateljska nazočnost koja pruža podršku i uvijek je tu
08:38
to be able to help you make the right decision
206
518260
2000
da bi vam pomogla da donesete pravu odluku
08:40
in the right way at the right time
207
520260
2000
na pravi način, u pravo vrijeme,
08:42
to help you form healthy habits.
208
522260
2000
da vam pomogne stvoriti zdrave navike.
08:44
So we actually explored this idea in our lab.
209
524260
2000
Mi smo zapravo ispitali ovu ideju u našem laboratoriju.
08:46
This is a robot, Autom.
210
526260
2000
Ovo je robot, Autom.
08:48
Cory Kidd developed this robot for his doctoral work.
211
528260
3000
Cory Kidd je razvio ovog robota za svoj doktorski rad.
08:51
And it was designed to be a robot diet-and-exercise coach.
212
531260
3000
Oblikovan je da bude robot trener za ishranu i vježbu.
08:54
It had a couple of simple non-verbal skills it could do.
213
534260
2000
Imao je nekoliko jednostavnih neverbalnih vještina.
08:56
It could make eye contact with you.
214
536260
2000
Mogao je ostvariti kontakt pogledom s vama.
08:58
It could share information looking down at a screen.
215
538260
2000
Mogao je dijeliti informacije gledajući dolje u ekran.
09:00
You'd use a screen interface to enter information,
216
540260
2000
Informacije unosite pomoću ekranskog sučelja,
09:02
like how many calories you ate that day,
217
542260
2000
kao, koliko kalorija ste pojeli toga dana,
09:04
how much exercise you got.
218
544260
2000
koliko ste vježbali.
09:06
And then it could help track that for you.
219
546260
2000
A zatim bi vam robot pomogao pratiti te informacije.
09:08
And the robot spoke with a synthetic voice
220
548260
2000
Robot je govorio sintetičkim glasom
09:10
to engage you in a coaching dialogue
221
550260
2000
kako bi vas uključio u trenerski dijalog
09:12
modeled after trainers
222
552260
2000
modeliran prema trenerima
09:14
and patients and so forth.
223
554260
2000
i pacijentima i slično.
09:16
And it would build a working alliance with you
224
556260
2000
On bi uspostavio radno savezništvo s vama
09:18
through that dialogue.
225
558260
2000
kroz taj dijalog.
09:20
It could help you set goals and track your progress,
226
560260
2000
Mogao vam je pomoći postaviti ciljeve i pratiti napredak,
09:22
and it would help motivate you.
227
562260
2000
i pomagao vam je u motivaciji.
09:24
So an interesting question is,
228
564260
2000
Zanimljivo pitanje je, stoga,
09:26
does the social embodiment really matter? Does it matter that it's a robot?
229
566260
3000
da li je društveno utjelovljenje doista bitno? Je li bitno što je robot?
09:29
Is it really just the quality of advice and information that matters?
230
569260
3000
Da li je stvarno važna samo kvaliteta savjeta i informacija?
09:32
To explore that question,
231
572260
2000
Da bismo odgovorili na to pitanje,
09:34
we did a study in the Boston area
232
574260
2000
napravili smo studiju na području Bostona
09:36
where we put one of three interventions in people's homes
233
576260
3000
u kojoj smo stavili jednu od tri intervencije ljudima u kuću
09:39
for a period of several weeks.
234
579260
2000
na razdoblje od nekoliko tjedana.
09:41
One case was the robot you saw there, Autom.
235
581260
3000
Jedan slučaj je bio robot koji ste ovdje vidjeli, Autom.
09:44
Another was a computer that ran the same touch-screen interface,
236
584260
3000
Druga intervencija je bila računalo koje je imalo isti ekran osjetljiv na dodir,
09:47
ran exactly the same dialogues.
237
587260
2000
i vrtjelo potpuno iste dijaloge.
09:49
The quality of advice was identical.
238
589260
2000
Kvaliteta savjeta je bila ista.
09:51
And the third was just a pen and paper log,
239
591260
2000
A treća je bila samo olovka i papir za vođenje bilješki,
09:53
because that's the standard intervention you typically get
240
593260
2000
jer je to standardna intervencija koju tipično dobijete
09:55
when you start a diet-and-exercise program.
241
595260
3000
kada započnete program dijete i vježbe.
09:58
So one of the things we really wanted to look at
242
598260
3000
Jedna od stvari koju smo stvarno htjeli promotriti
10:01
was not how much weight people lost,
243
601260
3000
nije bila koliko su ljudi izgubili na težini,
10:04
but really how long they interacted with the robot.
244
604260
3000
već zapravo koliko su dugo komunicirali s robotom.
10:07
Because the challenge is not losing weight, it's actually keeping it off.
245
607260
3000
Jer nije problem skinuti kile, problem je ne vratiti se na staro.
10:10
And the longer you could interact with one of these interventions,
246
610260
3000
I što duže možete komunicirati s jednom od ovih intervencija,
10:13
well that's indicative, potentially, of longer-term success.
247
613260
3000
tim je indikativnije, potencijalno, da ćete postići dugoročniji uspjeh.
10:16
So the first thing I want to look at is how long,
248
616260
2000
Prvo što želim pogledati je koliko dugo,
10:18
how long did people interact with these systems.
249
618260
2000
koliko su dugo ljudi bili u interakciji s tim sustavima.
10:20
It turns out that people interacted with the robot
250
620260
2000
Pokazalo se da su ljudi komunicirali s robotom
10:22
significantly more,
251
622260
2000
značajno više,
10:24
even though the quality of the advice was identical to the computer.
252
624260
3000
premda je kvaliteta savjeta bila identična onoj na računalu.
10:28
When it asked people to rate it on terms of the quality of the working alliance,
253
628260
3000
Kad ih je pitao da ga ocijene u smislu kvalitete radnog savezništva,
10:31
people rated the robot higher
254
631260
2000
ljudi su robota ocijenili bolje
10:33
and they trusted the robot more.
255
633260
2000
i robotu su vjerovali više.
10:35
(Laughter)
256
635260
2000
(Smijeh)
10:37
And when you look at emotional engagement,
257
637260
2000
A kad pogledate emocionalni angažman,
10:39
it was completely different.
258
639260
2000
sve je bilo različito.
10:41
People would name the robots.
259
641260
2000
Ljudi bi robotima dali imena.
10:43
They would dress the robots.
260
643260
2000
Odijevali su ih.
10:45
(Laughter)
261
645260
2000
(Smijeh)
10:47
And even when we would come up to pick up the robots at the end of the study,
262
647260
3000
I čak kad bismo došli pokupiti robote pri kraju studije,
10:50
they would come out to the car and say good-bye to the robots.
263
650260
2000
izišli bi do automobila i pozdravili se s robotom.
10:52
They didn't do this with a computer.
264
652260
2000
Nisu to činili s računalom.
10:54
The last thing I want to talk about today
265
654260
2000
Posljednje o čemu želim danas pričati
10:56
is the future of children's media.
266
656260
2000
je budućnost dječjih medija.
10:58
We know that kids spend a lot of time behind screens today,
267
658260
3000
Znamo da djeca danas provode puno vremena za ekranima,
11:01
whether it's television or computer games or whatnot.
268
661260
3000
bilo da je riječ o televiziji ili računalnim igrama ili drugome.
11:04
My sons, they love the screen. They love the screen.
269
664260
3000
Moji sinovi, oni vole ekran. Vole ekran.
11:07
But I want them to play; as a mom, I want them to play,
270
667260
3000
Ali ja želim da se oni igraju; kao majka želim da se igraju,
11:10
like, real-world play.
271
670260
2000
igraju u stvarnom svijetu.
11:12
And so I have a new project in my group I wanted to present to you today
272
672260
3000
Pa u svojoj skupini pokrećem novi projekt koji vam želim danas predstaviti
11:15
called Playtime Computing
273
675260
2000
koji se zove Računala u igri
11:17
that's really trying to think about how we can take
274
677260
2000
koji zapravo pokušava razmišljati
11:19
what's so engaging about digital media
275
679260
2000
o tome što je tako zanimljivo kod digitalnih medija
11:21
and literally bring it off the screen
276
681260
2000
i doslovno to skinuti s ekrana,
11:23
into the real world of the child,
277
683260
2000
u stvarni svijet djeteta,
11:25
where it can take on many of the properties of real-world play.
278
685260
3000
gdje može poprimiti mnoga obilježja igranja u stvarnom svijetu.
11:29
So here's the first exploration of this idea,
279
689260
4000
Evo prvog ispitivanja ove ideje,
11:33
where characters can be physical or virtual,
280
693260
3000
u kojem likovi mogu biti fizički ili virtualni,
11:36
and where the digital content
281
696260
2000
i gdje digitalni sadržaj
11:38
can literally come off the screen
282
698260
2000
doslovno može sići s ekrana,
11:40
into the world and back.
283
700260
2000
u svijet i natrag.
11:42
I like to think of this
284
702260
2000
Volim o ovome razmišljati
11:44
as the Atari Pong
285
704260
2000
kao o Atari Pongu
11:46
of this blended-reality play.
286
706260
2000
ove stopljene stvarnosne igre.
11:48
But we can push this idea further.
287
708260
2000
Ali možemo ovu ideju pogurati i dalje.
11:50
What if --
288
710260
2000
Što ako --
11:52
(Game) Nathan: Here it comes. Yay!
289
712260
3000
(Igra) Nathan: Stiže. Jee!
11:55
CB: -- the character itself could come into your world?
290
715260
3000
CB: -- sam lik može doći u vaš svijet?
11:58
It turns out that kids love it
291
718260
2000
Pokazuje se da djeca obožavaju
12:00
when the character becomes real and enters into their world.
292
720260
3000
kad lik postaje stvaran i ulazi u njihov svijet.
12:03
And when it's in their world,
293
723260
2000
A kad je u njihovom svijetu,
12:05
they can relate to it and play with it in a way
294
725260
2000
mogu uspostaviti odnos i igrati se s njime na način
12:07
that's fundamentally different from how they play with it on the screen.
295
727260
2000
koji je fundamentalno drukčiji od igranja na ekranu.
12:09
Another important idea is this notion
296
729260
2000
Druga važna ideja je shvaćanje
12:11
of persistence of character across realities.
297
731260
3000
opstojnosti lika u različitim realitetima.
12:14
So changes that children make in the real world
298
734260
2000
Pa se promjene koje djeca naprave u stvarnom svijetu
12:16
need to translate to the virtual world.
299
736260
2000
moraju prenijeti u virtualni svijet.
12:18
So here, Nathan has changed the letter A to the number 2.
300
738260
3000
Ovdje je Nathan promijenio slovo A u broj 2.
12:21
You can imagine maybe these symbols
301
741260
2000
Možete zamisliti da ovi simboli
12:23
give the characters special powers when it goes into the virtual world.
302
743260
3000
daju likovima specijalne moći kad odu u virtualni svijet.
12:26
So they are now sending the character back into that world.
303
746260
3000
Pa sada šalju lika natrag u taj svijet.
12:29
And now it's got number power.
304
749260
3000
I on sada ima brojčanu moć.
12:32
And then finally, what I've been trying to do here
305
752260
2000
I naposlijetku, ono što sam pokušala ovdje učiniti je
12:34
is create a really immersive experience for kids,
306
754260
3000
stvoriti iskustvo u koje će djeca stvarno uroniti,
12:37
where they really feel like they are part of that story,
307
757260
3000
u kojem se doista osjećaju kao da su dio priče,
12:40
a part of that experience.
308
760260
2000
dio tog iskustva.
12:42
And I really want to spark their imaginations
309
762260
2000
I zaista želim zapaliti njihovu maštu
12:44
the way mine was sparked as a little girl watching "Star Wars."
310
764260
3000
onako kako se moja zapalila kad sam kao djevojčica gledala „Ratove zvijezda“.
12:47
But I want to do more than that.
311
767260
2000
Ali želim i više od toga.
12:49
I actually want them to create those experiences.
312
769260
3000
Zapravo želim da oni stvore svoja iskustva.
12:52
I want them to be able to literally build their imagination
313
772260
2000
Želim da budu u stanju doslovno ugraditi svoju maštu
12:54
into these experiences and make them their own.
314
774260
2000
u ta iskustva i učiniti ih svojima.
12:56
So we've been exploring a lot of ideas
315
776260
2000
Dakle, istraživali smo mnoge ideje
12:58
in telepresence and mixed reality
316
778260
2000
telenazočnosti i mješovite stvarnosti
13:00
to literally allow kids to project their ideas into this space
317
780260
3000
kako bismo doslovno omogućili djeci da projiciraju svoje ideje u prostor
13:03
where other kids can interact with them
318
783260
2000
gdje druga djeca mogu komunicirati s njima
13:05
and build upon them.
319
785260
2000
i graditi na tome.
13:07
I really want to come up with new ways of children's media
320
787260
3000
Zaista želim iznaći nove načine dječjih medija
13:10
that foster creativity and learning and innovation.
321
790260
3000
koji hrane kreativnost i učenje i inovacije.
13:13
I think that's very, very important.
322
793260
3000
Mislim da je to vrlo, vrlo važno.
13:16
So this is a new project.
323
796260
2000
Pa je ovo novi projekt.
13:18
We've invited a lot of kids into this space,
324
798260
2000
Pozvali smo mnogo djece u ovaj prostor,
13:20
and they think it's pretty cool.
325
800260
3000
i oni misle da je prilično cool.
13:23
But I can tell you, the thing that they love the most
326
803260
2000
Ali mogu vam reći, ono što najviše vole
13:25
is the robot.
327
805260
2000
je robot.
13:27
What they care about is the robot.
328
807260
3000
Ono do čega im je stalo je robot.
13:30
Robots touch something deeply human within us.
329
810260
3000
Roboti dotiču nešto duboko ljudsko u nama.
13:33
And so whether they're helping us
330
813260
2000
I tako, bez obzira da li nam pomažu
13:35
to become creative and innovative,
331
815260
2000
da postanemo kreativni i inovativni,
13:37
or whether they're helping us
332
817260
2000
ili nam pomažu
13:39
to feel more deeply connected despite distance,
333
819260
2000
da se osjećamo bolje povezani usprkos udaljenosti,
13:41
or whether they are our trusted sidekick
334
821260
2000
ili su naš drug kojem vjerujemo
13:43
who's helping us attain our personal goals
335
823260
2000
i koji nam pomaže ostvariti osobne ciljeve
13:45
in becoming our highest and best selves,
336
825260
2000
kako bismo postali najbolji što možemo,
13:47
for me, robots are all about people.
337
827260
3000
za mene, kod robota su važni ljudi.
13:50
Thank you.
338
830260
2000
Hvala vam.
13:52
(Applause)
339
832260
5000
(Pljesak)
O ovoj web stranici

Ova stranica će vas upoznati s YouTube videozapisima koji su korisni za učenje engleskog jezika. Vidjet ćete lekcije engleskog koje vode vrhunski profesori iz cijelog svijeta. Dvaput kliknite na engleske titlove prikazane na svakoj video stranici da biste reproducirali video s tog mjesta. Titlovi se pomiču sinkronizirano s reprodukcijom videozapisa. Ako imate bilo kakvih komentara ili zahtjeva, obratite nam se putem ovog obrasca za kontakt.

https://forms.gle/WvT1wiN1qDtmnspy7