What's it like to be a robot? | Leila Takayama

92,193 views ・ 2018-02-16

TED


Dvaput kliknite na engleske titlove ispod za reprodukciju videozapisa.

Prevoditelj: Ivana Varga Recezent: Sanda L
00:12
You only get one chance to make a first impression,
0
12760
2656
Imate samo jednu priliku napraviti prvi dojam,
00:15
and that's true if you're a robot as well as if you're a person.
1
15440
3176
a to je istina ako ste robot i ako ste osoba.
00:18
The first time that I met one of these robots
2
18640
3016
Prvi put kad sam upoznala jednog od ovih robota
00:21
was at a place called Willow Garage in 2008.
3
21680
2896
bilo je u mjestu nazvanom Willow Garage u 2008.
00:24
When I went to visit there, my host walked me into the building
4
24600
3016
Kad sam bila u posjetu tamo, domaćin me uveo u zgradu
00:27
and we met this little guy.
5
27640
1576
i upoznali smo onog malenog.
00:29
He was rolling into the hallway,
6
29240
1656
Dokotrljao se hodnikom,
00:30
came up to me, sat there,
7
30920
1816
došao do mene, sjeo,
00:32
stared blankly past me,
8
32760
2256
buljio u prazno iza mene,
00:35
did nothing for a while,
9
35040
1656
mirovao neko vrijeme,
00:36
rapidly spun his head around 180 degrees
10
36720
1936
ubrzano okrenuo glavu za 180 stupnjeva
00:38
and then ran away.
11
38680
1536
i onda pobjegao.
00:40
And that was not a great first impression.
12
40240
2176
To nije bio dobar prvi dojam.
00:42
The thing that I learned about robots that day
13
42440
2176
Ono što sam naučila od robota tog dana
00:44
is that they kind of do their own thing,
14
44640
2176
je da se oni ponašaju nekako na svoju ruku
00:46
and they're not totally aware of us.
15
46840
2136
i da nas nisu sasvim svjesni.
00:49
And I think as we're experimenting with these possible robot futures,
16
49000
3239
Smatram da eksperimentiranjem s ovim mogućim budućnostima robota,
00:52
we actually end up learning a lot more about ourselves
17
52263
2673
zapravo učimo puno više o sebi,
00:54
as opposed to just these machines.
18
54960
1656
za razliku od ovih strojeva.
00:56
And what I learned that day
19
56640
1336
Ono što sam naučila tog dana
00:58
was that I had pretty high expectations for this little dude.
20
58000
3416
je da sam imala dosta visoka očekivanja od ovog mališana.
01:01
He was not only supposed to be able to navigate the physical world,
21
61440
3176
On je trebao znati navigirati ne samo fizičkim svijetom,
01:04
but also be able to navigate my social world --
22
64640
2656
već i mojim društvenim svijetom.
01:07
he's in my space; it's a personal robot.
23
67320
2176
On je u mom prostoru, to je osobni robot.
01:09
wWhy didn't it understand me?
24
69520
2016
Zašto me nije razumio?
01:11
My host explained to me,
25
71560
1256
Moj domaćin mi je objasnio:
01:12
"Well, the robot is trying to get from point A to point B,
26
72840
3176
"Pa, robot želi doći od točke A do točke B,
01:16
and you were an obstacle in his way,
27
76040
1776
a ti si mu bila prepreka na putu
01:17
so he had to replan his path,
28
77840
2016
pa je morao ponovno isplanirati svoj put,
01:19
figure out where to go,
29
79880
1256
shvatiti kuda ići
01:21
and then get there some other way,"
30
81160
1696
i onda stići tamo na neki drugi način",
01:22
which was actually not a very efficient thing to do.
31
82880
2456
što, zapravo, nije vrlo učinkovito.
01:25
If that robot had figured out that I was a person, not a chair,
32
85360
3256
Da je taj robot shvatio da ja nisam stolica nego osoba,
01:28
and that I was willing to get out of its way
33
88640
2096
da sam mu se mogla maknuti s puta
01:30
if it was trying to get somewhere,
34
90760
1656
ako je pokušavao stići negdje,
01:32
then it actually would have been more efficient
35
92440
2216
onda bi bio učinkovitiji
01:34
at getting its job done
36
94680
1256
u izvršavanju svoje zadaće,
01:35
if it had bothered to notice that I was a human
37
95960
2216
da se potrudio primijetiti da sam čovjek
01:38
and that I have different affordances than things like chairs and walls do.
38
98200
3576
i da imam drukčiju pristupačnost od stvari poput stolica i zidova.
01:41
You know, we tend to think of these robots as being from outer space
39
101800
3216
Znate, volimo misliti o ovim robotima kao o nečemu iz svemira,
01:45
and from the future and from science fiction,
40
105040
2136
iz budućnosti i znanstvene fantastike,
01:47
and while that could be true,
41
107200
1416
no, dok to možda i jest istina,
01:48
I'd actually like to argue that robots are here today,
42
108640
2656
zapravo tvrdim da su roboti ovdje danas
01:51
and they live and work amongst us right now.
43
111320
2776
te da oni žive i rade među nama već sada.
01:54
These are two robots that live in my home.
44
114120
2856
Ova dva robota žive u mom domu.
01:57
They vacuum the floors and they cut the grass
45
117000
2496
Usisavaju podove i kose travu
01:59
every single day,
46
119520
1216
svakog dana,
02:00
which is more than I would do if I actually had time to do these tasks,
47
120760
3376
što je više nego što bih sama čistila da zapravo imam vremena za te zadatke,
02:04
and they probably do it better than I would, too.
48
124160
2336
a oni to vjerojatno rade bolje od mene.
02:06
This one actually takes care of my kitty.
49
126520
2496
Ovaj se zapravo brine za mog mačka.
02:09
Every single time he uses the box, it cleans it,
50
129040
2576
Svaki put kad on obavi nuždu u pijesak, robot to počisti,
02:11
which is not something I'm willing to do,
51
131640
1976
jer to nije nešto što sam voljna raditi
02:13
and it actually makes his life better as well as mine.
52
133640
2576
pa mu čini život lakšim kao i moj.
02:16
And while we call these robot products --
53
136240
2416
Iako zovemo te robote proizvodima;
02:18
it's a "robot vacuum cleaner, it's a robot lawnmower,
54
138680
2696
"to je robotski usisavač", "to je robotska kosilica",
02:21
it's a robot littler box,"
55
141400
1495
"to je robotska kutija s pijeskom",
02:22
I think there's actually a bunch of other robots hiding in plain sight
56
142919
4137
smatram da, zapravo, ima hrpa drugih robota skrivenih na otvorenom
02:27
that have just become so darn useful
57
147080
1856
koji su postali toliko vražje korisni
02:28
and so darn mundane
58
148960
1456
i toliko vražje svakodnevni,
02:30
that we call them things like, "dishwasher," right?
59
150440
2496
da ih zovemo stvarima poput "perilice posuđa", zar ne?
02:32
They get new names.
60
152960
1216
Oni dobiju nova imena.
02:34
They don't get called robot anymore
61
154200
1696
Njih više ne zovemo robotima
02:35
because they actually serve a purpose in our lives.
62
155920
2416
jer oni, zapravo, imaju svrhu u našim životima.
02:38
Similarly, a thermostat, right?
63
158360
1496
Slično kao termostat, zar ne?
02:39
I know my robotics friends out there
64
159880
1776
Znam da su moji prijatelji robotičari
02:41
are probably cringing at me calling this a robot,
65
161680
2336
vjerojatno sada ustuknuli jer njega zovem robotom,
02:44
but it has a goal.
66
164040
1256
ali on ima cilj.
02:45
Its goal is to make my house 66 degrees Fahrenheit,
67
165320
2896
Njegov cilj je održavati moju kuću toplom na 18°C
02:48
and it senses the world.
68
168240
1256
i on osjeća svijet.
02:49
It knows it's a little bit cold,
69
169520
1576
Zna da je zahladilo,
02:51
it makes a plan and then it acts on the physical world.
70
171120
2616
napravi plan i onda djeluje u fizičkom svijetu.
02:53
It's robotics.
71
173760
1256
To je robotika.
02:55
Even if it might not look like Rosie the Robot,
72
175040
2576
Iako ne izgleda kao robot Rosie iz crtića,
02:57
it's doing something that's really useful in my life
73
177640
2936
on radi nešto uistinu korisno u mom životu
03:00
so I don't have to take care
74
180600
1376
da se ja ne moram brinuti
03:02
of turning the temperature up and down myself.
75
182000
2576
o pojačavanju ili smanjivanju temperature.
03:04
And I think these systems live and work amongst us now,
76
184600
3816
Smatram da ovi sustavi žive i rade među nama sada
03:08
and not only are these systems living amongst us
77
188440
2336
i ne samo da žive među nama,
03:10
but you are probably a robot operator, too.
78
190800
2656
nego vi njima, također, vjerojatno i upravljate.
03:13
When you drive your car,
79
193480
1256
Kad vozite svoj auto,
03:14
it feels like you are operating machinery.
80
194760
2216
imate osjećaj kao da upravljate mašinerijom.
03:17
You are also going from point A to point B,
81
197000
2816
Vozite od točke A do točke B,
03:19
but your car probably has power steering,
82
199840
2216
ali vaš auto vjerojatno ima servo upravljanje,
03:22
it probably has automatic braking systems,
83
202080
2696
vjerojatno ima automatsku kočnicu,
03:24
it might have an automatic transmission and maybe even adaptive cruise control.
84
204800
3736
možda ima automatski mjenjač i možda ima prilagodljivu kontrolu upravljanja.
03:28
And while it might not be a fully autonomous car,
85
208560
2936
I dok možda nije potpuno autonoman auto,
03:31
it has bits of autonomy,
86
211520
1296
ima djeliće autonomije
03:32
and they're so useful
87
212840
1336
koji su toliko korisni
03:34
and they make us drive safer,
88
214200
1816
da vozimo sigurnije zbog njih,
03:36
and we just sort of feel like they're invisible-in-use, right?
89
216040
3656
a jednostavno osjećamo kao da su nevidljivi, zar ne?
03:39
So when you're driving your car,
90
219720
1576
Tako da kad vozite
03:41
you should just feel like you're going from one place to another.
91
221320
3096
trebali biste se osjećati kao da idete od jednog mjesta prema drugom.
03:44
It doesn't feel like it's this big thing that you have to deal with and operate
92
224440
3736
Ne čini se kao da je to neka velika stvar s kojom se morate nositi i upravljati
03:48
and use these controls
93
228200
1256
i koristiti ove kontrole,
03:49
because we spent so long learning how to drive
94
229480
2176
jer smo toliko vremena proveli učeći kako voziti,
03:51
that they've become extensions of ourselves.
95
231680
2696
da su oni postali produžeci nas samih.
03:54
When you park that car in that tight little garage space,
96
234400
2696
Kad parkirate taj auto u ono usko mjesto u garaži,
03:57
you know where your corners are.
97
237120
1576
znate gdje su vam uglovi.
03:58
And when you drive a rental car that maybe you haven't driven before,
98
238720
3256
Ali, onda kad vozite iznajmljeni auto koji možda nikad niste vozili,
potrebno je neko vrijeme da se naviknete na svoje robotsko tijelo.
04:02
it takes some time to get used to your new robot body.
99
242000
3056
04:05
And this is also true for people who operate other types of robots,
100
245080
3976
To je istina i za ljude koju upravljaju drugim vrstama robota,
04:09
so I'd like to share with you a few stories about that.
101
249080
2600
zato bih htjela podijeliti s vama par priča o tome.
04:12
Dealing with the problem of remote collaboration.
102
252240
2336
O nošenju s problemom udaljene suradnje.
04:14
So, at Willow Garage I had a coworker named Dallas,
103
254600
2576
Dakle, u Willow Garageu imala sam kolegu zvanog Dallas
04:17
and Dallas looked like this.
104
257200
1576
i Dallas je izgledao ovako.
04:18
He worked from his home in Indiana in our company in California.
105
258800
4056
Radio je od kuće u Indiani u našoj tvrtci u Kaliforniji.
04:22
He was a voice in a box on the table in most of our meetings,
106
262880
2936
On je bio glas u kutiji na stolu na većini naših sastanaka,
04:25
which was kind of OK except that, you know,
107
265840
2215
što je bilo u redu, osim što, znate,
ako bismo imali neku žestoku raspravu i nije nam se sviđalo što bi govorio,
04:28
if we had a really heated debate and we didn't like what he was saying,
108
268079
3377
04:31
we might just hang up on him.
109
271480
1416
samo bismo prekinuli poziv.
04:32
(Laughter)
110
272920
1015
(Smijeh)
04:33
Then we might have a meeting after that meeting
111
273959
2217
Onda bismo mogli održati sastanak nakon tog sastanka
04:36
and actually make the decisions in the hallway afterwards
112
276200
2696
i, zapravo, donijeti odluke u hodniku,
04:38
when he wasn't there anymore.
113
278920
1416
kada njega više nije bilo.
04:40
So that wasn't so great for him.
114
280360
1576
Tako da to nije bilo baš super za njega.
04:41
And as a robotics company at Willow,
115
281960
1736
A kao tvrtka robotike u Willowu,
04:43
we had some extra robot body parts laying around,
116
283720
2336
imali smo nekoliko dodatnih dijelova robota uokolo,
04:46
so Dallas and his buddy Curt put together this thing,
117
286080
2496
tako da su Dallas i njegov prijatelj Curt sastavili ovo,
04:48
which looks kind of like Skype on a stick on wheels,
118
288600
2936
što izgleda poput Skypea na štapu s kotačima,
04:51
which seems like a techy, silly toy,
119
291560
1736
što se čini kao smiješna, mehanička igračka,
04:53
but really it's probably one of the most powerful tools
120
293320
2776
ali stvarno je vjerojatno jedan od najmoćnijih alata
04:56
that I've seen ever made for remote collaboration.
121
296120
2480
koje sam ikad vidjela, stvorenih za udaljenu suradnju.
04:59
So now, if I didn't answer Dallas' email question,
122
299160
3496
Pa tako, ako ne bih odgovorila na Dallasovo pitanje u e-mailu,
05:02
he could literally roll into my office,
123
302680
2216
on bi se mogao doslovno dokotrljati u moj ured,
05:04
block my doorway and ask me the question again --
124
304920
2576
blokirati mi izlazak i ponovno mi postaviti pitanje...
05:07
(Laughter)
125
307520
1016
(Smijeh)
05:08
until I answered it.
126
308560
1216
...dok mu ne bih odgovorila.
05:09
And I'm not going to turn him off, right? That's kind of rude.
127
309800
2976
A ja ga neću ugasiti, zar ne? To se čini pomalo nepristojnim.
05:12
Not only was it good for these one-on-one communications,
128
312800
2696
Ne samo da je bilo dobro za ove razgovore jedan na jedan,
05:15
but also for just showing up at the company all-hands meeting.
129
315520
2936
već i za grupne sastanke u tvrtci.
05:18
Getting your butt in that chair
130
318480
1696
Dovući se na tu stolicu
05:20
and showing people that you're present and committed to your project
131
320200
3216
i pokazati ljudima da ste prisutni i odani svom projektu
05:23
is a big deal
132
323440
1256
je velika stvar
05:24
and can help remote collaboration a ton.
133
324720
2176
i može uvelike pomoći udaljenoj suradnji.
05:26
We saw this over the period of months and then years,
134
326920
2856
Vidjeli smo to u razdoblju od par mjeseci, zatim i godina,
05:29
not only at our company but at others, too.
135
329800
2160
ne samo u našoj tvrtci već i u drugima.
05:32
The best thing that can happen with these systems
136
332720
2336
Najbolja stvar koja se može dogoditi s ovim sustavima
05:35
is that it starts to feel like you're just there.
137
335080
2336
je to da se počnete osjećati kao da ste tamo.
05:37
It's just you, it's just your body,
138
337440
1696
Vi ste tamo, vaše tijelo je tamo,
05:39
and so people actually start to give these things personal space.
139
339160
3096
tako da ljudi stvarno počinju davati ovim stvarima osobni prostor.
05:42
So when you're having a stand-up meeting,
140
342280
1976
Tako da kad imate stojeći sastanak,
ljudi će stajati uokolo,
05:44
people will stand around the space
141
344280
1656
05:45
just as they would if you were there in person.
142
345960
2216
baš kao što bi stajali da ste tamo uživo.
To je odlično dok nema prekida, a onda više nije.
05:48
That's great until there's breakdowns and it's not.
143
348200
2576
05:50
People, when they first see these robots,
144
350800
1976
Ljudi, kada prvi put vide ove robote,
05:52
are like, "Wow, where's the components? There must be a camera over there,"
145
352800
3576
počnu: "Vau, gdje su komponente? Mora biti nekakva kamera ovdje negdje"
05:56
and they start poking your face.
146
356400
1576
i onda vam bockaju lice.
"Pretiho govoriš, pojačat ću ti glasnoću",
05:58
"You're talking too softly, I'm going to turn up your volume,"
147
358000
2936
što je isto kao da imate kolegu koji vam priđe i kaže:
06:00
which is like having a coworker walk up to you and say,
148
360960
2616
"Pretiho govoriš, pojačat ću ti lice".
06:03
"You're speaking too softly, I'm going to turn up your face."
149
363600
2896
To je čudno i nije u redu,
06:06
That's awkward and not OK,
150
366520
1256
06:07
and so we end up having to build these new social norms
151
367800
2616
tako da na kraju moramo razviti nove društvene norme
06:10
around using these systems.
152
370440
2216
kad koristimo ove sustave.
06:12
Similarly, as you start feeling like it's your body,
153
372680
3416
Slično je kada se počnete privikavati da je to vaše tijelo,
06:16
you start noticing things like, "Oh, my robot is kind of short."
154
376120
3696
počnete primjećivati stvari poput: "Ah, moj robot je nekako kratak".
06:19
Dallas would say things to me -- he was six-foot tall --
155
379840
2656
Dallas bi mi znao govoriti -- visok je 1,80 m --
06:22
and we would take him via robot to cocktail parties and things like that,
156
382520
3456
i vodili bismo ga putem robota na koktel partije i slično,
06:26
as you do,
157
386000
1216
normalno,
06:27
and the robot was about five-foot-tall, which is close to my height.
158
387240
3336
a robot je bio visok otprilike 1,50 m, što je bliže mojoj visini.
06:30
And he would tell me,
159
390600
1216
I on bi mi govorio:
06:31
"You know, people are not really looking at me.
160
391840
2536
"Znaš, ljudi zapravo ni ne gledaju na mene.
06:34
I feel like I'm just looking at this sea of shoulders,
161
394400
2696
Osjećam se kao da gledam u ovo more ramena,
06:37
and it's just -- we need a taller robot."
162
397120
1976
a to je... treba nam viši robot".
06:39
And I told him,
163
399120
1256
Ja bih mu rekla:
06:40
"Um, no.
164
400400
1296
"Hm, ne.
06:41
You get to walk in my shoes for today.
165
401720
1936
Vidjet ćeš danas kako je meni inače.
06:43
You get to see what it's like to be on the shorter end of the spectrum."
166
403680
3536
Vidjet ćeš kako je to biti nizak".
06:47
And he actually ended up building a lot of empathy for that experience,
167
407240
3376
On je na kraju razvio veliku empatiju tijekom tog iskustva,
06:50
which was kind of great.
168
410640
1256
što je bilo super.
06:51
So when he'd come visit in person,
169
411920
1656
Tako da kad bi došao uživo u posjet,
06:53
he no longer stood over me as he was talking to me,
170
413600
2416
više ne bi stajao nada mnom kad bi mi govorio,
06:56
he would sit down and talk to me eye to eye,
171
416040
2096
sjeo bi i govorio sa mnom u razini očiju,
06:58
which was kind of a beautiful thing.
172
418160
1736
što je bilo predivno.
06:59
So we actually decided to look at this in the laboratory
173
419920
2656
Tako da smo, zapravo, odlučili pogledati to u laboratoriju
07:02
and see what others kinds of differences things like robot height would make.
174
422600
3656
i vidjeti kakve se druge vrste stvari, poput visine robota, mogu napraviti.
07:06
And so half of the people in our study used a shorter robot,
175
426280
2856
Pa je pola ljudi u našem istraživanju koristilo kraćeg robota,
07:09
half of the people in our study used a taller robot
176
429160
2416
a druga polovica višeg robota.
07:11
and we actually found that the exact same person
177
431600
2256
Otkrili smo da će ista osoba
07:13
who has the exact same body and says the exact same things as someone,
178
433880
3336
koja ima isto tijelo i govori iste stvari kao netko,
07:17
is more persuasive and perceived as being more credible
179
437240
2616
biti uvjerljivija i doimati se vjerodostojnijom
07:19
if they're in a taller robot form.
180
439880
1656
ako je u višem robotskom obliku.
07:21
It makes no rational sense,
181
441560
1816
Racionalno to nema smisla,
07:23
but that's why we study psychology.
182
443400
1696
ali zato proučavamo psihologiju.
07:25
And really, you know, the way that Cliff Nass would put this
183
445120
2856
I, znate, kao što bi to Cliff Nass rekao,
07:28
is that we're having to deal with these new technologies
184
448000
3016
suočavamo se s ovim novim tehnologijama,
07:31
despite the fact that we have very old brains.
185
451040
2736
unatoč činjenici da imamo vrlo stare mozgove.
07:33
Human psychology is not changing at the same speed that tech is
186
453800
2976
Ljudska psihologija se ne mijenja istom brzinom kao tehnologija,
07:36
and so we're always playing catch-up,
187
456800
1816
tako da je stalno nastojimo sustići,
07:38
trying to make sense of this world
188
458640
1656
pokušavajući pronaći smisao svijeta
07:40
where these autonomous things are running around.
189
460320
2336
u kojem smo okruženi ovim autonomnim stvarima.
07:42
Usually, things that talk are people, not machines, right?
190
462680
2736
Uobičajeno, ljudi su ti koji govore, ne strojevi, zar ne?
07:45
And so we breathe a lot of meaning into things like just height of a machine,
191
465440
4576
Stoga dodajemo mnogo značenja stvarima poput visine stroja,
07:50
not a person,
192
470040
1256
ne osobe,
07:51
and attribute that to the person using the system.
193
471320
2360
i pripisujemo to osobi koja koristi sustav.
07:55
You know, this, I think, is really important
194
475120
2216
Znate, smatram da je ovo uistinu važno
07:57
when you're thinking about robotics.
195
477360
1736
kad razmišljate o robotici.
07:59
It's not so much about reinventing humans,
196
479120
2096
Ne radi se tu toliko o rekonstrukciji čovjeka,
08:01
it's more about figuring out how we extend ourselves, right?
197
481240
3136
nego više o tome kako proširiti sebe, zar ne?
08:04
And we end up using things in ways that are sort of surprising.
198
484400
2976
Na kraju koristimo stvari na načine koji su iznenađujući.
08:07
So these guys can't play pool because the robots don't have arms,
199
487400
4256
Dakle, ovi momci ne mogu igrati biljar jer roboti nemaju ruke,
08:11
but they can heckle the guys who are playing pool
200
491680
2336
ali mogu provocirati momke koji igraju biljar
08:14
and that can be an important thing for team bonding,
201
494040
3176
i to može biti važna stvar kod timskog zbližavanja,
08:17
which is kind of neat.
202
497240
1296
što je zgodno.
08:18
People who get really good at operating these systems
203
498560
2496
Ljudi koji postanu zbilja dobri u upravljanju tim sustavima,
08:21
will even do things like make up new games,
204
501080
2056
smislit će stvari poput novih igara,
08:23
like robot soccer in the middle of the night,
205
503160
2136
kao, recimo, robotski nogomet usred noći,
08:25
pushing the trash cans around.
206
505320
1456
gurajući uokolo kante za smeće.
08:26
But not everyone's good.
207
506800
1576
No, nisu svi dobri u tome.
08:28
A lot of people have trouble operating these systems.
208
508400
2496
Dosta ljudi ima problema s upravljanjem tim sustavima.
08:30
This is actually a guy who logged into the robot
209
510920
2256
Ovo je, zapravo, lik koji se ulogirao u robota
08:33
and his eyeball was turned 90 degrees to the left.
210
513200
2376
i oko mu je bilo okrenuto za 90° ulijevo.
08:35
He didn't know that,
211
515600
1256
On to nije znao,
08:36
so he ended up just bashing around the office,
212
516880
2176
pa se na kraju sudarao sa svime u uredu,
zalijetao ljudima u stolove, postalo mu je neugodno,
08:39
running into people's desks, getting super embarrassed,
213
519080
2616
08:41
laughing about it -- his volume was way too high.
214
521720
2336
smijao se tome - glasnoća mu je bila previsoka.
08:44
And this guy here in the image is telling me,
215
524080
2136
A ovaj lik ovdje na slici mi govori:
08:46
"We need a robot mute button."
216
526240
2096
"Treba nam gumb za utišavanje robota".
08:48
And by that what he really meant was we don't want it to be so disruptive.
217
528360
3496
Time je mislio reći da ne želimo da bude toliko ometajući.
08:51
So as a robotics company,
218
531880
1616
Tako da smo kao tvrtka robotike
08:53
we added some obstacle avoidance to the system.
219
533520
2456
dodali izbjegavanje prepreka u sustav.
Dobio je mali laserski daljinometar koji bi mogao vidjeti prepreke,
08:56
It got a little laser range finder that could see the obstacles,
220
536000
3056
08:59
and if I as a robot operator try to say, run into a chair,
221
539080
3136
tako da kad bih ja, kao upravitelj robota, rekla da se zabije u stolicu,
09:02
it wouldn't let me, it would just plan a path around,
222
542240
2496
ne bi mi dozvolio, nego bi isplanirao put koji vodi okolo,
09:04
which seems like a good idea.
223
544760
1856
što se čini kao bolja ideja.
09:06
People did hit fewer obstacles using that system, obviously,
224
546640
3176
Ljudi su se, naravno, manje sudarali koristeći taj sustav,
09:09
but actually, for some of the people,
225
549840
2096
no, zapravo, za neke ljude,
09:11
it took them a lot longer to get through our obstacle course,
226
551960
2856
trebalo je puno više vremena da prođu cijeli poligon s preprekama
09:14
and we wanted to know why.
227
554840
1560
pa smo htjeli znati zašto.
09:17
It turns out that there's this important human dimension --
228
557080
3056
Ispalo je da tu postoji važna ljudska dimenzija;
09:20
a personality dimension called locus of control,
229
560160
2296
dimenzija osobnosti zvana lokus kontrole,
09:22
and people who have a strong internal locus of control,
230
562480
3136
stoga ljudi koji imaju snažan unutarnji lokus kontrole,
09:25
they need to be the masters of their own destiny --
231
565640
3056
moraju biti upravitelji svojih sudbina
09:28
really don't like giving up control to an autonomous system --
232
568720
3096
i stvarno ne žele predati kontrolu autonomnom sustavu,
09:31
so much so that they will fight the autonomy;
233
571840
2136
toliko da će se boriti s autonomijom.
09:34
"If I want to hit that chair, I'm going to hit that chair."
234
574000
3096
"Ako se ja želim sudariti s tom stolicom, ja ću se sudariti".
09:37
And so they would actually suffer from having that autonomous assistance,
235
577120
3616
Tako da bi oni zapravo patili kad bi imali tu autonomnu potporu.
09:40
which is an important thing for us to know
236
580760
2576
Što je važno za nas da znamo,
09:43
as we're building increasingly autonomous, say, cars, right?
237
583360
3296
budući da gradimo izrazito automomne aute, recimo. Zar ne?
09:46
How are different people going to grapple with that loss of control?
238
586680
3360
Kako će se različiti ljudi nositi s tim gubitkom kontrole?
09:50
It's going to be different depending on human dimensions.
239
590880
2696
To će biti različito ovisno o ljudskim dimenzijama.
09:53
We can't treat humans as if we're just one monolithic thing.
240
593600
3496
Ne možemo tretirati ljude kao da smo neka monolitna stvar.
09:57
We vary by personality, by culture,
241
597120
2416
Razlikujemo se po osobnosti, kulturi,
09:59
we even vary by emotional state moment to moment,
242
599560
2456
čak i po emocionalnom stanju od trenutka do trenutka,
a biti u mogućnosti osmisliti ove sustave,
10:02
and being able to design these systems,
243
602040
1976
10:04
these human-robot interaction systems,
244
604040
2296
ove sustave komunikacije između čovjeka i robota,
10:06
we need to take into account the human dimensions,
245
606360
2736
moramo uzeti u obzir ljudske dimenzije,
10:09
not just the technological ones.
246
609120
1720
ne samo tehnološke.
10:11
Along with a sense of control also comes a sense of responsibility.
247
611640
4296
S osjećajem kontrole dolazi i osjećaj odgovornosti.
10:15
And if you were a robot operator using one of these systems,
248
615960
2856
Kad biste bili upravitelj robota koji koristi neke od ovih sustava,
10:18
this is what the interface would look like.
249
618840
2056
ovako bi sučelje izgledalo.
10:20
It looks a little bit like a video game,
250
620920
1936
Nalikuje malo na videoigru,
10:22
which can be good because that's very familiar to people,
251
622880
2976
što može biti pozitivno jer je to poznato ljudima,
10:25
but it can also be bad
252
625880
1216
no, može biti i loše
10:27
because it makes people feel like it's a video game.
253
627120
2456
jer se onda ljudi osjećaju kao da su u videoigri.
10:29
We had a bunch of kids over at Stanford play with the system
254
629600
2856
Imali smo hrpu djece na Stanfordu koja su se igrala time
10:32
and drive the robot around our office in Menlo Park,
255
632480
2456
i vozila robota uokolo u našem uredu u Menlo Parku,
10:34
and the kids started saying things like,
256
634960
1936
i djeca bi govorila stvari poput:
10:36
"10 points if you hit that guy over there. 20 points for that one."
257
636920
3176
"10 bodova ako pogodiš onog lika tamo. 20 za onog ondje".
I onda bi ih lovili po hodniku.
10:40
And they would chase them down the hallway.
258
640120
2016
10:42
(Laughter)
259
642160
1016
(Smijeh)
10:43
I told them, "Um, those are real people.
260
643200
1936
Rekla bih im: "To su prave osobe.
10:45
They're actually going to bleed and feel pain if you hit them."
261
645160
3296
Oni će zaista krvariti i osjećati bol ako ih pogodite".
10:48
And they'd be like, "OK, got it."
262
648480
1616
I oni bi na to: "U redu, shvaćamo".
10:50
But five minutes later, they would be like,
263
650120
2056
Ali pet minuta poslije, oni bi opet:
"20 bodova za onog lika tamo, taj baš izgleda kao da ga se treba pogoditi".
10:52
"20 points for that guy over there, he just looks like he needs to get hit."
264
652200
3616
10:55
It's a little bit like "Ender's Game," right?
265
655840
2136
Podsjeća malo na "Enderovu igru", zar ne?
Postoji stvarni svijet na toj drugoj strani
10:58
There is a real world on that other side
266
658000
1936
10:59
and I think it's our responsibility as people designing these interfaces
267
659960
3416
i smatram da je naša odgovornost kao ljudi koji osmišljavaju ta sučelja
11:03
to help people remember
268
663400
1256
da pomognu ljudima zapamtiti
11:04
that there's real consequences to their actions
269
664680
2256
da postoje stvarne posljedice njihovih djelovanja
11:06
and to feel a sense of responsibility
270
666960
2296
i da osjećaju odgovornost
11:09
when they're operating these increasingly autonomous things.
271
669280
3280
kad upravljaju ovim sve više autonomnim stvarima.
11:13
These are kind of a great example
272
673840
2296
Ovi su, recimo, sjajan primjer
11:16
of experimenting with one possible robotic future,
273
676160
3256
eksperimentiranja s jednom od mogućih budućnosti robota,
11:19
and I think it's pretty cool that we can extend ourselves
274
679440
3856
i smatram da je stvarno kul što možemo produžiti svoje djelovanje
11:23
and learn about the ways that we extend ourselves
275
683320
2336
i učiti o načinima toga
11:25
into these machines
276
685680
1216
u ovim strojevima,
11:26
while at the same time being able to express our humanity
277
686920
2696
dok u isto vrijeme imamo mogućnost izraziti svoju čovječnost
11:29
and our personality.
278
689640
1216
i svoju osobnost.
11:30
We also build empathy for others
279
690880
1576
Također gradimo empatiju za druge
11:32
in terms of being shorter, taller, faster, slower,
280
692480
3216
tako što postajemo niži, viši, brži, sporiji,
11:35
and maybe even armless,
281
695720
1416
možda čak i bezruki,
11:37
which is kind of neat.
282
697160
1336
što je nekako zgodno.
11:38
We also build empathy for the robots themselves.
283
698520
2536
Također gradimo empatiju za same robote.
11:41
This is one of my favorite robots.
284
701080
1656
Ovo je jedan od mojih omiljenih robota.
11:42
It's called the Tweenbot.
285
702760
1456
Zove se Tweenbot.
11:44
And this guy has a little flag that says,
286
704240
1976
Ima zastavicu na kojoj piše:
11:46
"I'm trying to get to this intersection in Manhattan,"
287
706240
2576
"Pokušavam doći do ovog raskrižja na Manhattanu".
11:48
and it's cute and rolls forward, that's it.
288
708840
2776
Sladak je i kotrlja se unaprijed, to je sve.
11:51
It doesn't know how to build a map, it doesn't know how to see the world,
289
711640
3456
Ne zna kako mapirati, ne zna kako vidjeti svijet,
11:55
it just asks for help.
290
715120
1256
samo traži upute.
11:56
The nice thing about people
291
716400
1336
Lijepa stvar kod ljudi je
11:57
is that it can actually depend upon the kindness of strangers.
292
717760
3096
da se, zapravo, može oslanjati na ljubaznost stranaca.
12:00
It did make it across the park to the other side of Manhattan --
293
720880
3896
On je uspio stići preko parka na drugu stranu Manhattana,
12:04
which is pretty great --
294
724800
1256
što je odlično,
12:06
just because people would pick it up and point it in the right direction.
295
726080
3456
samo zato što bi ga ljudi podigli i usmjerili u pravom smjeru.
(Smijeh)
12:09
(Laughter)
296
729560
936
12:10
And that's great, right?
297
730520
1256
To je super, zar ne?
12:11
We're trying to build this human-robot world
298
731800
2696
Pokušavamo izgraditi svijet ljudi i robota,
12:14
in which we can coexist and collaborate with one another,
299
734520
3416
u kojem možemo supostojati i surađivati jedni s drugima
12:17
and we don't need to be fully autonomous and just do things on our own.
300
737960
3376
pa ne moramo biti posve autonomni i samo raditi na svoj način.
12:21
We actually do things together.
301
741360
1496
Zapravo možemo raditi stvari zajedno.
12:22
And to make that happen,
302
742880
1256
Da bismo to postigli,
12:24
we actually need help from people like the artists and the designers,
303
744160
3256
stvarno trebamo pomoć ljudi poput umjetnika i dizajnera,
12:27
the policy makers, the legal scholars,
304
747440
1856
kreatora politike, pravnika,
12:29
psychologists, sociologists, anthropologists --
305
749320
2216
psihologa, sociologa, antropologa,
12:31
we need more perspectives in the room
306
751560
1816
treba nam više perspektiva u prostoriji,
12:33
if we're going to do the thing that Stu Card says we should do,
307
753400
2976
ako ćemo učiniti ono što nam je Stu Card rekao da bismo trebali,
12:36
which is invent the future that we actually want to live in.
308
756400
3936
a to je stvaranje budućnosti u kojoj uistinu želimo živjeti.
12:40
And I think we can continue to experiment
309
760360
2656
I smatram da možemo nastaviti eksperimentirati
12:43
with these different robotic futures together,
310
763040
2176
s ovim različitim budućnostima robota zajedno
12:45
and in doing so, we will end up learning a lot more about ourselves.
311
765240
4680
i na taj način naučiti puno više o sebi samima.
12:50
Thank you.
312
770720
1216
Hvala vam.
12:51
(Applause)
313
771960
2440
(Pljesak)
O ovoj web stranici

Ova stranica će vas upoznati s YouTube videozapisima koji su korisni za učenje engleskog jezika. Vidjet ćete lekcije engleskog koje vode vrhunski profesori iz cijelog svijeta. Dvaput kliknite na engleske titlove prikazane na svakoj video stranici da biste reproducirali video s tog mjesta. Titlovi se pomiču sinkronizirano s reprodukcijom videozapisa. Ako imate bilo kakvih komentara ili zahtjeva, obratite nam se putem ovog obrasca za kontakt.

https://forms.gle/WvT1wiN1qDtmnspy7