How to be "Team Human" in the digital future | Douglas Rushkoff

114,556 views ・ 2019-01-14

TED


Dvaput kliknite na engleske titlove ispod za reprodukciju videozapisa.

Prevoditelj: Djurdjica Ercegovac Recezent: Sanda L
00:13
I got invited to an exclusive resort
0
13520
3880
Bio sam pozvan u jedno ekskluzivno odmaralište
00:17
to deliver a talk about the digital future
1
17440
2456
da održim govor o digitalnoj budućnosti
00:19
to what I assumed would be a couple of hundred tech executives.
2
19920
3576
za, kako sam ja pretpostavljao, par stotina tech rukovoditelja.
00:23
And I was there in the green room, waiting to go on,
3
23520
2696
Nalazio sam se tamo u 'zelenoj sobi' čekajući da počnem,
00:26
and instead of bringing me to the stage, they brought five men into the green room
4
26240
5176
no umjesto da me odvedu na binu, doveli su u 'zelenu sobu' petoricu muškaraca
00:31
who sat around this little table with me.
5
31440
2056
koji su sjeli sa mnom oko tog malog stola.
00:33
They were tech billionaires.
6
33520
2096
Bili su to tech milijarderi.
00:35
And they started peppering me with these really binary questions,
7
35640
4536
I počeli su me obasipati tim zaista binarnim pitanjima,
00:40
like: Bitcoin or Etherium?
8
40200
2000
kao: Bitcoin ili Etherium?
00:43
Virtual reality or augmented reality?
9
43120
2656
Virtualna stvarnost ili proširena stvarnost?
00:45
I don't know if they were taking bets or what.
10
45800
2496
Ne znam jesu li se kladili ili što?
00:48
And as they got more comfortable with me,
11
48320
2816
I kako su se osjećali sve slobodniji sa mnom,
00:51
they edged towards their real question of concern.
12
51160
3216
načeli su i pitanje koje ih je zapravo brinulo.
00:54
Alaska or New Zealand?
13
54400
2240
Aljaska ili Novi Zeland?
00:57
That's right.
14
57760
1216
Točno tako.
Ti tech milijarderi pitali su medijskog teoretičara za savjet
00:59
These tech billionaires were asking a media theorist for advice
15
59000
2976
01:02
on where to put their doomsday bunkers.
16
62000
1880
o tome gdje da sagrade svoje bunkere za sudnji dan.
01:04
We spent the rest of the hour on the single question:
17
64600
3136
Ostatak tog sata proveli smo baveći se samo jednim pitanjem:
01:07
"How do I maintain control of my security staff
18
67760
3616
"Kako ću održati kontrolu nad mojim zaštitarima
01:11
after the event?"
19
71400
1320
nakon tog događaja?"
01:13
By "the event" they mean the thermonuclear war
20
73920
2736
Pod "događajem" mislili su na termonuklearni rat,
01:16
or climate catastrophe or social unrest that ends the world as we know it,
21
76680
4656
klimatsku katastrofu ili društvene nemire koji će okončati svijet kakav poznajemo
01:21
and more importantly, makes their money obsolete.
22
81360
3280
i, što je još važnije, njihov novac učiniti neupotrebljivim.
01:26
And I couldn't help but think:
23
86200
2216
A ja nisam mogao da ne pomislim:
01:28
these are the wealthiest, most powerful men in the world,
24
88440
4736
ovo su najbogatiji, najmoćniji ljudi na svijetu,
01:33
yet they see themselves as utterly powerless to influence the future.
25
93200
4656
a ipak misle da su potpuno nemoćni da utječu na budućnost.
01:37
The best they can do is hang on for the inevitable catastrophe
26
97880
4456
Najbolje što mogu je čekati tu neizbježnu katastrofu
01:42
and then use their technology and money to get away from the rest of us.
27
102360
3680
i onda upotrijebiti svoju tehnologiju i novac kako bi pobjegli od nas ostalih.
01:47
And these are the winners of the digital economy.
28
107520
2536
A to su pobjednici digitalne ekonomije!
01:50
(Laughter)
29
110080
3416
(Smijeh)
01:53
The digital renaissance
30
113520
2776
Digitalna renesansa
01:56
was about the unbridled potential
31
116320
4256
odnosila se na neobuzdani potencijal
02:00
of the collective human imagination.
32
120600
2416
kolektivne ljudske imaginacije.
02:03
It spanned everything from chaos math and quantum physics
33
123040
5136
Ona je zahvaćala sve, od matematike kaosa i kvantne fizike,
02:08
to fantasy role-playing and the Gaia hypothesis, right?
34
128200
4176
do igranja izmišljenih uloga i hipoteze o Geji, zar ne?
02:12
We believed that human beings connected could create any future we could imagine.
35
132400
6720
Vjerovali smo da bi povezana ljudska bića mogla stvoriti
budućnost kakvu god možemo zamisliti.
02:20
And then came the dot com boom.
36
140840
2199
A onda je došao procvat dot-coma.
02:24
And the digital future became stock futures.
37
144600
3616
I digitalna budućnost postala je dionička budućnosnica.
02:28
And we used all that energy of the digital age
38
148240
3016
I svu tu energiju digitalnog doba iskoristili smo da bismo
02:31
to pump steroids into the already dying NASDAQ stock exchange.
39
151280
4256
steroidima napumpali već umiruću burzu dionica NASDAQ.
02:35
The tech magazines told us a tsunami was coming.
40
155560
3616
Tech časopisi rekli su nam da dolazi tsunami.
02:39
And only the investors who hired the best scenario-planners and futurists
41
159200
4456
I samo će oni investitori koji zaposle najbolje planere scenarija i futuriste
02:43
would be able to survive the wave.
42
163680
2520
biti u stanju preživjeti taj val.
02:47
And so the future changed from this thing we create together in the present
43
167160
5896
I tako se budućnost pretvorila,
iz nečega što stvaramo zajedno u sadašnjem trenutku,
02:53
to something we bet on
44
173080
1496
u nešto na što se kladimo,
02:54
in some kind of a zero-sum winner-takes-all competition.
45
174600
3200
neku vrsta natjecanja gdje pobjednik nosi sve, a pobjednika nema.
03:00
And when things get that competitive about the future,
46
180120
3136
A kada stvari postanu tako kompetitivne u vezi budućnosti,
03:03
humans are no longer valued for our creativity.
47
183280
3296
ljudi se više ne vrednuju zbog svoje kreativnosti.
03:06
No, now we're just valued for our data.
48
186600
3136
Ne, danas nas vrednuju samo zbog naših podataka,
03:09
Because they can use the data to make predictions.
49
189760
2376
jer te podatke mogu koristiti za izradu predviđanja.
03:12
Creativity, if anything, that creates noise.
50
192160
2576
Kreativnost, ako išta, to stvara šum.
03:14
That makes it harder to predict.
51
194760
2216
To otežava predviđanje.
03:17
So we ended up with a digital landscape
52
197000
2416
Tako smo na koncu dobili digitalni krajolik
03:19
that really repressed creativity, repressed novelty,
53
199440
3256
koji zaista guši kreativnost, guši inovativnost,
03:22
it repressed what makes us most human.
54
202720
2840
guši ono što nas najviše čini ljudskima.
03:26
We ended up with social media.
55
206760
1456
Dobili smo društvene medije.
03:28
Does social media really connect people in new, interesting ways?
56
208240
3456
Povezuju li zaista društveni mediji ljude na nove, zanimljive načine?
03:31
No, social media is about using our data to predict our future behavior.
57
211720
5016
Ne. Društveni mediji koriste naše podatke za predviđanje našeg budućeg ponašanja
03:36
Or when necessary, to influence our future behavior
58
216760
2896
ili da bi, po potrebi, utjecali na naše buduće ponašanje,
03:39
so that we act more in accordance with our statistical profiles.
59
219680
4040
kako bismo se ponašali više u skladu s našim statističkim profilima.
03:45
The digital economy -- does it like people?
60
225200
2216
Digitalna ekonomija - voli li ona ljude?
03:47
No, if you have a business plan, what are you supposed to do?
61
227440
2896
Ne. Ako imate poslovni plan, što se od vas očekuje?
Da se riješite svih ljudi.
03:50
Get rid of all the people.
62
230360
1256
03:51
Human beings, they want health care, they want money, they want meaning.
63
231640
3400
Ljudska bića, ona žele zdravstvenu skrb, žele novac, žele smisao.
03:56
You can't scale with people.
64
236360
1880
S ljudima ne možete postići rast i profit.
03:59
(Laughter)
65
239360
1456
(Smijeh)
04:00
Even our digital apps --
66
240840
1216
Čak i digitalne aplikacije --
04:02
they don't help us form any rapport or solidarity.
67
242080
3216
one nam ne pomažu stvoriti bilo kakav bliski odnos ili solidarnost.
04:05
I mean, where's the button on the ride hailing app
68
245320
2416
Mislim, gdje je tipka u aplikaciji za poziv taksija
04:07
for the drivers to talk to one another about their working conditions
69
247760
3496
za vozače, da mogu međusobno razgovarati o svojim uvjetima rada
04:11
or to unionize?
70
251280
1200
ili se udruživati?
04:13
Even our videoconferencing tools,
71
253600
2016
Čak i naši alati za video konferencije,
04:15
they don't allow us to establish real rapport.
72
255640
2376
ne omogućuju nam da uspostavimo stvarno bliski odnos.
04:18
However good the resolution of the video,
73
258040
3336
Koliko god dobra bila rezolucija videa,
04:21
you still can't see if somebody's irises are opening to really take you in.
74
261400
4016
ipak ne vidite otvaraju li se nečije zjenice u želji da vas se stvarno shvati.
04:25
All of the things that we've done to establish rapport
75
265440
2576
Sve one stvari koje činimo da uspostavimo bliski odnos,
04:28
that we've developed over hundreds of thousands of years of evolution,
76
268040
3335
koje smo razvijali stotinama tisuća godina evolucije, one ne rade.
04:31
they don't work,
77
271399
1217
Ne možete vidjeti je li nečiji dah usklađen s vašim.
04:32
you can't see if someone's breath is syncing up with yours.
78
272640
3016
04:35
So the mirror neurons never fire, the oxytocin never goes through your body,
79
275680
3656
Zato se zrcalni neuroni nikada ne pale, oksitocin nikada ne kola vašim tijelom,
04:39
you never have that experience of bonding with the other human being.
80
279360
3240
nikada nemate iskustvo povezivanja s drugim ljudskim bićem.
04:43
And instead, you're left like,
81
283360
1456
Umjesto toga, ostaje vam ono:
04:44
"Well, they agreed with me, but did they really,
82
284840
2256
"Pa, složili su se sa mnom. A jesu li zaista,
jesu li me zaista shvatili?"
04:47
did they really get me?"
83
287120
1696
04:48
And we don't blame the technology for that lack of fidelity.
84
288840
3376
I mi ne krivimo tehnologiju za taj nedostatak vjernosti.
04:52
We blame the other person.
85
292240
1480
Mi krivimo drugu osobu.
04:55
You know, even the technologies and the digital initiatives that we have
86
295320
4056
Znate, čak i tehnologije i digitalne inicijative koje imamo
04:59
to promote humans,
87
299400
2176
za promoviranje ljudi,
05:01
are intensely anti-human at the core.
88
301600
2760
snažno su anti-ljudske u srži.
05:05
Think about the blockchain.
89
305600
2000
Razmislite o blockchainu.
05:08
The blockchain is here to help us have a great humanized economy? No.
90
308520
3616
Blokchain postoji da nam pomogne imati sjajnu humaniziranu ekonomiju? Ne!
05:12
The blockchain does not engender trust between users,
91
312160
2696
Blockchain ne stvara povjerenje između korisnika.
05:14
the blockchain simply substitutes for trust in a new,
92
314880
3536
Blockchain je jednostavno zamjena za povjerenje na novi,
05:18
even less transparent way.
93
318440
2000
još manje transparentan način.
05:21
Or the code movement.
94
321600
1816
Ili pokret za učenje programiranja.
05:23
I mean, education is great, we love education,
95
323440
2176
Mislim, obrazovanje je sjajno, mi volimo obrazovanje
05:25
and it's a wonderful idea
96
325640
1456
i to je prekrasna ideja
05:27
that we want kids to be able to get jobs in the digital future,
97
327120
3136
da želimo da djeca budu u stanju naći posao u digitalnoj budućnosti,
05:30
so we'll teach them code now.
98
330280
1600
stoga ćemo ih sada naučiti programirati.
05:32
But since when is education about getting jobs?
99
332640
2480
Ali otkada je svrha obrazovanja dobivanje posla?
05:35
Education wasn't about getting jobs.
100
335920
1736
To nije bila svrha obrazovanja.
05:37
Education was compensation for a job well done.
101
337680
3976
Obrazovanje je bilo kompenzacija za dobro odrađen posao.
05:41
The idea of public education
102
341680
1576
Zamisao javnog obrazovanja
05:43
was for coal miners, who would work in the coal mines all day,
103
343280
3376
bila je za rudare ugljena, koji bi po cijeli dan radili u rudniku,
05:46
then they'd come home and they should have the dignity
104
346680
2576
a kada bi došli kući trebali bi imati dostojanstvo
05:49
to be able to read a novel and understand it.
105
349280
2136
da mogu čitati roman i razumjeti ga.
05:51
Or the intelligence to be able to participate in democracy.
106
351440
2960
Ili biti toliko inteligentni da su sposobni sudjelovati u demokraciji.
05:55
When we make it an extension of the job, what are we really doing?
107
355200
3216
Kada od toga radimo produžetak posla, što mi ustvari radimo?
05:58
We're just letting corporations really
108
358440
2536
Mi zapravo samo puštamo korporacije
06:01
externalize the cost of training their workers.
109
361000
3120
da na druge prebace troškove obučavanja svojih radnika.
06:05
And the worst of all really is the humane technology movement.
110
365520
4096
A zaista najgori od svih je pokret za humanu tehnologiju.
06:09
I mean, I love these guys, the former guys who used to take
111
369640
2816
Hoću reći, meni se sviđaju ti momci, bivši momci koji su znali uzeti
06:12
the algorithms from Las Vegas slot machines
112
372480
2936
algoritme iz automata u Las Vegasu i ugrađivati ih u društvene medije
06:15
and put them in our social media feed so that we get addicted.
113
375440
2936
kako bismo postali ovisni o njima.
06:18
Now they've seen the error of their ways
114
378400
1936
Sada uviđaju da su pogriješili i žele tehnologiju učiniti humanijom.
06:20
and they want to make technology more humane.
115
380360
2456
06:22
But when I hear the expression "humane technology,"
116
382840
2416
No, kad čujem izraz "humana tehnologija",
06:25
I think about cage-free chickens or something.
117
385280
2856
pomislim na nekavezne piliće ili tako nešto.
06:28
We're going to be as humane as possible to them,
118
388160
2256
Bit ćemo prema njima humani koliko je god to moguće,
06:30
until we take them to the slaughter.
119
390440
1720
sve dok ih ne odvedemo na klanje.
06:33
So now they're going to let these technologies be as humane as possible,
120
393200
3416
Tako će oni pustiti da te tehnologije budu što je moguće humanije,
06:36
as long as they extract enough data and extract enough money from us
121
396640
3216
sve dok od nas izvlače dovoljno podataka i dovoljno novca
06:39
to please their shareholders.
122
399880
1400
kako bi usrećili svoje dioničare.
06:42
Meanwhile, the shareholders, for their part, they're just thinking,
123
402520
3176
Dotle dioničari, sa svoje strane, samo razmišljaju:
06:45
"I need to earn enough money now, so I can insulate myself
124
405720
2976
"Moram zaraditi dovoljno novca sada kako bih se izolirao
06:48
from the world I'm creating by earning money in this way."
125
408720
3056
od svijeta koji stvaram zarađujući novac na ovaj način."
06:51
(Laughter)
126
411800
2376
(Smijeh)
06:54
No matter how many VR goggles they slap on their faces
127
414200
4056
Bez obzira koliko VR naočala nataknu na lice
06:58
and whatever fantasy world they go into,
128
418280
2256
i u koji god fantastični svijet da ulaze,
07:00
they can't externalize the slavery and pollution that was caused
129
420560
3536
oni ne mogu na druge prebaciti ropstvo i zagađenje uzrokovano
07:04
through the manufacture of the very device.
130
424120
2976
proizvodnjom samih tih uređaja.
07:07
It reminds me of Thomas Jefferson's dumbwaiter.
131
427120
3176
To me podsjeća na kuhinjsko dizalo Thomasa Jeffersona.
07:10
Now, we like to think that he made the dumbwaiter
132
430320
2336
Mi volimo zamišljati da je on napravio kuhinjsko dizalo
07:12
in order to spare his slaves all that labor of carrying the food
133
432680
3656
kako bi poštedio svoje robove svog tog truda oko donošenja hrane
07:16
up to the dining room for the people to eat.
134
436360
2776
gore u blagovaonicu gdje ljudi jedu.
07:19
That's not what it was for, it wasn't for the slaves,
135
439160
2496
Nije to bilo zato, nije bilo za robove,
07:21
it was for Thomas Jefferson and his dinner guests,
136
441680
2336
bilo je za Thomasa Jeffersona i njegove goste,
07:24
so they didn't have to see the slave bringing the food up.
137
444040
3096
kako ne bi morali gledati robove koji donose hranu.
07:27
The food just arrived magically,
138
447160
1576
Hrana je dolazila magijom,
07:28
like it was coming out of a "Start Trek" replicator.
139
448760
3000
kao da izlazi iz replikatora "Zvjezdanih staza".
07:32
It's part of an ethos that says,
140
452720
2096
To je dio etosa koji kaže,
07:34
human beings are the problem and technology is the solution.
141
454840
4280
ljudska bića su problem, a tehnologija je rješenje.
07:40
We can't think that way anymore.
142
460680
2056
Ne možemo više razmišljati na taj način.
07:42
We have to stop using technology to optimize human beings for the market
143
462760
5296
Moramo prestati koristiti tehnologiju za optimiziranje ljudskih bića za tržište
07:48
and start optimizing technology for the human future.
144
468080
5040
i početi optimizirati tehnologiju za budućnost ljudi.
07:55
But that's a really hard argument to make these days,
145
475080
2656
Ali to je zaista težak argument u današnje vrijeme,
07:57
because humans are not popular beings.
146
477760
4056
zato jer ljudi više nisu popularna bića.
08:01
I talked about this in front of an environmentalist just the other day,
147
481840
3376
Neki dan sam o tome govorio pred jednom aktivisticom za okoliš,
a ona je rekla, "Zašto branite ljude? Ljudi su uništili planet.
08:05
and she said, "Why are you defending humans?
148
485240
2096
08:07
Humans destroyed the planet. They deserve to go extinct."
149
487360
2696
Zaslužuju da izumru."
08:10
(Laughter)
150
490080
3456
(Smijeh)
08:13
Even our popular media hates humans.
151
493560
2576
Čak i naši popularni mediji mrze ljude.
Gledajte televiziju,
08:16
Watch television,
152
496160
1256
08:17
all the sci-fi shows are about how robots are better and nicer than people.
153
497440
3736
svi SF filmovi su o tome kako su roboti bolji i ljubazniji od ljudi.
08:21
Even zombie shows -- what is every zombie show about?
154
501200
2976
Čak i filmovi o zombijima -- o čemu je svaki film o zombijima?
08:24
Some person, looking at the horizon at some zombie going by,
155
504200
3256
Netko promatra nekog zombija koji se pojavljuje na horizontu
08:27
and they zoom in on the person and you see the person's face,
156
507480
2896
i onda zumiraju na osobu, vi vidite lice te osobe
08:30
and you know what they're thinking:
157
510400
1736
i znate o čemu razmišlja:
08:32
"What's really the difference between that zombie and me?
158
512160
2736
"U čemu je zapravo razlika između tog zombija i mene?
08:34
He walks, I walk.
159
514920
1536
On hoda, ja hodam.
08:36
He eats, I eat.
160
516480
2016
On jede, ja jedem.
08:38
He kills, I kill."
161
518520
2280
On ubija, ja ubijam."
08:42
But he's a zombie.
162
522360
1536
Ali on je zombi.
08:43
At least you're aware of it.
163
523920
1416
Bar ste toga svjesni.
08:45
If we are actually having trouble distinguishing ourselves from zombies,
164
525360
3696
Ako nam je zaista teško razlikovati sebe od zombija,
08:49
we have a pretty big problem going on.
165
529080
2176
onda je to prilično veliki problem.
08:51
(Laughter)
166
531280
1216
(Smijeh)
08:52
And don't even get me started on the transhumanists.
167
532520
2936
I samo nemojte da počnem o transhumanistima.
08:55
I was on a panel with a transhumanist, and he's going on about the singularity.
168
535480
3816
Bio sam na panel diskusiji s jednim od njih koji je pričao o singularitetu.
08:59
"Oh, the day is going to come really soon when computers are smarter than people.
169
539320
3856
"Oh, vrlo skoro doći će dan kad će računala biti pametnija od ljudi.
09:03
And the only option for people at that point
170
543200
2136
I jedina opcija za ljude u tom času
09:05
is to pass the evolutionary torch to our successor
171
545360
3336
je da evolucijsku baklju predaju našem nasljedniku
09:08
and fade into the background.
172
548720
1616
i izgube se u pozadini.
09:10
Maybe at best, upload your consciousness to a silicon chip.
173
550360
3456
U najboljem slučaju, da svoju svijest prenesu na silikonski čip.
09:13
And accept your extinction."
174
553840
1840
I prihvate svoje izumiranje."
09:16
(Laughter)
175
556640
1456
(Smijeh)
09:18
And I said, "No, human beings are special.
176
558120
3376
A ja sam rekao, "Ne, ljudska bića su posebna.
09:21
We can embrace ambiguity, we understand paradox,
177
561520
3536
Mi možemo prihvatiti višeznačnost, mi razumijemo proturječje,
09:25
we're conscious, we're weird, we're quirky.
178
565080
2616
mi smo svjesni, mi smo čudni, mi smo osebujni.
09:27
There should be a place for humans in the digital future."
179
567720
3336
Moralo bi postajati mjesto za ljude u digitalnoj budućnosti."
09:31
And he said, "Oh, Rushkoff,
180
571080
1536
A on je rekao, "Oh, Rushkoff,
09:32
you're just saying that because you're a human."
181
572640
2296
ti to govoriš samo zato jer si ljudsko biće."
09:34
(Laughter)
182
574960
1776
(Smijeh)
09:36
As if it's hubris.
183
576760
1600
Kao da je to oholost.
09:39
OK, I'm on "Team Human."
184
579280
2800
U redu, ja sam u "Team Human".
09:43
That was the original insight of the digital age.
185
583200
3856
To je bio originalni uvid u digitalno doba,
09:47
That being human is a team sport,
186
587080
2216
da je biti čovjek timski sport,
09:49
evolution's a collaborative act.
187
589320
2736
kolaborativno djelo evolucije.
09:52
Even the trees in the forest,
188
592080
1416
Čak i stabla u šumi,
09:53
they're not all in competition with each other,
189
593520
2216
ona se ne natječu jedno s drugim,
09:55
they're connected with the vast network of roots and mushrooms
190
595760
3216
ona su povezana golemom mrežom korijenja i gljiva koja im omogućava
09:59
that let them communicate with one another and pass nutrients back and forth.
191
599000
4536
međusobnu komunikaciju i prijenos hranjivih tvari od jednog do drugog.
10:03
If human beings are the most evolved species,
192
603560
2136
Ako su ljudska bića najrazvijenija vrsta,
10:05
it's because we have the most evolved ways of collaborating and communicating.
193
605720
4136
to je zato jer mi imamo najrazvijenije načine suradnje i komuniciranja.
10:09
We have language.
194
609880
1496
Mi imamo jezik.
10:11
We have technology.
195
611400
1240
Mi imamo tehnologiju.
10:14
It's funny, I used to be the guy who talked about the digital future
196
614120
4576
To je čudno, ja sam nekad bio taj koji je govorio o digitalnoj budućnosti
10:18
for people who hadn't yet experienced anything digital.
197
618720
2680
ljudima koji još nisu imali nikakvog iskustva s digitalnim.
10:22
And now I feel like I'm the last guy
198
622200
1816
A sada se osjećam kao da sam ja posljednji
10:24
who remembers what life was like before digital technology.
199
624040
2920
koji se sjeća kako je izgledao život prije digitalne tehnologije.
10:28
It's not a matter of rejecting the digital or rejecting the technological.
200
628680
4216
Nije stvar u odbacivanju digitalnog ili odbacivanju tehnologije.
10:32
It's a matter of retrieving the values that we're in danger of leaving behind
201
632920
4096
Stvar je u obnavljanju vrijednosti koje su u opasnosti da ih napustimo
10:37
and then embedding them in the digital infrastructure for the future.
202
637040
3600
i da ih uklopimo u digitalnu infrastrukturu budućnosti.
10:41
And that's not rocket science.
203
641880
2296
A to nije neka velika mudrost.
To je jednostavno kao stvaranje društvene mreže
10:44
It's as simple as making a social network
204
644200
2096
10:46
that instead of teaching us to see people as adversaries,
205
646320
3536
koja umjesto da nas uči gledati na ljude kao na suparnike,
10:49
it teaches us to see our adversaries as people.
206
649880
3000
uči nas da na suparnike gledamo kao na ljude.
10:54
It means creating an economy that doesn't favor a platform monopoly
207
654240
4296
To znači stvaranje ekonomije koja ne pogoduje platformi monopola
10:58
that wants to extract all the value out of people and places,
208
658560
3336
koja želi izvući svu vrijednost iz ljudi i mjesta,
11:01
but one that promotes the circulation of value through a community
209
661920
3936
nego koja promovira optjecaj vrijednosti unutar zajednice
11:05
and allows us to establish platform cooperatives
210
665880
2416
i omogućava stvaranje kooperativnih platformi
11:08
that distribute ownership as wide as possible.
211
668320
3816
koje distribuiraju vlasništvo što je moguće šire.
11:12
It means building platforms
212
672160
1656
To znači stvaranje platformi
11:13
that don't repress our creativity and novelty in the name of prediction
213
673840
4656
koje ne potiskuju našu kreativnost i inovativnost u ime predviđanja,
11:18
but actually promote creativity and novelty,
214
678520
2576
nego zapravo promoviraju kreativnost i inovativnost
11:21
so that we can come up with some of the solutions
215
681120
2336
kako bismo mogli osmisliti neka rješenja
11:23
to actually get ourselves out of the mess that we're in.
216
683480
2680
da se izvučemo iz zbrke u kojoj se nalazimo.
11:27
No, instead of trying to earn enough money to insulate ourselves
217
687440
3056
Ne, umjesto pokušavanja da zaradimo dovoljno novca
kako bismo se izolirali od svijeta koji stvaramo,
11:30
from the world we're creating,
218
690520
1496
11:32
why don't we spend that time and energy making the world a place
219
692040
3056
zašto ne utrošiti to vrijeme i energiju da svijet učinimo mjestom
11:35
that we don't feel the need to escape from.
220
695120
2040
iz kojeg ne bismo imali potrebu bježati?
11:38
There is no escape, there is only one thing going on here.
221
698000
3200
Ne može se pobjeći, ovdje se događa samo jedna stvar.
11:42
Please, don't leave.
222
702680
2120
Molim vas, ne odlazite.
11:45
Join us.
223
705640
1360
Pridružite nam se.
11:47
We may not be perfect,
224
707800
1296
Mi možda nismo savršeni,
11:49
but whatever happens, at least you won't be alone.
225
709120
2440
no štogod da se dogodi, bar nećete biti sami.
11:52
Join "Team Human."
226
712640
1520
Pridružite se "Timu ljudi".
11:55
Find the others.
227
715160
2096
Pronađite druge.
11:57
Together, let's make the future that we always wanted.
228
717280
2960
Hajde da zajedno stvorimo budućnost kakvu smo oduvijek željeli.
12:01
Oh, and those tech billionaires who wanted to know
229
721560
2496
A ti tech milijarderi koji su željeli saznati
12:04
how to maintain control of their security force after the apocalypse,
230
724080
3256
kako održati kontrolu nad svojim zaštitarima nakon apokalipse,
12:07
you know what I told them?
231
727360
1896
znate li što sam im rekao?
12:09
"Start treating those people with love and respect right now.
232
729280
4216
"Počnite odmah sada tretirati te ljude s ljubavlju i poštovanjem.
12:13
Maybe you won't have an apocalypse to worry about."
233
733520
2440
Možda tada nećete morati brinuti o apokalipsi."
12:16
Thank you.
234
736840
1216
Hvala.
12:18
(Applause)
235
738080
4440
(Pljesak)
O ovoj web stranici

Ova stranica će vas upoznati s YouTube videozapisima koji su korisni za učenje engleskog jezika. Vidjet ćete lekcije engleskog koje vode vrhunski profesori iz cijelog svijeta. Dvaput kliknite na engleske titlove prikazane na svakoj video stranici da biste reproducirali video s tog mjesta. Titlovi se pomiču sinkronizirano s reprodukcijom videozapisa. Ako imate bilo kakvih komentara ili zahtjeva, obratite nam se putem ovog obrasca za kontakt.

https://forms.gle/WvT1wiN1qDtmnspy7