How to be "Team Human" in the digital future | Douglas Rushkoff

114,556 views ・ 2019-01-14

TED


Please double-click on the English subtitles below to play the video.

Prevodilac: Milenka Okuka Lektor: Ivana Krivokuća
00:13
I got invited to an exclusive resort
0
13520
3880
Pozvan sam u ekskluzivno odmaralište
00:17
to deliver a talk about the digital future
1
17440
2456
da održim govor o digitalnoj budućnosti
00:19
to what I assumed would be a couple of hundred tech executives.
2
19920
3576
pred, kako sam pretpostavio, nekoliko stotina tehnoloških direktora.
00:23
And I was there in the green room, waiting to go on,
3
23520
2696
Bio sam u zelenoj sobi, čekajući da nastupim,
00:26
and instead of bringing me to the stage, they brought five men into the green room
4
26240
5176
i umesto da me izvedu na scenu, doveli su pet muškaraca u zelenu sobu
00:31
who sat around this little table with me.
5
31440
2056
koji su seli za jedan manji sto sa mnom.
00:33
They were tech billionaires.
6
33520
2096
Radilo se o tehnološkim milijarderima.
00:35
And they started peppering me with these really binary questions,
7
35640
4536
A počeli su da me saleću sa tim krajnje binarnim pitanjima,
00:40
like: Bitcoin or Etherium?
8
40200
2000
poput: bitkoin ili eterijum?
00:43
Virtual reality or augmented reality?
9
43120
2656
Virtuelna stvarnost ili proširena stvarnost?
00:45
I don't know if they were taking bets or what.
10
45800
2496
Ne znam da li su se kladili ili šta već.
00:48
And as they got more comfortable with me,
11
48320
2816
I kako su postajali opušteniji sa mnom,
00:51
they edged towards their real question of concern.
12
51160
3216
počeli su da naginju ka pitanju koje ih zaista brine.
00:54
Alaska or New Zealand?
13
54400
2240
Aljaska ili Novi Zeland?
00:57
That's right.
14
57760
1216
Tako je.
Ovi tehnološki milijarderi su tražili savet od teoretičara medija
00:59
These tech billionaires were asking a media theorist for advice
15
59000
2976
01:02
on where to put their doomsday bunkers.
16
62000
1880
gde da postave bunkere za sudnji dan.
01:04
We spent the rest of the hour on the single question:
17
64600
3136
Proveli smo ostatak sata na samo jednom pitanju:
01:07
"How do I maintain control of my security staff
18
67760
3616
„Kako da održim kontrolu nad obezbeđenjem
01:11
after the event?"
19
71400
1320
nakon događaja?“
01:13
By "the event" they mean the thermonuclear war
20
73920
2736
Pod „događajem“ smatraju termonuklearni rat,
01:16
or climate catastrophe or social unrest that ends the world as we know it,
21
76680
4656
klimatsku katastrofu ili društvene nemire koji će da dokrajče nama poznati svet,
01:21
and more importantly, makes their money obsolete.
22
81360
3280
i što je još važnije, koji će njihov novac da učine prevaziđenim.
01:26
And I couldn't help but think:
23
86200
2216
I nisam mogao da se oduprem utisku:
01:28
these are the wealthiest, most powerful men in the world,
24
88440
4736
ovo su najimućniji, najmoćniji ljudi na svetu,
01:33
yet they see themselves as utterly powerless to influence the future.
25
93200
4656
a ipak na sebe gledaju kao krajnje nemoćne da utiču na budućnost.
01:37
The best they can do is hang on for the inevitable catastrophe
26
97880
4456
Najbolje što mogu je da sačekaju neizbežnu katastrofu
01:42
and then use their technology and money to get away from the rest of us.
27
102360
3680
i da onda iskoriste svoju tehnologiju i novac da pobegnu od nas ostalih.
01:47
And these are the winners of the digital economy.
28
107520
2536
A ovo su pobednici digitalne ekonomije.
01:50
(Laughter)
29
110080
3416
(Smeh)
01:53
The digital renaissance
30
113520
2776
Kod digitalne renesanse
01:56
was about the unbridled potential
31
116320
4256
se radilo o neograničenom potencijalu
02:00
of the collective human imagination.
32
120600
2416
kolektivne ljudske imaginacije.
02:03
It spanned everything from chaos math and quantum physics
33
123040
5136
Obuhvatala je sve od teorije haosa, preko kvantne fizike,
02:08
to fantasy role-playing and the Gaia hypothesis, right?
34
128200
4176
do fantazijskog igranja uloga i Gejine hipoteze, zar ne?
02:12
We believed that human beings connected could create any future we could imagine.
35
132400
6720
Verovali smo da ljudska bića, kad su povezana,
mogu da stvore bilo koju budućnost koju možemo da zamislimo.
02:20
And then came the dot com boom.
36
140840
2199
A onda je usledio internet procvat.
02:24
And the digital future became stock futures.
37
144600
3616
I digitalna budućnost je postala berzanska budućnost.
02:28
And we used all that energy of the digital age
38
148240
3016
I koristili smo svu tu energiju digitalnog doba
02:31
to pump steroids into the already dying NASDAQ stock exchange.
39
151280
4256
da bi upumpavali steroide u berzu NASDAQ koja je već na izdisaju.
02:35
The tech magazines told us a tsunami was coming.
40
155560
3616
Tehnološki časopisi su nam rekli kako je cunami na pomolu.
02:39
And only the investors who hired the best scenario-planners and futurists
41
159200
4456
I da će jedino investitori koji zaposle najbolje planere scenarija i futuriste
02:43
would be able to survive the wave.
42
163680
2520
biti u stanju da prežive talas.
02:47
And so the future changed from this thing we create together in the present
43
167160
5896
Pa se budućnost izmenila iz nečega što stvaramo zajedno u sadašnjosti,
02:53
to something we bet on
44
173080
1496
u nešto na šta se kladimo
02:54
in some kind of a zero-sum winner-takes-all competition.
45
174600
3200
u nekakvom takmičenju nulta suma u kome pobednik uzima sve.
03:00
And when things get that competitive about the future,
46
180120
3136
Kada sve postane toliko takmičarski nastrojeno po pitanju budućnosti,
03:03
humans are no longer valued for our creativity.
47
183280
3296
ljude više ne cene zbog naše kreativnosti.
03:06
No, now we're just valued for our data.
48
186600
3136
Ne, tada nas cene samo zbog naših podataka.
03:09
Because they can use the data to make predictions.
49
189760
2376
Jer mogu da koriste podatke za predviđanja.
03:12
Creativity, if anything, that creates noise.
50
192160
2576
Kreativnost, ako ništa drugo, stvara buku.
03:14
That makes it harder to predict.
51
194760
2216
To otežava predviđanje.
03:17
So we ended up with a digital landscape
52
197000
2416
Pa smo završili sa digitalnim pejzažom
03:19
that really repressed creativity, repressed novelty,
53
199440
3256
koji zaista potiskuje kreativnost, potiskuje novine,
03:22
it repressed what makes us most human.
54
202720
2840
potiskuje ono što nas najviše čini ljudima.
03:26
We ended up with social media.
55
206760
1456
Završili smo sa društvenim mrežama.
03:28
Does social media really connect people in new, interesting ways?
56
208240
3456
Da li društvene mreže zaista povezuju ljude na nove, zanimljive načine?
03:31
No, social media is about using our data to predict our future behavior.
57
211720
5016
Ne, društvene mreže se svode na korišćenje naših podataka
radi predviđanja našeg budućeg ponašanja.
03:36
Or when necessary, to influence our future behavior
58
216760
2896
Ili, ako je neophodno, radi uticanja na naše buduće ponašanje
03:39
so that we act more in accordance with our statistical profiles.
59
219680
4040
kako bismo se ponašali više u skladu sa našim statističkim profilima.
03:45
The digital economy -- does it like people?
60
225200
2216
Digitalna ekonomija - da li je naklonjena ljudima?
03:47
No, if you have a business plan, what are you supposed to do?
61
227440
2896
Ne, ako imate poslovni plan, šta bi trebalo da radite?
03:50
Get rid of all the people.
62
230360
1256
Da se rešite svih ljudi.
03:51
Human beings, they want health care, they want money, they want meaning.
63
231640
3400
Ljudska bića žele zdravstvo, žele novac, žele smisao.
03:56
You can't scale with people.
64
236360
1880
Ne možete da rastete s ljudima.
03:59
(Laughter)
65
239360
1456
(Smeh)
04:00
Even our digital apps --
66
240840
1216
Ni naše digitalne aplikacije -
04:02
they don't help us form any rapport or solidarity.
67
242080
3216
ne pomažu nam da stvorimo bilo kakvu prisnost ili solidarnost.
04:05
I mean, where's the button on the ride hailing app
68
245320
2416
Mislim, gde je tipka na aplikaciji za prevoz putnika
04:07
for the drivers to talk to one another about their working conditions
69
247760
3496
za vozače da međusobno razgovaraju o svojim uslovima rada
04:11
or to unionize?
70
251280
1200
ili da osnuju sindikat?
04:13
Even our videoconferencing tools,
71
253600
2016
Čak i naša oruđa za video-konferencije
04:15
they don't allow us to establish real rapport.
72
255640
2376
nam ne omogućavaju da postignemo istinsku prisnost.
04:18
However good the resolution of the video,
73
258040
3336
Koliko god bila dobra rezolucija snimka,
04:21
you still can't see if somebody's irises are opening to really take you in.
74
261400
4016
i dalje ne možete da vidite
da li se nečije dužice šire u znak istinskog prihvatanja.
04:25
All of the things that we've done to establish rapport
75
265440
2576
Sve što smo uradili kako bismo ostvarili prisnost,
04:28
that we've developed over hundreds of thousands of years of evolution,
76
268040
3335
što smo razvijali stotinama hiljada godina evolucije,
04:31
they don't work,
77
271399
1217
ne deluje,
04:32
you can't see if someone's breath is syncing up with yours.
78
272640
3016
ne možete da vidite da li se nečiji dah sinhronizuje sa vašim.
04:35
So the mirror neurons never fire, the oxytocin never goes through your body,
79
275680
3656
Stoga se neuroni ogledala ne pokreću, oksitocin nikad ne projuri kroz vaše telo,
04:39
you never have that experience of bonding with the other human being.
80
279360
3240
nikad nemate iskustvo povezivanja sa drugim ljudskim bićem.
04:43
And instead, you're left like,
81
283360
1456
Umesto toga, ostajete u fazonu:
04:44
"Well, they agreed with me, but did they really,
82
284840
2256
„Pa, složili su se sa mnom, ali da li su zaista,
04:47
did they really get me?"
83
287120
1696
da li su me zaista shvatili?“
04:48
And we don't blame the technology for that lack of fidelity.
84
288840
3376
A ne krivimo tehnologiju zbog tog odsustva vernosti.
04:52
We blame the other person.
85
292240
1480
Krivimo drugu osobu.
04:55
You know, even the technologies and the digital initiatives that we have
86
295320
4056
Znate, čak su i tehnologije i digitalne inicijative koje imamo
04:59
to promote humans,
87
299400
2176
da bi promovisali ljude
05:01
are intensely anti-human at the core.
88
301600
2760
u srži izrazito neljudske.
05:05
Think about the blockchain.
89
305600
2000
Razmislite o blokčejnu.
05:08
The blockchain is here to help us have a great humanized economy? No.
90
308520
3616
Blokčejn je tu da nam pomogne da imamo sjajnu humanu ekonomiju? Ne.
05:12
The blockchain does not engender trust between users,
91
312160
2696
Blokčejn ne izaziva poverenje među korisnicima,
05:14
the blockchain simply substitutes for trust in a new,
92
314880
3536
blokčejn je prosto zamena za poverenje na nov,
05:18
even less transparent way.
93
318440
2000
čak i manje transparentan način.
05:21
Or the code movement.
94
321600
1816
Ili pokret kodiranja.
05:23
I mean, education is great, we love education,
95
323440
2176
Mislim, obrazovanje je sjajno, volimo obrazovanje
05:25
and it's a wonderful idea
96
325640
1456
i radi se o divnoj ideji:
05:27
that we want kids to be able to get jobs in the digital future,
97
327120
3136
želimo da deca mogu da nađu poslove u digitalnoj budućnosti,
05:30
so we'll teach them code now.
98
330280
1600
pa ćemo ih sad podučavati kodiranju.
05:32
But since when is education about getting jobs?
99
332640
2480
Međutim, otkad se obrazovanje svodi na pronalaženje poslova?
05:35
Education wasn't about getting jobs.
100
335920
1736
Nije se radilo o pronalaženju poslova.
05:37
Education was compensation for a job well done.
101
337680
3976
Obrazovanje je bilo nadoknada za dobro obavljen posao.
05:41
The idea of public education
102
341680
1576
Ideja javnog obrazovanja
05:43
was for coal miners, who would work in the coal mines all day,
103
343280
3376
je bila da rudari uglja, koji su radili u rudnicima uglja čitav dan,
05:46
then they'd come home and they should have the dignity
104
346680
2576
potom bi došli kući i trebalo je da imaju dostojanstvo
05:49
to be able to read a novel and understand it.
105
349280
2136
da mogu da čitaju roman i da ga razumeju.
05:51
Or the intelligence to be able to participate in democracy.
106
351440
2960
Ili da budu inteligentni i u stanju da učestvuju u demokratiji.
05:55
When we make it an extension of the job, what are we really doing?
107
355200
3216
Kada ga pretvorimo u produženu ruku posla, šta zaista radimo?
05:58
We're just letting corporations really
108
358440
2536
Prosto dozvoljavamo korporacijama da zaista
06:01
externalize the cost of training their workers.
109
361000
3120
prebace na druge cenu obuke za svoje radnike.
06:05
And the worst of all really is the humane technology movement.
110
365520
4096
A najgore od svega zaista je pokret humane tehnologije.
06:09
I mean, I love these guys, the former guys who used to take
111
369640
2816
Mislim, volim te momke, bivši momci koji bi uzeli
06:12
the algorithms from Las Vegas slot machines
112
372480
2936
algoritme iz slot mašina iz Las Vegasa
06:15
and put them in our social media feed so that we get addicted.
113
375440
2936
i stavili bi ih u naše društvene mreže kako bismo se navukli.
06:18
Now they've seen the error of their ways
114
378400
1936
Sada su uvideli greške u svojim postupcima
06:20
and they want to make technology more humane.
115
380360
2456
i žele da učine tehnologiju humanijom.
06:22
But when I hear the expression "humane technology,"
116
382840
2416
Ali kad čujem izraz „humana tehnologija“,
06:25
I think about cage-free chickens or something.
117
385280
2856
pomislim na kokoške van kaveza ili nešto slično.
06:28
We're going to be as humane as possible to them,
118
388160
2256
Bićemo što je moguće humaniji prema njima,
06:30
until we take them to the slaughter.
119
390440
1720
dok ih ne povedemo u klanicu.
06:33
So now they're going to let these technologies be as humane as possible,
120
393200
3416
Sada će da dozvole tim tehnologijama da budu što je moguće humanije,
06:36
as long as they extract enough data and extract enough money from us
121
396640
3216
sve dok izvlače dovoljno podataka i dok izvlače dovoljno novca iz nas
06:39
to please their shareholders.
122
399880
1400
kako bi zadovoljili svoje akcionare.
06:42
Meanwhile, the shareholders, for their part, they're just thinking,
123
402520
3176
Istovremeno, akcionari sa svoje strane, oni prosto misle:
06:45
"I need to earn enough money now, so I can insulate myself
124
405720
2976
„Moram da zaradim dovoljno novca sad, kako bih mogao da se izolujem
06:48
from the world I'm creating by earning money in this way."
125
408720
3056
od sveta koji stvaram zarađujući novac na ovaj način.“
06:51
(Laughter)
126
411800
2376
(Smeh)
06:54
No matter how many VR goggles they slap on their faces
127
414200
4056
Koliko god naočara za virtuelnu stvarnost da stave na lica
06:58
and whatever fantasy world they go into,
128
418280
2256
i u kakav god fantastični svet da uđu,
07:00
they can't externalize the slavery and pollution that was caused
129
420560
3536
ne mogu da prebace na druge ropstvo i zagađenje koje je uzrokovano
07:04
through the manufacture of the very device.
130
424120
2976
prilikom proizvodnje baš tih naočara.
07:07
It reminds me of Thomas Jefferson's dumbwaiter.
131
427120
3176
Podseća me na mali lift Tomasa Džefersona.
07:10
Now, we like to think that he made the dumbwaiter
132
430320
2336
Sad, volimo da mislimo kako je napravio mali lift
07:12
in order to spare his slaves all that labor of carrying the food
133
432680
3656
ne bi li poštedeo svoje robove svog tog rada kod iznošenja hrane
07:16
up to the dining room for the people to eat.
134
436360
2776
gore u trpezariju kako bi ljudi jeli.
07:19
That's not what it was for, it wasn't for the slaves,
135
439160
2496
Nije služio tome, nije postojao zbog robova,
07:21
it was for Thomas Jefferson and his dinner guests,
136
441680
2336
već zbog Tomasa Džefersona i njegovih prijatelja za večerom
07:24
so they didn't have to see the slave bringing the food up.
137
444040
3096
kako ne bi morali da gledaju roba kako im donosi hranu.
07:27
The food just arrived magically,
138
447160
1576
Hrana je prosto magično stizala,
07:28
like it was coming out of a "Start Trek" replicator.
139
448760
3000
kao da izlazi iz replikatora iz „Zvezdanih staza“.
07:32
It's part of an ethos that says,
140
452720
2096
Deo je uverenja koje kaže:
07:34
human beings are the problem and technology is the solution.
141
454840
4280
ljudska bića su problem, a tehnologija je rešenje.
07:40
We can't think that way anymore.
142
460680
2056
Ne možemo više da razmišljamo na taj način.
07:42
We have to stop using technology to optimize human beings for the market
143
462760
5296
Moramo prestati da koristimo tehnologiju zarad optimizovanja ljudi za tržište
07:48
and start optimizing technology for the human future.
144
468080
5040
i da počnemo da optimizujemo tehnologiju za ljudsku budućnost.
07:55
But that's a really hard argument to make these days,
145
475080
2656
Međutim, taj argument je veoma teško izneti ovih dana
07:57
because humans are not popular beings.
146
477760
4056
jer ljudi nisu popularna bića.
08:01
I talked about this in front of an environmentalist just the other day,
147
481840
3376
Razgovarao sam o ovome pred jednom ekološkinjom pre neki dan,
a ona je rekla: „Zašto braniš ljude?
08:05
and she said, "Why are you defending humans?
148
485240
2096
Ljudi su uništili planetu. Zaslužuju da izumru.“
08:07
Humans destroyed the planet. They deserve to go extinct."
149
487360
2696
08:10
(Laughter)
150
490080
3456
(Smeh)
08:13
Even our popular media hates humans.
151
493560
2576
Čak i naši popularni mediji mrze ljude.
08:16
Watch television,
152
496160
1256
Pogledajte televiziju,
08:17
all the sci-fi shows are about how robots are better and nicer than people.
153
497440
3736
sve naučnofantastične serije su o tome kako su roboti bolji i draži od ljudi.
08:21
Even zombie shows -- what is every zombie show about?
154
501200
2976
Pa i serije o zombijima - o čemu se radi u svakoj seriji o zombijima?
08:24
Some person, looking at the horizon at some zombie going by,
155
504200
3256
O nekoj osobi koja posmatra horizont kuda prolazi nekakav zombi
08:27
and they zoom in on the person and you see the person's face,
156
507480
2896
i onda zumiraju osobu i vidite ljudsko lice
08:30
and you know what they're thinking:
157
510400
1736
i znate o čemu razmišlja:
„U čemu je zaista razlika između zombija i mene?
08:32
"What's really the difference between that zombie and me?
158
512160
2736
08:34
He walks, I walk.
159
514920
1536
On hoda, ja hodam.
08:36
He eats, I eat.
160
516480
2016
On jede, ja jedem.
08:38
He kills, I kill."
161
518520
2280
On ubija, ja ubijam.“
08:42
But he's a zombie.
162
522360
1536
Međutim, on je zombi.
08:43
At least you're aware of it.
163
523920
1416
Vi ste bar svesni toga.
08:45
If we are actually having trouble distinguishing ourselves from zombies,
164
525360
3696
Ako se zapravo mučimo da razlikujemo sebe od zombija,
08:49
we have a pretty big problem going on.
165
529080
2176
u prilično smo velikom problemu.
08:51
(Laughter)
166
531280
1216
(Smeh)
08:52
And don't even get me started on the transhumanists.
167
532520
2936
Nemojte ni da mi pominjete transhumaniste.
08:55
I was on a panel with a transhumanist, and he's going on about the singularity.
168
535480
3816
Bio sam na panelu sa transhumanistom i on je raspredao o singularnosti.
08:59
"Oh, the day is going to come really soon when computers are smarter than people.
169
539320
3856
„Oh, sve je bliži dan kada će kompjuteri da budu pametniji od ljudi.
09:03
And the only option for people at that point
170
543200
2136
I jedina opcija za ljude će tada da bude
09:05
is to pass the evolutionary torch to our successor
171
545360
3336
da predaju evolutivnu baklju našim naslednicima
09:08
and fade into the background.
172
548720
1616
i da nestanemo u pozadini.
09:10
Maybe at best, upload your consciousness to a silicon chip.
173
550360
3456
U najboljem slučaju, pak, prebacićemo našu svesnost na silicijumski čip.
09:13
And accept your extinction."
174
553840
1840
I prihvatićemo naše izumiranje.“
09:16
(Laughter)
175
556640
1456
(Smeh)
09:18
And I said, "No, human beings are special.
176
558120
3376
A ja sam rekao: „Ne, ljudska bića su posebna.
09:21
We can embrace ambiguity, we understand paradox,
177
561520
3536
Možemo da prigrlimo dvosmislenost, razumemo paradoks,
09:25
we're conscious, we're weird, we're quirky.
178
565080
2616
svesni smo, čudni smo, ekscentrični smo.
09:27
There should be a place for humans in the digital future."
179
567720
3336
Trebalo bi da postoji mesto za ljude u digitalnoj budućnosti.“
09:31
And he said, "Oh, Rushkoff,
180
571080
1536
A on je rekao: „Ah, Raškofe,
09:32
you're just saying that because you're a human."
181
572640
2296
ti samo tako govoriš jer si čovek.“
09:34
(Laughter)
182
574960
1776
(Smeh)
09:36
As if it's hubris.
183
576760
1600
Kao da se radi o aroganciji.
09:39
OK, I'm on "Team Human."
184
579280
2800
U redu, ja sam u „Ekipi ljudi“.
09:43
That was the original insight of the digital age.
185
583200
3856
To je bilo prvobitno zapažanje digitalnog doba.
09:47
That being human is a team sport,
186
587080
2216
Da je biti čovek ekipni sport,
09:49
evolution's a collaborative act.
187
589320
2736
evolucija je čin saradnje.
09:52
Even the trees in the forest,
188
592080
1416
Čak se ni drveće u šumi
09:53
they're not all in competition with each other,
189
593520
2216
međusobno ne takmiči,
09:55
they're connected with the vast network of roots and mushrooms
190
595760
3216
povezani su ogromnom mrežom korenja i gljiva
09:59
that let them communicate with one another and pass nutrients back and forth.
191
599000
4536
koja im omogućava da međusobno komuniciraju i prenose hranljive stvari.
10:03
If human beings are the most evolved species,
192
603560
2136
Ako su ljudska bića najevoluiranija vrsta,
10:05
it's because we have the most evolved ways of collaborating and communicating.
193
605720
4136
to je zato što imamo najevoluiranije načine saradnje i komunikacije.
10:09
We have language.
194
609880
1496
Imamo jezik.
10:11
We have technology.
195
611400
1240
Imamo tehnologiju.
10:14
It's funny, I used to be the guy who talked about the digital future
196
614120
4576
Smešno je, nekad sam bio lik koji je govorio o digitalnoj budućnosti
10:18
for people who hadn't yet experienced anything digital.
197
618720
2680
ljudima bez bilo kakvog iskustva sa digitalnim.
10:22
And now I feel like I'm the last guy
198
622200
1816
A sada se osećam kao poslednji čovek
10:24
who remembers what life was like before digital technology.
199
624040
2920
koji se seća kako je bilo pre digitalnih tehnologija.
10:28
It's not a matter of rejecting the digital or rejecting the technological.
200
628680
4216
Ne radi se o odbacivanju digitalnog ili odbacivanju tehnologije.
10:32
It's a matter of retrieving the values that we're in danger of leaving behind
201
632920
4096
Radi se o vraćanju vrednosti koje smo u opasnosti da izgubimo,
10:37
and then embedding them in the digital infrastructure for the future.
202
637040
3600
i da ih zatim ugradimo u digitalnu infrastrukturu budućnosti.
10:41
And that's not rocket science.
203
641880
2296
A ne radi se o nuklearnoj fizici.
10:44
It's as simple as making a social network
204
644200
2096
Prosto je kao izgradnja društvene mreže
10:46
that instead of teaching us to see people as adversaries,
205
646320
3536
koja umesto da nas uči da gledamo na ljude kao na dušmane,
10:49
it teaches us to see our adversaries as people.
206
649880
3000
uči nas da gledamo na naše dušmane kao na ljude.
10:54
It means creating an economy that doesn't favor a platform monopoly
207
654240
4296
To znači stvaranje ekonomije koja ne favorizuje platformski monopol
10:58
that wants to extract all the value out of people and places,
208
658560
3336
koji želi da izvuče svu vrednost iz ljudi i mesta,
11:01
but one that promotes the circulation of value through a community
209
661920
3936
već one koja promoviše kruženje vrednosti u zajednici
11:05
and allows us to establish platform cooperatives
210
665880
2416
i omogućuje nam da stvaramo platformsku saradnju
11:08
that distribute ownership as wide as possible.
211
668320
3816
koja raspodeljuje vlasništvo što je šire moguće.
11:12
It means building platforms
212
672160
1656
To znači izgradnju platformi
11:13
that don't repress our creativity and novelty in the name of prediction
213
673840
4656
koje ne potiskuju našu kreativnost i novine u ime predviđanja,
11:18
but actually promote creativity and novelty,
214
678520
2576
već zapravo promovišu kreativnost i novine
11:21
so that we can come up with some of the solutions
215
681120
2336
kako bismo mogli da smislimo nekakva rešenja
11:23
to actually get ourselves out of the mess that we're in.
216
683480
2680
da se izvučemo iz nereda u kome smo.
11:27
No, instead of trying to earn enough money to insulate ourselves
217
687440
3056
Ne, umesto što pokušavamo da zaradimo dovoljno novca da bismo se izolovali
11:30
from the world we're creating,
218
690520
1496
od sveta koga stvaramo,
11:32
why don't we spend that time and energy making the world a place
219
692040
3056
zašto ne bismo potrošili to vreme i energiju
da pretvorimo svet u mesto iz koga nemamo potrebu da bežimo.
11:35
that we don't feel the need to escape from.
220
695120
2040
11:38
There is no escape, there is only one thing going on here.
221
698000
3200
Nema izlaza, ovde se radi o samo jednom.
11:42
Please, don't leave.
222
702680
2120
Molim vas, ne idite.
11:45
Join us.
223
705640
1360
Pridružite nam se.
11:47
We may not be perfect,
224
707800
1296
Možda nismo savršeni,
11:49
but whatever happens, at least you won't be alone.
225
709120
2440
ali šta god da se desi, bar nećete da budete usamljeni.
11:52
Join "Team Human."
226
712640
1520
Pridružite se „Ekipi ljudi“.
11:55
Find the others.
227
715160
2096
Pronađite druge.
11:57
Together, let's make the future that we always wanted.
228
717280
2960
Hajde da zajedno izgradimo budućnost kakvu smo oduvek želeli.
12:01
Oh, and those tech billionaires who wanted to know
229
721560
2496
Eh da, a tim milijarderima koji su želeli da znaju
12:04
how to maintain control of their security force after the apocalypse,
230
724080
3256
kako da održe kontrolu nad bezbednosnim snagama nakon apokalipse,
12:07
you know what I told them?
231
727360
1896
znate šta sam im rekao?
12:09
"Start treating those people with love and respect right now.
232
729280
4216
„Počnite da se odnosite prema tim ljudima sa ljubavlju i poštovanjem odmah.
12:13
Maybe you won't have an apocalypse to worry about."
233
733520
2440
Možda nećete morati da brinete zbog apokalipse.“
12:16
Thank you.
234
736840
1216
Hvala vam.
12:18
(Applause)
235
738080
4440
(Aplauz)
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7