Clay Shirky: How cognitive surplus will change the world

120,602 views ・ 2010-06-29

TED


Dvaput kliknite na engleske titlove ispod za reprodukciju videozapisa.

Prevoditelj: Vesna Munić Recezent: Tilen Pigac - EFZG
00:16
The story starts in Kenya
0
16260
2000
Priča počinje u Keniji
00:18
in December of 2007,
1
18260
2000
u prosincu 2007.,
00:20
when there was a disputed presidential election,
2
20260
2000
kada su bili sporni predsjednički izbori.
00:22
and in the immediate aftermath of that election,
3
22260
3000
Kao izravan rezultat tih izbora
00:25
there was an outbreak of ethnic violence.
4
25260
2000
došlo je do izbijanja etničkih sukoba.
00:27
And there was a lawyer in Nairobi, Ory Okolloh --
5
27260
3000
U Nairobiju je bila i odvjetnica, Ory Okolloh --
00:30
who some of you may know from her TEDTalk --
6
30260
2000
koju neki od vas možda znaju iz njezinog TED govora --
00:32
who began blogging about it on her site,
7
32260
2000
koja je o tome počela pisati blog na svojoj internetskoj stranici
00:34
Kenyan Pundit.
8
34260
2000
Kenyan Pundit (kenijski kritičar).
00:36
And shortly after the election and the outbreak of violence,
9
36260
3000
I ubrzo nakon izbora i izbijanja nasilja,
00:39
the government suddenly imposed
10
39260
2000
vlada je nametnula
00:41
a significant media blackout.
11
41260
2000
značajnu blokadu medija.
00:43
And so weblogs went from being
12
43260
2000
Tako su blogovi na internetu
00:45
commentary as part of the media landscape
13
45260
2000
od komentatorskog dijela medijskog krajolika
00:47
to being a critical part of the media landscape
14
47260
3000
postali kritični dio medijskog krajolika
00:50
in trying to understand where the violence was.
15
50260
3000
u pokušaju razumijevanja gdje sa nasilje događalo.
00:53
And Okolloh solicited
16
53260
2000
I Okolloh je molila
00:55
from her commenters
17
55260
2000
od svojih komentatora
00:57
more information about what was going on.
18
57260
2000
još informacija o tome što se događa.
00:59
The comments began pouring in,
19
59260
2000
I komentari su počeli pristizati.
01:01
and Okolloh would collate them. She would post them.
20
61260
2000
Okolloh bi ih uspoređivala i stavljala na internet.
01:03
And she quickly said, "It's too much.
21
63260
2000
I ubrzo je rekla: "Ovo je previše.
01:05
I could do this all day every day
22
65260
2000
Mogu to raditi cijeli dan, svaki dan
01:07
and I can't keep up.
23
67260
2000
a ne mogu držati korak.
01:09
There is more information
24
69260
2000
Ima više informacija
01:11
about what's going on in Kenya right now
25
71260
2000
o tome što se upravo događa u Keniji
01:13
than any one person can manage.
26
73260
2000
nego što jedna osoba može baratati.
01:15
If only there was a way to automate this."
27
75260
2000
Kad bi barem postojao način da to automatiziramo."
01:17
And two programmers who read her blog
28
77260
2000
Dva programera koji su pročitali njezin blog
01:19
held their hands up and said, "We could do that,"
29
79260
3000
javili su se i rekli: "Mi bismo to mogli učiniti."
01:22
and in 72 hours, they launched Ushahidi.
30
82260
3000
I u 72 sata, pokrenuli su Ushahidi.
01:25
Ushahidi -- the name means "witness"
31
85260
2000
Ushahidi - znači "svjedok"
01:27
or "testimony" in Swahili --
32
87260
2000
ili "dokaz" na svahili jeziku -
01:29
is a very simple way of taking reports from the field,
33
89260
3000
vrlo je jednostavan način uzimanja izvještaja s terena,
01:32
whether it's from the web or, critically,
34
92260
3000
bilo s interneta ili, ključno,
01:35
via mobile phones and SMS,
35
95260
2000
putem mobitela i SMS-a,
01:37
aggregating it and putting it on a map.
36
97260
3000
grupiranje izvještaja i stavljanje na kartu.
01:40
That's all it is, but that's all that's needed
37
100260
2000
I to je suština, ali to je i sve što je potrebno.
01:42
because what it does is it takes the tacit information
38
102260
3000
Jer ono što radi je uzimanje prešutnih informacija
01:45
available to the whole population --
39
105260
2000
dostupnih cijelom stanovništvu -
01:47
everybody knows where the violence is,
40
107260
2000
svi znaju gdje je nasilje,
01:49
but no one person knows what everyone knows --
41
109260
3000
samo nitko ne zna da svi znaju -
01:52
and it takes that tacit information
42
112260
2000
i Ushahidi uzima te prešutne informacije
01:54
and it aggregates it,
43
114260
2000
i gomila ih,
01:56
and it maps it and it makes it public.
44
116260
2000
stavlja na kartu i čini javnima.
01:58
And that, that maneuver
45
118260
2000
I taj manevar
02:00
called "crisis mapping,"
46
120260
2000
nazvan "mapiranje krize"
02:02
was kicked off in Kenya
47
122260
3000
započeo je u Keniji
02:05
in January of 2008.
48
125260
2000
u siječnju 2008.
02:07
And enough people looked at it and found it valuable enough
49
127260
3000
I dovoljno ljudi ga je gledalo i smatralo dovoljno vrijednim
02:10
that the programmers who created Ushahidi
50
130260
2000
da su programeri koji su kreirali Ushahidi,
02:12
decided they were going to make it open source
51
132260
2000
odlučili učiniti program javno dostupnim
02:14
and turn it into a platform.
52
134260
2000
i pretvoriti ga u platformu.
02:16
It's since been deployed in Mexico
53
136260
2000
Od tada bio je upotrebljen u Meksiku
02:18
to track electoral fraud.
54
138260
2000
za praćenje prevara na izborima.
02:20
It's been deployed in Washington D.C. to track snow cleanup.
55
140260
3000
Upotrijebljen je u Washingtonu za praćenje čišćenja snijega.
02:23
And it's been used most famously in Haiti
56
143260
2000
I slavno je upotrijebljen na Haitiju
02:25
in the aftermath of the earthquake.
57
145260
3000
na posljedicama potresa.
02:28
And when you look at the map
58
148260
2000
Ako pogledate na kartu
02:30
now posted on the Ushahidi front page,
59
150260
2000
stavljenu na prvu stranicu Ushahidija,
02:32
you can see that the number of deployments in Ushahidi
60
152260
2000
možete vidjeti kako se upotreba Ushahidija
02:34
has gone worldwide, all right?
61
154260
3000
proširila po svijetu, je li tako?
02:37
This went from a single idea
62
157260
2000
To je krenulo od jedne ideje
02:39
and a single implementation
63
159260
2000
i jedne provedbe
02:41
in East Africa in the beginning of 2008
64
161260
3000
u istočnoj Africi početkom 2008.
02:44
to a global deployment
65
164260
2000
do globalne upotrebe
02:46
in less than three years.
66
166260
3000
u manje od tri godine.
02:49
Now what Okolloh did
67
169260
3000
To što je Okolloh učinila,
02:52
would not have been possible
68
172260
2000
ne bi bilo moguće
02:54
without digital technology.
69
174260
3000
bez digitalne tehnologije.
02:57
What Okolloh did would not have been possible
70
177260
3000
Što je Okolloh učinila, ne bi bilo moguće
03:00
without human generosity.
71
180260
2000
bez ljudske velikodušnosti.
03:02
And the interesting moment now,
72
182260
2000
I zanimljiv trenutak,
03:04
the number of environments
73
184260
2000
broj sredina
03:06
where the social design challenge
74
186260
2000
s izazovima društvenog organiziranja
03:08
relies on both of those things being true.
75
188260
3000
oslanja se na istinitost obje ove činjenice.
03:11
That is the resource that I'm talking about.
76
191260
3000
To je resurs o kojem govorim.
03:14
I call it cognitive surplus.
77
194260
2000
I zovem ga misaoni višak.
03:16
And it represents the ability
78
196260
2000
I predstavlja mogućnost
03:18
of the world's population
79
198260
2000
svjetskog stanovništva
03:20
to volunteer and to contribute and collaborate
80
200260
3000
da volontira, doprinosi i surađuje
03:23
on large, sometimes global, projects.
81
203260
3000
na velikim, ponekad svjetskim, projektima.
03:26
Cognitive surplus is made up of two things.
82
206260
2000
Misaoni višak sastoji se od dvije stvari.
03:28
The first, obviously, is the world's free time and talents.
83
208260
3000
Prvi, očito, svjetsko slobodno vrijeme i talent.
03:31
The world has over
84
211260
2000
Svijet ima preko
03:33
a trillion hours a year
85
213260
3000
bilijun sati na godinu
03:36
of free time
86
216260
2000
slobodnog vremena
03:38
to commit to shared projects.
87
218260
2000
za posvetiti zajedničkim projektima.
03:40
Now, that free time existed in the 20th century,
88
220260
2000
Sad, to slobodno vrijeme postojalo je u 20. stoljeću,
03:42
but we didn't get Ushahidi in the 20th century.
89
222260
3000
ali u 20. stoljeću nismo imali Ushahidi.
03:45
That's the second half of cognitive surplus.
90
225260
2000
To je druga polovica misoanog viška.
03:47
The media landscape in the 20th century
91
227260
2000
Medijski krajolik u 20. stoljeću
03:49
was very good at helping people consume,
92
229260
3000
bio je dobar u pomaganju ljudima da troše.
03:52
and we got, as a result,
93
232260
2000
I postali smo, kao rezultat toga,
03:54
very good at consuming.
94
234260
2000
jako dobri u trošenju.
03:56
But now that we've been given media tools --
95
236260
2000
Ali sad kad su nam dani medijski alati -
03:58
the Internet, mobile phones -- that let us do more than consume,
96
238260
3000
internet, mobiteli - koji nam dozvoljavaju više nego samo trošiti,
04:01
what we're seeing is that people weren't couch potatoes
97
241260
3000
vidimo da mi ljudi nismo lijeni i neaktivni
04:04
because we liked to be.
98
244260
2000
jer to želimo biti,
04:06
We were couch potatoes because that was
99
246260
2000
već zato što je to bila
04:08
the only opportunity given to us.
100
248260
2000
jedina mogućnost koja nam je dana.
04:10
We still like to consume, of course.
101
250260
2000
I dalje volimo konzumirati, naravno.
04:12
But it turns out we also like to create,
102
252260
2000
Ali ispada da također volimo i kreirati,
04:14
and we like to share.
103
254260
3000
volimo i dijeliti.
04:17
And it's those two things together --
104
257260
2000
I te dvije stvari zajedno -
04:19
ancient human motivation
105
259260
2000
drevna ljudska motivacija
04:21
and the modern tools to allow that motivation
106
261260
2000
i suvremeni alati koji dozvoljavaju toj motivaciji
04:23
to be joined up in large-scale efforts --
107
263260
3000
da se ujedini u trud velikih razmjera -
04:26
that are the new design resource.
108
266260
3000
to je resurs novog stvaranja.
04:29
And using cognitive surplus,
109
269260
2000
Korištenjem misaonog viška
04:31
we're starting to see truly incredible experiments
110
271260
3000
počinjemo vidjeti stvarno nevjerojatne pokušaje
04:34
in scientific, literary,
111
274260
2000
znanstvenih, literarnih,
04:36
artistic, political efforts.
112
276260
3000
umjetničkih i političkih nastojanja.
04:39
Designing.
113
279260
2000
Stvaranje.
04:41
We're also getting, of course, a lot of LOLcats.
114
281260
3000
Naravno, također dobivamo i puno smiješnih slika mačaka (LOLcats).
04:44
LOLcats are cute pictures of cats
115
284260
2000
LOLcats su slatke slike mačaka
04:46
made cuter with the addition of cute captions.
116
286260
3000
učinjene još slađima dodatkom slatkih natpisa.
04:49
And they are also
117
289260
2000
One su također dio
04:51
part of the abundant media landscape we're getting now.
118
291260
3000
učestalog medijskog krajolika koji sada dobivamo.
04:54
This is one of the participatory --
119
294260
2000
To je jedan od modela -
04:56
one of the participatory models
120
296260
2000
sudjelovanja
04:58
we see coming out of that, along with Ushahidi.
121
298260
3000
koji nam dolaze iz njega zajedno s Ushahidijem.
05:01
Now I want to stipulate, as the lawyers say,
122
301260
2000
Sada želim postaviti za uvjet, kako odvjetnici kažu,
05:03
that LOLcats are the stupidest possible
123
303260
2000
da su LOLcats najgluplji mogući
05:05
creative act.
124
305260
2000
kreativni čin.
05:07
There are other candidates of course,
125
307260
2000
Postoje i drugi kandidati, naravno,
05:09
but LOLcats will do as a general case.
126
309260
3000
ali LOLcats će poslužiti kao generalni primjer.
05:12
But here's the thing:
127
312260
2000
U ovome je stvar.
05:14
The stupidest possible creative act
128
314260
2000
I najgluplji mogući kreativni čin
05:16
is still a creative act.
129
316260
3000
još uvijek je kreativni čin.
05:19
Someone who has done something like this,
130
319260
3000
Onaj tko je napravio nešto takvo,
05:22
however mediocre and throwaway,
131
322260
3000
koliko god bilo osrednje i za baciti,
05:25
has tried something, has put something forward in public.
132
325260
3000
nešto je pokušao, nešto je pokrenuo u javnosti.
05:28
And once they've done it, they can do it again,
133
328260
3000
I jednom kad je to napravio, može to napraviti opet.
05:31
and they could work on getting it better.
134
331260
2000
I može raditi na poboljšanju toga.
05:33
There is a spectrum between mediocre work and good work,
135
333260
3000
Postoji spektar između osrednjeg posla i dobrog posla.
05:36
and as anybody who's worked as an artist or a creator knows,
136
336260
3000
I kao što svatko tko je radio kao umjetnik ili stvaratelj zna,
05:39
it's a spectrum you're constantly
137
339260
2000
to je spektar u kojem se stalno
05:41
struggling to get on top of.
138
341260
2000
borite da stignete na vrh.
05:43
The gap is between
139
343260
2000
Procjep je između
05:45
doing anything and doing nothing.
140
345260
3000
činjenja nečega i nečinjenja ničega.
05:48
And someone who makes a LOLcat
141
348260
2000
I onaj tko radi LOLcats
05:50
has already crossed over that gap.
142
350260
3000
već je prešao taj procjep.
05:53
Now it's tempting to want to get the Ushahidis
143
353260
2000
Sad, primamljivo je željeti doći do Ushahidija
05:55
without the LOLcats, right,
144
355260
2000
bez LOLcats,
05:57
to get the serious stuff without the throwaway stuff.
145
357260
3000
dobiti ozbiljne stvari bez stvari za baciti.
06:00
But media abundance never works that way.
146
360260
3000
Ali medijsko izobilje nikad ne radi na taj način.
06:03
Freedom to experiment means freedom to experiment with anything.
147
363260
3000
Sloboda eksperimentiranja znači slobodu eksperimentiranja s bilo čime.
06:06
Even with the sacred printing press,
148
366260
2000
Čak i sa svetim tiskom,
06:08
we got erotic novels 150 years
149
368260
2000
imali smo erotske priče 150 godina prije
06:10
before we got scientific journals.
150
370260
3000
nego što smo dobili znanstvene časopise.
06:14
So before I talk about
151
374260
3000
Prije nego popričamo
06:17
what is, I think, the critical difference
152
377260
2000
koja je, po mom mišljenju, glavna razlika
06:19
between LOLcats and Ushahidi,
153
379260
2000
između LOLcats i Ushahidi,
06:21
I want to talk about
154
381260
2000
želim pričati o njihovom
06:23
their shared source.
155
383260
2000
zajedničkom izvorištu.
06:25
And that source is design for generosity.
156
385260
3000
A to izvorište je stvaranje iz velikodušnosti.
06:28
It is one of the curiosities of our historical era
157
388260
3000
Jedna od zanimljivosti naše povijesne ere
06:31
that even as cognitive surplus
158
391260
2000
je da kako misaoni višak
06:33
is becoming a resource we can design around,
159
393260
2000
postaje resurs oko kojeg možemo stvarati,
06:35
social sciences are also starting to explain
160
395260
3000
društvene znanosti počinju shvaćati
06:38
how important
161
398260
2000
koliko su nam važne
06:40
our intrinsic motivations are to us,
162
400260
2000
naše unutarnje motivacije,
06:42
how much we do things because we like to do them
163
402260
3000
koliko puno radimo stvari zato što ih volimo raditi
06:45
rather than because our boss told us to do them,
164
405260
2000
a ne zato što nam je šef rekao da ih napravimo,
06:47
or because we're being paid to do them.
165
407260
3000
ili zato što smo za to plaćeni.
06:50
This is a graph from a paper
166
410260
3000
Ovo je grafikon iz članka
06:53
by Uri Gneezy and Aldo Rustichini,
167
413260
2000
Urija Gneezyja i Alda Rustichinija
06:55
who set out to test, at the beginning of this decade,
168
415260
3000
koji su išli testirati, početkom ovog desetljeća,
06:58
what they called "deterrence theory."
169
418260
2000
ono što su zvali "teorijom zastrašivanja."
07:00
And deterrence theory is a very simple theory of human behavior:
170
420260
2000
A teorija zastrašivanja vrlo je jednostavno teorija ljudskog ponašanja.
07:02
If you want somebody to do less of something,
171
422260
2000
Ako želite da netko manje radi nešto,
07:04
add a punishment and they'll do less of it.
172
424260
2000
uvedite kaznu i radit će manje toga.
07:06
Simple, straightforward, commonsensical --
173
426260
3000
Jednostavno, otvoreno, zdravorazumski,
07:09
also, largely untested.
174
429260
2000
i uglavnom neispitano.
07:11
And so they went and studied
175
431260
2000
Krenuli su proučavati
07:13
10 daycare centers in Haifa, Israel.
176
433260
2000
10 dječjih vrtića u Haifi, Izrael.
07:15
They studied those daycare centers
177
435260
2000
I proučavali su te vrtiće
07:17
at the time of highest tension,
178
437260
2000
u vrijeme najveće napetosti
07:19
which is pick-up time.
179
439260
2000
koje je vrijeme dolaska roditelja po djecu.
07:21
At pick-up time the teachers,
180
441260
2000
U vrijeme dolaska odgajatelji,
07:23
who have been with your children all day,
181
443260
2000
koji su bili s vašom djecom cijeli dan,
07:25
would like you to be there at the appointed hour to take your children back.
182
445260
3000
htjeli bi da dođete u dogovoreni sat da uzmete djecu nazad.
07:28
Meanwhile, the parents -- perhaps a little busy at work, running late, running errands --
183
448260
3000
U međuvremenu, roditelji - vjerojatno malo prezauzeti na poslu, kasne, obavljajući neke posliće -
07:31
want a little slack to pick the kids up late.
184
451260
3000
žele da im se tolerira malo kašnjenje.
07:34
So Gneezy and Rustichini said,
185
454260
2000
Gneezy i Rustichini su se tad zapitali,
07:36
"How many instances of late pick-ups
186
456260
2000
"Koliko ima slučajeva kašnjenja roditelja
07:38
are there at these 10 daycare centers?"
187
458260
2000
u ovih 10 vrtića?"
07:40
Now they saw -- and this is what the graph is,
188
460260
2000
Vidjeli su - kao što ovaj graf pokazuje,
07:42
these are the number of weeks and these are the number of late arrivals --
189
462260
3000
ovdje je broj tjedana a ovo je broj kašnjenja roditelja -
07:45
that there were between six and 10
190
465260
2000
da ima između šest i 10
07:47
instances of late pick-ups
191
467260
2000
slučajeva kašnjenja roditelja
07:49
on average in these 10 daycare centers.
192
469260
2000
u prosjeku u ovih 10 vrtića.
07:51
So they divided the daycare centers into two groups.
193
471260
3000
Tako su podijelili vrtiće u dvije grupe.
07:54
The white group there
194
474260
2000
Ova bijela grupa je
07:56
is the control group; they change nothing.
195
476260
3000
kontrolna grupa; tu nisu ništa mijenjali.
07:59
But the group of daycare centers represented by the black line,
196
479260
3000
A grupi vrtića prikazanih crnom crtom,
08:02
they said, "We are changing this bargain
197
482260
2000
rekli su: "Mijenjamo pogodbu
08:04
as of right now.
198
484260
2000
istog trena.
08:06
If you pick your kid up more than 10 minutes late,
199
486260
2000
Ako zakasnite po svoje dijete više od 10 minuta,
08:08
we're going to add a 10 shekel fine to your bill.
200
488260
2000
dodat ćemo 10 šekela kazne na vaš račun.
08:10
Boom. No ifs, ands or buts."
201
490260
3000
Bum. Bez ikakvih ako i ali."
08:13
And the minute they did that,
202
493260
2000
I istog trena kad su to učinili,
08:15
the behavior in those daycare centers changed.
203
495260
2000
ponašanje se u tim vrtićima promijenilo.
08:17
Late pick-ups went up
204
497260
2000
Broj kasnih dolazaka je rastao
08:19
every week for the next four weeks
205
499260
3000
svaki tjedan iduća četiri tjedna
08:22
until they topped out at triple the pre-fine average,
206
502260
3000
sve dok nije dosegao vrhunac tri puta veći od prosjeka prije kazni,
08:25
and then they fluctuated
207
505260
2000
i nakon toga se kretao
08:27
at between double and triple the pre-fine average
208
507260
2000
između dvostruke i trostruke vrijednosti prosjeka prije uvođenja kazni
08:29
for the life of the fine.
209
509260
2000
dokle god je kazna trajala.
08:31
And you can see immediately what happened, right?
210
511260
3000
I odmah možete vidjeti što se dogodilo, jel tako?
08:35
The fine broke the culture
211
515260
2000
Kazna je ukinula kulturu
08:37
of the daycare center.
212
517260
2000
dječjeg vrtića.
08:39
By adding a fine,
213
519260
2000
Uvođenjem kazne,
08:41
what they did was communicate to the parents
214
521260
2000
poručili su roditeljima
08:43
that their entire debt to the teachers
215
523260
2000
da je sav njihov dug odgajateljima
08:45
had been discharged
216
525260
2000
otpušten
08:47
with the payment of 10 shekels,
217
527260
2000
plaćanjem 10 šekela,
08:49
and that there was no residue of guilt or social concern
218
529260
3000
te da nema nikakvog ostatka krivnje ili društvene brige
08:52
that the parents owed the teachers.
219
532260
2000
koju su roditelji dugovali odgajateljima.
08:54
And so the parents, quite sensibly, said,
220
534260
2000
I tako su roditelji, prilično razumno rekli:
08:56
"10 shekels to pick my kid up late?
221
536260
2000
"10 šekela da kasnije dođem po dijete?
08:58
What could be bad?"
222
538260
2000
Što može biti loše?
09:00
(Laughter)
223
540260
2000
(Smijeh)
09:04
The explanation of human behavior
224
544260
2000
Objašnjenje ljudskog ponašanja
09:06
that we inherited in the 20th century
225
546260
3000
koje smo naslijedili u 20-tom stoljeću
09:09
was that we are all rational, self-maximizing actors,
226
549260
3000
bilo je da smo racionalni, sebični glumci.
09:12
and in that explanation --
227
552260
2000
I u tom objašnjenju -
09:14
the daycare center had no contract --
228
554260
3000
dječji vrtić nije imao ugovor -
09:17
should have been operating without any constraints.
229
557260
3000
i trebao bi raditi bez ograničenja.
09:20
But that's not right.
230
560260
2000
No to nije u redu.
09:22
They were operating with social constraints
231
562260
2000
Oni rade s društvenim ograničenjima
09:24
rather than contractual ones.
232
564260
2000
a ne s ugovornim.
09:26
And critically, the social constraints
233
566260
2000
I ključno, društvena ograničenja
09:28
created a culture that was more generous
234
568260
3000
kreirala su kulturu koja je bila velikodušnija
09:31
than the contractual constraints did.
235
571260
2000
nego što su to ugovorna ograničenja.
09:33
So Gneezy and Rustichini run this experiment for a dozen weeks --
236
573260
3000
Gneezy i Rustichini proveli su ovaj pokus 12 tjedana -
09:36
run the fine for a dozen weeks --
237
576260
2000
kazna je postojala 12 tjedana -
09:38
and then they say, "Okay, that's it. All done; fine."
238
578260
3000
i onda su rekli: "Ok, to je gotovo. Sve u redu."
09:41
And then a really interesting thing happens:
239
581260
2000
I zaista zanimljiva stvar se dogodila.
09:43
Nothing changes.
240
583260
3000
Ništa se nije promijenilo.
09:46
The culture that got broken by the fine
241
586260
3000
Kultura koja se slomila pod kaznom
09:49
stayed broken when the fine was removed.
242
589260
3000
ostala je slomljena i kad je kazna povučena.
09:52
Not only are economic motivations
243
592260
3000
Ne samo da su ekonomska motivacija
09:55
and intrinsic motivations
244
595260
2000
i intrinzička motivacija
09:57
incompatible,
245
597260
2000
neuskladive,
09:59
that incompatibility
246
599260
2000
ta neuskladivost
10:01
can persist over long periods.
247
601260
3000
može ostati dulje vrijeme.
10:04
So the trick
248
604260
2000
Tako je trik
10:06
in designing these kinds of situations
249
606260
2000
u stvaranju ovih vrsta situacija
10:08
is to understand where you're relying on
250
608260
3000
razumjeti gdje se oslanjate na
10:11
the economic part of the bargain -- as with the parents paying the teachers --
251
611260
3000
ekonomski dio pogodbe - kao što su roditelji plaćali odgajateljima -
10:14
and when you're relying on the social part of the bargain,
252
614260
3000
a kad se oslanjate na društveni dio pogodbe,
10:17
when you're really designing for generosity.
253
617260
3000
kada zaista stvarate iz velikodušnosti.
10:20
This brings me back to the LOLcats
254
620260
3000
To me vraća na LOLcats
10:23
and to Ushahidi.
255
623260
2000
i na Ushahidi.
10:25
This is, I think, the range that matters.
256
625260
2000
To je, mislim, smisleni raspon.
10:27
Both of these rely on cognitive surplus.
257
627260
2000
Oba se oslanjaju na misaoni višak.
10:29
Both of these design for the assumption
258
629260
2000
Oba stvaraju s pretpostavkom
10:31
that people like to create and we want to share.
259
631260
3000
da ljudi vole stvarati i da volimo dijeliti.
10:34
Here is the critical difference between these:
260
634260
3000
Ovo je ključna razlika između toga dvoga.
10:39
LOLcats is communal value.
261
639260
3000
LOLcats su zajednička vrijednost.
10:42
It's value created by the participants
262
642260
2000
To je vrijednost stvorena od strane sudionika
10:44
for each other.
263
644260
2000
za svakog od njih.
10:46
Communal value on the networks we have
264
646260
3000
Zajednička vrijednost na našim mrežama je
10:49
is everywhere --
265
649260
2000
posvuda.
10:51
every time you see a large aggregate
266
651260
2000
Svaki put kad vidite veliku gomilu
10:53
of shared, publicly available data,
267
653260
3000
dijeljenih, javno dostupnih podataka,
10:56
whether it's photos on Flickr
268
656260
2000
bilo slika na Flickru
10:58
or videos on Youtube or whatever.
269
658260
2000
ili video zapisa na Youtubeu ili što već.
11:00
This is good. I like LOLcats as much as the next guy,
270
660260
2000
To je dobro. Ja volim LOLcats isto kao i drugi,
11:02
maybe a little more even,
271
662260
2000
možda čak i malo više.
11:04
but this is also
272
664260
3000
No ovo je također
11:07
a largely solved problem.
273
667260
2000
gotovo posve riješen problem.
11:09
I have a hard time envisioning a future
274
669260
2000
Teško si mogu zamisliti budućnost
11:11
in which someone is saying,
275
671260
2000
u kojoj netko kaže:
11:13
"Where, oh where, can I find a picture
276
673260
2000
"Gdje, o gdje, mogu naći sliku
11:15
of a cute cat?"
277
675260
2000
slatke mačke?"
11:17
Ushahidi, by contrast,
278
677260
2000
Ushahidi je, nasuprot tome,
11:19
is civic value.
279
679260
2000
građanska vrijednost.
11:21
It's value created by the participants
280
681260
2000
Njegovu vrijednost stvaraju sudionici,
11:23
but enjoyed by society as a whole.
281
683260
2000
a uživa je društvo u cijelosti.
11:25
The goals set out by Ushahidi
282
685260
2000
Ciljevi koje je postavio Ushahidi
11:27
are not just to make life better
283
687260
2000
nisu samo poboljšati život
11:29
for the participants,
284
689260
2000
sudionicima,
11:31
but to make life better for everyone in the society
285
691260
3000
već poboljšati život svima u društvu
11:34
in which Ushahidi is operating.
286
694260
2000
u kojem Ushahidi djeluje.
11:36
And that kind of civic value
287
696260
3000
A ta vrsta građanske vrijednosti
11:39
is not just a side effect
288
699260
2000
nije samo nuspojava
11:41
of opening up to human motivation.
289
701260
3000
otvaranja ljudskoj motivaciji.
11:44
It really is going to be a side effect
290
704260
2000
To će stvarno biti nuspojava
11:46
of what we, collectively,
291
706260
2000
onoga što, zajedno,
11:48
make of these kinds of efforts.
292
708260
3000
učinimo od tog truda.
11:51
There are a trillion
293
711260
2000
Imamo bilijun
11:53
hours a year
294
713260
2000
sati na godinu
11:55
of participatory value
295
715260
2000
sudioničke vrijednosti
11:57
up for grabs.
296
717260
2000
na raspolaganju.
11:59
That will be true year-in and year-out.
297
719260
3000
To je stvarni dobitak i utrošak po godini.
12:02
The number of people who are going to be able
298
722260
2000
Broj ljudi koji će biti u mogućnosti
12:04
to participate in these kinds of projects
299
724260
2000
da sudjeluju u ovakvim projektima,
12:06
is going to grow,
300
726260
2000
rasti će.
12:08
and we can see that organizations
301
728260
3000
A mi ćemo vidjeti da organizacije
12:11
designed around a culture of generosity
302
731260
2000
stvorene oko kulture velikodušnosti
12:13
can achieve incredible effects
303
733260
2000
mogu doseći nevjerojatne učinke
12:15
without an enormous amount of contractual overhead --
304
735260
3000
s ogromnom količinom ugovorne nadogradnje.
12:18
a very different model
305
738260
2000
Posve drugačiji model
12:20
than our default model for large-scale group action in the 20th century.
306
740260
3000
od našeg uobičajenog modela za pokret velikih grupa u 20. stoljeću.
12:24
What's going to make the difference here
307
744260
3000
Što će činiti razliku
12:27
is what Dean Kamen said,
308
747260
3000
jest ono što je Dean Kamen rekao,
12:30
the inventor and entrepreneur.
309
750260
2000
izumitelj i poduzetnik.
12:32
Kamen said, "Free cultures get what they celebrate."
310
752260
3000
Kamen je rekao: "Slobodne kulture dobiju ono što slave."
12:36
We've got a choice before us.
311
756260
3000
Imamo izbor pred nama.
12:39
We've got this trillion hours a year.
312
759260
2000
Imamo bilijun sati na godinu.
12:41
We can use it to crack each other up, and we're going to do that.
313
761260
3000
Možemo ih koristiti da jedni druge nasmijavamo, i to ćemo i raditi.
12:44
That, we get for free.
314
764260
2000
To dobijemo besplatno.
12:46
But we can also celebrate
315
766260
2000
Ali također možemo slaviti,
12:48
and support and reward the people
316
768260
2000
poticati i nagrađivati ljude
12:50
trying to use cognitive surplus
317
770260
2000
koji pokušavaju upotrijebiti misaoni višak
12:52
to create civic value.
318
772260
2000
za stvaranje građanske vrijednosti.
12:54
And to the degree we're going to do that, to the degree we're able to do that,
319
774260
3000
Koliko ćemo to raditi, koliko ćemo biti sposobni to raditi,
12:57
we'll be able to change society.
320
777260
2000
toliko ćemo biti sposobni mijenjati društvo.
12:59
Thank you very much.
321
779260
2000
Puno vam hvala.
O ovoj web stranici

Ova stranica će vas upoznati s YouTube videozapisima koji su korisni za učenje engleskog jezika. Vidjet ćete lekcije engleskog koje vode vrhunski profesori iz cijelog svijeta. Dvaput kliknite na engleske titlove prikazane na svakoj video stranici da biste reproducirali video s tog mjesta. Titlovi se pomiču sinkronizirano s reprodukcijom videozapisa. Ako imate bilo kakvih komentara ili zahtjeva, obratite nam se putem ovog obrasca za kontakt.

https://forms.gle/WvT1wiN1qDtmnspy7