A bold idea to replace politicians | César Hidalgo

400,966 views ・ 2019-04-03

TED


Please double-click on the English subtitles below to play the video.

Prevodilac: Milenka Okuka Lektor: Kristina Radosavljević
00:13
Is it just me,
0
13373
1744
Da li je do mene,
00:15
or are there other people here
1
15141
2332
ili ovde ima i drugih ljudi
00:17
that are a little bit disappointed with democracy?
2
17497
2344
koji su malo razočarani u demokratiju?
00:20
(Applause)
3
20986
2336
(Aplauz)
00:24
So let's look at a few numbers.
4
24141
2071
Pogledajmo neke brojke.
00:26
If we look across the world,
5
26934
2173
Ako pogledamo širom sveta,
00:29
the median turnout in presidential elections
6
29131
3892
prosečna izlaznost na predsedničkim izborima
00:33
over the last 30 years
7
33047
1651
u proteklih 30 godina
00:34
has been just 67 percent.
8
34722
2622
je bila svega 67 procenata.
00:38
Now, if we go to Europe
9
38329
1973
Ako pođemo do Evrope
00:40
and we look at people that participated in EU parliamentary elections,
10
40326
4428
i pogledamo ljude koji su učestvovali na parlamentarnim izborima EU,
00:44
the median turnout in those elections
11
44778
2071
prosečna izlaznost na tim izborima
00:46
is just 42 percent.
12
46873
2104
je svega 42 procenta.
00:50
Now let's go to New York,
13
50125
1669
Pođimo sad u Njujork,
00:51
and let's see how many people voted in the last election for mayor.
14
51818
4681
i pogledajmo koliko je ljudi glasalo na poslednjim izborima za gradonačelnika.
00:56
We will find that only 24 percent of people showed up to vote.
15
56523
3817
Otkrićemo da se svega 24 procenta ljudi pojavilo na glasanju.
01:01
What that means is that, if "Friends" was still running,
16
61063
3095
To znači da kad bi se i dalje prikazivali „Prijatelji”,
01:04
Joey and maybe Phoebe would have shown up to vote.
17
64182
3348
Džoi i možda Fibi bi se pojavili na glasanju.
01:07
(Laughter)
18
67554
1290
(Smeh)
01:09
And you cannot blame them because people are tired of politicians.
19
69434
4426
I ne možete da ih krivite, jer je ljudima dosta političara.
01:13
And people are tired of other people using the data that they have generated
20
73884
3887
I ljudima je dosta drugih ljudi koji koriste podatke koje su oni stvorili
01:17
to communicate with their friends and family,
21
77795
2198
da bi razgovarali sa prijateljima i porodicom,
01:20
to target political propaganda at them.
22
80017
2094
kako bi usmeravali političku propagandu na njih.
01:22
But the thing about this is that this is not new.
23
82519
2728
Međutim, radi se o tome da ovo nije novina.
01:25
Nowadays, people use likes to target propaganda at you
24
85271
3225
Danas ljudi koriste lajkove radi ciljane propagande ka vama,
01:28
before they use your zip code or your gender or your age,
25
88520
3373
a nekad su koristili poštanski broj, vaš pol ili vaš uzrast,
01:31
because the idea of targeting people with propaganda for political purposes
26
91917
3570
jer je ideja ciljanja ljudi u svrhe političke propagande
01:35
is as old as politics.
27
95511
1592
stara koliko i politika.
01:37
And the reason why that idea is there
28
97530
2278
A razlog zašto je ta ideja prisutna
01:39
is because democracy has a basic vulnerability.
29
99832
3472
je zato što demokratija ima osnovnu ranjivost.
01:43
This is the idea of a representative.
30
103710
1932
A radi se o ideji zastupnika.
01:46
In principle, democracy is the ability of people to exert power.
31
106049
3944
U principu, demokratija je mogućnost da ljudi vrše vlast.
01:50
But in practice, we have to delegate that power to a representative
32
110017
3818
U praksi, međutim, moramo da delegiramo tu moć zastupniku
01:53
that can exert that power for us.
33
113859
2255
koji može da vrši vlast za nas.
01:56
That representative is a bottleneck,
34
116561
1825
Taj zastupnik je usko grlo,
01:58
or a weak spot.
35
118410
1297
ili slaba tačka.
01:59
It is the place that you want to target if you want to attack democracy
36
119731
3949
To je mesto koje biste ciljali, ako želite da napadnete demokratiju,
02:03
because you can capture democracy by either capturing that representative
37
123704
3489
jer možete da kontrolišete demokratiju bilo kontrolisanjem tog zastupnika
02:07
or capturing the way that people choose it.
38
127217
2148
ili kontrolisanjem načina na koji ga ljudi biraju.
Veliko pitanje glasi:
02:10
So the big question is:
39
130065
1416
02:11
Is this the end of history?
40
131505
1704
da li je ovo kraj istorije?
02:13
Is this the best that we can do
41
133989
3103
Da li je ovo najbolje što možemo
02:17
or, actually, are there alternatives?
42
137878
3067
ili zapravo postoje alternative?
02:22
Some people have been thinking about alternatives,
43
142130
2354
Neki ljudi su razmišljali o alternativama,
02:24
and one of the ideas that is out there is the idea of direct democracy.
44
144508
3679
a jedna od zamisli u opticaju je zamisao o direktnoj demokratiji.
02:28
This is the idea of bypassing politicians completely
45
148790
2479
Radi se o zamisli zaobilaženja političara u potpunosti,
02:31
and having people vote directly on issues,
46
151293
2403
gde ljudi direktno glasaju o pitanjima,
02:33
having people vote directly on bills.
47
153720
2265
direktno glasaju za predloge zakona.
02:36
But this idea is naive
48
156415
1336
Međutim, ova ideja je naivna,
02:37
because there's too many things that we would need to choose.
49
157775
3171
jer ima previše stvari koje bismo morali da biramo.
02:40
If you look at the 114th US Congress,
50
160970
2782
Ako pogledate 114. kongres SAD-a,
02:43
you will have seen that the House of Representatives
51
163776
2487
videćete da je Predstavnički dom
02:46
considered more than 6,000 bills,
52
166287
2889
razmatrao preko 6 000 predloga zakona,
02:49
the Senate considered more than 3,000 bills
53
169200
2656
Senat je razmatrao preko 3 000 predloga zakona,
02:51
and they approved more than 300 laws.
54
171880
2808
a odobrili su preko 300 zakona.
02:54
Those would be many decisions
55
174712
1579
To bi bile brojne odluke
02:56
that each person would have to make a week
56
176315
2187
koje bi svaka osoba morala nedeljno da donese,
02:58
on topics that they know little about.
57
178526
2142
o temama o kojima malo znaju.
03:01
So there's a big cognitive bandwidth problem
58
181229
2281
Dakle, imamo veliki problem sa kognitivnim opsegom
03:03
if we're going to try to think about direct democracy as a viable alternative.
59
183534
3992
ako ćemo da razmišljamo o direktnoj demokratiji kao mogućoj alternativi.
03:08
So some people think about the idea of liquid democracy, or fluid democracy,
60
188205
4435
Pa neki ljudi razmatraju ideju tečne demokratije ili fluidne demokratije,
03:12
which is the idea that you endorse your political power to someone,
61
192664
3776
a to je ideja da date punomoć za vašu političku moć nekome,
03:16
who can endorse it to someone else,
62
196464
1700
ko može tu punomoć dati nekom drugom,
03:18
and, eventually, you create a large follower network
63
198188
2541
da biste na kraju stvorili veliku mrežu pratilaca
03:20
in which, at the end, there's a few people that are making decisions
64
200753
3294
u kojoj se na kraju nalazi nekolicina ljudi koja donosi odluke,
03:24
on behalf of all of their followers and their followers.
65
204071
3143
u ime svih njihovih pratilaca i pratilaca tih pratilaca.
03:28
But this idea also doesn't solve the problem of the cognitive bandwidth
66
208326
4129
Međutim, ni ova ideja ne rešava problem kognitivnog opsega,
03:32
and, to be honest, it's also quite similar to the idea of having a representative.
67
212479
3877
i iskreno, prilično je slična ideji da imate zastupnika.
03:36
So what I'm going to do today is I'm going to be a little bit provocative,
68
216795
3468
Zato ću ja danas da budem malo provokativan,
pa ću vas pitati:
03:40
and I'm going to ask you, well:
69
220277
2300
03:42
What if, instead of trying to bypass politicians,
70
222601
6562
šta ako bismo umesto pokušavanja da zaobiđemo političare,
03:49
we tried to automate them?
71
229187
2200
pokušali da ih automatizujemo?
03:57
The idea of automation is not new.
72
237871
2926
Ideja automatizacije nije nova.
04:00
It was started more than 300 years ago,
73
240821
2080
Pokrenuta je pre više od 300 godina,
04:02
when French weavers decided to automate the loom.
74
242925
3068
kada su francuski tkači odlučili da automatizuju razboj.
04:06
The winner of that industrial war was Joseph-Marie Jacquard.
75
246820
4360
Pobednik tog industrijskog rata je bio Žozef-Mari Žakard.
04:11
He was a French weaver and merchant
76
251204
1751
On je bio francuski tkalac i trgovac
04:12
that married the loom with the steam engine
77
252979
2440
koji je spojio razboj sa parnim motorom
04:15
to create autonomous looms.
78
255443
2190
kako bi napravio automatski razboj.
04:17
And in those autonomous looms, he gained control.
79
257657
2753
Sa tim automatskim razbojima on je pridobio kontrolu.
04:20
He could now make fabrics that were more complex and more sophisticated
80
260434
3885
Sada je mogao da pravi materijale koji su složeniji i prefinjeniji
04:24
than the ones they were able to do by hand.
81
264343
2128
od onih koji su pravljeni ručno.
04:27
But also, by winning that industrial war,
82
267193
2632
Međutim, pobedom u tom industrijskom ratu
04:29
he laid out what has become the blueprint of automation.
83
269849
3524
postavio je i temelje nečemu što će postati obrazac automatizacije.
04:34
The way that we automate things for the last 300 years
84
274135
2870
Način na koji automatizujemo stvari u proteklih 300 godina
04:37
has always been the same:
85
277029
1382
je oduvek bio isti:
04:39
we first identify a need,
86
279006
2509
prvo identifikujemo potrebu,
04:41
then we create a tool to satisfy that need,
87
281539
3184
potom napravimo oruđe da zadovoljimo potrebu,
04:44
like the loom, in this case,
88
284747
2040
poput razboja u ovom slučaju,
04:46
and then we study how people use that tool
89
286811
2391
a potom izučavamo kako ljudi koriste to oruđe
04:49
to automate that user.
90
289226
1485
da bismo automatizovali korisnika.
04:51
That's how we came from the mechanical loom
91
291242
3061
Tako smo prešli sa mehaničkog razboja
04:54
to the autonomous loom,
92
294327
1896
na automatski razboj,
04:56
and that took us a thousand years.
93
296247
2120
a za to nam je bilo potrebno hiljadu godina.
04:58
Now, it's taken us only a hundred years
94
298391
2071
Bilo nam je potrebno svega sto godina
05:00
to use the same script to automate the car.
95
300486
3211
da korstimo isti scenario da automatizujemo automobil.
05:05
But the thing is that, this time around,
96
305286
2452
Međutim, radi se o tome da je ovaj put
05:07
automation is kind of for real.
97
307762
2129
automatizacije nekako stvarna.
05:09
This is a video that a colleague of mine from Toshiba shared with me
98
309915
3321
Ovo je snimak koji je moj kolega iz Tošibe podelio sa mnom.
05:13
that shows the factory that manufactures solid state drives.
99
313260
3259
On pokazuje fabriku koja proizvodi poluprovodničke diskove.
05:16
The entire factory is a robot.
100
316543
2018
Čitava fabrika je robot.
05:18
There are no humans in that factory.
101
318585
1925
Nema ljudi u fabrici.
05:21
And the robots are soon to leave the factories
102
321033
2221
A roboti će uskoro da napuste fabrike
05:23
and become part of our world,
103
323278
2022
i da postanu deo našeg sveta,
05:25
become part of our workforce.
104
325324
1835
deo naše radne snage.
05:27
So what I do in my day job
105
327183
1773
Dakle, moj dnevni posao
05:28
is actually create tools that integrate data for entire countries
106
328980
3992
je zapravo stvaranje oruđa koja integrišu podatke celokupnih država
05:32
so that we can ultimately have the foundations that we need
107
332996
3466
kako bismo na kraju imali potrebne osnove
05:36
for a future in which we need to also manage those machines.
108
336486
3687
za budućnost u kojoj moramo i da upravljamo tim mašinama.
05:41
But today, I'm not here to talk to you about these tools
109
341195
2906
Međutim, danas nisam ovde da vam pričam o tim oruđima
05:44
that integrate data for countries.
110
344125
1824
koja integrišu državne podatke,
05:46
But I'm here to talk to you about another idea
111
346463
2622
nego sam ovde da govorim o jednoj drugoj ideji
05:49
that might help us think about how to use artificial intelligence in democracy.
112
349109
4865
koja bi nam mogla pomoći u razmišljanju
kako da koristimo veštačku inteligenciju u demokratiji.
05:53
Because the tools that I build are designed for executive decisions.
113
353998
4733
Jer su oruđa koja sam napravio dizajnirana za izvršne odluke.
05:58
These are decisions that can be cast in some sort of term of objectivity --
114
358755
3842
To su odluke koje mogu da se donesu u nekom obliku objektivnosti -
06:02
public investment decisions.
115
362621
1745
odluke o javnim investicijama.
06:04
But there are decisions that are legislative,
116
364885
2631
Međutim, postoje zakonodavne odluke,
06:07
and these decisions that are legislative require communication among people
117
367540
3787
a te zakonodavne odluke zahtevaju komunikaciju među ljudima
06:11
that have different points of view,
118
371351
1700
koji imaju različite tačke gledišta,
06:13
require participation, require debate,
119
373075
2613
zahtevaju učešće, debatu,
06:15
require deliberation.
120
375712
1478
zahtevaju razmatranje.
06:18
And for a long time, we have thought that, well,
121
378241
2804
Dugo vremena smo smatrali da,
06:21
what we need to improve democracy is actually more communication.
122
381069
3460
kako bismo unapredili demokratiju, zapravo nam je potrebno više komunikacije.
06:24
So all of the technologies that we have advanced in the context of democracy,
123
384553
3709
Pa su sve tehnologije koje smo unapredili u tom kontekstu demokratije,
bilo da se radi o novinama ili društvenim mrežama,
06:28
whether they are newspapers or whether it is social media,
124
388286
2778
06:31
have tried to provide us with more communication.
125
391088
2382
pokušale da nam obezbede više komunikacije.
06:34
But we've been down that rabbit hole,
126
394103
1822
Ali već smo prošli tu zamku
06:35
and we know that's not what's going to solve the problem.
127
395949
2748
i znamo da to neće da reši problem.
06:38
Because it's not a communication problem,
128
398721
1996
Jer se ne radi o problemu u komunikaciji,
radi se o problemu kognitivnog opsega.
06:40
it's a cognitive bandwidth problem.
129
400741
1748
06:42
So if the problem is one of cognitive bandwidth,
130
402513
2366
Ako se radi o problemu kognitivnog opsega,
06:44
well, adding more communication to people
131
404903
2587
dodavanje više komunikacije među ljudima
06:47
is not going to be what's going to solve it.
132
407514
2744
neće rešiti problem.
06:50
What we are going to need instead is to have other technologies
133
410282
3113
Umesto toga će nam biti potrebne nove tehnologije,
06:53
that help us deal with some of the communication
134
413419
3046
koje će nam pomoći da se rešimo dela ove komunikacije
06:56
that we are overloaded with.
135
416489
2242
kojom smo zatrpani.
06:58
Think of, like, a little avatar,
136
418755
1699
Zamislite na primer, malog avatara,
07:00
a software agent,
137
420478
1339
softverskog agenta,
07:01
a digital Jiminy Cricket --
138
421841
1878
digitalnog cvrčka iz Pinokija, Džiminija -
07:03
(Laughter)
139
423743
1238
(Smeh)
07:05
that basically is able to answer things on your behalf.
140
425005
4012
koji je u stanju da odgovara u vaše ime.
07:09
And if we had that technology,
141
429759
1787
Ako bismo imali tu tehnologiju,
07:11
we would be able to offload some of the communication
142
431570
2478
bili bismo u stanju da se oslobodimo dela te komunikacije,
07:14
and help, maybe, make better decisions or decisions at a larger scale.
143
434072
4147
i možda pomognemo donošenju boljih odluka ili odluka većeg obima.
07:18
And the thing is that the idea of software agents is also not new.
144
438860
3719
Stvar je u tome da ideja o softverskim agentima nije nova.
07:22
We already use them all the time.
145
442603
2109
Već ih stalno koristimo.
07:25
We use software agents
146
445216
1521
Koristimo softverske agente
07:26
to choose the way that we're going to drive to a certain location,
147
446761
3675
da odlučimo put kojim ćemo da vozimo do određene lokacije,
07:31
the music that we're going to listen to
148
451070
2101
koju ćemo muziku slušati,
07:33
or to get suggestions for the next books that we should read.
149
453758
3021
ili da nam preporuče sledeću knjigu koju bi trebalo da pročitamo.
07:37
So there is an obvious idea in the 21st century
150
457994
2574
Dakle, postoji očigledna ideja u 21. veku
07:40
that was as obvious as the idea
151
460592
2643
koja je jednako očigledna kao i ideja
07:43
of putting together a steam engine with a loom at the time of Jacquard.
152
463259
5581
da se sastave parni motor i razboj u Žakardovo vreme.
07:49
And that idea is combining direct democracy with software agents.
153
469538
4456
A ta ideja je kombinovanje direktne demokratije i softverskih agenata.
07:54
Imagine, for a second, a world
154
474849
2121
Zamislite za trenutak svet
07:56
in which, instead of having a representative that represents you
155
476994
3166
u kome, umesto što imate zastupnika koji zastupa
08:00
and millions of other people,
156
480184
1574
vas i milione drugih ljudi,
08:01
you can have a representative that represents only you,
157
481782
3036
možete da imate zastupnika koji zastupa samo vas,
08:05
with your nuanced political views --
158
485504
2254
sa vašim nijansiranim političkim stavovima -
08:07
that weird combination of libertarian and liberal
159
487782
3344
tu čudnu kombinaciju libertarijanca i liberala,
08:11
and maybe a little bit conservative on some issues
160
491150
2392
i možda po nekim pitanjima malo konzervativca,
08:13
and maybe very progressive on others.
161
493566
2108
a po nekim drugim, vrlo progresivnog.
08:15
Politicians nowadays are packages, and they're full of compromises.
162
495698
3267
Današnji političari su paketi i puni su kompromisa.
08:18
But you might have someone that can represent only you,
163
498989
3634
Međutim, možete da imate nekoga ko može da predstavlja samo vas,
08:22
if you are willing to give up the idea
164
502647
1852
ako ste spremni da odustanete od ideje
08:24
that that representative is a human.
165
504523
2249
da je zastupnik ljudsko biće.
08:27
If that representative is a software agent,
166
507229
2082
Da je taj zastupnik softverski agent,
08:29
we could have a senate that has as many senators as we have citizens.
167
509335
4170
mogli bismo da imamo senat koji bi imao onoliko senatora koliko imamo stanovnika.
08:33
And those senators are going to be able to read every bill
168
513529
2858
A ti senatori bi bili u stanju da čitaju svaki predlog zakona,
08:36
and they're going to be able to vote on each one of them.
169
516411
2747
i da glasaju o svakom od njih.
08:39
So there's an obvious idea that maybe we want to consider.
170
519822
2956
Dakle, imamo očiglednu ideju koju bismo možda želeli da razmotrimo.
08:42
But I understand that in this day and age,
171
522802
2422
Međutim, razumem da u današnje vreme
08:45
this idea might be quite scary.
172
525248
1889
ova ideja može biti prilično zastrašujuća.
08:48
In fact, thinking of a robot coming from the future
173
528391
3440
U stvari, razmišljati o robotu koji dolazi iz budućnosti
08:51
to help us run our governments
174
531855
1673
da nam pomogne da vodimo države
08:53
sounds terrifying.
175
533552
1631
zvuči zastrašujuće.
08:56
But we've been there before.
176
536223
1651
Međutim, već smo to doživeli.
08:57
(Laughter)
177
537898
1273
(Smeh)
08:59
And actually he was quite a nice guy.
178
539195
2495
I zapravo je bio prilično fin momak.
09:03
So what would the Jacquard loom version of this idea look like?
179
543677
6434
Kako bi Žakardova ideja razboja u ovoj verziji izgledala?
09:10
It would be a very simple system.
180
550135
1901
To bi bio veoma jednostavan sistem.
09:12
Imagine a system that you log in and you create your avatar,
181
552060
3458
Zamislite sistem u koji se ulogujete i napravite svoj avatar,
09:15
and then you're going to start training your avatar.
182
555542
2456
a potom počinjete da obučavate svog avatara.
09:18
So you can provide your avatar with your reading habits,
183
558022
2682
Možete da prenesete na vaš avatar soptvene čitalačke skolonosti,
09:20
or connect it to your social media,
184
560728
1861
ili da ga povežete sa vašim društvenim mrežama,
09:22
or you can connect it to other data,
185
562613
2408
ili da ga povežete s drugim podacima,
09:25
for example by taking psychological tests.
186
565045
2272
na primer, rešavajući psihološke testove.
09:27
And the nice thing about this is that there's no deception.
187
567341
2968
Dobra stvar kod ovoga je ta što nema obmane.
09:30
You are not providing data to communicate with your friends and family
188
570333
3339
Ne dajete podatke kako biste komunicirali sa prijateljima i porodicom,
09:33
that then gets used in a political system.
189
573696
3151
a da se to kasnije koristi u političkom sistemu.
09:36
You are providing data to a system that is designed to be used
190
576871
3704
Pružate podatke sistemu koji je osmišljen da se koristi
09:40
to make political decisions on your behalf.
191
580599
2116
kako bi donosio političke odluke u vaše ime.
09:43
Then you take that data and you choose a training algorithm,
192
583264
3980
Potom uzmete te podatke i odaberete algoritam za obuku,
09:47
because it's an open marketplace
193
587268
1563
jer se radi o otvorenom tržištu,
09:48
in which different people can submit different algorithms
194
588855
2786
u kome različiti ljudi mogu da prilože različite algoritme
09:51
to predict how you're going to vote, based on the data you have provided.
195
591665
4394
kako bi predvideli kako ćete da glasate na osnovu podataka koje ste ponudili.
09:56
And the system is open, so nobody controls the algorithms;
196
596083
3455
A sistem je otvoren, dakle, niko ne kontroliše algoritme;
09:59
there are algorithms that become more popular
197
599562
2112
tu su algoritmi koji će biti popularniji i drugi koji će da budu manje popularni.
10:01
and others that become less popular.
198
601698
1723
10:03
Eventually, you can audit the system.
199
603445
1807
Naposletku, možete da revidirate sistem.
10:05
You can see how your avatar is working.
200
605276
1881
Možete da vidite kako vaš avatar funkcioniše.
10:07
If you like it, you can leave it on autopilot.
201
607181
2152
Ako vam se dopada, stavite ga na sistem autopilota.
10:09
If you want to be a little more controlling,
202
609357
2062
Ako želite da imate malo veću kontrolu,
10:11
you can actually choose that they ask you
203
611443
1968
možete zapravo da odaberete da vas pitaju kad god treba da donesu neku odluku,
10:13
every time they're going to make a decision,
204
613435
2068
10:15
or you can be anywhere in between.
205
615527
1635
ili možete da budete negde u sredini.
10:17
One of the reasons why we use democracy so little
206
617186
2405
Jedan od razloga zašto tako malo koristimo demokratiju
10:19
may be because democracy has a very bad user interface.
207
619615
3568
je možda taj što demokratija ima veoma loš korisnički pristup.
10:23
And if we improve the user interface of democracy,
208
623207
2483
Ako bismo unapredili korisnički pristup demokratije,
10:25
we might be able to use it more.
209
625714
2127
možda bismo mogli da je više koristimo.
10:28
Of course, there's a lot of questions that you might have.
210
628452
3207
Naravno, verovatno imate gomilu pitanja.
10:32
Well, how do you train these avatars?
211
632473
2161
Kako da obučavate ove avatare?
10:34
How do you keep the data secure?
212
634658
1894
Kako da vaši podaci budu bezbedni?
10:36
How do you keep the systems distributed and auditable?
213
636576
3248
Kako da vaš sistem bude raspodeljen i podložan reviziji?
10:39
How about my grandmother, who's 80 years old
214
639848
2062
Šta je sa mojom bakom koja ima 80 godina
10:41
and doesn't know how to use the internet?
215
641946
1960
i ne zna kako da koristi internet?
10:44
Trust me, I've heard them all.
216
644262
2221
Verujte mi, sva ta pitanja sam čuo.
10:46
So when you think about an idea like this, you have to beware of pessimists
217
646507
4560
Kada razmišljate o ovakvoj ideji, morate da se pazite pesimista,
10:51
because they are known to have a problem for every solution.
218
651091
4319
jer su poznati po tome što imaju problem za svako rešenje.
10:55
(Laughter)
219
655434
1825
(Smeh)
10:57
So I want to invite you to think about the bigger ideas.
220
657283
3040
Želim da vas pozovem da razmišljate o većim idejama.
11:00
The questions I just showed you are little ideas
221
660347
3626
Pitanja koja sam vam upravo pokazao su male ideje,
11:03
because they are questions about how this would not work.
222
663997
2902
jer se radi o pitanjima zašto ovo ne bi funkcionisalo.
11:07
The big ideas are ideas of:
223
667502
1981
Velike ideje su:
11:09
What else can you do with this
224
669507
1807
šta još možemo da uradimo s ovim,
11:11
if this would happen to work?
225
671338
1889
ukoliko bi ovo funkcionisalo?
11:13
And one of those ideas is, well, who writes the laws?
226
673774
3445
A jedna od tih ideja je: ko piše zakone?
11:17
In the beginning, we could have the avatars that we already have,
227
677854
4223
U početku bismo mogli da imamo avatare koje već imamo,
11:22
voting on laws that are written by the senators or politicians
228
682101
3497
koji glasaju za zakone koje su pisali senatori ili političari
11:25
that we already have.
229
685622
1351
koje već imamo.
11:27
But if this were to work,
230
687491
1714
Međutim, ako bi ovo proradilo,
11:29
you could write an algorithm
231
689902
2350
mogli biste da napišete algoritam
11:32
that could try to write a law
232
692276
2150
koji bi pokušao da napiše zakon
11:34
that would get a certain percentage of approval,
233
694450
2421
koji bi dobio određen procenat podrške,
11:36
and you could reverse the process.
234
696895
1782
i mogli biste da obrnete proces.
11:38
Now, you might think that this idea is ludicrous and we should not do it,
235
698701
3512
Možda smatrate da je ova ideja smešna i da ne bi trebalo to da radimo,
11:42
but you cannot deny that it's an idea that is only possible
236
702237
2786
ali ne možete da negirate da je ova ideja jedino moguća
11:45
in a world in which direct democracy and software agents
237
705047
3020
u svetu u kome su direktna demokratija i softverski agenti
11:48
are a viable form of participation.
238
708091
2656
održivi oblici učešća.
11:52
So how do we start the revolution?
239
712596
2753
Kako da otpočnemo revoluciju?
11:56
We don't start this revolution with picket fences or protests
240
716238
3310
Ne započinjemo ovu revoluciju tarabama ili protestima,
11:59
or by demanding our current politicians to be changed into robots.
241
719572
4190
ili zahtevajući od trenutnih političara da se pretvore u robote.
12:03
That's not going to work.
242
723786
1549
To neće funkcionisati.
12:05
This is much more simple,
243
725359
1612
Ovo je mnogo jednostavnije,
12:06
much slower
244
726995
1159
mnogo sporije
12:08
and much more humble.
245
728178
1414
i mnogo skromnije.
12:09
We start this revolution by creating simple systems like this in grad schools,
246
729616
4349
Počinjemo revoluciju stvarajući proste sisteme poput ovog na fakultetima,
12:13
in libraries, in nonprofits.
247
733989
2094
ili u bibliotekama, neprofitnim organizacijama.
12:16
And we try to figure out all of those little questions
248
736107
2654
I pokušavamo da shvatimo sva ta mala pitanja,
12:18
and those little problems
249
738785
1221
te male probleme,
12:20
that we're going to have to figure out to make this idea something viable,
250
740030
3901
koje ćemo morati da rešimo da bi ova ideja postala održiva,
12:23
to make this idea something that we can trust.
251
743955
2351
kako bi ova ideja postala nešto čemu možemo verovati.
12:26
And as we create those systems that have a hundred people, a thousand people,
252
746330
3635
Kako stvaramo ove sisteme sa stotinama, hiljadama,
12:29
a hundred thousand people voting in ways that are not politically binding,
253
749989
3770
stotinama hiljada ljudi koji glasaju na načine koji nisu politički obavezujući,
12:33
we're going to develop trust in this idea,
254
753783
2018
izgradićemo poverenje u ovu ideju,
12:35
the world is going to change,
255
755825
1519
svet će se promeniti,
12:37
and those that are as little as my daughter is right now
256
757368
2875
a oni koji su mladi, kao moja ćerka danas,
12:40
are going to grow up.
257
760267
1337
će odrasti.
12:42
And by the time my daughter is my age,
258
762580
2369
A kad moja ćerka bude mojih godina,
12:44
maybe this idea, that I know today is very crazy,
259
764973
4436
možda će ova ideja, za koju znam da danas zvuči sumanuto,
12:49
might not be crazy to her and to her friends.
260
769433
4134
njoj i njenim prijateljima neće biti sumanuta.
12:53
And at that point,
261
773956
1837
A u tom trenutku,
12:55
we will be at the end of our history,
262
775817
2603
mi ćemo biti na kraju naše istorije,
12:58
but they will be at the beginning of theirs.
263
778444
2821
ali oni će biti na početku njihove.
13:01
Thank you.
264
781646
1183
Hvala vam.
13:02
(Applause)
265
782853
3042
(Aplauz)
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7