Martin Rees: Can we prevent the end of the world?

151,675 views ・ 2014-08-25

TED


Please double-click on the English subtitles below to play the video.

Prevodilac: Mile Živković Lektor: Ivana Korom
00:12
Ten years ago, I wrote a book which I entitled
0
12485
2222
Pre deset godina, napisao sam knjigu koju sam nazvao
00:14
"Our Final Century?" Question mark.
1
14707
3093
"Naš poslednji vek?" sa znakom pitanja.
00:17
My publishers cut out the question mark. (Laughter)
2
17800
3577
Moji izdavači su izbacili upitnik. (Smeh)
00:21
The American publishers changed our title
3
21377
1882
Američki izdavači su promenili naslov
00:23
to "Our Final Hour."
4
23259
3909
u "Naš poslednji čas".
00:27
Americans like instant gratification and the reverse.
5
27168
3492
Amerikanci vole trenutno zadovoljavanje i obrnuto.
00:30
(Laughter)
6
30660
1708
(Smeh)
00:32
And my theme was this:
7
32368
1750
Moja tema je bila ovo:
00:34
Our Earth has existed for 45 million centuries,
8
34118
4166
naša Zemlja postoji 45 miliona vekova,
00:38
but this one is special —
9
38284
2013
ali ovaj je poseban -
00:40
it's the first where one species, ours,
10
40297
3016
to je prvi gde jedna vrsta, naša,
00:43
has the planet's future in its hands.
11
43313
2802
drži budućnost planete u svojim rukama.
00:46
Over nearly all of Earth's history,
12
46115
1990
Tokom skoro cele istorije Zemlje,
00:48
threats have come from nature —
13
48105
1936
pretnju je predstavljala priroda -
00:50
disease, earthquakes, asteroids and so forth —
14
50041
3496
zaraze, zemljotresi, asteroidi i drugo -
00:53
but from now on, the worst dangers come from us.
15
53537
5672
ali od sada, mi predstavljamo najveću opasnost.
00:59
And it's now not just the nuclear threat;
16
59209
3271
To sada nije samo nuklearna pretnja -
01:02
in our interconnected world,
17
62480
1751
u našem povezanom svetu,
01:04
network breakdowns can cascade globally;
18
64231
3163
pad mreže može da se manifestuje globalno,
01:07
air travel can spread pandemics worldwide within days;
19
67394
3956
avionski letovi mogu da rašire epidemije širom sveta za nekoliko dana,
01:11
and social media can spread panic and rumor
20
71350
3327
a društveni mediji mogu da šire paniku i glasine
01:14
literally at the speed of light.
21
74677
3217
bukvalno brzinom svetlosti.
01:17
We fret too much about minor hazards —
22
77894
3225
Previše strahujemo oko manjih pretnji -
01:21
improbable air crashes, carcinogens in food,
23
81119
4031
neverovatnih sudara u letu, karcinogena u hrani,
01:25
low radiation doses, and so forth —
24
85150
2226
malih doza radijacije i tako dalje -
01:27
but we and our political masters
25
87376
2825
ali mi i naši politički gospodari
01:30
are in denial about catastrophic scenarios.
26
90201
4203
živimo u poricanju oko katastrofalnih scenarija.
01:34
The worst have thankfully not yet happened.
27
94404
3038
Na sreću, najgori se još nisu desili.
01:37
Indeed, they probably won't.
28
97442
2196
Zaista, verovatno ni neće.
01:39
But if an event is potentially devastating,
29
99638
3185
Ali ako je događaj potencijalno razarajući,
01:42
it's worth paying a substantial premium
30
102823
2868
vredi platiti značajno osiguranje
01:45
to safeguard against it, even if it's unlikely,
31
105691
3836
da se od njega zaštitite, čak i ako nije verovatan,
01:49
just as we take out fire insurance on our house.
32
109527
4513
kao što osiguravamo kuće protiv požara.
01:54
And as science offers greater power and promise,
33
114040
4997
A kako nauka nudi sve veću moć i obećanja,
01:59
the downside gets scarier too.
34
119037
3866
loša strana takođe postaje sve strašnija.
02:02
We get ever more vulnerable.
35
122903
2239
Postajemo sve ranjiviji.
02:05
Within a few decades,
36
125142
1838
Kroz nekoliko decenija,
02:06
millions will have the capability
37
126980
2230
milioni će imati mogućnost
02:09
to misuse rapidly advancing biotech,
38
129210
3121
zloupotrebe biotehnologije koja se brzo razvija,
02:12
just as they misuse cybertech today.
39
132331
3553
isto kao što danas zloupotrebljavaju sajber tehnologiju.
02:15
Freeman Dyson, in a TED Talk,
40
135884
3199
U TED govoru, Friman Dajson
02:19
foresaw that children will design and create new organisms
41
139083
3596
je predvideo da će deca dizajnirati i stvarati nove organizme
02:22
just as routinely as his generation played with chemistry sets.
42
142679
4511
rutinski kao što se njegova generacija igrala kompletima za hemiju.
02:27
Well, this may be on the science fiction fringe,
43
147190
2528
Ovo je možda na ivici naučne fantastike,
02:29
but were even part of his scenario to come about,
44
149718
3183
ali ukoliko se obistini samo deo njegovog scenarija,
02:32
our ecology and even our species
45
152901
2737
naša ekologija i čak i naša vrsta
02:35
would surely not survive long unscathed.
46
155638
3989
ne bi dugo preživele nedirnute.
02:39
For instance, there are some eco-extremists
47
159627
3863
Na primer, postoje neki eko-ekstremisti
02:43
who think that it would be better for the planet,
48
163490
2509
koji misle da bi bilo bolje za planetu,
02:45
for Gaia, if there were far fewer humans.
49
165999
3403
za Gaju, da postoji mnogo manje ljudi.
02:49
What happens when such people have mastered
50
169402
2717
Šta bi se desilo kada bi takvi ljudi ovladali
02:52
synthetic biology techniques
51
172119
2137
tehnikama sintetičke biologije
02:54
that will be widespread by 2050?
52
174256
2852
koje će biti sveprisutne do 2050?
02:57
And by then, other science fiction nightmares
53
177108
3042
Do tada, drugi košmari iz naučne fantastike
03:00
may transition to reality:
54
180150
1710
možda postanu stvarnost:
03:01
dumb robots going rogue,
55
181860
2070
glupi roboti koji podivljaju
03:03
or a network that develops a mind of its own
56
183930
2417
ili mreža koja počne sama da misli
03:06
threatens us all.
57
186347
2589
i preti svima nama.
03:08
Well, can we guard against such risks by regulation?
58
188936
3270
Možemo li se regulisanjem braniti od takvih rizika?
03:12
We must surely try, but these enterprises
59
192206
2407
Sigurno moramo pokušati, ali ova preduzeća
03:14
are so competitive, so globalized,
60
194613
3529
su toliko takmičarski nastrojena, toliko globalizovana
03:18
and so driven by commercial pressure,
61
198142
1980
i vođena komercijalnim pritiskom
03:20
that anything that can be done will be done somewhere,
62
200122
3285
da će sve što može da bude urađeno biti urađeno negde,
03:23
whatever the regulations say.
63
203407
2036
bez obzira na pravila.
03:25
It's like the drug laws — we try to regulate, but can't.
64
205443
3487
To je poput zakona o drogama - pokušavamo da regulišemo, ali ne možemo.
03:28
And the global village will have its village idiots,
65
208930
3044
Globalno selo će imati svoje seoske idiote,
03:31
and they'll have a global range.
66
211974
3496
a oni će imati globalni opseg.
03:35
So as I said in my book,
67
215470
2291
Kao što sam rekao u svojoj knjizi,
03:37
we'll have a bumpy ride through this century.
68
217761
2889
čeka nas nezgodna vožnja kroz ovaj vek.
03:40
There may be setbacks to our society —
69
220650
3490
Možda će biti prepreka za naše društvo -
03:44
indeed, a 50 percent chance of a severe setback.
70
224140
4115
zaista, šanse za veliku prepreku su 50%.
03:48
But are there conceivable events
71
228255
2914
Ali da li postoje zamislivi događaji
03:51
that could be even worse,
72
231169
2161
koji bi mogli biti još gori,
03:53
events that could snuff out all life?
73
233330
3430
događaji koji bi ugasili sav život?
03:56
When a new particle accelerator came online,
74
236760
2926
Kada se na internetu pojavio novi akcelerator čestica,
03:59
some people anxiously asked,
75
239686
1789
neki ljudi su se nervozno upitali
04:01
could it destroy the Earth or, even worse,
76
241475
2250
da li bi mogao da uništi Zemlju,
04:03
rip apart the fabric of space?
77
243725
2659
ili još gore, razori materiju svemira?
04:06
Well luckily, reassurance could be offered.
78
246384
3543
Na sreću, postoji uteha.
04:09
I and others pointed out that nature
79
249927
2044
Ja i drugi smo istakli da je priroda
04:11
has done the same experiments
80
251971
1933
izvodila iste eksperimente
04:13
zillions of times already,
81
253904
2186
već nebrojeno puta,
04:16
via cosmic ray collisions.
82
256090
1765
preko sudara kosmičkih zraka.
04:17
But scientists should surely be precautionary
83
257855
3054
Ali naučnici bi svakako trebalo da paze
04:20
about experiments that generate conditions
84
260909
2580
na eksperimente koji stvaraju uslove
04:23
without precedent in the natural world.
85
263489
2483
bez presedana u prirodnom svetu.
04:25
Biologists should avoid release of potentially devastating
86
265972
3423
Biolozi bi trebalo da izbegavaju puštanje potencijalno razarajućih
04:29
genetically modified pathogens.
87
269395
2715
genetski modifikovanih patogena.
04:32
And by the way, our special aversion
88
272110
3517
Usput, naša naročita averzija
04:35
to the risk of truly existential disasters
89
275627
3461
prema riziku od zaista egzistencijalnih katastrofa
04:39
depends on a philosophical and ethical question,
90
279088
3275
zavisi od filozofskog i etičkog pitanja,
04:42
and it's this:
91
282363
1670
a to je sledeće.
04:44
Consider two scenarios.
92
284033
2308
Zamislite dva scenarija.
04:46
Scenario A wipes out 90 percent of humanity.
93
286341
5236
Prvi scenario će zbrisati 90% čovečanstva.
04:51
Scenario B wipes out 100 percent.
94
291577
3896
Drugi će zbrisati 100%.
04:55
How much worse is B than A?
95
295473
2918
Koliko je drugi gori od prvog?
04:58
Some would say 10 percent worse.
96
298391
3023
Neki bi rekli 10% gori.
05:01
The body count is 10 percent higher.
97
301414
3150
Broj mrtvih je 10% veći.
05:04
But I claim that B is incomparably worse.
98
304564
2906
Ali ja mislim da je drugi nemerljivo gori.
05:07
As an astronomer, I can't believe
99
307470
2629
Kao astronom, ne mogu da verujem
05:10
that humans are the end of the story.
100
310099
2467
da su ljudi kraj priče.
05:12
It is five billion years before the sun flares up,
101
312566
3323
Ima pet milijardi godina pre nego što se Sunce zapali
05:15
and the universe may go on forever,
102
315889
2711
i univerzum se možda zauvek nastavi,
05:18
so post-human evolution,
103
318600
2292
dakle post-ljudska evolucija,
05:20
here on Earth and far beyond,
104
320892
2190
na Zemlji i dalje od nje,
05:23
could be as prolonged as the Darwinian process
105
323082
2714
može da se produži kao i Darvinov proces
05:25
that's led to us, and even more wonderful.
106
325796
3281
koji je doveo do nas, i da bude još lepša.
05:29
And indeed, future evolution will happen much faster,
107
329077
2664
Evolucija u budućnosti će se zaista dešavati mnogo brže
05:31
on a technological timescale,
108
331741
2199
na tehnološkoj skali,
05:33
not a natural selection timescale.
109
333940
2299
ne na skali prirodne selekcije.
05:36
So we surely, in view of those immense stakes,
110
336239
4195
U svetlu tih ogromnih rizika,
05:40
shouldn't accept even a one in a billion risk
111
340434
3386
sigurno ne bi trebalo da prihvatimo ni rizik jednog u milijardu
05:43
that human extinction would foreclose
112
343820
2229
da bi istrebljenje ljudi moglo da prekine
05:46
this immense potential.
113
346049
2310
ovaj ogromni potencijal.
05:48
Some scenarios that have been envisaged
114
348359
1772
Neki scenariji koji su zamišljeni
05:50
may indeed be science fiction,
115
350131
1819
zaista mogu da budu naučna fantastika,
05:51
but others may be disquietingly real.
116
351950
3386
ali drugi mogu da budu razočaravajuće realni.
05:55
It's an important maxim that the unfamiliar
117
355336
2874
Važna je maksima da nepoznato
05:58
is not the same as the improbable,
118
358210
2697
nije isto što i nemoguće
06:00
and in fact, that's why we at Cambridge University
119
360907
2398
i zato zapravo na Univerzitetu Kembridž
06:03
are setting up a center to study how to mitigate
120
363305
3375
osnivamo centar za izučavanje toga kako da se umanje
06:06
these existential risks.
121
366680
2032
ovi egzistencijalni rizici.
06:08
It seems it's worthwhile just for a few people
122
368712
3063
Čini se kao da vredi da samo nekoliko ljudi
06:11
to think about these potential disasters.
123
371775
2316
razmišlja o ovim potencijalnim katastrofama.
06:14
And we need all the help we can get from others,
124
374091
3013
Potrebna nam je sva pomoć koju možemo dobiti od drugih
06:17
because we are stewards of a precious
125
377104
2479
jer smo posada na dragocenoj
06:19
pale blue dot in a vast cosmos,
126
379583
3483
bledoj plavoj tački u ogromnom kosmosu,
06:23
a planet with 50 million centuries ahead of it.
127
383066
3378
na planeti koja ispred sebe ima 50 miliona vekova.
06:26
And so let's not jeopardize that future.
128
386444
2556
Hajde da ne ugrozimo tu budućnost.
06:29
And I'd like to finish with a quote
129
389000
1795
Želeo bih da završim citatom
06:30
from a great scientist called Peter Medawar.
130
390795
3501
sjajnog naučnika, Pitera Medavara.
06:34
I quote, "The bells that toll for mankind
131
394296
3273
On kaže: "Zvona koja zvone za čovečanstvom
06:37
are like the bells of Alpine cattle.
132
397569
2644
su poput zvona na grlima stoke u Alpima.
06:40
They are attached to our own necks,
133
400213
2286
Prikačena su za naše vratove
06:42
and it must be our fault if they do not make
134
402499
2675
i mora da je naša krivica ako ne proizvode
06:45
a tuneful and melodious sound."
135
405174
2131
harmoničan i melodičan zvuk."
06:47
Thank you very much.
136
407305
2267
Mnogo vam hvala.
06:49
(Applause)
137
409572
2113
(Aplauz)
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7