How close are we to uploading our minds? - Michael S.A. Graziano

539,741 views ・ 2019-10-28

TED-Ed


Please double-click on the English subtitles below to play the video.

Prevodilac: Katarina Čolić Lektor: Ivana Krivokuća
00:07
Imagine a future where nobody dies—
0
7063
2880
Zamislite budućnost u kojoj niko ne umire -
00:09
instead, our minds are uploaded to a digital world.
1
9943
4000
umesto toga se naši umovi šalju u digitalni svet.
00:13
They might live on in a realistic, simulated environment with avatar bodies,
2
13943
4530
Možda bi nastavili da žive u realističnom, simuliranom okruženju u telima avatara
00:18
and could still call in and contribute to the biological world.
3
18473
4680
i da i dalje posećuju biološki svet i doprinose mu.
00:23
Mind uploading has powerful appeal—
4
23153
3091
Otpremanje uma je jako privlačno -
00:26
but what would it actually take to scan a person’s brain and upload their mind?
5
26244
5430
ali šta je zapravo potrebno da se skenira mozak i otpremi um?
00:31
The main challenges are scanning a brain in enough detail to capture the mind
6
31674
4552
Glavni izazov je skeniranje mozga onoliko detaljno da se zabeleži um
00:36
and perfectly recreating that detail artificially.
7
36226
4408
i savršeno rekreiranje tih detalja veštačkim putem.
00:40
But first, we have to know what to scan.
8
40634
2680
Ali prvo moramo da znamo šta da skeniramo.
00:43
The human brain contains about 86 billion neurons,
9
43314
3370
Ljudski mozak sadrži oko 86 milijardi neurona,
00:46
connected by at least a hundred trillion synapses.
10
46684
3630
povezanih sa najmanje stotinu biliona sinapsi.
00:50
The pattern of connectivity among the brain’s neurons,
11
50314
3070
Šema povezanih neurona mozga,
00:53
that is, all of the neurons and all their connections to each other,
12
53384
4120
odnosno, svih neurona i njihovih međusobnih veza,
00:57
is called the connectome.
13
57504
2199
naziva se konektom.
00:59
We haven’t yet mapped the connectome,
14
59703
2210
Još uvek nismo mapirali konektom
01:01
and there’s also a lot more to neural signaling.
15
61913
3020
i postoji još mnogo toga vezanog za neuronski prenos signala.
01:04
There are hundreds, possibly thousands of different kinds of connections,
16
64933
3730
Postoji na stotine, možda i hiljade različitih vrsta veza
01:08
or synapses.
17
68663
1600
ili sinapsi.
01:10
Each functions in a slightly different way.
18
70263
2640
Svaka funkcioniše pomalo drugačije.
01:12
Some work faster, some slower.
19
72903
2040
Neke rade brže, neke sporije.
01:14
Some grow or shrink rapidly in the process of learning;
20
74943
3770
Neke rastu ili se smanjuju u procesu učenja;
01:18
some are more stable over time.
21
78713
2190
neke su stabilnije tokom vremena.
01:20
And beyond the trillions of precise, 1-to-1 connections between neurons,
22
80903
4690
A pored biliona preciznih veza „jedan na jedan“ između neurona,
01:25
some neurons also spray out neurotransmitters
23
85593
3363
pojedini neuroni šalju neurotransmitere
01:28
that affect many other neurons at once.
24
88956
2770
koji istovremeno utiču na mnogo drugih neurona.
01:31
All of these different kinds of interactions
25
91726
2240
Sve ove različite vrste interakcija
01:33
would need to be mapped in order to copy a person’s mind.
26
93966
3910
moraju da se mapiraju da bi se kopirao um neke osobe.
01:37
There are also a lot of influences on neural signaling
27
97876
2990
Postoji i dosta uticaja na neuronski prenos signala
01:40
that are poorly understood or undiscovered.
28
100866
3340
koje nismo do kraja razumeli ili otkrili.
01:44
To name just one example,
29
104206
1570
Evo na primer,
01:45
patterns of activity between neurons
30
105776
2310
na šeme aktivnosti među neuronima
01:48
are likely influenced by a type of cell called glia.
31
108086
4070
najčešće utiče jedna vrsta ćelije, glija.
01:52
Glia surround neurons and, according to some scientists,
32
112156
3697
Glija okružuje neurone i, prema nekim naučnicima,
01:55
may even outnumber them by as many as ten to one.
33
115853
3650
može premašiti njihov broj čak deset puta.
01:59
Glia were once thought to be purely for structural support,
34
119503
3630
Nekada se mislilo da glija postoji samo zbog potpore
02:03
and their functions are still poorly understood,
35
123133
2850
i još nismo do kraja razumeli njenu funkciju,
02:05
but at least some of them can generate their own signals
36
125983
3150
ali makar neke od ovih ćelija mogu razviti sopstvene signale
02:09
that influence information processing.
37
129133
2620
koji utiču na obradu informacija.
02:11
Our understanding of the brain isn’t good enough to determine
38
131753
3140
Naše razumevanje mozga nije dovoljno da utvrdimo
02:14
what we’d need to scan in order to replicate the mind,
39
134893
3040
šta nam je potrebno za skeniranje kako bismo napravili repliku uma,
02:17
but assuming our knowledge does advance to that point,
40
137933
3010
ali ako pretpostavimo da to znamo,
02:20
how would we scan it?
41
140943
1430
kako bismo ga skenirali?
02:22
Currently, we can accurately scan a living human brain
42
142373
3490
Trenutno možemo da precizno skeniramo ljudski mozak
02:25
with resolutions of about half a millimeter
43
145863
2560
u rezoluciji od oko pola milimetra,
02:28
using our best non-invasive scanning method, MRI.
44
148423
3820
koristeći najbolju neinvazivnu metodu koju imamo, magnetnu rezonancu.
02:32
To detect a synapse, we’ll need to scan at a resolution of about a micron—
45
152243
5490
Da bismo detektovali sinapsu, moramo skenirati u rezoluciji od jednog mikrona -
02:37
a thousandth of a millimeter.
46
157733
1960
koji je hiljaditi deo milimetra.
02:39
To distinguish the kind of synapse and precisely how strong each synapse is,
47
159693
5000
Za određivanje vrste sinapse i njene jačine
02:44
we’ll need even better resolution.
48
164693
2381
biće nam potrebna bolja rezolucija.
02:47
MRI depends on powerful magnetic fields.
49
167074
2970
MR zavisi od jakih magnetnih polja.
02:50
Scanning at the resolution required
50
170044
1880
Za skeniranje u rezoluciji koja je potrebna
02:51
to determine the details of individual synapses
51
171924
3190
da bi se odredili detalji pojedinačne sinapse
02:55
would requires a field strength high enough to cook a person’s tissues.
52
175114
4620
bila bi potrebna jačina polja kojom bi moglo da se skuva ljudsko tkivo.
02:59
So this kind of leap in resolution
53
179734
2032
Stoga bi ovakav prelaz u rezoluciji
03:01
would require fundamentally new scanning technology.
54
181766
3290
zahtevao promenu tehnologije skeniranja iz korena.
03:05
It would be more feasible to scan a dead brain using an electron microscope,
55
185056
4580
Bilo bi izvodljivije skenirati mrtav mozak pomoću elektronskog mikroskopa,
03:09
but even that technology is nowhere near good enough–
56
189636
3600
ali ni ta tehnologija nije dovoljno dobra -
03:13
and requires killing the subject first.
57
193236
3290
i zahteva prvo ubijanje ispitanika.
03:16
Assuming we eventually understand the brain well enough to know what to scan
58
196526
4150
Pretpostavimo da konačno razumemo mozak dovoljno da znamo šta da skeniramo
03:20
and develop the technology to safely scan at that resolution,
59
200676
3450
i da razvijemo tehnologiju za bezbedno skeniranje pri toj rezoluciji,
03:24
the next challenge would be to recreate that information digitally.
60
204126
4410
naredni izaziv bi bio da ponovo kreiramo date informacije digitalno.
03:28
The main obstacles to doing so are computing power and storage space,
61
208536
4730
Glavne prepreke su jačina kompjutera i prostor za čuvanje,
03:33
both of which are improving every year.
62
213266
2498
a oba se unapređuju svake godine.
03:35
We’re actually much closer to attaining this technological capacity
63
215764
3760
Zapravo smo mnogo bliže ovoj tehnologiji
03:39
than we are to understanding or scanning our own minds.
64
219524
4270
nego razumevanju ili skeniranju sopstvenog uma.
03:43
Artificial neural networks already run our internet search engines,
65
223794
4040
Veštačke neuronske veze već pokreću naše internet pretraživače,
03:47
digital assistants, self-driving cars, Wall Street trading algorithms,
66
227834
4584
digitalne asistente, samovozeće automobile,
algoritme poslovanja na Vol Stritu i pametne telefone.
03:52
and smart phones.
67
232418
1374
03:53
Nobody has yet built an artificial network with 86 billion neurons,
68
233792
4110
Niko još uvek nije izgradio veštačku mrežu sa 86 milijardi neurona,
03:57
but as computing technology improves,
69
237902
2430
ali kako kompjuterska tehnologija napreduje,
04:00
it may be possible to keep track of such massive data sets.
70
240332
4380
možda će biti moguće voditi evidenciju tako velikih baza podataka.
04:04
At every step in the scanning and uploading process,
71
244712
3458
Pri svakom koraku u procesu skeniranja i otpremanja
04:08
we’d have to be certain we were capturing all the necessary information accurately—
72
248170
4670
moramo biti sigurni da svaku informaciju beležimo tačno -
04:12
or there’s no telling what ruined version of a mind might emerge.
73
252840
5457
u suprotnom, ko zna kakva upropašćena verzija uma može nastati.
04:18
While mind uploading is theoretically possible,
74
258297
2786
Iako je otpremanje uma teoretski moguće,
04:21
we’re likely hundreds of years away
75
261083
1850
verovatno smo stotinama godina daleko
04:22
from the technology and scientific understanding
76
262933
2890
od tehnologije i razumevanja nauke
04:25
that would make it a reality.
77
265823
1450
koji bi to omogućili.
04:27
And that reality would come with ethical and philosophical considerations:
78
267273
4250
A ta realnost bi podstakla etička i filozofska pitanja:
04:31
who would have access to mind uploading?
79
271523
2810
ko bi imao pristup otpremanju uma?
04:34
What rights would be accorded to uploaded minds?
80
274333
2880
Kakva bi prava bila dodeljena otpremljenim umovima?
04:37
How could this technology be abused?
81
277213
2760
Kako bi ova tehnologija mogla biti zloupotrebljena?
04:39
Even if we can eventually upload our minds,
82
279636
2800
Čak i da možemo otpremiti svoj um,
04:42
whether we should remains an open question.
83
282436
3110
ostaje pitanje treba li to i da uradimo.
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7