Read Montague: What we're learning from 5,000 brains

46,909 views ・ 2012-09-24

TED


Please double-click on the English subtitles below to play the video.

00:00
Translator: Joseph Geni Reviewer: Morton Bast
0
0
7000
Prevodilac: Jovana Zorić Lektor: Boško Eraković
00:15
Other people. Everyone is interested in other people.
1
15734
2809
Drugi ljudi. Svi su zainteresovani za druge ljude.
00:18
Everyone has relationships with other people,
2
18543
2123
Svi imaju veze sa drugim ljudima
00:20
and they're interested in these relationships
3
20666
1592
i zanimaju ih ove veze
00:22
for a variety of reasons.
4
22258
1855
iz raznih razloga.
00:24
Good relationships, bad relationships,
5
24113
2012
Dobre veze, loše veze,
00:26
annoying relationships, agnostic relationships,
6
26125
3146
dosadne veze, sumnjičave veze,
00:29
and what I'm going to do is focus on the central piece
7
29271
3424
a ja ću se fokusirati na centralni deo
00:32
of an interaction that goes on in a relationship.
8
32695
3303
interakcije koji se odvija u vezama.
00:35
So I'm going to take as inspiration the fact that we're all
9
35998
2336
Dakle, ja ću kao inspiraciju uzeti činjenicu da nas sve
00:38
interested in interacting with other people,
10
38334
2425
zanima interakcija sa drugim ljudima
00:40
I'm going to completely strip it of all its complicating features,
11
40759
3832
i tu ideju ću u potpunosti osloboditi svih komplikovanih detalja
00:44
and I'm going to turn that object, that simplified object,
12
44591
3894
i pretvoriću taj pojednostavljeni objekat
00:48
into a scientific probe, and provide the early stages,
13
48485
4150
u naučno istraživanje i doći ću do prvih faza,
00:52
embryonic stages of new insights into what happens
14
52635
2449
osnovnih faza novih otkrića o tome šta se dešava
00:55
in two brains while they simultaneously interact.
15
55084
3650
unutar dva mozga u toku simultane interakcije.
00:58
But before I do that, let me tell you a couple of things
16
58734
2293
Ali pre nego što to uradim, govoriću vam o par stvari
01:01
that made this possible.
17
61027
1699
koje su to omogućile.
01:02
The first is we can now eavesdrop safely
18
62726
2781
Prva je ta da sada možemo bezbedno da osluškujemo
01:05
on healthy brain activity.
19
65507
2711
aktivnosti zdravog mozga.
01:08
Without needles and radioactivity,
20
68218
2577
Bez igala ili radioaktivnosti,
01:10
without any kind of clinical reason, we can go down the street
21
70795
2863
bez ikakvog kliničkog razloga, možemo ići niz ulicu
01:13
and record from your friends' and neighbors' brains
22
73658
3127
i snimati mozgove vaših prijatelja i suseda
01:16
while they do a variety of cognitive tasks, and we use
23
76785
2538
dok obavljaju razne kognitivne zadatke i pri tome koristimo
01:19
a method called functional magnetic resonance imaging.
24
79323
3734
metodu koja se naziva funkcionalno oslikavanje magnetnom rezonancijom.
01:23
You've probably all read about it or heard about in some
25
83057
2325
Svi ste verovatno čitali o tome ili ste čuli o tome u nekom obliku.
01:25
incarnation. Let me give you a two-sentence version of it.
26
85382
4378
Dozvolite da vam u dve rečenice objasnim o čemu se radi.
01:29
So we've all heard of MRIs. MRIs use magnetic fields
27
89760
3484
Svi smo čuli za MR (magnetna rezonancija). Tokom nje se koriste magnetna polja
01:33
and radio waves and they take snapshots of your brain
28
93244
2029
i radio talasi i uzimaju se snimci vašeg mozga
01:35
or your knee or your stomach,
29
95273
2361
ili vašeg kolena, vašeg stomaka,
01:37
grayscale images that are frozen in time.
30
97634
2045
slike u sivim tonovima koje su zamrznute u vremenu.
01:39
In the 1990s, it was discovered you could use
31
99679
2321
Tokom 1990-ih otkriveno je da ove mašine
01:42
the same machines in a different mode,
32
102000
2659
mogu da se koriste i na drugi način
01:44
and in that mode, you could make microscopic blood flow
33
104659
2346
i to tako da možete da uzmete mikroskopske snimke
01:47
movies from hundreds of thousands of sites independently in the brain.
34
107005
3300
krvotoka sa stotine hiljada mesta u mozgu.
01:50
Okay, so what? In fact, the so what is, in the brain,
35
110305
3200
Okej, pa šta? Zapravo, promene u nervnoj
01:53
changes in neural activity, the things that make your brain work,
36
113505
3832
aktivnosti mozga, ono što pokreće vaš mozak,
01:57
the things that make your software work in your brain,
37
117337
2010
ono što čini da softver u vašem mozgu radi,
01:59
are tightly correlated with changes in blood flow.
38
119347
2489
usko je povezano sa promenama u krvotoku.
02:01
You make a blood flow movie, you have an independent
39
121836
1973
Ako napravite snimak krvotoka, imaćete nezavistan
02:03
proxy of brain activity.
40
123809
2339
prikaz moždane aktivnosti.
02:06
This has literally revolutionized cognitive science.
41
126148
3034
Ovo je doslovno revolucija u kognitivnoj nauci.
02:09
Take any cognitive domain you want, memory,
42
129182
1991
Uzmite bilo koju oblast kognicije, pamćenje
02:11
motor planning, thinking about your mother-in-law,
43
131173
2141
planiranje, razmišljanje o vašoj svekrvi,
02:13
getting angry at people, emotional response, it goes on and on,
44
133314
3715
ljutnju, emocionalne reakcije i lista se nastavlja,
02:17
put people into functional MRI devices, and
45
137029
3089
stavite ljude u funkcionalne MR aparate
02:20
image how these kinds of variables map onto brain activity.
46
140118
3383
i posmatrajte kako ove varijable utiču na aktivnost mozga.
02:23
It's in its early stages, and it's crude by some measures,
47
143501
2849
Ideja je u ranoj fazi i još je nedovršena,
02:26
but in fact, 20 years ago, we were at nothing.
48
146350
2568
ali pre 20 godina, nismo imali ništa.
02:28
You couldn't do people like this. You couldn't do healthy people.
49
148918
2359
Nismo mogli ovako da istražujemo na ljudima. Ne na zdravim ljudima.
02:31
That's caused a literal revolution, and it's opened us up
50
151277
2488
To je izazvalo bukvalnu revoluciju i otvorilo nas je
02:33
to a new experimental preparation. Neurobiologists,
51
153765
2818
ka novim eksperimentalnim pripremama. Neurobiolozi
02:36
as you well know, have lots of experimental preps,
52
156583
3760
kao što znate, eksperimentišu na
02:40
worms and rodents and fruit flies and things like this.
53
160343
3141
crvima, glodarima, vinskim mušicama i sličnim oblicima života.
02:43
And now, we have a new experimental prep: human beings.
54
163484
3397
A sada, možemo da eksperimentišemo i na ljudskim bićima.
02:46
We can now use human beings to study and model
55
166881
3761
Možemo da koristimo ljudska bića u istraživanju i oblikovanju
02:50
the software in human beings, and we have a few
56
170642
2950
tog softvera u ljudskim bićima i sada imamo nekoliko
02:53
burgeoning biological measures.
57
173592
2835
razvijenih bioloških mera za to.
02:56
Okay, let me give you one example of the kinds of experiments that people do,
58
176427
3887
Ok, daću vam primer eksperimenata koje ljudi rade,
03:00
and it's in the area of what you'd call valuation.
59
180314
2677
a koji je u oblasti koju zovemo vrednovanje.
03:02
Valuation is just what you think it is, you know?
60
182991
2135
Vrednovanje je ono što i mislite da jeste, znate?
03:05
If you went and you were valuing two companies against
61
185126
2804
Ako biste uzeli da upoređujete vrednost dveju kompanija
03:07
one another, you'd want to know which was more valuable.
62
187930
2736
to biste uradili da otkrijete koja ima veću vrednost.
03:10
Cultures discovered the key feature of valuation thousands of years ago.
63
190666
3879
Kulture su otkrile ključnu odliku vrednovanja još pre nekoliko hiljada godina.
03:14
If you want to compare oranges to windshields, what do you do?
64
194545
2690
Ako želite da uporedite narandže i vetrobrane, šta ćete uraditi?
03:17
Well, you can't compare oranges to windshields.
65
197235
2356
Pa, ne možete da poredite narandže i vetrobrane.
03:19
They're immiscible. They don't mix with one another.
66
199591
2255
Oni se ne mogu mešati. Ne mogu se stavljati u isti koš.
03:21
So instead, you convert them to a common currency scale,
67
201846
2351
Umesto toga, konvertovaćete ih po skali zajedničkih vrednosti,
03:24
put them on that scale, and value them accordingly.
68
204197
2706
staviti ih na tu skalu i tako ih vrednovati.
03:26
Well, your brain has to do something just like that as well,
69
206903
3436
Pa, vaš mozak isto mora da uradi nešto poput toga,
03:30
and we're now beginning to understand and identify
70
210339
2488
a mi počinjemo da shvatamo i identifikujemo
03:32
brain systems involved in valuation,
71
212827
2137
sisteme u mozgu koji učestvuju u vrednovanju,
03:34
and one of them includes a neurotransmitter system
72
214964
2632
a jedan od njih uključuje sistem neurotransmitera
03:37
whose cells are located in your brainstem
73
217596
2632
čije ćelije se nalaze u korenu vašeg mozga
03:40
and deliver the chemical dopamine to the rest of your brain.
74
220228
3175
i šalju supstanciju dopamin do ostatka vašeg mozga.
03:43
I won't go through the details of it, but that's an important
75
223403
2442
Neću zalaziti u detalje, ali ovo je važno otkriće
03:45
discovery, and we know a good bit about that now,
76
225845
2157
o kome sada znamo dosta.
03:48
and it's just a small piece of it, but it's important because
77
228002
2230
Ovo je samo mali deo toga, ali je bitan jer
03:50
those are the neurons that you would lose if you had Parkinson's disease,
78
230232
3275
to su neuroni koje biste izgubili ako biste dobili Parkinsonovu bolest
03:53
and they're also the neurons that are hijacked by literally
79
233507
2016
i to su takođe neuroni koji stradaju zbog bukvalno
03:55
every drug of abuse, and that makes sense.
80
235523
2232
svake droge, a to ima smisla.
03:57
Drugs of abuse would come in, and they would change
81
237755
2336
Droga bi ušla u vaš sistem i promenila
04:00
the way you value the world. They change the way
82
240091
1789
način na koji vrednujete svet. Ona menja način
04:01
you value the symbols associated with your drug of choice,
83
241880
3199
na koji vrednujete simbole vezane za drogu koju koristite
04:05
and they make you value that over everything else.
84
245079
2514
i čine da to stavljate iznad svega ostalog.
04:07
Here's the key feature though. These neurons are also
85
247593
3021
Ali evo ključne odlike. Ovi neuroni takođe
04:10
involved in the way you can assign value to literally abstract ideas,
86
250614
3501
utiču na način na koji cenimo apstraktne ideje
04:14
and I put some symbols up here that we assign value to
87
254115
2041
i ovde sam stavio nekoliko simbola koje cenimo
04:16
for various reasons.
88
256156
2720
iz raznih razloga.
04:18
We have a behavioral superpower in our brain,
89
258876
2689
Imamo bihejvioralne super moći u našem mozgu
04:21
and it at least in part involves dopamine.
90
261565
1753
koje bar delimično uključuju dopamin.
04:23
We can deny every instinct we have for survival for an idea,
91
263318
4189
Mi možemo da ignorišemo svaki instinkt za preživljavanjem zbog ideje,
04:27
for a mere idea. No other species can do that.
92
267507
4005
samo zbog ideje. Nijedna druga vrsta to ne može.
04:31
In 1997, the cult Heaven's Gate committed mass suicide
93
271512
3606
1997. pripadnici kulta Kapija raja počinili su masovno samoubistvo
04:35
predicated on the idea that there was a spaceship
94
275118
2215
zasnovano na ideji da je postojao svemirski brod
04:37
hiding in the tail of the then-visible comet Hale-Bopp
95
277333
3785
koji se krio u repu tada vidljive komete Hejl-Bop
04:41
waiting to take them to the next level. It was an incredibly tragic event.
96
281118
4272
i koji je trebalo da ih prenese na sledeći nivo. Bio je to neverovatno tragičan događaj.
04:45
More than two thirds of them had college degrees.
97
285390
3485
Više od dve trećine njih završilo je fakultete.
04:48
But the point here is they were able to deny their instincts for survival
98
288875
3723
Ali poenta ovde je da su oni bili sposobni da zanemare svoj instinkt za preživljavanjem
04:52
using exactly the same systems that were put there
99
292598
2866
koristeći upravo one sisteme koji su postojali
04:55
to make them survive. That's a lot of control, okay?
100
295464
4042
kako bi ih terali da prežive. To zahteva mnogo kontrole.
04:59
One thing that I've left out of this narrative
101
299506
2089
Jednu očiglednu stvar sam izostavio iz ove priče,
05:01
is the obvious thing, which is the focus of the rest of my
102
301595
2234
koja je u fokusu ostatka mog govora
05:03
little talk, and that is other people.
103
303829
2159
a to su drugi ljudi.
05:05
These same valuation systems are redeployed
104
305988
2996
Isti ovi sistemi vrednovanja se primenjuju
05:08
when we're valuing interactions with other people.
105
308984
2492
kada vrednujemo interakciju sa drugim ljudima.
05:11
So this same dopamine system that gets addicted to drugs,
106
311476
3271
Dakle, isti ovaj dopaminski sistem, koji se navuče na drogu
05:14
that makes you freeze when you get Parkinson's disease,
107
314747
2524
koji vas natera da se parališete kada dobijete Parkinsonovu bolest,
05:17
that contributes to various forms of psychosis,
108
317271
3077
koji učestvuje u raznim oblicima psihoze,
05:20
is also redeployed to value interactions with other people
109
320348
3920
takođe se primenjuje kod vrednovanja interakcije sa drugim ljudima
05:24
and to assign value to gestures that you do
110
324268
2896
i kod pridavanja važnosti gestovima koje činite
05:27
when you're interacting with somebody else.
111
327164
2574
kada ste u interakciji sa nekim drugim.
05:29
Let me give you an example of this.
112
329738
2577
Daću vam primer ovoga.
05:32
You bring to the table such enormous processing power
113
332315
2967
Vi posedujete toliku ogromnu moć obrade podataka
05:35
in this domain that you hardly even notice it.
114
335282
2624
u ovom području da na to jedva obraćate pažnju.
05:37
Let me just give you a few examples. So here's a baby.
115
337906
1467
Daću vam nekoliko primera. Evo jedne bebe.
05:39
She's three months old. She still poops in her diapers and she can't do calculus.
116
339373
3730
Ona ima tri meseca. I dalje kaki u pelene i ne ume da računa.
05:43
She's related to me. Somebody will be very glad that she's up here on the screen.
117
343103
3353
U rodu je sa mnom. Nekome će biti veoma drago što je ovde na ekranu.
05:46
You can cover up one of her eyes, and you can still read
118
346456
2376
Možete pokriti jedno njeno oko i i dalje pročitati
05:48
something in the other eye, and I see sort of curiosity
119
348832
2755
nešto u njenom drugom oku, a ja vidim neku radoznalost
05:51
in one eye, I see maybe a little bit of surprise in the other.
120
351587
3597
u jednom oku i možda malo iznenađenosti u drugom.
05:55
Here's a couple. They're sharing a moment together,
121
355184
3179
Evo jednog para. Dele jedan zajednički trenutak,
05:58
and we've even done an experiment where you can cut out
122
358363
1318
a mi smo čak uradili i eksperiment gde možete da isečete
05:59
different pieces of this frame and you can still see
123
359681
3007
razne delove ove slike, a da i dalje vidite
06:02
that they're sharing it. They're sharing it sort of in parallel.
124
362688
2504
da oni dele isto osećanje. To se dešava paralelno.
06:05
Now, the elements of the scene also communicate this
125
365192
2463
E sada, elementi scene takođe nam to govore,
06:07
to us, but you can read it straight off their faces,
126
367655
2235
ali možete to pročitati direktno sa njihovih lica,
06:09
and if you compare their faces to normal faces, it would be a very subtle cue.
127
369890
3503
a ako uporedite njihova lica sa normalnim licima, bio bi to veoma suptilan signal.
06:13
Here's another couple. He's projecting out at us,
128
373393
3347
Evo još jednog para. On se obraća nama,
06:16
and she's clearly projecting, you know,
129
376740
2888
a ona očigledno pokazuje, znate,
06:19
love and admiration at him.
130
379628
2263
ljubav i divljenje prema njemu.
06:21
Here's another couple. (Laughter)
131
381891
3635
Evo još jednog para. (Smeh)
06:25
And I'm thinking I'm not seeing love and admiration on the left. (Laughter)
132
385526
5150
I čini mi se da ne vidim ljubav i divljenje sa leve strane. (Smeh)
06:30
In fact, I know this is his sister, and you can just see
133
390676
2560
Zapravo, ja znam da mu je to sestra i prosto možete da ga vidite
06:33
him saying, "Okay, we're doing this for the camera,
134
393236
2513
kako govori: "Ok, radimo ovo samo za slikanje,
06:35
and then afterwards you steal my candy and you punch me in the face." (Laughter)
135
395749
5702
a posle ćeš da mi kradeš slatkiše i da me udaraš u lice." (Smeh)
06:41
He'll kill me for showing that.
136
401451
2106
Ubiće me što sam ovo pokazao.
06:43
All right, so what does this mean?
137
403557
2797
U redu, šta onda ovo znači?
06:46
It means we bring an enormous amount of processing power to the problem.
138
406354
3350
Ovo znači da posedujemo ogromnu količinu moći obrade podataka.
06:49
It engages deep systems in our brain, in dopaminergic
139
409704
3648
To uključuje dubinske sisteme u našem mozgu, u dopaminergičkim
06:53
systems that are there to make you chase sex, food and salt.
140
413352
2818
sistemima koji postoje kako bi vas terali da jurite seks, hranu i so.
06:56
They keep you alive. It gives them the pie, it gives
141
416170
2894
Oni vas drže u životu. Daje im celinu, daje
06:59
that kind of a behavioral punch which we've called a superpower.
142
419064
2904
takvu bihevijoralnu snagu koju smo nazvali super moć.
07:01
So how can we take that and arrange a kind of staged
143
421968
3654
Pa kako to možemo da uzmemo i napravimo nekakvu isceniranu
07:05
social interaction and turn that into a scientific probe?
144
425622
2698
društvenu interakciju i pretvorimo to u naučno istraživanje?
07:08
And the short answer is games.
145
428320
2691
A kratak odgovor bio bi: kroz igre.
07:11
Economic games. So what we do is we go into two areas.
146
431011
4404
Ekonomske igre. Ono što mi radimo je da se bavimo dvama poljima.
07:15
One area is called experimental economics. The other area is called behavioral economics.
147
435415
3336
Jedno polje se zove eksperimentalna ekonomija. Drugo polje se zove bihevijoralna ekonomija.
07:18
And we steal their games. And we contrive them to our own purposes.
148
438751
4078
Mi krademo njihove igre i sprovodimo ih prema našim potrebama.
07:22
So this shows you one particular game called an ultimatum game.
149
442829
2967
Ovo vam prikazuje jednu od igara koja se zove igra ultimatuma.
07:25
Red person is given a hundred dollars and can offer
150
445796
1845
Crvena osoba dobija sto dolara i može da ih podeli
07:27
a split to blue. Let's say red wants to keep 70,
151
447641
3723
sa plavom osobom. Recimo da crveni hoće da zadrži 70,
07:31
and offers blue 30. So he offers a 70-30 split with blue.
152
451364
4086
a plavom nudi 30. Dakle, on plavom nudi podelu 70-30.
07:35
Control passes to blue, and blue says, "I accept it,"
153
455450
2851
Kontrola prelazi na plavog i on kaže: "Prihvatam"
07:38
in which case he'd get the money, or blue says,
154
458301
1956
i u tom slučaju on dobija novac. Ili plavi kaže:
07:40
"I reject it," in which case no one gets anything. Okay?
155
460257
4307
"Odbijam" i u tom slučaju niko ne dobija ništa. Ok?
07:44
So a rational choice economist would say, well,
156
464564
3392
Dakle, razumna odluka po ekonomistima bi bila
07:47
you should take all non-zero offers.
157
467956
2056
prihvatiti sve ponude koje ne uključuju nulu.
07:50
What do people do? People are indifferent at an 80-20 split.
158
470012
3762
Šta ljudi rade? Ljudi su ravnodušni na 80-20 podelu.
07:53
At 80-20, it's a coin flip whether you accept that or not.
159
473774
3524
Na 80-20, može se bacati novčić kako bi se odlučilo da li prihvatiti ili ne.
07:57
Why is that? You know, because you're pissed off.
160
477298
2891
A zašto je to tako? Pa znate, jer ste iznervirani.
08:00
You're mad. That's an unfair offer, and you know what an unfair offer is.
161
480189
3609
Besni ste. To nije poštena ponuda, a vi znate kako ona izgleda.
08:03
This is the kind of game done by my lab and many around the world.
162
483798
2704
Ovakva igra se radi u mojoj laboratoriji i još mnogima širom sveta.
08:06
That just gives you an example of the kind of thing that
163
486502
2544
To vam samo daje primer stvari koje
08:09
these games probe. The interesting thing is, these games
164
489046
3738
ove igre ispituju. Zanimljivo je da ove igre
08:12
require that you have a lot of cognitive apparatus on line.
165
492784
3707
zahtevaju od vas mnogo kognitivnog materijala.
08:16
You have to be able to come to the table with a proper model of another person.
166
496491
2928
Morate biti sposobni da uđete u raspravu sa odgovarajućom drugom osobom.
08:19
You have to be able to remember what you've done.
167
499419
3213
Morate biti sposobni da se setite šta ste uradili.
08:22
You have to stand up in the moment to do that.
168
502632
1420
Morate biti dostojni situacije da biste to uradili.
08:24
Then you have to update your model based on the signals coming back,
169
504052
3350
Onda morate da ažurirate svoj model na osnovu signala koje dobijete
08:27
and you have to do something that is interesting,
170
507402
2972
i morate da uradite nešto zanimljivo,
08:30
which is you have to do a kind of depth of thought assay.
171
510374
2597
a to je da morate da uradite procenu skrivenih misli.
08:32
That is, you have to decide what that other person expects of you.
172
512971
3333
Što znači da morate da procenite šta druga osoba očekuje od vas.
08:36
You have to send signals to manage your image in their mind.
173
516304
2954
Morate da šaljete signale kako biste upravljali vašom slikom u njenoj svesti.
08:39
Like a job interview. You sit across the desk from somebody,
174
519258
2853
Kao razgovor za posao. Sedite preko puta nekoga,
08:42
they have some prior image of you,
175
522111
1369
taj neko već ima prethodno stvorenu sliku o vama,
08:43
you send signals across the desk to move their image
176
523480
2751
vi šaljete signale preko stola kako biste promenili tu sliku
08:46
of you from one place to a place where you want it to be.
177
526231
3920
sa postojeće na onakvu kakva vi želite da bude.
08:50
We're so good at this we don't really even notice it.
178
530151
3385
Mi smo toliko dobri u tome, da ni ne obraćamo pažnju na to.
08:53
These kinds of probes exploit it. Okay?
179
533536
3767
Ovakva istraživanja to koriste. Razumete?
08:57
In doing this, what we've discovered is that humans
180
537303
1807
Radeći ovo shvatili smo da se ljudi
08:59
are literal canaries in social exchanges.
181
539110
2331
ponašaju kao kanarinci u komunikativnim situacijama.
09:01
Canaries used to be used as kind of biosensors in mines.
182
541441
3397
Kanarinci su korišćeni kao biosenzori u rudnicima.
09:04
When methane built up, or carbon dioxide built up,
183
544838
3560
Kada bi nivo metana ili ugljen-dioksida porastao,
09:08
or oxygen was diminished, the birds would swoon
184
548398
4186
ili kada bi nivo kiseonika opao, ptice bi izgubile svest
09:12
before people would -- so it acted as an early warning system:
185
552584
2326
pre ljudi, tako da je to služilo kao sistem upozoravanja:
09:14
Hey, get out of the mine. Things aren't going so well.
186
554910
2980
Hej, izlazite iz rudnika. Stvari baš i ne stoje najbolje.
09:17
People come to the table, and even these very blunt,
187
557890
2954
Ljudi dođu na raspravu i čak i u ovim veoma ograničenim,
09:20
staged social interactions, and they, and there's just
188
560844
2990
odglumljenim društvenim interakcijama, gde se
09:23
numbers going back and forth between the people,
189
563834
3016
radi samo o razmenjivanju raznih brojeva,
09:26
and they bring enormous sensitivities to it.
190
566850
2199
ljudi unose veliku osećajnost u to.
09:29
So we realized we could exploit this, and in fact,
191
569049
2689
Tako da smo shvatili da možemo ovo da iskoristimo i u stvari,
09:31
as we've done that, and we've done this now in
192
571738
2556
dok smo to radili, a uradili smo to sada sa
09:34
many thousands of people, I think on the order of
193
574294
2694
više hiljada ljudi, mislim na oko
09:36
five or six thousand. We actually, to make this
194
576988
2165
pet ili šest hiljada. Zapravo, kako bismo od ovoga napravili
09:39
a biological probe, need bigger numbers than that,
195
579153
2224
biološko istraživanje, treba nam i veći broj od ovoga
09:41
remarkably so. But anyway,
196
581377
3674
mnogo veći. Ali u svakom slučaju,
09:45
patterns have emerged, and we've been able to take
197
585051
2004
pojavili su se šabloni, a mi smo uzeli
09:47
those patterns, convert them into mathematical models,
198
587055
3836
te šablone, pretvorili ih u matematičke modele
09:50
and use those mathematical models to gain new insights
199
590891
2689
i iskoristili te matematičke modele kako bismo napravili nova otkrića
09:53
into these exchanges. Okay, so what?
200
593580
2131
o ovim situacijama. Ok, pa šta?
09:55
Well, the so what is, that's a really nice behavioral measure,
201
595711
3313
Pa, to da je ovo vrlo lepo merilo ponašanja,
09:59
the economic games bring to us notions of optimal play.
202
599024
3319
ekonomske igre nam donose predstavu optimalne igre.
10:02
We can compute that during the game.
203
602343
2484
To možemo da izračunamo u toku igre.
10:04
And we can use that to sort of carve up the behavior.
204
604827
2953
I to možemo da iskoristimo da, takoreći, isklešemo ponašanje.
10:07
Here's the cool thing. Six or seven years ago,
205
607780
4330
Evo jedne kul stvari. Pre šest ili sedam godina,
10:12
we developed a team. It was at the time in Houston, Texas.
206
612110
2550
oformili smo tim. Tada je to bilo u Hjustonu, u Teksasu.
10:14
It's now in Virginia and London. And we built software
207
614660
3394
Sada je u Virdžiniji i Londonu. I napravili smo softver
10:18
that'll link functional magnetic resonance imaging devices
208
618054
3207
koji će povezati sprave za funkcionalnu magnetnu rezonancu
10:21
up over the Internet. I guess we've done up to six machines
209
621261
4035
preko interneta. Mislim da smo to tada uradili sa oko šest
10:25
at a time, but let's just focus on two.
210
625296
1981
mašina, ali fokusirajmo se na dve.
10:27
So it synchronizes machines anywhere in the world.
211
627277
3058
Dakle, usklađujemo mašine bilo gde u svetu.
10:30
We synchronize the machines, set them into these
212
630335
3169
Mi usklađujemo mašine, namestimo ih na te
10:33
staged social interactions, and we eavesdrop on both
213
633504
1983
odglumljene društvene interakcije i prisluškujemo dva
10:35
of the interacting brains. So for the first time,
214
635487
1666
mozga u interakciji. Prvi put,
10:37
we don't have to look at just averages over single individuals,
215
637153
3607
ne moramo da posmatramo samo individualne proseke,
10:40
or have individuals playing computers, or try to make
216
640760
2897
ili da svako individualno igra kompjuter ili da pokušamo da pravimo
10:43
inferences that way. We can study individual dyads.
217
643657
2763
zaključke na taj način. Možemo da proučavamo individualne dijade.
10:46
We can study the way that one person interacts with another person,
218
646420
2785
Možemo da proučavamo način na koji jedna osoba komunicira da drugom,
10:49
turn the numbers up, and start to gain new insights
219
649205
2564
da povećamo broj osoba i počnemo da pravimo nova otkrića
10:51
into the boundaries of normal cognition,
220
651769
2515
o granicama normalne kognicije,
10:54
but more importantly, we can put people with
221
654284
2732
ali ono što je još bitnije, možemo da stavimo ljude
10:57
classically defined mental illnesses, or brain damage,
222
657016
3337
sa klasično definisanim mentalnim oboljenjima ili oštećenjima mozga
11:00
into these social interactions, and use these as probes of that.
223
660353
3551
u ove društvene interakcije i da ih koristimo u istraživanju toga.
11:03
So we've started this effort. We've made a few hits,
224
663904
2350
Dakle, mi smo započeli ovaj napor. Imali smo nekoliko pogodaka,
11:06
a few, I think, embryonic discoveries.
225
666254
2449
nekoliko, verujem, početnih otkrića.
11:08
We think there's a future to this. But it's our way
226
668703
2812
Mislimo da ovo ima budućnost. Ali to je naš način
11:11
of going in and redefining, with a new lexicon,
227
671515
2560
ulaska u to i redefinisanja, sa novim rečnikom,
11:14
a mathematical one actually, as opposed to the standard
228
674075
4022
matematičkim, zapravo, koji je suprotan standardnom
11:18
ways that we think about mental illness,
229
678097
2578
načinu na koji vidimo mentalna oboljenja,
11:20
characterizing these diseases, by using the people
230
680675
2067
karakterišući ove bolesti, koristeći ljude
11:22
as birds in the exchanges. That is, we exploit the fact
231
682742
3007
kao ptice u interakciji. Odnosno, iskorišćavamo činjenicu
11:25
that the healthy partner, playing somebody with major depression,
232
685749
4244
da zdrav partner glumi nekoga sa teškom depresijom,
11:29
or playing somebody with autism spectrum disorder,
233
689993
2910
ili nekoga sa spektrom autističnih poremećaja,
11:32
or playing somebody with attention deficit hyperactivity disorder,
234
692903
3850
ili nekoga sa hiperkinetičkim poremećajem.
11:36
we use that as a kind of biosensor, and then we use
235
696753
3219
Koristimo to kao nekakve biosenzore, a zatim koristimo
11:39
computer programs to model that person, and it gives us
236
699972
2644
kompjuterske programe da modeliramo tu osobu i to nam daje
11:42
a kind of assay of this.
237
702616
2470
neku procenu toga.
11:45
Early days, and we're just beginning, we're setting up sites
238
705086
2131
Još je rano i tek počinjemo, širimo se
11:47
around the world. Here are a few of our collaborating sites.
239
707217
3410
po svetu. Evo nekoliko naših saradnika.
11:50
The hub, ironically enough,
240
710627
2309
Središte se, ironično,
11:52
is centered in little Roanoke, Virginia.
241
712936
2889
nalazi u Malom Roanoku, u Virdžiniji.
11:55
There's another hub in London, now, and the rest
242
715825
2269
Drugo središte nalazi se u Londonu, za sada, a ostala
11:58
are getting set up. We hope to give the data away
243
718094
4009
su u procesu. Nadamo se da ćemo u nekom trenutku moći
12:02
at some stage. That's a complicated issue
244
722103
3673
da otkrijemo podatke. Komplikovan je problem
12:05
about making it available to the rest of the world.
245
725776
2994
napraviti ovo dostupnim za ostatak sveta.
12:08
But we're also studying just a small part
246
728770
1847
Ali mi takođe proučavamo samo mali deo
12:10
of what makes us interesting as human beings, and so
247
730617
2267
onoga što nas kao ljudska bića čini zanimljivim i stoga
12:12
I would invite other people who are interested in this
248
732884
2041
bih da pozovem druge ljude koje ovo zanima
12:14
to ask us for the software, or even for guidance
249
734925
2569
da nas pitaju o softveru ili da čak potraže pomoć
12:17
on how to move forward with that.
250
737494
2219
oko toga kako da napreduju u tome.
12:19
Let me leave you with one thought in closing.
251
739713
2341
Ostaviću vas sa jednom mišlju za kraj.
12:22
The interesting thing about studying cognition
252
742054
1942
Zanimljiva stvar kod proučavanja kognicije
12:23
has been that we've been limited, in a way.
253
743996
3732
je da smo, na neki način, bili ograničeni.
12:27
We just haven't had the tools to look at interacting brains
254
747728
2943
Samo nismo imali instrumente kojima bismo posmatrali mozgove u simultanoj
12:30
simultaneously.
255
750671
1200
interakciji.
12:31
The fact is, though, that even when we're alone,
256
751871
2470
Činjenica je, pak, da čak i kad smo sami,
12:34
we're a profoundly social creature. We're not a solitary mind
257
754341
4111
mi smo izuzetno društvena bića. Nismo samotni um
12:38
built out of properties that kept it alive in the world
258
758452
4373
izgrađen od osobina koje ga drže u životu
12:42
independent of other people. In fact, our minds
259
762825
3948
nezavisno od drugih ljudi. Zapravo, naši umovi
12:46
depend on other people. They depend on other people,
260
766773
2870
zavise od drugih ljudi. Oni zavise od drugih ljudi
12:49
and they're expressed in other people,
261
769643
1541
i očituju se u drugim ljudima,
12:51
so the notion of who you are, you often don't know
262
771184
3652
tako da pojam onoga ko ste, vi često ne znate
12:54
who you are until you see yourself in interaction with people
263
774836
2688
ko ste dok ne vidite sebe u interakciji sa drugim ljudima
12:57
that are close to you, people that are enemies of you,
264
777524
2406
koji su vam bliski, ljudima koji su vaši neprijatelji,
12:59
people that are agnostic to you.
265
779930
2545
ljudima koji su sumnjičavi prema vama.
13:02
So this is the first sort of step into using that insight
266
782475
3776
Dakle, ovo je prvi korak ka korišćenju tih saznanja
13:06
into what makes us human beings, turning it into a tool,
267
786251
3295
o tome šta nas čini ljudskim bićima i pretvaranju toga u alat
13:09
and trying to gain new insights into mental illness.
268
789546
1978
i pokušaju dobijanja novih saznanja o mentalnim oboljenjima.
13:11
Thanks for having me. (Applause)
269
791524
3121
Hvala na pažnji. (Aplauz)
13:14
(Applause)
270
794645
3089
(Aplauz)
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7