Robert Wright: The evolution of compassion

37,410 views ・ 2015-07-17

TED


Vă rugăm să faceți dublu clic pe subtitrările în limba engleză de mai jos pentru a reda videoclipul.

Traducător: preda silvana Corector: Ariana Bleau Lugo
00:13
I'm going to talk about compassion and the golden rule
0
13000
4000
Voi vorbi despre compasiune si despre regula de aur
00:17
from a secular perspective and even from a kind of scientific perspective.
1
17000
6000
din perspectiva laica si într-un fel din perspectiva stiintifica.
00:23
I'm going to try to give you a little bit of a natural history
2
23000
3000
Voi incerca sa va ofer un pic din istoria progresiva
00:26
of compassion and the golden rule.
3
26000
2000
a compasiunii si a regulii de aur.
00:28
So, I'm going to be sometimes using kind of clinical language,
4
28000
5000
Uneori voi folosi limbaj clinic
00:33
and so it's not going to sound as warm and fuzzy
5
33000
2000
asa ca nu va suna recomfortant
00:35
as your average compassion talk.
6
35000
2000
ca discutiile obisnuite despre compasiune.
00:37
I want to warn you about that.
7
37000
3000
Vreau sa va previn.
00:40
So, I do want to say, at the outset, that I think compassion's great.
8
40000
4000
Subliniez de la bun inceput: Compasiunea e minunata.
00:44
The golden rule is great. I'm a big supporter of both.
9
44000
3000
Regula de aur e minunata. Sunt un mare fan al ambelor.
00:47
And I think it's great that
10
47000
2000
Si cred ca e minunat ca religiile lumii,
00:49
the leaders of the religions of the world
11
49000
3000
liderii religiilor lumii
00:52
are affirming compassion and the golden rule as fundamental principles
12
52000
5000
declara compasiunea si regula de aur ca principii fundamentale
00:57
that are integral to their faiths.
13
57000
3000
indispensabile credintei lor.
01:00
At the same time, I think religions don't deserve all the credit.
14
60000
3000
In acelasi timp nu cred ca religiile merita tot creditul.
01:03
I think nature gave them a helping hand here.
15
63000
4000
Natura le-a dat o mana de ajutor aici.
01:07
I'm going to argue tonight that compassion and the golden rule
16
67000
5000
Voi demonstra in seara aceasta ca regula de aur si compasiunea
01:12
are, in a certain sense, built into human nature.
17
72000
3000
sunt, intr-un fel, intrinseci naturii umane.
01:15
But I'm also going to argue
18
75000
2000
Dar de asemenea voi dovedi
01:17
that once you understand the sense in which they are built into human nature,
19
77000
4000
ca odata ce intelegem felul in care sunt intrinseci naturii umane
01:21
you realize that just affirming compassion,
20
81000
4000
realizam ca doar afirmarea compasiunii
01:25
and affirming the golden rule, is really not enough.
21
85000
3000
si a regulii de aur nu e suficienta.
01:28
There's a lot of work to be done after that.
22
88000
3000
Multa munca trebuie sa urmeze.
01:31
OK so, a quick natural history, first of compassion.
23
91000
5000
O scurta lectie de istoria compasiunii:
01:36
In the beginning, there was compassion,
24
96000
3000
La inceput exista compasiunea
01:39
and I mean not just when human beings first showed up,
25
99000
3000
si nu ma refer doar la momentul aparitiei fiintelor umane,
01:42
but actually even before that.
26
102000
2000
ci chiar inainte de asta.
01:44
I think it's probably the case that, in the human evolutionary lineage,
27
104000
4000
Cred ca, probabil in linia evolutiei umane,
01:48
even before there were homo sapiens,
28
108000
3000
inainte chiar de a exista Homo sapiens,
01:51
feelings like compassion and love and sympathy
29
111000
2000
sentimente cum ar fi compasiunea, iubirea si simpatia
01:53
had earned their way into the gene pool,
30
113000
4000
si-au facut oarecum loc in fondul genetic,
01:57
and biologists have a pretty clear idea of how this first happened.
31
117000
3000
si biologii au o idee destul de clara cum s-a intamplat asta prima data.
02:00
It happened through a principle known as kin selection.
32
120000
4000
S-a intamplat prin aplicarea principiului cunoscut de selectia aproapelui.
02:04
And the basic idea of kin selection is that,
33
124000
6000
Ideea de baza a selectiei aproapelui e aceea ca
02:10
if an animal feels compassion for a close relative,
34
130000
3000
daca un animal simte compasiune pentru o ruda apropiata
02:13
and this compassion leads the animal to help the relative,
35
133000
4000
si aceasta compasiune determina animalul sa-si ajute ruda,
02:17
then, in the end, the compassion actually winds up helping the genes
36
137000
5000
atunci, compasiunea va sfarsi prin a ajuta genele
02:22
underlying the compassion itself.
37
142000
3000
în perpetuarea compasiunii.
02:25
So, from a biologist's point of view, compassion is actually
38
145000
4000
Deci, din punctul de vedere al unui biolog, compasiunea e de fapt
02:29
a gene's way of helping itself. OK.
39
149000
4000
modalitatea prin care o gena se ajuta pe sine insasi.
02:33
I warned you this was not going to be very warm and fuzzy.
40
153000
4000
V-am avertizat ca nu va fi prea placut.
02:37
I'll get there -- I hope to get a little fuzzier.
41
157000
3000
Voi ajunge și acolo. Sper sa devina putin mai placut.
02:40
This doesn't bother me so much,
42
160000
2000
Pe mine nu ma deranjeaza foarte mult,
02:42
that the underlying Darwinian rationale of compassion
43
162000
4000
ca rationamentul darwinist asupra compasiunii
02:46
is kind of self-serving at the genetic level.
44
166000
2000
e interesul propriu, la nivel genetic.
02:48
Actually, I think the bad news about kin selection
45
168000
3000
De fapt, vestea proasta in selectia aproapelui
02:51
is just that it means that this kind of compassion
46
171000
4000
e ca acest gen de compasiune
02:55
is naturally deployed only within the family.
47
175000
3000
e implementat in mod firesc doar in cadrul familiei.
02:58
That's the bad news. The good news is compassion is natural.
48
178000
3000
Asta-i partea proasta. Partea buna e că această compasiunea e fireasca.
03:01
The bad news is that this kin selected compassion
49
181000
3000
Vestea proasta e ca aceasta compasiune ce tine de favorizarea aproapelui
03:04
is naturally confined to the family.
50
184000
2000
e in mod natural limitata la familie.
03:06
Now, there's more good news that came along later in evolution,
51
186000
4000
Mai sunt vesti bune care au aparut de-a lungul evolutiei,
03:10
a second kind of evolutionary logic.
52
190000
3000
un al doilea gen de logica evolutiva.
03:13
Biologists call that "reciprocal altruism." OK.
53
193000
3000
Biologii o numesc altruism reciproc.
03:16
And there, the basic idea is that
54
196000
3000
Aici, ideea principala este:
03:19
compassion leads you to do good things for people who then will return the favor.
55
199000
8000
Compasiunea va determina sa faceti lucruri bune pentru persoane care, mai apoi, vor returna favorul.
03:27
Again, I know this is not as inspiring a notion of compassion
56
207000
5000
Stiu ca aceasta persectiva asupra compasiunii nu-i exaltantă
03:32
as you may have heard in the past,
57
212000
2000
ca altele auzite in trecut.
03:34
but from a biologist's point of view, this reciprocal altruism kind of compassion
58
214000
5000
dar, din perspectiva unui biolog, compasiunea altruismului reciproc
03:39
is ultimately self-serving too.
59
219000
3000
opereaza din interes propriu.
03:42
It's not that people think that, when they feel the compassion.
60
222000
2000
Oamenii nu sunt constienti de asta cand simt compasiune.
03:44
It's not consciously self-serving, but to a biologist, that's the logic.
61
224000
5000
Nu opereaza constient din interes propriu, dar pentru un biolog, asta e motivatia.
03:49
And so, you wind up most easily extending compassion to friends and allies.
62
229000
6000
Si astfel, ajungem sa extindem compasiunea asupra prietenilor si aliatilor.
03:55
I'm sure a lot of you, if a close friend has something really terrible happen to them,
63
235000
6000
Sunt sigur ca, daca unui prieten apropiat i s-ar intampla ceva ingrozitor,
04:01
you feel really bad.
64
241000
2000
multora din voi v-ar parea foarte rau.
04:03
But if you read in the newspaper
65
243000
1000
Dar, daca cititi in ziar
04:04
that something really horrible happened to somebody you've never heard of,
66
244000
3000
ca ceva ingrozitor s-a intamplat cuiva de care n-ati auzit niciodata,
04:07
you can probably live with that.
67
247000
2000
probabil nu va afecteaza.
04:09
That's just human nature.
68
249000
2000
Asa e natura umana.
04:11
So, it's another good news/bad news story.
69
251000
2000
Deci, asta-i inca o poveste cu vesti bune si rele.
04:13
It's good that compassion was extended beyond the family
70
253000
2000
E bine ca aceasta compasiune a fost extinsa in afara familiei
04:15
by this kind of evolutionary logic.
71
255000
2000
de aceasta motivatie evolutiva.
04:17
The bad news is this doesn't bring us universal compassion by itself.
72
257000
5000
Vestea proasta e ca nu aduce cu sine compasiunea universala.
04:22
So, there's still work to be done.
73
262000
2000
Deci, mai avem de lucru.
04:24
Now, there's one other result of this dynamic called reciprocal altruism,
74
264000
5000
Mai exista un rezultat al acestei dinamici de altruism reciproc,
04:29
which I think is kind of good news,
75
269000
2000
care cred ca-i o veste buna,
04:31
which is that the way that this is played out in the human species,
76
271000
4000
anume ca manifestarea altruismului reciproc in specia umana
04:35
it has given people an intuitive appreciation of the golden rule.
77
275000
4000
i-a facut pe oameni sa dobandeasca o apreciere intuitiva a regulii de aur.
04:39
I don't quite mean that the golden rule itself is written in our genes,
78
279000
4000
Nu afirm neaparat ca regula de aur e inscrisa in gene,
04:43
but you can go to a hunter gatherer society
79
283000
4000
dar poti merge intr-un trib de vanatori-culegatori
04:47
that has had no exposure to any of the great religious traditions,
80
287000
3000
care n-au fost expusi niciuneia din marile traditii religioase,
04:50
no exposure to ethical philosophy,
81
290000
2000
niciunei etici filozofice,
04:52
and you'll find, if you spend time with these people,
82
292000
2000
si veti descoperi, daca veti petrece timp cu acesti oameni,
04:54
that, basically, they believe that one good turn deserves another,
83
294000
3000
ca, in principiu, ei cred ca o fapta buna merita o alta fapta buna,
04:57
and that bad deeds should be punished.
84
297000
2000
si ca faptele rele ar trebui pedepsite.
04:59
And evolutionary psychologists think that these intuitions have a basis in the genes.
85
299000
6000
Psihologii evolutionisti cred ca aceasta intuitie isi are originea in gene.
05:05
So, they do understand that if you want to be treated well,
86
305000
4000
Deci, ei inteleg ca, daca vrei sa fii tratat bine,
05:09
you treat other people well.
87
309000
2000
trebuie sa-i tratezi bine pe ceilalti.
05:11
And it's good to treat other people well.
88
311000
2000
Si e benefic sa-i tratezi pe ceilalti bine.
05:13
That's close to being a kind of built-in intuition.
89
313000
4000
E aproape o intuitie innascuta.
05:17
So, that's good news. Now, if you've been paying attention,
90
317000
3000
Deci, asta e partea buna. Acum, daca ati fost atenti,
05:20
you're probably anticipating that there's bad news here;
91
320000
3000
probabil ati anticipat ca exista si vesti proaste,
05:23
we still aren't to universal love,
92
323000
2000
ca n-am ajuns inca la iubirea universala,
05:25
and it's true because, although an appreciation of the golden rule is natural,
93
325000
5000
si, e adevarat pentru ca, desi regula de aur e fireasca,
05:30
it's also natural to carve out exceptions to the golden rule.
94
330000
5000
tot firesti sunt si exceptiile de la regula de aur.
05:35
I mean, for example, none of us, probably, want to go to prison,
95
335000
4000
De exemplu, niciunul din noi nu vrem sa ajungem la inchisoare,
05:39
but we all think that there are some people who should go to prison. Right?
96
339000
3000
dar cu totii credem ca exista oameni care ar trebui inchisi. Adevarat?
05:42
So, we think we should treat them differently than we would want to be treated.
97
342000
4000
Deci gândim c-ar trebui sa-i tratam diferit de cum am vrea noi sa fim tratati.
05:46
Now, we have a rationale for that.
98
346000
2000
Avem si o explicatie pentru asta.
05:48
We say they did these bad things that make it just that they should go to prison.
99
348000
5000
Faptele lor rele ne valideaza convingerea ca trebuie sa ajunga la inchisoare.
05:53
None of us really extends the golden rule in truly diffuse and universal fashion.
100
353000
4000
Niciunul nu extindem regula de aur cu adevarat nelimitat si universal.
05:57
We have the capacity to carve out exceptions,
101
357000
3000
Avem capacitatea de-a evidentia exceptiile,
06:00
put people in a special category.
102
360000
2000
de a-i include pe oameni in categorii speciale.
06:02
And the problem is that -- although in the case of sending people to prison,
103
362000
4000
Si problema e ca in cazul trimiterii oamenilor la inchisoare,
06:06
you have this impartial judiciary
104
366000
3000
aveti impartialitatea determinarii juridice,
06:09
determining who gets excluded from the golden rule --
105
369000
4000
prin intermediul careia stiti cine e exclus de la regula de aur,
06:13
that in everyday life, the way we all make these decisions
106
373000
4000
dar in viata de zi cu zi, in felul in care luam deciziile
06:17
about who we're not going to extend the golden rule to,
107
377000
3000
cu privire la cei carora nu le vom aplica regula de aur
06:20
is we use a much rougher and readier formula.
108
380000
3000
folosim o formula mult mai dura si mai excluxiva,
06:23
Basically it's just like, if you're my enemy, if you're my rival --
109
383000
4000
care spune, in principiu: daca esti dusmanul meu, rivalul meu,
06:27
if you're not my friend, if you're not in my family --
110
387000
2000
daca nu esti prietenul meu, daca nu esti din familia mea,
06:29
I'm much less inclined to apply the golden rule to you.
111
389000
5000
sunt mai putin inclinat sa-ti aplic regula de aur.
06:34
We all do that,
112
394000
2000
Cam toti facem lucrul asta,
06:36
and you see it all over the world.
113
396000
3000
se vede peste tot in lume.
06:39
You see it in the Middle East:
114
399000
3000
Se vede in Orientul Mijlociu.
06:42
people who, from Gaza, are firing missiles at Israel.
115
402000
3000
Oameni din Gaza care arunca proiectile in Israel.
06:45
They wouldn't want to have missiles fired at them, but they say,
116
405000
2000
N-ar vrea ca ei sa fie tinta proiectilelor, dar spun :
06:47
"Well, but the Israelis, or some of them have done things
117
407000
2000
"Ei bine, israelienii sau o parte din ei au comis fapte
06:49
that put them in a special category."
118
409000
2000
care-i includ intr-o categorie speciala."
06:51
The Israelis would not want to have an economic blockade imposed on them,
119
411000
3000
Israelienii nu si-ar dori sa le fie impusa o blocada economica,
06:54
but they impose one on Gaza, and they say,
120
414000
2000
dar impun una in Gaza si spun :
06:56
"Well, the Palestinians, or some of them, have brought this on themselves."
121
416000
3000
"Ei bine, palestinienii, sau o parte dintre ei, si-au facut-o cu mana lor."
06:59
So, it's these exclusions to the golden rule that amount to a lot of the world's trouble.
122
419000
8000
Prin urmare, aceasta excludere de la regula de aur sporeste necazurile lumii.
07:07
And it's natural to do that.
123
427000
3000
E natural firii umane sa faca asa.
07:10
So, the fact that the golden rule is in some sense built in to us
124
430000
3000
Deci, desi regula de aur e oarecum innascuta
07:13
is not, by itself, going to bring us universal love.
125
433000
6000
ea nu induce singura dragostea universala.
07:19
It's not going to save the world.
126
439000
2000
Nu va salva lumea.
07:21
Now, there's one piece of good news I have that may save the world. Okay.
127
441000
5000
Dar am o veste buna care ar putea salva lumea.
07:26
Are you on the edges of your seats here?
128
446000
3000
Stati bine pe scaune?
07:29
Good, because before I tell you about that good news,
129
449000
2000
Bine, pentru ca, inainte sa va dau vestea buna,
07:31
I'm going to have to take a little excursion through some academic terrain.
130
451000
5000
va trebui sa fac o incursiune stiintifica intr-un domeniu academic.
07:36
So, I hope I've got your attention with this promise of good news
131
456000
3000
Deci, sper ca v-am captat atentia cu promisunea vestii bune
07:39
that may save the world.
132
459000
3000
care poate salva lumea.
07:42
It's this non-zero-sumness stuff you just heard a little bit about.
133
462000
3000
E vorba de 'suma nenula' despre care ati mai auzit cate ceva.
07:45
It's just a quick introduction to game theory.
134
465000
3000
Sa facem o introducere sumara in teoria jocului.
07:48
This won't hurt. Okay.
135
468000
2000
Nu va strica.
07:50
It's about zero-sum and non-zero-sum games.
136
470000
2000
E vorba de jocurile cu suma nula si suma nenula.
07:52
If you ask what kind of a situation
137
472000
4000
Daca intrebati ce fel de situatie
07:56
is conducive to people becoming friends and allies,
138
476000
3000
ii determina pe oameni sa devina prieteni si aliati,
07:59
the technical answer is a non-zero-sum situation.
139
479000
3000
raspunsul expert este: o situatie cu suma nenula.
08:02
And if you ask what kind of situation
140
482000
2000
Daca intrebati ce fel de situatie
08:04
is conducive to people defining people as enemies,
141
484000
2000
ii determina pe oameni sa-si defineasca semenii ca dusmani,
08:06
it's a zero-sum situation.
142
486000
2000
raspunsul e: o situatie cu suma nula.
08:08
So, what do those terms mean?
143
488000
2000
Deci, ce inseamna acesti termeni?
08:10
Basically, a zero-sum game is the kind you're used to in sports,
144
490000
3000
Un joc de suma nula e adoptat in toate sporturile,
08:13
where there's a winner and a loser.
145
493000
2000
in care exista un invingator si un invins.
08:15
So, their fortunes add up to zero.
146
495000
3000
Deci, succesele lor insumeaza zero.
08:18
So, in tennis, every point is either good for you and bad for the other person,
147
498000
5000
Astfel, in tenis, fiecare punct e fie bun pentru tine si rau pentru celalalt,
08:23
or good for them, bad for you.
148
503000
2000
fie bun pentru oponent dar rau pentru tine.
08:25
Either way, your fortunes add up to zero. That's a zero-sum game.
149
505000
3000
Oricum sansele voastre insumeaza zero. Asta e un joc de suma zero.
08:28
Now, if you're playing doubles,
150
508000
2000
Acum, daca jucati la dublu,
08:30
then the person on your side of the net
151
510000
2000
coechipierul vostru
08:32
is in a non-zero-sum relationship with you,
152
512000
3000
e intr-o relatie de suma nenula cu voi,
08:35
because every point is either good for both of you -- positive, win-win --
153
515000
3000
pentru ca fiecare punct e fie bun pentru amandoi, pozitiv, castigator-castigator,
08:38
or bad for both of you, it's lose-lose.
154
518000
2000
fie rau pentru amandoi, invins-invins.
08:40
That's a non-zero-sum game.
155
520000
2000
Acesta e un joc de suma nenula.
08:42
And in real life, there are lots of non-zero-sum games.
156
522000
3000
In viata reala exista multe jocuri de suma nenula.
08:45
In the realm of economics, say, if you buy something:
157
525000
3000
In domeniul economic, de ex., daca achizitionati ceva,
08:48
that means you'd rather have the merchandise than the money,
158
528000
3000
inseamna ca preferati sa aveti marfa si nu banii,
08:51
but the merchant would rather have the money than the merchandise.
159
531000
3000
iar comerciantul prefera sa aiba banii si nu marfa.
08:54
You both feel you've won.
160
534000
2000
Amandoi simtiti ca ati castigat.
08:56
In a war, two allies are playing a non-zero-sum game.
161
536000
3000
In razboi, doi aliati joaca un joc de suma nenula.
08:59
It's going to either be win-win or lose-lose for them.
162
539000
3000
Se va incheia fie ca invingator-invingator, fie ca invins-invins.
09:02
So, there are lots of non-zero-sum games in real life.
163
542000
7000
Deci, sunt multe jocuri de suma nenula in viata reala.
09:09
And you could basically reformulate what I said earlier,
164
549000
4000
Si puteti reformula ceea ce am afirmat mai devreme
09:13
about how compassion is deployed and the golden rule is deployed,
165
553000
3000
despre modul in care compasiunea si regula de aur sunt aplicate,
09:16
by just saying, well, compassion most naturally flows along non-zero-sum channels
166
556000
6000
concluzionand: compasiunea e oferita firesc pe canalele cu suma nenula,
09:22
where people perceive themselves as being in a potentially win-win situation
167
562000
3000
in care oamenii se percep intr-o potentiala situatie de castigator-castigator
09:25
with some of their friends or allies.
168
565000
3000
alaturi de unii prieteni sau aliati.
09:28
The deployment of the golden rule
169
568000
2000
Aplicarea regulii de aur
09:30
most naturally happens along these non-zero-sum channels.
170
570000
3000
are loc natural in situatii cu sume nenule.
09:33
So, kind of webs of non-zero-sumness
171
573000
2000
Asa ca, aceste rețele de sume nenule
09:35
are where you would expect compassion and the golden rule
172
575000
4000
sunt sferele unde e de asteptat ca regula de aur si compasiunea
09:39
to kind of work their magic.
173
579000
2000
sa faca minuni.
09:41
With zero-sum channels you would expect something else.
174
581000
2000
In situatiile cu suma zero asteptarile sunt diferite.
09:43
Okay. So, now you're ready for the good news that I said might save the world.
175
583000
4000
Acum sunteti pregatiti pentru vestea buna care am spus ca poate salva lumea.
09:47
And now I can admit that it might not too,
176
587000
3000
Acum pot sa admit ca s-ar putea sa n-o salveze,
09:50
now that I've held your attention for three minutes of technical stuff.
177
590000
6000
acum ca v-am retinut atentia trei minute cu lectii teoretice.
09:56
But it may. And the good news is that history
178
596000
5000
Dar, s-ar putea s-o salveze totusi.
10:01
has naturally expanded these webs of non-zero-sumness,
179
601000
4000
Vestea buna e ca istoria a extins natural aceste retele de suma nenula,
10:05
these webs that can be these channels for compassion.
180
605000
4000
aceste canale ale compasiunii.
10:09
You can go back all the way to the stone age:
181
609000
3000
Va puteti intoarce in timp pana in epoca de piatra
10:12
technological evolution -- roads, the wheel, writing,
182
612000
8000
si cred ca, de la revolutia industriala, drumuri, roata, scrisul
10:20
a lot of transportation and communication technologies --
183
620000
3000
multe din tehnologiile folosite in transport si comunicare
10:23
has just inexorably made it so that more people
184
623000
3000
au facut ca, in mod implacabil, mai multi oameni
10:26
can be in more non-zero-sum relationships
185
626000
3000
sa poata fi in relatii de suma nenula
10:29
with more and more people at greater and greater distances.
186
629000
3000
cu tot mai multi oameni aflati la distante tot mai mari.
10:32
That's the story of civilization.
187
632000
3000
Cam asta e istoria civilizatiei.
10:35
It's why social organization has grown from the hunter-gatherer village
188
635000
5000
Asta-i motivul pentru care organizarea sociala a evoluat de la satul vanator-culegator
10:40
to the ancient state, the empire, and now here we are in a globalized world.
189
640000
3000
la statele antice, la imperii, la lumea globala de astazi.
10:43
And the story of globalization is largely a story of non-zero-sumness.
190
643000
4000
Istoria globalizarii e, in mare, o istorie de sume nenule.
10:47
You've probably heard the term "interdependence"
191
647000
2000
Probabil ati auzit termenul interdependenta
10:49
applied to the modern world. Well, that's just another term for non-zero-sum.
192
649000
4000
aplicat lumii moderne. Asta e un alt termen pt. sumele nenule.
10:53
If your fortunes are interdependent with somebody,
193
653000
3000
Daca succesele voastre sunt interdependente cu ale altora
10:56
then you live in a non-zero-sum relationship with them.
194
656000
3000
atunci sunteti intr-o relatie de suma nenula cu acestia.
10:59
And you see this all the time in the modern world.
195
659000
2000
Vedeti asta mereu in lumea moderna.
11:01
You saw it with the recent economic crash,
196
661000
2000
Ati vazut recent in recenta criza economica,
11:03
where bad things happen in the economy --
197
663000
3000
cand lucruri rele s-au intamplat in economie,
11:06
bad for everybody, for much of the world.
198
666000
3000
resimtite de mare parte a planetei.
11:09
Good things happen, and it's good for much of the world.
199
669000
3000
Daca se intampla lucruri bune e bine pentru mare parte a planetei.
11:12
And, you know, I'm happy to say, I think there's really evidence
200
672000
3000
Sunt fericit sa afirm ca exista dovezi clare
11:15
that this non-zero-sum kind of connection
201
675000
3000
ca aceasta conexiune de suma nenula
11:18
can expand the moral compass.
202
678000
3000
poate extinde principiile moralitatii.
11:21
I mean, if you look at the American attitudes
203
681000
3000
Considerati atitudinea americanilor
11:24
toward Japanese during World War II --
204
684000
4000
fata de japonezi in al doilea razboi mondial,
11:28
look at the depictions of Japanese
205
688000
2000
uitati-va la descrierile japonezilor
11:30
in the American media as just about subhuman,
206
690000
2000
in presa americana, ca fiind aproape suboameni,
11:32
and look at the fact that we dropped atomic bombs,
207
692000
2000
si la faptul ca am aruncat bomba atomica,
11:34
really without giving it much of a thought --
208
694000
3000
fara sa gandim prea mult.
11:37
and you compare that to the attitude now,
209
697000
2000
Comparati acea atitudine cu cea de acum.
11:39
I think part of that is due to a kind of economic interdependence.
210
699000
3000
Cred ca in parte se datoreaza interdependentei economice.
11:42
Any form of interdependence, or non-zero-sum relationship
211
702000
3000
Orice forma de interdependenta, de relatie de suma nenula,
11:45
forces you to acknowledge the humanity of people.
212
705000
3000
te forțeaza sa conștientizezi umanitatea oamenilor.
11:48
So, I think that's good.
213
708000
2000
Cred ca asta-i bine.
11:50
And the world is full of non-zero-sum dynamics.
214
710000
3000
Iar lumea e plina de dinamici de suma nenula.
11:53
Environmental problems, in many ways, put us all in the same boat.
215
713000
4000
Problemele de mediu, in multe feluri, ne pun in aceeasi barca.
11:57
And there are non-zero-sum relationships that maybe people aren't aware of.
216
717000
5000
Exista relatii de suma nenula de care oamenii poate nu sunt constienti.
12:02
For example, probably a lot of American Christians
217
722000
4000
De ex., poate ca multi crestini americani
12:06
don't think of themselves as being in a non-zero-sum relationship
218
726000
4000
nu gandesc ca sunt intr-o relatie de suma nenula
12:10
with Muslims halfway around the world,
219
730000
2000
cu musulmanii din cealalta parte a lumii,
12:12
but they really are, because if these Muslims become happier and happier
220
732000
5000
dar sunt, pentru ca, daca musulmanii devin mai multumiti
12:17
with their place in the world and feel that they have a place in it,
221
737000
3000
cu locul lor in lume si simt ca si-au gasit un loc in ea,
12:20
that's good for Americans, because there will be fewer terrorists
222
740000
3000
e bine pentru americani deoarece vor fi mai putini teroristi
12:23
to threaten American security.
223
743000
2000
sa ameninte siguranta americana.
12:25
If they get less and less happy, that will be bad for Americans.
224
745000
4000
Daca devin tot mai nefericiti, va fi rau pentru americani.
12:29
So, there's plenty of non-zero-sumness.
225
749000
3000
Deci, sunt destule sume nenule.
12:32
And so, the question is: If there's so much non-zero-sumness,
226
752000
5000
Intrebarea este : daca sunt atatea sume nenule,
12:37
why has the world not yet been suffused in love, peace, and understanding?
227
757000
4000
de ce lumea n-a fost inundata inca de dragoste, pace si intelegere?
12:41
The answer's complicated. It's the occasion for a whole other talk.
228
761000
3000
Raspunsul e complicat. Poate face subiectul unei alte discutii,
12:44
Certainly, a couple of things are that,
229
764000
4000
dar, cu siguranta, exista doua motive:
12:48
first of all, there are a lot of zero-sum situations in the world.
230
768000
4000
in primul rand exista in lume multe situatii de suma nula.
12:52
And also, sometimes people don't recognize
231
772000
4000
De asemenea, oamenii nu recunosc uneori
12:56
the non-zero-sum dynamics in the world.
232
776000
5000
dinamica sumelor nenule din lume.
13:01
In both of these areas,
233
781000
2000
Cred ca in ambele aspecte,
13:03
I think politicians can play a role.
234
783000
3000
politicienii pot sa joace un rol.
13:06
This isn't only about religion.
235
786000
2000
Nu e vorba doar de religie.
13:08
I think politicians can help foster non-zero-sum relationships,
236
788000
5000
Cred ca politicienii pot stimula relatiile de suma nenula,
13:13
Economic engagement is generally better than blockades and so on,
237
793000
3000
caci parteneriatele economice sunt mai bune decat blocajul
13:16
in this regard.
238
796000
2000
in astfel de situatii.
13:18
And politicians can be aware, and should be aware that,
239
798000
3000
Politicienii pot si ar trebui sa fie constienti
13:21
when people around the world are looking at them,
240
801000
2000
ca oamenii din intreaga lume se uita la ei,
13:23
are looking at their nation
241
803000
2000
se uita la natiunile lor,
13:25
and picking up their cues
242
805000
2000
si le interpreteaza mesajele
13:27
for whether they are in a zero-sum or a non-zero-sum relationship with a nation --
243
807000
4000
ca o relatie de suma nula sau suma nenula cu o natiune.
13:31
like, say, America, or any other nation --
244
811000
3000
De ex., America, sau oricare alta,
13:34
human psychology is such that they use cues like:
245
814000
3000
psihologia umana foloseste indicii cum ar fi:
13:37
Do we feel we're being respected?
246
817000
2000
Simtim ca suntem respectati?
13:39
Because, you know, historically, if you're not being respected,
247
819000
3000
Pentru ca istoric, daca nu esti respectat
13:42
you're probably not going to wind up in a non-zero-sum,
248
822000
3000
probabil nu vei ajunge intr-o relatie de suma nenula,
13:45
mutually profitable relationship with people.
249
825000
3000
din care sa profite ambele parti.
13:48
So, we need to be aware of what kind of signals we're sending out.
250
828000
5000
Deci, trebuie sa fim constienti de semnalele pe care le transmitem.
13:53
And some of this, again, is in the realm of political work.
251
833000
5000
In parte depind de o politica buna.
13:58
If there's one thing I can encourage everyone to do,
252
838000
2000
Un lucru care incurajez pe toti sa-l faca,
14:00
politicians, religious leaders, and us,
253
840000
3000
politicieni, lideri religiosi si noi toti,
14:03
it would be what I call "expanding the moral imagination" --
254
843000
5000
este acela de-a extinde imaginatia morala.
14:08
that is to say, your ability to put yourself in the shoes
255
848000
3000
Asta inseamna abilitatea de-a va pune in locul
14:11
of people in very different circumstances.
256
851000
3000
altor oameni in circumstante foarte diferite.
14:14
This is not the same as compassion,
257
854000
2000
Nu e acelasi lucru cu compasiunea,
14:16
but it's conducive to compassion. It opens the channels for compassion.
258
856000
6000
dar e benefic compasiunii. Deschde caile spre compasiune.
14:22
And I'm afraid we have another good news/bad news story,
259
862000
3000
Si mi-e teama ca aici avem o alta situatie cu vesti bune si vesti rele,
14:25
which is that the moral imagination is part of human nature.
260
865000
3000
anume ca imaginatia morala face parte din natura umana.
14:28
That's good, but again we tend to deploy it selectively.
261
868000
5000
Asta e bine, dar din nou tindem s-o implementam selectiv.
14:33
Once we define somebody as an enemy,
262
873000
2000
Odata ce definim o persoana ca fiindu-ne dusman,
14:35
we have trouble putting ourselves in their shoes, just naturally.
263
875000
5000
ne e greu sa ne punem in pielea lui in mod firesc.
14:40
So, if you want to take a particularly hard case for an American:
264
880000
4000
Daca vreti sa luati un caz dificil, de ex., pentru un american,
14:44
somebody in Iran who is burning an American flag, and you see them on TV.
265
884000
4000
cineva din Iran, care arde un steag american pe care il vedeti la televizor.
14:48
Well, the average American is going to resist
266
888000
3000
Americanii obisnuiti vor fi reticenti in
14:51
the moral exercise of putting themselves in that person's head
267
891000
4000
exercitiul moral de a se transpune in mintea acelei persoane
14:55
and is going to resist the idea that they have much in common with that person.
268
895000
3000
si nu vor accepta ideea ca au multe in comun cu acea persona.
14:58
And if you tell them, "Well, they think America disrespects them
269
898000
4000
Ei bine, ei considera ca America nu-i respecta,
15:02
and even wants to dominate them, and they hate America.
270
902000
3000
ca vrea sa-i domine si urasc America.
15:05
Has there ever been somebody who disrespected you so much
271
905000
2000
A existat vreodata cineva care v-a dispretuit atat de mult
15:07
that you kind of hated them briefly"?
272
907000
2000
incat pe scurt i-ati cam urat?
15:09
You know, they'll resist that comparison and that's natural, that's human.
273
909000
3000
Vor respinge acea comparatie si e natural, e firesc.
15:12
And, similarly, the person in Iran:
274
912000
2000
Si, la fel, o persoana din Iran,
15:14
when you try to humanize somebody in America who said that Islam is evil,
275
914000
4000
cand incercati sa umanizati pe cineva din America care a afirmat ca Islamul e malefic,
15:18
they'll have trouble with that.
276
918000
2000
va avea o problema cu asta.
15:20
So, it's a very difficult thing to get people to expand the moral imagination
277
920000
5000
Prin urmare e dificil sa determini oamenii sa-si extinda imaginatia morala
15:25
to a place it doesn't naturally go.
278
925000
3000
intr-o arie in care nu apare firesc.
15:28
I think it's worth the trouble because,
279
928000
3000
Cred ca merita efortul pentru ca
15:31
again, it just helps us to understand.
280
931000
2000
ne ajuta sa intelegem,
15:33
If you want to reduce the number of people who are burning flags,
281
933000
2000
daca vreti sa reduceti numarul celor care ard steaguri,
15:35
it helps to understand what makes them do it.
282
935000
2000
sa intelegem ce-i determina s-o faca.
15:37
And I think it's good moral exercise.
283
937000
3000
Si cred ca e un bun exercitiu moral.
15:40
I would say here is where religious leaders come in,
284
940000
3000
Asta e domeniul in care liderii religiosi pot interveni,
15:43
because religious leaders are good at reframing issues for people,
285
943000
7000
pentru ca ei sunt specialisti in a reformula problemele pentru oameni,
15:50
at harnessing the emotional centers of the brain
286
950000
2000
captand centrul emotional al creierului
15:52
to get people to alter their awareness and reframe the way they think.
287
952000
5000
pentru a-i determina pe oameni sa inteleaga si sa-si reevalueze gandirea.
15:57
I mean, religious leaders are kind of in the inspiration business.
288
957000
4000
Vreau sa spun ca liderii religiosi fac afaceri cu inspiratia.
16:01
It's their great calling right now,
289
961000
2000
E marea lor chemare acum
16:03
to get people all around the world better at expanding their moral imaginations,
290
963000
4000
sa-i determine pe oamenii din intreaga lume sa-si extinda imaginatia morala
16:07
appreciating that in so many ways they're in the same boat.
291
967000
4000
constientizand ca din multe puncte de vedere sunt in aceeasi barca.
16:11
I would just sum up the way things look, at least from this secular perspective,
292
971000
6000
As rezuma aparenta lucrurilor din perspectiva lumeasca,
16:17
as far as compassion and the golden rule go,
293
977000
4000
in ce priveste compasiunea si regula de aur,
16:21
by saying that it's good news that compassion and the golden rule
294
981000
6000
afirmand ca e o veste buna faptul ca regula de aur si compasiunea
16:27
are in some sense built into human nature.
295
987000
5000
sunt intr-un fel innascute naturii umane.
16:32
It's unfortunate that they tend to be selectively deployed.
296
992000
5000
E regretabil ca tind sa fie implementate selectiv.
16:37
And it's going to take real work to change that.
297
997000
4000
Va trebui mult efort pentru a schimba asta.
16:41
But, nobody ever said that doing God's work was going to be easy. Thanks.
298
1001000
5000
Dar nimeni n-a spus ca a face munca lui Dumnezeu va fi usor. Multumesc.
16:46
(Applause)
299
1006000
2000
Aplauze.
Despre acest site

Acest site vă va prezenta videoclipuri de pe YouTube care sunt utile pentru a învăța limba engleză. Veți vedea lecții de engleză predate de profesori de top din întreaga lume. Faceți dublu clic pe subtitrările în limba engleză afișate pe fiecare pagină video pentru a reda videoclipul de acolo. Subtitrările se derulează în sincron cu redarea videoclipului. Dacă aveți comentarii sau solicitări, vă rugăm să ne contactați folosind acest formular de contact.

https://forms.gle/WvT1wiN1qDtmnspy7