How to have better political conversations | Robb Willer

245,848 views ・ 2017-02-09

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

00:00
Translator: Joseph Geni Reviewer: Joanna Pietrulewicz
0
0
7000
λ²ˆμ—­: Kyo young Chu κ²€ν† : KI young Jang
00:12
So you probably have the sense, as most people do,
1
12580
3056
μ—¬λŸ¬λΆ„λ“€κ»˜μ„œλ„ λ‹€λ₯Έ μ‚¬λžŒλ“€μ²˜λŸΌ
00:15
that polarization is getting worse in our country,
2
15660
3656
μš°λ¦¬λ‚˜λΌμ—μ„œ μ–‘κ·Ήν™”κ°€ 더 심각해지고 μžˆλ‹€κ³  λŠλΌμ‹€ κ²λ‹ˆλ‹€.
00:19
that the divide between the left and the right
3
19340
3456
쒌읡과우읡의 λ‹¨μ ˆμ΄
00:22
is as bad as it's been in really any of our lifetimes.
4
22820
3536
μ—¬νƒœκΉŒμ§€ μ‚΄λ©΄μ„œ 이만큼 λ‚˜λΉ΄λ˜ 적이 μ—†μ—ˆλ‹€κ³ μš”.
00:26
But you might also reasonably wonder if research backs up your intuition.
5
26380
5280
ν•˜μ§€λ§Œ 이런 생각을 λ’·λ°›μΉ¨ν•΄μ£ΌλŠ” νƒ€λ‹Ήν•œ 연ꡬ가 μžˆλŠ”μ§€λ„ κΆκΈˆν•˜μ‹€ κ²λ‹ˆλ‹€.
00:32
And in a nutshell, the answer is sadly yes.
6
32380
4680
κ²°λ‘ λΆ€ν„° λ§μ”€λ“œλ¦¬μžλ©΄, μŠ¬ν”„μ§€λ§Œ μ‘΄μž¬ν•©λ‹ˆλ‹€.
00:38
In study after study, we find
7
38740
2016
κ±°λ“­λœ μ—°κ΅¬μ—μ„œ μš°λ¦¬λŠ” μ§„λ³΄μ£Όμ˜μžμ™€ λ³΄μˆ˜μ£Όμ˜μžκ°€
00:40
that liberals and conservatives have grown further apart.
8
40780
3680
점점 더 멀어지고 μžˆλ‹€λŠ” 사싀을 λ°œκ²¬ν–ˆμŠ΅λ‹ˆλ‹€.
00:45
They increasingly wall themselves off in these ideological silos,
9
45260
4776
그듀은 μžμ‹ λ“€μ„ 사상적인 곡간 μ•ˆμ— 가두고,
00:50
consuming different news, talking only to like-minded others
10
50060
4136
μ„œλ‘œ λ‹€λ₯Έ λ‰΄μŠ€λ₯Ό 보고, λΉ„μŠ·ν•œ 생각을 가진 μ‚¬λžŒλ“€ν•˜κ³ λ§Œ 이야기λ₯Ό λ‚˜λˆ„λ©°,
00:54
and more and more choosing to live in different parts of the country.
11
54220
3240
더 λ‚˜μ•„κ°€μ„œλŠ” ν•œ λ‚˜λΌμ—μ„œλ„ λ‹€λ₯Έ 지역에 μ‚΄κ³  μ‹Άμ–΄ν•©λ‹ˆλ‹€.
00:58
And I think that most alarming of all of it
12
58540
3216
ν•˜μ§€λ§Œ μ œκ°€ κ°€μž₯ μœ„ν—˜ν•˜λ‹€κ³  μƒκ°ν•˜λŠ” 것은
01:01
is seeing this rising animosity on both sides.
13
61780
3800
μ„œλ‘œμ— λŒ€ν•œ μ¦μ˜€κ°€ μ»€μ Έλ§Œκ°„λ‹€λŠ” κ²ƒμž…λ‹ˆλ‹€.
01:06
Liberals and conservatives,
14
66260
1656
μ§„λ³΄μ£Όμ˜μžμ™€ 보수주의자,
01:07
Democrats and Republicans,
15
67940
1896
민주당원과 곡화당원,
01:09
more and more they just don't like one another.
16
69860
3120
κ³„μ†ν•΄μ„œ 그듀은 μ„œλ‘œλ₯Ό μ‹«μ–΄ν•˜κΈ°λ§Œ ν•©λ‹ˆλ‹€.
01:14
You see it in many different ways.
17
74140
2016
λ‹€λ°©λ©΄μ—μ„œ 이λ₯Ό μ°Ύμ•„λ³Ό 수 μžˆμŠ΅λ‹ˆλ‹€.
01:16
They don't want to befriend one another. They don't want to date one another.
18
76180
3656
μ„œλ‘œ μΉœκ΅¬κ°€ 되고 μ‹Άμ–΄ν•˜μ§€λ„, μƒλŒ€λ°©κ³Ό μ—°μ• λ₯Ό ν•˜κ³  μ‹Άμ–΄ν•˜μ§€λ„ μ•Šμ£ .
01:19
If they do, if they find out, they find each other less attractive,
19
79860
3296
섀사 κ·Έλ ‡λ‹€ 할지라도 μ•Œκ²Œ λ˜λŠ” μˆœκ°„λΆ€ν„° μ„œλ‘œμ— λŒ€ν•œ 맀λ ₯이 떨어지고,
01:23
and they more and more don't want their children to marry someone
20
83180
3096
μžλ…€λ“€μ΄ μžμ‹ λ“€κ³Ό λ‹€λ₯Έ 정당을
01:26
who supports the other party,
21
86300
1696
μ§€μ§€ν•˜λŠ” μ‚¬λžŒμ„ λ§Œλ‚˜λŠ” κ±Έ μ›μΉ˜ μ•ŠμŠ΅λ‹ˆλ‹€.
01:28
a particularly shocking statistic.
22
88020
1760
특히 좩격적인 ν†΅κ³„μ˜€μ£ .
01:31
You know, in my lab, the students that I work with,
23
91460
2816
저와 같이 연ꡬ싀에 μžˆλŠ” 학생듀은
01:34
we're talking about some sort of social pattern --
24
94300
3456
가끔 μ‚¬νšŒμ μΈ νŒ¨ν„΄μ— λŒ€ν•΄μ„œ μ΄μ•ΌκΈ°ν•˜κ³€ ν•©λ‹ˆλ‹€.
01:37
I'm a movie buff, and so I'm often like,
25
97780
3536
μ €λŠ” μ˜ν™”κ΄‘μ΄λΌ μ’…μ’… μ΄λŸ¬ν•œ νŒ¨ν„΄μ΄ μžˆμ—ˆλ˜ μ˜ν™”λŠ”
01:41
what kind of movie are we in here with this pattern?
26
101340
2960
μ–΄λ–€ κ²ƒμ΄μ—ˆλŠ”μ§€μ— λŒ€ν•œ μ§ˆλ¬Έμ„ ν•©λ‹ˆλ‹€.
01:44
So what kind of movie are we in with political polarization?
27
104900
3280
κ·Έλ ‡λ‹€λ©΄ μ •μΉ˜μ μΈ μ–‘κ·Ήν™”κ°€ λ‚˜μ˜¨ μ˜ν™”μ—λŠ” μ–΄λ–€ 것이 μžˆμ„κΉŒμš”?
01:48
Well, it could be a disaster movie.
28
108900
2720
μž¬λ‚œ μ˜ν™”μΌ μˆ˜λ„ 있죠.
01:52
It certainly seems like a disaster.
29
112700
1680
ν™•μ‹€νžˆ μž¬μ•™μ΄κΈ΄ ν•˜λ‹ˆκΉŒμš”.
01:54
Could be a war movie.
30
114740
2000
μ „μŸ μ˜ν™”μΌ μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
01:57
Also fits.
31
117460
1200
μ λ‹Ήν•˜κ² λ„€μš”.
01:59
But what I keep thinking is that we're in a zombie apocalypse movie.
32
119300
3816
ν•˜μ§€λ§Œ 제 머릿속에 λ– μ˜€λ₯΄λŠ” 것은 μ’€λΉ„κ°€ λ‚˜μ˜€λŠ” μž¬λ‚œ μ˜ν™”μž…λ‹ˆλ‹€.
02:03
(Laughter)
33
123140
1456
(μ›ƒμŒ)
02:04
Right? You know the kind.
34
124620
2296
κ·Έλ ‡μ£ ? μ—¬λŸ¬λΆ„λ„ μ•„μ‹œμ£ ?
02:06
There's people wandering around in packs,
35
126940
2416
이성적인 νŒλ‹¨λ„ 없이
02:09
not thinking for themselves,
36
129380
1776
μ’€λΉ„ λ°”μ΄λŸ¬μŠ€λ₯Ό 퍼트렀
02:11
seized by this mob mentality
37
131180
1616
μ‚¬νšŒλ₯Ό νŒŒκ΄΄ν•˜λ €λŠ” 생각에 μ‚¬λ‘œμž‘νžŒ μ‚¬λžŒλ“€μ΄
02:12
trying to spread their disease and destroy society.
38
132820
3240
λ‹¨μ²΄λ‘œ 무리지어 λŒμ•„λ‹€λ‹™λ‹ˆλ‹€.
02:17
And you probably think, as I do,
39
137300
2336
그리고 μ—¬λŸ¬λΆ„λ“€λ„ 저와 λ§ˆμ°¬κ°€μ§€λ‘œ μš°λ¦¬κ°€ μ’€λΉ„ μž¬λ‚œ μ˜ν™”μ—μ„œ
02:19
that you're the good guy in the zombie apocalypse movie,
40
139660
3456
μ°©ν•œ μ‚¬λžŒμ΄λΌκ³  μƒκ°ν•˜μ‹€ κ²λ‹ˆλ‹€.
02:23
and all this hate and polarization, it's being propagated by the other people,
41
143140
3696
그리고 이 λͺ¨λ“  μ¦μ˜€μ™€ μ–‘κ·Ήν™”λŠ” λ‹€λ₯Έ μ‚¬λžŒλ“€λ‘œλΆ€ν„° μ‹œμž‘λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
02:26
because we're Brad Pitt, right?
42
146860
1880
μ™œλƒλ©΄ μš°λ¦¬κ°€ μ£ΌμΈκ³΅μ΄λ‹ˆκΉŒμš”, κ·Έλ ‡μ£ ?
02:29
Free-thinking, righteous,
43
149580
2896
자유둭게 μƒκ°ν•˜κ³ , 옳으며,
02:32
just trying to hold on to what we hold dear,
44
152500
2296
μš°λ¦¬κ°€ μ‚¬λž‘ν•˜λŠ” 것을 작으렀 λ…Έλ ₯ν•˜μ£ .
02:34
you know, not foot soldiers in the army of the undead.
45
154820
3576
μ’€λΉ„ κ΅°λŒ€ λ§κ³ μš”.
02:38
Not that.
46
158420
1456
그건 μ•„λ‹ˆμ—μš”.
02:39
Never that.
47
159900
1200
μ ˆλŒ€λ‘œ μ•„λ‹ˆμ£ .
02:41
But here's the thing:
48
161900
1496
ν•˜μ§€λ§Œ 이게 μ€‘μš”ν•œ λΆ€λΆ„μž…λ‹ˆλ‹€.
02:43
what movie do you suppose they think they're in?
49
163420
2720
μƒλŒ€λ°©μ€ μžμ‹ μ΄ μ–΄λ–€ μ˜ν™”μ— μžˆλ‹€κ³  μƒκ°ν• κΉŒμš”?
02:47
Right?
50
167300
1216
κ·Έλ ‡μ£ ?
02:48
Well, they absolutely think that they're the good guys
51
168540
2536
그듀도 λ‹Ήμ—°νžˆ μžμ‹ λ“€μ΄
02:51
in the zombie apocalypse movie. Right?
52
171100
1856
μ’€λΉ„ μ˜ν™”μ˜ 주인곡이라고 생각할 κ²λ‹ˆλ‹€.
02:52
And you'd better believe that they think that they're Brad Pitt
53
172980
2976
그리고 그듀이 μ’€λΉ„ μ˜ν™”μ˜ λΈŒλž˜λ“œ ν”ΌνŠΈκ³ 
02:55
and that we, we are the zombies.
54
175980
2120
μš°λ¦¬λŠ” 쒀비라고 μƒκ°ν•˜μ‹œλŠ”κ²Œ 쒋을 κ²λ‹ˆλ‹€.
03:00
And who's to say that they're wrong?
55
180940
2360
κ·Έ λˆ„κ΅¬κ°€ 그듀이 ν‹€λ Έλ‹€κ³  말할 수 μžˆμ„κΉŒμš”?
03:04
I think that the truth is that we're all a part of this.
56
184260
3120
제 μƒκ°μ—λŠ” 우리 λͺ¨λ‘κ°€ κ·Έ 일뢀라고 λ΄…λ‹ˆλ‹€.
03:08
And the good side of that is that we can be a part of the solution.
57
188060
3160
쒋은 점은 μš°λ¦¬κ°€ 해결책이 될 μˆ˜λ„ μžˆλ‹€λŠ” κ²λ‹ˆλ‹€.
03:12
So what are we going to do?
58
192100
2000
그럼 이제 μ–΄λ–»κ²Œ ν•΄μ•Όν• κΉŒμš”?
03:15
What can we do to chip away at polarization in everyday life?
59
195140
4256
μΌμƒμƒν™œμ—μ„œμ˜ μ–‘κ·Ήν™”λ₯Ό μ—†μ• κΈ° μœ„ν•΄μ„  뭘 ν•΄μ•Ό ν• κΉŒμš”?
03:19
What could we do to connect with and communicate with
60
199420
3816
우리의 μ •μΉ˜μ μΈ 생각과 λ°˜λŒ€νŽΈμ— μžˆλŠ” μ‚¬λžŒν•˜κ³ 
03:23
our political counterparts?
61
203260
1720
관계λ₯Ό λ§Ίκ³  μ†Œν†΅ν•˜κΈ° μœ„ν•΄μ„  무엇을 ν•΄μ•Ό ν• κΉŒμš”?
03:25
Well, these were exactly the questions that I and my colleague, Matt Feinberg,
62
205540
4136
저와 제 λ™λ£ŒμΈ λ§· νŒŒμΈλ²„κ·ΈλŠ”
03:29
became fascinated with a few years ago,
63
209700
1858
λͺ‡ λ…„ μ „ 이 μ§ˆλ¬Έλ“€μ— λΉ μ Έλ“€μ–΄
03:31
and we started doing research on this topic.
64
211582
2200
연ꡬλ₯Ό μ‹œμž‘ν–ˆμŠ΅λ‹ˆλ‹€.
03:34
And one of the first things that we discovered
65
214740
2976
μ–‘κ·Ήν™”λ₯Ό μ΄ν•΄ν•˜λŠ”λ°μ— μžˆμ–΄μ„œ ꡉμž₯히 도움이 λœλ‹€κ³  μƒκ°ν–ˆλ˜ 것을
03:37
that I think is really helpful for understanding polarization
66
217740
3456
ν•˜λ‚˜ λ°œκ²¬ν–ˆλŠ”λ°, 그것은 λ°”λ‘œ μš°λ¦¬λ‚˜λΌμ˜ μ •μΉ˜μ μΈ 뢄열이
03:41
is to understand
67
221220
1216
더 κΉŠμ€ 도덕적 뢄열에 μ˜ν•΄
03:42
that the political divide in our country is undergirded by a deeper moral divide.
68
222460
4416
κ°•ν™”λ˜μ—ˆλ‹€λŠ” μ‚¬μ‹€μž…λ‹ˆλ‹€.
03:46
So one of the most robust findings in the history of political psychology
69
226900
4776
μ •μΉ˜μ‹¬λ¦¬ν•™μ˜ 역사상 κ°€μž₯ μœ„λŒ€ν•œ 발견 쀑 ν•˜λ‚˜λŠ”
03:51
is this pattern identified by Jon Haidt and Jesse Graham,
70
231700
3696
μ‹¬λ¦¬ν•™μžμΈ μ‘΄ ν•˜μ΄νŠΈμ™€
03:55
psychologists,
71
235420
1216
μ œμ‹œ κ·Έλ ˆμ΄μ—„μ΄ λ°œκ²¬ν–ˆλŠ”λ°
03:56
that liberals and conservatives tend to endorse different values
72
236660
4016
μ§„λ³΄μ£Όμ˜μžμ™€ λ³΄μˆ˜μ£Όμ˜μžλŠ” μ§€μ§€ν•˜λŠ”
04:00
to different degrees.
73
240700
1200
κ°€μΉ˜μ™€ 정도가 μ„œλ‘œ λ‹€λ₯΄λ‹€λŠ” κ²ƒμž…λ‹ˆλ‹€.
04:02
So for example, we find that liberals tend to endorse values like equality
74
242420
5496
예λ₯Ό λ“€μ–΄, μ§„λ³΄μ£Όμ˜μžλŠ” λ³΄μˆ˜μ£Όμ˜μžλ³΄λ‹€
04:07
and fairness and care and protection from harm
75
247940
3656
평등과 곡정성, 볡지, μœ„ν—˜μœΌλ‘œλΆ€ν„°μ˜ 보호λ₯Ό
04:11
more than conservatives do.
76
251620
2136
더 μ§€μ§€ν•˜λŠ” κ²½ν–₯이 μžˆμŠ΅λ‹ˆλ‹€.
04:13
And conservatives tend to endorse values like loyalty, patriotism,
77
253780
5256
λ³΄μˆ˜μ£Όμ˜μžλ“€μ€ 좩성심, 애ꡭ심,
04:19
respect for authority and moral purity
78
259060
3456
κΆŒμœ„μ— λŒ€ν•œ μ‘΄κ²½, 도덕적 κ²°λ°± 등을
04:22
more than liberals do.
79
262540
2080
μ§„λ³΄μ£Όμ˜μžλ“€λ³΄λ‹€ 높이 ν‰κ°€ν•©λ‹ˆλ‹€.
04:25
And Matt and I were thinking that maybe this moral divide
80
265740
4056
그리고 λ§·κ³Ό μ €λŠ” μ•„λ§ˆλ„ μ΄λŸ¬ν•œ 도덕적 차이가
04:29
might be helpful for understanding how it is
81
269820
3096
μ§„λ³΄μ£Όμ˜μžμ™€ λ³΄μˆ˜μ£Όμ˜μžκ°€
04:32
that liberals and conservatives talk to one another
82
272940
2416
μ„œλ‘œ 이야기λ₯Ό λ‚˜λˆ„λŠ” λ°©λ²•μ΄λ‚˜, μ—‡κ°ˆλ¦¬λŠ” μ΄μœ μ— λŒ€ν•΄
04:35
and why they so often seem to talk past one another
83
275380
2416
이해할 수 μžˆλ„λ‘
04:37
when they do.
84
277820
1216
도움을 μ•Šμ„κΉŒν•˜λŠ” 생각을 ν–ˆμŠ΅λ‹ˆλ‹€.
04:39
So we conducted a study
85
279060
1976
κ·Έλž˜μ„œ μ €ν¬λŠ” μ§„λ³΄μ£Όμ˜μžλ“€μ„ λͺ¨μ•„μ„œ
04:41
where we recruited liberals to a study
86
281060
3096
연ꡬλ₯Ό μ‹œμž‘ν–ˆλŠ”λ°,
04:44
where they were supposed to write a persuasive essay
87
284180
2456
λ‚΄μš©μ€ 동성 κ²°ν˜Όμ— λŒ€ν•΄
04:46
that would be compelling to a conservative in support of same-sex marriage.
88
286660
4440
λ³΄μˆ˜μ£Όμ˜μžλ“€μ„ μ„€λ“ν•˜λŠ” 글을 μ“°λŠ” κ²ƒμ΄μ—ˆμŠ΅λ‹ˆλ‹€.
04:51
And what we found was that liberals tended to make arguments
89
291620
3256
μ €ν¬λŠ” μ§„λ³΄μ£Όμ˜μžλ“€μ΄ μžμ‹ λ“€μ΄ 높이 ν‰κ°€ν•˜λŠ” 도덕적 κ°€μΉ˜μΈ
04:54
in terms of the liberal moral values of equality and fairness.
90
294900
4176
평등과 κ³΅ν‰ν•¨μ˜ κ΄€μ μ—μ„œ μ£Όμž₯을 νŽΌμ³λ‚˜κ°„λ‹€λŠ” 것을 λ°œκ²¬ν–ˆμŠ΅λ‹ˆλ‹€.
04:59
So they said things like,
91
299100
1736
λ‚΄μš©μ€ λ‹€μŒκ³Ό κ°™μ•˜μŠ΅λ‹ˆλ‹€.
05:00
"Everyone should have the right to love whoever they choose,"
92
300860
3376
"μ‚¬λžŒμ€ λˆ„κ΅¬λ‚˜ μžμ‹ μ΄ μ‚¬λž‘ν•  μ‚¬λžŒμ„ 선택할 κΆŒλ¦¬κ°€ μžˆμ–΄μš”."
05:04
and, "They" -- they being gay Americans --
93
304260
2576
"동성을 μ‚¬λž‘ν•˜λŠ” 미ꡭ인듀도
05:06
"deserve the same equal rights as other Americans."
94
306860
2760
λ‹€λ₯Έ 미ꡭ인듀과 같은 ꢌ리λ₯Ό 보μž₯λ°›μ•„μ•Ό ν•©λ‹ˆλ‹€."
05:10
Overall, we found that 69 percent of liberals
95
310180
3216
μ „μ²΄μ μœΌλ‘œ, λ³΄μˆ˜μ£Όμ˜μžλ“€μ„ μ„€λ“ν•˜λŠ” 글을 μ“°λŠ” κ²ƒμž„μ—λ„ λΆˆκ΅¬ν•˜κ³ 
05:13
invoked one of the more liberal moral values in constructing their essay,
96
313420
5416
κ·Έλ“€μ˜ 69%κ°€ 글을 μ“°λŠ”λ°μ— μžˆμ–΄μ„œ 더 진보적인 도덕적 κ°€μΉ˜λ₯Ό μΈμš©ν–ˆκ³ ,
05:18
and only nine percent invoked one of the more conservative moral values,
97
318860
3696
더 보수적인 도덕적 κ°€μΉ˜λ₯Ό
05:22
even though they were supposed to be trying to persuade conservatives.
98
322580
3416
μΈμš©ν•œ μ‚¬λžŒμ€ 9%에 λΆˆκ³Όν–ˆμŠ΅λ‹ˆλ‹€.
05:26
And when we studied conservatives and had them make persuasive arguments
99
326020
4296
그리고 저희가 λ³΄μˆ˜μ£Όμ˜μžλ“€μ—κ²Œ 섀득적인 κΈ€μ“°κΈ°μ˜ 주제둜
05:30
in support of making English the official language of the US,
100
330340
2896
μ „ν˜•μ μΈ 보수주의 μ •μ±…μ˜ μž…μž₯인 μ˜μ–΄λ₯Ό 미ꡭ의 곡식 μ–Έμ–΄λ‘œ μ§€μ •ν•˜λŠ” 것을
05:33
a classically conservative political position,
101
333260
2536
μ§€μ§€ν•΄λ‹¬λΌλŠ” λ‚΄μš©μ„ 쓰도둝 ν–ˆμ„ λ•Œ,
05:35
we found that they weren't much better at this.
102
335820
2216
κ·Έ μ‚¬λžŒλ“€λ„ λ³„λ°˜ λ‹€λ₯΄μ§„ μ•Šμ•˜μŠ΅λ‹ˆλ‹€.
05:38
59 percent of them made arguments
103
338060
1616
μ§„λ³΄μ£Όμ˜μžλ“€μ„ μ„€λ“ν•˜λŠ” κΈ€μ΄μ—ˆμ§€λ§Œ
05:39
in terms of one of the more conservative moral values,
104
339700
2696
κ·Έλ“€μ˜ 59%κ°€ 더 보수적인 도덕적 κ°€μΉ˜λ₯Ό μ΄μš©ν•΄ μ£Όμž₯을 펼쳀고
05:42
and just eight percent invoked a liberal moral value,
105
342420
2496
였직 8%만이 진보적인 도덕적 κ°€μΉ˜μ— ν•΄λ‹Ήν•˜λŠ”
05:44
even though they were supposed to be targeting liberals for persuasion.
106
344940
3360
단어λ₯Ό μ‚¬μš©ν–ˆμŠ΅λ‹ˆλ‹€.
05:49
Now, you can see right away why we're in trouble here. Right?
107
349300
4040
μ—¬λŸ¬λΆ„μ€ 문제점이 무엇인지 κ°„νŒŒν•˜μ…¨μ„ κ²λ‹ˆλ‹€.
05:54
People's moral values, they're their most deeply held beliefs.
108
354100
3496
μš°λ¦¬μ—κ²Œ 도덕적 κ°€μΉ˜λž€ κ°€μž₯ κ°•ν•œ 신념과도 κ°™μŠ΅λ‹ˆλ‹€.
05:57
People are willing to fight and die for their values.
109
357620
3400
κ·ΈλŸ¬ν•œ κ°€μΉ˜λ₯Ό μœ„ν•΄ μ‹Έμš°κΈ°λ„, 죽기도 ν•˜μ£ .
06:01
Why are they going to give that up just to agree with you
110
361540
2696
μ–΄μ°¨ν”Ό λ‹€λ₯Έ μ‚¬λžŒλ“€μ˜ λ™μ˜λ₯Ό ꡬ할 ν•„μš”λ„ μ—†λŠ” 것듀을 μœ„ν•΄
06:04
on something that they don't particularly want to agree with you on anyway?
111
364260
3536
μ™œ κ·ΈλŸ¬ν•œ 것듀을 ν¬κΈ°ν•˜κ² μ–΄μš”?
06:07
If that persuasive appeal that you're making to your Republican uncle
112
367820
3256
μ—¬λŸ¬λΆ„μ˜ 보수주의자 μ‚Όμ΄Œμ„ μ„€λ“ν•˜κΈ° μœ„ν•œ 말이
06:11
means that he doesn't just have to change his view,
113
371100
2416
단지 μ‚Όμ΄Œμ˜ κ΄€μ λ§Œμ΄ μ•„λ‹ˆλΌ
06:13
he's got to change his underlying values, too,
114
373540
2166
본인의 신념도 λ°”κΏ”μ•Όλ§Œ ν•œλ‹€λ©΄
06:15
that's not going to go very far.
115
375730
1560
κ·Έ λŒ€ν™”λŠ” 길게 μ΄μ–΄μ§ˆ 수 없을 κ²λ‹ˆλ‹€.
06:17
So what would work better?
116
377900
1320
더 λ‚˜μ€ 방법은 무엇이 μžˆμ„κΉŒμš”?
06:20
Well, we believe it's a technique that we call moral reframing,
117
380020
4296
음, μ €ν¬λŠ” 도덕적 μž¬κ΅¬μ‘°ν™”λΌκ³  λΆ€λ₯΄λŠ” 기법을
06:24
and we've studied it in a series of experiments.
118
384340
2616
μ—¬λŸ¬ μ‹€ν—˜μ— 걸쳐 μ—°κ΅¬ν•΄μ™”μŠ΅λ‹ˆλ‹€.
06:26
In one of these experiments,
119
386980
1496
이 μ‹€ν—˜λ“€ 쀑 ν•˜λ‚˜μ—μ„œ
06:28
we recruited liberals and conservatives to a study
120
388500
3136
μ €ν¬λŠ” μ§„λ³΄μ£Όμ˜μžμ™€ 보수주의자λ₯Ό λΆˆλŸ¬μ„œ
06:31
where they read one of three essays
121
391660
2296
κ·Έλ“€μ˜ ν™˜κ²½ νƒœλ„λ₯Ό μ—°κ΅¬ν•˜κΈ° 전에
06:33
before having their environmental attitudes surveyed.
122
393980
3040
μ„Έ 가지 κΈ€ 쀑 ν•˜λ‚˜λ₯Ό μ½κ²Œν–ˆμŠ΅λ‹ˆλ‹€.
06:37
And the first of these essays
123
397460
1496
첫 번째 μ—μ„Έμ΄λŠ” 진보적 κ°€μΉ˜ 쀑
06:38
was a relatively conventional pro-environmental essay
124
398980
3376
λ³΄μ‚΄ν•Œκ³Ό μƒν•΄λ‘œλΆ€ν„°μ˜ 보호λ₯Ό ν˜Έμ†Œν•˜λŠ”
06:42
that invoked the liberal values of care and protection from harm.
125
402380
4016
μ‘°κΈˆμ€ μƒνˆ¬μ μΈ μΉœν™˜κ²½μ μΈ κ²ƒμ΄μ—ˆμŠ΅λ‹ˆλ‹€.
06:46
It said things like, "In many important ways
126
406420
2536
λ‹€μŒκ³Ό 같은 λ‚΄μš©μ„ λ‹΄κ³  μžˆμ—ˆμ£ .
06:48
we are causing real harm to the places we live in,"
127
408980
2816
"μš°λ¦¬λŠ” λ§Žμ€ μ€‘μš”ν•œ λ°©λ²•μœΌλ‘œ μš°λ¦¬κ°€ μ‚΄κ³  μžˆλŠ” ν™˜κ²½μ— μ—„μ²­λ‚œ 해악을 끼치고 μžˆλ‹€."
06:51
and, "It is essential that we take steps now
128
411820
2816
"μ΄μ œλŠ” 우리 지ꡬ에 λ”μ΄μƒμ˜ 파괴λ₯Ό
06:54
to prevent further destruction from being done to our Earth."
129
414660
2920
λ§‰μœΌλ €λŠ” λ…Έλ ₯을 μ‹œμž‘ν•΄μ•Όλ§Œ ν•œλ‹€."
06:58
Another group of participants
130
418940
1416
λ‹€λ₯Έ μ°Έκ°€μžλ“€μ—κ²ŒλŠ”
07:00
were assigned to read a really different essay
131
420380
2216
보수적인 κ°€μΉ˜μΈ
07:02
that was designed to tap into the conservative value of moral purity.
132
422620
4440
도덕적 μˆœμˆ˜μ„±μ„ κ±΄λ“œλ¦¬λŠ” 에세이λ₯Ό μ€¬μŠ΅λ‹ˆλ‹€.
07:08
It was a pro-environmental essay as well,
133
428010
1986
이 λ˜ν•œ μΉœν™˜κ²½μ μΈ κΈ€μ΄μ—ˆκ³ 
07:10
and it said things like,
134
430020
1496
λ‹€μŒκ³Ό 같은 말을 ν–ˆμŠ΅λ‹ˆλ‹€.
07:11
"Keeping our forests, drinking water, and skies pure is of vital importance."
135
431540
4240
"우리의 숲과 μ‹μˆ˜, 그리고 ν•˜λŠ˜μ„ κΉ¨λ—ν•˜κ²Œ μœ μ§€ν•˜λŠ” 것은 ꡉμž₯히 μ€‘μš”ν•©λ‹ˆλ‹€."
07:16
"We should regard the pollution
136
436820
1496
"μš°λ¦¬κ°€ μ‚΄κ³  μžˆλŠ” 곳을 μ—­κ²Ήκ²Œ λ§Œλ“œλŠ”
07:18
of the places we live in to be disgusting."
137
438340
2040
μ˜€μ—Όμ— λŒ€ν•΄ 생각해야 ν•©λ‹ˆλ‹€."
07:20
And, "Reducing pollution can help us preserve
138
440980
2096
"μ˜€μ—Όμ„ μ€„μ΄λŠ” 것은 μš°λ¦¬κ°€ μ‚΄κ³  μžˆλŠ” 곳을
07:23
what is pure and beautiful about the places we live."
139
443100
3160
κΉ¨λ—ν•˜κ³  μ•„λ¦„λ‹΅κ²Œ λ³΄μ‘΄ν•˜λŠ” 것을 λ„μ™€μ€λ‹ˆλ‹€."
07:27
And then we had a third group
140
447700
1416
그리고 μ„Έ 번째 집단은
07:29
that were assigned to read just a nonpolitical essay.
141
449140
2496
μ •μΉ˜μ μ΄μ§€ μ•Šμ€ 글을 읽게 ν–ˆμŠ΅λ‹ˆλ‹€.
07:31
It was just a comparison group so we could get a baseline.
142
451660
2736
이 집단은 μ‹€ν—˜μ„ μœ„ν•œ λŒ€μ‘°κ΅°μ΄μ—ˆμŠ΅λ‹ˆλ‹€.
07:34
And what we found when we surveyed people
143
454420
1953
저희가 이 연ꡬλ₯Ό 톡해 μ‚¬λžŒλ“€μ˜ ν™˜κ²½ νƒœλ„μ— λŒ€ν•΄ μ•Œκ²Œλœ 것은
07:36
about their environmental attitudes afterwards,
144
456397
2199
μ§„λ³΄μ£Όμ˜μžλ“€μ—κ²Œ κΈ€μ˜ μ’…λ₯˜λŠ”
07:38
we found that liberals, it didn't matter what essay they read.
145
458620
2936
크게 상관이 μ—†μ—ˆλ‹€λŠ” κ²ƒμž…λ‹ˆλ‹€.
07:41
They tended to have highly pro-environmental attitudes regardless.
146
461580
3096
그듀은 κΈ€κ³ΌλŠ” λ¬΄κ΄€ν•˜κ²Œ μΉœν™˜κ²½μ μΈ μ„±ν–₯을 더 λ†’κ²Œ λ³΄μ˜€μŠ΅λ‹ˆλ‹€.
07:44
Liberals are on board for environmental protection.
147
464700
2416
μ§„λ³΄μ£Όμ˜μžλ“€μ€ 이미 ν™˜κ²½ 보쑴에 μ‹ κ²½μ“°κ³  μžˆμ—ˆμ£ .
07:47
Conservatives, however,
148
467140
1216
ν•˜μ§€λ§Œ λ³΄μˆ˜μ£Όμ˜μžλ“€μ€
07:48
were significantly more supportive of progressive environmental policies
149
468380
4416
도덕적 μˆœκ²°μ— λŒ€ν•œ 글을 μ½μ—ˆμ„ λ•Œ
07:52
and environmental protection
150
472820
1536
λ‹€λ₯Έ 글을 읽은 κ²½μš°λ³΄λ‹€
07:54
if they had read the moral purity essay
151
474380
2056
진보적인 ν™˜κ²½ 정책을
07:56
than if they read one of the other two essays.
152
476460
2400
훨씬 더 μ§€μ§€ν•˜λŠ” κ²ƒμœΌλ‘œ λ‚˜νƒ€λ‚¬μŠ΅λ‹ˆλ‹€.
07:59
We even found that conservatives who read the moral purity essay
153
479980
3096
심지어 도덕적 μˆœκ²°μ— λŒ€ν•œ 글을 읽은 λ³΄μˆ˜μ£Όμ˜μžλ“€μ€
08:03
were significantly more likely to say that they believed in global warming
154
483100
3496
κΈ€μ—μ„œλŠ” μ–ΈκΈ‰λ˜μ§€λ„ μ•Šμ€ 지ꡬ μ˜¨λ‚œν™”μ— λŒ€ν•΄
08:06
and were concerned about global warming,
155
486620
1905
더 높은 λΉˆλ„λ‘œ 지ꡬ μ˜¨λ‚œν™”λ₯Ό λ―Ώκ³ 
08:08
even though this essay didn't even mention global warming.
156
488549
2727
그것을 κ±±μ •ν•œλ‹€κ³  ν–ˆμŠ΅λ‹ˆλ‹€.
08:11
That's just a related environmental issue.
157
491300
2456
κ·Έμ € κ΄€λ ¨μžˆλŠ” ν™˜κ²½ 문제일 λΏμΈλ°λ„μš”.
08:13
But that's how robust this moral reframing effect was.
158
493780
3080
도덕적 μž¬κ΅¬μ‘°κ°€ μ΄λ ‡κ²Œλ‚˜ κ°•λ ₯ν•©λ‹ˆλ‹€.
08:17
And we've studied this on a whole slew of different political issues.
159
497780
3736
μ €ν¬λŠ” ꡉμž₯히 λ‹€μ–‘ν•œ μ •μΉ˜μ  λ¬Έμ œμ— λŒ€ν•΄ 연ꡬλ₯Ό ν•΄μ™”μŠ΅λ‹ˆλ‹€.
08:21
So if you want to move conservatives
160
501540
3736
μ—¬λŸ¬λΆ„μ΄ 동성 κ²°ν˜Όμ΄λ‚˜ 건강 λ³΄ν—˜κ³Ό 같은 μ‚¬μ•ˆμ— λŒ€ν•΄
08:25
on issues like same-sex marriage or national health insurance,
161
505300
3096
λ³΄μˆ˜μ£Όμ˜μžλ“€μ„ μ„€λ“μ½”μž ν•˜μ‹ λ‹€λ©΄
08:28
it helps to tie these liberal political issues to conservative values
162
508420
3456
μ΄λŸ¬ν•œ 진보적 μ •μΉ˜ 문제λ₯Ό 보수적 κ°€μΉ˜μΈ μ• κ΅­μ‹¬μ΄λ‚˜
08:31
like patriotism and moral purity.
163
511900
2800
도덕적 순결과 μ—°κ΄€μ§“λŠ” 것이 도움이 λ©λ‹ˆλ‹€.
08:35
And we studied it the other way, too.
164
515620
2096
그리고 λ°˜λŒ€μ˜ κ²½μš°λ„ μ—°κ΅¬ν–ˆμŠ΅λ‹ˆλ‹€.
08:37
If you want to move liberals to the right on conservative policy issues
165
517740
3816
μ—¬λŸ¬λΆ„μ΄ κ΅­λ°©λΉ„λ‚˜ μ˜μ–΄λ₯Ό 미ꡭ의 κ³΅μš©μ–΄λ‘œ μ§€μ •ν•˜λŠ” 것과 같은
08:41
like military spending and making English the official language of the US,
166
521580
4616
보수주의적인 μ •μ±… μ‚¬μ•ˆλ“€μ— λŒ€ν•΄ μ§„λ³΄μ£Όμ˜μžλ“€μ„ μ„€λ“ν•˜λ €κ³  ν•˜μ‹ λ‹€λ©΄
08:46
you're going to be more persuasive
167
526220
1656
μ΄λŸ¬ν•œ μ‚¬μ•ˆλ“€μ„ ν‰λ“±μ΄λ‚˜
08:47
if you tie those conservative policy issues to liberal moral values
168
527900
3336
곡정함과 같은 μ§„λ³΄μ£Όμ˜μ  κ°€μΉ˜μ™€
08:51
like equality and fairness.
169
531260
1880
μ—°κ΄€μ‹œμΌœλ³΄μ„Έμš”.
08:54
All these studies have the same clear message:
170
534460
2856
이 λͺ¨λ“  μ—°κ΅¬λŠ” λ™μΌν•œ λ©”μ‹œμ§€λ₯Ό λ‹΄κ³  μžˆμŠ΅λ‹ˆλ‹€.
08:57
if you want to persuade someone on some policy,
171
537340
2936
μ—¬λŸ¬λΆ„μ΄ λ‹€λ₯Έ μ‚¬λžŒμ„ μ–΄λ–€ 정책에 λŒ€ν•΄ μ„€λ“ν•˜λ €κ³  ν•œλ‹€λ©΄
09:00
it's helpful to connect that policy to their underlying moral values.
172
540300
3840
κ·Έ 정책을 그듀이 κΈ°λ°˜ν•˜λŠ” 도덕적인 κ°€μΉ˜μ™€ μ—°κ²°ν•˜λŠ” 것이 도움이 λœλ‹€λŠ” κ²ƒμž…λ‹ˆλ‹€.
09:05
And when you say it like that
173
545340
2176
κ·Έλ ‡κ²Œ λ§ν•œλ‹€λ©΄
09:07
it seems really obvious. Right?
174
547540
1496
ꡉμž₯히 μ•ŒκΈ° 쉽지 μ•Šλ‚˜μš”?
09:09
Like, why did we come here tonight?
175
549060
1776
에λ₯Ό λ“€μžλ©΄, 였늘밀 μ—¬κΈ° μ™œ μ˜€μ…¨λ‚˜μš”?
09:10
Why --
176
550860
1216
μ™œμ£ ?
09:12
(Laughter)
177
552100
1536
(μ›ƒμŒ)
09:13
It's incredibly intuitive.
178
553660
2040
ꡉμž₯히 직관적이죠.
09:17
And even though it is, it's something we really struggle to do.
179
557220
3296
κ·Έλ ‡λ‹€κ³  ν•˜λ”λΌλ„, μ‹€μ²œν•˜κΈ°λŠ” κ½€λ‚˜ μ–΄λ ΅μŠ΅λ‹ˆλ‹€.
09:20
You know, it turns out that when we go to persuade somebody on a political issue,
180
560540
3856
μš°λ¦¬κ°€ μ •μΉ˜μ μΈ λ¬Έμ œμ— λŒ€ν•΄ μƒλŒ€λ°©μ„ μ„€λ“ν•˜λ €κ³  ν•œλ‹€κ³  해도
09:24
we talk like we're speaking into a mirror.
181
564420
2736
μ‡  귀에 κ²½ 읽기가 λ˜λŠ” κ²½μš°κ°€ λ§ŽμŠ΅λ‹ˆλ‹€.
09:27
We don't persuade so much as we rehearse our own reasons
182
567180
4376
μš°λ¦¬λŠ” 슀슀둜 λ―Ώκ³  μžˆλŠ” μ •μΉ˜μ μΈ μž…μž₯에 λŒ€ν•΄
09:31
for why we believe some sort of political position.
183
571580
2880
ν˜Όμžμ„œ μ—°μŠ΅ν–ˆλ˜ κ²ƒλ§ŒνΌ 잘 μ„€λ“ν•˜μ§€ λͺ»ν•©λ‹ˆλ‹€.
09:35
We kept saying when we were designing these reframed moral arguments,
184
575220
4416
μ΄λŸ¬ν•œ μž¬κ΅¬μ‘°ν™”λœ 도덕적 μ£Όμž₯을 λ§Œλ“€λ©΄μ„œ 슀슀둜 계속 λ§ν•©λ‹ˆλ‹€.
09:39
"Empathy and respect, empathy and respect."
185
579660
2640
"κ³΅κ°ν•˜κ³  쑴쀑해야해, κ³΅κ°ν•˜κ³  μ‘΄μ€‘ν•˜μž."
09:42
If you can tap into that,
186
582860
1456
이λ₯Ό 잘 ν™œμš©ν•œλ‹€λ©΄
09:44
you can connect
187
584340
1656
λ‹€λ₯Έ μ‚¬λžŒκ³Ό μ†Œν†΅ν•  수 있고
09:46
and you might be able to persuade somebody in this country.
188
586020
2800
μ–΄μ©Œλ©΄ 이 λ‚˜λΌμ—μ„œ λˆ„κ΅°κ°€λ₯Ό 섀득할 수 μžˆμ„μ§€λ„ λͺ¨λ¦…λ‹ˆλ‹€.
09:49
So thinking again
189
589380
2416
자, κ·Έλ ‡λ‹€λ©΄ λ‹€μ‹œ
09:51
about what movie we're in,
190
591820
2280
μš°λ¦¬κ°€ 찍고 μžˆλŠ” μ˜ν™”λ₯Ό μƒκ°ν•΄λ΄…μ‹œλ‹€.
09:55
maybe I got carried away before.
191
595020
1576
이미 λͺ©μˆ¨μ„ μžƒμ—ˆμ„μ§€λ„ λͺ¨λ₯΄κ² λ„€μš”.
09:56
Maybe it's not a zombie apocalypse movie.
192
596620
1960
μ•„λ‹ˆλ©΄ μ’€λΉ„ μž¬λ‚œ μ˜ν™”κ°€ 아닐지도 λͺ¨λ¦…λ‹ˆλ‹€.
09:59
Maybe instead it's a buddy cop movie.
193
599340
1920
λŒ€μ‹  κ²½μ°° λ™λ£Œ μ˜ν™”μΌ μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
10:01
(Laughter)
194
601860
2016
(μ›ƒμŒ)
10:03
Just roll with it, just go with it please.
195
603900
2016
κ·Έλƒ₯ κ·Έλ ‡λ‹€κ³  μ³μ£Όμ„Έμš”. μžμ—°μŠ€λ ˆ λ„˜μ–΄ κ°€μžκ³ μš”.
10:05
(Laughter)
196
605940
1440
(μ›ƒμŒ)
10:08
You know the kind: there's a white cop and a black cop,
197
608300
2696
λ‹€λ“€ μ–΄λ–€ λ‚΄μš©μΈμ§€ μ•„μ‹€ κ²λ‹ˆλ‹€. 백인 κ²½μ°°κ³Ό 흑인 κ²½μ°°
10:11
or maybe a messy cop and an organized cop.
198
611020
2136
μ•„λ‹ˆλ©΄ 개판인 κ²½μ°°κ³Ό μ§ˆμ„œμžˆλŠ” κ²½μ°°
10:13
Whatever it is, they don't get along
199
613180
2056
μ–΄μ¨Œκ±΄ 간에 μ„œλ‘œ λ‹€λ₯΄κΈ° λ•Œλ¬Έμ—
10:15
because of this difference.
200
615260
1286
잘 μ–΄μšΈλ¦¬μ§€ λͺ»ν•©λ‹ˆλ‹€.
10:17
But in the end, when they have to come together and they cooperate,
201
617340
3216
ν•˜μ§€λ§Œ μ˜ν™” ν›„λ°˜λΆ€μ— λ³΄μ‹œλ©΄ μ„œλ‘œ λ­‰μ³μ„œ ν˜‘μ‘°ν•˜λŠ” μˆœκ°„
10:20
the solidarity that they feel,
202
620580
1936
그듀이 λŠλΌλŠ” μ—°λŒ€λŠ”
10:22
it's greater because of that gulf that they had to cross. Right?
203
622540
3640
훨씬 더 클 κ²λ‹ˆλ‹€. ν—˜λ‚œν•œ λ°”λ‹€λ₯Ό κ±΄λ„ˆ μ™”μœΌλ‹ˆκΉŒμš”.
10:27
And remember that in these movies,
204
627100
1976
그리고 이런 μ˜ν™”μ— μžˆμ–΄μ„œ
10:29
it's usually worst in the second act
205
629100
2896
주인곡듀은 항상 μ€‘λ°˜λΆ€ 쯀에
10:32
when our leads are further apart than ever before.
206
632020
2400
κ°€μž₯ 사이가 μ•ˆ μ’‹λ‹€λŠ” κ±Έ κΈ°μ–΅ν•˜μ„Έμš”.
10:35
And so maybe that's where we are in this country,
207
635260
2336
μš°λ¦¬κ°€ μ§€κΈˆ 이런 상황일지도 λͺ¨λ₯΄κ² λ„€μš”.
10:37
late in the second act of a buddy cop movie --
208
637620
2176
κ²½μ°° μ˜ν™”μ˜ μ΄ˆμ€‘λ°˜λΆ€ λμžλ½μ΄μš”.
10:39
(Laughter)
209
639820
2576
(μ›ƒμŒ)
10:42
torn apart but about to come back together.
210
642420
3080
μ„œλ‘œ μ•™μˆ™μ΄ λ˜μ§€λ§Œ μ‘°λ§Œκ°„ λ‹€μ‹œ κ²°ν•©ν•  κ²λ‹ˆλ‹€.
10:47
It sounds good,
211
647220
1656
μ’‹κ²Œ λ“€λ¦¬μ§€λ§Œ
10:48
but if we want it to happen,
212
648900
1856
이런 일이 μΌμ–΄λ‚˜κΈ° μœ„ν•΄μ„œλŠ”
10:50
I think the responsibility is going to start with us.
213
650780
2720
μš°λ¦¬λΆ€ν„° κ·Έ μ±…μž„κ°μ„ 느끼고 μ‹œμž‘ν•΄μ•Όλ§Œ ν•©λ‹ˆλ‹€.
10:54
So this is my call to you:
214
654340
2160
μ—¬λŸ¬λΆ„κ»˜ 이런 말씀을 λ“œλ¦¬κ³  μ‹ΆμŠ΅λ‹ˆλ‹€.
10:57
let's put this country back together.
215
657300
2000
이 λ‚˜λΌλ₯Ό λ‹€μ‹œ ν•©μΉ©μ‹œλ‹€.
11:00
Let's do it despite the politicians
216
660900
3056
μ •μΉ˜μΈ, 방솑, 페이슀뢁, νŠΈμœ„ν„°,
11:03
and the media and Facebook and Twitter
217
663980
2856
그리고 의회 재ꡬ획 λ“±
11:06
and Congressional redistricting
218
666860
1536
우리λ₯Ό κ°ˆλΌλ†“λŠ” λͺ¨λ“  것듀을 이겨내고
11:08
and all of it, all the things that divide us.
219
668420
2720
μ‹œμž‘ν•©μ‹œλ‹€.
11:12
Let's do it because it's right.
220
672180
2240
μ˜³μ€ 일을 μ‹œμž‘ν•©μ‹œλ‹€.
11:15
And let's do it because this hate and contempt
221
675740
4416
맀일 μš°λ¦¬κ°€ κ²ͺλŠ” μ΄λŸ¬ν•œ μ¦μ˜€μ™€ λΆ„λ…Έκ°€
11:20
that flows through all of us every day
222
680180
2160
우리λ₯Ό λͺ»λ‚˜κ²Œ λ§Œλ“€κ³ , λΆ€νŒ¨ν•˜κ²Œ λ§Œλ“€λ©°
11:23
makes us ugly and it corrupts us,
223
683220
3176
우리 μ‚¬νšŒμ˜ κ°€μž₯ μž‘μ€ ꡬ쑰뢀터 λ³‘λ“€κ²Œ ν•˜κΈ° λ•Œλ¬Έμ—
11:26
and it threatens the very fabric of our society.
224
686420
3320
μš°λ¦¬λŠ” μ‹œμž‘ν•΄μ•Όν•©λ‹ˆλ‹€.
11:31
We owe it to one another and our country
225
691780
2656
μš°λ¦¬λŠ” μ„œλ‘œμ™€ κ΅­κ°€μ—κ²Œ λ‹€κ°€κ°€
11:34
to reach out and try to connect.
226
694460
2160
관계λ₯Ό 맺을 의무λ₯Ό μ§€λ‹ˆκ³  μžˆμŠ΅λ‹ˆλ‹€.
11:37
We can't afford to hate them any longer,
227
697820
3160
μš°λ¦¬λŠ” 더이상 μƒλŒ€λ°©μ„ μ‹«μ–΄ν•˜κ±°λ‚˜
11:42
and we can't afford to let them hate us either.
228
702020
2200
μƒλŒ€λ°©μ΄ 우리λ₯Ό μ‹«μ–΄ν•˜κ²Œ λ†”λ‘¬μ„œλŠ”μ•ˆ λ©λ‹ˆλ‹€.
11:45
Empathy and respect.
229
705700
1360
곡감과 쑴쀑.
11:47
Empathy and respect.
230
707700
1240
κ³΅κ°ν•˜κ³  μ‘΄μ€‘ν•˜μ‹­μ‹œμ˜€.
11:49
If you think about it, it's the very least that we owe our fellow citizens.
231
709740
3800
μ΄λŠ” 우리 μΉœμ• ν•˜λŠ” μ‹œλ―Όλ“€μ— λŒ€ν•œ μ΅œμ†Œν•œμ˜ 의무라고 μƒκ°ν•˜μ‹€ κ²λ‹ˆλ‹€.
11:54
Thank you.
232
714220
1216
κ°μ‚¬ν•©λ‹ˆλ‹€.
11:55
(Applause)
233
715460
4685
(λ°•μˆ˜)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7