How technology changes our sense of right and wrong | Juan Enriquez

112,242 views ・ 2021-02-24

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: DK Kim κ²€ν† : Jihyeon J. Kim
00:13
In an era of extreme polarization,
0
13063
2734
μ–‘κ·Ήν™”κ°€ κ·Ήμ‹¬ν•œ μ‹œκΈ°μ—λŠ”
00:15
it's really dangerous to talk about right and wrong.
1
15830
3133
옳고 그름에 λŒ€ν•΄ λ§ν•˜λŠ” 것이 맀우 μœ„ν—˜ν•©λ‹ˆλ‹€.
00:20
You can be targeted, judged for something you said 10 years ago, 10 months ago,
2
20263
5300
10λ…„ μ „, μ—΄ 달 전에 ν•œ 말둜 ν‘œμ μ΄ 되고, 평가받을 수 있으며
00:25
10 hours ago, 10 seconds ago.
3
25563
2734
μ—΄ μ‹œκ°„ μ „, μ‹­ 초 전에 ν•œ 말에 말꼬리λ₯Ό μž‘νž™λ‹ˆλ‹€.
00:28
And that means that those who think you're wrong
4
28330
2267
μ—¬λŸ¬λΆ„μ΄ ν‹€λ Έλ‹€κ³  μƒκ°ν•˜λŠ” μ‚¬λžŒλ“€μ€
00:30
may burn you at the stake
5
30630
1267
μ—¬λŸ¬λΆ„μ„ ν™”ν˜•μ‹œν‚€λ €κ³  ν•˜κ±°λ‚˜
00:31
or those who are on your side
6
31930
2133
μ—¬λŸ¬λΆ„μ΄ λ§žλ‹€κ³  μƒκ°ν•˜λŠ” μ‚¬λžŒλ“€ 쀑에
00:34
that think you're not sufficiently orthodox
7
34063
2234
μ•„μ£Ό μ •ν™•νžˆ λ§žλ‹€κ³  μƒκ°ν•˜μ§„ μ•ŠλŠ” μ‚¬λžŒλ“€μ€
00:36
may try and cancel you.
8
36330
1300
μ—¬λŸ¬λΆ„μ„ μ‹œν—˜ν•˜κ³  λΆ€μ •ν•  κ²ƒμž…λ‹ˆλ‹€.
00:37
As you're thinking about right and wrong, I want you to consider three ideas.
9
37663
3667
옳고 그름을 생각할 λ•Œ 이 μ„Έ 가지λ₯Ό 염두에 λ‘μ‹œκΈ° λ°”λžλ‹ˆλ‹€.
00:41
What if right and wrong is something that changes over time.
10
41963
3667
옳고 그름은 μ‹œκ°„μ— 따라 변할지도 λͺ¨λ₯Έλ‹€.
00:46
What if right and wrong is something that can change because of technology.
11
46530
4033
옳고 그름은 기술 λ°œμ „μ— 따라 바뀔지도 λͺ¨λ₯Έλ‹€.
00:50
What if technology is moving exponentially?
12
50563
2600
κΈ°μˆ μ€ μ•„μ£Ό λΉ λ₯΄κ²Œ 변할지도 λͺ¨λ₯Έλ‹€.
00:53
So as you're thinking about this concept,
13
53930
2167
이걸 κ°μ•ˆν•˜κ³  μƒκ°ν•˜λ©΄
00:56
remember human sacrifice used to be normal and natural.
14
56130
2933
μ‚¬λžŒμ„ 제물둜 λ°”μΉ˜λŠ” 것도 ν•œλ•ŒλŠ” μžμ—°μŠ€λŸ¬μš΄ μΌμ΄μ—ˆμŠ΅λ‹ˆλ‹€.
00:59
It was a way of appeasing the gods.
15
59063
1867
μ‹ λ“€μ˜ ν™”λ₯Ό λ‹¬λž˜λŠ” κΈΈμ΄μ—ˆμ§€μš”.
01:00
Otherwise the rain wouldn't come,
16
60963
1767
κ·ΈλŸ¬μ§€ μ•ŠμœΌλ©΄ λΉ„κ°€ μ˜€μ§€ μ•Šκ±°λ‚˜
01:02
the sun wouldn't shine.
17
62763
1700
ν•΄κ°€ μ–΄λ‘μ›Œμ§ˆ ν…Œλ‹ˆκΉŒμš”.
01:04
Public executions.
18
64497
1566
곡개 μ²˜ν˜•μ€ μ–΄λ–€κ°€μš”.
01:06
They were common, normal, legal.
19
66063
2567
일상적이고 정상적이고 ν•©λ²•μ μ΄μ—ˆμŠ΅λ‹ˆλ‹€.
01:08
You used to take your kids to watch beheadings in the streets of Paris.
20
68663
4100
애듀을 데리고 파리 μ‹œλ‚΄μ— μ°Έμˆ˜ν•˜λŠ” κ±Έ κ΅¬κ²½ν•˜λŸ¬ κ°”μŠ΅λ‹ˆλ‹€.
01:12
One of the greatest wrongs, slavery,
21
72797
3133
μ•„μ£Ό 잘λͺ»λœ μ œλ„μΈ λ…Έμ˜ˆ μ œλ„λ‚˜
01:15
indentured servitude,
22
75963
1934
κ°•μ œ 노동은
01:17
that was something that was practiced for millennia.
23
77930
2633
천 λ…„ λ™μ•ˆ μ§€μ†ν–ˆμŠ΅λ‹ˆλ‹€.
01:21
It was practiced across the Incas, the Mayas, the Chinese,
24
81163
4434
이 μ œλ„λŠ” μž‰μΉ΄, λ§ˆμ•Ό, 쀑ꡭ,
01:25
the Indians in North and South America.
25
85630
3900
남뢁미 인디언듀에 μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
01:29
And as you're thinking about this,
26
89530
1700
이걸 λ³΄λ©΄μ„œ 이런 의문이 λ“€ κ²ƒμž…λ‹ˆλ‹€.
01:31
one question is why did something so wrong last for so long?
27
91263
4667
μ–΄μ§Έμ„œ μ΄λ ‡κ²Œ 잘λͺ»λœ 일이 κ·Έλ ‡κ²Œ 였래 지속될 수 μžˆμ„κΉŒ?
01:36
And a second question is: why did it go away?
28
96663
3267
두 번째 μ˜λ¬Έμ€ μ–΄λ–»κ²Œ μ‚¬λΌμ‘Œμ„κΉŒμž…λ‹ˆλ‹€.
01:39
And why did it go away in a few short decades in legal terms?
29
99963
3900
μ–΄λ–»κ²Œ λ²•μ μœΌλ‘œ 짧은 기간인 λͺ‡ μ‹­ λ…„λ§Œμ— μ—†μ–΄μ‘Œμ„κΉŒμš”?
01:43
Certainly there was a work
30
103863
1700
λ¬Όλ‘  κ±°κΈ°μ—λŠ”
01:45
by extraordinary abolitionists who risked their lives,
31
105563
3700
μ•„μ£Ό νŠΉλ³„ν•œ νμ§€λ‘ μžλ“€μ˜ λͺ©μˆ¨μ„ 건 투쟁이 μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
01:49
but there may be something else happening alongside these brave abolitionists.
32
109297
4666
ν•˜μ§€λ§Œ 이 μš©κ°ν•œ νμ§€λ‘ μžλ“€ κ³μ—λŠ” λ‹€λ₯Έ 것이 더 μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
01:54
Consider energy and the industrial revolution.
33
114663
3634
μ—λ„ˆμ§€μ™€ μ‚°μ—…ν˜λͺ…을 λ΄…μ‹œλ‹€.
01:58
A single barrel of oil contains the energy equivalent
34
118330
4067
μ„μœ  ν•œ λ°°λŸ΄μ— λ“€μ–΄μžˆλŠ” μ—λ„ˆμ§€λŠ”
02:02
of the work of five to 10 people.
35
122430
2567
5~10λͺ…μ˜ 노동λ ₯에 ν•΄λ‹Ήν•©λ‹ˆλ‹€.
02:05
Add that to machines,
36
125930
1833
κ·Έκ±Έ 기계에 λ„£μœΌλ©΄
02:07
and suddenly you've got millions of people's equivalent labor
37
127797
4466
κ°‘μžκΈ° 수백만 λͺ…μ˜ 노동λ ₯에 λ‹¬ν•˜λŠ” νž˜μ„
02:12
at your disposal.
38
132297
1833
λ§ˆμŒλŒ€λ‘œ μ“Έ 수 μžˆμŠ΅λ‹ˆλ‹€.
02:14
You can quit oppressing people and have a doubling in lifespan
39
134163
4900
μ‚¬λžŒλ“€μ„ μ–΅λˆ„λ₯΄λŠ” κ±Έ κ·Έλ§Œλ‘κ³  수λͺ…을 두 배둜 늘릴 수 μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
02:19
after a flattened lifespan for millennia.
40
139063
3467
처음 천 년간은 변함이 μ—†μ—ˆμŠ΅λ‹ˆλ‹€.
02:22
The world economy, which had been flat for millennia,
41
142530
3067
천 λ…„κ°„ 변함 μ—†μ—ˆλ˜ 세계 κ²½μ œλ„
02:25
all of a sudden explodes.
42
145630
1900
κ°‘μžκΈ° ν­λ°œν–ˆμŠ΅λ‹ˆλ‹€.
02:27
And you get enormous amounts of wealth and food and other things
43
147530
4533
μ—„μ²­λ‚˜κ²Œ λ§Žμ€ 뢀와 μ‹λŸ‰μ΄ μƒκ²ΌλŠ”λ°
02:32
produced by far fewer hands.
44
152063
2767
노동λ ₯은 훨씬 덜 λ“€μ—ˆμŠ΅λ‹ˆλ‹€.
02:35
Technology changes the way we interact with each other in fundamental ways.
45
155797
4266
기술 λ³€ν™”λŠ” 우리 μƒν™œμ„ 근본적으둜 λ°”κΏ¨μŠ΅λ‹ˆλ‹€.
02:40
New technologies like the machine gun
46
160063
2234
기관총 같은 μ‹ κΈ°μˆ μ€
02:42
completely changed the nature of warfare in World War I.
47
162330
3767
1μ°¨μ„Έκ³„λŒ€μ „μ—μ„œ μ „μŸμ˜ 성격을 μ™„μ „νžˆ λ°”κΏ¨μŠ΅λ‹ˆλ‹€.
02:46
It drove people into trenches.
48
166130
2067
ꡰ인듀을 참호둜 λͺ°μ•„λ„£μ—ˆμ§€μš”.
02:48
You were in the British trench, or you were in the German trench.
49
168197
3100
μ˜κ΅­κ΅°μ΄λ‚˜ 독일ꡰ μͺ½ 참호 μ–΄λ”˜κ°€μ— μžˆμ–΄μ•Ό ν•©λ‹ˆλ‹€.
02:51
Anything in between was no man's land.
50
171330
2367
κ·Έ μ™Έ 지역은 μ–‘κ΅°μ˜ μ€‘κ°„μ§€λŒ€μž…λ‹ˆλ‹€.
02:53
You entered no man's land.
51
173697
1533
μ–‘κ΅°μ˜ μ€‘κ°„μ§€λŒ€μ—μ„œλŠ”
02:55
You were shot. You were killed.
52
175263
2034
총을 맞고, μ£½μŠ΅λ‹ˆλ‹€.
02:57
You tried to leave the trench in the other direction.
53
177330
2733
λ°˜λŒ€μͺ½μœΌλ‘œ 참호λ₯Ό λ›°μ–΄λ‚˜κ°€λ©΄
03:00
Then your own side would shoot you
54
180063
1700
같은 편의 총에 λ§žμŠ΅λ‹ˆλ‹€.
03:01
because you were a deserter.
55
181797
1333
νƒˆμ˜λ³‘μ΄λ‹ˆκΉŒμš”.
03:03
In a weird way, today's machine guns are narrowcast social media.
56
183963
5067
μ’€ μ΄μƒν•˜μ§€λ§Œ μ˜€λŠ˜λ‚ μ˜ 기관총은 νŠΉμ • μ§‘λ‹¨μ˜ μ†Œμ…œ λ―Έλ””μ–΄μž…λ‹ˆλ‹€.
03:09
We're shooting at each other.
57
189030
1500
μ„œλ‘œλ₯Ό 마ꡬ μ˜μ•„λŒ€μ£ .
03:10
We're shooting at those we think are wrong
58
190530
2067
틀리닀고 μƒκ°ν•˜λŠ” μ‚¬λžŒμ„ μ˜μ•„λŒ‘λ‹ˆλ‹€.
03:12
with posts, with tweets, with photographs, with accusations, with comments.
59
192630
4900
μ΄μ•Œμ€ κ²Œμ‹œκΈ€, νŠΈμœ—, 사진, λΉ„λ°©, λŒ“κΈ€μž…λ‹ˆλ‹€.
03:17
And what it's done is it's created these two trenches
60
197530
3100
κ²°κ΅­ 두 μ§„μ˜μ΄ 생기고
03:20
where you have to be either in this trench or that trench.
61
200663
3934
이μͺ½ μ•„λ‹ˆλ©΄ μ €μͺ½μ— λ“€μ–΄μ•Ό ν•©λ‹ˆλ‹€.
03:24
And there's almost no middle ground to meet each other,
62
204630
3100
μ„œλ‘œ 간에 쀑간은 μ—†μŠ΅λ‹ˆλ‹€.
03:27
to try and find some sort of a discussion between right and wrong.
63
207763
5534
옳고 그름을 생각해 λ³Ό 곳이 μ—†μŠ΅λ‹ˆλ‹€.
03:33
As you drive around the United States, you see signs on lawns.
64
213330
4633
미ꡭ을 λŒμ•„λ‹€λ…€ 보면 μž”λ””μ— κ½‚νžŒ νŒ»λ§μ„ λ΄…λ‹ˆλ‹€.
03:37
Some say, "Black Lives Matter."
65
217997
2033
λˆ„κ΅¬λŠ” β€˜ν‘μΈ 생λͺ…은 μ†Œμ€‘ν•˜λ‹€β€™κ³  ν•˜κ³ 
03:40
Others say, "We support the police."
66
220030
2500
λˆ„κ΅¬λŠ” β€˜κ²½μ°°μ„ μ‘μ›ν•©λ‹ˆλ‹€β€™λΌκ³  ν•˜μ£ .
03:42
You very rarely see both signs on the same lawn.
67
222530
4000
두 팻말이 같이 μžˆλŠ” 집은 거의 μ—†μŠ΅λ‹ˆλ‹€.
03:47
And yet if you ask people,
68
227197
1900
ν•˜μ§€λ§Œ μ‚¬λžŒλ“€μ—κ²Œ 물어보면
03:49
most people would probably support Black Lives Matter
69
229130
2933
λŒ€λΆ€λΆ„μ΄ 흑인 생λͺ…은 μ†Œμ€‘ν•˜κ³ 
03:52
and they would also support their police.
70
232063
2867
경찰도 μ‘μ›ν•œλ‹€κ³  ν•˜μ£ .
03:54
So as you think of these polarized times,
71
234963
2134
κ·ΈλŸ¬λ‹ˆ 이런 μ–‘κ·Ήν™”μ˜ μ‹œλŒ€λ₯Ό μƒκ°ν•˜κ³ 
03:57
as you think of right and wrong,
72
237130
1633
옳고 그름을 μƒκ°ν•˜λ©΄
03:58
you have to understand that right and wrong changes
73
238797
2766
옳고 그름이 λ³€ν•œλ‹€λŠ” κ±Έ 인정해야 ν•©λ‹ˆλ‹€.
04:01
and is now changing in exponential ways.
74
241563
2967
μ§€κΈˆμ€ μ•„μ£Ό κΈ‰κ²©ν•˜κ²Œ λ³€ν•œλ‹€λŠ” κ²ƒλ„μš”.
04:04
Take the issue of gay marriage.
75
244530
1933
동성 κ²°ν˜Όμ„ λ³ΌκΉŒμš”.
04:06
In 1996, two-thirds of the US population was against gay marriage.
76
246497
5133
1996λ…„μ—λŠ” 미ꡭ인의 2/3κ°€ 동성 κ²°ν˜Όμ— λ°˜λŒ€ν–ˆμŠ΅λ‹ˆλ‹€.
04:11
Today two-thirds is for.
77
251663
2100
μ§€κΈˆμ€ 2/3κ°€ μ°¬μ„±ν•˜μ£ .
04:13
It's almost 180-degree shift in the opinion.
78
253797
4233
거의 μ •λ°˜λŒ€κ°€ λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
04:18
In part, this is because of protests,
79
258030
2333
λΆ€λΆ„μ μœΌλ‘œλŠ” μ‹œμœ„κ°€ μžˆμ—ˆκ³ 
04:20
because people came out of the closet,
80
260363
2234
μ‚¬λžŒλ“€μ΄ μ–‘μ§€λ‘œ λ‚˜μ„°μœΌλ©°
04:22
because of AIDS,
81
262630
1933
μ—μ΄μ¦ˆ λ•Œλ¬Έμ΄κΈ°λ„ ν–ˆμ§€λ§Œ
04:24
but a great deal of it has to do with social media.
82
264563
2967
κ°€μž₯ 큰 μ΄μœ λŠ” μ†Œμ…œ λ―Έλ””μ–΄μž…λ‹ˆλ‹€.
04:27
A great deal of it has to do with people out in our homes,
83
267530
4100
κ°€μž₯ 큰 μ΄μœ λŠ” μ‚¬λžŒλ“€μ΄ μ§‘μ—μ„œ,
04:31
in our living rooms, through television, through film, through posts,
84
271663
4700
κ±°μ‹€μ—μ„œ TVλ‚˜ μ˜ν™”, κ²Œμ‹œκΈ€μ„ 톡해
04:36
through people being comfortable enough,
85
276363
2534
μš°λ¦¬κ°€ μ•„λŠ” μ‚¬λžŒλ“€,
04:38
our friends, our neighbors, our family,
86
278930
2967
친ꡬ, 이웃, 가쑱이 νŽΈν•œ 마음으둜
04:41
to say, "I'm gay."
87
281930
1833
λ™μ„±μ• μžλΌκ³  말할 수 있게 된 κ±°μ£ .
04:43
And this has shifted opinion
88
283797
1633
κ·Έλ ‡κ²Œ ν•΄μ„œ 인식이 λ°”λ€λ‹ˆλ‹€.
04:45
even in some of the most conservative of places.
89
285463
3400
심지어 κ°€μž₯ 보수적인 κ³³μ—μ„œλ„μš”.
04:48
Take the Pope.
90
288863
1434
ꡐ황을 λ³΄κ² μŠ΅λ‹ˆλ‹€.
04:50
As Cardinal in 2010,
91
290330
1733
ꡐ황은 μΆ”κΈ°κ²½μ΄λ˜ 2010년에
04:52
he was completely against gay marriage.
92
292063
2534
동성 κ²°ν˜Όμ„ μ „ν˜€ μΈμ •ν•˜μ§€ μ•Šμ•˜μŠ΅λ‹ˆλ‹€.
04:54
He becomes Pope.
93
294630
1333
ꡐ황이 λ˜κ³ λ‚˜μ„œ
04:55
And three years after the last sentence
94
295997
3300
λ§ˆμ§€λ§‰μœΌλ‘œ μ˜κ²¬μ„ 내보인 지 3년이 μ§€λ‚œ ν›„μ—λŠ”
04:59
he comes out with "Who am I to judge?"
95
299330
2367
β€˜μ œκ°€ λˆ„κ΅¬λ₯Ό νŒλ‹¨ν•˜κ² μŠ΅λ‹ˆκΉŒ?’ 라고 ν–ˆμ£ .
05:01
And then today, he's in favor of civil unions.
96
301697
4000
이제 가톨릭은 λ™μ„±κ²°ν˜Όμ„ μ§€μ§€ν•©λ‹ˆλ‹€.
05:05
As you're thinking about technology changing ethics,
97
305697
3200
기술이 윀리λ₯Ό λ°”κΎΌλ‹€λŠ” 점을 μƒκ°ν•˜λ©΄
05:08
you also have to consider that technology is now moving exponentially.
98
308930
3833
ν˜„μž¬ 기술이 μ•„μ£Ό κΈ‰κ²©νžˆ λ°”λ€λ‹€λŠ” 점도 μœ λ…ν•΄μ•Ό ν•©λ‹ˆλ‹€.
05:12
As right and wrong changes,
99
312797
2133
옳고 그름이 λ°”λ€ŒλŠ” κ±Έ κ³ λ €ν•˜λ©΄
05:14
if you take the position, "I know right.
100
314963
2634
β€™λ‚˜λŠ” μ˜³μ€ 것이 뭔지 μ•Œκ³  μžˆμœΌλ‹ˆ
05:17
And if you completely disagree with me, if you partially disagree with me,
101
317630
3567
λ‚΄ μ˜κ²¬μ— μ™„μ „νžˆ λ°˜λŒ€ν•˜λ“ , λΆ€λΆ„μ μœΌλ‘œ λ°˜λŒ€ν•˜λ“ ,
κ·Έλƒ₯ μ• λ§€ν•˜κ²Œ 얼버무리든, 당신은 ν‹€λ Έλ‹€β€™λŠ” μžμ„ΈλΌλ©΄
05:21
if you even quibble with me, then you're wrong,"
102
321197
2433
05:23
then there's no discussion,
103
323663
1367
λŒ€ν™”λŠ” μ—†μŠ΅λ‹ˆλ‹€.
05:25
no tolerance, no evolution, and certainly no learning.
104
325030
3500
κ΄€μš©λ„, λ°œμ „λ„ μ—†κ³  λ‹Ήμ—°νžˆ 배움도 μ—†μŠ΅λ‹ˆλ‹€.
05:28
Most of us are not vegetarians yet.
105
328530
2267
우리 λŒ€λΆ€λΆ„μ€ 아직 μ±„μ‹μ£Όμ˜μžκ°€ μ•„λ‹™λ‹ˆλ‹€.
05:31
Then again, we haven't had
106
331830
1400
μ—­μ‹œ 더 쉽고 μ’‹κ³  가격도 μ‹Ό
05:33
a whole lot of faster, better, cheaper alternatives to meat.
107
333263
3767
윑λ₯˜μ˜ λŒ€μ•ˆμ€ 아직 μ—†μŠ΅λ‹ˆλ‹€.
05:37
But now that we're getting synthetic meats,
108
337030
2533
ν•˜μ§€λ§Œ 이제 ν•©μ„± 윑λ₯˜κ°€ λ‚˜μ˜€κ³  있고
05:39
as the price drops from 380,000 dollars in 2013
109
339563
4467
가격도 2013λ…„ 38만 λ‹¬λŸ¬μ—μ„œ
05:44
to 9 dollars today,
110
344030
1333
이제 9λ‹¬λŸ¬λ‘œ λ‚΄λ €μ„œ
05:46
a great big chunk of people
111
346530
1433
μ•„μ£Ό λ§Žμ€ μ‚¬λžŒλ“€μ΄
05:47
are going to start becoming vegetarian or quasi-vegetarian.
112
347997
4466
μ€€ μ±„μ‹μ£Όμ˜μž λŒ€μ—΄λ‘œ λ“€μ–΄μ„œκ³  μžˆμŠ΅λ‹ˆλ‹€.
05:52
And then in retrospect, these pictures
113
352497
2566
λ‚˜μ€‘μ— λŒμ•„λ³΄λ©΄
λ™λ„€μ—μ„œ 제일 ν™”λ €ν•˜κ³  λΉ„μ‹Ό 식당에 κ°€μ„œ
05:55
of walking into the fanciest, most expensive restaurants in town
114
355063
4134
05:59
and walking past racks of bloody steaks
115
359197
3900
핏물이 흐λ₯΄λŠ” κ³ κΈ°λ₯Ό μ§€λ‚˜λŠ” 이 그림은
06:03
is going to look very different in 10 years, in 20 years and 30 years.
116
363130
5267
10λ…„, 20λ…„, 30년이 μ§€λ‚˜λ©΄ μ•„μ£Ό λ‹€λ₯Έ λŠλ‚Œμ„ 쀄 κ²ƒμž…λ‹ˆλ‹€.
06:08
In these polarized times,
117
368430
2333
이 μ–‘κ·Ήν™”μ˜ μ‹œλŒ€μ—
06:10
I'd like to revive two words you rarely hear today:
118
370797
3900
μ˜€λŠ˜λ‚ μ—” 자주 듀을 수 μ—†λŠ” 두 낱말을 상기해 λ“œλ¦¬κ³  μ‹ΆμŠ΅λ‹ˆλ‹€.
06:14
humility and forgiveness.
119
374697
1866
겸손과 μš©μ„œμž…λ‹ˆλ‹€.
06:17
When you judge the past, your ancestors, your forefathers,
120
377663
3934
κ³Όκ±°, μ„ μ‘°λ“€, μ•žμ„  μ„ΈλŒ€λ₯Ό 평가할 λ•Œ
06:21
do so with a little bit more humility,
121
381630
2100
μ’€ 더 κ²Έμ†ν•˜κ²Œ 바라봐 μ£Όμ„Έμš”.
06:23
because perhaps if you'd been educated in that time,
122
383763
3167
μ—¬λŸ¬λΆ„μ΄ κ·Έλ•Œ κ΅μœ‘μ„ λ°›κ³ 
06:26
if you'd lived in that time,
123
386963
1634
κ·Έ μ‹œμ ˆμ— μ‚΄μ•˜λ”λΌλ©΄
06:28
you would've done a lot of things wrong.
124
388630
2333
μ—¬λŸ¬λΆ„ μ—­μ‹œ 잘λͺ»λœ 일을 많이 μ €μ§ˆλ €μ„ κ²ƒμž…λ‹ˆλ‹€.
06:30
Not because they're right.
125
390997
1700
κ·Έ 행동이 μ˜³μ•˜λ‹€λŠ” 것이 μ•„λ‹™λ‹ˆλ‹€.
06:32
Not because we don't see they're wrong today,
126
392697
2700
μ΄μ œλŠ” 그것듀이 틀리닀고 μƒκ°ν•΄μ„œλ„ μ•„λ‹ˆκ³ 
06:35
but simply because our notions,
127
395430
2100
단지 우리의 생각이
06:37
our understanding of right and wrong change across time.
128
397530
3533
옳고 그름에 λŒ€ν•œ 우리 이해가 μ‹œλŒ€μ— 맞좰 λ³€ν•˜κΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
06:41
The second word, forgiveness.
129
401997
2200
두 λ²ˆμ§ΈλŠ” μš©μ„œμž…λ‹ˆλ‹€.
06:44
Forgiveness is incredibly important these days.
130
404197
3033
이 μ‹œλŒ€μ— μš©μ„œλŠ” 더할 λ‚˜μœ„ 없이 μ€‘μš”ν•©λ‹ˆλ‹€.
06:47
You cannot cancel somebody for saying the wrong word,
131
407263
3967
잘λͺ»λœ 말을 ν–ˆλ‹€κ³  ν•΄μ„œ μ–΄λ–€ μ‚¬λžŒμ„ λΆ€μ •ν•  μˆ˜λŠ” μ—†μŠ΅λ‹ˆλ‹€.
06:51
for having done something 10 years ago,
132
411263
2634
10λ…„ 전에 μ–΄λ–€ 일을 ν–ˆλ‹€κ³  ν•΄μ„œ 그럴 μˆ˜λ„ μ—†κ³ 
06:53
for having triggered you and not being a hundred percent right.
133
413930
3700
μ—¬λŸ¬λΆ„μ„ λ„λ°œν–ˆλ‹€κ±°λ‚˜ 100% μ˜³μ§€ μ•Šμ•„λ„ λ§ˆμ°¬κ°€μ§€μž…λ‹ˆλ‹€.
06:57
To build a community, you have to build it and talk to people
134
417663
3467
μ‚¬νšŒλ₯Ό 이루렀면 μ‚¬λžŒλ“€κ³Ό ν•¨κ»˜ μ–˜κΈ°ν•˜κ³ 
07:01
and learn from people
135
421163
1367
μ‚¬λžŒλ“€μ—κ²Œ λ°°μ›Œμ•Ό ν•©λ‹ˆλ‹€.
07:02
who may have very different points of view from yours.
136
422530
2933
μ„€λ Ή κ·Έλ“€μ˜ 생각이 μ—¬λŸ¬λΆ„κ³Ό 많이 λ‹€λ₯΄λ”λΌλ„μš”.
07:05
You have to allow them a space
137
425497
2566
κ·Έλ“€μ—κ²Œ ν‹ˆμ„ μ€˜μ•Ό ν•©λ‹ˆλ‹€.
07:08
instead of creating a no man's land.
138
428063
2667
μ „μŸν„°λ₯Ό λ§Œλ“€μ§€ λ§κ³ μš”.
07:10
A middle ground, a ??? and a space of empathy.
139
430763
3600
쀑간 μ§€λŒ€, 곡감의 μž₯ λ§μž…λ‹ˆλ‹€.
07:15
This is a time to build community.
140
435130
1833
μ§€κΈˆμ€ 곡동체λ₯Ό 이룰 μ‹œκ°„μž…λ‹ˆλ‹€.
07:16
This is not a time to continue ripping nations apart.
141
436997
3233
μ„œλ‘œλ₯Ό μ°’κ³  κ°€λ₯Ό μ‹œκ°„μ΄ μ•„λ‹™λ‹ˆλ‹€.
07:20
Thank you very much.
142
440263
1234
λŒ€λ‹¨νžˆ κ°μ‚¬ν•©λ‹ˆλ‹€.
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7