Damon Horowitz calls for a "moral operating system"

95,064 views ・ 2011-06-06

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: Jeong-Lan Kinser κ²€ν† : Jimin Lee
00:15
Power.
0
15260
2000
ꢌλ ₯
00:17
That is the word that comes to mind.
1
17260
2000
κ·Έ 말이 제 마음으둜 λ“€μ–΄μ„œλŠ”κ΅°μš”.
00:19
We're the new technologists.
2
19260
2000
μš°λ¦¬λŠ” μ‹ μ„ΈλŒ€ ν…Œν¬λ†€λ‘œμ§€μŠ€νŠΈλ“€μž…λ‹ˆλ‹€.
00:21
We have a lot of data, so we have a lot of power.
3
21260
3000
μš°λ¦¬μ—κ²ŒλŠ” λ§Žμ€ μ–‘μ˜ 데이터가 μžˆμ–΄μ„œ, μš°λ¦¬μ—κ² λ§Žμ€ ꢌλ ₯이 μžˆμ§€μš”.
00:24
How much power do we have?
4
24260
2000
μ–Όλ§ˆλ‚˜ λ§Žμ€ ꢌλ ₯을 μš°λ¦¬κ°€ 가지고 μžˆλ‚˜μš”?
00:26
Scene from a movie: "Apocalypse Now" -- great movie.
5
26260
3000
μ˜ν™”μ˜ ν•œμž₯λ©΄: "μ•„ν¬μΉΌλ¦½μŠ€ λ‚˜μš°"--ꡉμž₯ν•œ μ˜ν™”μ§€μš”.
00:29
We've got to get our hero, Captain Willard, to the mouth of the Nung River
6
29260
3000
μš°λ¦¬λŠ” 우리의 μ˜μ›…, μœŒλΌλ“œ λŒ€μž₯을, 넝 κ°•(Nung River) 의 μž…κ΅¬λ‘œ κ°€μ„œ 체포해야 ν•΄μš”.
00:32
so he can go pursue Colonel Kurtz.
7
32260
2000
κ·Έλž˜μ„œ κ·ΈλŠ” 컀츠 λŒ€λ Ήμ„ 쫒을 수 μžˆμ§€μš”.
00:34
The way we're going to do this is fly him in and drop him off.
8
34260
2000
μš°λ¦¬κ°€ 이걸 ν•˜λ €λŠ” 방식은 κ·Έλ₯Ό λΉ„ν–‰κΈ°λ‘œ νƒœμ›Œλ‹€κ°€ 떨ꡬ어 λ†“λŠ” κ²λ‹ˆλ‹€.
00:36
So the scene:
9
36260
2000
κ·Έλž˜μ„œ κ·Έ μž₯면은:
00:38
the sky is filled with this fleet of helicopters carrying him in.
10
38260
3000
ν•˜λŠ˜μ€ κ·Έλ₯Ό μš΄λ°˜ν•΄ λ“€μ—¬μ˜€λŠ” 이 헬리μ½₯ν„°μ˜ 무리둜 가득차 있죠.
00:41
And there's this loud, thrilling music in the background,
11
41260
2000
또 κ±°κΈ°μ—λŠ” 이 μš”λž€ν•œ, μŠ€λ¦΄μžˆλŠ” μŒμ•…μ΄ 배경에 κΉ”λ €μžˆμŠ΅λ‹ˆλ‹€,
00:43
this wild music.
12
43260
2000
이 야생적인 μŒμ•….
00:45
β™« Dum da ta da dum β™«
13
45260
2000
♫덀 λ‹€ 타 λ‹€ 덀♫
00:47
β™« Dum da ta da dum β™«
14
47260
2000
♫덀 λ‹€ 타 λ‹€ 덀♫
00:49
β™« Da ta da da β™«
15
49260
3000
β™«λ‹€ 타 λ‹€ λ‹€β™«
00:52
That's a lot of power.
16
52260
2000
그것은 λŒ€λ‹¨ν•œ ꢌλ ₯μž…λ‹ˆλ‹€.
00:54
That's the kind of power I feel in this room.
17
54260
2000
그건 μ œκ°€ 이 μž₯μ†Œμ—μ„œ λŠλΌλŠ” 것과 같은 μ’…λ₯˜μ˜ ꢌλ ₯μž…λ‹ˆλ‹€.
00:56
That's the kind of power we have
18
56260
2000
그건 μš°λ¦¬κ°€ 가지고 μžˆλŠ” μ’…λ₯˜μ˜ ꢌλ ₯μž…λ‹ˆλ‹€
00:58
because of all of the data that we have.
19
58260
2000
μ™œλƒν•˜λ©΄ μš°λ¦¬κ°€ 가지고 μžˆλŠ” λ°μ΄ν„°μ˜ 전뢀이기 λ•Œλ¬Έμ΄μ§€μš”.
01:00
Let's take an example.
20
60260
2000
예λ₯Ό λ“€μ–΄λ΄…μ‹œλ‹€.
01:02
What can we do
21
62260
2000
μš°λ¦¬κ°€ 뭘 ν• μˆ˜ μžˆμ„κΉŒμš”
01:04
with just one person's data?
22
64260
3000
ν•œμ‚¬λžŒμ˜ 데이터λ₯Ό κ°€μ§€κ΅¬μš”?
01:07
What can we do
23
67260
2000
μš°λ¦¬κ°€ 뭘 ν•  수 μžˆμ„κΉŒμš”
01:09
with that guy's data?
24
69260
2000
μ € λ‚¨μžμ˜ 데이터λ₯Ό 가지고 말이죠?
01:11
I can look at your financial records.
25
71260
2000
μ €λŠ” μ—¬λŸ¬λΆ„μ˜ μž¬μ • 기둝상황을 λ³Ό 수 μžˆμŠ΅λ‹ˆλ‹€.
01:13
I can tell if you pay your bills on time.
26
73260
2000
μ €λŠ” μ—¬λŸ¬λΆ„μ΄ μ œλ•Œμ— μš”κΈˆμ„ μ§€λΆˆν•˜λŠ”μ§€ 말할 수 μžˆμŠ΅λ‹ˆλ‹€.
01:15
I know if you're good to give a loan to.
27
75260
2000
μ €λŠ” μ—¬λŸ¬λΆ„μ΄ 육자λ₯Ό μ–»μ„λ§ŒνΌ μ‹ μš©μ΄ μžˆλŠ”μ§€ μ•Œ 수 μžˆμ§€μš”.
01:17
I can look at your medical records; I can see if your pump is still pumping --
28
77260
3000
μ €λŠ” μ—¬λŸ¬λΆ„μ˜ 의료 기둝을 λ³Ό 수 있고, μ—¬λŸ¬λΆ„μ˜ 심μž₯이 μ œλŒ€λ‘œ μž‘λ™ν•˜λŠ” 지 λ³Ό 수 μžˆμ§€μš”--
01:20
see if you're good to offer insurance to.
29
80260
3000
μ—¬λŸ¬λΆ„μ΄ λ³΄ν—˜μ„ μ œκ³΅λ°›μ„ 만큼 μ‹ μš©μ΄ μžˆλŠ” 지 λ³Ό 수 있고.
01:23
I can look at your clicking patterns.
30
83260
2000
μ—¬λŸ¬λΆ„μ΄ ν΄λ¦­ν•˜λŠ” νŒ¨ν„΄μ„ λ³Ό 수 μžˆμŠ΅λ‹ˆλ‹€.
01:25
When you come to my website, I actually know what you're going to do already
31
85260
3000
μ—¬λŸ¬λΆ„μ΄ 제 μ›Ήμ‚¬μ΄νŠΈλ‘œ 갔을 λ•Œ, μ €λŠ” μ‹€μ œλ‘œ μ—¬λŸ¬λΆ„μ΄ 뭘 ν•  것인지λ₯Ό μ•Œκ³  μžˆμŠ΅λ‹ˆλ‹€,
01:28
because I've seen you visit millions of websites before.
32
88260
2000
μ™œλƒν•˜λ©΄ μ €λŠ” μ—¬λŸ¬λΆ„μ΄ 전에 κ·Έ μ›Ήμ‚¬μ΄νŠΈλ₯Ό λ°©λ¬Έν•œ 것을 λ³΄μ•˜κΈ° λ•Œλ¬Έμ΄μ§€μš”.
01:30
And I'm sorry to tell you,
33
90260
2000
그리고 μ €λŠ” μ—¬λŸ¬λΆ„μ—κ²Œ λ§ν•˜λŠ”κ²ƒμ„ μ£„μ†‘ν•˜κ²Œ μƒκ°ν•©λ‹ˆλ‹€,
01:32
you're like a poker player, you have a tell.
34
92260
2000
μ—¬λŸ¬λΆ„μ΄ 포컀 ν”Œλ ˆμ΄μ–΄κ°™μ•„μ„œ, 할말이 μžˆλ‹€λŠ” κ²ƒμ„μš”.
01:34
I can tell with data analysis what you're going to do
35
94260
2000
μ €λŠ” κ·Έ 데이터 λΆ„μ„μœΌλ‘œ μ—¬λŸ¬λΆ„μ΄ 무엇을 ν•  것인지λ₯Ό μ••λ‹ˆλ‹€
01:36
before you even do it.
36
96260
2000
μ—¬λŸ¬λΆ„μ΄ 그것을 μ‹œμž‘ν•˜κΈ°λ„ 전에 말이죠.
01:38
I know what you like. I know who you are,
37
98260
3000
μ €λŠ” μ—¬λŸ¬λΆ„μ΄ μ’‹μ•„ν•˜λŠ”κ²ƒμ„ μ••λ‹ˆλ‹€. μ €λŠ” μ—¬λŸ¬λΆ„μ΄ λˆ„κ΅¬μΈμ§€λ₯Ό μ••λ‹ˆλ‹€.
01:41
and that's even before I look at your mail
38
101260
2000
그것은 심지어 μ œκ°€ μ—¬λŸ¬λΆ„μ˜ λ©”μΌμ΄λ‚˜
01:43
or your phone.
39
103260
2000
μ—¬λŸ¬λΆ„μ˜ μ „ν™”λ₯Ό 보기도 μ΄μ „μž…λ‹ˆλ‹€.
01:45
Those are the kinds of things we can do
40
105260
2000
κ·ΈλŸ°κ²ƒλ“€μ΄ μš°λ¦¬κ°€ ν•  수 μžˆλŠ” μ’…λ₯˜μ˜ κ²ƒλ“€μž…λ‹ˆλ‹€
01:47
with the data that we have.
41
107260
3000
μš°λ¦¬κ°€ 가진 데이터λ₯Ό 가지고 말이죠.
01:50
But I'm not actually here to talk about what we can do.
42
110260
3000
ν•˜μ§€λ§Œ μ €λŠ” μ‹€μ œλ‘œ μ—¬κΈ°μ—μ„œ μš°λ¦¬κ°€ ν•  수 μžˆλŠ”κ²ƒμ— κ΄€ν•΄μ„œ μ΄μ•ΌκΈ°ν•˜λ €κ³  μ˜¨κ²ƒμ΄ μ•„λ‹™λ‹ˆλ‹€.
01:56
I'm here to talk about what we should do.
43
116260
3000
μ €λŠ” μš°λ¦¬κ°€ ν•΄μ•Όλ§Œ ν•˜λŠ” 것듀에 λŒ€ν•΄μ„œ λ§ν•˜λ €κ³  μ™”μ–΄μš”.
02:00
What's the right thing to do?
44
120260
3000
무엇이 ν•΄μ•Όλ§Œ ν•˜λŠ” μ •λ‹Ήν•œ κ²ƒμΌκΉŒμš”?
02:04
Now I see some puzzled looks
45
124260
2000
자 λͺ‡λͺ‡μ˜ ν˜Όλž€ν•΄ν•˜λŠ” ν‘œμ •μ΄ λ³΄μ΄λŠ”κ΅°μš”.
02:06
like, "Why are you asking us what's the right thing to do?
46
126260
3000
"μ™œ 당신이 μš°λ¦¬μ—κ²Œ 무엇이 ν•΄μ•Όλ§Œ ν•˜λŠ” μ •λ‹Ήν•œ 일인지λ₯Ό λ¬»λŠ”κ±°μ•Ό?
02:09
We're just building this stuff. Somebody else is using it."
47
129260
3000
μš°λ¦¬λŠ” 이것을 λ§Œλ“€κ³  μžˆμ„ 뿐이야. λ‹€λ₯Έ λˆ„κ΅°κ°€κ°€ 그것을 μ΄μš©ν•˜κ³  μžˆμ–΄."
02:12
Fair enough.
48
132260
3000
μΆ©λΆ„νžˆ μ •λ‹Ήν•˜μ£ .
02:15
But it brings me back.
49
135260
2000
ν•˜μ§€λ§Œ 그것은 μ €λ₯Ό λ‹€μ‹œ 되돌렀 μ˜΅λ‹ˆλ‹€.
02:17
I think about World War II --
50
137260
2000
μ €λŠ” 제 2μ°¨ μ„Έκ³„λŒ€μ „μ— λŒ€ν•΄ 생각해 λ΄…λ‹ˆλ‹€.
02:19
some of our great technologists then,
51
139260
2000
κ·Έλ‹Ήμ‹œ 우리의 λŒ€λ‹¨ν•œ ν…Œκ·Έλ†€λ‘œμ§€μŠ€νŠΈλ“€ λͺ‡λͺ…κ³Ό,
02:21
some of our great physicists,
52
141260
2000
우리의 λŒ€λ‹¨ν•œ λ¬Όλ¦¬ν•™μžλ“€ λͺ‡λͺ…이,
02:23
studying nuclear fission and fusion --
53
143260
2000
ν•΅ λΆ„μ—΄κ³Ό 퓨전에 λŒ€ν•΄ κ³΅λΆ€ν•˜κ³  μžˆμ—ˆμŠ΅λ‹ˆλ‹€--
02:25
just nuclear stuff.
54
145260
2000
단지 ν•΅μ˜ 물건일 λΏμ΄μ—ˆμŠ΅λ‹ˆλ‹€.
02:27
We gather together these physicists in Los Alamos
55
147260
3000
μš°λ¦¬λŠ” 둜슀 μ•ŒλΌλͺ¨μŠ€μ—μ„œ 이 λ¬Όλ¦¬ν•™μžλ“€κ³Ό ν•¨κ»˜ λͺ¨μ˜€μŠ΅λ‹ˆλ‹€
02:30
to see what they'll build.
56
150260
3000
그듀이 지을 것이 무엇인지λ₯Ό λ³΄λ €κ΅¬μš”.
02:33
We want the people building the technology
57
153260
3000
μš°λ¦¬λŠ” μ‚¬λžŒλ“€μ΄ κ·Έ ν…Œν¬λ†€λ‘œμ§€λ₯Ό κ±΄μΆ•ν•˜λŠ”κ²ƒμ„ λ°”λžλ‹ˆλ‹€
02:36
thinking about what we should be doing with the technology.
58
156260
3000
μš°λ¦¬κ°€ ν…Œν¬λ†€λ‘œμ§€λ‘œ 무엇을 ν•΄μ•Όλ§Œ ν•  것인지λ₯Ό μƒκ°ν•˜λ©΄μ„œ λ§μ΄μ§€μš”.
02:41
So what should we be doing with that guy's data?
59
161260
3000
자 μ € λ‚¨μžμ˜ 데이터λ₯Ό 가지고 μš°λ¦¬λŠ” 무엇을 ν•΄μ•Όλ§Œ ν•  κΉŒμš”?
02:44
Should we be collecting it, gathering it,
60
164260
3000
μš°λ¦¬κ°€ κ·Έκ±Έ λͺ¨μœΌκ³ , μ§‘ν•©μ‹œμΌœμ•Ό ν• κΉŒμš”,
02:47
so we can make his online experience better?
61
167260
2000
κ·Έλž˜μ„œ μš°λ¦¬κ°€ 그의 온라인 κ²½ν—˜μ„ 더 λ‚«κ²Œ ν•˜κΈ° μœ„ν•΄μ„œ?
02:49
So we can make money?
62
169260
2000
κ·Έλž˜μ„œ μš°λ¦¬κ°€ λˆμ„ 벌수 μžˆλ„λ‘?
02:51
So we can protect ourselves
63
171260
2000
κ·Έλž˜μ„œ μš°λ¦¬κ°€ 우리 μžμ‹ μ„ λ³΄ν˜Έν•  수 μžˆλ„λ‘
02:53
if he was up to no good?
64
173260
2000
λ§Œμ•½ κ·Έκ°€ μ‹ μš©μ΄ μ—†μ–΄μ§ˆ λ•ŒκΉŒμ§€?
02:55
Or should we respect his privacy,
65
175260
3000
λ˜λŠ” μš°λ¦¬κ°€ 그의 개인 정보λ₯Ό 쑴쀑해야 ν• κΉŒμš”,
02:58
protect his dignity and leave him alone?
66
178260
3000
그의 쑴엄을 λ³΄ν˜Έν•˜κ³  κ·Έλ₯Ό ν˜Όμžμ„œ 내버렀 두어야 ν• κΉŒμš”?
03:02
Which one is it?
67
182260
3000
어떀것이어야 ν• κΉŒμš”?
03:05
How should we figure it out?
68
185260
2000
μš°λ¦¬κ°€ μ–΄λ–»κ²Œ μ•Œμ•„λ‚΄μ•Ό ν• κΉŒμš”?
03:07
I know: crowdsource. Let's crowdsource this.
69
187260
3000
μ €λŠ” μ••λ‹ˆλ‹€: ν¬λΌμš°λ“œ μ†ŒμŠ€. 이것을 ν¬λΌμš°λ“œ μ†ŒμŠ€μ‹œν‚΅μ‹œλ‹€.
03:11
So to get people warmed up,
70
191260
3000
κ·Έλž˜μ„œ μ‚¬λžŒλ“€μ—κ²Œ μ€€λΉ„λ₯Ό μ‹œν‚€κΈ° μœ„ν•΄μ„œ,
03:14
let's start with an easy question --
71
194260
2000
μ‰¬μš΄ 질문으둜 μ‹œμž‘ν•©μ‹œλ‹€--
03:16
something I'm sure everybody here has an opinion about:
72
196260
3000
μ œκ°€ ν™•μ‹ ν•˜κΈ°λ‘œ μ—¬κΈ° μžˆλŠ” λͺ¨λ“  μ‚¬λžŒλ“€μ΄ μ˜κ²¬μ„ 가지고 μžˆλŠ” λ¬΄μ—‡μœΌλ‘œ:
03:19
iPhone versus Android.
73
199260
2000
아이폰 λŒ€ μ•ˆλ“œλ‘œμ΄λ“œ.
03:21
Let's do a show of hands -- iPhone.
74
201260
3000
손을 λ³΄μ—¬λ΄…μ‹œλ‹€--아이폰
03:24
Uh huh.
75
204260
2000
μ’‹μ•„μš”.
03:26
Android.
76
206260
3000
μ•ˆλ“œλ‘œμ΄λ“œ.
03:29
You'd think with a bunch of smart people
77
209260
2000
μ—¬λŸ¬λΆ„μ΄ μ˜λ¦¬ν•œ λ§Žμ€ μ‚¬λžŒλ“€μ΄ μƒκ°ν•˜κΈ°λ₯Ό
03:31
we wouldn't be such suckers just for the pretty phones.
78
211260
2000
"μš°λ¦¬λŠ” 단지 예쁜 μ „ν™”λ§Œμ„ μ›ν•˜λŠ” λ°”λ³΄λŠ” μ•ˆλ κ±°μ•Ό." 라고 ν•˜μ‹œκ² μ£ .
03:33
(Laughter)
79
213260
2000
(μ›ƒμŒ)
03:35
Next question,
80
215260
2000
λ‹€μŒ 질문,
03:37
a little bit harder.
81
217260
2000
μ•½κ°„ 더 μ–΄λ €μš΄κ²ƒ.
03:39
Should we be collecting all of that guy's data
82
219260
2000
μš°λ¦¬κ°€ κ·Έ λ‚¨μžμ˜ 데이터λ₯Ό λͺ¨λ‘ μˆ˜μ§‘ν•΄μ„œ
03:41
to make his experiences better
83
221260
2000
그의 κ²½ν—˜μ˜ μ§ˆμ„ 높이기 μœ„ν•΄
03:43
and to protect ourselves in case he's up to no good?
84
223260
3000
또 κ·Έκ°€ μ‹ μš©μ΄ μ—†μ–΄μ§ˆ κ²½μš°μ— 우리λ₯Ό 지킬 수 μžˆλ„λ‘ ν•΄μ•Ό ν• κΉŒμš”?
03:46
Or should we leave him alone?
85
226260
2000
λ˜λŠ” κ·Έλ₯Ό 혼자 내버렀 두어야 ν• κΉŒμš”?
03:48
Collect his data.
86
228260
3000
그의 데이터λ₯Ό μˆ˜μ§‘ν•΄λΌ.
03:53
Leave him alone.
87
233260
3000
κ·Έλ₯Ό 혼자 내버렀 두어라.
03:56
You're safe. It's fine.
88
236260
2000
μ—¬λŸ¬λΆ„μ€ μ•ˆμ „ν•©λ‹ˆλ‹€. μ’‹μ•„μš”.
03:58
(Laughter)
89
238260
2000
(μ›ƒμŒ)
04:00
Okay, last question --
90
240260
2000
μ’‹μ•„μš”, λ§ˆμ§€λ§‰ 질문--
04:02
harder question --
91
242260
2000
μ–΄λ €μš΄ 질문--
04:04
when trying to evaluate
92
244260
3000
μš°λ¦¬κ°€ 이 μ‚¬λ‘€μ—μ„œ 무엇을 ν•΄μ•Όλ§Œ ν•  지
04:07
what we should do in this case,
93
247260
3000
ν‰κ°€ν•˜λ €λŠ” λ…Έλ ₯을 ν• λ•Œ,
04:10
should we use a Kantian deontological moral framework,
94
250260
4000
μš°λ¦¬λŠ” μΉΈνŠΈλ°©μ‹μ˜ μ˜λ―Έλ‘ μ „ 도덕 체계λ₯Ό μ΄μš©ν•΄μ•Ό ν• κΉŒμš”
04:14
or should we use a Millian consequentialist one?
95
254260
3000
λ˜λŠ” μš°λ¦¬κ°€ λ°€λ°©μ‹μ˜ 결과둠적인 것을 μ΄μš©ν•΄μ•Ό ν• κΉŒμš”?
04:19
Kant.
96
259260
3000
칸트.
04:22
Mill.
97
262260
3000
λ°€.
04:25
Not as many votes.
98
265260
2000
νˆ¬ν‘œκ°€ 많이 μ—†κ΅°μš”.
04:27
(Laughter)
99
267260
3000
(μ›ƒμŒ)
04:30
Yeah, that's a terrifying result.
100
270260
3000
λ§žμ•„μš”, 그건 λ”μ°ν•œ κ²°κ³Όμž…λ‹ˆλ‹€.
04:34
Terrifying, because we have stronger opinions
101
274260
4000
λ”μ°ν•©λ‹ˆλ‹€, μ™œλƒν•˜λ©΄ μš°λ¦¬λŠ” μš°λ¦¬κ°€ 우리의 결정을 ν•˜λŠ”λ° 도움을 μ£ΌλŠ”
04:38
about our hand-held devices
102
278260
2000
도덕적인 체계에 λŒ€ν•œ 것보닀
04:40
than about the moral framework
103
280260
2000
우리의 μ†Œν˜•κΈ°κΈ° μž₯μΉ˜μ— λŒ€ν•΄μ„œ
04:42
we should use to guide our decisions.
104
282260
2000
보닀 κ°•ν•œ μ˜κ²¬μ„ 가지고 있기 λ•Œλ¬Έμ΄μ£ 
04:44
How do we know what to do with all the power we have
105
284260
3000
만일 μš°λ¦¬κ°€ 도덕적인 체계λ₯Ό 가지지 μ•Šμ•˜λ‹€λ©΄
04:47
if we don't have a moral framework?
106
287260
3000
μš°λ¦¬κ°€ 가진 λͺ¨λ“  ꢌλ ₯을 가지고 무엇을 해야할지 μ–΄λ–»κ²Œ μ•Œκ² μŠ΅λ‹ˆκΉŒ?
04:50
We know more about mobile operating systems,
107
290260
3000
μš°λ¦¬λŠ” λͺ¨λΉŒ 운영 μ‹œμŠ€ν…œμ— κ΄€ν•΄ 보닀 더 많이 μ•Œκ³  μžˆμ§€λ§Œ,
04:53
but what we really need is a moral operating system.
108
293260
3000
μš°λ¦¬μ—κ²Œ 정말 ν•„μš”ν•œ 것은 도덕적인 운영 μ‹œμŠ€ν…œμž…λ‹ˆλ‹€.
04:58
What's a moral operating system?
109
298260
2000
무엇이 도덕적인 운영 μ‹œμŠ€ν…œμΌκΉŒμš”?
05:00
We all know right and wrong, right?
110
300260
2000
우리λͺ¨λ‘λŠ” 옳고 그름을 μ••λ‹ˆλ‹€, λ§žμ§€μš”?
05:02
You feel good when you do something right,
111
302260
2000
μ—¬λŸ¬λΆ„μ΄ λ­”κ°€ λ°”λ₯Έ 일을 ν–ˆμ„λ•Œ μ—¬λŸ¬λΆ„μ€ 기뢄이 μ’‹μŒμ„ λŠλ‚λ‹ˆλ‹€,
05:04
you feel bad when you do something wrong.
112
304260
2000
μ—¬λŸ¬λΆ„μ΄ λ­”κ°€ λ‚˜μœ 일을 ν–ˆμ„λ•Œ μ—¬λŸ¬λΆ„μ€ 기뢄이 쒋지 μ•ŠμŒμ„ λŠλ‚λ‹ˆλ‹€.
05:06
Our parents teach us that: praise with the good, scold with the bad.
113
306260
3000
우리의 λΆ€λͺ¨λ“€μ΄ κ·Έκ±Έ κ°€λ₯΄μ³€μ§€μš”; 쒋은것은 μΉ­μ°¬ν•˜κ³ , λ‚˜μœκ²ƒμ€ 꾸짖어라.
05:09
But how do we figure out what's right and wrong?
114
309260
3000
ν•˜μ§€λ§Œ 어떀것이 옳고 κ·Έλ₯Έμ§€ μ–΄λ–»κ²Œ μ•Œ 수 μžˆμ„κΉŒμš”?
05:12
And from day to day, we have the techniques that we use.
115
312260
3000
ν•˜μ§€λ§Œ 맀일 맀일, μš°λ¦¬μ—κ²ŒλŠ” μš°λ¦¬κ°€ μ΄μš©ν•˜λŠ” κΈ°μˆ μ„ κ°–κ³  μžˆμŠ΅λ‹ˆλ‹€.
05:15
Maybe we just follow our gut.
116
315260
3000
μš°λ¦¬λŠ” μ–΄μ©Œλ©΄ 우리의 λ³ΈλŠ₯을 κ·Έμ € λ”°λΌκ°‘λ‹ˆλ‹€.
05:18
Maybe we take a vote -- we crowdsource.
117
318260
3000
μš°λ¦¬λŠ” μ–΄μ©Œλ©΄ νˆ¬ν‘œλ₯Ό ν•©λ‹ˆλ‹€--μš°λ¦¬λŠ” ν¬λΌμš°λ“œ μ†ŒμŠ€λ₯Ό ν•©λ‹ˆλ‹€.
05:21
Or maybe we punt --
118
321260
2000
λ˜λŠ” μ–΄μ©Œλ©΄ μš°λ¦¬λŠ” νŽ€νŠΈ(μ†μ—μ„œ λ–¨μ–΄λœ¨λ¦° 곡이 땅에 λ‹ΏκΈ° 전에 μ°¨κΈ°)λ₯Ό ν•˜κ² μ£ --
05:23
ask the legal department, see what they say.
119
323260
3000
법λ₯ λΆ€μ„œμ— 물어봐, 그듀이 뭐라고 ν•˜λŠ”μ§€ 보자.
05:26
In other words, it's kind of random,
120
326260
2000
달리 λ§ν•˜λ©΄, 그건 λ¬΄μž‘μœ„μ μ΄μ£ ,
05:28
kind of ad hoc,
121
328260
2000
μΌμ’…μ˜ μž„μ‹œ 응변적이죠,
05:30
how we figure out what we should do.
122
330260
3000
μš°λ¦¬κ°€ 무엇을 ν•΄μ•Όν•  지 μ•Œμ•„λ‚΄λŠ”κ²ƒμ€μš”.
05:33
And maybe, if we want to be on surer footing,
123
333260
3000
그리고 μ–΄μ©Œλ©΄, λ§Œμ•½ μš°λ¦¬κ°€ 보닀 ν™•μ‹€ν•œ 거점을 ν™•λ³΄ν•˜κ³  μ‹Άλ‹€λ©΄,
05:36
what we really want is a moral framework that will help guide us there,
124
336260
3000
μš°λ¦¬κ°€ 정말 μ›ν•˜λŠ” 것은 우리λ₯Ό μ € 곳으둜,
05:39
that will tell us what kinds of things are right and wrong in the first place,
125
339260
3000
μš°λ¦¬μ—κ²Œ μ²˜μŒλΆ€ν„° μ–΄λ–€ 것듀이 옳고 κ·Έλ₯Έκ²ƒμΈμ§€ μ•Œλ €μ£Όκ³ 
05:42
and how would we know in a given situation what to do.
126
342260
4000
μš°λ¦¬κ°€ 주어진 μƒν™©μ—μ„œ 무엇을 ν•  지 μš°λ¦¬κ°€ μ–΄λ–»κ²Œ μ•„λŠ” μž₯μ†Œλ‘œ 데렀갈 도덕적인 μ²΄κ³„μž…λ‹ˆλ‹€.
05:46
So let's get a moral framework.
127
346260
2000
κ·ΈλŸ¬λ‹ˆ 도덕적인 체계λ₯Ό λ§ˆλ ¨ν•©μ‹œλ‹€.
05:48
We're numbers people, living by numbers.
128
348260
3000
μš°λ¦¬λŠ” μˆ«μžμ— 따라 μ‚¬λŠ” 숫자의 μ‚¬λžŒλ“€μž…λ‹ˆλ‹€.
05:51
How can we use numbers
129
351260
2000
도덕적인 체계λ₯Ό μœ„ν•œ ν† λŒ€λ‘œμ„œ
05:53
as the basis for a moral framework?
130
353260
3000
μš°λ¦¬κ°€ μ–΄λ–»κ²Œ 숫자λ₯Ό μ΄μš©ν• κΉŒμš”?
05:56
I know a guy who did exactly that.
131
356260
3000
μ €λŠ” λ°”λ‘œ κ·Έλ ‡κ²Œ ν•œ λ‚¨μžλ₯Ό μ•Œκ³  μžˆμŠ΅λ‹ˆλ‹€.
05:59
A brilliant guy --
132
359260
3000
ꡉμž₯히 μ˜λ¦¬ν•œ λ‚¨μžμ£ --
06:02
he's been dead 2,500 years.
133
362260
3000
κ·ΈλŠ” 2,500 년전에 μ£½μ—ˆμŠ΅λ‹ˆλ‹€.
06:05
Plato, that's right.
134
365260
2000
ν”ŒλΌν† , λ§žμ•„μš”.
06:07
Remember him -- old philosopher?
135
367260
2000
κ·Έλ₯Ό κΈ°μ–΅ν•˜μ‹œλ‚˜μš”--λŠ™μ€ μ² ν•™μžλ₯Ό?
06:09
You were sleeping during that class.
136
369260
3000
μ—¬λŸ¬λΆ„μ€ κ·Έ μˆ˜μ—…μ— μž μ„ 자고 μžˆμ—ˆμ£ .
06:12
And Plato, he had a lot of the same concerns that we did.
137
372260
2000
자 ν”ŒλΌν† , κ·ΈλŠ” μš°λ¦¬κ°€ 가진것과 같은 근심을 많이 가지고 μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
06:14
He was worried about right and wrong.
138
374260
2000
κ·ΈλŠ” 옳고 그름에 κ΄€ν•΄ 걱정을 ν–ˆμŠ΅λ‹ˆλ‹€.
06:16
He wanted to know what is just.
139
376260
2000
κ·ΈλŠ” 무엇이 μ •μ˜μΈμ§€μ— λŒ€ν•΄μ„œ μ•Œκ³  μ‹Άμ–΄ν–ˆμ§€μš”.
06:18
But he was worried that all we seem to be doing
140
378260
2000
ν•˜μ§€λ§Œ κ·ΈλŠ” 우리 λͺ¨λ‘κ°€ ν•˜κ³  μžˆλŠ”κ²ƒκ³Ό 같은
06:20
is trading opinions about this.
141
380260
2000
이것에 κ΄€ν•œ μ˜κ²¬μ„ κ΅ν™˜ν•˜λŠ”κ²ƒμ— λŒ€ν•΄ κ±±μ •ν–ˆμŠ΅λ‹ˆλ‹€.
06:22
He says something's just. She says something else is just.
142
382260
3000
"κ·ΈλŠ” 무엇인가가 μ •μ˜λΌκ³  말해. κ·Έλ…€λŠ” λ‹€λ₯Έ 어떀것이 μ •μ˜λΌκ³  말해," 라고 ν•˜λŠ” κ²ƒμ„μš”.
06:25
It's kind of convincing when he talks and when she talks too.
143
385260
2000
κ·Έκ°€ λ§ν• λ•Œ 그것은 μ‹ λ’°ν• λ§Œ ν•˜κ³ , κ·Έλ…€κ°€ λ§ν• λ•Œλ„ μ‹ λ’°ν• λ§Œ ν•˜μ£ .
06:27
I'm just going back and forth; I'm not getting anywhere.
144
387260
2000
μ €λŠ” 단지 μ™”λ‹€ κ°”λ‹€ ν•  뿐이죠; μ–΄λ–€ 곳에도 λ„λ‹¬ν•˜μ§€ μ•Šμ•„μš”.
06:29
I don't want opinions; I want knowledge.
145
389260
3000
μ €λŠ” μ˜κ²¬μ„ μ›ν•˜λŠ” 것이 μ•„λ‹™λ‹ˆλ‹€, μ €λŠ” 지식을 μ›ν•΄μš”.
06:32
I want to know the truth about justice --
146
392260
3000
μ €λŠ” μ •μ˜μ— κ΄€ν•œ 진리λ₯Ό μ•Œκ³  μ‹Άμ–΄μš”--
06:35
like we have truths in math.
147
395260
3000
μš°λ¦¬κ°€ μˆ˜ν•™μ—μ„œ 진리λ₯Ό 가지고 μžˆλŠ”κ²ƒμ²˜λŸΌ.
06:38
In math, we know the objective facts.
148
398260
3000
μˆ˜ν•™μ—μ„œ, μš°λ¦¬λŠ” 객관적인 사싀을 μ••λ‹ˆλ‹€.
06:41
Take a number, any number -- two.
149
401260
2000
숫자λ₯Ό νƒν•˜μ„Έμš”, μž„μ˜μ˜ 숫자--2.
06:43
Favorite number. I love that number.
150
403260
2000
κ°€μž₯ μ’‹μ•„ν•˜λŠ” 숫자. μ €λŠ” κ·Έ 숫자λ₯Ό μ‚¬λž‘ν•©λ‹ˆλ‹€.
06:45
There are truths about two.
151
405260
2000
2에 κ΄€ν•œ 진싀이 μžˆμ–΄μš”.
06:47
If you've got two of something,
152
407260
2000
μ—¬λŸ¬λΆ„μ΄ λ‘κ°œμ˜ λ­”κ°€λ₯Ό κ°€μ‘Œλ‹€λ©°,
06:49
you add two more, you get four.
153
409260
2000
λ‘κ°œλ₯Ό λ”ν•˜λ©΄, 4λ₯Ό μ–»κ²Œ λ©λ‹ˆλ‹€.
06:51
That's true no matter what thing you're talking about.
154
411260
2000
그건 μ—¬λŸ¬λΆ„μ΄ 어떀것에 이야기 ν•˜λŠ”μ§€ 상관없이 μ‚¬μ‹€μž…λ‹ˆλ‹€.
06:53
It's an objective truth about the form of two,
155
413260
2000
그건 2의 좔상적인 ν˜•νƒœμ— κ΄€ν•œ
06:55
the abstract form.
156
415260
2000
객관적인 μ§„μ‹€μž…λ‹ˆλ‹€,
06:57
When you have two of anything -- two eyes, two ears, two noses,
157
417260
2000
μ—¬λŸ¬λΆ„μ΄ 어떀것 2개λ₯Ό κ°€μ‘Œμ„λ•Œ --2개의 눈, 2개의 κ·€, 코에
06:59
just two protrusions --
158
419260
2000
단지 2개의 λŒμΆœλΆ€--
07:01
those all partake of the form of two.
159
421260
3000
그것듀은 λͺ¨λ‘ 2개 ν˜•νƒœμ— ν•œλͺ«μœΌλ‘œ λ‚λ‹ˆλ‹€.
07:04
They all participate in the truths that two has.
160
424260
4000
그듀은 λͺ¨λ‘ 2κ°€ κ°€μ§€λŠ” 진싀에 κ°€λ‹΄ν•©λ‹ˆλ‹€.
07:08
They all have two-ness in them.
161
428260
2000
κ·Έλ“€μ—κ²ŒλŠ” λͺ¨λ‘ 2 ν˜•νƒœλ₯Ό 가지고 μžˆμ–΄μš”.
07:10
And therefore, it's not a matter of opinion.
162
430260
3000
κ·ΈλŸ¬λ―€λ‘œ, 그것은 의견의 λ¬Έμ œκ°€ μ•„λ‹™λ‹ˆλ‹€.
07:13
What if, Plato thought,
163
433260
2000
λ§Œμ•½ ν”ŒλΌν† κ°€ μƒκ°ν•˜κΈ°λ₯Ό,
07:15
ethics was like math?
164
435260
2000
μœ€λ¦¬κ°€ μˆ˜ν•™κ³Ό 같은거라고 μƒκ°ν–ˆλ‹€λ©΄ μ–΄λ–¨κΉŒμš”?
07:17
What if there were a pure form of justice?
165
437260
3000
λ§Œμ•½ μ •μ˜μ— μˆœμˆ˜ν•œ ν˜•νƒœκ°€ μ‘΄μž¬ν•œλ‹€λ©΄μš”?
07:20
What if there are truths about justice,
166
440260
2000
λ§Œμ•½ μ •μ˜μ— κ΄€ν•œ 진싀이 있고,
07:22
and you could just look around in this world
167
442260
2000
μ—¬λŸ¬λΆ„μ€ 이 세상을 λŒμ•„λ³΄μ•„
07:24
and see which things participated,
168
444260
2000
어떀것듀이 μ°Έμ—¬λ˜μ–΄μ„œ,
07:26
partook of that form of justice?
169
446260
3000
μ •μ˜μ˜ ν˜•νƒœμ— μ°Έκ°€ν–ˆλ‹€λ©΄μ€μš”?
07:29
Then you would know what was really just and what wasn't.
170
449260
3000
그러면 μ—¬λŸ¬λΆ„μ€ 무엇이 진싀이고 무엇이 진싀이 μ•„λ‹ˆμ—ˆλŠ”μ§€ μ•Œ 수 μžˆμ„κ²λ‹ˆλ‹€.
07:32
It wouldn't be a matter
171
452260
2000
단지 μ˜κ²¬μ΄λ‚˜ 단지 외양은
07:34
of just opinion or just appearances.
172
454260
3000
λ¬Έμ œκ°€ λ˜μ§€ μ•Šμ„κ²ƒμž…λ‹ˆλ‹€.
07:37
That's a stunning vision.
173
457260
2000
그것은 ꡉμž₯ν•œ 톡찰λ ₯μž…λ‹ˆλ‹€.
07:39
I mean, think about that. How grand. How ambitious.
174
459260
3000
생각해 λ³΄μ„Έμš”. μ–Όλ§ˆλ‚˜ κ±°λŒ€ν•©λ‹ˆκΉŒ. μ–Όλ§ˆλ‚˜ 야심적인 κ²ƒμΈκ°€μš”.
07:42
That's as ambitious as we are.
175
462260
2000
그건 μš°λ¦¬κ°€ 야심찬 정도와 κ°™μ§€μš”.
07:44
He wants to solve ethics.
176
464260
2000
μš°λ¦¬λŠ” μœ€λ¦¬ν•™μ„ ν’€κ³  μ‹Άμ–΄ν•©λ‹ˆλ‹€.
07:46
He wants objective truths.
177
466260
2000
κ·ΈλŠ” 객관적인 진싀을 λ°”λžλ‹ˆλ‹€.
07:48
If you think that way,
178
468260
3000
만일 μ—¬λŸ¬λΆ„μ΄ 그런 λ°©μ‹μœΌλ‘œ μƒκ°ν•œλ‹€λ©΄,
07:51
you have a Platonist moral framework.
179
471260
3000
μ—¬λŸ¬λΆ„μ€ ν”ŒλΌν†€μ‹μ˜ μœ€λ¦¬μ²΄κ³„λ₯Ό 가지고 μžˆμŠ΅λ‹ˆλ‹€.
07:54
If you don't think that way,
180
474260
2000
만일 μ—¬λŸ¬λΆ„μ΄ κ·ΈλŸ°μ‹μœΌλ‘œ μƒκ°ν•˜μ§€ μ•ŠλŠ”λ‹€λ©΄,
07:56
well, you have a lot of company in the history of Western philosophy,
181
476260
2000
κΈ€μŽ„μš”, μ—¬λŸ¬λΆ„μ€ μ„œλΆ€ μ² ν•™μ˜ μ—­μ‚¬μ—μ„œ λ§Žμ€ λ™λ£Œλ₯Ό κ³΅μœ ν•©λ‹ˆλ‹€,
07:58
because the tidy idea, you know, people criticized it.
182
478260
3000
μ™œλƒν•˜λ©΄ κ·Έ μ •λˆλœ 생각은--μ•„μ‹œμ£ , μ‚¬λžŒλ“€μ΄ κ·Έκ±Έ λΉ„νŒν•©λ‹ˆλ‹€.
08:01
Aristotle, in particular, he was not amused.
183
481260
3000
특히, μ•„λ¦¬μŠ€ν† ν…”λ ˆμŠ€λŠ” 그것을 μ¦κ²¨ν•˜μ§€ μ•Šμ•˜μ§€μš”.
08:04
He thought it was impractical.
184
484260
3000
κ·ΈλŠ” 그게 λΉ„μ‹€μš©μ μ΄λΌκ³  μƒκ°ν–ˆμŠ΅λ‹ˆλ‹€.
08:07
Aristotle said, "We should seek only so much precision in each subject
185
487260
4000
μ•„λ¦¬μŠ€ν† ν…”λ ˆμŠ€κ°€ λ§ν•˜κΈΈ, "μš°λ¦¬λŠ” 각 μ£Όμ œμ— κ·Έ μ£Όμ œκ°€ ν—ˆλ½ν•˜λŠ”
08:11
as that subject allows."
186
491260
2000
κ·Έ μ •λ„λ§Œμ˜ 정확성을 μΆ”κ΅¬ν•΄μ•Όλ§Œ ν•©λ‹ˆλ‹€."
08:13
Aristotle thought ethics wasn't a lot like math.
187
493260
3000
μ•„λ¦¬μŠ€ν† ν…”λ ˆμŠ€λŠ” μœ€λ¦¬κ°€ μˆ˜ν•™κ³Ό 같지 μ•Šλ‹€κ³  μƒκ°ν–ˆμ–΄μš”.
08:16
He thought ethics was a matter of making decisions in the here-and-now
188
496260
3000
κ·ΈλŠ” μœ€λ¦¬κ°€ 즉결을 μš”κ΅¬ν•˜λŠ” μ˜μ‚¬κ²°μ •μ˜,
08:19
using our best judgment
189
499260
2000
λ°”λ₯Έ ν–‰λ‘œλ₯Ό μ°ΎκΈ°μœ„ν•΄
08:21
to find the right path.
190
501260
2000
우리의 κ°€μž₯ ν›Œλ₯­ν•œ νŒλ‹¨μ„ μ΄μš©ν•˜λŠ” 문제라고 μƒκ°ν–ˆμ–΄μš”.
08:23
If you think that, Plato's not your guy.
191
503260
2000
λ§Œμ•½ μ—¬λŸ¬λΆ„μ΄ κ·Έλ ‡κ²Œ μƒκ°ν–ˆλ‹€λ©΄, ν”ŒλΌν†€μ€ μ—¬λŸ¬λΆ„μ˜ μ‚¬λžŒμ΄ μ•„λ‹™λ‹ˆλ‹€.
08:25
But don't give up.
192
505260
2000
ν•˜μ§€λ§Œ ν¬κΈ°ν•˜μ§€λŠ” λ§ˆμ„Έμš”.
08:27
Maybe there's another way
193
507260
2000
μ–΄μ©Œλ©΄ λ‹€λ₯Έ 방법이 μžˆμ„μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
08:29
that we can use numbers as the basis of our moral framework.
194
509260
3000
우리의 윀리 μ²΄κ³„μ˜ κΈ°λ°˜μœΌλ‘œμ„œ 숫자λ₯Ό μ΄μš©ν•˜λŠ” κ²ƒμž…λ‹ˆλ‹€.
08:33
How about this:
195
513260
2000
이건 μ–΄λ–¨κΉŒμš”:
08:35
What if in any situation you could just calculate,
196
515260
3000
λ§Œμ•½ μ—¬λŸ¬λΆ„μ΄ 단지 계산할 수 μžˆλŠ” μž„μ˜μ˜ 상황이 μžˆλ‹€λ©΄,
08:38
look at the choices,
197
518260
2000
κ·Έ 선택사항듀을 λ³΄μ„Έμš”,
08:40
measure out which one's better and know what to do?
198
520260
3000
어떀것이 더 λ‚˜μ€μ§€ 또 무엇을 ν•΄μ•Όν•  지 μ•„λŠ”μ§€λ₯Ό μΈ‘μ •ν•˜μ„Έμš”?
08:43
That sound familiar?
199
523260
2000
그게 μΉœμˆ™ν•˜κ²Œ λ“€λ¦¬λ‚˜μš”?
08:45
That's a utilitarian moral framework.
200
525260
3000
그것은 곡리주의 도덕 μ²΄κ³„μž…λ‹ˆλ‹€.Β£
08:48
John Stuart Mill was a great advocate of this --
201
528260
2000
μ‘΄ μŠ€νŠœμ–΄νŠΈ 밀이
08:50
nice guy besides --
202
530260
2000
쒋은 μ‚¬λžŒμ΄μ—ˆλ˜κ³ 
08:52
and only been dead 200 years.
203
532260
2000
200λ…„ 전에 νƒ€κ³„ν–ˆλ˜κ²ƒ 이외에도 이 곡리주의의 ꡉμž₯ν•œ μ£Όμ°½μžμ˜€μ§€μš”.
08:54
So basis of utilitarianism --
204
534260
2000
κ·Έλž˜μ„œ 곡리주의의 κΈ°λ°˜μ€--
08:56
I'm sure you're familiar at least.
205
536260
2000
적어도 μ—¬λŸ¬λΆ„μ΄ μΉœμˆ™ν•  거라고 ν™•μ‹ ν•©λ‹ˆλ‹€.
08:58
The three people who voted for Mill before are familiar with this.
206
538260
2000
이전에 밀을 μœ„ν•΄ νˆ¬ν‘œν–ˆλ˜ μ„Έμ‚¬λžŒμ€ 이 μ‹€μš©μ£Όμ˜μ— λŒ€ν•΄ μΉœμˆ™ν•©λ‹ˆλ‹€.
09:00
But here's the way it works.
207
540260
2000
ν•˜μ§€λ§Œ μ—¬κΈ° 그게 μš΄μ˜λ˜λŠ” 방식이 μžˆμŠ΅λ‹ˆλ‹€.
09:02
What if morals, what if what makes something moral
208
542260
3000
λ§Œμ•½ μœ€λ¦¬κ°€, λ§Œμ•½ λ­”κ°€ 도덕적인것을 λ§Œλ“œλŠ” 무엇인가가
09:05
is just a matter of if it maximizes pleasure
209
545260
2000
μΎŒλ½μ„ κ·ΉλŒ€ν™”ν•˜κ³  고톡을 κ·Ήμ†Œν™”μ‹œν‚€λŠ” 것에 λŒ€ν•œ
09:07
and minimizes pain?
210
547260
2000
문제라면 μ–΄λ–¨κΉŒμš”?
09:09
It does something intrinsic to the act.
211
549260
3000
그건 κ·Έ ν–‰μœ„μ— λ³ΈλŠ₯적인 λ­”κ°€λ₯Ό ν•©λ‹ˆλ‹€.
09:12
It's not like its relation to some abstract form.
212
552260
2000
그건 좔상적인 ν˜•νƒœλ‘œμ˜ 관계와 같은 게 μ•„λ‹™λ‹ˆλ‹€.
09:14
It's just a matter of the consequences.
213
554260
2000
그건 단지 결과의 λ¬Έμ œμž…λ‹ˆλ‹€.
09:16
You just look at the consequences
214
556260
2000
μ—¬λŸ¬λΆ„μ€ 단지 κ²°κ³Όλ₯Ό 보고
09:18
and see if, overall, it's for the good or for the worse.
215
558260
2000
λ§Œμ•½, 그게 선을 μœ„ν•œκ²ƒμΈμ§€ λ˜λŠ” 악을 μœ„ν•œκ²ƒμΈμ§€λ₯Ό μ „μ²΄μ μœΌλ‘œ λ³΄μ§€μš”.
09:20
That would be simple. Then we know what to do.
216
560260
2000
그건 κ°„λ‹¨ν•œ 것일 κ²λ‹ˆλ‹€. 그러면 μš°λ¦¬λŠ” 무엇을 ν•  지 μ•Œμ§€μš”.
09:22
Let's take an example.
217
562260
2000
예λ₯Ό λ“€μ–΄λ΄…μ‹œλ‹€.
09:24
Suppose I go up
218
564260
2000
μ œκ°€ μ˜¬λΌκ°€μ„œ λ§ν•˜κΈ°λ₯Ό
09:26
and I say, "I'm going to take your phone."
219
566260
2000
"λ‚˜λŠ” λ„ˆμ˜ μ „ν™”λ₯Ό μ••μˆ˜ν•˜κ² μ–΄," 라고 ν–ˆλ‹€κ³  κ°€μ •ν•΄ λ΄…μ‹œλ‹€.
09:28
Not just because it rang earlier,
220
568260
2000
그게 전에 울렸기 λ•Œλ¬Έμ΄ μ•„λ‹ˆλΌ,
09:30
but I'm going to take it because I made a little calculation.
221
570260
3000
λ‚΄κ°€ μž‘μ€ 계산을 ν–ˆκΈ° λ•Œλ¬Έμ— 그것을 κ°€μ§ˆκ±°λΌκ³ .
09:33
I thought, that guy looks suspicious.
222
573260
3000
μ €λŠ”, μ € λ‚¨μžκ°€ μˆ˜μƒν•˜κ²Œ 보인닀고 μƒκ°ν–ˆμŠ΅λ‹ˆλ‹€.
09:36
And what if he's been sending little messages to Bin Laden's hideout --
223
576260
3000
만일 κ·Έκ°€ μž‘μ€ 메세지λ₯Ό 빈 λΌλ“ μ˜ 은거μž₯μ†Œλ‘œ
09:39
or whoever took over after Bin Laden --
224
579260
2000
λ˜λŠ” 빈 λΌλ“ μ˜ 이후 μ„Έλ ₯을 물렀받은 κ·Έ λˆ„κ΅¬μ—κ²ŒλΌλ„ 보내왔고
09:41
and he's actually like a terrorist, a sleeper cell.
225
581260
3000
κ·ΈλŠ” μ‹€μ œλ‘œ ν…ŒλŸ¬λ¦¬μŠ€νŠΈμ™€ 같은, 슬리퍼 ν•Έλ“œν°κ°™λ‹€κ΅¬μš”.
09:44
I'm going to find that out, and when I find that out,
226
584260
3000
μ €λŠ” 그것을 μ•Œμ•„λ‚Όκ±°κ³ , μ œκ°€ 그것을 μ•Œμ•„λƒˆμ„λ•ŒλŠ”,
09:47
I'm going to prevent a huge amount of damage that he could cause.
227
587260
3000
κ·Έκ°€ μœ λ°œν•  μˆ˜λ„ μžˆλŠ” κ±°λŒ€ν•œ μ–‘μ˜ 손상을 λ°©μ§€ν• κ²ƒμž…λ‹ˆλ‹€.
09:50
That has a very high utility to prevent that damage.
228
590260
3000
κ·Έ 손상을 λ°©μ§€ν•˜λŠ” 것은 ꡉμž₯히 높은 μ‹€μš©μ„±μ΄ μžˆμŠ΅λ‹ˆλ‹€.
09:53
And compared to the little pain that it's going to cause --
229
593260
2000
그리고 그것은 그것이 μœ λ°œν•  μž‘μ€ κ³ ν†΅μœΌλ‘œ λΉ„κ΅λ©λ‹ˆλ‹€.
09:55
because it's going to be embarrassing when I'm looking on his phone
230
595260
2000
μ™œλƒν•˜λ©΄ μ œκ°€ 그의 μ „ν™”κΈ°λ₯Ό λ³΄λ©΄μ„œ
09:57
and seeing that he has a Farmville problem and that whole bit --
231
597260
3000
κ·Έ μ „ν™”κΈ°λ₯Ό 바라보며 μ••λ„μ μœΌλ‘œ 전체가
10:00
that's overwhelmed
232
600260
3000
팜빌 (Farmville) λ¬Έμ œκ°€ μžˆλ‹€λŠ” 것을 λ³΄λŠ”κ²ƒμ€
10:03
by the value of looking at the phone.
233
603260
2000
μ°½ν”Όν•˜κ²Œ 느껴질 것이기 λ•Œλ¬Έμž…λ‹ˆλ‹€.
10:05
If you feel that way,
234
605260
2000
만일 μ—¬λŸ¬λΆ„μ΄ κ·ΈλŸ°μ‹μœΌλ‘œ λŠλ‚€λ‹€λ©΄,
10:07
that's a utilitarian choice.
235
607260
3000
그것은 μ‹€μš©μ£Όμ˜μžμ μΈ μ„ νƒμž…λ‹ˆλ‹€.
10:10
But maybe you don't feel that way either.
236
610260
3000
ν•˜μ§€λ§Œ μ—¬λŸ¬λΆ„μ€ 그런 λ°©μ‹μœΌλ‘œ λŠλΌμ§€ μ•Šμ„μ§€λ„ λͺ¨λ¦…λ‹ˆλ‹€.
10:13
Maybe you think, it's his phone.
237
613260
2000
μ–΄μ©Œλ©΄μ€ μ—¬λŸ¬λΆ„μ€, "그건 그의 μ „ν™”μ•Ό" 라고 생각할 지도 λͺ¨λ¦…λ‹ˆλ‹€.
10:15
It's wrong to take his phone
238
615260
2000
"그의 μ „ν™”λ₯Ό μ••μˆ˜ν•˜λŠ” 것은 μ˜³μ§€μ•Šμ•„,
10:17
because he's a person
239
617260
2000
μ™œλƒν•˜λ©΄ κ·ΈλŠ” μ‚¬λžŒμ΄κ³ 
10:19
and he has rights and he has dignity,
240
619260
2000
κ·Έμ—κ²ŒλŠ” κΆŒλ¦¬κ°€ 있고 쑴엄성이 있고,
10:21
and we can't just interfere with that.
241
621260
2000
μš°λ¦¬λŠ” 그것을 κ·Έλƒ₯ κ·Έλ ‡κ²Œ λ°©ν•΄ν•  μˆ˜λŠ” μ—†μ–΄.
10:23
He has autonomy.
242
623260
2000
κ·ΈλŠ” μžμœ¨μ„±μ„ κ°€μ‘Œμ–΄.
10:25
It doesn't matter what the calculations are.
243
625260
2000
계산이 μ–΄λ–€κ²ƒμΈμ§€λŠ” λ¬Έμ œκ°€ λ˜μ§€μ•Šμ•„.
10:27
There are things that are intrinsically wrong --
244
627260
3000
λ³ΈλŠ₯적으둜 κ·Έλ₯Έκ²ƒλ“€μ΄ μžˆμŠ΅λ‹ˆλ‹€--
10:30
like lying is wrong,
245
630260
2000
거짓말이 κ·Έλ₯Έκ²ƒκ³Ό κ°™μ΄μš”.
10:32
like torturing innocent children is wrong.
246
632260
3000
μˆœμ§„ν•œ 어린이듀을 κ΄΄λ‘­νžˆλŠ”κ²ƒμ΄ κ·Έλ₯Έκ²ƒκ³Ό κ°™μ΄μš”.
10:35
Kant was very good on this point,
247
635260
3000
μΉΈνŠΈλŠ” 이 μš”μ μ— κ΄€ν•΄ μ•„μ£Ό νƒμ›”ν–ˆκ³ ,
10:38
and he said it a little better than I'll say it.
248
638260
2000
κ·ΈλŠ” μ œκ°€ 말할 것보닀 쑰금 더 잘 λ§ν–ˆμ£ .
10:40
He said we should use our reason
249
640260
2000
κ·Έκ°€ λ§ν•˜κΈΈ μš°λ¦¬λŠ” 우리의 이성을 μ΄μš©ν•˜μ—¬
10:42
to figure out the rules by which we should guide our conduct,
250
642260
3000
우리의 행싀을 μ•ˆλ‚΄ν•΄μ•Όλ§Œν•˜λŠ” κ·œμΉ™λ“€μ„ μ•Œμ•„λ‚΄μ•Όλ§Œ ν•œλ‹€κ³  λ§ν–ˆμŠ΅λ‹ˆλ‹€.
10:45
and then it is our duty to follow those rules.
251
645260
3000
그리고 κ·Έ κ·œμΉ™λ“€μ„ λ”°λ₯΄λŠ” 것은 우리의 μ˜λ¬΄λΌκ³ μš”.
10:48
It's not a matter of calculation.
252
648260
3000
그것은 κ³„μ‚°μ˜ λ¬Έμ œκ°€ μ•„λ‹™λ‹ˆλ‹€.
10:51
So let's stop.
253
651260
2000
κ·ΈλŸ¬λ‹ˆ λ©ˆμΆ”λ„λ‘ ν•©μ‹œλ‹€.
10:53
We're right in the thick of it, this philosophical thicket.
254
653260
3000
μš°λ¦¬λŠ” κ·Έκ²ƒμ˜ λ°€μ§‘μ˜ 쀑심에 μžˆμŠ΅λ‹ˆλ‹€, 이것은 철학적인 μ°©μž‘ν•¨μ΄μ£ .
10:56
And this goes on for thousands of years,
255
656260
3000
이것은 μˆ˜μ²œλ…„λ™μ•ˆ μ§€μ†λ©λ‹ˆλ‹€,
10:59
because these are hard questions,
256
659260
2000
μ™œλƒν•˜λ©΄ 이것듀은 μ–΄λ €μš΄ μ§ˆλ¬Έλ“€μ΄κΈ° λ•Œλ¬Έμ΄κ³ 
11:01
and I've only got 15 minutes.
257
661260
2000
μ €λŠ” 단지 15λΆ„λ§Œμ„ 가지고 μžˆμŠ΅λ‹ˆλ‹€.
11:03
So let's cut to the chase.
258
663260
2000
κ·ΈλŸ¬λ‹ˆ 단도 μ§μž…μ μœΌλ‘œ λ§ν•©μ‹œλ‹€.
11:05
How should we be making our decisions?
259
665260
4000
μš°λ¦¬κ°€ μ–΄λ–»κ²Œ 우리의 μ˜μ‚¬κ²°μ •μ„ ν•΄μ•Όλ§Œ ν• κΉŒμš”?
11:09
Is it Plato, is it Aristotle, is it Kant, is it Mill?
260
669260
3000
ν”ŒλΌν† μΈκ°€μš”, μ•„λ¦¬μŠ€ν† ν…”λ ˆμŠ€μΈκ°€μš”, μΉΈνŠΈμΈκ°€μš”, λ°€μΈκ°€μš”?
11:12
What should we be doing? What's the answer?
261
672260
2000
μš°λ¦¬κ°€ 무엇을 ν•΄μ•Όλ§Œ ν• κΉŒμš”? 무엇이 λŒ€λ‹΅μΈκ°€μš”?
11:14
What's the formula that we can use in any situation
262
674260
3000
μš°λ¦¬κ°€ κ·Έ λ‚¨μžμ˜ 데이타λ₯Ό μ΄μš©ν•΄μ•Όλ§Œ ν•  지 μ•„λ‹Œμ§€
11:17
to determine what we should do,
263
677260
2000
μš°λ¦¬κ°€ μ–΄λ–€ μƒν™©μ—λ‚˜ μ΄μš©ν•  수 μžˆλŠ”
11:19
whether we should use that guy's data or not?
264
679260
2000
μš°λ¦¬κ°€ 무엇을 ν•΄μ•Όλ§Œ ν•  지λ₯Ό κ²°μ •ν•˜λŠ” 곡식은 λ¬΄μ—‡μΌκΉŒμš”?
11:21
What's the formula?
265
681260
3000
κ·Έ 곡식은 λ¬΄μ—‡μΌκΉŒμš”?
11:25
There's not a formula.
266
685260
2000
곡식은 μ—†μŠ΅λ‹ˆλ‹€.
11:29
There's not a simple answer.
267
689260
2000
λ‹¨μˆœν•œ λŒ€λ‹΅μ΄ μžˆλŠ”κ²ƒμ€ μ•„λ‹™λ‹ˆλ‹€.
11:31
Ethics is hard.
268
691260
3000
μœ€λ¦¬ν•™μ€ μ–΄λ ΅μŠ΅λ‹ˆλ‹€.
11:34
Ethics requires thinking.
269
694260
3000
μœ€λ¦¬ν•™μ€ 생각을 μš”κ΅¬ν•©λ‹ˆλ‹€.
11:38
And that's uncomfortable.
270
698260
2000
그리고 그것은 νŽΈμ•ˆν•˜μ§€ μ•Šμ§€μš”.
11:40
I know; I spent a lot of my career
271
700260
2000
μ•Œμ•„μš”; μ €λŠ” 제 μ§μ—…μ˜ λ§Žμ€ 뢀뢄을
11:42
in artificial intelligence,
272
702260
2000
인곡지λŠ₯에 μ†ŒλΉ„ν–ˆμŠ΅λ‹ˆλ‹€,
11:44
trying to build machines that could do some of this thinking for us,
273
704260
3000
μš°λ¦¬λ¦‰ μœ„ν•΄ μ΄λŸ¬ν•œ 생각을 μ‘°κΈˆν•  수 μžˆλŠ”,
11:47
that could give us answers.
274
707260
2000
μš°λ¦¬μ—κ²Œ λŒ€λ‹΅μ„ 쀄 수 μžˆλŠ” 기계λ₯Ό μ„€λ¦½ν•˜λ €κ³  μ‹œλ„ν•˜λ©΄μ„œ 말이죠.
11:49
But they can't.
275
709260
2000
κ·ΈλŸ¬λ‚˜ 기계듀은 ν•  수 μ—†μŠ΅λ‹ˆλ‹€.
11:51
You can't just take human thinking
276
711260
2000
단지 μΈκ°„μ˜ 생각을 νƒν•΄μ„œ
11:53
and put it into a machine.
277
713260
2000
기계에 집어넣을 μˆ˜λŠ” μ—†μ–΄μš”.
11:55
We're the ones who have to do it.
278
715260
3000
그것을 ν•΄μ•Όλ§Œ ν•˜λŠ” μ‚¬λžŒμ€ μš°λ¦¬μž…λ‹ˆλ‹€.
11:58
Happily, we're not machines, and we can do it.
279
718260
3000
ν–‰λ³΅ν•˜κ²Œλ„, μš°λ¦¬λŠ” 기계가 μ•„λ‹ˆμ–΄μ„œ μš°λ¦¬λŠ” 그것을 ν•  수 μžˆμ§€μš”.
12:01
Not only can we think,
280
721260
2000
μš°λ¦¬λŠ” 생각할 수 μžˆμ„ 뿐만이 μ•„λ‹ˆλΌ,
12:03
we must.
281
723260
2000
μš°λ¦¬λŠ” μƒκ°ν•΄μ•Όλ§Œ ν•©λ‹ˆλ‹€.
12:05
Hannah Arendt said,
282
725260
2000
ν•˜λ‚˜ μ•„λ Œλ“œνŠΈκ°€ λ§ν•˜κΈΈ,
12:07
"The sad truth
283
727260
2000
"μŠ¬ν”ˆ 진싀은
12:09
is that most evil done in this world
284
729260
2000
이 세상에 μΉ˜λ£¨μ–΄μ§„ μ•…μ˜ λŒ€λΆ€λΆ„μ€
12:11
is not done by people
285
731260
2000
악인이 되기둜 μ„ νƒν•œ μ‚¬λžŒλ“€μ—
12:13
who choose to be evil.
286
733260
2000
μ˜ν•΄μ„œ κ·Έλ ‡κ²Œ λœκ²ƒμ΄ μ•„λ‹ˆλ‹€.
12:15
It arises from not thinking."
287
735260
3000
악은 생각을 μ•ˆν•˜λŠ” κ²ƒμœΌλ‘œλΆ€ν„° μ œκΈ°λœλ‹€."
12:18
That's what she called the "banality of evil."
288
738260
4000
그것이 κ·Έλ…€κ°€ "μ•…μ˜ 진뢀," 라고 λΆ€λ₯΄λŠ” κ²ƒμž…λ‹ˆλ‹€.
12:22
And the response to that
289
742260
2000
그리고 그에 λŒ€ν•œ λ°˜μ‘μ€
12:24
is that we demand the exercise of thinking
290
744260
2000
μš°λ¦¬κ°€ λͺ¨λ“  정신을 μ œλŒ€λ‘œ 가진 μ‚¬λžŒμ—κ²Œμ„œ
12:26
from every sane person.
291
746260
3000
μƒκ°μ˜ μ—°μŠ΅μ„ μš”κ΅¬ν•˜λŠ” κ²ƒμž…λ‹ˆλ‹€.
12:29
So let's do that. Let's think.
292
749260
2000
κ·ΈλŸ¬λ‹ˆ κ·Έλ ‡κ²Œ ν•˜λ„λ‘ ν•©μ‹œλ‹€. μƒκ°ν•˜λ„λ‘ ν•©μ‹œλ‹€.
12:31
In fact, let's start right now.
293
751260
3000
사싀, μ§€κΈˆ 그것을 λ°”λ‘œ μ‹œμž‘ν•©μ‹œλ‹€.
12:34
Every person in this room do this:
294
754260
3000
이 μž₯μ†Œμ— μžˆλŠ” λͺ¨λ“ μ‚¬λžŒμ΄ μ΄λ ‡κ²Œ ν•˜μ„Έμš”:
12:37
think of the last time you had a decision to make
295
757260
3000
μ—¬λŸ¬λΆ„μ΄ 맨 λ§ˆμ§λ§‰μœΌλ‘œ μ˜μ‚¬κ²°μ •μ„ ν–ˆμ–΄μ•Ό ν• λ•Œλ₯Ό μƒκ°ν•˜μ„Έμš”
12:40
where you were worried to do the right thing,
296
760260
2000
λ°”λ₯ΈμΌμ„ ν•˜λŠ”κ²ƒμ— λŒ€ν•΄ κ±±μ •ν–ˆλ˜κ²ƒ
12:42
where you wondered, "What should I be doing?"
297
762260
2000
"λ‚΄κ°€ λ­˜ν•΄μ•Όλ§Œ ν•˜λ‚˜?" 라고 μ˜μ‹¬μ©μ–΄ ν•˜λ˜κ²ƒ
12:44
Bring that to mind,
298
764260
2000
κ·Έκ±Έ λ§ˆμŒμ†μœΌλ‘œ λ°λ €μ˜€μ„Έμš”.
12:46
and now reflect on that
299
766260
2000
이제 κ·Έκ±Έ νšŒμƒν•΄μ„œ
12:48
and say, "How did I come up that decision?
300
768260
3000
λ§ν•˜κΈΈ, "μ–΄λ–»κ²Œ λ‚΄κ°€ κ·Έ 결정에 λ„λ‹¬ν–ˆμ§€?
12:51
What did I do? Did I follow my gut?
301
771260
3000
λ‚΄κ°€ 뭘 ν–ˆμ§€? λ‚΄κ°€ λ³ΈλŠ₯을 λ”°λžλ‚˜?
12:54
Did I have somebody vote on it? Or did I punt to legal?"
302
774260
2000
κ·Έκ±Έ 가지고 μ‚¬λžŒλ“€μ΄ νˆ¬ν‘œλ₯Ό ν•˜κ²Œ ν–ˆλ‚˜? λ˜λŠ” λ‚΄κ°€ 법λ₯  μ „λ¬Έκ°€μ—κ²Œ νŽ€νŠΈλ₯Ό ν–ˆλ‚˜?"
12:56
Or now we have a few more choices.
303
776260
3000
λ˜λŠ” 이제 μš°λ¦¬λŠ” λͺ‡κ°€μ§€ 더 λ§Žμ€ 선택사항듀이 μžˆμŠ΅λ‹ˆλ‹€.
12:59
"Did I evaluate what would be the highest pleasure
304
779260
2000
"밀이 할것과 같이 κ°€μž₯ μ΅œμƒμ˜ μΎŒλ½μƒνƒœκ°€
13:01
like Mill would?
305
781260
2000
λ˜λ„λ‘ ν‰κ°€ν–ˆλ‚˜?
13:03
Or like Kant, did I use reason to figure out what was intrinsically right?"
306
783260
3000
λ˜λŠ” μΉΈνŠΈμ™€ 같이, 이성을 μ΄μš©ν•˜μ—¬ λ³ΈλŠ₯적으둜 무엇이 μ˜³μ€μ§€ μ•Œμ•„λ‚΄λ„λ‘ ν–ˆλ‚˜?"
13:06
Think about it. Really bring it to mind. This is important.
307
786260
3000
그것에 λŒ€ν•΄ 생각해 λ³΄μ„Έμš”. 그것을 μ •λ§λ‘œ λ§ˆμŒμ†μœΌλ‘œ λ“€μ—¬μ˜€μ„Έμš”. 이것은 μ€‘μš”ν•©λ‹ˆλ‹€.
13:09
It is so important
308
789260
2000
그것은 μ•„μ£Ό μ€‘μš”ν•©λ‹ˆλ‹€
13:11
we are going to spend 30 seconds of valuable TEDTalk time
309
791260
2000
μš°λ¦¬λŠ” κ·€μ€‘ν•œ TEDTalk μ‹œκ°„μ˜ 30초λ₯Ό
13:13
doing nothing but thinking about this.
310
793260
2000
이것에 κ΄€ν•΄μ„œλ§Œ 생각해 λ³΄λŠ”λ° μ†ŒλΉ„ν•  κ²ƒμž…λ‹ˆλ‹€.
13:15
Are you ready? Go.
311
795260
2000
μ€€λΉ„ λ˜μ…¨λ‚˜μš”? ν•˜μ„Έμš”.
13:33
Stop. Good work.
312
813260
3000
λ©ˆμΆ”μ„Έμš”. 쒋은 μž‘μ—…μž…λ‹ˆλ‹€.
13:36
What you just did,
313
816260
2000
μ—¬λŸ¬λΆ„μ΄ λ°©κΈˆν•˜μ‹ κ²ƒμ€,
13:38
that's the first step towards taking responsibility
314
818260
2000
μš°λ¦¬κ°€ 가진 ꢌλ ₯의 μ „λΆ€λ₯Ό 가지고 μš°λ¦¬κ°€ 무엇을 ν•΄μ•Όν• 
13:40
for what we should do with all of our power.
315
820260
3000
μ±…μž„κ°μ„ μ·¨ν•˜λŠ” λ°©ν–₯으둜 κ°€λŠ” 첫번째의 λ‹¨κ³„μž…λ‹ˆλ‹€
13:45
Now the next step -- try this.
316
825260
3000
자 이제 λ‹€μŒ 단계--이걸 μ‹œλ„ν•΄ λ³΄μ„Έμš”.
13:49
Go find a friend and explain to them
317
829260
2000
μΉœκ΅¬ν•˜λ‚˜λ₯Ό μ°Ύμ•„μ„œ μ„€λͺ…ν•˜μ„Έμš”
13:51
how you made that decision.
318
831260
2000
μ—¬λŸ¬λΆ„μ΄ κ·Έ 결정을 μ–΄λ–»κ²Œ λ‚΄λ ΈλŠ”μ§€ 말이죠.
13:53
Not right now. Wait till I finish talking.
319
833260
2000
μ§€κΈˆλ§κ΅¬μš”. μ œκ°€ 이야기λ₯Ό λλ‚Όλ•ŒκΉŒμ§€ κΈ°λ‹€λ¦¬μ„Έμš”.
13:55
Do it over lunch.
320
835260
2000
μ μ‹¬μ‹œκ°„λ™μ•ˆ 그것을 ν•˜μ„Έμš”.
13:57
And don't just find another technologist friend;
321
837260
3000
또 λ‹€λ₯Έ ν…Œν¬λ†€λ‘œμ§€μŠ€νŠΈ 친ꡬλ₯Ό 찾지 λ§ˆμ‹œκ³ ;
14:00
find somebody different than you.
322
840260
2000
μ—¬λŸ¬λΆ„κ³Ό λ‹€λ₯Έ μ–΄λ–€ λ‹€λ₯Έ μ‚¬λžŒμ„ μ°ΎμœΌμ„Έμš”.
14:02
Find an artist or a writer --
323
842260
2000
μ˜ˆμˆ κ°€λ‚˜ μž‘κ°€λ₯Ό μ°ΎμœΌμ„Έμš”--
14:04
or, heaven forbid, find a philosopher and talk to them.
324
844260
3000
λ˜λŠ” 천ꡭ이 κΈˆμ§€ν•œ, μ² ν•™μžλ₯Ό μ°Ύμ•„μ„œ κ·Έμ—κ²Œ 이야기 ν•˜μ„Έμš”.
14:07
In fact, find somebody from the humanities.
325
847260
2000
사싀, μΈλ¬Έν•™μ—μ„œμ˜ μ‚¬λžŒμ„ μ°ΎμœΌμ„Έμš”.
14:09
Why? Because they think about problems
326
849260
2000
μ™œλƒκ΅¬μš”? μ™œλƒν•˜λ©΄ 그듀은 λ¬Έμ œλ“€μ— κ΄€ν•΄μ„œ
14:11
differently than we do as technologists.
327
851260
2000
우리 ν…Œν¬λ†€λ‘œμ§€μŠ€νŠΈλ“€μ΄ μƒκ°ν•˜λŠ” κ²ƒκ³ΌλŠ” λ‹€λ₯΄κ²Œ μƒκ°ν•˜κΈ° λ•Œλ¬Έμ΄μ£ .
14:13
Just a few days ago, right across the street from here,
328
853260
3000
단지 λͺ‡μΌμ „에, μ—¬κΈ°μ„œ λ°”λ‘œ λ§žμ€νŽΈ 거리에,
14:16
there was hundreds of people gathered together.
329
856260
2000
수백λͺ…μ˜ μ‚¬λžŒλ“€μ΄ λͺ¨μ—¬μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
14:18
It was technologists and humanists
330
858260
2000
ν…Œν¬λ†€λ‘œμ§€μŠ€νŠΈλ“€κ³Ό μΈλ¬Έν•™μžλ“€μ΄
14:20
at that big BiblioTech Conference.
331
860260
2000
κ±°λŒ€ν•œ λΉ„λΈ”λ¦¬μ˜€ ν…Œν¬ μ»¨νΌλŸ°μŠ€μ—μ„œ μžˆμ—ˆμ§€μš”.
14:22
And they gathered together
332
862260
2000
그듀이 ν•¨κ»˜ λͺ¨μ˜€μŠ΅λ‹ˆλ‹€
14:24
because the technologists wanted to learn
333
864260
2000
μ™œλƒν•˜λ©΄ 그듀은 μΈλ¬Έν•™μžμ μΈ κ΄€μ μ—μ„œ
14:26
what it would be like to think from a humanities perspective.
334
866260
3000
μƒκ°ν•˜λŠ”κ²ƒμ€ 어떀지λ₯Ό 배우고 μ‹Άμ–΄μ„œμ˜€κΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
14:29
You have someone from Google
335
869260
2000
κ΅¬κΈ€νšŒμ‚¬μ—μ„œ μΌν•˜λŠ” μ–΄λ–€μ‚¬λžŒμ΄
14:31
talking to someone who does comparative literature.
336
871260
2000
비ꡐ 문학을 ν•˜λŠ” λˆ„κ΅°κ°€μ—κ²Œ 이야기λ₯Ό ν•©λ‹ˆλ‹€.
14:33
You're thinking about the relevance of 17th century French theater --
337
873260
3000
17μ„ΈκΈ°μ˜ λΆˆλž€μ„œ μ—°κ·Ήκ³Όμ˜ 관련성에 λŒ€ν•΄ 이야기 ν•©λ‹ˆλ‹€.
14:36
how does that bear upon venture capital?
338
876260
2000
그것이 μ–΄λ–»κ²Œ 벀처 캐피털에 이읡을 μ‚°μΆœν• κΉŒμš”?
14:38
Well that's interesting. That's a different way of thinking.
339
878260
3000
κΈ€μŽ„ 그게 μž¬λ―ΈμžˆλŠ”κ²ƒμž…λ‹ˆλ‹€. 그것은 λ‹€λ₯Έ λ°©μ‹μ˜ μƒκ°μž…λ‹ˆλ‹€.
14:41
And when you think in that way,
340
881260
2000
μ—¬λŸ¬λΆ„μ΄ 그런 λ°©μ‹μœΌλ‘œ 생각할 λ•Œ,
14:43
you become more sensitive to the human considerations,
341
883260
3000
μ—¬λŸ¬λΆ„μ€ 인간 고렀점듀에 λŒ€ν•΄ 보닀 λ―Όκ°ν•˜κ²Œ λ˜μ§€μš”.
14:46
which are crucial to making ethical decisions.
342
886260
3000
그건 윀리적인 μ˜μ‚¬κ²°μ •μ„ ν•˜λŠ”λ° ν•„μˆ˜μ μΈ κ²ƒμž…λ‹ˆλ‹€.
14:49
So imagine that right now
343
889260
2000
κ·ΈλŸ¬λ‹ˆ λ°”λ‘œ μ—¬κΈ°μ„œ 상상해 λ³΄μ„Έμš”.
14:51
you went and you found your musician friend.
344
891260
2000
μ—¬λŸ¬λΆ„μ΄ κ°€μ„œ μŒμ•…κ°€ 친ꡬλ₯Ό μ°Ύμ•˜μŠ΅λ‹ˆλ‹€.
14:53
And you're telling him what we're talking about,
345
893260
3000
μš°λ¦¬κ°€ μ—¬κΈ°μ„œ 이야기 ν•˜λŠ”κ²ƒμ— λŒ€ν•΄μ„œ κ·Έμ—κ²Œ 이야기 ν•©λ‹ˆλ‹€.
14:56
about our whole data revolution and all this --
346
896260
2000
우리의 전체적인 데이터 혁λͺ…κ³Ό μ΄λŸ¬ν•œ λͺ¨λ“ κ²ƒμ— κ΄€ν•΄μ„œ--
14:58
maybe even hum a few bars of our theme music.
347
898260
2000
μ–΄μ©Œλ©΄ 우리 ν…Œλ§ˆ μŒμ•…μ˜ λͺ‡μ†Œμ ˆμ„ ν—ˆλ°ν•  수 도 있죠.
15:00
β™« Dum ta da da dum dum ta da da dum β™«
348
900260
3000
β™« 덀 타 λ‹€ λ‹€ 덀 덀 타 λ‹€ λ‹€ 덀 β™«
15:03
Well, your musician friend will stop you and say,
349
903260
2000
κΈ€μŽ„μš”, μ—¬λŸ¬λΆ„μ˜ μŒμ•…κ°€ μΉœκ΅¬λŠ” μ—¬λŸ¬λΆ„μ—κ²Œ λ©ˆμΆ”μ–΄μ„œ λ§ν•˜κΈΈ,
15:05
"You know, the theme music
350
905260
2000
"μžˆμž–μ•„, κ·Έ ν…Œλ§ˆ μŒμ•…μ€
15:07
for your data revolution,
351
907260
2000
λ„ˆμ˜ λ°μ΄ν„°μ˜ 혁λͺ…을 μœ„ν•œ κ±°μ•Ό
15:09
that's an opera, that's Wagner.
352
909260
2000
저건 μ˜€νŽ˜λΌμ•Ό, 저건 λ°”κ·Έλ„ˆμ•Ό
15:11
It's based on Norse legend.
353
911260
2000
그건 κ³ λŒ€ μŠ€μΉΈλ””λ‚˜λΉ„μ•„μ˜ 전섀에 κ·Όκ±°ν•œ κ±°μ•Ό.
15:13
It's Gods and mythical creatures
354
913260
2000
그건 μ‹ λ“€κ³Ό 신화적인 생λͺ…체듀이
15:15
fighting over magical jewelry."
355
915260
3000
마λ ₯의 보석을 놓고 μ‹Έμš°λŠ”κ²ƒμ— κ΄€ν•œκ±°μ•Ό."
15:19
That's interesting.
356
919260
3000
그건 ν₯λ―Έλ‘­μŠ΅λ‹ˆλ‹€.
15:22
Now it's also a beautiful opera,
357
922260
3000
자 그것은 λ˜ν•œ μ•„λ¦„λ‹€μš΄ μ˜€νŽ˜λΌμž…λ‹ˆλ‹€.
15:25
and we're moved by that opera.
358
925260
3000
또 μš°λ¦¬λŠ” κ·Έ μ˜€νŽ˜λΌμ— μ˜ν•΄μ„œ 감동을 λ°›μ£ .
15:28
We're moved because it's about the battle
359
928260
2000
μš°λ¦¬λŠ” 감동을 λ°›μŠ΅λ‹ˆλ‹€ μ™œλƒν•˜λ©΄ 그것은
15:30
between good and evil,
360
930260
2000
μ„ κ³Ό μ•…μ‚¬μ΄μ˜ μ „μŸ,
15:32
about right and wrong.
361
932260
2000
옳고 그름에 κ΄€ν•œ 것이기 λ•Œλ¬Έμž…λ‹ˆλ‹€.
15:34
And we care about right and wrong.
362
934260
2000
μš°λ¦¬λŠ” 옳고 그름에 λŒ€ν•΄ μš°λ €ν•©λ‹ˆλ‹€.
15:36
We care what happens in that opera.
363
936260
3000
μš°λ¦¬λŠ” κ·Έ μ˜€νŽ˜λΌμ—μ„œ 무슨 일이 μΌμ–΄λ‚˜λŠ”μ§€ μš°λ €ν•©λ‹ˆλ‹€.
15:39
We care what happens in "Apocalypse Now."
364
939260
3000
μš°λ¦¬λŠ” "μ•„ν¬μΉΌλ¦½μ†Œ λ‚˜μš°"μ—μ„œ 무슨일이 μΌμ–΄λ‚˜λŠ”μ§€ μ—Όλ €ν•©λ‹ˆλ‹€.
15:42
And we certainly care
365
942260
2000
또 μš°λ¦¬λŠ” ν™•μ‹€νžˆ μ—Όλ €ν•©λ‹ˆλ‹€.
15:44
what happens with our technologies.
366
944260
2000
우리의 ν…Œν¬λ†€λ‘œμ§€μ— 무슨일이 μΌμ–΄λ‚˜λŠμ§€λ₯Ό.
15:46
We have so much power today,
367
946260
2000
μš°λ¦¬λŠ” μ˜€λŠ˜λ‚  λŒ€λ‹¨νžˆ λ§Žμ€ ꢌλ ₯을 가지고 μžˆμŠ΅λ‹ˆλ‹€,
15:48
it is up to us to figure out what to do,
368
948260
3000
그것을 가지고 무엇을 할지 그것은 μš°λ¦¬μ—κ²Œ λ‹¬λ €μžˆμŠ΅λ‹ˆλ‹€.
15:51
and that's the good news.
369
951260
2000
그것은 쒋은 λ‰΄μŠ€μž…λ‹ˆλ‹€.
15:53
We're the ones writing this opera.
370
953260
3000
μš°λ¦¬κ°€ 이 였페라λ₯Ό μ“°λŠ” μ‚¬λžŒλ“€μž…λ‹ˆλ‹€.
15:56
This is our movie.
371
956260
2000
이것은 우리의 μ˜ν™”μž…λ‹ˆλ‹€.
15:58
We figure out what will happen with this technology.
372
958260
3000
μš°λ¦¬λŠ” 이 ν…Œν¬λ†€λ‘œμ§€μ— 무슨일이 일어날지 μ•Œμ•„λƒ…λ‹ˆλ‹€.
16:01
We determine how this will all end.
373
961260
3000
이것이 μ–΄λ–»κ²Œ 쒅결이 λ μ§€λŠ” μš°λ¦¬κ°€ κ²°μ •ν•©λ‹ˆλ‹€.
16:04
Thank you.
374
964260
2000
κ°μ‚¬ν•©λ‹ˆλ‹€.
16:06
(Applause)
375
966260
5000
(λ°•μˆ˜)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7