Susan Blackmore: Memes and "temes"

157,186 views ใƒป 2008-06-04

TED


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

๋ฒˆ์—ญ: Jongwook Kim ๊ฒ€ํ† : ChungKyu Park
00:18
Cultural evolution is a dangerous child
0
18330
3000
๋ฌธํ™”์  ์ง„ํ™”๋Š” ์ง€๊ตฌ์ƒ ์–ด๋–ค ์ข…์—๊ฒŒ๋„
00:21
for any species to let loose on its planet.
1
21330
3000
๊ทธ๋ƒฅ ํ’€์–ด ๋†“๊ธฐ์—” ์œ„ํ—˜ํ•œ ์•„์ด์ž…๋‹ˆ๋‹ค.
00:24
By the time you realize what's happening, the child is a toddler,
2
24330
4000
๋ฌด์Šจ ์ผ์ด ์ผ์–ด๋‚ฌ๋Š”์ง€ ๊นจ๋‹ฌ์•˜์„ ๋•Œ๋Š” ์ด๋ฏธ, ์•„์ด๋Š” ๊ธฐ์–ด ๋‹ค๋‹ˆ๋‹ค๊ฐ€,
00:28
up and causing havoc, and it's too late to put it back.
3
28330
6000
์ผ์–ด๋‚˜๊ณ , ๋Œ€ํ˜ผ๋ž€์„ ์ผ์œผ์ผœ, ๋˜๋Œ๋ฆด ์ˆ˜ ์—†์ด ๋Šฆ์–ด๋ฒ„๋ฆด ๊ฒƒ์ž…๋‹ˆ๋‹ค.
00:34
We humans are Earth's Pandoran species.
4
34330
3000
์šฐ๋ฆฌ ์ธ๊ฐ„์€ ์ง€๊ตฌ์˜ ํŒ๋„๋ผ ์ข…์ž…๋‹ˆ๋‹ค.
00:37
We're the ones who let the second replicator out of its box,
5
37330
5000
์šฐ๋ฆฌ๋Š” ํŒ๋„๋ผ์˜ ์ƒ์ž์—์„œ ๋‘๋ฒˆ์งธ ๋ณต์ œ์ž๋ฅผ ๊บผ๋ƒˆ๊ณ ,
00:42
and we can't push it back in.
6
42330
2000
๊ทธ๊ฑธ ๋‹ค์‹œ ๋Œ๋ ค๋„ฃ์„ ์ˆ˜ ์—†์Šต๋‹ˆ๋‹ค.
00:44
We're seeing the consequences all around us.
7
44330
3000
์šฐ๋ฆฌ ์ฃผ์œ„ ๋ชจ๋“  ๊ณณ์—์„œ ๊ทธ ๊ฒฐ๊ณผ๋ฅผ ๋ณด๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
00:48
Now that, I suggest, is the view that
8
48330
4000
์ž ์ด์ œ, ๋ฐˆํ•™๋ฌธ์„ ์ง„์ง€ํ•˜๊ฒŒ ๋ฐ›์•„๋“ค์ธ๋‹ค๋ฉด
00:52
comes out of taking memetics seriously.
9
52330
2000
๋„์ถœ๋˜๋Š” ๊ด€์ ์„ ์ œ์•ˆํ•˜๊ฒ ์Šต๋‹ˆ๋‹ค.
00:54
And it gives us a new way of thinking about
10
54330
2000
๊ทธ๋ฆฌ๊ณ  ์ด๊ฒƒ์€ ์ง€๊ตฌ์—์„œ ์ผ์–ด๋‚˜๊ณ  ์žˆ์„ ๋ฟ๋งŒ ์•„๋‹ˆ๋ผ
00:56
not only what's going on on our planet,
11
56330
2000
์šฐ์ฃผ ๋‹ค๋ฅธ ๊ณณ์—์„œ๋„ ๋ถ„๋ช…ํžˆ ์ผ์–ด๋‚˜๊ณ  ์žˆ๋Š”
00:58
but what might be going on elsewhere in the cosmos.
12
58330
3000
๋ฌด์–ธ๊ฐ€์— ๋Œ€ํ•ด ์šฐ๋ฆฌ๊ฐ€ ์ƒˆ๋กญ๊ฒŒ ์ƒ๊ฐํ•ด๋ณด๊ฒŒ ํ•ฉ๋‹ˆ๋‹ค.
01:01
So first of all, I'd like to say something about memetics
13
61330
3000
๊ทธ๋ž˜์„œ ์šฐ์„  ๋ฐˆํ•™๋ฌธ๊ณผ ๋ฐˆ ์ด๋ก ์— ๋Œ€ํ•ด ์–˜๊ธฐํ•˜๊ณ ,
01:04
and the theory of memes,
14
64330
2000
๊ทธ ๋‹ค์Œ "์šฐ์ฃผ์— ๊ณผ์—ฐ ๋ˆ„๊ฐ€ ์žˆ๋Š”๊ฐ€"๋ผ๋Š” ์งˆ๋ฌธ์—
01:06
and secondly, how this might answer questions about who's out there,
15
66330
5000
์–ด๋–ป๊ฒŒ ๋Œ€๋‹ตํ•  ์ˆ˜ ์žˆ๋Š”์ง€ ์–˜๊ธฐํ•ด ๋ณด๋„๋ก ํ•˜๊ฒ ์Šต๋‹ˆ๋‹ค.
01:11
if indeed anyone is.
16
71330
3000
๋งŒ์ผ ์ •๋ง ๋ˆ„๊ตฐ๊ฐ€ ์žˆ๋‹ค๋ฉด์š”.
01:14
So, memetics:
17
74330
2000
์ž, ๋ฐˆํ•™๋ฌธ.
01:16
memetics is founded on the principle of Universal Darwinism.
18
76330
4000
๋ฐˆํ•™๋ฌธ์€ ๋ณดํŽธ์  ๋‹ค์œˆ์ฃผ์˜ ์›์น™ ์œ„์—์„œ ๋งŒ๋“ค์–ด ์กŒ์Šต๋‹ˆ๋‹ค.
01:20
Darwin had this amazing idea.
19
80330
3000
๋‹ค์œˆ์€ ์ด ๋†€๋ผ์šด ์ƒ๊ฐ์„ ํ•˜๊ฒŒ ๋˜์—ˆ์ฃ .
01:23
Indeed, some people say
20
83330
2000
์ฐธ์œผ๋กœ ๋†€๋ผ์›Œ์„œ, ์–ด๋–ค ์‚ฌ๋žŒ๋“ค์€
01:25
it's the best idea anybody ever had.
21
85330
3000
์‚ฌ์ƒ ์ตœ๊ณ ์˜ ์•„์ด๋””์–ด๋ผ๊ณ ๊นŒ์ง€ ํ•ฉ๋‹ˆ๋‹ค.
01:28
Isn't that a wonderful thought, that there could be such a thing
22
88330
4000
์ง€๊ธˆ๊นŒ์ง€ ๋‚˜์˜จ ์ตœ๊ณ ์˜ ์•„์ด๋””์–ด๊ฐ€ ์žˆ์„ ์ˆ˜ ์žˆ๋‹ค๋‹ˆ
01:32
as a best idea anybody ever had?
23
92330
2000
์ •๋ง ๋†€๋ž์ง€ ์•Š๋‚˜์š”?
01:34
Do you think there could?
24
94330
1000
์—ฌ๋Ÿฌ๋ถ„๋„ ๊ทธ๋ ‡๋‹ค๊ณ  ์ƒ๊ฐํ•˜๋‚˜์š”?
01:35
Audience: No.
25
95330
1000
์ฒญ์ค‘: ์•„๋‹ˆ์š”.
01:36
(Laughter)
26
96330
1000
(์›ƒ์Œ)
01:37
Susan Blackmore: Someone says no, very loudly, from over there.
27
97330
2000
์ €๊ธฐ ์–ด๋–ค ๋ถ„์€ ํฐ ์†Œ๋ฆฌ๋กœ ์•„๋‹ˆ๋ผ๊ณ  ํ•˜๋Š”๊ตฐ์š”.
01:39
Well, I say yes, and if there is, I give the prize to Darwin.
28
99330
4000
ํ•˜์ง€๋งŒ, ์˜ˆ, ์žˆ์Šต๋‹ˆ๋‹ค. ์žˆ๋‹ค๋ฉด ์ตœ๊ณ ๋Š” ๋‹ค์œˆ์˜ ๋ชซ์ด์ฃ .
01:43
Why?
29
103330
2000
์™œ์š”?
01:45
Because the idea was so simple,
30
105330
3000
๊ทธ ์•„์ด๋””์–ด๊ฐ€ ์ •๋ง ๋‹จ์ˆœํ•˜์ง€๋งŒ
01:48
and yet it explains all design in the universe.
31
108330
6000
์ „ ์šฐ์ฃผ์˜ ๋ชจ๋“  ์„ค๊ณ„๋ฅผ ์„ค๋ช…ํ•ด์ฃผ๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
01:54
I would say not just biological design,
32
114330
2000
๋‹จ์ˆœํžˆ ์ƒ๋ฌผํ•™์  ์„ค๊ณ„ ๋ฟ๋งŒ ์•„๋‹ˆ๋ผ
01:56
but all of the design that we think of as human design.
33
116330
2000
์ธ๊ฐ„์ด ๊ด€๊ณ„๋œ ๋ชจ๋“  ์„ค๊ณ„๋ฅผ ๋งํ•ฉ๋‹ˆ๋‹ค.
01:58
It's all just the same thing happening.
34
118330
2000
๋ชจ๋“  ๊ฒƒ์ด ๋˜‘๊ฐ™์ด ๋ฐœ์ƒํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
02:00
What did Darwin say?
35
120330
2000
๋‹ค์œˆ์ด ๋ญ๋ผ๊ณ  ๋งํ–ˆ์„๊นŒ์š”?
02:02
I know you know the idea, natural selection,
36
122330
2000
์ž์—ฐ ์„ ํƒ ๊ฐœ๋…์€ ์ด๋ฏธ ์—ฌ๋Ÿฌ๋ถ„๋„ ์•„์‹œ์ง€๋งŒ,
02:04
but let me just paraphrase "The Origin of Species," 1859,
37
124330
5000
1859๋…„ ์ž‘ "์ข…์˜ ๊ธฐ์›"์„ ๋ช‡ ๋ฌธ์žฅ์œผ๋กœ
02:09
in a few sentences.
38
129330
2000
ํ•œ๋ฒˆ ์ค„์—ฌ ๋ณด๊ฒ ์Šต๋‹ˆ๋‹ค.
02:11
What Darwin said was something like this:
39
131330
3000
๋‹ค์œˆ์ด ๋งํ•œ ๊ฒƒ์€ ์ด๋Ÿฐ ๊ฒƒ์ด์˜€์–ด์š”.
02:14
if you have creatures that vary, and that can't be doubted --
40
134330
4000
๋งŒ์ผ ๋‹ค์–‘ํ•œ ์ƒ๋ช…์ฒด๊ฐ€ ์กด์žฌํ•œ๋‹ค๋ฉด, ๋‹น์—ฐํ•œ ์–˜๊ธฐ์ง€์š”,
02:18
I've been to the Galapagos, and I've measured the size of the beaks
41
138330
3000
๋‚ด๊ฐ€ ๊ฐˆ๋ผํŒŒ๊ณ ์Šค์— ๊ฐ€์„œ ์ƒˆ ๋ถ€๋ฆฌ ํฌ๊ธฐ๋ฅผ ์žฌ ๋ณด๊ณ ,
02:21
and the size of the turtle shells and so on, and so on.
42
141330
2000
๊ทธ๋ฆฌ๊ณ  ๊ฑฐ๋ถ ๊ป์งˆ๋„ ์žฌ ๋ณด๊ณ , ๊ทธ๋ฆฌ๊ณ , ๊ทธ๋ฆฌ๊ณ  ๊ณ„์†.
02:23
And 100 pages later.
43
143330
2000
๊ทธ๋ฆฌ๊ณ  100 ํŽ˜์ด์ง€ ๋’ค์— --
02:25
(Laughter)
44
145330
2000
(์›ƒ์Œ)
02:27
And if there is a struggle for life,
45
147330
4000
๋งŒ์ผ ๊ฑฐ์˜ ๋ชจ๋“  ์ƒ๋ช…์ฒด๊ฐ€ ๋‹ค ์ฃฝ์„ ์ •๋„๋กœ
02:31
such that nearly all of these creatures die --
46
151330
3000
์น˜์—ดํ•œ ์ƒ์กด ๊ฒฝ์Ÿ์ด ์žˆ๋‹ค๋ฉด --
02:34
and this can't be doubted, I've read Malthus
47
154330
3000
๋‹น์—ฐํžˆ ๊ทธ๋ ‡์ฃ , ๊ทธ๋ฆฌ๊ณ  ๋งฌ์„œ์Šค๋ฅผ ์ฝ๊ณ  ๋‚˜์„œ
02:37
and I've calculated how long it would take for elephants
48
157330
2000
์ฝ”๋ผ๋ฆฌ๊ฐ€ ์ œํ•œ ์—†์ด ๋ฒˆ์‹ํ•œ๋‹ค๋ฉด ์ „ ์„ธ๊ณ„๋ฅผ ๋’ค๋ฎ๋Š”๋ฐ
02:39
to cover the whole world if they bred unrestricted, and so on and so on.
49
159330
3000
์–ผ๋งˆ๋‚˜ ๊ฑธ๋ฆด์ง€๋„ ๊ณ„์‚ฐํ–ˆ๊ณ , ๊ทธ๋ฆฌ๊ณ , ๊ทธ๋ฆฌ๊ณ  ๊ณ„์†.
02:42
And another 100 pages later.
50
162330
4000
๊ทธ๋ฆฌ๊ณ  ๋‹ค์‹œ 100 ํŽ˜์ด์ง€ ๋’ค์—.
02:46
And if the very few that survive pass onto their offspring
51
166330
5000
๋งŒ์ผ ๋งค์šฐ ์†Œ์ˆ˜๋งŒ์ด ์‚ด์•„๋‚จ์•„ ๊ทธ๋“ค์˜ ํ›„์†์œผ๋กœ ์ „ํ•ด์ง„๋‹ค๋ฉด,
02:51
whatever it was that helped them survive,
52
171330
3000
๊ทธ๋“ค์ด ์ƒ์กดํ•˜๊ฒŒ ๋„์™€์ค€ ๊ฒƒ์ด ๋ฌด์—‡์ด๋“ ์ง€ ๊ฐ„์—,
02:54
then those offspring must be better adapted
53
174330
2000
๊ทธ ํ›„์†๋“ค์€ ์ด ๋ชจ๋“  ๊ฒƒ์ด ์ผ์–ด๋‚˜๋Š” ํ™˜๊ฒฝ์—
02:56
to the circumstances in which all this happened
54
176330
2000
๊ทธ๋“ค์˜ ๋ถ€๋ชจ๋ณด๋‹ค
02:58
than their parents were.
55
178330
3000
๋” ์ž˜ ์ ์‘ํ•  ๊ฒƒ์ด ๋ถ„๋ช…ํ•˜๋‹ค.
03:01
You see the idea?
56
181330
2000
์ด ์•„์ด๋””์–ด๊ฐ€ ์ดํ•ด๋˜๋‚˜์š”?
03:03
If, if, if, then.
57
183330
2000
๋งŒ์ผ, ๋งŒ์ผ, ๋งŒ์ผ, ๊ทธ๋ ‡๋‹ค๋ฉด.
03:05
He had no concept of the idea of an algorithm,
58
185330
2000
๊ทธ๋Š” ์•Œ๊ณ ๋ฆฌ์ฆ˜์ด๋ผ๋Š” ๊ฐœ๋…์„ ๋ชฐ๋ž์Šต๋‹ˆ๋‹ค.
03:07
but that's what he described in that book,
59
187330
3000
๊ทธ๋Ÿฌ๋‚˜ ๊ทธ๊ฒƒ์ด ๋ฐ”๋กœ ๊ทธ๊ฐ€ ์ฑ… ์†์—์„œ ์„ค๋ช…ํ•œ ๊ฒƒ์ด๊ณ ,
03:10
and this is what we now know as the evolutionary algorithm.
60
190330
3000
๊ทธ๋ฆฌ๊ณ  ๊ทธ๊ฒƒ์ด ์šฐ๋ฆฌ๊ฐ€ ์ง„ํ™”๋ก ์ด๋ผ๊ณ  ์•Œ๊ณ  ์žˆ๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
03:13
The principle is you just need those three things --
61
193330
4000
์ด ์›์น™์€ ๋‹ค์Œ ์„ธ ๊ฐ€์ง€๋งŒ ์žˆ์œผ๋ฉด ๋ฉ๋‹ˆ๋‹ค.
03:17
variation, selection and heredity.
62
197330
3000
๋‹ค์–‘์„ฑ, ์„ ํƒ ๊ทธ๋ฆฌ๊ณ  ์ƒ์†.
03:20
And as Dan Dennett puts it, if you have those,
63
200330
4000
๊ทธ๋ฆฌ๊ณ  ๋Œ„ ๋ฐ๋‹ˆํŠธ๊ฐ€ ์ ์—ˆ๋“ฏ์ด, ๋งŒ์ผ ์ด ์„ธ๊ฐ€์ง€๊ฐ€ ์žˆ๋‹ค๋ฉด
03:24
then you must get evolution.
64
204330
2000
๋‹น์‹ ์€ ๋ถ„๋ช…ํžˆ ์ง„ํ™”ํ•  ๊ฒƒ์ด๋‹ค.
03:26
Or design out of chaos, without the aid of mind.
65
206330
5000
๋˜๋Š” ์ง€์„ฑ์˜ ๋„์›€ ์—†์ด ํ˜ผ๋ž€์—์„œ ์„ค๊ณ„๊ฐ€ ์ƒ๊ฒจ๋‚  ๊ฒƒ์ด๋‹ค.
03:31
There's one word I love on that slide.
66
211330
2000
์ € ์Šฌ๋ผ์ด๋“œ์— ์ œ๊ฐ€ ์ข‹์•„ํ•˜๋Š” ๋‹จ์–ด๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
03:33
What do you think my favorite word is?
67
213330
2000
์ œ๊ฐ€ ์ข‹์•„ํ•˜๋Š” ๋‹จ์–ด๊ฐ€ ๋ฌด์—‡์ธ์ง€ ์•Œ๊ฒ ๋‚˜์š”?
03:35
Audience: Chaos.
68
215330
1000
์ฒญ์ค‘: ํ˜ผ๋ž€
03:36
SB: Chaos? No. What? Mind? No.
69
216330
3000
ํ˜ผ๋ž€? ์•„๋‹ˆ์š”. ๋ญ์š”? ์ง€์„ฑ? ์•„๋‹ˆ์š”.
03:39
Audience: Without.
70
219330
1000
์ฒญ์ค‘: ์—†์ด
03:40
SB: No, not without.
71
220330
1000
์•„๋‹ˆ์š”. ์—†์ด ์•„๋‹™๋‹ˆ๋‹ค.
03:41
(Laughter)
72
221330
1000
(์›ƒ์Œ)
03:42
You try them all in order: Mmm...?
73
222330
2000
๋‹จ์–ด ์ˆœ์„œ๋Œ€๋กœ ์‹œ๋„ํ•ด ๋ณด์„ธ์š”. ์˜ˆ?
03:44
Audience: Must.
74
224330
1000
์ฒญ์ค‘: ๋ถ„๋ช…ํžˆ
03:45
SB: Must, at must. Must, must.
75
225330
4000
๋ถ„๋ช…ํžˆ, ๋ถ„๋ช…ํžˆ, ๋ถ„๋ช…ํžˆ
03:49
This is what makes it so amazing.
76
229330
2000
์ด๊ฒƒ์ด ๋ฐ”๋กœ ๋†€๋ผ์šด ๊ฒƒ์ž…๋‹ˆ๋‹ค.
03:51
You don't need a designer,
77
231330
3000
๋‹น์‹ ์€ ์„ค๊ณ„์ž๊ฐ€ ํ•„์š”์—†์–ด์š”.
03:54
or a plan, or foresight, or anything else.
78
234330
3000
๋˜๋Š” ๊ณ„ํš์ด๋‚˜ ์˜ˆ๊ฒฌ์ด๋‚˜ ๊ทธ ์–ด๋–ค ๊ฒƒ๋„์š”.
03:57
If there's something that is copied with variation
79
237330
3000
๋งŒ์ผ ๋‹ค์–‘์„ฑ ์†์—์„œ ์„ ํƒ์„ ํ†ตํ•ด ๋ณต์ œ๋˜๋Š” ๊ฒƒ์ด ์žˆ๋‹ค๋ฉด,
04:00
and it's selected, then you must get design appearing out of nowhere.
80
240330
4000
๋‹น์‹ ์€ ๋ถ„๋ช…ํžˆ ์ €์ ˆ๋กœ ๋‚˜ํƒ€๋‚˜๋Š” ์„ค๊ณ„๋ฅผ ๊ฐ–๊ฒŒ๋  ๊ฒ๋‹ˆ๋‹ค.
04:04
You can't stop it.
81
244330
2000
์ด๊ฑด ๋ฉˆ์ถœ ์ˆ˜ ์—†์ฃ .
04:06
Must is my favorite word there.
82
246330
4000
๋ถ„๋ช…ํžˆ๋Š” ์ €๊ธฐ์„œ ์ œ๊ฐ€ ๊ฐ€์žฅ ์ข‹์•„ํ•˜๋Š” ๋‹จ์–ด์ž…๋‹ˆ๋‹ค.
04:11
Now, what's this to do with memes?
83
251330
2000
๊ทธ๋ ‡๋‹ค๋ฉด ์ด๊ฒƒ๊ณผ ๋ฐˆ์€ ์–ด๋–ค ๊ด€๋ จ์ด ์žˆ์„๊นŒ์š”?
04:13
Well, the principle here applies to anything
84
253330
5000
๋ฐ”๋กœ, ์—ฌ๊ธฐ ์ด ์›์น™์€ ๋‹ค์–‘์„ฑ ์†์—์„œ ์„ ํƒ์„ ํ†ตํ•ด ๋ณต์ œ๋˜๋Š”
04:18
that is copied with variation and selection.
85
258330
1000
์–ด๋–ค ๊ฒƒ์—๋„ ์ ์šฉ๋  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
04:19
We're so used to thinking in terms of biology,
86
259330
3000
์šฐ๋ฆฌ๋Š” ์ƒ๋ฌผํ•™ ์šฉ์–ด๋กœ ์ƒ๊ฐํ•˜๋Š”๋ฐ ๋„ˆ๋ฌด ์ต์ˆ™ํ•ด์„œ,
04:22
we think about genes this way.
87
262330
2000
์œ ์ „์ž๋„ ์ด๋Ÿฐ ์‹์œผ๋กœ ์ƒ๊ฐํ•˜์ฃ .
04:24
Darwin didn't, of course; he didn't know about genes.
88
264330
3000
๋‹ค์œˆ์€ ๊ทธ๋ ‡์ง€ ์•Š์•˜์–ด์š”, ๊ทธ๋Š” ์œ ์ „์ž์— ๋Œ€ํ•ด์„œ ๋ชฐ๋ž์ฃ .
04:27
He talked mostly about animals and plants,
89
267330
2000
๊ทธ๋Š” ๋Œ€๋ถ€๋ถ„ ๋™๋ฌผ์ด๋‚˜ ์‹๋ฌผ์— ๋Œ€ํ•ด์„œ ์–˜๊ธฐํ–ˆ๊ณ ,
04:29
but also about languages evolving and becoming extinct.
90
269330
3000
๋˜ํ•œ ์ง„ํ™”์™€ ๋ฉธ์ข… ๊ฐ™์€ ์–ธ์–ด๋ฅผ ์–˜๊ธฐํ–ˆ์ฃ .
04:32
But the principle of Universal Darwinism
91
272330
2000
๊ทธ๋Ÿฌ๋‚˜ ๋ณดํŽธ์  ๋‹ค์œˆ์ฃผ์˜ ์›์น™์€
04:34
is that any information that is varied and selected
92
274330
4000
๋‹ค์–‘ํ•˜๊ณ  ์„ ํƒ๋˜๋Š” ์–ด๋–ค ์ •๋ณด๋„ ์„ค๊ณ„๋ฅผ ๋งŒ๋“ค์–ด ๋‚ผ
04:38
will produce design.
93
278330
2000
๊ฒƒ์ด๋ผ๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
04:40
And this is what Richard Dawkins was on about
94
280330
2000
๊ทธ๋ฆฌ๊ณ  ์ด๊ฒƒ์ด ๋ฆฌ์ฐจ๋“œ ๋„ํ‚จ์Šค๊ฐ€ 1976๋…„์— ์“ด
04:42
in his 1976 bestseller, "The Selfish Gene."
95
282330
3000
๋ฒ ์ŠคํŠธ ์…€๋Ÿฌ, "์ด๊ธฐ์  ์œ ์ „์ž"์—์„œ ์„ค๋ช…ํ•œ ๊ฒƒ์ด์ฃ .
04:45
The information that is copied, he called the replicator.
96
285330
4000
๋ณต์ œ๋˜๋Š” ์ •๋ณด๋ฅผ ๊ทธ๋Š” ๋ณต์ œ์ž๋ผ๊ณ  ๋ถˆ๋ €์Šต๋‹ˆ๋‹ค.
04:49
It selfishly copies.
97
289330
2000
์ด๊ธฐ์ ์œผ๋กœ ๋ณต์ œํ•˜์ฃ .
04:51
Not meaning it kind of sits around inside cells going, "I want to get copied."
98
291330
4000
์ด๊ฒƒ์ด ์„ธํฌ ๋‚ด๋ถ€์— "๋‚œ ๋ณต์ œ๋˜๊ณ  ์‹ถ์–ด"๋ผ๋Š” ์–ด๋–ค ๊ฒƒ์ด ์žˆ๋‹ค๋Š” ์˜๋ฏธ๋Š” ์•„๋‹™๋‹ˆ๋‹ค.
04:55
But that it will get copied if it can,
99
295330
2000
ํ•˜์ง€๋งŒ ๊ทธ๊ฒƒ์€ ๊ฐ€๋Šฅํ•˜๋ฉด ๋ณต์ œ๋˜๋ ค๊ณ  ํ•˜์ฃ .
04:57
regardless of the consequences.
100
297330
2000
๊ฒฐ๊ณผ์™€๋Š” ์ƒ๊ด€ ์—†์Šต๋‹ˆ๋‹ค.
05:00
It doesn't care about the consequences because it can't,
101
300330
3000
๊ฒฐ๊ณผ์— ๋Œ€ํ•ด์„œ๋Š” ๊ฑฑ์ •ํ•˜์ง€ ์•Š์•„์š”, ์™œ๋ƒํ•˜๋ฉด ๊ฑฑ์ •ํ•  ์ˆ˜ ์—†์œผ๋‹ˆ๊นŒ์š”,
05:03
because it's just information being copied.
102
303330
2000
๊ทธ๊ฒƒ์€ ๋‹จ์ง€ ๋ณต์ œ๋˜๋Š” ์ •๋ณด์ผ ๋ฟ์ž…๋‹ˆ๋‹ค.
05:06
And he wanted to get away
103
306330
1000
๋„ํ‚จ์Šค๋Š” ๋ชจ๋“  ์‚ฌ๋žŒ๋“ค์ด ํ•ญ์ƒ
05:07
from everybody thinking all the time about genes,
104
307330
3000
์œ ์ „์ž์— ๋Œ€ํ•ด ์ƒ๊ฐํ•˜๋Š” ๊ฒƒ์—์„œ ๋ฒ—์–ด๋‚˜๊ธฐ๋ฅผ ์›ํ–ˆ์ฃ .
05:10
and so he said, "Is there another replicator out there on the planet?"
105
310330
3000
๊ทธ๋ž˜์„œ ๊ทธ๋Š” "์ง€๊ตฌ์ƒ์— ๋‹ค๋ฅธ ๋ณต์ œ์ž๊ฐ€ ์žˆ์„๊นŒ"๋ผ๊ณ  ๋งํ–ˆ์Šต๋‹ˆ๋‹ค.
05:13
Ah, yes, there is.
106
313330
2000
์•„, ๊ทธ๋ž˜์š”, ์žˆ์Šต๋‹ˆ๋‹ค.
05:15
Look around you -- here will do, in this room.
107
315330
3000
์ฃผ์œ„๋ฅผ ๋‘˜๋Ÿฌ๋ณด์„ธ์š”. ์—ฌ๊ธฐ, ์ด ๋ฐฉ์•ˆ์—๋„ ์žˆ์„๊ฑฐ์˜ˆ์š”.
05:18
All around us, still clumsily drifting about
108
318330
3000
์šฐ๋ฆฌ ์ฃผ์œ„์—์„œ, ๋ฌธํ™”์˜ ์›์‹œ ์Šคํ”„ ์•ˆ์—์„œ
05:21
in its primeval soup of culture, is another replicator.
109
321330
3000
์—ฌ์ „ํžˆ ์–ด์„คํ”„๊ฒŒ ๋– ๋‹ค๋‹ˆ๋Š” ๊ฒƒ์ด ๋ฐ”๋กœ ๋˜ ๋‹ค๋ฅธ ๋ณต์ œ์ž์ž…๋‹ˆ๋‹ค.
05:24
Information that we copy from person to person, by imitation,
110
324330
5000
๋ชจ๋ฐฉ์— ์˜ํ•ด ์šฐ๋ฆฌ๊ฐ€ ์‚ฌ๋žŒ ๊ฐ„์— ๋ณต์ œํ•˜๊ณ  ์žˆ๋Š” ์ •๋ณด,
05:29
by language, by talking, by telling stories,
111
329330
2000
์–ธ์–ด์— ์˜ํ•ด, ๋Œ€ํ™”์— ์˜ํ•ด, ์ด์•ผ๊ธฐ๋ฅผ ํ†ตํ•ด,
05:31
by wearing clothes, by doing things.
112
331330
3000
์ž…๋Š” ์˜ท์„ ํ†ตํ•ด, ๋ฌด์—‡์„ ํ•จ์œผ๋กœ์จ.
05:34
This is information copied with variation and selection.
113
334330
5000
์ด๊ฒƒ์ด ๋‹ค์–‘์„ฑ ์†์—์„œ ์„ ํƒ์— ์˜ํ•ด ๋ณต์ œ๋˜๋Š” ์ •๋ณด์ž…๋‹ˆ๋‹ค.
05:39
This is design process going on.
114
339330
3000
์ด๊ฒƒ์ด ์ง„ํ–‰ ์ค‘์ธ ์„ค๊ณ„ ๊ณผ์ •์ž…๋‹ˆ๋‹ค.
05:42
He wanted a name for the new replicator.
115
342330
3000
๊ทธ๋Š” ์ด ์ƒˆ๋กœ์šด ๋ณต์ œ์ž์— ์ด๋ฆ„์„ ๋ถ€์—ฌํ•˜๊ณ  ์‹ถ์—ˆ์ฃ .
05:45
So, he took the Greek word "mimeme," which means that which is imitated.
116
345330
4000
๊ทธ๋ž˜์„œ ๊ทธ๋Š” ๋ชจ๋ฐฉ์ด๋ผ๋Š” ๋œป์˜ ๊ทธ๋ฆฌ์Šค์–ด mimeme์„ ์ฐจ์šฉํ–ˆ์Šต๋‹ˆ๋‹ค.
05:49
Remember that, that's the core definition:
117
349330
2000
๊ธฐ์–ตํ•˜์„ธ์š”. ์ด๊ฒƒ์ด ํ•ต์‹ฌ ์ •์˜์ž…๋‹ˆ๋‹ค.
05:52
that which is imitated.
118
352330
1000
๋ชจ๋ฐฉํ•˜๋Š” ๊ฒƒ ๋ง์ด์ฃ .
05:53
And abbreviated it to meme, just because it sounds good
119
353330
3000
๊ทธ๋ฆฌ๊ณ  ๊ทธ๊ฒƒ์„ meme ๋ฐˆ์œผ๋กœ ์ค„์—ฌ์ผ์ฃ , ์ด๊ฒŒ ๋” ๋ฐœ์Œํ•˜๊ธฐ ์ข‹์œผ๋‹ˆ๊นŒ์š”.
05:56
and made a good meme, an effective spreading meme.
120
356330
3000
๊ทธ๋ฆฌ๊ณ  ์ข‹์€ ๋ฐˆ, ํšจ๊ณผ์ ์œผ๋กœ ํผ์ ธ๋‚˜๊ฐ€๋Š” ๋ฐˆ๋„ ๋งŒ๋“ค์—ˆ์Šต๋‹ˆ๋‹ค.
05:59
So that's how the idea came about.
121
359330
3000
๊ทธ๋ž˜์„œ ์ด๋ ‡๊ฒŒ ์ด ์•„์ด๋””์–ด๊ฐ€ ํƒ„์ƒํ–ˆ์Šต๋‹ˆ๋‹ค.
06:03
It's important to stick with that definition.
122
363330
3000
์ด ์ •์˜๊ฐ€ ๋งค์šฐ ์ค‘์š”ํ•ฉ๋‹ˆ๋‹ค.
06:06
The whole science of memetics is much maligned,
123
366330
4000
์ „์ฒด ๋ฐˆํ•™๋ฌธ์€ ์‹ฌํ•˜๊ฒŒ ํ—๋œฏ์–ด์ง€๊ณ ,
06:10
much misunderstood, much feared.
124
370330
3000
ํฌ๊ฒŒ ์ž˜๋ชป ์ดํ•ด๋˜๊ณ , ์ปค๋‹ค๋ž€ ๋‘๋ ค์›€์˜ ๋Œ€์ƒ์ด ๋˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
06:13
But a lot of these problems can be avoided
125
373330
3000
ํ•˜์ง€๋งŒ ์ด๋Ÿฌํ•œ ๋งŽ์€ ๋ฌธ์ œ๋“ค์€
06:16
by remembering the definition.
126
376330
2000
๊ทธ ์ •์˜๋ฅผ ๊ธฐ์–ตํ•จ์œผ๋กœ์จ ํ”ผํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
06:18
A meme is not equivalent to an idea.
127
378330
2000
๋ฐˆ์€ ์•„์ด๋””์–ด์™€ ๋™๋“ฑํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
06:20
It's not an idea. It's not equivalent to anything else, really.
128
380330
2000
์•„์ด๋””์–ด๊ฐ€ ์•„๋‹ˆ๊ณ , ๋‹ค๋ฅธ ์–ด๋–ค ๊ฒƒ๊ณผ ๋™๋“ฑํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค. ์ •๋ง๋กœ์š”.
06:22
Stick with the definition.
129
382330
2000
๊ทธ ์ •์˜๋ฅผ ๋†“์น˜์ง€ ๋งˆ์„ธ์š”.
06:24
It's that which is imitated,
130
384330
2000
๊ทธ๊ฒƒ์€ ๋ฐ”๋กœ ๋ชจ๋ฐฉํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
06:26
or information which is copied from person to person.
131
386330
3000
๋˜๋Š” ์‚ฌ๋žŒ๊ฐ„์— ๋ณต์ œ๋˜๋Š” ์ •๋ณด์ž…๋‹ˆ๋‹ค.
06:30
So, let's see some memes.
132
390330
1000
๊ทธ๋Ÿผ, ๋ช‡๊ฐ€์ง€ ๋ฐˆ์„ ์‚ดํŽด๋ณผ๊นŒ์š”.
06:31
Well, you sir, you've got those glasses hung around your neck
133
391330
3000
์ž, ๋‹น์‹ ์ด์š”, ๋‹น์‹ ์€ ๋…ํŠนํ•œ ๋งค๋ ฅ์ ์ธ ๋ฐฉ๋ฒ•์œผ๋กœ
06:34
in that particularly fetching way.
134
394330
2000
๋ชฉ์— ์•ˆ๊ฒฝ์„ ๊ฑธ๊ณ  ์žˆ๋„ค์š”.
06:36
I wonder whether you invented that idea for yourself,
135
396330
2000
ํ˜น์‹œ ๊ทธ ์•„์ด๋””์–ด๋ฅผ ์Šค์Šค๋กœ ์ƒ๊ฐํ•ด๋ƒˆ๋‚˜์š”?
06:38
or copied it from someone else?
136
398330
2000
์•„๋‹ˆ๋ฉด ๋‹ค๋ฅธ ์‚ฌ๋žŒ์„ ๋”ฐ๋ผํ•œ ๊ฑด๊ฐ€์š”?
06:40
If you copied it from someone else, it's a meme.
137
400330
3000
๋งŒ์ผ ๋‹น์‹ ์ด ๋‹ค๋ฅธ ์‚ฌ๋žŒ์œผ๋กœ๋ถ€ํ„ฐ ๋”ฐ๋ผํ•œ ๊ฒƒ์ด๋ฉด ๊ทธ๊ฒƒ์ด ๋ฐˆ์ž…๋‹ˆ๋‹ค.
06:43
And what about, oh, I can't see any interesting memes here.
138
403330
3000
๊ทธ๋ฆฌ๊ณ  ๋˜, ์–ด, ์—ฌ๊ธฐ์„œ ๋‹ค๋ฅธ ํฅ๋ฏธ์žˆ๋Š” ๋ฐˆ์„ ์ฐพ์ง€ ๋ชปํ•˜๊ฒ ๋„ค์š”.
06:46
All right everyone, who's got some interesting memes for me?
139
406330
3000
์ž ์—ฌ๋Ÿฌ๋ถ„, ๋ˆ„๊ฐ€ ์ €์—๊ฒŒ ํฅ๋ฏธ์žˆ๋Š” ๋ฐˆ์„ ๋ณด์—ฌ์ฃผ์ง€ ์•Š๊ฒ ์–ด์š”?
06:49
Oh, well, your earrings,
140
409330
2000
์˜ค, ๊ทธ๋ž˜์š”. ๋‹น์‹ ์˜ ๊ท€๊ฑธ์ด.
06:51
I don't suppose you invented the idea of earrings.
141
411330
2000
์•„๋งˆ๋„ ๋‹น์‹ ์ด ๊ท€๊ฑธ์ด ์•„์ด๋””์–ด๋ฅผ ๋ฐœ๋ช…ํ•˜์ง„ ์•Š์•˜๊ฒ ์ฃ .
06:53
You probably went out and bought them.
142
413330
2000
๋‹น์‹ ์€ ์•„๋งˆ ๋ฐ–์—์„œ ๊ทธ๊ฒƒ์„ ์ƒ€์„๊ฑฐ์˜ˆ์š”.
06:55
There are plenty more in the shops.
143
415330
2000
๊ฐ€๊ฒŒ์—๋Š” ํ›จ์”ฌ ๋งŽ์€ ๊ท€๊ฑธ์ด๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
06:57
That's something that's passed on from person to person.
144
417330
2000
๊ทธ๊ฒƒ์ด ๋ฐ”๋กœ ์‚ฌ๋žŒ๊ฐ„์— ์˜ฎ๊ฒจ๋‹ค๋‹ˆ๋Š” ๋ฌด์—‡์ž…๋‹ˆ๋‹ค.
06:59
All the stories that we're telling -- well, of course,
145
419330
3000
TED๋Š” ๊ฑฐ๋Œ€ํ•œ ๋ฐˆ ๋ชจ์ž„์ด๊ณ , ์šฐ๋ฆฌ๊ฐ€ ์–˜๊ธฐํ•˜๊ณ  ์žˆ๋Š”
07:02
TED is a great meme-fest, masses of memes.
146
422330
4000
๋ชจ๋“  ์ด์•ผ๊ธฐ๋“ค์€ ๋ฐˆ ๋ฉ์–ด๋ฆฌ๋“ค์ž…๋‹ˆ๋‹ค.
07:06
The way to think about memes, though,
147
426330
2000
๋ฐˆ์— ๋Œ€ํ•ด ์ƒ๊ฐํ•ด ๋ณด๋ ค๋ฉด ์—ญ์‹œ ๋ฐˆ์ด ์™œ
07:08
is to think, why do they spread?
148
428330
2000
ํผ์ ธ๋‚˜๊ฐ€๋Š”์ง€ ์ƒ๊ฐํ•ด ๋ณด์•„์•ผ ํ•ฉ๋‹ˆ๋‹ค.
07:10
They're selfish information, they will get copied, if they can.
149
430330
4000
๋ฐˆ์€ ์ด๊ธฐ์ ์ธ ์ •๋ณด์ž…๋‹ˆ๋‹ค, ๋ฐˆ์€ ๊ฐ€๋Šฅํ•˜๋‹ค๋ฉด ๋ณต์ œ๋˜๋ ค๊ณ  ํ•˜์ฃ .
07:14
But some of them will be copied because they're good,
150
434330
3000
๊ทธ๋Ÿฌ๋‚˜ ๊ทธ๊ฒƒ๋“ค ์ค‘ ์ผ๋ถ€๋Š” ๊ทธ๊ฒƒ์ด ์ข‹๊ณ , ์ง„์‹คํ•˜๊ณ ,
07:17
or true, or useful, or beautiful.
151
437330
2000
์œ ์šฉํ•˜๊ณ , ์•„๋ฆ„๋‹ค์›Œ์„œ ๋ณต์ œ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
07:19
Some of them will be copied even though they're not.
152
439330
2000
์ผ๋ถ€๋Š” ๊ทธ๋ ‡์ง€ ์•Š๋”๋ผ๋„ ๋ณต์ œ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
07:21
Some, it's quite hard to tell why.
153
441330
2000
์–ด๋–ค ๊ฒƒ์€ ์™œ ๊ทธ๋Ÿฐ์ง€ ๋งํ•˜๊ธฐ๊ฐ€ ์ •๋ง ํž˜๋“ค์ฃ .
07:24
There's one particular curious meme which I rather enjoy.
154
444330
3000
์ œ๊ฐ€ ์ฆ๊ฒจ ์‚ฌ์šฉํ•˜๋Š” ํŠน๋ณ„ํ•˜๊ณ  ํ˜ธ๊ธฐ์‹ฌ์„ ๋„๋Š” ๋ฐˆ์ด ์žˆ์Šต๋‹ˆ๋‹ค.
07:27
And I'm glad to say, as I expected, I found it when I came here,
155
447330
3000
์ œ๊ฐ€ ๊ธฐ๋Œ€ํ–ˆ๋˜ ๋Œ€๋กœ ์ด๊ณณ์— ์™€์„œ๋„ ๋ฐœ๊ฒฌํ•  ์ˆ˜ ์žˆ์—ˆ์ฃ .
07:30
and I'm sure all of you found it, too.
156
450330
2000
๊ทธ๋ฆฌ๊ณ  ์•„๋งˆ ์—ฌ๋Ÿฌ๋ถ„๋„ ํ™•์‹คํžˆ ๋ณด์•˜์„ ๊ฑฐ์˜ˆ์š”.
07:32
You go to your nice, posh, international hotel somewhere,
157
452330
3000
์—ฌ๋Ÿฌ๋ถ„์€ ์–ด๋”˜๊ฐ€์˜ ๋ฉ‹์ง€๊ณ  ํ˜ธํ™”์Šค๋Ÿฐ ๊ตญ์ œ ํ˜ธํ…”์— ๋„์ฐฉํ•ฉ๋‹ˆ๋‹ค.
07:36
and you come in and you put down your clothes
158
456330
2000
์•ˆ์œผ๋กœ ๋“ค์–ด๊ฐ€ ์˜ท์„ ๋ฒ—์–ด๋†“์Šต๋‹ˆ๋‹ค.
07:38
and you go to the bathroom, and what do you see?
159
458330
3000
๊ทธ๋ฆฌ๊ณ  ์š•์‹ค๋กœ ๋“ค์–ด๊ฐ€๋ฉด, ๋ฌด์—‡์„ ๋ณผ ์ˆ˜ ์žˆ์ฃ ?
07:41
Audience: Bathroom soap.
160
461330
1000
์ฒญ์ค‘: ํ™”์žฅ์‹ค ๋น„๋ˆ„.
07:42
SB: Pardon?
161
462330
1000
๋ญ๋ผ๊ตฌ์š”?
07:43
Audience: Soap.
162
463330
1000
๋น„๋ˆ„์š”.
07:44
SB: Soap, yeah. What else do you see?
163
464330
2000
์•„, ๋น„๋ˆ„. ๋˜ ๋‹ค๋ฅธ ๊ฑด์š”?
07:46
Audience: (Inaudible)
164
466330
1000
์ฒญ์ค‘: (์ž˜ ์•ˆ๋“ค๋ฆผ)
07:47
SB: Mmm mmm.
165
467330
1000
์Œ...
07:48
Audience: Sink, toilet!
166
468330
1000
์ฒญ์ค‘: ์„ธ๋ฉด๋Œ€, ๋ณ€๊ธฐ!
07:49
SB: Sink, toilet, yes, these are all memes, they're all memes,
167
469330
2000
์„ธ๋ฉด๋Œ€, ๋ณ€๊ธฐ, ๊ทธ๋ ‡์ฃ . ๊ทธ๊ฒƒ๋“ค์€ ๋ชจ๋‘ ๋ฐˆ์ด์˜ˆ์š”.
07:51
but they're sort of useful ones, and then there's this one.
168
471330
3000
ํ•˜์ง€๋งŒ ๊ทธ๊ฒƒ๋“ค์€ ์œ ์šฉํ•œ ๊ฒƒ๋“ค์ด๊ตฌ์š”. ๊ทธ๋ฆฌ๊ณ  ์ด๋Ÿฐ ๊ฒƒ๋„ ์žˆ์–ด์š”.
07:54
(Laughter)
169
474330
3000
(์›ƒ์Œ)
07:58
What is this one doing?
170
478330
2000
์ด๊ฒŒ ๋ฌด์—‡์„ ์œ„ํ•œ ๊ฑฐ์ฃ ?
08:00
(Laughter)
171
480330
1000
(์›ƒ์Œ)
08:01
This has spread all over the world.
172
481330
2000
์ด๊ฑด ์ „์„ธ๊ณ„์— ํผ์ ธ์žˆ์–ด์š”.
08:03
It's not surprising that you all found it
173
483330
2000
์—ฌ๋Ÿฌ๋ถ„์ด ์—ฌ๊ธฐ์„œ ์š•์‹ค์— ๋“ค์–ด๊ฐ”์„ ๋•Œ
08:05
when you arrived in your bathrooms here.
174
485330
2000
์ด๊ฒƒ์„ ๋ฐœ๊ฒฌํ•˜๋”๋ผ๋„ ์ด์ƒํ•  ๊ฒƒ์ด ์—†์Šต๋‹ˆ๋‹ค.
08:07
But I took this photograph in a toilet at the back of a tent
175
487330
5000
ํ•˜์ง€๋งŒ ์ด ์‚ฌ์ง„์€ ์ œ๊ฐ€ ์•„์Œˆ ์ •๊ธ€์— ์žˆ๋Š” ํ™˜๊ฒฝ ์บ ํ”„
08:12
in the eco-camp in the jungle in Assam.
176
492330
2000
ํ…ํŠธ ๋’ค ํ™”์žฅ์‹ค์—์„œ ์ฐ์—ˆ์–ด์š”.
08:14
(Laughter)
177
494330
1000
(์›ƒ์Œ)
08:16
Who folded that thing up there, and why?
178
496330
3000
๋ˆ„๊ฐ€ ์ €๋ ‡๊ฒŒ ์ ‘์–ด ์˜ฌ๋ ธ์„๊นŒ์š”, ๊ทธ๋ฆฌ๊ณ  ์™œ?
08:19
(Laughter)
179
499330
1000
(์›ƒ์Œ)
08:20
Some people get carried away.
180
500330
2000
์–ด๋–ค ์‚ฌ๋žŒ๋“ค์€ ๊ฑฐ์˜ ๋ฌด์•„์ง€๊ฒฝ์ด ๋˜์—ˆ๋„ค์š”.
08:22
(Laughter)
181
502330
3000
(์›ƒ์Œ)
08:26
Other people are just lazy and make mistakes.
182
506330
3000
๋‹ค๋ฅธ ์‚ฌ๋žŒ๋“ค์€ ์ข€ ๊ฒŒ์œผ๋ฅด๊ฑฐ๋‚˜ ์‹ค์ˆ˜๋ฅผ ํ–ˆ๊ตฐ์š”.
08:29
Some hotels exploit the opportunity to put even more memes
183
509330
3000
์–ด๋–ค ํ˜ธํ…”๋“ค์€ ์ž‘์€ ์Šคํ‹ฐ์ปค๋กœ ๋” ๋งŽ์€ ๋ฐˆ์„ ์ถ”๊ฐ€ํ•˜๋Š”
08:32
with a little sticker.
184
512330
2000
๊ธฐํšŒ๋ฅผ ๊ฐœ๋ฐœํ•ฉ๋‹ˆ๋‹ค.
08:34
(Laughter)
185
514330
1000
(์›ƒ์Œ)
08:35
What is this all about?
186
515330
2000
์ด๊ฒƒ๋“ค์€ ๋ชจ๋‘ ๋ฌด์—‡์„ ์œ„ํ•œ ๊ฑธ๊นŒ์š”?
08:37
I suppose it's there to tell you that somebody's
187
517330
2000
์ œ ์ƒ๊ฐ์—” ๋ˆ„๊ตฐ๊ฐ€ ๊ทธ ์žฅ์†Œ๋ฅผ ์ฒญ์†Œํ–ˆ๋‹ค๋Š” ๊ฒƒ์„ ์—ฌ๋Ÿฌ๋ถ„์—๊ฒŒ
08:39
cleaned the place, and it's all lovely.
188
519330
2000
๋งํ•ด์ฃผ๊ธฐ ์œ„ํ•œ ๊ฒƒ ๊ฐ™๊ณ , ๊ทธ๊ฑด ์ •๋ง ์‚ฌ๋ž‘์Šค๋Ÿฝ์ฃ .
08:41
And you know, actually, all it tells you is that another person
189
521330
3000
๊ทธ๋ฆฌ๊ณ  ์‚ฌ์‹ค ์ด ๋ชจ๋“  ๊ฒƒ์€ ๋˜ ๋‹ค๋ฅธ ์‚ฌ๋žŒ์ด ๊ท ์„ ์ด๊ณณ ์ €๊ณณ ์˜ฎ๊ฒผ์„
08:44
has potentially spread germs from place to place.
190
524330
3000
๊ฐ€๋Šฅ์„ฑ์ด ์žˆ๋‹ค๋Š” ๊ฒƒ์„ ๋งํ•ด์ฃผ๊ธฐ๋„ ํ•˜์ฃ .
08:47
(Laughter)
191
527330
1000
(์›ƒ์Œ)
08:48
So, think of it this way.
192
528330
2000
๊ทธ๋Ÿฌ๋‹ˆ ์ด๋ ‡๊ฒŒ ํ•œ๋ฒˆ ์ƒ๊ฐํ•ด๋ณด์„ธ์š”.
08:50
Imagine a world full of brains
193
530330
2000
๋‡Œ๋กœ ๊ฐ€๋“ ์ฐฌ, ๊ทธ๋ฆฌ๊ณ  ์ฐพ์„ ์ˆ˜ ์žˆ๋Š” ์ง‘๋ณด๋‹ค
08:52
and far more memes than can possibly find homes.
194
532330
3000
ํ›จ์”ฌ ๋งŽ์€ ๋ฐˆ๋“ค๋กœ ๊ฐ€๋“ ์ฐฌ ์„ธ์ƒ๋ฅผ ์ƒ์ƒํ•ด๋ณด์„ธ์š”.
08:55
The memes are all trying to get copied --
195
535330
3000
๋ฐˆ๋“ค์€ ๋ชจ๋‘ ๋ณต์ œ๋˜๋ ค๊ณ  ๋…ธ๋ ฅ ์ค‘์ด๊ณ ,
08:58
trying, in inverted commas -- i.e.,
196
538330
3000
๋…ธ๋ ฅ ์ค‘์ด๋ž€, ๋”ฐ๋กœ ๋„์–ด์„œ, ์˜ˆ๋ฅผ ๋“ค์–ด
09:01
that's the shorthand for, if they can get copied, they will.
197
541330
3000
๋ฐˆ์ด ๋ณต์ œ๋  ์ˆ˜ ์žˆ๋‹ค๋ฉด ๋ณต์ œ๋  ๊ฒƒ์ด๋ผ๋Š” ๊ฒƒ์„ ํ•จ์ถ•์ ์œผ๋กœ ์˜๋ฏธํ•ฉ๋‹ˆ๋‹ค.
09:04
They're using you and me as their propagating, copying machinery,
198
544330
6000
๊ทธ๊ฒƒ์€ ๋‹น์‹ ๊ณผ ๋‚˜๋ฅผ ์ „ํŒŒ๋ฅผ ์œ„ํ•œ ๋ณต์ œ ๊ธฐ๊ณ„๋กœ ์‚ฌ์šฉํ•˜๊ณ  ์žˆ๊ณ ,
09:10
and we are the meme machines.
199
550330
3000
์šฐ๋ฆฌ๋Š” ๋ฐˆ ๊ธฐ๊ณ„์ž…๋‹ˆ๋‹ค.
09:13
Now, why is this important?
200
553330
2000
์ด์ œ, ์™œ ์ด๊ฒƒ์ด ์ค‘์š”ํ• ๊นŒ์š”?
09:15
Why is this useful, or what does it tell us?
201
555330
2000
์ด๊ฒƒ์ด ์™œ ์œ ์šฉํ•˜๋ฉฐ ์šฐ๋ฆฌ์—๊ฒŒ ๋ฌด์—‡์„ ๋งํ•ด์ฃผ๋‚˜์š”?
09:17
It gives us a completely new view of human origins
202
557330
4000
์ด๊ฒƒ์€ ์šฐ๋ฆฌ์—๊ฒŒ ์ธ๋ฅ˜์˜ ๊ธฐ์›๊ณผ ์ธ๋ฅ˜์—๊ฒŒ ์ฃผ๋Š” ์˜๋ฏธ์— ๋Œ€ํ•œ
09:21
and what it means to be human,
203
561330
1000
์™„์ „ํžˆ ์ƒˆ๋กœ์šด ์‹œ๊ฐ์„ ์•Œ๋ ค์ค๋‹ˆ๋‹ค.
09:22
all conventional theories of cultural evolution,
204
562330
4000
์ธ๋ฅ˜ ๊ธฐ์›์— ๋Œ€ํ•œ ๊ทธ๋ฆฌ๊ณ  ์šฐ๋ฆฌ๋ฅผ ๋‹ค๋ฅธ ์ข…๊ณผ ์ด๋ ‡๊ฒŒ ๋‹ค๋ฅด๊ฒŒ
09:26
of the origin of humans,
205
566330
2000
๋งŒ๋“  ๊ฒƒ์ด ๋ฌด์—‡์ธ์ง€์— ๋Œ€ํ•œ
09:28
and what makes us so different from other species.
206
568330
4000
๋ฌธํ™”์  ์ง„ํ™”์— ๋Œ€ํ•œ ๋ชจ๋“  ์ „ํ†ต์ ์ธ ์ด๋ก  ๋ง์ž…๋‹ˆ๋‹ค.
09:32
All other theories explaining the big brain, and language, and tool use
207
572330
2000
์šฐ๋ฆฌ๋ฅผ ๋…ํŠนํ•˜๊ฒŒ ๋งŒ๋“œ๋Š” ํฐ ๋‡Œ, ์–ธ์–ด, ๋„๊ตฌ ์‚ฌ์šฉ์— ๋Œ€ํ•ด
09:34
and all these things that make us unique,
208
574330
2000
์„ค๋ช…ํ•˜๋Š” ๋‹ค๋ฅธ ๋ชจ๋“  ์ด๋ก ์€
09:36
are based upon genes.
209
576330
3000
์œ ์ „์ž์— ๊ทผ๊ฑฐํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
09:39
Language must have been useful for the genes.
210
579330
3000
์–ธ์–ด๋Š” ์œ ์ „์ž์—๊ฒŒ ์œ ์šฉํ–ˆ์Œ์ด ํ‹€๋ฆผ์—†์Šต๋‹ˆ๋‹ค.
09:42
Tool use must have enhanced our survival, mating and so on.
211
582330
3000
๋„๊ตฌ ์‚ฌ์šฉ์€ ์šฐ๋ฆฌ์˜ ์ƒ์กด๊ณผ ์ง์ง“๊ธฐ ๋“ฑ์„ ํ™•์žฅ์‹œ์ผœ์™”์Œ์ด ๋ถ„๋ช…ํ•ฉ๋‹ˆ๋‹ค.
09:45
It always comes back, as Richard Dawkins complained
212
585330
3000
์ด๊ฒƒ์€ ํ•ญ์ƒ, ๋ฆฌ์ฐจ๋“œ ๋„ํ‚จ์Šค๊ฐ€ ์•„์ฃผ ์˜ค๋ž˜์ „์—
09:48
all that long time ago, it always comes back to genes.
213
588330
3000
ํ‘ธ๋…ํ–ˆ๋“ฏ์ด, ํ•ญ์ƒ ์œ ์ „์ž๋กœ ๊ท€์ฐฉ๋ฉ๋‹ˆ๋‹ค.
09:51
The point of memetics is to say, "Oh no, it doesn't."
214
591330
4000
๋ฐˆํ•™๋ฌธ์˜ ๋…ผ์ ์€ "์˜ค, ๊ทธ๊ฒŒ ์•„๋‹ˆ์˜ˆ์š”."๋ผ๊ณ  ๋งํ•˜๋Š” ๊ฒƒ์ด์ฃ .
09:55
There are two replicators now on this planet.
215
595330
3000
์ง€๊ตฌ ์ƒ์—๋Š” ์ด์ œ ๋‘๊ฐœ์˜ ๋ณต์ œ์ž๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
09:58
From the moment that our ancestors,
216
598330
3000
์•ฝ ์ด๋ฐฑ ์˜ค์‹ญ๋งŒ ๋…„ ์ „ ์ฏค์—,
10:01
perhaps two and a half million years ago or so,
217
601330
2000
์šฐ๋ฆฌ ์กฐ์ƒ๋“ค์ด ๋ชจ๋ฐฉ์„ ํ•˜๊ธฐ ์‹œ์ž‘ํ•œ ๋•Œ๋กœ๋ถ€ํ„ฐ,
10:03
began imitating, there was a new copying process.
218
603330
4000
์ƒˆ๋กœ์šด ๋ณต์ œ ๊ณผ์ •์ด ์ƒ๊ฒจ๋‚ฌ์Šต๋‹ˆ๋‹ค.
10:07
Copying with variation and selection.
219
607330
2000
๋‹ค์–‘์„ฑ ์†์—์„œ ์„ ํƒ์— ์˜ํ•ด ๋ณต์ œ๋˜๋Š” ๊ฒƒ์ด์ฃ .
10:09
A new replicator was let loose, and it could never be --
220
609330
5000
์ƒˆ๋กœ์šด ๋ณต์ œ์ž๊ฐ€ ํ’€๋ฆฐ ํ›„, ๋ฐ”๋กœ ๊ทธ ์‹œ์ž‘๋ถ€ํ„ฐ ๊ฒฐ์ฝ” ๊ทธ๋Ÿด ์ˆ˜ ์—†์Šต๋‹ˆ๋‹ค.
10:14
right from the start -- it could never be
221
614330
1000
๊ทธ๋Ÿด ์ˆ˜ ์—†๋Š” ๊ฒƒ์ด
10:15
that human beings who let loose this new creature,
222
615330
5000
์ด ์ƒˆ๋กœ์šด ์ฐฝ์กฐ๋ฌผ์„ ํ’€์–ด๋†“์€ ์ธ๋ฅ˜๋Š”
10:20
could just copy the useful, beautiful, true things,
223
620330
3000
์œ ์šฉํ•˜๊ณ  ์•„๋ฆ„๋‹ต๊ณ  ์ง„์‹ค๋œ ๊ฒƒ๋งŒ ๋ณต์ œํ•˜๊ณ 
10:23
and not copy the other things.
224
623330
2000
๋‹ค๋ฅธ ๊ฒƒ๋“ค์€ ๋ณต์ œํ•˜์ง€ ์•Š์„ ์ˆ˜ ์—†์Šต๋‹ˆ๋‹ค.
10:25
While their brains were having an advantage from being able to copy --
225
625330
3000
๋ถˆ์„ ๋ถ™์ด๊ณ , ๋ถˆ์„ ์œ ์ง€ํ•˜๊ณ , ์ƒˆ๋กœ์šด ์‚ฌ๋ƒฅ ๊ธฐ์ˆ ์„
10:28
lighting fires, keeping fires going, new techniques of hunting,
226
628330
5000
์ตํžˆ๋Š” ์ด๋Ÿฌํ•œ ์ข…๋ฅ˜์˜ ๊ฒƒ๋“ค์€ ๋‡Œ๊ฐ€ ๋ณต์ œํ•˜๊ธฐ์—
10:33
these kinds of things --
227
633330
2000
์ด์ ์„ ๊ฐ€์ง€๊ณ  ์žˆ๋Š” ๋ฐ˜๋ฉด
10:35
inevitably they were also copying putting feathers in their hair,
228
635330
3000
๋จธ๋ฆฌ์— ๊นƒํ„ธ์„ ๊ฝ‚๋Š” ํ–‰์œ„๋‚˜ ์ด์ƒํ•œ ์˜ท์„ ์ž…๊ฑฐ๋‚˜
10:38
or wearing strange clothes, or painting their faces,
229
638330
2000
์–ผ๊ตด์— ์น ์„ ํ•˜๊ฑฐ๋‚˜ ๋˜๋Š” ๋ญ๊ฐ€ ๋๋“ ์ง€
10:40
or whatever.
230
640330
1000
์–ด์ฉ” ์ˆ˜ ์—†์ด ๋ณต์ œํ•ฉ๋‹ˆ๋‹ค.
10:41
So, you get an arms race between the genes
231
641330
4000
๊ทธ๋ž˜์„œ ์ด๋Ÿฐ ๋ชจ๋“  ๊ฒƒ์„ ๋ณต์ œํ•˜๋Š”๋ฐ ์‹œ๊ฐ„์„ ๋‚ญ๋น„ํ•˜์ง€ ์•Š๋„๋ก
10:45
which are trying to get the humans to have small economical brains
232
645330
4000
์ž‘๊ณ  ๊ฒฝ์ œ์ ์ธ ๋‡Œ๋ฅผ ๊ฐ–๊ฒŒ ํ•˜๋ ค๋Š” ์œ ์ „์ž์™€,
10:49
and not waste their time copying all this stuff,
233
649330
2000
์‚ฌ๋žŒ๋“ค์ด ๋งŒ๋“ค๊ณ  ๋ณต์ œํ•˜๋Š” ์†Œ๋ฆฌ๋“ค,
10:51
and the memes themselves, like the sounds that people made and copied --
234
651330
4000
๋‹ค์‹œ ๋งํ•ด ์–ธ์–ด๊ฐ€ ๋˜์ฃ , ์ด๊ฑธ ์„ ํ˜ธํ•ด์„œ
10:56
in other words, what turned out to be language --
235
656330
2000
๋‡Œ๋ฅผ ์ ์  ๋” ์ปค์ง€๊ฒŒ ํ•˜๊ธฐ ์œ„ํ•ด ์• ์“ฐ๋Š” ๋ฐˆ ์‚ฌ์ด์—
10:58
competing to get the brains to get bigger and bigger.
236
658330
3000
๋ฌดํ•œ ๊ฒฝ์Ÿ์ด ์ผ์–ด๋‚˜๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
11:01
So, the big brain, on this theory, is driven by the memes.
237
661330
4000
๊ทธ๋ž˜์„œ ์ด ์ด๋ก ์— ๋”ฐ๋ฅด๋ฉด ํฐ ๋‡Œ๋Š” ๋ฐˆ์— ์˜ํ•ด ๋งŒ๋“ค์–ด์ง„ ๊ฒƒ์ด์ฃ .
11:05
This is why, in "The Meme Machine," I called it memetic drive.
238
665330
4000
์ด๊ฒƒ์ด ์ œ๊ฐ€ "๋ฐˆ ๊ธฐ๊ณ„"์—์„œ ๊ทธ๊ฑธ ๋ฐˆ์˜ ๋™์ธ์ด๋ผ๊ณ  ๋ถ€๋ฅธ ์ด์œ ์ž…๋‹ˆ๋‹ค.
11:09
As the memes evolve, as they inevitably must,
239
669330
3000
๋ฐˆ์ด ์ง„ํ™”ํ•˜๋ฉด์„œ, ๋ถ„๋ช…ํžˆ ๊ทธ๋Ÿด ์ˆ˜ ๋ฐ–์— ์—†๋Š”๋ฐ,
11:12
they drive a bigger brain that is better at copying the memes
240
672330
4000
๋ฐˆ์€ ์ž๊ธฐ ์ž์‹ ์„ ๋ณต์ œํ•˜๋Š”๋ฐ ๋” ์œ ๋ฆฌํ•œ
11:16
that are doing the driving.
241
676330
2000
๋” ํฐ ๋‡Œ๋ฅผ ์ถ”๊ตฌํ•ฉ๋‹ˆ๋‹ค.
11:18
This is why we've ended up with such peculiar brains,
242
678330
4000
์ด๊ฒƒ์ด ์šฐ๋ฆฌ๊ฐ€ ๊ฒฐ๊ตญ ์ด๋Ÿฐ ๊ธฐ๋ฌ˜ํ•œ ๋‡Œ๋ฅผ ๊ฐ–๊ฒŒ ๋˜๊ณ ,
11:22
that we like religion, and music, and art.
243
682330
3000
์ข…๊ต, ์Œ์•…, ์˜ˆ์ˆ ์„ ์ข‹์•„ํ•˜๋Š” ์ด์œ ์ž…๋‹ˆ๋‹ค.
11:25
Language is a parasite that we've adapted to,
244
685330
3000
์ด๋Ÿฐ ์ž…์žฅ์—์„œ ๋ฐ”๋ผ๋ณด๋ฉด, ์–ธ์–ด๋Š” ์šฐ๋ฆฌ๊ฐ€
11:28
not something that was there originally for our genes,
245
688330
2000
์ ์‘ํ•œ ๊ธฐ์ƒ์ถฉ์ผ ๋ฟ, ๋ณธ๋ž˜ ์šฐ๋ฆฌ ์œ ์ „์ž๋ฅผ ์œ„ํ•œ
11:30
on this view.
246
690330
2000
๊ทธ๋Ÿฐ๊ฒŒ ์•„๋‹™๋‹ˆ๋‹ค.
11:32
And like most parasites, it can begin dangerous,
247
692330
3000
๊ทธ๋ฆฌ๊ณ  ์ฒ˜์Œ์—” ์œ„ํ—˜์„ ๋ผ์น  ์ˆ˜ ์žˆ์ง€๋งŒ ๊ฐ™์ด ์ง„ํ™”ํ•˜๊ณ 
11:35
but then it coevolves and adapts,
248
695330
3000
์ ์‘ํ•˜๋Š” ๋Œ€๋ถ€๋ถ„์˜ ๊ธฐ์ƒ์ถฉ๊ณผ ๊ฐ™์ด
11:38
and we end up with a symbiotic relationship
249
698330
2000
์šฐ๋ฆฌ๋Š” ์ด ์ƒˆ๋กœ์šด ๊ธฐ์ƒ์ถฉ๊ณผ
11:40
with this new parasite.
250
700330
1000
๊ณต์ƒ ๊ด€๊ณ„๊ฐ€ ๋˜์—ˆ์ฃ .
11:41
And so, from our perspective,
251
701330
2000
๊ทธ๋ฆฌ๊ณ  ์šฐ๋ฆฌ ๊ด€์ ์—์„œ๋Š”
11:43
we don't realize that that's how it began.
252
703330
3000
์ด๊ฒƒ์ด ์–ด๋–ป๊ฒŒ ์‹œ์ž‘๋๋Š”์ง€ ์•Œ ์ˆ˜ ์—†์Šต๋‹ˆ๋‹ค.
11:46
So, this is a view of what humans are.
253
706330
3000
๊ทธ๋ž˜์„œ ์ด๊ฒƒ์ด ์ธ๊ฐ„์ด ๋ฌด์—‡์ธ์ง€์— ๋Œ€ํ•œ ์‹œ๊ฐ์ž…๋‹ˆ๋‹ค.
11:49
All other species on this planet are gene machines only,
254
709330
3000
์ง€๊ตฌ์ƒ์˜ ๋‹ค๋ฅธ ๋ชจ๋“  ์ข…์€ ๋‹จ์ง€ ์œ ์ „์ž ๊ธฐ๊ณ„์ด๊ณ ,
11:52
they don't imitate at all well, hardly at all.
255
712330
3000
๊ทธ๊ฒƒ๋“ค์€ ๊ฑฐ์˜ ๋ชจ๋ฐฉ์„ ํ•˜์ง€ ๋ชปํ•ฉ๋‹ˆ๋‹ค.
11:55
We alone are gene machines and meme machines as well.
256
715330
5000
์šฐ๋ฆฌ๋งŒ์ด ์œ ์ „์ž ๊ธฐ๊ณ„์ด๋ฉด์„œ ๋ฐˆ ๊ธฐ๊ณ„์ด์ฃ .
12:00
The memes took a gene machine and turned it into a meme machine.
257
720330
4000
๋ฐˆ์ด ์œ ์ „์ž ๊ธฐ๊ณ„๋ฅผ ๋ฐˆ ๊ธฐ๊ณ„๋กœ ๋ณ€ํ™”์‹œ์ผฐ์Šต๋‹ˆ๋‹ค.
12:04
But that's not all.
258
724330
2000
๊ทธ๋Ÿฌ๋‚˜ ์ด๊ฒŒ ์ „๋ถ€๊ฐ€ ์•„๋‹ˆ์˜ˆ์š”.
12:06
We have a new kind of memes now.
259
726330
3000
์šฐ๋ฆฌ๋Š” ์ด์ œ ์ƒˆ๋กœ์šด ์ข…๋ฅ˜์˜ ๋ฐˆ์„ ๊ฐ–๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
12:09
I've been wondering for a long time,
260
729330
1000
๋ฐˆ์— ๋Œ€ํ•ด์„œ ์ƒ๊ฐ์„ ๋งŽ์ด ํ•˜๊ธฐ ์‹œ์ž‘ํ•œ ํ›„
12:10
since I've been thinking about memes a lot,
261
730330
2000
์˜ค๋žซ๋™์•ˆ ์ „ ๊ถ๊ธˆํ•ด ์™”์Šต๋‹ˆ๋‹ค.
12:12
is there a difference between the memes that we copy --
262
732330
2000
์šฐ๋ฆฌ๊ฐ€ ์„œ๋กœ ๋งํ•˜๋Š” ๋‹จ์–ด๋“ค๊ณผ
12:14
the words we speak to each other,
263
734330
2000
์šฐ๋ฆฌ์˜ ํ–‰์œ„๋“ค, ์ธ๊ฐ„์ ์ธ ์šฐ๋ฆฌ๊ฐ€ ๋ณต์ œํ•˜๋Š” ๋ฐˆ๊ณผ
12:16
the gestures we copy, the human things --
264
736330
2000
์šฐ๋ฆฌ ์ฃผ์œ„์˜ ๋ชจ๋“  ๊ธฐ์ˆ ์ ์ธ ๊ฒƒ ์‚ฌ์ด์—
12:18
and all these technological things around us?
265
738330
2000
์–ด๋–ค ์ฐจ์ด๊ฐ€ ์žˆ์„๊นŒ์š”?
12:20
I have always, until now, called them all memes,
266
740330
4000
์ €๋Š” ์ง€๊ธˆ๊นŒ์ง€ ์ด ๋ชจ๋“  ๊ฒƒ๋“ค์„ ํ•ญ์ƒ ๋ฐˆ์ด๋ผ๊ณ  ๋ถˆ๋ €์ง€๋งŒ,
12:24
but I do honestly think now
267
744330
3000
์†”์งํžˆ ์ง€๊ธˆ์€ ๊ธฐ์ˆ ์ ์ธ ๋ฐˆ์—๊ฒŒ๋Š”
12:27
we need a new word for technological memes.
268
747330
3000
๋‹ค๋ฅธ ์ด๋ฆ„์ด ํ•„์š”ํ•˜๋‹ค๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
12:30
Let's call them techno-memes or temes.
269
750330
3000
๊ทธ๊ฒƒ๋“ค์„ ๊ธฐ์ˆ ์  ๋ฐˆ ๋˜๋Š” ํŒ€์ด๋ผ๊ณ  ๋ถˆ๋Ÿฌ๋ณด์ฃ .
12:33
Because the processes are getting different.
270
753330
3000
๊ทธ ๊ณผ์ •์ด ๋‹ค๋ฅด๊ธฐ ๋•Œ๋ฌธ์ธ๋ฐ์š”.
12:37
We began, perhaps 5,000 years ago, with writing.
271
757330
3000
์šฐ๋ฆฌ๋Š” 5,000๋…„ ์ „์— ์“ฐ๊ธฐ๋ฅผ ์‹œ์ž‘ํ–ˆ์Šต๋‹ˆ๋‹ค.
12:40
We put the storage of memes out there on a clay tablet,
272
760330
7000
์šฐ๋ฆฌ๋Š” ์ ํ† ํŒ ์œ„์— ๋ฐˆ์˜ ์ €์žฅ์†Œ๋ฅผ ๋งŒ๋“ค์—ˆ๋Š”๋ฐ,
12:48
but in order to get true temes and true teme machines,
273
768330
2000
์ง„์ •ํ•œ ํŒ€๊ณผ ํŒ€ ๊ธฐ๊ณ„๊ฐ€ ๋˜๊ธฐ ์œ„ํ•ด์„œ๋Š”
12:50
you need to get the variation, the selection and the copying,
274
770330
3000
๋ชจ๋“  ๊ฒƒ์ด ์ธ๊ฐ„ ๋ฐ–์—์„œ ์ด๋ฃจ์–ด์ง€๋Š”
12:53
all done outside of humans.
275
773330
2000
๋‹ค์–‘์„ฑ, ์„ ํƒ, ๋ณต์ œ๊ฐ€ ํ•„์š”ํ•ฉ๋‹ˆ๋‹ค.
12:55
And we're getting there.
276
775330
2000
์šฐ๋ฆฌ๋Š” ๊ทธ๊ณณ์œผ๋กœ ๊ฐ€๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
12:57
We're at this extraordinary point where we're nearly there,
277
777330
2000
์šฐ๋ฆฌ๋Š” ์ด๋Ÿฐ ๊ธฐ๊ณ„๊ฐ€ ์กด์žฌํ•˜๋Š” ๊ณณ์— ๊ฑฐ์˜ ๋„๋‹ฌํ•œ,
12:59
that there are machines like that.
278
779330
2000
ํŠน์ดํ•œ ์‹œ์ ์— ์™€ ์žˆ์Šต๋‹ˆ๋‹ค.
13:01
And indeed, in the short time I've already been at TED,
279
781330
2000
๊ทธ๋ฆฌ๊ณ  ์ฐธ์œผ๋กœ, ์ œ๊ฐ€ ์งง์€ ์‹œ๊ฐ„ ๋™์•ˆ TED์— ์ฐธ์—ฌํ•ด๋ณด๋‹ˆ,
13:03
I see we're even closer than I thought we were before.
280
783330
2000
์ œ ์ƒ๊ฐ๋ณด๋‹ค ํ›จ์”ฌ ๊ฐ€๊นŒ์›Œ์กŒ๋‹ค๋Š” ๊ฑธ ์•Œ๊ฒŒ ๋์Šต๋‹ˆ๋‹ค.
13:05
So actually, now the temes are forcing our brains
281
785330
6000
๊ทธ๋ž˜์„œ ์‚ฌ์‹ค์ƒ, ์ด์ œ ํŒ€์€ ์šฐ๋ฆฌ ๋‡Œ๊ฐ€ ๊ฑฐ์˜ ํŒ€ ๊ธฐ๊ณ„์ฒ˜๋Ÿผ
13:11
to become more like teme machines.
282
791330
2000
๋˜๋„๋ก ๋ชฐ์•„๊ฐ€๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
13:13
Our children are growing up very quickly learning to read,
283
793330
3000
์šฐ๋ฆฌ ์•„์ด๋“ค์€ ์ž๋ผ๋ฉด์„œ ๋งค์šฐ ๋นจ๋ฆฌ, ์ฝ๋Š” ๊ฒƒ๊ณผ
13:16
learning to use machinery.
284
796330
2000
๊ธฐ๊ณ„ ์‚ฌ์šฉ๋ฒ•์„ ๋ฐฐ์šฐ๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
13:18
We're going to have all kinds of implants,
285
798330
1000
์šฐ๋ฆฌ๋ฅผ ํ•ญ์ƒ ๊ฐ•์ œ๋กœ ๊นจ์–ด ์žˆ๊ฒŒ ๋งŒ๋“ค
13:19
drugs that force us to stay awake all the time.
286
799330
3000
์˜จ๊ฐ– ์ด์‹๋ฌผ๊ณผ ์•ฝ๋ฌผ์ด ์ƒ๊ธธ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
13:22
We'll think we're choosing these things,
287
802330
2000
์šฐ๋ฆฌ๋Š” ์šฐ๋ฆฌ๊ฐ€ ์ด๊ฒƒ๋“ค์„ ์„ ํƒํ•œ๋‹ค๊ณ  ์ƒ๊ฐํ•˜์ง€๋งŒ,
13:24
but the temes are making us do it.
288
804330
3000
ํŒ€์ด ์šฐ๋ฆฌ๊ฐ€ ๊ทธ๋ ‡๊ฒŒ ํ•˜๋„๋ก ๋งŒ๋“œ๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
13:28
So, we're at this cusp now
289
808330
1000
๊ทธ๋ž˜์„œ ์šฐ๋ฆฌ๋Š” ์ด์ œ
13:29
of having a third replicator on our planet.
290
809330
4000
์ง€๊ตฌ์ƒ์—์„œ ์„ธ๋ฒˆ์งธ ๋ณต์ œ์ž๋ฅผ ๊ฐ–๋Š” ๋ฐ”๋กœ ๊ทธ ์‹œ์ž‘์ ์— ์žˆ๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
13:34
Now, what about what else is going on out there in the universe?
291
814330
5000
๊ทธ๋ ‡๋‹ค๋ฉด ์ € ์šฐ์ฃผ ๋ฐ– ์–ด๋”˜๊ฐ€์—์„œ๋Š” ๋ฌด์—‡์ด ์ผ์–ด๋‚˜๊ณ  ์žˆ์„๊นŒ์š”?
13:39
Is there anyone else out there?
292
819330
2000
๋ˆ„๊ตฐ๊ฐ€๊ฐ€ ์ €๊ณณ์— ์žˆ์„๊นŒ์š”?
13:41
People have been asking this question for a long time.
293
821330
3000
์‚ฌ๋žŒ๋“ค์€ ์˜ค๋žซ๋™์•ˆ ์ด ์งˆ๋ฌธ์„ ํ•ด์˜ค๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
13:44
We've been asking it here at TED already.
294
824330
2000
์ด๊ณณ TED์—์„œ๋„ ์šฐ๋ฆฌ๋Š” ์ด๋ฏธ ์ด ์งˆ๋ฌธ์„ ํ•ด์™”์ฃ .
13:46
In 1961, Frank Drake made his famous equation,
295
826330
4000
1961๋…„ ํ”„๋žญํฌ ๋“œ๋ ˆ์ดํฌ๋Š” ์œ ๋ช…ํ•œ ๋ฐฉ์ •์‹์„ ๋งŒ๋“ค์—ˆ์ง€๋งŒ
13:50
but I think he concentrated on the wrong things.
296
830330
2000
์ €๋Š” ๊ทธ๊ฐ€ ์ž˜๋ชป๋œ ๊ฒƒ๋“ค์— ์ง‘์ค‘ํ•˜๊ณ  ์žˆ์—ˆ๋‹ค๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
13:52
It's been very productive, that equation.
297
832330
2000
์ด ๊ณต์‹์€ ๋งค์šฐ ์ƒ์‚ฐ์ ์ด์˜€์Šต๋‹ˆ๋‹ค.
13:54
He wanted to estimate N,
298
834330
2000
๊ทธ๋Š” ์šฐ๋ฆฌ ์€ํ•˜๊ณ„์— ์กด์žฌํ•˜๋Š”,
13:56
the number of communicative civilizations out there in our galaxy,
299
836330
4000
์˜์‚ฌ์†Œํ†ตํ•  ์ˆ˜ ์žˆ๋Š” ๋ฌธ๋ช…์˜ ์ˆ˜ N์„ ์ถ”์‚ฐํ•˜๊ธธ ์›ํ–ˆ์Šต๋‹ˆ๋‹ค.
14:00
and he included in there the rate of star formation,
300
840330
4000
๊ทธ๋ฆฌ๊ณ  ๊ทธ๋Š” ์ด ๊ณต์‹์— ๋ณ„์˜ ๊ตฌ์„ฑ๋น„, ํ–‰์„ฑ์˜ ๋น„์œจ์„
14:04
the rate of planets, but crucially, intelligence.
301
844330
4000
ํฌํ•จ์‹œ์ผฐ๊ณ , ๊ฒฐ์ •์ ์œผ๋กœ ์ง€๋Šฅ์„ ํฌํ•จํ–ˆ์Šต๋‹ˆ๋‹ค.
14:08
I think that's the wrong way to think about it.
302
848330
4000
์ €๋Š” ์ด๊ฒƒ์ด ์ž˜๋ชป๋œ ์ƒ๊ฐ์ด๋ผ๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
14:12
Intelligence appears all over the place, in all kinds of guises.
303
852330
3000
์ง€๋Šฅ์€ ๋ชจ๋“  ๊ณณ์—์„œ ์–ด๋–ค ๋ชจ์Šต์œผ๋กœ๋“  ๋‚˜ํƒ€๋‚ฉ๋‹ˆ๋‹ค.
14:15
Human intelligence is only one kind of a thing.
304
855330
2000
์ธ๊ฐ„ ์ง€๋Šฅ์€ ๋‹จ์ง€ ํ•œ ์ข…๋ฅ˜์ผ ๋ฟ์ด์ฃ .
14:17
But what's really important is the replicators you have
305
857330
3000
๊ทธ๋Ÿฌ๋‚˜ ์ •๋ง ์ค‘์š”ํ•œ ๊ฒƒ์€ ๋‹น์‹ ์ด ๊ฐ€์ง„ ๋ณต์ œ์ž์™€
14:20
and the levels of replicators, one feeding on the one before.
306
860330
4000
๋ณต์ œ์ž์˜ ์ˆ˜์ค€, ๊ทธ๋ฆฌ๊ณ  ๊ทธ์ „์— ๊ทธ ๋ณต์ œ์ž๋ฅผ ํ‚ค์šด ๋ณต์ œ์ž์ž…๋‹ˆ๋‹ค.
14:24
So, I would suggest that we don't think intelligence,
307
864330
5000
๊ทธ๋ž˜์„œ ์ €๋Š” ์šฐ๋ฆฌ๊ฐ€ ์ง€๋Šฅ์ด ์•„๋‹ˆ๋ผ ๋ณต์ œ์ž๋ฅผ ์ƒ๊ฐํ•ด์•ผ ํ•œ๋‹ค๊ณ 
14:29
we think replicators.
308
869330
2000
์ œ์•ˆํ•ฉ๋‹ˆ๋‹ค.
14:31
And on that basis, I've suggested a different kind of equation.
309
871330
3000
๊ทธ๋ฆฌ๊ณ  ์ด๋ฅผ ๊ทผ๊ฑฐ๋กœ ์ €๋Š” ๋‹ค๋ฅธ ๊ณต์‹์„ ์ œ์•ˆํ–ˆ์Šต๋‹ˆ๋‹ค.
14:34
A very simple equation.
310
874330
2000
๋งค์šฐ ๋‹จ์ˆœํ•œ ๊ณต์‹์ด์ฃ .
14:36
N, the same thing,
311
876330
2000
์šฐ๋ฆฌ์˜ ์€ํ•˜๊ณ„์—์„œ ๊ธฐ๋Œ€ํ•  ์ˆ˜ ์žˆ๋Š”
14:38
the number of communicative civilizations out there
312
878330
3000
์˜์‚ฌ์†Œํ†ต์ด ๊ฐ€๋Šฅํ•œ ๋ฌธ๋ช…์˜ ์ˆ˜
14:41
[that] we might expect in our galaxy.
313
881330
2000
๋™์ผํ•œ ๊ทธ N์ž…๋‹ˆ๋‹ค.
14:43
Just start with the number of planets there are in our galaxy.
314
883330
4000
์šฐ์„  ์šฐ๋ฆฌ ์€ํ•˜๊ณ„์— ์žˆ๋Š” ํ–‰์„ฑ์˜ ์ˆ˜๋กœ ์‹œ์ž‘ํ•ฉ๋‹ˆ๋‹ค.
14:47
The fraction of those which get a first replicator.
315
887330
4000
์ฒซ๋ฒˆ์งธ ๋ณต์ œ์ž๋ฅผ ๊ฐ€์ง€๊ฒŒ ๋˜๋Š” ์ž‘์€ ๊ฒฝ์šฐ
14:51
The fraction of those that get the second replicator.
316
891330
4000
๋‘๋ฒˆ์งธ ๋ณต์ œ์ž๋ฅผ ๊ฐ€์ง€๊ฒŒ ๋˜๋Š” ์ž‘์€ ๊ฒฝ์šฐ
14:55
The fraction of those that get the third replicator.
317
895330
2000
์„ธ๋ฒˆ์งธ ๋ณต์ œ์ž๋ฅผ ๊ฐ€์ง€๊ฒŒ ๋˜๋Š” ์ž‘์€ ๊ฒฝ์šฐ
14:58
Because it's only the third replicator that's going to reach out --
318
898330
3000
์ด๊ฒƒ์€ ์˜ค์ง ์„ธ๋ฒˆ์งธ ๋ณต์ œ์ž๊ฐ€ ๋ผ์•ผ ๋น„๋กœ์†Œ
15:01
sending information, sending probes, getting out there,
319
901330
3000
์ •๋ณด๋ฅผ ๋ณด๋‚ด๊ณ , ํƒ์‚ฌ์„ ์„ ๋ณด๋‚ด๊ณ , ์ง์ ‘ ๊ฐ€์„œ,
15:04
and communicating with anywhere else.
320
904330
2000
๋‹ค๋ฅธ ๊ณณ๊ณผ ์˜์‚ฌ์†Œํ†ต์„ ํ•˜๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
15:06
OK, so if we take that equation,
321
906330
3000
์ข‹์•„์š”, ์šฐ๋ฆฌ๊ฐ€ ์ด ๊ณต์‹์„ ๋ฐ›์•„๋“ค์ธ๋‹ค๋ฉด
15:09
why haven't we heard from anybody out there?
322
909330
5000
์ €๊ณณ์œผ๋กœ๋ถ€ํ„ฐ ์™œ ์•„๋ฌด ๊ฒƒ๋„ ๋“ฃ์ง€ ๋ชปํ–ˆ์„๊นŒ์š”?
15:14
Because every step is dangerous.
323
914330
4000
๊ทธ๊ฒƒ์€ ๋ชจ๋“  ๋‹จ๊ณ„๊ฐ€ ์œ„ํ—˜ํ•˜๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
15:18
Getting a new replicator is dangerous.
324
918330
3000
์ƒˆ๋กœ์šด ๋ณต์ œ์ž๋ฅผ ๊ฐ–๋Š” ๊ฒƒ์€ ์œ„ํ—˜ํ•ฉ๋‹ˆ๋‹ค.
15:21
You can pull through, we have pulled through,
325
921330
2000
์šฐ๋ฆฌ๋Š” ์• ์จ ๋‚˜์•„๊ฐˆ ์ˆ˜ ์žˆ๊ณ , ๋‚˜์•„๊ฐ€ ์™”์ง€๋งŒ
15:23
but it's dangerous.
326
923330
2000
๊ทธ๊ฒƒ์€ ์œ„ํ—˜ํ•ฉ๋‹ˆ๋‹ค.
15:25
Take the first step, as soon as life appeared on this earth.
327
925330
3000
์ง€๊ตฌ์ƒ์— ์ƒ๋ช…์ฒด๊ฐ€ ๋‚˜ํƒ€๋‚˜์ž๋งˆ์ž ์ฒซ๋ฒˆ์งธ ๋‹จ๊ณ„๊ฐ€ ์‹œ์ž‘๋์Šต๋‹ˆ๋‹ค.
15:28
We may take the Gaian view.
328
928330
2000
๊ฐ€์ด์•„ ์ด๋ก ์„ ์ƒ๊ฐํ•ด ๋ณผ ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
15:30
I loved Peter Ward's talk yesterday -- it's not Gaian all the time.
329
930330
3000
์ „ ํ”ผํ„ฐ ์›Œ๋“œ์˜ ์–ด์ œ ๋ฐœํ‘œ๊ฐ€ ์ข‹์•˜๋Š”๋ฐ, ๋ชจ๋‘ ๊ฐ€์ด์•„ ์–˜๊ธฐ๋Š” ์•„๋‹ˆ์ฃ .
15:33
Actually, life forms produce things that kill themselves.
330
933330
3000
์‚ฌ์‹ค, ์ƒ๋ช…์ฒด๋“ค์€ ๊ทธ ์ž์‹ ์„ ์ฃฝ์ด๋Š” ๊ฒƒ์„ ๋งŒ๋“ค์–ด๋ƒ…๋‹ˆ๋‹ค.
15:36
Well, we did pull through on this planet.
331
936330
3000
์ž, ์šฐ๋ฆฌ๋Š” ์ง€๊ตฌ์ƒ์— ์• ์จ ๋‚˜ํƒ€๋‚ฌ์ฃ .
15:39
But then, a long time later, billions of years later,
332
939330
2000
๊ทธ๋Ÿฌ๋‚˜ ์˜ค๋žœ ํ›„์—, 10์–ต ๋…„ ํ›„์—,
15:41
we got the second replicator, the memes.
333
941330
3000
์šฐ๋ฆฌ๋Š” ๋‘๋ฒˆ์งธ ๋ณต์ œ์ž์ธ ๋ฐˆ์„ ๊ฐ–๊ฒŒ ๋์Šต๋‹ˆ๋‹ค.
15:44
That was dangerous, all right.
334
944330
2000
๊ทธ๋ ‡์ฃ . ๊ทธ๊ฑด ์ •๋ง ์œ„ํ—˜ํ–ˆ์Šต๋‹ˆ๋‹ค.
15:46
Think of the big brain.
335
946330
2000
ํฐ ๋‡Œ์— ๋Œ€ํ•ด์„œ ์ƒ๊ฐํ•ด ๋ณด์„ธ์š”.
15:48
How many mothers do we have here?
336
948330
3000
์—ฌ๊ธฐ ์—„๋งˆ๊ฐ€ ๋ช‡ ๋ช…์ด๋‚˜ ์žˆ๋‚˜์š”?
15:51
You know all about big brains.
337
951330
2000
๋‹น์‹ ๋“ค์€ ํฐ ๋‡Œ์— ๋Œ€ํ•ด์„œ ์ž˜ ์•Œ๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
15:53
They are dangerous to give birth to,
338
953330
2000
ํฐ ๋‡Œ๋Š” ํƒœ์–ด๋‚  ๋•Œ ์œ„ํ—˜ํ•ฉ๋‹ˆ๋‹ค.
15:55
are agonizing to give birth to.
339
955330
2000
ํƒœ์–ด๋‚  ๋•Œ ์ •๋ง ์•…์ „๊ณ ํˆฌํ•˜์ฃ .
15:57
(Laughter)
340
957330
1000
(์›ƒ์Œ)
15:59
My cat gave birth to four kittens, purring all the time.
341
959330
2000
์ œ ๊ณ ์–‘์ด๋Š” ๊ทธ๋ฅด๋ ๊ฑฐ๋ฆฌ๋ฉฐ ๋„ค๋งˆ๋ฆฌ๋ฅผ ๋‚ณ์•˜์Šต๋‹ˆ๋‹ค.
16:01
Ah, mm -- slightly different.
342
961330
2000
์•„, ์Œ, ์•ฝ๊ฐ„ ๋‹ค๋ฅด์ฃ .
16:03
(Laughter)
343
963330
2000
(์›ƒ์Œ)
16:05
But not only is it painful, it kills lots of babies,
344
965330
3000
๊ณ ํ†ต์Šค๋Ÿฌ์šธ ๋ฟ๋งŒ ์•„๋‹ˆ๋ผ ๋งŽ์€ ์•„์ด๊ฐ€ ์ฃฝ๊ณ 
16:08
it kills lots of mothers,
345
968330
2000
๋งŽ์€ ์‚ฐ๋ชจ๊ฐ€ ์ฃฝ๋Š”
16:10
and it's very expensive to produce.
346
970330
2000
๋งค์šฐ ๊ฐ’๋น„์‹ผ ํƒ„์ƒ์ž…๋‹ˆ๋‹ค.
16:12
The genes are forced into producing all this myelin,
347
972330
2000
์œ ์ „์ž๋Š” ๋ชจ๋“  ๋ฏธ์—˜๋ฆฐ์„ ์ƒ์‚ฐํ•˜๋„๋ก ๊ฐ•์š”๋ฐ›๊ณ 
16:14
all the fat to myelinate the brain.
348
974330
2000
๋ชจ๋“  ์ง€๋ฐฉ์ด ๋‡Œ๋ฅผ ๋ฏธ์—˜๋ฆฐํ™”ํ•˜๋„๋ก ํ•ฉ๋‹ˆ๋‹ค.
16:16
Do you know, sitting here,
349
976330
2000
์—ฌ๊ธฐ ๊ณ„์‹  ์—ฌ๋Ÿฌ๋ถ„์€
16:18
your brain is using about 20 percent of your body's energy output
350
978330
4000
๋ชธ๋ฌด๊ฒŒ 2ํผ์„ผํŠธ์ธ ๋‡Œ๊ฐ€ ์ „์ฒด ๋ชธ ์—๋„ˆ์ง€์˜ 20ํผ์„ผํŠธ๋ฅผ
16:22
for two percent of your body weight?
351
982330
2000
์‚ฌ์šฉํ•˜๊ณ  ์žˆ๋‹ค๋Š” ์‚ฌ์‹ค์„ ์•Œ๊ณ  ์žˆ๋‚˜์š”?
16:24
It's a really expensive organ to run.
352
984330
2000
์ด๊ฑด ์ •๋ง ์œ ์ง€๋น„๊ฐ€ ๋น„์‹ผ ์žฅ๊ธฐ์ž…๋‹ˆ๋‹ค.
16:26
Why? Because it's producing the memes.
353
986330
2000
์™œ์ฃ ? ๋‡Œ๊ฐ€ ๋ฐˆ์„ ๋งŒ๋“ค์–ด๋‚ด๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
16:28
Now, it could have killed us off. It could have killed us off,
354
988330
4000
์ž, ๊ทธ๊ฒƒ์€ ์šฐ๋ฆฌ๋ฅผ ๋ฉธ์ข…์‹œํ‚ฌ ์ˆ˜๋„ ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค. ๋ฉธ์ข…์ด์š”.
16:32
and maybe it nearly did, but you see, we don't know.
355
992330
2000
๊ทธ๋ฆฌ๊ณ  ๊ฑฐ์˜ ๊ทธ๋žฌ์„ ์ˆ˜๋„ ์žˆ์ง€๋งŒ, ์•Œ ์ˆ˜ ์—†๋Š” ์ผ์ด์ฃ .
16:34
But maybe it nearly did.
356
994330
2000
๊ทธ๋Ÿฌ๋‚˜ ์•„๋งˆ ๊ฑฐ์˜ ๊ทธ๋žฌ๊ฒ ์ฃ .
16:36
Has it been tried before?
357
996330
1000
์ด์ „์—๋„ ์ด๋Ÿฐ ์‹œ๋„๊ฐ€ ์žˆ์—ˆ์„๊นŒ์š”?
16:37
What about all those other species?
358
997330
2000
๋‹ค๋ฅธ ๋ชจ๋“  ์ข…๋“ค์€ ์–ด๋• ์„๊นŒ์š”?
16:39
Louise Leakey talked yesterday
359
999330
2000
๋ฃจ์ด์Šค ๋ฆฌํ‚ค๋Š” ์–ด์ œ, ์–ด๋–ป๊ฒŒ ์šฐ๋ฆฌ๋งŒ
16:41
about how we're the only one in this branch left.
360
1001330
3000
์ด์ชฝ ๊ฐ€์ง€์— ๋‚จ๊ฒŒ๋๋Š”์ง€ ์–˜๊ธฐํ–ˆ์Šต๋‹ˆ๋‹ค.
16:44
What happened to the others?
361
1004330
2000
๋‹ค๋ฅธ ์ด๋“ค์—๊ฒ ๋ฌด์Šจ ์ผ์ด ์ผ์–ด๋‚ฌ๋‚˜์š”?
16:46
Could it be that this experiment in imitation,
362
1006330
2000
์ด๊ฒƒ์€ ์ด ๋ชจ๋ฐฉ์˜ ์‹คํ—˜,
16:48
this experiment in a second replicator,
363
1008330
2000
์ด ๋‘๋ฒˆ์งธ ๋ณต์ œ์ž์˜ ์‹คํ—˜์ด
16:50
is dangerous enough to kill people off?
364
1010330
4000
์ธ๋ฅ˜๋ฅผ ๋ฉธ๋ง์‹œํ‚ฌ๋งŒํผ ์œ„ํ—˜ํ•˜๋‹ค๋Š” ์˜๋ฏธ๊ฐ€ ๋  ์ˆ˜ ์žˆ์„๊นŒ์š”?
16:54
Well, we did pull through, and we adapted.
365
1014330
2000
์ž, ์šฐ๋ฆฌ๋Š” ๊ฒฌ๋ŽŒ๋‚ด ์™”๊ณ  ์ ์‘ํ–ˆ์Šต๋‹ˆ๋‹ค.
16:56
But now, we're hitting, as I've just described,
366
1016330
3000
๊ทธ๋Ÿฌ๋‚˜ ์ด์ œ, ์ œ๊ฐ€ ์ด๋ฏธ ์„ค๋ช…ํ•œ ๊ฒƒ์ฒ˜๋Ÿผ,
16:59
we're hitting the third replicator point.
367
1019330
2000
์„ธ๋ฒˆ์งธ ๋ณต์ œ์ž ๋“ฑ์žฅ ์‹œ์ ์ด ๊ฐ€๊นŒ์™€ ์กŒ์Šต๋‹ˆ๋‹ค.
17:01
And this is even more dangerous --
368
1021330
3000
๊ทธ๋ฆฌ๊ณ  ์ด๊ฒƒ์€ ํ›จ์”ฌ ๋” ์œ„ํ—˜ํ•ฉ๋‹ˆ๋‹ค.
17:04
well, it's dangerous again.
369
1024330
2000
์˜ˆ, ๋‹ค์‹œ ์œ„ํ—˜์— ์ฒ˜ํ•œ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
17:06
Why? Because the temes are selfish replicators
370
1026330
4000
์™œ์ฃ ? ๊ทธ๊ฒƒ์€ ํŒ€์ด ์ด๊ธฐ์ ์ธ ๋ณต์ œ์ž์ด๊ณ  ์šฐ๋ฆฌ๋‚˜ ์šฐ์ฃผ
17:10
and they don't care about us, or our planet, or anything else.
371
1030330
3000
๋˜๋Š” ๊ทธ ์–ด๋–ค ๊ฒƒ์— ๋Œ€ํ•ด์„œ๋„ ๊ฑฑ์ •ํ•˜์ง€ ์•Š๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
17:13
They're just information, why would they?
372
1033330
3000
๊ทธ๊ฒƒ์€ ๋‹จ์ง€ ์ •๋ณด์ž…๋‹ˆ๋‹ค. ์™œ ๊ทธ๋Ÿด๊นŒ์š”?
17:17
They are using us to suck up the planet's resources
373
1037330
2000
๊ทธ๊ฒƒ์€ ๋” ๋งŽ์€ ์ปดํ“จํ„ฐ๋ฅผ ๋งŒ๋“ค๊ณ , ์ด๊ณณ TED์—์„œ ๋“ฃ๋Š”
17:19
to produce more computers,
374
1039330
2000
๋ชจ๋“  ๋†€๋ผ์šด ๊ฒƒ๋“ค์„ ๋” ๋งŽ์ด ๋งŒ๋“ค์–ด๋‚ด๊ธฐ ์œ„ํ•ด,
17:21
and more of all these amazing things we're hearing about here at TED.
375
1041330
3000
์ง€๊ตฌ์˜ ์ž์›์„ ๋นจ์•„๋“ค์ด๋„๋ก ์šฐ๋ฆฌ๋ฅผ ์ด์šฉํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
17:24
Don't think, "Oh, we created the Internet for our own benefit."
376
1044330
4000
"์˜ค, ์šฐ๋ฆฌ ์ด์ต์„ ์œ„ํ•ด ์šฐ๋ฆฌ๊ฐ€ ์ธํ„ฐ๋„ท์„ ์ฐฝ์กฐํ–ˆ์–ด."๋ผ๊ณ  ์ƒ๊ฐํ•˜์ง€ ๋งˆ์„ธ์š”.
17:28
That's how it seems to us.
377
1048330
2000
์šฐ๋ฆฌ ๋ˆˆ์— ๊ทธ๋ ‡๊ฒŒ ๋ณด์ผ ๋ฟ์ž…๋‹ˆ๋‹ค.
17:30
Think, temes spreading because they must.
378
1050330
4000
ํŒ€์ด ํผ์ ธ๋‚˜๊ฐ€์•ผ ํ•˜๊ธฐ ๋•Œ๋ฌธ์—, ๊ทธ๋ ‡๊ฒŒ ๋œ ๊ฑฐ๋ผ๊ณ  ์ƒ๊ฐํ•˜์„ธ์š”.
17:34
We are the old machines.
379
1054330
2000
์šฐ๋ฆฌ๋Š” ์˜ค๋ž˜๋œ ๊ธฐ๊ณ„์ž…๋‹ˆ๋‹ค.
17:36
Now, are we going to pull through?
380
1056330
2000
์ด์ œ, ์šฐ๋ฆฌ๊ฐ€ ๊ฒฌ๋ŽŒ๋‚ผ ์ˆ˜ ์žˆ์„๊นŒ์š”?
17:38
What's going to happen?
381
1058330
2000
์–ด๋–ค ์ผ์ด ์ผ์–ด๋‚ ๊นŒ์š”?
17:40
What does it mean to pull through?
382
1060330
2000
๊ฒฌ๋ŽŒ๋‚ธ๋‹ค๋Š” ๊ฒƒ์ด ๋ฌด์—‡์„ ์˜๋ฏธํ• ๊นŒ์š”?
17:42
Well, there are kind of two ways of pulling through.
383
1062330
2000
์ž, ๊ฒฌ๋ŽŒ๋‚ด๋Š” ๊ฒƒ์—๋Š” ๋‘๊ฐ€์ง€ ์ข…๋ฅ˜๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
17:45
One that is obviously happening all around us now,
384
1065330
2000
์ง€๊ธˆ ์šฐ๋ฆฌ ์ฃผ์œ„์—์„œ ๋ช…๋ฐฑํ•˜๊ฒŒ ์ผ์–ด๋‚˜๊ณ  ์žˆ๋Š”
17:47
is that the temes turn us into teme machines,
385
1067330
4000
์ฒซ๋ฒˆ ์งธ๋Š”, ์ด๋Ÿฐ ์ด์‹๋ฌผ๊ณผ ์•ฝ๋ฌผ, ๊ธฐ์ˆ ์—
17:51
with these implants, with the drugs,
386
1071330
2000
์œตํ•ฉ๋˜๊ฒŒ ํ•จ์œผ๋กœ์จ, ํŒ€์ด ์šฐ๋ฆฌ๋ฅผ
17:53
with us merging with the technology.
387
1073330
3000
ํŒ€ ๊ธฐ๊ณ„๋กœ ์ „ํ™˜์‹œํ‚ค๊ณ  ์žˆ๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
17:56
And why would they do that?
388
1076330
2000
์™œ ํŒ€์ด ์ด๋ ‡๊ฒŒ ํ• ๊นŒ์š”?
17:58
Because we are self-replicating.
389
1078330
2000
์šฐ๋ฆฌ๊ฐ€ ์ž๊ธฐ ๋ณต์ œ์ž์ด๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
18:00
We have babies.
390
1080330
2000
์šฐ๋ฆฌ๋Š” ์•„์ด๋“ค์„ ๋‚ณ์Šต๋‹ˆ๋‹ค.
18:02
We make new ones, and so it's convenient to piggyback on us,
391
1082330
3000
์šฐ๋ฆฌ๋Š” ์ƒˆ๋กœ์šด ์ž์†์„ ๋งŒ๋“ค๊ณ , ๊ทธ๋ž˜์„œ ์•„์ง ์ง€๊ตฌ ์ƒ์—
18:05
because we're not yet at the stage on this planet
392
1085330
4000
ํ™•์‹คํ•œ ๋Œ€์•ˆ์ด ์žˆ๋Š” ๋‹จ๊ณ„์— ๋‹ค๋‹ค๋ฅด์ง€ ๋ชปํ–ˆ๊ธฐ ๋•Œ๋ฌธ์—
18:09
where the other option is viable.
393
1089330
2000
์šฐ๋ฆฌ ๋“ฑ์— ์—…ํ˜€๊ฐ€๋Š” ๊ฒƒ์ด ํŽธ๋ฆฌํ•ฉ๋‹ˆ๋‹ค.
18:11
Although it's closer, I heard this morning,
394
1091330
2000
์˜ค๋Š˜ ์•„์นจ ๋“ค์–ด๋ณด๋‹ˆ, ์ด ์ˆœ๊ฐ„์ด ๊ฐ€๊นŒ์›Œ์กŒ์Œ์—๋„ ๋ถˆ๊ตฌํ•˜๊ณ 
18:13
it's closer than I thought it was.
395
1093330
2000
์ œ๊ฐ€ ์ƒ๊ฐํ–ˆ๋˜ ๊ฒƒ๋ณด๋‹ค ๋” ๊ฐ€๊นŒ์›Œ์กŒ์Šต๋‹ˆ๋‹ค.
18:15
Where the teme machines themselves will replicate themselves.
396
1095330
3000
๋ฐ”๋กœ ํŒ€ ๊ธฐ๊ณ„ ์ž์‹ ์ด ์Šค์Šค๋กœ ๋ณต์ œํ•˜๊ฒŒ ๋  ์ˆœ๊ฐ„์ด ๋ง์ž…๋‹ˆ๋‹ค.
18:18
That way, it wouldn't matter if the planet's climate
397
1098330
4000
๊ทธ๋ ‡๊ฒŒ ๋˜๋ฉด, ํŒ€์€ ์ง€๊ตฌ์˜ ๊ธฐํ›„๊ฐ€ ์™„์ „ํžˆ ๋ถˆ์•ˆ์ •ํ•ด์ง€๊ณ 
18:22
was utterly destabilized,
398
1102330
2000
์ธ๊ฐ„์ด ๋” ์ด์ƒ ์ด๊ณณ์— ์‚ด ์ˆ˜ ์—†์„์ง€๋ผ๋„
18:24
and it was no longer possible for humans to live here.
399
1104330
2000
์ƒ๊ด€์ด ์—†์„ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
18:26
Because those teme machines, they wouldn't need --
400
1106330
2000
์™œ๋ƒํ•˜๋ฉด ์ด๋Ÿฐ ํŒ€ ๋จธ์‹ ์€ ํ๋Š˜ํ๋Š˜ํ•˜๊ณ , ์ถ•์ถ•ํ•˜๊ณ ,
18:28
they're not squishy, wet, oxygen-breathing,
401
1108330
2000
์‚ฐ์†Œ๋ฅผ ํ˜ธํกํ•˜๊ณ , ์˜จ๊ธฐ๋ฅผ ํ•„์š”๋กœ ํ•˜๋Š”
18:30
warmth-requiring creatures.
402
1110330
3000
์ƒ๋ช…์ฒด๊ฐ€ ์•„๋‹ˆ๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
18:33
They could carry on without us.
403
1113330
2000
์šฐ๋ฆฌ ์—†์ด๋„ ๊ณ„์†๋  ์ˆ˜ ์žˆ์„ํ…Œ๋‹ˆ๊นŒ์š”.
18:35
So, those are the two possibilities.
404
1115330
3000
์ž, ์ด๋ ‡๊ฒŒ ๋‘๊ฐ€์ง€ ๊ฐ€๋Šฅ์„ฑ์ด ์žˆ์Šต๋‹ˆ๋‹ค.
18:38
The second, I don't think we're that close.
405
1118330
4000
๋‘๋ฒˆ์งธ๋Š”, ์šฐ๋ฆฌ๊ฐ€ ๊ทธ๋ ‡๊ฒŒ ๊ฐ€๊นŒ์ด ์™”๋‹ค๊ณ ๋Š” ์ƒ๊ฐํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
18:42
It's coming, but we're not there yet.
406
1122330
2000
๋‹ค๊ฐ€์˜ค๊ณ ๋Š” ์žˆ์œผ๋‚˜ ์•„์ง ๊ทธ๊ณณ์— ๋‹ค๋‹ค๋ฅด์ง€๋Š” ์•Š์•˜์–ด์š”.
18:44
The first, it's coming too.
407
1124330
2000
์ฒซ๋ฒˆ์งธ๋Š”, ๋˜ํ•œ ๋‹ค๊ฐ€์˜ค๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
18:46
But the damage that is already being done
408
1126330
3000
๊ทธ๋Ÿฌ๋‚˜ ์ง€๊ตฌ์— ์ด๋ฏธ ๊ฐ€ํ•ด์ง„ ํ”ผํ•ด๋Š”
18:49
to the planet is showing us how dangerous the third point is,
409
1129330
5000
์„ธ๋ฒˆ์งธ ์ง€์ ์ด ์–ผ๋งˆ๋‚˜ ์œ„ํ—˜ํ•œ์ง€๋ฅผ ๋ณด์—ฌ์ฃผ๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
18:54
that third danger point, getting a third replicator.
410
1134330
3000
์„ธ๋ฒˆ์งธ ์ง€์ , ์ฆ‰ ์„ธ๋ฒˆ์งธ ๋ณต์ œ์ž๋ฅผ ๊ฐ–๋Š” ๊ฒƒ ๋ง์ด์ฃ .
18:58
And will we get through this third danger point,
411
1138330
2000
๊ทธ๋ฆฌ๊ณ  ์šฐ๋ฆฌ๊ฐ€ ๋‘๋ฒˆ์งธ์™€ ์ฒซ๋ฒˆ์งธ๋ฅผ ํ†ต๊ณผํ•œ ๊ฒƒ์ฒ˜๋Ÿผ
19:00
like we got through the second and like we got through the first?
412
1140330
3000
์ด ์„ธ๋ฒˆ์งธ ์œ„ํ—˜ํ•œ ์ง€์ ์„ ํ†ต๊ณผํ•˜๊ฒŒ ๋ ๊นŒ์š”?
19:04
Maybe we will, maybe we won't.
413
1144330
2000
๊ทธ๋Ÿด ์ˆ˜๋„ ์žˆ๊ณ , ์•„๋‹ ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
19:06
I have no idea.
414
1146330
3000
์ €๋กœ์„  ์•Œ ์ˆ˜ ์—†์Šต๋‹ˆ๋‹ค.
19:13
(Applause)
415
1153330
10000
(๋ฐ•์ˆ˜)
19:24
Chris Anderson: That was an incredible talk.
416
1164330
2000
ํฌ๋ฆฌ์Šค ์•ค๋”์Šจ: ๊ต‰์žฅํ•œ ์ด์•ผ๊ธฐ์˜€์Šต๋‹ˆ๋‹ค.
19:26
SB: Thank you. I scared myself.
417
1166330
2000
๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค. ์ œ ์ž์‹ ์ด ๋‹ค ๋ฌด์„ญ๊ตฐ์š”.
19:28
CA: (Laughter)
418
1168330
1000
ํฌ๋ฆฌ์Šค ์•ค๋”์Šจ: ์›ƒ์Œ
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7