Adam Ostrow: After your final status update

62,531 views ・ 2011-08-01

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: Kyo young Chu κ²€ν† : Sungwon Ma
00:15
By the end of this year,
0
15260
2000
μ˜¬ν•΄ 말쯀 되면
00:17
there'll be nearly a billion people on this planet
1
17260
3000
지ꡬ상에 μ•½ 10μ–΅ λͺ…μ˜ μ‚¬λžŒλ“€μ΄
00:20
that actively use social networking sites.
2
20260
2000
적극적으둜 μ†Œμ…œ λ„€νŠΈμ›Œν‚Ή μ‚¬μ΄νŠΈλ₯Ό μ΄μš©ν•  κ²ƒμž…λ‹ˆλ‹€.
00:22
The one thing that all of them have in common
3
22260
2000
이 μ‚¬λžŒλ“€μ΄ 가지고 μžˆλŠ” 단 ν•˜λ‚˜μ˜ 곡톡점은 λ°”λ‘œ
00:24
is that they're going to die.
4
24260
3000
μ–Έμ  κ°€ λͺ¨λ‘ λ‹€ μ£½λŠ”λ‹€λŠ”κ²λ‹ˆλ‹€.
00:27
While that might be a somewhat morbid thought,
5
27260
3000
이게 μ•½κ°„ μ†Œλ¦„λΌμΉ˜λŠ” 생각일지도 λͺ¨λ₯΄μ§€λ§Œ
00:30
I think it has some really profound implications
6
30260
2000
μ „ 이게 연ꡬ해 λ³Ό κ°€μΉ˜κ°€ μžˆλŠ”
00:32
that are worth exploring.
7
32260
2000
μ‹¬μ˜€ν•œ λ­”κ°€κ°€ μžˆλ‹€κ³  μƒκ°ν•©λ‹ˆλ‹€.
00:34
What first got me thinking about this
8
34260
3000
μ œκ°€ 이런 생각을 ν•˜λ„λ‘ λ§Œλ“  것은
00:37
was a blog post authored earlier this year by Derek K. Miller,
9
37260
3000
μ˜¬ν•΄ 초 데릭 λ°€λŸ¬λž€ μ‚¬λžŒμ΄ λΈ”λ‘œκ·Έμ— μ“΄ κΈ€μ΄μ—ˆμŠ΅λ‹ˆλ‹€.
00:40
who was a science and technology journalist
10
40260
3000
κ·ΈλŠ” μ•”μœΌλ‘œ 생을 λ§ˆκ°ν•œ
00:43
who died of cancer.
11
43260
2000
κ³Όν•™, κΈ°μˆ μ „λ¬Έ κΈ°μžμ˜€μŠ΅λ‹ˆλ‹€.
00:45
And what Miller did was have his family and friends write a post
12
45260
3000
그리고 λ°€λŸ¬λŠ” 그의 κ°€μ‘±κ³Ό μΉœκ΅¬λ“€μ—κ²Œ κ·Έκ°€ μ£½κ³  λ‚˜μ„œ
00:48
that went out shortly after he died.
13
48260
2000
글을 써달라고 λΆ€νƒν–ˆμŠ΅λ‹ˆλ‹€.
00:50
Here's what he wrote in starting that out.
14
50260
2000
이 글이 κ·Έ κΈ€μž…λ‹ˆλ‹€.
00:52
He said, "Here it is. I'm dead,
15
52260
2000
"λ„€. μ „ μ£½μ—ˆμŠ΅λ‹ˆλ‹€.
00:54
and this is my last post to my blog.
16
54260
2000
그리고 이 글이 제 λΈ”λ‘œκ·Έμ— 올라 올 λ§ˆμ§€λ§‰ κΈ€μž…λ‹ˆλ‹€.
00:56
In advance, I asked that once my body finally shut down
17
56260
3000
μ €λŠ” 제 λͺΈμ΄ μ•”μœΌλ‘œ 인해
00:59
from the punishments of my cancer,
18
59260
2000
μ™„μ „νžˆ κΈ°λŠ₯을 λ©ˆμΆ”κΈ° 전에
01:01
then my family and friends publish this prepared message I wrote --
19
61260
3000
κ°€μ‘±κ³Ό μΉœκ΅¬λ“€μ—κ²Œ μ œκ°€ 미리 써 놓은 글을 μ˜¬λ €λ‹¬λΌκ³  λΆ€νƒν–ˆμŠ΅λ‹ˆλ‹€.
01:04
the first part of the process
20
64260
2000
운영 μ€‘μ΄μ—ˆλ˜ λΈ”λ‘œκ·Έλ₯Ό
01:06
of turning this from an active website to an archive."
21
66260
3000
기둝 보쑴을 μœ„ν•œ 곳으둜 λ§Œλ“€κΈ° μœ„ν•œ 첫 λ‹¨κ³„μž…λ‹ˆλ‹€."
01:09
Now, while as a journalist,
22
69260
2000
κΈ°μžλ‘œμ„œ
01:11
Miller's archive may have been better written
23
71260
2000
λ°€λŸ¬μ˜ 기둝듀은 더 잘 μ“°μ—¬μ‘Œμ„κ²ƒμ΄κ³ 
01:13
and more carefully curated than most,
24
73260
2000
더 μ‘°μ‹¬μŠ€λŸ½κ²Œ κ΄€λ¦¬λ˜μ—ˆμ„ κ²ƒμž…λ‹ˆλ‹€.
01:15
the fact of the matter is that all of us today
25
75260
2000
μ€‘μš”ν•œ 사싀은 μ˜€λŠ˜λ‚  우리 λͺ¨λ‘κ°€
01:17
are creating an archive
26
77260
2000
μ΄μ „μ˜ κ·Έ μ–΄λ–€ μ„ΈλŒ€λ“€μ΄ λ§Œλ“ 
01:19
that's something completely different
27
79260
2000
λͺ¨λ“  것듀과 λΉ„κ΅ν–ˆμ„ λ•Œ
01:21
than anything that's been created
28
81260
2000
μ „ν˜€ λ‹€λ₯Έ ν˜•νƒœμ˜
01:23
by any previous generation.
29
83260
2000
기둝 보관 맀체λ₯Ό λ§Œλ“ λ‹€λŠ” κ²ƒμž…λ‹ˆλ‹€.
01:25
Consider a few stats for a moment.
30
85260
2000
μ•½κ°„μ˜ 톡계에 λŒ€ν•΄ 생각해 λ΄…μ‹œλ‹€.
01:27
Right now there are 48 hours of video
31
87260
2000
μ§€κΈˆ ν˜„μž¬ μœ νˆ¬λΈŒμ—λŠ”
01:29
being uploaded to YouTube every single minute.
32
89260
3000
맀 λΆ„ 48μ‹œκ°„ λΆ„λŸ‰μ˜ λΉ„λ””μ˜€κ°€ 올라였고 있으며,
01:32
There are 200 million Tweets being posted every day.
33
92260
4000
νŠΈμœ„ν„°μ—λŠ” ν•˜λ£¨μ— 2μ–΅κ°œμ˜ νŠΈμœ—μ΄ μ˜¬λΌμ˜΅λ‹ˆλ‹€.
01:36
And the average Facebook user
34
96260
3000
그리고 일반적인 페이슀뢁 μ‚¬μš©μžλŠ” 맀달
01:39
is creating 90 pieces of content each month.
35
99260
4000
90개 μ •λ„μ˜ 글을 μ”λ‹ˆλ‹€.
01:43
So when you think about your parents or your grandparents,
36
103260
3000
우리 λΆ€λͺ¨λ‹˜μ΄λ‚˜ κ·Έ μœ—μ„ΈλŒ€λ₯Ό λ³Όλ•Œ,
01:46
at best they may have created
37
106260
2000
그듀은 μ§€κΈˆμ€ μ–΄λ”˜κ°€μ˜
01:48
some photos or home videos,
38
108260
2000
μƒμžμ— μžˆμ„ 사진, λΉ„λ””μ˜€,
01:50
or a diary that lives in a box somewhere.
39
110260
3000
일기λ₯Ό μ“°λŠ” 것이 κ³ μž‘μ΄μ—ˆμ„ κ²ƒμž…λ‹ˆλ‹€.
01:53
But today we're all creating this incredibly rich digital archive
40
113260
3000
ν•˜μ§€λ§Œ μ˜€λŠ˜λ‚  μš°λ¦¬λŠ” μš°λ¦¬κ°€ μ£½κ³ λ‚œ 뒀에도
01:56
that's going to live in the cloud indefinitely,
41
116260
2000
μ–΄λ”˜κ°€μ— μ–Έμ œκΉŒμ§€λ‚˜ μ €μž₯될 μ—„μ²­λ‚˜κ²Œ ν’λΆ€ν•œ
01:58
years after we're gone.
42
118260
2000
기둝듀을 λ§Œλ“€μ–΄λ‚΄κ³  μžˆμŠ΅λ‹ˆλ‹€.
02:00
And I think that's going to create some incredibly intriguing opportunities
43
120260
3000
μ „ 이런 것듀이 κ³Όν•™κΈ°μˆ  μ „λ¬Έκ°€λ“€μ—κ²Œ 정말 ν₯미둜운 기회λ₯Ό
02:03
for technologists.
44
123260
2000
μ œκ³΅ν•  것이라고 μƒκ°ν•©λ‹ˆλ‹€.
02:05
Now to be clear, I'm a journalist and not a technologist,
45
125260
2000
μ†”μ§νžˆ λ§μ”€λ“œλ¦¬μžλ©΄ μ „ κΈ°μžμ΄μ§€ κ³Όν•™κΈ°μˆ μžκ°€ μ•„λ‹™λ‹ˆλ‹€.
02:07
so what I'd like to do briefly
46
127260
2000
κ·Έλž˜μ„œ μ „ ν˜„μž¬μ™€
02:09
is paint a picture
47
129260
2000
미래의 λͺ¨μŠ΅μ„
02:11
of what the present and the future are going to look like.
48
131260
3000
κ°„λ‹¨ν•˜κ²Œ λ³΄μ—¬λ“œλ¦¬κ³ μž ν•©λ‹ˆλ‹€.
02:14
Now we're already seeing some services
49
134260
2000
μš°λ¦¬λŠ” 벌써 μš°λ¦¬κ°€ 죽은 λ’€
02:16
that are designed to let us decide
50
136260
2000
온라인 ν”„λ‘œν•„μ΄λ‚˜ μ†Œμ…œ 미디어에
02:18
what happens to our online profile and our social media accounts
51
138260
3000
μ–΄λ–€ 일이 일어날지 μ•Œ 수 μžˆλ„λ‘ λ§Œλ“€μ–΄μ§„ μ„œλΉ„μŠ€λ₯Ό
02:21
after we die.
52
141260
2000
μ ‘ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
02:23
One of them actually, fittingly enough,
53
143260
2000
κ·Έ 쀑에 ν•œ μ„œλΉ„μŠ€κ°€
02:25
found me when I checked into a deli
54
145260
2000
μ œκ°€ λ‰΄μš• 4λ²ˆκ°€μ— μžˆλŠ”
02:27
at a restaurant in New York
55
147260
2000
μŒμ‹μ μ— 듀어갔을 λ•Œ
02:29
on foursquare.
56
149260
3000
절 λ°œκ²¬ν–ˆμ—ˆμŠ΅λ‹ˆλ‹€.
02:32
(Recording) Adam Ostrow: Hello.
57
152260
2000
(λ…ΉμŒ) μ•„λ‹΄ 였슀트둜: μ—¬λ³΄μ„Έμš”.
02:34
Death: Adam?
58
154260
2000
μ €μŠΉμ‚¬μž : 아담인가?
02:36
AO: Yeah.
59
156260
2000
AO: λ„€.
02:38
Death: Death can catch you anywhere, anytime,
60
158260
3000
μ €μŠΉμ‚¬μž : μ €μŠΉμ‚¬μžλŠ” μžλ„€κ°€ μ–΄λ”” μžˆλ“ μ§€, μ–Έμ œλ“ μ§€ 데리러 갈 수 μžˆλ‹€λ„€.
02:41
even at the Organic.
61
161260
3000
μœ κΈ°λ† μ‹ν’ˆ νŒλ§€μ μ—λ„ 말이지.
02:44
AO: Who is this?
62
164260
2000
AO: λˆ„κ΅¬μ„Έμš”?
02:46
Death: Go to ifidie.net
63
166260
3000
μ €μŠΉμ‚¬μž : 더 늦기 전에
02:49
before it's too late.
64
169260
2000
ifidie.net을 λ°©λ¬Έν•΄λ³΄κ²Œ.
02:51
(Laughter)
65
171260
2000
(μ›ƒμŒ)
02:53
Adam Ostrow: Kind of creepy, right?
66
173260
2000
μ’€ μ˜€μ‹Ήν•˜μ§€ μ•Šλ‚˜μš”?
02:55
So what that service does, quite simply,
67
175260
2000
κ·Έ μ„œλΉ„μŠ€κ°€ ν•˜λŠ” 일은 κ½€ κ°„λ‹¨ν•©λ‹ˆλ‹€.
02:57
is let you create a message or a video
68
177260
2000
μ‚¬λžŒλ“€λ‘œ ν•˜μ—¬κΈˆ 죽은 λ’€ νŽ˜μ΄μŠ€λΆμ—
02:59
that can be posted to Facebook after you die.
69
179260
3000
올라갈 κΈ€μ΄λ‚˜ λΉ„λ””μ˜€λ₯Ό 미리 λ§Œλ“€λ„λ‘ ν•˜λŠ”κ±°μ˜€μ£ .
03:02
Another service right now
70
182260
2000
'1000개의 κΈ°μ–΅λ“€'이라고 λΆˆλ¦¬λŠ”
03:04
is called 1,000 Memories.
71
184260
2000
λ‹€λ₯Έ μ’…λ₯˜μ˜ μ„œλΉ„μŠ€λ„ μžˆμŠ΅λ‹ˆλ‹€.
03:06
And what this lets you do is create an online tribute to your loved ones,
72
186260
2000
이걸둜 μžμ‹ μ—κ²Œ μ€‘μš”ν–ˆλ˜ μ‚¬λžŒλ“€μ„ μœ„ν•΄μ„œ
03:08
complete with photos and videos and stories
73
188260
3000
당신이 μ£½κ³ λ‚œ λ’€ 올릴 사진, λΉ„λ””μ˜€, μ΄μ•ΌκΈ°λ“€λ‘œ λ§Œλ“€μ–΄μ§„
03:11
that they can post after you die.
74
191260
3000
인터넷 ν—Œμ‚¬λ₯Ό 미리 λ§Œλ“œλŠ” κ²ƒμž…λ‹ˆλ‹€.
03:14
But what I think comes next is far more interesting.
75
194260
3000
근데 이 λ‹€μŒμ΄ 훨씬 더 μž¬λ―ΈμžˆμŠ΅λ‹ˆλ‹€.
03:17
Now a lot of you are probably familiar with Deb Roy
76
197260
3000
μ—¬λŸ¬λΆ„λ“€ 쀑 λŒ€λΆ€λΆ„μ΄ 뎁 λ‘œμ΄κ°€ λˆ„κ΅°μ§€ μ•„μ‹€κ²λ‹ˆλ‹€.
03:20
who, back in March,
77
200260
2000
μ§€λ‚œ 3월에
03:22
demonstrated how he was able to analyze more than 90,000 hours of home video.
78
202260
4000
μ–΄λ–»κ²Œ 90,000μ‹œκ°„μ΄ λ„˜λŠ” λΉ„λ””μ˜€λ₯Ό 뢄석할 수 μžˆμ—ˆλŠ”μ§€ λ³΄μ—¬μ€¬λ˜ κ·Έ μ‚¬λžŒμž…λ‹ˆλ‹€.
03:26
I think as machines' ability
79
206260
2000
μ €λŠ” κΈ°κ³„μ˜
03:28
to understand human language and process vast amounts of data
80
208260
2000
μ‚¬λžŒμ˜ μ–Έμ–΄λ₯Ό μ΄ν•΄ν•˜κ³  λ§Žμ€ μ–‘μ˜ 정보λ₯Ό μ²˜λ¦¬ν•  λŠ₯λ ₯이
03:30
continues to improve,
81
210260
2000
κ³„μ†ν•΄μ„œ λ°œμ „λ˜λ©΄μ„œ
03:32
it's going to become possible
82
212260
2000
μš°λ¦¬κ°€ λ§Œλ“€κ³  μžˆλŠ” μ—„μ²­λ‚œ μ–‘μ˜
03:34
to analyze an entire life's worth of content --
83
214260
2000
νŠΈμœ„ν„°λ‚˜, 사진, λΉ„λ””μ˜€, λΈ”λ‘œκ·Έ κΈ€ 같은
03:36
the Tweets, the photos, the videos, the blog posts --
84
216260
3000
우리 일생에 걸친 λ‚΄μš©μ„ λΆ„μ„ν•˜λŠ” 것도
03:39
that we're producing in such massive numbers.
85
219260
2000
κ°€λŠ₯ν•΄ 질 것이라고 μƒκ°ν•©λ‹ˆλ‹€.
03:41
And I think as that happens,
86
221260
2000
또 이런 것이 κ°€λŠ₯ν•΄μ§€λ©΄μ„œ
03:43
it's going to become possible for our digital personas
87
223260
3000
μš°λ¦¬κ°€ μ“°λŠ” μ—„μ²­λ‚œ μ–‘μ˜ κΈ€κ³Ό
03:46
to continue to interact in the real world long after we're gone
88
226260
3000
λͺ¨λ“  것을 κ°€λŠ₯ν•˜κ²Œ λ§Œλ“€ 수 μžˆλŠ” 기술의 λ°œμ „μœΌλ‘œ 인해
03:49
thanks to the vastness of the amount of content we're creating
89
229260
3000
우리의 λ””μ§€ν„Έν™”λœ μžμ•„κ°€ μš°λ¦¬κ°€ μ£½κ³  λ‚œ 뒀에도
03:52
and technology's ability to make sense of it all.
90
232260
3000
μ‹€μ œ 세계와 계속 μ†Œν†΅μ„ ν•  수 있게 될 κ²ƒμž…λ‹ˆλ‹€.
03:55
Now we're already starting to see some experiments here.
91
235260
3000
μ—¬κΈ° 이미 이런 연ꡬλ₯Ό μ‹œμž‘ν•œ μ‚¬λžŒλ“€μ΄ μžˆμŠ΅λ‹ˆλ‹€.
03:58
One service called My Next Tweet
92
238260
2000
λ°”λ‘œ 'My Next Tweet'이라고 λΆˆλ¦¬λŠ” μ„œλΉ„μŠ€μΈλ°
04:00
analyzes your entire Twitter stream, everything you've posted onto Twitter,
93
240260
3000
이 μ„œλΉ„μŠ€λŠ” μ‚¬λžŒλ“€μ΄ λ‹€μŒμ— μ–΄λ–€ 글을 μ“Έ 지 μ˜ˆμƒμ„ ν•˜κΈ°μœ„ν•΄
04:03
to make some predictions as to what you might say next.
94
243260
3000
κ·Έ μ‚¬λžŒμ΄ νŠΈμœ„ν„°μ— μ“΄ λͺ¨λ“  것을 λΆ„μ„ν•©λ‹ˆλ‹€.
04:06
Well right now, as you can see,
95
246260
2000
μ—¬κΈ° μ—¬λŸ¬λΆ„λ“€μ΄ λ³΄μ‹œλ‹€μ‹œν”Ό
04:08
the results can be somewhat comical.
96
248260
2000
κ²°κ³ΌλŠ” μ•½κ°„ 웃길지도 λͺ¨λ¦…λ‹ˆλ‹€.
04:10
You can imagine what something like this might look like
97
250260
2000
우린 기술적인 λŠ₯λ ₯이 μ§„λ³΄ν•˜λ©΄μ„œ 이런 μ„œλΉ„μŠ€κ°€
04:12
five, 10 or 20 years from now
98
252260
2000
5λ…„, 10λ…„, 20λ…„ 뒀에
04:14
as our technical capabilities improve.
99
254260
3000
μ–΄λ–€ λͺ¨μŠ΅μΌμ§€ 상상할 수 μžˆμŠ΅λ‹ˆλ‹€.
04:17
Taking it a step further,
100
257260
2000
μ—¬κΈ°μ„œ ν•œ 발 더 λ‚˜μ•„κ°€
04:19
MIT's media lab is working on robots
101
259260
2000
MIT의 λ―Έλ””μ–΄ μ—°κ΅¬μ†ŒλŠ”
04:21
that can interact more like humans.
102
261260
3000
μ‚¬λžŒμ²˜λŸΌ μ˜μ‚¬μ†Œν†΅μ΄ κ°€λŠ₯ν•œ λ‘œλ΄‡μ„ μ—°κ΅¬ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
04:24
But what if those robots were able to interact
103
264260
2000
ν•˜μ§€λ§Œ λ§Œμ•½μ— κ·Έ λ‘œλ΄‡μ΄
04:26
based on the unique characteristics of a specific person
104
266260
3000
νŠΉμ •ν•œ μ‚¬λžŒμ˜ 성격을 λ°”νƒ•μœΌλ‘œ
04:29
based on the hundreds of thousands of pieces of content
105
269260
2000
κ·Έκ°€ μΌμƒλ™μ•ˆ μ“΄ μ—„μ²­λ‚˜κ²Œ λ§Žμ€ 컨텐츠λ₯Ό μ΄μš©ν•΄
04:31
that person produces in their lifetime?
106
271260
3000
μ˜μ‚¬μ†Œν†΅μ΄ κ°€λŠ₯ν•˜λ‹€λ©΄ μ–΄λ–¨κΉŒμš”?
04:34
Finally, think back to this famous scene
107
274260
2000
λ§ˆμ§€λ§‰μœΌλ‘œ 2008λ…„ λ―Έκ΅­ μ„ κ±°μ—μ„œ
04:36
from election night 2008
108
276260
2000
CNN이 μ—”λ”μŠ¨ 쿠퍼의
04:38
back in the United States,
109
278260
2000
인터뷰λ₯Ό μœ„ν•΄ μŠ€νŠœλ””μ˜€μ—
04:40
where CNN beamed a live hologram
110
280260
2000
νž™ν•©κ°€μˆ˜ Will.i.am의
04:42
of hip hop artist will.i.am into their studio
111
282260
2000
ν™€λ‘œκ·Έλž¨μ„ 쐈던
04:44
for an interview with Anderson Cooper.
112
284260
3000
κ·Έ 유λͺ…ν•œ μž₯면을 μƒκ°ν•΄λ³΄μ„Έμš”.
04:47
What if we were able to use that same type of technology
113
287260
2000
λ§Œμ•½μ— μš°λ¦¬κ°€ μ‚¬λž‘ν–ˆλ˜ μ‚¬λžŒλ“€μ˜ λͺ¨μŠ΅μ„
04:49
to beam a representation of our loved ones into our living rooms --
114
289260
4000
κ·Έ κΈ°μˆ μ„ μ΄μš©ν•΄ 거싀에 λΉ„μΆœ 수 μžˆλ‹€λ©΄ μ–΄λ–¨κΉŒμš”?
04:53
interacting in a very lifelike way
115
293260
2000
그듀이 μ‚΄μ•„μžˆμ„ λ•Œ λ§Œλ“€μ—ˆλ˜ 컨텐츠λ₯Ό λ°”νƒ•μœΌλ‘œ
04:55
based on all the content they created while they were alive?
116
295260
3000
μ‹€μ œμ™€ 같은 λŒ€ν™”λ₯Ό ν•˜λŠ” 것 λ§μ΄μ—μš”.
04:58
I think that's going to become completely possible
117
298260
3000
μ „ μš°λ¦¬κ°€ μ‚¬μš©ν•˜λŠ” μ •λ³΄μ˜ μ–‘κ³Ό
05:01
as the amount of data we're producing
118
301260
2000
이λ₯Ό μ²˜λ¦¬ν•  수 μžˆλŠ” 기술이
05:03
and technology's ability to understand it
119
303260
2000
κΈ°ν•˜κΈ‰μˆ˜μ μœΌλ‘œ λ°œμ „ν•¨μ— 따라
05:05
both expand exponentially.
120
305260
3000
이런 기술이 κ°€λŠ₯ν•΄μ§ˆ 수 μžˆλ‹€κ³  ν™•μ‹ ν•©λ‹ˆλ‹€.
05:08
Now in closing, I think what we all need to be thinking about
121
308260
2000
이 λ°œν‘œλ₯Ό λ§ˆμΉ˜λ©΄μ„œ μ „
05:10
is if we want that to become our reality --
122
310260
2000
λ§Œμ•½μ— μš°λ¦¬κ°€ 이런 일이 μ‹€μ œλ‘œ μΌμ–΄λ‚˜κΈΈ μ›ν•œλ‹€λ©΄,
05:12
and if so,
123
312260
2000
λ§Œμ•½ κ·Έλ ‡λ‹€λ©΄,
05:14
what it means for a definition of life and everything that comes after it.
124
314260
3000
그것이 μ‚Άμ˜ μ •μ˜μ™€ 죽은 λ’€μ˜ λͺ¨λ“  일에 μžˆμ–΄μ„œ μ–΄λ–€ μ˜λ―ΈμΈμ§€μ— λŒ€ν•΄ 생각해봐야 ν•œλ‹€κ³  λ§μ”€λ“œλ¦¬κ³  μ‹ΆμŠ΅λ‹ˆλ‹€.
05:17
Thank you very much.
125
317260
2000
κ°μ‚¬ν•©λ‹ˆλ‹€.
05:19
(Applause)
126
319260
4000
(λ°•μˆ˜)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7