How to be "Team Human" in the digital future | Douglas Rushkoff

117,366 views ・ 2019-01-14

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: TJ Kim κ²€ν† : Dahyun Ha
00:13
I got invited to an exclusive resort
0
13520
3880
μ–Έμ  κ°€ κ³ κΈ‰ λ¦¬μ‘°νŠΈμ— μ΄ˆλŒ€λ₯Ό λ°›μ•„
00:17
to deliver a talk about the digital future
1
17440
2456
디지털 λ―Έλž˜μ— λŒ€ν•œ 강연을 ν•˜κ²Œ λμ–΄μš”.
00:19
to what I assumed would be a couple of hundred tech executives.
2
19920
3576
κΈ°μˆ μ „λ¬ΈνšŒμ‚¬ 쀑역듀이 ν•œ 200λͺ… λͺ¨μ—¬μžˆμ„ 거라 μƒκ°ν–ˆμ£ .
00:23
And I was there in the green room, waiting to go on,
3
23520
2696
λŒ€κΈ°μ‹€μ—μ„œ μ€€λΉ„λ₯Ό ν•˜κ³  μžˆλŠ”λ°
00:26
and instead of bringing me to the stage, they brought five men into the green room
4
26240
5176
λ¬΄λŒ€λ‘œ λΆ€λ₯΄μ§€ μ•Šκ³  μ‚¬λžŒ 5λͺ…을 방으둜 데렀와
00:31
who sat around this little table with me.
5
31440
2056
μž‘μ€ νƒμžλ₯Ό 놓고 제 μ˜†μœΌλ‘œ μ•‰ν˜”μ–΄μš”.
00:33
They were tech billionaires.
6
33520
2096
λ‹€λ“€ μ–΅λ§Œμž₯μžλ“€μ΄μ—ˆλŠ”λ°
00:35
And they started peppering me with these really binary questions,
7
35640
4536
예 μ•„λ‹ˆμ˜€ μ§ˆλ¬Έμ„ 제게 퍼뢓기 μ‹œμž‘ν–ˆμ–΄μš”.
00:40
like: Bitcoin or Etherium?
8
40200
2000
예λ₯Ό λ“€μ–΄, λΉ„νŠΈμ½”μΈ μ•„λ‹ˆλ©΄ 이더리움?
00:43
Virtual reality or augmented reality?
9
43120
2656
κ°€μƒν˜„μ‹€ μ•„λ‹ˆλ©΄ μ¦κ°•ν˜„μ‹€?
00:45
I don't know if they were taking bets or what.
10
45800
2496
μžκΈ°λ“€λΌλ¦¬ λ‚΄κΈ°λ₯Ό ν–ˆλ˜ 건진 λͺ¨λ₯΄κ² μ§€λ§Œ
00:48
And as they got more comfortable with me,
11
48320
2816
μ–΄λŠ 정도 λΆ„μœ„κΈ°κ°€ 무λ₯΄μ΅μœΌλ©΄μ„œ
00:51
they edged towards their real question of concern.
12
51160
3216
정말 κΆκΈˆν–ˆλ˜ 것듀을 묻기 μ‹œμž‘ν–ˆμ£ .
00:54
Alaska or New Zealand?
13
54400
2240
μ•Œλž˜μŠ€μΉ΄ μ•„λ‹ˆλ©΄ λ‰΄μ§ˆλžœλ“œ?
00:57
That's right.
14
57760
1216
λ§žμŠ΅λ‹ˆλ‹€.
00:59
These tech billionaires were asking a media theorist for advice
15
59000
2976
이 μ–΅λ§Œμž₯μžλ“€μ΄ λ―Έλ””μ–΄ μ΄λ‘ κ°€μ—κ²Œ μ΅œν›„μ˜ 그날을 λŒ€λΉ„ν•œ 벙컀λ₯Ό
01:02
on where to put their doomsday bunkers.
16
62000
1880
어디에 지을지 묻고 μžˆμ—ˆλ˜ κ±°μ£ .
01:04
We spent the rest of the hour on the single question:
17
64600
3136
남은 μ‹œκ°„ λ™μ•ˆ 이 μ–˜κΈ°μ—λ§Œ λ§€λ‹¬λ Έμ–΄μš”.
01:07
"How do I maintain control of my security staff
18
67760
3616
"κ·Έ 날이 였면 νšŒμ‚¬μ˜ 기밀을 λ‹΄λ‹Ήν•˜λŠ” 직원듀을
01:11
after the event?"
19
71400
1320
μ–΄λ–»κ²Œ ν†΅μ œν•˜μ£ ?"
01:13
By "the event" they mean the thermonuclear war
20
73920
2736
이듀이 λ§ν•˜λŠ” "μ΅œν›„μ˜ κ·Έλ‚ "은 세상에 쒅말을 κ°€μ Έ 올
01:16
or climate catastrophe or social unrest that ends the world as we know it,
21
76680
4656
ν•΅μ „μŸμ΄λ‚˜ κΈ°ν›„ λŒ€μž¬μ•™, λ˜λŠ” μ‚¬νšŒ λΆˆμ•ˆ λ“±
01:21
and more importantly, makes their money obsolete.
22
81360
3280
더 μ •ν™•ν•˜κ²Œ, κ·Έλ“€μ˜ 돈이 νœ΄μ§€μ‘°κ°μ΄ 될 λ•Œλ₯Ό λ§ν•˜μ£ .
01:26
And I couldn't help but think:
23
86200
2216
κ·Έ λ•Œ 이런 생각이 λ“€μ—ˆμ–΄μš”.
01:28
these are the wealthiest, most powerful men in the world,
24
88440
4736
μ„Έμƒμ—μ„œ κ°€μž₯ λΆ€μœ ν•˜κ³  νž˜μ„ 가진 이듀이
01:33
yet they see themselves as utterly powerless to influence the future.
25
93200
4656
미래의 λ¬Έμ œμ— λŒ€ν•΄μ„  λ„ˆλ¬΄λ‚˜ 무λŠ₯ν•˜λ‹€λŠ” κ±°μš”.
01:37
The best they can do is hang on for the inevitable catastrophe
26
97880
4456
ν”Όν•  수 μ—†λŠ” λŒ€μž¬μ•™μ„ μ•žμ— 두고 그듀이 μƒκ°ν•˜λŠ” μ΅œμ„ μ±…μ΄
01:42
and then use their technology and money to get away from the rest of us.
27
102360
3680
기술과 λˆμ„ 가지고 λ‚˜λ¨Έμ§€ μ‚¬λžŒλ“€λ‘œλΆ€ν„° λ²—μ–΄λ‚˜λ €λŠ” κ±°μ£ .
01:47
And these are the winners of the digital economy.
28
107520
2536
λ°”λ‘œ μ˜€λŠ˜λ‚ μ˜ 디지털 경제λ₯Ό μ„ λ„ν•˜λŠ” μžλ“€μž…λ‹ˆλ‹€.
01:50
(Laughter)
29
110080
3416
(μ›ƒμŒ)
01:53
The digital renaissance
30
113520
2776
λ””μ§€ν„Έμ˜ λΆ€ν₯은
01:56
was about the unbridled potential
31
116320
4256
μΈκ°„μ˜ 집단적 μƒμƒμ˜
02:00
of the collective human imagination.
32
120600
2416
ꡬ속받지 μ•ŠλŠ” 잠재λ ₯에 κ΄€ν•œ κ²ƒμž…λ‹ˆλ‹€.
02:03
It spanned everything from chaos math and quantum physics
33
123040
5136
혼돈 이둠과 ν€€ν…€ 물리,
02:08
to fantasy role-playing and the Gaia hypothesis, right?
34
128200
4176
곡상적 μ—­ν•  κ²Œμž„κ³Ό 가이아 가섀에 이λ₯΄κΈ°κΉŒμ§€ λ‹€μ–‘ν•˜μ£ .
02:12
We believed that human beings connected could create any future we could imagine.
35
132400
6720
인간듀이 νž˜μ„ ν•©μΉ˜λ©΄ μ–΄λ–€ λ―Έλž˜λ“  κ°€λŠ₯ν•˜λ‹€κ³  λ―Ώμ—ˆμ–΄μš”.
02:20
And then came the dot com boom.
36
140840
2199
그리고 λ‹·μ»΄ 뢐이 μ°Ύμ•„μ™”κ³ 
02:24
And the digital future became stock futures.
37
144600
3616
디지털 λ―Έλž˜λŠ” 주식 μ„ λ¬Όλ‘œ μ „λ½ν•˜κ³  λ§™λ‹ˆλ‹€.
02:28
And we used all that energy of the digital age
38
148240
3016
디지털 μ‹œλŒ€μ˜ λͺ¨λ“  μ—­λŸ‰μ„
02:31
to pump steroids into the already dying NASDAQ stock exchange.
39
151280
4256
이미 μ‚¬μ–‘κΈΈλ‘œ μ ‘μ–΄λ“  λ‚˜μŠ€λ‹₯ 증ꢌ κ±°λž˜μ†Œμ— νΌλΆ€μ—ˆμ£ .
02:35
The tech magazines told us a tsunami was coming.
40
155560
3616
κΈ°μˆ μ „λ¬Έ μž‘μ§€λ“€μ€ μ—„μ²­λ‚œ λ³€ν™”κ°€ λ°€λ €μ˜€κ³  있고
02:39
And only the investors who hired the best scenario-planners and futurists
41
159200
4456
졜고의 각본 κΈ°νšμžμ™€ λ―Έλž˜ν•™μžλ“€μ„ κ³ μš©ν•œ νˆ¬μžμžλ“€λ§Œμ΄
02:43
would be able to survive the wave.
42
163680
2520
살아남을 거라고 κ²½κ³ ν–ˆμ–΄μš”.
02:47
And so the future changed from this thing we create together in the present
43
167160
5896
κ·Έλž˜μ„œ λ―Έλž˜λŠ” ν˜„μž¬ μš°λ¦¬κ°€ ν•¨κ»˜ λ§Œλ“€μ–΄ κ°€λŠ” 것이 μ•„λ‹Œ
02:53
to something we bet on
44
173080
1496
μŠΉμžκ°€ λ…μ‹ν•˜λŠ” μΉ˜μ—΄ν•œ μ‹œν•©μ—
02:54
in some kind of a zero-sum winner-takes-all competition.
45
174600
3200
내기라도 κ±Έκ³  μžˆλŠ” μƒν™©μœΌλ‘œ λ°”λ€Œμ–΄ 버렸죠.
03:00
And when things get that competitive about the future,
46
180120
3136
λ―Έλž˜μ— λŒ€ν•œ 경쟁이 μ΄λ ‡κ²Œ μΉ˜μ—΄ν•΄ 진닀면
03:03
humans are no longer valued for our creativity.
47
183280
3296
이제 κ·Έ κ°€μΉ˜λ₯Ό μΈμ •λ°›κ²Œ λ˜λŠ” 건 μΈκ°„μ˜ μ°½μ˜μ„±μ΄ μ•„λ‹Œ
03:06
No, now we're just valued for our data.
48
186600
3136
μš°λ¦¬λ“€μ— κ΄€ν•œ 데이터 λΏμ΄μ—μš”.
03:09
Because they can use the data to make predictions.
49
189760
2376
데이터λ₯Ό μ΄μš©ν•΄ 예츑이 κ°€λŠ₯ν•˜λ‹ˆκΉŒμš”.
03:12
Creativity, if anything, that creates noise.
50
192160
2576
μ°½μ˜μ„±μ€, 그게 뭐든, 작음으둜 듀릴 κ±°μ—μš”.
03:14
That makes it harder to predict.
51
194760
2216
μ˜ˆμΈ‘ν•˜κΈ°λ§Œ νž˜λ“€κ²Œ ν•˜λ‹ˆκΉŒμš”.
03:17
So we ended up with a digital landscape
52
197000
2416
κ²°κ΅­ μš°λ¦¬κ°€ λ§Œλ“  디지털 ν™˜κ²½μ€
03:19
that really repressed creativity, repressed novelty,
53
199440
3256
μ°½μ˜μ„±κ³Ό 참신함 같은 μš°λ¦¬μ—κ²Œ κ°€μž₯
03:22
it repressed what makes us most human.
54
202720
2840
인간적인 면을 μ–΅λˆ„λ₯΄κ³  μžˆμ–΄μš”.
03:26
We ended up with social media.
55
206760
1456
μ†Œμ…œ λ―Έλ””μ–΄κ°€ νŒμ„ 치고 μžˆλŠ”λ°
03:28
Does social media really connect people in new, interesting ways?
56
208240
3456
정말 μƒˆλ‘­κ³  ν₯미둜운 λ°©μ‹μœΌλ‘œ μ‚¬λžŒλ“€μ„ 이어주고 μžˆμ„κΉŒμš”?
03:31
No, social media is about using our data to predict our future behavior.
57
211720
5016
μ•„λ‹ˆμ£ , 데이터λ₯Ό 톡해 우리의 행동을 μ˜ˆμΈ‘ν•˜λŠ”λ° 주둜 쓰일 뿐이죠.
03:36
Or when necessary, to influence our future behavior
58
216760
2896
ν•„μš”μ— 따라, μ•žμœΌλ‘œμ˜ 행동에 영ν–₯을 주기도 ν•˜λŠ”λ°
03:39
so that we act more in accordance with our statistical profiles.
59
219680
4040
톡계적 κ°œμš”μ— 맞게 ν–‰λ™ν•˜κ²Œ ν•˜λ €λŠ” κ²λ‹ˆλ‹€.
03:45
The digital economy -- does it like people?
60
225200
2216
디지털 κ²½μ œλŠ” ... μ‚¬λžŒλ“€μ—κ²Œ μ‹ κ²½μ“ΈκΉŒμš”?
03:47
No, if you have a business plan, what are you supposed to do?
61
227440
2896
μ•„λ‹ˆμ£ , 또 μ‚¬μ—…κ³„νšμ„ 가지고 뭘 ν•©λ‹ˆκΉŒ?
03:50
Get rid of all the people.
62
230360
1256
μ‚¬λžŒλ“€μ„ 자λ₯΄λŠ”데 μ“°μž–μ•„μš”.
03:51
Human beings, they want health care, they want money, they want meaning.
63
231640
3400
인간듀은 κ±΄κ°•λ³΄ν—˜, μ›”κΈ‰, 보람 등을 μš”κ΅¬ν•˜λ‹ˆκΉŒ
03:56
You can't scale with people.
64
236360
1880
사업을 λŠ˜λ¦¬κΈ°κ°€ νž˜λ“€μ£ .
03:59
(Laughter)
65
239360
1456
(μ›ƒμŒ)
04:00
Even our digital apps --
66
240840
1216
앱도 λ§ˆμ°¬κ°€μ§€μ£  ...
04:02
they don't help us form any rapport or solidarity.
67
242080
3216
μΉœλΆ„μ„ μŒ“κ³  관계λ₯Ό λ§ΊλŠ”λ° 도움을 주지 μ•ŠμŠ΅λ‹ˆλ‹€.
04:05
I mean, where's the button on the ride hailing app
68
245320
2416
제 말은, μ°¨λŸ‰κ³΅μœ μ„œλΉ„μŠ€ 앱에
04:07
for the drivers to talk to one another about their working conditions
69
247760
3496
μΌν•˜λŠ” ν™˜κ²½μ— λŒ€ν•΄ μ–˜κΈ°ν•˜κ±°λ‚˜ λ…Έμ‘°λ₯Ό κ²°μ„±ν•˜λŠ”
04:11
or to unionize?
70
251280
1200
κΈ°λŠ₯은 μ—†λ‹€λŠ” κ±°μ£ .
04:13
Even our videoconferencing tools,
71
253600
2016
ν™”μƒνšŒμ˜λ₯Ό ν†΅ν•΄μ„œλ„
04:15
they don't allow us to establish real rapport.
72
255640
2376
μœ λŒ€κ°μ€ ν˜•μ„±λ˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
04:18
However good the resolution of the video,
73
258040
3336
아무리 ν™”μ§ˆμ΄ 쒋아도
04:21
you still can't see if somebody's irises are opening to really take you in.
74
261400
4016
μƒλŒ€λ°©μ˜ 말을 κ²½μ²­ν•˜λ©° 눈이 μ»€μ§€λŠ” λͺ¨μŠ΅μ€ λ³Ό 수 μ—†μž–μ•„μš”.
04:25
All of the things that we've done to establish rapport
75
265440
2576
μš°λ¦¬λ“€μ΄ μ„œλ‘œ 관계λ₯Ό λ§ΊλŠ” 방식듀은
04:28
that we've developed over hundreds of thousands of years of evolution,
76
268040
3335
μˆ˜μ‹­λ§Œ λ…„μ˜ 진화λ₯Ό 거쳐 자리λ₯Ό μž‘μ€ 것인데
04:31
they don't work,
77
271399
1217
이젠 μ†Œμš©μ΄ μ—†μ£ .
04:32
you can't see if someone's breath is syncing up with yours.
78
272640
3016
μ„œλ‘œ 같은 생각을 ν•˜κ³  μžˆλŠ”μ§€ μ•Œ 수 κ°€ μ—†μ–΄μš”.
04:35
So the mirror neurons never fire, the oxytocin never goes through your body,
79
275680
3656
λ°˜μ‚¬μ‹ κ²½μ€ κΏˆμ©λ„ μ•Šκ³ , λͺΈμ— μ˜₯μ‹œν† μ‹ λ„ λ‚˜μ˜€μ§€ μ•Šμ•„
04:39
you never have that experience of bonding with the other human being.
80
279360
3240
인간듀 μ‚¬μ΄μ˜ μœ λŒ€κ°μ„ μ „ν˜€ λŠλΌμ§€ λͺ»ν•©λ‹ˆλ‹€.
04:43
And instead, you're left like,
81
283360
1456
λŒ€μ‹ , 이런 생각이 λ“€μ£ ,
04:44
"Well, they agreed with me, but did they really,
82
284840
2256
"κΈ€μŽ„, 같은 생각인 것 κ°™κΈ΄ ν•œλ°
04:47
did they really get me?"
83
287120
1696
정말 μ΄ν•΄ν•œκ±°μ•Ό?"
04:48
And we don't blame the technology for that lack of fidelity.
84
288840
3376
ν•˜μ§€λ§Œ μ΄λ ‡κ²Œ μ‹ λ’°ν•  수 μ—†λŠ” 기술이 μ•„λ‹Œ
04:52
We blame the other person.
85
292240
1480
μ‚¬λžŒλ“€ νƒ“λ§Œ ν•˜μ£ .
04:55
You know, even the technologies and the digital initiatives that we have
86
295320
4056
μ•Œλ‹€μ‹œν”Ό, μ—¬λŸ¬ 기술과 디지털 정책듀은
04:59
to promote humans,
87
299400
2176
인간을 μœ„ν•΄ λ§Œλ“€μ–΄ μ‘Œμ§€λ§Œ
05:01
are intensely anti-human at the core.
88
301600
2760
λ³Έμ§ˆλΆ€ν„° λ„ˆλ¬΄λ‚˜ λΉ„μΈκ°„μ μ΄μ—μš”.
05:05
Think about the blockchain.
89
305600
2000
블둝체인을 생각해 λ³΄μ„Έμš”.
05:08
The blockchain is here to help us have a great humanized economy? No.
90
308520
3616
κ±°μ°½ν•œ 인도적 경제의 κ΅¬ν˜„μ„ μœ„ν•œ κ±ΈκΉŒμš”? μ•„λ‹ˆμ£ .
05:12
The blockchain does not engender trust between users,
91
312160
2696
블둝체인은 μ‚¬μš©μžλ“€ κ°„μ˜ μ‹ μš©μ„ μ΄λŒμ–΄ λ‚΄λŠ”κ²Œ μ•„λ‹ˆλΌ
05:14
the blockchain simply substitutes for trust in a new,
92
314880
3536
λ‹¨μˆœνžˆ μ‹ μš©μ„ λŒ€μ²΄ν•˜λŠ” μƒˆλ‘œμš΄ 방식일 λΏμ΄μ—μš”.
05:18
even less transparent way.
93
318440
2000
훨씬 덜 투λͺ…ν•œ 방식이죠.
05:21
Or the code movement.
94
321600
1816
μ½”λ”© λ°”λžŒμ€ 또 μ–΄λ–»κ³ μš”.
05:23
I mean, education is great, we love education,
95
323440
2176
ꡐ윑 μžμ²΄κ°€ μ€‘μš”ν•˜κ³ , 또 μš°λ¦¬λ“€μ΄ μ’‹μ•„ν•˜λ‹ˆκΉŒ
05:25
and it's a wonderful idea
96
325640
1456
생각 μžμ²΄λŠ” μ•„μ£Ό ν›Œλ₯­ν•˜μ£ .
05:27
that we want kids to be able to get jobs in the digital future,
97
327120
3136
디지털 λ―Έλž˜μ— 우리 아이듀이 μΌν•˜λŠ”λ° ν•„μš”ν• ν…Œλ‹ˆ
05:30
so we'll teach them code now.
98
330280
1600
μ§€κΈˆλΆ€ν„° 코딩을 κ°€λ₯΄μΉ©λ‹ˆλ‹€.
05:32
But since when is education about getting jobs?
99
332640
2480
그런데 μ–Έμ œλΆ€ν„° ꡐ윑의 λͺ©μ μ΄ 취업이 λ˜λ²„λ¦° κ±ΈκΉŒμš”?
05:35
Education wasn't about getting jobs.
100
335920
1736
κ΅μœ‘μ€ 취업을 μœ„ν•œκ²Œ μ•„λ‹ˆμ£ .
05:37
Education was compensation for a job well done.
101
337680
3976
κ΅μœ‘μ€ 일을 μ—΄μ‹¬νžˆ ν•œ 것에 λŒ€ν•œ λ³΄μƒμ΄μ—ˆμ–΄μš”.
05:41
The idea of public education
102
341680
1576
곡곡 ꡐ윑의 κ°œλ…μ€
05:43
was for coal miners, who would work in the coal mines all day,
103
343280
3376
ν•˜λ£¨ 쒅일 νƒ„κ΄‘μ—μ„œ μΌν•˜κ³  집에 μ˜€λŠ” 광뢀듀이
05:46
then they'd come home and they should have the dignity
104
346680
2576
μ–΄λŠ μ •λ„μ˜ ν’ˆμœ„λŠ” μ§€μΌœμ•Ό ν•œλ‹€λŠ” κ±°μ˜€μ–΄μš”.
05:49
to be able to read a novel and understand it.
105
349280
2136
책을 읽으며 μ΄ν•΄ν•˜κ³ 
05:51
Or the intelligence to be able to participate in democracy.
106
351440
2960
민주주의 μ‹œλ―ΌμœΌλ‘œμ„œμ˜ 지성은 μžˆμ–΄μ•Ό ν•œλ‹€λŠ” κ±°μ˜€μ£ .
05:55
When we make it an extension of the job, what are we really doing?
107
355200
3216
μ·¨μ—…μ˜ μ—°μž₯으둜만 μ—¬κΈ΄λ‹€λ©΄, κ²°κ΅­ μš°λ¦¬κ°€ ν•˜κ³  μžˆλŠ” 일은 λ­˜κΉŒμš”?
05:58
We're just letting corporations really
108
358440
2536
기업듀이 λ…Έκ³¨μ μœΌλ‘œ
06:01
externalize the cost of training their workers.
109
361000
3120
μ§μ›λ“€μ˜ ꡐ윑 λΉ„μš©μ„ ν‘œλ©΄ν™”ν•˜κ²Œ ν•  λΏμž…λ‹ˆλ‹€.
06:05
And the worst of all really is the humane technology movement.
110
365520
4096
κ·Έ 쀑 μ΅œμ•…μ€ μ—­μ‹œ 인도적 기술의 λ°”λžŒμ΄μ£ .
06:09
I mean, I love these guys, the former guys who used to take
111
369640
2816
정말 λŒ€λ‹¨ν•œ μ‚¬λžŒλ“€μΈλ°μš”,
06:12
the algorithms from Las Vegas slot machines
112
372480
2936
λΌμŠ€λ² κ°€μŠ€ μŠ¬λ‘―λ¨Έμ‹ μ˜ μ•Œκ³ λ¦¬μ¦˜μ„ μ΄μš©ν•΄
06:15
and put them in our social media feed so that we get addicted.
113
375440
2936
μ‚¬λžŒλ“€μ„ μ†Œμ…œλ―Έλ””μ–΄μ— μ€‘λ…μ‹œν‚€λŠ”λ° μ‚¬μš©ν–ˆμ–΄μš”.
06:18
Now they've seen the error of their ways
114
378400
1936
κ·Έ 방식에 ν•˜μžκ°€ μžˆλ‹€λŠ” κ±Έ μ•Œμ•˜κ³ 
06:20
and they want to make technology more humane.
115
380360
2456
μ§€κΈˆμ€ μ’€ 더 인도적인 μ°¨μ›μ—μ„œ κΈ°μˆ μ— μ ‘κ·Όν•˜κ³  있죠.
06:22
But when I hear the expression "humane technology,"
116
382840
2416
"인도적 기술"μ΄λž€ ν‘œν˜„μ„ λ“£κ²Œλ˜λ©΄
06:25
I think about cage-free chickens or something.
117
385280
2856
λ°©λͺ©ν•˜λŠ” 닭듀이 μƒκ°λ‚©λ‹ˆλ‹€.
06:28
We're going to be as humane as possible to them,
118
388160
2256
μ΅œλŒ€ν•œ 인도적인 λ°©λ²•μœΌλ‘œ μ‚¬μœ‘ν•˜κ² λ‹€λŠ” κ±°μ£ .
06:30
until we take them to the slaughter.
119
390440
1720
도살할 λ•Œ κΉŒμ§€μš”.
06:33
So now they're going to let these technologies be as humane as possible,
120
393200
3416
μ§€κΈˆ 이듀은 μ΅œλŒ€ν•œ μΈλ„μ μœΌλ‘œ κΈ°μˆ μ— μ ‘κ·Όν•˜κ³  μžˆλ‹€κ³  μ£Όμž₯ν•˜μ§€λ§Œ
06:36
as long as they extract enough data and extract enough money from us
121
396640
3216
μš°λ¦¬λ“€λ‘œλΆ€ν„° 데이터와 λˆμ„ μΆ©λΆ„νžˆ 긁어가
06:39
to please their shareholders.
122
399880
1400
주주듀을 λ§Œμ‘±μ‹œν‚€λŠ”κ²Œ μš°μ„ μ΄μ£ .
06:42
Meanwhile, the shareholders, for their part, they're just thinking,
123
402520
3176
ν•œνŽΈ, μ£Όμ£Όλ“€μ˜ μž…μž₯은 μ΄λ ‡μŠ΅λ‹ˆλ‹€,
06:45
"I need to earn enough money now, so I can insulate myself
124
405720
2976
"λ‹Ήμž₯ μΆ©λΆ„νžˆ λˆμ„ λ²Œμ–΄μ„œ, λ‚΄ 슀슀둜λ₯Ό μ§€μΌœμ•Όμ§€.
06:48
from the world I'm creating by earning money in this way."
125
408720
3056
이런 μ‹μœΌλ‘œ λˆμ„ λ²Œλ©΄μ„œ λ§Œλ“œλŠ” μ„ΈμƒμœΌλ‘œλΆ€ν„° 말이야."
06:51
(Laughter)
126
411800
2376
(μ›ƒμŒ)
06:54
No matter how many VR goggles they slap on their faces
127
414200
4056
아무리 λ§Žμ€ κ°€μƒν˜„μ‹€ μž₯비듀을 μ΄μš©ν•΄
06:58
and whatever fantasy world they go into,
128
418280
2256
μ–΄λ–€ ν™˜μƒμ˜ μ„Έκ³„λ‘œ λ“€μ–΄κ°€ 봐도
07:00
they can't externalize the slavery and pollution that was caused
129
420560
3536
그런 것듀을 μƒμ‚°ν•˜λ©° μ΄ˆλž˜ν•œ λ…Έμ˜ˆν™”μ™€ ν™˜κ²½μ˜€μ—Όμ€
07:04
through the manufacture of the very device.
130
424120
2976
ν‘œλ©΄ν™”ν•  수 μ—†μŠ΅λ‹ˆλ‹€.
07:07
It reminds me of Thomas Jefferson's dumbwaiter.
131
427120
3176
ν† λ§ˆμŠ€ 제퍼슨의 '덀웨이터'κ°€ μƒκ°λ‚˜λ„€μš”.
07:10
Now, we like to think that he made the dumbwaiter
132
430320
2336
자, μ΄λ ‡κ²Œ λ§ν•˜λŠ” μ‚¬λžŒλ“€λ„ 있죠.
07:12
in order to spare his slaves all that labor of carrying the food
133
432680
3656
μ‹λ‹ΉκΉŒμ§€ μŒμ‹μ„ λ‚˜λ₯΄λŠ” λ…Έμ˜ˆλ“€μ˜ 수고λ₯Ό
07:16
up to the dining room for the people to eat.
134
436360
2776
λœμ–΄ μ£ΌκΈ° μœ„ν•΄ λ§Œλ“  κ±°λΌκ³ μš”.
07:19
That's not what it was for, it wasn't for the slaves,
135
439160
2496
사싀은 그렇지 μ•Šκ³ 
07:21
it was for Thomas Jefferson and his dinner guests,
136
441680
2336
κ·Έ μžμ‹ κ³Ό μ†λ‹˜λ“€μ΄ μŒμ‹μ„ λ‚΄μ˜€λŠ” λ…Έμ˜ˆλ“€μ„
07:24
so they didn't have to see the slave bringing the food up.
137
444040
3096
보지 μ•ŠκΈ° μœ„ν•΄ λ§Œλ“  것이죠.
07:27
The food just arrived magically,
138
447160
1576
μ‹ κΈ°ν•˜κ²Œ μŒμ‹λ“€μ΄ λ°°λ‹¬λμ–΄μš”.
07:28
like it was coming out of a "Start Trek" replicator.
139
448760
3000
"μŠ€νƒ€νŠΈλ ‰"에 λ‚˜μ˜€λŠ” 'λ ˆν”Œλ¦¬μΌ€μ΄ν„°' μ²˜λŸΌμš”.
07:32
It's part of an ethos that says,
140
452720
2096
λ‹Ήμ‹œ 상황이 κ·Έλž¬λ‹€κ³ λ“€ ν•˜μ§€λ§Œ
07:34
human beings are the problem and technology is the solution.
141
454840
4280
인간듀이 문제고 기술이 κ·Έ 해결책이어야 ν•˜λŠ”λ°
07:40
We can't think that way anymore.
142
460680
2056
더 이상 κ·Έλ ‡κ²Œ 여겨지지 μ•ŠμŠ΅λ‹ˆλ‹€.
07:42
We have to stop using technology to optimize human beings for the market
143
462760
5296
인간을 μ†ŒλΉ„μžλ‘œμ„œ μ΅œλŒ€ν•œ μ΄μš©ν•˜κΈ° μœ„ν•΄ κΈ°μˆ μ„ μ‚¬μš©ν•΄μ„  μ•ˆλ˜κ³ 
07:48
and start optimizing technology for the human future.
144
468080
5040
인λ₯˜μ˜ 미래λ₯Ό μœ„ν•΄ κΈ°μˆ μ„ μ΅œμ ν™”ν•΄μ•Όμ£ .
07:55
But that's a really hard argument to make these days,
145
475080
2656
μ˜€λŠ˜λ‚  μ•„μ£Ό μ–΄λ €μš΄ 논지인데
07:57
because humans are not popular beings.
146
477760
4056
인간에 λŒ€ν•΄ 크게 신경쓰지 μ•ŠμœΌλ‹ˆκΉŒμš”.
08:01
I talked about this in front of an environmentalist just the other day,
147
481840
3376
λ©°μΉ  μ „, ν•œ ν™˜κ²½λ¬Έμ œ 전문가와 이에 κ΄€ν•œ λŒ€ν™”λ₯Ό λ‚˜λˆ΄λŠ”λ°
08:05
and she said, "Why are you defending humans?
148
485240
2096
κ·Έ 뢄이 μ΄λ ‡κ²Œ λ§ν–ˆμ£ , "μ™œ 인간 νŽΈμ„ λ“œμ„Έμš”?
08:07
Humans destroyed the planet. They deserve to go extinct."
149
487360
2696
지ꡬλ₯Ό λ‹€ λ§κ°€λœ¨λ ΈλŠ”λ°, 멸쒅돼도 할말 μ—†μ£ ."
08:10
(Laughter)
150
490080
3456
(μ›ƒμŒ)
08:13
Even our popular media hates humans.
151
493560
2576
λŒ€μ€‘μ μΈ 맀체듀 λ˜ν•œ 인간듀을 ν˜μ˜€ν•©λ‹ˆλ‹€.
08:16
Watch television,
152
496160
1256
ν…”λ ˆλΉ„μ Όμ— λ‚˜μ˜€λŠ”
08:17
all the sci-fi shows are about how robots are better and nicer than people.
153
497440
3736
곡상과학물듀을 보면, λ‘œλ΄‡λ“€μ΄ μ‚¬λžŒλ³΄λ‹€ 더 λ‚«κ³  μ°©ν•˜κ²Œ λ‚˜μ˜€μž–μ•„μš”.
08:21
Even zombie shows -- what is every zombie show about?
154
501200
2976
μ’€λΉ„κ°€ λ‚˜μ˜€λŠ” 것듀도 λ§ˆμ°¬κ°€μ§€μΈλ°, κ³΅ν†΅μ μΈκ²Œ 뭐가 있죠?
08:24
Some person, looking at the horizon at some zombie going by,
155
504200
3256
λˆ„κ΅°κ°€ 멀리 μ’€λΉ„κ°€ μ§€λ‚˜κ°€λŠ” κ±Έ λ°”λΌλ΄…λ‹ˆλ‹€.
08:27
and they zoom in on the person and you see the person's face,
156
507480
2896
κ°€κΉŒμ΄ ν™•λŒ€λ˜μ–΄ κ·Έ μ‚¬λžŒμ˜ ν‘œμ •μ΄ 보이면
08:30
and you know what they're thinking:
157
510400
1736
무슨 생각을 ν•˜κ³  μžˆλŠ”μ§€ μ•Œκ²Œ 되죠.
08:32
"What's really the difference between that zombie and me?
158
512160
2736
"λ‚΄κ°€ μ € μ’€λΉ„ν•˜κ³  λ‹€λ₯Έκ²Œ λ„λŒ€μ²΄ 뭐야?
08:34
He walks, I walk.
159
514920
1536
μŸ€λ„ κ±·κ³ , λ‚˜λ„ κ±·μž–μ•„.
08:36
He eats, I eat.
160
516480
2016
μŸ€λ„ λ¨Ήκ³ , λ‚˜λ„ λ¨Ήμž–μ•„.
08:38
He kills, I kill."
161
518520
2280
μŸ€λ„ 죽이고, λ‚˜λ„ μ£½μ΄μž–μ•„."
08:42
But he's a zombie.
162
522360
1536
κ·Έλž˜λ„ μ’€λΉ„μž–μ•„μš”.
08:43
At least you're aware of it.
163
523920
1416
적어도 그건 μ•Œμž–μ•„μš”.
08:45
If we are actually having trouble distinguishing ourselves from zombies,
164
525360
3696
쒀비와 슀슀둜λ₯Ό κ΅¬λ³„ν•˜λŠ”λ° 어렀움을 κ²ͺκ³  μžˆλ‹€λ©΄
08:49
we have a pretty big problem going on.
165
529080
2176
μ•žλ‚ μ΄ 정말 κ±±μ •λ©λ‹ˆλ‹€.
08:51
(Laughter)
166
531280
1216
(μ›ƒμŒ)
08:52
And don't even get me started on the transhumanists.
167
532520
2936
'νŠΈλžœμŠ€νœ΄λ¨Έλ‹ˆμŠ€νŠΈ'에 λŒ€ν•΄μ„  말도 λ§ˆμ„Έμš”.
08:55
I was on a panel with a transhumanist, and he's going on about the singularity.
168
535480
3816
같이 μžλ¬Έλ‹¨μ— 있던 ν•œ 뢄이 인곡지λŠ₯의 '특이점'에 λŒ€ν•΄ λ§ν–ˆμ£ ,
08:59
"Oh, the day is going to come really soon when computers are smarter than people.
169
539320
3856
"곧 컴퓨터가 μ‚¬λžŒλ³΄λ‹€ 더 λ˜‘λ˜‘ν•΄ 질 λ•Œκ°€ μ˜΅λ‹ˆλ‹€.
09:03
And the only option for people at that point
170
543200
2136
κ·Έ λ•Œ μœ μΌν•˜κ²Œ μ‚¬λžŒλ“€μ΄ ν•  수 μžˆλŠ” 건
09:05
is to pass the evolutionary torch to our successor
171
545360
3336
μ§„ν™”μ˜ μ£Όλ„κΆŒμ„ λ„˜κ²¨μ£Όκ³ 
09:08
and fade into the background.
172
548720
1616
μ„œμ„œνžˆ 사라져 μ£ΌλŠ” κ²λ‹ˆλ‹€.
09:10
Maybe at best, upload your consciousness to a silicon chip.
173
550360
3456
μ΅œμ„ μœΌλ‘œ ν•  수 μžˆλŠ” 건, μ‹€λ¦¬μ½˜ 칩에 μ˜μ‹μ„ μ €μž₯ν•΄ 두고
09:13
And accept your extinction."
174
553840
1840
멸쒅을 λ°›μ•„λ“€μ΄λŠ” κ±°κ² μ£ ."
09:16
(Laughter)
175
556640
1456
(μ›ƒμŒ)
09:18
And I said, "No, human beings are special.
176
558120
3376
μ „ μ΄λ ‡κ²Œ λ§ν–ˆμ–΄μš”, "μ•„λ‹ˆμš”, 인간은 νŠΉλ³„ν•œ 쑴재죠.
09:21
We can embrace ambiguity, we understand paradox,
177
561520
3536
애맀λͺ¨ν˜Έν•¨μ„ μ•Œμ•„ 차리고, λͺ¨μˆœμ„ μ΄ν•΄ν•˜κ³ ,
09:25
we're conscious, we're weird, we're quirky.
178
565080
2616
지각이 있고, λ¬˜ν•˜κ³ , λ³„λ‚˜κΈ°λ„ ν•΄μš”.
09:27
There should be a place for humans in the digital future."
179
567720
3336
디지털 λ―Έλž˜μ—λ„ λΆ„λͺ… 인간이 μ„€ μžλ¦¬λŠ” μžˆμ„ κ±°μ—μš”."
09:31
And he said, "Oh, Rushkoff,
180
571080
1536
κ·Έ 뢄이 λ§ν–ˆμ–΄μš”, "에이, λŸ¬μ‰¬μ½”ν”„ 씨,
09:32
you're just saying that because you're a human."
181
572640
2296
당신도 μ‚¬λžŒμ΄λ‹ˆκΉŒ κ·Έλ ‡κ²Œ 말할 λΏμ΄μ˜ˆμš”."
09:34
(Laughter)
182
574960
1776
(μ›ƒμŒ)
09:36
As if it's hubris.
183
576760
1600
μ˜€λ§Œμ΄λž€ κ±΄κ°€μš”.
09:39
OK, I'm on "Team Human."
184
579280
2800
λ§žμ•„μš”, μ „ "μΈκ°„μ˜ 편"μž…λ‹ˆλ‹€.
09:43
That was the original insight of the digital age.
185
583200
3856
디지털 μ‹œλŒ€μ˜ 근본적인 톡찰이죠.
09:47
That being human is a team sport,
186
587080
2216
인간은 단체경기에 μ°Έκ°€ν•΄ 있고
09:49
evolution's a collaborative act.
187
589320
2736
μ§„ν™”λŠ” ν•¨κ»˜ μ΄λ€„λ‚΄λŠ” κ²λ‹ˆλ‹€.
09:52
Even the trees in the forest,
188
592080
1416
숲 μ†μ˜ λ‚˜λ¬΄λ“€ μ‘°μ°¨
09:53
they're not all in competition with each other,
189
593520
2216
μ„œλ‘œ κ²½μŸν•˜μ§€ μ•Šκ³ 
09:55
they're connected with the vast network of roots and mushrooms
190
595760
3216
μ—„μ²­λ‚œ μ–‘μ˜ λΏŒλ¦¬μ™€ λ²„μ„―λ“€μ˜ 쑰직을 톡해 μ„œλ‘œ μ—°κ²°λ˜μ–΄
09:59
that let them communicate with one another and pass nutrients back and forth.
191
599000
4536
μ†Œν†΅ν•˜κ³  양뢄을 μ£Όκ³  λ°›μž–μ•„μš”.
10:03
If human beings are the most evolved species,
192
603560
2136
인간이 κ°€μž₯ μ§„ν™”λœ 쒅인 μ΄μœ λŠ”
10:05
it's because we have the most evolved ways of collaborating and communicating.
193
605720
4136
κ°€μž₯ μ§„ν™”λœ λ°©μ‹μœΌλ‘œ ν˜‘λ ₯ν•˜κ³  μ†Œν†΅ν•˜κΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
10:09
We have language.
194
609880
1496
μ–Έμ–΄κ°€ μžˆμŠ΅λ‹ˆλ‹€.
10:11
We have technology.
195
611400
1240
κΈ°μˆ λ„ μžˆμ–΄μš”.
10:14
It's funny, I used to be the guy who talked about the digital future
196
614120
4576
μž¬λ°ŒλŠ” 건, μ œκ°€ μ „μ—” 디지털 λ―Έλž˜μ— λŒ€ν•΄ λ– λ“€κ³  닀녔단 사싀이죠.
10:18
for people who hadn't yet experienced anything digital.
197
618720
2680
μ „ν˜€ 디지털에 λŒ€ν•΄ λͺ¨λ₯΄λŠ” μ‚¬λžŒλ“€ν•œν…Œμš”.
10:22
And now I feel like I'm the last guy
198
622200
1816
μ§€κΈˆμ€ 세상에 μ € 혼자만이
10:24
who remembers what life was like before digital technology.
199
624040
2920
디지털 기술 μ΄μ „μ˜ 삢에 λŒ€ν•΄ μ•Œκ³  μžˆλŠ” 것 κ°™μ•„μš”.
10:28
It's not a matter of rejecting the digital or rejecting the technological.
200
628680
4216
λ””μ§€ν„Έμ΄λ‚˜ κ·Έ 기술의 진보λ₯Ό κ±°λΆ€ν•˜λ €λŠ”κ²Œ μ•„λ‹ˆλΌ
10:32
It's a matter of retrieving the values that we're in danger of leaving behind
201
632920
4096
자칫 놓칠 μœ„ν—˜μ— μžˆλŠ” κ·Έ μ§„μ •ν•œ κ°€μΉ˜λ₯Ό 되찾고
10:37
and then embedding them in the digital infrastructure for the future.
202
637040
3600
미래의 디지털 κΈ°λ°˜μ— 깊이 κ°μΈμ‹œμΌœμ•Ό ν•œλ‹€λŠ” κ±°μ£ .
10:41
And that's not rocket science.
203
641880
2296
μ—„μ²­λ‚˜κ²Œ λ²…μ°¬ κ³Όμ œκ°€ μ•„λ‹™λ‹ˆλ‹€.
10:44
It's as simple as making a social network
204
644200
2096
κ°„λ‹¨ν•˜κ²Œ μ‚¬νšŒ 연결망을 쑰직해
10:46
that instead of teaching us to see people as adversaries,
205
646320
3536
μ‚¬λžŒμ„ 적으둜 보게 끔 κ°€λ₯΄μΉ˜λŠ”κ²Œ μ•„λ‹ˆλΌ
10:49
it teaches us to see our adversaries as people.
206
649880
3000
적도 μ‚¬λžŒμœΌλ‘œ 보게 끔 κ°€λ₯΄μΉ˜λ©΄ 되겠죠.
10:54
It means creating an economy that doesn't favor a platform monopoly
207
654240
4296
κΈ°λ°˜μ‹œμ„€μ˜ 독점을 ν—ˆμš©ν•˜μ§€ μ•ŠλŠ” 경제λ₯Ό μ°½μ‘°ν•΄
10:58
that wants to extract all the value out of people and places,
208
658560
3336
μ‚¬λžŒλ“€κ³Ό μ§€μ—­λ“€μ˜ κ°€μΉ˜λ₯Ό μ§€μΌœμ£Όκ³ 
11:01
but one that promotes the circulation of value through a community
209
661920
3936
μ§€μ—­μ‚¬νšŒ κ°„ κ°€μΉ˜μ˜ μˆœν™˜μ„ μ΄‰μ§„ν•˜λ©°
11:05
and allows us to establish platform cooperatives
210
665880
2416
κ°€λŠ₯ν•œ μ†Œμœ λ₯Ό 널리 λΆ„λ°°ν•˜λŠ”
11:08
that distribute ownership as wide as possible.
211
668320
3816
κΈ°λ°˜μ„ λ‹€μ Έμ•Ό ν•©λ‹ˆλ‹€.
11:12
It means building platforms
212
672160
1656
μ–΄λ–€ 발판이 ν•„μš”ν•œλ°
11:13
that don't repress our creativity and novelty in the name of prediction
213
673840
4656
예츑의 μš©μ΄μ„±μ„ μœ„ν•΄ μ°½μ˜μ„±κ³Ό 참신함을 μ–΅μ œν•˜μ§€ μ•Šκ³ 
11:18
but actually promote creativity and novelty,
214
678520
2576
μ°½μ˜μ„±κ³Ό 참신함을 μž₯λ €ν•¨μœΌλ‘œ
11:21
so that we can come up with some of the solutions
215
681120
2336
해결책을 λ§ˆλ ¨ν•΄
11:23
to actually get ourselves out of the mess that we're in.
216
683480
2680
μ‹€μ œλ‘œ 이 κ³€κ²½μ—μ„œ 우리 슀슀둜λ₯Ό ꡬ해내야겠죠.
11:27
No, instead of trying to earn enough money to insulate ourselves
217
687440
3056
λˆμ„ μΆ©λΆ„νžˆ λ²Œμ–΄μ„œ μš°λ¦¬κ°€ λ§Œλ“€κ³  μžˆλŠ” μ„ΈμƒμœΌλ‘œλΆ€ν„°
11:30
from the world we're creating,
218
690520
1496
λ³΄ν˜Έλ§‰μ„ μŒ“λŠ” λŒ€μ‹ 
11:32
why don't we spend that time and energy making the world a place
219
692040
3056
κ·Έ μ‹œκ°„κ³Ό λ…Έλ ₯을 도망칠 ν•„μš”κ°€ μ—†λŠ”
11:35
that we don't feel the need to escape from.
220
695120
2040
세상을 λ§Œλ“œλŠ”λ° μ“°λ©΄ μ–΄λ–¨κΉŒμš”?
11:38
There is no escape, there is only one thing going on here.
221
698000
3200
λ„λ§κ°ˆ 데가 μ—†μ–΄μš”, μ—¬κΈ° ν•˜λ‚˜ 밖에 μ—†μž–μ•„μš”.
11:42
Please, don't leave.
222
702680
2120
제발 λ– λ‚˜μ§€ 말고
11:45
Join us.
223
705640
1360
λ™μ°Έν•˜μ„Έμš”.
11:47
We may not be perfect,
224
707800
1296
μ™„λ²½ν•˜μ§„ μ•Šμ§€λ§Œ
11:49
but whatever happens, at least you won't be alone.
225
709120
2440
무슨 일이 λ²Œμ–΄μ§€κ±΄, 적어도 ν˜ΌμžλŠ” μ•„λ‹ˆμž–μ•„μš”.
11:52
Join "Team Human."
226
712640
1520
"μΈκ°„μ˜ 편"으둜 μ˜€μ„Έμš”.
11:55
Find the others.
227
715160
2096
λ‹€λ₯Έ μ‚¬λžŒλ“€λ„ ν•¨κ»˜μš”.
11:57
Together, let's make the future that we always wanted.
228
717280
2960
λ‹€ν•¨κ»˜ μš°λ¦¬κ°€ 늘 μ›ν•˜λ˜ 미래λ₯Ό λ§Œλ“­μ‹œλ‹€.
12:01
Oh, and those tech billionaires who wanted to know
229
721560
2496
μ•„, κ·Έ μ–΅λ§Œμž₯μžλ“€ μžˆμž–μ•„μš”.
12:04
how to maintain control of their security force after the apocalypse,
230
724080
3256
쒅말 이후 λ³΄μ•ˆ κ΄€κ³„μžλ“€μ„ μ–΄λ–»κ²Œ 관리할지 λ¬Όμ—ˆλ˜ μ‚¬λžŒλ“€μ΄μš”.
12:07
you know what I told them?
231
727360
1896
μ œκ°€ 뭐라고 ν–ˆλŠ”μ§€ μ•„μ„Έμš”?
12:09
"Start treating those people with love and respect right now.
232
729280
4216
"μ§€κΈˆλΆ€ν„° λ‹Ήμž₯ μ• μ •κ³Ό 쑴경으둜 그듀을 λŒ€ν•˜μ‹œμ£ .
12:13
Maybe you won't have an apocalypse to worry about."
233
733520
2440
κ±±μ •μ²˜λŸΌ 쒅말이 μ˜€μ§€ μ•Šμ„ μˆ˜λ„ μžˆμœΌλ‹ˆκΉŒμš”."
12:16
Thank you.
234
736840
1216
κ°μ‚¬ν•©λ‹ˆλ‹€.
12:18
(Applause)
235
738080
4440
(λ°•μˆ˜)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7