How we need to remake the internet | Jaron Lanier

435,371 views ใƒป 2018-05-03

TED


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

๋ฒˆ์—ญ: JY Kang ๊ฒ€ํ† : Noh kyua
00:12
Back in the 1980s, actually, I gave my first talk at TED,
0
12944
4009
1980๋…„๋Œ€์— ์ œ๊ฐ€ TED์—์„œ ์ฒ˜์Œ์œผ๋กœ ๊ฐ•์—ฐํ–ˆ์„ ๋•Œ
00:16
and I brought some of the very, very first public demonstrations
1
16977
4262
์ตœ์ดˆ๋กœ ๊ฐ€์ƒ ํ˜„์‹ค ๊ธฐ์ˆ ์„ ๊ณต๊ฐœ ์‹œ์—ฐํ•œ ์ ์ด ์žˆ์Šต๋‹ˆ๋‹ค.
00:21
of virtual reality ever to the TED stage.
2
21263
4234
์ด TED ๋ฌด๋Œ€์—์„œ ์ฒ˜์Œ์œผ๋กœ ๋ง์ด์ฃ .
00:26
And at that time, we knew that we were facing a knife-edge future
3
26375
6867
์šฐ๋ฆฌ๋Š” ๊ทธ๋•Œ ์ด๋ฏธ ๋ถˆ์•ˆํ•œ ๋ฏธ๋ž˜์™€ ์ง๋ฉดํ–ˆ์Œ์„ ์•Œ๊ณ  ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
00:33
where the technology we needed,
4
33266
5201
์šฐ๋ฆฌ๊ฐ€ ์›ํ–ˆ๋˜ ๊ธฐ์ˆ ๊ณผ
00:38
the technology we loved,
5
38491
1851
์†Œ์ค‘ํžˆ ์ƒ๊ฐํ–ˆ๋˜ ๊ธฐ์ˆ  ๋•Œ๋ฌธ์—
00:40
could also be our undoing.
6
40366
2047
ํŒŒ๋ฉธํ•˜๊ฒŒ ๋ ์ง€ ๋ชจ๋ฅผ ๋ฏธ๋ž˜ ๋ง์ด์ฃ .
00:43
We knew that if we thought of our technology
7
43266
4091
์šฐ๋ฆฌ๊ฐ€ ๊นจ๋‹ซ๊ฒŒ ๋œ ๊ฒƒ์€
๋งŒ์•ฝ ์šฐ๋ฆฌ์˜ ๊ธฐ์ˆ ์„ ๋” ๊ฐ•ํ•œ ๊ถŒ๋ ฅ์„ ์–ป๋Š” ์ˆ˜๋‹จ์œผ๋กœ ์ƒ๊ฐํ•˜๊ณ 
00:47
as a means to ever more power,
8
47381
3254
00:50
if it was just a power trip, we'd eventually destroy ourselves.
9
50659
3707
๊ทธ ํž˜์„ ๊ณผ์‹œํ•˜๋‹ค๋ณด๋ฉด ๊ฒฐ๊ตญ ์Šค์Šค๋กœ ํŒŒ๋ฉธํ•˜๊ฒŒ ๋  ๊ฑฐ๋ž€ ์‚ฌ์‹ค์ด์—ˆ์Šต๋‹ˆ๋‹ค.
00:54
That's what happens
10
54390
1181
๊ทธ๋ ‡๊ฒŒ ๋  ๊ฒ๋‹ˆ๋‹ค.
00:55
when you're on a power trip and nothing else.
11
55595
2787
๊ธฐ์ˆ ์„ ๋‚จ์šฉํ•˜๊ธฐ๋งŒ ํ•œ๋‹ค๋ฉด ๋ง์ด์ฃ .
00:59
So the idealism
12
59509
3389
๊ทธ ๋‹น์‹œ์˜ ์ด์ƒ์ฃผ์˜์ ์ธ
01:02
of digital culture back then
13
62922
4809
๋””์ง€ํ„ธ ๋ฌธํ™”๋Š”
01:07
was all about starting with that recognition of the possible darkness
14
67755
4739
๊ธฐ์ˆ ์ด ๊ฐ€์ง„ ์–ด๋‘์šด ์ธก๋ฉด์˜ ๊ฐ€๋Šฅ์„ฑ์„ ํŒŒ์•…ํ•˜๊ณ 
01:12
and trying to imagine a way to transcend it
15
72518
3350
๊ทธ๊ฒƒ์„ ๋›ฐ์–ด๋„˜์„ ๋ฐฉ๋ฒ•์„ ์ฐพ๊ณ ์ž ๋…ธ๋ ฅํ•˜๋Š” ๊ฒƒ์ด์—ˆ์Šต๋‹ˆ๋‹ค.
01:15
with beauty and creativity.
16
75892
2578
๋ฏธ์  ๊ฐ๊ฐ๊ณผ ์ฐฝ์˜์„ฑ์„ ํ†ตํ•ด์„œ ๋ง์ด์ฃ .
01:19
I always used to end my early TED Talks with a rather horrifying line, which is,
17
79033
6507
์ œ๊ฐ€ ์ดˆ๊ธฐ์— TED ๊ฐ•์—ฐ์„ ํ•  ๋•Œ๋Š” ์ด๋Ÿฐ ๋ฌด์„œ์šด ๋ง๋กœ ๋๋งบ๊ณค ํ–ˆ์Šต๋‹ˆ๋‹ค.
01:26
"We have a challenge.
18
86478
3866
"์šฐ๋ฆฌ์—๊ฒŒ๋Š” ํ•œ๊ฐ€์ง€ ์ˆ™์ œ๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
01:30
We have to create a culture around technology
19
90368
4024
์šฐ๋ฆฌ๋Š” ๊ธฐ์ˆ ์„ ๋‘˜๋Ÿฌ์‹ผ ๋ฌธํ™”๋ฅผ ์ฐฝ์กฐํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
01:34
that is so beautiful, so meaningful,
20
94416
3968
์•„๋ฆ„๋‹ต๊ณ , ์˜๋ฏธ ์žˆ๊ณ 
01:38
so deep, so endlessly creative,
21
98408
2541
๊นŠ์ด ์žˆ๊ณ , ๋ฌดํ•œํžˆ ์ฐฝ์˜์ ์ด๊ณ 
01:40
so filled with infinite potential
22
100973
3016
๋ฌดํ•œํ•œ ์ž ์žฌ๋ ฅ์„ ๊ฐ€์ง„ ๋ฌธํ™” ๋ง์ด์ฃ .
01:44
that it draws us away from committing mass suicide."
23
104013
3253
๊ทธ๋ž˜์•ผ ์ง‘๋‹จ ์ž์‚ด์„ ๋ง‰์„ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค."
01:48
So we talked about extinction as being one and the same
24
108519
5588
๊ทธ๋ž˜์„œ ์ธ๋ฅ˜์˜ ๋ฉธ์ข…์— ๋Œ€ํ•ด์„œ๋„ ํ† ๋ก ์ด ์ด๋ฃจ์–ด์กŒ์Šต๋‹ˆ๋‹ค.
๋งคํ˜น์ ์ด๊ณ  ๋ฌดํ•œํ•œ ์ฐฝ์˜์„ฑ์„ ๊ฐ€์ง„ ๋ฏธ๋ž˜์˜ ํ•„์š”์„ฑ์— ๋ง๋ถ™์—ฌ์„œ ๋ง์ด์ฃ .
01:54
as the need to create an alluring, infinitely creative future.
25
114131
4830
01:59
And I still believe that that alternative of creativity
26
119639
5382
๊ทธ๋ฆฌ๊ณ  ์—ฌ์ „ํžˆ ์ œ ์ƒ๊ฐ์—๋Š”
์ฃฝ์Œ ๋Œ€์‹ ์— ์ฐฝ์˜์„ฑ์„ ์„ ํƒํ•˜๋Š” ์ชฝ์ด
02:05
as an alternative to death
27
125045
1974
02:07
is very real and true,
28
127043
1969
ํ˜„์‹ค์ธ ๊ฒƒ ๊ฐ™์Šต๋‹ˆ๋‹ค.
02:09
maybe the most true thing there is.
29
129036
1983
์–ด์ฉŒ๋ฉด ๊ฐ€์žฅ ์‚ฌ์‹ค์ ์ธ ๋ฌธ์ œ์ฃ .
02:11
In the case of virtual reality --
30
131870
2095
๊ฐ€์ƒ ํ˜„์‹ค์˜ ๊ฒฝ์šฐ์—๋Š”..
02:13
well, the way I used to talk about it
31
133989
2282
์ œ๊ฐ€ ์ด ์ฃผ์ œ๋กœ ์–˜๊ธฐํ•  ๋•Œ๋งˆ๋‹ค
02:16
is that it would be something like
32
136295
2635
ํ•œ๊ฐ€์ง€ ๋น„์œ ๋ฅผ ๋“ค๊ณค ํ•˜๋Š” ๊ฒƒ์ด
02:18
what happened when people discovered language.
33
138954
2850
์ธ๊ฐ„์ด ์–ธ์–ด๋ฅผ ๋ฐœ๋ช…ํ–ˆ์„ ๋•Œ์ž…๋‹ˆ๋‹ค.
02:21
With language came new adventures, new depth, new meaning,
34
141828
4675
์–ธ์–ด๋ฅผ ํ†ตํ•ด์„œ ์ƒˆ๋กœ์šด ๋ชจํ—˜์„ ๋งŒ๋‚˜๊ณ , ์ƒˆ๋กœ์šด ๊นŠ์ด์™€ ์ƒˆ๋กœ์šด ์˜๋ฏธ๋ฅผ ์–ป๊ณ 
02:26
new ways to connect, new ways to coordinate,
35
146527
2080
์ƒˆ๋กœ์šด ์—ฐ๋ฝ ๋ฐฉ๋ฒ•, ์ƒˆ๋กœ์šด ํ˜‘๋ ฅ ์ˆ˜๋‹จ
02:28
new ways to imagine, new ways to raise children,
36
148631
4034
์ƒˆ๋กœ์šด ์ƒ๊ฐ ์ˆ˜๋‹จ, ์ƒˆ๋กœ์šด ์ž๋…€ ์–‘์œก๋ฒ•์ด ์ƒ๊ฒผ์ฃ .
02:32
and I imagined, with virtual reality, we'd have this new thing
37
152689
4262
๊ทธ๋ฆฌ๊ณ  ์ œ๊ฐ€ ์ƒ์ƒํ–ˆ๋˜ ๊ฐ€์ƒํ˜„์‹ค์ด ๊ฐ€์ ธ์˜ฌ ์ƒˆ๋กœ์šด ๊ธฐ์ˆ ์€
02:36
that would be like a conversation
38
156975
1593
์ผ์ข…์˜ ๋Œ€ํ™” ์ˆ˜๋‹จ๋ฟ๋งŒ ์•„๋‹ˆ๋ผ
02:38
but also like waking-state intentional dreaming.
39
158592
3344
์˜๋„์  ๋ชฝ์œ  ์ƒํƒœ๊ฐ€ ๋˜๋Š” ๊ฒƒ์ด์—ˆ์Šต๋‹ˆ๋‹ค.
02:41
We called it post-symbolic communication,
40
161960
2653
"ํƒˆ-๊ธฐํ˜ธํ™” ์†Œํ†ต"์ด๋ผ๊ณ  ํ•˜๋Š” ๊ฒƒ์ธ๋ฐ์š”.
02:44
because it would be like just directly making the thing you experienced
41
164637
4358
์™œ๋ƒํ•˜๋ฉด ์—ฌ๋Ÿฌ๋ถ„์ด ๊ฒฝํ—˜ํ•˜๋Š” ๋Œ€์ƒ์„ ์ง์ ‘ ๋Š๋ผ๋„๋ก ํ•ด์ฃผ๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
02:49
instead of indirectly making symbols to refer to things.
42
169019
3619
๊ทธ ๋Œ€์ƒ์„ ์ง์ ‘์ ์œผ๋กœ ๊ธฐํ˜ธํ™”ํ•˜๋Š” ๋Œ€์‹ ์— ๋ง์ด์ฃ .
02:53
It was a beautiful vision, and it's one I still believe in,
43
173466
4338
์ •๋ง ๋ฉ‹์ง„ ์ƒ๊ฐ์ด์—ˆ์Šต๋‹ˆ๋‹ค. ์ง€๊ธˆ๋„ ๊ทธ๋ ‡๊ฒŒ ์ƒ๊ฐํ•ด์š”.
02:57
and yet, haunting that beautiful vision
44
177828
3215
ํ•˜์ง€๋งŒ ๊ทธ ๊ตฌ์ƒ์—์„œ ๋– ์˜ค๋ฅธ ๋ฌธ์ œ์ ์€
03:01
was the dark side of how it could also turn out.
45
181067
3150
๊ทธ ๊ธฐ์ˆ ์ด ๋“œ๋Ÿฌ๋‚ผ์ง€ ๋ชจ๋ฅผ ์–ด๋‘์šด ๋ฉด์ด์—ˆ์Šต๋‹ˆ๋‹ค.
03:04
And I suppose I could mention
46
184241
5048
ํ•œ๊ฐ€์ง€ ์–ธ๊ธ‰ํ•˜๊ณ  ์‹ถ์€ ๊ฒƒ์€
03:09
from one of the very earliest computer scientists,
47
189313
3064
์ดˆ๊ธฐ ์ปดํ“จํ„ฐ ๊ณผํ•™์ž๋“ค ์ค‘์˜ ํ•œ ์‚ฌ๋žŒ์ธ
03:12
whose name was Norbert Wiener,
48
192401
2135
๋…ธ๋ฒ„ํŠธ ์œ„๋„ˆ ๋ฐ•์‚ฌ์˜ ์ฃผ์žฅ์ธ๋ฐ์š”.
03:14
and he wrote a book back in the '50s, from before I was even born,
49
194560
3754
์ œ๊ฐ€ ํƒœ์–ด๋‚˜๊ธฐ๋„ ์ „์ธ 1950๋…„๋Œ€์— ๊ทธ๋ถ„์ด ์“ด ์ €์„œ ํ•˜๋‚˜๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
03:18
called "The Human Use of Human Beings."
50
198338
2658
"์ธ๊ฐ„์˜ ์ธ๊ฐ„์  ํ™œ์šฉ"์ด๋ž€ ์ฑ…์ธ๋ฐ์š”.
03:21
And in the book, he described the potential
51
201779
4172
๊ทธ ์ฑ…์—์„œ ๊ทธ๋ถ„์€ ํ•œ๊ฐ€์ง€ ๊ฐ€๋Šฅ์„ฑ์— ๋Œ€ํ•ด์„œ ์„ค๋ช…ํ–ˆ์Šต๋‹ˆ๋‹ค.
03:25
to create a computer system that would be gathering data from people
52
205975
6181
์–ด๋Š ์ปดํ“จํ„ฐ ์‹œ์Šคํ…œ์ด ๊ฐœ๋ฐœ๋˜์–ด ์‚ฌ๋žŒ๋“ค๋กœ๋ถ€ํ„ฐ ์ •๋ณด๋ฅผ ์ˆ˜์ง‘ํ•˜๊ณ 
03:32
and providing feedback to those people in real time
53
212180
3572
๊ทธ ์‚ฌ๋žŒ๋“ค์—๊ฒŒ ์‹ค์‹œ๊ฐ„์œผ๋กœ ํ”ผ๋“œ๋ฐฑ์„ ์ œ๊ณตํ•จ์œผ๋กœ์จ
03:35
in order to put them kind of partially, statistically, in a Skinner box,
54
215776
5135
์‚ฌ๋žŒ๋“ค์„ ๋ถ€๋ถ„์ , ํ˜น์€ ํ†ต๊ณ„์ ์ธ ์‹คํ—˜์šฉ ์ƒ์ž์— ๋„ฃ์„ ๊ฑฐ๋ผ๊ณ  ํ–ˆ์ฃ .
03:40
in a behaviorist system,
55
220935
2444
์ผ์ข…์˜ ํ–‰๋™์‹ฌ๋ฆฌ ์‹œ์Šคํ…œ์— ๊ฐ€๋‘๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
03:43
and he has this amazing line where he says,
56
223403
2501
๊ทธ๋ฆฌ๊ณ  ์ด๋Ÿฐ ๋†€๋ผ์šด ๋ง์„ ํ–ˆ์Šต๋‹ˆ๋‹ค.
03:45
one could imagine, as a thought experiment --
57
225928
2738
์‚ฌ๊ณ  ์‹คํ—˜์œผ๋กœ ์ƒ์ƒํ•ด๋ณด๋ฉด --
03:48
and I'm paraphrasing, this isn't a quote --
58
228690
2461
์•„, ์ œ ๋‚˜๋ฆ„์˜ ํ•ด์„์ž…๋‹ˆ๋‹ค. ์ฑ… ๋‚ด์šฉ์„ ๊ทธ๋Œ€๋กœ ์˜ฎ๊ธฐ๋Š” ๊ฑด ์•„๋‹ˆ์—์š”.
03:51
one could imagine a global computer system
59
231175
3080
์ „ ์„ธ๊ณ„์ ์ธ ์ปดํ“จํ„ฐ ์‹œ์Šคํ…œ์„ ์ƒ์ƒํ•ด๋ณด๋ฉด
03:54
where everybody has devices on them all the time,
60
234279
2842
๋ชจ๋“  ์‚ฌ๋žŒ๋“ค์ด ์ปดํ“จํ„ฐ๋ฅผ ๋Š˜ ์ง€๋‹ˆ๊ณ  ๋‹ค๋‹ˆ๊ณ 
03:57
and the devices are giving them feedback based on what they did,
61
237145
3272
๊ทธ ์žฅ์น˜๋Š” ์‚ฌ๋žŒ์˜ ํ–‰๋™์— ๋”ฐ๋ผ ํ”ผ๋“œ๋ฐฑ์„ ์ค„ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
04:00
and the whole population
62
240441
1875
๊ทธ๋ฆฌ๊ณ  ์ธ๋ฅ˜ ์ „์ฒด๊ฐ€
04:02
is subject to a degree of behavior modification.
63
242340
3576
ํ–‰๋™ ์ˆ˜์ •์„ ๋‹นํ•˜๋Š” ์ƒํƒœ๊ฐ€ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
04:05
And such a society would be insane,
64
245940
3546
๊ทธ๋Ÿฐ ์‚ฌํšŒ๋Š” ์ •์ƒ์ด ์•„๋‹™๋‹ˆ๋‹ค.
04:09
could not survive, could not face its problems.
65
249510
3097
๊ทธ ์•ˆ์—์„œ๋Š” ์‚ด ์ˆ˜๋„ ์—†๊ณ , ๋ญ๊ฐ€ ๋ฌธ์ œ์ธ์ง€๋„ ๋ชจ๋ฅผ ๊ฒ๋‹ˆ๋‹ค.
04:12
And then he says, but this is only a thought experiment,
66
252631
2621
ํ•˜์ง€๋งŒ ๊ทธ๋ถ„์€ ์ด๊ฑด ๋‹จ์ง€ ์‚ฌ๊ณ  ์‹คํ—˜์ด๊ณ 
04:15
and such a future is technologically infeasible.
67
255276
3420
๊ทธ๋Ÿฐ ๋ฏธ๋ž˜๋Š” ๊ธฐ์ˆ ์ ์œผ๋กœ ์‹คํ˜„๋  ์ˆ˜ ์—†๋‹ค๊ณ  ์–ธ๊ธ‰ํ–ˆ์ฃ .
04:18
(Laughter)
68
258720
1092
(์›ƒ์Œ)
04:19
And yet, of course, it's what we have created,
69
259836
3002
ํ•˜์ง€๋งŒ, ์šฐ๋ฆฌ๋Š” ๋ฌผ๋ก  ๊ทธ๋Ÿฐ ์„ธ์ƒ์„ ๋งŒ๋“ค์—ˆ๊ณ 
04:22
and it's what we must undo if we are to survive.
70
262862
3277
๊ทธ๊ฑธ ๋˜๋Œ๋ ค์•ผ๋งŒ ์šฐ๋ฆฌ๊ฐ€ ์‚ด์•„๋‚จ์„ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
04:27
So --
71
267457
1151
๊ทธ๋ž˜์„œ..
04:28
(Applause)
72
268632
3540
(๋ฐ•์ˆ˜)
04:32
I believe that we made a very particular mistake,
73
272631
5977
์ €๋Š” ์šฐ๋ฆฌ๊ฐ€ ์•„์ฃผ ํŠน๋ณ„ํ•œ ์‹ค์ˆ˜๋ฅผ ์ €์งˆ๋ €๋‹ค๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
04:38
and it happened early on,
74
278632
2234
์ผ์ด ๋ฒŒ์–ด์ง„ ์ง€ ์ดˆ๊ธฐ ๋‹จ๊ณ„๋ผ
04:40
and by understanding the mistake we made,
75
280890
2074
์šฐ๋ฆฌ๊ฐ€ ์–ด๋–ค ์‹ค์ˆ˜๋ฅผ ์ €์งˆ๋ €๋Š”์ง€ ์•ˆ๋‹ค๋ฉด
04:42
we can undo it.
76
282988
1859
๋‹ค์‹œ ๋˜๋Œ๋ฆด ์ˆ˜ ์žˆ์„ ๊ฒ๋‹ˆ๋‹ค.
04:44
It happened in the '90s,
77
284871
2559
์ด ์ผ์€ 90๋…„๋Œ€์— ๋ฒŒ์–ด์กŒ๊ณ 
04:47
and going into the turn of the century,
78
287454
2742
์ด๋ฏธ ํ•œ ์„ธ๊ธฐ๊ฐ€ ๋„˜์–ด๊ฐ€๋Š” ์‹œ์ ์— ์žˆ์Šต๋‹ˆ๋‹ค.
04:50
and here's what happened.
79
290220
1388
์–ด๋–ค ์ผ์ด ์žˆ์—ˆ๋Š”์ง€ ๋ง์”€๋“œ๋ฆฌ์ฃ .
04:53
Early digital culture,
80
293200
1374
์ดˆ๊ธฐ์˜ ๋””์ง€ํ„ธ ๋ฌธํ™”์™€
04:54
and indeed, digital culture to this day,
81
294598
4972
๊ทธ๋ฆฌ๊ณ  ์‚ฌ์‹ค.. ์ง€๊ธˆ๊นŒ์ง€์˜ ๋””์ง€ํ„ธ ๋ฌธํ™”์—๋Š”
04:59
had a sense of, I would say, lefty, socialist mission about it,
82
299594
6309
๋งํ•˜์ž๋ฉด ์ผ์ข…์˜ ์ขŒํŽธํ–ฅ์˜ ์‚ฌํšŒ์ฃผ์˜์  ์ž„๋ฌด๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
05:05
that unlike other things that have been done,
83
305927
2160
์ธ๋ฅ˜๊ฐ€ ์ด๋ค„์˜จ ๋‹ค๋ฅธ ์—…์ ๋“ค, ์˜ˆ๋ฅผ ๋“ค์–ด ์ฑ…์˜ ๋ฐœ๋ช… ๊ฐ™์€ ๊ฒƒ๊ณผ ๋‹ฌ๋ฆฌ
05:08
like the invention of books,
84
308111
1434
05:09
everything on the internet must be purely public,
85
309569
3413
์ธํ„ฐ๋„ท ์•ˆ์˜ ๋ชจ๋“  ๊ฒƒ๋“ค์€ ์™„์ „ํžˆ ๊ณต๊ฐœ๋˜์–ด์•ผ ํ•˜๊ณ 
05:13
must be available for free,
86
313006
2325
๊ณต์งœ๋กœ ์“ธ ์ˆ˜ ์žˆ์–ด์•ผ ํ•œ๋‹ค๋Š” ๊ฑฐ์˜ˆ์š”.
05:15
because if even one person cannot afford it,
87
315355
3388
๋‹จ ํ•œ ๋ช…์ด๋ผ๋„ ์ธํ„ฐ๋„ท์„ ์“ธ ํ˜•ํŽธ์ด ์•ˆ๋˜๋ฉด๋‹ค๋ฉด
05:18
then that would create this terrible inequity.
88
318767
2572
๊ทธ์—๊ฒŒ๋Š” ๋„ˆ๋ฌด๋‚˜ ๋ถˆ๊ณตํ‰ํ•˜๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
05:21
Now of course, there's other ways to deal with that.
89
321912
2524
๋ฌผ๋ก  ๊ทธ๋Ÿฐ ๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ•  ์—ฌ๋Ÿฌ ๋ฐฉ๋ฒ•์ด ์žˆ์Šต๋‹ˆ๋‹ค.
05:24
If books cost money, you can have public libraries.
90
324460
3016
์ฑ…์„ ์‚ด ๋ˆ์ด ์—†์œผ๋ฉด ๊ณต์šฉ ๋„์„œ๊ด€์„ ์ฐพ๋Š”๋‹ค๊ฑฐ๋‚˜
05:27
And so forth.
91
327500
1174
๊ทธ ์™ธ ๋ฐฉ๋ฒ•๋“ค์ด ์žˆ์ฃ .
05:28
But we were thinking, no, no, no, this is an exception.
92
328698
2618
ํ•˜์ง€๋งŒ ์ด๊ฑธ ์ƒ๊ฐํ•ด๋ณด๋ฉด.. ์•„๋‹ˆ์š”. ์—ฌ๊ธฐ์—” ์˜ˆ์™ธ๊ฐ€ ์žˆ์–ด์š”.
05:31
This must be pure public commons, that's what we want.
93
331340
4605
์ธํ„ฐ๋„ท์€ ์ˆœ์ „ํžˆ ๊ณต๊ณต์˜ ๊ฒƒ์ด์–ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
์šฐ๋ฆฌ๋„ ๊ทธ๊ฒƒ์„ ์›ํ•˜๊ณ ์š”.
05:35
And so that spirit lives on.
94
335969
2634
๊ทธ ์ •์‹ ์ด ์‚ด์•„์žˆ๋Š” ๋ถ€๋ถ„๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
05:38
You can experience it in designs like the Wikipedia, for instance,
95
338627
3715
์˜ˆ๋ฅผ ๋“ค๋ฉด, ์œ„ํ‚คํ”ผ๋””์•„ ๊ฐ™์€ ์‚ฌ์ดํŠธ๊ฐ€ ๋งŒ๋“ค์ง€๋Š” ๊ฒƒ์—์„œ๋„ ๊ฒฝํ—˜ํ•  ์ˆ˜ ์žˆ๊ณ 
05:42
many others.
96
342366
1341
๋‹ค๋ฅธ ๊ฒฝ์šฐ๋„ ๋งŽ์ฃ .
05:43
But at the same time,
97
343731
1874
ํ•˜์ง€๋งŒ ๊ทธ์™€ ๋™์‹œ์—
05:45
we also believed, with equal fervor,
98
345629
2588
๋™์ผํ•œ ์—ด์ •์„ ๊ฐ€์ง€๊ณ  ์žˆ์ง€๋งŒ
05:48
in this other thing that was completely incompatible,
99
348241
3937
์™„์ „ํžˆ ์ƒ๋ฐ˜๋œ ๋‹ค๋ฅธ ๋ถ€๋ถ„๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
05:52
which is we loved our tech entrepreneurs.
100
352202
3627
์šฐ๋ฆฌ๊ฐ€ ์ข‹์•„ํ•˜๋Š” ๊ธฐ์ˆ  ๊ธฐ์—…๊ฐ€๋“ค์ด์ฃ .
05:55
We loved Steve Jobs; we loved this Nietzschean myth
101
355853
3739
์šฐ๋ฆฌ๋Š” ์Šคํ‹ฐ๋ธŒ ์žก์Šค๋ฅผ ์‚ฌ๋ž‘ํ•˜๊ณ 
๋‹ˆ์ฒด์ฒ˜๋Ÿผ ์„ธ์ƒ์„ ๋ฐ”๊ฟ€ ๊ธฐ์ˆ  ์‹ ํ™”๋ฅผ ์ด๋ค„๋‚ธ ์ด๋“ค๋„ ์‚ฌ๋ž‘ํ•ฉ๋‹ˆ๋‹ค.
05:59
of the techie who could dent the universe.
102
359616
3468
06:03
Right?
103
363108
1318
๊ทธ๋ ‡์ฃ ?
06:04
And that mythical power still has a hold on us, as well.
104
364450
5848
๊ทธ ์‹ ํ™”์  ๋Šฅ๋ ฅ์ด ์—ฌ์ „ํžˆ ์šฐ๋ฆฌ๋ฅผ ์ง€๋ฐฐํ•˜๊ณ  ์žˆ๊ธฐ๋„ ํ•˜๊ณ ์š”.
06:10
So you have these two different passions,
105
370322
4459
๋”ฐ๋ผ์„œ ์ด ์ „ํ˜€ ๋‹ค๋ฅธ ๋‘ ๊ฐ€์ง€ ์—ด์ •์„ ๊ฐ€์ง€๊ณ 
06:14
for making everything free
106
374805
1937
๋ชจ๋“  ์ •๋ณด๋ฅผ ์ž์œ ๋กญ๊ฒŒ ๋งŒ๋“ค๊ณ 
06:16
and for the almost supernatural power of the tech entrepreneur.
107
376766
5166
์ธํ„ฐ๋„ท ๊ธฐ์—…๋“ค๋„ ์ดˆ์›”์  ๋Šฅ๋ ฅ์„ ๋ฐœํœ˜ํ•ด์•ผ ํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
06:21
How do you celebrate entrepreneurship when everything's free?
108
381956
4352
๊ธฐ์—…๋“ค์ด ๋ชจ๋“  ๊ฒƒ์„ ๊ณต์งœ๋กœ ๋งŒ๋“ ๋‹ค๋ฉด ์–ผ๋งˆ๋‚˜ ์ข‹์„๊นŒ์š”?
06:26
Well, there was only one solution back then,
109
386332
3125
๋ฌผ๋ก  ๊ทธ๋Ÿฌ๋ ค๋ฉด ๋ฐฉ๋ฒ•์€ ๋‹จ ํ•˜๋‚˜๋ฟ์ž…๋‹ˆ๋‹ค.
06:29
which was the advertising model.
110
389481
2087
๊ด‘๊ณ  ์ˆ˜์ต ๋ชจ๋ธ์ด์ฃ .
06:31
And so therefore, Google was born free, with ads,
111
391592
4003
๊ทธ๋ž˜์„œ ๊ตฌ๊ธ€์ด ๊ด‘๊ณ  ๋•๋ถ„์— ๋ฌด๋ฃŒ ์„œ๋น„์Šค๋กœ ์ถœ๋ฐœํ–ˆ๊ณ 
06:35
Facebook was born free, with ads.
112
395619
3682
ํŽ˜์ด์Šค๋ถ๋„ ๊ด‘๊ณ  ๋•๋ถ„์— ๋ฌด๋ฃŒ๋กœ ์‹œ์ž‘ํ–ˆ์ฃ .
06:39
Now in the beginning, it was cute,
113
399325
3865
๋ฌผ๋ก  ์ฒ˜์Œ์—๋Š” ๊ท€์—ฝ๊ฒŒ ๋ด์ค„ ๋งŒ ํ–ˆ์–ด์š”.
06:43
like with the very earliest Google.
114
403214
1960
์ดˆ์ฐฝ๊ธฐ ๊ตฌ๊ธ€์€ ๊ทธ๋žฌ์ฃ .
06:45
(Laughter)
115
405198
1286
(์›ƒ์Œ)
06:46
The ads really were kind of ads.
116
406508
2897
๊ทธ๋•Œ์˜ ๊ด‘๊ณ ๋Š” ๊ทธ์•ผ๋ง๋กœ ๊ด‘๊ณ ์˜€์Šต๋‹ˆ๋‹ค.
06:49
They would be, like, your local dentist or something.
117
409429
2485
์˜ˆ๋ฅผ ๋“ค๋ฉด, ์ง€์—ญ ์น˜๊ณผ ๊ด‘๊ณ ๋‚˜ ๊ทธ ๋น„์Šทํ•œ ๊ฑฐ์˜€์ฃ .
06:51
But there's thing called Moore's law
118
411938
1920
ํ•˜์ง€๋งŒ ๋ฌด์–ด์˜ ๋ฒ•์น™์— ๋”ฐ๋ผ
06:53
that makes the computers more and more efficient and cheaper.
119
413882
3142
์ ์  ์ปดํ“จํ„ฐ ์„ฑ๋Šฅ์ด ์ข‹์•„์ง€๊ณ  ๊ฐ€๊ฒฉ๋„ ์‹ธ์ง€๋ฉด์„œ
06:57
Their algorithms get better.
120
417048
1858
๊ด‘๊ณ  ์•Œ๊ณ ๋ฆฌ์ฆ˜๋„ ์ ์ฐจ ๋ฐœ์ „ํ–ˆ์ฃ .
06:58
We actually have universities where people study them,
121
418930
2596
๊ทธ ๋ถ„์•ผ๋ฅผ ๊ฐ€๋ฅด์น˜๋Š” ๋Œ€ํ•™๋„ ์žˆ์–ด์„œ ๊ธฐ์ˆ ์ด ๊ฐˆ์ˆ˜๋ก ์ข‹์•„์กŒ์Šต๋‹ˆ๋‹ค.
07:01
and they get better and better.
122
421550
1628
07:03
And the customers and other entities who use these systems
123
423202
4452
๊ทธ๋ฆฌ๊ณ  ๊ทธ ์‹œ์Šคํ…œ์„ ์ด์šฉํ•˜๋Š” ์†Œ๋น„์ž๋‚˜ ๋‹ค๋ฅธ ๊ธฐ์—…๋“ค๋„
07:07
just got more and more experienced and got cleverer and cleverer.
124
427678
4127
๊ฒฝํ—˜์ด ์Œ“์—ฌ๊ฐˆ์ˆ˜๋ก ์ ์  ๋” ๋˜‘๋˜‘ํ•ด์กŒ์ฃ .
07:11
And what started out as advertising
125
431829
2397
๊ฒฐ๊ตญ, ๊ด‘๊ณ ๋กœ ์‹œ์ž‘๋˜์—ˆ๋˜ ๊ฒƒ๋“ค์„
07:14
really can't be called advertising anymore.
126
434250
2477
๋” ์ด์ƒ ๊ด‘๊ณ ๋กœ ๋ณผ ์ˆ˜ ์—†๊ฒŒ ๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
07:16
It turned into behavior modification,
127
436751
2912
์ด์šฉ์ž์˜ ํ–‰๋™์„ ์ˆ˜์ •ํ•˜๋Š” ์ชฝ์œผ๋กœ ๋ฐ”๋€Œ์—ˆ์ฃ .
07:19
just as Norbert Wiener had worried it might.
128
439687
4493
๋…ธ๋ฒ„ํŠธ ์œ„๋„ˆ ๋ฐ•์‚ฌ๊ฐ€ ๊ฑฑ์ •ํ–ˆ๋˜ ๊ทธ๋Œ€๋กœ ๋œ ๊ฒƒ์ด์ฃ .
07:24
And so I can't call these things social networks anymore.
129
444204
4620
๊ทธ๋ž˜์„œ ์ด์   ๋” ์ด์ƒ ์ด๊ฒƒ๋“ค์„ ์†Œ์…œ ๋„คํŠธ์›Œํฌ๋ผ๊ณ  ํ•  ์ˆ˜ ์—†์Šต๋‹ˆ๋‹ค.
07:28
I call them behavior modification empires.
130
448848
3814
ํ–‰๋™ ์ˆ˜์ •์˜ ์ œ๊ตญ์ด๋ผ๊ณ  ํ•ด์•ผ ํ•˜์ฃ .
07:32
(Applause)
131
452686
2235
(๋ฐ•์ˆ˜)
07:34
And I refuse to vilify the individuals.
132
454945
4214
๊ด€๋ จ๋œ ๊ฐœ๊ฐœ์ธ์„ ๋น„๋‚œํ•˜๋Š” ๊ฑด ์•„๋‹™๋‹ˆ๋‹ค.
07:39
I have dear friends at these companies,
133
459183
2271
์ œ ์นœํ•œ ์นœ๊ตฌ๋“ค๋„ ๊ด€๋ จ ํšŒ์‚ฌ์—์„œ ์ผํ•˜๊ณ  ์žˆ๊ณ 
07:41
sold a company to Google, even though I think it's one of these empires.
134
461478
4760
์ž๊ธฐ ํšŒ์‚ฌ๋ฅผ ๊ตฌ๊ธ€์— ํŒ ๊ฒฝ์šฐ๋„ ์žˆ์ฃ . ๊ตฌ๊ธ€๋„ ๊ทธ ์ œ๊ตญ์— ์†ํ•˜์ง€๋งŒ์š”.
07:46
I don't think this is a matter of bad people who've done a bad thing.
135
466262
5060
ํ–‰์œ„๋Š” ๋‚˜์˜์ง€๋งŒ ์‚ฌ๋žŒ์ด ๋‚˜๋น ์„œ ๊ทธ๋Ÿฐ ๊ฑด ์•„๋‹ˆ๋ผ๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
07:51
I think this is a matter of a globally tragic,
136
471346
4576
์˜คํžˆ๋ ค ์ „ ์„ธ๊ณ„์ ์ธ ๋น„๊ทน์ด๋ผ๊ณ  ์ƒ๊ฐํ•ด์š”.
07:55
astoundingly ridiculous mistake,
137
475946
4572
๋„ˆ๋ฌด๋‚˜ ํ„ฐ๋ฌด๋‹ˆ์—†๋Š” ์‹ค์ˆ˜์ด์ง€
08:00
rather than a wave of evil.
138
480542
4129
์•…์˜ ๋ฌผ๊ฒฐ์€ ์•„๋‹ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
08:04
Let me give you just another layer of detail
139
484695
2682
์ข€ ๋” ์ž์„ธํ•˜๊ฒŒ ์„ค๋ช…๋“œ๋ฆฌ๊ฒ ์Šต๋‹ˆ๋‹ค.
08:07
into how this particular mistake functions.
140
487401
3103
์ด ํŠน์ •ํ•œ ์‹ค์ˆ˜๊ฐ€ ์–ด๋–ป๊ฒŒ ์ž‘๋™ํ•˜๋Š”์ง€์— ๋Œ€ํ•ด์„œ์š”.
08:11
So with behaviorism,
141
491337
2707
ํ–‰๋™์ฃผ์˜ ์ธก๋ฉด์—์„œ
08:14
you give the creature, whether it's a rat or a dog or a person,
142
494068
5064
์—ฌ๋Ÿฌ๋ถ„์€ ์–ด๋–ค ๋Œ€์ƒ, ์˜ˆ๋ฅผ ๋“ค๋ฉด ์ฅ๋‚˜ ๊ฐœ, ํ˜น์€ ์‚ฌ๋žŒ์ด ๋  ์ˆ˜ ์žˆ๊ฒ ์ฃ .
08:19
little treats and sometimes little punishments
143
499156
2840
๊ทธ ๋Œ€์ƒ์—๊ฒŒ ์ž‘์€ ๋ณด์ƒ์ด๋‚˜ ๋•Œ๋กœ๋Š” ๋ฒŒ์„ ์ค„ ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
08:22
as feedback to what they do.
144
502020
1817
๊ทธ๋“ค ํ–‰๋™์— ๋Œ€์‘ํ•ด์„œ ๋ง์ด์ฃ .
08:24
So if you have an animal in a cage, it might be candy and electric shocks.
145
504710
5912
์˜ˆ๋ฅผ ๋“ค์–ด, ์šฐ๋ฆฌ์— ๊ฐ€๋‘ฌ ๋‘” ๋™๋ฌผ์ด๋ผ๋ฉด
์‚ฌํƒ•์„ ์ฃผ๊ฑฐ๋‚˜ ์ „๊ธฐ ์ถฉ๊ฒฉ์„ ๊ฐ€ํ•˜๋Š” ๊ฒ๋‹ˆ๋‹ค.
08:30
But if you have a smartphone,
146
510646
2524
๊ทธ๋Ÿฐ๋ฐ ์Šค๋งˆํŠธํฐ์ด๋ผ๋ฉด
08:33
it's not those things, it's symbolic punishment and reward.
147
513194
6926
๊ทธ๋Ÿฐ ํ˜•ํƒœ๊ฐ€ ์•„๋‹ˆ๋ผ, ๊ธฐํ˜ธํ™”๋œ ๋ฒŒ๊ณผ ๋ณด์ƒ์„ ์ค˜์•ผ๊ฒ ์ฃ .
08:40
Pavlov, one of the early behaviorists,
148
520144
2443
์ดˆ์ฐฝ๊ธฐ ํ–‰๋™์‹ฌ๋ฆฌํ•™์ž์ธ ํŒŒ๋ธ”๋กœํ”„๊ฐ€
08:42
demonstrated the famous principle.
149
522611
2952
์‹คํ—˜ํ–ˆ๋˜ ์œ ๋ช…ํ•œ ์ด๋ก ์ด ์žˆ์Šต๋‹ˆ๋‹ค.
08:45
You could train a dog to salivate just with the bell, just with the symbol.
150
525587
3961
๋จน์ด ๋Œ€์‹  ๊ธฐํ˜ธํ™”๋œ ์ข…์†Œ๋ฆฌ๋ฅผ ๋“ฃ๊ณ  ๊ฐœ๊ฐ€ ์นจ์„ ํ˜๋ฆฌ๋„๋ก ํ›ˆ๋ จ์‹œํ‚จ ๊ฒƒ์ด์ฃ .
08:49
So on social networks,
151
529572
1586
์†Œ์…œ ๋„คํŠธ์›Œํฌ์—๋„
08:51
social punishment and social reward function as the punishment and reward.
152
531182
5080
๋งˆ์ฐฌ๊ฐ€์ง€๋กœ ์‚ฌํšŒ์  ๋ฒŒ๊ณผ ์‚ฌํšŒ์  ๋ณด์ƒ์˜ ๊ธฐ๋Šฅ์ด ์žˆ์Šต๋‹ˆ๋‹ค.
08:56
And we all know the feeling of these things.
153
536286
2077
๋‹ค๋“ค ์–ด๋–ค ๋Š๋‚Œ์ธ์ง€ ์•„์‹คํ…๋ฐ์š”. ์ด๋Ÿด ๋•Œ ํฌ์—ด์„ ๋Š๋ผ์ฃ .
08:58
You get this little thrill --
154
538387
1451
08:59
"Somebody liked my stuff and it's being repeated."
155
539862
2350
"๋ˆ„๊ฐ€ ๋‚ด ๊ธ€์— '์ข‹์•„์š”'๋ฅผ ๋ˆ„๋ฅด๊ณ  ๊ณต์œ ํ•˜๊ธฐ๊นŒ์ง€ ํ–ˆ์–ด!"
์•„๋‹ˆ๋ฉด ๋ฒŒ์€ ์ด๋Ÿฐ ๊ฒฝ์šฐ์ฃ .
09:02
Or the punishment: "Oh my God, they don't like me,
156
542236
2334
"์„ธ์ƒ์—, ๋‹ค๋“ค ๋‚˜๋Š” ๋ณ„๋ก ๊ฐ€ ๋ด. ๋‹ค๋ฅธ ์‚ฌ๋žŒ์ด ๋” ์ธ๊ธฐ๊ฐ€ ๋งŽ์•„.."
09:04
maybe somebody else is more popular, oh my God."
157
544594
2239
09:06
So you have those two very common feelings,
158
546857
2226
์ผ๋ฐ˜์ ์œผ๋กœ ์ด๋ ‡๊ฒŒ ๋‘ ๊ฐ€์ง€ ๊ฐ์ •์ด ์žˆ์Šต๋‹ˆ๋‹ค.
09:09
and they're doled out in such a way that you get caught in this loop.
159
549107
3564
๊ทธ๋ ‡๊ฒŒ ์ƒ๊ณผ ๋ฒŒ์„ ์กฐ๊ธˆ์”ฉ ๋‚˜๋ˆ ์ฃผ๋ฉด์„œ ์—ฌ๋Ÿฌ๋ถ„์„ ๊ทธ ๊ฐ์ •์˜ ์ˆœํ™˜์— ๊ฐ€๋‘๋Š” ๊ฑฐ์ฃ .
09:12
As has been publicly acknowledged by many of the founders of the system,
160
552695
4095
๊ทธ ์‹œ์Šคํ…œ ๊ฐœ๋ฐœ์ž๋“ค์ด ์ด ๋‚ด์šฉ์„ ์ด๋ฏธ ๊ณต๊ฐœ์ ์œผ๋กœ ๋ฐํ˜”๊ธฐ ๋•Œ๋ฌธ์—
09:16
everybody knew this is what was going on.
161
556814
2341
๋ˆ„๊ตฌ๋‚˜ ๊ทธ ์‚ฌ์‹ค์„ ์•Œ๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
09:19
But here's the thing:
162
559871
1619
๊ทธ๋Ÿฐ๋ฐ ๋ฌธ์ œ๊ฐ€ ์žˆ์–ด์š”.
09:21
traditionally, in the academic study of the methods of behaviorism,
163
561514
5294
์ „ํ†ต์ ์œผ๋กœ, ํ•™๊ต์—์„œ ํ–‰๋™์ฃผ์˜ ๊ธฐ๋ฒ•์„ ๊ฐ€๋ฅด์น  ๋•Œ๋Š”
09:26
there have been comparisons of positive and negative stimuli.
164
566832
5436
๊ธ์ •์  ์ž๊ทน๊ณผ ๋ถ€์ •์  ์ž๊ทน์„ ๋น„๊ตํ•ด์„œ ๊ฐ€๋ฅด์นฉ๋‹ˆ๋‹ค.
09:32
In this setting, a commercial setting,
165
572292
2364
๊ทธ ๋‚ด์šฉ์„ ๊ด‘๊ณ ์— ์ ์šฉํ•ด๋ณด๋ฉด ์ „ํ˜€ ๋‹ค๋ฅธ ์ฐจ์ด์ ์„ ๋ณด์ž…๋‹ˆ๋‹ค.
09:34
there's a new kind of difference
166
574680
1596
09:36
that has kind of evaded the academic world for a while,
167
576300
2769
ํ•™๊ณ„์—์„œ๋Š” ํ•œ๋™์•ˆ ๋ชฐ๋ž๋˜ ์‚ฌ์‹ค์ธ๋ฐ์š”.
09:39
and that difference is that whether positive stimuli
168
579093
4048
๊ทธ ์ฐจ์ด์ ์€ ์ด๊ฒƒ์ž…๋‹ˆ๋‹ค.
์ƒํ™ฉ์— ๋”ฐ๋ผ ๊ธ์ •์  ์ž๊ทน์ด ๋ถ€์ •์  ์ž๊ทน๋ณด๋‹ค ๋” ํšจ๊ณผ์ ์ด๊ธด ํ•ด๋„
09:43
are more effective than negative ones in different circumstances,
169
583165
3309
09:46
the negative ones are cheaper.
170
586498
2104
๋ถ€์ •์  ์ž๊ทน์€ ๋น„์šฉ์ด ์‹ธ๊ฑฐ๋“ ์š”.
09:48
They're the bargain stimuli.
171
588626
2056
๊ทธ๋ž˜์„œ ๋Œ€๋Ÿ‰์œผ๋กœ ์ž๊ทน์ด ๊ฐ€ํ•ด์ง‘๋‹ˆ๋‹ค.
09:50
So what I mean by that is it's much easier
172
590706
5703
์ด๊ฒƒ์ด ์˜๋ฏธํ•˜๋Š” ๋ฐ”๋Š”..
์‹ ๋ขฐ๋ฅผ ์Œ“๊ธฐ๋ณด๋‹ค ๋ฌด๋„ˆ๋œจ๋ฆฌ๊ธฐ๊ฐ€ ํ›จ์”ฌ ์‰ฝ๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
09:56
to lose trust than to build trust.
173
596433
3116
09:59
It takes a long time to build love.
174
599573
3172
์‚ฌ๋ž‘์„ ๋งŒ๋“ค์–ด๊ฐ€๋Š” ๋ฐ์—๋Š” ์˜ค๋žœ ์‹œ๊ฐ„์ด ๊ฑธ๋ฆฌ์ง€๋งŒ
10:02
It takes a short time to ruin love.
175
602769
2606
์‚ฌ๋ž‘์„ ๊นจ๋ฒ„๋ฆฌ๋Š” ๊ฑด ๊ธˆ๋ฐฉ์ด์ฃ .
10:05
Now the customers of these behavior modification empires
176
605399
4588
์ด๋“ค ํ–‰๋™ ์ˆ˜์ •์˜ ์ œ๊ตญ์„ ์ด์šฉํ•˜๋Š” ์ด์šฉ์ž๋“ค์€ ์ด์ œ
10:10
are on a very fast loop.
177
610011
1423
๊ธ‰๊ฒฉํ•œ ๊ฐ์ •์˜ ์ˆœํ™˜์„ ๊ฒช์Šต๋‹ˆ๋‹ค.
10:11
They're almost like high-frequency traders.
178
611458
2045
์ดˆ๋‹จํƒ€ ์ฃผ์‹ ๋งค๋งค๋ฅผ ํ•˜๋Š” ๊ฒƒ๊ณผ ๋งˆ์ฐฌ๊ฐ€์ง€์—์š”.
10:13
They're getting feedbacks from their spends
179
613527
2024
๊ฑฐ๋ž˜๋ฅผ ํ•œ ๋’ค์— ๋ฐ˜์‘์„ ์‚ดํ”ผ๊ณ 
10:15
or whatever their activities are if they're not spending,
180
615575
2802
ํ˜น์€ ๊ฑฐ๋ž˜๋ฅผ ํ•˜์ง€ ์•Š์•„๋„ ์–ด๋–ค ์›€์ง์ž„์„ ๋ณด์ด๋‚˜ ์‚ดํ”ผ์ฃ .
10:18
and they see what's working, and then they do more of that.
181
618401
3270
์›€์ง์ž„์„ ๋ณด๊ณ , ๊ทธ์— ๋”ฐ๋ผ์„œ ์ถ”๊ฐ€์ ์ธ ํ–‰๋™์„ ์ทจํ•ฉ๋‹ˆ๋‹ค.
10:21
And so they're getting the quick feedback,
182
621695
2040
๊ทธ๋Ÿฐ ์‹์œผ๋กœ ๊ทธ๋“ค์€ ๋น ๋ฅด๊ฒŒ ํ”ผ๋“œ๋ฐฑ์„ ๋ฐ›์Šต๋‹ˆ๋‹ค.
10:23
which means they're responding more to the negative emotions,
183
623759
3040
๊ทธ๊ฒƒ์€ ๋ถ€์ •์ ์ธ ๊ฐ์ •์— ๋Œ€ํ•ด์„œ ๋” ์ž์ฃผ ๋Œ€์‘ํ•˜๋Š” ๊ฑธ ์˜๋ฏธํ•ฉ๋‹ˆ๋‹ค.
10:26
because those are the ones that rise faster, right?
184
626823
3937
์™œ๋ƒํ•˜๋ฉด ๋ถ€์ •์  ๊ฐ์ •์€ ๋นจ๋ฆฌ ์ผ์–ด๋‚˜๋‹ˆ๊นŒ์š”. ๊ทธ๋ ‡์ฃ ?
10:30
And so therefore, even well-intentioned players
185
630784
3548
๋”ฐ๋ผ์„œ ์„ ์˜๋ฅผ ๊ฐ€์ง„ ์‚ฌ๋žŒ์ด
10:34
who think all they're doing is advertising toothpaste
186
634356
2865
์ž์‹ ์˜ ํ–‰์œ„๋“ค์„ ์น˜์•ฝ ๊ด‘๊ณ  ์ •๋„๋กœ ์ƒ๊ฐํ•˜๋”๋ผ๋„
10:37
end up advancing the cause of the negative people,
187
637245
3031
๊ฒฐ๊ตญ ๋ถ€์ •์ ์ธ ์‚ฌ๋žŒ๋“ค์„ ์–‘์‚ฐํ•˜๋Š” ์ชฝ์œผ๋กœ ๋ฐœ์ „ํ•˜๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
10:40
the negative emotions, the cranks,
188
640300
2334
๋ถ€์ •์  ๊ฐ์ •๊ณผ ์งœ์ฆ
10:42
the paranoids,
189
642658
1444
ํ”ผํ•ด๋ง์ƒ
10:44
the cynics, the nihilists.
190
644126
3080
๋ƒ‰์†Œ์ฃผ์˜, ํ—ˆ๋ฌด์ฃผ์˜.
10:47
Those are the ones who get amplified by the system.
191
647230
3493
์ด๋Ÿฐ ๊ฐ์ •๋“ค์„ ์‹œ์Šคํ…œ์ด ์ฆํญ์‹œํ‚ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
10:50
And you can't pay one of these companies to make the world suddenly nice
192
650747
5651
์ด๋“ค ๊ธฐ์—…์—๊ฒŒ ๋น„์šฉ์„ ์ง€๋ถˆํ•  ์ˆ˜๋Š” ์—†์Šต๋‹ˆ๋‹ค.
๊ฐ‘์ž๊ธฐ ์ข‹์€ ์„ธ์ƒ์„ ๋งŒ๋“ค๊ณ  ๋ฏผ์ฃผ์ฃผ์˜๋ฅผ ๋ฐœ์ „์‹œํ‚ค๋ผ๊ณ  ๋ง์ด์ฃ .
10:56
and improve democracy
193
656422
1151
10:57
nearly as easily as you can pay to ruin those things.
194
657597
3841
๊ทธ๊ฒƒ์„ ๋ฌด๋„ˆ๋œจ๋ฆฌ๋ผ๊ณ  ๋น„์šฉ์„ ์ง€๋ถˆํ•˜๋Š” ๊ฒƒ๋งŒํผ์ด๋‚˜ ์‰ฌ์šด ์ผ์ด ์•„๋‹™๋‹ˆ๋‹ค.
11:01
And so this is the dilemma we've gotten ourselves into.
195
661462
3719
๊ทธ๊ฒƒ์ด ์šฐ๋ฆฌ์˜ ๋”œ๋ ˆ๋งˆ์ž…๋‹ˆ๋‹ค.
11:05
The alternative is to turn back the clock, with great difficulty,
196
665856
5232
๋Œ€์•ˆ์€ ์‹œ๊ฐ„์„ ๋˜๋Œ๋ฆฌ๋Š” ๊ฑฐ์˜ˆ์š”. ๋Œ€๋‹จํžˆ ์–ด๋ ต๋”๋ผ๋„ ๋ง์ด์ฃ .
11:11
and remake that decision.
197
671112
2841
๊ทธ๋ฆฌ๊ณ  ๊ฒฐ์ •์„ ๋‹ค์‹œ ํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
11:13
Remaking it would mean two things.
198
673977
4038
๋‹ค์‹œ ํ•œ๋‹ค๋Š” ๊ฒƒ์—๋Š” ๋‘ ๊ฐ€์ง€ ์˜๋ฏธ๊ฐ€ ์žˆ์–ด์š”.
11:18
It would mean first that many people, those who could afford to,
199
678039
3928
์ฒซ๋ฒˆ์งธ ์˜๋ฏธ๋กœ๋Š”, ํ˜•ํŽธ์ด ๋˜๋Š” ์‚ฌ๋žŒ๋“ค์€
11:21
would actually pay for these things.
200
681991
2207
์„œ๋น„์Šค์— ๋Œ€ํ•œ ๋น„์šฉ์„ ์‹ค์ œ๋กœ ์ง€๋ถˆํ•ด์•ผ ํ•œ๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
11:24
You'd pay for search, you'd pay for social networking.
201
684222
4407
๊ฒ€์ƒ‰ ๋น„์šฉ์„ ์ง€๋ถˆํ•˜๊ณ , SNS ์‚ฌ์šฉ๋ฃŒ๋ฅผ ์ง€๋ถˆํ•˜๋Š” ๊ฒ๋‹ˆ๋‹ค.
11:28
How would you pay? Maybe with a subscription fee,
202
688653
3461
์–ด๋–ป๊ฒŒ ์ง€๋ถˆํ•ด์•ผ ํ• ๊นŒ์š”? ๊ฐ€์ž…๋น„๋กœ๋„ ๊ฐ€๋Šฅํ•˜๊ณ 
11:32
maybe with micro-payments as you use them.
203
692138
2738
์„œ๋น„์Šค๋ฅผ ์‚ฌ์šฉํ•  ๋•Œ ์†Œ์•ก๊ฒฐ์ œ๋ฅผ ํ•  ์ˆ˜๋„ ์žˆ๊ฒ ์ฃ .
11:34
There's a lot of options.
204
694900
1802
์—ฌ๋Ÿฌ๊ฐ€์ง€ ๋ฐฉ๋ฒ•์ด ์žˆ์Šต๋‹ˆ๋‹ค.
11:36
If some of you are recoiling, and you're thinking,
205
696726
2397
๋ˆ„๊ตฐ๊ฐ€๋Š” ๊ทธ๊ฒƒ์— ๋ฐ˜๋ฐœํ•˜๋ฉฐ ์ด๋ ‡๊ฒŒ ์ƒ๊ฐํ•  ์ˆ˜๋„ ์žˆ๊ฒ ์ฃ .
11:39
"Oh my God, I would never pay for these things.
206
699147
2366
"๋ง๋„ ์•ˆ ๋ผ. ๋‚œ ์—ฌ๊ธฐ์— ์ ˆ๋Œ€๋กœ ๋ˆ ์•ˆ ๋‚ผ ๊ฑฐ์•ผ.
11:41
How could you ever get anyone to pay?"
207
701537
2095
๋ˆ„๊ฐ€ ์—ฌ๊ธฐ์— ๋ˆ์„ ๋‚ด๊ฒ ์–ด?"
11:43
I want to remind you of something that just happened.
208
703656
3239
ํ•˜์ง€๋งŒ ์ตœ๊ทผ์— ์ผ์–ด๋‚˜๋Š” ์ผ๋“ค์„ ์ƒ๊ฐํ•ด๋ณด์„ธ์š”.
11:46
Around this same time
209
706919
2054
์š”์ฆ˜์— ์™€์„œ
11:48
that companies like Google and Facebook were formulating their free idea,
210
708997
5707
๊ตฌ๊ธ€์ด๋‚˜ ํŽ˜์ด์Šค๋ถ ๊ฐ™์€ ๊ธฐ์—…์€ ๋ฌด๋ฃŒ ์„œ๋น„์Šค๋ฅผ ๊ตฌ์ƒํ•˜๊ณ  ์žˆ๊ณ 
11:54
a lot of cyber culture also believed that in the future,
211
714728
4504
์ธํ„ฐ๋„ท ๋ฌธํ™” ๋Œ€๋‹ค์ˆ˜๊ฐ€ ์•ž์œผ๋กœ๋„ ๋ฌด๋ฃŒ์ผ ๊ฑฐ๋ผ๊ณ  ๋ฏฟ๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
11:59
televisions and movies would be created in the same way,
212
719256
3022
TV์™€ ์˜ํ™”๋„ ์œ„ํ‚คํ”ผ๋””์•„์ฒ˜๋Ÿผ ์ด์šฉ์ž๊ฐ€ ์ฐธ์—ฌํ•˜๋Š” ํ˜•ํƒœ๋กœ
12:02
kind of like the Wikipedia.
213
722302
1755
๋งŒ๋“ค์–ด์งˆ์ง€๋„ ๋ชจ๋ฆ…๋‹ˆ๋‹ค.
12:04
But then, companies like Netflix, Amazon, HBO,
214
724456
5064
๊ทธ๋Ÿฐ๋ฐ ์ •๋ฐ˜๋Œ€๋กœ, ๋„ทํ”Œ๋ฆญ์Šค๋‚˜ ์•„๋งˆ์กด, HBO ๊ฐ™์€ ๊ธฐ์—…์€ ์ด๋ ‡๊ฒŒ ๋งํ•˜์ฃ .
12:09
said, "Actually, you know, subscribe. We'll give you give you great TV."
215
729544
3739
"์ž, ์œ ๋ฃŒ๊ฐ€์ž…ํ•˜์‹œ๋ฉด ์—„์ฒญ๋‚œ TV ์„œ๋น„์Šค๋ฅผ ์ œ๊ณตํ•ด ๋“œ๋ฆฌ๊ฒ ์Šต๋‹ˆ๋‹ค."
12:13
And it worked!
216
733307
1373
๊ทธ๋Ÿฐ๋ฐ ๊ทธ๊ฒŒ ํ†ตํ–ˆ์–ด์š”!
12:14
We now are in this period called "peak TV," right?
217
734704
3874
์ง€๊ธˆ ์šฐ๋ฆฌ๋Š” TV์˜ ์ „์„ฑ๊ธฐ ์‹œ๋Œ€๋ฅผ ์‚ด๊ณ  ์žˆ์ž–์•„์š”. ๊ทธ๋ ‡์ฃ ?
12:18
So sometimes when you pay for stuff, things get better.
218
738602
4198
๋”ฐ๋ผ์„œ ๋•Œ๋กœ๋Š” ๋Œ“๊ฐ€๋ฅผ ์ง€๋ถˆํ•˜๋ฉด ๋” ๋‚˜์€ ๊ฑธ ์–ป๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
12:22
We can imagine a hypothetical --
219
742824
2286
์ด๋Ÿฐ ๊ฑธ ํ•œ๋ฒˆ ์ƒ์ƒํ•ด๋ณด๋ฉด..
12:25
(Applause)
220
745134
4671
(๋ฐ•์ˆ˜)
12:29
We can imagine a hypothetical world of "peak social media."
221
749829
3659
"SNS์˜ ์ „์„ฑ๊ธฐ"๋ผ๋Š” ๊ฐ€์ƒ์˜ ์„ธ๊ณ„๋ฅผ ์ƒ์ƒํ•ด๋ณธ๋‹ค๋ฉด
12:33
What would that be like?
222
753512
1349
๊ทธ ์„ธ์ƒ์€ ์–ด๋–ค ๋ชจ์Šต์ผ๊นŒ์š”?
12:34
It would mean when you get on, you can get really useful,
223
754885
2770
๊ทธ๋Ÿฐ ์„ธ์ƒ์„ ๋งŒ๋“ค๊ธฐ๋งŒ ํ•œ๋‹ค๋ฉด ์ •๋ง ์œ ์šฉํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
12:37
authoritative medical advice instead of cranks.
224
757679
3095
์งœ์ฆ ๋Œ€์‹ ์— ๊ถŒ์œ„์žˆ๋Š” ์˜์‚ฌ๋กœ๋ถ€ํ„ฐ ์˜๋ฃŒ ์ƒ๋‹ด๋„ ๋ฐ›์„ ์ˆ˜ ์žˆ๊ฒ ์ฃ .
12:41
It could mean when you want to get factual information,
225
761143
3310
์›ํ•  ๋•Œ ์‚ฌ์‹ค์— ๊ทผ๊ฑฐํ•œ ์ •๋ณด๋ฅผ ์–ป๊ฒŒ ๋˜๊ณ 
12:44
there's not a bunch of weird, paranoid conspiracy theories.
226
764477
3254
๋ง๋„ ์•ˆ๋˜๋Š” ๋ง์ƒ ๊ฐ™์€ ์Œ๋ชจ๋ก ๋“ค์€ ์‚ฌ๋ผ์งˆ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
12:47
We can imagine this wonderful other possibility.
227
767755
4235
์ด๋Ÿฐ ์—„์ฒญ๋‚œ ๊ฐ€๋Šฅ์„ฑ์„ ์ƒ์ƒํ•  ์ˆ˜ ์žˆ์ฃ .
12:52
Ah.
228
772014
1261
์•„.
12:53
I dream of it. I believe it's possible.
229
773299
2130
์ „ ๊ทธ๋Ÿฐ ์„ธ์ƒ์„ ๊ฟˆ๊ฟ‰๋‹ˆ๋‹ค. ๊ฐ€๋Šฅํ•  ๊ฑฐ๋ผ๊ณ  ๋ฏฟ์–ด์š”.
12:55
I'm certain it's possible.
230
775453
3302
๋ถ„๋ช… ๊ฐ€๋Šฅํ•˜๋ฆฌ๋ผ ํ™•์‹ ํ•ฉ๋‹ˆ๋‹ค.
12:58
And I'm certain that the companies, the Googles and the Facebooks,
231
778779
4747
ํ™•์‹ ํ•˜๊ฑด๋Œ€, ๊ตฌ๊ธ€์ด๋‚˜ ํŽ˜์ด์Šค๋ถ ๊ฐ™์€ ๊ธฐ์—…๋„
13:03
would actually do better in this world.
232
783550
2312
๋” ๋‚˜์€ ์„ธ์ƒ์„ ๋งŒ๋“ค ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
13:05
I don't believe we need to punish Silicon Valley.
233
785886
3166
์‹ค๋ฆฌ์ฝ˜ ๋ฐธ๋ฆฌ์˜ ๊ธฐ์—…๋“ค์—๊ฒŒ ๋ฒŒ์„ ์ค„ ํ•„์š”๋Š” ์—†๋‹ค๊ณ  ์ƒ๊ฐํ•ด์š”.
13:09
We just need to remake the decision.
234
789076
2253
๋‹ค๋ฅธ ๊ฒฐ์ •์ด ํ•„์š”ํ•  ๋ฟ์ž…๋‹ˆ๋‹ค.
13:12
Of the big tech companies,
235
792702
1882
๊ฑฐ๋Œ€ ์ธํ„ฐ๋„ท ๊ธฐ์—…๋“ค ์ค‘์—์„œ
13:14
it's really only two that depend on behavior modification and spying
236
794608
5563
์‹ค์ œ๋กœ ๋‹จ ๋‘ ๊ฐœ ๊ธฐ์—…๋งŒ์ด ํ–‰๋™ ์ˆ˜์ •๊ณผ ๊ฐœ์ธ์ •๋ณด ํ›”์ณ๋ณด๊ธฐ๋ฅผ
13:20
as their business plan.
237
800195
1257
์‚ฌ์—…๊ณ„ํš์— ๋ฐ˜์˜ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
13:21
It's Google and Facebook.
238
801476
1759
๊ตฌ๊ธ€๊ณผ ํŽ˜์ด์Šค๋ถ์ด์—์š”.
13:23
(Laughter)
239
803259
1310
(์›ƒ์Œ)
13:24
And I love you guys.
240
804593
1691
์ „ ๊ทธ ์‚ฌ๋žŒ๋“ค๋„ ์ข‹์•„ํ•ด์š”.
13:26
Really, I do. Like, the people are fantastic.
241
806308
2721
์ •๋ง ์ข‹์•„ํ•ด์š”. ๋Œ€๋‹จํ•œ ์‚ฌ๋žŒ๋“ค์ด์ฃ .
13:30
I want to point out, if I may,
242
810371
3182
์ œ๊ฐ€ ์ง€์ ํ•˜๊ณ  ์‹ถ์€ ๊ฒƒ์€
13:33
if you look at Google,
243
813577
1151
๊ตฌ๊ธ€์„ ์ž˜ ์‚ดํŽด๋ณด๋ฉด
13:34
they can propagate cost centers endlessly with all of these companies,
244
814752
5087
๋‹ค๋ฅธ ํšŒ์‚ฌ๋ฅผ ์ธ์ˆ˜ํ•˜๋ฉด์„œ ๋น„์šฉ์ด ๋“œ๋Š” ๋ถ€์„œ๋ฅผ ๊ณ„์† ํ™•์žฅํ•ด ๋‚˜๊ฐ‘๋‹ˆ๋‹ค.
13:39
but they cannot propagate profit centers.
245
819863
2048
ํ•˜์ง€๋งŒ ์ˆ˜์ต ๋ถ€์„œ๋Š” ๋Š˜๋ฆฌ์ง€ ๋ชปํ•ด์ฃ .
13:41
They cannot diversify, because they're hooked.
246
821935
3181
ํ™•์žฅ์—๋งŒ ์‹ฌ์ทจํ•ด์„œ ๋‹ค์–‘ํ™”๋Š” ํ•˜์ง€ ๋ชปํ•ด์š”.
13:45
They're hooked on this model, just like their own users.
247
825140
2627
๊ทธ ์‚ฌ์—… ๋ชจ๋ธ์—๋งŒ ์‹ฌ์ทจํ•ด ์žˆ์Šต๋‹ˆ๋‹ค. ์ž๊ธฐ ์„œ๋น„์Šค์˜ ์ด์šฉ์ž๋“ค์ฒ˜๋Ÿผ์š”.
13:47
They're in the same trap as their users,
248
827791
2298
์ด์šฉ์ž๋“ค๊ณผ ํ•จ๊ป˜ ํ•จ์ •์— ๋น ์ ธ์žˆ๋Š” ๊ฑฐ์ฃ .
13:50
and you can't run a big corporation that way.
249
830113
2504
๊ทธ๋Ÿฐ ๋ฐฉ์‹์œผ๋กœ๋Š” ํฐ ๊ธฐ์—…์„ ์šด์˜ํ•  ์ˆ˜ ์—†์Šต๋‹ˆ๋‹ค.
13:52
So this is ultimately totally in the benefit of the shareholders
250
832641
3603
๊ฒฐ๊ตญ์—๋Š” ์ฃผ์ฃผ๋“ค์˜ ์ด์ต๊ณผ ์ง๊ฒฐ๋˜๊ณ 
13:56
and other stakeholders of these companies.
251
836268
2445
ํšŒ์‚ฌ ์ดํ•ด ๋‹น์‚ฌ์ž๋“ค์—๊ฒŒ๋„ ๋ฌธ์ œ๊ฐ€ ๋˜์ฃ .
13:58
It's a win-win solution.
252
838737
2350
์ด๊ฑด ๋ชจ๋‘์—๊ฒŒ ์œ ๋ฆฌํ•œ ํ•ด๋ฒ•์ž…๋‹ˆ๋‹ค.
14:01
It'll just take some time to figure it out.
253
841111
2515
ํŒŒ์•…ํ•˜๋Š” ๋ฐ์— ์‹œ๊ฐ„์ด ์ข€ ๊ฑธ๋ฆฌ๊ฒ ์ฃ .
14:03
A lot of details to work out,
254
843650
2262
๋งŽ์€ ์„ธ๋ถ€ ์‚ฌํ•ญ์„ ํŒŒ์•…ํ•ด์•ผ
14:05
totally doable.
255
845936
1830
์™„์ „ํžˆ ์‹คํ–‰ํ•  ์ˆ˜ ์žˆ์„ ๊ฒ๋‹ˆ๋‹ค.
14:07
(Laughter)
256
847790
2415
(์›ƒ์Œ)
14:10
I don't believe our species can survive unless we fix this.
257
850229
3834
์ด๊ฑธ ํ•ด๊ฒฐํ•˜์ง€ ์•Š๊ณ ๋Š” ์ธ๋ฅ˜๋Š” ์‚ด์•„๋‚จ์ง€ ๋ชปํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
14:14
We cannot have a society
258
854087
2290
์šฐ๋ฆฌ๊ฐ€ ์šฉ๋‚ฉํ•  ์ˆ˜ ์—†๋Š” ์‚ฌํšŒ๋Š”
14:16
in which, if two people wish to communicate,
259
856401
2961
๋‘ ์‚ฌ๋žŒ์ด ์„œ๋กœ ์†Œํ†ต์„ ์›ํ•  ๋•Œ
14:19
the only way that can happen is if it's financed by a third person
260
859386
3440
๊ทธ๋“ค์„ ์กฐ์ข…ํ•˜๋ ค๋Š” ๋‹ค๋ฅธ ์ด๊ฐ€ ๋ˆ์„ ๋Œ€์•ผ ์†Œํ†ต์ด ๊ฐ€๋Šฅํ•œ ์‚ฌํšŒ์ž…๋‹ˆ๋‹ค.
14:22
who wishes to manipulate them.
261
862850
2346
14:25
(Applause)
262
865220
6238
(๋ฐ•์ˆ˜)
14:35
(Applause ends)
263
875077
1151
14:36
In the meantime, if the companies won't change,
264
876942
2945
๊ทธ๋Ÿฐ๋ฐ๋„ ๊ทธ ํšŒ์‚ฌ๋“ค์ด ๋ฐ”๋€Œ์ง€ ์•Š์œผ๋ฉด ๊ณ„์ •์„ ์‚ญ์ œํ•ด๋ฒ„๋ฆฌ์„ธ์š”. ์•Œ์•˜์ฃ ?
14:39
delete your accounts, OK?
265
879911
1666
14:41
(Laughter)
266
881601
1269
(์›ƒ์Œ)
14:42
(Applause)
267
882894
1046
(๋ฐ•์ˆ˜)
14:43
That's enough for now.
268
883964
1509
์ด๋ฒˆ์—” ์ด ์ •๋„๋กœ ํ•ด๋‘์ฃ . ๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
14:45
Thank you so much.
269
885497
1151
14:46
(Applause)
270
886672
6804
(๋ฐ•์ˆ˜)
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7