How we need to remake the internet | Jaron Lanier

432,900 views ใƒป 2018-05-03

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืชืจื’ื•ื: Ben Bokser ืขืจื™ื›ื”: Ido Dekkers
00:12
Back in the 1980s, actually, I gave my first talk at TED,
0
12944
4009
ื”ืืžืช, ื‘ืฉื ื•ืช ื”-80 ืฉืœ ื”ืžืื” ื”-20, ื”ืขื‘ืจืชื™ ืืช ื”ืจืฆืืช TED ื”ืจืืฉื•ื ื” ืฉืœื™,
00:16
and I brought some of the very, very first public demonstrations
1
16977
4262
ื•ื”ื‘ืืชื™ ื‘ื™ืŸ ื”ื”ื“ื’ืžื•ืช ื”ืฆื™ื‘ื•ืจื™ื•ืช ื”ืžืžืฉ, ืžืžืฉ ืจืืฉื•ื ื•ืช
00:21
of virtual reality ever to the TED stage.
2
21263
4234
ืฉืœ ืžืฆื™ืื•ืช ืžื“ื•ืžื”, ืœื‘ืžืช TED.
00:26
And at that time, we knew that we were facing a knife-edge future
3
26375
6867
ื•ืื–, ื™ื“ืขื ื• ืฉืื ื—ื ื• ืžืกืชื›ืœื™ื ื‘ืขืชื™ื“ ืฉื™ื”ื™ื” ื›ืžื• ืœื”ื‘ ืกื›ื™ืŸ,
00:33
where the technology we needed,
4
33266
5201
ื‘ื• ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ืฉื”ื™ื™ื ื• ืฆืจื™ื›ื™ื,
00:38
the technology we loved,
5
38491
1851
ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ืฉืื”ื‘ื ื•,
00:40
could also be our undoing.
6
40366
2047
ืชื•ื›ืœ ืœื”ื™ื•ืช ื”ืกื•ืฃ ืฉืœื ื•.
00:43
We knew that if we thought of our technology
7
43266
4091
ื™ื“ืขื ื• ืฉืื ื—ืฉื‘ื ื• ืขืœ ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ืฉืœื ื•
00:47
as a means to ever more power,
8
47381
3254
ื›ืืžืฆืขื™ ืœืขื•ื“ ื•ืขื•ื“ ื™ื•ืชืจ ื›ื•ื—,
00:50
if it was just a power trip, we'd eventually destroy ourselves.
9
50659
3707
ืื ื”ื™ื ื”ื™ืชื” ืจืง ื˜ืจื™ืค ื›ื•ื—, ื‘ืกื•ืฃ ื ื—ืจื™ื‘ ืืช ืขืฆืžื ื•.
00:54
That's what happens
10
54390
1181
ื–ื” ืžื” ืฉืงื•ืจื”
00:55
when you're on a power trip and nothing else.
11
55595
2787
ื›ืฉืืชื ืขืกื•ืงื™ื ืจืง ื‘ื˜ืจื™ืค ื›ื•ื—.
00:59
So the idealism
12
59509
3389
ืื– ื”ืื™ื“ื™ืืœื™ื–ื
01:02
of digital culture back then
13
62922
4809
ืฉืœ ื”ืชืจื‘ื•ืช ื”ื“ื™ื’ื™ื˜ืœื™ืช ืื–
01:07
was all about starting with that recognition of the possible darkness
14
67755
4739
ื”ืชืจื›ื– ื›ื•ืœื• ื‘ืœื”ืชื—ื™ืœ ืขื ื”ื”ื›ืจื” ื‘ืืคื™ืœื” ื”ืืคืฉืจื™ืช
01:12
and trying to imagine a way to transcend it
15
72518
3350
ื•ื ืกื™ื•ืŸ ืœื“ืžื™ื™ืŸ ื“ืจืš ืœื”ืชืขืœื•ืช ืžืขืœื™ื”
01:15
with beauty and creativity.
16
75892
2578
ื“ืจืš ื™ื•ืคื™ ื•ื™ืฆื™ืจืชื™ื•ืช.
01:19
I always used to end my early TED Talks with a rather horrifying line, which is,
17
79033
6507
ืชืžื™ื“ ื”ื™ื™ืชื™ ืžืกื™ื™ื ืืช ื”ืจืฆืื•ืช TED ืฉืœื™ ืขื ืฉื•ืจื” ืงืฆืช ืžืคื—ื™ื“ื”:
01:26
"We have a challenge.
18
86478
3866
"ื™ืฉ ืœื ื• ืืชื’ืจ.
01:30
We have to create a culture around technology
19
90368
4024
ืื ื• ื—ื™ื™ื‘ื™ื ืœื™ืฆื•ืจ ืชืจื‘ื•ืช ืกื‘ื™ื‘ ื˜ื›ื ื•ืœื•ื’ื™ื”
01:34
that is so beautiful, so meaningful,
20
94416
3968
ืฉื”ื™ื ื›ืœ ื›ืš ื™ืคื” ื•ืžืฉืžืขื•ืชื™ืช,
01:38
so deep, so endlessly creative,
21
98408
2541
ื›ืœ ื›ืš ืขืžื•ืงื”, ื™ืฆื™ืจืชื™ืช ืขื“ ืื™ืŸ ืงืฅ,
01:40
so filled with infinite potential
22
100973
3016
ื›ืœ ื›ืš ืžืœืื” ื‘ืคื•ื˜ื ืฆื™ืืœ ืื™ื ืกื•ืคื™
01:44
that it draws us away from committing mass suicide."
23
104013
3253
ืฉื”ื™ื ืžืจื—ื™ืงื” ืื•ืชื ื• ืžื”ืชืื‘ื“ื•ืช ื”ืžื•ื ื™ืช."
01:48
So we talked about extinction as being one and the same
24
108519
5588
ืื– ื“ื™ื‘ืจื ื• ืขืœ ื”ื›ื—ื“ื” ื›ืื™ืœื• ื”ื™ื ื–ื”ื”
01:54
as the need to create an alluring, infinitely creative future.
25
114131
4830
ืœืฆื•ืจืš ืœื™ืฆื•ืจ ืขืชื™ื“ ืžื•ืฉืš ื•ื™ืฆื™ืจืชื™ ืขื“ ืื™ืŸ ืงืฅ.
01:59
And I still believe that that alternative of creativity
26
119639
5382
ื•ืื ื™ ืขื“ื™ื™ืŸ ืžืืžื™ืŸ ืฉื”ืืœื˜ืจื ื˜ื™ื‘ื” ืฉืœ ื™ืฆื™ืจืชื™ื•ืช
02:05
as an alternative to death
27
125045
1974
ื›ืืœื˜ืจื ื˜ื™ื‘ื” ืœืžื•ื•ืช
02:07
is very real and true,
28
127043
1969
ื”ื™ื ืžืžืฉื™ืช ื•ืืžื™ืชื™ืช,
02:09
maybe the most true thing there is.
29
129036
1983
ืื•ืœื™ ื”ื“ื‘ืจ ื”ื›ื™ ืืžื™ืชื™ ืฉื™ืฉ.
02:11
In the case of virtual reality --
30
131870
2095
ื‘ืžืงืจื” ืฉืœ ืžืฆื™ืื•ืช ืžื“ื•ืžื” --
02:13
well, the way I used to talk about it
31
133989
2282
ืื– ื”ื“ืจืš ืฉื”ื™ื™ืชื™ ืžื“ื‘ืจ ืขืœื™ื”
02:16
is that it would be something like
32
136295
2635
ื”ื™ื ืฉื”ื™ื ืชื”ื™ื” ืžืฉื”ื• ื›ืžื•
02:18
what happened when people discovered language.
33
138954
2850
ืžื” ืงืจื” ื›ืฉืื ืฉื™ื ื’ื™ืœื• ืืช ื”ืฉืคื”.
02:21
With language came new adventures, new depth, new meaning,
34
141828
4675
ืขื ื”ืฉืคื” ื”ื’ื™ืขื• ืืชื’ืจื™ื, ืขื•ืžืง, ื•ืžืฉืžืขื•ืช ื—ื“ืฉื™ื,
02:26
new ways to connect, new ways to coordinate,
35
146527
2080
ื“ืจื›ื™ื ื—ื“ืฉื•ืช ืœื”ืชื—ื‘ืจ, ืœืชืื,
02:28
new ways to imagine, new ways to raise children,
36
148631
4034
ืœื“ืžื™ื™ืŸ, ืœื’ื“ืœ ืœื™ืœื“ื™ื,
02:32
and I imagined, with virtual reality, we'd have this new thing
37
152689
4262
ื•ืื ื™ ื“ืžื™ื™ื ืชื™ ืฉืขื ืžืฆื™ืื•ืช ืžื“ื•ืžื”, ื™ื”ื™ื” ืœื ื• ืืช ื”ื“ื‘ืจ ื”ื—ื“ืฉ ื”ื–ื”
02:36
that would be like a conversation
38
156975
1593
ืฉื™ื”ื™ื” ื›ืžื• ืฉื™ื—ื”
02:38
but also like waking-state intentional dreaming.
39
158592
3344
ืื‘ืœ ื’ื ื›ืžื• ื—ืœื•ื ืžื›ื•ื•ืŸ ื‘ื”ืงื™ืฅ.
02:41
We called it post-symbolic communication,
40
161960
2653
ืงืจืื ื• ืœื” ืชืงืฉื•ืจืช ืคื•ืกื˜-ืกื™ืžื‘ื•ืœื™ืช,
02:44
because it would be like just directly making the thing you experienced
41
164637
4358
ื›ื™ ื”ื™ื ื”ื™ืชื” ื™ื›ื•ืœื” ืœื”ื™ื•ืช ื›ืžื• ื™ืฉืจ ืœื™ืฆื•ืจ ืืช ื”ื“ื‘ืจ ืฉื—ื•ื•ื™ืช
02:49
instead of indirectly making symbols to refer to things.
42
169019
3619
ื‘ืžืงื•ื ื‘ืขืงื™ืคื™ืŸ ื™ืฆื™ืจืช ืกืžืœื™ื ืฉืžืกืžืœื™ื ื“ื‘ืจื™ื.
02:53
It was a beautiful vision, and it's one I still believe in,
43
173466
4338
ื–ื” ื”ื™ื” ื—ื–ื•ืŸ ืžื“ื”ื™ื, ื‘ื• ืื ื™ ืขื“ื™ื™ืŸ ืžืืžื™ืŸ,
02:57
and yet, haunting that beautiful vision
44
177828
3215
ืื‘ืœ, ืชืžื™ื“ ืจื“ืฃ ืืช ื”ื—ื–ื•ืŸ ื”ืžื“ื”ื™ื ื”ื–ื”
03:01
was the dark side of how it could also turn out.
45
181067
3150
ื”ืฆื“ ื”ืืคืœ ืฉืœ ืื™ืš ื”ื•ื ื”ื™ื” ื™ื›ื•ืœ ืœืฆืืช.
03:04
And I suppose I could mention
46
184241
5048
ื•ืื ื™ ืžื ื™ื— ืฉื”ื™ื™ืชื™ ื™ื›ื•ืœ ืœืกืคืจ
03:09
from one of the very earliest computer scientists,
47
189313
3064
ืžืื—ื“ ืžืžื“ืขื ื™ ื”ืžื—ืฉื‘ ื”ืจืืฉื•ื ื™ื,
03:12
whose name was Norbert Wiener,
48
192401
2135
ืฉืžื• ื ื•ืจื‘ืจื˜ ื•ื™ื ืจ,
03:14
and he wrote a book back in the '50s, from before I was even born,
49
194560
3754
ืฉื›ืชื‘ ืกืคืจ ื‘ืฉื ื•ืช ื”-50 ืฉืœ ืžืื” ื”-20, ืœืคื ื™ ืฉื ื•ืœื“ืชื™,
03:18
called "The Human Use of Human Beings."
50
198338
2658
ืฉื ืงืจืืช "ื ื™ืฆื•ืœ ื‘ื ื™ ืื“ื ืขืœ ื™ื“ื™ ื‘ื ื™ ืื“ื."
03:21
And in the book, he described the potential
51
201779
4172
ื‘ืกืคืจ, ื”ื•ื ืชื™ืืจ ืืช ื”ืคื•ื˜ื ืฆื™ืืœ
03:25
to create a computer system that would be gathering data from people
52
205975
6181
ืœื™ืฆื•ืจ ืžืขืจื›ืช ืžืžื•ื—ืฉื‘ืช ืฉืชืืกื•ืฃ ื ืชื•ื ื™ื ืžืื ืฉื™ื
03:32
and providing feedback to those people in real time
53
212180
3572
ื•ืชื™ืชืŸ ืœื”ื ืคื™ื“ื‘ืง ื‘ื–ืžืŸ ืืžืช
03:35
in order to put them kind of partially, statistically, in a Skinner box,
54
215776
5135
ื›ื“ื™ ืœืฉื™ื ืื•ืชื, ื—ืœืงื™ืช, ืกื˜ื˜ื™ืกื˜ื™ืช, ื‘ืกื•ื’ ืฉืœ "ืืจื’ื– ืกืงื™ื ืจ,"
03:40
in a behaviorist system,
55
220935
2444
ืžืขืจื›ืช ื”ืชื ื”ื’ื•ืชื™ืช,
03:43
and he has this amazing line where he says,
56
223403
2501
ื•ื™ืฉ ืœื• ืฉื•ืจื” ืžื“ื”ื™ืžื”, ื‘ื” ื”ื•ื ืื•ืžืจ,
03:45
one could imagine, as a thought experiment --
57
225928
2738
ืžื™ืฉื”ื• ื™ื›ื•ืœ ืœื“ืžื™ื™ืŸ, ื›ืชืจื’ื™ืœ ืžื—ืฉื‘ื”,
03:48
and I'm paraphrasing, this isn't a quote --
58
228690
2461
ื•ื–ื” ืœื ืฆื™ื˜ื•ื˜ ืžื“ื•ื™ืง --
03:51
one could imagine a global computer system
59
231175
3080
ืืคืฉืจ ืœื“ืžื™ื™ืŸ ืžืขืจื›ืช ืžืžื•ื—ืฉื‘ืช ืขื•ืœืžื™ืช
03:54
where everybody has devices on them all the time,
60
234279
2842
ื›ืืฉืจ ืœื›ื•ืœื ื™ืฉ ืžื›ืฉื™ืจื™ื ืขืœื™ื”ื ื›ืœ ื”ื–ืžืŸ,
03:57
and the devices are giving them feedback based on what they did,
61
237145
3272
ื•ื”ืžื›ืฉื™ืจื™ื ื ื•ืชื ื™ื ืœื”ื ืคื™ื“ื‘ืง ืขืœ ื‘ืกื™ืก ื”ื”ืชื ื”ื’ื•ืช ืฉืœื”ื,
04:00
and the whole population
62
240441
1875
ื•ื›ืœ ื”ืื•ื›ืœื•ืกื™ื”
04:02
is subject to a degree of behavior modification.
63
242340
3576
ื ืชื•ื ื” ืœื”ืฉืคืขื” ื”ืชื ื”ื’ื•ืชื™ืช ืžืกื•ื™ืžืช.
04:05
And such a society would be insane,
64
245940
3546
ื•ื—ื‘ืจื” ื›ื–ื• ืชื”ื™ื” ืžืฉื•ื’ืขืช,
04:09
could not survive, could not face its problems.
65
249510
3097
ืœื ืชื•ื›ืœ ืœืฉืจื•ื“, ืœื ืชื•ื›ืœ ืœื”ืชืžื•ื“ื“ ื‘ื‘ืขื™ื•ืช ืฉืœื”.
04:12
And then he says, but this is only a thought experiment,
66
252631
2621
ื•ืื– ื”ื•ื ืื•ืžืจ, ืื‘ืœ ื–ื” ืจืง ืชืจื’ื™ืœ ืžื—ืฉื‘ื”,
04:15
and such a future is technologically infeasible.
67
255276
3420
ื•ืขืชื™ื“ ื›ื–ื” ื‘ืœืชื™ ืืคืฉืจื™ ืžื‘ื—ื™ื ื” ื˜ื›ื ื•ืœื•ื’ื™ืช.
04:18
(Laughter)
68
258720
1092
(ืฆื—ื•ืง)
04:19
And yet, of course, it's what we have created,
69
259836
3002
ืื‘ืœ, ื›ืžื•ื‘ืŸ, ื–ื” ืžื” ืฉื™ืฆืจื ื•,
04:22
and it's what we must undo if we are to survive.
70
262862
3277
ื•ื–ื” ืžื” ืฉืื ื• ื—ื™ื™ื‘ื™ื ืœื‘ื˜ืœ ื›ื“ื™ ืœืฉืจื•ื“.
04:27
So --
71
267457
1151
ืื– --
04:28
(Applause)
72
268632
3540
(ืžื—ื™ืืช ื›ืคื™ื™ื)
04:32
I believe that we made a very particular mistake,
73
272631
5977
ืื ื™ ืžืืžื™ืŸ ืฉืขืฉื™ื ื• ื˜ืขื•ืช ืžืื•ื“ ืžืกื•ื™ืžืช,
04:38
and it happened early on,
74
278632
2234
ืฉื ืขืฉืชื” ื‘ืฉืœื‘ ืžื•ืงื“ื,
04:40
and by understanding the mistake we made,
75
280890
2074
ื•ืื ื ื‘ื™ืŸ ืืช ื”ื˜ืขื•ืช ืฉืขืฉื™ื ื•,
04:42
we can undo it.
76
282988
1859
ื ื•ื›ืœ ืœื‘ื˜ืœื”.
04:44
It happened in the '90s,
77
284871
2559
ื”ื™ื ื ืขืฉืชื” ื‘ืฉื ื•ืช ื”-90,
04:47
and going into the turn of the century,
78
287454
2742
ืœืงืจืืช ืฉื ืช 2000,
04:50
and here's what happened.
79
290220
1388
ื•ื–ื” ืžื” ืฉืงืจื”.
04:53
Early digital culture,
80
293200
1374
ื”ืชืจื‘ื•ืช ื”ื“ื™ื’ื™ื˜ืœื™ืช ื”ืžื•ืงื“ืžืช,
04:54
and indeed, digital culture to this day,
81
294598
4972
ื•ื”ืืžืช, ื”ืชืจื‘ื•ืช ื”ื“ื™ื’ื™ื˜ืœื™ืช ืขื“ ื”ื™ื•ื,
04:59
had a sense of, I would say, lefty, socialist mission about it,
82
299594
6309
ื”ื—ื–ื™ืงื” ื‘ืฉืœื™ื—ื•ืช, ื‘ื•ื ื ื’ื™ื“, ืฉืžืืœื ื™ืช-ืกื•ืฆื™ืืœื™ืกื˜ื™ืช,
05:05
that unlike other things that have been done,
83
305927
2160
ืฉื‘ืฉื•ื ื” ืžื“ื‘ืจื™ื ืื—ืจื™ื ืฉื ืขืฉื•,
05:08
like the invention of books,
84
308111
1434
ื›ืžื• ื”ืžืฆืืช ืกืคืจื™ื,
05:09
everything on the internet must be purely public,
85
309569
3413
ื›ืœ ืžื” ืฉื™ื”ื™ื” ื‘ืื™ื ื˜ืจื ื˜ ื—ื™ื™ื‘ ืœื”ื™ื•ืช ืœื’ืžืจื™ ื—ื•ืคืฉื™ ืœืฆื™ื‘ื•ืจ,
05:13
must be available for free,
86
313006
2325
ื‘ื—ื™ื ื,
05:15
because if even one person cannot afford it,
87
315355
3388
ื›ื™ ืื ืืคื™ืœื• ืื“ื ืื—ื“ ื™ื”ื™ื” ืžื ื•ืข ืžืœืงื ื•ืช ืื•ืชื•,
05:18
then that would create this terrible inequity.
88
318767
2572
ืื– ื–ื” ื™ืฆื•ืจ ื—ื•ืกืจ ืฉื•ื•ื™ื•ืŸ ื‘ืœืชื™ ืกื‘ื™ืœ.
05:21
Now of course, there's other ways to deal with that.
89
321912
2524
ืขื›ืฉื™ื•, ื›ืžื•ื‘ืŸ, ื™ืฉ ื“ืจื›ื™ื ืื—ืจื•ืช ืœื”ืชืžื•ื“ื“ ืขื ื–ื”.
05:24
If books cost money, you can have public libraries.
90
324460
3016
ืขื ืกืคืจื™ื ืขื•ืœื™ื ื›ืกืฃ, ื™ืฉ ืกืคืจื™ื•ืช ืฆื™ื‘ื•ืจื™ื•ืช.
05:27
And so forth.
91
327500
1174
ื•ื›ื•'.
05:28
But we were thinking, no, no, no, this is an exception.
92
328698
2618
ืื‘ืœ ืื ื—ื ื• ื—ืฉื‘ื ื•, ืœื, ืœื,ืœื, ื–ื” ื™ื•ืฆื ืžื”ื›ืœืœ.
05:31
This must be pure public commons, that's what we want.
93
331340
4605
ื–ื” ื—ื™ื™ื‘ ืœื”ื™ื•ืช ืžืจื—ื‘ ืฆื™ื‘ื•ืจื™ ืœื’ืžืจื™, ื–ื” ืžื” ืฉืื ื—ื ื• ืจื•ืฆื™ื.
05:35
And so that spirit lives on.
94
335969
2634
ื•ื”ืจื•ื— ื”ื–ื• ืขื“ื™ื™ืŸ ืงื™ื™ืžืช.
05:38
You can experience it in designs like the Wikipedia, for instance,
95
338627
3715
ืืคืฉืจ ืœื—ื•ื•ืช ืื•ืชื” ื‘ืขื™ืฆื•ื‘ื™ื ื›ืžื• ื•ื™ืงื™ืคื“ื™ื”, ืœื“ื•ื’ืžื,
05:42
many others.
96
342366
1341
ื•ืขื•ื“ ืจื‘ื™ื.
05:43
But at the same time,
97
343731
1874
ืื‘ืœ ื‘ืื•ืชื• ื–ืžืŸ,
05:45
we also believed, with equal fervor,
98
345629
2588
ื’ื ื”ืืžื ื•, ื‘ืื•ืชื” ืขื•ืฆืžื”,
05:48
in this other thing that was completely incompatible,
99
348241
3937
ื‘ืžืฉื”ื• ื ื•ืกืฃ, ืฉืœื ื™ื›ืœ ืœื”ืชืงื™ื™ื ื‘ื™ื—ื“ ืขื ื”ืจื•ื— ื”ื–ื•:
05:52
which is we loved our tech entrepreneurs.
100
352202
3627
ืื”ื‘ื ื• ืืช ื”ื™ื–ืžื™ื ื”ื˜ื›ื ื•ืœื•ื’ื™ื™ื ืฉืœื ื•.
05:55
We loved Steve Jobs; we loved this Nietzschean myth
101
355853
3739
ืื”ื‘ื ื• ืืช ืกื˜ื™ื‘ ื’'ื•ื‘ืก; ืื”ื‘ื ื• ืืช ื”ืžื™ืชื•ืก ื”ื ื™ืฆ'ื™ืื ื™
05:59
of the techie who could dent the universe.
102
359616
3468
ืฉืœ ื”ื™ื–ื ื”ื˜ื›ื ื•ืœื•ื’ื™ ืฉื™ื•ื›ืœ ืœื”ืฉืคื™ืข ืขืœ ื”ื™ืงื•ื.
06:03
Right?
103
363108
1318
ื ื›ื•ืŸ?
06:04
And that mythical power still has a hold on us, as well.
104
364450
5848
ื•ื”ื›ื•ื— ื”ืžื™ืชื™ ื”ื–ื” ืขื“ื™ื™ืŸ ืฉื•ืœื˜ ื‘ื ื•, ื’ื.
06:10
So you have these two different passions,
105
370322
4459
ืื– ื™ืฉ ืืช ืฉื ื™ ื”ื™ืฆืจื™ื ื”ืฉื•ื ื™ื ื”ืืœื”,
06:14
for making everything free
106
374805
1937
ืฉื”ื›ืœ ื—ื™ื™ื‘ ืœื”ื™ื•ืช ื‘ื—ื™ื ื
06:16
and for the almost supernatural power of the tech entrepreneur.
107
376766
5166
ื•ื”ืืžื•ื ื” ื‘ื›ื•ื— ื”ื›ืžืขื˜ ืขืœ ื˜ื‘ืขื™ ืฉืœ ื”ื™ื–ืžื™ื ื”ื˜ื›ื ื•ืœื•ื’ื™ื.
06:21
How do you celebrate entrepreneurship when everything's free?
108
381956
4352
ืื™ืš ืืคืฉืจ ืœื—ื’ื•ื’ ืืช ื”ื™ื–ืžื•ืช ื›ืืฉืจ ื”ื›ืœ ื‘ื—ื™ื ื?
06:26
Well, there was only one solution back then,
109
386332
3125
ืื–, ื”ื™ื” ืจืง ืคืชืจื•ืŸ ืื—ื“ ื‘ื™ืžื™ื ื”ื”ื,
06:29
which was the advertising model.
110
389481
2087
ื•ื”ื•ื ื”ื™ื” ื”ืžื•ื“ืœ ืฉืœ ืคืจืกื•ืžื•ืช.
06:31
And so therefore, Google was born free, with ads,
111
391592
4003
ืื– ืœื›ืŸ, ื’ื•ื’ืœ ื ื•ืœื“ื” ื‘ื—ื™ื ื, ืขื ืคืจืกื•ืžื•ืช,
06:35
Facebook was born free, with ads.
112
395619
3682
ืคื™ื™ืกื‘ื•ืง ื ื•ืœื“ื” ื‘ื—ื™ื ื, ืขื ืคืจืกื•ืžื•ืช.
06:39
Now in the beginning, it was cute,
113
399325
3865
ืขื›ืฉื™ื•, ื‘ื”ืชื—ืœื”, ื–ื” ื”ื™ื” ื—ืžื•ื“,
06:43
like with the very earliest Google.
114
403214
1960
ื›ืžื• ื‘ื’ื•ื’ืœ ื”ืจืืฉื•ื ื™.
06:45
(Laughter)
115
405198
1286
(ืฆื—ื•ืง)
06:46
The ads really were kind of ads.
116
406508
2897
ื”ืคืจืกื•ืžื•ืช ื‘ืืžืช ื”ื™ื• ืกื•ื’ ืฉืœ ืคืจืกื•ืžื•ืช.
06:49
They would be, like, your local dentist or something.
117
409429
2485
ื”ื ื”ื™ื•, ืœื“ื•ื’ืžื, ื‘ืฉื‘ื™ืœ ืจื•ืคื ื”ืฉื™ื ื™ื™ื ื”ืžืงื•ืžื™.
06:51
But there's thing called Moore's law
118
411938
1920
ืื‘ืœ ื™ืฉ ืžืฉื”ื• ืฉื ืงืจื ื—ื•ืง ืžื•ืจ
06:53
that makes the computers more and more efficient and cheaper.
119
413882
3142
ืœืคื™ื• ื”ืžื—ืฉื‘ื™ื ื ื”ื™ื™ื ื™ื•ืชืจ ื™ืขื™ืœื™ื ื•ื–ื•ืœื™ื.
06:57
Their algorithms get better.
120
417048
1858
ื”ืืœื’ื•ืจื™ืชืžื™ื ืฉืœื”ื ื ื”ื™ื™ื ื™ื•ืชืจ ื˜ื•ื‘ื™ื.
06:58
We actually have universities where people study them,
121
418930
2596
ืืคื™ืœื• ื™ืฉ ืœื ื• ืื•ื ื‘ืจืกื™ื˜ืื•ืช ื‘ื”ืŸ ืื ืฉื™ื ื—ื•ืงืจื™ื ืื•ืชื,
07:01
and they get better and better.
122
421550
1628
ื•ื”ื ื ื”ื™ื™ื ื™ื•ืชืจ ื•ื™ื•ืชืจ ื˜ื•ื‘ื™ื.
07:03
And the customers and other entities who use these systems
123
423202
4452
ื•ื”ืœืงื•ื—ื•ืช ื•ืฉืืจ ื”ื’ื•ืคื™ื ืฉืžืฉืชืžืฉื™ื ื‘ืžืขืจื›ื•ืช ื”ืืœื”
07:07
just got more and more experienced and got cleverer and cleverer.
124
427678
4127
ื ื”ื™ื• ื™ื•ืชืจ ื•ื™ื•ืชืจ ืžื ื•ืกื™ื ื•ื™ื•ืชืจ ื•ื™ื•ืชืจ ืžืชื•ื—ื›ืžื™ื.
07:11
And what started out as advertising
125
431829
2397
ืื– ืžื” ืฉื”ืชื—ื™ืœ ื›ืคืจืกื•ื
07:14
really can't be called advertising anymore.
126
434250
2477
ืขื›ืฉื™ื• ื›ื‘ืจ ืื™ ืืคืฉืจ ืœืงืจื•ื ืœื• ืคืจืกื•ื ื‘ืืžืช.
07:16
It turned into behavior modification,
127
436751
2912
ื”ื•ื ื ืขืฉื” ื”ืฉืคืขื” ื”ืชื ื”ื’ื•ืชื™ืช,
07:19
just as Norbert Wiener had worried it might.
128
439687
4493
ื›ืžื• ืฉื ื•ืจื‘ืจื˜ ื•ื™ื ืจ ื“ืื’ ืฉื”ื™ื” ื™ื›ื•ืœ ืœืงืจื•ืช.
07:24
And so I can't call these things social networks anymore.
129
444204
4620
ืื– ืื ื™ ื›ื‘ืจ ืœื ื™ื›ื•ืœ ืœืงืจื•ื ืœื“ื‘ืจื™ื ื”ืืœื” ืจืฉืชื•ืช ื—ื‘ืจืชื™ื•ืช.
07:28
I call them behavior modification empires.
130
448848
3814
ืื ื™ ืงื•ืจื ืœื”ื ืื™ืžืคืจื™ื•ืช ืฉืœ ื”ืฉืคืขื” ื”ืชื ื”ื’ื•ืชื™ืช.
07:32
(Applause)
131
452686
2235
(ืžื—ื™ืืช ื›ืคื™ื™ื)
07:34
And I refuse to vilify the individuals.
132
454945
4214
ื•ืื ื™ ืžืกืจื‘ ืœืจืื•ืช ืืช ื”ืื ืฉื™ื ื›ืจืฉืขื™ื.
07:39
I have dear friends at these companies,
133
459183
2271
ื™ืฉ ืœื™ ื—ื‘ืจื™ื ื˜ื•ื‘ื™ื ื‘ื—ื‘ืจื•ืช ื”ืืœื”,
07:41
sold a company to Google, even though I think it's one of these empires.
134
461478
4760
ืžื›ืจืชื™ ื—ื‘ืจื” ืœื’ื•ื’ืœ, ืœืžืจื•ืช ืฉืื ื™ ื—ื•ืฉื‘ ืฉื”ื•ื ื‘ื™ืŸ ื”ืื™ืžืคืจื™ื•ืช ื”ืืœื”.
07:46
I don't think this is a matter of bad people who've done a bad thing.
135
466262
5060
ืื ื™ ืœื ื—ื•ืฉื‘ ืฉื–ื” ืขื ื™ื™ืŸ ืฉืœ ืื ืฉื™ื ืจืฉืขื™ื ืฉืขืฉื• ืžืฉื”ื• ืจืข.
07:51
I think this is a matter of a globally tragic,
136
471346
4576
ืื ื™ ื—ื•ืฉื‘ ืฉื–ื” ืขื ื™ื™ืŸ ื˜ืจื’ื™ ืขื•ืœืžื™,
07:55
astoundingly ridiculous mistake,
137
475946
4572
ืฉืœ ื˜ืขื•ืช ืžืžืฉ ืžื’ื•ื—ื›ืช,
08:00
rather than a wave of evil.
138
480542
4129
ื‘ืžืงื•ื ื’ืœ ืจืฉืข.
08:04
Let me give you just another layer of detail
139
484695
2682
ืชื ื• ืœื™ ืœืชืช ืœื›ื ืจืง ืขื•ื“ ืจื•ื‘ื“ ืฉืœ ืคื™ืจื•ื˜
08:07
into how this particular mistake functions.
140
487401
3103
ืฉืœ ืื™ืš ื”ื˜ืขื•ืช ื”ืžืกื•ื™ืžืช ื”ื–ื• ืคื•ืขืœืช.
08:11
So with behaviorism,
141
491337
2707
ื‘ื”ืชื ื”ื’ื•ืชื™ื•ืช,
08:14
you give the creature, whether it's a rat or a dog or a person,
142
494068
5064
ื ื•ืชื ื™ื ืœืกื•ื‘ื™ื™ืงื˜, ืื ื–ื• ื—ื•ืœื“ื”, ื›ืœื‘, ืื• ื‘ืŸ ืื“ื,
08:19
little treats and sometimes little punishments
143
499156
2840
ืคื™ื ื•ืงื™ื ืงื˜ื ื™ื, ื•ืœืคืขืžื™ื ืขื•ื ืฉื™ื ืงื˜ื ื™ื
08:22
as feedback to what they do.
144
502020
1817
ื›ืคื™ื“ื‘ืง ืœืžื” ืฉื”ื ืขื•ืฉื™ื.
08:24
So if you have an animal in a cage, it might be candy and electric shocks.
145
504710
5912
ืื– ืื ื™ืฉ ื‘ืขืœ ื—ื™ื™ื ื‘ื›ืœื•ื‘ ื–ื” ื™ื›ื•ืœ ืœื”ื™ื•ืช ืžืžืชืงื™ื ื•ื”ืœื ื—ืฉืžืœื™.
08:30
But if you have a smartphone,
146
510646
2524
ืื‘ืœ ืื ื™ืฉ ืกืžืืจื˜ืคื•ืŸ,
08:33
it's not those things, it's symbolic punishment and reward.
147
513194
6926
ื–ื” ืœื ื”ื“ื‘ืจื™ื ื”ืืœื”, ืืœื ืฉื›ืจ ื•ืขื•ื ืฉ ืกืžืœื™ื™ื.
08:40
Pavlov, one of the early behaviorists,
148
520144
2443
ืคื‘ืœื•ื‘, ืื—ื“ ื”ืจืืฉื•ื ื™ื ื‘ืชื—ื•ื ื”ื”ืชื ื”ื’ื•ืชื™ื•ืช,
08:42
demonstrated the famous principle.
149
522611
2952
ื”ื“ื’ื™ื ืืช ื”ืขื™ืงืจื•ืŸ ื”ืžืคื•ืจืกื.
08:45
You could train a dog to salivate just with the bell, just with the symbol.
150
525587
3961
ื”ื™ื” ืืคืฉืจ ืœืืžืŸ ื›ืœื‘ ืœืจื™ื™ืจ ืจืง ืขื ื”ืคืขืžื•ืŸ, ืจืง ืขื ื”ืกืžืœ.
08:49
So on social networks,
151
529572
1586
ืื– ื‘ืจืฉืชื•ืช ื”ื—ื‘ืจืชื™ื•ืช,
08:51
social punishment and social reward function as the punishment and reward.
152
531182
5080
ืฉื›ืจ ื•ืขื•ื ืฉ ื—ื‘ืจืชื™ื™ื ืคื•ืขืœื™ื ื›ืฉื›ืจ ื•ืขื•ื ืฉ.
08:56
And we all know the feeling of these things.
153
536286
2077
ื•ื›ื•ืœื ื• ืžื›ื™ืจื™ื ืืช ื”ื”ืจื’ืฉื” ืฉืœ ื”ื“ื‘ืจื™ื ื”ืืœื”.
08:58
You get this little thrill --
154
538387
1451
ืืชื” ืžืงื‘ืœ ืื™ื–ื” ืจื™ื’ื•ืฉ ืงื˜ืŸ --
08:59
"Somebody liked my stuff and it's being repeated."
155
539862
2350
"ืžื™ืฉื”ื• ืื”ื‘ ืืช ื”ื“ื‘ืจื™ื ืฉืœื™ ื•ื”ื ืžืฉืชื›ืคืœื™ื."
09:02
Or the punishment: "Oh my God, they don't like me,
156
542236
2334
ืื• ื”ืขื•ื ืฉ: "ืืœื•ื”ื™ื, ื”ื ืœื ืื•ื”ื‘ื™ื ืื•ืชื™,
09:04
maybe somebody else is more popular, oh my God."
157
544594
2239
ืื•ืœื™ ืžื™ืฉื”ื• ืื—ืจ ื”ื•ื ื™ื•ืชืจ ืคื•ืคื•ืœืืจื™, ืืœื•ื”ื™ื."
09:06
So you have those two very common feelings,
158
546857
2226
ืื– ื™ืฉ ืืช ืฉืชื™ ื”ืจื’ืฉื•ืช ื”ืžืื•ื“ ื ืคื•ืฆื•ืช,
09:09
and they're doled out in such a way that you get caught in this loop.
159
549107
3564
ื•ืžื—ืœืงื™ื ืื•ืชื ื›ื›ื” ืฉื ืชืคืกื™ื ื‘ืžืขื’ืœ ื”ื–ื”.
09:12
As has been publicly acknowledged by many of the founders of the system,
160
552695
4095
ื•ื”ืจื‘ื” ืžื™ื•ืฆืจื™ ื”ืžืขืจื›ืช ื”ื–ื• ื”ื•ื“ื• ื‘ื–ื” ื‘ืฆื™ื‘ื•ืจ,
09:16
everybody knew this is what was going on.
161
556814
2341
ื›ื•ืœื ื™ื“ืขื• ืฉื–ื” ืžื” ืฉืงื•ืจื”.
09:19
But here's the thing:
162
559871
1619
ืื‘ืœ ื”ื ื” ื”ืงื˜ืข:
09:21
traditionally, in the academic study of the methods of behaviorism,
163
561514
5294
ื‘ืžื—ืงืจ ื”ืืงื“ืžื™ ืฉืœ ืžืชื•ื“ื•ืช ื”ืชื ื”ื’ื•ืชื™ื•ืช,
09:26
there have been comparisons of positive and negative stimuli.
164
566832
5436
ื”ื™ื• ื”ืฉื•ื•ืื•ืช ืฉืœ ื’ื™ืจื•ื™ื™ื ื—ื™ื•ื‘ื™ื™ื ื•ืฉืœื™ืœื™ื™ื.
09:32
In this setting, a commercial setting,
165
572292
2364
ื‘ืงื•ื ื˜ืงืกื˜ ื”ื–ื”, ืฉื”ื•ื ืžืกื—ืจื™,
09:34
there's a new kind of difference
166
574680
1596
ื™ืฉ ืกื•ื’ ืฉืœ ื”ื‘ื“ืœ ื—ื“ืฉ
09:36
that has kind of evaded the academic world for a while,
167
576300
2769
ืฉืงืฆืช ื ืขืœื ืžืขื™ื ื™ ื”ืขื•ืœื ื”ืืงื“ืžื™ ื–ืžืŸ ืžื”
09:39
and that difference is that whether positive stimuli
168
579093
4048
ื•ื”ื”ื‘ื“ืœ ื”ื•ื ืฉืื ื’ื™ืจื•ื™ื™ื ื—ื™ื•ื‘ื™ื™ื ื”ื
09:43
are more effective than negative ones in different circumstances,
169
583165
3309
ื—ื–ืงื™ื ื™ื•ืชืจ ืžื’ื™ืจื•ื™ื™ื ืฉืœื™ืœื™ื™ื ื‘ื ืกื™ื‘ื•ืช ืžืกื•ื™ืžื•ืช,
09:46
the negative ones are cheaper.
170
586498
2104
ื”ืฉืœื™ืœื™ื™ื ื”ื ื–ื•ืœื™ื ื™ื•ืชืจ.
09:48
They're the bargain stimuli.
171
588626
2056
ื”ื ื”ื’ื™ืจื•ื™ "ื‘ืžื‘ืฆืข."
09:50
So what I mean by that is it's much easier
172
590706
5703
ืื ื™ ืžืชื›ื•ื•ืŸ ืœื›ืš ืฉื–ื” ืงืœ ื™ื•ืชืจ
09:56
to lose trust than to build trust.
173
596433
3116
ืœืื‘ื“ ืืžื•ืŸ ืžืืฉืจ ืœื‘ื ื•ืช ืืžื•ืŸ.
09:59
It takes a long time to build love.
174
599573
3172
ืœื•ืงื— ื”ืจื‘ื” ื–ืžืŸ ืœื‘ื ื•ืช ืื”ื‘ื”.
10:02
It takes a short time to ruin love.
175
602769
2606
ืœื•ืงื— ืžืขื˜ ื–ืžืŸ ืœื”ืจื•ืก ืื”ื‘ื”.
10:05
Now the customers of these behavior modification empires
176
605399
4588
ืขื›ืฉื™ื•, ื”ืœืงื•ื—ื•ืช ืฉืœ ื”ืื™ืžืคืจื™ื•ืช ืฉืœ ื”ืฉืคืขื” ื”ืชื ื”ื’ื•ืชื™ืช
10:10
are on a very fast loop.
177
610011
1423
ื ืžืฆืื™ื ื‘ืžืขื’ืœ ืžืžืฉ ืžื”ืจ.
10:11
They're almost like high-frequency traders.
178
611458
2045
ื”ื ื›ืžืขื˜ ื›ืžื• ืกื•ื—ืจื™ื ื‘ืžื”ื™ืจื•ืช ื’ื‘ื•ื”ื”.
10:13
They're getting feedbacks from their spends
179
613527
2024
ื”ื ืžืงื‘ืœื™ื ืคื™ื“ื‘ืง ืžื”ื”ืฉืงืขื•ืช ืฉืœื”ื
10:15
or whatever their activities are if they're not spending,
180
615575
2802
ืื• ืžื”ืคืขื™ืœื•ื™ื•ืช ืฉืœื”ื ืื ื”ื ืœื ืžืฉืงื™ืขื™ื
10:18
and they see what's working, and then they do more of that.
181
618401
3270
ื•ื”ื ืจื•ืื™ื ืžื” ืขื•ื‘ื“, ื•ืื– ื”ื ืขื•ืฉื™ื ื™ื•ืชืจ ืžื–ื”.
10:21
And so they're getting the quick feedback,
182
621695
2040
ืื– ื”ื ืžืงื‘ืœื™ื ืืช ื”ืคื™ื“ื‘ืง ื”ืžื”ืจ,
10:23
which means they're responding more to the negative emotions,
183
623759
3040
ื›ืœื•ืžืจ ื”ื ืžื’ื™ื‘ื™ื ื™ื•ืชืจ ืœืจื’ืฉื•ืช ื”ืฉืœื™ืœื™ื•ืช,
10:26
because those are the ones that rise faster, right?
184
626823
3937
ื›ื™ ื”ื ืืœื” ืฉืขื•ืœื™ื ืžื”ืจ ื™ื•ืชืจ, ื ื›ื•ืŸ?
10:30
And so therefore, even well-intentioned players
185
630784
3548
ืœื›ืŸ ืืคื™ืœื• ืฉื—ืงื ื™ื ืขื ื›ื•ื•ื ื•ืช ื˜ื•ื‘ื•ืช
10:34
who think all they're doing is advertising toothpaste
186
634356
2865
ืฉื—ื•ืฉื‘ื™ื ืฉื›ืœ ืžื” ืฉื”ื ืขื•ืฉื™ื ื–ื” ืœืžื›ื•ืจ ืžืฉื—ืช ืฉื™ื ื™ื™ื
10:37
end up advancing the cause of the negative people,
187
637245
3031
ื‘ืกื•ืฃ ืžืงื“ืžื™ื ืืช ื”ืžื˜ืจื•ืช ืฉืœ ื”ืื ืฉื™ื ื”ืฉืœื™ืœื™ื™ื,
10:40
the negative emotions, the cranks,
188
640300
2334
ื”ืจื’ืฉื•ืช ื”ืฉืœื™ืœื™ื•ืช, ื”ืžืฉื•ื’ืขื™ื,
10:42
the paranoids,
189
642658
1444
ื”ืคืจื ืื•ื™ื“ื™ื,
10:44
the cynics, the nihilists.
190
644126
3080
ื”ืฆื™ื ื™ื, ื”ื ื™ื”ื™ืœื™ืกื˜ื™ื.
10:47
Those are the ones who get amplified by the system.
191
647230
3493
ืืœื” ื”ื ื”ืื ืฉื™ื ืฉื”ืžืขืจื›ืช ืžื’ื‘ื™ืจ ืืช ืงื•ืœื.
10:50
And you can't pay one of these companies to make the world suddenly nice
192
650747
5651
ื•ืื™ ืืคืฉืจ ืœืฉืœื ืœื—ื‘ืจื” ื›ื–ื• ืœืขืฉื•ืช ืืช ื”ืขื•ืœื ืคืชืื•ื ื ื—ืžื“
10:56
and improve democracy
193
656422
1151
ื•ืœืฉืคืจ ืืช ื”ื“ืžื•ืงืจื˜ื™ื”
10:57
nearly as easily as you can pay to ruin those things.
194
657597
3841
ื‘ืงืœื•ืช ื›ืžื• ืฉืืคืฉืจ ืœืฉืœื ืœื”ืจื•ืก ืืช ื”ื“ื‘ืจื™ื ื”ืืœื”.
11:01
And so this is the dilemma we've gotten ourselves into.
195
661462
3719
ืื– ื–ื• ื”ื“ื™ืœืžื” ืืœื™ื” ื”ื’ืขื ื•.
11:05
The alternative is to turn back the clock, with great difficulty,
196
665856
5232
ื”ืืœื ืจื ื˜ื™ื‘ื” ื”ื™ื ืœื”ื—ื–ื™ืจ ืืช ื”ืฉืขื•ืŸ ืื—ื•ืจื”, ื‘ืงื•ืฉื™ ืจื‘,
11:11
and remake that decision.
197
671112
2841
ื•ืœืงื—ืช ืืช ื”ื”ื—ืœื˜ื” ื”ื–ื• ืžื—ื“ืฉ.
11:13
Remaking it would mean two things.
198
673977
4038
ื”ืžืฉืžืขื•ืช ืชื”ื™ื” ื›ืคื•ืœื”.
11:18
It would mean first that many people, those who could afford to,
199
678039
3928
ืงื•ื“ื, ื”ืจื‘ื” ืื ืฉื™ื, ืืฉืจ ื™ืฉ ืœื”ื ืืช ื”ืืžืฆืขื™ื,
11:21
would actually pay for these things.
200
681991
2207
ื™ืฉืœืžื• ืขื‘ื•ืจ ื”ื“ื‘ืจื™ื ื”ืืœื”.
11:24
You'd pay for search, you'd pay for social networking.
201
684222
4407
ืชืฉืœื ืขื‘ื•ืจ ื—ื™ืคื•ืฉ, ืขื‘ื•ืจ ืจืฉืช ื—ื‘ืจืชื™ืช.
11:28
How would you pay? Maybe with a subscription fee,
202
688653
3461
ืื™ืš ืชืฉืœื? ืื•ืœื™ ื“ืจืš ืชืฉืœื•ื ืžื ื•ื™,
11:32
maybe with micro-payments as you use them.
203
692138
2738
ืื•ืœื™ ื“ืจืš ืชืฉืœื•ืžื™ื ืžื–ืขืจื™ื™ื ื‘ืžืฉืš ื”ืฉื™ืžื•ืฉ.
11:34
There's a lot of options.
204
694900
1802
ื™ืฉ ื”ืจื‘ื” ืื•ืคืฆื™ื•ืช.
11:36
If some of you are recoiling, and you're thinking,
205
696726
2397
ืื ื—ืœืงื›ื ืœื ืžืจื’ื™ืฉื™ื ื‘ื ื•ื—, ื—ื•ืฉื‘ื™ื,
11:39
"Oh my God, I would never pay for these things.
206
699147
2366
"ืืœื•ื”ื™ื, ื‘ื—ื™ื™ื ืœื ื”ื™ื™ืชื™ ืžืฉืœื ืขื‘ื•ืจ ื“ื‘ืจื™ื ืืœื”.
11:41
How could you ever get anyone to pay?"
207
701537
2095
ืื™ืš ืชื’ืจื•ื ืœืžื™ืฉื”ื• ืœืฉืœื?"
11:43
I want to remind you of something that just happened.
208
703656
3239
ืื ื™ ืจื•ืฆื” ืœื”ื–ื›ื™ืจ ืœืš ืžืฉื”ื• ืฉืงืจื” ืœืื—ืจื•ื ื”.
11:46
Around this same time
209
706919
2054
ื‘ืขืจืš ื‘ืื•ืชื• ื–ืžืŸ
11:48
that companies like Google and Facebook were formulating their free idea,
210
708997
5707
ืฉื—ื‘ืจื•ืช ื›ืžื• ื’ื•ื’ืœ ื•ืคื™ื™ืกื‘ื•ืง ืคื™ืชื—ื• ืืช ื”ืจืขื™ื•ืŸ ื”ื—ื™ื ืžื™ ืฉืœื”ื,
11:54
a lot of cyber culture also believed that in the future,
211
714728
4504
ื”ืจื‘ื” ื‘ืชืจื‘ื•ืช ื”ื“ื™ื’ื™ื˜ืœื™ืช ื’ื ื”ืืžื™ื ื• ืฉื‘ืขืชื™ื“,
11:59
televisions and movies would be created in the same way,
212
719256
3022
ื˜ืœื•ื™ื–ื™ื” ื•ืกืจื˜ื™ื ื™ื™ื•ืฆืจื• ื‘ืื•ืชื” ืฆื•ืจื”,
12:02
kind of like the Wikipedia.
213
722302
1755
ืงืฆืช ื›ืžื• ื•ื™ืงื™ืคื“ื™ื”.
12:04
But then, companies like Netflix, Amazon, HBO,
214
724456
5064
ืื‘ืœ ืื–, ื—ื‘ืจื•ืช ื›ืžื• ื ื˜ืคืœื™ืงืก, ืืžื–ื•ืŸ, HBO,
12:09
said, "Actually, you know, subscribe. We'll give you give you great TV."
215
729544
3739
ืืžืจื•, "ื”ืืžืช, ืืชื ื™ื•ื“ืขื™ื, ืงื ื• ืžื ื•ื™, ื•ื ื™ืชืŸ ืœื›ื ื˜ืœื•ื™ื–ื™ื” ื˜ื•ื‘ื”."
12:13
And it worked!
216
733307
1373
ื•ื–ื” ืขื‘ื“!
12:14
We now are in this period called "peak TV," right?
217
734704
3874
ื•ืื ื—ื ื• ืขื›ืฉื™ื• ื‘ืชืงื•ืคื” ืฉื ืงืจืืช "ื˜ืœื•ื™ื–ื™ื” ื‘ืฉื™ืื”," ื ื›ื•ืŸ?
12:18
So sometimes when you pay for stuff, things get better.
218
738602
4198
ืื– ืœืคืขืžื™ื ื›ืฉืžืฉืœืžื™ื ืขื‘ื•ืจ ืžืฉื”ื•, ื”ื“ื‘ืจื™ื ื ื”ื™ื™ื ื™ื•ืชืจ ื˜ื•ื‘ื™ื.
12:22
We can imagine a hypothetical --
219
742824
2286
ื ื•ื›ืœ ืœื“ืžื™ื™ืŸ ืžืฉื”ื• ื”ื™ืคื•ื˜ืชื™ --
12:25
(Applause)
220
745134
4671
(ืžื—ื™ืืช ื›ืคื™ื™ื)
12:29
We can imagine a hypothetical world of "peak social media."
221
749829
3659
ืขื•ืœื ื”ื™ืคื•ื˜ืชื™ ืฉืœ "ืžื“ื™ื” ื—ื‘ืจืชื™ืช ื‘ืฉื™ืื”."
12:33
What would that be like?
222
753512
1349
ืื™ืš ื”ื•ื ื”ื™ื” ื ืจืื”?
12:34
It would mean when you get on, you can get really useful,
223
754885
2770
ื”ืžืฉืžืขื•ืช ื”ื™ื ืฉื›ืฉื”ื™ื™ืชื ืžืชื—ื‘ืจื™ื, ื”ื™ื™ืชื ืžืงื‘ืœื™ื ืขืฆื” ืจืคื•ืื™ืช
12:37
authoritative medical advice instead of cranks.
224
757679
3095
ืžืžืฉ ืฉื™ืžื•ืฉื™ืช ื‘ืžืงื•ื ืžื–ื•ื™ื™ืคืช.
12:41
It could mean when you want to get factual information,
225
761143
3310
ื”ืžืฉืžืขื•ืช ื”ื™ื ืฉื›ืืฉืจ ื”ื™ื™ืชื ืจื•ืฆื™ื ืœืงื‘ืœ ืžื™ื“ืข ืขื•ื‘ื“ืชื™,
12:44
there's not a bunch of weird, paranoid conspiracy theories.
226
764477
3254
ืœื ืชืงื‘ืœื• ืื•ืกืฃ ืชืื•ืจื™ื•ืช ืงื•ื ืกืคื™ืจืฆื™ื” ืžื•ื–ืจื•ืช.
12:47
We can imagine this wonderful other possibility.
227
767755
4235
ื ื•ื›ืœ ืœื“ืžื™ื™ืŸ ืืช ื”ืืคืฉืจื•ืช ื”ืื—ืจืช ื”ืžื“ื”ื™ืžื” ื”ื–ื•.
12:52
Ah.
228
772014
1261
ืื”.
12:53
I dream of it. I believe it's possible.
229
773299
2130
ืื ื™ ื—ื•ืœื ืืช ื–ื”. ืื ื™ ืžืืžื™ืŸ ืฉื–ื” ืืคืฉืจื™.
12:55
I'm certain it's possible.
230
775453
3302
ืื ื™ ื‘ื˜ื•ื— ืฉื–ื” ืืคืฉืจื™.
12:58
And I'm certain that the companies, the Googles and the Facebooks,
231
778779
4747
ื•ืื ื™ ื‘ื˜ื•ื— ืฉื”ื—ื‘ืจื•ืช, ื›ืžื• ื’ื•ื’ืœ ื•ืคื™ื™ืกื‘ื•ืง,
13:03
would actually do better in this world.
232
783550
2312
ื”ื™ื• ืžืฆืœื™ื—ื™ื ื™ื•ืชืจ ื‘ืขื•ืœื ื”ื”ื•ื.
13:05
I don't believe we need to punish Silicon Valley.
233
785886
3166
ืื ื™ ืœื ืžืืžื™ืŸ ืฉืื ื• ื—ื™ื™ื‘ื™ื ืœื”ืขื ื™ืฉ ืืช ืขืžืง ืกื™ืœื™ืงื•ืŸ.
13:09
We just need to remake the decision.
234
789076
2253
ืื ื• ื—ื™ื™ื‘ื™ื ืจืง ืœืงื—ืช ืืช ื”ื”ื—ืœื˜ื” ื”ื–ื• ืžื—ื“ืฉ.
13:12
Of the big tech companies,
235
792702
1882
ืžื‘ื™ืŸ ื—ื‘ืจื•ืช ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ื”ื’ื“ื•ืœื•ืช,
13:14
it's really only two that depend on behavior modification and spying
236
794608
5563
ื™ืฉ ื‘ืืžืช ืจืง ืฉืชื™ื™ื ืฉืชืœื•ื™ื•ืช ื‘ื”ืฉืคืขื” ื”ืชื ื”ื’ื•ืชื™ืช ื•ืจื™ื’ื•ืœ
13:20
as their business plan.
237
800195
1257
ื›ืžื•ื“ืœ ื”ืขืกืงื™ ืฉืœื”ื.
13:21
It's Google and Facebook.
238
801476
1759
ื’ื•ื’ืœ ื•ืคื™ื™ืกื‘ื•ืง.
13:23
(Laughter)
239
803259
1310
(ืฆื—ื•ืง)
13:24
And I love you guys.
240
804593
1691
ื•ืื ื™ ืื•ื”ื‘ ืืชื›ื.
13:26
Really, I do. Like, the people are fantastic.
241
806308
2721
ื‘ืืžืช, ืื ื™ ื›ืŸ. ื›ืื™ืœื•, ื”ืื ืฉื™ื ืžื“ื”ื™ืžื™ื.
13:30
I want to point out, if I may,
242
810371
3182
ืื ื™ ืจื•ืฆื” ืœืฆื™ื™ืŸ, ืื ื™ื•ืจืฉื” ืœื™,
13:33
if you look at Google,
243
813577
1151
ืื ืชืกืชื›ืœื• ื‘ื’ื•ื’ืœ,
13:34
they can propagate cost centers endlessly with all of these companies,
244
814752
5087
ื”ื ื™ื›ื•ืœื™ื ืœื”ืคื™ืฅ ืžืจื›ื–ื™ ืขืœื•ื™ื•ืช ืขื“ ืื™ืŸ ืงืฅ ืขื ื›ืœ ืฉืืจ ื”ื—ื‘ืจื•ืช ื”ืืœื”,
13:39
but they cannot propagate profit centers.
245
819863
2048
ืื‘ืœ ื”ื ืœื ื™ื›ื•ืœื™ื ืœื”ืคื™ืฅ ืžืจื›ื–ื™ ื”ื›ื ืกื•ืช.
13:41
They cannot diversify, because they're hooked.
246
821935
3181
ื”ื ืœื ื™ื›ื•ืœื™ื ืœื’ื•ื•ืŸ,
13:45
They're hooked on this model, just like their own users.
247
825140
2627
ื”ื ืชืœื•ื™ื™ื ื‘ืžื•ื“ืœ ื”ื–ื”, ื›ืžื• ื”ืžืฉืชืžืฉื™ื ืฉืœื”ื.
13:47
They're in the same trap as their users,
248
827791
2298
ื”ื ื ืžืฆืื™ื ื‘ืื•ืชื” ืžืœื›ื•ื“ืช ืฉื ืžืฆืื™ื ื”ืžืฉืชืžืฉื™ื ืฉืœื”ื.
13:50
and you can't run a big corporation that way.
249
830113
2504
ื•ืื™ ืืคืฉืจ ืœื ื”ืœ ื—ื‘ืจื” ื’ื“ื•ืœื” ื›ื›ื”.
13:52
So this is ultimately totally in the benefit of the shareholders
250
832641
3603
ืื– ื‘ืกื•ืฃ ื–ื” ืžืžืฉ ืœื˜ื•ื‘ืช ื‘ืขืœื™ ื”ืžื ื™ื•ืช
13:56
and other stakeholders of these companies.
251
836268
2445
ื•ืฉืืจ ื‘ืขืœื™ ื”ืขื ื™ืŸ ืฉืœ ื”ื—ื‘ืจื•ืช ื”ืืœื”.
13:58
It's a win-win solution.
252
838737
2350
ื–ื” ืคืชืจื•ืŸ ื˜ื•ื‘ ืœื›ืœ ื”ืฆื“ื“ื™ื.
14:01
It'll just take some time to figure it out.
253
841111
2515
ื–ื” ืจืง ื™ืงื— ืงืฆืช ื–ืžืŸ ืœืคืชื—.
14:03
A lot of details to work out,
254
843650
2262
ื”ืจื‘ื” ืคืจื˜ื™ื ืœืคืชื—,
14:05
totally doable.
255
845936
1830
ืžืžืฉ ืืคืฉืจื™.
14:07
(Laughter)
256
847790
2415
(ืฆื—ื•ืง)
14:10
I don't believe our species can survive unless we fix this.
257
850229
3834
ืื ื™ ืœื ืžืืžื™ืŸ ืฉื”ืžื™ืŸ ื”ืื ื•ืฉื™ ื™ื•ื›ืœ ืœืฉืจื•ื“ ืื ืœื ื ืชืงืŸ ืืช ื–ื”.
14:14
We cannot have a society
258
854087
2290
ืœื ื ื•ื›ืœ ืœื—ื™ื•ืช ื‘ื—ื‘ืจื”
14:16
in which, if two people wish to communicate,
259
856401
2961
ืืฉืจ ื‘ื”, ืื ืฉื ื™ ืื ืฉื™ื ืจื•ืฆื™ื ืœืชืงืฉืจ,
14:19
the only way that can happen is if it's financed by a third person
260
859386
3440
ื”ื“ืจืš ื”ื™ื—ื™ื“ื” ืฉื–ื” ืืคืฉืจื™ ื–ื• ื‘ืขื–ืจืช ืื™ืฉ ืฉืœื™ืฉื™
14:22
who wishes to manipulate them.
261
862850
2346
ืฉืจื•ืฆื” ืœื ืฆืœื.
14:25
(Applause)
262
865220
6238
(ืžื—ื™ืืช ื›ืคื™ื™ื)
14:35
(Applause ends)
263
875077
1151
(ืžื—ื™ืืช ื”ื›ืคื™ื™ื ืžืกืชื™ื™ืžืช)
14:36
In the meantime, if the companies won't change,
264
876942
2945
ื‘ื™ื ืชื™ื™ื, ืื ื”ื—ื‘ืจื•ืช ืœื ืžืฉืชื ื•ืช,
14:39
delete your accounts, OK?
265
879911
1666
ืžื—ืงื• ืืช ื”ื—ืฉื‘ื•ื ื•ืช ืฉืœื›ื, ื‘ืกื“ืจ?
14:41
(Laughter)
266
881601
1269
(ืฆื—ื•ืง)
14:42
(Applause)
267
882894
1046
(ืžื—ื™ืืช ื›ืคื™ื™ื)
14:43
That's enough for now.
268
883964
1509
ืžืกืคื™ืง ื‘ื™ื ืชื™ื™ื.
14:45
Thank you so much.
269
885497
1151
ืชื•ื“ื” ืจื‘ื”.
14:46
(Applause)
270
886672
6804
(ืžื—ื™ืืช ื›ืคื™ื™ื)
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7