How we need to remake the internet | Jaron Lanier

430,989 views ・ 2018-05-03

TED


Please double-click on the English subtitles below to play the video.

00:12
Back in the 1980s, actually, I gave my first talk at TED,
0
12944
4009
00:16
and I brought some of the very, very first public demonstrations
1
16977
4262
00:21
of virtual reality ever to the TED stage.
2
21263
4234
00:26
And at that time, we knew that we were facing a knife-edge future
3
26375
6867
00:33
where the technology we needed,
4
33266
5201
00:38
the technology we loved,
5
38491
1851
00:40
could also be our undoing.
6
40366
2047
00:43
We knew that if we thought of our technology
7
43266
4091
00:47
as a means to ever more power,
8
47381
3254
00:50
if it was just a power trip, we'd eventually destroy ourselves.
9
50659
3707
00:54
That's what happens
10
54390
1181
00:55
when you're on a power trip and nothing else.
11
55595
2787
00:59
So the idealism
12
59509
3389
01:02
of digital culture back then
13
62922
4809
01:07
was all about starting with that recognition of the possible darkness
14
67755
4739
01:12
and trying to imagine a way to transcend it
15
72518
3350
01:15
with beauty and creativity.
16
75892
2578
01:19
I always used to end my early TED Talks with a rather horrifying line, which is,
17
79033
6507
01:26
"We have a challenge.
18
86478
3866
01:30
We have to create a culture around technology
19
90368
4024
01:34
that is so beautiful, so meaningful,
20
94416
3968
01:38
so deep, so endlessly creative,
21
98408
2541
01:40
so filled with infinite potential
22
100973
3016
01:44
that it draws us away from committing mass suicide."
23
104013
3253
01:48
So we talked about extinction as being one and the same
24
108519
5588
01:54
as the need to create an alluring, infinitely creative future.
25
114131
4830
01:59
And I still believe that that alternative of creativity
26
119639
5382
02:05
as an alternative to death
27
125045
1974
02:07
is very real and true,
28
127043
1969
02:09
maybe the most true thing there is.
29
129036
1983
02:11
In the case of virtual reality --
30
131870
2095
02:13
well, the way I used to talk about it
31
133989
2282
02:16
is that it would be something like
32
136295
2635
02:18
what happened when people discovered language.
33
138954
2850
02:21
With language came new adventures, new depth, new meaning,
34
141828
4675
02:26
new ways to connect, new ways to coordinate,
35
146527
2080
02:28
new ways to imagine, new ways to raise children,
36
148631
4034
02:32
and I imagined, with virtual reality, we'd have this new thing
37
152689
4262
02:36
that would be like a conversation
38
156975
1593
02:38
but also like waking-state intentional dreaming.
39
158592
3344
02:41
We called it post-symbolic communication,
40
161960
2653
02:44
because it would be like just directly making the thing you experienced
41
164637
4358
02:49
instead of indirectly making symbols to refer to things.
42
169019
3619
02:53
It was a beautiful vision, and it's one I still believe in,
43
173466
4338
02:57
and yet, haunting that beautiful vision
44
177828
3215
03:01
was the dark side of how it could also turn out.
45
181067
3150
03:04
And I suppose I could mention
46
184241
5048
03:09
from one of the very earliest computer scientists,
47
189313
3064
03:12
whose name was Norbert Wiener,
48
192401
2135
03:14
and he wrote a book back in the '50s, from before I was even born,
49
194560
3754
03:18
called "The Human Use of Human Beings."
50
198338
2658
03:21
And in the book, he described the potential
51
201779
4172
03:25
to create a computer system that would be gathering data from people
52
205975
6181
03:32
and providing feedback to those people in real time
53
212180
3572
03:35
in order to put them kind of partially, statistically, in a Skinner box,
54
215776
5135
03:40
in a behaviorist system,
55
220935
2444
03:43
and he has this amazing line where he says,
56
223403
2501
03:45
one could imagine, as a thought experiment --
57
225928
2738
03:48
and I'm paraphrasing, this isn't a quote --
58
228690
2461
03:51
one could imagine a global computer system
59
231175
3080
03:54
where everybody has devices on them all the time,
60
234279
2842
03:57
and the devices are giving them feedback based on what they did,
61
237145
3272
04:00
and the whole population
62
240441
1875
04:02
is subject to a degree of behavior modification.
63
242340
3576
04:05
And such a society would be insane,
64
245940
3546
04:09
could not survive, could not face its problems.
65
249510
3097
04:12
And then he says, but this is only a thought experiment,
66
252631
2621
04:15
and such a future is technologically infeasible.
67
255276
3420
04:18
(Laughter)
68
258720
1092
04:19
And yet, of course, it's what we have created,
69
259836
3002
04:22
and it's what we must undo if we are to survive.
70
262862
3277
04:27
So --
71
267457
1151
04:28
(Applause)
72
268632
3540
04:32
I believe that we made a very particular mistake,
73
272631
5977
04:38
and it happened early on,
74
278632
2234
04:40
and by understanding the mistake we made,
75
280890
2074
04:42
we can undo it.
76
282988
1859
04:44
It happened in the '90s,
77
284871
2559
04:47
and going into the turn of the century,
78
287454
2742
04:50
and here's what happened.
79
290220
1388
04:53
Early digital culture,
80
293200
1374
04:54
and indeed, digital culture to this day,
81
294598
4972
04:59
had a sense of, I would say, lefty, socialist mission about it,
82
299594
6309
05:05
that unlike other things that have been done,
83
305927
2160
05:08
like the invention of books,
84
308111
1434
05:09
everything on the internet must be purely public,
85
309569
3413
05:13
must be available for free,
86
313006
2325
05:15
because if even one person cannot afford it,
87
315355
3388
05:18
then that would create this terrible inequity.
88
318767
2572
05:21
Now of course, there's other ways to deal with that.
89
321912
2524
05:24
If books cost money, you can have public libraries.
90
324460
3016
05:27
And so forth.
91
327500
1174
05:28
But we were thinking, no, no, no, this is an exception.
92
328698
2618
05:31
This must be pure public commons, that's what we want.
93
331340
4605
05:35
And so that spirit lives on.
94
335969
2634
05:38
You can experience it in designs like the Wikipedia, for instance,
95
338627
3715
05:42
many others.
96
342366
1341
05:43
But at the same time,
97
343731
1874
05:45
we also believed, with equal fervor,
98
345629
2588
05:48
in this other thing that was completely incompatible,
99
348241
3937
05:52
which is we loved our tech entrepreneurs.
100
352202
3627
05:55
We loved Steve Jobs; we loved this Nietzschean myth
101
355853
3739
05:59
of the techie who could dent the universe.
102
359616
3468
06:03
Right?
103
363108
1318
06:04
And that mythical power still has a hold on us, as well.
104
364450
5848
06:10
So you have these two different passions,
105
370322
4459
06:14
for making everything free
106
374805
1937
06:16
and for the almost supernatural power of the tech entrepreneur.
107
376766
5166
06:21
How do you celebrate entrepreneurship when everything's free?
108
381956
4352
06:26
Well, there was only one solution back then,
109
386332
3125
06:29
which was the advertising model.
110
389481
2087
06:31
And so therefore, Google was born free, with ads,
111
391592
4003
06:35
Facebook was born free, with ads.
112
395619
3682
06:39
Now in the beginning, it was cute,
113
399325
3865
06:43
like with the very earliest Google.
114
403214
1960
06:45
(Laughter)
115
405198
1286
06:46
The ads really were kind of ads.
116
406508
2897
06:49
They would be, like, your local dentist or something.
117
409429
2485
06:51
But there's thing called Moore's law
118
411938
1920
06:53
that makes the computers more and more efficient and cheaper.
119
413882
3142
06:57
Their algorithms get better.
120
417048
1858
06:58
We actually have universities where people study them,
121
418930
2596
07:01
and they get better and better.
122
421550
1628
07:03
And the customers and other entities who use these systems
123
423202
4452
07:07
just got more and more experienced and got cleverer and cleverer.
124
427678
4127
07:11
And what started out as advertising
125
431829
2397
07:14
really can't be called advertising anymore.
126
434250
2477
07:16
It turned into behavior modification,
127
436751
2912
07:19
just as Norbert Wiener had worried it might.
128
439687
4493
07:24
And so I can't call these things social networks anymore.
129
444204
4620
07:28
I call them behavior modification empires.
130
448848
3814
07:32
(Applause)
131
452686
2235
07:34
And I refuse to vilify the individuals.
132
454945
4214
07:39
I have dear friends at these companies,
133
459183
2271
07:41
sold a company to Google, even though I think it's one of these empires.
134
461478
4760
07:46
I don't think this is a matter of bad people who've done a bad thing.
135
466262
5060
07:51
I think this is a matter of a globally tragic,
136
471346
4576
07:55
astoundingly ridiculous mistake,
137
475946
4572
08:00
rather than a wave of evil.
138
480542
4129
08:04
Let me give you just another layer of detail
139
484695
2682
08:07
into how this particular mistake functions.
140
487401
3103
08:11
So with behaviorism,
141
491337
2707
08:14
you give the creature, whether it's a rat or a dog or a person,
142
494068
5064
08:19
little treats and sometimes little punishments
143
499156
2840
08:22
as feedback to what they do.
144
502020
1817
08:24
So if you have an animal in a cage, it might be candy and electric shocks.
145
504710
5912
08:30
But if you have a smartphone,
146
510646
2524
08:33
it's not those things, it's symbolic punishment and reward.
147
513194
6926
08:40
Pavlov, one of the early behaviorists,
148
520144
2443
08:42
demonstrated the famous principle.
149
522611
2952
08:45
You could train a dog to salivate just with the bell, just with the symbol.
150
525587
3961
08:49
So on social networks,
151
529572
1586
08:51
social punishment and social reward function as the punishment and reward.
152
531182
5080
08:56
And we all know the feeling of these things.
153
536286
2077
08:58
You get this little thrill --
154
538387
1451
08:59
"Somebody liked my stuff and it's being repeated."
155
539862
2350
09:02
Or the punishment: "Oh my God, they don't like me,
156
542236
2334
09:04
maybe somebody else is more popular, oh my God."
157
544594
2239
09:06
So you have those two very common feelings,
158
546857
2226
09:09
and they're doled out in such a way that you get caught in this loop.
159
549107
3564
09:12
As has been publicly acknowledged by many of the founders of the system,
160
552695
4095
09:16
everybody knew this is what was going on.
161
556814
2341
09:19
But here's the thing:
162
559871
1619
09:21
traditionally, in the academic study of the methods of behaviorism,
163
561514
5294
09:26
there have been comparisons of positive and negative stimuli.
164
566832
5436
09:32
In this setting, a commercial setting,
165
572292
2364
09:34
there's a new kind of difference
166
574680
1596
09:36
that has kind of evaded the academic world for a while,
167
576300
2769
09:39
and that difference is that whether positive stimuli
168
579093
4048
09:43
are more effective than negative ones in different circumstances,
169
583165
3309
09:46
the negative ones are cheaper.
170
586498
2104
09:48
They're the bargain stimuli.
171
588626
2056
09:50
So what I mean by that is it's much easier
172
590706
5703
09:56
to lose trust than to build trust.
173
596433
3116
09:59
It takes a long time to build love.
174
599573
3172
10:02
It takes a short time to ruin love.
175
602769
2606
10:05
Now the customers of these behavior modification empires
176
605399
4588
10:10
are on a very fast loop.
177
610011
1423
10:11
They're almost like high-frequency traders.
178
611458
2045
10:13
They're getting feedbacks from their spends
179
613527
2024
10:15
or whatever their activities are if they're not spending,
180
615575
2802
10:18
and they see what's working, and then they do more of that.
181
618401
3270
10:21
And so they're getting the quick feedback,
182
621695
2040
10:23
which means they're responding more to the negative emotions,
183
623759
3040
10:26
because those are the ones that rise faster, right?
184
626823
3937
10:30
And so therefore, even well-intentioned players
185
630784
3548
10:34
who think all they're doing is advertising toothpaste
186
634356
2865
10:37
end up advancing the cause of the negative people,
187
637245
3031
10:40
the negative emotions, the cranks,
188
640300
2334
10:42
the paranoids,
189
642658
1444
10:44
the cynics, the nihilists.
190
644126
3080
10:47
Those are the ones who get amplified by the system.
191
647230
3493
10:50
And you can't pay one of these companies to make the world suddenly nice
192
650747
5651
10:56
and improve democracy
193
656422
1151
10:57
nearly as easily as you can pay to ruin those things.
194
657597
3841
11:01
And so this is the dilemma we've gotten ourselves into.
195
661462
3719
11:05
The alternative is to turn back the clock, with great difficulty,
196
665856
5232
11:11
and remake that decision.
197
671112
2841
11:13
Remaking it would mean two things.
198
673977
4038
11:18
It would mean first that many people, those who could afford to,
199
678039
3928
11:21
would actually pay for these things.
200
681991
2207
11:24
You'd pay for search, you'd pay for social networking.
201
684222
4407
11:28
How would you pay? Maybe with a subscription fee,
202
688653
3461
11:32
maybe with micro-payments as you use them.
203
692138
2738
11:34
There's a lot of options.
204
694900
1802
11:36
If some of you are recoiling, and you're thinking,
205
696726
2397
11:39
"Oh my God, I would never pay for these things.
206
699147
2366
11:41
How could you ever get anyone to pay?"
207
701537
2095
11:43
I want to remind you of something that just happened.
208
703656
3239
11:46
Around this same time
209
706919
2054
11:48
that companies like Google and Facebook were formulating their free idea,
210
708997
5707
11:54
a lot of cyber culture also believed that in the future,
211
714728
4504
11:59
televisions and movies would be created in the same way,
212
719256
3022
12:02
kind of like the Wikipedia.
213
722302
1755
12:04
But then, companies like Netflix, Amazon, HBO,
214
724456
5064
12:09
said, "Actually, you know, subscribe. We'll give you give you great TV."
215
729544
3739
12:13
And it worked!
216
733307
1373
12:14
We now are in this period called "peak TV," right?
217
734704
3874
12:18
So sometimes when you pay for stuff, things get better.
218
738602
4198
12:22
We can imagine a hypothetical --
219
742824
2286
12:25
(Applause)
220
745134
4671
12:29
We can imagine a hypothetical world of "peak social media."
221
749829
3659
12:33
What would that be like?
222
753512
1349
12:34
It would mean when you get on, you can get really useful,
223
754885
2770
12:37
authoritative medical advice instead of cranks.
224
757679
3095
12:41
It could mean when you want to get factual information,
225
761143
3310
12:44
there's not a bunch of weird, paranoid conspiracy theories.
226
764477
3254
12:47
We can imagine this wonderful other possibility.
227
767755
4235
12:52
Ah.
228
772014
1261
12:53
I dream of it. I believe it's possible.
229
773299
2130
12:55
I'm certain it's possible.
230
775453
3302
12:58
And I'm certain that the companies, the Googles and the Facebooks,
231
778779
4747
13:03
would actually do better in this world.
232
783550
2312
13:05
I don't believe we need to punish Silicon Valley.
233
785886
3166
13:09
We just need to remake the decision.
234
789076
2253
13:12
Of the big tech companies,
235
792702
1882
13:14
it's really only two that depend on behavior modification and spying
236
794608
5563
13:20
as their business plan.
237
800195
1257
13:21
It's Google and Facebook.
238
801476
1759
13:23
(Laughter)
239
803259
1310
13:24
And I love you guys.
240
804593
1691
13:26
Really, I do. Like, the people are fantastic.
241
806308
2721
13:30
I want to point out, if I may,
242
810371
3182
13:33
if you look at Google,
243
813577
1151
13:34
they can propagate cost centers endlessly with all of these companies,
244
814752
5087
13:39
but they cannot propagate profit centers.
245
819863
2048
13:41
They cannot diversify, because they're hooked.
246
821935
3181
13:45
They're hooked on this model, just like their own users.
247
825140
2627
13:47
They're in the same trap as their users,
248
827791
2298
13:50
and you can't run a big corporation that way.
249
830113
2504
13:52
So this is ultimately totally in the benefit of the shareholders
250
832641
3603
13:56
and other stakeholders of these companies.
251
836268
2445
13:58
It's a win-win solution.
252
838737
2350
14:01
It'll just take some time to figure it out.
253
841111
2515
14:03
A lot of details to work out,
254
843650
2262
14:05
totally doable.
255
845936
1830
14:07
(Laughter)
256
847790
2415
14:10
I don't believe our species can survive unless we fix this.
257
850229
3834
14:14
We cannot have a society
258
854087
2290
14:16
in which, if two people wish to communicate,
259
856401
2961
14:19
the only way that can happen is if it's financed by a third person
260
859386
3440
14:22
who wishes to manipulate them.
261
862850
2346
14:25
(Applause)
262
865220
6238
14:35
(Applause ends)
263
875077
1151
14:36
In the meantime, if the companies won't change,
264
876942
2945
14:39
delete your accounts, OK?
265
879911
1666
14:41
(Laughter)
266
881601
1269
14:42
(Applause)
267
882894
1046
14:43
That's enough for now.
268
883964
1509
14:45
Thank you so much.
269
885497
1151
14:46
(Applause)
270
886672
6804
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7