How to be "Team Human" in the digital future | Douglas Rushkoff

117,366 views ・ 2019-01-14

TED


Please double-click on the English subtitles below to play the video.

00:13
I got invited to an exclusive resort
0
13520
3880
00:17
to deliver a talk about the digital future
1
17440
2456
00:19
to what I assumed would be a couple of hundred tech executives.
2
19920
3576
00:23
And I was there in the green room, waiting to go on,
3
23520
2696
00:26
and instead of bringing me to the stage, they brought five men into the green room
4
26240
5176
00:31
who sat around this little table with me.
5
31440
2056
00:33
They were tech billionaires.
6
33520
2096
00:35
And they started peppering me with these really binary questions,
7
35640
4536
00:40
like: Bitcoin or Etherium?
8
40200
2000
00:43
Virtual reality or augmented reality?
9
43120
2656
00:45
I don't know if they were taking bets or what.
10
45800
2496
00:48
And as they got more comfortable with me,
11
48320
2816
00:51
they edged towards their real question of concern.
12
51160
3216
00:54
Alaska or New Zealand?
13
54400
2240
00:57
That's right.
14
57760
1216
00:59
These tech billionaires were asking a media theorist for advice
15
59000
2976
01:02
on where to put their doomsday bunkers.
16
62000
1880
01:04
We spent the rest of the hour on the single question:
17
64600
3136
01:07
"How do I maintain control of my security staff
18
67760
3616
01:11
after the event?"
19
71400
1320
01:13
By "the event" they mean the thermonuclear war
20
73920
2736
01:16
or climate catastrophe or social unrest that ends the world as we know it,
21
76680
4656
01:21
and more importantly, makes their money obsolete.
22
81360
3280
01:26
And I couldn't help but think:
23
86200
2216
01:28
these are the wealthiest, most powerful men in the world,
24
88440
4736
01:33
yet they see themselves as utterly powerless to influence the future.
25
93200
4656
01:37
The best they can do is hang on for the inevitable catastrophe
26
97880
4456
01:42
and then use their technology and money to get away from the rest of us.
27
102360
3680
01:47
And these are the winners of the digital economy.
28
107520
2536
01:50
(Laughter)
29
110080
3416
01:53
The digital renaissance
30
113520
2776
01:56
was about the unbridled potential
31
116320
4256
02:00
of the collective human imagination.
32
120600
2416
02:03
It spanned everything from chaos math and quantum physics
33
123040
5136
02:08
to fantasy role-playing and the Gaia hypothesis, right?
34
128200
4176
02:12
We believed that human beings connected could create any future we could imagine.
35
132400
6720
02:20
And then came the dot com boom.
36
140840
2199
02:24
And the digital future became stock futures.
37
144600
3616
02:28
And we used all that energy of the digital age
38
148240
3016
02:31
to pump steroids into the already dying NASDAQ stock exchange.
39
151280
4256
02:35
The tech magazines told us a tsunami was coming.
40
155560
3616
02:39
And only the investors who hired the best scenario-planners and futurists
41
159200
4456
02:43
would be able to survive the wave.
42
163680
2520
02:47
And so the future changed from this thing we create together in the present
43
167160
5896
02:53
to something we bet on
44
173080
1496
02:54
in some kind of a zero-sum winner-takes-all competition.
45
174600
3200
03:00
And when things get that competitive about the future,
46
180120
3136
03:03
humans are no longer valued for our creativity.
47
183280
3296
03:06
No, now we're just valued for our data.
48
186600
3136
03:09
Because they can use the data to make predictions.
49
189760
2376
03:12
Creativity, if anything, that creates noise.
50
192160
2576
03:14
That makes it harder to predict.
51
194760
2216
03:17
So we ended up with a digital landscape
52
197000
2416
03:19
that really repressed creativity, repressed novelty,
53
199440
3256
03:22
it repressed what makes us most human.
54
202720
2840
03:26
We ended up with social media.
55
206760
1456
03:28
Does social media really connect people in new, interesting ways?
56
208240
3456
03:31
No, social media is about using our data to predict our future behavior.
57
211720
5016
03:36
Or when necessary, to influence our future behavior
58
216760
2896
03:39
so that we act more in accordance with our statistical profiles.
59
219680
4040
03:45
The digital economy -- does it like people?
60
225200
2216
03:47
No, if you have a business plan, what are you supposed to do?
61
227440
2896
03:50
Get rid of all the people.
62
230360
1256
03:51
Human beings, they want health care, they want money, they want meaning.
63
231640
3400
03:56
You can't scale with people.
64
236360
1880
03:59
(Laughter)
65
239360
1456
04:00
Even our digital apps --
66
240840
1216
04:02
they don't help us form any rapport or solidarity.
67
242080
3216
04:05
I mean, where's the button on the ride hailing app
68
245320
2416
04:07
for the drivers to talk to one another about their working conditions
69
247760
3496
04:11
or to unionize?
70
251280
1200
04:13
Even our videoconferencing tools,
71
253600
2016
04:15
they don't allow us to establish real rapport.
72
255640
2376
04:18
However good the resolution of the video,
73
258040
3336
04:21
you still can't see if somebody's irises are opening to really take you in.
74
261400
4016
04:25
All of the things that we've done to establish rapport
75
265440
2576
04:28
that we've developed over hundreds of thousands of years of evolution,
76
268040
3335
04:31
they don't work,
77
271399
1217
04:32
you can't see if someone's breath is syncing up with yours.
78
272640
3016
04:35
So the mirror neurons never fire, the oxytocin never goes through your body,
79
275680
3656
04:39
you never have that experience of bonding with the other human being.
80
279360
3240
04:43
And instead, you're left like,
81
283360
1456
04:44
"Well, they agreed with me, but did they really,
82
284840
2256
04:47
did they really get me?"
83
287120
1696
04:48
And we don't blame the technology for that lack of fidelity.
84
288840
3376
04:52
We blame the other person.
85
292240
1480
04:55
You know, even the technologies and the digital initiatives that we have
86
295320
4056
04:59
to promote humans,
87
299400
2176
05:01
are intensely anti-human at the core.
88
301600
2760
05:05
Think about the blockchain.
89
305600
2000
05:08
The blockchain is here to help us have a great humanized economy? No.
90
308520
3616
05:12
The blockchain does not engender trust between users,
91
312160
2696
05:14
the blockchain simply substitutes for trust in a new,
92
314880
3536
05:18
even less transparent way.
93
318440
2000
05:21
Or the code movement.
94
321600
1816
05:23
I mean, education is great, we love education,
95
323440
2176
05:25
and it's a wonderful idea
96
325640
1456
05:27
that we want kids to be able to get jobs in the digital future,
97
327120
3136
05:30
so we'll teach them code now.
98
330280
1600
05:32
But since when is education about getting jobs?
99
332640
2480
05:35
Education wasn't about getting jobs.
100
335920
1736
05:37
Education was compensation for a job well done.
101
337680
3976
05:41
The idea of public education
102
341680
1576
05:43
was for coal miners, who would work in the coal mines all day,
103
343280
3376
05:46
then they'd come home and they should have the dignity
104
346680
2576
05:49
to be able to read a novel and understand it.
105
349280
2136
05:51
Or the intelligence to be able to participate in democracy.
106
351440
2960
05:55
When we make it an extension of the job, what are we really doing?
107
355200
3216
05:58
We're just letting corporations really
108
358440
2536
06:01
externalize the cost of training their workers.
109
361000
3120
06:05
And the worst of all really is the humane technology movement.
110
365520
4096
06:09
I mean, I love these guys, the former guys who used to take
111
369640
2816
06:12
the algorithms from Las Vegas slot machines
112
372480
2936
06:15
and put them in our social media feed so that we get addicted.
113
375440
2936
06:18
Now they've seen the error of their ways
114
378400
1936
06:20
and they want to make technology more humane.
115
380360
2456
06:22
But when I hear the expression "humane technology,"
116
382840
2416
06:25
I think about cage-free chickens or something.
117
385280
2856
06:28
We're going to be as humane as possible to them,
118
388160
2256
06:30
until we take them to the slaughter.
119
390440
1720
06:33
So now they're going to let these technologies be as humane as possible,
120
393200
3416
06:36
as long as they extract enough data and extract enough money from us
121
396640
3216
06:39
to please their shareholders.
122
399880
1400
06:42
Meanwhile, the shareholders, for their part, they're just thinking,
123
402520
3176
06:45
"I need to earn enough money now, so I can insulate myself
124
405720
2976
06:48
from the world I'm creating by earning money in this way."
125
408720
3056
06:51
(Laughter)
126
411800
2376
06:54
No matter how many VR goggles they slap on their faces
127
414200
4056
06:58
and whatever fantasy world they go into,
128
418280
2256
07:00
they can't externalize the slavery and pollution that was caused
129
420560
3536
07:04
through the manufacture of the very device.
130
424120
2976
07:07
It reminds me of Thomas Jefferson's dumbwaiter.
131
427120
3176
07:10
Now, we like to think that he made the dumbwaiter
132
430320
2336
07:12
in order to spare his slaves all that labor of carrying the food
133
432680
3656
07:16
up to the dining room for the people to eat.
134
436360
2776
07:19
That's not what it was for, it wasn't for the slaves,
135
439160
2496
07:21
it was for Thomas Jefferson and his dinner guests,
136
441680
2336
07:24
so they didn't have to see the slave bringing the food up.
137
444040
3096
07:27
The food just arrived magically,
138
447160
1576
07:28
like it was coming out of a "Start Trek" replicator.
139
448760
3000
07:32
It's part of an ethos that says,
140
452720
2096
07:34
human beings are the problem and technology is the solution.
141
454840
4280
07:40
We can't think that way anymore.
142
460680
2056
07:42
We have to stop using technology to optimize human beings for the market
143
462760
5296
07:48
and start optimizing technology for the human future.
144
468080
5040
07:55
But that's a really hard argument to make these days,
145
475080
2656
07:57
because humans are not popular beings.
146
477760
4056
08:01
I talked about this in front of an environmentalist just the other day,
147
481840
3376
08:05
and she said, "Why are you defending humans?
148
485240
2096
08:07
Humans destroyed the planet. They deserve to go extinct."
149
487360
2696
08:10
(Laughter)
150
490080
3456
08:13
Even our popular media hates humans.
151
493560
2576
08:16
Watch television,
152
496160
1256
08:17
all the sci-fi shows are about how robots are better and nicer than people.
153
497440
3736
08:21
Even zombie shows -- what is every zombie show about?
154
501200
2976
08:24
Some person, looking at the horizon at some zombie going by,
155
504200
3256
08:27
and they zoom in on the person and you see the person's face,
156
507480
2896
08:30
and you know what they're thinking:
157
510400
1736
08:32
"What's really the difference between that zombie and me?
158
512160
2736
08:34
He walks, I walk.
159
514920
1536
08:36
He eats, I eat.
160
516480
2016
08:38
He kills, I kill."
161
518520
2280
08:42
But he's a zombie.
162
522360
1536
08:43
At least you're aware of it.
163
523920
1416
08:45
If we are actually having trouble distinguishing ourselves from zombies,
164
525360
3696
08:49
we have a pretty big problem going on.
165
529080
2176
08:51
(Laughter)
166
531280
1216
08:52
And don't even get me started on the transhumanists.
167
532520
2936
08:55
I was on a panel with a transhumanist, and he's going on about the singularity.
168
535480
3816
08:59
"Oh, the day is going to come really soon when computers are smarter than people.
169
539320
3856
09:03
And the only option for people at that point
170
543200
2136
09:05
is to pass the evolutionary torch to our successor
171
545360
3336
09:08
and fade into the background.
172
548720
1616
09:10
Maybe at best, upload your consciousness to a silicon chip.
173
550360
3456
09:13
And accept your extinction."
174
553840
1840
09:16
(Laughter)
175
556640
1456
09:18
And I said, "No, human beings are special.
176
558120
3376
09:21
We can embrace ambiguity, we understand paradox,
177
561520
3536
09:25
we're conscious, we're weird, we're quirky.
178
565080
2616
09:27
There should be a place for humans in the digital future."
179
567720
3336
09:31
And he said, "Oh, Rushkoff,
180
571080
1536
09:32
you're just saying that because you're a human."
181
572640
2296
09:34
(Laughter)
182
574960
1776
09:36
As if it's hubris.
183
576760
1600
09:39
OK, I'm on "Team Human."
184
579280
2800
09:43
That was the original insight of the digital age.
185
583200
3856
09:47
That being human is a team sport,
186
587080
2216
09:49
evolution's a collaborative act.
187
589320
2736
09:52
Even the trees in the forest,
188
592080
1416
09:53
they're not all in competition with each other,
189
593520
2216
09:55
they're connected with the vast network of roots and mushrooms
190
595760
3216
09:59
that let them communicate with one another and pass nutrients back and forth.
191
599000
4536
10:03
If human beings are the most evolved species,
192
603560
2136
10:05
it's because we have the most evolved ways of collaborating and communicating.
193
605720
4136
10:09
We have language.
194
609880
1496
10:11
We have technology.
195
611400
1240
10:14
It's funny, I used to be the guy who talked about the digital future
196
614120
4576
10:18
for people who hadn't yet experienced anything digital.
197
618720
2680
10:22
And now I feel like I'm the last guy
198
622200
1816
10:24
who remembers what life was like before digital technology.
199
624040
2920
10:28
It's not a matter of rejecting the digital or rejecting the technological.
200
628680
4216
10:32
It's a matter of retrieving the values that we're in danger of leaving behind
201
632920
4096
10:37
and then embedding them in the digital infrastructure for the future.
202
637040
3600
10:41
And that's not rocket science.
203
641880
2296
10:44
It's as simple as making a social network
204
644200
2096
10:46
that instead of teaching us to see people as adversaries,
205
646320
3536
10:49
it teaches us to see our adversaries as people.
206
649880
3000
10:54
It means creating an economy that doesn't favor a platform monopoly
207
654240
4296
10:58
that wants to extract all the value out of people and places,
208
658560
3336
11:01
but one that promotes the circulation of value through a community
209
661920
3936
11:05
and allows us to establish platform cooperatives
210
665880
2416
11:08
that distribute ownership as wide as possible.
211
668320
3816
11:12
It means building platforms
212
672160
1656
11:13
that don't repress our creativity and novelty in the name of prediction
213
673840
4656
11:18
but actually promote creativity and novelty,
214
678520
2576
11:21
so that we can come up with some of the solutions
215
681120
2336
11:23
to actually get ourselves out of the mess that we're in.
216
683480
2680
11:27
No, instead of trying to earn enough money to insulate ourselves
217
687440
3056
11:30
from the world we're creating,
218
690520
1496
11:32
why don't we spend that time and energy making the world a place
219
692040
3056
11:35
that we don't feel the need to escape from.
220
695120
2040
11:38
There is no escape, there is only one thing going on here.
221
698000
3200
11:42
Please, don't leave.
222
702680
2120
11:45
Join us.
223
705640
1360
11:47
We may not be perfect,
224
707800
1296
11:49
but whatever happens, at least you won't be alone.
225
709120
2440
11:52
Join "Team Human."
226
712640
1520
11:55
Find the others.
227
715160
2096
11:57
Together, let's make the future that we always wanted.
228
717280
2960
12:01
Oh, and those tech billionaires who wanted to know
229
721560
2496
12:04
how to maintain control of their security force after the apocalypse,
230
724080
3256
12:07
you know what I told them?
231
727360
1896
12:09
"Start treating those people with love and respect right now.
232
729280
4216
12:13
Maybe you won't have an apocalypse to worry about."
233
733520
2440
12:16
Thank you.
234
736840
1216
12:18
(Applause)
235
738080
4440
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7