How to be "Team Human" in the digital future | Douglas Rushkoff

117,366 views ใƒป 2019-01-14

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืชืจื’ื•ื: Yael Ring ืขืจื™ื›ื”: Nurit Noy
00:13
I got invited to an exclusive resort
0
13520
3880
ื”ื•ื–ืžื ืชื™ ืœืืชืจ ื ื•ืคืฉ ื™ื•ืงืจืชื™
00:17
to deliver a talk about the digital future
1
17440
2456
ื›ื“ื™ ืœืชืช ื”ืจืฆืื” ืขืœ ื”ืขืชื™ื“ ื”ื“ื™ื’ื™ื˜ืœื™,
00:19
to what I assumed would be a couple of hundred tech executives.
2
19920
3576
ืื– ื”ื ื—ืชื™ ืฉื™ื”ื™ื” ืงื”ืœ ืฉืœ ื›ืžื” ืžืื•ืช ืžื ื”ืœื™ ื”ื™ื™ื˜ืง,
00:23
And I was there in the green room, waiting to go on,
3
23520
2696
ื™ืฉื‘ืชื™ ื‘ื—ื“ืจ ื”ื™ืจื•ืง, ืžื—ื›ื” ืœืขืœื•ืช ืœื‘ืžื”,
00:26
and instead of bringing me to the stage, they brought five men into the green room
4
26240
5176
ืื‘ืœ ื‘ืžืงื•ื ืœื”ืขืœื•ืช ืื•ืชื™ ืœื‘ืžื”, ื”ื•ื‘ืื• ื—ืžื™ืฉื” ืื ืฉื™ื ืœื—ื“ืจ ื”ื™ืจื•ืง.
00:31
who sat around this little table with me.
5
31440
2056
ืฉื™ืฉื‘ื• ืกื‘ื™ื‘ ืฉื•ืœื—ืŸ ืงื˜ืŸ ืื™ืชื™.
00:33
They were tech billionaires.
6
33520
2096
ื”ื ื”ื™ื• ืžื™ืœื™ืืจื“ืจื™ื ืžืขื•ืœื ื”ื”ื™ื™ื˜ืง.
00:35
And they started peppering me with these really binary questions,
7
35640
4536
ื•ื”ื ื”ืชื—ื™ืœื• ืœืฉืื•ืœ ืื•ืชื™ ืฉืืœื•ืช ื‘ื—ื™ืจื”,
00:40
like: Bitcoin or Etherium?
8
40200
2000
ื›ืžื•: ื‘ื™ื˜ืงื•ื™ืŸ ืื• ืืชืจื™ื•ื?
00:43
Virtual reality or augmented reality?
9
43120
2656
ืžืฆื™ืื•ืช ืžื“ื•ืžื” ืื• ืžืฆื™ืื•ืช ืจื‘ื•ื“ื”?
00:45
I don't know if they were taking bets or what.
10
45800
2496
ืื ื™ ืœื ื™ื•ื“ืข, ืื•ืœื™ ื”ื ื”ืชืขืจื‘ื• ืขืœ ื–ื” ืื• ืžืฉื”ื•.
00:48
And as they got more comfortable with me,
11
48320
2816
ื•ื›ื›ืœ ืฉื”ื ื”ืจื’ื™ืฉื• ื™ื•ืชืจ ื‘ื ื•ื— ืื™ืชื™,
00:51
they edged towards their real question of concern.
12
51160
3216
ื”ื ื”ื’ื™ืขื• ืœืฉืืœื” ืฉื‘ืืžืช ื”ื˜ืจื™ื“ื” ืื•ืชื:
00:54
Alaska or New Zealand?
13
54400
2240
ืืœืกืงื” ืื• ื ื™ื•-ื–ื™ืœื ื“?
00:57
That's right.
14
57760
1216
ื ื›ื•ืŸ,
ื”ืžื™ืœื™ืืจื“ืจื™ื ื”ืืœื” ื‘ื™ืงืฉื• ืžืชืื•ืจื˜ื™ืงืŸ ืžื“ื™ื”, ืขืฆื”,
00:59
These tech billionaires were asking a media theorist for advice
15
59000
2976
01:02
on where to put their doomsday bunkers.
16
62000
1880
ืื™ืคื” ืœื‘ื ื•ืช ืืช ื”ื‘ื•ื ืงืจื™ื ืœืฉืขืช ื—ื™ืจื•ื ืฉืœื”ื.
01:04
We spent the rest of the hour on the single question:
17
64600
3136
ื‘ื™ืœื™ื ื• ืืช ืฉืืจื™ืช ื”ืฉืขื” ื‘ื“ื™ื•ืŸ ืขืœ ื ื•ืฉื ืื—ื“:
01:07
"How do I maintain control of my security staff
18
67760
3616
"ืื™ืš ืื ื™ ืžืฆืœื™ื— ืœืฉืžื•ืจ ืขืœ ื”ืฉืœื™ื˜ื” ื‘ืฆื•ื•ืช ื”ืื‘ื˜ื—ื” ืฉืœื™,
01:11
after the event?"
19
71400
1320
ืœืื—ืจ ื”ืื™ืจื•ืข?"
01:13
By "the event" they mean the thermonuclear war
20
73920
2736
ื‘"ืื™ืจื•ืข" ื”ื ืžืชื›ื•ื•ื ื™ื ืœืžืœื—ืžื” ื”ื’ืจืขื™ื ื™ืช,
01:16
or climate catastrophe or social unrest that ends the world as we know it,
21
76680
4656
ืื• ืืกื•ืŸ ืืงืœื™ืžื™ ืื• ืžื”ืคื›ื” ื—ื‘ืจืชื™ืช, ืฉื™ื—ืกืœื• ืืช ื”ืขื•ืœื ืฉืื ื—ื ื• ืžื›ื™ืจื™ื,
01:21
and more importantly, makes their money obsolete.
22
81360
3280
ืื•, ืžื” ืฉื—ืฉื•ื‘ ื™ื•ืชืจ, ื™ื”ืคื•ืš ืืช ื”ื›ืกืฃ ืฉืœื”ื, ืœื—ืกืจ ืขืจืš.
01:26
And I couldn't help but think:
23
86200
2216
ื•ืื ื™ ืœื ื™ื›ื•ืœืชื™ ืฉืœื ืœื—ืฉื•ื‘:
01:28
these are the wealthiest, most powerful men in the world,
24
88440
4736
ืืœื” ื”ื ื”ืื ืฉื™ื ื”ืขืฉื™ืจื™ื ื•ื”ื—ื–ืงื™ื ื‘ื™ื•ืชืจ ื‘ืขื•ืœื,
01:33
yet they see themselves as utterly powerless to influence the future.
25
93200
4656
ื•ื‘ื›ืœ ื–ืืช, ื”ื ืชื•ืคืกื™ื ืขืฆืžื ื›ืœื’ืžืจื™ ื—ืกืจื™ ื”ืฉืคืขื” ืขืœ ื”ืขืชื™ื“.
01:37
The best they can do is hang on for the inevitable catastrophe
26
97880
4456
ื”ื›ื™ ื˜ื•ื‘ ืฉื”ื ื™ื›ื•ืœื™ื ืœืขืฉื•ืช ื–ื” ืœื—ื›ื•ืช ืœืงื˜ืกื˜ืจื•ืคื” ื”ื‘ืœืชื™ ื ืžื ืขืช,
01:42
and then use their technology and money to get away from the rest of us.
27
102360
3680
ื•ืื– ืœื”ืฉืชืžืฉ ื‘ื˜ื›ื ื•ืœื•ื’ื™ื” ื•ื‘ื›ืกืฃ ืฉืœื”ื, ื›ื“ื™ ืœื”ื™ืžืœื˜ ืžื›ื•ืœื ื•.
01:47
And these are the winners of the digital economy.
28
107520
2536
ื•ืืœื• ื”ื ื”ืžืฆืœื™ื—ื™ื ืฉืœ ื”ื›ืœื›ืœื” ื”ื“ื™ื’ื™ื˜ืœื™ืช.
01:50
(Laughter)
29
110080
3416
(ืฆื—ื•ืง)
01:53
The digital renaissance
30
113520
2776
ื”ืจื ืกื ืก ื”ื“ื™ื’ื™ื˜ืœื™
01:56
was about the unbridled potential
31
116320
4256
ื”ืชืขืกืง ื‘ืคื•ื˜ื ืฆื™ืืœ ื”ืœื ืžืจื•ืกืŸ
02:00
of the collective human imagination.
32
120600
2416
ืฉืœ ื”ื“ืžื™ื•ืŸ ื”ืื ื•ืฉื™ ื”ืงื•ืœืงื˜ื™ื‘ื™.
02:03
It spanned everything from chaos math and quantum physics
33
123040
5136
ื”ื•ื ืžืงื™ืฃ ื”ื›ืœ, ืžืžืชืžื˜ื™ืงืช ื”ื›ืื•ืก ืœืžื›ื ื™ืงืช ื”ืงื•ื•ื ื˜ื™ื,
02:08
to fantasy role-playing and the Gaia hypothesis, right?
34
128200
4176
ืœืžืฉื—ืงื™ ืชืคืงื™ื“ื™ื ื•ืชื™ืื•ืจื™ืช ื’ืื™ื”, ื ื›ื•ืŸ?
02:12
We believed that human beings connected could create any future we could imagine.
35
132400
6720
ืื ื—ื ื• ืžืืžื™ื ื™ื ืฉื‘ื ื™ ืื“ื ื‘ื™ื—ื“, ื™ื›ื•ืœื™ื ืœื™ืฆื•ืจ ื›ืœ ืขืชื™ื“ ืฉืืคืฉืจ ืœื“ืžื™ื™ืŸ.
02:20
And then came the dot com boom.
36
140840
2199
ื•ืื– ื”ื’ื™ืขื” ื‘ื•ืขืช ื”ื“ื•ื˜-ืงื•ื.
02:24
And the digital future became stock futures.
37
144600
3616
ื•ื”ืขืชื™ื“ ื”ื“ื™ื’ื™ื˜ืœื™ ื”ืคืš ืœืขืชื™ื“ ืžื ื™ื™ืชื™.
02:28
And we used all that energy of the digital age
38
148240
3016
ื•ื”ืฉืชืžืฉื ื• ื‘ื›ืœ ื”ืื ืจื’ื™ื” ื”ื–ืืช ืฉืœ ื”ืขื™ื“ืŸ ื”ื“ื™ื’ื™ื˜ืœื™,
02:31
to pump steroids into the already dying NASDAQ stock exchange.
39
151280
4256
ืœื”ืฆืœืช ืฉื•ืง ื”ืžื ื™ื•ืช ื”ื“ื•ืขืš ืฉืœ ื ืืกื“"ืง.
02:35
The tech magazines told us a tsunami was coming.
40
155560
3616
ืขื™ืชื•ื ื™ ื”ื˜ืง ืกื™ืคืจื• ืœื ื• ืฉ'ืฆื•ื ืืžื™' ืขื•ืžื“ ืœื”ื›ื•ืช.
02:39
And only the investors who hired the best scenario-planners and futurists
41
159200
4456
ื•ืจืง ื”ืžืฉืงื™ืขื™ื ืฉืฉื›ืจื• ืืช ืžืชื›ื ื ื™ ื”ืชืจื—ื™ืฉื™ื ื•ื”ืขืชื™ื“ื ื™ื ื”ื˜ื•ื‘ื™ื ื‘ื™ื•ืชืจ
02:43
would be able to survive the wave.
42
163680
2520
ื™ื”ื™ื• ืžืกื•ื’ืœื™ื ืœืฉืจื•ื“ ืืช 'ื”ื’ืœ.'
02:47
And so the future changed from this thing we create together in the present
43
167160
5896
ื•ื›ืš ื”ืขืชื™ื“ ื”ืฉืชื ื” ืžื”ื“ื‘ืจ ื”ื–ื” ืฉืื ื—ื ื• ื™ื•ืฆืจื™ื ื‘ื™ื—ื“ ื‘ื”ื•ื•ื”,
02:53
to something we bet on
44
173080
1496
ืœืžืฉื”ื• ืฉืื ื—ื ื• ืžืชืขืจื‘ื™ื ืขืœื™ื•,
02:54
in some kind of a zero-sum winner-takes-all competition.
45
174600
3200
ืชื—ืจื•ืช ืกื›ื•ื ืืคืก, ื‘ื” ื”ืžื ืฆื— ืœื•ืงื— ื”ื›ืœ.
03:00
And when things get that competitive about the future,
46
180120
3136
ื•ื›ืฉื”ื“ื‘ืจื™ื ื”ื•ืคื›ื™ื ืœื”ื™ื•ืช ื›ืœ ื›ืš ืชื—ืจื•ืชื™ื™ื ื‘ื ื•ื’ืข ืœืขืชื™ื“,
03:03
humans are no longer valued for our creativity.
47
183280
3296
ื‘ื ื™ ื”ืื“ื ื›ื‘ืจ ืื™ื ื ืžื•ืขืจื›ื™ื ืœืคื™ ื”ื™ืฆื™ืจืชื™ื•ืช ืฉืœื ื•.
03:06
No, now we're just valued for our data.
48
186600
3136
ืœื, ืขื›ืฉื™ื• ืื ื—ื ื• ืจืง ืžื•ืขืจื›ื™ื ืœืคื™ ื”ื ืชื•ื ื™ื ืฉืœื ื•.
03:09
Because they can use the data to make predictions.
49
189760
2376
ื›ื™ื•ื•ืŸ ืฉื”ื ื™ื›ื•ืœื™ื ืœื”ืฉืชืžืฉ ื‘ื ืชื•ื ื™ื, ื›ื“ื™ ืœื‘ื ื•ืช ืชื—ื–ื™ื•ืช.
03:12
Creativity, if anything, that creates noise.
50
192160
2576
ื™ืฆื™ืจืชื™ื•ืช, ืื ื›ื‘ืจ, ื™ื•ืฆืจืช ืจืขืฉ.
03:14
That makes it harder to predict.
51
194760
2216
ื”ืจืขืฉ ื”ื–ื” ืžืคืจื™ืข ืœื‘ื ื™ื™ืช ื”ืชื—ื–ื™ื•ืช.
03:17
So we ended up with a digital landscape
52
197000
2416
ืื– ื”ื’ืขื ื• ืœืชืžื•ื ืช ื ื•ืฃ ื“ื™ื’ื™ื˜ืœื™ืช
03:19
that really repressed creativity, repressed novelty,
53
199440
3256
ืฉืžืžืฉ ืžื“ื›ืืช ืืช ื”ื™ืฆื™ืจืชื™ื•ืช, ืžื“ื›ืืช ื—ื“ืฉื ื•ืช,
03:22
it repressed what makes us most human.
54
202720
2840
ื”ื™ื ืžื“ื›ืืช ืืช ืžื” ืฉื”ื›ื™ ื”ื•ืคืš ืื•ืชื ื• ืœืื ื•ืฉื™ื™ื.
03:26
We ended up with social media.
55
206760
1456
ื•ื ืฉืืจื ื• ืขื ื”ืžื“ื™ื” ื”ื—ื‘ืจืชื™ืช.
03:28
Does social media really connect people in new, interesting ways?
56
208240
3456
ื”ืื ืžื“ื™ื” ื—ื‘ืจืชื™ืช ื‘ืืžืช ืžื—ื‘ืจืช ื‘ื™ืŸ ืื ืฉื™ื ื‘ื“ืจื›ื™ื ื—ื“ืฉื•ืช ื•ืžืขื ื™ื™ื ื•ืช?
03:31
No, social media is about using our data to predict our future behavior.
57
211720
5016
ืœื. ืžื“ื™ื” ื—ื‘ืจืชื™ืช ืžืฉืชืžืฉืช ื‘ืžื™ื“ืข ืฉืœื ื• ื›ื“ื™ ืœื—ื–ื•ืช ืžื” ื ืขืฉื” ื‘ืขืชื™ื“.
03:36
Or when necessary, to influence our future behavior
58
216760
2896
ืื•, ื›ืฉืฆืจื™ืš, ืœื”ืฉืคื™ืข ืขืœ ื”ื”ืชื ื”ื’ื•ืช ื”ืขืชื™ื“ื™ืช ืฉืœื ื•
03:39
so that we act more in accordance with our statistical profiles.
59
219680
4040
ื›ื“ื™ ืฉื ืคืขืœ ื‘ื”ืชืื ืœืคืจื•ืคื™ืœื™ื ื”ืกื˜ื˜ื™ืกื˜ื™ื™ื ืฉืœื ื•.
03:45
The digital economy -- does it like people?
60
225200
2216
ื”ื›ืœื›ืœื” ื”ื“ื™ื’ื™ื˜ืœื™ืช -- ื”ืื ื”ื™ื ืื•ื”ื‘ืช ืื ืฉื™ื?
03:47
No, if you have a business plan, what are you supposed to do?
61
227440
2896
ืœื. ืื ื™ืฉ ืœื›ื ืชื›ื ื™ืช ืขืกืงื™ืช, ืžื” ืืชื ืืžื•ืจื™ื ืœืขืฉื•ืช?
03:50
Get rid of all the people.
62
230360
1256
ืœื”ื™ืคื˜ืจ ืžื›ืœ ื”ืื ืฉื™ื.
03:51
Human beings, they want health care, they want money, they want meaning.
63
231640
3400
ื‘ื ื™ ืื“ื, ื”ื ืจื•ืฆื™ื ื‘ื™ื˜ื•ื— ื‘ืจื™ืื•ืช. ื”ื ืจื•ืฆื™ื ื›ืกืฃ, ื”ื ืจื•ืฆื™ื ืžืฉืžืขื•ืช.
03:56
You can't scale with people.
64
236360
1880
ืืชื ืœื ื™ื›ื•ืœื™ื ืœื™ื™ืฉืจ ืงื• ืขื ืื ืฉื™ื.
03:59
(Laughter)
65
239360
1456
(ืฆื—ื•ืง)
04:00
Even our digital apps --
66
240840
1216
ืืคื™ืœื• ื”ืืคืœื™ืงืฆื™ื•ืช ืฉืœื ื• --
04:02
they don't help us form any rapport or solidarity.
67
242080
3216
ื”ืŸ ืœื ืขื•ื–ืจื•ืช ืœื ื• ืœื™ืฆื•ืจ ื›ืœ ืงืจื‘ื” ืื• ืกื•ืœื™ื“ืจื™ื•ืช.
04:05
I mean, where's the button on the ride hailing app
68
245320
2416
ืื ื™ ืžืชื›ื•ื•ืŸ, ืื™ืคื” ื”ื›ืคืชื•ืจ ื‘ืืคืœื™ืงืฆื™ื” ืฉืžื–ืžื™ื ื” ืœืš ืžื•ื ื™ืช,
04:07
for the drivers to talk to one another about their working conditions
69
247760
3496
ืฉืžืืคืฉืจ ืœื ื”ื’ื™ื ืœื“ื‘ืจ ื–ื” ืขื ื–ื” ืขืœ ืชื ืื™ ื”ืขื‘ื•ื“ื” ืฉืœื”ื
04:11
or to unionize?
70
251280
1200
ืื• ืœื”ืชืื’ื“ ื™ื—ื“?
04:13
Even our videoconferencing tools,
71
253600
2016
ืืคื™ืœื• ื›ืœื™ ืฉื™ื—ื•ืช ื”ื•ื•ื™ื“ืื• ืฉืœื ื•
04:15
they don't allow us to establish real rapport.
72
255640
2376
ืœื ืžืืคืฉืจื™ื ืœื ื• ืœื™ืฆื•ืจ ืงืฉืจ ืืžื™ืชื™.
04:18
However good the resolution of the video,
73
258040
3336
ืœื ืžืฉื ื” ื›ืžื” ื˜ื•ื‘ื” ื”ืจื–ื•ืœื•ืฆื™ื” ืฉืœ ื”ื•ื™ื“ืื•,
04:21
you still can't see if somebody's irises are opening to really take you in.
74
261400
4016
ืืชื ืขื“ื™ื™ืŸ ืœื ืจื•ืื™ื ืืช ื”ืื™ืฉื•ื ื™ื ืฉืžืชืจื—ื‘ื™ื ื›ื“ื™ ืœื”ื‘ื™ืŸ ืืชื›ื.
04:25
All of the things that we've done to establish rapport
75
265440
2576
ื›ืœ ื”ื“ื‘ืจื™ื ืฉืขืฉื™ื ื• ื›ื“ื™ ืœื™ืฆื•ืจ ืงืจื‘ื”
04:28
that we've developed over hundreds of thousands of years of evolution,
76
268040
3335
ืฉืคื™ืชื—ื ื• ื‘ืžืื•ืช ืืœืคื™ ืฉื ื™ื ืฉืœ ืื‘ื•ืœื•ืฆื™ื”,
04:31
they don't work,
77
271399
1217
ื”ื ืœื ืขื•ื‘ื“ื™ื,
04:32
you can't see if someone's breath is syncing up with yours.
78
272640
3016
ืืชื ืœื ื™ื›ื•ืœื™ื ืœืจืื•ืช ืื ืงืฆื‘ ื”ื ืฉื™ืžื” ืฉืœ ืžื™ืฉื”ื• ืžื•ืชืื ืœืฉืœื›ื.
04:35
So the mirror neurons never fire, the oxytocin never goes through your body,
79
275680
3656
ื›ืš ืฉื ื•ื™ื™ืจื•ื ื™ ื”ืจืื™ื™ื” ืœื ืžื•ืคืขืœื™ื, ื”ืื•ืงืกื˜ื•ืฆื™ืŸ ืœื ื ื›ื ืก ืœืžืขืจื›ืช ื”ื“ื,
04:39
you never have that experience of bonding with the other human being.
80
279360
3240
ืืชื ืืฃ ืคืขื ืœื ื—ื•ื•ื™ื ืืช ื”ื—ื•ื•ื™ื” ืฉืœ ื”ืชื—ื‘ืจื•ืช ืœื‘ืŸ ืื“ื ืื—ืจ.
04:43
And instead, you're left like,
81
283360
1456
ื‘ืžืงื•ื, ืืชื ื ืฉืืจื™ื ืขื --
04:44
"Well, they agreed with me, but did they really,
82
284840
2256
"ื˜ื•ื‘, ื”ื ื”ืกื›ื™ืžื• ืื™ืชื™, ืื‘ืœ ื”ืื ื”ื ื‘ืืžืช,
04:47
did they really get me?"
83
287120
1696
ื”ืื ื”ื ื‘ืืžืช, ื”ื‘ื™ื ื• ืื•ืชื™?"
04:48
And we don't blame the technology for that lack of fidelity.
84
288840
3376
ื•ืื ื—ื ื• ืœื ืžืืฉื™ืžื™ื ืืช ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ืขืœ ื—ื•ืกืจ ื”ื ืืžื ื•ืช ื”ื–ืืช.
04:52
We blame the other person.
85
292240
1480
ืื ื—ื ื• ืžืืฉื™ืžื™ื ืืช ื”ืื“ื ื”ืฉื ื™.
04:55
You know, even the technologies and the digital initiatives that we have
86
295320
4056
ืืชื ื™ื•ื“ืขื™ื, ืืคื™ืœื• ื”ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื•ื”ื™ื•ื–ืžื•ืช ื”ื“ื™ื’ื™ื˜ืœื™ื•ืช ื”ืžื•ืงื“ืฉื•ืช
04:59
to promote humans,
87
299400
2176
ืœืงื™ื“ื•ื ื‘ื ื™ ืื“ื,
05:01
are intensely anti-human at the core.
88
301600
2760
ื”ืŸ ืžืžืฉ ืื ื˜ื™-ืื ื•ืฉื™ื•ืช ื‘ื‘ืกื™ืกืŸ.
05:05
Think about the blockchain.
89
305600
2000
ืชื—ืฉื‘ื• ืขืœ ื‘ืœื•ืงืฆ'ื™ื™ืŸ (ืื‘ื˜ื—ืช ืขืกืงื™ื ื‘ืื™ื ื˜ืจื ื˜)
05:08
The blockchain is here to help us have a great humanized economy? No.
90
308520
3616
ื”ืื ื‘ืœื•ืงืฆ'ื™ื™ืŸ ื”ื•ืžืฆื ื›ื“ื™ ืœืขื–ื•ืจ ืœื ื• ืœื™ืฆื•ืจ ื›ืœื›ืœื” ืื ื•ืฉื™ืช? ืœื.
05:12
The blockchain does not engender trust between users,
91
312160
2696
ื‘ืœื•ืงืฆ'ื™ื™ืŸ ืœื ืžื˜ืคื— ืืžื•ืŸ ื‘ื™ืŸ ืžืฉืชืžืฉื™ื,
05:14
the blockchain simply substitutes for trust in a new,
92
314880
3536
ื‘ืœื•ืงืฆ'ื™ื™ืŸ ืคืฉื•ื˜ ืžื—ืœื™ืฃ ืืช ื”ืืžื•ืŸ
05:18
even less transparent way.
93
318440
2000
ื‘ืฉื™ื˜ื” ื—ื“ืฉื”, ืขื•ื“ ืคื—ื•ืช ืฉืงื•ืคื” .
05:21
Or the code movement.
94
321600
1816
ืื• ืชื ื•ืขืช ื”ืงื™ื“ื•ื“.
05:23
I mean, education is great, we love education,
95
323440
2176
ืื ื™ ืžืชื›ื•ื•ืŸ, ื—ื™ื ื•ืš ื”ื•ื ื ืคืœื, ืื ื—ื ื• ืื•ื”ื‘ื™ื ื—ื™ื ื•ืš,
05:25
and it's a wonderful idea
96
325640
1456
ื•ื–ื” ืจืขื™ื•ืŸ ื ืคืœื
05:27
that we want kids to be able to get jobs in the digital future,
97
327120
3136
ืฉื ืจืฆื” ืฉื™ืœื“ื™ื ื™ื”ื™ื• ืžืกื•ื’ืœื™ื ืœื”ืฉื™ื’ ืชืขืกื•ืงื” ื‘ืขืชื™ื“ ื”ื“ื™ื’ื™ื˜ืœื™,
05:30
so we'll teach them code now.
98
330280
1600
ืื– ื ืœืžื“ ืื•ืชื ืœืงื•ื“ื“ ืขื›ืฉื™ื•.
05:32
But since when is education about getting jobs?
99
332640
2480
ืื‘ืœ ืžืžืชื™ ื—ื™ื ื•ืš ืžืชืžืงื“ ื‘ื”ืฉื’ืช ืชืขืกื•ืงื”?
05:35
Education wasn't about getting jobs.
100
335920
1736
ื—ื™ื ื•ืš ืœื ื”ืชืขืกืง ื‘ื”ืฉื’ืช ืขื‘ื•ื“ื•ืช.
05:37
Education was compensation for a job well done.
101
337680
3976
ื—ื™ื ื•ืš ื”ื•ื ื”ืฉื›ืจ ืœืขื‘ื•ื“ื” ื˜ื•ื‘ื”.
05:41
The idea of public education
102
341680
1576
ื”ืจืขื™ื•ืŸ ืฉืœ ื—ื™ื ื•ืš ืฆื™ื‘ื•ืจื™
05:43
was for coal miners, who would work in the coal mines all day,
103
343280
3376
ื”ื•ืžืฆื ืขื‘ื•ืจ ื›ื•ืจื™ ืคื—ื, ืฉืขื‘ื“ื• ื‘ืžื›ืจื•ืช ื”ืคื—ื ื›ืœ ื”ื™ื•ื,
05:46
then they'd come home and they should have the dignity
104
346680
2576
ื•ืื– ื”ื™ื• ื—ื•ื–ืจื™ื ื”ื‘ื™ืชื”, ืžื’ื™ืข ืœื”ื ื”ื›ื‘ื•ื“ ื”ืขืฆืžื™
05:49
to be able to read a novel and understand it.
105
349280
2136
ืœื”ื™ื•ืช ืžืกื•ื’ืœื™ื ืœืงืจื•ื ืกืคืจ ื•ืœื”ื‘ื™ืŸ ืื•ืชื•.
05:51
Or the intelligence to be able to participate in democracy.
106
351440
2960
ืื• ืฉืฉื›ื‘ืช ื”ืื™ื ื˜ืœื™ื’ื ืฆื™ื” ืชื•ื›ืœ ืœื”ืฉืชืชืฃ ื‘ืชื”ืœื™ืš ื”ื“ืžื•ืงืจื˜ื™.
05:55
When we make it an extension of the job, what are we really doing?
107
355200
3216
ื›ืฉืื ื—ื ื• ื”ื•ืคื›ื™ื ืื•ืชื• ืœื”ืจื—ื‘ื” ืฉืœ ื”ืขื‘ื•ื“ื”, ืžื” ืื ื—ื ื• ื‘ืขืฆื ืขื•ืฉื™ื?
05:58
We're just letting corporations really
108
358440
2536
ืื ื—ื ื• ืžืืคืฉืจื™ื ืœืชืื’ื™ื“ื™ื
06:01
externalize the cost of training their workers.
109
361000
3120
ืœื”ื›ืฉื™ืจ ืืช ื”ืขื•ื‘ื“ื™ื ื‘ืžื™ืงื•ืจ ื—ื•ืฅ,
06:05
And the worst of all really is the humane technology movement.
110
365520
4096
ื•ื”ืจืข ืžื›ืœ ื”ื™ื ืชื ื•ืขืช ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ื”ื”ื•ืžื ื™ืช.
06:09
I mean, I love these guys, the former guys who used to take
111
369640
2816
ืื ื™ ืžืชื›ื•ื•ืŸ, ืื ื™ ืื•ื”ื‘ ืืช ื”ืื ืฉื™ื ื”ืืœื”, ืื•ืชื ืื ืฉื™ื ืฉื ื”ื’ื• ืœืงื—ืช
06:12
the algorithms from Las Vegas slot machines
112
372480
2936
ืืช ื”ืืœื’ื•ืจื™ืชืžื™ื ืฉืœ ืžื›ื•ื ื•ืช ื”ืžื–ืœ ืžืœืืก ื•ื’ืืก
06:15
and put them in our social media feed so that we get addicted.
113
375440
2936
ื•ืœืฉื™ื ืื•ืชื ื‘ืžื“ื™ื” ื”ื—ื‘ืจืชื™ืช ืฉืœื ื• ื›ืš ืฉื ืชืžื›ืจ ืืœื™ื”.
06:18
Now they've seen the error of their ways
114
378400
1936
ืขื›ืฉื™ื• ื”ื ื”ื‘ื™ื ื• ืืช ื˜ืขื•ืชื
06:20
and they want to make technology more humane.
115
380360
2456
ื•ื”ื ืจื•ืฆื™ื ืœื”ืคื•ืš ืืช ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ืœื™ื•ืชืจ ื”ื•ืžื ื™ืช.
06:22
But when I hear the expression "humane technology,"
116
382840
2416
ืื‘ืœ ื›ืฉืื ื™ ืฉื•ืžืข ืืช ื”ื‘ื™ื˜ื•ื™ "ื˜ื›ื ื•ืœื•ื’ื™ื” ื”ื•ืžื ื™ืช",
06:25
I think about cage-free chickens or something.
117
385280
2856
ืื ื™ ื—ื•ืฉื‘ ืขืœ ืชืจื ื’ื•ืœื•ืช ื—ื•ืคืฉ ืื• ืžืฉื”ื• ื›ื–ื”.
06:28
We're going to be as humane as possible to them,
118
388160
2256
ืื ื—ื ื• ื ื”ื™ื” ื›ืžื” ืฉื™ื•ืชืจ ื”ื•ืžื ื™ื™ื ื›ืœืคื™ื”ืŸ,
06:30
until we take them to the slaughter.
119
390440
1720
ืขื“ ืฉื ืฉื—ื˜ ืื•ืชื.
06:33
So now they're going to let these technologies be as humane as possible,
120
393200
3416
ืื– ืขื›ืฉื™ื• ื”ื•ืœื›ื™ื ืœื”ืคื•ืš ืืช ื”ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ืœื”ื•ืžื ื™ื•ืช,
06:36
as long as they extract enough data and extract enough money from us
121
396640
3216
ื›ืœ ืขื•ื“ ื”ื ืžืคื™ืงื™ื ืžืกืคื™ืง ืžื™ื“ืข ื•ืžืคื™ืงื™ื ืžืกืคื™ืง ื›ืกืฃ ืžืืชื ื•,
06:39
to please their shareholders.
122
399880
1400
ืฉื™ืกืคืง ืืช ื‘ืขืœื™ ื”ืžื ื™ื•ืช ืฉืœื”ื.
06:42
Meanwhile, the shareholders, for their part, they're just thinking,
123
402520
3176
ื‘ื ืชื™ื™ื, ื‘ืขืœื™ ื”ืžื ื™ื•ืช ืžืฆื“ื, ื”ื ื—ื•ืฉื‘ื™ื --
06:45
"I need to earn enough money now, so I can insulate myself
124
405720
2976
"ืื ื—ื ื• ืฆืจื™ื›ื™ื ืœื”ืจื•ื•ื™ื— ืžืกืคื™ืง ื›ืกืฃ ืขื›ืฉื™ื•, ื›ื“ื™ ืฉื ื•ื›ืœ ืœื”ื’ืŸ ืขืœ ืขืฆืžื ื•
06:48
from the world I'm creating by earning money in this way."
125
408720
3056
ืžื”ืขื•ืœื ืื•ืชื• ืฉืื ื• ื™ื•ืฆืจื™ื ื‘ื›ืš ืฉืื ื—ื ื• ืžืจื•ื•ื™ื—ื™ื ื›ืกืฃ ื›ื›ื”."
06:51
(Laughter)
126
411800
2376
(ืฆื—ื•ืง)
06:54
No matter how many VR goggles they slap on their faces
127
414200
4056
ืœื ืžืฉื ื” ื›ืžื” ืžืฉืงืคื™ ืžืฆื™ืื•ืช ืžื“ื•ืžื” ื”ื ื™ืจื›ื™ื‘ื• ืขืœ ื”ืคื ื™ื,
06:58
and whatever fantasy world they go into,
128
418280
2256
ืื• ืื™ื–ื” ืขื•ืœื ืคื ื˜ื–ื™ื” ื”ื ื™ื›ื ืกื• ืืœื™ื•,
07:00
they can't externalize the slavery and pollution that was caused
129
420560
3536
ื”ื ืœื ื™ื›ื•ืœื™ื ืœื”ืชืขืœื ืžื”ืขื‘ื“ื•ืช ื•ื”ื–ื™ื”ื•ื ืฉื ื•ืฆืจื•
07:04
through the manufacture of the very device.
130
424120
2976
ื‘ืืžืฆืขื•ืช ื”ื™ื™ืฆื•ืจ ืฉืœ ื”ืžื›ืฉื™ืจ ืขืฆืžื•.
07:07
It reminds me of Thomas Jefferson's dumbwaiter.
131
427120
3176
ื–ื” ืžื–ื›ื™ืจ ืœื™ ืืช ืžืขืœื™ืช ื”ืื•ื›ืœ ืฉืœ ืชื•ืžืืก ื’'ืคืจืกื•ืŸ.
07:10
Now, we like to think that he made the dumbwaiter
132
430320
2336
ืื ื—ื ื• ืื•ื”ื‘ื™ื ืœื—ืฉื•ื‘ ืฉื”ื•ื ื”ืžืฆื™ื ืืช ื”ืžืขืœื™ืช
07:12
in order to spare his slaves all that labor of carrying the food
133
432680
3656
ื›ื“ื™ ืœื—ืกื•ืš ืžื”ืขื‘ื“ื™ื ืฉืœื• ืืช ื ืฉื™ืืช ืžื’ืฉื™ ื”ืื•ื›ืœ
07:16
up to the dining room for the people to eat.
134
436360
2776
ืœื—ื“ืจ ื”ืื•ื›ืœ ื›ื“ื™ ืœื”ื’ื™ืฉ ืœืื ืฉื™ื.
07:19
That's not what it was for, it wasn't for the slaves,
135
439160
2496
ื–ืืช ืœื ื”ื™ืชื” ืžื˜ืจืช ื”ืžืขืœื™ืช, ื”ื™ื ืœื ื”ื•ืžืฆืื” ืœื˜ื•ื‘ืช ื”ืขื‘ื“ื™ื,
07:21
it was for Thomas Jefferson and his dinner guests,
136
441680
2336
ื”ื™ื ื”ื•ืžืฆืื” ืขื‘ื•ืจ ืชื•ืžืืก ื’'ืคืจืกื•ืŸ ื•ื”ืื•ืจื—ื™ื ืฉืœื•,
07:24
so they didn't have to see the slave bringing the food up.
137
444040
3096
ืฉืœื ื™ืฆื˜ืจื›ื• ืœืจืื•ืช ืืช ื”ืขื‘ื“ื™ื ืžื‘ื™ืื™ื ืืช ื”ืื•ื›ืœ.
07:27
The food just arrived magically,
138
447160
1576
ืžื’ืฉื™ ื”ืื•ื›ืœ ื”ื•ืคื™ืขื• ื‘ืื•ืจื— ืงืกื,
07:28
like it was coming out of a "Start Trek" replicator.
139
448760
3000
ื›ืื™ืœื• ื™ืฆืื• ืžืžื›ื•ื ืช ื”ืขืชืงื” ืฉืœ "ืกื˜ืืจ ื˜ืจืง".
07:32
It's part of an ethos that says,
140
452720
2096
ื–ื” ื—ืœืง ืžืืชื•ืก ืฉืื•ืžืจ,
07:34
human beings are the problem and technology is the solution.
141
454840
4280
ื‘ื ื™ ื”ืื“ื ื”ื ื”ื‘ืขื™ื” ื•ื˜ื›ื ื•ืœื•ื’ื™ื” ื”ื™ื ื”ืคืชืจื•ืŸ.
07:40
We can't think that way anymore.
142
460680
2056
ืื ื—ื ื• ืœื ื™ื›ื•ืœื™ื ืœื—ืฉื•ื‘ ื›ืš ื™ื•ืชืจ.
07:42
We have to stop using technology to optimize human beings for the market
143
462760
5296
ืฆืจื™ืš ืœื”ืคืกื™ืง ืœื”ืฉืชืžืฉ ื‘ื˜ื›ื ื•ืœื•ื’ื™ื” ื›ื“ื™ ืœืฉืคืจ ืื ืฉื™ื ืœืคื™ ื“ืจื™ืฉื•ืช ื”ืฉื•ืง.
07:48
and start optimizing technology for the human future.
144
468080
5040
ื•ืœื”ืชื—ื™ืœ ืœืžืงืกื ืืช ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ืœื˜ื•ื‘ืช ืขืชื™ื“ ื”ืื ื•ืฉื•ืช.
07:55
But that's a really hard argument to make these days,
145
475080
2656
ืื‘ืœ ื–ื” ื˜ื™ืขื•ืŸ ืฉืžืžืฉ ืงืฉื” ืœื˜ืขื•ืŸ ืื•ืชื• ื›ื™ื•ื,
07:57
because humans are not popular beings.
146
477760
4056
ื›ื™ ื‘ื ื™ ื”ืื“ื ืื™ื ื ื™ืฆื•ืจื™ื ืคื•ืคื•ืœืจื™ื™ื.
08:01
I talked about this in front of an environmentalist just the other day,
147
481840
3376
ื”ืจืฆื™ืชื™ ืขืœ ื ื•ืฉื ื–ื” ื‘ืคื ื™ ืคืขื™ืœืช ืื™ื›ื•ืช ื”ืกื‘ื™ื‘ื” ืœืคื ื™ ื›ืžื” ื™ืžื™ื,
ื•ื”ื™ื ืืžืจื”, "ืœืžื” ืืชื” ืžื’ืŸ ืขืœ ื‘ื ื™ ื”ืื“ื?
08:05
and she said, "Why are you defending humans?
148
485240
2096
ื‘ื ื™ ื”ืื“ื ื”ืจืกื• ืืช ื›ื“ื•ืจ ื”ืืจืฅ. ืžื’ื™ืข ืœื”ื ืœื”ื™ื›ื—ื“".
08:07
Humans destroyed the planet. They deserve to go extinct."
149
487360
2696
08:10
(Laughter)
150
490080
3456
(ืฆื—ื•ืง)
08:13
Even our popular media hates humans.
151
493560
2576
ืืคื™ืœื• ื”ืชืงืฉื•ืจืช ืฉืœื ื• ืฉื•ื ืืช ื‘ื ื™ ืื“ื.
08:16
Watch television,
152
496160
1256
ืชืคืชื—ื• ื˜ืœื•ื™ื–ื™ื”,
08:17
all the sci-fi shows are about how robots are better and nicer than people.
153
497440
3736
ื›ืœ ืชื›ื ื™ื•ืช ื”ืžื“"ื‘ ื”ืŸ ืขืœ ืื™ืš ืจื•ื‘ื•ื˜ื™ื ื”ื ื˜ื•ื‘ื™ื ื•ื ื—ืžื“ื™ื ืžื‘ื ื™ ืื“ื.
08:21
Even zombie shows -- what is every zombie show about?
154
501200
2976
ืืคื™ืœื• ืชื›ื ื™ื•ืช ื”ื–ื•ืžื‘ื™ื -- ื‘ืžื” ืขื•ืกืงืช ื›ืœ ืชื›ื ื™ืช ืขืœ ื–ื•ืžื‘ื™ื?
08:24
Some person, looking at the horizon at some zombie going by,
155
504200
3256
ืื™ื–ื” ืื“ื, ืžื‘ื™ื˜ ืืœ ื”ืื•ืคืง ื‘ื–ื•ืžื‘ื™ ืฉืขื•ื‘ืจ ืฉื,
08:27
and they zoom in on the person and you see the person's face,
156
507480
2896
ื•ืื– ื”ื ืžืชืžืงื“ื™ื ื‘ืื“ื ื•ืืชื ืจื•ืื™ื ืืช ื”ืคืจืฆื•ืฃ ืฉืœื•,
08:30
and you know what they're thinking:
157
510400
1736
ื•ืืชื ื™ื•ื“ืขื™ื ืขืœ ืžื” ื”ื•ื ื—ื•ืฉื‘:
08:32
"What's really the difference between that zombie and me?
158
512160
2736
"ื‘ืขืฆื ืžื” ื”ื”ื‘ื“ืœ ื‘ื™ื ื™ ื•ื‘ื™ืŸ ื”ื–ื•ืžื‘ื™ ื”ื–ื”?"
08:34
He walks, I walk.
159
514920
1536
ื”ื•ื ื”ื•ืœืš, ืื ื™ ื”ื•ืœืš.
08:36
He eats, I eat.
160
516480
2016
ื”ื•ื ืื•ื›ืœ, ืื ื™ ืื•ื›ืœ.
08:38
He kills, I kill."
161
518520
2280
ื”ื•ื ื”ื•ืจื’, ืื ื™ ื”ื•ืจื’".
08:42
But he's a zombie.
162
522360
1536
ืื‘ืœ ื”ื•ื ื–ื•ืžื‘ื™.
08:43
At least you're aware of it.
163
523920
1416
ืœืคื—ื•ืช ืืชื ืžื•ื“ืขื™ื ืœื–ื”.
08:45
If we are actually having trouble distinguishing ourselves from zombies,
164
525360
3696
ืื ืงืฉื” ืœื ื• ืœื”ื‘ื“ื™ืœ ื‘ื™ื ื™ื ื• ื•ื‘ื™ืŸ ื–ื•ืžื‘ื™ื,
08:49
we have a pretty big problem going on.
165
529080
2176
ื™ืฉ ืœื ื• ื‘ืขื™ื” ื“ื™ ืจืฆื™ื ื™ืช.
08:51
(Laughter)
166
531280
1216
(ืฆื—ื•ืง)
08:52
And don't even get me started on the transhumanists.
167
532520
2936
ื•ืืœ ืชื’ืจืžื• ืœื™ ืœื”ืชื—ื™ืœ ืืคื™ืœื• ืœื“ื‘ืจ ืขืœ ืžืืžื™ื ื™ ื”ื˜ืจื ืก-ื”ื•ืžื ื™ื–ื.
08:55
I was on a panel with a transhumanist, and he's going on about the singularity.
168
535480
3816
ื™ืฉื‘ืชื™ ื‘ืคื ืœ ืขื ื˜ืจื ืก-ื”ื•ืžืื ื™, ื•ื”ื•ื ื“ื™ื‘ืจ ื•ื“ื™ื‘ืจ ืขืœ ืกื™ื ื’ื•ืœืจื™ื•ืช.
08:59
"Oh, the day is going to come really soon when computers are smarter than people.
169
539320
3856
"ื‘ืงืจื•ื‘ ืžืžืฉ ื™ื‘ื•ื ื™ื•ื ื‘ื• ื”ืžื—ืฉื‘ื™ื ื™ื”ื™ื• ื—ื›ืžื™ื ื™ื•ืชืจ ืžื‘ื ื™ ืื“ื.
09:03
And the only option for people at that point
170
543200
2136
ื•ื”ืืคืฉืจื•ืช ื”ื™ื—ื™ื“ื” ืฉืชืขืžื•ื“ ื‘ืคื ื™ ืื ืฉื™ื ืื–,
09:05
is to pass the evolutionary torch to our successor
171
545360
3336
ืชื”ื™ื” ืœื”ืขื‘ื™ืจ ืืช ื”ืฉืจื‘ื™ื˜ ื”ืื‘ื•ืœื•ืฆื™ื•ื ื™ ืœื™ื•ืจืฉื™ื ืฉืœื ื•
09:08
and fade into the background.
172
548720
1616
ื•ืœื”ื™ืขืœื ื‘ืจืงืข.
09:10
Maybe at best, upload your consciousness to a silicon chip.
173
550360
3456
ืื• ืžืงืกื™ืžื•ื, ืœื”ืขืœื•ืช ืืช ื”ืชื•ื“ืขื” ืฉืœื ื• ืœืฉื‘ื™ื‘ ืกื™ืœื™ืงื•ืŸ.
09:13
And accept your extinction."
174
553840
1840
ื•ืœืงื‘ืœ ืืช ื”ื”ื›ื—ื“ื” ืฉืœื ื•".
09:16
(Laughter)
175
556640
1456
(ืฆื—ื•ืง)
09:18
And I said, "No, human beings are special.
176
558120
3376
ื•ืืžืจืชื™ ืœื•, "ืœื, ื‘ื ื™ ื”ืื“ื ื”ื ืžื™ื•ื—ื“ื™ื.
09:21
We can embrace ambiguity, we understand paradox,
177
561520
3536
ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœืืžืฅ ืขืžื™ืžื•ืช, ืื ื—ื ื• ืžื‘ื™ื ื™ื ืคืจื“ื•ืงืกื™ื,
09:25
we're conscious, we're weird, we're quirky.
178
565080
2616
ืื ื—ื ื• ื‘ืขืœื™ ืชื•ื“ืขื”, ืื ื—ื ื• ืžื•ื–ืจื™ื ื•ืžืฉื•ื ื™ื.
09:27
There should be a place for humans in the digital future."
179
567720
3336
ืฆืจื™ืš ืœื”ื™ื•ืช ืžืงื•ื ืœื‘ื ื™ ืื“ื ื‘ืขืชื™ื“ ื”ื“ื™ื’ื™ื˜ืœื™".
09:31
And he said, "Oh, Rushkoff,
180
571080
1536
ื•ื”ื•ื ืขื ื” ืœื™, "ืื•ื™, ืจืฉืงื•ืฃ,
09:32
you're just saying that because you're a human."
181
572640
2296
ืืชื” ืจืง ืื•ืžืจ ืืช ื–ื” ื›ื™ ืืชื” ื‘ืŸ ืื“ื".
09:34
(Laughter)
182
574960
1776
(ืฆื—ื•ืง)
09:36
As if it's hubris.
183
576760
1600
ื›ืื™ืœื• ืžื“ื•ื‘ืจ ื‘ื”ื™ื‘ืจื™ืก.
09:39
OK, I'm on "Team Human."
184
579280
2800
ืื•ืงื™ื™, ืื ื™ ื‘ืฆื“ ืฉืœ "ืงื‘ื•ืฆืช ื‘ื ื™ ื”ืื“ื".
09:43
That was the original insight of the digital age.
185
583200
3856
ื–ืืช ื”ื™ืชื” ื”ืชื•ื‘ื ื” ื”ืžืงื•ืจื™ืช ืฉืœ ื”ืขื™ื“ืŸ ื”ื“ื™ื’ื™ื˜ืœื™.
09:47
That being human is a team sport,
186
587080
2216
ืฉื”ืงื™ื•ื ื”ืื ื•ืฉื™ ื”ื•ื ืกืคื•ืจื˜ ืงื‘ื•ืฆืชื™,
09:49
evolution's a collaborative act.
187
589320
2736
ืื‘ื•ืœื•ืฆื™ื” ื”ื™ื ื™ืฆื™ืจื” ืžืฉื•ืชืคืช.
09:52
Even the trees in the forest,
188
592080
1416
ืืคื™ืœื• ื”ืขืฆื™ื ื‘ื™ืขืจ,
09:53
they're not all in competition with each other,
189
593520
2216
ื”ื ืœื ื›ื•ืœื ืžืชื—ืจื™ื ื–ื” ื‘ื–ื”,
09:55
they're connected with the vast network of roots and mushrooms
190
595760
3216
ื”ื ืžื—ื•ื‘ืจื™ื ื‘ืจืฉืช ืจื—ื‘ื” ืฉืœ ืฉื•ืจืฉื™ื ื•ืคื˜ืจื™ื•ืช
09:59
that let them communicate with one another and pass nutrients back and forth.
191
599000
4536
ืฉืžืืคืฉืจื™ื ืœื”ื ืœืชืงืฉืจ ื–ื” ืขื ื–ื” ื•ืœื”ืขื‘ื™ืจ ื—ื•ืžืจื™ ืžื–ื•ืŸ ื‘ื™ื ื™ื”ื.
10:03
If human beings are the most evolved species,
192
603560
2136
ืื ื‘ื ื™ ื”ืื“ื ื”ื ื”ื–ืŸ ื”ืžืคื•ืชื— ื‘ื™ื•ืชืจ,
10:05
it's because we have the most evolved ways of collaborating and communicating.
193
605720
4136
ื–ื” ื‘ื’ืœืœ ืฉื™ืฉ ืœื ื• ืืช ื”ืฉื™ื˜ื•ืช ื”ืžืคื•ืชื—ื•ืช ื‘ื™ื•ืชืจ ืœืฉื™ืชื•ืฃ ื•ืชืงืฉื•ืจืช.
10:09
We have language.
194
609880
1496
ื™ืฉ ืœื ื• ืฉืคื”.
10:11
We have technology.
195
611400
1240
ื™ืฉ ืœื ื• ื˜ื›ื ื•ืœื•ื’ื™ื”.
10:14
It's funny, I used to be the guy who talked about the digital future
196
614120
4576
ื–ื” ืžืฆื—ื™ืง, ืคืขื ื”ื™ื™ืชื™ ื”ื‘ื—ื•ืจ ืฉื“ื™ื‘ืจ ืขืœ ื”ืขืชื™ื“ ื”ื“ื™ื’ื™ื˜ืœื™
10:18
for people who hadn't yet experienced anything digital.
197
618720
2680
ืขื ืื ืฉื™ื ืฉืขื•ื“ ืœื ื—ื•ื• ืฉื•ื ื“ื‘ืจ ื“ื™ื’ื™ื˜ืœื™.
10:22
And now I feel like I'm the last guy
198
622200
1816
ื•ื›ื™ื•ื ืื ื™ ืžืจื’ื™ืฉ ื›ืื™ืœื• ืื ื™ ื”ื‘ื—ื•ืจ ื”ืื—ืจื•ืŸ
10:24
who remembers what life was like before digital technology.
199
624040
2920
ืฉื–ื•ื›ืจ ืืช ื”ื—ื™ื™ื ืฉืœืคื ื™ ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ื”ื“ื™ื’ื™ื˜ืœื™ืช.
10:28
It's not a matter of rejecting the digital or rejecting the technological.
200
628680
4216
ืœื ืžื“ื•ื‘ืจ ื‘ื“ื—ื™ื™ื” ืฉืœ ื”ื“ื™ื’ื™ื˜ืœื™ ืื• ื“ื—ื™ื™ื” ืฉืœ ื”ื˜ื›ื ื•ืœื•ื’ื™ื”.
10:32
It's a matter of retrieving the values that we're in danger of leaving behind
201
632920
4096
ืžื“ื•ื‘ืจ ื‘ื”ืฉื‘ืช ื”ืขืจื›ื™ื ืฉื ืžืฆืื™ื ื‘ืกื›ื ืช ื”ื›ื—ื“ื”
10:37
and then embedding them in the digital infrastructure for the future.
202
637040
3600
ื•ืื– ื”ื˜ืžืขืชื ื‘ืžื‘ื ื” ื”ื“ื™ื’ื™ื˜ืœื™, ื‘ืฉื‘ื™ืœ ื”ืขืชื™ื“.
10:41
And that's not rocket science.
203
641880
2296
ื•ืœื ืžื“ื•ื‘ืจ ื‘ืžื“ืข ืžืกื•ื‘ืš.
10:44
It's as simple as making a social network
204
644200
2096
ื–ื” ืžืžืฉ ืคืฉื•ื˜, ื›ืžื• ื™ืฆื™ืจืช ืจืฉืช ื—ื‘ืจืชื™ืช,
10:46
that instead of teaching us to see people as adversaries,
205
646320
3536
ืฉื‘ืžืงื•ื ืœืœืžื“ ืื•ืชื ื• ืœืจืื•ืช ื‘ืื ืฉื™ื ื™ืจื™ื‘ื™ื,
10:49
it teaches us to see our adversaries as people.
206
649880
3000
ื”ื™ื ืžืœืžื“ืช ืื•ืชื ื• ืœืจืื•ืช ืืช ื”ื™ืจื™ื‘ื™ื ืฉืœื ื•, ื›ืื ืฉื™ื.
10:54
It means creating an economy that doesn't favor a platform monopoly
207
654240
4296
ืžื“ื•ื‘ืจ ื‘ื™ืฆื™ืจืช ื›ืœื›ืœื” ืฉืœื ืžืชืขื“ืคืช ืคืœื˜ืคื•ืจืžื•ืช ืžื•ื ื•ืคื•ืœื™ื•ืช (ืคื™ื™ืกื‘ื•ืง ื•ื›ื•')
10:58
that wants to extract all the value out of people and places,
208
658560
3336
ืฉืจื•ืฆื•ืช ืœืžืฆื•ืช ืืช ื›ืœ ื”ืขืจืš ืžืื ืฉื™ื ื•ืžืงื•ืžื•ืช,
11:01
but one that promotes the circulation of value through a community
209
661920
3936
ืืœื ื›ืœื›ืœื” ืฉืžืงื“ืžืช ืืช ืžื—ื–ื•ืจ ื”ื—ืคืฆื™ื ื‘ืงื”ื™ืœื”
11:05
and allows us to establish platform cooperatives
210
665880
2416
ื•ืžืืคืฉืจืช ืœื ื• ืœื™ืฆื•ืจ ืคืœื˜ืคื•ืจืžื•ืช ืงื•ืื•ืคืจื˜ื™ื‘ื™ื•ืช.
11:08
that distribute ownership as wide as possible.
211
668320
3816
ืฉืžื—ืœืงื™ื ืืช ื”ืฉืœื™ื˜ื” ืœื›ืžื” ืฉื™ื•ืชืจ.
11:12
It means building platforms
212
672160
1656
ืžื“ื•ื‘ืจ ื‘ื‘ื ื™ื™ืช ืคืœื˜ืคื•ืจืžื•ืช
11:13
that don't repress our creativity and novelty in the name of prediction
213
673840
4656
ืฉืœื ืžื“ื›ืื•ืช ืืช ื”ื™ืฆื™ืจืชื™ื•ืช ื•ื”ื—ื“ืฉื ื•ืช ืฉืœื ื• ื‘ืฉื ื”ืชื—ื–ื™ื•ืช,
11:18
but actually promote creativity and novelty,
214
678520
2576
ื›ืืœื” ืฉืžืงื“ืžื•ืช ื™ืฆื™ืจืชื™ื•ืช ื•ื—ื“ืฉื ื•ืช.
11:21
so that we can come up with some of the solutions
215
681120
2336
ื›ืš ืฉื ื•ื›ืœ ืœื—ืฉื•ื‘ ืขืœ ื—ืœืง ืžื”ืคืชืจื•ื ื•ืช
11:23
to actually get ourselves out of the mess that we're in.
216
683480
2680
ืฉื™ืฆืœื™ื—ื• ืœื”ื•ืฆื™ื ืื•ืชื ื• ืžื”ื‘ื•ืฅ ื‘ื• ืื ื—ื ื• ื ืžืฆืื™ื.
11:27
No, instead of trying to earn enough money to insulate ourselves
217
687440
3056
ืœื. ื‘ืžืงื•ื ืœื ืกื•ืช ืœื”ืจื•ื•ื™ื— ืžืกืคื™ืง ื›ืกืฃ ืœื”ื’ืŸ ืขืœ ืขืฆืžื ื•,
11:30
from the world we're creating,
218
690520
1496
ืžื”ืขื•ืœื ืื•ืชื• ื™ืฆืจื ื•,
11:32
why don't we spend that time and energy making the world a place
219
692040
3056
ืœืžื” ืฉืœื ื ืฉืงื™ืข ืืช ื”ื–ืžืŸ ื•ื”ืื ืจื’ื™ื” ื‘ื™ืฆื™ืจืช ืขื•ืœื
ืฉืœื ื ืจื’ื™ืฉ ืืช ื”ืฆื•ืจืš ืœื”ื™ืžืœื˜ ืžืžื ื•.
11:35
that we don't feel the need to escape from.
220
695120
2040
11:38
There is no escape, there is only one thing going on here.
221
698000
3200
ืื™ืŸ ืฉื•ื ืžืงื•ื ืžืงืœื˜, ื™ืฉ ืจืง ืืช ืžื” ืฉืงื•ืจื” ื›ืืŸ.
11:42
Please, don't leave.
222
702680
2120
ื‘ื‘ืงืฉื”, ืืœ ืชืขื–ื‘ื•.
11:45
Join us.
223
705640
1360
ื”ืฆื˜ืจืคื• ืืœื™ื ื•.
11:47
We may not be perfect,
224
707800
1296
ืื ื—ื ื• ืื•ืœื™ ืœื ืžื•ืฉืœืžื™ื,
11:49
but whatever happens, at least you won't be alone.
225
709120
2440
ืื‘ืœ ืœื ืžืฉื ื” ืžื” ื™ืงืจื”, ืœืคื—ื•ืช ืœื ืชื”ื™ื• ืœื‘ื“.
11:52
Join "Team Human."
226
712640
1520
ื”ืฆื˜ืจืคื• ืœ"ืงื‘ื•ืฆืช ื‘ื ื™ ื”ืื“ื".
11:55
Find the others.
227
715160
2096
ืžืฆืื• ืืช ื”ืื—ืจื™ื.
11:57
Together, let's make the future that we always wanted.
228
717280
2960
ื™ื—ื“, ื‘ื•ืื• ื ื™ืฆื•ืจ ืืช ื”ืขืชื™ื“ ืฉืชืžื™ื“ ืจืฆื™ื ื•.
12:01
Oh, and those tech billionaires who wanted to know
229
721560
2496
ืื”, ื•ืื•ืชื ืžื™ืœื™ืืจื“ืจื™ื ืฉืจืฆื• ืœื“ืขืช
12:04
how to maintain control of their security force after the apocalypse,
230
724080
3256
ืื™ืš ืœืฉืœื•ื˜ ื‘ืฆื•ื•ืช ื”ืื‘ื˜ื—ื” ืฉืœื”ื ืื—ืจื™ ื”ืืคื•ืงืœื™ืคืกื”,
12:07
you know what I told them?
231
727360
1896
ืืชื ื™ื•ื“ืขื™ื ืžื” ืืžืจืชื™ ืœื”ื?
12:09
"Start treating those people with love and respect right now.
232
729280
4216
"ืชืชื—ื™ืœื• ืœื”ืชื™ื™ื—ืก ืœืื ืฉื™ื ื”ืืœื” ื‘ืื”ื‘ื” ื•ื›ื‘ื•ื“ ื›ื‘ืจ ืขื›ืฉื™ื•,
12:13
Maybe you won't have an apocalypse to worry about."
233
733520
2440
ืื•ืœื™ ืœื ืชื”ื™ื” ืœื›ื ืืคื•ืงืœื™ืคืกื” ืœื“ืื•ื’ ื‘ื’ืœืœื”".
12:16
Thank you.
234
736840
1216
ืชื•ื“ื” ืจื‘ื”.
12:18
(Applause)
235
738080
4440
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7