A funny look at the unintended consequences of technology | Chuck Nice

275,018 views ・ 2018-02-27

TED


請雙擊下方英文字幕播放視頻。

譯者: Lilian Chiu 審譯者: NAN-KUN WU
00:12
Future tech always comes with two things: promise
0
12863
4205
未來科技總是會帶來 兩樣東西:承諾,
00:17
and unintended consequences.
1
17092
2104
以及不在計畫中的結果。
00:19
And it's those consequences that I want to explore.
2
19220
3406
我想要探討的是那些結果。
00:23
And before we get to how future tech may affect us,
3
23101
3723
在我們開始談未來科技可能 會對我們有什麼影響之前,
00:26
I'd like to spend a little time exploring the unintended consequences
4
26848
3749
我想要先花點時間, 探究一些我們近期的科技造成的
00:30
of some of our recent tech,
5
30621
1567
不在計畫中的結果,
00:32
namely, social media.
6
32212
2022
這科技就是:社交媒體。
00:34
Social media, a few short years ago, was the tech of future you.
7
34885
4632
在短短幾年前,社交媒體 是「未來的你」的科技。
00:39
Now it just is you.
8
39541
2090
現在,它就只是「你」。
00:42
Social media was supposed to bring us together
9
42125
2827
社交媒體應該要用我們從來沒有
00:44
in ways we could never imagine.
10
44976
2443
想像過的方式把我們連結起來。
00:47
And the predictors were correct.
11
47443
1769
預言者是對的。
00:50
These three girls are talking to one another
12
50412
2842
這三個女孩是在跟彼此對談,
00:53
without the awkward discomfort of eye contact.
13
53278
3034
還可以避免尷尬的眼神接觸。
00:56
(Laughter)
14
56336
1329
(笑聲)
00:57
I call that advancement.
15
57689
2080
我把這稱為進步。
01:01
We were supposed to be caught up in a communication tsunami,
16
61533
3508
我們應該要被捲入溝通的海嘯當中,
01:05
the likes of which the world has never seen.
17
65065
2840
世界完全沒見過類似這樣的東西。
01:07
And that did happen.
18
67929
1745
那的確發生了。
01:09
And so did this.
19
69698
1781
而這也發生了。
01:13
(Sings) One of these things is not like the other.
20
73153
3044
(唱歌)這些東西當中 有一個和其他的不一樣。
01:16
(Speaks) Now, look at this picture.
21
76221
1697
(說話)看看這張照片。
01:17
If you picked the guy with the book, you’re wrong --
22
77942
2464
如果你選拿書的那個人,你就錯了──
01:20
or, as a certain president would say, "Wrong!"
23
80430
2764
或是某位總統會說:「錯!」
01:23
(Laughter)
24
83218
1922
(笑聲)
01:27
Clearly, three of these guys are reading,
25
87496
2287
很顯然,當中有三個人在閱讀,
01:29
and one guy, on the end, is listening to music
26
89807
2380
而在邊上的那個人則是在聽音樂,
01:32
and playing "Candy Crush."
27
92211
1499
同時在玩「糖果傳奇」。
01:33
(Laughter)
28
93734
1551
(笑聲)
01:35
So are we more connected,
29
95862
1583
所以,我們有比較連結嗎?
01:37
or are we just more connected to our devices?
30
97469
3645
還是我們只是和我們的 裝置比較連結了?
01:42
Social media was supposed to place us in a veritable town square,
31
102156
3405
社交媒體應該要把我們放在 一個稱得上是市鎮廣場的地方,
01:45
where we could engage one another with challenging ideas and debates.
32
105585
4739
在那裡,我們可以用 挑戰性想法和辯論來彼此互動。
01:50
And instead what we got were trolls.
33
110348
2890
結果,我們卻得到了惡搞。
01:53
This is an actual tweet that I received.
34
113262
3577
這是我收到的真實推特訊息。
01:58
"Chuck, no one wants to hear your stupid, ill-informed political views!
35
118220
4154
「查克,沒有人想要聽你那 愚蠢且孤陋寡聞的政治觀點!
02:02
I hope you get leprosy and die.
36
122398
2321
我希望你得麻風病然後死掉。
02:04
Love, Dad"
37
124743
1408
愛你的老爹敬上。」
02:06
(Laughter)
38
126175
1667
(笑聲)
02:08
Now, the great thing about that tweet if you look at it,
39
128796
2653
推特有個優點,如果你仔細看看它,
02:11
just like most trolls, it's not that bad,
40
131473
1987
就像大部分的惡搞,其實並不壞,
02:13
because he wished "leporsy" on me instead of "leprosy,"
41
133484
3670
因為他希望我得的 是「麻風」而非「痲瘋」,
02:17
and "leporsy" is not dangerous at all.
42
137178
2841
而麻風一點也不危險。 (註:英文是把 ro 拼成 or)
02:20
(Laughter)
43
140043
1581
(笑聲)
02:21
(Applause)
44
141648
1660
(掌聲)
02:26
Along with trolls, we got a brand new way of torturing teenagers --
45
146333
4517
除了惡搞之外,我們也得到了 一種折磨青少年的新方式──
02:30
cyberbullying.
46
150874
1365
網路霸凌。
02:32
A concept that my 75-year-old mother just can't seem to wrap her head around.
47
152848
4933
我七十五歲的老媽似乎 無法理解這個概念。
02:38
"So, uh, did they hit him?"
48
158441
2102
「所以,呃,他們有打他嗎?」
02:40
"No, Mom, they didn't hit him."
49
160567
1682
「沒有,媽,他們沒有打他。」
02:42
"Did they take his money?"
50
162619
1316
「他們有拿他的錢嗎?」
02:43
"No, Mom, they didn't take his money."
51
163959
1866
「沒有,媽,他們沒有拿他的錢。」
02:45
"Did they put his face in the toilet?"
52
165849
1856
「他們有把他的臉塞到馬桶裡嗎?」
02:47
"No, Mom, they didn't --"
53
167729
1219
「沒有,媽,他們沒有──」
02:48
"Well, what did they do?"
54
168972
1211
「那他們做了什麼?」
02:50
"They attacked him on the internet."
55
170207
1879
「他們在網路上攻擊他。」
02:52
"Attacked him on the internet?"
56
172800
1521
「在網路上攻擊他?」
02:54
(Laughter)
57
174345
1001
(笑聲)
02:55
"Well, why don't you just turn off the internet?"
58
175370
2342
「那你為什麼不把 那網路關掉就好了?」
02:57
(Laughter)
59
177736
1289
(笑聲)
02:59
"Your whole generation is a bunch of wussies."
60
179049
2542
「你們這整個世代 都是一群軟弱的膿包。」
03:01
(Laughter)
61
181615
1505
(笑聲)
03:03
She's got a point.
62
183660
1150
她說的有理。
03:04
(Laughter)
63
184834
1339
(笑聲)
03:06
She's got a point.
64
186197
1159
她說的有理。
03:07
And I don't even want to talk about what social media has done to dating.
65
187380
3618
我甚至不想去談社交媒體 對約會造成的影響。
03:11
I was on Grindr until I found out it wasn't a sandwich app.
66
191491
4901
我一直在用 Grindr,直到我發現它不是 三明治 app。(註:男同志社交平台)
03:16
(Laughter)
67
196416
2083
(笑聲)
03:19
And I can't even tell you about Tinder,
68
199908
3222
我就別跟你們說 Tinder 了, (註:一夜情約會 app)
03:23
except for the fact that if you think there is a limit
69
203154
3900
只能說,如果你認為在地球上你能
03:27
to the amount of anonymous sex we can have on this planet,
70
207078
3886
進行的匿名性交數量是有限的,
03:30
you are sadly mistaken.
71
210988
1817
很不幸,你錯了。
03:32
(Laughter)
72
212829
1001
(笑聲)
03:33
So where do we go from here?
73
213854
1879
所以,以上這些是什麼意思?
03:35
Well, let's just jump right in and play the hits.
74
215757
2367
咱們就直接切入重點吧。
03:38
Driverless cars.
75
218148
1151
無人駕駛汽車。
03:39
Something that has already been around for many years,
76
219323
2672
這種車已經出現非常多年了,
03:42
just without the assistance of computers.
77
222019
2161
只是沒有電腦的協助而已。
03:44
(Laughter)
78
224204
2054
(笑聲)
03:47
(Applause)
79
227352
2359
(掌聲)
03:50
Because for years, we have been driving while texting,
80
230733
3980
因為多年來,我們一邊 開車時還會一邊傳訊息、
03:54
putting on makeup,
81
234737
1578
一邊化妝、
03:56
shaving, reading -- actually reading --
82
236339
3082
一邊刮鬍子、閱讀──真的在閱讀──
03:59
that would be me.
83
239445
1162
那是我會做的。
04:00
(Laughter)
84
240631
1013
(笑聲)
04:01
The other thing is that since driverless cars will be shared,
85
241668
2999
還有一件事,因為 無人駕駛汽車將會是共乘的,
04:04
most people won't own cars,
86
244691
1316
大部分人不會擁有汽車,
04:06
and that means the DMV will go away.
87
246031
2318
機動車輛管理局(DMV)將會消失。
04:09
The DMV -- I know what you're saying right now.
88
249000
2260
DMV──我知道你們現在在說什麼。
「這個傢伙不可能會站在這裡
04:11
"There's no way this guy is going to stand up here
89
251284
2367
為 DMV 說好話。」
04:13
and make a case for the DMV."
90
253675
1425
我不知道你們如何, 但我並不希望在我住的世界裡,
04:15
Well, I don't know about you, but I do not want to live in a world
91
255124
3173
04:18
where harsh fluorescent lights,
92
258321
2278
有刺眼的日光燈、
04:20
endless lines,
93
260623
1898
看不到尾巴的排隊隊伍、
04:22
terrible forms to fill out
94
262545
1836
要填寫很糟的表格,
04:24
and disaffected, soulless bureaucrats remind me
95
264405
3975
以及充滿怨氣又無情的官僚,來提醒我
04:28
that I am pretty damn lucky not to work here.
96
268404
3442
我不用在這裡工作是多幸運的事情。
04:31
(Laughter)
97
271870
1150
(笑聲)
04:33
That is the real service they provide.
98
273490
2250
他們真正提供的就是那種服務。
04:36
The DMV:
99
276458
1517
DMV:
04:37
come for the registration renewal,
100
277999
2061
因為要換新駕照而來,
04:40
stay for the satisfaction of knowing you made some pretty good life choices.
101
280084
4251
因為你對你做的人生選擇 感到滿意而留下。
04:44
(Laughter)
102
284359
1847
(笑聲)
04:49
Nobody will own their car in the future,
103
289060
1998
在未來,沒有人會擁有自己的車,
04:51
and that means teenagers will not have a place to make out.
104
291082
3046
那就表示,青少年少女 就沒地方可以親熱。
04:55
So you know what that means.
105
295546
1373
你們知道那意味著什麼。
04:56
That means they will order driverless cars to do just that.
106
296943
3230
那意味著,他們會為了親熱 而預約無人駕駛汽車。
05:00
I do not want to step into a vehicle and ask the question:
107
300602
4365
我並不想踏入一台車時 還要問這個問題:
05:04
"Why does this car smell like awkwardness, failure and shame?"
108
304991
4793
「為什麼這台車聞起來 有尷尬、失敗和羞恥的味道?」
05:09
(Laughter)
109
309808
2158
(笑聲)
05:12
If I want to ask that question, I'll walk into my own bedroom.
110
312916
3044
如果我想要問那個問題, 我會走進我自己的臥房。
05:15
(Laughter)
111
315984
1381
(笑聲)
05:17
So what else do we have to look forward to?
112
317389
2010
所以,我們還能期待什麼?
05:19
That's right, artificial intelligence.
113
319423
1824
沒錯,人工智慧。
05:21
Artificial intelligence, yes.
114
321271
2291
人工智慧,是的。
05:23
You know, there was a time when artificial intelligence was a joke.
115
323586
3189
曾經有個時期, 人工智慧只是個笑話。
05:26
I mean, literally a quip that you would hear at a cocktail party
116
326799
3632
真的就是你在雞尾酒派對上 會聽到的玩笑話,
05:30
when somebody would bring it up in conversation:
117
330455
2361
有人可能會在談話間提到:
05:32
"Artificial intelligence.
118
332840
1932
「人工智慧。
05:34
The only real artificial intelligence is our American Congress.
119
334796
3556
唯一的人工智慧就是 我們的美國國會。
05:38
Ha, ha, ha, ha, ha."
120
338376
2028
哈哈哈哈哈。」
05:40
Well, it's not funny anymore.
121
340428
1417
嗯,那不再好笑了。
05:41
(Laughter)
122
341869
2039
(笑聲)
05:48
Stephen Hawking, Elon Musk and Bill Gates have all gone on record
123
348108
3551
史帝芬霍金、伊隆馬斯克、 比爾蓋茲都有正式公開
05:51
expressing grave reservations about artificial intelligence.
124
351683
3956
表示他們對人工智慧 持嚴肅保留態度。
05:55
That's like Jesus, Moses and Muhammad coming together and saying,
125
355663
3531
那就像是耶穌、摩西、 穆罕默德都聚集在一起,說:
05:59
"Guy, guys -- here's something we can all believe in."
126
359218
2790
「各位,各位──這裡有件事, 大家都能相信。」
06:02
(Laughter)
127
362032
1022
(笑聲)
06:03
You might want to go with that, is all I'm saying.
128
363078
2413
我只是在說,你可能會想要相信。
06:07
We are actually teaching machines how to think,
129
367310
4467
我們是在教導機器如何思考、
06:11
how to understand our behavior,
130
371801
2224
如何了解我們的行為、
06:14
how to defend themselves and even practice deception.
131
374049
3601
如何保衛它們自己,甚至進行欺騙。
06:18
What could possibly go wrong?
132
378904
1729
怎麼可能會出問題呢?
06:20
(Laughter)
133
380657
1591
(笑聲)
06:23
The one thing that's for sure:
134
383568
1828
有一件事是肯定的:
06:25
the creation always despises its creator.
135
385420
3149
被創造物都會看不起它的創造者。
06:29
OK?
136
389043
1193
對吧?
06:30
The Titans rose up against the gods;
137
390260
2125
泰坦起義對抗眾神;
06:32
Lucifer against Jehovah.
138
392409
1732
路西法對抗耶和華。
06:34
And anybody who has a teenager has heard these words:
139
394547
2782
家中有青少年少女的人 都聽過這句話:
06:37
"I hate you and you're ruining my life!
140
397353
2033
「我恨你,你毀了我的人生!
06:39
I hate you!"
141
399410
1150
我恨你!」
06:42
Now just imagine that sentiment with a machine that can outthink you
142
402325
4555
想像那樣的情緒,
只是角色換成是在思想上超越你 且有強大武裝火力的機器。
06:46
and is heavily armed.
143
406904
1843
06:48
(Laughter)
144
408771
1511
(笑聲)
06:51
The result?
145
411074
1201
結果呢?
06:52
Absolutely.
146
412976
1202
當然。
06:54
(Laughter)
147
414202
1456
(笑聲)
06:58
What we need to do before we perfect artificial intelligence
148
418973
3244
在我們把人工智慧做到完美之前,
07:02
is perfect artificial emotions.
149
422241
2187
要先把人工情緒做到完美。
07:04
That way, we can teach the robots or machines
150
424793
3910
那樣一來,我們就能 教導機器人或機器
07:08
how to love us unconditionally,
151
428727
2481
如何無條件地愛我們,
07:11
so that when they figure out that the only real problem on this planet
152
431232
4525
所以,當他們想通了原來 在地球上唯一真正的問題
07:15
is us,
153
435781
1190
就是我們時,
07:16
instead of destroying us --
154
436995
1716
他們就不會摧毀我們──
07:18
which, by the way, is totally logical --
155
438735
3104
其實摧毀我們是完全合邏輯的──
07:21
they will find us adorable --
156
441863
2446
他們反而會覺得我們很可愛──
07:24
(Laughter)
157
444333
1066
(笑聲)
07:25
like baby poop.
158
445423
1221
就像寶寶的大便。
07:26
(Laughter)
159
446668
1003
(笑聲)
07:27
"Oh my god, I just love the way you just destroyed the planet.
160
447695
3106
「喔,天啊,我好愛你們 摧毀這個星球的方式。
07:30
I can't stay mad at you, you're so cute!
161
450825
2772
我實在無法對你們生氣, 你們好可愛!
07:33
You're so cute!"
162
453621
1222
你們好可愛!」
07:34
(Laughter)
163
454867
1160
(笑聲)
07:36
Can't talk about this without talking about robotics. OK?
164
456051
6925
談到這個,就一定會 談到機器人,對吧?
07:43
Remember when you thought robotics were cool?
165
463000
2121
記得以前你們會覺得機器人很酷嗎?
07:45
I remember when I thought robotics were cool,
166
465145
2122
我還記得我以前覺得機器人很酷,
07:47
until I figured out that they were going to take everybody's place,
167
467291
3160
直到我發現它們將會取代每個人,
從送貨員一直到心臟外科醫生。
07:50
from the delivery guy down to the heart surgeon.
168
470475
2266
但,機器人有一點是很讓人失望的,
07:52
The one thing, though, that is very disappointing about robotics
169
472765
3029
07:55
is the holy grail of robotics,
170
475818
1432
就是機器人的聖杯,
那甚至還沒發生。
07:57
and it hasn't even happened.
171
477274
1347
07:58
I'm talking about the robot girlfriend,
172
478645
1939
我在說的,是機器人女友,
08:00
the dream of one lonely geek in a windowless basement
173
480608
3608
在沒有窗戶的地下室, 一個寂寞怪胎的夢想,
08:04
who vowed one day: "I am going to marry my creation."
174
484240
3425
有一天他發誓: 「我要娶我的創造物。」
08:08
And there actually is a movement underway to stop this from happening,
175
488738
5027
其實現在就有一項運動在進行中, 目的是要阻止這件事發生,
08:13
for fear of exploitation.
176
493789
2164
因為害怕剝削。
08:16
And I, for one, am against that movement.
177
496653
2626
而我是反對這項運動的人之一,
08:20
I believe we should have robot girlfriends.
178
500224
2827
我認為我們應該要有機器人女友。
08:23
I just believe that they should come with a feminist protocol
179
503596
4245
我認為它們應該要加上女性主義思想
08:27
and artificial intelligence,
180
507865
1879
以及人工智慧,
08:29
so she can take one look at that guy and go, "I am too good for you.
181
509768
4205
這麼一來,她可以看一眼那傢伙, 然後說:「你配不上我。
08:33
I'm leaving."
182
513997
1302
我要走了。」
08:35
(Laughter)
183
515323
1565
(笑聲)
08:36
(Applause)
184
516912
2036
(掌聲)
08:40
And finally,
185
520371
1631
最後,
08:42
I have to talk about bioengineering,
186
522026
2182
我得要談談生物工程,
08:44
an area of science that promises to end disease before it even begins,
187
524714
6053
它是一個科學領域,保證在 疾病爆發之前就把它終止,
08:51
to help us live longer, fuller, healthier lives.
188
531658
4420
協助我們過著更長壽、 更豐足、更健康的生活。
08:57
And when you couple that with implantable hardware,
189
537218
3122
如果你把它 和可植入的硬體連結起來,
09:00
you are looking at the next incarnation of human evolution.
190
540364
4427
你在看的就是下一波的人類演化。
09:04
And all of that sounds great,
191
544815
2316
那些全都聽起來很美好,
09:07
until you figure out where it's really going.
192
547155
2301
直到你發現這條路會通到什麼結果。
09:09
One place:
193
549480
1179
那就是:
09:11
designer babies,
194
551133
1319
設計出來的寶寶,
09:13
where, no matter where you are on the globe
195
553317
2027
不論你在世界的什麼地方,
09:15
or what your ethnicity,
196
555368
1785
不論你是哪個人種,
09:17
babies will end up looking like that.
197
557177
2634
寶寶最後都會像這樣。
09:19
(Laughter)
198
559835
1596
(笑聲)
09:21
That boy is surprised
199
561455
2582
那個男孩很驚訝,
09:24
because he just found out both his parents are black.
200
564061
3479
因為他剛剛發現他的父母都是黑人。
09:27
(Laughter)
201
567564
2708
(笑聲)
09:35
Can you imagine him at a cocktail party in 20 years?
202
575569
3168
你們能想像二十年後 他參加雞尾酒派對的情況?
09:39
"Yeah, both my parents are black.
203
579429
1613
「是的,我的父母都是黑人。
09:41
I mean, it's a little awkward at times,
204
581066
2199
有時候是有點尷尬,
09:43
but you should see my credit rating.
205
583289
1747
但你應該要看我的信用積分。
09:45
Impressive, very impressive."
206
585060
1723
很讓人印象深刻,非常。」
09:46
(Laughter)
207
586807
1439
(笑聲)
09:49
Now, all of this seems scary,
208
589353
2340
這一切似乎蠻駭人的,
09:51
and everybody in this room knows that it isn't.
209
591717
2647
這間房間的每個人都知道並非如此。
09:54
Technology isn't scary.
210
594388
1462
科技並不駭人。
09:56
Never has been and it never will be.
211
596568
2915
從來就不駭人,以後也不會駭人。
10:00
What's scary is us
212
600113
2671
駭人的是我們,
10:02
and what we will do with technology.
213
602808
1992
以及我們會用科技做出的事。
10:05
Will we allow it to expose our humanity,
214
605332
3089
我們會容許它揭露出我們的人性,
10:09
showing our true selves
215
609334
2307
呈現出我們的真實自我,
10:11
and reinforcing the fact that we are indeed our brother's keeper?
216
611665
3844
並強化我們的確是我們 兄弟的守護者的這個事實?
10:16
Or will we allow it to reveal our deepest, darkest demons?
217
616240
5734
或者,我們會容許它揭示出 我們最深處最黑暗的惡魔?
10:22
The true question is not whether or not technology is scary.
218
622723
5157
真正的問題並非科技是否駭人。
10:28
The true question is:
219
628326
1285
真正的問題是:
10:30
How human
220
630273
1152
你多有人性?
10:31
are you?
221
631874
1170
10:33
Thank you.
222
633656
1151
謝謝。
10:34
(Applause)
223
634831
3930
(掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7


This website was created in October 2020 and last updated on June 12, 2025.

It is now archived and preserved as an English learning resource.

Some information may be out of date.

隱私政策

eng.lish.video

Developer's Blog