How will AI change the world?

1,773,057 views ・ 2022-12-06

TED-Ed


請雙擊下方英文字幕播放視頻。

譯者: Lilian Chiu 審譯者: Helen Chang
在不久的將來,人工智慧 可能會改變你的人生,
00:07
In the coming years, artificial intelligence
0
7336
2169
00:09
is probably going to change your life, and likely the entire world.
1
9505
3629
還很可能改變全世界。
但對於改變的方式, 大家很難取得共識。
00:13
But people have a hard time agreeing on exactly how.
2
13301
3295
接下來的片段取自一場訪談,
00:16
The following are excerpts from an interview
3
16804
2127
00:18
where renowned computer science professor and AI expert Stuart Russell
4
18931
3504
知名資訊科學教授 及人工智慧專家史都華‧羅素
00:22
helps separate the sense from the nonsense.
5
22435
2586
要來協助大家區分 什麼有理,什麼是胡說。
00:25
There’s a big difference between asking a human to do something
6
25313
3754
要求一個人類做某件事情,很不同於
00:29
and giving that as the objective to an AI system.
7
29067
3461
把那件事情設定為 人工智慧系統的目標。
00:32
When you ask a human to get you a cup of coffee,
8
32528
2628
當你請一個人類幫你弄一杯咖啡時,
00:35
you don’t mean this should be their life’s mission,
9
35156
2586
你並不是要他把這件事 當作人生的使命,
00:37
and nothing else in the universe matters.
10
37742
1960
且宇宙中其他一切都無所謂,
00:39
Even if they have to kill everybody else in Starbucks
11
39702
2586
就算他得要把星巴克裡 所有人的殺光才能
00:42
to get you the coffee before it closes— they should do that.
12
42288
2836
在關門前幫你弄到 一杯咖啡,也應該去做。
不,那不是你的意思。
00:45
No, that’s not what you mean.
13
45124
1627
00:46
All the other things that we mutually care about,
14
46751
2294
我們互相都在乎的所有其他因素 都應該納入行為時的考量。
00:49
they should factor into your behavior as well.
15
49045
2169
而我們現在建造人工智慧 系統的方式,問題在於
00:51
And the problem with the way we build AI systems now
16
51214
3169
我們給系統固定的目標。
00:54
is we give them a fixed objective.
17
54383
1627
演算法需要我們把目標中的 一切都明確定義清楚。
00:56
The algorithms require us to specify everything in the objective.
18
56010
3545
00:59
And if you say, can we fix the acidification of the oceans?
19
59555
3420
如果你說,我們能否處理 海洋酸化的問題?
01:02
Yeah, you could have a catalytic reaction that does that extremely efficiently,
20
62975
4046
可以,可以找到一種 極有效的催化反應來達成,
01:07
but it consumes a quarter of the oxygen in the atmosphere,
21
67021
3253
但那會消耗掉大氣中 四分之一的氧氣,
01:10
which would apparently cause us to die fairly slowly and unpleasantly
22
70274
3712
很顯然那會讓我們死亡,
且過程長達七小時, 是緩慢又不愉快的死法。
01:13
over the course of several hours.
23
73986
1752
01:15
So, how do we avoid this problem?
24
75780
3211
所以我們要如何避免這個問題?
01:18
You might say, okay, well, just be more careful about specifying the objective—
25
78991
4088
你可能會說,好,那就在訂 明確目標的時候更謹慎點——
01:23
don’t forget the atmospheric oxygen.
26
83079
2544
附註別忘了考量大氣中的氧氣。
01:25
And then, of course, some side effect of the reaction in the ocean
27
85873
3545
接著,當然,這個反應 在海洋中的一些副作用
01:29
poisons all the fish.
28
89418
1377
毒死了所有的魚類。
01:30
Okay, well I meant don’t kill the fish either.
29
90795
2669
好,我的意思是也不能害死魚類。
01:33
And then, well, what about the seaweed?
30
93464
1919
那,水草呢?
01:35
Don’t do anything that’s going to cause all the seaweed to die.
31
95383
2961
別做任何會造成海草死亡的事。 以此一直類推下去。
01:38
And on and on and on.
32
98344
1210
01:39
And the reason that we don’t have to do that with humans is that
33
99679
3920
對人類就不需要這麼做,原因是因為
01:43
humans often know that they don’t know all the things that we care about.
34
103599
4505
人類通常知道
我們在乎之事他們不全都知道。
01:48
If you ask a human to get you a cup of coffee,
35
108354
2961
如果你請人類幫你弄一杯咖啡,
01:51
and you happen to be in the Hotel George Sand in Paris,
36
111315
2878
而你剛好住在巴黎的喬治·桑酒店,
01:54
where the coffee is 13 euros a cup,
37
114193
2628
在那裡的咖啡一杯要價十三歐元。
01:56
it’s entirely reasonable to come back and say, well, it’s 13 euros,
38
116821
4171
非常合理的行為是回來跟你說,
要十三歐元,你確定你要嗎? 還是要我去隔壁買?
02:00
are you sure you want it, or I could go next door and get one?
39
120992
2961
02:03
And it’s a perfectly normal thing for a person to do.
40
123953
2878
對人類來說,這樣做非常正常。
02:07
To ask, I’m going to repaint your house—
41
127039
3003
人類會問,我打算要 重新油漆你的房子——
02:10
is it okay if I take off the drainpipes and then put them back?
42
130042
3337
我先把排水管拿下來, 之後再放回去可以嗎?
02:13
We don't think of this as a terribly sophisticated capability,
43
133504
3128
我們不認為這是 非常精密複雜的能力,
02:16
but AI systems don’t have it because the way we build them now,
44
136632
3087
但人工智慧系統做不到,因為
我們建造它們的方式 必須要給它們完整的目標。
02:19
they have to know the full objective.
45
139719
1793
02:21
If we build systems that know that they don’t know what the objective is,
46
141721
3753
如果我們建立的系統能知道 它們不知道目標是什麼,
02:25
then they start to exhibit these behaviors,
47
145474
2586
那麼它們就會展現這些行為,
02:28
like asking permission before getting rid of all the oxygen in the atmosphere.
48
148060
4046
比如在把大氣中所有氧氣 都除掉之前會先問可不可以。
02:32
In all these senses, control over the AI system
49
152565
3378
在所有這些意義上, 對人工智慧系統的掌控
02:35
comes from the machine’s uncertainty about what the true objective is.
50
155943
4463
來自於機器無法完全確定 真正的目標是什麼。
02:41
And it’s when you build machines that believe with certainty
51
161282
3086
當你建造出的機器 非常堅信它們有目標時,
02:44
that they have the objective,
52
164368
1418
02:45
that’s when you get this sort of psychopathic behavior.
53
165786
2753
就會出現這種精神錯亂的行為。
02:48
And I think we see the same thing in humans.
54
168539
2127
我想,在人類身上也能看到這現象。
02:50
What happens when general purpose AI hits the real economy?
55
170750
4254
當通用人工智慧和真實經濟 碰撞時,會發生什麼事?
02:55
How do things change? Can we adapt?
56
175379
3587
會有什麼改變?我們能適應嗎?
02:59
This is a very old point.
57
179175
1835
這是個古老的論點。
03:01
Amazingly, Aristotle actually has a passage where he says,
58
181010
3587
讓人驚奇的是,亞里斯多德 有段話是這樣說的:
03:04
look, if we had fully automated weaving machines
59
184597
3045
如果我們有完全自動化的編織機器
03:07
and plectrums that could pluck the lyre and produce music without any humans,
60
187642
3837
和弦撥,不用人類就可以 撥里拉琴和製作音樂,
03:11
then we wouldn’t need any workers.
61
191604
2002
那我們就不需要任何工人。
03:13
That idea, which I think it was Keynes
62
193814
2878
我認為那個想法是凱因斯的,
03:16
who called it technological unemployment in 1930,
63
196692
2836
他在 1930 年稱之為技術性失業,
03:19
is very obvious to people.
64
199528
1919
對大家來說這想法很明顯。
03:21
They think, yeah, of course, if the machine does the work,
65
201447
3086
他們會想,當然, 如果機器能做這些事,
03:24
then I'm going to be unemployed.
66
204533
1669
那我就會失業。
03:26
You can think about the warehouses that companies are currently operating
67
206369
3503
可以想想目前企業為了 電子商務而經營的倉庫,
03:29
for e-commerce, they are half automated.
68
209872
2711
它們是半自動化的。
03:32
The way it works is that an old warehouse— where you’ve got tons of stuff piled up
69
212583
4046
老式倉庫的運作方式是: 一大堆東西到處堆疊,
03:36
all over the place and humans go and rummage around
70
216629
2461
人類在倉庫中到處翻找, 再把貨拿去寄出,
03:39
and then bring it back and send it off—
71
219090
1877
03:40
there’s a robot who goes and gets the shelving unit
72
220967
3586
是有機器人會去取得
你要找的東西所屬的那個儲存單位,
03:44
that contains the thing that you need,
73
224553
1919
03:46
but the human has to pick the object out of the bin or off the shelf,
74
226472
3629
但要由人類把物品 從箱子中取出或從架上取下,
03:50
because that’s still too difficult.
75
230101
1877
因為那仍然太困難。
03:52
But, at the same time,
76
232019
2002
但,同時,
03:54
would you make a robot that is accurate enough to be able to pick
77
234021
3921
你是否會做一個夠精確的機器人,
能夠從你能購買的各種物品中 挑選出任何物品?
03:57
pretty much any object within a very wide variety of objects that you can buy?
78
237942
4338
04:02
That would, at a stroke, eliminate 3 or 4 million jobs?
79
242280
4004
那會不會一下子就消滅了 三、四百萬個工作?
04:06
There's an interesting story that E.M. Forster wrote,
80
246409
3336
艾德華‧摩根‧佛斯特 寫了一個有趣的故事,
04:09
where everyone is entirely machine dependent.
81
249745
3504
故事中,人人都完全仰賴機器。
04:13
The story is really about the fact that if you hand over
82
253499
3754
故事要談的事實是,
如果你把文明的管理權交給機器,
04:17
the management of your civilization to machines,
83
257253
2961
04:20
you then lose the incentive to understand it yourself
84
260214
3504
就不會有動機刺激你
自己去了解它, 或教下一代如何了解它。
04:23
or to teach the next generation how to understand it.
85
263718
2544
04:26
You can see “WALL-E” actually as a modern version,
86
266262
3003
可以看到《瓦力》其實就是現代版,
04:29
where everyone is enfeebled and infantilized by the machine,
87
269265
3628
機器讓所有人都很軟弱、 被當孩子對待,
04:32
and that hasn’t been possible up to now.
88
272893
1961
目前為止還不可能發生。
04:34
We put a lot of our civilization into books,
89
274854
2419
我們的文明有很多 都被我們放到書中,
04:37
but the books can’t run it for us.
90
277273
1626
但書無法為我們經營文明。
04:38
And so we always have to teach the next generation.
91
278899
2795
所以我們一直得要教導下一代。
04:41
If you work it out, it’s about a trillion person years of teaching and learning
92
281736
4212
算一算,是大約一兆人年的教與學,
04:45
and an unbroken chain that goes back tens of thousands of generations.
93
285948
3962
以及一條完整的鏈, 可追溯到數萬個世代以前。
04:50
What happens if that chain breaks?
94
290119
1919
如果那條鏈斷了呢?
04:52
I think that’s something we have to understand as AI moves forward.
95
292038
3461
我認為那是隨著人工智慧 向前邁進時我們該去了解的。
04:55
The actual date of arrival of general purpose AI—
96
295624
3587
通用人工智慧出現的確切日期——
04:59
you’re not going to be able to pinpoint, it isn’t a single day.
97
299211
3087
無法明確指出來, 那並不是單一個日子。
05:02
It’s also not the case that it’s all or nothing.
98
302298
2294
現實也不是全有或全無。
05:04
The impact is going to be increasing.
99
304592
2461
衝擊會漸漸增大。
05:07
So with every advance in AI,
100
307053
2043
每當人工智慧有所進展,
05:09
it significantly expands the range of tasks.
101
309096
2962
就會讓工作任務的範圍顯著再擴大。
05:12
So in that sense, I think most experts say by the end of the century,
102
312058
5338
就那方面來說,我想, 大部分的專家說
到了這個世紀末
05:17
we’re very, very likely to have general purpose AI.
103
317396
3337
我們非常非常有可能 會有通用人工智慧。
05:20
The median is something around 2045.
104
320733
3754
中位數大概會在 2045 年。
05:24
I'm a little more on the conservative side.
105
324487
2002
我比較偏向保守派一點, 我認為問題比我們想的更困難。
05:26
I think the problem is harder than we think.
106
326489
2085
05:28
I like what John McAfee, he was one of the founders of AI,
107
328574
3253
我很喜歡人工智慧創造者 約翰‧麥克菲的說法,
05:31
when he was asked this question, he said, somewhere between five and 500 years.
108
331911
3837
被問到這個問題時,他說 是在五年和五百年之間。
05:35
And we're going to need, I think, several Einsteins to make it happen.
109
335748
3337
我想,我們需要好幾個 愛因斯坦才能實現。

Original video on YouTube.com
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7