请双击下面的英文字幕来播放视频。
翻译人员: Yip Yan Yeung
校对人员: Sue Lu
00:03
Whether you're thrilled
by what AI can do for us
0
3934
2903
无论你对 AI
能为我们带来的成就而兴奋,
00:06
or terrified by what AI
is going to do to us,
1
6837
5172
还是对 AI
将对我们造成的影响感到恐惧,
00:12
whether it can be funny,
is probably not top of mind for you.
2
12043
4170
反正它是不是有趣,
大概不会是你优先考虑的事。
00:16
It is for me.
3
16781
1134
但却是我要优先考虑的。
00:18
I don't care if it turns
all of us into paperclips,
4
18516
4137
我不在乎 AI 是否
会把我们都变成回形针,
00:22
as long as they're funny paper clips.
5
22686
2370
只要是好玩的回形针就行。
00:25
(Laughter)
6
25089
1034
(笑声)
00:26
And the fact that it
makes stuff up, hallucinates,
7
26157
3737
AI 胡编乱造、出现幻觉的事实,
00:29
for me, that's not a bug,
that's a feature.
8
29894
3069
对我来说,
不是故障,而是它的功能。
00:33
My entire career was making stuff up.
9
33397
4038
我的整个职业生涯都是在编造东西。
00:37
They're called cartoons.
10
37468
2035
我编造的,被称为“漫画”。
00:40
This is probably the most famous
one I hallucinated.
11
40304
4671
这大概是我“幻想”出来
最出名的一幅。
00:46
There are a number of theories of humor
that could explain this cartoon.
12
46410
3937
有许多幽默理论可以解释这幅漫画。
00:50
There's the superiority theory.
13
50381
2636
其中有优越论。
00:53
You're the guy on the phone,
not on the other end.
14
53050
4505
你是打电话的人,不是接电话的。
00:57
The incongruity theory.
15
57588
1602
有乖讹论。
00:59
There's a mismatch between
the politeness of the language
16
59223
4671
用语的礼貌
和消息内容的粗鲁不相配。
01:03
and the rudeness of the message.
17
63894
2136
01:06
And the benign violation theory of humor,
18
66063
2669
还有幽默的良性违规理论,
01:08
which is sort of a golden ratio
theory of humor, if you will.
19
68766
5205
你可以把它当成
幽默的黄金比例理论。
01:13
See, I got in that term
20
73971
1602
看,我用了那个词。
01:15
(Laughter)
21
75606
1602
(笑声)
01:17
Which says, for something to be funny,
22
77908
1835
也就是说,要让什么东西好笑的话,
01:19
it has to have just the right
amount of wrong.
23
79777
5105
就必须有恰到好处的不合时宜。
01:25
Now I didn’t use any theories
to create that cartoon.
24
85349
3604
不过我在创作这幅漫画的时候
并没有使用任何理论。
01:28
And I once said at a talk at Google,
25
88986
2536
我曾经在谷歌的一次演讲中说过,
01:31
there is no algorithm for humor.
26
91555
2636
幽默没有算法。
01:34
But now with the rapid pace of AI,
27
94225
3403
但是现在随着 AI 的飞速发展,
01:37
I have to wonder,
could there be a bot Mankoff?
28
97661
4205
我不禁想,会不会出现
一个机器人版曼科夫?
01:42
(Laughter)
29
102500
1835
(笑声)
01:47
You might think my reflexive answer
to this would be, how about never?
30
107705
4938
你可能以为我对这个问题的
反应会是,永远不会吧?
01:55
But while I don't want
to be replaced by a bot,
31
115446
3070
虽然我不想被机器人取代,
01:58
I'm not above being helped by it.
32
118516
2268
但我也并不排斥机器人的帮助。
02:01
Steve Jobs famously said
33
121652
1969
史蒂夫·乔布斯曾说过一句名言,
02:03
that computers are a bicycle for the mind.
34
123654
3770
计算机是心灵的自行车。
02:07
If that's the case,
what's AI, a rocket ship?
35
127424
2837
照这个道理,
AI 是什么,火箭吗?
02:10
And at my age, you know what?
36
130294
1935
我这个年纪,你猜怎么着?
02:12
I'd settle for a walker.
37
132263
2202
我会觉得是个助行器吧。
02:14
(Laughter)
38
134498
2503
(笑声)
02:20
The fears of machines
replacing humans are not new.
39
140004
6072
对机器取代人类的恐惧早就有了。
02:26
This cartoon anticipates by five decades
40
146110
5372
这幅漫画五十年前就预测了
02:31
what's now called the alignment problem,
41
151515
2970
现在的“对齐”问题,
02:34
when the goals of machines
and humans go horribly awry,
42
154485
5839
即机器和人类的目标大相径庭,
02:40
at least for one of the parties.
43
160324
2703
至少对其中一方来说是如此。
02:43
Cartoons don't happen
in a cultural vacuum.
44
163761
4504
漫画不是在文化真空中产生的。
02:48
They're part of the zeitgeist.
45
168299
2435
它们是时代精神的一部分。
02:51
Here's a contemporaneous article
46
171569
4004
这是一篇同时代的文章,
02:55
about the guy who invented cybernetics,
47
175573
2402
讲述的是发明控制论的人
02:58
Norbert Wiener,
48
178008
1268
诺伯特·维纳(Norbert Wiener),
02:59
who said thinking machines
were putting us on the eve of destruction.
49
179310
5305
他说会思考的机器
会让我们危在旦夕。
03:05
Now sadly and tragically,
50
185449
3470
可悲又不幸的是,
03:08
Norbert Wiener died,
51
188953
2302
诺伯特·维纳死了,
03:11
but not by a thinking machine,
52
191288
1669
但不是死于会思考的机器,
03:12
but by an unthinking one,
he was run over by a bus.
53
192957
3370
而是死于一台不会思考的机器,
他被一辆巴士撞死了。
03:16
(Laughter)
54
196327
2035
(笑声)
03:18
That's not true.
55
198362
1301
这不是真的。
03:19
I made that up.
56
199697
1334
是我编的。
03:21
(Laughter)
57
201031
1468
(笑声)
03:22
I hallucinated it because it's funny.
58
202533
4237
我幻想出这个故事,
是因为它很好玩。
03:26
(Laughter)
59
206804
1968
(笑声)
03:28
Here's a great update
of that older cartoon.
60
208806
3770
这是之前那幅漫画的重大更新。
03:33
"To think this all began with letting
Autocomplete finish our sentence,"
61
213210
5172
“想想这一切竟然始于
让自动补全完成我们的句子”,
03:38
which indeed it did.
62
218415
2102
确实如此。
03:40
So these fears are not new, not novel.
63
220517
5339
所以这些恐惧不是什么新鲜事。
03:45
But now, in the immortal words
of Nigel Tufnel of "Spinal Tap,"
64
225856
5839
但现在用 Spinal Tap 乐队的
奈杰尔·塔弗纳(Nigel Tufnel)的
经典名言来说,
03:51
they go to 11.
65
231695
1769
这种恐惧已经“顶破天”,
03:55
They're cranked to the max.
66
235032
2169
达到最高极限了。
03:57
And here is one of the maximum
cranksters of all time,
67
237234
4438
有史以来最极端的一个怪人,
04:01
Elon Musk, saying,
68
241705
2303
埃隆·马斯克,曾说道:
04:04
"AI is one of the biggest
threats to humanity."
69
244041
3737
“AI 是人类最大的威胁之一。”
04:07
But certainly not as big as Elon Musk.
70
247811
3404
但肯定没有埃隆·马斯克的威胁大。
04:11
(Laughter)
71
251615
1502
(笑声)
04:13
(Applause and cheers)
72
253150
6006
(掌声和欢呼)
04:19
People like Elon have a p(doom) number.
73
259156
3537
像埃隆这样的人
有一个 P(doom) (末日)值,
04:22
That's the probability
AI is going to wipe us out.
74
262726
3704
是 AI 消灭我们的概率。
04:28
I think p(doom) is p(dumb).
75
268065
2736
我觉得 (末日)值
应该是 P(dumb) (傻子)值。
04:30
(Laughter)
76
270834
1268
(笑声)
04:32
I'm interested in p(funny),
77
272870
2269
我对 P(funny) (搞笑)值
很感兴趣,
04:35
and I’ve been using
the New Yorker caption contest
78
275139
3670
我一直在通过
《纽约客》配文竞赛
04:38
to look into the probability of that.
79
278842
3204
研究出现搞笑配文的概率。
04:43
Every week since 2005,
80
283514
2736
2005 年以来的每一周,
04:46
the New Yorker has presented
a cartoon without a caption,
81
286250
3737
《纽约客》都会刊登一张
没有配文的漫画,
04:50
and challenged its readers
to come up with the winning caption
82
290020
4171
要求读者在配文竞赛中
提出最佳配文。
04:54
in the caption contest.
83
294224
1836
04:56
And for that,
84
296093
1134
获胜奖励是荣登《纽约客》杂志、
04:57
they get the glory of being
in the New Yorker magazine,
85
297261
3837
05:01
a huge amount of money,
86
301131
2403
得到一大笔钱、
05:03
house in the Bahamas
that Sam Bankman-Fried --
87
303534
4104
还有巴哈马的房子,
是山姆·班克曼-弗里德曾拥有的
(美国加密亿万富翁,因欺诈罪入狱)。
05:07
Actually, it's just the glory.
88
307638
1968
其实,奖励只有那份刊登的荣耀。
05:12
On the page of the New Yorker,
89
312076
1534
在《纽约客》杂志上,
05:13
there's a contest you enter,
90
313644
2602
定期登出一个你可以参与的配文竞赛,
05:16
the finalists from a few weeks before,
91
316280
2502
几周后选出决赛选手,
05:18
three finalists,
92
318816
1201
共三名,
05:20
and a winning caption.
93
320050
1635
从中再选出一名获胜者。
05:21
So it's staggered in that way.
94
321719
2535
竞赛就是这么分阶段进行的。
05:24
Each one of these images are funny.
95
324254
3838
竞赛的每一张图片都很好玩。
05:28
They're incongruous.
96
328125
1301
它们是违和的。
05:29
You'd certainly think they're humorous,
97
329460
2302
你肯定会觉得它们很幽默,
05:31
but they're not funny
in a way that you get.
98
331762
2402
但它们的好笑并非是你理解的那样,
05:34
They're not mentally funny.
99
334198
1935
它们并不是需要动脑的幽默。
05:36
To make it that,
100
336166
1168
要达到这种幽默的效果,
05:37
of course, you need the right caption.
101
337368
3269
当然,你得有合适的配文。
05:40
"Any happily married people here tonight?"
102
340671
2436
“今晚在座的有婚姻幸福的吗?”
05:43
(Laughter)
103
343140
1435
(笑声)
05:46
OK, but with up to 10,000 captions
104
346176
4605
好吧,但是每周
都有多达 10,000 条配文,
05:50
every week,
105
350814
1335
05:52
how do you select that?
106
352182
1869
你该如何选择呢?
05:54
Now from 2005 to early 2016,
107
354084
5239
从 2005 年到 2016 年初,
05:59
that burden fell on me and my assistants,
108
359356
3837
这个负担落在了我和我的助理身上,
06:03
but mainly my assistants, to cull,
109
363227
2636
但主要是我的助理身上,挑选、
06:05
to try to cull the good captions
110
365896
2903
刨去我们不客气地
称为“垃圾”的配文,找出一些好的来。
06:08
from what we uncharitably
call the “craptions.”
111
368832
3304
06:12
(Laughter)
112
372169
2202
(笑声)
06:15
But then, in early 2016,
113
375672
4305
但是,2016 年初,
06:20
for the benefit of all humanity,
114
380010
4505
为了全人类的利益,
06:24
but mainly for me and my assistants,
115
384548
3570
但主要是为了我和我的助理,
06:28
we switched to crowdsourcing.
116
388152
2535
我们转向了众包。
06:30
So now for every contest, you vote online,
117
390721
4337
现在每次竞赛,你都在网上投票,
06:35
and a funniness score
from over a million judgments
118
395058
4271
所有配文都会从超过一百万次评判中
06:39
is given for all the captions.
119
399363
3003
得到一个有趣度评分。
06:42
Now overall, I’m against mob rule,
120
402366
2636
总体来说,我反对多数暴政,
06:45
but actually in this case,
121
405035
1735
但其实在这种竞赛中,
06:46
the mob does a pretty good job.
122
406804
2602
大众的表现相当不错。
06:49
Usually, the finalist almost certainly
almost all the time, really,
123
409406
4638
一直以来决赛选手通常
06:54
the finalists come
from the top 200 captions.
124
414077
4739
都选自前 200 名配文。
06:59
Well, this is popular,
not only with the New Yorker,
125
419616
3337
嗯,这不仅在《纽约客》很受欢迎,
07:02
but it's caught the eye
of data scientists,
126
422986
3237
还引起了各行广泛关注,
包括数据科学家、
07:06
creativity researchers,
127
426223
1702
创意研究人员、
07:07
cognitive scientists,
128
427925
1601
认知科学家,
07:09
and AI, of course,
and everything adjacent to AI.
129
429526
4271
当然还有 AI 及与其有关的一切。
07:14
So that's why I wasn't
really surprised when --
130
434431
4371
这就是为什么我不太感到意外……
07:19
Oh, it's up there, thanks.
131
439369
1602
哦,放出来了,谢谢。
07:21
I wasn't really surprised
when Vincent Vanhoucke,
132
441638
3604
我不太意外文森特·范霍克——
07:25
then the chief data scientist
for Google's DeepMind,
133
445242
3971
当时是谷歌 DeepMind
首席数据科学家,
07:29
now their head of robotics,
134
449246
2369
现为机器人负责人,
07:31
sent me this email
135
451648
2803
给我发了这封电子邮件,
07:34
indicating that winning
the caption contest,
136
454485
4037
表明要在配文竞赛中获胜,
07:38
which was actually somewhat
of the sine qua non of human creativity
137
458522
5172
其实必须有人类创造力
07:43
and intelligence.
138
463727
2436
和智力作为必要条件。
07:49
And I was also flattered
by that, of course,
139
469066
2702
当然,他的话也让我很受宠若惊,
07:51
but I didn't think they had
any chance at all of doing it.
140
471768
3571
不过我认为他们的机器人根本没戏。
07:55
And it turned out that was the case.
141
475906
3036
事实证明情况确实如此。
07:58
All of the AI juju circa 2016
142
478942
4772
2016 年左右的 AI 技术
08:03
wasn't up to the task.
143
483747
2236
还不足以胜任这项任务。
08:06
It really couldn't even decode the image.
144
486016
2803
它甚至无法解码竞赛图片。
08:08
So for the sine qua non of the human mind,
145
488819
3704
因此,对于人类思维这个必要条件来说,
08:12
DeepMind was non compos mentis
and out of its depth.
146
492556
4338
DeepMind 既无判断能力,
也力不从心。
08:16
But time and AI marched on,
147
496927
5439
但是随着时间流逝和 AI 进步,
08:22
AI marching quadruple time.
148
502399
2202
AI 以四倍的速度向前迈进。
08:24
Vincent gets back to me and says,
149
504635
2335
文森特回复我说,
08:27
while human creativity
might still be out of reach,
150
507004
3670
尽管人类创造力可能仍遥不可及,
08:30
we think we have
understanding well in hand.
151
510707
3737
我们认为我们已有很好的理解。
08:34
He sends me this ridiculous,
uber-nerd example
152
514444
5873
他发给我这个荒谬的、
超级书呆子的例子
08:40
of explaining humor.
153
520350
1535
用以解释幽默。
08:41
And I said, you know what?
154
521919
1401
我就说,这样吧,
08:43
Let me give you a cartoon I did in 1997
155
523820
3671
给你看看我 1997 年画的漫画,
08:47
of this other watershed moment
156
527524
2669
是关于另一个分水岭时刻,
08:50
when IBM's DeepMind
defeated Garry Kasparov,
157
530227
4304
描绘了 IBM 的 DeepMind 击败
加里·卡斯帕罗夫(Garry Kasparov),
08:54
the world chess champion.
158
534565
2102
他是世界象棋冠军。
08:56
And here's the cartoon
I did then, and it says,
159
536700
4004
这是我当时画的漫画,配文是:
09:00
"No, I don't want to play chess.
160
540737
1569
“不,我不想下棋。
09:02
I just want you to reheat the lasagna."
161
542306
2168
我只想让你去热一热千层面。”
09:04
(Laughter)
162
544508
1835
(笑声)
09:07
I rate this explanation a solid B-minus.
163
547377
3671
我只给这条配文注解打分 B-。
09:11
But so what if it was an A?
164
551748
2203
但如果它其实是 A 的水平呢?
09:13
Is there ever going to be a beautiful
165
553984
3036
会有一套精美的
09:17
New Yorker cartoon
anthology of explanations?
166
557020
5105
《纽约客》漫画配文注解合集吗?
09:22
I don't think so.
167
562693
1434
我觉得不会。
09:24
But the idea that understanding humor
168
564127
4672
但是,理解幽默
09:28
could be a stepping stone to creating it
169
568832
2870
是创造幽默的基石,
09:31
sort of made sense.
170
571735
1501
这一想法是有道理的。
09:34
This paper I was involved in
171
574504
2403
我参与的这篇论文
09:36
tried to look at compared to smart humans,
172
576940
3504
研究与聪明的人类相比,
09:40
what would the best AIs do on three tasks?
173
580477
3437
最好的 AI 在三项任务上表现如何?
09:44
Could they,
174
584281
1601
它们能否把不同竞赛的获胜配文
09:45
from winning captions
from different contests
175
585916
3003
09:48
match to the right image?
176
588919
1835
与正确的图片匹配?
09:50
Could they, between two captions,
177
590787
1802
它们能否面对两条配文——
09:52
one that won and the one
that was pretty good,
178
592623
2168
一条获胜配文和一条很不错的配文,
09:54
pick the right one?
179
594791
1202
从中选出正确的那条?
09:56
And could they explain the humor?
180
596026
1602
而且,它们能解释幽默吗?
09:57
Now for all of them, you know what?
181
597628
1835
所有这些 AI 的表现证实,
09:59
Yeah, humans were still ahead,
182
599463
3403
没错,人类仍然处于领先地位,
10:02
but AI is closing the gap.
183
602899
2770
但 AI 正在缩小差距。
10:06
The most interesting thing
about this paper for me
184
606436
2336
对我来说,这篇论文最有意思的地方
10:08
was it showed a pathway
185
608805
1635
是它展示了一条途径,
10:10
for which you could create cartoon humor.
186
610474
3737
可以帮助人们创作漫画幽默。
10:14
And that was how we trained the contest.
187
614911
4138
我们就是这样训练(AI)比赛的。
10:19
For 660 --
188
619082
2570
在 660……
10:21
653 contests,
189
621685
3103
653 场比赛中,
10:24
the AI was trained, fine-tuned,
190
624821
2203
AI 根据这些由人类注释的示例
进行了训练和微调,
10:27
on these examples which humans annotated.
191
627057
3604
10:30
A description of the cartoon,
explanation of the humor.
192
630694
3737
比如漫画的描述,对幽默的解释。
10:34
OK, if you've used ChatGPT,
you sort of get the idea now.
193
634431
4204
如果你用过 ChatGPT,
现在应该有点明白了吧。
10:40
Put a number of examples like this.
194
640971
2469
输入一些这样的例子,
10:43
Put it in the prompt window.
195
643874
2435
放进提示词窗口,
10:46
Rinse and repeat and you get new cartoons.
196
646877
3937
反复操作,就能得到新的漫画。
10:50
Well Jack Hessel,
the chief author of the paper,
197
650847
4071
该论文的主要作者
杰克·赫塞尔(Jack Hessel)
10:54
did something more sophisticated.
198
654951
2069
做了更复杂的事情。
10:57
And what he did was create
50 synthetic new cartoons
199
657054
3703
他根据这些旧数据
创作了 50 幅合成的漫画,
11:00
generated from this old data
200
660791
3069
11:03
in which there were five
options for captions.
201
663894
2435
每幅漫画有五个配文选项。
11:06
I picked four of them,
202
666363
1868
我挑了四个配文,
11:08
and I gave them to cartoonist
Shannon Wheeler to draw up.
203
668265
5005
然后交给漫画家香农·惠勒
(Shannon Wheeler)画出来。
11:13
Here are the results.
204
673270
2536
以下是香农的画作。
11:16
Now Shannon said, well, these are weird.
205
676273
4004
香农说,嗯,这些画很奇怪。
11:20
They don't really seem like --
206
680310
2136
它们看起来不像……
11:22
it's sort of an uncanny
valley of cartoons.
207
682479
2736
有点像漫画的恐怖谷效应。
11:25
They're not quite there.
208
685215
1668
显然 AI 的水平还不到位,
11:26
But it is interesting.
209
686917
1434
但很有意思。
11:28
All of these are new cartoons
that never appeared anywhere
210
688385
4905
所有这些都是
从未在任何地方出现过的新漫画,
11:33
that are an idea of human --
211
693323
3370
一点不像来自人类的创作,
11:36
of computer creativity.
212
696693
2703
因为它们是计算机的创意。
11:40
And also when you look at it, what it is,
213
700130
2302
当你仔细看时,它们究竟是什么?
11:42
it's weirdness in, weirdness out.
214
702466
1768
因为怎么看都很奇怪。
11:44
The caption contest cartoons are weird.
215
704267
2870
配文竞赛的漫画很奇怪。
11:48
But I do see this now as a tool
for brainstorming for cartoonists,
216
708705
5539
但是我确实把它看作一个
漫画家头脑风暴的工具,
11:54
in that we played this
completely straight.
217
714277
3003
我们是很认真的。
11:57
Shannon wasn't able to manipulate
the description of the picture
218
717314
3470
香农当时不能改动图片描述或配文。
12:00
or the caption.
219
720817
1168
12:02
Had he done that,
it could have been better.
220
722018
2103
如果他改动了,画作可能会好些。
12:04
Also, we could have asked it to make more.
221
724154
3470
而且,我们可以让 AI 创作更多。
12:07
We could have put in
the rankings for the humor.
222
727657
2536
我们还可以输入幽默排名。
12:10
We could do all this to improve it.
223
730227
2135
我们可以做所有这些来改善 AI。
12:12
So quality comes out of quantity.
224
732395
4372
也就是说,量变会引起质变。
12:16
You can get an awful lot of quantity here.
225
736767
2035
你可以得到非常大的数量,
12:18
You can have a human being
in the loop to do this.
226
738835
3737
而且,你可以让一个人
介入流程来完成这件事。
12:23
But ...
227
743974
1168
但是……
12:26
I would not go so far
228
746777
3536
我不会做这么多来帮助
12:30
to give AI a true human sense of humor.
229
750347
4671
AI 拥有真正人类的幽默感。
12:35
A human sense of humor
is not about making a joke or getting it.
230
755585
5072
人类的幽默感
并不只是开玩笑或理解玩笑。
12:40
It's rooted in our vulnerability.
231
760690
3804
它源于我们的脆弱性。
12:45
It's the blessing we get
for the curse of mortality.
232
765462
6139
这是我们用死亡诅咒换来的祝福。
12:52
Mark Twain said the true source of humor
233
772402
3737
马克·吐温说,幽默的真正来源
12:56
is not joy but sorrow.
234
776173
3036
不是喜悦,而是悲伤。
13:00
If we gave AI the thousand natural shocks
235
780610
5472
如果我们要让 AI 经受上千种
13:06
that flesh is heir to,
236
786116
4004
肉体凡胎必经的自然冲击,
13:10
that would be cruel.
237
790120
1568
那会很残酷。
13:12
If we did that,
238
792856
1935
如果我们那样做,
13:14
it might very well want to wipe us out.
239
794791
2770
AI 很可能想要消灭我们。
13:17
And if they did, all I ask
is that they take Elon first.
240
797594
5706
如果它们真要那么做,
就先消灭埃隆(Elon)吧。
13:23
(Laughter)
241
803333
2503
(笑声)
13:26
Thank you.
242
806236
1168
谢谢。
13:27
(Applause)
243
807437
4905
(掌声)
13:32
Now, I think, if I'm not mistaken,
244
812375
3537
如果我没记错的话,
13:35
we had a caption contest here, right?
245
815946
2869
我们这里也有过一场配文竞赛,对吧?
13:38
It's on the next slide
246
818849
1501
是在下一张幻灯片上,
13:40
that I had to decide
from U of M people.
247
820383
4271
我不得不从密歇根大学的人中挑选。
13:44
And this was the image
we created for the contest.
248
824654
4872
这就是我们为竞赛创作的图片,
13:49
It's that image.
249
829559
1735
这一张。
13:51
And the one I picked up
250
831328
2936
我选中的配文
13:54
by Kavya Davuluri was this one.
251
834297
5406
是卡维亚·达武鲁里
(Kavya Davuluri)创作的。
13:59
"Above all, I asked the class of 3000
not to forget your humanity."
252
839703
5238
“最重要的是,我请求 3000 届的学生
不要忘记你们的人性。”
14:04
(Laughter)
253
844975
1167
(笑声)
14:06
Now what I will say is,
that is really close to a good caption.
254
846176
4638
我想说,这确实非常接近
一个好的配文了。
14:10
(Laughter)
255
850847
1668
(笑声)
14:13
Mark Twain said the difference between
the right word and the wrong word
256
853383
3770
马克·吐温说,
正确用词和错误用词的区别
14:17
is the difference between
the lightning and the lightning bug.
257
857187
3704
就是闪电和萤火虫的区别。
14:22
That caption should be
258
862225
1468
那条配文应该是:
14:23
"Above all, I ask that the class of 3000
to not forget my humanity."
259
863727
6439
“最重要的是,我请求 3000 届的学生
不要忘记我的人性。”
14:30
(Laughter)
260
870166
1168
(笑声)
14:31
Now you have a joke.
261
871368
1167
现在这就是个笑话了。
14:32
Thank you very much.
262
872569
1168
非常感谢。
14:33
(Applause)
263
873770
1135
(掌声)
New videos
关于本网站
这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。