Jennifer Golbeck: The curly fry conundrum: Why social media "likes" say more than you might think

385,737 views ・ 2014-04-03

TED


请双击下面的英文字幕来播放视频。

翻译人员: Li Li 校对人员: 杏儀 歐陽
00:12
If you remember that first decade of the web,
0
12738
1997
如果你还记得网络时代的头十年,
00:14
it was really a static place.
1
14735
2255
网络是一个水尽鹅飞的地方。
00:16
You could go online, you could look at pages,
2
16990
2245
你可以上网,你可以浏览网页,
00:19
and they were put up either by organizations
3
19235
2513
当时的网站
00:21
who had teams to do it
4
21748
1521
要么是由某个组织的专门团队建立,
00:23
or by individuals who were really tech-savvy
5
23269
2229
要么就是由真正的技术行家所做,
00:25
for the time.
6
25498
1737
这就是当时情况。
00:27
And with the rise of social media
7
27235
1575
但在二十一世纪初
00:28
and social networks in the early 2000s,
8
28810
2399
随着社交媒体以及社交网络的兴起,
00:31
the web was completely changed
9
31209
2149
网络发生了翻天覆地的变化:
00:33
to a place where now the vast majority of content
10
33358
3608
如今网络上大部分的互动内容
00:36
we interact with is put up by average users,
11
36966
3312
都是由大众网络用户提供,
00:40
either in YouTube videos or blog posts
12
40278
2697
既有Youtube视频,也有博客文章,
00:42
or product reviews or social media postings.
13
42975
3315
既有产品评论,也有社交媒体发布。
00:46
And it's also become a much more interactive place,
14
46290
2347
与此同时,互联网成为了一个有更多互动的地方,
00:48
where people are interacting with others,
15
48637
2637
人们在这里互相交流、
00:51
they're commenting, they're sharing,
16
51274
1696
互相评论、互相分享,
00:52
they're not just reading.
17
52970
1614
而不只是阅读信息。
00:54
So Facebook is not the only place you can do this,
18
54584
1866
面谱网不是唯一一个你可以做这些事情的地方,
00:56
but it's the biggest,
19
56450
1098
但它确实是最大的一个,
00:57
and it serves to illustrate the numbers.
20
57548
1784
并且它用数字来证明这点。
00:59
Facebook has 1.2 billion users per month.
21
59332
3477
面谱网每个月有12亿用户。
01:02
So half the Earth's Internet population
22
62809
1930
由此可见,地球上一半的互联网用户
01:04
is using Facebook.
23
64739
1653
都在使用面谱网。
01:06
They are a site, along with others,
24
66392
1932
这些都是网站,
01:08
that has allowed people to create an online persona
25
68324
3219
允许人们在网上创建不同的角色,
01:11
with very little technical skill,
26
71543
1782
但这些人又不需要有多少计算机技能,
01:13
and people responded by putting huge amounts
27
73325
2476
而人们的反应是
01:15
of personal data online.
28
75801
1983
在网上输入大量的个人信息。
01:17
So the result is that we have behavioral,
29
77784
2543
结果是,我们拥有数以亿计人的
01:20
preference, demographic data
30
80327
1986
行为信息、喜好信息
01:22
for hundreds of millions of people,
31
82313
2101
以及人口数据资料。
01:24
which is unprecedented in history.
32
84414
2026
这在历史上前所未有。
01:26
And as a computer scientist, what this means is that
33
86440
2560
对于作为计算机科学家的我来说,这意味着
01:29
I've been able to build models
34
89000
1664
我能够建立模型
01:30
that can predict all sorts of hidden attributes
35
90664
2322
来预测各种各样的
01:32
for all of you that you don't even know
36
92986
2284
你或许完全没有意识到的
01:35
you're sharing information about.
37
95270
2202
与你所分享的信息相关的隐藏信息。
01:37
As scientists, we use that to help
38
97472
2382
作为科学家,我们利用这些信息
01:39
the way people interact online,
39
99854
2114
来帮助人们在网上交流。
01:41
but there's less altruistic applications,
40
101968
2499
但也有人用此来谋取自己的私欲,
01:44
and there's a problem in that users don't really
41
104467
2381
而问题是,用户并没有真正理解
01:46
understand these techniques and how they work,
42
106848
2470
其中用到的技术和技术的应用方式。
01:49
and even if they did, they don't have a lot of control over it.
43
109318
3128
即便理解了,也不见得他们有话事权。
01:52
So what I want to talk to you about today
44
112446
1490
所以,我今天想谈谈
01:53
is some of these things that we're able to do,
45
113936
2702
我们能够做的一些事情,
01:56
and then give us some ideas of how we might go forward
46
116638
2763
也启发我们
01:59
to move some control back into the hands of users.
47
119401
2769
如何改善情况、让话事权回归用户。
02:02
So this is Target, the company.
48
122170
1586
这是塔吉特百货公司的商标。
02:03
I didn't just put that logo
49
123756
1324
我并不单单把那个商标
02:05
on this poor, pregnant woman's belly.
50
125080
2170
放在这个可怜的孕妇的肚子上。
02:07
You may have seen this anecdote that was printed
51
127250
1840
或许在福布斯杂志上
02:09
in Forbes magazine where Target
52
129090
2061
你看过这么一则趣事:
02:11
sent a flyer to this 15-year-old girl
53
131151
2361
塔吉特百货公司给这个15岁女孩寄了一份传单,
02:13
with advertisements and coupons
54
133512
1710
传单上都是婴儿奶瓶、尿布、
02:15
for baby bottles and diapers and cribs
55
135222
2554
婴儿床的广告和优惠券。
02:17
two weeks before she told her parents
56
137776
1684
这一切发生在
02:19
that she was pregnant.
57
139460
1864
她把怀孕消息告诉父母的两周前。
02:21
Yeah, the dad was really upset.
58
141324
2704
没错,女孩的父亲很生气。
02:24
He said, "How did Target figure out
59
144028
1716
他说:”塔吉特是如何
02:25
that this high school girl was pregnant
60
145744
1824
在连这个高中女生的父母都尚未知情之前
02:27
before she told her parents?"
61
147568
1960
就知道她怀孕了?“
02:29
It turns out that they have the purchase history
62
149528
2621
原来,塔吉特有成千上万的顾客,
02:32
for hundreds of thousands of customers
63
152149
2301
并拥有他们的购买历史记录,
02:34
and they compute what they call a pregnancy score,
64
154450
2730
他们用计算机推算出他们所谓的“怀孕分数”,
02:37
which is not just whether or not a woman's pregnant,
65
157180
2332
不仅能知道一个女性是否怀孕,
02:39
but what her due date is.
66
159512
1730
而且还能计算出她的分娩日期。
02:41
And they compute that
67
161242
1304
他们计算出的结果
02:42
not by looking at the obvious things,
68
162546
1768
不单单是基于一些显而易见的事情,
02:44
like, she's buying a crib or baby clothes,
69
164314
2512
比如说,她准备买个婴儿床或孩子的衣服,
02:46
but things like, she bought more vitamins
70
166826
2943
更是基于其他一些事情,
02:49
than she normally had,
71
169769
1717
例如她比平时多买了维他命,
02:51
or she bought a handbag
72
171486
1464
或她买了一个新的手提包
02:52
that's big enough to hold diapers.
73
172950
1711
大得可以放尿布。
02:54
And by themselves, those purchases don't seem
74
174661
1910
单独来看这些消费记录
02:56
like they might reveal a lot,
75
176571
2469
或许并不能说明什么,
02:59
but it's a pattern of behavior that,
76
179040
1978
但这确是一种行为模式,
03:01
when you take it in the context of thousands of other people,
77
181018
3117
当你有大量人口背景作比较,
03:04
starts to actually reveal some insights.
78
184135
2757
这种行为模式就开始透露一些见解。
03:06
So that's the kind of thing that we do
79
186892
1793
当我们根据社交媒体来预测关于你的一些事情时,
03:08
when we're predicting stuff about you on social media.
80
188685
2567
这便是我们常做的一类事情。
03:11
We're looking for little patterns of behavior that,
81
191252
2796
我们着眼于零星的行为模式,
03:14
when you detect them among millions of people,
82
194048
2682
当你在众人中发现这些行为模式时,
03:16
lets us find out all kinds of things.
83
196730
2706
会帮助我们发现各种各样的事情。
03:19
So in my lab and with colleagues,
84
199436
1747
在我的实验室,在同事们的合作下,
03:21
we've developed mechanisms where we can
85
201183
1777
我们已经开发了一些机制
03:22
quite accurately predict things
86
202960
1560
来较为准确地推测一些事情,
03:24
like your political preference,
87
204520
1725
比如你的政治立场、
03:26
your personality score, gender, sexual orientation,
88
206245
3752
你的性格得分、性别、性取向、
03:29
religion, age, intelligence,
89
209997
2873
宗教信仰、年龄、智商,
03:32
along with things like
90
212870
1394
另外还有:
03:34
how much you trust the people you know
91
214264
1937
你对认识的人的信任程度、
03:36
and how strong those relationships are.
92
216201
1804
你的人际关系程度。
03:38
We can do all of this really well.
93
218005
1785
我们能够很好地完成这些推测。
03:39
And again, it doesn't come from what you might
94
219790
2197
我在这里在强调一遍,这种推测并基于
03:41
think of as obvious information.
95
221987
2102
在你看来显而易见的信息。
03:44
So my favorite example is from this study
96
224089
2281
我最喜欢的例子是来自
03:46
that was published this year
97
226370
1240
今年发表在
03:47
in the Proceedings of the National Academies.
98
227610
1795
美国国家论文集上的一个研究。
03:49
If you Google this, you'll find it.
99
229405
1285
你可以在谷歌搜索找到这篇文章。
03:50
It's four pages, easy to read.
100
230690
1872
这篇文章总共四页,容易阅读。
03:52
And they looked at just people's Facebook likes,
101
232562
3003
他们仅仅研究了人们在面谱网上的“赞”,
03:55
so just the things you like on Facebook,
102
235565
1920
也就是你在面谱网上喜欢的事情。
03:57
and used that to predict all these attributes,
103
237485
2138
他们利用这些数据来预测
03:59
along with some other ones.
104
239623
1645
之前所说的所有特性,还有其他的一些特性。
04:01
And in their paper they listed the five likes
105
241268
2961
在文章中列举了
04:04
that were most indicative of high intelligence.
106
244229
2787
最能够显示高智商的五个“赞”。
04:07
And among those was liking a page
107
247016
2324
在这五项中
04:09
for curly fries. (Laughter)
108
249340
1905
赞“炸扭薯”页面的是其中之一
04:11
Curly fries are delicious,
109
251245
2093
炸扭薯很好吃,
04:13
but liking them does not necessarily mean
110
253338
2530
但喜欢吃炸扭薯
04:15
that you're smarter than the average person.
111
255868
2080
并不一定意味着你比一般人聪明。
04:17
So how is it that one of the strongest indicators
112
257948
3207
那么为什么喜欢某个页面
04:21
of your intelligence
113
261155
1570
就成为显示你智商
04:22
is liking this page
114
262725
1447
的重要因素,
04:24
when the content is totally irrelevant
115
264172
2252
尽管该页面的内容和所预测的属性
04:26
to the attribute that's being predicted?
116
266424
2527
与此毫不相干?
04:28
And it turns out that we have to look at
117
268951
1584
事实是我们必须审视
04:30
a whole bunch of underlying theories
118
270535
1618
大量的基础理论,
04:32
to see why we're able to do this.
119
272153
2569
从而了解我们是如何做到准确推测的。
04:34
One of them is a sociological theory called homophily,
120
274722
2913
其中一个基础理论是社会学的同质性理论,
04:37
which basically says people are friends with people like them.
121
277635
3092
主要意思是人们和自己相似的人交朋友。
04:40
So if you're smart, you tend to be friends with smart people,
122
280727
2014
所以说,如果你很聪明,你倾向于和聪明的人交朋友。
04:42
and if you're young, you tend to be friends with young people,
123
282741
2630
如果你还年轻,你倾向于和年轻人交朋友。
04:45
and this is well established
124
285371
1627
这是数百年来
04:46
for hundreds of years.
125
286998
1745
公认的理论。
04:48
We also know a lot
126
288743
1232
我们很清楚
04:49
about how information spreads through networks.
127
289975
2550
信息在网络上传播的传播途径。
04:52
It turns out things like viral videos
128
292525
1754
结果是,流行的视频、
04:54
or Facebook likes or other information
129
294279
2406
脸书上得到很多“赞”的内容、
04:56
spreads in exactly the same way
130
296685
1888
或者其他信息的传播,
04:58
that diseases spread through social networks.
131
298573
2454
同疾病在社交网络中蔓延的方式是相同的。
05:01
So this is something we've studied for a long time.
132
301027
1791
我们在这方面已经研究很久了。
05:02
We have good models of it.
133
302818
1576
我们己经建立了很好的模型。
05:04
And so you can put those things together
134
304394
2157
你能够将所有这些事物放在一起,
05:06
and start seeing why things like this happen.
135
306551
3088
看看为什么这样的事情会发生。
05:09
So if I were to give you a hypothesis,
136
309639
1814
如果要我给你一个假说的话,
05:11
it would be that a smart guy started this page,
137
311453
3227
我会猜测一个聪明的人建立了这个页面,
05:14
or maybe one of the first people who liked it
138
314680
1939
或者第一个喜欢这个页面的人
05:16
would have scored high on that test.
139
316619
1736
拥有挺高的智商得分。
05:18
And they liked it, and their friends saw it,
140
318355
2288
他们喜欢了这个页面,然后他们的朋友看到了,
05:20
and by homophily, we know that he probably had smart friends,
141
320643
3122
根据同质性理论,我们知道这些人可能有聪明的朋友,
05:23
and so it spread to them, and some of them liked it,
142
323765
3056
然后他们看到这类信息,他们中的一部分人也喜欢,
05:26
and they had smart friends,
143
326821
1189
他们也有聪明的朋友,
05:28
and so it spread to them,
144
328010
807
05:28
and so it propagated through the network
145
328817
1973
所以这类信息也传到其他朋友那里,
所以信息就在网络上
05:30
to a host of smart people,
146
330790
2569
在聪明人的圈子里流传开来了,
05:33
so that by the end, the action
147
333359
2056
因此到了最后,
05:35
of liking the curly fries page
148
335415
2544
喜欢炸扭薯的这个页面
05:37
is indicative of high intelligence,
149
337959
1615
就成了高智商的象征,
05:39
not because of the content,
150
339574
1803
而不是因为内容本身,
05:41
but because the actual action of liking
151
341377
2522
而是“喜欢”这一个实际行动
05:43
reflects back the common attributes
152
343899
1900
反映了那些也付诸同样行动的人
05:45
of other people who have done it.
153
345799
2468
的相同特征。
05:48
So this is pretty complicated stuff, right?
154
348267
2897
听起来很复杂,对吧?
05:51
It's a hard thing to sit down and explain
155
351164
2199
对于一般用户来说
05:53
to an average user, and even if you do,
156
353363
2848
它比较难解释清楚,就算你解释清楚了,
05:56
what can the average user do about it?
157
356211
2188
一般用户又能利用它来干嘛呢?
05:58
How do you know that you've liked something
158
358399
2048
你又怎么能知道你喜欢的事情
06:00
that indicates a trait for you
159
360447
1492
反映了你什么特征
06:01
that's totally irrelevant to the content of what you've liked?
160
361939
3545
而且这个特征还和你喜欢的内容毫不相干呢?
06:05
There's a lot of power that users don't have
161
365484
2546
用户其实没有太多的能力
06:08
to control how this data is used.
162
368030
2230
去控制这些数据的使用。
06:10
And I see that as a real problem going forward.
163
370260
3112
我把这个看作将来的真实问题,
06:13
So I think there's a couple paths
164
373372
1977
我认为,要是我们想让用户拥有
06:15
that we want to look at
165
375349
1001
使用这些数据的能力,
06:16
if we want to give users some control
166
376350
1910
那么有几条路径
06:18
over how this data is used,
167
378260
1740
我们需要探究,
06:20
because it's not always going to be used
168
380000
1940
因为这些数据并不总是
06:21
for their benefit.
169
381940
1381
用来为他们谋利益。
06:23
An example I often give is that,
170
383321
1422
这有一个我经常举的例子,
06:24
if I ever get bored being a professor,
171
384743
1646
如果我厌倦了当一名教授,
06:26
I'm going to go start a company
172
386389
1653
我会选择自己开家公司
06:28
that predicts all of these attributes
173
388042
1454
这家公司能预测这些特性和事物
06:29
and things like how well you work in teams
174
389496
1602
例如你在团队里的能力
06:31
and if you're a drug user, if you're an alcoholic.
175
391098
2671
例如你是否是一个吸毒者或酗酒者。
06:33
We know how to predict all that.
176
393769
1440
我们知道如何去预测这些特性。
06:35
And I'm going to sell reports
177
395209
1761
然后我就会把这些报告
06:36
to H.R. companies and big businesses
178
396970
2100
卖给那些人力资源公司
06:39
that want to hire you.
179
399070
2273
和想要雇佣你的大公司。
06:41
We totally can do that now.
180
401343
1177
我们完全可以做到这点。
06:42
I could start that business tomorrow,
181
402520
1788
我明天就能开始这个项目,
06:44
and you would have absolutely no control
182
404308
2052
并且你对我这用使用你的数据
06:46
over me using your data like that.
183
406360
2138
是一点办法也没有的。
06:48
That seems to me to be a problem.
184
408498
2292
这对我来说是一个问题。
06:50
So one of the paths we can go down
185
410790
1910
所以我们可选的其中一条路径
06:52
is the policy and law path.
186
412700
2032
是政策和法律这条途径。
06:54
And in some respects, I think that that would be most effective,
187
414732
3046
某程度上我觉得这可能是最有效的
06:57
but the problem is we'd actually have to do it.
188
417778
2756
但问题是,事实上我们将不得不这么做。
07:00
Observing our political process in action
189
420534
2780
观察我们目前的政治进程
07:03
makes me think it's highly unlikely
190
423314
2379
让我觉得在美国
07:05
that we're going to get a bunch of representatives
191
425693
1597
把一帮代表们聚在一起
07:07
to sit down, learn about this,
192
427290
1986
让他们坐下来理解这个问题,
07:09
and then enact sweeping changes
193
429276
2106
然后颁布有关知识产权法方面的颠覆性条例,
07:11
to intellectual property law in the U.S.
194
431382
2157
让用户掌控自己的数据,
07:13
so users control their data.
195
433539
2461
这似乎是不可能的。
07:16
We could go the policy route,
196
436000
1304
我们可以走政策途径,
07:17
where social media companies say,
197
437304
1479
这样社交媒体公司就会告诉你,
07:18
you know what? You own your data.
198
438783
1402
你知道吗?你的确拥有你的数据。
07:20
You have total control over how it's used.
199
440185
2489
你绝对能自己决定要怎么去用。
07:22
The problem is that the revenue models
200
442674
1848
但问题在于大部分的社交媒体公司
07:24
for most social media companies
201
444522
1724
他们的盈利模式
07:26
rely on sharing or exploiting users' data in some way.
202
446246
4031
在某方面取决于分享或挖掘用户的数据资料。
07:30
It's sometimes said of Facebook that the users
203
450277
1833
所以有时会说面谱网的用户并不是顾客,
07:32
aren't the customer, they're the product.
204
452110
2528
而是产品。
07:34
And so how do you get a company
205
454638
2714
那么你要怎样让一个公司
07:37
to cede control of their main asset
206
457352
2558
将他们的主要资产控制权
07:39
back to the users?
207
459910
1249
双手拱让给用户呢?
07:41
It's possible, but I don't think it's something
208
461159
1701
这是可能的,但我不觉得
07:42
that we're going to see change quickly.
209
462860
2320
我们能很快见证这种改变。
07:45
So I think the other path
210
465180
1500
所以我认为我们得走另一条途径
07:46
that we can go down that's going to be more effective
211
466680
2288
一条更有效的途径,
07:48
is one of more science.
212
468968
1508
一条更加科学的途径。
07:50
It's doing science that allowed us to develop
213
470476
2510
这途径是开发一种技术
07:52
all these mechanisms for computing
214
472986
1750
让我们能够发展所有这些机制
07:54
this personal data in the first place.
215
474736
2052
来首先处理自己的个人信息资料。
07:56
And it's actually very similar research
216
476788
2106
而这很接近
07:58
that we'd have to do
217
478894
1438
我们必须做的研究,
08:00
if we want to develop mechanisms
218
480332
2386
要是我们想要发展这些机制
08:02
that can say to a user,
219
482718
1421
跟用户说明,
08:04
"Here's the risk of that action you just took."
220
484139
2229
“这样做你需要承担那样的风险。”
08:06
By liking that Facebook page,
221
486368
2080
你在面谱网上点“赞”
08:08
or by sharing this piece of personal information,
222
488448
2535
或者分享一些私人信息,
08:10
you've now improved my ability
223
490983
1502
就相当于增强了我的能力
08:12
to predict whether or not you're using drugs
224
492485
2086
去预测你是不是在吸毒
08:14
or whether or not you get along well in the workplace.
225
494571
2862
或者你在工作中是否顺利。
08:17
And that, I think, can affect whether or not
226
497433
1848
我觉得,这样做
08:19
people want to share something,
227
499281
1510
能够影响人们分享的决定:
08:20
keep it private, or just keep it offline altogether.
228
500791
3239
是要保持私隐,还是在网上只字不提。
08:24
We can also look at things like
229
504030
1563
我们也可以探究一些别的,例如
08:25
allowing people to encrypt data that they upload,
230
505593
2728
让人们去给上传的东西加密,
08:28
so it's kind of invisible and worthless
231
508321
1855
那么像面谱网这样的网站
08:30
to sites like Facebook
232
510176
1431
或其他能获取信息的第三方来说
08:31
or third party services that access it,
233
511607
2629
这些信息就隐秘很多,也少了很多意义,
08:34
but that select users who the person who posted it
234
514236
3247
而且只有上传人指定的用户
08:37
want to see it have access to see it.
235
517483
2670
才有浏览的权限。
08:40
This is all super exciting research
236
520153
2166
从智能的角度来看,
08:42
from an intellectual perspective,
237
522319
1620
这是一个非常振奋人心的研究,
08:43
and so scientists are going to be willing to do it.
238
523939
1859
而且科学家们也会乐意去做这样的事。
08:45
So that gives us an advantage over the law side.
239
525798
3610
这样在法律方面,我们就有优势了。
08:49
One of the problems that people bring up
240
529408
1725
当我谈论到这个话题时,
08:51
when I talk about this is, they say,
241
531133
1595
人们提到的其中一个问题,就是
08:52
you know, if people start keeping all this data private,
242
532728
2646
如果当人们开始把这些数据进行保密,
08:55
all those methods that you've been developing
243
535374
2113
那些你研发的用来预测
08:57
to predict their traits are going to fail.
244
537487
2653
人们特性的手段都会作废。
09:00
And I say, absolutely, and for me, that's success,
245
540140
3520
我会说,绝对会作废,但对我来说,这是成功,
09:03
because as a scientist,
246
543660
1786
因为作为一个科学家,
09:05
my goal is not to infer information about users,
247
545446
3688
我的目标不是去推测出用户的信息,
09:09
it's to improve the way people interact online.
248
549134
2767
而是提高人们在网上互动的方式。
09:11
And sometimes that involves inferring things about them,
249
551901
3218
虽然有时涉及到推测用户的资料,
09:15
but if users don't want me to use that data,
250
555119
3022
但如果用户不希望我们用他们的数据,
09:18
I think they should have the right to do that.
251
558141
2038
我觉得他们应该有权去拒绝。
09:20
I want users to be informed and consenting
252
560179
2651
我希望用户能被告知
09:22
users of the tools that we develop.
253
562830
2112
并且赞同我们开发的这种工具。
09:24
And so I think encouraging this kind of science
254
564942
2952
所以我认为,鼓励这类科学,
09:27
and supporting researchers
255
567894
1346
支持这些研究者们
09:29
who want to cede some of that control back to users
256
569240
3023
这些愿意放弃部分控制,退还给用户们,
09:32
and away from the social media companies
257
572263
2311
并且不让社交媒体公司接触数据的研究者们
09:34
means that going forward, as these tools evolve
258
574574
2671
随着这些工具的进化和提高
09:37
and advance,
259
577245
1476
这一切意味着向前的发展,
09:38
means that we're going to have an educated
260
578721
1414
意味着我们将会拥有一个
09:40
and empowered user base,
261
580135
1694
有素质有权力的用户基础,
09:41
and I think all of us can agree
262
581829
1100
我觉得我们都会同意
09:42
that that's a pretty ideal way to go forward.
263
582929
2564
这是一个理想的前进目标。
09:45
Thank you.
264
585493
2184
谢谢。
09:47
(Applause)
265
587677
3080
(掌声)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7


This website was created in October 2020 and last updated on June 12, 2025.

It is now archived and preserved as an English learning resource.

Some information may be out of date.

隐私政策

eng.lish.video

Developer's Blog