Jennifer Golbeck: The curly fry conundrum: Why social media "likes" say more than you might think

385,737 views ・ 2014-04-03

TED


請雙擊下方英文字幕播放視頻。

譯者: Adrienne Lin 審譯者: Ying Ru Wu
00:12
If you remember that first decade of the web,
0
12738
1997
如果你還記得網路出現的頭十年,
00:14
it was really a static place.
1
14735
2255
當時是一個很靜態的環境。
00:16
You could go online, you could look at pages,
2
16990
2245
你可以上網、瀏覽網頁,
00:19
and they were put up either by organizations
3
19235
2513
這些網站或許是由一些機構製作,
00:21
who had teams to do it
4
21748
1521
這些機構有自己的團隊,
00:23
or by individuals who were really tech-savvy
5
23269
2229
或是當時很懂科技的人製作的。
00:25
for the time.
6
25498
1737
00:27
And with the rise of social media
7
27235
1575
隨著社交媒體、
00:28
and social networks in the early 2000s,
8
28810
2399
社交網路在 21 世紀初期的興起,
00:31
the web was completely changed
9
31209
2149
網路世界完全改變了。
00:33
to a place where now the vast majority of content
10
33358
3608
現在的網路有很多內容
我們互動的內容是由網路用戶放上網的,
00:36
we interact with is put up by average users,
11
36966
3312
00:40
either in YouTube videos or blog posts
12
40278
2697
不管是 YouTube 上的影片或者部落格,
00:42
or product reviews or social media postings.
13
42975
3315
抑或是商品評價或者社交媒體的文章。
00:46
And it's also become a much more interactive place,
14
46290
2347
除此之外,網路也多了很多互動。
00:48
where people are interacting with others,
15
48637
2637
人們在網絡上互動,
00:51
they're commenting, they're sharing,
16
51274
1696
他們評論、分享,
00:52
they're not just reading.
17
52970
1614
而不僅是看看而已。
00:54
So Facebook is not the only place you can do this,
18
54584
1866
臉書不是唯一一個 能做這些事的網站,
00:56
but it's the biggest,
19
56450
1098
但它是最大的。
00:57
and it serves to illustrate the numbers.
20
57548
1784
我們可以通過臉書 來判斷使用人數。
00:59
Facebook has 1.2 billion users per month.
21
59332
3477
臉書每個月的用戶高達 12 億。
01:02
So half the Earth's Internet population
22
62809
1930
也就是說全球一半的網民
01:04
is using Facebook.
23
64739
1653
都在使用臉書。
01:06
They are a site, along with others,
24
66392
1932
這個網站,還有其他的網站,
01:08
that has allowed people to create an online persona
25
68324
3219
讓網民能創建網路上的個人形象
01:11
with very little technical skill,
26
71543
1782
而且無需太多的技術即可操作。
01:13
and people responded by putting huge amounts
27
73325
2476
用戶反應熱烈,上傳大量的
01:15
of personal data online.
28
75801
1983
個人訊息到網路上。
01:17
So the result is that we have behavioral,
29
77784
2543
這樣一來我們就有了有關行為、
01:20
preference, demographic data
30
80327
1986
偏好、地理數據,
01:22
for hundreds of millions of people,
31
82313
2101
提供給成千上萬的人,
01:24
which is unprecedented in history.
32
84414
2026
這是史無前例的。
01:26
And as a computer scientist, what this means is that
33
86440
2560
作為電腦科學家,這就意味著
01:29
I've been able to build models
34
89000
1664
我可以建立很多模型
01:30
that can predict all sorts of hidden attributes
35
90664
2322
用來推測各種隱藏特性,
01:32
for all of you that you don't even know
36
92986
2284
而你們自己可能都不知道
你們分享的訊息透露了這些特性。
01:35
you're sharing information about.
37
95270
2202
01:37
As scientists, we use that to help
38
97472
2382
科學家利用這些數據來改善
01:39
the way people interact online,
39
99854
2114
網民在網路上的互動,
01:41
but there's less altruistic applications,
40
101968
2499
但網路也有一些 沒那麼利他主義的應用,
01:44
and there's a problem in that users don't really
41
104467
2381
我們面臨一個問題, 那就是網路用戶並不真正
01:46
understand these techniques and how they work,
42
106848
2470
了解這些網路技術、它們的運作原理,
01:49
and even if they did, they don't have a lot of control over it.
43
109318
3128
而且即使他們懂, 也沒什麼辦法控制其影響。
01:52
So what I want to talk to you about today
44
112446
1490
所以我今天想和你們分享的,
01:53
is some of these things that we're able to do,
45
113936
2702
是我們力所能及、可控制的一些事情,
01:56
and then give us some ideas of how we might go forward
46
116638
2763
給大家一些想法,看看我們如何發展才能
01:59
to move some control back into the hands of users.
47
119401
2769
把部分控制權交回到網路用戶的手上。
02:02
So this is Target, the company.
48
122170
1586
這個是 Target 公司。
02:03
I didn't just put that logo
49
123756
1324
我不是沒事把 Target 的標誌放在
02:05
on this poor, pregnant woman's belly.
50
125080
2170
這個可憐孕婦的肚子上。
02:07
You may have seen this anecdote that was printed
51
127250
1840
你可能讀過一個小故事,刊登在
02:09
in Forbes magazine where Target
52
129090
2061
富比士雜誌。故事提到 Target
02:11
sent a flyer to this 15-year-old girl
53
131151
2361
發了張傳單給一位 15 歲的女孩。
02:13
with advertisements and coupons
54
133512
1710
上面的廣告和折價卷
02:15
for baby bottles and diapers and cribs
55
135222
2554
都是嬰兒奶瓶、尿布、嬰兒床的。
02:17
two weeks before she told her parents
56
137776
1684
這還是在她告訴她父親
02:19
that she was pregnant.
57
139460
1864
自己懷孕了之前兩週的事。
02:21
Yeah, the dad was really upset.
58
141324
2704
是的,她的父親很難過。
02:24
He said, "How did Target figure out
59
144028
1716
那為什麼 Target 知道
02:25
that this high school girl was pregnant
60
145744
1824
在這高中女生告訴父母她懷孕以前, 就已經先知道了呢?
02:27
before she told her parents?"
61
147568
1960
02:29
It turns out that they have the purchase history
62
149528
2621
原來,Target 有購物記錄,
02:32
for hundreds of thousands of customers
63
152149
2301
記錄成千上萬網路顧客的購物歷史,
02:34
and they compute what they call a pregnancy score,
64
154450
2730
而且他們還有一個叫做 “懷孕分數”的計算系統,
02:37
which is not just whether or not a woman's pregnant,
65
157180
2332
這個系統不只計算一位女性是否懷孕,
02:39
but what her due date is.
66
159512
1730
還有她們的預產期。
02:41
And they compute that
67
161242
1304
另外,他們不僅探討一些很明顯的資訊,
02:42
not by looking at the obvious things,
68
162546
1768
02:44
like, she's buying a crib or baby clothes,
69
164314
2512
比如說購買了一張嬰兒床、嬰兒服,
02:46
but things like, she bought more vitamins
70
166826
2943
還會計算她買了比平時多的維他命,
02:49
than she normally had,
71
169769
1717
02:51
or she bought a handbag
72
171486
1464
或者是她買了一個 大小足夠放下尿布的包包。
02:52
that's big enough to hold diapers.
73
172950
1711
02:54
And by themselves, those purchases don't seem
74
174661
1910
對購買者來說, 他們並不覺得這些購物訊息
02:56
like they might reveal a lot,
75
176571
2469
透露很多隱私,
02:59
but it's a pattern of behavior that,
76
179040
1978
但其實這是一種行為模式,
03:01
when you take it in the context of thousands of other people,
77
181018
3117
當你把和成千上萬 網友的資料放在一起看,
03:04
starts to actually reveal some insights.
78
184135
2757
其實就能推測出很多東西。
03:06
So that's the kind of thing that we do
79
186892
1793
所以這些就是我們所做的事情,
03:08
when we're predicting stuff about you on social media.
80
188685
2567
我們在社群網站上 推測與你們相關的東西。
03:11
We're looking for little patterns of behavior that,
81
191252
2796
我們要找的行為模式是,
03:14
when you detect them among millions of people,
82
194048
2682
當你們從上百萬人身上發現這種模式,
03:16
lets us find out all kinds of things.
83
196730
2706
我們就能找到所有相關的事情。
03:19
So in my lab and with colleagues,
84
199436
1747
所以我和實驗室的同事們,
03:21
we've developed mechanisms where we can
85
201183
1777
開發了多種機制,幫助我們
03:22
quite accurately predict things
86
202960
1560
較精確地推斷很多事情,
03:24
like your political preference,
87
204520
1725
像是你的政治傾向、
03:26
your personality score, gender, sexual orientation,
88
206245
3752
性格測試分數、性別、性取向、
03:29
religion, age, intelligence,
89
209997
2873
宗教信仰、年齡、智力,
03:32
along with things like
90
212870
1394
同時還有像是
03:34
how much you trust the people you know
91
214264
1937
你對認識的人有多信任、
03:36
and how strong those relationships are.
92
216201
1804
你們的關係有多緊密等。
03:38
We can do all of this really well.
93
218005
1785
所有這些我們都可以做得很好。
03:39
And again, it doesn't come from what you might
94
219790
2197
而且,這些都不是來自於
03:41
think of as obvious information.
95
221987
2102
你會認為是明顯的訊息。
03:44
So my favorite example is from this study
96
224089
2281
我最喜歡舉的一個例子
03:46
that was published this year
97
226370
1240
是一個今年發表的研究
03:47
in the Proceedings of the National Academies.
98
227610
1795
刊在《美國國家科學院院刊》上。
03:49
If you Google this, you'll find it.
99
229405
1285
Google 一下就能查到。
03:50
It's four pages, easy to read.
100
230690
1872
研究只有四頁紙,很容易讀。
03:52
And they looked at just people's Facebook likes,
101
232562
3003
他們僅是研究了用戶在臉書的點讚,
03:55
so just the things you like on Facebook,
102
235565
1920
只是你在臉書上點讚的內容,
03:57
and used that to predict all these attributes,
103
237485
2138
用這些點讚的內容 來推斷所有這些特性,
03:59
along with some other ones.
104
239623
1645
以及其他的資訊。
04:01
And in their paper they listed the five likes
105
241268
2961
在調查中,他們列出了五類的讚,
04:04
that were most indicative of high intelligence.
106
244229
2787
這些讚最能表明高智商的用戶。
04:07
And among those was liking a page
107
247016
2324
這其中還包括
到炸馬鈴薯圈頁面點讚。(笑聲)
04:09
for curly fries. (Laughter)
108
249340
1905
04:11
Curly fries are delicious,
109
251245
2093
炸馬鈴薯圈是好吃,
04:13
but liking them does not necessarily mean
110
253338
2530
但是到這頁面按讚不表示
04:15
that you're smarter than the average person.
111
255868
2080
你就比一般人聰明。
04:17
So how is it that one of the strongest indicators
112
257948
3207
到底為什麼,
最能體現你智商指數的指標之一
04:21
of your intelligence
113
261155
1570
04:22
is liking this page
114
262725
1447
是到一個頁面按讚,
04:24
when the content is totally irrelevant
115
264172
2252
即使頁面的內容完全無關於
04:26
to the attribute that's being predicted?
116
266424
2527
要推斷的特性?
04:28
And it turns out that we have to look at
117
268951
1584
結論是,我們需要參考
04:30
a whole bunch of underlying theories
118
270535
1618
很多背後的理論
04:32
to see why we're able to do this.
119
272153
2569
來了解為什麼我們能夠做到這點。
04:34
One of them is a sociological theory called homophily,
120
274722
2913
其中一個就是社會學理論,叫同質相吸,
04:37
which basically says people are friends with people like them.
121
277635
3092
指的是人們通常 和與自己相像的人交朋友。
04:40
So if you're smart, you tend to be friends with smart people,
122
280727
2014
所以如果你聰明, 你會和聰明的人交朋友,
04:42
and if you're young, you tend to be friends with young people,
123
282741
2630
如果你年輕, 你會和年輕人交朋友,
04:45
and this is well established
124
285371
1627
這個理論是經過驗證的,
04:46
for hundreds of years.
125
286998
1745
多年來大家都肯定。
04:48
We also know a lot
126
288743
1232
我們還知道很多
04:49
about how information spreads through networks.
127
289975
2550
關於訊息在網路上如何傳播。
04:52
It turns out things like viral videos
128
292525
1754
我們發現病毒影片、
04:54
or Facebook likes or other information
129
294279
2406
臉書按讚或是其他訊息
04:56
spreads in exactly the same way
130
296685
1888
傳播的方式完全和
04:58
that diseases spread through social networks.
131
298573
2454
病毒透過社群網站傳播的方式一樣。
05:01
So this is something we've studied for a long time.
132
301027
1791
這是我們研究了很長時間的東西,
05:02
We have good models of it.
133
302818
1576
我們有很好的模型。
05:04
And so you can put those things together
134
304394
2157
所以如果你們把這些模型都放在一起,
05:06
and start seeing why things like this happen.
135
306551
3088
就能了解為何這樣的事情會發生了。
05:09
So if I were to give you a hypothesis,
136
309639
1814
如果要給各位一個假設,
05:11
it would be that a smart guy started this page,
137
311453
3227
那就是一個聰明的人 建立了一個粉絲頁,
05:14
or maybe one of the first people who liked it
138
314680
1939
或者剛開始幾個去按讚的人
05:16
would have scored high on that test.
139
316619
1736
在智力測試上得了高分,
05:18
And they liked it, and their friends saw it,
140
318355
2288
他們給這個頁面點了讚, 當他們的朋友看見了,
05:20
and by homophily, we know that he probably had smart friends,
141
320643
3122
根據同質相吸的原理,我們知道 這些人的朋友可能也很聰明,
05:23
and so it spread to them, and some of them liked it,
142
323765
3056
當訊息傳給他們, 有些人也會給這個頁面點讚,
05:26
and they had smart friends,
143
326821
1189
而他們又有聰明的朋友,
05:28
and so it spread to them,
144
328010
807
05:28
and so it propagated through the network
145
328817
1973
訊息接著傳出去,
這樣一來,就在網路上傳開了,
05:30
to a host of smart people,
146
330790
2569
傳給一群聰明的人,
05:33
so that by the end, the action
147
333359
2056
如此,到最後
05:35
of liking the curly fries page
148
335415
2544
給炸馬鈴薯圈頁面點讚的行為
05:37
is indicative of high intelligence,
149
337959
1615
就成了高智商的指標,
05:39
not because of the content,
150
339574
1803
並不是因為頁面的內容,
05:41
but because the actual action of liking
151
341377
2522
而是因為點讚的這一行為
05:43
reflects back the common attributes
152
343899
1900
反映了做這件事情的人的
05:45
of other people who have done it.
153
345799
2468
共同特性。
05:48
So this is pretty complicated stuff, right?
154
348267
2897
所以這還是挺複雜的,是吧?
05:51
It's a hard thing to sit down and explain
155
351164
2199
要坐下來跟普通用戶解釋是困難的,
05:53
to an average user, and even if you do,
156
353363
2848
而且即使我們分析了,
05:56
what can the average user do about it?
157
356211
2188
對普通用戶們又有什麼用呢?
05:58
How do you know that you've liked something
158
358399
2048
你們怎麼知道到某個粉絲頁按讚
06:00
that indicates a trait for you
159
360447
1492
能夠反映出你的特性,
06:01
that's totally irrelevant to the content of what you've liked?
160
361939
3545
而這特性又和你按讚的內容 完全無關呢?
06:05
There's a lot of power that users don't have
161
365484
2546
很多的權力用戶都沒有,
06:08
to control how this data is used.
162
368030
2230
他們沒法控制這些數據的使用。
06:10
And I see that as a real problem going forward.
163
370260
3112
我認為這是我們繼續發展 所面臨的真正困難。
06:13
So I think there's a couple paths
164
373372
1977
所以我想到了幾條途徑
06:15
that we want to look at
165
375349
1001
我們可以參考,
06:16
if we want to give users some control
166
376350
1910
看能不能給用戶一些
06:18
over how this data is used,
167
378260
1740
控制這些數據的方法。
06:20
because it's not always going to be used
168
380000
1940
因為這些數據並不總是
06:21
for their benefit.
169
381940
1381
能替用戶帶來益處。
06:23
An example I often give is that,
170
383321
1422
我常舉例說,
06:24
if I ever get bored being a professor,
171
384743
1646
如果我厭倦當教授,
06:26
I'm going to go start a company
172
386389
1653
我要開個公司
06:28
that predicts all of these attributes
173
388042
1454
去推斷所有這些用戶特性,
06:29
and things like how well you work in teams
174
389496
1602
像是你的團隊合作、
06:31
and if you're a drug user, if you're an alcoholic.
175
391098
2671
嗑不嗑藥、是不是酒鬼。
06:33
We know how to predict all that.
176
393769
1440
我們知道如何去推斷這些訊息。
06:35
And I'm going to sell reports
177
395209
1761
接著我就要把這些報告
06:36
to H.R. companies and big businesses
178
396970
2100
賣給人力資源公司或者大企業
06:39
that want to hire you.
179
399070
2273
就是那些將要雇你的人。
06:41
We totally can do that now.
180
401343
1177
我們現在完全可以做到這些。
06:42
I could start that business tomorrow,
181
402520
1788
我明天就可以開始做,
06:44
and you would have absolutely no control
182
404308
2052
而且你完全沒法控制
06:46
over me using your data like that.
183
406360
2138
我這樣使用數據的行為。
06:48
That seems to me to be a problem.
184
408498
2292
這在我看來是一個問題。
06:50
So one of the paths we can go down
185
410790
1910
所以我們能選擇的 其中一條途徑就是
06:52
is the policy and law path.
186
412700
2032
政策和法律的制定。
06:54
And in some respects, I think that that would be most effective,
187
414732
3046
在某種程度上, 我認為這將是最有效的方法,
06:57
but the problem is we'd actually have to do it.
188
417778
2756
但問題是我們必須得實際執行。
07:00
Observing our political process in action
189
420534
2780
透過觀察我們的政治進程,
07:03
makes me think it's highly unlikely
190
423314
2379
讓我意識到我們很難
07:05
that we're going to get a bunch of representatives
191
425693
1597
集合一群代表,
07:07
to sit down, learn about this,
192
427290
1986
讓他們坐下來了解這件事,
07:09
and then enact sweeping changes
193
429276
2106
然後開始進行大規模改變,
07:11
to intellectual property law in the U.S.
194
431382
2157
修改美國的知識產權法律
07:13
so users control their data.
195
433539
2461
以讓用戶有權控制他們的數據。
07:16
We could go the policy route,
196
436000
1304
我們可以走政策道路,
07:17
where social media companies say,
197
437304
1479
讓社群公司表態,
07:18
you know what? You own your data.
198
438783
1402
「好,你們擁有自己的數據。
07:20
You have total control over how it's used.
199
440185
2489
你們能完全地控制對它們的使用。」
07:22
The problem is that the revenue models
200
442674
1848
問題在於
多數社交媒體的收益模式
07:24
for most social media companies
201
444522
1724
07:26
rely on sharing or exploiting users' data in some way.
202
446246
4031
某種程度上仰賴 分享或利用用戶的數據。
07:30
It's sometimes said of Facebook that the users
203
450277
1833
有人說臉書的用戶
07:32
aren't the customer, they're the product.
204
452110
2528
不是顧客,而是產品。
07:34
And so how do you get a company
205
454638
2714
所以你怎麼可能讓一間公司
07:37
to cede control of their main asset
206
457352
2558
放棄對他們主要收入的控制
07:39
back to the users?
207
459910
1249
把控制權還給用戶呢?
07:41
It's possible, but I don't think it's something
208
461159
1701
這是有可能的,但我不認為
07:42
that we're going to see change quickly.
209
462860
2320
我們能很快看到這一改變。
07:45
So I think the other path
210
465180
1500
所以我認為另外一條途徑
07:46
that we can go down that's going to be more effective
211
466680
2288
一條更有效的途徑,
07:48
is one of more science.
212
468968
1508
是更科學的途徑。
07:50
It's doing science that allowed us to develop
213
470476
2510
正是透過科學,我們才能開發
07:52
all these mechanisms for computing
214
472986
1750
所有的這些機制首先用於計算個人數據
07:54
this personal data in the first place.
215
474736
2052
07:56
And it's actually very similar research
216
476788
2106
事實上,有個很類似的研究,
07:58
that we'd have to do
217
478894
1438
08:00
if we want to develop mechanisms
218
480332
2386
如果我們要發明一些機制
08:02
that can say to a user,
219
482718
1421
是可以對用戶說
08:04
"Here's the risk of that action you just took."
220
484139
2229
「這是你剛才所做的行為 要面臨的風險。」
08:06
By liking that Facebook page,
221
486368
2080
藉由臉書按讚,
08:08
or by sharing this piece of personal information,
222
488448
2535
或者是分享私人資訊,
08:10
you've now improved my ability
223
490983
1502
你現在給了我更多能力
08:12
to predict whether or not you're using drugs
224
492485
2086
去推斷你是否嗑藥
08:14
or whether or not you get along well in the workplace.
225
494571
2862
或者你是否和同事相處融洽。
08:17
And that, I think, can affect whether or not
226
497433
1848
我認為這些會影響
08:19
people want to share something,
227
499281
1510
人們是否願意分享事情、
08:20
keep it private, or just keep it offline altogether.
228
500791
3239
還是設為隱私,或者是完全不放上網絡。
08:24
We can also look at things like
229
504030
1563
我們還可以研究一些像是
08:25
allowing people to encrypt data that they upload,
230
505593
2728
讓用戶可以加密他們上傳的數據,
08:28
so it's kind of invisible and worthless
231
508321
1855
所以對像是臉書的網站, 這是隱形而且無用的,
08:30
to sites like Facebook
232
510176
1431
08:31
or third party services that access it,
233
511607
2629
或者是第三方服務網站也是如此。
08:34
but that select users who the person who posted it
234
514236
3247
但是用戶可選擇上傳的東西
08:37
want to see it have access to see it.
235
517483
2670
要讓誰有權可以看到。
08:40
This is all super exciting research
236
520153
2166
如果我們從知識的角度去看,
08:42
from an intellectual perspective,
237
522319
1620
這些都是非常令人興奮的研究,
08:43
and so scientists are going to be willing to do it.
238
523939
1859
所以說科學家會願意做相關的研究。
08:45
So that gives us an advantage over the law side.
239
525798
3610
這比起法律的途徑, 給了我們更多的好處。
08:49
One of the problems that people bring up
240
529408
1725
當我談到這個的時候,
08:51
when I talk about this is, they say,
241
531133
1595
人們常會提出一個疑問,
08:52
you know, if people start keeping all this data private,
242
532728
2646
你知道,如果人們開始把這些數據都保密了,
08:55
all those methods that you've been developing
243
535374
2113
你們一直在開發的這些
08:57
to predict their traits are going to fail.
244
537487
2653
用來推斷他們特性的方法都將失效,
09:00
And I say, absolutely, and for me, that's success,
245
540140
3520
我回答說,完全正確,
但對我來說,那就是成功。
09:03
because as a scientist,
246
543660
1786
因為身為一名科學家,
09:05
my goal is not to infer information about users,
247
545446
3688
我的目標不是要推斷用戶的資訊,
09:09
it's to improve the way people interact online.
248
549134
2767
而是要改進人們在網路互動的方式。
09:11
And sometimes that involves inferring things about them,
249
551901
3218
有時候這包括推斷關於他們的事情,
09:15
but if users don't want me to use that data,
250
555119
3022
但如果用戶不想要我使用這些數據,
09:18
I think they should have the right to do that.
251
558141
2038
我認為他們有權利這麼做。
09:20
I want users to be informed and consenting
252
560179
2651
我希望用戶們可以知道且同意
09:22
users of the tools that we develop.
253
562830
2112
我們一直開發這些工具。
09:24
And so I think encouraging this kind of science
254
564942
2952
所以,我認為推廣這類科學、
09:27
and supporting researchers
255
567894
1346
支持研究者,
09:29
who want to cede some of that control back to users
256
569240
3023
支持那些希望把控制權 交回到用戶手中,
09:32
and away from the social media companies
257
572263
2311
從社群媒體公司 拿回這些權利的研究者,
09:34
means that going forward, as these tools evolve
258
574574
2671
意味著隨著這些工具進化和發展, 我們是向前發展的。
09:37
and advance,
259
577245
1476
09:38
means that we're going to have an educated
260
578721
1414
我們將有一組教育程度更高、 更有力的用戶數據,
09:40
and empowered user base,
261
580135
1694
09:41
and I think all of us can agree
262
581829
1100
我相信大家都會認同
09:42
that that's a pretty ideal way to go forward.
263
582929
2564
朝此理想的發展方式前進。
09:45
Thank you.
264
585493
2184
謝謝。
09:47
(Applause)
265
587677
3080
(掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7


This website was created in October 2020 and last updated on June 12, 2025.

It is now archived and preserved as an English learning resource.

Some information may be out of date.

隱私政策

eng.lish.video

Developer's Blog