When technology can read minds, how will we protect our privacy? | Nita Farahany
166,675 views ・ 2018-12-18
請雙擊下方英文字幕播放視頻。
譯者: Lilian Chiu
審譯者: congmei Han
00:13
In the months following
the 2009 presidential election in Iran,
0
13096
4714
在 2009 年伊朗
總統大選之後的數個月,
00:17
protests erupted across the country.
1
17834
2894
抗議在該國各地爆發。
00:21
The Iranian government
violently suppressed
2
21685
2896
伊朗政府以暴力方式壓制
00:24
what came to be known
as the Iranian Green Movement,
3
24605
3979
後來大家所知的伊朗綠色革命,
00:28
even blocking mobile signals
4
28608
2053
政府甚至阻擋手機訊號,
00:30
to cut off communication
between the protesters.
5
30685
2714
切斷抗議者之間的通訊。
00:34
My parents, who emigrated
to the United States in the late 1960s,
6
34316
4669
我父母在六○年代末移民到美國,
00:39
spend substantial time there,
7
39009
1794
他們很多時候都在伊朗,
00:40
where all of my large,
extended family live.
8
40827
3153
我非常大的大家庭
成員都住在那裡。
00:44
When I would call my family in Tehran
9
44832
3129
在政府用最暴力的方式鎮壓抗議時,
00:47
during some of the most violent
crackdowns of the protest,
10
47985
3452
我會打電話給在德黑蘭的家人,
00:51
none of them dared discuss
with me what was happening.
11
51461
3252
他們通通不敢跟我討論
發生了什麼事。
00:55
They or I knew to quickly steer
the conversation to other topics.
12
55196
3529
他們或是我都會知道
要很快地轉換到其他話題。
00:59
All of us understood
what the consequences could be
13
59163
3380
我們都了解若被認為持有不同意見,
01:02
of a perceived dissident action.
14
62567
2540
會有什麼樣的後果。
01:06
But I still wish I could have known
what they were thinking
15
66095
3469
但我還是希望我當時
能知道他們在想什麼,
01:09
or what they were feeling.
16
69588
1418
或他們有什麼感覺。
01:12
What if I could have?
17
72217
1393
如果我當初能做到呢?
01:14
Or more frighteningly,
18
74149
1151
或,更令人害怕的狀況,
如果伊朗政府當初能夠做到呢?
01:15
what if the Iranian government could have?
19
75324
2761
01:18
Would they have arrested them
based on what their brains revealed?
20
78695
3244
政府是否會基於他們
腦中所想的就逮捕他們?
01:22
That day may be closer than you think.
21
82933
2944
那一天可能比你想像的還要近。
01:26
With our growing capabilities
in neuroscience, artificial intelligence
22
86527
3811
我們在神經科學、
人工智慧以及機器學習的
能力不斷提升,
01:30
and machine learning,
23
90362
1703
01:32
we may soon know a lot more
of what's happening in the human brain.
24
92089
4075
我們可能很快就會知道
更多人腦裡頭的狀況。
01:37
As a bioethicist, a lawyer, a philosopher
25
97083
3310
我是生物倫理學家、律師、
哲學家,以及伊朗裔美國人,
01:40
and an Iranian-American,
26
100417
1867
01:42
I'm deeply concerned
about what this means for our freedoms
27
102308
3787
我非常在乎這項發展對於
我們的自由有什麼樣的意涵,
01:46
and what kinds of protections we need.
28
106119
2171
以及我們需要怎樣的保護。
01:48
I believe we need
a right to cognitive liberty,
29
108993
3460
我認為我們要有
「認知自由」的權利,
01:52
as a human right
that needs to be protected.
30
112477
2892
它是一種需要被保護的人權,
01:55
If not, our freedom of thought,
31
115772
2643
如果沒有保護它,
我們的思想自由、
01:58
access and control over our own brains
32
118439
3024
對我們大腦的存取權以及控制權,
02:01
and our mental privacy will be threatened.
33
121487
2841
還有我們的心理隱私
都會受到威脅。
02:05
Consider this:
34
125698
1158
試想看看:一般人每天
會有數千個思維想法。
02:06
the average person thinks
thousands of thoughts each day.
35
126880
3314
02:10
As a thought takes form,
36
130697
1151
一個想法形成時,
比如一項數學計算、
02:11
like a math calculation
or a number, a word,
37
131872
5056
一個數字、一個單字,
02:16
neurons are interacting in the brain,
38
136952
2885
神經元就會和大腦互動,
02:19
creating a miniscule electrical discharge.
39
139861
3088
產生很微小的靜電放電。
02:23
When you have a dominant
mental state, like relaxation,
40
143713
3452
當你有一種主動支配的
精神狀態,比如放鬆,
02:27
hundreds and thousands of neurons
are firing in the brain,
41
147189
4175
腦中會有成千上百個
神經元串接在一起,
02:31
creating concurrent electrical discharges
in characteristic patterns
42
151388
4218
創造出依循特徵模式的
同時靜電放電,
02:35
that can be measured
with electroencephalography, or EEG.
43
155630
4865
可以用腦電圖(EEG)來測量。
02:41
In fact, that's what
you're seeing right now.
44
161118
2160
事實上,就是各位現在所看到的。
02:43
You're seeing my brain activity
that was recorded in real time
45
163956
3964
你們看到的是我的
即時腦部活動記錄,
02:47
with a simple device
that was worn on my head.
46
167944
2735
我頭上只要戴著一個
簡單的裝置即可。
02:51
What you're seeing is my brain activity
when I was relaxed and curious.
47
171669
5653
你們看到的,
是我放鬆且好奇時的腦部活動。
02:58
To share this information with you,
48
178097
1755
為了和各位分享這些資訊,
02:59
I wore one of the early
consumer-based EEG devices
49
179876
3020
我戴上了一個早期以消費者
為基礎的腦電圖裝置,就像這個,
03:02
like this one,
50
182920
1211
03:04
which recorded the electrical
activity in my brain in real time.
51
184155
3988
它能即時記錄下我腦部的電活動。
03:08
It's not unlike the fitness trackers
that some of you may be wearing
52
188849
3826
它並不像有些人健身戴的那種
03:12
to measure your heart rate
or the steps that you've taken,
53
192699
3586
心跳追蹤器,或是計步器,
03:16
or even your sleep activity.
54
196309
1587
或是睡眠活動監測器。
03:19
It's hardly the most sophisticated
neuroimaging technique on the market.
55
199154
3824
它甚至算不上是市場上
最精密的神經成像技術。
03:23
But it's already the most portable
56
203597
2378
但它是最能攜帶的,
03:25
and the most likely to impact
our everyday lives.
57
205999
3152
且最可能影響我們日常生活的。
03:29
This is extraordinary.
58
209915
1503
這很了不起。
03:31
Through a simple, wearable device,
59
211442
2505
透過一個簡單、可穿戴的裝置,
03:33
we can literally see
inside the human brain
60
213971
3548
我們就真的能看見人腦的內部,
03:37
and learn aspects of our mental landscape
without ever uttering a word.
61
217543
6476
且完全不用說話就可以
知道我們的心景。
03:44
While we can't reliably decode
complex thoughts just yet,
62
224829
3653
雖然我們還無法用很可靠的方式
來解譯複雜的思想,
03:48
we can already gauge a person's mood,
63
228506
2519
我們已經可以測量人的心情,
03:51
and with the help
of artificial intelligence,
64
231049
2873
在人工智慧的協助之下,
03:53
we can even decode
some single-digit numbers
65
233946
4341
我們甚至可以將一個人腦中
在想的一位數數字、
03:58
or shapes or simple words
that a person is thinking
66
238311
4882
一個形狀,或簡單的字詞
給解譯出來,
04:03
or hearing, or seeing.
67
243217
2189
連他聽見的、看見的也可以。
04:06
Despite some inherent limitations in EEG,
68
246345
3365
儘管腦電圖有一些先天的限制,
04:09
I think it's safe to say
that with our advances in technology,
69
249734
4720
我認為我們可以說,
隨著我們的科技進步,
04:14
more and more of what's happening
in the human brain
70
254478
3809
會有越來越多在人腦中發生的狀況
04:18
can and will be decoded over time.
71
258311
2310
隨時間都能夠被解譯出來。
04:21
Already, using one of these devices,
72
261362
2746
現今,使用這種裝置,
04:24
an epileptic can know
they're going to have an epileptic seizure
73
264132
3255
癲癇症患者就可以
在癲癇發作之前知道即將發作,
04:27
before it happens.
74
267411
1436
04:28
A paraplegic can type on a computer
with their thoughts alone.
75
268871
4603
截癱患者能夠單單用
他們的思想在電腦上打字。
04:34
A US-based company has developed
a technology to embed these sensors
76
274485
4183
一間美國公司開發出了一種技術,
來將這些感測器裝入汽車的頭枕,
04:38
into the headrest of automobilies
77
278692
2230
04:40
so they can track driver concentration,
78
280946
2505
他們就能追蹤司機在開車時的
04:43
distraction and cognitive load
while driving.
79
283475
2667
注意力、分心,以及認知負載。
04:46
Nissan, insurance companies
and AAA have all taken note.
80
286912
4058
日產汽車、保險公司,
以及美國汽車協會都已經在關注。
04:51
You could even watch this
choose-your-own-adventure movie
81
291949
4508
你甚至可以觀賞《時刻》這部
「選擇你自己的冒險」的電影,
04:56
"The Moment," which, with an EEG headset,
82
296481
4240
看的時候要戴上腦電圖耳機,
05:00
changes the movie
based on your brain-based reactions,
83
300745
3926
就能用你腦中的反應來改變電影,
05:04
giving you a different ending
every time your attention wanes.
84
304695
4353
每當你的注意力降低,
就會給你一個不一樣的結局。
05:11
This may all sound great,
85
311154
2763
這可能聽起來很棒,
05:13
and as a bioethicist,
86
313941
2189
身為生物倫理學家,
05:16
I am a huge proponent of empowering people
87
316154
3613
我非常支持要讓人們有能力
05:19
to take charge of their own
health and well-being
88
319791
2616
去掌控自己的健康和幸福,
05:22
by giving them access
to information about themselves,
89
322431
2918
而讓他們取得關於自己的資訊——
05:25
including this incredible
new brain-decoding technology.
90
325373
2976
包括用這項了不起的大腦解譯新技術,
就可以做到這一點。
05:29
But I worry.
91
329878
1167
但,我會擔心。
05:31
I worry that we will voluntarily
or involuntarily give up
92
331736
4760
我擔心我們會自願地或不自願地
放棄我們捍衛自由及我們
心理隱私的最後保壘。
05:36
our last bastion of freedom,
our mental privacy.
93
336520
4118
05:41
That we will trade our brain activity
94
341302
2925
我擔心我們會拿
我們的大腦活動來交易,
05:44
for rebates or discounts on insurance,
95
344251
3046
換取保險的貼現或折扣,
05:48
or free access
to social-media accounts ...
96
348391
2603
或換取免費帳號來使用社交媒體,
05:52
or even to keep our jobs.
97
352444
1848
甚至用來保有工作。
05:54
In fact, in China,
98
354900
1913
事實上,在中國,
05:58
the train drivers on
the Beijing-Shanghai high-speed rail,
99
358199
5897
北京上海之間的高鐵,
是世界上最忙碌的高鐵,
開這條線的火車司機,
06:04
the busiest of its kind in the world,
100
364120
2532
06:06
are required to wear EEG devices
101
366676
2476
被要求要配戴腦電圖裝置,
06:09
to monitor their brain activity
while driving.
102
369176
2427
用來監測他們
在駕駛時的大腦活動。
06:12
According to some news sources,
103
372157
2226
根據一些新聞來源,
06:14
in government-run factories in China,
104
374407
2679
在中國的國營工廠中,
06:17
the workers are required to wear
EEG sensors to monitor their productivity
105
377110
5364
工人被要求要配戴腦電圖感測器,
來監測他們的生產力,
06:22
and their emotional state at work.
106
382498
2115
以及他們工作時的情緒狀態。
06:25
Workers are even sent home
107
385267
2310
如果工人的大腦展現出
他們對於工作沒有做到最專注,
06:27
if their brains show less-than-stellar
concentration on their jobs,
108
387601
4054
或是呈現出情緒煩亂,
他們就會被送回家。
06:31
or emotional agitation.
109
391679
2122
06:35
It's not going to happen tomorrow,
110
395189
1745
明天還不會發生,
06:36
but we're headed to a world
of brain transparency.
111
396958
3086
但我們正在朝向一個
透明大腦的世界邁進。
06:40
And I don't think people understand
that that could change everything.
112
400537
3440
我不覺得大家了解那會改變一切。
06:44
Everything from our definitions
of data privacy to our laws,
113
404474
3675
一切都會變,從我們
對於資料隱私的定義,
到我們的法律,
到我們對於自由的想法。
06:48
to our ideas about freedom.
114
408173
1800
06:50
In fact, in my lab at Duke University,
115
410731
3077
事實上,在我的
杜克大學實驗室中,
06:53
we recently conducted a nationwide study
in the United States
116
413832
3175
我們最近進行了一項
美國全國性的研究,
06:57
to see if people appreciated
117
417031
1959
來看看大家是否知道
他們的大腦資訊有多敏感。
06:59
the sensitivity
of their brain information.
118
419014
2071
07:02
We asked people to rate
their perceived sensitivity
119
422356
3356
我們請受訪者針對三十三種資訊
07:05
of 33 different kinds of information,
120
425736
2231
所認為的敏感度來進行評分,
07:07
from their social security numbers
121
427991
2220
這些資訊包括他們的身分證字號、
07:10
to the content
of their phone conversations,
122
430235
2597
他們電話交談的內容、
07:12
their relationship history,
123
432856
2193
他們過去的關係、
07:15
their emotions, their anxiety,
124
435073
1942
他們的情緒、他們的焦慮、
07:17
the mental images in their mind
125
437039
1946
他們腦中的影像,
07:19
and the thoughts in their mind.
126
439009
1538
以及他們腦中的思想。
07:21
Shockingly, people rated their social
security number as far more sensitive
127
441481
5229
很讓人驚訝的是,
大家認為身分證字號的敏感度
07:26
than any other kind of information,
128
446734
2203
遠高於任何其他資訊,
07:28
including their brain data.
129
448961
2435
包括他們腦中的資料。
07:32
I think this is because
people don't yet understand
130
452380
3216
我認為這是因為大家尚未了解
07:35
or believe the implications
of this new brain-decoding technology.
131
455620
4063
或還不相信這種新的
大腦解譯技術背後的意涵。
07:40
After all, if we can know
the inner workings of the human brain,
132
460629
3289
畢竟,如果我們能夠知道
人腦內部的運作,
07:43
our social security numbers
are the least of our worries.
133
463942
2706
我們的身分證字號
是最不用擔心的了。
07:46
(Laughter)
134
466672
1285
(笑聲)
07:47
Think about it.
135
467981
1167
想想看。
07:49
In a world of total brain transparency,
136
469172
2396
在一個大腦完全透明的世界中,
07:51
who would dare have
a politically dissident thought?
137
471592
2429
誰敢有在政治上意見不同的念頭?
07:55
Or a creative one?
138
475279
1541
或是有創意的想法?
07:57
I worry that people will self-censor
139
477503
3476
我擔心大家會因為害怕
被社會排斥而做自我審查,
08:01
in fear of being ostracized by society,
140
481003
3302
08:04
or that people will lose their jobs
because of their waning attention
141
484329
3813
或者,大家丟掉飯碗可能是
因為注意力下降或是情緒不穩定,
08:08
or emotional instability,
142
488166
2150
08:10
or because they're contemplating
collective action against their employers.
143
490340
3550
或是因為他們在想著
要集體會抗僱主的行動。
08:14
That coming out
will no longer be an option,
144
494478
3177
出櫃與否也不再是個選擇,
08:17
because people's brains will long ago
have revealed their sexual orientation,
145
497679
5067
因為人腦會揭露出
他們的性向、
他們的政治意識形態,
08:22
their political ideology
146
502770
1822
08:24
or their religious preferences,
147
504616
2025
或是他們的宗教偏好,
08:26
well before they were ready
to consciously share that information
148
506665
3080
在他們有意識地準備好和他人
分享之前,大腦都先做了。
08:29
with other people.
149
509769
1253
08:31
I worry about the ability of our laws
to keep up with technological change.
150
511565
4912
我擔心我們的法律是否
能夠跟上科技的改變。
08:36
Take the First Amendment
of the US Constitution,
151
516986
2320
比如美國憲法第一修正案
08:39
which protects freedom of speech.
152
519330
1958
保障的是言論的自由。
08:41
Does it also protect freedom of thought?
153
521312
1927
它是否也保護思想的自由?
08:43
And if so, does that mean that we're free
to alter our thoughts however we want?
154
523944
4169
如果是的話,那是否表示我們
可以任意改變想法?
08:48
Or can the government or society tell us
what we can do with our own brains?
155
528137
4674
或者,政府或社會是否能
告訴我們能用自己的大腦做什麼?
08:53
Can the NSA spy on our brains
using these new mobile devices?
156
533591
3717
國家安全協會是否能用這些新的
行動裝置來監視我們的大腦?
08:58
Can the companies that collect
the brain data through their applications
157
538053
4119
透過應用程式收集大腦資料的公司
是否能將這些資訊販售給第三方?
09:02
sell this information to third parties?
158
542196
2074
09:05
Right now, no laws prevent them
from doing so.
159
545174
3222
目前,沒有法律阻止他們這麼做。
09:09
It could be even more problematic
160
549203
2025
有些國家問題可能更大,
09:11
in countries that don't share
the same freedoms
161
551252
2519
如人民不像美國人能享有
09:13
enjoyed by people in the United States.
162
553795
2103
那種自由的那些國家。
09:16
What would've happened during
the Iranian Green Movement
163
556883
3787
在伊朗綠色革命中,
09:20
if the government had been
monitoring my family's brain activity,
164
560694
3901
如果政府能夠監視
我家人的大腦活動,
09:24
and had believed them
to be sympathetic to the protesters?
165
564619
4007
因而認為他們同情抗議者,
會發生什麼事?
09:30
Is it so far-fetched to imagine a society
166
570091
3047
真的有這麼難想像這樣的社會嗎?
09:33
in which people are arrested
based on their thoughts
167
573162
2842
人民因為他們有想要
犯罪的念頭而被逮捕,
09:36
of committing a crime,
168
576028
1167
09:37
like in the science-fiction
dystopian society in "Minority Report."
169
577219
4312
就像是科幻電影《關鍵報告》中的
反烏托邦社會一樣。
09:42
Already, in the United States, in Indiana,
170
582286
4323
在美國印第安納州,
09:46
an 18-year-old was charged
with attempting to intimidate his school
171
586633
4937
已經有一位十八歲的男子
被控試圖威嚇他的學校,
09:51
by posting a video of himself
shooting people in the hallways ...
172
591594
3309
因為他張貼了一支他自己
在走廊射殺他人的影片……
09:55
Except the people were zombies
173
595881
3007
只不過,被殺的人是殭屍,
09:58
and the video was of him playing
an augmented-reality video game,
174
598912
5047
那支影片的內容其實是
他在玩虛擬實境的遊戲,
10:03
all interpreted to be a mental projection
of his subjective intent.
175
603983
4772
這全都被解讀為
他主觀意圖的心理投射。
10:10
This is exactly why our brains
need special protection.
176
610456
4612
這正是為什麼我們的大腦
需要特殊的保護。
10:15
If our brains are just as subject
to data tracking and aggregation
177
615092
3556
如果我們的大腦就像
我們的財務記錄和交易一樣
10:18
as our financial records and transactions,
178
618672
2532
會受到資料追蹤和整合,
10:21
if our brains can be hacked
and tracked like our online activities,
179
621228
4285
如果我們的大腦能像
我們的線上活動、手機,
及應用程式一樣被駭入和被追蹤,
10:25
our mobile phones and applications,
180
625537
2361
10:27
then we're on the brink of a dangerous
threat to our collective humanity.
181
627922
4269
那麼我們就是處於集體人性
受到危險威脅的邊緣。
10:33
Before you panic,
182
633406
1309
在你們慌張之前,
10:36
I believe that there are solutions
to these concerns,
183
636012
3144
我相信這些擔憂都有解決方案,
10:39
but we have to start by focusing
on the right things.
184
639180
2825
但我們一開始就得
把焦點放在對的地方。
10:42
When it comes to privacy
protections in general,
185
642580
2921
就一般性的隱私保護而言,
10:45
I think we're fighting a losing battle
186
645525
1826
我認為我們在打一場會輸的仗,
因為我們在試圖限制資訊流。
10:47
by trying to restrict
the flow of information.
187
647375
2858
10:50
Instead, we should be focusing
on securing rights and remedies
188
650257
4057
反之,我們應該要把焦點放在
保衛我們的權利並做出補救,
10:54
against the misuse of our information.
189
654338
2275
確保我們的資訊不會被濫用。
10:57
If people had the right to decide
how their information was shared,
190
657291
3285
如果人民有權決定
他們的資訊要如何被分享,
11:00
and more importantly, have legal redress
191
660600
2921
更重要的,要有法律補救方法,
11:03
if their information
was misused against them,
192
663545
2428
以免他們的資訊被濫用
來對他們不利,
11:05
say to discriminate against them
in an employment setting
193
665997
2786
比如在應徵時,或健康照護
11:08
or in health care or education,
194
668807
2785
或教育的情境中被用來歧視他們,
11:11
this would go a long way to build trust.
195
671616
2571
要花很長的時間才能夠建立信任。
11:14
In fact, in some instances,
196
674843
1718
事實上,在一些狀況中,
11:16
we want to be sharing more
of our personal information.
197
676585
3524
我們會想要分享更多的個人資訊。
11:20
Studying aggregated information
can tell us so much
198
680697
3047
研究整合的資訊能告訴我們許多
11:23
about our health and our well-being,
199
683768
2747
關於我們健康和幸福的訊息,
11:26
but to be able to safely share
our information,
200
686539
3313
但若要安全地分享我們的資訊,
11:29
we need special protections
for mental privacy.
201
689876
3223
我們的心理隱私
會需要特殊的保護。
11:33
This is why we need
a right to cognitive liberty.
202
693832
3147
這就是為什麼我們需要
一種認知自由權。
11:37
This right would secure for us
our freedom of thought and rumination,
203
697543
4079
這種權利能確保
我們的思想和沉思自由,
11:41
our freedom of self-determination,
204
701646
2540
我們的自我決定自由,
11:44
and it would insure that we have
the right to consent to or refuse
205
704210
4390
且能保證我們有權同意或拒絕他人
11:48
access and alteration
of our brains by others.
206
708624
2857
取得和變更我們大腦內容的權利。
11:51
This right could be recognized
207
711811
2112
這種權利可以被視為是
11:53
as part of the Universal Declaration
of Human Rights,
208
713947
2883
世界人權宣言的一部分,
11:56
which has established mechanisms
209
716854
2388
該宣言是在建立一些機制
11:59
for the enforcement
of these kinds of social rights.
210
719266
2856
以強制執行這類社會權利。
12:03
During the Iranian Green Movement,
211
723872
2070
在伊朗綠色革命時,
12:05
the protesters used the internet
and good old-fashioned word of mouth
212
725966
5186
抗議者用網路
以及傳統的口耳相傳,
12:11
to coordinate their marches.
213
731176
1948
來協調他們的遊行。
12:14
And some of the most oppressive
restrictions in Iran
214
734238
2769
結果,伊朗政府解除了
一些最壓迫的限制。
12:17
were lifted as a result.
215
737031
1877
12:20
But what if the Iranian government
had used brain surveillance
216
740047
4087
但,如果伊朗政府
當初使用大腦監視
12:24
to detect and prevent the protest?
217
744158
3061
來偵測並預防抗議呢?
12:28
Would the world have ever heard
the protesters' cries?
218
748847
3176
世界還會有機會
聽到抗議者的吶喊嗎?
12:33
The time has come for us to call
for a cognitive liberty revolution.
219
753732
5121
該是我們來進行一場
認知自由革命的時候了。
12:39
To make sure that we responsibly
advance technology
220
759559
3264
以確保我們的科技是以
負責任的方式在進步,
12:42
that could enable us to embrace the future
221
762847
2978
讓我們能夠擁抱未來,
12:45
while fiercely protecting all of us
from any person, company or government
222
765849
6717
同時強力保護我們所有人,
不受到任何人、公司,或政府
12:52
that attempts to unlawfully access
or alter our innermost lives.
223
772590
5040
以不合法的方式嘗試取得
或變更我們最內在的生活。
12:58
Thank you.
224
778659
1174
謝謝。
12:59
(Applause)
225
779857
3492
(掌聲)
New videos
Original video on YouTube.com
關於本網站
本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。