What tech companies know about your kids | Veronica Barassi

85,272 views ・ 2020-07-03

TED


請雙擊下方英文字幕播放視頻。

00:00
Transcriber: Leslie Gauthier Reviewer: Joanna Pietrulewicz
0
0
7000
譯者: Lilian Chiu 審譯者: Carol Wang
00:12
Every day, every week,
1
12792
2267
每天,每週,
00:15
we agree to terms and conditions.
2
15083
2185
我們同意某些「條件及條款」。
00:17
And when we do this,
3
17292
1476
當我們這麼做時,
00:18
we provide companies with the lawful right
4
18792
2476
我們便讓公司擁有合法的權利
00:21
to do whatever they want with our data
5
21292
3684
可以任意使用我們的資料,
00:25
and with the data of our children.
6
25000
2375
以及我們孩子的資料。
00:28
Which makes us wonder:
7
28792
2976
這會讓我們不禁納悶:
00:31
how much data are we giving away of children,
8
31792
2892
我們給出了有關孩子的多少資料,
00:34
and what are its implications?
9
34708
2000
以及這背後的意涵是什麼?
00:38
I'm an anthropologist,
10
38500
1393
我是人類學家,
00:39
and I'm also the mother of two little girls.
11
39917
2601
同時也是兩個小女孩的母親。
00:42
And I started to become interested in this question in 2015
12
42542
4476
我從 2015 年開始 對這個問題感到好奇,
00:47
when I suddenly realized that there were vast --
13
47042
2726
那年,我突然發現,有很大量——
00:49
almost unimaginable amounts of data traces
14
49792
3017
和孩子有關的追蹤資料 被產生出來並收集起來,
00:52
that are being produced and collected about children.
15
52833
3167
且數量大到無法想像。
00:56
So I launched a research project,
16
56792
1976
於是,我展開了一項研究計畫, 名稱叫做「兒童資料公民」,
00:58
which is called Child Data Citizen,
17
58792
2476
01:01
and I aimed at filling in the blank.
18
61292
2125
我的目標是要填補這些空白。
01:04
Now you may think that I'm here to blame you
19
64583
3018
各位可能會認為我是來責怪大家
01:07
for posting photos of your children on social media,
20
67625
2768
在社群媒體上張貼 自己孩子的照片,
01:10
but that's not really the point.
21
70417
2142
但那其實不是重點。
01:12
The problem is way bigger than so-called "sharenting."
22
72583
3417
問題遠大於所謂的 「分享式教養」。
01:16
This is about systems, not individuals.
23
76792
4101
重點在於體制,而不是個人。
01:20
You and your habits are not to blame.
24
80917
2291
要怪的不是你們和你們的習慣。
01:24
For the very first time in history,
25
84833
2851
史無前例,
01:27
we are tracking the individual data of children
26
87708
2560
我們遠在孩子出生之前 就開始追蹤他們的個人資料——
01:30
from long before they're born --
27
90292
1767
01:32
sometimes from the moment of conception,
28
92083
2685
有時是從懷孕就開始,
01:34
and then throughout their lives.
29
94792
2351
接著便追蹤他們的一生。
01:37
You see, when parents decide to conceive,
30
97167
3101
要知道,當父母決定要懷孕時,
01:40
they go online to look for "ways to get pregnant,"
31
100292
2976
他們會上網搜尋「懷孕的方式」,
01:43
or they download ovulation-tracking apps.
32
103292
2750
或者他們會下載 排卵追蹤應用程式。
01:47
When they do get pregnant,
33
107250
2601
當他們確實懷孕之後,
01:49
they post ultrasounds of their babies on social media,
34
109875
3143
他們會把寶寶的超音波照片 張貼在社群媒體上,
01:53
they download pregnancy apps
35
113042
2017
他們會下載懷孕期應用程式,
01:55
or they consult Dr. Google for all sorts of things,
36
115083
3726
或者他們會向 Google 大神 諮詢各種相關事項。
01:58
like, you know --
37
118833
1518
比如——
02:00
for "miscarriage risk when flying"
38
120375
2559
搜尋「飛行造成的流產風險」,
02:02
or "abdominal cramps in early pregnancy."
39
122958
2768
或「懷孕初期的腹痛」。
02:05
I know because I've done it --
40
125750
1809
我知道是因為我做過許多次。
02:07
and many times.
41
127583
1625
02:10
And then, when the baby is born, they track every nap,
42
130458
2810
等寶寶出生了,他們會用各種技術
02:13
every feed,
43
133292
1267
追蹤每次小盹、每次進食、 生命中的每件事。
02:14
every life event on different technologies.
44
134583
2584
02:18
And all of these technologies
45
138083
1476
而他們用的這些技術,
02:19
transform the baby's most intimate behavioral and health data into profit
46
139583
6143
都會把寶寶最私密的行為 和健康資料分享出去,
02:25
by sharing it with others.
47
145750
1792
以轉換成利潤。
02:28
So to give you an idea of how this works,
48
148583
2143
讓我說明一下這是怎麼運作的:
02:30
in 2019, the British Medical Journal published research that showed
49
150750
5184
2019 年,英國醫學期刊 刊出了一篇研究,
02:35
that out of 24 mobile health apps,
50
155958
3643
指出在二十四個 行動健康應用程式中,
02:39
19 shared information with third parties.
51
159625
3458
有十九個會和第三方分享資訊。
02:44
And these third parties shared information with 216 other organizations.
52
164083
5834
而這些第三方會把資訊分享給
兩百一十六個其他組織。
02:50
Of these 216 other fourth parties,
53
170875
3434
在這兩百一十六個第四方當中,
02:54
only three belonged to the health sector.
54
174333
3143
只有三個屬於健康領域。
02:57
The other companies that had access to that data were big tech companies
55
177500
4518
其他能取得這些資料的公司 則是大型科技公司,
03:02
like Google, Facebook or Oracle,
56
182042
3517
比如 Google、臉書, 或甲骨文公司,
03:05
they were digital advertising companies
57
185583
2601
還有數位廣告公司,
03:08
and there was also a consumer credit reporting agency.
58
188208
4125
還有一家是消費者信用調查機構。
03:13
So you get it right:
59
193125
1434
所以,沒錯:
03:14
ad companies and credit agencies may already have data points on little babies.
60
194583
5125
廣告公司和信用機構可能 都已經有小寶寶的資料了。
03:21
But mobile apps, web searches and social media
61
201125
2768
但,行動應用程式、 網路搜尋和社群媒體
03:23
are really just the tip of the iceberg,
62
203917
3101
其實只是冰山的一角,
03:27
because children are being tracked by multiple technologies
63
207042
2851
因為有許多技術在日常生活中
03:29
in their everyday lives.
64
209917
1726
追蹤兒童的資料。
03:31
They're tracked by home technologies and virtual assistants in their homes.
65
211667
4142
在家中,家用科技 和虛擬助理會追蹤兒童。
03:35
They're tracked by educational platforms
66
215833
1976
在學校,教育平台 和教育相關技術都會追蹤兒童。
03:37
and educational technologies in their schools.
67
217833
2185
03:40
They're tracked by online records
68
220042
1601
在醫生的診間,線上記錄 和線上入口網站都會追蹤兒童。
03:41
and online portals at their doctor's office.
69
221667
3017
03:44
They're tracked by their internet-connected toys,
70
224708
2351
還有需連結網路的玩具、線上遊戲
03:47
their online games
71
227083
1310
及許多許多其他技術 都會追蹤兒童。
03:48
and many, many, many, many other technologies.
72
228417
2666
03:52
So during my research,
73
232250
1643
所以,在我研究期間,
03:53
a lot of parents came up to me and they were like, "So what?
74
233917
4142
很多父母來找我, 他們會說:「又怎樣?
03:58
Why does it matter if my children are being tracked?
75
238083
2917
我的孩子被追蹤有什麼關係?
04:02
We've got nothing to hide."
76
242042
1333
我們沒啥要隱瞞的。」
04:04
Well, it matters.
77
244958
1500
這是有關係的。
04:07
It matters because today individuals are not only being tracked,
78
247083
6018
有關係是因為,現今,
個人不僅受到追蹤,
04:13
they're also being profiled on the basis of their data traces.
79
253125
4101
這些追蹤資料還會 被拿來建構他們的側寫評比。
04:17
Artificial intelligence and predictive analytics are being used
80
257250
3809
人工智慧和預測分析
04:21
to harness as much data as possible of an individual life
81
261083
3643
正被用來盡可能多地利用
不同來源的個人生活資料:
04:24
from different sources:
82
264750
1851
04:26
family history, purchasing habits, social media comments.
83
266625
4518
家族史、購買習慣、社群媒體留言。
04:31
And then they bring this data together
84
271167
1851
接著,這些資料會被整合,
04:33
to make data-driven decisions about the individual.
85
273042
2750
以資料為根據, 做出針對個人的決策。
04:36
And these technologies are used everywhere.
86
276792
3434
到處都在使用這些技術。
04:40
Banks use them to decide loans.
87
280250
2393
銀行用它們來決定貸款,
04:42
Insurance uses them to decide premiums.
88
282667
2375
保險公司用它們來決定保費,
04:46
Recruiters and employers use them
89
286208
2476
招聘公司和僱主用它們
04:48
to decide whether one is a good fit for a job or not.
90
288708
2917
來判定應徵者是否適合某個職缺。
04:52
Also the police and courts use them
91
292750
3101
連警方和法庭也會用它們
04:55
to determine whether one is a potential criminal
92
295875
3518
來判斷一個人是否有可能是罪犯,
04:59
or is likely to recommit a crime.
93
299417
2625
或是否有可能再犯。
05:04
We have no knowledge or control
94
304458
4060
我們不知道也無法控制
05:08
over the ways in which those who buy, sell and process our data
95
308542
3642
購買、銷售、處理我們資料的公司
會用什麼方式來對我們 和我們的孩子做側寫評比,
05:12
are profiling us and our children.
96
312208
2709
05:15
But these profiles can come to impact our rights in significant ways.
97
315625
4042
但那些側寫評比有可能會 顯著影響我們的權利。
05:20
To give you an example,
98
320917
2208
舉個例子,
05:25
in 2018 the "New York Times" published the news
99
325792
4059
2018 年《紐約時報》 刊載的新聞提到,
05:29
that the data that had been gathered
100
329875
1976
透過大學規劃線上服務 所收集到的資料——
05:31
through online college-planning services --
101
331875
3059
05:34
that are actually completed by millions of high school kids across the US
102
334958
4726
這些資料來自全美各地數百萬名
想要尋找大學科系 或獎學金的高中生——
05:39
who are looking for a college program or a scholarship --
103
339708
3643
05:43
had been sold to educational data brokers.
104
343375
3042
被販售給教育資料中介商。
05:47
Now, researchers at Fordham who studied educational data brokers
105
347792
5434
福坦莫大學裡那些研究 教育資料中介商的研究者
05:53
revealed that these companies profiled kids as young as two
106
353250
5226
揭發出這些公司會根據不同的分類
來為小至兩歲的兒童做側寫評比:
05:58
on the basis of different categories:
107
358500
3059
06:01
ethnicity, religion, affluence,
108
361583
4185
人種、宗教、富裕程度、
06:05
social awkwardness
109
365792
2059
社交尷尬
06:07
and many other random categories.
110
367875
2934
及許多其他隨機的分類。
06:10
And then they sell these profiles together with the name of the kid,
111
370833
5018
接著,它們會賣掉這些側寫評比,
連帶附上兒童的姓名、 地址和聯絡細節資訊,
06:15
their home address and the contact details
112
375875
2809
06:18
to different companies,
113
378708
1851
賣給各種公司,
06:20
including trade and career institutions,
114
380583
2459
包括貿易和職涯機構、
06:24
student loans
115
384083
1268
學生貸款以及學生信用卡公司。
06:25
and student credit card companies.
116
385375
1750
06:28
To push the boundaries,
117
388542
1351
福坦莫大學的研究者還更進一步,
06:29
the researchers at Fordham asked an educational data broker
118
389917
3809
請一家教育資料中介商 提供他們一份名單,
06:33
to provide them with a list of 14-to-15-year-old girls
119
393750
5809
羅列十四到十五歲 對於避孕措施感興趣的女孩。
06:39
who were interested in family planning services.
120
399583
3375
06:44
The data broker agreed to provide them the list.
121
404208
2476
資料中介商同意 提供他們這份名單。
06:46
So imagine how intimate and how intrusive that is for our kids.
122
406708
4875
想像這多麼侵害我們孩子的私密。
06:52
But educational data brokers are really just an example.
123
412833
3976
但,教育資料中介商 也只不過是一個例子。
06:56
The truth is that our children are being profiled in ways that we cannot control
124
416833
4685
事實是,我們無法控制別人 如何對我們的孩子做側寫評比,
07:01
but that can significantly impact their chances in life.
125
421542
3416
但這些側寫評比卻會明顯影響 他們在人生中的機會。
07:06
So we need to ask ourselves:
126
426167
3476
所以,我們得要捫心自問:
07:09
can we trust these technologies when it comes to profiling our children?
127
429667
4684
我們能信任這些 側寫評比孩子的技術嗎?
07:14
Can we?
128
434375
1250
能嗎?
07:17
My answer is no.
129
437708
1250
我的答案是「不能。」
07:19
As an anthropologist,
130
439792
1267
身為人類學家,
07:21
I believe that artificial intelligence and predictive analytics can be great
131
441083
3768
我相信人工智慧和預測分析
07:24
to predict the course of a disease
132
444875
2018
很擅長預測疾病的過程 或對抗氣候變遷。
07:26
or to fight climate change.
133
446917
1833
07:30
But we need to abandon the belief
134
450000
1643
但我們不能夠信任
07:31
that these technologies can objectively profile humans
135
451667
3684
這些技術能夠客觀地 對人類做側寫評比,
07:35
and that we can rely on them to make data-driven decisions
136
455375
3184
讓我們依據這些側寫評比資料 來對個人的人生做出判斷,
07:38
about individual lives.
137
458583
1893
07:40
Because they can't profile humans.
138
460500
2559
因為它們無法對人類做側寫評比。
07:43
Data traces are not the mirror of who we are.
139
463083
3351
追蹤資料並無法反映出 我們是什麼樣的人。
07:46
Humans think one thing and say the opposite,
140
466458
2101
人類說出來的話 可能和心中想的相反,
07:48
feel one way and act differently.
141
468583
2435
做出來的行為 可能和心中的感受不同。
07:51
Algorithmic predictions or our digital practices
142
471042
2476
用演算法做預測或其他數位做法
07:53
cannot account for the unpredictability and complexity of human experience.
143
473542
5166
無法考量到人類經歷中的 不可預測性和複雜性。
08:00
But on top of that,
144
480417
1559
除此之外,
08:02
these technologies are always --
145
482000
2684
這些技術向來——
08:04
always --
146
484708
1268
向來——會以某種方式偏頗。
08:06
in one way or another, biased.
147
486000
1917
08:09
You see, algorithms are by definition sets of rules or steps
148
489125
5059
在定義上,演算法就是 一組一組的規則或步驟,
08:14
that have been designed to achieve a specific result, OK?
149
494208
3709
設計的目的是要達成 一個特定的結果。
08:18
But these sets of rules or steps cannot be objective,
150
498833
2726
但這些規則或步驟並不客觀,
08:21
because they've been designed by human beings
151
501583
2143
因為它們是由某種 特定文化情境下的人所設計的,
08:23
within a specific cultural context
152
503750
1726
08:25
and are shaped by specific cultural values.
153
505500
2500
且由某些特定的 文化價值觀所形塑出來。
08:28
So when machines learn,
154
508667
1726
所以,機器學習時
08:30
they learn from biased algorithms,
155
510417
2250
會自偏頗的演算法學習,
08:33
and they often learn from biased databases as well.
156
513625
3208
通常也會從偏頗的資料庫中學習。
08:37
At the moment, we're seeing the first examples of algorithmic bias.
157
517833
3726
現在我們已經開始看見 一些偏頗演算法的初始例子,
08:41
And some of these examples are frankly terrifying.
158
521583
3500
當中有些還挺嚇人的。
08:46
This year, the AI Now Institute in New York published a report
159
526500
4059
紐約的 AI Now Institute 今年公佈的一份報告揭露出
08:50
that revealed that the AI technologies
160
530583
2393
用來做預測性維安的人工智慧技術
08:53
that are being used for predictive policing
161
533000
3476
08:56
have been trained on "dirty" data.
162
536500
3125
是用「髒數據」訓練出來的。
09:00
This is basically data that had been gathered
163
540333
2893
收集這些資料的時期,
09:03
during historical periods of known racial bias
164
543250
4184
是歷史上已知很有種族偏見
以及警方作業不透明的時期。
09:07
and nontransparent police practices.
165
547458
2250
09:10
Because these technologies are being trained with dirty data,
166
550542
4059
因為訓練這些技術 所用的資料是髒數據,
09:14
they're not objective,
167
554625
1434
不具備客觀性,
09:16
and their outcomes are only amplifying and perpetrating
168
556083
4518
它們產出的結果
只會放大和犯下警方的偏見和錯誤。
09:20
police bias and error.
169
560625
1625
09:25
So I think we are faced with a fundamental problem
170
565167
3142
所以,我認為我們面臨的 是社會中的根本問題。
09:28
in our society.
171
568333
1643
09:30
We are starting to trust technologies when it comes to profiling human beings.
172
570000
4792
我們開始交由科技技術 來側寫評比人。
09:35
We know that in profiling humans,
173
575750
2518
我們知道在側寫評比人時,
09:38
these technologies are always going to be biased
174
578292
2809
這些技術一定會偏頗,
09:41
and are never really going to be accurate.
175
581125
2726
永遠不會正確。
09:43
So what we need now is actually political solution.
176
583875
2934
所以,現在我們需要的 是政治上的解決方案。
09:46
We need governments to recognize that our data rights are our human rights.
177
586833
4709
我們需要政府認可 我們的資料權和人權。
09:52
(Applause and cheers)
178
592292
4083
(掌聲及歡呼)
09:59
Until this happens, we cannot hope for a more just future.
179
599833
4084
在那之前,我們不用冀望 會有更公正的未來。
10:04
I worry that my daughters are going to be exposed
180
604750
2726
我擔心我的女兒會接觸到
10:07
to all sorts of algorithmic discrimination and error.
181
607500
3726
各種演算法歧視和錯誤。
10:11
You see the difference between me and my daughters
182
611250
2393
我和我女兒的差別在於
10:13
is that there's no public record out there of my childhood.
183
613667
3184
我的童年並沒有 公開的記錄可被取得。
10:16
There's certainly no database of all the stupid things that I've done
184
616875
4018
肯定也沒有資料庫 記錄我在青少女時期
10:20
and thought when I was a teenager.
185
620917
2142
做過的所有蠢事和蠢念頭。
10:23
(Laughter)
186
623083
1500
(笑聲)
10:25
But for my daughters this may be different.
187
625833
2750
但我女兒要面臨的情況可能不同。
10:29
The data that is being collected from them today
188
629292
3184
今天收集到和她們有關的資料,
10:32
may be used to judge them in the future
189
632500
3809
未來可能就會被用來評斷她們,
10:36
and can come to prevent their hopes and dreams.
190
636333
2959
且有可能會漸漸阻擋到 她們的希望和夢想。
10:40
I think that's it's time.
191
640583
1518
我認為該是我們大家 站出來的時候了。
10:42
It's time that we all step up.
192
642125
1434
10:43
It's time that we start working together
193
643583
2476
該是我們開始同心協力,
10:46
as individuals,
194
646083
1435
以個人、組織、 機構的身份攜手合作,
10:47
as organizations and as institutions,
195
647542
2517
10:50
and that we demand greater data justice for us
196
650083
3101
我們要為自己及我們的孩子 爭取更高的資料公平性,
10:53
and for our children
197
653208
1393
10:54
before it's too late.
198
654625
1518
別等到太遲了。
10:56
Thank you.
199
656167
1267
謝謝。
10:57
(Applause)
200
657458
1417
(掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7


This website was created in October 2020 and last updated on June 12, 2025.

It is now archived and preserved as an English learning resource.

Some information may be out of date.

隱私政策

eng.lish.video

Developer's Blog