What tech companies know about your kids | Veronica Barassi

84,501 views ・ 2020-07-03

TED


请双击下面的英文字幕来播放视频。

00:00
Transcriber: Leslie Gauthier Reviewer: Joanna Pietrulewicz
0
0
7000
翻译人员: Ashley Huang 校对人员: Yolanda Zhang
00:12
Every day, every week,
1
12792
2267
每一天, 每一个星期,
00:15
we agree to terms and conditions.
2
15083
2185
我们都会同意各种服务条款。
00:17
And when we do this,
3
17292
1476
每当我们这样做,
00:18
we provide companies with the lawful right
4
18792
2476
我们其实就赋予了公司法律上的权利,
00:21
to do whatever they want with our data
5
21292
3684
用我们的数据去做任何事,
00:25
and with the data of our children.
6
25000
2375
也包括我们孩子的数据。
00:28
Which makes us wonder:
7
28792
2976
这难免使我们感到困惑:
00:31
how much data are we giving away of children,
8
31792
2892
我们到底提供了多少 关于孩子的数据,
00:34
and what are its implications?
9
34708
2000
它们的用途又是什么?
00:38
I'm an anthropologist,
10
38500
1393
我是个人类学家,
00:39
and I'm also the mother of two little girls.
11
39917
2601
也是两个女孩的母亲。
00:42
And I started to become interested in this question in 2015
12
42542
4476
2015 年,我开始关注这个问题,
00:47
when I suddenly realized that there were vast --
13
47042
2726
当时我突然发现很多科技公司
00:49
almost unimaginable amounts of data traces
14
49792
3017
从孩子那里搜集到了
00:52
that are being produced and collected about children.
15
52833
3167
庞大到无法想象的数据信息。
00:56
So I launched a research project,
16
56792
1976
所以我启动了一个研究项目,
00:58
which is called Child Data Citizen,
17
58792
2476
叫“儿童数据市民”,
01:01
and I aimed at filling in the blank.
18
61292
2125
希望能够填补空缺的信息。
01:04
Now you may think that I'm here to blame you
19
64583
3018
现在,你们有可能以为我在责怪你们
01:07
for posting photos of your children on social media,
20
67625
2768
在社交网络上传了孩子的照片,
01:10
but that's not really the point.
21
70417
2142
但是这不是重点。
01:12
The problem is way bigger than so-called "sharenting."
22
72583
3417
实际问题比分享要严重得多。
01:16
This is about systems, not individuals.
23
76792
4101
这事关系统,而不是个人。
01:20
You and your habits are not to blame.
24
80917
2291
你的行为习惯并没有错。
01:24
For the very first time in history,
25
84833
2851
历史上首次,
01:27
we are tracking the individual data of children
26
87708
2560
我们开始追踪孩子的个人数据,
01:30
from long before they're born --
27
90292
1767
从他们出生之前——
01:32
sometimes from the moment of conception,
28
92083
2685
有时候是从受孕开始,
01:34
and then throughout their lives.
29
94792
2351
然后贯穿他们的一生。
01:37
You see, when parents decide to conceive,
30
97167
3101
通常,当家长决定要一个孩子,
01:40
they go online to look for "ways to get pregnant,"
31
100292
2976
他们会在网上搜索 “怎么怀孕”,
01:43
or they download ovulation-tracking apps.
32
103292
2750
或者下载排卵期追踪软件。
01:47
When they do get pregnant,
33
107250
2601
等到真的怀孕了,
01:49
they post ultrasounds of their babies on social media,
34
109875
3143
他们会在社交网络上 发布宝宝的超音波图像,
01:53
they download pregnancy apps
35
113042
2017
下载关于怀孕的软件,
01:55
or they consult Dr. Google for all sorts of things,
36
115083
3726
或者在谷歌上搜索相关信息。
01:58
like, you know --
37
118833
1518
比如,
02:00
for "miscarriage risk when flying"
38
120375
2559
“乘飞机时的流产风险”
02:02
or "abdominal cramps in early pregnancy."
39
122958
2768
或者“怀孕早期的腹痛”。
02:05
I know because I've done it --
40
125750
1809
我知道这些, 因为我也有过类似的经历,
02:07
and many times.
41
127583
1625
而且是很多次。
02:10
And then, when the baby is born, they track every nap,
42
130458
2810
等到宝宝出生后, 他们会用不同的技术
02:13
every feed,
43
133292
1267
记录每个午觉、
每次喂食和每个重要时刻。
02:14
every life event on different technologies.
44
134583
2584
02:18
And all of these technologies
45
138083
1476
所有这些技术
02:19
transform the baby's most intimate behavioral and health data into profit
46
139583
6143
都会通过把宝宝的资料分享给别人
02:25
by sharing it with others.
47
145750
1792
从而换取利润。
02:28
So to give you an idea of how this works,
48
148583
2143
先给各位举一个例子,
02:30
in 2019, the British Medical Journal published research that showed
49
150750
5184
在 2019 年, 英国医学杂志发布了一项研究:
02:35
that out of 24 mobile health apps,
50
155958
3643
在 24 个健康类的手机软件里,
02:39
19 shared information with third parties.
51
159625
3458
有 19 个把用户资料 分享给了第三方,
02:44
And these third parties shared information with 216 other organizations.
52
164083
5834
而这些第三方又分享给了 216 个其他的组织。
02:50
Of these 216 other fourth parties,
53
170875
3434
而这 216 个第四方机构,
02:54
only three belonged to the health sector.
54
174333
3143
只有三个属于健康类机构,
02:57
The other companies that had access to that data were big tech companies
55
177500
4518
其他的则是大型科技公司,
03:02
like Google, Facebook or Oracle,
56
182042
3517
比如谷歌,脸书或甲骨文,
03:05
they were digital advertising companies
57
185583
2601
都是数据广告类的公司,
03:08
and there was also a consumer credit reporting agency.
58
188208
4125
而且还有消费信贷的报告机构。
03:13
So you get it right:
59
193125
1434
所以你的猜测是对的:
03:14
ad companies and credit agencies may already have data points on little babies.
60
194583
5125
广告公司和信贷机构 已经有了宝宝们的数据。
03:21
But mobile apps, web searches and social media
61
201125
2768
但是手机软件、网站搜索和社交媒体
03:23
are really just the tip of the iceberg,
62
203917
3101
只是冰山一角,
03:27
because children are being tracked by multiple technologies
63
207042
2851
因为孩子们的日常生活
03:29
in their everyday lives.
64
209917
1726
已经被很多科技追踪了。
03:31
They're tracked by home technologies and virtual assistants in their homes.
65
211667
4142
他们被家里的设备和虚拟助手追踪,
03:35
They're tracked by educational platforms
66
215833
1976
他们被教育网站
03:37
and educational technologies in their schools.
67
217833
2185
和学校里的教育技术追踪。
03:40
They're tracked by online records
68
220042
1601
他们被诊所的
03:41
and online portals at their doctor's office.
69
221667
3017
网上记录和门户网站追踪。
03:44
They're tracked by their internet-connected toys,
70
224708
2351
他们也在被连网的玩具、
03:47
their online games
71
227083
1310
在线游戏
03:48
and many, many, many, many other technologies.
72
228417
2666
和很多很多其他的技术追踪。
03:52
So during my research,
73
232250
1643
在我的研究过程中,
03:53
a lot of parents came up to me and they were like, "So what?
74
233917
4142
很多家长问我,“那又怎么样?
03:58
Why does it matter if my children are being tracked?
75
238083
2917
就算我的孩子被追踪,那又怎么样?
04:02
We've got nothing to hide."
76
242042
1333
我们又没什么见不得人的秘密。”
04:04
Well, it matters.
77
244958
1500
但是,这真的很重要。
04:07
It matters because today individuals are not only being tracked,
78
247083
6018
因为现如今,个人信息不仅仅被追踪,
04:13
they're also being profiled on the basis of their data traces.
79
253125
4101
还会被用来创建网络个人档案。
04:17
Artificial intelligence and predictive analytics are being used
80
257250
3809
那些公司会用人工智能和预测分析
04:21
to harness as much data as possible of an individual life
81
261083
3643
从不同渠道搜集越来越多的
04:24
from different sources:
82
264750
1851
个人数据:
04:26
family history, purchasing habits, social media comments.
83
266625
4518
家庭历史、购物习惯和社交媒体评论,
04:31
And then they bring this data together
84
271167
1851
然后将这些信息结合在一起
04:33
to make data-driven decisions about the individual.
85
273042
2750
去做出关于你的决定。
04:36
And these technologies are used everywhere.
86
276792
3434
这些技术几乎无处不在。
04:40
Banks use them to decide loans.
87
280250
2393
银行利用这些信息 决定批准谁的贷款,
04:42
Insurance uses them to decide premiums.
88
282667
2375
保险公司用它们决定保费额度,
04:46
Recruiters and employers use them
89
286208
2476
招聘人员和雇主用它们
04:48
to decide whether one is a good fit for a job or not.
90
288708
2917
来决定你们到底适不适合某个工作。
04:52
Also the police and courts use them
91
292750
3101
警察和法庭也利用它们
04:55
to determine whether one is a potential criminal
92
295875
3518
去决定这个人是不是罪犯,
04:59
or is likely to recommit a crime.
93
299417
2625
或者有没有可能犯罪。
05:04
We have no knowledge or control
94
304458
4060
这些购买、售卖 和处理我们信息的人
05:08
over the ways in which those who buy, sell and process our data
95
308542
3642
究竟如何调查我们和我们的孩子,
05:12
are profiling us and our children.
96
312208
2709
我们对此一无所知, 也没有任何控制权。
05:15
But these profiles can come to impact our rights in significant ways.
97
315625
4042
但这些信息会 严重影响我们的权益。
05:20
To give you an example,
98
320917
2208
举个例子,
05:25
in 2018 the "New York Times" published the news
99
325792
4059
2018 年《纽约时报》 发布的一则新闻称,
05:29
that the data that had been gathered
100
329875
1976
由线上大学规划服务
05:31
through online college-planning services --
101
331875
3059
搜集的数据——
05:34
that are actually completed by millions of high school kids across the US
102
334958
4726
这些数据都来自 全美数百万正在寻找
05:39
who are looking for a college program or a scholarship --
103
339708
3643
大学项目或奖学金的高中生——
05:43
had been sold to educational data brokers.
104
343375
3042
已经被售卖给了教育数据经纪人。
05:47
Now, researchers at Fordham who studied educational data brokers
105
347792
5434
福特汉姆的研究人员在对一些 教育数据经纪人进行分析之后透露,
05:53
revealed that these companies profiled kids as young as two
106
353250
5226
这些公司根据以下类别 对不小于两岁的孩子
05:58
on the basis of different categories:
107
358500
3059
进行了分组:
06:01
ethnicity, religion, affluence,
108
361583
4185
种族、宗教、家庭富裕程度、
06:05
social awkwardness
109
365792
2059
社交恐惧症,
06:07
and many other random categories.
110
367875
2934
以及很多其他的随机分类。
06:10
And then they sell these profiles together with the name of the kid,
111
370833
5018
然后他们会将这些资料, 以及孩子的名字、
06:15
their home address and the contact details
112
375875
2809
地址和联系方式
06:18
to different companies,
113
378708
1851
出售给不同的公司,
06:20
including trade and career institutions,
114
380583
2459
包括贸易和职业发展机构,
06:24
student loans
115
384083
1268
学生贷款
06:25
and student credit card companies.
116
385375
1750
和学生信用卡公司。
06:28
To push the boundaries,
117
388542
1351
更夸张的是,
06:29
the researchers at Fordham asked an educational data broker
118
389917
3809
研究人员要求教育数据经纪人
06:33
to provide them with a list of 14-to-15-year-old girls
119
393750
5809
提供一份对家庭生育服务感兴趣,
06:39
who were interested in family planning services.
120
399583
3375
年龄在 14 至 15 岁的少女名单。
06:44
The data broker agreed to provide them the list.
121
404208
2476
数据经纪人同意了。
06:46
So imagine how intimate and how intrusive that is for our kids.
122
406708
4875
所以不难想象,我们孩子的隐私 得到了何等程度的侵犯。
06:52
But educational data brokers are really just an example.
123
412833
3976
但是教育数据经纪人的例子 只是冰山一角。
06:56
The truth is that our children are being profiled in ways that we cannot control
124
416833
4685
诚然,孩子们的信息 正以不可控的方式被人操纵着,
07:01
but that can significantly impact their chances in life.
125
421542
3416
但这会极大地影响他们以后的人生。
07:06
So we need to ask ourselves:
126
426167
3476
所以我们要扪心自问:
07:09
can we trust these technologies when it comes to profiling our children?
127
429667
4684
这些搜集孩子们信息的技术 还值得信任吗?
07:14
Can we?
128
434375
1250
值得吗?
07:17
My answer is no.
129
437708
1250
我的答案是否定的。
07:19
As an anthropologist,
130
439792
1267
作为一个人类学家,
我相信人工智能和 预测分析可以很好的
07:21
I believe that artificial intelligence and predictive analytics can be great
131
441083
3768
07:24
to predict the course of a disease
132
444875
2018
预测疾病的发展过程
07:26
or to fight climate change.
133
446917
1833
或者对抗气候变化。
07:30
But we need to abandon the belief
134
450000
1643
但是我们需要摒弃
07:31
that these technologies can objectively profile humans
135
451667
3684
这些技术可以客观的分析人类数据,
07:35
and that we can rely on them to make data-driven decisions
136
455375
3184
我们能够以数据为依据做出 关于个人生活的决定
07:38
about individual lives.
137
458583
1893
这一想法。
07:40
Because they can't profile humans.
138
460500
2559
因为它们做不到。
07:43
Data traces are not the mirror of who we are.
139
463083
3351
数据无法反映我们的真实情况。
07:46
Humans think one thing and say the opposite,
140
466458
2101
人类往往心口不一,
07:48
feel one way and act differently.
141
468583
2435
言行不一。
07:51
Algorithmic predictions or our digital practices
142
471042
2476
算法预测或者数据实践
07:53
cannot account for the unpredictability and complexity of human experience.
143
473542
5166
无法应对人类经验的 不可预测性和复杂性。
08:00
But on top of that,
144
480417
1559
但是在此之上,
08:02
these technologies are always --
145
482000
2684
这些科技总是——
08:04
always --
146
484708
1268
总是——
08:06
in one way or another, biased.
147
486000
1917
以这样或那样的方式存在偏见。
08:09
You see, algorithms are by definition sets of rules or steps
148
489125
5059
要知道,算法的定义是 被设计成实现一个具体结果的
08:14
that have been designed to achieve a specific result, OK?
149
494208
3709
很多套规则或步骤,对吧?
08:18
But these sets of rules or steps cannot be objective,
150
498833
2726
但是这些都不是客观的,
08:21
because they've been designed by human beings
151
501583
2143
因为它们都是 由带有特殊文化背景,
08:23
within a specific cultural context
152
503750
1726
被特殊文化价值所塑造的人类
08:25
and are shaped by specific cultural values.
153
505500
2500
设计出来的。
08:28
So when machines learn,
154
508667
1726
所以当机器在学习的时候,
08:30
they learn from biased algorithms,
155
510417
2250
它们利用的是带有偏见的算法,
08:33
and they often learn from biased databases as well.
156
513625
3208
以及往往同样带有偏见的数据。
08:37
At the moment, we're seeing the first examples of algorithmic bias.
157
517833
3726
如今,我们已经看到了 第一批算法偏见的例子,
08:41
And some of these examples are frankly terrifying.
158
521583
3500
其中有一些真的很可怕。
08:46
This year, the AI Now Institute in New York published a report
159
526500
4059
今年,位于纽约的 人工智能现在研究所(AI Now Institute)
08:50
that revealed that the AI technologies
160
530583
2393
发表的一份报告揭示了
08:53
that are being used for predictive policing
161
533000
3476
预测警务领域的人工智能技术
08:56
have been trained on "dirty" data.
162
536500
3125
是使用非常糟糕的数据进行训练的。
09:00
This is basically data that had been gathered
163
540333
2893
这些数据基本上都是
09:03
during historical periods of known racial bias
164
543250
4184
在历史上存在已知的种族偏见 和不透明的警察行为时期
09:07
and nontransparent police practices.
165
547458
2250
收集的数据。
09:10
Because these technologies are being trained with dirty data,
166
550542
4059
因为这些技术都是 用这类数据训练的,
09:14
they're not objective,
167
554625
1434
它们无法做到客观,
09:16
and their outcomes are only amplifying and perpetrating
168
556083
4518
结果只是放大和进一步深化
09:20
police bias and error.
169
560625
1625
警察的偏见和错误。
09:25
So I think we are faced with a fundamental problem
170
565167
3142
所以我觉得我们是在面对社会中的
09:28
in our society.
171
568333
1643
一个基本问题。
09:30
We are starting to trust technologies when it comes to profiling human beings.
172
570000
4792
我们正在放心大胆的 用各种技术对人类信息进行分析。
09:35
We know that in profiling humans,
173
575750
2518
我们知道在这方面,
09:38
these technologies are always going to be biased
174
578292
2809
这些技术总是有偏见的,
09:41
and are never really going to be accurate.
175
581125
2726
结果也永远不可能准确。
09:43
So what we need now is actually political solution.
176
583875
2934
所以我们现在需要 一个政治层面的解决方案。
09:46
We need governments to recognize that our data rights are our human rights.
177
586833
4709
我们需要让政府认识到, 我们的数据权利也是人权。
09:52
(Applause and cheers)
178
592292
4083
(鼓掌和欢声)
09:59
Until this happens, we cannot hope for a more just future.
179
599833
4084
在这样的转变发生之前, 我们无法期待一个更加公平的未来。
10:04
I worry that my daughters are going to be exposed
180
604750
2726
我担心我的女儿们会暴露在
10:07
to all sorts of algorithmic discrimination and error.
181
607500
3726
各种算法的歧视与错误判断中。
10:11
You see the difference between me and my daughters
182
611250
2393
我和我女儿的区别就在于,
10:13
is that there's no public record out there of my childhood.
183
613667
3184
我的童年并没有公开的记录,
10:16
There's certainly no database of all the stupid things that I've done
184
616875
4018
当然,我十几岁时做过的傻事
10:20
and thought when I was a teenager.
185
620917
2142
和那些荒唐的想法也没有被记录。
10:23
(Laughter)
186
623083
1500
(笑声)
10:25
But for my daughters this may be different.
187
625833
2750
但是我的女儿们就不同了。
10:29
The data that is being collected from them today
188
629292
3184
今天从她们那里搜集的数据
10:32
may be used to judge them in the future
189
632500
3809
在将来有可能被用来 评判她们的未来,
10:36
and can come to prevent their hopes and dreams.
190
636333
2959
并可能阻止她们的希望和梦想。
10:40
I think that's it's time.
191
640583
1518
我觉得是时候了,
10:42
It's time that we all step up.
192
642125
1434
是时候
10:43
It's time that we start working together
193
643583
2476
采取行动——
10:46
as individuals,
194
646083
1435
无论是个人,
10:47
as organizations and as institutions,
195
647542
2517
还是组织和机构——
10:50
and that we demand greater data justice for us
196
650083
3101
在一切还来得及之前就开展合作, 为我们和我们的孩子
10:53
and for our children
197
653208
1393
争取更大程度的
10:54
before it's too late.
198
654625
1518
数据公正。
10:56
Thank you.
199
656167
1267
谢谢大家!
10:57
(Applause)
200
657458
1417
(掌声)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7