What tech companies know about your kids | Veronica Barassi

84,949 views ใƒป 2020-07-03

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

00:00
Transcriber: Leslie Gauthier Reviewer: Joanna Pietrulewicz
0
0
7000
ืชืจื’ื•ื: ืžื™ื›ืœ ืกืœืžืŸ
00:12
Every day, every week,
1
12792
2267
ื›ืœ ื™ื•ื, ื›ืœ ืฉื‘ื•ืข,
00:15
we agree to terms and conditions.
2
15083
2185
ืื ื• ืžืกื›ื™ืžื™ื ืœืชื ืื™ื ื•ืœื”ื’ื‘ืœื•ืช.
00:17
And when we do this,
3
17292
1476
ื•ื›ืฉืื ื—ื ื• ืขื•ืฉื™ื ื–ืืช,
00:18
we provide companies with the lawful right
4
18792
2476
ืื ื—ื ื• ืžืกืคืงื™ื ืœื—ื‘ืจื•ืช ืืช ื”ื–ื›ื•ืช ื”ื—ื•ืงื™ืช
00:21
to do whatever they want with our data
5
21292
3684
ืœืขืฉื•ืช ืžื” ืฉื”ื ืจื•ืฆื™ื ืขื ื”ื ืชื•ื ื™ื ืฉืœื ื•
00:25
and with the data of our children.
6
25000
2375
ื•ืขื ื”ื ืชื•ื ื™ื ืฉืœ ื™ืœื“ื™ื ื•.
00:28
Which makes us wonder:
7
28792
2976
ืžื” ืฉื’ื•ืจื ืœื ื• ืœืชื”ื•ืช:
00:31
how much data are we giving away of children,
8
31792
2892
ื›ืžื” ื ืชื•ื ื™ื ืขืœ ื™ืœื“ื™ื ืื ื• ืžื•ืกืจื™ื,
00:34
and what are its implications?
9
34708
2000
ื•ืžื” ื”ื”ืฉืœื›ื•ืช ืฉืœ ื›ืš?
00:38
I'm an anthropologist,
10
38500
1393
ืื ื™ ืื ืชืจื•ืคื•ืœื•ื’ื™ืช,
00:39
and I'm also the mother of two little girls.
11
39917
2601
ื•ืื ื™ ื’ื ืื™ืžื ืœืฉืชื™ ื™ืœื“ื•ืช ืงื˜ื ื•ืช.
00:42
And I started to become interested in this question in 2015
12
42542
4476
ื•ื”ืชื—ืœืชื™ ืœื”ืชืขื ื™ื™ืŸ ื‘ืฉืืœื” ื–ื• ื‘ืฉื ืช 2015
00:47
when I suddenly realized that there were vast --
13
47042
2726
ื›ืฉืœืคืชืข ื”ื‘ื ืชื™ ืฉื”ื™ื• ื›ืžื•ื™ื•ืช ืขืฆื•ืžื•ืช
00:49
almost unimaginable amounts of data traces
14
49792
3017
ื›ืžื•ื™ื•ืช ื›ืžืขื˜ ื‘ืœืชื™ ื ืชืคืกื•ืช ืฉืœ ืขืงื‘ื•ืช ื ืชื•ื ื™ื
00:52
that are being produced and collected about children.
15
52833
3167
ืืฉืจ ืžื™ื•ืฆืจื™ื ื•ื ืืกืคื™ื ื‘ืงืฉืจ ืœื™ืœื“ื™ื.
00:56
So I launched a research project,
16
56792
1976
ืื– ื”ืฉืงืชื™ ืคืจื•ื™ืงื˜ ืžื—ืงืจ,
00:58
which is called Child Data Citizen,
17
58792
2476
ื‘ืฉื ื ืชื•ื ื™ ืื–ืจื— ื™ืœื“,
01:01
and I aimed at filling in the blank.
18
61292
2125
ื•ื›ื™ื•ื•ื ืชื™ ืœืžืœื ืืช ื”ื—ืกืจ.
01:04
Now you may think that I'm here to blame you
19
64583
3018
ื›ืขืช ืืชื ืขืฉื•ื™ื™ื ืœื—ืฉื•ื‘ ืฉืื ื™ ื›ืืŸ ื›ื“ื™ ืœื”ืืฉื™ื ืืชื›ื
01:07
for posting photos of your children on social media,
20
67625
2768
ืขืœ ื›ืš ืฉืืชื ืžืคืจืกืžื™ื ืชืžื•ื ื•ืช ืฉืœ ื™ืœื“ื™ื›ื ื‘ืžื“ื™ื” ื”ื—ื‘ืจืชื™ืช,
01:10
but that's not really the point.
21
70417
2142
ืื‘ืœ ื–ื” ืœื ื‘ืืžืช ื”ืขื ื™ื™ืŸ.
01:12
The problem is way bigger than so-called "sharenting."
22
72583
3417
ื”ื‘ืขื™ื” ื”ื™ื ื”ืจื‘ื” ื™ื•ืชืจ ื’ื“ื•ืœื” ืžืžื” ืฉืžื›ื•ื ื” "ืฉื™ืชื•ืฃ ื™ืชืจ ืฉืœ ื”ื•ืจื™ื".
01:16
This is about systems, not individuals.
23
76792
4101
ื”ืขื ื™ื™ืŸ ื›ืืŸ ื”ื•ื ืžืขืจื›ื•ืช, ื•ืœื ื™ื—ื™ื“ื™ื.
01:20
You and your habits are not to blame.
24
80917
2291
ืืชื ื•ื”ื”ืจื’ืœื™ื ืฉืœื›ื ืื™ื ื ืืฉืžื™ื.
01:24
For the very first time in history,
25
84833
2851
ื‘ืคืขื ื”ืจืืฉื•ื ื” ื‘ื”ื™ืกื˜ื•ืจื™ื”,
01:27
we are tracking the individual data of children
26
87708
2560
ืื ื• ืขื•ืงื‘ื™ื ืื—ืจ ื”ื ืชื•ื ื™ื ื”ืื™ืฉื™ื™ื ืฉืœ ื™ืœื“ื™ื
01:30
from long before they're born --
27
90292
1767
ื”ืจื‘ื” ืœืคื ื™ ืฉื”ื ื ื•ืœื“ื™ื,
01:32
sometimes from the moment of conception,
28
92083
2685
ืœืคืขืžื™ื ืžืจื’ืข ื”ื”ืชืขื‘ืจื•ืช,
01:34
and then throughout their lives.
29
94792
2351
ื•ืื– ืœืื•ืจืš ื—ื™ื™ื”ื.
01:37
You see, when parents decide to conceive,
30
97167
3101
ืืชื ืžื‘ื™ื ื™ื, ื›ืฉื”ื”ื•ืจื™ื ืžื—ืœื™ื˜ื™ื ืœื”ืจื•ืช,
01:40
they go online to look for "ways to get pregnant,"
31
100292
2976
ื”ื ื ื›ื ืกื™ื ืœืื™ื ื˜ืจื ื˜ ืœื—ืคืฉ ืื—ืจ "ื“ืจื›ื™ื ืœื”ื™ื›ื ืก ืœื”ืจื™ื•ืŸ"
01:43
or they download ovulation-tracking apps.
32
103292
2750
ืื• ืฉื”ื ืžื•ืจื™ื“ื™ื ืืคืœื™ืงืฆื™ื•ืช ืœืžืขืงื‘ ืื—ืจ ื‘ื™ื•ืฅ.
01:47
When they do get pregnant,
33
107250
2601
ื›ืฉื”ื ื ื›ื ืกื™ื ืœื”ืจื™ื•ืŸ,
01:49
they post ultrasounds of their babies on social media,
34
109875
3143
ื”ื ืžืคืจืกืžื™ื ืืช ืชื•ืฆืื•ืช ื”ืื•ืœื˜ืจืกืื•ื ื“ ืฉืœ ืชื™ื ื•ืงื•ืชื™ื”ื ื‘ืžื“ื™ื” ื”ื—ื‘ืจืชื™ืช,
01:53
they download pregnancy apps
35
113042
2017
ื”ื ืžื•ืจื™ื“ื™ื ืืคืœื™ืงืฆื™ื•ืช ืœื”ืจื™ื•ืŸ
01:55
or they consult Dr. Google for all sorts of things,
36
115083
3726
ืื• ืฉื”ื ืžืชื™ื™ืขืฆื™ื ืขื ื“"ืจ ื’ื•ื’ืœ ื‘ื›ืœ ืžื™ื ื™ ืขื ื™ื™ื ื™ื,
01:58
like, you know --
37
118833
1518
ืœืžืฉืœ, ืืชื ื™ื•ื“ืขื™ื -
02:00
for "miscarriage risk when flying"
38
120375
2559
ืœื’ื‘ื™ "ื”ืกื™ื›ื•ืŸ ืœื”ืคื™ืœ ื‘ื–ืžืŸ ื˜ื™ืกื”"
02:02
or "abdominal cramps in early pregnancy."
39
122958
2768
ืื• "ื”ืชื›ื•ื•ืฆื•ื™ื•ืช ื‘ื˜ืŸ ื‘ืชื—ื™ืœืช ื”ื”ื™ืจื™ื•ืŸ."
02:05
I know because I've done it --
40
125750
1809
ืื ื™ ื™ื•ื“ืขืช ื›ื™ ืื ื™ ืขืฉื™ืชื™ ืืช ื–ื” -
02:07
and many times.
41
127583
1625
ื”ืจื‘ื” ืคืขืžื™ื.
02:10
And then, when the baby is born, they track every nap,
42
130458
2810
ื•ืื–, ื›ืฉื”ืชื™ื ื•ืง ื ื•ืœื“, ื”ื ืขื•ืงื‘ื™ื ืื—ืจ ื›ืœ ืชื ื•ืžื”,
02:13
every feed,
43
133292
1267
ื›ืœ ื”ืื›ืœื”,
02:14
every life event on different technologies.
44
134583
2584
ื›ืœ ืื™ืจื•ืข ื‘ื—ื™ื™ื ืขืœ ื’ื‘ื™ ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ืฉื•ื ื•ืช.
02:18
And all of these technologies
45
138083
1476
ื•ื›ืœ ื”ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื”ืืœื”
02:19
transform the baby's most intimate behavioral and health data into profit
46
139583
6143
ื”ื•ืคื›ื•ืช ืืช ื ืชื•ื ื™ ื”ื”ืชื ื”ื’ื•ืช ื•ื”ื‘ืจื™ืื•ืช ื”ื›ื™ ืื™ื ื˜ื™ืžื™ื™ื ืฉืœ ื”ืชื™ื ื•ืง ืœืจื•ื•ื—
02:25
by sharing it with others.
47
145750
1792
ื‘ื›ืš ืฉื”ืŸ ืžืฉืชืคื•ืช ืื•ืชื ืขื ืื—ืจื™ื.
02:28
So to give you an idea of how this works,
48
148583
2143
ืื– ื›ื“ื™ ืœืชืช ืœื›ื ืžื•ืฉื’ ืื™ืš ื–ื” ืขื•ื‘ื“,
02:30
in 2019, the British Medical Journal published research that showed
49
150750
5184
ื‘ืฉื ืช 2019, ื›ืชื‘ ื”ืขืช ื”ืจืคื•ืื™ ื”ื‘ืจื™ื˜ื™ ืคืจืกื ืžื—ืงืจ ืฉื”ืจืื”
02:35
that out of 24 mobile health apps,
50
155958
3643
ื›ื™ ืžืชื•ืš 24 ืืคืœื™ืงืฆื™ื•ืช ื‘ืจื™ืื•ืช ืœื ื™ื™ื“,
02:39
19 shared information with third parties.
51
159625
3458
19 ืฉื™ืชืคื• ืืช ื”ืžื™ื“ืข ืขื ืฆื“ ืฉืœื™ืฉื™.
02:44
And these third parties shared information with 216 other organizations.
52
164083
5834
ื•ืืœื” ืฉื‘ืฆื“ ืฉืœื™ืฉื™ ืฉื™ืชืคื• ืžื™ื“ืข ืขื 216 ืืจื’ื•ื ื™ื ืื—ืจื™ื.
02:50
Of these 216 other fourth parties,
53
170875
3434
ืžื‘ื™ืŸ 216 ืืœื” ืฉื”ื™ื•ื• ืฆื“ ืจื‘ื™ืขื™,
02:54
only three belonged to the health sector.
54
174333
3143
ืจืง ืฉืœื•ืฉื” ื”ื™ื• ืฉื™ื™ื›ื™ื ืœืขื ืฃ ื”ื‘ืจื™ืื•ืช.
02:57
The other companies that had access to that data were big tech companies
55
177500
4518
ื”ื—ื‘ืจื•ืช ื”ืื—ืจื•ืช ืฉื”ื™ื™ืชื” ืœื”ืŸ ื’ื™ืฉื” ืœื ืชื•ื ื™ื ื”ืืœื” ื”ื™ื• ื—ื‘ืจื•ืช ื˜ื›ื ื•ืœื•ื’ื™ื” ื’ื“ื•ืœื•ืช
03:02
like Google, Facebook or Oracle,
56
182042
3517
ื›ืžื• ื’ื•ื’ืœ, ืคื™ื™ืกื‘ื•ืง ืื• ืื•ืจืงืœ,
03:05
they were digital advertising companies
57
185583
2601
ื”ื ื”ื™ื• ื—ื‘ืจื•ืช ืคืจืกื•ื ื“ื™ื’ื™ื˜ืœื™
03:08
and there was also a consumer credit reporting agency.
58
188208
4125
ื•ื›ืŸ ื”ื™ื™ืชื” ืกื•ื›ื ื•ืช ืœื“ื™ื•ื•ื— ืขืœ ืืฉืจืื™ ืฆืจื›ื ื™.
03:13
So you get it right:
59
193125
1434
ืื– ืืชื ืžื‘ื™ื ื™ื ืืช ื–ื” ื ื›ื•ืŸ:
03:14
ad companies and credit agencies may already have data points on little babies.
60
194583
5125
ื—ื‘ืจื•ืช ืคืจืกื•ื ื•ืกื•ื›ื ื•ื™ื•ืช ืืฉืจืื™ ื›ื‘ืจ ืขืฉื•ื™ื•ืช ืœื”ื—ื–ื™ืง ื‘ื—ืœืงื™ ื ืชื•ื ื™ื ืขืœ ืชื™ื ื•ืงื•ืช ืงื˜ื ื™ื.
03:21
But mobile apps, web searches and social media
61
201125
2768
ืื‘ืœ ืืคืœื™ืงืฆื™ื•ืช ืœื ื™ื™ื“, ื—ื™ืคื•ืฉื™ื ื‘ืจืฉืช ื•ืžื“ื™ื” ื—ื‘ืจืชื™ืช
03:23
are really just the tip of the iceberg,
62
203917
3101
ื”ื ืžืžืฉ ืจืง ืงืฆื” ื”ืงืจื—ื•ืŸ,
03:27
because children are being tracked by multiple technologies
63
207042
2851
ืžืฉื•ื ืฉืžืขืงื‘ ืื—ืจ ื™ืœื“ื™ื ืžืชื‘ืฆืข ื“ืจืš ืžืกืคืจ ื˜ื›ื ื•ืœื•ื’ื™ื•ืช
03:29
in their everyday lives.
64
209917
1726
ื‘ื—ื™ื™ ื”ื™ื•ืžื™ื•ื ืฉืœื”ื.
03:31
They're tracked by home technologies and virtual assistants in their homes.
65
211667
4142
ืขื•ืงื‘ื™ื ืื—ืจื™ื”ื ื“ืจืš ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื‘ื™ืชื™ื•ืช ื•ืขื•ื–ืจื™ื ื•ื™ืจื˜ื•ืืœื™ื™ื ื‘ื‘ืชื™ื”ื.
03:35
They're tracked by educational platforms
66
215833
1976
ืขื•ืงื‘ื™ื ืื—ืจื™ื”ื ื“ืจืš ืคืœื˜ืคื•ืจืžื•ืช ื—ื™ื ื•ื›ื™ื•ืช
03:37
and educational technologies in their schools.
67
217833
2185
ื•ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื—ื™ื ื•ื›ื™ื•ืช ื‘ื‘ืชื™ ื”ืกืคืจ ืฉืœื”ื.
03:40
They're tracked by online records
68
220042
1601
ืขื•ืงื‘ื™ื ืื—ืจื™ื”ื ื“ืจืš ืจืฉื•ืžื•ืช ืžืงื•ื•ื ื•ืช
03:41
and online portals at their doctor's office.
69
221667
3017
ื•ืคื•ืจื˜ืœื™ื ืžืงื•ื•ื ื™ื ื‘ืžืฉืจื“ ื”ืจื•ืคื ืฉืœื”ื.
03:44
They're tracked by their internet-connected toys,
70
224708
2351
ืขื•ืงื‘ื™ื ืื—ืจื™ื”ื ื“ืจืš ืฆืขืฆื•ืขื™ื”ื ื”ืžื—ื•ื‘ืจื™ื ืœืื™ื ื˜ืจื ื˜,
03:47
their online games
71
227083
1310
ื”ืžืฉื—ืงื™ื ื”ืžืงื•ื•ื ื™ื ืฉืœื”ื
03:48
and many, many, many, many other technologies.
72
228417
2666
ื•ืขื•ื“ ื”ืจื‘ื”, ื”ืจื‘ื”, ื”ืจื‘ื” ื”ืจื‘ื” ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ืื—ืจื•ืช.
03:52
So during my research,
73
232250
1643
ืื– ื‘ืžื”ืœืš ื”ืžื—ืงืจ ืฉืœื™,
03:53
a lot of parents came up to me and they were like, "So what?
74
233917
4142
ื”ื•ืจื™ื ืจื‘ื™ื ื ื™ื’ืฉื• ืืœื™ ื•ื”ื ืืžืจื• ืžืฉื”ื• ื›ืžื•, "ืื– ืžื”?"
03:58
Why does it matter if my children are being tracked?
75
238083
2917
ืœืžื” ื–ื” ืžืฉื ื” ืื ืขื•ืงื‘ื™ื ืื—ืจื™ ื”ื™ืœื“ื™ื ืฉืœื™?
04:02
We've got nothing to hide."
76
242042
1333
"ืื™ืŸ ืœื ื• ืžื” ืœื”ืกืชื™ืจ".
04:04
Well, it matters.
77
244958
1500
ื•ื‘ื›ืŸ, ื–ื” ืžืฉื ื”.
04:07
It matters because today individuals are not only being tracked,
78
247083
6018
ื–ื” ืžืฉื ื” ื›ื™ ื”ื™ื•ื ืื ืฉื™ื ืœื ืจืง ื ืžืฆืื™ื ืชื—ืช ืžืขืงื‘,
04:13
they're also being profiled on the basis of their data traces.
79
253125
4101
ื”ื ื’ื ืžืงื‘ืœื™ื ืคืจื•ืคื™ืœ ื‘ื”ืชืื ืœืขืงื‘ื•ืช ื”ื ืชื•ื ื™ื ืฉืœื”ื.
04:17
Artificial intelligence and predictive analytics are being used
80
257250
3809
ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ื•ื ื™ืชื•ื— ืชื—ื–ื™ื•ืช ืžืฉืžืฉื™ื
04:21
to harness as much data as possible of an individual life
81
261083
3643
ืœืจืชื•ื ื›ืžื” ืฉื™ื•ืชืจ ื ืชื•ื ื™ื ืžืชื•ืš ื—ื™ื™ื• ื”ืคืจื˜ื™ื™ื ืฉืœ ืื“ื
04:24
from different sources:
82
264750
1851
ืžืžืงื•ืจื•ืช ืฉื•ื ื™ื:
04:26
family history, purchasing habits, social media comments.
83
266625
4518
ื”ื™ืกื˜ื•ืจื™ื” ืžืฉืคื—ืชื™ืช, ื”ืจื’ืœื™ ืงื ื™ื™ื”, ื”ืขืจื•ืช ื‘ืจืฉืชื•ืช ื”ื—ื‘ืจืชื™ื•ืช.
04:31
And then they bring this data together
84
271167
1851
ื•ืื– ื”ื ืžื—ื‘ืจื™ื ืืช ื”ื ืชื•ื ื™ื ื”ืืœื”
04:33
to make data-driven decisions about the individual.
85
273042
2750
ื‘ืžื˜ืจื” ืœืงื‘ืœ ื”ื—ืœื˜ื•ืช ืžื•ื ื—ื•ืช ื ืชื•ื ื™ื ืขืœ ื”ืคืจื˜.
04:36
And these technologies are used everywhere.
86
276792
3434
ื•ื”ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื”ืืœื” ืžื ื•ืฆืœื•ืช ื‘ื›ืœ ืžืงื•ื.
04:40
Banks use them to decide loans.
87
280250
2393
ื‘ื ืงื™ื ืžืฉืชืžืฉื™ื ื‘ื”ืŸ ื›ื“ื™ ืœื”ื—ืœื™ื˜ ืขืœ ื”ืœื•ื•ืื•ืช.
04:42
Insurance uses them to decide premiums.
88
282667
2375
ื—ื‘ืจื•ืช ื‘ื™ื˜ื•ื— ืžืฉืชืžืฉื•ืช ื‘ื”ืŸ ื›ื“ื™ ืœื”ื—ืœื™ื˜ ืขืœ ืคืจืžื™ื•ืช.
04:46
Recruiters and employers use them
89
286208
2476
ืžื’ื™ื™ืกื™ื ื•ืžืขืกื™ืงื™ื ืžืฉืชืžืฉื™ื ื‘ื”ืŸ
04:48
to decide whether one is a good fit for a job or not.
90
288708
2917
ื›ื“ื™ ืœื”ื—ืœื™ื˜ ืื ืื“ื ืžืชืื™ื ืœืžืฉืจื” ืื• ืœื.
04:52
Also the police and courts use them
91
292750
3101
ื’ื ื”ืžืฉื˜ืจื” ื•ื‘ืชื™ ื”ืžืฉืคื˜ ืžืฉืชืžืฉื™ื ื‘ื”ืŸ
04:55
to determine whether one is a potential criminal
92
295875
3518
ื›ื“ื™ ืœืงื‘ื•ืข ืื ืื“ื ื”ื•ื ืขื‘ืจื™ื™ืŸ ืคื•ื˜ื ืฆื™ืืœื™
04:59
or is likely to recommit a crime.
93
299417
2625
ืื• ืื ื”ื•ื ืขืฉื•ื™ ืœื—ื–ื•ืจ ืœื‘ืฆืข ืคืฉืข.
05:04
We have no knowledge or control
94
304458
4060
ืื™ืŸ ืœื ื• ืฉื•ื ื™ื“ืข ืื• ืฉืœื™ื˜ื”
05:08
over the ways in which those who buy, sell and process our data
95
308542
3642
ืขืœ ื”ื“ืจื›ื™ื ืฉื‘ื”ืŸ ืžื™ ืฉืงื•ื ื”, ืžื•ื›ืจ ื•ืžืขื‘ื“ ืืช ื”ื ืชื•ื ื™ื ืฉืœื ื•
05:12
are profiling us and our children.
96
312208
2709
ื™ื•ืฆืจื™ื ืืช ื”ืคืจื•ืคื™ืœ ืฉืœื ื• ื•ืฉืœ ื™ืœื“ื™ื ื•.
05:15
But these profiles can come to impact our rights in significant ways.
97
315625
4042
ืื‘ืœ ื”ืคืจื•ืคื™ืœื™ื ื”ืืœื” ื™ื›ื•ืœื™ื ืœื”ืฉืคื™ืข ืขืœ ื–ื›ื•ื™ื•ืชื™ื ื• ื‘ื“ืจื›ื™ื ืžืฉืžืขื•ืชื™ื•ืช.
05:20
To give you an example,
98
320917
2208
ืœืฉื ื”ื”ื“ื’ืžื”,
05:25
in 2018 the "New York Times" published the news
99
325792
4059
ื‘ืฉื ืช 2018 ืขื™ืชื•ืŸ ื”"ื ื™ื• ื™ื•ืจืง ื˜ื™ื™ืžืก" ืคืจืกื ื‘ื—ื“ืฉื•ืช
05:29
that the data that had been gathered
100
329875
1976
ืฉื”ื ืชื•ื ื™ื ืฉื ืืกืคื•
05:31
through online college-planning services --
101
331875
3059
ื“ืจืš ืฉื™ืจื•ืชื™ ืชื›ื ื•ืŸ ืžื›ืœืœื•ืช ื‘ืื™ื ื˜ืจื ื˜
05:34
that are actually completed by millions of high school kids across the US
102
334958
4726
ืฉืœืžืขืฉื” ืžืžื•ืœืื™ื ืขืœ ื™ื“ื™ ืžื™ืœื™ื•ื ื™ื ืฉืœ ืชืœืžื™ื“ื™ ืชื™ื›ื•ืŸ ื‘ืจื—ื‘ื™ ืืจื”"ื‘
05:39
who are looking for a college program or a scholarship --
103
339708
3643
ืฉืžื—ืคืฉื™ื ืชื›ื ื™ืช ืœื™ืžื•ื“ื™ื ื‘ืžื›ืœืœื” ืื• ืžืœื’ื”
05:43
had been sold to educational data brokers.
104
343375
3042
ื ืžื›ืจื• ืœืžืชื•ื•ื›ื™ื ื”ืกื•ื—ืจื™ื ื‘ื ืชื•ื ื™ื ื—ื™ื ื•ื›ื™ื™ื.
05:47
Now, researchers at Fordham who studied educational data brokers
105
347792
5434
ื›ืขืช, ื—ื•ืงืจื™ื ื‘ืคื•ืจื“ื”ืื ืฉื—ืงืจื• ืชื™ื•ื•ืš ื ืชื•ื ื™ื ื—ื™ื ื•ื›ื™ื™ื
05:53
revealed that these companies profiled kids as young as two
106
353250
5226
ื—ืฉืคื• ื›ื™ ื—ื‘ืจื•ืช ืืœื” ืขืจื›ื• ืคืจื•ืคื™ืœ ืฉืœ ื™ืœื“ื™ื ื‘ื’ื™ืœ ืฉื ืชื™ื™ื
05:58
on the basis of different categories:
107
358500
3059
ืขืœ ืคื™ ืงื˜ื’ื•ืจื™ื•ืช ืฉื•ื ื•ืช:
06:01
ethnicity, religion, affluence,
108
361583
4185
ืืชื ื™ื•ืช, ื“ืช, ืžืขืžื“ ื›ืœื›ืœื™,
06:05
social awkwardness
109
365792
2059
ืื™ ื ื•ื—ื•ืช ื—ื‘ืจืชื™ืช,
06:07
and many other random categories.
110
367875
2934
ื•ืขื•ื“ ืงื˜ื’ื•ืจื™ื•ืช ืืงืจืื™ื•ืช ืจื‘ื•ืช ืื—ืจื•ืช.
06:10
And then they sell these profiles together with the name of the kid,
111
370833
5018
ืœืื—ืจ ืžื›ืŸ ื”ื ืžื•ื›ืจื™ื ืืช ื”ืคืจื•ืคื™ืœื™ื ื”ืืœื” ื™ื—ื“ ืขื ืฉื ื”ื™ืœื“,
06:15
their home address and the contact details
112
375875
2809
ื›ืชื•ื‘ืช ื”ื‘ื™ืช ื•ืคืจื˜ื™ ื”ื”ืชืงืฉืจื•ืช ืฉืœื”ื
06:18
to different companies,
113
378708
1851
ืœื—ื‘ืจื•ืช ืฉื•ื ื•ืช,
06:20
including trade and career institutions,
114
380583
2459
ื›ื•ืœืœ ืžื•ืกื“ื•ืช ืกื—ืจ ื•ืงืจื™ื™ืจื”,
06:24
student loans
115
384083
1268
ื”ืœื•ื•ืื•ืช ืกื˜ื•ื“ื ื˜ื™ื
06:25
and student credit card companies.
116
385375
1750
ื•ื—ื‘ืจื•ืช ื›ืจื˜ื™ืกื™ ืืฉืจืื™ ืœืกื˜ื•ื“ื ื˜ื™ื.
06:28
To push the boundaries,
117
388542
1351
ื‘ืžื˜ืจื” ืœื‘ื“ื•ืง ืืช ื”ื’ื‘ื•ืœื•ืช,
06:29
the researchers at Fordham asked an educational data broker
118
389917
3809
ื”ื—ื•ืงืจื™ื ื‘ืคื•ืจื“ื”ืื ื‘ื™ืงืฉื• ืžืžืชื•ื•ืš ื‘ื ืชื•ื ื™ื ื—ื™ื ื•ื›ื™ื™ื
06:33
to provide them with a list of 14-to-15-year-old girls
119
393750
5809
ืœืกืคืง ืขื‘ื•ืจื ืจืฉื™ืžื” ืฉืœ ื ืขืจื•ืช ื‘ื ื•ืช 14- 15
06:39
who were interested in family planning services.
120
399583
3375
ืืฉืจ ื”ื‘ื™ืขื• ื”ืชืขื ื™ื™ื ื•ืช ื‘ืฉื™ืจื•ืชื™ื ืœืชื›ื ื•ืŸ ืžืฉืคื—ื”.
06:44
The data broker agreed to provide them the list.
121
404208
2476
ืžืชื•ื•ืš ื”ื ืชื•ื ื™ื ื”ืกื›ื™ื ืœืกืคืง ืœื”ื ืืช ื”ืจืฉื™ืžื”.
06:46
So imagine how intimate and how intrusive that is for our kids.
122
406708
4875
ืื– ืชืืจื• ืœื›ื ื›ืžื” ืื™ื ื˜ื™ืžื™ ื•ื›ืžื” ืคื•ืœืฉื ื™ ื”ื“ื‘ืจ ืขื‘ื•ืจ ื™ืœื“ื™ื ื•.
06:52
But educational data brokers are really just an example.
123
412833
3976
ืื‘ืœ ืžืชื•ื•ื›ื™ ื ืชื•ื ื™ื ื—ื™ื ื•ื›ื™ื™ื ื”ื ื‘ืืžืช ืจืง ื“ื•ื’ืžื”.
06:56
The truth is that our children are being profiled in ways that we cannot control
124
416833
4685
ื”ืืžืช ื”ื™ื ืฉืคืจื•ืคื™ืœ ื™ืœื“ื™ื ื• ื ื•ืฆืจ ื‘ื“ืจื›ื™ื ืฉืื™ื ื ื• ื™ื›ื•ืœื™ื ืœืฉืœื•ื˜ ื‘ื”ืŸ
07:01
but that can significantly impact their chances in life.
125
421542
3416
ืื‘ืœ ื–ื” ื™ื›ื•ืœ ืœื”ืฉืคื™ืข ื‘ืื•ืคืŸ ืžืฉืžืขื•ืชื™ ืขืœ ื”ื”ื–ื“ืžื ื•ื™ื•ืช ืฉืœื”ื ื‘ื—ื™ื™ื.
07:06
So we need to ask ourselves:
126
426167
3476
ืœื›ืŸ ืขืœื™ื ื• ืœืฉืื•ืœ ืืช ืขืฆืžื ื•:
07:09
can we trust these technologies when it comes to profiling our children?
127
429667
4684
ื”ืื ืื ื• ื™ื›ื•ืœื™ื ืœืกืžื•ืš ืขืœ ื”ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ืืœื” ื›ืฉืžื“ื•ื‘ืจ ื‘ื™ืฆื™ืจืช ืคืจื•ืคื™ืœ ืœื™ืœื“ื™ื ื•?
07:14
Can we?
128
434375
1250
ื”ืื ืื ื—ื ื• ื™ื›ื•ืœื™ื?
07:17
My answer is no.
129
437708
1250
ื”ืชืฉื•ื‘ื” ืฉืœื™ ื”ื™ื ืœื.
07:19
As an anthropologist,
130
439792
1267
ื›ืื ืชืจื•ืคื•ืœื•ื’ื™ืช,
07:21
I believe that artificial intelligence and predictive analytics can be great
131
441083
3768
ืื ื™ ืžืืžื™ื ื” ืฉื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ื•ืชื—ื–ื™ืช ื ืชื•ื ื™ื ื™ื›ื•ืœื•ืช ืœื”ื™ื•ืช ื ื”ื“ืจื•ืช
07:24
to predict the course of a disease
132
444875
2018
ืœื—ื–ื•ืช ืืช ืžื”ืœื›ื” ืฉืœ ืžื—ืœื”
07:26
or to fight climate change.
133
446917
1833
ืื• ืœื”ื™ืœื—ื ื‘ืฉื™ื ื•ื™ื™ ืืงืœื™ื.
07:30
But we need to abandon the belief
134
450000
1643
ืื‘ืœ ืขืœื™ื ื• ืœื ื˜ื•ืฉ ืืช ื”ืืžื•ื ื”
07:31
that these technologies can objectively profile humans
135
451667
3684
ื›ื™ ื”ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื”ืืœื” ืžืกื•ื’ืœื•ืช ืœื™ืฆื•ืจ ืคืจื•ืคื™ืœ ืื ื•ืฉื™ ื‘ืื•ืคืŸ ืื•ื‘ื™ื™ืงื˜ื™ื‘ื™
07:35
and that we can rely on them to make data-driven decisions
136
455375
3184
ื•ืฉืื ื—ื ื• ื™ื›ื•ืœื™ื ืœืกืžื•ืš ืขืœื™ื”ืŸ ื‘ืงื‘ืœืช ื”ื—ืœื˜ื•ืช ืžื•ื ื—ื•ืช ื ืชื•ื ื™ื
07:38
about individual lives.
137
458583
1893
ื›ืฉืžื“ื•ื‘ืจ ื‘ื—ื™ื™ ืื“ื.
07:40
Because they can't profile humans.
138
460500
2559
ืžื›ื™ื•ื•ืŸ ืฉื”ืŸ ืื™ื ืŸ ืžืกื•ื’ืœื•ืช ืœื™ืฆื•ืจ ืคืจื•ืคื™ืœ ืฉืœ ืื“ื.
07:43
Data traces are not the mirror of who we are.
139
463083
3351
ืขืงื‘ื•ืช ื ืชื•ื ื™ื ืื™ื ืŸ ื”ืฉืชืงืคื•ืช ืฉืœ ืžื™ ืฉืื ื—ื ื•.
07:46
Humans think one thing and say the opposite,
140
466458
2101
ื‘ื ื™ ืื“ื ื—ื•ืฉื‘ื™ื ื“ื‘ืจ ืื—ื“ ื•ืื•ืžืจื™ื ืืช ื”ื”ื™ืคืš,
07:48
feel one way and act differently.
141
468583
2435
ืžืจื’ื™ืฉื™ื ื‘ืื•ืคืŸ ืžืกื•ื™ื ื•ืคื•ืขืœื™ื ืื—ืจืช.
07:51
Algorithmic predictions or our digital practices
142
471042
2476
ื”ืชื—ื–ื™ื•ืช ื”ืืœื’ื•ืจื™ืชืžื™ื•ืช ืื• ื”ืคืจืงื˜ื™ืงื•ืช ื”ื“ื™ื’ื™ื˜ืœื™ื•ืช ืฉืœื ื•
07:53
cannot account for the unpredictability and complexity of human experience.
143
473542
5166
ืื™ื ืŸ ืžืกื•ื’ืœื•ืช ืœื—ื–ื•ืช ืืช ื”ื—ื•ื•ื™ื” ื”ืื ื•ืฉื™ืช ื”ืžื•ืจื›ื‘ืช ื•ื”ื‘ืœืชื™ ืฆืคื•ื™ื”.
08:00
But on top of that,
144
480417
1559
ื‘ื ื•ืกืฃ ืœื›ืš,
08:02
these technologies are always --
145
482000
2684
ื”ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื”ืืœื” ื”ืŸ ืชืžื™ื“
08:04
always --
146
484708
1268
ืชืžื™ื“
08:06
in one way or another, biased.
147
486000
1917
ื‘ื“ืจืš ื–ื• ืื• ืื—ืจืช, ืžื•ื˜ื•ืช.
08:09
You see, algorithms are by definition sets of rules or steps
148
489125
5059
ืืชื ืžื‘ื™ื ื™ื, ื”ืืœื’ื•ืจื™ืชืžื™ื ื”ื ื‘ื”ื’ื“ืจื” ืื•ืกืฃ ืฉืœ ื›ืœืœื™ื ืื• ืฉืœื‘ื™ื
08:14
that have been designed to achieve a specific result, OK?
149
494208
3709
ืฉืชื•ื›ื ื ื• ืขืœ ืžื ืช ืœื”ืฉื™ื’ ืชื•ืฆืื” ืกืคืฆื™ืคื™ืช, ื‘ืกื“ืจ?
08:18
But these sets of rules or steps cannot be objective,
150
498833
2726
ืืš ืื•ืกืฃ ื–ื” ืฉืœ ื›ืœืœื™ื ืื• ืฉืœื‘ื™ื ืื™ื ื• ื™ื›ื•ืœ ืœื”ื™ื•ืช ืื•ื‘ื™ื™ืงื˜ื™ื‘ื™,
08:21
because they've been designed by human beings
151
501583
2143
ื›ื™ ื”ื•ื ืขื•ืฆื‘ ืขืœ ื™ื“ื™ ื‘ื ื™ ืื“ื
08:23
within a specific cultural context
152
503750
1726
ื‘ื”ืงืฉืจ ืชืจื‘ื•ืชื™ ืกืคืฆื™ืคื™
08:25
and are shaped by specific cultural values.
153
505500
2500
ื•ื”ื•ื ืžืขื•ืฆื‘ ืขืœ ืคื™ ืขืจื›ื™ื ืชืจื‘ื•ืชื™ื™ื ืกืคืฆื™ืคื™ื™ื.
08:28
So when machines learn,
154
508667
1726
ืื– ื›ืฉืžื›ื•ื ื•ืช ืœื•ืžื“ื•ืช,
08:30
they learn from biased algorithms,
155
510417
2250
ื”ืŸ ืœื•ืžื“ื•ืช ืžืืœื’ื•ืจื™ืชืžื™ื ืžื•ื˜ื™ื,
08:33
and they often learn from biased databases as well.
156
513625
3208
ื•ืœืขืชื™ื ืงืจื•ื‘ื•ืช ื”ืŸ ื’ื ืœื•ืžื“ื•ืช ืžืžืื’ืจื™ ื ืชื•ื ื™ื ืžื•ื˜ื™ื.
08:37
At the moment, we're seeing the first examples of algorithmic bias.
157
517833
3726
ื›ืจื’ืข ืื ื—ื ื• ืจื•ืื™ื ืืช ื”ื“ื•ื’ืžืื•ืช ื”ืจืืฉื•ื ื•ืช ืœื”ื˜ื™ื” ืืœื’ื•ืจื™ืชืžื™ืช.
08:41
And some of these examples are frankly terrifying.
158
521583
3500
ื•ื›ืžื” ืžื”ื“ื•ื’ืžืื•ืช ื”ืืœื” ืœืœื ืกืคืง ืžืขื•ืจืจื•ืช ืื™ืžื”.
08:46
This year, the AI Now Institute in New York published a report
159
526500
4059
ื”ืฉื ื” ืžื›ื•ืŸ AI Now ื‘ื ื™ื• ื™ื•ืจืง ืคืจืกื ื“ื•"ื—
08:50
that revealed that the AI technologies
160
530583
2393
ืฉื’ื™ืœื” ื›ื™ ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช
08:53
that are being used for predictive policing
161
533000
3476
ื”ื ืžืฆืื•ืช ื‘ืฉื™ืžื•ืฉ ื”ืฉื™ื˜ื•ืจ ื”ืžื ื‘ื
08:56
have been trained on "dirty" data.
162
536500
3125
ื”ื•ื›ืฉืจื• ื‘ืืžืฆืขื•ืช ื ืชื•ื ื™ื "ืžื–ื•ื”ืžื™ื".
09:00
This is basically data that had been gathered
163
540333
2893
ืืœื• ื‘ืขืฆื ื ืชื•ื ื™ื ืืฉืจ ื ืืกืคื•
09:03
during historical periods of known racial bias
164
543250
4184
ื‘ืžื”ืœืš ืชืงื•ืคื•ืช ื”ื™ืกื˜ื•ืจื™ื•ืช ื”ื™ื“ื•ืขื•ืช ื‘ื”ื˜ื™ื™ืชืŸ ื”ื’ื–ืขื™ืช
09:07
and nontransparent police practices.
165
547458
2250
ื•ืฉื™ื˜ื•ืช ืขื‘ื•ื“ื” ืœื ืฉืงื•ืคื•ืช ืฉืœ ื”ืžืฉื˜ืจื”.
09:10
Because these technologies are being trained with dirty data,
166
550542
4059
ืžื›ื™ื•ื•ืŸ ืฉื˜ื›ื ื•ืœื•ื’ื™ื•ืช ืืœื” ืžืื•ืžื ื•ืช ื‘ืขื–ืจืช ื ืชื•ื ื™ื ืžื–ื•ื”ืžื™ื,
09:14
they're not objective,
167
554625
1434
ื”ืŸ ืื™ื ืŸ ืื•ื‘ื™ื™ืงื˜ื™ื‘ื™ื•ืช,
09:16
and their outcomes are only amplifying and perpetrating
168
556083
4518
ื•ื”ืชื•ืฆืื•ืช ืฉืœื”ืŸ ืจืง ืžืขืฆื™ืžื•ืช ื•ืžื•ื‘ื™ืœื•ืช
09:20
police bias and error.
169
560625
1625
ืœื”ื˜ื™ื” ื•ืœื˜ืขื•ื™ื•ืช ืฉืœ ื”ืžืฉื˜ืจื”.
09:25
So I think we are faced with a fundamental problem
170
565167
3142
ืื– ืื ื™ ื—ื•ืฉื‘ืช ืฉืื ื• ืขื•ืžื“ื™ื ืžื•ืœ ื‘ืขื™ื” ืขืงืจื•ื ื™ืช
09:28
in our society.
171
568333
1643
ื‘ื—ื‘ืจื” ืฉืœื ื•.
09:30
We are starting to trust technologies when it comes to profiling human beings.
172
570000
4792
ืื ื• ืžืชื—ื™ืœื™ื ืœืกืžื•ืš ืขืœ ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื›ืฉืžื“ื•ื‘ืจ ื‘ื™ืฆื™ืจืช ืคืจื•ืคื™ืœ ืฉืœ ื‘ื ื™ ืื“ื.
09:35
We know that in profiling humans,
173
575750
2518
ืื ื• ื™ื•ื“ืขื™ื ืฉื‘ื ื™ืกื™ื•ืŸ ืœื™ืฆื•ืจ ืคืจื•ืคื™ืœ ืฉืœ ืื“ื,
09:38
these technologies are always going to be biased
174
578292
2809
ื”ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื”ืืœื” ืชืžื™ื“ ื™ื”ื™ื• ืžื•ื˜ื•ืช
09:41
and are never really going to be accurate.
175
581125
2726
ื•ืœืขื•ืœื ืœื ื™ื”ื™ื• ืžื“ื•ื™ืงื•ืช ื‘ืืžืช.
09:43
So what we need now is actually political solution.
176
583875
2934
ืื– ืžื” ืฉืื ื—ื ื• ืฆืจื™ื›ื™ื ืขื›ืฉื™ื• ื”ื•ื ืœืžืขืฉื” ืคืชืจื•ืŸ ืคื•ืœื™ื˜ื™.
09:46
We need governments to recognize that our data rights are our human rights.
177
586833
4709
ืื ื—ื ื• ืฆืจื™ื›ื™ื ืฉืžืžืฉืœื•ืช ื™ื›ื™ืจื• ื‘ื›ืš ืฉื–ื›ื•ื™ื•ืช ื”ื ืชื•ื ื™ื ืฉืœื ื• ื”ืŸ ื–ื›ื•ื™ื•ืช ื”ืื“ื ืฉืœื ื•.
09:52
(Applause and cheers)
178
592292
4083
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื ื•ืฆื”ืœื•ืช)
09:59
Until this happens, we cannot hope for a more just future.
179
599833
4084
ืขื“ ืฉื–ื” ื™ืงืจื”, ืื™ื ื ื• ื™ื›ื•ืœื™ื ืœืฆืคื•ืช ืœืขืชื™ื“ ืฆื•ื“ืง ื™ื•ืชืจ.
10:04
I worry that my daughters are going to be exposed
180
604750
2726
ืื ื™ ื“ื•ืื’ืช ืฉื‘ื ื•ืชื™ื™ ื™ื”ื™ื• ื—ืฉื•ืคื•ืช
10:07
to all sorts of algorithmic discrimination and error.
181
607500
3726
ืœื›ืœ ืžื™ื ื™ ืืคืœื™ื•ืช ื•ื˜ืขื•ื™ื•ืช ืืœื’ื•ืจื™ืชืžื™ื•ืช.
10:11
You see the difference between me and my daughters
182
611250
2393
ืชื‘ื™ื ื•, ื”ื”ื‘ื“ืœ ื‘ื™ื ื™ ืœื‘ื™ืŸ ื‘ื ื•ืชื™ื™
10:13
is that there's no public record out there of my childhood.
183
613667
3184
ื”ื•ื ืฉืื™ืŸ ืจืฉื•ืžื” ืฆื™ื‘ื•ืจื™ืช ืฉื ื‘ื—ื•ืฅ ืขืœ ืชืงื•ืคืช ื”ื™ืœื“ื•ืช ืฉืœื™.
10:16
There's certainly no database of all the stupid things that I've done
184
616875
4018
ื›ืžื•ื‘ืŸ ืฉืื™ืŸ ืžืกื“ ื ืชื•ื ื™ื ืขืœ ื›ืœ ื”ื“ื‘ืจื™ื ื”ืžื˜ื•ืคืฉื™ื ืฉืขืฉื™ืชื™
10:20
and thought when I was a teenager.
185
620917
2142
ื•ื—ืฉื‘ืชื™ ื›ืฉื”ื™ื™ืชื™ ื ืขืจื”.
10:23
(Laughter)
186
623083
1500
(ืฆื—ื•ืง)
10:25
But for my daughters this may be different.
187
625833
2750
ืื‘ืœ ืขื‘ื•ืจ ื‘ื ื•ืชื™ื™ ื–ื” ืขืฉื•ื™ ืœื”ื™ื•ืช ืฉื•ื ื”.
10:29
The data that is being collected from them today
188
629292
3184
ื”ื ืชื•ื ื™ื ืฉื ืืกืคื™ื ืขืœื™ื”ืŸ ื”ื™ื•ื
10:32
may be used to judge them in the future
189
632500
3809
ืขืœื•ืœื™ื ืœืฉืžืฉ ืืžืฆืขื™ ืœืฉืคื•ื˜ ืื•ืชืŸ ื‘ืขืชื™ื“
10:36
and can come to prevent their hopes and dreams.
190
636333
2959
ื•ื™ื›ื•ืœื™ื ืœืžื ื•ืข ืืช ื”ืชืงื•ื•ืช ื•ืืช ื”ื—ืœื•ืžื•ืช ืฉืœื”ืŸ.
10:40
I think that's it's time.
191
640583
1518
ืื ื™ ื—ื•ืฉื‘ืช ืฉื”ื’ื™ืข ื”ื–ืžืŸ.
10:42
It's time that we all step up.
192
642125
1434
ื”ื’ื™ืข ื”ื–ืžืŸ ืฉื›ื•ืœื ื• ื ื™ืงื— ื—ืœืง.
10:43
It's time that we start working together
193
643583
2476
ื”ื’ื™ืข ื”ื–ืžืŸ ืฉื ืชื—ื™ืœ ืœืขื‘ื•ื“ ื‘ื™ื—ื“
10:46
as individuals,
194
646083
1435
ื›ื™ื—ื™ื“ื™ื,
10:47
as organizations and as institutions,
195
647542
2517
ื›ืืจื’ื•ื ื™ื ื•ื›ืžื•ืกื“ื•ืช,
10:50
and that we demand greater data justice for us
196
650083
3101
ื•ืฉื ื“ืจื•ืฉ ืฆื“ืง ื ืชื•ื ื™ื ื’ื“ื•ืœ ื™ื•ืชืจ ืขื‘ื•ืจื ื•
10:53
and for our children
197
653208
1393
ื•ืขื‘ื•ืจ ื™ืœื“ื™ื ื•
10:54
before it's too late.
198
654625
1518
ืœืคื ื™ ืฉื™ื”ื™ื” ืžืื•ื—ืจ ืžื“ื™ื™.
10:56
Thank you.
199
656167
1267
ืชื•ื“ื” ืจื‘ื”.
10:57
(Applause)
200
657458
1417
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7