What tech companies know about your kids | Veronica Barassi

84,949 views ใƒป 2020-07-03

TED


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

00:00
Transcriber: Leslie Gauthier Reviewer: Joanna Pietrulewicz
0
0
7000
๋ฒˆ์—ญ: soojin Lim ๊ฒ€ํ† : NaYeun Kim
00:12
Every day, every week,
1
12792
2267
๋งค์ผ, ๋งค์ฃผ
00:15
we agree to terms and conditions.
2
15083
2185
์šฐ๋ฆฌ๋Š” ์ด์šฉ์•ฝ๊ด€์— ๋™์˜ํ•ฉ๋‹ˆ๋‹ค.
00:17
And when we do this,
3
17292
1476
์šฐ๋ฆฌ๋Š” ์ด ํ–‰์œ„๋ฅผ ํ•จ์œผ๋กœ์จ
00:18
we provide companies with the lawful right
4
18792
2476
๊ธฐ์—…์—๊ฒŒ ๋ฒ•์  ๊ถŒํ•œ์„ ๋ถ€์—ฌํ•˜๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
00:21
to do whatever they want with our data
5
21292
3684
์šฐ๋ฆฌ ์ •๋ณด๋ฅผ ๋ง˜๋Œ€๋กœ ์‚ฌ์šฉํ•  ๊ถŒํ•œ์ด์ฃ .
00:25
and with the data of our children.
6
25000
2375
์šฐ๋ฆฌ ์•„์ด๋“ค์˜ ์ •๋ณด๋„ ํฌํ•จํ•ด์„œ์š”.
00:28
Which makes us wonder:
7
28792
2976
๊ทธ๋Ÿผ ์šฐ๋ฆฌ๋Š” ์ด๋Ÿฐ ์ƒ๊ฐ์„ ํ•˜๊ฒŒ ๋˜์ฃ .
00:31
how much data are we giving away of children,
8
31792
2892
๋‚˜๋Š” ์•„์ด๋“ค์— ๋Œ€ํ•œ ์ •๋ณด๋ฅผ ์–ผ๋งˆ๋งŒํผ์ด๋‚˜ ์ œ๊ณตํ•˜๊ณ  ์žˆ์œผ๋ฉฐ
00:34
and what are its implications?
9
34708
2000
๊ทธ๋กœ ์ธํ•œ ๊ฒฐ๊ณผ๋Š” ๋ฌด์—‡์ผ๊นŒ?
00:38
I'm an anthropologist,
10
38500
1393
์ €๋Š” ์ธ๋ฅ˜ํ•™์ž์ด์ž
00:39
and I'm also the mother of two little girls.
11
39917
2601
๋™์‹œ์— ๋‘ ์—ฌ์ž์•„์ด์˜ ์—„๋งˆ์ž…๋‹ˆ๋‹ค.
00:42
And I started to become interested in this question in 2015
12
42542
4476
์ œ๊ฐ€ ์ด ์งˆ๋ฌธ์— ๊ด€ํ•ด ๊ด€์‹ฌ์„ ๋‘๊ธฐ ์‹œ์ž‘ํ•œ ๊ฑด 2015๋…„ ์ฆˆ์Œ์ด์—ˆ์Šต๋‹ˆ๋‹ค.
00:47
when I suddenly realized that there were vast --
13
47042
2726
๊ทธ ์ •๋ณด์˜ ์–‘์ด ์—„์ฒญ๋‚˜๋‹ค๋Š” ๊ฑธ ๋ถˆํ˜„๋“ฏ ๊นจ๋‹ฌ์•˜๊ฑฐ๋“ ์š”.
00:49
almost unimaginable amounts of data traces
14
49792
3017
์ƒ์ƒํ•˜๊ธฐ ํž˜๋“ค ์ •๋„๋กœ ๋งŽ์€ ์ •๋ณด ์ถ”์ ์„ ํ†ตํ•ด์„œ
00:52
that are being produced and collected about children.
15
52833
3167
์•„์ด๋“ค์— ๋Œ€ํ•œ ์ •๋ณด๊ฐ€ ์ˆ˜์ง‘๋˜๊ณ  ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
00:56
So I launched a research project,
16
56792
1976
๊ทธ๋ž˜์„œ ์ €๋Š” '์•„๋™ ์ •๋ณด ์‹œ๋ฏผ'์ด๋ผ๋Š” ์—ฐ๊ตฌ ํ”„๋กœ์ ํŠธ๋ฅผ ์‹œ์ž‘ํ•˜์˜€๊ณ 
00:58
which is called Child Data Citizen,
17
58792
2476
01:01
and I aimed at filling in the blank.
18
61292
2125
์ด ๋ฌธ์ œ๋ฅผ ํ’€๊ณ ์ž ํ–ˆ์Šต๋‹ˆ๋‹ค.
01:04
Now you may think that I'm here to blame you
19
64583
3018
์ œ๊ฐ€ ์ด ์ž๋ฆฌ์— ์„  ์ด์œ ๊ฐ€
SNS์— ์•„์ด ์‚ฌ์ง„์„ ์˜ฌ๋ฆฌ๋Š” ๊ฑธ ๋น„๋‚œํ•˜๊ธฐ ์œ„ํ•œ ๊ฑฐ๋ผ๊ณ  ์ƒ๊ฐํ•˜์‹ค ํ…๋ฐ์š”.
01:07
for posting photos of your children on social media,
20
67625
2768
01:10
but that's not really the point.
21
70417
2142
ํ•˜์ง€๋งŒ ๋ฌธ์ œ๋Š” ๊ทธ๊ฒŒ ์•„๋‹™๋‹ˆ๋‹ค.
01:12
The problem is way bigger than so-called "sharenting."
22
72583
3417
์†Œ์œ„ ๋งํ•˜๋Š” '์œก์•„ ๋ธ”๋กœ๊ทธ' ๋ณด๋‹ค ํ›จ์”ฌ ๋” ํฐ ๋ฌธ์ œ์ž…๋‹ˆ๋‹ค.
01:16
This is about systems, not individuals.
23
76792
4101
๋ฌธ์ œ๋Š” ๊ฐœ์ธ์ด ์•„๋‹ˆ๋ผ ์‹œ์Šคํ…œ์— ๊ด€ํ•œ ๊ฑฐ์˜ˆ์š”.
01:20
You and your habits are not to blame.
24
80917
2291
๋น„๋‚œ๋ฐ›์•„์•ผ ํ•  ๊ฒƒ์€ ์—ฌ๋Ÿฌ๋ถ„๊ณผ ์—ฌ๋Ÿฌ๋ถ„์˜ ์Šต๊ด€์ด ์•„๋‹™๋‹ˆ๋‹ค.
01:24
For the very first time in history,
25
84833
2851
์—ญ์‚ฌ์ƒ ์ฒ˜์Œ์œผ๋กœ
01:27
we are tracking the individual data of children
26
87708
2560
์•„์ด๋“ค์˜ ๊ฐœ์ธ ์ •๋ณด๊ฐ€ ์ถ”์ ๋˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
01:30
from long before they're born --
27
90292
1767
์•„์ด๊ฐ€ ํƒœ์–ด๋‚˜๊ธฐ๋„ ์ „๋ถ€ํ„ฐ์š”.
01:32
sometimes from the moment of conception,
28
92083
2685
์–ด๋–จ ๋• ์ž„์‹ ์„ ํ•œ ์ˆœ๊ฐ„๋ถ€ํ„ฐ
01:34
and then throughout their lives.
29
94792
2351
๊ทธ ์•„์ด๋“ค์˜ ์ƒ์• ์— ๊ฑธ์ณ์„œ์š”.
01:37
You see, when parents decide to conceive,
30
97167
3101
๋ถ€๋ชจ๋“ค์€ ์ž„์‹ ์„ ๊ฒฐ์ •ํ•œ ์ˆœ๊ฐ„๋ถ€ํ„ฐ
01:40
they go online to look for "ways to get pregnant,"
31
100292
2976
์ธํ„ฐ๋„ท์— ์ž„์‹ ํ•˜๋Š” ๋ฒ•์„ ๊ฒ€์ƒ‰ํ•˜๊ฑฐ๋‚˜
01:43
or they download ovulation-tracking apps.
32
103292
2750
๋ฐฐ๋ž€๊ธฐ๋ฅผ ์•Œ๋ ค์ฃผ๋Š” ์•ฑ์„ ๋‚ด๋ ค๋ฐ›์Šต๋‹ˆ๋‹ค.
01:47
When they do get pregnant,
33
107250
2601
๊ทธ๋ฆฌ๊ณ  ์‹ค์ œ๋กœ ์ž„์‹ ํ–ˆ์„ ๋•
01:49
they post ultrasounds of their babies on social media,
34
109875
3143
์•„์ด์˜ ์ดˆ์ŒํŒŒ ์‚ฌ์ง„์„ SNS์— ์˜ฌ๋ฆฌ๊ณ 
01:53
they download pregnancy apps
35
113042
2017
์ž„์‹  ๊ด€๋ จ ์•ฑ์„ ๋‚ด๋ ค๋ฐ›๊ฑฐ๋‚˜
01:55
or they consult Dr. Google for all sorts of things,
36
115083
3726
๊ตฌ๊ธ€์— ์ด๋Ÿฐ์ €๋Ÿฐ ์ฆ์ƒ๋“ค์„ ๊ฒ€์ƒ‰ํ•˜์ฃ .
01:58
like, you know --
37
118833
1518
์˜ˆ๋ฅผ ๋“ค๋ฉด
02:00
for "miscarriage risk when flying"
38
120375
2559
๋น„ํ–‰๊ธฐ ์—ฌํ–‰ ์‹œ์˜ ์œ ์‚ฐ ๊ฐ€๋Šฅ์„ฑ์ด๋‚˜
02:02
or "abdominal cramps in early pregnancy."
39
122958
2768
ํ˜น์€ ์ž„์‹  ์ดˆ๊ธฐ ๋ณตํ†ต ๊ฐ™์€ ๊ฒƒ์ด์ฃ .
02:05
I know because I've done it --
40
125750
1809
์ €๋„ ๋งŽ์ด ํ•ด๋ดค๊ธฐ ๋•Œ๋ฌธ์— ์ž˜ ์••๋‹ˆ๋‹ค.
02:07
and many times.
41
127583
1625
02:10
And then, when the baby is born, they track every nap,
42
130458
2810
๊ทธ๋ฆฌ๊ณ  ์•„์ด๊ฐ€ ํƒœ์–ด๋‚˜๋ฉด
์•„์ด์˜ ๋ชจ๋“  ๋‚ฎ์ž ๊ณผ ์‹์‚ฌ, ์•„์ด์˜ ์ผ๊ฑฐ์ˆ˜์ผํˆฌ์กฑ์„
02:13
every feed,
43
133292
1267
02:14
every life event on different technologies.
44
134583
2584
์˜จ๊ฐ– ๊ธฐ์ˆ ๋กœ ์ถ”์ ํ•ฉ๋‹ˆ๋‹ค.
02:18
And all of these technologies
45
138083
1476
๊ทธ๋ฆฌ๊ณ  ์ด ๊ธฐ์ˆ ๋“ค์€
02:19
transform the baby's most intimate behavioral and health data into profit
46
139583
6143
์•„์ด์˜ ๊ฐ€์žฅ ์‚ฌ์ ์ธ ํ–‰๋™๊ณผ ๊ฑด๊ฐ• ์ •๋ณด๋ฅผ ๋‹ค๋ฅธ ์‚ฌ๋žŒ๋“ค์—๊ฒŒ ์ œ๊ณตํ•จ์œผ๋กœ์จ
02:25
by sharing it with others.
47
145750
1792
์ด์ต์„ ์ฐฝ์ถœํ•ฉ๋‹ˆ๋‹ค.
02:28
So to give you an idea of how this works,
48
148583
2143
์ด๊ฒŒ ์–ด๋–ป๊ฒŒ ์ผ์–ด๋‚˜๋Š”์ง€ ์•Œ๋ ค๋“œ๋ฆด๊ฒŒ์š”.
02:30
in 2019, the British Medical Journal published research that showed
49
150750
5184
2019๋…„, ์˜๊ตญ ์˜ํ•™ ํ•™์ˆ ์ง€๊ฐ€ ๋ฐœํ‘œํ•œ ์ž๋ฃŒ์—์„œ๋Š”
02:35
that out of 24 mobile health apps,
50
155958
3643
24๊ฐœ์˜ ๋ชจ๋ฐ”์ผ ๊ฑด๊ฐ• ์•ฑ์„ ์กฐ์‚ฌํ•œ ๊ฒฐ๊ณผ
02:39
19 shared information with third parties.
51
159625
3458
19๊ฐœ๊ฐ€ ์ œ3์ž์—๊ฒŒ ์ •๋ณด๋ฅผ ์ œ๊ณตํ–ˆ๋‹ค๊ณ  ๋ฐํ˜”์Šต๋‹ˆ๋‹ค.
02:44
And these third parties shared information with 216 other organizations.
52
164083
5834
๊ทธ ์ œ3์ž๋“ค์€ ๋‹ค์‹œ 216๊ฐœ์˜ ๋‹ค๋ฅธ ์—…์ฒด์— ์ •๋ณด๋ฅผ ์ œ๊ณตํ–ˆ์ฃ .
02:50
Of these 216 other fourth parties,
53
170875
3434
๊ทธ๋Ÿฐ๋ฐ 216๊ฐœ์˜ ์ œ4์ž ๊ธฐ๊ด€๋“ค ์ค‘์—์„œ
02:54
only three belonged to the health sector.
54
174333
3143
์˜ค์ง 3๊ฐœ์˜ ๊ธฐ๊ด€๋งŒ์ด ๊ฑด๊ฐ•๊ณผ ๊ด€๋ จ๋œ ๊ณณ์ด์—ˆ์Šต๋‹ˆ๋‹ค.
02:57
The other companies that had access to that data were big tech companies
55
177500
4518
๊ทธ ์ •๋ณด๋ฅผ ๊ฐ€์ ธ๊ฐ„ ๊ธฐ์—…์—๋Š” ๋Œ€ํ˜• IT ๊ธฐ์—…์ธ
03:02
like Google, Facebook or Oracle,
56
182042
3517
๊ตฌ๊ธ€, ํŽ˜์ด์Šค๋ถ ๊ทธ๋ฆฌ๊ณ  ์˜ค๋ผํด์ด ํฌํ•จ๋˜์–ด ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
03:05
they were digital advertising companies
57
185583
2601
๋””์ง€ํ„ธ ๊ด‘๊ณ  ํšŒ์‚ฌ๋“ค๋„ ์žˆ์—ˆ๊ณ 
03:08
and there was also a consumer credit reporting agency.
58
188208
4125
์‹ฌ์ง€์–ด ์†Œ๋น„์ž ์‹ ์šฉ ํ‰๊ฐ€ ๊ธฐ๊ด€๋„ ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
03:13
So you get it right:
59
193125
1434
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ ์—ฌ๋Ÿฌ๋ถ„ ์ƒ๊ฐ์ด ๋งž์•„์š”.
03:14
ad companies and credit agencies may already have data points on little babies.
60
194583
5125
๊ด‘๊ณ  ํšŒ์‚ฌ์™€ ์‹ ์šฉํ‰๊ฐ€์‚ฌ๋Š” ์ด๋ฏธ ์•„์ด๋“ค์˜ ์ •๋ณด๋ฅผ ๊ฐ–๊ณ  ์žˆ์„ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
03:21
But mobile apps, web searches and social media
61
201125
2768
๋ชจ๋ฐ”์ผ ์•ฑ, ๊ฒ€์ƒ‰ ์‚ฌ์ดํŠธ, SNS๋Š”
03:23
are really just the tip of the iceberg,
62
203917
3101
๊ทธ์ € ๋น™์‚ฐ์˜ ์ผ๊ฐ์ผ ๋ฟ์ด์ฃ .
03:27
because children are being tracked by multiple technologies
63
207042
2851
์™œ๋ƒํ•˜๋ฉด ์•„์ด๋“ค์€ ๋งค์ผ๊ฐ™์ด ๋‹ค์ˆ˜์˜ ๊ธฐ์ˆ ์— ์˜ํ•ด
03:29
in their everyday lives.
64
209917
1726
์ถ”์ ๋‹นํ•˜๊ณ  ์žˆ๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
03:31
They're tracked by home technologies and virtual assistants in their homes.
65
211667
4142
์•„์ด๋“ค์€ ์ง‘์—์„œ ์‚ฌ์šฉํ•˜๋Š” ๊ธฐ์ˆ ๊ณผ ๊ฐ€์ƒ์˜ ๋„์šฐ๋ฏธ๋“ค์— ์˜ํ•ด ์ถ”์ ๋‹นํ•ฉ๋‹ˆ๋‹ค.
03:35
They're tracked by educational platforms
66
215833
1976
ํ•™๊ต์˜ ๊ต์œก ๊ธฐ๋ฐ˜๊ณผ ๊ต์œก ๊ธฐ์ˆ ๋กœ๋„ ์ถ”์ ์„ ๋‹นํ•˜๊ณ  ์žˆ์–ด์š”.
03:37
and educational technologies in their schools.
67
217833
2185
03:40
They're tracked by online records
68
220042
1601
๋ณ‘์›์˜ ์˜จ๋ผ์ธ์ƒ ๊ธฐ๋ก๊ณผ ๋ณ‘์› ํฌํ„ธ๋กœ๋„ ์ถ”์ ๋‹นํ•ฉ๋‹ˆ๋‹ค.
03:41
and online portals at their doctor's office.
69
221667
3017
03:44
They're tracked by their internet-connected toys,
70
224708
2351
์ธํ„ฐ๋„ท๊ณผ ์—ฐ๊ฒฐ๋œ ์žฅ๋‚œ๊ฐ๊ณผ
03:47
their online games
71
227083
1310
์˜จ๋ผ์ธ ๊ฒŒ์ž„ ์™ธ์—๋„
03:48
and many, many, many, many other technologies.
72
228417
2666
์ •๋ง ์ •๋ง ๋งŽ์€ ๊ธฐ์ˆ ๋กœ ์ถ”์  ๋‹นํ•ฉ๋‹ˆ๋‹ค.
03:52
So during my research,
73
232250
1643
์ œ๊ฐ€ ์—ฐ๊ตฌํ•˜๋Š” ๋™์•ˆ ๋ถ€๋ชจ๋‹˜์˜ ๋ฐ˜์‘๋“ค์€ ๋Œ€๊ฐœ ์ด๋žฌ์–ด์š”.
03:53
a lot of parents came up to me and they were like, "So what?
74
233917
4142
"๊ทธ๋ž˜์„œ ๋ญ ์–ด์ฉŒ๋ผ๊ณ ์š”?
03:58
Why does it matter if my children are being tracked?
75
238083
2917
์šฐ๋ฆฌ ์•„์ด๋“ค์ด ์ถ”์ ๋‹นํ•˜๋Š” ๊ฒŒ ์™œ ๊ทธ๋ฆฌ ๋ฌธ์ œ๊ฐ€ ๋˜๋Š” ๊ฑด๊ฐ€์š”?
04:02
We've got nothing to hide."
76
242042
1333
๋ณ„๋กœ ์ˆจ๊ธธ ๊ฒƒ๋„ ์—†์–ด์š”."
04:04
Well, it matters.
77
244958
1500
์Œ, ๋ฌธ์ œ๊ฐ€ ๋ฉ๋‹ˆ๋‹ค.
04:07
It matters because today individuals are not only being tracked,
78
247083
6018
์™œ๋ƒํ•˜๋ฉด ์˜ค๋Š˜๋‚ ์˜ ๊ฐœ๊ฐœ์ธ์€ ๊ทธ์ € ์ถ”์ ๋งŒ ๋‹นํ•˜๋Š” ๊ฒƒ์ด ์•„๋‹ˆ๋ผ
04:13
they're also being profiled on the basis of their data traces.
79
253125
4101
์ถ”์ ๋‹นํ•œ ์ •๋ณด์— ๊ธฐ์ดˆํ•˜์—ฌ ๋ถ„์„๋˜๊ณ  ์žˆ๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
04:17
Artificial intelligence and predictive analytics are being used
80
257250
3809
์ธ๊ณต์ง€๋Šฅ๊ณผ ์˜ˆ์ธก ๋ถ„์„์„ ์ด์šฉํ•ด์„œ
04:21
to harness as much data as possible of an individual life
81
261083
3643
๊ฐœ์ธ์˜ ์ผ์ƒ์— ๊ด€ํ•œ ์ •๋ณด๋ฅผ ์ตœ๋Œ€ํ•œ ๋Œ์–ด ๋ชจ์œผ๊ณ  ์žˆ์ฃ .
04:24
from different sources:
82
264750
1851
์˜จ๊ฐ– ๊ฒฝ๋กœ๋ฅผ ํ†ตํ•ด์„œ์š”.
04:26
family history, purchasing habits, social media comments.
83
266625
4518
๊ฐ€์กฑ ์ด๋ ฅ, ์†Œ๋น„ ์Šต๊ด€, ๊ทธ๋ฆฌ๊ณ  SNS ๋Œ“๊ธ€๋“ค.
04:31
And then they bring this data together
84
271167
1851
๊ธฐ์—…๋“ค์€ ์ด ์ •๋ณด๋“ค์„ ๋ชจ์•„
04:33
to make data-driven decisions about the individual.
85
273042
2750
๊ทธ ์ •๋ณด์— ๊ธฐ๋ฐ˜ํ•˜์—ฌ ๊ฐœ์ธ์„ ํŒ๋‹จํ•ฉ๋‹ˆ๋‹ค.
04:36
And these technologies are used everywhere.
86
276792
3434
์ด ๊ธฐ์ˆ ๋“ค์€ ์šฐ๋ฆฌ์˜ ์ผ์ƒ ๋ชจ๋“  ๊ณณ์—์„œ ์‚ฌ์šฉ๋˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
04:40
Banks use them to decide loans.
87
280250
2393
์€ํ–‰์ด ๋Œ€์ถœ์•ก์„ ๊ฒฐ์ •ํ•  ๋•Œ ์‚ฌ์šฉํ•˜๊ณ 
04:42
Insurance uses them to decide premiums.
88
282667
2375
๋ณดํ—˜ํšŒ์‚ฌ์—์„œ๋Š” ๋ณดํ—˜๋ฃŒ ๊ฒฐ์ •์— ์‚ฌ์šฉํ•˜์ฃ .
04:46
Recruiters and employers use them
89
286208
2476
๊ณ ์šฉ์ฃผ๋“ค์€ ๊ทธ๊ฑธ ์ด์šฉํ•ด์„œ
04:48
to decide whether one is a good fit for a job or not.
90
288708
2917
์ž…์‚ฌ ์ง€์›์ž๊ฐ€ ์ผ์— ์ ํ•ฉํ•œ ์‚ฌ๋žŒ์ธ์ง€ ํŒ๋‹จํ•ฉ๋‹ˆ๋‹ค.
04:52
Also the police and courts use them
91
292750
3101
๋˜ํ•œ ๊ฒฝ์ฐฐ๊ณผ ๋ฒ•์›์—์„œ๋Š” ์ด๋ฅผ ์ด์šฉํ•ด์„œ
04:55
to determine whether one is a potential criminal
92
295875
3518
์ด ์‚ฌ๋žŒ์ด ์ž ์žฌ์ ์ธ ๋ฒ”์ฃ„์ž์ธ์ง€
04:59
or is likely to recommit a crime.
93
299417
2625
ํ˜น์€ ๋˜๋‹ค์‹œ ๋ฒ”ํ–‰์„ ์ €์ง€๋ฅผ์ง€ ํŒ๋‹จํ•˜์ฃ .
05:04
We have no knowledge or control
94
304458
4060
์šฐ๋ฆฌ๋Š” ์ด์— ๋Œ€ํ•œ ์–ด๋– ํ•œ ์ง€์‹๋„ ์—†๊ณ 
๊ทธ๋Ÿฐ ๊ธฐ์—…์— ๋Œ€ํ•œ ํ†ต์ œ๋ ฅ๋„ ์—†์Šต๋‹ˆ๋‹ค.
05:08
over the ways in which those who buy, sell and process our data
95
308542
3642
์šฐ๋ฆฌ์™€ ์•„์ด๋“ค์— ๋Œ€ํ•œ ์ •๋ณด๊ฐ€ ๊ฑฐ๋ž˜๋˜๊ณ  ์‚ฌ์šฉ๋˜๋Š” ๊ณผ์ •์„ ๋ง‰์„ ์ˆ˜ ์—†์Šต๋‹ˆ๋‹ค.
05:12
are profiling us and our children.
96
312208
2709
05:15
But these profiles can come to impact our rights in significant ways.
97
315625
4042
ํ•˜์ง€๋งŒ ์ด ์ •๋ณด๋“ค์€ ์šฐ๋ฆฌ ๊ถŒ๋ฆฌ์— ํฐ ์˜ํ–ฅ์„ ๋ผ์นฉ๋‹ˆ๋‹ค.
05:20
To give you an example,
98
320917
2208
์˜ˆ๋ฅผ ๋“ค์–ด ๋ณผ๊ฒŒ์š”.
05:25
in 2018 the "New York Times" published the news
99
325792
4059
2018๋…„ ๋‰ด์š•ํƒ€์ž„์Šค์— ์‹ค๋ฆฐ ๊ธฐ์‚ฌ๊ฐ€ ์žˆ๋Š”๋ฐ์š”.
05:29
that the data that had been gathered
100
329875
1976
๋Œ€์ž… ์ •๋ณด ์˜จ๋ผ์ธ ์„œ๋น„์Šค๋กœ ์ด์šฉ์ž ์ •๋ณด๊ฐ€ ์ˆ˜์ง‘๋˜์—ˆ๊ณ 
05:31
through online college-planning services --
101
331875
3059
05:34
that are actually completed by millions of high school kids across the US
102
334958
4726
๋ฏธ๊ตญ ์ „์—ญ์˜ ๊ทธ์•ผ๋ง๋กœ ์ˆ˜๋งŒ ๋ช…์˜ ๊ณ ๋“ฑํ•™์ƒ๋“ค์˜
05:39
who are looking for a college program or a scholarship --
103
339708
3643
๋Œ€ํ•™ ๊ต์œก ํ”„๋กœ๊ทธ๋žจ ๋˜๋Š” ์žฅํ•™๊ธˆ์„ ์กฐํšŒํ•œ ์ •๋ณด๋“ค์ด
05:43
had been sold to educational data brokers.
104
343375
3042
๊ต์œก ์ž๋ฃŒ ์ค‘๊ฐœ์ธ์—๊ฒŒ ํŒ”๋ ธ๋‹ค๋Š” ๋‚ด์šฉ์ด์—ˆ์Šต๋‹ˆ๋‹ค.
05:47
Now, researchers at Fordham who studied educational data brokers
105
347792
5434
๊ต์œก ์ž๋ฃŒ ์ค‘๊ฐœ์ธ์— ๋Œ€ํ•ด ์—ฐ๊ตฌํ•˜๋˜ ํฌ๋“œํ–„๋Œ€ํ•™๊ต์˜ ์—ฐ๊ตฌ์›๋“ค์€
05:53
revealed that these companies profiled kids as young as two
106
353250
5226
์ด๋“ค ๊ธฐ์—…์€ ์ ๊ฒŒ๋Š” ๋‘ ์‚ด ์•„์ด๋“ค ์ •๋ณด๊นŒ์ง€๋„ ์ˆ˜์ง‘ํ–ˆ๋‹ค๊ณ  ๋ฐํ˜”์Šต๋‹ˆ๋‹ค.
05:58
on the basis of different categories:
107
358500
3059
์—ฌ๋Ÿฌ ํ•ญ๋ชฉ์œผ๋กœ ๋ถ„๋ฅ˜๊นŒ์ง€ ํ•ด์„œ ๋ง์ด์ฃ .
06:01
ethnicity, religion, affluence,
108
361583
4185
๋ฏผ์กฑ์„ฑ, ์ข…๊ต, ์†Œ๋“์ˆ˜์ค€
06:05
social awkwardness
109
365792
2059
์‚ฌ๊ต์„ฑ
06:07
and many other random categories.
110
367875
2934
๊ทธ ์™ธ์—๋„ ๋‹ค์–‘ํ•œ ํ•ญ๋ชฉ์œผ๋กœ ๊ตฌ๋ถ„ํ•ด์„œ์š”.
06:10
And then they sell these profiles together with the name of the kid,
111
370833
5018
๊ทธ๋ฆฌ๊ณ  ๊ทธ๋“ค์€ ๊ทธ ๋ถ„์„ ์ž๋ฃŒ๋“ค์„
์•„์ด์˜ ์ด๋ฆ„, ์ง‘ ์ฃผ์†Œ, ์—ฐ๋ฝ์ฒ˜ ๋“ฑ์˜ ์„ธ๋ถ€ ์ •๋ณด์™€ ๋ฌถ์–ด์„œ
06:15
their home address and the contact details
112
375875
2809
06:18
to different companies,
113
378708
1851
๋‹ค๋ฅธ ๊ธฐ์—…๋“ค์— ํŒ”์•˜์Šต๋‹ˆ๋‹ค.
06:20
including trade and career institutions,
114
380583
2459
์ง์—… ๋ฐ ๊ฒฝ๋ ฅ ๊ด€๋ฆฌ ๊ธฐ๊ด€์„ ํฌํ•จํ•ด์„œ
06:24
student loans
115
384083
1268
ํ•™์ž๊ธˆ ๋Œ€์ถœ ๊ธฐ๊ด€
06:25
and student credit card companies.
116
385375
1750
ํ•™์ƒ ์‹ ์šฉ์นด๋“œ ํšŒ์‚ฌ์— ํŒ”๋ ธ์ฃ .
06:28
To push the boundaries,
117
388542
1351
์—ฐ๊ตฌ ๋ฒ”์œ„๋ฅผ ๋„“ํžˆ๊ธฐ ์œ„ํ•ด
06:29
the researchers at Fordham asked an educational data broker
118
389917
3809
ํฌ๋“œํ–„๋Œ€ํ•™๊ต์˜ ์—ฐ๊ตฌ์›๋“ค์€ ํ•œ ๊ต์œก ์ž๋ฃŒ ์ค‘๊ฐœ์ธ์—๊ฒŒ ์—ฐ๋ฝํ•ด์„œ
06:33
to provide them with a list of 14-to-15-year-old girls
119
393750
5809
14์‚ด์—์„œ 15์‚ด ์—ฌ์ž์•„์ด๋“ค ์ค‘์—์„œ
06:39
who were interested in family planning services.
120
399583
3375
๊ฐ€์กฑ ๊ณ„ํš ์„œ๋น„์Šค์— ๊ด€์‹ฌ ์žˆ๋Š” ์•„์ด๋“ค ๋ช…๋‹จ์„ ๋ถ€ํƒํ–ˆ์Šต๋‹ˆ๋‹ค.
06:44
The data broker agreed to provide them the list.
121
404208
2476
๊ทธ ์ค‘๊ฐœ์ธ์€ ๋ช…๋‹จ์„ ์ค„ ์ˆ˜ ์žˆ๋‹ค๊ณ  ํ–ˆ์ฃ .
06:46
So imagine how intimate and how intrusive that is for our kids.
122
406708
4875
์ด๊ฒƒ์ด ์šฐ๋ฆฌ ์•„์ด๋“ค์˜ ์‚ฌ์ƒํ™œ์„ ์–ผ๋งˆ๋‚˜ ์นจํ•ดํ•˜๋Š” ์ผ์ธ์ง€ ์ƒ๊ฐํ•ด๋ณด์„ธ์š”.
06:52
But educational data brokers are really just an example.
123
412833
3976
ํ•˜์ง€๋งŒ ๊ต์œก ์ž๋ฃŒ ์ค‘๊ฐœ์ธ๋“ค์€ ๊ทธ์ € ๋‹จ ํ•˜๋‚˜์˜ ์˜ˆ์‹œ์ผ ๋ฟ์ž…๋‹ˆ๋‹ค.
06:56
The truth is that our children are being profiled in ways that we cannot control
124
416833
4685
๋ฌธ์ œ๋Š” ์šฐ๋ฆฌ๊ฐ€ ํ†ต์ œํ•  ์ˆ˜ ์—†๋Š” ๋ฐฉ์‹์œผ๋กœ ์•„์ด๋“ค์— ๋Œ€ํ•œ ์ •๋ณด๊ฐ€ ๋ถ„์„๋˜๊ณ  ์žˆ๊ณ 
07:01
but that can significantly impact their chances in life.
125
421542
3416
์•„์ด๋“ค์ด ์‚ด๋ฉด์„œ ์–ป๊ฒŒ ๋  ๊ธฐํšŒ์— ๊ทธ๊ฒƒ์ด ํฐ ์˜ํ–ฅ์„ ๋ผ์นœ๋‹ค๋Š” ๊ฒƒ์ด์ฃ .
07:06
So we need to ask ourselves:
126
426167
3476
๋”ฐ๋ผ์„œ ์šฐ๋ฆฌ๋Š” ์Šค์Šค๋กœ์—๊ฒŒ ์งˆ๋ฌธํ•ด์•ผ๋งŒ ํ•ฉ๋‹ˆ๋‹ค.
07:09
can we trust these technologies when it comes to profiling our children?
127
429667
4684
์šฐ๋ฆฌ ์•„์ด๋“ค์„ ๋ถ„์„ํ•˜๋Š” ๋ฌธ์ œ์— ์žˆ์–ด์„œ ์šฐ๋ฆฌ๋Š” ์ด ๊ธฐ์ˆ ๋“ค์„ ๋ฏฟ์–ด๋„ ๋˜๋Š”๊ฐ€?
07:14
Can we?
128
434375
1250
์ •๋ง ๊ทธ๋ž˜๋„ ๋˜๋Š” ๊ฑธ๊นŒ?
07:17
My answer is no.
129
437708
1250
์ €์˜ ๋Œ€๋‹ต์€ '์•„๋‹ˆ์˜ค' ์ž…๋‹ˆ๋‹ค.
07:19
As an anthropologist,
130
439792
1267
์ธ๋ฅ˜ํ•™์ž๋กœ์„œ ์ €๋Š”
07:21
I believe that artificial intelligence and predictive analytics can be great
131
441083
3768
์ธ๊ณต์ง€๋Šฅ๊ณผ ์˜ˆ์ธก ๋ถ„์„์ด ํฐ ๋„์›€์ด ๋  ๊ฑฐ๋ผ๊ณ  ๋ฏฟ์Šต๋‹ˆ๋‹ค.
07:24
to predict the course of a disease
132
444875
2018
๋ณ‘์˜ ์ง„ํ–‰์„ ์˜ˆ์ธกํ•˜๊ณ 
07:26
or to fight climate change.
133
446917
1833
๊ธฐํ›„ ๋ณ€ํ™”์— ๋Œ€๋น„ํ•˜๋Š” ๋ฌธ์ œ์— ์žˆ์–ด์„œ์š”.
07:30
But we need to abandon the belief
134
450000
1643
๊ทธ๋Ÿฌ๋‚˜ ์ด๋Ÿฌํ•œ ๊ธฐ์ˆ ๋“ค์ด ๊ฐ๊ด€์ ์œผ๋กœ ์ธ๊ฐ„์„ ๋ถ„์„ํ•  ๊ฑฐ๋ผ ๋ฏฟ์–ด์„  ์•ˆ ๋ฉ๋‹ˆ๋‹ค.
07:31
that these technologies can objectively profile humans
135
451667
3684
07:35
and that we can rely on them to make data-driven decisions
136
455375
3184
๊ทธ๊ฒƒ์œผ๋กœ ๊ฐœ์ธ์˜ ์‚ถ์„ ํŒ๋‹จํ•  ์ˆ˜ ์žˆ๋‹ค๋Š” ์ƒ๊ฐ์€ ๋ฒ„๋ ค์•ผ ํ•ฉ๋‹ˆ๋‹ค.
07:38
about individual lives.
137
458583
1893
07:40
Because they can't profile humans.
138
460500
2559
์™œ๋ƒํ•˜๋ฉด ์ด๊ฒƒ๋“ค์€ ์ธ๊ฐ„์„ ๋ถ„์„ํ•  ์ˆ˜ ์—†๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
07:43
Data traces are not the mirror of who we are.
139
463083
3351
์ •๋ณด์˜ ํ”์ ๋“ค์€ ์šฐ๋ฆฌ๊ฐ€ ๋ˆ„๊ตฌ์ธ์ง€๋ฅผ ๋ณด์—ฌ์ฃผ๋Š” ๊ฑฐ์šธ์ด ์•„๋‹™๋‹ˆ๋‹ค.
07:46
Humans think one thing and say the opposite,
140
466458
2101
์ธ๊ฐ„์€ ์ƒ๊ฐ๊ณผ ๋‹ค๋ฅธ ๋ง์„ ํ•˜๊ณ  ๋Š๋ผ๋Š” ๊ฐ์ •๊ณผ ๋‹ฌ๋ฆฌ ํ–‰๋™ํ•ฉ๋‹ˆ๋‹ค.
07:48
feel one way and act differently.
141
468583
2435
07:51
Algorithmic predictions or our digital practices
142
471042
2476
์•Œ๊ณ ๋ฆฌ์ฆ˜ ์˜ˆ์ธก ํ˜น์€ ๋””์ง€ํ„ธ ๊ด€ํ–‰์€
07:53
cannot account for the unpredictability and complexity of human experience.
143
473542
5166
์‚ฌ๋žŒ์˜ ๊ฒฝํ—˜์— ์กด์žฌํ•˜๋Š” ๋น„์˜ˆ์ธก์„ฑ๊ณผ ๋ณต์žก์„ฑ๊นŒ์ง€ ํŒŒ์•…ํ•  ์ˆ˜ ์—†์–ด์š”.
08:00
But on top of that,
144
480417
1559
๊ทธ๋ฟ๋งŒ ์•„๋‹ˆ๋ผ
08:02
these technologies are always --
145
482000
2684
์ด๋Ÿฌํ•œ ๊ธฐ์ˆ ๋“ค์€ ์–ธ์ œ๋‚˜
08:04
always --
146
484708
1268
์–ธ์ œ๋‚˜
08:06
in one way or another, biased.
147
486000
1917
์–ด๋–ค ์‹์œผ๋กœ๋“  ํŽธํ–ฅ๋˜์–ด ์žˆ๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
08:09
You see, algorithms are by definition sets of rules or steps
148
489125
5059
์ž, ์•Œ๊ณ ๋ฆฌ์ฆ˜์€ ์˜๋ฏธ ์ž์ฒด๊ฐ€
ํŠน์ •ํ•œ ๊ฒฐ๊ณผ๋ฅผ ๋‹ฌ์„ฑํ•˜๊ธฐ ์œ„ํ•ด ์„ค๊ณ„๋œ ๊ทœ์น™ ํ˜น์€ ๋‹จ๊ณ„์˜ ์ง‘ํ•ฉ์ž…๋‹ˆ๋‹ค.
08:14
that have been designed to achieve a specific result, OK?
149
494208
3709
๊ทธ๋ ‡์ฃ ?
08:18
But these sets of rules or steps cannot be objective,
150
498833
2726
ํ•˜์ง€๋งŒ ์ด ๊ทœ์น™ ํ˜น์€ ๋‹จ๊ณ„์˜ ์ง‘ํ•ฉ์€ ๊ฐ๊ด€์ ์ผ ์ˆ˜๊ฐ€ ์—†์Šต๋‹ˆ๋‹ค.
08:21
because they've been designed by human beings
151
501583
2143
์™œ๋ƒํ•˜๋ฉด ์ด๊ฒƒ์€ ์ธ๊ฐ„์ด ์„ค๊ณ„ํ–ˆ๊ณ 
08:23
within a specific cultural context
152
503750
1726
๊ทธ๋„ ํŠน์ •ํ•œ ๋ฌธํ™”์  ๋งฅ๋ฝ ์•ˆ์—์„œ ํŠน์ •ํ•œ ๋ฌธํ™”์  ๊ฐ€์น˜๋ฅผ ๊ฐ–๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
08:25
and are shaped by specific cultural values.
153
505500
2500
08:28
So when machines learn,
154
508667
1726
๋”ฐ๋ผ์„œ ๊ธฐ๊ณ„๋“ค์ด ์ •๋ณด๋ฅผ ์Šต๋“ํ•  ๋•Œ
08:30
they learn from biased algorithms,
155
510417
2250
ํŽธํ–ฅ๋œ ์•Œ๊ณ ๋ฆฌ์ฆ˜์—์„œ ์Šต๋“ํ•˜๊ฒŒ ๋˜๊ณ 
08:33
and they often learn from biased databases as well.
156
513625
3208
์ด๋กœ ์ธํ•ด ๊ธฐ๊ณ„๋Š” ์ข…์ข… ํŽธํ–ฅ๋œ ์ž๋ฃŒ์˜ ์ •๋ณด๋ฅผ ์Šต๋“ํ•˜๊ฒŒ ๋˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
08:37
At the moment, we're seeing the first examples of algorithmic bias.
157
517833
3726
๋ฐ”๋กœ ์ด ์ˆœ๊ฐ„, ์šฐ๋ฆฐ ํŽธํ–ฅ๋œ ์•Œ๊ณ ๋ฆฌ์ฆ˜์˜ ์ฒซ ๋ฒˆ์งธ ์‚ฌ๋ก€๋ฅผ ๋ณด๊ณ  ์žˆ๋Š” ๊ฒ๋‹ˆ๋‹ค.
08:41
And some of these examples are frankly terrifying.
158
521583
3500
์ด ์‚ฌ๋ก€ ์ค‘ ๋ช‡ ๊ฐ€์ง€๋Š” ์†”์งํžˆ ๊ต‰์žฅํžˆ ๋ฌด์„ญ์Šต๋‹ˆ๋‹ค.
08:46
This year, the AI Now Institute in New York published a report
159
526500
4059
์˜ฌํ•ด, ๋‰ด์š•์˜ AI Now๊ฐ€ ๋ฐœํ‘œํ•œ ๋ณด๊ณ ์„œ์— ๋”ฐ๋ฅด๋ฉด
08:50
that revealed that the AI technologies
160
530583
2393
์˜ˆ์ธก ์น˜์•ˆ ์œ ์ง€ ํ™œ๋™์— ์‚ฌ์šฉ๋œ ์ธ๊ณต์ง€๋Šฅ ๊ธฐ์ˆ ์ด
08:53
that are being used for predictive policing
161
533000
3476
08:56
have been trained on "dirty" data.
162
536500
3125
'๋”๋Ÿฌ์šด' ์ •๋ณด๋กœ ํ›ˆ๋ จ๋˜์—ˆ๋‹ค๊ณ  ํ•ฉ๋‹ˆ๋‹ค.
09:00
This is basically data that had been gathered
163
540333
2893
์ด ์ •๋ณด๋“ค์ด ์ˆ˜์ง‘๋œ ์‹œ๊ธฐ๋Š”
09:03
during historical periods of known racial bias
164
543250
4184
์—ญ์‚ฌ์ ์œผ๋กœ ์ธ์ข… ํŽธ๊ฒฌ์ด ์‹ฌํ•˜๊ณ 
09:07
and nontransparent police practices.
165
547458
2250
๋ถˆํˆฌ๋ช…ํ•œ ๊ฒฝ์ฐฐ ๊ด€ํ–‰์ด ๋‚œ๋ฌดํ•˜๋˜ ๋•Œ์˜€์Šต๋‹ˆ๋‹ค.
09:10
Because these technologies are being trained with dirty data,
166
550542
4059
์ด๋Ÿฌํ•œ ๊ธฐ์ˆ ๋“ค์€ ๋”๋Ÿฌ์šด ์ •๋ณด๋“ค๋กœ ํ›ˆ๋ จ๋œ ๊ฒƒ์ด๊ธฐ ๋•Œ๋ฌธ์—
09:14
they're not objective,
167
554625
1434
๊ฐ๊ด€์ ์ด์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
09:16
and their outcomes are only amplifying and perpetrating
168
556083
4518
์ด๋Ÿฌํ•œ ์ •๋ณด๋กœ๋Š”
๊ฒฝ์ฐฐ์—๊ฒŒ ์žˆ์–ด์„œ ํŽธ๊ฒฌ๊ณผ ์˜ค๋ฅ˜๋ฅผ ์ฆํญ์‹œํ‚ฌ ๋ฟ์ž…๋‹ˆ๋‹ค.
09:20
police bias and error.
169
560625
1625
09:25
So I think we are faced with a fundamental problem
170
565167
3142
์ €๋Š” ํ˜„์žฌ ์šฐ๋ฆฌ๊ฐ€
์šฐ๋ฆฌ ์‚ฌํšŒ์˜ ๊ทผ๋ณธ์ ์ธ ๋ฌธ์ œ์— ์ง๋ฉดํ•˜๊ณ  ์žˆ๋‹ค๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
09:28
in our society.
171
568333
1643
09:30
We are starting to trust technologies when it comes to profiling human beings.
172
570000
4792
์šฐ๋ฆฌ๋Š” ์‚ฌ๋žŒ์„ ๋ถ„์„ํ•  ๋•Œ ๊ธฐ์ˆ ์„ ๋ฏฟ๊ธฐ ์‹œ์ž‘ํ–ˆ์Šต๋‹ˆ๋‹ค.
09:35
We know that in profiling humans,
173
575750
2518
ํ•˜์ง€๋งŒ ์šฐ๋ฆฌ๋„ ์•Œ์ฃ .
์‚ฌ๋žŒ์„ ๋ถ„์„ํ•จ์— ์žˆ์–ด์„œ ์ด๋Ÿฐ ๊ธฐ์ˆ ๋“ค์€ ๋Š˜ ํŽธํ–ฅ๋˜์–ด ์žˆ๊ณ 
09:38
these technologies are always going to be biased
174
578292
2809
09:41
and are never really going to be accurate.
175
581125
2726
์ ˆ๋Œ€๋กœ ์ •ํ™•ํ•  ์ˆ˜ ์—†๋‹ค๋Š” ๊ฒƒ์„์š”.
09:43
So what we need now is actually political solution.
176
583875
2934
๋”ฐ๋ผ์„œ ์ง€๊ธˆ ์ •๋ง๋กœ ํ•„์š”ํ•œ ๊ฑด ์ •์น˜์  ํ•ด๊ฒฐ์ฑ…์ž…๋‹ˆ๋‹ค.
09:46
We need governments to recognize that our data rights are our human rights.
177
586833
4709
์ •๋ถ€๋„ ์ด์ œ ๊นจ๋‹ฌ์•„์•„์•ผ ํ•ฉ๋‹ˆ๋‹ค.
์ •๋ณด์— ๋Œ€ํ•œ ์šฐ๋ฆฌ์˜ ๊ถŒ๋ฆฌ๊ฐ€ ๊ณง ์ธ๊ถŒ์ด๋ผ๋Š” ์‚ฌ์‹ค์„์š”.
09:52
(Applause and cheers)
178
592292
4083
(๋ฐ•์ˆ˜์™€ ํ™˜ํ˜ธ)
09:59
Until this happens, we cannot hope for a more just future.
179
599833
4084
์ด๊ฒƒ์ด ์‹คํ˜„๋˜๊ธฐ ์ „๊นŒ์ง€๋Š”
์šฐ๋ฆฌ๋Š” ๋” ์ •๋‹นํ•œ ๋ฏธ๋ž˜๋ฅผ ๊ฟˆ๊ฟ€ ์ˆ˜ ์—†์Šต๋‹ˆ๋‹ค.
10:04
I worry that my daughters are going to be exposed
180
604750
2726
์ €๋Š” ์ œ ๋”ธ๋“ค์ด ์•Œ๊ณ ๋ฆฌ์ฆ˜์— ์˜ํ•œ ์ฐจ๋ณ„๊ณผ ์˜ค๋ฅ˜์— ๋…ธ์ถœ๋ ๊นŒ๋ด ๊ฑฑ์ •๋ฉ๋‹ˆ๋‹ค.
10:07
to all sorts of algorithmic discrimination and error.
181
607500
3726
10:11
You see the difference between me and my daughters
182
611250
2393
์ €์™€ ์ œ ๋”ธ๋“ค์˜ ์ฐจ์ด์ ์€
10:13
is that there's no public record out there of my childhood.
183
613667
3184
์ œ ์ฒญ์†Œ๋…„๊ธฐ์— ๋Œ€ํ•œ ๊ณต์‹์ ์ธ ๊ธฐ๋ก์€ ์–ด๋””์—๋„ ๋‚จ์•„์žˆ์ง€ ์•Š๋‹ค๋Š” ๊ฑฐ์ฃ .
10:16
There's certainly no database of all the stupid things that I've done
184
616875
4018
์ œ๊ฐ€ ์‹ญ ๋Œ€ ์‹œ์ ˆ ์ €์ง€๋ฅธ ๋˜๋Š” ์ €์ง€๋ฅผ ๋ป” ํ•œ ์˜จ๊ฐ– ๋ฐ”๋ณด ๊ฐ™์€ ์ผ์ด
10:20
and thought when I was a teenager.
185
620917
2142
์ž๋ฃŒ๋กœ ๋‚จ์•„์žˆ์ง„ ์•Š์Šต๋‹ˆ๋‹ค.
10:23
(Laughter)
186
623083
1500
(์›ƒ์Œ)
10:25
But for my daughters this may be different.
187
625833
2750
ํ•˜์ง€๋งŒ ์ œ ๋”ธ๋“ค์˜ ๊ฒฝ์šฐ ์ƒํ™ฉ์ด ์กฐ๊ธˆ ๋‹ค๋ฅผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
10:29
The data that is being collected from them today
188
629292
3184
ํ˜„์žฌ ์ œ ๋”ธ๋“ค๋กœ๋ถ€ํ„ฐ ์ˆ˜์ง‘๋˜๊ณ  ์žˆ๋Š” ์ •๋ณด๋“ค์€
10:32
may be used to judge them in the future
189
632500
3809
๋ฏธ๋ž˜์— ๊ทธ๋“ค์„ ํŒ๋‹จํ•˜๋Š” ๋ฐ ์“ฐ์ผ ์ˆ˜ ์žˆ๊ณ 
10:36
and can come to prevent their hopes and dreams.
190
636333
2959
๊ทธ๋“ค์˜ ํฌ๋ง๊ณผ ๊ฟˆ์„ ๊ฐ€๋กœ๋ง‰์„ ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
10:40
I think that's it's time.
191
640583
1518
์ด์ œ ๋•Œ๊ฐ€ ์˜จ ๊ฒƒ ๊ฐ™์Šต๋‹ˆ๋‹ค.
10:42
It's time that we all step up.
192
642125
1434
์ด์   ์•ž์œผ๋กœ ๋‚˜์•„๊ฐˆ ๋•Œ์ž…๋‹ˆ๋‹ค.
10:43
It's time that we start working together
193
643583
2476
์šฐ๋ฆฌ๋Š” ํ•จ๊ป˜ ๋…ธ๋ ฅํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
10:46
as individuals,
194
646083
1435
๊ฐœ์ธ์œผ๋กœ์„œ
10:47
as organizations and as institutions,
195
647542
2517
์กฐ์ง์œผ๋กœ์„œ ๊ทธ๋ฆฌ๊ณ  ๊ธฐ๊ด€์œผ๋กœ์„œ
10:50
and that we demand greater data justice for us
196
650083
3101
๋”์šฑ ๋‚˜์€ ๊ฐœ์ธ ์ •๋ณด์˜ ์ •๋‹น์„ฑ์„ ์š”๊ตฌํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
10:53
and for our children
197
653208
1393
์šฐ๋ฆฌ์™€ ์šฐ๋ฆฌ ์•„์ด๋“ค์„ ์œ„ํ•ด์„œ์š”.
10:54
before it's too late.
198
654625
1518
๋” ๋Šฆ๊ธฐ ์ „์—์š”.
10:56
Thank you.
199
656167
1267
๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
10:57
(Applause)
200
657458
1417
(๋ฐ•์ˆ˜)
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7