We're building a dystopia just to make people click on ads | Zeynep Tufekci

738,629 views ・ 2017-11-17

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: SeungGyu Min κ²€ν† : Jihyeon J. Kim
00:12
So when people voice fears of artificial intelligence,
0
12760
3536
μš°λ¦¬κ°€ 인곡지λŠ₯에 λŒ€ν•΄ 걱정을 ν•  λ•Œλ§ˆλ‹€
00:16
very often, they invoke images of humanoid robots run amok.
1
16320
3976
μ’…μ’… ν†΅μ œλΆˆλŠ₯의 μΈκ°„μ²˜λŸΌ 생긴 λ‘œλ΄‡μ΄ μƒκ°λ‚©λ‹ˆλ‹€.
00:20
You know? Terminator?
2
20320
1240
터미넀이터 μƒκ°ν•˜μ‹œλ‚˜μš”?
00:22
You know, that might be something to consider,
3
22400
2336
μ „ν˜€ κ°€λŠ₯성이 μ—†λ‹€κ³ λŠ” ν• μˆ˜ μ—†μŠ΅λ‹ˆλ‹€λ§Œ
00:24
but that's a distant threat.
4
24760
1856
아직은 λ¨Ό 미래의 이야기라 ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
00:26
Or, we fret about digital surveillance
5
26640
3456
κ³Όκ±° λΉ„μœ λ²•μœΌλ‘œ 처음 λ“±μž₯ν•œ
00:30
with metaphors from the past.
6
30120
1776
디지털 κ°μ‹œμ‹œμŠ€ν…œμ— λŒ€ν•΄ μš°λ €κ°€ ν½λ‹ˆλ‹€.
00:31
"1984," George Orwell's "1984,"
7
31920
2656
"1984" 쑰지 μ˜€μ›°μ˜ "1984"
00:34
it's hitting the bestseller lists again.
8
34600
2280
μš”μ¦˜ λ‹€μ‹œ λ² μŠ€ν„° μ…€λŸ¬κ°€ 되고 μžˆμŠ΅λ‹ˆλ‹€.
00:37
It's a great book,
9
37960
1416
λŒ€λ‹¨ν•œ 책이죠.
00:39
but it's not the correct dystopia for the 21st century.
10
39400
3880
ν•˜μ§€λ§Œ, 21μ„ΈκΈ°κ°€ 지ν–₯ν•˜λŠ” 세상은 μ•„λ‹Œ 것은 μ‚¬μ‹€μž…λ‹ˆλ‹€.
00:44
What we need to fear most
11
44080
1416
μš°λ¦¬κ°€ μ—¬κΈ°μ„œ κ°€μž₯ λ‘λ €μ›Œν•΄μ•Ό ν•  것은
00:45
is not what artificial intelligence will do to us on its own,
12
45520
4776
인곡지λŠ₯이 슀슀둜 힘으둜 μ–΄λ–€ 일을 ν•  수 μžˆμ„κΉŒκ°€ μ•„λ‹Œ
ꢌλ ₯μžλ“€μ΄ 인곡지λŠ₯을 μ–΄λ–»κ²Œ μ‚¬μš©ν•  κ²ƒμΈκ°€μž…λ‹ˆλ‹€.
00:50
but how the people in power will use artificial intelligence
13
50320
4736
00:55
to control us and to manipulate us
14
55080
2816
μ†Œμ„€μ²˜λŸΌ
00:57
in novel, sometimes hidden,
15
57920
3136
보이지 μ•Šμ€ κ³³μ—μ„œ
01:01
subtle and unexpected ways.
16
61080
3016
λŠλΌμ§€ λͺ»ν•˜μ§€λ§Œ κ΅λ¬˜ν•œ λ°©λ²•μœΌλ‘œ 우리λ₯Ό ν†΅μ œν•˜κ³  μ‘°μ’…ν•˜λ € ν•  κ²ƒμž…λ‹ˆλ‹€.
01:04
Much of the technology
17
64120
1856
κ·ΈλŸ¬ν•œ 기술 λŒ€λΆ€λΆ„μ€
01:06
that threatens our freedom and our dignity in the near-term future
18
66000
4336
λ―Έλž˜μ— 우리의 μžμœ μ™€ 쑴엄성을 μΉ¨ν•΄ν•  수 μžˆμ§€λ§Œ
01:10
is being developed by companies
19
70360
1856
λ§Žμ€ 기업이 λ›°μ–΄λ“€κ³  μžˆμŠ΅λ‹ˆλ‹€.
01:12
in the business of capturing and selling our data and our attention
20
72240
4936
광고주와 우리 관심사λ₯Ό λͺ¨μ•„ νŒλ§€ν•˜λŠ” νšŒμ‚¬λ“€μ΄ μ‹œμŠ€ν…œμ„ κ°œλ°œν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
01:17
to advertisers and others:
21
77200
2256
01:19
Facebook, Google, Amazon,
22
79480
3416
예λ₯Ό λ“€μ–΄, 페이슀뢁, ꡬ글, μ•„λ§ˆμ‘΄
01:22
Alibaba, Tencent.
23
82920
1880
μ•Œλ¦¬λ°”λ°”, ν…μ„ΌνŠΈλ“±μ΄ μžˆμŠ΅λ‹ˆλ‹€.
01:26
Now, artificial intelligence has started bolstering their business as well.
24
86040
5496
ν˜„μž¬, 인곡지λŠ₯ μ˜μ—­μ€ κΈ°μ—…μ˜ μ„±μž₯에 이바지λ₯Ό ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
01:31
And it may seem like artificial intelligence
25
91560
2096
그리고 인곡지λŠ₯이
01:33
is just the next thing after online ads.
26
93680
2856
온라인 인터넷 κ΄‘κ³ μ˜ λŒ€μ²΄μž¬λ‘œ
여겨지기도 ν•©λ‹ˆλ‹€λ§Œ
01:36
It's not.
27
96560
1216
싀은 μ™„μ „νžˆ λ‹€λ₯Έ μ˜μ—­μ˜ 기술이고
01:37
It's a jump in category.
28
97800
2456
기쑴의 기술 λ²”μ£Όλ₯Ό λ›°μ–΄λ„˜λŠ” λΆ„μ•Όμž…λ‹ˆλ‹€.
01:40
It's a whole different world,
29
100280
2576
μƒˆλ‘œμš΄ 기술인 μ…ˆμ΄μ£ .
01:42
and it has great potential.
30
102880
2616
ꡉμž₯ν•œ 잠재λŠ₯λ ₯을 κ°–μΆ”κ³ 
01:45
It could accelerate our understanding of many areas of study and research.
31
105520
6920
우리 μΈκ°„μ˜ 연ꡬ와 μ‹€ν—˜μ˜ 이해속도λ₯Ό 가속할 μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
01:53
But to paraphrase a famous Hollywood philosopher,
32
113120
3496
ν•˜μ§€λ§Œ ν•œ 유λͺ…ν•œ ν• λ¦¬μš°λ“œ μ² ν•™μžμ˜ 말을 쑰금 μΈμš©ν•΄ λ§μ”€λ“œλ¦¬λ©΄
01:56
"With prodigious potential comes prodigious risk."
33
116640
3640
"천재적인 잠재λŠ₯λ ₯은 천재적인 μœ„ν—˜λ„λ₯Ό μˆ˜λ°˜ν•œλ‹€."
02:01
Now let's look at a basic fact of our digital lives, online ads.
34
121120
3936
ν˜„μž¬ 우리의 디지털 기술 세상인 인터넷 κ΄‘κ³ μ˜ 기본을 μ‚΄νŽ΄λ³΄κ² μŠ΅λ‹ˆλ‹€.
02:05
Right? We kind of dismiss them.
35
125080
2896
μš°λ¦¬λŠ” κ΄‘κ³ λ₯Ό 거의 보지 μ•ŠμŠ΅λ‹ˆλ‹€.
02:08
They seem crude, ineffective.
36
128000
1976
μ„±μ˜λ„ μ—†κ³ , λΉ„νš¨μœ¨μ μΈ 맀체라고 μƒκ°ν•˜λŠ” λ“―ν•©λ‹ˆλ‹€.
02:10
We've all had the experience of being followed on the web
37
130000
4256
인터넷을 ν•˜λ‹€ 보면 κ΄‘κ³ κ°€ 계속 λ”°λΌλ‹€λ‹ˆλŠ” ν˜„μƒμ„ λ°œκ²¬ν•©λ‹ˆλ‹€.
02:14
by an ad based on something we searched or read.
38
134280
2776
μš°λ¦¬κ°€ ν•œλ²ˆ μ°Ύμ•˜κ±°λ‚˜ μ½μ—ˆλ˜ μ‚¬μ΄νŠΈμ˜ κ΄‘κ³ μ—μ„œ λ‚˜μ˜¨ κ΄‘κ³ κ°€ μ›μΈμž…λ‹ˆλ‹€.
02:17
You know, you look up a pair of boots
39
137080
1856
예λ₯Ό λ“€μ–΄, λΆ€μΈ λ₯Ό μ‚΄λ €κ³  μ°Ύμ•˜λŠ”λ°
02:18
and for a week, those boots are following you around everywhere you go.
40
138960
3376
일주일 λ™μ•ˆ, κ·Έ λΆ€μΈ κ°€ λ‚΄κ°€ μ–΄λ””λ₯Ό κ²€μƒ‰ν•˜λ“  λ‚΄ 주변을 μ–΄μŠ¬λ κ±°λ¦½λ‹ˆλ‹€.
02:22
Even after you succumb and buy them, they're still following you around.
41
142360
3656
μ‹¬μ§€μ–΄λŠ”, λΆ€μΈ λ₯Ό κ΅¬λ§€ν•œ μ‚¬λžŒλ“€μ—κ²Œλ„ μ ‘μ°©μ œμ²˜λŸΌ λΆ™μ–΄μ„œ λ”°λΌλ‹€λ‹™λ‹ˆλ‹€.
02:26
We're kind of inured to that kind of basic, cheap manipulation.
42
146040
3016
μš°λ¦¬λ„ 이런 싸ꡬ렀 μˆ μ±…μ— 단련이 λ˜μ–΄κ°€κ³  μžˆλŠ” ν˜„μ‹€μž…λ‹ˆλ‹€.
02:29
We roll our eyes and we think, "You know what? These things don't work."
43
149080
3400
μš°λ¦¬λŠ” λˆˆλ™μžλ₯Ό κ΅΄λ¦¬λ©΄μ„œ μƒκ°ν•˜μ£  "μ„€λ§ˆ 이런 기술이 μžˆμ„κΉŒ" ν•©λ‹ˆλ‹€.
02:33
Except, online,
44
153720
2096
κ·ΈλŸ¬λ‚˜ μ˜¨λΌμΈμ„Έμƒμ—μ„œλŠ” μ•ˆ λ˜λŠ” 것이 μ—†μ£ .
02:35
the digital technologies are not just ads.
45
155840
3600
디지털 κΈ°μˆ μ€ λ‹¨μˆœν•œ κ΄‘κ³ λ§Œ μ‘΄μž¬ν•˜λŠ” 곡간이 μ•„λ‹™λ‹ˆλ‹€.
02:40
Now, to understand that, let's think of a physical world example.
46
160240
3120
μš°λ¦¬λŠ” 이것을 μ΄ν•΄ν•˜κΈ° μœ„ν•΄μ„œ 물리적 세상을 이해해야 ν•©λ‹ˆλ‹€.
02:43
You know how, at the checkout counters at supermarkets, near the cashier,
47
163840
4656
μ•„μ‹œλ‹€μ‹œν”Ό, 슈퍼 κ³„μ‚°λŒ€ μ•ž 계산원 λ°”λ‘œ μ˜†μ„ 생각해 λ³΄μ„Έμš”.
02:48
there's candy and gum at the eye level of kids?
48
168520
3480
사탕과 껌 μ’…λ₯˜κ°€ 아이듀 λˆˆλ†’μ΄μ— 맞게 μ „μ‹œλ˜μ–΄ μžˆμ„ κ²λ‹ˆλ‹€.
02:52
That's designed to make them whine at their parents
49
172800
3496
아이듀이 λΆ€λͺ¨λ‹˜μ—κ²Œ 사 달라고 μ‘°λ₯΄λΌλŠ” λ§ˆμΌ€νŒ… λ°©λ²•μ˜ ν•˜λ‚˜μ£ .
02:56
just as the parents are about to sort of check out.
50
176320
3080
λΆ€λͺ¨λ‹˜λ“€μ΄ κ³„μ‚°λŒ€μ—μ„œ 바쁠 λ•Œ μ‘°λ₯΄λ©΄ λ‹€ 사 μ£ΌλŠ” 심리λ₯Ό μ΄μš©ν•©λ‹ˆλ‹€.
03:00
Now, that's a persuasion architecture.
51
180040
2640
섀득 ꡬ쑰 λ©”μ»€λ‹ˆμ¦˜μž…λ‹ˆλ‹€.
03:03
It's not nice, but it kind of works.
52
183160
3096
아이듀 심리λ₯Ό μ΄μš©ν•΄μ„œ λˆμ„ λ²Œμ§€λ§Œ κ·Έλž˜λ„ κ½€ νš¨κ³Όμ μž…λ‹ˆλ‹€.
03:06
That's why you see it in every supermarket.
53
186280
2040
κ·Έλž˜μ„œ 근처 μŠˆνΌμ— κ°€μ‹œλ©΄ λ‹€ μ΄λ ‡κ²Œ 껌과 사탕λ₯˜λ₯Ό μ „μ‹œν•΄ 놓고 μžˆμŠ΅λ‹ˆλ‹€.
03:08
Now, in the physical world,
54
188720
1696
물리적 세계λ₯Ό ν•œλ²ˆ 생각해 λ³΄κ² μŠ΅λ‹ˆλ‹€.
03:10
such persuasion architectures are kind of limited,
55
190440
2496
μ΄λŸ¬ν•œ μ„€λ“κ΅¬μ‘°μ˜ νš¨κ³ΌλŠ” μ œν•œμ μž…λ‹ˆλ‹€.
03:12
because you can only put so many things by the cashier. Right?
56
192960
4816
μ™œλƒν•˜λ©΄ κ·Έ λ§Žμ€ 물건을 계산원 μ˜†μ— μŒ“μ•„λ§Œ λ‘˜ κ²λ‹ˆλ‹€.
03:17
And the candy and gum, it's the same for everyone,
57
197800
4296
그리고 μ‚¬νƒ•μ΄λ‚˜ 껌, 이것듀은 λ‹€ λ˜‘κ°™μ€ ν•˜λ‚˜μ˜ 물건일 λΏμž…λ‹ˆλ‹€.
03:22
even though it mostly works
58
202120
1456
효과λ₯Ό κ°€μ§€λŠ” λŒ€μƒμ€
03:23
only for people who have whiny little humans beside them.
59
203600
4040
꼬맹이λ₯Ό λ‘” λΆ€λͺ¨λ‹˜μ—κ²Œ ν•œμ •λ  뿐이기 λ•Œλ¬Έμž…λ‹ˆλ‹€.
03:29
In the physical world, we live with those limitations.
60
209160
3920
물질적 μ„Έκ³„μ—μ„œλŠ” ν•œκ³„λ₯Ό 가진 λΆˆμ™„μ „ν•œ 세상을 μ‚΄κ³  μžˆμŠ΅λ‹ˆλ‹€.
03:34
In the digital world, though,
61
214280
1936
ν•˜μ§€λ§Œ 디지털 μ„Έμƒμ—μ„œλŠ”
03:36
persuasion architectures can be built at the scale of billions
62
216240
4320
μˆ˜μ‹­μ–΅ 개의 규λͺ¨λ‘œ 이루어진 섀득ꡬ쑰가
03:41
and they can target, infer, understand
63
221840
3856
λͺ©ν‘œλ₯Ό μ„€μ •ν•˜κ³ , μΆ”λ‘ ν•˜κ³ , ν•™μŠ΅ν•©λ‹ˆλ‹€.
03:45
and be deployed at individuals
64
225720
2896
그리고 각 κ°œμ²΄μ— 집쀑해
03:48
one by one
65
228640
1216
ν•˜λ‚˜ ν•˜λ‚˜μ”©
03:49
by figuring out your weaknesses,
66
229880
2136
κ·Έλ“€μ˜ 약점을 νŒŒμ•…ν•˜κ³ 
03:52
and they can be sent to everyone's phone private screen,
67
232040
5616
κ²°κ³Όλ₯Ό μ‚¬λžŒλ“€μ˜ νœ΄λŒ€μ „ν™”μ— μ „μ†‘ν•©λ‹ˆλ‹€.
03:57
so it's not visible to us.
68
237680
2256
μš°λ¦¬μ—κ²Œ 보이지 μ•ŠμŠ΅λ‹ˆλ‹€.
03:59
And that's different.
69
239960
1256
λ°”λ‘œ, 그것이 λ‹€λ₯Έ μ μž…λ‹ˆλ‹€.
04:01
And that's just one of the basic things that artificial intelligence can do.
70
241240
3576
인곡지λŠ₯의 기본적인 κΈ°λŠ₯ 쀑 ν•˜λ‚˜μΌ λΏμž…λ‹ˆλ‹€.
04:04
Now, let's take an example.
71
244840
1336
예λ₯Ό ν•œλ²ˆ λ³΄μ‹œκ² μŠ΅λ‹ˆλ‹€.
04:06
Let's say you want to sell plane tickets to Vegas. Right?
72
246200
2696
μ—¬λŸ¬λΆ„μ΄ 라슀 λ² κ°€μŠ€λ‘œ κ°€λŠ” ν‘œλ₯Ό νŒ”κ³  싢은 λ•Œκ°€ μžˆλ‹€κ³  ν•˜κ² μŠ΅λ‹ˆλ‹€.
04:08
So in the old world, you could think of some demographics to target
73
248920
3496
μ˜ˆμ „μ—λŠ” λͺ©ν‘œ 집단을 λ¨Όμ € μ„ μ •ν•˜μ…”μ•Ό ν–ˆμŠ΅λ‹ˆλ‹€.
04:12
based on experience and what you can guess.
74
252440
2520
κ²½ν—˜μ΄λ‚˜ μ—¬λŸ¬λΆ„λ“€μ΄ μΆ”λ‘ ν•΄μ„œ λ‚Έ κ²°κ³Όλ₯Ό 기초둜 ν•΄μ„œ 말이죠.
04:15
You might try to advertise to, oh,
75
255560
2816
그리고 κ΄‘κ³ λ₯Ό ν•©λ‹ˆλ‹€.
04:18
men between the ages of 25 and 35,
76
258400
2496
λ‚˜μ΄λŠ” 25μ„Έμ—μ„œ 35μ„Έ μ‚¬μ΄μ˜ λ‚¨μ„±μ΄κ±°λ‚˜
04:20
or people who have a high limit on their credit card,
77
260920
3936
μ•„λ‹ˆλ©΄, μ‹ μš©μΉ΄λ“œ ν•œλ„κ°€ 높은 μ‚¬λžŒλ“€μ„ λŒ€μƒμœΌλ‘œ ν•˜κ±°λ‚˜ λ§μž…λ‹ˆλ‹€.
04:24
or retired couples. Right?
78
264880
1376
λ˜λŠ”, ν‡΄μ§ν•˜λŠ” 뢄듀을 λͺ©ν‘œλ‘œ ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
04:26
That's what you would do in the past.
79
266280
1816
κ³Όκ±°μ—λŠ” 그런 μ‹μœΌλ‘œ 일을 ν–ˆμŠ΅λ‹ˆλ‹€λ§Œ
04:28
With big data and machine learning,
80
268120
2896
ν˜„μž¬λŠ” 빅데이타와 κΈ°κ³„ν•™μŠ΅μ˜ 영ν–₯으둜
04:31
that's not how it works anymore.
81
271040
1524
κ·Έλ ‡κ²Œ 일을 ν•˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
04:33
So to imagine that,
82
273320
2176
머릿속에 큰 그림을 ν•œλ²ˆ κ·Έλ €λ³΄μ‹œμ£ .
04:35
think of all the data that Facebook has on you:
83
275520
3856
페이슀뢁이 가지고 μžˆμ„ μ—¬λŸ¬λΆ„λ“€μ˜ 자료의 양을 상상해 λ³΄μ‹­μ‹œμ˜€.
04:39
every status update you ever typed,
84
279400
2536
μ—¬λŸ¬λΆ„λ“€μ˜ μž…λ ₯ν•œ λͺ¨λ‘ μƒνƒœν‘œμ‹œ
04:41
every Messenger conversation,
85
281960
2016
λ©”μ‹ μ € λŒ€ν™” λ‚΄μš©
04:44
every place you logged in from,
86
284000
1880
λ‘œκ·ΈμΈν–ˆλ˜ λͺ¨λ“  μž₯μ†Œ
04:48
all your photographs that you uploaded there.
87
288400
3176
올린 λͺ¨λ“ μ‚¬μ§„듀을 가지고 μžˆμŠ΅λ‹ˆλ‹€.
04:51
If you start typing something and change your mind and delete it,
88
291600
3776
λ§Œμ•½ μž…λ ₯ν•˜λ‹€κ°€ 마음이 λ°”λ€Œμ–΄ μ‚­μ œν•œλ‹€ 해도
04:55
Facebook keeps those and analyzes them, too.
89
295400
3200
νŽ˜μ΄μŠ€λΆμ€ κ·Έ 자료λ₯Ό λ³΄κ΄€ν•˜κ³  λΆ„μ„ν•©λ‹ˆλ‹€.
04:59
Increasingly, it tries to match you with your offline data.
90
299160
3936
점점 νŽ˜μ΄μŠ€λΆμ€ 온라인의 μ—¬λŸ¬λΆ„κ³Ό μ˜€ν”„λΌμΈμ˜ μ—¬λŸ¬λΆ„μ„ μΌμΉ˜μ‹œν‚¬ κ²λ‹ˆλ‹€.
05:03
It also purchases a lot of data from data brokers.
91
303120
3176
또, 데이터 μ€‘κ°œμƒμ—κ²Œμ„œ λ§Žμ€ 정보λ₯Ό 사듀이고 μžˆμŠ΅λ‹ˆλ‹€.
05:06
It could be everything from your financial records
92
306320
3416
κ·Έ μžλ£ŒλŠ” κΈˆμœ΅μ •λ³΄μ—μ„œλΆ€ν„°
05:09
to a good chunk of your browsing history.
93
309760
2120
검색 파편 덩어리에 이λ₯΄κΈ°κΉŒμ§€ λ‹€μ–‘ν•œ 것이 ν¬ν•¨λ©λ‹ˆλ‹€.
05:12
Right? In the US, such data is routinely collected,
94
312360
5416
λ―Έκ΅­μ—μ„œλ§Œ 그런 μžλ£ŒλŠ” μΌμƒμ μœΌλ‘œ μˆ˜μ§‘λ˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
05:17
collated and sold.
95
317800
1960
그런 자료λ₯Ό λͺ¨μ•„ νŒ”μ•„μ„œ λˆμ„ λ²•λ‹ˆλ‹€.
05:20
In Europe, they have tougher rules.
96
320320
2440
μœ λŸ½μ—μ„œλŠ” μ’€ 더 μ—„κ²©ν•œ 법을 μ μš©ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
05:23
So what happens then is,
97
323680
2200
그럼 μ‹€μ œλ‘œ μ—¬κΈ°μ„œ
05:26
by churning through all that data, these machine-learning algorithms --
98
326920
4016
μΆ•μ ν•œ 데이터λ₯Ό ν•œλ° μ„žμ–΄λ²„λ¦¬λŠ” 기계 ν•™μŠ΅ μ•Œκ³ λ¦¬μ¦˜μ΄ μΌμ–΄λ‚©λ‹ˆλ‹€.
05:30
that's why they're called learning algorithms --
99
330960
2896
μš°λ¦¬κ°€ ν•™μŠ΅μ•Œκ³ λ¦¬μ¦˜μ΄λΌ λΆ€λ₯΄λŠ” μ΄μœ κ°€ 여기에 μžˆμŠ΅λ‹ˆλ‹€.
05:33
they learn to understand the characteristics of people
100
333880
4096
μ‚¬λžŒλ“€μ˜ νŠΉμ§•μ„ λΆ„μ„ν•˜κΈ° μœ„ν•΄ ν•™μŠ΅ν•˜κ±°λ‚˜
예λ₯Ό λ“€μ–΄ λ² κ°€μŠ€ ν–‰ λΉ„ν–‰κΈ° ν‘œλ₯Ό κ΅¬λ§€ν•œ μ‚¬λžŒλ“€μ„ λΆ„μ„ν•˜λŠ” κ±°μ£ .
05:38
who purchased tickets to Vegas before.
101
338000
2520
05:41
When they learn this from existing data,
102
341760
3536
일단 기쑴의 μžλ£Œλ‘œλΆ€ν„° ν•™μŠ΅μ΄ 되면
05:45
they also learn how to apply this to new people.
103
345320
3816
μƒˆλ‘œμš΄ κ³ κ°μ—κ²Œ μ μš©ν•  수 μžˆλŠ” 방법을 ν•™μŠ΅ν•˜κ²Œ λ©λ‹ˆλ‹€.
05:49
So if they're presented with a new person,
104
349160
3056
κ·Έλ ‡λ‹€λ©΄ λ§Œμ•½ μƒˆλ‘œμš΄ ν•™μŠ΅μžλ£Œκ°€ μƒˆλ‘œμš΄ μ‚¬λžŒμ—κ²Œ 전달됐을 λ•Œ
05:52
they can classify whether that person is likely to buy a ticket to Vegas or not.
105
352240
4640
κ·Έ μ‚¬λžŒμ€ 라슀 λ² κ°€μŠ€ν–‰ ν‘œλ₯Ό ꡬ맀할 κ°€λŠ₯성에 따라 λΆ„λ₯˜κ°€ λ©λ‹ˆλ‹€.
05:57
Fine. You're thinking, an offer to buy tickets to Vegas.
106
357720
5456
예λ₯Ό λ“€μ–΄, μ—¬λŸ¬λΆ„λ“€μ€ 라슀 λ² κ°€μŠ€ν–‰ λΉ„ν–‰κΈ° ν‘œλ₯Ό μ œμ•ˆλ°›μ•˜λ‹€κ³  해보죠.
06:03
I can ignore that.
107
363200
1456
λ‚˜λŠ” κ·Έ μ œμ•ˆμ„ λ¬΄μ‹œν•  μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
06:04
But the problem isn't that.
108
364680
2216
ν•˜μ§€λ§Œ 그것이 λ¬Έμ œκ°€ μ•„λ‹™λ‹ˆλ‹€.
06:06
The problem is,
109
366920
1576
λ¬Έμ œλŠ”
06:08
we no longer really understand how these complex algorithms work.
110
368520
4136
이런 λ³΅μž‘ν•œ μ•Œκ³ λ¦¬μ¦˜μ΄ μ–΄λ–»κ²Œ 일을 ν•˜λŠ”κ°€λ₯Ό λͺ¨λ₯Έλ‹€λŠ” κ²ƒμž…λ‹ˆλ‹€.
06:12
We don't understand how they're doing this categorization.
111
372680
3456
μ•Œκ³ λ¦¬μ¦˜μ˜ λΆ„λ₯˜μ²΄κ³„에 κ΄€ν•΄ μš°λ¦¬κ°€ μ•„λŠ” 것이 μ—†μŠ΅λ‹ˆλ‹€.
06:16
It's giant matrices, thousands of rows and columns,
112
376160
4416
μ—„μ²­λ‚œ ν–‰κ³Ό μ—΄μ˜ 수둜 이루어져 μžˆμŠ΅λ‹ˆλ‹€.
06:20
maybe millions of rows and columns,
113
380600
1960
μ•„λ§ˆλ„ 수백만 ν–‰κ³Ό μ—΄λ‘œ μ§œμ—¬ μžˆμ„ μˆ˜λ„ 있겠죠.
06:23
and not the programmers
114
383320
2640
ν”„λ‘œκ·Έλž˜λ¨Έλ„
06:26
and not anybody who looks at it,
115
386760
1680
μ•Œκ³ λ¦¬μ¦˜μ„ λ³΄λŠ” μ‚¬λžŒλ“€λ„ λͺ¨λ¦…λ‹ˆλ‹€.
06:29
even if you have all the data,
116
389440
1496
비둝 μ—¬λŸ¬λΆ„λ“€μ΄ κ·Έ λͺ¨λ“  데이터λ₯Ό 가지고 μžˆμ–΄λ„
06:30
understands anymore how exactly it's operating
117
390960
4616
μ–΄λ–»κ²Œ κ·Έ 데이터λ₯Ό μš΄μš©ν•΄μ•Ό ν•˜λŠ”μ§€ μ •ν™•ν•œ 이해가 λΆˆκ°€λŠ₯ν•˜λ‹€λŠ” 점과
06:35
any more than you'd know what I was thinking right now
118
395600
3776
μ—¬κΈ° μ„œ μžˆλŠ” 제 생각을 μ—¬λŸ¬λΆ„λ“€μ€ μΆ”μΈ‘ν•˜μ‹œλŠ” μ΄μƒμ˜ μ •λ³΄λŠ” μ•Œ 수 μ—†μŠ΅λ‹ˆλ‹€.
06:39
if you were shown a cross section of my brain.
119
399400
3960
λ§Œμ•½ 제 λ‡Œμ˜ 단면을 보신닀 해도 말이죠.
06:44
It's like we're not programming anymore,
120
404360
2576
마치 ν”„λ‘œκ·Έλž˜λ°μ΄ λΆˆκ°€λŠ₯ν•œ 것과 κ°™μŠ΅λ‹ˆλ‹€.
06:46
we're growing intelligence that we don't truly understand.
121
406960
4400
μš°λ¦¬λŠ” μ‹€μ œλ‘œ 이해할 수 μ—†λŠ” 지λŠ₯을 μƒμ„±ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
06:52
And these things only work if there's an enormous amount of data,
122
412520
3976
이 λͺ¨λ“  것듀은 μ‹€μ œλ‘œ μ—„μ²­λ‚œ 데이터가 μžˆμ„ λ•Œλ§Œ μ‚¬μš© κ°€λŠ₯ν•  κ²λ‹ˆλ‹€.
06:56
so they also encourage deep surveillance on all of us
123
416520
5096
κ·Έλž˜μ„œ 그듀은 우리의 일거수 μΌνˆ¬μ‘±μ„ κ°μ‹œν•˜κ²Œ λ©λ‹ˆλ‹€.
07:01
so that the machine learning algorithms can work.
124
421640
2336
κΈ°κ³„ν•™μŠ΅ μ•Œκ³ λ¦¬μ¦˜μ„ μ‚¬μš©ν•˜κΈ° μœ„ν•΄μ„œμž…λ‹ˆλ‹€.
07:04
That's why Facebook wants to collect all the data it can about you.
125
424000
3176
νŽ˜μ΄μŠ€λΆμ€ μ—¬λŸ¬λΆ„λ“€μ— λŒ€ν•œ 데이터λ₯Ό μ΅œλŒ€ν•œ 많이 λͺ¨μœΌκ³  μžˆμŠ΅λ‹ˆλ‹€.
07:07
The algorithms work better.
126
427200
1576
μžλ£Œκ°€ λ§Žμ„μˆ˜λ‘ μ•Œκ³ λ¦¬μ¦˜μ˜ νš¨μœ¨μ€ λ†’μ•„μ§‘λ‹ˆλ‹€.
07:08
So let's push that Vegas example a bit.
127
428800
2696
μ•„κΉŒ λ² κ°€μŠ€μ˜ 예λ₯Ό μ’€ 더 λ°œμ „μ‹œμΌœ λ³΄κ² μŠ΅λ‹ˆλ‹€.
07:11
What if the system that we do not understand
128
431520
3680
λ§Œμ•½ μš°λ¦¬κ°€ λͺ¨λ₯΄λŠ” μ‹œμŠ€ν…œμ΄ 아직 μ‘°μšΈμ¦μ„ μ•“λŠ” μ†ŒλΉ„μžλ“€μ—κ²Œ
07:16
was picking up that it's easier to sell Vegas tickets
129
436200
5136
λ² κ°€μŠ€ ν–‰ λΉ„ν–‰κΈ° ν‘œλ₯Ό νŒλ§€ν•˜λŠ” 경우
07:21
to people who are bipolar and about to enter the manic phase.
130
441360
3760
더 νš¨μœ¨μ„±μ΄ 높아진닀면 μ–΄λ–»κ²Œ λ κΉŒμš”?
07:25
Such people tend to become overspenders, compulsive gamblers.
131
445640
4920
그런 μ‚¬λžŒλ“€μ€ κ³Όμ†ŒλΉ„ κ²½ν–₯이 있고 좩동ꡬ맀λ₯Ό ν•  κ°€λŠ₯성이 ν½λ‹ˆλ‹€.
07:31
They could do this, and you'd have no clue that's what they were picking up on.
132
451280
4456
그듀이 ꡬ맀λ₯Ό ν•œλ‹€ 해도, μš°λ¦¬λŠ” κ·Έ ꡬ맀이유λ₯Ό μ•Œ μˆ˜λŠ” μ—†μŠ΅λ‹ˆλ‹€.
07:35
I gave this example to a bunch of computer scientists once
133
455760
3616
λ‚˜λŠ” 이런 예λ₯Ό λ§Žμ€ 컴퓨터 μ „λ¬Έκ°€λ“€μ—κ²Œ λͺ‡ 번 λ³΄λƒˆλŠ”λ°
07:39
and afterwards, one of them came up to me.
134
459400
2056
κ·Έλ“€ 쀑 ν•œ λͺ…이 λ‚˜μ—κ²Œ μ™€μ„œ
07:41
He was troubled and he said, "That's why I couldn't publish it."
135
461480
3520
λ‚œκ°ν•΄ν•˜λ©΄μ„œ κ·Όκ±°κ°€ λΆ€μ‘±ν•΄ μΆœνŒμ„ ν•  μˆ˜κ°€ μ—†λ‹€κ³  λ§ν•˜λ”κ΅°μš”.
07:45
I was like, "Couldn't publish what?"
136
465600
1715
λ‚˜λŠ” 무엇을 μΆœνŒν•  수 μ—†λŠ”μ§€ λ˜λ¬Όμ–΄μ•Ό ν–ˆμŠ΅λ‹ˆλ‹€.
07:47
He had tried to see whether you can indeed figure out the onset of mania
137
467800
5856
κ·Έ μ „λ¬Έκ°€λŠ” 쑰울증의 μ‹œμž‘μ΄ 병리학적 증상 이전에
07:53
from social media posts before clinical symptoms,
138
473680
3216
μ†Œμ…œ λ―Έλ””μ–΄ κ²Œμ‹œλ¬Όμ΄ μ›μΈμ΄λΌλŠ” 점을 밝히고자 λ…Έλ ₯ν–ˆμœΌλ©°
07:56
and it had worked,
139
476920
1776
κ²°κ΅­, λ°ν˜€λƒˆμŠ΅λ‹ˆλ‹€.
07:58
and it had worked very well,
140
478720
2056
이 μ „λΆ€ν„° λ°ν˜€μ‘Œλ˜ μ‚¬μ•ˆμž…λ‹ˆλ‹€.
08:00
and he had no idea how it worked or what it was picking up on.
141
480800
4880
그런데, μ–΄λ–»κ²Œ 일을 μˆ˜ν–‰ν•˜κ³  무엇을 μ„ νƒν•˜λŠ”κ°€μ— λŒ€ν•œ μ΄μœ λŠ” λͺ°λžμŠ΅λ‹ˆλ‹€.
08:06
Now, the problem isn't solved if he doesn't publish it,
142
486840
4416
μΆœνŒν•˜μ§€ μ•Šμ•„λ„ 해결될 λ¬Έμ œλŠ” μ•„λ‹™λ‹ˆλ‹€.
08:11
because there are already companies
143
491280
1896
μ™œλƒν•˜λ©΄ 이런 μ’…λ₯˜μ˜ κΈ°μˆ μ„ 개발 쀑인
08:13
that are developing this kind of technology,
144
493200
2536
νšŒμ‚¬λ“€μ΄ 있고
08:15
and a lot of the stuff is just off the shelf.
145
495760
2800
ν˜„μž¬ λ§Žμ€ 것듀이 판맀되고 μžˆμŠ΅λ‹ˆλ‹€.
08:19
This is not very difficult anymore.
146
499240
2576
λ”λŠ” λΆˆκ°€λŠ₯ν•œ 일이 μ•„λ‹™λ‹ˆλ‹€.
08:21
Do you ever go on YouTube meaning to watch one video
147
501840
3456
λΉ„λ””μ˜€λ₯Ό ν•˜λ‚˜ 보기 μœ„ν•΄ μœ νŠœλΈŒμ— λ“€μ–΄κ°”λ‹€κ°€
08:25
and an hour later you've watched 27?
148
505320
2360
ν•œ μ‹œκ°„ 후에 27개의 μ˜μƒμ„ λ³Έ 적이 μžˆμœΌμ‹ κ°€μš”?
08:28
You know how YouTube has this column on the right
149
508760
2496
μ—¬λŸ¬λΆ„λ“€μ€ μ™œ μœ νŠœλΈŒκ°€ 였λ₯ΈνŽΈμ—
08:31
that says, "Up next"
150
511280
2216
'λ‹€μŒ μˆœμ„œ' 라고 있고 무엇인가가
08:33
and it autoplays something?
151
513520
1816
연속 μž¬μƒλ˜λŠ” 사싀을 잘 μ•Œκ³  κ³„μ‹œμ£ ?
08:35
It's an algorithm
152
515360
1216
그것이 μ—¬λŸ¬λΆ„λ“€μ΄ 관심은 μžˆμ§€λ§Œ
08:36
picking what it thinks that you might be interested in
153
516600
3616
슀슀둜 찾을 수 μ—†λ‹€κ³  νŒλ‹¨λ˜λŠ” 것듀을
08:40
and maybe not find on your own.
154
520240
1536
μ°Ύμ•„λ‚΄λŠ” μ•Œκ³ λ¦¬μ¦˜μž…λ‹ˆλ‹€.
08:41
It's not a human editor.
155
521800
1256
인간 νŽΈμ§‘μžκ°€ μ•„λ‹™λ‹ˆλ‹€.
08:43
It's what algorithms do.
156
523080
1416
λ°”λ‘œ μ•Œκ³ λ¦¬μ¦˜μ΄ μˆ˜ν–‰ν•©λ‹ˆλ‹€.
08:44
It picks up on what you have watched and what people like you have watched,
157
524520
4736
μ•Œκ³ λ¦¬λ“¬μ€ μ—¬λŸ¬λΆ„λ“€μ΄ 이미 λ³΄μ…¨κ±°λ‚˜ 보신 것 μ€‘μ—μ„œ μ’‹μ•„ν•˜λŠ” 것을 선별해
08:49
and infers that that must be what you're interested in,
158
529280
4216
그런 μ’…λ₯˜μ— 관심을 κ°€μ§„λ‹€κ±°λ‚˜
08:53
what you want more of,
159
533520
1255
더 μ›ν•œλ‹€κ³  μΆ”λ‘ ν•œ ν›„
08:54
and just shows you more.
160
534799
1336
μ—¬λŸ¬λΆ„λ“€μ—κ²Œ 더 λ§Žμ€ 것을 λ³΄μ—¬μ€λ‹ˆλ‹€.
08:56
It sounds like a benign and useful feature,
161
536159
2201
그렇지 μ•Šμ€ 경우λ₯Ό μ œμ™Έν•˜κ³ λŠ”
08:59
except when it isn't.
162
539280
1200
무척 맀λ ₯적이고 μœ μš©ν•œ μž₯치처럼 λ“€λ¦½λ‹ˆλ‹€.
09:01
So in 2016, I attended rallies of then-candidate Donald Trump
163
541640
6960
μ €λŠ” 2016λ…„ λ‹Ήμ‹œ λŒ€ν†΅λ Ή ν›„λ³΄μ˜€λ˜ λ„λ„λ“œ νŠΈλŸΌν”„ λŒ€ν†΅λ Ήμ„ μ§€μ§€ν•˜λŠ”
09:09
to study as a scholar the movement supporting him.
164
549840
3336
λŒ€μ„  μΊ νŽ˜μΈμ— μ°Έμ—¬ν–ˆμŠ΅λ‹ˆλ‹€.
09:13
I study social movements, so I was studying it, too.
165
553200
3456
λ§Žμ€ μš΄λ™μ— μ°Έμ—¬ν–ˆκ³  연ꡬ μ€‘μ΄μ—ˆμŠ΅λ‹ˆλ‹€.
09:16
And then I wanted to write something about one of his rallies,
166
556680
3336
κ·Έ ν›„ λ‚˜λŠ” νŠΈλŸΌν”„ μ§€μ§€μš΄λ™μ— κ΄€ν•œ 논문을 μ“°κ³  μ‹Άμ—ˆμŠ΅λ‹ˆλ‹€.
09:20
so I watched it a few times on YouTube.
167
560040
1960
유튜브λ₯Ό ν†΅ν•΄μ„œλ„ 많이 λ΄€μŠ΅λ‹ˆλ‹€.
09:23
YouTube started recommending to me
168
563240
3096
κ·Έ ν›„, μœ νŠœλΈŒμ—μ„œ λ§Žμ€ μΆ”μ²œ μ˜μƒμ΄ μ˜¬λΌμ˜€λ”κ΅°μš”.
09:26
and autoplaying to me white supremacist videos
169
566360
4256
λ§Žμ€ 백인 μš°μ›”λ‘ μžλ“€μ˜ μ˜μƒμ΄ μžλ™μž¬μƒλ˜μ—ˆκ³ μš”.
κ°•λ„μ˜ μˆœμ„œλŒ€λ‘œ λ‚˜μ—΄λ˜λ”κ΅°μš”.
09:30
in increasing order of extremism.
170
570640
2656
09:33
If I watched one,
171
573320
1816
λ§Œμ•½ λ‚΄κ°€ 그쀑 ν•˜λ‚˜λ₯Ό 보게 되면
09:35
it served up one even more extreme
172
575160
2976
κ·Έ ν•˜λ‚˜μ˜ μ˜μƒμ€ 꼬리λ₯Ό λ¬Όκ³  μ—„μ²­λ‚œ μ–‘μœΌλ‘œ λŠ˜μ–΄λ‚˜λ©΄μ„œ
09:38
and autoplayed that one, too.
173
578160
1424
μžλ™ μž¬μƒμ΄ λ©λ‹ˆλ‹€.
09:40
If you watch Hillary Clinton or Bernie Sanders content,
174
580320
4536
λ§Œμ•½ μ—¬λŸ¬λΆ„λ“€μ΄ 힐러리 ν΄λ¦°ν„΄μ΄λ‚˜ λ²„λ‹ˆ μƒŒλ”μŠ€μ˜ μ˜μƒμ„ 보신닀면
09:44
YouTube recommends and autoplays conspiracy left,
175
584880
4696
μœ νŠœλΈŒλŠ” μ—¬λŸ¬λΆ„μ—κ²Œ κ΄€λ ¨ 정보λ₯Ό μΆ”μ²œν•˜κ³  λ‚¨μ•„μžˆλŠ” 음λͺ¨λ‘ μ„ 틀어쀄 κ²λ‹ˆλ‹€.
09:49
and it goes downhill from there.
176
589600
1760
κ°€νžˆ μ—„μ²­λ‚œ μ–‘μ˜ μ˜μƒμ„ μΆ”μ²œν•©λ‹ˆλ‹€.
09:52
Well, you might be thinking, this is politics, but it's not.
177
592480
3056
μ•„λ§ˆ μ—¬λŸ¬λΆ„λ“€μ€ μ •μΉ˜λ₯Ό λ– μ˜¬λ¦¬μ‹œκ² μ§€λ§Œ, 그렇지 μ•ŠμŠ΅λ‹ˆλ‹€.
09:55
This isn't about politics.
178
595560
1256
μ •μΉ˜ 이야기가 μ•„λ‹™λ‹ˆλ‹€.
09:56
This is just the algorithm figuring out human behavior.
179
596840
3096
μΈκ°„μ˜ 행동을 νŒŒμ•…ν•˜λŠ” μ•Œκ³ λ¦¬λ“¬μ˜ μ΄μ•ΌκΈ°μž…λ‹ˆλ‹€.
09:59
I once watched a video about vegetarianism on YouTube
180
599960
4776
ν•œλ²ˆμ€ μ±„μ‹μ£Όμ˜μ— λŒ€ν•œ μ˜μƒμ„ 유튜브둜 λ΄€μŠ΅λ‹ˆλ‹€.
10:04
and YouTube recommended and autoplayed a video about being vegan.
181
604760
4936
μ±„μ‹μ£Όμ˜μžμ— κ΄€ν•œ μ˜μƒμ„ μΆ”μ²œν•˜κ³  μžλ™ μž¬μƒμ„ ν–ˆμŠ΅λ‹ˆλ‹€.
10:09
It's like you're never hardcore enough for YouTube.
182
609720
3016
μœ νŠœλΈŒλŠ” μ—¬λŸ¬λΆ„μ„ μ² μ €ν•œ μ±„μ‹μ£Όμ˜μžλ‘œ λΆ„μ„ν•œ κ²λ‹ˆλ‹€.
10:12
(Laughter)
183
612760
1576
(μ›ƒμŒ)
10:14
So what's going on?
184
614360
1560
λ„λŒ€μ²΄ μ–΄λ–€ 일이 μΌμ–΄λ‚˜κ³  μžˆμ„κΉŒμš”?
10:16
Now, YouTube's algorithm is proprietary,
185
616520
3536
유튜브의 μ•Œκ³ λ¦¬μ¦˜μ€ μ „λ§€νŠΉν—ˆμ΄μ§€λ§Œ
10:20
but here's what I think is going on.
186
620080
2360
μ €λŠ” 유튜브의 μ•Œκ³ λ¦¬μ¦˜ 원리λ₯Ό λ§ν•˜κ³  μžˆμ„ λΏμž…λ‹ˆλ‹€.
10:23
The algorithm has figured out
187
623360
2096
유튜브의 μ•Œκ³ λ¦¬μ¦˜μ€
10:25
that if you can entice people
188
625480
3696
λ§Œμ•½, λ°©λ¬Έκ°μ—κ²Œ 더 자극적인 것을 보여쀄 수 μžˆλ‹€λŠ”
10:29
into thinking that you can show them something more hardcore,
189
629200
3736
사고λ₯Ό μœ λ„ν•  수만 μžˆλ‹€λ©΄
10:32
they're more likely to stay on the site
190
632960
2416
더 였래 μœ νŠœλΈŒμ— 머무λ₯Ό 것이라고 λΆ„μ„ν•©λ‹ˆλ‹€.
10:35
watching video after video going down that rabbit hole
191
635400
4416
μ΄μƒν•œ λ‚˜λΌμ˜ μ•¨λ¦¬μŠ€μ²˜λŸΌ λμ—†λŠ” μ˜μƒμ˜ λ‚˜λΌλ‘œ λ“€μ–΄κ°€κ²Œ 되고
10:39
while Google serves them ads.
192
639840
1680
ꡬ글은 이듀 μ˜μƒμ— κ΄‘κ³ λ₯Ό λ…ΈμΆœν•˜κ²Œ λ©λ‹ˆλ‹€.
10:43
Now, with nobody minding the ethics of the store,
193
643760
3120
아무도 유튜브의 상도덕을 μ‹ κ²½ 쓰지 μ•ŠλŠ” κ°€μš΄λ°
10:47
these sites can profile people
194
647720
4240
μœ νŠœλΈŒλŠ” μ‚¬λžŒλ“€μ„ 뢄석할 수 μžˆμŠ΅λ‹ˆλ‹€.
10:53
who are Jew haters,
195
653680
1920
μœ λŒ€μΈ μ¦μ˜€μžλ“€
10:56
who think that Jews are parasites
196
656360
2480
μœ λŒ€μΈλ“€μ€ 기생좩이라고 μ—¬κΈ°λŠ” μ‚¬λžŒλ“€
11:00
and who have such explicit anti-Semitic content,
197
660320
4920
그리고, 노골적인 λ°˜μœ λŒ€μ£Όμ˜ μ˜μƒλ¬Όμ„ 가진 μ‚¬λžŒλ“€μ„ κ°€λ €λ‚΄κ³ 
11:06
and let you target them with ads.
198
666080
2000
λ§Žμ€ μ‚¬λžŒμ—κ²Œ κ·Έ μ˜μƒλ¬Όμ— κ΄‘κ³ λ₯Ό μ‹€μ–΄ μ „νŒŒμ‹œν‚΅λ‹ˆλ‹€.
11:09
They can also mobilize algorithms
199
669200
3536
ꡬ글은 μ•Œκ³ λ¦¬μ¦˜μ„ 퍼뜨릴 μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
11:12
to find for you look-alike audiences,
200
672760
3136
μ—¬λŸ¬λΆ„κ³Ό λΉ„μŠ·ν•œ μ‚¬λžŒλ“€μ„ μ°Ύμ•„λ‚΄κΈ° μœ„ν•΄μ„œμ£ .
11:15
people who do not have such explicit anti-Semitic content on their profile
201
675920
5576
ν”„λ‘œν•„μ— λ°˜μœ λŒ€μ μΈ μ˜μƒλ¬Όμ΄ μ—†λŠ” μ‚¬λžŒλ“€μ΄μ§€λ§Œ
11:21
but who the algorithm detects may be susceptible to such messages,
202
681520
6176
μ•Œκ³ λ¦¬λ“¬μ΄ κ°μ§€ν•œ μ‚¬λžŒμ€ 그런 λ©”μ‹œμ§€μ— μ·¨μ•½ν•˜κ²Œ λ©λ‹ˆλ‹€.
11:27
and lets you target them with ads, too.
203
687720
1920
그리고 μ—¬λŸ¬λΆ„λ“€μ˜ κ΄‘κ³ μ˜ ν¬μƒλ¬Όλ‘œ μ‚¬μš©ν•˜κ²Œ λ©λ‹ˆλ‹€.
11:30
Now, this may sound like an implausible example,
204
690680
2736
λ―ΏκΈ° μ–΄λ €μš΄ 말둜 듀릴 μˆ˜λ„ μžˆμ„ν…λ°μš”.
11:33
but this is real.
205
693440
1320
κ·ΈλŸ¬λ‚˜ 이것은 μ‚¬μ‹€μž…λ‹ˆλ‹€.
11:35
ProPublica investigated this
206
695480
2136
ν”„λ‘œν”„λ‘œλΉ„μΉ΄κ°€ 이것을 μ‘°μ‚¬ν–ˆκ³ 
11:37
and found that you can indeed do this on Facebook,
207
697640
3616
μ—¬λŸ¬λΆ„λ“€μ΄ μ‹€μ œλ‘œ νŽ˜μ΄μŠ€λΆμ—μ„œ κ΄‘κ³ λ₯Ό ν•˜μ‹ λ‹€λŠ” 것을 λ°œκ²¬ν–ˆμŠ΅λ‹ˆλ‹€.
11:41
and Facebook helpfully offered up suggestions
208
701280
2416
그리고 νŽ˜μ΄μŠ€λΆμ€ κ°€λŠ₯μ„± μžˆλŠ” μ œμ•ˆμ„ ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
11:43
on how to broaden that audience.
209
703720
1600
νŒ”λ‘œμ›Œ 수λ₯Ό λŠ˜λ¦¬λŠ” 방법 등을 μ œμ‹œν•˜μ£ .
11:46
BuzzFeed tried it for Google, and very quickly they found,
210
706720
3016
λ²„μ¦ˆν”Όλ“œλŠ” ꡬ글에 이것을 μ‹œν—˜ν•΄ 보고 λ°”λ‘œ νŒŒμ•…μ„ ν•  수 μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
11:49
yep, you can do it on Google, too.
211
709760
1736
λ„€, μ—¬λŸ¬λΆ„λ“€λ„ ν•˜μ‹€ 수 μžˆμŠ΅λ‹ˆλ‹€.
11:51
And it wasn't even expensive.
212
711520
1696
λΉ„μš©λ„ 비싸지 μ•Šμ•˜μŠ΅λ‹ˆλ‹€.
11:53
The ProPublica reporter spent about 30 dollars
213
713240
4416
ν”„λ‘œν”„λ‘œλΉ„μΉ΄μ˜ ν•œ κΈ°μžκ°€ μ•½ 30λ‹¬λŸ¬λ₯Ό μ‚¬μš©ν•΄μ„œ
11:57
to target this category.
214
717680
2240
이 μ•Œκ³ λ¦¬μ¦˜μ„ μ΄μš©ν•΄ λ΄€μŠ΅λ‹ˆλ‹€.
12:02
So last year, Donald Trump's social media manager disclosed
215
722600
5296
그리고, μž‘λ…„μ— λ„λ„λ“œ νŠΈλŸΌν”„ λŒ€ν†΅λ Ή μ†Œμ…œλ―Έλ””μ–΄ λ§€λ‹ˆμ €κ°€ ν­λ‘œν–ˆλŠ”λ°μš”.
12:07
that they were using Facebook dark posts to demobilize people,
216
727920
5336
νŠΈλŸΌν”„ μ§„μ˜μ΄ 페이슀뢁의 μ–΄λ‘ μ˜ κ²Œμ‹œνŒμ„ μš΄μ˜ν•΄ μ‚¬λžŒλ“€μ„ 묢어두고
12:13
not to persuade them,
217
733280
1376
외뢀인은 그듀을 섀득 λͺ»ν•˜κ²Œ ν•˜κ³ 
12:14
but to convince them not to vote at all.
218
734680
2800
νˆ¬ν‘œν•˜μ§€ 말 것을 μ„€λ“ν–ˆλ‹€κ³  ν•©λ‹ˆλ‹€.
12:18
And to do that, they targeted specifically,
219
738520
3576
κ·Έ 일을 μˆ˜ν–‰ν•˜κΈ° μœ„ν•΄, κ·Έ λ―Έλ””μ–΄νŒ€μ€ ꡬ체적으둜 필라델피아와 같은
12:22
for example, African-American men in key cities like Philadelphia,
220
742120
3896
λŒ€λ„μ‹œμ— κ±°μ£Όν•˜λŠ” 흑인계측을 λͺ©ν‘œλ‘œ μ‚Όμ•„ ν™œλ™ν–ˆμŠ΅λ‹ˆλ‹€.
12:26
and I'm going to read exactly what he said.
221
746040
2456
μ§€κΈˆλΆ€ν„° 사싀을 ν­λ‘œν•œ κ·Έ μ‚¬λžŒμ˜ μ§„μˆ μ„ κ·ΈλŒ€λ‘œ 읽어 λ³΄κ² μŠ΅λ‹ˆλ‹€.
12:28
I'm quoting.
222
748520
1216
κ·ΈλŒ€λ‘œ μΈμš©ν•˜κ² μŠ΅λ‹ˆλ‹€.
12:29
They were using "nonpublic posts
223
749760
3016
"λ―Έλ””μ–΄νŒ€μ€ μ„ κ±°νŒ€μ΄ 보기λ₯Ό ν†΅μ œν•˜λŠ”
12:32
whose viewership the campaign controls
224
752800
2176
λΉ„κ³΅κ°œ κ²Œμ‹œνŒμ„ μ‚¬μš©ν–ˆμŠ΅λ‹ˆλ‹€.
12:35
so that only the people we want to see it see it.
225
755000
3776
μš°λ¦¬κ°€ 그것을 보고 μ‹Άμ–΄ ν•˜λŠ” μ‚¬λžŒλ“€λ§Œ λ³Ό 수 μžˆλ„λ‘ κ΄€λ¦¬ν–ˆμŠ΅λ‹ˆλ‹€."
12:38
We modeled this.
226
758800
1216
μš°λ¦¬λŠ” 이것을 따라 ν•΄ λ΄€μŠ΅λ‹ˆλ‹€.
12:40
It will dramatically affect her ability to turn these people out."
227
760040
4720
이 κΈ°μˆ μ€ μ‚¬λžŒλ“€μ„ μ‘°μ’…ν•  수 μžˆλŠ” κ·Έλ…€μ˜ λŠ₯λ ₯에 μ—„μ²­λ‚˜κ²Œ 영ν–₯을 λ―Έμ³€μŠ΅λ‹ˆλ‹€.
12:45
What's in those dark posts?
228
765720
2280
μ•”ν‘μ˜ κ²Œμ‹œνŒμ—λŠ” 무엇이 μžˆμ—ˆμ„κΉŒμš”?
12:48
We have no idea.
229
768480
1656
μš°λ¦¬λŠ” 잘 μ•Œμ§€ λͺ»ν•©λ‹ˆλ‹€.
12:50
Facebook won't tell us.
230
770160
1200
페이슀뢁이 μš°λ¦¬μ—κ²Œ 말해주지 μ•Šμ„ κ²λ‹ˆλ‹€.
12:52
So Facebook also algorithmically arranges the posts
231
772480
4376
κ·Έλž˜μ„œ νŽ˜μ΄μŠ€λΆμ€ κ²Œμ‹œνŒμ„ μ—¬λŸ¬λΆ„μ˜ μΉœκ΅¬κ°€
12:56
that your friends put on Facebook, or the pages you follow.
232
776880
3736
μ˜¬λ¦¬λŠ” κ²Œμ‹œλ¬Όμ΄λ‚˜ 당신이 νŒ”λ‘œμ›Œ ν•˜λŠ” νŽ˜μ΄μ§€λ₯Ό μ•Œκ³ λ¦¬μ¦˜ν™”ν•΄μ„œ λ°°μ—΄ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
13:00
It doesn't show you everything chronologically.
233
780640
2216
λͺ¨λ“  자료λ₯Ό μ‹œκ°„μˆœμœΌλ‘œ 보여주지 μ•ŠμŠ΅λ‹ˆλ‹€.
13:02
It puts the order in the way that the algorithm thinks will entice you
234
782880
4816
νŽ˜μ΄μŠ€λΆμ€ κ·Έ μ•Œκ³ λ¦¬λ“¬μ΄ νŒŒμ•…ν•œ ν›„ μ—¬λŸ¬λΆ„μ„ 더 였래 μ‚¬μ΄νŠΈμ— 머무λ₯΄κ²Œ
13:07
to stay on the site longer.
235
787720
1840
방법을 μ‚¬μš©ν•΄ κ²Œμ‹œλ¬Όμ˜ μˆœμ„œλ₯Ό μ •ν•©λ‹ˆλ‹€.
13:11
Now, so this has a lot of consequences.
236
791040
3376
ν˜„μž¬, λ§Žμ€ κ²°κ³Όλ₯Ό λ§Œλ“€κ³  μžˆμŠ΅λ‹ˆλ‹€.
13:14
You may be thinking somebody is snubbing you on Facebook.
237
794440
3800
μ—¬λŸ¬λΆ„μ΄ λˆ„κ΅°κ°€ νŽ˜μ΄μŠ€λΆμ—μ„œ 당신을 κ±°λΆ€ν•˜κ³  μžˆλ‹€κ³  생각할 μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
13:18
The algorithm may never be showing your post to them.
238
798800
3256
κ·Έ μ•Œκ³ λ¦¬λ“¬μ€ μ•„λ§ˆλ„ κ·Έλ“€μ—κ²Œ λ‹Ήμ‹ μ˜ κ²Œμ‹œλ¬Όμ„ 보여주지 μ•Šμ„ 수 μžˆμŠ΅λ‹ˆλ‹€.
13:22
The algorithm is prioritizing some of them and burying the others.
239
802080
5960
μš°μ„ μˆœμœ„λ₯Ό μ •ν•˜κ³  λ‚˜λ¨Έμ§€λŠ” 묻어 λ²„λ¦½λ‹ˆλ‹€.
13:29
Experiments show
240
809320
1296
연ꡬ에 λ”°λ₯΄λ©΄
13:30
that what the algorithm picks to show you can affect your emotions.
241
810640
4520
μ•Œκ³ λ¦¬μ¦˜μ΄ μ—¬λŸ¬λΆ„μ—κ²Œ 보여쀄 μžλ£ŒλŠ” μ—¬λŸ¬λΆ„μ—κ²Œ 영ν–₯을 쀄 수 μžˆμŠ΅λ‹ˆλ‹€.
13:36
But that's not all.
242
816600
1200
κ·ΈλŸ¬λ‚˜, 그것이 μ „λΆ€κ°€ μ•„λ‹™λ‹ˆλ‹€.
13:38
It also affects political behavior.
243
818280
2360
μ•Œκ³ λ¦¬λ“¬μ€ μ •μΉ˜μ  행동에도 영ν–₯을 λ―ΈμΉ  수 μžˆμŠ΅λ‹ˆλ‹€.
13:41
So in 2010, in the midterm elections,
244
821360
4656
2010λ…„ 쀑간 μ„ κ±°μ—μ„œ
13:46
Facebook did an experiment on 61 million people in the US
245
826040
5896
νŽ˜μ΄μŠ€λΆμ€ λ―Έκ΅­μ—μ„œ 6천 백만 λͺ…μ—κ²Œ
13:51
that was disclosed after the fact.
246
831960
1896
μ„ κ±° 후에야 λ°œν‘œν•œ μ‹€ν—˜μ„ ν–ˆμŠ΅λ‹ˆλ‹€.
13:53
So some people were shown, "Today is election day,"
247
833880
3416
μ‚¬λžŒλ“€μ—κ²Œ "μ˜€λŠ˜μ€ μ„ κ±°μΌμž…λ‹ˆλ‹€" λΌλŠ” 글을 보여 μ£Όμ—ˆμŠ΅λ‹ˆλ‹€.
13:57
the simpler one,
248
837320
1376
정말 κ°„λ‹¨ν•œ μ‹€ν—˜μ΄μ—ˆμŠ΅λ‹ˆλ‹€.
13:58
and some people were shown the one with that tiny tweak
249
838720
3896
μ–΄λ–€ μ‚¬λžŒλ“€μ—κ²ŒλŠ”
14:02
with those little thumbnails
250
842640
2096
"νˆ¬ν‘œν–ˆμ–΄μš”"λ²„νŠΌμ—
14:04
of your friends who clicked on "I voted."
251
844760
2840
λ―Έμ„Έν•œ 쑰정을 ν–ˆμŠ΅λ‹ˆλ‹€.
14:09
This simple tweak.
252
849000
1400
μ•„μ£Ό κ°„λ‹¨ν•œ μ‘°μž‘μ΄μ—ˆμŠ΅λ‹ˆλ‹€.
14:11
OK? So the pictures were the only change,
253
851520
4296
그림만 λ°”κΎΈμ—ˆμŠ΅λ‹ˆλ‹€.
14:15
and that post shown just once
254
855840
3256
ν•œλ²ˆλ§Œ 보여진 κ·Έ κ²Œμ‹œλ¬Όμ€
14:19
turned out an additional 340,000 voters
255
859120
6056
κ·Έ μ„ κ±°μ—μ„œ μΆ”κ°€λ‘œ 34만 λͺ…을
14:25
in that election,
256
865200
1696
νˆ¬ν‘œν•˜κ²Œ λ§Œλ“€μ—ˆμŠ΅λ‹ˆλ‹€.
14:26
according to this research
257
866920
1696
κ·Έ μ‹€ν—˜μ—μ„œ λ°ν˜€μ§„ 사싀이고
14:28
as confirmed by the voter rolls.
258
868640
2520
유ꢌ자 λͺ…λΆ€μ—μ„œ ν™•μΈλœ μ‚¬μ‹€μž…λ‹ˆλ‹€.
14:32
A fluke? No.
259
872920
1656
μš°μ—°μ˜ μΌμΉ˜λΌκ³ μš”? 그렇지 μ•ŠμŠ΅λ‹ˆλ‹€.
14:34
Because in 2012, they repeated the same experiment.
260
874600
5360
2012년에 λ‹€μ‹œ μ‹€ν—˜μ„ ν–ˆκΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
14:40
And that time,
261
880840
1736
κ·Έλ•ŒλŠ”
14:42
that civic message shown just once
262
882600
3296
단 ν•œ 번 λ…ΈμΆœλœ 곡곡 νšŒλ³΄κ°€
14:45
turned out an additional 270,000 voters.
263
885920
4440
μΆ”κ°€λ‘œ 27만 λͺ…μ˜ νˆ¬ν‘œλ₯Ό λ§Œλ“€μ—ˆμŠ΅λ‹ˆλ‹€.
14:51
For reference, the 2016 US presidential election
264
891160
5216
참고둜, 2016λ…„ λ―Έκ΅­ λŒ€μ„ μ˜ κ²°κ³ΌλŠ”
14:56
was decided by about 100,000 votes.
265
896400
3520
μ•½ 10만λͺ… 차이둜 μŠΉνŒ¨κ°€ κ°ˆλ ΈμŠ΅λ‹ˆλ‹€.
15:01
Now, Facebook can also very easily infer what your politics are,
266
901360
4736
ν˜„μž¬, νŽ˜μ΄μŠ€λΆμ€ μ—¬λŸ¬λΆ„μ˜ μ •μΉ˜κ΄€λ„ μ•„μ£Ό μ‰½κ²Œ μΆ”λ‘ ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
15:06
even if you've never disclosed them on the site.
267
906120
2256
심지어, ν•œ λ²ˆλ„ 그런 κ²Œμ‹œλ¬Όμ„ 올린 적이 μ—†λŠ”λ°λ„ λ§μž…λ‹ˆλ‹€.
15:08
Right? These algorithms can do that quite easily.
268
908400
2520
μ•Œκ³ λ¦¬μ¦˜μ€ 그런 일을 κ½€ μ‰½κ²Œ μˆ˜ν–‰ν•©λ‹ˆλ‹€.
15:11
What if a platform with that kind of power
269
911960
3896
λ§Œμ•½, 그런 νž˜μ„ 가진 λ„€νŠΈμ›Œν¬κ°€ ν•œ ν›„λ³΄μžλ₯Ό
15:15
decides to turn out supporters of one candidate over the other?
270
915880
5040
μ„ ν˜Έν•˜λŠ” 더 λ§Žμ€ μ§€μ§€μžλ₯Ό λ§Œλ“ λ‹€λ©΄ μ–΄λ–»κ²Œ λ κΉŒμš”?
15:21
How would we even know about it?
271
921680
2440
μš°λ¦¬κ°€ μ–΄λ–»κ²Œ κ·Έ 일을 μ•Œ μˆ˜κ°€ μžˆμ„κΉŒμš”?
15:25
Now, we started from someplace seemingly innocuous --
272
925560
4136
μš°λ¦¬λŠ” κ²‰λ³΄κΈ°μ—λŠ” ν”Όν•΄λ₯Ό 주지 μ•ŠλŠ” ν•œ μ‚¬μ΄νŠΈμ˜
15:29
online adds following us around --
273
929720
2216
따라 λ‹€λ‹ˆλŠ” κ΄‘κ³ λΆ€ν„°
15:31
and we've landed someplace else.
274
931960
1840
λ‹€λ₯Έ μ‚¬μ΄νŠΈκΉŒμ§€ μ΄μ•ΌκΈ°ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
15:35
As a public and as citizens,
275
935480
2456
ꡭ민이자 μ‹œλ―ΌμœΌλ‘œμ„œ
15:37
we no longer know if we're seeing the same information
276
937960
3416
μš°λ¦¬λŠ” λ‹€λ₯Έ μ‚¬λžŒκ³Ό 같은 λ‚΄μš©μ˜ 정보λ₯Ό 보고 μžˆλŠ”μ§€
15:41
or what anybody else is seeing,
277
941400
1480
또 λ‹€λ₯Έ 이듀이 μ–΄λ–€ 것을 λ³΄λŠ”μ§€ λ”λŠ” μ•Œ μˆ˜κ°€ μ—†κ²Œ 되고
15:43
and without a common basis of information,
278
943680
2576
정보가 κ°€μ§€λŠ” 곡톡 원칙이
15:46
little by little,
279
946280
1616
μ‘°κΈˆμ”© μƒμ‹€λœλ‹€λ©΄
15:47
public debate is becoming impossible,
280
947920
3216
곡곡의 λ…ΌμŸμ€ λΆˆκ°€λŠ₯ν•˜κ²Œ λ©λ‹ˆλ‹€.
15:51
and we're just at the beginning stages of this.
281
951160
2976
μš°λ¦¬λŠ” ν˜„μž¬ κ·Έ μ§μ „μ˜ 단계에 μžˆμŠ΅λ‹ˆλ‹€.
15:54
These algorithms can quite easily infer
282
954160
3456
κ·ΈλŸ¬ν•œ μ•Œκ³ λ¦¬μ¦˜μ€ μ•„μ£Ό μ‰½κ²Œ
15:57
things like your people's ethnicity,
283
957640
3256
μ—¬λŸ¬λΆ„μ˜ λ―Όμ‘±
16:00
religious and political views, personality traits,
284
960920
2336
쒅ꡐ, μ •μΉ˜μ  μ„±ν–₯, 개인적 νŠΉμ§•
16:03
intelligence, happiness, use of addictive substances,
285
963280
3376
지λŠ₯, 행볡, μ•½λ¬Ό μ‚¬μš©
16:06
parental separation, age and genders,
286
966680
3136
λΆ€λͺ¨μ˜ 이혼 μ—¬λΆ€, 성별등을
16:09
just from Facebook likes.
287
969840
1960
페이슀뢁 μ’‹μ•„μš” λ§Œμ—μ„œλ„ μ•Œ μˆ˜κ°€ μžˆμŠ΅λ‹ˆλ‹€.
16:13
These algorithms can identify protesters
288
973440
4056
μ΄λŸ¬ν•œ μ•Œκ³ λ¦¬λ“¬μ€ μ‹œμœ„μžλ“€μ˜ 정체λ₯Ό 뢄별할 수 μžˆλŠ”λ°
16:17
even if their faces are partially concealed.
289
977520
2760
비둝 얼꡴을 λΆ€λΆ„μ μœΌλ‘œ κ°€λ¦°λ‹€ 해도 μ•Œ μˆ˜κ°€ μžˆμŠ΅λ‹ˆλ‹€.
16:21
These algorithms may be able to detect people's sexual orientation
290
981720
6616
λ˜ν•œ, 데이트 ν”„λ‘œν•„ μ •λ³΄λ§Œ 가지고도 μ‚¬λžŒλ“€μ˜ 성적 μ·¨ν–₯을
16:28
just from their dating profile pictures.
291
988360
3200
νŒλ‹¨ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
16:33
Now, these are probabilistic guesses,
292
993560
2616
이 λͺ¨λ“  것듀은 ν™•λ₯ μ  μΆ”λ‘ μœΌλ‘œ
16:36
so they're not going to be 100 percent right,
293
996200
2896
100% μΌμΉ˜ν•˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€
16:39
but I don't see the powerful resisting the temptation to use these technologies
294
999120
4896
ν•˜μ§€λ§Œ μ €λŠ” 일뢀 뢈일치 쑰건이 μ‘΄μž¬ν•œλ‹€λŠ” 이유만으둜 이 기술의
16:44
just because there are some false positives,
295
1004040
2176
μ‚¬μš©μ„ κΊΌλ¦¬λŠ” κ·Έ μ–΄λ–€ κ°œμΈμ΄λ‚˜ 집단을 보지 λͺ»ν–ˆμŠ΅λ‹ˆλ‹€.
16:46
which will of course create a whole other layer of problems.
296
1006240
3256
λ¬Όλ‘ , 이 였λ₯˜κ°€ λ‹€μΈ΅ ꡬ쑰의 λ¬Έμ œλ“€μ„ μ•ΌκΈ°ν•  수 μžˆμ–΄λ„ λ§μž…λ‹ˆλ‹€.
16:49
Imagine what a state can do
297
1009520
2936
λ§Œμ•½, ν•œ μ£Όκ°€ μžμ‹ λ“€μ˜ μ‹œλ―Όλ“€μ— λŒ€ν•œ μ—„μ²­λ‚œ 정보λ₯Ό 가지고
16:52
with the immense amount of data it has on its citizens.
298
1012480
3560
μ–΄λ–€ 일을 ν•  수 μžˆμ„κΉŒμš”?
16:56
China is already using face detection technology
299
1016680
4776
쀑ꡭ은 이미 μ•ˆλ©΄ 인식 κΈ°μˆ μ„ μ‚¬μš©ν•΄
17:01
to identify and arrest people.
300
1021480
2880
μ‚¬λžŒλ“€μ˜ μ‹ λΆ„ νŒŒμ•…, 체포 등에 μ΄μš©ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
17:05
And here's the tragedy:
301
1025280
2136
λ°”λ‘œ 이것이 λΉ„κ·Ήμž…λ‹ˆλ‹€.
17:07
we're building this infrastructure of surveillance authoritarianism
302
1027440
5536
μš°λ¦¬λŠ” μ§€κΈˆ λ‹¨μˆœνžˆ μ‚¬λžŒλ“€μ΄ κ΄‘κ³ λ₯Ό ν΄λ¦­ν•˜κ²Œ ν• λ €κ³ 
17:13
merely to get people to click on ads.
303
1033000
2960
이런 λ¬΄μ‹œλ¬΄μ‹œν•œ κ°μ‹œ λ…μž¬κ΅­κ°€μ˜ μ‹œμŠ€ν…œμ„ λ§Œλ“€κ³  μžˆμŠ΅λ‹ˆλ‹€.
17:17
And this won't be Orwell's authoritarianism.
304
1037240
2576
μ ˆλŒ€ μ†Œμ„€ μ†μ˜ 이야기가 μ•„λ‹™λ‹ˆλ‹€.
17:19
This isn't "1984."
305
1039839
1897
ν˜„μ‹€μ˜ "1984"μž…λ‹ˆλ‹€.
17:21
Now, if authoritarianism is using overt fear to terrorize us,
306
1041760
4576
λ§Œμ•½, λ…μž¬κ΅­κ°€κ°€ 우리λ₯Ό μ–΅μ••ν•˜κΈ° μœ„ν•΄ λΆ€λ‹Ήν•œ μœ„ν˜‘μ„ μ‚¬μš©ν•˜κ³  μžˆλ‹€λ©΄
17:26
we'll all be scared, but we'll know it,
307
1046359
2897
μš°λ¦¬λŠ” 겁을 λ¨Ήκ² μ§€λ§Œ μ•Œ μˆ˜κ°€ μžˆμ–΄
17:29
we'll hate it and we'll resist it.
308
1049280
2200
μ¦μ˜€ν•˜κ³  μ €ν•­ν•  κ²ƒμž…λ‹ˆλ‹€.
17:32
But if the people in power are using these algorithms
309
1052880
4416
ν•˜μ§€λ§Œ ꢌλ ₯μžλ“€μ΄ 이런 μ•Œκ³ λ¦¬μ¦˜μ„ μ‚¬μš©ν•΄
17:37
to quietly watch us,
310
1057319
3377
쑰용히 우리λ₯Ό κ°μ‹œν•˜κ³ 
17:40
to judge us and to nudge us,
311
1060720
2080
νŒλ‹¨ν•˜κ±°λ‚˜, μ‘°μ •ν•˜κ³ 
17:43
to predict and identify the troublemakers and the rebels,
312
1063720
4176
문제λ₯Ό μΌμœΌν‚¬ 수 μžˆλŠ” μ‚¬λžŒλ“€μ„ μ˜ˆμƒν•œ λ’€ 정체λ₯Ό νŒŒμ•…ν•  수 μžˆλŠ”
17:47
to deploy persuasion architectures at scale
313
1067920
3896
μ—„μ²­λ‚œ 규λͺ¨μ˜ 회유 μ•„ν‚€ν…νŠΈλ₯Ό κ΅¬μΆ•ν•˜λŠ” 일이 κ°€λŠ₯ν•΄μ§‘λ‹ˆλ‹€.
17:51
and to manipulate individuals one by one
314
1071840
4136
λ˜ν•œ, 개인의 μ•½μ μ΄λ‚˜
17:56
using their personal, individual weaknesses and vulnerabilities,
315
1076000
5440
μ·¨μ•½ν•œ 곳을 νŒŒμ•…ν•΄μ„œ μ‘°μ •ν•  μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
18:02
and if they're doing it at scale
316
1082720
2200
λ˜ν•œ, λ§Œμ•½ ꢌλ ₯μžλ“€μ΄ μ΄λŸ¬ν•œ μ‹œμŠ€ν…œμ„ μ΄μš©ν•΄
18:06
through our private screens
317
1086080
1736
κ°œμΈμ— λŒ€ν•œ 검열을 ν•œλ‹€λ©΄
18:07
so that we don't even know
318
1087840
1656
κ·Έ κ²°κ³ΌλŠ”
18:09
what our fellow citizens and neighbors are seeing,
319
1089520
2760
우리 κ΅­λ―Όμ΄λ‚˜ μ΄μ›ƒλ“€μ˜ ν˜„μ‹€μ„ μš°λ¦¬λŠ” λͺ¨λ₯Ό μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
18:13
that authoritarianism will envelop us like a spider's web
320
1093560
4816
κ·Έ λ…μž¬μ£Όμ˜λŠ” 마치 κ±°λ―Έμ€„μ²˜λŸΌ μš°λ¦¬μ„ λ¬Άμ–΄μ„œ
18:18
and we may not even know we're in it.
321
1098400
2480
μš°λ¦¬κ°€ κ·Έ μ•ˆμ— λ¬Άμ—¬μžˆλ‹€λŠ” 사싀도 λͺ¨λ₯΄κ²Œ ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
18:22
So Facebook's market capitalization
322
1102440
2936
페이슀뢁의 ν˜„μž¬ μ‹œκ°€μ΄μ•‘μ€
18:25
is approaching half a trillion dollars.
323
1105400
3296
5μ²œμ–΅ λ‹¬λŸ¬μ— μœ‘λ°•ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
18:28
It's because it works great as a persuasion architecture.
324
1108720
3120
μ™œλƒν•˜λ©΄, νŽ˜μ΄μŠ€λΆμ€ 회유ꡬ쑰둜 μž‘λ™μ„ ν•˜κ³  있기 λ•Œλ¬Έμž…λ‹ˆλ‹€.
18:33
But the structure of that architecture
325
1113760
2816
κ·ΈλŸ¬λ‚˜ μ΄λŸ¬ν•œ μ•„ν‚€ν…νŠΈλŠ”
18:36
is the same whether you're selling shoes
326
1116600
3216
μ—¬λŸ¬λΆ„λ“€μ΄ μ‹ λ°œμ„ νŒ”κ³  μžˆλŠ”μ§€
18:39
or whether you're selling politics.
327
1119840
2496
μ•„λ‹ˆλ©΄ μ •μΉ˜λ₯Ό νŒ”κ³  μžˆλŠ”μ§€ ꡬ뢄을 λͺ» ν•œλ‹€λŠ” κ²ƒμž…λ‹ˆλ‹€.
18:42
The algorithms do not know the difference.
328
1122360
3120
κ²°κ΅­, κ·Έ μ•Œκ³ λ¦¬λ“¬μ€ 본질적인 μ°¨μ΄λŠ” ꡬ뢄할 수 μ—†μŠ΅λ‹ˆλ‹€.
18:46
The same algorithms set loose upon us
329
1126240
3296
이 같은 ꡬ쑰의 μ•Œκ³ λ¦¬λ“¬μ€ 우리의 μˆ¨ν†΅μ„ μ—΄μ–΄μ£ΌλŠ” κΈ°λŠ₯으둜 μž‘μš©ν•΄
18:49
to make us more pliable for ads
330
1129560
3176
μš°λ¦¬κ°€ 광고에 λŒ€ν•΄ λŒ€μ²˜ν•˜λ„λ‘ ν•˜κ³ 
18:52
are also organizing our political, personal and social information flows,
331
1132760
6736
μ •μΉ˜μ , 개인적, μ‚¬νšŒμ  μ •λ³΄μ˜ 흐름을 κ΅¬μ„±ν•˜κ²Œ ν•©λ‹ˆλ‹€.
18:59
and that's what's got to change.
332
1139520
1840
λ°”λ‘œ 그것이 λ³€ν•΄μ•Ό ν•©λ‹ˆλ‹€.
19:02
Now, don't get me wrong,
333
1142240
2296
ν•˜μ§€λ§Œ λ„ˆλ¬΄ λ‚˜μ˜κ²Œλ§Œ μƒκ°ν•˜μ§€ λ§ˆμ‹œκΈ° λ°”λžλ‹ˆλ‹€.
19:04
we use digital platforms because they provide us with great value.
334
1144560
3680
μš°λ¦¬κ°€ 디지털 ν”Œλž«νΌμ„ μ‚¬μš©ν•˜λŠ” μ΄μœ λŠ” 그듀이 μ—„μ²­λ‚œ κ°€μΉ˜λ₯Ό μ œκ³΅ν•˜κΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
19:09
I use Facebook to keep in touch with friends and family around the world.
335
1149120
3560
λ‚˜λŠ” μ „ 세계 μΉœκ΅¬λ“€μ΄λ‚˜ κ°€μ‘±λ“€κ³Ό μ—°λ½ν•˜κΈ° μœ„ν•΄μ„œ νŽ˜μ΄μŠ€λΆμ„ μ‚¬μš©ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
19:14
I've written about how crucial social media is for social movements.
336
1154000
5776
μ†Œμ…œλ―Έλ””μ–΄κ°€ μ‚¬νšŒμš΄λ™μ— μ–Όλ§ˆλ‚˜ μ€‘μš”ν•œκ°€μ— λŒ€ν•œ 논문을 쓰기도 ν–ˆμŠ΅λ‹ˆλ‹€.
19:19
I have studied how these technologies can be used
337
1159800
3016
μ €λŠ” μ „ μ„Έκ³„μ˜ 검열을 ν”Όν•  수 μžˆλŠ” 방법을 찾고자 이 μ†Œμ…œλ―Έλ””μ–΄μ˜
19:22
to circumvent censorship around the world.
338
1162840
2480
κΈ°μˆ μ„ μ μš©ν•˜λŠ” 연ꡬλ₯Ό ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
19:27
But it's not that the people who run, you know, Facebook or Google
339
1167280
6416
κ·ΈλŸ¬λ‚˜ κ΅¬κΈ€μ΄λ‚˜ 페이슀뢁의 μš΄μ˜μžλ“€μ€
19:33
are maliciously and deliberately trying
340
1173720
2696
μ•…μ˜μ μ΄λ©° μ˜λ„μ μœΌλ‘œ
19:36
to make the country or the world more polarized
341
1176440
4456
μš°λ¦¬λ‚˜λΌλ‚˜ 세계λ₯Ό μ–‘λΆ„μ‹œν‚€κ³ 
19:40
and encourage extremism.
342
1180920
1680
κ·Ήλ‹¨μ£Όμ˜λ₯Ό λΆ€μΆ”κΈ°κ³  μžˆμŠ΅λ‹ˆλ‹€.
19:43
I read the many well-intentioned statements
343
1183440
3976
μ €λŠ” 이 기업듀이 λ°œν‘œν•œ κ·ΈλŸ΄λ“―ν•˜κ²Œ 말둜 μˆ˜μ‹λœ
19:47
that these people put out.
344
1187440
3320
μˆ˜λ§Žμ€ μ„±λͺ…μ„œλ₯Ό μ½μ—ˆμŠ΅λ‹ˆλ‹€.
19:51
But it's not the intent or the statements people in technology make that matter,
345
1191600
6056
κ·ΈλŸ¬λ‚˜ κ·Έμžλ“€μ΄ λ§Œλ“€κ³  μžˆλŠ” 것은 λ‹¨μˆœν•œ μ„±λͺ…μ„œλ‚˜ μ˜λ„ λ”°μœ„κ°€ μ•„λ‹Œ
19:57
it's the structures and business models they're building.
346
1197680
3560
ꡬ쑰듀이며 κΈ°μ—…μš© μƒν’ˆμž…λ‹ˆλ‹€.
20:02
And that's the core of the problem.
347
1202360
2096
문제의 핡심은 여기에 μžˆμŠ΅λ‹ˆλ‹€.
20:04
Either Facebook is a giant con of half a trillion dollars
348
1204480
4720
νŽ˜μ΄μŠ€λΆμ€ 5μ²œμ–΅ λ‹¬λŸ¬μ— λ‹¬ν•˜λŠ” κ±°λŒ€κΈ°μ—…μ΄μ§€λ§Œ
20:10
and ads don't work on the site,
349
1210200
1896
κ΄‘κ³ λŠ” 싣지 μ•ŠμŠ΅λ‹ˆλ‹€.
20:12
it doesn't work as a persuasion architecture,
350
1212120
2696
λ˜ν•œ, 섀득 ꡬ쑰둜 μš΄μ˜λ˜μ§€ μ•Šμ§€λ§Œ
20:14
or its power of influence is of great concern.
351
1214840
4120
영ν–₯λ ₯은 큰 κΈ°μ—…μž…λ‹ˆλ‹€.
20:20
It's either one or the other.
352
1220560
1776
μ–΄λŠ μͺ½μ΄λ“  ν•΄λ‹Ήν•©λ‹ˆλ‹€.
20:22
It's similar for Google, too.
353
1222360
1600
ꡬ글도 λ§ˆμ°¬κ°€μ§€ μž…λ‹ˆλ‹€.
20:24
So what can we do?
354
1224880
2456
κ·Έλ ‡λ‹€λ©΄ μš°λ¦¬λŠ” 무엇을 ν•  수 μžˆμ„κΉŒμš”?
20:27
This needs to change.
355
1227360
1936
λ³€ν™”μ‹œμΌœμ•Ό ν•©λ‹ˆλ‹€.
20:29
Now, I can't offer a simple recipe,
356
1229320
2576
ν˜„μž¬, κ°„λ‹¨ν•œ 방법은 μ•„λ‹™λ‹ˆλ‹€.
20:31
because we need to restructure
357
1231920
2256
μ™œλƒν•˜λ©΄ μš°λ¦¬λŠ”
디지털 기술이 μž‘λ™ν•˜λŠ” 전체 ꡬ쑰λ₯Ό λ‹€μ‹œ μ§œμ•Ό ν•˜κΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
20:34
the whole way our digital technology operates.
358
1234200
3016
20:37
Everything from the way technology is developed
359
1237240
4096
기술의 λ°œμ „λΆ€ν„°
20:41
to the way the incentives, economic and otherwise,
360
1241360
3856
μ„±κ³Όλ³΄μˆ˜λ‚˜ κ²½μ œκ°€ μƒμ„±λ˜λŠ” 방법듀 λͺ¨λ‘κ°€
20:45
are built into the system.
361
1245240
2280
ν•˜λ‚˜μ˜ μ‹œμŠ€ν…œμœΌλ‘œ λ°œμ „ ν–ˆμŠ΅λ‹ˆλ‹€.
20:48
We have to face and try to deal with
362
1248480
3456
μš°λ¦¬κ°€ λŒ€λ©΄ν•˜κ³  ν•΄μ•Ό ν•  일은
20:51
the lack of transparency created by the proprietary algorithms,
363
1251960
4656
μ „λ§€νŠΉν—ˆμ˜ μ•Œκ³ λ¦¬λ“¬μ— μ˜ν•œ 투λͺ…μ„±μ˜ 결여와
20:56
the structural challenge of machine learning's opacity,
364
1256640
3816
기계 ν•™μŠ΅μ˜ 뢈투λͺ…성에 λŒ€ν•œ ꡬ쑰적 도전이며
21:00
all this indiscriminate data that's being collected about us.
365
1260480
3400
이 λͺ¨λ“  무차별적인 정보 μˆ˜μ§‘μ— λŒ€ν•­ν•˜λŠ” κ²ƒμž…λ‹ˆλ‹€.
21:05
We have a big task in front of us.
366
1265000
2520
우리 μ•žμ—λŠ” 큰 κ³Όμ œκ°€ μžˆμŠ΅λ‹ˆλ‹€.
21:08
We have to mobilize our technology,
367
1268160
2680
λ³€ν™”ν•΄μ•Ό 할것은 기술이고
21:11
our creativity
368
1271760
1576
μ°½μ˜μ„±μ΄κ³ 
21:13
and yes, our politics
369
1273360
1880
μ •μΉ˜μž…λ‹ˆλ‹€
21:16
so that we can build artificial intelligence
370
1276240
2656
μš°λ¦¬κ°€ λ§Œλ“œλŠ” 인곡지λŠ₯은
21:18
that supports us in our human goals
371
1278920
3120
μΈκ°„μ˜ λͺ©μ  속에 우리λ₯Ό μ§€μ§€ν•˜μ§€λ§Œ
21:22
but that is also constrained by our human values.
372
1282800
3920
μΈκ°„μ˜ κ°€μΉ˜μ— μ œμ•½ν•©λ‹ˆλ‹€.
21:27
And I understand this won't be easy.
373
1287600
2160
κ·Έ 일은 쉽지 μ•Šμ„ κ²ƒμž…λ‹ˆλ‹€.
21:30
We might not even easily agree on what those terms mean.
374
1290360
3600
이런 말듀이 μ˜λ―Έν•˜λŠ” μ΄μœ μ— μ‰½κ²Œ λ™μ˜ν•˜μ§€ λͺ»ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
21:34
But if we take seriously
375
1294920
2400
ν•˜μ§€λ§Œ λ§Œμ•½ μ‹¬κ°ν•˜κ²Œ
21:38
how these systems that we depend on for so much operate,
376
1298240
5976
μš°λ¦¬κ°€ μ˜μ§€ν•˜κ³  μžˆλŠ” μ‹œμŠ€ν…œμ΄ μ–΄λ–»κ²Œ 일을 ν•˜λŠ”μ§€ 생각해 λ³Έλ‹€λ©΄
21:44
I don't see how we can postpone this conversation anymore.
377
1304240
4120
μ €λŠ” μ™œ μš°λ¦¬κ°€ 이런 λŒ€ν™”λ₯Ό ν”Όν•˜λ € ν•˜λŠ”μ§€ 이해할 수 μ—†μŠ΅λ‹ˆλ‹€.
21:49
These structures
378
1309200
2536
κ·ΈλŸ¬ν•œ ꡬ쑰듀은
21:51
are organizing how we function
379
1311760
4096
우리의 κΈ°λŠ₯을 μ‘°μ§ν™”μ‹œν‚€κ³ 
21:55
and they're controlling
380
1315880
2296
ν†΅μ œλ₯Ό ν•©λ‹ˆλ‹€.
21:58
what we can and we cannot do.
381
1318200
2616
μš°λ¦¬κ°€ ν•  수 μžˆλŠ” 것과 ν•  수 μ—†λŠ” 것듀을 κ΅¬λΆ„ν•©λ‹ˆλ‹€.
22:00
And many of these ad-financed platforms,
382
1320840
2456
그리고 κ΄‘κ³ λ₯Ό μœ„μ£Όλ‘œ μš΄μ˜λ˜λŠ” λ§Žμ€ ν”Œλž«νΌλ“€μ€
22:03
they boast that they're free.
383
1323320
1576
그듀은 λ¬΄λ£Œμž„μ„ λ‚΄μ„Έμš°κ³  μžˆμŠ΅λ‹ˆλ‹€.
22:04
In this context, it means that we are the product that's being sold.
384
1324920
4560
이런 λ§₯λ½μ—μ„œ, μš°λ¦¬λŠ” νŒ”λ¦¬κ³  μžˆλŠ” μƒν’ˆμ΄λΌλŠ” 말과 λ‹€λ₯΄μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
22:10
We need a digital economy
385
1330840
2736
μš°λ¦¬λŠ” 디지털 기술이 ν•„μš”ν•˜μ§€λ§Œ
22:13
where our data and our attention
386
1333600
3496
우리의 μžλ£Œμ™€ 관심이
22:17
is not for sale to the highest-bidding authoritarian or demagogue.
387
1337120
5080
κ°€μž₯ λˆμ„ 많이 κ±°λŠ” μ‚¬μ—…κ°€λ‚˜ μ‚¬λžŒλ“€μ—κ²Œ κ±°λž˜κ°€ λ˜μ§€ μ•Šμ€ 세상을 κΏˆκΏ‰λ‹ˆλ‹€.
22:23
(Applause)
388
1343160
3800
(λ°•μˆ˜)
22:30
So to go back to that Hollywood paraphrase,
389
1350480
3256
λ‹€μ‹œ ν• λ¦¬μš°λ“œμ—μ„œ μœ ν–‰ν•˜λŠ” 말을 λ°”κΏ”μ„œ μΈμš©ν•˜κ² μŠ΅λ‹ˆλ‹€.
22:33
we do want the prodigious potential
390
1353760
3736
"μ—„μ²­λ‚œ 잠재λŠ₯λ ₯을 가진
22:37
of artificial intelligence and digital technology to blossom,
391
1357520
3200
인곡지λŠ₯μ΄λ‚˜ 디지털 기술이 꽃 ν”ΌκΈ°λ₯Ό κΈ°λŒ€λ¦¬μ§€λ§Œ
22:41
but for that, we must face this prodigious menace,
392
1361400
4936
κ·Έ 일을 μœ„ν•΄μ„ , μš°λ¦¬λŠ” μ—„μ²­λ‚œ μœ„ν—˜μ— λ§žμ„œμ•Ό ν•˜κ³ 
22:46
open-eyed and now.
393
1366360
1936
λˆˆμ„ 크게 λ– μ•Ό ν•œλ‹€."
22:48
Thank you.
394
1368320
1216
κ°μ‚¬ν•©λ‹ˆλ‹€.
22:49
(Applause)
395
1369560
4640
(λ°•μˆ˜)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7