Jennifer Golbeck: The curly fry conundrum: Why social media "likes" say more than you might think

383,322 views

2014-04-03 ・ TED


New videos

Jennifer Golbeck: The curly fry conundrum: Why social media "likes" say more than you might think

383,322 views ・ 2014-04-03

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: Jihyeon J. Kim κ²€ν† : Catherine YOO
00:12
If you remember that first decade of the web,
0
12738
1997
μΈν„°λ„·μ˜ 초기 10년을 μƒκ°ν•˜λ©΄
00:14
it was really a static place.
1
14735
2255
κ±°κΈ΄ 정말 정적인 κ³³μ΄μ—ˆμŠ΅λ‹ˆλ‹€.
00:16
You could go online, you could look at pages,
2
16990
2245
온라인으둜 νŽ˜μ΄μ§€λ“€μ„ λ³Ό 수 μžˆμ—ˆμ£ 
00:19
and they were put up either by organizations
3
19235
2513
그것은 νŒ€μœΌλ‘œ 이루어진 λ‹¨μ²΄μ—μ„œ
00:21
who had teams to do it
4
21748
1521
올린 κ²ƒμ΄κ±°λ‚˜
00:23
or by individuals who were really tech-savvy
5
23269
2229
κ·Έ λ‹Ήμ‹œ 컴퓨터 κΈ°μˆ μ— λŠ₯μˆ™ν•œ
00:25
for the time.
6
25498
1737
κ°œμΈλ“€μ΄ 올린 κ²ƒμ΄μ—ˆμŠ΅λ‹ˆλ‹€.
00:27
And with the rise of social media
7
27235
1575
2000λ…„λŒ€ μ΄ˆκΈ°μ— λΆ€μƒν•˜κΈ° μ‹œμž‘ν•œ
00:28
and social networks in the early 2000s,
8
28810
2399
μ†Œμ…œλ―Έλ””μ–΄μ™€ λ„€νŠΈμ›Œν¬λ‘œ 인해
00:31
the web was completely changed
9
31209
2149
인터넷은 μ™„μ „νžˆ λ°”λ€Œμ—ˆμŠ΅λ‹ˆλ‹€.
00:33
to a place where now the vast majority of content
10
33358
3608
μš°λ¦¬κ°€ λ³΄λŠ” μ–΄λ§ˆμ–΄λ§ˆν•œ μ½˜ν…μΈ λ“€μ„
00:36
we interact with is put up by average users,
11
36966
3312
λ³΄ν†΅μ˜ μ‚¬μš©μžλ“€μ΄ μ˜¬λ¦¬λŠ” μž₯μ†Œλ‘œ λ§μž…λ‹ˆλ‹€.
00:40
either in YouTube videos or blog posts
12
40278
2697
유투브 λ™μ˜μƒμ΄λΌλ“ μ§€ λΈ”λ‘œκ·Έ,
00:42
or product reviews or social media postings.
13
42975
3315
λ˜λŠ” μƒν’ˆν›„κΈ°λ‚˜ μ†Œμ…œλ―Έλ””μ–΄ κΈ€λ“€μ΄μš”.
00:46
And it's also become a much more interactive place,
14
46290
2347
μ‚¬λžŒλ“€μ΄ μ„œλ‘œ μ†Œν†΅ν•˜λŠ”
00:48
where people are interacting with others,
15
48637
2637
훨씬 μƒν˜Έμ μΈ μž₯μ†Œκ°€ λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
00:51
they're commenting, they're sharing,
16
51274
1696
닡글을 남기고, κ³΅μœ ν•©λ‹ˆλ‹€.
00:52
they're not just reading.
17
52970
1614
κ·Έμ € 읽기만 ν•˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
00:54
So Facebook is not the only place you can do this,
18
54584
1866
νŽ˜μ΄μŠ€λΆμ€ 이런 것을 ν•  수 μžˆλŠ” μœ μΌν•œ 곳일 뿐만 μ•„λ‹ˆλΌ
00:56
but it's the biggest,
19
56450
1098
κ°€μž₯ 큰 곳이기도 ν•©λ‹ˆλ‹€.
00:57
and it serves to illustrate the numbers.
20
57548
1784
μˆ«μžκ°€ λ°”λ‘œ μ„€λͺ…ν•΄μ£Όμ£ .
00:59
Facebook has 1.2 billion users per month.
21
59332
3477
νŽ˜μ΄μŠ€λΆμ€ ν•œ λ‹¬κΈ°μ€€μœΌλ‘œ 12μ–΅λͺ…이 μ‚¬μš©ν•©λ‹ˆλ‹€.
01:02
So half the Earth's Internet population
22
62809
1930
κ·ΈλŸ¬λ‹ˆκΉŒ μ§€κ΅¬μƒμ˜ 인터넷 μ‚¬μš©μž 쀑
01:04
is using Facebook.
23
64739
1653
절반이 νŽ˜μ΄μŠ€λΆμ„ μ“°κ³  μžˆμ–΄μš”.
01:06
They are a site, along with others,
24
66392
1932
λ‹€λ₯Έ μ‚¬μ΄νŠΈλ“€κ³Ό 같이
01:08
that has allowed people to create an online persona
25
68324
3219
별닀λ₯Έ 기술이 없어도 온라인 μƒμ˜ 인물을
01:11
with very little technical skill,
26
71543
1782
λ§Œλ“€μˆ˜ μžˆλ„λ‘ ν•΄μ€λ‹ˆλ‹€.
01:13
and people responded by putting huge amounts
27
73325
2476
μ‚¬λžŒλ“€μ€ μ—„μ²­λ‚œ μ–‘μ˜ 개인적인 λ‚΄μš©μ„
01:15
of personal data online.
28
75801
1983
μ˜¬λ¦¬λŠ” κ²ƒμœΌλ‘œ 그것을 μ‚¬μš©ν•˜μ£ .
01:17
So the result is that we have behavioral,
29
77784
2543
κ·Έκ²ƒμœΌλ‘œ μΈν•œ κ²°κ³ΌλŠ” 역사에 μ„ λ‘€μ—†λŠ”
01:20
preference, demographic data
30
80327
1986
수 μ‹­μ–΅ λͺ…μ˜ μ‚¬λžŒλ“€μ— κ΄€ν•œ
01:22
for hundreds of millions of people,
31
82313
2101
ν–‰λ™νŒ¨ν„΄, μ„ ν˜Έλ„, 인ꡬ적 데이타λ₯Ό
01:24
which is unprecedented in history.
32
84414
2026
μ–»κ²Œ 된 κ²ƒμž…λ‹ˆλ‹€.
01:26
And as a computer scientist, what this means is that
33
86440
2560
컴퓨터 κ³Όν•™μžλ‘œμ„œ, μ΄κ²ƒμ˜ μ˜λ―ΈλŠ”
01:29
I've been able to build models
34
89000
1664
μ—¬λŸ¬λΆ„λ“€μ΄ κ³΅μœ ν•˜κ³  μžˆλŠ” 쀄도 λͺ¨λ₯΄λŠ” μ •λ³΄λ“€λ‘œ
01:30
that can predict all sorts of hidden attributes
35
90664
2322
κ·Έ μ•ˆμ— μˆ¨κ²¨μ§„ μ—¬λŸ¬λΆ„μ˜ μ˜λ„λ₯Ό
01:32
for all of you that you don't even know
36
92986
2284
μ˜ˆμΈ‘ν•΄ λ‚Ό 수 μžˆλŠ” λͺ¨ν˜•μ„
01:35
you're sharing information about.
37
95270
2202
λ§Œλ“€μ–΄ λ‚Ό 수 μžˆμ—ˆλ‹€λŠ” κ²λ‹ˆλ‹€.
01:37
As scientists, we use that to help
38
97472
2382
κ³Όν•™μžλ‘œμ„œ, μš°λ¦¬λŠ” μ‚¬λžŒλ“€μ΄ μ˜¨λΌμΈμƒμ—μ„œ
01:39
the way people interact online,
39
99854
2114
μƒν˜Έμž‘μš©ν•˜λŠ” 방식에 도움을 μ£ΌλŠ”λ° μ”λ‹ˆλ‹€.
01:41
but there's less altruistic applications,
40
101968
2499
κ·ΈλŸ¬λ‚˜ 도움이 λ˜λŠ” μͺ½μ€ μ’€ μ μŠ΅λ‹ˆλ‹€.
01:44
and there's a problem in that users don't really
41
104467
2381
μ‚¬μš©μžλ“€μ€ 이 κΈ°μˆ μ΄λ‚˜ 과정이 어떀지
01:46
understand these techniques and how they work,
42
106848
2470
μ‹€μ œλ‘œ μ΄ν•΄ν•˜μ§€ λͺ»ν•œλ‹€λŠ” 게 λ¬Έμ œμž…λ‹ˆλ‹€.
01:49
and even if they did, they don't have a lot of control over it.
43
109318
3128
μ•ˆλ‹€κ³  해도, 그것을 ν†΅μ œν•  μˆ˜κ°€ μ—†μ£ .
01:52
So what I want to talk to you about today
44
112446
1490
κ·Έλž˜μ„œ 였늘 λ§μ”€λ“œλ¦¬λ €κ³  ν•˜λŠ” 것은
01:53
is some of these things that we're able to do,
45
113936
2702
이에 λŒ€ν•΄ μš°λ¦¬κ°€ ν•  수 μžˆλŠ” 것에 λŒ€ν•œ 것과
01:56
and then give us some ideas of how we might go forward
46
116638
2763
μ–΄λ–»κ²Œ ν•˜λ©΄ μ‚¬μš©μžλ“€μ—κ²Œ κ·Έ ν†΅μ œκΆŒμ„ 돌렀 주도둝
01:59
to move some control back into the hands of users.
47
119401
2769
ν•  수 μžˆλŠ”κ°€μ— κ΄€ν•œ μƒκ°λ“€μž…λ‹ˆλ‹€.
02:02
So this is Target, the company.
48
122170
1586
이건 νƒ€κ²Ÿμ΄λΌλŠ” νšŒμ‚¬μž…λ‹ˆλ‹€.
02:03
I didn't just put that logo
49
123756
1324
μ—¬κΈ° μž„μ‹ ν•œ μ—¬μ„±μ˜ 배에 μžˆλŠ” λ‘œκ³ λŠ”
02:05
on this poor, pregnant woman's belly.
50
125080
2170
μ œκ°€ 넣은 게 μ•„λ‹™λ‹ˆλ‹€.
02:07
You may have seen this anecdote that was printed
51
127250
1840
이 이야기λ₯Ό λ“€μ–΄ 보신 적이 μžˆμœΌμ‹€ κ±°μ—μš”.
02:09
in Forbes magazine where Target
52
129090
2061
포브슀 μž‘μ§€μ— μ‹€λ¦° 것인데
02:11
sent a flyer to this 15-year-old girl
53
131151
2361
νƒ€κ²Ÿ νšŒμ‚¬μ—μ„œ 15μ‚΄μ§œλ¦¬ μ†Œλ…€μ—κ²Œ
02:13
with advertisements and coupons
54
133512
1710
젖병, κΈ°μ €κ·€, μš”λžŒλ₯Ό μœ„ν•œ 광고와 쿠폰 전단지λ₯Ό
02:15
for baby bottles and diapers and cribs
55
135222
2554
κ·Έλ…€κ°€ μžμ‹ μ˜ λΆ€λͺ¨λ‹˜μ—κ²Œ
02:17
two weeks before she told her parents
56
137776
1684
μž„μ‹ ν–ˆλ‹€λŠ” 사싀을 λ§ν•˜κΈ° 2μ£Ό 전에
02:19
that she was pregnant.
57
139460
1864
λ³΄λƒˆλ‹€λŠ” κ²λ‹ˆλ‹€.
02:21
Yeah, the dad was really upset.
58
141324
2704
λ„€, κ·Έλž˜μš”. 아버지가 무척 ν™”κ°€ λ‚˜μ…¨μ£ .
02:24
He said, "How did Target figure out
59
144028
1716
κ·Έκ°€, "μ–΄λ–»κ²Œ 고등학생이 μž„μ‹ ν–ˆλ‹€λŠ” 것을
02:25
that this high school girl was pregnant
60
145744
1824
λΆ€λͺ¨κ°€ μ•ŒκΈ°λ„ 전에
02:27
before she told her parents?"
61
147568
1960
νƒ€κ²Ÿμ΄ λ¨Όμ € μ•Œ μˆ˜κ°€ 있죠?" 라고 λ§ν–ˆμ–΄μš”.
02:29
It turns out that they have the purchase history
62
149528
2621
μ•Œκ³  λ³΄λ‹ˆ 그듀은 수 μ‹­λ§Œλͺ…μ˜ 고객듀에 λŒ€ν•œ
02:32
for hundreds of thousands of customers
63
152149
2301
ꡬ맀이λ ₯을 가지고 μžˆμ—ˆκ³ 
02:34
and they compute what they call a pregnancy score,
64
154450
2730
κ·Έλ“€λ§Œμ˜ μž„μ‹ μ²™λ„λΌκ³  ν•˜λŠ” 것을 κ³„μ‚°ν–ˆμ–΄μš”.
02:37
which is not just whether or not a woman's pregnant,
65
157180
2332
단지 μž„μ‹ μ„ ν–ˆλŠ”μ§€ μ•ˆν–ˆλŠ”μ§€λ§Œ λ³΄λŠ”κ²Œ μ•„λ‹ˆλΌ
02:39
but what her due date is.
66
159512
1730
μ˜ˆμ •μΌμ΄ μ–Έμ œμΈμ§€ κΉŒμ§€μš”.
02:41
And they compute that
67
161242
1304
그듀이 계산 ν•΄λ‚Έ 것이
02:42
not by looking at the obvious things,
68
162546
1768
μ•„κΈ°μš”λžŒμ΄λ‚˜ μ˜·μ„ μ‚¬λŠ” 것과 같은
02:44
like, she's buying a crib or baby clothes,
69
164314
2512
λ»”ν•œ 것을 ν†΅ν•΄μ„œ μ•Œμ•„ λ‚Έ 것이 μ•„λ‹ˆκ³ 
02:46
but things like, she bought more vitamins
70
166826
2943
ν‰μ†Œμ— μƒ€λ˜ 것보닀
02:49
than she normally had,
71
169769
1717
더 λ§Žμ€ 비타민을 산닀든지
02:51
or she bought a handbag
72
171486
1464
μ•„λ‹ˆλ©΄ κΈ°μ €κ·€λ₯Ό 넣을 수 μžˆμ„ 만큼
02:52
that's big enough to hold diapers.
73
172950
1711
큰 가방을 산닀든지 ν•˜λŠ” κ²ƒμœΌλ‘œ μ•„λŠ” κ²λ‹ˆλ‹€.
02:54
And by themselves, those purchases don't seem
74
174661
1910
κ·Έ ꡬ맀자체만 λ΄μ„œλŠ”
02:56
like they might reveal a lot,
75
176571
2469
λ³„λ‘œ λ³΄μ—¬μ£ΌλŠ” 게 μ—†μŠ΅λ‹ˆλ‹€.
02:59
but it's a pattern of behavior that,
76
179040
1978
그것은 행동 νŒ¨ν„΄μœΌλ‘œ
03:01
when you take it in the context of thousands of other people,
77
181018
3117
수 천λͺ…μ˜ λ‹€λ₯Έ μ‚¬λžŒλ“€ μ†μ—μ„œ 보면
03:04
starts to actually reveal some insights.
78
184135
2757
이해할 λ§Œν•œ 것이 λ³΄μž…λ‹ˆλ‹€.
03:06
So that's the kind of thing that we do
79
186892
1793
그게 μš°λ¦¬κ°€ ν•˜λŠ” μΌμΈλ°μš”
03:08
when we're predicting stuff about you on social media.
80
188685
2567
μ†Œμ…œλ―Έλ””μ–΄μ—μ„œ μ—¬λŸ¬λΆ„μ— λŒ€ν•œ 것을 μ˜ˆμΈ‘ν•˜λŠ” κ²λ‹ˆλ‹€.
03:11
We're looking for little patterns of behavior that,
81
191252
2796
μ—¬λŸ¬λΆ„μ΄ 수 백만λͺ…μ˜ μ‚¬λžŒλ“€ μ†μ—μ„œ 무언가 λ°œκ²¬ν•˜λ©΄
03:14
when you detect them among millions of people,
82
194048
2682
μ—¬λŸ¬κ°€μ§€ 것듀을 μ•Œ 수 있게 ν•΄μ£ΌλŠ”
03:16
lets us find out all kinds of things.
83
196730
2706
μ–΄λ–€ 행동 νŒ¨ν„΄λ“€μ„ μ°ΎμŠ΅λ‹ˆλ‹€.
03:19
So in my lab and with colleagues,
84
199436
1747
μ—°κ΅¬μ‹€μ—μ„œ λ™λ£Œλ“€κ³Ό
03:21
we've developed mechanisms where we can
85
201183
1777
κ½€ μ •ν™•ν•˜κ²Œ μ˜ˆμΈ‘ν•  수 μžˆλŠ”
03:22
quite accurately predict things
86
202960
1560
방법을 κ°œλ°œν–ˆμŠ΅λ‹ˆλ‹€.
03:24
like your political preference,
87
204520
1725
μ—¬λŸ¬λΆ„μ˜ μ •μΉ˜μ  μ„±ν–₯,
03:26
your personality score, gender, sexual orientation,
88
206245
3752
성격 점수, 성별, 성적 μ„±ν–₯,
03:29
religion, age, intelligence,
89
209997
2873
쒅ꡐ, λ‚˜μ΄, 지λŠ₯,
03:32
along with things like
90
212870
1394
거기에닀가
03:34
how much you trust the people you know
91
214264
1937
μ‚¬λžŒλ“€μ„ μ–Όλ§ˆλ‚˜ μ‹ λ’°ν•˜κ³  있고
03:36
and how strong those relationships are.
92
216201
1804
관계가 μ–Όλ§ˆλ‚˜ λ‘ν„°μš΄μ§€λ„μš”.
03:38
We can do all of this really well.
93
218005
1785
μ €ν¬λŠ” 이런 κ±Έ 정말 μž˜ν•©λ‹ˆλ‹€.
03:39
And again, it doesn't come from what you might
94
219790
2197
λ‹€μ‹œ λ§μ”€λ“œλ¦¬μ§€λ§Œ, μ—¬λŸ¬λΆ„μ΄ μƒκ°ν•˜μ‹€ λ§Œν•œ λ»”ν•œ μ •λ³΄λ‘œ
03:41
think of as obvious information.
95
221987
2102
이런 것듀을 μ•Œμ•„λ‚΄λŠ” 게 μ•„λ‹™λ‹ˆλ‹€.
03:44
So my favorite example is from this study
96
224089
2281
μ˜¬ν•΄ λ―Έκ΅­ ꡭ립과학원 νšŒλ³΄μ— μ‹€λ¦°
03:46
that was published this year
97
226370
1240
이 μ—°κ΅¬μ—μ„œ
03:47
in the Proceedings of the National Academies.
98
227610
1795
μ œκ°€ 제일 μ’‹μ•„ν•˜λŠ” λΆ€λΆ„μž…λ‹ˆλ‹€.
03:49
If you Google this, you'll find it.
99
229405
1285
ꡬ글 검색을 ν•΄λ³΄μ‹œλ©΄ λ‚˜μ˜΅λ‹ˆλ‹€.
03:50
It's four pages, easy to read.
100
230690
1872
읽기 μ‰¬μš΄ 4νŽ˜μ΄μ§€ λΆ„λŸ‰μ΄μ—μš”.
03:52
And they looked at just people's Facebook likes,
101
232562
3003
μ—¬λŸ¬λΆ„λ“€μ΄ μ’‹μ•„ν•˜μ‹œλŠ” 것듀에 ν‘œμ‹œν•˜λŠ”
03:55
so just the things you like on Facebook,
102
235565
1920
페이슀뢁의 "μ’‹μ•„μš”"λ₯Ό λ³΄κ³ μ„œ
03:57
and used that to predict all these attributes,
103
237485
2138
λ‹€λ₯Έ 것듀과 ν•¨κ»˜
03:59
along with some other ones.
104
239623
1645
μ–΄λ–€ μš”μΈμ„ μ˜ˆμΈ‘ν•˜λŠ”λ° μ“°λŠ” κ²λ‹ˆλ‹€.
04:01
And in their paper they listed the five likes
105
241268
2961
λ³΄κ³ μ„œμ—λŠ” 5개의 "μ’‹μ•„μš”" λ¦¬μŠ€νŠΈκ°€ μžˆλŠ”λ°
04:04
that were most indicative of high intelligence.
106
244229
2787
높은 지λŠ₯을 λ‚˜νƒ€λ‚΄λŠ” κ°•λ ₯ν•œ μ§€ν‘œμ˜€μŠ΅λ‹ˆλ‹€.
04:07
And among those was liking a page
107
247016
2324
κ·Έ λ‹€μ„― 가지 쀑 ν•˜λ‚˜κ°€
κΌ¬λΆ€λž‘ κ°μžμ— λŒ€ν•œ νŽ˜μ΄μ§€λ₯Ό "μ’‹μ•„μš”"ν•˜λŠ” κ²ƒμ΄μ—ˆμ–΄μš”. (μ›ƒμŒ)
04:09
for curly fries. (Laughter)
108
249340
1905
04:11
Curly fries are delicious,
109
251245
2093
κΌ¬λΆ€λž‘ κ°μžλŠ” λ§›μžˆμ§€λ§Œ,
04:13
but liking them does not necessarily mean
110
253338
2530
그것을 μ’‹λ‹€κ³  ν•˜λŠ” 것이
04:15
that you're smarter than the average person.
111
255868
2080
κΌ­ 평균보닀 λ˜‘λ˜‘ν•˜λ‹€λŠ” 것을 μ˜λ―Έν•˜μ§„ μ•ŠμŠ΅λ‹ˆλ‹€.
04:17
So how is it that one of the strongest indicators
112
257948
3207
그럼 μ–΄λ–»κ²Œ μ—¬λŸ¬λΆ„μ˜ 지λŠ₯에 κ΄€ν•œ
04:21
of your intelligence
113
261155
1570
ν™•μ‹€ν•œ μ§€ν‘œμ€‘ ν•˜λ‚˜κ°€
04:22
is liking this page
114
262725
1447
이 νŽ˜μ΄μ§€λ₯Ό μ’‹λ‹€κ³  ν•˜λŠ” κ±ΈκΉŒμš”?
04:24
when the content is totally irrelevant
115
264172
2252
이 λ‚΄μš©μ΄ 지λŠ₯μ΄λΌλŠ” μš”μΈκ³ΌλŠ”
04:26
to the attribute that's being predicted?
116
266424
2527
μ•„λ¬΄λŸ° 관련성이 μ—†λŠ”λ°λ„ λ§μ΄μ—μš”.
04:28
And it turns out that we have to look at
117
268951
1584
μ™œ κ·ΈλŸ°μ§€ 보렀면
04:30
a whole bunch of underlying theories
118
270535
1618
κ·Έ 이면에 μžˆλŠ” μ—¬λŸ¬ 가지 이둠듀을
04:32
to see why we're able to do this.
119
272153
2569
μ•Œμ•„ 봐야 ν•©λ‹ˆλ‹€.
04:34
One of them is a sociological theory called homophily,
120
274722
2913
κ·Έ μ€‘μ˜ ν•˜λ‚˜κ°€ 호λͺ¨ν•„리라고 ν•˜λŠ” μ‚¬νšŒμ΄λ‘ μž…λ‹ˆλ‹€.
04:37
which basically says people are friends with people like them.
121
277635
3092
λΉ„μŠ·ν•œ μ‚¬λžŒλ“€λΌλ¦¬ μΉœκ΅¬κ°€ λœλ‹€λŠ” μ΄λ‘ μž…λ‹ˆλ‹€.
04:40
So if you're smart, you tend to be friends with smart people,
122
280727
2014
λ˜‘λ˜‘ν•œ μ‚¬λžŒμ€ λ˜‘λ˜‘ν•œ μ‚¬λžŒλΌλ¦¬
04:42
and if you're young, you tend to be friends with young people,
123
282741
2630
μ Šμ€ μ‚¬λžŒμ€ μ Šμ€ μ‚¬λžŒλΌλ¦¬ μΉœκ΅¬κ°€ λ˜λŠ” κ²½ν–₯이 있고
04:45
and this is well established
124
285371
1627
이것은 수 λ°±λ…„ λ™μ•ˆ
04:46
for hundreds of years.
125
286998
1745
κ·Έλ ‡κ²Œ κ΅³μ–΄μ‘ŒμŠ΅λ‹ˆλ‹€.
04:48
We also know a lot
126
288743
1232
μš°λ¦¬λŠ” 잘 μ••λ‹ˆλ‹€.
04:49
about how information spreads through networks.
127
289975
2550
정보가 μ–΄λ–»κ²Œ 인λ§₯을 톡해 νΌμ§€λŠ”μ§€μš”.
04:52
It turns out things like viral videos
128
292525
1754
히트 λ™μ˜μƒμ΄λ‚˜
04:54
or Facebook likes or other information
129
294279
2406
페이슀뢁의 "μ’‹μ•„μš”", λ˜λŠ” λ‹€λ₯Έ 정보듀이
04:56
spreads in exactly the same way
130
296685
1888
μ‚¬νšŒμ  망을 ν†΅ν•΄μ„œ
04:58
that diseases spread through social networks.
131
298573
2454
μ§ˆλ³‘μ΄ νΌμ§€λŠ” 방법과 λ˜‘κ°™λ‹€λŠ” κ²ƒμž…λ‹ˆλ‹€.
05:01
So this is something we've studied for a long time.
132
301027
1791
이것이 저희가 μ˜€λž«λ™μ•ˆ μ—°κ΅¬ν•œ κ²ƒμž…λ‹ˆλ‹€.
05:02
We have good models of it.
133
302818
1576
잘 μ„€λͺ…ν•΄ 쀄 수 μžˆλŠ” λͺ¨λΈμ΄ 있죠.
05:04
And so you can put those things together
134
304394
2157
κ·ΈλŸ¬λ‹ˆκΉŒ 이 λͺ¨λ“  정보듀을 λͺ¨μ•„μ„œ
05:06
and start seeing why things like this happen.
135
306551
3088
μ™œ 이런 일이 μƒκΈ°λŠ”μ§€ μ•Œ 수 μžˆλŠ” 것이죠.
05:09
So if I were to give you a hypothesis,
136
309639
1814
κ°€λŠ₯ν•œ 가섀을 λ§μ”€λ“œλ¦¬μžλ©΄,
05:11
it would be that a smart guy started this page,
137
311453
3227
μ–΄λ–€ λ˜‘λ˜‘ν•œ μ‚¬λžŒμ΄ 이 νŽ˜μ΄μ§€λ₯Ό μ‹œμž‘ν–ˆκ±°λ‚˜
05:14
or maybe one of the first people who liked it
138
314680
1939
"μ’‹μ•„μš”" 라고 μ‹œμž‘ν•œ μ‚¬λžŒμ΄
05:16
would have scored high on that test.
139
316619
1736
지λŠ₯ν…ŒμŠ€νŠΈμ—μ„œ 높은 점수λ₯Ό λ°›μ•˜λ˜ κ±°μ£ .
05:18
And they liked it, and their friends saw it,
140
318355
2288
"μ’‹μ•„μš”" ν•œ 후에 κ·Έ μΉœκ΅¬λ“€μ΄ λ΄€κ³ ,
05:20
and by homophily, we know that he probably had smart friends,
141
320643
3122
호λͺ¨ν•„리 이둠상, 그에겐 λ˜‘λ˜‘ν•œ μΉœκ΅¬λ“€μ΄ μžˆμ„κ²λ‹ˆλ‹€.
05:23
and so it spread to them, and some of them liked it,
142
323765
3056
그게 κ·Έλ“€μ—κ²Œ 퍼진 것이고, λͺ‡λͺ‡μ€ "μ’‹μ•„μš”"λ₯Ό λˆŒλ €μ„ κ±°κ΅¬μš”.
05:26
and they had smart friends,
143
326821
1189
κ·Έλ“€μ—κ²Œ μžˆλŠ” λ˜‘λ˜‘ν•œ μΉœκ΅¬λ“€μ—κ²Œ
05:28
and so it spread to them,
144
328010
807
05:28
and so it propagated through the network
145
328817
1973
λ‹€μ‹œ νΌμ§€λŠ” κ²λ‹ˆλ‹€.
κ·Έλž˜μ„œ κ·Έ 관계λ₯Ό 톡해 선전이 λ˜λŠ” 것이죠.
05:30
to a host of smart people,
146
330790
2569
λ˜‘λ˜‘ν•œ μ‚¬λžŒλ“€ μ§‘λ‹¨μ—κ²Œμš”.
05:33
so that by the end, the action
147
333359
2056
κ·Έλž˜μ„œ λ§ˆμΉ¨λ‚΄ κΌ¬λΆ€λž‘ 감자λ₯Ό
05:35
of liking the curly fries page
148
335415
2544
"μ’‹μ•„μš”" ν•œ 행동이
05:37
is indicative of high intelligence,
149
337959
1615
높은 지λŠ₯의 ν‘œμ‹œκ°€ λ˜λŠ” κ²λ‹ˆλ‹€.
05:39
not because of the content,
150
339574
1803
그것이 λ‚΄μš© λ•Œλ¬Έμ΄ μ•„λ‹ˆλΌ
05:41
but because the actual action of liking
151
341377
2522
"μ’‹μ•„μš”"λ₯Ό λˆ„λ₯΄λŠ”
05:43
reflects back the common attributes
152
343899
1900
행동을 ν•œ μ‚¬λžŒλ“€μ˜
05:45
of other people who have done it.
153
345799
2468
곡톡적인 μš”μΈ λ•Œλ¬ΈμΈ 것이죠.
05:48
So this is pretty complicated stuff, right?
154
348267
2897
κ½€ λ³΅μž‘ν•˜κ²Œ λ“€λ¦¬μ‹œμ£ ?
05:51
It's a hard thing to sit down and explain
155
351164
2199
이걸 보톡 μ‚¬λžŒλ“€μ—κ²Œ μ„€λͺ…ν•˜κΈ°κ°€
05:53
to an average user, and even if you do,
156
353363
2848
μ–΄λ ΅μŠ΅λ‹ˆλ‹€. ν•œλ‹€κ³  해도
05:56
what can the average user do about it?
157
356211
2188
그듀이 이걸 κ°–κ³  뭘 ν• κΉŒμš”?
05:58
How do you know that you've liked something
158
358399
2048
μ—¬λŸ¬λΆ„μ΄ μ–΄λ–€ 것을 μ’‹λ‹€κ³  ν•˜λŠ” 것이
06:00
that indicates a trait for you
159
360447
1492
μ’‹μ•„ν•˜λŠ” λ‚΄μš©κ³ΌλŠ” μ „ν˜€ 관련이 μ—†λŠ”
06:01
that's totally irrelevant to the content of what you've liked?
160
361939
3545
μ–΄λ–€ νŠΉμ§•μ„ λ³΄μ—¬μ£ΌλŠ” 것을 μ–΄λ–»κ²Œ μ•Œκ² μ–΄μš”?
06:05
There's a lot of power that users don't have
161
365484
2546
이런 데이타가 μ‚¬μš©λ˜λŠ” 방법에 λŒ€ν•΄
06:08
to control how this data is used.
162
368030
2230
μ‚¬μš©μžλ“€μ€ ν†΅μ œλ ₯이 거의 μ—†μŠ΅λ‹ˆλ‹€.
06:10
And I see that as a real problem going forward.
163
370260
3112
이것이 μ•žμœΌλ‘œ λ‚˜μ•„κ°€λŠ”λ° 큰 문제라고 λ΄…λ‹ˆλ‹€.
06:13
So I think there's a couple paths
164
373372
1977
μš°λ¦¬κ°€ μ‚΄νŽ΄λ΄μ•Ό ν• 
06:15
that we want to look at
165
375349
1001
λͺ‡ 가지 κ²½μš°κ°€ μžˆλ‹€κ³  μƒκ°ν•©λ‹ˆλ‹€.
06:16
if we want to give users some control
166
376350
1910
데이타가 μ‚¬μš©λ˜λŠ” 것에 λŒ€ν•œ ν†΅μ œκΆŒμ„
06:18
over how this data is used,
167
378260
1740
μ‚¬μš©μžμ—κ²Œ μ£Όκ³  싢은지에 λŒ€ν•΄μ„œμš”.
06:20
because it's not always going to be used
168
380000
1940
μ™œλƒν•˜λ©΄ 이것이 μ–Έμ œλ‚˜ μ‚¬μš©μžμ—κ²Œ
06:21
for their benefit.
169
381940
1381
쒋은 μͺ½μœΌλ‘œλ§Œ μ‚¬μš©λ˜λŠ” 것은 μ•„λ‹ˆλ‹ˆκΉŒμš”.
06:23
An example I often give is that,
170
383321
1422
μ œκ°€ 자주 λ“œλŠ” μ˜ˆκ°€ μžˆλŠ”λ°μš”,
06:24
if I ever get bored being a professor,
171
384743
1646
μ œκ°€ λ§Œμ•½ κ΅μˆ˜μ§μ„ ν•˜λŠ” 것이 싫어진닀면
06:26
I'm going to go start a company
172
386389
1653
이런 λͺ¨λ“  μš”μΈλ“€μ„ μ˜ˆμΈ‘ν•˜λŠ”
06:28
that predicts all of these attributes
173
388042
1454
νšŒμ‚¬λ₯Ό 차릴 수 μžˆμ„ κ²λ‹ˆλ‹€.
06:29
and things like how well you work in teams
174
389496
1602
μ—¬λŸ¬λΆ„μ΄ νŒ€μ—μ„œ 일을 잘 ν•  지,
06:31
and if you're a drug user, if you're an alcoholic.
175
391098
2671
λ§ˆμ•½λ³΅μš©μžμΈμ§€, μ•Œμ½œ μ€‘λ…μžμΈμ§€ μ•Œλ €μ£ΌλŠ” νšŒμ‚¬μš”.
06:33
We know how to predict all that.
176
393769
1440
μ˜ˆμΈ‘ν•˜λŠ” 방법을 μ•Œκ³  μžˆκ±°λ“ μš”.
06:35
And I'm going to sell reports
177
395209
1761
μ €λŠ” κ·Έ λ³΄κ³ μ„œλ₯Ό νŒλ§€ν•˜λŠ” κ±°μ£ .
06:36
to H.R. companies and big businesses
178
396970
2100
μ—¬λŸ¬λΆ„μ„ κ³ μš©ν•˜λ €λŠ” λŒ€κΈ°μ—…μ΄λ‚˜
06:39
that want to hire you.
179
399070
2273
인λ ₯νšŒμ‚¬μ— λ§μ΄μ—μš”.
06:41
We totally can do that now.
180
401343
1177
μ§€κΈˆ κ·Έ 일듀을 ν•  μˆ˜κ°€ μžˆμ–΄μš”.
06:42
I could start that business tomorrow,
181
402520
1788
내일 κ·Έ 사업을 μ‹œμž‘ν•  μˆ˜λ„ 있죠.
06:44
and you would have absolutely no control
182
404308
2052
μ œκ°€ 그런 데이타λ₯Ό μ“°λŠ” 것에 λŒ€ν•΄
06:46
over me using your data like that.
183
406360
2138
μ—¬λŸ¬λΆ„μ€ ν†΅μ œκΆŒμ΄ μ•„μ˜ˆ μ—†μŠ΅λ‹ˆλ‹€.
06:48
That seems to me to be a problem.
184
408498
2292
그게 μ €μ—κ²ŒλŠ” 문제둜 λŠκ»΄μ§‘λ‹ˆλ‹€.
06:50
So one of the paths we can go down
185
410790
1910
μš°λ¦¬κ°€ 택할 수 μžˆλŠ” 방법 쀑 ν•˜λ‚˜λŠ”
06:52
is the policy and law path.
186
412700
2032
μ •μ±…κ³Ό 법λ₯ μž…λ‹ˆλ‹€.
06:54
And in some respects, I think that that would be most effective,
187
414732
3046
μ–΄λ–€ λ©΄μ—μ„œ, 제 생각엔 κ°€μž₯ 효과적일 것 κ°™μŠ΅λ‹ˆλ‹€.
06:57
but the problem is we'd actually have to do it.
188
417778
2756
λ¬Έμ œλŠ” μ‹€μ œλ‘œ κ·Έλ ‡κ²Œ ν•΄μ•Ό ν•œλ‹€λŠ” κ²λ‹ˆλ‹€.
07:00
Observing our political process in action
189
420534
2780
μ •μΉ˜μ μΈ 과정이 λŒμ•„κ°€λŠ” 것을 보면
07:03
makes me think it's highly unlikely
190
423314
2379
κ·Έλ ‡κ²Œ 될 것 같지가 μ•ŠμŠ΅λ‹ˆλ‹€.
07:05
that we're going to get a bunch of representatives
191
425693
1597
수 λ§Žμ€ μ˜μ›λ‹˜λ“€μ„ 데리고
07:07
to sit down, learn about this,
192
427290
1986
μ•‰ν˜€ 놓고, 상황을 μ•Œλ € μ€€ λ‹€μŒμ—
07:09
and then enact sweeping changes
193
429276
2106
미ꡭ의 μ§€μ μž¬μ‚°κΆŒλ²•μ—
07:11
to intellectual property law in the U.S.
194
431382
2157
μ—„μ²­λ‚œ λ³€ν™”λ₯Ό 쀄 법을 μ œμ •ν•΄μ„œ
07:13
so users control their data.
195
433539
2461
μ‚¬μš©μžλ“€μ΄ 데이타λ₯Ό ν†΅μ œν•˜κ²Œ ν•˜λŠ” 것이죠.
07:16
We could go the policy route,
196
436000
1304
μ†Œμ…œλ―Έλ””μ–΄ νšŒμ‚¬λ“€μ΄ λ§ν•˜λŠ”
07:17
where social media companies say,
197
437304
1479
정책적인 방법을 선택할 수 도 μžˆμ–΄μš”.
07:18
you know what? You own your data.
198
438783
1402
μ•„μ„Έμš”? λ°μ΄νƒ€λŠ” μ—¬λŸ¬λΆ„ κ²ƒμž…λ‹ˆλ‹€.
07:20
You have total control over how it's used.
199
440185
2489
데이타 μ‚¬μš©μ— λŒ€ν•œ ν†΅μ œκΆŒμ΄ μ—¬λŸ¬λΆ„μ—κ²Œ μžˆμ–΄μš”.
07:22
The problem is that the revenue models
200
442674
1848
λ¬Έμ œλŠ” λŒ€λΆ€λΆ„μ˜ μ†Œμ…œλ―Έλ””μ–΄ 기업듀이
07:24
for most social media companies
201
444522
1724
μ–΄λ–€ λ°©μ‹μœΌλ‘œλ“  μ‚¬μš©μžλ“€μ˜ 데이타λ₯Ό
07:26
rely on sharing or exploiting users' data in some way.
202
446246
4031
κ³΅μœ ν•˜κ³  μ΄μš©ν•˜λŠ” κ²ƒμœΌλ‘œ λ§€μΆœμ„ μ–»κ³  μžˆλ‹€λŠ” κ²ƒμž…λ‹ˆλ‹€.
07:30
It's sometimes said of Facebook that the users
203
450277
1833
νŽ˜μ΄μŠ€λΆμ—λŠ” μ‚¬μš©μžκ°€ μžˆλŠ”κ²Œ μ•„λ‹ˆλΌ
07:32
aren't the customer, they're the product.
204
452110
2528
μƒν’ˆλ“€μ΄ μžˆλŠ” 거라고 μ–˜κΈ°ν•˜κΈ°λ„ ν•©λ‹ˆλ‹€.
07:34
And so how do you get a company
205
454638
2714
μ–΄λ–»κ²Œ ν•΄μ•Ό νšŒμ‚¬λ“€μ΄
07:37
to cede control of their main asset
206
457352
2558
μ£Όμš” μžμ‚°μ˜ ν†΅μ œκΆŒμ„
07:39
back to the users?
207
459910
1249
μ‚¬μš©μžλ“€μ—κ²Œ 돌렀 μ€„κΉŒμš”?
07:41
It's possible, but I don't think it's something
208
461159
1701
κ°€λŠ₯ν•œ μΌμ΄μ§€λ§Œ, 제 생각엔
07:42
that we're going to see change quickly.
209
462860
2320
λ‹Ήμž₯ λ²Œμ–΄μ§€μ§„ μ•Šμ„ κ²λ‹ˆλ‹€.
07:45
So I think the other path
210
465180
1500
κ·Έλž˜μ„œ λ‹€λ₯Έ λ°©λ²•μœΌλ‘œ
07:46
that we can go down that's going to be more effective
211
466680
2288
훨씬 효과적으둜 λŒ€μ‘ν•  수 μžˆλŠ” 것이
07:48
is one of more science.
212
468968
1508
과학적인 방법을 μ“°λŠ” κ²λ‹ˆλ‹€.
07:50
It's doing science that allowed us to develop
213
470476
2510
μ• μ΄ˆμ— 개인적인 데이타λ₯Ό
07:52
all these mechanisms for computing
214
472986
1750
κ³„μ‚°ν•΄λ‚΄λŠ” λ©”μΉ΄λ‹ˆμ¦˜μ„
07:54
this personal data in the first place.
215
474736
2052
λ§Œλ“€μ–΄ λ‚΄λŠ” 것이 κ³Όν•™μž…λ‹ˆλ‹€.
07:56
And it's actually very similar research
216
476788
2106
사싀 이건 μš°λ¦¬κ°€ ν•΄μ•Ό ν•˜λŠ” 연ꡬ와
07:58
that we'd have to do
217
478894
1438
μ•„μ£Ό λΉ„μŠ·ν•©λ‹ˆλ‹€.
08:00
if we want to develop mechanisms
218
480332
2386
μ‚¬μš©μžμ—κ²Œ 이런 κ²½κ³ λ₯Ό ν•΄μ£ΌλŠ”
08:02
that can say to a user,
219
482718
1421
ν”„λ‘œκ·Έλž¨μ„ λ§Œλ“€λ €λ©΄μš”.
08:04
"Here's the risk of that action you just took."
220
484139
2229
"방금 ν•˜μ‹  μΌμ—λŠ” μœ„ν—˜μ΄ λ”°λ¦…λ‹ˆλ‹€."
08:06
By liking that Facebook page,
221
486368
2080
페이슀뢁의 "μ’‹μ•„μš”"λ₯Ό λˆ„λ₯΄μ‹œκ³ 
08:08
or by sharing this piece of personal information,
222
488448
2535
개인 정보λ₯Ό κ³΅μœ ν•˜μ‹¬μœΌλ‘œμ¨
08:10
you've now improved my ability
223
490983
1502
μ œκ°€ μ˜ˆμΈ‘ν•˜λŠ” λŠ₯λ ₯을 ν‚€μ›Œ μ£Όμ‹œλŠ” κ±°μ—μš”.
08:12
to predict whether or not you're using drugs
224
492485
2086
μ—¬λŸ¬λΆ„μ΄ λ§ˆμ•½μ„ ν•˜μ‹œλŠ”μ§€
08:14
or whether or not you get along well in the workplace.
225
494571
2862
νšŒμ‚¬μ—μ„œ 잘 μ§€λ‚΄μ‹œλŠ”μ§€μš”.
08:17
And that, I think, can affect whether or not
226
497433
1848
제 생각엔, 그게 영ν–₯을 쀄 수 μžˆμŠ΅λ‹ˆλ‹€.
08:19
people want to share something,
227
499281
1510
μ–΄λ–€ 것을 κ³΅μœ ν•  지
08:20
keep it private, or just keep it offline altogether.
228
500791
3239
λΉ„λ°€λ‘œ 할지, μ•„μ˜ˆ μ˜€ν”„λΌμΈμœΌλ‘œλ§Œ κ°–κ³  μžˆμ„μ§€μš”.
08:24
We can also look at things like
229
504030
1563
이런 것도 생각 ν•΄ λ³Ό 수 μžˆμŠ΅λ‹ˆλ‹€.
08:25
allowing people to encrypt data that they upload,
230
505593
2728
μ—…λ‘œλ“œν•˜λŠ” 데이타λ₯Ό μ•”ν˜Έν™”ν•΄μ„œ
08:28
so it's kind of invisible and worthless
231
508321
1855
페이슀뢁 같은 μ‚¬μ΄νŠΈλ‚˜
08:30
to sites like Facebook
232
510176
1431
제3μžκ°€ μ•„μ˜ˆ 좔적도 λͺ»ν•˜κ³ 
08:31
or third party services that access it,
233
511607
2629
μ“Έ μˆ˜λ„ μ—†κ²Œ λ§μž…λ‹ˆλ‹€.
08:34
but that select users who the person who posted it
234
514236
3247
ν•˜μ§€λ§Œ μ—…λ‘œλ“œν•œ μ‚¬μš©μžκ°€ μ§€μ •ν•œ μ‚¬λžŒλ§Œ
08:37
want to see it have access to see it.
235
517483
2670
그것을 보게 ν•΄ μ£ΌλŠ” 것이죠.
08:40
This is all super exciting research
236
520153
2166
지적인 κ΄€μ μ—μ„œ 보면
이것은 λŒ€λ‹¨νžˆ μ„€λ ˆλŠ” μ—°κ΅¬μž…λ‹ˆλ‹€.
08:42
from an intellectual perspective,
237
522319
1620
08:43
and so scientists are going to be willing to do it.
238
523939
1859
κ³Όν•™μžλ“€μ€ 기꺼이 ν•˜λ €κ³  ν•©λ‹ˆλ‹€.
08:45
So that gives us an advantage over the law side.
239
525798
3610
κ·Έλž˜μ„œ 법λ₯ μ μΈ μΈ‘λ©΄μ—μ„œ μš°λ¦¬μ—κ²Œ μœ λ¦¬ν•˜λ„λ‘ λ§μž…λ‹ˆλ‹€.
08:49
One of the problems that people bring up
240
529408
1725
μ œκ°€ 이런 말을 ν• λ•Œ
08:51
when I talk about this is, they say,
241
531133
1595
μ‚¬λžŒλ“€μ΄ μ œκΈ°ν•˜λŠ” λ¬Έμ œκ°€ μžˆλŠ”λ°μš”,
08:52
you know, if people start keeping all this data private,
242
532728
2646
"λͺ¨λ‘κ°€ 데이타λ₯Ό λΉ„λ°€λ‘œ ν•œλ‹€λ©΄
08:55
all those methods that you've been developing
243
535374
2113
당신이 μ˜ˆμΈ‘ν•˜λ €κ³  κ°œλ°œν•˜λŠ” 방법듀이
08:57
to predict their traits are going to fail.
244
537487
2653
λ‹€ μ‹€νŒ¨ν•  κ²λ‹ˆλ‹€." 라고 λ§μ΄μ—μš”.
09:00
And I say, absolutely, and for me, that's success,
245
540140
3520
그럼 μ €λŠ”, "λ§žμ•„μš”, 그럼 μ„±κ³΅ν•œ κ±°μ—μš”" λΌκ³ μš”.
09:03
because as a scientist,
246
543660
1786
μ €λŠ” κ³Όν•™μžλ‘œμ„œ
09:05
my goal is not to infer information about users,
247
545446
3688
μ‚¬μš©μžλ“€μ— λŒ€ν•œ 정보λ₯Ό μΆ”μΈ‘ν•˜λŠ”κ²Œ μ•„λ‹ˆλΌ
09:09
it's to improve the way people interact online.
248
549134
2767
μ˜¨λΌμΈμ—μ„œ 더 쒋은 λ°©λ²•μœΌλ‘œ ꡐλ₯˜ν•˜λ„둝 ν•˜λŠ”κ²Œ λͺ©ν‘œμ΄λ‹ˆκΉŒμš”.
09:11
And sometimes that involves inferring things about them,
249
551901
3218
μ–΄λ–€ λ•ŒλŠ” 그듀에 λŒ€ν•œ 정보λ₯Ό μΆ”μΈ‘ν•˜λŠ” 일이 μƒκΉλ‹ˆλ‹€.
09:15
but if users don't want me to use that data,
250
555119
3022
ν•˜μ§€λ§Œ μ‚¬μš©μžκ°€ μ›μΉ˜ μ•ŠμœΌλ©΄,
09:18
I think they should have the right to do that.
251
558141
2038
제 생각엔 κ·Έλ“€μ—κ²Œ 그럴 κΆŒλ¦¬κ°€ μžˆμŠ΅λ‹ˆλ‹€.
09:20
I want users to be informed and consenting
252
560179
2651
μš°λ¦¬κ°€ κ°œλ°œν•œ 방법듀을
09:22
users of the tools that we develop.
253
562830
2112
μ‚¬μš©μžλ“€μ—κ²Œ μ•Œλ €μ£Όκ³  λ™μ˜λ₯Ό λ°›κ³  μ‹ΆμŠ΅λ‹ˆλ‹€.
09:24
And so I think encouraging this kind of science
254
564942
2952
μ΄λŸ¬ν•œ 과학을 μž₯λ €ν•˜κ³ 
09:27
and supporting researchers
255
567894
1346
μ‚¬μš©μžλ“€μ—κ²Œ
09:29
who want to cede some of that control back to users
256
569240
3023
ν†΅μ œκΆŒμ„ 돌렀주고 μ†Œμ…œλ―Έλ””μ–΄ νšŒμ‚¬λ“€λ‘œλΆ€ν„°
09:32
and away from the social media companies
257
572263
2311
멀어지도둝 ν•˜λŠ” μ—°κ΅¬μžλ“€μ„ μ§€μ›ν•˜λŠ” 것이
09:34
means that going forward, as these tools evolve
258
574574
2671
이 κ°œλ°œλ„κ΅¬λ“€μ΄ λ³€ν™”ν•˜κ³  λ°œμ „ν•˜λ©΄μ„œ
09:37
and advance,
259
577245
1476
μ•žμœΌλ‘œ λ‚˜μ•„κ°€λŠ” κ²ƒμž…λ‹ˆλ‹€.
09:38
means that we're going to have an educated
260
578721
1414
λ˜ν•œ 지식이 있고
09:40
and empowered user base,
261
580135
1694
κΆŒλ¦¬κ°€ μžˆλŠ” μ‚¬μš©μžλ“€μ΄ μƒκΈ΄λ‹€λŠ” 것이죠.
09:41
and I think all of us can agree
262
581829
1100
μ €λŠ” μ•žμœΌλ‘œ λ‚˜μ•„κ°€κΈ° μœ„ν•΄
09:42
that that's a pretty ideal way to go forward.
263
582929
2564
λͺ¨λ‘κ°€ λ™μ˜ν•  λ§Œν•œ 것이라고 μƒκ°ν•©λ‹ˆλ‹€.
09:45
Thank you.
264
585493
2184
κ°μ‚¬ν•©λ‹ˆλ‹€.
09:47
(Applause)
265
587677
3080
(λ°•μˆ˜)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7