Why you should get paid for your data | Jennifer Zhu Scott

78,179 views ・ 2020-03-19

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: Hosu Shin κ²€ν† : DK Kim
00:13
I grew up in the late '70s in rural China
0
13325
3896
μ €λŠ” 70λ…„λŒ€ ν›„λ°˜ 쀑ꡭ μ‹œκ³¨μ—μ„œ μžλžμŠ΅λ‹ˆλ‹€.
00:17
during the final years of my country's pursuit of absolute equality
1
17245
4565
이 μ‹œκΈ°λŠ” 쀑ꡭ이 μ ˆλŒ€ 평등을 μ’‡μœΌλ©°
00:21
at the expense of liberty.
2
21834
2568
자유λ₯Ό μžƒμ–΄κ°€λ˜ λ§ˆμ§€λ§‰ λ•Œμ˜€μŠ΅λ‹ˆλ‹€.
00:24
At that time, everybody had a job,
3
24870
2324
λ‹Ήμ‹œμ—λŠ” λͺ¨λ‘ 직업이 μžˆμ—ˆμ§€λ§Œ
00:27
but everyone was struggling.
4
27218
2070
λͺ¨λ‘κ°€ κΆν•ν–ˆμŠ΅λ‹ˆλ‹€.
00:29
In the early '80s, my dad was an electrician,
5
29312
2876
80λ…„λŒ€ 초, μ•„λ²„μ§€λŠ” μ „κΈ°κ³΅μ΄μ—ˆκ³ 
00:32
and my mom worked two shifts in the local hospital.
6
32212
2872
μ–΄λ¨Έλ‹ˆλŠ” 지역 λ³‘μ›μ—μ„œ 2κ΅λŒ€λ‘œ 일을 ν–ˆμŠ΅λ‹ˆλ‹€.
00:35
But still, we didn't have enough food,
7
35790
2940
κ·ΈλŸ¬λ‚˜ μ—¬μ „νžˆ 먹을 것은 μΆ©λΆ„ν•˜μ§€ μ•Šμ•˜κ³ 
00:38
and our living conditions were dismal.
8
38754
2317
μƒν™œ ν™˜κ²½μ€ λΉ„μ°Έν–ˆμŠ΅λ‹ˆλ‹€.
00:41
We were undoubtedly equal --
9
41095
1991
μš°λ¦¬λŠ” λΆ„λͺ…νžˆ λ™λ“±ν–ˆμŠ΅λ‹ˆλ‹€.
00:43
we were equally poor.
10
43110
1539
λ™λ“±ν•˜κ²Œ κ°€λ‚œν–ˆμŠ΅λ‹ˆλ‹€.
00:45
The state owned everything.
11
45114
2008
μ •λΆ€κ°€ λͺ¨λ“  κ±Έ μ†Œμœ ν–ˆκ³ 
00:47
We owned nothing.
12
47146
1456
μš°λ¦¬μ—κ² 아무것도 μ—†μ—ˆμŠ΅λ‹ˆλ‹€.
00:49
The story I'm going to share with you is about my struggles
13
49309
3672
μ œκ°€ ν•΄λ“œλ¦΄ μ–˜κΈ°λŠ”
역경을 이겨낸
00:53
of overcoming adversity
14
53005
2200
μΉ μ „νŒ”κΈ° μ •μ‹  μ΄μ•ΌκΈ°μž…λ‹ˆλ‹€.
00:55
with my resilience, grit and sheer determination.
15
55229
5038
01:01
No, I'm just kidding, I'm not going to do that to you.
16
61151
2526
λ†λ‹΄μ΄μ—μš”. 그건 μ•„λ‹ˆκ³ μš”.
01:03
(Laughter)
17
63701
2369
(μ›ƒμŒ)
01:06
Instead, I'm going to tell you,
18
66094
3950
λŒ€μ‹  μ œκ°€ λ“œλ¦΄ 말씀은
01:10
what I'm going to talk about today is about a new form of collective poverty
19
70068
4146
였늘 λ§μ”€λ“œλ¦΄ μ£Όμ œλŠ” 집단적 빈곀의 μƒˆλ‘œμš΄ ν˜•νƒœμž…λ‹ˆλ‹€.
01:14
that many of us don't recognize
20
74238
1929
μ΄λŠ” λ§Žμ€ μ‚¬λžŒλ“€μ΄ 잘 μ•Œμ§€ λͺ»ν•˜λ‚˜
01:16
and that urgently needs to be understood.
21
76191
2154
λ°˜λ“œμ‹œ μ•Œμ•„μ•Ό ν•˜λŠ” κ²ƒμ΄μ§€μš”.
01:18
I'm sure you've noticed that in the past 20 years,
22
78822
2458
μ—¬λŸ¬λΆ„μ€ λΆ„λͺ… μ§€λ‚œ 20λ…„ λ™μ•ˆ
01:21
that asset has emerged.
23
81304
1776
ν•œ μžμ‚°μ΄ λ– μ˜¬λžλ‹€λŠ” 것을 μ•Œκ³  μžˆμ„ κ²λ‹ˆλ‹€.
01:23
It's been generating wealth at a breakneck pace.
24
83104
3458
μ΄λŠ” μœ„ν—˜ν•  μ •λ„λ‘œ λΉ λ₯΄κ²Œ λΆ€λ₯Ό μ°½μ‘°ν–ˆμŠ΅λ‹ˆλ‹€.
01:26
As a tool, it has brought businesses deep customer insights,
25
86586
4077
μ΄λŠ” λ„κ΅¬λ‘œμ¨ 사업에 μžˆμ–΄ 고객듀에 λŒ€ν•œ κΉŠμ€ 톡찰λ ₯,
01:30
operational efficiency
26
90687
1638
μš΄μ˜μƒ 효율,
01:32
and enormous top-line growth.
27
92349
2192
그리고 μ—„μ²­λ‚œ μ„±μž₯을 κ°€μ Έμ™”μŠ΅λ‹ˆλ‹€.
01:35
But for some,
28
95018
1628
ν•˜μ§€λ§Œ λͺ‡λͺ‡ μ‚¬λžŒλ“€μ—κ²ŒλŠ”
01:36
it has also provided a device to manipulate a democratic election
29
96670
4797
μ΄λŠ” 민주적 μ„ κ±°λ₯Ό μ‘°μž‘ν•˜λŠ” 도ꡬ,
01:41
or perform surveillance for profit or political purposes.
30
101491
5169
μ΄μ΅μ΄λ‚˜ μ •μΉ˜μ  λͺ©μ μ„ μœ„ν•œ κ°μ‹œ λ„κ΅¬λ‘œ μ“°μ˜€μŠ΅λ‹ˆλ‹€.
01:47
What is this miracle asset?
31
107287
2046
이 기적 같은 μžμ‚°μ€ λ¬΄μ—‡μΌκΉŒμš”?
01:49
You've guessed it: it's data.
32
109357
1913
μ•„λ§ˆ λˆˆμΉ˜μ±„μ…¨μ„ 텐데 λ°”λ‘œ λ°μ΄ν„°μž…λ‹ˆλ‹€.
01:52
Seven out of the top 10 most valuable companies in the world are tech companies
33
112308
4566
μ„Έκ³„μ—μ„œ κ°€μž₯ κ°€μΉ˜ μžˆλŠ” νšŒμ‚¬λ“€ 10개 쀑 7κ°œλŠ” 기술 νšŒμ‚¬μž…λ‹ˆλ‹€.
01:56
that either directly generate profit from data
34
116898
2967
λ°μ΄ν„°μ—μ„œ 직접 이읡을 μ°½μΆœν•˜κ±°λ‚˜
01:59
or are empowered by data from the core.
35
119889
2432
λ°μ΄ν„°λ‘œ 큰 κΆŒν•œμ„ λΆ€μ—¬λ°›μ•˜μŠ΅λ‹ˆλ‹€.
02:03
Multiple surveys show
36
123352
1265
μ—¬λŸ¬ 섀문을 보면
02:04
that the vast majority of business decision makers
37
124641
2486
λŒ€λΆ€λΆ„μ˜ κΈ°μ—… κ²°μ •κΆŒμžκ°€
02:07
regard data as an essential asset for success.
38
127151
3371
데이터λ₯Ό 성곡에 ν•„μˆ˜μ μΈ μžμ‚°μœΌλ‘œ μ—¬κ²ΌμŠ΅λ‹ˆλ‹€.
02:10
We have all experienced how data is shifting this major paradigm shift
39
130546
5938
우리 λͺ¨λ‘λŠ” 데이터가
μ£Όμš” νŒ¨λŸ¬λ‹€μž„μ„ μ–΄λ–»κ²Œ λ°”κΎΈλŠ”μ§€λ₯Ό κ²½ν—˜ν–ˆμŠ΅λ‹ˆλ‹€.
02:16
for our personal, economic and political lives.
40
136508
3885
개인적, 경제적, μ •μΉ˜μ μΈ μΈ‘λ©΄μ—μ„œμš”.
02:20
Whoever owns the data owns the future.
41
140764
2351
λˆ„κ΅¬λ“  데이터λ₯Ό μ†Œμœ ν•œ μ‚¬λžŒμ΄ 미래λ₯Ό μ§€λ°°ν•©λ‹ˆλ‹€.
02:23
But who's producing the data?
42
143553
1725
ν•˜μ§€λ§Œ κ·Έ λ°μ΄ν„°λŠ” λˆ„κ°€ μ°½μΆœν• κΉŒμš”?
μ €λŠ” μ—¬κΈ° μ—¬λŸ¬λΆ„ λͺ¨λ‘ μŠ€λ§ˆνŠΈν°μ„ 가지고 있고,
02:26
I assume everyone in this room has a smartphone,
43
146004
2964
02:28
several social media accounts
44
148992
1492
μ—¬λŸ¬ 개의 SNS 계정이 있으며,
02:30
and has done a Google search or two in the past week.
45
150508
2950
μ΅œκ·Όμ— ꡬ글 섀문지λ₯Ό μž‘μ„±ν•΄λ΄€μœΌλ¦¬λΌκ³  μƒκ°ν•©λ‹ˆλ‹€.
02:33
We are all producing data. Yes.
46
153966
2315
우리 λͺ¨λ‘κ°€ 데이터λ₯Ό λ§Œλ“€κ³  μžˆμŠ΅λ‹ˆλ‹€.
02:36
It is estimated that by 2030, 10 years from now,
47
156948
3408
2030λ…„κΉŒμ§€, μ§€κΈˆλΆ€ν„° 10λ…„ ν›„μ—λŠ”
02:40
there will be about 125 billion connected devices in the world.
48
160380
4199
세상에 1250μ–΅ 개의 κΈ°κΈ°κ°€ μ—°κ²°λ˜μ–΄ μžˆμ„ 거라고 μ˜ˆμΈ‘ν•©λ‹ˆλ‹€.
02:44
That's an average of about 15 devices per person.
49
164603
4424
ν•œ μ‚¬λžŒ λ‹Ή μ•½ 15개의 κΈ°κΈ°λ₯Ό μ“°λŠ” κ²ƒμ΄μ§€μš”.
02:49
We are already producing data every day.
50
169511
2027
μš°λ¦¬λŠ” 이미 맀일 데이터λ₯Ό μƒμ„±ν•©λ‹ˆλ‹€.
02:51
We'll be producing exponentially more.
51
171562
3158
κΈ°ν•˜κΈ‰μˆ˜μ μœΌλ‘œ μ¦κ°€ν•˜κ² μ§€μš”.
02:55
Google, Facebook and Tencent's combined revenue in 2018
52
175529
4011
2018λ…„ ꡬ글과 페이슀뢁, ν…μ„ΌνŠΈμ˜ λ§€μΆœμ„ ν•©ν•˜λ©΄
02:59
was 236 billion US dollars.
53
179564
3284
2360μ–΅ λ‹¬λŸ¬(μ•½ 270μ‘° 원)에 λ‹¬ν•©λ‹ˆλ‹€.
03:02
Now, how many of you have received payment from them
54
182872
2537
κ·Έλ ‡λ‹€λ©΄ μ—¬λŸ¬λΆ„ 쀑 데이터λ₯Ό λ§Œλ“€κ³ μ„œ
03:05
for the data you generate for them?
55
185433
1708
κ·Έλ“€μ—κ²Œ λŒ€κ°€λ₯Ό λ°›λŠ” μ‚¬λžŒμ΄ μžˆλ‚˜μš”?
없을 κ²λ‹ˆλ‹€. κ·Έλ ‡μ£ ?
03:07
None, right?
56
187165
1381
03:08
Data has immense value but is centrally controlled and owned.
57
188570
3961
λ°μ΄ν„°λŠ” μ—„μ²­λ‚œ κ°€μΉ˜κ°€ μžˆλŠ”λ° 기업에 μ˜ν•΄ ν†΅μ œλ˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
03:12
You are all walking raw materials for those large data companies,
58
192555
4863
κ°œμΈμ€ μ›€μ§μ΄λŠ” μ›μž¬λ£Œμ΄μ§€λ§Œ
아무도 데이터 λŒ€κΈ°μ—…μœΌλ‘œλΆ€ν„° 보상을 받지 μ•ŠμŠ΅λ‹ˆλ‹€.
03:17
but none of you are paid.
59
197442
1435
03:19
Not only that,
60
199703
1173
그뿐만 μ•„λ‹ˆλΌ,
03:20
you're not even considered as part of this equation for income.
61
200900
4370
κ°œμΈμ€ 이 μˆ˜μž… λ“±μ‹μ˜ ν•œ λΆ€λΆ„μœΌλ‘œλ„ κ³ λ €λ˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
03:25
So once again,
62
205651
1693
κ·ΈλŸ¬λ‹ˆ λ‹€μ‹œ ν•œ 번,
03:27
we are undoubtedly equal:
63
207368
1737
μš°λ¦¬λŠ” 말할 ν•„μš” 없이 λ™λ“±ν•˜κ³ 
03:29
we're equally poor.
64
209129
1463
λ˜‘κ°™μ΄ κ°€λ‚œν•˜μ§€μš”.
λˆ„κ΅°κ°€λŠ” λͺ¨λ“  것을 μ†Œμœ ν•˜κ³ , μš°λ¦¬λŠ” 아무것도 μ—†μŠ΅λ‹ˆλ‹€.
03:31
Somebody else owns everything, and we own nothing.
65
211004
3039
μ΅μˆ™ν•œ 말이지 μ•Šλ‚˜μš”?
03:34
Sounds familiar, doesn't it?
66
214067
1553
03:36
So what should we do?
67
216747
1387
κ·Έλ ‡λ‹€λ©΄ μš°λ¦¬λŠ” 무엇을 ν•΄μ•Ό ν• κΉŒμš”?
03:38
There might be some clues in how my life turned out
68
218158
2413
κ·Έ μ–΄λ €μš΄ μ‹œκΈ° 이후 제 삢이 μ–΄λ–»κ²Œ λ˜μ—ˆλŠ”μ§€μ—μ„œ
03:40
after that difficult start.
69
220595
1338
λΆ„λͺ…νžˆ λ‹¨μ„œλ₯Ό 찾을 수 μžˆμ„ κ²λ‹ˆλ‹€.
03:41
Things began to look up for my family in the '80s.
70
221957
2419
80λ…„λŒ€μ— λ“€μ–΄μ„œ 우리 κ°€μ‘±μ˜ ν˜•νŽΈμ΄ λ‚˜μ•„μ§€κΈ° μ‹œμž‘ν–ˆμŠ΅λ‹ˆλ‹€.
03:44
The system evolved,
71
224400
1479
체계가 λ°œμ „ν–ˆκ³ 
03:45
and people began to be allowed to own a piece of what we created.
72
225903
3732
μƒμ‚°ν•œ κ²ƒλ“€μ˜ 일뢀뢄을 μ†Œμœ ν•˜λŠ” 것이 점차 ν—ˆκ°€λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
'λ°”λ‹€λ‘œ λ›°μ–΄λ“  μ‚¬λžŒ',
03:50
"People diving into the ocean,"
73
230118
1718
03:51
or "xia hai," the Chinese term,
74
231860
2348
ν˜Ήμ€ 'δΈ‹ζ΅·(μ‹œμ•„ ν•˜μ΄)'λΌλŠ” 쀑ꡭ말은
03:54
described those who left state-owned enterprise jobs
75
234232
3299
ꡭ영 κΈ°μ—… 직원 자리λ₯Ό ν¬κΈ°ν•˜κ³ 
μžμ‹ μ˜ 사업을 μ‹œμž‘ν•œ μ‚¬λžŒμ„ λœ»ν•©λ‹ˆλ‹€.
03:57
and started their own businesses.
76
237555
1775
03:59
Private ownership of a business
77
239814
1987
개인이 사업을 μ˜μœ„ν•˜λ©΄μ„œ
04:01
became personal ownership of cars,
78
241825
3080
각 κ°œμΈλ“€μ΄
μ°¨, μž¬μ‚°, μŒμ‹, 옷 같은 것을 μ†Œμœ ν•˜κ²Œ λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
04:04
properties, food, clothes and things.
79
244929
3437
04:08
The economic machine started rolling,
80
248390
2871
κ²½μ œλΌλŠ” μ—΄μ°¨λŠ” μ•žμœΌλ‘œ λ‚˜μ•„κ°€κΈ° μ‹œμž‘ν–ˆκ³ 
μ‚¬λžŒλ“€μ˜ 삢도 λ‚˜μ•„μ§€κΈ° μ‹œμž‘ν–ˆμŠ΅λ‹ˆλ‹€.
04:11
and people's lives began to improve.
81
251285
2200
04:13
For the first time,
82
253509
1244
처음으둜 λΆ€λ₯Ό μ–»λŠ”λ‹€λŠ” 것이 μ˜κ΄‘μŠ€λŸ¬μš΄ 일둜 μ—¬κ²¨μ‘ŒμŠ΅λ‹ˆλ‹€.
04:14
to get rich was glorious.
83
254777
2608
04:17
So in the '90s, when I went to study in Chengdu in west China,
84
257409
5063
κ·Έλž˜μ„œ 1990λ…„λŒ€μ— μ œκ°€ 쀑ꡭ μ„œμͺ½ μ²­λ‘μ—μ„œ 곡뢀할 λ•Œμ—
04:22
many young individuals like myself
85
262496
1628
제 또래의 λ§Žμ€ μ Šμ€μ΄λ“€μ€
04:24
were well-positioned to take advantage of the new system.
86
264148
2779
μƒˆλ‘œμš΄ μ‹œμŠ€ν…œμ˜ μž₯점을 받아듀이기에 쒋은 쑰건을 가지고 μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
04:27
After I graduated from my university,
87
267330
2704
λŒ€ν•™μ„ μ‘Έμ—…ν•œ 후에
μ €λŠ” 제 사업을 μ„€λ¦½ν•˜μ˜€κ³  μ„ μ „μœΌλ‘œ κ°”μŠ΅λ‹ˆλ‹€.
04:30
I cofounded my first business and moved to Shenzhen,
88
270058
3183
04:33
the brand-new special economic zone that used to be a fishing village.
89
273265
4479
과거에 μ–΄μ΄Œμ΄μ—ˆμ§€λ§Œ μ§€κΈˆμ€ λ– μ˜€λ₯΄λŠ” 경제특ꡬ죠.
04:37
Twenty years later,
90
277768
1169
12λ…„ 후에 선전은
04:38
Shenzhen has become a global innovation powerhouse.
91
278961
4405
ꡭ제적인 ν˜μ‹ μ˜ 심μž₯이 λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
04:44
Private ownership was a form of liberty we didn't have before.
92
284667
4078
μ‚¬μœ  μž¬μ‚°μ€ 전에 보지 λͺ»ν•œ 자유의 ν˜•νƒœμ˜€μŠ΅λ‹ˆλ‹€.
04:49
It created unprecedented opportunities for our generations,
93
289276
4923
μ΄λŠ” 우리 μ„ΈλŒ€μ— μ „λ‘€μ—†λŠ” 기회λ₯Ό λ§Œλ“€μ—ˆκ³ ,
μš°λ¦¬κ°€ μ—„μ²­λ‚˜κ²Œ μΌν•˜κ³  κ³΅λΆ€ν•˜λŠ” 동기λ₯Ό λΆ€μ—¬ν–ˆμŠ΅λ‹ˆλ‹€.
04:54
motivating us to work and study incredibly hard.
94
294223
3628
04:58
The result was that more than 850 million people rose out of poverty.
95
298317
5886
κ·Έ κ²°κ³Ό, 8μ–΅ 5천만 λͺ… 이상이 κ°€λ‚œμ—μ„œ λ²—μ–΄λ‚¬μŠ΅λ‹ˆλ‹€.
세계은행에 λ”°λ₯΄λ©΄
05:04
According to the World Bank,
96
304227
1968
μ œκ°€ 어릴 적, 1981λ…„ μ€‘κ΅­μ˜ κ·ΉλΉˆμœ¨μ€ 88%μ˜€μŠ΅λ‹ˆλ‹€.
05:06
China's extreme poverty rate in 1981, when I was a little kid, was 88 percent.
97
306219
6361
05:12
By 2015, 0.7 percent.
98
312604
3427
2015년은 0.7%μ£ .
05:16
I am a product of that success,
99
316832
1744
μ €λŠ” κ·Έ μ„±κ³΅μ˜ 결과물이고,
05:18
and I am very happy to share that today, I have my own AI business,
100
318600
4092
μ˜€λŠ˜λ‚  μ €λ§Œμ˜ AI 사업을 μš΄μ˜ν•˜κ³ 
05:22
and I lead a very worldly and dynamic life,
101
322716
2698
세계적이고 격동적인 삢을 μ‚°λ‹€λŠ” 것을 μ—¬λŸ¬λΆ„κ³Ό ν•¨κ»˜ν•˜λŠ” 것이 κΈ°μ©λ‹ˆλ‹€.
05:25
a path that was unimaginable when I was a kid in west China.
102
325438
4045
어릴 적 쀑ꡭ μ„œμͺ½μ— μžˆμ—ˆμ„ λ•ŒλŠ” 상상할 수 μ—†μ—ˆλ˜ 일이죠.
λ¬Όλ‘ , 이 번영의 λŒ€κ°€λŠ”
05:30
Of course, this prosperity came with a trade-off,
103
330252
3844
05:34
with equality, the environment and freedom.
104
334120
4390
평등, ν™˜κ²½, μžμœ μ˜€μ£ .
05:38
And obviously I'm not here to argue that China has it all figured out.
105
338534
3733
쀑ꡭ이 이λ₯Ό μ™„μ „νžˆ μ•Œκ³  μžˆλ‹€κ³  μ£Όμž₯ν•˜λ €λŠ” 것이 μ•„λ‹™λ‹ˆλ‹€.
05:42
We haven't.
106
342291
1206
그렇지 μ•Šμ•„μš”.
05:43
Nor that data is fully comparable to physical assets.
107
343521
4186
데이터가 물적 μžμ‚°κ³Ό 비ꡐ할 정도도 μ•„λ‹ˆκ³ μš”.
05:47
It is not.
108
347731
1265
그렇지 μ•ŠμŠ΅λ‹ˆλ‹€.
05:49
But my life experience allowed me to see what's hiding in plain sight.
109
349726
4983
ν•˜μ§€λ§Œ 제 μ‚Άμ˜ κ²½ν—˜μ€ 평범함 속에 μˆ¨κ²¨μ§„ 것을 λ³Ό 수 있게 ν•΄μ€¬μŠ΅λ‹ˆλ‹€.
05:54
Currently, the public discourse
110
354733
1595
ν˜„μž¬ μ‚¬νšŒμ  λΆ„μœ„κΈ°λŠ”
λ°μ΄ν„°μ˜ μ†Œμœ λ₯Ό 생각할 λ•Œλ©΄
05:56
is so focused on the regulatory and privacy issue
111
356352
4007
κ·œμ œμ™€ 개인 정보 보호 λ¬Έμ œμ— μ΄ˆμ μ„ 두고 μžˆμŠ΅λ‹ˆλ‹€.
06:00
when it comes to data ownership.
112
360383
1731
06:02
But I want to ask:
113
362138
1675
ν•˜μ§€λ§Œ μ €λŠ” 묻고 μ‹Άμ–΄μš”.
06:03
What if we look at data ownership in completely different ways?
114
363837
3216
λ§Œμ•½ μš°λ¦¬κ°€ 데이터 μ†Œμœ μ— λŒ€ν•΄μ„œ μ™„μ „νžˆ λ‹€λ₯Έ λ°©μ‹μœΌλ‘œ λ³Έλ‹€λ©΄ μ–΄λ–¨κΉŒμš”?
06:07
What if data ownership is, in fact,
115
367561
2655
λ§Œμ•½ 데이터 μ†Œμœ κ°€
사싀은 사적, 개인적, 경제적 범주에 μ†ν•œλ‹€λ©΄μš”?
06:10
a personal, individual and economic issue?
116
370240
4328
06:15
What if, in the new digital economy,
117
375183
2354
λ§Œμ•½, μƒˆλ‘œμš΄ 디지털 κ²½μ œμ—μ„œ
06:17
we are allowed to own a piece of what we create
118
377561
3056
μš°λ¦¬κ°€ λ§Œλ“  것을 μ†Œμœ ν•˜λŠ” 것이 κ°€λŠ₯ν•˜κ³ 
06:20
and give people the liberty of private data ownership?
119
380641
4140
사적 데이터 μ†Œμœ μ˜ μžμœ κ°€ κ°œμΈμ—κ²Œ μžˆλ‹€λ©΄μš”?
06:26
The legal concept of ownership is when you can possess,
120
386111
4202
법적 κ°œλ…μœΌλ‘œμ„œμ˜ μ†Œμœ λŠ”
λ³΄μœ ν•˜κ³ , μ‚¬μš©ν•˜κ³ , μ„ λ¬Όν•˜κ³ , λ¬Όλ €μ£Όκ±°λ‚˜ νŒŒκΈ°ν•˜λŠ” 것,
06:30
use, gift, pass on, destroy
121
390337
4311
06:34
or trade it or sell your asset
122
394672
3913
ν˜Ήμ€ 납득할 수 μžˆλŠ” 가격에 κ±°λž˜ν•˜κ±°λ‚˜
06:38
at a price accepted by you.
123
398609
2012
νŒ” 수 μžˆλ‹€λŠ” 것을 λœ»ν•©λ‹ˆλ‹€.
06:41
What if we give that same definition to individuals' data,
124
401446
4099
λ§Œμ•½ μš°λ¦¬κ°€ 개인의 데이터에 이와 같은 μ •μ˜λ₯Ό μ μš©ν•˜λ©΄
06:45
so individuals can use or destroy our data
125
405569
3458
개인이 μžμ‹ μ˜ 데이터λ₯Ό μ‚¬μš©ν•˜κ³  νŒŒκΈ°ν•˜κ±°λ‚˜,
μžμ‹ μ΄ μ •ν•œ 가격에 κ±°λž˜ν•  수 μžˆμ§€ μ•Šμ„κΉŒμš”?
06:49
or we trade it at our chosen price?
126
409051
2480
이제 μ—¬λŸ¬λΆ„μ€ μ΄λ ‡κ²Œ 말할 κ²ƒμž…λ‹ˆλ‹€.
06:52
Now, I know some of you might say,
127
412013
1674
06:53
"I would never, ever trade my data for any amount of money."
128
413711
3428
"λ‚˜λŠ” λ‚΄ 데이터λ₯Ό κ·Έ μ–΄λ–€ 값에도 μ•ˆ νŒ” κ±°μ•Ό."
ν•˜μ§€λ§Œ, λ‹€μ‹œ ν•œ 번 μ–˜κΈ°ν•˜μžλ©΄, 이게 λ°”λ‘œ μ§€κΈˆ λ²Œμ–΄μ§€λŠ” μΌμž…λ‹ˆλ‹€.
06:57
But that, let me remind you, is exactly what you're doing now,
129
417163
3285
07:00
except you're giving your data away for free.
130
420472
2438
μ—¬λŸ¬λΆ„μ΄ λ¬΄λ£Œμ œκ³΅μ„ ν•˜μ§€ μ•Šμ•˜λ‹€λ©΄μš”.
07:03
Plus, privacy is a very personal and nuanced issue.
131
423576
4042
κ²Œλ‹€κ°€ μ‚¬μƒν™œμ€ 맀우 개인적이고 λ―Όκ°ν•œ λΆ€λΆ„μž…λ‹ˆλ‹€.
07:07
You might have the privilege to prioritize your privacy over money,
132
427642
4391
μ—¬λŸ¬λΆ„μ€ λˆλ³΄λ‹€ 개인의 μ‚¬μƒν™œμ„ μš°μ„ μ‹œν•  수 μžˆμ„μ§€ λͺ¨λ¦…λ‹ˆλ‹€.
ν•˜μ§€λ§Œ 쀑ꡭ에 μžˆλŠ” 수백만의 μ†Œκ·œλͺ¨ μ‚¬μ—…μžλ“€μ€
07:12
but for millions of small business owners in China
133
432057
3021
은행 λŒ€μΆœμ„ μ‰½κ²Œ 받을 수 μ—†λŠ”λ°
07:15
who can't get bank loans easily,
134
435102
2339
AI λŒ€μΆœ μ‹¬μ‚¬λ‘œλΆ€ν„° λŒ€μΆœ μŠΉμΈμ„ 빨리 λ°›κΈ° μœ„ν•΄ 데이터λ₯Ό μ‚¬μš©ν•˜λ©΄
07:17
using their data to gain rapid loan approval from AI-powered lenders
135
437465
4763
κ·Έλ“€μ˜ κΈ‰λ°•ν•œ ν•„μš”λ₯Ό λ©”κΏ€ 수 μžˆμŠ΅λ‹ˆλ‹€.
07:22
can answer their more pressing needs.
136
442252
2139
07:24
What's private to you
137
444867
1404
μ—¬λŸ¬λΆ„μ—κ²Œ 개인적인 것은
07:26
is different from what's private to others.
138
446295
2732
λ‹€λ₯Έ 이의 것과 μ’…λ₯˜κ°€ 쑰금 λ‹€λ¦…λ‹ˆλ‹€.
μ§€κΈˆ μ—¬λŸ¬λΆ„μ—κ²Œ 사적인 것은
07:29
What's private to you now
139
449051
2441
07:31
is different from what was private when you were in college.
140
451516
3525
μ—¬λŸ¬λΆ„μ΄ λŒ€ν•™μƒμΌ λ•Œμ™€ λ‹€λ₯Ό κ±°μ˜ˆμš”.
적어도 그러길 λ°”λΌμš”.
07:35
Or, at least, I hope so.
141
455065
1398
07:36
(Laughter)
142
456487
1917
(μ›ƒμŒ)
07:38
We are always, although often subconsciously,
143
458428
4201
μš°λ¦¬λŠ” 항상, μ’…μ’… λ¬΄μ˜μ‹μ μ΄μ§€λ§Œ
07:42
making such trade-offs
144
462653
1344
우리의 λ‹€μ–‘ν•œ 믿음, μ‚Άμ˜ μš°μ„  μˆœμœ„λ₯Ό 기반으둜
07:44
based on our diverse personal beliefs and life priorities.
145
464021
4051
κ·ΈλŸ¬ν•œ 거래λ₯Ό ν•©λ‹ˆλ‹€.
이게 데이터λ₯Ό μ†Œμœ ν•˜λŠ” 것이
07:48
That is why data ownership would be incomplete
146
468096
3921
가격 κ²°μ •κΆŒμ΄ μ—†λ‹€λ©΄ λΆˆμ™„μ „ν•œ 것이 λ˜λŠ” μ΄μœ μž…λ‹ˆλ‹€.
07:52
without a pricing power.
147
472041
1759
07:54
By assigning pricing power to individuals,
148
474665
2670
κ°œμΈμ—κ²Œ 가격을 μ±…μ •ν•  κΆŒν•œμ„ λΆ€μ—¬ν•¨μœΌλ‘œμ¨
μš°λ¦¬λŠ” 개인적이고 λ―Όκ°ν•œ μ„ ν˜Έλ₯Ό λ°˜μ˜ν•  μˆ˜λ‹¨μ„ κ°–κ²Œ λ©λ‹ˆλ‹€.
07:57
we gain a tool to reflect our personal and nuanced preferences.
149
477359
4231
08:01
So, for example, you could choose to donate your data for free
150
481614
3439
예λ₯Ό λ“€λ©΄ μ–΄λ–€ μ˜ν•™μ  연ꡬ에
08:05
if a contribution to a particular medical research
151
485077
3645
μ—¬λŸ¬λΆ„μ˜ 데이터λ₯Ό 무료둜 μ œκ³΅ν•  μˆ˜λ„ μžˆμ„ κ²λ‹ˆλ‹€.
08:08
is very meaningful for you.
152
488746
1786
κ·Έ 연ꡬ가 μ—¬λŸ¬λΆ„μ—κ²Œ 맀우 μ˜λ―Έκ°€ μžˆλ‹€λ©΄μš”.
08:11
Or, if we had the tools to set our behavior data
153
491068
2938
μš°λ¦¬κ°€ 행동 데이터에
10만 λ‹¬λŸ¬λ₯Ό μ±…μ •ν•  방법이 μžˆλ‹€λ©΄
08:14
at a price of, say, 100,000 US dollars,
154
494030
3283
08:17
I doubt any political group would be able to target
155
497337
2430
μ–΄λ–€ μ •μΉ˜μ μΈ 단체가 개인의 ν‘œλ₯Ό λ…Έλ¦¬κ±°λ‚˜ μ‘°μž‘ν•  수 없을 κ²ƒμž…λ‹ˆλ‹€.
08:19
or manipulate your vote.
156
499791
1872
μ—¬λŸ¬λΆ„μ΄ ν†΅μ œν•˜κ³ 
08:22
You control. You decide.
157
502131
2359
μ—¬λŸ¬λΆ„μ΄ κ²°μ •ν•©λ‹ˆλ‹€.
08:25
Now, I know this sounds probably implausible,
158
505403
2547
μ•„λ§ˆ νƒ€λ‹Ήν•˜μ§€ μ•Šκ²Œ 듀릴 것 κ°™κΈ΄ ν•˜μ§€λ§Œ
08:27
but trends are already pointing to
159
507974
1680
μΆ”μ„ΈλŠ” 이미 개인의 데이터 μ†Œμœ λ₯Ό 늘리고 κ°•ν™”ν•˜λŠ” κ²ƒμž…λ‹ˆλ‹€.
08:29
a growing and very powerful individual data ownership movement.
160
509678
4781
08:34
First, start-ups are already creating tools
161
514483
2464
첫째둜, μŠ€νƒ€νŠΈμ—… 기업듀은
μ–΄λŠ 정도 데이터λ₯Ό μ§€λ°°ν•˜κ²Œ ν•˜λŠ” 도ꡬ듀을 이미 λ§Œλ“€κ³  μžˆμŠ΅λ‹ˆλ‹€.
08:36
to allow us to take back some control.
162
516971
2366
08:39
A new browser called Brave
163
519361
3073
'브레이브'λΌλŠ” μƒˆλ‘œμš΄ 인터넷 λΈŒλΌμš°μ €λŠ”
08:42
empowers users with "Brave Shields" -- they literally call it that --
164
522458
3512
μ‚¬μš©μžλ“€μ—κ²Œ '브레이브 μ‰΄λ“œ'λ₯Ό μ£ΌλŠ”λ° κΈ€μž κ·ΈλŒ€λ‘œμž…λ‹ˆλ‹€.
08:45
by aggressively blocking data-grabbing ads and trackers,
165
525994
5001
κ³Όλ„ν•œ 광고와 좔적을 곡격적으둜 μ°¨λ‹¨ν•˜κ³ ,
λ‹€λ₯Έ λΈŒλΌμš°μ €μ™€ 같은 λ°μ΄ν„°μ˜ λˆ„μΆœμ„ λ§‰μŠ΅λ‹ˆλ‹€.
08:51
and avoid leaking data like other browsers.
166
531019
2484
08:53
In return, users can take back some bargaining and pricing power.
167
533527
4943
그에 따라, μ‚¬μš©μžλ“€μ€ κ±°λž˜μ™€ 가격에 힘이 μƒκΉλ‹ˆλ‹€.
08:58
When users opt in to accept ads,
168
538494
3210
μ‚¬μš©μžλ“€μ΄ κ΄‘κ³ λ₯Ό ν—ˆμš©ν•œλ‹€λ©΄,
09:01
Brave rewards users with "basic attention tokens"
169
541728
3377
λΈŒλ ˆμ΄λΈŒλŠ” μ‚¬μš©μžλ“€μ—κ²Œ 토큰을 λ³΄μƒμœΌλ‘œ μ£Όκ³ 
09:05
that can redeem content behind paywalls from publishers.
170
545129
4221
μ΄κ²ƒμœΌλ‘œ 유료 μ½˜ν…μΈ λ₯Ό λ³Ό 수 μžˆμŠ΅λ‹ˆλ‹€.
09:10
And I've been using Brave for a few months.
171
550686
2053
μ €λŠ” λͺ‡ 달 λ™μ•ˆ 브레이브λ₯Ό μ¨λ΄€μŠ΅λ‹ˆλ‹€.
09:12
It has already blocked more than 200,000 ads and trackers
172
552763
3952
μ΄λŠ” 20만 개 μ΄μƒμ˜ 광고와 좔적을 μ°¨λ‹¨ν–ˆκ³ 
09:16
and saved hours of my time.
173
556739
2627
μ €λŠ” λ§Žμ€ μ‹œκ°„μ„ μ ˆμ•½ν–ˆμŠ΅λ‹ˆλ‹€.
09:19
Now, I know some of you interact with your browser
174
559695
2625
μ €λŠ” μ—¬λŸ¬λΆ„ 쀑 μΌλΆ€λŠ”
애인보닀 λΈŒλΌμš°μ €μ™€ 더 λ§Žμ€ μ‹œκ°„μ„ λ³΄λ‚΄λŠ” κ±Έ μ•Œμ•„μš”.
09:22
more than with your partners, so --
175
562344
2216
09:24
(Laughter)
176
564584
1040
(μ›ƒμŒ)
09:25
you should at least find one that doesn't waste your time and is not creepy.
177
565648
4653
μ΅œμ†Œν•œ μ—¬λŸ¬λΆ„μ˜ μ‹œκ°„μ„ λ‚­λΉ„ν•˜μ§€ μ•ŠλŠ”, μ΄μƒν•˜μ§€ μ•Šμ€ 무언가λ₯Ό μ°Ύμ•„μ•Ό ν•΄μš”.
09:30
(Laughter)
178
570325
1618
(μ›ƒμŒ)
09:34
Do you think Google is indispensable?
179
574110
2590
κ³Όμ—° ꡬ글이 ν•„μˆ˜μ μ΄λΌκ³  μƒκ°ν•˜μ„Έμš”?
09:37
Think again.
180
577351
1256
κ·Έλ ‡λ‹€λ©΄ λ‹€μ‹œ 생각해 λ³΄μ„Έμš”.
09:38
A search engine is indispensable.
181
578631
3019
검색 엔진은 ν•„μˆ˜μ μ΄μ—μš”.
09:41
Google just has the monopoly --
182
581674
1669
ꡬ글은 단지 독점을 ν•˜κ³  μžˆμ–΄μš”.
09:43
for now.
183
583367
1153
μ§€κΈˆμ€ 말이죠.
09:44
A search engine called DuckDuckGo doesn't store your personal information
184
584544
4190
λ•λ•κ³ λΌλŠ” 검색 엔진은 개인 정보λ₯Ό μ €μž₯ν•˜κ±°λ‚˜,
09:48
or follow you around with ads
185
588758
1519
μ—¬λŸ¬λΆ„μ΄ λ³΄λŠ” 화면에 κ΄‘κ³ λ₯Ό λ„μš°κ±°λ‚˜,
09:50
or track your personal browsing history.
186
590301
2659
μ—¬λŸ¬λΆ„μ˜ 검색 기둝을 μΆ”μ ν•˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
09:52
Instead, it gives all users the same search results
187
592984
2812
λŒ€μ‹ , λͺ¨λ“  μ‚¬μš©μžμ—κ²Œ λ™μΌν•œ 검색 κ²°κ³Όλ₯Ό λ„μΆœν•©λ‹ˆλ‹€.
09:55
instead of based on your personal browsing records.
188
595820
3895
개인의 검색 기둝에 κ·Όκ±°ν•œ κ²°κ³Όκ°€ μ•„λ‹™λ‹ˆλ‹€.
10:00
In London, a company called digi.me
189
600565
2774
λŸ°λ˜μ— μžˆλŠ” λ””μ§€λ―ΈλΌλŠ” νšŒμ‚¬λŠ”
10:03
offers an app you can download on your smartphone
190
603363
2878
μŠ€λ§ˆνŠΈν°μ— 내렀받을 수 μžˆλŠ” 앱을 μ§€μ›ν•©λ‹ˆλ‹€.
10:06
that helps to import and consolidate your data generated by you
191
606265
4742
κ·Έ 앱은 μ—¬λŸ¬λΆ„μ΄ μƒμ„±ν•œ 데이터λ₯Ό λŒμ–΄ μ™€μ„œ ν†΅ν•©ν•©λ‹ˆλ‹€.
10:11
from your Fitbit, Spotify,
192
611031
2181
핏빗, μŠ€ν¬ν‹°νŒŒμ΄, 그리고 λ‹€λ₯Έ SNS κ³„μ •μ—μ„œ κ°€μ Έμ˜€μ£ .
10:13
social media accounts ...
193
613236
1574
10:15
And you can choose where to store your data,
194
615405
3010
κ·Έλ ‡κ²Œ μ—¬λŸ¬λΆ„μ€ 데이터λ₯Ό 어디에 μ €μž₯할지λ₯Ό κ²°μ •ν•˜κ³ 
10:18
and digi.me will help you to make your data work for you
195
618439
4343
λ””μ§€λ―ΈλŠ” μ—¬λŸ¬λΆ„μ˜ 데이터가 μ—¬λŸ¬λΆ„μ—κ²Œ 이읡이 λ˜λ„λ‘ λ•μŠ΅λ‹ˆλ‹€.
10:22
by providing insights that used to be exclusively accessible
196
622806
3161
λŒ€κΈ°μ—…λ“€μ΄ λ…μ μ μœΌλ‘œ κ°–λ˜ 톡찰λ ₯을
10:25
by large data companies.
197
625991
1654
μ—¬λŸ¬λΆ„μ—κ²Œ μ£ΌλŠ” κ±°μ£ .
10:28
In DC, a new initiative called UBDI, U-B-D-I,
198
628584
4897
λ―Έκ΅­ μ›Œμ‹±ν„΄μ—μ„œ μƒˆλ‘œ μ‹œμž‘λœ UBDI,
10:33
Universal Basic Data Income,
199
633505
2417
즉, 보편적 데이터 μ†Œλ“μ€
10:35
helps people to make money
200
635946
1989
μ‚¬λžŒλ“€μ΄ μˆ˜μž…μ„ μ°½μΆœν•˜κ²Œ ν•©λ‹ˆλ‹€.
10:37
by sharing anonymous insights through their data
201
637959
4019
데이터λ₯Ό 톡해 읡λͺ…μ˜ 톡찰λ ₯을 μ œκ³΅ν•˜μ—¬
10:42
for companies that can use them for market research.
202
642002
2675
기업듀이 μ‹œμž₯ 쑰사에 μ‚¬μš©ν•˜κ²Œ ν•˜λŠ” 것이죠.
10:45
And whenever a company purchases a study,
203
645562
2795
기업이 연ꡬλ₯Ό κ΅¬μž…ν•  λ•Œλ§ˆλ‹€
10:48
users get paid in cash and UBDI points to track their contribution,
204
648381
4088
μ‚¬μš©μžλ“€μ€ ν˜„κΈˆκ³Ό 기여도에 λ”°λ₯Έ UBDI 포인트λ₯Ό λ°›μŠ΅λ‹ˆλ‹€.
10:52
potentially as much as 1,000 US dollars per year
205
652493
3146
μ—°κ°„ 천 λ‹¬λŸ¬ 정도 될 κ²ƒμ΄λΌλŠ” 게
10:55
per their estimation.
206
655663
2265
κ·Έλ“€μ˜ μ˜ˆμƒμž…λ‹ˆλ‹€.
10:59
UBDI could be a very feasible path for universal basic income
207
659420
3885
UBDIλŠ” 인곡 지λŠ₯ κ²½μ œμ— μžˆμ–΄
보편적 κΈ°λ³Έ μ†Œλ“μ„ μœ„ν•œ μ‹€ν˜„ κ°€λŠ₯ν•œ 방식이 될 κ²ƒμž…λ‹ˆλ‹€.
11:03
in the AI economy.
208
663329
1464
11:05
Further, individual awareness of privacy and data ownership
209
665642
5737
λ‚˜μ•„κ°€μ„œ, μ‚¬μƒν™œ λ³΄ν˜Έμ™€ 데이터 μ†Œμœ μ— λŒ€ν•œ 인식은
11:11
is growing fast
210
671403
1238
λΉ λ₯΄κ²Œ μ¦κ°€ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
11:13
as we all become aware of this monster we have unleashed in our pocket.
211
673424
4514
μ£Όλ¨Έλ‹ˆμ— λ“€μ–΄ μžˆλŠ” 이 괴물의 쑴재λ₯Ό κΉ¨λ‹«λŠ” κ±°μ£ .
11:18
I'm a mother of two preteen girls,
212
678684
2466
μ €λŠ” μ–΄λ¦° 두 λ”Έμ˜ μ—„λ§ˆμž…λ‹ˆλ‹€.
11:21
and trust me,
213
681174
1595
κ·ΈλŸ¬λ‹ˆ μ €λ₯Ό λ―Ώμ–΄λ³΄μ„Έμš”.
11:22
the single biggest source of stress and anxiety as a parent,
214
682793
5195
λΆ€λͺ¨λ‘œμ„œ κ±±μ •κ³Ό 슀트레슀의 κ°€μž₯ 큰 원인은
11:28
for me, is my children's relationship with technology.
215
688012
3764
제 아이듀과 기술의 κ΄€κ³„μž…λ‹ˆλ‹€.
11:33
This is a three-page agreement my husband and I make them sign
216
693161
3839
이것은 μ•„μ΄λ“€μ—κ²Œ 첫 번째 ν•Έλ“œν°μ„ 사주기 전에
11:37
before they receive their first [mobile phone].
217
697024
2277
λ‚¨νŽΈκ³Ό λ§Œλ“  μ„Έ μž₯ 짜리 μ„œμ•½μ„œμž…λ‹ˆλ‹€.
11:39
(Laughter)
218
699325
1668
(μ›ƒμŒ)
11:41
We want to help them to become
219
701619
2318
우리 λΆ€λΆ€λŠ” 아이듀이
(μ›ƒμŒ)
11:45
digital citizens,
220
705191
2581
디지털 μ‚¬νšŒμ˜ μ‹œλ―ΌμœΌλ‘œ μ„±μž₯ν•˜κ²Œ 돕고 μ‹ΆμŠ΅λ‹ˆλ‹€.
11:47
but only if we can make them become smart and responsible ones.
221
707796
5693
아이듀이 λ˜‘λ˜‘ν•˜κ³  μ±…μž„κ° μžˆλŠ” μ‚¬λžŒμ΄ λœλ‹€λ©΄ λ§μ΄μ§€μš”.
11:53
I help them to understand what kind of data should never be shared.
222
713513
4141
μ €λŠ” μ•„μ΄λ“€μ—κ²Œ μ–΄λ– ν•œ 데이터가 곡유되면 μ•ˆλ˜λŠ”μ§€λ₯Ό μ•Œλ €μ€λ‹ˆλ‹€.
11:58
So if you Google me,
223
718931
1569
κ·ΈλŸ¬λ‹ˆκΉŒ μ—¬λŸ¬λΆ„μ΄ μ €λ₯Ό ꡬ글 κ²€μƒ‰ν•œλ‹€λ©΄
12:00
in fact -- actually, sorry -- if you DuckDuckGo me,
224
720524
3465
μ£„μ†‘ν•©λ‹ˆλ‹€, λ•λ•κ³ μ—μ„œ κ²€μƒ‰ν•œλ‹€λ©΄
12:04
you will find maybe a lot about me and my work,
225
724013
3280
저와 제 μž‘μ—…μ— λŒ€ν•œ λ§Žμ€ 정보λ₯Ό μ°Ύμ•„λ³Ό 수 μžˆμ§€λ§Œ
12:07
but you may find no information about my daughters.
226
727317
2705
제 딸듀에 λŒ€ν•œ 것은 없을 κ±°μ˜ˆμš”.
12:10
When they grow up,
227
730710
1155
아이듀이 μ»€μ„œ
12:11
if they want to put themselves out there, it's their choice, not mine,
228
731889
3298
μžμ‹ μ„ μ˜¨λΌμΈμ— λ…ΈμΆœν•˜κ³  μ‹Άμ–΄ν•œλ‹€λ©΄, 그건 μ œκ°€ μ•„λ‹Œ μ•„μ΄λ“€μ˜ μ„ νƒμž…λ‹ˆλ‹€.
12:15
despite that I insist they're the most beautiful,
229
735211
2575
λ¬Όλ‘  μ €λŠ” 우리 아이듀이 μ„Έμƒμ—μ„œ κ°€μž₯ 아름닡고
12:17
smartest and most extraordinary kids in the world, of course.
230
737810
3951
λ˜‘λ˜‘ν•˜λ©°, κ°€μž₯ μ†Œμ€‘ν•˜μ§€μš”.
(μ›ƒμŒ)
12:23
And I know many people are having similar conversations
231
743474
2743
λ§Žμ€ μ‚¬λžŒλ“€μ΄ 이런 λΉ„μŠ·ν•œ λŒ€ν™”λ₯Ό ν•˜κ³ 
12:26
and making similar decisions,
232
746241
1773
λΉ„μŠ·ν•œ 결정을 ν•¨μœΌλ‘œμ¨
12:28
which gives me hope
233
748038
1293
μ§„μ •ν•œ 데이터 ν’μš”μ˜ λ―Έλž˜κ°€ 곧 μ˜¨λ‹€λŠ” 희망을 μ €μ—κ²Œ μ€λ‹ˆλ‹€.
12:29
that a truly smart data-rich future will be here soon.
234
749355
3123
12:33
But I want to highlight the Clause 6 of this agreement.
235
753592
3384
ν•˜μ§€λ§Œ μ €λŠ” 이 μ„œμ•½μ„œμ˜ 제6μ‘°λ₯Ό κ°•μ‘°ν•˜κ³  μ‹ΆμŠ΅λ‹ˆλ‹€.
12:37
It says, "I will never, ever search for any information online
236
757000
3634
"λ‚˜λŠ” λ„λ‹ˆ ν• λ¨Έλ‹ˆκ°€ λ³Έλ‹€λ©΄ λΆ€λ„λŸ¬μ›Œν• 
12:40
if I would be embarrassed if seen by Grandma Dawnie."
237
760658
3463
μ–΄λ– ν•œ 정보도 μ˜¨λΌμΈμ—μ„œ 찾지 μ•Šμ„ 것이닀."
12:44
(Laughter)
238
764145
1068
(μ›ƒμŒ)
12:45
Try it. It's really effective.
239
765237
1865
ν•œλ²ˆ ν•΄ λ³΄μ‹œλ©΄ 정말 νš¨κ³Όκ°€ μžˆμ–΄μš”.
12:47
(Laughter)
240
767126
1375
(μ›ƒμŒ)
12:49
Throughout history,
241
769588
1961
μ—­μ‚¬μ μœΌλ‘œ
12:51
there has always been a trade-off between liberty and equality
242
771573
6015
λ²ˆμ˜μ„ μΆ”κ΅¬ν•˜κΈ° μœ„ν•œ
12:57
in the pursuit of prosperity.
243
777612
2586
μžμœ μ™€ 평등 κ°„μ˜ 이율 λ°°λ°˜μ€ 항상 μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
13:00
The world has constantly been going through the circle of wealth accumulation
244
780657
6366
세상은 λΆ€μ˜ 좕적과
13:07
to wealth redistribution.
245
787047
1947
μž¬λΆ„λ°°λ₯Ό λ°˜λ³΅ν•©λ‹ˆλ‹€.
13:09
As the tension between the haves and have-nots
246
789555
3118
가진 μ‚¬λžŒκ³Ό λͺ» 가진 μ‚¬λžŒμ˜ κ΄€κ³„λŠ”
13:12
is breaking so many countries,
247
792697
2195
λ§Žμ€ λ‚˜λΌμ—μ„œ λΆ•κ΄΄λ˜κ³  있고,
13:14
it is in everyone's interest,
248
794916
2411
데이터 λŒ€κΈ°μ—…μ„ ν¬ν•¨ν•œ λͺ¨λ“  μ΄λ“€μ—κ²Œ
13:17
including the large data companies,
249
797351
2500
μƒˆλ‘œμš΄ λΆˆν‰λ“±μ˜ ν˜•μ„±μ„ λ°°κ²©ν•˜λŠ” 것은
13:19
to prevent this new form of inequality.
250
799875
3298
큰 관심사가 λ©λ‹ˆλ‹€.
13:23
Of course, individual data ownership is not the perfect nor the complete answer
251
803997
5304
λ¬Όλ‘ , 개인의 데이터 μ†Œμœ λŠ”
μ™„λ²½ν•˜κ±°λ‚˜ μ™„μ „ν•œ 결둠이 μ•„λ‹™λ‹ˆλ‹€.
13:29
to this profoundly complex question
252
809325
2421
무엇이 쒋은 디지털 μ‚¬νšŒλ₯Ό λ§Œλ“€μ–΄ λ‚΄λŠ”κ°€λΌλŠ” λ¬Έμ œλŠ”
13:31
of what makes a good digital society.
253
811770
4022
μ‹¬μ˜€ν•˜κ³  λ³΅μž‘ν•œ μ§ˆλ¬Έμž…λ‹ˆλ‹€.
13:36
But according to McKinsey,
254
816991
1560
ν•˜μ§€λ§Œ λ§₯킨지에 λ”°λ₯΄λ©΄,
13:40
AI will add 13 trillion US dollars of economic output in the next 10 years.
255
820321
6062
인곡 지λŠ₯은 10λ…„ 후에 13μ‘° λ‹¬λŸ¬μ˜ 경제적 μ‚°μΆœλ¬Όμ„ λ§Œλ“€ κ²ƒμž…λ‹ˆλ‹€.
13:46
Data generated by individuals will no doubt contribute
256
826883
3213
개인이 λ§Œλ“  λ°μ΄ν„°λŠ” μ΄λŸ¬ν•œ 큰 μ„±μž₯에
λ§Žμ€ κΈ°μ—¬λ₯Ό ν•  것이 ν‹€λ¦Όμ—†μŠ΅λ‹ˆλ‹€.
13:50
to this enormous growth.
257
830120
2435
13:53
Shouldn't we at least consider an economic model
258
833716
2946
적어도 개인이 κΆŒν•œμ„ κ°–λŠ”
경제 λͺ¨ν˜•μ„ 생각해봐야 ν•˜μ§€ μ•Šμ„κΉŒμš”?
13:56
that empowers the people?
259
836686
1870
13:59
And if private ownership helped to lift more than 850 million people
260
839055
4699
μ‚¬μœ  μž¬μ‚°μ΄ 8μ–΅ 5천만 λͺ… 이상을
14:03
out of poverty,
261
843778
1928
κ°€λ‚œμ—μ„œ λ²—μ–΄λ‚˜κ²Œ 해쀬닀면,
14:05
it is our duty
262
845730
1997
우리의 의무이며
14:07
and we owe it to future generations
263
847751
3044
λ‹€μŒ μ„ΈλŒ€μ—κ²Œ 남겨야 ν•  것은
14:10
to create a more inclusive AI economy
264
850819
3516
기업뿐 μ•„λ‹ˆλΌ μ‚¬λžŒλ“€μ—κ²Œλ„ κΆŒν•œμ„ μ£ΌλŠ”
14:14
that will empower the people in addition to businesses.
265
854359
4423
더 포괄적인 인곡 지λŠ₯ 경제λ₯Ό λ§Œλ“œλŠ” κ²ƒμž…λ‹ˆλ‹€.
14:18
Thank you.
266
858806
1178
κ°μ‚¬ν•©λ‹ˆλ‹€.
14:20
(Applause)
267
860008
3144
(λ°•μˆ˜)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7