How a handful of tech companies control billions of minds every day | Tristan Harris

947,059 views ・ 2017-07-28

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: JY Kang κ²€ν† : Boram Cho
00:12
I want you to imagine
0
12960
1200
ν•œλ²ˆ μƒμƒν•΄λ³΄μ„Έμš”.
00:15
walking into a room,
1
15000
1200
μ–΄λ–€ 방에 λ“€μ–΄κ°‘λ‹ˆλ‹€.
00:17
a control room with a bunch of people,
2
17480
2136
μ‚¬λžŒλ“€λ‘œ κ°€λ“ν•œ μ–΄λŠ μ‘°μ’…μ‹€μž…λ‹ˆλ‹€.
00:19
a hundred people, hunched over a desk with little dials,
3
19640
2840
μž‘μ€ 닀이얼이 μžˆλŠ” 책상에 100λͺ…μ˜ μ‚¬λžŒλ“€μ΄ 앉아 μžˆμŠ΅λ‹ˆλ‹€.
00:23
and that that control room
4
23280
1520
κ·Έ μ‘°μ’…μ‹€μ—μ„œ ν•˜λŠ” 일이
00:25
will shape the thoughts and feelings
5
25680
3696
10μ–΅ λͺ… μ‚¬λžŒλ“€μ˜ 생각과 감정을 μ‘°μ’…ν•˜λŠ” 일이라고 μƒμƒν•΄λ³΄μ„Έμš”.
00:29
of a billion people.
6
29400
1240
00:32
This might sound like science fiction,
7
32560
1880
곡상 κ³Όν•™ μ†Œμ„€ 같은 μ–˜κΈ°μ§€λ§Œ
00:35
but this actually exists
8
35320
2216
이런 곳이 μ‹€μ œλ‘œ μ‘΄μž¬ν•©λ‹ˆλ‹€.
00:37
right now, today.
9
37560
1200
λ°”λ‘œ μ§€κΈˆ, μ˜€λŠ˜λ‚  말이죠.
00:40
I know because I used to be in one of those control rooms.
10
40040
3360
μ œκ°€ ν•œλ•Œ 그런 μ‘°μ’…μ‹€μ—μ„œ μΌν–ˆλ˜ μ‚¬λžŒμ΄λΌμ„œ 잘 μ••λ‹ˆλ‹€.
00:44
I was a design ethicist at Google,
11
44159
2297
μ €λŠ” κ΅¬κΈ€μ˜ λ””μžμΈ μœ€λ¦¬ν•™μžμ˜€μŠ΅λ‹ˆλ‹€.
00:46
where I studied how do you ethically steer people's thoughts?
12
46480
3360
윀리적 λ²”μœ„μ—μ„œ μ‚¬λžŒλ“€μ˜ 생각을 μ‘°μ’…ν•˜λŠ” 방법을 μ—°κ΅¬ν–ˆμ£ .
00:50
Because what we don't talk about is how the handful of people
13
50560
2896
μ—¬λŸ¬λΆ„μ΄ λͺ¨λ₯΄λŠ” 사싀은
뢈과 κ·Ήμ†Œμˆ˜μ˜ IT κΈ°μ—…μ—μ„œ μΌν•˜λŠ” κ·Ήμ†Œμˆ˜μ˜ μ‚¬λžŒλ“€μ΄
00:53
working at a handful of technology companies
14
53480
2536
μ˜€λŠ˜λ‚  자그마치 10μ–΅ λͺ… μ‚¬λžŒλ“€μ˜ 생각을
00:56
through their choices will steer what a billion people are thinking today.
15
56040
5040
κ·Έλ“€ λ§ˆμŒλŒ€λ‘œ μ‘°μ’…ν•˜κ³  μžˆλ‹€λŠ” κ²ƒμž…λ‹ˆλ‹€.
01:02
Because when you pull out your phone
16
62400
1736
μŠ€λ§ˆνŠΈν°μ„ κΊΌλ‚΄λŠ” μˆœκ°„, κ·Έ μž‘λ™ 방식과 μ•Œλ¦Ό λ©”μ„Έμ§€λŠ”
01:04
and they design how this works or what's on the feed,
17
64160
3096
그듀이 이미 μ„€κ³„ν•œ λŒ€λ‘œ 짧은 μˆœκ°„μ— μš°λ¦¬κ°€ μ–΄λ–€ 생각을 ν•˜λ„λ‘ λ§Œλ“­λ‹ˆλ‹€.
01:07
it's scheduling little blocks of time in our minds.
18
67280
3216
01:10
If you see a notification, it schedules you to have thoughts
19
70520
3136
μ•Œλ¦Ό 메세지λ₯Ό λ³΄λŠ” μˆœκ°„μ—
λ‚˜λ„ λͺ¨λ₯΄κ²Œ μ–΄λ–€ 생각을 ν•˜κ²Œ λ§Œλ“€κ³ ,
01:13
that maybe you didn't intend to have.
20
73680
2040
01:16
If you swipe over that notification,
21
76400
2616
κ·Έ 메세지λ₯Ό λ°€μ–΄μ„œ ν™•μΈν•˜λŠ” μˆœκ°„
01:19
it schedules you into spending a little bit of time
22
79040
2381
거기에 λΉ μ Έ μž κΉμ΄λΌλ„ μ‹œκ°„μ„ ν—ˆλΉ„ν•˜λ„λ‘ λ§Œλ“œλŠ” κ±°μ˜ˆμš”.
01:21
getting sucked into something
23
81445
1381
01:22
that maybe you didn't intend to get sucked into.
24
82850
2955
μ—¬λŸ¬λΆ„μ΄ 그럴 μ˜λ„κ°€ μ—†μ—ˆλ‹€ 해도 말이죠.
01:27
When we talk about technology,
25
87320
1520
보톡 μš°λ¦¬λŠ” ν…Œν¬λ†€λ‘œμ§€κ°€
01:30
we tend to talk about it as this blue sky opportunity.
26
90040
2696
λ¬΄ν•œν•œ κ°€λŠ₯μ„±μ˜ 기회λ₯Ό μ€€λ‹€κ³  λ§ν•˜κ³€ ν•©λ‹ˆλ‹€.
01:32
It could go any direction.
27
92760
1480
μ–΄λŠ λ°©ν–₯μœΌλ‘œλ“  λ°œμ „ν•  수 μžˆλ‹€κ³  보죠.
01:35
And I want to get serious for a moment
28
95400
1856
그런데 μ œκ°€ μ§„μ§€ν•˜κ²Œ λ§μ”€λ“œλ¦¬κ³  싢은 것은
01:37
and tell you why it's going in a very specific direction.
29
97280
2680
기술이 μ•„μ£Ό νŠΉμ •ν•œ λ°©ν–₯으둜 λ‚˜μ•„κ°€κ³  μžˆλ‹€λŠ” μ‚¬μ‹€μž…λ‹ˆλ‹€.
01:40
Because it's not evolving randomly.
30
100840
2200
κΈ°μˆ μ€ μ•„λ¬΄λ ‡κ²Œλ‚˜ μ§„ν™”ν•˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
01:44
There's a hidden goal driving the direction
31
104000
2016
μš°λ¦¬κ°€ λ§Œλ“œλŠ” λͺ¨λ“  κΈ°μˆ μ€ ν•œ 가지 μˆ¨κ²¨μ§„ λͺ©μ μ„ 두고 λ°œμ „ν•΄ λ‚˜κ°‘λ‹ˆλ‹€.
01:46
of all of the technology we make,
32
106040
2136
01:48
and that goal is the race for our attention.
33
108200
2920
우리의 관심을 끌기 μœ„ν•΄ κ²½μŸν•˜λŠ” 것이죠.
01:52
Because every news site,
34
112840
2736
λͺ¨λ“  λ‰΄μŠ€ μ›Ήμ‚¬μ΄νŠΈμ™€
01:55
TED, elections, politicians,
35
115600
2736
TED, μ„ κ±°, μ •μΉ˜μΈ,
01:58
games, even meditation apps
36
118360
1976
κ²Œμž„, 심지어 λͺ…상 μ•±κΉŒμ§€λ„
02:00
have to compete for one thing,
37
120360
1960
λͺ¨λ‘ 단 ν•œ 가지λ₯Ό μœ„ν•΄ κ²½μŸν•©λ‹ˆλ‹€.
02:03
which is our attention,
38
123160
1736
우리의 관심이죠.
02:04
and there's only so much of it.
39
124920
1600
그런데 μΈκ°„μ˜ 관심은 ν•œκ³„κ°€ μžˆμŠ΅λ‹ˆλ‹€.
02:08
And the best way to get people's attention
40
128440
2416
μ‚¬λžŒλ“€μ˜ 관심을 λ„λŠ” 졜고의 방법은
02:10
is to know how someone's mind works.
41
130880
2440
μΈκ°„μ˜ 사고방식을 μ΄ν•΄ν•˜λŠ” κ²λ‹ˆλ‹€.
02:13
And there's a whole bunch of persuasive techniques
42
133800
2336
그리고 κ±°κΈ°μ—λŠ” 무수히 λ§Žμ€ 섀득 기술이 λ™μ›λ˜μ£ .
02:16
that I learned in college at a lab called the Persuasive Technology Lab
43
136160
3496
μ œκ°€ λŒ€ν•™μ› μ‹œμ ˆ, 이λ₯Έλ°” 섀득 기술 μ—°κ΅¬μ‹€μ—μ„œ 배운 것은
02:19
to get people's attention.
44
139680
1600
μ‚¬λžŒλ“€μ˜ 관심을 λ„λŠ” κΈ°μˆ μ΄μ—ˆμŠ΅λ‹ˆλ‹€.
02:21
A simple example is YouTube.
45
141880
1480
κ°„λ‹¨ν•œ 예둜 μœ νŠœλΈŒκ°€ 있죠.
02:24
YouTube wants to maximize how much time you spend.
46
144000
2936
μœ νŠœλΈŒλŠ” μ—¬λŸ¬λΆ„μ΄ μ΅œλŒ€ν•œ 였래 μ‚¬μ΄νŠΈμ— 머물러 있기λ₯Ό λ°”λžλ‹ˆλ‹€.
02:26
And so what do they do?
47
146960
1200
κ·Έλž˜μ„œ μ–΄λ–»κ²Œ ν–ˆμ„κΉŒμš”?
02:28
They autoplay the next video.
48
148840
2280
λ‹€μŒ μ˜μƒμ„ μžλ™ μž¬μƒν•©λ‹ˆλ‹€.
02:31
And let's say that works really well.
49
151760
1816
그게 μ•„μ£Ό νš¨κ³Όκ°€ μ’‹μ•˜λ‹€κ³  치죠. μ‚¬λžŒλ“€μ˜ μ‹œκ°„μ„ μ’€ 더 λΊμ—ˆμ–΄μš”.
02:33
They're getting a little bit more of people's time.
50
153600
2416
λ„·ν”Œλ¦­μŠ€λŠ” κ·Έκ±Έ 보고 μ΄λ ‡κ²Œ 말할 κ²λ‹ˆλ‹€.
02:36
Well, if you're Netflix, you look at that and say,
51
156040
2376
"μ €λŸ¬λ©΄ 우리 μ‹œμž₯ 지뢄이 μ€„μ–΄λ“€μž–μ•„."
02:38
well, that's shrinking my market share,
52
158440
1858
"μš°λ¦¬λ„ λ‹€μŒ ν™”λ₯Ό μžλ™ μž¬μƒν•΄μ•Όμ§€."
02:40
so I'm going to autoplay the next episode.
53
160322
2000
02:43
But then if you're Facebook,
54
163320
1376
페이슀뢁이라면 또 이럴 κ±°μ˜ˆμš”.
02:44
you say, that's shrinking all of my market share,
55
164720
2336
"μ €κ²Œ 우리 μ‹œμž₯ 지뢄을 λ‹€ 뺏어가넀."
02:47
so now I have to autoplay all the videos in the newsfeed
56
167080
2656
"그럼 우린 κ²Œμ‹œκΈ€μ˜ λͺ¨λ“  μ˜μƒμ„ 클릭도 ν•˜κΈ° 전에 μžλ™ μž¬μƒν•΄μ•Όμ§€."
02:49
before waiting for you to click play.
57
169760
1762
02:52
So the internet is not evolving at random.
58
172320
3160
μ΄λ ‡κ²Œ 인터넷은 μ•„λ¬΄λ ‡κ²Œλ‚˜ μ§„ν™”ν•˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
02:56
The reason it feels like it's sucking us in the way it is
59
176320
4416
이런 μ‹μœΌλ‘œ μš°λ¦¬κ°€ 인터넷에 μ€‘λ…λ˜μ–΄ κ°€λŠ” λŠλ‚Œμ΄ λ“œλŠ” μ΄μœ λŠ”
03:00
is because of this race for attention.
60
180760
2376
λ°”λ‘œ 이 관심을 μœ„ν•œ 경쟁 λ•Œλ¬Έμž…λ‹ˆλ‹€.
03:03
We know where this is going.
61
183160
1416
이 λ°©ν–₯으둜 κ°€κ³  μžˆμ–΄μš”.
03:04
Technology is not neutral,
62
184600
1520
κΈ°μˆ μ€ 쀑립적이지 μ•ŠμŠ΅λ‹ˆλ‹€.
03:07
and it becomes this race to the bottom of the brain stem
63
187320
3416
μΈκ°„μ˜ λ‡Œ 쀑심 κΉŠμ€ κ³³κΉŒμ§€ λˆ„κ°€ λ¨Όμ € 도달할지 κ²½μŸν•˜λŠ” κ±°μ£ .
03:10
of who can go lower to get it.
64
190760
2200
03:13
Let me give you an example of Snapchat.
65
193920
2336
μŠ€λƒ…μ±—μ˜ 예λ₯Ό λ§μ”€λ“œλ¦¬κ² μŠ΅λ‹ˆλ‹€.
03:16
If you didn't know, Snapchat is the number one way
66
196280
3696
아싀지 λͺ¨λ₯΄κ² μ§€λ§Œ
μŠ€λƒ…μ±—μ€ λ―Έκ΅­ 10λŒ€λ“€μ΄ κ°€μž₯ 많이 μ“°λŠ” λ©”μ‹ μ €μž…λ‹ˆλ‹€.
03:20
that teenagers in the United States communicate.
67
200000
2256
03:22
So if you're like me, and you use text messages to communicate,
68
202280
4176
μ € 같은 μ„ΈλŒ€κ°€ 문자둜 μ—°λ½ν•˜λ“―μ΄
03:26
Snapchat is that for teenagers,
69
206480
1776
10λŒ€ 아이듀은 μŠ€λƒ…μ±—μ„ μ΄μš©ν•˜μ£ . μ•½ 1μ–΅ λͺ…이 μ‚¬μš©ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
03:28
and there's, like, a hundred million of them that use it.
70
208280
2696
μŠ€λƒ…μ±—μ€ 'μŠ€λƒ… 슀트리크'λΌλŠ” κΈ°μˆ μ„ κ°œλ°œν–ˆμŠ΅λ‹ˆλ‹€.
03:31
And they invented a feature called Snapstreaks,
71
211000
2216
두 μ‚¬λžŒμ΄ μ„œλ‘œ 맀일 μ—°λ½ν•˜λ©° λŒ€ν™”λ₯Ό μœ μ§€ν•œ 기간을 λ³΄μ—¬μ£ΌλŠ” κΈ°λŠ₯이죠.
03:33
which shows the number of days in a row
72
213240
1896
03:35
that two people have communicated with each other.
73
215160
2616
03:37
In other words, what they just did
74
217800
1856
즉 μŠ€λƒ…μ±—μ€ 이 두 μ‚¬λžŒμ—κ²Œ
03:39
is they gave two people something they don't want to lose.
75
219680
2960
μžƒκ³  싢지 μ•Šμ€ 무언가λ₯Ό λ§Œλ“€μ–΄ μ€€ 거라고 ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
03:44
Because if you're a teenager, and you have 150 days in a row,
76
224000
3456
μ™œλƒλ©΄ μ—¬λŸ¬λΆ„μ΄ 10λŒ€μ΄κ³  λˆ„κ΅°κ°€μ™€ 150일간 맀일 λŒ€ν™”ν–ˆλ‹€λ©΄
03:47
you don't want that to go away.
77
227480
1976
κ·Έκ±Έ 계속 κ°„μ§ν•˜κ³  μ‹Άκ±°λ“ μš”.
03:49
And so think of the little blocks of time that that schedules in kids' minds.
78
229480
4160
κ·Έλ ‡κ²Œ ν•˜λ£¨μ—λ„ μ—¬λŸ¬ 번 μŠ€λƒ…μ±—μ„ λΆ™μž‘λ„λ‘ μ•„μ΄λ“€μ˜ λ§ˆμŒμ„ μ‘°μ’…ν•˜μ£ .
μ΄λ‘ μƒμœΌλ‘œλ§Œ κ·Έλ ‡λ‹€λŠ” 게 μ•„λ‹ˆμ—μš”.
03:54
This isn't theoretical: when kids go on vacation,
79
234160
2336
아이듀이 여행을 κ°€μ•Ό ν•  λ•ŒλŠ” 친ꡬ λ‹€μ„― λͺ…μ—κ²Œ λΉ„λ°€λ²ˆν˜Έλ₯Ό λ„˜κΈ°κ³ 
03:56
it's been shown they give their passwords to up to five other friends
80
236520
3256
03:59
to keep their Snapstreaks going,
81
239800
2216
연속 λŒ€ν™”λ₯Ό μœ μ§€ν•œλ‹€λŠ” 쑰사 결과도 μžˆμŠ΅λ‹ˆλ‹€.
04:02
even when they can't do it.
82
242040
2016
μžκΈ°κ°€ μŠ€λƒ…μ±—μ„ λͺ» ν•˜λ”λΌλ„μš”.
04:04
And they have, like, 30 of these things,
83
244080
1936
이런 κ±Έ 30개 정도 κ°–κ³  μžˆμœΌλ©΄μ„œ
04:06
and so they have to get through taking photos of just pictures or walls
84
246040
3376
λ²½μ΄λ‚˜ 천μž₯ 사진 같은 κ±Έ λŒ€μΆ© 찍어 μ˜¬λ €μ„œλΌλ„
04:09
or ceilings just to get through their day.
85
249440
2480
κ·Έλ‚ μ˜ 연속 λŒ€ν™”λ₯Ό μ΄μ–΄κ°€λŠ” κ±°μ£ .
04:13
So it's not even like they're having real conversations.
86
253200
2696
μ§„μ§œ λŒ€ν™”λ₯Ό λ‚˜λˆ„λŠ” 것도 μ•„λ‹™λ‹ˆλ‹€.
04:15
We have a temptation to think about this
87
255920
1936
이걸 μš°λ¦¬λŠ” μ‰½κ²Œ μƒκ°ν•©λ‹ˆλ‹€.
04:17
as, oh, they're just using Snapchat
88
257880
2696
"μ˜›λ‚ μ— μš°λ¦¬κ°€ μ „ν™”λ‘œ μˆ˜λ‹€ 떨듯이 μš”μ¦˜ 애듀은 μŠ€λƒ…μ±—μ„ μ“°λŠ” κ±°λ„€."
04:20
the way we used to gossip on the telephone.
89
260600
2016
04:22
It's probably OK.
90
262640
1200
"별 문제 없겠지."
04:24
Well, what this misses is that in the 1970s,
91
264480
2256
ν•˜μ§€λ§Œ μ§€κΈˆκ³Ό 70λ…„λŒ€λŠ” λ‹€λ¦…λ‹ˆλ‹€.
04:26
when you were just gossiping on the telephone,
92
266760
2615
μ—¬λŸ¬λΆ„μ΄ μ „ν™”λ‘œ μˆ˜λ‹€ λ–¨ λ•ŒλŠ”
04:29
there wasn't a hundred engineers on the other side of the screen
93
269399
3017
ν™”λ©΄ 뒀에 μˆ¨μ€ 100λͺ…μ˜ κΈ°μˆ μžκ°€
04:32
who knew exactly how your psychology worked
94
272440
2056
μ—¬λŸ¬λΆ„μ˜ 심리λ₯Ό μ •ν™•ν•˜κ²Œ νŒŒμ•…ν•΄μ„œ
04:34
and orchestrated you into a double bind with each other.
95
274520
2640
λˆ„κ°€ λ¨Όμ € λŠμ§€ λͺ»ν•˜λ„둝 μ‘°μ’…ν•˜μ§€λŠ” μ•Šμ•˜μœΌλ‹ˆκΉŒμš”.
04:38
Now, if this is making you feel a little bit of outrage,
96
278440
3400
이 μ–˜κΈ°μ— 쑰금 ν™”κ°€ λ‚˜μ‹€ μˆ˜λ„ μžˆλŠ”λ°μš”.
04:42
notice that that thought just comes over you.
97
282680
2576
λ°”λ‘œ κ·Έ 감정이 μžμ—°μŠ€λŸ½κ²Œ μΌμ–΄λ‚˜λŠ” 것에 μ£Όλͺ©ν•΄λ³΄μ„Έμš”.
04:45
Outrage is a really good way also of getting your attention,
98
285280
3320
λΆ„λ…Έ λ˜ν•œ μ‚¬λžŒμ˜ 관심을 λ„λŠ” μ•„μ£Ό νƒμ›”ν•œ λ°©λ²•μž…λ‹ˆλ‹€.
04:49
because we don't choose outrage.
99
289880
1576
λΆ„λ…ΈλŠ” 선택할 수 μžˆλŠ” 게 μ•„λ‹ˆλΌ κ·Έλƒ₯ μΌμ–΄λ‚˜λŠ” 감정이기 λ•Œλ¬Έμ΄μ£ .
04:51
It happens to us.
100
291480
1416
04:52
And if you're the Facebook newsfeed,
101
292920
1856
페이슀뢁 λ‰΄μŠ€ν”Όλ“œ 개발자라면
04:54
whether you'd want to or not,
102
294800
1416
μ‚¬λžŒλ“€μ΄ μ›ν•˜λ“  μ›ν•˜μ§€ μ•Šλ“  κ·Έ λΆ„λ…Έλ‘œ 이읡을 μ–»λŠ” κ²λ‹ˆλ‹€.
04:56
you actually benefit when there's outrage.
103
296240
2736
04:59
Because outrage doesn't just schedule a reaction
104
299000
2936
μ™œλƒλ©΄ λΆ„λ…ΈλŠ” μ—¬λŸ¬λΆ„μ΄ κ°μ •μ μœΌλ‘œ μ‹œκ°„μ„ μ†ŒλΉ„ν•˜κ²Œ ν•  뿐만 μ•„λ‹ˆλΌ
05:01
in emotional time, space, for you.
105
301960
2880
05:05
We want to share that outrage with other people.
106
305440
2416
κ·Έ λΆ„λ…Έλ₯Ό λ‹€λ₯Έ μ‚¬λžŒλ“€κ³Ό λ‚˜λˆ„κ³  μ‹Άκ²Œ λ§Œλ“€κΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
05:07
So we want to hit share and say,
107
307880
1576
κ³΅μœ ν•˜κΈ° λ²„νŠΌμ„ λˆ„λ₯΄λŠ” κ±°μ£ .
05:09
"Can you believe the thing that they said?"
108
309480
2040
"μ–˜λ„€κ°€ 이런 말을 ν–ˆλ‹€λ‹ˆ 말이 돼?"
05:12
And so outrage works really well at getting attention,
109
312520
3376
이런 μ‹μœΌλ‘œ λΆ„λ…ΈλŠ” 관심을 λ„λŠ” 데에 μ•„μ£Ό μš©μ΄ν•©λ‹ˆλ‹€.
05:15
such that if Facebook had a choice between showing you the outrage feed
110
315920
3896
λ§Œμ•½ νŽ˜μ΄μŠ€λΆμ— μ—¬λŸ¬λΆ„μ΄ λΆ„λ…Έν•  λ§Œν•œ μ†Œμ‹κ³Ό
05:19
and a calm newsfeed,
111
319840
1320
μž”μž”ν•œ μ†Œμ‹μ„ μ„ νƒν•˜λΌκ³  ν•˜λ©΄
05:22
they would want to show you the outrage feed,
112
322120
2136
λΆ„λ…Έν•  λ§Œν•œ 것을 κ³ λ₯Ό κ²λ‹ˆλ‹€.
05:24
not because someone consciously chose that,
113
324280
2056
λˆ„κ΅°κ°€μ˜ μ˜λ„μ  선택이라기보닀
05:26
but because that worked better at getting your attention.
114
326360
2680
그게 μ—¬λŸ¬λΆ„μ˜ 관심을 λŒκΈ°μ— 더 효과적이기 λ•Œλ¬Έμ΄μ£ .
05:31
And the newsfeed control room is not accountable to us.
115
331120
5480
λ‰΄μŠ€ν”Όλ“œ 쑰쒅싀은 μš°λ¦¬μ— λŒ€ν•œ μ±…μž„μ΄ μ—†μŠ΅λ‹ˆλ‹€.
05:37
It's only accountable to maximizing attention.
116
337040
2296
단지 관심을 μ΅œλŒ€ν•œ λ„λŠ” κ²ƒλ§Œ μ±…μž„μ§€μ£ .
05:39
It's also accountable,
117
339360
1216
κ΄‘κ³ λ₯Ό 톡해 μˆ˜μ΅μ„ λ‚΄λŠ” ꡬ쑰이기 λ•Œλ¬Έμ— λ”μš± κ·Έλ ‡μŠ΅λ‹ˆλ‹€.
05:40
because of the business model of advertising,
118
340600
2376
05:43
for anybody who can pay the most to actually walk into the control room
119
343000
3336
돈 λ§Žμ€ μ‚¬λžŒμ΄ 쑰쒅싀을 μ°Ύμ•„μ™€μ„œ μ΄λ ‡κ²Œ λ§ν•˜λŠ” κ±°μ£ .
05:46
and say, "That group over there,
120
346360
1576
"μ € μ‚¬λžŒλ“€μ˜ λ¨Έλ¦Ώμ†μ—μ„œ 이런 생각을 ν•˜κ²Œ λ§Œλ“€μ–΄μ£Όμ„Έμš”."
05:47
I want to schedule these thoughts into their minds."
121
347960
2640
05:51
So you can target,
122
351760
1200
속이기 μ‰¬μš΄ λŒ€μƒμ„ κ³¨λΌμ„œ μ •ν™•ν•˜κ²Œ κ±°μ§“λ§λ‘œ 속일 μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
05:54
you can precisely target a lie
123
354040
1936
05:56
directly to the people who are most susceptible.
124
356000
2920
06:00
And because this is profitable, it's only going to get worse.
125
360080
2880
이건 돈이 λ˜λŠ” μž₯사라 점점 더 μ•…ν™”ν•  κ²λ‹ˆλ‹€.
06:05
So I'm here today
126
365040
1800
μ œκ°€ 였늘 μ—¬κΈ° 온 μ΄μœ λ„
06:08
because the costs are so obvious.
127
368160
2000
μΉ˜λ€„μ•Ό ν•  λŒ€κ°€κ°€ λ»”νžˆ 보이기 λ•Œλ¬Έμž…λ‹ˆλ‹€.
06:12
I don't know a more urgent problem than this,
128
372280
2136
이보닀 더 μ‹œκΈ‰ν•œ λ¬Έμ œλŠ” μ—†μ–΄μš”.
06:14
because this problem is underneath all other problems.
129
374440
3120
μ™œλƒλ©΄ 이건 λ‹€λ₯Έ λͺ¨λ“  문제의 밑바탕에 κΉ”λ €μžˆκΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
06:18
It's not just taking away our agency
130
378720
3176
μš°λ¦¬κ°€ 무엇에 관심을 두고 μ–΄λ–»κ²Œ μ‚΄μ•„κ°ˆμ§€λ₯Ό κ²°μ •ν•˜λŠ”
06:21
to spend our attention and live the lives that we want,
131
381920
2600
우리의 주체성을 빼앗을 뿐 μ•„λ‹ˆλΌ
06:25
it's changing the way that we have our conversations,
132
385720
3536
μš°λ¦¬κ°€ μ„œλ‘œ λŒ€ν™”ν•˜λŠ” 방식을 바꿔버리고
06:29
it's changing our democracy,
133
389280
1736
민주주의λ₯Ό λ³€μ§ˆμ‹œν‚€κ³ 
06:31
and it's changing our ability to have the conversations
134
391040
2616
μƒλŒ€λ°©κ³Ό λŒ€ν™”ν•˜λŠ” λŠ₯λ ₯ 및 λŒ€μΈ 관계λ₯Ό λ§ΊλŠ” λŠ₯λ ₯에도 영ν–₯을 λ―ΈμΉ©λ‹ˆλ‹€.
06:33
and relationships we want with each other.
135
393680
2000
06:37
And it affects everyone,
136
397160
1776
λͺ¨λ“  μ‚¬λžŒμ—κ²Œ 영ν–₯을 미치죠.
06:38
because a billion people have one of these in their pocket.
137
398960
3360
10μ–΅ λͺ…μ˜ μ‚¬λžŒλ“€μ˜ ν˜Έμ£Όλ¨Έλ‹ˆμ— 이런 게 ν•˜λ‚˜μ”© μžˆμœΌλ‹ˆκΉŒμš”.
06:45
So how do we fix this?
138
405360
1400
이 문제λ₯Ό μ–΄λ–»κ²Œ ν•΄κ²°ν•˜μ£ ?
06:49
We need to make three radical changes
139
409080
2936
μ„Έ 가지 근본적인 λ³€ν™”κ°€ ν•„μš”ν•©λ‹ˆλ‹€.
06:52
to technology and to our society.
140
412040
1800
기술과 우리 μ‚¬νšŒμ— μžˆμ–΄μ„œ 말이죠.
06:55
The first is we need to acknowledge that we are persuadable.
141
415720
3800
μ²«μ§ΈλŠ”, μš°λ¦¬κ°€ μ„€λ“λ‹Ήν•˜κΈ° μ‰½λ‹€λŠ” κ±Έ μΈμ •ν•˜λŠ” κ²λ‹ˆλ‹€.
07:00
Once you start understanding
142
420840
1376
μš°λ¦¬κ°€ μžκΈ°λ„ λͺ¨λ₯΄κ²Œ μ—¬κΈ°μ €κΈ° μ‹œκ°„μ„ λΉΌμ•—κΈ°λ©° μˆ˜λ§Žμ€ 생각을 ν•˜λ„λ‘
07:02
that your mind can be scheduled into having little thoughts
143
422240
2776
07:05
or little blocks of time that you didn't choose,
144
425040
2576
쑰쒅될 수 μžˆλ‹€λŠ” 것을 μΈμ •ν•˜λ©΄
07:07
wouldn't we want to use that understanding
145
427640
2056
κ·Έ 사싀을 였히렀 μ—­μ΄μš©ν•΄μ„œ
07:09
and protect against the way that that happens?
146
429720
2160
그런 일이 생기지 μ•Šκ²Œ μžμ‹ μ„ λ³΄ν˜Έν•˜λ € ν•˜μ§€ μ•Šμ„κΉŒμš”?
07:12
I think we need to see ourselves fundamentally in a new way.
147
432600
3296
우리 μžμ‹ μ„ λ°”λΌλ³΄λŠ” 방식을 근본적으둜 λ°”κΏ”μ•Ό ν•©λ‹ˆλ‹€.
07:15
It's almost like a new period of human history,
148
435920
2216
인λ₯˜μ˜ μƒˆ μ‹œλŒ€κ°€ μ—΄λ¦° 것이죠. 계λͺ½μ£Όμ˜ μ‹œλŒ€μ²˜λŸΌμš”.
07:18
like the Enlightenment,
149
438160
1216
07:19
but almost a kind of self-aware Enlightenment,
150
439400
2216
ν•˜μ§€λ§Œ 슀슀둜 깨달아야 ν•˜λŠ” 계λͺ½μ£Όμ˜ μ‹œλŒ€μΈ κ±°μ£ .
07:21
that we can be persuaded,
151
441640
2080
μš°λ¦¬κ°€ 섀득당할 수 μžˆμŒμ„ κΉ¨λ‹«κ³ 
07:24
and there might be something we want to protect.
152
444320
2240
κ·Έ μ‚¬μ‹€λ‘œλΆ€ν„° λ³΄ν˜Έν•΄μ•Ό ν•  것이 μžˆλ‹€κ³  말이죠.
07:27
The second is we need new models and accountability systems
153
447400
4576
λ‘˜μ§Έλ‘œ, μƒˆλ‘œμš΄ λͺ¨λΈκ³Ό μ±…μž„ μ‹œμŠ€ν…œμ΄ ν•„μš”ν•©λ‹ˆλ‹€.
07:32
so that as the world gets better and more and more persuasive over time --
154
452000
3496
세상은 μ‹œκ°„μ΄ 갈수둝 점점 더 λ°œμ „ν•˜λ©΄μ„œ
07:35
because it's only going to get more persuasive --
155
455520
2336
λ”μš± κ°•λ ₯ν•˜κ²Œ μ„€λ“ν•˜λ €λŠ” λ°©ν–₯으둜만 κ°€κΈ° λ•Œλ¬Έμ—
07:37
that the people in those control rooms
156
457880
1856
쑰쒅싀에 μžˆλŠ” μ‚¬λžŒλ“€μ΄ μš°λ¦¬κ°€ μ›ν•˜λŠ” 것에
07:39
are accountable and transparent to what we want.
157
459760
2456
투λͺ…ν•˜κ³  μ±…μž„κ°μ„ κ°–κ²Œ ν•˜λŠ” μ‹œμŠ€ν…œμ΄ ν•„μš”ν•©λ‹ˆλ‹€.
07:42
The only form of ethical persuasion that exists
158
462240
2696
윀리적인 섀득은 μ„€λ“ν•˜λŠ” μ΄λ“€μ˜ λͺ©μ μ΄
07:44
is when the goals of the persuader
159
464960
1936
μ„€λ“λ‹Ήν•˜λŠ” μ΄λ“€μ˜ λͺ©μ κ³Ό
07:46
are aligned with the goals of the persuadee.
160
466920
2200
같을 λ•Œλ§Œ μ„±λ¦½ν•˜κΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
07:49
And that involves questioning big things, like the business model of advertising.
161
469640
3840
즉, 이 κΈ°μ—…λ“€μ˜ κ΄‘κ³  수읡 ꡬ쑰λ₯Ό κ³Όκ°ν•˜κ²Œ λ¬Όμ–΄λ³Ό μˆ˜λ„ μžˆμ–΄μ•Ό ν•©λ‹ˆλ‹€.
07:54
Lastly,
162
474720
1576
λ§ˆμ§€λ§‰μœΌλ‘œ
07:56
we need a design renaissance,
163
476320
1680
이런 섀계에 λŒ€ν•œ λ³€ν™”κ°€ ν•„μš”ν•©λ‹ˆλ‹€.
07:59
because once you have this view of human nature,
164
479080
3056
무렀 10μ–΅ λͺ…μ˜ 생각을 μ‘°μ’…ν•  수 있게 ν•˜λŠ”
μΈκ°„μ˜ 본성을 μƒκ°ν•΄λ³΄μ„Έμš”.
08:02
that you can steer the timelines of a billion people --
165
482160
2976
08:05
just imagine, there's people who have some desire
166
485160
2736
λ‚΄κ°€ μ›ν•˜λŠ” λŒ€λ‘œ μ–΄λ–€ κ±Έ ν•˜κ³  μ‹Άκ³ ,
08:07
about what they want to do and what they want to be thinking
167
487920
2856
μƒκ°ν•˜κ³ , 느끼고, μ•Œκ³  싢은데
08:10
and what they want to be feeling and how they want to be informed,
168
490800
3136
그런 개개인의 μš•κ΅¬λŠ” λ‹€ λ¬΄μ‹œλ‹Ήν•œ 채 λˆ„κ΅°κ°€μ— μ˜ν•΄
08:13
and we're all just tugged into these other directions.
169
493960
2536
μ—‰λš±ν•œ λ°©ν–₯으둜 μ‘°μ’…λ‹Ήν•˜λŠ” μ‚¬λžŒλ“€μ΄ μˆ˜μ‹­μ–΅ λͺ…μ΄λ‚˜ λ©λ‹ˆλ‹€.
08:16
And you have a billion people just tugged into all these different directions.
170
496520
3696
κ·ΈλŸ¬λ‚˜ 섀계 μ „λ°˜μ— 걸쳐 λ₯΄λ„€μƒμŠ€μ μΈ 혁λͺ…을 μΌμœΌν‚€λ©΄
08:20
Well, imagine an entire design renaissance
171
500240
2056
08:22
that tried to orchestrate the exact and most empowering
172
502320
3096
10μ–΅ λͺ…이 각자 μžμ‹ μ˜ μ‹œκ°„μ„ κ°€μž₯ κ°’μ§€κ²Œ μ‚¬μš©ν•  수 μžˆλŠ”
08:25
time-well-spent way for those timelines to happen.
173
505440
3136
κ°€μž₯ 효과적인 방법을 μ§€νœ˜ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
08:28
And that would involve two things:
174
508600
1656
μ—¬κΈ°μ—λŠ” 두 가지 μš”μ†Œκ°€ μžˆμŠ΅λ‹ˆλ‹€.
08:30
one would be protecting against the timelines
175
510280
2136
첫째, μ›μΉ˜ μ•Šκ²Œ μ‹œκ°„μ„ λ‚­λΉ„ν•˜κ²Œ λ§Œλ“œλŠ” μš”μ†Œλ₯Ό λ§‰λŠ” κ²ƒμž…λ‹ˆλ‹€.
08:32
that we don't want to be experiencing,
176
512440
1856
08:34
the thoughts that we wouldn't want to be happening,
177
514320
2416
08:36
so that when that ding happens, not having the ding that sends us away;
178
516760
3336
κ·Έλž˜μ•Ό 그런 상황이 μΌμ–΄λ‚˜λ„ 거기에 정신이 νŒ”λ¦¬μ§€ μ•Šμ£ .
08:40
and the second would be empowering us to live out the timeline that we want.
179
520120
3616
λ‘˜μ§Έ, μš°λ¦¬κ°€ μ›ν•˜λŠ” λ°©ν–₯으둜 μ‹œκ°„μ„ λ³΄λ‚΄λŠ” 것이 κ°€λŠ₯ν•˜λ„λ‘ ν•˜λŠ” κ²λ‹ˆλ‹€.
08:43
So let me give you a concrete example.
180
523760
1880
ꡬ체적인 예λ₯Ό λ“€μ–΄ λ³Όκ²Œμš”.
08:46
Today, let's say your friend cancels dinner on you,
181
526280
2456
μΉœκ΅¬κ°€ 였늘 λ°€ 저녁 약속을 μ·¨μ†Œν–ˆλ‹€κ³  치죠.
08:48
and you are feeling a little bit lonely.
182
528760
3775
그러면 μ—¬λŸ¬λΆ„μ€ 쑰금 μ™Έλ‘œμ›Œμ§€κ² μ£ ?
08:52
And so what do you do in that moment?
183
532559
1817
그럴 λ•ŒλŠ” 뭘 ν•˜μ£ ? νŽ˜μ΄μŠ€λΆμ— μ ‘μ†ν•˜κ² μ£ .
08:54
You open up Facebook.
184
534400
1279
08:56
And in that moment,
185
536960
1696
λ°”λ‘œ κ·Έ μˆœκ°„
08:58
the designers in the control room want to schedule exactly one thing,
186
538680
3376
페이슀뢁 쑰쒅싀에 μžˆλŠ” λ””μžμ΄λ„ˆμ—κ² 단 ν•˜λ‚˜μ˜ λͺ©ν‘œκ°€ 있죠.
09:02
which is to maximize how much time you spend on the screen.
187
542080
3040
μ—¬λŸ¬λΆ„μ΄ μ΅œλŒ€ν•œ 였래 νŽ˜μ΄μŠ€λΆμ— 머물게 ν•˜λŠ” κ²λ‹ˆλ‹€.
09:06
Now, instead, imagine if those designers created a different timeline
188
546640
3896
그런데 λ””μžμ΄λ„ˆκ°€ λͺ©ν‘œλ₯Ό μ™„μ „νžˆ λ°”κΏ”μ„œ
09:10
that was the easiest way, using all of their data,
189
550560
3496
가진 정보λ₯Ό μ΄λ™μ›ν•΄μ„œ κ°€μž₯ κ°„λ‹¨ν•œ λ°©λ²•μœΌλ‘œ
09:14
to actually help you get out with the people that you care about?
190
554080
3096
μ—¬λŸ¬λΆ„μ΄ μ’‹μ•„ν•˜λŠ” μ‚¬λžŒλ“€κ³Ό μ‹œκ°„μ„ 보내도둝 λ„μ™€μ€€λ‹€λ©΄μš”?
09:17
Just think, alleviating all loneliness in society,
191
557200
5416
μš°λ¦¬κ°€ μ‚¬νšŒμ—μ„œ λŠλΌλŠ” λͺ¨λ“  μ™Έλ‘œμ›€μ΄ 이런 μ‹μœΌλ‘œ 사라진닀고 μƒκ°ν•΄λ³΄μ„Έμš”.
09:22
if that was the timeline that Facebook wanted to make possible for people.
192
562640
3496
페이슀뢁이 μ‚¬λžŒλ“€μ„ μœ„ν•΄ κ·Έλ ‡κ²Œλ§Œ ν•΄μ€€λ‹€λ©΄ 말이죠.
09:26
Or imagine a different conversation.
193
566160
1715
μ•„λ‹ˆλ©΄ λ‹€λ₯Έ 상황을 생각해보죠.
09:27
Let's say you wanted to post something supercontroversial on Facebook,
194
567899
3317
μ—„μ²­λ‚œ λ…Όλž€μ„ μΌμœΌν‚¬ 글을 νŽ˜μ΄μŠ€λΆμ— μ˜¬λ¦°λ‹€κ³  치죠.
09:31
which is a really important thing to be able to do,
195
571240
2416
μ–΄λ–€ μŸμ μ„ 두고 μ„œλ‘œ ν† λ‘ ν•  수 μžˆλ‹€λŠ” 것은 무척 μ€‘μš”ν•΄μš”.
09:33
to talk about controversial topics.
196
573680
1696
09:35
And right now, when there's that big comment box,
197
575400
2336
νŽ˜μ΄μŠ€λΆμ— μƒˆ κ²Œμ‹œλ¬Όμ„ μž…λ ₯ν•˜λŠ” μ»€λ‹€λž€ 창을 보면
09:37
it's almost asking you, what key do you want to type?
198
577760
3376
마치 λ‚΄κ°€ μž…λ ₯ν•œ 것을 ν‚€μ›Œλ“œ μ‚Όμ•„
09:41
In other words, it's scheduling a little timeline of things
199
581160
2816
또 ν•œμ°Έμ„ νŽ˜μ΄μŠ€λΆμ— 머물게 ν•˜λ €λŠ” 것 κ°™μ•„μš”.
09:44
you're going to continue to do on the screen.
200
584000
2136
λ§Œμ•½ μ˜†μ— λ‹€λ₯Έ λ²„νŠΌμ΄ μžˆμ–΄μ„œ
09:46
And imagine instead that there was another button there saying,
201
586160
2976
"κ°€μž₯ 의미 μžˆλŠ” μ‹œκ°„μ„ 보내렀면 뭘 ν•˜κ³  μ‹Άμ€κ°€μš”?" 라 묻고
09:49
what would be most time well spent for you?
202
589160
2056
09:51
And you click "host a dinner."
203
591240
1576
'저녁 식사 μ΄ˆλŒ€'λ₯Ό 눌러 ν¬μŠ€νŒ…ν•˜λ©΄
09:52
And right there underneath the item it said,
204
592840
2096
λ°”λ‘œ 밑에 '저녁 식사에 μ°Έμ—¬ν•˜κΈ°' λ²„νŠΌμ΄ λ‚˜μ˜¨λ‹€ μƒκ°ν•΄λ³΄μ„Έμš”.
09:54
"Who wants to RSVP for the dinner?"
205
594960
1696
09:56
And so you'd still have a conversation about something controversial,
206
596680
3256
μ—¬λŸ¬λΆ„μ€ μ—¬μ „νžˆ λ…Όλž€κ±°λ¦¬κ°€ 될 λ§Œν•œ 주제둜 λŒ€ν™”λ₯Ό λ‚˜λˆ„μ§€λ§Œ
09:59
but you'd be having it in the most empowering place on your timeline,
207
599960
3736
μ›ν•˜λŠ” μ‹œκ°„κ³Ό μž₯μ†Œμ—μ„œ λŒ€ν™”λ₯Ό ν•˜λŠ” κ±°μ˜ˆμš”.
10:03
which would be at home that night with a bunch of a friends over
208
603720
3016
λ°”λ‘œ μΉœκ΅¬κ°€ 저녁을 μ·¨μ†Œν•œ κ·Έ λ‚ 
λ‹€λ₯Έ λ§Žμ€ μΉœκ΅¬λ“€κ³Ό ν•¨κ»˜ λ…Όλž€κ±°λ¦¬λ‘œ 열띀 λŒ€ν™”λ₯Ό λ‚˜λˆ„λŠ” κ±°μ£ .
10:06
to talk about it.
209
606760
1200
10:09
So imagine we're running, like, a find and replace
210
609000
3160
μ›Œλ“œ λ¬Έμ„œλ₯Ό μž‘μ„±ν•  λ•Œ μ“°λŠ” 'μ°Ύμ•„ λ°”κΎΈκΈ°' κΈ°λŠ₯처럼
10:13
on all of the timelines that are currently steering us
211
613000
2576
우리λ₯Ό 점점 더 인터넷 μ€‘λ…μœΌλ‘œ λ§Œλ“œλŠ” λͺ¨λ“  κ²Œμ‹œλ¬Όμ„ μ°Ύμ•„λ‚΄μ„œ
10:15
towards more and more screen time persuasively
212
615600
2560
10:19
and replacing all of those timelines
213
619080
2536
이것을 μš°λ¦¬κ°€ μ‹€μ œ μ‚Άμ—μ„œ ν•˜κ³  싢은 κ²ƒμœΌλ‘œ λͺ¨λ‘ λ°”κΎΌλ‹€κ³  μƒκ°ν•΄λ³΄μ„Έμš”.
10:21
with what do we want in our lives.
214
621640
1640
10:26
It doesn't have to be this way.
215
626960
1480
κΌ­ 이런 방식이 μ•„λ‹ˆμ–΄λ„ λ©λ‹ˆλ‹€.
10:30
Instead of handicapping our attention,
216
630360
2256
우리의 관심을 μ—‰λš±ν•œ 데 ν—ˆλΉ„ν•˜λŠ” λŒ€μ‹ 
10:32
imagine if we used all of this data and all of this power
217
632640
2816
μˆ˜λ§Žμ€ 데이터와 그것을 ν†΅μ œν•  힘과 μ„€λ“λ‹Ήν•˜κΈ° μ‰¬μš΄ μΈκ°„μ˜ 본성을 μ΄μš©ν•΄
10:35
and this new view of human nature
218
635480
1616
10:37
to give us a superhuman ability to focus
219
637120
2856
μš°λ¦¬κ°€ 정말 μ›ν•˜λŠ” 것에 집쀑할 수 μžˆλŠ” 초인적인 힘과
10:40
and a superhuman ability to put our attention to what we cared about
220
640000
4136
10:44
and a superhuman ability to have the conversations
221
644160
2616
λ―Όμ£Όμ£Όμ˜μ— ν•„μš”ν•œ μ†Œν†΅μ„ ν•  수 μžˆλŠ” 초인적인 νž˜μ„ κ°–λŠ”λ‹€κ³  μƒκ°ν•΄λ³΄μ„Έμš”.
10:46
that we need to have for democracy.
222
646800
2000
10:51
The most complex challenges in the world
223
651600
2680
μ„Έμƒμ—μ„œ κ°€μž₯ μ–΄λ €μš΄ κ³Όμ œλŠ”
10:56
require not just us to use our attention individually.
224
656280
3120
개개인의 κ΄€μ‹¬λΏλ§Œ μ•„λ‹ˆλΌ
11:00
They require us to use our attention and coordinate it together.
225
660440
3320
곡동체 μ „μ²΄μ˜ 관심이 λͺ¨μΌ λ•Œ λΉ„λ‘œμ†Œ ν•΄κ²°ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
11:04
Climate change is going to require that a lot of people
226
664440
2816
κΈ°ν›„ λ³€ν™”λ₯Ό ν•΄κ²°ν•˜λ €λ©΄ λ§Žμ€ μ‚¬λžŒμ˜ 관심을
11:07
are being able to coordinate their attention
227
667280
2096
κ°€μž₯ κ°•λ ₯ν•œ λ°©λ²•μœΌλ‘œ λͺ¨μ•„ νž˜μ„ μ‹€μ–΄μ•Ό ν•©λ‹ˆλ‹€.
11:09
in the most empowering way together.
228
669400
1896
11:11
And imagine creating a superhuman ability to do that.
229
671320
3080
이런 문제λ₯Ό ν•΄κ²°ν•  수 μžˆλŠ” 초인적인 힘이 생긴닀고 μƒκ°ν•΄λ³΄μ„Έμš”.
11:19
Sometimes the world's most pressing and important problems
230
679000
4160
κ°€μž₯ μ‹œκΈ‰ν•˜κ³  μ€‘μš”ν•œ λ¬Έμ œλŠ”
11:24
are not these hypothetical future things that we could create in the future.
231
684040
3840
μš°λ¦¬κ°€ λ―Έλž˜μ— μΌμœΌν‚¬μ§€λ„ λͺ¨λ₯Ό κ°€μƒμ˜ λ¬Έμ œκ°€ μ•„λ‹™λ‹ˆλ‹€.
11:28
Sometimes the most pressing problems
232
688560
1736
κ°€μž₯ μ‹œκΈ‰ν•˜κ³  μ€‘μš”ν•œ λ¬Έμ œλŠ” λ°”λ‘œ 우리 μ½”μ•žμ— μžˆλŠ”
11:30
are the ones that are right underneath our noses,
233
690320
2336
11:32
the things that are already directing a billion people's thoughts.
234
692680
3120
이미 10μ–΅ λͺ… μ‚¬λžŒλ“€μ˜ 생각을 μ‘°μ’…ν•˜κ³  μžˆλŠ” κ²ƒλ“€μž…λ‹ˆλ‹€.
11:36
And maybe instead of getting excited about the new augmented reality
235
696600
3376
μ¦κ°•ν˜„μ‹€, κ°€μƒν˜„μ‹€μ΄λ‚˜ μ•žμœΌλ‘œ 펼쳐질 λ†€λΌμš΄ κΈ°μˆ μ— μ—΄κ΄‘ν•˜λŠ” λŒ€μ‹ μ—
11:40
and virtual reality and these cool things that could happen,
236
700000
3296
11:43
which are going to be susceptible to the same race for attention,
237
703320
3296
우리 관심을 끌렀고 κ²½μŸν•  게 λ»”ν•œ 그런 것에 μ—΄κ΄‘ν•  것이 μ•„λ‹ˆλΌ
11:46
if we could fix the race for attention
238
706640
2176
이미 10μ–΅ λͺ…μ˜ μ£Όλ¨Έλ‹ˆμ— λ“  νœ΄λŒ€ν°μ—μ„œ
11:48
on the thing that's already in a billion people's pockets.
239
708840
2720
우리의 관심을 끌렀고 κ²½μŸν•˜λŠ” 것을 λ°”λ‘œμž‘μ•„μ•Ό ν•˜μ§€ μ•Šμ„κΉŒμš”.
11:52
Maybe instead of getting excited
240
712040
1576
κ°€μž₯ 멋진 μ΅œμ‹  ꡐ윑용 앱에 μ—΄κ΄‘ν•  것이 μ•„λ‹ˆλΌ
11:53
about the most exciting new cool fancy education apps,
241
713640
4176
11:57
we could fix the way kids' minds are getting manipulated
242
717840
2896
λ‚΄μš© μ—†λŠ” 문자λ₯Ό 주고받도둝 μ•„μ΄λ“€μ˜ 마음이 μ‘°μ’…λ‹Ήν•˜λŠ” 상황을
12:00
into sending empty messages back and forth.
243
720760
2480
λ°”λ‘œμž‘μ•„μ•Ό ν•  κ²ƒμž…λ‹ˆλ‹€.
12:04
(Applause)
244
724040
4296
(λ°•μˆ˜)
12:08
Maybe instead of worrying
245
728360
1256
ν•œ 가지 λͺ©ν‘œλ§Œ κ·ΉλŒ€ν™”ν•˜λ €λŠ”
12:09
about hypothetical future runaway artificial intelligences
246
729640
3776
인곡지λŠ₯을 ν†΅μ œν•  수 μ—†κ²Œ λ˜λŠ” κ°€μƒμ˜ 미래λ₯Ό κ±±μ •ν•˜λŠ” λŒ€μ‹ 
12:13
that are maximizing for one goal,
247
733440
1880
12:16
we could solve the runaway artificial intelligence
248
736680
2656
μš°λ¦¬λŠ” 이미 μ‘΄μž¬ν•˜λŠ” 인곡지λŠ₯을 ν†΅μ œν•  수 μžˆμ–΄μ•Ό ν•©λ‹ˆλ‹€.
12:19
that already exists right now,
249
739360
2056
12:21
which are these newsfeeds maximizing for one thing.
250
741440
2920
λ°”λ‘œ 단 ν•œ 가지 λͺ©ν‘œμΈ 우리의 관심을 λ„λŠ” 것을 κ·ΉλŒ€ν™”ν•˜λ €λŠ” λ‰΄μŠ€ν”Όλ“œλ₯Όμš”.
12:26
It's almost like instead of running away to colonize new planets,
251
746080
3816
μƒˆλ‘œμš΄ 행성을 μ‹λ―Όμ§€λ‘œ λ§Œλ“€λ €κ³  λ°–μœΌλ‘œλ§Œ λ‚˜κ°ˆ 것이 μ•„λ‹ˆλΌ
12:29
we could fix the one that we're already on.
252
749920
2056
μš°λ¦¬κ°€ μ‚΄κ³  μžˆλŠ” 이곳을 λ°”λ‘œμž‘μ„ 수 μžˆμ–΄μ•Ό ν•©λ‹ˆλ‹€.
12:32
(Applause)
253
752000
4120
(λ°•μˆ˜)
12:40
Solving this problem
254
760040
1776
이 문제의 해법이
12:41
is critical infrastructure for solving every other problem.
255
761840
3800
λ‹€λ₯Έ λͺ¨λ“  문제λ₯Ό ν•΄κ²°ν•  수 μžˆλŠ” κ°€μž₯ μ€‘μš”ν•œ 기반이 될 κ²ƒμž…λ‹ˆλ‹€.
12:46
There's nothing in your life or in our collective problems
256
766600
4016
우리의 μ‚Άκ³Ό μ‚¬νšŒμ˜ λͺ¨λ“  λ¬Έμ œλŠ”
12:50
that does not require our ability to put our attention where we care about.
257
770640
3560
μš°λ¦¬κ°€ 정말 μ†Œμ€‘νžˆ μ—¬κΈ°λŠ” 것에 관심을 λ‘˜ λ•Œλ§Œ ν•΄κ²°ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
12:55
At the end of our lives,
258
775800
1240
μš°λ¦¬κ°€ 삢을 λ§ˆκ°ν•  λ•Œ
12:58
all we have is our attention and our time.
259
778240
2640
κ²°κ΅­ λ‚¨λŠ” 건 우리의 관심과 μ‹œκ°„μž…λ‹ˆλ‹€.
13:01
What will be time well spent for ours?
260
781800
1896
μš°λ¦¬λŠ” μ–΄λ–€ μ‹œκ°„μ„ 보내야 ν• κΉŒμš”?
13:03
Thank you.
261
783720
1216
κ³ λ§™μŠ΅λ‹ˆλ‹€.
13:04
(Applause)
262
784960
3120
(λ°•μˆ˜)
13:17
Chris Anderson: Tristan, thank you. Hey, stay up here a sec.
263
797760
2936
Chris Anderson: νŠΈλ¦¬μŠ€νƒ„ 씨. κ³ λ§™μŠ΅λ‹ˆλ‹€. μž μ‹œ μ–˜κΈ° λ‚˜λˆ„μ‹€κΉŒμš”.
13:20
First of all, thank you.
264
800720
1336
μš°μ„ , κ³ λ§™μŠ΅λ‹ˆλ‹€.
13:22
I know we asked you to do this talk on pretty short notice,
265
802080
2776
μ œκ°€ λ“£κΈ°λ‘œ κ°‘μžκΈ° 강연을 μš”μ²­λ°›μœΌμ…¨λ‹€κ³  ν•˜λ˜λ°μš”.
13:24
and you've had quite a stressful week
266
804880
2216
κ°•μ—° μ€€λΉ„λ₯Ό ν•˜μ‹œλŠλΌ νž˜λ“  ν•œ μ£Όλ₯Ό λ³΄λ‚΄μ…¨κ² μ–΄μš”.
13:27
getting this thing together, so thank you.
267
807120
2440
κ°μ‚¬ν•©λ‹ˆλ‹€.
13:30
Some people listening might say, what you complain about is addiction,
268
810680
3976
쀑독에 κ΄€ν•œ 문제λ₯Ό μ§€μ ν•˜μ…¨λŠ”λ°μš”.
13:34
and all these people doing this stuff, for them it's actually interesting.
269
814680
3496
그런데 μ‚¬λžŒλ“€μ΄ μ‹€μ œλ‘œ 즐기고 μžˆλ‹€κ³  ν•  μˆ˜λ„ μžˆμž–μ•„μš”.
이런 건 μ‚¬μš©μžκ°€ 무척 ν₯미둭게 λŠλ‚„ λ‚΄μš©μœΌλ‘œ 섀계가 λ˜μ–΄ μžˆμ–΄μš”.
13:38
All these design decisions
270
818200
1256
13:39
have built user content that is fantastically interesting.
271
819480
3096
13:42
The world's more interesting than it ever has been.
272
822600
2416
세상이 역사상 μ΄λ ‡κ²Œ 재밌던 적은 μ—†μ—ˆλŠ”λ°, 무엇이 잘λͺ»λœ κ±ΈκΉŒμš”?
13:45
What's wrong with that?
273
825040
1256
13:46
Tristan Harris: I think it's really interesting.
274
826320
2256
Tristan Harris: 쒋은 μ§ˆλ¬Έμ΄λ„€μš”.
κ·Έκ±Έ μ„€λͺ…ν•˜λŠ” ν•œ 가지 방법은, 예λ₯Ό λ“€μ–΄ 유튜브λ₯Ό λ³΄μ„Έμš”.
13:48
One way to see this is if you're just YouTube, for example,
275
828600
4016
13:52
you want to always show the more interesting next video.
276
832640
2656
ν₯λ―Έλ₯Ό λŠλ‚„ λ§Œν•œ λ‹€μŒ μ˜μƒμ„ 보여주고 더 ν₯미둜운 λ‹€μŒ μ˜μƒμ„ 또 μ œμ‹œν•˜μ£ .
13:55
You want to get better and better at suggesting that next video,
277
835320
3016
ν•˜μ§€λ§Œ 심지어 λͺ¨λ‘κ°€ 보고 μ‹Άμ–΄ ν•˜λŠ” μ™„λ²½ν•œ λ‹€μŒ μ˜μƒμ„ μ†Œκ°œν•˜λ”λΌλ„
13:58
but even if you could propose the perfect next video
278
838360
2456
14:00
that everyone would want to watch,
279
840840
1656
κ³„μ†ν•΄μ„œ 우릴 화면에 λΆ™μž‘μ•„μ•Ό μœ νŠœλΈŒμ—” 쒋은 μΌμ΄λž€ κ±°μ£ .
14:02
it would just be better and better at keeping you hooked on the screen.
280
842520
3336
κ·Έ 곡식에 빠진 게 μžˆλŠ”λ°μš”.
14:05
So what's missing in that equation
281
845880
1656
κ·Έ 끝이 μ–΄λ”˜μ§€ μ•Œμ•„μ•Ό ν•œλ‹€λŠ” 것이죠.
14:07
is figuring out what our boundaries would be.
282
847560
2136
14:09
You would want YouTube to know something about, say, falling asleep.
283
849720
3216
λ‚΄κ°€ μ‘Έλ¦°λ‹€λŠ” κ±Έ μœ νŠœλΈŒλ„ μ•Œμ•„μ•Ό ν•˜μž–μ•„μš”.
λ„·ν”Œλ¦­μŠ€ 사μž₯이 μ΅œκ·Όμ— 이런 말을 ν–ˆλŠ”λ°μš”.
14:12
The CEO of Netflix recently said,
284
852960
1616
μžμ‹ λ“€μ˜ μ΅œλŒ€ κ²½μŸμžλŠ” 페이슀뢁, 유튜브, 그리고 쑸음이라고 ν–ˆμ£ .
14:14
"our biggest competitors are Facebook, YouTube and sleep."
285
854600
2736
14:17
And so what we need to recognize is that the human architecture is limited
286
857360
4456
μš°λ¦¬κ°€ μ•Œμ•„μ•Ό ν•  것은, μΈκ°„μ˜ κ΅¬μ‘°μ—λŠ” ν•œκ³„κ°€ μžˆλ‹€λŠ” μ‚¬μ‹€μž…λ‹ˆλ‹€.
14:21
and that we have certain boundaries or dimensions of our lives
287
861840
2976
우리 μ‚Άμ—λŠ” μΌμ •ν•œ 경계와 곡간이 있고
14:24
that we want to be honored and respected,
288
864840
1976
μš°λ¦¬λŠ” κ·Έ 경계와 곡간을 지킀고 μ‹Άμ–΄ ν•΄μš”.
14:26
and technology could help do that.
289
866840
1816
기술이 κ·Έκ±Έ λ„μšΈ 수 μžˆμŠ΅λ‹ˆλ‹€.
14:28
(Applause)
290
868680
2616
(λ°•μˆ˜)
14:31
CA: I mean, could you make the case
291
871320
1696
CA: κ·ΈλŸ¬λ‹ˆκΉŒ 문제점 μ€‘μ˜ ν•˜λ‚˜κ°€
14:33
that part of the problem here is that we've got a naΓ―ve model of human nature?
292
873040
6056
인간 본성에 λŒ€ν•œ λͺ¨λΈμ΄ λ„ˆλ¬΄ μˆœμ§„ν•˜λ‹€λŠ” κ±°μ£ ?
14:39
So much of this is justified in terms of human preference,
293
879120
2736
인간이 μ„ ν˜Έν•œλ‹€λŠ” 이유둜 λŒ€λΆ€λΆ„μ΄ μ •λ‹Ήν™”λ˜κ³ 
14:41
where we've got these algorithms that do an amazing job
294
881880
2616
μΈκ°„μ˜ κΈ°ν˜Έμ— 맞게 λ†€λžλ„λ‘ μ΅œμ ν™”ν•˜λŠ” μ•Œκ³ λ¦¬μ¦˜μ„ κ°–κ³  μžˆλŠ”λ°
14:44
of optimizing for human preference,
295
884520
1696
14:46
but which preference?
296
886240
1336
μ„ ν˜Έν•œλ‹€λŠ” 게 λ‹€ 같은 건 μ•„λ‹ˆλ‹ˆκΉŒμš”.
14:47
There's the preferences of things that we really care about
297
887600
3496
곰곰이 생각해본 ν›„ μ •λ§λ‘œ μ›ν•΄μ„œ μ„ νƒν•˜λŠ” 것이 있고
14:51
when we think about them
298
891120
1376
14:52
versus the preferences of what we just instinctively click on.
299
892520
3056
λ°˜λ©΄μ—, λ³ΈλŠ₯적으둜 ν΄λ¦­ν•˜λŠ” 것도 μ„ ν˜Έν•˜λŠ” 것이라고 ν•  수 μžˆμž–μ•„μš”.
14:55
If we could implant that more nuanced view of human nature in every design,
300
895600
4656
인간 본성에 λŒ€ν•œ κ·Έ λ―Έλ¬˜ν•œ κ΄€μ μ˜ 차이λ₯Ό 섀계에 λ°˜μ˜ν•œλ‹€λ©΄
15:00
would that be a step forward?
301
900280
1456
진전이 μžˆμ§€ μ•Šμ„κΉŒμš”?
15:01
TH: Absolutely. I mean, I think right now
302
901760
1976
TH: λ¬Όλ‘ μž…λ‹ˆλ‹€. 제 μƒκ°μ—λŠ”
15:03
it's as if all of our technology is basically only asking our lizard brain
303
903760
3496
ν˜„μž¬ κΈ°μˆ μ€ 기본적으둜 우리의 λ³ΈλŠ₯만 κ±΄λ“œλ¦¬λŠ” 것 κ°™μŠ΅λ‹ˆλ‹€.
μ˜λ―Έκ°€ 없더라도 μΆ©λ™μ μœΌλ‘œ 1μ΄ˆλΌλ„ 더 머물게 ν•˜λ €λŠ”
15:07
what's the best way to just impulsively get you to do
304
907280
2496
15:09
the next tiniest thing with your time,
305
909800
2136
κ°€μž₯ 효과적인 방법이 λ¬΄μ—‡μΌκΉŒλ§Œ μ—°κ΅¬ν•˜μ£ .
15:11
instead of asking you in your life
306
911960
1656
인생에 주어진 μ‹œκ°„μ„ μ–΄λ–»κ²Œ ν•΄μ•Ό κ°€μž₯ 잘 보낼 수 μžˆλŠ”μ§€, ν˜Ήμ€
15:13
what we would be most time well spent for you?
307
913640
2176
15:15
What would be the perfect timeline that might include something later,
308
915840
3296
TED 콘퍼런슀의 λ§ˆμ§€λ§‰ λ‚  μ½˜νΌλŸ°μŠ€κ°€ λλ‚œ ν›„λ₯Ό 포함해
μ–΄λ–»κ²Œ ν•΄μ•Ό μ‹œκ°„μ„ μ™„λ²½ν•˜κ²Œ 잘 썼닀고 ν•  수 μžˆμ„μ§€ λ¬»λŠ” λŒ€μ‹ μ—μš”.
15:19
would be time well spent for you here at TED in your last day here?
309
919160
3176
CA: κ·ΈλŸ¬λ‹ˆκΉŒ νŽ˜μ΄μŠ€λΆμ΄λ‚˜ ꡬ글 같은 λ°μ„œ
15:22
CA: So if Facebook and Google and everyone said to us first up,
310
922360
2976
이런 메세지λ₯Ό λ¨Όμ € λ„μ›Œμ•Όκ² λ„€μš”.
15:25
"Hey, would you like us to optimize for your reflective brain
311
925360
2896
"μ‚¬κ³ μ˜ 츑면을 μ΅œμ ν™”ν• κΉŒμš”, μ•„λ‹ˆλ©΄ λ³ΈλŠ₯ 츑면을 μ΅œμ ν™”ν• κΉŒμš”?
15:28
or your lizard brain? You choose."
312
928280
1656
15:29
TH: Right. That would be one way. Yes.
313
929960
2080
μ„ νƒν•˜μ„Έμš”."
TH: κ·Έλ ‡μŠ΅λ‹ˆλ‹€. 그것도 방법이죠.
15:34
CA: You said persuadability, that's an interesting word to me
314
934358
2858
CA: 섀득도 μ–ΈκΈ‰ν•˜μ…¨λŠ”λ°μš”. ν₯미둜운 단어라고 μƒκ°ν•©λ‹ˆλ‹€.
15:37
because to me there's two different types of persuadability.
315
937240
2856
μ™œλƒλ©΄, μ œκ°€ 보기엔 μ„€λ“μ—λŠ” 두 μ’…λ₯˜κ°€ μžˆλŠ” 것 κ°™μ€λ°μš”.
15:40
There's the persuadability that we're trying right now
316
940120
2536
μ„€λ“μ˜ ν•˜λ‚˜λŠ” μ§€κΈˆλ„ μš°λ¦¬κ°€ ν•˜κ³  μžˆλŠ” κ±°μ£ .
15:42
of reason and thinking and making an argument,
317
942680
2176
νŒλ‹¨ν•˜κ³ , μƒκ°ν•˜κ³ , λ…ΌμŸμ„ 벌이죠.
15:44
but I think you're almost talking about a different kind,
318
944880
2696
그런데 κ°•μ—°μ—μ„œλŠ” λ‹€λ₯Έ μ’…λ₯˜μ˜ 섀득,
더 λ³ΈλŠ₯적인 섀득을 λ§μ”€ν•˜μ‹  것 κ°™μ•„μš”.
15:47
a more visceral type of persuadability,
319
947590
1906
15:49
of being persuaded without even knowing that you're thinking.
320
949520
2896
생각할 μƒˆλ„ 없이 μ„€λ“λ‹Ήν•˜λŠ” κ±΄κ°€μš”?
TH: λ°”λ‘œ κ·Έκ²λ‹ˆλ‹€. μ œκ°€ κ·Έκ±Έ 문제둜 μ‚ΌλŠ” μ΄μœ κ°€ μžˆλŠ”λ°μš”.
15:52
TH: Exactly. The reason I care about this problem so much is
321
952440
2856
μ œκ°€ μžˆμ—ˆλ˜ μŠ€νƒ νΌλ“œλŒ€μ˜ 섀득 기술 μ—°κ΅¬μ‹€μ΄λΌλŠ” κ³³μ—μ„œλŠ”
15:55
I studied at a lab called the Persuasive Technology Lab at Stanford
322
955320
3176
λ§μ”€ν•˜μ‹  λ°”λ‘œ κ·Έ κΈ°μˆ μ„ κ°€λ₯΄μ³€μ–΄μš”.
15:58
that taught [students how to recognize] exactly these techniques.
323
958520
2546
κ·Έ μ€λ°€ν•œ κΈ°μˆ μ„ μ•Œλ €μ£ΌλŠ” ν•™μˆ  λŒ€νšŒμ™€ μ›Œν¬μˆλ„ 있죠.
16:01
There's conferences and workshops that teach people all these covert ways
324
961106
2990
μ‚¬λžŒλ“€μ˜ 관심을 끌고, κ·Έλ“€ 삢을 μ‘°μž‘ν•˜λŠ” 기술이죠.
16:04
of getting people's attention and orchestrating people's lives.
325
964120
2976
λŒ€λΆ€λΆ„μ˜ μ‚¬λžŒμ€ 그런 게 μžˆλŠ”μ§€λ„ λͺ¨λ₯΄κΈ° λ•Œλ¬Έμ—
16:07
And it's because most people don't know that that exists
326
967120
2656
16:09
that this conversation is so important.
327
969800
1896
이런 λŒ€ν™”λ₯Ό λ‚˜λˆ„λŠ” 것이 μ˜λ―Έκ°€ 있죠.
16:11
CA: Tristan, you and I, we both know so many people from all these companies.
328
971720
3776
CA: μ €λ‚˜ νŠΈλ¦¬μŠ€νƒ„ μ”¨λ‚˜ 이μͺ½ 뢄야에 지인이 λ§Žμ€λ°μš”.
16:15
There are actually many here in the room,
329
975520
1976
이 κ°•μ—°μž₯에도 λ§Žμ€ 뢄이 κ³„μ‹œκ³ μš”.
16:17
and I don't know about you, but my experience of them
330
977520
2477
νŠΈλ¦¬μŠ€νƒ„ μ”¨λŠ” 어떨지 λͺ¨λ₯΄μ§€λ§Œ μ œκ°€ κ²½ν—˜ν•œ λ°”λ‘œλŠ”
λͺ¨λ‘ μ„ μ˜λ‘œ 이 일을 ν•˜κ³  μžˆμ–΄μš”.
16:20
is that there is no shortage of good intent.
331
980021
2075
16:22
People want a better world.
332
982120
2176
μ‚¬λžŒλ“€μ€ 더 λ‚˜μ€ 세상을 μ›ν•˜λ‹ˆκΉŒμš”.
16:24
They are actually -- they really want it.
333
984320
3520
λ‹€λ“€ 더 λ‚˜μ€ 세상을 μ§„μ •μœΌλ‘œ μ›ν•©λ‹ˆλ‹€.
16:28
And I don't think anything you're saying is that these are evil people.
334
988320
4176
νŠΈλ¦¬μŠ€νƒ„ 씨도 κ°•μ—°μ—μ„œ 이듀이 λ‚˜μœ μ‚¬λžŒμ΄λΌκ³  λ§μ”€ν•˜μ‹  것 κ°™μ§€λŠ” μ•Šμ•„μš”.
16:32
It's a system where there's these unintended consequences
335
992520
3696
ν†΅μ œν•˜μ§€ λͺ»ν•  μ •λ„λ‘œ μ˜λ„ν•˜μ§€ μ•Šμ€ κ²°κ³Όλ₯Ό λ§Œλ“œλŠ” μ‹œμŠ€ν…œμ΄ 문제인 κ±°μ£ .
16:36
that have really got out of control --
336
996240
1856
TH: 관심을 λŒλ €λŠ” 경쟁이죠.
16:38
TH: Of this race for attention.
337
998120
1496
더 λ³ΈλŠ₯적인 것을 μžκ·Ήν•΄ 관심을 λŒλ €λŠ” μ „ν˜•μ  방식이죠. λ„ˆλ¬΄ μΉ˜μ—΄ν•΄μš”.
16:39
It's the classic race to the bottom when you have to get attention,
338
999640
3176
16:42
and it's so tense.
339
1002840
1216
더 λ§Žμ€ 관심을 μ–»λŠ” μœ μΌν•œ 방법은 λ‡Œμ˜ μ€‘μ‹¬κΉŒμ§€ νŒŒκ³ λ“œλŠ” κ±°μ˜ˆμš”.
16:44
The only way to get more is to go lower on the brain stem,
340
1004080
2736
16:46
to go lower into outrage, to go lower into emotion,
341
1006840
2416
우리의 뢄노와 κ°μ •κΉŒμ§€ νŒŒκ³ λ“€μ–΄κ°€κ³ 
16:49
to go lower into the lizard brain.
342
1009280
1696
우리 λ³ΈλŠ₯κΉŒμ§€ μžκ·Ήν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
16:51
CA: Well, thank you so much for helping us all get a little bit wiser about this.
343
1011000
3816
CA: κ°μ‚¬ν•©λ‹ˆλ‹€. 덕뢄에 이 λ¬Έμ œμ— 더 ν˜„λͺ…ν•˜κ²Œ λŒ€μ²˜ν•  수 μžˆκ² λ„€μš”.
16:54
Tristan Harris, thank you. TH: Thank you very much.
344
1014840
2416
CA: νŠΈλ¦¬μŠ€νƒ„ 해리 μ”¨μ˜€μŠ΅λ‹ˆλ‹€. TH: κ°μ‚¬ν•©λ‹ˆλ‹€.
16:57
(Applause)
345
1017280
2240
(λ°•μˆ˜)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7