How Twitter needs to change | Jack Dorsey

183,606 views ใƒป 2019-06-07

TED


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

๋ฒˆ์—ญ: Soobin Ahn ๊ฒ€ํ† : Yunjung Nam
00:13
Chris Anderson: What worries you right now?
0
13131
2408
ํฌ๋ฆฌ์Šค ์•ค๋”์Šจ: ์ง€๊ธˆ ๊ฑฑ์ •ํ•˜๋Š” ๊ฑด ๋ฌด์—‡์ธ๊ฐ€์š”?
00:15
You've been very open about lots of issues on Twitter.
1
15563
2853
ํŠธ์œ„ํ„ฐ์— ๊ด€ํ•œ ๋‹ค์–‘ํ•œ ์ด์Šˆ๋ฅผ ์•Œ๊ณ  ๊ณ„์‹ค ๊ฒ๋‹ˆ๋‹ค
00:18
What would be your top worry
2
18440
2299
ํ˜„์žฌ ์ƒํ™ฉ์—์„œ ๊ฐ€์žฅ ๊ฑฑ์ •ํ•˜๋Š” ๊ฒƒ์€ ๋ฌด์—‡์ธ๊ฐ€์š”?
00:20
about where things are right now?
3
20763
2049
00:23
Jack Dorsey: Right now, the health of the conversation.
4
23447
2929
์žญ ๋„์‹œ: ์ง€๊ธˆ ๋‹น์žฅ์€, ๋Œ€ํ™”์˜ ๊ฑด์ „์„ฑ์ด์š”.
00:26
So, our purpose is to serve the public conversation,
5
26400
3660
์ €ํฌ์˜ ๋ชฉ์ ์€ ๋Œ€์ค‘์˜ ๋Œ€ํ™”์— ๊ธฐ์—ฌํ•˜๋Š” ๊ฒƒ์ด์ง€๋งŒ
00:30
and we have seen a number of attacks on it.
6
30084
5056
๋Œ€ํ™” ์ค‘ ์ผ์–ด๋‚˜๋Š” ๋งŽ์€ ๊ณต๊ฒฉ์„ ๋ดค์Šต๋‹ˆ๋‹ค.
00:35
We've seen abuse, we've seen harassment,
7
35164
2425
์šฐ๋ฆฌ๋Š” ์š•์„ค๋„ ๋ดค๊ณ , ๊ดด๋กญํž˜๋„ ๋ดค์œผ๋ฉฐ,
00:37
we've seen manipulation,
8
37613
3222
์กฐ์ž‘๋„ ๋ดค์—ˆ๊ณ ,
00:40
automation, human coordination, misinformation.
9
40859
4265
์ž๋™ํ™”๋‚˜, ์ธ๊ฐ„์˜ ์กฐ์งํ™”, ๋˜๋Š” ํ—ˆ์œ„์ •๋ณด๋„ ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
00:46
So these are all dynamics that we were not expecting
10
46134
4034
๋ชจ๋‘ ์ €ํฌ๊ฐ€ ์˜ˆ์ƒ์น˜ ๋ชปํ–ˆ๋˜ ์ƒํ™ฉ์ž…๋‹ˆ๋‹ค.
00:50
13 years ago when we were starting the company.
11
50192
3718
13๋…„ ์ „์— ํšŒ์‚ฌ๋ฅผ ์„ธ์› ์„ ๋•Œ๋Š” ์˜ˆ์ƒ์น˜ ๋ชปํ–ˆ๋˜ ๊ฒƒ๋“ค์ด์ฃ .
00:53
But we do now see them at scale,
12
53934
2664
ํ•˜์ง€๋งŒ ์ด๋Ÿฐ ๋ฌธ์ œ์˜ ๊ทœ๋ชจ๊ฐ€ ์ ์  ์ปค์กŒ์Šต๋‹ˆ๋‹ค.
00:56
and what worries me most is just our ability to address it
13
56622
5278
๊ฐ€์žฅ ๊ฑฑ์ •ํ•˜๋Š” ๊ฑด ์ด ๋ฌธ์ œ๋ฅผ ๋‹ค๋ฃจ๋Š” ์ €ํฌ์˜ ๋Šฅ๋ ฅ์ž…๋‹ˆ๋‹ค.
01:01
in a systemic way that is scalable,
14
61924
3108
ํ™•์žฅ ๊ฐ€๋Šฅํ•˜๊ณ  ์ฒด๊ณ„์ ์ธ ๋ฐฉ๋ฒ•์ด์–ด์•ผ ํ•˜๊ณ 
01:05
that has a rigorous understanding of how we're taking action,
15
65056
6976
์ €ํฌ๊ฐ€ ์กฐ์น˜๋ฅผ ์ทจํ•˜๋Š” ๋ฐฉ๋ฒ•์„ ์ฒ ์ €ํžˆ ์ดํ•ดํ•ด์•ผ ํ•ด์š”.
01:12
a transparent understanding of how we're taking action
16
72056
3105
์ฆ‰, ํ–‰๋™์„ ์ทจํ•˜๋Š” ๋ฐฉ๋ฒ•์— ๋Œ€ํ•œ ํ™•์‹คํ•œ ์ดํ•ด๊ฐ€ ํ•„์š”ํ•ด์š”.
01:15
and a rigorous appeals process for when we're wrong,
17
75185
3101
๊ทธ๋ฆฌ๊ณ  ์ €ํฌ๊ฐ€ ์ž˜๋ชปํ–ˆ์„ ๋•Œ ํ™•์‹คํ•œ ํ•ญ์†Œ ์ ˆ์ฐจ๊ฐ€ ํ•„์š”ํ•˜์ฃ .
01:18
because we will be wrong.
18
78310
2169
์ €ํฌ๋Š” ์ž˜๋ชป์„ ์ €์ง€๋ฅผ ์ˆ˜๋ฐ–์— ์—†์œผ๋‹ˆ๊นŒ์š”.
01:20
Whitney Pennington Rodgers: I'm really glad to hear
19
80503
2397
ํœ˜ํŠธ๋‹ˆ ํŽ˜๋‹ํ„ด ๋กœ์ €์Šค: ๊ทธ๋Ÿฐ ๊ฑฑ์ •์„ ๋“ค์œผ๋‹ˆ ์ •๋ง ๊ธฐ์˜๋„ค์š”.
01:22
that that's something that concerns you,
20
82924
1928
01:24
because I think there's been a lot written about people
21
84876
2630
์™œ๋ƒํ•˜๋ฉด ํŠธ์œ„ํ„ฐ์—์„œ ์š•์„ค์„ ๋“ฃ๊ฑฐ๋‚˜ ๊ดด๋กญํž˜์„ ๋‹นํ•˜๋Š”
01:27
who feel they've been abused and harassed on Twitter,
22
87530
2477
์‚ฌ๋žŒ๋“ค์— ๋Œ€ํ•œ ๊ธ€์ด ๋งŽ์•˜๊ฑฐ๋“ ์š”.
01:30
and I think no one more so than women and women of color
23
90031
4102
ํŠนํžˆ๋‚˜ ์—ฌ์„ฑ, ์œ ์ƒ‰์ธ ์—ฌ์„ฑ,
01:34
and black women.
24
94157
1170
ํ‘์ธ ์—ฌ์„ฑ์—๊ฒŒ์š”.
01:35
And there's been data that's come out --
25
95351
1913
๊ตญ์ œ ์•ฐ๋„ค์Šคํ‹ฐ์—์„œ
01:37
Amnesty International put out a report a few months ago
26
97288
2909
๋ช‡ ๋‹ฌ ์ „ ๋ฐœํ‘œํ•œ ๋ณด๊ณ ์„œ์— ๋”ฐ๋ฅด๋ฉด
01:40
where they showed that a subset of active black female Twitter users
27
100221
4480
ํ‘์ธ ์—ฌ์„ฑ ํŠธ์œ„ํ„ฐ ์‚ฌ์šฉ์ž์˜ ์ผ๋ถ€๊ฐ€
01:44
were receiving, on average, one in 10 of their tweets
28
104725
3456
ํŠธ์œ—ํ•  ๋•Œ๋งˆ๋‹ค ํ‰๊ท ์ ์œผ๋กœ 10๋ฒˆ ์ค‘์— 1๋ฒˆ์€
01:48
were some form of harassment.
29
108205
2099
๊ดด๋กญํž˜์„ ๋‹นํ•˜๊ณ  ์žˆ๋‹ค๊ณ  ํ•ฉ๋‹ˆ๋‹ค.
01:50
And so when you think about health for the community on Twitter,
30
110328
3907
๊ฑด๊ฐ•ํ•œ ํŠธ์œ„ํ„ฐ ์ปค๋ฎค๋‹ˆํ‹ฐ์— ๋Œ€ํ•œ ์ƒ๊ฐ์„ ๋ง์”€ํ•˜์‹ค ๋•Œ
01:54
I'm interested to hear, "health for everyone,"
31
114259
4024
"๋ชจ๋‘๋ฅผ ์œ„ํ•œ ๊ฑด๊ฐ•"์ด๋ผ๋Š” ๋ง์„ ํฅ๋ฏธ๋กญ๊ฒŒ ๋“ค์—ˆ์Šต๋‹ˆ๋‹ค.
01:58
but specifically: How are you looking to make Twitter a safe space
32
118307
3125
ํ•˜์ง€๋งŒ ๊ตฌ์ฒด์ ์œผ๋กœ ์–ด๋–ป๊ฒŒ ์—ฌ์„ฑ, ์œ ์ƒ‰์ธ์ข…์ธ ์—ฌ์„ฑ๊ณผ
02:01
for that subset, for women, for women of color and black women?
33
121456
4164
ํ‘์ธ ์—ฌ์„ฑ์—๊ฒŒ๋„ ์•ˆ์ „ํ•œ ์žฅ์†Œ๋กœ ๋งŒ๋“ค ๊ณ„ํš์ด์‹ ๊ฐ€์š”?
02:05
JD: Yeah.
34
125644
1164
์žญ: ๋งž์•„์š”.
02:06
So it's a pretty terrible situation
35
126832
2643
๊ทธ๊ฑด ์ •๋ง ๋”์ฐํ•œ ์ƒํ™ฉ์ด์ฃ .
02:09
when you're coming to a service
36
129499
1619
์‚ฌ๋žŒ๋“ค์€ ํŠธ์œ„ํ„ฐ๋ฅผ ์ด์šฉํ•˜๋ฉด์„œ
02:11
that, ideally, you want to learn something about the world,
37
131142
4321
์„ธ๊ณ„์— ๋Œ€ํ•ด ๋” ๋ฐฐ์šฐ๋Š” ๊ฑธ ๊ฟˆ๊ฟจ์ง€๋งŒ,
02:15
and you spend the majority of your time reporting abuse, receiving abuse,
38
135487
5443
๊ทธ ์‹œ๊ฐ„์˜ ๋Œ€๋ถ€๋ถ„์„ ์š•์„ ๋“ฃ๊ฑฐ๋‚˜, ๋น„๋ฐฉ ๋ฐ›๊ฑฐ๋‚˜, ๊ดด๋กญํž˜์„ ๋‹นํ•˜๊ณ 
02:20
receiving harassment.
39
140954
1804
์‹ ๊ณ ๋ฅผ ํ•˜๋ฉฐ ๋ณด๋‚ด์ฃ .
02:23
So what we're looking most deeply at is just the incentives
40
143373
6321
๊ทธ๋ž˜์„œ ๊ฐ€์žฅ ์ž์„ธํžˆ ๋ณด๋ ค๋Š” ๊ฒƒ์€ ํ”Œ๋žซํผ๊ณผ ์„œ๋น„์Šค๊ฐ€
02:29
that the platform naturally provides and the service provides.
41
149718
3823
์ž์—ฐ์Šค๋Ÿฝ๊ฒŒ ์ œ๊ณตํ•˜๋Š” ์žฅ๋ ค์ฑ…์ž…๋‹ˆ๋‹ค.
02:34
Right now, the dynamic of the system makes it super-easy to harass
42
154262
4577
์ง€๊ธˆ ๋‹น์žฅ์€, ๊ดด๋กฌํž˜์ด ์ผ์–ด๋‚˜๊ธฐ ๋„ˆ๋ฌด ์‰ฌ์šด ์‹œ์Šคํ…œ์ด์ฃ .
02:38
and to abuse others through the service,
43
158863
3664
๋‹ค๋ฅธ ์‚ฌ๋žŒ์„ ๋น„๋ฐฉํ•˜๋Š” ๊ฒƒ๋„์š”.
02:42
and unfortunately, the majority of our system in the past
44
162551
3262
๋ถˆํ–‰ํžˆ๋„, ๊ณผ๊ฑฐ์— ์ €ํฌ ์‹œ์Šคํ…œ์˜ ๋Œ€๋ถ€๋ถ„์€
02:45
worked entirely based on people reporting harassment and abuse.
45
165837
5596
์˜จ์ „ํžˆ ์‚ฌ๋žŒ๋“ค์ด ๋น„๋ฐฉ๊ณผ ๋‚จ์šฉ์„ ์‹ ๊ณ ํ•˜๋Š” ๊ฒƒ์—๋งŒ ์˜์ง€ํ–ˆ์—ˆ์ฃ .
02:51
So about midway last year, we decided that we were going to apply
46
171457
5075
๊ทธ๋ž˜์„œ ์ž‘๋…„ ์ค‘๋ฐ˜์— ์ €ํฌ๋Š” ๊ทธ ๋ฌธ์ œ์—
02:56
a lot more machine learning, a lot more deep learning to the problem,
47
176556
3982
๋” ๋งŽ์€ ๊ธฐ๊ณ„ํ•™์Šต๊ณผ ๋”ฅ ๋Ÿฌ๋‹์„ ์ ์šฉํ•ด ๋ณด๊ธฐ๋กœ ๊ฒฐ์ •ํ–ˆ์Šต๋‹ˆ๋‹ค.
03:00
and try to be a lot more proactive around where abuse is happening,
48
180562
4538
๋˜ ์–ด๋””์„œ ๋น„๋ฐฉ์ด ์ผ์–ด๋‚˜๊ณ  ์žˆ๋Š”์ง€์— ๋Œ€ํ•ด ์‚ฌ์ „ ๋Œ€์ฑ…์„ ์ฐพ์œผ๋ ค ํ–ˆ์Šต๋‹ˆ๋‹ค.
03:05
so that we can take the burden off the victim completely.
49
185124
3960
ํ”ผํ•ด์ž์˜ ๋ถ€๋‹ด์„ ์™„์ „ํžˆ ๋œ์–ด๋ฒ„๋ฆฌ๊ธฐ ์œ„ํ•ด์„œ์š”.
03:09
And we've made some progress recently.
50
189108
2435
๊ทธ๋ฆฌ๊ณ  ์ตœ๊ทผ์— ์–ด๋Š ์ •๋„ ์ง„์ „์ด ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
03:11
About 38 percent of abusive tweets are now proactively identified
51
191567
6689
๊ธฐ๊ณ„ํ•™์Šต ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ํ†ตํ•ด์„œ ๋น„๋ฐฉ ํŠธ์œ— ์ค‘ 38%๋ฅผ
03:18
by machine learning algorithms
52
198280
1715
์‚ฌ์ „์— ํŒŒ์•…ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
03:20
so that people don't actually have to report them.
53
200019
2334
๊ทธ๋ž˜์„œ ๋” ์ด์ƒ ์‹ ๊ณ ํ•  ํ•„์š”๊ฐ€ ์—†์–ด์กŒ์Šต๋‹ˆ๋‹ค.
03:22
But those that are identified are still reviewed by humans,
54
202377
3305
ํ•˜์ง€๋งŒ ๊ทธ๋ ‡๊ฒŒ ํŒŒ์•…ํ•œ ํŠธ์œ—๋“ค์€ ์—ฌ์ „ํžˆ ์‚ฌ๋žŒ์ด ๊ฒ€ํ† ํ•ฉ๋‹ˆ๋‹ค
03:25
so we do not take down content or accounts without a human actually reviewing it.
55
205706
5384
์‚ฌ๋žŒ์ด ์ง์ ‘ ํ™•์ธํ•˜์ง€ ์•Š์œผ๋ฉด ์ปจํ…์ธ ๋‚˜ ๊ณ„์ •์„ ์‚ญ์ œํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
03:31
But that was from zero percent just a year ago.
56
211114
2759
ํ•˜์ง€๋งŒ ๋ถˆ๊ณผ ์ผ ๋…„ ์ „๋งŒ ํ•ด๋„ 0%์˜€์ฃ .
03:33
So that meant, at that zero percent,
57
213897
1931
๋‹ค์‹œ ๋งํ•ด, ๋น„์œจ์ด 0%์˜€์„ ๋•Œ๋Š”
03:35
every single person who received abuse had to actually report it,
58
215852
3650
๋น„๋ฐฉ์„ ๋ฐ›์€ ์‚ฌ๋žŒ์ด ์ง์ ‘ ์‹ ๊ณ ํ•ด์•ผ๋งŒ ํ–ˆ๊ณ ,
03:39
which was a lot of work for them, a lot of work for us
59
219526
3579
๊ทธ๊ฑด ๊ทธ๋“ค์—๊ฒŒ๋„ ์ €ํฌ์—๊ฒŒ๋„ ๋„ˆ๋ฌด ๋ณต์žกํ•œ ์ผ์ด์—ˆ์Šต๋‹ˆ๋‹ค.
03:43
and just ultimately unfair.
60
223129
2018
๊ทธ๋ฆฌ๊ณ  ์ •๋ง ๋ถ€๋‹นํ•œ ์ผ์ด์—ˆ์ฃ .
03:46
The other thing that we're doing is making sure that we, as a company,
61
226528
3780
๋˜ ํšŒ์‚ฌ์—์„œ ์ง„ํ–‰ ์ค‘์ธ ๋‹ค๋ฅธ ์ž‘์—…์œผ๋กœ๋Š”
03:50
have representation of all the communities that we're trying to serve.
62
230332
3333
๋ฐ˜๋“œ์‹œ ๋ชจ๋“  ๊ณต๋™์ฒด๊ฐ€ ํ‘œํ˜„ํ•  ์ˆ˜ ์žˆ๊ฒŒ ๋งŒ๋“œ๋Š” ๊ฒ๋‹ˆ๋‹ค.
03:53
We can't build a business that is successful
63
233689
2159
์‹ค์ œ๋กœ ์ €ํฌ ์žฅ๋ฒฝ ์•ˆ์—์„œ
03:55
unless we have a diversity of perspective inside of our walls
64
235872
3300
์ด๋Ÿฐ ๋ฌธ์ œ๋ฅผ ๋งค์ผ ๋Š๋ผ๋Š” ๋‹ค์–‘ํ•œ ์‹œ๊ฐ์ด ์—†๋‹ค๋ฉด
03:59
that actually feel these issues every single day.
65
239196
3732
์ €ํฌ๋Š” ์„ฑ๊ณต์ ์ธ ๋น„์ฆˆ๋‹ˆ์Šค๋ฅผ ๊ตฌ์ถ•ํ•  ์ˆ˜ ์—†์Šต๋‹ˆ๋‹ค.
04:02
And that's not just with the team that's doing the work,
66
242952
3738
๋‹จ์ˆœํžˆ ๊ทธ ์ž‘์—…์„ ํ•˜๋Š” ํŒ€๋ฟ๋งŒ ์•„๋‹ˆ๋ผ
04:06
it's also within our leadership as well.
67
246714
2096
์ €ํฌ ๋ฆฌ๋”์‹ญ ์•ˆ์—์„œ๋„ ์ด๋ค„์ ธ์•ผ ํ•˜๋Š” ๊ฑฐ์ฃ .
04:08
So we need to continue to build empathy for what people are experiencing
68
248834
5757
๊ทธ๋ž˜์„œ ์‚ฌ๋žŒ๋“ค์ด ๊ฒฝํ—˜ํ•œ ๊ฒƒ์— ๊ณต๊ฐ๋Œ€๋ฅผ ํ˜•์„ฑํ•˜๊ณ 
04:14
and give them better tools to act on it
69
254615
3316
์‚ฌ๋žŒ๋“ค์ด ์ด์šฉํ•  ์ˆ˜ ์žˆ๋Š” ๋” ๋‚˜์€ ๋„๊ตฌ๋ฅผ ์ฃผ๋ ค๋Š” ๊ฒ๋‹ˆ๋‹ค.
04:17
and also give our customers a much better and easier approach
70
257955
4252
๋˜ ๊ณ ๊ฐ์ด ๋ณด๊ณ  ์žˆ๋Š” ๊ฑธ ์ฒ˜๋ฆฌํ•  ์ˆ˜ ์žˆ๋Š”
04:22
to handle some of the things that they're seeing.
71
262231
2382
๋” ์‰ฝ๊ณ  ์ข‹์€ ์ ‘๊ทผ ๋ฐฉ์‹๋„ ์ œ๊ณตํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
04:24
So a lot of what we're doing is around technology,
72
264637
3266
๋Œ€๋ถ€๋ถ„์˜ ์ž‘์—…์€ ๊ธฐ์ˆ ๊ณผ ๊ด€๋ จ๋œ ๊ฒƒ์ด์ง€๋งŒ
04:27
but we're also looking at the incentives on the service:
73
267927
4308
์žฅ๋ ค ์ œ๋„ ์—ญ์‹œ ๊ฒ€ํ† ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
04:32
What does Twitter incentivize you to do when you first open it up?
74
272259
5183
ํŠธ์œ„ํ„ฐ๋ฅผ ์ฒ˜์Œ ์ผฐ์„ ๋•Œ ๋ฌด์—‡์„ ํ•˜๋„๋ก ์œ ๋„ํ•˜๋‚˜์š”?
04:37
And in the past,
75
277466
1294
์˜ˆ์ „์—๋Š”,
04:40
it's incented a lot of outrage, it's incented a lot of mob behavior,
76
280670
5544
ํฐ ๋ถ„๋…ธ๋ฅผ ์ผ์œผํ‚ค๊ณ  ๋งŽ์€ ๊ตฐ์ค‘ ํ–‰์œ„๋ฅผ ์ผ์œผ์ผฐ์œผ๋ฉฐ
04:46
it's incented a lot of group harassment.
77
286238
2459
์ง‘๋‹จ ํญ๋ ฅ์„ ๋งŽ์ด ์ผ์œผ์ผฐ์ฃ .
04:48
And we have to look a lot deeper at some of the fundamentals
78
288721
3648
ํฐ ๋ณ€ํ™”๋ฅผ ์ผ์œผํ‚ค๊ธฐ ์œ„ํ•ด์„œ๋Š” ์„œ๋น„์Šค๊ฐ€ ํ•˜๋Š” ์ผ์˜
04:52
of what the service is doing to make the bigger shifts.
79
292393
2958
๊ธฐ๋ณธ ์›์น™์„ ์ž์„ธํžˆ ์‚ดํŽด๋ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
04:55
We can make a bunch of small shifts around technology, as I just described,
80
295375
4031
์ œ๊ฐ€ ๋ง์”€๋“œ๋ ธ๋“ฏ์ด, ๊ธฐ์ˆ ์„ ์ด์šฉํ•ด ์ˆ˜๋งŽ์€ ์ž‘์€ ๋ณ€ํ™”๋ฅผ ๋งŒ๋“ค ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
04:59
but ultimately, we have to look deeply at the dynamics in the network itself,
81
299430
4386
ํ•˜์ง€๋งŒ ๊ถ๊ทน์ ์œผ๋กœ๋Š” ๋„คํŠธ์›Œํฌ ๊ทธ ์ž์ฒด์˜ ์—ญํ•™๊ด€๊ณ„๋ฅผ ์ž์„ธํžˆ ์‚ดํŽด๋ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
05:03
and that's what we're doing.
82
303840
1368
๊ทธ๊ฒŒ ์ €ํฌ๊ฐ€ ํ•˜๋Š” ๊ฑฐ์ฃ .
05:05
CA: But what's your sense --
83
305232
2060
ํฌ๋ฆฌ์Šค: ๊ทธ๋ ‡๋‹ค๋ฉด ์–ด๋–ค...
05:07
what is the kind of thing that you might be able to change
84
307316
3963
์–ด๋–ค ๊ฒŒ ์‹ค์ œ๋กœ ํ–‰๋™์˜ ๊ทผ๋ณธ์ ์ธ ๋ณ€ํ™”๋ฅผ
05:11
that would actually fundamentally shift behavior?
85
311303
2749
๋งŒ๋“ค์–ด๋‚ผ ์ˆ˜ ์žˆ์„๊นŒ์š”?
05:15
JD: Well, one of the things --
86
315386
1480
์žญ: ๊ธ€์Ž„์š”, ์ €ํฌ ์„œ๋น„์Šค๋Š”
05:16
we started the service with this concept of following an account,
87
316890
5340
๊ณ„์ •์„ ํŒ”๋กœ์šฐํ•˜๋Š” ๋ฐœ์ƒ๊ณผ ํ•จ๊ป˜ ์‹œ์ž‘ํ–ˆ์ฃ .
05:22
as an example,
88
322254
1725
์˜ˆ๋ฅผ ๋“ค๋ฉด ๋ง์ด์ฃ .
05:24
and I don't believe that's why people actually come to Twitter.
89
324003
4349
ํ•˜์ง€๋งŒ ์‚ฌ๋žŒ๋“ค์ด ํŠธ์œ„ํ„ฐ๋ฅผ ์ด์šฉํ•˜๋Š” ๊ฑด ๊ทธ๊ฒƒ ๋•Œ๋ฌธ์ด ์•„๋‹ˆ๋ผ๊ณ  ์ƒ๊ฐํ•ด์š”.
05:28
I believe Twitter is best as an interest-based network.
90
328376
4857
ํฅ๋ฏธ ๊ธฐ๋ฐ˜์˜ ๋„คํŠธ์›Œํฌ๋กœ์จ๋Š” ํŠธ์œ„ํ„ฐ๊ฐ€ ์ตœ๊ณ ๋ผ๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
05:33
People come with a particular interest.
91
333257
3453
์‚ฌ๋žŒ๋“ค์€ ํŠน์ •ํ•œ ํฅ๋ฏธ๋ฅผ ๊ฐ€์ง€๊ณ  ์™€์š”.
05:36
They have to do a ton of work to find and follow the related accounts
92
336734
3487
๊ทธ๋“ค์˜ ํฅ๋ฏธ์™€ ๊ด€๋ จ๋œ ๊ณ„์ •์„ ์ฐพ๊ณ  ํŒ”๋กœ์šฐํ•˜๊ธฐ๊นŒ์ง€ ํ•  ์ผ์ด ๋„ˆ๋ฌด ๋งŽ์ฃ .
05:40
around those interests.
93
340245
1405
05:42
What we could do instead is allow you to follow an interest,
94
342217
3397
๋Œ€์‹  ํŠธ์œ„ํ„ฐ์—์„  ํฅ๋ฏธ๋ฅผ ํŒ”๋กœ์šฐํ•˜๊ณ ,
05:45
follow a hashtag, follow a trend,
95
345638
2103
ํ•ด์‰ฌํƒœ๊ทธ, ํŠธ๋ Œ๋“œ๋ฅผ ํŒ”๋กœ์šฐํ•˜๊ณ 
05:47
follow a community,
96
347765
1754
๊ณต๋™์ฒด๋ฅผ ํŒ”๋กœ์šฐํ•ด์š”.
05:49
which gives us the opportunity to show all of the accounts,
97
349543
4637
๊ทธ๋Ÿฌ๋ฉด ํŠน์ •ํ•œ ์ฃผ์ œ์™€ ํฅ๋ฏธ์— ๊ด€๋ จ๋œ
05:54
all the topics, all the moments, all the hashtags
98
354204
3323
๋ชจ๋“  ๊ณ„์ •, ํ† ํ”ฝ, ๋ชจ๋ฉ˜ํŠธ, ํ•ด์‰ฌํƒœ๊ทธ๋ฅผ
05:57
that are associated with that particular topic and interest,
99
357551
3992
๋ณด์—ฌ์ค„ ๊ธฐํšŒ๊ฐ€ ์ƒ๊ฒจ์š”.
06:01
which really opens up the perspective that you see.
100
361567
4600
๋ณด๋Š” ์‹œ๊ฐ์„ ๋„“ํ˜€์ฃผ๋Š” ๊ฒƒ์ด์ฃ .
06:06
But that is a huge fundamental shift
101
366191
2157
ํ•˜์ง€๋งŒ ์ „์ฒด์ ์ธ ๋„คํŠธ์›Œํฌ์— ์ง‘์ค‘ํ•œ
06:08
to bias the entire network away from just an account bias
102
368372
3792
๋‹จ์ˆœํžˆ ๊ณ„์ •์— ํŽธํ–ฅ๋œ ๊ฒŒ ์•„๋‹ˆ๋ผ ์ฃผ์ œ์™€ ํฅ๋ฏธ์— ๊ธฐ๋Œ„
06:12
towards a topics and interest bias.
103
372188
2587
๊ฑฐ๋Œ€ํ•œ ๊ธฐ๋ณธ ์›๋ฆฌ์˜ ๋ณ€ํ™”์—์š”.
06:15
CA: Because isn't it the case
104
375283
3148
ํฌ๋ฆฌ์Šค: ํ•˜์ง€๋งŒ ํŠธ์œ„ํ„ฐ์— ๊ทธ๋ ‡๊ฒŒ
06:19
that one reason why you have so much content on there
105
379375
3541
๋งŽ์€ ์ปจํ…์ธ ๊ฐ€ ์žˆ๋Š” ๊ฑด ๊ฒฐ๊ตญ
06:22
is a result of putting millions of people around the world
106
382940
3591
์ „ ์„ธ๊ณ„์˜ ์ด์šฉ์ž ์ˆ˜๋ฐฑ๋งŒ ๋ช…์ด
06:26
in this kind of gladiatorial contest with each other
107
386555
3142
์„œ๋กœ ์‹ธ์›€์„ ๋ฒŒ์ด๊ณ  ์žˆ๊ธฐ ๋•Œ๋ฌธ์ด์ง€ ์•Š๋‚˜์š”?
06:29
for followers, for attention?
108
389721
2090
ํŒ”๋กœ์›Œ์™€ ๊ด€์‹ฌ์„ ์–ป๊ธฐ ์œ„ํ•ด์„œ์š”.
06:31
Like, from the point of view of people who just read Twitter,
109
391835
4117
ํŠธ์œ„ํ„ฐ๋ฅผ ๋ฐฉ๊ธˆ ๋ณธ ์‚ฌ๋žŒ์˜ ์‹œ๊ฐ์—์„œ ๋ดค์„ ๋•Œ๋Š”์š”.
06:35
that's not an issue,
110
395976
1155
๋ณ„๋กœ ๋ฌธ์ œ๊ฐ€ ์•„๋‹ˆ์—์š”.
06:37
but for the people who actually create it, everyone's out there saying,
111
397155
3350
ํ•˜์ง€๋งŒ ์‹ธ์›€์„ ์ผ์œผํ‚ค๋Š” ์‚ฌ๋žŒ๋“ค์€ ์‹ค์ œ๋กœ ์ด๋ ‡๊ฒŒ ๋งํ•ด์š”.
06:40
"You know, I wish I had a few more 'likes,' followers, retweets."
112
400529
3236
"๋‚œ ๋” ๋งŽ์€ '์ข‹์•„์š”'์™€ ํŒ”๋กœ์›Œ, ๋ฆฌํŠธ์œ—์„ ์–ป๊ณ  ์‹ถ์–ด" ๋ผ๊ณ ์š”.
06:43
And so they're constantly experimenting,
113
403789
2148
๊ทธ๋ž˜์„œ ๋” ๋งŽ์ด ์–ป์„ ๋ฐฉ๋ฒ•์„
06:45
trying to find the path to do that.
114
405961
1961
๊ณ„์† ์‹œํ—˜ํ•ด๋ณด๊ณ  ์ฐพ์•„๋‚ด๋ ค ํ•˜์ฃ .
06:47
And what we've all discovered is that the number one path to do that
115
407946
4126
๊ทธ๋ฆฌ๊ณ  ๊ด€์‹ฌ์„ ์–ป์„ ์ˆ˜ ์žˆ๋Š” ์ตœ๊ณ ์˜ ๋ฐฉ๋ฒ•์€
06:52
is to be some form of provocative,
116
412096
3406
์ผ๋ถ€๋Ÿฌ ๋„๋ฐœํ•˜๊ณ ,
06:55
obnoxious, eloquently obnoxious,
117
415526
2980
๋Šฅ๋ž€ํ•œ ๋ง์†œ์”จ๋กœ ๋ถˆ์พŒํ•˜๊ฒŒ ๋งŒ๋“œ๋Š” ๊ฑฐ์˜€์–ด์š”.
06:58
like, eloquent insults are a dream on Twitter,
118
418530
3516
ํŠธ์œ„ํ„ฐ์—์„œ ์„ค๋“๋ ฅ ์žˆ๊ฒŒ ๋ชจ์š•ํ•˜๋Š” ๊ฑธ ๊ฟˆ๊พธ๋Š” ์‚ฌ๋žŒ๋“ค์ด์ฃ .
07:02
where you rapidly pile up --
119
422070
2603
๊ทธ๋ ‡๊ฒŒ ์Œ“์•„์˜ฌ๋ฆฌ๋‹ค ๋ณด๋ฉด
07:04
and it becomes this self-fueling process of driving outrage.
120
424697
4608
์ด ๋ถ„๊ฐœ์— ๊ธฐ๋ฆ„์„ ๋ถ“๋Š” ๊ผด์ด ๋˜๋Š” ๊ฑฐ์ฃ .
07:09
How do you defuse that?
121
429329
2351
์ด ๋ฌธ์ œ๋Š” ์–ด๋–ป๊ฒŒ ์™„ํ™”์‹œํ‚ค๋‚˜์š”?
07:12
JD: Yeah, I mean, I think you're spot on,
122
432624
2947
์žญ: ๋„ค, ์ •ํ™•ํžˆ ๋ง์”€ํ•ด์ฃผ์…จ์–ด์š”.
07:15
but that goes back to the incentives.
123
435595
1886
์žฅ๋ ค์ฑ… ์ด์•ผ๊ธฐ๋กœ ๋‹ค์‹œ ๋Œ์•„ ๊ฐ€๋Š”๋ฐ์š”.
07:17
Like, one of the choices we made in the early days was
124
437505
2632
์˜ˆ์ „์— ์ €ํฌ๊ฐ€ ํ–ˆ๋˜ ์„ ํƒ ์ค‘ ํ•˜๋‚˜๋Š”
07:20
we had this number that showed how many people follow you.
125
440161
4701
ํŒ”๋กœ์›Œ ์ˆซ์ž๋ฅผ ๋ณด์—ฌ์ฃผ๋Š” ๊ฑฐ์˜€์–ด์š”.
07:24
We decided that number should be big and bold,
126
444886
2959
๊ทธ ์ˆซ์ž๋Š” ํฌ๊ณ  ๊ตต์€ ๋ณผ๋“œ์ฒด์—ฌ์•ผ ํ–ˆ๊ณ 
07:27
and anything that's on the page that's big and bold has importance,
127
447869
3740
ํฌ๊ณ  ๊ตต๋‹ค๋Š” ๊ฑด ์ค‘์š”ํ•˜๋‹ค๋Š” ๋œป์ด์ฃ .
07:31
and those are the things that you want to drive.
128
451633
2278
๊ทธ๋Ÿผ ๊ทธ๊ฒƒ๋“ค์„ ๊ฐ€์ง€๊ณ  ์‹ถ์–ด์ง€๊ณ ์š”.
07:33
Was that the right decision at the time?
129
453935
1907
๊ทธ๊ฒŒ ์˜ณ์€ ๊ฒฐ์ •์ด์—ˆ๋ƒ๊ณ ์š”?
07:35
Probably not.
130
455866
1153
์•„๋งˆ ์•„๋‹ˆ๊ฒ ์ฃ .
07:37
If I had to start the service again,
131
457043
1805
๋งŒ์•ฝ ์ด ์„œ๋น„์Šค๋ฅผ ๋‹ค์‹œ ์‹œ์ž‘ํ•œ๋‹ค๋ฉด
07:38
I would not emphasize the follower count as much.
132
458872
2398
์ง€๊ธˆ์ฒ˜๋Ÿผ ํŒ”๋กœ์›Œ ์ˆซ์ž๋ฅผ ๊ฐ•์กฐํ•˜์ง„ ์•Š์„ ๊ฑฐ์˜ˆ์š”.
07:41
I would not emphasize the "like" count as much.
133
461294
2295
"์ข‹์•„์š”" ์ˆ˜๋„ ์ด๋ ‡๊ฒŒ ๊ฐ•์กฐํ•˜์ง€ ์•Š์„ ๊ฑฐ๊ณ ์š”.
07:43
I don't think I would even create "like" in the first place,
134
463613
3120
์• ์ดˆ์— "์ข‹์•„์š”"๋ฅผ ๋งŒ๋“ค์ง€๋„ ์•Š์„ ๊ฒƒ ๊ฐ™๋„ค์š”.
07:46
because it doesn't actually push
135
466757
3267
์™œ๋ƒํ•˜๋ฉด ์ง€๊ธˆ์€ ๊ทธ๊ฒŒ
07:50
what we believe now to be the most important thing,
136
470048
3179
๊ฐ€์žฅ ์ค‘์š”ํ•œ ๊ฒŒ ์•„๋‹ˆ๋ผ๊ณ  ์ƒ๊ฐํ•˜๊ธฐ ๋•Œ๋ฌธ์ด์—์š”.
07:53
which is healthy contribution back to the network
137
473251
3039
๋ฐ˜๋Œ€๋กœ ๋„คํŠธ์›Œํฌ์— ๊ฑด์ „ํ•˜๊ฒŒ ๊ธฐ์—ฌํ•˜๋Š” ๊ฒƒ,
07:56
and conversation to the network,
138
476314
2652
๋Œ€ํ™”์—์„œ ๋„คํŠธ์›Œํฌ๋กœ,
07:58
participation within conversation,
139
478990
2072
๋Œ€ํ™”์— ์ฐธ์—ฌํ•˜๋Š” ๊ฒƒ,
08:01
learning something from the conversation.
140
481086
2493
๋Œ€ํ™”๋ฅผ ํ†ตํ•ด ๋ฌด์–ธ๊ฐ€๋ฅผ ๋ฐฐ์šฐ๋Š” ๊ฒƒ์ด ๊ฐ€์žฅ ์ค‘์š”ํ•˜์ฃ .
08:03
Those are not things that we thought of 13 years ago,
141
483603
2824
13๋…„ ์ „์—๋Š” ์ƒ๊ฐํ•˜์ง€ ๋ชปํ–ˆ๋˜ ๊ฒƒ๋“ค์ด์ง€๋งŒ
08:06
and we believe are extremely important right now.
142
486451
2439
์ง€๊ธˆ์€ ๊ต‰์žฅํžˆ ์ค‘์š”ํ•˜๊ฒŒ ์—ฌ๊ธฐ๋Š” ๊ฒƒ์ด์—์š”.
08:08
So we have to look at how we display the follower count,
143
488914
3023
๊ทธ๋ž˜์„œ ์šฐ๋ฆฌ๊ฐ€ ์ƒ๊ฐ๋ด์•ผ ํ•  ๊ฑด ์–ด๋–ป๊ฒŒ ํŒ”๋กœ์›Œ ์ˆ˜๋ฅผ ํ‘œ์‹œํ•˜๊ณ 
08:11
how we display retweet count,
144
491961
2365
์–ด๋–ป๊ฒŒ ๋ฆฌํŠธ์œ— ์ˆ˜์™€ "์ข‹์•„์š”" ์ˆ˜๋ฅผ ํ‘œ์‹œํ• ์ง€ ์ž…๋‹ˆ๋‹ค.
08:14
how we display "likes,"
145
494350
1401
08:15
and just ask the deep question:
146
495775
2254
๊ทธ๋ž˜์„œ ์‹ฌ์ธต์ ์ธ ์งˆ๋ฌธ์„ ํ•˜์ฃ .
08:18
Is this really the number that we want people to drive up?
147
498053
3048
์šฐ๋ฆฌ๋Š” ์ •๋ง ์‚ฌ๋žŒ๋“ค์ด ์ด ์ˆซ์ž๋“ค์„ ๋†’์ด๊ธธ ์›ํ•˜๋Š” ๊ฑธ๊นŒ์š”?
08:21
Is this the thing that, when you open Twitter,
148
501125
2545
ํŠธ์œ„ํ„ฐ๋ฅผ ์ผฐ์„ ๋•Œ ๊ทธ ์ˆซ์ž๋ฅผ ๋ณด๊ณ  ์ด๋Ÿฌ์‹ค ์ˆ˜ ์žˆ์ฃ .
08:23
you see, "That's the thing I need to increase?"
149
503694
2516
"์ €๊ฒŒ ๋‚ด๊ฐ€ ๋†’์—ฌ์•ผ ํ•˜๋Š” ๊ฑฐ์•ผ?"
08:26
And I don't believe that's the case right now.
150
506234
2144
์ง€๊ธˆ์€ ์ „ํ˜€ ๊ทธ๋ ‡์ง€ ์•Š๋‹ค๊ณ  ์ƒ๊ฐํ•ด์š”.
08:28
(Applause)
151
508402
2103
(๋ฐ•์ˆ˜)
08:30
WPR: I think we should look at some of the tweets
152
510529
2352
ํœ˜ํŠธ๋‹ˆ: ํ˜„์žฅ์— ๊ณ„์‹  ๊ด€๊ฐ ๋ถ„๋“ค์ด
08:32
that are coming in from the audience as well.
153
512905
2169
์˜ฌ๋ ค์ฃผ์‹  ํŠธ์œ—๋“ค๋„ ๋ด์•ผํ•  ๊ฒƒ ๊ฐ™๋„ค์š”.
08:35
CA: Let's see what you guys are asking.
154
515868
2436
ํฌ๋ฆฌ์Šค: ์–ด๋–ค ์งˆ๋ฌธ์„ ํ•˜์…จ๋Š”์ง€ ๋ณด์ฃ .
08:38
I mean, this is -- generally, one of the amazing things about Twitter
155
518328
3294
์ด๋Ÿฐ ์ ์ด ํŠธ์œ„ํ„ฐ์˜ ๋†€๋ผ์šด ์  ๊ฐ™์•„์š”.
08:41
is how you can use it for crowd wisdom,
156
521646
2294
์ง‘๋‹จ ์ง€์„ฑ์„ ํ™œ์šฉํ•˜๋Š” ๋ฐฉ์‹์ด์š”.
08:43
you know, that more knowledge, more questions, more points of view
157
523964
4840
์ƒ๊ฐ๋ณด๋‹ค ํ›จ์”ฌ ๋” ๋งŽ์€ ์ง€์‹, ๋” ๋งŽ์€ ์งˆ๋ฌธ,
08:48
than you can imagine,
158
528828
1238
๋” ๋„“์€ ์‹œ๊ฐ์„ ๋ณผ ์ˆ˜ ์žˆ์ฃ .
08:50
and sometimes, many of them are really healthy.
159
530090
3689
๊ทธ๋ฆฌ๊ณ  ๋Œ€๋ถ€๋ถ„์ด ๊ฑด๊ฐ•ํ•œ ๋‚ด์šฉ์ผ ๋•Œ๋„ ์žˆ๊ณ ์š”.
08:53
WPR: I think one I saw that passed already quickly down here,
160
533803
2900
ํœ˜ํŠธ๋‹ˆ: ์ง€๊ธˆ ๋ง‰ ์ง€๋‚˜๊ฐ„ ํŠธ์œ—๋“ค ์ค‘์— ์ด๋Ÿฐ ์งˆ๋ฌธ์ด ์žˆ๋„ค์š”.
08:56
"What's Twitter's plan to combat foreign meddling in the 2020 US election?"
161
536717
3524
"ํŠธ์œ„ํ„ฐ๋Š” 2020๋…„ ๋ฏธ๊ตญ ๋Œ€์„ ์— ๋Œ€ํ•œ ์™ธ๋ถ€ ๊ฐ„์„ญ์„ ์–ด๋–ป๊ฒŒ ๋ฐฉ์ง€ํ•  ๊ณ„ํš์ด์ฃ ?"
09:00
I think that's something that's an issue we're seeing
162
540265
2571
์ธํ„ฐ๋„ท์—์„œ ์ผ๋ฐ˜์ ์œผ๋กœ ๋ณผ ์ˆ˜ ์žˆ๋Š”
09:02
on the internet in general,
163
542860
1901
๋…ผ์Ÿ๊ฑฐ๋ฆฌ์ธ ๊ฒƒ ๊ฐ™์•„์š”.
09:04
that we have a lot of malicious automated activity happening.
164
544785
3667
์•…์˜๋ฅผ ๊ฐ€์ง„ ์ž๋™ํ™”๋œ ํ™œ๋™๋„ ๋งŽ์ด ๋ณด๊ฒŒ ๋˜์ฃ .
09:08
And on Twitter, for example, in fact, we have some work
165
548476
5373
ํŠธ์œ„ํ„ฐ๋กœ ์˜ˆ๋ฅผ ๋“ค์–ด๋ณด์ž๋ฉด์š”.
09:13
that's come from our friends at Zignal Labs,
166
553873
2758
์ง€๊ทธ๋„ ์—ฐ๊ตฌ์†Œ์— ์žˆ๋Š” ์นœ๊ตฌ์—๊ฒŒ ์ž๋ฃŒ๋ฅผ ๋ฐ›์•˜๋Š”๋ฐ
09:16
and maybe we can even see that to give us an example
167
556655
2656
์ œ๊ฐ€ ๋ง์”€๋“œ๋ฆฌ๋Š” ๋‚ด์šฉ์˜ ์˜ˆ์‹œ๋ฅผ
09:19
of what exactly I'm talking about,
168
559335
1927
๋ณผ ์ˆ˜ ์žˆ์„ ๊ฒƒ ๊ฐ™๋„ค์š”.
09:21
where you have these bots, if you will,
169
561286
3204
ํŠธ์œ„ํ„ฐ ๋ด‡์ด๋‚˜
09:24
or coordinated automated malicious account activity,
170
564514
4550
์•…์˜๋ฅผ ๊ฐ€์ง„ ์ž๋™ ๊ณ„์ •์˜ ํ™œ๋™์ด
09:29
that is being used to influence things like elections.
171
569088
2764
์„ ๊ฑฐ ๋“ฑ์— ์˜ํ–ฅ์„ ๋ฏธ์น˜๊ธฐ ์œ„ํ•ด ์‚ฌ์šฉ๋˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
09:31
And in this example we have from Zignal which they've shared with us
172
571876
3843
์ง€๊ทธ๋„์—์„œ ๊ณต์œ ํ•œ ์ด ์˜ˆ์‹œ๋Š”
09:35
using the data that they have from Twitter,
173
575743
2198
ํŠธ์œ„ํ„ฐ์˜ ๋ฐ์ดํ„ฐ๋ฅผ ์ด์šฉํ•œ ๊ฑด๋ฐ์š”.
09:37
you actually see that in this case,
174
577965
2441
์ด ์‚ฌ๋ก€์—์„œ๋„ ์‹ค์ œ๋กœ ๋‚˜ํƒ€๋‚ฉ๋‹ˆ๋‹ค.
09:40
white represents the humans -- human accounts, each dot is an account.
175
580430
4370
๊ฐ๊ฐ์˜ ํฐ์ƒ‰ ์ ์€ ์‹ค์ œ ์‚ฌ๋žŒ์˜ ๊ณ„์ •์„ ๋œปํ•ด์š”
09:44
The pinker it is,
176
584824
1359
๋” ์ง„ํ•œ ๋ถ„ํ™์ƒ‰ ์ ์ผ์ˆ˜๋ก
09:46
the more automated the activity is.
177
586207
1740
์ž๋™ํ™”๋œ ๊ณ„์ •์˜ ํ™œ๋™์ด์ฃ .
09:47
And you can see how you have a few humans interacting with bots.
178
587971
5970
์‚ฌ๋žŒ๋“ค์ด ๋ด‡๋“ค๊ณผ ์–ผ๋งˆ๋‚˜ ๊ต๋ฅ˜ํ•˜๋Š”์ง€ ๋ณด์ด์‹ค ๊ฒ๋‹ˆ๋‹ค.
09:53
In this case, it's related to the election in Israel
179
593965
4419
์ด ์˜ˆ์‹œ๋Š” ์ด์Šค๋ผ์—˜์˜ ์„ ๊ฑฐ ๋ฐ์ดํ„ฐ์ธ๋ฐ์š”.
09:58
and spreading misinformation about Benny Gantz,
180
598408
2833
๋ฒ ๋‹ˆ ๊ฐ„์ธ ์— ๋Œ€ํ•œ ํ—ˆ์œ„์ •๋ณด๋ฅผ ํผ๋œจ๋ฆฌ๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
10:01
and as we know, in the end, that was an election
181
601265
2662
์•„์‹œ๋‹ค์‹œํ”ผ, ๊ฒฐ๊ตญ ๊ทธ ์„ ๊ฑฐ๋Š”
10:03
that Netanyahu won by a slim margin,
182
603951
3724
๊ฐ„๋ฐœ์˜ ์ฐจ๋กœ ๋„คํƒ€๋ƒํ›„๊ฐ€ ์Šน๋ฆฌํ–ˆ์ฃ .
10:07
and that may have been in some case influenced by this.
183
607699
2842
์ด๊ฒƒ์˜ ์˜ํ–ฅ๋„ ๋ฐ›์•˜์„ ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
10:10
And when you think about that happening on Twitter,
184
610565
2615
ํŠธ์œ„ํ„ฐ์—์„œ ์ผ์–ด๋‚˜๋Š” ์ผ์— ๋Œ€ํ•ด ์ƒ๊ฐํ•ด๋ณด์…จ์„ ๋•Œ
10:13
what are the things that you're doing, specifically,
185
613204
2456
ํŠนํžˆ ์ด๋Ÿฐ ์‹์œผ๋กœ ์ž˜๋ชป๋œ ์ •๋ณด๊ฐ€ ํผ์ ธ์„œ
10:15
to ensure you don't have misinformation like this spreading in this way,
186
615684
3702
์‚ฌ๋žŒ๋“ค๊ณผ ๋ฏผ์ฃผ์ฃผ์˜์— ์˜ํ–ฅ์„ ๋ฏธ์น˜์ง€ ์•Š๋„๋ก
10:19
influencing people in ways that could affect democracy?
187
619410
4181
์–ด๋–ค ์ผ์„ ํ•˜๊ณ  ์žˆ๋‚˜์š”?
10:23
JD: Just to back up a bit,
188
623615
1771
์žญ: ๋‹ค์‹œ ๋ง์”€๋“œ๋ฆฌ์ž๋ฉด,
10:25
we asked ourselves a question:
189
625410
2975
์ €ํฌ๋Š” ์Šค์Šค๋กœ์—๊ฒŒ ์งˆ๋ฌธ์„ ๋˜์กŒ์Šต๋‹ˆ๋‹ค.
10:28
Can we actually measure the health of a conversation,
190
628409
3816
์‹ค์ œ๋กœ ๋Œ€ํ™”์˜ ๊ฑด์ „์„ฑ์„ ์ธก์ •ํ•  ์ˆ˜ ์žˆ์„๊นŒ?
10:32
and what does that mean?
191
632249
1288
๊ทธ๊ฑด ๋ฌด์Šจ ์˜๋ฏธ๊ฐ€ ์žˆ์„๊นŒ?
10:33
And in the same way that you have indicators
192
633561
3382
์—ฌ๋Ÿฌ๋ถ„์ด ์ง€ํ‘œ๋ฅผ ๊ฐ€์ง€๊ณ  ๊ณ„์‹  ๊ฒƒ์ฒ˜๋Ÿผ
10:36
and we have indicators as humans in terms of are we healthy or not,
193
636967
3467
์ธ๊ฐ„ ์—ญ์‹œ ๊ฑด๊ฐ•ํ•œ์ง€ ์•„๋‹Œ์ง€๋ฅผ ๋‚˜ํƒ€๋‚ด๋Š” ์ฒ™๋„๋ฅผ ๊ฐ€์ง€๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
10:40
such as temperature, the flushness of your face,
194
640458
4658
์ฒด์˜จ๊ณผ ์–ผ๊ตด์˜ ํ™์กฐ ๊ฐ™์€ ๊ฑฐ์ฃ .
10:45
we believe that we could find the indicators of conversational health.
195
645140
4560
์ €ํฌ๋Š” ๋Œ€ํ™”์˜ ๊ฑด์ „์„ฑ ์—ญ์‹œ ์ธก์ •ํ•  ์ˆ˜ ์žˆ๋‹ค๊ณ  ๋ฏฟ์Šต๋‹ˆ๋‹ค.
10:49
And we worked with a lab called Cortico at MIT
196
649724
3843
MIT์˜ ์ฝ”๋ฅดํ‹ฐ์ฝ” ์—ฐ๊ตฌ์‹ค๊ณผ๋„ ํ˜‘์—…ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
10:54
to propose four starter indicators
197
654479
6091
๊ถ๊ทน์ ์œผ๋กœ ์‹œ์Šคํ…œ์—์„œ ์ธก์ •ํ•  ์ˆ˜ ์žˆ๋‹ค๊ณ  ๋ฏฟ๋Š”
11:00
that we believe we could ultimately measure on the system.
198
660594
3670
๋„ค ๊ฐ€์ง€ ๊ธฐ๋ณธ ์ง€ํ‘œ๋ฅผ ์ œ์‹œํ•˜๊ธฐ ์œ„ํ•ด์„œ์š”.
11:05
And the first one is what we're calling shared attention.
199
665249
5604
์ฒซ ๋ฒˆ์งธ ์ง€ํ‘œ๋Š” ๊ณต๋™์˜ ๊ด€์‹ฌ์‚ฌ๋ผ๊ณ  ๋ถ€๋ฅด๋Š” ๊ฒƒ์ธ๋ฐ์š”.
11:10
It's a measure of how much of the conversation is attentive
200
670877
3581
๊ฐ™์€ ์ฃผ์ œ์™€ ์ „ํ˜€ ๋‹ค๋ฅธ ์ฃผ์ œ์— ๋Œ€ํ•œ ๋Œ€ํ™”์— ๋Œ€ํ•ด์„œ ๊ฐ๊ฐ
11:14
on the same topic versus disparate.
201
674482
2630
์–ผ๋งˆ๋‚˜ ์ฃผ์˜๋ฅผ ๊ธฐ์šธ์ด๋Š”์ง€ ์ธก์ •ํ•˜๋Š” ์ง€์ˆ˜์ž…๋‹ˆ๋‹ค.
11:17
The second one is called shared reality,
202
677739
2783
๋‘ ๋ฒˆ์งธ๋Š” ๊ณต์œ ๋˜๋Š” ํ˜„์‹ค์ด๋ผ๋Š” ์ง€ํ‘œ์ธ๋ฐ์š”.
11:21
and this is what percentage of the conversation
203
681217
2259
์ด๊ฒƒ์€ ๋Œ€ํ™”์˜ ๋ช‡ ํผ์„ผํŠธ๊ฐ€
11:23
shares the same facts --
204
683500
2005
๊ฐ™์€ ์‚ฌ์‹ค์„ ๊ณต์œ ํ•˜๋Š”์ง€ ๋‚˜ํƒ€๋‚ด์š”.
11:25
not whether those facts are truthful or not,
205
685529
3113
๊ทธ ์‚ฌ์‹ค์ด ์–ผ๋งˆ๋‚˜ ๋ฏฟ์„๋งŒํ•œ์ง€๊ฐ€ ์•„๋‹ˆ๋ผ
11:28
but are we sharing the same facts as we converse?
206
688666
3009
๋Œ€ํ™”ํ•  ๋•Œ ๊ฐ™์€ ์‚ฌ์‹ค์„ ๊ณต์œ ํ•˜๊ณ  ์žˆ๋Š”์ง€ ๋ง์ด์ฃ .
11:32
The third is receptivity:
207
692235
2353
์„ธ ๋ฒˆ์งธ๋Š” ์ˆ˜์šฉ์„ฑ์ž…๋‹ˆ๋‹ค.
11:34
How much of the conversation is receptive or civil
208
694612
3959
๋Œ€ํ™”๊ฐ€ ์–ผ๋งˆ๋‚˜ ์ˆ˜์šฉ์ ์ด๊ณ  ํ˜ธ์˜์ ์ธ์ง€,
11:38
or the inverse, toxic?
209
698595
2944
๋˜๋Š” ์ •๋ฐ˜๋Œ€์ด๊ณ  ์œ ๋…ํ•œ์ง€ ์ธก์ •ํ•˜์ฃ .
11:42
And then the fourth is variety of perspective.
210
702213
3222
๊ทธ๋ฆฌ๊ณ  ๋„ค ๋ฒˆ์งธ๋Š” ๊ด€์ ์˜ ๋‹ค์–‘์„ฑ์ž…๋‹ˆ๋‹ค.
11:45
So, are we seeing filter bubbles or echo chambers,
211
705459
3145
ํ•„ํ„ฐ๋ฒ„๋ธ”์ด๋‚˜ ์—์ฝ”์ฑ”๋ฒ„ ํšจ๊ณผ๋ฅผ ๊ฒฝํ—˜ํ•˜๊ณ  ์žˆ๋Š” ๊ฑด์ง€,
11:48
or are we actually getting a variety of opinions
212
708628
3057
์•„๋‹ˆ๋ฉด ์‹ค์ œ๋กœ ๋Œ€ํ™”์—์„œ ๋‹ค์–‘ํ•œ ์˜๊ฒฌ์„
11:51
within the conversation?
213
711709
1635
๋ณด๊ณ  ์žˆ๋Š” ๊ฑด์ง€ ์ธก์ •ํ•˜์ฃ .
11:53
And implicit in all four of these is the understanding that,
214
713368
4018
๊ทธ๋ฆฌ๊ณ  ์ด ๋„ค ๊ฐ€์ง€ ์ง€ํ‘œ์—์„œ ์•”๋ฌต์ ์œผ๋กœ ๋‚˜ํƒ€๋‚˜๋Š” ๊ฑด
11:57
as they increase, the conversation gets healthier and healthier.
215
717410
3390
์ง€ํ‘œ๋“ค์ด ๋†’์•„์งˆ์ˆ˜๋ก ๋Œ€ํ™”๋„ ๋” ๊ฑด๊ฐ•ํ•ด์ง„๋‹ค๋Š” ๊ฑฐ์ฃ .
12:00
So our first step is to see if we can measure these online,
216
720824
4869
๊ทธ๋ž˜์„œ ๊ทธ ์ฒซ ๊ฑธ์Œ์€ ์˜จ๋ผ์ธ์—์„œ ์ธก์ • ๊ฐ€๋Šฅํ•œ์ง€ ํ™•์ธํ•˜๋Š” ๊ฒƒ์ด๋ฉฐ,
12:05
which we believe we can.
217
725717
1308
์‹ค์ œ๋กœ ๊ฐ€๋Šฅํ•˜๋‹ค๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
12:07
We have the most momentum around receptivity.
218
727049
3167
์ˆ˜์šฉ์„ฑ ์ง€ํ‘œ์—๋Š” ๊ฐ€์žฅ ๋†’์€ ๊ฐ€์†๋„๊ฐ€ ๋ถ™๊ณ  ์žˆ์–ด์š”.
12:10
We have a toxicity score, a toxicity model, on our system
219
730240
4317
์œ ๋…์„ฑ ์ ์ˆ˜์™€ ์œ ๋…์„ฑ ๋ชจ๋ธ๋„ ์‹œ์Šคํ…œ ์ƒ์— ๋ณด์œ ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
12:14
that can actually measure whether you are likely to walk away
220
734581
4124
ํŠธ์œ„ํ„ฐ ์ƒ์—์„œ ๋Œ€ํ™”ํ™”๋˜ ๋„์ค‘
12:18
from a conversation that you're having on Twitter
221
738729
2313
๊ฝค๋‚˜ ๋†’์€ ์ˆ˜์ค€์˜
12:21
because you feel it's toxic,
222
741066
1633
์œ ํ•ดํ•จ์„ ๋Š๊ผˆ์„ ๋•Œ
12:22
with some pretty high degree.
223
742723
2512
๋Œ€ํ™”์—์„œ ๋– ๋‚˜๋ฒ„๋ฆด์ง€ ์ธก์ •ํ•˜๋Š” ๋„๊ตฌ์ฃ .
12:26
We're working to measure the rest,
224
746369
2199
๋‹ค๋ฅธ ๊ฒƒ๋„ ์ธก์ •ํ•˜๊ธฐ ์œ„ํ•ด ๋…ธ๋ ฅ์ค‘์ด๋ฉฐ,
12:28
and the next step is,
225
748592
1964
ํ•ด๊ฒฐ ๋ฐฉ์•ˆ ๊ตฌ์ถ•์˜
12:30
as we build up solutions,
226
750580
3359
๋‹ค์Œ ๋‹จ๊ณ„๋Š”,
12:33
to watch how these measurements trend over time
227
753963
3491
์ด ์ธก์ •์น˜๋“ค์ด ์‹œ๊ฐ„์ด ์ง€๋‚˜๋ฉด์„œ ์–ด๋–ค ์ถ”์„ธ๋ฅผ ๋ณด์ด๋Š”์ง€ ํ™•์ธํ•˜๊ณ 
12:37
and continue to experiment.
228
757478
1873
๊ณ„์†ํ•ด์„œ ์‹คํ—˜ํ•ด๋ณด๋Š” ๊ฒ๋‹ˆ๋‹ค.
12:39
And our goal is to make sure that these are balanced,
229
759375
4041
์ €ํฌ์˜ ๋ชฉํ‘œ๋Š” ์ด ์ง€์ˆ˜๋“ค์ด ๊ท ํ˜•์„ ์ด๋ฃจ๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
12:43
because if you increase one, you might decrease another.
230
763440
3066
์–ด๋Š ํ•œ ์ง€์ˆ˜๊ฐ€ ์ฆ๊ฐ€ํ•˜๋ฉด, ๋‹ค๋ฅธ ์ง€์ˆ˜๋Š” ๋–จ์–ด์ง€๊ธฐ ๋•Œ๋ฌธ์ด์ฃ .
12:46
If you increase variety of perspective,
231
766530
2147
๊ด€์ ์˜ ๋‹ค์–‘์„ฑ์ด ์ฆ๊ฐ€ํ•˜๋ฉด
12:48
you might actually decrease shared reality.
232
768701
3091
๊ณต์œ  ํ˜„์‹ค ์ง€์ˆ˜๋Š” ๋‚ฎ์•„์ง‘๋‹ˆ๋‹ค.
12:51
CA: Just picking up on some of the questions flooding in here.
233
771816
4989
ํฌ๋ฆฌ์Šค: ์˜ฌ๋ผ์˜ค๋Š” ์งˆ๋ฌธ๋“ค ์ค‘์—์„œ ๊ณจ๋ผ๋ณด์ฃ .
12:56
JD: Constant questioning.
234
776829
1271
์žญ: ๊ณ„์† ์งˆ๋ฌธ์ด ์˜ฌ๋ผ์˜ค๋„ค์š”.
12:58
CA: A lot of people are puzzled why,
235
778996
3620
ํฌ๋ฆฌ์Šค: ๋งŽ์€ ์‚ฌ๋žŒ๋“ค์ด
13:02
like, how hard is it to get rid of Nazis from Twitter?
236
782640
4247
ํŠธ์œ„ํ„ฐ ์ƒ์˜ ๋‚˜์น˜๋ฅผ ์ฒ˜๋ฆฌํ•˜๋Š” ๊ฒŒ ์–ผ๋งˆ๋‚˜ ํž˜๋“  ๊ฑด์ง€ ๊ถ๊ธˆํ•ดํ•˜๋„ค์š”.
13:08
JD: (Laughs)
237
788309
1322
์žญ: (์›ƒ์Œ)
13:09
So we have policies around violent extremist groups,
238
789655
6995
์ €ํฌ๋„ ํญ๋ ฅ์ ์ธ ๊ทน๋‹จ์ฃผ์˜์ž๋“ค์— ๊ด€ํ•œ ์ •์ฑ…์ด ์žˆ์Šต๋‹ˆ๋‹ค.
13:16
and the majority of our work and our terms of service
239
796674
4426
๊ทธ๋ฆฌ๊ณ  ๋Œ€๋ถ€๋ถ„์˜ ์—…๋ฌด์™€ ์ด์šฉ์•ฝ๊ด€์€
13:21
works on conduct, not content.
240
801124
3729
๋‚ด์šฉ์ด ์•„๋‹ˆ๋ผ ํ–‰๋™์— ๋”ฐ๋ผ ์ด๋ฃจ์–ด์ง‘๋‹ˆ๋‹ค.
13:24
So we're actually looking for conduct.
241
804877
2551
๊ทธ๋ž˜์„œ ์‹ค์ œ๋กœ ํ–‰๋™์„ ๊ด€์ฐฐํ•˜์ฃ .
13:27
Conduct being using the service
242
807452
3014
๋ฐ˜๋ณต์ ์œผ๋กœ ๋˜๋Š” ๋‹จํŽธ์ ์œผ๋กœ
13:30
to repeatedly or episodically harass someone,
243
810490
3867
๋‹ค๋ฅธ ์‚ฌ๋žŒ์„ ๊ดด๋กญํžˆ๊ธฐ ์œ„ํ•ด์„œ
13:34
using hateful imagery
244
814381
2493
KKK๋‹จ ํ˜น์€ ๋‚˜์น˜๋‹น์›๊ณผ ๊ด€๋ จ๋  ์ˆ˜ ์žˆ๋Š”
13:36
that might be associated with the KKK
245
816898
2106
13:39
or the American Nazi Party.
246
819028
3281
ํ˜์˜ค์Šค๋Ÿฌ์šด ์ด๋ฏธ์ง€๋ฅผ ์‚ฌ์šฉํ•˜๋Š” ํ–‰๋™๋“ค์ด์š”.
13:42
Those are all things that we act on immediately.
247
822333
4156
์ด๋Ÿฐ ํ–‰๋™์€ ์ €ํฌ๊ฐ€ ์ฆ‰์‹œ ์กฐ์น˜๋ฅผ ์ทจํ•ฉ๋‹ˆ๋‹ค.
13:47
We're in a situation right now where that term is used fairly loosely,
248
827002
5452
์ง€๊ธˆ์€ ๊ทธ ์šฉ์–ด๊ฐ€ ๊ฝค ๋ง‰์—ฐํ•˜๊ฒŒ ์‚ฌ์šฉ๋˜๊ณค ํ•˜์ง€๋งŒ
13:52
and we just cannot take any one mention of that word
249
832478
5313
์ €ํฌ๋Š” ๊ทธ ๋ˆ„๊ตฌ๋„ ๊ทธ ๋‹จ์–ด๋ฅผ ๋‹ค๋ฅธ ์‚ฌ๋žŒ์„ ๋น„๋‚œํ•˜๊ธฐ ์œ„ํ•ด
13:57
accusing someone else
250
837815
2117
์‚ฌ์šฉํ•˜๋Š” ๊ฑธ ์šฉ๋‚ฉํ•  ์ˆ˜ ์—†์–ด์š”.
13:59
as a factual indication that they should be removed from the platform.
251
839956
3755
๊ทธ๋Ÿฐ ์‚ฌ๋žŒ๋“ค์ด ๋ชจ๋‘ ํ”Œ๋žซํผ์—์„œ ์ œ๊ฑฐ๋˜์•ผ ํ•œ๋‹ค๋Š” ์‚ฌ์‹ค์— ์ž…๊ฐํ•ด์„œ์š”.
14:03
So a lot of our models are based around, number one:
252
843735
2627
๊ทธ๋ž˜์„œ ์ €ํฌ์˜ ๋ชจ๋ธ์€ ์ด๋Ÿฐ ์ˆ˜์น™์— ๊ธฐ๋ฐ˜ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
14:06
Is this account associated with a violent extremist group?
253
846386
3140
์ฒซ ๋ฒˆ์งธ๋กœ, ์ด ๊ณ„์ •์ด ํญ๋ ฅ์ ์ธ ๊ทน๋‹จ์ฃผ์˜์ž๋“ค๊ณผ ์—ฐ๊ด€๋˜์–ด ์žˆ๋Š”๊ฐ€?
14:09
And if so, we can take action.
254
849550
1983
๊ทธ๋ ‡๋‹ค๋ฉด, ์กฐ์น˜๋ฅผ ์ทจํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
14:11
And we have done so on the KKK and the American Nazi Party and others.
255
851557
3852
์‹ค์ œ๋กœ๋„ ์ˆ˜๋งŽ์€ KKK๋‹จ๊ณผ ๋ฏธ๊ตญ ๋‚˜์น˜๋‹น์› ๋“ฑ์„ ์ฒ˜๋ฆฌํ–ˆ์ฃ .
14:15
And number two: Are they using imagery or conduct
256
855433
4183
๋‘ ๋ฒˆ์งธ๋Š”, ๋งˆ์ฐฌ๊ฐ€์ง€๋กœ ์œ„์˜ ๋‹จ์ฒด์™€ ์—ฐ๊ด€๋  ์ˆ˜ ์žˆ๋Š” ์‚ฌ์ง„์ด๋‚˜ ํ–‰๋™์„
14:19
that would associate them as such as well?
257
859640
2372
์‚ฌ์šฉํ•˜๋Š”์ง€ ํ™•์ธํ•ฉ๋‹ˆ๋‹ค.
14:22
CA: How many people do you have working on content moderation
258
862416
2932
ํฌ๋ฆฌ์Šค: ๊ทธ๋Ÿผ ์ด๋ฅผ ํ™•์ธํ•˜๋Š” ์‚ฌ๋žŒ๋“ค์€ ์–ผ๋งˆ๋‚˜ ์žˆ๋‚˜์š”?
14:25
to look at this?
259
865372
1250
14:26
JD: It varies.
260
866646
1496
์žญ: ๋•Œ์— ๋”ฐ๋ผ ๋‹ค๋ฆ…๋‹ˆ๋‹ค.
14:28
We want to be flexible on this,
261
868166
1595
์ €ํฌ๋Š” ์œ ์—ฐํ•˜๊ฒŒ ๋Œ€์ฒ˜ํ•˜๋ ค ํ•ด์š”.
14:29
because we want to make sure that we're, number one,
262
869785
2646
์™œ๋ƒํ•˜๋ฉด ์ฒซ์งธ๋กœ, ๋‹จ์ˆœํžˆ ์ˆ˜๋งŽ์€ ์‚ฌ๋žŒ๋“ค์„ ๊ณ ์šฉํ•˜๊ธฐ๋ณด๋‹ค
14:32
building algorithms instead of just hiring massive amounts of people,
263
872455
4424
์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ๋งŒ๋“ค๊ณ  ์‹ถ์—ˆ๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
14:36
because we need to make sure that this is scalable,
264
876903
2824
์™œ๋ƒํ•˜๋ฉด ์ด๊ฒƒ๋“ค์„ ์ธก์ • ๊ฐ€๋Šฅ์„ฑ์„ ํ™•์‹ ํ•˜๊ณ  ์‹ถ์—ˆ๊ณ 
14:39
and there are no amount of people that can actually scale this.
265
879751
3454
๋˜ ์‹ค์ œ๋กœ ์ด๊ฑธ ๋‹ค ์ธก์ •ํ•˜๋ ค๋ฉด ๋„ˆ๋ฌด ๋งŽ์€ ์‚ฌ๋žŒ์ด ํ•„์š”ํ•˜๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
14:43
So this is why we've done so much work around proactive detection of abuse
266
883229
6629
๊ทธ๋ž˜์„œ ์‹ค์ œ๋กœ ์‚ฌ๋žŒ์ด ํ™•์ธํ•ด์•ผ๋งŒ ํ•˜๋Š” ๋น„๋ฐฉ์„ ์‚ฌ์ „์ ์œผ๋กœ ํƒ์ง€ํ•˜๊ธฐ ์œ„ํ•ด
14:49
that humans can then review.
267
889882
1391
๋งŽ์€ ๋…ธ๋ ฅ์„ ์Ÿ๊ณ  ์žˆ์ฃ .
14:51
We want to have a situation
268
891297
2861
์ €ํฌ๋Š” ์•Œ๊ณ ๋ฆฌ์ฆ˜์ด ๋Š์ž„์—†์ด
14:54
where algorithms are constantly scouring every single tweet
269
894182
3741
๋ชจ๋“  ํŠธ์œ— ํ•˜๋‚˜ํ•˜๋‚˜๋ฅผ ์ƒ…์ƒ…์ด ๋’ค์ ธ์„œ
14:57
and bringing the most interesting ones to the top
270
897947
2342
๊ฐ€์žฅ ํฅ๋ฏธ๋กœ์šด ๊ฒƒ์„ ์ œ์ผ ์œ„์ชฝ์œผ๋กœ ์˜ฌ๋ฆฌ๋ฉด
15:00
so that humans can bring their judgment to whether we should take action or not,
271
900313
3902
์ด์šฉ์•ฝ๊ด€์— ๋”ฐ๋ผ ์‚ฌ๋žŒ์ด ์ง์ ‘ ์กฐ์น˜๋ฅผ ์ทจํ• ์ง€ ๋ง์ง€ ๊ฒฐ์ •ํ•˜๋Š”
15:04
based on our terms of service.
272
904239
1524
์‹œ์Šคํ…œ์„ ๊ฐ–๊ณ  ์‹ถ์Šต๋‹ˆ๋‹ค.
15:05
WPR: But there's not an amount of people that are scalable,
273
905787
2803
ํœ˜ํŠธ๋‹ˆ: ์‹ค์ œ๋กœ ์ธก์ •ํ•  ์ˆ˜ ์žˆ๋Š” ์ธ์›์€ ์—†๋‹ค๊ณ  ํ•˜์…จ์ง€๋งŒ,
15:08
but how many people do you currently have monitoring these accounts,
274
908614
3497
ํ˜„์žฌ ์ด๋Ÿฐ ๊ณ„์ •๋“ค์„ ๋ชจํ‹ฐํ„ฐ๋งํ•˜๋Š” ์š”์›๋“ค์€ ์–ผ๋งˆ๋‚˜ ์žˆ๋‚˜์š”?
15:12
and how do you figure out what's enough?
275
912135
2546
๊ทธ๋ฆฌ๊ณ  ์–ด๋Š ์ •๋„๊ฐ€ ์ถฉ๋ถ„ํ•œ์ง€ ์–ด๋–ป๊ฒŒ ์•Œ์ฃ ?
15:14
JD: They're completely flexible.
276
914705
2272
์žญ: ๊ทธ๊ฑด ์ •๋ง ์œ ๋™์ ์ด์—์š”.
15:17
Sometimes we associate folks with spam.
277
917001
2941
์–ด๋–จ ๋• ์ผ๋ฐ˜์ ์ธ ์‚ฌ๋žŒ๋“ค์„ ์ŠคํŒธ์œผ๋กœ ๋ถ„๋ฅ˜ํ•˜๊ณ 
15:19
Sometimes we associate folks with abuse and harassment.
278
919966
3845
์–ด๋–จ ๋• ํ‰๋ฒ”ํ•œ ์‚ฌ๋žŒ์„ ๋น„๋ฐฉ๊ณผ ๊ดด๋กญํž˜์œผ๋กœ ๋ถ„๋ฅ˜ํ•  ๋•Œ๋„ ์žˆ์–ด์š”.
15:23
We're going to make sure that we have flexibility in our people
279
923835
3062
์ €ํฌ๋Š” ์ง์›์ด ๊ฐ€์žฅ ํ•„์š”ํ•œ ๊ณณ์—์„œ ๊ณง๋ฐ”๋กœ ์ผํ•  ์ˆ˜ ์žˆ๊ฒŒ
15:26
so that we can direct them at what is most needed.
280
926921
2350
์ง์›๋“ค์ด ์œตํ†ต์„ฑ์„ ๊ฐ€์งˆ ์ˆ˜ ์žˆ๊ฒŒ ํ•˜๊ณ  ์‹ถ์–ด์š”.
15:29
Sometimes, the elections.
281
929295
1204
๊ทธ๊ฒŒ ์„ ๊ฑฐ์ผ ์ˆ˜๋„ ์žˆ์ฃ .
15:30
We've had a string of elections in Mexico, one coming up in India,
282
930523
4927
๋ฉ•์‹œ์ฝ”์—์„œ ์—ฐ์ด์€ ์„ ๊ฑฐ๊ฐ€ ์žˆ์—ˆ๊ณ , ์ธ๋„์—์„œ๋„ ๊ณง ์„ ๊ฑฐ๊ฐ€ ์žˆ์ฃ .
15:35
obviously, the election last year, the midterm election,
283
935474
4447
์ž‘๋…„์˜ ์„ ๊ฑฐ์™€ ์ค‘๊ฐ„ ์„ ๊ฑฐ๋„ ๊ทธ๋ ‡๊ณ ์š”.
15:39
so we just want to be flexible with our resources.
284
939945
2472
๊ทธ๋ž˜์„œ ์ €ํฌ์˜ ์ž์›์„ ์œ ์—ฐํ•˜๊ฒŒ ์‚ฌ์šฉํ•˜๊ณ  ์‹ถ์–ด์š”.
15:42
So when people --
285
942441
2129
๊ทธ๋ž˜์„œ ์˜ˆ๋ฅผ ๋“ค๋ฉด,
15:44
just as an example, if you go to our current terms of service
286
944594
6389
์ง€๊ธˆ ์ €ํฌ ์ด์šฉ์•ฝ๊ด€ ํŽ˜์ด์ง€๋ฅผ
15:51
and you bring the page up,
287
951007
1641
ํ™•์ธํ•ด๋ณด์‹œ๋ฉด,
15:52
and you're wondering about abuse and harassment that you just received
288
952672
3682
๋ฐฉ๊ธˆ ๋ฐ›์€ ๋น„๋ฐฉ๊ณผ ๊ดด๋กญํž˜์ด
15:56
and whether it was against our terms of service to report it,
289
956378
3634
์ด์šฉ์•ฝ๊ด€์„ ์–ด๊ฒจ์„œ ์‹ ๊ณ ํ•ด์•ผ ํ•˜๋Š” ๊ฑด์ง€ ๊ถ๊ธˆํ•ด์ ธ์„œ ํ™•์ธํ•ด๋ณด๋ฉด
16:00
the first thing you see when you open that page
290
960036
2559
์ด์šฉ์•ฝ๊ด€ ์ฒซ ํŽ˜์ด์ง€๊ฐ€
16:02
is around intellectual property protection.
291
962619
3088
์ง€์ ์žฌ์‚ฐ๊ถŒ์— ๊ด€๋ จ๋˜์—ˆ๋‹ค๋Š” ๊ฑธ ์•Œ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
16:06
You scroll down and you get to abuse, harassment
292
966504
5323
์Šคํฌ๋กค์„ ๋‚ด๋ฆฌ๋ฉด ๋น„๋ฐฉ๊ณผ ๊ดด๋กญํž˜, ๊ทธ ์™ธ์—๋„ ๊ฒฝํ—˜ํ•  ์ˆ˜ ์žˆ๋Š” ๊ฒƒ์— ๊ด€ํ•œ
16:11
and everything else that you might be experiencing.
293
971851
2382
์ด์šฉ์•ฝ๊ด€์ด ๋‚˜์˜ต๋‹ˆ๋‹ค.
16:14
So I don't know how that happened over the company's history,
294
974257
3195
์ด ํšŒ์‚ฌ๊ฐ€ ์–ด๋–ค ์—ญ์‚ฌ๋ฅผ ๊ฑฐ์ณ์„œ ๊ทธ๋ ‡๊ฒŒ ๋˜์—ˆ๋Š”์ง€ ๋ชจ๋ฅด์ง€๋งŒ,
16:17
but we put that above the thing that people want
295
977476
4797
์‚ฌ๋žŒ๋“ค์ด ๊ฐ€์žฅ ์›ํ•˜๋Š” ์ •๋ณด์ด์ž ์กฐ์น˜๋ฅผ ์ทจํ•ด์คฌ์œผ๋ฉด ํ•˜๋Š” ๊ฒƒ์„
16:24
the most information on and to actually act on.
296
984146
3222
๊ฐ€์žฅ ์œ„์— ์˜ฌ๋ ค๋†“์Šต๋‹ˆ๋‹ค.
16:27
And just our ordering shows the world what we believed was important.
297
987392
5241
๊ทธ ์ˆœ์„œ๋Š” ์„ธ์ƒ ์‚ฌ๋žŒ๋“ค์ด ๊ฐ€์žฅ ์ค‘์š”ํ•˜๊ฒŒ ์ƒ๊ฐํ•˜๋Š” ๊ฒƒ๋“ค์˜ ์ˆœ์„œ์ฃ .
16:32
So we're changing all that.
298
992657
2881
๊ทธ๋ž˜์„œ ๋ชจ๋‘ ๋ฐ”๊พธ๋ ค๊ณ  ๋…ธ๋ ฅ ์ค‘์ž…๋‹ˆ๋‹ค.
16:35
We're ordering it the right way,
299
995562
1563
์˜ฌ๋ฐ”๋ฅธ ์ˆœ์„œ๋Œ€๋กœ์š”.
16:37
but we're also simplifying the rules so that they're human-readable
300
997149
3451
๊ทธ์™€ ๋™์‹œ์— ์–ธ์ œ ์•ฝ๊ด€์„ ์–ด๊ธฐ๊ณ  ์–ธ์ œ ์–ด๊ธฐ์ง€ ์•Š์•˜๋Š”์ง€
16:40
so that people can actually understand themselves
301
1000624
4067
์‹ค์ œ๋กœ ์ด์šฉ์•ฝ๊ด€์„ ์ดํ•ดํ•  ์ˆ˜ ์žˆ๋„๋ก
16:44
when something is against our terms and when something is not.
302
1004715
3448
์‚ฌ๋žŒ๋“ค์ด ์•Œ์•„๋ณด๊ธฐ ์‰ฝ๊ฒŒ ๊ทœ์น™์„ ๋‹จ์ˆœํ™”ํ•˜๋ ค๊ณ  ํ•ฉ๋‹ˆ๋‹ค.
16:48
And then we're making --
303
1008187
2161
๊ทธ๋ฆฌ๊ณ  ๋‹ค์‹œ ๋ง์”€๋“œ๋ฆฌ์ง€๋งŒ ์ €ํฌ๋Š”
16:50
again, our big focus is on removing the burden of work from the victims.
304
1010372
5200
ํ”ผํ•ด์ž์˜ ์ง์„ ๋œ์–ด์ฃผ๋Š” ๊ฒƒ์— ์ง‘์ค‘ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
16:55
So that means push more towards technology,
305
1015596
3734
์‚ฌ๋žŒ์ด ์ง์ ‘ ์ผํ•˜๊ธฐ๋ณด๋‹ค๋Š”
16:59
rather than humans doing the work --
306
1019354
1873
์ ์  ๋” ๊ธฐ์ˆ ์— ์ง‘์ค‘ํ•œ๋‹ค๋Š” ๋œป์ด์ฃ .
17:01
that means the humans receiving the abuse
307
1021251
2413
์‚ฌ๋žŒ์ด ์ง์ ‘ ๋น„๋ฐฉ์„ ๋“ฃ๊ณ ,
17:03
and also the humans having to review that work.
308
1023688
3026
์‚ฌ๋žŒ์ด ๋˜ ๊ฒ€ํ† ํ•˜๋Š” ๊ฒƒ์—์„œ ๋ฒ—์–ด๋‚˜๋ ค๋Š” ๊ฑฐ์ฃ .
17:06
So we want to make sure
309
1026738
1673
๊ทธ๋ž˜์„œ ์ €ํฌ๋Š” ์ ˆ๋Œ€๋กœ
17:08
that we're not just encouraging more work
310
1028435
2841
๊ต‰์žฅํžˆ ๋ถ€์ •์ ์ธ ๋‚ด์šฉ์„ ๊ด€ํ•œ ์ผ์„
17:11
around something that's super, super negative,
311
1031300
2629
์žฅ๋ คํ•˜๊ธฐ ๋ณด๋‹ค๋Š”
17:13
and we want to have a good balance between the technology
312
1033953
2674
๊ธฐ์ˆ ๊ณผ ์‚ฌ๋žŒ์˜ ์ฐฝ์˜์„ฑ ์‚ฌ์ด์—์„œ
17:16
and where humans can actually be creative,
313
1036651
2852
๊ทœ์น™์„ ํŒ๋‹จํ•˜๋Š”
17:19
which is the judgment of the rules,
314
1039527
3090
์ข‹์€ ๊ท ํ˜•์„ ๊ฐ€์ ธ์˜ค๋ ค ํ•ฉ๋‹ˆ๋‹ค.
17:22
and not just all the mechanical stuff of finding and reporting them.
315
1042641
3267
๊ทธ์ € ๋ฌธ์ œ ๊ณ„์ •์„ ๋ฐœ๊ฒฌํ•˜๊ณ  ์‹ ๊ณ ํ•˜๋Š” ๋ฉ”์ปค๋‹ˆ์ฆ˜์— ๊ทธ์น˜์ง€ ์•Š์„ ๊ฒ๋‹ˆ๋‹ค.
17:25
So that's how we think about it.
316
1045932
1530
๊ทธ๊ฒŒ ์ €ํฌ์˜ ์ƒ๊ฐ์ž…๋‹ˆ๋‹ค.
17:27
CA: I'm curious to dig in more about what you said.
317
1047486
2406
ํฌ๋ฆฌ์Šค: ๋ง์”€ํ•˜์‹  ๊ฒƒ์— ๋Œ€ํ•ด ๋” ๊ถ๊ธˆํ•œ๊ฒŒ ์žˆ์–ด์š”.
17:29
I mean, I love that you said you are looking for ways
318
1049916
2605
๋ฐ˜์ž‘์šฉ์„ ๋ณด์ด๋Š” ํ–‰๋™์„ ์ €์ง€ํ•˜๋Š”
17:32
to re-tweak the fundamental design of the system
319
1052545
3462
์‹œ์Šคํ…œ์˜ ๊ธฐ์ดˆ์ ์ธ ๋””์ž์ธ์„ ๋ฐ”๊พธ๋Š” ๋ฐฉ๋ฒ•์„ ์ฐพ๊ณ  ๊ณ„์‹ ๋‹ค๋‹ˆ ๊ธฐ์ฉ๋‹ˆ๋‹ค.
17:36
to discourage some of the reactive behavior, and perhaps --
320
1056031
4875
17:40
to use Tristan Harris-type language --
321
1060930
2705
์‚ฌ๋žŒ๋“ค์ด ๋” ์„ฑ์ฐฐํ•  ์ˆ˜ ์žˆ๊ฒŒ ๋งŒ๋“œ๋Š”
17:43
engage people's more reflective thinking.
322
1063659
4288
ํŠธ๋ฆฌ์Šคํƒ„ ํ•ด๋ฆฌ์Šค์”จ์˜ ํ™”๋ฒ•์ฒ˜๋Ÿผ์š”.
17:47
How far advanced is that?
323
1067971
1854
์ง€๊ธˆ๊นŒ์ง€ ๊ทธ๊ฑด ์–ผ๋งˆ๋‚˜ ๋ฐœ์ „ํ–ˆ์ฃ ?
17:49
What would alternatives to that "like" button be?
324
1069849
4305
"์ข‹์•„์š”" ๋ฒ„ํŠผ์˜ ๋Œ€์•ˆ์€ ๋ฌด์—‡์ผ๊นŒ์š”?
17:55
JD: Well, first and foremost,
325
1075518
3575
์žญ: ๊ธ€์Ž„์š”, ์ฒซ์งธ๋กœ ๊ฐ€์žฅ ์ค‘์š”ํ•œ ๊ฑด
17:59
my personal goal with the service is that I believe fundamentally
326
1079117
5753
์„œ๋น„์Šค์˜ ๊ฐœ์ธ์ ์ธ ๋ชฉํ‘œ๋Š” ๊ธฐ๋ณธ์ ์œผ๋กœ
18:04
that public conversation is critical.
327
1084894
2702
๋Œ€์ค‘์ ์ธ ๋…ผ์˜๊ฐ€ ์ค‘์š”ํ•ด์ง€๋Š” ๊ฑฐ์˜ˆ์š”.
18:07
There are existential problems facing the world
328
1087620
2647
์„ธ๊ณ„๊ฐ€ ์ง๋ฉดํ•œ ์‹ค์กด์  ๋ฌธ์ œ์—
18:10
that are facing the entire world, not any one particular nation-state,
329
1090291
4163
์ „์„ธ๊ณ„์˜ ๋Œ€์ค‘์ ์ธ ๋…ผ์˜๋Š” ๋„์›€์„ ์ค„ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
18:14
that global public conversation benefits.
330
1094478
2649
ํŠน์ • ๊ตญ๊ฐ€๋งŒ์ด ์•„๋‹ˆ๋ผ ์ „์„ธ๊ณ„๊ฐ€ ์ง๋ฉดํ•œ ๋ฌธ์ œ์š”.
18:17
And that is one of the unique dynamics of Twitter,
331
1097151
2372
๊ทธ๊ฒŒ ํŠธ์œ„ํ„ฐ๊ฐ€ ๊ฐ€์ง„ ๋…ํŠนํ•œ ์›๋™๋ ฅ์ด์ฃ .
18:19
that it is completely open,
332
1099547
1814
๋ˆ„๊ตฌ๋‚˜ ์ฐธ์—ฌํ•  ์ˆ˜ ์žˆ๊ณ 
18:21
it is completely public,
333
1101385
1596
์ถฉ๋ถ„ํžˆ ๋Œ€์ค‘์ ์ด๋ฉฐ,
18:23
it is completely fluid,
334
1103005
1399
์—„์ฒญ๋‚œ ์œ ๋™์„ฑ์„ ์ง€๋…”์ฃ .
18:24
and anyone can see any other conversation and participate in it.
335
1104428
4038
๋ˆ„๊ตฌ๋‚˜ ์–ด๋Š ๋Œ€ํ™”๋“  ๋ณผ ์ˆ˜ ์žˆ๊ณ  ๋Œ€ํ™”์— ์ฐธ์—ฌํ•  ์ˆ˜ ์žˆ์–ด์š”.
18:28
So there are conversations like climate change.
336
1108490
2206
๊ธฐํ›„ ๋ณ€ํ™”์— ๊ด€ํ•œ ๋Œ€ํ™”๋„ ์žˆ๊ณ ์š”.
18:30
There are conversations like the displacement in the work
337
1110720
2682
์ธ๊ณต์ง€๋Šฅ ๋•Œ๋ฌธ์— ๋ฐœ์ƒํ•˜๋Š”
18:33
through artificial intelligence.
338
1113426
2000
ํ•ด๊ณ ์™€ ๊ด€๋ จ๋œ ๋Œ€ํ™”๋„ ์žˆ์ฃ .
18:35
There are conversations like economic disparity.
339
1115450
3006
๊ฒฝ์ œ ๊ฒฉ์ฐจ์— ๋Œ€ํ•œ ๋Œ€ํ™”๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
18:38
No matter what any one nation-state does,
340
1118480
2765
์–ด๋Š ๊ตญ๊ฐ€๊ฐ€ ๋˜์—ˆ๋“ ์ง€,
18:41
they will not be able to solve the problem alone.
341
1121269
2421
ํ˜ผ์ž์„œ ๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ•  ์ˆ˜๋Š” ์—†์Šต๋‹ˆ๋‹ค.
18:43
It takes coordination around the world,
342
1123714
2643
์„ธ๊ณ„์˜ ํ˜‘๋ ฅ์ด ํ•„์š”ํ•˜๊ณ ,
18:46
and that's where I think Twitter can play a part.
343
1126381
3047
์ œ ์ƒ๊ฐ์—” ๊ฑฐ๊ธฐ์— ํŠธ์œ„ํ„ฐ๊ฐ€ ์ผ์กฐํ•  ์ˆ˜ ์žˆ๋‹ค๋Š” ๊ฒ๋‹ˆ๋‹ค.
18:49
The second thing is that Twitter, right now, when you go to it,
344
1129452
5642
๋‘ ๋ฒˆ์งธ๋กœ๋Š”, ์ง€๊ธˆ ํŠธ์œ„ํ„ฐ์— ๋“ค์–ด๊ฐ”์„ ๋•Œ
18:55
you don't necessarily walk away feeling like you learned something.
345
1135118
3746
๊ผญ ๋ญ”๊ฐ€๋ฅผ ๋ฐฐ์› ๋‹ค๋Š” ์ƒ๊ฐ์„ ๊ฐ€์ง€๊ณ  ๋– ๋‚˜์ง„ ์•Š์ฃ .
18:58
Some people do.
346
1138888
1276
๋ˆ„๊ตฐ๊ฐ€๋Š” ๊ทธ๋Ÿฌ๊ฒ ์ง€๋งŒ์š”.
19:00
Some people have a very, very rich network,
347
1140188
3107
์–ด๋–ค ์‚ฌ๋žŒ์€ ์—„์ฒญ๋‚œ ๋„คํŠธ์›Œํฌ๋ฅผ ๊ฐ€์ง€๊ณ ,
19:03
a very rich community that they learn from every single day.
348
1143319
3117
๋‹ค์–‘ํ•œ ๊ณต๋™์ฒด์— ์†ํ•ด์„œ ๋ญ”๊ฐ€๋ฅผ ๋งค์ผ ๋ฐฐ์šฐ๊ธฐ๋„ ํ•˜๊ฒ ์ฃ .
19:06
But it takes a lot of work and a lot of time to build up to that.
349
1146460
3691
ํ•˜์ง€๋งŒ ๊ทธ๋Ÿฐ ๋„คํŠธ์›Œํฌ๋ฅผ ๋งŒ๋“ค๊ธฐ ์œ„ํ•ด์„  ๋งŽ์€ ๋…ธ๋ ฅ๊ณผ ์‹œ๊ฐ„์ด ํ•„์š”ํ•ฉ๋‹ˆ๋‹ค.
19:10
So we want to get people to those topics and those interests
350
1150175
3448
๊ทธ๋ž˜์„œ ์ €ํฌ๋Š” ์‚ฌ๋žŒ๋“ค์ด ๊ทธ๋Ÿฐ ์ฃผ์ œ์™€ ํฅ๋ฏธ๋ฅผ
19:13
much, much faster
351
1153647
1579
ํ›จ์”ฌ ๋” ๋น ๋ฅด๊ฒŒ ์–ป์–ด๊ฐ”์œผ๋ฉด ์ข‹๊ฒ ์–ด์š”.
19:15
and make sure that they're finding something that,
352
1155250
2566
๊ทธ๋ฆฌ๊ณ  ์‚ฌ๋žŒ๋“ค์ด ํŠธ์œ„ํ„ฐ์—์„œ ๋ช‡ ์‹œ๊ฐ„์„ ์žˆ๋“ 
19:18
no matter how much time they spend on Twitter --
353
1158728
2360
๊ทธ๋Ÿฐ ์ฃผ์ œ์™€ ํฅ๋ฏธ๋ฅผ ์ฐพ์•˜์œผ๋ฉด ์ข‹๊ฒ ์–ด์š”.
19:21
and I don't want to maximize the time on Twitter,
354
1161112
2358
์‚ฌ๋žŒ๋“ค์ด ํŠธ์œ„ํ„ฐ์— ๋จธ๋ฌด๋ฅด๋Š” ์‹œ๊ฐ„์„ ์ตœ๋Œ€ํ™”ํ•˜๊ธฐ๋ณด๋‹ค
19:23
I want to maximize what they actually take away from it
355
1163494
2910
๋Œ€์‹  ์‚ฌ๋žŒ๋“ค์ด ๋ญ”๊ฐ€๋ฅผ ์–ป์–ด๊ฐ€๊ณ 
19:26
and what they learn from it, and --
356
1166428
2030
๋ฐฐ์›Œ๊ฐ€๋Š” ๊ฑธ ์ตœ๋Œ€ํ™”ํ–ˆ์œผ๋ฉด ์ข‹๊ฒ ์–ด์š”.
19:29
CA: Well, do you, though?
357
1169598
1328
ํฌ๋ฆฌ์Šค: ๋‹น์‹ ๋„ ๊ทธ๋Ÿฐ๊ฐ€์š”?
19:30
Because that's the core question that a lot of people want to know.
358
1170950
3244
์™œ๋ƒํ•˜๋ฉด ๋งŽ์€ ์‚ฌ๋žŒ๋“ค์ด ์•Œ๊ณ  ์‹ถ์–ดํ•˜๋Š” ํ•ต์‹ฌ ์งˆ๋ฌธ์ด๊ฑฐ๋“ ์š”.
19:34
Surely, Jack, you're constrained, to a huge extent,
359
1174218
3638
ํ™•์‹คํžˆ, ์ƒ์žฅ๊ธฐ์—…์ด๊ธฐ ๋•Œ๋ฌธ์ธ์ง€
19:37
by the fact that you're a public company,
360
1177880
2007
์žญ์”จ๋Š” ๊ฐ•์š”๋ฐ›์„ ์ˆ˜๋ฐ–์— ์—†์–ด์š”.
19:39
you've got investors pressing on you,
361
1179911
1774
๋‹น์‹ ์„ ์••๋ฐ•ํ•˜๋Š” ํˆฌ์ž์ž๋“ค์ด ์žˆ๊ณ 
19:41
the number one way you make your money is from advertising --
362
1181709
3559
๋ˆ์„ ๋ฒ„๋Š” ๊ฐ€์žฅ ํฐ ๋ฐฉ๋ฒ•์€
19:45
that depends on user engagement.
363
1185292
2772
์‚ฌ์šฉ์ž ์ฐธ์—ฌ์— ์˜์กดํ•˜๋Š” ๊ด‘๊ณ ์ž–์•„์š”.
19:48
Are you willing to sacrifice user time, if need be,
364
1188088
4700
์‚ฌ์ƒ‰์ ์ธ ๋Œ€ํ™”๋ฅผ ์žฅ๋ คํ•˜๊ธฐ ์œ„ํ•ด์„œ
19:52
to go for a more reflective conversation?
365
1192812
3729
์‚ฌ์šฉ์ž ์ด์šฉ์‹œ๊ฐ„์„ ํฌ์ƒํ•˜๊ฒ ๋‹จ ๋ง์”€์ด์‹ ๊ฐ€์š”?
19:56
JD: Yeah; more relevance means less time on the service,
366
1196565
3111
์žญ: ๋„ค, ์‚ฌํšŒ์  ํƒ€๋‹น์„ฑ์ด ๋†’์•„์ง€๋ฉด ์„œ๋น„์Šค ์ด์šฉ์‹œ๊ฐ„์ด ์ค„๊ฒ ์ฃ .
19:59
and that's perfectly fine,
367
1199700
1937
์ •๋ง ๊ทธ๋ž˜๋„ ๊ดœ์ฐฎ์•„์š”.
20:01
because we want to make sure that, like, you're coming to Twitter,
368
1201661
3099
์™œ๋ƒํ•˜๋ฉด ํŠธ์œ„ํ„ฐ๋ฅผ ์ด์šฉํ•  ๋•Œ
20:04
and you see something immediately that you learn from and that you push.
369
1204784
4520
๋ˆ„๋ฅด๋Š” ์ฆ‰์‹œ ๋ญ”๊ฐ€๋ฅผ ๋ฐฐ์›Œ๊ฐ”์œผ๋ฉด ์ข‹๊ฒ ์–ด์š”.
20:09
We can still serve an ad against that.
370
1209328
3420
์ €ํฌ๋Š” ๊ทธ๋ž˜๋„ ๊ด‘๊ณ ๋ฅผ ํ•  ์ˆ˜ ์žˆ์œผ๋‹ˆ๊นŒ์š”.
20:12
That doesn't mean you need to spend any more time to see more.
371
1212772
2921
๋” ๋ณด๊ธฐ ์œ„ํ•ด ๋” ๋งŽ์€ ์‹œ๊ฐ„์„ ์จ์•ผ ํ•œ๋‹ค๋Š” ์˜๋ฏธ๋Š” ์•„๋…œ์š”.
20:15
The second thing we're looking at --
372
1215717
1733
๋‘ ๋ฒˆ์งธ๋กœ ์ €ํฌ๋Š”,
20:17
CA: But just -- on that goal, daily active usage,
373
1217474
2698
ํฌ๋ฆฌ์Šค: ํ•˜์ง€๋งŒ, ๊ทธ ๋ชฉํ‘œ๋Š”, ์ผ์ผ ์‚ฌ์šฉ๋Ÿ‰์ด ๋งŽ๋‹ค๊ณ 
20:20
if you're measuring that, that doesn't necessarily mean things
374
1220196
3245
์‚ฌ๋žŒ๋“ค์ด ๊ทธ๊ฑธ ๋งค์ผ ์†Œ์ค‘ํ•˜๊ฒŒ ์—ฌ๊ธด๋‹ค๋Š” ๋œป์€ ์•„๋‹ˆ์ž–์•„์š”.
20:23
that people value every day.
375
1223465
1738
์‚ฌ์šฉ๋Ÿ‰์„ ์ธก์ •ํ•˜์‹ ๋‹ค๋ฉด ๋ง์ด์ฃ .
20:25
It may well mean
376
1225227
1161
๊ทธ๊ฑด ์‚ฌ๋žŒ๋“ค์ด
20:26
things that people are drawn to like a moth to the flame, every day.
377
1226412
3306
๋‚˜๋ฐฉ์ด ๋ถˆ๊ฝƒ์— ์ด๋Œ๋ฆฌ๋Š” ๊ฒƒ์ฒ˜๋Ÿผ ๋งค์ผ ๋Œ๋ ค๊ฐ€๋Š” ๊ฑฐ์ฃ .
20:29
We are addicted, because we see something that pisses us off,
378
1229742
3022
์ค‘๋…๋˜์—ˆ์œผ๋‹ˆ๊นŒ์š”. ๋งค์ผ ์ €๋ฅผ ์—ด๋ฐ›๊ฒŒ ํ•˜๋Š” ์ผ์„ ๋ณด๋ฉด
20:32
so we go in and add fuel to the fire,
379
1232788
3178
๊ฑฐ๊ธฐ์— ์ฐธ์—ฌํ•ด์„œ ํ™”๋ ฅ์„ ํ‚ค์šฐ์ฃ .
20:35
and the daily active usage goes up,
380
1235990
1927
๊ทธ๋Ÿผ ์ผ์ผ ์‚ฌ์šฉ๋Ÿ‰๋„ ์ฆ๊ฐ€ํ•˜๊ณ 
20:37
and there's more ad revenue there,
381
1237941
1715
๋” ๋งŽ์€ ๊ด‘๊ณ ์ˆ˜์ต์„ ์–ป๊ฒ ์ฃ .
20:39
but we all get angrier with each other.
382
1239680
2752
์ €ํฌ๋Š” ๋” ํ™”๋‚˜๊ธฐ๋งŒ ํ•˜๊ณ ์š”.
20:42
How do you define ...
383
1242456
2509
"์ผ์ผ ์‚ฌ์šฉ๋Ÿ‰"์ด๋ž€ ๋‹จ์–ด๋ฅผ
20:44
"Daily active usage" seems like a really dangerous term to be optimizing.
384
1244989
4126
๋‚™๊ด€์ ์œผ๋กœ ๋ณด๊ธฐ์—” ๋„ˆ๋ฌด ์œ„ํ—˜ํ•œ ๋‹จ์–ด ๊ฐ™๋„ค์š”.
20:49
(Applause)
385
1249139
5057
(๋ฐ•์ˆ˜)
20:54
JD: Taken alone, it is,
386
1254220
1268
์žญ: ๊ทธ๊ฒƒ๋งŒ ๋ด๋„ ๊ทธ๋ ‡์ฃ .
20:55
but you didn't let me finish the other metric,
387
1255512
2346
์ธก์ •์— ๊ด€ํ•œ ์ด์•ผ๊ธฐ๋ฅผ ๋งˆ์ € ํ•˜์ž๋ฉด
20:57
which is, we're watching for conversations
388
1257882
3727
์ €ํฌ๋Š” ๋Œ€ํ™” ๋ฐ ์—ฐ๊ฒฐ๋œ ๋Œ€ํ™”๋ฅผ
21:01
and conversation chains.
389
1261633
2129
์ง€์ผœ๋ณด๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
21:03
So we want to incentivize healthy contribution back to the network,
390
1263786
5076
๊ทธ๋ž˜์„œ ์ €ํฌ๋Š” ๋„คํŠธ์›Œํฌ์— ๊ฑด๊ฐ•ํ•œ ๊ธฐ์—ฌ๋ฅผ ํ•  ์ˆ˜ ์žˆ๊ฒŒ๋”
21:08
and what we believe that is is actually participating in conversation
391
1268886
4181
์žฅ๋ คํ•˜๊ณ  ์‹ถ๊ณ , ์ €ํฌ๋Š” ๊ทธ ๋ฐฉ๋ฒ•์ด ๊ฑด๊ฐ•ํ•œ ๋Œ€ํ™”์— ์ฐธ์—ฌํ•˜๋Š” ๊ฑฐ๋ผ ์ƒ๊ฐํ•ด์š”.
21:13
that is healthy,
392
1273091
1197
21:14
as defined by those four indicators I articulated earlier.
393
1274312
5037
์•„๊นŒ ๋ง์”€๋“œ๋ฆฐ 4๊ฐ€์ง€ ์ง€ํ‘œ๋กœ ์ •์˜๋œ ๊ฑด๊ฐ•ํ•œ ๋Œ€ํ™”์š”.
21:19
So you can't just optimize around one metric.
394
1279373
2657
๊ทธ๋ž˜์„œ ํ•œ ๊ฐ€์ง€ ์ง€ํ‘œ๋งŒ์„ ํ™œ์šฉํ•  ์ˆœ ์—†์Šต๋‹ˆ๋‹ค.
21:22
You have to balance and look constantly
395
1282054
2752
๊ท ํ˜•์„ ๋งž์ถ”๊ณ  ๊พธ์ค€ํžˆ ์ง€์ผœ๋ด์•ผ ํ•˜์ฃ .
21:24
at what is actually going to create a healthy contribution to the network
396
1284830
4083
์–ด๋–ค ๊ฒŒ ์‹ค์ œ๋กœ ๋„คํŠธ์›Œํฌ์— ๊ฑด๊ฐ•ํ•œ ๊ธฐ์—ฌ๋ฅผ ํ•˜๊ณ 
21:28
and a healthy experience for people.
397
1288937
2341
์‚ฌ๋žŒ๋“ค์ด ๊ฑด๊ฐ•ํ•œ ๊ฒฝํ—˜์„ ํ• ์ง€๋ฅผ ๋ง์ด์ฃ .
21:31
Ultimately, we want to get to a metric
398
1291302
1866
๊ถ๊ทน์ ์œผ๋กœ ์ €ํฌ๋Š” ์‚ฌ๋žŒ๋“ค์ด
21:33
where people can tell us, "Hey, I learned something from Twitter,
399
1293192
3757
"๋‚˜ ํŠธ์œ„ํ„ฐ์—์„œ ๋ญ”๊ฐ€๋ฅผ ๋ฐฐ์šฐ๊ณ  ๊ฐ€์น˜์žˆ๋Š” ๊ฑธ ๊ฐ€์ง€๊ณ  ๋– ๋‚˜๊ฒŒ ๋์–ด"
21:36
and I'm walking away with something valuable."
400
1296973
2167
๋ผ๊ณ  ๋งํ•˜๋Š” ์ธก์ • ๊ธฐ์ค€์„ ์–ป๊ณ  ์‹ถ์–ด์š”.
21:39
That is our goal ultimately over time,
401
1299164
2043
์ด๊ฒŒ ์ €ํฌ์˜ ์žฅ๊ธฐ์ ์ด๊ณ  ๊ถ๊ทน์ ์ธ ๋ชฉํ‘œ์˜ˆ์š”.
21:41
but that's going to take some time.
402
1301231
1809
ํ•˜์ง€๋งŒ ์‹œ๊ฐ„์ด ๊ฑธ๋ฆฌ๊ฒ ์ฃ .
21:43
CA: You come over to many, I think to me, as this enigma.
403
1303064
5282
ํฌ๋ฆฌ์Šค: ์ €์—๊ฒ ๋งŽ์€ ์ˆ˜์ˆ˜๊ป˜๋ผ๋กœ ๋‹ค๊ฐ€์˜ค๋„ค์š”.
21:48
This is possibly unfair, but I woke up the other night
404
1308370
4396
๋ถˆ๊ณตํ‰ํ•  ์ˆ˜๋„ ์žˆ๊ฒ ์ง€๋งŒ ์–ด๋Š ๋‚  ๋ฐค์— ์ œ๊ฐ€ ์ผ์–ด๋‚˜์„œ
21:52
with this picture of how I found I was thinking about you and the situation,
405
1312790
3879
์ œ๊ฐ€ ๋‹น์‹ ๊ณผ ๋‹น์‹ ์˜ ์ƒํ™ฉ์— ๋Œ€ํ•ด์„œ ์–ด๋–ป๊ฒŒ ์ƒ๊ฐํ•˜๋Š”์ง€ ๊ทธ๋ฆผ์„ ๊ทธ๋ ค๋ณด์ž๋ฉด
21:56
that we're on this great voyage with you on this ship called the "Twittanic" --
406
1316693
6903
"ํŠธ์œ„ํƒ€๋‹‰" ํ˜ธ์— ๋‹น์‹ ๊ณผ ํ•จ๊ป˜ ๋ฉ‹์ง„ ์—ฌํ–‰์„ ๋– ๋‚˜๋Š” ๊ทธ๋ฆผ์ด ๋– ์˜ฌ๋ผ์š”.
22:03
(Laughter)
407
1323620
1281
(์›ƒ์Œ)
22:04
and there are people on board in steerage
408
1324925
4357
3๋“ฑ ์„ ์‹ค์—๋Š” ๋ถˆํ‰์„ ํ† ๋กœํ•˜๋Š”
22:09
who are expressing discomfort,
409
1329306
2203
์‚ฌ๋žŒ๋“ค์ด ์žˆ๊ณ ,
22:11
and you, unlike many other captains,
410
1331533
2543
๋‹น์‹ ์€ ๋‹ค๋ฅธ ์„ ์žฅ๊ณผ๋Š” ๋‹ค๋ฅด๊ฒŒ
22:14
are saying, "Well, tell me, talk to me, listen to me, I want to hear."
411
1334100
3431
"๋งํ•ด๋ด์š”. ๋“ฃ๊ณ  ์žˆ์–ด์š”. ๋“ฃ๊ณ  ์‹ถ์–ด์š”."๋ผ๊ณ  ๋งํ•˜์ฃ .
22:17
And they talk to you, and they say, "We're worried about the iceberg ahead."
412
1337555
3619
๊ทธ ์‚ฌ๋žŒ๋“ค์ด "์•ž์— ์žˆ๋Š” ๋น™ํ•˜ ๋•Œ๋ฌธ์— ๊ฑฑ์ •๋ผ์š”."๋ผ๊ณ  ๋งํ•˜๋ฉด
22:21
And you go, "You know, that is a powerful point,
413
1341198
2242
๋‹น์‹ ์€ ๋งํ•˜์ฃ . "์ •๋ง ์ค‘์š”ํ•œ ์ด์•ผ๊ธฐ๋„ค์š”.
22:23
and our ship, frankly, hasn't been built properly
414
1343464
2430
๊ทธ๋Ÿฐ๋ฐ ์ €ํฌ ๋ฐฐ๋Š” ์†”์งํžˆ ๋งํ•˜๋ฉด
22:25
for steering as well as it might."
415
1345918
1669
์กฐ์ข…ํ•˜๊ธฐ ์ ์ ˆํ•˜์ง€ ์•Š์ฃ " ์ €ํฌ๋Š” ๋งํ•˜์ฃ . "๋ญ๋ผ๋„ ์ข€ ํ•ด๋ด์š”."
22:27
And we say, "Please do something."
416
1347611
1658
22:29
And you go to the bridge,
417
1349293
1411
๋‹น์‹ ์€ ํ•จ๊ต์— ์˜ฌ๋ผ๊ฐ€๊ณ 
22:30
and we're waiting,
418
1350728
2295
์ €ํฐ ๊ธฐ๋‹ค๋ฆฌ์ฃ .
22:33
and we look, and then you're showing this extraordinary calm,
419
1353047
4548
์ €ํฌ๊ฐ€ ์˜ฌ๋ ค๋‹ค๋ณด๋ฉด ๋‹น์‹ ์€ ๋†€๋ž๋„๋ก ํ‰์˜จํ•˜๊ฒŒ ๋ณด์ด๊ณ 
22:37
but we're all standing outside, saying, "Jack, turn the fucking wheel!"
420
1357619
3883
์ €ํฌ๋Š” ๋ฐ–์— ์„œ์„œ "์  ์žฅ, ํ‚ค๋ฅผ ๋Œ๋ ค์š”!"๋ผ๊ณ  ์†Œ๋ฆฌ์น˜์ฃ .
22:41
You know?
421
1361526
1151
์•ˆ ๊ทธ๋Ÿฐ๊ฐ€์š”?
22:42
(Laughter)
422
1362701
1335
(์›ƒ์Œ)
22:44
(Applause)
423
1364060
2381
(๋ฐ•์ˆ˜)
22:46
I mean --
424
1366465
1166
์ œ ๋ง์€...
22:47
(Applause)
425
1367655
1734
(๋ฐ•์ˆ˜)
22:49
It's democracy at stake.
426
1369413
4594
๋ฏผ์ฃผ์ฃผ์˜๊ฐ€ ์œ„ํƒœ๋กœ์šด ์ƒํƒœ์˜ˆ์š”.
22:54
It's our culture at stake. It's our world at stake.
427
1374031
2821
์ €ํฌ ๋ฌธํ™”๊ฐ€ ์œ„ํƒœ๋กญ๊ณ , ์ด ์„ธ์ƒ์ด ์œ„ํƒœ๋กœ์›Œ์š”.
22:56
And Twitter is amazing and shapes so much.
428
1376876
4706
ํŠธ์œ„ํ„ฐ๋Š” ์ •๋ง ๋†€๋ž๊ณ  ๋งŽ์€ ๋ถ€๋ถ„์„ ํ˜•์„ฑํ•˜๊ณ  ์žˆ์ฃ .
23:01
It's not as big as some of the other platforms,
429
1381606
2233
๋‹ค๋ฅธ ํ”Œ๋žซํผ๋งŒํผ ํฌ์ง„ ์•Š์ง€๋งŒ
23:03
but the people of influence use it to set the agenda,
430
1383863
2804
์˜ํ–ฅ๋ ฅ ์žˆ๋Š” ์‚ฌ๋žŒ๋“ค์ด ์•ˆ๊ฑด์„ ์ œ์‹œํ•˜๋Š” ๋ฐ ์ด์šฉํ•ด์š”.
23:06
and it's just hard to imagine a more important role in the world than to ...
431
1386691
6787
๊ทธ๋ณด๋‹ค ๋” ์ค‘์š”ํ•œ ์—ญํ• ์„ ๋งก๋Š” ๊ฑด ์ƒ์ƒํ•˜๊ธฐ ํž˜๋“ค์ฃ .
23:13
I mean, you're doing a brilliant job of listening, Jack, and hearing people,
432
1393502
3784
์ œ ๋ง์€, ์žญ, ๋‹น์‹ ์€ ์‚ฌ๋žŒ๋“ค์˜ ๋ง์„ ๋“ค์–ด์ฃผ๋Š” ํ›Œ๋ฅญํ•œ ์ผ์„ ํ•œ๋‹ค๋Š” ๊ฑฐ์˜ˆ์š”.
23:17
but to actually dial up the urgency and move on this stuff --
433
1397310
4445
ํ•˜์ง€๋งŒ ๊ธด๊ธ‰ํ•œ ์š”๊ตฌ์— ๋Œ€ํ•ด ๋งํ•˜์ž๋ฉด, ๊ทธ ๋ฌธ์ œ๋กœ ๋„˜์–ด๊ฐ€๋ ค๋Š”๋ฐ
23:21
will you do that?
434
1401779
2201
์ด๋Ÿฐ ๊ฒƒ๋„ ํ•ด์ฃผ์‹ค ๊ฑด๊ฐ€์š”?
23:24
JD: Yes, and we have been moving substantially.
435
1404750
3815
์žญ: ๋„ค, ์ €ํฌ๋Š” ๋งŽ์ด ์›€์ง์ด๊ณ  ์žˆ์ฃ .
23:28
I mean, there's been a few dynamics in Twitter's history.
436
1408589
3225
ํŠธ์œ„ํ„ฐ ์—ญ์‚ฌ ์ƒ ๋ช‡๋ช‡ ํฐ ๋ณ€ํ™”๊ฐ€ ์žˆ์—ˆ์–ด์š”.
23:31
One, when I came back to the company,
437
1411838
2083
์ œ๊ฐ€ ํšŒ์‚ฌ๋กœ ๋Œ์•„์™”์„ ๋•Œ๋Š”
23:35
we were in a pretty dire state in terms of our future,
438
1415477
6256
๋ฏธ๋ž˜๋ฅผ ์ƒ๊ฐํ•ด๋ณด๋ฉด ๊ฝค ์‹ฌ๊ฐํ•œ ์ƒํ™ฉ์ด์—ˆ์ฃ .
23:41
and not just from how people were using the platform,
439
1421757
4634
์‚ฌ๋žŒ๋“ค์ด ์ €ํฌ ํ”Œ๋žซํผ์„ ์–ด๋–ป๊ฒŒ ์“ฐ๋Š”์ง€ ๋ฟ๋งŒ ์•„๋‹ˆ๋ผ
23:46
but from a corporate narrative as well.
440
1426415
2047
ํšŒ์‚ฌ ์ธก๋ฉด์—์„œ๋„ ๋งˆ์ฐฌ๊ฐ€์ง€์˜€์–ด์š”.
23:48
So we had to fix a bunch of the foundation,
441
1428486
3204
ํšŒ์‚ฌ ์ƒํ™ฉ์„ ํ˜ธ์ „์‹œํ‚ค๊ธฐ ์œ„ํ•ด์„œ
23:51
turn the company around,
442
1431714
1969
๋งŽ์€ ํ† ๋Œ€๋ฅผ ๊ณ ์ณ์•ผ๋งŒ ํ–ˆ์Šต๋‹ˆ๋‹ค.
23:53
go through two crazy layoffs,
443
1433707
3111
๋‘ ๋ฒˆ์˜ ๋ง๋„ ์•ˆ ๋˜๋Š” ํ•ด๊ณ ๋„ ๊ฑฐ์ณค์ฃ .
23:56
because we just got too big for what we were doing,
444
1436842
3793
์™œ๋ƒํ•˜๋ฉด ์ €ํฌ๊ฐ€ ํ•˜๋Š” ์ผ์— ๋น„ํ•ด ํšŒ์‚ฌ๊ฐ€ ๋„ˆ๋ฌด ์ปค์กŒ๊ณ 
24:00
and we focused all of our energy
445
1440659
2060
๋ชจ๋“  ์—๋„ˆ์ง€๋ฅผ ๋Œ€์ค‘์ ์ธ ๋…ผ์˜์—
24:02
on this concept of serving the public conversation.
446
1442743
3508
๊ธฐ์—ฌํ•œ๋‹ค๋Š” ๊ฒƒ์— ์ง‘์ค‘ํ–ˆ๊ฑฐ๋“ ์š”.
24:06
And that took some work.
447
1446275
1451
๊ทธ ๊ณผ์ •์—์„œ ๋งŽ์€ ๋…ธ๋ ฅ์„ ํ–ˆ์ฃ .
24:07
And as we dived into that,
448
1447750
2608
๊ทธ ๊ณผ์ •์— ๋›ฐ์–ด๋“ค๋ฉด์„œ
24:10
we realized some of the issues with the fundamentals.
449
1450382
2992
๋ณธ์งˆ์— ๊ด€ํ•œ ๋ช‡ ๊ฐ€์ง€ ์Ÿ์ ์„ ๊นจ๋‹ฌ์•˜์Šต๋‹ˆ๋‹ค.
24:14
We could do a bunch of superficial things to address what you're talking about,
450
1454120
4656
๋ง์”€ํ•˜์‹  ๊ฒƒ๋“ค์„ ํ•ด๊ฒฐํ•˜๊ธฐ ์œ„ํ•ด์„  ์—ฌ๋Ÿฌ ํ”ผ์ƒ์ ์ธ ๊ฒƒ ํ•  ์ˆ˜ ์žˆ์—ˆ์ง€๋งŒ
24:18
but we need the changes to last,
451
1458800
1790
์ง€์† ๊ฐ€๋Šฅํ•œ ๋ณ€ํ™”๊ฐ€ ํ•„์š”ํ–ˆ์–ด์š”.
24:20
and that means going really, really deep
452
1460614
2459
์ฆ‰, ์ •๋ง ๊นŠ์ด ์‚ดํŽด๋ณด๊ณ 
24:23
and paying attention to what we started 13 years ago
453
1463097
4350
13๋…„ ์ „์— ์‹œ์ž‘ํ•œ ๊ฒƒ๋“ค์— ๊ด€์‹ฌ์„ ๊ฐ€์ง€๊ณ 
24:27
and really questioning
454
1467471
2261
์ง„์‹ฌ์œผ๋กœ
24:29
how the system works and how the framework works
455
1469756
2566
์–ด๋–ป๊ฒŒ ์‹œ์Šคํ…œ๊ณผ ๋ผˆ๋Œ€๊ฐ€ ์ž‘๋™ํ•˜๋Š”์ง€,
24:32
and what is needed for the world today,
456
1472346
3833
๋˜ ์˜ค๋Š˜๋‚  ์ด ์„ธ์ƒ์— ํ•„์š”ํ•œ ๊ฒŒ ๋ญ”์ง€ ์งˆ๋ฌธ์„ ๋˜์ง€๋Š” ๊ฑธ ๋œปํ•˜์ฃ .
24:36
given how quickly everything is moving and how people are using it.
457
1476203
4024
๋ชจ๋“  ๊ฒŒ ์—„์ฒญ๋‚˜๊ฒŒ ๋นจ๋ฆฌ ๋ณ€ํ•˜๊ณ  ์‚ฌ๋žŒ๋“ค์ด ๊ทธ๊ฑธ ์–ด๋–ป๊ฒŒ ์“ฐ๋Š”์ง€ ๊ณ ๋ คํ•˜๋ฉด์„œ ๋ง์ด์ฃ .
24:40
So we are working as quickly as we can, but quickness will not get the job done.
458
1480251
6544
๊ทธ๋ž˜์„œ ๊ฐ€๋Šฅํ•œ ํ•œ ๋นจ๋ฆฌ ํ•˜๋ ค๊ณ  ํ•˜์ง€๋งŒ ๋น ๋ฅด๊ธฐ๋งŒ ํ•ด์„œ๋Š” ํ•ด๊ฒฐํ•  ์ˆ˜ ์—†์Šต๋‹ˆ๋‹ค.
24:46
It's focus, it's prioritization,
459
1486819
2611
์ง‘์ค‘๊ณผ ์šฐ์„ ์ˆœ์œ„์ฃ .
24:49
it's understanding the fundamentals of the network
460
1489454
2946
๋„คํŠธ์›Œํฌ์˜ ๋ณธ์งˆ์„ ์ดํ•ดํ•˜๊ณ 
24:52
and building a framework that scales
461
1492424
2842
๊ทœ๋ชจ๊ฐ€ ๋ณ€ํ™”ํ•˜๋Š” ๋ผˆ๋Œ€๋ฅผ ์ง“๊ณ 
24:55
and that is resilient to change,
462
1495290
2351
๊ทธ ๋ผˆ๋Œ€๋Š” ๋ณ€ํ™”์—๋„ ๋น ๋ฅด๊ฒŒ ์ ์‘ํ•˜์ฃ .
24:57
and being open about where we are and being transparent about where are
463
1497665
5429
์‹ ๋ขฐ๋ฅผ ๊ณ„์†ํ•ด์„œ ์–ป๊ธฐ ์œ„ํ•ด ์šฐ๋ฆฌ๊ฐ€ ์–ด๋””์— ์žˆ๋Š”์ง€ ๋ฐ›์•„๋“ค์ด๊ณ 
25:03
so that we can continue to earn trust.
464
1503118
2179
์–ด๋””์— ์žˆ๋Š”์ง€ ์†”์งํ•ด์ ธ์•ผ ํ•˜๊ณ ์š”.
25:06
So I'm proud of all the frameworks that we've put in place.
465
1506141
3331
์ €๋Š” ์ œ๊ฐ€ ๋ฐฐ์น˜ํ•œ ๋ชจ๋“  ์ฒด๊ณ„๊ฐ€ ์ž๋ž‘์Šค๋Ÿฌ์›Œ์š”.
25:09
I'm proud of our direction.
466
1509496
2888
์ €ํฌ์˜ ๋ฐฉํ–ฅ์„ฑ๋„ ์ž๋ž‘์Šค๋Ÿฝ๊ณ ์š”.
25:12
We obviously can move faster,
467
1512915
2718
์ €ํฌ๋Š” ๋ถ„๋ช…ํžˆ ๋” ๋น ๋ฅด๊ฒŒ ์›€์ง์ผ ์ˆ˜ ์žˆ์ง€๋งŒ
25:15
but that required just stopping a bunch of stupid stuff we were doing in the past.
468
1515657
4719
๊ทธ๋Ÿฌ๋ ค๋ฉด ๊ณผ๊ฑฐ๋ถ€ํ„ฐ ํ•ด์˜จ ์—ฌ๋Ÿฌ ๋ฐ”๋ณด๊ฐ™์€ ์ผ๋„ ๋ฉˆ์ถฐ์•ผ ํ•ด์š”.
25:21
CA: All right.
469
1521067
1164
ํฌ๋ฆฌ์Šค: ๊ทธ๋ ‡๊ตฐ์š”.
25:22
Well, I suspect there are many people here who, if given the chance,
470
1522255
4067
์—ฌ๊ธฐ์— ๊ธฐํšŒ๊ฐ€ ์ฃผ์–ด์ง„๋‹ค๋ฉด ๋‹น์‹ ์ด ๋‹ค๋ฃจ๊ณ  ์žˆ๋Š”
25:26
would love to help you on this change-making agenda you're on,
471
1526346
3989
๋ณ€ํ™”๋ฅผ ๋งŒ๋“ค๋ ค๋Š” ์•ˆ๊ฑด์„ ๊ธฐ๊บผ์ด ๋„์šธ ์‚ฌ๋žŒ๋“ค์ด ๋งŽ์„ ๊ฑฐ์˜ˆ์š”.
25:30
and I don't know if Whitney --
472
1530359
1542
ํœ˜ํŠธ๋‹ˆ๋„ ๊ทธ๋Ÿด ๊ฒƒ ๊ฐ™๊ณ ์š”.
25:31
Jack, thank you for coming here and speaking so openly.
473
1531925
2761
์žญ, ์—ฌ๊ธฐ๊นŒ์ง€ ์™€์ฃผ์…”์„œ ์†”์งํ•˜๊ฒŒ ๋ง์”€ํ•ด์ฃผ์…”์„œ ๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
25:34
It took courage.
474
1534710
1527
๋Œ€๋‹จํ•œ ์šฉ๊ธฐ์˜€์–ด์š”.
25:36
I really appreciate what you said, and good luck with your mission.
475
1536261
3384
๋ง์”€ํ•ด์ฃผ์‹  ๋ฐ”์— ๊ฐ๋™ํ–ˆ๊ณ , ์›ํ•˜์‹œ๋Š” ๋ชฉํ‘œ๋ฅผ ์ด๋ฃจ์‹œ๊ธฐ ๋ฐ”๋ž๋‹ˆ๋‹ค.
25:39
JD: Thank you so much. Thanks for having me.
476
1539669
2095
์žญ: ๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค. ์ดˆ๋Œ€ํ•ด์ฃผ์…”์„œ ๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
25:41
(Applause)
477
1541788
3322
(๋ฐ•์ˆ˜)
25:45
Thank you.
478
1545134
1159
๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7