How technology can fight extremism and online harassment | Yasmin Green

75,506 views

2018-06-27 ・ TED


New videos

How technology can fight extremism and online harassment | Yasmin Green

75,506 views ・ 2018-06-27

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: Hyowon Ahn κ²€ν† : Jihyeon J. Kim
00:13
My relationship with the internet reminds me of the setup
0
13131
4411
인터넷과 μ €μ˜ κ΄€κ³„λŠ” λ»”ν•œ 곡포 μ˜ν™”μ˜ μ„€μ • κ°™μŠ΅λ‹ˆλ‹€.
00:17
to a clichΓ©d horror movie.
1
17566
1833
00:19
You know, the blissfully happy family moves in to their perfect new home,
2
19867
4386
λ„ˆλ¬΄λ‚˜λ„ ν–‰λ³΅ν•œ 가쑱이 λͺ¨λ“  게 μ™„λ²½ν•œ μƒˆ μ§‘μœΌλ‘œ 이사λ₯Ό κ°€μ„œ
00:24
excited about their perfect future,
3
24277
2281
닀같이 μ™„λ²½ν•œ 미래λ₯Ό κΏˆκΎΈλŠ” μ„€μ •μ΄μš”.
00:26
and it's sunny outside and the birds are chirping ...
4
26582
3521
ν™”μ°½ν•œ 날씨와 μ§€μ €κ·€λŠ” μƒˆλ“€κΉŒμ§€...
00:30
And then it gets dark.
5
30857
1839
그리고 어둠이 μ°Ύμ•„μ˜€μ£ .
00:32
And there are noises from the attic.
6
32720
2348
λ‹€λ½λ°©μ—μ„œ μ†Œλ¦¬κ°€ λ“€λ €μ˜΅λ‹ˆλ‹€.
00:35
And we realize that that perfect new house isn't so perfect.
7
35092
4345
κ·Έμ œμ„œμ•Ό μ™„λ²½ν•œ μ€„λ§Œ μ•Œμ•˜λ˜ κ·Έ 집이 μ™„λ²½ν•˜μ§€ μ•Šλ‹€λŠ” κ±Έ κΉ¨λ‹«κ²Œ 되죠.
00:40
When I started working at Google in 2006,
8
40485
3131
μ œκ°€ 2006년에 κ΅¬κΈ€μ—μ„œ μΌν•˜κΈ° μ‹œμž‘ν–ˆμ„ λ•Œ
00:43
Facebook was just a two-year-old,
9
43640
1767
페이슀뢁이 μ‹œμž‘ν•œμ§€λŠ” κ³ μž‘ 2λ…„ 밖에 λ˜μ§€ μ•Šμ•˜κ³ 
00:45
and Twitter hadn't yet been born.
10
45431
2012
νŠΈμœ„ν„°λŠ” μ‘΄μž¬ν•˜μ§€λ„ μ•Šμ•˜μŠ΅λ‹ˆλ‹€.
00:47
And I was in absolute awe of the internet and all of its promise
11
47848
4410
μ €λŠ” 인터넷에 μ™„μ „νžˆ λ§€λ£Œλ˜μ–΄ 인터넷을 톡해 μš°λ¦¬κ°€ 더 κ°€κΉŒμ›Œμ§€κ³ 
00:52
to make us closer
12
52282
1437
00:53
and smarter
13
53743
1296
더 λ˜‘λ˜‘ν•΄μ§€λ©°
00:55
and more free.
14
55063
1214
그리고 더 μžμœ λ‘œμ›Œμ§ˆ 미래λ₯Ό κΏˆκΏ¨μŠ΅λ‹ˆλ‹€.
00:57
But as we were doing the inspiring work of building search engines
15
57265
3714
ν•˜μ§€λ§Œ μš°λ¦¬κ°€ 검색엔진을 λ§Œλ“€κ³ 
01:01
and video-sharing sites and social networks,
16
61003
2886
μ˜μƒ 곡유 μ‚¬μ΄νŠΈλ‚˜ μ†Œμ…œ λ„€νŠΈμ›Œν¬λ₯Ό λ§Œλ“œλŠ” λ™μ•ˆ
01:04
criminals, dictators and terrorists were figuring out
17
64907
4304
λ²”μ£„μž, λ…μž¬μžμ™€ ν…ŒλŸ¬λ²”λ“€μ€
같은 ν”Œλž«νΌμ„ μ•…μš©ν•  방법을 μ°Ύκ³  μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
01:09
how to use those same platforms against us.
18
69235
3202
01:13
And we didn't have the foresight to stop them.
19
73417
2455
ν•˜μ§€λ§Œ μš°λ¦¬μ—κ² 그듀을 막을 선견지λͺ…이 μ—†μ—ˆμŠ΅λ‹ˆλ‹€.
01:16
Over the last few years, geopolitical forces have come online to wreak havoc.
20
76746
5099
μ§€λ‚œ λͺ‡ λ…„ λ™μ•ˆ, 지정학적인 μ„Έλ ₯이 μ˜¨λΌμΈμ—μ„œ λŒ€ν˜Όλž€μ„ μΌμœΌν‚€λ € ν–ˆμŠ΅λ‹ˆλ‹€.
01:21
And in response,
21
81869
1169
그리고 그에 λ§žμ„œ
01:23
Google supported a few colleagues and me to set up a new group called Jigsaw,
22
83062
4778
κ΅¬κΈ€μ‚¬μ—μ„œλŠ” 저와 λͺ‡ 직원듀을 ꡬ성해 μ§μ†ŒλΌλŠ” 그룹을 λ§Œλ“€κ³ 
01:27
with a mandate to make people safer from threats like violent extremism,
23
87864
4596
폭λ ₯적인 κ·Ήλ‹¨μ£Όμ˜μžλ“€μ˜ κ³΅κ²©μœΌλ‘œλΆ€ν„° μ‚¬λžŒλ“€μ„ λ³΄ν˜Έν•˜λ„λ‘ ν–ˆμŠ΅λ‹ˆλ‹€.
01:32
censorship, persecution --
24
92484
2078
κ²€μ—΄, 처벌 λ“±
01:35
threats that feel very personal to me because I was born in Iran,
25
95186
4117
μ €μ—κ²Œ 맀우 μ΅μˆ™ν•œ 곡격듀인데 μ €λŠ” μ΄λž€μ—μ„œ νƒœμ–΄λ‚¬κ³ 
01:39
and I left in the aftermath of a violent revolution.
26
99327
2929
λ‚œν­ν•œ 혁λͺ…μ˜ μ—¬νŒŒλ‘œ λ‚˜λΌλ₯Ό λ– λ‚¬κ±°λ“ μš”.
01:43
But I've come to realize that even if we had all of the resources
27
103525
4346
ν•˜μ§€λ§Œ μ œκ°€ 깨달은 것이 μš°λ¦¬μ—κ²Œ μ„Έμƒμ˜ λͺ¨λ“  μžμ›κ³Ό
01:47
of all of the technology companies in the world,
28
107895
2858
λͺ¨λ“  기술 νšŒμ‚¬λ“€μ΄ μžˆλ‹€κ³  해도
01:51
we'd still fail
29
111595
1230
μš°λ¦¬κ°€ κ°€μž₯ μ€‘μš”ν•œ ν•œ 가지λ₯Ό λ†“μΉœλ‹€λ©΄
01:53
if we overlooked one critical ingredient:
30
113586
2948
κ²°κ΅­ μ‹€νŒ¨ν•œλ‹€λŠ” κ²ƒμž…λ‹ˆλ‹€.
01:57
the human experiences of the victims and perpetrators of those threats.
31
117653
5789
μ•žμ„œ λ§ν•œ μœ„ν˜‘μ˜ ν”Όν•΄μžλ‚˜ κ°€ν•΄μžμ˜ 인간적 κ²½ν—˜μž…λ‹ˆλ‹€.
02:04
There are many challenges I could talk to you about today.
32
124935
2736
μ˜€λŠ˜μ€ λ§Žμ€ μœ„ν˜‘ 쀑 두 가지에 μ§‘μ€‘ν•˜κ² μŠ΅λ‹ˆλ‹€.
02:07
I'm going to focus on just two.
33
127695
1504
02:09
The first is terrorism.
34
129623
2079
첫 λ²ˆμ§ΈλŠ” ν…ŒλŸ¬μž…λ‹ˆλ‹€.
02:13
So in order to understand the radicalization process,
35
133563
2557
μš°λ¦¬λŠ” κΈ‰μ§„ν™”μ˜ 과정을 μ΄ν•΄ν•˜κΈ° μœ„ν•΄μ„œ
02:16
we met with dozens of former members of violent extremist groups.
36
136144
4287
폭λ ₯적 κΈ‰μ§„νŒŒμ˜ μ „ λ©€λ²„μ˜€λ˜ 수 λ§Žμ€ μ‚¬λžŒλ“€μ„ λ§Œλ‚˜λ³΄μ•˜μŠ΅λ‹ˆλ‹€.
02:21
One was a British schoolgirl,
37
141590
2483
κ·Έ 쀑 ν•œλͺ…은 영ꡭ인 μ—¬ν•™μƒμ΄μ—ˆλŠ”λ°
02:25
who had been taken off of a plane at London Heathrow
38
145049
3699
μ‹œλ¦¬μ•„μ— κ°€μ„œ λ°˜μ •λΆ€ 무μž₯단체 IS에 κ°€μž…ν•˜λ €λ‹€κ°€
02:28
as she was trying to make her way to Syria to join ISIS.
39
148772
4692
런던 νžˆλ“œλ‘œ 곡항 λΉ„ν–‰κΈ°μ—μ„œ μž‘ν˜”μŠ΅λ‹ˆλ‹€.
02:34
And she was 13 years old.
40
154281
1931
κ³ μž‘ 13μ‚΄μ΄μ—ˆλŠ”λ° λ§μž…λ‹ˆλ‹€.
02:37
So I sat down with her and her father, and I said, "Why?"
41
157792
4625
ν•™μƒμ˜ 아버지와 셋이 μ•‰μ•„μ„œ μ œκ°€ λ¬Όμ—ˆμŠ΅λ‹ˆλ‹€. "μ΄μœ κ°€ 뭐지?"
02:42
And she said,
42
162441
1717
그러자 κ·Έλ…€κ°€ λ§ν–ˆμŠ΅λ‹ˆλ‹€.
02:44
"I was looking at pictures of what life is like in Syria,
43
164182
3639
"μ‹œλ¦¬μ•„μ—μ„œλŠ” μ–΄λ–»κ²Œ μ‚¬λŠ”μ§€ 사진을 λ³΄κ³ μžˆλ‹€κ°€
02:47
and I thought I was going to go and live in the Islamic Disney World."
44
167845
3510
이슬람 λ””μ¦ˆλ‹ˆ μ›”λ“œμ—μ„œ μ‚΄ 수 μžˆμ„ 것 κ°™μ•˜μ–΄μš”."
02:52
That's what she saw in ISIS.
45
172527
2084
ISISλ₯Ό 보고 κ·Έλ ‡κ²Œ μƒκ°ν•œ κ±°μ˜ˆμš”.
02:54
She thought she'd meet and marry a jihadi Brad Pitt
46
174635
3492
λΈŒλž˜λ“œ ν”ΌνŠΈμ²˜λŸΌ 생긴 μ§€ν•˜λ“œ 전사λ₯Ό λ§Œλ‚˜ κ²°ν˜Όν•΄μ„œ
02:58
and go shopping in the mall all day and live happily ever after.
47
178151
3058
ν•˜λ£¨μ’…μΌ μ‡Όν•‘ λ‹€λ‹ˆκ³  평생 행볡할 거라 μƒκ°ν–ˆλŒ€μš”.
03:02
ISIS understands what drives people,
48
182977
2824
ISλŠ” μ‚¬λžŒλ“€μ΄ 뭘 μ’‹μ•„ν•˜λŠ”μ§€ μ•Œκ³ 
03:05
and they carefully craft a message for each audience.
49
185825
3544
각 λŒ€μƒμ— 맞좰 μ •κ΅ν•œ 메세지λ₯Ό λ§Œλ“­λ‹ˆλ‹€.
03:11
Just look at how many languages
50
191122
1511
ISκ°€ ν™λ³΄μžλ£Œλ₯Ό
03:12
they translate their marketing material into.
51
192657
2273
λͺ‡ 개 κ΅­μ–΄λ‘œ λ²ˆμ—­ν•΄λ‚΄λŠ”μ§€λ§Œ 봐도 닡이 λ³΄μž…λ‹ˆλ‹€.
03:15
They make pamphlets, radio shows and videos
52
195677
2661
νŒœν”Œλ ›, λΌλ””μ˜€μ‡Ό, λΉ„λ””μ˜€ λͺ¨λ‘
03:18
in not just English and Arabic,
53
198362
1973
μ˜μ–΄μ™€ μ•„λžμ–΄ 뿐 μ•„λ‹ˆλΌ
03:20
but German, Russian, French, Turkish, Kurdish,
54
200359
4767
독일어, λŸ¬μ‹œμ•„μ–΄, ν”„λž‘μŠ€μ–΄, ν„°ν‚€μ–΄, μΏ λ₯΄λ“œμ–΄
03:25
Hebrew,
55
205150
1672
νžˆλΈŒλ¦¬μ–΄
03:26
Mandarin Chinese.
56
206846
1741
ν‘œμ€€ μ€‘κ΅­μ–΄λ‘œ κΉŒμ§€ λ²ˆμ—­ν•©λ‹ˆλ‹€.
03:29
I've even seen an ISIS-produced video in sign language.
57
209309
4192
심지어 μˆ˜ν™”λ‘œ 된 IS ν™λ³΄μ˜μƒλ„ μžˆμŠ΅λ‹ˆλ‹€.
03:34
Just think about that for a second:
58
214605
1884
ν•œλ²ˆ 생각해 λ³΄μ„Έμš”.
03:36
ISIS took the time and made the effort
59
216513
2308
ISISλŠ” μ‹œκ°„κ³Ό λ…Έλ ₯을 λ“€μ—¬ κ·Έλ“€μ˜ 메세지λ₯Ό
03:38
to ensure their message is reaching the deaf and hard of hearing.
60
218845
3804
λ‚œμ²­κ³Ό 청각 μž₯μ• μΈλ“€μ—κ²Œλ„ μ „νŒŒν•˜λ €κ³  ν–ˆμŠ΅λ‹ˆλ‹€.
03:45
It's actually not tech-savviness
61
225143
2144
ISISκ°€ μ‚¬λžŒλ“€μ˜ λ§ˆμŒμ„ ν›”μΉ˜λŠ” 방법은
03:47
that is the reason why ISIS wins hearts and minds.
62
227311
2595
높은 기술λ ₯이 μ•„λ‹™λ‹ˆλ‹€.
03:49
It's their insight into the prejudices, the vulnerabilities, the desires
63
229930
4163
μžμ‹ λ“€μ΄ 손을 λ»—κ³ μž ν•˜λŠ” λŒ€μƒμ— λŒ€ν•œ
03:54
of the people they're trying to reach
64
234117
1774
νŽΈκ²¬μ΄λ‚˜ λ‚˜μ•½ν•¨, 갈망 등을
03:55
that does that.
65
235915
1161
κΏ°λš«μ–΄λ΄€κΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
03:57
That's why it's not enough
66
237718
1429
μ΄λŸ¬ν•œ 이유 λ•Œλ¬Έμ—
03:59
for the online platforms to focus on removing recruiting material.
67
239171
4239
λ‹¨μˆœνžˆ 온라인 μƒμ˜ λͺ¨μ§‘ κ΄€λ ¨ 글을 λ‚΄λ¦¬λŠ” κ²ƒμœΌλ‘  μΆ©λΆ„ν•œ λŒ€μ‘μ΄ λ˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
04:04
If we want to have a shot at building meaningful technology
68
244518
3581
급진화λ₯Ό λŒ€μ‘ν•  μ˜λ―ΈμžˆλŠ” κΈ°μˆ μ„ λ°œλ‹¬μ‹œν‚¬
04:08
that's going to counter radicalization,
69
248123
1874
λ…Έλ ₯이라도 해보기 μœ„ν•΄μ„œλŠ”
04:10
we have to start with the human journey at its core.
70
250021
2979
μΈκ°„μ˜ μ—¬μ •μ˜ λ³Έμ§ˆλΆ€ν„° μ•Œμ•„μ•Ό ν•©λ‹ˆλ‹€.
04:13
So we went to Iraq
71
253884
2187
κ·Έλž˜μ„œ μ €ν¬λŠ” 이라크에 κ°€μ„œ
04:16
to speak to young men who'd bought into ISIS's promise
72
256095
2831
ν•œ λ‚¨μžλ₯Ό λ§Œλ‚˜λ΄€μŠ΅λ‹ˆλ‹€.
04:18
of heroism and righteousness,
73
258950
3191
κ·ΈλŠ” IS의 μ˜μ›…μ£Όμ˜μ™€ μ •μ˜μ— 이끌렀
04:22
who'd taken up arms to fight for them
74
262165
1847
κ·Έλ“€κ³Ό ν•œνŽΈμ΄ λ˜μ–΄ μ‹Έμ› μ§€λ§Œ
04:24
and then who'd defected
75
264036
1338
κ·Έλ“€μ˜ μž”μΈν•¨μ„ λͺ©κ²©ν•œ ν›„
04:25
after they witnessed the brutality of ISIS's rule.
76
265398
3021
ISλ₯Ό λ– λ‚˜ λŒμ•„μ„°μŠ΅λ‹ˆλ‹€.
04:28
And I'm sitting there in this makeshift prison in the north of Iraq
77
268880
3192
이라크 뢁μͺ½μ— μœ„μΉ˜ν•œ μž„μ‹œκ΅λ„μ†Œμ— μ•‰μ•„μ„œ
04:32
with this 23-year-old who had actually trained as a suicide bomber
78
272096
4550
μžμ‚΄ν­νƒ„ ν…ŒλŸ¬λ²” ν›ˆλ ¨μ„ λ°›μ•˜μ§€λ§Œ κ²°κ΅­ λŒμ•„μ„ 
04:36
before defecting.
79
276670
1552
23μ‚΄μ§œλ¦¬ λ‚¨μžμ™€ μ–˜κΈ°λ₯Ό λ‚˜λˆ΄μŠ΅λ‹ˆλ‹€.
04:39
And he says,
80
279080
1158
κ·Έκ°€ λ§ν–ˆμŠ΅λ‹ˆλ‹€.
04:41
"I arrived in Syria full of hope,
81
281119
3220
"희망을 가득 ν’ˆκ³  μ‹œλ¦¬μ•„μ— λ„μ°©ν–ˆλŠ”λ°,
04:44
and immediately, I had two of my prized possessions confiscated:
82
284363
4365
도착 μ¦‰μ‹œ μ €μ˜ κ°€μž₯ μ€‘μš”ν•œ μ†Œμœ ν’ˆ 두 개λ₯Ό μ••μˆ˜λ‹Ήν–ˆμŠ΅λ‹ˆλ‹€.
04:48
my passport and my mobile phone."
83
288752
2933
제 μ—¬κΆŒκ³Ό ν•Έλ“œν°μ„μš”."
04:52
The symbols of his physical and digital liberty
84
292140
2406
신체적, 디지털 자유의 상징 두 가지가
04:54
were taken away from him on arrival.
85
294570
1760
도착 μ¦‰μ‹œ μ••μˆ˜λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
04:57
And then this is the way he described that moment of loss to me.
86
297248
3510
κ·ΈλŠ” μ••μˆ˜ λ‹Ήμ‹œμ˜ 상황을 μ΄λ ‡κ²Œ μ„€λͺ…ν–ˆμŠ΅λ‹ˆλ‹€.
05:01
He said,
87
301356
1586
κ·Έκ°€ λ§ν•˜κΈΈ,
05:02
"You know in 'Tom and Jerry,'
88
302966
2329
"λ§Œν™”μ˜ν™” 'ν†°κ³Ό 제리' μ•„μ‹œμ£ ?
05:06
when Jerry wants to escape, and then Tom locks the door
89
306192
3103
μ œλ¦¬κ°€ 도망가렀고 ν•˜λ©΄ 톰이 문을 잠그고
05:09
and swallows the key
90
309319
1156
μ—΄μ‡ λ₯Ό κΏ€κΊ½ ν•˜μž–μ•„μš”.
05:10
and you see it bulging out of his throat as it travels down?"
91
310499
3551
μ—΄μ‡ κ°€ λ±ƒμ†μœΌλ‘œ λΆˆλ£©ν•˜κ³  νŠ€μ–΄λ‚˜μ˜€λ©° λ‚΄λ €κ°€λŠ” 것도 λ³΄μ΄κ³ μš”."
05:14
And of course, I really could see the image that he was describing,
92
314446
3153
λ¬Όλ‘ , κ·Έκ°€ μ„€λͺ…ν•˜λŠ” 상황이 머릿속에 그렀쑌고
05:17
and I really did connect with the feeling that he was trying to convey,
93
317623
3661
κ·Έκ°€ μ „ν•˜κ³ μž ν•˜λŠ” λŠλ‚Œ λ˜ν•œ 이해가 λμŠ΅λ‹ˆλ‹€.
05:21
which was one of doom,
94
321308
2021
λΉ μ Έλ‚˜κ°ˆ 방법이 μ—†λŠ”
05:23
when you know there's no way out.
95
323353
1789
μ΅œν›„μ˜ λ‚ μ΄λ‚˜ 막닀λ₯Έ κΈΈ 같은 λŠλ‚Œμ΄μš”.
05:26
And I was wondering:
96
326551
1289
그리고 κΆκΈˆν•΄μ‘ŒμŠ΅λ‹ˆλ‹€.
05:28
What, if anything, could have changed his mind
97
328644
2682
κ·Έκ°€ 집을 λ– λ‚˜λ˜ λ‚ 
05:31
the day that he left home?
98
331350
1240
그의 λ§ˆμŒμ„ 돌릴 수 μžˆλŠ” 건 뭐가 μžˆμ—ˆμ„κΉŒμš”?
05:32
So I asked,
99
332614
1250
κ·Έλž˜μ„œ κ·Έμ—κ²Œ λ¬Όμ—ˆμŠ΅λ‹ˆλ‹€.
05:33
"If you knew everything that you know now
100
333888
3178
"λ§Œμ•½ 당신이 μ§€κΈˆ μ•„λŠ” λͺ¨λ“  것을,
05:37
about the suffering and the corruption, the brutality --
101
337090
3051
고톡, λΆ€νŒ¨, μž”μΈν•¨ λ“± λͺ¨λ“  것을
05:40
that day you left home,
102
340165
1415
집을 λ– λ‚˜κΈ° μ „μ˜ 당신이 μ•Œμ•˜λ”λΌλ„
05:41
would you still have gone?"
103
341604
1679
같은 선택을 ν–ˆμ„κΉŒμš”?"
05:43
And he said, "Yes."
104
343786
1711
그러자 κ·Έκ°€ λŒ€λ‹΅ν–ˆμŠ΅λ‹ˆλ‹€, "λ„€."
05:45
And I thought, "Holy crap, he said 'Yes.'"
105
345846
2282
μ œκ°€ "세상에, 'λ„€' 라고 λŒ€λ‹΅ν•˜λ‹€λ‹ˆ" 라고 μƒκ°ν•˜λŠ” 도쀑
05:48
And then he said,
106
348694
1219
κ·Έκ°€ μ΄μ–΄μ„œ λ§ν–ˆμŠ΅λ‹ˆλ‹€.
05:49
"At that point, I was so brainwashed,
107
349937
3001
"κ·Έλ•Œ μ €λŠ” μ™„μ „νžˆ μ„Έλ‡Œλ˜μ–΄ μžˆλŠ” μƒνƒœμ—¬μ„œ
05:52
I wasn't taking in any contradictory information.
108
352962
3244
λ°˜λ°•λ  λ§Œν•œ μ •λ³΄λŠ” ν•˜λ‚˜λ„ 받아듀이지 μ•Šμ•˜μ–΄μš”.
05:56
I couldn't have been swayed."
109
356744
1555
맘이 λ°”λ€Œμ§€ μ•Šμ•˜μ„ κ²λ‹ˆλ‹€."
05:59
"Well, what if you knew everything that you know now
110
359235
2527
"그럼 μ§€κΈˆ 당신이 μ•„λŠ” λͺ¨λ“  것듀을
06:01
six months before the day that you left?"
111
361786
2098
λ– λ‚˜κΈ° 6κ°œμ›” 전에 μ•Œμ•˜λ‹€λ©΄μš”?"
06:05
"At that point, I think it probably would have changed my mind."
112
365345
3131
"κ·Έλ•ŒλŠ” μ•„λ§ˆ 제 맘이 λ°”λ€Œμ—ˆμ„ μˆ˜λ„ μžˆμ„ 것 κ°™μ•„μš”."
06:10
Radicalization isn't this yes-or-no choice.
113
370138
3397
κΈ‰μ§„ν™”λŠ” λ„€ λ˜λŠ” μ•„λ‹ˆμ˜€ μ„ νƒμ˜ λ¬Έμ œκ°€ μ•„λ‹™λ‹ˆλ‹€.
06:14
It's a process, during which people have questions --
114
374007
2977
μ‚¬λžŒλ“€μ—κ²Œ 이념, 쒅ꡐ, μƒν™œ ν™˜κ²½ 등에 λŒ€ν•œ
06:17
about ideology, religion, the living conditions.
115
377008
3776
μ§ˆλ¬Έμ„ 뢈러 μΌμœΌν‚€λŠ” κ³Όμ •μž…λ‹ˆλ‹€.
06:20
And they're coming online for answers,
116
380808
2766
그리고 μ‚¬λžŒλ“€μ€ μ˜¨λΌμΈμ—μ„œ κ·Έ 닡을 찾으렀고 ν•˜λŠ”λ°
06:23
which is an opportunity to reach them.
117
383598
1917
ISISλŠ” 이λ₯Ό 기회둜 μ‚ΌλŠ” κ²ƒμž…λ‹ˆλ‹€.
06:25
And there are videos online from people who have answers --
118
385905
3014
μ˜¨λΌμΈμ—” 이런 μ§ˆλ¬Έλ“€μ— λŒ€ν•œ 닡을 담은 λ™μ˜μƒλ“€μ΄ μ‘΄μž¬ν•©λ‹ˆλ‹€.
06:28
defectors, for example, telling the story of their journey
119
388943
2876
폭λ ₯의 길에 λ“€μ–΄μ„°λ‹€κ°€ λŒμ•„μ„  μ‚¬λžŒλ“€μ˜
06:31
into and out of violence;
120
391843
1583
여정을 κ³΅μœ ν•˜λŠ” λ™μ˜μƒλ“€ λ§μž…λ‹ˆλ‹€.
06:33
stories like the one from that man I met in the Iraqi prison.
121
393450
3487
μ œκ°€ 이라크 κ΅λ„μ†Œμ—μ„œ λ§Œλ‚¬λ˜ κ·Έ λ‚¨μžμ™€ 같은 μ‚¬λžŒλ“€μ΄ 올린 μ˜μƒλ“€μ΄μ£ .
06:37
There are locals who've uploaded cell phone footage
122
397914
2590
ν˜Ήμ€ ν˜„μ§€μΈλ“€μ΄ ν•Έλ“œν°μœΌλ‘œ μ°μ–΄μ˜¬λ¦°
06:40
of what life is really like in the caliphate under ISIS's rule.
123
400528
3503
ISIS의 지배λ₯Ό λ°›λŠ” 칼리프 μ‚Άμ˜ 싀상도 λ³Ό 수 μžˆμŠ΅λ‹ˆλ‹€.
06:44
There are clerics who are sharing peaceful interpretations of Islam.
124
404055
3735
이슬람 ꡐ리의 ν‰ν™”λ‘œμš΄ 해석을 ν•˜λŠ” μ„±μ§μžλ“€λ„ μžˆκ³ μš”.
06:48
But you know what?
125
408830
1150
ν•˜μ§€λ§Œ κ·Έκ±° μ•„μ„Έμš”?
06:50
These people don't generally have the marketing prowess of ISIS.
126
410004
3020
이런 μ‚¬λžŒλ“€μ€ 보톡 ISIS 만큼의 홍보 μ—­λŸ‰μ΄ μ—†μŠ΅λ‹ˆλ‹€.
06:54
They risk their lives to speak up and confront terrorist propaganda,
127
414049
4532
μžμ‹ μ˜ 삢을 λ‚΄κ±Έκ³  ν…ŒλŸ¬λ²” 쑰직 μ„ μ „ν™œλ™μ— λ°˜λŒ€ν•˜μ§€λ§Œ
06:58
and then they tragically don't reach the people
128
418605
2211
λΉ„κ·Ήμ μ΄κ²Œλ„, μ •μž‘ 그런 메세지λ₯Ό λ“€μ–΄μ•Ό λ˜λŠ” μ‚¬λžŒλ“€μ—κ²ŒλŠ”
07:00
who most need to hear from them.
129
420840
1682
영ν–₯λ ₯을 λ―ΈμΉ˜μ§€ λͺ»ν•©λ‹ˆλ‹€.
07:03
And we wanted to see if technology could change that.
130
423173
2612
κ·Έλž˜μ„œ 저희 그룹은 κΈ°μˆ μ„ 톡해 이λ₯Ό λ³€ν™”μ‹œν‚€κ³ μž ν–ˆμŠ΅λ‹ˆλ‹€.
07:06
So in 2016, we partnered with Moonshot CVE
131
426205
4183
2016년에 Moonshot CVE λΌλŠ” κ·Έλ£Ήκ³Ό 제휴λ₯Ό λ§Ίκ³ 
07:10
to pilot a new approach to countering radicalization
132
430412
3180
급진화λ₯Ό λŒ€μ‘ν•˜κΈ° μœ„ν•œ μƒˆλ‘œμš΄ μ‹œλ„μΈ
07:13
called the "Redirect Method."
133
433616
1780
"우회 방법" μ΄λž€ μ‹œν—˜μ„ ν•΄λ³΄μ•˜μŠ΅λ‹ˆλ‹€.
07:16
It uses the power of online advertising
134
436453
3012
온라인 ν™λ³΄μ˜ νž˜μ„ μ΄μš©ν•΄
07:19
to bridge the gap between those susceptible to ISIS's messaging
135
439489
4514
ISIS의 메세지에 μ·¨μ•½ν•œ μ‚¬λžŒλ“€κ³Ό κ·Έ 메세지λ₯Ό 무λ ₯ν™”ν•  수 μžˆλŠ”
07:24
and those credible voices that are debunking that messaging.
136
444027
3760
섀득λ ₯ μžˆλŠ” μ‚¬λžŒλ“€μ˜ μ˜μƒμ„ μ—°κ²°μ‹œμΌœμ£ΌλŠ” λ°©λ²•μ΄μ—ˆμŠ΅λ‹ˆλ‹€.
07:28
And it works like this:
137
448633
1150
예λ₯Ό λ“€λ©΄ 이런 λ°©μ‹μž…λ‹ˆλ‹€.
07:29
someone looking for extremist material --
138
449807
1961
λˆ„κ΅°κ°€ κ·Ήλ‹¨μ£Όμ˜μ μΈ 자료λ₯Ό κ²€μƒ‰ν•œλ‹€κ³  해보죠.
07:31
say they search for "How do I join ISIS?" --
139
451792
2990
"ISIS에 κ°€μž…ν•˜λŠ” 법" 을 κ²€μƒ‰ν•˜λ©΄
07:34
will see an ad appear
140
454806
2476
κ΄‘κ³ κ°€ 같이 λœΉλ‹ˆλ‹€.
07:37
that invites them to watch a YouTube video of a cleric, of a defector --
141
457306
4882
μ„±μ§μžλ‚˜, λ³€μ ˆμž λ“± μ œλŒ€λ‘œ 된 닡을 쀄 수 μžˆλŠ” μ‚¬λžŒλ“€μ˜
07:42
someone who has an authentic answer.
142
462212
2310
유튜브 μ˜μƒμ„ 보도둝 μœ λ„ν•˜λŠ” κ΄‘κ³ μ£ .
07:44
And that targeting is based not on a profile of who they are,
143
464546
3623
κ΄‘κ³ μ˜ λŒ€μƒμ„ μ •ν•  λ•Œ μ‚¬λžŒλ“€μ˜ 신상정보 λ³΄λ‹€λŠ”
07:48
but of determining something that's directly relevant
144
468193
3053
κ²€μƒ‰μ–΄λ‚˜ μ§ˆλ¬Έμ— 직접적인 연관이 μžˆλŠ” 행동을 ν•  λ§Œν•œ
07:51
to their query or question.
145
471270
1708
κ°€λŠ₯성에 μ΄ˆμ μ„ λ‘μ—ˆμŠ΅λ‹ˆλ‹€.
07:54
During our eight-week pilot in English and Arabic,
146
474122
2842
8μ£Ό λ™μ•ˆ μ˜μ–΄μ™€ μ•„λžμ–΄λ‘œ μ‹œλ²” μš΄ν–‰μ„ ν•΄λ³Έ κ²°κ³Ό
07:56
we reached over 300,000 people
147
476988
3279
μ§€ν•˜λ“œ 그룹에 λŒ€ν•΄ κ΄€μ‹¬μ΄λ‚˜ 동정을 ν‘œν•œ
08:00
who had expressed an interest in or sympathy towards a jihadi group.
148
480291
5545
30만 λͺ…이 λ„˜λŠ” μ‚¬λžŒλ“€μ—κ²Œ 손길을 뻗을 수 μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
08:06
These people were now watching videos
149
486626
2264
이 μ‚¬λžŒλ“€μ€ 절망적인 선택을 ν•˜κΈ° 전에
08:08
that could prevent them from making devastating choices.
150
488914
3340
그듀을 막아 쀄 수 μžˆλŠ” μ˜μƒμ„ 보게 된 κ²ƒμž…λ‹ˆλ‹€.
08:13
And because violent extremism isn't confined to any one language,
151
493405
3727
그리고 폭λ ₯적인 κ·Ήλ‹¨μ£Όμ˜λŠ” ν•œ μ–Έμ–΄λ‚˜, μ’…κ΅λ‚˜, 이념에
08:17
religion or ideology,
152
497156
1804
κ΅­ν•œλ˜μ–΄ μžˆλŠ” 것이 μ•„λ‹ˆκΈ° λ•Œλ¬Έμ—
08:18
the Redirect Method is now being deployed globally
153
498984
3501
'우회 방법'은 μ „μ„Έκ³„μ μœΌλ‘œ 퍼져
08:22
to protect people being courted online by violent ideologues,
154
502509
3804
λ§Žμ€ μ‚¬λžŒλ“€μ„ 폭λ ₯적인 μ΄λ…μ£Όμ˜μžλ“€μ˜ μœ ν˜Ήμ—μ„œ κ΅¬ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
08:26
whether they're Islamists, white supremacists
155
506337
2596
이슬람 κ΅λ„μ΄λ˜, 백인 μš°μ›”μ£Όμ˜μžμ΄λ“ 
08:28
or other violent extremists,
156
508957
2103
ν˜Ήμ€ λ‹€λ₯Έ 폭λ ₯적 κ·Ήλ‹¨μ£Όμ˜μžμ΄λ“  간에
08:31
with the goal of giving them the chance to hear from someone
157
511084
2873
κ·Έ μ—¬μ •μ˜ λ°˜λŒ€μͺ½μ— μžˆλŠ” λˆ„κ΅°κ°€μ˜ 말을 μ „ν•˜κ³ μž ν•˜λŠ”
08:33
on the other side of that journey;
158
513981
2091
κ·Ήλ‹¨μ£Όμ˜μžλ“€μ˜ μ†μ—μ„œ λ²—μ–΄λ‚˜
08:36
to give them the chance to choose a different path.
159
516096
2839
λ‹€λ₯Έ 여정을 걸을 수 μžˆλŠ” 기회λ₯Ό μ£ΌλŠ” κ²λ‹ˆλ‹€.
08:40
It turns out that often the bad guys are good at exploiting the internet,
160
520749
5980
μ•Œκ³ λ³΄λ‹ˆ 보톡 악당듀이 인터넷을 잘 μ΄μš©ν•˜λ”λΌκ³ μš”.
08:46
not because they're some kind of technological geniuses,
161
526753
3744
그건 그듀이 κΈ°μˆ μ„ μ—„μ²­λ‚˜κ²Œ 잘 λ‹€λ€„μ„œ 그런 게 μ•„λ‹ˆλΌ
08:50
but because they understand what makes people tick.
162
530521
2985
μ‚¬λžŒλ“€μ„ μ›€μ§μ΄λŠ” 법을 μ•ŒκΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
08:54
I want to give you a second example:
163
534855
2369
λ‹€λ₯Έ μ˜ˆμ‹œλ₯Ό λ“€μ–΄λ³Όκ²Œμš”.
08:58
online harassment.
164
538019
1391
온라인 괴둭힘.
09:00
Online harassers also work to figure out what will resonate
165
540629
3363
온라인 괴둭힘 λ˜ν•œ λ‹€λ₯Έ μ‚¬λžŒλ“€μ΄ 곡λͺ…ν•  수 μžˆλŠ”
09:04
with another human being.
166
544016
1615
μš”μΈμ„ νŒŒμ•…ν•˜λ €κ³  λ…Έλ ₯ν•©λ‹ˆλ‹€.
09:05
But not to recruit them like ISIS does,
167
545655
3110
ν•˜μ§€λ§Œ ISIS처럼 μ‚¬λžŒλ“€μ„ λͺ¨μ§‘ν•˜λ €λŠ” 게 μ•„λ‹ˆλΌ
09:08
but to cause them pain.
168
548789
1275
고톡을 μ£ΌκΈ° μœ„ν•΄μ„œμš”.
09:11
Imagine this:
169
551259
1342
ν•œλ²ˆ μƒμƒν•΄λ³΄μ„Έμš”.
09:13
you're a woman,
170
553347
1659
μ—¬λŸ¬λΆ„μ΄ 여성이고
09:15
you're married,
171
555030
1413
기혼이며
09:16
you have a kid.
172
556467
1154
아이가 μžˆλ‹€κ³  κ°€μ •ν•΄λ³΄μ„Έμš”.
09:18
You post something on social media,
173
558834
1784
μ†Œμ…œ 미디어에 λ­”κ°€λ₯Ό μ˜¬λ ΈλŠ”λ°
09:20
and in a reply, you're told that you'll be raped,
174
560642
2886
λ‹΅μž₯이 μ˜΅λ‹ˆλ‹€. λˆ„κ΅°κ°€ μ—¬λŸ¬λΆ„μ„ κ°•κ°„ν•˜λŸ¬ 올거고
09:24
that your son will be watching,
175
564577
1560
아듀이 κ·Έκ±Έ 보게 λ§Œλ“€ 거라고 ν•˜λ©°
09:26
details of when and where.
176
566825
1856
μžμ„Έν•˜κ²Œ μž₯μ†Œμ™€ μ‹œκ°„κΉŒμ§€ 적은 λ‹΅μž₯μ΄μš”.
09:29
In fact, your home address is put online for everyone to see.
177
569148
3143
사싀, μ—¬λŸ¬λΆ„ μ§‘μ£Όμ†ŒλŠ” μ˜¨λΌμΈμ— μ˜¬λ €μ Έμžˆμ–΄μ„œ λͺ¨λ“  μ‚¬λžŒλ“€μ΄ λ³Ό 수 μžˆμŠ΅λ‹ˆλ‹€.
09:33
That feels like a pretty real threat.
178
573580
2007
μ•„κΉŒ κ·Έ λ‹΅μž₯이 정말 ν˜‘λ°•μ²˜λŸΌ λŠκ»΄μ§€μ£ .
09:37
Do you think you'd go home?
179
577113
1656
μ—¬λŸ¬λΆ„μ΄λΌλ©΄ 집에 κ°€μ‹œκ² μŠ΅λ‹ˆκΉŒ?
09:39
Do you think you'd continue doing the thing that you were doing?
180
579999
3048
ν•˜λ˜ 일을 계속 ν•˜μ‹œκ² μ–΄μš”?
09:43
Would you continue doing that thing that's irritating your attacker?
181
583071
3220
λ‹Ήμ‹ μ—κ²Œ κ·Έ 메세지λ₯Ό 보낸 μ‚¬λžŒμ„ 거슬리게 ν•˜λŠ” 행동을 계속 ν•  수 μžˆμœΌμ‹œκ² μ–΄μš”?
09:48
Online abuse has been this perverse art
182
588016
3096
μ˜¨λΌμΈμƒμ˜ ν•™λŒ€λŠ” 이처럼
09:51
of figuring out what makes people angry,
183
591136
3468
μ‚¬λžŒλ“€μ„ ν™”λ‚˜κ²Œ ν•˜κ±°λ‚˜
09:54
what makes people afraid,
184
594628
2132
λ‘λ €μ›Œν•˜κ²Œ ν•˜κ±°λ‚˜
09:56
what makes people insecure,
185
596784
1641
λΆˆμ•ˆν•˜κ²Œ λ§Œλ“œλŠ” 방법을 μ•Œμ•„λ‚΄μ„œ
09:58
and then pushing those pressure points until they're silenced.
186
598449
3067
κ²°κ΅­ μ‚¬λžŒλ“€μ΄ μž…μ„ λ‹€λ¬Ό λ•ŒκΉŒμ§€ μ•„ν”ˆ 곳을 찌λ₯΄λŠ” μ‚¬μ•…ν•œ λ°©λ²•μž…λ‹ˆλ‹€.
10:02
When online harassment goes unchecked,
187
602333
2304
온라인 μƒμ˜ ν•™λŒ€λŠ” 검열받지 μ•Šμ§€λ§Œ
10:04
free speech is stifled.
188
604661
1667
μ–Έλ‘ μ˜ μžμœ λŠ” μ–΅μ••λ‹Ήν•©λ‹ˆλ‹€.
10:07
And even the people hosting the conversation
189
607196
2127
토둠을 이끌던 μ‚¬λžŒλ“€λ§ˆμ €
10:09
throw up their arms and call it quits,
190
609347
1834
λ‘μ†λ‘λ°œ λ“€κ³  포기해 λ²„λ¦½λ‹ˆλ‹€.
10:11
closing their comment sections and their forums altogether.
191
611205
2957
μ½”λ©˜νŠΈλž€μ„ λ‹«μ•„λ²„λ¦¬κ±°λ‚˜ ν† λ‘  포럼 자체λ₯Ό λ‹«μ•„λ²„λ¦½λ‹ˆλ‹€.
10:14
That means we're actually losing spaces online
192
614186
2849
이건 온라인 μƒμ—μ„œ μš°λ¦¬κ°€ λ§Œλ‚˜κ³  생각을 λ‚˜λˆŒ
10:17
to meet and exchange ideas.
193
617059
1987
곡간이 μ‚¬λΌμ§„λ‹€λŠ” λœ»μž…λ‹ˆλ‹€.
10:19
And where online spaces remain,
194
619939
2163
그리고 λ‚¨μ•„μžˆλŠ” 온라인 곡간은
10:22
we descend into echo chambers with people who think just like us.
195
622126
4470
μš°λ¦¬μ™€ λ˜‘κ°™μ€ 생각을 ν•˜λŠ” μ‚¬λžŒλ“€μ˜ 같은 말만 μšΈλ €νΌμ§€λŠ” κ³΅κ°„μœΌλ‘œ λ³€μ§ˆλ˜κ³ 
10:27
But that enables the spread of disinformation;
196
627688
2499
ν—ˆμœ„ 정보λ₯Ό 퍼트렀 μ–‘κ·Ήν™”λ₯Ό μ‘°μž₯ν•˜λŠ”λ°
10:30
that facilitates polarization.
197
630211
2184
μ΄μš©λ˜λ„λ‘ 좔락할 κ²ƒμž…λ‹ˆλ‹€.
10:34
What if technology instead could enable empathy at scale?
198
634508
5269
ν•˜μ§€λ§Œ κΈ°μˆ μ„ μ‚¬μš©ν•΄μ„œ ν•„μš”ν•œ 만큼의 곡감을 λŒμ–΄λ‚Ό 수 μžˆλ‹€λ©΄ μ–΄λ–¨κΉŒμš”?
10:40
This was the question that motivated our partnership
199
640451
2486
이 μ§ˆλ¬Έμ„ μ‹œμž‘μœΌλ‘œ μ €ν¬λŠ”
10:42
with Google's Counter Abuse team,
200
642961
1819
κ΅¬κΈ€μ‚¬μ˜ ν•™λŒ€ λŒ€μ‘ νŒ€,
10:44
Wikipedia
201
644804
1178
μœ„ν‚€ν”Όλ””μ•„,
10:46
and newspapers like the New York Times.
202
646006
1934
λ‰΄μš•νƒ€μž„μ¦ˆμ™€ 같은 신문사듀과 ν˜‘λ ₯관계λ₯Ό λ§Ίμ—ˆμŠ΅λ‹ˆλ‹€.
10:47
We wanted to see if we could build machine-learning models
203
647964
2876
기계가 슀슀둜 ν•™μŠ΅ν•˜λŠ” λ¨Έμ‹  λŸ¬λ‹ λͺ¨λΈμ„ κ΅¬μΆ•ν•΄μ„œ
10:50
that could understand the emotional impact of language.
204
650864
3606
μ–Έμ–΄μ˜ 감정적인 μ—¬νŒŒλ₯Ό 이해할 수 μžˆλŠ”μ§€ μ‹€ν—˜ν•΄ 보고자 ν–ˆμŠ΅λ‹ˆλ‹€.
10:55
Could we predict which comments were likely to make someone else leave
205
655062
3610
μ–΄λ–€ λŒ“κΈ€λ“€μ΄ 온라인 μƒμ˜ λŒ€ν™”λ‚˜ 토둠을
10:58
the online conversation?
206
658696
1374
λλ‚΄λŠ”λ° μΌμ‘°ν•˜λŠ”μ§€ μ˜ˆμƒν•  수 μžˆμ„κΉŒμš”?
11:00
And that's no mean feat.
207
660515
3887
이건 정말 λŒ€λ‹¨ν•œ 일이죠.
11:04
That's no trivial accomplishment
208
664426
1566
인곡지λŠ₯이 이런 역할을
11:06
for AI to be able to do something like that.
209
666016
2563
μˆ˜ν–‰ν•  수 있게 λœλ‹€λŠ” 건 μ‚¬μ†Œν•œ μ„±κ³Όκ°€ μ•„λ‹™λ‹ˆλ‹€.
11:08
I mean, just consider these two examples of messages
210
668603
3729
μ œκ°€ μ•žμ„œ λ§μ”€λ“œλ¦° 두 개의 λ©”μ„Έμ§€μ²˜λŸΌ
11:12
that could have been sent to me last week.
211
672356
2224
μ§€λ‚œ 주에 μ œκ°€ 받을 μˆ˜λ„ 있던 메세지λ₯Ό μƒκ°ν•΄λ³΄μ„Έμš”.
11:15
"Break a leg at TED!"
212
675517
1879
"TEDμ—μ„œ μž˜ν•˜κ³  와!"
11:17
... and
213
677420
1164
...μ•„λ‹ˆλ©΄
11:18
"I'll break your legs at TED."
214
678608
2126
"TEDμ—μ„œ 닀리 λΆ€λŸ¬λœ¨λ €!" (λ¬΄λŒ€μ—μ„œ 잘 ν•˜λΌλŠ” κ΄€μš©μ  ν‘œν˜„)
11:20
(Laughter)
215
680758
1246
(μ›ƒμŒ)
11:22
You are human,
216
682028
1513
μ—¬λŸ¬λΆ„μ€ μ‚¬λžŒμ΄κΈ° λ•Œλ¬Έμ—
11:23
that's why that's an obvious difference to you,
217
683565
2210
λ‹¨μ–΄μ˜ 뜻이 사싀상 거의 같아도
11:25
even though the words are pretty much the same.
218
685799
2224
의미의 차이λ₯Ό λͺ…λ°±νžˆ λŠλ‚„ 수 μžˆμŠ΅λ‹ˆλ‹€.
11:28
But for AI, it takes some training to teach the models
219
688047
3079
ν•˜μ§€λ§Œ 인곡지λŠ₯μ—κ²Œ μ΄λŸ¬ν•œ 차이λ₯Ό μΈμ§€ν•˜λ„λ‘ ν•˜λ €λ©΄
11:31
to recognize that difference.
220
691150
1571
ν›ˆλ ¨ν•˜λŠ” μ‹œκ°„μ΄ 쑰금 κ±Έλ¦½λ‹ˆλ‹€.
11:32
The beauty of building AI that can tell the difference
221
692745
3245
차이점을 인지 ν•  수 μžˆλŠ” 인곡지λŠ₯을 κ°œλ°œν•œλ‹€λŠ” 건
11:36
is that AI can then scale to the size of the online toxicity phenomenon,
222
696014
5050
κ²°κ΅­ 이 지λŠ₯이 온라인 독성 ν˜„μƒμ„ ν•΄κ²°ν•  만큼 λ°œλ‹¬ν•  수 μžˆλ‹€λŠ” κ²λ‹ˆλ‹€.
11:41
and that was our goal in building our technology called Perspective.
223
701088
3287
이것이 λ°”λ‘œ 저희가 'νΌμŠ€νŽ™ν‹°λΈŒ' 라고 λΆˆλ¦¬λŠ” κΈ°μˆ μ„ κ°œλ°œν•  λ•Œ 염두해 λ‘” λͺ©ν‘œμž…λ‹ˆλ‹€.
11:45
With the help of Perspective,
224
705056
1427
'νΌμŠ€νŽ™ν‹°λΈŒ'의 λ„μ›€μœΌλ‘œ
11:46
the New York Times, for example,
225
706507
1583
예λ₯Ό λ“€μ–΄, λ‰΄μš• νƒ€μž„μ¦ˆ μ§€λŠ”
11:48
has increased spaces online for conversation.
226
708114
2487
온라인 μƒμ—μ„œμ˜ λŒ€ν™” κ°€λŠ₯ 곡간을 λŠ˜λ ΈμŠ΅λ‹ˆλ‹€.
11:51
Before our collaboration,
227
711005
1310
ν˜‘λ ₯ μ „μ—λŠ”
11:52
they only had comments enabled on just 10 percent of their articles.
228
712339
4305
전체 기사 쀑 10%μ—λ§Œ λŒ“κΈ€μ„ 달 수 μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
11:57
With the help of machine learning,
229
717495
1644
λ¨Έμ‹  λŸ¬λ‹μ„ 톡해
11:59
they have that number up to 30 percent.
230
719163
1897
μ΄μ œλŠ” κ·Έ μˆ˜κ°€ 30%에 μœ‘λ°•ν•©λ‹ˆλ‹€.
12:01
So they've tripled it,
231
721084
1156
3λ°°λ‚˜ 늘린 것이죠.
12:02
and we're still just getting started.
232
722264
1917
이건 겨우 μ‹œμž‘μ— λΆˆκ³Όν•œλ°λ„μš”.
12:04
But this is about way more than just making moderators more efficient.
233
724872
3461
ν•˜μ§€λ§Œ 이건 더 효과적으둜 μ€‘μž¬ 역할을 μ‹œν–‰ν•΄ λ‚΄λŠ” 것에 κ·ΈμΉ˜λŠ” 일이 μ•„λ‹™λ‹ˆλ‹€.
12:10
Right now I can see you,
234
730076
1850
μ§€κΈˆ μ €λŠ” μ—¬λŸ¬λΆ„μ„ λ³Ό 수 있고
12:11
and I can gauge how what I'm saying is landing with you.
235
731950
3294
μ œκ°€ ν•˜λŠ” 말이 μ—¬λŸ¬λΆ„μ—κ²Œ μ–΄λ–»κ²Œ λ“€λ¦¬λŠ”μ§€ νŒλ‹¨ν•  수 있죠.
12:16
You don't have that opportunity online.
236
736370
1879
μ˜¨λΌμΈμ—μ„œλŠ” 그런 κΈ°νšŒκ°€ μ—†μŠ΅λ‹ˆλ‹€.
12:18
Imagine if machine learning could give commenters,
237
738558
3635
λ§Œμ•½ λ¨Έμ‹  λŸ¬λ‹μ΄ λŒ“κΈ€μ„ λ‹€λŠ” μ‚¬λžŒλ“€μ—κ²Œ
12:22
as they're typing,
238
742217
1162
μ‹€μ‹œκ°„μœΌλ‘œ
12:23
real-time feedback about how their words might land,
239
743403
3347
그듀이 λ‹€λŠ” λŒ“κΈ€μ΄ μ–΄λ–»κ²Œ 듀릴지 μ•Œλ €μ€€λ‹€λ©΄ μ–΄λ–¨κΉŒμš”?
12:27
just like facial expressions do in a face-to-face conversation.
240
747609
3024
얼꡴을 보고 λŒ€ν™” ν•˜λ©΄ ν‘œμ •μ„ λ³Ό 수 μžˆλ“―μ΄μš”.
12:32
Machine learning isn't perfect,
241
752926
1842
λ¨Έμ‹  λŸ¬λ‹μ€ μ™„λ²½ν•˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
12:34
and it still makes plenty of mistakes.
242
754792
2394
아직도 μ‹€μˆ˜λ₯Ό μž”λœ© μ €μ§€λ¦…λ‹ˆλ‹€.
12:37
But if we can build technology
243
757210
1557
ν•˜μ§€λ§Œ μš°λ¦¬κ°€ λ§Œλ“œλŠ” 기술이
12:38
that understands the emotional impact of language,
244
758791
3293
μ–Έμ–΄κ°€ λ―ΈμΉ˜λŠ” 감정적인 영ν–₯λ ₯을 이해할 수 μžˆλ‹€λ©΄
12:42
we can build empathy.
245
762108
1460
우린 곡감을 μžμ•„λ‚Ό 수 μžˆμŠ΅λ‹ˆλ‹€.
12:43
That means that we can have dialogue between people
246
763592
2425
κ·Έλ ‡κ²Œ λœλ‹€λ©΄, μš°λ¦¬λŠ”
12:46
with different politics,
247
766041
1816
λ‹€λ₯Έ μ •μΉ˜μ  견해,
12:47
different worldviews,
248
767881
1216
세상을 λ°”λΌλ³΄λŠ” 눈,
12:49
different values.
249
769121
1246
λ‹€λ₯Έ κ°€μΉ˜μ— λŒ€ν•΄ λŒ€ν™”ν•  수 있게 λ©λ‹ˆλ‹€.
12:51
And we can reinvigorate the spaces online that most of us have given up on.
250
771359
4775
그럼 우리 λŒ€λΆ€λΆ„μ΄ ν¬κΈ°ν–ˆλ˜ 온라인 μƒμ˜ 곡간듀을 λ˜μ‚΄λ¦΄ μˆ˜λ„ 있겠죠.
12:57
When people use technology to exploit and harm others,
251
777857
3785
μ‚¬λžŒλ“€μ΄ κΈ°μˆ μ„ μ•…μš©ν•΄ λ‹€λ₯Έ μ‚¬λžŒλ“€μ—κ²Œ ν•΄λ₯Ό μž…νžλ•Œ
13:01
they're preying on our human fears and vulnerabilities.
252
781666
3642
μΈκ°„μ˜ 두렀움과 취약함을 먹이둜 μ‚ΌμŠ΅λ‹ˆλ‹€.
13:06
If we ever thought that we could build an internet
253
786461
3508
μš°λ¦¬κ°€ λ§Œμ•½ μΈκ°„μ„±μ˜ μ–΄λ‘μš΄ 면을 μ™„λ²½ μ°¨λ‹¨ν•˜λŠ”
13:09
insulated from the dark side of humanity,
254
789993
2578
인터넷을 개발 ν•  수 μžˆλ‹€κ³  μƒκ°ν•œ 적이 μžˆλ‹€λ©΄
13:12
we were wrong.
255
792595
1184
잘λͺ» μƒκ°ν•œ κ²λ‹ˆλ‹€.
13:14
If we want today to build technology
256
794361
2270
μ˜€λŠ˜λ‚  μš°λ¦¬κ°€ μ§λ©΄ν•˜λŠ” λ¬Έμ œλ“€μ„ 극볡할 수 μžˆλŠ”
13:16
that can overcome the challenges that we face,
257
796655
3127
κΈ°μˆ μ„ κ°œλ°œν•΄λ‚΄κ³  μ‹Άλ‹€λ©΄
13:19
we have to throw our entire selves into understanding the issues
258
799806
4043
문제λ₯Ό μ •ν™•νžˆ μ΄ν•΄ν•˜λŠ” 것과 해결책을 λ§Œλ“œλŠ” 것에
13:23
and into building solutions
259
803873
1893
온 νž˜μ„ λ‹€ν•΄μ•Ό ν•©λ‹ˆλ‹€.
13:25
that are as human as the problems they aim to solve.
260
805790
3782
ν•΄κ²°ν•˜κ³ μž ν•˜λŠ” λ¬Έμ œλ§ŒνΌμ΄λ‚˜ 인간적인 ν•΄κ²°μ±…μ„μš”.
13:30
Let's make that happen.
261
810071
1513
ν•œλ²ˆ μ΄λ€„λ‚΄λ΄…μ‹œλ‹€.
13:31
Thank you.
262
811924
1150
κ°μ‚¬ν•©λ‹ˆλ‹€.
13:33
(Applause)
263
813098
3277
(λ°•μˆ˜)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7