Fake videos of real people -- and how to spot them | Supasorn Suwajanakorn

1,274,992 views

2018-07-25 ・ TED


New videos

Fake videos of real people -- and how to spot them | Supasorn Suwajanakorn

1,274,992 views ・ 2018-07-25

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: Soobin Ahn κ²€ν† : Jihyeon J. Kim
00:12
Look at these images.
0
12876
1151
이 사진듀을 봐 μ£Όμ„Έμš”.
00:14
Now, tell me which Obama here is real.
1
14051
2635
이제 이 μ€‘μ—μ„œ μ–΄λ–€ 게 μ‹€μ œ μ˜€λ°”λ§ˆμΈμ§€ λ§žμΆ°λ³΄μ„Έμš”.
00:16
(Video) Barack Obama: To help families refinance their homes,
2
16710
2861
버락 μ˜€λ°”λ§ˆ: κ°€μ‘±λ“€μ˜ 주택 재육자λ₯Ό 돕기도 ν•˜κ³ 
00:19
to invest in things like high-tech manufacturing,
3
19595
2647
쒋은 일자리λ₯Ό λŠ˜λ¦¬λŠ” μ΅œμ²¨λ‹¨ μ œμ‘°μ‚°μ—…μ΄λ‚˜
00:22
clean energy
4
22266
1159
μ²­μ • μ—λ„ˆμ§€,
00:23
and the infrastructure that creates good new jobs.
5
23449
2779
μ‚¬νšŒ 기반 μ‹œμ„€μ— νˆ¬μžν•˜κΈ°λ„ ν–ˆμŠ΅λ‹ˆλ‹€.
00:26
Supasorn Suwajanakorn: Anyone?
6
26647
1484
μˆ˜νŒŒμ† μˆ˜μ™€μžλ‚˜μ½˜: λ§žμΆ°λ³΄μ‹€ λΆ„?
00:28
The answer is none of them.
7
28155
1874
이 쀑 μ‹€μ œ μ˜€λ°”λ§ˆλŠ” 아무도 μ—†μŠ΅λ‹ˆλ‹€.
00:30
(Laughter)
8
30053
1114
(μ›ƒμŒ)
00:31
None of these is actually real.
9
31191
1786
이 쀑 μ§„μ§œμΈ 건 아무 것도 μ—†μ–΄μš”.
00:33
So let me tell you how we got here.
10
33001
1840
이걸 μ–΄λ–»κ²Œ ν–ˆλŠ”μ§€ μ•Œλ €λ“œλ¦΄κ²Œμš”.
00:35
My inspiration for this work
11
35940
1578
이 μž‘μ—…μ˜ μ‹œμž‘μ€
00:37
was a project meant to preserve our last chance for learning about the Holocaust
12
37542
5411
ν™€λ‘œμ½”μŠ€νŠΈ μƒμ‘΄μžλ‘œλΆ€ν„° 배울 λ§ˆμ§€λ§‰ 기회λ₯Ό 보쑴할 ν”„λ‘œμ νŠΈμ˜€μŠ΅λ‹ˆλ‹€.
00:42
from the survivors.
13
42977
1768
00:44
It's called New Dimensions in Testimony,
14
44769
2627
μƒˆλ‘œμš΄ μ°¨μ›μ˜ 증언이라고 λΆ€λ₯΄κ³ 
00:47
and it allows you to have interactive conversations
15
47420
3126
이λ₯Ό 톡해 ν™€λ‘œμ½”μŠ€νŠΈ μƒμ‘΄μž ν™€λ‘œκ·Έλž¨κ³Ό μ†Œν†΅ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
00:50
with a hologram of a real Holocaust survivor.
16
50570
2556
00:53
(Video) Man: How did you survive the Holocaust?
17
53793
1966
λ‚¨μž: ν™€λ‘œμ½”μŠ€νŠΈμ—μ„œ μ–΄λ–»κ²Œ μ‚΄μ•„λ‚¨μœΌμ…¨λ‚˜μš”?
00:55
(Video) Hologram: How did I survive?
18
55783
1668
ν™€λ‘œκ·Έλž¨: μ–΄λ–»κ²Œ μ‚΄μ•„λ‚¨μ•˜λƒκ³ μš”?
00:57
I survived,
19
57912
1807
μ œκ°€ 살아남은 건
01:00
I believe,
20
60419
1527
μ œκ°€ μƒκ°ν•˜κΈ°μ—
01:01
because providence watched over me.
21
61970
3023
μ‹ μ˜ 섭리가 μ €λ₯Ό μ§€μΌœμ€¬κΈ° λ•Œλ¬ΈμΈ 것 κ°™μ•„μš”.
01:05
SS: Turns out these answers were prerecorded in a studio.
22
65573
3454
이 λŒ€λ‹΅λ“€μ€ μŠ€νŠœλ””μ˜€μ—μ„œ 미리 λ…Ήν™”λœ κ²ƒμ΄μ—ˆμŠ΅λ‹ˆλ‹€.
01:09
Yet the effect is astounding.
23
69051
2452
ν•˜μ§€λ§Œ κ·Έ νš¨κ³ΌλŠ” λ―ΏκΈ° μ–΄λ €μšΈ μ •λ„μ˜€μŠ΅λ‹ˆλ‹€.
01:11
You feel so connected to his story and to him as a person.
24
71527
3619
μ‚¬λžŒλ“€μ€ 그의 이야기와 κ·Έμ—κ²Œ μΈκ°„μ μœΌλ‘œ μ—°κ²°λ˜μ–΄μžˆλ‹€κ³  λŠκΌˆμŠ΅λ‹ˆλ‹€.
01:16
I think there's something special about human interaction
25
76011
3301
μ €λŠ” μ±…μ΄λ‚˜ κ°•μ˜, μ˜ν™”κ°€ κ°€λ₯΄μ³μ£ΌλŠ” κ²ƒλ³΄λ‹€λŠ”
01:19
that makes it much more profound
26
79336
2757
μ‚¬λžŒλ“€μ΄ μƒν˜Έμž‘μš©ν•˜λŠ” 것에
01:22
and personal
27
82117
2198
더 깊고 개인적인 κ²ƒμœΌλ‘œ λ°›μ•„λ“€μ΄κ²Œ λ§Œλ“œλŠ”
01:24
than what books or lectures or movies could ever teach us.
28
84339
3485
νŠΉλ³„ν•œ 무언가가 μžˆλ‹€κ³  μƒκ°ν•©λ‹ˆλ‹€.
01:28
So I saw this and began to wonder,
29
88267
2425
κ·Έλž˜μ„œ 이걸 보고 κΆκΈˆν•΄μ‘ŒμŠ΅λ‹ˆλ‹€.
01:30
can we create a model like this for anyone?
30
90716
2810
μš°λ¦¬κ°€ λˆ„κ΅¬μ—κ²Œλ‚˜ μ μš©ν•  수 μžˆλŠ” λͺ¨λΈμ„ λ§Œλ“€ 수 μžˆμ„κΉŒ?
01:33
A model that looks, talks and acts just like them?
31
93550
2975
κ·Έ μ‚¬λžŒλ“€κ³Ό λ˜‘κ°™μ΄ 생기고 λ§ν•˜κ³  ν–‰λ™ν•˜λ„λ‘ λ§Œλ“€ 수 μžˆμ„κΉŒ?
01:37
So I set out to see if this could be done
32
97573
2007
κ·Έλž˜μ„œ 이게 κ°€λŠ₯ν•œμ§€ ν™•μΈν•΄λ³΄κΈ°λ‘œ ν–ˆμŠ΅λ‹ˆλ‹€.
01:39
and eventually came up with a new solution
33
99604
2310
그리고 κ²°κ΅­ 이것 ν•˜λ‚˜λ§Œ 있으면
01:41
that can build a model of a person using nothing but these:
34
101938
3220
μ‹€μ œ μ‚¬λžŒμ˜ λͺ¨λΈμ„ λ§Œλ“€ 수 μžˆλŠ” μƒˆλ‘œμš΄ 방법을 μ°Ύμ•˜μŠ΅λ‹ˆλ‹€.
01:45
existing photos and videos of a person.
35
105747
2214
λ°”λ‘œ κ·Έ μ‚¬λžŒμ˜ 사진과 λ™μ˜μƒλ§Œ 있으면 λ©λ‹ˆλ‹€.
01:48
If you can leverage this kind of passive information,
36
108701
2617
κ·Έλƒ₯ μ—¬κΈ° μ €κΈ°μ—μ„œ λͺ¨μ„ 수 μžˆλŠ” 정보인
01:51
just photos and video that are out there,
37
111342
2007
사진과 λ™μ˜μƒμ„ μ΄μš©ν•  수 μžˆλ‹€λ©΄
01:53
that's the key to scaling to anyone.
38
113373
2056
그것이 λˆ„κ΅¬μ—κ²Œλ‚˜ μ μš©ν•  수 μžˆλŠ” λΉ„λ²•μž…λ‹ˆλ‹€.
01:56
By the way, here's Richard Feynman,
39
116119
1777
κ·Έλ‚˜μ €λ‚˜, μ—¬κΈ° λ¦¬μ°¨λ“œ 파인만이 μžˆμŠ΅λ‹ˆλ‹€.
01:57
who in addition to being a Nobel Prize winner in physics
40
117920
3413
노벨 물리학상 μˆ˜μƒμžμ΄μž
02:01
was also known as a legendary teacher.
41
121357
2453
전섀적인 μ„ μƒλ‹˜μœΌλ‘œλ„ μ•Œλ €μ§„ 뢄이죠.
02:05
Wouldn't it be great if we could bring him back
42
125080
2198
λ§Œμ•½ μš°λ¦¬κ°€ κ·Έλ₯Ό λ‹€μ‹œ λ°λ €μ™€μ„œ
02:07
to give his lectures and inspire millions of kids,
43
127302
3265
수백만 λͺ…μ˜ 아이듀을 κ°€λ₯΄μΉ˜κ³  κ°λ™μ‹œν‚¬ 수 μžˆλ‹€λ©΄ 멋지겠죠?
02:10
perhaps not just in English but in any language?
44
130591
2992
μ˜μ–΄λΏλ§Œ μ•„λ‹ˆλΌ λͺ¨λ“  μ–Έμ–΄λ‘œ κ°€λŠ₯ν•˜λ‹€λ©΄μš”?
02:14
Or if you could ask our grandparents for advice and hear those comforting words
45
134441
4602
λ˜λŠ” λŒμ•„κ°€μ‹  μ‘°λΆ€λͺ¨λ‹˜μ—κ²Œ
μ‘°μ–Έκ³Ό μœ„λ‘œλ₯Ό 듀을 수 μžˆλ‹€λ©΄ μ–΄λ–¨κΉŒμš”?
02:19
even if they're no longer with us?
46
139067
1770
02:21
Or maybe using this tool, book authors, alive or not,
47
141683
3396
μ•„λ‹ˆλ©΄ 이 κΈ°μˆ μ„ μ΄μš©ν•΄μ„œ λŒμ•„κ°€μ…¨κ±°λ‚˜ 살아계신 μž‘κ°€λ“€μ΄
02:25
could read aloud all of their books for anyone interested.
48
145103
2937
직접 κ·Έλ“€μ˜ 책을 κ΄€μ‹¬μžˆλŠ” μ‚¬λžŒλ“€μ—κ²Œ 읽어쀀닀면 μ–΄λ–€κ°€μš”?
02:29
The creative possibilities here are endless,
49
149199
2437
여기에 λ‹΄κΈ΄ 창의적인 κ°€λŠ₯성은 λ¬΄ν•œν•©λ‹ˆλ‹€.
02:31
and to me, that's very exciting.
50
151660
1713
저도 ꡉμž₯히 μ‹ λ‚©λ‹ˆλ‹€.
02:34
And here's how it's working so far.
51
154595
2002
그리고 μ§€κΈˆκΉŒμ§€ μ΄μš©ν•œ μž‘λ™μ›λ¦¬λ₯Ό λ³΄μ—¬λ“œλ¦¬κ² μŠ΅λ‹ˆλ‹€.
02:36
First, we introduce a new technique
52
156621
1667
λ¨Όμ € 직접 3D μŠ€μΊ”μ„ ν•˜μ§€ μ•Šκ³ λ„
02:38
that can reconstruct a high-detailed 3D face model from any image
53
158312
4572
사진을 μ΄μš©ν•΄ ꡉμž₯히 μƒμ„Έν•œ 3D μ–Όκ΅΄ λͺ¨λΈμ„
02:42
without ever 3D-scanning the person.
54
162908
2119
μž¬ν˜„ν•˜λŠ” μƒˆλ‘œμš΄ κΈ°μˆ μ„ μ†Œκ°œν•©λ‹ˆλ‹€.
02:45
And here's the same output model from different views.
55
165890
2642
κ·Έ κ²°κ³Όλ¬Ό λͺ¨λΈμ„ λ‹€λ₯Έ κ°λ„μ—μ„œ 보면 μ΄λ ‡μŠ΅λ‹ˆλ‹€.
02:49
This also works on videos,
56
169969
1502
이 κΈ°μˆ μ€ λΉ„λ””μ˜€μ—λ„ μ μš©ν•  수 μžˆλŠ”λ°
02:51
by running the same algorithm on each video frame
57
171495
2852
λΉ„λ””μ˜€ ν”„λ ˆμž„λ§ˆλ‹€ 같은 μ•Œκ³ λ¦¬μ¦˜μ„ μ μš©ν•΄
02:54
and generating a moving 3D model.
58
174371
2222
μ›€μ§μ΄λŠ” 3D λͺ¨ν˜•μ„ λ§Œλ“€μ–΄λƒ…λ‹ˆλ‹€.
02:57
And here's the same output model from different angles.
59
177538
2772
κ·Έ κ²°κ³Όλ¬Ό λͺ¨λΈμ„ λ‹€λ₯Έ κ°λ„μ—μ„œ 보면 μ΄λ ‡μŠ΅λ‹ˆλ‹€.
03:01
It turns out this problem is very challenging,
60
181933
2534
이건 쉽지 μ•Šμ€ λ„μ „μ΄μ§€λ§Œ
03:04
but the key trick is that we are going to analyze
61
184491
2525
μ€‘μš”ν•œ λ¬˜μ±…μ€ κ·Έ μ‚¬λžŒμ˜ μˆ˜λ§Žμ€ 사진을
03:07
a large photo collection of the person beforehand.
62
187040
2966
사전에 뢄석할 κ²ƒμ΄λΌλŠ” κ²λ‹ˆλ‹€.
03:10
For George W. Bush, we can just search on Google,
63
190650
2539
쑰지 W. λΆ€μ‹œλŠ” κ΅¬κΈ€μ—μ„œ μ‰½κ²Œ 찾을 수 μžˆμŠ΅λ‹ˆλ‹€.
03:14
and from that, we are able to build an average model,
64
194309
2499
κ·Έλž˜μ„œ 일반적인 λͺ¨λΈλΏλ§Œ μ•„λ‹ˆλΌ
03:16
an iterative, refined model to recover the expression
65
196832
3111
μ£Όλ¦„μ²˜λŸΌ ꡉμž₯히 μžμ„Έν•œν•œ λΆ€λΆ„κΉŒμ§€λ„ ν‘œν˜„ν•˜λŠ”
03:19
in fine details, like creases and wrinkles.
66
199967
2336
반볡적이고 μ •μ œλœ λͺ¨λΈμ„ λ§Œλ“€ 수 μžˆμŠ΅λ‹ˆλ‹€.
03:23
What's fascinating about this
67
203326
1403
정말 λŒ€λ‹¨ν•œ 건
03:24
is that the photo collection can come from your typical photos.
68
204753
3423
일반적인 μ‚¬μ§„μœΌλ‘œλ„ 사진 λͺ¨μŒμ§‘을 λ§Œλ“€ 수 μžˆλ‹€λŠ” κ²λ‹ˆλ‹€.
03:28
It doesn't really matter what expression you're making
69
208200
2603
무슨 ν‘œμ •μ΄λ“ μ§€, μ–΄λ””μ—μ„œ μ°μ—ˆλ“ μ§€
03:30
or where you took those photos.
70
210827
1885
μ „ν˜€ 상관이 μ—†μŠ΅λ‹ˆλ‹€.
03:32
What matters is that there are a lot of them.
71
212736
2400
μ€‘μš”ν•œ 건 사진이 λ§Žμ•„μ•Ό ν•œλ‹€λŠ” 것이죠.
03:35
And we are still missing color here,
72
215160
1736
그리고 μ—¬κΈ°μ—” 아직 색깔이 μ—†μ£ .
03:36
so next, we develop a new blending technique
73
216920
2348
κ·Έλž˜μ„œ λ‹€μŒμœΌλ‘œλŠ” λ‹¨μˆœν•œ 평균법을 κ°œμ„ ν•˜μ—¬
03:39
that improves upon a single averaging method
74
219292
2836
μ„¬μ„Έν•œ μ–Όκ΅΄ 질감과 색을 λ§Œλ“€ 수 μžˆλŠ”
03:42
and produces sharp facial textures and colors.
75
222152
2818
μƒˆλ‘œμš΄ ν˜Όν•© κΈ°μˆ μ„ λ°œμ „μ‹œμΌ°μŠ΅λ‹ˆλ‹€.
03:45
And this can be done for any expression.
76
225779
2771
이건 μ–΄λ–€ ν‘œμ •μ—λ„ μ μš©ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
03:49
Now we have a control of a model of a person,
77
229485
2499
이제 μ‚¬λžŒ λͺ¨ν˜•μ„ μ‘°μ ˆν•  수 μžˆμ–΄μ„œ,
03:52
and the way it's controlled now is by a sequence of static photos.
78
232008
3795
연속적인 μ •μ§€λœ μ‚¬μ§„μœΌλ‘œ μ‘°μ ˆν•  수 μžˆμŠ΅λ‹ˆλ‹€.
03:55
Notice how the wrinkles come and go, depending on the expression.
79
235827
3126
주름이 ν‘œμ •μ— 따라 μ–΄λ–»κ²Œ 생기고 μ‚¬λΌμ§€λŠ”μ§€ λ³΄μ„Έμš”.
04:00
We can also use a video to drive the model.
80
240109
2746
λ™μ˜μƒμœΌλ‘œλ„ 이 λͺ¨λΈμ„ μ›€μ§μ΄κ²Œ ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
04:02
(Video) Daniel Craig: Right, but somehow,
81
242879
2593
(μ˜μƒ) λ‹€λ‹ˆμ—˜ 크레이그 : λ§žμ•„μš”, ν•˜μ§€λ§Œ μ™œμΈμ§€
04:05
we've managed to attract some more amazing people.
82
245496
3771
μ €ν¬λŠ” κ°€κΉŒμŠ€λ‘œ 더 ꡉμž₯ν•œ μ‚¬λžŒλ“€μ˜ 관심을 λ°›μ•˜μ–΄μš”.
04:10
SS: And here's another fun demo.
83
250021
1642
μˆ˜νŒŒμ†: μ—¬κΈ° 또 λ‹€λ₯Έ μž¬λ―ΈμžˆλŠ” 데λͺ¨κ°€ μžˆμŠ΅λ‹ˆλ‹€.
04:11
So what you see here are controllable models
84
251687
2246
μ—¬κΈ°μ—λŠ” μ œκ°€ μ‘°μ ˆν•  수 μžˆλŠ” λͺ¨λΈμ΄ μžˆμŠ΅λ‹ˆλ‹€.
04:13
of people I built from their internet photos.
85
253957
2444
인터넷에 μžˆλŠ” μ‚¬μ§„λ“€λ‘œ λ§Œλ“  것이죠.
04:16
Now, if you transfer the motion from the input video,
86
256425
2904
이제, μž…λ ₯ λ™μ˜μƒμ˜ μ›€μ§μž„μ„ μ΄λ™μ‹œν‚€λ©΄
04:19
we can actually drive the entire party.
87
259353
2152
λͺ¨λ“  λͺ¨λΈμ„ 움직일 수 있죠.
04:21
George W. Bush: It's a difficult bill to pass,
88
261529
2172
쑰지 W. λΆ€μ‹œ: 이건 ν†΅κ³Όλ˜κΈ° μ–΄λ €μš΄ λ²•μ•ˆμž…λ‹ˆλ‹€.
04:23
because there's a lot of moving parts,
89
263725
2303
λ³€λ™λ˜λŠ” 뢀뢄이 많기 λ•Œλ¬Έμž…λ‹ˆλ‹€.
04:26
and the legislative processes can be ugly.
90
266052
5231
μž…λ²• 과정은 ꡉμž₯히 μΆ”μž‘ν•΄μ§ˆ μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
04:31
(Applause)
91
271307
1630
(λ°•μˆ˜)
04:32
SS: So coming back a little bit,
92
272961
1837
μˆ˜νŒŒμ†: μ›λž˜ μ΄μ•ΌκΈ°λ‘œ λŒμ•„μ™€μ„œ
04:34
our ultimate goal, rather, is to capture their mannerisms
93
274822
3191
μ €ν¬μ˜ ꢁ극적인 λͺ©ν‘œλŠ” μ‚¬λžŒλ“€μ˜ λ²„λ¦‡μ΄λ‚˜
04:38
or the unique way each of these people talks and smiles.
94
278037
3045
각자 가지고 μžˆλŠ” λ§ν•˜κ³  μ›ƒλŠ” 방식을 μ •ν™•νžˆ μ°ΎλŠ” κ²ƒμž…λ‹ˆλ‹€.
04:41
So to do that, can we actually teach the computer
95
281106
2313
κ·Έλ ‡κ²Œ ν•˜κΈ° μœ„ν•΄μ„œ, 컴퓨터가 μ–΄λ–€ μ‚¬λžŒμ˜ λ™μ˜μƒλ§Œ 보고도
04:43
to imitate the way someone talks
96
283443
2222
κ·Έ μ‚¬λžŒμ΄ λ§ν•˜λŠ” 방식을 λͺ¨λ°©ν•˜λ„둝
04:45
by only showing it video footage of the person?
97
285689
2420
κ°€λ₯΄μΉ  수 μžˆμ„κΉŒμš”?
04:48
And what I did exactly was, I let a computer watch
98
288898
2577
κ·Έλž˜μ„œ μ €λŠ” μ»΄ν“¨ν„°μ—κ²Œ 14μ‹œκ°„ λ™μ•ˆ
04:51
14 hours of pure Barack Obama giving addresses.
99
291499
3277
버락 μ˜€λ°”λ§ˆκ°€ μ—°μ„€ν•˜λŠ” λͺ¨μŠ΅μ„ λ³΄μ—¬μ£Όμ—ˆμŠ΅λ‹ˆλ‹€.
04:55
And here's what we can produce given only his audio.
100
295443
3516
그리고 이것이 그의 λͺ©μ†Œλ¦¬λ§ŒμœΌλ‘œ λ§Œλ“  μ˜μƒμž…λ‹ˆλ‹€.
04:58
(Video) BO: The results are clear.
101
298983
1777
(μ˜μƒ) μ˜€λ°”λ§ˆ: κ²°κ³ΌλŠ” ν™•μ‹€ν•©λ‹ˆλ‹€.
05:00
America's businesses have created 14.5 million new jobs
102
300784
4349
75달 λ™μ•ˆ λ―Έκ΅­ 사업은 μƒˆλ‘œμš΄ 일자리 1450만 개λ₯Ό
05:05
over 75 straight months.
103
305157
2774
μ°½μΆœν–ˆμŠ΅λ‹ˆλ‹€.
05:07
SS: So what's being synthesized here is only the mouth region,
104
307955
2905
μˆ˜νŒŒμ†: μ—¬κΈ°μ—μ„œ ν•©μ„±ν•œ 곳은 μž… λΆ€λΆ„λΏμž…λ‹ˆλ‹€.
05:10
and here's how we do it.
105
310884
1540
이걸 μ–΄λ–»κ²Œ ν–ˆλŠ”μ§€ 말씀 λ“œλ¦΄κ²Œμš”.
05:12
Our pipeline uses a neural network
106
312764
1826
νŒŒμ΄ν”„λΌμΈμ€ μž… 뢀뢄에 μ†Œλ¦¬λ₯Ό λ„£κΈ° μœ„ν•΄
05:14
to convert and input audio into these mouth points.
107
314614
2936
신경망을 μ΄μš©ν•©λ‹ˆλ‹€.
05:18
(Video) BO: We get it through our job or through Medicare or Medicaid.
108
318547
4225
(μ˜μƒ) μ˜€λ°”λ§ˆ: μ—¬λŸ¬λΆ„μ˜ μ§μ—…μ΄λ‚˜ 취약계측 μ˜λ£Œλ³΄ν—˜μ œλ„λ₯Ό ν†΅ν•΄μ„œλ„ κ°€λŠ₯ν•©λ‹ˆλ‹€.
05:22
SS: Then we synthesize the texture, enhance details and teeth,
109
322796
3420
μˆ˜μ™„μ†: 그리고 μ§ˆκ°μ„ ν•©μ„±ν•˜κ³  μΉ˜μ•„μ™€ μ„Έλ°€ν•œ λΆ€λΆ„κΉŒμ§€ ν‘œν˜„ν•œ ν›„,
05:26
and blend it into the head and background from a source video.
110
326240
3074
원본 λΉ„λ””μ˜€λ₯Ό μ΄μš©ν•΄ 머리와 배경에 잘 λ§žλ„λ‘ ν˜Όν•©ν•©λ‹ˆλ‹€.
05:29
(Video) BO: Women can get free checkups,
111
329338
1905
(μ˜μƒ) μ˜€λ°”λ§ˆ: 여성은 무료둜 건강검진을 받을 수 있고
05:31
and you can't get charged more just for being a woman.
112
331267
2968
μ—¬μ„±μ΄λΌλŠ” 이유만으둜 더 많이 청ꡬ받지 μ•Šμ„ κ²ƒμž…λ‹ˆλ‹€.
05:34
Young people can stay on a parent's plan until they turn 26.
113
334973
3306
청년듀은 만 26μ„ΈκΉŒμ§€ λΆ€λͺ¨λ‹˜μ˜ μ˜λ£Œλ³΄ν—˜ ν˜œνƒμ„ 받을 수 μžˆμŠ΅λ‹ˆλ‹€.
05:39
SS: I think these results seem very realistic and intriguing,
114
339267
2952
μˆ˜μ™„μ†: 이 결과듀은 정말 사싀적이고 ν₯λ―Έλ‘­μ§€λ§Œ
05:42
but at the same time frightening, even to me.
115
342243
3173
심지어 저도 λ¬΄μ„­μŠ΅λ‹ˆλ‹€.
05:45
Our goal was to build an accurate model of a person, not to misrepresent them.
116
345440
4015
μ €ν¬λŠ” μ •ν™•ν•œ λͺ¨λΈμ„ λ§Œλ“€κ³  싢은 것이지, μ‚¬μΉ­ν•˜κ³  싢은 것이 μ•„λ‹™λ‹ˆλ‹€.
05:49
But one thing that concerns me is its potential for misuse.
117
349956
3111
ν•˜μ§€λ§Œ μ•…μš©λ  κ°€λŠ₯성이 있기 λ•Œλ¬Έμ— μš°λ €λ©λ‹ˆλ‹€.
05:53
People have been thinking about this problem for a long time,
118
353958
2971
μ‚¬λžŒλ“€μ€ μ˜€λž«λ™μ•ˆ 이 λ¬Έμ œμ— λŒ€ν•΄ μƒκ°ν–ˆμŠ΅λ‹ˆλ‹€.
05:56
since the days when Photoshop first hit the market.
119
356953
2381
포토샡이 처음 μΆœμ‹œλμ„ λ•ŒλΆ€ν„° 말이죠.
05:59
As a researcher, I'm also working on countermeasure technology,
120
359862
3801
μ—°κ΅¬μ›μœΌλ‘œμ„œ λŒ€μ±… κΈ°μˆ μ„ κ°œλ°œν•˜κ³  있으며
06:03
and I'm part of an ongoing effort at AI Foundation,
121
363687
2942
μ‘°μž‘λœ 사진과 μ˜μƒμ„ κ°μ§€ν•˜κΈ° μœ„ν•΄μ„œ
06:06
which uses a combination of machine learning and human moderators
122
366653
3397
κΈ°κ³„ν•™μŠ΅κ³Ό 인간 μ€‘μž¬μžμ˜ 쑰합인
06:10
to detect fake images and videos,
123
370074
2144
AI 기초λ₯Ό μ„Έμš°κ³  μžˆμŠ΅λ‹ˆλ‹€.
06:12
fighting against my own work.
124
372242
1514
제 일과 λ°˜λŒ€λ˜μ–΄ μ‹Έμš°λŠ” 일이죠.
06:14
And one of the tools we plan to release is called Reality Defender,
125
374675
3190
μΆœμ‹œ κ³„νš 쀑에 μžˆλŠ” 또 λ‹€λ₯Έ μ œν’ˆμ€ 리얼리티 λ””νŽœλ”μž…λ‹ˆλ‹€.
06:17
which is a web-browser plug-in that can flag potentially fake content
126
377889
4039
μœ„μ‘°λœ μ»¨ν…μΈ λŠ” λΈŒλΌμš°μ €μ—μ„œ λ°”λ‘œ μžλ™μœΌλ‘œ ν‘œμ‹œν•΄μ£ΌλŠ”
06:21
automatically, right in the browser.
127
381952
2533
λΈŒλΌμš°μ € ν”ŒλŸ¬κ·ΈμΈμ΄μ£ .
06:24
(Applause)
128
384509
4228
(λ°•μˆ˜)
06:28
Despite all this, though,
129
388761
1453
ν•˜μ§€λ§Œ μ΄λŸ¬ν•œ λ…Έλ ₯에도 λΆˆκ΅¬ν•˜κ³ 
06:30
fake videos could do a lot of damage,
130
390238
1840
μ‘°μž‘ μ˜μƒμ€ 심지어 λˆ„κ΅°κ°€ μ•Œμ•„μ±„κΈ°λ„ 전에
06:32
even before anyone has a chance to verify,
131
392102
3294
큰 ν”Όν•΄λ₯Ό μž…νž 수 μžˆμŠ΅λ‹ˆλ‹€.
06:35
so it's very important that we make everyone aware
132
395420
2722
κ·Έλž˜μ„œ 우리 λͺ¨λ‘ ν˜„μž¬ μ–΄λ–€ 게 κ°€λŠ₯ν•œμ§€ μ•„λŠ” 것이
06:38
of what's currently possible
133
398166
2007
ꡉμž₯히 μ€‘μš”ν•©λ‹ˆλ‹€.
06:40
so we can have the right assumption and be critical about what we see.
134
400197
3369
κ·Έλž˜μ•Ό μ˜¬λ°”λ₯΄κ²Œ μΆ”μΈ‘ν•˜κ³  λΉ„νŒμ μΈ μ‹œκ°μœΌλ‘œ λ³Ό 수 μžˆμœΌλ‹ˆκΉŒμš”.
06:44
There's still a long way to go before we can fully model individual people
135
404423
5007
μš°λ¦¬κ°€ μ‚¬λžŒλ“€μ˜ λͺ¨λΈμ„ λ§Œλ“€κΈ° 전에
06:49
and before we can ensure the safety of this technology.
136
409454
2786
이 기술의 μ•ˆμ •μ„±μ„ ν™•μ‹ ν•˜κΈ° 전에 κ°€μ•Ό ν•  길은 아직도 λ©‰λ‹ˆλ‹€.
06:53
But I'm excited and hopeful,
137
413097
1587
ν•˜μ§€λ§Œ μ €λŠ” ꡉμž₯히 μ‹ λ‚˜κ³  κΈ°λŒ€ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
06:54
because if we use it right and carefully,
138
414708
3539
μš°λ¦¬κ°€ μ˜¬λ°”λ₯΄κ³  주의 깊게 μ‚¬μš©ν•œλ‹€λ©΄
06:58
this tool can allow any individual's positive impact on the world
139
418271
4309
이 κΈ°μˆ μ€ λˆ„κ΅¬λ‚˜ 세상에 긍정적인 영ν–₯을 미치게 ν•˜κ³ 
07:02
to be massively scaled
140
422604
2190
λŒ€κ·œλͺ¨λ‘œ ν™•λŒ€λ  κ²ƒμž…λ‹ˆλ‹€.
07:04
and really help shape our future the way we want it to be.
141
424818
2742
μš°λ¦¬κ°€ μ •λ§λ‘œ μ›ν•˜λŠ” 미래λ₯Ό λ§Œλ“œλŠ”λ° 도움이 되기 λ•Œλ¬Έμž…λ‹ˆλ‹€.
07:07
Thank you.
142
427584
1151
κ°μ‚¬ν•©λ‹ˆλ‹€.
07:08
(Applause)
143
428759
5090
(λ°•μˆ˜)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7