How a Deepfake Almost Ruined My Political Career | Cara Hunter | TED

40,936 views ใƒป 2024-12-18

TED


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

๋ฒˆ์—ญ: chloe park ๊ฒ€ํ† : DK Kim
00:03
This talk contains mature language
0
3620
3720
[์ด ๊ฐ•์—ฐ์—๋Š” ์„ ์ •์  ํ‘œํ˜„์ด ๋‚˜์˜ต๋‹ˆ๋‹ค]
00:08
โ€œYouโ€™re a little whore,
1
8700
1840
โ€œ์ฐฝ๋…€ ๊ฐ™์œผ๋‹ˆ๋ผ๊ณ ,
00:10
and we've all seen your little video."
2
10540
2560
๋„ค ์˜์ƒ ๋‹ค ๋ดค์–ด.โ€
00:14
That was the text message that was sent to me in April of 2022.
3
14300
5280
์ œ๊ฐ€ 2022๋…„ 4์›”์— ๋ฐ›์€ ๋ฌธ์ž์ž…๋‹ˆ๋‹ค.
00:20
I'm sitting in my grandmother's living room
4
20300
2480
ํ• ๋จธ๋‹ˆ ์ง‘ ๊ฑฐ์‹ค์—์„œ ์•„ํ” ์‚ด์ด ๋˜์‹  ํ• ๋จธ๋‹ˆ๋ฅผ ์ถ•ํ•˜ํ•˜๋ ค๊ณ 
00:22
in what is her 90th birthday,
5
22820
2640
00:25
surrounded by family and friends
6
25460
2360
๊ฐ€์กฑ๊ณผ ์นœ๊ตฌ๋“ค๊ณผ ํ•จ๊ป˜ ์•‰์•„ ์žˆ์—ˆ๋Š”๋ฐ
00:27
as my phone blows up with messages from strangers
7
27820
4480
์ „๊ตญ ๊ฐ์ง€ ๋ชจ๋ฅด๋Š” ์‚ฌ๋žŒ๋“ค์ด ๋ณด๋‚ธ ๋ฌธ์ž๋กœ ์ „ํ™”์— ๋ถˆ์ด ๋‚ฌ์–ด์š”.
00:32
right across the country
8
32300
1840
00:34
who say they have seen a video of me
9
34140
3160
์–ด๋–ค ์˜์ƒ์— ์ œ๊ฐ€ ๋‚˜์˜ค๋Š”๋ฐ
00:37
engaging in hardcore pornographic activity with a man.
10
37340
4720
์–ด๋–ค ๋‚จ์ž์™€ ๋…ธ๊ณจ์ ์ธ ์„ฑ๊ด€๊ณ„๋ฅผ ํ•œ๋‹ค๋Š” ๋‚ด์šฉ์ด์—ˆ์ฃ .
00:42
I knew this was impossible.
11
42740
1600
๋ง‰์ง€ ๋ชปํ•  ๊ฑธ ์•Œ์•˜์–ด์š”.
00:44
With just three weeks out from my election,
12
44900
2000
์„ ๊ฑฐ๋ฅผ ๊ฒจ์šฐ 3์ฃผ ์•ž๋‘๊ณ  ์žˆ๋Š”๋ฐ
00:46
I felt as though my career was crumbling before my very eyes.
13
46940
4080
์ œ ๊ฒฝ๋ ฅ์ด ๋ˆˆ์•ž์—์„œ ๋ฌด๋„ˆ์ง€๋Š” ๋А๋‚Œ์ด์—ˆ์Šต๋‹ˆ๋‹ค.
00:51
My heart pounded,
14
51420
1440
์‹ฌ์žฅ์ด ๋‘๊ทผ๊ฑฐ๋ฆฌ๊ณ  ๋จธ๋ฆฌ๊ฐ€ ํ•‘ ๋Œ๊ณ  ์˜จ๋ชธ์— ์‹์€๋•€์ด ๋‚ฌ์–ด์š”.
00:52
my mind raced, sweat beaded on my skin.
15
52860
3160
00:56
And then I watched the video, and my worst fear was realized.
16
56380
4880
๊ทธ๋Ÿฌ๊ณ  ๋‚˜์„œ ์˜์ƒ์„ ํ™•์ธํ–ˆ๋Š”๋ฐ
์ตœ์•…์˜ ๊ณตํฌ๊ฐ€ ํ˜„์‹ค์ด ๋์ฃ .
01:01
Although this woman in the video was not me,
17
61980
3560
์˜์ƒ์— ๋‚˜์˜ค๋Š” ์—ฌ์ž๋Š” ์ œ๊ฐ€ ์•„๋‹ˆ์—ˆ์ง€๋งŒ
01:05
she looked exactly like me.
18
65540
2720
์ €๋ž‘ ๋˜‘ ๋‹ฎ์•˜์Šต๋‹ˆ๋‹ค.
๋ฏฟ๊ธฐ ์–ด๋ ค์šธ ์ •๋„๋กœ์š”.
01:09
Impossibly like me.
19
69020
2360
01:11
Eerily like me.
20
71380
1200
์„ฌ๋œฉํ•˜๊ฒŒ ๋‹ฎ์•˜์ฃ .
์ˆ˜๋งŽ์€ ์˜๋ฌธ์ด ๋จธ๋ฆฟ์†์— ๋งด๋Œ์•˜์Šต๋‹ˆ๋‹ค.
01:13
I had so many questions running through my mind.
21
73140
3040
01:16
Was this AI?
22
76500
1360
AI์ธ๊ฐ€? ์•„๋‹Œ๊ฐ€?
01:17
Was it not?
23
77900
1280
01:19
Who made this?
24
79220
1160
๋ˆ„๊ฐ€ ๋งŒ๋“ค์—ˆ์ง€? ์–ด๋–ป๊ฒŒ ๋งŒ๋“  ๊ฑฐ์ง€?
01:20
How did they make it?
25
80420
1360
01:21
Why did they make it?
26
81820
1320
์™œ ๋งŒ๋“ค์—ˆ์ง€?
01:23
So I did what anyone would do,
27
83140
2360
๊ทธ๋ž˜์„œ ์ €๋„ ๋‹ค๋ฅธ ์‚ฌ๋žŒ๋“ค์ฒ˜๋Ÿผ
01:25
and I approached my local police service to ask for advice, for guidance.
28
85500
4680
๊ฒฝ์ฐฐ์— ์—ฐ๋ฝํ•ด ์กฐ์–ธ๊ณผ ๋„์›€์„ ๊ตฌํ–ˆ์Šต๋‹ˆ๋‹ค.
01:30
And really, where did I go from there?
29
90220
2280
๊ทธ๋ž˜์„œ ์–ด๋–ป๊ฒŒ ๋๋ƒ๊ณ ์š”?
01:32
But they informed me that they wouldn't have the cybercrime technology to assist,
30
92540
4280
๊ฒฝ์ฐฐ์€ ์ด ์˜์ƒ ์ถœ์ฒ˜๋ฅผ ์•Œ์•„๋‚ด๋Š” ๋ฐ ํ•„์š”ํ•œ
01:36
to find out where this video came from.
31
96860
2240
์‚ฌ์ด๋ฒ„ ๋ฒ”์ฃ„ ๊ธฐ์ˆ ์ด ์—†๋‹ค๊ณ  ํ–ˆ์Šต๋‹ˆ๋‹ค.
๊ทธ๋•Œ ์ „ ํ˜ผ์ž๋ผ๋Š” ๊ฑธ ๊นจ๋‹ฌ์•˜์ฃ .
01:39
And it was from that moment I knew that I was on my own.
32
99140
3520
01:43
Now to set the stage, as you can probably tell,
33
103380
2440
์ œ ์†Œ๊ฐœ๋ฅผ ํ•˜์ž๋ฉด, ์•„์‹œ๊ฒ ์ง€๋งŒ
01:45
Iโ€™m from Ireland,
34
105820
1240
์ €๋Š” ์•„์ผ๋žœ๋“œ ์ถœ์‹ ์ด๊ณ , ์ •ํ™•ํžˆ ๋งํ•˜์ž๋ฉด
01:47
and to be exact, I'm from Northern Ireland,
35
107100
2480
๋ถ์•„์ผ๋žœ๋“œ๋กœ ์•„์ผ๋žœ๋“œ๋ณด๋‹ค ๋” ์ž‘์ฃ .
01:49
which is an even smaller place.
36
109580
2000
01:51
We have just 1.8 million people,
37
111620
2680
์ธ๊ตฌ๊ฐ€ 180๋งŒ ๋ช…๋ฐ–์— ์•ˆ ๋ผ์š”.
01:54
very similar to the size of Vienna.
38
114340
2840
๋น„์—”๋‚˜ ์ธ๊ตฌ์™€ ์•„์ฃผ ๋น„์Šทํ•˜์ฃ .
01:57
So you can imagine a rumor of this sinister nature,
39
117180
3440
์•„์‹œ๊ฒ ์ง€๋งŒ ์ด๋Ÿฐ ๊ณ ์•ฝํ•œ ์†Œ๋ฌธ์€,
02:00
particularly in the world of politics, can go very far, very fast.
40
120620
5920
ํŠนํžˆ ์ •์น˜๊ณ„์—์„ , ์•„์ฃผ ๋น ๋ฅด๊ฒŒ ๋ฉ€๋ฆฌ ํผ์ง‘๋‹ˆ๋‹ค.
02:06
And that old saying, โ€œseeing is believingโ€ began to haunt me.
41
126540
4480
โ€˜์•„๋‹ˆ ๋• ๊ตด๋š์— ์—ฐ๊ธฐ ๋‚ ๊นŒโ€™๋ผ๋Š” ์†๋‹ด์ด ์ €๋ฅผ ๊ดด๋กญํ˜”์ฃ .
02:11
And in the weeks leading up to my election,
42
131060
2520
์„ ๊ฑฐ๋ฅผ ์•ž๋‘” ๋ช‡ ์ฃผ ๋™์•ˆ ์ด ์˜์ƒ, ๊ฐ€์งœ ์˜์ƒ์€
02:13
this video, this false video,
43
133620
1920
02:15
was shared thousands and thousands of times across WhatsApp.
44
135540
4240
์™“์ธ ์•ฑ์—์„œ ์ˆ˜๋ฐฑ๋งŒ ๋ฒˆ ๊ณต์œ ๋์Šต๋‹ˆ๋‹ค.
02:20
And attached to this video was photos of me at work,
45
140260
3720
์˜์ƒ์— ์‚ฌ์šฉ๋œ ์ œ ์‚ฌ์ง„๋“ค์€ ์‚ฌ๋ฌด์‹ค์—์„œ ์ผํ•˜๋Š” ๋ชจ์Šต์ด๊ฑฐ๋‚˜
์›ƒ๊ฑฐ๋‚˜, ์„ ๊ฑฐ ์šด๋™์„ ํ•˜๊ฑฐ๋‚˜
02:24
smiling, campaigning,
46
144020
2760
02:26
building a sense of trust with my constituents.
47
146820
2680
์œ ๊ถŒ์ž๋“ค๊ณผ ์‹ ๋ขฐ๋ฅผ ์Œ“๋Š” ๋ชจ์Šต์ด์—ˆ์Šต๋‹ˆ๋‹ค.
02:29
And as the weeks went on,
48
149540
1480
์‹œ๊ฐ„์ด ์ง€๋‚˜๋ฉด์„œ ๋ฌธ์ž๋Š” ๋” ๋ฏธ์นœ ๋“ฏ์ด ์Ÿ์•„์กŒ๊ณ 
02:31
messages flooded in faster and faster,
49
151020
3920
02:34
and they were of a very vile and sexual nature.
50
154980
3040
๋ชน์‹œ ๋ถˆ์พŒํ•˜๊ณ  ์„ฑ์ ์ธ ๋‚ด์šฉ์ด์—ˆ์Šต๋‹ˆ๋‹ค.
02:38
Ding!
51
158340
1200
๋”ฉ!
02:39
"We've all seen your little video."
52
159540
2000
โ€˜๋„ค ์˜์ƒ ๋‹ค ๋ดค์–ด!โ€™
02:41
Ding!
53
161940
1120
๋”ฉ!
โ€˜๋ถ€๋„๋Ÿฌ์šด ์ค„ ์•Œ์•„!โ€™
02:43
"You should be ashamed of yourself."
54
163100
2480
02:45
Ding!
55
165620
1200
๋”ฉ!
02:46
"Ah, now I see how you got your position in politics."
56
166820
4280
โ€˜์ •๊ณ„์— ์–ด๋–ป๊ฒŒ ์ง„์ถœํ–ˆ๋Š”์ง€ ์•Œ๊ฒ ๊ตฐ!โ€™
(ํ•œ์ˆจ)
์ •๋ง ํž˜๋“ค์—ˆ์–ด์š”.
02:53
It was very difficult.
57
173100
1960
02:55
And having been in politics since the age of 23,
58
175540
3160
์ €๋Š” 23์‚ด์— ์ •๊ณ„์— ์ง„์ถœํ–ˆ๊ณ 
02:58
and at this point I've been in it for about four to five years,
59
178740
3560
์ด์ œ ์•ฝ 4~5๋…„ ๋œ ์‹œ์ ์ด์—ˆ์Šต๋‹ˆ๋‹ค.
03:02
and I'm from Northern Ireland, which is a post-conflict society,
60
182300
3280
์ €๋Š” ๋ถ์•„์ผ๋žœ๋“œ ์ถœ์‹ ์ธ๋ฐ
๋ถ„์Ÿ์„ ๊ฒช์—ˆ๊ณ  ์—ฌ์ „ํžˆ ์‹ฌ๊ฐํ•˜๊ฒŒ ๋ถ„๋‹จ๋œ ์‚ฌํšŒ์ž…๋‹ˆ๋‹ค.
03:05
still very deeply divided.
61
185620
1920
03:07
So I anticipated challenges,
62
187580
2600
๊ทธ๋ž˜์„œ ๊ณ ๋  ๊ฑธ ์˜ˆ์ƒํ–ˆ๊ณ 
03:10
I anticipated disagreements,
63
190220
2600
์˜๊ฒฌ ์ถฉ๋Œ๋„ ์˜ˆ์ƒํ–ˆ๊ณ  ๊ณต๊ฒฉ๊นŒ์ง€๋„ ์˜ˆ์ƒํ–ˆ์Šต๋‹ˆ๋‹ค.
03:12
I even anticipated attacks, is politics after all.
64
192860
2800
์ •์น˜๋ž€ ๊ทธ๋Ÿฐ ๊ฑฐ๋‹ˆ๊นŒ์š”.
ํ•˜์ง€๋งŒ ์ด๊ฑด ์˜ˆ์ƒํ•˜์ง€ ๋ชปํ–ˆ์Šต๋‹ˆ๋‹ค.
03:16
But what I did not anticipate was this moment.
65
196100
3520
03:19
This was different.
66
199660
1400
์ด๊ฑด ๋‹ฌ๋ž์Šต๋‹ˆ๋‹ค.
์ด๊ฑด ์—ฌ์„ฑ ํ˜์˜ค์— ๊ธฐ์ˆ ์ด ์•…์šฉ๋œ ๊ฒƒ์ด์—ˆ๊ณ 
03:21
This was the moment where misogyny meets the misuse of technology,
67
201060
5800
03:26
and even had the potential to impact the outcome of a democratic election.
68
206900
5560
๋ฏผ์ฃผ์ ์ธ ์„ ๊ฑฐ ๊ฒฐ๊ณผ์— ์˜ํ–ฅ์„ ์ค„ ์ˆ˜๋„ ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
03:32
And the sad thing for me was this lie became so far spread,
69
212820
5080
์ œ๊ฐ€ ์ •๋ง ์Šฌํސ๋˜ ๊ฑด
๊ฑฐ์ง“๋ง์ด ๋„ˆ๋ฌด๋‚˜ ์ˆœ์‹๊ฐ„์— ๋„ˆ๋ฌด๋‚˜ ๋ฉ€๋ฆฌ ํผ์ ธ์„œ
03:37
so far, so fast,
70
217940
1840
03:39
that even my own family started to believe it.
71
219820
2880
๊ฐ€์กฑ๋“ค๊นŒ์ง€ ๋ฏฟ๊ธฐ ์‹œ์ž‘ํ–ˆ๋‹ค๋Š” ๊ฒ๋‹ˆ๋‹ค.
03:42
Some would say that they'd heard it at a golf club,
72
222700
2760
๋ˆ„๊ตฌ๋Š” ๊ณจํ”„ ๋ชจ์ž„์—์„œ ๋“ค์—ˆ๋‹ค๊ณ  ํ•˜๊ณ 
03:45
others would say they heard it at the bar,
73
225500
2520
๋ˆ„๊ตฌ๋Š” ์ˆ ์ง‘์—์„œ ๋“ค์—ˆ๋‹ค๊ณ  ํ•˜๊ณ 
๋˜ ๋ˆ„๊ตฌ๋Š” ํƒˆ์˜์‹ค์—์„œ ๋“ค์—ˆ๋‹ค๊ณ ๋„ ํ–ˆ์ฃ .
03:48
and of course, some even said they heard it in a locker room.
74
228060
4320
03:52
A really good example of how far this went
75
232420
2480
์ด ๊ฑฐ์ง“๋ง์ด ์–ผ๋งˆ๋‚˜ ๋ฉ€๋ฆฌ ํผ์กŒ๋Š”์ง€
03:54
was people that I knew my entire life
76
234900
2520
ํ‰์ƒ์„ ์•Œ๊ณ  ์ง€๋‚ด๋˜ ์‚ฌ๋žŒ๋“ค์ด
03:57
would pass me in the street without whispering a word.
77
237460
3000
์ €๋ฅผ ๊ธธ๊ฑฐ๋ฆฌ์—์„œ ๋งˆ์ฃผ์ณ๋„ ๋ง ํ•œ๋งˆ๋”” ์—†์ด ์ง€๋‚˜์ณค์–ด์š”.
04:00
People like school teachers,
78
240460
1360
ํ•™๊ต ์„ ์ƒ๋‹˜, ์‹ ๋ขฐํ•˜๋˜ ์‚ฌ๋žŒ๋“ค,
04:01
people I had a sense of trust with
79
241860
1720
04:03
and, you know, an affinity with.
80
243620
2360
์นœ๋ฐ€๊ฐ์„ ๋А๊ผˆ๋˜ ์‚ฌ๋žŒ๋“ค์ด์—ˆ์ฃ .
04:05
And that was really hard.
81
245980
1320
๊ทธ๊ฑด ๋„ˆ๋ฌด ํž˜๋“ค์—ˆ์–ด์š”.
04:07
It felt like overnight I was wearing a scarlet letter.
82
247300
3920
ํ•˜๋ฃจ์•„์นจ์— ์ฃผํ™ ๊ธ€์”จ๊ฐ€ ์ƒˆ๊ฒจ์ง„ ๊ฒƒ ๊ฐ™์•˜์ฃ .
04:11
And as things moved on
83
251660
1320
์‹œ๊ฐ„์ด ์ง€๋‚˜๊ณ  ์„ ๊ฑฐ๊นŒ์ง€ 2~3์ฃผ ๋‚จ์•˜์„ ๋•Œ๋„
04:12
and were about two, three weeks out from the election,
84
252980
2880
04:15
I kept receiving messages, and it got wider and wider.
85
255900
3120
๋ฌธ์ž๋Š” ๊ณ„์† ์™”๊ณ 
๊ฑฐ์ง“๋ง์€ ๋” ๋ฉ€๋ฆฌ ํผ์ ธ์„œ ์ „ ์„ธ๊ณ„์—์„œ ์™”์ฃ .
04:19
It was global.
86
259060
1160
04:20
Not only was I receiving messages from Dublin and from London,
87
260220
3680
๋”๋ธ”๋ฆฐ๊ณผ ๋Ÿฐ๋˜์—์„œ ๋ณด๋‚ธ ๊ฒƒ๋„ ๋ชจ์ž๋ผ
04:23
but I was also receiving messages from Massachusetts, Manhattan,
88
263900
4040
๋งค์‚ฌ์ถ”์„ธ์ธ ์™€ ๋งจํ•ดํŠผ์—์„œ๋„ ๋ณด๋ƒˆ์Šต๋‹ˆ๋‹ค.
04:27
and I was getting so many follows on my political social media,
89
267940
3920
์ œ ์ •์น˜ ์†Œ์…œ ๊ณ„์ •์€ ๊ตฌ๋…์ž๊ฐ€ ๋งˆ๊ตฌ ๋Š˜์–ด๋‚ฌ๋Š”๋ฐ
04:31
predominantly from men hungry for more of this scandal.
90
271900
3920
๋Œ€๋ถ€๋ถ„์ด ์ด ์ถ”๋ฌธ์— ๊ตถ์ฃผ๋ฆฐ ๋‚จ์ž์˜€์ฃ .
04:35
And this intersection of online harms impacting my real life
91
275860
5120
์˜จ๋ผ์ธ์—์„œ ๋ฐ›์€ ํ”ผํ•ด๊ฐ€ ํ˜„์‹ค์—์„œ๋„ ์˜ํ–ฅ์„ ๋ฏธ์น˜๋‹ˆ
04:40
was something I found utterly strange and surreal.
92
280980
4000
์ •๋ง ์ด์ƒํ•˜๊ณ  ๋น„ํ˜„์‹ค์ ์ด์—ˆ์Šต๋‹ˆ๋‹ค.
04:45
But it got to the point where I was recognized on the street
93
285500
3800
๊ฒฐ๊ตญ ๊ธธ์—์„œ ๋ˆ„๊ตฌ๋‚˜ ์ €๋ฅผ ์•Œ์•„๋ณด๊ฒŒ ๋๊ณ 
04:49
and approached by a stranger who asked me for a sexual favor.
94
289340
6040
๋‚ฏ์„  ์‚ฌ๋žŒ์ด ์ €์—๊ฒŒ ์„ฑ์  ์š”๊ตฌ๋ฅผ ํ•˜๊ธฐ๊นŒ์ง€ ์ด๋ฅด๋ €์ฃ .
04:55
And it was just, for me, it was like in the blink of an eye,
95
295420
3080
์ œ๊ฐ€ ๋А๋ผ๊ธฐ์—๋Š” ์ •๋ง ๋ˆˆ ๊นœ์งํ•  ์‚ฌ์ด์—
04:58
everything had just changed,
96
298500
2640
๋ชจ๋“  ๊ฒŒ ๋ณ€ํ•œ ๊ฒƒ ๊ฐ™์•˜์Šต๋‹ˆ๋‹ค.
05:01
and it was utterly humiliating.
97
301140
1600
๋„ˆ๋ฌด ์น˜์š•์Šค๋Ÿฌ์› ๊ณ  ์ง‘ ๋ฐ–์— ๋‚˜๊ฐ€๊ธฐ ์‹ซ์—ˆ์Šต๋‹ˆ๋‹ค.
05:02
I didn't want to leave the house,
98
302780
1680
05:04
and I had turned notifications off in my phone
99
304460
2880
์ˆจํ†ต์ด ์กฐ๊ธˆ ํŠธ์ด๋„๋ก ํ•ธ๋“œํฐ ์•Œ๋žŒ์„ ์ „๋ถ€ ๊ป์Šต๋‹ˆ๋‹ค๋งŒ
05:07
just so I could kind of catch my breath,
100
307380
1920
05:09
but this wasn't ideal in the lead up, of course, to an election.
101
309340
4280
์„ ๊ฑฐ๋ฅผ ์•ž๋‘” ์ƒํ™ฉ์— ๊ทธ๋Ÿด ์ˆ˜๋„ ์—†์—ˆ์ฃ .
05:13
And for me, I think that was the purpose of this false video, was to do just that.
102
313620
4960
์ด ๊ฐ€์งœ ์˜์ƒ์ด ๋…ธ๋ฆฐ ๊ฒŒ ๋ฐ”๋กœ ๊ทธ๊ฑฐ๋ผ๊ณ  ์ƒ๊ฐํ•ด์š”.
05:19
But what hurt the most for me
103
319660
2440
๊ทธ๋Ÿฐ๋ฐ ๊ฐ€์žฅ ๋งˆ์Œ ์•„ํŒ ๋˜ ๊ฑด
์•„๋น ๋ฅผ ์•‰ํ˜€ ๋†“๊ณ 
05:22
was sitting down my father
104
322140
2720
05:24
and having to explain to him this strange, surreal situation.
105
324860
5760
์ด์ƒํ•˜๊ณ  ๋น„ํ˜„์‹ค์ ์ธ ์ด ์ผ์„ ์„ค๋ช…ํ•˜๋Š” ๊ฑฐ์˜€์Šต๋‹ˆ๋‹ค.
05:30
My father is an Irishman,
106
330620
1760
์ €ํฌ ์•„๋น ๋Š” ์•„์ผ๋žœ๋“œ์ธ์ด๊ณ 
05:32
completely disconnected from tech,
107
332420
2160
๊ธฐ์ˆ ์€ ์ „ํ˜€ ๋ชจ๋ฅด์‹œ๊ธฐ ๋•Œ๋ฌธ์—
05:34
and so having to explain
108
334620
1920
์ด ๋”์ฐํ•œ ์ƒํ™ฉ์ด
05:36
this horrific situation was an entire fabrication
109
336580
4320
์ „๋ถ€ ์™„์ „ํ•œ ์กฐ์ž‘์ด๋ผ๊ณ  ์„ค๋ช…ํ•˜๋Š” ๊ฑด
05:40
was very hard to do.
110
340940
1600
๋„ˆ๋ฌด๋‚˜ ์–ด๋ ค์› ์Šต๋‹ˆ๋‹ค.
05:42
This was this strange moment
111
342860
2120
์˜จ๋ผ์ธ ์„ธ์ƒ์ด
์ œ ํ˜„์‹ค ์„ธ์ƒ๊ณผ ๋งŒ๋‚˜๋‹ค๋‹ˆ ์ฐธ์œผ๋กœ ์ด์ƒํ–ˆ์ฃ .
05:45
where the online world met my life, my reality.
112
345020
4880
05:50
Not only having the impact to ruin my reputation,
113
350460
4520
์ด ์˜์ƒ์€ ์ œ ํ‰ํŒ์„ ๋ง์น  ๋ฟ๋งŒ ์•„๋‹ˆ๋ผ
05:54
but have the capacity to change the outcome
114
354980
2800
๋ฏผ์ฃผ์ ์ธ ์„ ๊ฑฐ ๊ฒฐ๊ณผ๋ฅผ ๋ฐ”๊ฟ€ ํž˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
05:57
of a democratic election.
115
357820
2200
06:00
And, you know,
116
360500
1160
์œ ๊ถŒ์ž๋“ค๊ณผ ์‹ ๋ขฐ๋ฅผ ์Œ“๋Š” ๋ฐ ์ˆ˜๋…„๊ฐ„ ๋งŽ์€ ์‹œ๊ฐ„์„ ์ผ์–ด์š”.
06:01
for years I spent so much time building trust with my constituents.
117
361700
4040
06:05
I mean, we all know how much people like politicians.
118
365780
3360
์‚ฌ๋žŒ๋“ค์ด ์ •์น˜์ธ์„ ์–ผ๋งˆ๋‚˜ ์ข‹์•„ํ•˜๋Š”์ง€ ์ž˜ ์•„์‹œ์ž–์•„์š”.
(์›ƒ์Œ)
06:09
You know, weโ€™re as likeable as the tax man.
119
369180
2000
์„ธ๋ฌด์„œ ์ง์›๋งŒํผ ํ˜ธ๊ฐํ˜•์ด์ฃ .
06:11
So for me, it was hard.
120
371220
2880
๊ทธ๋ž˜์„œ ํž˜๋“ค์—ˆ์Šต๋‹ˆ๋‹ค, ์ •๋ง ํž˜๋“ค์—ˆ์ฃ .
06:14
It was really hard
121
374100
1200
06:15
because it was years of hard work.
122
375300
2280
์ˆ˜๋…„๊ฐ„ ๊ณต์„ ๋“ค์˜€์—ˆ๊ฑฐ๋“ ์š”.
06:18
You know, Iโ€™m so passionate about my job,
123
378100
2160
์ „ ์ œ ์ผ์— ์ •๋ง ์—ด์ •์ ์ธ๋ฐ
06:20
and this video, this complete falsehood,
124
380260
2920
์™„๋ฒฝํžˆ ์กฐ์ž‘๋œ ์ด ์˜์ƒ์€
06:23
had the ability to just undermine years of hard work in mere seconds.
125
383180
4600
๋‹จ ๋ช‡ ์ดˆ ๋งŒ์— ์ˆ˜๋…„๊ฐ„์˜ ๋…ธ๋ ฅ์„ ๋ฌด๋„ˆ๋œจ๋ฆด ์ˆ˜ ์žˆ์—ˆ์ฃ .
06:28
But instead of succumbing entirely to victimhood,
126
388220
3200
์ „ ํ”ผํ•ด์ž๋ผ๋Š” ์‚ฌ์‹ค์— ์ขŒ์ ˆํ•˜๋Š” ๋Œ€์‹  ์ œ ์ž์‹ ์—๊ฒŒ ๋ฌผ์—ˆ์Šต๋‹ˆ๋‹ค.
06:31
I ask myself today,
127
391420
1360
06:32
you know, where do we go from here?
128
392820
2280
โ€™์ด์ œ ์–ด๋–กํ•ด์•ผ ํ• ๊นŒ?
06:35
And how can AI evolve to prevent something like this happening again?
129
395140
5560
์ด๋Ÿฐ ์ฐธ์‚ฌ๊ฐ€ ์žฌ๋ฐœํ•˜๋Š” ๊ฑธ ๋ง‰์œผ๋ ค๋ฉด AI๋Š” ์–ด๋–ป๊ฒŒ ๋ฐœ์ „ํ•ด์•ผ ํ• ๊นŒ?
06:40
Not only has it happened to me,
130
400740
1480
์ด๋Ÿฐ ์ผ์„ ๋‚˜๋ฟ ์•„๋‹ˆ๋ผ ํ›—๋‚ ์˜ ์—ฌ์„ฑ๋“ค๋„
06:42
but we want to future-proof and ensure that this doesn't happen
131
402220
3000
๋‹นํ•˜์ง€ ์•Š๋„๋ก ๋Œ€๋น„ํ•˜๊ณ  ์‹ถ์€๋ฐ.
06:45
to the women of tomorrow.
132
405260
1760
๋‹ค๋ฅธ ์‚ฌ๋žŒ์„ ์•„๋ผ๋Š” ์‚ฌ๋žŒ๋“ค, ์ฆ‰, ์—ฌ๋Ÿฌ๋ถ„๊ณผ ๋‚˜, ์šฐ๋ฆฌ๋“ค์ด
06:47
How can we, you and I, people who care about people,
133
407060
4280
06:51
ensure that this is a tech for good?
134
411380
2400
์–ด๋–ป๊ฒŒ ์ด ๊ธฐ์ˆ ์ด ์„ ์„ ์œ„ํ•œ ๊ฒƒ์ด ๋˜๋„๋ก ๋งŒ๋“ค ์ˆ˜ ์žˆ์„๊นŒ?
06:54
How can we, the policymakers, the creators, the consumers,
135
414300
4680
์ •์ฑ… ์ž…์•ˆ์ž, ์ฐฝ์ž‘๊ฐ€, ์†Œ๋น„์ž๊ฐ€
06:58
ensure we regulate AI and things like social media,
136
418980
3480
์ธ๊ณต ์ง€๋Šฅ์˜ ์ค‘์‹ฌ์— ์ธ๋ฅ˜์™€ ์ธ๊ฐ„์„ฑ์„ ๋‘๋„๋ก
07:02
putting humans and humanity at the center of artificial intelligence?
137
422500
5320
์–ด๋–ป๊ฒŒ AI์™€ SNS ๊ฐ™์€ ๊ฒƒ๋“ค์„ ๊ทœ์ œํ•  ์ˆ˜ ์žˆ์„๊นŒ?โ€ฒ
07:08
AI can change the world.
138
428740
2280
AI๋Š” ์„ธ์ƒ์„ ๋ฐ”๊ฟ€ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
์‹ค์€ ์˜ค๋Š˜ ๋“ค์—ˆ๋“ฏ์ด ์ด๋ฏธ ๋ฐ”๊พธ์—ˆ์ฃ .
07:11
In fact, as we've heard today, it already has.
139
431060
2880
07:13
In a matter of seconds,
140
433940
1520
์„œ๋กœ ์–ธ์–ด๊ฐ€ ์™„์ „ํžˆ ๋‹ค๋ฅด๋”๋ผ๋„ ์‚ฌ๋žŒ๋“ค์€ ๋‹จ ๋ช‡ ์ดˆ ๋งŒ์—
07:15
people who speak completely different languages
141
435460
2240
07:17
can connect and understand one another.
142
437700
2160
์„œ๋กœ๋ฅผ ๋งŒ๋‚˜๊ณ  ์ดํ•ดํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
07:19
And we've even seen the Pope as a style icon in a puffer jacket.
143
439860
3960
๊ตํ™ฉ์ด ์œ ํ–‰ํ•˜๋Š” ํŒจ๋”ฉ ์ ํผ๋ฅผ ์ž…์€ ๊ฒƒ๋„ ๋ดค์ž–์•„์š”.
07:24
So some really important uses right there.
144
444180
2240
์œ ์šฉํ•˜๊ฒŒ ์“ฐ์ผ ๋•Œ๋„ ์žˆ๊ธด ํ•˜์ฃ .
07:27
But then in my case as well,
145
447220
1600
ํ•˜์ง€๋งŒ ์ œ ๊ฒฝ์šฐ์ฒ˜๋Ÿผ ์ง„์‹ค์„ ๊ณต๊ฒฉํ•˜๋Š” ๋ฌด๊ธฐ๋„ ๋ฉ๋‹ˆ๋‹ค.
07:28
we can also see how it is weaponized against the truth.
146
448860
3440
07:32
And good examples of this would be art that appears like reality,
147
452620
4120
์˜ˆ๋กœ ํ˜„์‹ค ๊ฐ™์€ ๋ฏธ์ˆ  ์ž‘ํ’ˆ์„ ๋งŒ๋“ค๊ฑฐ๋‚˜
07:36
AI-generated reviews unfairly boosting certain products
148
456740
3840
AI๊ฐ€ ์“ด ํ›„๊ธฐ๋กœ ํŠน์ • ์ œํ’ˆ์„ ๋ถ€๋‹นํ•˜๊ฒŒ ํ™๋ณดํ•˜๊ฑฐ๋‚˜
07:40
and things like chatbot propaganda.
149
460580
2760
์ฑ—๋ด‡์œผ๋กœ ์„ ์ „ํ•˜๋Š” ๊ฒฝ์šฐ๊ฐ€ ์žˆ์ฃ .
07:43
And then, politically speaking,
150
463340
1760
๋˜ ์ •์น˜์ ์ธ ๋ฉด์—์„ 
์ˆ˜๋…„๊ฐ„ ๋”ฅํŽ˜์ดํฌ ์˜์ƒ์œผ๋กœ ๋‚ธ์‹œ ํŽ ๋กœ์‹œ๊ฐ€ ํšก์„ค์ˆ˜์„คํ•˜๋Š” ๊ฒƒ๊ณผ
07:45
we've seen over the years deepfakes of Nancy Pelosi slurring,
151
465140
4320
07:49
Joe Biden cursing
152
469500
1480
์กฐ ๋ฐ”์ด๋“ ์ด ์š•ํ•˜๋Š” ๊ฒƒ๊ณผ
์ ค๋ Œ์Šคํ‚ค ๋Œ€ํ†ต๋ น์ด ์ž๊ธฐ ๊ตฐ์ธ๋“ค์—๊ฒŒ ๋ฌด๊ธฐ๋ฅผ ๋ฒ„๋ฆฌ๋ผ๊ณ  ๋ช…๋ นํ•œ ๊ฑธ ๋ดค์ฃ .
07:51
and even President Zelensky
153
471020
1760
07:52
asking his soldiers to surrender their weapons.
154
472820
3160
07:57
So when AI is used like this, to manipulate,
155
477220
3560
๋”ฐ๋ผ์„œ AI๋ฅผ ์ด๋Ÿฐ ์‹์œผ๋กœ, ์กฐ์ž‘์— ์‚ฌ์šฉํ•œ๋‹ค๋ฉด
08:00
it can be a threat to our democracy.
156
480780
2960
๋ฏผ์ฃผ์ฃผ์˜์— ์œ„ํ˜‘์ด ๋ ์ง€๋„ ๋ชจ๋ฆ…๋‹ˆ๋‹ค.
08:03
And the tech is becoming so advanced
157
483740
2680
๋˜ ๊ธฐ์ˆ ์ด ์—„์ฒญ๋‚˜๊ฒŒ ๋ฐœ์ „ํ•ด์„œ
08:06
that it's hard to differentiate fact from fiction.
158
486460
3400
์‚ฌ์‹ค๊ณผ ํ—ˆ๊ตฌ๋ฅผ ๊ตฌ๋ณ„ํ•˜๊ธฐ๊ฐ€ ์–ด๋ ค์›Œ์ง€๊ณ  ์žˆ์ฃ .
08:10
So how does AI interfere with politics?
159
490660
4000
๊ทธ๋ ‡๋‹ค๋ฉด AI๋Š” ์–ด๋–ป๊ฒŒ ์ •์น˜์— ๊ฐ„์„ญํ• ๊นŒ์š”?
08:14
And for us as politicians, what should we be worried about?
160
494660
3840
์šฐ๋ฆฌ ์ •์น˜์ธ์€ ๋ฌด์—‡์„ ๊ฑฑ์ •ํ•ด์•ผ ํ• ๊นŒ์š”?
08:18
Could truth and democracy become shattered by AI?
161
498820
3760
AI๊ฐ€ ์ง„์‹ค๊ณผ ๋ฏผ์ฃผ์ฃผ์˜๋ฅผ ๋ฌด๋„ˆ๋œจ๋ฆด ์ˆ˜ ์žˆ์„๊นŒ์š”?
08:22
Has it already?
162
502580
1320
์ด๋ฏธ ๋ฌด๋„ˆ๋œจ๋ ธ์„๊นŒ์š”?
08:24
Well, to dive a little deeper here,
163
504300
2160
์—ฌ๊ธฐ์„œ ์ข€ ๋” ๊นŠ๊ฒŒ ๋“ค์–ด๊ฐ€๋ ค๋ฉด
08:26
I think firstly we need to talk about the concept of truth.
164
506500
3440
๋จผ์ € ์ง„์‹ค์ด๋ž€ ๋ฌด์—‡์ธ์ง€๋ฅผ ์šฐ์„  ์–˜๊ธฐํ•ด์•ผ ํ•  ๊ฒƒ ๊ฐ™์Šต๋‹ˆ๋‹ค.
08:30
Without truth, democracy collapses.
165
510300
2920
์ง„์‹ค์ด ์—†์œผ๋ฉด ๋ฏผ์ฃผ์ฃผ์˜๋Š” ๋ฌด๋„ˆ์ ธ์š”.
08:33
Truth allows us to make informed decisions,
166
513700
3520
์ง„์‹ค์€ ์ •๋ณด๋ฅผ ๋ฐ”ํƒ•์œผ๋กœ ๊ฒฐ์ •ํ•˜๊ฒŒ ํ•˜๊ณ 
08:37
it enables us to hold leaders accountable, which is very important.
167
517220
4040
์ง€๋„์ž๋“ค์—๊ฒŒ ์ฑ…์ž„์„ ๋ฌผ์„ ์ˆ˜ ์žˆ๊ฒŒ ํ•ฉ๋‹ˆ๋‹ค.
์•„์ฃผ ์ค‘์š”ํ•œ ๋ถ€๋ถ„์ด์ฃ .
08:41
And it also allows us, as political representatives,
168
521300
3400
๋˜ํ•œ ์šฐ๋ฆฌ ๊ฐ™์€ ์ •์น˜์ธ๋“ค์ด
08:44
to create a sense of trust with our citizens and our constituents.
169
524700
3840
์‹œ๋ฏผ๊ณผ ์œ ๊ถŒ์ž๋“ค๊ณผ ์‹ ๋ขฐ๊ฐ์„ ํ˜•์„ฑํ•˜๊ฒŒ ํ•ฉ๋‹ˆ๋‹ค.
ํ•˜์ง€๋งŒ ์ด๋Ÿฐ ์ง„์‹ค์ด ์—†๋‹ค๋ฉด
08:49
But without that truth,
170
529020
1680
08:50
democracy is vulnerable to misinformation,
171
530740
3280
๋ฏผ์ฃผ์ฃผ์˜๋Š” ํ—ˆ์œ„ ์ •๋ณด๋‚˜
์กฐ์ž‘, ๋ถ€ํŒจ์— ๋‹นํ•˜๊ธฐ ์‰ฝ์Šต๋‹ˆ๋‹ค.
08:54
manipulation, and, of course, corruption.
172
534020
3400
08:57
When AI erodes truth,
173
537940
2920
AI๊ฐ€ ์ง„์‹ค์„ ๊ฐ‰์•„๋จน์œผ๋ฉด
09:00
it erodes trust,
174
540860
2000
์‹ ๋ขฐ๊ฐ€ ํ”๋“ค๋ฆฌ๊ณ  ๋ฏผ์ฃผ์ฃผ์˜๋„ ๋ฌด๋„ˆ์ง‘๋‹ˆ๋‹ค.
09:02
and it undermines our democracy.
175
542860
2240
09:05
And for me, in my experience with a deepfake,
176
545500
3760
์ œ๊ฐ€ ๋”ฅํŽ˜์ดํฌ๋ฅผ ์ง์ ‘ ๋‹นํ•ด ๋ณด๋‹ˆ
09:09
I've seen what a fantastic distortion tool that deepfakes really are.
177
549260
4960
๋”ฅํŽ˜์ดํฌ๋Š” ์‚ฌ์‹ค์„ ์™œ๊ณกํ•˜๋Š” ๋ฐ ๊ฐ€์žฅ ์ ์ ˆํ•œ ๋„๊ตฌ๋ผ๋Š” ๊ฑธ ๋А๊ผˆ์Šต๋‹ˆ๋‹ค.
09:14
So how can we safeguard democracy from this ever-advancing technology?
178
554220
5080
๊ทธ๋Ÿผ ๊ณ„์† ๋ฐœ์ „ํ•˜๋Š” ๊ธฐ์ˆ ๋กœ๋ถ€ํ„ฐ ์–ด๋–ป๊ฒŒ ๋ฏผ์ฃผ์ฃผ์˜๋ฅผ ์ง€ํ‚ฌ ์ˆ˜ ์žˆ์„๊นŒ์š”?
09:19
It's becoming ever harder to distinguish between real and synthetic content.
179
559620
4960
์‹ค์ œ ์ •๋ณด์™€ ๊ฐ€์งœ ์ •๋ณด๋ฅผ ๊ตฌ๋ถ„ํ•˜๋Š” ๊ฒŒ ๋” ์–ด๋ ค์›Œ์ง€๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
09:24
And politically, what role does AI play in the future?
180
564620
4160
๋ฏธ๋ž˜์— AI๋Š” ์ •์น˜์ ์œผ๋กœ ์–ด๋–ค ์—ญํ• ์„ ํ• ๊นŒ์š”?
09:28
And I can't talk about politics without talking about media as well.
181
568820
4080
์ •์น˜๋ฅผ ์–˜๊ธฐํ•  ๋•Œ ๋ฏธ๋””์–ด๊ฐ€ ๋น ์งˆ ์ˆ˜ ์—†์ฃ .
09:32
They're undeniably linked, they're intertwined.
182
572900
2360
๋‘˜์€ ๋ช…๋ฐฑํžˆ ์—ฐ๊ฒฐ๋˜์–ด ์žˆ๊ณ  ์„œ๋กœ ์–ฝํ˜€ ์žˆ์Šต๋‹ˆ๋‹ค.
09:35
And I think journalism has its own battle here as well.
183
575300
3600
์–ธ๋ก  ์—ญ์‹œ ์ž์‹ ์˜ ์‹ธ์›€์„ ๋ฒŒ์ด๊ณ  ์žˆ์ฃ .
09:39
From AI algorithms boosting articles unfairly
184
579220
3600
๊ธฐ์‚ฌ๋ฅผ ๋ถ€๋‹นํ•˜๊ฒŒ ๋…ธ์ถœํ•˜๋Š” AI ์•Œ๊ณ ๋ฆฌ์ฆ˜๊ณผ ๋‚š์‹œ์„ฑ ํ‘œ์ œ,
09:42
to clickbait headlines,
185
582860
1520
09:44
and then also moments where they can manipulate the public as well.
186
584420
4800
๋Œ€์ค‘์„ ์‰ฝ๊ฒŒ ์กฐ์ข…ํ•  ์ˆ˜ ์žˆ๋Š” ์ƒํ™ฉ๊นŒ์ง€ ๋ง์ด์ฃ .
09:49
But politically speaking,
187
589220
1240
๊ทธ๋Ÿฐ๋ฐ ์ •์น˜์ ์ธ ๋ฉด์—์„  AI๊ฐ€ ์กฐ์ž‘ํ•œ ์ •์น˜ ๋ฉ”์‹œ์ง€๊ฐ€
09:50
we've seen AI-tailored political messaging
188
590500
3000
09:53
influencing voters,
189
593540
1160
์œ ๊ถŒ์ž๋“ค์—๊ฒŒ ์˜ํ•ญ์„ ์ฃผ๊ณ  ๊ธฐ์กด์˜ ํŽธ๊ฒฌ์„ ์‹ฌํ™”ํ•ฉ๋‹ˆ๋‹ค.
09:54
weโ€™ve seen it adding to existing bias.
190
594740
3240
ํŽ˜์ด์Šค๋ถ์—์„œ ๋ณธ ๊ฑด ๋‹ค ๋ฏฟ๋Š” ์นœ์ฒ™ ํ•œ ๋ช…์ฏค์€ ์•„์‹œ์ž–์•„์š”.
09:58
And definitely I think we all have that aunt thatโ€™s on Facebook
191
598020
2960
10:00
and kind of believes anything.
192
600980
1840
10:02
So for me, as a politician,
193
602860
1440
์ €๋Š” ์ •์น˜์ธ์œผ๋กœ์„œ
10:04
I think it's really important we dive a little deeper
194
604340
2480
AI, ์–ธ๋ก , ๋ฏธ๋””์–ด์˜ ๊ด€๊ณ„๋ฅผ
10:06
into the relationship of AI, journalism and media.
195
606820
3880
์ข€ ๋” ๊นŠ๊ฒŒ ์‚ดํŽด์•ผ ํ•œ๋‹ค๊ณ  ๋ด…๋‹ˆ๋‹ค.
10:11
But it also puts us at risk
196
611220
1840
์ž์นซํ•˜๋ฉด ๋ถ„์—ด๋˜๊ณ  ์‹œ๋Œ€์— ์—ญํ–‰ํ•˜๋Š” ์‚ฌํšŒ๊ฐ€ ๋  ์œ„ํ—˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
10:13
of creating a more divided and reactionary society,
197
613100
4000
๊ฑฐ์ง“์€ ํฐ ๋ฐ˜์‘์„ ์ด๋Œ์–ด ๋‚ผ ์ˆ˜ ์žˆ์œผ๋‹ˆ๊นŒ์š”.
10:17
because falsehoods can create a lot of reaction.
198
617140
3000
10:20
And for myself, coming from Northern Ireland,
199
620140
2320
๋ถ„์Ÿ์„ ๊ฒช์€ ๋ถ์•„์ผ๋žœ๋“œ ์ถœ์‹ ์œผ๋กœ์„œ
10:22
which is that post-conflict society,
200
622500
2320
10:24
I do have concerns about how it could shape our political landscape
201
624860
3960
AI๊ฐ€ ์ •์น˜ ํ™˜๊ฒฝ๊ณผ ์„ธ๊ณ„ ๊ณณ๊ณณ์— ์–ด๋–ค ์˜ํ–ฅ์„ ๋ฏธ์น ์ง€ ์‹ฌํžˆ ์šฐ๋ ค๋ฉ๋‹ˆ๋‹ค.
10:28
and other places across the globe.
202
628820
2320
์Šฌํ”„๊ฒŒ๋„
10:32
Sadly, this deepfake video
203
632060
2320
์ œ๊ฐ€ ๊ฒฝํ—˜ํ•œ AI ์•…์šฉ์€ ์ด ๋”ฅํŽ˜์ดํฌ ์˜์ƒ๋งŒ์ด ์•„๋‹™๋‹ˆ๋‹ค.
10:34
is not the only instance of me having experienced abuse with AI.
204
634380
5400
10:39
Just six months ago,
205
639780
1280
๋ถˆ๊ณผ 6๊ฐœ์›” ์ „์— ๋”ฅํŽ˜์ดํฌ ์‚ฌ์ง„ 15์žฅ์„ ๋ฐ›์•˜๋Š”๋ฐ
10:41
I received 15 deepfake images of myself in lingerie
206
641100
5400
์ œ๊ฐ€ ์†์˜ท์„ ์ž…๊ณ  ์ž๊ทน์ ์ธ ์ž์„ธ๋ฅผ ์ทจํ•˜๊ณ  ์žˆ์—ˆ์ฃ .
10:46
posing provocatively.
207
646540
1640
10:48
And I thought to myself, here we go again.
208
648620
3240
์†์œผ๋กœ ์ƒ๊ฐํ–ˆ์Šต๋‹ˆ๋‹ค, โ€˜๋˜ ์‹œ์ž‘์ด๋„ค.โ€™
10:52
And you know, I spoke with some other female politicians.
209
652460
3720
์ €๋Š” ๋‹ค๋ฅธ ์—ฌ์„ฑ ์ •์น˜์ธ๋“ค๊ณผ ์–˜๊ธฐ๋ฅผ ๋‚˜๋ˆด์–ด์š”.
10:56
Thankfully, where I represent, we have more women getting into politics.
210
656980
4280
๋‹คํ–‰ํžˆ ์ œ๊ฐ€ ๋Œ€ํ‘œํ•˜๋Š” ๊ณณ์—” ์ •์น˜์— ์ž…๋ฌธํ•˜๋Š” ์—ฌ์„ฑ์ด ๋งŽ์ฃ .
11:01
But I had a really good conversation with them,
211
661300
2200
์ •๋ง ์ข‹์€ ๋Œ€ํ™”๋ฅผ ๋‚˜๋ˆด๊ณ 
11:03
and itโ€™s around, if this is happening to you now,
212
663500
2320
์ง€๊ธˆ ๋‹น์‹ ํ•œํ…Œ ์ผ์–ด๋‚˜๋Š” ์ผ์ด ๋‚ด์ผ์€ ๋‚˜ํ•œํ…Œ ์ผ์–ด๋‚˜๊ฒ ์ฃ  ๊ฐ™์€ ์–˜๊ธฐ์˜€์Šต๋‹ˆ๋‹ค.
11:05
what happens to me tomorrow?
213
665860
1760
11:07
And I think this really strikes at the heart of the climate of fear
214
667660
3960
์ €๋Š” ์ด๊ฒƒ์ด ๊ณต์ธ๋“ค์—๊ฒŒ AI๊ฐ€ ๊ณตํฌ๊ฐ์„ ์ฃผ๋Š”
11:11
that AI can create for those in public life.
215
671660
3840
์ง„์งœ ๊ทผ์›์ด๋ผ๊ณ  ์ƒ๊ฐํ–ˆ์Šต๋‹ˆ๋‹ค.
11:15
And I don't blame women.
216
675500
1160
์ €๋Š” ์—ฌ์„ฑ๋“ค์„ ํƒ“ํ•˜์ง€ ์•Š์•„์š”, ๊ทธ๊ฑด ๋ชน์‹œ ์•ˆ๋œ ์ผ์ด์ฃ .
11:16
It's very sad.
217
676700
1160
11:17
I don't blame women for not wanting to get into politics
218
677860
2640
์ด๋Ÿฐ ๊ธฐ์ˆ  ๋ฐœ์ „์„ ๋ณด๊ณ  ์ •์น˜ ์ž…๋ฌธ์„ ๊บผ๋ฆฌ๋Š” ์—ฌ์„ฑ์„ ํƒ“ํ•  ๊ฒŒ ์•„๋‹ˆ๋ผ
11:20
when they see this kind of technology come forward,
219
680540
2400
11:22
so that's so important that we safeguard it.
220
682940
2480
์ด๋“ค์„ ์•ˆ์ „ํ•˜๊ฒŒ ๋ณดํ˜ธํ•ด์•ผ ํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
11:25
What also concerned me was the position of elderly voters
221
685420
4680
๋˜ ์ œ๊ฐ€ ์šฐ๋ คํ–ˆ๋˜ ์ ์€ ๋‚˜์ด ๋“  ์œ ๊ถŒ์ž๋“ค์˜ ์ƒํ™ฉ๊ณผ
๋งค์ฒด ์ดํ•ด๋ ฅ ๋ฐ ์ด ๊ธฐ์ˆ ์˜ ์ดํ•ด๋„์˜€์Šต๋‹ˆ๋‹ค.
11:30
and perhaps their media literacy,
222
690100
1880
11:32
their comprehension of this technology.
223
692020
2200
11:34
People who don't use the internet,
224
694260
2120
์ด๋“ค์€ ์ธํ„ฐ๋„ท์„ ์“ฐ์ง€ ์•Š์œผ๋ฉฐ
11:36
who perhaps are not aware of AI and its many, many uses.
225
696420
3920
AI์™€ ๊ทธ ์ˆ˜๋งŽ์€ ์šฉ๋„๋ฅผ ์ž˜ ๋ชจ๋ฅผ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
11:40
So that was really concerning as well.
226
700340
2400
๊ทธ๋ž˜์„œ ์ด ์ ๋„ ์‹ฌํžˆ ์šฐ๋ ค๋์ฃ .
ํ•˜์ง€๋งŒ ๊ผญ ์ด๋Ÿฐ ์‹์ผ ํ•„์š”๋Š” ์—†์–ด์š”.
11:43
But it doesn't have to be this way.
227
703060
1800
11:44
We can be part of the change.
228
704900
1760
์šฐ๋ฆฌ๊ฐ€ ๋ณ€ํ™”์˜ ์ผ๋ถ€๊ฐ€ ๋  ์ˆ˜ ์žˆ์ฃ .
11:47
For me and my video,
229
707260
1920
์ œ ์˜์ƒ ๊ฐ™์€ ๊ฒฝ์šฐ์—”
11:49
I still don't know, to this day, who did this.
230
709220
3520
์˜ค๋Š˜๊นŒ์ง€ ์—ฌ์ „ํžˆ ๋ˆ„๊ฐ€ ๋งŒ๋“  ๊ฑด์ง€ ๋ชจ๋ฆ…๋‹ˆ๋‹ค.
11:52
I can imagine why they did it, but I don't know who did it.
231
712780
3400
์ด์œ ๋Š” ์ง์ž‘์ด ๊ฐ€์ง€๋งŒ ๋ˆ„๊ตฐ์ง€๋Š” ๋ชจ๋ฅด๊ฒ ์–ด์š”.
11:56
And sadly for me and for across the globe,
232
716180
3200
์ €์™€ ์ „ ์„ธ๊ณ„ ์‚ฌ๋žŒ๋“ค์—๊ฒ ์•ˆํƒ€๊น์ง€๋งŒ
11:59
it still wouldn't be considered a crime.
233
719420
2920
์—ฌ์ „ํžˆ ๋ฒ”์ฃ„๋กœ ๊ฐ„์ฃผ๋˜์ง€๋„ ์•Š์Šต๋‹ˆ๋‹ค.
12:03
So from being the enemy of fair, transparent,
234
723260
2840
๋”ฐ๋ผ์„œ ๊ณต์ •๊ณผ ์ •์ง, ๊ทธ๋ฆฌ๊ณ  ์„ ํ•œ ์ •์น˜์ธ๋“ค์˜ ์ ๋ถ€ํ„ฐ
12:06
good-natured politics,
235
726100
1600
12:07
to warfare, to international interference,
236
727740
3240
์ „์Ÿ๊ณผ ๊ตญ์ œ์  ํ˜ผ๋ž€๊นŒ์ง€,
AI๋Š” ๊ทธ ์žฅ์ ๊ณผ ์ž ์žฌ๋ ฅ์ด ์žˆ์ง€๋งŒ,
12:11
despite its beauty and its potential,
237
731020
2880
12:13
AI is still a cause for concern.
238
733940
2880
์—ฌ์ „ํžˆ ์šฐ๋ ค ๋Œ€์ƒ์ž…๋‹ˆ๋‹ค.
12:16
But it doesn't have to be this way.
239
736820
2080
ํ•˜์ง€๋งŒ ๊ผญ ์ด๋Ÿฐ ์‹์ผ ํ•„์š”๋Š” ์—†์ฃ .
12:19
I feel passionately that AI can be a humanistic technology
240
739260
3720
์ „ AI๊ฐ€ ์ธ๊ฐ„ ๊ฐ€์น˜๋ฅผ ๋‹ด์€ ์ธ๋ณธ์ฃผ์˜์  ๊ธฐ์ˆ ๋กœ
12:23
with human values
241
743020
1560
12:24
that complements the lives that we live
242
744620
2640
์šฐ๋ฆฌ์˜ ์‚ถ์„ ๋ณด์™„ํ•˜๊ณ 
12:27
to make us the very best versions of ourselves.
243
747300
3440
๋” ๋‚˜์€ ์‚ฌ๋žŒ์ด ๋˜๋„๋ก ํ•ด ์ค€๋‹ค๊ณ  ๋ฏฟ์Šต๋‹ˆ๋‹ค.
๊ทธ๋Ÿฐ๋ฐ ๊ทธ๋Ÿฌ๊ธฐ ์œ„ํ•ด์„ 
12:31
But to do that,
244
751100
1560
12:32
I think we need to embed ethics into this technology,
245
752700
3920
์ด ๊ธฐ์ˆ ์— ์œค๋ฆฌ๋ฅผ ๊ฒฐํ•ฉํ•ด์„œ
12:36
to eliminate bias, to install empathy
246
756620
4160
ํŽธ๊ฒฌ์„ ์—†์• ๊ณ  ๊ณต๊ฐ์„ ํ˜•์„ฑํ•˜๊ณ 
12:40
and make sure it is aligned with human values and human principles.
247
760820
4760
์ธ๊ฐ„์˜ ๊ฐ€์น˜์™€ ์›์น™์— ๋ถ€ํ•ฉํ•˜๋„๋ก ํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
12:45
Who knows, our democracy could depend on it.
248
765900
3400
์šฐ๋ฆฌ ๋ฏผ์ฃผ์ฃผ์˜๊ฐ€ AI๋ฅผ ์ž˜ ํ™œ์šฉํ•  ์ˆ˜ ์žˆ์„์ง€๋„ ๋ชจ๋ฅด์ฃ .
12:49
And today I know we have some of the brightest minds in this room.
249
769300
3840
์˜ค๋Š˜ ์ง€์„ฑ์ธ๋“ค์ด ํ•œ๊ณณ์— ๋ชจ์˜€์Šต๋‹ˆ๋‹ค.
12:53
I heard some of the talks earlier, and I'm certain of that.
250
773180
4120
์•„๊นŒ ๋ช‡๋ช‡ ๊ฐ•์—ฐ์„ ๋“ฃ๊ณ  ํ™•์‹ ํ–ˆ์ฃ .
12:57
And I know each and every one of you have weight on your shoulders
251
777340
3120
์—ฌ๋Ÿฌ๋ถ„ ๋ชจ๋‘๊ฐ€ ๋ฏธ๋ž˜๋ฅผ ๋ฐ”๋ผ๋ณด๋ฉฐ ์ฑ…์ž„๊ฐ์„ ๋А๋‚€๋‹ค๋Š” ๊ฑธ ์••๋‹ˆ๋‹ค.
13:00
when looking to the future.
252
780500
1680
13:02
But I also know each and every one of you want to see this tech for good.
253
782220
5400
๋˜ ์ด ๊ธฐ์ˆ ์„ ์•ž์œผ๋กœ๋„ ๊ณ„์† ๋ณด๊ณ  ์‹ถ์–ด ํ•œ๋‹ค๋Š” ๊ฒƒ๋„ ์•Œ์ฃ .
13:07
And what gives me hope
254
787980
1800
์ œ๊ฒŒ ํฌ๋ง์„ ์ฃผ๋Š” ๊ฒƒ์€
13:09
is witnessing the movement right across the globe
255
789780
3320
์ „ ์„ธ๊ณ„์—์„œ ์ด๋Ÿฐ ํž˜๋“  ์ผ์„ ์‹œ์ž‘ํ•˜๋Š” ์›€์ง์ž„์ด ๋ณด์ธ๋‹ค๋Š” ๊ฒ๋‹ˆ๋‹ค.
13:13
to see this hard journey begin,
256
793140
2800
13:15
to regulate this ever-advancing technology,
257
795940
2840
์„ ๊ณผ ์•… ์–‘๋‚ ์„ ์ง€๋‹Œ ์ฑ„ ๊ณ„์† ๋ฐœ์ „ํ•˜๋Š” ์ด ๊ธฐ์ˆ ์„ ๊ทœ์ œํ•˜๊ณ  ์žˆ๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
13:18
which can be used for bad or for good.
258
798780
3000
์ด ์ž๋ฆฌ์— ์žˆ๋Š” ์šฐ๋ฆฌ ๋ชจ๋‘๊ฐ€
13:22
But perhaps we, each and every one of us in this room,
259
802140
3720
ํ•ด๊ฒฐ์ฑ…์„ ์ฐพ๋Š” ๋ฐ ์ผ๋ถ€๊ฐ€ ๋  ์ˆ˜ ์žˆ์„์ง€๋„ ๋ชจ๋ฆ…๋‹ˆ๋‹ค.
13:25
can be part of finding the solution.
260
805860
2360
13:28
(Applause)
261
808860
2200
(๋ฐ•์ˆ˜)
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7