How a Deepfake Almost Ruined My Political Career | Cara Hunter | TED

41,195 views ใƒป 2024-12-18

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืชืจื’ื•ื: zeeva livshitz
00:03
This talk contains mature language
0
3620
3720
ื”ืฉื™ื—ื” ื”ื–ื• ืžื›ื™ืœื” ืฉืคื” ื‘ื•ื’ืจืช
00:08
โ€œYouโ€™re a little whore,
1
8700
1840
"ืืช ื–ื•ื ื” ืงื˜ื ื”,
00:10
and we've all seen your little video."
2
10540
2560
ื•ื›ื•ืœื ื• ืจืื™ื ื• ืืช ื”ืกืจื˜ื•ืŸ ื”ืงื˜ืŸ ืฉืœืš."
00:14
That was the text message that was sent to me in April of 2022.
3
14300
5280
ื–ื• ื”ื™ื™ืชื” ื”ื•ื“ืขืช ื”ื˜ืงืกื˜ ืฉื ืฉืœื—ื” ืืœื™ ื‘ืืคืจื™ืœ 2022.
00:20
I'm sitting in my grandmother's living room
4
20300
2480
ืื ื™ ื™ื•ืฉื‘ืช ื‘ืกืœื•ืŸ ืฉืœ ืกื‘ืชื™
00:22
in what is her 90th birthday,
5
22820
2640
ื‘ื™ื•ื ื”ื•ืœื“ืชื” ื” -90,
00:25
surrounded by family and friends
6
25460
2360
ืžื•ืงืคืช ื‘ืžืฉืคื—ื” ื•ื—ื‘ืจื™ื
00:27
as my phone blows up with messages from strangers
7
27820
4480
ื›ืฉื”ื˜ืœืคื•ืŸ ืฉืœื™ ืžืชืคื•ืฆืฅ ื‘ื”ื•ื“ืขื•ืช ืฉืœ ื–ืจื™ื
00:32
right across the country
8
32300
1840
ื‘ืจื—ื‘ื™ ื”ืืจืฅ
00:34
who say they have seen a video of me
9
34140
3160
ืฉืื•ืžืจื™ื ืฉื”ื ืจืื• ืกืจื˜ื•ืŸ ืฉื‘ื• ืื ื™
00:37
engaging in hardcore pornographic activity with a man.
10
37340
4720
ืขื•ืกืงืช ื‘ืคืขื™ืœื•ืช ืคื•ืจื ื•ื’ืจืคื™ืช ืงืฉื” ืขื ื’ื‘ืจ.
00:42
I knew this was impossible.
11
42740
1600
ื™ื“ืขืชื™ ืฉื–ื” ื‘ืœืชื™ ืืคืฉืจื™.
00:44
With just three weeks out from my election,
12
44900
2000
ืœืื—ืจ ืฉืœื•ืฉื” ืฉื‘ื•ืขื•ืช ื‘ืœื‘ื“ ืžื”ื‘ื—ื™ืจื•ืช ืฉืœื™,
00:46
I felt as though my career was crumbling before my very eyes.
13
46940
4080
ื”ืจื’ืฉืชื™ ื›ืื™ืœื• ื”ืงืจื™ื™ืจื” ืฉืœื™ ืžืชืคื•ืจืจืช ืœื ื’ื“ ืขื™ื ื™ื™.
00:51
My heart pounded,
14
51420
1440
ืœื‘ื™ ื“ืคืง,
00:52
my mind raced, sweat beaded on my skin.
15
52860
3160
ืžื•ื—ื™ ื”ืชืจื•ืฆืฅ, ื–ื™ืขื” ื”ืชืคืฉื˜ื” ืขืœ ืขื•ืจื™.
00:56
And then I watched the video, and my worst fear was realized.
16
56380
4880
ื•ืื– ืฆืคื™ืชื™ ื‘ืกืจื˜ื•ืŸ, ื•ื”ืคื—ื“ ื”ื’ืจื•ืข ื‘ื™ื•ืชืจ ืฉืœื™ ื”ืชืžืžืฉ.
01:01
Although this woman in the video was not me,
17
61980
3560
ืœืžืจื•ืช ืฉื”ืื™ืฉื” ื”ื–ื• ื‘ืกืจื˜ื•ืŸ ืœื ื”ื™ื™ืชื” ืื ื™,
01:05
she looked exactly like me.
18
65540
2720
ื”ื™ื ื ืจืืชื” ื‘ื“ื™ื•ืง ื›ืžื•ื ื™.
01:09
Impossibly like me.
19
69020
2360
ื‘ืื•ืคืŸ ื‘ืœืชื™ ืืคืฉืจื™ ื›ืžื•ื ื™.
01:11
Eerily like me.
20
71380
1200
ื‘ืื•ืคืŸ ืžืคื—ื™ื“ ื›ืžื•ื ื™.
01:13
I had so many questions running through my mind.
21
73140
3040
ื”ื™ื• ืœื™ ื›ืœ ื›ืš ื”ืจื‘ื” ืฉืืœื•ืช ืฉืขื‘ืจื• ื‘ืžื•ื—ื™.
01:16
Was this AI?
22
76500
1360
ื”ืื ื–ื” ื”ื™ื” AI?
01:17
Was it not?
23
77900
1280
ื–ื” ืœื ื”ื™ื”?
01:19
Who made this?
24
79220
1160
ืžื™ ืขืฉื” ืืช ื–ื”?
01:20
How did they make it?
25
80420
1360
ืื™ืš ื”ื ื”ืฆืœื™ื—ื• ืœืขืฉื•ืช ืืช ื–ื”?
01:21
Why did they make it?
26
81820
1320
ืœืžื” ื”ื ืขืฉื• ืืช ื–ื”?
01:23
So I did what anyone would do,
27
83140
2360
ืื– ืขืฉื™ืชื™ ืžื” ืฉื›ืœ ืื—ื“ ื”ื™ื” ืขื•ืฉื”,
01:25
and I approached my local police service to ask for advice, for guidance.
28
85500
4680
ื•ืคื ื™ืชื™ ืœืฉื™ืจื•ืช ื”ืžืฉื˜ืจื” ื”ืžืงื•ืžื™ ืฉืœื™ ื›ื“ื™ ืœื‘ืงืฉ ืขืฆื”, ืœื”ื“ืจื›ื”.
01:30
And really, where did I go from there?
29
90220
2280
ื•ื‘ืืžืช, ืœืืŸ ื”ืœื›ืชื™ ืžืฉื?
01:32
But they informed me that they wouldn't have the cybercrime technology to assist,
30
92540
4280
ืื‘ืœ ื”ื ื”ื•ื“ื™ืขื• ืœื™ ืฉืœื ืชื”ื™ื” ืœื”ื ื˜ื›ื ื•ืœื•ื’ื™ื™ืช ืคืฉืขื™ ื”ืกื™ื™ื‘ืจ ืฉืชืกื™ื™ืข,
01:36
to find out where this video came from.
31
96860
2240
ื›ื“ื™ ืœื’ืœื•ืช ืžืื™ืคื” ื”ืกืจื˜ื•ืŸ ื”ื–ื” ื”ื’ื™ืข.
01:39
And it was from that moment I knew that I was on my own.
32
99140
3520
ื•ืžืื•ืชื• ื”ืจื’ืข ื™ื“ืขืชื™ ืฉืื ื™ ืœื‘ื“.
ืขื›ืฉื™ื• ื›ื“ื™ ืœื”ื›ื™ืŸ ืืช ื”ื‘ืžื”, ื›ืคื™ ืฉืืชื ื‘ื•ื•ื“ืื™ ื™ื›ื•ืœื™ื ืœื“ืขืช,
01:43
Now to set the stage, as you can probably tell,
33
103380
2440
01:45
Iโ€™m from Ireland,
34
105820
1240
ืื ื™ ืžืื™ืจืœื ื“,
01:47
and to be exact, I'm from Northern Ireland,
35
107100
2480
ื•ืœื™ืชืจ ื“ื™ื•ืง, ืื ื™ ืžืฆืคื•ืŸ ืื™ืจืœื ื“,
01:49
which is an even smaller place.
36
109580
2000
ืฉื”ื™ื ืžืงื•ื ืงื˜ืŸ ืขื•ื“ ื™ื•ืชืจ.
01:51
We have just 1.8 million people,
37
111620
2680
ื™ืฉ ืœื ื• ืจืง 1.8 ืžื™ืœื™ื•ืŸ ืื ืฉื™ื,
01:54
very similar to the size of Vienna.
38
114340
2840
ื“ื•ืžื” ืžืื•ื“ ืœื’ื•ื“ืœ ื•ื™ื ื”.
01:57
So you can imagine a rumor of this sinister nature,
39
117180
3440
ืื– ืืชื ื™ื›ื•ืœื™ื ืœื“ืžื™ื™ืŸ ืฉืฉืžื•ืขื” ื‘ืขืœืช ืื•ืคื™ ืžืจื•ืฉืข ื–ื”,
02:00
particularly in the world of politics, can go very far, very fast.
40
120620
5920
ื‘ืžื™ื•ื—ื“ ื‘ืขื•ืœื ื”ืคื•ืœื™ื˜ื™ืงื”, ื™ื›ื•ืœื” ืœื”ื’ื™ืข ืจื—ื•ืง ืžืื•ื“, ืžื”ืจ ืžืื•ื“.
02:06
And that old saying, โ€œseeing is believingโ€ began to haunt me.
41
126540
4480
ื•ื”ืืžืจื” ื”ื™ืฉื ื” ื”ื”ื™ื, "ืœืจืื•ืช ื–ื” ืœื”ืืžื™ืŸ" ื”ื—ืœื” ืœืจื“ื•ืฃ ืื•ืชื™.
02:11
And in the weeks leading up to my election,
42
131060
2520
ื•ื‘ืฉื‘ื•ืขื•ืช ืฉืงื“ืžื• ืœื‘ื—ื™ืจืชื™,
02:13
this video, this false video,
43
133620
1920
ื”ืกืจื˜ื•ืŸ ื”ื–ื”, ืกืจื˜ื•ืŸ ื”ืฉืงืจ ื”ื–ื”,
02:15
was shared thousands and thousands of times across WhatsApp.
44
135540
4240
ืฉื•ืชืฃ ืืœืคื™ ื•ืืœืคื™ ืคืขืžื™ื ื‘ื•ื•ืื˜ืกืืค.
02:20
And attached to this video was photos of me at work,
45
140260
3720
ื•ืœืกืจื˜ื•ืŸ ื”ื–ื” ื”ื™ื• ืžืฆื•ืจืคื•ืช ืชืžื•ื ื•ืช ืฉืœื™ ื‘ืขื‘ื•ื“ื”,
02:24
smiling, campaigning,
46
144020
2760
ืžื—ื™ื™ื›ืช, ืขื•ืฉื” ืงืžืคื™ื™ืŸ,
02:26
building a sense of trust with my constituents.
47
146820
2680
ื‘ื•ื ื” ืชื—ื•ืฉืช ืืžื•ืŸ ืขื ื”ื‘ื•ื—ืจื™ื ืฉืœื™.
02:29
And as the weeks went on,
48
149540
1480
ื•ื›ื›ืœ ืฉื—ืœืคื• ื”ืฉื‘ื•ืขื•ืช,
02:31
messages flooded in faster and faster,
49
151020
3920
ื”ืžืกืจื™ื ื”ืฆื™ืคื• ืžื”ืจ ื™ื•ืชืจ ื•ื™ื•ืชืจ,
02:34
and they were of a very vile and sexual nature.
50
154980
3040
ื•ื”ื ื”ื™ื• ื‘ืขืœื™ ืื•ืคื™ ืžืื•ื“ ืžืจื•ืฉืข ื•ืžื™ื ื™.
02:38
Ding!
51
158340
1200
ื“ื™ื ื’!
02:39
"We've all seen your little video."
52
159540
2000
"ื›ื•ืœื ื• ืจืื™ื ื• ืืช ื”ืกืจื˜ื•ืŸ ื”ืงื˜ืŸ ืฉืœืš."
02:41
Ding!
53
161940
1120
ื“ื™ื ื’!
02:43
"You should be ashamed of yourself."
54
163100
2480
โ€œืืช ืฆืจื™ื›ื” ืœื”ืชื‘ื™ื™ืฉ ื‘ืขืฆืžืš.โ€
02:45
Ding!
55
165620
1200
ื“ื™ื ื’!
02:46
"Ah, now I see how you got your position in politics."
56
166820
4280
"ืื”, ืขื›ืฉื™ื• ืื ื™ ืจื•ืื” ืื™ืš ืงื™ื‘ืœืช ืืช ืขืžื“ืชืš ื‘ืคื•ืœื™ื˜ื™ืงื”."
02:53
It was very difficult.
57
173100
1960
ื–ื” ื”ื™ื” ืงืฉื” ืžืื•ื“.
02:55
And having been in politics since the age of 23,
58
175540
3160
ื•ืื—ืจื™ ืฉื”ื™ื™ืชื™ ื‘ืคื•ืœื™ื˜ื™ืงื” ืžื’ื™ืœ 23,
02:58
and at this point I've been in it for about four to five years,
59
178740
3560
ื•ื‘ืฉืœื‘ ื”ื–ื” ืื ื™ ืขื•ืกืงืช ื‘ื” ื›ืืจื‘ืข ืขื“ ื—ืžืฉ ืฉื ื™ื,
03:02
and I'm from Northern Ireland, which is a post-conflict society,
60
182300
3280
ื•ืื ื™ ืžืฆืคื•ืŸ ืื™ืจืœื ื“, ืฉื”ื™ื ื—ื‘ืจื” ืฉืœืื—ืจ-ืกื›ืกื•ืš,
03:05
still very deeply divided.
61
185620
1920
ืขื“ื™ื™ืŸ ืžืคื•ืฆืœืช ืžืื•ื“.
03:07
So I anticipated challenges,
62
187580
2600
ืื– ืฆื™ืคื™ืชื™ ืœืืชื’ืจื™ื,
03:10
I anticipated disagreements,
63
190220
2600
ืฆื™ืคื™ืชื™ ืœื—ื™ืœื•ืงื™ ื“ืขื•ืช,
03:12
I even anticipated attacks, is politics after all.
64
192860
2800
ืืคื™ืœื• ืฆื™ืคื™ืชื™ ืœื”ืชืงืคื•ืช, ื–ื• ืคื•ืœื™ื˜ื™ืงื” ืื—ืจื™ ื”ื›ืœ.
03:16
But what I did not anticipate was this moment.
65
196100
3520
ืื‘ืœ ืžื” ืฉืœื ืฆื™ืคื™ืชื™ ื”ื™ื” ื”ืจื’ืข ื”ื–ื”.
03:19
This was different.
66
199660
1400
ื–ื” ื”ื™ื” ืฉื•ื ื”.
03:21
This was the moment where misogyny meets the misuse of technology,
67
201060
5800
ื–ื” ื”ื™ื” ื”ืจื’ืข ื‘ื• ืฉื ืืช ื ืฉื™ื ืคื•ื’ืฉืช ืฉื™ืžื•ืฉ ืœืจืขื” ื‘ื˜ื›ื ื•ืœื•ื’ื™ื”,
03:26
and even had the potential to impact the outcome of a democratic election.
68
206900
5560
ื•ืืฃ ื”ื™ื” ืœื” ืคื•ื˜ื ืฆื™ืืœ ืœื”ืฉืคื™ืข ืขืœ ืชื•ืฆืื•ืช ื”ื‘ื—ื™ืจื•ืช ื”ื“ืžื•ืงืจื˜ื™ื•ืช.
03:32
And the sad thing for me was this lie became so far spread,
69
212820
5080
ื•ื”ื“ื‘ืจ ื”ืขืฆื•ื‘ ืขื‘ื•ืจื™ ื”ื™ื” ืฉื”ืฉืงืจ ื”ื–ื” ื”ืชืคืฉื˜
03:37
so far, so fast,
70
217940
1840
ื›ืœ ื›ืš ืจื—ื•ืง, ื›ืœ ื›ืš ืžื”ืจ,
03:39
that even my own family started to believe it.
71
219820
2880
ืฉืืคื™ืœื• ื”ืžืฉืคื—ื” ืฉืœื™ ื”ืชื—ื™ืœื” ืœื”ืืžื™ืŸ ื‘ื–ื”.
03:42
Some would say that they'd heard it at a golf club,
72
222700
2760
ื™ืฉ ืฉื™ื’ื™ื“ื• ืฉืฉืžืขื• ืืช ื–ื” ื‘ืžื•ืขื“ื•ืŸ ื’ื•ืœืฃ,
03:45
others would say they heard it at the bar,
73
225500
2520
ืื—ืจื™ื ื”ื™ื• ืื•ืžืจื™ื ืฉื”ื ืฉืžืขื• ืืช ื–ื” ื‘ื‘ืจ,
03:48
and of course, some even said they heard it in a locker room.
74
228060
4320
ื•ื›ืžื•ื‘ืŸ, ื—ืœืงื ืืคื™ืœื• ืืžืจื• ืฉื”ื ืฉืžืขื• ืืช ื–ื” ื‘ื—ื“ืจ ื”ื”ืœื‘ืฉื”.
03:52
A really good example of how far this went
75
232420
2480
ื“ื•ื’ืžื” ื˜ื•ื‘ื” ืžืื•ื“ ืœื›ืžื” ืจื—ื•ืง ื–ื” ื”ืœืš
03:54
was people that I knew my entire life
76
234900
2520
ื”ื™ื• ืื ืฉื™ื ืฉื”ื™ื›ืจืชื™ ืฉื›ืœ ื—ื™ื™
03:57
would pass me in the street without whispering a word.
77
237460
3000
ื™ืขื‘ืจื• ืื•ืชื™ ื‘ืจื—ื•ื‘ ื‘ืœื™ ืœืœื—ื•ืฉ ืžื™ืœื”.
04:00
People like school teachers,
78
240460
1360
ืื ืฉื™ื ื›ืžื• ืžื•ืจื™ื ื‘ื‘ื™ืช ืกืคืจ,
04:01
people I had a sense of trust with
79
241860
1720
ืื ืฉื™ื ืฉื”ื™ื™ืชื” ืœื™ ืชื—ื•ืฉืช ืืžื•ืŸ ืื™ืชื,
04:03
and, you know, an affinity with.
80
243620
2360
ื•ืืชื ื™ื•ื“ืขื™ื, ื–ื™ืงื” ืื™ืชื.
04:05
And that was really hard.
81
245980
1320
ื•ื–ื” ื”ื™ื” ืžืžืฉ ืงืฉื”.
04:07
It felt like overnight I was wearing a scarlet letter.
82
247300
3920
ื–ื” ื”ืจื’ื™ืฉ ื›ืื™ืœื• ื‘ืŸ ืœื™ืœื” ืื ื™ ืœื•ื‘ืฉืช ืžื›ืชื‘ ืืจื’ืžืŸ.
04:11
And as things moved on
83
251660
1320
ื•ื›ื›ืœ ืฉื”ื“ื‘ืจื™ื ื”ืชืงื“ืžื•
04:12
and were about two, three weeks out from the election,
84
252980
2880
ื•ื”ื™ื• ื‘ืขืจืš ืฉื‘ื•ืขื™ื™ื, ืฉืœื•ืฉื” ืฉื‘ื•ืขื•ืช ืžื”ื‘ื—ื™ืจื•ืช,
04:15
I kept receiving messages, and it got wider and wider.
85
255900
3120
ื”ืžืฉื›ืชื™ ืœืงื‘ืœ ื”ื•ื“ืขื•ืช, ื•ื–ื” ื ื”ื™ื” ืจื—ื‘ ื™ื•ืชืจ ื•ื™ื•ืชืจ.
04:19
It was global.
86
259060
1160
ื–ื” ื”ื™ื” ื’ืœื•ื‘ืœื™.
04:20
Not only was I receiving messages from Dublin and from London,
87
260220
3680
ืœื ืจืง ืฉืงื™ื‘ืœืชื™ ื”ื•ื“ืขื•ืช ืžื“ื‘ืœื™ืŸ ื•ืžืœื•ื ื“ื•ืŸ,
04:23
but I was also receiving messages from Massachusetts, Manhattan,
88
263900
4040
ืืœื ื’ื ืงื™ื‘ืœืชื™ ื”ื•ื“ืขื•ืช ืžืžืกืฆโ€™ื•ืกื˜ืก, ืžื ื”ื˜ืŸ,
04:27
and I was getting so many follows on my political social media,
89
267940
3920
ื•ืงื™ื‘ืœืชื™ ื›ืœ ื›ืš ื”ืจื‘ื” ืขื•ืงื‘ื™ื ื‘ืจืฉืชื•ืช ื”ื—ื‘ืจืชื™ื•ืช ื”ืคื•ืœื™ื˜ื™ื•ืช ืฉืœื™,
04:31
predominantly from men hungry for more of this scandal.
90
271900
3920
ื‘ืขื™ืงืจ ืžื’ื‘ืจื™ื ืจืขื‘ื™ื ืœื™ื•ืชืจ ืžื”ืฉืขืจื•ืจื™ื™ื” ื”ื–ื•.
04:35
And this intersection of online harms impacting my real life
91
275860
5120
ื•ื”ืฆื•ืžืช ื”ื–ื” ืฉืœ ื ื–ืงื™ื ืžืงื•ื•ื ื™ื ืฉื”ืฉืคื™ืขื• ืขืœ ื—ื™ื™ ื”ืืžื™ืชื™ื™ื
04:40
was something I found utterly strange and surreal.
92
280980
4000
ื”ื™ื” ืžืฉื”ื• ืฉืžืฆืืชื™ ืžื•ื–ืจ ื•ืกื•ืจื™ืืœื™ืกื˜ื™ ืœื—ืœื•ื˜ื™ืŸ.
04:45
But it got to the point where I was recognized on the street
93
285500
3800
ืื‘ืœ ื–ื” ื”ื’ื™ืข ืœื ืงื•ื“ื” ืฉื‘ื” ื–ื™ื”ื• ืื•ืชื™ ื‘ืจื—ื•ื‘
04:49
and approached by a stranger who asked me for a sexual favor.
94
289340
6040
ื•ืคื ื” ืืœื™ื• ื–ืจ ืฉื‘ื™ืงืฉ ืžืžื ื™ ื˜ื•ื‘ื” ืžื™ื ื™ืช.
04:55
And it was just, for me, it was like in the blink of an eye,
95
295420
3080
ื•ื–ื” ื”ื™ื” ืคืฉื•ื˜, ื‘ืฉื‘ื™ืœื™, ื–ื” ื”ื™ื” ื›ืื™ืœื• ื›ื”ืจืฃ ืขื™ืŸ,
04:58
everything had just changed,
96
298500
2640
ื”ื›ืœ ืคืฉื•ื˜ ื”ืฉืชื ื”,
05:01
and it was utterly humiliating.
97
301140
1600
ื•ื–ื” ื”ื™ื” ืžืฉืคื™ืœ ืœื—ืœื•ื˜ื™ืŸ.
05:02
I didn't want to leave the house,
98
302780
1680
ืœื ืจืฆื™ืชื™ ืœืขื–ื•ื‘ ืืช ื”ื‘ื™ืช,
05:04
and I had turned notifications off in my phone
99
304460
2880
ื•ื›ื™ื‘ื™ืชื™ ืืช ื”ื”ืชืจืื•ืช ื‘ื˜ืœืคื•ืŸ ืฉืœื™
05:07
just so I could kind of catch my breath,
100
307380
1920
ืจืง ื›ื“ื™ ืฉืื•ื›ืœ ืœื”ื—ื–ื™ืจ ืืช ื ืฉื™ืžืชื™,
05:09
but this wasn't ideal in the lead up, of course, to an election.
101
309340
4280
ืื‘ืœ ื–ื” ืœื ื”ื™ื” ืื™ื“ื™ืืœื™ ืœืงืจืืช ื”ื‘ื—ื™ืจื•ืช, ื›ืžื•ื‘ืŸ.
05:13
And for me, I think that was the purpose of this false video, was to do just that.
102
313620
4960
ื•ื‘ืฉื‘ื™ืœื™, ืื ื™ ื—ื•ืฉื‘ืช ืฉื–ื• ื”ื™ื™ืชื” ื”ืžื˜ืจื” ืฉืœ ื”ืกืจื˜ื•ืŸ ื”ืฉืงืจื™ ื”ื–ื”, ืœืขืฉื•ืช ื‘ื“ื™ื•ืง ืืช ื–ื”.
05:19
But what hurt the most for me
103
319660
2440
ืื‘ืœ ืžื” ืฉื”ื›ื™ ื›ืื‘ ืœื™
05:22
was sitting down my father
104
322140
2720
ื”ื™ื” ืœื”ื•ืฉื™ื‘ ืืช ืื‘ื™
05:24
and having to explain to him this strange, surreal situation.
105
324860
5760
ื•ื”ื™ื™ืชื™ ืฆืจื™ื›ื” ืœื”ืกื‘ื™ืจ ืœื• ืืช ื”ืžืฆื‘ ื”ืžื•ื–ืจ ื•ื”ืกื•ืจื™ืืœื™ืกื˜ื™ ื”ื–ื”.
05:30
My father is an Irishman,
106
330620
1760
ืื‘ื™ ื”ื•ื ืื™ืจื™,
05:32
completely disconnected from tech,
107
332420
2160
ืžื ื•ืชืง ืœื—ืœื•ื˜ื™ืŸ ืžื”ื˜ื›ื ื•ืœื•ื’ื™ื”,
05:34
and so having to explain
108
334620
1920
ื•ืœื›ืŸ ื”ืฆื•ืจืš ืœื”ืกื‘ื™ืจ
05:36
this horrific situation was an entire fabrication
109
336580
4320
ืืช ื”ืžืฆื‘ ื”ื ื•ืจื ื”ื–ื” ื”ื™ื” ืคื‘ืจื™ืงืฆื™ื” ืฉืœืžื”
05:40
was very hard to do.
110
340940
1600
ื”ื™ื” ืงืฉื” ืžืื•ื“ ืœืขืฉื•ืช.
05:42
This was this strange moment
111
342860
2120
ื–ื” ื”ื™ื” ื”ืจื’ืข ื”ืžื•ื–ืจ ื”ื–ื”
05:45
where the online world met my life, my reality.
112
345020
4880
ืฉื‘ื• ื”ืขื•ืœื ื”ืžืงื•ื•ืŸ ืคื’ืฉ ืืช ื—ื™ื™, ืืช ื”ืžืฆื™ืื•ืช ืฉืœื™.
05:50
Not only having the impact to ruin my reputation,
113
350460
4520
ืœื ืจืง ืฉื™ืฉ ืœื• ืืช ื”ื”ืฉืคืขื” ืœื”ืจื•ืก ืืช ื”ืžื•ื ื™ื˜ื™ืŸ ืฉืœื™,
05:54
but have the capacity to change the outcome
114
354980
2800
ืืœื ืฉื™ืฉ ืœื• ืืช ื”ื™ื›ื•ืœืช ืœืฉื ื•ืช ืืช ืชื•ืฆืื•ืช
05:57
of a democratic election.
115
357820
2200
ื”ื‘ื—ื™ืจื•ืช ื”ื“ืžื•ืงืจื˜ื™ื•ืช.
06:00
And, you know,
116
360500
1160
ื•ืืชื ื™ื•ื“ืขื™ื,
06:01
for years I spent so much time building trust with my constituents.
117
361700
4040
ื‘ืžืฉืš ืฉื ื™ื ื‘ื™ืœื™ืชื™ ื›ืœ ื›ืš ื”ืจื‘ื” ื–ืžืŸ ื‘ื‘ื ื™ื™ืช ืืžื•ืŸ ืขื ื”ื‘ื•ื—ืจื™ื ืฉืœื™.
06:05
I mean, we all know how much people like politicians.
118
365780
3360
ื›ื•ืœื ื• ื™ื•ื“ืขื™ื ื›ืžื” ืื ืฉื™ื ืื•ื”ื‘ื™ื ืคื•ืœื™ื˜ื™ืงืื™ื.
06:09
You know, weโ€™re as likeable as the tax man.
119
369180
2000
ืืชื ื™ื•ื“ืขื™ื, ืื ื—ื ื• ื—ื‘ื™ื‘ื™ื ื›ืžื• ืื™ืฉ ื”ืžืก.
06:11
So for me, it was hard.
120
371220
2880
ืื– ื‘ืฉื‘ื™ืœื™, ื–ื” ื”ื™ื” ืงืฉื”.
06:14
It was really hard
121
374100
1200
ื–ื” ื”ื™ื” ืžืžืฉ ืงืฉื”
06:15
because it was years of hard work.
122
375300
2280
ื›ื™ ื”ื™ื• ืฉื ื™ื ืฉืœ ืขื‘ื•ื“ื” ืงืฉื”.
06:18
You know, Iโ€™m so passionate about my job,
123
378100
2160
ืืชื ื™ื•ื“ืขื™ื, ืื ื™ ื›ืœ ื›ืš ื ืœื”ื‘ืช ืžื”ืขื‘ื•ื“ื” ืฉืœื™,
06:20
and this video, this complete falsehood,
124
380260
2920
ื•ืœืกืจื˜ื•ืŸ ื”ื–ื”, ื”ืฉืงืจ ื”ืžื•ื—ืœื˜ ื”ื–ื”,
06:23
had the ability to just undermine years of hard work in mere seconds.
125
383180
4600
ื”ื™ืชื” ื™ื›ื•ืœืช ืคืฉื•ื˜ ืœืขืจืขืจ ืฉื ื™ื ืฉืœ ืขื‘ื•ื“ื” ืงืฉื” ืชื•ืš ืฉื ื™ื•ืช ืกืคื•ืจื•ืช.
06:28
But instead of succumbing entirely to victimhood,
126
388220
3200
ืื‘ืœ ื‘ืžืงื•ื ืœื”ื™ื›ื ืข ืœื—ืœื•ื˜ื™ืŸ ืœืงื•ืจื‘ื ื•ืช,
06:31
I ask myself today,
127
391420
1360
ืื ื™ ืฉื•ืืœืช ืืช ืขืฆืžื™ ื”ื™ื•ื,
06:32
you know, where do we go from here?
128
392820
2280
ืืชื ื™ื•ื“ืขื™ื, ืœืืŸ ืื ื—ื ื• ื”ื•ืœื›ื™ื ืžื›ืืŸ?
06:35
And how can AI evolve to prevent something like this happening again?
129
395140
5560
ื•ืื™ืš AI ื™ื›ื•ืœ ืœื”ืชืคืชื— ื›ื“ื™ ืœืžื ื•ืข ืžื“ื‘ืจ ื›ื–ื” ืœืงืจื•ืช ืฉื•ื‘?
06:40
Not only has it happened to me,
130
400740
1480
ืœื ืจืง ืฉื–ื” ืงืจื” ืœื™,
06:42
but we want to future-proof and ensure that this doesn't happen
131
402220
3000
ืืœื ืฉืื ื—ื ื• ืจื•ืฆื™ื ืœื”ื‘ื˜ื™ื— ืืช ื”ืขืชื™ื“ ื•ืœื”ื‘ื˜ื™ื— ืฉื–ื” ืœื ื™ืงืจื”
06:45
to the women of tomorrow.
132
405260
1760
ืœื ืฉื™ื ืฉืœ ืžื—ืจ.
06:47
How can we, you and I, people who care about people,
133
407060
4280
ืื™ืš ืื ื—ื ื•, ืืชื ื•ืื ื™, ืื ืฉื™ื ืฉืื›ืคืช ืœื”ื ืžืื ืฉื™ื,
06:51
ensure that this is a tech for good?
134
411380
2400
ื™ื›ื•ืœื™ื ืœื”ื‘ื˜ื™ื— ืฉื–ื• ื˜ื›ื ื•ืœื•ื’ื™ื” ืœืชืžื™ื“?
06:54
How can we, the policymakers, the creators, the consumers,
135
414300
4680
ืื™ืš ืื ื—ื ื•, ืงื•ื‘ืขื™ ื”ืžื“ื™ื ื™ื•ืช, ื”ื™ื•ืฆืจื™ื, ื”ืฆืจื›ื ื™ื,
06:58
ensure we regulate AI and things like social media,
136
418980
3480
ื™ื›ื•ืœื™ื ืœื”ื‘ื˜ื™ื— ืฉืื ื—ื ื• ืžืกื“ื™ืจื™ื ืืช ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ื•ื“ื‘ืจื™ื ื›ืžื• ืžื“ื™ื” ื—ื‘ืจืชื™ืช,
07:02
putting humans and humanity at the center of artificial intelligence?
137
422500
5320
ืœืฉื™ื ืืช ื‘ื ื™ ื”ืื“ื ื•ื”ืื ื•ืฉื•ืช ื‘ืžืจื›ื– ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช?
07:08
AI can change the world.
138
428740
2280
AI ื™ื›ื•ืœื” ืœืฉื ื•ืช ืืช ื”ืขื•ืœื.
07:11
In fact, as we've heard today, it already has.
139
431060
2880
ืœืžืขืฉื”, ื›ืคื™ ืฉืฉืžืขื ื• ื”ื™ื•ื, ื–ื” ื›ื‘ืจ ืงืจื”.
07:13
In a matter of seconds,
140
433940
1520
ืชื•ืš ืฉื ื™ื•ืช,
07:15
people who speak completely different languages
141
435460
2240
ืื ืฉื™ื ืฉืžื“ื‘ืจื™ื ืฉืคื•ืช ืฉื•ื ื•ืช ืœื—ืœื•ื˜ื™ืŸ
07:17
can connect and understand one another.
142
437700
2160
ื™ื›ื•ืœื™ื ืœื”ืชื—ื‘ืจ ื•ืœื”ื‘ื™ืŸ ื–ื” ืืช ื–ื”.
07:19
And we've even seen the Pope as a style icon in a puffer jacket.
143
439860
3960
ื•ืืคื™ืœื• ืจืื™ื ื• ืืช ื”ืืคื™ืคื™ื•ืจ ื›ืกืžืœ ืกื’ื ื•ืŸ ื‘ืžืขื™ืœ ืžืชื ืคื—.
07:24
So some really important uses right there.
144
444180
2240
ืื– ื›ืžื” ืฉื™ืžื•ืฉื™ื ื—ืฉื•ื‘ื™ื ื‘ืืžืช ืฉื.
07:27
But then in my case as well,
145
447220
1600
ืื‘ืœ ืื– ื’ื ื‘ืžืงืจื” ืฉืœื™,
07:28
we can also see how it is weaponized against the truth.
146
448860
3440
ืื ื—ื ื• ื™ื›ื•ืœื™ื ื’ื ืœืจืื•ืช ืื™ืš ื–ื” ื ืฉืง ื›ื ื’ื“ ื”ืืžืช.
07:32
And good examples of this would be art that appears like reality,
147
452620
4120
ื•ื“ื•ื’ืžืื•ืช ื˜ื•ื‘ื•ืช ืœื›ืš ื”ืŸ ืืžื ื•ืช ืฉื ืจืื™ืช ื›ืžื• ืžืฆื™ืื•ืช,
07:36
AI-generated reviews unfairly boosting certain products
148
456740
3840
ื‘ื™ืงื•ืจื•ืช ืฉื ื•ืฆืจื• ื‘ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ื”ืžื’ื‘ื™ืจื•ืช ื‘ืฆื•ืจื” ืœื ื”ื•ื’ื ืช ืžื•ืฆืจื™ื ืžืกื•ื™ืžื™ื
07:40
and things like chatbot propaganda.
149
460580
2760
ื•ื“ื‘ืจื™ื ื›ืžื• ืชืขืžื•ืœืช ืฆ'ืื˜ื‘ื•ื˜.
07:43
And then, politically speaking,
150
463340
1760
ื•ืื–, ืžื‘ื—ื™ื ื” ืคื•ืœื™ื˜ื™ืช,
07:45
we've seen over the years deepfakes of Nancy Pelosi slurring,
151
465140
4320
ืจืื™ื ื• ื‘ืžื”ืœืš ื”ืฉื ื™ื ื–ื™ื•ืคื™ื ืขืžื•ืงื™ื ืฉืœ ื ื ืกื™ ืคืœื•ืกื™,
07:49
Joe Biden cursing
152
469500
1480
ื’โ€™ื• ื‘ื™ื™ื“ืŸ ืžืงืœืœ
07:51
and even President Zelensky
153
471020
1760
ื•ืืคื™ืœื• ื”ื ืฉื™ื ื–ืœื ืกืงื™
07:52
asking his soldiers to surrender their weapons.
154
472820
3160
ืžื‘ืงืฉ ืžื—ื™ื™ืœื™ื• ืœืžืกื•ืจ ืืช ื ืฉืงื™ื”ื.
07:57
So when AI is used like this, to manipulate,
155
477220
3560
ืื– ื›ืฉืžืฉืชืžืฉื™ื ื‘ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ื›ืš, ื›ื“ื™ ืœืชืžืจืŸ,
08:00
it can be a threat to our democracy.
156
480780
2960
ื–ื” ื™ื›ื•ืœ ืœื”ื•ื•ืช ืื™ื•ื ืขืœ ื”ื“ืžื•ืงืจื˜ื™ื” ืฉืœื ื•.
08:03
And the tech is becoming so advanced
157
483740
2680
ื•ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ื ืขืฉื™ืช ื›ืœ ื›ืš ืžืชืงื“ืžืช
08:06
that it's hard to differentiate fact from fiction.
158
486460
3400
ืฉืงืฉื” ืœื”ื‘ื“ื™ืœ ื‘ื™ืŸ ืขื•ื‘ื“ื” ืœื‘ื“ื™ื•ืŸ.
08:10
So how does AI interfere with politics?
159
490660
4000
ืื– ืื™ืš AI ืžืคืจื™ืขื” ืœืคื•ืœื™ื˜ื™ืงื”?
08:14
And for us as politicians, what should we be worried about?
160
494660
3840
ื•ื‘ืฉื‘ื™ืœื ื• ื›ืคื•ืœื™ื˜ื™ืงืื™ื, ืžืžื” ืขืœื™ื ื• ืœื“ืื•ื’?
08:18
Could truth and democracy become shattered by AI?
161
498820
3760
ื”ืื ืืžืช ื•ื“ืžื•ืงืจื˜ื™ื” ื™ื›ื•ืœื•ืช ืœื”ืชื ืคืฅ ืขืœ ื™ื“ื™ AI?
08:22
Has it already?
162
502580
1320
ื”ืื ื–ื” ื›ื‘ืจ?
08:24
Well, to dive a little deeper here,
163
504300
2160
ื•ื‘ื›ืŸ, ื›ื“ื™ ืœืฆืœื•ืœ ืงืฆืช ื™ื•ืชืจ ืขืžื•ืง ื›ืืŸ,
08:26
I think firstly we need to talk about the concept of truth.
164
506500
3440
ืื ื™ ื—ื•ืฉื‘ืช ืฉืงื•ื“ื ื›ืœ ืขืœื™ื ื• ืœื“ื‘ืจ ืขืœ ืžื•ืฉื’ ื”ืืžืช.
08:30
Without truth, democracy collapses.
165
510300
2920
ืœืœื ืืžืช, ื”ื“ืžื•ืงืจื˜ื™ื” ืžืชืžื•ื˜ื˜ืช.
08:33
Truth allows us to make informed decisions,
166
513700
3520
ื”ืืžืช ืžืืคืฉืจืช ืœื ื• ืœืงื‘ืœ ื”ื—ืœื˜ื•ืช ืžื•ืฉื›ืœื•ืช,
08:37
it enables us to hold leaders accountable, which is very important.
167
517220
4040
ื”ื™ื ืžืืคืฉืจืช ืœื ื• ืœืชืช ื“ื™ืŸ ื•ื—ืฉื‘ื•ืŸ ืœืžื ื”ื™ื’ื™ื, ื•ื–ื” ืžืื•ื“ ื—ืฉื•ื‘.
08:41
And it also allows us, as political representatives,
168
521300
3400
ื•ื–ื” ื’ื ืžืืคืฉืจ ืœื ื•, ื›ื ืฆื™ื’ื™ื ืคื•ืœื™ื˜ื™ื™ื,
08:44
to create a sense of trust with our citizens and our constituents.
169
524700
3840
ืœื™ืฆื•ืจ ืชื—ื•ืฉืช ืืžื•ืŸ ืขื ืื–ืจื—ื™ื ื• ื•ื‘ื•ื—ืจื™ื™ื ื•.
08:49
But without that truth,
170
529020
1680
ืืš ืœืœื ืืžืช ื–ื•,
08:50
democracy is vulnerable to misinformation,
171
530740
3280
ื”ื“ืžื•ืงืจื˜ื™ื” ื—ืฉื•ืคื” ืœืžื™ื“ืข ืžื•ื˜ืขื”,
08:54
manipulation, and, of course, corruption.
172
534020
3400
ืžื ื™ืคื•ืœืฆื™ื•ืช ื•ื›ืžื•ื‘ืŸ ืฉื—ื™ืชื•ืช.
08:57
When AI erodes truth,
173
537940
2920
ื›ืืฉืจ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ืฉื•ื—ืงืช ืืช ื”ืืžืช,
09:00
it erodes trust,
174
540860
2000
ื”ื™ื ืฉื•ื—ืงืช ืืช ื”ืืžื•ืŸ
09:02
and it undermines our democracy.
175
542860
2240
ื•ื”ื™ื ืžืขืจืขืจืช ืืช ื”ื“ืžื•ืงืจื˜ื™ื” ืฉืœื ื•.
09:05
And for me, in my experience with a deepfake,
176
545500
3760
ื•ื‘ืฉื‘ื™ืœื™, ืžื ื™ืกื™ื•ื ื™ ืขื ื“ื™ืค-ืคื™ื™ืง,
09:09
I've seen what a fantastic distortion tool that deepfakes really are.
177
549260
4960
ืจืื™ืชื™ ืื™ื–ื” ื›ืœื™ ืขื™ื•ื•ืช ืคื ื˜ืกื˜ื™ ืฉื“ื™ืค-ืคื™ื™ืงื™ื ื”ื ื‘ืืžืช.
09:14
So how can we safeguard democracy from this ever-advancing technology?
178
554220
5080
ืื– ืื™ืš ื ื•ื›ืœ ืœื”ื’ืŸ ืขืœ ื”ื“ืžื•ืงืจื˜ื™ื” ืžื”ื˜ื›ื ื•ืœื•ื’ื™ื” ื”ืžืชืงื“ืžืช ื›ืœ ื”ื–ืžืŸ ื”ื–ื”?
09:19
It's becoming ever harder to distinguish between real and synthetic content.
179
559620
4960
ื–ื” ื ื”ื™ื” ืงืฉื” ื™ื•ืชืจ ืœื”ื‘ื—ื™ืŸ ื‘ื™ืŸ ืชื•ื›ืŸ ืืžื™ืชื™ ืœืกื™ื ืชื˜ื™.
09:24
And politically, what role does AI play in the future?
180
564620
4160
ื•ืคื•ืœื™ื˜ื™ืช, ืื™ื–ื” ืชืคืงื™ื“ ืžืžืœืืช AI ื‘ืขืชื™ื“?
09:28
And I can't talk about politics without talking about media as well.
181
568820
4080
ืื ื™ ืœื ื™ื›ื•ืœื” ืœื“ื‘ืจ ืขืœ ืคื•ืœื™ื˜ื™ืงื” ื‘ืœื™ ืœื“ื‘ืจ ื’ื ืขืœ ื”ืชืงืฉื•ืจืช.
09:32
They're undeniably linked, they're intertwined.
182
572900
2360
ื”ื ืงืฉื•ืจื™ื ืœืœื ืกืคืง, ื”ื ืฉื–ื•ืจื™ื ื–ื” ื‘ื–ื”.
09:35
And I think journalism has its own battle here as well.
183
575300
3600
ื•ืื ื™ ื—ื•ืฉื‘ืช ืฉืœืขื™ืชื•ื ื•ืช ื™ืฉ ื›ืืŸ ื’ื ืงืจื‘ ืžืฉืœื”.
09:39
From AI algorithms boosting articles unfairly
184
579220
3600
ืžืืœื’ื•ืจื™ืชืžื™ AI ื”ืžื’ื‘ื™ืจื™ื ืžืืžืจื™ื ื‘ืฆื•ืจื” ืœื ื”ื•ื’ื ืช
09:42
to clickbait headlines,
185
582860
1520
ืœื›ื•ืชืจื•ืช ืงึฐืœึดื™ืงึฐื‘ึผึตื™ึฐื˜,
09:44
and then also moments where they can manipulate the public as well.
186
584420
4800
ื•ืื– ื’ื ืจื’ืขื™ื ืฉื‘ื”ื ื”ื ื™ื›ื•ืœื™ื ืœืชืžืจืŸ ื’ื ืืช ื”ืฆื™ื‘ื•ืจ.
09:49
But politically speaking,
187
589220
1240
ืื‘ืœ ืžื‘ื—ื™ื ื” ืคื•ืœื™ื˜ื™ืช,
09:50
we've seen AI-tailored political messaging
188
590500
3000
ืจืื™ื ื• ื”ื•ื“ืขื•ืช ืคื•ืœื™ื˜ื™ื•ืช ืžื•ืชืืžื•ืช ืœื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช
09:53
influencing voters,
189
593540
1160
ืžืฉืคื™ืขื•ืช ืขืœ ื”ื‘ื•ื—ืจื™ื,
09:54
weโ€™ve seen it adding to existing bias.
190
594740
3240
ืจืื™ื ื• ืืช ื–ื” ืžื•ืกื™ืฃ ืœื”ื˜ื™ื” ื”ืงื™ื™ืžืช.
09:58
And definitely I think we all have that aunt thatโ€™s on Facebook
191
598020
2960
ื•ื‘ื”ื—ืœื˜ ืื ื™ ื—ื•ืฉื‘ืช ืฉืœื›ื•ืœื ื• ื™ืฉ ืืช ื”ื“ื•ื“ื” ื”ื–ื• ืฉื ืžืฆืืช ื‘ืคื™ื™ืกื‘ื•ืง
10:00
and kind of believes anything.
192
600980
1840
ื•ืžืืžื™ื ื” ื‘ื›ืœ ื“ื‘ืจ.
10:02
So for me, as a politician,
193
602860
1440
ืื– ื‘ืฉื‘ื™ืœื™, ื›ืคื•ืœื™ื˜ื™ืงืื™ืช,
10:04
I think it's really important we dive a little deeper
194
604340
2480
ืื ื™ ื—ื•ืฉื‘ืช ืฉื–ื” ื‘ืืžืช ื—ืฉื•ื‘ ืฉื ืฆืœื•ืœ ืงืฆืช ื™ื•ืชืจ ืขืžื•ืง
10:06
into the relationship of AI, journalism and media.
195
606820
3880
ืœืชื•ืš ืžืขืจื›ืช ื”ื™ื—ืกื™ื ืฉืœ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช, ืขื™ืชื•ื ืื•ืช ื•ืžื“ื™ื”.
10:11
But it also puts us at risk
196
611220
1840
ืื‘ืœ ื–ื” ื’ื ืžืขืžื™ื“ ืื•ืชื ื• ื‘ืกื™ื›ื•ืŸ
10:13
of creating a more divided and reactionary society,
197
613100
4000
ืฉืœ ื™ืฆื™ืจืช ื—ื‘ืจื” ืžืคื•ืฆืœืช ื•ืจื™ืืงืฆื™ื•ื ื™ืช ื™ื•ืชืจ,
10:17
because falsehoods can create a lot of reaction.
198
617140
3000
ืžื›ื™ื•ื•ืŸ ืฉืฉืงืจ ื™ื›ื•ืœ ืœื™ืฆื•ืจ ื”ืจื‘ื” ืชื’ื•ื‘ื•ืช.
10:20
And for myself, coming from Northern Ireland,
199
620140
2320
ื•ืœืขืฆืžื™, ื‘ืืชื™ ืžืฆืคื•ืŸ ืื™ืจืœื ื“,
10:22
which is that post-conflict society,
200
622500
2320
ืฉื”ื™ื ืื•ืชื” ื—ื‘ืจื” ืฉืœืื—ืจ ื”ืกื›ืกื•ืš,
10:24
I do have concerns about how it could shape our political landscape
201
624860
3960
ื™ืฉ ืœื™ ื—ืฉืฉื•ืช ืœื’ื‘ื™ ืื™ืš ื”ื™ื ื™ื›ื•ืœื” ืœืขืฆื‘ ืืช ื”ื ื•ืฃ ื”ืคื•ืœื™ื˜ื™ ืฉืœื ื•
10:28
and other places across the globe.
202
628820
2320
ื•ืžืงื•ืžื•ืช ืื—ืจื™ื ื‘ืจื—ื‘ื™ ื”ืขื•ืœื.
10:32
Sadly, this deepfake video
203
632060
2320
ืœืžืจื‘ื” ื”ืฆืขืจ,ืกืจื˜ื•ืŸ ื”ื“ื™ืค-ืคื™ื™ืง ื”ื–ื”
10:34
is not the only instance of me having experienced abuse with AI.
204
634380
5400
ื”ื•ื ืœื ื”ืžืงืจื” ื”ื™ื—ื™ื“ ืฉื‘ื• ื—ื•ื•ื™ืชื™ ื”ืชืขืœืœื•ืช ืขื AI.
10:39
Just six months ago,
205
639780
1280
ืจืง ืœืคื ื™ ื—ืฆื™ ืฉื ื”,
10:41
I received 15 deepfake images of myself in lingerie
206
641100
5400
ืงื™ื‘ืœืชื™ 15 ืชืžื•ื ื•ืช ื“ื™ืค-ืคื™ื™ืงื™ื•ืช ืฉืœ ืขืฆืžื™ ื‘ื”ืœื‘ืฉื” ืชื—ืชื•ื ื”
10:46
posing provocatively.
207
646540
1640
ืฉืžืชื™ื™ืฆื‘ืช ื‘ืื•ืคืŸ ืคืจื•ื‘ื•ืงื˜ื™ื‘ื™.
10:48
And I thought to myself, here we go again.
208
648620
3240
ื•ื—ืฉื‘ืชื™ ืœืขืฆืžื™, ื”ื ื” ืื ื—ื ื• ืžืชื—ื™ืœื™ื ืฉื•ื‘.
10:52
And you know, I spoke with some other female politicians.
209
652460
3720
ื•ืืชื ื™ื•ื“ืขื™ื, ื“ื™ื‘ืจืชื™ ืขื ื›ืžื” ืคื•ืœื™ื˜ื™ืงืื™ื•ืช ืื—ืจื•ืช.
10:56
Thankfully, where I represent, we have more women getting into politics.
210
656980
4280
ืœืžืจื‘ื” ื”ืžื–ืœ, ื‘ืžืงื•ื ืฉืื ื™ ืžื™ื™ืฆื’ืช, ื™ืฉ ืœื ื• ื™ื•ืชืจ ื ืฉื™ื ืฉื ื›ื ืกื•ืช ืœืคื•ืœื™ื˜ื™ืงื”.
11:01
But I had a really good conversation with them,
211
661300
2200
ืื‘ืœ ื”ื™ื™ืชื” ืœื™ ืฉื™ื—ื” ืžืžืฉ ื˜ื•ื‘ื” ืื™ืชื,
11:03
and itโ€™s around, if this is happening to you now,
212
663500
2320
ื•ื–ื” ืžืกื‘ื™ื‘, ืื ื–ื” ืงื•ืจื” ืœืš ืขื›ืฉื™ื•,
11:05
what happens to me tomorrow?
213
665860
1760
ืžื” ื™ืงืจื” ืœื™ ืžื—ืจ?
11:07
And I think this really strikes at the heart of the climate of fear
214
667660
3960
ื•ืื ื™ ื—ื•ืฉื‘ืช ืฉื–ื” ื‘ืืžืช ืคื•ื’ืข ื‘ืœื‘ ืืงืœื™ื ื”ืคื—ื“
11:11
that AI can create for those in public life.
215
671660
3840
ืฉื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ื™ื›ื•ืœื” ืœื™ืฆื•ืจ ืขื‘ื•ืจ ืืœื” ื‘ื—ื™ื™ื ื”ืฆื™ื‘ื•ืจื™ื™ื.
11:15
And I don't blame women.
216
675500
1160
ื•ืื ื™ ืœื ืžืืฉื™ืžื” ื ืฉื™ื.
11:16
It's very sad.
217
676700
1160
ื–ื” ืžืื•ื“ ืขืฆื•ื‘.
11:17
I don't blame women for not wanting to get into politics
218
677860
2640
ืื ื™ ืœื ืžืืฉื™ืžื” ื ืฉื™ื ืฉืœื ืจื•ืฆื•ืช ืœื”ื™ื›ื ืก ืœืคื•ืœื™ื˜ื™ืงื”
11:20
when they see this kind of technology come forward,
219
680540
2400
ื›ืฉื”ืŸ ืจื•ืื•ืช ื˜ื›ื ื•ืœื•ื’ื™ื” ืžืกื•ื’ ื–ื” ืžืชืคืชื—ืช,
11:22
so that's so important that we safeguard it.
220
682940
2480
ืื– ื–ื” ื›ืœ ื›ืš ื—ืฉื•ื‘ ืฉื ื’ืŸ ืขืœื™ื”.
11:25
What also concerned me was the position of elderly voters
221
685420
4680
ืžื” ืฉื”ื“ืื™ื’ ืื•ืชื™ ื”ื™ื” ื’ื ืขืžื“ืชื ืฉืœ ื”ื‘ื•ื—ืจื™ื ื”ืžื‘ื•ื’ืจื™ื
11:30
and perhaps their media literacy,
222
690100
1880
ื•ืื•ืœื™ ืื•ืจื™ื™ื ื•ืช ื”ืžื“ื™ื” ืฉืœื”ื,
11:32
their comprehension of this technology.
223
692020
2200
ื”ื‘ื ืชื ืืช ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ื”ื–ื•.
11:34
People who don't use the internet,
224
694260
2120
ืื ืฉื™ื ืฉืื™ื ื ืžืฉืชืžืฉื™ื ื‘ืื™ื ื˜ืจื ื˜,
11:36
who perhaps are not aware of AI and its many, many uses.
225
696420
3920
ืฉืื•ืœื™ ืื™ื ื ืžื•ื“ืขื™ื ืœ- AI ื•ืœืฉื™ืžื•ืฉื™ื ื”ืจื‘ื™ื ืžืื•ื“ ืฉืœื”.
11:40
So that was really concerning as well.
226
700340
2400
ืื– ื–ื” ื”ื™ื” ืžืžืฉ ืžื“ืื™ื’ ื’ื ื›ืŸ.
11:43
But it doesn't have to be this way.
227
703060
1800
ืื‘ืœ ื–ื” ืœื ื—ื™ื™ื‘ ืœื”ื™ื•ืช ื›ื›ื”.
11:44
We can be part of the change.
228
704900
1760
ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœื”ื™ื•ืช ื—ืœืง ืžื”ืฉื™ื ื•ื™.
11:47
For me and my video,
229
707260
1920
ืžื‘ื—ื™ื ืชื™ ื•ื”ืกืจื˜ื•ืŸ ืฉืœื™,
11:49
I still don't know, to this day, who did this.
230
709220
3520
ืื ื™ ืขื“ื™ื™ืŸ ืœื ื™ื•ื“ืขืช, ืขื“ ื”ื™ื•ื, ืžื™ ืขืฉื” ืืช ื–ื”.
11:52
I can imagine why they did it, but I don't know who did it.
231
712780
3400
ืื ื™ ื™ื›ื•ืœื” ืœื“ืžื™ื™ืŸ ืœืžื” ื”ื ืขืฉื• ืืช ื–ื”, ืื‘ืœ ืื ื™ ืœื ื™ื•ื“ืขืช ืžื™ ืขืฉื” ืืช ื–ื”.
11:56
And sadly for me and for across the globe,
232
716180
3200
ื•ืœืžืจื‘ื” ื”ืฆืขืจ ืขื‘ื•ืจื™ ื•ืขื‘ื•ืจ ื›ืœ ื”ืขื•ืœื,
11:59
it still wouldn't be considered a crime.
233
719420
2920
ื–ื” ืขื“ื™ื™ืŸ ืœื ื™ื™ื—ืฉื‘ ืœืคืฉืข.
12:03
So from being the enemy of fair, transparent,
234
723260
2840
ืื– ืžื”ื™ื•ืชื” ื”ืื•ื™ื‘ ืฉืœ ืคื•ืœื™ื˜ื™ืงื” ื”ื•ื’ื ืช, ืฉืงื•ืคื”
12:06
good-natured politics,
235
726100
1600
ื•ื˜ื•ื‘ื”,
12:07
to warfare, to international interference,
236
727740
3240
ื•ืขื“ ืœื•ื—ืžื”, ืœื”ืชืขืจื‘ื•ืช ื‘ื™ื ืœืื•ืžื™ืช,
12:11
despite its beauty and its potential,
237
731020
2880
ืœืžืจื•ืช ื”ื™ื•ืคื™ ื•ื”ืคื•ื˜ื ืฆื™ืืœ ืฉืœื”,
12:13
AI is still a cause for concern.
238
733940
2880
ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ืขื“ื™ื™ืŸ ืžื”ื•ื•ื” ืกื™ื‘ื” ืœื“ืื’ื”.
12:16
But it doesn't have to be this way.
239
736820
2080
ืื‘ืœ ื–ื” ืœื ื—ื™ื™ื‘ ืœื”ื™ื•ืช ื›ื›ื”.
12:19
I feel passionately that AI can be a humanistic technology
240
739260
3720
ืื ื™ ืžืจื’ื™ืฉื” ื‘ืœื”ื˜ ืฉ- AI ื™ื›ื•ืœื” ืœื”ื™ื•ืช ื˜ื›ื ื•ืœื•ื’ื™ื” ื”ื•ืžื ื™ืกื˜ื™ืช
12:23
with human values
241
743020
1560
ืขื ืขืจื›ื™ื ืื ื•ืฉื™ื™ื
12:24
that complements the lives that we live
242
744620
2640
ืฉืžืฉืœื™ืžื” ืืช ื”ื—ื™ื™ื ืฉืื ื• ื—ื™ื™ื
12:27
to make us the very best versions of ourselves.
243
747300
3440
ื›ื“ื™ ืœื”ืคื•ืš ืื•ืชื ื• ืœื’ืจืกืื•ืช ื”ื˜ื•ื‘ื•ืช ื‘ื™ื•ืชืจ ืฉืœ ืขืฆืžื ื•.
12:31
But to do that,
244
751100
1560
ืื‘ืœ ื›ื“ื™ ืœืขืฉื•ืช ื–ืืช,
12:32
I think we need to embed ethics into this technology,
245
752700
3920
ืื ื™ ื—ื•ืฉื‘ืช ืฉืื ื—ื ื• ืฆืจื™ื›ื™ื ืœื”ื˜ืžื™ืข ืืชื™ืงื” ื‘ื˜ื›ื ื•ืœื•ื’ื™ื” ื”ื–ื•,
12:36
to eliminate bias, to install empathy
246
756620
4160
ืœื—ืกืœ ื”ื˜ื™ื”, ืœื”ืชืงื™ืŸ ืืžืคืชื™ื”
12:40
and make sure it is aligned with human values and human principles.
247
760820
4760
ื•ืœื•ื•ื“ื ืฉื”ื™ื ืชื•ืืžืช ืœืขืจื›ื™ื ืื ื•ืฉื™ื™ื ื•ืขืงืจื•ื ื•ืช ืื ื•ืฉื™ื™ื.
12:45
Who knows, our democracy could depend on it.
248
765900
3400
ืžื™ ื™ื•ื“ืข, ื”ื“ืžื•ืงืจื˜ื™ื” ืฉืœื ื• ื™ื›ื•ืœื” ืœื”ื™ื•ืช ืชืœื•ื™ื” ื‘ื–ื”.
12:49
And today I know we have some of the brightest minds in this room.
249
769300
3840
ื•ื”ื™ื•ื ืื ื™ ื™ื•ื“ืขืช ืฉื™ืฉ ืœื ื• ื›ืžื” ืžื”ืžื•ื—ื•ืช ื”ืžื‘ืจื™ืงื™ื ื‘ื—ื“ืจ ื”ื–ื”.
12:53
I heard some of the talks earlier, and I'm certain of that.
250
773180
4120
ืฉืžืขืชื™ ื›ืžื” ืžื”ืฉื™ื—ื•ืช ืžื•ืงื“ื ื™ื•ืชืจ, ื•ืื ื™ ื‘ื˜ื•ื—ื” ื‘ื–ื”.
12:57
And I know each and every one of you have weight on your shoulders
251
777340
3120
ื•ืื ื™ ื™ื•ื“ืขืช ืฉืœื›ืœ ืื—ื“ ื•ืื—ื“ ืžื›ื ื™ืฉ ืžืฉืงืœ ืขืœ ื”ื›ืชืคื™ื™ื
13:00
when looking to the future.
252
780500
1680
ื›ืฉืžืกืชื›ืœื™ื ืขืœ ื”ืขืชื™ื“.
13:02
But I also know each and every one of you want to see this tech for good.
253
782220
5400
ืื‘ืœ ืื ื™ ื’ื ื™ื•ื“ืขืช ืฉื›ืœ ืื—ื“ ื•ืื—ื“ ืžื›ื ืจื•ืฆื” ืœืจืื•ืช ืืช ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ื”ื–ื• ืœืชืžื™ื“.
13:07
And what gives me hope
254
787980
1800
ื•ืžื” ืฉื ื•ืชืŸ ืœื™ ืชืงื•ื•ื”
13:09
is witnessing the movement right across the globe
255
789780
3320
ื”ื•ื ืœืจืื•ืช ืืช ื”ืชื ื•ืขื” ื‘ืจื—ื‘ื™ ื”ืขื•ืœื
13:13
to see this hard journey begin,
256
793140
2800
ื›ื“ื™ ืœืจืื•ืช ืืช ื”ืžืกืข ื”ืงืฉื” ื”ื–ื” ืžืชื—ื™ืœ,
13:15
to regulate this ever-advancing technology,
257
795940
2840
ื›ื“ื™ ืœื”ืกื“ื™ืจ ืืช ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ื”ืžืชืงื“ืžืช ื”ื–ื•,
13:18
which can be used for bad or for good.
258
798780
3000
ืฉื™ื›ื•ืœื” ืœืฉืžืฉ ืœืจืข ืื• ืœื˜ื•ื‘.
13:22
But perhaps we, each and every one of us in this room,
259
802140
3720
ืื‘ืœ ืื•ืœื™ ืื ื—ื ื•, ื›ืœ ืื—ื“ ื•ืื—ื“ ืžืื™ืชื ื• ื‘ื—ื“ืจ ื”ื–ื”,
13:25
can be part of finding the solution.
260
805860
2360
ื™ื›ื•ืœื™ื ืœื”ื™ื•ืช ื—ืœืง ืžืžืฆื™ืืช ื”ืคืชืจื•ืŸ.
13:28
(Applause)
261
808860
2200
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7