How a Deepfake Almost Ruined My Political Career | Cara Hunter | TED

40,512 views ・ 2024-12-18

TED


请双击下面的英文字幕来播放视频。

翻译人员: Jovina Ma 校对人员: Bruce Wang
00:03
This talk contains mature language
0
3620
3720
这个演讲包含了成熟的语言
00:08
“You’re a little whore,
1
8700
1840
“你是个小婊子,
00:10
and we've all seen your little video."
2
10540
2560
我们都看过你的小视频。”
00:14
That was the text message that was sent to me in April of 2022.
3
14300
5280
那是2022年4月发给我的短信。
00:20
I'm sitting in my grandmother's living room
4
20300
2480
在祖母90岁生日那天,我正坐
00:22
in what is her 90th birthday,
5
22820
2640
在她的起居室里,
00:25
surrounded by family and friends
6
25460
2360
周围是家人和朋友,
00:27
as my phone blows up with messages from strangers
7
27820
4480
来自全国各地的陌生人发来的消息, 令我的手机炸了.
00:32
right across the country
8
32300
1840
00:34
who say they have seen a video of me
9
34140
3160
他们说他们看过一段
关于我和一个男人进行硬核色情活动的视频。
00:37
engaging in hardcore pornographic activity with a man.
10
37340
4720
00:42
I knew this was impossible.
11
42740
1600
我知道这是不可能的。
00:44
With just three weeks out from my election,
12
44900
2000
00:46
I felt as though my career was crumbling before my very eyes.
13
46940
4080
我感觉自己的职业生涯好像在我眼前崩溃 了。
00:51
My heart pounded,
14
51420
1440
我的心跳加速,
00:52
my mind raced, sweat beaded on my skin.
15
52860
3160
心跳加速,汗水流过我的皮肤。
00:56
And then I watched the video, and my worst fear was realized.
16
56380
4880
然后我看了视频, 我最大的恐惧意识到了。
01:01
Although this woman in the video was not me,
17
61980
3560
尽管视频 中的这个女人不是我,但
01:05
she looked exactly like me.
18
65540
2720
她长得和我一模一样。
01:09
Impossibly like me.
19
69020
2360
不可能像我一样。
01:11
Eerily like me.
20
71380
1200
怪异地像我。
01:13
I had so many questions running through my mind.
21
73140
3040
我脑海里浮现了很多问题。
01:16
Was this AI?
22
76500
1360
这是人工智能吗?
01:17
Was it not?
23
77900
1280
不是吗?
01:19
Who made this?
24
79220
1160
谁做了这个?
01:20
How did they make it?
25
80420
1360
他们是怎么做到的?
01:21
Why did they make it?
26
81820
1320
他们为什么这样做?
01:23
So I did what anyone would do,
27
83140
2360
因此,我做了任何人都会做的事情,
01:25
and I approached my local police service to ask for advice, for guidance.
28
85500
4680
然后我联系了当地的警察部门, 寻求建议和指导。
01:30
And really, where did I go from there?
29
90220
2280
真的,我从那里去了哪里?
01:32
But they informed me that they wouldn't have the cybercrime technology to assist,
30
92540
4280
但是他们告诉我,他们 没有网络犯罪技术可以提供帮助,
01:36
to find out where this video came from.
31
96860
2240
也无法了解这段视频的来源。
01:39
And it was from that moment I knew that I was on my own.
32
99140
3520
从那一刻起,我就 知道自己只能靠自己了。
01:43
Now to set the stage, as you can probably tell,
33
103380
2440
现在要做好准备, 正如你可能知道的那样,
01:45
I’m from Ireland,
34
105820
1240
我来自爱尔兰,
01:47
and to be exact, I'm from Northern Ireland,
35
107100
2480
确切地说, 我来自北爱尔兰,
01:49
which is an even smaller place.
36
109580
2000
那是一个更小的地方。
01:51
We have just 1.8 million people,
37
111620
2680
我们只有180万人口,
01:54
very similar to the size of Vienna.
38
114340
2840
与维也纳的面积非常相似。
01:57
So you can imagine a rumor of this sinister nature,
39
117180
3440
因此,你可以想象,关于这种险恶性质的谣言,
02:00
particularly in the world of politics, can go very far, very fast.
40
120620
5920
特别是在政治领域, 可以传播得非常广,非常快。
02:06
And that old saying, “seeing is believing” began to haunt me.
41
126540
4480
那句老话 “眼见为实” 开始困扰着我。
02:11
And in the weeks leading up to my election,
42
131060
2520
在我当选前的几周里,这个视频,
02:13
this video, this false video,
43
133620
1920
这个虚假的视频,在WhatsApp上 被分享了成千上万次。
02:15
was shared thousands and thousands of times across WhatsApp.
44
135540
4240
02:20
And attached to this video was photos of me at work,
45
140260
3720
这段视频附有我在工作、
02:24
smiling, campaigning,
46
144020
2760
微笑、竞选、与选民 建立信任感的照片。
02:26
building a sense of trust with my constituents.
47
146820
2680
02:29
And as the weeks went on,
48
149540
1480
随着几周的流逝,
02:31
messages flooded in faster and faster,
49
151020
3920
信息的涌入速度越来越快,
02:34
and they were of a very vile and sexual nature.
50
154980
3040
而且它们具有非常卑鄙 和色情的性质。
02:38
Ding!
51
158340
1200
叮!
02:39
"We've all seen your little video."
52
159540
2000
“我们都看过你的小视频。”
02:41
Ding!
53
161940
1120
叮!
02:43
"You should be ashamed of yourself."
54
163100
2480
“你应该为自己感到羞耻。”
02:45
Ding!
55
165620
1200
叮当!
02:46
"Ah, now I see how you got your position in politics."
56
166820
4280
“啊,现在我明白 你是如何获得政治地位的。”
02:53
It was very difficult.
57
173100
1960
这非常困难。
02:55
And having been in politics since the age of 23,
58
175540
3160
我从23岁 起就进入政界,
02:58
and at this point I've been in it for about four to five years,
59
178740
3560
此时我已经在政治领域工作了大约四到五年,
03:02
and I'm from Northern Ireland, which is a post-conflict society,
60
182300
3280
我来自北爱尔兰, 这是一个冲突后社会,
03:05
still very deeply divided.
61
185620
1920
仍然存在严重的分歧。
03:07
So I anticipated challenges,
62
187580
2600
因此,我预见了挑战,
03:10
I anticipated disagreements,
63
190220
2600
我预见了分歧,
03:12
I even anticipated attacks, is politics after all.
64
192860
2800
我甚至预见了袭击,毕竟是政治。
03:16
But what I did not anticipate was this moment.
65
196100
3520
但我没想到的是这一刻。
03:19
This was different.
66
199660
1400
这不一样。
03:21
This was the moment where misogyny meets the misuse of technology,
67
201060
5800
这是厌女症与滥用技术相遇的时刻,
03:26
and even had the potential to impact the outcome of a democratic election.
68
206900
5560
甚至有可能影响民主 选举的结果。
03:32
And the sad thing for me was this lie became so far spread,
69
212820
5080
对我来说,可悲的是,这个谎言传播得
03:37
so far, so fast,
70
217940
1840
如此之远,如此之快,
03:39
that even my own family started to believe it.
71
219820
2880
以至于连我自己的家人也 开始相信。
03:42
Some would say that they'd heard it at a golf club,
72
222700
2760
有些人会说他们在高尔夫俱乐部听过,
03:45
others would say they heard it at the bar,
73
225500
2520
另一些人会说他们在酒吧听过,
03:48
and of course, some even said they heard it in a locker room.
74
228060
4320
当然,有些人甚至说 他们在更衣室里听见了。
03:52
A really good example of how far this went
75
232420
2480
一个很好的例子说明了这件事走 了多远,我知道我一生
03:54
was people that I knew my entire life
76
234900
2520
03:57
would pass me in the street without whispering a word.
77
237460
3000
都在街上遇见我时不发一言地走过。
04:00
People like school teachers,
78
240460
1360
像学校老师这样的人,
04:01
people I had a sense of trust with
79
241860
1720
我信任的人,你知道的,
04:03
and, you know, an affinity with.
80
243620
2360
和我有亲和力的人。
04:05
And that was really hard.
81
245980
1320
那真的很难。
04:07
It felt like overnight I was wearing a scarlet letter.
82
247300
3920
感觉就像一夜之间 我戴着一封猩红色的字母。
04:11
And as things moved on
83
251660
1320
随着事情的发展,
04:12
and were about two, three weeks out from the election,
84
252980
2880
距离选举还有大约两三 周的时间,
04:15
I kept receiving messages, and it got wider and wider.
85
255900
3120
我不断收到信息, 而且信息越来越广泛。
04:19
It was global.
86
259060
1160
这是全球性的。
04:20
Not only was I receiving messages from Dublin and from London,
87
260220
3680
我不仅收到了 来自都柏林和伦敦的消息,
04:23
but I was also receiving messages from Massachusetts, Manhattan,
88
263900
4040
我还收到了来自马萨诸塞州、
04:27
and I was getting so many follows on my political social media,
89
267940
3920
曼哈顿的消息,我的政治社交媒体 上获得了很多关注,
04:31
predominantly from men hungry for more of this scandal.
90
271900
3920
主要来自 渴望更多地了解这起丑闻的人。
04:35
And this intersection of online harms impacting my real life
91
275860
5120
而这种影响我现实生活的网络危害交汇点
04:40
was something I found utterly strange and surreal.
92
280980
4000
令我感到非常怪异和超现实。
04:45
But it got to the point where I was recognized on the street
93
285500
3800
但是到了这样 的地步:我在街上被认出来
04:49
and approached by a stranger who asked me for a sexual favor.
94
289340
6040
并被一个陌生 人走近,他向我寻求性方面的帮助。
04:55
And it was just, for me, it was like in the blink of an eye,
95
295420
3080
只是,对我来说, 就像在眨眼之间,
04:58
everything had just changed,
96
298500
2640
一切都变了,
05:01
and it was utterly humiliating.
97
301140
1600
太丢脸了。
05:02
I didn't want to leave the house,
98
302780
1680
我不想离开家,
05:04
and I had turned notifications off in my phone
99
304460
2880
我关闭了手机中的通知
05:07
just so I could kind of catch my breath,
100
307380
1920
只是为了屏住呼吸,
05:09
but this wasn't ideal in the lead up, of course, to an election.
101
309340
4280
但这在选举之前 当然并不理想。
05:13
And for me, I think that was the purpose of this false video, was to do just that.
102
313620
4960
对我来说,我认为这就是这段虚假视频 的目的,就是这么做。
05:19
But what hurt the most for me
103
319660
2440
但是对我来说最痛苦 的是让父亲坐下
05:22
was sitting down my father
104
322140
2720
05:24
and having to explain to him this strange, surreal situation.
105
324860
5760
而且必须向他解释 这种奇怪的、超现实的情况。
05:30
My father is an Irishman,
106
330620
1760
我父亲是爱尔兰人,
05:32
completely disconnected from tech,
107
332420
2160
与科技完全脱节,
05:34
and so having to explain
108
334620
1920
因此不得不解释
05:36
this horrific situation was an entire fabrication
109
336580
4320
这种可怕的情况完全是捏造的,
05:40
was very hard to do.
110
340940
1600
很难做到。
05:42
This was this strange moment
111
342860
2120
这是一个奇怪的时刻,
05:45
where the online world met my life, my reality.
112
345020
4880
网络世界遇到了我的生活,我的现实。
05:50
Not only having the impact to ruin my reputation,
113
350460
4520
不仅会破坏我的声誉,
05:54
but have the capacity to change the outcome
114
354980
2800
而且有能力改变民主选举的结果。
05:57
of a democratic election.
115
357820
2200
06:00
And, you know,
116
360500
1160
而且,你知道,
06:01
for years I spent so much time building trust with my constituents.
117
361700
4040
多年来我花了很多时间与选民 建立信任。
06:05
I mean, we all know how much people like politicians.
118
365780
3360
我的意思是,我们都知道人们 有多喜欢政治家。
06:09
You know, we’re as likeable as the tax man.
119
369180
2000
你知道,我们和税务人员 一样讨人喜欢。
06:11
So for me, it was hard.
120
371220
2880
所以对我来说,这很难。
06:14
It was really hard
121
374100
1200
这真的很难,
06:15
because it was years of hard work.
122
375300
2280
因为这是多年的辛勤工作。
06:18
You know, I’m so passionate about my job,
123
378100
2160
你知道,我对自己的工作充满热情,
06:20
and this video, this complete falsehood,
124
380260
2920
而这段视频完全是虚假的,
06:23
had the ability to just undermine years of hard work in mere seconds.
125
383180
4600
能够在短短几秒钟内破坏多 年的辛勤工作。
06:28
But instead of succumbing entirely to victimhood,
126
388220
3200
但是,我没有完全屈服于受害者身份,
06:31
I ask myself today,
127
391420
1360
而是问自己,
06:32
you know, where do we go from here?
128
392820
2280
你知道吧,我们该走向何方?
06:35
And how can AI evolve to prevent something like this happening again?
129
395140
5560
以及人工智能如何发展以防止这样 的事情再次发生?
06:40
Not only has it happened to me,
130
400740
1480
这不仅发生在我身上,
06:42
but we want to future-proof and ensure that this doesn't happen
131
402220
3000
而且我们希望经得起未来考验, 确保明天的女性不会发生这种情况。
06:45
to the women of tomorrow.
132
405260
1760
06:47
How can we, you and I, people who care about people,
133
407060
4280
我们,你和我,关心他人的人, 如何才能确保这是一项有益的技术?
06:51
ensure that this is a tech for good?
134
411380
2400
06:54
How can we, the policymakers, the creators, the consumers,
135
414300
4680
我们,决策者,创造者, 消费者,如何
06:58
ensure we regulate AI and things like social media,
136
418980
3480
确保我们监管人工智能 和社交媒体等事物,
07:02
putting humans and humanity at the center of artificial intelligence?
137
422500
5320
将人类和人性置于人工智能的中心?
07:08
AI can change the world.
138
428740
2280
人工智能可以改变世界。
07:11
In fact, as we've heard today, it already has.
139
431060
2880
实际上,正如我们今天所听到的那样, 它已经做到了。
07:13
In a matter of seconds,
140
433940
1520
在短短几秒钟内, 说完全不同语言的人
07:15
people who speak completely different languages
141
435460
2240
07:17
can connect and understand one another.
142
437700
2160
可以相互联系和理解。
07:19
And we've even seen the Pope as a style icon in a puffer jacket.
143
439860
3960
我们甚至将教皇 视为身穿羽绒夹克的时尚偶像。
07:24
So some really important uses right there.
144
444180
2240
所以这里有一些非常重要的用途。
07:27
But then in my case as well,
145
447220
1600
但是,就我而言,
07:28
we can also see how it is weaponized against the truth.
146
448860
3440
我们也可以看到它是如何用武器来 对抗真相的。
07:32
And good examples of this would be art that appears like reality,
147
452620
4120
这方面的一个很好的例子是看似现实的 艺术、
07:36
AI-generated reviews unfairly boosting certain products
148
456740
3840
人工智能生成的 不公平地宣传某些产品的评论
07:40
and things like chatbot propaganda.
149
460580
2760
以及诸如聊天机器人宣传之类的东西。
07:43
And then, politically speaking,
150
463340
1760
然后,从政治上讲,这些年来, 我们已经看到南希·佩洛西口齿不清、
07:45
we've seen over the years deepfakes of Nancy Pelosi slurring,
151
465140
4320
07:49
Joe Biden cursing
152
469500
1480
乔·拜登在诅咒,
07:51
and even President Zelensky
153
471020
1760
甚至泽伦斯基总统 要求他的士兵交出武器的假话。
07:52
asking his soldiers to surrender their weapons.
154
472820
3160
07:57
So when AI is used like this, to manipulate,
155
477220
3560
因此,当这样使用人工智能进行操纵时,
08:00
it can be a threat to our democracy.
156
480780
2960
它可能会对我们的民主构成威胁。
08:03
And the tech is becoming so advanced
157
483740
2680
而且这项技术变得如此先进,
08:06
that it's hard to differentiate fact from fiction.
158
486460
3400
以至于很难区分 事实和虚构。
08:10
So how does AI interfere with politics?
159
490660
4000
那么,人工智能如何干扰政治呢?
08:14
And for us as politicians, what should we be worried about?
160
494660
3840
对于我们作为政治家来说,我们应该担心 什么?
08:18
Could truth and democracy become shattered by AI?
161
498820
3760
人工智能 会粉碎真理和民主吗?
08:22
Has it already?
162
502580
1320
已经发生了吗?
08:24
Well, to dive a little deeper here,
163
504300
2160
好吧,为了更深入地了解一下,
08:26
I think firstly we need to talk about the concept of truth.
164
506500
3440
我认为首先我们需要谈 谈真理的概念。
08:30
Without truth, democracy collapses.
165
510300
2920
没有真相,民主就会崩溃。
08:33
Truth allows us to make informed decisions,
166
513700
3520
真理使我们能够 做出明智的决定,
08:37
it enables us to hold leaders accountable, which is very important.
167
517220
4040
它使我们能够追究领导者的责任, 这非常重要。
08:41
And it also allows us, as political representatives,
168
521300
3400
它还使我们作为政治代表
08:44
to create a sense of trust with our citizens and our constituents.
169
524700
3840
能够与我们的公民和选民建立信任感。
08:49
But without that truth,
170
529020
1680
但是,如果没有这个真相,
08:50
democracy is vulnerable to misinformation,
171
530740
3280
民主很容易受到错误信息、
08:54
manipulation, and, of course, corruption.
172
534020
3400
操纵,当然还有腐败的影响。
08:57
When AI erodes truth,
173
537940
2920
当人工智能侵蚀真理时,
09:00
it erodes trust,
174
540860
2000
它就会侵蚀信任,
09:02
and it undermines our democracy.
175
542860
2240
破坏我们的民主。
09:05
And for me, in my experience with a deepfake,
176
545500
3760
对我来说,根据我使用deepfake 的经验,
09:09
I've seen what a fantastic distortion tool that deepfakes really are.
177
549260
4960
我已经看到了deepfakes到底是多棒的失真工具。
09:14
So how can we safeguard democracy from this ever-advancing technology?
178
554220
5080
那么,我们怎样才能保护民主 免受这种不断进步的技术的侵害呢?
09:19
It's becoming ever harder to distinguish between real and synthetic content.
179
559620
4960
09:24
And politically, what role does AI play in the future?
180
564620
4160
在政治上,人工智能在未来扮演 什么角色?
09:28
And I can't talk about politics without talking about media as well.
181
568820
4080
如果不谈论媒体,我 就谈不上政治。
09:32
They're undeniably linked, they're intertwined.
182
572900
2360
不可否认,它们是相互关联的,交织 在一起的。
09:35
And I think journalism has its own battle here as well.
183
575300
3600
而且我认为新闻业在这里 也有自己的战斗。
09:39
From AI algorithms boosting articles unfairly
184
579220
3600
从人工智能算法不公平地 提升文章
09:42
to clickbait headlines,
185
582860
1520
到点击诱饵头条,
09:44
and then also moments where they can manipulate the public as well.
186
584420
4800
再到他们也可以 操纵公众的时刻。
09:49
But politically speaking,
187
589220
1240
但是从政治上讲,
09:50
we've seen AI-tailored political messaging
188
590500
3000
我们已经看到人工智能量身定制的政治信息
09:53
influencing voters,
189
593540
1160
影响了选民,
09:54
we’ve seen it adding to existing bias.
190
594740
3240
我们看到它加剧了现有的偏见。
09:58
And definitely I think we all have that aunt that’s on Facebook
191
598020
2960
当然,我认为我们都有那个在 Facebook 上的什么都相信的阿姨。
10:00
and kind of believes anything.
192
600980
1840
10:02
So for me, as a politician,
193
602860
1440
因此,对我来说,作为一名政治家,我认为 深入研究人工智能、
10:04
I think it's really important we dive a little deeper
194
604340
2480
10:06
into the relationship of AI, journalism and media.
195
606820
3880
新闻和媒体之间的关系非常重要。
10:11
But it also puts us at risk
196
611220
1840
但这也使我们面临制造更加分裂 和反动社会的风险,
10:13
of creating a more divided and reactionary society,
197
613100
4000
10:17
because falsehoods can create a lot of reaction.
198
617140
3000
因为虚假信息会引起 很多反应。
10:20
And for myself, coming from Northern Ireland,
199
620140
2320
就我自己而言, 来自北爱尔兰,
10:22
which is that post-conflict society,
200
622500
2320
也就是冲突后的社会,
10:24
I do have concerns about how it could shape our political landscape
201
624860
3960
我确实担心它会如何 影响我们的政治格局
10:28
and other places across the globe.
202
628820
2320
和全球其他地方。
10:32
Sadly, this deepfake video
203
632060
2320
可悲的是,这段深度伪造的视频
10:34
is not the only instance of me having experienced abuse with AI.
204
634380
5400
并不是我 遭受人工智能虐待的唯一例子。
10:39
Just six months ago,
205
639780
1280
就在六个月前,
10:41
I received 15 deepfake images of myself in lingerie
206
641100
5400
我收到了15张自己穿着内衣挑衅性 摆姿势的deepfake照片。
10:46
posing provocatively.
207
646540
1640
10:48
And I thought to myself, here we go again.
208
648620
3240
然后我心想,又开始了。
10:52
And you know, I spoke with some other female politicians.
209
652460
3720
而且你知道,我 和其他一些女性政治家谈过。
10:56
Thankfully, where I represent, we have more women getting into politics.
210
656980
4280
值得庆幸的是,在我所代表的地方, 有更多的女性参政。
11:01
But I had a really good conversation with them,
211
661300
2200
但我和他们进行了非常愉快的 交谈, 如果现在发生在你身上,
11:03
and it’s around, if this is happening to you now,
212
663500
2320
11:05
what happens to me tomorrow?
213
665860
1760
明天我会怎样?
11:07
And I think this really strikes at the heart of the climate of fear
214
667660
3960
而且我认为这确实触及了人工智能可能 为公共生活中的人们造成的恐惧气氛的核心。
11:11
that AI can create for those in public life.
215
671660
3840
11:15
And I don't blame women.
216
675500
1160
而且我不怪女人。
11:16
It's very sad.
217
676700
1160
太可悲了。
11:17
I don't blame women for not wanting to get into politics
218
677860
2640
我不责怪女性在看到这种 技术出现时不想参政,
11:20
when they see this kind of technology come forward,
219
680540
2400
11:22
so that's so important that we safeguard it.
220
682940
2480
所以保护它非常重要。
11:25
What also concerned me was the position of elderly voters
221
685420
4680
同样令我 担忧的是老年选民的地位,
11:30
and perhaps their media literacy,
222
690100
1880
也许还有他们的媒体素养,
11:32
their comprehension of this technology.
223
692020
2200
他们对这项技术的理解。
11:34
People who don't use the internet,
224
694260
2120
不使用互联网的人, 他们可能对人工智能及其许多用途一无所知。
11:36
who perhaps are not aware of AI and its many, many uses.
225
696420
3920
11:40
So that was really concerning as well.
226
700340
2400
所以这也非常令人担忧。
11:43
But it doesn't have to be this way.
227
703060
1800
但不一定是这样。
11:44
We can be part of the change.
228
704900
1760
我们可以成为变革的一部分。
11:47
For me and my video,
229
707260
1920
对于我和我的视频,直到今天 我还不知道是谁干的。
11:49
I still don't know, to this day, who did this.
230
709220
3520
11:52
I can imagine why they did it, but I don't know who did it.
231
712780
3400
我能想象他们为什么这么做, 但我不知道是谁干的。
11:56
And sadly for me and for across the globe,
232
716180
3200
不幸的是,对我和全球而言,
11:59
it still wouldn't be considered a crime.
233
719420
2920
它仍然不会被视为犯罪。
12:03
So from being the enemy of fair, transparent,
234
723260
2840
因此,从公平、透明、
12:06
good-natured politics,
235
726100
1600
善良的政治,
12:07
to warfare, to international interference,
236
727740
3240
到战争,再到国际干预,
12:11
despite its beauty and its potential,
237
731020
2880
尽管它既美观又有潜力,
12:13
AI is still a cause for concern.
238
733940
2880
人工智能仍然令人担忧。
12:16
But it doesn't have to be this way.
239
736820
2080
但不一定是这样。
12:19
I feel passionately that AI can be a humanistic technology
240
739260
3720
我深深地感到,人工智能可以成为 一种具有人类价值观的人文技术,
12:23
with human values
241
743020
1560
12:24
that complements the lives that we live
242
744620
2640
它可以补充我们的生活,
12:27
to make us the very best versions of ourselves.
243
747300
3440
使我们成为自己最好的 版本。
12:31
But to do that,
244
751100
1560
但是要做到这一点,
12:32
I think we need to embed ethics into this technology,
245
752700
3920
我认为我们需要 将伦理纳入这项技术,
12:36
to eliminate bias, to install empathy
246
756620
4160
消除偏见,树立同理心,
12:40
and make sure it is aligned with human values and human principles.
247
760820
4760
并确保它 与人类价值观和人类原则保持一致。
12:45
Who knows, our democracy could depend on it.
248
765900
3400
谁知道,我们的民主 可能取决于它。
12:49
And today I know we have some of the brightest minds in this room.
249
769300
3840
今天我知道这个房间里 有一些最聪明的人。
12:53
I heard some of the talks earlier, and I'm certain of that.
250
773180
4120
我之前听过一些会谈, 我确信这一点。
12:57
And I know each and every one of you have weight on your shoulders
251
777340
3120
而且我知道展望未来时, 你们每一个人都有重担。
13:00
when looking to the future.
252
780500
1680
13:02
But I also know each and every one of you want to see this tech for good.
253
782220
5400
但我也知道你们每个人都 希望看到这项技术用在好的事情上。
13:07
And what gives me hope
254
787980
1800
给我带来希望的是目睹全球各地的运动,
13:09
is witnessing the movement right across the globe
255
789780
3320
13:13
to see this hard journey begin,
256
793140
2800
看到这段艰难的旅程开始,
13:15
to regulate this ever-advancing technology,
257
795940
2840
监管这个不断进步的技术,
13:18
which can be used for bad or for good.
258
798780
3000
它可以用来做坏事,也可以用来做好事。
13:22
But perhaps we, each and every one of us in this room,
259
802140
3720
但是,也许我们,在座的每一个人,
13:25
can be part of finding the solution.
260
805860
2360
都可以参与寻找解决方案。
13:28
(Applause)
261
808860
2200
(掌声)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7