When technology can read minds, how will we protect our privacy? | Nita Farahany

167,640 views ・ 2018-12-18

TED


Dobbeltklik venligst på de engelske undertekster nedenfor for at afspille videoen.

Translator: Nikolaj Johansen Reviewer: Anders Finn Jørgensen
00:13
In the months following the 2009 presidential election in Iran,
0
13096
4714
I månederne efter Irans præsidentvalg i 2009
00:17
protests erupted across the country.
1
17834
2894
udbrød der protester landet over.
00:21
The Iranian government violently suppressed
2
21685
2896
Den iranske regering undertrykte med vold
00:24
what came to be known as the Iranian Green Movement,
3
24605
3979
det, der blev kendt som Den Grønne Bevægelse.
00:28
even blocking mobile signals
4
28608
2053
De blokerede endda mobilsignaler
00:30
to cut off communication between the protesters.
5
30685
2714
for at forhindre demonstranterne i at kommunikere.
00:34
My parents, who emigrated to the United States in the late 1960s,
6
34316
4669
Mine forældre, som emigrerede til USA sidst i 60'erne,
tilbragte lang tid der,
00:39
spend substantial time there,
7
39009
1794
00:40
where all of my large, extended family live.
8
40827
3153
hvor hele min store, udvidede familie bor.
00:44
When I would call my family in Tehran
9
44832
3129
Når jeg ringede til min familie i Teheran
00:47
during some of the most violent crackdowns of the protest,
10
47985
3452
under de mest brutale overgreb på demonstranterne,
00:51
none of them dared discuss with me what was happening.
11
51461
3252
turde ingen af dem diskutere begivenhederne med mig.
00:55
They or I knew to quickly steer the conversation to other topics.
12
55196
3529
Vi skiftede hurtigt emne i vores samtaler.
00:59
All of us understood what the consequences could be
13
59163
3380
Vi forstod alle, hvad konsekvenserne kunne være
01:02
of a perceived dissident action.
14
62567
2540
af anklagen om systemkritik.
01:06
But I still wish I could have known what they were thinking
15
66095
3469
Men jeg ønsker stadig, jeg kunne vide, hvad de tænkte,
01:09
or what they were feeling.
16
69588
1418
eller hvad de følte.
01:12
What if I could have?
17
72217
1393
Hvad hvis jeg kunne det?
01:14
Or more frighteningly,
18
74149
1151
Eller endnu mere skræmmende,
01:15
what if the Iranian government could have?
19
75324
2761
hvad hvis den iranske regering kunne?
01:18
Would they have arrested them based on what their brains revealed?
20
78695
3244
Ville de have arresteret dem på grund af deres tanker?
01:22
That day may be closer than you think.
21
82933
2944
Den dag er måske nærmere, end man tror.
01:26
With our growing capabilities in neuroscience, artificial intelligence
22
86527
3811
Med fremgangen i neuroforskning, kunstig intelligens
01:30
and machine learning,
23
90362
1703
og maskinlæring,
01:32
we may soon know a lot more of what's happening in the human brain.
24
92089
4075
vil vi snart vide mere om, hvad der sker i vores hjerner.
01:37
As a bioethicist, a lawyer, a philosopher
25
97083
3310
Som bioetiker, advokat, filosof
01:40
and an Iranian-American,
26
100417
1867
og iransk-amerikaner
01:42
I'm deeply concerned about what this means for our freedoms
27
102308
3787
er jeg dybt bekymret for, hvad det betyder for vores frihed,
01:46
and what kinds of protections we need.
28
106119
2171
og hvilken beskyttelse, vi har brug for.
01:48
I believe we need a right to cognitive liberty,
29
108993
3460
Jeg mener, vi har ret til kognitiv frihed
01:52
as a human right that needs to be protected.
30
112477
2892
som en menneskerettighed, der skal beskyttes.
01:55
If not, our freedom of thought,
31
115772
2643
Hvis ikke, vil vores tankefrihed,
01:58
access and control over our own brains
32
118439
3024
adgang og kontrol over vores egne hjerner
02:01
and our mental privacy will be threatened.
33
121487
2841
og vores mentale privatliv komme under pres.
02:05
Consider this:
34
125698
1158
Overvej dette:
02:06
the average person thinks thousands of thoughts each day.
35
126880
3314
I gennemsnit tænker man tusindvis af tanker hver dag.
02:10
As a thought takes form,
36
130697
1151
Ved hver en tanke,
02:11
like a math calculation or a number, a word,
37
131872
5056
som et regnestykke, et tal eller et ord,
02:16
neurons are interacting in the brain,
38
136952
2885
interagerer neuroner i hjernen
02:19
creating a miniscule electrical discharge.
39
139861
3088
og skaber en lille, elektrisk udladning.
02:23
When you have a dominant mental state, like relaxation,
40
143713
3452
Når hjernen er optaget af en ting, for eksempel afslapning,
02:27
hundreds and thousands of neurons are firing in the brain,
41
147189
4175
er tusindvis af neuroner aktive i hjernen
02:31
creating concurrent electrical discharges in characteristic patterns
42
151388
4218
og skaber elektriske udladninger i karakteristiske mønstre,
02:35
that can be measured with electroencephalography, or EEG.
43
155630
4865
der kan måles med elektroencefalografi, eller EEG.
02:41
In fact, that's what you're seeing right now.
44
161118
2160
Det er faktisk det, I ser lige her.
02:43
You're seeing my brain activity that was recorded in real time
45
163956
3964
I ser min hjerneaktivitet, der blev registreret i realtid.
02:47
with a simple device that was worn on my head.
46
167944
2735
med et simpelt apparat, jeg havde på hovedet.
02:51
What you're seeing is my brain activity when I was relaxed and curious.
47
171669
5653
Her er min hjerneaktivitet, mens jeg var afslappet og nysgerrig.
02:58
To share this information with you,
48
178097
1755
For at dele dette med jer,
02:59
I wore one of the early consumer-based EEG devices
49
179876
3020
brugte jeg et nyt, lettilgængeligt EEG-apparat
03:02
like this one,
50
182920
1211
som det her,
03:04
which recorded the electrical activity in my brain in real time.
51
184155
3988
der registrerede min hjerneaktivitet i realtid.
03:08
It's not unlike the fitness trackers that some of you may be wearing
52
188849
3826
Det er ligesom de fitnesstrackere, nogle af jer måske bruger
03:12
to measure your heart rate or the steps that you've taken,
53
192699
3586
til at måle jeres puls eller jeres skridt
03:16
or even your sleep activity.
54
196309
1587
eller endda jeres søvnmønster.
Det er ikke den mest sofistikerede hjerneskannings-teknologi.
03:19
It's hardly the most sophisticated neuroimaging technique on the market.
55
199154
3824
03:23
But it's already the most portable
56
203597
2378
Men det er allerede den mest bærbare,
03:25
and the most likely to impact our everyday lives.
57
205999
3152
som nok vil få stor indflydelse på vores hverdag.
03:29
This is extraordinary.
58
209915
1503
Det er utroligt.
03:31
Through a simple, wearable device,
59
211442
2505
Med et simpelt, bærbart apparat
03:33
we can literally see inside the human brain
60
213971
3548
kan vi bogstaveligt talt se ind i vores hjerner
03:37
and learn aspects of our mental landscape without ever uttering a word.
61
217543
6476
og aflæse vores mentale landskab uden at sige et eneste ord.
03:44
While we can't reliably decode complex thoughts just yet,
62
224829
3653
Selvom vi ikke kan aflæse komplekse tanker endnu,
03:48
we can already gauge a person's mood,
63
228506
2519
kan vi allerede vurdere en persons humør,
03:51
and with the help of artificial intelligence,
64
231049
2873
og med hjælp fra kunstig intelligens
03:53
we can even decode some single-digit numbers
65
233946
4341
kan vi endda afkode enkelte cifre,
03:58
or shapes or simple words that a person is thinking
66
238311
4882
former eller simple ord, som en person tænker,
04:03
or hearing, or seeing.
67
243217
2189
hører eller ser.
04:06
Despite some inherent limitations in EEG,
68
246345
3365
Trods nogle indbyggede begrænsninger i EEG,
04:09
I think it's safe to say that with our advances in technology,
69
249734
4720
kan man roligt sige, at med vores teknologiske fremskridt
04:14
more and more of what's happening in the human brain
70
254478
3809
vil mere og mere af, hvad der sker i vores hjerner,
04:18
can and will be decoded over time.
71
258311
2310
blive afkodet med tiden.
04:21
Already, using one of these devices,
72
261362
2746
Med sådan et apparat,
04:24
an epileptic can know they're going to have an epileptic seizure
73
264132
3255
kan en epileptiker allerede vide, hvornår de vil få et anfald,
04:27
before it happens.
74
267411
1436
før det sker.
04:28
A paraplegic can type on a computer with their thoughts alone.
75
268871
4603
Personer med lammelse kan skrive på computer med tankerne.
04:34
A US-based company has developed a technology to embed these sensors
76
274485
4183
Et amerikansk firma har udviklet teknologien
til at indbygge disse sensorer i nakkestøtten i biler,
04:38
into the headrest of automobilies
77
278692
2230
04:40
so they can track driver concentration,
78
280946
2505
så de kan registrere førerens koncentration,
04:43
distraction and cognitive load while driving.
79
283475
2667
distraktion og kognitive belastning, under kørslen.
04:46
Nissan, insurance companies and AAA have all taken note.
80
286912
4058
Nissan, forsikringsselskaber og AAA følger udviklingen.
04:51
You could even watch this choose-your-own-adventure movie
81
291949
4508
Man kan endda se en film, hvor man selv vælger handlingen,
04:56
"The Moment," which, with an EEG headset,
82
296481
4240
"The Moment," som med et EEG-headset
05:00
changes the movie based on your brain-based reactions,
83
300745
3926
ændrer filmen baseret på din hjernes reaktioner
05:04
giving you a different ending every time your attention wanes.
84
304695
4353
og giver dig en ny slutning, hver gang du bliver distraheret.
05:11
This may all sound great,
85
311154
2763
Det lyder måske fantastisk,
05:13
and as a bioethicist,
86
313941
2189
og som bioetiker
05:16
I am a huge proponent of empowering people
87
316154
3613
er jeg en stor fortaler for at hjælpe mennesker
05:19
to take charge of their own health and well-being
88
319791
2616
med at få kontrol over deres sundhed og velvære
05:22
by giving them access to information about themselves,
89
322431
2918
ved at give dem adgang til informationer om sig selv
05:25
including this incredible new brain-decoding technology.
90
325373
2976
med denne utrolige, nye, hjerne-aflæsende teknologi.
05:29
But I worry.
91
329878
1167
Men jeg er bange.
05:31
I worry that we will voluntarily or involuntarily give up
92
331736
4760
Jeg er bange for, at vi, frivilligt eller ej, vil opgive
05:36
our last bastion of freedom, our mental privacy.
93
336520
4118
vores sidste bastion af frihed, vores mentale privatliv.
05:41
That we will trade our brain activity
94
341302
2925
At vi vil give vores hjerneaktivitet
05:44
for rebates or discounts on insurance,
95
344251
3046
mod at få rabat på forsikringen
05:48
or free access to social-media accounts ...
96
348391
2603
eller gratis adgang til sociale medier ...
05:52
or even to keep our jobs.
97
352444
1848
eller endda beholde vores jobs.
05:54
In fact, in China,
98
354900
1913
I Kina
05:58
the train drivers on the Beijing-Shanghai high-speed rail,
99
358199
5897
skal togførerne på Beijing-Shanghai-ruten,
06:04
the busiest of its kind in the world,
100
364120
2532
den travleste af sin slags i verden,
06:06
are required to wear EEG devices
101
366676
2476
bære EEG-apparater,
der registrerer deres hjerneaktivitet, under kørslen.
06:09
to monitor their brain activity while driving.
102
369176
2427
06:12
According to some news sources,
103
372157
2226
Ifølge nogle nyhedskilder
06:14
in government-run factories in China,
104
374407
2679
skal arbejderne i Kinas statskontrollerede fabrikker
06:17
the workers are required to wear EEG sensors to monitor their productivity
105
377110
5364
bære EEG-sensorer, der overvåger deres produktivitet
06:22
and their emotional state at work.
106
382498
2115
og sindstilstand på arbejdet.
06:25
Workers are even sent home
107
385267
2310
Arbejdere bliver endda sendt hjem,
06:27
if their brains show less-than-stellar concentration on their jobs,
108
387601
4054
hvis deres hjerner viser dårlig koncentration på jobbet
06:31
or emotional agitation.
109
391679
2122
Eller følelsesmæssig uro.
06:35
It's not going to happen tomorrow,
110
395189
1745
Det vil ikke ske i morgen,
06:36
but we're headed to a world of brain transparency.
111
396958
3086
men vi er på vej mod en verden, hvor hjernen kan aflæses.
06:40
And I don't think people understand that that could change everything.
112
400537
3440
Jeg tror ikke, folk forstår, at det kan ændre alting.
06:44
Everything from our definitions of data privacy to our laws,
113
404474
3675
Lige fra definitionen af datasikkerhed til vores love,
06:48
to our ideas about freedom.
114
408173
1800
til vores forestillinger om frihed.
06:50
In fact, in my lab at Duke University,
115
410731
3077
På mit laboratorium på Duke University
06:53
we recently conducted a nationwide study in the United States
116
413832
3175
udførte vi for nylig et landsdækkende studie i USA
06:57
to see if people appreciated
117
417031
1959
for at se, om folk værdsatte
følsomheden af deres hjerneinformationer.
06:59
the sensitivity of their brain information.
118
419014
2071
07:02
We asked people to rate their perceived sensitivity
119
422356
3356
Vi bad folk vudere deres opfattelse af følsomheden
07:05
of 33 different kinds of information,
120
425736
2231
af 33 forskellige slags informationer,
07:07
from their social security numbers
121
427991
2220
fra deres personnumre
07:10
to the content of their phone conversations,
122
430235
2597
til indholdet af deres mobilsamtaler,
07:12
their relationship history,
123
432856
2193
deres tidligere parforhold,
07:15
their emotions, their anxiety,
124
435073
1942
deres følelser, deres bekymringer,
07:17
the mental images in their mind
125
437039
1946
deres mentale billeder
og deres tanker.
07:19
and the thoughts in their mind.
126
439009
1538
07:21
Shockingly, people rated their social security number as far more sensitive
127
441481
5229
Overraskende var folk mere følsomme over for deres personnumre
07:26
than any other kind of information,
128
446734
2203
end noget som helst andet,
07:28
including their brain data.
129
448961
2435
inklusive deres hjernedata.
Jeg tror, det skyldes, at folk endnu ikke forstår
07:32
I think this is because people don't yet understand
130
452380
3216
07:35
or believe the implications of this new brain-decoding technology.
131
455620
4063
eller tror på konsekvenserne af den nye hjerneteknologi.
07:40
After all, if we can know the inner workings of the human brain,
132
460629
3289
For hvis vi kan læse et menneskes inderste tanker,
07:43
our social security numbers are the least of our worries.
133
463942
2706
så er vores personnumre vores mindste bekymring.
07:46
(Laughter)
134
466672
1285
(Latter)
07:47
Think about it.
135
467981
1167
Tænk over det.
07:49
In a world of total brain transparency,
136
469172
2396
I en verden, hvor hjerner kan aflæses,
07:51
who would dare have a politically dissident thought?
137
471592
2429
hvem ville så vove at tænke systemkritisk?
07:55
Or a creative one?
138
475279
1541
Eller kreativt?
07:57
I worry that people will self-censor
139
477503
3476
Jeg er bange for, at folk vil censurere sig selv
af frygt for at blive udstødt af samfundet,
08:01
in fear of being ostracized by society,
140
481003
3302
08:04
or that people will lose their jobs because of their waning attention
141
484329
3813
eller at folk vil miste jeres job på grund af dårlig koncentration
08:08
or emotional instability,
142
488166
2150
eller følelsesmæssig ustabilitet.
08:10
or because they're contemplating collective action against their employers.
143
490340
3550
Eller for at overveje at organisere sig på arbejdspladsen.
08:14
That coming out will no longer be an option,
144
494478
3177
At det ikke længere vil være mulig at springe ud,
08:17
because people's brains will long ago have revealed their sexual orientation,
145
497679
5067
fordi folks hjerner for længst vil have afsløret deres seksualitet,
08:22
their political ideology
146
502770
1822
deres politiske ideologi
08:24
or their religious preferences,
147
504616
2025
eller deres religiøse overbevisning,
08:26
well before they were ready to consciously share that information
148
506665
3080
længe før de var klar til at dele de oplysninger
08:29
with other people.
149
509769
1253
med andre mennesker
08:31
I worry about the ability of our laws to keep up with technological change.
150
511565
4912
Jeg er bange for, at vores love ikke kan holde trit med teknologien.
08:36
Take the First Amendment of the US Constitution,
151
516986
2320
Tag det første tillæg til USA's forfatning,
08:39
which protects freedom of speech.
152
519330
1958
der beskytter ytringsfriheden.
08:41
Does it also protect freedom of thought?
153
521312
1927
Beskytter den også tankefrihed?
08:43
And if so, does that mean that we're free to alter our thoughts however we want?
154
523944
4169
Og betyder det i så fald, at vi må tænke, hvad vi vil?
08:48
Or can the government or society tell us what we can do with our own brains?
155
528137
4674
Eller må staten og samfundet bestemme, hvad vi må med vores egne hjerner?
08:53
Can the NSA spy on our brains using these new mobile devices?
156
533591
3717
Må NSA overvåge vores hjerner med de ny bærbare enheder?
Må firmaer, der indsamler de data gennem deres apps,
08:58
Can the companies that collect the brain data through their applications
157
538053
4119
09:02
sell this information to third parties?
158
542196
2074
sælge informationerne til tredjeparter?
09:05
Right now, no laws prevent them from doing so.
159
545174
3222
Det er der ingen love imod lige nu.
Det kunne være endnu mere problematisk
09:09
It could be even more problematic
160
549203
2025
09:11
in countries that don't share the same freedoms
161
551252
2519
i lande, der ikke har de samme friheder
09:13
enjoyed by people in the United States.
162
553795
2103
som her i USA.
09:16
What would've happened during the Iranian Green Movement
163
556883
3787
Hvad ville der være sket under Irans Grønne Bevægelse,
09:20
if the government had been monitoring my family's brain activity,
164
560694
3901
hvis regeringen havde overvåget min families hjerneaktivitet
09:24
and had believed them to be sympathetic to the protesters?
165
564619
4007
og vurderet, at de stod på demonstranternes side?
Er det så langt ude at forestille sig et samfund,
09:30
Is it so far-fetched to imagine a society
166
570091
3047
09:33
in which people are arrested based on their thoughts
167
573162
2842
hvor folk bliver arresteret på grund af deres tanker
om at begå en forbrydelse,
09:36
of committing a crime,
168
576028
1167
09:37
like in the science-fiction dystopian society in "Minority Report."
169
577219
4312
ligesom det dystopiske samfund i sci fi-filmen "Minority Report",
09:42
Already, in the United States, in Indiana,
170
582286
4323
I Indiana i USA
09:46
an 18-year-old was charged with attempting to intimidate his school
171
586633
4937
blev en 18-årig tiltalt for at forsøge at true sin skole
09:51
by posting a video of himself shooting people in the hallways ...
172
591594
3309
ved at uploade en video, hvor han skød folk på gangene.
09:55
Except the people were zombies
173
595881
3007
Bortset fra, at folkene var zombier,
09:58
and the video was of him playing an augmented-reality video game,
174
598912
5047
og i videoen spillede han et computerspil,
10:03
all interpreted to be a mental projection of his subjective intent.
175
603983
4772
der blev fortolket som en projektion af hans subjektive hensigter.
10:10
This is exactly why our brains need special protection.
176
610456
4612
Derfor har vores hjerner brug for særlig beskyttelse.
10:15
If our brains are just as subject to data tracking and aggregation
177
615092
3556
Hvis vores hjerner udsættes for overvågning og dataindsamling
10:18
as our financial records and transactions,
178
618672
2532
ligesom vores økonomiske transaktioner,
10:21
if our brains can be hacked and tracked like our online activities,
179
621228
4285
hvis vores hjerner kan hackes og overvåges ligesom vores aktiviteter
10:25
our mobile phones and applications,
180
625537
2361
på nettet, mobilen og apps,
10:27
then we're on the brink of a dangerous threat to our collective humanity.
181
627922
4269
står vi over for en farlig trussel mod hele menneskeheden.
10:33
Before you panic,
182
633406
1309
Nu må I ikke gå i panik.
Jeg tror, der findes løsninger på disse problemer,
10:36
I believe that there are solutions to these concerns,
183
636012
3144
10:39
but we have to start by focusing on the right things.
184
639180
2825
men vi må begynde med at fokusere på det rigtige.
10:42
When it comes to privacy protections in general,
185
642580
2921
Når det kommer til beskyttelse af privatlivet,
10:45
I think we're fighting a losing battle
186
645525
1826
så kæmper vi forgæves,
10:47
by trying to restrict the flow of information.
187
647375
2858
når vi forsøger at begrænse strømmen af information.
10:50
Instead, we should be focusing on securing rights and remedies
188
650257
4057
I stedet bør vi fokusere på at sikre rettigheder og sanktioner
10:54
against the misuse of our information.
189
654338
2275
for at misbruge vores oplysninger.
10:57
If people had the right to decide how their information was shared,
190
657291
3285
Hvis folk havde ret til at vælge, hvordan deres oplysninger blev delt,
11:00
and more importantly, have legal redress
191
660600
2921
og vigtigere endnu, havde juraen på deres side,
11:03
if their information was misused against them,
192
663545
2428
hvis deres oplysninger blev misbrugt
11:05
say to discriminate against them in an employment setting
193
665997
2786
til at diskriminere på arbejdspladsen
11:08
or in health care or education,
194
668807
2785
eller i sundhedssystemet eller skolen,
11:11
this would go a long way to build trust.
195
671616
2571
ville det skabe grobund for tillid.
11:14
In fact, in some instances,
196
674843
1718
I nogle tilfælde
11:16
we want to be sharing more of our personal information.
197
676585
3524
vil vi gerne dele flere personlige oplysninger
11:20
Studying aggregated information can tell us so much
198
680697
3047
Ved at studere store mængder data kan vi lære så meget
11:23
about our health and our well-being,
199
683768
2747
om vores sundhed og velvære,
11:26
but to be able to safely share our information,
200
686539
3313
men for sikkert at kunne dele vores oplysninger
11:29
we need special protections for mental privacy.
201
689876
3223
har vi brug for særlig beskyttelse af vores mentale privatliv.
11:33
This is why we need a right to cognitive liberty.
202
693832
3147
Derfor har vi brug for en ret til kognitiv frihed.
11:37
This right would secure for us our freedom of thought and rumination,
203
697543
4079
Det ville sikre, vores tankefrihed,
11:41
our freedom of self-determination,
204
701646
2540
vores ret til selvbestemmelse,
11:44
and it would insure that we have the right to consent to or refuse
205
704210
4390
og det ville sikre, at vi selv ville kunne bestemme,
11:48
access and alteration of our brains by others.
206
708624
2857
om andre kunne få adgang til vores hjerner.
11:51
This right could be recognized
207
711811
2112
Den rettighed kunne anderkendes
11:53
as part of the Universal Declaration of Human Rights,
208
713947
2883
som del af de universelle menneskerettigheder,
11:56
which has established mechanisms
209
716854
2388
der har etableret systemer
11:59
for the enforcement of these kinds of social rights.
210
719266
2856
til at håndhæve den slags sociale rettigheder.
12:03
During the Iranian Green Movement,
211
723872
2070
Under Irans Grønne Bevægelse
12:05
the protesters used the internet and good old-fashioned word of mouth
212
725966
5186
kommunikerede demonstranterne via internettet og fra mund til mund,
12:11
to coordinate their marches.
213
731176
1948
når de planlagde deres protester.
12:14
And some of the most oppressive restrictions in Iran
214
734238
2769
Og nogle af de mest undertrykkende love i Iran
blev ophævet på grund af det.
12:17
were lifted as a result.
215
737031
1877
Men hvad hvis Irans regering havde brugt hjerneovervågning
12:20
But what if the Iranian government had used brain surveillance
216
740047
4087
12:24
to detect and prevent the protest?
217
744158
3061
til at opdage og forhindre protesterne?
12:28
Would the world have ever heard the protesters' cries?
218
748847
3176
Ville verden nogensinde have hørt demonstranternes råb?
12:33
The time has come for us to call for a cognitive liberty revolution.
219
753732
5121
Det er på tide, at vi kræver en kognitiv frihedsrevolution.
12:39
To make sure that we responsibly advance technology
220
759559
3264
At vi udvikler teknologien på en ansvarlig måde,
12:42
that could enable us to embrace the future
221
762847
2978
så vi kan omfavne fremtiden
12:45
while fiercely protecting all of us from any person, company or government
222
765849
6717
og samtidig beskytte os alle mod mennesker, firmaer eller stater,
12:52
that attempts to unlawfully access or alter our innermost lives.
223
772590
5040
der uretmæssigt forsøger at blande sig i vores inderste indre.
12:58
Thank you.
224
778659
1174
Tak.
12:59
(Applause)
225
779857
3492
(Klapsalver)
Om denne hjemmeside

På dette websted kan du se YouTube-videoer, der er nyttige til at lære engelsk. Du vil se engelskundervisning, der er udført af førsteklasses lærere fra hele verden. Dobbeltklik på de engelske undertekster, der vises på hver videoside, for at afspille videoen derfra. Underteksterne ruller i takt med videoafspilningen. Hvis du har kommentarer eller ønsker, bedes du kontakte os ved hjælp af denne kontaktformular.

https://forms.gle/WvT1wiN1qDtmnspy7