Don't fear superintelligent AI | Grady Booch

270,326 views ・ 2017-03-13

TED


Fai dobre clic nos subtítulos en inglés a continuación para reproducir o vídeo.

Translator: Alicia Fontela Morán Reviewer: Xosé María Moreno
00:12
When I was a kid, I was the quintessential nerd.
0
12580
3840
Cando era un neno, era o típico friqui.
00:17
I think some of you were, too.
1
17140
2176
Creo que algúns de vostedes tamén.
00:19
(Laughter)
2
19340
1216
(risos)
00:20
And you, sir, who laughed the loudest, you probably still are.
3
20580
3216
E vostede, señor, o que ri máis alto, pode que aínda o sexa.
00:23
(Laughter)
4
23820
2256
(risos)
00:26
I grew up in a small town in the dusty plains of north Texas,
5
26100
3496
Medrei nunha cidade pequena nas poirentas chairas do norte de Texas,
00:29
the son of a sheriff who was the son of a pastor.
6
29620
3336
son fillo dun sheriff que era fillo dun pastor.
00:32
Getting into trouble was not an option.
7
32980
1920
así que mellor non meterme en problemas.
00:35
And so I started reading calculus books for fun.
8
35860
3256
Por tanto comecei a ler libros de cálculo por divertimento.
00:39
(Laughter)
9
39140
1536
(risos)
00:40
You did, too.
10
40700
1696
Vostede tamén.
00:42
That led me to building a laser and a computer and model rockets,
11
42420
3736
Iso levoume a construir un láser, un computador, cohetes,
00:46
and that led me to making rocket fuel in my bedroom.
12
46180
3000
e a facer combustible para foguetes no meu cuarto.
00:49
Now, in scientific terms,
13
49780
3656
Agora, en termos científicos,
00:53
we call this a very bad idea.
14
53460
3256
a iso chámamoslle: unha moi mala idea.
00:56
(Laughter)
15
56740
1216
00:57
Around that same time,
16
57980
2176
(risos)
Ao mesmo tempo, estreábase
01:00
Stanley Kubrick's "2001: A Space Odyssey" came to the theaters,
17
60180
3216
2001, unha odisea no espazo de Stanley Kubrick,
01:03
and my life was forever changed.
18
63420
2200
e a miña vida cambiaba para sempre.
01:06
I loved everything about that movie,
19
66100
2056
Encantoume todo desa película,
01:08
especially the HAL 9000.
20
68180
2536
sobre todo HAL 9000.
01:10
Now, HAL was a sentient computer
21
70740
2056
HAL era un computador consciente
01:12
designed to guide the Discovery spacecraft
22
72820
2456
deseñado para guiar o Discovery
01:15
from the Earth to Jupiter.
23
75300
2536
da Terra a Xúpiter.
01:17
HAL was also a flawed character,
24
77860
2056
HAL era ademais un personaxe con defectos,
01:19
for in the end he chose to value the mission over human life.
25
79940
4280
xa que ao final lle deu máis importancia á misión ca á vida humana.
01:24
Now, HAL was a fictional character,
26
84660
2096
Agora, HAL pertence á ficción
01:26
but nonetheless he speaks to our fears,
27
86780
2656
pero aínda así fala dos nosos medos
01:29
our fears of being subjugated
28
89460
2096
a sermos dominados
01:31
by some unfeeling, artificial intelligence
29
91580
3016
por unha intelixencia artificial (IA) sen sentimentos
01:34
who is indifferent to our humanity.
30
94620
1960
que é indiferente á nosa humanidade.
01:37
I believe that such fears are unfounded.
31
97700
2576
Creo que eses medos non teñen fundamento.
01:40
Indeed, we stand at a remarkable time
32
100300
2696
De feito, estamos nunha importante época
01:43
in human history,
33
103020
1536
para a historia humana
01:44
where, driven by refusal to accept the limits of our bodies and our minds,
34
104580
4976
na que, por negarnos a aceptar os límites dos nosos corpos e mentes,
01:49
we are building machines
35
109580
1696
estamos construíndo máquinas
01:51
of exquisite, beautiful complexity and grace
36
111300
3616
de exquisita beleza, complexidade e graza
01:54
that will extend the human experience
37
114940
2056
que ampliarán a experiencia humana
01:57
in ways beyond our imagining.
38
117020
1680
de formas que nin imaxinamos.
01:59
After a career that led me from the Air Force Academy
39
119540
2576
Despois de pasar da Academia da Forza Aérea
02:02
to Space Command to now,
40
122140
1936
ao Mando Espacial da Forza Aérea,
02:04
I became a systems engineer,
41
124100
1696
pasei a enxeñeiro de sistemas
02:05
and recently I was drawn into an engineering problem
42
125820
2736
e hai pouco metéronme nun problema de enxeñaría
02:08
associated with NASA's mission to Mars.
43
128580
2576
relacionado coa misión da NASA a Marte.
02:11
Now, in space flights to the Moon,
44
131180
2496
Agora, nos voos espaciais á Lúa,
02:13
we can rely upon mission control in Houston
45
133700
3136
dependemos do control de Houston para vixiar
02:16
to watch over all aspects of a flight.
46
136860
1976
todos os aspectos do voo.
02:18
However, Mars is 200 times further away,
47
138860
3536
Porén Marte está 200 veces máis lonxe
02:22
and as a result it takes on average 13 minutes
48
142420
3216
e por tanto un sinal tarda arredor de 13 minutos
02:25
for a signal to travel from the Earth to Mars.
49
145660
3136
en viaxar dende a Terra ata alí.
02:28
If there's trouble, there's not enough time.
50
148820
3400
Se hai algún problema, é demasiado tempo.
02:32
And so a reasonable engineering solution
51
152660
2496
Unha solución razoable
02:35
calls for us to put mission control
52
155180
2576
pasa por incluír o control da misión
02:37
inside the walls of the Orion spacecraft.
53
157780
3016
dentro das paredes da nave espacial Orión.
02:40
Another fascinating idea in the mission profile
54
160820
2896
Outra idea fascinante do perfil da misión
02:43
places humanoid robots on the surface of Mars
55
163740
2896
sitúa a robots humanoides na superficie de Marte
02:46
before the humans themselves arrive,
56
166660
1856
antes ca os propios humanos,
02:48
first to build facilities
57
168540
1656
primeiro para construír infraestruturas
02:50
and later to serve as collaborative members of the science team.
58
170220
3360
e despois para serviren como colaboradores do equipo científico.
02:55
Now, as I looked at this from an engineering perspective,
59
175220
2736
Desde a perspectiva da enxeñaría,
02:57
it became very clear to me that what I needed to architect
60
177980
3176
quedoume moi claro que o que necesitaba deseñar
03:01
was a smart, collaborative,
61
181180
2176
era unha intelixencia artificial
03:03
socially intelligent artificial intelligence.
62
183380
2376
esperta, colaborativa e socialmente intelixente.
03:05
In other words, I needed to build something very much like a HAL
63
185780
4296
É dicir, precisaba construír algo moi parecido a HAL
03:10
but without the homicidal tendencies.
64
190100
2416
pero sen as súas tendencias homicidas.
03:12
(Laughter)
65
192540
1360
(risos)
03:14
Let's pause for a moment.
66
194740
1816
Deteñámonos un momento.
03:16
Is it really possible to build an artificial intelligence like that?
67
196580
3896
É posible construírmos unha intelixencia artificial coma esa?
03:20
Actually, it is.
68
200500
1456
Pois, si.
03:21
In many ways,
69
201980
1256
Por moitas razóns,
03:23
this is a hard engineering problem
70
203260
1976
este é un difícil problema de enxeñaría
03:25
with elements of AI,
71
205260
1456
cos elementos da lA,
03:26
not some wet hair ball of an AI problem that needs to be engineered.
72
206740
4696
non só unhas pedriñas no camiño que deben ser tratadas pola enxeñaría.
03:31
To paraphrase Alan Turing,
73
211460
2656
Parafraseando a Alan Turing,
03:34
I'm not interested in building a sentient machine.
74
214140
2376
Non quero construír unha máquina con sentimentos.
03:36
I'm not building a HAL.
75
216540
1576
Non constrúo un HAL.
03:38
All I'm after is a simple brain,
76
218140
2416
Só busco un cerebro simple,
03:40
something that offers the illusion of intelligence.
77
220580
3120
algo que ofrece a ilusión da intelixencia.
03:44
The art and the science of computing have come a long way
78
224820
3136
A arte e a ciencia informática percorreron un longo camiño
03:47
since HAL was onscreen,
79
227980
1496
dende que vimos a HAL,
03:49
and I'd imagine if his inventor Dr. Chandra were here today,
80
229500
3216
e creo que se o Dr. Chandra, o seu inventor, estivese aquí hoxe,
03:52
he'd have a whole lot of questions for us.
81
232740
2336
tería moitas preguntas que facernos.
03:55
Is it really possible for us
82
235100
2096
Énos posible coller un sistema
03:57
to take a system of millions upon millions of devices,
83
237220
4016
de millóns e millóns de dispositivos,
04:01
to read in their data streams,
84
241260
1456
medir os seus datos,
04:02
to predict their failures and act in advance?
85
242740
2256
predicir os seus fallos e anticiparnos?
04:05
Yes.
86
245020
1216
É.
04:06
Can we build systems that converse with humans in natural language?
87
246260
3176
Podemos crear sistemas que falen nunha linguaxe natural?
04:09
Yes.
88
249460
1216
Podemos.
04:10
Can we build systems that recognize objects, identify emotions,
89
250700
2976
Podemos crear sistemas que recoñezan obxectos,
04:13
emote themselves, play games and even read lips?
90
253700
3376
identifiquen e expresen emocións, xoguen, lean os beizos?
04:17
Yes.
91
257100
1216
Podemos.
04:18
Can we build a system that sets goals,
92
258340
2136
Podemos crear un sistema que fixe metas,
04:20
that carries out plans against those goals and learns along the way?
93
260500
3616
cree plans para acadalas e de paso aprenda?
04:24
Yes.
94
264140
1216
Podemos.
04:25
Can we build systems that have a theory of mind?
95
265380
3336
Podemos construír sistemas que teñan unha teoría da mente?
04:28
This we are learning to do.
96
268740
1496
Isto é o que estamos a aprender a facer.
04:30
Can we build systems that have an ethical and moral foundation?
97
270260
3480
Podemos construír un sistema que teña principios éticos e morais?
04:34
This we must learn how to do.
98
274300
2040
Isto é o que debemos aprender.
04:37
So let's accept for a moment
99
277180
1376
Aceptemos por un momento
04:38
that it's possible to build such an artificial intelligence
100
278580
2896
que podemos construír esta intelixencia artificial
04:41
for this kind of mission and others.
101
281500
2136
para este e outro tipo de misións.
04:43
The next question you must ask yourself is,
102
283660
2536
A seguinte pregunta que se deben facer é:
04:46
should we fear it?
103
286220
1456
deberiamos de lle ter medo?
04:47
Now, every new technology
104
287700
1976
Todas as novas tecnoloxías
04:49
brings with it some measure of trepidation.
105
289700
2896
traen consigo certo grao de inquietude.
04:52
When we first saw cars,
106
292620
1696
A primeira vez que a xente viu un coche,
04:54
people lamented that we would see the destruction of the family.
107
294340
4016
queixábase de que ía destruír as familias.
04:58
When we first saw telephones come in,
108
298380
2696
Cando vimos por primeira vez os teléfonos,
05:01
people were worried it would destroy all civil conversation.
109
301100
2896
preocupábanos que acabasen coas conversas civilizadas.
05:04
At a point in time we saw the written word become pervasive,
110
304020
3936
Nalgún momento cremos que a escritura se volvería ubicua,
05:07
people thought we would lose our ability to memorize.
111
307980
2496
que se perdería a capacidade de memorizar.
05:10
These things are all true to a degree,
112
310500
2056
Todo isto é verdade ata certo punto,
05:12
but it's also the case that these technologies
113
312580
2416
pero tamén o é que estas tecnoloxías
05:15
brought to us things that extended the human experience
114
315020
3376
nos brindaron avances que ampliaron a experiencia humana
05:18
in some profound ways.
115
318420
1880
de formas moi profundas.
05:21
So let's take this a little further.
116
321660
2280
Así que sigamos un pouco máis.
05:24
I do not fear the creation of an AI like this,
117
324940
4736
Non me dá medo a creación dunha IA coma esta,
05:29
because it will eventually embody some of our values.
118
329700
3816
xa que, co tempo, encarnará algúns dos nosos valores.
05:33
Consider this: building a cognitive system is fundamentally different
119
333540
3496
Tendo isto en conta: construír un sistema cognitivo é distinto
05:37
than building a traditional software-intensive system of the past.
120
337060
3296
a crear un anticuado sistema baseado en software intensivo.
05:40
We don't program them. We teach them.
121
340380
2456
Non os programamos, aprendémoslles.
05:42
In order to teach a system how to recognize flowers,
122
342860
2656
Para lles aprendermos a como recoñeceren as flores,
05:45
I show it thousands of flowers of the kinds I like.
123
345540
3016
amosaríalles moitas flores das miñas variedades preferidas.
05:48
In order to teach a system how to play a game --
124
348580
2256
Para aprenderlles como xogar a un xogo...
05:50
Well, I would. You would, too.
125
350860
1960
Eu faríao, e vós tamén.
05:54
I like flowers. Come on.
126
354420
2040
De verdade que me gustan as flores.
05:57
To teach a system how to play a game like Go,
127
357260
2856
Para ensinarlle a un sistema a xogar a un xogo coma Go
06:00
I'd have it play thousands of games of Go,
128
360140
2056
faríao xogar milleiros de partidas.
06:02
but in the process I also teach it
129
362220
1656
E no proceso aprenderíalle
06:03
how to discern a good game from a bad game.
130
363900
2416
a distinguir unha boa partida dunha mala.
06:06
If I want to create an artificially intelligent legal assistant,
131
366340
3696
Se quixese crear unha IA coa función de asistente legal
06:10
I will teach it some corpus of law
132
370060
1776
aprenderíalle un corpus legal
06:11
but at the same time I am fusing with it
133
371860
2856
ao mesmo tempo que lle introduciría
06:14
the sense of mercy and justice that is part of that law.
134
374740
2880
o sentido da misericordia e xustiza que é parte da lei.
06:18
In scientific terms, this is what we call ground truth,
135
378380
2976
En termos científicos chámaselles verdades fundamentais
06:21
and here's the important point:
136
381380
2016
e o máis importante:
06:23
in producing these machines,
137
383420
1456
ao producirmos estas máquinas
06:24
we are therefore teaching them a sense of our values.
138
384900
3416
estámoslles inculcando os nosos valores.
06:28
To that end, I trust an artificial intelligence
139
388340
3136
Chegados a este punto, confío nunha IA
06:31
the same, if not more, as a human who is well-trained.
140
391500
3640
tanto, se non máis, ca nun humano ben adestrado.
06:35
But, you may ask,
141
395900
1216
Preguntarédesvos
06:37
what about rogue agents,
142
397140
2616
que pasa cos axentes corruptos,
06:39
some well-funded nongovernment organization?
143
399780
3336
con algunhas ONG ben financiadas?
06:43
I do not fear an artificial intelligence in the hand of a lone wolf.
144
403140
3816
Unha IA en mans dun lobo solitario non me pon medo.
06:46
Clearly, we cannot protect ourselves against all random acts of violence,
145
406980
4536
Non nos podemos autoprotexer de todo acto aleatorio de violencia
06:51
but the reality is such a system
146
411540
2136
pero a realidade é que tal sistema
06:53
requires substantial training and subtle training
147
413700
3096
precisa dun adestramento substancial e sutil
06:56
far beyond the resources of an individual.
148
416820
2296
moi por riba das capacidades dun único individuo.
06:59
And furthermore,
149
419140
1216
E ademais é máis complicado
07:00
it's far more than just injecting an internet virus to the world,
150
420380
3256
ca inxectarmos un virus informático en todo o mundo.
07:03
where you push a button, all of a sudden it's in a million places
151
423660
3096
Premendo un botón o virus chega a milleiros de lugares
07:06
and laptops start blowing up all over the place.
152
426780
2456
e os portátiles empezan a estoupar.
07:09
Now, these kinds of substances are much larger,
153
429260
2816
Agora, este tipo de contidos son moito máis grandes
07:12
and we'll certainly see them coming.
154
432100
1715
e vémolos vir.
07:14
Do I fear that such an artificial intelligence
155
434340
3056
Teño medo a que unha IA deste tipo
07:17
might threaten all of humanity?
156
437420
1960
poida ameazar a toda a humanidade?
07:20
If you look at movies such as "The Matrix," "Metropolis,"
157
440100
4376
Se ves películas como Matrix, Metrópolis,
07:24
"The Terminator," shows such as "Westworld,"
158
444500
3176
Terminator, series como Westworld
07:27
they all speak of this kind of fear.
159
447700
2136
todas tratan este tipo de medos.
07:29
Indeed, in the book "Superintelligence" by the philosopher Nick Bostrom,
160
449860
4296
De feito, no libro Superintelixencia do filósofo Nick Bostrom
07:34
he picks up on this theme
161
454180
1536
este trata o mesmo tema
07:35
and observes that a superintelligence might not only be dangerous,
162
455740
4016
e opina que unha superintelixencia non só podería ser perigosa
07:39
it could represent an existential threat to all of humanity.
163
459780
3856
podería representar unha ameaza para toda a humanidade.
07:43
Dr. Bostrom's basic argument
164
463660
2216
O argumento do Dr. Bostrom
07:45
is that such systems will eventually
165
465900
2736
é que estes sistemas van chegar a ter nun momento
07:48
have such an insatiable thirst for information
166
468660
3256
tal insaciable fame de información
07:51
that they will perhaps learn how to learn
167
471940
2896
que pode que aprendan a como aprender
07:54
and eventually discover that they may have goals
168
474860
2616
e poden chegar a descubrir que teñen obxectivos
07:57
that are contrary to human needs.
169
477500
2296
contrarios ás necesidades humanas.
07:59
Dr. Bostrom has a number of followers.
170
479820
1856
O Dr. Bostrom ten seguidores.
08:01
He is supported by people such as Elon Musk and Stephen Hawking.
171
481700
4320
É apoiado por xente como Elon Musk ou Stephen Hawking.
08:06
With all due respect
172
486700
2400
Con todo o meu respecto
08:09
to these brilliant minds,
173
489980
2016
por estas mentes prodixiosas,
08:12
I believe that they are fundamentally wrong.
174
492020
2256
creo que se equivocan.
08:14
Now, there are a lot of pieces of Dr. Bostrom's argument to unpack,
175
494300
3176
Hai moitas partes da argumentación do Dr. Bostrom que analizar
08:17
and I don't have time to unpack them all,
176
497500
2136
e agora non teño tempo
08:19
but very briefly, consider this:
177
499660
2696
pero de forma moi breve, direi isto:
08:22
super knowing is very different than super doing.
178
502380
3736
a supersabedoría é moi diferente da superrealización.
08:26
HAL was a threat to the Discovery crew
179
506140
1896
HAL era unha ameaza
08:28
only insofar as HAL commanded all aspects of the Discovery.
180
508060
4416
para a tripulación do Discovery só se tomaba o control de toda a nave.
08:32
So it would have to be with a superintelligence.
181
512500
2496
O mesmo pasaría cunha superintelixencia.
08:35
It would have to have dominion over all of our world.
182
515020
2496
Tería que ter o dominio de todo o mundo.
08:37
This is the stuff of Skynet from the movie "The Terminator"
183
517540
2816
Isto é o que pasa coa Skynet de Terminator
08:40
in which we had a superintelligence
184
520380
1856
onde temos unha superintelixencia
08:42
that commanded human will,
185
522260
1376
que dirixe a vontade humana
08:43
that directed every device that was in every corner of the world.
186
523660
3856
e cada dispositivo situado en calquera recuncho do mundo.
08:47
Practically speaking,
187
527540
1456
Falando dende a práctica
08:49
it ain't gonna happen.
188
529020
2096
isto non vai pasar.
08:51
We are not building AIs that control the weather,
189
531140
3056
Non estamos deseñando IA que controlen o clima,
08:54
that direct the tides,
190
534220
1336
que dirixan as mareas,
08:55
that command us capricious, chaotic humans.
191
535580
3376
nin a nós, os caprichosos e caóticos humanos.
08:58
And furthermore, if such an artificial intelligence existed,
192
538980
3896
Ademais, de existir unha intelixencia artificial así
09:02
it would have to compete with human economies,
193
542900
2936
tería que competir coas economías humanas
09:05
and thereby compete for resources with us.
194
545860
2520
e competir con nós polos recursos.
09:09
And in the end --
195
549020
1216
E á fin e ao cabo
09:10
don't tell Siri this --
196
550260
1240
(non llo digan a Siri)
09:12
we can always unplug them.
197
552260
1376
podemos desconectalos.
09:13
(Laughter)
198
553660
2120
(risos)
09:17
We are on an incredible journey
199
557180
2456
Estamos nunha incrible viaxe de coevolución
09:19
of coevolution with our machines.
200
559660
2496
xunto coas nosas máquinas.
09:22
The humans we are today
201
562180
2496
Os humanos que somos agora
09:24
are not the humans we will be then.
202
564700
2536
non somos os que seremos daquela.
09:27
To worry now about the rise of a superintelligence
203
567260
3136
Preocuparse agora polo auxe dunha superintelixencia
09:30
is in many ways a dangerous distraction
204
570420
3056
é unha distracción perigosa
09:33
because the rise of computing itself
205
573500
2336
xa que o auxe da informática en si
09:35
brings to us a number of human and societal issues
206
575860
3016
xa dá lugar a algúns asuntos humanos e sociais
09:38
to which we must now attend.
207
578900
1640
aos que temos que atender.
09:41
How shall I best organize society
208
581180
2816
Como se pode organizar a sociedade
09:44
when the need for human labor diminishes?
209
584020
2336
cando diminúe a necesidade de capital humano?
09:46
How can I bring understanding and education throughout the globe
210
586380
3816
Como podo achegarlle compresión e educación a todo o mundo
09:50
and still respect our differences?
211
590220
1776
respectando as nosas diferenzas?
09:52
How might I extend and enhance human life through cognitive healthcare?
212
592020
4256
Como podo mellorar a vida das persoas a través da medicina cognitiva?
09:56
How might I use computing
213
596300
2856
Como podo utilizar a informática
09:59
to help take us to the stars?
214
599180
1760
para que cheguemos ás estrelas?
10:01
And that's the exciting thing.
215
601580
2040
E iso é o excitante.
10:04
The opportunities to use computing
216
604220
2336
As oportunidades de usar a informática
10:06
to advance the human experience
217
606580
1536
para que progrese a experiencia humana
10:08
are within our reach,
218
608140
1416
están ao noso alcance
10:09
here and now,
219
609580
1856
aquí e agora
10:11
and we are just beginning.
220
611460
1680
e isto é só o comezo.
10:14
Thank you very much.
221
614100
1216
Moitas grazas.
10:15
(Applause)
222
615340
4286
(Aplausos)
About this website

Este sitio presentarache vídeos de YouTube que son útiles para aprender inglés. Verás clases de inglés impartidas por profesores de primeiro nivel de todo o mundo. Fai dobre clic nos subtítulos en inglés que aparecen en cada páxina de vídeo para reproducir o vídeo desde alí. Os subtítulos desprázanse sincronizados coa reprodución do vídeo. Se tes algún comentario ou solicitude, póñase en contacto connosco a través deste formulario de contacto.

https://forms.gle/WvT1wiN1qDtmnspy7