The Turing test: Can a computer pass for a human? - Alex Gendler

1,661,434 views ・ 2016-04-25

TED-Ed


Please double-click on the English subtitles below to play the video.

Prevodilac: Milenka Okuka Lektor: Mile Živković
00:06
What is consciousness?
0
6999
2035
Šta je svesnost?
00:09
Can an artificial machine really think?
1
9034
2794
Može li veštačka mašina zaista da misli?
00:11
Does the mind just consist of neurons in the brain,
2
11828
3286
Da li se um sastoji jedino od neurona u mozgu
00:15
or is there some intangible spark at its core?
3
15114
3643
ili u njegovoj srži postoji neopipljiva iskra?
00:18
For many, these have been vital considerations
4
18757
2568
Za mnoge su ovo bila ključna razmatranja
00:21
for the future of artificial intelligence.
5
21325
3027
za budućnost veštačke inteligencije.
00:24
But British computer scientist Alan Turing decided to disregard all these questions
6
24352
5187
No britanski kompjuterski naučnik Alan Tjuring
je odlučio da odbaci sva ova pitanja
00:29
in favor of a much simpler one:
7
29539
2311
zarad jednostavnijeg:
00:31
can a computer talk like a human?
8
31850
3419
može li kompjuter da govori kao čovek?
00:35
This question led to an idea for measuring aritificial intelligence
9
35269
3996
Ovo pitanje je dovelo do zamisli merenja veštačke inteligencije
00:39
that would famously come to be known as the Turing test.
10
39265
4051
koja će postati čuvena kao Tjuringov test.
00:43
In the 1950 paper, "Computing Machinery and Intelligence,"
11
43316
4025
U naučnom radu iz 1950. "Kompjuterska mašina i inteligencija",
00:47
Turing proposed the following game.
12
47341
2473
Tjuring je predložio sledeću igru.
00:49
A human judge has a text conversation with unseen players
13
49814
4092
Ljudski sudija vodi tekstualni razgovor sa takmičarima koje ne vidi
00:53
and evaluates their responses.
14
53906
2476
i procenjuje njihove odgovore.
00:56
To pass the test, a computer must be able to replace one of the players
15
56382
4027
Da bi prošao test, kompjuter mora da bude u stanju da zameni takmičare,
01:00
without substantially changing the results.
16
60409
3448
a da se pri tom rezultat značajno ne promeni.
01:03
In other words, a computer would be considered intelligent
17
63857
2959
Drugim rečima, kompjuter bismo smatrali inteligentnim,
01:06
if its conversation couldn't be easily distinguished from a human's.
18
66816
5697
ako ono što govori ne bi bilo lako razlikovati od ljudskog govora.
01:12
Turing predicted that by the year 2000,
19
72513
2154
Tjuring je predvideo da će do 2000. godine
01:14
machines with 100 megabytes of memory would be able to easily pass his test.
20
74667
5955
mašine sa memorijom od 100 megabajta lako prolaziti na ovom testu.
01:20
But he may have jumped the gun.
21
80622
2164
No možda je istrčao pred rudu.
01:22
Even though today's computers have far more memory than that,
22
82786
3054
Iako današnji kompjuteri imaju daleko veću memoriju,
01:25
few have succeeded,
23
85840
1803
malo njih je u tome uspelo,
01:27
and those that have done well
24
87643
1795
a oni koji su imali dobre rezultate
01:29
focused more on finding clever ways to fool judges
25
89438
3770
su se više usredsređivali na pronalaženje pametnih načina da prevare sudije
01:33
than using overwhelming computing power.
26
93208
2966
nego na korišćenje ogromne kompjuterske moći.
01:36
Though it was never subjected to a real test,
27
96174
2922
Iako nikad nije podvrgnut stvarnom testiranju,
01:39
the first program with some claim to success was called ELIZA.
28
99096
4769
prvi program koji je tvrdio da je imao nekog uspeha se zvao ELIZA.
01:43
With only a fairly short and simple script,
29
103865
2547
Prilično kratkim i jednostavnim scenariom
01:46
it managed to mislead many people by mimicking a psychologist,
30
106412
4022
uspeo je da obmane ljude oponašajući psihologa,
01:50
encouraging them to talk more
31
110434
1961
ohrabrujući ih da više govore
01:52
and reflecting their own questions back at them.
32
112395
3510
i vraćajući njihova sopstvena pitanja nazad njima.
01:55
Another early script PARRY took the opposite approach
33
115905
3507
Još jedan raniji scenario, PARRY, je imao suprotan pristup,
01:59
by imitating a paranoid schizophrenic
34
119412
2801
imitirajući paranoičnog šizofreničara
02:02
who kept steering the conversation back to his own preprogrammed obsessions.
35
122213
5726
koji je stalno usmeravao tok razgovora na sopstvene pretprogramirane opsesije.
02:07
Their success in fooling people highlighted one weakness of the test.
36
127939
4864
Njihovi uspesi u obmanjivanju ljudi su istakli slabosti testa.
02:12
Humans regularly attribute intelligence to a whole range of things
37
132803
4630
Ljudi redovno pripisuju inteligenciju čitavom nizu stvari
02:17
that are not actually intelligent.
38
137433
3643
koje nisu zaista inteligentne.
02:21
Nonetheless, annual competitions like the Loebner Prize,
39
141076
3213
Bez obzira na to, godišnja takmičenja, poput Lobnerove nagrade,
02:24
have made the test more formal
40
144289
1720
formalizovala su test
02:26
with judges knowing ahead of time
41
146009
2146
tako što sudije unapred znaju
02:28
that some of their conversation partners are machines.
42
148155
3864
da su neki od njihovih sagovornika mašine.
02:32
But while the quality has improved,
43
152019
1901
No, iako su kvalitetniji,
02:33
many chatbot programmers have used similar strategies to ELIZA and PARRY.
44
153920
5176
mnogi programeri botova za ćaskanje
koriste strategije slične ELIZA-i i PARRY-ju.
02:39
1997's winner Catherine
45
159096
2036
Pobednica iz 1997, Katrin,
02:41
could carry on amazingly focused and intelligent conversation,
46
161132
4081
mogla je da obavi začuđujuće usredsređene i inteligentne razgovore,
02:45
but mostly if the judge wanted to talk about Bill Clinton.
47
165213
3962
ali uglavnom, ako bi sudija želeo da razgovoara o Bilu Klintonu.
02:49
And the more recent winner Eugene Goostman
48
169175
2638
A skorašnjem pobedniku, Judžinu Gustmanu,
02:51
was given the persona of a 13-year-old Ukrainian boy,
49
171813
3736
podarena je ličnost 13-ogodišnjeg dečaka iz Ukrajine,
02:55
so judges interpreted its nonsequiturs and awkward grammar
50
175549
4021
pa su sudije protumačile njegove zablude i nezgrapnu gramatiku
02:59
as language and culture barriers.
51
179570
3346
kao jezičku i kulturološku barijeru.
03:02
Meanwhile, other programs like Cleverbot have taken a different approach
52
182916
4219
U međuvremenu, drugi programi, poput Kleverbota su imali drugačiji pristup,
03:07
by statistically analyzing huge databases of real conversations
53
187135
4605
statistički analizirajući ogromne baze podataka stvarnih razgovora
03:11
to determine the best responses.
54
191740
2541
kako bi došli do najboljih odgovora.
03:14
Some also store memories of previous conversations
55
194281
3249
Neki su takođe pohranjivali u memoriju prethodne razgovore,
03:17
in order to improve over time.
56
197530
3385
kako bi se vremenom popravljali.
03:20
But while Cleverbot's individual responses can sound incredibly human,
57
200915
4151
No, iako Kleverbotovi pojedinačni odgovori mogu da zvuče neverovatno ljudski,
03:25
its lack of a consistent personality
58
205066
2026
odsustvo konzistentne ličnosti
03:27
and inability to deal with brand new topics
59
207092
3153
i nemogućnost da se bavi najnovijim temama
03:30
are a dead giveaway.
60
210245
2613
odaju ga odmah.
03:32
Who in Turing's day could have predicted that today's computers
61
212858
3254
Ko bi u Tjuringovo vreme mogao da predvidi da će današnji kompjuteri
03:36
would be able to pilot spacecraft,
62
216112
2089
moći da upravljaju svemirskim brodovima,
03:38
perform delicate surgeries,
63
218201
2317
da izvode prefinjene operacije
03:40
and solve massive equations,
64
220518
2289
i da rešavaju obimne jednačine,
03:42
but still struggle with the most basic small talk?
65
222807
3503
ali će i dalje da se muče s najobičnijim ćaskanjem?
03:46
Human language turns out to be an amazingly complex phenomenon
66
226310
3509
Ispostavilo se da je ljudski jezik zadivljujuće složena pojava
03:49
that can't be captured by even the largest dictionary.
67
229819
3705
koju ne može da obuhvati čak ni najobimniji rečnik.
03:53
Chatbots can be baffled by simple pauses, like "umm..."
68
233524
4469
Botove za ćaskanje zbunjuju sitne pauze, poput "hm..."
03:57
or questions with no correct answer.
69
237993
2422
ili pitanja koja nemaju tačne odgovore.
04:00
And a simple conversational sentence,
70
240415
1900
A prosta razgovorna rečenica,
04:02
like, "I took the juice out of the fridge and gave it to him,
71
242315
3280
poput: "Izvadio sam sok iz frižidera i dao mu,
04:05
but forgot to check the date,"
72
245595
1811
ali sam zaboravio da proverim datum",
04:07
requires a wealth of underlying knowledge and intuition to parse.
73
247406
5466
zahteva bogato pozadinsko znanje i intuiciju da bi je raščlanili.
04:12
It turns out that simulating a human conversation
74
252872
2631
Ispostavilo se da simuliranje ljudskog razgovora
04:15
takes more than just increasing memory and processing power,
75
255503
3609
zahteva više od pukog uvećanja memorije i snage procesora;
04:19
and as we get closer to Turing's goal,
76
259112
2511
i kako se približavamo Tjuringovom cilju,
04:21
we may have to deal with all those big questions about consciousness after all.
77
261623
4860
možda ćemo ipak morati da se bavimo svim tim velikim pitanjima o svesnosti.
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7