This computer is learning to read your mind | DIY Neuroscience, a TED series

126,903 views

2018-09-15 ・ TED


New videos

This computer is learning to read your mind | DIY Neuroscience, a TED series

126,903 views ・ 2018-09-15

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

00:00
Translator: Joseph Geni Reviewer: Krystian Aparta
0
0
7000
λ²ˆμ—­: Hyeyoung Jeong κ²€ν† : Yunjung Nam
00:12
Greg Gage: Mind-reading. You've seen this in sci-fi movies:
1
12203
2859
κ·Έλ ‰ κ²Œμ΄μ§€: 마음 읽기. 곡상과학 μ˜ν™”μ— λ“±μž₯ν•˜λŠ”
생각을 읽을 수 μžˆλŠ” 기계λ₯Ό 본적이 μžˆμ„ κ²λ‹ˆλ‹€.
00:15
machines that can read our thoughts.
2
15086
1857
00:16
However, there are devices today
3
16967
1798
ν•˜μ§€λ§Œ μ˜€λŠ˜λ‚  μš°λ¦¬μ—κ²ŒλŠ”
00:18
that can read the electrical activity from our brains.
4
18789
2524
λ‡ŒνŒŒ ν™œλ™μ„ 읽을 수 μžˆλŠ” 기계듀이 μžˆλŠ”λ°,
00:21
We call this the EEG.
5
21337
1272
그건 λ°”λ‘œ EEGμž…λ‹ˆλ‹€.
00:23
Is there information contained in these brainwaves?
6
23695
2829
이 λ‡ŒνŒŒμ—λŠ” 정보가 ν¬ν•¨λ˜μ–΄ μžˆμ„κΉŒμš”?
00:26
And if so, could we train a computer to read our thoughts?
7
26548
2813
그러면 컴퓨터λ₯Ό ν›ˆλ ¨μ‹œμΌœ 생각을 읽게 ν•  수 μžˆμ„κΉŒμš”?
00:29
My buddy Nathan has been working to hack the EEG
8
29385
2904
제 친ꡬ λ„€μ΄λ˜μ€ EEGλ₯Ό λΆ„μ„ν•˜λŠ” 일을 ν•˜κ³  μžˆμ–΄μš”.
00:32
to build a mind-reading machine.
9
32313
1676
λ§ˆμŒμ„ μ½λŠ” 기계λ₯Ό λ§Œλ“€κΈ° μœ„ν•΄μ„œμ£ .
00:34
[DIY Neuroscience]
10
34013
2457
[DIY μ‹ κ²½κ³Όν•™]
00:36
So this is how the EEG works.
11
36939
1561
EEGλŠ” μ΄λ ‡κ²Œ μž‘λ™ν•©λ‹ˆλ‹€.
00:38
Inside your head is a brain,
12
38524
1844
μ—¬λŸ¬λΆ„ λ¨Έλ¦¬μ—λŠ” λ‡Œκ°€ 있고,
00:40
and that brain is made out of billions of neurons.
13
40392
2558
κ·Έ λ‡ŒλŠ” λͺ‡ μ‹­μ–΅ 개의 μ‹ κ²½μœΌλ‘œ 이루어져 μžˆμŠ΅λ‹ˆλ‹€.
00:42
Each of those neurons sends an electrical message to each other.
14
42974
3007
각각의 이 신경듀은 μ„œλ‘œμ—κ²Œ 전기적 메세지λ₯Ό μ „λ‹¬ν•©λ‹ˆλ‹€.
00:46
These small messages can combine to make an electrical wave
15
46005
2814
이런 μž‘μ€ 메세지듀이 합쳐져 μ „νŒŒλ₯Ό λ§Œλ“€κ³  μš°λ¦¬λŠ” 그것을
00:48
that we can detect on a monitor.
16
48843
1570
λͺ¨λ‹ˆν„°λ‘œ 확인할 수 μžˆμŠ΅λ‹ˆλ‹€.
00:50
Now traditionally, the EEG can tell us large-scale things,
17
50437
2724
μ „ν†΅μ μœΌλ‘œ EEGλŠ” κ΄‘λ²”μœ„ν•œ 정보λ₯Ό μ œκ³΅ν•©λ‹ˆλ‹€.
00:53
for example if you're asleep or if you're alert.
18
53185
2350
예λ₯Ό λ“€μ–΄ 수면 쀑 ν˜Ήμ€ κΉ¨μ–΄μžˆλŠ” κ²½μš°μž…λ‹ˆλ‹€.
00:55
But can it tell us anything else?
19
55559
1594
ν•˜μ§€λ§Œ λ‹€λ₯Έ 정보도 μ œκ³΅ν• κΉŒμš”?
00:57
Can it actually read our thoughts?
20
57177
1702
μ‹€μ œλ‘œ 우리 생각을 읽을 수 μžˆμ„κΉŒμš”?
00:58
We're going to test this,
21
58903
1219
μ €ν¬λŠ” 이걸 μ‹€ν—˜ν•˜λŠ”λ°
01:00
and we're not going to start with some complex thoughts.
22
60146
2623
λ³΅μž‘ν•œ μƒκ°μœΌλ‘œ μ‹œμž‘ν•˜μ§€λŠ” μ•Šμ„ 것 μž…λ‹ˆλ‹€.
01:02
We're going to do something very simple.
23
62793
1977
맀우 λ‹¨μˆœν•œ κ±Έ μ‹œλ„ν•΄λ³΄μ£ .
01:04
Can we interpret what someone is seeing using only their brainwaves?
24
64794
3253
λ‡ŒνŒŒλ§Œ 가지고 보고 μžˆλŠ” λŒ€μƒμ΄ 무엇인지 해석할 수 μžˆμ„κΉŒμš”?
01:08
Nathan's going to begin by placing electrodes on Christy's head.
25
68071
3000
λ„€μ΄λ˜μ€ ν¬λ¦¬μŠ€ν‹°μ˜ 머리에 전극을 λΆ€μ°©ν•©λ‹ˆλ‹€.
01:11
Nathan: My life is tangled.
26
71095
1523
λ„€μ΄λ˜: λ‚΄ 인생이 κΌ¬μ˜€μ–΄.
01:12
(Laughter)
27
72642
1150
(μ›ƒμŒ)
01:14
GG: And then he's going to show her a bunch of pictures
28
74152
2584
GG: κ·ΈλŠ” κ·Έλ…€μ—κ²Œ
λ„€ 가지 λ‹€λ₯Έ λ²”μ£Όμ˜ 사진듀을 λ³΄μ—¬μ€λ‹ˆλ‹€.
01:16
from four different categories.
29
76760
1521
01:18
Nathan: Face, house, scenery and weird pictures.
30
78305
2654
λ„€μ΄λ˜: μ–Όκ΅΄, 집, 풍경, 그리고 μ΄μƒν•œ μ‚¬μ§„λ“€μ΄μ—μš”.
01:20
GG: As we show Christy hundreds of these images,
31
80983
2498
GG: ν¬λ¦¬μŠ€ν‹°μ—κ²Œ λͺ‡λ°± 개의 이미지λ₯Ό λ³΄μ—¬μ£Όλ©΄μ„œ
01:23
we are also capturing the electrical waves onto Nathan's computer.
32
83505
3543
μš°λ¦¬λŠ” λ˜ν•œ μ „νŒŒλ₯Ό μΊ‘μ²˜ν•˜μ—¬ λ„€μ΄λ˜μ˜ 컴퓨터에 μ €μž₯ν•©λ‹ˆλ‹€.
01:27
We want to see if we can detect any visual information about the photos
33
87072
3386
μš°λ¦¬λŠ” λ‡ŒνŒŒμ— ν¬ν•¨λœ 사진에 λŒ€ν•œ λͺ¨λ“  μ‹œκ°μ •λ³΄λ₯Ό 감지할 수 μžˆλŠ”μ§€
01:30
contained in the brainwaves,
34
90482
1352
μ•Œμ•„λ³΄λ €κ³  ν•˜λŠ” κ²λ‹ˆλ‹€.
01:31
so when we're done, we're going to see if the EEG
35
91858
2331
μž‘μ—…μ΄ λλ‚˜λ©΄, ν¬λ¦¬μŠ€ν‹°κ°€ 보고 μžˆλŠ” μ‚¬μ§„μ˜ μ’…λ₯˜λ₯Ό
01:34
can tell us what kind of picture Christy is looking at,
36
94213
2598
EGGκ°€ 맞좜 수 μžˆλŠ”μ§€λ₯Ό μ•Œμ•„λ³Ό κ²λ‹ˆλ‹€.
01:36
and if it does, each category should trigger a different brain signal.
37
96835
3584
μ„±κ³΅ν•œλ‹€λ©΄ 각 μΉ΄ν…Œκ³ λ¦¬λŠ” λ‹€λ₯Έ λ‡Œ μ‹ ν˜Έλ₯Ό μ΄‰λ°œμ‹œν‚¬ 것 μž…λ‹ˆλ‹€.
01:40
OK, so we collected all the raw EEG data,
38
100443
2628
저희가 μˆ˜μ§‘ν•œ EEG 미가곡 λ°μ΄ν„°λŠ”
01:43
and this is what we got.
39
103095
1150
λ°”λ‘œ 이런 λ°μ΄ν„°μ˜ˆμš”.
01:45
It all looks pretty messy, so let's arrange them by picture.
40
105389
2938
λ„ˆλ¬΄ 지저뢄해 λ³΄μ΄λ‹ˆκΉŒ, μ‚¬μ§„λ³„λ‘œ 정리λ₯Ό ν•΄λ΄…μ‹œλ‹€.
01:48
Now, still a bit too noisy to see any differences,
41
108826
2656
μ—¬μ „νžˆ λ„ˆλ¬΄ μ–΄μ§€λŸ¬μ›Œμ„œ 차이점을 νŒŒμ•…ν•  수 μ—†λ„€μš”.
01:51
but if we average the EEG across all image types
42
111506
3040
ν•˜μ§€λ§Œ 이미지가 처음 λ‚˜νƒ€λ‚œ μ‹œμ μ— 따라 μ •λ¦¬ν•¨μœΌλ‘œμ¨
01:54
by aligning them to when the image first appeared,
43
114570
2436
λͺ¨λ“  이미지 μ’…λ₯˜μ— 걸쳐 EEG의 평균을 λ‚Έλ‹€λ©΄
01:57
we can remove this noise,
44
117030
1617
μ΄λŸ¬ν•œ μž‘μŒμ„ 없앨 수 있고,
01:58
and pretty soon, we can see some dominant patterns
45
118671
2334
그리고 곧 μš°λ¦¬λŠ” 각 μ’…λ₯˜λ³„λ‘œ 지배적인 νŒ¨ν„΄μ΄ λ‚˜νƒ€λ‚˜λŠ” 것을
02:01
emerge for each category.
46
121029
1564
λ³Ό 수 μžˆμŠ΅λ‹ˆλ‹€.
02:02
Now the signals all still look pretty similar.
47
122617
2156
이제 μ‹ ν˜Έλ“€μ΄ λͺ¨λ‘ κ½€ λΉ„μŠ·ν•΄λ³΄μ΄μ£ .
02:04
Let's take a closer look.
48
124797
1215
μ’€ 더 μžμ„Ένžˆ λ΄…μ‹œλ‹€.
02:06
About a hundred milliseconds after the image comes on,
49
126036
2525
이미지 μΆœν˜„ ν›„ λ°± λ°€λ¦¬μ΄ˆ 정도에
02:08
we see a positive bump in all four cases,
50
128585
2628
λ„€ 가지 경우 λͺ¨λ‘μ—μ„œ 긍정적인 μƒμŠΉμ΄ λ³΄μž…λ‹ˆλ‹€.
02:11
and we call this the P100, and what we think that is
51
131237
2789
그것은 P100이라고 λΆˆλ¦½λ‹ˆλ‹€.
그게 λ°”λ‘œ μš°λ¦¬κ°€ λŒ€μƒμ„ 인식할 λ•Œ λ‡Œμ—μ„œ μΌμ–΄λ‚˜λŠ” 일이라 λ³Ό 수 있죠.
02:14
is what happens in your brain when you recognize an object.
52
134050
3075
02:17
But damn, look at that signal for the face.
53
137149
2086
근데 이런, 얼꡴을 λ³Ό λ•Œ μ‹ ν˜Έλ₯Ό λ³΄μ„Έμš”.
02:19
It looks different than the others.
54
139259
1711
λ‹€λ₯Έ κ²ƒλ“€κ³ΌλŠ” λ‹€λ₯΄κ²Œ λ³΄μž…λ‹ˆλ‹€.
02:20
There's a negative dip about 170 milliseconds
55
140994
2890
이미지가 λ‚˜νƒ€λ‚œ λ’€ 170λ°€λ¦¬μ΄ˆ μ―€
02:23
after the image comes on.
56
143908
1540
μ•„λž˜λ‘œ ν•˜κ°•ν•˜λŠ” 뢀뢄이 μžˆμŠ΅λ‹ˆλ‹€.
02:25
What could be going on here?
57
145472
1750
μ—¬κΈ°μ„œ 무슨 일이 μΌμ–΄λ‚˜λŠ” κ±ΈκΉŒμš”?
02:27
Research shows that our brain has a lot of neurons that are dedicated
58
147246
3240
연ꡬ에 λ”°λ₯΄λ©΄ 우리 λ‡ŒλŠ” μΈκ°„μ˜ 얼꡴을 μΈμ‹ν•˜λŠ”λ° ν•„μš”ν•œ
02:30
to recognizing human faces,
59
150510
1459
λ§Žμ€ λ‰΄λŸ°μ„ 가지고 있죠.
02:31
so this N170 spike could be all those neurons
60
151993
2844
이 N170 μŠ€νŒŒμ΄ν¬λŠ” λͺ¨λ“  λ‰΄λŸ°λ“€μ΄
02:34
firing at once in the same location,
61
154861
1985
같은 μž₯μ†Œμ—μ„œ λ™μ‹œμ— λ°œμ‚¬λ˜λŠ” 것일 수 μžˆμŠ΅λ‹ˆλ‹€.
02:36
and we can detect that in the EEG.
62
156870
1634
μš°λ¦¬λŠ” κ·Έκ±Έ EEGμ—μ„œ 감지할 수 있죠.
02:39
So there are two takeaways here.
63
159083
1820
μ—¬κΈ° 두 가지 μ‹œμ‚¬μ μ΄ μžˆμŠ΅λ‹ˆλ‹€.
02:40
One, our eyes can't really detect the differences in patterns
64
160927
3085
ν•˜λ‚˜λŠ” 우리의 눈이 작음의 평균을 내지 μ•Šκ³ λŠ”
νŒ¨ν„΄μ—μ„œ 차이점을 κ°μ§€ν•˜κΈ° 정말 μ–΄λ ΅λ‹€λŠ” 것이고
02:44
without averaging out the noise,
65
164036
1571
02:45
and two, even after removing the noise,
66
165631
2237
두 λ²ˆμ§Έλ‘œλŠ”, μž‘μŒμ„ μ œκ±°ν•œ 후에도
02:47
our eyes can only pick up the signals associated with faces.
67
167892
3001
우리 λˆˆμ€ μ–Όκ΅΄κ³Ό κ΄€λ ¨λœ μ‹ ν˜Έλ§Œμ„ μΆ”μΆœν•  수 μžˆλ‹€λŠ” κ²λ‹ˆλ‹€.
02:50
So this is where we turn to machine learning.
68
170917
2268
이것이 μš°λ¦¬κ°€ κΈ°κ³„ν•™μŠ΅μ— 관심을 κ°–λŠ” 이유죠.
02:53
Now, our eyes are not very good at picking up patterns in noisy data,
69
173209
3976
우리의 λˆˆμ€ 작음이 μ„žμΈ λ°μ΄ν„°μ—μ„œ νŒ¨ν„΄μ„ 뽑아내지 λͺ»ν•©λ‹ˆλ‹€.
02:57
but machine learning algorithms are designed to do just that,
70
177209
2946
ν•˜μ§€λ§Œ κΈ°κ³„ν•™μŠ΅ μ•Œκ³ λ¦¬μ¦˜μ€ 그것이 κ°€λŠ₯ν•˜λ„λ‘ μ„€κ³„λ˜μ–΄ 있고
03:00
so could we take a lot of pictures and a lot of data
71
180179
3201
λ”°λΌμ„œ μš°λ¦¬κ°€ λ§Žμ€ 이미지와 데이터λ₯Ό 기계에 μž…λ ₯ν•˜μ—¬
03:03
and feed it in and train a computer
72
183404
1790
컴퓨터λ₯Ό ν›ˆλ ¨μ‹œμΌœμ„œ
03:05
to be able to interpret what Christy is looking at in real time?
73
185218
3381
ν¬λ¦¬μŠ€ν‹°κ°€ μ§€κΈˆ 무엇을 보고 μžˆλŠ”μ§€ 해석할 수 있게 ν•˜λŠ”κ²Œ κ°€λŠ₯ν•˜κ² μ£ ?
03:09
We're trying to code the information that's coming out of her EEG
74
189088
4117
μš°λ¦¬λŠ” κ·Έλ…€μ˜ EEGμ—μ„œ λ‚˜μ˜€λŠ” 정보λ₯Ό λ™μ‹œμ— μ½”λ“œν™”ν•˜κ³ 
03:13
in real time
75
193229
1175
κ·Έλ…€μ˜ 눈이 보고 μžˆλŠ” 게 무엇인지 μ˜ˆμΈ‘ν•˜λ € ν•©λ‹ˆλ‹€.
03:14
and predict what it is that her eyes are looking at.
76
194428
2461
03:16
And if it works, what we should see
77
196913
1727
μ„±κ³΅ν•œλ‹€λ©΄, μš°λ¦¬κ°€ 봐야 ν•˜λŠ” 것은
03:18
is every time that she gets a picture of scenery,
78
198664
2381
κ·Έλ…€κ°€ 풍경 이미지λ₯Ό λ³Ό λ•Œλ§ˆλ‹€
03:21
it should say scenery, scenery, scenery, scenery.
79
201069
2286
풍경, 풍경, 풍경이라고 말해야 ν•˜λŠ” 것이죠.
03:23
A face -- face, face, face, face,
80
203379
1957
얼꡴이면-- μ–Όκ΅΄, μ–Όκ΅΄, μ–Όκ΅΄, μ–Όκ΅΄
03:25
but it's not quite working that way, is what we're discovering.
81
205360
3531
ν•˜μ§€λ§Œ κ·Έλ ‡κ²Œ λ˜μ§€ μ•ŠλŠ”λ‹€λŠ” 것이 λ°ν˜€μ‘Œλ„€μš”.
03:33
(Laughter)
82
213385
3548
(μ›ƒμŒ)
03:36
OK.
83
216957
1151
μ’‹μ•„μš”.
03:38
Director: So what's going on here? GG: We need a new career, I think.
84
218132
3382
감독: 무슨 일이죠? GG: 직업을 λ°”κΏ”μ•Όν•  것 κ°™μ•„μš”.
03:41
(Laughter)
85
221538
1070
(μ›ƒμŒ)
03:42
OK, so that was a massive failure.
86
222632
2444
그건 μ™„μ „ν•œ μ‹€νŒ¨μ˜€μ§€λ§Œ μ €ν¬λŠ” μ—¬μ „νžˆ κΆκΈˆν•©λ‹ˆλ‹€.
03:45
But we're still curious: How far could we push this technology?
87
225100
3212
이 κΈ°μˆ μ„ μ–Όλ§ˆλ‚˜ λ°€κ³  λ‚˜κ°ˆ 수 μžˆμ„κΉŒμš”?
03:48
And we looked back at what we did.
88
228336
1640
μ €ν¬λŠ” 결과물을 λ‹€μ‹œ λ³΄μ•˜μŠ΅λ‹ˆλ‹€.
03:50
We noticed that the data was coming into our computer very quickly,
89
230000
3143
데이터가 ꡉμž₯히 빨리 컴퓨터에 λ“€μ–΄μ˜€κ³  μžˆλ‹€λŠ” 것을 μ•Œμ•„λƒˆμ£ .
03:53
without any timing of when the images came on,
90
233167
2241
이미지가 μ–Έμ œ μž…λ ₯λ˜μ—ˆλŠ”μ§€ λͺ¨λ₯Ό μ •λ„λ‘œμš”.
03:55
and that's the equivalent of reading a very long sentence
91
235432
2876
그것은 맀우 κΈ΄ λ¬Έμž₯을 단어 사이에 곡백 없이 μ½λŠ” 것과 κ°™μŠ΅λ‹ˆλ‹€.
03:58
without spaces between the words.
92
238332
1605
03:59
It would be hard to read,
93
239961
1438
읽기 μ–΄λ ΅κ² μ£ .
04:01
but once we add the spaces, individual words appear
94
241423
3713
ν•˜μ§€λ§Œ 곡백을 λ„£μœΌλ©΄, 각 단어듀이 보이고
04:05
and it becomes a lot more understandable.
95
245160
2044
훨씬 더 μ΄ν•΄ν•˜κΈ° νŽΈν•΄μ§‘λ‹ˆλ‹€.
04:07
But what if we cheat a little bit?
96
247228
1847
ν•˜μ§€λ§Œ μ•½κ°„ μ†μž„μˆ˜λ₯Ό μ“°λ©΄ μ–΄λ–¨κΉŒμš”?
04:09
By using a sensor, we can tell the computer when the image first appears.
97
249099
3537
μ„Όμ„œλ₯Ό μ΄μš©ν•΄ 이미지가 처음 λ‚˜νƒ€λ‚˜λŠ” 지점을
μ»΄ν“¨ν„°μ—κ²Œ μ•Œλ €μ€„ 수 μžˆμŠ΅λ‹ˆλ‹€.
04:12
That way, the brainwave stops being a continuous stream of information,
98
252660
3602
κ·Έλ ‡κ²Œ λ‡ŒνŒŒλŠ” μ§€μ†μ μœΌλ‘œ λ‚˜μ—΄λœ 정보가 μ•„λ‹ˆλΌ
04:16
and instead becomes individual packets of meaning.
99
256286
2711
κ°œλ³„ 의미 λ‹¨μœ„κ°€ λ˜λŠ” 것이죠.
04:19
Also, we're going to cheat a little bit more,
100
259021
2368
λ˜ν•œ μ†μž„μˆ˜λ₯Ό μ•½κ°„λ§Œ 더 써볼 수 μžˆμŠ΅λ‹ˆλ‹€.
04:21
by limiting the categories to two.
101
261413
1812
μΉ΄ν…Œκ³ λ¦¬λ₯Ό λ‘κ°œλ‘œ μ œν•œν•˜λŠ” 것이죠.
04:23
Let's see if we can do some real-time mind-reading.
102
263249
2383
μ‹€μ‹œκ°„μœΌλ‘œ 생각을 읽어낼 수 μžˆλŠ”μ§€ λ΄…μ‹œλ‹€.
04:25
In this new experiment,
103
265656
1235
이 μƒˆλ‘œμš΄ μ‹€ν—˜μ—μ„œ
04:26
we're going to constrict it a little bit more
104
266915
2097
ν™˜κ²½μ„ 쑰금만 μ œν•œν•˜λ©΄,
04:29
so that we know the onset of the image
105
269036
2252
μ΄λ―Έμ§€μ˜ μ‹œμž‘μ„ νŒŒμ•…ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
04:31
and we're going to limit the categories to "face" or "scenery."
106
271312
3382
μ €ν¬λŠ” μΉ΄ν…Œκ³ λ¦¬λ₯Ό "μ–Όκ΅΄"μ΄λ‚˜ "풍경"으둜 μ œν•œν•  것 μž…λ‹ˆλ‹€.
04:35
Nathan: Face. Correct.
107
275097
1511
λ„€μ΄λ˜: μ–Όκ΅΄. μ •λ‹΅μž…λ‹ˆλ‹€.
04:37
Scenery. Correct.
108
277780
1351
풍경. μ •λ‹΅μž…λ‹ˆλ‹€.
04:40
GG: So right now, every time the image comes on,
109
280251
2373
GG: 이제, 이미지가 λ‚˜νƒ€λ‚  λ•Œλ§ˆλ‹€
04:42
we're taking a picture of the onset of the image
110
282648
2266
μ΄λ―Έμ§€μ˜ μ‹œμž‘μ μ„ 찍고
04:44
and decoding the EEG.
111
284938
1695
EEGλ₯Ό ν•΄λ…ν•˜κ² μŠ΅λ‹ˆλ‹€.
04:46
It's getting correct.
112
286657
1256
정닡을 λ§žμΆ”κ³  μžˆμ–΄μš”.
04:47
Nathan: Yes. Face. Correct.
113
287937
1579
λ„€μ΄λ˜: λ„€. μ–Όκ΅΄. μ •λ‹΅μž…λ‹ˆλ‹€.
04:49
GG: So there is information in the EEG signal, which is cool.
114
289540
2859
GG: EEG μ‹ ν˜Έμ— 정보가 μžˆμ–΄μš”. λ©‹μ§€λ„€μš”.
04:52
We just had to align it to the onset of the image.
115
292423
2537
μš°λ¦¬λŠ” 그것을 μ΄λ―Έμ§€μ˜ μ‹œμž‘μ μ— λ°°μ—΄ν•΄μ•Ό ν–ˆμŠ΅λ‹ˆλ‹€.
04:55
Nathan: Scenery. Correct.
116
295307
1311
λ„€μ΄λ˜: 풍경. μ •λ‹΅μž…λ‹ˆλ‹€.
04:59
Face. Yeah.
117
299344
1150
μ–Όκ΅΄. 정닡이죠.
05:00
GG: This means there is some information there,
118
300518
2288
GG: 이것은 μ•½κ°„μ˜ 정보가 μžˆμ–΄μ„œ
05:02
so if we know at what time the picture came on,
119
302830
2913
이미지가 μ‹œμž‘λœ 지점을 μ•Œλ©΄ 그것이 μ–΄λ–€ μ’…λ₯˜μ˜ 이미지인지
05:05
we can tell what type of picture it was,
120
305767
1999
ꡬ별할 수 μžˆλ‹€λŠ” 것을 μ˜λ―Έν•©λ‹ˆλ‹€.
05:07
possibly, at least on average, by looking at these evoked potentials.
121
307790
5096
μ΄λŸ¬ν•œ μœ λ°œμ „μœ„λ₯Ό λ΄„μœΌλ‘œμ¨ 적어도 ν‰κ· μ μœΌλ‘œλŠ” κ°€λŠ₯ν•˜λ‹€λŠ” 것이죠.
05:12
Nathan: Exactly.
122
312910
1325
λ„€μ΄λ˜: μ •ν™•ν•©λ‹ˆλ‹€.
05:14
GG: If you had told me at the beginning of this project this was possible,
123
314259
3521
GG: λ§Œμ•½ ν”„λ‘œμ νŠΈ μ΄ˆλ°˜μ— 이게 κ°€λŠ₯λ‹€κ³  λ§ν–ˆλ‹€λ©΄
μ €λŠ” μ ˆλŒ€ λΆˆκ°€λŠ₯이라 ν–ˆμ„κ±°μ˜ˆμš”.
05:17
I would have said no way.
124
317804
1251
이게 κ°€λŠ₯ν•˜λ‹€λŠ”κ±΄ 생각도 λͺ»ν–ˆμ–΄μš”.
05:19
I literally did not think we could do this.
125
319079
2000
κ³Όμ—° 우리의 λ§ˆμŒμ„ μ½λŠ” μ‹€ν—˜μ΄ μ‹€μ œλ‘œ νš¨κ³Όκ°€ μžˆμ—ˆμ„κΉŒμš”?
05:21
Did our mind-reading experiment really work?
126
321103
2066
05:23
Yes, but we had to do a lot of cheating.
127
323193
1975
λ„€, ν•˜μ§€λ§Œ λ§Žμ€ μ†μž„μˆ˜ 덕뢄이죠.
05:25
It turns out you can find some interesting things in the EEG,
128
325192
2905
EEGμ—μ„œ λͺ‡ 가지 ν₯미둜운 사싀을 찾을 수 μžˆμ–΄μš”.
05:28
for example if you're looking at someone's face,
129
328121
2290
예λ₯Ό λ“€μ–΄ λ§Œμ•½ μš°λ¦¬κ°€ ν•œ μ‚¬λžŒμ˜ 얼꡴을 λ³Έλ‹€λ©΄
05:30
but it does have a lot of limitations.
130
330435
2157
κ±°κΈ°μ—” λ„ˆλ¬΄ λ§Žμ€ ν•œκ³„μ μ΄ μžˆμŠ΅λ‹ˆλ‹€.
05:32
Perhaps advances in machine learning will make huge strides,
131
332616
2946
μ•„λ§ˆ 기계 ν•™μŠ΅μ— 큰 λ°œμ „μ΄ μžˆμ–΄μ•Ό
05:35
and one day we will be able to decode what's going on in our thoughts.
132
335586
3390
μ–Έμ  κ°€ 우리의 생각을 해석할 수 μžˆμ„ 것 μž…λ‹ˆλ‹€.
05:39
But for now, the next time a company says that they can harness your brainwaves
133
339000
4077
ν•˜μ§€λ§Œ ν˜„μž¬λ‘œμ¨λŠ”, μ–΄λ–€ νšŒμ‚¬κ°€ μ—¬λŸ¬λΆ„μ˜ λ‡ŒνŒŒλ₯Ό ν™œμš©ν•΄μ„œ
05:43
to be able to control devices,
134
343101
1750
μž₯μΉ˜λ“€μ„ ν†΅μ œν•  μˆ˜λ„ μžˆλ‹€κ³  ν•˜λ©΄,
05:44
it is your right, it is your duty to be skeptical.
135
344875
3310
회의적인 νƒœλ„λŠ” κ°–λŠ” 건 μ—¬λŸ¬λΆ„μ˜ ꢌ리이자 μ˜λ¬΄μž…λ‹ˆλ‹€.
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7