Can machines read your emotions? - Kostas Karpouzis

334,937 views ・ 2016-11-29

TED-Ed


請雙擊下方英文字幕播放視頻。

譯者: Songzhe Gao 審譯者: Cissy Yun
00:07
With every year, machines surpass humans in more and more activities
0
7052
4590
每年,以往被認為 只有人類能完成的事情
00:11
we once thought only we were capable of.
1
11642
3206
漸漸的被機械所超越
00:14
Today's computers can beat us in complex board games,
2
14848
3575
現在的電腦可以 在複雜的桌遊中打敗我們
00:18
transcribe speech in dozens of languages,
3
18423
2871
把演講翻譯成各種語言
00:21
and instantly identify almost any object.
4
21294
3452
還能立刻辨識各種物品
00:24
But the robots of tomorrow may go futher
5
24746
2366
在未來,機器人可能會透過
00:27
by learning to figure out what we're feeling.
6
27112
3131
理解我們的感受變得更進步
00:30
And why does that matter?
7
30243
2138
為什麼這很重要?
00:32
Because if machines and the people who run them
8
32381
2292
假如機器人以及操作者 可以精準判斷出我們的情感
00:34
can accurately read our emotional states,
9
34673
2550
00:37
they may be able to assist us or manipulate us
10
37223
2970
可能會給我們很大的幫助 或是進而操控我們
00:40
at unprecedented scales.
11
40193
2909
但在討論到那之前
00:43
But before we get there,
12
43102
1512
00:44
how can something so complex as emotion be converted into mere numbers,
13
44614
5039
試問為什麼像情緒這麼複雜的東西
可以轉換成機器唯一理解的 語言符號:數字呢?
00:49
the only language machines understand?
14
49653
3600
00:53
Essentially the same way our own brains interpret emotions,
15
53253
3590
其實跟大腦先學習認知情緒 才會詮釋自己的心情是一樣的
00:56
by learning how to spot them.
16
56843
2151
00:58
American psychologist Paul Ekman identified certain universal emotions
17
58994
5126
美國心理學家保羅·艾克曼 歸類出幾種廣泛存在的情緒
01:04
whose visual cues are understood the same way across cultures.
18
64120
5054
這些情緒的視覺線索 在各個文化中意義都相同
01:09
For example, an image of a smile signals joy to modern urban dwellers
19
69174
5019
舉例來說,一個笑臉圖像 對都市人和原住民來說
01:14
and aboriginal tribesmen alike.
20
74193
2772
都是代表喜悅
01:16
And according to Ekman,
21
76965
1129
根據艾克曼的說法
01:18
anger,
22
78094
729
01:18
disgust,
23
78823
710
憤怒
厭惡
01:19
fear,
24
79533
742
恐懼
01:20
joy,
25
80275
817
喜悅
01:21
sadness,
26
81092
756
01:21
and surprise are equally recognizable.
27
81848
3585
悲傷
以及驚訝都同樣容易辨識
01:25
As it turns out, computers are rapidly getting better at image recognition
28
85433
4403
事實證明,電腦辨識 影像的速度愈來愈快
01:29
thanks to machine learning algorithms, such as neural networks.
29
89836
4179
多虧了像人工神經網絡 這種學習演算法
這些人工節點會藉由互相連結 和交換資訊來模仿生物神經的活動
01:34
These consist of artificial nodes that mimic our biological neurons
30
94015
4190
01:38
by forming connections and exchanging information.
31
98205
3579
01:41
To train the network, sample inputs pre-classified into different categories,
32
101784
4501
為了訓練網絡 會事先輸入預先歸類的樣本
01:46
such as photos marked happy or sad,
33
106285
2890
例如把標示快樂 或悲傷的照片輸入系統
01:49
are fed into the system.
34
109175
2110
01:51
The network then learns to classify those samples
35
111285
2460
網絡會根據特定的特徵調整相關數值 學習如何辨別這些樣本
01:53
by adjusting the relative weights assigned to particular features.
36
113745
4660
01:58
The more training data it's given,
37
118405
1620
輸入的訓練資料愈多 運算法辨識新資料就會愈準確
02:00
the better the algorithm becomes at correctly identifying new images.
38
120025
4770
02:04
This is similar to our own brains,
39
124795
1732
這跟我們的大腦一樣:
02:06
which learn from previous experiences to shape how new stimuli are processed.
40
126527
5198
藉由學習以往經驗 形成往後對新刺激的反應
02:11
Recognition algorithms aren't just limited to facial expressions.
41
131725
3741
辨識演算法不只會辨認臉部表情而已
02:15
Our emotions manifest in many ways.
42
135466
2420
情緒表達有很多種方式
02:17
There's body language and vocal tone,
43
137886
2230
包括肢體語言、語調
心率變化、膚色,以及表皮溫度
02:20
changes in heart rate, complexion, and skin temperature,
44
140116
3121
02:23
or even word frequency and sentence structure in our writing.
45
143237
4809
甚至包含說話節奏 和書面的語法結構
你可能認為訓練神經網絡 辨識它們是漫長又複雜的程序
02:28
You might think that training neural networks to recognize these
46
148046
3159
02:31
would be a long and complicated task
47
151205
2432
02:33
until you realize just how much data is out there,
48
153637
3329
不過當你了解網路數據的龐大 和現代電腦運算的速度就會改觀了
02:36
and how quickly modern computers can process it.
49
156966
3409
02:40
From social media posts,
50
160375
1542
像是從社群網站的發文 到上傳的照片和影片;
02:41
uploaded photos and videos,
51
161917
1669
02:43
and phone recordings,
52
163586
1401
從通話紀錄到熱感應監視器;
02:44
to heat-sensitive security cameras
53
164987
1780
02:46
and wearables that monitor physiological signs,
54
166767
3670
還有記錄生理狀況的穿戴式裝置
02:50
the big question is not how to collect enough data,
55
170437
2510
最大的問題已經不是如何取得資料
02:52
but what we're going to do with it.
56
172947
2308
而是使用這些資料的目的
02:55
There are plenty of beneficial uses for computerized emotion recognition.
57
175255
4451
電腦化表情辨識有很多實用途徑
02:59
Robots using algorithms to identify facial expressions
58
179706
2921
利用演算法辨識表情的機器人
03:02
can help children learn
59
182627
1619
可以幫助孩童學習
03:04
or provide lonely people with a sense of companionship.
60
184246
3390
或是陪伴孤單的人
03:07
Social media companies are considering using algorithms
61
187636
3001
社群媒體公司正在考慮運用演算法
03:10
to help prevent suicides by flagging posts that contain specific words or phrases.
62
190637
6410
來標記含有特定文字或用語的發文
以協助自殺防治
03:17
And emotion recognition software can help treat mental disorders
63
197047
4240
情緒辨識軟體 可以幫助治療心理疾病
03:21
or even provide people with low-cost automated psychotherapy.
64
201287
4291
甚至提供人們低價位的 自動化心理治療
03:25
Despite the potential benefits,
65
205578
1610
儘管有這些好處
03:27
the prospect of a massive network automatically scanning our photos,
66
207188
3681
大型網路自動掃描我們的照片
03:30
communications,
67
210869
1089
03:31
and physiological signs is also quite disturbing.
68
211958
4919
對話紀錄、和生理層面是相當惱人的
03:36
What are the implications for our privacy when such impersonal systems
69
216877
3919
當我們的情緒數據被公司用來打廣告 我們該如何保有隱私?
03:40
are used by corporations to exploit our emotions through advertising?
70
220796
4412
03:45
And what becomes of our rights
71
225208
1510
同時也出現人權問題
03:46
if authorities think they can identify the people likely to commit crimes
72
226718
4019
警方能夠把未決定犯罪的人 直接判定為罪犯嗎?
03:50
before they even make a conscious decision to act?
73
230737
4190
03:54
Robots currently have a long way to go
74
234927
2223
機器人的發展還有很長的路要走
例如辨別像諷刺這種細微的情緒
03:57
in distinguishing emotional nuances, like irony,
75
237150
3108
04:00
and scales of emotions, just how happy or sad someone is.
76
240258
4500
以及分辨情緒強弱 例如一個人多快樂或多難過
04:04
Nonetheless, they may eventually be able to accurately read our emotions
77
244758
4530
儘管如此,機器人將來 可能會精準地判斷情緒
04:09
and respond to them.
78
249288
1850
並做出回應
但是人類會不會因 機器人的過度入侵感到恐懼
04:11
Whether they can empathize with our fear of unwanted intrusion, however,
79
251138
4251
04:15
that's another story.
80
255389
1498
這又是另一回事了
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7