How we can teach computers to make sense of our emotions | Raphael Arar

66,332 views ・ 2018-04-24

TED


請雙擊下方英文字幕播放視頻。

00:00
Translator: Ivana Korom Reviewer: Joanna Pietrulewicz
0
0
7000
譯者: Lilian Chiu 審譯者: Yanyan Hong
00:13
I consider myself one part artist and one part designer.
1
13760
4680
我自認一部分是藝術家, 一部分是設計師。
00:18
And I work at an artificial intelligence research lab.
2
18480
3160
我在一間人工智能研究實驗室工作。
00:22
We're trying to create technology
3
22720
1696
我們在試圖創造的技術,
00:24
that you'll want to interact with in the far future.
4
24440
3296
是在遙遠的未來 你會想要和它互動的技術。
00:27
Not just six months from now, but try years and decades from now.
5
27760
4640
不只是現在算起六個月之後, 而是數年、數十年之後。
00:33
And we're taking a moonshot
6
33120
1616
我們做了個很大膽的猜測,
00:34
that we'll want to be interacting with computers
7
34760
2456
猜想我們將來會想要和電腦
00:37
in deeply emotional ways.
8
37240
2120
用帶有深度感情的方式來互動。
00:40
So in order to do that,
9
40280
1456
為了做到這一點,
00:41
the technology has to be just as much human as it is artificial.
10
41760
4480
這項技術不能只是很人工, 也要很人性。
00:46
It has to get you.
11
46920
2256
它必須要能懂你。
00:49
You know, like that inside joke that'll have you and your best friend
12
49200
3336
就像只有你和你最好的朋友之間 才懂的「圈內笑話」
00:52
on the floor, cracking up.
13
52560
1936
能讓你們笑到在地上打滾。
00:54
Or that look of disappointment that you can just smell from miles away.
14
54520
4560
或是你遠遠就能嗅到的 那種失望表情。
01:00
I view art as the gateway to help us bridge this gap between human and machine:
15
60560
6040
我把藝術視為是一種途徑, 能協助我們在人、機之間搭起橋樑:
01:07
to figure out what it means to get each other
16
67280
3136
找出「了解彼此」是什麼意思,
01:10
so that we can train AI to get us.
17
70440
2760
這樣才能訓練人工智能來了解我們。
01:13
See, to me, art is a way to put tangible experiences
18
73920
3816
對我而言,藝術是一種 可以把有形的經驗
01:17
to intangible ideas, feelings and emotions.
19
77760
3240
放到無形的想法、感受, 及情緒中的方式。
01:21
And I think it's one of the most human things about us.
20
81800
2600
我認為它是我們 最有人性的事物之一。
01:25
See, we're a complicated and complex bunch.
21
85480
2936
我們是很複雜又難懂的物種。
01:28
We have what feels like an infinite range of emotions,
22
88440
3136
我們的情緒範圍感覺 起來似乎是無限的,
01:31
and to top it off, we're all different.
23
91600
2496
更麻煩的是,我們每個人都不同。
01:34
We have different family backgrounds,
24
94120
2296
我們有不同的家庭背景、
01:36
different experiences and different psychologies.
25
96440
3080
不同的經驗,以及不同的心理。
01:40
And this is what makes life really interesting.
26
100240
2720
正是因為如此,人生才有意思。
01:43
But this is also what makes working on intelligent technology
27
103440
3496
但也是因為如此,
發展人工智能才極度困難。
01:46
extremely difficult.
28
106960
1600
01:49
And right now, AI research, well,
29
109640
3456
而現在,人工智能研究,嗯,
01:53
it's a bit lopsided on the tech side.
30
113120
2016
它有點不平衡,傾向技術面。
01:55
And that makes a lot of sense.
31
115160
2136
那是非常合理的。
01:57
See, for every qualitative thing about us --
32
117320
2456
對於人類的質性成分──
01:59
you know, those parts of us that are emotional, dynamic and subjective --
33
119800
4456
我們有很多情緒、 動態、主觀的部分──
02:04
we have to convert it to a quantitative metric:
34
124280
3136
我們得將之轉換為量性的度量:
02:07
something that can be represented with facts, figures and computer code.
35
127440
4360
可以用事實、數字, 和電腦程式來呈現的東西。
02:13
The issue is, there are many qualitative things
36
133000
3376
問題在於,有許多質性的東西
02:16
that we just can't put our finger on.
37
136400
1960
是我們實在無法認出來的。
02:20
So, think about hearing your favorite song for the first time.
38
140400
3200
試想看看,當你第一次 聽見你最喜歡的歌曲時,
02:25
What were you doing?
39
145200
1200
你在做什麼?
02:28
How did you feel?
40
148000
1200
你的感受是什麼?
02:30
Did you get goosebumps?
41
150720
1360
你有起雞皮疙瘩嗎?
02:33
Or did you get fired up?
42
153240
1640
你有充滿激情嗎?
02:36
Hard to describe, right?
43
156400
1200
很難形容,對吧?
02:38
See, parts of us feel so simple,
44
158800
2096
我們的一些部分感覺起來很簡單,
02:40
but under the surface, there's really a ton of complexity.
45
160920
3656
但在表面底下, 是相當可觀的複雜度。
02:44
And translating that complexity to machines
46
164600
2936
要把那種複雜翻譯給機器了解,
02:47
is what makes them modern-day moonshots.
47
167560
2856
讓這個目標變得非常高難度。
02:50
And I'm not convinced that we can answer these deeper questions
48
170440
4176
我不相信我們能夠只用零和一來回答
02:54
with just ones and zeros alone.
49
174640
1480
這些較深的問題,
02:57
So, in the lab, I've been creating art
50
177120
1936
所以,在實驗室中, 我一直在創造藝術,
02:59
as a way to help me design better experiences
51
179080
2456
藝術協助我為最先進的技術
03:01
for bleeding-edge technology.
52
181560
2096
設計出更佳的體驗。
03:03
And it's been serving as a catalyst
53
183680
1736
它一直是一種催化劑,
03:05
to beef up the more human ways that computers can relate to us.
54
185440
3840
用來讓電腦用更人性的 方式和我們相處。
03:10
Through art, we're tacking some of the hardest questions,
55
190000
2696
我們透過藝術來處理 一些最困難的問題,
03:12
like what does it really mean to feel?
56
192720
2360
比如「去感覺」究竟是什麼意思?
03:16
Or how do we engage and know how to be present with each other?
57
196120
4080
或者我們要如何與人互動, 如何在彼此相處時不是心不在焉?
03:20
And how does intuition affect the way that we interact?
58
200800
4000
直覺又如何影響我們互動的方式?
03:26
So, take for example human emotion.
59
206440
2056
以人類情緒為例。
03:28
Right now, computers can make sense of our most basic ones,
60
208520
3256
現在,電腦能夠理解 我們最基本的情緒,
03:31
like joy, sadness, anger, fear and disgust,
61
211800
3696
比如喜悅、悲傷、 憤怒、恐懼,及作噁,
03:35
by converting those characteristics to math.
62
215520
3000
做法是將那些特徵轉換為數學。
03:39
But what about the more complex emotions?
63
219400
2536
但遇到更複雜的情緒怎麼辦?
03:41
You know, those emotions
64
221960
1216
你們知道的,
03:43
that we have a hard time describing to each other?
65
223200
2376
有些情緒我們都很難對彼此形容?
03:45
Like nostalgia.
66
225600
1440
像是鄉愁。
03:47
So, to explore this, I created a piece of art, an experience,
67
227640
3936
為了探究這一點,我創作出了 一件藝術作品,一項體驗,
03:51
that asked people to share a memory,
68
231600
2096
我請大家分享一段記憶,
03:53
and I teamed up with some data scientists
69
233720
2136
我和一些資料科學家合作,
03:55
to figure out how to take an emotion that's so highly subjective
70
235880
3576
想辦法把一種高度主觀性的情緒
03:59
and convert it into something mathematically precise.
71
239480
3200
轉換成很精確的數學。
04:03
So, we created what we call a nostalgia score
72
243840
2136
於是我們創造出了鄉愁分數,
04:06
and it's the heart of this installation.
73
246000
2216
它是這項裝置的核心。
04:08
To do that, the installation asks you to share a story,
74
248240
3056
做法是這樣的:這項裝置 會先請你分享一個故事,
04:11
the computer then analyzes it for its simpler emotions,
75
251320
3256
接著,電腦會分析 這個故事中比較簡單的情緒,
04:14
it checks for your tendency to use past-tense wording
76
254600
2656
它會檢查你是否傾向 用比較多過去式修辭,
04:17
and also looks for words that we tend to associate with nostalgia,
77
257280
3336
也會尋找我們談鄉愁時 比較會用到的字眼,
04:20
like "home," "childhood" and "the past."
78
260640
3040
比如「家」、「童年」,和「過去」。
04:24
It then creates a nostalgia score
79
264760
2056
接著,它就會算出鄉愁分數,
04:26
to indicate how nostalgic your story is.
80
266840
2736
表示你的故事有多麼具有鄉愁。
04:29
And that score is the driving force behind these light-based sculptures
81
269600
4136
那個分數,就是這些以光線為 基礎之雕塑背後的驅動力,
04:33
that serve as physical embodiments of your contribution.
82
273760
3896
這些雕塑就是將你的貢獻 給具體化的結果。
04:37
And the higher the score, the rosier the hue.
83
277680
3216
分數越高,色調就會越偏玫瑰色。
04:40
You know, like looking at the world through rose-colored glasses.
84
280920
3936
就像是戴上玫瑰色的 眼鏡來看世界。
04:44
So, when you see your score
85
284880
2616
所以,當你看到你的分數
04:47
and the physical representation of it,
86
287520
2656
以及它的實體代表呈現之後,
04:50
sometimes you'd agree and sometimes you wouldn't.
87
290200
2936
有時你可能可以認同,有時不能。
04:53
It's as if it really understood how that experience made you feel.
88
293160
3480
它就像是真正去了解 那體驗帶給你什麼感覺。
04:57
But other times it gets tripped up
89
297400
2216
但其他時候,這裝置會犯錯,
04:59
and has you thinking it doesn't understand you at all.
90
299640
2560
讓你認為它完全不了解你。
05:02
But the piece really serves to show
91
302680
1896
但這件作品實際上呈現出
05:04
that if we have a hard time explaining the emotions that we have to each other,
92
304600
4056
如果我們都很難向彼此 解釋我們的某些情緒,
05:08
how can we teach a computer to make sense of them?
93
308680
2360
我們要如何教電腦理解那些情緒?
05:12
So, even the more objective parts about being human are hard to describe.
94
312360
3576
即使是人類比較客觀的 部分,也很難形容。
05:15
Like, conversation.
95
315960
1240
比如,對談。
05:17
Have you ever really tried to break down the steps?
96
317880
2736
你可曾真正試過把它拆解成步驟?
05:20
So think about sitting with your friend at a coffee shop
97
320640
2656
試想一下,你和朋友 坐在咖啡廳裡,
05:23
and just having small talk.
98
323320
1320
只是隨意閒聊。
05:25
How do you know when to take a turn?
99
325160
1720
怎麼知道何時要換人說話?
05:27
How do you know when to shift topics?
100
327440
1840
怎麼知道何時要換主題?
05:29
And how do you even know what topics to discuss?
101
329960
2720
甚至,怎麼知道要聊什麼主題?
05:33
See, most of us don't really think about it,
102
333560
2096
我們大部分人並不會去思考這些,
05:35
because it's almost second nature.
103
335680
1656
因為我們早就習慣成自然。
05:37
And when we get to know someone, we learn more about what makes them tick,
104
337360
3496
當我們漸漸了解一個人, 就會更清楚什麼會打動他,
05:40
and then we learn what topics we can discuss.
105
340880
2376
然後就會知道我們能討論什麼話題。
05:43
But when it comes to teaching AI systems how to interact with people,
106
343280
3656
但若要教導人工智能系統 怎麼和人互動,
05:46
we have to teach them step by step what to do.
107
346960
2856
我們得要教它們每一個步驟。
05:49
And right now, it feels clunky.
108
349840
3496
現在,感覺還很笨拙。
05:53
If you've ever tried to talk with Alexa, Siri or Google Assistant,
109
353360
4136
如果你曾經試過和 Alexa、 Siri,或 Google 助理說話,
05:57
you can tell that it or they can still sound cold.
110
357520
4200
你就知道它們聽起來還是很冰冷。
06:02
And have you ever gotten annoyed
111
362440
1656
你是否曾被它們惹惱,
06:04
when they didn't understand what you were saying
112
364120
2256
因為它們聽不懂你在說什麼,
06:06
and you had to rephrase what you wanted 20 times just to play a song?
113
366400
3840
你得要換二十種說法, 只為播放一首歌?
06:11
Alright, to the credit of the designers, realistic communication is really hard.
114
371440
4896
好吧,設計師是很辛苦的, 真實的溝通真的很難。
06:16
And there's a whole branch of sociology,
115
376360
2136
而社會學還有一整個分支,
06:18
called conversation analysis,
116
378520
1936
就叫做談話分析,
06:20
that tries to make blueprints for different types of conversation.
117
380480
3136
試圖為不同類型的對談繪製出藍圖。
06:23
Types like customer service or counseling, teaching and others.
118
383640
4080
像客服、諮詢、教學和其他類型。
06:28
I've been collaborating with a conversation analyst at the lab
119
388880
2936
我在實驗室和一位對談分析師合作,
06:31
to try to help our AI systems hold more human-sounding conversations.
120
391840
4696
試圖協助我們的人工智慧系統 在進行對談時聽起來更像人類。
06:36
This way, when you have an interaction with a chatbot on your phone
121
396560
3176
這麼一來,當你用手機 和聊天機器人互動,
06:39
or a voice-based system in the car,
122
399760
1856
或和車上的語音系統互動時,
06:41
it sounds a little more human and less cold and disjointed.
123
401640
3320
它聽起來會更像人一點, 不那麼冰冷,不那麼沒條理。
06:46
So I created a piece of art
124
406360
1336
我創作了一件藝術作品,
06:47
that tries to highlight the robotic, clunky interaction
125
407720
2816
試圖強調出機器式、笨拙的互動,
06:50
to help us understand, as designers,
126
410560
1976
來協助我們設計師了解
06:52
why it doesn't sound human yet and, well, what we can do about it.
127
412560
4576
為什麼它聽起來還不像人, 以及對此我們能怎麼辦。
06:57
The piece is called Bot to Bot
128
417160
1456
這件作品叫機器人對機器人,
06:58
and it puts one conversational system against another
129
418640
2936
它讓一個交談系統 和另一個交談系統互動,
07:01
and then exposes it to the general public.
130
421600
2536
接著讓系統接觸一般民眾。
07:04
And what ends up happening is that you get something
131
424160
2496
最後發生的狀況就是, 你得到一種產物,
07:06
that tries to mimic human conversation,
132
426680
1896
它會試圖模仿人類交談,
07:08
but falls short.
133
428600
1896
但達不到標準。
07:10
Sometimes it works and sometimes it gets into these, well,
134
430520
2736
有時它行得通,
有時它會陷入誤解的迴圈當中。
07:13
loops of misunderstanding.
135
433280
1536
07:14
So even though the machine-to-machine conversation can make sense,
136
434840
3096
雖然機器對機器的交談是有意義的,
07:17
grammatically and colloquially,
137
437960
2016
在文法上和口語上皆是如此,
07:20
it can still end up feeling cold and robotic.
138
440000
3216
但它最後的感覺 可能還是很冰冷、很機械化。
07:23
And despite checking all the boxes, the dialogue lacks soul
139
443240
4016
儘管所有的條件都做到了, 對話還是沒有靈魂,
07:27
and those one-off quirks that make each of us who we are.
140
447280
3136
缺乏讓我們每個人 獨一無二的個人特色。
07:30
So while it might be grammatically correct
141
450440
2056
所以,即使文法是正確的,
07:32
and uses all the right hashtags and emojis,
142
452520
2656
所有「#」和表情符號的 應用也都沒錯,
07:35
it can end up sounding mechanical and, well, a little creepy.
143
455200
4136
聽起來還是很機械, 且還有一點詭異。
07:39
And we call this the uncanny valley.
144
459360
2336
我們稱之為恐怖谷。
07:41
You know, that creepiness factor of tech
145
461720
1936
技術中的怪異因子,
07:43
where it's close to human but just slightly off.
146
463680
2856
很接近人類,但又還差那麼一點。
07:46
And the piece will start being
147
466560
1456
這件作品將開始成為一種方式,
07:48
one way that we test for the humanness of a conversation
148
468040
3216
讓我們能夠測試對話的人性
07:51
and the parts that get lost in translation.
149
471280
2160
和翻譯中迷失的部分。
07:54
So there are other things that get lost in translation, too,
150
474560
2856
在翻譯中還有其他的東西會遺失,
07:57
like human intuition.
151
477440
1616
比如人類直覺。
07:59
Right now, computers are gaining more autonomy.
152
479080
2776
目前,電腦越來越自動化。
08:01
They can take care of things for us,
153
481880
1736
它們能為我們處理事情,
08:03
like change the temperature of our houses based on our preferences
154
483640
3176
比如根據我們的偏好 幫我們改變房子裡的室溫,
08:06
and even help us drive on the freeway.
155
486840
2160
甚至協助我們在高速公路上開車。
08:09
But there are things that you and I do in person
156
489560
2496
但有些我們個人會做的事情,
08:12
that are really difficult to translate to AI.
157
492080
2760
很難翻譯讓人工智能理解。
08:15
So think about the last time that you saw an old classmate or coworker.
158
495440
4360
想想看你上回碰見一位 老同學或同事的情況。
08:21
Did you give them a hug or go in for a handshake?
159
501080
2480
你是給他一個擁抱,還是握個手?
08:24
You probably didn't think twice
160
504800
1496
你可能根本不用思考,
08:26
because you've had so many built up experiences
161
506320
2336
因為你有許多既有經驗,
08:28
that had you do one or the other.
162
508680
2000
會影響你決定做前者或後者。
08:31
And as an artist, I feel that access to one's intuition,
163
511440
3456
身為藝術家,我覺得 正是因為會使用直覺,
08:34
your unconscious knowing,
164
514920
1416
潛意識中所知道的事,
08:36
is what helps us create amazing things.
165
516360
3056
才讓我們能創造出了不起的事物。
08:39
Big ideas, from that abstract, nonlinear place in our consciousness
166
519440
4056
大點子是來自於我們意識中 那塊抽象、非線性的地方,
08:43
that is the culmination of all of our experiences.
167
523520
2960
我們所有經驗的最高點。
08:47
And if we want computers to relate to us and help amplify our creative abilities,
168
527840
4656
若我們想讓電腦能和我們相處, 並協助強化我們的創意能力,
08:52
I feel that we'll need to start thinking about how to make computers be intuitive.
169
532520
3896
我覺得我們得要開始思考 如何讓電腦能有直覺。
08:56
So I wanted to explore how something like human intuition
170
536440
3096
所以,我想要探究的是, 像人類直覺這類東西
08:59
could be directly translated to artificial intelligence.
171
539560
3456
如何能被直接翻譯成 人工智慧能懂的語言。
09:03
And I created a piece that explores computer-based intuition
172
543040
3216
我創作了一件作品, 來探究實體空間中
09:06
in a physical space.
173
546280
1320
以電腦為基礎的直覺。
09:08
The piece is called Wayfinding,
174
548480
1696
這件作品叫做「找路」,
09:10
and it's set up as a symbolic compass that has four kinetic sculptures.
175
550200
3936
它的設計是個象徵性的羅盤, 具有四個動態雕塑。
09:14
Each one represents a direction,
176
554160
2056
每一個都代表一個方向,
09:16
north, east, south and west.
177
556240
2120
北、東、南、西。
09:19
And there are sensors set up on the top of each sculpture
178
559080
2696
每個雕塑上都裝有感測器,
09:21
that capture how far away you are from them.
179
561800
2256
能知道你離它們有多遠。
09:24
And the data that gets collected
180
564080
1816
資料會被收集起來,
09:25
ends up changing the way that sculptures move
181
565920
2136
最後會改變雕塑移動的方式,
09:28
and the direction of the compass.
182
568080
2040
以及羅盤的方向。
09:31
The thing is, the piece doesn't work like the automatic door sensor
183
571360
3656
重點是,這件作品並不是像 自動門的感測器那樣運作,
09:35
that just opens when you walk in front of it.
184
575040
2656
當你走到自動門前時它就會打。
09:37
See, your contribution is only a part of its collection of lived experiences.
185
577720
5056
它在收集活體的經驗, 而你的貢獻只是其中一部分。
09:42
And all of those experiences affect the way that it moves.
186
582800
4056
所有經驗都會影響它移動的方式。
09:46
So when you walk in front of it,
187
586880
1736
所以,當你走到它前方時,
09:48
it starts to use all of the data
188
588640
1976
它會開始用所有資料,
09:50
that it's captured throughout its exhibition history --
189
590640
2616
所有它在展示期間 所捕捉到的資料──
09:53
or its intuition --
190
593280
1816
可以說是它的直覺──
09:55
to mechanically respond to you based on what it's learned from others.
191
595120
3560
根據它從其他人身上學到的, 來對你做出機械式的反應。
09:59
And what ends up happening is that as participants
192
599480
2536
最後的結果是,我們參與者開始
10:02
we start to learn the level of detail that we need
193
602040
2816
學到許多細節, 我們需要這些細節,
10:04
in order to manage expectations
194
604880
2016
才能管理來自人類
10:06
from both humans and machines.
195
606920
2776
以及機器的期望。
10:09
We can almost see our intuition being played out on the computer,
196
609720
3616
我們幾乎可以看見 我們的直覺在電腦上呈現出來,
10:13
picturing all of that data being processed in our mind's eye.
197
613360
3600
描繪出我們心靈之眼所看見的 所有被處理的資料。
10:17
My hope is that this type of art
198
617560
1656
我希望這類藝術
10:19
will help us think differently about intuition
199
619240
2416
能夠協助我們對直覺 有不同的想法,
10:21
and how to apply that to AI in the future.
200
621680
2280
及未來要如何 把它用在人工智能上。
10:24
So these are just a few examples of how I'm using art to feed into my work
201
624480
3936
這些只是幾個例子,
用來說明我這個 人工智能設計師兼研究者,
10:28
as a designer and researcher of artificial intelligence.
202
628440
3096
如何把藝術注入工作中。
10:31
And I see it as a crucial way to move innovation forward.
203
631560
3496
我認為,要讓創新再向前邁進, 這種方式十分關鍵。
10:35
Because right now, there are a lot of extremes when it comes to AI.
204
635080
4376
因為,目前談到人工智能時, 會有許多極端想法。
10:39
Popular movies show it as this destructive force
205
639480
2816
熱門電影把人工智能 呈現成毀滅性的力量,
10:42
while commercials are showing it as a savior
206
642320
3056
廣告又把它呈現成救星,
10:45
to solve some of the world's most complex problems.
207
645400
2936
用來解決世界上最困難的一些問題。
10:48
But regardless of where you stand,
208
648360
2536
但,不論你的立場為何,
10:50
it's hard to deny that we're living in a world
209
650920
2176
很難否認,我們所居住的世界
10:53
that's becoming more and more digital by the second.
210
653120
2576
每一秒鐘都在變得越來越數位。
10:55
Our lives revolve around our devices, smart appliances and more.
211
655720
4600
我們的生活圍繞著我們的裝置、 智慧設備等等在轉動。
11:01
And I don't think this will let up any time soon.
212
661400
2320
我不認為近期內這會減緩下來。
11:04
So, I'm trying to embed more humanness from the start.
213
664400
3736
所以,我試著打從一開始 就嵌入更多的人性。
11:08
And I have a hunch that bringing art into an AI research process
214
668160
5136
我有預感,將藝術帶入
人工智能的研究過程是一種方法。
11:13
is a way to do just that.
215
673320
1896
11:15
Thank you.
216
675240
1216
謝謝。
11:16
(Applause)
217
676480
2280
(掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7


This website was created in October 2020 and last updated on June 12, 2025.

It is now archived and preserved as an English learning resource.

Some information may be out of date.

隱私政策

eng.lish.video

Developer's Blog