Technology that knows what you're feeling | Poppy Crum

137,815 views ・ 2018-07-10

TED


請雙擊下方英文字幕播放視頻。

譯者: Clare Wong 審譯者: Karen Lo
00:12
What happens when technology knows more about us than we do?
0
12881
4456
當科技比我們更了解自己時, 會發生甚麼事呢?
00:17
A computer now can detect our slightest facial microexpressions
1
17992
3672
現今的電腦,
可以偵測到我們的微表情──
00:21
and be able to tell the difference between a real smile and a fake one.
2
21688
3611
它分辨得出真笑和假笑。
00:25
That's only the beginning.
3
25323
1734
但這只是個開始。
00:27
Technology has become incredibly intelligent
4
27466
2865
科技日益進步,
00:30
and already knows a lot about our internal states.
5
30355
3400
它知道很多我們的內心世界。
00:34
And whether we like it or not,
6
34085
2286
不管我們喜歡與否,
00:36
we already are sharing parts of our inner lives
7
36395
3499
我們已將自己部分內心世界分享出去,
00:39
that's out of our control.
8
39918
1733
這不在我們的掌控之中。
00:43
That seems like a problem,
9
43413
1421
那似乎是個問題,
00:44
because a lot of us like to keep what's going on inside
10
44858
3246
因為我們當中很多人,
喜歡隱藏內心想法,
讓旁人無從察覺。
00:48
from what people actually see.
11
48128
1647
00:50
We want to have agency over what we share and what we don't.
12
50323
4420
我們想要掌握自主權去選擇 什麼可被分享、什麼不可以。
00:55
We all like to have a poker face.
13
55473
2321
我們都想要張撲克臉。
00:59
But I'm here to tell you that I think that's a thing of the past.
14
59584
3346
但我在這裡告訴你們, 我認為這種想法已經過時。
01:03
And while that might sound scary, it's not necessarily a bad thing.
15
63347
4770
儘管這件事好像頗為嚇人,
但未必是件壞事。
01:09
I've spent a lot of time studying the circuits in the brain
16
69030
2770
我花了很多時間研究腦內的迴路,
01:11
that create the unique perceptual realities that we each have.
17
71824
3693
它創造了我們獨特的感知。
01:16
And now I bring that together
18
76110
1405
現在我把它和當今科技結合,
01:17
with the capabilities of current technology
19
77539
2062
01:19
to create new technology that does make us better,
20
79625
2537
創造新技術,令我們更好、
01:22
feel more, connect more.
21
82186
1600
感受更多、聯繫更多。
01:24
And I believe to do that,
22
84482
1786
而我認為要達到這個目的,
01:26
we have to be OK losing some of our agency.
23
86292
2749
我們要接受失去一些自主權。
01:30
With some animals, it's really amazing,
24
90149
2523
和某些動物在一起時,真的很奇妙。
01:32
and we get to see into their internal experiences.
25
92696
3474
我們可知道牠們的內心經歷。
01:36
We get this upfront look at the mechanistic interaction
26
96649
3722
我們可直接得知
牠們對周圍環境的反應
01:40
between how they respond to the world around them
27
100395
2817
和生理系統狀態之間的相互影響。
01:43
and the state of their biological systems.
28
103236
2008
01:45
This is where evolutionary pressures like eating, mating
29
105268
3809
這就是來自進化的壓力, 像是進食、交配
和確保我們不會被吃掉,
01:49
and making sure we don't get eaten
30
109101
1762
01:50
drive deterministic behavioral responses to information in the world.
31
110887
4157
促使確定性行為 回應我們身邊的資訊。
01:55
And we get to see into this window,
32
115806
2794
而我們透過這個窗口,
01:58
into their internal states and their biological experiences.
33
118624
3636
透視牠們的心理狀態及生理變化。
02:02
It's really pretty cool.
34
122284
1642
這真的很酷。
02:03
Now, stay with me for a moment -- I'm a violinist, not a singer.
35
123950
4103
現在請你們忍受我一會兒。
我是個小提琴手,不是歌唱家。
02:08
But the spider's already given me a critical review.
36
128077
3590
但這隻蜘蛛已挑剔地批評我。
02:16
(Video) (Singing in a low pitch)
37
136907
2060
(低音唱歌)
02:19
(Singing in a middle pitch)
38
139868
2888
(中音唱歌)
02:23
(Singing in a high pitch)
39
143800
2505
(高音唱歌)
02:27
(Singing in a low pitch)
40
147069
1421
(低音唱歌)
02:29
(Singing in a middle pitch)
41
149236
1600
(中音唱歌)
02:31
(Singing in a high pitch)
42
151403
1777
(高音唱歌)
02:33
(Laughter)
43
153204
1150
(笑聲)
02:36
Poppy Crum: It turns out, some spiders tune their webs like violins
44
156387
3198
由此可知, 有些蜘蛛會調節自己的蜘蛛網,
02:39
to resonate with certain sounds.
45
159609
2158
像小提琴一樣調至特定的音階。
02:41
And likely, the harmonics of my voice as it went higher
46
161791
2771
同樣地,我唱歌的音調
隨分貝數拉高,
02:44
coupled with how loud I was singing
47
164586
1730
02:46
recreated either the predatory call of an echolocating bat or a bird,
48
166340
4467
重新創造類似捕獵者, 蝙蝠或鳥的回聲定位。
02:50
and the spider did what it should.
49
170831
1881
而蜘蛛做了牠應該做的事。
02:53
It predictively told me to bug off.
50
173300
2817
牠預言性地叫我離開。
02:56
I love this.
51
176824
1150
我愛這樣。
02:58
The spider's responding to its external world
52
178546
3309
蜘蛛回應外在環境時,
03:01
in a way that we get to see and know what's happening to its internal world.
53
181879
4350
我們藉此看到並了解牠的內心世界。
03:07
Biology is controlling the spider's response;
54
187069
2206
生理控制著蜘蛛的反應,
03:09
it's wearing its internal state on its sleeve.
55
189299
2776
牠把自己最真實的情感流露出來。
03:13
But us, humans --
56
193768
1655
但我們,人類 ──
03:16
we're different.
57
196184
1150
我們不一樣。
03:17
We like to think we have cognitive control over what people see, know and understand
58
197899
5735
我們自認能夠控制自己的感知能力。
像是讓別人怎樣看待、知道和理解
03:23
about our internal states --
59
203658
1409
我們的內心世界、
03:25
our emotions, our insecurities, our bluffs, our trials and tribulations --
60
205091
4303
我們的情感、我們的局促不安、
我們的虛張聲勢、 我們的麻煩和艱難⋯⋯
03:29
and how we respond.
61
209418
1267
及我們的反應。
03:31
We get to have our poker face.
62
211927
2282
我們需要一張撲克臉。
03:35
Or maybe we don't.
63
215799
1200
但或者,我們其實並不需要。
03:37
Try this with me.
64
217728
1182
跟我一起嘗試這樣做。
03:38
Your eye responds to how hard your brain is working.
65
218934
2690
你的眼睛能反映出大腦使用的程度。
03:42
The response you're about to see is driven entirely by mental effort
66
222363
3230
接下來你看到的反應
完全出於心理因素。
03:45
and has nothing to do with changes in lighting.
67
225617
2635
和亮度沒有關係。
03:48
We know this from neuroscience.
68
228276
1650
我保證這有神經科學的根據。
03:49
I promise, your eyes are doing the same thing as the subject in our lab,
69
229950
4560
你的眼睛和實驗室受試者的 眼睛會做相同的事,
03:54
whether you want them to or not.
70
234534
1734
無論你願意與否。
03:56
At first, you'll hear some voices.
71
236292
2173
首先你會聽到一些聲音。
03:58
Try and understand them and keep watching the eye in front of you.
72
238489
3278
嘗試去了解, 而且定睛於你面前的眼睛。
04:01
It's going to be hard at first,
73
241791
1498
一開始會有點難。
要放棄真的很容易。
04:03
one should drop out, and it should get really easy.
74
243313
2391
從瞳孔的直徑 你可以看見心理因素的變化。
04:05
You're going to see the change in effort in the diameter of the pupil.
75
245728
3325
04:10
(Video) (Two overlapping voices talking)
76
250140
2567
(兩種聲音重疊) 智能科技仰賴個人數據。
04:12
(Single voice) Intelligent technology depends on personal data.
77
252731
2963
(一種聲音)智能科技仰賴個人數據。
04:15
(Two overlapping voices talking)
78
255718
2446
(兩種聲音重疊) 智能科技仰賴個人數據。
04:18
(Single voice) Intelligent technology depends on personal data.
79
258188
2976
(一種聲音)智能科技仰賴個人數據。
04:21
PC: Your pupil doesn't lie.
80
261680
1326
你的瞳孔不會說謊。
04:23
Your eye gives away your poker face.
81
263030
2400
你的眼睛拆穿了你的撲克臉。
04:25
When your brain's having to work harder,
82
265990
1913
當腦袋愈努力地想事情,
04:27
your autonomic nervous system drives your pupil to dilate.
83
267927
2785
自主神經系統驅使瞳孔擴大。
04:30
When it's not, it contracts.
84
270736
1555
在相反的情況下,瞳孔縮小。
04:32
When I take away one of the voices,
85
272680
1691
當只有一種聲音時,
04:34
the cognitive effort to understand the talkers
86
274395
2262
理解說話內容需要的精力減少。
04:36
gets a lot easier.
87
276681
1158
04:37
I could have put the two voices in different spatial locations,
88
277863
3000
我把兩種聲音分別放在兩個地方,
04:40
I could have made one louder.
89
280887
1666
如果我將其中一種聲音調大聲,
04:42
You would have seen the same thing.
90
282577
1738
出來的效果也是一樣。
04:45
We might think we have more agency over the reveal of our internal state
91
285006
4786
我們可能自認為比蜘蛛有更多自主權
能控制披露自己的內心狀態與否,
04:49
than that spider,
92
289816
1579
04:51
but maybe we don't.
93
291419
1266
但或許不如我們所想像的。
04:53
Today's technology is starting to make it really easy
94
293021
2969
現今科技很容易就讀懂
04:56
to see the signals and tells that give us away.
95
296014
2690
我們內心深處釋出的信號。
04:59
The amalgamation of sensors paired with machine learning
96
299109
3294
傳感器連同機器學習,
05:02
on us, around us and in our environments,
97
302427
2413
與我們、周遭、環境相結合,
05:04
is a lot more than cameras and microphones tracking our external actions.
98
304864
4653
遠遠不止追蹤我們行動的 相機和麥克風而已。
05:12
Our bodies radiate our stories
99
312529
2818
我們身體的生理溫度變化
05:15
from changes in the temperature of our physiology.
100
315371
2666
輻射出我們的故事。
05:18
We can look at these as infrared thermal images
101
318546
2261
看看我身後的紅外線溫度圖。
05:20
showing up behind me,
102
320831
1160
05:22
where reds are hotter and blues are cooler.
103
322015
2070
紅色代表較高溫,藍色代表低溫。
05:24
The dynamic signature of our thermal response
104
324458
3183
動態的身體溫度徵象
05:27
gives away our changes in stress,
105
327665
3031
讓我們對壓力的反應露了餡,
05:30
how hard our brain is working,
106
330720
2008
我們有多努力用腦想問題,
05:32
whether we're paying attention
107
332752
1936
我們和別人聊天時是否專注,
05:34
and engaged in the conversation we might be having
108
334712
2627
05:37
and even whether we're experiencing a picture of fire as if it were real.
109
337363
4095
甚至我們是否像身歷其境遭火炙。
05:41
We can actually see people give off heat on their cheeks
110
341482
2643
人們的臉頰真的會散發出熱力
05:44
in response to an image of flame.
111
344149
2200
來回應火焰的影像。
05:48
But aside from giving away our poker bluffs,
112
348013
2929
除了假裝冷漠露了餡,
05:50
what if dimensions of data from someone's thermal response
113
350966
4746
如果反應的人體溫度數據
05:55
gave away a glow of interpersonal interest?
114
355736
2659
也讓人際關係露了餡呢?
05:58
Tracking the honesty of feelings in someone's thermal image
115
358966
3532
用溫度圖追蹤人的真誠感受
06:02
might be a new part of how we fall in love and see attraction.
116
362522
3626
可能是部分墜入愛河 和偵測吸引力的新穎方式。
06:06
Our technology can listen, develop insights and make predictions
117
366172
3693
我們的科技可以聆聽、洞察和預測
06:09
about our mental and physical health
118
369889
2095
我們心理和生理的健康,
06:12
just by analyzing the timing dynamics of our speech and language
119
372008
4000
只需分析麥克風收到的 語言節奏變化。
06:16
picked up by microphones.
120
376032
1443
06:18
Groups have shown that changes in the statistics of our language
121
378038
3880
資料顯示,透過搭配
人們的語言變化數據和機器,
06:21
paired with machine learning
122
381942
1420
06:23
can predict the likelihood someone will develop psychosis.
123
383386
3161
可以預測一個人有精神病的可能性。
06:27
I'm going to take it a step further
124
387442
1751
我將更深入地探討,
06:29
and look at linguistic changes and changes in our voice
125
389217
2587
在不同的情況下,
06:31
that show up with a lot of different conditions.
126
391828
2239
我們語言和聲音上的改變。
06:34
Dementia, diabetes can alter the spectral coloration of our voice.
127
394091
4367
老年癡呆和糖尿病 可改變聲音中的光譜顏色。
06:39
Changes in our language associated with Alzheimer's
128
399205
3119
與阿茲海默症有關的語言變化
06:42
can sometimes show up more than 10 years before clinical diagnosis.
129
402348
4365
能在臨床診斷的前十年就被發現。
06:47
What we say and how we say it tells a much richer story
130
407236
3960
我們表達的內容和怎樣表達
比我們想像中透露出更多資訊。
06:51
than we used to think.
131
411220
1254
06:53
And devices we already have in our homes could, if we let them,
132
413022
4047
家中已有的儀器,如果我們允許,
06:57
give us invaluable insight back.
133
417093
2134
可以讓它們把珍貴的數據傳送回來。
06:59
The chemical composition of our breath
134
419998
2978
我們呼吸中的化學成分,
07:03
gives away our feelings.
135
423959
1354
能透露我們的感覺。
07:06
There's a dynamic mixture of acetone, isoprene and carbon dioxide
136
426363
4478
由丙酮、異戊二烯和二氧化碳 所組成的混合物成分,
07:10
that changes when our heart speeds up, when our muscles tense,
137
430865
3384
在心跳加速、肌肉緊張時不斷變化。
07:14
and all without any obvious change in our behaviors.
138
434809
2897
但我們的行為不會有明顯的改變。
07:18
Alright, I want you to watch this clip with me.
139
438268
2738
請你們跟我一同看這段影片。
07:21
Some things might be going on on the side screens,
140
441030
3119
有些東西可能出現在兩旁的螢幕,
07:24
but try and focus on the image in the front
141
444173
3777
但嘗試把注意力集中在前面的影像,
07:27
and the man at the window.
142
447974
1463
和窗戶旁的男人。
07:31
(Eerie music)
143
451633
2658
(怪異的音樂)
07:39
(Woman screams)
144
459767
1437
(女人的尖叫)
07:50
PC: Sorry about that. I needed to get a reaction.
145
470692
2395
抱歉讓你們經歷這些, 我需要你們的反應。
07:53
(Laughter)
146
473111
1785
(笑聲)
07:55
I'm actually tracking the carbon dioxide you exhale in the room right now.
147
475412
4972
我正追蹤你們在這房間 呼出的二氧化碳。
08:01
We've installed tubes throughout the theater,
148
481903
3293
我們在整間講廳安裝了管道,
08:05
lower to the ground, because CO2 is heavier than air.
149
485220
2595
在地下,
因為二氧化碳比空氣重。
08:07
But they're connected to a device in the back
150
487839
2667
這些管道連接後面的儀器,
08:10
that lets us measure, in real time, with high precision,
151
490530
3287
可以實時準確地計算
08:13
the continuous differential concentration of CO2.
152
493841
2922
一連串的二氧化碳濃度變化。
08:17
The clouds on the sides are actually the real-time data visualization
153
497246
5508
旁邊的雲圖顯示出這裡
08:22
of the density of our CO2.
154
502778
1998
二氧化碳濃度。
08:25
You might still see a patch of red on the screen,
155
505374
3699
你可能見到螢幕上有一小塊紅色,
08:29
because we're showing increases with larger colored clouds,
156
509097
3705
因為它顯示有顏色的雲團增多,
08:32
larger colored areas of red.
157
512826
2196
紅色的區域增多。
08:35
And that's the point where a lot of us jumped.
158
515046
2559
這正是顯示我們被嚇到的時候。
08:38
It's our collective suspense driving a change in carbon dioxide.
159
518173
4915
是我們集體的焦慮 導致二氧化碳變化。
08:43
Alright, now, watch this with me one more time.
160
523649
2722
現在再看一次這段影片。
08:46
(Cheerful music)
161
526395
2238
(快樂的音樂)
08:54
(Woman laughs)
162
534553
2137
(女人的笑聲)
09:05
PC: You knew it was coming.
163
545344
1349
你們已經知道會發生什麼事。
09:06
But it's a lot different when we changed the creator's intent.
164
546717
3363
當我們改變了創作者的意圖, 差別會很大。
09:10
Changing the music and the sound effects
165
550776
2769
轉換了音樂和音效
09:13
completely alter the emotional impact of that scene.
166
553569
3603
完全改變那場景帶來的情緒影響。
09:17
And we can see it in our breath.
167
557196
2134
從我們的呼吸可以知道,
09:20
Suspense, fear, joy
168
560196
2262
焦慮、恐懼、歡樂,
09:22
all show up as reproducible, visually identifiable moments.
169
562482
4507
都顯示可重複性, 視覺上可以辨識的時刻。
09:27
We broadcast a chemical signature of our emotions.
170
567473
4151
這些化學特質傳達了我們的情緒。
09:35
It is the end of the poker face.
171
575249
2133
撲克臉的時代將就此終結。
09:38
Our spaces, our technology will know what we're feeling.
172
578582
3566
我們的空間、科技 將會知道我們的情感。
09:42
We will know more about each other than we ever have.
173
582736
2785
我們將前所未有地更了解彼此。
09:45
We get a chance to reach in and connect to the experience and sentiments
174
585911
4307
我們有機會在情感上與社交上,
09:50
that are fundamental to us as humans
175
590242
1742
交換那些身為人類
09:52
in our senses, emotionally and socially.
176
592008
2410
基本擁有的經驗與觀點。
09:55
I believe it is the era of the empath.
177
595482
2540
我相信這是一個同理心的時代。
09:58
And we are enabling the capabilities that true technological partners can bring
178
598046
5222
我們正在實現那些能力,
是真正的技術合作夥伴 能夠帶來的能力,
10:03
to how we connect with each other and with our technology.
179
603292
3047
它能夠連結我們彼此,
也能連結我們與技術。
10:06
If we recognize the power of becoming technological empaths,
180
606363
3389
如果意識到我們能用技術 達到同理心的力量,
10:09
we get this opportunity
181
609776
1936
我們就掌有機會,
10:11
where technology can help us bridge the emotional and cognitive divide.
182
611736
4424
用科技幫助我們跨越 情感和認知的鴻溝。
10:16
And in that way, we get to change how we tell our stories.
183
616680
2723
如此一來,我們就能改變敘事的方法。
10:19
We can enable a better future for technologies like augmented reality
184
619427
3580
我們能為「擴增實境」等技術 創造更美好的未來,
10:23
to extend our own agency and connect us at a much deeper level.
185
623031
4193
擴大我們的自主權,
並令彼此更深入地連結。
10:27
Imagine a high school counselor being able to realize
186
627625
2547
試想像高中的輔導老師
能注意到表面歡樂的學生 其實正經歷著艱難的日子。
10:30
that an outwardly cheery student really was having a deeply hard time,
187
630196
3826
10:34
where reaching out can make a crucial, positive difference.
188
634046
3180
給予這名學生支持, 藉此得到關鍵且正向的結果。
10:37
Or authorities, being able to know the difference
189
637766
3230
或是讓權威人士有能力分辨
10:41
between someone having a mental health crisis
190
641020
2325
一個人是有心理健康障礙
10:43
and a different type of aggression,
191
643369
1826
還是具有侵略性,
10:45
and responding accordingly.
192
645219
1800
並根據情況作出回應。
10:47
Or an artist, knowing the direct impact of their work.
193
647609
3273
或者讓藝術工作者 能知曉其作品對人們的影響。
10:52
Leo Tolstoy defined his perspective of art
194
652173
2643
列夫·托爾斯泰將藝術定義為
10:54
by whether what the creator intended
195
654840
1785
創造者透過作品 把自己的想法傳達給觀賞者。
10:56
was experienced by the person on the other end.
196
656649
2586
10:59
Today's artists can know what we're feeling.
197
659259
2566
如今藝術工作者能知道我們的感受。
11:02
But regardless of whether it's art or human connection,
198
662204
3005
但無論是藝術或人際的連結,
11:06
today's technologies will know and can know
199
666608
2802
當今科技將會知道,能夠知道
11:09
what we're experiencing on the other side,
200
669434
2048
我們不為人知的經歷。
11:11
and this means we can be closer and more authentic.
201
671506
2618
這代表我們之間會更接近、更真誠。
11:14
But I realize a lot of us have a really hard time
202
674498
4293
但我明白有很多人抗拒
11:18
with the idea of sharing our data,
203
678815
2267
分享與自身相關數據的做法,
11:21
and especially the idea that people know things about us
204
681673
3111
特別是別人能洞悉
11:24
that we didn't actively choose to share.
205
684808
2321
我們不想與之分享的想法。
11:28
Anytime we talk to someone,
206
688728
2216
無論何時當我們和人交談、
11:31
look at someone
207
691946
1555
看著別人、
11:33
or choose not to look,
208
693525
1468
或不看時,
11:35
data is exchanged, given away,
209
695017
2647
數據已被交換並傳遞出去。
11:38
that people use to learn,
210
698533
2205
人們利用這些數據了解我們
11:40
make decisions about their lives and about ours.
211
700762
3267
來藉此做了和我們相關的決定。
11:45
I'm not looking to create a world where our inner lives are ripped open
212
705469
3968
我不是想創造一個 內心生活要被剖開的世界──
11:49
and our personal data and our privacy given away
213
709461
2348
把我們的私人數據和隱私
11:51
to people and entities where we don't want to see it go.
214
711833
2713
給我們不想給的人看。
11:55
But I am looking to create a world
215
715117
2762
我想創造一個世界
11:57
where we can care about each other more effectively,
216
717903
3408
可以讓我們更有效地關心彼此。
12:01
we can know more about when someone is feeling something
217
721335
3060
當有人需要被關心時,
12:04
that we ought to pay attention to.
218
724419
1872
我們可以獲得更多資訊。
12:06
And we can have richer experiences from our technology.
219
726800
3335
科技讓我們有更豐富的經驗。
12:10
Any technology can be used for good or bad.
220
730887
2357
任何科技都可用來行善或犯罪。
12:13
Transparency to engagement and effective regulation
221
733268
2412
參與的透明度和有效的規範,
12:15
are absolutely critical to building the trust for any of this.
222
735704
3120
這兩項要素能幫助科技建立信任。
12:20
But the benefits that "empathetic technology" can bring to our lives
223
740106
4834
同理心技術帶給生活的好處,
12:24
are worth solving the problems that make us uncomfortable.
224
744964
3891
值得讓我們解決令人不安的問題。
12:29
And if we don't, there are too many opportunities and feelings
225
749315
4025
如果我們不解決的話, 會有太多的機會和感受
12:33
we're going to be missing out on.
226
753364
1695
被我們忽略。
12:35
Thank you.
227
755083
1175
謝謝!
12:36
(Applause)
228
756282
2479
(掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7