Art that reveals how technology frames reality | Jiabao Li

79,380 views ・ 2020-04-06

TED


請雙擊下方英文字幕播放視頻。

譯者: Lilian Chiu 審譯者: NAN-KUN WU
00:12
I'm an artist and an engineer.
0
12887
2396
我是藝術家,也是工程師。
00:15
And lately, I've been thinking a lot about how technology mediates
1
15307
5774
最近,對於科技如何居中影響
我們感知現實,我想了很多。
00:21
the way we perceive reality.
2
21105
2041
00:23
And it's being done in a superinvisible and nuanced way.
3
23653
4640
科技已經以一種超級隱形 且微妙的方式居中影響了。
00:29
Technology is designed to shape our sense of reality
4
29260
3646
科技用把它自己偽裝成 真實體驗的方式,
00:32
by masking itself as the actual experience of the world.
5
32930
3811
來形塑我們對現實的感受。
00:37
As a result, we are becoming unconscious and unaware
6
37430
4686
因此,我們完全沒有意識到
00:42
that it is happening at all.
7
42140
2185
它其實一直在發生著。
00:45
Take the glasses I usually wear, for example.
8
45360
2488
以我通常在戴的眼鏡為例。
00:47
These have become part of the way I ordinarily experience my surroundings.
9
47872
4024
眼鏡已經變成我平常 體驗環境的方式的一部分。
00:52
I barely notice them,
10
52352
1490
我幾乎不會注意到它,
00:53
even though they are constantly framing reality for me.
11
53866
4085
即使它經常在為我構建現實。
00:58
The technology I am talking about is designed to do the same thing:
12
58447
4016
我所談的科技, 其設計目的也是一樣的:
01:02
change what we see and think
13
62487
2438
以完全不被注意到的方式
01:04
but go unnoticed.
14
64949
1760
改變我們的所見、所想。
01:07
Now, the only time I do notice my glasses
15
67780
3269
只有一種情況下 我會注意到我有戴眼鏡,
01:11
is when something happens to draw my attention to it,
16
71073
3258
那就是發生了某件事, 把我的注意力拉到眼鏡上,
01:14
like when it gets dirty or my prescription changes.
17
74355
3514
比如眼鏡被弄髒, 或者我的度數改變。
01:18
So I asked myself, "As an artist, what can I create
18
78481
4608
所以,我問我自己: 「身為藝術家,
我能創造什麼,來把同樣的注意力
01:23
to draw the same kind of attention
19
83113
2447
01:25
to the ways digital media -- like news organizations, social media platforms,
20
85584
6171
帶到數位媒體——如
新聞組織、社群媒體平台、
01:31
advertising and search engines --
21
91779
2261
廣告,和搜尋引擎——
01:34
are shaping our reality?"
22
94064
2014
看看它們如何形塑現實。
01:36
So I created a series of perceptual machines
23
96798
4772
所以,我創作了 一系列的感知機器,
01:41
to help us defamiliarize and question
24
101594
3566
來協助我們破除對世界的熟悉感,
並質疑我們看世界的方式。
01:45
the ways we see the world.
25
105184
2120
01:48
For example, nowadays, many of us have this kind of allergic reaction
26
108859
6033
比如,現今,我們許多人
對於和我們不同的想法 會產生過敏反應。
01:54
to ideas that are different from ours.
27
114916
2493
01:57
We may not even realize that we've developed this kind of mental allergy.
28
117945
5656
我們可能甚至沒發現我們 已經發展出了這種心理過敏。
02:04
So I created a helmet that creates this artificial allergy to the color red.
29
124596
5643
所以我製做了一頂頭盔,能創造出
對紅色這種顏色的人工過敏。
02:10
It simulates this hypersensitivity by making red things look bigger
30
130263
4453
當你戴上這頭盔,它模擬 這種過敏症的方式就是
讓紅色的東西看起來比較大。
02:14
when you are wearing it.
31
134740
1280
02:16
It has two modes: nocebo and placebo.
32
136730
3558
它有兩種模式: 反安慰劑和安慰劑。
02:21
In nocebo mode, it creates this sensorial experience of hyperallergy.
33
141272
5357
用反安慰劑模式時, 它會創造出過敏症的感覺體驗。
02:26
Whenever I see red, the red expands.
34
146653
2707
每當我看見紅色,紅色就會擴張。
02:30
It's similar to social media's amplification effect,
35
150062
3531
這和社群媒體的放大效應很類似,
02:33
like when you look at something that bothers you,
36
153617
2597
比如,當你看到讓你反感的事物,
02:36
you tend to stick with like-minded people
37
156238
2928
你會傾向留在同溫層,
02:39
and exchange messages and memes, and you become even more angry.
38
159190
4595
和想法相似的人交流訊息 和迷因,結果你會更生氣。
02:44
Sometimes, a trivial discussion gets amplified
39
164274
3662
有時,很瑣碎的討論會被放大
02:47
and blown way out of proportion.
40
167960
2401
並擴大到非常不成比例的程度。
02:51
Maybe that's even why we are living in the politics of anger.
41
171038
4520
也許這甚至是我們活在 「憤怒的政治」下的原因。
02:56
In placebo mode, it's an artificial cure for this allergy.
42
176756
3415
用安慰劑模式,則會提供 這種過敏的人工解藥。
03:00
Whenever you see red, the red shrinks.
43
180575
2292
每當你看到紅色,紅色就會縮小。
03:03
It's a palliative, like in digital media.
44
183617
2875
它是種緩和劑,就像數位媒體。
03:06
When you encounter people with different opinions,
45
186516
2899
當你遇到意見不同的人,
03:09
we will unfollow them,
46
189439
1477
我們就不會追隨他們, 完全取消追蹤他們的動態。
03:10
remove them completely out of our feeds.
47
190940
2748
03:14
It cures this allergy by avoiding it.
48
194343
3203
它治癒這種過敏的 方式是「避開」。
03:17
But this way of intentionally ignoring opposing ideas
49
197570
4547
但這種刻意忽略相反想法的方式
03:22
makes human community hyperfragmented and separated.
50
202141
4056
會讓人類共同體變得 超分散且分裂。
03:27
The device inside the helmet reshapes reality
51
207343
3059
頭盔裡面的裝置能重塑現實,
03:30
and projects into our eyes through a set of lenses
52
210426
2950
透過一組鏡片, 投射到你的眼睛上,
03:33
to create an augmented reality.
53
213400
1915
來創造擴增實境。
03:35
I picked the color red, because it's intense and emotional,
54
215935
4782
我選中紅色,是因為 這個顏色很強烈且情緒化,
03:40
it has high visibility
55
220741
1958
它非常顯眼,
03:42
and it's political.
56
222723
1278
且它有政治味。
03:44
So what if we take a look
57
224390
1263
所以,如果我們用這個頭盔來看
03:45
at the last American presidential election map
58
225677
3049
上次美國總統大選的地圖呢?
03:48
through the helmet?
59
228750
1165
03:49
(Laughter)
60
229939
1008
(笑聲)
03:50
You can see that it doesn't matter if you're a Democrat or a Republican,
61
230971
3573
各位可以看見,不論你是 民主黨或共和黨都一樣,
03:54
because the mediation alters our perceptions.
62
234568
3989
因為中介影響會改變我們的感知。
03:58
The allergy exists on both sides.
63
238581
3133
雙方都有過敏。
04:03
In digital media,
64
243134
1320
在數位媒體中,
04:04
what we see every day is often mediated,
65
244478
3133
我們每天所見的資訊, 通常都已受中介影響,
04:07
but it's also very nuanced.
66
247635
1733
但這影響非常微妙。
04:09
If we are not aware of this,
67
249758
1995
如果我們沒有意識到,
04:11
we will keep being vulnerable to many kinds of mental allergies.
68
251777
5405
我們就會一直無法 抵抗多種心理過敏。
04:18
Our perception is not only part of our identities,
69
258900
3510
我們的感知不僅是 我們身分的一部分,
04:22
but in digital media, it's also a part of the value chain.
70
262434
4294
在數位媒體中, 它也是價值鏈的一部分。
04:27
Our visual field is packed with so much information
71
267992
3575
我們的視野中塞滿了資訊,
04:31
that our perception has become a commodity with real estate value.
72
271591
5512
因此我們的感知已經變成了
有不動產價值的商品。
04:38
Designs are used to exploit our unconscious biases,
73
278035
3529
很多設計都利用了 我們的無意識偏見,
04:41
algorithms favor content that reaffirms our opinions,
74
281588
3480
用演算法找出我們偏好的內容 讓我們更堅信自己的意見,
04:45
so that every little corner of our field of view is being colonized
75
285092
4191
這麼一來,我們 視野中的每個小角落
都會被開拓來賣廣告。
04:49
to sell ads.
76
289307
1342
04:51
Like, when this little red dot comes out in your notifications,
77
291402
4017
比如,當這個小紅點 出現在你的通知圖示上,
04:55
it grows and expands, and to your mind, it's huge.
78
295443
4244
它會長大、擴展, 對你的心靈而言它很巨大。
05:00
So I started to think of ways to put a little dirt,
79
300571
3361
所以,我開始想辦法弄髒
05:03
or change the lenses of my glasses,
80
303956
2698
我的眼鏡或是改變鏡片度數,
05:06
and came up with another project.
81
306678
2098
因而想出了另一個計畫。
05:09
Now, keep in mind this is conceptual. It's not a real product.
82
309423
3801
請記住,這不是真實產品, 只是概念性的。
05:13
It's a web browser plug-in
83
313676
2073
它是瀏覽器的外掛程式,
05:15
that could help us to notice the things that we would usually ignore.
84
315773
4342
能協助我們注意到 我們通常會忽略的東西。
05:20
Like the helmet, the plug-in reshapes reality,
85
320709
3770
就像那頂頭盔一樣, 這個外掛能重塑現實,
05:24
but this time, directly into the digital media itself.
86
324503
3397
但這次,是直接 進入到數位媒體本身。
05:29
It shouts out the hidden filtered voices.
87
329066
3156
它會喊出被過濾掉的隱藏聲音。
05:32
What you should be noticing now
88
332246
1573
你現在應該要注意到的東西 會變得更大且會震動,
05:33
will be bigger and vibrant,
89
333843
2509
05:36
like here, this story about gender bias emerging from the sea of cats.
90
336376
4295
像這裡,關於性別偏見的故事 從一片貓海中浮現。
05:40
(Laughter)
91
340695
2021
(笑聲)
05:42
The plug-in could dilute the things that are being amplified by an algorithm.
92
342740
5417
這個外掛可以將演算法 所放大的東西給稀釋。
05:48
Like, here in this comment section,
93
348831
1696
比如,在這個留言區,
05:50
there are lots of people shouting about the same opinions.
94
350551
3069
有很多人都在 大聲嚷嚷同一種意見,
05:54
The plug-in makes their comments super small.
95
354111
2870
而這個外掛就讓 他們的留言變得超級小。
05:57
(Laughter)
96
357005
1008
(笑聲)
05:58
So now the amount of pixel presence they have on the screen
97
358037
5251
所以,他們在螢幕上有多少像素,
06:03
is proportional to the actual value they are contributing to the conversation.
98
363312
4554
就會對應到他們對於對談的 實際貢獻價值有多高。
06:07
(Laughter)
99
367890
1942
(笑聲)
06:11
(Applause)
100
371345
3631
(掌聲)
06:16
The plug-in also shows the real estate value of our visual field
101
376919
4061
這個外掛也會展示出 我們視野中的不動產價值,
06:21
and how much of our perception is being commoditized.
102
381004
3456
以及我們有多少感知 已經被商品化。
06:24
Different from ad blockers,
103
384912
1589
和廣告攔截程式不同,
06:26
for every ad you see on the web page,
104
386525
2472
針對你在網頁上 所看到的每則廣告,
06:29
it shows the amount of money you should be earning.
105
389021
3407
它都會顯示出你應該 賺到的錢有多少。
06:32
(Laughter)
106
392452
1245
(笑聲)
06:34
We are living in a battlefield between reality
107
394356
2292
我們生活在戰場上,
06:36
and commercial distributed reality,
108
396672
2253
現實和商業分配現實之間在作戰,
06:39
so the next version of the plug-in could strike away that commercial reality
109
399329
5086
所以,下一版的外掛 可以趕跑那商業現實,
06:44
and show you things as they really are.
110
404439
2799
讓你看見一切真正的樣貌。
06:47
(Laughter)
111
407262
1791
(笑聲)
06:50
(Applause)
112
410335
3629
(掌聲)
06:55
Well, you can imagine how many directions this could really go.
113
415179
3612
各位可以想像,這個外掛 有多少方向可以去發展。
06:58
Believe me, I know the risks are high if this were to become a real product.
114
418815
4414
相信我,我知道如果它變成 真實的產品,風險會很高。
07:04
And I created this with good intentions
115
424006
2939
我創造它是出於好意,
07:06
to train our perception and eliminate biases.
116
426969
3837
想要訓練我們的感知並消除偏見。
07:10
But the same approach could be used with bad intentions,
117
430830
3728
但同樣的方法也可以 用在不好的地方,比如
07:14
like forcing citizens to install a plug-in like that
118
434582
3205
強迫公民安裝那樣的外掛,
07:17
to control the public narrative.
119
437811
2126
來控制對公眾敘事的方式。
07:20
It's challenging to make it fair and personal
120
440768
2737
困難之處在於要讓它 公平、個人使用,
07:23
without it just becoming another layer of mediation.
121
443529
3155
不能變成另一層的中介影響。
07:27
So what does all this mean for us?
122
447933
2662
所以,上述這些 對我們有什麼意涵?
07:31
Even though technology is creating this isolation,
123
451230
3733
雖然科技正在造成這種孤立,
07:34
we could use it to make the world connected again
124
454987
3602
但我們也可以用科技 讓世界再次連結,
07:38
by breaking the existing model and going beyond it.
125
458613
3247
要做的是打破既有模型, 不要受限於它。
07:42
By exploring how we interface with these technologies,
126
462508
3338
藉由探索我們能如何 和這些技術接合,
07:45
we could step out of our habitual, almost machine-like behavior
127
465870
5185
我們就能踏出我們習慣性 且幾乎是機械式的行為,
07:51
and finally find common ground between each other.
128
471079
2670
最終在彼此之間找到共同點。
07:54
Technology is never neutral.
129
474915
1789
科技從來就不是中立的。
07:57
It provides a context and frames reality.
130
477191
3019
它能提供情境,也能修飾現實。
08:00
It's part of the problem and part of the solution.
131
480234
2888
它是問題的一部分, 也是解決方案的一部分。
08:03
We could use it to uncover our blind spots and retrain our perception
132
483624
5640
我們可以用科技 來發掘我們的盲點,
並重新訓練我們的感知,
08:09
and consequently, choose how we see each other.
133
489949
3506
這麼一來,我們就能選擇 我們如何看待彼此。
08:13
Thank you.
134
493892
1191
謝謝。
08:15
(Applause)
135
495107
2685
(掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7