Art that reveals how technology frames reality | Jiabao Li

79,380 views ・ 2020-04-06

TED


请双击下面的英文字幕来播放视频。

翻译人员: Jiasi Hao 校对人员: Yanyan Hong
00:12
I'm an artist and an engineer.
0
12887
2396
我是一位艺术家兼工程师。
00:15
And lately, I've been thinking a lot about how technology mediates
1
15307
5774
最近,我一直在思考
科技在以怎样的方式影响 我们感知现实的能力。
00:21
the way we perceive reality.
2
21105
2041
00:23
And it's being done in a superinvisible and nuanced way.
3
23653
4640
这种影响是潜移默化的。
00:29
Technology is designed to shape our sense of reality
4
29260
3646
科技通过伪装对现实的真实体验,
00:32
by masking itself as the actual experience of the world.
5
32930
3811
来塑造我们对现实的感知。
00:37
As a result, we are becoming unconscious and unaware
6
37430
4686
于是,我们对正在发生的一切
变得毫无意识,也毫无察觉。
00:42
that it is happening at all.
7
42140
2185
00:45
Take the glasses I usually wear, for example.
8
45360
2488
就拿我平时戴的眼镜举例。
00:47
These have become part of the way I ordinarily experience my surroundings.
9
47872
4024
眼镜变成我平日里 体验周围环境方式的一部分。
00:52
I barely notice them,
10
52352
1490
尽管它们一直框住了 我的现实世界,
00:53
even though they are constantly framing reality for me.
11
53866
4085
我也很少注意到它们的存在。
00:58
The technology I am talking about is designed to do the same thing:
12
58447
4016
我现在所说的科技 也被设计成做同样的事:
01:02
change what we see and think
13
62487
2438
不知不觉地
改变着我们的所见所思。
01:04
but go unnoticed.
14
64949
1760
01:07
Now, the only time I do notice my glasses
15
67780
3269
如今,我很少意识到我带着眼镜,
除非发生了一些事情,引起我的注意,
01:11
is when something happens to draw my attention to it,
16
71073
3258
01:14
like when it gets dirty or my prescription changes.
17
74355
3514
比如,眼镜脏了,或者度数不够了。
01:18
So I asked myself, "As an artist, what can I create
18
78481
4608
所以我自问: “作为一名艺术家,
我能创造出什么作品, 来吸引注意力,
01:23
to draw the same kind of attention
19
83113
2447
01:25
to the ways digital media -- like news organizations, social media platforms,
20
85584
6171
就如同数字媒体那样——
像新闻机构、社交媒体平台、 广告,和搜索引擎——
01:31
advertising and search engines --
21
91779
2261
来塑造我们的现实?”
01:34
are shaping our reality?"
22
94064
2014
01:36
So I created a series of perceptual machines
23
96798
4772
所以我创造了一系列的感知机器
01:41
to help us defamiliarize and question
24
101594
3566
来帮助我们陌生化并质疑
01:45
the ways we see the world.
25
105184
2120
我们看世界的方式。
01:48
For example, nowadays, many of us have this kind of allergic reaction
26
108859
6033
例如,现在我们很多人
对持有不同观点的人 有着这样一种过敏反应。
01:54
to ideas that are different from ours.
27
114916
2493
01:57
We may not even realize that we've developed this kind of mental allergy.
28
117945
5656
我们可能甚至都没意识到
我们已经产生了这样一种“精神过敏”。
02:04
So I created a helmet that creates this artificial allergy to the color red.
29
124596
5643
所以我做了一个头盔, 对红色产生人工过敏。
02:10
It simulates this hypersensitivity by making red things look bigger
30
130263
4453
当你佩戴头盔时, 它会通过放大红色物体,
02:14
when you are wearing it.
31
134740
1280
来模拟这一超敏反应。
02:16
It has two modes: nocebo and placebo.
32
136730
3558
它有两种模式: 非安慰剂效应和安慰剂效应。
02:21
In nocebo mode, it creates this sensorial experience of hyperallergy.
33
141272
5357
在非安慰剂模式下,
头盔可以创造超敏的感官体验。
02:26
Whenever I see red, the red expands.
34
146653
2707
只要我看到红色, 红色就会膨胀开来。
02:30
It's similar to social media's amplification effect,
35
150062
3531
这和社交媒体的放大效应类似,
02:33
like when you look at something that bothers you,
36
153617
2597
就像你在看到某些不喜欢的言论,
02:36
you tend to stick with like-minded people
37
156238
2928
你会倾向于去找和自己想法一样的人
02:39
and exchange messages and memes, and you become even more angry.
38
159190
4595
交换信息和表情包, 然后你变得更加生气。
02:44
Sometimes, a trivial discussion gets amplified
39
164274
3662
有时,无关紧要的小讨论
会被放大、夸张到不合时宜的程度。
02:47
and blown way out of proportion.
40
167960
2401
可能这就是“愤怒政治”出现的原因。
02:51
Maybe that's even why we are living in the politics of anger.
41
171038
4520
02:56
In placebo mode, it's an artificial cure for this allergy.
42
176756
3415
在安慰剂模式下, 有种针对该过敏的人造疗法。
03:00
Whenever you see red, the red shrinks.
43
180575
2292
只要你看到红色,红色会缩小。
03:03
It's a palliative, like in digital media.
44
183617
2875
这是一种缓和剂。 好比在数字媒体中,
03:06
When you encounter people with different opinions,
45
186516
2899
当你遇到有着与你 有不同想法的人时,
03:09
we will unfollow them,
46
189439
1477
你会取消关注他们,
03:10
remove them completely out of our feeds.
47
190940
2748
不接收他们的推送。
03:14
It cures this allergy by avoiding it.
48
194343
3203
通过避免接触来治疗过敏。
03:17
But this way of intentionally ignoring opposing ideas
49
197570
4547
但是,这种有意无视对立观点的方法,
03:22
makes human community hyperfragmented and separated.
50
202141
4056
使人类社区变得极其破碎且分离。
03:27
The device inside the helmet reshapes reality
51
207343
3059
头盔内部的设备重塑现实,
03:30
and projects into our eyes through a set of lenses
52
210426
2950
并通过一组镜头 投射入我们的眼睛,
03:33
to create an augmented reality.
53
213400
1915
从而实现增强现实。
03:35
I picked the color red, because it's intense and emotional,
54
215935
4782
我选择红色,因为红色 紧张激烈且饱含情绪。
03:40
it has high visibility
55
220741
1958
它具有高度可见性,
03:42
and it's political.
56
222723
1278
也同时具有政治性。
03:44
So what if we take a look
57
224390
1263
所以,要是我们利用这头盔
03:45
at the last American presidential election map
58
225677
3049
看看上次美国总统大选的地图,
03:48
through the helmet?
59
228750
1165
会是如何?
03:49
(Laughter)
60
229939
1008
(笑声)
03:50
You can see that it doesn't matter if you're a Democrat or a Republican,
61
230971
3573
可以看到,不论你是 民主党还是共和党,都不重要了,
03:54
because the mediation alters our perceptions.
62
234568
3989
因为这一调解改变了我们的感知。
03:58
The allergy exists on both sides.
63
238581
3133
两边都存在过敏反应。
在数字媒体中,
04:03
In digital media,
64
243134
1320
04:04
what we see every day is often mediated,
65
244478
3133
我们每天看到的内容 通常都是经过调整的,
04:07
but it's also very nuanced.
66
247635
1733
但这些调整也非常细微。
04:09
If we are not aware of this,
67
249758
1995
如果我们无法意识到这一点,
04:11
we will keep being vulnerable to many kinds of mental allergies.
68
251777
5405
我们会变得脆弱, 容易遭受到许多不同的“精神过敏”。
04:18
Our perception is not only part of our identities,
69
258900
3510
我们的感知不仅是 我们身份认同的一部分,
04:22
but in digital media, it's also a part of the value chain.
70
262434
4294
在数字媒体中也是价值链的一部分。
04:27
Our visual field is packed with so much information
71
267992
3575
我们的视野中 充斥着如此多的信息,
04:31
that our perception has become a commodity with real estate value.
72
271591
5512
以至于我们的感知已经变成了商品, 跟房地产一样明码标价。
04:38
Designs are used to exploit our unconscious biases,
73
278035
3529
设计被用来挖掘我们的无意识的偏爱,
04:41
algorithms favor content that reaffirms our opinions,
74
281588
3480
算法则偏爱那些能再次 确认我们想法的内容,
我们视线所及的每个角落都被掌控,
04:45
so that every little corner of our field of view is being colonized
75
285092
4191
04:49
to sell ads.
76
289307
1342
来销售广告。
04:51
Like, when this little red dot comes out in your notifications,
77
291402
4017
例如,当这一个小红点 出现在你的推送中,
04:55
it grows and expands, and to your mind, it's huge.
78
295443
4244
它会不断放大, 在你脑海中,它变得巨大。
05:00
So I started to think of ways to put a little dirt,
79
300571
3361
我开始想办法弄脏我的眼镜,
05:03
or change the lenses of my glasses,
80
303956
2698
或是改变镜片的度数。
05:06
and came up with another project.
81
306678
2098
于是想出了另一个项目。
05:09
Now, keep in mind this is conceptual. It's not a real product.
82
309423
3801
请记住,目前还只是 概念性的,并未投产。
05:13
It's a web browser plug-in
83
313676
2073
这是一个浏览器插件,
05:15
that could help us to notice the things that we would usually ignore.
84
315773
4342
能帮我们注意到 那些经常被忽略的内容。
05:20
Like the helmet, the plug-in reshapes reality,
85
320709
3770
像刚刚的头盔一样, 这一插件能重塑现实,
05:24
but this time, directly into the digital media itself.
86
324503
3397
但这次,是直接从数字媒体本身入手。
被埋没、被过滤的声音得以传播。
05:29
It shouts out the hidden filtered voices.
87
329066
3156
05:32
What you should be noticing now
88
332246
1573
值得你关注的东西
05:33
will be bigger and vibrant,
89
333843
2509
会变得更大、更明显,
05:36
like here, this story about gender bias emerging from the sea of cats.
90
336376
4295
比如这里, 一篇关于性别偏见的文章, 从一堆关于猫咪的文章中被筛选出来。
05:40
(Laughter)
91
340695
2021
(笑声)
05:42
The plug-in could dilute the things that are being amplified by an algorithm.
92
342740
5417
这个插件能够稀释 被算法放大的东西。
05:48
Like, here in this comment section,
93
348831
1696
例如,在这个评论区,
05:50
there are lots of people shouting about the same opinions.
94
350551
3069
很多人都在发表着同样的观点。
05:54
The plug-in makes their comments super small.
95
354111
2870
这个插件就能把他们的 评论变得超级小。
(笑声)
05:57
(Laughter)
96
357005
1008
05:58
So now the amount of pixel presence they have on the screen
97
358037
5251
所以现在,这些相同观点的占屏面积
06:03
is proportional to the actual value they are contributing to the conversation.
98
363312
4554
和他们为这场对话 贡献的实际价值成正比了。
06:07
(Laughter)
99
367890
1942
(笑声)
06:11
(Applause)
100
371345
3631
(掌声)
06:16
The plug-in also shows the real estate value of our visual field
101
376919
4061
这个插件也能展现 我们视野的房地产价值,
以及我们的感知 有多少正在被商品化。
06:21
and how much of our perception is being commoditized.
102
381004
3456
06:24
Different from ad blockers,
103
384912
1589
与广告拦截器不同,
06:26
for every ad you see on the web page,
104
386525
2472
它能告诉你,你每看一个广告,
你本应该挣到多少钱。
06:29
it shows the amount of money you should be earning.
105
389021
3407
06:32
(Laughter)
106
392452
1245
(笑声)
06:34
We are living in a battlefield between reality
107
394356
2292
我们身处战场之中,
双方是现实和商业分销化的现实,
06:36
and commercial distributed reality,
108
396672
2253
06:39
so the next version of the plug-in could strike away that commercial reality
109
399329
5086
所以下一个插件版本 将可能打破商业化的现实
06:44
and show you things as they really are.
110
404439
2799
还原一个真正的现实。
06:47
(Laughter)
111
407262
1791
(笑声)
06:50
(Applause)
112
410335
3629
(掌声)
06:55
Well, you can imagine how many directions this could really go.
113
415179
3612
你们可以想象一下, 这个插件能做多少事情。
06:58
Believe me, I know the risks are high if this were to become a real product.
114
418815
4414
相信我,我知道把它做出来会有很大风险。
我提出这个概念的出发点是好的,
07:04
And I created this with good intentions
115
424006
2939
07:06
to train our perception and eliminate biases.
116
426969
3837
是为了训练感知和消除偏见。
07:10
But the same approach could be used with bad intentions,
117
430830
3728
但它也可能被用来做不好的事情,
例如,强迫公民安装这样的插件
07:14
like forcing citizens to install a plug-in like that
118
434582
3205
07:17
to control the public narrative.
119
437811
2126
来控制公众叙事。
07:20
It's challenging to make it fair and personal
120
440768
2737
挑战之处在于 把它变得公平且个人化,
07:23
without it just becoming another layer of mediation.
121
443529
3155
而非只是变成另一层调和工具。
07:27
So what does all this mean for us?
122
447933
2662
所以,这一切 对我们来说意味着什么?
07:31
Even though technology is creating this isolation,
123
451230
3733
尽管科技正在让我们分裂隔离,
07:34
we could use it to make the world connected again
124
454987
3602
但通过打破并超越现有模型,
07:38
by breaking the existing model and going beyond it.
125
458613
3247
我们依旧可以利用科技 把世界再次连结起来。
07:42
By exploring how we interface with these technologies,
126
462508
3338
通过探索我们 如何与这些科技建立联系,
07:45
we could step out of our habitual, almost machine-like behavior
127
465870
5185
我们能够摆脱习惯性的、 甚至类似机器的行为,
最终找到我们彼此间的共同点。
07:51
and finally find common ground between each other.
128
471079
2670
07:54
Technology is never neutral.
129
474915
1789
科技永远不会是中立的。
07:57
It provides a context and frames reality.
130
477191
3019
它能提供情境,也能构建现实。
08:00
It's part of the problem and part of the solution.
131
480234
2888
它既是问题的一部分, 也是解决方法的一部分。
08:03
We could use it to uncover our blind spots and retrain our perception
132
483624
5640
我们能够利用它来发现我们的盲点,
并重新训练我们的感知能力,
08:09
and consequently, choose how we see each other.
133
489949
3506
这么一来,我们就能 选择如何看待彼此。
08:13
Thank you.
134
493892
1191
谢谢大家。
(掌声)
08:15
(Applause)
135
495107
2685
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7