Art that reveals how technology frames reality | Jiabao Li

80,475 views ใƒป 2020-04-06

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืชืจื’ื•ื: Shimon Rottenberg ืขืจื™ื›ื”: Sigal Tifferet
00:12
I'm an artist and an engineer.
0
12887
2396
ืื ื™ ืืžื ื™ืช ื•ืžื”ื ื“ืกืช.
00:15
And lately, I've been thinking a lot about how technology mediates
1
15307
5774
ืœืื—ืจื•ื ื” ื—ืฉื‘ืชื™ ืจื‘ื•ืช ืขืœ ื”ื“ืจืš ื‘ื” ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ืžืชื•ื•ื›ืช
00:21
the way we perceive reality.
2
21105
2041
ืืช ืชืคื™ืกืช ื”ืžืฆื™ืื•ืช ืฉืœื ื•.
00:23
And it's being done in a superinvisible and nuanced way.
3
23653
4640
ื–ื” ืงื•ืจื” ื‘ืื•ืคืŸ ืžืื“ ื ืกืชืจ ืžื”ืขื™ืŸ ื•ืจื‘-ื“ืงื•ื™ื•ืช.
00:29
Technology is designed to shape our sense of reality
4
29260
3646
ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ืžืชื•ื›ื ื ืช ืœืขืฆื‘ ืืช ืชื—ื•ืฉืช ื”ืžืฆื™ืื•ืช ืฉืœื ื•
00:32
by masking itself as the actual experience of the world.
5
32930
3811
ื‘ื›ืš ืฉื”ื™ื ืžืชื—ืคืฉืช ืœื—ื•ื•ื™ืช ื”ืขื•ืœื ื”ืืžื™ืชื™ืช.
00:37
As a result, we are becoming unconscious and unaware
6
37430
4686
ื›ืชื•ืฆืื” ืžื›ืš, ืื ื—ื ื• ื ื”ื™ื™ื ืœื ืžื•ื“ืขื™ื
00:42
that it is happening at all.
7
42140
2185
ืœื›ืš ืฉื–ื” ื‘ื›ืœืœ ืงื•ืจื”.
00:45
Take the glasses I usually wear, for example.
8
45360
2488
ืœืžืฉืœ ื”ืžืฉืงืคื™ื™ื ืฉืื ื™ ืœืจื•ื‘ ืžืจื›ื™ื‘ื”.
00:47
These have become part of the way I ordinarily experience my surroundings.
9
47872
4024
ื”ื ื”ืคื›ื• ืœื—ืœืง ืžื”ืื•ืคืŸ ืฉื‘ื• ืื ื™ ื‘ื“ืจืš ื›ืœืœ ื—ื•ื•ื” ืืช ืกื‘ื™ื‘ืชื™.
00:52
I barely notice them,
10
52352
1490
ืื ื™ ื‘ืงื•ืฉื™ ืฉืžื” ืœื‘ ืืœื™ื”ื,
00:53
even though they are constantly framing reality for me.
11
53866
4085
ืœืžืจื•ืช ืฉื”ื ืžืžืกื’ืจื™ื ืขื‘ื•ืจื™ ืืช ื”ืžืฆื™ืื•ืช ื‘ืื•ืคืŸ ืงื‘ื•ืข.
00:58
The technology I am talking about is designed to do the same thing:
12
58447
4016
ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ืฉืขืœื™ื” ืื ื™ ืžื“ื‘ืจืช ืžืชื•ื›ื ื ืช ืœื‘ืฆืข ืืช ืื•ืชื• ื”ื“ื‘ืจ:
01:02
change what we see and think
13
62487
2438
ืœืฉื ื•ืช ืืช ืžื” ืฉืื ื—ื ื• ืจื•ืื™ื ื•ื—ื•ืฉื‘ื™ื
01:04
but go unnoticed.
14
64949
1760
ืืš ื‘ืœื™ ืฉื ืฉื™ื ืœื‘.
01:07
Now, the only time I do notice my glasses
15
67780
3269
ืื ื™ ืฉืžื” ืœื‘ ืœืžืฉืงืคื™ื™ื ืฉืœื™
01:11
is when something happens to draw my attention to it,
16
71073
3258
ืจืง ื›ืฉืžืฉื”ื• ืงื•ืจื” ื•ืžื•ืฉืš ืืช ืชืฉื•ืžืช ืœื™ื‘ื™ ืืœื™ื”ื,
01:14
like when it gets dirty or my prescription changes.
17
74355
3514
ืœืžืฉืœ ื›ืฉื”ื ืžืชืœื›ืœื›ื™ื ืื• ื›ืฉื”ืžืกืคืจ ืฉืœื”ื ื’ื“ืœ.
01:18
So I asked myself, "As an artist, what can I create
18
78481
4608
ืื– ืฉืืœืชื™ ืืช ืขืฆืžื™, "ื›ืืžื ื™ืช, ืžื” ืื ื™ ื™ื›ื•ืœื” ืœื™ืฆื•ืจ
01:23
to draw the same kind of attention
19
83113
2447
"ื›ื“ื™ ืœืžืฉื•ืš ืืช ืื•ืชื• ืกื•ื’ ืฉืœ ืชืฉื•ืžืช ืœื‘
01:25
to the ways digital media -- like news organizations, social media platforms,
20
85584
6171
"ืœื“ืจืš ืฉื‘ื” ื”ืžื“ื™ื” ื”ื“ื™ื’ื™ื˜ืœื™ืช -- ื’ื•ืคื™ ื—ื“ืฉื•ืช, ืจืฉืชื•ืช ื—ื‘ืจืชื™ื•ืช,
01:31
advertising and search engines --
21
91779
2261
"ืžื ื•ืขื™ ื—ื™ืคื•ืฉ ื•ืคืจืกื•ื --
01:34
are shaping our reality?"
22
94064
2014
ืžืขืฆื‘ื™ื ืืช ื”ืžืฆื™ืื•ืช ืฉืœื ื•?"
01:36
So I created a series of perceptual machines
23
96798
4772
ืื– ื™ืฆืจืชื™ ืกื“ืจื” ืฉืœ ืžื›ืฉื™ืจื™ื ืชืคื™ืกืชื™ื™ื
01:41
to help us defamiliarize and question
24
101594
3566
ืฉื™ืกื™ื™ืขื• ืœื ื• ืœื™ืฆื•ืจ ืจื™ื—ื•ืง ื•ืœื”ื˜ื™ืœ ืกืคืง
01:45
the ways we see the world.
25
105184
2120
ื‘ื“ืจื›ื™ื ื”ืžื•ื›ืจื•ืช ืฉื‘ื”ืŸ ืื ื• ืจื•ืื™ื ืืช ื”ืขื•ืœื.
01:48
For example, nowadays, many of us have this kind of allergic reaction
26
108859
6033
ืœืžืฉืœ, ื‘ื™ืžื™ื ื•, ืจื‘ื™ื ืžืื™ืชื ื• ื—ื•ื•ื™ื ืกื•ื’ ืฉืœ ืชื’ื•ื‘ื” ืืœืจื’ื™ืช
01:54
to ideas that are different from ours.
27
114916
2493
ืœื“ืขื•ืช ืฉื•ื ื•ืช ืžืฉืœื ื•.
01:57
We may not even realize that we've developed this kind of mental allergy.
28
117945
5656
ืœื ื‘ื˜ื•ื— ืฉืื ื—ื ื• ืืคื™ืœื• ืงื•ืœื˜ื™ื ืฉืคื™ืชื—ื ื• ืกื•ื’ ื›ื–ื” ืฉืœ ืืœืจื’ื™ื” ืžื ื˜ืœื™ืช.
02:04
So I created a helmet that creates this artificial allergy to the color red.
29
124596
5643
ืื– ื‘ื ื™ืชื™ ืงืกื“ื” ืฉื’ื•ืจืžืช ืœืืœืจื’ื™ื” ืžืœืื›ื•ืชื™ืช ืœืฆื‘ืข ื”ืื“ื•ื.
02:10
It simulates this hypersensitivity by making red things look bigger
30
130263
4453
ื”ื™ื ืžื“ืžื” ืืช ืจื’ื™ืฉื•ืช-ื”ื™ืชืจ ื”ื–ืืช ื‘ื›ืš ืฉื”ื™ื ื’ื•ืจืžืช ืœื“ื‘ืจื™ื ืื“ื•ืžื™ื ืœื”ื™ืจืื•ืช ื’ื“ื•ืœื™ื ื™ื•ืชืจ
02:14
when you are wearing it.
31
134740
1280
ื‘ื–ืžืŸ ืฉื—ื•ื‘ืฉื™ื ืื•ืชื”.
02:16
It has two modes: nocebo and placebo.
32
136730
3558
ื™ืฉ ืœื” ืฉื ื™ ืžืฆื‘ื™ ืคืขื•ืœื”: ื ื•ืฆื‘ื• (ืคืœืฆื‘ื• ืฉืœื™ืœื™) ื•ืคืœืฆื‘ื•.
02:21
In nocebo mode, it creates this sensorial experience of hyperallergy.
33
141272
5357
ื‘ืžืฆื‘ ื ื•ืฆื‘ื•, ื”ื™ื ื’ื•ืจืžืช ืœื—ื•ื™ื” ืชื—ื•ืฉืชื™ืช ืฉืœ ื”ื™ืคืจ-ืืœืจื’ื™ื”.
02:26
Whenever I see red, the red expands.
34
146653
2707
ื›ืœ ืคืขื ืฉืื ื™ ืจื•ืื” ืื“ื•ื, ื”ืื“ื•ื ืžืชืคืฉื˜.
02:30
It's similar to social media's amplification effect,
35
150062
3531
ื–ื” ื“ื•ืžื” ืœืืคืงื˜ ื”ื”ื’ื‘ืจื” ื‘ืจืฉืชื•ืช ื”ื—ื‘ืจืชื™ื•ืช,
02:33
like when you look at something that bothers you,
36
153617
2597
ืœืžืฉืœ ื›ืฉืืชื ืจื•ืื™ื ืžืฉื”ื• ืฉืžืคืจื™ืข ืœื›ื,
02:36
you tend to stick with like-minded people
37
156238
2928
ืืชื ื ื•ื˜ื™ื ืœื“ื‘ื•ืง ื‘ืื ืฉื™ื ืฉื—ื•ืฉื‘ื™ื ื›ืžื•ื›ื
02:39
and exchange messages and memes, and you become even more angry.
38
159190
4595
ื•ืœื”ื—ืœื™ืฃ ื”ื•ื“ืขื•ืช ื•ืžืžื™ื, ื•ืœื›ืขื•ืก ืขื•ื“ ื™ื•ืชืจ.
02:44
Sometimes, a trivial discussion gets amplified
39
164274
3662
ืœืคืขืžื™ื, ื“ื™ื•ืŸ ืชืžื™ื ื ื”ื™ื” ืžื•ื’ื‘ืจ
02:47
and blown way out of proportion.
40
167960
2401
ื•ืžืื‘ื“ ืคืจื•ืคื•ืจืฆื™ื•ืช.
02:51
Maybe that's even why we are living in the politics of anger.
41
171038
4520
ืื•ืœื™ ื–ืืช ื’ื ื”ืกื™ื‘ื” ืฉืื ื—ื ื• ื—ื™ื™ื ื‘ืคื•ืœื™ื˜ื™ืงื” ืฉืœ ื›ืขืก.
02:56
In placebo mode, it's an artificial cure for this allergy.
42
176756
3415
ื‘ืžืฆื‘ ืคืขื•ืœื” ืฉืœ ืคืœืฆื‘ื•, ืงื™ื™ื ืจื™ืคื•ื™ ืžืœืื›ื•ืชื™ ืฉืœ ืื•ืชื” ืืœืจื’ื™ื”.
03:00
Whenever you see red, the red shrinks.
43
180575
2292
ื‘ื›ืœ ืคืขื ืฉืจื•ืื™ื ืื“ื•ื, ื”ืื“ื•ื ืžืฆื˜ืžืง.
03:03
It's a palliative, like in digital media.
44
183617
2875
ื–ื” ืกื ื”ืจื’ืขื”, ื›ืžื• ื‘ืจืฉืชื•ืช ื”ื—ื‘ืจืชื™ื•ืช.
03:06
When you encounter people with different opinions,
45
186516
2899
ื›ืฉื ืชืงืœื™ื ื‘ืื ืฉื™ื ื‘ืขืœื™ ื“ืขื•ืช ืฉื•ื ื•ืช,
03:09
we will unfollow them,
46
189439
1477
ืžืคืกื™ืงื™ื ืœืขืงื•ื‘ ืื—ืจื™ื”ื,
03:10
remove them completely out of our feeds.
47
190940
2748
ืžืกื™ืจื™ื ืื•ืชื ืœื’ืžืจื™ ืžื”ืคื™ื“.
03:14
It cures this allergy by avoiding it.
48
194343
3203
ื–ื” ืžืจืคื ืืช ื”ืืœืจื’ื™ื” ืขืœื™ ื™ื“ื™ ื”ื™ืžื ืขื•ืช ืžืžื ื”.
03:17
But this way of intentionally ignoring opposing ideas
49
197570
4547
ืื‘ืœ ื“ืจืš ื–ื• ืฉืœ ื”ืชืขืœืžื•ืช ืžื›ื•ื•ื ืช ืžื“ืขื•ืช ืžื ื•ื’ื“ื•ืช
03:22
makes human community hyperfragmented and separated.
50
202141
4056
ื”ื•ืคื›ืช ืืช ื”ืงื”ื™ืœื” ื”ืื ื•ืฉื™ืช ืœืžื‘ื•ื“ืœืช ื•ืžืคื•ืœื’ืช ืžืื“.
03:27
The device inside the helmet reshapes reality
51
207343
3059
ื”ืžืชืงืŸ ื‘ืชื•ืš ื”ืงืกื“ื” ืžืขืฆื‘ ืžื—ื“ืฉ ืืช ื”ืžืฆื™ืื•ืช,
03:30
and projects into our eyes through a set of lenses
52
210426
2950
ืžืงืจื™ืŸ ืœืชื•ืš ื”ืขื™ื ื™ื™ื ื“ืจืš ืžืขืจื›ืช ืขื“ืฉื•ืช,
03:33
to create an augmented reality.
53
213400
1915
ื•ื™ื•ืฆืจ ืžืฆื™ืื•ืช ืจื‘ื•ื“ื”.
03:35
I picked the color red, because it's intense and emotional,
54
215935
4782
ื‘ื—ืจืชื™ ื‘ืฆื‘ืข ื”ืื“ื•ื ื›ื™ ื”ื•ื ืขื•ืฆืžืชื™ ื•ืืžื•ืฆื™ื•ื ืœื™,
03:40
it has high visibility
55
220741
1958
ื”ื•ื ืžืื“ ืžื•ื‘ื—ืŸ,
03:42
and it's political.
56
222723
1278
ื•ื”ื•ื ืคื•ืœื™ื˜ื™.
03:44
So what if we take a look
57
224390
1263
ืื•ืœื™ ื ืชื‘ื•ื ืŸ
03:45
at the last American presidential election map
58
225677
3049
ื‘ืžืคืช ื”ื‘ื—ื™ืจื•ืช ื”ืื—ืจื•ื ื•ืช ืœื ืฉื™ืื•ืช ื‘ืืจืฆื•ืช ื”ื‘ืจื™ืช
03:48
through the helmet?
59
228750
1165
ื“ืจืš ื”ืงืกื“ื”?
03:49
(Laughter)
60
229939
1008
(ืฆื—ื•ืง)
03:50
You can see that it doesn't matter if you're a Democrat or a Republican,
61
230971
3573
ืืชื ื™ื›ื•ืœื™ื ืœืจืื•ืช ืฉื–ื” ืœื ืžืฉื ื” ืื ืืชื ื“ืžื•ืงืจื˜ื™ื ืื• ืจืคื•ื‘ืœื™ืงื ื™ื,
03:54
because the mediation alters our perceptions.
62
234568
3989
ื›ื™ ื”ืชื™ื•ื•ืš ืžืฉื ื” ืืช ื”ืชืคื™ืกื•ืช ืฉืœื ื•.
03:58
The allergy exists on both sides.
63
238581
3133
ื”ืืœืจื’ื™ื” ืงื™ื™ืžืช ื‘ืฉื ื™ ื”ืฆื“ื“ื™ื.
04:03
In digital media,
64
243134
1320
ื‘ืžื“ื™ื” ื”ื“ื™ื’ื™ื˜ืœื™ืช,
04:04
what we see every day is often mediated,
65
244478
3133
ืžื” ืฉืจื•ืื™ื ืžื“ื™ ื™ื•ื ื”ื•ื ื‘ื“ืจืš ื›ืœืœ ืžืชื•ื•ืš,
04:07
but it's also very nuanced.
66
247635
1733
ืื‘ืœ ื™ืฉ ื‘ื• ื’ื ื“ืงื•ื™ื•ืช ืจื‘ื•ืช.
04:09
If we are not aware of this,
67
249758
1995
ืื ืื ื—ื ื• ืœื ืžื•ื“ืขื™ื ืœื›ืš,
04:11
we will keep being vulnerable to many kinds of mental allergies.
68
251777
5405
ื ืžืฉื™ืš ืœื”ื™ื•ืช ื—ืฉื•ืคื™ื ืœืกื•ื’ื™ื ืจื‘ื™ื ืฉืœ ืืœืจื’ื™ื•ืช ืžื ื˜ืœื™ื•ืช.
04:18
Our perception is not only part of our identities,
69
258900
3510
ื”ืชืคื™ืกื” ืฉืœื ื• ืื™ื ื ื” ืจืง ื—ืœืง ืžื”ื–ื”ื•ืช ืฉืœื ื•,
04:22
but in digital media, it's also a part of the value chain.
70
262434
4294
ื‘ืžื“ื™ื” ื“ื™ื’ื™ื˜ืœื™ืช ื”ื™ื ื’ื ื—ืœืง ืžืฉืจืฉืจืช ื”ืขืจื›ื™ื.
04:27
Our visual field is packed with so much information
71
267992
3575
ืฉื“ื” ื”ืจืื™ื” ืฉืœื ื• ืขืžื•ืก ื‘ืžื™ื“ืข ืจื‘ ื›ืœ ื›ืš
04:31
that our perception has become a commodity with real estate value.
72
271591
5512
ืฉื”ืชืคื™ืกื” ืฉืœื ื• ื”ืคื›ื” ืœืžืฆืจืš ื‘ืขืœ ืขืจืš ืฉืœ ื ื“ืœ"ืŸ.
04:38
Designs are used to exploit our unconscious biases,
73
278035
3529
ื”ืขื™ืฆื•ื‘ื™ื ื ื•ืขื“ื• ืœื ืฆืœ ืืช ื”ื”ืขื“ืคื•ืช ื”ืœื ืžื•ื“ืขื•ืช ืฉืœื ื•,
04:41
algorithms favor content that reaffirms our opinions,
74
281588
3480
ืืœื’ื•ืจื™ืชืžื™ื ืžืขื“ื™ืคื™ื ืชื•ื›ืŸ ืฉืžืืฉืจ ืžื—ื“ืฉ ืืช ื”ื“ืขื•ืช ืฉืœื ื•,
04:45
so that every little corner of our field of view is being colonized
75
285092
4191
ื›ืš ืฉื›ืœ ืคื™ื ื” ืงื˜ื ื” ื‘ืฉื“ื” ื”ืจืื™ื” ืฉืœื ื• ืžืื•ื›ืœืกืช
04:49
to sell ads.
76
289307
1342
ื‘ืคืจืกื•ืžื•ืช.
04:51
Like, when this little red dot comes out in your notifications,
77
291402
4017
ืœืžืฉืœ ื”ื ืงื•ื“ื” ื”ืื“ื•ืžื” ื”ืงื˜ื ื” ืฉืžื•ืคื™ืขื” ื‘ื”ืชืจืื•ืช,
04:55
it grows and expands, and to your mind, it's huge.
78
295443
4244
ื”ื™ื ื’ื“ืœื” ื•ืžืชืคืฉื˜ืช, ื•ืขื‘ื•ืจ ื”ืžื•ื— ืฉืœื›ื ื”ื™ื ืขืฆื•ืžื”.
05:00
So I started to think of ways to put a little dirt,
79
300571
3361
ืื– ื”ืชื—ืœืชื™ ืœื—ืฉื•ื‘ ืื™ืš ืœืœื›ืœืš ืžืขื˜,
05:03
or change the lenses of my glasses,
80
303956
2698
ืื• ืœืฉื ื•ืช ืืช ืขื“ืฉื•ืช ื”ืžืฉืงืคื™ื™ื ืฉืœื™,
05:06
and came up with another project.
81
306678
2098
ื•ืขืœื” ื‘ืžื•ื—ื™ ืจืขื™ื•ืŸ ืœืคืจื•ื™ืงื˜ ืื—ืจ.
05:09
Now, keep in mind this is conceptual. It's not a real product.
82
309423
3801
ืชื–ื›ืจื• ืฉื–ื” ืงื•ื ืกืคื˜ื•ืืœื™, ืœื ืžื•ืฆืจ ืืžื™ืชื™.
05:13
It's a web browser plug-in
83
313676
2073
ื–ื” ืชื•ืกืฃ ืœื“ืคื“ืคืŸ ืื™ื ื˜ืจื ื˜
05:15
that could help us to notice the things that we would usually ignore.
84
315773
4342
ืฉื™ื›ื•ืœ ืœืขื–ื•ืจ ืœื ื• ืœื”ื‘ื—ื™ืŸ ื‘ื“ื‘ืจื™ื ืฉื‘ื“ืจืš ื›ืœืœ ืื ื—ื ื• ืžืชืขืœืžื™ื ืžื”ื.
05:20
Like the helmet, the plug-in reshapes reality,
85
320709
3770
ื›ืžื• ื”ืงืกื“ื”, ื”ืชื•ืกืฃ ืžืขืฆื‘ ืžื—ื“ืฉ ืืช ื”ืžืฆื™ืื•ืช,
05:24
but this time, directly into the digital media itself.
86
324503
3397
ืืš ื”ืคืขื, ื™ืฉื™ืจื•ืช ืœืชื•ืš ื”ืžื“ื™ื” ื”ื“ื™ื’ื™ื˜ืœื™ืช ืขืฆืžื”.
05:29
It shouts out the hidden filtered voices.
87
329066
3156
ื”ื•ื ืฆื•ืขืง ืืช ื”ืงื•ืœื•ืช ื”ืžื•ืกืชืจื™ื ื•ื”ืžืกื•ื ื ื™ื.
05:32
What you should be noticing now
88
332246
1573
ืžื” ืฉืืชื ืืžื•ืจื™ื ืœื”ื‘ื—ื™ืŸ ื‘ื• ืขื›ืฉื™ื•
05:33
will be bigger and vibrant,
89
333843
2509
ื™ื”ื™ื” ื’ื“ื•ืœ ื™ื•ืชืจ ื•ื—ื™ ื™ื•ืชืจ,
05:36
like here, this story about gender bias emerging from the sea of cats.
90
336376
4295
ืœืžืฉืœ ื›ืืŸ, ื”ืกื™ืคื•ืจ ื”ื–ื” ืขืœ ืืคืœื™ื” ืžื’ื“ืจื™ืช ืฉืžื’ื™ื— ืžืชื•ืš ื™ื ืฉืœ ื—ืชื•ืœื™ื.
05:40
(Laughter)
91
340695
2021
(ืฆื—ื•ืง)
05:42
The plug-in could dilute the things that are being amplified by an algorithm.
92
342740
5417
ื”ืชื•ืกืฃ ื™ื›ื•ืœ ืœื“ืœืœ ืืช ื”ื“ื‘ืจื™ื ืฉืžื•ื’ื‘ืจื™ื ืขืœ ื™ื“ื™ ื”ืืœื’ื•ืจื™ืชื.
05:48
Like, here in this comment section,
93
348831
1696
ืœืžืฉืœ ืคื” ื‘ืชื’ื•ื‘ื•ืช,
05:50
there are lots of people shouting about the same opinions.
94
350551
3069
ื”ืจื‘ื” ืื ืฉื™ื ืฆื•ืขืงื™ื ืืช ืื•ืชืŸ ื“ืขื•ืช.
05:54
The plug-in makes their comments super small.
95
354111
2870
ื”ืชื•ืกืฃ ื”ื•ืคืš ืืช ื”ืชื’ื•ื‘ื•ืช ืฉืœื”ื ืœื–ืขื™ืจื•ืช.
05:57
(Laughter)
96
357005
1008
(ืฆื—ื•ืง)
05:58
So now the amount of pixel presence they have on the screen
97
358037
5251
ืขื›ืฉื™ื• ื›ืžื•ืช ื”ืคื™ืงืกืœื™ื ืฉื”ื ืชื•ืคืกื™ื ืขืœ ื”ืžืกืš
06:03
is proportional to the actual value they are contributing to the conversation.
98
363312
4554
ืชื•ืืžืช ืœืขืจืš ื”ืืžื™ืชื™ ืฉื”ื ืชื•ืจืžื™ื ืœืฉื™ื—ื”.
06:07
(Laughter)
99
367890
1942
(ืฆื—ื•ืง)
06:11
(Applause)
100
371345
3631
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
06:16
The plug-in also shows the real estate value of our visual field
101
376919
4061
ื”ืชื•ืกืฃ ืžืจืื” ื’ื ืืช ืฉื•ื•ื™ ื”ื ื“ืœ"ืŸ ืฉืœ ืฉื“ื” ื”ืจืื™ื” ืฉืœื ื•
06:21
and how much of our perception is being commoditized.
102
381004
3456
ื•ืื™ื–ื” ื—ืœืง ืžื”ืชืคื™ืกื” ืฉืœื ื• ื”ืคืš ืœืžืฆืจืš.
06:24
Different from ad blockers,
103
384912
1589
ื–ื” ืฉื•ื ื” ืžื—ื•ืกืžื™ ืคืจืกื•ืžื•ืช
06:26
for every ad you see on the web page,
104
386525
2472
ื‘ื›ืš ืฉืขืœ ื™ื“ ื›ืœ ืคืจืกื•ืžืช ืฉืจื•ืื™ื ื‘ื“ืฃ ื”ืื™ื ื˜ืจื ื˜,
06:29
it shows the amount of money you should be earning.
105
389021
3407
ื”ื•ื ืžืจืื” ื›ืžื” ื›ืกืฃ ืืชื ืืžื•ืจื™ื ืœื”ืจื•ื•ื™ื—.
06:32
(Laughter)
106
392452
1245
(ืฆื—ื•ืง)
06:34
We are living in a battlefield between reality
107
394356
2292
ืื ื—ื ื• ื—ื™ื™ื ื‘ืฉื“ื” ืงืจื‘ ื‘ื™ืŸ ื”ืžืฆื™ืื•ืช
06:36
and commercial distributed reality,
108
396672
2253
ืœื‘ื™ืŸ ืžืฆื™ืื•ืช ืžืกื—ืจื™ืช,
06:39
so the next version of the plug-in could strike away that commercial reality
109
399329
5086
ืื– ื”ื’ืจืกื” ื”ื‘ืื” ืฉืœ ื”ืชื•ืกืฃ ืชื•ื›ืœ ืœื”ืกื™ืจ ืืช ื”ืžืฆื™ืื•ืช ื”ืžืกื—ืจื™ืช ื”ื–ืืช
06:44
and show you things as they really are.
110
404439
2799
ื•ืœื”ืฆื™ื’ ื‘ืคื ื™ื›ื ืืช ื”ื“ื‘ืจื™ื ื›ืžื• ืฉื”ื.
06:47
(Laughter)
111
407262
1791
(ืฆื—ื•ืง)
06:50
(Applause)
112
410335
3629
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
06:55
Well, you can imagine how many directions this could really go.
113
415179
3612
ืืชื ื™ื›ื•ืœื™ื ืœืชืืจ ืœืขืฆืžื›ื ืœื›ืžื” ื›ื™ื•ื•ื ื™ื ื–ื” ื™ื›ื•ืœ ื‘ืืžืช ืœืœื›ืช.
06:58
Believe me, I know the risks are high if this were to become a real product.
114
418815
4414
ืชืืžื™ื ื• ืœื™, ืื ื™ ื™ื•ื“ืขืช ืฉื”ืกื™ื›ื•ื ื™ื ื™ื”ื™ื• ื’ื‘ื•ื”ื™ื ืื ื–ื” ื™ื”ืคื•ืš ืœืžื•ืฆืจ ืืžื™ืชื™.
07:04
And I created this with good intentions
115
424006
2939
ื•ืื ื™ ื™ืฆืจืชื™ ืื•ืชื• ืขื ื›ื•ื•ื ื•ืช ื˜ื•ื‘ื•ืช
07:06
to train our perception and eliminate biases.
116
426969
3837
ืœืืžืŸ ืืช ื”ืชืคื™ืกื” ืฉืœื ื• ื•ืœืกืœืง ื“ืขื•ืช ืงื“ื•ืžื•ืช.
07:10
But the same approach could be used with bad intentions,
117
430830
3728
ืื‘ืœ ื ื™ืชืŸ ืœื”ืฉืชืžืฉ ื‘ืื•ืชื” ื’ื™ืฉื” ืขื ื›ื•ื•ื ื•ืช ืจืขื•ืช,
07:14
like forcing citizens to install a plug-in like that
118
434582
3205
ืœืžืฉืœ ืœื”ื›ืจื™ื— ืื–ืจื—ื™ื ืœื”ืชืงื™ืŸ ืชื•ืกืฃ ื›ื–ื”
07:17
to control the public narrative.
119
437811
2126
ื›ื“ื™ ืœืฉืœื•ื˜ ื‘ื”ืœืš ื”ืจื•ื— ื”ืฆื™ื‘ื•ืจื™.
07:20
It's challenging to make it fair and personal
120
440768
2737
ื–ื” ืžืืชื’ืจ ืœืขืฉื•ืช ืืช ื–ื” ื”ื•ื’ืŸ ื•ืื™ืฉื™
07:23
without it just becoming another layer of mediation.
121
443529
3155
ื‘ืœื™ ืฉื–ื” ืคืฉื•ื˜ ื™ื”ืคื•ืš ืœืขื•ื“ ืฉื›ื‘ืช ืชื™ื•ื•ืš.
07:27
So what does all this mean for us?
122
447933
2662
ืื– ืžื” ื›ืœ ื–ื” ืื•ืžืจ ืžื‘ื—ื™ื ืชื ื•?
07:31
Even though technology is creating this isolation,
123
451230
3733
ืœืžืจื•ืช ืฉื”ื˜ื›ื ื•ืœื•ื’ื™ื” ื’ื•ืจืžืช ืœื‘ื™ื“ื•ื“ ื”ื–ื”,
07:34
we could use it to make the world connected again
124
454987
3602
ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœื”ืฉืชืžืฉ ื‘ื” ื›ื“ื™ ืœื—ื‘ืจ ืžื—ื“ืฉ ืืช ื”ืขื•ืœื
07:38
by breaking the existing model and going beyond it.
125
458613
3247
ืขืœ ื™ื“ื™ ืฉื‘ื™ืจืช ื”ืžื•ื“ืœ ื”ืงื™ื™ื ื•ืฆืขื™ื“ื” ืืœ ืžืขื‘ืจ ืœื•.
07:42
By exploring how we interface with these technologies,
126
462508
3338
ืื ื ื—ืงื•ืจ ื›ื™ืฆื“ ืื ื—ื ื• ืžืชืžืžืฉืงื™ื ืขื ื”ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื”ืืœื”,
07:45
we could step out of our habitual, almost machine-like behavior
127
465870
5185
ื ื•ื›ืœ ืœืฆืืช ืžื”ื”ืชื ื”ื’ื•ืช ื”ื˜ื‘ืขื™ืช, ื”ื›ืžืขื˜ ืžื›ื ื™ืช ืฉืœื ื•,
07:51
and finally find common ground between each other.
128
471079
2670
ื•ืœืžืฆื•ื ืกื•ืฃ ืกื•ืฃ ืžื›ื ื” ืžืฉื•ืชืฃ ืื—ื“ ืขื ื”ืฉื ื™.
07:54
Technology is never neutral.
129
474915
1789
ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ืœืขื•ืœื ืื™ื ื ื” ื ื™ื˜ืจืืœื™ืช.
07:57
It provides a context and frames reality.
130
477191
3019
ื”ื™ื ืžืกืคืงืช ื”ืงืฉืจ ื•ืžืขืฆื‘ืช ืืช ื”ืžืฆื™ืื•ืช.
08:00
It's part of the problem and part of the solution.
131
480234
2888
ื”ื™ื ื—ืœืง ืžื”ื‘ืขื™ื” ื•ื—ืœืง ืžื”ืคืชืจื•ืŸ.
08:03
We could use it to uncover our blind spots and retrain our perception
132
483624
5640
ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœื”ืฉืชืžืฉ ื‘ื” ื›ื“ื™ ืœื—ืฉื•ืฃ ืืช ื”ืฉื˜ื—ื™ื ื”ืžืชื™ื ื•ืœืืžืŸ ืžื—ื“ืฉ ืืช ื”ืชืคื™ืกื” ืฉืœื ื•
08:09
and consequently, choose how we see each other.
133
489949
3506
ื•ื›ืชื•ืฆืื” ืžื›ืš, ืœื‘ื—ื•ืจ ื›ื™ืฆื“ ืœืจืื•ืช ืื—ื“ ืืช ื”ืฉื ื™.
08:13
Thank you.
134
493892
1191
ืชื•ื“ื” ืจื‘ื”.
08:15
(Applause)
135
495107
2685
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7