Art that reveals how technology frames reality | Jiabao Li

80,436 views ・ 2020-04-06

TED


Please double-click on the English subtitles below to play the video.

Prevodilac: Tijana Ćopić Lektor: Ivana Korom
00:12
I'm an artist and an engineer.
0
12887
2396
Ja sam umetnik i inženjer.
00:15
And lately, I've been thinking a lot about how technology mediates
1
15307
5774
I u poslednje vreme sam često mislila o tome kako tehnologija utiče
00:21
the way we perceive reality.
2
21105
2041
na način na koji doživljavamo stvarnost.
00:23
And it's being done in a superinvisible and nuanced way.
3
23653
4640
I to se dešava na skoro neprimetan i pažljiv način.
00:29
Technology is designed to shape our sense of reality
4
29260
3646
Tehnologija je dizajnirana da oblikuje naš smisao za realnost
00:32
by masking itself as the actual experience of the world.
5
32930
3811
tako što se predstavlja da je sam naš doživljaj pravog sveta.
00:37
As a result, we are becoming unconscious and unaware
6
37430
4686
Kao rezultat toga, mi postajemo nesvesni toga da se to uopše dešava.
00:42
that it is happening at all.
7
42140
2185
Uzmite kao primer naočare koje ja obično nosim.
00:45
Take the glasses I usually wear, for example.
8
45360
2488
00:47
These have become part of the way I ordinarily experience my surroundings.
9
47872
4024
One su postale redovan deo mog doživljaja okruženja.
00:52
I barely notice them,
10
52352
1490
Jedva i da ih primetim,
00:53
even though they are constantly framing reality for me.
11
53866
4085
čak iako one sve vreme uokviruju stvarnost oko mene.
00:58
The technology I am talking about is designed to do the same thing:
12
58447
4016
Tehnologija o kojoj govorim je dizajnirana da uradi upravo istu stvar -
01:02
change what we see and think
13
62487
2438
da promeni način na koji vidimo i mislimo,
01:04
but go unnoticed.
14
64949
1760
ali na neprimetan način.
01:07
Now, the only time I do notice my glasses
15
67780
3269
Svoje naočare krećem da primećujem tek onda
01:11
is when something happens to draw my attention to it,
16
71073
3258
kada mi neka promena njima privuče pažnju,
01:14
like when it gets dirty or my prescription changes.
17
74355
3514
kao kada se uprljaju ili kada treba da menjam dioptriju.
01:18
So I asked myself, "As an artist, what can I create
18
78481
4608
I tako sam se zapitala: „Šta ja, kao umetnik, mogu da stvorim
01:23
to draw the same kind of attention
19
83113
2447
a da privučem istu vrstu pažnje
01:25
to the ways digital media -- like news organizations, social media platforms,
20
85584
6171
na načine na koje digitalni mediji - kao što su novine, društvene mreže,
01:31
advertising and search engines --
21
91779
2261
reklame i internet pretraživači -
01:34
are shaping our reality?"
22
94064
2014
oblikuju našu stvarnost?“
01:36
So I created a series of perceptual machines
23
96798
4772
I tako sam napravila niz perceptivnih uređaja
01:41
to help us defamiliarize and question
24
101594
3566
koji bi nam pomogli da se otuđimo i zapitamo
01:45
the ways we see the world.
25
105184
2120
zašto vidimo svet na određeni način.
01:48
For example, nowadays, many of us have this kind of allergic reaction
26
108859
6033
Na primer, ovih dana kao da se svima javi nekakva alergijska reakcija kao odgovor
01:54
to ideas that are different from ours.
27
114916
2493
na mišljenja drugačija od svojih.
01:57
We may not even realize that we've developed this kind of mental allergy.
28
117945
5656
Možda čak i nismo svesni da nam se javlja ovakva mentalna alergija.
02:04
So I created a helmet that creates this artificial allergy to the color red.
29
124596
5643
Zbog toga sam stvorila šlem koji pretvara ovu veštačku alergiju u crvenu boju.
02:10
It simulates this hypersensitivity by making red things look bigger
30
130263
4453
On stimuliše ovu preosetljivost tako što čini crvene stvari većim nego što jesu
02:14
when you are wearing it.
31
134740
1280
kada ga nosite.
02:16
It has two modes: nocebo and placebo.
32
136730
3558
Postoje dva režima: nocebo i placebo.
02:21
In nocebo mode, it creates this sensorial experience of hyperallergy.
33
141272
5357
U nocebu se stvara senzorni doživljaj hiperalergije.
02:26
Whenever I see red, the red expands.
34
146653
2707
Kad god se vidi crvena boja, ona se proširi.
02:30
It's similar to social media's amplification effect,
35
150062
3531
Slično je kao efekat pojačavanja koji se može videti na društvenim mrežama,
02:33
like when you look at something that bothers you,
36
153617
2597
kao kada vidite nešto što vam smeta,
02:36
you tend to stick with like-minded people
37
156238
2928
obično se držite osoba sa sličnim mišljenjem,
02:39
and exchange messages and memes, and you become even more angry.
38
159190
4595
razmenjujete poruke i mimove, i postajete još više ljuti.
02:44
Sometimes, a trivial discussion gets amplified
39
164274
3662
Ponekad se nebitna diskusija toliko preuveliča
02:47
and blown way out of proportion.
40
167960
2401
i izvuče do neprepoznatljivosti.
02:51
Maybe that's even why we are living in the politics of anger.
41
171038
4520
Možda i zato živimo u dobu mržnje.
02:56
In placebo mode, it's an artificial cure for this allergy.
42
176756
3415
Placebo nam pruža veštačko rešenje za ovu alergiju.
03:00
Whenever you see red, the red shrinks.
43
180575
2292
Kad god vidite crvenu boju, ona se smanji.
03:03
It's a palliative, like in digital media.
44
183617
2875
To je lek za ublažavanje bolova, kao na društvenim mrežama.
03:06
When you encounter people with different opinions,
45
186516
2899
Kada se susretnete sa ljudima drugačijih mišljenja,
03:09
we will unfollow them,
46
189439
1477
uklonićete ih sa liste prijatelja,
03:10
remove them completely out of our feeds.
47
190940
2748
skroz ćete ih se rešiti sa vaše naslovne strane.
03:14
It cures this allergy by avoiding it.
48
194343
3203
Pruža nam lek za alergiju tako što je izbegava.
03:17
But this way of intentionally ignoring opposing ideas
49
197570
4547
Ali ovo namerno ignorisanje suprotstavljenih mišljenja
03:22
makes human community hyperfragmented and separated.
50
202141
4056
čini ljudsko društvo potpuno izdeljenim i razdvojenim.
03:27
The device inside the helmet reshapes reality
51
207343
3059
Sprava unutar šlema preoblikuje stvarnost
03:30
and projects into our eyes through a set of lenses
52
210426
2950
i projektuje se na naše oči preko sočiva
03:33
to create an augmented reality.
53
213400
1915
kako bi stvorila uvećanu stvarnost.
03:35
I picked the color red, because it's intense and emotional,
54
215935
4782
Izabrala sam crvenu boju zato što je žestoka i emotivna,
03:40
it has high visibility
55
220741
1958
lako je vidljiva,
03:42
and it's political.
56
222723
1278
i od političkog je značaja.
03:44
So what if we take a look
57
224390
1263
Šta ako pogledamo
03:45
at the last American presidential election map
58
225677
3049
poslednju mapu američkih predsedničkih izbora
03:48
through the helmet?
59
228750
1165
kroz ovaj šlem?
03:49
(Laughter)
60
229939
1008
(Smeh)
03:50
You can see that it doesn't matter if you're a Democrat or a Republican,
61
230971
3573
Vidite da nije važno da li ste demokrata ili republikanac,
03:54
because the mediation alters our perceptions.
62
234568
3989
percepcija je svakako izmenjena.
03:58
The allergy exists on both sides.
63
238581
3133
Alergija je prisutna na obe strane.
04:03
In digital media,
64
243134
1320
Na digitalnim medijima,
04:04
what we see every day is often mediated,
65
244478
3133
ono u šta gledamo svaki dan je oblikovano,
04:07
but it's also very nuanced.
66
247635
1733
ali veoma suptilno.
04:09
If we are not aware of this,
67
249758
1995
Ukoliko ovoga nismo svesni,
04:11
we will keep being vulnerable to many kinds of mental allergies.
68
251777
5405
bićemo podložni mnogim mentalnim alergijama.
04:18
Our perception is not only part of our identities,
69
258900
3510
Naša percepcija ne samo da je deo našeg identiteta,
04:22
but in digital media, it's also a part of the value chain.
70
262434
4294
već je, unutar digitalnih medija, deo lanca vrednosti.
04:27
Our visual field is packed with so much information
71
267992
3575
Naše vidno polje je toliko prepuno informacija
04:31
that our perception has become a commodity with real estate value.
72
271591
5512
da je naša percepcija postala roba sa tržišnom vrednošću.
04:38
Designs are used to exploit our unconscious biases,
73
278035
3529
Dizajnom se iskorišćavaju naše podsvesne predrasude,
04:41
algorithms favor content that reaffirms our opinions,
74
281588
3480
algoritmi daju prednost sadržajima koji potvrđuju naša mišljenja,
04:45
so that every little corner of our field of view is being colonized
75
285092
4191
te svaki ćošak našeg vidnog polja biva preplavljen
04:49
to sell ads.
76
289307
1342
radi prodaje reklama.
04:51
Like, when this little red dot comes out in your notifications,
77
291402
4017
Kao kada se ova crvena tačkica pojavi u vašim obaveštenjima,
04:55
it grows and expands, and to your mind, it's huge.
78
295443
4244
ona raste i širi se, i vašem mozgu deluje glomazno.
05:00
So I started to think of ways to put a little dirt,
79
300571
3361
Tako sam počela da smišljam načine da uprljam
05:03
or change the lenses of my glasses,
80
303956
2698
ili promenim dioptriju svojih naočara,
05:06
and came up with another project.
81
306678
2098
te sam smislila još jedan projekat.
05:09
Now, keep in mind this is conceptual. It's not a real product.
82
309423
3801
Imajte na umu da je ovo samo ideja. Nije pravi proizvod.
05:13
It's a web browser plug-in
83
313676
2073
To je priključak za internet pretraživače
05:15
that could help us to notice the things that we would usually ignore.
84
315773
4342
koji bi nam pomogao da primetimo stvari koje bismo obično ignorisali.
05:20
Like the helmet, the plug-in reshapes reality,
85
320709
3770
Isto kao šlem, ovaj priključak menja stvarnost,
05:24
but this time, directly into the digital media itself.
86
324503
3397
ali ovaj put to radi direktno unutar digitalnih medija.
05:29
It shouts out the hidden filtered voices.
87
329066
3156
Otkriva nam glasove skrivene probirivanjem.
05:32
What you should be noticing now
88
332246
1573
Ono što bi trebalo da primećujete
05:33
will be bigger and vibrant,
89
333843
2509
biće sada veće i živahnije,
05:36
like here, this story about gender bias emerging from the sea of cats.
90
336376
4295
kao ovde, priča o rodnim predrasudama koja izranja u moru članaka o mačkama.
05:40
(Laughter)
91
340695
2021
(Smeh)
05:42
The plug-in could dilute the things that are being amplified by an algorithm.
92
342740
5417
Ovaj priključak može da ublaži stvari koje algoritam preuveličava.
05:48
Like, here in this comment section,
93
348831
1696
Na primer, u ovim komentarima,
05:50
there are lots of people shouting about the same opinions.
94
350551
3069
mnogo ljudi glasno izlažu svoja mišljenja.
05:54
The plug-in makes their comments super small.
95
354111
2870
Priključak čini njihove komentare mnogo sitnijim.
05:57
(Laughter)
96
357005
1008
(Smeh)
05:58
So now the amount of pixel presence they have on the screen
97
358037
5251
Sada je broj piksela koje zauzimaju na ekranu
06:03
is proportional to the actual value they are contributing to the conversation.
98
363312
4554
jednak njihovom doprinosu razgovoru.
06:07
(Laughter)
99
367890
1942
(Smeh)
06:11
(Applause)
100
371345
3631
(Aplauz)
06:16
The plug-in also shows the real estate value of our visual field
101
376919
4061
Priključak takođe pokazuje tržišnu vrednost našeg vidnog polja
06:21
and how much of our perception is being commoditized.
102
381004
3456
i koliko se tačno naša percepcija iskorišćava kao roba.
06:24
Different from ad blockers,
103
384912
1589
Nije isto što i AdBlock,
06:26
for every ad you see on the web page,
104
386525
2472
jer vam za svaku reklamu koju vidite na nekoj stranici
06:29
it shows the amount of money you should be earning.
105
389021
3407
pokazuje koliko novca bi trebalo da zaradite na osnovu toga.
06:32
(Laughter)
106
392452
1245
(Smeh)
06:34
We are living in a battlefield between reality
107
394356
2292
Mi živimo na bojnom polju između stvarnosti
06:36
and commercial distributed reality,
108
396672
2253
i komercijalno raspodeljene stvarnosti,
06:39
so the next version of the plug-in could strike away that commercial reality
109
399329
5086
pa bi nova verzija ovog priključka mogla da se reši te komercijalne stvarnosti
06:44
and show you things as they really are.
110
404439
2799
i pokaže vam stvari onakve kakve jesu.
06:47
(Laughter)
111
407262
1791
(Smeh)
06:50
(Applause)
112
410335
3629
(Aplauz)
06:55
Well, you can imagine how many directions this could really go.
113
415179
3612
Možete da zamislite u kojim bi sve pravcima ovo moglo da ode.
06:58
Believe me, I know the risks are high if this were to become a real product.
114
418815
4414
Verujte mi, znam da bi bilo rizično kada bi ovo postalo pravi proizvod.
07:04
And I created this with good intentions
115
424006
2939
Stvorila sam ovo sa najboljim namerama,
07:06
to train our perception and eliminate biases.
116
426969
3837
kako bih osposobila našu percepciju i rešila se predrasuda.
07:10
But the same approach could be used with bad intentions,
117
430830
3728
Međutim, isti pristup bi mogao da se iskoristi sa lošim namerama,
07:14
like forcing citizens to install a plug-in like that
118
434582
3205
kao na primer, teranje stanovništa da instalira dati priključak
07:17
to control the public narrative.
119
437811
2126
radi kontrole javnog mnjenja.
07:20
It's challenging to make it fair and personal
120
440768
2737
Teško je napraviti ga pravednim i ličnim,
07:23
without it just becoming another layer of mediation.
121
443529
3155
a da se ne pretvori u samo još jedan sloj medijacije.
07:27
So what does all this mean for us?
122
447933
2662
Dakle, šta sve ovo znači za nas?
07:31
Even though technology is creating this isolation,
123
451230
3733
Čak iako je tehnologija ta koja stvara ovu izolaciju,
07:34
we could use it to make the world connected again
124
454987
3602
možemo da je iskoristimo kako bismo ponovo povezali svet
07:38
by breaking the existing model and going beyond it.
125
458613
3247
tako što ćemo da prevaziđemo postojeći model.
07:42
By exploring how we interface with these technologies,
126
462508
3338
Istražujući način na koji uzajamno delujemo na tehnologiju,
07:45
we could step out of our habitual, almost machine-like behavior
127
465870
5185
možemo da se oslobodimo našeg redovnog, skoro mašinskog, ponašanja
07:51
and finally find common ground between each other.
128
471079
2670
i da konačno nađemo zajednički jezik.
07:54
Technology is never neutral.
129
474915
1789
Tehnologija nije nikada nepristrasna.
07:57
It provides a context and frames reality.
130
477191
3019
Ona nam obezbeđuje kontekst i oblikuje stvarnost.
08:00
It's part of the problem and part of the solution.
131
480234
2888
Ona je deo problema, i deo rešenja.
08:03
We could use it to uncover our blind spots and retrain our perception
132
483624
5640
Pomoću nje možemo otkriti slepe tačke, ponovo obučiti svoju percepciju,
08:09
and consequently, choose how we see each other.
133
489949
3506
i stoga, izabrati način na koji vidimo jedni druge.
08:13
Thank you.
134
493892
1191
Hvala.
08:15
(Applause)
135
495107
2685
(Aplauz)
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7