Art that reveals how technology frames reality | Jiabao Li

79,380 views ・ 2020-04-06

TED


Dvaput kliknite na engleske titlove ispod za reprodukciju videozapisa.

Prevoditelj: Vida Grujic Recezent: Sanda L
00:12
I'm an artist and an engineer.
0
12887
2396
Umjetnica sam i inženjerka.
00:15
And lately, I've been thinking a lot about how technology mediates
1
15307
5774
U posljednje vrijeme često mislim o tome kako tehnologija utječe
00:21
the way we perceive reality.
2
21105
2041
na našu percepciju stvarnosti.
00:23
And it's being done in a superinvisible and nuanced way.
3
23653
4640
I to se odvija na super nevidljivi i precizan način.
00:29
Technology is designed to shape our sense of reality
4
29260
3646
Tehnologija je dizajnirana za oblikovanje našeg osjećaja stvarnosti
00:32
by masking itself as the actual experience of the world.
5
32930
3811
pod krinkom stvarnog iskustva svijeta.
00:37
As a result, we are becoming unconscious and unaware
6
37430
4686
Kao rezultat, nismo niti svjesni i zanemarujemo
00:42
that it is happening at all.
7
42140
2185
da se to uopće događa.
00:45
Take the glasses I usually wear, for example.
8
45360
2488
Za primjer uzmimo naočale koje obično nosim.
00:47
These have become part of the way I ordinarily experience my surroundings.
9
47872
4024
Postale su dio mog uobičajenog iskustva okoline.
00:52
I barely notice them,
10
52352
1490
Jedva da ih i primijetim,
00:53
even though they are constantly framing reality for me.
11
53866
4085
bez obzira na to što stalno uokviruju moju stvarnost.
00:58
The technology I am talking about is designed to do the same thing:
12
58447
4016
Tehnologija o kojoj govorim dizajnirana je za isto to:
01:02
change what we see and think
13
62487
2438
promjenu onoga što vidimo i mislimo,
01:04
but go unnoticed.
14
64949
1760
no bez da to primijetimo.
01:07
Now, the only time I do notice my glasses
15
67780
3269
Svoje naočale primjećujem jedino
01:11
is when something happens to draw my attention to it,
16
71073
3258
kada nešto na njih usmjeri moju pažnju,
01:14
like when it gets dirty or my prescription changes.
17
74355
3514
kada se zaprljaju ili se moj recept promijeni.
01:18
So I asked myself, "As an artist, what can I create
18
78481
4608
Upitala sam se: "Kao umjetnica, što mogu stvoriti
01:23
to draw the same kind of attention
19
83113
2447
kako bih usmjerila istu vrstu pažnje
01:25
to the ways digital media -- like news organizations, social media platforms,
20
85584
6171
na način na koji digitalni mediji -- kao novinske agencije, društveni mediji,
01:31
advertising and search engines --
21
91779
2261
mehanizmi za pretraživanje i oglašavanje --
01:34
are shaping our reality?"
22
94064
2014
oblikuju našu stvarnost?"
01:36
So I created a series of perceptual machines
23
96798
4772
Izradila sam seriju percepcijskih uređaja
01:41
to help us defamiliarize and question
24
101594
3566
koji će nam pomoći odmaknuti se i preispitati
01:45
the ways we see the world.
25
105184
2120
načine na koje vidimo svijet.
01:48
For example, nowadays, many of us have this kind of allergic reaction
26
108859
6033
Na primjer, mnogi od nas danas imaju vrstu alergijske reakcije
01:54
to ideas that are different from ours.
27
114916
2493
na ideje koje se razlikuju od naših.
01:57
We may not even realize that we've developed this kind of mental allergy.
28
117945
5656
Možda nismo ni primijetili da smo razvili ovu vrstu mentalne alergije.
02:04
So I created a helmet that creates this artificial allergy to the color red.
29
124596
5643
Izradila sam kacigu koja ovu umjetnu alergiju označava crvenom bojom.
02:10
It simulates this hypersensitivity by making red things look bigger
30
130263
4453
Simulira hiperosjetljivost tako da prividno čini crveno većim
02:14
when you are wearing it.
31
134740
1280
kada je nosite.
02:16
It has two modes: nocebo and placebo.
32
136730
3558
Ima dva načina: nocebo i placebo.
02:21
In nocebo mode, it creates this sensorial experience of hyperallergy.
33
141272
5357
U načinu nocebo, stvara osjetilno iskustvo hiperalergije.
02:26
Whenever I see red, the red expands.
34
146653
2707
Kad god vidim crveno, ono se proširi.
02:30
It's similar to social media's amplification effect,
35
150062
3531
Slično je efektu pojačavanja na društvenim medijima,
02:33
like when you look at something that bothers you,
36
153617
2597
kao kad vidite nešto što vam smeta,
02:36
you tend to stick with like-minded people
37
156238
2928
obično se priklanjate istomišljenicima
02:39
and exchange messages and memes, and you become even more angry.
38
159190
4595
i razmjenjujete poruke i memove, postajući sve ljući.
02:44
Sometimes, a trivial discussion gets amplified
39
164274
3662
Ponekad se beznačajna rasprava pojača
02:47
and blown way out of proportion.
40
167960
2401
i raširi preko svake mjere.
02:51
Maybe that's even why we are living in the politics of anger.
41
171038
4520
Možda je to i razlog što živimo politiku bijesa.
02:56
In placebo mode, it's an artificial cure for this allergy.
42
176756
3415
U načinu placebo, tu je umjetni lijek za ovu alergiju.
03:00
Whenever you see red, the red shrinks.
43
180575
2292
Kada god vidite crveno, ono se smanjuje.
03:03
It's a palliative, like in digital media.
44
183617
2875
Palijativno je, kao digitalni mediji.
03:06
When you encounter people with different opinions,
45
186516
2899
Kad susretnete ljude različitog mišljenja,
03:09
we will unfollow them,
46
189439
1477
prestat ćete ih pratiti,
03:10
remove them completely out of our feeds.
47
190940
2748
potpuno ih ukloniti iz svojih novosti.
03:14
It cures this allergy by avoiding it.
48
194343
3203
Liječi alergiju izbjegavajući je.
03:17
But this way of intentionally ignoring opposing ideas
49
197570
4547
Ovakvo namjerno ignoriranje suprotnih ideja
03:22
makes human community hyperfragmented and separated.
50
202141
4056
ljudsku zajednicu čini hiperfragmentiranom i razdvojenom.
03:27
The device inside the helmet reshapes reality
51
207343
3059
Uređaj u kacigi mijenja oblik stvarnosti
03:30
and projects into our eyes through a set of lenses
52
210426
2950
i projektira ga na naše oči putem kompleta leća
03:33
to create an augmented reality.
53
213400
1915
te stvara proširenu stvarnost.
03:35
I picked the color red, because it's intense and emotional,
54
215935
4782
Crvenu sam odabrala zbog njezinog intenziteta i emocije,
03:40
it has high visibility
55
220741
1958
vrlo je vidljiva
03:42
and it's political.
56
222723
1278
i politička je.
03:44
So what if we take a look
57
224390
1263
Što ako pogledamo
03:45
at the last American presidential election map
58
225677
3049
mapu posljednjih američkih predsjedničkih izbora
03:48
through the helmet?
59
228750
1165
kroz kacigu?
03:49
(Laughter)
60
229939
1008
(Smijeh)
03:50
You can see that it doesn't matter if you're a Democrat or a Republican,
61
230971
3573
Vidjet ćete da nije važno jeste li demokrat ili republikanac
03:54
because the mediation alters our perceptions.
62
234568
3989
jer medijacija mijenja naša shvaćanja.
03:58
The allergy exists on both sides.
63
238581
3133
Alergija postoji s obje strane.
04:03
In digital media,
64
243134
1320
U digitalnim medijima,
04:04
what we see every day is often mediated,
65
244478
3133
ono što vidimo svaki dan često je posredovano,
04:07
but it's also very nuanced.
66
247635
1733
ali i vrlo nijansirano.
04:09
If we are not aware of this,
67
249758
1995
Ako toga nismo svjesni,
04:11
we will keep being vulnerable to many kinds of mental allergies.
68
251777
5405
i dalje ćemo biti osjetljivi na mnoge mentalne alergije.
04:18
Our perception is not only part of our identities,
69
258900
3510
Naša percepcija nije samo dio našeg identiteta,
04:22
but in digital media, it's also a part of the value chain.
70
262434
4294
u digitalnim medijima ona je i dio lanca vrijednosti.
04:27
Our visual field is packed with so much information
71
267992
3575
Naše je vizualno polje prepunjeno tolikim informacijama
04:31
that our perception has become a commodity with real estate value.
72
271591
5512
i naša je percepcija postala proizvod s vrijednošću nekretnine.
04:38
Designs are used to exploit our unconscious biases,
73
278035
3529
Dizajnima se iskorištavaju naše nesvjesne pristranosti,
04:41
algorithms favor content that reaffirms our opinions,
74
281588
3480
algoritmi favoriziraju sadržaj koji potvrđuje naše mišljenje,
04:45
so that every little corner of our field of view is being colonized
75
285092
4191
a svaki mali kutak našeg vidnog polja kolonizira se
04:49
to sell ads.
76
289307
1342
za prodaju oglasa.
04:51
Like, when this little red dot comes out in your notifications,
77
291402
4017
Kao kada se crvena točkica pojavi u vašim obavijestima,
04:55
it grows and expands, and to your mind, it's huge.
78
295443
4244
raste i širi se, što je za vaš um ogromno.
05:00
So I started to think of ways to put a little dirt,
79
300571
3361
Tako sam počela razmišljati o načinima prljanja
05:03
or change the lenses of my glasses,
80
303956
2698
ili promjene leća na naočalama
05:06
and came up with another project.
81
306678
2098
i dosjetila se novog projekta.
05:09
Now, keep in mind this is conceptual. It's not a real product.
82
309423
3801
Imajte na umu da je ovo samo koncept. Nije pravi proizvod.
05:13
It's a web browser plug-in
83
313676
2073
Radi se o programskom dodatku za pretraživač
05:15
that could help us to notice the things that we would usually ignore.
84
315773
4342
koji bi nam pomogao primijetiti stvari koje inače ignoriramo.
05:20
Like the helmet, the plug-in reshapes reality,
85
320709
3770
Kao kaciga, i dodatak mijenja oblik stvarnosti,
05:24
but this time, directly into the digital media itself.
86
324503
3397
no ovaj put izravno, u samom digitalnom mediju.
05:29
It shouts out the hidden filtered voices.
87
329066
3156
Izvikuje skrivene filtrirane glasove.
05:32
What you should be noticing now
88
332246
1573
Ono što bismo trebali primijetiti
05:33
will be bigger and vibrant,
89
333843
2509
postajat će veće i jače,
05:36
like here, this story about gender bias emerging from the sea of cats.
90
336376
4295
kao u ovoj priči o pristranosti na temelju spola koja se pojavljuje u moru mačaka.
05:40
(Laughter)
91
340695
2021
(Smijeh)
05:42
The plug-in could dilute the things that are being amplified by an algorithm.
92
342740
5417
Programski dodatak može razrijediti sve što algoritam pojačava.
05:48
Like, here in this comment section,
93
348831
1696
U ovom odjeljku s komentarima,
05:50
there are lots of people shouting about the same opinions.
94
350551
3069
mnogo se ljudi nadvikuje o istim mišljenjima.
05:54
The plug-in makes their comments super small.
95
354111
2870
Dodatak će njihove komentare učiniti super malima.
05:57
(Laughter)
96
357005
1008
(Smijeh)
05:58
So now the amount of pixel presence they have on the screen
97
358037
5251
Sada je količina piksela koje imaju na zaslonu
06:03
is proportional to the actual value they are contributing to the conversation.
98
363312
4554
proporcionalna stvarnoj vrijednosti kojom doprinose razgovoru.
06:07
(Laughter)
99
367890
1942
(Smijeh)
06:11
(Applause)
100
371345
3631
(Pljesak)
06:16
The plug-in also shows the real estate value of our visual field
101
376919
4061
Dodatak pokazuje i vrijednost nekretnine našeg vidnog polja
06:21
and how much of our perception is being commoditized.
102
381004
3456
i koliko se naše percepcije pojednostavljuju.
06:24
Different from ad blockers,
103
384912
1589
Za razliku od blokatora oglasa,
06:26
for every ad you see on the web page,
104
386525
2472
za svaki oglas koji vidite na web stranici
06:29
it shows the amount of money you should be earning.
105
389021
3407
prikazuje se iznos novca koji biste trebali zarađivati.
06:32
(Laughter)
106
392452
1245
(Smijeh)
06:34
We are living in a battlefield between reality
107
394356
2292
Živimo na bojištu između stvarnosti
06:36
and commercial distributed reality,
108
396672
2253
i komercijalno raspoređene stvarnosti,
06:39
so the next version of the plug-in could strike away that commercial reality
109
399329
5086
a sljedeća verzija dodatka mogla bi ukloniti tu komercijalnu stvarnost
06:44
and show you things as they really are.
110
404439
2799
i pokazati vam stvari kakvima zaista jesu.
06:47
(Laughter)
111
407262
1791
(Smijeh)
06:50
(Applause)
112
410335
3629
(Pljesak)
06:55
Well, you can imagine how many directions this could really go.
113
415179
3612
Možete zamisliti u koliko bi smjerova ovo moglo ići.
06:58
Believe me, I know the risks are high if this were to become a real product.
114
418815
4414
Vjerujte mi, znam da je rizik velik za pretvaranje ovoga u stvarni proizvod.
07:04
And I created this with good intentions
115
424006
2939
I ovo sam stvorila s dobrim namjerama
07:06
to train our perception and eliminate biases.
116
426969
3837
za treniranje naše svijesti i uklanjanje pristranosti.
07:10
But the same approach could be used with bad intentions,
117
430830
3728
No isti pristup može se koristiti i s lošim namjerama,
07:14
like forcing citizens to install a plug-in like that
118
434582
3205
kao što je prisiljavanje građana da instaliraju takav dodatak
07:17
to control the public narrative.
119
437811
2126
kako bi se kontroliralo javno mnijenje.
07:20
It's challenging to make it fair and personal
120
440768
2737
Izazov je napraviti ga poštenim i osobnim
07:23
without it just becoming another layer of mediation.
121
443529
3155
bez da postane samo još jedan sloj medijacije.
07:27
So what does all this mean for us?
122
447933
2662
Što onda to sve znači za nas?
07:31
Even though technology is creating this isolation,
123
451230
3733
Iako tehnologija stvara izolaciju,
07:34
we could use it to make the world connected again
124
454987
3602
mogli bismo je koristiti za ponovno povezivanje svijeta
07:38
by breaking the existing model and going beyond it.
125
458613
3247
rušenjem postojećeg modela i rastom iznad njega.
07:42
By exploring how we interface with these technologies,
126
462508
3338
Istražujući načine na koje se povezujemo ovim tehnologijama,
07:45
we could step out of our habitual, almost machine-like behavior
127
465870
5185
možemo izaći iz našeg uobičajenog, skoro robotskog ponašanja
07:51
and finally find common ground between each other.
128
471079
2670
i konačno pronaći zajedničko tlo jedni s drugima.
07:54
Technology is never neutral.
129
474915
1789
Tehnologija nikad nije neutralna.
07:57
It provides a context and frames reality.
130
477191
3019
Daje kontekst i uokviruje stvarnost.
08:00
It's part of the problem and part of the solution.
131
480234
2888
Dio je problema, ali i dio rješenja.
08:03
We could use it to uncover our blind spots and retrain our perception
132
483624
5640
Mogla bi se koristiti za otkrivanje mrtvih točaka i treniranje svijesti
08:09
and consequently, choose how we see each other.
133
489949
3506
te posljedično, za odabir načina na koji vidimo jedni druge.
08:13
Thank you.
134
493892
1191
Hvala vam.
08:15
(Applause)
135
495107
2685
(Pljesak)
O ovoj web stranici

Ova stranica će vas upoznati s YouTube videozapisima koji su korisni za učenje engleskog jezika. Vidjet ćete lekcije engleskog koje vode vrhunski profesori iz cijelog svijeta. Dvaput kliknite na engleske titlove prikazane na svakoj video stranici da biste reproducirali video s tog mjesta. Titlovi se pomiču sinkronizirano s reprodukcijom videozapisa. Ako imate bilo kakvih komentara ili zahtjeva, obratite nam se putem ovog obrasca za kontakt.

https://forms.gle/WvT1wiN1qDtmnspy7