Sebastian Deterding: What your designs say about you

33,131 views ・ 2012-05-31

TED


Za predvajanje videoposnetka dvakrat kliknite na spodnje angleške podnapise.

Translator: Klara Pelcl Reviewer: Lenka Tušar
00:15
We are today talking about moral persuasion:
0
15651
2586
Danes govorimo o moralnem prepričevanju.
00:18
What is moral and immoral in trying to change people's behaviors
1
18261
3982
Kaj je moralno in kaj nemoralno
pri spreminjanju vedenja posameznikov
00:22
by using technology and using design?
2
22267
2453
z uporabo tehnologije in dizajna?
00:24
And I don't know what you expect,
3
24744
1832
Ne vem, kaj pričakujete,
00:26
but when I was thinking about that issue,
4
26600
1953
toda, ko sem sam razmišljal o tem problemu,
00:28
I early on realized what I'm not able to give you are answers.
5
28577
4039
sem kmalu spoznal,
da vam ne morem ponuditi odgovorov na ta vprašanja.
00:33
I'm not able to tell you what is moral or immoral,
6
33203
2771
Ne morem vam reči, kaj je moralno in kaj ne,
00:35
because we're living in a pluralist society.
7
35998
2547
ker živimo v pluralistični družnbi.
00:38
My values can be radically different from your values,
8
38569
4242
Moje vrednote so lahko popolnoma drugačne
od vaših.
00:42
which means that what I consider moral or immoral based on that
9
42835
3177
To pomeni, da je tisto, kar je zame moralno in tisto, kar je zame nemoralno
00:46
might not necessarily be what you consider moral or immoral.
10
46036
3612
ni nujno enako kot tisto, kar vi vidite kot moralno ali nemoralno.
Ob tem razmišljanju pa sem ugotovil tudi, da vam kljub vsemu lahko dam nekaj.
00:50
But I also realized there is one thing that I could give you,
11
50029
3214
In to je tisto, kar je ta moški svetu dal že dolgo pred mano.
00:53
and that is what this guy behind me gave the world --
12
53267
2533
00:55
Socrates.
13
55824
1150
Sokrat.
00:56
It is questions.
14
56998
1395
To so vprašanja.
00:58
What I can do and what I would like to do with you
15
58417
2633
Tisto, kar lahko naredim in, kar bi rad naredil za vas je,
da vam ponudim prvo vprašanje,
01:01
is give you, like that initial question,
16
61074
1945
skupek vprašanj
01:03
a set of questions to figure out for yourselves,
17
63043
3410
na katera si boste odgovarjali sami,
01:06
layer by layer, like peeling an onion,
18
66477
3641
plast za plastjo,
kot da bi lupili čebulo,
in se tako dokopali do jedra tistega, za kar menite,
01:10
getting at the core of what you believe is moral or immoral persuasion.
19
70142
5020
da je moralno ali nemoralno prepričevanje.
01:15
And I'd like to do that with a couple of examples of technologies
20
75559
4041
Pri tem si bom pomagal z dvema primeroma
tehnologij, v katerih so ljudje uporabljali elemente igre,
01:19
where people have used game elements to get people to do things.
21
79624
4980
s katerimi so pripravili ljudi do določenih dejanj.
Torej prvo zelo preprosto vprašanje,
01:25
So it's at first a very simple, very obvious question
22
85343
3040
01:28
I would like to give you:
23
88407
1196
ki bi vam ga rad zastavil je:
01:29
What are your intentions if you are designing something?
24
89627
2687
Kakšen je vaš namen, ko nekaj oblikujete?
01:32
And obviously, intentions are not the only thing,
25
92668
3373
In očitno nameni niso edina stvar,
zato je tu še en primer.
01:36
so here is another example for one of these applications.
26
96065
3178
01:39
There are a couple of these kinds of Eco dashboards right now --
27
99267
3087
Pred kratkim so izumili eko armaturne plošče.
Gre za armaturne plošče v avtomobilih,
01:42
dashboards built into cars --
28
102378
1460
01:43
which try to motivate you to drive more fuel-efficiently.
29
103862
2832
ki z eko-indikatorjem spodbujajo voznike k varčni vožnji.
01:46
This here is Nissan's MyLeaf,
30
106718
1773
Tukaj vidimo Nissanov Myleaf,
01:48
where your driving behavior is compared with the driving behavior
31
108515
3064
ki vaše vedenje na cesti primerja
z vedenjem drugih voznikov.
01:51
of other people,
32
111603
1151
01:52
so you can compete for who drives a route the most fuel-efficiently.
33
112778
3277
Tako lahko tekmujete in preverjate
kdo vozi najbolj varčno.
Te zadeve so se izkazale za izredno učinkovite.
01:56
And these things are very effective, it turns out --
34
116079
2477
Celo tako zelo, da so motivirale ljudi,
01:58
so effective that they motivate people to engage in unsafe driving behaviors,
35
118580
3939
da so vozili neodgovorno,
02:02
like not stopping at a red light,
36
122543
1762
na primer prevozili rdečo luč.
02:04
because that way you have to stop and restart the engine,
37
124329
2715
To so počeli zato, ker je treba pri rdeči luči ustaviti in ponovno zagnati motor,
to pa zagotovo porabi nekaj goriva, kajne?
02:07
and that would use quite some fuel, wouldn't it?
38
127068
2723
02:10
So despite this being a very well-intended application,
39
130338
4409
Torej kljub zelo dobrem namenu aplikacije,
02:14
obviously there was a side effect of that.
40
134771
2375
očitno naletimo tudi na stranske učinke.
02:17
Here's another example for one of these side effects.
41
137170
2548
Tu je še en primer takšnih stranskih učinkov.
02:19
Commendable: a site that allows parents to give their kids little badges
42
139742
4784
Commendable:
Spletna stran, na kateri starši svojim otrokom podeljujejo značke
02:24
for doing the things that parents want their kids to do,
43
144550
2658
za naloge, ki so jih njihovi otroci opravili.
Na primer zavezali čevlje.
02:27
like tying their shoes.
44
147232
1316
02:28
And at first that sounds very nice,
45
148572
2244
Na prvi pogled je slišati zelo lepo,
02:30
very benign, well-intended.
46
150840
2150
zelo neškodljivo, z lepim namenom.
Toda, če si ogledamo raziskave človeške miselnosti, ugotovimo, da
02:33
But it turns out, if you look into research on people's mindset,
47
153014
3770
02:36
caring about outcomes,
48
156808
1487
skrb glede rezultatov,
02:38
caring about public recognition,
49
158319
1773
skrb glede javne podobe,
02:40
caring about these kinds of public tokens of recognition
50
160116
3861
skrb glede takšnih vrst javnih priznanj
dolgoročno ni najbolj koristna
02:44
is not necessarily very helpful
51
164001
1994
02:46
for your long-term psychological well-being.
52
166019
2307
za vaše dobro psihično počutje.
02:48
It's better if you care about learning something.
53
168350
2676
Bolje je, da nas zanima, kako bi se česa naučili.
02:51
It's better when you care about yourself
54
171050
1905
Bolje je, kadar nas bolj skrbi zase
02:52
than how you appear in front of other people.
55
172979
2574
kot za to, kako nas vidijo drugi.
Torej, takšno motivacijsko orodje,
02:56
So that kind of motivational tool that is used actually, in and of itself,
56
176021
5041
ki se ga uporablja dejansko v in na samem sebi
ima dolgoročne stranske učinke
03:01
has a long-term side effect,
57
181086
1908
03:03
in that every time we use a technology
58
183018
1810
vsakič, ko uporabljamo tehnologijo,
03:04
that uses something like public recognition or status,
59
184852
3174
ki uporablja javna priznanja ali izboljšanje statusa,
03:08
we're actually positively endorsing this
60
188050
2376
dejansko promoviramo ravno to
03:10
as a good and normal thing to care about --
61
190450
3410
kot dobro stvar, za katero nas mora skrbeti.
03:13
that way, possibly having a detrimental effect
62
193884
2864
Na takšen način lahko dosežemo škodljiv učinek na
03:16
on the long-term psychological well-being of ourselves as a culture.
63
196772
3855
dolgoročno psihično počutje nas kot pripadnikov določene kulture.
03:20
So that's a second, very obvious question:
64
200651
2644
To je torej drugo, zelo očitno vprašanje:
03:23
What are the effects of what you're doing --
65
203319
2311
Kakšne bodo posledice naših dejanj?
03:25
the effects you're having with the device, like less fuel,
66
205654
4124
Učinki, ki bo imela naprava,
kot je manj porabljenega goriva,
03:29
as well as the effects of the actual tools you're using
67
209802
2705
pa tudi učinki orodij, ki jih uporabljamo,
03:32
to get people to do things --
68
212531
1674
da pri ljudeh dosegamo določena dejanja.
03:34
public recognition?
69
214229
1571
Javna priznanja.
03:35
Now is that all -- intention, effect?
70
215824
2921
Je to vse? Namen, učinek?
03:38
Well, there are some technologies which obviously combine both.
71
218769
3134
No, tu so še tehnologije,
ki seveda združujejo oboje.
03:41
Both good long-term and short-term effects
72
221927
2787
Tako dolgoročne kot tudi kratkoročne učinke
03:44
and a positive intention like Fred Stutzman's "Freedom,"
73
224738
2809
in dober namen, kot ga je imel Freda Stutzman s Freedomom.
03:47
where the whole point of that application is --
74
227571
2207
Namen te aplikacije je,
no, običajno nas tako bombardirajo
03:49
well, we're usually so bombarded with constant requests by other people,
75
229802
3725
s klici in prošnjami drugih ljudi,
03:53
with this device,
76
233551
1156
s pomočjo te naprave pa lahko izklopimo povezovanje s spletom
03:54
you can shut off the Internet connectivity of your PC of choice
77
234731
3480
na svojem osebnem računalniku za čas, ki ga določimo vnaprej
03:58
for a pre-set amount of time,
78
238235
1448
03:59
to actually get some work done.
79
239707
1971
in tako opravimo svoje delo.
04:01
And I think most of us will agree that's something well-intended,
80
241702
3151
Večina bi se strinjala,
da gre res za dober namen
04:04
and also has good consequences.
81
244877
2220
in dobre posledice.
Kot pravi Michel Foucault,
04:07
In the words of Michel Foucault,
82
247121
1646
04:08
it is a "technology of the self."
83
248791
1940
"Gre za tehnologijo na sebi"
04:10
It is a technology that empowers the individual
84
250755
2837
Gre za tehnologijo, s pomočjo katere posameznik
04:13
to determine its own life course,
85
253616
1815
določa potek njenega življenja,
04:15
to shape itself.
86
255455
1519
v katerem se oblikuje.
04:17
But the problem is, as Foucault points out,
87
257410
2984
Toda težava je,
kot pravi Foucault,
04:20
that every technology of the self
88
260418
1785
da ima vsaka tehnologija na sebi
04:22
has a technology of domination as its flip side.
89
262227
3445
na drugi strani tudi tehnologijo dominacije.
04:25
As you see in today's modern liberal democracies,
90
265696
4603
Kot danes vidimo v moralnih liberalnih demokracijah,
družba, država,
04:30
the society, the state, not only allows us to determine our self,
91
270323
4683
ne le omogočata, da določamo in oblikujemo sami sebe
04:35
to shape our self,
92
275030
1151
04:36
it also demands it of us.
93
276205
1991
ampak to od nas celo zahtevata.
Zahtevata, da se optimiziramo,
04:38
It demands that we optimize ourselves,
94
278220
1961
04:40
that we control ourselves,
95
280205
1840
se nadzorujemo,
se nenehno usmerjamo,
04:42
that we self-manage continuously,
96
282069
2711
04:44
because that's the only way in which such a liberal society works.
97
284804
3893
saj je to edini način,
na katerega lahko deluje takšna liberalna družba.
04:48
These technologies want us to stay in the game
98
288721
4268
Te tehnologije od nas zahtevajo, da ostanemo v igri,
04:53
that society has devised for us.
99
293013
2773
ki jo je za nas ustvarila družba.
04:55
They want us to fit in even better.
100
295810
2287
Hočejo, da se še bolje vklopimo.
04:58
They want us to optimize ourselves to fit in.
101
298121
2578
Hočejo, da se optimiziramo, da se bomo bolje vklopili vanjo.
Ne pravim, da je to nujno slaba stvar.
05:01
Now, I don't say that is necessarily a bad thing;
102
301628
3079
Mislim le, da nas ta primer
05:05
I just think that this example points us to a general realization,
103
305293
4328
usmerja k splošnemu spoznanju:
05:09
and that is: no matter what technology or design you look at,
104
309645
3803
Ne glede na to, katero tehnologijo ali dizajn si ogledamo,
05:13
even something we consider as well-intended
105
313472
3021
celo kaj, za kar mislimo, da ima dober namen in pozitivne učinke,
05:16
and as good in its effects as Stutzman's Freedom,
106
316517
2966
kot je bil Stutzmanov Freedom,
05:19
comes with certain values embedded in it.
107
319507
2707
vsaka ima vgrajene določene vrednote.
05:22
And we can question these values.
108
322238
1936
O teh vrednotah pa se lahko sprašujemo.
Lahko se vprašamo: Ali je dobro,
05:24
We can question: Is it a good thing
109
324198
1944
da se vsi nenehno izboljšujemo,
05:26
that all of us continuously self-optimize ourselves
110
326166
3484
05:29
to fit better into that society?
111
329674
2011
da bi se bolje vklopili v družbo?
05:31
Or to give you another example:
112
331709
1492
Naj navedem še en primer.
05:33
What about a piece of persuasive technology
113
333225
2476
Kaj pa prepričevalna tehnologija,
ki muslimanke prepričuje, da si obraze prekrijejo z rutami?
05:35
that convinces Muslim women to wear their headscarves?
114
335725
3191
05:38
Is that a good or a bad technology
115
338940
2052
Je to dobra ali slaba tehnologija
glede na namen in učinek?
05:41
in its intentions or in its effects?
116
341016
2563
05:43
Well, that basically depends on the kind of values you bring to bear
117
343603
4254
V bistvu je pomembno,
kakšen je vaš sistem vrednot,
05:47
to make these kinds of judgments.
118
347881
2237
saj je od tega odvisna naša razsodba.
To je torej tretje vprašanje:
05:50
So that's a third question:
119
350142
1528
05:51
What values do you use to judge?
120
351694
1528
S katerimi vrednotami sodimo?
05:53
And speaking of values:
121
353848
1341
Ko smo že pri vrednotah, naj povem,
05:55
I've noticed that in the discussion about moral persuasion online
122
355213
3356
da sem tako v spletnih razpravah o moralnem prepričevanju
05:58
and when I'm talking with people,
123
358593
1637
kot tudi v pogovorih z ostalimi
06:00
more often than not, there is a weird bias.
124
360254
2667
opazil, da na tem mestu pogosto postanemo pristranski.
Ta pristranskost se pokaže, ko se vprašamo,
06:03
And that bias is that we're asking:
125
363463
2896
06:06
Is this or that "still" ethical?
126
366383
2813
ali je to ali ono "še vedno" etično?
06:09
Is it "still" permissible?
127
369220
2650
Je to "še vedno" dovoljeno?
06:11
We're asking things like:
128
371894
1198
Sprašujemo se na primer,
ali je ta Oxfamov obrazec za donacije,
06:13
Is this Oxfam donation form,
129
373116
2189
06:15
where the regular monthly donation is the preset default,
130
375329
3048
s prednastavljenimi mesečnimi donacijami,
06:18
and people, maybe without intending it,
131
378401
2079
ki ljudi, morda celo nenamerno,
06:20
are encouraged or nudged into giving a regular donation
132
380504
3812
spodbuja ali nagovarja
v redne namesto enkratnih donacij
06:24
instead of a one-time donation,
133
384340
1489
06:25
is that "still' permissible?
134
385853
1343
še vedno sprejemljiv?
Je to še vedno etično?
06:27
Is it "still" ethical?
135
387220
1363
06:28
We're fishing at the low end.
136
388607
1479
V tem primeru hodimo po tankem ledu.
Toda v bistvu vprašanje
06:30
But in fact, that question, "Is it 'still' ethical?"
137
390879
2474
"Je to še vedno etično"
predstavlja le enega izmed načinov, kako gledamo na etiko.
06:33
is just one way of looking at ethics.
138
393377
1781
06:35
Because if you look at the beginning of ethics in Western culture,
139
395182
4892
Če se vrnemo na začetke etike
v zahodni kulturi
bomo videli zelo drugačne poglede
06:40
you see a very different idea of what ethics also could be.
140
400098
3532
na to, kaj naj bi bila etika.
06:43
For Aristotle, ethics was not about the question,
141
403970
3997
Aristotel etike ni videl kot vprašanje
06:47
"Is that still good, or is it bad?"
142
407991
2272
ali je nekaj še vedno dobro ali slabo.
06:50
Ethics was about the question of how to live life well.
143
410287
3428
Pri etiki je šlo za vprašanje, kako živeti dobro življenje.
Ta pojem je združil v besedo "arete",
06:54
And he put that in the word "arête,"
144
414216
2181
ki jo iz latinščine prevedemo kot "vrlina",
06:56
which we, from [Ancient Greek], translate as "virtue."
145
416421
2755
dejansko pa pomeni odličnost.
06:59
But really, it means "excellence."
146
419200
1640
07:00
It means living up to your own full potential as a human being.
147
420864
5431
Pomeni, da dosežemo svoj potencial
kot človeška bitja.
07:06
And that is an idea that, I think,
148
426937
1656
To je tisto,
kar je po mojem mnenju lepo pojasnil Paul Richard Buchanan v svojem eseju,
07:08
Paul Richard Buchanan put nicely in a recent essay,
149
428617
2697
07:11
where he said, "Products are vivid arguments
150
431338
2143
v katerem pravi: "Izdelki so slikoviti argumenti
07:13
about how we should live our lives."
151
433505
2123
v razpravi o tem, kako naj živimo svoja življenja."
Naši dizajni niso etični ali neetični
07:16
Our designs are not ethical or unethical
152
436086
2577
07:18
in that they're using ethical or unethical means of persuading us.
153
438687
4590
v smislu da uporabljajo etična ali neetična sredstva za prepričevanje.
07:23
They have a moral component
154
443661
1570
Moralno komponento imajo
že v sami viziji in težnji po dobrem življenju,
07:25
just in the kind of vision and the aspiration of the good life
155
445255
4227
ki nam ju predstavljajo.
07:29
that they present to us.
156
449506
1348
07:31
And if you look into the designed environment around us
157
451441
3503
Če si ogledamo okolico, ki nas obkroža
07:34
with that kind of lens,
158
454968
1172
s tega vidika
07:36
asking, "What is the vision of the good life
159
456164
2453
in se vprašamo: "Kakšna je vizija dobrega življenja,
07:38
that our products, our design, present to us?",
160
458641
2738
ki nam jo predstavljajo naši izdelki, naš dizajn,"
07:41
then you often get the shivers,
161
461403
2276
nas pogosto spreleti srh,
07:43
because of how little we expect of each other,
162
463703
2328
saj drug od drugega pričakujemo veliko manj,
07:46
of how little we actually seem to expect of our life,
163
466055
3890
kot se pravzaprav zdi, da pričakujemo
od svojega življenja in kakšno naj bi bilo dobro življenje.
07:49
and what the good life looks like.
164
469969
2034
To je torej četrto vprašanje, ki vam ga zastavljam:
07:53
So that's a fourth question I'd like to leave you with:
165
473110
3023
07:56
What vision of the good life do your designs convey?
166
476157
4362
Kakšno vizijo dobrega življenja
izražajo vaši dizajni?
08:01
And speaking of design,
167
481249
1372
Ko smo že pri dizajnu,
08:02
you'll notice that I already broadened the discussion,
168
482645
4190
opazili ste, da sem že razširil razpravo.
08:06
because it's not just persuasive technology that we're talking about here,
169
486859
4444
Tu namreč ne govorimo o prepričevalni tehnologiji,
govorimo o vseh dizajnih, kar jih na svetu obstaja.
08:11
it's any piece of design that we put out here in the world.
170
491327
4077
08:15
I don't know whether you know
171
495428
1400
Ne vem, če že veste,
08:16
the great communication researcher Paul Watzlawick who, back in the '60s,
172
496852
3555
da je veliki raziskovalec komunikacije Paul Watzlawick,
v 60-ih letih postavil argument,
da ne moremo ne-komunicirati.
08:20
made the argument that we cannot not communicate.
173
500431
2511
Tudi, če se odločimo za molk,
08:22
Even if we choose to be silent, we chose to be silent,
174
502966
2601
smo se odločili, da bomo tiho. Tudi z izbiro molka nekaj povemo.
08:25
and we're communicating something by choosing to be silent.
175
505591
2956
08:28
And in the same way that we cannot not communicate,
176
508571
2735
Enako kot ne moremo ne-komunicirati
ne moremo ne-prepričevati.
08:31
we cannot not persuade:
177
511330
1531
08:32
whatever we do or refrain from doing,
178
512885
2011
Karkoli storimo ali ne storimo,
08:34
whatever we put out there as a piece of design, into the world,
179
514920
4365
karkoli kot neko obliko dizajna
spravimo na svet,
vsebuje komponento prepričevanja.
08:39
has a persuasive component.
180
519309
2047
08:41
It tries to affect people.
181
521380
1873
Ta skuša vplivati na ljudi.
08:43
It puts a certain vision of the good life out there in front of us,
182
523277
3747
V dobro življenje vnese določeno vizijo
in jo postavi pred nas.
To je tisto, kar pravi Peter-Paul Verbeek,
08:47
which is what Peter-Paul Verbeek,
183
527048
1625
08:48
the Dutch philosopher of technology, says.
184
528697
2716
nizozemski filozof in tehnolog.
Ne glede na to, ali kot dizajnerji to nameravamo ali ne,
08:51
No matter whether we as designers intend it or not,
185
531437
3948
08:55
we materialize morality.
186
535409
2142
pri svojem delu materializiramo moralnost.
08:57
We make certain things harder and easier to do.
187
537575
2803
S svojim delom olajšamo ali otežimo določene stvari.
09:00
We organize the existence of people.
188
540402
2211
Organiziramo obstoj ljudi.
09:02
We put a certain vision
189
542637
1151
Ustvarjamo določeno vizijo o tem, kaj med nami je dobro ali slabo,
09:03
of what good or bad or normal or usual is
190
543812
3404
normalno ali običajno.
09:07
in front of people,
191
547240
1151
09:08
by everything we put out there in the world.
192
548415
2400
To počnemo z vsem, kar doprinesemo na svet.
Še kaj tako vsakdanjega, kot so šolski stoli,
09:11
Even something as innocuous as a set of school chairs
193
551247
3086
09:14
is a persuasive technology,
194
554357
2047
je pravzaprav prepričevalna tehnologija.
Predstavlja in materializira namreč
09:16
because it presents and materializes a certain vision of the good life --
195
556428
4690
določeno vizijo o dobrem življenju,
dobrem življenju, v katerem je poučevanje, učenje in poslušanje
09:21
a good life in which teaching and learning and listening
196
561142
2857
09:24
is about one person teaching, the others listening;
197
564023
3099
odvisno od osebe, ki poučuje, tistih, ki poslušajo,
v katerem se ljudje učimo medtem ko sedimo,
09:27
in which it is about learning-is-done-while-sitting;
198
567146
4053
v katerem se učimo zase,
09:31
in which you learn for yourself;
199
571223
1595
09:32
in which you're not supposed to change these rules,
200
572842
2420
v katerem se naj ne bi spreminjalo teh pravil,
09:35
because the chairs are fixed to the ground.
201
575286
2447
saj so stoli pritrjeni na tla.
09:38
And even something as innocuous as a single-design chair,
202
578888
2911
In celo nekaj tako običajnega, kot je dizajnerski stol,
09:41
like this one by Arne Jacobsen,
203
581823
1572
kot tisti, ki ga je naredil Arne Jacobsen,
09:43
is a persuasive technology,
204
583419
1776
je prepričevalna tehnologija.
Zato namreč, ker spet posreduje zamisel o tem, kakšno je dobro življenje.
09:45
because, again, it communicates an idea of the good life:
205
585219
3043
09:48
a good life -- a life that you, as a designer, consent to by saying,
206
588735
4925
Dobro življenje,
življenje, v katerega bi kot dizajnerji privolili tako,
da bi rekli: "V dobrem življenju"
09:53
"In a good life, goods are produced as sustainably or unsustainably
207
593684
3507
se dobrine proizvaja trajnostno ali netrajnostno kot ta stol.
09:57
as this chair.
208
597215
1611
09:58
Workers are treated as well or as badly
209
598850
1977
Z delavci se dela dobro ali slabo,
10:00
as the workers were treated that built that chair."
210
600851
2572
tako, kot se je delalo s tistimi, ki so ustvarili ta stol."
10:03
The good life is a life where design is important
211
603762
2301
Dobro življenje je življenje, v katerem je dizajn pomemben,
ker si je nekdo očitno vzel čas in zapravil denar
10:06
because somebody obviously took the time and spent the money
212
606087
2921
za takšen dobro oblikovan stol,
10:09
for that kind of well-designed chair;
213
609032
1784
10:10
where tradition is important,
214
610840
1404
kjer je pomembna tradicija,
saj je to tradicionalna klasika
10:12
because this is a traditional classic and someone cared about this;
215
612268
3166
in nekomu je bilo mar zanjo
in ko imamo opravka z nečim tako izstopajočim, kot je potrošnja,
10:15
and where there is something as conspicuous consumption,
216
615458
2690
kjer je O.K. in normalno
10:18
where it is OK and normal to spend a humongous amount of money
217
618172
2956
zapraviti gromozanske količine denarja za takšen stol,
10:21
on such a chair,
218
621152
1151
da bi ostalim pokazali svoj socialni status.
10:22
to signal to other people what your social status is.
219
622327
2573
10:24
So these are the kinds of layers, the kinds of questions
220
624924
3317
To so torej naše plasti, vrste vprašanj,
10:28
I wanted to lead you through today;
221
628265
1970
ki sem vam jih danes želel približati.
Vprašanje "Kakšen je namen,
10:30
the question of: What are the intentions that you bring to bear
222
630259
3034
ki ga bomo določili svojemu dizajnu?"
10:33
when you're designing something?
223
633317
1560
10:34
What are the effects, intended and unintended, that you're having?
224
634901
3252
"Kakšne učinke, namerne in nenamerne bo imel?"
"Kakšne vrednote bomo uporabljali,
10:38
What are the values you're using to judge those?
225
638177
2801
ko bomo presojali glede tega?"
Kakšne so odlike, težnje,
10:41
What are the virtues, the aspirations
226
641002
1973
10:42
that you're actually expressing in that?
227
642999
2028
ki jih s svojim delom izražamo?
10:45
And how does that apply,
228
645400
1857
Kako to sovpada
10:47
not just to persuasive technology,
229
647281
1986
ne le s prepričevalno tehnologijo
10:49
but to everything you design?
230
649291
2048
temveč z vsem, kar oblikujemo?
10:51
Do we stop there?
231
651912
1291
Se tu ustavimo?
10:53
I don't think so.
232
653815
1167
Ne bi rekel.
10:55
I think that all of these things are eventually informed
233
655291
4438
Mislim, da vse te zadeve prej ali slej pridobijo informacije
10:59
by the core of all of this,
234
659753
1423
iz jedra vsega tega
11:01
and this is nothing but life itself.
235
661200
3091
in to ni nič drugega kot življenje samo.
11:04
Why, when the question of what the good life is
236
664594
2717
Zakaj, ko pa vprašanje, kaj je dobro življenje,
11:07
informs everything that we design,
237
667335
2339
osmisli vse, kar oblikujemo,
11:09
should we stop at design and not ask ourselves:
238
669698
2794
bi se morali ustaviti pri svojem dizajnu in se ne vprašati,
11:12
How does it apply to our own life?
239
672516
1990
kako se to navezuje na naša življenja?
11:14
"Why should the lamp or the house be an art object,
240
674881
2712
"Zakaj bi morala biti svetilka ali hiša umetniški objekt,
11:17
but not our life?"
241
677617
1176
naše življenje pa ne,"
11:18
as Michel Foucault puts it.
242
678817
1483
kot pravi Michel Foucault.
11:20
Just to give you a practical example of Buster Benson.
243
680696
3611
Naj navedem še praktični primer Busterja Bensona.
11:24
This is Buster setting up a pull-up machine
244
684331
2258
Tu je Buster, ki sestavlja napravo za dvigovanje
11:26
at the office of his new start-up, Habit Labs,
245
686613
2612
v pisarni svojega start-up podjetja Habit Labs,
kjer delajo na izboljšavi ostalih aplikacij,
11:29
where they're trying to build other applications like "Health Month"
246
689249
3215
kot je Health Month.
11:32
for people.
247
692488
1158
Zakaj torej izdeluje nekaj takšnega?
11:33
And why is he building a thing like this?
248
693670
1962
11:35
Well, here is the set of axioms
249
695656
2033
No, tu je skupek aksiomov
11:37
that Habit Labs, Buster's start-up, put up for themselves
250
697713
3398
ki so jih v Habit Labs, Busterjevem start-upu ustvarili zase,
glede svojega dela kot skupine
11:41
on how they wanted to work together as a team
251
701135
2705
11:43
when they're building these applications --
252
703864
2029
ko delajo na teh aplikacijah.
11:45
a set of moral principles they set themselves
253
705917
2189
Gre za skupek moralnih načel, ki so jih postavili sami
in veljajo za njihovo skupno delo.
11:48
for working together --
254
708130
1361
11:49
one of them being,
255
709515
1238
Eno izmed načel je:
11:50
"We take care of our own health and manage our own burnout."
256
710777
3087
"Sami skrbimo za svoje zdravje in pazimo, da nas ne doleti izgorelost"
Kako se konec koncev namreč lahko vprašamo
11:54
Because ultimately, how can you ask yourselves
257
714221
3500
11:57
and how can you find an answer on what vision of the good life
258
717745
3930
in najdemo odgovor na to,
kakšno vizijo dobrega življenja
12:01
you want to convey and create with your designs
259
721699
3169
želimo izraziti in ustvariti s svojimi dizajni
12:04
without asking the question:
260
724892
1730
ne da bi se vprašali,
12:06
What vision of the good life do you yourself want to live?
261
726646
3832
kakšno vizijo dobrega življenja
bi želeli živeti sami?
Za konec se vam zahvaljujem.
12:11
And with that,
262
731018
1174
12:12
I thank you.
263
732945
1412
12:14
(Applause)
264
734381
4156
(Aplavz)
O tej spletni strani

Na tem mestu boste našli videoposnetke na YouTubu, ki so uporabni za učenje angleščine. Ogledali si boste lekcije angleščine, ki jih poučujejo vrhunski učitelji z vsega sveta. Z dvoklikom na angleške podnapise, ki so prikazani na vsaki strani z videoposnetki, lahko predvajate videoposnetek od tam. Podnapisi se pomikajo sinhronizirano s predvajanjem videoposnetka. Če imate kakršne koli pripombe ali zahteve, nam pišite prek tega obrazca za stike.

https://forms.gle/WvT1wiN1qDtmnspy7