Sebastian Deterding: What your designs say about you

33,131 views ใƒป 2012-05-31

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืžืชืจื’ื: Shlomo Adam ืžื‘ืงืจ: Ido Dekkers
00:15
We are today talking about moral persuasion:
0
15651
2586
ืื ื—ื ื• ืžื“ื‘ืจื™ื ื”ื™ื•ื ืขืœ ืฉื›ื ื•ืข ืžื•ืกืจื™.
00:18
What is moral and immoral in trying to change people's behaviors
1
18261
3982
ืžื” ืžื•ืกืจื™ ื•ืžื” ืœื ืžื•ืกืจื™
ื›ืฉืžื ืกื™ื ืœืฉื ื•ืช ื”ืชื ื”ื’ื•ืช ืฉืœ ืื ืฉื™ื
00:22
by using technology and using design?
2
22267
2453
ื‘ืืžืฆืขื•ืช ื˜ื›ื ื•ืœื•ื’ื™ื” ื•ืขื™ืฆื•ื‘?
00:24
And I don't know what you expect,
3
24744
1832
ื•ืื ื™ ืœื ื™ื•ื“ืข ืœืžื” ืืชื ืžืฆืคื™ื,
00:26
but when I was thinking about that issue,
4
26600
1953
ืื‘ืœ ื›ืฉื—ืฉื‘ืชื™ ืขืœ ื”ื ื•ืฉื ื”ื–ื”,
00:28
I early on realized what I'm not able to give you are answers.
5
28577
4039
ื”ื‘ื ืชื™ ื›ื‘ืจ ื‘ื”ืชื—ืœื”
ืฉืื™ืŸ ืœื™ ืชืฉื•ื‘ื•ืช.
00:33
I'm not able to tell you what is moral or immoral,
6
33203
2771
ืื ื™ ืœื ื™ื•ื“ืข ืœื”ื’ื™ื“ ืœื›ื ืžื” ืžื•ืกืจื™ ื•ืžื” ืœื
00:35
because we're living in a pluralist society.
7
35998
2547
ืžืคื ื™ ืฉืื ื—ื ื• ื—ื™ื™ื ื‘ื—ื‘ืจื” ืคืœื•ืจืœื™ืกื˜ื™ืช.
00:38
My values can be radically different from your values,
8
38569
4242
ื”ืขืจื›ื™ื ืฉืœื™ ื™ื›ื•ืœื™ื ืœื”ื™ื•ืช ืฉื•ื ื™ื ื‘ืฆื•ืจื” ืงื™ืฆื•ื ื™ืช
ืžืฉืœื›ื.
00:42
which means that what I consider moral or immoral based on that
9
42835
3177
ื›ืœื•ืžืจ, ืžื” ืฉืื ื™ ืžื—ืฉื™ื‘ ืœืคื™ื”ื ื›ืžื•ืกืจื™ ืื• ืœื ืžื•ืกืจื™
00:46
might not necessarily be what you consider moral or immoral.
10
46036
3612
ืื™ื ื ื• ื‘ื”ื›ืจื— ืžื” ืฉืืชื ืžื—ืฉื™ื‘ื™ื ืœืžื•ืกืจื™ ืื• ืœื ืžื•ืกืจื™.
ืืš ื’ื ื”ื‘ื ืชื™ ืฉื™ืฉ ื“ื‘ืจ ืื—ื“ ืฉืื ื™ ื™ื›ื•ืœ ืœืชืช ืœื›ื.
00:50
But I also realized there is one thing that I could give you,
11
50029
3214
ื•ื–ื” ืžื” ืฉื”ื‘ื—ื•ืจ ื”ื–ื” ืฉืžืื—ื•ืจื™ ื ืชืŸ ืœืขื•ืœื --
00:53
and that is what this guy behind me gave the world --
12
53267
2533
00:55
Socrates.
13
55824
1150
ืกื•ืงืจื˜ืก.
00:56
It is questions.
14
56998
1395
ืฉืืœื•ืช.
00:58
What I can do and what I would like to do with you
15
58417
2633
ืžื” ืฉืื ื™ ื™ื›ื•ืœ ืœืขืฉื•ืช ื•ืžื” ืฉื”ื™ื™ืชื™ ืจื•ืฆื” ืœืขืฉื•ืช ืื™ืชื›ื
ื”ื•ื ืœืชืช ืœื›ื, ื›ืžื• ื”ืฉืืœื” ื‘ื”ืชื—ืœื”,
01:01
is give you, like that initial question,
16
61074
1945
ืกื˜ ืฉืœ ืฉืืœื•ืช
01:03
a set of questions to figure out for yourselves,
17
63043
3410
ื›ื“ื™ ืฉืชื‘ื™ื ื• ื‘ืขืฆืžื›ื,
01:06
layer by layer, like peeling an onion,
18
66477
3641
ืฉื›ื‘ื” ืื—ืจื™ ืฉื›ื‘ื”,
ื›ืžื• ืงื™ืœื•ืฃ ื‘ืฆืœ,
ืœื”ื’ื™ืข ืœืœื‘ ืฉืœ ืžื” ืฉืืชื ืžืืžื™ื ื™ื
01:10
getting at the core of what you believe is moral or immoral persuasion.
19
70142
5020
ืฉื”ื•ื ืฉื›ื ื•ืข ืžื•ืกืจื™ ืื• ืœื ืžื•ืกืจื™.
01:15
And I'd like to do that with a couple of examples of technologies
20
75559
4041
ื•ื‘ืจืฆื•ื ื™ ืœืขืฉื•ืช ืืช ื–ื” ืขื ื›ืžื” ื“ื•ื’ืžืื•ืช
ืฉืœ ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื‘ื”ืŸ ื ืขืฉื” ืฉื™ืžื•ืฉ ื‘ืžืจื›ื™ื‘ื™ื ืฉืœ ืžืฉื—ืงื™ื
01:19
where people have used game elements to get people to do things.
21
79624
4980
ื›ื“ื™ ืœื’ืจื•ื ืœืื ืฉื™ื ืœืขืฉื•ืช ื“ื‘ืจื™ื.
ืื– ื”ืจืืฉื•ืŸ ื”ื•ื ืžืื•ื“ ืคืฉื•ื˜, ืฉืืœื” ืžืื•ื“ ื‘ืจื•ืจื” ืžืืœื™ื”
01:25
So it's at first a very simple, very obvious question
22
85343
3040
01:28
I would like to give you:
23
88407
1196
ืฉื‘ืจืฆื•ื ื™ ืœื”ืฆื™ื’ ืœื›ื:
01:29
What are your intentions if you are designing something?
24
89627
2687
ืžื” ื”ื›ื•ื•ื ื•ืช ืฉืœื›ื ื›ืฉืืชื ืžืขืฆื‘ื™ื ืžืฉื”ื•?
01:32
And obviously, intentions are not the only thing,
25
92668
3373
ื•ื›ืžื•ื‘ืŸ ืฉื›ื•ื•ื ื•ืช ื”ืŸ ืœื ื”ื“ื‘ืจ ื”ื™ื—ื™ื“,
ืื– ื”ื ื” ื“ื•ื’ืžื” ื ื•ืกืคืช ืœืื—ื“ ื”ื™ื™ืฉื•ืžื™ื ื”ืืœื”.
01:36
so here is another example for one of these applications.
26
96065
3178
01:39
There are a couple of these kinds of Eco dashboards right now --
27
99267
3087
ื™ืฉ ื›ื™ื•ื ื›ืžื” ืกื•ื’ื™ื ื›ืืœื” ืฉืœ ืœื•ื—ื•ืช ื‘ืงืจื” ืืงื•ืœื•ื’ื™ื™ื --
ืœื•ื—ื•ืช-ืฉืขื•ื ื™ื ืฉื‘ื ื•ื™ื™ื ื‘ืžื›ื•ื ื™ื•ืช
01:42
dashboards built into cars --
28
102378
1460
01:43
which try to motivate you to drive more fuel-efficiently.
29
103862
2832
ืฉืžื ืกื™ื ืœื”ื ื™ืข ืืชื›ื ืœื ื”ื•ื’ ื‘ืฆื•ืจื” ืฉื—ื•ืกื›ืช ื™ื•ืชืจ ื“ืœืง.
01:46
This here is Nissan's MyLeaf,
30
106718
1773
ื–ืืช ื”"ื ื™ืกืืŸ ืžื™ื™ืœื™ืฃ",
01:48
where your driving behavior is compared with the driving behavior
31
108515
3064
ืฉื‘ื” ื”ืจื’ืœื™ ื”ื ื”ื™ื’ื” ืฉืœื›ื ืžื•ืฉื•ื•ื™ื
ืœืขื•ืžืช ื”ืจื’ืœื™ ื”ื ื”ื™ื’ื” ืฉืœ ืื ืฉื™ื ืื—ืจื™ื,
01:51
of other people,
32
111603
1151
01:52
so you can compete for who drives a route the most fuel-efficiently.
33
112778
3277
ืื– ืืชื ื™ื›ื•ืœื™ื ืœื”ืชื—ืจื•ืช ืžื™ ื ื•ื”ื’
ื‘ืฆื•ืจื” ื”ื›ื™ ื—ืกื›ื•ื ื™ืช.
ื•ืžืกืชื‘ืจ ืฉื”ื“ื‘ืจื™ื ื”ืืœื” ืžืื“ ืืคืงื˜ื™ื‘ื™ื™ื,
01:56
And these things are very effective, it turns out --
34
116079
2477
ื›ืœ ื›ืš ืืคืงื˜ื™ื‘ื™ื™ื ืฉื”ื ืžื ื™ืขื™ื ืื ืฉื™ื
01:58
so effective that they motivate people to engage in unsafe driving behaviors,
35
118580
3939
ืœืคืชื— ื”ืจื’ืœื™ ื ื”ื™ื’ื” ืœื ื‘ื˜ื•ื—ื™ื--
02:02
like not stopping at a red light,
36
122543
1762
ื›ืžื• ืœื ืœืขืฆื•ืจ ื‘ืื•ืจ ืื“ื•ื.
02:04
because that way you have to stop and restart the engine,
37
124329
2715
ืžืคื ื™ ืฉื›ืš ืขืœื™ื›ื ืœืขืฆื•ืจ ื•ืœื”ืคืขื™ืœ ืฉื•ื‘ ืืช ื”ืžื ื•ืข,
ื•ื–ื” ืžื ืฆืœ ื”ืจื‘ื” ื“ืœืง, ื ื›ื•ืŸ?
02:07
and that would use quite some fuel, wouldn't it?
38
127068
2723
02:10
So despite this being a very well-intended application,
39
130338
4409
ืื– ืœืžืจื•ืช ืฉื–ื”ื• ื™ื™ืฉื•ื ืขื ื›ื•ื•ื ื” ื˜ื•ื‘ื”,
02:14
obviously there was a side effect of that.
40
134771
2375
ื‘ืจื•ืจ ืฉื”ื™ืชื” ืœื• ื”ืฉืคืขื” ืฉืœื™ืœื™ืช.
02:17
Here's another example for one of these side effects.
41
137170
2548
ื•ื”ื ื” ืขื•ื“ ื“ื•ื’ืžื” ืœื”ืฉืคืขื” ืฉืœื™ืœื™ืช ื›ื–ื•.
02:19
Commendable: a site that allows parents to give their kids little badges
42
139742
4784
ืจืื•ื™ ืœืฉื‘ื—:
ืืชืจ ืฉืžืืคืฉืจ ืœื”ื•ืจื™ื ืœืชืช ืœื™ืœื“ื™ื”ื ืขื™ื˜ื•ืจื™ื
02:24
for doing the things that parents want their kids to do,
43
144550
2658
ืขืœ ืขืฉื™ื™ืช ื“ื‘ืจื™ื ืฉื”ื”ื•ืจื™ื ืจื•ืฆื™ื ืฉื”ื™ืœื“ื™ื ื™ืขืฉื• --
ื›ืžื• ืœืงืฉื•ืจ ืืช ื”ืฉืจื•ื›ื™ื.
02:27
like tying their shoes.
44
147232
1316
02:28
And at first that sounds very nice,
45
148572
2244
ื•ื‘ื”ืชื—ืœื” ื–ื” ื ืฉืžืข ื ื—ืžื“ ืžืื“,
02:30
very benign, well-intended.
46
150840
2150
ืžืื“ ื ืขื™ื, ืžืœื ื›ื•ื•ื ื•ืช ื˜ื•ื‘ื•ืช.
ืืš ืžืกืชื‘ืจ, ืื ืžืกืชื›ืœื™ื ื‘ืžื—ืงืจ ืฉื‘ื“ืง ืืช ืฆื•ืจืช ื”ื—ืฉื™ื‘ื” ืฉืœ ืื ืฉื™ื,
02:33
But it turns out, if you look into research on people's mindset,
47
153014
3770
02:36
caring about outcomes,
48
156808
1487
ืฉืžืชืŸ ื—ืฉื™ื‘ื•ืช ืœืชื•ืฆืื•ืช,
02:38
caring about public recognition,
49
158319
1773
ืžืชืŸ ื—ืฉื™ื‘ื•ืช ืœื”ื›ืจื” ื—ื‘ืจืชื™ืช,
02:40
caring about these kinds of public tokens of recognition
50
160116
3861
ืžืชืŸ ื—ืฉื™ื‘ื•ืช ืœืื™ืฉื•ืจื™ื ื”ื—ื‘ืจืชื™ื™ื ื”ืืœื”
ืื™ื ื ื• ื‘ื”ื›ืจื— ืžื•ืขื™ืœ ืžืื“
02:44
is not necessarily very helpful
51
164001
1994
02:46
for your long-term psychological well-being.
52
166019
2307
ืœื‘ืจื™ืื•ืช ื”ื ืคืฉื™ืช ื‘ื˜ื•ื•ื— ื”ืืจื•ืš.
02:48
It's better if you care about learning something.
53
168350
2676
ืขื“ื™ืฃ ืฉื™ื”ื™ื” ื—ืฉื•ื‘ ืœื›ื ืฉื™ืœืžื“ื• ืžืฉื”ื•.
02:51
It's better when you care about yourself
54
171050
1905
ืขื“ื™ืฃ ืฉื™ื”ื™ื” ืื›ืคืช ืœื›ื ืžืขืฆืžื›ื
02:52
than how you appear in front of other people.
55
172979
2574
ืžืืฉืจ ืื™ืš ืืชื ื ืจืื™ื ื‘ืขื™ื ื™ ืื—ืจื™ื.
ืื– ืกื•ื’ ืืžืฆืขื™ ื›ื–ื” ื‘ื• ืžืฉืชืžืฉื™ื ืœื™ืฆื™ืจืช ืžื•ื˜ื™ื‘ืฆื™ื”
02:56
So that kind of motivational tool that is used actually, in and of itself,
56
176021
5041
ืœืžืขืฉื” ื‘ืขืฆืžื•
ื™ื•ืฆืจ ืชื•ืคืขื•ืช ืฉืœื™ืœื™ื•ืช ื‘ื˜ื•ื•ื— ื”ืืจื•ืš
03:01
has a long-term side effect,
57
181086
1908
03:03
in that every time we use a technology
58
183018
1810
ื›ื™ ื‘ื›ืœ ืคืขื ืฉืื ื• ืžืฉืชืžืฉื™ื ื‘ื˜ื›ื ื•ืœื•ื’ื™ื”
03:04
that uses something like public recognition or status,
59
184852
3174
ืฉืžื ืฆืœืช ืžืฉื”ื• ื›ืžื• ื”ื›ืจื” ื—ื‘ืจืชื™ืช ืื• ืžืขืžื“,
03:08
we're actually positively endorsing this
60
188050
2376
ืื ื—ื ื• ืœืžืขืฉื” ืžืขื•ื“ื“ื™ื ืืช ื–ื”
03:10
as a good and normal thing to care about --
61
190450
3410
ื›ื“ื‘ืจ ื˜ื•ื‘ ื•ื ื•ืจืžืœื™ ืฉื™ืฉ ืœืชืช ืœื• ื—ืฉื™ื‘ื•ืช --
03:13
that way, possibly having a detrimental effect
62
193884
2864
ื•ื‘ื›ืš ืื•ืœื™ ื’ื•ืจืžื™ื ืœืืคืงื˜ ืžื–ื™ืง
03:16
on the long-term psychological well-being of ourselves as a culture.
63
196772
3855
ืขืœ ื‘ืจื™ืื•ืชื ื• ื”ื ืคืฉื™ืช ื›ื—ื‘ืจื” ืœืื•ืจืš ื–ืžืŸ.
03:20
So that's a second, very obvious question:
64
200651
2644
ืื– ื”ื ื” ืฉืืœื” ืฉื ื™ื” ื•ื“ื™ ืžื•ื‘ื ืช ืžืืœื™ื”:
03:23
What are the effects of what you're doing --
65
203319
2311
ืžื” ื”ื”ืฉืคืขื•ืช ืฉืœ ืžื” ืฉืืชื ืขื•ืฉื™ื?
03:25
the effects you're having with the device, like less fuel,
66
205654
4124
ื”ื”ืฉืคืขื•ืช ื”ืžื•ืฉื’ื•ืช ื‘ืขื–ืจืช ื”ืžื›ืฉื™ืจ,
ื›ืžื• ืคื—ื•ืช ื“ืœืง,
03:29
as well as the effects of the actual tools you're using
67
209802
2705
ื›ืžื• ื’ื ื”ืฉืคืขื•ืช ื”ื›ืœื™ื ื‘ื”ื ืžืฉืชืžืฉื™ื
03:32
to get people to do things --
68
212531
1674
ื›ื“ื™ ืœื’ืจื•ื ืœืื ืฉื™ื ืœืขืฉื•ืช ื“ื‘ืจื™ื --
03:34
public recognition?
69
214229
1571
ื”ื›ืจื” ืคื•ืžื‘ื™ืช.
03:35
Now is that all -- intention, effect?
70
215824
2921
ื”ืื ื›ืœ ื–ื”-- ื›ื•ื•ื ื”, ืชื•ืฆืื”?
03:38
Well, there are some technologies which obviously combine both.
71
218769
3134
ื™ืฉ ื›ืžื” ื˜ื›ื ื•ืœื•ื’ื™ื•ืช
ืฉืžืฉืœื‘ื•ืช ื‘ื‘ื™ืจื•ืจ ืืช ืฉืชื™ื”ืŸ.
03:41
Both good long-term and short-term effects
72
221927
2787
ื”ืฉืคืขื•ืช ืœื˜ื•ื•ื— ืืจื•ืš ื•ื’ื ืœื˜ื•ื•ื— ืงืฆืจ
03:44
and a positive intention like Fred Stutzman's "Freedom,"
73
224738
2809
ืœืฆื“ ื›ื•ื•ื ื” ื—ื™ื•ื‘ื™ืช ื›ืžื• ื”ื—ื•ืคืฉ ืœืคื™ ืคืจื“ ืกื˜ื•ืฆืžืŸ,
03:47
where the whole point of that application is --
74
227571
2207
ื›ืฉื›ืœ ื”ืขื ื™ื™ืŸ ื‘ื™ื™ืฉื•ื ื”ื–ื”,
ื”ื•ื, ื•ื‘ื›ืŸ, ืื ื—ื ื• ื‘ื“ืจืš ื›ืœืœ ืžื•ืคืฆืฆื™ื
03:49
well, we're usually so bombarded with constant requests by other people,
75
229802
3725
ื‘ืฉื™ื—ื•ืช ื•ื‘ื‘ืงืฉื•ืช ืฉืœ ืื ืฉื™ื ืื—ืจื™ื,
03:53
with this device,
76
233551
1156
ืขื ื”ื”ืชืงืŸ ื”ื–ื” ืืคืฉืจ ืœื›ื‘ื•ืช ืืช ื”ืงื™ืฉื•ืจื™ื•ืช ืœืื™ื ื˜ืจื ื˜
03:54
you can shut off the Internet connectivity of your PC of choice
77
234731
3480
ื‘ื›ืœ ืžื—ืฉื‘ ืœื–ืžืŸ ืžื•ื’ื“ืจ ืžืจืืฉ
03:58
for a pre-set amount of time,
78
238235
1448
03:59
to actually get some work done.
79
239707
1971
ื›ื“ื™ ืฉืืคืฉืจ ื™ื”ื™ื” ืœืขื‘ื•ื“ ื‘ืฉืงื˜.
04:01
And I think most of us will agree that's something well-intended,
80
241702
3151
ื•ืื ื™ ื—ื•ืฉื‘ ืฉืจื•ื‘ื ื• ื ืกื›ื™ื,
ืฉื™ืฉ ื‘ื›ืš ื›ื•ื•ื ื” ื˜ื•ื‘ื”
04:04
and also has good consequences.
81
244877
2220
ื•ื’ื ืชื•ืฆืื•ืช ื˜ื•ื‘ื•ืช.
ื‘ืžื™ืœื•ืชื™ื• ืฉืœ ืžื™ืฉืœ ืคื•ืงื•,
04:07
In the words of Michel Foucault,
82
247121
1646
04:08
it is a "technology of the self."
83
248791
1940
"ื–ื• ื˜ื›ื ื•ืœื•ื’ื™ื” ืฉืœ ื”ืขืฆืžื™."
04:10
It is a technology that empowers the individual
84
250755
2837
ื–ื• ื˜ื›ื ื•ืœื•ื’ื™ื” ืฉื ื•ืชื ืช ื›ื•ื— ืœื™ื—ื™ื“
04:13
to determine its own life course,
85
253616
1815
ืœืงื‘ื•ืข ืืช ื“ืจืš ื—ื™ื™ื•,
04:15
to shape itself.
86
255455
1519
ืœืขืฆื‘ ืืช ืขืฆืžื•.
04:17
But the problem is, as Foucault points out,
87
257410
2984
ืื‘ืœ ื”ื‘ืขื™ื” ื”ื™ื,
ื›ืžื• ืฉืคื•ืงื• ืžืจืื”,
04:20
that every technology of the self
88
260418
1785
ืฉืœื›ืœ ื˜ื›ื ื•ืœื•ื’ื™ื” ืฉืœ ื”ืขืฆืžื™
04:22
has a technology of domination as its flip side.
89
262227
3445
ื™ืฉ ืžืฆื“ ืฉื ื™ ื˜ื›ื ื•ืœื•ื’ื™ื” ืฉืœ ืฉืœื™ื˜ื”.
04:25
As you see in today's modern liberal democracies,
90
265696
4603
ื›ืžื• ืฉืจื•ืื™ื ื›ื™ื•ื ื‘ื“ืžื•ืงืจื˜ื™ื•ืช ื”ืœื™ื‘ืจืœื™ื•ืช,
ื”ื—ื‘ืจื”, ื”ืžื“ื™ื ื”,
04:30
the society, the state, not only allows us to determine our self,
91
270323
4683
ืœื ืจืง ืžืืคืฉืจื•ืช ืœื ื• ืœืงื‘ื•ืข ืžื™ ืื ื—ื ื• ื•ืœืขืฆื‘ ืืช ืขืฆืžื ื•,
04:35
to shape our self,
92
275030
1151
04:36
it also demands it of us.
93
276205
1991
ื”ืŸ ื’ื ืชื•ื‘ืขื•ืช ืืช ื–ื” ืžืื™ืชื ื•.
ื”ืŸ ืชื•ื‘ืขื•ืช ืฉื ืฉืคืจ ืืช ืขืฆืžื ื•,
04:38
It demands that we optimize ourselves,
94
278220
1961
04:40
that we control ourselves,
95
280205
1840
ืฉื ืฉืœื•ื˜ ื‘ืขืฆืžื ื•,
ืฉื ื ื”ืœ ืืช ืขืฆืžื ื• ื‘ืื•ืคืŸ ืจืฆื™ืฃ
04:42
that we self-manage continuously,
96
282069
2711
04:44
because that's the only way in which such a liberal society works.
97
284804
3893
ืžืคื ื™ ืฉื–ื• ื”ื“ืจืš ื”ื™ื—ื™ื“ื”
ื‘ื” ื—ื‘ืจื” ืœื™ื‘ืจืœื™ืช ื›ื–ื• ืคื•ืขืœืช.
04:48
These technologies want us to stay in the game
98
288721
4268
ื”ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื”ืืœื• ืจื•ืฆื•ืช ืฉื ืžืฉื™ืš ืœืฉื—ืง
04:53
that society has devised for us.
99
293013
2773
ื‘ืžืฉื—ืง ืฉื”ื—ื‘ืจื” ืื™ืจื’ื ื” ืœื ื•.
04:55
They want us to fit in even better.
100
295810
2287
ื”ื ืจื•ืฆื™ื ืฉื ืชืื™ื ืขื•ื“ ื™ื•ืชืจ ืืช ืขืฆืžื ื•.
04:58
They want us to optimize ourselves to fit in.
101
298121
2578
ื”ื ืจื•ืฆื™ื ืฉื ืžื˜ื‘ ืืช ืขืฆืžื ื• ื›ื“ื™ ืฉื ืฉืชืœื‘ ื‘ื•.
ืื ื™ ืœื ืื•ืžืจ ืฉื–ื” ื‘ื”ื›ืจื— ืจืข .
05:01
Now, I don't say that is necessarily a bad thing;
102
301628
3079
ืื ื™ ืจืง ื—ื•ืฉื‘ ืฉื”ื“ื•ื’ืžื” ื”ื–ื•
05:05
I just think that this example points us to a general realization,
103
305293
4328
ืžื•ื‘ื™ืœื” ืื•ืชื ื• ืœื”ื‘ื ื” ื”ื›ืœืœื™ืช,
05:09
and that is: no matter what technology or design you look at,
104
309645
3803
ืฉื™ื”ื™ื• ืืฉืจ ื™ื”ื™ื• ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ืื• ื”ืขื™ืฆื•ื‘,
05:13
even something we consider as well-intended
105
313472
3021
ื’ื ืื ื–ื” ืžื›ื•ื•ื ื•ืช ื˜ื•ื‘ื•ืช ืื• ืขื ืชื•ืฆืื•ืช ื˜ื•ื‘ื•ืช--
05:16
and as good in its effects as Stutzman's Freedom,
106
316517
2966
ื›ืžื• ื”ื—ื•ืคืฉ ืœืคื™ ืกื˜ื•ืฆืžืŸ --
05:19
comes with certain values embedded in it.
107
319507
2707
ืชืžื™ื“ ืžื•ื˜ืžืขื™ื ื‘ืชื•ื›ื• ืขืจื›ื™ื ืžืกื•ื™ืžื™ื.
05:22
And we can question these values.
108
322238
1936
ื•ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœืคืงืคืง ื‘ืขืจื›ื™ื ื”ืืœื•.
ืื ื• ื™ื›ื•ืœื™ื ืœืฉืื•ืœ: ื”ืื ื–ื” ื˜ื•ื‘
05:24
We can question: Is it a good thing
109
324198
1944
ืฉื›ื•ืœื ื• ื›ืœ ื”ื–ืžืŸ ืžืฉืคืจื™ื ืืช ืขืฆืžื ื•
05:26
that all of us continuously self-optimize ourselves
110
326166
3484
05:29
to fit better into that society?
111
329674
2011
ื›ื“ื™ ืœื”ืชืื™ื ื˜ื•ื‘ ื™ื•ืชืจ ืœื—ื‘ืจื”?
05:31
Or to give you another example:
112
331709
1492
ืื• ื“ื•ื’ืžื” ืื—ืจืช,
05:33
What about a piece of persuasive technology
113
333225
2476
ืžื” ื“ืขืชื›ื ืขืœ ืื™ื–ื• ื˜ื›ื ื•ืœื•ื’ื™ื” ืจื‘ืช-ื”ืฉืคืขื”
ืฉืžืฉื›ื ืขืช ื ืฉื™ื ืžื•ืกืœืžื™ื•ืช ืœืœื‘ื•ืฉ ืืช ื›ื™ืกื•ื™ื™ ื”ืจืืฉ ืฉืœื”ืŸ?
05:35
that convinces Muslim women to wear their headscarves?
114
335725
3191
05:38
Is that a good or a bad technology
115
338940
2052
ื”ืื ื–ื• ื˜ื›ื ื•ืœื•ื’ื™ื” ื˜ื•ื‘ื” ืื• ืจืขื”
ืžื‘ื—ื™ื ืช ื”ื›ื•ื•ื ื•ืช ืื• ื”ืชื•ืฆืื•ืช ืฉืœื”?
05:41
in its intentions or in its effects?
116
341016
2563
05:43
Well, that basically depends on the kind of values you bring to bear
117
343603
4254
ื•ื‘ื›ืŸ, ืขืงืจื•ื ื™ืช ื–ื” ืชืœื•ื™
ืžื”ื ื”ืขืจื›ื™ื ืœืคื™ื”ื
05:47
to make these kinds of judgments.
118
347881
2237
ืขื•ืฉื™ื ืฉื™ืคื•ื˜ ื›ื–ื”.
ื•ื”ื ื” ืฉืืœื” ืฉืœื™ืฉื™ืช:
05:50
So that's a third question:
119
350142
1528
05:51
What values do you use to judge?
120
351694
1528
ื‘ืื™ื–ื” ืขืจื›ื™ื ืืชื ืžืฉืชืžืฉื™ื ื›ื“ื™ ืœืฉืคื•ื˜?
05:53
And speaking of values:
121
353848
1341
ื•ืื ืžื“ื‘ืจื™ื ืขืœ ืขืจื›ื™ื,
05:55
I've noticed that in the discussion about moral persuasion online
122
355213
3356
ื”ื‘ื—ื ืชื™ ืฉื‘ื“ื™ื•ืŸ ืฉืœ ืฉื›ื ื•ืข ืžื•ืกืจื™ ื‘ืจืฉืช,
05:58
and when I'm talking with people,
123
358593
1637
ื•ื›ืฉืื ื™ ืžื“ื‘ืจ ืขื ืื ืฉื™ื,
06:00
more often than not, there is a weird bias.
124
360254
2667
ืจื•ื‘ ื”ืคืขืžื™ื ื™ืฉ ื”ื˜ื™ื” ืžื•ื–ืจื”.
ื•ื”ื”ื˜ื™ื” ื”ื–ื• ื”ื™ื ืฉืื ื—ื ื• ืฉื•ืืœื™ื,
06:03
And that bias is that we're asking:
125
363463
2896
06:06
Is this or that "still" ethical?
126
366383
2813
ื”ืื ื–ื” ืื• ื”ื”ื•ื "ืขื“ื™ื™ืŸ" ืืชื™ื™ื?
06:09
Is it "still" permissible?
127
369220
2650
ื”ืื ื–ื” "ืขื“ื™ื™ืŸ" ืชืงื™ืŸ?
06:11
We're asking things like:
128
371894
1198
ืื ื—ื ื• ืฉื•ืืœื™ื ืฉืืœื•ืช ื›ืžื•,
ื”ืื ื–ื” ื˜ื•ืคืก ื”ืชืจื•ืžื” ืฉืœ ืื•ืงืกืคืื --
06:13
Is this Oxfam donation form,
129
373116
2189
06:15
where the regular monthly donation is the preset default,
130
375329
3048
ื‘ื• ื”ืชืจื•ืžื” ื”ื—ื•ื“ืฉื™ืช ื”ืจื’ื™ืœื” ื”ื™ื ื‘ืจืจืช ื”ืžื—ื“ืœ
06:18
and people, maybe without intending it,
131
378401
2079
ื•ืื ืฉื™ื, ืื•ืœื™ ื‘ืœื™ ืœื”ืชื›ื•ื•ืŸ,
06:20
are encouraged or nudged into giving a regular donation
132
380504
3812
ืžืขื•ื“ื“ื™ื ืื• ื ื“ื—ืคื™ื ื›ื›ื”
ืœืชืช ืชืจื•ืžื•ืช ืงื‘ื•ืขื•ืช ื‘ืžืงื•ื ืชืจื•ืžื” ื—ื“ ืคืขืžื™ืช --
06:24
instead of a one-time donation,
133
384340
1489
06:25
is that "still' permissible?
134
385853
1343
ื”ืื ื–ื” ืขื“ื™ื™ืŸ ืชืงื™ืŸ?
ื”ืื ื–ื” ืขื“ื™ื™ืŸ ืืชื™?
06:27
Is it "still" ethical?
135
387220
1363
06:28
We're fishing at the low end.
136
388607
1479
ืื ื—ื ื• ืžื’ื™ืขื™ื ืœืชื—ืชื™ืช.
ืื‘ืœ ืœืžืขืฉื”, ื”ืฉืืœื” ื”ื–ื•
06:30
But in fact, that question, "Is it 'still' ethical?"
137
390879
2474
"ื”ืื ื–ื” ืขื“ื™ื™ืŸ ืืชื™?"
ื”ื™ื ืจืง ื“ืจืš ืื—ืช ืœื”ื‘ื™ื˜ ื‘ืืชื™ืงื”.
06:33
is just one way of looking at ethics.
138
393377
1781
06:35
Because if you look at the beginning of ethics in Western culture,
139
395182
4892
ืžืคื ื™ ืฉืื ืืชื ืžื‘ื™ื˜ื™ื ื‘ื”ืชื—ืœื” ืฉืœ ื”ืืชื™ืงื”
ื‘ืชืจื‘ื•ื™ื•ืช ืžืขืจื‘ื™ื•ืช,
ืืชื ืจื•ืื™ื ืจืขื™ื•ืŸ ืฉื•ื ื” ืœื’ืžืจื™
06:40
you see a very different idea of what ethics also could be.
140
400098
3532
ืฉืœ ืžื” ืืชื™ืงื” ื™ื›ื•ืœื” ืœื”ื™ื•ืช.
06:43
For Aristotle, ethics was not about the question,
141
403970
3997
ืœืืจื™ืกื˜ื•, ืืชื™ืงื” ืœื ื”ื™ืชื” ื”ืฉืืœื”,
06:47
"Is that still good, or is it bad?"
142
407991
2272
ื”ืื ื–ื” ืขื“ื™ื™ืŸ ื˜ื•ื‘, ืื• ืฉื–ื” ืจืข?
06:50
Ethics was about the question of how to live life well.
143
410287
3428
ืืชื™ืงื” ื ื’ืขื” ืœืฉืืœื” ืฉืœ ืื™ืš ืœื—ื™ื•ืช ืืช ื”ื—ื™ื™ื ื˜ื•ื‘.
ื•ื”ื•ื ืฉื ืืช ื–ื” ื‘ืžื™ืœื” "ืืจื˜ื”,"
06:54
And he put that in the word "arรชte,"
144
414216
2181
ืฉืื ื—ื ื•, ืžื”ืœื˜ื™ื ื™ืช, ืชืจื’ืžื ื• ืืช "ืกื’ื•ืœื”."
06:56
which we, from [Ancient Greek], translate as "virtue."
145
416421
2755
ืื‘ืœ ื‘ืขืฆื ื”ื™ื ืื•ืžืจืช ืžืฆื•ื™ื™ื ื•ืช.
06:59
But really, it means "excellence."
146
419200
1640
07:00
It means living up to your own full potential as a human being.
147
420864
5431
ื–ื” ืื•ืžืจ ืœื—ื™ื•ืช ื•ืœืžืฆื•ืช ืืช ื”ืคื•ื˜ื ืฆื™ืืœ ื”ืžืœื ืฉืœื›ื
ื›ื‘ื ื™ ืื“ื.
07:06
And that is an idea that, I think,
148
426937
1656
ื•ื–ื” ื”ืจืขื™ื•ืŸ
ืฉืื ื™ ื—ื•ืฉื‘, ืฉืคื•ืœ ืจื™ืฆ'ืืจื“ ื‘ื™ื•ืงืื ืŸ ื”ื›ื ื™ืก ื‘ืื•ืคืŸ ื™ืคื” ืœืžืืžืจ
07:08
Paul Richard Buchanan put nicely in a recent essay,
149
428617
2697
07:11
where he said, "Products are vivid arguments
150
431338
2143
ืฉื ื”ื•ื ืืžืจ, "ืžื•ืฆืจื™ื ื”ื ื•ื™ื›ื•ื—ื™ื ื‘ืจื•ืจื™ื
07:13
about how we should live our lives."
151
433505
2123
ืขืœ ืื™ืš ืื ื—ื ื• ืฆืจื™ื›ื™ื ืœื—ื™ื•ืช ืืช ื—ื™ื™ื ื•."
ื”ืขื™ืฆื•ื‘ื™ื ืฉืœื ื• ื”ื ืœื ืืชื™ื™ื ืื• ืœื-ืืชื™ื™ื
07:16
Our designs are not ethical or unethical
152
436086
2577
07:18
in that they're using ethical or unethical means of persuading us.
153
438687
4590
ื‘ื–ื” ืฉื”ื ืžืฉืชืžืฉื™ื ื‘ืืžืฆืขื™ื ืืชื™ื™ื ืื• ืœื-ืืชื™ื™ื ื›ื“ื™ ืœืฉื›ื ืข ืื•ืชื ื•.
07:23
They have a moral component
154
443661
1570
ื™ืฉ ืœื”ื ืจื›ื™ื‘ ืžื•ืกืจื™
ืจืง ื‘ืกื•ื’ ื”ื—ื–ื•ืŸ ื•ื”ืฉืื™ืคื” ืฉืœ ื”ื—ื™ื™ื ื”ื˜ื•ื‘ื™ื
07:25
just in the kind of vision and the aspiration of the good life
155
445255
4227
ืฉื”ื ืžืฆื™ืขื™ื ืœื ื•.
07:29
that they present to us.
156
449506
1348
07:31
And if you look into the designed environment around us
157
451441
3503
ื•ืื ืืชื ืžื‘ื™ื˜ื™ื ืœืชื•ืš ื”ืกื‘ื™ื‘ื•ืช ื”ืžืขื•ืฆื‘ื•ืช ืกื‘ื™ื‘ื ื•
07:34
with that kind of lens,
158
454968
1172
ืขื ืขื“ืฉื” ืžืกื•ื’ ื›ื–ื”,
07:36
asking, "What is the vision of the good life
159
456164
2453
ื•ืฉื•ืืœื™ื, "ืžื”ื• ื”ื—ื–ื•ืŸ ืฉืœ ื—ื™ื™ื ื˜ื•ื‘ื™ื
07:38
that our products, our design, present to us?",
160
458641
2738
ืฉื”ืžื•ืฆืจื™ื ืฉืœื ื•, ื”ืขื™ืฆื•ื‘ื™ื ืฉืœื ื•, ืžืฆื™ื’ื™ื ืœื ื•?",
07:41
then you often get the shivers,
161
461403
2276
ื•ืื– ื”ืจื‘ื” ืคืขืžื™ื ืืชื ืจื•ืขื“ื™ื,
07:43
because of how little we expect of each other,
162
463703
2328
ืžื›ืžื” ืžืขื˜ ืฆื™ืคื™ื ื• ืื—ื“ ืžื”ืฉื ื™,
07:46
of how little we actually seem to expect of our life,
163
466055
3890
ืžื›ืžื” ืžืขื˜ ื ืจืื” ืฉืื ื—ื ื• ืžืฆืคื™ื
ืžื—ื™ื™ื ื• ื•ืื™ืš ื”ื—ื™ื™ื ื”ื˜ื•ื‘ื™ื ื ืจืื™ื.
07:49
and what the good life looks like.
164
469969
2034
ืื– ื–ื• ื”ืฉืืœื” ื”ืจื‘ื™ืขื™ืช ืฉื”ื™ื™ืชื™ ืจื•ืฆื” ืœื”ืฉืื™ืจ ืืชื›ื:
07:53
So that's a fourth question I'd like to leave you with:
165
473110
3023
07:56
What vision of the good life do your designs convey?
166
476157
4362
ืื™ื–ื” ื—ื–ื•ืŸ ืฉืœ ื”ื—ื™ื™ื ื”ื˜ื•ื‘ื™ื
ื”ืขื™ืฆื•ื‘ื™ื ืฉืœื›ื ืžืขื‘ื™ืจื™ื?
08:01
And speaking of design,
167
481249
1372
ื•ืื ืžื“ื‘ืจื™ื ืขืœ ืขื™ืฆื•ื‘,
08:02
you'll notice that I already broadened the discussion,
168
482645
4190
ืืชื ืจื•ืื™ื ืฉื›ื‘ืจ ื”ืจื—ื‘ืชื™ ืืช ื”ื“ื™ื•ืŸ.
08:06
because it's not just persuasive technology that we're talking about here,
169
486859
4444
ืžืคื ื™ ืฉื–ื” ืœื ืจืง ื˜ื›ื ื•ืœื•ื’ื™ืช ืฉื›ื ื•ืข ืฉืื ื—ื ื• ืžื“ื‘ืจื™ื ืขืœื™ื” ืคื”,
ื–ื” ื›ืœ ืคื™ืกื” ืฉืœ ืขื™ืฆื•ื‘ ืฉืืชื ืžื•ืฆื™ืื™ื ืœืขื•ืœื.
08:11
it's any piece of design that we put out here in the world.
170
491327
4077
08:15
I don't know whether you know
171
495428
1400
ืื ื™ ืœื ื™ื•ื“ืข ืื ืืชื ืžื›ื™ืจื™ื
08:16
the great communication researcher Paul Watzlawick who, back in the '60s,
172
496852
3555
ืืช ื—ื•ืงืจ ืžืชืงืฉื•ืจืช ื”ื’ื“ื•ืœ ืคื•ืœ ื•ื•ืฆื™ื˜ื•ื•ื™ืง
ืฉื‘ืฉื ื•ืช ื”60, ื˜ืขืŸ
ืฉืื ื—ื ื• ืœื ื™ื›ื•ืœื™ื ืœื ืœืชืงืฉืจ.
08:20
made the argument that we cannot not communicate.
173
500431
2511
ืืคื™ืœื• ืื ื ื‘ื—ืจ ืœืฉืชื•ืง,
08:22
Even if we choose to be silent, we chose to be silent,
174
502966
2601
ืื ื—ื ื• ื‘ื•ื—ืจื™ื ืœืฉืชื•ืง. ืื ื—ื ื• ืžืชืงืฉืจื™ื ืžืฉื”ื• ื‘ื–ื” ืฉืื ื—ื ื• ื‘ื•ื—ืจื™ื ืœืฉืชื•ืง.
08:25
and we're communicating something by choosing to be silent.
175
505591
2956
08:28
And in the same way that we cannot not communicate,
176
508571
2735
ื•ื‘ืื•ืชื” ื“ืจืš ืฉืื ื—ื ื• ืœื ื™ื›ื•ืœื™ื ืœื ืœืชืงืฉืจ,
ืื ื—ื ื• ืœื ื™ื›ื•ืœื™ื ืœื ืœืฉื›ื ืข.
08:31
we cannot not persuade:
177
511330
1531
08:32
whatever we do or refrain from doing,
178
512885
2011
ืžื” ืฉืœื ื ืขืฉื” ืื• ื ืžื ืข ืžืœืขืฉื•ืช,
08:34
whatever we put out there as a piece of design, into the world,
179
514920
4365
ืžื” ืฉืœื ื ื•ืฆื™ื ื”ื—ื•ืฆื” ื›ืคื™ืกืช ืขื™ืฆื•ื‘
ืœืขื•ืœื
ื™ื”ื™ื” ืœื• ืžืจื›ื™ื‘ ืฉื›ื ื•ืขื™.
08:39
has a persuasive component.
180
519309
2047
08:41
It tries to affect people.
181
521380
1873
ื”ื•ื ืžื ืกื” ืœื”ืฉืคื™ืข ืขืœ ืื ืฉื™ื,
08:43
It puts a certain vision of the good life out there in front of us,
182
523277
3747
ื”ื• ืฉื ืกื•ื’ ืฉืœ ื—ื–ื•ืŸ ืฉืœ ื”ื—ื™ื™ื ื”ื˜ื•ื‘ื™ื
ืฉื ืœืคื ื™ื ื•.
ืฉื–ื” ืžื” ืฉืคื˜ืจ ืคื•ืœ ืคืจื‘ื™ืง,
08:47
which is what Peter-Paul Verbeek,
183
527048
1625
08:48
the Dutch philosopher of technology, says.
184
528697
2716
ื”ืคื™ืœื•ืกื•ืฃ ื”ื”ื•ืœื ื“ื™ ืฉืœ ื”ื˜ื›ื ื•ืœื•ื’ื™ื”, ืื•ืžืจ.
ืœื ืžืฉื ื” ืื ืื ื—ื ื• ื›ืžืขืฆื‘ื™ื ืžืชื›ื•ื•ื ื™ื ืœื–ื” ืื• ืœื,
08:51
No matter whether we as designers intend it or not,
185
531437
3948
08:55
we materialize morality.
186
535409
2142
ืื ื—ื ื• ืžืžืฉื™ื ืžื•ืกืจื™ื•ืช.
08:57
We make certain things harder and easier to do.
187
537575
2803
ืื ื—ื ื• ื”ื•ืคื›ื™ื ื“ื‘ืจื™ื ืžืกื•ื™ื™ืžื™ื ืœืงืฉื™ื ืื• ืงืœื™ื ื™ื•ืชืจ ืœืขืฉื™ื”.
09:00
We organize the existence of people.
188
540402
2211
ืื ื—ื ื• ืžืืจื’ื ื™ื ืืช ื”ืงื™ื•ื ืฉืœ ืื ืฉื™ื.
09:02
We put a certain vision
189
542637
1151
ืื ื—ื ื• ืฉืžื™ื ืกื•ื’ ืฉืœ ื—ื–ื•ืŸ ืฉืœ ืžื” ื˜ื•ื‘ ืื• ืžื” ืจืข
09:03
of what good or bad or normal or usual is
190
543812
3404
ืื• ื ื•ืจืžืœื™ ืื• ืจื’ื™ืœ ืœืคื ื™ ืื ืฉื™ื
09:07
in front of people,
191
547240
1151
09:08
by everything we put out there in the world.
192
548415
2400
ืขืœ ื™ื“ื™ ื›ืœ ื“ื‘ืจ ืฉืื ื—ื ื• ืฉืžื™ื ืฉื ื‘ื—ื•ืฅ ื‘ืขื•ืœื.
ืืคื™ืœื• ืžืฉื”ื• ืจื’ื™ืœ ื›ืžื• ืกื˜ ืฉืœ ื›ื™ืกืื•ืช ื‘ื™ืช ืกืคืจ
09:11
Even something as innocuous as a set of school chairs
193
551247
3086
09:14
is a persuasive technology,
194
554357
2047
ื”ื•ื ื˜ื›ื ื•ืœื•ื’ื™ื” ืžืฉื›ื ืขืช.
ืžืคื ื™ ืฉื–ื” ืžืฆื™ื’ ื•ืžืžื—ื™ืฉ
09:16
because it presents and materializes a certain vision of the good life --
195
556428
4690
ืกื•ื’ ืฉืœ ื—ื–ื•ืŸ ืฉืœ ื”ื—ื™ื™ื ื”ื˜ื•ื‘ื™ื --
ื”ื—ื™ื™ื ื”ื˜ื•ื‘ื™ื ื‘ื”ื ื”ื•ืจืื” ื•ืœืžื™ื“ื” ื•ื”ืงืฉื‘ื”
09:21
a good life in which teaching and learning and listening
196
561142
2857
09:24
is about one person teaching, the others listening;
197
564023
3099
ื”ื ืื“ื ืื—ื“ ืžืœืžื“, ื•ื”ืื—ืจื™ื ืžืงืฉื™ื‘ื™ื,
ื‘ื”ื ื–ื” ื ื•ื’ืข, ืœื™ืžื•ื“ ื ืขืฉื” ื‘ื™ืฉื™ื‘ื”,
09:27
in which it is about learning-is-done-while-sitting;
198
567146
4053
ื‘ื” ืืชื ืœื•ืžื“ื™ื ื‘ืขืฆืžื›ื,
09:31
in which you learn for yourself;
199
571223
1595
09:32
in which you're not supposed to change these rules,
200
572842
2420
ื‘ื” ืืชื ืœื ืืžื•ืจื™ื ืœืฉื ื•ืช ืืช ื”ื—ื•ืงื™ื ื”ืืœื”
09:35
because the chairs are fixed to the ground.
201
575286
2447
ืžืคื ื™ ืฉื”ื›ืกืื•ืช ืžืงื•ื‘ืขื™ื ืœืจืฆืคื”.
09:38
And even something as innocuous as a single-design chair,
202
578888
2911
ื•ืืคื™ืœื• ืžืฉื”ื• ืคืฉื•ื˜ ื›ืžื• ื›ื™ืกื ืžืขื•ืฆื‘ ืื—ื“ --
09:41
like this one by Arne Jacobsen,
203
581823
1572
ื›ืžื• ื–ื” ืฉืœ ืืจื ื” ื’'ื™ื™ืงื•ื‘ืกื•ืŸ --
09:43
is a persuasive technology,
204
583419
1776
ื‘ื˜ื›ื ื•ืœื•ื’ื™ื” ืฉื™ื›ื ื•ืขื™ืช.
ืžืคื ื™, ืฉืฉื•ื‘, ื–ื” ืžืขื‘ื™ืจ ืืช ื”ืจืขื™ื•ืŸ ืฉืœ ื—ื™ื™ื ื˜ื•ื‘ื™ื.
09:45
because, again, it communicates an idea of the good life:
205
585219
3043
09:48
a good life -- a life that you, as a designer, consent to by saying,
206
588735
4925
ื—ื™ื™ื ื˜ื•ื‘ื™ื --
ื—ื™ื ืฉืืชื ืื•ืžืจื™ื ืฉืืชื ื›ืžืขืฆื‘ื™ื ืžืกื›ื™ืžื™ื
ื›ืฉืืชื ืื•ืžืจื™ื, "ื‘ื—ื™ื™ื ื”ื˜ื•ื‘ื™ื,
09:53
"In a good life, goods are produced as sustainably or unsustainably
207
593684
3507
ืžื•ืฆืจื™ื ืžื™ื•ืฆืจื™ื ื›ื‘ืจื™ ืงื™ื™ืžื ืื• ืœื ื‘ืจื™ ืงื™ื™ืžื ื›ืžื• ื”ื›ื™ืกื ื”ื–ื”.
09:57
as this chair.
208
597215
1611
09:58
Workers are treated as well or as badly
209
598850
1977
ืžืชื™ื™ื—ืกื™ื ืœืขื•ื‘ื“ื™ื ื˜ื•ื‘ ืื• ืจืข
10:00
as the workers were treated that built that chair."
210
600851
2572
ื›ืžื• ืฉื”ืชื™ื™ื—ืกื• ืœืขื•ื‘ื“ื™ื ืฉื‘ื ื• ืืช ื”ื›ืกื ื”ื–ื”."
10:03
The good life is a life where design is important
211
603762
2301
ื”ื—ื™ื™ื ื”ื˜ื•ื‘ื™ื ื”ื ื—ื™ื™ื ื‘ื”ื ืขื™ืฆื•ื‘ ื”ื•ื ื—ืฉื•ื‘
ืžืคื ื™ ืฉื‘ืจื•ืจ ืฉืžื™ืฉื”ื• ื”ืฉืงื™ืข ืืช ื”ื–ืžืŸ ื•ืืช ื”ื›ืกืฃ
10:06
because somebody obviously took the time and spent the money
212
606087
2921
ืขืœ ืกื•ื’ ื›ื–ื” ืฉืœ ื›ื™ืกื ืžืขื•ืฆื‘ ื”ื™ื˜ื‘,
10:09
for that kind of well-designed chair;
213
609032
1784
10:10
where tradition is important,
214
610840
1404
ืฉื ืžืกื•ืจืชื™ ื”ื•ื ื—ืฉื•ื‘
ืžืคื ื™ ืฉื–ื” ืงืœืืกื™ืงื” ืžืกื•ืจืชื™ืช
10:12
because this is a traditional classic and someone cared about this;
215
612268
3166
ื•ืœืžื™ืฉื”ื• ื”ื™ื” ืื›ืคืช ืžื–ื”,
ื•ื›ืฉื™ืฉ ืžืฉื”ื• ื›ืžื• ืฆืจื™ื›ื” ื ื™ื›ืจืช,
10:15
and where there is something as conspicuous consumption,
216
615458
2690
ืฉื ื–ื” ื‘ืกื“ืจ ื•ื ื•ืจืžืœื™
10:18
where it is OK and normal to spend a humongous amount of money
217
618172
2956
ืœื‘ื–ื‘ื– ื›ืžื•ื™ื•ืช ืขืฆื•ืžื•ืช ืฉืœ ื›ืกืฃ ืขืœ ื›ื–ื” ื›ื™ืกื
10:21
on such a chair,
218
621152
1151
ื›ื“ื™ ืœืื•ืชืช ืœืื ืฉื™ื ืื—ืจื™ื ืžื” ื”ืกื˜ื˜ื•ืก ื”ืกื•ืฆื™ืืœื™ ืฉืœื›ื.
10:22
to signal to other people what your social status is.
219
622327
2573
10:24
So these are the kinds of layers, the kinds of questions
220
624924
3317
ืื– ืืœื” ืกื•ื’ื™ื ืฉืœ ืฉื›ื‘ื•ืช, ืกื•ื’ ื”ืฉืืœื•ืช
10:28
I wanted to lead you through today;
221
628265
1970
ืฉืจืฆื™ืชื™ ืœื”ืขื‘ื™ืจ ืื•ืชื›ื ื“ืจื›ืŸ ื”ื™ื•ื --
ื”ืฉืืœื” ืฉืœ, ืžื” ื”ื›ื•ื•ื ื•ืช
10:30
the question of: What are the intentions that you bring to bear
222
630259
3034
ืฉืืชื ืžื‘ื™ืื™ื ื›ืฉืืชื ืžืขืฆื‘ื™ื ืžืฉื”ื•?
10:33
when you're designing something?
223
633317
1560
10:34
What are the effects, intended and unintended, that you're having?
224
634901
3252
ืžื” ื”ื”ืฉืœื›ื•ืช, ืžื›ื•ื•ื ื•ืช ืื• ืœื, ืฉื™ืฉ ืœื›ื?
ืžื” ื”ืขืจื›ื™ื ื‘ื”ื ืืชื ืžืฉืชืžืฉื™ื ื‘ื”ื
10:38
What are the values you're using to judge those?
225
638177
2801
ื›ื“ื™ ืœืฉืคื•ื˜ ืืช ืืœื”?
ืžื” ื”ืžืขืœื•ืช, ื”ืฉืื™ืคื•ืช
10:41
What are the virtues, the aspirations
226
641002
1973
10:42
that you're actually expressing in that?
227
642999
2028
ืฉืืชื ืžื‘ื˜ืื™ื ื‘ื–ื”?
10:45
And how does that apply,
228
645400
1857
ื•ืื™ืš ื–ื” ืžืฉืคื™ืข,
10:47
not just to persuasive technology,
229
647281
1986
ืœื ืจืง ืขืœ ื˜ื›ื ื•ืœื•ื’ื™ื” ืฉื›ื ื•ืขื™ืช,
10:49
but to everything you design?
230
649291
2048
ืืœื ืœื›ืœ ื“ื‘ืจ ืฉืืชื ืžืขืฆื‘ื™ื?
10:51
Do we stop there?
231
651912
1291
ื”ืื ืื ื—ื ื• ืขื•ืฆืจื™ื ืคื”?
10:53
I don't think so.
232
653815
1167
ืื ื™ ืœื ื—ื•ืฉื‘.
10:55
I think that all of these things are eventually informed
233
655291
4438
ืื ื™ ื—ื•ืฉื‘ ืฉื›ืœ ื”ื“ื‘ืจื™ื ื”ืืœื” ื”ื ื‘ืกื•ืคื• ืฉืœ ื“ื‘ืจ ืžื•ื“ืขื™ื
10:59
by the core of all of this,
234
659753
1423
ืขืœ ื™ื“ื™ ื›ืœ ื–ื” --
11:01
and this is nothing but life itself.
235
661200
3091
ื•ื–ื” ื›ืœื•ื ื—ื•ืฅ ืžื”ื—ื™ื™ื ืขืฆืžื.
11:04
Why, when the question of what the good life is
236
664594
2717
ืœืžื”, ื›ืฉื”ืฉืืœื” ืฉืœ ืžื” ื”ื ื”ื—ื™ื™ื ื”ื˜ื•ื‘ื™ื
11:07
informs everything that we design,
237
667335
2339
ืžื™ื“ืขืช ื›ืœ ื“ื‘ืจ ืฉื ื—ื ื• ืžืขืฆื‘ื™ื,
11:09
should we stop at design and not ask ourselves:
238
669698
2794
ื”ืื ืื ื—ื ื• ืขื•ืฆืจื™ื ื‘ืขื™ืฆื•ื‘ ื•ืœื ืฉื•ืืœื™ื ืืช ืขืฆืžื ื•,
11:12
How does it apply to our own life?
239
672516
1990
ืื™ืš ื–ื” ืžืฉืคื™ืข ืขืœ ื”ื—ื™ื™ื ืฉืœื ื•?
11:14
"Why should the lamp or the house be an art object,
240
674881
2712
"ืœืžื” ื”ืžื ื•ืจื” ืื• ื”ื‘ื™ืช ืฆืจื™ื›ื™ื ืœื”ื™ื•ืช ืื•ื‘ื™ื™ืงื˜ ืื•ืžื ื•ืช,
11:17
but not our life?"
241
677617
1176
ืื‘ืœ ืœื ื—ื™ื™ื ื•?"
11:18
as Michel Foucault puts it.
242
678817
1483
ื›ืžื• ืฉืžื™ืฉืœ ืคื•ืงื• ืื•ืžืจ ืืช ื–ื”.
11:20
Just to give you a practical example of Buster Benson.
243
680696
3611
ืจืง ืœืชืช ืœื›ื ื“ื•ื’ืžื” ืคืจืงื˜ื™ืช ืฉืœ ื‘ืืกื˜ืจ ื‘ื ืกื•ืŸ.
11:24
This is Buster setting up a pull-up machine
244
684331
2258
ื–ื” ื‘ืืกื˜ืจ ื™ื•ืฉื‘ ืขืœ ื›ื™ืกื ื ืžืฉืš
11:26
at the office of his new start-up, Habit Labs,
245
686613
2612
ื‘ืžืฉืจื“ ืฉืœ ื”ืกื˜ืืจื˜ ืืค ื”ื—ื“ืฉ ืฉืœื• ืžืขื‘ื“ื•ืช ื”ื”ืจื’ืœ,
ืฉื ื”ื ืžื ืกื™ื ืœื‘ื ื•ืช ืืคืœื™ืงืฆื™ื•ืช ืื—ืจื•ืช
11:29
where they're trying to build other applications like "Health Month"
246
689249
3215
ื›ื• ื—ื•ื“ืฉ ื‘ืจื™ืื•ืช ืœืื ืฉื™ื.
11:32
for people.
247
692488
1158
ื•ืœืžื” ื”ื•ื ื‘ื•ื ื” ืžืฉื”ื• ื›ื–ื”?
11:33
And why is he building a thing like this?
248
693670
1962
11:35
Well, here is the set of axioms
249
695656
2033
ื•ื‘ื›ืŸ ื”ื ื” ืกื˜ ืฉืœ ืืงืกื™ื•ืžื•ืช
11:37
that Habit Labs, Buster's start-up, put up for themselves
250
697713
3398
ืฉืžืขื‘ื“ื•ืช ื”ื”ืจื’ืœ, ื”ืกื˜ืจื˜ ืืค ืฉืœ ื‘ืืกื˜ืจ, ื”ืขืœื• ื‘ืขืฆืžื
ืขืœ ืื™ืš ื”ื ืจืฆื• ืœืขื‘ื•ื“ ื™ื—ื“ ื›ืฆื•ื•ืช
11:41
on how they wanted to work together as a team
251
701135
2705
11:43
when they're building these applications --
252
703864
2029
ื›ืฉื”ื ื‘ื•ื ื™ื ืืช ื”ืืคืœื™ืงืฆื™ื•ืช ื”ืืœื” --
11:45
a set of moral principles they set themselves
253
705917
2189
ืกื˜ ืฉืœ ืขืงืจื•ื ื•ืช ืžื•ืกืจื™ื™ื ืฉื”ื ื™ืฆืจื• ื‘ืขืฆืžื
ืœืขื‘ื•ื“ื” ื™ื—ื“ --
11:48
for working together --
254
708130
1361
11:49
one of them being,
255
709515
1238
ื•ืื—ื“ ืžื”ื ื”ื™ื”,
11:50
"We take care of our own health and manage our own burnout."
256
710777
3087
"ืื ื—ื ื• ื“ื•ืื’ื™ื ืœื‘ืจื™ืื•ืช ืฉืœ ืขืฆืžื ื• ื•ืžื ื”ืœื™ื ืืช ืกื“ืจ ื”ื™ื•ื ืฉืœื ื•."
ืžืคื ื™ ืฉื‘ืกื•ืคื• ืฉืœ ื“ื‘ืจ ืื™ืš ืืชื ื™ื›ื•ืœื™ื ืœืฉืื•ืœ ืืช ืขืฆืžื›ื
11:54
Because ultimately, how can you ask yourselves
257
714221
3500
11:57
and how can you find an answer on what vision of the good life
258
717745
3930
ื•ืื™ืš ืืชื ื™ื›ื•ืœื™ื ืœืžืฆื•ื ืืช ื”ืชืฉื•ื‘ื”
ืขืœ ืื™ื–ื” ื—ื–ื•ืŸ ืฉืœ ื—ื™ื™ื ื˜ื•ื‘ื™ื
12:01
you want to convey and create with your designs
259
721699
3169
ืืชื ืจื•ืฆื™ื ืœื”ืขื‘ื™ืจ ื•ืœื™ืฆื•ืจ ืขื ื”ืขื™ืฆื•ื‘ื™ื ืฉืœื›ื
12:04
without asking the question:
260
724892
1730
ื‘ืœื™ ืœืฉืื•ืœ ืืช ื”ืฉืืœื”,
12:06
What vision of the good life do you yourself want to live?
261
726646
3832
ืื™ื–ื” ื—ื–ื•ืŸ ืฉืœ ื”ื—ื™ื™ื ื”ื˜ื•ื‘ื™ื
ืืชื ืขืฆืžื›ื ืจื•ืฆื™ื ืœื—ื™ื•ืช?
ื•ืขื ื–ื”, ืื ื™ ืžื•ื“ื” ืœื›ื.
12:11
And with that,
262
731018
1174
12:12
I thank you.
263
732945
1412
12:14
(Applause)
264
734381
4156
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7