Sebastian Deterding: What your designs say about you

33,106 views ・ 2012-05-31

TED


请双击下面的英文字幕来播放视频。

翻译人员: Xiang Li 校对人员: Yu-Sheng Lin
00:15
We are today talking about moral persuasion:
0
15651
2586
我们今天要谈的是道德引导
00:18
What is moral and immoral in trying to change people's behaviors
1
18261
3982
当我们利用手段或者计谋
试图改变别人的行为时
00:22
by using technology and using design?
2
22267
2453
什么是道德的,什么是不道德的?
00:24
And I don't know what you expect,
3
24744
1832
我不知道你有什么想法
00:26
but when I was thinking about that issue,
4
26600
1953
但是,当我想到这个问题时
00:28
I early on realized what I'm not able to give you are answers.
5
28577
4039
我很早就意识到
我不可能给出一个答案
00:33
I'm not able to tell you what is moral or immoral,
6
33203
2771
我不可能告诉你们什么是道德的,什么是不道德的
00:35
because we're living in a pluralist society.
7
35998
2547
因为我们生活在一个多元化的社会当中
00:38
My values can be radically different from your values,
8
38569
4242
我的价值观可能迥异于
你的价值观
00:42
which means that what I consider moral or immoral based on that
9
42835
3177
这也就是说基于我的价值观所得的道德或者不道德
00:46
might not necessarily be what you consider moral or immoral.
10
46036
3612
并不一定与你的道德标准一样
但是我也清楚,有一件事我能给你们
00:50
But I also realized there is one thing that I could give you,
11
50029
3214
这也是我身后这位留给世界的
00:53
and that is what this guy behind me gave the world --
12
53267
2533
00:55
Socrates.
13
55824
1150
他是苏格拉底
00:56
It is questions.
14
56998
1395
那就是疑问
00:58
What I can do and what I would like to do with you
15
58417
2633
我能做的,以及我要和你们一起做的就是
给你们,就像刚开头那个问题,
01:01
is give you, like that initial question,
16
61074
1945
经由一系列的问题
01:03
a set of questions to figure out for yourselves,
17
63043
3410
找出自己的答案
01:06
layer by layer, like peeling an onion,
18
66477
3641
一层一层的探讨
就像剥洋葱一样
剥到最后你就会得到你所相信的
01:10
getting at the core of what you believe is moral or immoral persuasion.
19
70142
5020
道德与不道德的价值观
01:15
And I'd like to do that with a couple of examples of technologies
20
75559
4041
我想用几个科技面的例子来说明
其中,人们利用了比赛的元素
01:19
where people have used game elements to get people to do things.
21
79624
4980
来让别人去做事
首先是一个简单而且显而易见的问题
01:25
So it's at first a very simple, very obvious question
22
85343
3040
01:28
I would like to give you:
23
88407
1196
我要问的是
01:29
What are your intentions if you are designing something?
24
89627
2687
当你在盘算某件事的时候,你的意图是什么?
01:32
And obviously, intentions are not the only thing,
25
92668
3373
很明显,意图并不是唯一需要考虑的
所以,在各种运用之中,我要举另一个例子
01:36
so here is another example for one of these applications.
26
96065
3178
01:39
There are a couple of these kinds of Eco dashboards right now --
27
99267
3087
现在有一些这样的环保仪表盘
这样的仪表盘被装在了车上
01:42
dashboards built into cars --
28
102378
1460
01:43
which try to motivate you to drive more fuel-efficiently.
29
103862
2832
从而促使你以更节能的方式去驾驶
01:46
This here is Nissan's MyLeaf,
30
106718
1773
这是日产的环保驾驶指示器
01:48
where your driving behavior is compared with the driving behavior
31
108515
3064
通过它,你的驾驶行为将会与
其他人的驾驶行为进行比较
01:51
of other people,
32
111603
1151
01:52
so you can compete for who drives a route the most fuel-efficiently.
33
112778
3277
这样,你就可以和其他驾驶者竞争
看谁最节能
结果证明这样的方式相当的有效
01:56
And these things are very effective, it turns out --
34
116079
2477
它的效力之大,以至于人们
01:58
so effective that they motivate people to engage in unsafe driving behaviors,
35
118580
3939
会采取一些不安全的驾驶方法
02:02
like not stopping at a red light,
36
122543
1762
比如说遇红灯也不停
02:04
because that way you have to stop and restart the engine,
37
124329
2715
因为一停下来你就要关闭再重新启动发动机
这样一来会消耗不少燃料,对吧?
02:07
and that would use quite some fuel, wouldn't it?
38
127068
2723
02:10
So despite this being a very well-intended application,
39
130338
4409
所以尽管这种应用的意图是良好的
02:14
obviously there was a side effect of that.
40
134771
2375
但很明显还存在一定的副作用
02:17
Here's another example for one of these side effects.
41
137170
2548
关于副作用,还有另外一个例子
02:19
Commendable: a site that allows parents to give their kids little badges
42
139742
4784
Commendable网站:
通过这个网站,父母们可以给孩子颁发小勋章
02:24
for doing the things that parents want their kids to do,
43
144550
2658
只要孩子们做了父母想要他们做的事
比如说系鞋带
02:27
like tying their shoes.
44
147232
1316
02:28
And at first that sounds very nice,
45
148572
2244
乍一听,这东西不错
02:30
very benign, well-intended.
46
150840
2150
非常有益,用意也很好
结果,你若是看了关于人们思维方式的研究,
02:33
But it turns out, if you look into research on people's mindset,
47
153014
3770
02:36
caring about outcomes,
48
156808
1487
就会发现追求结果
02:38
caring about public recognition,
49
158319
1773
在意公众认可
02:40
caring about these kinds of public tokens of recognition
50
160116
3861
在意这样的代表公众认可的勋章
其实对于长期的心理健康来说
02:44
is not necessarily very helpful
51
164001
1994
02:46
for your long-term psychological well-being.
52
166019
2307
并没有什么帮助
02:48
It's better if you care about learning something.
53
168350
2676
更好的方式是在意学到了什么
02:51
It's better when you care about yourself
54
171050
1905
更好的方式是在意你自己
02:52
than how you appear in front of other people.
55
172979
2574
而不是在意你在别人面前是什么样的
所以这样的一种激励工具
02:56
So that kind of motivational tool that is used actually, in and of itself,
56
176021
5041
就其本身来看
是有长期的副作用的
03:01
has a long-term side effect,
57
181086
1908
03:03
in that every time we use a technology
58
183018
1810
每一次当我们采取一些方法诸如
03:04
that uses something like public recognition or status,
59
184852
3174
利用公众认可或者公众地位的时候
03:08
we're actually positively endorsing this
60
188050
2376
实际上我们是在积极认可这种方法
03:10
as a good and normal thing to care about --
61
190450
3410
把它们塑造成了我们理应去在意的事
03:13
that way, possibly having a detrimental effect
62
193884
2864
这样一来,可能对我们整个文化族群
03:16
on the long-term psychological well-being of ourselves as a culture.
63
196772
3855
长期心理健康会有不良的影响
03:20
So that's a second, very obvious question:
64
200651
2644
所以,第二个显而易见的问题就是:
03:23
What are the effects of what you're doing --
65
203319
2311
你在做的事有什么样的影响?
03:25
the effects you're having with the device, like less fuel,
66
205654
4124
你所使用的这个装置所产生的影响
像节能
03:29
as well as the effects of the actual tools you're using
67
209802
2705
或是,你所使用切实的方法
03:32
to get people to do things --
68
212531
1674
让别人去做某件事情
03:34
public recognition?
69
214229
1571
比如公众认可,又会有什么影响
03:35
Now is that all -- intention, effect?
70
215824
2921
意图,影响,这就是全部了吗?
03:38
Well, there are some technologies which obviously combine both.
71
218769
3134
实际上,有些技术
显然两者兼具
03:41
Both good long-term and short-term effects
72
221927
2787
长短期看皆好的影响
03:44
and a positive intention like Fred Stutzman's "Freedom,"
73
224738
2809
以及良好的意图,比如像弗雷德•斯图兹曼的“自由者”软件
03:47
where the whole point of that application is --
74
227571
2207
这款软件的意图就在于
这样说吧,我们经常受到别人电话
03:49
well, we're usually so bombarded with constant requests by other people,
75
229802
3725
和各种请求的狂轰滥炸
03:53
with this device,
76
233551
1156
有了这个软件我们可以让电脑
03:54
you can shut off the Internet connectivity of your PC of choice
77
234731
3480
在预设的一段时间内断开网络连接
03:58
for a pre-set amount of time,
78
238235
1448
03:59
to actually get some work done.
79
239707
1971
这样你就有时间搞定其它事
04:01
And I think most of us will agree that's something well-intended,
80
241702
3151
我想大多数人都会认同
这个东西的意图是好的
04:04
and also has good consequences.
81
244877
2220
结果也不错
用米歇尔·福柯的话说
04:07
In the words of Michel Foucault,
82
247121
1646
04:08
it is a "technology of the self."
83
248791
1940
“这是满足自我的一种科技”
04:10
It is a technology that empowers the individual
84
250755
2837
这样的一种技术手段能赋予个人
04:13
to determine its own life course,
85
253616
1815
决定自己生活方式的权力
04:15
to shape itself.
86
255455
1519
塑造自己的权力
04:17
But the problem is, as Foucault points out,
87
257410
2984
但是问题是
福柯也指出了
04:20
that every technology of the self
88
260418
1785
每一种满足自我的方法
04:22
has a technology of domination as its flip side.
89
262227
3445
都会有一个对立的控制手段相伴
04:25
As you see in today's modern liberal democracies,
90
265696
4603
正如大家所见,如今的现代自由民主国家中
社会,国家
04:30
the society, the state, not only allows us to determine our self,
91
270323
4683
不仅允许我们定义自我,塑造自我
04:35
to shape our self,
92
275030
1151
04:36
it also demands it of us.
93
276205
1991
它也要求我们做到这些
它要求我们完善自我
04:38
It demands that we optimize ourselves,
94
278220
1961
04:40
that we control ourselves,
95
280205
1840
要求我们控制自我
要求我们时刻管理自我
04:42
that we self-manage continuously,
96
282069
2711
04:44
because that's the only way in which such a liberal society works.
97
284804
3893
因为只有这样
一个自由的社会才能运作
04:48
These technologies want us to stay in the game
98
288721
4268
这些方法想要把我们限定在
04:53
that society has devised for us.
99
293013
2773
社会早已为我们设定好了的规则之中
04:55
They want us to fit in even better.
100
295810
2287
他们希望我们能更好的融入
04:58
They want us to optimize ourselves to fit in.
101
298121
2578
他们希望我们完善自我以融入这个社会
我并不想说这是一件坏事
05:01
Now, I don't say that is necessarily a bad thing;
102
301628
3079
我只是想到这样的一个例子
05:05
I just think that this example points us to a general realization,
103
305293
4328
能让我们有一个大体的认识
05:09
and that is: no matter what technology or design you look at,
104
309645
3803
不管你所看到的是怎样的一种科技或者设计
05:13
even something we consider as well-intended
105
313472
3021
就算是那些我们认为有良好意图的和良好结果的
05:16
and as good in its effects as Stutzman's Freedom,
106
316517
2966
就像是斯图兹曼的“自由者”软件
05:19
comes with certain values embedded in it.
107
319507
2707
都已经内涵了既定的价值观
05:22
And we can question these values.
108
322238
1936
然而我们能够质疑这样价值观
我们能够提出疑问:为了能更好的
05:24
We can question: Is it a good thing
109
324198
1944
融入社会而不断自我完善
05:26
that all of us continuously self-optimize ourselves
110
326166
3484
05:29
to fit better into that society?
111
329674
2011
这是好事吗?
05:31
Or to give you another example:
112
331709
1492
或者给你们说说另外一个例子
05:33
What about a piece of persuasive technology
113
333225
2476
是什么样的引导方法
让穆斯林女子一直裹着头巾?
05:35
that convinces Muslim women to wear their headscarves?
114
335725
3191
05:38
Is that a good or a bad technology
115
338940
2052
就这个方法的意图和影响来看
是好还是坏了?
05:41
in its intentions or in its effects?
116
341016
2563
05:43
Well, that basically depends on the kind of values you bring to bear
117
343603
4254
答案基本上是取决于
你所持何种价值观
05:47
to make these kinds of judgments.
118
347881
2237
来进行判断
所以,第三个问题是就是:
05:50
So that's a third question:
119
350142
1528
05:51
What values do you use to judge?
120
351694
1528
你所用来进行判断的价值观是什么?
05:53
And speaking of values:
121
353848
1341
说到价值观
05:55
I've noticed that in the discussion about moral persuasion online
122
355213
3356
我注意到,在网上关于道德引导的讨论中
05:58
and when I'm talking with people,
123
358593
1637
我在和人们交流的时候
06:00
more often than not, there is a weird bias.
124
360254
2667
谈话中多半会有一种奇怪的偏见
那种偏见就是我们在问到的
06:03
And that bias is that we're asking:
125
363463
2896
06:06
Is this or that "still" ethical?
126
366383
2813
这个或那个“依旧”符合伦理吗?
06:09
Is it "still" permissible?
127
369220
2650
这是不是“依旧”被允许的?
06:11
We're asking things like:
128
371894
1198
我们会问
乐施会的捐助形式是
06:13
Is this Oxfam donation form,
129
373116
2189
06:15
where the regular monthly donation is the preset default,
130
375329
3048
每月的捐款数是之前就定好的
06:18
and people, maybe without intending it,
131
378401
2079
人们可能没打算捐
06:20
are encouraged or nudged into giving a regular donation
132
380504
3812
但是被这种形式鼓励或者驱使着
定期捐款,而不是一次性捐款
06:24
instead of a one-time donation,
133
384340
1489
06:25
is that "still' permissible?
134
385853
1343
这种方法还被允许吗?
这种方法还符合伦理吗?
06:27
Is it "still" ethical?
135
387220
1363
06:28
We're fishing at the low end.
136
388607
1479
我们这是在下游钓鱼
但事实上,那个问题
06:30
But in fact, that question, "Is it 'still' ethical?"
137
390879
2474
“这是不是依旧符合伦理”
只是审视伦理的其中一种方式
06:33
is just one way of looking at ethics.
138
393377
1781
06:35
Because if you look at the beginning of ethics in Western culture,
139
395182
4892
因为,当你看到伦理的开端
在西方文化里
你会发现伦理还会有
06:40
you see a very different idea of what ethics also could be.
140
400098
3532
一种非常不同的理念
06:43
For Aristotle, ethics was not about the question,
141
403970
3997
对亚里士多德来说,伦理不是关于
06:47
"Is that still good, or is it bad?"
142
407991
2272
这是好的吗,或者这是坏的吗的提问
06:50
Ethics was about the question of how to live life well.
143
410287
3428
伦理是关于如何好好生活的疑问
而他用一个词将之总结,“arete”
06:54
And he put that in the word "arête,"
144
414216
2181
我们把它从古希腊语翻译过来就是“美德”
06:56
which we, from [Ancient Greek], translate as "virtue."
145
416421
2755
但它真正意味着优秀
06:59
But really, it means "excellence."
146
419200
1640
07:00
It means living up to your own full potential as a human being.
147
420864
5431
它的意思是不辜负自己做为一个人的
全部潜力
07:06
And that is an idea that, I think,
148
426937
1656
这种理念
保罗·理查德·布坎南最近的一篇论文将之很好地诠释了出来
07:08
Paul Richard Buchanan put nicely in a recent essay,
149
428617
2697
07:11
where he said, "Products are vivid arguments
150
431338
2143
他说:“商品是我们应该怎样过生活的
07:13
about how we should live our lives."
151
433505
2123
生动的论述。“
商品的设计是没有的或不道德的,
07:16
Our designs are not ethical or unethical
152
436086
2577
07:18
in that they're using ethical or unethical means of persuading us.
153
438687
4590
不管它们使用的是道德的或不道德的手段来说服我们
07:23
They have a moral component
154
443661
1570
但是它们确实有道义上的成分
其表现在所呈现给我们的对美好生活的
07:25
just in the kind of vision and the aspiration of the good life
155
445255
4227
想象和渴望
07:29
that they present to us.
156
449506
1348
07:31
And if you look into the designed environment around us
157
451441
3503
如果以这种观点,你仔细观察周围
07:34
with that kind of lens,
158
454968
1172
那些设定好的环境
07:36
asking, "What is the vision of the good life
159
456164
2453
你来提问:“我们的商品,我们的设计呈现给我们的
07:38
that our products, our design, present to us?",
160
458641
2738
是对美好生活怎样的一种想象?”
07:41
then you often get the shivers,
161
461403
2276
你可能会浑身一襟
07:43
because of how little we expect of each other,
162
463703
2328
因为我们对彼此的期待是如此得少
07:46
of how little we actually seem to expect of our life,
163
466055
3890
因为我们似乎对我们的生活
和美好生活的样子的期待是如此得少
07:49
and what the good life looks like.
164
469969
2034
所以,第四个问题就是:
07:53
So that's a fourth question I'd like to leave you with:
165
473110
3023
07:56
What vision of the good life do your designs convey?
166
476157
4362
你所设计的的商品, 传达的是美好生活的
怎样的一种想象呢?
08:01
And speaking of design,
167
481249
1372
说到设计
08:02
you'll notice that I already broadened the discussion,
168
482645
4190
你注意到我已经扩大了这个讨论
08:06
because it's not just persuasive technology that we're talking about here,
169
486859
4444
因为这不仅仅是有引导性的技术的讨论
这是关于我们在这个世界上应用的任何一种设计方法的讨论
08:11
it's any piece of design that we put out here in the world.
170
491327
4077
08:15
I don't know whether you know
171
495428
1400
我不清楚各位知不知道
08:16
the great communication researcher Paul Watzlawick who, back in the '60s,
172
496852
3555
伟大的交流研究者保罗·瓦兹拉威克
他在上世纪60年代发表了
我们无法不进行交流的论述
08:20
made the argument that we cannot not communicate.
173
500431
2511
即使我们选择了保持沉默
08:22
Even if we choose to be silent, we chose to be silent,
174
502966
2601
我们选择了保持沉默。我们通过选择沉默来进行某种交流
08:25
and we're communicating something by choosing to be silent.
175
505591
2956
08:28
And in the same way that we cannot not communicate,
176
508571
2735
就像我们无法不进行交流一样
我们无法不进行引导
08:31
we cannot not persuade:
177
511330
1531
08:32
whatever we do or refrain from doing,
178
512885
2011
不管我们做什么或者制止做什么
08:34
whatever we put out there as a piece of design, into the world,
179
514920
4365
不管我们在这个世界上应用了
怎样的设计方法
都有引导的成分
08:39
has a persuasive component.
180
519309
2047
08:41
It tries to affect people.
181
521380
1873
它试图影响人们
08:43
It puts a certain vision of the good life out there in front of us,
182
523277
3747
它给美好生活下好定义
将之摆在我们面前
也就如彼得-保罗·维比克
08:47
which is what Peter-Paul Verbeek,
183
527048
1625
08:48
the Dutch philosopher of technology, says.
184
528697
2716
这位荷兰的技术哲学家所说的
作为设计者,不论我们有没有打算
08:51
No matter whether we as designers intend it or not,
185
531437
3948
08:55
we materialize morality.
186
535409
2142
我们都把道德具象化了
08:57
We make certain things harder and easier to do.
187
537575
2803
我们让一些事情做起来更困难或者更容易了
09:00
We organize the existence of people.
188
540402
2211
我们把人类的存在安排得有秩有序
09:02
We put a certain vision
189
542637
1151
我们通过在世界中的各种应用
09:03
of what good or bad or normal or usual is
190
543812
3404
把孰好孰坏,把什么是道德的还是司空见惯的
09:07
in front of people,
191
547240
1151
09:08
by everything we put out there in the world.
192
548415
2400
某种想象摆在人们面前
即使是像课桌椅这种无伤大雅的东西
09:11
Even something as innocuous as a set of school chairs
193
551247
3086
09:14
is a persuasive technology,
194
554357
2047
也是一种引导技术
因为这也呈现并物质化了
09:16
because it presents and materializes a certain vision of the good life --
195
556428
4690
某种美好生活的想象
在这种教学和聆听的美好生活中
09:21
a good life in which teaching and learning and listening
196
561142
2857
09:24
is about one person teaching, the others listening;
197
564023
3099
是一个一教课,其他人听课
听课是要坐着听的
09:27
in which it is about learning-is-done-while-sitting;
198
567146
4053
听课是为了自己
09:31
in which you learn for yourself;
199
571223
1595
09:32
in which you're not supposed to change these rules,
200
572842
2420
你不应该去改变这种规则
09:35
because the chairs are fixed to the ground.
201
575286
2447
因为椅子是固定在地上的
09:38
And even something as innocuous as a single-design chair,
202
578888
2911
甚至一只无伤大雅的时尚坐椅
09:41
like this one by Arne Jacobsen,
203
581823
1572
像这只阿纳·雅各布森设计的坐椅
09:43
is a persuasive technology,
204
583419
1776
也是一种引导技术
因为,再一次的,这与美好生活的理念相联系
09:45
because, again, it communicates an idea of the good life:
205
585219
3043
09:48
a good life -- a life that you, as a designer, consent to by saying,
206
588735
4925
美好生活
一种你作为设计者所赞成的生活
你说:“在美好生活中,
09:53
"In a good life, goods are produced as sustainably or unsustainably
207
593684
3507
商品就像这把椅子一样被持续或者不持续地生产出来
09:57
as this chair.
208
597215
1611
09:58
Workers are treated as well or as badly
209
598850
1977
这些工人与那些打造椅子的工人的待遇
10:00
as the workers were treated that built that chair."
210
600851
2572
同样好,或者同样糟糕
10:03
The good life is a life where design is important
211
603762
2301
美好生活中设计十分重要
因为为了那种精心设计的椅子
10:06
because somebody obviously took the time and spent the money
212
606087
2921
显然有人花了时间又花了金钱来买
10:09
for that kind of well-designed chair;
213
609032
1784
10:10
where tradition is important,
214
610840
1404
这其中传统也同样重要
因为这是件传统的经典
10:12
because this is a traditional classic and someone cared about this;
215
612268
3166
而且有人对这很在意
这里存在着炫耀性消费
10:15
and where there is something as conspicuous consumption,
216
615458
2690
花费大笔的金钱在这么一把椅子上
10:18
where it is OK and normal to spend a humongous amount of money
217
618172
2956
去向其他人展示自己的地位
10:21
on such a chair,
218
621152
1151
这也很正常
10:22
to signal to other people what your social status is.
219
622327
2573
10:24
So these are the kinds of layers, the kinds of questions
220
624924
3317
好了,这就是我今天想带各位
10:28
I wanted to lead you through today;
221
628265
1970
体会的一些层次,一些问题 ----
当你设计某件事情时
10:30
the question of: What are the intentions that you bring to bear
222
630259
3034
你的意图是什么?
10:33
when you're designing something?
223
633317
1560
10:34
What are the effects, intended and unintended, that you're having?
224
634901
3252
有意识或是无意识的,你将会造成了哪些影响?
你判断这些所用的又是
10:38
What are the values you're using to judge those?
225
638177
2801
什么样的价值观?
在这其中,你真正想表达的
10:41
What are the virtues, the aspirations
226
641002
1973
10:42
that you're actually expressing in that?
227
642999
2028
美德和渴望又是什么?
10:45
And how does that apply,
228
645400
1857
这些又是如何运用的?
10:47
not just to persuasive technology,
229
647281
1986
不仅仅是运用在引导技术中
10:49
but to everything you design?
230
649291
2048
这些是如何运用在你设计的每一件事中的?
10:51
Do we stop there?
231
651912
1291
我们就到此为止了吗?
10:53
I don't think so.
232
653815
1167
我不这么认为
10:55
I think that all of these things are eventually informed
233
655291
4438
我觉得所有这些事情最终
10:59
by the core of all of this,
234
659753
1423
都被这个所了解
11:01
and this is nothing but life itself.
235
661200
3091
这不是别的什么东西,这个就是生活本身
11:04
Why, when the question of what the good life is
236
664594
2717
为什么?当美好生活的问题
11:07
informs everything that we design,
237
667335
2339
包括了我们所设计的一切
11:09
should we stop at design and not ask ourselves:
238
669698
2794
我们是不是应该停止设计,也不要过问
11:12
How does it apply to our own life?
239
672516
1990
它是如何适用于我们自己的生活的呢?
11:14
"Why should the lamp or the house be an art object,
240
674881
2712
“为什么一盏灯一座房子可以成为艺术品,
11:17
but not our life?"
241
677617
1176
而我们的生活就不能呢?
11:18
as Michel Foucault puts it.
242
678817
1483
米歇尔·福柯如是说
11:20
Just to give you a practical example of Buster Benson.
243
680696
3611
给你们举一下伯斯特·本森的现实的例子
11:24
This is Buster setting up a pull-up machine
244
684331
2258
这是伯斯特在他新开办的习惯实验室的办公室里
11:26
at the office of his new start-up, Habit Labs,
245
686613
2612
搭起的引体向上机
在那里,他们还尝试着为人们建造其他应用
11:29
where they're trying to build other applications like "Health Month"
246
689249
3215
像是健康月(一个健康促进的游戏网站)
11:32
for people.
247
692488
1158
那么他为什么要造这么个东西呢?
11:33
And why is he building a thing like this?
248
693670
1962
11:35
Well, here is the set of axioms
249
695656
2033
这有一些格言
11:37
that Habit Labs, Buster's start-up, put up for themselves
250
697713
3398
是习惯实验室,伯斯特的新公司,写给他们自己的
是关于他们建造这些应用时
11:41
on how they wanted to work together as a team
251
701135
2705
11:43
when they're building these applications --
252
703864
2029
如何想要进行团队合作的
11:45
a set of moral principles they set themselves
253
705917
2189
这是他们为自己共同合作
所写的一系列道德准则
11:48
for working together --
254
708130
1361
11:49
one of them being,
255
709515
1238
其中一条是
11:50
"We take care of our own health and manage our own burnout."
256
710777
3087
“我们照顾好自己并且控制好自己的倦怠”
因为根本上,如果不去问你自己想要
11:54
Because ultimately, how can you ask yourselves
257
714221
3500
11:57
and how can you find an answer on what vision of the good life
258
717745
3930
生活在怎样的一种
美好生活中
12:01
you want to convey and create with your designs
259
721699
3169
你如何能设计出
12:04
without asking the question:
260
724892
1730
能为你传达的想创造的
12:06
What vision of the good life do you yourself want to live?
261
726646
3832
那种美好生活的想象
的商品?
以此为总结 感谢各位
12:11
And with that,
262
731018
1174
12:12
I thank you.
263
732945
1412
12:14
(Applause)
264
734381
4156
(掌声)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7