Sebastian Deterding: What your designs say about you

33,131 views ใƒป 2012-05-31

TED


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

๋ฒˆ์—ญ: Seungbum Son ๊ฒ€ํ† : Woo Hwang
00:15
We are today talking about moral persuasion:
0
15651
2586
์šฐ๋ฆฌ๋Š” ์˜ค๋Š˜ ๋„๋•์  ์„ค๋“์— ๋Œ€ํ•ด์„œ ์–˜๊ธฐํ•ด๋ณด๋ ค๊ณ  ํ•ฉ๋‹ˆ๋‹ค.
00:18
What is moral and immoral in trying to change people's behaviors
1
18261
3982
๊ธฐ์ˆ ๊ณผ ๋””์ž์ธ์„ ์ด์šฉํ•˜์—ฌ
์‚ฌ๋žŒ๋“ค์˜ ํ–‰๋™์„ ๋ฐ”๊พธ๋Š” ์‹œ๋„์— ์žˆ์–ด์„œ
00:22
by using technology and using design?
2
22267
2453
๋„๋•์ ์ธ ๊ฒƒ๊ณผ ๋น„๋„๋•์ ์ธ ๊ฒƒ์€ ๋ฌด์—‡์„ ์˜๋ฏธํ• ๊นŒ์š”?
00:24
And I don't know what you expect,
3
24744
1832
์—ฌ๋Ÿฌ๋ถ„์ด ๋ฌด์—‡์„ ์˜ˆ์ƒํ•˜๊ณ  ๊ณ„์‹ค์ง€๋Š” ๋ชจ๋ฅด๊ฒ ์ง€๋งŒ
00:26
but when I was thinking about that issue,
4
26600
1953
์ €๋Š” ์ด ๋ฌธ์ œ์— ๋Œ€ํ•ด์„œ ์ƒ๊ฐํ•œ์ง€
00:28
I early on realized what I'm not able to give you are answers.
5
28577
4039
์–ผ๋งˆ๊ฐ€์ง€ ์•Š์•„์„œ
์—ฌ๋Ÿฌ๋ถ„๊ป˜ ๋‹ต์„ ๋“œ๋ฆด ์ˆ˜๊ฐ€ ์—†๋‹ค๋Š” ๊ฒƒ์„ ๊นจ๋‹ฌ์•˜์Šต๋‹ˆ๋‹ค.
00:33
I'm not able to tell you what is moral or immoral,
6
33203
2771
์šฐ๋ฆฌ๊ฐ€ ๋‹ค์›์ ์ธ ์‚ฌํšŒ์— ์‚ด๊ณ  ์žˆ๊ธฐ ๋•Œ๋ฌธ์—
00:35
because we're living in a pluralist society.
7
35998
2547
์ €๋Š” ์—ฌ๋Ÿฌ๋ถ„๊ป˜ ๋ฌด์—‡์ด ๋„๋•์ ์ด๊ณ  ๋ฌด์—‡์ด ๋น„๋„๋•์ ์ธ์ง€ ์•Œ๋ ค๋“œ๋ฆด ์ˆ˜ ์—†์Šต๋‹ˆ๋‹ค.
00:38
My values can be radically different from your values,
8
38569
4242
์ €์˜ ๊ฐ€์น˜๊ด€์ด
์—ฌ๋Ÿฌ๋ถ„์˜ ๊ฐ€์น˜๊ด€๊ณผ ๋งŽ์ด ๋‹ค๋ฅผ ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
00:42
which means that what I consider moral or immoral based on that
9
42835
3177
์ด๋Š” ์ œ๊ฐ€ ๋„๋•๊ณผ ๋น„๋„๋•์„ ํŒ๋‹จํ•˜๋Š” ๊ธฐ์ค€์ด
00:46
might not necessarily be what you consider moral or immoral.
10
46036
3612
์—ฌ๋Ÿฌ๋ถ„์ด ๋„๋•๊ณผ ๋น„๋„๋•์„ ํŒ๋‹จํ•˜๋Š” ๊ธฐ์ค€๊ณผ ๋‹ค๋ฅผ ์ˆ˜๋„ ์žˆ์Œ์„ ์˜๋ฏธํ•ฉ๋‹ˆ๋‹ค.
ํ•˜์ง€๋งŒ ์ €๋Š” ๋˜ํ•œ ์—ฌ๋Ÿฌ๋ถ„์—๊ฒŒ ๋“œ๋ฆด ์ˆ˜ ์žˆ๋Š” ๊ฒŒ ํ•œ๊ฐ€์ง€ ์žˆ์Œ์„ ๊นจ๋‹ฌ์•˜์Šต๋‹ˆ๋‹ค.
00:50
But I also realized there is one thing that I could give you,
11
50029
3214
๊ทธ๋ฆฌ๊ณ  ๊ทธ๊ฒƒ์€ ์ง€๊ธˆ ๋’ค์— ๋ณด์‹œ๋Š” ๋‚จ์ž๊ฐ€ ์„ธ์ƒ์— ์ฃผ์—ˆ๋˜ ๊ฒƒ์ด์ฃ  --
00:53
and that is what this guy behind me gave the world --
12
53267
2533
00:55
Socrates.
13
55824
1150
์†Œํฌ๋ผํ…Œ์Šค์ž…๋‹ˆ๋‹ค.
00:56
It is questions.
14
56998
1395
๊ทธ๋Š” ์„ธ์ƒ์— ์งˆ๋ฌธ์„ ์ฃผ์—ˆ์ฃ .
00:58
What I can do and what I would like to do with you
15
58417
2633
์ œ๊ฐ€ ์—ฌ๋Ÿฌ๋ถ„๊ป˜ ํ•  ์ˆ˜ ์žˆ๊ณ  ํ•ด๋“œ๋ฆฌ๊ณ  ์‹ถ์€ ๊ฒƒ์€,
์ œ๊ฐ€ ์ฒ˜์Œ์— ํ–ˆ๋˜ ์งˆ๋ฌธ๊ณผ ๊ฐ™์ด,
01:01
is give you, like that initial question,
16
61074
1945
์—ฌ๋Ÿฌ ์งˆ๋ฌธ๋“ค์„ ๋“œ๋ฆฌ๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
01:03
a set of questions to figure out for yourselves,
17
63043
3410
์—ฌ๋Ÿฌ๋ถ„ ์Šค์Šค๋กœ๊ฐ€ ๋‹ต์„ ์ฐพ๊ธฐ ์œ„ํ•ด์„œ,
01:06
layer by layer, like peeling an onion,
18
66477
3641
๋‹จ๊ณ„ ๋‹จ๊ณ„ ๋ณ„๋กœ,
์–‘ํŒŒ๋ฅผ ๊นŒ๋“ฏ์ด,
์—ฌ๋Ÿฌ๋ถ„์ด ๋„๋•๊ณผ ๋น„๋„๋•์ด๋ผ๊ณ  ๋ฏฟ๋Š”
01:10
getting at the core of what you believe is moral or immoral persuasion.
19
70142
5020
๊ทธ๊ฒƒ์˜ ํ•ต์‹ฌ์— ๋„๋‹ฌํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
01:15
And I'd like to do that with a couple of examples of technologies
20
75559
4041
๊ทธ๋Ÿฌ๊ธฐ ์œ„ํ•ด์„œ ์ €๋Š” ์‚ฌ๋žŒ๋“ค์ด ๋‹ค๋ฅธ ์‚ฌ๋žŒ๋“ค์„
ํ–‰๋™์œผ๋กœ ์ด๋Œ๊ธฐ ์œ„ํ•ด์„œ ๋†€์ด์š”์†Œ๋ฅผ ํ™œ์šฉํ•œ
01:19
where people have used game elements to get people to do things.
21
79624
4980
๊ธฐ์ˆ ์„ ์“ด ์˜ˆ๋ฅผ ๋ช‡๊ฐ€์ง€ ๋“ค๋„๋ก ํ•˜๊ฒ ์Šต๋‹ˆ๋‹ค.
์ž, ๊ทธ๋Ÿผ ์ฒซ๋ฒˆ์งธ ์•„์ฃผ ๋‹จ์ˆœํ•˜๊ณ  ๋‹น์—ฐํ•œ ์งˆ๋ฌธ์„
01:25
So it's at first a very simple, very obvious question
22
85343
3040
01:28
I would like to give you:
23
88407
1196
์—ฌ๋Ÿฌ๋ถ„๊ป˜ ๋“œ๋ฆฌ๊ฒ ์Šต๋‹ˆ๋‹ค:
01:29
What are your intentions if you are designing something?
24
89627
2687
์—ฌ๋Ÿฌ๋ถ„์ด ๋ฌด์—‡์ธ๊ฐ€๋ฅผ ๋””์ž์ธํ•œ๋‹ค๋ฉด ์—ฌ๋Ÿฌ๋ถ„์˜ ์˜๋„๋Š” ๋ฌด์—‡์ผ๊นŒ์š”?
01:32
And obviously, intentions are not the only thing,
25
92668
3373
๊ทธ๋ฆฌ๊ณ  ๋‹น์—ฐํžˆ ์˜๋„๋งŒ์ด ๊ณ ๋ ค์‚ฌํ•ญ์ด ์•„๋‹ˆ๊ธฐ์—
์ œ๊ฐ€ ๋ง์”€๋“œ๋ฆฐ ์˜ˆ๋ฅผ ํ•˜๋‚˜ ๋“ค๋„๋ก ํ•˜๊ฒ ์Šต๋‹ˆ๋‹ค.
01:36
so here is another example for one of these applications.
26
96065
3178
01:39
There are a couple of these kinds of Eco dashboards right now --
27
99267
3087
ํ˜„์žฌ ์‹œํŒ๋˜๋Š” ์ œํ’ˆ์—๋Š” ์ด๋Ÿฐ ์ข…๋ฅ˜์˜ ์นœํ™˜๊ฒฝ ๊ณ„๊ธฐํŒ์ด ์žˆ์Šต๋‹ˆ๋‹ค --
์—ฐ๋ฃŒ๋ฅผ ์ ˆ์•ฝํ•˜๋„๋ก ๋™๊ธฐ๋ฅผ ๋ถ€์—ฌํ•˜๋Š”
01:42
dashboards built into cars --
28
102378
1460
01:43
which try to motivate you to drive more fuel-efficiently.
29
103862
2832
๊ทธ๋Ÿฐ ์ž๋™์ฐจ์— ์„ค์น˜๋œ ๊ณ„๊ธฐํŒ ๋ง์ด์ฃ .
01:46
This here is Nissan's MyLeaf,
30
106718
1773
์—ฌ๋Ÿฌ๋ถ„์ด ์ง€๊ธˆ ๋ณด์‹œ๊ณ  ๊ณ„์‹œ๋Š” ์ฐจ๋Š” ๋‹›์‚ฐ์˜ ๋งˆ์ด๋ฆฌํ”„(MyLeaf)์ž…๋‹ˆ๋‹ค.
01:48
where your driving behavior is compared with the driving behavior
31
108515
3064
์ด ์ฐจ๋Š” ์—ฌ๋Ÿฌ๋ถ„์˜ ์šด์ „์Šต๊ด€์„
๋‹ค๋ฅธ ์šด์ „์ž์˜ ์šด์ „์Šต๊ด€๊ณผ ๋น„๊ตํ•ด์„œ
01:51
of other people,
32
111603
1151
01:52
so you can compete for who drives a route the most fuel-efficiently.
33
112778
3277
๋ˆ„๊ฐ€ ์ œ์ผ ์—ฐ๋ฃŒ๋ฅผ ํšจ์œจ์ ์œผ๋กœ ์“ฐ๋ฉฐ
์šด์ „์„ ํ•˜๋Š”๊ฐ€๋ฅผ ๊ฒฝ์Ÿํ•  ์ˆ˜ ์žˆ๊ฒŒ ํ•ด์ค๋‹ˆ๋‹ค.
๊ทธ๋ฆฌ๊ณ  ๊ฒฐ๊ณผ์ ์œผ๋กœ ๋ง์”€๋“œ๋ฆฌ๋ฉด, ์ด๋Ÿฐ ์‹œ๋„๋Š”
01:56
And these things are very effective, it turns out --
34
116079
2477
์•„์ฃผ ํšจ๊ณผ์ ์ด์–ด์„œ ์‚ฌ๋žŒ๋“ค๋กœ ํ•˜์—ฌ๊ธˆ
01:58
so effective that they motivate people to engage in unsafe driving behaviors,
35
118580
3939
์•ˆ์ „ํ•˜์ง€ ์•Š์€ ์šด์ „์Šต๊ด€๊นŒ์ง€ ๊ฐ์ˆ˜ํ•˜๊ฒŒ ๋งŒ๋“ค์—ˆ์Šต๋‹ˆ๋‹ค--
02:02
like not stopping at a red light,
36
122543
1762
์‹ ํ˜ธ๋“ฑ์˜ ๋นจ๊ฐ„๋ถˆ์„ ๋ฌด์‹œํ•˜๊ณ  ์šด์ „ํ•˜๋Š” ๊ฒƒ๊ณผ ๊ฐ™์€ ์Šต๊ด€๋ง์ด์ฃ .
02:04
because that way you have to stop and restart the engine,
37
124329
2715
ํ•œ๋ฒˆ ๋ฉˆ์ถœ ๋•Œ๋งˆ๋‹ค ์—”์ง„์„ ๋‹ค์‹œ ๊ฑธ์–ด์•ผํ•˜๊ณ 
๊ทธ๋ ‡๊ฒŒ ๋˜๋ฉด ์ƒ๋‹น๋Ÿ‰์˜ ์—ฐ๋ฃŒ๋ฅผ ์†Œ๋ชจํ•˜๊ฒ ์ฃ , ์•ˆ๊ทธ๋ ‡์Šต๋‹ˆ๊นŒ?
02:07
and that would use quite some fuel, wouldn't it?
38
127068
2723
02:10
So despite this being a very well-intended application,
39
130338
4409
๋ณด์‹œ๋‹ค์‹œํ”ผ ์˜๋„๋Š” ์•„์ฃผ ์ข‹์•˜๋˜ ์‹œ๋„์˜€์œผ๋‚˜
02:14
obviously there was a side effect of that.
40
134771
2375
ํ™•์—ฐํ•˜๊ฒŒ ๋ถ€์ž‘์šฉ์ด ์žˆ๋Š” ๊ฒƒ์„ ํ™•์ธํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
02:17
Here's another example for one of these side effects.
41
137170
2548
๊ทธ๋ฆฌ๊ณ  ์—ฌ๊ธฐ ๋ถ€์ž‘์šฉ์— ๊ด€ํ•œ ๋˜ ๋‹ค๋ฅธ ์˜ˆ๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
02:19
Commendable: a site that allows parents to give their kids little badges
42
139742
4784
์ปค๋งจ๋”๋ธ”(Commendable)์ž…๋‹ˆ๋‹ค:
๋ถ€๋ชจ๋“ค์ด ์•„์ด๋“ค์—๊ฒŒ ์‹œํ‚ค๊ณ  ์‹ถ์€ ์ผ,
02:24
for doing the things that parents want their kids to do,
43
144550
2658
์šด๋™ํ™” ๋ˆ์€ ๋งค๋Š” ๋“ฑ์˜ ์ผ,์„ ํ•  ๋•Œ๋งˆ๋‹ค
์นญ์ฐฌ ๋ฑƒ์ง€๋ฅผ ์ค„ ์ˆ˜ ์žˆ๊ฒŒ ํ•˜๋Š” ์‚ฌ์ดํŠธ์ž…๋‹ˆ๋‹ค.
02:27
like tying their shoes.
44
147232
1316
02:28
And at first that sounds very nice,
45
148572
2244
์ฒ˜์Œ์—๋Š” ๊ต‰์žฅํžˆ ๊ดœ์ฐฎ๊ณ ,
02:30
very benign, well-intended.
46
150840
2150
๊ต‰์žฅํžˆ ๊ธ์ •์ ์ด๊ณ , ์˜๋„๋„ ์ข‹์•„๋ณด์˜€์ฃ .
ํ•˜์ง€๋งŒ ๊ฒฐ๊ณผ์ ์œผ๋กœ ๋ดค์„ ๋•Œ, ์‚ฌ๋žŒ์˜ ์‹ฌ๋ฆฌ์— ๊ด€ํ•œ ์—ฐ๊ตฌ๋ฅผ ๋ณด๋ฉด,
02:33
But it turns out, if you look into research on people's mindset,
47
153014
3770
02:36
caring about outcomes,
48
156808
1487
๊ฒฐ๊ณผ์— ๋Œ€ํ•ด์„œ ์‹ ๊ฒฝ์“ฐ๊ณ ,
02:38
caring about public recognition,
49
158319
1773
์ฃผ์œ„์˜ ์‹œ์„ ์— ์‹ ๊ฒฝ์“ฐ๊ณ ,
02:40
caring about these kinds of public tokens of recognition
50
160116
3861
์ฃผ์œ„์˜ ์ธ์ •์„ ๋ฐ›๋Š” ๊ฒƒ์— ๋Œ€ํ•œ ์ด๋Ÿฐ ์ง•ํ‘œ์— ๋Œ€ํ•ด์„œ ์‹ ๊ฒฝ์“ฐ๋Š” ๊ฒƒ์€
์žฅ๊ธฐ์ ์ธ ์‹ฌ๋ฆฌ์  ํ–‰๋ณต์—๋Š”
02:44
is not necessarily very helpful
51
164001
1994
02:46
for your long-term psychological well-being.
52
166019
2307
๋ฐ˜๋“œ์‹œ ์ข‹์€ ๊ฒฐ๊ณผ๋ฅผ ๊ฐ€์ ธ์˜จ๋‹ค๊ณ  ๋งํ•  ์ˆ˜ ์—†์Šต๋‹ˆ๋‹ค.
02:48
It's better if you care about learning something.
53
168350
2676
์—ฌ๋Ÿฌ๋ถ„์ด ๋ฐฐ์šฐ๋Š” ๊ฒƒ์— ๋Œ€ํ•ด์„œ ๊ด€์‹ฌ์„ ๊ฐ€์ง€๋Š” ๊ฒƒ์ด ๋” ์ข‹์Šต๋‹ˆ๋‹ค.
02:51
It's better when you care about yourself
54
171050
1905
๋‹ค๋ฅธ ์‚ฌ๋žŒ๋“ค ์•ž์—์„œ ์–ด๋–ป๊ฒŒ ๋น„์ถ”์–ด์ง€๋Š”์ง€์— ๋Œ€ํ•ด์„œ ์‹ ๊ฒฝ์“ฐ๊ธฐ๋ณด๋‹ค๋Š”
02:52
than how you appear in front of other people.
55
172979
2574
์—ฌ๋Ÿฌ๋ถ„ ์ž์‹ ์— ๋Œ€ํ•ด์„œ ๊ด€์‹ฌ์„ ๊ฐ€์ง€๋Š” ๊ฒƒ์ด ๋” ์ข‹์Šต๋‹ˆ๋‹ค.
๊ทธ๋Ÿฌ๋ฏ€๋กœ ์–ด๋–ค ๋™๊ธฐ๋ถ€์—ฌ ์ˆ˜๋‹จ์ด
02:56
So that kind of motivational tool that is used actually, in and of itself,
56
176021
5041
์“ฐ์˜€๋‚˜ ๊ทธ ์ž์ฒด๋Š”
์žฅ๊ธฐ์ ์œผ๋กœ ๋ดค์„ ๋•Œ ๋ถ€์ž‘์šฉ์ด ์žˆ์Šต๋‹ˆ๋‹ค.
03:01
has a long-term side effect,
57
181086
1908
03:03
in that every time we use a technology
58
183018
1810
์šฐ๋ฆฌ๊ฐ€ ๋งค๋ฒˆ ๋Œ€์ค‘์ธ์‹์ด๋‚˜ ์ง€์œ„๋ฅผ
03:04
that uses something like public recognition or status,
59
184852
3174
์ด์šฉํ•˜๋Š” ์–ด๋–ค ๊ธฐ์ˆ ์„ ์“ธ ๋•Œ๋งˆ๋‹ค
03:08
we're actually positively endorsing this
60
188050
2376
์šฐ๋ฆฌ๋Š” ์‹ค์งˆ์ ์œผ๋กœ ์ด๋Ÿฐ ๋ถ€์ž‘์šฉ์„ ๊ธ์ •์ ์ด๊ณ 
03:10
as a good and normal thing to care about --
61
190450
3410
๋‹น์—ฐํ•œ ์‚ฌ์‹ค์ธ ๊ฒƒ์œผ๋กœ ์ง€์ง€ํ•˜๋Š” ์ž…์žฅ์ด ๋˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค --
03:13
that way, possibly having a detrimental effect
62
193884
2864
๊ทธ๋Ÿผ์œผ๋กœ์จ ์•„๋งˆ ์žฅ๊ธฐ์ ์œผ๋กœ ์šฐ๋ฆฌ๋“ค์˜ ์ •์‹ ์  ๊ฑด๊ฐ•์—
03:16
on the long-term psychological well-being of ourselves as a culture.
63
196772
3855
๋ฌธํ™”์˜ ์ผ๋ถ€๋ถ„์œผ๋กœ์„œ ํ•ด๋กœ์šด ์˜ํ–ฅ์„ ๋ผ์น˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
03:20
So that's a second, very obvious question:
64
200651
2644
๋ฐ”๋กœ ๊ทธ๊ฒƒ์ด ์ €์˜ ์•„์ฃผ ๋‹น์—ฐํ•œ ๋‘ ๋ฒˆ์งธ ์งˆ๋ฌธ์ž…๋‹ˆ๋‹ค:
03:23
What are the effects of what you're doing --
65
203319
2311
์—ฌ๋Ÿฌ๋ถ„์ด ์ง€๊ธˆ ํ•˜๊ณ  ์žˆ๋Š” ์ผ์˜ ์˜ํ–ฅ์€ ๋ฌด์—‡์ž…๋‹ˆ๊นŒ?
03:25
the effects you're having with the device, like less fuel,
66
205654
4124
์—ฌ๋Ÿฌ๋ถ„์ด ์“ฐ๊ณ  ์žˆ๋Š” ๊ธฐ๊ธฐ์— ๋ผ์น˜๋Š”,
์ ์€ ์—ฐ๋ฃŒ๋ฅผ ์†Œ๋ชจํ•˜๋Š” ๊ฒƒ๊ณผ ๊ฐ™์€, ์˜ํ–ฅ์€ ๋ฌผ๋ก ์ด๊ณ 
03:29
as well as the effects of the actual tools you're using
67
209802
2705
์—ฌ๋Ÿฌ๋ถ„์ด ์‚ฌ๋žŒ๋“ค์„ ์›€์ง์ด๊ธฐ ์œ„ํ•ด์„œ
03:32
to get people to do things --
68
212531
1674
์“ฐ๊ณ  ์žˆ๋Š” ์ˆ˜๋‹จ๋“ค -- ๋Œ€์ค‘์ธ์‹๊ณผ ๊ฐ™์€
03:34
public recognition?
69
214229
1571
์ˆ˜๋‹จ๋“ค์˜ ์˜ํ–ฅ๋ง์ž…๋‹ˆ๋‹ค.
03:35
Now is that all -- intention, effect?
70
215824
2921
์ž, ๊ทธ๋Ÿผ ์˜๋„์™€ ์˜ํ–ฅ๋ ฅ, ์ด ๋‘˜์ด ๋‹ค์ผ๊นŒ์š”?
03:38
Well, there are some technologies which obviously combine both.
71
218769
3134
๋‹น์—ฐํžˆ ์ด ๋‘˜์„ ์„ž์–ด์„œ ์“ฐ๋Š”
๊ธฐ์ˆ ๋“ค์ด ์žˆ์Šต๋‹ˆ๋‹ค.
03:41
Both good long-term and short-term effects
72
221927
2787
์žฅ๊ธฐ์ , ๋‹จ๊ธฐ์  ์˜ํ–ฅ ๋ชจ๋‘ ์ข‹๊ณ ,
03:44
and a positive intention like Fred Stutzman's "Freedom,"
73
224738
2809
ํ”„๋ ˆ๋“œ ์Šคํ…ƒ์ฆˆ๋งŒ์˜ ์ž์œ ์ฒ˜๋Ÿผ ์˜๋„๋„ ์ข‹์Šต๋‹ˆ๋‹ค.
03:47
where the whole point of that application is --
74
227571
2207
๋ณดํ†ต ์šฐ๋ฆฌ๋Š” ๋‹ค๋ฅธ ์‚ฌ๋žŒ๋“ค์˜
์—ฐ๋ฝ๊ณผ ์š”์ฒญ์˜ ํ™์ˆ˜ ์†์— ํŒŒ๋ฌปํ˜€์„œ
03:49
well, we're usually so bombarded with constant requests by other people,
75
229802
3725
์ผ์„ ๋ชปํ•˜๋Š”๋ฐ
03:53
with this device,
76
233551
1156
์ด ๊ธฐ์ˆ ์€ ์‚ฌ์ „์— ์ •ํ•ด์ฃผ๋Š” ์‹œ๊ฐ„๋งŒํผ
03:54
you can shut off the Internet connectivity of your PC of choice
77
234731
3480
์—ฌ๋Ÿฌ๋ถ„ PC์˜ ์ธํ„ฐ๋„ท ์—ฐ๊ฒฐ์„ ๋Š์–ด์„œ
03:58
for a pre-set amount of time,
78
238235
1448
03:59
to actually get some work done.
79
239707
1971
์‹ค์ œ ์—…๋ฌด๋ฅผ ํ•  ์ˆ˜ ์žˆ๋„๋ก ๋„์™€์ค๋‹ˆ๋‹ค.
04:01
And I think most of us will agree that's something well-intended,
80
241702
3151
์•„๋งˆ๋„ ์—ฌ๋Ÿฌ๋ถ„ ์ค‘์— ๋Œ€๋ถ€๋ถ„์€
์ด ๊ธฐ์ˆ ์ด ์ข‹์€ ์˜๋„๋ฅผ ๊ฐ€์ง€๊ณ  ์žˆ๊ณ ,
04:04
and also has good consequences.
81
244877
2220
๋˜ ๊ธ์ ์ ์ธ ๊ฒฐ๊ณผ๋ฅผ ๊ฐ€์ ธ์˜ฌ ๊ฒƒ์ด๋ผ๋Š” ๊ฒƒ์— ๋Œ€ํ•ด์„œ ๋™์˜ํ•˜์‹ค ๊ฒƒ์ž…๋‹ˆ๋‹ค.
๋งˆ์ดํด ํฌ์ปฌํŠธ(Michel Foucault)์˜ ๋ง์„ ์ธ์šฉํ•˜์ž๋ฉด,
04:07
In the words of Michel Foucault,
82
247121
1646
04:08
it is a "technology of the self."
83
248791
1940
"์ž์‹ ์„ ์œ„ํ•œ ๊ธฐ์ˆ "์ธ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
04:10
It is a technology that empowers the individual
84
250755
2837
์ธ์ƒ์˜ ๊ฒฝ๋กœ๋ฅผ ๊ฒฐ์ •ํ•˜๊ธฐ ์œ„ํ•ด,
04:13
to determine its own life course,
85
253616
1815
๊ทธ๋ฆฌ๊ณ  ๊ทธ๊ฒƒ์„ ๋‹ค๋“ฌ๊ธฐ ์œ„ํ•ด,
04:15
to shape itself.
86
255455
1519
๊ฐœ์ธ์—๊ฒŒ ํž˜์„ ์‹ค์–ด์ฃผ๋Š” ๊ทธ๋Ÿฐ ๊ธฐ์ˆ ์ž…๋‹ˆ๋‹ค.
04:17
But the problem is, as Foucault points out,
87
257410
2984
ํ•˜์ง€๋งŒ ๋ฌธ์ œ๋Š”,
ํฌ์ปฌํŠธ๊ฐ€ ์ง€์ ํ–ˆ๋“ฏ์ด,
04:20
that every technology of the self
88
260418
1785
๋ชจ๋“  ์ž์‹ ์„ ์œ„ํ•œ ๊ธฐ์ˆ ์€
04:22
has a technology of domination as its flip side.
89
262227
3445
์ง€๋ฐฐ์˜ ๊ธฐ์ˆ ์ด๋ผ๋Š” ๋’ท๋ฉด์ด ์žˆ์Šต๋‹ˆ๋‹ค.
04:25
As you see in today's modern liberal democracies,
90
265696
4603
๊ทผ๋Œ€ ์ž์œ  ๋ฏผ์ฃผ์ฃผ์˜์—์„œ ๋ณผ ์ˆ˜ ์žˆ๋“ฏ์ด,
์‚ฌํšŒ์™€ ๊ตญ๊ฐ€๋Š”
04:30
the society, the state, not only allows us to determine our self,
91
270323
4683
์šฐ๋ฆฌ ์ž์‹ ์— ๋Œ€ํ•ด์„œ ๊ฒฐ์ •ํ•˜๊ณ  ๋‹ค๋“ฌ๋Š” ๊ฒƒ์„ ํ—ˆ์šฉํ•˜๊ธฐ๋„ ํ•˜์ง€๋งŒ
04:35
to shape our self,
92
275030
1151
04:36
it also demands it of us.
93
276205
1991
๊ทธ๊ฒƒ์„ ๊ฐ•์š”ํ•˜๊ธฐ๋„ ํ•ฉ๋‹ˆ๋‹ค.
์šฐ๋ฆฌ๊ฐ€ ํšจ์œจ์ ์ด๊ธฐ๋ฅผ,
04:38
It demands that we optimize ourselves,
94
278220
1961
04:40
that we control ourselves,
95
280205
1840
์ž์‹ ์„ ํ†ต์ œํ•˜๊ธฐ๋ฅผ,
๊ทธ๋ฆฌ๊ณ  ์ง€์†์ ์œผ๋กœ ์ž์‹ ์„ ๊ด€๋ฆฌํ•˜๊ธฐ๋ฅผ ๊ฐ•์š”ํ•ฉ๋‹ˆ๋‹ค.
04:42
that we self-manage continuously,
96
282069
2711
04:44
because that's the only way in which such a liberal society works.
97
284804
3893
๊ทธ๊ฒƒ์ด์•ผ๋ง๋กœ
์ž์œ ์‚ฌํšŒ๊ฐ€ ์œ ์ง€๋  ์ˆ˜ ์žˆ๋Š” ์œ ์ผํ•œ ๋ฐฉ๋ฒ•์ด๊ธฐ ๋•Œ๋ฌธ์ด์ฃ .
04:48
These technologies want us to stay in the game
98
288721
4268
์ด๋Ÿฌํ•œ ๊ธฐ์ˆ ๋“ค์€ ์šฐ๋ฆฌ๋ฅผ ์‚ฌํšŒ๊ฐ€ ๊ณ ์•ˆํ•œ
04:53
that society has devised for us.
99
293013
2773
ํ‹€์•ˆ์— ๋‚จ์•„์žˆ๊ธฐ๋ฅผ ๋ฐ”๋ž๋‹ˆ๋‹ค.
04:55
They want us to fit in even better.
100
295810
2287
๊ทธ๊ฒƒ๋“ค์€ ์šฐ๋ฆฌ๊ฐ€ ๋” ์ ์‘์„ ์ž˜ํ•˜๊ธฐ๋ฅผ ๋ฐ”๋ž๋‹ˆ๋‹ค.
04:58
They want us to optimize ourselves to fit in.
101
298121
2578
๊ทธ๊ฒƒ๋“ค์€ ์šฐ๋ฆฌ๊ฐ€ ์กฐ๊ธˆ ๋” ์ž์‹ ์„ ํšจ์œจ์ ์œผ๋กœ ๋งŒ๋“ค๊ธฐ๋ฅผ ๋ฐ”๋ž๋‹ˆ๋‹ค.
์ €๋Š” ๊ทธ๋Ÿฐ ๊ฒƒ์ด ๊ตณ์ด ๋‚˜์œ ๊ฒƒ์ด๋ผ๊ณ  ๋งํ•˜๋ ค๋Š” ๊ฒƒ์€ ์•„๋‹™๋‹ˆ๋‹ค.
05:01
Now, I don't say that is necessarily a bad thing;
102
301628
3079
์ œ๊ฐ€ ๋ฐ”๋กœ ์ „์— ๋ณด์—ฌ๋“œ๋ ธ๋˜ ์‚ฌ๋ก€๋ฅผ ๋ณด๋ฉด
05:05
I just think that this example points us to a general realization,
103
305293
4328
ํ•œ๊ฐ€์ง€์˜ ์ผ๋ฐ˜์ ์ธ ๊นจ๋‹ฌ์Œ์„ ์–ป์„์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
05:09
and that is: no matter what technology or design you look at,
104
309645
3803
์–ด๋–ค ๊ธฐ์ˆ ์ด๋‚˜ ๋””์ž์ธ์„ ๋ณด๋˜๊ฐ„์—,
05:13
even something we consider as well-intended
105
313472
3021
๊ทธ๊ฒƒ์ด ์ข‹์€ ์˜๋„๋ฅผ ๊ฐ€์ง€๊ณ  ๋˜ ์ข‹์€ ๊ฒฐ๊ณผ๋ฅผ ์ค„์ง€๋ผ๋„,
05:16
and as good in its effects as Stutzman's Freedom,
106
316517
2966
์Šคํ…ƒ์ฆˆ๋งŒ์˜ ์ž์œ ์™€ ๊ฐ™์ด ๋ง์ด์ฃ  --
05:19
comes with certain values embedded in it.
107
319507
2707
๊ทธ๊ฒƒ์€ ํŠน์ •ํ•œ ๊ฐ€์น˜๊ด€์„ ์ง€๋‹ˆ๊ณ  ๋‹น์‹ ์—๊ฒŒ ๋‹ค๊ฐ€์„  ๋‹ค๋Š” ์ ์ž…๋‹ˆ๋‹ค.
05:22
And we can question these values.
108
322238
1936
๊ทธ๋ฆฌ๊ณ  ์šฐ๋ฆฌ๋Š” ์ด๋Ÿฌํ•œ ๊ฐ€์น˜๊ด€๋“ค์— ๋Œ€ํ•ด์„œ ์˜๋ฌธ์„ ํ’ˆ์„ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
๋ฐ”๋กœ ์ด๋Ÿฌํ•œ ์˜๋ฌธ ๋ง์ด์ฃ : ์šฐ๋ฆฌ ๋ชจ๋‘๊ฐ€
05:24
We can question: Is it a good thing
109
324198
1944
์‚ฌํšŒ์— ์ ์‘ํ•˜๊ธฐ ์œ„ํ•ด์„œ ์ž์‹ ์„ ํšจ์œจ์ ์œผ๋กœ ๋งŒ๋“œ๋Š” ๊ฒƒ์ด
05:26
that all of us continuously self-optimize ourselves
110
326166
3484
05:29
to fit better into that society?
111
329674
2011
๊ณผ์—ฐ ์˜ณ์€ ์ผ์ธ๊ฐ€?
05:31
Or to give you another example:
112
331709
1492
์•„๋‹ˆ๋ฉด ์ œ๊ฐ€ ๋˜ ์—ฌ๋Ÿฌ๋ถ„๊ป˜ ๋‹ค๋ฅธ ์‚ฌ๋ก€์„ ๋ณด์—ฌ๋“œ๋ฆฌ๋„๋ก ํ•˜๊ฒ ์Šต๋‹ˆ๋‹ค.
05:33
What about a piece of persuasive technology
113
333225
2476
๋ฌด์Šฌ๋ฆผ ์—ฌ์„ฑ๋“ค์—๊ฒŒ ๋จธ๋ฆฌ์ˆ˜๊ฑด์„ ๋‘๋ฅด๊ฒŒ ์„ค๋“ํ•˜๋Š”
์„ค๋“์˜ ๊ธฐ์ˆ ์— ๋Œ€ํ•ด์„œ๋Š” ์–ด๋–ป๊ฒŒ ์ƒ๊ฐํ•˜์‹ญ๋‹ˆ๊นŒ?
05:35
that convinces Muslim women to wear their headscarves?
114
335725
3191
05:38
Is that a good or a bad technology
115
338940
2052
๊ทธ๊ฒƒ์˜ ์˜๋„์™€ ๊ฒฐ๊ณผ๋ฅผ ๋‘๊ณ  ๋ณธ๋‹ค๋ฉด
๊ณผ์—ฐ ์ข‹์€ ๊ธฐ์ˆ ์ผ๊นŒ์š”, ์•„๋‹ˆ๋ฉด ๋‚˜์œ ๊ธฐ์ˆ ์ผ๊นŒ์š”?
05:41
in its intentions or in its effects?
116
341016
2563
05:43
Well, that basically depends on the kind of values you bring to bear
117
343603
4254
์ด๋Ÿฐ ์งˆ๋ฌธ์— ๋Œ€ํ•œ ๋Œ€๋‹ต์€
์ด๋Ÿฐ ์ข…๋ฅ˜์˜ ํŒ๋‹จ์„ ํ•  ๋•Œ ์—ฌ๋Ÿฌ๋ถ„์ด ์–ด๋–ค
05:47
to make these kinds of judgments.
118
347881
2237
๊ฐ€์น˜๊ด€์— ์ค‘์ ์„ ๋‘๋Š๋ƒ์— ๋‹ฌ๋ ค์žˆ์Šต๋‹ˆ๋‹ค.
์ž, ๋ฐ”๋กœ ๊ทธ๊ฒƒ์ด ์ œ ์„ธ ๋ฒˆ์งธ ์งˆ๋ฌธ์ž…๋‹ˆ๋‹ค:
05:50
So that's a third question:
119
350142
1528
05:51
What values do you use to judge?
120
351694
1528
์—ฌ๋Ÿฌ๋ถ„์ด ํŒ๋‹จํ•  ๋•Œ ์–ด๋–ค ๊ฐ€์น˜๊ด€์„ ์ ์šฉํ•ฉ๋‹ˆ๊นŒ?
05:53
And speaking of values:
121
353848
1341
๊ทธ๋ฆฌ๊ณ  ๊ฐ€์น˜๊ด€์— ๋Œ€ํ•ด์„œ ์–˜๊ธฐํ•˜์ž๋ฉด,
05:55
I've noticed that in the discussion about moral persuasion online
122
355213
3356
์˜จ๋ผ์ธ์ƒ์—์„œ ๊ฐ€์น˜๊ด€ ์„ค๋“์— ๊ด€ํ•œ ํ† ๋ก ์„ ๋ณด๊ณ 
05:58
and when I'm talking with people,
123
358593
1637
๋˜ ์ œ๊ฐ€ ์ง์ ‘ ๋‹ค๋ฅธ ๋ถ„๋“ค๊ณผ ์–˜๊ธฐ๋ฅผ ๋‚˜๋ˆ„์–ด ๋ณธ ๊ฒฐ๊ณผ,
06:00
more often than not, there is a weird bias.
124
360254
2667
๋งŽ์€ ๊ฒฝ์šฐ ์‚ฌ๋žŒ๋“ค์€ ์ด์ƒํ•œ ํŽธ๊ฒฌ์— ์‚ฌ๋กœ์žกํžŒ ๊ฒฝ์šฐ๊ฐ€ ๋งŽ์•˜์Šต๋‹ˆ๋‹ค.
์ด๊ฒƒ์ด๋‚˜ ์ €๊ฒƒ์ด "๊ณผ์—ฐ" ๋„๋•์ ์ธ๊ฐ€?
06:03
And that bias is that we're asking:
125
363463
2896
06:06
Is this or that "still" ethical?
126
366383
2813
์ด๊ฒƒ์ด "๊ณผ์—ฐ" ํ—ˆ์šฉ๋˜๋Š” ๊ฒƒ์ธ๊ฐ€?
06:09
Is it "still" permissible?
127
369220
2650
์ด๋Ÿฐ ์งˆ๋ฌธ์„ ํ•˜๋Š” ํŽธ๊ฒฌ ๋ง์ž…๋‹ˆ๋‹ค.
06:11
We're asking things like:
128
371894
1198
์šฐ๋ฆฌ๋Š” ์ด๋Ÿฐ ์งˆ๋ฌธ์„ ํ•ฉ๋‹ˆ๋‹ค:
์—ฌ๊ธฐ ์˜ฅ์ŠคํŒธ ๊ธฐ๋ถ€ ์„œ๋ฅ˜๊ฐ€ ์žˆ๋Š”๋ฐ,
06:13
Is this Oxfam donation form,
129
373116
2189
06:15
where the regular monthly donation is the preset default,
130
375329
3048
๋งŒ์•ฝ ์„œ๋ช…ํ•˜๋Š” ์‚ฌ๋žŒ๋“ค์ด ๋ชจ๋ฅด๋Š” ์‚ฌ์ด์—
06:18
and people, maybe without intending it,
131
378401
2079
ํ•œ๋ฒˆ์˜ ๊ฐ€๋ถ€๊ฐ€ ์•„๋‹Œ
06:20
are encouraged or nudged into giving a regular donation
132
380504
3812
๋งค์›” ์ผ์ •ํ•œ ๊ธˆ์•ก์„ ๊ธฐ๋ถ€ํ•˜๋Š”๊ฒƒ์ด
๊ธฐ๋ณธ ๊ณ„์•ฝ ํ˜•ํƒœ๋กœ ์„ค์ •๋˜์–ด ์žˆ๋‹ค๋ฉด
06:24
instead of a one-time donation,
133
384340
1489
06:25
is that "still' permissible?
134
385853
1343
์ด๊ฒƒ์€ ๊ณผ์—ฐ ์šฉ๋‚ฉ์ด ๋˜๋Š” ํ–‰์œ„์ผ๊นŒ์š”?
์ด๊ฒƒ์ด ๊ณผ์—ฐ ๋„๋•์ ์ธ ๊ฒƒ์ผ๊นŒ์š”?
06:27
Is it "still" ethical?
135
387220
1363
06:28
We're fishing at the low end.
136
388607
1479
์ œ๊ฐ€ ๋„ˆ๋ฌด ๊ทน๋‹จ์ ์ธ ์˜ˆ๋ฅผ ๋“ค์—ˆ์„ ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
ํ•˜์ง€๋งŒ "๊ณผ์—ฐ ๋„๋•์ ์ธ๊ฐ€?"๋ผ๋Š”
06:30
But in fact, that question, "Is it 'still' ethical?"
137
390879
2474
์งˆ๋ฌธ์€ ๋„๋•์„ฑ์„ ๋ด„์— ์žˆ์–ด์„œ
ํ•œ ๊ฐ€์ง€ ๋ฐฉ๋ฒ•์— ๋ถˆ๊ณผํ•ฉ๋‹ˆ๋‹ค.
06:33
is just one way of looking at ethics.
138
393377
1781
06:35
Because if you look at the beginning of ethics in Western culture,
139
395182
4892
์™œ๋ƒ๋ฉด ์„œ์–‘๋ฌธํ™”์— ์žˆ์–ด์„œ
์œค๋ฆฌํ•™ ์ดˆ๊ธฐ์‹œ๋Œ€๋ฅผ ์‚ดํŽด๋ณด๋ฉด
๋„๋•์ด ์–ด๋–ค ๊ฒƒ์ธ๊ฐ€์— ๋Œ€ํ•ด์„œ
06:40
you see a very different idea of what ethics also could be.
140
400098
3532
์•„์ฃผ ๋‹ค๋ฅธ ๋ฐœ์ƒ์„ ์ ‘ํ•  ์ˆ˜๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
06:43
For Aristotle, ethics was not about the question,
141
403970
3997
์•„๋ฆฌ์Šคํ† ํ…”๋ ˆ์Šค์—๊ฒŒ ์žˆ์–ด์„œ ๋„๋•์€ ์ด๊ฒƒ์ด ๊ณผ์—ฐ
06:47
"Is that still good, or is it bad?"
142
407991
2272
์ข‹์€ ๊ฒƒ์ธ๊ฐ€ ๋‚˜์œ ๊ฒƒ์ธ๊ฐ€์˜ ์งˆ๋ฌธ์— ๋Œ€ํ•œ ๊ฒƒ์ด ์•„๋‹ˆ์—ˆ์Šต๋‹ˆ๋‹ค.
06:50
Ethics was about the question of how to live life well.
143
410287
3428
๋„๋•์€ ์‚ถ์„ ์–ด๋–ป๊ฒŒ ์ž˜ ์‚ฌ๋Š๋ƒ์˜ ๋ฌธ์ œ์˜€์Šต๋‹ˆ๋‹ค.
๊ทธ๋Š” "์•„๋ ˆํ…Œ(arete)"๋ผ๋Š” ํ•˜๋‚˜์˜ ๋‹จ์–ด๋กœ ๋„๋•์„ ํ•จ์ถ•ํ•ฉ๋‹ˆ๋‹ค.
06:54
And he put that in the word "arรชte,"
144
414216
2181
์šฐ๋ฆฌ๋ง๋กœ๋Š” "๋ฏธ๋•"์ด๋ผ๊ณ  ๋ฒˆ์—ญ๋ฉ๋‹ˆ๋‹ค.
06:56
which we, from [Ancient Greek], translate as "virtue."
145
416421
2755
ํ•˜์ง€๋งŒ ์ €๋Š” ์ด๊ฒƒ์„ ๋›ฐ์–ด๋‚จ์œผ๋กœ ํ’€์ดํ•ฉ๋‹ˆ๋‹ค.
06:59
But really, it means "excellence."
146
419200
1640
07:00
It means living up to your own full potential as a human being.
147
420864
5431
์ธ๊ฐ„์œผ๋กœ์„œ ์—ฌ๋Ÿฌ๋ถ„์ด ์ž์‹ ์˜ ์ž ์žฌ๋ ฅ์„
์ตœ๋Œ€ํ•œ ๋ฐœํœ˜ํ•˜๋Š” ์‚ถ์„ ์˜๋ฏธํ•ฉ๋‹ˆ๋‹ค.
07:06
And that is an idea that, I think,
148
426937
1656
๊ทธ๋ฆฌ๊ณ  ์ด ๋ฐœ์ƒ์€
ํด ๋ฆฌ์ฒ˜๋“œ ๋ทฐ์บ๋„Œ์ด ์ตœ๊ทผ ์—์„ธ์—์„œ ์ž˜ ์ •๋ฆฌํ–ˆ๋“ฏ์ด,
07:08
Paul Richard Buchanan put nicely in a recent essay,
149
428617
2697
07:11
where he said, "Products are vivid arguments
150
431338
2143
"๋ชจ๋“  ์‚ฐ๋ฌผ์€ ์šฐ๋ฆฌ๊ฐ€ ์–ด๋–ค ์‚ถ์„ ์‚ด์•„์•ผ
07:13
about how we should live our lives."
151
433505
2123
ํ• ์ง€์— ๋Œ€ํ•œ ํ™•๊ณ ํ•œ ์ฃผ์žฅ"์ž…๋‹ˆ๋‹ค.
์šฐ๋ฆฌ์˜ ๋””์ž์ธ ์ž์ฒด๋Š” ๋„๋•์ ์ด๊ฑฐ๋‚˜ ๋น„๋„๋•์ ์ธ ์ˆ˜๋‹จ์„ ์จ์„œ
07:16
Our designs are not ethical or unethical
152
436086
2577
07:18
in that they're using ethical or unethical means of persuading us.
153
438687
4590
์šฐ๋ฆฌ๋ฅผ ์„ค๋“ํ•˜๋Š” ์‹์œผ๋กœ ๋„๋•์ ์ด๊ฑฐ๋‚˜ ๋น„๋„๋•์ ์ด์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
07:23
They have a moral component
154
443661
1570
๊ทธ๊ฒƒ๋“ค์€ ์ข‹์€ ์‚ถ์„ ์ถ”๊ตฌํ•˜๊ธฐ ์œ„ํ•œ
๋น„์ „๊ณผ ์—ด๋ง์„
07:25
just in the kind of vision and the aspiration of the good life
155
445255
4227
์šฐ๋ฆฌ์—๊ฒŒ ๋ณด์—ฌ์ค„ ๋ฟ์ž…๋‹ˆ๋‹ค.
07:29
that they present to us.
156
449506
1348
07:31
And if you look into the designed environment around us
157
451441
3503
"์šฐ๋ฆฌ์˜ ์‚ฐ๋ฌผ๊ณผ ๋””์ž์ธ์ด ์šฐ๋ฆฌ์—๊ฒŒ ๊ณผ์—ฐ ์ข‹์€ ์‚ถ์— ๋Œ€ํ•ด์„œ
07:34
with that kind of lens,
158
454968
1172
์–ด๋–ค ๋น„์ „์„ ์„ ์‚ฌํ•˜๋Š”๊ฐ€?"์™€ ๊ฐ™์€ ์งˆ๋ฌธ์„ ๋˜์ง€๋ฉฐ
07:36
asking, "What is the vision of the good life
159
456164
2453
์šฐ๋ฆฌ ์ฃผ์œ„์— ํŽผ์ €์ง„ ๊ณ„ํš์ ์œผ๋กœ ๋งŒ๋“ค์–ด์ง„
07:38
that our products, our design, present to us?",
160
458641
2738
ํ™˜๊ฒฝ์„ ๋ฐ”๋ผ๋ณด๋ฉด
07:41
then you often get the shivers,
161
461403
2276
์—ฌ๋Ÿฌ๋ถ„์€ ๋†€๋ž„ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
07:43
because of how little we expect of each other,
162
463703
2328
์šฐ๋ฆฌ ์„œ๋กœ์—๊ฒŒ, ๊ทธ๋ฆฌ๊ณ  ์šฐ๋ฆฌ ์‚ถ์—๊ฒŒ
07:46
of how little we actually seem to expect of our life,
163
466055
3890
์šฐ๋ฆฌ ์ž์‹ ์ด ์–ผ๋งˆ๋‚˜ ์กฐ๊ธˆ ๊ธฐ๋Œ€ํ•˜๊ณ  ๋˜ ์ข‹์€ ์‚ถ์— ๋Œ€ํ•ด์„œ
์–ผ๋งˆ๋‚˜ ์กฐ๊ธˆ ์•Œ๊ณ  ์žˆ๋‹ค๋Š” ๊ฒƒ์„ ์•Œ๋ฉด ์ •๋ง ๋†€๋ผ์‹ค ๊ฒƒ์ž…๋‹ˆ๋‹ค.
07:49
and what the good life looks like.
164
469969
2034
์—ฌ๊ธฐ์„œ ์ œ๊ฐ€ ์—ฌ๋Ÿฌ๋ถ„๊ป˜ ๋‚จ๊ธธ ๋„ค ๋ฒˆ์งธ ์งˆ๋ฌธ์ด ๋‚˜์˜ต๋‹ˆ๋‹ค:
07:53
So that's a fourth question I'd like to leave you with:
165
473110
3023
07:56
What vision of the good life do your designs convey?
166
476157
4362
์—ฌ๋Ÿฌ๋ถ„์˜ ๋””์ž์ธ์€ ์ข‹์€ ์‚ถ์— ๋Œ€ํ•ด์„œ
์–ด๋–ค ๋น„์ „์„ ์ „๋‹ฌํ•ฉ๋‹ˆ๊นŒ?
08:01
And speaking of design,
167
481249
1372
๊ทธ๋ฆฌ๊ณ  ๋””์ž์ธ์— ๋Œ€ํ•ด์„œ ์–˜๊ธฐํ•จ์œผ๋กœ์จ
08:02
you'll notice that I already broadened the discussion,
168
482645
4190
์ œ๊ฐ€ ๋…ผ์˜์˜ ์˜์—ญ์„ ๋” ๋„“ํžŒ ๊ฒƒ์„ ์•„์‹ค ๊ฒ๋‹ˆ๋‹ค.
08:06
because it's not just persuasive technology that we're talking about here,
169
486859
4444
์™œ๋ƒ๋ฉด ์ด์ œ๋Š” ๋” ์ด์ƒ ์„ค๋“์˜ ๊ธฐ์ˆ ์— ํ•œํ•ด์„œ ์–˜๊ธฐํ•˜๋Š” ๊ฒƒ์ด ์•„๋‹ˆ๋ผ
์šฐ๋ฆฌ๊ฐ€ ์ด ์„ธ์ƒ์— ๋‚ด์–ด๋†“๋Š” ๊ทธ ์–ด๋–ค ๋””์ž์ธ์— ๋Œ€ํ•ด์„œ๋„ ์ ์šฉ๊ฐ€๋Šฅํ•œ ์–˜๊ธฐ๊ฐ€ ๋˜๋Š” ๊ฒƒ์ด๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
08:11
it's any piece of design that we put out here in the world.
170
491327
4077
08:15
I don't know whether you know
171
495428
1400
์—ฌ๋Ÿฌ๋ถ„์ด ์•„์‹ค์ง€ ๋ชจ๋ฅด๊ฒ ์ง€๋งŒ
08:16
the great communication researcher Paul Watzlawick who, back in the '60s,
172
496852
3555
์œ„๋Œ€ํ•œ ์†Œํ†ตํ•™ ์—ฐ๊ตฌ์ž์ธ ํด ์™“์ฆ๋ผ์œ…(Paul Watzlawick)์ด
60๋…„๋Œ€์— ์ฃผ์žฅํ–ˆ๋“ฏ์ด
์šฐ๋ฆฌ๋Š” ์†Œํ†ต์„ ์•ˆํ•˜๊ณ  ์‚ด ์ˆ˜๋Š” ์—†์Šต๋‹ˆ๋‹ค.
08:20
made the argument that we cannot not communicate.
173
500431
2511
๋งŒ์•ฝ์— ์šฐ๋ฆฌ๊ฐ€ ์นจ๋ฌต์„ ์ง€ํ‚ค๊ณ ์ž ํ• ์ง€๋ผ๋„
08:22
Even if we choose to be silent, we chose to be silent,
174
502966
2601
์šฐ๋ฆฌ๋Š” ์นจ๋ฌต์„ ์ง€ํ‚ค๋Š” ๊ฒƒ์„ ์„ ํƒํ•œ ๊ฒƒ์ž…๋‹ˆ๋‹ค. ์นจ๋ฌต์„ ์ง€ํ‚ด์œผ๋กœ์จ ์šฐ๋ฆฌ๋Š” ์†Œํ†ตํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
08:25
and we're communicating something by choosing to be silent.
175
505591
2956
08:28
And in the same way that we cannot not communicate,
176
508571
2735
๊ทธ๋ฆฌ๊ณ  ์šฐ๋ฆฌ๊ฐ€ ์†Œํ†ต์„ ์•ˆํ•  ์ˆ˜๊ฐ€ ์—†๋“ฏ์ด
์šฐ๋ฆฌ๋Š” ์„ค๋“ํ•˜์ง€ ์•Š์„ ์ˆ˜๊ฐ€ ์—†์Šต๋‹ˆ๋‹ค.
08:31
we cannot not persuade:
177
511330
1531
08:32
whatever we do or refrain from doing,
178
512885
2011
์šฐ๋ฆฌ๊ฐ€ ์–ด๋–ค ์ผ์„ ํ•˜๋˜ ์•„๋‹ˆ๋ฉด ์–ด๋–ค ์ผ์„ ํ•˜๋Š” ๊ฒƒ์„ ์ž์ œํ•˜๋˜ ๊ฐ„์—,
08:34
whatever we put out there as a piece of design, into the world,
179
514920
4365
์šฐ๋ฆฌ๊ฐ€ ๋””์ž์ธ์˜ ์ผ๋ถ€๋กœ์„œ
์„ธ์ƒ์— ๋‚ด์–ด๋†“๋Š” ๊ฒƒ์€
์„ค๋“์˜ ์š”์†Œ๋ฅผ ๋‹ด๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
08:39
has a persuasive component.
180
519309
2047
08:41
It tries to affect people.
181
521380
1873
๊ทธ๊ฒƒ์€ ์‚ฌ๋žŒ๋“ค์—๊ฒŒ ์˜ํ–ฅ์„ ๋ผ์น˜๋ ค๊ณ  ํ•ฉ๋‹ˆ๋‹ค.
08:43
It puts a certain vision of the good life out there in front of us,
182
523277
3747
๊ทธ๊ฒƒ์€ ์šฐ๋ฆฌ ์•ž์— ์ข‹์€ ์‚ถ์— ๋Œ€ํ•œ
์–ด๋–ค ๋น„์ „์„ ์„ ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
๋„ค๋œ๋ž€๋“œ ๊ธฐ์ˆ  ์ฒ ํ•™์ž์ธ
08:47
which is what Peter-Paul Verbeek,
183
527048
1625
08:48
the Dutch philosopher of technology, says.
184
528697
2716
ํ”ผํ„ฐ-ํด ๋ฒ ๋ฅด๋น…์ด ๋งํ–ˆ๋“ฏ์ด,
์šฐ๋ฆฌ๊ฐ€ ๋””์ž์ด๋„ˆ๋กœ์„œ ์˜๋„ํ–ˆ๋˜ ์•ˆํ–ˆ๋˜
08:51
No matter whether we as designers intend it or not,
185
531437
3948
08:55
we materialize morality.
186
535409
2142
์šฐ๋ฆฌ๋Š” ๋„๋•์„ฑ์„ ๊ตฌ์ฒดํ™”์‹œํ‚ต๋‹ˆ๋‹ค.
08:57
We make certain things harder and easier to do.
187
537575
2803
์šฐ๋ฆฌ๋Š” ์–ด๋–ค ๊ฒƒ๋“ค์„ ํ•˜๊ธฐ ์–ด๋ ต๊ฒŒ ๋˜๋Š” ์‰ฝ๊ฒŒ ๋งŒ๋“ญ๋‹ˆ๋‹ค.
09:00
We organize the existence of people.
188
540402
2211
์šฐ๋ฆฌ๋Š” ์‚ฌ๋žŒ๋“ค์˜ ์กด์žฌ๋ฅผ ์ •๋ฆฌํ•ฉ๋‹ˆ๋‹ค.
09:02
We put a certain vision
189
542637
1151
์šฐ๋ฆฌ๊ฐ€ ์„ธ์ƒ์— ๋‚ด์–ด๋†“๋Š” ๋ชจ๋“  ๊ฒƒ๋“ค์„ ํ†ตํ•ด์„œ
09:03
of what good or bad or normal or usual is
190
543812
3404
์–ด๋–ค ๊ฒƒ์ด ์ข‹๊ณ  ๋‚˜์˜๊ณ  ์ •์ƒ์ ์ด๊ณ 
09:07
in front of people,
191
547240
1151
09:08
by everything we put out there in the world.
192
548415
2400
๋˜ ์ผ์ƒ์ ์ธ ๊ฒƒ์— ๋Œ€ํ•œ ํŠน์ • ๋น„์ „์„ ์„ธ์ƒ ์‚ฌ๋žŒ๋“ค ์•ž์— ์„ ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
ํ•™๊ต์˜์ž์™€ ๊ฐ™์ด ๋ฌดํ•ดํ•ด ๋ณด์ด๋Š” ๊ฒƒ ์กฐ์ฐจ๋„
09:11
Even something as innocuous as a set of school chairs
193
551247
3086
09:14
is a persuasive technology,
194
554357
2047
์„ค๋“ ๊ธฐ์ˆ ์˜ ํ•œ ๊ฐ€์ง€์ž…๋‹ˆ๋‹ค.
๊ทธ๋Ÿฐ ๊ฒƒ ์กฐ์ฐจ๋„ ์ข‹์€ ์‚ถ์— ๋Œ€ํ•œ
09:16
because it presents and materializes a certain vision of the good life --
195
556428
4690
ํŠน์ • ๋น„์ „์„ ์„ ์‚ฌํ•˜๊ณ  ๊ตฌ์ฒดํ™”์‹œํ‚ค๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
๊ฐ€๋ฅด์น˜๊ณ  ๋ฐฐ์šฐ๊ณ  ๋“ฃ๋Š” ํ–‰์œ„๊ฐ€
09:21
a good life in which teaching and learning and listening
196
561142
2857
09:24
is about one person teaching, the others listening;
197
564023
3099
ํ•œ ์‚ฌ๋žŒ์ด ๊ฐ€๋ฅด์น˜๊ณ  ๋‹ค๋ฅธ ๋ชจ๋‘๊ฐ€ ๋“ฃ๊ณ ,
๋ฐฐ์›€์ด ์•‰์€ ์ƒํƒœ์—์„œ ํ–‰ํ•ด์ง€๊ณ ,
09:27
in which it is about learning-is-done-while-sitting;
198
567146
4053
์˜์ž๋“ค์ด ๋ฐ”๋‹ฅ์— ๊ณ ์ •๋˜์–ด ์žˆ๊ธฐ ๋•Œ๋ฌธ์—
09:31
in which you learn for yourself;
199
571223
1595
09:32
in which you're not supposed to change these rules,
200
572842
2420
๊ทธ๋ฆฌ๊ณ  ์ด๋Ÿฌํ•œ ๊ทœ์น™๋“ค์ด ๋ฐ”๋€Œ์–ด์„  ์•ˆ๋œ๋‹ค๊ณ  ์Šค์Šค๋กœ ๋ฐฐ์šฐ๊ฒŒ ๋˜๋Š”
09:35
because the chairs are fixed to the ground.
201
575286
2447
๊ทธ๋Ÿฌํ•œ ์ข‹์€ ์‚ถ ๋ง์ž…๋‹ˆ๋‹ค.
09:38
And even something as innocuous as a single-design chair,
202
578888
2911
๊ทธ๋ฆฌ๊ณ  ๋‹จ ํ•˜๋‚˜์˜ ๋””์ž์ธ ์˜์ž์™€ ๊ฐ™์ด ๋ฌดํ•ดํ•ด ๋ณด์ด๋Š” ๊ฒƒ๋„ --
09:41
like this one by Arne Jacobsen,
203
581823
1572
์˜ˆ๋ฅผ ๋“ค๋ฉด ์•„๋ฅธ ์ œ์ด์ฝฅ์„ผ(Arne Jacobsen)์˜ ์˜์ž--
09:43
is a persuasive technology,
204
583419
1776
์„ค๋“ ๊ธฐ์ˆ ์˜ ํ•œ๊ฐ€์ง€์ž…๋‹ˆ๋‹ค.
์ด๊ฒƒ ๋˜ํ•œ ์ข‹์€ ์‚ถ์— ๋Œ€ํ•œ ์ƒ๊ฐ์„ ์†Œํ†ตํ•˜๋ ค๊ณ  ํ•˜๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
09:45
because, again, it communicates an idea of the good life:
205
585219
3043
09:48
a good life -- a life that you, as a designer, consent to by saying,
206
588735
4925
์ข‹์€ ์‚ถ์ด๋ž€ --
๋””์ž์ด๋„ˆ๋กœ์„œ์˜ ์—ฌ๋Ÿฌ๋ถ„์ด ๋‹ค์Œ๊ณผ ๊ฐ™์€ ๋ง์— ๋™์˜ํ•˜๋Š” ์‚ถ์ž…๋‹ˆ๋‹ค:
"์ข‹์€ ์‚ถ ์•ˆ์—์„œ๋Š”
09:53
"In a good life, goods are produced as sustainably or unsustainably
207
593684
3507
๋ชจ๋“  ์‚ฐ๋ฌผ์ด ์ด ์˜์ž์™€ ๊ฐ™์ด ์ง€์†๊ฐ€๋Šฅํ•˜๊ฑฐ๋‚˜ ์ง€์†๊ฐ€๋Šฅํ•˜์ง€ ์•Š์€ ์ƒํƒœ๋กœ ์ƒ์‚ฐ๋ฉ๋‹ˆ๋‹ค.
09:57
as this chair.
208
597215
1611
09:58
Workers are treated as well or as badly
209
598850
1977
๋ชจ๋“  ๋…ธ๋™์ž๋Š” ๊ทธ ์˜์ž๋ฅผ ๋งŒ๋“  ๋…ธ๋™์ž๋“ค๊ณผ ๊ฐ™์ด
10:00
as the workers were treated that built that chair."
210
600851
2572
์ข‹๊ฒŒ ํ˜น์€ ๋‚˜์˜๊ฒŒ ๋Œ€์šฐ๋ฐ›์Šต๋‹ˆ๋‹ค."
10:03
The good life is a life where design is important
211
603762
2301
์ข‹์€ ์‚ถ์ด๋ž€ ๊ทธ๋Ÿฐ ์ž˜ ๋””์ž์ธ๋œ ์˜์ž์™€ ๊ฐ™์ด
๋ˆ„๊ตฐ๊ฐ€๊ฐ€ ํ™•์‹คํ•˜๊ฒŒ ์‹œ๊ฐ„์„ ๋“ค์ด๊ณ  ๋ˆ์„ ๋“ค์ด๊ธฐ ๋•Œ๋ฌธ์—
10:06
because somebody obviously took the time and spent the money
212
606087
2921
๋””์ž์ธ์ด ์ค‘์š”ํ•œ ๊ทธ๋Ÿฐ ์‚ถ์ž…๋‹ˆ๋‹ค.
10:09
for that kind of well-designed chair;
213
609032
1784
10:10
where tradition is important,
214
610840
1404
๊ทธ๊ฒƒ์ด ๊ณ ์ „์ด๊ณ 
๊ทธ ๊ณ ์ „์— ๊ด€์‹ฌ์„ ๊ฐ€์ง€๊ธฐ ๋•Œ๋ฌธ์—
10:12
because this is a traditional classic and someone cared about this;
215
612268
3166
์ „ํ†ต์ด ์ค‘์š”ํ•˜๊ณ ,
๊ทธ๋ฆฌ๊ณ  ๊ณผ์‹œ๋ฅผ ์œ„ํ•œ ์†Œ๋น„๊ฐ€ ์žˆ์–ด์„œ
10:15
and where there is something as conspicuous consumption,
216
615458
2690
์—„์ฒญ๋‚œ ์–‘์˜ ๋ˆ์„ ๋“ค์–ด์„œ ๊ทธ๋Ÿฐ ์˜์ž๋ฅผ ์‚ฌ๋Š” ๊ฒƒ์ด
10:18
where it is OK and normal to spend a humongous amount of money
217
618172
2956
๋‹น์‹ ์˜ ์‚ฌํšŒ์ง€์œ„๋ฅผ ๋‹ค๋ฅธ ์‚ฌ๋žŒ๋“ค์—๊ฒŒ ๋ณด์—ฌ์ฃผ๋Š” ๊ฒƒ์ด
10:21
on such a chair,
218
621152
1151
ํ—ˆ์šฉ๋˜๋Š” ๊ทธ๋Ÿฐ ์‚ถ ๋ง์ž…๋‹ˆ๋‹ค.
10:22
to signal to other people what your social status is.
219
622327
2573
10:24
So these are the kinds of layers, the kinds of questions
220
624924
3317
์ด๋Ÿฌํ•œ ๋‹ค์–‘ํ•œ ๋ฉด๋“ค๊ณผ ์งˆ๋ฌธ๋“ค์ด
10:28
I wanted to lead you through today;
221
628265
1970
์ œ๊ฐ€ ์˜ค๋Š˜ ์—ฌ๋Ÿฌ๋ถ„์—๊ฒŒ ์ „๋‹ฌํ•˜๊ณ  ์‹ถ์€ ๊ฒƒ๋“ค์ด์—ˆ์Šต๋‹ˆ๋‹ค --
๋‹ค์Œ๊ณผ ๊ฐ™์€ ์งˆ๋ฌธ๋“ค ๋ง์ด์ฃ :
10:30
the question of: What are the intentions that you bring to bear
222
630259
3034
์—ฌ๋Ÿฌ๋ถ„์ด ๋””์ž์ธ์„ ํ•จ์— ์žˆ์–ด์„œ ์–ด๋–ค ์˜๋„๋ฅผ ๊ฐ€์ง€๊ณ  ์žˆ๋Š”๊ฐ€?
10:33
when you're designing something?
223
633317
1560
10:34
What are the effects, intended and unintended, that you're having?
224
634901
3252
์˜๋„ํ–ˆ๋˜ ์˜๋„ํ•˜์ง€ ์•Š์•˜๋˜ ์—ฌ๋Ÿฌ๋ถ„์ด ๋ณด๋Š” ๊ฒฐ๊ณผ๋Š” ๋ฌด์—‡์ธ๊ฐ€?
๊ทธ๊ฒƒ๋“ค์— ๋Œ€ํ•ด์„œ ํŒ๋‹จํ•  ๋•Œ
10:38
What are the values you're using to judge those?
225
638177
2801
์–ด๋–ค ๊ฐ€์น˜๊ด€์„ ์ ์šฉํ•˜๋Š”๊ฐ€?
๊ทธ ๊ฐ€์น˜๊ด€์„ ์„ ์‚ฌํ•˜๊ธฐ ์œ„ํ•ด์„œ
10:41
What are the virtues, the aspirations
226
641002
1973
10:42
that you're actually expressing in that?
227
642999
2028
์–ด๋–ค ๋ฏธ๋•๊ณผ ์—ผ์›์„ ๋‹ด๊ณ  ์žˆ๋Š”๊ฐ€?
10:45
And how does that apply,
228
645400
1857
์„ค๋“ ๊ธฐ์ˆ ๋งŒ์ด ์•„๋‹ˆ๋ผ,
10:47
not just to persuasive technology,
229
647281
1986
์—ฌ๋Ÿฌ๋ถ„์ด ๋””์ž์ธํ•˜๋Š” ๋ชจ๋“  ๊ฒƒ์—
10:49
but to everything you design?
230
649291
2048
์ด ๋ชจ๋“  ์งˆ๋ฌธ์ด ์ ์šฉ๋˜๋Š”๊ฐ€?
10:51
Do we stop there?
231
651912
1291
์—ฌ๊ธฐ์„œ ์งˆ๋ฌธํ•˜๋Š” ๊ฒƒ์„ ๋๋‚ด์•ผ ํ• ๊นŒ์š”?
10:53
I don't think so.
232
653815
1167
์ €๋Š” ๊ทธ๋ ‡๊ฒŒ ์ƒ๊ฐํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
10:55
I think that all of these things are eventually informed
233
655291
4438
์ด๋Ÿฌํ•œ ์งˆ๋ฌธ๋“ค์€ ๊ฒฐ๊ณผ์ ์œผ๋กœ
10:59
by the core of all of this,
234
659753
1423
ํ•œ๊ฐ€์ง€ ํ•ต์‹ฌ์œผ๋กœ ํ†ตํ•œ๋‹ค๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค --
11:01
and this is nothing but life itself.
235
661200
3091
๊ทธ๊ฒƒ์€ ๊ทธ ๋ฌด์—‡๋„ ์•„๋‹Œ ๋ฐ”๋กœ ์‚ถ ๊ทธ ์ž์ฒด์ž…๋‹ˆ๋‹ค.
11:04
Why, when the question of what the good life is
236
664594
2717
์ข‹์€ ์‚ถ์— ๋Œ€ํ•œ ์งˆ๋ฌธ์ด
11:07
informs everything that we design,
237
667335
2339
์šฐ๋ฆฌ๊ฐ€ ๋””์ž์ธํ•˜๋Š” ๋ชจ๋“  ๊ฒƒ์— ์˜ํ–ฅ์„ ๋ฏธ์นœ๋‹ค๋ฉด
11:09
should we stop at design and not ask ourselves:
238
669698
2794
์šฐ๋ฆฌ๋Š” ๋””์ž์ธํ•˜๋Š” ๊ฒƒ์„ ๋ฉˆ์ฃผ๊ณ  ์šฐ๋ฆฌ ์ž์‹ ์—๊ฒŒ ๋‹ค์Œ๊ณผ ๊ฐ™์€ ์งˆ๋ฌธ์„ ๋˜์ ธ์•ผ ํ•ฉ๋‹ˆ๋‹ค:
11:12
How does it apply to our own life?
239
672516
1990
์ด๊ฒƒ์ด ์šฐ๋ฆฌ ์ž์‹ ์˜ ์‚ถ์— ์–ด๋–ป๊ฒŒ ์ ์šฉ๋ ๊นŒ?
11:14
"Why should the lamp or the house be an art object,
240
674881
2712
๋งˆ์ดํฌ ํฌ์ปฌํŠธ๊ฐ€ ๋งํ–ˆ๋“ฏ์ด,
11:17
but not our life?"
241
677617
1176
"์ „๋“ฑ์ด๋‚˜ ์ง‘์ด ์˜ˆ์ˆ ํ’ˆ์ด ๋  ์ˆ˜ ์žˆ๋‹ค๋ฉด
11:18
as Michel Foucault puts it.
242
678817
1483
์šฐ๋ฆฌ ์‚ถ์ด ๋ ์ง€ ์•Š์„ ์ด์œ ๋Š” ์—†์ง€ ์•Š์€๊ฐ€?"
11:20
Just to give you a practical example of Buster Benson.
243
680696
3611
์—ฌ๋Ÿฌ๋ถ„๊ป˜ ์‹ค์งˆ์ ์ธ ์˜ˆ๋ฅผ ๋“ค๊ธฐ์œ„ํ•ด ๋ฒ„์Šคํ„ฐ ๋ฐด์Šจ(Buster Benson)์— ๋Œ€ํ•ด์„œ ๋ง์”€๋“œ๋ฆฌ๊ฒ ์Šต๋‹ˆ๋‹ค.
11:24
This is Buster setting up a pull-up machine
244
684331
2258
๋ณด์‹œ๋Š” ๊ฒƒ์€ ๋ฒ„์Šคํ„ฐ๊ฐ€ ๊ทธ์˜ ์ƒˆ๋กœ์šด ์ฐฝ์—… ํšŒ์‚ฌ์ธ ํ•ด๋น—๋žฉ์˜ ์‚ฌ๋ฌด์‹ค์—
11:26
at the office of his new start-up, Habit Labs,
245
686613
2612
ํ’€์—… ๋จธ์‹ ์„ ์ค€๋น„ํ•˜๋Š” ๋ชจ์Šต์ž…๋‹ˆ๋‹ค.
๊ทธ์˜ ํšŒ์‚ฌ๋Š” ๊ฑด๊ฐ•์˜ ๋‹ฌ๊ณผ ๊ฐ™์ด ์‚ฌ๋žŒ๋“ค์„ ์œ„ํ•œ
11:29
where they're trying to build other applications like "Health Month"
246
689249
3215
์—ฌ๋Ÿฌ๊ฐ€์ง€ ์žฅ์น˜๋“ค์„ ๊ฐœ๋ฐœํ•˜๋ ค๊ณ  ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
11:32
for people.
247
692488
1158
๊ทธ๋Š” ์™œ ์ด๋Ÿฐ ๊ฒƒ๋“ค์„ ๋งŒ๋“ค๊ณ  ์žˆ๋Š” ๊ฒƒ์ผ๊นŒ์š”?
11:33
And why is he building a thing like this?
248
693670
1962
11:35
Well, here is the set of axioms
249
695656
2033
๋ฒ„์Šคํ„ฐ์˜ ์ฐฝ์—… ํšŒ์‚ฌ์ธ ํ•ด๋น—๋žฉ์—์„œ ์ด๋Ÿฐ ์žฅ์น˜๋“ค์„ ๊ฐœ๋ฐœํ•  ๋•Œ
11:37
that Habit Labs, Buster's start-up, put up for themselves
250
697713
3398
ํŒ€์œผ๋กœ์„œ ์ผํ•˜๊ธฐ ์œ„ํ•ด์„œ
๊ทธ๋“ค ์ž์‹ ์„ ์œ„ํ•ด
11:41
on how they wanted to work together as a team
251
701135
2705
11:43
when they're building these applications --
252
703864
2029
๋‚ด ๊ฑธ์€ ๊ฒฝ๊ตฌ๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค --
11:45
a set of moral principles they set themselves
253
705917
2189
ํ•จ๊ป˜ ์ผํ•˜๋Š” ๊ฒƒ์— ์žˆ์–ด์„œ
๊ทธ๋“ค ์ž์‹ ์—๊ฒŒ ์ ์šฉํ•˜๋Š” ๋„๋•์  ์›์น™๋“ค์ž…๋‹ˆ๋‹ค.
11:48
for working together --
254
708130
1361
11:49
one of them being,
255
709515
1238
๊ทธ ์ค‘์— ํ•˜๋‚˜๋Š” ๋‹ค์Œ๊ณผ ๊ฐ™์Šต๋‹ˆ๋‹ค:
11:50
"We take care of our own health and manage our own burnout."
256
710777
3087
"์šฐ๋ฆฌ๋Š” ์šฐ๋ฆฌ ์ž์‹ ์˜ ๊ฑด๊ฐ•์— ์ฑ…์ž„์„ ์ง€๋ฉฐ ์šฐ๋ฆฌ ์ž์‹ ์˜ ํ”ผ๋กœ์— ๋Œ€ํ•ด์„œ ์กฐ์ ˆํ•œ๋‹ค."
์ตœ์ข…์ ์œผ๋กœ ์งˆ๋ฌธ์„ ํ•˜์ง€ ์•Š๊ณ 
11:54
Because ultimately, how can you ask yourselves
257
714221
3500
11:57
and how can you find an answer on what vision of the good life
258
717745
3930
์–ด๋–ป๊ฒŒ ์—ฌ๋Ÿฌ๋ถ„ ์ž์‹ ์—๊ฒŒ ์งˆ๋ฌธ์„ ๋˜์ง€๊ณ 
์—ฌ๋Ÿฌ๋ถ„์˜ ๋””์ž์ธ์ด ๊ด€์ฒ ํ•˜๊ณ  ์ฐฝ์กฐํ•˜๋Š”
12:01
you want to convey and create with your designs
259
721699
3169
์ข‹์€ ์‚ถ์— ๋Œ€ํ•œ ๋น„์ „์— ๋Œ€ํ•ด์„œ
12:04
without asking the question:
260
724892
1730
์–ด๋–ป๊ฒŒ ๋‹ต์„ ์–ป๊ฒ ์Šต๋‹ˆ๊นŒ?
12:06
What vision of the good life do you yourself want to live?
261
726646
3832
์—ฌ๋Ÿฌ๋ถ„์€ ์–ด๋– ํ•œ ์ข‹์€ ์‚ถ์— ๋Œ€ํ•œ ๋น„์ „์„ ๊ฐ€์ง€๊ณ 
์—ฌ๋Ÿฌ๋ถ„์˜ ์‚ถ์„ ์‚ด๊ณ  ์‹ถ์Šต๋‹ˆ๊นŒ?
์ด๋Ÿฌํ•œ ์งˆ๋ฌธ๊ณผ ํ•จ๊ป˜ ๋งˆ๋ฌด๋ฆฌ ํ•˜๊ฒ ์Šต๋‹ˆ๋‹ค. ๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
12:11
And with that,
262
731018
1174
12:12
I thank you.
263
732945
1412
12:14
(Applause)
264
734381
4156
(๋ฐ•์ˆ˜)
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7