Art that reveals how technology frames reality | Jiabao Li

79,380 views ・ 2020-04-06

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: μ†Œμ—° κΉ€ κ²€ν† : Jihyeon J. Kim
00:12
I'm an artist and an engineer.
0
12887
2396
μ €λŠ” μ˜ˆμˆ κ°€μΈ λ™μ‹œμ— κ°œλ°œμžμž…λ‹ˆλ‹€.
00:15
And lately, I've been thinking a lot about how technology mediates
1
15307
5774
또 μ΅œκ·Όμ— μ–΄λ–»κ²Œ 기술이 μš°λ¦¬κ°€ 세상을 μΈμ§€ν•˜λŠ” 방식을
00:21
the way we perceive reality.
2
21105
2041
μ‘°μž‘ν•˜λŠ” 지에 κ΄€ν•΄ κ³ λ―Όν•΄ μ™”μŠ΅λ‹ˆλ‹€.
00:23
And it's being done in a superinvisible and nuanced way.
3
23653
4640
μ‘°μž‘μ€ λˆˆμΉ˜μ±„μ§€ λͺ»ν•˜κ²Œ ν•˜λŠ” λ―Έμ„Έν•œ λ°©λ²•μœΌλ‘œ 진행이 되고 μžˆμŠ΅λ‹ˆλ‹€.
00:29
Technology is designed to shape our sense of reality
4
29260
3646
κΈ°μˆ μ€ μ§„μ§œ 세계인 κ²ƒμ²˜λŸΌ λ§Œλ“€μ–΄
00:32
by masking itself as the actual experience of the world.
5
32930
3811
우리의 ν˜„μ‹€κ°μ„ μ‘°μž‘ν•©λ‹ˆλ‹€.
00:37
As a result, we are becoming unconscious and unaware
6
37430
4686
κ·Έ κ²°κ³Ό, 기술이 세상을 μ‘°μž‘ν•˜κ³  μžˆλ‹€λŠ” 것을
00:42
that it is happening at all.
7
42140
2185
μš°λ¦¬λŠ” μ˜μ‹ν•˜μ§€ λͺ»ν•˜κ³ , 눈치 채지도 λͺ»ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
00:45
Take the glasses I usually wear, for example.
8
45360
2488
예λ₯Ό λ“€μ–΄, μ œκ°€ ν‰μ†Œ λΌλŠ” μ•ˆκ²½μ΄ μžˆλŠ”λ°μš”.
00:47
These have become part of the way I ordinarily experience my surroundings.
9
47872
4024
이 μ•ˆκ²½μ€ μ œκ°€ 보톡 주변을 μΈμ§€ν•˜λŠ” 것 쀑 ν•˜λ‚˜κ°€ λ©λ‹ˆλ‹€.
00:52
I barely notice them,
10
52352
1490
μ €λŠ” μ œκ°€ μ•ˆκ²½μ„ μΌλ‹€λŠ” 것을 잘 μ•Œμ•„μ±„μ§€ λͺ»ν•΄μš”.
00:53
even though they are constantly framing reality for me.
11
53866
4085
μ‹€μ œ μ•ˆκ²½ ν…Œλ‘λ¦¬κ°€ μ‘΄μž¬ν•˜λŠ”λ°λ„μš”.
00:58
The technology I am talking about is designed to do the same thing:
12
58447
4016
μ œκ°€ μ–˜κΈ°ν•˜κ³  μžˆλŠ” κΈ°μˆ λ„ 같은 λ°©μ‹μœΌλ‘œ κ΅¬ν˜„λ©λ‹ˆλ‹€.
01:02
change what we see and think
13
62487
2438
μš°λ¦¬κ°€ 보고 μƒκ°ν•˜λŠ” 방식을
01:04
but go unnoticed.
14
64949
1760
눈치 채지 λͺ»ν•˜λ„둝 λ°”κΏ‰λ‹ˆλ‹€.
01:07
Now, the only time I do notice my glasses
15
67780
3269
μ§€κΈˆ, μ œκ°€ μ•ˆκ²½μ„ μ“°κ³  μžˆλ‹€λŠ” 것을 μœ μΌν•˜κ²Œ μ•Œμ•„μ°¨λ¦΄ λ•ŒλŠ”
01:11
is when something happens to draw my attention to it,
16
71073
3258
μ•ˆκ²½μ— 주의λ₯Ό λΉΌμ•—κΈ°λŠ” 무언가가 μžˆμ„ λ•ŒμΈλ°μš”,
01:14
like when it gets dirty or my prescription changes.
17
74355
3514
μ•ˆκ²½μ΄ λ”λŸ½κ±°λ‚˜ λ„μˆ˜λ₯Ό λ°”κΏ”μ•Ό ν•  λ•Œ 같은 κ²½μš°μž…λ‹ˆλ‹€.
01:18
So I asked myself, "As an artist, what can I create
18
78481
4608
κ·Έλž˜μ„œ μ € μžμ‹ μ—κ²Œ λ¬Όμ–΄ λ΄…λ‹ˆλ‹€. "μ˜ˆμˆ κ°€λ‘œμ„œ, μ‚¬λžŒλ“€μ΄ λˆˆμΉ˜μ±„μ§€ λͺ»ν•˜κ²Œ
01:23
to draw the same kind of attention
19
83113
2447
우리의 ν˜„μ‹€μ„ μ‘°μž‘ν•˜κ³  μžˆλŠ” 디지털 미디어인
01:25
to the ways digital media -- like news organizations, social media platforms,
20
85584
6171
λ‰΄μŠ€, μ†Œμ…œ λ―Έλ””μ–΄ ν”Œλž«νΌ,
01:31
advertising and search engines --
21
91779
2261
광고와 검색 엔진에
01:34
are shaping our reality?"
22
94064
2014
주의λ₯Ό μ§‘μ€‘μ‹œν‚€κΈ° μœ„ν•΄ μ–΄λ–€ κ±Έ λ§Œλ“€ 수 μžˆμ„κΉŒ?"
01:36
So I created a series of perceptual machines
23
96798
4772
κ·Έλž˜μ„œ μ €λŠ” 지각 λ¨Έμ‹ μ˜ μ‹œλ¦¬μ¦ˆ 물을 λ§Œλ“€μ—ˆμŠ΅λ‹ˆλ‹€.
01:41
to help us defamiliarize and question
24
101594
3566
μš°λ¦¬κ°€ λ³΄λŠ” 세계에 λŒ€ν•΄
01:45
the ways we see the world.
25
105184
2120
λ‚―μ„€κ²Œ 느끼게 ν•˜κ³  μ˜λ¬Έμ„ 갖도둝 ν•˜κΈ° μœ„ν•΄μ„œ λ§μž…λ‹ˆλ‹€.
01:48
For example, nowadays, many of us have this kind of allergic reaction
26
108859
6033
예λ₯Ό λ“€μ–΄, μš”μ¦˜ λ§Žμ€ μ‚¬λžŒλ“€μ€
01:54
to ideas that are different from ours.
27
114916
2493
μžμ‹ κ³Ό λ‹€λ₯Έ 생각에 λŒ€ν•΄ μ•ŒλŸ¬μ§€ 같은 λ°˜μ‘μ„ λ³΄μž…λ‹ˆλ‹€.
01:57
We may not even realize that we've developed this kind of mental allergy.
28
117945
5656
μ•„λ§ˆλ„ μš°λ¦¬κ°€ 정신적 μ•ŒλŸ¬μ§€κ°€ μžˆλ‹€λŠ” 것 μ‘°μ°¨ μ•Œμ•„ 차리지 λͺ»ν•  κ²λ‹ˆλ‹€.
02:04
So I created a helmet that creates this artificial allergy to the color red.
29
124596
5643
κ·Έλž˜μ„œ μ €λŠ” 빨간색에 λ°˜μ‘ν•˜λŠ” 인곡 μ•ŒλŸ¬μ§€λ₯Ό λ§Œλ“œλŠ” 헬멧을 λ§Œλ“€μ–΄ λ΄€μŠ΅λ‹ˆλ‹€.
02:10
It simulates this hypersensitivity by making red things look bigger
30
130263
4453
헬멧을 μ“°κ³  빨간색을 보면 점점 크게 λ³΄μ΄λ©΄μ„œ
02:14
when you are wearing it.
31
134740
1280
맀우 λ―Όκ°ν•˜κ²Œ λ°˜μ‘ν•©λ‹ˆλ‹€.
02:16
It has two modes: nocebo and placebo.
32
136730
3558
이 ν—¬λ©§μ—λŠ” λ…Έμ‹œλ³΄μ™€ ν”ŒλΌμ‹œλ³΄ λͺ¨λ“œκ°€ μžˆμŠ΅λ‹ˆλ‹€.
02:21
In nocebo mode, it creates this sensorial experience of hyperallergy.
33
141272
5357
λ…Έμ‹œλ³΄ λͺ¨λ“œλŠ” 지각적인 κ²½ν—˜μ„ μ΄ˆμ•ŒλŸ¬μ§€ λ°˜μ‘μ„ 보이도둝 ν•©λ‹ˆλ‹€.
02:26
Whenever I see red, the red expands.
34
146653
2707
λΉ¨κ°„ 색을 λ³Ό λ•Œλ§ˆλ‹€, 색깔이 νŒ½μ°½λ˜μ–΄ λ³΄μž…λ‹ˆλ‹€.
02:30
It's similar to social media's amplification effect,
35
150062
3531
μ†Œμ…œ λ―Έλ””μ–΄μ˜ 증폭 νš¨κ³Όμ™€ λΉ„μŠ·ν•œλ°μš”.
02:33
like when you look at something that bothers you,
36
153617
2597
μ—¬λŸ¬λΆ„μ„ κΈ°λΆ„ λ‚˜μ˜κ²Œ ν•˜λŠ” 것을 보면
02:36
you tend to stick with like-minded people
37
156238
2928
λΉ„μŠ·ν•œ 생각을 가진 μ‚¬λžŒλ“€κ³Ό μ‹œκ°„μ„ λ³΄λ‚΄λ©΄μ„œ
02:39
and exchange messages and memes, and you become even more angry.
38
159190
4595
메세지와 λ°ˆμ„ μ£Όκ³  λ°›μœΌλ©° 더 ν™”λ₯Ό λ‹μš°κ²Œ λ©λ‹ˆλ‹€.
02:44
Sometimes, a trivial discussion gets amplified
39
164274
3662
가끔, μ‚¬μ†Œν•œ μ–˜κΈ°κ°€ 뢀풀어지고
02:47
and blown way out of proportion.
40
167960
2401
κ³Όμž₯이 λ©λ‹ˆλ‹€.
02:51
Maybe that's even why we are living in the politics of anger.
41
171038
4520
μ•„λ§ˆλ„ κ·Έλž˜μ„œ μ •μΉ˜μ— λŒ€ν•œ λΆ„λ…Έλ₯Ό μ•ˆκ³  μ‚¬λŠ” 건지도 λͺ¨λ¦…λ‹ˆλ‹€.
02:56
In placebo mode, it's an artificial cure for this allergy.
42
176756
3415
ν”ŒλΌμ‹œλ³΄ λͺ¨λ“œλŠ” μ•ŒλŸ¬μ§€μ— λŒ€ν•œ 인곡적인 μΉ˜λ£Œλ°©λ²• μž…λ‹ˆλ‹€.
03:00
Whenever you see red, the red shrinks.
43
180575
2292
λΉ¨κ°„ 색을 λ³Ό λ•Œλ§ˆλ‹€ λΉ¨κ°„ μƒ‰μ˜ 크기가 μ€„μ–΄λ“­λ‹ˆλ‹€.
03:03
It's a palliative, like in digital media.
44
183617
2875
디지털 λ―Έλ””μ–΄ 처럼 μž„μ‹œ 방편인 λ°©μ‹μž…λ‹ˆλ‹€.
03:06
When you encounter people with different opinions,
45
186516
2899
SNS μƒμ—μ„œ 의견이 λ‹€λ₯Έ μ‚¬λžŒμ„ λ§ˆμ£Όν•˜λ©΄
03:09
we will unfollow them,
46
189439
1477
μ–ΈνŒ”λ‘œμš°λ₯Ό ν•˜κ³ ,
03:10
remove them completely out of our feeds.
47
190940
2748
μ™„μ „ν•˜κ²Œ ν”Όλ“œλ₯Ό 차단할 κ²ƒμž…λ‹ˆλ‹€.
03:14
It cures this allergy by avoiding it.
48
194343
3203
νšŒν”Όν•¨μœΌλ‘œ 인해 μ•ŒλŸ¬μ§€λ₯Ό μΉ˜λ£Œν•˜κ²Œ λ©λ‹ˆλ‹€.
03:17
But this way of intentionally ignoring opposing ideas
49
197570
4547
ν•˜μ§€λ§Œ μ˜λ„μ μœΌλ‘œ λ°˜λŒ€ 생각을 λ¬΄μ‹œν•˜λŠ” 이 방법은
03:22
makes human community hyperfragmented and separated.
50
202141
4056
우리의 곡동체λ₯Ό κ³Όλ„ν•˜κ²Œ ν•΄μ²΄μ‹œν‚€κ³  λΆ„μ—΄ μ‹œν‚΅λ‹ˆλ‹€.
03:27
The device inside the helmet reshapes reality
51
207343
3059
헬멧 μ•ˆμ— μž₯μΉ˜λŠ” ν˜„μ‹€μ„ μž¬μ‘°μž‘ν•˜κ³ 
03:30
and projects into our eyes through a set of lenses
52
210426
2950
렌즈λ₯Ό 톡해 우리 λˆˆμ— νˆ¬μ‚¬ μ‹œν‚€λ©΄
03:33
to create an augmented reality.
53
213400
1915
가상 ν˜„μ‹€μ΄ λ§Œλ“€μ–΄μ§‘λ‹ˆλ‹€.
03:35
I picked the color red, because it's intense and emotional,
54
215935
4782
μ œκ°€ λΉ¨κ°„ 색을 κ³ λ₯Έ μ΄μœ λŠ” κ°•λ ¬ν•˜κ³  감성적이고,
03:40
it has high visibility
55
220741
1958
κ°€μž₯ 눈이 잘 띄고,
03:42
and it's political.
56
222723
1278
μ •μΉ˜μ μΈ 색깔이기 λ•Œλ¬Έμž…λ‹ˆλ‹€.
03:44
So what if we take a look
57
224390
1263
λ§Œμ•½ μš°λ¦¬κ°€
03:45
at the last American presidential election map
58
225677
3049
μž‘λ…„ λ―Έκ΅­ λŒ€ν†΅λ Ή 선거지도λ₯Ό
03:48
through the helmet?
59
228750
1165
헬멧을 톡해 λ³Έλ‹€λ©΄ μ–΄λ• μ„κΉŒμš”?
03:49
(Laughter)
60
229939
1008
(μ›ƒμŒ)
03:50
You can see that it doesn't matter if you're a Democrat or a Republican,
61
230971
3573
μ—¬λŸ¬λΆ„μ΄ λ―Όμ£Όλ‹Ή ν˜Ήμ€ 곡화당 μ§€μ§€μžμž„μ΄ 상관 μ—†κ²Œ λ©λ‹ˆλ‹€.
03:54
because the mediation alters our perceptions.
62
234568
3989
μ™œλƒν•˜λ©΄ λ³΄μ΄λŠ” 이미지가 지각을 λ³€ν™”μ‹œν‚€κΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
03:58
The allergy exists on both sides.
63
238581
3133
이 μ•ŒλŸ¬μ§€λŠ” μ–‘μͺ½ λͺ¨λ‘μ—μ„œ λ°œμƒν•©λ‹ˆλ‹€.
04:03
In digital media,
64
243134
1320
디지털 λ―Έλ””μ–΄μ—μ„œλŠ”
04:04
what we see every day is often mediated,
65
244478
3133
μš°λ¦¬κ°€ 맀일 λ³΄λŠ” 것이 가끔 μ‘°μž‘ λ˜μ§€λ§Œ
04:07
but it's also very nuanced.
66
247635
1733
맀우 λ―Έμ„Έν•©λ‹ˆλ‹€.
04:09
If we are not aware of this,
67
249758
1995
μš°λ¦¬κ°€ 이 차이λ₯Ό λˆˆμΉ˜μ±„μ§€ μ•ŠλŠ”λ‹€λ©΄
04:11
we will keep being vulnerable to many kinds of mental allergies.
68
251777
5405
정신적 μ•ŒλŸ¬μ§€μ™€ 같은 λ§Žμ€ 것듀에 μ·¨μ•½ν•΄ 질 κ²ƒμž…λ‹ˆλ‹€.
04:18
Our perception is not only part of our identities,
69
258900
3510
우리의 지각은 우리λ₯Ό μ΄λ£¨λŠ” ν•œ 뢀뢄이기도 ν•˜κ³ 
04:22
but in digital media, it's also a part of the value chain.
70
262434
4294
디지털 λ―Έλ””μ–΄μ—μ„œ κ°€μΉ˜ μ‚¬μŠ¬μ˜ ν•œ 뢀뢄이기도 ν•©λ‹ˆλ‹€.
04:27
Our visual field is packed with so much information
71
267992
3575
μ‹œκ° μ˜μ—­μ€ 수 λ§Žμ€ μ •λ³΄λ“€λ‘œ 가득차 μžˆλŠ”λ°μš”.
04:31
that our perception has become a commodity with real estate value.
72
271591
5512
이 정보듀은 μžμ‚° κ°€μΉ˜κ°€ 맀겨진 것듀이죠.
04:38
Designs are used to exploit our unconscious biases,
73
278035
3529
λ””μžμΈμ€ 우리의 λ¬΄μ˜μ‹μ μΈ 편견과
04:41
algorithms favor content that reaffirms our opinions,
74
281588
3480
μ’‹μ•„ν•˜λŠ” 컨텐츠λ₯Ό λΆˆλŸ¬μ˜€λŠ” μ•Œκ³ λ¦¬μ¦˜μ„ μ΄μš©ν•©λ‹ˆλ‹€.
04:45
so that every little corner of our field of view is being colonized
75
285092
4191
μ΄λŠ” κ΄‘κ³ λ₯Ό νŒ”κΈ° μœ„ν•΄μ„œ
04:49
to sell ads.
76
289307
1342
우리의 눈이 κ°€λŠ” 곳곳을 μ λ Ήν•˜κΈ° μœ„ν•¨μž…λ‹ˆλ‹€.
04:51
Like, when this little red dot comes out in your notifications,
77
291402
4017
마치 μž‘μ€ λΉ¨κ°„ 색 점이 μ—¬λŸ¬λΆ„μ˜ μ•Œλ¦Ό μ°½μ—μ„œ 보일 λ•Œ,
04:55
it grows and expands, and to your mind, it's huge.
78
295443
4244
이 점은 μ—¬λŸ¬λΆ„μ˜ λ¨Έλ¦Ώμ†μ—μ„œ 점점 μ»€μ§€λŠ”λ°, ꡉμž₯히 크죠.
05:00
So I started to think of ways to put a little dirt,
79
300571
3361
κ·Έλž˜μ„œ μ €λŠ” μ•ˆκ²½μ„ λ”λŸ½νžˆκ±°λ‚˜
05:03
or change the lenses of my glasses,
80
303956
2698
μ•ˆκ²½μ˜ 렌즈λ₯Ό λ°”κΎΈλŠ” 방법 같은 것을 μƒκ°ν•˜κ²Œ λ˜μ—ˆκ³ ,
05:06
and came up with another project.
81
306678
2098
또 λ‹€λ₯Έ ν”„λ‘œμ νŠΈλ₯Ό λ§Œλ“€μ—ˆμŠ΅λ‹ˆλ‹€.
05:09
Now, keep in mind this is conceptual. It's not a real product.
82
309423
3801
μ§„μ§œ μ œν’ˆμ΄ μ•„λ‹ˆλΌ, κ°œλ…μ μΈ ν”„λ‘œμ νŠΈλΌλŠ” 것을 염두해 λ‘μ„Έμš”.
05:13
It's a web browser plug-in
83
313676
2073
μ œκ°€ λ§Œλ“  건 μ›ΉλΈŒλΌμš°μ € ν”„λ‘œκ·Έλž¨ μΈλ°μš”.
05:15
that could help us to notice the things that we would usually ignore.
84
315773
4342
μš°λ¦¬κ°€ ν‰μ†Œμ— λ¬΄μ‹œν•˜λŠ” 것듀을 μ•Œλ €μ£ΌλŠ” κ²ƒμž…λ‹ˆλ‹€.
05:20
Like the helmet, the plug-in reshapes reality,
85
320709
3770
ν—¬λ©§μ²˜λŸΌ, 이 ν”„λ‘œκ·Έλž¨μ€ ν˜„μ‹€μ„ 재창쑰 ν•˜μ§€λ§Œ
05:24
but this time, directly into the digital media itself.
86
324503
3397
μ΄λ²ˆμ—λŠ” 디지털 λ―Έλ””μ–΄ 자체λ₯Ό μ§μ ‘μ μœΌλ‘œ ν‘œν˜„ν–ˆμŠ΅λ‹ˆλ‹€.
05:29
It shouts out the hidden filtered voices.
87
329066
3156
숨겨져있고, ν•„ν„°λ‘œ κ±ΈλŸ¬μ§„ μ˜κ²¬λ“€μ„ 보여 μ€λ‹ˆλ‹€.
05:32
What you should be noticing now
88
332246
1573
μ§€κΈˆ μ—¬λŸ¬ λΆ„λ“€μ—κ²Œ μ•Œλ €μ Έμ•Ό ν•  것듀은
05:33
will be bigger and vibrant,
89
333843
2509
더 크고 μ„ λͺ…ν•΄μ§ˆ κ²λ‹ˆλ‹€.
05:36
like here, this story about gender bias emerging from the sea of cats.
90
336376
4295
μ—¬κΈ°, 고양이 사진듀 ν‹ˆμ—μ„œ λ³΄μ΄λŠ” μ„± 편견과 κ΄€λ ¨λœ 이야기 μ²˜λŸΌμš”.
05:40
(Laughter)
91
340695
2021
(μ›ƒμŒ)
05:42
The plug-in could dilute the things that are being amplified by an algorithm.
92
342740
5417
이 ν”ŒλŸ¬κ·Έ-인은 μ•Œκ³ λ¦¬μ¦˜μœΌλ‘œ 증폭된 것을 λˆ„κ·ΈλŸ¬λœ¨λ¦΄ 수 μžˆμ–΄μš”.
05:48
Like, here in this comment section,
93
348831
1696
여기에 μžˆλŠ” λŒ“κΈ€λ“€ 처럼 λ§μ΄μ—μš”.
05:50
there are lots of people shouting about the same opinions.
94
350551
3069
같은 μ˜κ²¬λ“€μ„ 이야기 ν•˜λŠ” μ‚¬λžŒλ“€μ΄ 정말 많죠.
05:54
The plug-in makes their comments super small.
95
354111
2870
이 ν”„λ‘œκ·Έλž¨μ€ 이런 λŒ“κΈ€λ“€μ„ 정말 μž‘κ²Œ λ§Œλ“­λ‹ˆλ‹€.
05:57
(Laughter)
96
357005
1008
(μ›ƒμŒ)
05:58
So now the amount of pixel presence they have on the screen
97
358037
5251
κ·Έλž˜μ„œ 화면에 λ³΄μ΄λŠ” ν”½μ…€μ˜ 양은
06:03
is proportional to the actual value they are contributing to the conversation.
98
363312
4554
이 λŒ€ν™”μ— κΈ°μ—¬ν•œ μ‹€μ œ κ°€μΉ˜μ— λΉ„λ‘€ν•΄μ„œ λ³΄μ΄λŠ” κ²λ‹ˆλ‹€.
06:07
(Laughter)
99
367890
1942
(μ›ƒμŒ)
06:11
(Applause)
100
371345
3631
(λ°•μˆ˜)
06:16
The plug-in also shows the real estate value of our visual field
101
376919
4061
λ˜ν•œ 이 ν”ŒλŸ¬κ·Έ-인은 μ‹œκ° μ˜μ—­μ—μ„œμ˜ μžμ‚° κ°€μΉ˜λ₯Ό 보여 μ€λ‹ˆλ‹€.
06:21
and how much of our perception is being commoditized.
102
381004
3456
그리고 μ–Όλ§ˆλ‚˜ 우리의 지각을 μƒν’ˆν™” μ‹œμΌ°λŠ”μ§€ 보여 μ€λ‹ˆλ‹€.
06:24
Different from ad blockers,
103
384912
1589
κ΄‘κ³ λ₯Ό λ§‰λŠ” ν”„λ‘œκ·Έλž¨κ³ΌλŠ” λ‹€λ₯΄κ²Œ
06:26
for every ad you see on the web page,
104
386525
2472
μš°λ¦¬κ°€ μ›Ή νŽ˜μ΄μ§€μ—μ„œ 맀일 λ³΄λŠ” 광고에 λŒ€ν•΄
06:29
it shows the amount of money you should be earning.
105
389021
3407
κ·Έ μ œν’ˆμ„ 사렀면 μ—¬λŸ¬λΆ„μ΄ μ–Όλ§ˆλ‚˜ λ§Žμ€ λˆμ„ λ²Œμ–΄μ•Ό ν•˜λŠ”μ§€ 보여 μ€λ‹ˆλ‹€.
06:32
(Laughter)
106
392452
1245
(μ›ƒμŒ)
06:34
We are living in a battlefield between reality
107
394356
2292
μš°λ¦¬λŠ” ν˜„μ‹€κ³Ό
06:36
and commercial distributed reality,
108
396672
2253
ν˜„μ‹€μ— λΆ„ν¬λ˜μ–΄ μžˆλŠ” κ΄‘κ³ λ“€ 사이에 μ‘΄μž¬ν•˜λŠ” μ „μŸν„°μ—μ„œ μ‚΄κ³  μžˆμŠ΅λ‹ˆλ‹€.
06:39
so the next version of the plug-in could strike away that commercial reality
109
399329
5086
κ·Έλž˜μ„œ λ‹€μŒ 버전은 광고듀을 μ—†μ• κ³ 
06:44
and show you things as they really are.
110
404439
2799
κ΄‘κ³ κ°€ μ—†λŠ” ν˜„μ‹€ 자체λ₯Ό 보여쀄 κ²λ‹ˆλ‹€.
06:47
(Laughter)
111
407262
1791
(μ›ƒμŒ)
06:50
(Applause)
112
410335
3629
(λ°•μˆ˜)
06:55
Well, you can imagine how many directions this could really go.
113
415179
3612
μ—¬λŸ¬λΆ„μ€ 정말 갈 수 μžˆλŠ” λ°©ν–₯듀이 μ–Όλ§ˆλ‚˜ λ§Žμ€μ§€ 상상할 수 μžˆμŠ΅λ‹ˆλ‹€.
06:58
Believe me, I know the risks are high if this were to become a real product.
114
418815
4414
μ €λ₯Ό λ―Ώμ–΄λ³΄μ„Έμš”, 세상이 μƒν’ˆν™”κ°€ λœλ‹€λ©΄ 정말 μœ„ν—˜ν•  거라고 μƒκ°ν•΄μš”.
07:04
And I created this with good intentions
115
424006
2939
μ €λŠ” 쒋은 μ˜λ„λ₯Ό κ°–κ³  μž‘ν’ˆμ„ λ§Œλ“€μ—ˆμŠ΅λ‹ˆλ‹€.
07:06
to train our perception and eliminate biases.
116
426969
3837
우리의 지각을 ν›ˆλ ¨ν•˜κ³  νŽΈκ²¬μ„ μ œκ±°ν•˜κΈ° μœ„ν•΄μ„œ λ§μž…λ‹ˆλ‹€.
07:10
But the same approach could be used with bad intentions,
117
430830
3728
ν•˜μ§€λ§Œ 같은 접근이더라도 λ‚˜μœ μ˜λ„λ‘œ μ‚¬μš©λ  수 μžˆμŠ΅λ‹ˆλ‹€.
07:14
like forcing citizens to install a plug-in like that
118
434582
3205
λŒ€μ€‘μ˜ λͺ©μ†Œλ¦¬λ₯Ό ν†΅μ œν•˜κΈ° μœ„ν•΄
07:17
to control the public narrative.
119
437811
2126
μ‹œλ―Όλ“€μ—κ²Œ ν”„λ‘œκ·Έλž¨μ„ μ„€μΉ˜ν•˜κ²Œ κ°•μš”ν•˜λŠ” 경우 μ²˜λŸΌμš”.
07:20
It's challenging to make it fair and personal
120
440768
2737
또 λ‹€λ₯Έ νŽΈκ²¬μ„ λ§Œλ“€μ§€ μ•Šκ³ 
07:23
without it just becoming another layer of mediation.
121
443529
3155
ν”„λ‘œκ·Έλž¨μ„ κ³΅μ •ν•˜κ³  개인적으둜 λ§Œλ“œλŠ” 것은 μ–΄λ €μš΄ μΌμž…λ‹ˆλ‹€.
07:27
So what does all this mean for us?
122
447933
2662
이 λͺ¨λ“  것은 μš°λ¦¬μ—κ²Œ μ–΄λ–€ 의미λ₯Ό μ€„κΉŒμš”?
07:31
Even though technology is creating this isolation,
123
451230
3733
비둝 기술이 고립을 μ°½μ‘°ν–ˆμ§€λ§Œ,
07:34
we could use it to make the world connected again
124
454987
3602
μš°λ¦¬λŠ” 세계λ₯Ό λ‹€μ‹œ μ—°κ²°μ‹œν‚€λŠ”λ° κΈ°μˆ μ„ ν™œμš©ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
07:38
by breaking the existing model and going beyond it.
125
458613
3247
이λ₯Ό μœ„ν•΄μ„œλŠ” 기쑴의 μ•Œκ³ λ¦¬μ¦˜μ„ κΉ¨κ³  더 λ‚˜μ€ 것을 λ§Œλ“€μ–΄μ•Ό ν•©λ‹ˆλ‹€.
07:42
By exploring how we interface with these technologies,
126
462508
3338
μš°λ¦¬κ°€ μ–΄λ–»κ²Œ κΈ°μˆ λ“€μ„ ν™œμš©ν•  것인가λ₯Ό νƒμƒ‰ν•¨μœΌλ‘œμ¨
07:45
we could step out of our habitual, almost machine-like behavior
127
465870
5185
κΈ°κ³„μ²˜λŸΌ λ°›μ•„λ“€μ˜€λ˜ ν–‰λ™λ“€μ—μ„œ λ²—μ–΄λ‚  수 있고,
07:51
and finally find common ground between each other.
128
471079
2670
μš°λ¦¬μ—κ²Œ μ‘΄μž¬ν•˜λŠ” 곡톡점을 λ°œκ²¬ν•  수 μžˆμ„ κ²ƒμž…λ‹ˆλ‹€.
07:54
Technology is never neutral.
129
474915
1789
κΈ°μˆ μ€ 쀑립적인 것이 μ ˆλŒ€ μ•„λ‹™λ‹ˆλ‹€.
07:57
It provides a context and frames reality.
130
477191
3019
κΈ°μˆ μ€ ν˜„μ‹€μ— λ§₯락과 ν”„λ ˆμž„μ„ μ”Œμ›λ‹ˆλ‹€.
08:00
It's part of the problem and part of the solution.
131
480234
2888
이점은 λ¬Έμ œμ΄κΈ°λ„ ν•˜κ³ , 해결책이기도 ν•©λ‹ˆλ‹€.
08:03
We could use it to uncover our blind spots and retrain our perception
132
483624
5640
μ™œλƒν•˜λ©΄ μš°λ¦¬λŠ” μ‚¬κ°μ§€λŒ€λ₯Ό λ°œκ²¬ν•  수 있고 지각을 μž¬ν›ˆλ ¨ μ‹œν‚¬ 수 있고,
08:09
and consequently, choose how we see each other.
133
489949
3506
결과적으둜 μ„œλ‘œλ₯Ό μ–΄λ–»κ²Œ 바라 λ³΄λŠ”μ§€ μ„ νƒν•˜κ²Œ ν•  수 있기 λ•Œλ¬Έμž…λ‹ˆλ‹€.
08:13
Thank you.
134
493892
1191
κ°μ‚¬ν•©λ‹ˆλ‹€.
08:15
(Applause)
135
495107
2685
(λ°•μˆ˜)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7