Why I draw with robots | Sougwen Chung

27,896 views ・ 2020-02-14

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

00:00
Translator: Ivana Korom Reviewer: Camille MartΓ­nez
0
0
7000
λ²ˆμ—­: YoonJu Mangione κ²€ν† : Yeowoon Yi
00:12
Many of us here use technology in our day-to-day.
1
12937
3165
우리 λŒ€λΆ€λΆ„μ€ 맀일 κ³Όν•™ κΈ°μˆ μ„ μ΄μš©ν•©λ‹ˆλ‹€.
00:16
And some of us rely on technology to do our jobs.
2
16126
3247
μ–΄λ–€ 뢄듀은 자기 업무에 기술의 도움을 받기도 ν•˜μ£ .
00:19
For a while, I thought of machines and the technologies that drive them
3
19397
3950
ν•œλ•Œ μ €λŠ” 기계와 κ³Όν•™κΈ°μˆ μ΄ λ°œμ „ν•¨μ— λ”°λΌμ„œ
00:23
as perfect tools that could make my work more efficient and more productive.
4
23371
4505
제 일을 더 λŠ₯λ₯ μ μ΄κ³  μƒμ‚°μ μœΌλ‘œ λ§Œλ“€ ν›Œλ₯­ν•œ 도ꡬ가 될 거라고 μƒκ°ν–ˆμŠ΅λ‹ˆλ‹€.
00:28
But with the rise of automation across so many different industries,
5
28403
3254
ν•˜μ§€λ§Œ λ‹€μ–‘ν•œ μ‚°μ—… λΆ„μ•Όμ—μ„œ μžλ™ν™”κ°€ μ¦κ°€ν•˜λŠ” κ±Έ 보고
00:31
it led me to wonder:
6
31681
1372
ν•œκ°€μ§€ μ˜λ¬Έμ„ ν’ˆκ²Œ λμŠ΅λ‹ˆλ‹€.
00:33
If machines are starting to be able to do the work
7
33077
2341
λ§Œμ•½ μ˜€λž«λ™μ•ˆ 인간이 ν•œλ˜ 일듀을 기계가 ν•΄λ‚΄κΈ° μ‹œμž‘ν•œλ‹€λ©΄
00:35
traditionally done by humans,
8
35442
1667
00:37
what will become of the human hand?
9
37133
2333
μΈκ°„μ˜ 역할은 무엇이 λ κΉŒμš”?
00:40
How does our desire for perfection, precision and automation
10
40133
4093
μ™„λ²½κ³Ό μ •ν™•μ„± 그리고 μžλ™ν™”μ— λŒ€ν•œ 우리의 열망은
00:44
affect our ability to be creative?
11
44250
1922
μΈκ°„μ˜ 창쑰적 λŠ₯λ ₯에 μ–΄λ–€ 영ν–₯을 λ―ΈμΉ κΉŒμš”?
00:46
In my work as an artist and researcher, I explore AI and robotics
12
46553
4087
μ €λŠ” μ˜ˆμˆ κ°€μ΄μž μ—°κ΅¬μžλ‘œμ„œ
인곡지λŠ₯κ³Ό λ‘œλ΄‡μ„ ν™œμš©ν•˜μ—¬
00:50
to develop new processes for human creativity.
13
50664
3005
μΈκ°„μ˜ μ°½μ˜μ„±μ„ λ†’μ΄λŠ” μƒˆλ‘œμš΄ 방법을 μ°Ύκ³  μžˆμŠ΅λ‹ˆλ‹€.
00:54
For the past few years,
14
54077
1286
μ§€λ‚œ λͺ‡ λ…„ λ™μ•ˆ,
00:55
I've made work alongside machines, data and emerging technologies.
15
55387
4376
μ €λŠ” 기계와 데이터 그리고 μƒˆλ‘œμš΄ κΈ°μˆ μ„ μ΄μš©ν•΄ μž‘μ—…ν–ˆμŠ΅λ‹ˆλ‹€.
01:00
It's part of a lifelong fascination
16
60143
1861
그것은 제 ν‰μƒμ˜ 관심사 쀑 ν•˜λ‚˜μ˜€μŠ΅λ‹ˆλ‹€.
01:02
about the dynamics of individuals and systems
17
62028
2735
개인과 μ‹œμŠ€ν…œμ˜ 역동성과
01:04
and all the messiness that that entails.
18
64787
2381
κ·Έλ“€μ˜ λ’€μ„žμž„μ„ λ‹€λ£¨λŠ” 것이죠.
01:07
It's how I'm exploring questions about where AI ends and we begin
19
67192
4808
인곡지λŠ₯κ³Ό μΈκ°„μ˜ 경계에 κ΄€ν•œ 질문의 해닡을 μ°Ύκ³ 
01:12
and where I'm developing processes
20
72024
1642
μƒˆλ‘œμš΄ 방법을 ν†΅ν•΄μ„œ
01:13
that investigate potential sensory mixes of the future.
21
73690
3326
λ―Έλž˜μ— λ‘˜μ˜ 감각이 결합될 κ°€λŠ₯성을 찾고자 ν•˜λŠ” κ²ƒμž…λ‹ˆλ‹€.
01:17
I think it's where philosophy and technology intersect.
22
77675
2857
μ „ 이 연ꡬ가 μ² ν•™κ³Ό κ³Όν•™κΈ°μˆ μ΄ λ§Œλ‚˜λŠ” 접점이라고 λ΄…λ‹ˆλ‹€.
01:20
Doing this work has taught me a few things.
23
80992
2239
이 μž‘μ—…μ„ 톡해 μ €λŠ” λͺ‡ 가지λ₯Ό λ°°μ› μŠ΅λ‹ˆλ‹€.
01:23
It's taught me how embracing imperfection
24
83642
2824
λΆˆμ™„μ „ν•¨μ„ μš©μΈν•˜λŠ” 것이
01:26
can actually teach us something about ourselves.
25
86490
2489
μš°λ¦¬κ°€ μ–΄λ–€ μ‘΄μž¬μΈμ§€λ₯Ό μ•Œλ €μ€€λ‹€λŠ” κ²ƒμž…λ‹ˆλ‹€.
01:29
It's taught me that exploring art
26
89428
2336
μ˜ˆμˆ μ„ νƒκ΅¬ν•¨μœΌλ‘œμ¨
01:31
can actually help shape the technology that shapes us.
27
91788
2931
우리 인간을 λ°”κΎΈλŠ” κΈ°μˆ μ„ λ§Œλ“€ μˆ˜λ„ μžˆλ‹€λŠ” 점도 λ°°μ› μ£ .
01:35
And it's taught me that combining AI and robotics
28
95148
3261
λ˜ν•œ 인곡지λŠ₯κ³Ό λ‘œλ΄‡ κΈ°μˆ μ„ κ²°ν•©ν•¨μœΌλ‘œμ¨
01:38
with traditional forms of creativity -- visual arts in my case --
29
98433
3532
고전적 ν˜•νƒœμ˜ μ°½μž‘μ— μžˆμ–΄μ„œ, μ €μ˜ κ²½μš°μ—λŠ” μ‹œκ° μ˜ˆμˆ μž…λ‹ˆλ‹€λ§Œ,
01:41
can help us think a little bit more deeply
30
101989
2302
인간과 κΈ°κ³„μ˜ 차이에 λŒ€ν•΄ 더 깊이 생각할 수 있게 ν•΄μ€€λ‹€λŠ” κ±Έ μ•Œμ•˜μ£ .
01:44
about what is human and what is the machine.
31
104315
2897
01:47
And it's led me to the realization
32
107942
1707
그리고 또 ν•˜λ‚˜ κΉ¨λ‹«κ²Œ 된 것은
01:49
that collaboration is the key to creating the space for both
33
109673
3055
μš°λ¦¬κ°€ ν•¨κ»˜ κ³΅μ‘΄ν•˜λ €λ©΄ ν˜‘λ™μ΄ μ€‘μš”ν•˜λ‹€λŠ” μ‚¬μ‹€μ΄μ—ˆμŠ΅λ‹ˆλ‹€.
01:52
as we move forward.
34
112752
1267
기계와 인간이 μ§„λ³΄ν•˜λ €λ©΄ 말이죠.
01:54
It all started with a simple experiment with machines,
35
114387
2746
이 λͺ¨λ“  것은 기계λ₯Ό μ΄μš©ν•œ κ°„λ‹¨ν•œ μ‹€ν—˜μ—μ„œ μ‹œμž‘λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
01:57
called "Drawing Operations Unit: Generation 1."
36
117157
2826
'1μ„ΈλŒ€ κ·Έλ¦Ό κ·Έλ¦¬λŠ” 기계' 라 λΆˆλ¦¬λŠ” λ‘œλ΄‡μ΄μ—ˆμ£ .
02:00
I call the machine "D.O.U.G." for short.
37
120434
2516
μ€„μ—¬μ„œ '더그'라고 λΆ€λ¦…λ‹ˆλ‹€.
02:02
Before I built D.O.U.G,
38
122974
1326
더그λ₯Ό λ§Œλ“€κΈ° μ „,
02:04
I didn't know anything about building robots.
39
124324
2365
μ €λŠ” λ‘œλ΄‡ κ°œλ°œμ— λŒ€ν•΄ μ•„λŠ” 게 μ „ν˜€ μ—†μ—ˆμŠ΅λ‹ˆλ‹€.
02:07
I took some open-source robotic arm designs,
40
127220
2897
κ·Έλž˜μ„œ λ‘œλ΄‡ νŒ” λ””μžμΈ μ˜€ν”ˆμ†ŒμŠ€λ₯Ό μ΄μš©ν•΄μ„œ
02:10
I hacked together a system where the robot would match my gestures
41
130141
3341
μ‹œμŠ€ν…œμ„ μ—¬λŸ¬κ°€μ§€ λ³€ν˜•ν•œ 뒀에
더그가 μ‹€μ‹œκ°„μœΌλ‘œ 제 λ™μž‘μ„ 따라 ν•  수 μžˆλ„λ‘ λ§Œλ“€μ—ˆμ£ .
02:13
and follow [them] in real time.
42
133506
1639
02:15
The premise was simple:
43
135169
1448
κΈ°λ³Έ μ „μ œλŠ” κ°„λ‹¨ν•΄μš”.
02:16
I would lead, and it would follow.
44
136641
2200
μ œκ°€ ν•˜λŠ” λŒ€λ‘œ λ‘œλ΄‡ νŒ”μ΄ λ”°λΌμ˜€λŠ” κ±°μ£ .
02:19
I would draw a line, and it would mimic my line.
45
139403
2936
μ œκ°€ 선을 그리면 λ‘œλ΄‡ νŒ”μ΄ 따라 κ·Έλ¦½λ‹ˆλ‹€.
02:22
So back in 2015, there we were, drawing for the first time,
46
142363
3698
κ·Έλ ‡κ²Œ 2015λ…„, 더그와 μ €λŠ” λ‰΄μš•μ˜ μ†Œκ·œλͺ¨ 관쀑 μ•žμ—μ„œ
02:26
in front of a small audience in New York City.
47
146085
2619
처음으둜 ν•¨κ»˜ 그림을 κ·Έλ ΈμŠ΅λ‹ˆλ‹€.
02:28
The process was pretty sparse --
48
148728
2555
과정은 λ‹€μ†Œ μ΄˜μ΄˜ν•˜μ§€ λͺ»ν–ˆμŠ΅λ‹ˆλ‹€.
02:31
no lights, no sounds, nothing to hide behind.
49
151307
3487
μ‘°λͺ…κ³Ό 음ν–₯도 μ—†κ³ , 무언가λ₯Ό 뒀에 숨길 μˆ˜λ„ μ—†μ—ˆμ§€μš”.
02:35
Just my palms sweating and the robot's new servos heating up.
50
155241
3395
땀에 젖은 μ €μ˜ 손바λ‹₯κ³Ό λ‘œλ΄‡ μ œμ–΄ μž₯치의 μ—΄κΈ°λ§Œ μžˆμ—ˆμ£ .
02:38
(Laughs) Clearly, we were not built for this.
51
158950
2441
(μ›ƒμŒ) ν™•μ‹€νžˆ μ²΄μ§ˆμ€ μ•„λ‹ˆμ—ˆμ–΄μš”.
02:41
But something interesting happened, something I didn't anticipate.
52
161820
3233
ν•˜μ§€λ§Œ μ œκ°€ μ˜ˆμƒμΉ˜ λͺ»ν–ˆλ˜ μž¬λ―ΈμžˆλŠ” 일이 λ²Œμ–΄μ‘ŒμŠ΅λ‹ˆλ‹€.
02:45
See, D.O.U.G., in its primitive form, wasn't tracking my line perfectly.
53
165077
4802
λ³΄μ‹œλŠ” 바와 같이 λ”κ·Έμ˜ 초기 ν˜•νƒœλŠ” 선을 μ™„λ²½νžˆ 따라 그리지 λͺ»ν–ˆμŠ΅λ‹ˆλ‹€.
02:49
While in the simulation that happened onscreen
54
169903
2333
컴퓨터 ν™”λ©΄μ—μ„œμ˜ λͺ¨μ˜μ‹€ν—˜μ—μ„œλŠ” ν•œ ν”½μ…€μ‘°μ°¨ μ˜€μ°¨κ°€ μ—†μ—ˆλŠ”λ°
02:52
it was pixel-perfect,
55
172260
1357
02:53
in physical reality, it was a different story.
56
173641
2531
μ‹€μ œ ν˜„μ‹€μ—μ„œλŠ” 그렇지 μ•Šμ•˜μ£ .
02:56
It would slip and slide and punctuate and falter,
57
176196
2817
λ―Έλ„λŸ¬μ§€κ±°λ‚˜ λ–¨μ–΄μ§€κ±°λ‚˜ μ€‘λ‹¨λ˜κ±°λ‚˜ 버벅거렸고
02:59
and I would be forced to respond.
58
179037
2068
μ €λŠ” 거기에 λ°˜μ‘ν•  μˆ˜λ°–μ— μ—†μ—ˆμŠ΅λ‹ˆλ‹€.
03:01
There was nothing pristine about it.
59
181525
1778
κΉ”λ”ν•œ μž‘μ—…μ€ μ•„λ‹ˆμ—ˆμ–΄μš”.
03:03
And yet, somehow, the mistakes made the work more interesting.
60
183327
3238
κ·ΈλŸ¬ν•¨μ—λ„ λΆˆκ΅¬ν•˜κ³ , 이 μ‹€μˆ˜λ“€μ΄ μž‘μ—…μ„ 더 ν₯미둭게 ν–ˆμŠ΅λ‹ˆλ‹€.
03:06
The machine was interpreting my line but not perfectly.
61
186589
2754
λ‘œλ΄‡μ€ μ €μ˜ 선을 ν•΄λ…ν–ˆμ§€λ§Œ μ™„λ²½ν•˜μ§„ μ•Šμ•˜μ£ .
03:09
And I was forced to respond.
62
189367
1372
저도 거기에 λ°˜μ‘ν•΄μ•Ό ν–ˆκ³ μš”.
03:10
We were adapting to each other in real time.
63
190763
2709
μš°λ¦¬λŠ” μ‹€μ‹œκ°„μœΌλ‘œ μ„œλ‘œμ—κ²Œ μ μ‘ν•΄κ°”μŠ΅λ‹ˆλ‹€.
03:13
And seeing this taught me a few things.
64
193496
1937
그리고 이λ₯Ό 톡해 λͺ‡ 가지λ₯Ό λ°°μ› μŠ΅λ‹ˆλ‹€.
03:15
It showed me that our mistakes actually made the work more interesting.
65
195457
4880
λ°”λ‘œ 우리의 μ‹€μˆ˜λ“€μ΄ μž‘μ—…μ„ 더 ν₯미둭게 λ§Œλ“ λ‹€λŠ” μ μ΄μ—ˆμŠ΅λ‹ˆλ‹€.
03:20
And I realized that, you know, through the imperfection of the machine,
66
200663
4249
그리고 κΉ¨λ‹¬μ•˜μ£ .
λ°”λ‘œ κΈ°κ³„μ˜ λΆˆμ™„μ „μ„±μ΄,
03:24
our imperfections became what was beautiful about the interaction.
67
204936
3705
μ„œλ‘œμ˜ λΆˆμ™„μ „μ„±μ΄ μƒν˜Έμž‘μš©ν•˜μ—¬ μ•„λ¦„λ‹€μ›€μœΌλ‘œ λ°”λ€Œμ—ˆλ‹€λŠ” κ²ƒμ„μš”.
03:29
And I was excited, because it led me to the realization
68
209650
3087
그리고 ν•œ 가지 사싀에 맀우 κΈ°λ»€μŠ΅λ‹ˆλ‹€.
03:32
that maybe part of the beauty of human and machine systems
69
212761
3650
μ–΄μ©Œλ©΄ 인간과 λ‘œλ΄‡μ΄ μ°½μ‘°ν•˜λŠ” μ˜ˆμˆ μ—λŠ”
03:36
is their shared inherent fallibility.
70
216435
2738
λΆˆμ™„μ „μ„±μ„ κ³΅μœ ν•  ν•„μš”λ„ μžˆλ‹€λŠ” μ μ΄μ—ˆμ£ .
03:39
For the second generation of D.O.U.G.,
71
219197
1820
μ €λŠ” 2μ„ΈλŒ€ 더그λ₯Ό 톡해 이 아이디어λ₯Ό νƒκ΅¬ν•˜κ³  μ‹Άμ—ˆμŠ΅λ‹ˆλ‹€.
03:41
I knew I wanted to explore this idea.
72
221041
2307
03:43
But instead of an accident produced by pushing a robotic arm to its limits,
73
223372
4418
μ΄λ²ˆμ—λŠ” λ‘œλ΄‡νŒ”μ˜ λŠ₯λ ₯을 μ΄ˆκ³Όν•΄μ„œ λ°œμƒν•œ μ‹€μˆ˜ λŒ€μ‹ ,
03:47
I wanted to design a system that would respond to my drawings
74
227814
2897
제 μ˜ˆμƒμ„ λ²—μ–΄λ‚˜λŠ” λ°©μ‹μœΌλ‘œ 그림에 λ°˜μ‘ν•˜λŠ” μ‹œμŠ€ν…œμ„
03:50
in ways that I didn't expect.
75
230735
1833
λ””μžμΈν•˜κ³  μ‹Άμ—ˆμŠ΅λ‹ˆλ‹€.
03:52
So, I used a visual algorithm to extract visual information
76
232592
3849
κ·Έλž˜μ„œ μ‹œκ° μ•Œκ³ λ¦¬μ¦˜μ„ μ΄μš©ν•΄μ„œ
제 κ³Όκ±° 디지털 ν˜Ήμ€ μ•„λ‚ λ‘œκ·Έ λ°©μ‹μ˜ κ·Έλ¦Όμ—μ„œ μ‹œκ° 정보λ₯Ό μΆ”μΆœν•˜κ³ 
03:56
from decades of my digital and analog drawings.
77
236465
2978
03:59
I trained a neural net on these drawings
78
239467
2055
κ·Έκ²ƒμœΌλ‘œ 신경망 ν”„λ‘œκ·Έλž¨μ„ ν›ˆλ ¨μ‹œμΌ°μŠ΅λ‹ˆλ‹€.
04:01
in order to generate recurring patterns in the work
79
241546
2865
μž‘ν’ˆλ§ˆλ‹€μ˜ 반볡적인 νŒ¨ν„΄μ„ μƒμ„±ν•œ 뒀에
04:04
that were then fed through custom software back into the machine.
80
244435
3476
λ§žμΆ€ν˜• μ†Œν”„νŠΈμ›¨μ–΄λ‘œ 그것을 λ‘œλ΄‡μ— μž…λ ₯μ‹œμΌ°μ£ .
04:07
I painstakingly collected as many of my drawings as I could find --
81
247935
4386
μ €λŠ” μ—΄μ‹¬νžˆ 제 그림듀을 κ°€λŠ₯ν•œ 많이 λͺ¨μ•˜μŠ΅λ‹ˆλ‹€.
04:12
finished works, unfinished experiments and random sketches --
82
252345
4215
μ™„μ„±λœ κ·Έλ¦Ό, λ―Έμ™„μ„±λœ μŠ΅μž‘κ³Ό μ•„λ¬΄λ ‡κ²Œλ‚˜ κ·Έλ¦° μŠ€μΌ€μΉ˜λ“€μ„ λͺ¨μ•„μ„œ
04:16
and tagged them for the AI system.
83
256584
1999
인곡지λŠ₯ μ‹œμŠ€ν…œκ³Ό μ—°κ²°ν–ˆμ£ .
04:18
And since I'm an artist, I've been making work for over 20 years.
84
258607
3684
μ €λŠ” μ˜ˆμˆ κ°€λ‘œ 20λ…„ λ„˜κ²Œ ν™œλ™ν•΄μ™”κΈ°μ—
04:22
Collecting that many drawings took months,
85
262315
2024
κ·Έλ¦Ό λͺ¨μœΌλŠ” 데만 μˆ˜κ°œμ›”μ΄ κ±Έλ Έμ–΄μš”. 정말 νž˜λ“  μž‘μ—…μ΄μ—ˆμŠ΅λ‹ˆλ‹€.
04:24
it was a whole thing.
86
264363
1389
04:25
And here's the thing about training AI systems:
87
265776
2595
인곡지λŠ₯ μ‹œμŠ€ν…œμ„ ν›ˆλ ¨ν•˜λŠ” 것은
04:28
it's actually a lot of hard work.
88
268395
2200
사싀 정말 μ–΄λ €μš΄ μΌμž…λ‹ˆλ‹€.
04:31
A lot of work goes on behind the scenes.
89
271022
2191
보이지 μ•ŠλŠ” κ³³μ—μ„œ λ§Žμ€ μž‘μ—…μ΄ μ΄λ£¨μ–΄μ§‘λ‹ˆλ‹€.
04:33
But in doing the work, I realized a little bit more
90
273237
2681
ν•˜μ§€λ§Œ μž‘μ—…μ„ ν•˜λŠ” λ™μ•ˆ
04:35
about how the architecture of an AI is constructed.
91
275942
3421
인곡지λŠ₯ 섀계가 μ–΄λ–»κ²Œ κ΅¬μ„±λ˜λŠ”μ§€ μ’€ 더 잘 μ•Œκ²Œ λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
04:39
And I realized it's not just made of models and classifiers
92
279387
2947
단지 신경망 λͺ¨λΈκ³Ό λΆ„λ₯˜κΈ°μ€€μœΌλ‘œλ§Œ κ΅¬μ„±λ˜λŠ” 게 μ•„λ‹ˆλΌλŠ” κ±Έ μ•Œμ•˜μ£ .
04:42
for the neural network.
93
282358
1322
04:43
But it's a fundamentally malleable and shapable system,
94
283704
3532
였히렀 근본적으둜 가변적이고 μˆ˜μ •ν•  수 μžˆλŠ” μ‹œμŠ€ν…œμ΄λΌ
04:47
one in which the human hand is always present.
95
287260
3111
μ‚¬λžŒμ˜ μž‘μ—…μ„ 항상 ν•„μš”λ‘œ ν•œλ‹€λŠ” μ λ„μš”.
04:50
It's far from the omnipotent AI we've been told to believe in.
96
290395
4000
μ΄λŠ” μš°λ¦¬κ°€ λ―Ώμ–΄μ˜¨ μ™„μ „λ¬΄κ²°ν•œ 인곡지λŠ₯κ³ΌλŠ” 거리가 μžˆμŠ΅λ‹ˆλ‹€.
04:54
So I collected these drawings for the neural net.
97
294419
2515
μ–΄μ°Œ 됐든 μ „ 신경망 ꡬ성을 μœ„ν•΄ 그림듀을 λͺ¨μ•˜μŠ΅λ‹ˆλ‹€.
04:56
And we realized something that wasn't previously possible.
98
296958
3929
그리고 μš°λ¦¬λŠ” μ „μ—” λΆˆκ°€λŠ₯ν–ˆλ˜ 무언가λ₯Ό λ°œκ²¬ν–ˆμŠ΅λ‹ˆλ‹€.
05:00
My robot D.O.U.G. became a real-time interactive reflection
99
300911
4091
μ €μ˜ λ‘œλ΄‡ 더그가 μ œκ°€ 일생 그렀온 μž‘μ—…κ³Ό
05:05
of the work I'd done through the course of my life.
100
305026
2627
μ‹€μ‹œκ°„μœΌλ‘œ μƒν˜Έμž‘μš©ν•˜λŠ” 거울이 된 것이죠.
05:07
The data was personal, but the results were powerful.
101
307677
3865
비둝 개인적인 λ°μ΄ν„°μ˜€μ§€λ§Œ κ·Έ κ²°κ³ΌλŠ” κ°•λ ₯ν–ˆμŠ΅λ‹ˆλ‹€.
05:11
And I got really excited,
102
311566
1484
그리고 μ €λŠ” 맀우 κΈ°λ»€μŠ΅λ‹ˆλ‹€.
05:13
because I started thinking maybe machines don't need to be just tools,
103
313074
4582
μ™œλƒν•˜λ©΄ μ €λŠ” 기계가 λ‹¨μˆœνžˆ λ„κ΅¬λΏλ§Œ μ•„λ‹ˆλΌ
05:17
but they can function as nonhuman collaborators.
104
317680
3420
비인간 ν˜‘λ ₯자둜 κΈ°λŠ₯ν•  수 μžˆλ‹€λŠ” κ±Έ μ•Œμ•˜κΈ° λ•Œλ¬Έμ΄μ£ .
05:21
And even more than that,
105
321537
1547
그리고 이에 더해
05:23
I thought maybe the future of human creativity
106
323108
2429
인간 μ°½μ‘°μ„±μ˜ λ―Έλž˜κ°€
05:25
isn't in what it makes
107
325561
1524
λ‹¨μˆœνžˆ 결과물에 μžˆλŠ” 것이 μ•„λ‹Œ
05:27
but how it comes together to explore new ways of making.
108
327109
3436
νž˜μ„ 합쳐 μƒˆλ‘œμš΄ 과정을 μ°ΎλŠ” 방법에 있으리라 μƒκ°ν–ˆμŠ΅λ‹ˆλ‹€.
05:31
So if D.O.U.G._1 was the muscle,
109
331101
2190
더그 1μ„ΈλŒ€λ₯Ό 근윑이라 ν•˜κ³ 
05:33
and D.O.U.G._2 was the brain,
110
333315
1762
더그 2μ„ΈλŒ€λ₯Ό λ‡ŒλΌκ³  ν•œλ‹€λ©΄
05:35
then I like to think of D.O.U.G._3 as the family.
111
335101
2928
더그 3μ„ΈλŒ€λŠ” 가쑱이라 μƒκ°ν•©λ‹ˆλ‹€.
05:38
I knew I wanted to explore this idea of human-nonhuman collaboration at scale.
112
338482
4793
μ €λŠ” 인간과 κΈ°κ³„μ˜ ν˜‘μ—…μ— λŒ€ν•œ 생각을 더 넓은 λ²”μœ„μ—μ„œ νƒκ΅¬ν•˜κ³  μ‹Άμ—ˆμŠ΅λ‹ˆλ‹€.
05:43
So over the past few months,
113
343299
1373
κ·Έλž˜μ„œ μ§€λ‚œ λͺ‡ 달 λ™μ•ˆ 저희 연ꡬ진은
05:44
I worked with my team to develop 20 custom robots
114
344696
3135
곡동 μž‘μ—…μ΄ κ°€λŠ₯ν•œ λ§žμΆ€ν˜• λ‘œλ΄‡ 슀무 개λ₯Ό κ°œλ°œν–ˆμŠ΅λ‹ˆλ‹€.
05:47
that could work with me as a collective.
115
347855
1960
05:49
They would work as a group,
116
349839
1293
그리고 이 λ‘œλ΄‡ 무리와 ν•¨κ»˜
05:51
and together, we would collaborate with all of New York City.
117
351156
2889
μš°λ¦¬λŠ” λ‰΄μš• λ„μ‹œ 전체와 κ³΅λ™μœΌλ‘œ μž‘μ—…ν•˜κ²Œ λ˜μ—ˆμ£ .
05:54
I was really inspired by Stanford researcher Fei-Fei Li,
118
354069
2944
μ €λŠ” μŠ€νƒ νΌλ“œ λŒ€ν•™κ΅μ˜ μ—°κ΅¬μžμΈ 페이페이 리의 말에 μ˜κ°μ„ μ–»μ—ˆμŠ΅λ‹ˆλ‹€.
05:57
who said, "if we want to teach machines how to think,
119
357037
2515
κ·Έλ…€λŠ” 기계에 μƒκ°ν•˜λŠ” 법을 κ°€λ₯΄μΉ˜λ €λ©΄ λ³΄λŠ” 법뢀터 κ°€λ₯΄μ³μ•Ό ν•œλ‹€κ³  λ§ν–ˆμ£ .
05:59
we need to first teach them how to see."
120
359576
1984
06:01
It made me think of the past decade of my life in New York,
121
361584
2785
이 말은 λ‰΄μš•μ—μ„œ μƒν™œν•œ μ§€λ‚œ 10년간을 λŒμ•„λ³΄κ²Œ ν–ˆκ³ 
06:04
and how I'd been all watched over by these surveillance cameras around the city.
122
364393
3993
κ·Έκ°„ λ„μ‹œμ˜ κ°μ‹œ 카메라듀이 μ €λ₯Ό μ§€μΌœλ΄ μ™”λ‹€λŠ” κ±Έ λ– μ˜¬λ ΈμŠ΅λ‹ˆλ‹€.
06:08
And I thought it would be really interesting
123
368410
2056
그리고 ν₯미둜운 생각이 λ– μ˜¬λžμ–΄μš”.
06:10
if I could use them to teach my robots to see.
124
370490
2405
이λ₯Ό 톡해 λ‘œλ΄‡μ—κ²Œ λ³΄λŠ” 법을 κ°€λ₯΄μΉ˜λ©΄ μ–΄λ–¨κΉŒ μƒκ°ν–ˆμ£ .
06:12
So with this project,
125
372919
1888
μ €λŠ” 이 ν”„λ‘œμ νŠΈλ₯Ό ν•˜λŠ” λ™μ•ˆ
06:14
I thought about the gaze of the machine,
126
374831
1967
κΈ°κ³„μ˜ μ‹œμ„ μ— λŒ€ν•΄ μƒκ°ν–ˆμŠ΅λ‹ˆλ‹€.
06:16
and I began to think about vision as multidimensional,
127
376822
3226
μ‹œμ„ μ΄λΌλŠ” 것은
μ„œλ‘œ λ‹€λ₯Έ μž₯μ†Œμ—μ„œμ˜ 닀면적 μ‹œκ°μ΄λΌκ³  μƒκ°ν–ˆμŠ΅λ‹ˆλ‹€.
06:20
as views from somewhere.
128
380072
1600
06:22
We collected video
129
382151
1834
μš°λ¦¬λŠ” 인터넷에 곡개적으둜 μ˜¬λ €μ§„ 카메라 μ˜μƒλ“€μ„ μˆ˜μ§‘ν–ˆμŠ΅λ‹ˆλ‹€.
06:24
from publicly available camera feeds on the internet
130
384009
3063
06:27
of people walking on the sidewalks,
131
387096
1690
길을 κ±·λŠ” μ‚¬λžŒλ“€μ΄λ‚˜
06:28
cars and taxis on the road,
132
388810
1712
κΈΈ μœ„λ₯Ό λ‹¬λ¦¬λŠ” 차와 νƒμ‹œλ“€ 같은
06:30
all kinds of urban movement.
133
390546
1817
λ„μ‹œμ˜ λͺ¨λ“  μ›€μ§μž„μ„ μˆ˜μ§‘ν–ˆμ£ .
06:33
We trained a vision algorithm on those feeds
134
393188
2603
그리고 '광학적 흐름'μ΄λΌλŠ” 기술둜 μ‹œκ° μ•Œκ³ λ¦¬μ¦˜μ„ ν›ˆλ ¨μ‹œμΌ°μŠ΅λ‹ˆλ‹€.
06:35
based on a technique called "optical flow,"
135
395815
2286
06:38
to analyze the collective density,
136
398125
1977
λ„μ‹œ λ‚΄ μ§‘λ‹¨μ˜ 밀도,
06:40
direction, dwell and velocity states of urban movement.
137
400126
3637
이동 λ°©ν–₯, κ±°μ£Ό 그리고 속도 같은 λ„μ‹œμ˜ μ›€μ§μž„μ„ λΆ„μ„ν•˜λŠ” 것이죠.
06:44
Our system extracted those states from the feeds as positional data
138
404178
4269
저희 μ‹œμŠ€ν…œμ€ 이 μƒνƒœλ₯Ό μ˜μƒμ—μ„œ μΆ”μΆœν•΄μ„œ μœ„μΉ˜ λ°μ΄ν„°λ‘œ λ³€ν™˜ν•˜κ³ 
06:48
and became pads for my robotic units to draw on.
139
408471
3373
λ‘œλ΄‡μ΄ κ·Έ μœ„μ— 그림을 그릴 수 μžˆλŠ” 도화지 역할을 ν•˜κ²Œ λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
06:51
Instead of a collaboration of one-to-one,
140
411868
2534
μš°λ¦¬λŠ” μΌλŒ€μΌμ˜ ν˜‘μ—… λŒ€μ‹ 
06:54
we made a collaboration of many-to-many.
141
414426
3024
λ‹€μˆ˜ λŒ€ λ‹€μˆ˜μ˜ ν˜‘μ—…μ„ μ΄λ£¨μ–΄λƒˆμŠ΅λ‹ˆλ‹€.
06:57
By combining the vision of human and machine in the city,
142
417474
3587
λ„μ‹œ λ‚΄ 인간과 κΈ°κ³„μ˜ μ‹œμ•Όλ₯Ό ν•˜λ‚˜λ‘œ 묢음으둜써
07:01
we reimagined what a landscape painting could be.
143
421085
2794
μš°λ¦¬λŠ” ν’κ²½ν™”λž€ μ–΄λ–€ 것인지 λ‹€μ‹œ μƒμƒν•˜κ²Œ λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
07:03
Throughout all of my experiments with D.O.U.G.,
144
423903
2218
더그λ₯Ό μ΄μš©ν•œ λͺ¨λ“  μ‹€ν—˜μ—λŠ”
07:06
no two performances have ever been the same.
145
426145
2717
이전과 λ™μΌν•œ μž‘μ—…μ΄λž€ μ—†μŠ΅λ‹ˆλ‹€.
07:08
And through collaboration,
146
428886
1382
그리고 ν˜‘μ—…μ„ 톡해
07:10
we create something that neither of us could have done alone:
147
430292
2864
ν˜Όμžμ„œλŠ” λΆˆκ°€λŠ₯ν–ˆμ„ 무언가λ₯Ό μ°½μ‘°ν•˜κ³ 
07:13
we explore the boundaries of our creativity,
148
433180
2611
μ°½μ˜μ„±μ˜ 경계λ₯Ό νƒκ΅¬ν•©λ‹ˆλ‹€.
07:15
human and nonhuman working in parallel.
149
435815
2892
인간과 κΈ°κ³„μ˜ μˆ˜ν‰μ μΈ μž‘μ—…μ„ ν†΅ν•΄μ„œ 말이죠.
07:19
I think this is just the beginning.
150
439823
2334
이것은 μ‹œμž‘μ— λΆˆκ³Όν•©λ‹ˆλ‹€.
07:22
This year, I've launched Scilicet,
151
442569
2183
μ˜¬ν•΄ μ €λŠ” '싀리셋 (Scilicet)'을 μ„€λ¦½ν–ˆμŠ΅λ‹ˆλ‹€.
07:24
my new lab exploring human and interhuman collaboration.
152
444776
4245
μ‚¬λžŒκ³Ό μ‚¬λžŒκ³Όμ˜ ν˜‘μ—…μ„ νƒκ΅¬ν•˜λŠ” μƒˆλ‘œμš΄ μ—°κ΅¬μ†Œμž…λ‹ˆλ‹€.
07:29
We're really interested in the feedback loop
153
449339
2120
저희가 관심을 λ‘λŠ” 것은
μ‚¬λžŒκ³Ό 기계, 그리고 μƒνƒœκ³„ κ°„μ˜ ν”Όλ“œλ°± μˆœν™˜μ— κ΄€ν•œ κ²ƒμž…λ‹ˆλ‹€.
07:31
between individual, artificial and ecological systems.
154
451483
4230
07:36
We're connecting human and machine output
155
456276
2269
인간과 기계가 λ§Œλ“€μ–΄λ‚Έ 결과물을
07:38
to biometrics and other kinds of environmental data.
156
458569
2984
생체츑정 데이터 및 λ‹€λ₯Έ ν™˜κ²½ 데이터와 μ—°κ²°ν•˜μ£ .
07:41
We're inviting anyone who's interested in the future of work, systems
157
461577
4079
μ—¬κΈ°μ—λŠ” λˆ„κ΅¬λ‚˜ μ°Έμ—¬ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
일의 미래, μ‹œμŠ€ν…œκ³Ό μ‚¬λžŒ κ°„μ˜ ν˜‘μ—…μ„ νƒκ΅¬ν•˜κ³  싢은 뢄이라면 λˆ„κ΅¬λ“ μ§€μš”.
07:45
and interhuman collaboration
158
465680
1595
07:47
to explore with us.
159
467299
1550
07:48
We know it's not just technologists that have to do this work
160
468873
3405
μ΄λŠ” 기술자만의 일이 μ•„λ‹ˆλ©°
07:52
and that we all have a role to play.
161
472302
2103
μš°λ¦¬κ°€ 각자 ν•  수 μžˆλŠ” 역할이 μžˆμŠ΅λ‹ˆλ‹€.
07:54
We believe that by teaching machines
162
474429
2243
μ „ν†΅μ μœΌλ‘œ 인간이 ν•˜λ˜ 일을 기계에 κ°€λ₯΄μΉ¨μœΌλ‘œμ¨
07:56
how to do the work traditionally done by humans,
163
476696
2730
07:59
we can explore and evolve our criteria
164
479450
2953
인간이 ν•  수 μžˆλŠ” 일의 기쀀을 νƒκ΅¬ν•˜λ©° λ°œμ „μ‹œν‚¬ 수 μžˆμŠ΅λ‹ˆλ‹€.
08:02
of what's made possible by the human hand.
165
482427
2443
08:04
And part of that journey is embracing the imperfections
166
484894
3493
그리고 이 여정은 인간과 κΈ°κ³„μ—κ²Œ 결점이 μžˆμŒμ„ 받아듀이고
08:08
and recognizing the fallibility of both human and machine,
167
488411
3690
인간과 기계 λͺ¨λ‘μ˜ λΆˆμ™„μ „μ„±μ„ μΈμ§€ν•¨μœΌλ‘œμ¨
08:12
in order to expand the potential of both.
168
492125
2405
κ·Έλ“€μ˜ 잠재λ ₯을 ν™•μž₯ν•˜λ €λŠ” κ²ƒμž…λ‹ˆλ‹€.
08:14
Today, I'm still in pursuit of finding the beauty
169
494919
2301
μ €λŠ” μ§€κΈˆλ„ 인간과 비인간인 κΈ°κ³„μ˜ 창쑰성에 κΉƒλ“ 
08:17
in human and nonhuman creativity.
170
497244
2276
아름닀움을 νƒκ΅¬ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
08:19
In the future, I have no idea what that will look like,
171
499865
2829
그것이 λ―Έλž˜μ— μ–΄λ–€ λͺ¨μŠ΅μΌμ§€ 아직은 λͺ¨λ₯΄μ§€λ§Œ
08:23
but I'm pretty curious to find out.
172
503627
2024
μ €λŠ” λ„ˆλ¬΄λ‚˜ κΆκΈˆν•˜λ„€μš”.
08:25
Thank you.
173
505675
1151
κ°μ‚¬ν•©λ‹ˆλ‹€.
08:26
(Applause)
174
506850
1884
(λ°•μˆ˜)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7