Shape-shifting tech will change work as we know it | Sean Follmer

239,137 views ・ 2016-03-01

TED


Please double-click on the English subtitles below to play the video.

00:12
We've evolved with tools, and tools have evolved with us.
0
12811
3553
00:16
Our ancestors created these hand axes 1.5 million years ago,
1
16388
5019
00:21
shaping them to not only fit the task at hand
2
21431
3022
00:24
but also their hand.
3
24477
1468
00:26
However, over the years,
4
26747
1580
00:28
tools have become more and more specialized.
5
28351
2530
00:31
These sculpting tools have evolved through their use,
6
31293
3837
00:35
and each one has a different form which matches its function.
7
35154
3562
00:38
And they leverage the dexterity of our hands
8
38740
2668
00:41
in order to manipulate things with much more precision.
9
41432
3130
00:45
But as tools have become more and more complex,
10
45338
3066
00:48
we need more complex controls to control them.
11
48428
3878
00:52
And so designers have become very adept at creating interfaces
12
52714
4444
00:57
that allow you to manipulate parameters while you're attending to other things,
13
57182
3738
01:00
such as taking a photograph and changing the focus
14
60944
2879
01:03
or the aperture.
15
63847
1336
01:05
But the computer has fundamentally changed the way we think about tools
16
65918
4219
01:10
because computation is dynamic.
17
70161
2014
01:12
So it can do a million different things
18
72564
2151
01:14
and run a million different applications.
19
74739
2349
01:17
However, computers have the same static physical form
20
77112
3745
01:20
for all of these different applications
21
80881
1936
01:22
and the same static interface elements as well.
22
82841
2777
01:25
And I believe that this is fundamentally a problem,
23
85976
2420
01:28
because it doesn't really allow us to interact with our hands
24
88420
2996
01:31
and capture the rich dexterity that we have in our bodies.
25
91440
3427
01:36
And my belief is that, then, we must need new types of interfaces
26
96026
4536
01:40
that can capture these rich abilities that we have
27
100586
3759
01:44
and that can physically adapt to us
28
104369
2370
01:46
and allow us to interact in new ways.
29
106763
2250
01:49
And so that's what I've been doing at the MIT Media Lab
30
109037
2587
01:51
and now at Stanford.
31
111648
1324
01:53
So with my colleagues, Daniel Leithinger and Hiroshi Ishii,
32
113901
3611
01:57
we created inFORM,
33
117536
1389
01:58
where the interface can actually come off the screen
34
118949
2498
02:01
and you can physically manipulate it.
35
121471
2276
02:03
Or you can visualize 3D information physically
36
123771
2743
02:06
and touch it and feel it to understand it in new ways.
37
126538
3514
02:15
Or you can interact through gestures and direct deformations
38
135889
4076
02:19
to sculpt digital clay.
39
139989
2297
02:26
Or interface elements can arise out of the surface
40
146474
3081
02:29
and change on demand.
41
149579
1372
02:30
And the idea is that for each individual application,
42
150975
2508
02:33
the physical form can be matched to the application.
43
153507
3300
02:37
And I believe this represents a new way
44
157196
2105
02:39
that we can interact with information,
45
159325
1950
02:41
by making it physical.
46
161299
1429
02:43
So the question is, how can we use this?
47
163142
2215
02:45
Traditionally, urban planners and architects build physical models
48
165810
3690
02:49
of cities and buildings to better understand them.
49
169524
2810
02:52
So with Tony Tang at the Media Lab, we created an interface built on inFORM
50
172358
4215
02:56
to allow urban planners to design and view entire cities.
51
176597
4983
03:01
And now you can walk around it, but it's dynamic, it's physical,
52
181604
4257
03:05
and you can also interact directly.
53
185885
1700
03:07
Or you can look at different views,
54
187609
1738
03:09
such as population or traffic information,
55
189371
2817
03:12
but it's made physical.
56
192212
1562
03:14
We also believe that these dynamic shape displays can really change
57
194996
3810
03:18
the ways that we remotely collaborate with people.
58
198830
2960
03:21
So when we're working together in person,
59
201814
2303
03:24
I'm not only looking at your face
60
204141
1658
03:25
but I'm also gesturing and manipulating objects,
61
205823
3038
03:28
and that's really hard to do when you're using tools like Skype.
62
208885
3790
03:33
And so using inFORM, you can reach out from the screen
63
213905
2986
03:36
and manipulate things at a distance.
64
216915
2112
03:39
So we used the pins of the display to represent people's hands,
65
219051
3175
03:42
allowing them to actually touch and manipulate objects at a distance.
66
222250
4506
03:50
And you can also manipulate and collaborate on 3D data sets as well,
67
230519
4274
03:54
so you can gesture around them as well as manipulate them.
68
234817
3669
03:58
And that allows people to collaborate on these new types of 3D information
69
238510
4390
04:02
in a richer way than might be possible with traditional tools.
70
242924
3611
04:07
And so you can also bring in existing objects,
71
247870
2753
04:10
and those will be captured on one side and transmitted to the other.
72
250647
3214
04:13
Or you can have an object that's linked between two places,
73
253885
2786
04:16
so as I move a ball on one side,
74
256695
2084
04:18
the ball moves on the other as well.
75
258803
1927
04:22
And so we do this by capturing the remote user
76
262278
3103
04:25
using a depth-sensing camera like a Microsoft Kinect.
77
265405
2805
04:28
Now, you might be wondering how does this all work,
78
268758
3017
04:31
and essentially, what it is, is 900 linear actuators
79
271799
3651
04:35
that are connected to these mechanical linkages
80
275474
2286
04:37
that allow motion down here to be propagated in these pins above.
81
277784
3746
04:41
So it's not that complex compared to what's going on at CERN,
82
281554
3567
04:45
but it did take a long time for us to build it.
83
285145
2326
04:47
And so we started with a single motor,
84
287495
2255
04:49
a single linear actuator,
85
289774
1415
04:51
and then we had to design a custom circuit board to control them.
86
291816
3163
04:55
And then we had to make a lot of them.
87
295003
2052
04:57
And so the problem with having 900 of something
88
297079
3614
05:00
is that you have to do every step 900 times.
89
300717
3120
05:03
And so that meant that we had a lot of work to do.
90
303861
2357
05:06
So we sort of set up a mini-sweatshop in the Media Lab
91
306242
3732
05:09
and brought undergrads in and convinced them to do "research" --
92
309998
3712
05:13
(Laughter)
93
313734
1014
05:14
and had late nights watching movies, eating pizza
94
314772
3018
05:17
and screwing in thousands of screws.
95
317814
1828
05:19
You know -- research.
96
319666
1198
05:20
(Laughter)
97
320888
1547
05:22
But anyway, I think that we were really excited by the things
98
322459
3318
05:25
that inFORM allowed us to do.
99
325801
1696
05:27
Increasingly, we're using mobile devices, and we interact on the go.
100
327521
4200
05:31
But mobile devices, just like computers,
101
331745
2679
05:34
are used for so many different applications.
102
334448
2311
05:36
So you use them to talk on the phone,
103
336783
1993
05:38
to surf the web, to play games, to take pictures
104
338800
3156
05:41
or even a million different things.
105
341980
1709
05:43
But again, they have the same static physical form
106
343713
2984
05:46
for each of these applications.
107
346721
2118
05:48
And so we wanted to know how can we take some of the same interactions
108
348863
3363
05:52
that we developed for inFORM
109
352250
1683
05:53
and bring them to mobile devices.
110
353957
1888
05:56
So at Stanford, we created this haptic edge display,
111
356427
3647
06:00
which is a mobile device with an array of linear actuators
112
360098
3177
06:03
that can change shape,
113
363299
1347
06:04
so you can feel in your hand where you are as you're reading a book.
114
364670
3897
06:09
Or you can feel in your pocket new types of tactile sensations
115
369058
3737
06:12
that are richer than the vibration.
116
372819
1802
06:14
Or buttons can emerge from the side that allow you to interact
117
374645
3235
06:17
where you want them to be.
118
377904
1708
06:21
Or you can play games and have actual buttons.
119
381334
3397
06:25
And so we were able to do this
120
385786
1516
06:27
by embedding 40 small, tiny linear actuators inside the device,
121
387326
4754
06:32
and that allow you not only to touch them
122
392104
2055
06:34
but also back-drive them as well.
123
394183
1904
06:36
But we've also looked at other ways to create more complex shape change.
124
396911
4178
06:41
So we've used pneumatic actuation to create a morphing device
125
401113
3392
06:44
where you can go from something that looks a lot like a phone ...
126
404529
3865
06:48
to a wristband on the go.
127
408418
2230
06:51
And so together with Ken Nakagaki at the Media Lab,
128
411720
2839
06:54
we created this new high-resolution version
129
414583
2554
06:57
that uses an array of servomotors to change from interactive wristband
130
417161
5950
07:03
to a touch-input device
131
423135
3128
07:06
to a phone.
132
426287
1245
07:07
(Laughter)
133
427556
1658
07:10
And we're also interested in looking at ways
134
430104
2172
07:12
that users can actually deform the interfaces
135
432300
2627
07:14
to shape them into the devices that they want to use.
136
434951
2888
07:17
So you can make something like a game controller,
137
437863
2408
07:20
and then the system will understand what shape it's in
138
440295
2630
07:22
and change to that mode.
139
442949
1619
07:26
So, where does this point?
140
446052
1572
07:27
How do we move forward from here?
141
447648
1928
07:29
I think, really, where we are today
142
449600
2595
07:32
is in this new age of the Internet of Things,
143
452219
2754
07:34
where we have computers everywhere --
144
454997
1791
07:36
they're in our pockets, they're in our walls,
145
456812
2118
07:38
they're in almost every device that you'll buy in the next five years.
146
458954
3566
07:42
But what if we stopped thinking about devices
147
462544
2881
07:45
and think instead about environments?
148
465449
2394
07:47
And so how can we have smart furniture
149
467867
2512
07:50
or smart rooms or smart environments
150
470403
3316
07:53
or cities that can adapt to us physically,
151
473743
3092
07:56
and allow us to do new ways of collaborating with people
152
476859
4231
08:01
and doing new types of tasks?
153
481114
2238
08:03
So for the Milan Design Week, we created TRANSFORM,
154
483376
3384
08:06
which is an interactive table-scale version of these shape displays,
155
486784
3824
08:10
which can move physical objects on the surface; for example,
156
490632
3183
08:13
reminding you to take your keys.
157
493839
2257
08:16
But it can also transform to fit different ways of interacting.
158
496120
4482
08:20
So if you want to work,
159
500626
1317
08:21
then it can change to sort of set up your work system.
160
501967
2992
08:24
And so as you bring a device over,
161
504983
1951
08:26
it creates all the affordances you need
162
506958
2738
08:29
and brings other objects to help you accomplish those goals.
163
509720
4800
08:37
So, in conclusion,
164
517139
1561
08:38
I really think that we need to think about a new, fundamentally different way
165
518724
3999
08:42
of interacting with computers.
166
522747
2158
08:45
We need computers that can physically adapt to us
167
525551
2954
08:48
and adapt to the ways that we want to use them
168
528529
2601
08:51
and really harness the rich dexterity that we have of our hands,
169
531154
4547
08:55
and our ability to think spatially about information by making it physical.
170
535725
4271
09:00
But looking forward, I think we need to go beyond this, beyond devices,
171
540663
3996
09:04
to really think about new ways that we can bring people together,
172
544683
3393
09:08
and bring our information into the world,
173
548100
3018
09:11
and think about smart environments that can adapt to us physically.
174
551142
3953
09:15
So with that, I will leave you.
175
555119
1564
09:16
Thank you very much.
176
556707
1151
09:17
(Applause)
177
557882
3592
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7