The Rise of Personal Robots | Cynthia Breazeal | TED Talks

159,814 views ใƒป 2011-02-08

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืžืชืจื’ื: amir Goldstein ืžื‘ืงืจ: Sigal Tifferet
00:15
Ever since I was a little girl
0
15260
3000
ืžืื– ืฉื”ื™ื™ืชื™ ื™ืœื“ื” ืงื˜ื ื”
00:18
seeing "Star Wars" for the first time,
1
18260
2000
ื›ืฉืจืื™ืชื™ ืืช "ืžืœื—ืžืช ื”ื›ื•ื›ื‘ื™ื" ื‘ืคืขื ื”ืจืืฉื•ื ื”
00:20
I've been fascinated by this idea
2
20260
2000
ื”ื™ืชื™ ืžืจื•ืชืงืช ืœืจืขื™ื•ืŸ
00:22
of personal robots.
3
22260
2000
ืฉืœ ืจื•ื‘ื•ื˜ื™ื ืื™ืฉื™ื™ื
00:24
And as a little girl,
4
24260
2000
ื›ื™ืœื“ื” ืงื˜ื ื”,
00:26
I loved the idea of a robot that interacted with us
5
26260
2000
ืื”ื‘ืชื™ ืืช ื”ืจืขื™ื•ืŸ ืฉืœ ืจื•ื‘ื•ื˜ื™ื ื”ืคื•ืขืœื™ื ืื™ืชื ื•
00:28
much more like a helpful, trusted sidekick --
6
28260
3000
ื™ื•ืชืจ ื›ืžื• ืขื•ื–ืจื™ื ืื• ืฉื•ืชืคื™ื ื ืืžื ื™ื
00:31
something that would delight us, enrich our lives
7
31260
2000
ื›ืžืฉื”ื• ืฉื™ืขื ื’ ืื•ืชื ื•, ื™ืขืฉื™ืจ ืืช ื—ื™ื™ื ื•
00:33
and help us save a galaxy or two.
8
33260
3000
ื•ื™ืขื–ื•ืจ ืœื ื• ืœื”ืฆื™ืœ ื’ืœืงืกื™ื”, ืื• ืฉืชื™ื.
00:37
I knew robots like that didn't really exist,
9
37260
3000
ื™ื“ืขืชื™ ืฉืจื•ื‘ื•ื˜ื™ื ื›ืืœื• ืœื ื‘ืืžืช ืงื™ื™ืžื™ื
00:40
but I knew I wanted to build them.
10
40260
2000
ืื‘ืœ ื™ื“ืขืชื™ ืฉืื ื™ ืจื•ืฆื” ืœื‘ื ื•ืช ืื•ืชื
00:42
So 20 years pass --
11
42260
2000
ืขืฉืจื™ื ืฉื ื” ื—ืœืคื• --
00:44
I am now a graduate student at MIT
12
44260
2000
ื•ืื ื™ ืชืœืžื™ื“ืช ืชื•ืืจ ืฉื ื™ ื‘MIT
00:46
studying artificial intelligence,
13
46260
2000
ืœืžื“ืชื™ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช.
00:48
the year is 1997,
14
48260
2000
ื”ืฉื ื” ื”ื™ื 1997
00:50
and NASA has just landed the first robot on Mars.
15
50260
3000
ื•ื ืืก"ื ื”ื ื—ื™ืชื” ืืช ื”ืจื•ื‘ื•ื˜ ื”ืจืืฉื•ืŸ ืขืœ ืžืื“ื™ื
00:53
But robots are still not in our home, ironically.
16
53260
3000
ืื‘ืœ ืขื“ื™ืŸ ืื™ืŸ ืจื•ื‘ื•ื˜ื™ื ื‘ื‘ืชื™ื ืฉืœื ื•, ื‘ืื•ืคืŸ ืื™ืจื•ื ื™.
00:56
And I remember thinking about
17
56260
2000
ื•ืื ื™ ื–ื•ื›ืจืช ืฉื—ืฉื‘ืชื™ ืขืœ
00:58
all the reasons why that was the case.
18
58260
2000
ื›ืœ ื”ืกื™ื‘ื•ืช ืžื“ื•ืข ื–ื” ื›ืš
01:00
But one really struck me.
19
60260
2000
ืื‘ืœ ืกื™ื‘ื” ืื—ืช ื”ื›ืชื” ื‘ื™
01:02
Robotics had really been about interacting with things,
20
62260
3000
ืจื•ื‘ื•ื˜ื™ืงื” ืขื•ืกืงืช ืจื‘ื•ืช ื‘ืื™ื ื˜ืจืืงืฆื™ื” ืขื ื“ื‘ืจื™ื
01:05
not with people --
21
65260
2000
ืœื ืขื ืื ืฉื™ื --
01:07
certainly not in a social way that would be natural for us
22
67260
2000
ื•ื‘ื•ื“ืื™ ืœื ื‘ืฆื•ืจื” ื—ื‘ืจืชื™ืช ืฉืชื”ื™ื” ื˜ื‘ืขื™ืช ืœื ื•
01:09
and would really help people accept robots
23
69260
2000
ื•ืชืขื–ื•ืจ ืœืื ืฉื™ื ืœืงื‘ืœ ืจื•ื‘ื•ื˜ื™ื
01:11
into our daily lives.
24
71260
2000
ืœื—ื™ื™ ื”ื™ื•ื ื™ื•ื ืฉืœื ื•.
01:13
For me, that was the white space; that's what robots could not do yet.
25
73260
3000
ืขื‘ื•ืจื™, ื–ื” ื”ื™ื” ื”ื—ืœืœ, ื–ื” ื”ื™ื” ื”ื“ื‘ืจ ืฉืจื•ื‘ื•ื˜ื™ื ืœื ื™ื›ื•ืœื™ื ืœืขืฉื•ืช, ืขื“ื™ืŸ.
01:16
And so that year, I started to build this robot, Kismet,
26
76260
3000
ื•ืœื›ืŸ, ื‘ืื•ืชื” ืฉื ื”, ื”ืชื—ืœืชื™ ืœื‘ื ื•ืช ืืช ื”ืจื•ื‘ื•ื˜ ื”ื–ื”, ืงื™ื–ืžื˜
01:19
the world's first social robot.
27
79260
3000
ื”ืจื•ื‘ื•ื˜ ื”ื—ื‘ืจืชื™ ื”ืจืืฉื•ืŸ ื‘ืขื•ืœื
01:22
Three years later --
28
82260
2000
ื•ืฉืœื•ืฉ ืฉื ื™ื ืœืื—ืจ ืžื›ืŸ--
01:24
a lot of programming,
29
84260
2000
ืื—ืจื™ ื”ืจื‘ื” ืชื›ื ื•ืช
01:26
working with other graduate students in the lab --
30
86260
2000
ื•ืขื‘ื•ื“ื” ืขื ืขื•ื“ ืกื˜ื•ื“ื ื˜ื™ื ื‘ืžืขื‘ื“ื”
01:28
Kismet was ready to start interacting with people.
31
88260
2000
ืงื™ื–ืžื˜ ื”ื™ื” ืžื•ื›ืŸ ืœื”ืชื—ื™ืœ ืœื”ื’ื™ื‘ ืœืื ืฉื™ื
01:30
(Video) Scientist: I want to show you something.
32
90260
2000
(ื•ื™ื“ืื•) ืžื“ืขืŸ: ืื ื™ ืจื•ืฆื” ืœื”ืจืื•ืช ืœืš ืžืฉื”ื•.
01:32
Kismet: (Nonsense)
33
92260
2000
ืงื™ื–ืžื˜ (ืขื•ืฉื” ืคืจืฆื•ืฃ)
01:34
Scientist: This is a watch that my girlfriend gave me.
34
94260
3000
ืžื“ืขืŸ: ื–ื” ืฉืขื•ืŸ ืฉื”ื—ื‘ืจื” ืฉืœื™ ื ืชื ื” ืœื™
01:37
Kismet: (Nonsense)
35
97260
2000
ืงื™ื–ืžื˜ (ืคืจืฆื•ืฃ ืื—ืจ)
01:39
Scientist: Yeah, look, it's got a little blue light in it too.
36
99260
2000
ืžื“ืขืŸ: ื›ืŸ, ืชืกืชื›ืœ ื™ืฉ ื‘ื• ื’ื ืื•ืจ ื›ื—ื•ืœ ืงื˜ืŸ.
01:41
I almost lost it this week.
37
101260
3000
ื›ืžืขื˜ ืื™ื‘ื“ืชื™ ืื•ืชื• ื”ืฉื‘ื•ืข
01:44
Cynthia Breazeal: So Kismet interacted with people
38
104260
3000
ืกื™ื ื˜ื™ื”: ืื– ืงื™ื–ืžื˜ ืžื’ื™ื‘ ืœื‘ื ื™ ืื“ื
01:47
like kind of a non-verbal child or pre-verbal child,
39
107260
3000
ื›ืžื• ื™ืœื“ ืœืœื-ืฉืคื”, ืื• ื™ืœื“ ืœืคื ื™ ืฉืœืžื“ ืฉืคื” ืžื™ืœื•ืœื™ืช
01:50
which I assume was fitting because it was really the first of its kind.
40
110260
3000
ื•ืื ื™ ืžื ื™ื—ื” ืฉื–ื” ืžืชืื™ื ื›ื™ ื–ื” ื‘ืืžืช ื”ื™ื” ื”ืจืืฉื•ืŸ ืžืกื•ื’ื•
01:53
It didn't speak language, but it didn't matter.
41
113260
2000
ื–ื” ืœื ื“ื™ื‘ืจ ืฉืคื”, ืื‘ืœ ื–ื” ืœื ืฉื™ื ื” ื›ืœื•ื.
01:55
This little robot was somehow able
42
115260
2000
ื”ืจื•ื‘ื•ื˜ ื”ืงื˜ืŸ ื”ื–ื” ื”ืฆืœื™ื—, ืื™ื›ืฉื”ื•
01:57
to tap into something deeply social within us --
43
117260
3000
ืœื”ืชื—ื‘ืจ ืœืžืฉื”ื• ื—ื‘ืจืชื™ ืขืžื•ืง ื‘ืชื•ื›ื ื•
02:00
and with that, the promise of an entirely new way
44
120260
2000
ื•ื›ืš, ืคืชื— ื“ืจืš ื—ื“ืฉื” ืœื—ืœื•ื˜ื™ืŸ
02:02
we could interact with robots.
45
122260
2000
ืฉื‘ื” ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœื”ื’ื™ื‘ ืขื ืจื•ื‘ื•ื˜ื™ื
02:04
So over the past several years
46
124260
2000
ื•ื›ืš, ื‘ืžืฉืš ื”ืฉื ื™ื ื”ืื—ืจื•ื ื•ืช
02:06
I've been continuing to explore this interpersonal dimension of robots,
47
126260
2000
ื”ืžืฉื›ืชื™ ืœื—ืงื•ืจ ืืช ื”ื”ื™ื‘ื˜ ื”ื‘ื™ืŸ-ืื™ืฉื™ ื‘ืจื•ื‘ื•ื˜ื™ื
02:08
now at the media lab
48
128260
2000
ื•ืขื›ืฉื™ื• ื‘ืžืขื‘ื“ืช ื”ืžื“ื™ื”
02:10
with my own team of incredibly talented students.
49
130260
2000
ืขื ื”ืฆื•ืช ืฉืœื™ ืฉืœ ืกื˜ื•ื“ื ื˜ื™ื ืžืื•ื“ ืžื•ื›ืฉืจื™ื
02:12
And one of my favorite robots is Leonardo.
50
132260
3000
ื•ืื—ื“ ืžื”ืจื•ื‘ื•ื˜ื™ื ื”ื—ื‘ื™ื‘ื™ื ืขืœื™ - ืœื™ืื•ื ืจื“ื•
02:15
We developed Leonardo in collaboration with Stan Winston Studio.
51
135260
3000
ืคื™ืชื—ื ื• ืืช ืœื™ืื•ื ืจื“ื• ื‘ืฉื™ืชื•ืฃ ืขื ืื•ืœืคื ื™ ืกื˜ืืŸ ื•ื™ื ืกื˜ื•ืŸ
02:18
And so I want to show you a special moment for me of Leo.
52
138260
3000
ื•ืื ื™ ืจื•ืฆื” ืœื”ืจืื•ืช ืœื›ื ืจื’ืข ืžื™ื•ื—ื“ ื‘ืขื‘ื•ืจื™ ืžืœื™ืื•
02:21
This is Matt Berlin interacting with Leo,
53
141260
2000
ื–ื” ื”ื•ื ืžืื˜ ื‘ืจืœื™ืŸ ืžืชืงืฉืจ ืขื ืœื™ืื•,
02:23
introducing Leo to a new object.
54
143260
2000
ืžืฆื™ื’ ืœืœื™ืื• ื—ืคืฅ ื—ื“ืฉ
02:25
And because it's new, Leo doesn't really know what to make of it.
55
145260
3000
ื•ื‘ื’ืœืœ ืฉื”ื—ืคืฅ ื—ื“ืฉ, ืœื™ืื• ืœื ื›ืœ ื›ืš ื™ื•ื“ืข ืžื” ืœืขืฉื•ืช ืขื ื–ื”
02:28
But sort of like us, he can actually learn about it
56
148260
2000
ืื‘ืœ, ื‘ืขืจืš ื›ืžื•ื ื•, ื”ื•ื ื™ื›ื•ืœ ืœืœืžื•ื“ ืขืœ ื–ื”
02:30
from watching Matt's reaction.
57
150260
3000
ืžืฆืคื™ื™ื” ื‘ืชื’ื•ื‘ื•ืช ืฉืœ ืžืื˜
02:33
(Video) Matt Berlin: Hello, Leo.
58
153260
2000
(ื•ื™ื“ืื•) ืžืื˜ ื‘ืจืœื™ืŸ: ืฉืœื•ื ืœื™ืื•
02:38
Leo, this is Cookie Monster.
59
158260
3000
ืœื™ืื•, ื–ื” ืขื•ื’ื™ืคืœืฆืช
02:44
Can you find Cookie Monster?
60
164260
3000
ืืชื” ื™ื›ื•ืœ ืœืจืื•ืช ืืช ืขื•ื’ื™ืคืœืฆืช?
02:52
Leo, Cookie Monster is very bad.
61
172260
3000
ืœื™ืื•, ืขื•ื’ื™ืคืœืฆืช ื”ื•ื ืžืื•ื“ ืจืข
02:56
He's very bad, Leo.
62
176260
2000
ืžืื•ื“ ืจืข, ืœื™ืื•
03:00
Cookie Monster is very, very bad.
63
180260
3000
ืžืื•ื“ ืžืื•ื“ ืจืข
03:07
He's a scary monster.
64
187260
2000
ืขื•ื’ื™ืคืœืฆืช ื”ื•ื ืžืคืœืฆืช ืžืคื—ื™ื“ื”
03:09
He wants to get your cookies.
65
189260
2000
ื”ื•ื ืจื•ืฆื” ืœืงื—ืช ืืช ื”ืขื•ื’ื™ื•ืช ืฉืœืš
03:12
(Laughter)
66
192260
2000
(ืฆื—ื•ืง)
03:14
CB: All right, so Leo and Cookie
67
194260
3000
ื˜ื•ื‘, ืื– ืœื™ืื• ื•ืขื•ื’ื™ืคืœืฆืช
03:17
might have gotten off to a little bit of a rough start,
68
197260
2000
ืื•ืœื™ ื”ื™ืชื” ืœื”ื ื”ืชื—ืœื” ืงืฉื” ืงืฆืช
03:19
but they get along great now.
69
199260
3000
ืื‘ืœ ื”ื ืžืกืชื“ืจื™ื ื”ื™ื˜ื‘ ืขื›ืฉื™ื•
03:22
So what I've learned
70
202260
2000
ืื– ืžื” ืฉืœืžื“ืชื™
03:24
through building these systems
71
204260
2000
ื›ืฉื‘ื ื™ืชื™ ืืช ื”ืžืขืจื›ื•ืช ื”ืœืœื•
03:26
is that robots are actually
72
206260
2000
ื”ื•ื ืฉืจื•ื‘ื•ื˜ื™ื ื”ื ื‘ืขืฆื
03:28
a really intriguing social technology,
73
208260
2000
ื˜ื›ื ื•ืœื•ื’ื™ื” ื—ื‘ืจืชื™ืช ืžืื•ื“ ืžืขื ื™ื™ื ืช
03:30
where it's actually their ability
74
210260
2000
ื›ืฉื‘ืขืฆื ื‘ื™ื›ื•ืœืชื
03:32
to push our social buttons
75
212260
2000
ืœืœื—ื•ืฅ ืขืœ ื”ื›ืคืชื•ืจื™ื ื”ื—ื‘ืจืชื™ื™ื ืฉืœื ื•
03:34
and to interact with us like a partner
76
214260
2000
ื•ืœื”ื’ื™ื‘ ืื™ืชื ื• ื›ืžื• ืฉื•ืชืคื™ื
03:36
that is a core part of their functionality.
77
216260
3000
ื•ื–ื”ื• ืžืฆื‘ ื‘ืกื™ืกื™ ื‘ืชืคืงื•ื“ ืฉืœื”ื
03:39
And with that shift in thinking, we can now start to imagine
78
219260
2000
ื•ืœืื—ืจ ืฉืขื‘ืจื ื• ืืช ืฉื™ื ื•ื™ ื”ื—ืฉื™ื‘ื” ื”ื–ื”, ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœื”ืชื—ื™ืœ ืœื“ืžื™ื™ืŸ
03:41
new questions, new possibilities for robots
79
221260
3000
ืฉืืœื•ืช ื—ื“ืฉื•ืช, ืืคืฉืจื•ื™ื•ืช ื—ื“ืฉื•ืช, ื‘ืขื‘ื•ืจ ืจื•ื‘ื•ื˜ื™ื
03:44
that we might not have thought about otherwise.
80
224260
3000
ืฉืื•ืœื™ ืœื ื ื—ืฉื•ื‘ ืขืœื™ื”ื ื‘ืžืฆื‘ ืื—ืจ.
03:47
But what do I mean when I say "push our social buttons?"
81
227260
2000
ืืš ืœืžื” ืื ื™ ืžืชื›ื•ื•ื ืช ื›ืฉืื ื™ ืื•ืžืจืช "ืœืœื—ื•ืฅ ืขืœ ื›ืคืชื•ืจื™ื ื—ื‘ืจืชื™ื™ื"?
03:49
Well, one of the things that we've learned
82
229260
2000
ื•ื‘ื›ืŸ, ืื—ื“ ื”ื“ื‘ืจื™ื ืฉืœืžื“ื ื•
03:51
is that, if we design these robots to communicate with us
83
231260
2000
ื”ื•ื ืฉืื ื ืขืฆื‘ ืืช ื”ืจื‘ื•ื˜ื™ื ืœืชืงืฉืจ ืื™ืชื ื•
03:53
using the same body language,
84
233260
2000
ื‘ืืžืฆืขื•ืช ืื•ืชื” ืฉืคืช-ื’ื•ืฃ
03:55
the same sort of non-verbal cues that people use --
85
235260
2000
ืื•ืชื ืกื•ื’ื™ื ืฉืœ ืจืžื–ื™ื ืœื-ืžื™ืœื•ืœื™ื™ื ืฉื‘ื”ื ืื ืฉื™ื ืžืฉืชืžืฉื™ื --
03:57
like Nexi, our humanoid robot, is doing here --
86
237260
3000
ื›ืžื• ืฉ"ื ืงืกื™", ื”ืจื•ื‘ื•ื˜ ื“ืžื•ื™ ื”ืื ื•ืฉ ืžื‘ืฆืข ื›ืืŸ
04:00
what we find is that people respond to robots
87
240260
2000
ืžืฆืื ื• ืฉืื ืฉื™ื ืžื’ื™ื‘ื™ื ืœืจื•ื‘ื•ื˜ื™ื
04:02
a lot like they respond to people.
88
242260
2000
ื‘ืื•ืคืŸ ื“ื•ืžื” ืœืชื’ื•ื‘ืชื ืœืื ืฉื™ื
04:04
People use these cues to determine things like how persuasive someone is,
89
244260
3000
ืื ืฉื™ื ืžืฉืชืžืฉื™ื ื‘ืจืžื–ื™ื ื”ืœืœื• ื›ื“ื™ ืœืงื‘ื•ืข ื“ื‘ืจื™ื ื›ืžื• ื›ืžื” ืžืฉื›ื ืข ืžื™ืฉื”ื•
04:07
how likable, how engaging,
90
247260
2000
ื›ืžื” ืื”ื•ื“, ื›ืžื” ืžืงืกื™ื,
04:09
how trustworthy.
91
249260
2000
ื›ืžื” ืืžื™ืŸ.
04:11
It turns out it's the same for robots.
92
251260
2000
ื”ืชื‘ืจืจ ืฉื›ืš ื–ื” ื’ื ืขื ืจื•ื‘ื•ื˜ื™ื
04:13
It's turning out now
93
253260
2000
ืžืชื‘ืจืจ ืขื›ืฉื™ื•
04:15
that robots are actually becoming a really interesting new scientific tool
94
255260
3000
ืฉืจื•ื‘ื•ื˜ื™ื ืžืชื—ื™ืœื™ื ืœื”ื™ื•ืช ื›ืœื™ ืžื“ืขื™ ื—ื“ืฉ ื•ืžืขื ื™ื™ืŸ
04:18
to understand human behavior.
95
258260
2000
ืœื”ื‘ื ืช ื”ื”ืชื ื”ื’ื•ืช ื”ืื ื•ืฉื™ืช.
04:20
To answer questions like, how is it that, from a brief encounter,
96
260260
3000
ืœืžืฆื™ืืช ืชืฉื•ื‘ื•ืช ืœืฉืืœื•ืช ื›ื’ื•ืŸ, ืื™ืš ื™ืชื›ืŸ, ืฉืžืžืคื’ืฉ ืงืฆืจ
04:23
we're able to make an estimate of how trustworthy another person is?
97
263260
3000
ืื ื—ื ื• ืžืกื•ื’ืœื™ื ืœื”ืขืจื™ืš ื›ืžื” ืžื”ื™ืžืŸ ืื“ื ืื—ืจ?
04:26
Mimicry's believed to play a role, but how?
98
266260
3000
ืžืืžื™ื ื™ื ืฉื—ื™ืงื•ื™ ืžืฉื—ืง ืชืคืงื™ื“, ืืš ืื™ืš?
04:29
Is it the mimicking of particular gestures that matters?
99
269260
3000
ื”ืื ืžื“ื•ื‘ืจ ื‘ื—ื™ืงื•ื™ ืฉืœ ืžื—ื•ื•ืช ืžืกื•ื™ืžื•ืช?
04:32
It turns out it's really hard
100
272260
2000
ืžืชื‘ืจืจ ืฉื–ื” ื‘ืืžืช ืงืฉื”
04:34
to learn this or understand this from watching people
101
274260
2000
ืœืœืžื•ื“ ืขืœ ื–ื” ืื• ืœื”ื‘ื™ืŸ ืืช ื–ื” ืžืฆืคื™ื™ื” ื‘ืื ืฉื™ื
04:36
because when we interact we do all of these cues automatically.
102
276260
3000
ื›ื™ ื›ืฉืื ื—ื ื• ืžืชืงืฉืจื™ื ืื ื—ื ื• ืžืขื‘ื™ืจื™ื ืืช ื›ืœ ื”ืจืžื–ื™ื ื”ืืœื• ื‘ืฆื•ืจื” ืื•ื˜ื•ืžื˜ื™ืช
04:39
We can't carefully control them because they're subconscious for us.
103
279260
2000
ืื ื—ื ื• ืœื ื™ื›ื•ืœื™ื ืœืฉืœื•ื˜ ื‘ื”ื ื‘ืขื“ื™ื ื•ืช ื›ื™ ืžืงื•ืจื ื‘ืชืช-ืžื•ื“ืข ืฉืœื ื•
04:41
But with the robot, you can.
104
281260
2000
ืืš ื‘ืืžืฆืขื•ืช ืจื•ื‘ื•ื˜ื™ื ืื ื—ื ื• ื™ื›ื•ืœื™ื.
04:43
And so in this video here --
105
283260
2000
ื•ื›ืš ื‘ื•ื™ื“ืื• ื”ื–ื” --
04:45
this is a video taken from David DeSteno's lab at Northeastern University.
106
285260
3000
ื”ื•ื™ื“ืื• ื ืœืงื— ืžื”ืžืขื‘ื“ื” ืฉืœ ื“ื•ื™ื“ ื“ื”-ืกื˜ืื ื• ื‘ืื•ื ื™ื‘ืจืกื™ื˜ืช ื ื•ืจืกื˜ืื™ืกื˜ืจืŸ.
04:48
He's a psychologist we've been collaborating with.
107
288260
2000
ื”ื•ื ืคืกื™ื›ื•ืœื•ื’ ืฉืื™ืชื• ืฉื™ืชืคื ื• ืคืขื•ืœื”
04:50
There's actually a scientist carefully controlling Nexi's cues
108
290260
3000
ืžื•ืฆื’ ืžื“ืขืŸ ื”ืฉื•ืœื˜ ื‘ืจืžื–ื™ื ืฉ"ื ืงืกื™" ืžืขื‘ื™ืจ
04:53
to be able to study this question.
109
293260
3000
ืขืœ ืžื ืช ืœืœืžื•ื“ ืืช ื”ืฉืืœื” ื”ื–ื•
04:56
And the bottom line is -- the reason why this works is
110
296260
2000
ื•ื”ืฉื•ืจื” ื”ืชื—ืชื•ื ื” ื”ื™ื -- ื”ืกื™ื‘ื” ืœื›ืš ืฉื–ื” ืขื•ื‘ื“ ื”ื™ื --
04:58
because it turns out people just behave like people
111
298260
2000
ื›ื™ ืžืกืชื‘ืจ ืฉืื ืฉื™ื ืžืชื ื”ื’ื™ื ื›ืžื• ืื ืฉื™ื
05:00
even when interacting with a robot.
112
300260
3000
ืืคื™ืœื• ื›ืฉืžืชืงืฉืจื™ื ืขื ืจื•ื‘ื•ื˜.
05:03
So given that key insight,
113
303260
2000
ืื–, ืžืฆื•ื™ื™ื“ื™ื ื‘ืชื•ื‘ื ื” ืžืจื›ื–ื™ืช ื–ื•,
05:05
we can now start to imagine
114
305260
2000
ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœื”ืชื—ื™ืœ ืœื“ืžื™ื™ืŸ
05:07
new kinds of applications for robots.
115
307260
3000
ืฉื™ืžื•ืฉื™ื ื—ื“ืฉื™ื ืขื‘ื•ืจ ืจื•ื‘ื•ื˜ื™ื.
05:10
For instance, if robots do respond to our non-verbal cues,
116
310260
3000
ืœื“ื•ื’ืžื”, ืื ืจื•ื‘ื•ื˜ ืื›ืŸ ืžื’ื™ื‘ ืœืจืžื–ื™ื ื”ืœื-ืžื™ืœื•ืœื™ื™ื ืฉืœื ื•,
05:13
maybe they would be a cool, new communication technology.
117
313260
4000
ืื•ืœื™ ื”ื ื™ื•ื›ืœื• ืœื”ื™ื•ืช ื˜ื›ื ื•ืœื•ื’ื™ืช ืชืงืฉื•ืจืช ื—ื“ืฉื” ื•ืžื’ื ื™ื‘ื”
05:17
So imagine this:
118
317260
2000
ื“ืžื™ื™ื ื• ืืช ื–ื”:
05:19
What about a robot accessory for your cellphone?
119
319260
2000
ืžื” ื‘ื“ื‘ืจ ืื‘ื™ื–ืจ-ืจื•ื‘ื•ื˜ื™ ืœื˜ืœืคื•ืŸ ืกืœื•ืœืจื™?
05:21
You call your friend, she puts her handset in a robot,
120
321260
2000
ืืชื” ืžืชืงืฉืจ ืœื—ื‘ืจื” ืฉืœืš, ื”ื™ื ืฉืžื” ืืช ื”ืฉืคื•ืคืจืช ื‘ืจื•ื‘ื•ื˜
05:23
and, bam! You're a MeBot --
121
323260
2000
ื•ื‘ืื! ืืชื” ืžื“ื‘ืจ ืขื ืจื•ื‘ื•ื˜-ืื ื™
05:25
you can make eye contact, you can talk with your friends,
122
325260
3000
ืืชื” ื™ื›ื•ืœ ืœื™ืฆื•ืจ ืงืฉืจ-ืขื™ืŸ, ืœื“ื‘ืจ ืขื ื”ื—ื‘ืจื™ื ืฉืœืš
05:28
you can move around, you can gesture --
123
328260
2000
ืœื ื•ืข ื‘ืกื‘ื™ื‘ื”, ืœื”ื—ื•ื•ืช ืžื—ื•ื•ืช --
05:30
maybe the next best thing to really being there, or is it?
124
330260
3000
ืื•ืœื™ ื”ื›ื™ ืงืจื•ื‘ ืœืœื”ื™ื•ืช ืฉื ื‘ืขืฆืžืš? ืื• ืฉืœื?
05:33
To explore this question,
125
333260
2000
ื›ื“ื™ ืœื—ืงื•ืจ ืืช ื”ืฉืืœื” ื”ื–ื•
05:35
my student, Siggy Adalgeirsson, did a study
126
335260
3000
ื”ืกื˜ื•ื“ื ื˜ ืฉืœื™, ืกื™ื’ื™ ืื“ืืœื’ืจื™ืกื•ืŸ, ื‘ื™ืฆืข ืžื—ืงืจ
05:38
where we brought human participants, people, into our lab
127
338260
3000
ืฉื‘ื• ื”ื‘ืื ื• ืžืฉืชืชืคื™ื ืื ื•ืฉื™ื, ืื ืฉื™ื, ืœืžืขื‘ื“ื” ืฉืœื ื•
05:41
to do a collaborative task
128
341260
2000
ื›ื“ื™ ืœื‘ืฆืข ืžืฉื™ืžื” ื”ื“ื•ืจืฉืช ืœืฉืชืฃ ืคืขื•ืœื”
05:43
with a remote collaborator.
129
343260
2000
ืขื ืฉื•ืชืฃ ืžืจื•ื—ืง
05:45
The task involved things
130
345260
2000
ื”ืžืฉื™ืžื” ืขืกืงื” ื‘ื“ื‘ืจื™ื
05:47
like looking at a set of objects on the table,
131
347260
2000
ื›ื’ื•ืŸ ืœื”ืชื‘ื•ื ืŸ ื‘ืžืกืคืจ ืขืฆืžื™ื ืขืœ ื”ืฉื•ืœื—ืŸ
05:49
discussing them in terms of their importance and relevance to performing a certain task --
132
349260
3000
ื•ืœื”ืชื“ื™ื™ืŸ ื‘ื”ื ื‘ืžื•ื ื—ื™ื ืฉืœ ื—ืฉื™ื‘ื•ืชื ื•ื”ืฉื™ื™ื›ื•ืช ืฉืœื”ื ืœื‘ื™ืฆื•ืข ืžืฉื™ืžื” ืžืกื•ื™ื™ืžืช
05:52
this ended up being a survival task --
133
352260
2000
ืฉื‘ืกื•ืคื• ืฉืœ ื“ื‘ืจ ื”ืชื‘ืจืจ ืฉื”ื™ืชื” ืžืฉื™ืžืช ืฉืจื™ื“ื” --
05:54
and then rating them in terms
134
354260
2000
ื•ืื– ืœื“ืจื’ ืื•ืชื ื‘ืžื•ื ื—ื™ื ืฉืœ
05:56
of how valuable and important they thought they were.
135
356260
2000
ืฉืœ ื›ืžื” ื‘ืขืœื™ ืขืจืš ื•ื—ืฉื•ื‘ื™ื ื”ื—ืคืฆื™ื
05:58
The remote collaborator was an experimenter from our group
136
358260
3000
ื”ืฉื•ืชืฃ ื”ืžืจื•ื—ืง ื”ื™ื” ื ืกื™ื™ืŸ ืžื”ืงื‘ื•ืฆื” ืฉืœื ื•
06:01
who used one of three different technologies
137
361260
2000
ื”ื ื”ืฉืชืžืฉื• ื‘ืื—ืช ืžืฉืœื•ืฉ ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ืฉื•ื ื•ืช
06:03
to interact with the participants.
138
363260
2000
ืœืชืงืฉืจ ืขื ื”ืžืฉืชืชืคื™ื.
06:05
The first was just the screen.
139
365260
2000
ื”ืจืืฉื•ื ื” ื”ื™ืชื” ืžืกืš ื‘ืœื‘ื“
06:07
This is just like video conferencing today.
140
367260
3000
ื›ืš ืฉื–ื” ื‘ื“ื™ื•ืง ื›ืžื• ื•ืขื™ื“ืช-ื•ื™ื“ืื• ื”ื™ื•ื.
06:10
The next was to add mobility -- so, have the screen on a mobile base.
141
370260
3000
ื‘ืฉื ื™ื” ื”ื•ืกืคื ื• ื™ื›ื•ืœืช ืชื ื•ืขื”, ื›ืš ืฉื”ื™ื” ืœื ื• ืžืกืš ืžื•ืฆื‘ ืขืœ ื‘ืกื™ืก ืžืชื ื™ื™ืข
06:13
This is like, if you're familiar with any of the telepresence robots today --
142
373260
3000
ื–ื” ื›ืžื•, ืื ืืชื ืžื›ื™ืจื™ื ืืช ื˜ื›ื ื•ืœื•ื’ื™ืช ื”ืžืฆื™ืื•ืช-ืžืจื—ื•ืง ืฉืœ ื”ื™ื•ื
06:16
this is mirroring that situation.
143
376260
3000
ื—ื™ืงื™ื ื• ืืช ื”ืžืฆื‘ ื”ื–ื”
06:19
And then the fully expressive MeBot.
144
379260
2000
ื•ืœื‘ืกื•ืฃ, ื‘ื•ื˜-ืื ื™ ืขื ื™ื›ื•ืœื•ืช ื”ื”ื‘ืขื” ื”ืžืœืื•ืช ืฉืœื•
06:21
So after the interaction,
145
381260
2000
ื•ื›ืš, ืœืื—ืจ ื”ืื™ื ื˜ืจืืงืฆื™ื”
06:23
we asked people to rate their quality of interaction
146
383260
3000
ื‘ื™ืงืฉื ื• ืžื”ืื ืฉื™ื ืœื“ืจื’ ืืช ื—ื•ื•ื™ืช ื”ืชืงืฉื•ืจืช
06:26
with the technology, with a remote collaborator
147
386260
2000
ื‘ืขื–ืจืช ื”ื˜ื›ื ื•ืœื•ื’ื™ื”, ืขื ื”ืžืฉืชืชืฃ ื”ืžืจื•ื—ืง
06:28
through this technology, in a number of different ways.
148
388260
3000
ื“ืจืš ื˜ื›ื ื•ืœื•ื’ื™ื” ืžืกื•ื™ืžืช ื–ื•, ื‘ืžืกืคืจ ื“ืจื›ื™ื ืฉื•ื ื•ืช.
06:31
We looked at psychological involvement --
149
391260
2000
ื‘ื“ืงื ื• ืžืขื•ืจื‘ื•ืช ืคืกื™ื›ื•ืœื•ื’ื™ืช --
06:33
how much empathy did you feel for the other person?
150
393260
2000
ื›ืžื” ืืžืคืชื™ื” ื”ื•ืจื’ืฉื” ื›ืœืคื™ ื”ืื“ื ื”ืฉื ื™?
06:35
We looked at overall engagement.
151
395260
2000
ื‘ื“ืงื ื• ืžื—ื•ื™ื‘ื•ืช ื›ื•ืœืœืช
06:37
We looked at their desire to cooperate.
152
397260
2000
ื‘ื“ืงื ื• ืืช ื”ืจืฆื•ืŸ ืฉืœื”ื ืœืฉืชืฃ ืคืขื•ืœื”
06:39
And this is what we see when they use just the screen.
153
399260
3000
ื•ื–ื” ืžื” ืฉืจืื™ื ื• ื›ืฉื”ืžืฉืชืชืคื™ื ื”ืฉืชืžืฉื• ื‘ืžืกืš ื‘ืœื‘ื“.
06:42
It turns out, when you add mobility -- the ability to roll around the table --
154
402260
3000
ืžืกืชื‘ืจ, ื›ืืฉืจ ืืชื” ืžื•ืกื™ืฃ ื™ื›ื•ืœืช ืชื ื•ืขื” - ื”ื™ื›ื•ืœืช ืœื”ืชื’ืœื’ืœ ืกื‘ื™ื‘ ืœืฉื•ืœื—ืŸ --
06:45
you get a little more of a boost.
155
405260
2000
ืžืชืงื‘ืœื•ืช ืชื•ืฆืื•ืช ื™ื•ืชืจ ื’ื‘ื•ื”ื•ืช
06:47
And you get even more of a boost when you add the full expression.
156
407260
3000
ื•ื”ืชื•ืฆืื•ืช ืขื•ื“ ื™ื•ืชืจ ื’ื‘ื•ื”ื•ืช ืื ืžื•ืกื™ืคื™ื ื™ื›ื•ืœืช ื”ื‘ืขื” ืžืœืื”
06:50
So it seems like this physical, social embodiment
157
410260
2000
ื›ืš ืฉื ืจืื” ืฉื”ื’ื™ืœื•ื ื”ืคื™ื–ื™-ื—ื‘ืจืชื™ ื”ื–ื”
06:52
actually really makes a difference.
158
412260
2000
ื‘ืืžืช ืขื•ืฉื” ื”ื‘ื“ืœ
06:54
Now let's try to put this into a little bit of context.
159
414260
3000
ืขื›ืฉื™ื• ื‘ื•ืื• ื ื ืกื” ืœืชืช ืœื–ื” ื”ืงืฉืจ
06:57
Today we know that families are living further and further apart,
160
417260
3000
ืื ื—ื ื• ื™ื•ื“ืขื™ื ืฉื›ื™ื•ื ื‘ื ื™ ื”ืžืฉืคื—ื” ื™ื›ื•ืœื™ื ืœื’ื•ืจ ื‘ืžืจื—ืง ื’ื“ื•ืœ ืื—ื“ ืžื”ืฉื ื™
07:00
and that definitely takes a toll on family relationships
161
420260
2000
ื•ื–ื” ื‘ื”ื—ืœื˜ ื’ื•ื‘ื” ืžื—ื™ืจ ืžืžืขืจื›ืช ื”ื™ื—ืกื™ื ื”ืžืฉืคื—ืชื™ืช
07:02
and family bonds over distance.
162
422260
2000
ื•ื”ืงืฉืจ ื”ืžืฉืคื—ืชื™ ืžืขื‘ืจ ืœืžืจื—ืง ื’ื“ื•ืœ
07:04
For me, I have three young boys,
163
424260
2000
ืœื™ ืœื“ื•ื’ืžื, ื™ืฉ ืฉืœื•ืฉื” ื‘ื ื™ื ืฆืขื™ืจื™ื,
07:06
and I want them to have a really good relationship
164
426260
2000
ื•ืื ื™ ืจื•ืฆื” ืฉื™ื”ื™ื” ืœื”ื ืงืฉืจ ื˜ื•ื‘ ืžืื•ื“
07:08
with their grandparents.
165
428260
2000
ืขื ื”ืกื‘ื™ื ื•ื”ืกื‘ืชื•ืช ืฉืœื”ื
07:10
But my parents live thousands of miles away,
166
430260
2000
ืื‘ืœ ื”ื•ืจื™ ื’ืจื™ื ื‘ืžืจื—ืง ืืœืคื™ ืงื™ืœื•ืžื˜ืจื™ื
07:12
so they just don't get to see each other that often.
167
432260
2000
ื›ืš ืฉืื™ืŸ ืœื”ื ื”ืจื‘ื” ื”ื–ื“ืžื ื•ื™ื•ืช ืœื”ืคื’ืฉ ื‘ืชื“ื™ืจื•ืช ื’ื‘ื•ื”ื”
07:14
We try Skype, we try phone calls,
168
434260
2000
ื ื™ืกื™ื ื• ืกืงื™ื™ืค, ื ื™ืกื™ื ื• ืฉื™ื—ื•ืช ื˜ืœืคื•ืŸ,
07:16
but my boys are little -- they don't really want to talk;
169
436260
2000
ืื‘ืœ ื”ื‘ื ื™ื ืฉืœื™ ืงื˜ื ื™ื, ื”ื ืœื ืจื•ืฆื™ื ืœื“ื‘ืจ
07:18
they want to play.
170
438260
2000
ื”ื ืจื•ืฆื™ื ืœืฉื—ืง.
07:20
So I love the idea of thinking about robots
171
440260
2000
ืœื›ืŸ ืื ื™ ืื•ื”ื‘ืช ืืช ื”ืจืขื™ื•ืŸ ืฉืœ ืœื—ืฉื•ื‘ ืขืœ ืจื•ื‘ื•ื˜ื™ื
07:22
as a new kind of distance-play technology.
172
442260
3000
ื›ื˜ื›ื ื•ืœื•ื’ื™ื” ื—ื“ืฉื” ืœืžืฉื—ืง-ืžืจื—ื•ืง
07:25
I imagine a time not too far from now --
173
445260
3000
ืื ื™ ืžื“ืžื™ื™ื ืช ืฉืœื ื›ืš ื›ืš ืจื—ื•ืง ื‘ืขืชื™ื“ --
07:28
my mom can go to her computer,
174
448260
2000
ืืžื ืฉืœื™ ืชื•ื›ืœ ืœื’ืฉืช ืœืžื—ืฉื‘ ืฉืœื”,
07:30
open up a browser and jack into a little robot.
175
450260
2000
ืœืคืชื•ื— ื“ืคื“ืคืŸ ืื™ื ื˜ืจื ื˜, ื•ืœื”ืชื—ื‘ืจ ืœืจื•ื‘ื•ื˜ ืงื˜ืŸ
07:32
And as grandma-bot,
176
452260
3000
ื•ืื–, ื›"ืกื‘ืชื-ื‘ื•ื˜"
07:35
she can now play, really play,
177
455260
2000
ื”ื™ื ื™ื›ื•ืœื” ืœืฉื—ืง, ื‘ืืžืช ืœืฉื—ืง
07:37
with my sons, with her grandsons,
178
457260
2000
ืขื ื”ื‘ื ื™ื ืฉืœื™, ื”ื ื›ื“ื™ื ืฉืœื”.
07:39
in the real world with his real toys.
179
459260
3000
ื‘ืขื•ืœื ื”ืืžื™ืชื™, ืขื ื”ืฆืขืฆื•ืขื™ื ื”ืืžื™ืชื™ื™ื ืฉืœื•
07:42
I could imagine grandmothers being able to do social-plays
180
462260
2000
ืื ื™ ื™ื›ื•ืœื” ืœื“ืžื™ื™ืŸ ืกื‘ืชื•ืช ื™ื›ื•ืœืช ืœืฉื—ืง ืžืฉื—ืงื™ ื—ื‘ืจื”
07:44
with their granddaughters, with their friends,
181
464260
2000
ืขื ื”ื ื›ื“ื•ืช ืฉืœื”ืŸ, ืขื ื”ื—ื‘ืจื•ืช ืฉืœื”ืŸ,
07:46
and to be able to share all kinds of other activities around the house,
182
466260
2000
ื•ืœื—ืœื•ืง ื›ืœ ืžื™ื ื™ ืกื•ื’ื™ื ืฉืœ ืคืขื™ืœื•ื™ื•ืช ืื—ืจื•ืช ื‘ื‘ื™ืช
07:48
like sharing a bedtime story.
183
468260
2000
ื›ื’ื•ืŸ ืœื—ืœื•ืง ืกื™ืคื•ืจ-ืœืคื ื™-ื”ืฉื™ื ื”
07:50
And through this technology,
184
470260
2000
ื•ื‘ืืžืฆืขื•ืช ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ื”ื–ื•,
07:52
being able to be an active participant
185
472260
2000
ืœื”ื™ื•ืช ืžืกื•ื’ืœื™ื ืœืงื—ืช ื—ืœืง ืคืขื™ืœ
07:54
in their grandchildren's lives
186
474260
2000
ื‘ื—ื™ื™ ื”ื ื›ื“ื™ื ืฉืœื”ื
07:56
in a way that's not possible today.
187
476260
2000
ื‘ื“ืจื›ื™ื ืฉืœื ืืคืฉืจื™ื•ืช ื›ื™ื•ื
07:58
Let's think about some other domains,
188
478260
2000
ื‘ื•ืื• ื ื—ืฉื•ื‘ ืขืœ ืชื—ื•ืžื™ื ืื—ืจื™ื
08:00
like maybe health.
189
480260
2000
ื›ืžื• ืœืžืฉืœ, ื‘ืจื™ืื•ืช.
08:02
So in the United States today,
190
482260
2000
ื›ื™ื•ื, ื‘ืืจื”"ื‘,
08:04
over 65 percent of people are either overweight or obese,
191
484260
3000
ืžืขืœ 65% ืžื”ืื ืฉื™ื ืกื•ื‘ืœื™ื ืžืขื•ื“ืฃ ืžืฉืงืœ ืื• ื”ืฉืžื ืช ื™ืชืจ
08:07
and now it's a big problem with our children as well.
192
487260
2000
ื•ื›ืขืช ื–ื• ื‘ืขื™ื™ื” ื’ื“ื•ืœื” ื’ื ืขื‘ื•ืจ ื™ืœื“ื™ื ื•.
08:09
And we know that as you get older in life,
193
489260
2000
ื•ืื ื—ื ื• ื™ื•ื“ืขื™ื ืฉื›ื›ืœ ืฉืืชื” ืžืชื‘ื’ืจ
08:11
if you're obese when you're younger, that can lead to chronic diseases
194
491260
3000
ืื ื”ื™ื™ืช ื‘ืขืœ ืžืฉืงืœ ืขื•ื“ืฃ ื‘ื™ืœื“ื•ืชืš, ืืชื” ื—ืฉื•ืฃ ืœืžื—ืœื•ืช ื›ืจื•ื ื™ื•ืช
08:14
that not only reduce your quality of life,
195
494260
2000
ืฉืœื ืจืง ืžืคื—ื™ืชื•ืช ืžืจืžืช ื”ื—ื™ื™ื
08:16
but are a tremendous economic burden on our health care system.
196
496260
3000
ืืœื ื’ื•ืจืžื•ืช ืœื ื˜ืœ ื›ืœื›ืœื™ ืขืฆื•ื ืขืœ ืžืขืจื›ื•ืช ื”ื‘ืจื™ืื•ืช ืฉืœื ื•
08:19
But if robots can be engaging,
197
499260
2000
ืื‘ืœ ืื ืจื•ื‘ื•ื˜ื™ื ื™ื•ื›ืœื• ืœื”ื™ื•ืช ืžืงืกื™ืžื™ื,
08:21
if we like to cooperate with robots,
198
501260
2000
ืื ื ืจืฆื” ืœืฉืชืฃ ืคืขื•ืœื” ืขื ืจื•ื‘ื•ื˜ื™ื,
08:23
if robots are persuasive,
199
503260
2000
ืื ืจื•ื‘ื•ื˜ื™ื ื™ื”ื™ื• ืžืฉื›ื ืขื™ื,
08:25
maybe a robot can help you
200
505260
2000
ืื•ืœื™ ืจื•ื‘ื•ื˜ ื™ื•ื›ืœ ืœืขื–ื•ืจ ืœืš
08:27
maintain a diet and exercise program,
201
507260
2000
ืœืฉืžื•ืจ ืขืœ ื“ื™ืื˜ื” ื•ืชื•ื›ื ื™ืช ืื™ืžื•ืŸ,
08:29
maybe they can help you manage your weight.
202
509260
3000
ืื•ืœื™ ื”ื ื™ื•ื›ืœื• ืœืขื–ื•ืจ ืœืš ืœืฉืžื•ืจ ืขืœ ืžืฉืงืœืš.
08:32
Sort of like a digital Jiminy --
203
512260
2000
ื”ื ื™ื”ื™ื• ื›ืžื• "ื’'ื™ืžื™ื ื™ ื”ืฆืจืฆืจ" ื“ื™ื’ื™ื˜ืืœื™ --
08:34
as in the well-known fairy tale --
204
514260
2000
ื›ืžื• ื‘ืคื™ื ื•ืงื™ื• --
08:36
a kind of friendly, supportive presence that's always there
205
516260
2000
ืกื•ื’ ืฉืœ ื—ื‘ืจ ืชื•ืžืš ืฉืชืžื™ื“ ื ืžืฆื ืฉื
08:38
to be able to help you make the right decision
206
518260
2000
ืœืขื–ื•ืจ ืœืš ืœืงื‘ืœ ืืช ื”ื”ื—ืœื˜ื” ื”ื ื›ื•ื ื”
08:40
in the right way at the right time
207
520260
2000
ื‘ื“ืจืš ื”ื ื›ื•ื ื”, ื‘ื–ืžืŸ ื”ื ื›ื•ืŸ,
08:42
to help you form healthy habits.
208
522260
2000
ืœืขื–ื•ืจ ืœืš ืœื™ืฆื•ืจ ื”ืจื’ืœื™ื ื‘ืจื™ืื™ื.
08:44
So we actually explored this idea in our lab.
209
524260
2000
ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœื—ืงื•ืจ ืจืขื™ื•ื ื•ืช ืืœื• ื‘ืžืขื‘ื“ื”.
08:46
This is a robot, Autom.
210
526260
2000
ื–ื”ื• ื”ืจื•ื‘ื•ื˜ "ืื•ื˜ื•ื" (ืกืชื™ื•)
08:48
Cory Kidd developed this robot for his doctoral work.
211
528260
3000
ืงื•ืจื™ ืงื™ื“ ืคื™ืชื— ืืช ื”ืจื•ื‘ื•ื˜ ื”ื–ื” ื‘ืžื”ืœืš ืขื‘ื•ื“ืช ื”ื“ื•ืงื˜ื•ืจื˜ ืฉืœื•
08:51
And it was designed to be a robot diet-and-exercise coach.
212
531260
3000
ื•ื–ื” ืชื•ื›ื ืŸ ืœื”ื™ื•ืช ืจื•ื‘ื•ื˜-ืžืืžืŸ ืœื“ื™ืื˜ื” ื•ืชื™ืจื’ื•ืœ
08:54
It had a couple of simple non-verbal skills it could do.
213
534260
2000
ื™ืฉ ืœื–ื” ืžืกืคืจ ืžื™ื•ืžื ื•ื™ื•ืช ืœื-ืžื™ืœื•ืœื™ื•ืช ืฉื‘ื™ื›ื•ืœืชื• ืœื‘ืฆืข.
08:56
It could make eye contact with you.
214
536260
2000
ื–ื” ื™ื•ืฆืจ ืงืฉืจ ืขื™ืŸ ืืชืš.
08:58
It could share information looking down at a screen.
215
538260
2000
ื–ื” ื™ื›ื•ืœ ืœื—ืœื•ืง ืžื™ื“ืข ื‘ืืžืฆืขื•ืช ืžืกืš.
09:00
You'd use a screen interface to enter information,
216
540260
2000
ืืชื” ื™ื›ื•ืœ ืœื”ืฉืชืžืฉ ื‘ืžืกืš ืžืžืฉืง ืขืœ ืžื ืช ืœื”ื–ื™ืŸ ืื™ื ืคื•ืจืžืฆื™ื”
09:02
like how many calories you ate that day,
217
542260
2000
ื›ื’ื•ืŸ ื›ืžื” ืงืœื•ืจื™ื•ืช ืื›ืœืช ื”ื™ื•ื,
09:04
how much exercise you got.
218
544260
2000
ื›ืžื” ืชืจื’ื™ืœื™ ื›ื•ืฉืจ ื‘ื™ืฆืขืช.
09:06
And then it could help track that for you.
219
546260
2000
ื•ื–ื” ื™ื›ื•ืœ ืœืขื–ื•ืจ ืœืขืงื•ื‘ ืื—ืจื™ ื–ื” ื‘ืขื‘ื•ืจืš.
09:08
And the robot spoke with a synthetic voice
220
548260
2000
ื•ื”ืจื•ื‘ื•ื˜ ืžื“ื‘ืจ ื‘ืืžืฆืขื•ืช ืงื•ืœ ืกื™ื ื˜ื˜ื™
09:10
to engage you in a coaching dialogue
221
550260
2000
ื•ื™ื•ืฆืจ ื“ื™ืืœื•ื’ ืฉืœ ืื™ืžื•ืŸ (ืงื•ืืฆ'ื™ื ื’)
09:12
modeled after trainers
222
552260
2000
ืฉืขื•ืฆื‘ ืขืœ ื™ื“ื™ ืžืืžื ื™ื
09:14
and patients and so forth.
223
554260
2000
ื•ืคืฆื™ื™ื ื˜ื™ื ื•ื›ื•ืœื™.
09:16
And it would build a working alliance with you
224
556260
2000
ื•ื”ื•ื ื™ื‘ื ื” ื™ื—ืกื™ ืขื‘ื•ื“ื” ืื™ืชืš
09:18
through that dialogue.
225
558260
2000
ื“ืจืš ื”ื“ื™ืืœื•ื’ ื”ื–ื”
09:20
It could help you set goals and track your progress,
226
560260
2000
ื”ื•ื ื™ื›ื•ืœ ืœืขื–ื•ืจ ืœืš ืœื”ืฉื™ื’ ืžื˜ืจื•ืช ื•ืœืขืงื•ื‘ ืื—ืจื™ ื”ืชืงื“ืžื•ืชืš
09:22
and it would help motivate you.
227
562260
2000
ื•ื”ื•ื ื™ื›ื•ืœ ืœื“ืจื‘ืŸ ืื•ืชืš
09:24
So an interesting question is,
228
564260
2000
ื•ื›ืš, ืฉืืœื” ืžืขื ื™ื™ื ืช ื”ื™ื
09:26
does the social embodiment really matter? Does it matter that it's a robot?
229
566260
3000
ื”ืื ื”ื”ืชื—ื‘ืจื•ืช ื”ื—ื‘ืจืชื™ืช ื‘ืืžืช ืžืฉื ื”? ื”ืื ื–ื” ืžืฉื ื” ืฉืžื“ื•ื‘ืจ ื‘ืจื•ื‘ื•ื˜?
09:29
Is it really just the quality of advice and information that matters?
230
569260
3000
ื”ืื ืžื“ื•ื‘ืจ ืจืง ื‘ืื™ื›ื•ืช ื”ื™ืขื•ืฅ ื•ื”ืžื™ื“ืข ืฉืžืฉื ื”?
09:32
To explore that question,
231
572260
2000
ื›ื“ื™ ืœืคืชื•ืจ ืืช ื”ืฉืืœื” ื”ื–ื•
09:34
we did a study in the Boston area
232
574260
2000
ื‘ื™ืฆืขื ื• ืžื—ืงืจ ื‘ืื™ื–ื•ืจ ื‘ื•ืกื˜ื•ืŸ
09:36
where we put one of three interventions in people's homes
233
576260
3000
ืฉืžื ื• ืื—ืช ืžืฉืœื•ืฉ ืฉื™ื˜ื•ืช ืื™ืžื•ืŸ ื‘ื‘ืชื™ื ืฉืœ ืื ืฉื™ื
09:39
for a period of several weeks.
234
579260
2000
ืœืชืงื•ืคื” ืฉืœ ืžืกืคืจ ืฉื‘ื•ืขื•ืช
09:41
One case was the robot you saw there, Autom.
235
581260
3000
ื‘ืžืงืจื” ื”ืจืืฉื•ืŸ ื”ืฉืืจื ื• ืืช ื”ืจื•ื‘ื•ื˜ ืฉืจืื™ืชื, "ืื•ื˜ื•ื".
09:44
Another was a computer that ran the same touch-screen interface,
236
584260
3000
ื‘ืžืงืจื” ืื—ืจ ืฉืžื ื• ืžื—ืฉื‘ ืฉื”ืจื™ืฅ ื‘ื“ื™ื•ืง ืืช ืื•ืชื” ืชื•ื›ื ืช ืžืžืฉืง ืฉืœ ืžืกืš-ืžื’ืข
09:47
ran exactly the same dialogues.
237
587260
2000
ื”ืจื™ืฅ ื‘ื“ื™ื•ืง ืืช ืื•ืชื ื“ื™ืืœื•ื’ื™ื
09:49
The quality of advice was identical.
238
589260
2000
ื”ืื™ื›ื•ืช ืฉืœ ื”ื™ืขื•ืฅ ื”ื™ืชื” ื–ื”ื”
09:51
And the third was just a pen and paper log,
239
591260
2000
ื•ื”ืฉืœื™ืฉื™ืช ื”ื™ืชื” ื™ื•ืžืŸ ืจื’ื™ืœ, ืžื ื™ื™ืจ, ื•ืขืคืจื•ืŸ
09:53
because that's the standard intervention you typically get
240
593260
2000
ืžื›ื™ื•ื•ืŸ ืฉื–ื• ื”ืฉื™ื˜ื” ื”ืจื’ื™ืœื” ืฉื‘ื” ืžืฉืชืžืฉื™ื
09:55
when you start a diet-and-exercise program.
241
595260
3000
ื›ืฉืืชื” ืžืชื—ื™ืœ ื“ื™ืื˜ื” ื•ืชื•ื›ื ื™ืช ืื™ืžื•ื ื™ื
09:58
So one of the things we really wanted to look at
242
598260
3000
ืื—ื“ ื”ื“ื‘ืจื™ื ืฉื‘ืืžืช ืจืฆื™ื ื• ืœื”ื‘ื™ืŸ
10:01
was not how much weight people lost,
243
601260
3000
ืœื ื”ื™ื” ื›ืžื” ืžืฉืงืœ ืื ืฉื™ื ืื™ื‘ื“ื•
10:04
but really how long they interacted with the robot.
244
604260
3000
ืืœื ืœืžืฉืš ื›ืžื” ื–ืžืŸ ื”ื ืชื™ืงืฉืจื• ืขื ื”ืจื•ื‘ื•ื˜
10:07
Because the challenge is not losing weight, it's actually keeping it off.
245
607260
3000
ืžื›ื™ื•ื•ืŸ ืฉื”ืืชื’ืจ ื”ื•ื ืœื ืœืจื“ืช ื‘ืžืฉืงืœ, ืืœื ืœืฉืžื•ืจ ืืช ื”ืžืฉืงืœ ื ืžื•ืš.
10:10
And the longer you could interact with one of these interventions,
246
610260
3000
ื•ื›ื›ืœ ืฉืืชื” ืžืžืฉื™ืš ืœื”ืฉืชืžืฉ ื‘ืื—ืช ื”ืฉื™ื˜ื•ืช ื”ืœืœื•
10:13
well that's indicative, potentially, of longer-term success.
247
613260
3000
ื–ื• ืื™ื ื“ื™ืงืฆื™ื”, ื‘ืื•ืคืŸ ืคื•ื˜ื ืฆื™ืืœื™, ืœื”ืฆืœื—ื” ืœื˜ื•ื•ื— ืืจื•ืš ื™ื•ืชืจ
10:16
So the first thing I want to look at is how long,
248
616260
2000
ื›ืš ืฉื”ื“ื‘ืจ ื”ืจืืฉื•ืŸ ืฉืจืฆื™ื ื• ืœื“ืขืช ื”ื•ื ืœื›ืžื” ื–ืžืŸ
10:18
how long did people interact with these systems.
249
618260
2000
ื›ืžื” ื–ืžืŸ, ืื ืฉื™ื ื”ืฉืชืžืฉื• ื‘ืžืขืจื›ื•ืช ื”ืœืœื•.
10:20
It turns out that people interacted with the robot
250
620260
2000
ื”ืชื‘ืจืจ ืฉืื ืฉื™ื ืชืงืฉืจื• ืขื ื”ืจื•ื‘ื•ื˜
10:22
significantly more,
251
622260
2000
ื™ื•ืชืจ, ื‘ืื•ืคืŸ ืžืฉืžืขื•ืชื™
10:24
even though the quality of the advice was identical to the computer.
252
624260
3000
ืœืžืจื•ืช ืฉื”ืื™ื›ื•ืช ืฉืœ ื”ื™ืขื•ืฅ ื”ื™ืชื” ื–ื”ื” ืœืื™ื›ื•ืช ืฉื”ืžื—ืฉื‘ ืกื™ืคืง
10:28
When it asked people to rate it on terms of the quality of the working alliance,
253
628260
3000
ื›ืฉื”ืชื‘ืงืฉื• ืœื“ืจื’ ืืช ื”ืฉื™ื˜ื” ื‘ืžื•ื ื—ื™ื ืฉืœ ืื™ื›ื•ืช ืงืฉืจ ื”ืขื‘ื•ื“ื”
10:31
people rated the robot higher
254
631260
2000
ืื ืฉื™ื ื“ื™ืจื’ื• ืืช ื”ืจื•ื‘ื•ื˜ ื’ื‘ื•ื” ื™ื•ืชืจ
10:33
and they trusted the robot more.
255
633260
2000
ื•ื”ื ื‘ื˜ื—ื• ื‘ืจื•ื‘ื•ื˜ ื™ื•ืชืจ
10:35
(Laughter)
256
635260
2000
(ืฆื—ื•ืง)
10:37
And when you look at emotional engagement,
257
637260
2000
ื•ื›ืฉื‘ื•ื“ืงื™ื ืžืขื•ืจื‘ื•ืช ืจื’ืฉื™ืช
10:39
it was completely different.
258
639260
2000
ื–ื” ื”ื™ื” ืฉื•ื ื” ืœื’ืžืจื™
10:41
People would name the robots.
259
641260
2000
ืื ืฉื™ื ื ืชื ื• ืœืจื•ื‘ื•ื˜ ืฉื
10:43
They would dress the robots.
260
643260
2000
ื”ืœื‘ื™ืฉื• ืืช ื”ืจื•ื‘ื•ื˜
10:45
(Laughter)
261
645260
2000
(ืฆื—ื•ืง)
10:47
And even when we would come up to pick up the robots at the end of the study,
262
647260
3000
ื•ืœืคืขืžื™ื, ื›ืฉื‘ืื ื• ืœืืกื•ืฃ ืืช ื”ืจื•ื‘ื•ื˜ ื‘ืกื•ืฃ ื”ืžื—ืงืจ,
10:50
they would come out to the car and say good-bye to the robots.
263
650260
2000
ืื ืฉื™ื ื™ืฆืื• ืื™ืชื ื• ืœืจื›ื‘ ื•ืืžืจื• ืฉืœื•ื ืœืจื•ื‘ื•ื˜ื™ื.
10:52
They didn't do this with a computer.
264
652260
2000
ื”ื ืœื ืขืฉื• ืืช ื–ื” ื‘ืžืงืจื” ืฉืœ ื”ืžื—ืฉื‘.
10:54
The last thing I want to talk about today
265
654260
2000
ื”ื“ื‘ืจ ื”ืื—ืจื•ืŸ ืฉืขืœื™ื• ืื ื™ ืจื•ืฆื” ืœื“ื‘ืจ ื”ื™ื•ื
10:56
is the future of children's media.
266
656260
2000
ื”ื•ื ืขืชื™ื“ ืชืงืฉื•ืจืช ื”ื”ืžื•ื ื™ื ืœื™ืœื“ื™ื
10:58
We know that kids spend a lot of time behind screens today,
267
658260
3000
ืื ื—ื ื• ื™ื•ื“ืขื™ื ืฉื”ื™ื•ื ื™ืœื“ื™ื ืžื‘ืœื™ื ื–ืžืŸ ืจื‘ ืžืื—ื•ืจื™ ืžืกื›ื™ื
11:01
whether it's television or computer games or whatnot.
268
661260
3000
ื‘ื™ืŸ ืื ืžื“ื•ื‘ืจ ื‘ื˜ืœื•ื™ื–ื™ื” ืื• ืžืฉื—ืงื™ ืžื—ืฉื‘ ื•ืžื” ืœื.
11:04
My sons, they love the screen. They love the screen.
269
664260
3000
ื”ื‘ื ื™ื ืฉืœื™ ืื•ื”ื‘ื™ื ืืช ื”ืžืกืš. ืื•ื”ื‘ื™ื ืืช ื”ืžืกืš
11:07
But I want them to play; as a mom, I want them to play,
270
667260
3000
ืื‘ืœ ืื ื™ ืจื•ืฆื” ืฉื”ื ื™ืฉื—ืงื•, ื›ืืžื ืื ื™ ืจื•ืฆื” ืฉื”ื ื™ืฉื—ืงื•
11:10
like, real-world play.
271
670260
2000
ื‘ืžืฉื—ืงื™ื ื‘ืขื•ืœื ื”ืืžื™ืชื™
11:12
And so I have a new project in my group I wanted to present to you today
272
672260
3000
ื•ื›ืš ื™ืฉ ืœื ื• ื‘ืงื‘ื•ืฆื” ืžื™ื–ื ื—ื“ืฉ ืฉืื•ืชื• ืื ื™ ืจื•ืฆื” ืœื”ืฆื™ื’ ื‘ืคื ื™ื›ื ื”ื™ื•ื
11:15
called Playtime Computing
273
675260
2000
ืฉื ืงืจื ืžื—ืฉื‘ ืœืžืฉื—ืง
11:17
that's really trying to think about how we can take
274
677260
2000
ื–ื”ื• ื ืกื™ื•ืŸ ืœื—ืฉื•ื‘ ื‘ืืžืช
11:19
what's so engaging about digital media
275
679260
2000
ืžื” ื›ืœ ื›ืš ืžื•ืฉืš ื‘ืžื“ื™ื” ื”ื“ื™ื’ื™ืืœื™ืช
11:21
and literally bring it off the screen
276
681260
2000
ื•ื‘ืื•ืคืŸ ืžื™ืœื•ืœื™ "ืœื”ื•ืฆื™ื ืืช ื–ื” ืžื”ืžืกืš",
11:23
into the real world of the child,
277
683260
2000
ื•ืœื”ื‘ื™ื ืืช ื–ื” ืœืขื•ืœื ื”ืืžื™ืชื™ ืฉืœ ื”ื™ืœื“.
11:25
where it can take on many of the properties of real-world play.
278
685260
3000
ืฉื‘ื• ื–ื” ื™ื•ื›ืœ ืœืงื—ืช ื”ืจื‘ื” ืžื”ืชื›ื•ื ื•ืช ืฉืœ ืžืฉื—ืง ืืžื™ืชื™ ื‘ืขื•ืœื ื”ืืžื™ืชื™
11:29
So here's the first exploration of this idea,
279
689260
4000
ื•ื›ืš, ื”ื ื” ื”ื ืกื™ื•ืŸ ื”ืจืืฉื•ืŸ ืœื—ืงื•ืจ ืจืขื™ื•ืŸ ื–ื”
11:33
where characters can be physical or virtual,
280
693260
3000
ื‘ื• ื”ื“ืžื•ื™ื•ืช ื™ื›ื•ืœื•ืช ืœื”ื™ื•ืช ืคื™ื–ื™ื•ืช ืื• ื•ื™ืจื˜ื•ืืœื™ื•ืช
11:36
and where the digital content
281
696260
2000
ื•ื‘ื• ื”ืชื•ื›ืŸ ื”ื“ื™ื’ื™ื˜ืืœื™
11:38
can literally come off the screen
282
698260
2000
ื™ื›ื•ืœ, ื‘ืื•ืคืŸ ืžื™ืœื•ืœื™, ืœืฆืืช ืžื”ืžืกืš
11:40
into the world and back.
283
700260
2000
ืœืขื•ืœื ื•ื‘ื—ื–ืจื”
11:42
I like to think of this
284
702260
2000
ืื ื™ ืื•ื”ื‘ืช ืœื—ืฉื•ื‘ ืขืœ ื–ื”
11:44
as the Atari Pong
285
704260
2000
ื›"ืื˜ืืจื™ ืคื•ื ื’"
11:46
of this blended-reality play.
286
706260
2000
ืฉืœ ื”ืžืฉื—ืง ืžืขืจื‘ ื”ืžืฆื™ืื•ืช ื”ื–ื”
11:48
But we can push this idea further.
287
708260
2000
ืื‘ืœ ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœื“ื—ื•ืฃ ืืช ื”ืจืขื™ื•ืŸ ื”ื–ื” ื™ื•ืชืจ ืจื—ื•ืง
11:50
What if --
288
710260
2000
ืžื” ืื ---
11:52
(Game) Nathan: Here it comes. Yay!
289
712260
3000
(ืžืฉื—ืง) ื ืชืŸ: ื”ื ื” ื–ื” ื‘ื... ื•ืื™!
11:55
CB: -- the character itself could come into your world?
290
715260
3000
ืกื™ื ืชื™ื”: -- ื”ื“ืžื•ืช ืขืฆืžื” ื™ื›ื•ืœื” ืœื‘ื•ื ืœืขื•ืœื ืฉืœืš?
11:58
It turns out that kids love it
291
718260
2000
ืžืชื‘ืจืจ ืฉื”ื™ืœื“ื™ื ืื•ื”ื‘ื™ื ืืช ื–ื”
12:00
when the character becomes real and enters into their world.
292
720260
3000
ื›ืฉื”ื“ืžื•ืช ื”ื•ืคื›ืช ืœื”ื™ื•ืช ืืžื™ืชื™ืช ื•ื ื›ื ืกืช ืœืขื•ืœื ืฉืœื”ื
12:03
And when it's in their world,
293
723260
2000
ื•ื›ืืฉืจ ื”ื“ืžื•ืช ื‘ืขื•ืœื ืฉืœื”ื
12:05
they can relate to it and play with it in a way
294
725260
2000
ื”ื ื™ื›ื•ืœื™ื ืœื”ืชื™ื—ืก ืืœื™ื” ื•ืœืฉื—ืง ืื™ืชื” ื‘ื“ืจืš
12:07
that's fundamentally different from how they play with it on the screen.
295
727260
2000
ืฉื”ื™ื ืฉื•ื ื” ื‘ืื•ืคืŸ ืžื”ื•ืชื™ ืžื“ืจืš ื”ืžืฉื—ืง ืื™ืชื” ื›ืฉื”ื™ื ื‘ืžืกืš
12:09
Another important idea is this notion
296
729260
2000
ืขื•ื“ ืจืขื™ื•ืŸ ื—ืฉื•ื‘ ื”ื•ื ื”ืขื ื™ื™ืŸ
12:11
of persistence of character across realities.
297
731260
3000
ืฉืœ ืงื™ื•ื ื“ืžื•ืช ื‘ื›ืžื” ืžืฆื™ืื•ื™ื•ืช.
12:14
So changes that children make in the real world
298
734260
2000
ื›ืš ืฉืฉื™ื ื•ื™ื ืฉื”ื™ืœื“ ืžื‘ืฆืข ื‘ืขื•ืœื ื”ืืžื™ืชื™
12:16
need to translate to the virtual world.
299
736260
2000
ืžืชื•ืจื’ืžื™ื ืœืขื•ืœื ื”ื•ื™ืจื˜ื•ืืœื™.
12:18
So here, Nathan has changed the letter A to the number 2.
300
738260
3000
ื›ืืŸ, ื ืชืŸ ืฉื™ื ื” ืืช ื”ืื•ืช A ืœืžืกืคืจ 2
12:21
You can imagine maybe these symbols
301
741260
2000
ืืชื ื™ื›ื•ืœื™ื ืœื“ืžื™ื™ืŸ ืฉืื•ืœื™ ื”ืกืžืœื™ื ื”ืœืœื•
12:23
give the characters special powers when it goes into the virtual world.
302
743260
3000
ืžืขื ื™ืงื™ื ืœื“ืžื•ื™ื•ืช ื›ื•ื—ื•ืช ืžื™ื•ื—ื“ื™ื ื›ืืฉืจ ื”ื ื ืžืฆืื™ื ื‘ืขื•ืœื ื”ื•ื™ืจื˜ื•ืืœื™
12:26
So they are now sending the character back into that world.
303
746260
3000
ื›ืš ืฉืขื›ืฉื™ื• ื”ื ืฉื•ืœื—ื™ื ืืช ื”ื“ืžื•ืช ื—ื–ืจื” ืœืขื•ืœื ื”ื–ื”
12:29
And now it's got number power.
304
749260
3000
ื•ื›ืขืช ื™ืฉ ืœื” ื›ื•ื— ืฉืœ ืžืกืคืจ.
12:32
And then finally, what I've been trying to do here
305
752260
2000
ื•ืœื‘ืกื•ืฃ, ืžื” ืฉื ื™ืกื™ืชื™ ืœืขืฉื•ืช ื›ืืŸ
12:34
is create a really immersive experience for kids,
306
754260
3000
ื–ื” ืœื”ืขื ื™ืง ืœื™ืœื“ื™ื ื”ืชื ืกื•ืช ืฉืืคืฉืจ ืžืžืฉ ืœืฉืงื•ืข ื‘ื”
12:37
where they really feel like they are part of that story,
307
757260
3000
ื›ืš ืฉื”ื ืžืžืฉ ืžืจื’ื™ืฉื™ื ื›ืื™ืœื• ื”ื ื—ืœืง ืžื”ืกื™ืคื•ืจ
12:40
a part of that experience.
308
760260
2000
ื—ืœืง ืžื”ื”ืชื ืกื•ืช.
12:42
And I really want to spark their imaginations
309
762260
2000
ื•ืื ื™ ืžืžืฉ ืจื•ืฆื” ืœื”ืฆื™ืช ืืช ื”ื“ืžื™ื•ืŸ ืฉืœื”ื
12:44
the way mine was sparked as a little girl watching "Star Wars."
310
764260
3000
ื›ืžื• ืฉื”ื“ืžื™ื•ืŸ ืฉืœื™ ื ื™ืฆืช ื›ืฉื”ื™ืชื™ ื™ืœื“ื” ืงื˜ื ื” ื•ืฆืคื™ืชื™ ื‘"ืžืœื—ืžืช ื”ื›ื•ื›ื‘ื™ื."
12:47
But I want to do more than that.
311
767260
2000
ืื‘ืœ ืื ื™ ืจื•ืฆื” ืœืขืฉื•ืช ื™ื•ืชืจ ืžื›ืš
12:49
I actually want them to create those experiences.
312
769260
3000
ืื ื™ ืจื•ืฆื” ืฉื”ื ื™ื™ืฆืจื• ืืช ื”ื”ืชื ืกื•ื™ื•ืช ื”ืืœื•
12:52
I want them to be able to literally build their imagination
313
772260
2000
ืื ื™ ืจื•ืฆื” ืฉื”ื ื™ื•ื›ืœื• ืžืžืฉ ืœื‘ื ื•ืช ืืช ื”ื“ืžื™ื•ืŸ ืฉืœื”ื
12:54
into these experiences and make them their own.
314
774260
2000
ืœืชื•ืš ื”ื”ืชื ืกื•ื™ื•ืช ื”ืืœื•, ืœื”ืคื•ืš ืื•ืชื ืœืฉืœื”ื.
12:56
So we've been exploring a lot of ideas
315
776260
2000
ื›ืš ืฉืื—ื ื• ื—ืงืจื ื• ืจืขื™ื•ื ื•ืช ืจื‘ื™ื
12:58
in telepresence and mixed reality
316
778260
2000
ื‘ื ื•ื›ื—ื•ืช-ืžืจื—ื•ืง ื•ืขืจื‘ื•ื‘ ืžืฆื™ืื•ื™ื•ืช
13:00
to literally allow kids to project their ideas into this space
317
780260
3000
ื›ื“ื™ ืœืืคืฉืจ ืœื™ืœื“ื™ื ืœื”ืงืจื™ืŸ ืืช ื”ืจืขื™ื•ื ื•ืช ืฉืœื”ื ืœืชื•ืš ื”ื—ืœืœ ื”ื–ื”
13:03
where other kids can interact with them
318
783260
2000
ืฉื‘ื• ื™ืœื“ื™ื ืื—ืจื™ื ื™ื•ื›ืœื• ืœื”ื’ื™ื‘ ืื™ืชื
13:05
and build upon them.
319
785260
2000
ื•ืœื‘ื ื•ืช ืขืœื™ื”ื.
13:07
I really want to come up with new ways of children's media
320
787260
3000
ืื ื™ ืžืžืฉ ืจื•ืฆื” ืœื‘ื•ื ืขื ืจืขื™ื•ื ื•ืช ืฉื•ื ื™ื ืœืžื“ื™ื” ืœื™ืœื“ื™ื
13:10
that foster creativity and learning and innovation.
321
790260
3000
ืฉืžืขื•ื“ื“ืช ื™ืฆื™ืจืชื™ื•ืช, ืœืžื™ื“ื” ื•ื—ื“ืฉื ื•ืช
13:13
I think that's very, very important.
322
793260
3000
ืื ื™ ื—ื•ืฉื‘ืช ืฉื–ื” ืžืื•ื“, ืžืื•ื“ ื—ืฉื•ื‘
13:16
So this is a new project.
323
796260
2000
ื›ืš ืฉื–ื” ืžื™ื–ื ื—ื“ืฉ.
13:18
We've invited a lot of kids into this space,
324
798260
2000
ื”ื–ืžื ื• ื”ืจื‘ื” ื™ืœื“ื™ื ืœื—ืœืœ ื”ื–ื”
13:20
and they think it's pretty cool.
325
800260
3000
ื•ื”ื ื—ื•ืฉื‘ื™ื ืฉื–ื” ื“ื™ "ืงื•ืœ"
13:23
But I can tell you, the thing that they love the most
326
803260
2000
ืื‘ืœ ืื ื™ ื™ื›ื•ืœื” ืœื”ื’ื™ื“ ืœื›ื, ืฉื”ื“ื‘ืจ ืฉื”ื ื”ื›ื™ ืื•ื”ื‘ื™ื
13:25
is the robot.
327
805260
2000
ื”ื•ื ื”ืจื•ื‘ื•ื˜
13:27
What they care about is the robot.
328
807260
3000
ืžื” ืฉืื›ืคืช ืœื”ื ืžืžื ื• ื”ื•ื ื”ืจื•ื‘ื•ื˜.
13:30
Robots touch something deeply human within us.
329
810260
3000
ืจื•ื‘ื•ื˜ื™ื ื ื•ื’ืขื™ื ื‘ืžืฉื”ื• ืขืžื•ืง ื‘ืื ื•ืฉื™ื•ืช ืฉืœื ื•
13:33
And so whether they're helping us
330
813260
2000
ื•ื›ืš, ื‘ื™ืŸ ืฉื”ื ืขื•ื–ืจื™ื ืœื ื•
13:35
to become creative and innovative,
331
815260
2000
ืœื”ื™ื•ืช ื™ืฆื™ืจืชื™ื™ื ื•ื—ื“ืฉื ื™ื™ื,
13:37
or whether they're helping us
332
817260
2000
ืื• ืฉื”ื ืขื•ื–ืจื™ื ืœื ื•
13:39
to feel more deeply connected despite distance,
333
819260
2000
ืœื”ืจื’ื™ืฉ ืžื—ื•ื‘ืจื™ื ื™ื•ืชืจ ืœืžืจื•ืช ื”ืžืจื—ืง,
13:41
or whether they are our trusted sidekick
334
821260
2000
ืื• ืฉื”ื ืขื•ื–ืจื™ื ื ืืžื ื™ื
13:43
who's helping us attain our personal goals
335
823260
2000
ืฉืขื•ื–ืจื™ื ืœื ื• ืœื”ืฉื™ื’ ืืช ืžื˜ืจื•ืชื™ื ื• ื”ืื™ืฉื™ื•ืช
13:45
in becoming our highest and best selves,
336
825260
2000
ืœื”ื™ื•ืช ื”ื›ื™ ื˜ื•ื‘ื™ื ืฉืืคืฉืจ,
13:47
for me, robots are all about people.
337
827260
3000
ืขื‘ื•ืจื™, ืจื•ื‘ื•ื˜ื™ื ืงืฉื•ืจื™ื ืœืื ืฉื™ื.
13:50
Thank you.
338
830260
2000
ืชื•ื“ื”.
13:52
(Applause)
339
832260
5000
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7