The Rise of Personal Robots | Cynthia Breazeal | TED Talks

159,814 views ใƒป 2011-02-08

TED


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

๋ฒˆ์—ญ: J J LEE ๊ฒ€ํ† : Seongsu JEONG
00:15
Ever since I was a little girl
0
15260
3000
์–ด๋ฆด์  "์Šคํƒ€์›Œ์ฆˆ"๋ฅผ
00:18
seeing "Star Wars" for the first time,
1
18260
2000
์ฒ˜์Œ ๋ดค๋˜ ์ดํ›„๋กœ ๊ณ„์†
00:20
I've been fascinated by this idea
2
20260
2000
์ €๋Š” ๊ฐœ์ธ์šฉ ๋กœ๋ด‡์˜ ์•„์ด๋””์–ด์—
00:22
of personal robots.
3
22260
2000
๋งค๋ฃŒ๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
00:24
And as a little girl,
4
24260
2000
๊ทธ๋ฆฌ๊ณ  ์–ด๋ฆฐ ์†Œ๋…€๋กœ์„œ,
00:26
I loved the idea of a robot that interacted with us
5
26260
2000
์šฐ๋ฆฌ์™€ ์†Œํ†ตํ•˜๊ณ , ๋งค์šฐ ์œ ์šฉํ•˜๋ฉฐ ๋ฏฟ์„ ์ˆ˜ ์žˆ๋Š” ์นœ๊ตฌ ๊ฐ™์€
00:28
much more like a helpful, trusted sidekick --
6
28260
3000
๋กœ๋ด‡์˜ ์•„์ด๋””์–ด๋ฅผ ๋„ˆ๋ฌด ์ข‹์•„ํ–ˆ์Šต๋‹ˆ๋‹ค.
00:31
something that would delight us, enrich our lives
7
31260
2000
๊ทธ๋Ÿฐ ๊ฒƒ์ด ์šฐ๋ฆฌ๋ฅผ ์ฆ๊ฒ๊ฒŒ ๋งŒ๋“ค๊ณ , ์šฐ๋ฆฌ ์‚ถ์„ ํ’์š”๋กญ๊ฒŒ ํ•˜๋ฉฐ,
00:33
and help us save a galaxy or two.
8
33260
3000
์šฐ๋ฆฌ๊ฐ€ ์€ํ•˜๊ณ„ ํ•œ๋‘๊ฐœ๋ฅผ ๊ตฌํ•  ์ˆ˜ ์žˆ๋„๋ก ๋„์™€์ค๋‹ˆ๋‹ค.
00:37
I knew robots like that didn't really exist,
9
37260
3000
๊ทธ๋Ÿฐ ๋กœ๋ด‡์ด ์‹ค์ œ๋กœ ์กด์žฌํ•˜์ง€ ์•Š๋Š”๋‹ค๋Š”๊ฑด ์•Œ์•˜์ง€๋งŒ,
00:40
but I knew I wanted to build them.
10
40260
2000
์ œ๊ฐ€ ๊ทธ๊ฒƒ์„ ๋งŒ๋“ค์–ด ๋ณด๊ณ  ์‹ถ์–ด ํ•œ๋‹ค๋Š” ๊ฑธ ๊นจ๋‹ฌ์•˜์Šต๋‹ˆ๋‹ค.
00:42
So 20 years pass --
11
42260
2000
20๋…„์ด ์ง€๋‚ฌ์Šต๋‹ˆ๋‹ค.
00:44
I am now a graduate student at MIT
12
44260
2000
์ €๋Š” ์ด์ œ ์ธ๊ณต์ง€๋Šฅ์„ ์—ฐ๊ตฌํ•˜๊ณ  ์žˆ๋Š”
00:46
studying artificial intelligence,
13
46260
2000
MIT ๋Œ€ํ•™์›์ƒ์ž…๋‹ˆ๋‹ค.
00:48
the year is 1997,
14
48260
2000
1997๋…„์€
00:50
and NASA has just landed the first robot on Mars.
15
50260
3000
๋‚˜์‚ฌ๊ฐ€ ํ™”์„ฑ์— ์ฒซ ๋กœ๋ด‡์„ ์ฐฉ๋ฅ™์‹œํ‚จ ํ•ด์˜€์Šต๋‹ˆ๋‹ค.
00:53
But robots are still not in our home, ironically.
16
53260
3000
ํ•˜์ง€๋งŒ ์•„์ด๋Ÿฌ๋‹ˆํ•˜๊ฒŒ๋„ ๋กœ๋ด‡์€ ์•„์ง ์šฐ๋ฆฌ๋“ค์˜ ์ง‘์—๋Š” ์—†์Šต๋‹ˆ๋‹ค.
00:56
And I remember thinking about
17
56260
2000
๊ทธ๋ฆฌ๊ณ  ์™œ ๊ทธ๊ฒƒ์ด ๋ฌธ์ œ์ธ๊ฐ€์˜
00:58
all the reasons why that was the case.
18
58260
2000
๋ชจ๋“  ์ด์œ ๋“ค์— ๋Œ€ํ•ด์„œ ์ƒ๊ฐํ•œ ๊ฒƒ์„ ๊ธฐ์–ตํ•ฉ๋‹ˆ๋‹ค.
01:00
But one really struck me.
19
60260
2000
๊ทธ๋Ÿฌ๋‹ค ๊ฐ‘์ž๊ธฐ ๋– ์˜ค๋ฅธ ์ƒ๊ฐ์ด ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
01:02
Robotics had really been about interacting with things,
20
62260
3000
๋กœ๋ด‡๊ณตํ•™์€ ์‚ฌ๋ฌผ๋“ค๊ณผ์˜ ์ƒํ˜ธ์ž‘์šฉ์— ๋Œ€ํ•œ ๊ฒƒ์ด์ง€,
01:05
not with people --
21
65260
2000
์ธ๊ฐ„๊ณผ์˜ ์ƒํ˜ธ์ž‘์šฉ์— ๊ด€ํ•œ๊ฒŒ ์•„๋‹ˆ์—ˆ์Šต๋‹ˆ๋‹ค.
01:07
certainly not in a social way that would be natural for us
22
67260
2000
์šฐ๋ฆฌ์—๊ฒŒ ์ž์—ฐ์Šค๋Ÿฝ๊ณ , ์‚ฌ๋žŒ๋“ค์ด ๋กœ๋ด‡์„ ์šฐ๋ฆฌ์˜ ์ผ์ƒ์—
01:09
and would really help people accept robots
23
69260
2000
๋กœ๋ด‡์„ ๋ฐ›์•„๋“ค์ด๋„๋ก ํ•˜๋Š” ์‚ฌํšŒ์  ๋ฐฉ๋ฒ•์€
01:11
into our daily lives.
24
71260
2000
ํ™•์‹คํžˆ ์•„๋‹ˆ์—ˆ์Šต๋‹ˆ๋‹ค.
01:13
For me, that was the white space; that's what robots could not do yet.
25
73260
3000
์ œ๊ฒŒ ๊ทธ๊ฒƒ์€ ๋นˆ์นธ์ด์—ˆ๊ณ , ๋กœ๋ด‡์ด ์•„์ง๊นŒ์ง€ ํ•  ์ˆ˜ ์—†๋Š” ๊ฒƒ์ด์—ˆ์Šต๋‹ˆ๋‹ค.
01:16
And so that year, I started to build this robot, Kismet,
26
76260
3000
๊ทธ๋ž˜์„œ ๊ทธ ํ•ด, ์ €๋Š” ์„ธ๊ณ„ ์ตœ์ดˆ์˜ ์‚ฌํšŒ์  ๋กœ๋ด‡์ธ
01:19
the world's first social robot.
27
79260
3000
์ด ํ‚ค์ฆˆ๋งท์„ ๋งŒ๋“ค๊ธฐ ์‹œ์ž‘ํ–ˆ์Šต๋‹ˆ๋‹ค.
01:22
Three years later --
28
82260
2000
3๋…„ ํ›„์—,
01:24
a lot of programming,
29
84260
2000
์ˆ˜ ๋งŽ์€ ํ”„๋กœ๊ทธ๋ž˜๋ฐ๊ณผ
01:26
working with other graduate students in the lab --
30
86260
2000
์—ฐ๊ตฌ์‹ค์— ์žˆ๋Š” ๋‹ค๋ฅธ ๋Œ€ํ•™์›์ƒ๋“ค๊ณผ์˜ ํ˜‘์—…์„ ํ†ตํ•ด
01:28
Kismet was ready to start interacting with people.
31
88260
2000
ํ‚ค์ฆˆ๋งท์€ ์‚ฌ๋žŒ๋“ค๊ณผ ์ƒํ˜ธ์†Œํ†ต์„ ์‹œ์ž‘ํ•  ์ค€๋น„๊ฐ€ ๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
01:30
(Video) Scientist: I want to show you something.
32
90260
2000
(์˜์ƒ) ๊ณผํ•™์ž: ๋ญ”๊ฐ€ ๋ณด์—ฌ์ค„๊ฒŒ.
01:32
Kismet: (Nonsense)
33
92260
2000
ํ‚ค์ฆˆ๋งท: (์•„๋ฌด ์˜๋ฏธ ์—†๋Š” ๋ง)
01:34
Scientist: This is a watch that my girlfriend gave me.
34
94260
3000
๊ณผํ•™์ž: ๋‚ด ์—ฌ์ž์นœ๊ตฌ๊ฐ€ ์ค€ ์‹œ๊ณ„์•ผ.
01:37
Kismet: (Nonsense)
35
97260
2000
ํ‚ค์ฆˆ๋งท: (์•„๋ฌด ์˜๋ฏธ ์—†๋Š” ๋ง)
01:39
Scientist: Yeah, look, it's got a little blue light in it too.
36
99260
2000
๊ณผํ•™์ž: ๊ทธ๋ž˜, ๋ด๋ด, ํŒŒ๋ž€ ๋ถˆ๋น›๋„ ๋‚˜์˜จ๋‹ค.
01:41
I almost lost it this week.
37
101260
3000
์ด๋ฒˆ ์ฃผ์— ์žƒ์–ด๋ฒ„๋ฆด๋ป” ํ–ˆ์–ด.
01:44
Cynthia Breazeal: So Kismet interacted with people
38
104260
3000
์‹ ์‹œ์•„ ๋ธŒ๋ฆฌ์งˆ: ์ž, ํ‚ค์ฆˆ๋งท์€ ์ •๋ง ์ตœ์ดˆ์˜ ํ˜•ํƒœ์˜€๊ธฐ ๋•Œ๋ฌธ์—
01:47
like kind of a non-verbal child or pre-verbal child,
39
107260
3000
๋ง์„ ์‹œ์ž‘ํ•˜๊ธฐ ์ „์˜ ์•„์ด์ฒ˜๋Ÿผ ์‚ฌ๋žŒ๊ณผ ์ƒํ˜ธ์†Œํ†ตํ•˜๋Š” ๊ฒƒ์ด
01:50
which I assume was fitting because it was really the first of its kind.
40
110260
3000
์–ด์šธ๋ฆฐ๋‹ค๊ณ  ์ƒ๊ฐํ–ˆ์Šต๋‹ˆ๋‹ค.
01:53
It didn't speak language, but it didn't matter.
41
113260
2000
์–ธ์–ด๋ฅผ ๋งํ•˜์ง€ ๋ชปํ–ˆ์ง€๋งŒ ๋ฌธ์ œ๋˜์ง€ ์•Š์•˜์Šต๋‹ˆ๋‹ค.
01:55
This little robot was somehow able
42
115260
2000
์ด ์ž‘์€ ๋กœ๋ด‡์€ ์–ด๋–ป๊ฒŒ๋“  ์šฐ๋ฆฌ ๋‚ด๋ถ€์˜
01:57
to tap into something deeply social within us --
43
117260
3000
๊นŠ์ด ์‚ฌํšŒ์ ์ธ ๊ฒƒ์— ๋‹ค๊ฐ€์„ค ์ˆ˜ ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
02:00
and with that, the promise of an entirely new way
44
120260
2000
๊ทธ๋ฆฌ๊ณ  ๊ทธ๊ฒƒ์ด ์šฐ๋ฆฌ๊ฐ€ ๋กœ๋ด‡๊ณผ ์ƒํ˜ธ์†Œํ†ตํ•  ์ˆ˜ ์žˆ๋‹ค๋Š”
02:02
we could interact with robots.
45
122260
2000
์™„์ „ํžˆ ์ƒˆ๋กœ์šด ๋ฐฉ์‹์˜ ๊ฐ€๋Šฅ์„ฑ์ž…๋‹ˆ๋‹ค.
02:04
So over the past several years
46
124260
2000
๊ทธ๋ž˜์„œ ์ง€๋‚œ ๋ช‡ ๋…„ ๋™์•ˆ,
02:06
I've been continuing to explore this interpersonal dimension of robots,
47
126260
2000
๋ฏธ๋””์–ด ๋žฉ์—์„œ, ๋งค์šฐ ์žฌ๋Šฅ์žˆ๋Š” ํ•™์ƒ๋“ค๋กœ ์ด๋ฃจ์–ด์ง„ ์ œ ํŒ€๊ณผ ํ•จ๊ป˜
02:08
now at the media lab
48
128260
2000
๋กœ๋ด‡์˜ ์ด๋Ÿฐ ๋Œ€์ธ๊ด€๊ณ„์  ์ธก๋ฉด์„
02:10
with my own team of incredibly talented students.
49
130260
2000
๊ณ„์†ํ•ด์„œ ํƒ๊ตฌํ•ด์™”์Šต๋‹ˆ๋‹ค.
02:12
And one of my favorite robots is Leonardo.
50
132260
3000
์ œ๊ฐ€ ์ œ์ผ ์ข‹์•„ํ•˜๋Š” ๋กœ๋ด‡์€ ๋ ˆ์˜ค๋‚˜๋ฅด๋„์ž…๋‹ˆ๋‹ค.
02:15
We developed Leonardo in collaboration with Stan Winston Studio.
51
135260
3000
์šฐ๋ฆฌ๋Š” ์Šคํƒ  ์œˆ์Šคํ†ค ์ŠคํŠœ๋””์˜ค์™€ ํ˜‘๋ ฅํ•˜์—ฌ ๋ ˆ์˜ค๋‚˜๋ฅด๋„๋ฅผ ๊ฐœ๋ฐœํ–ˆ์Šต๋‹ˆ๋‹ค.
02:18
And so I want to show you a special moment for me of Leo.
52
138260
3000
์—ฌ๋Ÿฌ๋ถ„๊ป˜ ๋ ˆ์˜ค๋‚˜๋ฅด๋„์™€ ๊ด€๋ จํ•ด ์ œ๊ฒŒ ํŠน๋ณ„ํ–ˆ๋˜ ์ˆœ๊ฐ„์„ ๋ณด์—ฌ๋“œ๋ฆฌ๊ณ  ์‹ถ์Šต๋‹ˆ๋‹ค.
02:21
This is Matt Berlin interacting with Leo,
53
141260
2000
์ด ์‚ฌ๋žŒ์€ ๋ ˆ์˜ค์™€ ์ƒํ˜ธ์†Œํ†ตํ•˜๊ณ 
02:23
introducing Leo to a new object.
54
143260
2000
๋ ˆ์˜ค์—๊ฒŒ ์ƒˆ๋กœ์šด ๊ฒƒ์„ ์†Œ๊ฐœํ•˜๋Š” ๋งท ๋ฒ ๋ฅผ๋ฆฐ์ž…๋‹ˆ๋‹ค.
02:25
And because it's new, Leo doesn't really know what to make of it.
55
145260
3000
์ƒˆ๋กœ์šด ๊ฑธ ์†Œ๊ฐœํ–ˆ๊ธฐ ๋•Œ๋ฌธ์—, ๋ ˆ์˜ค๋Š” ๋ฌด์—‡์„ ํ•ด์•ผํ• ์ง€ ์ „ํ˜€ ๋ชจ๋ฆ…๋‹ˆ๋‹ค.
02:28
But sort of like us, he can actually learn about it
56
148260
2000
ํ•˜์ง€๋งŒ ์šฐ๋ฆฌ์™€ ์œ ์‚ฌํ•˜๊ฒŒ, ๋ ˆ์˜ค๋Š” ๋งท์˜ ๋ฐ˜์‘์„ ๊ด€์ฐฐํ•˜์—ฌ
02:30
from watching Matt's reaction.
57
150260
3000
ํ•™์Šตํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
02:33
(Video) Matt Berlin: Hello, Leo.
58
153260
2000
(์˜์ƒ) ๋งท ๋ฒ ๋ฅผ๋ฆฐ: ์•ˆ๋…•, ๋ ˆ์˜ค.
02:38
Leo, this is Cookie Monster.
59
158260
3000
๋ ˆ์˜ค, ์ด๊ฒƒ์€ ์ฟ ํ‚ค ๋ชฌ์Šคํ„ฐ์•ผ.
02:44
Can you find Cookie Monster?
60
164260
3000
์ฟ ํ‚ค ๋ชฌ์Šคํ„ฐ๋ฅผ ์ฐพ์„ ์ˆ˜ ์žˆ๊ฒ ์–ด?
02:52
Leo, Cookie Monster is very bad.
61
172260
3000
๋ ˆ์˜ค, ์ฟ ํ‚ค ๋ชฌ์Šคํ„ฐ๋Š” ๋งค์šฐ ๋‚˜๋น .
02:56
He's very bad, Leo.
62
176260
2000
์•„์ฃผ ๋‚˜์˜์ง€, ๋ ˆ์˜ค.
03:00
Cookie Monster is very, very bad.
63
180260
3000
์ฟ ํ‚ค ๋ชฌ์Šคํ„ฐ๋Š” ์•„์ฃผ ์•„์ฃผ ๋‚˜๋น .
03:07
He's a scary monster.
64
187260
2000
๋ฌด์‹œ๋ฌด์‹œํ•œ ๊ดด๋ฌผ์ด์ง€.
03:09
He wants to get your cookies.
65
189260
2000
๊ทธ๊ฑด ๋„ค ์ฟ ํ‚ค๋ฅผ ๋บ์–ด๊ฐ€๊ณ  ์‹ถ์–ดํ•ด.
03:12
(Laughter)
66
192260
2000
(์›ƒ์Œ)
03:14
CB: All right, so Leo and Cookie
67
194260
3000
CB: ์ข‹์Šต๋‹ˆ๋‹ค. ๊ทธ๋ž˜์„œ ๋ ˆ์˜ค์™€ ์ฟ ํ‚ค๋Š”
03:17
might have gotten off to a little bit of a rough start,
68
197260
2000
์ฒซ ๋งŒ๋‚จ์ด ์กฐ๊ธˆ์€ ํž˜๋“ค๊ฒŒ ์‹œ์ž‘ํ•œ ๊ฒƒ ๊ฐ™์ง€๋งŒ,
03:19
but they get along great now.
69
199260
3000
์ง€๊ธˆ์€ ์•„์ฃผ ์ž˜ ์ง€๋‚ด๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
03:22
So what I've learned
70
202260
2000
์ž, ์ œ๊ฐ€ ์ด ์‹œ์Šคํ…œ์„
03:24
through building these systems
71
204260
2000
๊ฐœ๋ฐœํ•˜๋ฉด์„œ ๋ฐฐ์šด ๊ฒƒ์€,
03:26
is that robots are actually
72
206260
2000
๋กœ๋ด‡์ด ์‹ค์ œ๋กœ ๋งค์šฐ ํฅ๋ฏธ๋กœ์šด
03:28
a really intriguing social technology,
73
208260
2000
์‚ฌํšŒ์  ๊ธฐ์ˆ ์ด๋ผ๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
03:30
where it's actually their ability
74
210260
2000
์‹ค์ œ๋กœ ์šฐ๋ฆฌ์˜ ์‚ฌํšŒ์  ๋ฒ„ํŠผ์„ ๋ˆ„๋ฅด๊ณ 
03:32
to push our social buttons
75
212260
2000
์šฐ๋ฆฌ์™€ ํŒŒํŠธ๋„ˆ์ฒ˜๋Ÿผ
03:34
and to interact with us like a partner
76
214260
2000
์ƒํ˜ธ์†Œํ†ต ํ•˜๋Š” ๋กœ๋ด‡์˜ ๋Šฅ๋ ฅ์€
03:36
that is a core part of their functionality.
77
216260
3000
๋กœ๋ด‡ ๊ธฐ๋Šฅ์˜ ํ•ต์‹ฌ ๋ถ€๋ถ„์ž…๋‹ˆ๋‹ค.
03:39
And with that shift in thinking, we can now start to imagine
78
219260
2000
๊ทธ๋ฆฌ๊ณ  ๊ทธ ์ƒ๊ฐ์˜ ๋ณ€ํ™”๋กœ, ์šฐ๋ฆฌ๋Š” ์ด์ œ
03:41
new questions, new possibilities for robots
79
221260
3000
๋‹ฌ๋ฆฌ ์ƒ๊ฐ ๋ชปํ•ด ๋ดค์—ˆ์„์ง€๋„ ๋ชจ๋ฅผ
03:44
that we might not have thought about otherwise.
80
224260
3000
๋กœ๋ด‡์— ๋Œ€ํ•œ ์ƒˆ๋กœ์šด ๋ฌธ์ œ์™€ ์ƒˆ๋กœ์šด ๊ฐ€๋Šฅ์„ฑ์„ ์ƒ์ƒํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
03:47
But what do I mean when I say "push our social buttons?"
81
227260
2000
์ œ๊ฐ€ "์‚ฌํšŒ์  ๋ฒ„ํŠผ์„ ๋ˆ„๋ฅธ๋‹ค"๊ณ  ํ–ˆ์„ ๋•Œ ๋ฌด์—‡์„ ์˜๋ฏธํ•˜๋Š” ๊ฒƒ์ผ๊นŒ์š”?
03:49
Well, one of the things that we've learned
82
229260
2000
์ž, ์šฐ๋ฆฌ๊ฐ€ ๋ฐฐ์›Œ์˜จ ๊ฒƒ๋“ค ์ค‘ ํ•˜๋‚˜๋Š”
03:51
is that, if we design these robots to communicate with us
83
231260
2000
์šฐ๋ฆฌ๊ฐ€ ๋™์ผํ•œ ๋ฐ”๋”” ๋žญ๊ท€์ง€์™€
03:53
using the same body language,
84
233260
2000
์‚ฌ๋žŒ๋“ค์ด ์‚ฌ์šฉํ•˜๋Š” ๋™์ผํ•œ ๋น„์–ธ์–ด์  ์‹ ํ˜ธ๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ
03:55
the same sort of non-verbal cues that people use --
85
235260
2000
์šฐ๋ฆฌ์™€ ์˜์‚ฌ์†Œํ†ตํ•˜๋„๋ก ์ด ๋กœ๋ด‡๋“ค์„ ๋””์ž์ธ ํ•œ๋‹ค๋ฉด,
03:57
like Nexi, our humanoid robot, is doing here --
86
237260
3000
-- ์—ฌ๊ธฐ ์ž‘๋™ํ•˜๊ณ  ์žˆ๋Š” ํœด๋จธ๋…ธ์ด๋“œ ๋กœ๋ด‡ ๋„ฅ์‹œ ๊ฐ™์ด ๋ง์ž…๋‹ˆ๋‹ค. --
04:00
what we find is that people respond to robots
87
240260
2000
์šฐ๋ฆฌ๊ฐ€ ๋ฐœ๊ฒฌํ•œ ๊ฒƒ์€ ์‚ฌ๋žŒ๋“ค์ด ์‚ฌ๋žŒ๋“ค์—๊ฒŒ ๋ฐ˜์‘ํ•˜๋Š” ๊ฒƒ๊ณผ
04:02
a lot like they respond to people.
88
242260
2000
๋งค์šฐ ์œ ์‚ฌํ•˜๊ฒŒ ๋กœ๋ด‡์—๊ฒŒ ๋ฐ˜์‘ํ•œ๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
04:04
People use these cues to determine things like how persuasive someone is,
89
244260
3000
์‚ฌ๋žŒ๋“ค์€ ๋ˆ„๊ตฐ๊ฐ€๊ฐ€ ์–ผ๋งˆ๋‚˜ ์„ค๋“๋ ฅ ์žˆ๊ณ , ์–ผ๋งˆ๋‚˜ ํ˜ธ๊ฐ์ด ๊ฐ€๋Š”์ง€,
04:07
how likable, how engaging,
90
247260
2000
์–ผ๋งˆ๋‚˜ ๋งค๋ ฅ์ ์ธ๊ฐ€, ์–ผ๋งˆ๋‚˜ ์‹ ๋ขฐํ•  ์ˆ˜ ์žˆ์„์ง€๋ฅผ
04:09
how trustworthy.
91
249260
2000
๊ฒฐ์ •ํ•˜๊ธฐ ์œ„ํ•ด์„œ ์ด ์‹ ํ˜ธ๋ฅผ ์ด์šฉํ•ฉ๋‹ˆ๋‹ค.
04:11
It turns out it's the same for robots.
92
251260
2000
๋กœ๋ด‡์—๊ฒŒ๋„ ๋™์ผํ•˜๋‹ค๋Š” ๊ฒƒ์ด ๋‚˜ํƒ€๋‚˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
04:13
It's turning out now
93
253260
2000
๋กœ๋ด‡์ด ์ธ๊ฐ„์˜ ํ–‰์œ„๋ฅผ ์ดํ•ดํ•˜๋Š”๋ฐ
04:15
that robots are actually becoming a really interesting new scientific tool
94
255260
3000
์‚ฌ์‹ค ์ •๋ง ํฅ๋ฏธ๋กœ์šด ์ƒˆ ๊ณผํ•™์  ๋„๊ตฌ๊ฐ€ ๋˜๊ณ  ์žˆ์Œ์ด
04:18
to understand human behavior.
95
258260
2000
์ง€๊ธˆ ๋‚˜ํƒ€๋‚˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
04:20
To answer questions like, how is it that, from a brief encounter,
96
260260
3000
๊ทธ๋Ÿฐ ์งˆ๋ฌธ์— ๋‹ต์„ ํ•˜๊ธฐ ์œ„ํ•ด์„œ, ์งง์€ ๋งŒ๋‚จ์—์„œ
04:23
we're able to make an estimate of how trustworthy another person is?
97
263260
3000
๋‹ค๋ฅธ ์‚ฌ๋žŒ์„ ์–ผ๋งˆ๋‚˜ ์‹ ๋ขฐํ•  ์ˆ˜ ์žˆ์„์ง€ ์–ด๋–ป๊ฒŒ ํ‰๊ฐ€ํ•  ์ˆ˜ ์žˆ์„๊นŒ์š”?
04:26
Mimicry's believed to play a role, but how?
98
266260
3000
ํ‰๋‚ด๋Š” ๊ฒƒ์„ ํ†ตํ•ด์„œ๋ผ๊ณ  ์—ฌ๊ฒจ์ง‘๋‹ˆ๋‹ค. ํ•˜์ง€๋งŒ ์–ด๋–ป๊ฒŒ์š”?
04:29
Is it the mimicking of particular gestures that matters?
99
269260
3000
์ค‘์š”ํ•œ ํŠน์ • ๋™์ž‘์„ ํ‰๋‚ด๋‚ด๋Š” ๊ฒƒ์ผ๊นŒ์š”?
04:32
It turns out it's really hard
100
272260
2000
์‚ฌ๋žŒ๋“ค์„ ๊ด€์ฐฐํ•˜๋Š” ๊ฒƒ์—์„œ
04:34
to learn this or understand this from watching people
101
274260
2000
์ด๋ฅผ ํ•™์Šตํ•˜๊ฑฐ๋‚˜ ์ดํ•ดํ•˜๋Š” ๊ฒƒ์€ ์ •๋ง ์–ด๋ ค์šด ์ผ์ž…๋‹ˆ๋‹ค.
04:36
because when we interact we do all of these cues automatically.
102
276260
3000
์šฐ๋ฆฌ๊ฐ€ ์ƒํ˜ธ์†Œํ†ตํ•  ๋•Œ, ์‹ ํ˜ธ๋“ค์„ ๋ฌด์˜์‹์ ์œผ๋กœ ์‹คํ–‰ํ•˜๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
04:39
We can't carefully control them because they're subconscious for us.
103
279260
2000
์ž ์žฌ์˜์‹์ ์ธ ๊ฒƒ์ด๊ธฐ ๋•Œ๋ฌธ์—, ์šฐ๋ฆฌ๋Š” ๊ทธ๊ฒƒ๋“ค์„ ์„ธ์‹ฌํ•˜๊ฒŒ ํ†ต์ œํ•  ์ˆ˜ ์—†์Šต๋‹ˆ๋‹ค.
04:41
But with the robot, you can.
104
281260
2000
ํ•˜์ง€๋งŒ ๋กœ๋ด‡์œผ๋กœ๋Š” ๊ฐ€๋Šฅํ•ฉ๋‹ˆ๋‹ค.
04:43
And so in this video here --
105
283260
2000
์—ฌ๊ธฐ ๋ณด์‹œ๋Š” ์˜์ƒ์—์„œ,
04:45
this is a video taken from David DeSteno's lab at Northeastern University.
106
285260
3000
์ด ์˜์ƒ์€ ๋…ธ์Šค์ด์Šคํ„ด ๋Œ€ํ•™, ๋ฐ์ด๋น„๋“œ ๋ฐ์Šคํ…Œ๋…ธ์˜ ์—ฐ๊ตฌ์‹ค์—์„œ ์ฐ์€ ์˜์ƒ์ž…๋‹ˆ๋‹ค.
04:48
He's a psychologist we've been collaborating with.
107
288260
2000
๊ทธ๋Š” ์šฐ๋ฆฌ๊ฐ€ ํ•จ๊ป˜ ์ผํ–ˆ๋˜ ์‹ฌ๋ฆฌํ•™์ž์ž…๋‹ˆ๋‹ค.
04:50
There's actually a scientist carefully controlling Nexi's cues
108
290260
3000
์ด ์งˆ๋ฌธ์„ ์—ฐ๊ตฌํ•  ์ˆ˜ ์žˆ๋„๋ก ๋„ฅ์‹œ์˜ ์‹ ํ˜ธ๋ฅผ
04:53
to be able to study this question.
109
293260
3000
์ฃผ์˜ ๊นŠ๊ฒŒ ์กฐ์ ˆํ–ˆ๋˜ ๊ณผํ•™์ž๊ฐ€ ์‹ค์ œ๋กœ ์žˆ์Šต๋‹ˆ๋‹ค.
04:56
And the bottom line is -- the reason why this works is
110
296260
2000
๊ทธ๋ฆฌ๊ณ  ๊ฒฐ๊ตญ, ์ด๋Ÿฐ ์ผ์„ ํ•˜๋Š” ์ด์œ ๋Š”
04:58
because it turns out people just behave like people
111
298260
2000
์‚ฌ๋žŒ๋“ค์ด ๋กœ๋ด‡๊ณผ ์ƒํ˜ธ์†Œํ†ตํ•  ๋•Œ๋„
05:00
even when interacting with a robot.
112
300260
3000
์‚ฌ๋žŒ์ธ ๊ฒƒ์ฒ˜๋Ÿผ ํ–‰๋™ํ•œ๋‹ค๋Š” ๊ฒƒ์„ ๋ฐํ˜€๋‚ด๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
05:03
So given that key insight,
113
303260
2000
๊ทธ๋ž˜์„œ ํ•ต์‹ฌ ๊ฒฌํ•ด๋ฅผ ๊ณ ๋ คํ•ด๋ณผ ๋•Œ,
05:05
we can now start to imagine
114
305260
2000
์šฐ๋ฆฌ๋Š” ์ด์ œ ์ƒˆ๋กœ์šด ๋ฐฉ์‹์˜
05:07
new kinds of applications for robots.
115
307260
3000
๋กœ๋ด‡ ํ™œ์šฉ์„ ์ƒ์ƒํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
05:10
For instance, if robots do respond to our non-verbal cues,
116
310260
3000
์˜ˆ๋ฅผ ๋“ค์–ด, ๋กœ๋ด‡์ด ์šฐ๋ฆฌ์˜ ๋น„์–ธ์–ด์  ์‹ ํ˜ธ์— ๋ฐ˜์‘ํ•œ๋‹ค๋ฉด,
05:13
maybe they would be a cool, new communication technology.
117
313260
4000
๋ฉ‹์ง€๊ณ  ์ƒˆ๋กœ์šด ์˜์‚ฌ์†Œํ†ต ๊ธฐ์ˆ ์ด ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
05:17
So imagine this:
118
317260
2000
์ž, ์ด๊ฒƒ์„ ์ƒ์ƒํ•ด๋ณด์„ธ์š”.
05:19
What about a robot accessory for your cellphone?
119
319260
2000
์—ฌ๋Ÿฌ๋ถ„ ํœด๋Œ€ํฐ์„ ์œ„ํ•œ ๋กœ๋ด‡ ์•…์„ธ์‚ฌ๋ฆฌ๋Š” ์–ด๋–ค๊ฐ€์š”?
05:21
You call your friend, she puts her handset in a robot,
120
321260
2000
์—ฌ๋Ÿฌ๋ถ„์€ ์นœ๊ตฌ์—๊ฒŒ ์ „ํ™”๋ฅผ ํ•ฉ๋‹ˆ๋‹ค. ๊ทธ๋…€๋Š” ์†ก์ˆ˜ํ™”๊ธฐ๋ฅผ ๋กœ๋ด‡์— ์˜ฌ๋ ค ๋†“์Šต๋‹ˆ๋‹ค.
05:23
and, bam! You're a MeBot --
121
323260
2000
๊ทธ๋ฆฌ๊ณ  ์ง ! ์—ฌ๋Ÿฌ๋ถ„์€ ์ด์ œ '๋ฏธ๋ด‡'์ž…๋‹ˆ๋‹ค.
05:25
you can make eye contact, you can talk with your friends,
122
325260
3000
์—ฌ๋Ÿฌ๋ถ„์€ ์‹œ์„ ์„ ๋งž์ถœ ์ˆ˜๋„ ์žˆ๊ณ , ์นœ๊ตฌ๋“ค๊ณผ ๋Œ€ํ™”๋ฅผ ๋‚˜๋ˆŒ ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
05:28
you can move around, you can gesture --
123
328260
2000
๋Œ์•„๋‹ค๋‹ˆ๊ณ , ๋ชธ์ง“์„ ํ•  ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
05:30
maybe the next best thing to really being there, or is it?
124
330260
3000
์•„๋งˆ๋„ ์‹ค์ œ ๊ทธ ์ž๋ฆฌ์— ๊ฐ™์ด ์žˆ๋Š” ๊ฒƒ ๋‹ค์Œ์œผ๋กœ ์ข‹์€ ๊ฒƒ์ด ์•„๋‹๊นŒ์š”?
05:33
To explore this question,
125
333260
2000
์ด ์งˆ๋ฌธ์„ ํƒ๊ตฌํ•˜๊ธฐ ์œ„ํ•ด์„œ
05:35
my student, Siggy Adalgeirsson, did a study
126
335260
3000
์ œ ํ•™์ƒ์ธ '์‹œ๊ธฐ ์•„๋Œˆ๊ธฐ์–ด์Šจ'์€
05:38
where we brought human participants, people, into our lab
127
338260
3000
์›๊ฒฉ์˜ ํ˜‘๋ ฅ์ž์™€ ํ˜‘์—…์„ ํ•  ์ˆ˜ ์žˆ๋„๋ก
05:41
to do a collaborative task
128
341260
2000
์—ฐ๊ตฌ์‹ค๋กœ ๋ฐ๋ ค์˜จ ์ฐธ์—ฌ์ž๋“ค์„
05:43
with a remote collaborator.
129
343260
2000
์—ฐ๊ตฌํ–ˆ์Šต๋‹ˆ๋‹ค.
05:45
The task involved things
130
345260
2000
๊ทธ ๊ณผ์ œ๋Š” ํƒ์ž ์œ„์— ๋†“์ธ
05:47
like looking at a set of objects on the table,
131
347260
2000
์ผ๋ จ์˜ ๋ฌผ์ฒด๋“ค์„ ๋ฐ”๋ผ๋ณด๊ณ ,
05:49
discussing them in terms of their importance and relevance to performing a certain task --
132
349260
3000
๊ทธ๊ฒƒ์˜ ์ค‘์š”์„ฑ๊ณผ ํŠน์ • ์ž‘์—… ์ˆ˜ํ–‰๊ณผ์˜ ๊ด€๋ จ์„ฑ์— ๋Œ€ํ•ด ํ† ๋ก ํ•˜๋ฉฐ,
05:52
this ended up being a survival task --
133
352260
2000
-- ์ด๊ฒƒ์€ ์„œ๋ฐ”์ด๋ฒŒ ๊ณผ์ œ๊ฐ€ ๋˜์—ˆ์Šต๋‹ˆ๋‹ค. --
05:54
and then rating them in terms
134
354260
2000
๊ทธ๋“ค์ด ๊ทธ ๋ฌผ์ฒด๋“ค์ด ์–ผ๋งˆ๋‚˜ ๊ฐ€์น˜์žˆ๊ณ  ์ค‘์š”ํ•˜๋‹ค๊ณ 
05:56
of how valuable and important they thought they were.
135
356260
2000
์ƒ๊ฐํ•˜๋Š”์ง€์— ๋Œ€ํ•œ ๋“ฑ๊ธ‰์„ ๋งค๊ธฐ๋Š” ๊ฒƒ์„ ํฌํ•จํ•ฉ๋‹ˆ๋‹ค.
05:58
The remote collaborator was an experimenter from our group
136
358260
3000
์›๊ฒฉ ํ˜‘๋ ฅ์ž๋Š” ์ฐธ๊ฐ€์ž๋“ค๊ณผ ์ƒํ˜ธ์ž‘์šฉํ•˜๊ธฐ ์œ„ํ•ด
06:01
who used one of three different technologies
137
361260
2000
์„ธ ๊ฐ€์ง€์˜ ๋‹ค๋ฅธ ๊ธฐ์ˆ  ์ค‘ ํ•˜๋‚˜๋ฅผ ์‚ฌ์šฉํ•˜๋Š”
06:03
to interact with the participants.
138
363260
2000
์ €ํฌ ๊ทธ๋ฃน์˜ ์‹คํ—˜์ž๋“ค์ด์—ˆ์Šต๋‹ˆ๋‹ค.
06:05
The first was just the screen.
139
365260
2000
์ฒซ ๋ฒˆ์งธ ๊ธฐ์ˆ ์€ ๊ทธ๋ƒฅ ํ™”๋ฉด์ž…๋‹ˆ๋‹ค.
06:07
This is just like video conferencing today.
140
367260
3000
์ด๊ฒƒ์€ ์˜ค๋Š˜๋‚ ์˜ ๋น„๋””์˜ค ํ™”์ƒํšŒ์˜์™€ ์œ ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
06:10
The next was to add mobility -- so, have the screen on a mobile base.
141
370260
3000
๋‹ค์Œ ๊ธฐ์ˆ ์€ ์ด๋™์„ฑ์„ ๋”ํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค. ํ™”๋ฉด์„ ๋ชจ๋ฐ”์ผ ๊ธฐ๋ฐ˜์œผ๋กœ ๋งŒ๋“ญ๋‹ˆ๋‹ค.
06:13
This is like, if you're familiar with any of the telepresence robots today --
142
373260
3000
์—ฌ๋Ÿฌ๋ถ„์ด ์˜ค๋Š˜๋‚ ์˜ ํ…”๋ ˆํ”„๋ ˆ์ „์Šค ๋กœ๋ด‡ ๊ฐ™์€ ๊ฒƒ์— ์ต์ˆ™ํ•˜๋‹ค๋ฉด,
06:16
this is mirroring that situation.
143
376260
3000
์ด๊ฒƒ์€ ๊ทธ ํ™˜๊ฒฝ์„ ๋ฐ˜์˜ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
06:19
And then the fully expressive MeBot.
144
379260
2000
๊ทธ๋ฆฌ๊ณ  ๋‚˜์„œ ์•„์ฃผ ํ‘œํ˜„์ ์ธ ๋ฏธ๋ด‡์ด ์žˆ์Šต๋‹ˆ๋‹ค.
06:21
So after the interaction,
145
381260
2000
๊ทธ๋ž˜์„œ ์ƒํ˜ธ์ž‘์šฉ ํ›„์—,
06:23
we asked people to rate their quality of interaction
146
383260
3000
์šฐ๋ฆฌ๋Š” ์‚ฌ๋žŒ๋“ค์—๊ฒŒ ๊ธฐ์ˆ ๊ณผ,
06:26
with the technology, with a remote collaborator
147
386260
2000
์›๊ฒฉ ํ˜‘๋ ฅ์ž์™€์˜ ์ƒํ˜ธ์ž‘์šฉ ํ’ˆ์งˆ์„
06:28
through this technology, in a number of different ways.
148
388260
3000
ํ‰๊ฐ€ํ•˜๋„๋ก ์š”์ฒญํ•ฉ๋‹ˆ๋‹ค.
06:31
We looked at psychological involvement --
149
391260
2000
๋‹ค๋ฅธ ์‚ฌ๋žŒ์—๊ฒŒ ์–ผ๋งˆ๋‚˜ ๊ณต๊ฐ์„ ๋Š๊ผˆ๋Š”์ง€
06:33
how much empathy did you feel for the other person?
150
393260
2000
์‹ฌ๋ฆฌ์  ๊ด€์—ฌ๋ฅผ ์‚ดํŽด๋ด…๋‹ˆ๋‹ค.
06:35
We looked at overall engagement.
151
395260
2000
์šฐ๋ฆฌ๋Š” ์ „๋ฐ˜์ ์ธ ์ฐธ์—ฌ๋ฅผ ์‚ดํŽด๋ด…๋‹ˆ๋‹ค.
06:37
We looked at their desire to cooperate.
152
397260
2000
๊ทธ๋“ค์ด ํ˜‘๋ ฅํ•˜๊ณ ์ž ํ•˜๋Š” ์š•๊ตฌ๋ฅผ ์‚ดํŽด๋ด…๋‹ˆ๋‹ค.
06:39
And this is what we see when they use just the screen.
153
399260
3000
์ด๊ฒƒ์ด ํ™”๋ฉด๋งŒ์„ ์‚ฌ์šฉํ•  ๋•Œ ํ™•์ธํ•œ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
06:42
It turns out, when you add mobility -- the ability to roll around the table --
154
402260
3000
์ด๋™์„ฑ์„ ๋”ํ–ˆ์„ ๋•Œ, --ํ…Œ์ด๋ธ” ์ด๋ฆฌ์ €๋ฆฌ๋กœ ์ด๋™ํ•  ์ˆ˜ ์žˆ๋Š” ๋Šฅ๋ ฅ์ด์ฃ .--
06:45
you get a little more of a boost.
155
405260
2000
์กฐ๊ธˆ ๋” ์ฆ๊ฐ€ํ•˜๋Š” ๊ฒƒ์„ ๋ณผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
06:47
And you get even more of a boost when you add the full expression.
156
407260
3000
๊ทธ๋ฆฌ๊ณ  ์ถฉ๋ถ„ํ•œ ํ‘œํ˜„๋ ฅ์„ ๋”ํ–ˆ์„ ๋•Œ, ๋” ์ฆ๊ฐ€ํ•˜๊ฒŒ ๋˜์ฃ .
06:50
So it seems like this physical, social embodiment
157
410260
2000
๊ทธ๋ž˜์„œ ๋ฌผ๋ฆฌ์ ์ธ ์‚ฌํšŒ์  ์ฒดํ˜„์ด
06:52
actually really makes a difference.
158
412260
2000
์‹ค์ œ๋กœ ์ฐจ์ด๋ฅผ ๋งŒ๋“œ๋Š” ๊ฒƒ ๊ฐ™์Šต๋‹ˆ๋‹ค.
06:54
Now let's try to put this into a little bit of context.
159
414260
3000
์ด์ œ, ์ด๊ฒƒ์„ ์•ฝ๊ฐ„์˜ ๋งฅ๋žต์— ๋Œ€์ž…ํ•ด๋ณด๊ธฐ๋กœ ํ•˜์ฃ .
06:57
Today we know that families are living further and further apart,
160
417260
3000
์˜ค๋Š˜๋‚  ๊ฐ€์กฑ๋“ค์ด ์„œ๋กœ ๋ฉ€๋ฆฌ ๋–จ์–ด์ ธ ์‚ด๊ณ  ์žˆ๊ณ ,
07:00
and that definitely takes a toll on family relationships
161
420260
2000
๊ทธ๊ฒƒ์ด ๊ฐ€์กฑ ๊ด€๊ณ„์— ํฐ ํƒ€๊ฒฉ์„ ์ฃผ๊ณ  ์žˆ์œผ๋ฉฐ,
07:02
and family bonds over distance.
162
422260
2000
๊ฐ€์กฑ์˜ ๊ฒฐ์†์€ ๋ฉ€์–ด์ง€๊ณ  ์žˆ๋‹ค๋Š” ๊ฒƒ์„ ์•Œ๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
07:04
For me, I have three young boys,
163
424260
2000
์ €๋Š” ์•„๋“ค ์…‹์ด ์žˆ์Šต๋‹ˆ๋‹ค.
07:06
and I want them to have a really good relationship
164
426260
2000
๊ทธ๋ฆฌ๊ณ  ์ €๋Š” ์•„์ด๋“ค์ด ํ• ์•„๋ฒ„์ง€, ํ• ๋จธ๋‹ˆ์™€
07:08
with their grandparents.
165
428260
2000
์ •๋ง ์ข‹์€ ๊ด€๊ณ„๋ฅผ ๊ฐ€์ง€๊ธฐ ์›ํ•ฉ๋‹ˆ๋‹ค.
07:10
But my parents live thousands of miles away,
166
430260
2000
ํ•˜์ง€๋งŒ ์šฐ๋ฆฌ ๋ถ€๋ชจ๋‹˜์€ ์ˆ˜ ์ฒœ ๋งˆ์ผ์„ ๋–จ์–ด์ ธ ์‚ด๊ณ  ๊ณ„์‹œ๊ณ ,
07:12
so they just don't get to see each other that often.
167
432260
2000
๊ทธ๋ž˜์„œ ์„œ๋กœ ์ž์ฃผ ๋ณด์ง€ ๋ชปํ•˜๊ฒŒ ๋˜์ฃ .
07:14
We try Skype, we try phone calls,
168
434260
2000
์šฐ๋ฆฌ๋Š” ์Šค์นด์ดํ”„๋‚˜ ์ „ํ™”๋ฅผ ์ด์šฉํ•˜์ง€๋งŒ,
07:16
but my boys are little -- they don't really want to talk;
169
436260
2000
์ œ ์•„์ด๋“ค์€ ์–ด๋ ค์„œ ๋Œ€ํ™”๋ฅผ ํ•˜๊ธฐ ๋ณด๋‹ค๋Š”
07:18
they want to play.
170
438260
2000
๊ทธ๋ƒฅ ๋†€๊ธฐ๋ฅผ ๋ฐ”๋ผ์ฃ .
07:20
So I love the idea of thinking about robots
171
440260
2000
์•„์ด๋“ค์€ ์ƒˆ๋กœ์šด ์ข…๋ฅ˜์˜ ์›๊ฑฐ๋ฆฌ ๋†€์ด ๊ธฐ์ˆ ๋กœ์„œ
07:22
as a new kind of distance-play technology.
172
442260
3000
๋กœ๋ด‡์— ๋Œ€ํ•ด ์ƒ๊ฐํ•˜๋Š” ์•„์ด๋””์–ด๋ฅผ ์ข‹์•„ํ•ฉ๋‹ˆ๋‹ค.
07:25
I imagine a time not too far from now --
173
445260
3000
์ž, ์ €๋Š” ๋ฉ€์ง€ ์•Š์„ ๋ฏธ๋ž˜๋ฅผ ์ƒ์ƒํ•ฉ๋‹ˆ๋‹ค.
07:28
my mom can go to her computer,
174
448260
2000
์ €ํฌ ์–ด๋จธ๋‹ˆ๋Š” ์ปดํ“จํ„ฐ๋กœ ๊ฐ€์„œ
07:30
open up a browser and jack into a little robot.
175
450260
2000
๋ธŒ๋ผ์šฐ์ €๋ฅผ ์—ด๊ณ , ์ž‘์€ ๋กœ๋ด‡์— ์ ‘์†ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
07:32
And as grandma-bot,
176
452260
3000
๊ทธ๋ฆฌ๊ณ  ํ• ๋จธ๋‹ˆ-๋ด‡์œผ๋กœ์„œ
07:35
she can now play, really play,
177
455260
2000
๊ทธ๋…€๋Š” ์†์ž๋“ค๊ณผ ํ•จ๊ป˜
07:37
with my sons, with her grandsons,
178
457260
2000
์†์ž์˜ ์‹ค์ œ ์žฅ๋‚œ๊ฐ์„ ๊ฐ€์ง€๊ณ 
07:39
in the real world with his real toys.
179
459260
3000
์‹ค์ œ ์„ธ์ƒ์—์„œ ๊ฐ™์ด ๋†€ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
07:42
I could imagine grandmothers being able to do social-plays
180
462260
2000
์ €๋Š” ์†๋…€๋”ธ์ด๋‚˜ ์นœ๊ตฌ๋“ค๊ณผ ํ•จ๊ป˜ ์‚ฌํšŒ์  ๋†€์ด๋ฅผ ํ•  ์ˆ˜ ์žˆ๊ณ ,
07:44
with their granddaughters, with their friends,
181
464260
2000
์ž  ์ž˜ ๋•Œ์˜ ๋™ํ™”๋ฅผ ๋‚˜๋ˆ„๋Š” ๊ฒƒ ๊ฐ™์ด
07:46
and to be able to share all kinds of other activities around the house,
182
466260
2000
์˜จ๊ฐ– ์ง‘์—์„œ์˜ ๋‹ค๋ฅธ ํ™œ๋™๋“ค์„ ๋‚˜๋ˆŒ ์ˆ˜ ์žˆ๋Š”
07:48
like sharing a bedtime story.
183
468260
2000
ํ• ๋จธ๋‹ˆ๋“ค์„ ์ƒ์ƒํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
07:50
And through this technology,
184
470260
2000
๊ทธ๋ฆฌ๊ณ  ์ด ๊ธฐ์ˆ ์„ ํ†ตํ•ด์„œ,
07:52
being able to be an active participant
185
472260
2000
์˜ค๋Š˜๋‚ ์—๋Š” ๊ฐ€๋Šฅํ•˜์ง€ ์•Š์€ ๋ฐฉ์‹์œผ๋กœ
07:54
in their grandchildren's lives
186
474260
2000
๊ทธ๋“ค ์†์ž๋“ค์˜ ์‚ถ์—
07:56
in a way that's not possible today.
187
476260
2000
์ ๊ทน์ ์ธ ์ฐธ์—ฌ์ž๊ฐ€ ๋  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
07:58
Let's think about some other domains,
188
478260
2000
๊ฑด๊ฐ•๊ณผ ๊ฐ™์€ ๋‹ค๋ฅธ ์˜์—ญ๋“ค์— ๋Œ€ํ•ด์„œ๋„
08:00
like maybe health.
189
480260
2000
์ƒ๊ฐํ•ด๋ณด๋„๋ก ํ•˜์ฃ .
08:02
So in the United States today,
190
482260
2000
์˜ค๋Š˜๋‚  ๋ฏธ๊ตญ์—์„œ๋Š”,
08:04
over 65 percent of people are either overweight or obese,
191
484260
3000
65% ์ด์ƒ์˜ ์‚ฌ๋žŒ๋“ค์ด ๊ณผ์ฒด์ค‘์ด๊ฑฐ๋‚˜ ๋น„๋งŒ์ด๊ณ ,
08:07
and now it's a big problem with our children as well.
192
487260
2000
์ด์ œ ์šฐ๋ฆฌ ์•„์ด๋“ค์—๊ฒŒ๋„ ํฐ ๋ฌธ์ œ์ž…๋‹ˆ๋‹ค.
08:09
And we know that as you get older in life,
193
489260
2000
๊ทธ๋ฆฌ๊ณ  ์—ฌ๋Ÿฌ๋ถ„์ด ์–ด๋ฆด ๋•Œ ๋น„๋งŒ์ด์—ˆ๋‹ค๋ฉด,
08:11
if you're obese when you're younger, that can lead to chronic diseases
194
491260
3000
์–ด๋ฆด ๋•Œ ๋น„๋งŒ์ด์—ˆ๋‹ค๋ฉด, ์šฐ๋ฆฌ ์‚ถ์˜ ์งˆ์„ ๊ฐ์†Œ์‹œํ‚ฌ ๋ฟ๋งŒ ์•„๋‹ˆ๋ผ
08:14
that not only reduce your quality of life,
195
494260
2000
์šฐ๋ฆฌ์˜ ๊ฑด๊ฐ•๊ด€๋ฆฌ ์‹œ์Šคํ…œ์— ์—„์ฒญ๋‚œ ๊ฒฝ์ œ์  ๋ถ€๋‹ด์ด ๋˜๋Š”
08:16
but are a tremendous economic burden on our health care system.
196
496260
3000
๋งŒ์„ฑ ์งˆ๋ณ‘์œผ๋กœ ์ด์–ด์งˆ ์ˆ˜ ์žˆ๋‹ค๋Š” ๊ฒƒ์„ ์•Œ๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
08:19
But if robots can be engaging,
197
499260
2000
ํ•˜์ง€๋งŒ ๋กœ๋ด‡์ด ๋งค๋ ฅ์ ์ด๋ผ๋ฉด,
08:21
if we like to cooperate with robots,
198
501260
2000
์šฐ๋ฆฌ๊ฐ€ ๋กœ๋ด‡๊ณผ ํ˜‘๋ ฅํ•˜๊ธฐ๋Š”๊ฑธ ์ข‹์•„ํ•œ๋‹ค๋ฉด,
08:23
if robots are persuasive,
199
503260
2000
๋กœ๋ด‡์ด ์„ค๋“๋ ฅ์„ ๋ฐœํœ˜ํ•  ์ˆ˜ ์žˆ๋‹ค๋ฉด,
08:25
maybe a robot can help you
200
505260
2000
์•„๋งˆ๋„ ๋กœ๋ด‡์€ ์—ฌ๋Ÿฌ๋ถ„์ด
08:27
maintain a diet and exercise program,
201
507260
2000
๋‹ค์ด์–ดํŠธ์™€ ์šด๋™์„ ์œ ์ง€ํ•˜๋„๋ก,
08:29
maybe they can help you manage your weight.
202
509260
3000
์ฒด์ค‘์„ ๊ด€๋ฆฌํ•˜๋„๋ก ๋„์šธ ์ˆ˜ ์žˆ์„ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
08:32
Sort of like a digital Jiminy --
203
512260
2000
์ผ์ข…์˜ ๋””์ง€ํ„ธ ์ง€๋ฏธ๋‹ˆ ๊ฐ™์ด,
08:34
as in the well-known fairy tale --
204
514260
2000
-- ์œ ๋ช…ํ•œ ๋™ํ™”์—์„œ ์ฒ˜๋Ÿผ ๋ง์ด์ฃ . --
08:36
a kind of friendly, supportive presence that's always there
205
516260
2000
์˜ณ์€ ๋ฐฉ๋ฒ•, ์ ์ ˆํ•œ ์‹œ๊ฐ„์— ์˜ณ์€ ๊ฒฐ์ •์„ ๋‚ด๋ฆฌ๋„๋ก,
08:38
to be able to help you make the right decision
206
518260
2000
์—ฌ๋Ÿฌ๋ถ„์ด ๊ฑด๊ฐ•ํ•œ ์Šต๊ด€์„ ๋งŒ๋“ค๋„๋ก
08:40
in the right way at the right time
207
520260
2000
๋„์™€์ค„ ์ˆ˜ ์žˆ๊ฒŒ ์–ธ์ œ๋‚˜ ๊ทธ๊ณณ์— ์žˆ๋Š”
08:42
to help you form healthy habits.
208
522260
2000
์šฐํ˜ธ์ ์œผ๋กœ ๋„์™€์ฃผ๋Š” ์กด์žฌ์ž…๋‹ˆ๋‹ค.
08:44
So we actually explored this idea in our lab.
209
524260
2000
์šฐ๋ฆฌ๋Š” ์‹ค์ œ๋กœ ์šฐ๋ฆฌ ์‹คํ—˜์‹ค์—์„œ ์ด ์•„์ด๋””์–ด๋ฅผ ํƒ๊ตฌํ–ˆ์Šต๋‹ˆ๋‹ค.
08:46
This is a robot, Autom.
210
526260
2000
์ด ๋กœ๋ด‡์€ ์˜คํ†ฐ์ž…๋‹ˆ๋‹ค.
08:48
Cory Kidd developed this robot for his doctoral work.
211
528260
3000
์ฝ”๋ฆฌ ํ‚ค๋“œ๊ฐ€ ๋ฐ•์‚ฌํ•™์œ„ ์ž‘ํ’ˆ์œผ๋กœ ์ด ๋กœ๋ด‡์„ ๊ฐœ๋ฐœํ–ˆ์Šต๋‹ˆ๋‹ค.
08:51
And it was designed to be a robot diet-and-exercise coach.
212
531260
3000
์ด ๋กœ๋ด‡์€ ๋‹ค์ด์–ดํŠธ์™€ ์šด๋™ ์ฝ”์น˜๋กœ ๋””์ž์ธ ๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
08:54
It had a couple of simple non-verbal skills it could do.
213
534260
2000
๊ทธ๊ฒƒ์€ ๋ช‡ ๊ฐ€์ง€ ๋‹จ์ˆœํ•œ ๋น„์–ธ์–ด์  ๊ธฐ๋Šฅ์„ ๊ฐ€์ง€๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
08:56
It could make eye contact with you.
214
536260
2000
์—ฌ๋Ÿฌ๋ถ„๊ณผ ๋ˆˆ์„ ๋งž์ถœ ์ˆ˜๋„ ์žˆ๊ณ ,
08:58
It could share information looking down at a screen.
215
538260
2000
ํ™”๋ฉด์— ๋‚ด๋ ค๋‹ค ๋ณด๋Š” ์ •๋ณด๋ฅผ ๊ณต์œ ํ•  ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
09:00
You'd use a screen interface to enter information,
216
540260
2000
์—ฌ๋Ÿฌ๋ถ„์€ ๊ทธ ๋‚  ์„ญ์ทจํ•œ ์นผ๋กœ๋ฆฌ๊ฐ€ ์–ผ๋งˆ์ธ์ง€,
09:02
like how many calories you ate that day,
217
542260
2000
์šด๋™๋Ÿ‰์€ ์–ผ๋งˆ๋‚˜ ๋˜๋Š”์ง€ ๊ฐ™์€ ์ •๋ณด๋ฅผ ์ž…๋ ฅํ•˜๋Š”
09:04
how much exercise you got.
218
544260
2000
ํ™”๋ฉด ์ธํ„ฐํŽ˜์ด์Šค๋ฅผ ์‚ฌ์šฉํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
09:06
And then it could help track that for you.
219
546260
2000
๊ทธ๋ฆฌ๊ณ  ๋‚˜์„œ ์—ฌ๋Ÿฌ๋ถ„์„ ์œ„ํ•ด ๊ทธ ์ •๋ณด๋ฅผ ์ถ”์ฒ™ํ•˜๋„๋ก ํ•ด์ค๋‹ˆ๋‹ค.
09:08
And the robot spoke with a synthetic voice
220
548260
2000
๊ทธ๋ฆฌ๊ณ  ๋กœ๋ด‡์€ ์—ฌ๋Ÿฌ๋ถ„์ด ํŠธ๋ ˆ์ด๋„ˆ์™€
09:10
to engage you in a coaching dialogue
221
550260
2000
ํ™˜์ž ๋“ฑ์„ ๋ชจ๋ธ๋กœ ์‚ผ์•„ ๊ตฌ์„ฑ๋œ
09:12
modeled after trainers
222
552260
2000
์ฝ”์นญ ๋ฌธ๋‹ต์— ์ฐธ์—ฌํ•˜๋„๋ก
09:14
and patients and so forth.
223
554260
2000
ํ•ฉ์„ฑ๋œ ์Œ์„ฑ์œผ๋กœ ๋งํ•ฉ๋‹ˆ๋‹ค.
09:16
And it would build a working alliance with you
224
556260
2000
๊ทธ๋ฆฌ๊ณ  ๊ทธ ๋ฌธ๋‹ต์„ ํ†ตํ•ด ์—ฌ๋Ÿฌ๋ถ„๊ณผ
09:18
through that dialogue.
225
558260
2000
๊ธฐ๋ณธ์ ์ธ ํ˜‘์กฐ๋ฅผ ์ด๋ฃฐ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
09:20
It could help you set goals and track your progress,
226
560260
2000
๋ชฉํ‘œ๋ฅผ ์„ค์ •ํ•˜๊ณ , ๊ทธ ๊ณผ์ •์„ ์ถ”์ ํ•˜๋„๋ก ๋„์šธ ์ˆ˜ ์žˆ์œผ๋ฉฐ,
09:22
and it would help motivate you.
227
562260
2000
์—ฌ๋Ÿฌ๋ถ„์—๊ฒŒ ๋™๊ธฐ๋ฅผ ๋ถ€์—ฌํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
09:24
So an interesting question is,
228
564260
2000
ํฅ๋ฏธ๋กœ์šด ์งˆ๋ฌธ์€ ๋‹ค์Œ๊ณผ ๊ฐ™์Šต๋‹ˆ๋‹ค.
09:26
does the social embodiment really matter? Does it matter that it's a robot?
229
566260
3000
์‚ฌํšŒ์  ์ฒดํ˜„์ด ์ •๋ง ์ค‘์š”ํ• ๊นŒ์š”? ๋กœ๋ด‡์ด๋ผ๋Š” ๊ฒƒ์ด ์ค‘์š”ํ• ๊นŒ์š”?
09:29
Is it really just the quality of advice and information that matters?
230
569260
3000
๊ถŒ๊ณ ์™€ ์ •๋ณด์˜ ์งˆ์ด ์ •๋ง ์ค‘์š”ํ• ๊นŒ์š”?
09:32
To explore that question,
231
572260
2000
๊ทธ ๋ฌธ์ œ๋ฅผ ํ’€๊ธฐ ์œ„ํ•ด์„œ,
09:34
we did a study in the Boston area
232
574260
2000
๋ช‡ ์ฃผ ๋™์•ˆ ๋ณด์Šคํ„ด ์ง€์—ญ์—์„œ
09:36
where we put one of three interventions in people's homes
233
576260
3000
์—ฌ๋Ÿฌ ๊ฐ€์ •์— ์„ธ ๊ฐ€์ง€ ์ค‘ ํ•˜๋‚˜์˜ ๊ฐœ์ž… ๋ฐฉ๋ฒ•์„
09:39
for a period of several weeks.
234
579260
2000
์ ์šฉํ•ด๋ดค์Šต๋‹ˆ๋‹ค.
09:41
One case was the robot you saw there, Autom.
235
581260
3000
ํ•˜๋‚˜์˜ ๊ฒฝ์šฐ๊ฐ€ ์—ฌ๋Ÿฌ๋ถ„์ด ๋ดค๋˜ ๋กœ๋ด‡, ์˜คํ†ฐ์ž…๋‹ˆ๋‹ค.
09:44
Another was a computer that ran the same touch-screen interface,
236
584260
3000
๋˜ ๋‹ค๋ฅธ ํ•˜๋‚˜๋Š” ๋™์ผํ•œ ํ„ฐ์น˜ ์Šคํฌ๋ฆฐ ์ธํ„ฐํŽ˜์ด์Šค๋ฅผ ๊ฐ–์ถ”๊ณ ,
09:47
ran exactly the same dialogues.
237
587260
2000
๋™์ผํ•œ ๋ฌธ๋‹ต์ด ๋‚˜ํƒ€๋‚˜๋Š” ์ปดํ“จํ„ฐ์˜€์Šต๋‹ˆ๋‹ค.
09:49
The quality of advice was identical.
238
589260
2000
๊ถŒ๊ณ ์˜ ์งˆ์€ ๋™์ผํ–ˆ์Šต๋‹ˆ๋‹ค.
09:51
And the third was just a pen and paper log,
239
591260
2000
์„ธ ๋ฒˆ์งธ๋Š” ๊ทธ๋ƒฅ ํŽœ๊ณผ ์ข…์ด ๊ธฐ๋ก์ด์—ˆ์Šต๋‹ˆ๋‹ค.
09:53
because that's the standard intervention you typically get
240
593260
2000
์—ฌ๋Ÿฌ๋ถ„์ด ๋‹ค์ด์–ดํŠธ์™€ ์šด๋™ ํ”„๋กœ๊ทธ๋žจ์„ ์‹œ์ž‘ํ•  ๋•Œ,
09:55
when you start a diet-and-exercise program.
241
595260
3000
๊ทธ๊ฒƒ์ด ์ „ํ˜•์ ์œผ๋กœ ์–ป๊ฒŒ ๋˜๋Š” ํ‘œ์ค€ ๊ฐ„์„ญ์ด๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
09:58
So one of the things we really wanted to look at
242
598260
3000
์šฐ๋ฆฌ๊ฐ€ ์ง„์งœ ์‚ดํŽด๋ณด๊ธฐ ์›ํ–ˆ๋˜ ๊ฒƒ ์ค‘ ํ•˜๋‚˜๋Š”
10:01
was not how much weight people lost,
243
601260
3000
์‚ฌ๋žŒ๋“ค์ด ์–ผ๋งˆ๋‚˜ ์ฒด์ค‘์„ ์ค„์˜€๋Š”๊ฐ€๊ฐ€ ์•„๋‹ˆ๋ผ,
10:04
but really how long they interacted with the robot.
244
604260
3000
์–ผ๋งˆ๋‚˜ ์˜ค๋žซ๋™์•ˆ ๋กœ๋ด‡๊ณผ ์ƒํ˜ธ์ž‘์šฉ ํ•˜๋Š”๊ฐ€์˜€์Šต๋‹ˆ๋‹ค.
10:07
Because the challenge is not losing weight, it's actually keeping it off.
245
607260
3000
๊ณผ์ œ๋Š” ์ฒด์ค‘์„ ์ค„์ด๋Š” ๊ฒƒ์ด ์•„๋‹ˆ๋ผ ๋น„๋งŒ์„ ๋ฐฉ์ง€ํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
10:10
And the longer you could interact with one of these interventions,
246
610260
3000
๊ทธ๋ฆฌ๊ณ  ์ด ์กฐ์ • ์ค‘ ํ•˜๋‚˜์™€ ์˜ค๋žซ๋™์•ˆ ์ƒํ˜ธ์ž‘์šฉํ•˜๋ฉด ํ•  ์ˆ˜๋ก,
10:13
well that's indicative, potentially, of longer-term success.
247
613260
3000
๋” ๊ธด ๊ธฐ๊ฐ„์˜ ์„ฑ๊ณต์ด ์ž ์žฌ์ ์œผ๋กœ ๋‚˜ํƒ€๋‚ฉ๋‹ˆ๋‹ค.
10:16
So the first thing I want to look at is how long,
248
616260
2000
๊ทธ๋ž˜์„œ ์ œ๊ฐ€ ์‚ดํŽด๋ณด๊ธฐ ์›ํ•œ ์ฒซ ๋ฒˆ์งธ ๊ฒƒ์€
10:18
how long did people interact with these systems.
249
618260
2000
์‚ฌ๋žŒ๋“ค์ด ์–ผ๋งˆ๋‚˜ ์˜ค๋žซ๋™์•ˆ ์ด ์‹œ์Šคํ…œ๊ณผ ์ƒํ˜ธ์ž‘์šฉ ํ•˜๋Š”๊ฐ€์ž…๋‹ˆ๋‹ค.
10:20
It turns out that people interacted with the robot
250
620260
2000
๊ถŒ๊ณ ์˜ ์งˆ์ด ์ปดํ“จํ„ฐ์™€ ๋™์ผํ–ˆ์Œ์—๋„ ๋ถˆ๊ตฌํ•˜๊ณ 
10:22
significantly more,
251
622260
2000
์‚ฌ๋žŒ๋“ค์€ ๋กœ๋ด‡๊ณผ
10:24
even though the quality of the advice was identical to the computer.
252
624260
3000
๋” ๋งŽ์ด ์ƒํ˜ธ์ž‘์šฉ ํ–ˆ์Œ์ด ๋ฐํ˜€์กŒ์Šต๋‹ˆ๋‹ค.
10:28
When it asked people to rate it on terms of the quality of the working alliance,
253
628260
3000
๊ธฐ๋ณธ์ ์ธ ํ˜‘์กฐ์˜ ์งˆ์— ๋Œ€ํ•ด์„œ ํ‰๊ฐ€ํ•˜๋„๋ก ์š”์ฒญ๋ฐ›์•˜์„ ๋•Œ,
10:31
people rated the robot higher
254
631260
2000
์‚ฌ๋žŒ๋“ค์€ ๋กœ๋ด‡์— ๋” ๋†’์€ ์ ์ˆ˜๋ฅผ ์คฌ๊ณ ,
10:33
and they trusted the robot more.
255
633260
2000
๋กœ๋ด‡์„ ๋” ์‹ ๋ขฐํ–ˆ์Šต๋‹ˆ๋‹ค.
10:35
(Laughter)
256
635260
2000
(์›ƒ์Œ)
10:37
And when you look at emotional engagement,
257
637260
2000
๊ทธ๋ฆฌ๊ณ  ์—ฌ๋Ÿฌ๋ถ„์ด ๊ฐ์ •์  ์ฐธ์—ฌ๋ฅผ ์‚ดํŽด๋ณผ ๋•Œ,
10:39
it was completely different.
258
639260
2000
๊ทธ๊ฒƒ์€ ์™„์ „ํžˆ ๋‹ค๋ฆ…๋‹ˆ๋‹ค.
10:41
People would name the robots.
259
641260
2000
์‚ฌ๋žŒ๋“ค์€ ๋กœ๋ด‡์— ์ด๋ฆ„์„ ๋ถ™์ผ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
10:43
They would dress the robots.
260
643260
2000
๋กœ๋ด‡์— ์˜ท๋„ ์ž…ํž ๊ฒƒ์ด๊ตฌ์š”.
10:45
(Laughter)
261
645260
2000
(์›ƒ์Œ)
10:47
And even when we would come up to pick up the robots at the end of the study,
262
647260
3000
์‹ฌ์ง€์–ด ์—ฐ๊ตฌ๊ฐ€ ์ข…๋ฅ˜๋˜๊ณ , ๋กœ๋ด‡ ์ˆ˜๊ฑฐ๋ฅผ ์œ„ํ•ด ์‚ฌ๋žŒ๋“ค์„ ๋ฐฉ๋ฌธํ•  ๋•Œ,
10:50
they would come out to the car and say good-bye to the robots.
263
650260
2000
๊ทธ๋“ค์€ ์ž๋™์ฐจ๊นŒ์ง€ ๋‚˜์™€์„œ ๋กœ๋ด‡์—์„œ ์ž‘๋ณ„ ์ธ์‚ฌ๋ฅผ ํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
10:52
They didn't do this with a computer.
264
652260
2000
์ปดํ“จํ„ฐ์—๊ฒŒ๋Š” ๊ทธ๋ ‡๊ฒŒ ํ•˜์ง€ ์•Š์•˜์ฃ .
10:54
The last thing I want to talk about today
265
654260
2000
์˜ค๋Š˜ ๋งˆ์ง€๋ง‰์œผ๋กœ ๋ง์”€๋“œ๋ฆฌ๊ณ  ์‹ถ์€ ๊ฒƒ์€
10:56
is the future of children's media.
266
656260
2000
์•„์ด๋“ค์ด ์‚ฌ์šฉํ•˜๋Š” ๋งค์ฒด์˜ ๋ฏธ๋ž˜์ž…๋‹ˆ๋‹ค.
10:58
We know that kids spend a lot of time behind screens today,
267
658260
3000
์˜ค๋Š˜๋‚  ์•„์ด๋“ค์€ ๊ทธ๊ฒƒ์ด ํ…”๋ ˆ๋น„์ „์ด๋˜ ์ปดํ“จํ„ฐ ๊ฒŒ์ž„์ด๋˜ ๋ญ๋“ ๊ฐ„์—
11:01
whether it's television or computer games or whatnot.
268
661260
3000
์ˆ˜ ๋งŽ์€ ์‹œ๊ฐ„์„ ํ™”๋ฉด ๋’ค์—์„œ ๋ณด๋‚ด๊ณ  ์žˆ๋‹ค๋Š” ๊ฒƒ์„ ์šฐ๋ฆฌ๋Š” ์•Œ๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
11:04
My sons, they love the screen. They love the screen.
269
664260
3000
์ œ ์•„๋“ค๋“ค์€ ํ™”๋ฉด์„ ์ข‹์•„ํ•ฉ๋‹ˆ๋‹ค. ํ™”๋ฉด์„ ์ •๋ง ์ข‹์•„ํ•˜์ฃ .
11:07
But I want them to play; as a mom, I want them to play,
270
667260
3000
ํ•˜์ง€๋งŒ ์—„๋งˆ๋กœ์„œ ์ €๋Š” ์•„์ด๋“ค์ด
11:10
like, real-world play.
271
670260
2000
ํ˜„์‹ค ์„ธ๊ณ„์˜ ๋†€์ด์™€ ๊ฐ™์ด ๋†€๊ธฐ๋ฅผ ๋ฐ”๋ž๋‹ˆ๋‹ค.
11:12
And so I have a new project in my group I wanted to present to you today
272
672260
3000
๊ทธ๋ž˜์„œ ์ €๋Š” ์˜ค๋Š˜ ์—ฌ๋Ÿฌ๋ถ„๊ป˜ ๋ณด์—ฌ๋“œ๋ฆฌ๊ณ  ์‹ถ์€,
11:15
called Playtime Computing
273
675260
2000
'ํ”Œ๋ ˆ์ดํƒ€์ž„ ์ปดํ“จํŒ…'์ด๋ผ๋Š” ์ƒˆ๋กœ์šด ํ”„๋กœ์ ํŠธ๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
11:17
that's really trying to think about how we can take
274
677260
2000
๊ทธ๊ฒƒ์€ ๋””์ง€ํ„ธ ๋ฏธ๋””์–ด์— ๋Œ€ํ•ด
11:19
what's so engaging about digital media
275
679260
2000
๋ฌด์—‡์ด ๋งค๋ ฅ์ ์ธ๊ฐ€์™€ ๊ทธ๊ฒƒ์„ ๋ฌธ์ž ๊ทธ๋Œ€๋กœ
11:21
and literally bring it off the screen
276
681260
2000
ํ™”๋ฉด์—์„œ ๊บผ๋‚ด ์•„์ด๋“ค์˜ ํ˜„์‹ค ์„ธ๊ณ„๋กœ
11:23
into the real world of the child,
277
683260
2000
๊ฐ€์ ธ์˜ค๋Š” ๊ฒƒ์„ ์ƒ๊ฐํ•˜๋ ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
11:25
where it can take on many of the properties of real-world play.
278
685260
3000
๊ทธ๊ฒƒ์€ ํ˜„์‹ค ์„ธ๊ณ„ ๋†€์ด์˜ ๋งŽ์€ ํŠน์„ฑ๋“ค์„ ์ ์šฉํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
11:29
So here's the first exploration of this idea,
279
689260
4000
์—ฌ๊ธฐ ์ด ์•„์ด๋””์–ด์˜ ์ฒซ ๋ฒˆ์งธ ํƒ๊ตฌ๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
11:33
where characters can be physical or virtual,
280
693260
3000
์บ๋ฆญํ„ฐ๊ฐ€ ๋ฌผ๋ฆฌ์  ๋˜๋Š” ๊ฐ€์ƒ์ ์œผ๋กœ ์กด์žฌํ•  ์ˆ˜ ์žˆ๊ณ ,
11:36
and where the digital content
281
696260
2000
๋””์ง€ํ„ธ ์ฝ˜ํ…์ธ ๊ฐ€ ๋ฌธ์ž ๊ทธ๋Œ€๋กœ
11:38
can literally come off the screen
282
698260
2000
ํ™”๋ฉด์—์„œ ํ˜„์‹ค๋กœ ๋น ์ ธ๋‚˜์™”๋‹ค
11:40
into the world and back.
283
700260
2000
๋‹ค์‹œ ๋“ค์–ด๊ฐˆ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
11:42
I like to think of this
284
702260
2000
์ €๋Š” ์ด ํ˜ผํ•ฉ๋œ ํ˜„์‹ค ๊ฒŒ์ž„์„
11:44
as the Atari Pong
285
704260
2000
'์•„ํƒ€๋ฆฌ ํ'์œผ๋กœ
11:46
of this blended-reality play.
286
706260
2000
์ƒ๊ฐํ•˜๊ณ  ์‹ถ์Šต๋‹ˆ๋‹ค.
11:48
But we can push this idea further.
287
708260
2000
ํ•˜์ง€๋งŒ ์šฐ๋ฆฌ๋Š” ์ด ์•„์ด๋””์–ด๋ฅผ ์ข€๋” ์ง„ํ–‰ํ•ด๋ณผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
11:50
What if --
288
710260
2000
๋งŒ์ผ
11:52
(Game) Nathan: Here it comes. Yay!
289
712260
3000
(๊ฒŒ์ž„) ๋„ค์ด์Šจ: ์—ฌ๊ธฐ ๊ฐ„๋‹ค. ์•ผํ˜ธ!
11:55
CB: -- the character itself could come into your world?
290
715260
3000
CB: ์บ๋ฆญํ„ฐ ์ž์ฒด๊ฐ€ ์—ฌ๋Ÿฌ๋ถ„์˜ ์‹ค์ œ ์„ธ์ƒ์œผ๋กœ ๋‚˜์˜จ๋‹ค๋ฉด ์–ด๋–จ๊นŒ์š”?
11:58
It turns out that kids love it
291
718260
2000
์บ๋ฆญํ„ฐ๊ฐ€ ํ˜„์‹ค์ด ๋˜๊ณ 
12:00
when the character becomes real and enters into their world.
292
720260
3000
๊ทธ๋“ค์˜ ์„ธ์ƒ์œผ๋กœ ๋‚˜์˜ฌ ๋•Œ, ์•„์ด๋“ค์ด ๋„ˆ๋ฌด ์ข‹์•„ํ•œ๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
12:03
And when it's in their world,
293
723260
2000
๊ทธ๋ฆฌ๊ณ  ๊ทธ๊ฒƒ์ด ๊ทธ๋“ค์˜ ์„ธ์ƒ์— ์žˆ์„ ๋•Œ,
12:05
they can relate to it and play with it in a way
294
725260
2000
์•„์ด๋“ค์€ ๊ทธ๊ฒƒ๊ณผ ๊ด€๊ณ„๋ฅผ ํ˜•์„ฑํ•˜๊ณ , ํ•จ๊ป˜ ๋†‰๋‹ˆ๋‹ค.
12:07
that's fundamentally different from how they play with it on the screen.
295
727260
2000
๊ทธ๊ฒƒ์€ ํ™”๋ฉด์œผ๋กœ ๋…ธ๋Š” ๊ฒƒ๊ณผ๋Š” ๊ทผ๋ณธ์ ์œผ๋กœ ๋‹ค๋ฆ…๋‹ˆ๋‹ค.
12:09
Another important idea is this notion
296
729260
2000
๋˜ ๋‹ค๋ฅธ ์ค‘์š”ํ•œ ์•„์ด๋””์–ด๋Š”
12:11
of persistence of character across realities.
297
731260
3000
ํ˜„์‹ค์„ ๊ฐ€๋กœ์งˆ๋Ÿฌ ์บ๋ฆญํ„ฐ๊ฐ€ ์ง€์†ํ•˜๋Š” ๊ฐœ๋…์ž…๋‹ˆ๋‹ค.
12:14
So changes that children make in the real world
298
734260
2000
๊ทธ๋ž˜์„œ ํ˜„์‹ค ์„ธ์ƒ์—์„œ ์•„์ด๋“ค์ด ๋งŒ๋“ค์–ด ๋‚ด๋Š” ๋ณ€ํ™”๋Š”
12:16
need to translate to the virtual world.
299
736260
2000
๊ฐ€์ƒ ์„ธ๊ณ„๋กœ ์˜ฎ๊ฒจ์ ธ์•ผํ•  ํ•„์š”๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
12:18
So here, Nathan has changed the letter A to the number 2.
300
738260
3000
์ž ์—ฌ๊ธฐ์„œ, ๋„ค์ด์Šจ์€ ๋ฌธ์ž A๋ฅผ ์ˆซ์ž 2๋กœ ๋ฐ”๊ฟจ์Šต๋‹ˆ๋‹ค.
12:21
You can imagine maybe these symbols
301
741260
2000
์ด๋Ÿฐ ์‹ฌ๋ณผ๋“ค์ด ๊ฐ€์ƒ ์„ธ๊ณ„๋กœ ๊ฐˆ ๋•Œ,
12:23
give the characters special powers when it goes into the virtual world.
302
743260
3000
์บ๋ฆญํ„ฐ๋“ค์—๊ฒŒ ํŠน๋ณ„ํ•œ ๋Šฅ๋ ฅ์„ ๋ถ€์—ฌํ•œ๋‹ค๊ณ  ์ƒ์ƒํ•  ์ˆ˜ ์žˆ์„ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
12:26
So they are now sending the character back into that world.
303
746260
3000
์ž, ์•„์ด๋“ค์€ ์ด์ œ ์บ๋ฆญํ„ฐ๋ฅผ ๊ฐ€์ƒ ์„ธ๊ณ„๋กœ ๋Œ๋ ค๋ณด๋ƒ…๋‹ˆ๋‹ค.
12:29
And now it's got number power.
304
749260
3000
๊ทธ๋ฆฌ๊ณ  ๊ทธ๊ฒƒ์€ ์ˆซ์ž์˜ ๋Šฅ๋ ฅ์„ ์–ป์—ˆ์Šต๋‹ˆ๋‹ค.
12:32
And then finally, what I've been trying to do here
305
752260
2000
๊ทธ๋ฆฌ๊ณ  ๋‚˜์„œ ๋งˆ์ง€๋ง‰์œผ๋กœ, ์—ฌ๊ธฐ์„œ ์‹œ๋„ํ•œ ๊ฒƒ์€
12:34
is create a really immersive experience for kids,
306
754260
3000
์•„์ด๋“ค์ด ๊ทธ ์ด์•ผ๊ธฐ, ๊ทธ ๊ฒฝํ—˜์˜ ์ผ๋ถ€๋ถ„์ด
12:37
where they really feel like they are part of that story,
307
757260
3000
๋œ ๊ฒƒ์ฒ˜๋Ÿผ ๋Š๋ผ๋Š”, ๋‘˜๋Ÿฌ ์‹ธ๋Š” ๋“ฏํ•œ ๊ฒฝํ—˜์„
12:40
a part of that experience.
308
760260
2000
์ฐฝ์กฐํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
12:42
And I really want to spark their imaginations
309
762260
2000
"์Šคํƒ€์›Œ์ฆˆ"๋ฅผ ๋ณด๋Š” ์ž‘์€ ์†Œ๋…€๋กœ ์ œ ์ƒ์ƒ๋ ฅ์ด
12:44
the way mine was sparked as a little girl watching "Star Wars."
310
764260
3000
์ผ๊นจ์›Œ์กŒ๋˜ ๊ฒƒ์ฒ˜๋Ÿผ, ์•„์ด๋“ค์˜ ์ƒ์ƒ๋ ฅ์„ ์ •๋ง ์ผ๊นจ์šฐ๊ณ  ์‹ถ์Šต๋‹ˆ๋‹ค.
12:47
But I want to do more than that.
311
767260
2000
ํ•˜์ง€๋งŒ ์ €๋Š” ๊ทธ๊ฒƒ๋ณด๋‹ค ๋” ๋‚˜์•„๊ฐ€๊ณ  ์‹ถ์Šต๋‹ˆ๋‹ค.
12:49
I actually want them to create those experiences.
312
769260
3000
์ €๋Š” ์‚ฌ์‹ค ๊ทธ๋“ค์ด ๊ทธ ๊ฒฝํ—˜๋“ค์„ ์ฐฝ์กฐํ•˜๊ธฐ ์›ํ•ฉ๋‹ˆ๋‹ค.
12:52
I want them to be able to literally build their imagination
313
772260
2000
์ €๋Š” ๊ทธ๋“ค์ด ๋ฌธ์ž ๊ทธ๋Œ€๋กœ ๊ทธ๋“ค์˜ ์ƒ์ƒ๋ ฅ์„ ์ด ๊ฒฝํ—˜๋“ค๋กœ
12:54
into these experiences and make them their own.
314
774260
2000
๋งŒ๋“ค์–ด๋‚ผ ์ˆ˜ ์žˆ๊ณ , ๊ทธ๋“ค๋งŒ์˜ ๊ฒƒ์œผ๋กœ ๋งŒ๋“ค ์ˆ˜ ์žˆ๊ธฐ๋ฅผ ์›ํ•ฉ๋‹ˆ๋‹ค.
12:56
So we've been exploring a lot of ideas
315
776260
2000
์ž, ์ €๋Š” ์•„์ด๋“ค์ด ๊ทธ๋“ค์˜ ์ƒ๊ฐ์„
12:58
in telepresence and mixed reality
316
778260
2000
๋‹ค๋ฅธ ์•„์ด๋“ค๊ณผ ์ƒํ˜ธ์ž‘์šฉํ•˜๋ฉฐ
13:00
to literally allow kids to project their ideas into this space
317
780260
3000
๊ตฌ์ถ•ํ•  ์ˆ˜ ์žˆ๋Š” ๊ณต๊ฐ„์— ํˆฌ์˜ํ•  ์ˆ˜ ์žˆ๋„๋ก ๋งŒ๋“œ๋Š”
13:03
where other kids can interact with them
318
783260
2000
ํ…”๋ ˆํ”„๋ ˆ์ „์Šค์™€ ํ˜ผํ•ฉ๋œ ํ˜„์‹ค์—์„œ์˜
13:05
and build upon them.
319
785260
2000
์ˆ˜ ๋งŽ์€ ์•„์ด๋””์–ด๋ฅผ ํƒ๊ตฌํ•ด์™”์Šต๋‹ˆ๋‹ค.
13:07
I really want to come up with new ways of children's media
320
787260
3000
์ €๋Š” ์ฐฝ์กฐ์„ฑ, ํ•™์Šต, ํ˜์‹ ์„ ๊ธฐ๋ฅผ ์ˆ˜ ์žˆ๋Š”
13:10
that foster creativity and learning and innovation.
321
790260
3000
์–ด๋ฆฐ์ด ๋งค์ฒด์˜ ์ƒˆ๋กœ์šด ๋ฐฉ์‹์„ ์ œ์‹œํ•˜๊ณ  ์‹ถ์Šต๋‹ˆ๋‹ค.
13:13
I think that's very, very important.
322
793260
3000
์ €๋Š” ๊ทธ๊ฒƒ์ด ์•„์ฃผ ์•„์ฃผ ์ค‘์š”ํ•˜๋‹ค๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
13:16
So this is a new project.
323
796260
2000
์ž, ์ด๊ฒƒ์€ ์ƒˆ๋กœ์šด ํ”„๋กœ์ ํŠธ์ž…๋‹ˆ๋‹ค.
13:18
We've invited a lot of kids into this space,
324
798260
2000
์šฐ๋ฆฌ๋Š” ์ˆ˜ ๋งŽ์€ ์•„์ด๋“ค์„ ์—ฌ๊ธฐ๋กœ ์ดˆ๋Œ€ํ–ˆ๊ณ ,
13:20
and they think it's pretty cool.
325
800260
3000
์•„์ด๋“ค์€ ๊ทธ๊ฒŒ ์•„์ฃผ ๋ฉ‹์ง€๋‹ค๊ณ  ์ƒ๊ฐํ•˜์ฃ .
13:23
But I can tell you, the thing that they love the most
326
803260
2000
ํ•˜์ง€๋งŒ ์ €๋Š” ๊ทธ๋“ค์ด ์ œ์ผ ์ข‹์•„ํ•˜๋Š” ๊ฒƒ์ด
13:25
is the robot.
327
805260
2000
๋กœ๋ด‡์ด๋ผ๊ณ  ๋งํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
13:27
What they care about is the robot.
328
807260
3000
๊ทธ๋“ค์ด ๊ด€์‹ฌ์„ ๊ฐ€์ง€๋Š” ๊ฒƒ์€ ๋กœ๋ด‡์ž…๋‹ˆ๋‹ค.
13:30
Robots touch something deeply human within us.
329
810260
3000
๋กœ๋ด‡์€ ์šฐ๋ฆฌ ์•ˆ์— ๊นŠ์ด ์กด์žฌํ•˜๋Š” ์ธ๊ฐ„์ ์ธ ๊ฒƒ์„ ์–ด๋ฃจ๋งŒ์ง‘๋‹ˆ๋‹ค.
13:33
And so whether they're helping us
330
813260
2000
๊ทธ๋ฆฌ๊ณ  ๋กœ๋ด‡์ด ์šฐ๋ฆฌ๊ฐ€ ์ฐฝ์กฐ์ ์ด๊ณ 
13:35
to become creative and innovative,
331
815260
2000
ํ˜์‹ ์ ์ด ๋˜๋„๋ก ํ•˜๋˜์ง€
13:37
or whether they're helping us
332
817260
2000
์šฐ๋ฆฌ๊ฐ€ ๊ฑฐ๋ฆฌ์—๋„ ๋ถˆ๊ตฌํ•˜๊ณ 
13:39
to feel more deeply connected despite distance,
333
819260
2000
๋ณด๋‹ค ๊นŠ์ด ์—ฐ๊ฒฐ๋˜์—ˆ๋‹ค๊ณ  ๋Š๋ผ๋„๋ก ํ•˜๋˜์ง€,
13:41
or whether they are our trusted sidekick
334
821260
2000
์šฐ๋ฆฌ๊ฐ€ ์ตœ๊ณ ์™€ ์ตœ์„ ์˜ ์ž์•„๊ฐ€ ๋˜๋Š”๋ฐ ์žˆ์–ด
13:43
who's helping us attain our personal goals
335
823260
2000
์šฐ๋ฆฌ์˜ ๊ฐœ์ธ์  ๋ชฉํ‘œ๋“ค์„ ์„ฑ์ทจํ•  ์ˆ˜ ์žˆ๋„๋ก
13:45
in becoming our highest and best selves,
336
825260
2000
๋•๋Š” ์‹ ๋ขฐํ•  ์ˆ˜ ์žˆ๋Š” ์นœ๊ตฌ๊ฐ€ ๋˜๋˜์ง€ ๊ฐ„์—,
13:47
for me, robots are all about people.
337
827260
3000
์ œ๊ฒŒ๋Š” ๋กœ๋ด‡์ด ์‚ฌ๋žŒ๋“ค์—๊ฒŒ ์ตœ๊ณ ์ž…๋‹ˆ๋‹ค.
13:50
Thank you.
338
830260
2000
๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
13:52
(Applause)
339
832260
5000
(๋ฐ•์ˆ˜)
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7