Adam Ostrow: After your final status update

62,570 views ใƒป 2011-08-01

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืžืชืจื’ื: Ido Dekkers ืžื‘ืงืจ: Oren Szekatch
00:15
By the end of this year,
0
15260
2000
ืขื“ ืกื•ืฃ ื”ืฉื ื”,
00:17
there'll be nearly a billion people on this planet
1
17260
3000
ื™ื”ื™ื• ื›ืžืขื˜ ืžื™ืœื™ืืจื“ ืื ืฉื™ื ื‘ื›ื•ื›ื‘
00:20
that actively use social networking sites.
2
20260
2000
ืฉืžืฉืชืžืฉื™ื ื‘ืืชืจื™ ืžื“ื™ื” ื—ื‘ืจืชื™ืช.
00:22
The one thing that all of them have in common
3
22260
2000
ื”ื“ื‘ืจ ื”ืื—ื“ ืฉื™ืฉ ืœื›ื•ืœื ื‘ืžืฉื•ืชืฃ
00:24
is that they're going to die.
4
24260
3000
ื”ื•ื ืฉื”ื ื™ืžื•ืชื•.
00:27
While that might be a somewhat morbid thought,
5
27260
3000
ื‘ืขื•ื“ ืฉื–ื• ืื•ืœื™ ืžื—ืฉื‘ื” ืžืขื˜ ืžื•ืจื‘ื™ื“ื™ืช,
00:30
I think it has some really profound implications
6
30260
2000
ืื ื™ ื—ื•ืฉื‘ ืฉื™ืฉ ืœื” ื”ืฉืœื›ื•ืช ืžืฉืžืขื•ืชื™ื•ืช
00:32
that are worth exploring.
7
32260
2000
ืฉื›ื“ืื™ ืœื—ืงื•ืจ.
00:34
What first got me thinking about this
8
34260
3000
ืขื›ืฉื™ื• ืžื” ืฉื’ืจื ืœื™ ืœื”ืชื—ื™ืœ ืœื—ืฉื•ื‘ ืขืœ ื–ื”
00:37
was a blog post authored earlier this year by Derek K. Miller,
9
37260
3000
ื”ื™ื” ืจื™ืฉื•ื ื‘ื‘ืœื•ื’ ืฉื ื›ืชื‘ ืžื•ืงื“ื ื™ื•ืชืจ ื”ืฉื ื” ืขืœ ื™ื“ื™ ื“ืจืง ืง. ืžื™ืœืจ,
00:40
who was a science and technology journalist
10
40260
3000
ืฉื”ื™ื” ืขื™ืชื•ื ืื™ ืžื“ืข ื•ื˜ื›ื ื•ืœื•ื’ื™ื”
00:43
who died of cancer.
11
43260
2000
ืฉืžืช ืžืกืจื˜ืŸ.
00:45
And what Miller did was have his family and friends write a post
12
45260
3000
ื•ืžื” ืฉืžื™ืœืจ ืขืฉื” ื”ื™ื” ืœื‘ืงืฉ ืžืžืฉืคื—ืชื• ื•ื—ื‘ืจื™ื• ืœื›ืชื•ื‘ ืคื•ืกื˜
00:48
that went out shortly after he died.
13
48260
2000
ืฉื™ืฆื ืžืขื˜ ืื—ืจื™ ืคื˜ื™ืจืชื•.
00:50
Here's what he wrote in starting that out.
14
50260
2000
ื”ื ื” ืžื” ืฉื›ืชื‘ ื›ื“ื™ ืœื”ืชื—ื™ืœ ืืช ื–ื”.
00:52
He said, "Here it is. I'm dead,
15
52260
2000
ื”ื•ื ืืžืจ, " ื”ื ื” ื–ื”. ืื ื™ ืžืช
00:54
and this is my last post to my blog.
16
54260
2000
ื•ื–ื” ื”ืคื•ืกื˜ ื”ืื—ืจื•ืŸ ืฉืœื™ ื‘ื‘ืœื•ื’.
00:56
In advance, I asked that once my body finally shut down
17
56260
3000
ืžื•ืงื“ื ื™ื•ืชืจ, ื‘ื™ืงืฉืชื™ ืฉื›ืฉื”ื’ื•ืฃ ืฉืœื™ ื™ื›ื‘ื” ืœื‘ืกื•ืฃ
00:59
from the punishments of my cancer,
18
59260
2000
ืžืขื•ื ืฉ ื”ืกืจื˜ืŸ ืฉืœื™,
01:01
then my family and friends publish this prepared message I wrote --
19
61260
3000
ืื– ืžืฉืคื—ืชื™ ื•ื—ื‘ืจื™ ื™ืคืจืกืžื• ืืช ื”ื”ื•ื“ืขื” ื”ืžื•ื›ื ื” ืžืจืืฉ ืฉื›ืชื‘ืชื™ --
01:04
the first part of the process
20
64260
2000
ื”ืฉืœื‘ ื”ืจืืฉื•ืŸ ืฉืœ ื”ืชื”ืœื™ืš
01:06
of turning this from an active website to an archive."
21
66260
3000
ืฉืœ ืœื”ืคื•ืš ืืช ื–ื” ืžืืชืจ ืคืขื™ืœ ืœืืจื›ื™ื•ืŸ."
01:09
Now, while as a journalist,
22
69260
2000
ืขื›ืฉื™ื•, ื‘ื–ืžืŸ ืฉื”ื™ื™ืชื™ ืขื™ืชื•ื ืื™,
01:11
Miller's archive may have been better written
23
71260
2000
ืืจื›ื™ื•ืŸ ืžื™ืœืจ ืื•ืœื™ ื”ื™ื” ื›ืชื•ื‘ ื˜ื•ื‘ ื™ื•ืชืจ
01:13
and more carefully curated than most,
24
73260
2000
ื•ืขืจื•ืš ื˜ื•ื‘ ื™ื•ืชืจ ืžืืฉืจ ืจื•ื‘,
01:15
the fact of the matter is that all of us today
25
75260
2000
ืœืžืขืฉื” ื›ื•ืœื ื• ื”ื™ื•ื
01:17
are creating an archive
26
77260
2000
ื™ื•ืฆืจื™ื ืืจื›ื™ื•ืŸ
01:19
that's something completely different
27
79260
2000
ืฉื”ื•ื ืฉื•ื ื” ืœื’ืžืจื™
01:21
than anything that's been created
28
81260
2000
ืžื›ืœ ื“ื‘ืจ ืฉื ื•ืฆืจ ืงื•ื“ื
01:23
by any previous generation.
29
83260
2000
ืขืœ ื™ื“ื™ ื”ื“ื•ืจ ื”ืงื•ื“ื.
01:25
Consider a few stats for a moment.
30
85260
2000
ืงื—ื• ื‘ื—ืฉื‘ื•ืŸ ื›ืžื” ืกื˜ื˜ื™ืกื˜ื™ืงื•ืช ืœืจื’ืข.
01:27
Right now there are 48 hours of video
31
87260
2000
ื›ืจื’ืข ืžื•ืขืœื•ืช 48 ืฉืขื•ืช ืฉืœ ื•ื™ื“ืื•
01:29
being uploaded to YouTube every single minute.
32
89260
3000
ืœื™ื•ื˜ื™ื•ื‘ ื›ืœ ื“ืงื”.
01:32
There are 200 million Tweets being posted every day.
33
92260
4000
ื™ืฉ 200 ืžื™ืœื™ื•ืŸ ื”ื•ื“ืขื•ืช ื˜ื•ื•ื™ื˜ืจ ืฉื ื›ืชื‘ื•ืช ืžื“ื™ ื™ื•ื.
01:36
And the average Facebook user
34
96260
3000
ื•ืžืฉืชืžืฉ ื”ืคื™ื™ืกื‘ื•ืง ื”ืžืžื•ืฆืข
01:39
is creating 90 pieces of content each month.
35
99260
4000
ื™ื•ืฆืจ 90 ืคื™ืกื•ืช ืชื•ื›ืŸ ื›ืœ ื—ื•ื“ืฉ.
01:43
So when you think about your parents or your grandparents,
36
103260
3000
ืื– ื›ืฉืืชื ื—ื•ืฉื‘ื™ื ืขืœ ื”ื•ืจื™ื™ื›ื ืื• ืกื‘ื™ื™ื›ื,
01:46
at best they may have created
37
106260
2000
ื‘ืžืงืจื” ื”ื˜ื•ื‘ ื”ื ื™ืฆืจื•
01:48
some photos or home videos,
38
108260
2000
ื›ืžื” ืชืžื•ื ื•ืช ืื• ืกืจื˜ื™ื,
01:50
or a diary that lives in a box somewhere.
39
110260
3000
ืื• ื™ื•ืžืŸ ืฉื—ื™ ื‘ืงื•ืคืกื” ืื™ืคืฉื”ื•.
01:53
But today we're all creating this incredibly rich digital archive
40
113260
3000
ืื‘ืœ ื”ื™ื•ื ืื ื—ื ื• ื™ื•ืฆืจื™ื ืืช ื”ืืจื›ื™ื‘ ื”ื“ื™ื’ื™ื˜ืœื™ ื”ืขืฉื™ืจ ื”ื–ื”
01:56
that's going to live in the cloud indefinitely,
41
116260
2000
ืฉืขื•ืžื“ ืœื—ื™ื•ืช ื‘ืขื ืŸ ืœืชืžื™ื“,
01:58
years after we're gone.
42
118260
2000
ืฉื ื™ื ืื—ืจื™ ืฉื ืžื•ืช.
02:00
And I think that's going to create some incredibly intriguing opportunities
43
120260
3000
ื•ืื ื™ ื—ื•ืฉื‘ ืฉื–ื” ืขื•ืžื“ ืœื™ืฆื•ืจ ื›ืžื” ื”ื–ื“ืžื ื•ื™ื•ืช ืžืื•ื“ ืžืขื ื™ื™ื ื•ืช
02:03
for technologists.
44
123260
2000
ืœื˜ื›ื ื•ืœื•ื’ื™ื.
02:05
Now to be clear, I'm a journalist and not a technologist,
45
125260
2000
ืขื›ืฉื™ื• ื›ื“ื™ ืœื”ื™ื•ืช ื‘ืจื•ืจ, ืื ื™ ืขื™ืชื•ื ืื™ ืœื ื˜ื›ื ื•ืœื•ื’,
02:07
so what I'd like to do briefly
46
127260
2000
ืื– ืžื” ืฉื”ื™ื™ืชื™ ืจื•ืฆื” ืœืขืฉื•ืช ื‘ืงืฆืจื”
02:09
is paint a picture
47
129260
2000
ื–ื” ืœืฆื™ื™ืจ ืชืžื•ื ื”
02:11
of what the present and the future are going to look like.
48
131260
3000
ืฉืœ ืื™ืš ื”ื”ื•ื•ื” ื•ื”ืขืชื™ื“ ื™ืจืื•.
02:14
Now we're already seeing some services
49
134260
2000
ืขื›ืฉื™ื• ืื ื—ื ื• ื›ื‘ืจ ืจื•ืื™ื ื›ืžื” ืฉืจื•ืชื™ื
02:16
that are designed to let us decide
50
136260
2000
ืฉืžืชื•ื›ื ื ื™ื ืœืืคืฉืจ ืœื ื• ืœื”ื—ืœื™ื˜
02:18
what happens to our online profile and our social media accounts
51
138260
3000
ืžื” ื™ืงืจื” ืœืคืจื•ืคื™ืœ ื”ืžืงื•ื•ืŸ ืฉืœื ื• ื•ื—ืฉื‘ื•ื ื•ืช ื”ืžื“ื™ื” ืฉืœื ื•
02:21
after we die.
52
141260
2000
ืื—ืจื™ ืฉื ืžื•ืช.
02:23
One of them actually, fittingly enough,
53
143260
2000
ืื—ื“ ืžื”ื ืœืžืขืฉื”, ื‘ืฆื•ืจื” ืžืชืื™ืžื”,
02:25
found me when I checked into a deli
54
145260
2000
ืžืฆื ืื•ืชื™ ื›ืฉื ื›ื ืกืชื™ ืœืžืขื“ื ื™ื™ื”
02:27
at a restaurant in New York
55
147260
2000
ื‘ืžืกืขื“ื” ื‘ื ื™ื• ื™ื•ืจืง.
02:29
on foursquare.
56
149260
3000
ื‘ืคื•ืจืกืงื•ื•ืจ.
02:32
(Recording) Adam Ostrow: Hello.
57
152260
2000
(ื”ืงืœื˜ื”)ืื“ื ืื•ืกื˜ืจื•: ื”ืœื•.
02:34
Death: Adam?
58
154260
2000
ืžื•ื•ืช: ืื“ื?
02:36
AO: Yeah.
59
156260
2000
ื.ื.: ื›ืŸ.
02:38
Death: Death can catch you anywhere, anytime,
60
158260
3000
ืžื•ื•ืช: ื”ืžื•ื•ืช ื™ื›ื•ืœ ืœืชืคื•ืก ืื•ืชืš ื‘ื›ืœ ืžืงื•ื, ื‘ื›ืœ ื–ืžืŸ,
02:41
even at the Organic.
61
161260
3000
ืืคื™ืœื• ื‘ืื•ืจื’ื ื™.
02:44
AO: Who is this?
62
164260
2000
ื.ื.: ืžื™ ื–ื”?
02:46
Death: Go to ifidie.net
63
166260
3000
ืžื•ื•ืช: ืœืš ืœifidie.net
02:49
before it's too late.
64
169260
2000
ืœืคื ื™ ืฉื™ื”ื™ื” ืžืื•ื—ืจ ืžื“ื™.
02:51
(Laughter)
65
171260
2000
(ืฆื—ื•ืง)
02:53
Adam Ostrow: Kind of creepy, right?
66
173260
2000
ืื“ื ืื•ืกื˜ืจื•: ื“ื™ ืžืคื—ื™ื“, ืœื?
02:55
So what that service does, quite simply,
67
175260
2000
ืื– ืžื” ืฉื”ืฉืจื•ืช ืขื•ืฉื”, ื‘ืคืฉื˜ื•ืช,
02:57
is let you create a message or a video
68
177260
2000
ื–ื” ืœืืคืฉืจ ืœื›ื ืœื™ืฆื•ืจ ื”ื•ื“ืขื” ืื• ืกืจื˜ื•ืŸ
02:59
that can be posted to Facebook after you die.
69
179260
3000
ืฉื™ื•ืคืฅ ืœืคื™ื™ืกื‘ื•ืง ืื—ืจื™ ืžื•ืชื›ื.
03:02
Another service right now
70
182260
2000
ืขื•ื“ ืฉืจื•ืช ื›ืขืช
03:04
is called 1,000 Memories.
71
184260
2000
ื ืงืจื 1000 ื–ื›ืจื•ื ื•ืช.
03:06
And what this lets you do is create an online tribute to your loved ones,
72
186260
2000
ื•ืžื” ืฉื–ื” ืžืืคืฉืจ ืœื›ื ื–ื” ืœื™ืฆื•ืจ ืžื–ื›ืจืช ืžืงื•ื•ื ืช ืœื™ืงื™ืจื™ื™ื›ื,
03:08
complete with photos and videos and stories
73
188260
3000
ืขื ืชืžื•ื ื•ืช ืกืจื˜ื™ื ื•ืกื™ืคื•ืจื™ื
03:11
that they can post after you die.
74
191260
3000
ืฉื”ื ื™ื›ื•ืœื™ื ืœื”ืขืœื•ืช ืื—ืจื™ ืžื•ืชื›ื.
03:14
But what I think comes next is far more interesting.
75
194260
3000
ืื‘ืœ ืžื” ืฉืื ื™ ื—ื•ืฉื‘ ืฉื™ื’ื™ืข ืœืื—ืจ ืžื›ืŸ ื”ื•ื ื”ืจื‘ื” ื™ื•ืชืจ ืžืขื ื™ื™ืŸ.
03:17
Now a lot of you are probably familiar with Deb Roy
76
197260
3000
ืขื›ืฉื™ื• ื”ืจื‘ื” ืžื›ื ื‘ื•ื•ื“ืื™ ืžื›ื™ืจื™ื ืืช ื“ื‘ ืจื•ื™
03:20
who, back in March,
77
200260
2000
ืฉื‘ืžืจืฅ,
03:22
demonstrated how he was able to analyze more than 90,000 hours of home video.
78
202260
4000
ื”ืฆื™ื’ ืื™ืš ื”ื•ื ืžืกื•ื’ืœ ืœื ืชื— ื™ื•ืชืจ ืž-90,000 ืฉืขื•ืช ื•ื™ื“ืื• ื‘ื™ืชื™.
03:26
I think as machines' ability
79
206260
2000
ืื ื™ ื—ื•ืฉื‘ ืฉื›ืฉื™ื›ื•ืœื•ืช ื”ืžื›ื•ื ื•ืช
03:28
to understand human language and process vast amounts of data
80
208260
2000
ืœื”ื‘ื™ืŸ ืืช ื”ืฉืคื” ื”ืื ื•ืฉื™ืช ื•ืœืขื‘ื“ ื›ืžื•ื™ื•ืช ืขืฆื•ืžื•ืช ืฉืœ ืžื™ื“ืข
03:30
continues to improve,
81
210260
2000
ืžืžืฉื™ื›ื•ืช ืœื”ืฉืชืคืจ
03:32
it's going to become possible
82
212260
2000
ื–ื” ื™ื”ืคืš ืœืืคืฉืจื™
03:34
to analyze an entire life's worth of content --
83
214260
2000
ืœื ืชื— ืชื•ื›ืŸ ื—ื™ื™ื ืฉืœืžื™ื ืฉืœ ืื“ื --
03:36
the Tweets, the photos, the videos, the blog posts --
84
216260
3000
ื”ื˜ื•ื•ื™ื˜ื™ื, ื”ืชืžื•ื ื•ืช, ื”ืกืจื˜ื™ื, ื”ืคื•ืกื˜ื™ื ื‘ื‘ืœื•ื’ --
03:39
that we're producing in such massive numbers.
85
219260
2000
ืฉืื ื—ื ื• ืžื™ื™ืฆืจื™ื ื‘ื›ืžื•ื™ื•ืช ืขืฆื•ืžื•ืช ื›ืืœื”.
03:41
And I think as that happens,
86
221260
2000
ื•ืื ื™ ื—ื•ืฉื‘ ืฉื›ืฉื–ื” ื™ืงืจื”,
03:43
it's going to become possible for our digital personas
87
223260
3000
ื–ื” ื™ื”ื™ื” ืืคืฉืจื™ ืœืื™ืฉื™ื•ืช ื”ื™ื“ื™ื’ื™ื˜ืœื™ืช ืฉืœื ื•
03:46
to continue to interact in the real world long after we're gone
88
226260
3000
ืœื”ืžืฉื™ืš ืœืชืงืฉืจ ืขื ื”ืขื•ืœื ื”ืืžื™ืชื™ ื”ืจื‘ื” ืœืื—ืจ ืœื›ืชื ื•
03:49
thanks to the vastness of the amount of content we're creating
89
229260
3000
ืชื•ื“ื•ืช ืœื›ืžื•ืช ื”ืžื™ื“ืข ื”ืขืฆื•ืžื” ืฉื™ืฆืจื ื•
03:52
and technology's ability to make sense of it all.
90
232260
3000
ื•ื™ื›ื•ืœืช ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ืœื”ื‘ื™ืŸ ืื•ืชื•.
03:55
Now we're already starting to see some experiments here.
91
235260
3000
ืขื›ืฉื™ื• ืื ื—ื ื• ื›ื‘ืจ ืžืชื—ื™ืœื™ื ืœืจืื•ืช ื›ืžื” ื ื™ืกื•ื™ื™ื ืคื”.
03:58
One service called My Next Tweet
92
238260
2000
ื‘ืฉืจื•ืช ืฉื ืงืจื ื”ื˜ื•ื•ื™ื˜ ื”ื‘ื ืฉืœื™
04:00
analyzes your entire Twitter stream, everything you've posted onto Twitter,
93
240260
3000
ืฉื‘ื•ื—ืŸ ืืช ื›ืœ ื–ืจื ื”ื˜ื•ื•ื™ื˜ื™ื ืฉืœื›ื, ื›ืœ ื“ื‘ืจ ืฉื›ืชื‘ืชื ื‘ื˜ื•ื•ื™ื˜ืจ,
04:03
to make some predictions as to what you might say next.
94
243260
3000
ื›ื“ื™ ืœืขืฉื•ืช ื›ืžื” ืชื—ื–ื™ื•ืช ืœืžื” ื ื›ืชื•ื‘ ื‘ื”ืžืฉืš.
04:06
Well right now, as you can see,
95
246260
2000
ืื– ืขื›ืฉื™ื•, ื›ืžื• ืฉืืชื ืจื•ืื™ื,
04:08
the results can be somewhat comical.
96
248260
2000
ื”ืชื•ืฆืื•ืช ื™ื›ื•ืœื•ืช ืœื”ื™ื•ืช ื“ื™ ืงื•ืžื™ื•ืช.
04:10
You can imagine what something like this might look like
97
250260
2000
ืืชื ื™ื›ื•ืœื™ื ืœื“ืžื™ื™ืŸ ืื™ืš ื™ืจืื” ืžืฉื”ื• ื›ื–ื”
04:12
five, 10 or 20 years from now
98
252260
2000
ื‘ืขื•ื“ ื—ืžืฉ, 10, 20 ืฉื ื”
04:14
as our technical capabilities improve.
99
254260
3000
ื›ืฉื”ื™ื›ื•ืœื•ืช ื”ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ืฉืœื ื• ื™ืฉืชืคืจื•.
04:17
Taking it a step further,
100
257260
2000
ืื ื ื™ืงื— ืืช ื–ื” ืฉืœื‘ ืื—ื“ ื”ืœืื”,
04:19
MIT's media lab is working on robots
101
259260
2000
ืžืขื‘ื“ืช ื”ืžื“ื™ื” ื‘MIT ืขื•ื‘ื“ืช ืขืœ ืจื•ื‘ื•ื˜ื™ื
04:21
that can interact more like humans.
102
261260
3000
ืฉื™ื›ื•ืœื™ื ืœืชืงืฉืจ ื™ื•ืชืจ ื›ืžื• ืื ืฉื™ื.
04:24
But what if those robots were able to interact
103
264260
2000
ืื‘ืœ ืžื” ืื ื”ืจื•ื‘ื•ื˜ื™ื ื”ืืœื” ื”ื™ื• ืžืกื•ื’ืœื™ื ืœืชืงืฉืจ
04:26
based on the unique characteristics of a specific person
104
266260
3000
ืขืœ ื‘ืกื™ืก ืชื›ื•ื ื•ืช ื™ื—ื•ื“ื™ื•ืช ืฉืœ ืื“ื ืžืกื•ื™ื™ื
04:29
based on the hundreds of thousands of pieces of content
105
269260
2000
ื‘ื”ืชื‘ืกืก ืขืœ ืžืื•ืช ืืœืคื™ ืคื™ืกื•ืช ืžื™ื“ืข
04:31
that person produces in their lifetime?
106
271260
3000
ืฉื”ืื“ื ื”ื–ื” ื™ืฆืจ ื‘ื—ื™ื™ื•?
04:34
Finally, think back to this famous scene
107
274260
2000
ืœื‘ืกื•ืฃ, ื—ืฉื‘ื• ืขืœ ื”ืกืฆื ื” ื”ืžืคื•ืจืกืžืช ื”ื–ื•
04:36
from election night 2008
108
276260
2000
ืžืขืจื‘ ื”ื‘ื—ื™ืจื•ืช ืฉืœ 2008
04:38
back in the United States,
109
278260
2000
ื‘ืืจื”"ื‘,
04:40
where CNN beamed a live hologram
110
280260
2000
ืฉื CNN ืฉื™ื“ืจื• ื”ื•ืœื•ื’ืจืžื” ื—ื™ื”
04:42
of hip hop artist will.i.am into their studio
111
282260
2000
ืฉืœ ืื•ืžืŸ ื”ื”ื™ืค ื”ื•ืค ื•ื•ื™ืœ ืื™ ืื ืœืชื•ืš ื”ืกื˜ื•ื“ื™ื• ืฉืœื”ื
04:44
for an interview with Anderson Cooper.
112
284260
3000
ืœืจืื™ื•ืŸ ืขื ืื ื“ืจืกื•ืŸ ืงื•ืคืจ.
04:47
What if we were able to use that same type of technology
113
287260
2000
ืžื” ืื ื”ื™ื™ื ื• ื™ื›ื•ืœื™ื ืœื”ืชืฉืžืฉ ื‘ืื•ืชื” ื˜ื›ื ื•ืœื•ื’ื™ื”
04:49
to beam a representation of our loved ones into our living rooms --
114
289260
4000
ืœืฉื’ืจ ื™ืฆื•ื’ ืฉืœ ืื”ื•ื‘ื™ื ื• ืœืชื•ืš ื”ืกืœื•ืŸ ืฉืœื ื• --
04:53
interacting in a very lifelike way
115
293260
2000
ื•ืœืชืงืฉืจ ื‘ื“ืจืš ืžืื•ื“ ืžืฆื™ืื•ืชื™ืช
04:55
based on all the content they created while they were alive?
116
295260
3000
ื‘ื”ืชื‘ืกืก ืขืœ ื›ืœ ื”ืชื•ื›ืŸ ืฉื”ื ื™ืฆืจื• ื‘ื–ืžืŸ ืฉื”ื™ื• ื‘ื—ื™ื™ื.
04:58
I think that's going to become completely possible
117
298260
3000
ืื ื™ ื—ื•ืฉื‘ ืฉื–ื” ื™ื”ืคืš ืœืืคืฉืจื™ ืœื—ืœื•ื˜ื™ืŸ
05:01
as the amount of data we're producing
118
301260
2000
ื›ืฉื›ืžื•ืช ื”ืžื™ื“ืข ื‘ื• ืื ื• ืžืฉืชืžืฉื™ื
05:03
and technology's ability to understand it
119
303260
2000
ื•ื”ื™ื›ื•ืœืช ื”ื˜ื›ื ื•ืœื•ื’ื™ืช ืœื”ื‘ื™ืŸ ืื•ืชื•
05:05
both expand exponentially.
120
305260
3000
ื™ืชืจื—ื‘ื• ืืงืกืคื•ื ื ืฆื™ืืœื™ืช.
05:08
Now in closing, I think what we all need to be thinking about
121
308260
2000
ืขื›ืฉื™ื• ืœืกื™ื•ื, ืื ื™ ื—ื•ืฉื‘ ืฉืžื” ืฉื›ื•ืœื ื• ืฆืจื™ื›ื™ื ืœื—ืฉื•ื‘ ืขืœื™ื•
05:10
is if we want that to become our reality --
122
310260
2000
ื–ื” ืื ืื ื—ื ื• ืจื•ืฆื™ื ืœื”ืคื•ืš ืœืžืฆื™ืื•ืช ืฉืœื ื• --
05:12
and if so,
123
312260
2000
ื•ืื ื›ืŸ
05:14
what it means for a definition of life and everything that comes after it.
124
314260
3000
ืžื” ื–ื” ืื•ืžืจ ืœื”ื’ื“ืจืช ื”ื—ื™ื™ื ื•ื›ืœ ืžื” ืฉื‘ื ืื—ืจื™ ื–ื”.
05:17
Thank you very much.
125
317260
2000
ืชื•ื“ื” ืจื‘ื” ืœื›ื.
05:19
(Applause)
126
319260
4000
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7