Beware online "filter bubbles" | Eli Pariser

1,595,990 views ใƒป 2011-05-02

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืžืชืจื’ื: Yubal Masalker ืžื‘ืงืจ: Ido Dekkers
00:15
Mark Zuckerberg,
0
15260
2000
ืžืืจืง ืฆื•ืงืจื‘ืจื’,
00:17
a journalist was asking him a question about the news feed.
1
17260
3000
ืขื™ืชื•ื ืื™ ืฉืืœ ืื•ืชื• ืขืœ ืขื“ื›ื•ืŸ ื—ื“ืฉื•ืช ืฉื•ื˜ืฃ.
00:20
And the journalist was asking him,
2
20260
2000
ื”ื•ื ืฉืืœ ืื•ืชื•,
00:22
"Why is this so important?"
3
22260
2000
"ืžื“ื•ืข ื–ื” ื›ื” ื—ืฉื•ื‘?"
00:24
And Zuckerberg said,
4
24260
2000
ื•ืฆื•ืงืจื‘ืจื’ ืขื ื”
00:26
"A squirrel dying in your front yard
5
26260
2000
ืฉืกื ืื™ ื’ื•ืกืก ื‘ื—ื–ื™ืช ื‘ื™ืชื ื•
00:28
may be more relevant to your interests right now
6
28260
3000
ืขืฉื•ื™ ืœื”ื™ื•ืช ื™ื•ืชืจ ืจืœื•ื•ื ื˜ื™ ืœืื™ื ื˜ืจืกื™ื ื”ืžื™ื™ื“ื™ื™ื ืฉืœื ื•
00:31
than people dying in Africa."
7
31260
3000
ืžืืฉืจ ืื ืฉื™ื ื’ื•ืกืกื™ื ื‘ืืคืจื™ืงื”.
00:34
And I want to talk about
8
34260
2000
ืื– ื‘ืจืฆื•ื ื™ ืœื“ื‘ืจ ืขืœ
00:36
what a Web based on that idea of relevance might look like.
9
36260
3000
ืื™ืš ืื™ื ื˜ืจื ื˜ ื”ืžืชื‘ืกืก ืขืœ ื”ืจืขื™ื•ืŸ ื”ื "ืœ ืขืฉื•ื™ ืœื”ื™ืจืื•ืช.
00:40
So when I was growing up
10
40260
2000
ื›ืืฉืจ ื’ื“ืœืชื™
00:42
in a really rural area in Maine,
11
42260
2000
ื‘ืื–ื•ืจ ื›ืคืจื™ ื‘ืžื™ื™ืŸ,
00:44
the Internet meant something very different to me.
12
44260
3000
ื”ืื™ื ื˜ืจื ื˜ ื”ื™ื” ื‘ืฉื‘ื™ืœื™ ืžืฉื”ื• ืื—ืจ ืœื’ืžืจื™.
00:47
It meant a connection to the world.
13
47260
2000
ื”ื•ื ื”ื™ื” ืขื‘ื•ืจื™ ืงืฉืจ ืœืขื•ืœื.
00:49
It meant something that would connect us all together.
14
49260
3000
ื”ื•ื ื”ื™ื” ืขื‘ื•ืจื™ ืžืฉื”ื• ื”ืขืฉื•ื™ ืœืงืฉืจ ืืช ื›ื•ืœื ื• ื‘ื™ื—ื“.
00:52
And I was sure that it was going to be great for democracy
15
52260
3000
ื•ื”ื™ื™ืชื™ ื‘ื˜ื•ื— ืฉื”ื•ื ื”ื•ืœืš ืœื”ื™ื•ืช ื ืคืœื ื‘ืฉื‘ื™ืœ ื”ื“ืžื•ืงืจื˜ื™ื”
00:55
and for our society.
16
55260
3000
ื•ื”ื—ื‘ืจื” ืฉืœื ื•.
00:58
But there's this shift
17
58260
2000
ืื‘ืœ ืžื•ืœ ืขื™ื ื™ื ื• ืžืชื—ื•ืœืœ ืฉื™ื ื•ื™
01:00
in how information is flowing online,
18
60260
2000
ื‘ืื•ืคืŸ ื‘ื• ื”ืžื™ื“ืข ื–ื•ืจื ื‘ืจืฉืช,
01:02
and it's invisible.
19
62260
3000
ื•ื”ืฉื™ื ื•ื™ ื”ื•ื ื‘ืœืชื™ ื ืจืื”.
01:05
And if we don't pay attention to it,
20
65260
2000
ื•ืื ืœื ื ืฉื™ื ืœื‘ ืœื–ื”,
01:07
it could be a real problem.
21
67260
3000
ื–ื• ืขืœื•ืœื” ืœื”ื™ื•ืช ื‘ืขื™ื” ืจืฆื™ื ื™ืช.
01:10
So I first noticed this in a place I spend a lot of time --
22
70260
3000
ื‘ืคืขื ื”ืจืืฉื•ื ื” ื”ื‘ื—ื ืชื™ ื‘ื–ื” ื”ื™ื›ืŸ ืฉืื ื™ ืžื‘ืœื” ื”ืžื•ืŸ ื–ืžืŸ --
01:13
my Facebook page.
23
73260
2000
ื“ืฃ ื”ืคื™ื™ืกื‘ื•ืง ืฉืœื™.
01:15
I'm progressive, politically -- big surprise --
24
75260
3000
ืื ื™ ืœื™ื‘ืจืœ ืžื‘ื—ื™ื ื” ืคื•ืœื™ื˜ื™ืช -- ืื™ื–ื• ื”ืคืชืขื” --
01:18
but I've always gone out of my way to meet conservatives.
25
78260
2000
ืื‘ืœ ืชืžื™ื“ ืขืฉื™ืชื™ ื”ื›ืœ ื›ื“ื™ ืœืคื’ื•ืฉ ืฉืžืจื ื™ื.
01:20
I like hearing what they're thinking about;
26
80260
2000
ืื ื™ ืื•ื”ื‘ ืœืฉืžื•ืข ืขืœ ืžื” ื”ื ื—ื•ืฉื‘ื™ื;
01:22
I like seeing what they link to;
27
82260
2000
ืื ื™ ืื•ื”ื‘ ืœืจืื•ืช ืขื ืžื” ื”ื ืžืงื•ืฉืจื™ื;
01:24
I like learning a thing or two.
28
84260
2000
ืื ื™ ืื•ื”ื‘ ืœืœืžื•ื“ ื“ื‘ืจื™ื ื—ื“ืฉื™ื.
01:26
And so I was surprised when I noticed one day
29
86260
3000
ื•ืœื›ืŸ ื”ื•ืคืชืขืชื™ ื›ืืฉืจ ื”ื‘ื—ื ืชื™ ื™ื•ื ืื—ื“
01:29
that the conservatives had disappeared from my Facebook feed.
30
89260
3000
ืฉื”ืฉืžืจื ื™ื ื ืขืœืžื• ืžืขื“ื›ื•ื ื™ ื”ืคื™ื™ืกื‘ื•ืง ืฉืœื™.
01:33
And what it turned out was going on
31
93260
2000
ื•ืžื” ืฉื”ืชื‘ืจืจ ื”ื•ื
01:35
was that Facebook was looking at which links I clicked on,
32
95260
4000
ืฉืคื™ื™ืกื‘ื•ืง ืขืงื‘ื” ืื—ืจื™ื™ -- ืขืœ ืื™ื–ื” ืงื™ืฉื•ืจื™ื ืื ื™ ืžืงืœื™ืง,
01:39
and it was noticing that, actually,
33
99260
2000
ื•ื”ื™ื ืฉืžื” ืœื‘ ืฉื‘ืขืฆื
01:41
I was clicking more on my liberal friends' links
34
101260
2000
ืื ื™ ืžืงืœื™ืง ื™ื•ืชืจ ืขืœ ืงื™ืฉื•ืจื™ื ืฉืœ
01:43
than on my conservative friends' links.
35
103260
3000
ื—ื‘ืจื™ื ืœื™ื‘ืจืœื™ื™ื ืžืืฉืจ ืืœื• ืฉืœ ื—ื‘ืจื™ื ืฉืžืจื ื™ื™ื.
01:46
And without consulting me about it,
36
106260
2000
ื•ืžื‘ืœื™ ืœื”ืชื™ื™ืขืฅ ืื™ืชื™ ืขืœ ื›ืš,
01:48
it had edited them out.
37
108260
2000
ื”ื™ื ืกื™ื ื ื” ืื•ืชื ื”ื—ื•ืฆื”.
01:50
They disappeared.
38
110260
3000
ื”ื ื ืขืœืžื•.
01:54
So Facebook isn't the only place
39
114260
2000
ืื‘ืœ ืคื™ื™ืกื‘ื•ืง ืื™ื ื” ื”ืžืงื•ื ื”ื™ื—ื™ื“
01:56
that's doing this kind of invisible, algorithmic
40
116260
2000
ืืฉืจ ืžื‘ืฆืข ืกื•ื’ ื›ื–ื” ืฉืœ ืขืจื™ื›ื”
01:58
editing of the Web.
41
118260
3000
ืืœื’ื•ืจื™ืชืžื™ืช ื‘ืœืชื™ ื ืจืื™ืช ื‘ืจืฉืช.
02:01
Google's doing it too.
42
121260
2000
ื’ื ื’ื•ื’ืœ ืขื•ืฉื” ืืช ื–ื”.
02:03
If I search for something, and you search for something,
43
123260
3000
ืื ืื ื™ ืžื—ืคืฉ ืžืฉื”ื•, ื•ืืชื ืžื—ืคืฉื™ื ืžืฉื”ื•,
02:06
even right now at the very same time,
44
126260
2000
ืืคื™ืœื• ืขื›ืฉื™ื•, ื‘ื•-ื–ืžื ื™ืช,
02:08
we may get very different search results.
45
128260
3000
ืื ื• ืขืฉื•ื™ื™ื ืœืงื‘ืœ ืชื•ืฆืื•ืช ื—ื™ืคื•ืฉ ืžืื•ื“ ืฉื•ื ื•ืช.
02:11
Even if you're logged out, one engineer told me,
46
131260
3000
ื’ื ื›ืืฉืจ ื™ื•ืฆืื™ื ืžื”ืžืขืจื›ืช, ืกื™ืคืจ ืœื™ ืžื”ื ื“ืก ืื—ื“,
02:14
there are 57 signals
47
134260
2000
ื™ืฉื ื 57 ืกื™ืžื ื™ื
02:16
that Google looks at --
48
136260
3000
ืฉื’ื•ื’ืœ ืขื•ืงื‘ืช ืื—ืจื™ื”ื --
02:19
everything from what kind of computer you're on
49
139260
3000
ื›ืœ ื“ื‘ืจ ื”ื—ืœ ืžืื™ื–ื” ืžื—ืฉื‘ ืื ื• ืžืฉืชืžืฉื™ื
02:22
to what kind of browser you're using
50
142260
2000
ื“ืจืš ื‘ืื™ื–ื” ื“ืคื“ืคืŸ ืžืฉืชืžืฉื™ื
02:24
to where you're located --
51
144260
2000
ื•ืขื“ ืœืžืงื•ื ื‘ื• ืื ื• ื ืžืฆืื™ื --
02:26
that it uses to personally tailor your query results.
52
146260
3000
ื•ื”ื™ื ืžืฉืชืžืฉืช ื‘ื”ื ื›ื“ื™ ืœืชืคื•ืจ ืœื ื• ืืช ืชื•ืฆืื•ืช ื—ื™ืคื•ืฉื ื•.
02:29
Think about it for a second:
53
149260
2000
ืชื—ืฉื‘ื• ืขืœ ื–ื” ืฉื ื™ื”:
02:31
there is no standard Google anymore.
54
151260
4000
ืื™ืŸ ื™ื•ืชืจ ื’ื•ื’ืœ ื–ื”ื” ืœื›ื•ืœื.
02:35
And you know, the funny thing about this is that it's hard to see.
55
155260
3000
ื•ืžื” ืฉืžืฆื—ื™ืง ื‘ื›ืœ ื–ื” ื”ื•ื ืฉืงืฉื” ืœื”ื‘ื—ื™ืŸ ื‘ื–ื”.
02:38
You can't see how different your search results are
56
158260
2000
ืื™-ืืคืฉืจ ืœืจืื•ืช ื›ืžื” ืฉื•ื ื™ื ืชื•ืฆืื•ืช ื”ื—ื™ืคื•ืฉ ืฉืœืš
02:40
from anyone else's.
57
160260
2000
ืžืืœื• ืฉืœ ืžื™ืฉื”ื• ืื—ืจ.
02:42
But a couple of weeks ago,
58
162260
2000
ืื‘ืœ ืœืคื ื™ ื›ืžื” ืฉื‘ื•ืขื•ืช,
02:44
I asked a bunch of friends to Google "Egypt"
59
164260
3000
ื‘ื™ืงืฉืชื™ ืžืžืกืคืจ ื—ื‘ืจื™ื ืœื—ืคืฉ ื‘ื’ื•ื’ืœ "ืžืฆืจื™ื"
02:47
and to send me screen shots of what they got.
60
167260
3000
ื•ืœืฉืœื•ื— ืœื™ ืฆื™ืœื•ืžื™ ืžืกืš ืฉืœ ื”ืชื•ืฆืื•ืช ืฉืœื”ื.
02:50
So here's my friend Scott's screen shot.
61
170260
3000
ืื– ื”ื ื” ืฆื™ืœื•ื ืžืกืš ืฉืœ ื—ื‘ืจื™ ืกืงื•ื˜.
02:54
And here's my friend Daniel's screen shot.
62
174260
3000
ื•ื–ื” ืฉืœ ื—ื‘ืจื™ ื“ื ื™ืืœ.
02:57
When you put them side-by-side,
63
177260
2000
ื›ืืฉืจ ืžื ื™ื—ื™ื ืื•ืชื ื–ื” ืœืฆื“ ื–ื”,
02:59
you don't even have to read the links
64
179260
2000
ืื™ืŸ ืืคื™ืœื• ืฆื•ืจืš ืœืงืจื•ื ืžื” ื›ืชื•ื‘ ื‘ืชื•ืฆืื•ืช
03:01
to see how different these two pages are.
65
181260
2000
ื›ื“ื™ ืœืจืื•ืช ืขื“ ื›ืžื” ืฉื•ื ื™ื ืฉื ื™ ื”ืขืžื•ื“ื™ื.
03:03
But when you do read the links,
66
183260
2000
ืื‘ืœ ื›ืืฉืจ ืงื•ืจืื™ื ืืช ื”ืชื•ืฆืื•ืช,
03:05
it's really quite remarkable.
67
185260
3000
ื–ื” ื‘ืืžืช ื“ื™ ืžื•ื–ืจ.
03:09
Daniel didn't get anything about the protests in Egypt at all
68
189260
3000
ื“ื ื™ืืœ ืœื ืงื™ื‘ืœ ืฉื•ื ื“ื‘ืจ ืขืœ ื”ื”ืคื’ื ื•ืช ื‘ืžืฆืจื™ื
03:12
in his first page of Google results.
69
192260
2000
ื‘ืขืžื•ื“ ื”ืจืืฉื•ืŸ ืฉืœ ื”ืชื•ืฆืื•ืช.
03:14
Scott's results were full of them.
70
194260
2000
ืื‘ืœ ื”ืชื•ืฆืื•ืช ืฉืกืงื•ื˜ ืงื™ื‘ืœ ืžืœืื•ืช ื‘ื”ืŸ.
03:16
And this was the big story of the day at that time.
71
196260
2000
ื•ื–ื” ื”ื™ื” ืื– ื”-ืกื™ืคื•ืจ ืฉืœ ืื•ืชื• ื™ื•ื.
03:18
That's how different these results are becoming.
72
198260
3000
ื–ื” ื›ืžื” ืฉื”ืชื•ืฆืื•ืช ื ืขืฉื•ืช ืฉื•ื ื•ืช.
03:21
So it's not just Google and Facebook either.
73
201260
3000
ืื‘ืœ ื–ื” ืœื ืจืง ื’ื•ื’ืœ ื•ืคื™ื™ืกื‘ื•ืง.
03:24
This is something that's sweeping the Web.
74
204260
2000
ื–ื” ืžืฉื”ื• ืฉืกื•ื—ืฃ ืื™ืชื• ืืช ื›ืœ ื”ืจืฉืช.
03:26
There are a whole host of companies that are doing this kind of personalization.
75
206260
3000
ื™ืฉ ืžื’ื•ื•ืŸ ืฉืœ ื—ื‘ืจื•ืช ืืฉืจ ืžื‘ืฆืขื•ืช ืกื™ื ื•ืŸ ืื™ืฉื™ ื›ื–ื”.
03:29
Yahoo News, the biggest news site on the Internet,
76
209260
3000
"ื—ื“ืฉื•ืช ื™ืื”ื•", ืืชืจ ื”ื—ื“ืฉื•ืช ื”ื›ื™ ื’ื“ื•ืœ ื‘ืจืฉืช,
03:32
is now personalized -- different people get different things.
77
212260
3000
ื”ื•ื ืขื›ืฉื™ื• ื‘ืขืœ ืกื™ื ื•ืŸ ืื™ืฉื™ -- ื›ืœ ืื—ื“ ืžืงื‘ืœ ืžืฉื”ื• ืื—ืจ.
03:36
Huffington Post, the Washington Post, the New York Times --
78
216260
3000
"ื”ืคื™ื ื’ื˜ื•ืŸ ืคื•ืกื˜", ื”"ื•ื•ืฉื™ื ื’ื˜ื•ืŸ ืคื•ืกื˜", ื”"ื ื™ื•-ื™ื•ืจืง ื˜ื™ื™ืžืก" --
03:39
all flirting with personalization in various ways.
79
219260
3000
ื›ื•ืœื ืžืคืœืจื˜ื˜ื™ื ืขื ืกื™ื ื•ื ื™ื ืื™ืฉื™ื™ื ื‘ืื•ืคื ื™ื ืฉื•ื ื™ื.
03:42
And this moves us very quickly
80
222260
3000
ื•ื–ื” ืœื•ืงื— ืื•ืชื ื• ื‘ืžื”ื™ืจื•ืช
03:45
toward a world in which
81
225260
2000
ืืœ ืขื•ืœื ืฉื‘ื•
03:47
the Internet is showing us what it thinks we want to see,
82
227260
4000
ื”ืื™ื ื˜ืจื ื˜ ืžืจืื” ืœื ื• ืืช ืžื” ืฉื ื“ืžื” ืœื• ืฉืื ื• ืจื•ืฆื™ื ืœืจืื•ืช,
03:51
but not necessarily what we need to see.
83
231260
3000
ืื‘ืœ ืœื ื‘ื”ื›ืจื— ืืช ืžื” ืฉืื ื• ืฆืจื™ื›ื™ื ืœืจืื•ืช.
03:54
As Eric Schmidt said,
84
234260
3000
ื›ืคื™ ืฉืืจื™ืง ืฉืžื™ื“ื˜ ืืžืจ,
03:57
"It will be very hard for people to watch or consume something
85
237260
3000
"ื™ื”ื™ื” ื–ื” ื‘ืœืชื™ ืืคืฉืจื™ ืœืื ืฉื™ื ืœืจืื•ืช ืื• ืœืฆืจื•ืš ืžืฉื”ื•
04:00
that has not in some sense
86
240260
2000
ืฉื‘ืžื•ื‘ืŸ ืžืกื•ื™ื™ื
04:02
been tailored for them."
87
242260
3000
ืœื ื ืชืคืจ ืขื‘ื•ืจื."
04:05
So I do think this is a problem.
88
245260
2000
ืœื›ืŸ ืื ื™ ืกื‘ื•ืจ ืฉื–ื• ืื›ืŸ ื‘ืขื™ื”.
04:07
And I think, if you take all of these filters together,
89
247260
3000
ื•ืื ื™ ื—ื•ืฉื‘ ืฉืื ื ื•ื˜ืœื™ื ืืช ื›ืœ ื”ืžืกื ื ื™ื ื‘ื™ื—ื“,
04:10
you take all these algorithms,
90
250260
2000
ื ื•ื˜ืœื™ื ืืช ื›ืœ ื”ืืœื’ื•ืจื™ืชืžื™ื ื”ืœืœื•,
04:12
you get what I call a filter bubble.
91
252260
3000
ืžืงื‘ืœื™ื ืืช ืžื” ืฉืื ื™ ืžื›ื ื” ื‘ื•ืขืช ืกื™ื ื•ืŸ.
04:16
And your filter bubble is your own personal,
92
256260
3000
ื•ื‘ื•ืขืช ื”ืกื™ื ื•ืŸ ืฉืœืš ื”ื™ื ืขื•ืœื ื”ืžื™ื“ืข ื”ืื™ืฉื™
04:19
unique universe of information
93
259260
2000
ื•ื”ืžื™ื•ื—ื“ ืฉืœืš
04:21
that you live in online.
94
261260
2000
ืฉืืชื” ื—ื™ ื‘ื• ื‘ืจืฉืช.
04:23
And what's in your filter bubble
95
263260
3000
ื•ืžื” ืฉื‘ืชื•ืš ื‘ื•ืขืช ื”ืกื™ื ื•ืŸ ืฉืœืš
04:26
depends on who you are, and it depends on what you do.
96
266260
3000
ืชืœื•ื™ ื‘ืžื™ ืฉืืชื”, ื•ื–ื” ืชืœื•ื™ ื‘ืžื” ืฉืืชื” ืขื•ืฉื”.
04:29
But the thing is that you don't decide what gets in.
97
269260
4000
ืื‘ืœ ื”ืขื ื™ื™ืŸ ื”ื•ื ืฉืื™ื ืš ืžื—ืœื™ื˜ ืžื” ื ื›ื ืก ืœืชื•ื›ื”.
04:33
And more importantly,
98
273260
2000
ื•ื™ื•ืชืจ ื—ืฉื•ื‘,
04:35
you don't actually see what gets edited out.
99
275260
3000
ืื™ื ืš ืจื•ืื” ืืช ืžื” ืฉืžืกื ื ื™ื ื”ื—ื•ืฆื”.
04:38
So one of the problems with the filter bubble
100
278260
2000
ืืช ืื—ืช ื”ื‘ืขื™ื•ืช
04:40
was discovered by some researchers at Netflix.
101
280260
3000
ืฉืœ ื‘ื•ืขื•ืช ื”ืกื™ื ื•ืŸ ื’ื™ืœื• ื›ืžื” ื—ื•ืงืจื™ื ืžื ื˜ืคืœื™ืงืก.
04:43
And they were looking at the Netflix queues, and they noticed something kind of funny
102
283260
3000
ื”ื ื”ื™ื• ืžืกืชื›ืœื™ื ื‘ื˜ื•ืจื™ ื ื˜ืคืœื™ืงืก ื•ื”ื ืฉืžื• ืœื‘ ืœื“ื‘ืจ ืžืฆื—ื™ืง
04:46
that a lot of us probably have noticed,
103
286260
2000
ืฉืจื‘ื™ื ืžืื™ืชื ื• ื‘ื˜ื— ืฉืžื• ืœื‘ ืืœื™ื•,
04:48
which is there are some movies
104
288260
2000
ื•ื”ื•ื ืฉื™ืฉ ื›ืžื” ืกืจื˜ื™ื
04:50
that just sort of zip right up and out to our houses.
105
290260
3000
ืฉืžื’ื™ืขื™ื ื‘ืžื”ื™ืจื•ืช ื”ื™ื™ืฉืจ ืœื‘ื™ืชื™ื ื•.
04:53
They enter the queue, they just zip right out.
106
293260
3000
ื”ื ื ื›ื ืกื™ื ืœืชื•ืจ ื•ื ืคืชื—ื™ื ืžื™ื™ื“ ืœืคื ื™ื ื•.
04:56
So "Iron Man" zips right out,
107
296260
2000
ื›ืš ืฉ-"Iron man" ื ืคืชื— ืžื™ื™ื“,
04:58
and "Waiting for Superman"
108
298260
2000
ื•-"Waiting for Superman"
05:00
can wait for a really long time.
109
300260
2000
ื™ื›ื•ืœ ืœื—ื›ื•ืช ืžืžืฉ ื”ืจื‘ื” ื–ืžืŸ.
05:02
What they discovered
110
302260
2000
ืžื” ืฉื”ื ืžืฆืื•
05:04
was that in our Netflix queues
111
304260
2000
ื”ื™ื” ืฉื‘ื˜ื•ืจื™ ื”ื ื˜ืคืœื™ืงืก ืฉืœื ื•
05:06
there's this epic struggle going on
112
306260
3000
ืžืชื ื”ืœ ืžืื‘ืง ืื™ืชื ื™ื
05:09
between our future aspirational selves
113
309260
3000
ื‘ื™ืŸ ื”ืขืฆืžื™ ื”ืขืชื™ื“ื™, ื”ืฉืืคืชื ื™ ืฉืœื ื•
05:12
and our more impulsive present selves.
114
312260
3000
ืœื‘ื™ืŸ ื”ืขืฆืžื™ ื”ื™ื•ืชืจ ืื™ืžืคื•ืœืกื™ื‘ื™ ืฉืœื ื• ื”ืฉื™ื™ืš ืœื”ื•ื•ื”.
05:15
You know we all want to be someone
115
315260
2000
ืื ื• ื›ื•ืœื ื• ืฉื•ืืคื™ื ืœื”ื™ื•ืช ื–ื”
05:17
who has watched "Rashomon,"
116
317260
2000
ืฉืจืื” ื›ื‘ืจ ืืช "Rashomon",
05:19
but right now
117
319260
2000
ืื‘ืœ ื‘ืจื’ืข ื–ื” ืžืžืฉ
05:21
we want to watch "Ace Ventura" for the fourth time.
118
321260
3000
ืื ื• ืจื•ืฆื™ื ืœืฆืคื•ืช ื‘-"Ace Ventura" ื‘ืคืขื ื”ืจื‘ื™ืขื™ืช.
05:24
(Laughter)
119
324260
3000
(ืฆื—ื•ืง)
05:27
So the best editing gives us a bit of both.
120
327260
2000
ืื– ื”ืขืจื™ื›ื” ื”ืžื•ืฆืœื—ืช ื‘ื™ื•ืชืจ ื ื•ืชื ืช ืœื ื• ืงืฆืช ืžืฉื ื™ื”ื.
05:29
It gives us a little bit of Justin Bieber
121
329260
2000
ื”ื™ื ื ื•ืชื ืช ืœื ื• ืงืฆืช ืžื’'ืกื˜ื™ืŸ ื‘ื™ื‘ืจ
05:31
and a little bit of Afghanistan.
122
331260
2000
ื•ืงืฆืช ืžืืคื’ื ื™ืกื˜ืŸ.
05:33
It gives us some information vegetables;
123
333260
2000
ื”ื™ื ื ื•ืชื ืช ืœื ื• ืงืฆืช ืžื™ื“ืข ืขืœ ื™ืจืงื•ืช,
05:35
it gives us some information dessert.
124
335260
3000
ื•ืงืฆืช ืžื™ื“ืข ืขืœ ืงื™ื ื•ื—.
05:38
And the challenge with these kinds of algorithmic filters,
125
338260
2000
ื•ื”ืงื•ืฉื™ ืขื ืžืกื ื ื™ื ืืœื’ื•ืจื™ืชืžื™ื™ื ื›ืืœื”,
05:40
these personalized filters,
126
340260
2000
ืขื ื”ืžืกื ื ื™ื ื”ืœืœื• ื”ืžื•ืชืืžื™ื ืื™ืฉื™ืช,
05:42
is that, because they're mainly looking
127
342260
2000
ื”ื•ื ืฉื‘ื’ืœืœ ืฉื”ื ืžืกืชื›ืœื™ื ื‘ืขื™ืงืจ
05:44
at what you click on first,
128
344260
4000
ืขืœ ืžื” ืฉืื ื• ืžืงืœื™ืงื™ื ืขืœื™ื• ืจืืฉื•ืŸ,
05:48
it can throw off that balance.
129
348260
4000
ื–ื” ื™ื›ื•ืœ ืœื”ืคืจ ืืช ื”ืื™ื–ื•ืŸ.
05:52
And instead of a balanced information diet,
130
352260
3000
ื•ืื– ื‘ืžืงื•ื ื“ื™ืื˜ืช ืžื™ื“ืข ืžืื•ื–ื ืช,
05:55
you can end up surrounded
131
355260
2000
ื ืžืฆื ืืช ืขืฆืžื ื• ืžื•ืงืคื™ื
05:57
by information junk food.
132
357260
2000
ื‘ื’'ื ืง-ืคื•ื“ ืฉืœ ืžื™ื“ืข.
05:59
What this suggests
133
359260
2000
ื›ืœ ื–ื” ืžืฆื‘ื™ืข ืขืœ ื›ืš
06:01
is actually that we may have the story about the Internet wrong.
134
361260
3000
ืฉืื ื• ืžืคืกืคืกื™ื ืืช ื›ืœ ืžื˜ืจืช ื”ืื™ื ื˜ืจื ื˜.
06:04
In a broadcast society --
135
364260
2000
ื‘ื—ื‘ืจื” ืžื‘ื•ืกืกืช ืฉื™ื“ื•ืจื™ื --
06:06
this is how the founding mythology goes --
136
366260
2000
ื›ืš ืขืœ-ืคื™ ืžื™ืชื•ืก ื”ื”ื™ื•ื•ืกื“ื•ืช --
06:08
in a broadcast society,
137
368260
2000
ื‘ื—ื‘ืจื” ืžื‘ื•ืกืกืช ืฉื™ื“ื•ืจื™ื,
06:10
there were these gatekeepers, the editors,
138
370260
2000
ื”ื™ื• ืืœื” ืฉื•ืžืจื™-ื”ืกืฃ, ื”ืขื•ืจื›ื™ื,
06:12
and they controlled the flows of information.
139
372260
3000
ื•ื”ื ืฉืœื˜ื• ืขืœ ื‘ืจื– ื”ืžื™ื“ืข.
06:15
And along came the Internet and it swept them out of the way,
140
375260
3000
ื•ืื– ื‘ื ื”ืื™ื ื˜ืจื ื˜ ื•ื ื™ืคื ืฃ ืื•ืชื ื”ืฆื™ื“ื”,
06:18
and it allowed all of us to connect together,
141
378260
2000
ื•ื–ื” ืื™ืคืฉืจ ืœื ื• ืœื”ืชื—ื‘ืจ ื–ื” ืœื–ื”,
06:20
and it was awesome.
142
380260
2000
ื•ื–ื” ื”ื™ื” ืคืฉื•ื˜ ื ืคืœื.
06:22
But that's not actually what's happening right now.
143
382260
3000
ืื‘ืœ ืœื ื–ื” ืžื” ืฉืงื•ืจื” ื›ืขืช.
06:26
What we're seeing is more of a passing of the torch
144
386260
3000
ืžื” ืฉืื ื• ืขื“ื™ื ืœื• ื–ื” ื”ืขื‘ืจืช ื”ืžืคืชื—ื•ืช
06:29
from human gatekeepers
145
389260
2000
ืžืฉื•ืžืจื™-ืกืฃ ืื ื•ืฉื™ื™ื
06:31
to algorithmic ones.
146
391260
3000
ืœืฉื•ืžืจื™-ืกืฃ ืืœื’ื•ืจื™ืชืžื™ื™ื.
06:34
And the thing is that the algorithms
147
394260
3000
ื•ื”ืขื ื™ื™ืŸ ื”ื•ื ืฉื”ืืœื’ื•ืจื™ืชืžื™ื
06:37
don't yet have the kind of embedded ethics
148
397260
3000
ื—ืกืจื™ื ืขื“ื™ื™ืŸ ืืช ื”ืืชื™ืงื” ื”ืคื ื™ืžื™ืช
06:40
that the editors did.
149
400260
3000
ืฉื”ื™ืชื” ืœืขื•ืจื›ื™ื.
06:43
So if algorithms are going to curate the world for us,
150
403260
3000
ื›ืš ืฉืื ื”ืืœื’ื•ืจื™ืชืžื™ื ื”ื•ืœื›ื™ื ืœื ื”ืœ ืœื ื• ืืช ื”ืขื•ืœื,
06:46
if they're going to decide what we get to see and what we don't get to see,
151
406260
3000
ืื ื”ื ื”ื•ืœื›ื™ื ืœื”ื—ืœื™ื˜ ืžื” ื ื–ื›ื” ืœืจืื•ืช ื•ืžื” ืœื ื ื–ื›ื”,
06:49
then we need to make sure
152
409260
2000
ืื– ืขืœื™ื ื• ืœื•ื•ื“ื
06:51
that they're not just keyed to relevance.
153
411260
3000
ืฉื”ื ืœื ืžื›ื•ื•ื ื ื™ื ืจืง ืœืคื™ ืžื™ื“ืช ื”ืจืœื•ื•ื ื˜ื™ื•ืช.
06:54
We need to make sure that they also show us things
154
414260
2000
ืขืœื™ื ื• ืœื•ื•ื“ื ืฉื”ื ื’ื ืžืจืื™ื ืœื ื• ื“ื‘ืจื™ื
06:56
that are uncomfortable or challenging or important --
155
416260
3000
ืฉืื™ื ื ื ื•ื—ื™ื ืื• ืฉื”ื ืžืืชื’ืจื™ื ืื• ืฉื”ื ื—ืฉื•ื‘ื™ื --
06:59
this is what TED does --
156
419260
2000
ื–ื” ืžื” ืฉ-TED ืขื•ืฉื” --
07:01
other points of view.
157
421260
2000
ื ืงื•ื“ื•ืช ืžื‘ื˜ ืฉื•ื ื•ืช.
07:03
And the thing is, we've actually been here before
158
423260
2000
ื•ื”ืขื ื™ื™ืŸ ื”ื•ื ืฉื›ื‘ืจ ื”ื™ื™ื ื• ื‘ืกืจื˜ ื”ื–ื”
07:05
as a society.
159
425260
2000
ื›ื—ื‘ืจื”.
07:08
In 1915, it's not like newspapers were sweating a lot
160
428260
3000
ื‘-1915, ื–ื” ืœื ืฉืœืขื™ืชื•ื ื™ื ื”ื™ื” ื™ื•ืชืจ ืžื“ื™ ืื™ื›ืคืช
07:11
about their civic responsibilities.
161
431260
3000
ืžื”ืื—ืจื™ื•ืช ื”ืื–ืจื—ื™ืช ืฉืœื”ื.
07:14
Then people noticed
162
434260
2000
ื•ืื– ืื ืฉื™ื ืฉืžื• ืœื‘
07:16
that they were doing something really important.
163
436260
3000
ืฉื”ืขื™ืชื•ื ื™ื ืขื•ืฉื™ื ืžืฉื”ื• ื‘ืืžืช ื—ืฉื•ื‘.
07:19
That, in fact, you couldn't have
164
439260
2000
ืฉืœืžืขืฉื”, ืงืฉื” ืœืงื™ื™ื
07:21
a functioning democracy
165
441260
2000
ื“ืžื•ืงืจื˜ื™ื” ืžืชืคืงื“ืช
07:23
if citizens didn't get a good flow of information,
166
443260
4000
ืื ืื–ืจื—ื™ื ืœื ื–ื•ื›ื™ื ืœื–ืจื ืฉื•ื˜ืฃ ื•ื™ืขื™ืœ ืฉืœ ืžื™ื“ืข.
07:28
that the newspapers were critical because they were acting as the filter,
167
448260
3000
ืฉื”ืขื™ืชื•ื ื™ื ื”ื™ื• ื“ื‘ืจ ืงืจื™ื˜ื™, ื”ื™ื•ืช ื•ื”ื ืคืขืœื• ื‘ืชื•ืจ ืžืกื ื ื™ื,
07:31
and then journalistic ethics developed.
168
451260
2000
ื•ื›ืš ื”ืชืคืชื—ื” ืืชื™ืงื” ืขื™ืชื•ื ืื™ืช.
07:33
It wasn't perfect,
169
453260
2000
ื”ื™ื ืœื ื”ื™ืชื” ืžื•ืฉืœืžืช,
07:35
but it got us through the last century.
170
455260
3000
ืื‘ืœ ื”ื™ื ืœื™ื•ื•ืชื” ืื•ืชื ื• ื‘ื›ืœ ื”ืžืื” ืฉืขื‘ืจื”.
07:38
And so now,
171
458260
2000
ื•ืขื›ืฉื™ื•,
07:40
we're kind of back in 1915 on the Web.
172
460260
3000
ืื ื• ืฉื•ื‘ ื‘ื—ื–ืจื” ื‘ืžื™ืŸ ืื•ืชื• ืžืฆื‘ ืฉืœ 1915, ืื‘ืœ ื‘ืจืฉืช.
07:44
And we need the new gatekeepers
173
464260
3000
ื•ืื ื• ืฆืจื™ื›ื™ื ืฉืฉื•ืžืจื™-ื”ืกืฃ ื”ื—ื“ืฉื™ื
07:47
to encode that kind of responsibility
174
467260
2000
ื™ืงื•ื“ื“ื• ืืช ื”ืื—ืจื™ื•ืช ื”ื–ื•,
07:49
into the code that they're writing.
175
469260
2000
ื‘ืชื•ืš ื”ืชื•ื›ื ื•ืช ืฉื”ื ื›ื•ืชื‘ื™ื.
07:51
I know that there are a lot of people here from Facebook and from Google --
176
471260
3000
ืื ื™ ื™ื•ื“ืข ืฉื™ืฉ ื›ืืŸ ื”ืจื‘ื” ืื ืฉื™ื ืžืคื™ื™ืกื‘ื•ืง ื•ื’ื•ื’ืœ --
07:54
Larry and Sergey --
177
474260
2000
ืœืืจื™ ื•ืกืจื’ื™ื™ --
07:56
people who have helped build the Web as it is,
178
476260
2000
ืื ืฉื™ื ืฉืชืจืžื• ืœื‘ื ื™ื™ืช ื”ืจืฉืช ื›ืคื™ ืฉื”ื™ื,
07:58
and I'm grateful for that.
179
478260
2000
ื•ืื ื™ ืืกื™ืจ-ืชื•ื“ื” ืขืœ ื›ืš.
08:00
But we really need you to make sure
180
480260
3000
ืื‘ืœ ืื ื• ื‘ืืžืช ืฆืจื™ื›ื™ื ืฉืืชื ืชื•ื•ื“ืื•
08:03
that these algorithms have encoded in them
181
483260
3000
ืฉื‘ืชื•ืš ื”ืืœื’ื•ืจื™ืชืžื™ื ื”ืœืœื• ื™ืงื•ื“ื“ื•
08:06
a sense of the public life, a sense of civic responsibility.
182
486260
3000
ื’ื ื”ืžื•ื“ืขื•ืช ืœืฆื™ื‘ื•ืจ, ื”ืžื•ื“ืขื•ืช ืœืื—ืจื™ื•ืช ืื–ืจื—ื™ืช.
08:09
We need you to make sure that they're transparent enough
183
489260
3000
ืื ื• ืฆืจื™ื›ื™ื ืฉืืชื ืชื‘ื˜ื™ื—ื• ืฉื”ื ื™ื”ื™ื• ืฉืงื•ืคื™ื ืžืกืคื™ืง
08:12
that we can see what the rules are
184
492260
2000
ื›ื“ื™ ืฉื ื•ื›ืœ ืœืจืื•ืช ืžื” ื”ื ื”ื›ืœืœื™ื
08:14
that determine what gets through our filters.
185
494260
3000
ืืฉืจ ืงื•ื‘ืขื™ื ืžื” ื™ืขื‘ื•ืจ ืืช ื”ืกื™ื ื•ืŸ.
08:17
And we need you to give us some control
186
497260
2000
ื•ืื ื• ืฆืจื™ื›ื™ื ืฉืชืชื ื• ืœื ื• ืฉืœื™ื˜ื” ืžืกื•ื™ื™ืžืช,
08:19
so that we can decide
187
499260
2000
ื›ืš ืฉื ื•ื›ืœ ืœื”ื—ืœื™ื˜
08:21
what gets through and what doesn't.
188
501260
3000
ืžื” ื™ืขื‘ื•ืจ ื•ืžื” ืœื.
08:24
Because I think
189
504260
2000
ื›ื™ ืื ื™ ืกื‘ื•ืจ
08:26
we really need the Internet to be that thing
190
506260
2000
ืฉืื ื• ื‘ืืžืช ืฆืจื™ื›ื™ื ืฉื”ืื™ื ื˜ืจื ื˜ ื™ื”ื™ื” ื”ื“ื‘ืจ
08:28
that we all dreamed of it being.
191
508260
2000
ืฉื›ื•ืœื ื• ื—ืœืžื ื• ืฉื”ื•ื ื™ื”ื™ื”.
08:30
We need it to connect us all together.
192
510260
3000
ืื ื• ืฆืจื™ื›ื™ื ืฉื”ื•ื ื™ื—ื‘ืจ ืืช ื›ื•ืœื ื• ื–ื” ืœื–ื”.
08:33
We need it to introduce us to new ideas
193
513260
3000
ืื ื• ืฆืจื™ื›ื™ื ืฉื”ื•ื ื™ื‘ื™ื ืœืคื ื™ื ื• ืจืขื™ื•ื ื•ืช ื—ื“ืฉื™ื
08:36
and new people and different perspectives.
194
516260
3000
ื•ืื ืฉื™ื ื—ื“ืฉื™ื ื•ื ืงื•ื“ื•ืช ืžื‘ื˜ ืฉื•ื ื•ืช.
08:40
And it's not going to do that
195
520260
2000
ื•ื–ื” ืœื ื™ื•ื›ืœ ืœืงืจื•ืช
08:42
if it leaves us all isolated in a Web of one.
196
522260
3000
ืื ื–ื” ื™ืฉืื™ืจ ืืช ื›ื•ืœื ื• ืžื‘ื•ื“ื“ื™ื ื‘ืจืฉืช ื”ืฉื™ื™ื›ืช ืœื™ื—ื™ื“.
08:45
Thank you.
197
525260
2000
ืชื•ื“ื” ืœื›ื.
08:47
(Applause)
198
527260
11000
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7