How to use data to make a hit TV show | Sebastian Wernicke

133,338 views ・ 2016-01-27

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: Ju Hye Lim κ²€ν† : Gichung Lee
00:12
Roy Price is a man that most of you have probably never heard about,
0
12820
4276
둜이 ν”„λΌμ΄μŠ€λΌλŠ” μ‚¬λžŒμ— λŒ€ν•΄ μ•„μ‹œλŠ” 뢄이 μ–Όλ§ˆ λ˜μ§€ μ•Šμ„ κ±°μ˜ˆμš”.
00:17
even though he may have been responsible
1
17120
2496
κ·Έκ°€ 2013λ…„ 4μ›” 19일에 μ—¬λŸ¬λΆ„μ΄
00:19
for 22 somewhat mediocre minutes of your life on April 19, 2013.
2
19640
6896
ν˜•νŽΈμ—†λŠ” 22뢄을 보낸 이유일 μˆ˜λ„ μžˆμ§€λ§Œ λ§μž…λ‹ˆλ‹€.
00:26
He may have also been responsible for 22 very entertaining minutes,
3
26560
3176
κ·ΈλŠ” μΌλΆ€μ—κ²ŒλŠ” μ•„μ£Ό ν₯미둜운 22뢄을 λ³΄λ‚΄κ²Œ ν•˜κΈ°λ„ ν–ˆμ§€λ§Œ
00:29
but not very many of you.
4
29760
2256
μ—¬λŸ¬λΆ„ λŒ€λΆ€λΆ„μ€ μ•„λ‹ν…Œμ£ .
00:32
And all of that goes back to a decision
5
32040
1896
그리고 이 λͺ¨λ“ κ±΄
00:33
that Roy had to make about three years ago.
6
33960
2000
μ‚Όλ…„ μ „ λ‘œμ΄κ°€ λ§Œλ“€μ–΄μ•Ό ν–ˆλ˜ κ²°μ •μœΌλ‘œ λ˜λŒμ•„ κ°‘λ‹ˆλ‹€.
00:35
So you see, Roy Price is a senior executive with Amazon Studios.
7
35984
4832
λ„€, λ‘œμ΄λŠ” μ•„λ§ˆμ‘΄ μŠ€νŠœλ””μ˜€μ˜ κ³ μœ„ κ°„λΆ€μž…λ‹ˆλ‹€.
00:40
That's the TV production company of Amazon.
8
40840
3016
μ•„λ§ˆμ‘΄μ˜ TV 기획 λΆ€μ„œμ£ .
00:43
He's 47 years old, slim, spiky hair,
9
43880
3256
λ‘œμ΄λŠ” 47μ„Έλ‘œ 마λ₯΄κ³  λΎ°μ‘±ν•œ 머리λ₯Ό ν–ˆμœΌλ©°
00:47
describes himself on Twitter as "movies, TV, technology, tacos."
10
47160
4816
νŠΈμœ„ν„°μ— 슀슀둜λ₯Ό λ‹€μŒκ³Ό 같이 λ¬˜μ‚¬ν–ˆμŠ΅λ‹ˆλ‹€. "μ˜ν™”, TV, 기술, 타코"
00:52
And Roy Price has a very responsible job, because it's his responsibility
11
52000
5176
둜이의 일은 μ±…μž„κ°μ΄ λ§‰μ€‘ν–ˆλŠ”λ°
κ·Έκ°€ ν•˜λŠ” 일은 μ•„λ§ˆμ‘΄μ΄ λ§Œλ“€ μ‹œμž‘ 컨텐츠λ₯Ό μ„ νƒν•˜λŠ” κ²ƒμ΄λ‹ˆκΉŒμš”.
00:57
to pick the shows, the original content that Amazon is going to make.
12
57200
4056
01:01
And of course that's a highly competitive space.
13
61280
2336
λ¬Όλ‘  경쟁이 맀우 μ‹¬ν•œ 뢄야이죠.
01:03
I mean, there are so many TV shows already out there,
14
63640
2736
κ·ΈλŸ¬λ‹ˆκΉŒ 이미 λ§Žμ€ TV μ‡Όκ°€ 있고
01:06
that Roy can't just choose any show.
15
66400
2176
λ‘œμ΄λŠ” 아무 μ‡Όλ‚˜ 선택할 수 μ—†μ£ .
01:08
He has to find shows that are really, really great.
16
68600
4096
맀우 ν›Œλ₯­ν•œ μ‡Όλ₯Ό μ°Ύμ•„μ•Ό ν•΄μš”.
01:12
So in other words, he has to find shows
17
72720
2816
즉, λ‹€μ‹œ λ§ν•΄μ„œ
λ°”λ‘œ 이 곑선 였λ₯Έμͺ½ 끝에 μžˆλŠ” 것듀이어야 ν•œλ‹€λŠ” 뜻이죠.
01:15
that are on the very right end of this curve here.
18
75560
2376
01:17
So this curve here is the rating distribution
19
77960
2656
이 곑선은 IMDB μ›Ήμ‚¬μ΄νŠΈμ˜ 2,500 μ—¬κ°œμ˜ TV μ‡Όλ₯Ό
01:20
of about 2,500 TV shows on the website IMDB,
20
80640
4376
ν‰κ°€ν•˜μ—¬ λΆ„μ„ν•œ λ‚΄μš©μ„ λ‹΄κ³  μžˆμŠ΅λ‹ˆλ‹€.
01:25
and the rating goes from one to 10,
21
85040
2896
평점은 1점뢀터 10점으둜 맀겨지며
01:27
and the height here shows you how many shows get that rating.
22
87960
2976
μ—¬κΈ° 이 λ†’μ΄λŠ” κ·Έ 평점을 받은 μ‡Όμ˜ 개수λ₯Ό 보여주죠.
01:30
So if your show gets a rating of nine points or higher, that's a winner.
23
90960
4696
9점 이상을 λ°›μ•˜λ‹€λ©΄ λŒ€λ°•μΈ κ±°μ˜ˆμš”.
01:35
Then you have a top two percent show.
24
95680
1816
μƒμœ„ 2%의 μ‡ΌλΌλŠ” κ±°μ£ .
01:37
That's shows like "Breaking Bad," "Game of Thrones," "The Wire,"
25
97520
3896
"λΈŒλ ˆμ΄ν‚Ή λ°°λ“œ", "μ™•μ’Œμ˜ κ²Œμž„", "더 와이어" μ²˜λŸΌμš”.
01:41
so all of these shows that are addictive,
26
101440
2296
이 쇼듀은 쀑독성이 κ°•ν•΄μš”.
01:43
whereafter you've watched a season, your brain is basically like,
27
103760
3056
ν•œ μ‹œμ¦Œμ„ λ³΄κ³ λ‚˜λ©΄ μ—¬λŸ¬λΆ„μ˜ λ‡ŒλŠ”
01:46
"Where can I get more of these episodes?"
28
106840
2176
"λ‹€λ₯Έ μ‹œμ¦Œλ“€μ€ μ–΄λ””μ„œ κ΅¬ν•˜μ§€?" 라고 생각할 κ±°μ˜ˆμš”.
01:49
That kind of show.
29
109040
1200
그런 쇼듀이죠.
01:50
On the left side, just for clarity, here on that end,
30
110920
2496
λͺ…ν™•νžˆ λ³΄μ—¬λ“œλ¦¬κΈ° μœ„ν•΄ μ™Όμͺ½ λμ—λŠ” 뭐가 μžˆλŠ”μ§€ μ•Œλ €λ“œλ¦¬μžλ©΄μš”.
01:53
you have a show called "Toddlers and Tiaras" --
31
113440
3176
"ν† λ“€λŸ¬ μ•€ 티아라"κ°€ μžˆμ–΄μš”.
01:56
(Laughter)
32
116640
2656
(μ›ƒμŒ)
01:59
-- which should tell you enough
33
119320
1536
μ—¬λŸ¬λΆ„μ€ μΆ©λΆ„νžˆ κ·Έ 곑선 λμ—μ„œ
02:00
about what's going on on that end of the curve.
34
120880
2191
μ–΄λ–€ 일이 μΌμ–΄λ‚˜κ³  μžˆλŠ”μ§€ μ˜ˆμΈ‘ν•  수 μžˆμ„ κ±°μ˜ˆμš”.
02:03
Now, Roy Price is not worried about getting on the left end of the curve,
35
123095
4161
λ„€, 둜이 ν”„λΌμ΄μŠ€λŠ” μ™Όμͺ½ 곑선에 머물게 λ˜λŠ” 것에 신경쓰지 μ•Šμ•„μš”.
02:07
because I think you would have to have some serious brainpower
36
127280
2936
μ™œλƒν•˜λ©΄ μ—¬λŸ¬λΆ„μ΄
"ν† λ“€λŸ¬ μ•€ 티아라"λ₯Ό μ œμΉ˜μ‹œλ €λ©΄ λ‡Œλ₯Ό μ’€ 많이 μ“°μ…”μ•Ό ν• ν…Œλ‹ˆκΉŒμš”.
02:10
to undercut "Toddlers and Tiaras."
37
130240
1696
02:11
So what he's worried about is this middle bulge here,
38
131960
3936
μ€‘μš”ν•œ 건 이 κ°€μš΄λ° μ§€μ μ΄μ—μš”.
02:15
the bulge of average TV,
39
135920
1816
평균 TV의 쀑간 지점
02:17
you know, those shows that aren't really good or really bad,
40
137760
2856
μ•„μ‹œλ‹€μ‹œν”Ό 그닀지 쒋지도 λ‚˜μ˜μ§€λ„ μ•Šμ€ 쇼듀이죠.
02:20
they don't really get you excited.
41
140639
1656
κ·Έλ ‡κ²Œ ν₯미둭지도 μ•Šμ•„μš”.
02:22
So he needs to make sure that he's really on the right end of this.
42
142320
4856
λ‘œμ΄λŠ” λ°˜λ“œμ‹œ 이 였λ₯Έμͺ½ 끝에 μžˆλ„λ‘ ν•΄μ•Ό ν•΄μš”.
02:27
So the pressure is on,
43
147200
1576
뢀담이 있죠.
02:28
and of course it's also the first time
44
148800
2176
λ¬Όλ‘  μ•„λ§ˆμ‘΄μ΄ 이런 일을 ν•˜λŠ” 것도
02:31
that Amazon is even doing something like this,
45
151000
2176
μ²˜μŒμ΄κΈ°λ„ ν•΄μš”.
02:33
so Roy Price does not want to take any chances.
46
153200
3336
둜이 ν”„λΌμ΄μŠ€λŠ” μœ„ν—˜μ„ ν”Όν•˜κ³  μ‹Άμ–΄ ν•  κ±°μ˜ˆμš”.
02:36
He wants to engineer success.
47
156560
2456
κ·ΈλŠ” 성곡을 λ§Œλ“€μ–΄λ‚΄κΈΈ κΎ€ν•˜μ£ .
02:39
He needs a guaranteed success,
48
159040
1776
그에겐 보μž₯된 성곡이 ν•„μš”ν•΄μš”.
02:40
and so what he does is, he holds a competition.
49
160840
2576
κ·Έλž˜μ„œ κ·ΈλŠ” κ²½μŸμ„ μ‹œν‚΅λ‹ˆλ‹€.
02:43
So he takes a bunch of ideas for TV shows,
50
163440
3136
κ·ΈλŠ” ν…”λ ˆλΉ„μ Ό μ‡Όμ˜ μ—¬λŸ¬ 아이디어λ₯Ό λͺ¨μ€ λ‹€μŒμ—
02:46
and from those ideas, through an evaluation,
51
166600
2296
아이디어와 평가λ₯Ό ν† λŒ€λ‘œ
02:48
they select eight candidates for TV shows,
52
168920
4096
μ—¬λŸ 개의 TV μ‡Όλ₯Ό μ„ νƒν•©λ‹ˆλ‹€.
02:53
and then he just makes the first episode of each one of these shows
53
173040
3216
그리고 각 μ‡Όλ“€μ˜ 첫 번째 μ—ν”Όμ†Œλ“œλ₯Ό
02:56
and puts them online for free for everyone to watch.
54
176280
3136
μ˜¨λΌμΈμ— λˆ„κ΅¬λ‚˜ μ‹œμ²­ν•  수 μžˆλ„λ‘ 무료둜 μ œκ³΅ν•΄μš”.
02:59
And so when Amazon is giving out free stuff,
55
179440
2256
μ•„λ§ˆμ‘΄μ΄ 무료둜 μ œκ³΅ν•˜λ©΄
03:01
you're going to take it, right?
56
181720
1536
λ‹Ήμ—°νžˆ 보게 λ˜μ§€ μ•Šκ² μ–΄μš”?
03:03
So millions of viewers are watching those episodes.
57
183280
5136
수 백만 λͺ…μ˜ μ‹œμ²­μžλ“€μ΄ 첫 번째 μ—ν”Όμ†Œλ“œλ₯Ό λ³Ό κ²ƒμž…λ‹ˆλ‹€.
03:08
What they don't realize is that, while they're watching their shows,
58
188440
3216
μ‡Όλ₯Ό μ‹œμ²­ν•  땐 μ•Œμ•„μ±„μ§€ λͺ»ν•˜μ§€λ§Œ
03:11
actually, they are being watched.
59
191680
2296
사싀 그듀은 κ°μ‹œλ‹Ήν•˜κ³  μžˆμ–΄μš”.
03:14
They are being watched by Roy Price and his team,
60
194000
2336
둜이 ν”„λΌμ΄μŠ€μ™€ 그의 νŒ€μœΌλ‘œλΆ€ν„°μš”.
03:16
who record everything.
61
196360
1376
그듀은 λͺ¨λ“  것을 κΈ°λ‘ν•΄μš”.
03:17
They record when somebody presses play, when somebody presses pause,
62
197760
3376
μ‹œμ²­μžλ“€μ΄ μ–Έμ œ μž¬μƒν•˜κ³  μ–Έμ œ μ •μ§€ν•˜λŠ”μ§€ κΈ°λ‘ν•˜μ£ .
03:21
what parts they skip, what parts they watch again.
63
201160
2536
μ–΄λŠ μž₯면을 κ±΄λ„ˆ λ›°κ³  μ–΄λ””λ₯Ό λ‹€μ‹œ λ³΄λŠ”μ§€λ₯Όμš”.
03:23
So they collect millions of data points,
64
203720
2256
그듀은 무수히 λ§Žμ€ 데이터 포인트λ₯Ό μˆ˜μ§‘ν•΄μš”.
03:26
because they want to have those data points
65
206000
2096
κ·Έλ“€μ—κ²Œ ν•„μš”ν•œκ±΄ 데이터 ν¬μΈνŠΈκ±°λ“ μš”.
03:28
to then decide which show they should make.
66
208120
2696
그리고 이λ₯Ό 톡해 μ–΄λ–€ μ‡Όλ₯Ό λ§Œλ“€μ–΄μ•Ό 할지λ₯Ό κ²°μ •ν•˜μ£ .
03:30
And sure enough, so they collect all the data,
67
210840
2176
λ‹Ήμ—°νžˆ 그듀은 λͺ¨λ“  정보λ₯Ό μˆ˜μ§‘ν•˜κ³ 
03:33
they do all the data crunching, and an answer emerges,
68
213040
2576
λΆ„μ„ν•˜λ©΄ 정닡은 λ‚˜νƒ€λ‚©λ‹ˆλ‹€.
03:35
and the answer is,
69
215640
1216
정닡은
03:36
"Amazon should do a sitcom about four Republican US Senators."
70
216880
5536
"μ•„λ§ˆμ‘΄μ€ 곡화당 μ˜μ› λ„€ λͺ…에 λŒ€ν•œ μ‹œνŠΈμ½€μ„ λ§Œλ“€μ–΄μ•Ό ν•œλ‹€" μž…λ‹ˆλ‹€.
03:42
They did that show.
71
222440
1216
κ·Έλž˜μ„œ 정말 λ§Œλ“€μ—ˆμ–΄μš”.
03:43
So does anyone know the name of the show?
72
223680
2160
μ‡Όμ˜ 제λͺ©μ„ μ•„μ‹œλŠ” 뢄이 κ³„μ‹ κ°€μš”?
03:46
(Audience: "Alpha House.")
73
226720
1296
(청쀑: "μ•ŒνŒŒ ν•˜μš°μŠ€")
03:48
Yes, "Alpha House,"
74
228040
1456
λ„€, "μ•ŒνŒŒ ν•˜μš°μŠ€"μ£ .
03:49
but it seems like not too many of you here remember that show, actually,
75
229520
4096
그런데 λ§Žμ€ 뢄듀이 κΈ°μ–΅ν•˜κ³  κ³„μ‹œμ§€λŠ” μ•Šμ€ 것 κ°™μ•„μš”.
03:53
because it didn't turn out that great.
76
233640
1856
κ·Έλ ‡κ²Œ λŒ€λ°•λ‚œ μž‘ν’ˆμ€ μ•„λ‹ˆκ±°λ“ μš”.
03:55
It's actually just an average show,
77
235520
1856
쀑간 μ •λ„μ˜ μ‡Όμ˜€μ–΄μš”.
03:57
actually -- literally, in fact, because the average of this curve here is at 7.4,
78
237400
4576
사싀 이 κ³‘μ„ μ˜ 쀑간은 7.4점이고
04:02
and "Alpha House" lands at 7.5,
79
242000
2416
"μ•ŒνŒŒ ν•˜μš°μŠ€"λŠ” 7.5μ μ΄λ‹ˆκΉμš”.
04:04
so a slightly above average show,
80
244440
2016
평균보닀 μ•„μ£Ό μ•½κ°„ μœ„μ£ .
04:06
but certainly not what Roy Price and his team were aiming for.
81
246480
2920
ν•˜μ§€λ§Œ λΆ„λͺ… 둜이 ν”„λΌμ΄μŠ€μ™€ 그의 νŒ€μ΄ λͺ©ν‘œν•œ 것은 μ•„λ‹ˆμ—ˆμ£ .
04:10
Meanwhile, however, at about the same time,
82
250320
2856
ν•œνŽΈ 그와 λ™μ‹œμ—
04:13
at another company,
83
253200
1576
λ‹€λ₯Έ νšŒμ‚¬μ—μ„œλ„
04:14
another executive did manage to land a top show using data analysis,
84
254800
4216
νšŒμ‚¬ κ°„λΆ€κ°€ 데이터 뢄석을 톡해 탑 μ‡Ό ν•˜λ‚˜λ₯Ό λ°°μΆœν–ˆμŠ΅λ‹ˆλ‹€.
04:19
and his name is Ted,
85
259040
1576
그의 이름은 ν…Œλ“œ μ‚¬λž€λ„μŠ€λ‘œ
04:20
Ted Sarandos, who is the Chief Content Officer of Netflix,
86
260640
3416
λ„·ν”Œλ¦­μŠ€μ˜ μ›Ήμ‚¬μ΄νŠΈ μ½˜ν…μΈ  λΆ€λ¬Έ 졜고 κ²½μ˜μžμ˜ˆμš”.
04:24
and just like Roy, he's on a constant mission
87
264080
2136
둜이처럼 κ·ΈλŠ” λŠμž„μ—†μ΄ 쒋은 TVμ‡Όλ₯Ό
04:26
to find that great TV show,
88
266240
1496
μ°Ύμ•„μ•Ό ν•˜λŠ” 일을 맑고 μžˆμ–΄μš”.
04:27
and he uses data as well to do that,
89
267760
2016
ν…Œλ“œ λ˜ν•œ 이λ₯Ό μœ„ν•΄ 데이터λ₯Ό μ΄μš©ν•˜λŠ”λ°
04:29
except he does it a little bit differently.
90
269800
2015
μ‘°κΈˆμ€ λ‹€λ₯Έ 방식을 μ‚¬μš©ν•©λ‹ˆλ‹€.
04:31
So instead of holding a competition, what he did -- and his team of course --
91
271839
3737
κ²½μŸμ„ λΆ™μ΄λŠ” λŒ€μ‹ μ— ν…Œλ“œμ™€ 그의 νŒ€μ€
04:35
was they looked at all the data they already had about Netflix viewers,
92
275600
3536
λ„·ν”Œλ¦­μŠ€ μ‹œμ²­μžλ“€μ— κ΄€ν•œ 기쑴의 데이터λ₯Ό λ΄…λ‹ˆλ‹€.
04:39
you know, the ratings they give their shows,
93
279160
2096
μ‹œμ²­μžλ“€μ΄ λ©”κΈ΄ 평점과
04:41
the viewing histories, what shows people like, and so on.
94
281280
2696
μ‹œμ²­ 기둝과 μ‚¬λžŒλ“€μ΄ μ–΄λ–€ μ‡Όλ₯Ό μ’‹μ•„ν•˜λŠ”μ§€ λ“±μ„μš”.
04:44
And then they use that data to discover
95
284000
1896
그리고 κ·Έ 데이터λ₯Ό ν†΅ν•΄μ„œ
04:45
all of these little bits and pieces about the audience:
96
285920
2616
관객듀에 λŒ€ν•œ μ†Œμ†Œν•œ λͺ¨λ“  것듀을 λ°œκ²¬ν•΄ λ‚˜κ°‘λ‹ˆλ‹€.
04:48
what kinds of shows they like,
97
288560
1456
μ–΄λ–€ μ’…λ₯˜μ˜ μ‡Όλ₯Ό μ’‹μ•„ν•˜λŠ”μ§€μ™€
04:50
what kind of producers, what kind of actors.
98
290040
2096
μ–΄λ–€ μ œμž‘μžμ™€ 배우λ₯Ό μ’‹μ•„ν•˜λŠ”μ§€ λ§μ΄μ—μš”.
04:52
And once they had all of these pieces together,
99
292160
2576
이런 정보듀을 ν•œλ° λͺ¨μ•„μ„œ
04:54
they took a leap of faith,
100
294760
1656
μƒλ‹Ήν•œ μ‹ λ’°λ₯Ό 가지고
04:56
and they decided to license
101
296440
2096
κ²°μ •ν•œ 것은
04:58
not a sitcom about four Senators
102
298560
2456
λ„€ λͺ…μ˜ κ΅­νšŒμœ„μ›μ— λŒ€ν•œ μ‹œνŠΈμ½€μ΄ μ•„λ‹ˆλΌ
05:01
but a drama series about a single Senator.
103
301040
2880
ν•œ λͺ…μ˜ κ΅­νšŒμœ„μ›μ— λŒ€ν•œ λ“œλΌλ§ˆμ˜€μŠ΅λ‹ˆλ‹€.
05:04
You guys know the show?
104
304760
1656
κ·Έ μ‡Όκ°€ 무엇인지 μ•„μ„Έμš”?
05:06
(Laughter)
105
306440
1296
(μ›ƒμŒ)
05:07
Yes, "House of Cards," and Netflix of course, nailed it with that show,
106
307760
3736
λ„€, "ν•˜μš°μŠ€ 였브 μΉ΄λ“œ"μ£ . μ΅œμ†Œ 두 번째 μ‹œμ¦ŒκΉŒμ§„ 성곡적인
05:11
at least for the first two seasons.
107
311520
2136
λ„·ν”Œλ¦­μŠ€μ˜ 쇼이죠.
05:13
(Laughter) (Applause)
108
313680
3976
(μ›ƒμŒ) (λ°•μˆ˜)
05:17
"House of Cards" gets a 9.1 rating on this curve,
109
317680
3176
"ν•˜μš°μŠ€ 였브 μΉ΄λ“œ"λŠ” 이 κ³‘μ„ μ—μ„œ 9.1점을 λ°›μ•˜μ–΄μš”.
05:20
so it's exactly where they wanted it to be.
110
320880
3176
그듀이 μ›ν•˜λ˜ μˆ˜μ€€μ΄μ£ .
05:24
Now, the question of course is, what happened here?
111
324080
2416
μ§ˆλ¬Έμ„ 해보죠. 무슨 일이 μΌμ–΄λ‚œ κ±ΈκΉŒμš”?
05:26
So you have two very competitive, data-savvy companies.
112
326520
2656
두 개의 맀우 경쟁적인 데이터λ₯Ό 잘 λ‹€λ£¨λŠ” νšŒμ‚¬λ“€μ΄ μžˆλ‹€κ³  ν•©μ‹œλ‹€.
05:29
They connect all of these millions of data points,
113
329200
2856
그듀은 수 백만 개의 데이터 ν¬μΈνŠΈλ“€μ„ μ—°κ²°μ‹œμΌœμš”.
05:32
and then it works beautifully for one of them,
114
332080
2376
κ·Έλ“€ 쀑 ν•œ μͺ½μ—κ²Œλ§Œ μ•„μ£Ό 잘 μž‘λ™ν•˜κ³ 
05:34
and it doesn't work for the other one.
115
334480
1856
λ‹€λ₯Έ μͺ½μ—μ„œλŠ” 잘 μž‘λ™ν•˜μ§€ μ•Šμ•„μš”.
05:36
So why?
116
336360
1216
μ™œ κ·ΈλŸ΄κΉŒμš”?
05:37
Because logic kind of tells you that this should be working all the time.
117
337600
3456
μ™œλƒν•˜λ©΄ λ…Όλ¦¬μ μœΌλ‘œ 보면 이것이 항상 μž‘λ™ν•˜λŠ” 것이 λ§žμ„ 것이기 λ•Œλ¬Έμž…λ‹ˆλ‹€.
05:41
I mean, if you're collecting millions of data points
118
341080
2456
결정을 ν•˜κΈ° μœ„ν•΄
05:43
on a decision you're going to make,
119
343560
1736
수 λ°± 만의 데이터 포인트λ₯Ό μˆ˜μ§‘ν•œλ‹€λ©΄
05:45
then you should be able to make a pretty good decision.
120
345320
2616
κ½€ ν›Œλ₯­ν•œ 결정을 내릴 수 μžˆμ–΄μ•Ό λ§ˆλ•…ν•  κ²ƒμž…λ‹ˆλ‹€.
05:47
You have 200 years of statistics to rely on.
121
347960
2216
μ—¬λŸ¬λΆ„μ—κ²Œ 200λ…„λ™μ•ˆμ˜ λ―Ώμ„λ§Œν•œ 톡계 μžλ£Œκ°€ μžˆμ–΄μš”.
05:50
You're amplifying it with very powerful computers.
122
350200
3016
κ°•λ ₯ν•œ 컴퓨터λ₯Ό μ΄μš©ν•˜μ—¬ 이λ₯Ό μ¦ν­μ‹œν‚΅λ‹ˆλ‹€.
05:53
The least you could expect is good TV, right?
123
353240
3280
μ΅œμ†Œν•œ 쒋은 TVν”„λ‘œλŠ” κΈ°λŒ€ν•˜μ‹€ν…Œμ£ ?
05:57
And if data analysis does not work that way,
124
357880
2720
만일 정보 뢄석이 κ·Έλ ‡κ²Œ λ˜μ§€ μ•ŠμœΌλ©΄
06:01
then it actually gets a little scary,
125
361520
2056
쑰금 λ¬΄μ„œμ›Œ μ§‘λ‹ˆλ‹€.
06:03
because we live in a time where we're turning to data more and more
126
363600
3816
μ™œλƒν•˜λ©΄ μš°λ¦¬κ°€ μ‚¬λŠ” ν˜„ μ„ΈλŒ€λŠ” 더 λ§Žμ€ 정보 μˆ˜μ§‘μ„ ν†΅ν•˜μ—¬
06:07
to make very serious decisions that go far beyond TV.
127
367440
4480
TVν”„λ‘œ μ΄μƒμœΌλ‘œ 훨씬 더 μ€‘λŒ€ν•œ 결정을 λ‚΄λ¦¬λŠ” μ‚¬νšŒκ±°λ“ μš”.
06:12
Does anyone here know the company Multi-Health Systems?
128
372760
3240
"λ©€ν‹°-ν—¬μŠ€ μ‹œμŠ€ν…œ" μ΄λΌλŠ” νšŒμ‚¬μ— λŒ€ν•΄ λ“€μ–΄ 보신 λΆ„ κ³„μ‹ κ°€μš”?
06:17
No one. OK, that's good actually.
129
377080
1656
아무도 μ—†κ΅°μš”. μ’‹μŠ΅λ‹ˆλ‹€.
06:18
OK, so Multi-Health Systems is a software company,
130
378760
3216
μ’‹μ•„μš”, λ©€ν‹° ν—¬μŠ€ μ‹œμŠ€ν…œμ€ μ†Œν”„νŠΈμ›¨μ–΄ νšŒμ‚¬μž…λ‹ˆλ‹€.
06:22
and I hope that nobody here in this room
131
382000
2816
μ—¬κΈ° 계신 μ—¬λŸ¬λΆ„ 쀑 아무도
06:24
ever comes into contact with that software,
132
384840
3176
κ·Έ μ†Œν”„νŠΈμ›¨μ–΄μ™€ λ§Œλ‚˜μ§€ μ•ŠμœΌμ‹œκΈΈ λ°”λž˜μš”.
06:28
because if you do, it means you're in prison.
133
388040
2096
만일 κ·ΈλŸ°λ‹€λ©΄ λ°”λ‘œ 감μ˜₯에 μžˆλ‹€λŠ” λœ»μ΄λ‹ˆκΉŒμš”.
06:30
(Laughter)
134
390160
1176
(μ›ƒμŒ)
06:31
If someone here in the US is in prison, and they apply for parole,
135
391360
3536
λ―Έκ΅­μ—μ„œ μˆ˜κ°μ€‘μΈ λˆ„κ΅°κ°€κ°€ 가석방을 μš”μ²­ν•˜λ©΄
06:34
then it's very likely that data analysis software from that company
136
394920
4296
κ·Έ νšŒμ‚¬μ˜ 정보 뢄석 μ†Œν”„νŠΈ 웨어λ₯Ό 톡해
06:39
will be used in determining whether to grant that parole.
137
399240
3616
가석방 μ—¬λΆ€λ₯Ό κ²°μ •ν•  κ°€λŠ₯성이 λ†’μŠ΅λ‹ˆλ‹€.
06:42
So it's the same principle as Amazon and Netflix,
138
402880
2576
μ•„λ§ˆμ‘΄κ³Ό λ„·ν”Œλ¦­μŠ€μ™€ 같은 원칙이죠.
06:45
but now instead of deciding whether a TV show is going to be good or bad,
139
405480
4616
ν•˜μ§€λ§Œ μ§€κΈˆμ€ TVμ‡Όκ°€ 쒋은지 λ‚˜μœμ§€λ₯Ό νŒλ‹¨ν•˜λŠ” λŒ€μ‹ μ—
06:50
you're deciding whether a person is going to be good or bad.
140
410120
2896
μ—¬λŸ¬λΆ„μ€ μ–΄λ–€ μ‚¬λžŒμ΄ 쒋은지 λ‚˜μœμ§€λ₯Ό κ²°μ •ν•˜κ²Œ 될 κ±°μ˜ˆμš”.
06:53
And mediocre TV, 22 minutes, that can be pretty bad,
141
413040
5496
ν‰λ²”ν•œ TVλ₯Ό 22λΆ„λ™μ•ˆ λ³΄λŠ” 것은 λ³„λ‘œμ£ .
06:58
but more years in prison, I guess, even worse.
142
418560
2640
ν•˜μ§€λ§Œ 감μ˜₯μ—μ„œ λͺ‡ λ…„ 더 μ‚¬λŠ” 것은 훨씬 λ‚˜μœ 일일 κ²ƒμž…λ‹ˆλ‹€.
07:02
And unfortunately, there is actually some evidence that this data analysis,
143
422360
4136
μ• μ„ν•˜κ²Œλ„ λΆ„μ„λœ 정보가
07:06
despite having lots of data, does not always produce optimum results.
144
426520
4216
κ·Έ λ°©λŒ€ν•œ 양에도 λΆˆκ΅¬ν•˜κ³  항상 μ΅œμƒμ˜ κ²°κ³Όλ₯Ό μ–»λŠ” 것은 μ•„λ‹ˆλΌλŠ” 증거가 μžˆμŠ΅λ‹ˆλ‹€.
07:10
And that's not because a company like Multi-Health Systems
145
430760
2722
λ©€ν‹° ν—¬μŠ€ μ‹œμŠ€ν…œκ³Ό 같은 νšŒμ‚¬κ°€
07:13
doesn't know what to do with data.
146
433506
1627
데이터λ₯Ό μ–΄λ–»κ²Œ μ΄μš©ν•΄μ•Ό 할지 λͺ°λΌμ„œ κ·ΈλŸ¬λŠ”κ²Œ μ•„λ‹ˆμ—μš”.
07:15
Even the most data-savvy companies get it wrong.
147
435158
2298
κ°€μž₯ 데이터λ₯Ό 잘 λ‹€λ£¨λŠ” νšŒμ‚¬λ“€λ„ ν‹€λ¦¬λŠ” κ±Έμš”.
07:17
Yes, even Google gets it wrong sometimes.
148
437480
2400
κ΅¬κΈ€λ§ˆμ €λ„ 가끔은 ν‹€λ¦½λ‹ˆλ‹€.
07:20
In 2009, Google announced that they were able, with data analysis,
149
440680
4496
2009λ…„ ꡬ글은 데이터 뢄석을 톡해
07:25
to predict outbreaks of influenza, the nasty kind of flu,
150
445200
4136
κ³ μ•½ν•œ 독감인 μΈν”Œλ£¨μ—”μž λ°œλ°œμ„ μ˜ˆμΈ‘ν•  수 μžˆλ‹€κ³  μ£Όμž₯ν–ˆμ—ˆμ–΄μš”.
07:29
by doing data analysis on their Google searches.
151
449360
3776
ꡬ글 κ²€μƒ‰μ˜ 정보 뢄석을 ν† λŒ€λ‘œμš”.
07:33
And it worked beautifully, and it made a big splash in the news,
152
453160
3856
μ•„μ£Ό 잘 흘러 κ°”κ³  λ‰΄μŠ€μ— 크게 보도 λμ–΄μš”.
07:37
including the pinnacle of scientific success:
153
457040
2136
과학적 μ„±κ³΅μ˜ 정점과 ν•¨κ»˜ λ§μž…λ‹ˆλ‹€.
07:39
a publication in the journal "Nature."
154
459200
2456
"넀이쳐"에 개제된 것이죠.
07:41
It worked beautifully for year after year after year,
155
461680
3616
ν•΄λ§ˆλ‹€ μ„±κ³΅μ μ΄μ—ˆμ£ .
07:45
until one year it failed.
156
465320
1656
ν•œ ν•΄ μ‹€νŒ¨ν•˜κΈ° μ „κΉŒμ§„μš”.
07:47
And nobody could even tell exactly why.
157
467000
2256
그리고 아무도 μ™œ μ‹€νŒ¨ν–ˆλŠ”μ§€ ν™•μ‹€ν•œ 이유λ₯Ό 규λͺ…ν•˜μ§€ λͺ»ν•΄μš”.
07:49
It just didn't work that year,
158
469280
1696
κ·Έ ν•΄λŠ” κ·Έλƒ₯ μ‹€νŒ¨μ˜€μ£ .
07:51
and of course that again made big news,
159
471000
1936
λ¬Όλ‘  그것 λ˜ν•œ λΉ…λ‰΄μŠ€μ˜€κ³ 
07:52
including now a retraction
160
472960
1616
"넀이쳐" ν•™μˆ μ§€μ—
07:54
of a publication from the journal "Nature."
161
474600
2840
κ°œμ œλ˜λŠ” 것은 μ² νšŒλ˜μ—ˆμŠ΅λ‹ˆλ‹€.
07:58
So even the most data-savvy companies, Amazon and Google,
162
478480
3336
μ•„λ§ˆμ‘΄κ³Ό ꡬ글같은 데이터λ₯Ό 잘 λ‹€λ£¨λŠ” νšŒμ‚¬λ“€λ„
08:01
they sometimes get it wrong.
163
481840
2136
가끔은 ν‹€λ¦°λ‹€λŠ” κ±°μ˜ˆμš”.
08:04
And despite all those failures,
164
484000
2936
μ΄λŸ¬ν•œ μ‹€νŒ¨ μ†μ—μ„œλ„
08:06
data is moving rapidly into real-life decision-making --
165
486960
3856
λ°μ΄ν„°λŠ” μ•„μ£Ό κΈ‰μ†ν•˜κ²Œ μ‹€μ œ μ˜μ‚¬κ²°μ •μ— 이용되고 μžˆμŠ΅λ‹ˆλ‹€.
08:10
into the workplace,
166
490840
1816
직μž₯μ—μ„œλ„
08:12
law enforcement,
167
492680
1816
법λ₯  μ‹€ν–‰ μ‹œμ—λ„
08:14
medicine.
168
494520
1200
의료 λΆ„μ•Όμ—μ„œλ„μš”.
08:16
So we should better make sure that data is helping.
169
496400
3336
κ·ΈλŸ¬λ‹ˆκΉŒ μ‹€μ§ˆμ μœΌλ‘œ 데이터가 도움이 λ˜λ„λ‘ ν•΄μ•Όκ² μ£ .
08:19
Now, personally I've seen a lot of this struggle with data myself,
170
499760
3136
개인적으둜 μ €λŠ” 데이터 λ•Œλ¬Έμ— κ³ μƒν•œ κ²½ν—˜μ΄ λ§Žμ•„μš”.
08:22
because I work in computational genetics,
171
502920
1976
μ™œλƒν•˜λ©΄ μ „ 컴퓨터 μœ μ „ν•™ 뢄야에 μ’…μ‚¬ν•˜κ³  μžˆκ±°λ“ μš”.
08:24
which is also a field where lots of very smart people
172
504920
2496
이 λΆ„μ•Όμ—μ„œλŠ” μˆ˜λ§Žμ€ λ›°μ–΄λ‚œ μΈμž¬λ“€μ΄
08:27
are using unimaginable amounts of data to make pretty serious decisions
173
507440
3656
μƒμƒν•˜μ§€ λͺ»ν• λ§ŒνΌ λ§Žμ€ μ–‘μ˜ 정보λ₯Ό μ΄μš©ν•˜μ—¬ μ€‘λŒ€ν•œ 결정을 λ‚΄λ¦½λ‹ˆλ‹€.
08:31
like deciding on a cancer therapy or developing a drug.
174
511120
3560
예λ₯Ό λ“€μ–΄ μ•” μΉ˜λ£Œλ‚˜ μ‹ μ•½ 개발 λ“±μ˜ 결정에 λ§μž…λ‹ˆλ‹€.
08:35
And over the years, I've noticed a sort of pattern
175
515520
2376
졜근 λͺ‡λ…„ 사이 μƒˆλ‘œμš΄ νŒ¨ν„΄μ„ λ°œκ²¬ν•˜μ˜€λŠ”λ°
08:37
or kind of rule, if you will, about the difference
176
517920
2456
차이에 κ΄€ν•œ μΌμ’…μ˜ κ·œμΉ™μ΄λΌκ³  해두죠.
08:40
between successful decision-making with data
177
520400
2696
데이터λ₯Ό μ΄μš©ν•˜μ—¬ 성곡적인 결정을
08:43
and unsuccessful decision-making,
178
523120
1616
μ΄λŒμ–΄ λ‚΄λŠ”κ°€ μ‹€νŒ¨ ν•˜λŠ”κ°€μ˜ 차이에 λŒ€ν•œ κ·œμΉ™ λ§μž…λ‹ˆλ‹€.
08:44
and I find this a pattern worth sharing, and it goes something like this.
179
524760
3880
이 νŒ¨ν„΄μ€ 곡유될 κ°€μΉ˜κ°€ μžˆλŠ”λ° κ·Έ λ‚΄μš©μ€ λ‹€μŒκ³Ό κ°™μŠ΅λ‹ˆλ‹€.
08:50
So whenever you're solving a complex problem,
180
530520
2135
λ³΅μž‘ν•œ λ¬Έμ œμ— μ§λ©΄ν–ˆμ„ λ•Œ
08:52
you're doing essentially two things.
181
532679
1737
μ—¬λŸ¬λΆ„μ€ 주둜 두가지λ₯Ό ν•©λ‹ˆλ‹€.
08:54
The first one is, you take that problem apart into its bits and pieces
182
534440
3296
μ²«λ²ˆμ§ΈλŠ” κ·Έ 문제λ₯Ό μ—¬λŸ¬ 쑰각으둜 μͺΌκ°œμ„œ
08:57
so that you can deeply analyze those bits and pieces,
183
537760
2496
κ·Έ 쑰각듀을 깊이 λΆ„μ„ν•˜λŠ” 것이고
09:00
and then of course you do the second part.
184
540280
2016
λ¬Όλ‘  μ—¬λŸ¬λΆ„μ΄ ν•˜λŠ” 두 번째 일은
09:02
You put all of these bits and pieces back together again
185
542320
2656
쑰각듀을 λ‹€μ‹œ ν•˜λ‚˜λ‘œ ν•©μ³μ„œ
09:05
to come to your conclusion.
186
545000
1336
결둠을 λ‚΄λŠ” κ²ƒμž…λ‹ˆλ‹€.
09:06
And sometimes you have to do it over again,
187
546360
2336
μ’…μ’… κ·Έ 과정을 λ°˜λ³΅ν•΄μ•Ό ν• ν…λ°μš”.
09:08
but it's always those two things:
188
548720
1656
κ·Έλž˜λ„ μ—­μ‹œ 두 가지 뿐이죠.
09:10
taking apart and putting back together again.
189
550400
2320
μͺΌκ°œκ³  λ‹€μ‹œ ν•©μΉ˜κΈ°.
09:14
And now the crucial thing is
190
554280
1616
그리고 μ€‘μš”ν•œ 것은
09:15
that data and data analysis
191
555920
2896
데이터와 데이터 뢄석은
09:18
is only good for the first part.
192
558840
2496
였직 첫 번째 νŒŒνŠΈμ—μ„œλ§Œ μ€‘μš”ν•œ κ²ƒμ΄λΌλŠ” κ²λ‹ˆλ‹€.
09:21
Data and data analysis, no matter how powerful,
193
561360
2216
데이터와 데이터 뢄석이 μ–Όλ§ˆλ‚˜ 영ν–₯λ ₯이 있던 간에
09:23
can only help you taking a problem apart and understanding its pieces.
194
563600
4456
문제λ₯Ό μͺΌκ°œκ³  κ·Έ 쑰각듀을 μ΄ν•΄ν•˜λŠ” κ²ƒμ—λ§Œ 도움을 쀄 뿐
09:28
It's not suited to put those pieces back together again
195
568080
3496
κ·Έ 쑰각듀을 λ‹€μ‹œ ν•˜λ‚˜λ‘œ ν•©μ³μ„œ
09:31
and then to come to a conclusion.
196
571600
1896
결둠을 λ‚΄λŠ”λ° 도움을 주진 μ•Šμ•„μš”.
09:33
There's another tool that can do that, and we all have it,
197
573520
2736
그것을 ν• μˆ˜ μžˆλŠ” λ˜λ‹€λ₯Έ 도ꡬ가 μžˆλŠ”λ° 우리 λͺ¨λ‘λŠ” 그것을 가지고 μžˆμŠ΅λ‹ˆλ‹€.
09:36
and that tool is the brain.
198
576280
1296
κ·Έ λ„κ΅¬λŠ” λ°”λ‘œ λ‡Œμ—μš”.
09:37
If there's one thing a brain is good at,
199
577600
1936
λ‡Œκ°€ λŠ₯ν†΅ν•œ 일듀 쀑 ν•˜λ‚˜λŠ”
09:39
it's taking bits and pieces back together again,
200
579560
2256
쑰각듀을 ν•©μΉ˜λŠ” κ²ƒμž…λ‹ˆλ‹€.
09:41
even when you have incomplete information,
201
581840
2016
μ™„μ „ν•˜μ§€ μ•Šμ€ 정보λ₯Ό 가지고 μžˆμ„ λ•Œμ‘°μ°¨
09:43
and coming to a good conclusion,
202
583880
1576
쒋은 결정을 내리죠.
09:45
especially if it's the brain of an expert.
203
585480
2936
μ „λ¬Έκ°€μ˜ λ‘λ‡ŒλΌλ©΄ λ§μž…λ‹ˆλ‹€.
09:48
And that's why I believe that Netflix was so successful,
204
588440
2656
그게 μ œκ°€ μƒκ°ν•˜λŠ” λ„·ν”Œλ¦­μŠ€μ˜ 성곡 λΉ„κ²°μ΄μ˜ˆμš”.
09:51
because they used data and brains where they belong in the process.
205
591120
3576
데이터와 λ‡Œλ₯Ό μ΄μš©ν•΄μ„œ 일을 처리 ν–ˆκ±°λ“ μš”.
09:54
They use data to first understand lots of pieces about their audience
206
594720
3536
첫 λ²ˆμ§ΈλŠ” 데이터λ₯Ό μ΄μš©ν•΄μ„œ λŒ€μ€‘λ“€μ— κ΄€ν•œ 쑰각듀을 μ΄ν•΄ν–ˆλŠ”λ°
09:58
that they otherwise wouldn't have been able to understand at that depth,
207
598280
3416
데이터가 μ—†μ—ˆλ‹€λ©΄ κ·Έλ ‡κ²Œ 깊이 μ΄ν•΄ν•˜μ§€ λͺ»ν–ˆμ„ κ²ƒμž…λ‹ˆλ‹€.
10:01
but then the decision to take all these bits and pieces
208
601720
2616
ν•˜μ§€λ§Œ κ·Έ 쑰각듀을 λ‹€μ‹œ ν•˜λ‚˜λ‘œ 합쳐
10:04
and put them back together again and make a show like "House of Cards,"
209
604360
3336
"ν•˜μš°μŠ€ 였브 μΉ΄λ“œ" 같은 μ‡Όλ₯Ό λ§Œλ“  것은
10:07
that was nowhere in the data.
210
607720
1416
λ°μ΄ν„°λ‘œλΆ€ν„° λ‚˜μ˜¨ 것이 μ•„λ‹ˆμ—ˆμŠ΅λ‹ˆλ‹€.
10:09
Ted Sarandos and his team made that decision to license that show,
211
609160
3976
ν…Œλ“œ μ„Έλž€λ„μŠ€μ™€ 그의 νŒ€μ€ κ·Έ 쇼에 νŠΉν—ˆλ₯Ό λ‚΄κΈ°λ‘œ κ²°μ •ν–ˆμ–΄μš”.
10:13
which also meant, by the way, that they were taking
212
613160
2381
즉, λ‹€λ₯Έ 말둜
10:15
a pretty big personal risk with that decision.
213
615565
2851
κ½€ 큰 개인적 μœ„ν—˜μ„ κ°μˆ˜ν•œλ‹€λŠ” 뜻이죠.
10:18
And Amazon, on the other hand, they did it the wrong way around.
214
618440
3016
λ°˜λ©΄μ— μ•„λ§ˆμ‘΄μ€ μ‹€μˆ˜λ₯Ό ν–ˆμ–΄μš”.
10:21
They used data all the way to drive their decision-making,
215
621480
2736
그듀은 결정을 λ‚΄λ¦¬κΈ°κΉŒμ§€ 계속 데이터λ₯Ό μ‚¬μš©ν–ˆμŠ΅λ‹ˆλ‹€.
10:24
first when they held their competition of TV ideas,
216
624240
2416
λ¨Όμ € TV 아이디어 κ²½μŸμ„ 벌일 λ•Œ μ‚¬μš©ν–ˆκ³ 
10:26
then when they selected "Alpha House" to make as a show.
217
626680
3696
"μ•ŒνŒŒ ν•˜μš°μŠ€" μ‡Όλ₯Ό λ§Œλ“€κΈ°λ‘œ μ„ νƒν–ˆμ„ λ•Œλ„ μ‚¬μš©ν–ˆμŠ΅λ‹ˆλ‹€.
10:30
Which of course was a very safe decision for them,
218
630400
2496
λ¬Όλ‘  그듀에겐 μ•„μ£Ό μ•ˆμ •μ μΈ κ²°μ •μ΄μ—ˆμŠ΅λ‹ˆλ‹€.
10:32
because they could always point at the data, saying,
219
632920
2456
μ™œλƒλ©΄ 그듀은 항상 데이터λ₯Ό κ°€λ₯΄ν‚€λ©΄μ„œ 말할 수 μžˆμ—ˆμœΌλ‹ˆκΉŒμš”.
10:35
"This is what the data tells us."
220
635400
1696
"이것이 데이터가 말해쀀 κ²ƒμž…λ‹ˆλ‹€".
10:37
But it didn't lead to the exceptional results that they were hoping for.
221
637120
4240
κ·Έλ ‡μ§€λ§Œ 그듀이 μ›ν•˜λ˜ νŠΉλ³„ν•œ 결과에 이λ₯΄μ§€λŠ” λͺ»ν–ˆμ–΄μš”.
10:42
So data is of course a massively useful tool to make better decisions,
222
642120
4976
λ‹Ήμ—°νžˆ λ°μ΄ν„°λŠ” 더 λ‚˜μ€ 선택을 μœ„ν•œ 맀우 μœ μš©ν•œ λ„κ΅¬μ΄μ§€λ§Œ
10:47
but I believe that things go wrong
223
647120
2376
λ•Œλ‘œλŠ” 잘λͺ»λ  μˆ˜λ„ μžˆλ‹€κ³  μƒκ°ν•΄μš”.
10:49
when data is starting to drive those decisions.
224
649520
2576
데이터가 κ·ΈλŸ¬ν•œ 결정을 이끌기 μ‹œμž‘ν•œλ‹€λ©΄ 말이죠.
10:52
No matter how powerful, data is just a tool,
225
652120
3776
μ–Όλ§ˆλ‚˜ κ°•λ ₯ν•˜κ±΄ 간에, λ°μ΄ν„°λŠ” 도ꡬ일 λΏμž…λ‹ˆλ‹€.
10:55
and to keep that in mind, I find this device here quite useful.
226
655920
3336
이λ₯Ό μœ λ…ν•˜κΈ° μœ„ν•΄μ„œλŠ” 이 μž₯μΉ˜κ°€ κ½€ μœ μš©ν•˜λ‹€κ³  μƒκ°ν•©λ‹ˆλ‹€.
10:59
Many of you will ...
227
659280
1216
μ—¬λŸ¬λΆ„λ“€ 쀑 λ§Žμ€λΆ„λ“€μ΄
11:00
(Laughter)
228
660520
1216
(μ›ƒμŒ)
11:01
Before there was data,
229
661760
1216
데이터가 있기 전에
11:03
this was the decision-making device to use.
230
663000
2856
이것을 μ˜μ‚¬κ²°μ • μž₯치둜 μ΄μš©ν•˜μ…¨μ„ κ²ƒμž…λ‹ˆλ‹€.
11:05
(Laughter)
231
665880
1256
(μ›ƒμŒ)
11:07
Many of you will know this.
232
667160
1336
λŒ€λΆ€λΆ„μ€ μ•„μ‹€ κ±°μ˜ˆμš”.
11:08
This toy here is called the Magic 8 Ball,
233
668520
1953
이 μž₯λ‚œκ°μ˜ 이름이 맀직 μ—μž‡ 볼이고
11:10
and it's really amazing,
234
670497
1199
멋진 κ²ƒμ΄λΌλŠ” κ²ƒμ„μš”.
11:11
because if you have a decision to make, a yes or no question,
235
671720
2896
μ™œλƒν•˜λ©΄ μ–΄λ– ν•œ 결정을 λ‚΄λ €μ•Ό ν•  λ•Œ 예 λ˜λŠ” μ•„λ‹ˆμ˜€ μ‹μ˜ λ¬Έμ œμΌλ•Œμš”.
11:14
all you have to do is you shake the ball, and then you get an answer --
236
674640
3736
이 곡을 ν”λ“€κΈ°λ§Œ ν•˜λ©΄ 닡을 얻을 수 μžˆμŠ΅λ‹ˆλ‹€.
11:18
"Most Likely" -- right here in this window in real time.
237
678400
2816
μ§€κΈˆ μ—¬κΈ°μ„œμ˜ μ‹€μ œ 닡은 "μ•„μ£Ό κ·Έλ ‡λ‹€"μ΄λ„€μš”.
11:21
I'll have it out later for tech demos.
238
681240
2096
이따가 기술 μ‹œν˜„ λ•Œ λ³΄μ—¬λ“œλ¦¬μ£ .
11:23
(Laughter)
239
683360
1216
(μ›ƒμŒ)
11:24
Now, the thing is, of course -- so I've made some decisions in my life
240
684600
3576
λ¬Όλ‘  μ€‘μš”ν•œ 것은- 제 μΈμƒμ—μ„œ 결정듀을 μ’…μ’… λ‚΄λ ΈλŠ”λ°
11:28
where, in hindsight, I should have just listened to the ball.
241
688200
2896
λ’€λŠ¦κ²Œ κ·Έλƒ₯ κ·Έ 곡의 말을 듀을걸 ν•˜κ³  깨달은 적이 μžˆμŠ΅λ‹ˆλ‹€.
11:31
But, you know, of course, if you have the data available,
242
691120
3336
그런데 λ‹Ήμ—°νžˆ 데이터가 μžˆλŠ” μƒνƒœλΌλ©΄
11:34
you want to replace this with something much more sophisticated,
243
694480
3056
μ’€ 더 지적인 결정을 내리고 싢어지죠.
11:37
like data analysis to come to a better decision.
244
697560
3616
데이터 뢄석을 톡해 더 λ‚˜μ€ 결정을 λ‚΄λ¦¬λŠ” 것 처럼 λ§μ΄μ˜ˆμš”.
11:41
But that does not change the basic setup.
245
701200
2616
ν•˜μ§€λ§Œ 그렇더라도 κΈ°λ³Έ ν™˜κ²½μ€ λ°”λ€Œμ§€ μ•Šμ•„μš”.
11:43
So the ball may get smarter and smarter and smarter,
246
703840
3176
곡이 λ”μš± 지λŠ₯적으둜 계속 μ§„ν™”ν• μ§€λŠ” λͺ¨λ₯΄μ§€λ§Œ
11:47
but I believe it's still on us to make the decisions
247
707040
2816
μ—¬μ „νžˆ 결정을 λ‚΄λ¦¬λŠ” 것은 우리의 λͺ«μ΄ λ κΊΌμ—μš”.
11:49
if we want to achieve something extraordinary,
248
709880
3016
무언가 νŠΉλ³„ν•œ 일을 이루고 μ‹Άλ‹€λ©΄μš”.
11:52
on the right end of the curve.
249
712920
1936
즉 κ³‘μ„ μ˜ 였λ₯Έμͺ½ 끝에 λ„λ‹¬ν•˜κ³  μ‹Άμ„λ•Œμš”.
11:54
And I find that a very encouraging message, in fact,
250
714880
4496
μ €λŠ” 사싀 이것이 μ•„μ£Ό 희망적인 메세지라고 μƒκ°ν•˜λŠ”λ°
11:59
that even in the face of huge amounts of data,
251
719400
3976
μ—„μ²­λ‚œ μ–‘μ˜ 데이터에도 λΆˆκ³Όν•˜κ³ 
12:03
it still pays off to make decisions,
252
723400
4096
결정을 내리고
12:07
to be an expert in what you're doing
253
727520
2656
μ „λ¬Έκ°€κ°€ 되고
12:10
and take risks.
254
730200
2096
μœ„ν—˜μ„±μ„ κ°μˆ˜ν•˜λŠ” λŒ“κ°€κ°€ μ£Όμ–΄μ§€λ‹ˆκΉŒμš”.
12:12
Because in the end, it's not data,
255
732320
2776
μ™œλƒν•˜λ©΄ κ²°κ΅­ 데이터가 μ•„λ‹ˆλΌ
12:15
it's risks that will land you on the right end of the curve.
256
735120
3960
당신이 κ°μˆ˜ν•˜λŠ” μœ„ν—˜μ΄ κ³‘μ„ μ˜ 였λ₯Έμͺ½ 끝으둜 데렀닀 μ€„ν…Œλ‹ˆκΉŒμš”.
12:19
Thank you.
257
739840
1216
κ°μ‚¬ν•©λ‹ˆλ‹€.
12:21
(Applause)
258
741080
3680
(λ°•μˆ˜)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7