What to trust in a "post-truth" world | Alex Edmans

159,123 views ・ 2018-12-03

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: Hanwool Chung κ²€ν† : Tae Young Choi
00:13
Belle Gibson was a happy young Australian.
0
13675
2920
벨 κΉμŠ¨μ€ 젊고 ν–‰λ³΅ν•œ ν˜Έμ£ΌμΈμ΄μ—ˆμŠ΅λ‹ˆλ‹€.
00:16
She lived in Perth, and she loved skateboarding.
1
16619
3023
κ·Έλ…€λŠ” νΌμŠ€μ— μ‚΄μ•˜κ³  μŠ€μΌ€μ΄νŠΈλ³΄λ”©μ„ μ’‹μ•„ν–ˆμ£ .
00:20
But in 2009, Belle learned that she had brain cancer and four months to live.
2
20173
4449
ν•˜μ§€λ§Œ 2009λ…„, λ‡Œμ’…μ–‘ 발견으둜 κ·Έλ…€μ—κ²Œ λ„€ λ‹¬μ˜ μ‹œκ°„μ΄ λ‚¨μ•˜μŒμ„ μ•Œμ•˜μ£ .
00:25
Two months of chemo and radiotherapy had no effect.
3
25034
3533
두 λ‹¬κ°„μ˜ ν™”ν•™μš”λ²•κ³Ό 방사λŠ₯μΉ˜λ£ŒλŠ” νš¨κ³Όκ°€ μ—†μ—ˆμŠ΅λ‹ˆλ‹€.
00:29
But Belle was determined.
4
29145
1500
ν•˜μ§€λ§Œ 벨은 κ²°μ‹¬ν–ˆμŠ΅λ‹ˆλ‹€.
00:30
She'd been a fighter her whole life.
5
30669
2130
κ·Έλ…€λŠ” 평생을 μ‹Έμ›Œμ™”μ£ .
00:32
From age six, she had to cook for her brother, who had autism,
6
32823
3294
6μ‚΄ λΆ€ν„° κ·Έλ…€λŠ” 동생과 μ–΄λ¨Έλ‹ˆμ˜ 식사λ₯Ό μ±…μž„μ Έμ•Όν–ˆμŠ΅λ‹ˆλ‹€.
00:36
and her mother, who had multiple sclerosis.
7
36141
2388
동생은 μžνμ¦μ„, μ–΄λ¨Έλ‹ˆλŠ” λ‹€λ°œμ„±κ²½ν™”μ¦μ„ μ•“κ³  μžˆμ—ˆκ³ 
00:38
Her father was out of the picture.
8
38553
1754
κ·Έλ…€μ˜ μ•„λ²„μ§€λŠ” 가정에 μ—†μ—ˆμŠ΅λ‹ˆλ‹€.
00:40
So Belle fought, with exercise, with meditation
9
40736
3286
벨은 μš΄λ™κ³Ό λͺ…μƒμœΌλ‘œ μ΄κ²¨λƒˆκ³ 
00:44
and by ditching meat for fruit and vegetables.
10
44046
2840
κ³ κΈ°λ₯Ό λ©€λ¦¬ν•˜κ³  과일과 μ±„μ†Œλ₯Ό μ„­μ·¨ν–ˆμŠ΅λ‹ˆλ‹€.
00:47
And she made a complete recovery.
11
47387
2200
그리곀 μ™„μ „νžˆ νšŒλ³΅ν–ˆμŠ΅λ‹ˆλ‹€.
00:50
Belle's story went viral.
12
50784
1579
벨의 μ΄μ•ΌκΈ°λŠ” μˆœμ‹κ°„μ— νΌμ‘ŒμŠ΅λ‹ˆλ‹€.
00:52
It was tweeted, blogged about, shared and reached millions of people.
13
52387
3393
νŠΈμœ„ν„°μ™€ λΈ”λ‘œκ·Έλ₯Ό 톡해 수 백만의 μ‚¬λžŒλ“€μ—κ²Œ μ „ν˜€μ‘Œμ£ .
00:56
It showed the benefits of shunning traditional medicine
14
56246
3111
이 방법은 전톡적인 μ˜μ•½ν’ˆμ„ ν”Όν•˜λŠ” λŒ€μ‹ 
00:59
for diet and exercise.
15
59381
1467
식이와 μš΄λ™μ˜ 이점을 보여 μ£Όμ—ˆμŠ΅λ‹ˆλ‹€.
01:01
In August 2013, Belle launched a healthy eating app,
16
61381
4498
2013λ…„ 8μ›”, 벨은 건강식 μ–΄ν”Œμ„ λ‚΄λ†“μ•˜μŠ΅λ‹ˆλ‹€.
01:05
The Whole Pantry,
17
65903
1349
"ν™€νŒ¬νŠΈλ¦¬"μ˜€μ£ .
01:07
downloaded 200,000 times in the first month.
18
67276
4023
μΆœμ‹œ 첫 달, 20만건의 λ‹€μš΄λ‘œλ“œμˆ˜λ₯Ό κΈ°λ‘ν–ˆμŠ΅λ‹ˆλ‹€.
01:13
But Belle's story was a lie.
19
73228
2799
ν•˜μ§€λ§Œ 벨의 μ΄μ•ΌκΈ°λŠ” κ±°μ§“μ΄μ—ˆμŠ΅λ‹ˆλ‹€.
01:17
Belle never had cancer.
20
77227
1534
벨은 암에 걸린적이 μ—†μ—ˆμ£ .
01:19
People shared her story without ever checking if it was true.
21
79601
4133
μ‚¬λžŒλ“€μ€ 벨의 이야기가 사싀인지 확인쑰차 ν•˜μ§€ μ•Šκ³  νΌλœ¨λ ΈμŠ΅λ‹ˆλ‹€.
01:24
This is a classic example of confirmation bias.
22
84815
3220
이것은 확증 편ν–₯의 μ „ν˜•μ μΈ μ˜ˆμž…λ‹ˆλ‹€.
01:28
We accept a story uncritically if it confirms what we'd like to be true.
23
88403
4676
μš°λ¦¬κ°€ λ―Ώκ³  싢은 이야기라면 그것을 λΉ„νŒμ μœΌλ‘œ 받아듀이지 μ•ŠμŠ΅λ‹ˆλ‹€.
01:33
And we reject any story that contradicts it.
24
93484
2506
그리고 그에 λ°˜λŒ€λ˜λŠ” 증거듀을 λ¬΄μ‹œν•˜μ£ .
01:36
How often do we see this
25
96937
1825
이런 일을 μ–Όλ§ˆλ‚˜ 자주 λ³΄μ‹œλ‚˜μš”?
01:38
in the stories that we share and we ignore?
26
98786
3045
μš°λ¦¬κ°€ κ³΅μœ ν•˜κ³  λ¬΄μ‹œν•˜λŠ” μ΄μ•ΌκΈ°λ“€μ—μ„œμš”.
01:41
In politics, in business, in health advice.
27
101855
4182
μ •μΉ˜μ—μ„œ, μ‚¬μ—…μ—μ„œ, κ±΄κ°•μ •λ³΄μ—μ„œ.
01:47
The Oxford Dictionary's word of 2016 was "post-truth."
28
107180
4106
2016λ…„ μ˜₯μŠ€ν¬λ“œ μ‚¬μ „μ—μ„œ 뽑은 μ˜¬ν•΄μ˜ λ‹¨μ–΄λŠ” "νƒˆμ§„μ‹€"μ΄μ—ˆμŠ΅λ‹ˆλ‹€.
01:51
And the recognition that we now live in a post-truth world
29
111768
3492
μš°λ¦¬κ°€ ν˜„μž¬ νƒˆμ§„μ‹€ μ„Έμƒμ—μ„œ μ‚΄κ³ μžˆλ‹€λŠ” 인식은
01:55
has led to a much needed emphasis on checking the facts.
30
115284
3364
μ‚¬μ‹€κ²€μ¦μ˜ μ€‘μš”μ„±μ„ μ•ΌκΈ°μ‹œμΌ°μŠ΅λ‹ˆλ‹€.
01:59
But the punch line of my talk
31
119339
1397
ν•˜μ§€λ§Œ 제 μ΄μ•ΌκΈ°μ˜ κ³¨μžλŠ”
02:00
is that just checking the facts is not enough.
32
120760
2991
그것이 λ‹¨μˆœν•œ 사싀검증에 그치면 μ•ˆλœλ‹€λŠ” κ²ƒμž…λ‹ˆλ‹€.
02:04
Even if Belle's story were true,
33
124347
2927
λ§Œμ•½ 벨의 이야기가 사싀이라고 ν•˜λ”λΌλ„
02:07
it would be just as irrelevant.
34
127298
2067
λ§ˆμ°¬κ°€μ§€λ‘œ 섀득λ ₯은 없을 κ²λ‹ˆλ‹€.
02:10
Why?
35
130457
1150
μ™œμΌκΉŒμš”?
02:11
Well, let's look at one of the most fundamental techniques in statistics.
36
131957
3508
음, ν†΅κ³„ν•™μ—μ„œ κ°€μž₯ 기본이 λ˜λŠ” 기법을 μ‚΄νŽ΄λ΄…μ‹œλ‹€.
02:15
It's called Bayesian inference.
37
135489
2410
λ² μ΄μ§€μ•ˆ 좔둠이라고 λΆˆλ¦¬λŠ” κΈ°λ²•μž…λ‹ˆλ‹€.
02:18
And the very simple version is this:
38
138251
2936
κ°€μž₯ κ°„λ‹¨ν•œ μ‚¬λ‘€λŠ” μ΄κ²λ‹ˆλ‹€.
02:21
We care about "does the data support the theory?"
39
141211
3268
μš°λ¦¬λŠ” "이 μžλ£Œλ“€μ΄ 이둠을 λ’·λ°›μΉ¨ ν•˜λŠ”κ°€?"λ₯Ό μ€‘μš”ν•˜κ²Œ μƒκ°ν•©λ‹ˆλ‹€.
02:25
Does the data increase our belief that the theory is true?
40
145053
3456
μžλ£Œλ“€μ΄ 이둠에 λŒ€ν•œ 우리의 λ―ΏμŒμ„ ꡳ게 ν•΄μ£ΌλŠ”κ°€.
02:29
But instead, we end up asking, "Is the data consistent with the theory?"
41
149520
4383
ν•˜μ§€λ§Œ μš°λ¦¬λŠ” μ΄λ ‡κ²Œ λ¬»κ²Œλ©λ‹ˆλ‹€. "이 μžλ£Œκ°€ 이둠에 λ§žλŠ” 이야기인가?"
02:34
But being consistent with the theory
42
154838
2515
ν•˜μ§€λ§Œ 이둠과 μΌμΉ˜ν•œλ‹€λŠ” 것이
02:37
does not mean that the data supports the theory.
43
157377
2929
μžλ£Œκ°€ 이둠을 λ’·λ°›μΉ¨ν•œλ‹€λŠ” 것을 μ˜λ―Έν•˜μ§„ μ•ŠμŠ΅λ‹ˆλ‹€.
02:40
Why?
44
160799
1159
μ™œλƒκ΅¬μš”?
02:41
Because of a crucial but forgotten third term --
45
161982
3825
μ€‘μš”ν•˜μ§€λ§Œ κ°„κ³Όλœ μ„Έλ²ˆμ§Έ 쑰건이 μžˆμŠ΅λ‹ˆλ‹€.
02:45
the data could also be consistent with rival theories.
46
165831
3558
κ·Έ μžλ£ŒλŠ” 경쟁 μ΄λ‘ μ—μ„œλ„ μœ νš¨ν•  수 μžˆλ‹€λŠ” κ²ƒμž…λ‹ˆλ‹€.
02:49
But due to confirmation bias, we never consider the rival theories,
47
169918
4667
ν•˜μ§€λ§Œ ν™•μ¦νŽΈν–₯으둜 인해 경쟁 이둠은 κ³ λ €ν•˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
02:54
because we're so protective of our own pet theory.
48
174609
3151
μ‚¬λžŒλ“€μ€ μžμ‹ μ˜ 지둠에 λŒ€ν•΄ λ„ˆλ¬΄λ‚˜ 방어적이기 λ•Œλ¬Έμž…λ‹ˆλ‹€.
02:58
Now, let's look at this for Belle's story.
49
178688
2413
벨의 경우λ₯Ό λ‹€μ‹œ μ‚΄νŽ΄λ³΄μ£ .
03:01
Well, we care about: Does Belle's story support the theory
50
181125
4214
우리의 κ΄€μ‹¬μ‚¬λŠ” "이둠과 벨의 이야기가
μ‹μ΄μš”λ²•μ΄ 암을 μΉ˜λ£Œν•œλ‹€λŠ” 것을 λ’·λ°›μΉ¨ν•˜λŠ”κ°€?" μž…λ‹ˆλ‹€.
03:05
that diet cures cancer?
51
185363
1603
03:06
But instead, we end up asking,
52
186990
1787
ν•˜μ§€λ§Œ κ·Έ λŒ€μ‹  μ΄λ ‡κ²Œ 묻게 λ©λ‹ˆλ‹€.
03:08
"Is Belle's story consistent with diet curing cancer?"
53
188801
4045
"벨의 이야기가 암을 μΉ˜λ£Œν•˜λŠ” μ‹μ΄μš”λ²•μ— ν•΄λ‹Ήν•˜λŠ”κ°€?"
03:13
And the answer is yes.
54
193790
1604
그리고 λŒ€λ‹΅μ€ "κ·Έλ ‡λ‹€."μž…λ‹ˆλ‹€.
03:15
If diet did cure cancer, we'd see stories like Belle's.
55
195839
4103
λ§Œμ•½ μ‹μ΄μš”λ²•μ΄ 암을 μΉ˜λ£Œν–ˆλ‹€λ©΄ μš°λ¦¬λŠ” 벨과 같은 이야기λ₯Ό 봐왔을 κ²λ‹ˆλ‹€.
03:20
But even if diet did not cure cancer,
56
200839
2849
ν•˜μ§€λ§Œ μ‹μ΄μš”λ²•μ΄ 암을 μΉ˜λ£Œν•˜μ§€ μ•Šμ•˜λ”λΌλ„
03:23
we'd still see stories like Belle's.
57
203712
2643
벨과 같은 이야기λ₯Ό 듀어왔을 κ²λ‹ˆλ‹€.
03:26
A single story in which a patient apparently self-cured
58
206744
5190
λͺ…λ°±ν•˜κ²Œ μžκ°€μΉ˜λ£Œλœ ν™˜μžμ˜ μ΄μ•ΌκΈ°λŠ”
03:31
just due to being misdiagnosed in the first place.
59
211958
3174
μ• μ΄ˆλΆ€ν„° μ˜€μ§„μœΌλ‘œ μΈν•œ 것 λΏμž…λ‹ˆλ‹€.
03:35
Just like, even if smoking was bad for your health,
60
215680
3326
마치 흑연이 건강에 μ•ˆμ’‹λ‹€κ³  ν•˜λŠ”λ°λ„
03:39
you'd still see one smoker who lived until 100.
61
219030
3304
100살이 λ„˜λ„λ‘ μ‚΄μ•„μžˆλŠ” ν‘μ—°μžλ₯Ό λ³Έ 적이 μžˆλŠ” 것 μ²˜λŸΌμš”.
03:42
(Laughter)
62
222664
1150
(μ›ƒμŒ)
03:44
Just like, even if education was good for your income,
63
224157
2562
κ΅μœ‘μˆ˜μ€€μ΄μ΄ μ†Œλ“μˆ˜μ€€μ— 영ν–₯을 λ―ΈμΉœλ‹€κ³  ν•˜μ§€λ§Œ
03:46
you'd still see one multimillionaire who didn't go to university.
64
226743
4281
λŒ€ν•™μ— 가지 μ•Šμ€ 수 백만μž₯자λ₯Ό λ³Ό 수 μžˆλŠ” 것 μ²˜λŸΌμš”.
03:51
(Laughter)
65
231048
4984
(μ›ƒμŒ)
03:56
So the biggest problem with Belle's story is not that it was false.
66
236056
3911
벨의 μ΄μ•ΌκΈ°μ—μ„œ κ°€μž₯ 큰 λ¬Έμ œλŠ” 이야기가 κ±°μ§“μ΄λΌλŠ” 것이 μ•„λ‹™λ‹ˆλ‹€.
03:59
It's that it's only one story.
67
239991
2531
단지 ν•˜λ‚˜μ˜ 이야기에 λΆˆκ³Όν•˜λ‹€λŠ” 것이죠.
04:03
There might be thousands of other stories where diet alone failed,
68
243094
4381
μ•„λ§ˆ μ‹μ΄μš”λ²•μ΄ μ‹€νŒ¨λ‘œ λŒμ•„κ°„ 수 천 κ°€μ§€μ˜ 이야기가 μžˆμ„ κ²λ‹ˆλ‹€.
04:07
but we never hear about them.
69
247499
1934
ν•˜μ§€λ§Œ μš°λ¦¬λŠ” κ²°μ½” λ“€μœΌλ € ν•˜μ§€ μ•Šμ£ .
04:10
We share the outlier cases because they are new,
70
250141
3896
μš°λ¦¬λŠ” νŠΉμ΄ν•œ 이야기λ₯Ό κ³΅μœ ν•©λ‹ˆλ‹€. νŠΉμ΄ν•œ μ΄μ•ΌκΈ°λŠ” μƒˆλ‘­κΈ° λ•Œλ¬Έμ΄μ£ .
04:14
and therefore they are news.
71
254061
1867
κ·Έλž˜μ„œ κ·Έ 이야기듀은 μƒˆλ‘œμš΄ μ†Œμ‹μ΄ λ©λ‹ˆλ‹€.
04:16
We never share the ordinary cases.
72
256657
2476
μš°λ¦¬λŠ” ν‰λ²”ν•œ μ΄μ•ΌκΈ°λŠ” κ³΅μœ ν•˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
04:19
They're too ordinary, they're what normally happens.
73
259157
3213
λ„ˆλ¬΄ ν‰λ²”ν•˜κΈ° λ•Œλ¬Έμ΄μ—μš”. κ·Έλƒ₯ μΌμ–΄λ‚˜λŠ” 일듀이죠.
04:23
And that's the true 99 percent that we ignore.
74
263125
3095
그리고 그것은 μš°λ¦¬κ°€ μ™Έλ©΄ν•˜λŠ” 99%의 μ§„μ‹€μž…λ‹ˆλ‹€.
04:26
Just like in society, you can't just listen to the one percent,
75
266244
2968
μ‚¬νšŒμ—μ„œμ™€ λ§ˆμ°¬κ°€μ§€λ‘œ λ‹¨μˆœνžˆ 1%의 μ˜ˆμ™Έλ₯Ό
04:29
the outliers,
76
269236
1158
λ―Ώμ–΄μ„  μ•ˆλ©λ‹ˆλ‹€.
04:30
and ignore the 99 percent, the ordinary.
77
270418
2666
99%의 원칙을 λ¬΄μ‹œν•˜λ©΄μ„œμš”.
04:34
Because that's the second example of confirmation bias.
78
274022
3254
이것이 ν™•μ¦νŽΈν–₯의 λ‘λ²ˆμ§Έ 예이기 λ•Œλ¬Έμž…λ‹ˆλ‹€.
04:37
We accept a fact as data.
79
277300
2769
μš°λ¦¬λŠ” 사싀을 자료처럼 받아듀이죠.
04:41
The biggest problem is not that we live in a post-truth world;
80
281038
3968
κ°€μž₯ 큰 λ¬Έμ œλŠ” μš°λ¦¬κ°€ νƒˆμ§„μ‹€ μ‚¬νšŒμ— μ‚¬λŠ” 것이 μ•„λ‹™λ‹ˆλ‹€.
04:45
it's that we live in a post-data world.
81
285030
3769
λŒ€μ‹  μš°λ¦¬κ°€ νƒˆμžλ£Œ μ‚¬νšŒμ— μ‚΄κ³  μžˆλŠ” κ²ƒμž…λ‹ˆλ‹€.
04:49
We prefer a single story to tons of data.
82
289792
3744
μš°λ¦¬λŠ” λ¬΄μˆ˜ν•œ 데이터보닀 ν•˜λ‚˜μ˜ 이야기λ₯Ό μ’‹μ•„ν•©λ‹ˆλ‹€.
04:54
Now, stories are powerful, they're vivid, they bring it to life.
83
294752
3016
μ΄μ•ΌκΈ°λŠ” νž˜μ„ κ°€μ§‘λ‹ˆλ‹€. μ΄μ•ΌκΈ°λŠ” 생λͺ…λ ₯을 κ°€μ§‘λ‹ˆλ‹€.
04:57
They tell you to start every talk with a story.
84
297792
2222
μ‚¬λžŒλ“€μ€ λͺ¨λ“  λŒ€ν™”λ₯Ό ν•˜λ‚˜μ˜ μ΄μ•ΌκΈ°λ‘œ μ‹œμž‘ν•  것을 κΆŒν•©λ‹ˆλ‹€.
05:00
I did.
85
300038
1150
저도 그랬죠.
05:01
But a single story is meaningless and misleading
86
301696
4754
ν•˜μ§€λ§Œ μΆ©λΆ„ν•œ κ·Όκ±°κ°€ μ—†λŠ” μ΄μ•ΌκΈ°λŠ”
05:06
unless it's backed up by large-scale data.
87
306474
2849
아무 μ˜λ―Έμ—†μ΄ λ“£λŠ” 이λ₯Ό ν˜Έλ„ν•  λΏμž…λ‹ˆλ‹€.
05:11
But even if we had large-scale data,
88
311236
2357
ν•˜μ§€λ§Œ λ’·λ°›μΉ¨ μžλ£Œκ°€ μžˆλ‹€κ³  해도
05:13
that might still not be enough.
89
313617
2158
κ·Έκ²ƒμœΌλ‘œλ„ μΆ©λΆ„μΉ˜ μ•Šμ„ 수 μžˆμŠ΅λ‹ˆλ‹€.
05:16
Because it could still be consistent with rival theories.
90
316260
3138
κ·Έ μžλ£Œλ“€μ€ κ²½μŸμ΄λ‘ λ“€λ„ κ·Όκ±°λ‘œμ„œ λ’·λ°›μΉ¨ ν•  수 있기 λ•Œλ¬Έμž…λ‹ˆλ‹€.
05:20
Let me explain.
91
320136
1150
μ„€λͺ…ν•΄λ“œλ¦΄κ²Œμš”.
05:22
A classic study by psychologist Peter Wason
92
322072
3262
μ‹¬λ¦¬ν•™μž ν”Όν„° μ™€μŠ¨μ€ μ—°κ΅¬μ—μ„œ
05:25
gives you a set of three numbers
93
325358
1952
μ„Έ 개의 숫자λ₯Ό μ£Όμ–΄μ€λ‹ˆλ‹€.
05:27
and asks you to think of the rule that generated them.
94
327334
2905
그리고 μ—¬λŸ¬λΆ„μ—κ²Œ 그듀이 κ°€μ§€λŠ” κ·œμΉ™μ— λŒ€ν•΄μ„œ 물어보죠
05:30
So if you're given two, four, six,
95
330585
4476
λ§Œμ•½ 2, 4, 6의 μˆ«μžκ°€ μžˆλ‹€κ³  μƒκ°ν•΄λ³΄μ„Έμš”.
05:35
what's the rule?
96
335085
1150
κ·œμΉ™μ΄ 무엇이죠?
05:36
Well, most people would think, it's successive even numbers.
97
336895
3219
μ•„λ§ˆ λŒ€λΆ€λΆ„μ˜ μ‚¬λžŒλ“€μ€ μ—°μ†λœ μ§μˆ˜λ“€μ΄λΌκ³  생각할 κ²λ‹ˆλ‹€.
05:40
How would you test it?
98
340767
1515
μ–΄λ–»κ²Œ 확인해 λ³Ό 수 있죠?
05:42
Well, you'd propose other sets of successive even numbers:
99
342306
3262
λ‹€λ₯Έ μ—°μ†λœ μ§μˆ˜λ“€μ„ 예둜 λ“€ 수 μžˆμ„ κ±°μ˜ˆμš”.
05:45
4, 6, 8 or 12, 14, 16.
100
345592
3318
4, 6, 8 λ˜λŠ” 12, 14, 16 μ²˜λŸΌμš”.
05:49
And Peter would say these sets also work.
101
349546
2800
ν”Όν„°λŠ” "이 μˆ«μžλ“€μ—λ„ 적용이 λ˜λ„€μš”."라고 말할 κ²λ‹ˆλ‹€.
05:53
But knowing that these sets also work,
102
353124
2564
ν•˜μ§€λ§Œ 이λ₯Ό μ•ˆλ‹€κ³ ν•΄μ„œ,
05:55
knowing that perhaps hundreds of sets of successive even numbers also work,
103
355712
4765
λ‹€λ₯Έ 수 백개의 μ§μˆ˜λ“€μ„ μ œμ‹œν•œλ‹€κ³  ν•΄μ„œ
06:00
tells you nothing.
104
360501
1348
μ„€λͺ…λ˜λŠ”κ±΄ 아무것도 μ—†μŠ΅λ‹ˆλ‹€.
06:02
Because this is still consistent with rival theories.
105
362572
3358
μ™œλƒν•˜λ©΄ 경쟁 μ΄λ‘ μ—μ„œλ„ μœ νš¨ν•œ μ˜ˆλ“€μ΄κΈ° λ•Œλ¬Έμ΄μ£ .
06:06
Perhaps the rule is any three even numbers.
106
366889
3205
μ–΄μ©Œλ©΄ κ·œμΉ™μ€ κ·Έλƒ₯ 짝수 μ„Έκ°œλ₯Ό λ‚˜μ—΄ν•˜λŠ” 것일 수 μžˆμ„ κ±°μ˜ˆμš”.
06:11
Or any three increasing numbers.
107
371000
2133
μ•„λ‹ˆλ©΄ 단지 μ¦κ°€ν•˜λŠ” μˆ˜λ“€μ˜ λ‚˜μ—΄μΌ 수 λ„μš”.
06:14
And that's the third example of confirmation bias:
108
374365
2888
이것이 ν™•μ¦νŽΈν–₯의 μ„Έ 번째 μ˜ˆμž…λ‹ˆλ‹€.
06:17
accepting data as evidence,
109
377277
3689
μžλ£Œλ“€μ„ 증거둜 λ°›μ•„λ“€μ΄λŠ” κ±°μ£ .
06:20
even if it's consistent with rival theories.
110
380990
3000
μžλ£Œμ™€ μΌμΉ˜ν•˜λŠ” λ‹€λ₯Έ 경쟁이둠이 μžˆλ”λΌλ„μš”.
06:24
Data is just a collection of facts.
111
384704
2952
μžλ£ŒλŠ” κ·Έμ € 사싀듀을 λͺ¨μ€ 것에 λΆˆκ³Όν•©λ‹ˆλ‹€.
06:28
Evidence is data that supports one theory and rules out others.
112
388402
4923
μ¦κ±°λž€ μ–΄λ–€ 이둠을 λ’·λ°›μΉ¨ν•˜κ³  λ‹€λ₯Έ 것듀을 κ·œμœ¨ν•˜λŠ” μžλ£Œμž…λ‹ˆλ‹€.
06:34
So the best way to support your theory
113
394665
2483
λ”°λΌμ„œ μ—¬λŸ¬λΆ„μ˜ 이둠을 증λͺ…ν•˜λŠ” κ°€μž₯ 쒋은 방법은
06:37
is actually to try to disprove it, to play devil's advocate.
114
397172
3930
μ—¬λŸ¬λΆ„μ˜ 이둠을 λ°˜λ°•ν•˜λŠ” κ²ƒμž…λ‹ˆλ‹€. μΌλΆ€λŸ¬ λ°˜λŒ€μ˜κ²¬μ„ λ‚΄λ³΄λŠ” 것이죠.
06:41
So test something, like 4, 12, 26.
115
401466
4718
4, 12, 26 같은 숫자둜 ν…ŒμŠ€νŠΈν•΄λ³΄λŠ”κ±°μ˜ˆμš”.
06:46
If you got a yes to that, that would disprove your theory
116
406938
3683
여기에도 κ·œμΉ™μ΄ μ μš©λœλ‹€λ©΄ μ—¬λŸ¬λΆ„μ˜ μ—°μ†λœ 짝수 이둠은
06:50
of successive even numbers.
117
410645
1936
효λ ₯을 μžƒκ²Œλ©λ‹ˆλ‹€.
06:53
Yet this test is powerful,
118
413232
2016
κ·Έλ ‡λ‹€ν•˜λ”λ„ 이 ν…ŒμŠ€νŠΈλŠ” μ—¬μ „νžˆ κ°•λ ₯ν•©λ‹ˆλ‹€.
06:55
because if you got a no, it would rule out "any three even numbers"
119
415272
4845
ν•΄λ‹Ή κ·œμΉ™μ΄ μ μš©λ˜μ§€ μ•Šλ”λΌλ„ μ„Έκ°œμ˜ μ§μˆ˜λΌλŠ” κ·œμΉ™ 그리고
07:00
and "any three increasing numbers."
120
420141
1712
μ¦κ°€ν•˜λŠ” μ„Έ μˆ«μžλΌλŠ” κ·œμΉ™μ€ λ§Œμ‘±ν•©λ‹ˆλ‹€.
07:01
It would rule out the rival theories, but not rule out yours.
121
421877
3341
λ‹€λ₯Έ κ²½μŸμ΄λ‘ λ“€μ„ λ’·λ°›μΉ¨ ν•  κ²ƒμž…λ‹ˆλ‹€. μ—¬λŸ¬λΆ„μ˜ 이둠을 μ œμ™Έν•˜κ³ μš”.
07:05
But most people are too afraid of testing the 4, 12, 26,
122
425968
4794
ν•˜μ§€λ§Œ λŒ€λΆ€λΆ„μ˜ μ‚¬λžŒλ“€μ€ 4, 12, 26 ν…ŒμŠ€νŠΈλ₯Ό λ‘λ €μ›Œν•©λ‹ˆλ‹€.
07:10
because they don't want to get a yes and prove their pet theory to be wrong.
123
430786
4163
μžμ‹ μ˜ 이둠이 ν‹€λ Έλ‹€λŠ” 것을 ν™•μΈν•˜κ³  싢지 μ•ŠκΈ° λ•Œλ¬Έμ΄μ£ .
07:16
Confirmation bias is not only about failing to search for new data,
124
436727
5676
확증 편ν–₯은 단지 μƒˆλ‘œμš΄ 자료λ₯Ό 찾지 λͺ»ν•΄ μƒκΈ°λŠ” κ²ƒλ§Œμ€ μ•„λ‹™λ‹ˆλ‹€.
07:22
but it's also about misinterpreting data once you receive it.
125
442427
3073
데이터λ₯Ό 찾은 이후 해석을 잘λͺ»ν•˜κΈ° λ•Œλ¬Έμ΄κΈ°λ„ ν•˜μ£ .
07:26
And this applies outside the lab to important, real-world problems.
126
446339
3548
그리고 μ΄λŠ” μ‹€ν—˜μ‹€ λ°–μ˜ μ€‘μš”ν•œ, ν˜„μ‹€μ˜ λ¬Έμ œλ“€μ—λ„ μ μš©λ©λ‹ˆλ‹€.
07:29
Indeed, Thomas Edison famously said,
127
449911
3309
μ‹€μ œλ‘œ ν† λ§ˆμŠ€ μ—λ””μŠ¨μ€ 이런 유λͺ…ν•œ 말을 ν–ˆμŠ΅λ‹ˆλ‹€.
07:33
"I have not failed,
128
453244
1888
"λ‚˜λŠ” μ‹€νŒ¨ν•˜μ§€ μ•Šμ•˜λ‹€.
07:35
I have found 10,000 ways that won't work."
129
455156
4188
λ‚˜λŠ” μž‘λ™ν•˜μ§€ μ•ŠλŠ” 1만 가지 방법을 λ°œκ²¬ν•΄λƒˆλ‹€."
07:40
Finding out that you're wrong
130
460281
2627
μ—¬λŸ¬λΆ„μ΄ ν‹€λ Έλ‹€λŠ” κ±Έ λ°œκ²¬ν•˜λŠ” 것이
07:42
is the only way to find out what's right.
131
462932
2733
무엇이 λ§žλŠ”μ§€ μ°ΎλŠ” μœ μΌν•œ λ°©λ²•μž…λ‹ˆλ‹€.
07:46
Say you're a university admissions director
132
466654
2946
μ—¬λŸ¬λΆ„μ΄ λŒ€ν•™μž…ν•™μ²˜μž₯이라 μƒκ°ν•΄λ³΄μ„Έμš”.
07:49
and your theory is that only students with good grades
133
469624
2563
그리고 μ—¬λŸ¬λΆ„μ€ 쒋은 μ„±μ μ˜ λΆ€μœ μΈ΅ ν•™μƒλ“€λ§Œμ΄
07:52
from rich families do well.
134
472211
1763
λŒ€ν•™μƒν™œμ„ μž˜ν•΄λ‚Έλ‹€κ³  λ―ΏμŠ΅λ‹ˆλ‹€.
07:54
So you only let in such students.
135
474339
2190
κ·Έλž˜μ„œ 그런 ν•™μƒλ“€λ§Œ λ½‘κ²Œλ˜μ£ .
07:56
And they do well.
136
476553
1150
그리고 그듀은 μ—­μ‹œλ‚˜ 잘 ν•΄λƒ…λ‹ˆλ‹€.
07:58
But that's also consistent with the rival theory.
137
478482
2772
ν•˜μ§€λ§Œ μ΄λŠ” λ‹€λ₯Έ κ²½μŸμ΄λ‘ μ—μ„œλ„ μœ νš¨ν•©λ‹ˆλ‹€.
08:01
Perhaps all students with good grades do well,
138
481593
2747
μ•„λ§ˆ 쒋은 성적을 가진 λͺ¨λ“  학생듀이 μž˜ν•΄λ‚Ό κ±°μ˜ˆμš”.
08:04
rich or poor.
139
484364
1181
λΆ€μœ ν•˜λ“ μ§€ κ°€λ‚œν•˜λ“ μ§€μš”.
08:06
But you never test that theory because you never let in poor students
140
486307
3730
ν•˜μ§€λ§Œ μ—¬λŸ¬λΆ„μ€ μ ˆλŒ€ κ°€λ‚œν•œ ν•™μƒλ“€λ‘œ 이 이둠을 μ‹œν—˜ν•΄λ³΄λ €ν•˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
08:10
because you don't want to be proven wrong.
141
490061
2800
μ—¬λŸ¬λΆ„μ΄ ν‹€λ Έλ‹€λŠ”κ±Έ 증λͺ…ν•˜κ³  싢지 μ•Šμ•„μ„œμ§€μš”.
08:14
So, what have we learned?
142
494577
1857
이야기λ₯Ό 정리해보면 μ΄λ ‡μŠ΅λ‹ˆλ‹€.
08:17
A story is not fact, because it may not be true.
143
497315
3560
μ΄μ•ΌκΈ°λŠ” 사싀이 μ•„λ‹™λ‹ˆλ‹€. 사싀이 μ•„λ‹μˆ˜ 있기 λ•Œλ¬Έμ΄μ£ .
08:21
A fact is not data,
144
501498
2087
사싀은 μžλ£Œκ°€ μ•„λ‹™λ‹ˆλ‹€.
08:23
it may not be representative if it's only one data point.
145
503609
4039
단지 κ·Έλž˜ν”„μ˜ 점에 λΆˆκ³Όν•˜λ‹€λ©΄ μ–΄λŠ 것도 증λͺ…ν•˜μ§€ λͺ»ν•©λ‹ˆλ‹€.
08:28
And data is not evidence --
146
508680
2349
μžλ£ŒλŠ” 증거가 μ•„λ‹™λ‹ˆλ‹€.
08:31
it may not be supportive if it's consistent with rival theories.
147
511053
3678
κ²½μŸμ΄λ‘ μ—μ„œλ„ μœ νš¨ν•˜λ‹€λ©΄ μ–΄λŠ 것도 뒷받침될 수 μ—†μŠ΅λ‹ˆλ‹€.
08:36
So, what do you do?
148
516146
2277
그럼 무엇을 ν•˜μ‹œκ² μŠ΅λ‹ˆκΉŒ?
08:39
When you're at the inflection points of life,
149
519464
2682
μ—¬λŸ¬λΆ„μ΄ μΈμƒμ˜ 변곑점을 μ§€λ‚˜κ³ μžˆλ‹€λ©΄,
08:42
deciding on a strategy for your business,
150
522170
2566
μ—¬λŸ¬λΆ„μ˜ 사업 μ „λž΅μ„ κ²°μ •ν•΄μ•Όν•œλ‹€λ©΄,
08:44
a parenting technique for your child
151
524760
2611
μ—¬λŸ¬λΆ„μ˜ 아이λ₯Ό μœ„ν•œ μœ‘μ•„κΈ°μˆ  λ˜λŠ”
08:47
or a regimen for your health,
152
527395
2428
μ—¬λŸ¬λΆ„μ˜ 건강을 μœ„ν•œ μ‹μ΄μš”λ²•μ„ μ •ν•΄μ•Ό ν•  λ•Œ
08:49
how do you ensure that you don't have a story
153
529847
3539
μ–΄λ–»κ²Œ ν•΄μ•Ό μ—¬λŸ¬λΆ„μ΄ 이야기가 μ•„λ‹ˆλΌ
08:53
but you have evidence?
154
533410
1468
증거λ₯Ό 가지고 μžˆλ‹€κ³  ν™•μ‹ ν•  수 μžˆμ„κΉŒμš”?
08:56
Let me give you three tips.
155
536268
1619
μ„Έ 가지 νŒμ„ λ“œλ¦΄κ²Œμš”.
08:58
The first is to actively seek other viewpoints.
156
538641
3984
첫 번째 νŒμ€ λ‹€λ₯Έ μ‹œκ°μœΌλ‘œ 보렀고 κΎΈμ€€νžˆ λ…Έλ ₯ν•˜λŠ” κ²ƒμž…λ‹ˆλ‹€.
09:02
Read and listen to people you flagrantly disagree with.
157
542649
3594
μ—¬λŸ¬λΆ„μ΄ κ°•λ ¬ν•˜κ²Œ κ±°λΆ€ν•˜λŠ” 이야기에 κ·€ κΈ°μšΈμ΄μ„Έμš”.
09:06
Ninety percent of what they say may be wrong, in your view.
158
546267
3488
μ—¬λŸ¬λΆ„μ˜ μƒκ°μœΌλ‘  그듀이 ν•˜λŠ” 말의 90%λŠ” 틀렸을 κ²λ‹ˆλ‹€.
09:10
But what if 10 percent is right?
159
550728
2133
ν•˜μ§€λ§Œ λ‚˜λ¨Έμ§€ 10%κ°€ λ§žλ‹€λ©΄ μ–΄λ–»κ²Œ ν•˜μ‹œκ² μŠ΅λ‹ˆκΉŒ?
09:13
As Aristotle said,
160
553851
1619
μ•„λ¦¬μŠ€ν† ν…”λ ˆμŠ€λŠ”
09:15
"The mark of an educated man
161
555494
2214
"κ΅μœ‘λ°›μ€ μΈκ°„μ΄λΌλŠ” μ¦κ±°λŠ”
09:17
is the ability to entertain a thought
162
557732
3397
μ–΄λ–€ 생각을 λ°›μ•„λ“€μ΄λŠ”λ° μ§‘μ°©ν•˜μ§€μ•Šκ³ 
09:21
without necessarily accepting it."
163
561153
2333
κ·Έμ € μ¦κΈ°λŠ” λŠ₯λ ₯이닀." 라고 λ§ν–ˆμŠ΅λ‹ˆλ‹€.
09:24
Surround yourself with people who challenge you,
164
564649
2254
μ—¬λŸ¬λΆ„μ— λ°˜λŒ€ν•˜λŠ” μ‚¬λžŒλ“€λ‘œ 주변을 가득 μ±„μš°μ„Έμš”.
09:26
and create a culture that actively encourages dissent.
165
566917
3699
그리고 λ°˜λŒ€μ˜κ²¬μ„ μ„œμŠ΄μ—†μ΄ ν‘œν˜„ν•˜λŠ” λ¬Έν™”λ₯Ό λ§Œλ“œμ„Έμš”.
09:31
Some banks suffered from groupthink,
166
571347
2318
μ–΄λ–€ 은행듀은 집단적 μ‚¬κ³ λ‘œ 고톡받고 μžˆμŠ΅λ‹ˆλ‹€.
09:33
where staff were too afraid to challenge management's lending decisions,
167
573689
4309
직원듀이 κ²½μ˜μ§„μ˜ λŒ€μΆœ 결정에 λŒ€ν•­ν•˜κΈ°λ₯Ό λ„ˆλ¬΄ λ‘λ €μ›Œν•˜κΈ° λ•Œλ¬Έμ΄μ£ .
09:38
contributing to the financial crisis.
168
578022
2466
κ·Έλ‘œμΈν•΄ μž¬μ •μ μΈ μœ„κΈ°κ°€ μ°Ύμ•„μ˜΄μ—λ„μš”.
09:41
In a meeting, appoint someone to be devil's advocate
169
581029
4199
νšŒμ˜μ—μ„œ νŠΉμ • μ‚¬λžŒμ„ μ—¬λŸ¬λΆ„μ˜ 이둠에
09:45
against your pet idea.
170
585252
1642
λ°˜λŒ€ν•˜κ³  λ‚˜μ„œλŠ” νŠΈμ§‘μŸμ΄λΌκ³  μƒκ°ν•˜μ„Έμš”.
09:47
And don't just hear another viewpoint --
171
587720
2571
그리고 κ·Έ 이야기λ₯Ό ν˜λ €λ“£μ§€λ§κ³ 
09:50
listen to it, as well.
172
590315
2176
κ·€κΈ°μšΈμ—¬ λ“€μ–΄λ³΄μ„Έμš”.
09:53
As psychologist Stephen Covey said,
173
593389
2404
μ‹¬λ¦¬ν•™μž μŠ€ν‹°λΈ μ½”λΉ„κ°€ λ§ν•œ 것 μ²˜λŸΌμš”.
09:55
"Listen with the intent to understand,
174
595817
3397
"λ°˜λ°•ν•˜κΈ° μœ„ν•΄ λ“£κΈ°λ³΄λ‹€λŠ”
09:59
not the intent to reply."
175
599238
1666
μ΄ν•΄ν•˜κΈ° μœ„ν•΄ 듀어라."
10:01
A dissenting viewpoint is something to learn from
176
601642
3492
λ°˜λŒ€ 관점을 λ…ΌμŸ λŒ€μƒμœΌλ‘œ 여기지 말고
10:05
not to argue against.
177
605158
1548
λ°°μ›€μ˜ λŒ€μƒμœΌλ‘œ λ°›μ•„λ“€μ΄μ„Έμš”.
10:07
Which takes us to the other forgotten terms in Bayesian inference.
178
607690
3866
우리λ₯Ό λ² μ΄μ§€μ•ˆ μΆ”λ‘ μ˜ λ‹€λ₯Έ 쑰건으둜 인도할 κ²ƒμž…λ‹ˆλ‹€.*
10:12
Because data allows you to learn,
179
612198
2324
μžλ£Œλ“€μ€ 배움을 κ°€λŠ₯μΌ€ ν•©λ‹ˆλ‹€.
10:14
but learning is only relative to a starting point.
180
614546
3515
ν•˜μ§€λ§Œ λ°°μ›€μ΄λΌλŠ” 것은 좜발점과 μ—°κ΄€λ˜μ–΄ μžˆμŠ΅λ‹ˆλ‹€..
10:18
If you started with complete certainty that your pet theory must be true,
181
618085
5716
λ§Œμ•½ μ—¬λŸ¬λΆ„μ˜ 이둠이 틀림이 μ—†λ‹€λŠ” ꡳ은 마음으둜 μ‹œμž‘ν•œλ‹€λ©΄
10:23
then your view won't change --
182
623825
1897
μ—¬λŸ¬λΆ„μ˜ 생각은 λ³€ν•˜μ§€ μ•Šμ„ κ²λ‹ˆλ‹€.
10:25
regardless of what data you see.
183
625746
2466
μ—¬λŸ¬λΆ„μ΄ μ–΄λ–€ 자료λ₯Ό λ³΄λ“ μ§€μš”.
10:28
Only if you are truly open to the possibility of being wrong
184
628641
4391
였직 틀릴 수 μžˆλ‹€λŠ” κ°€λŠ₯성을 받아듀일 μžμ„Έκ°€ λ˜μ–΄μžˆμ„ λ•Œμ—λ§Œ
10:33
can you ever learn.
185
633056
1267
μ—¬λŸ¬λΆ„μ€ 배울 수 μžˆμŠ΅λ‹ˆλ‹€.
10:35
As Leo Tolstoy wrote,
186
635580
2095
ν†¨μŠ€ν† μ΄κ°€ μ“΄ λ¬Έμž₯μ²˜λŸΌμš”.
10:37
"The most difficult subjects
187
637699
2182
"κ°€μž₯ μ–΄λ €μš΄ 것을
10:39
can be explained to the most slow-witted man
188
639905
3135
사전지식이 μ—†λŠ” 이해가 느린 μ‚¬λžŒμ—κ²Œ
10:43
if he has not formed any idea of them already.
189
643064
2753
μ„€λͺ…ν•  μˆ˜λŠ” μžˆμ§€λ§Œ
10:46
But the simplest thing
190
646365
1873
κ°€μž₯ λ‹¨μˆœν•œ 것을
10:48
cannot be made clear to the most intelligent man
191
648262
3071
이미 ν™•κ³ ν•œ 지식을 가진 λ˜‘λ˜‘ν•œ μ‚¬λžŒμ—κ²Œ
10:51
if he is firmly persuaded that he knows already."
192
651357
3334
μ„€λͺ…ν•  μˆ˜λŠ” μ—†λ‹€."
10:56
Tip number two is "listen to experts."
193
656500
3743
두 번째 νŒμ€ "μ „λ¬Έκ°€λ₯Ό 믿어라"μž…λ‹ˆλ‹€.
11:01
Now, that's perhaps the most unpopular advice that I could give you.
194
661040
3492
이건 μ œκ°€ μ—¬λŸ¬λΆ„κ»˜ λ“œλ¦΄ 수 μžˆλŠ” κ°€μž₯ κΊΌλ¦Όμ§ν•œ 쑰언일 κ²ƒμž…λ‹ˆλ‹€.
11:04
(Laughter)
195
664556
1220
(μ›ƒμŒ)
11:05
British politician Michael Gove famously said that people in this country
196
665800
4738
영ꡭ의 μ •μΉ˜μΈ 마이클 κ³ λΈŒλŠ” 이런 유λͺ…ν•œ 말을 ν–ˆμŠ΅λ‹ˆλ‹€.
"이 λ‚˜λΌμ—λŠ” μ „λ¬Έκ°€κ°€ λ„˜μΉœλ‹€."
11:10
have had enough of experts.
197
670562
2276
11:13
A recent poll showed that more people would trust their hairdresser --
198
673696
3508
졜근의 μ„€λ¬Έμ‘°μ‚¬λŠ” λ§Žμ€ μ‚¬λžŒλ“€μ΄ 그보닀 λ―Έμš©μ‚¬λ₯Ό 더 μ‹ λ’°ν•œλ‹€κ³  μ΄μ•ΌκΈ°ν•©λ‹ˆλ‹€.
11:17
(Laughter)
199
677228
2285
(μ›ƒμŒ)
11:19
or the man on the street
200
679537
1833
μ•„λ‹ˆλ©΄ 방금 κΈΈμ—μ„œ λ§Œλ‚œ μ‚¬λžŒμ„μš”.
11:21
than they would leaders of businesses, the health service and even charities.
201
681394
4305
μ‚¬μ—…κ°€λ‚˜ μ˜λ£Œμ„œλΉ„μŠ€ 제곡자, 심지어 μžμ„ λ‹¨μ²΄λ³΄λ‹€λ„μš”.
11:26
So we respect a teeth-whitening formula discovered by a mom,
202
686227
3977
μš°λ¦¬λŠ” ν•œ μ•„μ΄μ—„λ§ˆκ°€ λ°œκ²¬ν•΄λ‚Έ μΉ˜μ•„λ―Έλ°± 효과λ₯Ό 더 μ‹ λ’°ν•©λ‹ˆλ‹€.
11:30
or we listen to an actress's view on vaccination.
203
690228
3198
λ˜λŠ” μ–΄λŠ μ—¬λ°°μš°μ˜ 백신에 κ΄€ν•œ 견해에 κ·€κΈ°μšΈμ΄μ£ .
11:33
We like people who tell it like it is, who go with their gut,
204
693450
2865
μš°λ¦¬λŠ” 직감에 μ˜μ‘΄ν•΄ λ‹¨μ–Έν•˜λŠ” μ‚¬λžŒλ“€μ„ μ’‹μ•„ν•©λ‹ˆλ‹€.
11:36
and we call them authentic.
205
696339
1800
그리고 κ·Έκ±Έ μ§„μ§œλΌκ³  λ―Ώμ£ .
11:38
But gut feel can only get you so far.
206
698847
3214
ν•˜μ§€λ§Œ 직감은 단지 μ—¬λŸ¬λΆ„μ„ λ™λ–¨μ–΄λœ¨λ €λ†“μ„ λΏμž…λ‹ˆλ‹€.
11:42
Gut feel would tell you never to give water to a baby with diarrhea,
207
702736
4436
직감은 μ„€μ‚¬ν•˜λŠ” μ•„κΈ°μ—κ²Œ 물을 주지 말라고 ν• κ±°μ˜ˆμš”.
11:47
because it would just flow out the other end.
208
707196
2318
κ·Έμ € λ‹€λ₯Έ μͺ½μœΌλ‘œ ν˜λŸ¬λ‚˜μ˜¬ κ±°λΌλ©΄μ„œμš”.
11:49
Expertise tells you otherwise.
209
709538
2578
ν•˜μ§€λ§Œ 전문지식은 이와 λ‹€λ₯Έ 이야기λ₯Ό ν•©λ‹ˆλ‹€.
11:53
You'd never trust your surgery to the man on the street.
210
713149
3428
μ§€λ‚˜κ°€λŠ” μ‚¬λžŒμ—κ²Œ μ—¬λŸ¬λΆ„μ˜ μˆ˜μˆ μ„ 맑기진 μ•Šμ„ κ±°μ˜ˆμš”.
11:56
You'd want an expert who spent years doing surgery
211
716887
3587
졜고의 κΈ°μˆ μ„ μ§€λ‹ˆκ³  μžˆλŠ” μˆ˜λ…„κ°„μ˜ κ²½ν—˜μ„ 가진
12:00
and knows the best techniques.
212
720498
2000
μ „λ¬Έκ°€λ₯Ό 원할 κ²λ‹ˆλ‹€.
12:03
But that should apply to every major decision.
213
723514
3133
ν•˜μ§€λ§Œ 이런 μ‚¬κ³ λŠ” λ‹€λ₯Έ λͺ¨λ“  μ€‘μš”ν•œ 결정에도 ν•„μš”ν•©λ‹ˆλ‹€.
12:07
Politics, business, health advice
214
727255
4556
μ •μΉ˜, 사업, κ±΄κ°•μ—λŠ”
12:11
require expertise, just like surgery.
215
731835
2896
전문지식이 ν•„μš”ν•©λ‹ˆλ‹€. 수술과 λ§ˆμ°¬κ°€μ§€λ‘œμš”.
12:16
So then, why are experts so mistrusted?
216
736474
3539
그럼, 전문가듀은 μ™œ μ‹ λ’°λ₯Ό μžƒμ—ˆμ„κΉŒμš”?
12:20
Well, one reason is they're seen as out of touch.
217
740981
3239
κΈ€μŽ„μš”, κ·Έ 이유 쀑 ν•˜λ‚˜λŠ” κ·Έλ“€μ˜ ν˜„μ‹€κ°κ°μ΄ λ–¨μ–΄μ Έ λ³΄μ—¬μ„œμΌ κ²λ‹ˆλ‹€.
12:24
A millionaire CEO couldn't possibly speak for the man on the street.
218
744244
4090
백만μž₯자 CEOλŠ” 길거리의 μ‚¬λžŒμ„ λŒ€λ³€ν•˜μ§€ λͺ»ν•©λ‹ˆλ‹€.
12:29
But true expertise is found on evidence.
219
749455
3559
ν•˜μ§€λ§Œ μ§„μ •ν•œ 전문지식은 증거둜 λŒ€λ³€λ©λ‹ˆλ‹€.
12:33
And evidence stands up for the man on the street
220
753447
2905
그리고 μ¦κ±°λŠ” 길거리의 μ‚¬λžŒμ„ λŒ€μ‹ ν•˜μ£ .
12:36
and against the elites.
221
756376
1533
μ—˜λ¦¬νŠΈμΈ΅μ— λŒ€ν•­ν•΄μ„œμš”.
12:38
Because evidence forces you to prove it.
222
758456
2667
μ™œλƒν•˜λ©΄ μ¦κ±°λŠ” μ—¬λŸ¬λΆ„μ—κ²Œ 증λͺ…을 μš”κ΅¬ν•˜κΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
12:41
Evidence prevents the elites from imposing their own view
223
761774
4421
μ¦κ±°λŠ” 증λͺ…없이 μžμ‹ μ˜ 견해λ₯Ό μ„€νŒŒν•˜λŠ” μ „λ¬Έκ°€λ₯Ό κ±ΈλŸ¬λ‚Ό 수 μžˆμŠ΅λ‹ˆλ‹€.
12:46
without proof.
224
766219
1150
12:49
A second reason why experts are not trusted
225
769006
2071
μ „λ¬Έκ°€λ₯Ό 믿지 λͺ»ν•˜λŠ” λ‘λ²ˆμ§Έ μ΄μœ λŠ”
12:51
is that different experts say different things.
226
771101
3087
전문가듀끼리도 말이 λ§žμ§€ μ•ŠκΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
12:54
For every expert who claimed that leaving the EU would be bad for Britain,
227
774212
4476
EUμ—μ„œ νƒˆν‡΄ν•˜λŠ” 것이 μ˜κ΅­μ— 쒋지 μ•Šλ‹€κ³  μ£Όμž₯ν•˜λŠ” 전문가듀도 μžˆμ§€λ§Œ
12:58
another expert claimed it would be good.
228
778712
2429
그렇지 μ•Šλ‹€κ³  μ£Όμž₯ν•˜λŠ” 전문가듀도 μžˆμŠ΅λ‹ˆλ‹€.
13:01
Half of these so-called experts will be wrong.
229
781165
3767
μ†Œμœ„ μ „λ¬Έκ°€λΌλŠ” μ‚¬λžŒλ“€ 쀑 μ ˆλ°˜μ€ ν‹€λ¦° μ†Œλ¦¬λ₯Ό ν•˜λŠ”κ²λ‹ˆλ‹€.
13:05
And I have to admit that most papers written by experts are wrong.
230
785774
4243
전문가에 μ˜ν•΄ 쓰인 λŒ€λΆ€λΆ„μ˜ 논문은 ν‹€λ Έλ‹€κ³  인정할 수 밖에 μ—†μŠ΅λ‹ˆλ‹€.
13:10
Or at best, make claims that the evidence doesn't actually support.
231
790520
3505
μ•„λ‹ˆλ©΄ 적어도 증거에 μ˜ν•΄ λ’·λ°›μΉ¨λ˜μ§€ μ•Šκ±°λ‚˜μš”.
13:14
So we can't just take an expert's word for it.
232
794990
3133
κ·Έλž˜μ„œ μ „λ¬Έκ°€λ“€μ˜ 이야기λ₯Ό 마λƒ₯ 받아듀일 μˆ˜λŠ” μ—†μŠ΅λ‹ˆλ‹€.
13:18
In November 2016, a study on executive pay hit national headlines.
233
798776
6034
2016λ…„ 11μ›”, μž„μ› 급여에 κ΄€ν•œ ν•œ 연ꡬ가 μ „κ΅­μ μœΌλ‘œ μ΄μŠˆκ°€ λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
13:25
Even though none of the newspapers who covered the study
234
805240
2890
κ·Έ 연ꡬλ₯Ό 닀룬 μ–΄λ–€ 신문사도
13:28
had even seen the study.
235
808154
1600
κ·Έ 연ꡬλ₯Ό 본적이 μ—†μ—ˆμ£ .
13:30
It wasn't even out yet.
236
810685
1533
λ°œν‘œκ°€ λ˜κΈ°λ„ μ „μ΄μ—ˆμŠ΅λ‹ˆλ‹€.
13:32
They just took the author's word for it,
237
812708
2204
그듀은 단지 μ €μžμ˜ λ¬Έμž₯을 λ”°μ˜¨ 것 λΏμž…λ‹ˆλ‹€.
13:35
just like with Belle.
238
815768
1400
벨의 이야기 μ²˜λŸΌμš”.
13:38
Nor does it mean that we can just handpick any study
239
818093
2436
우리의 관점을 견지해 μ€„λ§Œν•œ
13:40
that happens to support our viewpoint --
240
820553
2111
아무 μ—°κ΅¬λ‚˜ κ³¨λΌμž‘μ•„λ„ λ˜λŠ” 것이 μ•„λ‹™λ‹ˆλ‹€.
13:42
that would, again, be confirmation bias.
241
822688
2103
이 λ˜ν•œ ν™•μ¦νŽΈν–₯이 될 수 μžˆμŠ΅λ‹ˆλ‹€.
13:44
Nor does it mean that if seven studies show A
242
824815
2555
또 일곱 개의 연ꡬ가 AλΌλŠ” κ²°κ³Όλ₯Ό κ°€λ₯΄ν‚€κ³ 
13:47
and three show B,
243
827394
1668
μ„Έ 개의 연ꡬ가 BλΌλŠ” κ²°κ³Όλ₯Ό κ°€λ₯΄ν‚¨λ‹€κ³  ν•΄μ„œ
13:49
that A must be true.
244
829086
1483
Aκ°€ 진싀인 것도 μ•„λ‹™λ‹ˆλ‹€.
13:51
What matters is the quality,
245
831109
2659
μ€‘μš”ν•œ 것은 μ „λ¬Έμ§€μ‹μ˜ μ§ˆμž…λ‹ˆλ‹€.
13:53
and not the quantity of expertise.
246
833792
2817
양이 μ•„λ‹ˆλΌμš”.
13:57
So we should do two things.
247
837879
1800
μš°λ¦¬λŠ” 두 가지 일을 ν•΄μ•Όν•©λ‹ˆλ‹€.
14:00
First, we should critically examine the credentials of the authors.
248
840434
4578
첫 번째둜 μš°λ¦¬λŠ” μ €μžμ˜ μžκ²©μ„ λΉ„νŒμ μœΌλ‘œ κ²€μ¦ν•΄λ΄μ•Όν•©λ‹ˆλ‹€.
14:05
Just like you'd critically examine the credentials of a potential surgeon.
249
845807
4143
μš°λ¦¬κ°€ μ˜μ‚¬λ₯Ό κ³ λ₯Ό λ•Œ 깐깐히 λ”°μ Έλ³΄λŠ” κ²ƒμ²˜λŸΌμš”.
14:10
Are they truly experts in the matter,
250
850347
3206
이 λΆ„μ•Όμ—μ„œ μ§„μ •ν•œ 전문가인가
14:13
or do they have a vested interest?
251
853577
2267
μ•„λ‹ˆλ©΄ κ·Έμ € 이해관계λ₯Ό 가진 μ‚¬λžŒμΌ 뿐인가.
14:16
Second, we should pay particular attention
252
856768
2523
두 번째둜 μš°λ¦¬λŠ” 졜고 κΆŒμœ„μ˜ ν•™μˆ μ§€μ— 기고된
14:19
to papers published in the top academic journals.
253
859315
3889
논문을 λˆˆμ—¬κ²¨ λ΄μ•Όν•©λ‹ˆλ‹€.
14:24
Now, academics are often accused of being detached from the real world.
254
864038
3861
ν•™κ³„λŠ” μ’…μ’… 싀세계와 동떨어져 μžˆλ‹€λŠ” λΉ„λ‚œμ„ λ°›μŠ΅λ‹ˆλ‹€.
14:28
But this detachment gives you years to spend on a study.
255
868585
3730
ν•˜μ§€λ§Œ 이런 κ΄΄λ¦¬μ—λŠ” 수 년에 걸친 연ꡬ가 ν•„μš”ν•©λ‹ˆλ‹€.
14:32
To really nail down a result,
256
872339
1905
μ–΄λ–€ 결둠을 μ§„μ •μœΌλ‘œ λ„μΆœν•΄λ‚΄κΈ° μœ„ν•΄μ„œμš”.
14:34
to rule out those rival theories,
257
874268
2015
μ—¬λŸ¬ 경쟁 이둠듀을 μ•„μš°λ₯΄κ³ ,
14:36
and to distinguish correlation from causation.
258
876307
3134
λ³΅μž‘ν•œ 인과관계λ₯Ό ꡬ별해 λ‚΄κΈ° μœ„ν•΄μ„œμž…λ‹ˆλ‹€.
14:40
And academic journals involve peer review,
259
880172
3477
ν•™μˆ μ§€λŠ” λ™λ£Œ 리뷰λ₯Ό ν¬ν•¨ν•©λ‹ˆλ‹€.
14:43
where a paper is rigorously scrutinized
260
883673
2294
논문이 μ² μ €ν•˜κ²Œ κ²€μ¦λ˜λŠ” 과정이죠.
14:45
(Laughter)
261
885991
1419
(μ›ƒμŒ)
14:47
by the world's leading minds.
262
887434
1934
세계적인 κΆŒμœ„μžλ“€μ— μ˜ν•΄μ„œμš”.
14:50
The better the journal, the higher the standard.
263
890434
2556
더 쒋은 ν•™μˆ μ§€μΌμˆ˜λ‘ 기쀀은 더 λ†’μ•„μ§‘λ‹ˆλ‹€.
14:53
The most elite journals reject 95 percent of papers.
264
893014
5148
졜고의 ν•™μˆ μ§€λŠ” 95%의 논문을 κ±°μ ˆν•©λ‹ˆλ‹€.
14:59
Now, academic evidence is not everything.
265
899434
3333
ν•™μˆ μ  증거가 λͺ¨λ“  것은 μ•„λ‹™λ‹ˆλ‹€.
15:03
Real-world experience is critical, also.
266
903109
2667
μ‹€μ„Έκ³„μ˜ κ²½ν—˜ λ˜ν•œ 맀우 μ€‘μš”ν•©λ‹ˆλ‹€.
15:06
And peer review is not perfect, mistakes are made.
267
906465
3400
λ™λ£Œ 리뷰가 μ™„λ²½ν•˜μ§€λ„ μ•ŠμŠ΅λ‹ˆλ‹€. μ‹€μˆ˜λŠ” λ°œμƒν•˜κΈ° λ§ˆλ ¨μž…λ‹ˆλ‹€.
15:10
But it's better to go with something checked
268
910530
2063
κ·Έλž˜λ„ κ²€ν† λ˜μ§€ μ•Šμ€ 것보단
15:12
than something unchecked.
269
912617
1667
κ²€ν† λœ 것이 λ‚«μŠ΅λ‹ˆλ‹€.
15:14
If we latch onto a study because we like the findings,
270
914696
3199
단지 결과물을 μ’‹μ•„ν•˜κΈ° λ•Œλ¬Έμ— μ €μžκ°€ λˆ„κ΅¬μΈμ§€λ„ λͺ¨λ₯΄λŠ”
15:17
without considering who it's by or whether it's even been vetted,
271
917919
3888
심지어 ν™•μΈλ˜μ§€ μ•Šμ€ 연ꡬ듀을 μ΄ν•΄ν•˜λ € ν•œλ‹€λ©΄
15:21
there is a massive chance that that study is misleading.
272
921831
3627
ꡉμž₯히 높은 ν™•λ₯ λ‘œ 연ꡬ결과λ₯Ό μ˜€ν•΄ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
15:26
And those of us who claim to be experts
273
926894
2580
그리고 슀슀둜λ₯Ό 전문가라고 λΆ€λ₯΄λŠ” μ‚¬λžŒλ“€μ€
15:29
should recognize the limitations of our analysis.
274
929498
3253
λŒ€μ€‘λ“€μ˜ 뢄석λ ₯μ—λŠ” ν•œκ³„κ°€ μžˆλ‹€λŠ”κ±Έ μΈμ‹ν•΄μ•Όν•©λ‹ˆλ‹€.
15:33
Very rarely is it possible to prove or predict something with certainty,
275
933244
4563
확신을 가지고 μ–΄λ–€ 것을 증λͺ…ν•˜κ±°λ‚˜ μ˜ˆμƒν•  수 μžˆλŠ” κ²½μš°λŠ” 맀우 λ“œλ­…λ‹ˆλ‹€.
15:38
yet it's so tempting to make a sweeping, unqualified statement.
276
938292
4369
ν•˜μ§€λ§Œ κ΄‘λ²”μœ„ν•˜κ³  증λͺ…λ˜μ§€ μ•Šμ€ λ°œμ–Έμ„ ν•˜κΈ°λŠ” 맀우 쉽죠.
15:43
It's easier to turn into a headline or to be tweeted in 140 characters.
277
943069
4344
그것을 λ‰΄μŠ€μ˜ ν—€λ“œλΌμΈμœΌλ‘œ λ§Œλ“€κ±°λ‚˜ νŠΈμœ„ν„°λ‘œ κ³΅μœ ν•˜λŠ” 것은 λ”μš± μ‰½μŠ΅λ‹ˆλ‹€.
15:48
But even evidence may not be proof.
278
948417
3142
ν•˜μ§€λ§Œ 증거 λ˜ν•œ 증λͺ…일 수 μ—†μŠ΅λ‹ˆλ‹€.
15:52
It may not be universal, it may not apply in every setting.
279
952481
4210
λ§Œμ‚¬μ— λ²”μš©λ˜μ§€ μ•Šμ„ 것이고, λͺ¨λ“  상황에 μ μš©λ˜μ§€ μ•Šμ„ κ²ƒμž…λ‹ˆλ‹€.
15:57
So don't say, "Red wine causes longer life,"
280
957252
4920
κ·ΈλŸ¬λ‹ˆκΉŒ "적포도주가 μž₯수λ₯Ό μ΄λˆλ‹€"라고 ν•˜μ§€λ§ˆμ„Έμš”.
16:02
when the evidence is only that red wine is correlated with longer life.
281
962196
4682
λ‚΄μ„ΈμšΈ 것이 단지 적포도주와 μž₯수의 μ—°κ΄€μ„± λΏμ΄λΌλ©΄μš”.
16:07
And only then in people who exercise as well.
282
967379
2770
그리고 μš΄λ™μ„ κΎΈμ€€νžˆ ν•˜λŠ”μ΄λ“€μ—κ²Œμ„œλ§Œ λ°œκ²¬ν•  수 μžˆλŠ” 결과일 κ²½μš°μ—μš”.
16:11
Tip number three is "pause before sharing anything."
283
971868
3966
μ„Έλ²ˆμ§Έ νŒμ€ κ³΅μœ ν•˜κΈ° 전에 잠깐 μƒκ°ν•΄λ³΄λΌλŠ” κ²λ‹ˆλ‹€.
16:16
The Hippocratic oath says, "First, do no harm."
284
976907
3464
νžˆν¬ν¬λΌν…ŒμŠ€ μ„ μ„œμ—λŠ” "첫째, ν•΄λ₯Ό λΌμΉ˜μ§€ 마라"라고 μ¨μžˆμŠ΅λ‹ˆλ‹€.
16:21
What we share is potentially contagious,
285
981046
3134
μš°λ¦¬κ°€ κ³΅μœ ν•˜λŠ” 것은 잠재적인 전염성을 띠고 μžˆμŠ΅λ‹ˆλ‹€.
16:24
so be very careful about what we spread.
286
984204
3683
λ”°λΌμ„œ 무언가 퍼뜨릴 λ•ŒλŠ” ꡉμž₯히 쑰심해야 ν•©λ‹ˆλ‹€.
16:28
Our goal should not be to get likes or retweets.
287
988632
2953
우리의 λͺ©ν‘œκ°€ μ’‹μ•„μš”λ‚˜ λ¦¬νŠΈμœ—μ΄ λ˜μ–΄μ„œλŠ” μ•ˆλ©λ‹ˆλ‹€.
16:31
Otherwise, we only share the consensus; we don't challenge anyone's thinking.
288
991609
3985
그렇지 μ•ŠμœΌλ©΄ μš°λ¦¬λŠ” λͺ¨λ‘λ₯Ό 거슀λ₯΄μ§€ μ•ŠλŠ” λ‹Ήμ—°ν•œ λ‚΄μš©λ§Œμ„ κ³΅μœ ν•˜κ²Œ λ©λ‹ˆλ‹€.
16:36
Otherwise, we only share what sounds good,
289
996085
2905
λ“£κΈ° 쒋은 μ΄μ•ΌκΈ°λ§Œ κ³΅μœ ν•˜κ²Œ 되죠.
16:39
regardless of whether it's evidence.
290
999014
2400
증거가 어떻든지 κ°„μ—μš”.
16:42
Instead, we should ask the following:
291
1002188
2466
λŒ€μ‹ μ— μš°λ¦¬λŠ” μ΄λ ‡κ²Œ λ˜λ¬Όμ–΄μ•Όν•©λ‹ˆλ‹€.
16:45
If it's a story, is it true?
292
1005572
2135
이야기라면 사싀인가?
16:47
If it's true, is it backed up by large-scale evidence?
293
1007731
2865
사싀이라면 μΆ©λΆ„νžˆ λ§Žμ€ μ¦κ±°λ“€λ‘œ λ’·λ°›μΉ¨ λ˜λŠ”κ°€?
16:50
If it is, who is it by, what are their credentials?
294
1010620
2595
κ·Έλ ‡λ‹€λ©΄, λˆ„κ°€ ν•œ 이야기인가? κ·Έλ“€μ˜ μžκ²©μ€ μΆ©λΆ„ν•œκ°€?
16:53
Is it published, how rigorous is the journal?
295
1013239
2756
좜판된 λ¬Έμ„œλΌλ©΄ μ–Όλ§ˆλ‚˜ μ—„κ²©ν•œ 맀체인가?
16:56
And ask yourself the million-dollar question:
296
1016733
2317
그리고 μŠ€μŠ€λ‘œμ—κ²Œ λ°± 만뢈짜리 μ§ˆλ¬Έμ„ ν•΄μ•Όν•©λ‹ˆλ‹€.
16:59
If the same study was written by the same authors with the same credentials
297
1019980
4023
같은 연ꡬ가 같은 μ €μžμ— μ˜ν•΄ 같은 μžκ²©μ„ 가지고 κΈ°μˆ λ˜μ—ˆμ§€λ§Œ
17:05
but found the opposite results,
298
1025130
1587
연ꡬ와 λ°˜λŒ€λ˜λŠ” κ²°κ³Όλ₯Ό λ°œκ²¬ν–ˆλ‹€λ©΄
17:07
would you still be willing to believe it and to share it?
299
1027608
3694
그것을 λ‹€μ‹œ κ³΅μœ ν•  마음이 μžˆλŠ”κ°€?
17:13
Treating any problem --
300
1033442
2246
μ–΄λŠ 문제λ₯Ό 닀루든지간에
17:15
a nation's economic problem or an individual's health problem,
301
1035712
3792
κ΅­κ°€κ²½μ œλ¬Έμ œμ΄λ“  κ°œμΈκ±΄κ°•λ¬Έμ œμ΄λ“ 
17:19
is difficult.
302
1039528
1150
문제λ₯Ό λ‹€λ£¨λŠ” 것은 μ–΄λ ΅μŠ΅λ‹ˆλ‹€.
17:21
So we must ensure that we have the very best evidence to guide us.
303
1041242
4383
λ”°λΌμ„œ 우리λ₯Ό 인도할 ν™•μ‹€ν•œ 증거λ₯Ό 가지고 μžˆμ–΄μ•Ό ν•©λ‹ˆλ‹€.
17:26
Only if it's true can it be fact.
304
1046188
2681
였직 진싀일 λ•Œλ§Œ 사싀일 수 μžˆμŠ΅λ‹ˆλ‹€.
17:29
Only if it's representative can it be data.
305
1049601
2781
였직 μ „ν˜•μ μΈ κ²½μš°μ—λ§Œ μžλ£Œκ°€ 될 수 μžˆμŠ΅λ‹ˆλ‹€.
17:33
Only if it's supportive can it be evidence.
306
1053128
3165
였직 뒷받침될 λ•Œλ§Œ 증거가 될 수 μžˆμŠ΅λ‹ˆλ‹€.
17:36
And only with evidence can we move from a post-truth world
307
1056317
5167
그리고 였직 증거와 ν•¨κ»˜μ—¬μ•Όλ§Œ νƒˆμ§„μ‹€ μ‚¬νšŒμ—μ„œ λ²—μ–΄λ‚˜
17:41
to a pro-truth world.
308
1061508
1583
선진싀 μ‚¬νšŒλ‘œ λ‚˜μ•„κ°ˆ 수 μžˆμŠ΅λ‹ˆλ‹€.
17:44
Thank you very much.
309
1064183
1334
κ°μ‚¬ν•©λ‹ˆλ‹€.
17:45
(Applause)
310
1065541
1150
(λ°•μˆ˜)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7