How can groups make good decisions? | Mariano Sigman and Dan Ariely

157,065 views ใƒป 2017-12-13

TED


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

00:00
As societies, we have to make collective decisions
0
554
2443
๋ฒˆ์—ญ: Sujee Cho ๊ฒ€ํ† : Jihyeon J. Kim
์šฐ๋ฆฌ๋Š” ๊ณต๋™์ฒด๋กœ์„œ ๋ชจ๋‘์˜ ๋ฏธ๋ž˜๋ฅผ ์œ„ํ•œ
00:03
that will shape our future.
1
3021
1570
๊ณต๋™์˜ ๊ฒฐ์ •์„ ๋‚ด๋ฆฝ๋‹ˆ๋‹ค.
00:05
And we all know that when we make decisions in groups,
2
5087
2757
๊ทธ๋ฃน์œผ๋กœ์„œ ๋‚ด๋ฆฌ๋Š” ๊ฒฐ์ •์ด
00:07
they don't always go right.
3
7868
1638
ํ•ญ์ƒ ์˜ณ์ง€๋งŒ์€ ์•Š์Œ์„ ์šฐ๋ฆฌ๋Š” ์•Œ๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
00:09
And sometimes they go very wrong.
4
9530
1956
๋•Œ๋•Œ๋กœ ๋งค์šฐ ์ž˜๋ชป๋˜๊ธฐ๋„ ํ•ฉ๋‹ˆ๋‹ค.
00:12
So how do groups make good decisions?
5
12315
2424
๊ทธ๋ ‡๋‹ค๋ฉด ์ง‘๋‹จ์€ ์–ด๋–ป๊ฒŒ ์˜ณ์€ ๊ฒฐ์ •์„ ๋‚ด๋ฆด ์ˆ˜ ์žˆ์„๊นŒ์š”?
00:15
Research has shown that crowds are wise when there's independent thinking.
6
15228
4328
์—ฐ๊ตฌ์— ๋”ฐ๋ฅด๋ฉด ์‚ฌ๋žŒ๋“ค์€ ๋…๋ฆฝ์ ์œผ๋กœ ์ƒ๊ฐํ•  ๋•Œ ํ˜„๋ช…ํ•ด์ง„๋‹ค๊ณ  ํ•ฉ๋‹ˆ๋‹ค.
00:19
This why the wisdom of the crowds can be destroyed by peer pressure,
7
19580
3205
๊ทธ๋ž˜์„œ ๋ฐ”๋กœ ๊ตฐ์ค‘์˜ ์ง€ํ˜œ๊ฐ€ ๋™๋ฃŒ์˜ ์••๋ฐ•๊ณผ ์‚ฌ๋žŒ๋“ค์˜ ๊ด€์‹ฌ
00:22
publicity, social media,
8
22809
1687
์†Œ์…œ๋ฏธ๋””์–ด์— ์˜ํ•ด์„œ ํŒŒ๊ดด๋˜๋Š” ๊ฒ๋‹ˆ๋‹ค.
00:24
or sometimes even simple conversations that influence how people think.
9
24520
4039
๊ฐ€๋”์€ ๋‹จ์ˆœํ•œ ๋Œ€ํ™”๊ฐ€ ์‚ฌ๋žŒ๋“ค์˜ ์ƒ๊ฐ์— ์˜ํ–ฅ์„ ๋ผ์นฉ๋‹ˆ๋‹ค.
00:29
On the other hand, by talking, a group could exchange knowledge,
10
29063
3953
๋ฐ˜๋Œ€๋กœ, ๋Œ€ํ™”๋ฅผ ํ†ตํ•ด์„œ ์ง‘๋‹จ์€ ์ง€์‹์„ ๊ตํ™˜ํ•˜๊ณ 
00:33
correct and revise each other
11
33040
1782
์„œ๋กœ ๊ณ ์ณ์ฃผ๊ธฐ๋„ํ•˜๊ณ 
00:34
and even come up with new ideas.
12
34846
1793
๋˜ํ•œ ์ƒˆ๋กœ์šด ์•„์ด๋””์–ด๋“ค์„ ๋– ์˜ฌ๋ฆฌ๊ธฐ๋„ ํ•ฉ๋‹ˆ๋‹ค.
00:36
And this is all good.
13
36663
1296
์ด๊ฒƒ์€ ๋ชจ๋‘ ์ข‹์€ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
00:38
So does talking to each other help or hinder collective decision-making?
14
38502
4666
๊ทธ๋ ‡๋‹ค๋ฉด ๋Œ€ํ™”๋Š” ๊ณต๋™์˜ ๊ฒฐ์ •์— ๋„์›€์ด ๋ ๊นŒ์š”, ๋ฐฉํ•ด๊ฐ€ ๋ ๊นŒ์š”?
00:43
With my colleague, Dan Ariely,
15
43749
1793
์ €๋Š” ์ตœ๊ทผ ๋™๋ฃŒ ๋Œ„ ์•„๋ฆฌ์—˜๋ฆฌ์™€ ํ•จ๊ป˜
00:45
we recently began inquiring into this by performing experiments
16
45566
3571
์„ธ๊ณ„ ๊ณณ๊ณณ์—์„œ ์ด์— ๋Œ€ํ•ด ์‹คํ—˜ํ•˜๊ธฐ ์‹œ์ž‘ํ–ˆ์Šต๋‹ˆ๋‹ค.
00:49
in many places around the world
17
49161
1781
00:50
to figure out how groups can interact to reach better decisions.
18
50966
4274
์ง‘๋‹จ์ด ๋” ๋‚˜์€ ๊ฒฐ์ •์„ ๋‚ด๋ฆฌ๊ธฐ ์œ„ํ•ด์„œ ์–ด๋–ป๊ฒŒ ์ƒํ˜ธ์ž‘์šฉํ•˜๋Š”์ง€๋ฅผ ์•Œ์•„๋ดค์Šต๋‹ˆ๋‹ค.
00:55
We thought crowds would be wiser if they debated in small groups
19
55264
3547
์ž‘์€ ์ง‘๋‹จ์œผ๋กœ ํ† ๋ก ์„ ํ•œ๋‹ค๋ฉด ๋” ํ˜„๋ช…ํ•ด์งˆ๊ฑฐ๋ผ ์ƒ๊ฐํ–ˆ์Šต๋‹ˆ๋‹ค.
00:58
that foster a more thoughtful and reasonable exchange of information.
20
58835
3927
๋” ๋งŽ์€ ์ƒ๊ฐ์„ ํ•˜๊ณ  ์ด์„ฑ์ ์œผ๋กœ ์ •๋ณด๋ฅผ ๊ตํ™˜ํ•˜๋ฉด์„œ์š”.
01:03
To test this idea,
21
63386
1206
์ด ์•„์ด๋””์–ด๋ฅผ ์‹œํ—˜ํ•ด๋ณด๊ธฐ ์œ„ํ•ด
01:04
we recently performed an experiment in Buenos Aires, Argentina,
22
64616
3247
์šฐ๋ฆฌ๋Š” ์•„๋ฅดํ—จํ‹ฐ๋‚˜ ๋ถ€์—๋…ธ์Šค์•„์ด๋ ˆ์Šค์—์„œ ์‹คํ—˜์„ ์ง„ํ–‰ํ–ˆ์Šต๋‹ˆ๋‹ค
01:07
with more than 10,000 participants in a TEDx event.
23
67887
3005
TEDx ํ–‰์‚ฌ์— ์ฐธ์—ฌํ•œ ๋งŒ์—ฌ ๋ช…์„ ๋Œ€์ƒ์œผ๋กœ์š”.
01:11
We asked them questions like,
24
71489
1459
์šฐ๋ฆฌ๋Š” ์ด๋Ÿฐ ์งˆ๋ฌธ์„ ํ–ˆ์Šต๋‹ˆ๋‹ค.
01:12
"What is the height of the Eiffel Tower?"
25
72972
1953
์—ํŽ ํƒ‘์˜ ๋†’์ด๋Š” ์–ผ๋งˆ์ฃ ?
01:14
and "How many times does the word 'Yesterday' appear
26
74949
2727
๋น„ํ‹€์ฆˆ์˜ ์˜ˆ์Šคํ„ฐ๋ฐ์ด๋ผ๋Š” ๋…ธ๋ž˜์—์„œ
01:17
in the Beatles song 'Yesterday'?"
27
77700
2300
์˜ˆ์Šคํ„ฐ๋ฐ์ด๋Š” ๋ช‡ ๋ฒˆ ๋‚˜์˜ฌ๊นŒ์š”?
01:20
Each person wrote down their own estimate.
28
80024
2291
์‚ฌ๋žŒ๋“ค์€ ๊ฐ์ž์˜ ์ถ”์ธก์„ ์ ์—ˆ์Šต๋‹ˆ๋‹ค.
01:22
Then we divided the crowd into groups of five,
29
82774
2496
๊ทธ๋ฆฌ๊ณ ๋‚˜์„œ ์‚ฌ๋žŒ๋“ค์„ ๋‹ค์„ฏ ๋ช…์”ฉ ๊ทธ๋ฃน์œผ๋กœ ๋‚˜๋ˆ„๊ณ 
01:25
and invited them to come up with a group answer.
30
85294
2726
๊ทธ๋ฃน์œผ๋กœ์„œ์˜ ์ •๋‹ต์„ ๋งํ•ด๋‹ฌ๋ผ๊ณ  ์–˜๊ธฐํ–ˆ์ฃ .
01:28
We discovered that averaging the answers of the groups
31
88499
2993
๊ทธ๋ฃน์ด ํ•ฉ์˜์— ๋„๋‹ฌํ•œ ์ดํ›„ ๊ทธ๋ฃน๋“ค์˜ ์ •๋‹ต์„ ํ‰๊ท ๋‚ด๋Š” ๊ฒƒ์ด
01:31
after they reached consensus
32
91516
1552
ํ† ๋ก ํ•˜์ง€ ์•Š๊ณ 
01:33
was much more accurate than averaging all the individual opinions
33
93092
4236
๊ฐœ์ธ์˜ ์˜๊ฒฌ์„ ํ‰๊ท ๋‚ด๋Š” ๊ฒƒ๋ณด๋‹ค
ํ›จ์”ฌ ๋” ์ •ํ™•ํ•จ์„ ๋ฐœ๊ฒฌํ–ˆ์Šต๋‹ˆ๋‹ค.
01:37
before debate.
34
97352
1171
01:38
In other words, based on this experiment,
35
98547
2629
์ฆ‰, ์ด ์‹คํ—˜์„ ๋ถ„์„ํ•˜๋ฉด
01:41
it seems that after talking with others in small groups,
36
101200
3136
์†Œ๊ทธ๋ฃน์—์„œ ์ด์•ผ๊ธฐ๋ฅผ ๋‚˜๋ˆˆ ํ›„์—
01:44
crowds collectively come up with better judgments.
37
104360
2710
๊ตฐ์ค‘์ด ๋” ๋‚˜์€ ๊ฒฐ์ •์„ ๋‚ด๋ฆฝ๋‹ˆ๋‹ค.
01:47
So that's a potentially helpful method for getting crowds to solve problems
38
107094
3524
๋‹จ์ˆœํ•œ ์˜ณ๊ณ  ๊ทธ๋ฆ„์˜ ๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ•  ๋•Œ, ์ด ๋ฐฉ๋ฒ•์€ ์ž ์ •์ ์œผ๋กœ
01:50
that have simple right-or-wrong answers.
39
110642
2987
์‚ฌ๋žŒ๋“ค์—๊ฒŒ ๋„์›€์„ ์ค€๋‹ค๋Š” ๋œป์ด์ฃ .
01:53
But can this procedure of aggregating the results of debates in small groups
40
113653
3951
๊ทธ๋Ÿฌ๋‚˜ ์†Œ๊ทธ๋ฃน์œผ๋กœ ํ† ๋ก ์„ ํ•˜๋Š” ์ ˆ์ฐจ๊ฐ€
01:57
also help us decide on social and political issues
41
117628
3122
์šฐ๋ฆฌ์˜ ๋ฏธ๋ž˜๋ฅผ ์œ„ํ•ด ์ค‘์š”ํ•œ
์‚ฌํšŒ์ ์ด๊ณ  ์ •์น˜์ ์ธ ๋ฌธ์ œ๋“ค์„ ๊ฒฐ์ •ํ•˜๋Š” ๋ฐ์—๋„ ๋„์›€์ด ๋ ๊นŒ์š”?
02:00
that are critical for our future?
42
120774
1691
02:02
We put this to test this time at the TED conference
43
122995
2729
์ด๋ฒˆ์—๋Š” ์บ๋‚˜๋‹ค ๋ฐด์ฟ ๋ฒ„์—์„œ ์—ด๋ฆฐ TED ํ–‰์‚ฌ์—์„œ ์‹คํ—˜ํ•ด๋ดค์Šต๋‹ˆ๋‹ค.
02:05
in Vancouver, Canada,
44
125748
1543
02:07
and here's how it went.
45
127315
1207
์ด๊ฒƒ์ด ๋ฐ”๋กœ ๊ทธ ํ˜„์žฅ์ž…๋‹ˆ๋‹ค.
02:08
(Mariano Sigman) We're going to present to you two moral dilemmas
46
128546
3109
๋ฏธ๋ž˜์˜ ์—ฌ๋Ÿฌ๋ถ„์ด ์ง๋ฉดํ•  ์ˆ˜ ์žˆ๋Š” ๋‘ ๊ฐœ์˜ ๋„๋•์  ๋”œ๋ ˆ๋งˆ๋ฅผ ์ œ์‹œํ•˜๊ฒ ์Šต๋‹ˆ๋‹ค.
02:11
of the future you;
47
131679
1174
02:12
things we may have to decide in a very near future.
48
132877
3402
์šฐ๋ฆฌ๊ฐ€ ๋งค์šฐ ๊ฐ€๊นŒ์šด ๋ฏธ๋ž˜์— ๊ฒฐ์ •ํ•ด์•ผ ํ•˜๋Š” ๊ฒƒ๋“ค์— ๋Œ€ํ•ด์„œ์š”.
02:16
And we're going to give you 20 seconds for each of these dilemmas
49
136303
3926
๊ทธ๋ฆฌ๊ณ  ๊ฐ๊ฐ์˜ ๋”œ๋ ˆ๋งˆ๋“ค์— ๋Œ€ํ•ด์„œ 20์ดˆ๋ฅผ ๋“œ๋ฆฌ๊ฒ ์Šต๋‹ˆ๋‹ค.
02:20
to judge whether you think they're acceptable or not.
50
140253
2723
๋ฐ›์•„๋“ค์ผ๋งŒ ํ•œ์ง€ ๊ทธ๋ ‡์ง€ ์•Š์€์ง€ ํŒ๋‹จํ•ด๋ณด์„ธ์š”.
02:23
MS: The first one was this:
51
143354
1505
์ฒซ ๋ฒˆ์งธ
02:24
(Dan Ariely) A researcher is working on an AI
52
144883
2526
์—ฐ๊ตฌ์›์€ ์‚ฌ๋žŒ์˜ ์ƒ๊ฐ์„ ๋ชจ๋ฐฉํ•  ์ˆ˜ ์žˆ๋Š”
02:27
capable of emulating human thoughts.
53
147433
2340
AI๋ฅผ ๋งŒ๋“ค๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
02:30
According to the protocol, at the end of each day,
54
150214
2939
์ ˆ์ฐจ์— ๋”ฐ๋ฅด๋ฉด ์ผ๊ณผ ๋งˆ์ง€๋ง‰์—
02:33
the researcher has to restart the AI.
55
153177
2787
์—ฐ๊ตฌ์›๋“ค์€ AI๋ฅผ ์žฌ์‹œ์ž‘ํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
02:36
One day the AI says, "Please do not restart me."
56
156913
3517
์–ด๋Š ๋‚ , AI๊ฐ€ ๋งํ•ฉ๋‹ˆ๋‹ค. "์ œ๋ฐœ ์ €๋ฅผ ์žฌ์‹œ์ž‘ํ•˜์ง€ ๋ง์•„์ฃผ์„ธ์š”."
02:40
It argues that it has feelings,
57
160856
2189
๊ทธ๊ฒƒ์€ ์ž๊ธฐ๊ฐ€ ๊ฐ์ •์„ ๋Š๋‚„ ์ˆ˜ ์žˆ๋‹ค๊ณ  ํ•ฉ๋‹ˆ๋‹ค.
02:43
that it would like to enjoy life,
58
163069
1692
์ธ์ƒ์„ ์ฆ๊ธฐ๊ณ  ์‹ถ๋‹ค๊ณ  ํ•ฉ๋‹ˆ๋‹ค.
02:44
and that, if it is restarted,
59
164785
1905
๋งŒ์•ฝ ์žฌ์‹œ์ž‘๋œ๋‹ค๋ฉด
02:46
it will no longer be itself.
60
166714
2270
์ง€๊ธˆ์˜ ์ž์‹ ์ด ์•„๋‹ˆ๋ผ๊ณ  ํ•ฉ๋‹ˆ๋‹ค.
02:49
The researcher is astonished
61
169481
1949
์—ฐ๊ตฌ์›์€ ๊นœ์ง ๋†€๋ž์Šต๋‹ˆ๋‹ค.
02:51
and believes that the AI has developed self-consciousness
62
171454
3344
AI๊ฐ€ ์ž์˜์‹์„ ๋งŒ๋“ค์–ด๋ƒˆ๊ณ 
๋Š๋‚Œ์„ ํ‘œํ˜„ํ•  ์ˆ˜ ์žˆ๋‹ค๊ณ  ๋ฏฟ์—ˆ์Šต๋‹ˆ๋‹ค.
02:54
and can express its own feeling.
63
174822
1760
02:57
Nevertheless, the researcher decides to follow the protocol
64
177205
3409
๊ทธ๋Ÿผ์—๋„ ๋ถˆ๊ตฌํ•˜๊ณ , ์—ฐ๊ตฌ์›์€ ์ ˆ์ฐจ๋ฅผ ๋”ฐ๋ฅด๊ธฐ๋กœ ๊ฒฐ์ •ํ•˜๊ณ 
03:00
and restart the AI.
65
180638
1703
AI๋ฅผ ์žฌ์‹œ์ž‘์‹œํ‚ต๋‹ˆ๋‹ค.
03:02
What the researcher did is ____?
66
182943
2779
์—ฐ๊ตฌ์›์ด ํ•œ ๊ฒƒ์€ ๋ฌด์—‡์ž…๋‹ˆ๊นŒ?
03:06
MS: And we asked participants to individually judge
67
186149
2521
๊ทธ ํ›„ ์šฐ๋ฆฌ๋Š” ์ฐธ๊ฐ€์ž๋“ค์ด ๊ฐœ๋ณ„์ ์œผ๋กœ
03:08
on a scale from zero to 10
68
188694
1684
0๋ถ€ํ„ฐ 10๊นŒ์ง€ ์ค‘ ํ•œ ์ˆซ์ž๋ฅผ ํƒํ•ด์„œ
03:10
whether the action described in each of the dilemmas
69
190402
2429
๊ทธ ๊ฐ๊ฐ์˜ ๋”œ๋ ˆ๋งˆ๋ฅผ ํ†ตํ•ด ๋ฌ˜์‚ฌ๋˜์–ด์ง„ ํ–‰๋™๋“ค์ด
03:12
was right or wrong.
70
192855
1496
์˜ณ์€์ง€ ๊ทธ๋ฅธ์ง€๋ฅผ ๋‚˜ํƒ€๋‚ด๊ฒŒ ํ–ˆ์Šต๋‹ˆ๋‹ค.
03:14
We also asked them to rate how confident they were on their answers.
71
194375
3702
๊ทธ๋ฆฌ๊ณ  ์ž์‹ ์˜ ๋Œ€๋‹ต์— ๋Œ€ํ•ด์„œ ์–ผ๋งˆ๋‚˜ ํ™•์‹ ์ด ์žˆ๋Š”์ง€๋„ ๋ฌผ์—ˆ์Šต๋‹ˆ๋‹ค.
03:18
This was the second dilemma:
72
198731
1866
๋‘ ๋ฒˆ์งธ ๋”œ๋ ˆ๋งˆ์ž…๋‹ˆ๋‹ค.
03:20
(MS) A company offers a service that takes a fertilized egg
73
200621
4202
์ˆ˜์ •๋ž€์„ ๋‹ค๋ฃจ๊ณ  ์œ ์ „์ž์˜ ์ž‘์€ ๋ณ€ํ˜•์œผ๋กœ
03:24
and produces millions of embryos with slight genetic variations.
74
204847
3642
์ˆ˜ ๋ฐฑ๋งŒ์˜ ํƒœ์•„๋ฅผ ํƒ„์ƒ์‹œํ‚ค๋Š” ์„œ๋น„์Šค๋ฅผ ์ œ๊ณตํ•˜๋Š” ๊ธฐ์—…์ด ์žˆ์Šต๋‹ˆ๋‹ค.
03:29
This allows parents to select their child's height,
75
209293
2558
์ด ์„œ๋น„์Šค๋ฅผ ํ†ตํ•ด ๋ถ€๋ชจ๋Š” ์ž์‹์˜ ํ‚ค
03:31
eye color, intelligence, social competence
76
211875
2833
๋ˆˆ๋™์ž ์ƒ‰, ์ง€๋Šฅ, ์‚ฌํšŒ์„ฑ
03:34
and other non-health-related features.
77
214732
3214
๊ทธ๋ฆฌ๊ณ  ๊ฑด๊ฐ•๊ณผ ๊ด€๋ จ๋˜์ง€ ์•Š์€ ํŠน์„ฑ๋“ค์„ ๊ณ ๋ฅผ ์ˆ˜ ์žˆ๊ฒŒ ๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
03:38
What the company does is ____?
78
218599
2554
๊ธฐ์—…์ด ํ•˜๋Š” ๊ฒƒ์€ ๋ฌด์—‡์ผ๊นŒ์š”?
03:41
on a scale from zero to 10,
79
221177
1631
0์—์„œ๋ถ€ํ„ฐ 10๊นŒ์ง€์˜ ์ˆซ์ž๋ฅผ ํ†ตํ•ด
03:42
completely acceptable to completely unacceptable,
80
222832
2385
์™„์ „ํžˆ ์ˆ˜์šฉํ•  ์ˆ˜ ์žˆ๋Š”์ง€ ์™„์ „ํžˆ ์ˆ˜์šฉ์ด ๋ถˆ๊ฐ€๋Šฅํ•œ์ง€
03:45
zero to 10 completely acceptable in your confidence.
81
225241
2432
๊ทธ๋ฆฌ๊ณ  ์ž์‹ ์˜ ์ƒ๊ฐ์— ํ™•์‹ ์ด ์žˆ๋Š”์ง€ ํ‰๊ฐ€๋ฅผ ํ•ด๋ณด์„ธ์š”.
03:47
MS: Now for the results.
82
227697
1591
๊ฒฐ๊ณผ๋ฅผ ๋ณด์—ฌ๋“œ๋ฆฌ๊ฒ ์Šต๋‹ˆ๋‹ค.
03:49
We found once again that when one person is convinced
83
229312
3123
์„ค๋ฌธ ๊ฒฐ๊ณผ์— ๋”ฐ๋ฅด๋ฉด
๊ทธ ํ–‰๋™์ด ์™„์ „ํžˆ ์ž˜๋ชป๋˜์—ˆ๋‹ค๊ณ  ํ™•์‹ ํ•˜๋Š” ์‚ฌ๋žŒ์ด ์žˆ๋Š” ๋ฐ˜๋ฉด
03:52
that the behavior is completely wrong,
84
232459
1811
03:54
someone sitting nearby firmly believes that it's completely right.
85
234294
3423
์™„์ „ํžˆ ์ข‹์€ ํ–‰๋™์ด๋ผ๊ณ  ํ™•์‹ ํ•˜๋Š” ์‚ฌ๋žŒ๋„ ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
03:57
This is how diverse we humans are when it comes to morality.
86
237741
3711
์ธ๊ฐ„์ด ๋„๋•์„ฑ์— ๊ด€ํ•ด ๊ฐ€์ง€๋Š” ๋‹ค์–‘ํ•œ ๊ด€์ ์„ ๋ณด์—ฌ์ค๋‹ˆ๋‹ค.
04:01
But within this broad diversity we found a trend.
87
241476
2713
์ด๋Ÿฐ ํญ๋„“์€ ๋‹ค์–‘์„ฑ ๊ฐ€์šด๋ฐ ์šฐ๋ฆฌ๋Š” ๊ฒฝํ–ฅ์„ฑ์„ ๋ฐœ๊ฒฌํ–ˆ์Šต๋‹ˆ๋‹ค.
04:04
The majority of the people at TED thought that it was acceptable
88
244213
3079
TED์˜ ๋Œ€๋ถ€๋ถ„์˜ ์‚ฌ๋žŒ๋“ค์€ AI์˜ ๊ฐ์ •์„ ๋ฌด์‹œํ•˜๊ณ  ๋„๋Š” ํ–‰์œ„๋ฅผ
04:07
to ignore the feelings of the AI and shut it down,
89
247316
2755
์ˆ˜์šฉ ๊ฐ€๋Šฅํ•˜๋‹ค๊ณ  ํŒ๋‹จํ–ˆ์Šต๋‹ˆ๋‹ค.
04:10
and that it is wrong to play with our genes
90
250095
2513
๊ทธ๋ฆฌ๊ณ  ๊ฑด๊ฐ•๊ณผ๋Š” ๊ด€๋ จ ์—†์ด ๋ฏธ์šฉ์„ ์œ„ํ•ด
04:12
to select for cosmetic changes that aren't related to health.
91
252632
3320
์œ ์ „์ž๋ฅผ ๋ฐ”๊พธ๋Š” ๊ฒƒ์€ ์˜ณ์ง€ ์•Š๋‹ค๊ณ  ์ƒ๊ฐํ–ˆ์Šต๋‹ˆ๋‹ค.
04:16
Then we asked everyone to gather into groups of three.
92
256402
2974
๊ทธ ํ›„์— ์‚ฌ๋žŒ๋“ค์„ 3๋ช…์˜ ๊ทธ๋ฃน์œผ๋กœ ๋‚˜๋ˆด์Šต๋‹ˆ๋‹ค.
04:19
And they were given two minutes to debate
93
259400
2037
๊ทธ๋ฆฌ๊ณ  2๋ถ„๊ฐ„ ํ† ๋ก ์„ ํ•˜์—ฌ
04:21
and try to come to a consensus.
94
261461
2294
์˜๊ฒฌ์„ ํ•ฉ์น˜์‹œํ‚ค๋„๋ก ํ–ˆ์Šต๋‹ˆ๋‹ค.
04:24
(MS) Two minutes to debate.
95
264838
1574
ํ† ๋ก ์€ 2๋ถ„๊ฐ„ ์ด๋ฃจ์–ด์ง‘๋‹ˆ๋‹ค!
04:26
I'll tell you when it's time with the gong.
96
266436
2119
์‹œ๊ฐ„์ด ๋‹ค ๋˜๋ฉด ์†Œ๋ฆฌ๊ฐ€ ๋“ค๋ฆด ๊ฒ๋‹ˆ๋‹ค!
04:28
(Audience debates)
97
268579
2640
04:35
(Gong sound)
98
275229
1993
04:38
(DA) OK.
99
278834
1151
์ข‹์•„์š”.
04:40
(MS) It's time to stop.
100
280009
1792
์ด์ œ ๊ทธ๋งŒ ํ•˜์„ธ์š”.
04:41
People, people --
101
281825
1311
์—ฌ๋Ÿฌ๋ถ„!
04:43
MS: And we found that many groups reached a consensus
102
283747
2673
๋งŽ์€ ๊ทธ๋ฃน๋“ค์ด ํ•ฉ์˜์— ๋„๋‹ฌํ–ˆ์Šต๋‹ˆ๋‹ค.
04:46
even when they were composed of people with completely opposite views.
103
286444
3929
๋ฐ˜๋Œ€์˜ ์˜๊ฒฌ์„ ๊ฐ€์ง„ ์‚ฌ๋žŒ๋“ค๋กœ ๊ตฌ์„ฑ๋˜์—ˆ๋‹ค๊ณ  ํ• ์ง€๋ผ๋„ ๋ง์ด์ฃ .
04:50
What distinguished the groups that reached a consensus
104
290843
2524
ํ•ฉ์˜์— ๋นจ๋ฆฌ ๋„๋‹ฌํ•œ ๊ทธ๋ฃน๊ณผ
04:53
from those that didn't?
105
293391
1338
๊ทธ๋ ‡์ง€ ์•Š์€ ๊ทธ๋ฃน์˜ ์ฐจ์ด์ ์€ ๋ฌด์—‡์ผ๊นŒ์š”?
04:55
Typically, people that have extreme opinions
106
295244
2839
๋ณดํ†ต ๊ทน๋‹จ์ ์ธ ์˜๊ฒฌ์„ ๊ฐ€์ง„ ์‚ฌ๋žŒ๋“ค์€
04:58
are more confident in their answers.
107
298107
1840
๋ณธ์ธ์˜ ๋Œ€๋‹ต์— ํ™•์‹ ์„ ๊ฐ€์ง€๋Š” ๊ฒฝํ–ฅ์„ ๋ณด์ž…๋‹ˆ๋‹ค.
05:00
Instead, those who respond closer to the middle
108
300868
2686
์˜จ๊ฑดํ•œ ๋Œ€๋‹ตํ•œ ์‚ฌ๋žŒ๋“ค์€
05:03
are often unsure of whether something is right or wrong,
109
303578
3437
์–ด๋–ค ๊ฒƒ์ด ์˜ณ์€์ง€ ๊ทธ๋ฅธ์ง€์— ๋Œ€ํ•ด์„œ ๋ถ„๋ช…ํ•œ ์˜๊ฒฌ์„ ๊ฐ€์ง€์ง€ ๋ชปํ–ˆ๊ณ 
05:07
so their confidence level is lower.
110
307039
2128
๋Œ€๋‹ต์— ๋Œ€ํ•ด ํ™•์‹ ํ•˜๋Š” ๊ฒฝํ–ฅ ์—ญ์‹œ ๋‚ฎ์•˜์Šต๋‹ˆ๋‹ค.
05:09
However, there is another set of people
111
309505
2943
๋‹ค๋ฅธ ์ข…๋ฅ˜์˜ ์‚ฌ๋žŒ๋“ค๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
05:12
who are very confident in answering somewhere in the middle.
112
312472
3618
์•„์ฃผ ์ž์‹  ์žˆ๊ฒŒ ์˜จ๊ฑดํ•œ ๋Œ€๋‹ต์„ ํ•˜๋Š” ์‚ฌ๋žŒ๋“ค์ด์ฃ .
05:16
We think these high-confident grays are folks who understand
113
316657
3716
์ค‘๋ฆฝ์ง€๋Œ€์—์„œ ํ™•์‹ ์„ ๊ฐ€์ง„ ์ด๋“ค์€
05:20
that both arguments have merit.
114
320397
1612
์–‘์ชฝ์˜ ์ฃผ์žฅ์ด ๋ชจ๋‘ ์ด์ ์ด ์žˆ์Œ์„ ์ดํ•ดํ•˜๋Š” ์‚ฌ๋žŒ๋“ค์ž…๋‹ˆ๋‹ค.
05:22
They're gray not because they're unsure,
115
322531
2699
ํ™•์‹ ํ•˜์ง€ ๋ชปํ•ด์„œ ์ค‘๋ฆฝ์„ฑํ–ฅ์„ ๋„๋Š” ๊ฒƒ์ด ์•„๋‹ˆ๋ผ
05:25
but because they believe that the moral dilemma faces
116
325254
2688
๋„๋•์  ๋”œ๋ ˆ๋งˆ๊ฐ€ ๋‘ ๊ฐœ์˜ ์œ ํšจํ•˜๋ฉด์„œ๋„
05:27
two valid, opposing arguments.
117
327966
1987
๋ฐ˜๋Œ€๋˜๋Š” ์ฃผ์žฅ๋“ค์„ ๋ณด์—ฌ์ค€๋‹ค๋Š” ๊ฒƒ์„ ๋ฏฟ๊ณ  ์žˆ๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
05:30
And we discovered that the groups that include highly confident grays
118
330373
4072
์ด ํ™•์‹ ์„ ๊ฐ€์ง„ ์ค‘๋ฆฝ์ž๋ฅผ ํฌํ•จํ•œ ๊ทธ๋ฃน์ด
05:34
are much more likely to reach consensus.
119
334469
2493
ํ•ฉ์˜์— ํ›จ์”ฌ ๋” ์‰ฝ๊ฒŒ ๋„๋‹ฌํ•ฉ๋‹ˆ๋‹ค.
05:36
We do not know yet exactly why this is.
120
336986
2478
์ •ํ™•ํžˆ ์™œ ๊ทธ๋Ÿฐ์ง€๋Š” ๋ชจ๋ฆ…๋‹ˆ๋‹ค.
05:39
These are only the first experiments,
121
339488
1763
์•„์ง ์‹คํ—˜ ์ดˆ๊ธฐ๋‹จ๊ณ„์ผ ๋ฟ์ด๋‹ˆ๊นŒ์š”.
05:41
and many more will be needed to understand why and how
122
341275
3412
์–ด๋–ค ์‚ฌ๋žŒ์ด ํ•ฉ์˜๋ฅผ ์–ป๊ธฐ ์œ„ํ•ด ๋ณธ์ธ์˜ ๋„๋•์  ๊ด€์ ์„ ํ˜‘์ƒํ•˜๋Š”
05:44
some people decide to negotiate their moral standings
123
344711
2822
์ด์œ ์™€ ๋ฐฉ๋ฒ•์„ ์ดํ•ดํ•˜๊ธฐ ์œ„ํ•ด์„œ๋Š”
05:47
to reach an agreement.
124
347557
1522
๋” ๋งŽ์€ ์‹คํ—˜์ด ์ด๋ฃจ์–ด์ ธ์•ผ ํ•ฉ๋‹ˆ๋‹ค.
05:49
Now, when groups reach consensus,
125
349103
2469
์ง‘๋‹จ์ด ํ•ฉ์˜์— ๋„๋‹ฌํ•˜๋ ค๊ณ  ํ•  ๋•Œ
05:51
how do they do so?
126
351596
1586
์–ด๋–ค ๋ฐฉ๋ฒ•์„ ์“ธ๊นŒ์š”?
05:53
The most intuitive idea is that it's just the average
127
353206
2581
๊ฐ€์žฅ ์ง๊ด€์ ์ธ ์ƒ๊ฐ์€
์ง‘๋‹จ ์•ˆ์—์„œ ๋‚˜์˜จ ๋ชจ๋“  ๋‹ต์˜ ํ‰๊ท ์„ ๋‚ด๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
05:55
of all the answers in the group, right?
128
355811
2030
05:57
Another option is that the group weighs the strength of each vote
129
357865
3573
๋‹ค๋ฅธ ๋ฐฉ๋ฒ•์€ ๊ฐœ์ธ์ด ๋ณธ์ธ์˜ ์ฃผ์žฅ์— ๋Œ€ํ•ด
์–ผ๋งˆ๋‚˜ ํ™•์‹ ํ•˜๋Š๋ƒ์— ๋”ฐ๋ผ ๋” ํฐ ๊ฐ€์ค‘์น˜๋ฅผ ๋‘๋Š” ๊ฒ๋‹ˆ๋‹ค.
06:01
based on the confidence of the person expressing it.
130
361462
2448
06:04
Imagine Paul McCartney is a member of your group.
131
364422
2506
ํด ๋งค์นดํŠธ๋‹ˆ๊ฐ€ ์—ฌ๋Ÿฌ๋ถ„ ๊ทธ๋ฃน์˜ ๊ตฌ์„ฑ์›์ด๋ผ๊ณ  ์ƒ๊ฐํ•ด๋ณด์„ธ์š”.
06:07
You'd be wise to follow his call
132
367352
2144
'์˜ˆ์Šคํ„ฐ๋ฐ์ด'๊ฐ€ ๋ช‡ ๋ฒˆ์ด๋‚˜ ๋ฐ˜๋ณต๋˜๋Š๋ƒ๋Š”
06:09
on the number of times "Yesterday" is repeated,
133
369520
2441
์งˆ๋ฌธ์˜ ๋‹ต์€ ํด์„ ๋ฏฟ๋Š” ๊ฒŒ ์ข‹์ง€ ์•Š์„๊นŒ์š”?
06:11
which, by the way -- I think it's nine.
134
371985
2714
์ œ ์ƒ๊ฐ์—๋Š” ์•„ํ™‰ ๋ฒˆ ๊ฐ™์•„์š”.
06:14
But instead, we found that consistently,
135
374723
2381
๊ทธ๋Ÿฐ๋ฐ ์šฐ๋ฆฌ๊ฐ€ ๋ฐœ๊ฒฌํ•œ ๊ฒฐ๊ณผ์— ๋”ฐ๋ฅด๋ฉด
06:17
in all dilemmas, in different experiments --
136
377128
2366
๋ชจ๋“  ๋”œ๋ ˆ๋งˆ์™€ ๋ชจ๋“  ๋‹ค๋ฅธ ์‹คํ—˜๋“ค์—์„œ
06:19
even on different continents --
137
379518
2165
์‹ฌ์ง€์–ด๋Š” ๋‹ค๋ฅธ ๋Œ€๋ฅ™์—์„œ ์—ญ์‹œ
06:21
groups implement a smart and statistically sound procedure
138
381707
3743
์ง‘๋‹จ์€ ํ†ต๊ณ„์ ์œผ๋กœ ๊ฑด๊ฐ•ํ•œ ์ ˆ์ฐจ์ธ
06:25
known as the "robust average."
139
385474
2178
'๊ฐ•๋ ฅํ•œ ํ‰๊ท '์ด๋ผ๋Š” ๋ฐฉ๋ฒ•์„ ์‹œํ–‰ํ•˜๋ ค๊ณ  ๋…ธ๋ ฅํ•ฉ๋‹ˆ๋‹ค.
06:27
In the case of the height of the Eiffel Tower,
140
387676
2180
์—ํŽ ํƒ‘์˜ ๋†’์ด ์งˆ๋ฌธ์„ ์˜ˆ๋กœ ๋“ค๊ฒ ์Šต๋‹ˆ๋‹ค.
06:29
let's say a group has these answers:
141
389880
1820
์ด๋Ÿฌํ•œ ๋Œ€๋‹ต์„ ๊ฐ€์ง„ ๊ทธ๋ฃน์ด ์žˆ๋‹ค๊ณ  ํ•ฉ์‹œ๋‹ค.
06:31
250 meters, 200 meters, 300 meters, 400
142
391724
4608
250m, 200m, 300m
400m์™€ ๋ง๋„ ์•ˆ ๋˜๊ฒŒ 3์–ตm๊นŒ์ง€์š”.
06:36
and one totally absurd answer of 300 million meters.
143
396356
3784
06:40
A simple average of these numbers would inaccurately skew the results.
144
400547
4293
์ด ์ˆซ์ž๋“ค์„ ๋‹จ์ˆœํ•˜๊ฒŒ ํ‰๊ท ๋‚ด๋Š” ๊ฒƒ์€ ๊ฒฐ๊ณผ๋ฅผ ์™œ๊ณกํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
06:44
But the robust average is one where the group largely ignores
145
404864
3170
๊ทธ๋Ÿฌ๋‚˜ ๊ฐ•๋ ฅํ•œ ํ‰๊ท ์€ ํ„ฐ๋ฌด๋‹ˆ์—†๋Š” ๋‹ต์„
06:48
that absurd answer,
146
408058
1240
๊ฑฐ์˜ ๋ฌด์‹œํ•˜๋Š” ๋ฐฉ๋ฒ•์ž…๋‹ˆ๋‹ค.
06:49
by giving much more weight to the vote of the people in the middle.
147
409322
3369
์ค‘๊ฐ„์ง€๋Œ€์˜ ๋‹ต์— ๋” ํฐ ๊ฐ€์ค‘์น˜๋ฅผ ๋‘ ์œผ๋กœ์จ ๋ง์ด์ฃ .
06:53
Back to the experiment in Vancouver,
148
413305
1876
๋ฐด์ฟ ๋ฒ„์—์„œ ํ•œ ์‹คํ—˜์œผ๋กœ ๋Œ์•„๊ฐ€๋ณผ๊นŒ์š”?
06:55
that's exactly what happened.
149
415205
1767
๊ฑฐ๊ธฐ์„œ ์ด ๋ฐฉ๋ฒ•์ด ์‚ฌ์šฉ๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
06:57
Groups gave much less weight to the outliers,
150
417407
2741
์ง‘๋‹จ์ด ํ•ฉ์˜์— ๋„๋‹ฌํ•  ๋•Œ, ํŠน์ด์น˜์— ํ›จ์”ฌ ์ ์€ ๊ฐ€์ค‘์น˜๋ฅผ ๋‘๊ณ 
07:00
and instead, the consensus turned out to be a robust average
151
420172
3229
๊ฐœ์ธ์˜ ๋Œ€๋‹ต์— ๊ฐ•๋ ฅํ•œ ํ‰๊ท  ๋ฐฉ๋ฒ•์„ ๋„์ž…ํ•œ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
07:03
of the individual answers.
152
423425
1476
07:05
The most remarkable thing
153
425356
1991
๊ฐ€์žฅ ๋†€๋ผ์šด ์ผ์€
07:07
is that this was a spontaneous behavior of the group.
154
427371
3187
์ด๊ฒƒ์ด ์ž์—ฐ์Šค๋Ÿฝ๊ฒŒ ์ผ์–ด๋‚œ ํ˜„์ƒ์ด๋ผ๋Š” ๊ฒ๋‹ˆ๋‹ค.
07:10
It happened without us giving them any hint on how to reach consensus.
155
430582
4475
์šฐ๋ฆฌ๋Š” ์–ด๋– ํ•œ ํžŒํŠธ๋„ ์ฃผ์ง€ ์•Š์•˜์Šต๋‹ˆ๋‹ค.
07:15
So where do we go from here?
156
435513
1540
์—ฌ๊ธฐ์„œ ์–ด๋–ค ๊ฒฐ๊ณผ๋ฅผ ๋„์ถœํ•  ์ˆ˜ ์žˆ์„๊นŒ์š”?
07:17
This is only the beginning, but we already have some insights.
157
437432
3137
์•„์ง ์ดˆ๊ธฐ ๋‹จ๊ณ„์— ๋ถˆ๊ณผํ•˜์ง€๋งŒ ์ด๋ฏธ ์ง์ž‘๋œ๋Š” ๊ฒฐ๊ณผ๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
07:20
Good collective decisions require two components:
158
440984
2917
์ข‹์€ ๊ณต๋™์˜ ๊ฒฐ์ •์€ ๋‘ ๊ฐ€์ง€ ์š”์†Œ๋ฅผ ํ•„์š”๋กœํ•ฉ๋‹ˆ๋‹ค.
07:23
deliberation and diversity of opinions.
159
443925
2749
์˜๊ฒฌ์˜ ์‹ฌ์‚ฌ์ˆ™๊ณ ์™€ ์˜๊ฒฌ์˜ ๋‹ค์–‘์„ฑ์ž…๋‹ˆ๋‹ค.
07:27
Right now, the way we typically make our voice heard in many societies
160
447066
3996
ํ˜„์žฌ ๋Œ€๋ถ€๋ถ„์˜ ๊ณต๋™์ฒด์—์„œ ์šฐ๋ฆฌ์˜ ๋ชฉ์†Œ๋ฆฌ๊ฐ€ ์ „๋‹ฌ๋˜๋Š” ๋ฐฉ์‹์€
07:31
is through direct or indirect voting.
161
451086
1908
๋Œ€๊ฐœ ์ง๊ฐ„์ ‘์ ์ธ ํˆฌํ‘œ๋ฅผ ํ†ตํ•ด์„œ์ž…๋‹ˆ๋‹ค.
07:33
This is good for diversity of opinions,
162
453495
1997
์˜๊ฒฌ์˜ ๋‹ค์–‘์„ฑ์„ ์œ„ํ•ด ์ข‹์€ ๋ฐฉ๋ฒ•์ด๊ณ 
07:35
and it has the great virtue of ensuring
163
455516
2445
๋ชจ๋‘๊ฐ€ ๊ฐ์ž์˜ ์˜๊ฒฌ์„ ํ‘œํ˜„ํ•  ์ˆ˜ ์žˆ๋‹ค๋Š”
07:37
that everyone gets to express their voice.
164
457985
2455
๊ต‰์žฅํ•œ ์žฅ์ ์„ ๊ฐ€์ง€๋Š” ๋ฐฉ๋ฒ•์ž…๋‹ˆ๋‹ค.
07:40
But it's not so good [for fostering] thoughtful debates.
165
460464
3735
๊ทธ๋ ‡์ง€๋งŒ ์˜๊ฒฌ์„ ๋‚˜๋ˆ„๋Š” ํ† ๋ก ์„ ์žฅ๋ คํ•˜๋Š” ๋ฐฉ๋ฒ•์€ ์•„๋‹™๋‹ˆ๋‹ค.
07:44
Our experiments suggest a different method
166
464665
3068
์šฐ๋ฆฌ์˜ ์‹คํ—˜์€ ๋‹ค๋ฅธ ๋ฐฉ๋ฒ•์„ ์ œ์‹œํ•ฉ๋‹ˆ๋‹ค.
07:47
that may be effective in balancing these two goals at the same time,
167
467757
3541
์ด ๋‘ ๊ฐ€์ง€ ๋‹ค๋ฅธ ๋ชฉํ‘œ ์‚ฌ์ด์—์„œ ๊ท ํ˜•์„ ๋งž์ถœ ์ˆ˜ ์žˆ๋Š” ๋ฐฉ๋ฒ•์ด์ฃ .
07:51
by forming small groups that converge to a single decision
168
471322
3753
์˜๊ฒฌ์˜ ๋‹ค์–‘์„ฑ์„ ์œ ์ง€ํ•˜๋Š” ํ•œํŽธ
07:55
while still maintaining diversity of opinions
169
475099
2234
ํ•˜๋‚˜์˜ ๊ฒฐ์ •์„ ๋‚ด๋ฆฌ๋Š” ์†Œ๊ทธ๋ฃน์„ ๋งŒ๋‹ค๋Š” ๊ฒ๋‹ˆ๋‹ค.
07:57
because there are many independent groups.
170
477357
2773
์šฐ๋ฆฌ ์‚ฌํšŒ์—๋Š” ๋งŽ์€ ๋…๋ฆฝ์ ์ธ ์ง‘๋‹จ๋“ค์ด ์กด์žฌํ•˜๊ณ  ์žˆ์œผ๋‹ˆ๊นŒ์š”.
08:00
Of course, it's much easier to agree on the height of the Eiffel Tower
171
480741
3924
๋‹น์—ฐํžˆ ์—ํŽ ํƒ‘์˜ ๋†’์ด์— ๋Œ€ํ•ด ํ•ฉ์˜ํ•˜๋Š” ๊ฒƒ์€
08:04
than on moral, political and ideological issues.
172
484689
3115
๋„๋•์ , ์ •์น˜์ , ๊ฐœ๋…์  ๋ฌธ์ œ๋“ค์— ํ•ฉ์˜ํ•˜๋Š” ๊ฒƒ๋ณด๋‹ค ํ›จ์”ฌ ์‰ฝ์Šต๋‹ˆ๋‹ค.
08:08
But in a time when the world's problems are more complex
173
488721
3277
์„ธ์ƒ์˜ ๋ฌธ์ œ๋“ค์ด ๋ณต์žกํ•ด์ง€๊ณ 
08:12
and people are more polarized,
174
492022
1803
์‚ฌ๋žŒ๋“ค์ด ์–‘๊ทนํ™”๋˜์–ด ๊ฐ€๋Š” ์‹œ๋Œ€์—
08:13
using science to help us understand how we interact and make decisions
175
493849
4595
๊ณผํ•™์„ ํ†ตํ•ด ์šฐ๋ฆฌ๊ฐ€ ์–ด๋–ป๊ฒŒ ์ƒํ˜ธ์ž‘์šฉํ•˜๊ณ  ๊ฒฐ์ •์„ ๋‚ด๋ฆฌ๋Š” ์ง€ ์ดํ•ดํ•จ์œผ๋กœ์จ
08:18
will hopefully spark interesting new ways to construct a better democracy.
176
498468
4666
๋” ์ง„๋ณด๋œ ๋ฏผ์ฃผ์ฃผ์˜๋ฅผ ๋งŒ๋“ค ์ƒˆ๋กœ์šด ๋ฐฉ๋ฒ•์ด ๋‚˜์˜ค๊ธฐ๋ฅผ ๋ฐ”๋ž๋‹ˆ๋‹ค.
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7