Are computers always right? 6 Minute English

257,391 views ・ 2018-07-19

BBC Learning English


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λ²ˆμ—­λœ μžλ§‰μ€ 기계 λ²ˆμ—­λ©λ‹ˆλ‹€.

00:06
Catherine: Hello and welcome to 6 Minute
0
6480
2280
Catherine: μ•ˆλ…•ν•˜μ„Έμš”. 6 Minute English에 μ˜€μ‹  것을 ν™˜μ˜ν•©λ‹ˆλ‹€
00:08
English. I'm Catherine.
1
8760
1420
. μ €λŠ” μΊμ„œλ¦°μž…λ‹ˆλ‹€.
00:10
Rob: And hello, I'm Rob.
2
10180
1400
Rob: 그리고 μ•ˆλ…•ν•˜μ„Έμš”, μ €λŠ” Robμž…λ‹ˆλ‹€.
00:11
Catherine: Today we have another
3
11580
1389
Catherine: μ˜€λŠ˜μ€ 또 λ‹€λ₯Έ
00:12
technology topic.
4
12969
761
기술 μ£Όμ œκ°€ μžˆμŠ΅λ‹ˆλ‹€.
00:13
Rob: Oh good! I love technology. It makes
5
13730
2470
λ‘­: 였 μ’‹λ‹€! λ‚˜λŠ” κΈ°μˆ μ„ μ’‹μ•„ν•©λ‹ˆλ‹€. 그것은
00:16
things easier, it's fast and means I can
6
16240
2160
일을 더 μ‰½κ²Œ λ§Œλ“€κ³  λΉ λ₯΄λ©° κ°€μ œνŠΈλ₯Ό κ°€μ§ˆ 수 μžˆμŒμ„ μ˜λ―Έν•©λ‹ˆλ‹€
00:18
have gadgets.
7
18400
1400
.
00:19
Catherine: Do you think that technology
8
19800
1760
μΊμ„œλ¦°: 기술이
00:21
can actually do things better than humans?
9
21560
2200
μ‹€μ œλ‘œ 인간보닀 일을 더 μž˜ν•  수 μžˆλ‹€κ³  μƒκ°ν•˜μ‹­λ‹ˆκΉŒ?
00:23
Rob: For some things, yes. I think cars
10
23760
2640
Rob: μ–΄λ–€ κ²½μš°μ—λŠ” κ·Έλ ‡μŠ΅λ‹ˆλ‹€.
00:26
that drive themselves will be safer than
11
26400
1600
슀슀둜 μš΄μ „ν•˜λŠ” μžλ™μ°¨λŠ” μ‚¬λžŒλ³΄λ‹€ μ•ˆμ „ν•˜κ² μ§€λ§Œ μš΄μ „μ˜ 즐거움을
00:28
humans but that will take away some of
12
28000
3180
μ–΄λŠ 정도 μ•—μ•„κ°ˆ 것이라고 μƒκ°ν•©λ‹ˆλ‹€
00:31
the pleasure of driving. So I guess it
13
31180
2180
. κ·Έλž˜μ„œ λ‚˜λŠ”
00:33
depends on what you mean by better.
14
33360
1940
당신이 더 λ‚˜μ€ μ˜λ―Έμ— 달렀 μžˆλ‹€κ³  μƒκ°ν•©λ‹ˆλ‹€.
00:35
Catherine: Good point, Rob. And that
15
35300
1680
Catherine: 쒋은 μ§€μ μž…λ‹ˆλ‹€, Rob. 그리고 그것은
00:36
actually ties in very closely with today's
16
36980
2420
μ‹€μ œλ‘œ 였늘의 주제인 ν…Œν¬λ…Έ μ‡ΌλΉ„λ‹ˆμ¦˜κ³Ό 맀우 λ°€μ ‘ν•˜κ²Œ μ—°κ²°λ˜μ–΄ μžˆμŠ΅λ‹ˆλ‹€
00:39
topic which is technochauvinism.
17
39400
3200
.
00:42
Rob: What's that?
18
42600
1480
λ‘­: μ €κ²Œ 뭔데?
00:44
Catherine: We'll find out shortly, Rob, but
19
44080
1520
μΊμ„œλ¦°: 곧 μ•Œκ²Œ 될 κ±°μ—μš”, λ‘­. ν•˜μ§€λ§Œ
00:45
before we do, today's quiz question.
20
45600
3200
그러기 전에 였늘의 ν€΄μ¦ˆ μ§ˆλ¬Έμ΄μš”. 인곡
00:48
Artificial Intelligence, or A.I., is an area of
21
48800
3260
지λŠ₯(Artificial Intelligence, A.I.)은 좩돌
00:52
computer science that develops the
22
52060
2040
00:54
ability of computers to learn to do things
23
54100
2540
00:56
like solve problems or drive cars without
24
56640
2495
없이 문제λ₯Ό ν•΄κ²°ν•˜κ±°λ‚˜ μžλ™μ°¨λ₯Ό μš΄μ „ν•˜λŠ” 것과 같은 일을 컴퓨터가 ν•™μŠ΅ν•˜λŠ” λŠ₯λ ₯을 κ°œλ°œν•˜λŠ” 컴퓨터 κ³Όν•™ λΆ„μ•Όμž…λ‹ˆλ‹€
00:59
crashing. But in what decade was the
25
59140
2800
. κ·ΈλŸ¬λ‚˜
01:01
term 'Artificial Intelligence'
26
61940
2640
'인곡 지λŠ₯'μ΄λΌλŠ” μš©μ–΄κ°€
01:04
coined? Was it: a) the 1940s, b) the 1950s
27
64589
4901
λ§Œλ“€μ–΄μ§„ 10년은 μ–Έμ œμž…λ‹ˆκΉŒ? a) 1940λ…„λŒ€, b) 1950λ…„λŒ€
01:09
or c) the 1960s?
28
69490
2070
λ˜λŠ” c) 1960λ…„λŒ€μ˜€μŠ΅λ‹ˆκΉŒ?
01:11
Rob: I think it's quite a new expression so
29
71560
3000
Rob: κ½€ μƒˆλ‘œμš΄ ν‘œν˜„μΈ 것 κ°™μ•„μ„œ
01:14
I'll go for c) the 1960s.
30
74560
2380
c) 1960λ…„λŒ€λ‘œ ν•˜κ² μŠ΅λ‹ˆλ‹€.
01:16
Catherine: Good luck with that, Rob, and
31
76940
1420
Catherine: ν–‰μš΄μ„ λΉ•λ‹ˆλ‹€, Rob.
01:18
we'll give you the answer later in the
32
78360
1300
λ‚˜μ€‘μ— ν”„λ‘œκ·Έλž¨μ—μ„œ 닡변을 λ“œλ¦¬κ² μŠ΅λ‹ˆλ‹€
01:19
programme. Now, let's get back to our
33
79660
2940
. 이제
01:22
topic of technochauvinism.
34
82600
2040
ν…Œν¬λ…Έμ‡ΌλΉ„λ‹ˆμ¦˜μ΄λΌλŠ” 주제둜 λŒμ•„κ°€ λ΄…μ‹œλ‹€.
01:24
Rob: I know what a chauvinist is. It's
35
84640
1920
Rob: μ‡ΌλΉ„λ‹ˆμŠ€νŠΈκ°€ 뭔지 μ•Œμ•„μš”.
01:26
someone who thinks that their country or
36
86580
2380
자기 λ‚˜λΌλ‚˜
01:28
race or sex is better than others. But how
37
88960
3140
μΈμ’…μ΄λ‚˜ 성별이 λ‹€λ₯Έ μ‚¬λžŒλ³΄λ‹€ λ‚«λ‹€κ³  μƒκ°ν•˜λŠ” μ‚¬λžŒμž…λ‹ˆλ‹€. κ·ΈλŸ¬λ‚˜
01:32
does this relate to technology?
38
92100
1340
이것이 기술과 μ–΄λ–€ 관련이 μžˆμŠ΅λ‹ˆκΉŒ?
01:33
Catherine: We're about to find out.
39
93440
1891
μΊμ„œλ¦°: 곧 μ•Œκ²Œ 될 κ²ƒμž…λ‹ˆλ‹€.
01:35
Meredith Broussard is Professor of
40
95340
1984
Meredith BroussardλŠ”
01:37
Journalism at New York University and
41
97324
2236
New York University의 μ €λ„λ¦¬μ¦˜ ꡐ수이며 Artificial Unintelligence
01:39
she's written a book called
42
99560
1660
λΌλŠ” 책을 μ €μˆ ν–ˆμŠ΅λ‹ˆλ‹€
01:41
Artificial Unintelligence. She appeared on
43
101220
3420
. κ·Έλ…€λŠ” 그것에 λŒ€ν•΄ μ΄μ•ΌκΈ°ν•˜κΈ° μœ„ν•΄
01:44
the BBC Radio 4 programme More or Less
44
104640
2580
BBC λΌλ””μ˜€ 4 ν”„λ‘œκ·Έλž¨ More or Less에 μΆœμ—°ν–ˆμŠ΅λ‹ˆλ‹€
01:47
to talk about it. Listen carefully and find
45
107220
2620
. 주의 깊게 λ“£κ³ 
01:49
out her definition of technochauvinism.
46
109840
3320
기술 μ‡ΌλΉ„λ‹ˆμ¦˜μ— λŒ€ν•œ κ·Έλ…€μ˜ μ •μ˜λ₯Ό μ°ΎμœΌμ‹­μ‹œμ˜€.
01:53
Meredith Broussard: Technochauvinism is
47
113160
1811
Meredith Broussard: Technochauvinism은
01:54
the idea that technology is always the
48
114980
2540
기술이 항상
01:57
highest and best solution. So somehow
49
117520
3120
졜고의 μ†”λ£¨μ…˜μ΄λΌλŠ” μƒκ°μž…λ‹ˆλ‹€. κ·Έλž˜μ„œ
02:00
over the past couple of decades we got
50
120640
2260
μ§€λ‚œ 20λ…„ λ™μ•ˆ μš°λ¦¬λŠ”
02:02
into the habit of
51
122900
1360
02:04
thinking that doing something with a
52
124270
3410
μ»΄ν“¨ν„°λ‘œ 무언가λ₯Ό ν•˜λŠ” 것이
02:07
computer is always the best and most
53
127680
3480
항상 μ΅œμ„ μ΄κ³  κ°€μž₯
02:11
objective way to do something and
54
131180
2060
객관적인 방법이라고 μƒκ°ν•˜λŠ” μŠ΅κ΄€μ΄ μƒκ²ΌμŠ΅λ‹ˆλ‹€.
02:13
that's simply not true.
55
133240
1540
그것은 사싀이 μ•„λ‹™λ‹ˆλ‹€.
02:14
Computers are not objective, they are
56
134780
3000
μ»΄ν“¨ν„°λŠ” 객관적이지 μ•ŠμœΌλ©° 컴퓨터λ₯Ό
02:17
proxies for the people who make them.
57
137780
2300
λ§Œλ“œλŠ” μ‚¬λžŒλ“€μ„ λŒ€μ‹ ν•©λ‹ˆλ‹€.
02:20
Catherine: What is Meredith Broussard's
58
140080
2200
Catherine: Meredith Broussardκ°€
02:22
definition of technochauvinism?
59
142300
2280
기술 μ‡ΌλΉ„λ‹ˆμ¦˜μ— λŒ€ν•΄ μ •μ˜ν•œ 것은 λ¬΄μ—‡μž…λ‹ˆκΉŒ?
02:24
Rob: It's this idea that using technology
60
144580
2250
Rob: κΈ°μˆ μ„ μ‚¬μš©ν•˜λŠ” 것이
02:26
is better than not using technology.
61
146830
2450
κΈ°μˆ μ„ μ‚¬μš©ν•˜μ§€ μ•ŠλŠ” 것보닀 λ‚«λ‹€λŠ” μƒκ°μž…λ‹ˆλ‹€.
02:29
Catherine: She says that we have this idea
62
149280
2360
μΊμ„œλ¦°: κ·Έλ…€λŠ” μš°λ¦¬κ°€ 컴퓨터가 κ°κ΄€μ μ΄λΌλŠ” 생각을 가지고 μžˆλ‹€κ³  λ§ν–ˆμŠ΅λ‹ˆλ‹€
02:31
that a computer is objective. Something
63
151640
2800
.
02:34
that is objective is neutral, it doesn't have
64
154440
2500
객관적인 것은 쀑립적이고,
02:36
an opinion, it's fair and it's unbiased - so
65
156940
3160
의견이 μ—†μœΌλ©°, κ³΅ν‰ν•˜κ³  편견이 μ—†μŠ΅λ‹ˆλ‹€. λ”°λΌμ„œ
02:40
it's the opposite of being a chauvinist. But
66
160100
3080
μ‡ΌλΉ„λ‹ˆμŠ€νŠΈμ™€ μ •λ°˜λŒ€μž…λ‹ˆλ‹€. κ·ΈλŸ¬λ‚˜
02:43
Meredith Broussard says this is not true.
67
163180
2860
Meredith BroussardλŠ” 이것이 사싀이 μ•„λ‹ˆλΌκ³  λ§ν•©λ‹ˆλ‹€.
02:46
Rob: She argues that computers are not
68
166040
2635
Rob: κ·Έλ…€λŠ” 컴퓨터가 객관적이지 μ•Šλ‹€κ³  μ£Όμž₯ν•©λ‹ˆλ‹€
02:48
objective. They are proxies for the people
69
168680
2880
. 그것듀은 그것듀을 λ§Œλ“œλŠ” μ‚¬λžŒλ“€μ„ μœ„ν•œ λŒ€λ¦¬μΈμž…λ‹ˆλ‹€
02:51
that make them. You might know the
70
171560
1800
.
02:53
word proxy when you are using your
71
173360
1735
02:55
computer in one country and want to look
72
175095
2235
ν•œ κ΅­κ°€μ—μ„œ 컴퓨터λ₯Ό μ‚¬μš©ν•˜κ³ 
02:57
at something that is only available
73
177330
1871
02:59
in a different country. You can use a piece
74
179201
2298
λ‹€λ₯Έ κ΅­κ°€μ—μ„œλ§Œ μ‚¬μš©ν•  수 μžˆλŠ” 것을 보고 싢을 λ•Œ ν”„λ‘μ‹œλΌλŠ” 단어λ₯Ό μ•Œκ³  μžˆμ„ 수 μžˆμŠ΅λ‹ˆλ‹€.
03:01
of software called a proxy to do that.
75
181500
2640
이λ₯Ό μœ„ν•΄ ν”„λ‘μ‹œλΌλŠ” μ†Œν”„νŠΈμ›¨μ–΄λ₯Ό μ‚¬μš©ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
03:04
Catherine: But a proxy is also a person or
76
184140
2840
μΊμ„œλ¦°: ν•˜μ§€λ§Œ λŒ€λ¦¬μΈμ€ 당신을 λŒ€μ‹ ν•΄ λ‹Ήμ‹ μ˜
03:06
a thing that carries out your wishes and
77
186980
2660
λ°”λžŒκ³Ό μ§€μ‹œλ₯Ό μˆ˜ν–‰ν•˜λŠ” μ‚¬λžŒμ΄λ‚˜ 사물이기도 ν•©λ‹ˆλ‹€
03:09
your instructions for you.
78
189640
2000
.
03:11
So computers are only as smart or as
79
191640
2860
λ”°λΌμ„œ μ»΄ν“¨ν„°λŠ” 컴퓨터λ₯Ό ν”„λ‘œκ·Έλž˜λ°ν•˜λŠ” μ‚¬λžŒλ§ŒνΌ λ˜‘λ˜‘ν•˜κ±°λ‚˜
03:14
objective as the people that
80
194500
2560
κ°κ΄€μ μž…λ‹ˆλ‹€
03:17
programme them. Computers are proxies
81
197060
3060
. μ»΄ν“¨ν„°λŠ”
03:20
for their programmers. Broussard says
82
200120
2620
ν”„λ‘œκ·Έλž˜λ¨Έλ₯Ό μœ„ν•œ ν”„λ‘μ‹œμž…λ‹ˆλ‹€. BroussardλŠ”
03:22
that believing too much in Artificial
83
202740
2400
인곡 지λŠ₯을 λ„ˆλ¬΄ 많이 믿으면
03:25
Intelligence can make the world worse.
84
205140
2820
세상이 더 λ‚˜λΉ μ§ˆ 수 μžˆλ‹€κ³  λ§ν•©λ‹ˆλ‹€.
03:27
Let's hear a bit more. This time find out
85
207960
2920
쑰금 더 λ“€μ–΄λ΄…μ‹œλ‹€. μ΄λ²ˆμ—λŠ”
03:30
what serious problems in society
86
210880
2820
μ‚¬νšŒμ˜ μ–΄λ–€ μ‹¬κ°ν•œ λ¬Έμ œκ°€
03:33
does she think may be reflected in AI?
87
213700
3900
AI에 반영될 수 μžˆλ‹€κ³  μƒκ°ν•˜λŠ”μ§€ μ•Œμ•„λ³΄μ‹­μ‹œμ˜€.
03:37
Meredith Broussard: It's a nuanced
88
217600
1400
Meredith Broussard: λ―Έλ¬˜ν•œ
03:39
problem. What we have is data on the
89
219000
2700
λ¬Έμ œμž…λ‹ˆλ‹€. μš°λ¦¬κ°€ 가지고 μžˆλŠ” 것은 μžˆλŠ” κ·ΈλŒ€λ‘œμ˜ 세상에 λŒ€ν•œ 데이터이고, μ§€κΈˆ μ„Έμƒμ—λŠ”
03:41
world as it is and we have serious
90
221700
3100
03:44
problems with racism, sexism, classism,
91
224800
2920
인쒅차별, 성차별, 계급차별,
03:47
ageism, in the world right now so there is
92
227720
2300
연령차별 λ“± μ‹¬κ°ν•œ λ¬Έμ œκ°€ 있기 λ•Œλ¬Έμ—
03:50
no such thing as perfect data. We also
93
230020
3760
μ™„λ²½ν•œ λ°μ΄ν„°λΌλŠ” 것은 μ—†μŠ΅λ‹ˆλ‹€. μš°λ¦¬λŠ” λ˜ν•œ
03:53
have a problem inside the tech world
94
233780
2379
기술 세계 내뢀에 λ¬Έμ œκ°€ μžˆμŠ΅λ‹ˆλ‹€.
03:56
where the creators of algorithms do not
95
236160
3080
μ•Œκ³ λ¦¬μ¦˜μ„ λ§Œλ“  μ‚¬λžŒλ“€μ΄
03:59
have sufficient awareness of social
96
239240
3960
μ‚¬νšŒ λ¬Έμ œμ— λŒ€ν•œ μΆ©λΆ„ν•œ 인식을 갖지 λͺ»ν•˜μ—¬
04:03
issues such that they can make good
97
243200
2680
04:05
technology that gets us closer to a world
98
245880
3800
우리λ₯Ό 세상에 더 κ°€κΉκ²Œ λ§Œλ“œλŠ” 쒋은 κΈ°μˆ μ„ λ§Œλ“€ 수 μžˆμŠ΅λ‹ˆλ‹€
04:09
as it should be.
99
249680
1660
.
04:11
Rob: She said that society has problems
100
251340
2640
Rob: κ·Έλ…€λŠ” μ‚¬νšŒκ°€
04:13
with racism, sexism, classism and ageism.
101
253984
3296
인쒅차별, 성차별, 계급차별, 연령차별에 λ¬Έμ œκ°€ μžˆλ‹€κ³  λ§ν–ˆμŠ΅λ‹ˆλ‹€.
04:17
Catherine: And she says it's a nuanced
102
257280
2531
μΊμ„œλ¦°: 그리고 κ·Έλ…€λŠ” 그것이 λ―Έλ¬˜ν•œ
04:19
problem. A nuanced problem is not
103
259820
2231
문제라고 λ§ν–ˆμŠ΅λ‹ˆλ‹€. λ―Έλ¬˜ν•œ λ¬Έμ œλŠ” λ‹¨μˆœν•˜μ§€ μ•Šμ§€λ§Œ λ°œκ²¬ν•˜κΈ° μ–΄λ €μšΈ
04:22
simple, but it does have small and
104
262060
1960
04:24
important areas which may be
105
264020
2200
수 μžˆλŠ” μž‘κ³  μ€‘μš”ν•œ μ˜μ—­μ΄ μžˆμ§€λ§Œ
04:26
hard to spot, but they need to be considered.
106
266220
3020
κ³ λ €ν•΄μ•Ό ν•©λ‹ˆλ‹€.
04:29
Rob: And she also talked about
107
269240
1800
Rob: 그리고 κ·Έλ…€λŠ”
04:31
algorithms used to program these
108
271053
2220
μ΄λŸ¬ν•œ 기술 μ‹œμŠ€ν…œμ„ ν”„λ‘œκ·Έλž˜λ°ν•˜λŠ” 데 μ‚¬μš©λ˜λŠ” μ•Œκ³ λ¦¬μ¦˜μ— λŒ€ν•΄μ„œλ„ μ΄μ•ΌκΈ°ν–ˆμŠ΅λ‹ˆλ‹€
04:33
technological systems.
109
273273
1527
.
04:34
An algorithm is a set of instructions that
110
274800
2497
μ•Œκ³ λ¦¬μ¦˜μ€
04:37
computers use to perform their tasks.
111
277300
2240
컴퓨터가 μž‘μ—…μ„ μˆ˜ν–‰ν•˜λŠ” 데 μ‚¬μš©ν•˜λŠ” λͺ…λ Ή μ§‘ν•©μž…λ‹ˆλ‹€.
04:39
Essentially it's the rules that they use
112
279540
1680
본질적으둜 그것은 그듀이
04:41
to come up with their answers and
113
281220
2220
닡을 μ°ΎλŠ” 데 μ‚¬μš©ν•˜λŠ” κ·œμΉ™μ΄λ©°
04:43
Broussard believes that technology will
114
283440
2560
BroussardλŠ” 기술이
04:46
reflect the views of those who create the algorithms.
115
286000
2680
μ•Œκ³ λ¦¬μ¦˜μ„ λ§Œλ“œλŠ” μ‚¬λžŒλ“€μ˜ 견해λ₯Ό λ°˜μ˜ν•  것이라고 λ―ΏμŠ΅λ‹ˆλ‹€.
04:48
Catherine: Next time you're using a piece
116
288680
1640
Catherine: λ‹€μŒμ—
04:50
of software or your favourite app, you
117
290320
2120
μ†Œν”„νŠΈμ›¨μ–΄λ‚˜ 즐겨 μ‚¬μš©ν•˜λŠ” 앱을 μ‚¬μš©ν•  λ•Œ
04:52
might find yourself wondering if it's a
118
292440
2040
04:54
useful tool or does it contain these little
119
294480
3020
μœ μš©ν•œ 도ꡬ인지 μ•„λ‹ˆλ©΄
04:57
nuances that
120
297500
1100
04:58
reflect the views of the developer.
121
298600
2220
개발자의 관점을 λ°˜μ˜ν•˜λŠ” μ•½κ°„μ˜ λ‰˜μ•™μŠ€κ°€ ν¬ν•¨λ˜μ–΄ μžˆλŠ”μ§€ κΆκΈˆν•  수 μžˆμŠ΅λ‹ˆλ‹€.
05:00
Rob: Right, Catherine. How about the
122
300830
1735
λ‘­: λ§žμ•„, μΊμ„œλ¦°.
05:02
answer to this week's question then?
123
302565
1735
그럼 이번 μ£Ό μ§ˆλ¬Έμ— λŒ€ν•œ 닡은 μ–΄λ–¨κΉŒμš”?
05:04
Catherine: I asked in which decade was
124
304300
2160
μΊμ„œλ¦°:
05:06
the term 'Artificial Intelligence' coined.
125
306460
2300
'인곡 지λŠ₯'μ΄λΌλŠ” μš©μ–΄κ°€ λ§Œλ“€μ–΄μ§„ 10년이 μ–Έμ œμΈμ§€ λ¬Όμ—ˆμŠ΅λ‹ˆλ‹€.
05:08
Was it the 40s, the 50s or the 60s?
126
308760
2740
40λ…„λŒ€μ˜€λ‚˜, 50λ…„λŒ€μ˜€λ‚˜, 60λ…„λŒ€μ˜€λ‚˜?
05:11
Rob: And I said the 60s.
127
311500
1060
Rob: μ €λŠ” 60λ…„λŒ€λΌκ³  λ§ν–ˆμŠ΅λ‹ˆλ‹€.
05:12
Catherine: But it was actually the 1950s.
128
312560
2500
μΊμ„œλ¦°: ν•˜μ§€λ§Œ 사싀은 1950λ…„λŒ€μ˜€μ–΄μš”.
05:15
Never mind, Rob. Let's review today's vocabulary.
129
315060
3520
μ‹ κ²½μ“°μ§€λ§ˆ, λ‘­. 였늘의 μ–΄νœ˜λ₯Ό λ³΅μŠ΅ν•΄ λ΄…μ‹œλ‹€.
05:18
Rob: Well, we had a chauvinist - that's
130
318580
2100
Rob: μš°μ›”μ£Όμ˜μžκ°€ μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
05:20
someone who believes their country, race
131
320700
2460
μžμ‹ μ˜ κ΅­κ°€, 인쒅
05:23
or sex is better than any others.
132
323160
2000
λ˜λŠ” 성별이 λ‹€λ₯Έ λˆ„κ΅¬λ³΄λ‹€ μš°μ›”ν•˜λ‹€κ³  λ―ΏλŠ” μ‚¬λžŒμ΄μ—ˆμŠ΅λ‹ˆλ‹€.
05:25
Catherine: And this gives us
133
325160
1540
μΊμ„œλ¦°: 그리고 이것은 μš°λ¦¬μ—κ²Œ
05:26
technochauvinism,
134
326700
1380
05:28
the belief that a technological solution is
135
328080
2809
기술적 해결책이
05:30
always a better solution to a problem.
136
330889
2180
항상 λ¬Έμ œμ— λŒ€ν•œ 더 λ‚˜μ€ ν•΄κ²°μ±…μ΄λΌλŠ” 믿음인 ν…Œν¬λ…Έ μ‡ΌλΉ„λ‹ˆμ¦˜μ„ μ œκ³΅ν•©λ‹ˆλ‹€.
05:33
Rob: Next - someone or something that is
137
333069
2776
Rob: λ‹€μŒ - 객관적인 μ‚¬λžŒμ΄λ‚˜ 사물은
05:35
objective is neutral, fair and balanced.
138
335845
2775
쀑립적이고 κ³΅μ •ν•˜λ©° κ· ν˜•μ΄ μž‘ν˜€ μžˆμŠ΅λ‹ˆλ‹€.
05:38
Catherine: A proxy is a piece of software
139
338620
2760
Catherine: ν”„λ‘μ‹œλŠ” μ†Œν”„νŠΈμ›¨μ–΄
05:41
but also someone who does something
140
341380
1660
μ΄μ§€λ§Œ μ‚¬μš©μžλ₯Ό λŒ€μ‹ ν•˜μ—¬ μž‘μ—…μ„ μˆ˜ν–‰ν•˜λŠ” μ‚¬λžŒμ΄κΈ°λ„ ν•©λ‹ˆλ‹€
05:43
for you, on your behalf.
141
343040
2340
.
05:45
A nuanced problem is a subtle
142
345380
2220
λ―Έλ¬˜ν•œ λ¬Έμ œλŠ” λ―Έλ¬˜ν•œ 문제이며
05:47
one, it's not a simple case of right or
143
347600
3000
옳고 κ·Έλ¦„μ˜ λ‹¨μˆœν•œ κ²½μš°κ°€ μ•„λ‹™λ‹ˆλ‹€
05:50
wrong, in a nuanced problem there are
144
350600
2740
. λ―Έλ¬˜ν•œ λ¬Έμ œμ—λŠ”
05:53
small but important things that you need
145
353340
2140
μž‘μ§€λ§Œ κ³ λ €ν•΄μ•Ό ν•  μ€‘μš”ν•œ 사항이 μžˆμŠ΅λ‹ˆλ‹€
05:55
to consider.
146
355480
800
.
05:56
Rob: And an algorithm is a set of software
147
356280
2560
Rob: 그리고 μ•Œκ³ λ¦¬μ¦˜μ€
05:58
instructions for a computer system.
148
358842
2128
컴퓨터 μ‹œμŠ€ν…œμ„ μœ„ν•œ 일련의 μ†Œν”„νŠΈμ›¨μ–΄ λͺ…λ Ήμž…λ‹ˆλ‹€.
06:00
Catherine: Well, that's all we have time for
149
360970
1430
μΊμ„œλ¦°: 음, 그게 였늘 μš°λ¦¬κ°€ 가진 μ‹œκ°„μ˜ μ „λΆ€μž…λ‹ˆλ‹€
06:02
today. Goodbye for now.
150
362400
1780
. μ§€κΈˆμ€ μ•ˆλ…•.
06:04
Rob: Bye bye!
151
364180
700
λ‘­: μ•ˆλ…•!
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7