Fake News: Fact & Fiction - Episode 7: Can you trust online images?

43,055 views ・ 2023-10-24

BBC Learning English


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λ²ˆμ—­λœ μžλ§‰μ€ 기계 λ²ˆμ—­λ©λ‹ˆλ‹€.

00:05
Hello I'm Hugo.
0
5339
831
μ•ˆλ…•ν•˜μ„Έμš” νœ΄κ³ μž…λ‹ˆλ‹€.
00:06
And I'm Sam.
1
6250
1100
그리고 μ €λŠ” μƒ˜μ΄μ—μš”.
00:07
And this is Fake News: Fact & Fiction from BBC Learning English.
2
7799
3270
그리고 이것은 BBC Learning English의 κ°€μ§œ λ‰΄μŠ€: 사싀과 ν—ˆκ΅¬μž…λ‹ˆλ‹€ .
00:11
In the programme today we're talking about images and we meet viral image debunker Paulo Ordoveza.
3
11470
6559
였늘 ν”„λ‘œκ·Έλž¨μ—μ„œ μš°λ¦¬λŠ” 이미지에 λŒ€ν•΄ 이야기 ν•˜κ³  λ°”μ΄λŸ΄ 이미지 폭둜자 Paulo Ordovezaλ₯Ό λ§Œλ‚¬μŠ΅λ‹ˆλ‹€. λ°”μ΄λŸ΄ 이미지λ₯Ό λ³Ό λ•Œ
00:18
Things that you need to look out for when you're looking at a viral image,
4
18109
3791
μ£Όμ˜ν•΄μ•Ό ν•  사항은
00:21
you want to look for any signs of manipulation, tell tales that it's been edited.
5
21980
6009
μ‘°μž‘μ˜ 흔적이 μžˆλŠ”μ§€ ν™•μΈν•˜κ³  νŽΈμ§‘λ˜μ—ˆμŒμ„ μ•Œλ¦¬λŠ” κ²ƒμž…λ‹ˆλ‹€.
00:28
Before we get to that though, Sam. I'm looking forward to your vocabulary section.
6
28559
3780
ν•˜μ§€λ§Œ κ·Έ μ–˜κΈ°λ₯Ό ν•˜κΈ° 전에 μƒ˜. λ‚˜λŠ” λ‹Ήμ‹ μ˜ μ–΄νœ˜ μ„Ήμ…˜μ„ κΈ°λŒ€ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
00:32
What are you going to be telling us about today?
7
32419
2380
μ˜€λŠ˜μ€ μš°λ¦¬μ—κ²Œ 무엇에 λŒ€ν•΄ 말씀해 μ£Όμ‹€ κ±΄κ°€μš”?
00:34
Today Hugo I am talking about the world of deceit and the words 'con', 'scam', 'phishing' and 'hoax'.
8
34879
9971
였늘 휴고 μ €λŠ” μ†μž„μˆ˜μ˜ 세계 와 '사기', '사기', 'ν”Όμ‹±', '사기'λΌλŠ” 단어에 λŒ€ν•΄ μ΄μ•ΌκΈ°ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
00:49
There have always been people who want to take our money
9
49049
3660
우리의 λˆμ„ λΉΌμ•—μœΌλ €λŠ” μ‚¬λžŒλ“€μ€ 항상 μžˆμ–΄μ™”κ³ ,
00:53
and the Internet and social media have provided criminals with different opportunities to try and con us.
10
53039
8020
인터넷과 μ†Œμ…œ λ―Έλ””μ–΄λŠ” λ²”μ£„μžλ“€μ—κ²Œ 우리λ₯Ό 속이렀고 μ‹œλ„ν•  수 μžˆλŠ” λ‹€μ–‘ν•œ 기회λ₯Ό μ œκ³΅ν–ˆμŠ΅λ‹ˆλ‹€.
01:01
The verb 'to con' means to deceive, to trick or to cheat and the noun 'a con' describes the method of tricking.
11
61219
10271
'to con'μ΄λΌλŠ” λ™μ‚¬λŠ” 속이닀, 속이닀, μ†μ΄λŠ” 것을 의미 ν•˜κ³  λͺ…사 'a con'은 μ†μ΄λŠ” 방법을 μ„€λͺ…ν•©λ‹ˆλ‹€.
01:11
For example, on social media you sometimes see competitions with fantastic prizes
12
71969
5311
예λ₯Ό λ“€μ–΄, μ†Œμ…œ λ―Έλ””μ–΄μ—μ„œ μžλ™μ°¨λ‚˜ λ³΄νŠΈμ™€ 같은 ν™˜μƒμ μΈ μƒν’ˆμ΄ κ±Έλ¦° λŒ€νšŒλ₯Ό λ³Ό 수 있으며,
01:17
like a car or a boat and all you have to do is like the post and share it to have a chance of winning.
13
77359
8861
κ²Œμ‹œλ¬Όμ— μ’‹μ•„μš”λ₯Ό λˆ„λ₯΄κ³  κ³΅μœ ν•˜κΈ°λ§Œ ν•˜λ©΄ 우승 기회λ₯Ό 얻을 수 μžˆμŠ΅λ‹ˆλ‹€.
01:26
Many of these competitions are cons.
14
86549
2740
μ΄λŸ¬ν•œ λŒ€νšŒ 쀑 μƒλ‹Ήμˆ˜λŠ” λ‹¨μ μž…λ‹ˆλ‹€.
01:29
They're not real and are designed to get us to share personal information
15
89909
4860
그것듀은 μ‹€μ œκ°€ μ•„λ‹ˆλ©°
01:35
which many people will do because they want to win the prize but no one ever wins
16
95129
5460
λ§Žμ€ μ‚¬λžŒλ“€μ΄ 상을 λ°›κ³  μ‹Άμ–΄ν•˜μ§€λ§Œ 아무도 이기지 λͺ»ν•˜κ³ 
01:40
and the people who run the con have collected lots of personal data.
17
100950
6069
사기꾼듀이 λ§Žμ€ 개인 데이터λ₯Ό μˆ˜μ§‘ν–ˆκΈ° λ•Œλ¬Έμ— κ·Έλ ‡κ²Œ ν•  개인 정보λ₯Ό κ³΅μœ ν•˜λ„λ‘ μ„€κ³„λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
01:47
The word 'con' dates back to the mid to late 19th century and is a shortened version of 'confidence man',
18
107129
8371
'콘'μ΄λΌλŠ” λ‹¨μ–΄μ˜ μœ λž˜λŠ” 19μ„ΈκΈ° μ€‘ν›„λ°˜μœΌλ‘œ 거슬러 μ˜¬λΌκ°€λ©° 'μ»¨ν”Όλ˜μŠ€λ§¨'의 μ€„μž„λ§μΈλ°, ν•œ λ²ˆλ„ ν•΄λ³Έ 적 μ—†λŠ” μ„œλΉ„μŠ€μ— λŒ€ν•œ λŒ€κ°€λ‘œ λˆμ„ 주도둝 μ‚¬λžŒλ“€μ„
01:56
which was the term for a person who was able to persuade people to give him money in return for
19
116250
5370
섀득할 수 μžˆμ—ˆλ˜ μ‚¬λžŒμ„ μΌμ»«λŠ” 말이닀.
02:01
a service that was never given and was not what was promised.
20
121700
4980
μ£Όμ–΄μ‘ŒμœΌλ‚˜ μ•½μ†λœ 것이 μ•„λ‹ˆμ—ˆλŠλ‹ˆλΌ.
02:06
This was known as a 'confidence trick', or a 'con trick'. And now just 'a con'.
21
126760
7429
이것은 'μ‹ λ’° μ†μž„μˆ˜' λ˜λŠ” '사기 μ†μž„μˆ˜'둜 μ•Œλ €μ‘ŒμŠ΅λ‹ˆλ‹€. 그리고 μ§€κΈˆμ€ 단지 '사기꾼'μž…λ‹ˆλ‹€.
02:15
A similar word for a 'con' is a 'scam'.
22
135599
2720
'사기'와 λΉ„μŠ·ν•œ 말은 '사기'μž…λ‹ˆλ‹€.
02:19
Scam is a much newer word though, from the late 1960s and in most cases it can be used in the same way as a con.
23
139090
8330
μ‚¬κΈ°λŠ” 1960λ…„λŒ€ ν›„λ°˜λΆ€ν„° μ‚¬μš©λœ 훨씬 μƒˆλ‘œμš΄ 단어이며 λŒ€λΆ€λΆ„μ˜ 경우 사기꾼과 같은 λ°©μ‹μœΌλ‘œ μ‚¬μš©λ  수 μžˆμŠ΅λ‹ˆλ‹€.
02:27
Those online competitions are often scams so be careful and beware of 'scammers'.
24
147500
6820
μ΄λŸ¬ν•œ 온라인 λŒ€νšŒλŠ” 사기인 κ²½μš°κ°€ λ§ŽμœΌλ―€λ‘œ '사기꾼'을 μ‘°μ‹¬ν•˜κ³  μ‘°μ‹¬ν•˜μ„Έμš”.
02:34
The people who are trying to trick you.
25
154400
2319
당신을 μ†μ΄λ €λŠ” μ‚¬λžŒλ“€.
02:36
A good piece of advice is: if it looks too good to be true, it probably is.
26
156960
5489
쒋은 쑰언은 λ‹€μŒκ³Ό κ°™μŠ΅λ‹ˆλ‹€. 사싀이라고 λ―ΏκΈ°μ—λŠ” λ„ˆλ¬΄ μ’‹μ•„ 보인닀면 μ•„λ§ˆλ„ 사싀일 κ²ƒμž…λ‹ˆλ‹€.
02:44
A very modern type of scam is 'phishing'.
27
164189
3801
맀우 ν˜„λŒ€μ μΈ μœ ν˜•μ˜ μ‚¬κΈ°λŠ” 'ν”Όμ‹±'μž…λ‹ˆλ‹€.
02:48
The term was coined in the 1990s and is a variation of the word 'fishing'.
28
168070
6100
이 μš©μ–΄λŠ” 1990λ…„λŒ€μ— λ§Œλ“€μ–΄μ‘ŒμœΌλ©° 'λ‚šμ‹œ'λΌλŠ” λ‹¨μ–΄μ˜ λ³€ν˜•μž…λ‹ˆλ‹€.
02:54
The scammers are fishing for your personal details.
29
174300
3649
사기꾼듀은 κ·€ν•˜μ˜ 개인 정보λ₯Ό 노리고 μžˆμŠ΅λ‹ˆλ‹€.
02:58
This is often done through emails which try to get you to log on to a web site that looks like your bank's
30
178030
5989
μ΄λŠ” μ’…μ’… κ·€ν•˜μ˜ 은행 μ›Ήμ‚¬μ΄νŠΈμ²˜λŸΌ λ³΄μ΄μ§€λ§Œ
03:04
but is actually a fake site from which the phishers collect your log-in details.
31
184379
7051
μ‹€μ œλ‘œλŠ” ν”Όμ…”κ°€ κ·€ν•˜μ˜ 둜그인 정보λ₯Ό μˆ˜μ§‘ν•˜λŠ” κ°€μ§œ μ‚¬μ΄νŠΈμΈ μ›Ήμ‚¬μ΄νŠΈμ— λ‘œκ·ΈμΈν•˜λ„λ‘ μœ λ„ν•˜λŠ” 이메일을 톡해 μ΄λ£¨μ–΄μ§‘λ‹ˆλ‹€.
03:11
Another type of scam that has been around for centuries but is also popular on the Internet are 'hoaxes'.
32
191510
7839
μˆ˜μ„ΈκΈ° λ™μ•ˆ μ‘΄μž¬ν•΄ μ™”μ§€λ§Œ μΈν„°λ„·μ—μ„œλ„ 인기가 μžˆλŠ” 또 λ‹€λ₯Έ μœ ν˜•μ˜ μ‚¬κΈ°λŠ” '사기'μž…λ‹ˆλ‹€.
03:19
A hoax is something that has been created to trick you into believing something that isn't true.
33
199430
7239
μ‚¬κΈ°λž€ 사싀이 μ•„λ‹Œ 것을 믿도둝 속이기 μœ„ν•΄ λ§Œλ“€μ–΄μ§„ κ²ƒμž…λ‹ˆλ‹€.
03:26
This could be a fake news story but hoaxes are also common with photos and videos.
34
206750
6030
이것은 κ°€μ§œ λ‰΄μŠ€μΌ μˆ˜λ„ μžˆμ§€λ§Œ 사진과 λΉ„λ””μ˜€μ—μ„œλ„ 사기가 ν”ν•©λ‹ˆλ‹€.
03:32
Take this one as an example.
35
212860
2869
이것을 예둜 λ“€μ–΄ λ³΄κ² μŠ΅λ‹ˆλ‹€.
03:35
If someone claims with authority that it's proof of the existence of the creature known as Bigfoot,
36
215849
6151
λˆ„κ΅°κ°€κ°€ 그것이 λΉ…ν’‹μ΄λΌλŠ” μƒλ¬Όμ˜ μ‘΄μž¬μ— λŒ€ν•œ 증거라고 κΆŒμœ„ 있게 μ£Όμž₯ν•œλ‹€λ©΄
03:42
it's a hoax because it's just a person in a hairy costume.
37
222449
5111
그것은 단지 털이 λ§Žμ€ μ˜μƒμ„ μž…μ€ μ‚¬λžŒμΌ λΏμ΄λ―€λ‘œ μ‚¬κΈ°μž…λ‹ˆλ‹€.
03:52
Thanks Sam, and that leads us on nicely to today's topic about images.
38
232319
4260
Samμ—κ²Œ κ°μ‚¬λ“œλ¦½λ‹ˆλ‹€. 였늘의 이미지 주제둜 λ„˜μ–΄κ°€κ² μŠ΅λ‹ˆλ‹€.
03:56
Yes. So there's an old saying that the camera doesn't lie. But I'm not so sure we believe that any more.
39
236659
8000
예. κ·Έλž˜μ„œ μΉ΄λ©”λΌλŠ” 거짓말을 ν•˜μ§€ μ•ŠλŠ”λ‹€λŠ” μ˜›λ§μ΄ μžˆμŠ΅λ‹ˆλ‹€. ν•˜μ§€λ§Œ μš°λ¦¬κ°€ 더 이상 그것을 믿을지 ν™•μ‹ ν•  수 μ—†μŠ΅λ‹ˆλ‹€.
04:04
Well the camera can only record what it sees. So in that case it can't lie.
40
244960
4250
μΉ΄λ©”λΌλŠ” λ³Έ κ²ƒλ§Œ 기둝할 수 μžˆμŠ΅λ‹ˆλ‹€. κ·ΈλŸ¬λ―€λ‘œ κ·Έ κ²½μš°μ—λŠ” 거짓말을 ν•  수 μ—†μŠ΅λ‹ˆλ‹€.
04:09
But what we are told we're seeing in the image is not necessarily what the image actually is.
41
249599
6000
κ·ΈλŸ¬λ‚˜ μš°λ¦¬κ°€ μ΄λ―Έμ§€μ—μ„œ 보고 μžˆλŠ” 것이 λ°˜λ“œμ‹œ μ΄λ―Έμ§€μ˜ μ‹€μ œ λͺ¨μŠ΅κ³Ό μΌμΉ˜ν•˜λŠ” 것은 μ•„λ‹™λ‹ˆλ‹€.
04:15
Yes so the camera doesn't lie. But people do.
42
255680
2589
λ„€, μΉ΄λ©”λΌλŠ” 거짓말을 ν•˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€. ν•˜μ§€λ§Œ μ‚¬λžŒλ“€μ€ κ·Έλ ‡μŠ΅λ‹ˆλ‹€.
04:18
Yeah. Well that's a good way to put it. And of course with the digital tools at our fingertips today
43
258509
5551
응. κΈ€μŽ„, 그것을 ν‘œν˜„ν•˜λŠ” 쒋은 λ°©λ²•μž…λ‹ˆλ‹€. λ¬Όλ‘  μ˜€λŠ˜λ‚  손끝에 μžˆλŠ” 디지털 도ꡬλ₯Ό μ‚¬μš©ν•˜λ©΄ μ‹€μ œ μ‚¬μ§„μ²˜λŸΌ λ³΄μ΄λŠ” μ™„μ „ν•œ 디지털 이미지λ₯Ό
04:24
we can change, manipulate, or even create a completely digital image that looks like a real photograph.
44
264360
7360
λ³€κ²½, μ‘°μž‘ λ˜λŠ” 생성할 μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€ .
04:31
Yeah. But now we're not talking about just putting a nice filter on our social media selfies, are we?
45
271910
6029
응. ν•˜μ§€λ§Œ μ§€κΈˆ μš°λ¦¬λŠ” μ†Œμ…œ λ―Έλ””μ–΄ 셀카에 멋진 ν•„ν„°λ₯Ό μ μš©ν•˜λŠ” 것에 λŒ€ν•΄μ„œλ§Œ μ΄μ•ΌκΈ°ν•˜λŠ” 것이 μ•„λ‹™λ‹ˆλ‹€. κ·Έλ ‡μ£ ?
04:38
No. There are a few different ways images are used to tell a different story to the one that was photographed.
46
278279
7821
μ•„λ‹ˆμš”. 사진을 찍은 것과 λ‹€λ₯Έ 이야기λ₯Ό μ „λ‹¬ν•˜κΈ° μœ„ν•΄ 이미지λ₯Ό μ‚¬μš©ν•˜λŠ” λ°©λ²•μ—λŠ” λͺ‡ 가지가 μžˆμŠ΅λ‹ˆλ‹€.
04:46
One thing we see a lot is real images. They're not fake or photoshopped but
47
286180
4849
μš°λ¦¬κ°€ 많이 λ³΄λŠ” 것 쀑 ν•˜λ‚˜λŠ” μ‹€μ œ μ΄λ―Έμ§€μž…λ‹ˆλ‹€. κ°€μ§œλ‚˜ ν¬ν† μƒ΅μœΌλ‘œ λ§Œλ“  것은 μ•„λ‹ˆμ§€λ§Œ
04:51
they are from a different time or place than the caption states.
48
291109
3440
μΊ‘μ…˜μ— ν‘œμ‹œλœ κ²ƒκ³ΌλŠ” λ‹€λ₯Έ μ‹œκ°„μ΄λ‚˜ μž₯μ†Œμ—μ„œ 온 κ²ƒμž…λ‹ˆλ‹€.
04:54
So the caption on an image can also be misleading in the way it describes what
49
294680
5229
λ”°λΌμ„œ μ΄λ―Έμ§€μ˜ μΊ‘μ…˜μ€
04:59
is happening in the photo particularly if it generates strong emotion.
50
299989
6721
특히 κ°•ν•œ 감정을 λΆˆλŸ¬μΌμœΌν‚€λŠ” 경우 μ‚¬μ§„μ—μ„œ μΌμ–΄λ‚˜λŠ” 일을 μ„€λͺ…ν•˜λŠ” λ°©μ‹μ—μ„œ μ˜€ν•΄λ₯Ό λΆˆλŸ¬μΌμœΌν‚¬ 수 μžˆμŠ΅λ‹ˆλ‹€.
05:06
Yeah well here's an example.
51
306790
1219
λ„€, μ—¬κΈ° μ˜ˆκ°€ μžˆμŠ΅λ‹ˆλ‹€.
05:08
Look at this picture and It might make you feel completely different emotions depending on how it's described.
52
308089
6120
이 사진을 보고 μ–΄λ–»κ²Œ ν‘œν˜„ν•˜λŠλƒμ— 따라 μ „ν˜€ λ‹€λ₯Έ 감정을 λŠλ‚„ μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
05:14
"Amazing. I saw the police helping an injured protester!"
53
314289
3111
"λ†€λΌμ›Œμš”. 경찰이 λΆ€μƒλ‹Ήν•œ μ‹œμœ„μžλ₯Ό λ•λŠ” κ±Έ λ΄€μ–΄μš”!"
05:17
"Terrible. I saw the police attacking an innocent protester!"
54
317899
3851
"λ”μ°ν•΄μš”. 경찰이 λ¬΄κ³ ν•œ μ‹œμœ„μžλ₯Ό κ³΅κ²©ν•˜λŠ” κ±Έ λ΄€μ–΄μš”!"
05:22
So what do you think Sam. Do you think the different captions tell a different story?
55
322410
3479
그럼 μƒ˜μ€ μ–΄λ–»κ²Œ μƒκ°ν•˜μ„Έμš”? λ‹€μ–‘ν•œ μΊ‘μ…˜μ΄ λ‹€λ₯Έ 이야기λ₯Ό μ „λ‹¬ν•œλ‹€κ³  μƒκ°ν•˜μ‹œλ‚˜μš”?
05:26
Definitely yes. Those two captions elicit two completely different emotions but which one is right?
56
326189
7440
λΆ„λͺ…ν•˜κ²Œ μ˜ˆμž…λ‹ˆλ‹€. 두 μΊ‘μ…˜μ€ μ „ν˜€ λ‹€λ₯Έ 두 가지 감정을 λΆˆλŸ¬μΌμœΌν‚€λŠ”λ° μ–΄λŠ 것이 λ§žλ‚˜μš”?
05:33
Without a wider context, we wouldn't know which one was correct.
57
333990
5070
더 넓은 λ§₯락이 μ—†λ‹€λ©΄ μš°λ¦¬λŠ” μ–΄λŠ 것이 μ˜³μ€μ§€ μ•Œ 수 없을 κ²ƒμž…λ‹ˆλ‹€.
05:39
So that's not the only way images can be misleading.
58
339139
3300
이미지가 μ˜€ν•΄λ₯Ό λΆˆλŸ¬μΌμœΌν‚¬ 수 μžˆλŠ” μœ μΌν•œ 방법은 μ•„λ‹™λ‹ˆλ‹€.
05:42
It's not just captions. Images can be cropped, manipulated and edited
59
342519
6601
λ‹¨μˆœν•œ μΊ‘μ…˜μ΄ μ•„λ‹™λ‹ˆλ‹€. 이미지λ₯Ό 자λ₯΄κ³  μ‘°μž‘ν•˜κ³  νŽΈμ§‘ν•˜μ—¬
05:49
to add things that were never there or take away things that were there.
60
349620
7560
거기에 μ—†μ—ˆλ˜ 것을 μΆ”κ°€ν•˜κ±°λ‚˜ 거기에 있던 것을 μ œκ±°ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
05:58
In fact it's so easy to do this now that an aged relative of mine, who shall remain nameless,
61
358350
6719
사싀 μ§€κΈˆμ€ 이 μž‘μ—…μ΄ λ„ˆλ¬΄ μ‰¬μ›Œμ„œ 이름을 μ•Œ 수 μ—†λŠ” λ‚΄ μ—°λ‘œν•œ μΉœμ²™μ΄
06:05
removed an in-law from a family photo and replaced her with a bush.
62
365550
5909
κ°€μ‘± μ‚¬μ§„μ—μ„œ μ‹œμ–΄λ¨Έλ‹ˆλ₯Ό 제거 ν•˜κ³  κ·Έ 자리λ₯Ό 덀뢈둜 λŒ€μ²΄ν–ˆμŠ΅λ‹ˆλ‹€.
06:12
That photo now hangs in the living room.
63
372149
2631
κ·Έ 사진은 μ§€κΈˆ 거싀에 κ±Έλ €μžˆμŠ΅λ‹ˆλ‹€.
06:14
And unless you really examined it closely, you wouldn't realise at all that someone was missing.
64
374939
5880
그리고 μžμ„Ένžˆ μ‚΄νŽ΄λ³΄μ§€ μ•ŠμœΌλ©΄ λˆ„κ΅°κ°€κ°€ λΉ μ‘Œλ‹€λŠ” 사싀을 μ „ν˜€ 깨닫지 λͺ»ν•  κ²ƒμž…λ‹ˆλ‹€.
06:20
I'm glad I'm not in your family Sam!
65
380899
2480
λ‚΄κ°€ μƒ˜μ˜ 가쑱이 μ•„λ‹ˆμ–΄μ„œ λ‹€ν–‰μ΄μ—μš”!
06:23
Most mainstream media organisations are very careful about the images they use
66
383519
3810
λŒ€λΆ€λΆ„μ˜ μ£Όλ₯˜ λ―Έλ””μ–΄ 쑰직은 μ‚¬μš©ν•˜λŠ” 이미지
06:27
and how they're described particularly if those images have come from members of the public.
67
387409
4750
와 특히 ν•΄λ‹Ή 이미지가 λŒ€μ€‘μœΌλ‘œλΆ€ν„° λ‚˜μ˜¨ 경우 μ„€λͺ…ν•˜λŠ” 방법에 λŒ€ν•΄ 맀우 μ‹ μ€‘ν•©λ‹ˆλ‹€.
06:32
On social media though, anyone can publish an image and say anything about it.
68
392430
5840
ν•˜μ§€λ§Œ μ†Œμ…œ λ―Έλ””μ–΄μ—μ„œλŠ” λˆ„κ΅¬λ‚˜ 이미지λ₯Ό κ²Œμ‹œν•˜κ³  그에 λŒ€ν•΄ 무엇이든 말할 수 μžˆμŠ΅λ‹ˆλ‹€.
06:38
And that's where people like today's guest come in.
69
398610
2130
이것이 λ°”λ‘œ 였늘의 μ†λ‹˜κ³Ό 같은 μ‚¬λžŒλ“€μ΄ λ“€μ–΄μ˜€λŠ” κ³³μž…λ‹ˆλ‹€.
06:41
Paulo Ordoveza is a viral image debunker.
70
401029
2971
Paulo OrdovezaλŠ” λ°”μ΄λŸ΄ 이미지 ν­λ‘œμžμž…λ‹ˆλ‹€.
06:44
He's known as Pic Pedant on Twitter and spends his free time verifying images
71
404100
4200
κ·ΈλŠ” νŠΈμœ„ν„°μ—μ„œ Pic Pedant둜 μ•Œλ €μ‘ŒμœΌλ©° μ—¬κ°€ μ‹œκ°„μ„
06:48
which seem to him to be not quite right.
72
408689
2190
μžμ‹ μ΄ λ³΄κΈ°μ—λŠ” μ˜³μ§€ μ•Šμ€ 이미지λ₯Ό ν™•μΈν•˜λŠ” 데 λ³΄λƒ…λ‹ˆλ‹€.
06:51
It's an activity he took up while working as a contractor at NASA. We spoke to him earlier and asked why he does it.
73
411209
6120
κ·Έκ°€ NASAμ—μ„œ κ³„μ•½μžλ‘œ μΌν•˜λ©΄μ„œ μ‹œμž‘ν•œ ν™œλ™μ΄λ‹€. μš°λ¦¬λŠ” μ•žμ„œ 그와 이야기λ₯Ό λ‚˜λˆ„κ³  κ·Έκ°€ μ™œ 그런 일을 ν•˜λŠ”μ§€ λ¬Όμ—ˆμŠ΅λ‹ˆλ‹€.
07:01
Well. Partly I'm just pedantic. I'm a stickler for accuracy.
74
421620
5029
잘. λΆ€λΆ„μ μœΌλ‘œ λ‚˜λŠ” 단지 ν˜„ν•™μ μž…λ‹ˆλ‹€. λ‚˜λŠ” 정확성을 κ³ μ§‘ν•˜λŠ” μ‚¬λžŒμ΄λ‹€.
07:06
But partly it's also that I get annoyed at seeing digital art, seeing manipulated images passed off as real.
75
426980
7480
ν•˜μ§€λ§Œ λΆ€λΆ„μ μœΌλ‘œλŠ” 디지털 μ•„νŠΈλ₯Ό λ³΄λŠ” 것, μ‘°μž‘λœ 이미지가 μ‹€μ œμ²˜λŸΌ μ „λ‹¬λ˜λŠ” 것을 λ³΄λŠ” 것에 짜증이 λ‚˜κΈ°λ„ ν•©λ‹ˆλ‹€.
07:14
And in a sense it's, it's unfair to the people who made them because you, you rob them of attribution, you rob them of context and credit.
76
434750
8780
그리고 μ–΄λ–€ μ˜λ―Έμ—μ„œ 그것은 그것을 λ§Œλ“  μ‚¬λžŒλ“€μ—κ²Œ λΆˆκ³΅ν‰ν•©λ‹ˆλ‹€. μ™œλƒν•˜λ©΄ 당신은 κ·Έλ“€μ˜ 귀속을 κ°•νƒˆν•˜κ³  λ§₯락과 μ‹ μš©μ„ κ°•νƒˆν•˜κΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
07:23
And I want to see justice done in that sense.
77
443610
4029
그리고 μ €λŠ” 그런 μ˜λ―Έμ—μ„œ μ •μ˜κ°€ μ‹€ν˜„λ˜λŠ” 것을 보고 μ‹ΆμŠ΅λ‹ˆλ‹€.
07:27
But on a grander scale it's also that small lies can later lead to bigger lies.
78
447740
5300
κ·ΈλŸ¬λ‚˜ 더 큰 규λͺ¨λ‘œ 보면 μž‘μ€ 거짓말이 λ‚˜μ€‘μ— 더 큰 κ±°μ§“λ§λ‘œ μ΄μ–΄μ§ˆ μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
07:33
I've seen, I've seen cases of viral image purveyors who just grab stuff off Reddit
79
453120
7420
λ‚˜λŠ” Redditμ—μ„œ 물건을 κ°€μ Έμ˜€λŠ” λ°”μ΄λŸ¬μŠ€ 이미지 κ³΅κΈ‰μ—…μ²΄μ˜ 사둀가
07:41
later be cited in bigger contexts and the damage they can do is amplified significantly by the nature of social media.
80
461180
12359
λ‚˜μ€‘μ— 더 큰 λ§₯λ½μ—μ„œ 인용되고 μ†Œμ…œ λ―Έλ””μ–΄μ˜ νŠΉμ„±μœΌλ‘œ 인해 그듀이 μž…νž 수 μžˆλŠ” ν”Όν•΄κ°€ 크게 μ¦ν­λ˜λŠ” 것을 λ³΄μ•˜μŠ΅λ‹ˆλ‹€.
07:53
I've seen people go from historical photos and go into full blown health conspiracies about the pandemic.
81
473619
8350
λ‚˜λŠ” μ‚¬λžŒλ“€μ΄ 역사적인 사진을 보고 전염병에 λŒ€ν•œ 본격적인 건강 음λͺ¨μ— λΉ μ Έλ“œλŠ” 것을 λ³΄μ•˜μŠ΅λ‹ˆλ‹€.
08:02
I've seen nature photos lead to eco-fascism.
82
482049
3710
λ‚˜λŠ” μžμ—° 사진이 에코 νŒŒμ‹œμ¦˜μœΌλ‘œ μ΄μ–΄μ§€λŠ” 것을 λ³΄μ•˜μŠ΅λ‹ˆλ‹€.
08:05
So you know it's the kind of thing I'd like to see nipped in the bud before it leads to grander or more harmful falsehoods.
83
485839
10040
κ·Έλž˜μ„œ 그것이 더 μ›…μž₯ν•˜κ±°λ‚˜ 더 ν•΄λ‘œμš΄ κ±°μ§“μœΌλ‘œ 이어지기 전에 μ œκ°€ 싹을 ν‹”μš°λŠ” 것을 보고 싢은 μ’…λ₯˜μ˜ μΌμ΄λΌλŠ” 것을 당신도 μ•Œκ³  μžˆμŠ΅λ‹ˆλ‹€ .
08:15
So he sees artists' work being stolen without credit.
84
495959
3050
κ·Έλž˜μ„œ κ·ΈλŠ” μ˜ˆμˆ κ°€λ“€μ˜ μž‘ν’ˆμ΄ μ‹ μš©λ„ 없이 λ„λ‚œλ‹Ήν•˜λŠ” 것을 λ΄…λ‹ˆλ‹€.
08:19
Fake and edited images being published as real.
85
499089
2971
κ°€μ§œ 및 νŽΈμ§‘λœ 이미지가 μ‹€μ œμ²˜λŸΌ κ²Œμ‹œλ©λ‹ˆλ‹€.
08:22
And also people who do this using their popularity on social media
86
502160
3270
λ˜ν•œ μ†Œμ…œ λ―Έλ””μ–΄μ—μ„œ μžμ‹ μ˜ 인기λ₯Ό μ΄μš©ν•˜μ—¬
08:25
to promote conspiracy theories and radical action.
87
505639
3251
음λͺ¨λ‘ κ³Ό 급진적인 행동을 μž₯λ €ν•˜λŠ” μ‚¬λžŒλ“€λ„ μžˆμŠ΅λ‹ˆλ‹€.
08:28
But how do we know that an image is fake, copied or misleading? If you do this a lot, you can spot the signs easily.
88
508970
6380
ν•˜μ§€λ§Œ 이미지가 κ°€μ§œμ΄κ±°λ‚˜ λ³΅μ œλ˜μ—ˆκ±°λ‚˜ μ˜€ν•΄μ˜ μ†Œμ§€κ°€ μžˆλ‹€λŠ” 것을 μ–΄λ–»κ²Œ μ•Œ 수 μžˆμŠ΅λ‹ˆκΉŒ? 이 μž‘μ—…μ„ 많이 μˆ˜ν–‰ν•˜λ©΄ ν‘œμ§€νŒμ„ μ‰½κ²Œ 찾을 수 μžˆμŠ΅λ‹ˆλ‹€.
08:35
But for those of us with less experience here are a few tips from Pic Pedant.
89
515509
5120
ν•˜μ§€λ§Œ κ²½ν—˜μ΄ λΆ€μ‘±ν•œ μ‚¬λžŒλ“€μ„ μœ„ν•΄ Pic Pedantκ°€ μ œκ³΅ν•˜λŠ” λͺ‡ 가지 νŒμ„ μ•Œλ €λ“œλ¦½λ‹ˆλ‹€.
08:40
For one thing there's context.
90
520759
1351
μš°μ„  λ§₯락이 μžˆμŠ΅λ‹ˆλ‹€.
08:42
You want to see if, is this image too good to be true? Is it too lucky?
91
522190
5119
이 이미지가 사싀이라고 λ―ΏκΈ°μ—λŠ” λ„ˆλ¬΄ 쒋은지 μ•Œκ³  μ‹ΆμœΌμ‹­λ‹ˆκΉŒ? λ„ˆλ¬΄ 운이 쒋은 κ±ΈκΉŒμš”?
08:47
Sometimes a photographer will get lucky, but sometimes that image is manipulated.
92
527389
5221
λ•Œλ‘œλŠ” μ‚¬μ§„μž‘κ°€κ°€ 운이 쒋을 λ•Œλ„ μžˆμ§€λ§Œ λ•Œλ‘œλŠ” κ·Έ 이미지가 μ‘°μž‘λ˜κΈ°λ„ ν•©λ‹ˆλ‹€.
08:52
Also think about what emotions these images are eliciting in you.
93
532690
6959
λ˜ν•œ μ΄λŸ¬ν•œ 이미지가 λ‹Ήμ‹ μ—κ²Œ μ–΄λ–€ 감정을 λΆˆλŸ¬μΌμœΌν‚€λŠ”μ§€ 생각해 λ³΄μ„Έμš”.
08:59
What are they trying to make you feel, or think, or do?
94
539870
3269
그듀은 당신이 무엇을 λŠλΌκ±°λ‚˜, μƒκ°ν•˜κ±°λ‚˜, ν–‰λ™ν•˜κ²Œ ν•˜λ €κ³  ν•©λ‹ˆκΉŒ?
09:03
What is the publisher's aim?
95
543590
1739
μΆœνŒμ‚¬μ˜ λͺ©μ μ€ λ¬΄μ—‡μž…λ‹ˆκΉŒ?
09:05
Are they trying to make money or are they trying to become more popular or
96
545409
2831
그듀은 λˆμ„ 벌렀고 ν•˜λŠ” κ±ΈκΉŒμš”, μ•„λ‹ˆλ©΄ 더 인기λ₯Ό μ–»μœΌλ €κ³  ν•˜λŠ” κ±ΈκΉŒμš”, μ•„λ‹ˆλ©΄
09:08
are they trying to manipulate your view of the world?
97
548320
3329
λ‹Ήμ‹ μ˜ 세계관을 μ‘°μ’…ν•˜λ €λŠ” κ±ΈκΉŒμš”?
09:12
And things that you need to look out for when you're looking at a viral image.
98
552889
4680
그리고 λ°”μ΄λŸ΄ 이미지λ₯Ό λ³Ό λ•Œ μ£Όμ˜ν•΄μ•Ό ν•  사항도 μžˆμŠ΅λ‹ˆλ‹€. μ‘°μž‘
09:17
You want to look for any signs of manipulation, tell tales that it's been edited.
99
557649
6400
의 흔적을 μ°Ύκ³  νŽΈμ§‘λ˜μ—ˆλ‹€λŠ” 사싀을 μ•Œλ¦¬κ³  μ‹ΆμŠ΅λ‹ˆλ‹€.
09:24
Sometimes the edges are too blurry, sometimes the edges are too sharp.
100
564129
3280
λ•Œλ‘œλŠ” κ°€μž₯μžλ¦¬κ°€ λ„ˆλ¬΄ νλ¦Ών•˜κ³  λ•Œλ‘œλŠ” κ°€μž₯μžλ¦¬κ°€ λ„ˆλ¬΄ μ„ λͺ…ν•©λ‹ˆλ‹€.
09:27
Sometimes the light sources are different and you'll see shadows coming, you'll see shadows in different directions.
101
567799
6541
λ•Œλ‘œλŠ” 광원이 λ‹€λ₯΄κΈ° λ•Œλ¬Έμ— κ·Έλ¦Όμžκ°€ λ‚˜νƒ€λ‚˜λŠ” 것을 λ³Ό 수 있고, λ‹€λ₯Έ λ°©ν–₯μ—μ„œ 그림자λ₯Ό λ³Ό μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
09:34
Sometimes the perspective will be wrong or you'll see artefacts of Photoshop clone stamping
102
574420
4239
λ•Œλ‘œλŠ” 관점이 잘λͺ»λ˜κ±°λ‚˜ Photoshop 볡제 μŠ€νƒ¬ν”„μ˜ 인곡물이 ν‘œμ‹œλ˜κ±°λ‚˜ 무언가가 λ³΅μ‚¬λ˜μ—ˆμŒμ„
09:39
or you'll see repeating patterns that indicate that something has been, that something has been copied.
103
579019
5791
λ‚˜νƒ€λ‚΄λŠ” 반볡 νŒ¨ν„΄μ΄ ν‘œμ‹œλ©λ‹ˆλ‹€ .
09:44
A few things to look out for there, some technical, some emotional.
104
584890
3239
μ—¬κΈ°μ„œ μ£Όμ˜ν•΄μ•Ό ν•  λͺ‡ 가지 사항은 기술적인 λΆ€λΆ„, 감정적인 λΆ€λΆ„μž…λ‹ˆλ‹€.
09:48
We've seen throughout the series that many examples of fake news are designed to generate strong emotions
105
588210
6380
μš°λ¦¬λŠ” μ‹œλ¦¬μ¦ˆ μ „μ²΄μ—μ„œ κ°€μ§œ λ‰΄μŠ€μ˜ λ§Žμ€ μ˜ˆκ°€ κ°•ν•œ 감정을 μƒμ„±ν•˜λ„λ‘ μ„€κ³„λ˜μ—ˆμœΌλ©°
09:54
and photos are no different.
106
594710
1870
사진도 λ‹€λ₯΄μ§€ μ•Šλ‹€λŠ” 것을 ν™•μΈν–ˆμŠ΅λ‹ˆλ‹€.
09:56
Yes. This doesn't mean that every emotional, interesting or spectacular image is suspicious.
107
596660
6879
예. κ·Έλ ‡λ‹€κ³  ν•΄μ„œ λͺ¨λ“  κ°μ •μ μ΄κ±°λ‚˜ ν₯λ―Έλ‘­κ±°λ‚˜ 극적인 이미지가 μ˜μ‹¬μŠ€λŸ½λ‹€λŠ” μ˜λ―ΈλŠ” μ•„λ‹™λ‹ˆλ‹€.
10:03
The latest baby picture from your family are probably not fake
108
603620
4380
가쑱이 찍은 졜근 μ•„κΈ° 사진은 μ•„λ§ˆλ„ κ°€μ§œκ°€ 아닐 κ²ƒμž…λ‹ˆλ‹€.
10:08
but with your critical thinking head on when you see images that make you feel upset, angry or outraged,
109
608080
8290
ν•˜μ§€λ§Œ μ†μƒν•˜κ±°λ‚˜ ν™”λ‚˜κ±°λ‚˜ κ²©λΆ„ν•˜κ²Œ λ§Œλ“œλŠ” 이미지λ₯Ό λ³Ό λ•Œ λΉ„νŒμ  사고λ₯Ό 염두에 두고 μƒκ°ν•˜κΈ° 전에
10:16
it might be worth taking a deep breath and doing a bit of investigating before you think about sharing.
110
616789
7961
μ‹¬ν˜Έν‘μ„ ν•˜κ³  μ•½κ°„μ˜ 쑰사λ₯Ό ν•΄ λ³Ό κ°€μΉ˜κ°€ μžˆμ„ 수 μžˆμŠ΅λ‹ˆλ‹€. λ‚˜λˆ„λŠ”.
10:24
And how do you investigate an image? Here's Paulo Ordoveza again with a practical tip.
111
624950
6240
이미지λ₯Ό μ–΄λ–»κ²Œ μ‘°μ‚¬ν•˜λ‚˜μš”? Paulo Ordovezaκ°€ μ‹€μš©μ μΈ νŒμ„ λ‹€μ‹œ μ•Œλ €λ“œλ¦½λ‹ˆλ‹€.
10:31
The first thing I do is plug the image into a reverse image search tool like Tin Eye,
112
631639
3811
μ œκ°€ κ°€μž₯ λ¨Όμ € ν•˜λŠ” 일은 Tin Eye와 같은 λ°˜μ „ 이미지 검색 도ꡬ
10:36
or Google image search which has a reverse image search if you click on the little camera icon in the search bar.
113
636529
6181
λ‚˜ 검색창에 μžˆλŠ” μž‘μ€ 카메라 μ•„μ΄μ½˜μ„ ν΄λ¦­ν•˜λ©΄ λ°˜μ „ 이미지 검색이 κ°€λŠ₯ν•œ Google 이미지 검색에 이미지λ₯Ό μ—°κ²°ν•˜λŠ” κ²ƒμž…λ‹ˆλ‹€.
10:43
And that will show you repeat occurrences of the image across various web sites and how far back the repetitions go.
114
643610
8100
그러면 λ‹€μ–‘ν•œ μ›Ή μ‚¬μ΄νŠΈμ—μ„œ 이미지가 반볡적으둜 λ°œμƒν•˜κ³  κ·Έ 반볡이 μ–Όλ§ˆλ‚˜ 과거둜 거슬러 μ˜¬λΌκ°€λŠ”μ§€ μ•Œ 수 μžˆμŠ΅λ‹ˆλ‹€.
10:51
I have to warn you that the oldest or the largest image are not always necessarily the original image.
115
651950
7250
κ°€μž₯ μ˜€λž˜λ˜μ—ˆκ±°λ‚˜ κ°€μž₯ 큰 이미지가 항상 원본 이미지인 것은 μ•„λ‹ˆλΌλŠ” 점을 κ²½κ³ ν•΄μ•Ό ν•©λ‹ˆλ‹€.
10:59
So you have to be able to tell from context, from the site
116
659480
2899
λ”°λΌμ„œ
11:02
if that is actually the photographer or the digital artist's original image.
117
662720
6150
그것이 μ‹€μ œλ‘œ μ‚¬μ§„μž‘κ°€μΈμ§€ μ•„λ‹ˆλ©΄ 디지털 μ•„ν‹°μŠ€νŠΈμ˜ 원본 이미지인지λ₯Ό μ‚¬μ΄νŠΈμ˜ λ§₯락을 톡해 μ•Œ 수 μžˆμ–΄μ•Ό ν•©λ‹ˆλ‹€.
11:09
Also if, I found that if you search for the caption, if you do a phrase search by entering the caption in quotes into Google,
118
669049
7171
λ˜ν•œ μΊ‘μ…˜μ„ 검색할 λ•Œ Googleμ΄λ‚˜ μ œκ°€ μ‚¬μš©ν•˜λŠ” Duck Duck Go에 μΊ‘μ…˜μ„ λ”°μ˜΄ν‘œλ‘œ λ¬Άμ–΄ ꡬ문 검색을 ν•˜λ©΄
11:16
or Duck Duck Go as I use, you might find a Reddit post or some other social media post
119
676460
6030
Reddit κ²Œμ‹œλ¬Όμ΄λ‚˜ λ‹€λ₯Έ μ†Œμ…œ λ―Έλ””μ–΄ κ²Œμ‹œλ¬Όμ„ 찾을 수 μžˆλ‹€λŠ” 것을 μ•Œμ•˜μŠ΅λ‹ˆλ‹€.
11:22
where the image was first posted or where the image first became popular.
120
682570
4150
이미지가 처음 κ²Œμ‹œλ˜μ—ˆκ±°λ‚˜ 이미지가 처음 인기λ₯Ό 얻은 κ³³μž…λ‹ˆλ‹€.
11:26
On certain photography sites you might be able to find the metadata for the image or the EXIF data as it's called.
121
686919
7901
νŠΉμ • 사진 μ‚¬μ΄νŠΈμ—μ„œλŠ” μ΄λ―Έμ§€μ˜ 메타데이터 λ˜λŠ” EXIF ​​데이터λ₯Ό 찾을 수 μžˆμŠ΅λ‹ˆλ‹€.
11:34
And that will tell you things like what camera was used to take the image, what lens they used, what lens angle,
122
694900
7359
그러면 이미지λ₯Ό μ΄¬μ˜ν•˜λŠ” 데 μ–΄λ–€ 카메라가 μ‚¬μš©λ˜μ—ˆλŠ”μ§€, μ–΄λ–€ λ Œμ¦ˆκ°€ μ‚¬μš©λ˜μ—ˆλŠ”μ§€, 렌즈 각도,
11:42
what F-stop, what exposure, but it will also tell you what software the image has been through.
123
702340
5419
FμŠ€ν†±, λ…ΈμΆœ 등이 μ•Œλ €μ§€λ©° 이미지가 μ–΄λ–€ μ†Œν”„νŠΈμ›¨μ–΄λ₯Ό 톡해 μ²˜λ¦¬λ˜μ—ˆλŠ”μ§€λ„ μ•Œ 수 μžˆμŠ΅λ‹ˆλ‹€.
11:47
Now remember that Photoshop is not necessarily a sign that the image has been manipulated or at least has been faked
124
707840
6660
이제 Photoshop이 λ°˜λ“œμ‹œ 이미지가 μ‘°μž‘λ˜μ—ˆκ±°λ‚˜ 적어도 μœ„μ‘°λ˜μ—ˆλ‹€λŠ” μ‹ ν˜ΈλŠ” μ•„λ‹ˆλΌλŠ” 점을 κΈ°μ–΅ν•˜μ‹­μ‹œμ˜€.
11:54
because sometimes a photographer will take an image into Photoshop and do some minor enhancements
125
714740
5100
μ™œλƒν•˜λ©΄ λ•Œλ‘œλŠ” 사진가가 이미지λ₯Ό Photoshop으둜 κ°€μ Έμ™€μ„œ μ•½κ°„μ˜ κ°œμ„  μž‘μ—…μ„ μˆ˜ν–‰
12:00
but that doesn't mean the image has been completely faked.
126
720259
2651
ν•˜μ§€λ§Œ 이것이 이미지가 μ™„μ „νžˆ μœ„μ‘°λ˜μ—ˆλ‹€λŠ” μ˜λ―ΈλŠ” μ•„λ‹ˆκΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
12:02
Sometimes it may mean that. Again you have to consider context.
127
722990
3769
λ•Œλ‘œλŠ” 그런 뜻일 μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€. λ‹€μ‹œ ν•œλ²ˆ λ§₯락을 κ³ λ €ν•΄μ•Ό ν•©λ‹ˆλ‹€.
12:07
So search for the caption that's a good tip.
128
727519
2190
κ·ΈλŸ¬λ‹ˆ 쒋은 팁이 λ˜λŠ” μΊ‘μ…˜μ„ 검색해 λ³΄μ„Έμš”.
12:10
Yeah that's also a good thing to do if you see a controversial meme.
129
730129
5280
λ„€, λ…Όλž€μ˜ 여지가 μžˆλŠ” λ°ˆμ„ λ³Έλ‹€λ©΄ 그것도 쒋은 μΌμž…λ‹ˆλ‹€.
12:16
You know the ones, there's a photograph of a politician or another public figure
130
736039
4980
μ•„μ‹œλ‹€μ‹œν”Ό μ •μΉ˜μΈμ΄λ‚˜ λ‹€λ₯Έ 곡인의 사진이
12:21
and a quote next to that image which is supposed to be something they said or was said about them.
131
741620
6649
있고 κ·Έ 이미지 μ˜†μ—λŠ” 그듀이 그듀에 λŒ€ν•΄ λ§ν–ˆκ±°λ‚˜ λ§ν•œ κ²ƒμœΌλ‘œ μΆ”μ •λ˜λŠ” 인용문이 μžˆμŠ΅λ‹ˆλ‹€.
12:28
Put that quote into a search engine to see what comes up and you'll be able to
132
748399
4860
ν•΄λ‹Ή μΈμš©λ¬Έμ„ 검색 엔진에 μž…λ ₯ν•˜μ—¬ 무엇이 λ‚˜μ˜€λŠ”μ§€ ν™•μΈν•˜λ©΄
12:33
see if it's a real quote or something that's made up.
133
753340
3039
그것이 μ‹€μ œ μΈμš©λ¬ΈμΈμ§€ μ•„λ‹ˆλ©΄ κΎΈλ©°λ‚Έ λ‚΄μš©μΈμ§€ 확인할 수 μžˆμŠ΅λ‹ˆλ‹€.
12:37
That's very good advice. I do that all the time and the other good suggestion is to do a reverse image search.
134
757289
6600
μ•„μ£Ό 쒋은 μ‘°μ–Έμ΄λ„€μš”. λ‚˜λŠ” 항상 κ·Έλ ‡κ²Œ ν•˜κ³  있으며 또 λ‹€λ₯Έ 쒋은 μ œμ•ˆμ€ μ—­λ°©ν–₯ 이미지 검색을 μˆ˜ν–‰ν•˜λŠ” κ²ƒμž…λ‹ˆλ‹€.
12:43
Are you familiar with that Sam?
135
763970
1000
κ·Έ μƒ˜μ„ μ•„μ‹œλ‚˜μš”?
12:45
Yes. So that's where you search using an image rather than text
136
765570
4859
예. λ”°λΌμ„œ
12:50
to see if that same image has been used before on the Internet.
137
770759
5180
λ™μΌν•œ 이미지가 이전에 μΈν„°λ„·μ—μ„œ μ‚¬μš©λ˜μ—ˆλŠ”μ§€ ν™•μΈν•˜κΈ° μœ„ν•΄ ν…μŠ€νŠΈ λŒ€μ‹  이미지λ₯Ό μ‚¬μš©ν•˜μ—¬ κ²€μƒ‰ν•˜λŠ” κ³³μž…λ‹ˆλ‹€.
12:56
Shall we do a little demonstration?
138
776159
2130
살짝 μ‹œμ—°μ„ ν•΄λ³ΌκΉŒμš”?
12:58
Yeah let's do it.
139
778980
800
응, ν•΄λ³΄μž.
12:59
So let's take this image.
140
779860
1720
그럼 이 이미지λ₯Ό μ°μ–΄λ³΄κ² μŠ΅λ‹ˆλ‹€.
13:01
Let's imagine someone has posted this picture online today with the caption:
141
781660
4249
λˆ„κ΅°κ°€κ°€ 였늘 이 사진을 λ‹€μŒκ³Ό 같은 μΊ‘μ…˜κ³Ό ν•¨κ»˜ μ˜¨λΌμΈμ— κ²Œμ‹œν–ˆλ‹€κ³  κ°€μ •ν•΄ λ³΄κ² μŠ΅λ‹ˆλ‹€.
13:06
"The media isn't covering this but there are violent demonstrations outside parliament, please share."
142
786269
6120
"μ–Έλ‘ μ—μ„œλŠ” 이 사진을 닀루지 μ•Šμ§€λ§Œ 의회 λ°–μ—μ„œ 폭λ ₯적인 μ‹œμœ„κ°€ λ²Œμ–΄μ§€κ³  μžˆμŠ΅λ‹ˆλ‹€. κ³΅μœ ν•΄ μ£Όμ„Έμš”."
13:12
OK.
143
792470
600
μ’‹μ•„μš”.
13:13
So I'm using a site called Tin Eye, which is one of the ones Paulo mentioned,
144
793150
4850
κ·Έλž˜μ„œ μ €λŠ” Pauloκ°€ μ–ΈκΈ‰ν•œ μ‚¬μ΄νŠΈ 쀑 ν•˜λ‚˜μΈ Tin EyeλΌλŠ” μ‚¬μ΄νŠΈλ₯Ό μ‚¬μš©ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
13:18
and I drag the image into the box here and then I let it do its thing.
145
798139
6441
μ—¬κΈ° μƒμžμ— 이미지λ₯Ό λŒμ–΄λ‹€ 놓고 μ›ν•˜λŠ” λŒ€λ‘œ ν•˜κ²Œ ν•©λ‹ˆλ‹€.
13:24
It then shows me all the examples of this image it has found.
146
804779
4531
그런 λ‹€μŒ 찾은 이 μ΄λ―Έμ§€μ˜ λͺ¨λ“  예λ₯Ό λ³΄μ—¬μ€λ‹ˆλ‹€.
13:30
You can sort by date and see that this image has been present on the internet since at least 2012.
147
810820
8339
λ‚ μ§œλ³„λ‘œ μ •λ ¬ν•˜λ©΄ 이 이미지가 적어도 2012λ…„λΆ€ν„° 인터넷에 μ‘΄μž¬ν–ˆλ‹€λŠ” 것을 μ•Œ 수 μžˆμŠ΅λ‹ˆλ‹€.
13:39
So it can't be from something that happened today.
148
819309
3001
λ”°λΌμ„œ 였늘 μΌμ–΄λ‚œ 일이 아닐 수 μ—†μŠ΅λ‹ˆλ‹€.
13:42
The image is genuine but the description is misleading.
149
822390
3870
μ΄λ―Έμ§€λŠ” μ§„μ§œμ§€λ§Œ μ„€λͺ…이 μ˜€ν•΄μ˜ μ†Œμ§€κ°€ μžˆμŠ΅λ‹ˆλ‹€.
13:46
Thank you Sam, very useful tips there.
150
826340
1990
μƒ˜, 맀우 μœ μš©ν•œ νŒμ„ μ£Όμ…”μ„œ κ°μ‚¬ν•©λ‹ˆλ‹€.
13:48
Now, would you please remind us of today's vocabulary.
151
828600
3370
자, 였늘의 μ–΄νœ˜λ₯Ό κΈ°μ–΅ν•΄ μ£Όμ‹œκ² μŠ΅λ‹ˆκΉŒ?
13:53
Yes of course.
152
833210
1110
물둠이죠.
13:54
I'm going to start by picking out a word that we've used a lot today
153
834400
3639
였늘 μš°λ¦¬κ°€ 많이 μ‚¬μš©ν•œ
13:58
which is 'manipulate'.
154
838490
1679
'μ‘°μž‘ν•˜λ‹€'λΌλŠ” 단어λ₯Ό κ³ λ₯΄λŠ” 것뢀터 μ‹œμž‘ν•˜κ² μŠ΅λ‹ˆλ‹€.
14:00
To manipulate an image means to change it or edit it in some way to deliberately mislead.
155
840889
7260
이미지λ₯Ό μ‘°μž‘ν•œλ‹€λŠ” 것은 고의적으둜 μ˜€ν•΄λ₯Ό λΆˆλŸ¬μΌμœΌν‚€κΈ° μœ„ν•΄ 이미지λ₯Ό λ³€κ²½ν•˜κ±°λ‚˜ νŽΈμ§‘ν•˜λŠ” 것을 μ˜λ―Έν•©λ‹ˆλ‹€.
14:08
And, as we've discussed, you can also manipulate emotions.
156
848230
4000
그리고 μš°λ¦¬κ°€ λ…Όμ˜ν•œ κ²ƒμ²˜λŸΌ 감정을 μ‘°μž‘ν•  μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
14:12
Our guest Paulo said that he was 'pedantic', which I am about grammar,
157
852879
5260
우리 μ†λ‹˜ PauloλŠ” μžμ‹ μ΄ 'ν˜„ν•™μ '이라고 λ§ν–ˆμŠ΅λ‹ˆλ‹€. μ €λŠ” 문법에 관심이 λ§ŽμŠ΅λ‹ˆλ‹€.
14:18
which means being very concerned that things are correct, often things that other people are not that worried about.
158
858649
6961
μ΄λŠ” 사물이 μ˜¬λ°”λ₯Έμ§€ 맀우 μ—Όλ €ν•˜κ³  μ’…μ’… λ‹€λ₯Έ μ‚¬λžŒλ“€μ€ 그닀지 κ±±μ •ν•˜μ§€ μ•ŠλŠ” 것듀을 μ˜λ―Έν•©λ‹ˆλ‹€.
14:26
And that he is 'a stickler for accuracy'.
159
866399
2721
그리고 κ·ΈλŠ” '정확성을 κ³ μ§‘ν•˜λŠ” μ‚¬λžŒ'μž…λ‹ˆλ‹€.
14:29
When you are a 'stickler' for something, you expect or demand a certain level of behaviour.
160
869450
6600
당신이 μ–΄λ–€ 것에 λŒ€ν•΄ '고착적'이라면, 당신은 νŠΉμ • μˆ˜μ€€μ˜ 행동을 κΈ°λŒ€ν•˜κ±°λ‚˜ μš”κ΅¬ν•©λ‹ˆλ‹€.
14:36
If you see this phrase, you will see it mostly with the noun 'accuracy' or 'rules'.
161
876740
6240
이 문ꡬλ₯Ό λ³΄μ‹œλ©΄ λŒ€λΆ€λΆ„ 'μ •ν™•μ„±'μ΄λ‚˜ 'κ·œμΉ™'μ΄λΌλŠ” λͺ…사와 ν•¨κ»˜ 보싀 κ²ƒμž…λ‹ˆλ‹€.
14:44
Two words which mean nearly the same thing are 'con' and 'scam'.
162
884120
4590
거의 같은 의미λ₯Ό κ°–λŠ” 두 λ‹¨μ–΄λŠ” '사기'와 '사기'μž…λ‹ˆλ‹€.
14:48
These are dishonest schemes designed to trick people and take their money.
163
888980
5630
μ΄λŠ” μ‚¬λžŒλ“€μ„ 속이고 λˆμ„ λΉΌμ•—κΈ° μœ„ν•΄ κ³ μ•ˆλœ λΆ€μ •μ§ν•œ κ³„νšμž…λ‹ˆλ‹€.
14:54
The people who carry out these schemes are 'con artists' and 'scammers'.
164
894769
5521
μ΄λŸ¬ν•œ κ³„νšμ„ μ‹€ν–‰ν•˜λŠ” μ‚¬λžŒλ“€μ€ '사기꾼'이자 '사기꾼'μž…λ‹ˆλ‹€.
15:01
'Phishing' is a type of scam.
165
901370
2219
'ν”Όμ‹±'은 μ‚¬κΈ°μ˜ μΌμ’…μž…λ‹ˆλ‹€.
15:04
It's a trick designed to steal your banking information by getting you to log on to a fake web site with your real details.
166
904789
8521
μ΄λŠ” μ‹€μ œ μ„ΈλΆ€ 정보λ₯Ό μ‚¬μš©ν•˜μ—¬ κ°€μ§œ μ›Ήμ‚¬μ΄νŠΈμ— λ‘œκ·ΈμΈν•˜λ„λ‘ μœ λ„ν•˜μ—¬ 은행 정보λ₯Ό ν›”μΉ˜λ„λ‘ κ³ μ•ˆλœ μˆ˜λ²•μž…λ‹ˆλ‹€.
15:13
And a 'hoax' is a kind of deception, something designed to make you believe something that isn't true.
167
913759
7591
그리고 '사기'λŠ” μΌμ’…μ˜ μ†μž„μˆ˜, 즉 사싀이 μ•„λ‹Œ 것을 믿게 λ§Œλ“€κΈ° μœ„ν•΄ κ³ μ•ˆλœ κ²ƒμž…λ‹ˆλ‹€.
15:21
Thank you Sam.
168
921430
600
κ³ λ§ˆμ›Œμš” μƒ˜.
15:22
It's very easy to manipulate images today and images can be used to manipulate our emotions.
169
922110
5899
μ˜€λŠ˜λ‚  이미지λ₯Ό μ‘°μž‘ν•˜λŠ” 것은 맀우 쉽고 μ΄λ―Έμ§€λŠ” 우리의 감정을 μ‘°μž‘ν•˜λŠ” 데 μ‚¬μš©λ  수 μžˆμŠ΅λ‹ˆλ‹€.
15:28
So be sceptical, be vigilant and share safely.
170
928090
3830
κ·ΈλŸ¬λ‹ˆ μ˜μ‹¬ν•˜κ³  κ²½κ³„ν•˜λ©° μ•ˆμ „ν•˜κ²Œ κ³΅μœ ν•˜μ„Έμš”.
15:32
And remember that just because it's on the internet doesn't mean it's true.
171
932360
4760
그리고 그것이 인터넷에 μžˆλ‹€κ³  ν•΄μ„œ 그것이 μ‚¬μ‹€μ΄λΌλŠ” 것을 μ˜λ―Έν•˜λŠ” 것은 μ•„λ‹ˆλΌλŠ” 점을 κΈ°μ–΅ν•˜μ‹­μ‹œμ˜€.
15:37
And if something seems too good to be true,
172
937220
2250
그리고 μ–΄λ–€ 것이 사싀이라고 λ―ΏκΈ°μ—λŠ” λ„ˆλ¬΄ μ’‹μ•„ 보인닀면 μ•„λ§ˆλ„
15:40
it probably is. Goodbye.
173
940129
2520
사싀일 κ²ƒμž…λ‹ˆλ‹€. μ•ˆλ…•νžˆ κ°€μ„Έμš”.
15:42
Goodbye.
174
942730
279
μ•ˆλ…•νžˆ κ°€μ„Έμš”.
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7