Fake News: Fact & Fiction - Episode 5: Why do people share fake news?

38,828 views ใƒป 2023-10-10

BBC Learning English


ไธ‹ใฎ่‹ฑ่ชžๅญ—ๅน•ใ‚’ใƒ€ใƒ–ใƒซใ‚ฏใƒชใƒƒใ‚ฏใ™ใ‚‹ใจๅ‹•็”ปใ‚’ๅ†็”Ÿใงใใพใ™ใ€‚ ็ฟป่จณใ•ใ‚ŒใŸๅญ—ๅน•ใฏๆฉŸๆขฐ็ฟป่จณใงใ™ใ€‚

00:06
Hello and welcome to Fake News: Fact and Fiction, from BBC Learning English.
0
6809
3540
ใ“ใ‚“ใซใกใฏใ€‚BBC Learning English ใฎใ€Œใƒ•ใ‚งใ‚คใ‚ฏ ใƒ‹ใƒฅใƒผใ‚น: ไบ‹ๅฎŸใจใƒ•ใ‚ฃใ‚ฏใ‚ทใƒงใƒณใ€ใธใ‚ˆใ†ใ“ใใ€‚
00:10
I'm Hugo.
1
10439
791
็งใฏใƒ’ใƒฅใƒผใ‚ดใงใ™ใ€‚
00:11
And I'm Sam.
2
11339
781
ใใ—ใฆ็งใฏใ‚ตใƒ ใงใ™ใ€‚
00:12
In the programme today,
3
12619
791
ไปŠๆ—ฅใฎใƒ—ใƒญใ‚ฐใƒฉใƒ ใงใฏใ€
00:13
we'll look at some of the reasons people share fake news.
4
13490
2659
ไบบใ€…ใŒใƒ•ใ‚งใ‚คใ‚ฏใƒ‹ใƒฅใƒผใ‚นใ‚’ๅ…ฑๆœ‰ใ™ใ‚‹็†็”ฑใฎใ„ใใคใ‹ใ‚’่ฆ‹ใฆใ„ใใพใ™ ใ€‚
00:16
We'll hear from disinformation researcher Samantha Bradshaw.
5
16229
2901
ๅฝๆƒ…ๅ ฑ ็ ”็ฉถ่€…ใฎใ‚ตใƒžใƒณใ‚ตใƒปใƒ–ใƒฉใƒƒใƒ‰ใ‚ทใƒงใƒผใซ่ฉฑใ‚’่žใใพใ™ใ€‚
00:20
One reason why people share fake news that they know is fake,
6
20039
3781
ไบบใ€…ใŒ ใƒ•ใ‚งใ‚คใ‚ฏใ ใจ็Ÿฅใ‚ŠใชใŒใ‚‰ใƒ•ใ‚งใ‚คใ‚ฏใƒ‹ใƒฅใƒผใ‚นใ‚’ๅ…ฑๆœ‰ใ™ใ‚‹็†็”ฑใฎ 1 ใคใฏใ€
00:24
has to do with this thing that we call identity signalling.
7
24179
4410
ใ‚ขใ‚คใƒ‡ใƒณใƒ†ใ‚ฃใƒ†ใ‚ฃ ใ‚ทใ‚ฐใƒŠใƒชใƒณใ‚ฐใจๅ‘ผใฐใ‚Œใ‚‹ใ‚‚ใฎใซ้–ขไฟ‚ใ—ใฆใ„ใพใ™ใ€‚
00:28
Identity signalling. Have you heard of that Sam?
8
28949
2520
ID ใ‚ทใ‚ฐใƒŠใƒชใƒณใ‚ฐใ€‚ ใ‚ใฎใ‚ตใƒ ใฎใ“ใจใ‚’่žใ„ใŸใ“ใจใŒใ‚ใ‚Šใพใ™ใ‹๏ผŸ ๅฎŸ้š›ใซใ“ใฎ็•ช็ต„ใฎ
00:31
Not until I saw the script for this programme actually.
9
31890
3780
ๅฐๆœฌใ‚’่ฆ‹ใ‚‹ใพใงใฏ็Ÿฅใ‚Šใพใ›ใ‚“ใงใ—ใŸ ใ€‚
00:35
Ah, so make sure you keep watching to find out what identity signalling is
10
35750
3530
ใ‚ใ‚ใ€ใใ‚Œใงใฏใ€ ใ‚ขใ‚คใƒ‡ใƒณใƒ†ใ‚ฃใƒ†ใ‚ฃ ใ‚ทใ‚ฐใƒŠใƒชใƒณใ‚ฐใจใฏไฝ•ใชใฎใ‹ใ€
00:40
and how it's a part of the spread of fake news.
11
40530
2939
ใใ—ใฆใใ‚ŒใŒ ใƒ•ใ‚งใ‚คใ‚ฏ ใƒ‹ใƒฅใƒผใ‚นใฎๆ‹กๆ•ฃใซใฉใฎใ‚ˆใ†ใซ้–ขไธŽใ—ใฆใ„ใ‚‹ใฎใ‹ใ‚’็Ÿฅใ‚‹ใŸใ‚ใซใ€ๅผ•ใ็ถšใๆณจ็›ฎใ—ใฆใใ ใ•ใ„ใ€‚
00:43
Before we get to that, we have some key vocabulary which you're going to
12
43979
3271
ใใฎๅ‰ใซใ€ใ‚ตใƒ ใซใคใ„ใฆ่ชžใฃใฆใ‚‚ใ‚‰ใ†้‡่ฆใช่ชžๅฝ™ใŒใ„ใใคใ‹ใ‚ใ‚Šใพใ™
00:47
tell us about Sam.
13
47329
820
ใ€‚
00:48
What do you have for us today?
14
48299
1091
ไปŠๆ—ฅใฏไฝ•ใ‹ใ‚ใ‚Šใพใ™ใ‹๏ผŸ
00:49
Yes, so today I am talking about: agenda, actors and motives.
15
49649
5610
ใฏใ„ใ€ใใ‚ŒใงไปŠๆ—ฅ็งใŒ่ฉฑใ—ใฆใ„ใ‚‹ใฎใฏใ€ ่ญฐ้กŒใ€ไฟณๅ„ชใ€ๅ‹•ๆฉŸใซใคใ„ใฆใงใ™ใ€‚
00:59
If you have to go to meetings and who doesn't love a good meeting?
16
59189
4841
ไผš่ญฐใซ่กŒใ‹ใชใ‘ใ‚Œใฐใชใ‚‰ใชใ„ๅ ดๅˆใ€ ่‰ฏใ„ไผš่ญฐใŒๅซŒใ„ใชไบบใฏใ„ใชใ„ใงใ—ใ‚‡ใ†ใ‹ใ€‚
01:04
I hate them.
17
64700
1080
ๅฝผใ‚‰ใฎใ“ใจใŒๅซŒใ„ใงใ™ใ€‚
01:05
They're such a waste of time.
18
65859
2261
ใจใฆใ‚‚ๆ™‚้–“ใฎ็„ก้ง„ใงใ™ใ€‚
01:08
You will probably be familiar with agendas.
19
68239
3410
ใŠใใ‚‰ใ ่ญฐ้กŒใซใคใ„ใฆใฏใ‚ˆใใ”ๅญ˜ใ˜ใงใ—ใ‚‡ใ†ใ€‚
01:12
An agenda is a list of things that are going to be discussed
20
72120
4040
ใ‚ขใ‚ธใ‚งใƒณใƒ€ใจใฏใ€ไผš่ญฐใง่ญฐ่ซ–ใ•ใ‚Œใ‚‹ๅ†…ๅฎนใฎใƒชใ‚นใƒˆใงใ™
01:16
at the meeting.
21
76370
1040
ใ€‚
01:17
But there is another meaning of the word 'agenda' that is important
22
77659
4291
ใ—ใ‹ใ—ใ€ใƒ•ใ‚งใ‚คใ‚ฏใƒ‹ใƒฅใƒผใ‚นใฎๅˆ†้‡Žใง้‡่ฆใชใ€Œใ‚ขใ‚ธใ‚งใƒณใƒ€ใ€ใจใ„ใ†่จ€่‘‰ใซใฏๅˆฅใฎๆ„ๅ‘ณใŒใ‚ใ‚Šใพใ™
01:22
in the area of fake news.
23
82040
2450
ใ€‚
01:24
If an individual or organisation or a government 'has an agenda',
24
84590
5599
ๅ€‹ไบบใ€็ต„็น”ใ€ใพใŸใฏ ๆ”ฟๅบœใŒใ€Œใ‚ขใ‚ธใ‚งใƒณใƒ€ใ‚’ๆŒใฃใฆใ„ใ‚‹ใ€ๅ ดๅˆใ€ใใ‚Œใฏ
01:30
it means that there is something they want to achieve.
25
90530
3179
้”ๆˆใ—ใŸใ„ไฝ•ใ‹ใŒใ‚ใ‚‹ใ“ใจใ‚’ๆ„ๅ‘ณใ—ใพใ™ใ€‚
01:33
This could be clearly stated objectives.
26
93799
3151
ใ“ใ‚Œใฏๆ˜Ž็ขบใซ ่ฟฐในใ‚‰ใ‚ŒใŸ็›ฎๆจ™ใงใ‚ใ‚‹ๅฏ่ƒฝๆ€งใŒใ‚ใ‚Šใพใ™ใ€‚
01:37
But having an agenda can also suggest a secret or hidden aim.
27
97030
5559
ใ—ใ‹ใ—ใ€่ญฐ้กŒใŒใ‚ใ‚‹ใจใ„ใ†ใ“ใจใฏใ€ ็ง˜ๅฏ†ใ‚„้š ใ‚ŒใŸ็›ฎ็š„ใ‚’็คบๅ”†ใ—ใฆใ„ใ‚‹ๅฏ่ƒฝๆ€งใ‚‚ใ‚ใ‚Šใพใ™ใ€‚ ใ€Œ
01:42
Let's park the word agenda for the moment because there is another word
28
102669
4031
ใ‚ขใ‚ธใ‚งใƒณใƒ€ใ€ใจใ„ใ†่จ€่‘‰ใฏใ—ใฐใ‚‰ใไฟ็•™ใซใ—ใพใ—ใ‚‡ใ†ใ€‚ ใ‚‚ใ†ไธ€ใค
01:46
I want to talk about that has a very well-known meaning but also
29
106780
4120
่ฉฑใ—ใŸใ„่จ€่‘‰ใŒใ‚ใ‚Šใพใ™ใ€‚ใใ‚Œใฏ้žๅธธ ใซใ‚ˆใ็Ÿฅใ‚‰ใ‚ŒใŸๆ„ๅ‘ณใ‚’ๆŒใฃใฆใ„ใพใ™ใŒใ€
01:51
another meaning which is perhaps not so well-known - actor.
30
111230
4340
ใŠใใ‚‰ใ ใ‚ใพใ‚Š็Ÿฅใ‚‰ใ‚Œใฆใ„ใชใ„ๅˆฅใฎๆ„ๅ‘ณใ‚‚ๆŒใฃใฆใ„ใพใ™ใ€‚ใใ‚Œใฏใ€Œไฟณๅ„ชใ€ใงใ™ใ€‚
01:56
It's probably a very familiar word.
31
116780
3030
ใจใฆใ‚‚้ฆดๆŸ“ใฟใฎใ‚ใ‚‹่จ€่‘‰ใ‹ใ‚‚ใ—ใ‚Œใพใ›ใ‚“ใ€‚
01:59
Somebody who performs on stage or in films is an actor.
32
119890
3869
่ˆžๅฐใ‚„ๆ˜ ็”ปใงๆผ”ๆŠ€ใ™ใ‚‹ไบบใฏ ไฟณๅ„ชใงใ™ใ€‚
02:03
But the same word is used to describe a person, group or even
33
123879
4511
ใ—ใ‹ใ—ใ€ๅŒใ˜่จ€่‘‰ใฏใ€
02:08
a state that has an agenda that's either politically or economically motivated
34
128470
6099
ๆ”ฟๆฒป็š„ใพใŸใฏ็ตŒๆธˆ็š„ๅ‹•ๆฉŸใซๅŸบใฅใ่ญฐ้กŒใ‚’ๆŒใก
02:15
and they have influence or power that they use
35
135020
2430
ใ€ใใฎ่ญฐ้กŒใ‚’ๆŽจ้€ฒใ™ใ‚‹ใŸใ‚ใซไฝฟ็”จใ™ใ‚‹ๅฝฑ้ŸฟๅŠ›ใ‚„ๆจฉๅŠ›ใ‚’ๆŒใฃใฆใ„ใ‚‹ๅ€‹ไบบใ€ใ‚ฐใƒซใƒผใƒ—ใ€ใ•ใ‚‰ใซใฏๅ›ฝๅฎถใ‚’่กจใ™ใฎใซใ‚‚ไฝฟ็”จใ•ใ‚Œใพใ™
02:17
to promote that agenda.
36
137530
2659
ใ€‚
02:20
In this context you can hear the phrases:
37
140389
2761
ใ“ใฎๆ–‡่„ˆใงใฏใ€
02:23
state actors, political actors, and even - bad actors.
38
143449
6180
ๅ›ฝๅฎถไธปไฝ“ใ€ๆ”ฟๆฒปไธปไฝ“ใ€ ใ•ใ‚‰ใซใฏๆ‚ชๅฝนใจใ„ใ†ใƒ•ใƒฌใƒผใ‚บใ‚’่žใใ“ใจใŒใงใใพใ™ใ€‚
02:30
State actors are governments,
39
150360
3709
ๅ›ฝๅฎถไธปไฝ“ใจใฏๆ”ฟๅบœใฎใ“ใจใงใ‚ใ‚Šใ€
02:34
political actors are those with, you guessed it, political power and agendas
40
154150
4359
ๆ”ฟๆฒปไธปไฝ“ใจใฏใ€ใŠ ๅฏŸใ—ใฎใจใŠใ‚Šใ€ๆ”ฟๆฒป็š„ๆจฉๅŠ›ใจๆ”ฟ็ญ–ใ‚’ๆŒใฃใŸ
02:38
and bad actors - well that doesn't mean people who aren't very good at acting.
41
158990
5250
ไบบใ€…ใ€ใใ—ใฆๆ‚ชๅฝนใฎใ“ใจใงใ™ใ€‚ใ“ใ‚Œใฏใ€ ๆผ”ๆŠ€ใŒใ‚ใพใ‚Šๅพ—ๆ„ใงใฏใชใ„ไบบใ€…ใ‚’ๆ„ๅ‘ณใ™ใ‚‹ใ‚ใ‘ใงใฏใ‚ใ‚Šใพใ›ใ‚“ใ€‚
02:44
A bad actor uses power or influence in a dishonest way
42
164840
4949
ๆ‚ชๆ„ใฎใ‚ใ‚‹่€…ใฏใ€ๆจฉๅŠ›ใ‚„ๅฝฑ้ŸฟๅŠ›ใ‚’ ไธๆญฃใชๆ–นๆณ•ใงๅˆฉ็”จใ—ใฆใ€
02:50
to push a hidden agenda.
43
170210
2500
้š ใ•ใ‚ŒใŸ่ญฐ้กŒใ‚’ๆŽจ้€ฒใ—ใพใ™ใ€‚
02:52
They're trying to achieve something but it's not clear what it is.
44
172909
3680
ๅฝผใ‚‰ใฏไฝ•ใ‹ใ‚’้”ๆˆใ—ใ‚ˆใ†ใจใ—ใฆใ„ใ‚‹ ใŒใ€ใใ‚ŒใŒไฝ•ใชใฎใ‹ใฏๆ˜Žใ‚‰ใ‹ใงใฏใชใ„ใ€‚
02:56
And it's not clear who is doing it.
45
176719
2871
ใใ—ใฆ่ชฐใŒใใ‚Œใ‚’ใ‚„ใฃใฆใ„ใ‚‹ใฎใ‹ใฏๆ˜Žใ‚‰ใ‹ใงใฏใ‚ใ‚Šใพใ›ใ‚“ใ€‚
02:59
For example, if a government from one country wants to influence an
46
179670
4160
ใŸใจใˆใฐใ€ ใ‚ใ‚‹ๅ›ฝใฎๆ”ฟๅบœใŒ
03:03
election in another country, they may spread fake news stories
47
183910
4180
ๅˆฅใฎๅ›ฝใฎ้ธๆŒ™ใซๅฝฑ้Ÿฟใ‚’ไธŽใˆใŸใ„ๅ ดๅˆใ€ ใƒ•ใ‚งใ‚คใ‚ฏใƒ‹ใƒฅใƒผใ‚นใ‚’ๅบƒใ‚ใ‚‹ไธ€ๆ–น
03:08
but hide the fact that they're doing it.
48
188180
2910
ใงใ€ใใฎไบ‹ๅฎŸใ‚’้š ใ—ใฆใ„ใ‚‹ๅฏ่ƒฝๆ€งใŒใ‚ใ‚Šใพใ™ ใ€‚
03:11
This could be described as the actions of a bad actor or
49
191170
4060
ใ“ใ‚Œใฏใ€ ๆ‚ชใ„่กŒ็‚บ่€…ใพใŸใฏ
03:15
a bad state actor.
50
195620
1530
ๆ‚ชใ„็Šถๆ…‹ใฎ่กŒ็‚บ่€…ใฎ่กŒ็‚บใจใ—ใฆ่ชฌๆ˜Žใงใใพใ™ใ€‚ ใ‚ชใƒณใƒฉใ‚คใƒณใง
03:18
People who spread disinformation online may not be doing it because
51
198169
4050
ๅฝๆƒ…ๅ ฑใ‚’ๅบƒใ‚ใ‚‹ไบบใฏใ€ใใ‚Œใ‚’ไฟกใ˜ใฆใ„ใ‚‹ ใ‹ใ‚‰ใงใฏใชใใ€่‡ชๅˆ†ใŸใกใฎ็›ฎ็š„ใซ้ฉใ—ใฆใ„ใ‚‹ใ‹ใ‚‰ใใ†ใ—ใฆใ„ใ‚‹ใฎใ‹ใ‚‚ใ—ใ‚Œใพใ›ใ‚“
03:22
they believe it but because it suits their agenda.
52
202300
4810
ใ€‚
03:27
This spreading of disinformation can be very sophisticated and the
53
207190
4539
ใ“ใฎๅฝๆƒ…ๅ ฑใฎๆ‹กๆ•ฃใฏ ้žๅธธใซๅทงๅฆ™ใงใ‚ใ‚‹ๅฏ่ƒฝๆ€งใŒใ‚ใ‚Šใ€ใใ‚Œใ‚’
03:31
motives of those doing it are often not obvious.
54
211810
3490
่กŒใ†่€…ใฎๅ‹•ๆฉŸใฏ ๆ˜Žใ‚‰ใ‹ใงใฏใชใ„ใ“ใจใŒใ‚ˆใใ‚ใ‚Šใพใ™ใ€‚
03:36
Motive.
55
216259
1231
ๅ‹•ใ‹ใ™ใ€‚
03:37
That's another useful word in this context.
56
217570
2770
ใ“ใ‚Œใ‚‚ ใ“ใฎๆ–‡่„ˆใงใฏๅฝน็ซ‹ใค่จ€่‘‰ใงใ™ใ€‚
03:41
If you like TV crime dramas, and who doesn't love a good crime drama?,
57
221240
4889
ใƒ†ใƒฌใƒ“ใฎ็Šฏ็ฝชใƒ‰ใƒฉใƒžใŒๅฅฝใใงใ€ ๅ„ชใ‚ŒใŸ็Šฏ็ฝชใƒ‰ใƒฉใƒžใŒๅซŒใ„ใชไบบใฏใ„ใชใ„ใงใ—ใ‚‡ใ†ใ‹?
03:46
- me, I never really understand them -
58
226590
3934
- ็งใซใฏใ€็Šฏ็ฝชใƒ‰ใƒฉใƒžใŒใพใฃใŸใ็†่งฃใงใใพใ›ใ‚“ -
03:50
you will often hear the word 'motive'.
59
230604
3835
ใ€Œๅ‹•ๆฉŸใ€ใจใ„ใ†่จ€่‘‰ใ‚’ใ‚ˆใ่žใใงใ—ใ‚‡ใ†ใ€‚
03:54
"He can't be guilty. He didn't have a motive."
60
234520
3749
ใ€ŒๅฝผใŒๆœ‰็ฝชใงใ‚ใ‚‹ใฏใšใฏใชใ„ใ€‚ ๅฝผใซใฏๅ‹•ๆฉŸใŒใชใ‹ใฃใŸใ€‚ใ€
03:59
A motive is a reason for doing something and not everyone's motives
61
239060
4979
ๅ‹•ๆฉŸใจใฏไฝ•ใ‹ใ‚’ใ™ใ‚‹็†็”ฑใงใ‚ใ‚Š ใ€ใ‚ชใƒณใƒฉใ‚คใƒณใซๆŠ•็จฟใ™ใ‚‹ๅ‹•ๆฉŸใฏใ™ในใฆใฎไบบใซใจใฃใฆๆœฌ็‰ฉใงใ‚ใ‚‹ใจใฏ้™ใ‚Šใพใ›ใ‚“
04:04
for what they post online are genuine.
62
244120
2519
ใ€‚
04:07
But how can you tell?
63
247340
1169
ใ—ใ‹ใ—ใ€ใฉใ†ใ™ใ‚Œใฐใ‚ใ‹ใ‚Šใพใ™ใ‹๏ผŸ
04:09
Let's go back to the studio to see if there are any more useful tips.
64
249349
4111
ใ‚นใ‚ฟใ‚ธใ‚ชใซๆˆปใฃใฆใ€ ใ•ใ‚‰ใซๅฝน็ซ‹ใคใƒ’ใƒณใƒˆใŒใ‚ใ‚‹ใ‹ใฉใ†ใ‹ใ‚’็ขบ่ชใ—ใฆใฟใพใ—ใ‚‡ใ†ใ€‚
04:16
Thanks for that Sam.
65
256449
881
ใ‚ตใƒ ใ€ใ‚ใ‚ŠใŒใจใ†ใ€‚
04:17
Really interesting, I was really impressed with your acting there.
66
257410
3040
ๆœฌๅฝ“ใซ้ข็™ฝใใฆใ€ ใใ“ใงใฎใ‚ใชใŸใฎๆผ”ๆŠ€ใซใฏๆœฌๅฝ“ใซๆ„Ÿ้Š˜ใ‚’ๅ—ใ‘ใพใ—ใŸใ€‚
04:20
Well, I may be a bad actor but I'm not a bad actor.
67
260990
5169
ใพใ‚ใ€็งใฏๆ‚ชใ„ ไฟณๅ„ชใ‹ใ‚‚ใ—ใ‚Œใพใ›ใ‚“ใŒใ€ๆ‚ชใ„ไฟณๅ„ชใงใฏใ‚ใ‚Šใพใ›ใ‚“ใ€‚
04:26
My motives are purely genuine.
68
266239
2290
็งใฎๅ‹•ๆฉŸใฏ็ด”็ฒ‹ใซๆœฌ็‰ฉใงใ™ใ€‚
04:28
Well that's an interesting point about fake news I think.
69
268609
2100
ใพใ‚ใ€ใใ‚ŒใŒ ใƒ•ใ‚งใ‚คใ‚ฏใƒ‹ใƒฅใƒผใ‚นใฎ่ˆˆๅ‘ณๆทฑใ„็‚นใ ใจๆ€ใ„ใพใ™ใ€‚
04:31
We've heard that fake news is produced by people who may want to
70
271160
3210
ใƒ•ใ‚งใ‚คใ‚ฏใƒ‹ใƒฅใƒผใ‚นใฏใ€
04:34
make money or have a political or social agenda.
71
274449
3391
้‡‘ๅ„ฒใ‘ใ‚’ใ—ใŸใ„ไบบใ‚„ใ€ ๆ”ฟๆฒป็š„ใพใŸใฏ็คพไผš็š„็›ฎ็š„ใ‚’ๆŒใฃใฆใ„ใ‚‹ไบบใซใ‚ˆใฃใฆไฝœๆˆใ•ใ‚Œใฆใ„ใ‚‹ใจ่žใ„ใฆใ„ใพใ™ใ€‚
04:37
Those are their motives,
72
277920
1190
ใใ‚Œใ‚‰ใฏๅฝผใ‚‰ใฎๅ‹•ๆฉŸใงใ™ใŒใ€็งใŸใกใŒ
04:39
but they can only be successful if we share those stories.
73
279189
3801
ใใ‚Œใ‚‰ใฎใ‚นใƒˆใƒผใƒชใƒผใ‚’ๅ…ฑๆœ‰ใ™ใ‚‹ๅ ดๅˆใซใฎใฟๆˆๅŠŸใ™ใ‚‹ใ“ใจใŒใงใใพใ™ใ€‚
04:44
Absolutely. But why do people share news, stories, images that are fake?
74
284120
5310
็ตถๅฏพใซใ€‚ ใ—ใ‹ใ—ใ€ใชใœไบบใ€…ใฏ ใƒ•ใ‚งใ‚คใ‚ฏใฎใƒ‹ใƒฅใƒผใ‚นใ€ใ‚นใƒˆใƒผใƒชใƒผใ€็”ปๅƒใ‚’ๅ…ฑๆœ‰ใ™ใ‚‹ใฎใงใ—ใ‚‡ใ†ใ‹? ใ‚ชใƒƒใ‚ฏใ‚นใƒ•ใ‚ฉใƒผใƒ‰ๅคงๅญฆใฎใ‚ชใƒƒใ‚ฏใ‚นใƒ•ใ‚ฉใƒผใƒ‰ ใ‚คใƒณใ‚ฟใƒผใƒใƒƒใƒˆ็ ”็ฉถๆ‰€ใฎ
04:49
Well researcher Samantha Bradshaw from the Oxford Internet Institute
75
289720
3820
็ ”็ฉถ่€…ใ‚ตใƒžใƒณใ‚ต ใƒ–ใƒฉใƒƒใƒ‰ใ‚ทใƒงใƒผใฏใ€
04:53
at Oxford University shared her observations with us earlier.
76
293620
3490
ไปฅๅ‰ใซ็งใŸใกใซๅฝผๅฅณใฎ่ฆณๅฏŸใ‚’ๅ…ฑๆœ‰ใ—ใฆใใ‚Œใพใ—ใŸใ€‚
04:58
People share fake news for a variety of reasons.
77
298910
3380
ไบบใ€…ใฏใ•ใพใ–ใพใช็†็”ฑใงใƒ•ใ‚งใ‚คใ‚ฏใƒ‹ใƒฅใƒผใ‚นใ‚’ๅ…ฑๆœ‰ใ—ใพใ™ ใ€‚
05:02
In many cases sometimes people don't even realise the stories
78
302449
3871
ๅคšใใฎๅ ดๅˆใ€ไบบใ€…ใฏ่‡ชๅˆ†ใŸใกใŒ
05:06
that they're sharing are fake.
79
306400
1819
ๅ…ฑๆœ‰ใ—ใฆใ„ใ‚‹ใ‚นใƒˆใƒผใƒชใƒผใŒๅฝ็‰ฉใงใ‚ใ‚‹ใ“ใจใซใ•ใˆๆฐ—ใฅใ„ใฆใ„ใชใ„ใ“ใจใŒใ‚ใ‚Šใพใ™ใ€‚
05:08
They might find the content funny or humorous or they might think that
80
308329
4110
ๅฝผใ‚‰ใฏใ€ใ‚ณใƒณใƒ†ใƒณใƒ„ใŒ้ข็™ฝใ„ใ€ใพใŸใฏใƒฆใƒผใƒขใƒฉใ‚นใงใ‚ใ‚‹ใจๆ„Ÿใ˜ใ‚‹ใ‹ใ‚‚ใ—ใ‚Œใพใ›ใ‚“ใ€
05:12
people in their network such as their friends or family or broader
81
312519
3731
ใ‚ใ‚‹ใ„ใฏใ€ ๅ‹ไบบใ€ๅฎถๆ—ใ€ใพใŸใฏใ‚ˆใ‚Šๅบƒใ„
05:16
acquaintances might enjoy this content.
82
316329
3760
็Ÿฅไบบใชใฉใฎใƒใƒƒใƒˆใƒฏใƒผใ‚ฏๅ†…ใฎไบบใ€…ใŒ ใ“ใฎใ‚ณใƒณใƒ†ใƒณใƒ„ใ‚’ๆฅฝใ—ใ‚€ใ‹ใ‚‚ใ—ใ‚Œใชใ„ใจ่€ƒใˆใ‚‹ใ‹ใ‚‚ใ—ใ‚Œใพใ›ใ‚“ใ€‚
05:20
Another reason people might share fake news has to do with
83
320169
3520
ไบบใ€…ใŒใƒ•ใ‚งใ‚คใ‚ฏ ใƒ‹ใƒฅใƒผใ‚นใ‚’ๅ…ฑๆœ‰ใ™ใ‚‹ใ‚‚ใ† 1 ใคใฎ็†็”ฑใฏใ€
05:24
the emotional appeal of fake news because it tends to be very negative.
84
324050
5549
ใƒ•ใ‚งใ‚คใ‚ฏ ใƒ‹ใƒฅใƒผใ‚นใฏ ้žๅธธใซใƒใ‚ฌใƒ†ใ‚ฃใƒ–ใชๅ‚พๅ‘ใŒใ‚ใ‚‹ใŸใ‚ใ€ๆ„Ÿๆƒ…ใซ่จดใˆใ‹ใ‘ใ‚‹ใ“ใจใจ้–ขไฟ‚ใ—ใฆใ„ใพใ™ใ€‚
05:29
It tends to also elicit a reaction from people and they might be angry
85
329679
4541
ใพใŸใ€ไบบใ€…ใฎๅๅฟœใ‚’ๅผ•ใๅ‡บใ™ๅ‚พๅ‘ใŒใ‚ใ‚Š ใ€ไบบใ€…ใฏๆ€’ใ‚Šใ€ใ‚ใพใ‚Šใซใ‚‚ใจใ‚“ใงใ‚‚ใชใ„ใฎใงๅ…ฑๆœ‰ใ™ใ‚‹ๅฟ…่ฆใŒใ‚ใ‚‹ใจ่€ƒใˆใ€ใ“ใฎ่จ˜ไบ‹ใง
05:34
and they might type and share and discuss what is being talked about
86
334300
5649
่ฉฑ้กŒใซใชใฃใฆใ„ใ‚‹ใ“ใจใ‚’ๅ…ฅๅŠ›ใ—ใฆๅ…ฑๆœ‰ใ—ใ€่ญฐ่ซ–ใ™ใ‚‹ๅฏ่ƒฝๆ€งใŒใ‚ใ‚Šใพใ™
05:40
in this article because they find it so outrageous that they need
87
340029
4151
05:44
to share it.
88
344259
1620
ใ€‚
05:45
The third reason why people might share fake news has to do with what
89
345959
3650
ไบบใ€…ใŒใƒ•ใ‚งใ‚คใ‚ฏ ใƒ‹ใƒฅใƒผใ‚นใ‚’ๅ…ฑๆœ‰ใ™ใ‚‹ๅฏ่ƒฝๆ€งใŒใ‚ใ‚‹ 3 ็•ช็›ฎใฎ็†็”ฑใฏใ€
05:49
we call 'identity signalling' where people know that the information
90
349689
4571
05:54
being shared in this story is false.
91
354340
2530
ใ“ใฎ่จ˜ไบ‹ใงๅ…ฑๆœ‰ใ•ใ‚Œใฆใ„ใ‚‹ๆƒ…ๅ ฑใŒ่™šๅฝใงใ‚ใ‚‹ใ“ใจใ‚’ไบบใ€…ใŒ็Ÿฅใ‚‹ใ€Œใ‚ขใ‚คใƒ‡ใƒณใƒ†ใ‚ฃใƒ†ใ‚ฃ ใ‚ทใ‚ฐใƒŠใƒชใƒณใ‚ฐใ€ใจๅ‘ผใฐใ‚Œใ‚‹ใ‚‚ใฎใซ้–ขไฟ‚ใ—ใฆใ„ใพใ™ใ€‚
05:56
But the purpose of sharing it is to signal to a broader community that
92
356949
4210
ใ—ใ‹ใ—ใ€ใใ‚Œใ‚’ๅ…ฑๆœ‰ใ™ใ‚‹็›ฎ็š„ใฏใ€
06:01
they share certain values that are reflected in that news story.
93
361239
4910
ใใฎใƒ‹ใƒฅใƒผใ‚น่จ˜ไบ‹ใซๅๆ˜ ใ•ใ‚Œใฆใ„ใ‚‹็‰นๅฎšใฎไพกๅ€ค่ฆณใ‚’ๅ…ฑๆœ‰ใ—ใฆใ„ใ‚‹ใ“ใจใ‚’ใ‚ˆใ‚Šๅบƒ็ฏ„ใชใ‚ณใƒŸใƒฅใƒ‹ใƒ†ใ‚ฃใซไผใˆใ‚‹ใ“ใจใงใ™ใ€‚
06:06
And so it gives people a sense of belonging and
94
366680
5130
ใใ—ใฆใ€ใใ‚Œใฏไบบใ€…ใซๅธฐๅฑžๆ„่ญ˜ใ‚’ไธŽใˆใ€
06:12
fitting into a broader community of people.
95
372139
2671
ใ‚ˆใ‚Šๅบƒใ„ไบบใ€…ใฎใ‚ณใƒŸใƒฅใƒ‹ใƒ†ใ‚ฃใซๆบถใ‘่พผใ‚€ใ“ใจใŒใงใใ‚‹ใฎใงใ™ใ€‚
06:14
Has much more to do with identity than the actual facts of the story.
96
374889
4241
็‰ฉ่ชžใฎๅฎŸ้š›ใฎไบ‹ๅฎŸใ‚ˆใ‚Šใ‚‚ใ‚ขใ‚คใƒ‡ใƒณใƒ†ใ‚ฃใƒ†ใ‚ฃใซๅคงใใ้–ขไฟ‚ใ—ใฆใ„ใพใ™ใ€‚
06:20
So what do you make of that Sam?
97
380449
1330
ใใ‚Œใงใ€ใใฎใ‚ตใƒ ใซใคใ„ใฆใฉใ†ๆ€ใ„ใพใ™ใ‹๏ผŸ
06:22
OK, well, I can definitely see how we might accidentally share
98
382790
4039
ใใ†ใงใ™ใญใ€็œŸๅฎŸใงใฏใชใ„ใ‚‚ใฎใ‚’ ่ชคใฃใฆๅ…ฑๆœ‰ใ—ใฆใ—ใพใ†ๅฏ่ƒฝๆ€งใŒใ‚ใ‚‹ใ“ใจใฏ็ขบใ‹ใซใ‚ใ‹ใ‚Šใพใ™
06:26
things that aren't true for, for example I sometimes see some
99
386909
3800
ใ€‚ ใŸใจใˆใฐใ€
06:30
amazing photos that I find out later are actually photoshopped.
100
390789
3311
ๅพŒใงๅฎŸ้š›ใซ ใƒ•ใ‚ฉใƒˆใ‚ทใƒงใƒƒใƒ—ใงๅŠ ๅทฅใ•ใ‚Œใฆใ„ใ‚‹ใ“ใจใŒๅˆคๆ˜Žใ™ใ‚‹็ด ๆ™ดใ‚‰ใ—ใ„ๅ†™็œŸใ‚’ๆ™‚ใ€…่ฆ‹ใ‚‹ใ“ใจใŒใ‚ใ‚Šใพใ™ใ€‚
06:34
and things that might make me angry can make me reach for the share button.
101
394939
4230
ๆ€’ใ‚Šใ‚’ๆ„Ÿใ˜ใ‚‹ใจ ๅ…ฑๆœ‰ใƒœใ‚ฟใƒณใซๆ‰‹ใŒไผธใณใฆใ—ใพใ†ใ“ใจใ‚‚ใ‚ใ‚Šใพใ™ใ€‚
06:39
But what stood out for me was this idea of identity signalling.
102
399249
4361
ใ—ใ‹ใ—ใ€็งใซใจใฃใฆๅฐ่ฑกใซๆฎ‹ใฃใŸใฎใฏใ€ ใ‚ขใ‚คใƒ‡ใƒณใƒ†ใ‚ฃใƒ†ใ‚ฃ ใ‚ทใ‚ฐใƒŠใƒชใƒณใ‚ฐใจใ„ใ†ใ‚ขใ‚คใƒ‡ใ‚ขใงใ—ใŸใ€‚
06:44
We might want to show that we belong to a particular group.
103
404060
3040
่‡ชๅˆ†ใŒ็‰นๅฎšใฎใ‚ฐใƒซใƒผใƒ—ใซๅฑžใ—ใฆใ„ใ‚‹ใ“ใจใ‚’็คบใ—ใŸใ„ๅ ดๅˆใŒใ‚ใ‚Šใพใ™ใ€‚
06:47
We share some beliefs and opinions and that it really doesn't matter
104
407180
3930
็งใŸใกใฏใ„ใใคใ‹ใฎไฟกๅฟตใ‚„ๆ„่ฆ‹ใ‚’ๅ…ฑๆœ‰ใ—ใฆใŠใ‚Šใ€ใใฎ
06:51
whether the content is true or not.
105
411350
1729
ๅ†…ๅฎนใŒ็œŸๅฎŸใงใ‚ใ‚‹ใ‹ใฉใ†ใ‹ใฏๅฎŸ้š›ใซใฏๅ•้กŒใงใฏใ‚ใ‚Šใพใ›ใ‚“ใ€‚
06:53
And this is something that the people who produce fake news know well
106
413399
3501
ใใ—ใฆใ€ใ“ใ‚Œใฏ ใƒ•ใ‚งใ‚คใ‚ฏใƒ‹ใƒฅใƒผใ‚นใ‚’ไฝœใ‚‹ไบบใ€…ใŒใ‚ˆใ็Ÿฅใฃใฆใ„ใ‚‹ใ“ใจใงใ‚ใ‚Šใ€
06:57
and they know how to make us share those stories.
107
417050
2669
ๅฝผใ‚‰ใฏ ใใ‚Œใ‚‰ใฎ่ฉฑใ‚’็งใŸใกใซๅ…ฑๆœ‰ใ•ใ›ใ‚‹ๆ–นๆณ•ใ‚’็Ÿฅใฃใฆใ„ใพใ™ใ€‚
06:59
Here's Samantha Bradshaw again.
108
419799
2090
ๅ†ใณใ‚ตใƒžใƒณใ‚ตใƒปใƒ–ใƒฉใƒƒใƒ‰ใ‚ทใƒงใƒผใงใ™ใ€‚
07:02
So the purveyors of fake news will use a variety of strategies to
109
422329
5731
ใใฎใŸใ‚ใ€ใƒ•ใ‚งใ‚คใ‚ฏใƒ‹ใƒฅใƒผใ‚นใฎๆไพ›่€…ใฏใ€
07:08
get people clicking and sharing their content.
110
428139
4000
ไบบใ€…ใซใ‚ณใƒณใƒ†ใƒณใƒ„ใ‚’ใ‚ฏใƒชใƒƒใ‚ฏใ•ใ›ใฆ ๅ…ฑๆœ‰ใ•ใ›ใ‚‹ใŸใ‚ใซใ€ใ•ใพใ–ใพใชๆˆฆ็•ฅใ‚’ไฝฟ็”จใ—ใพใ™ใ€‚
07:12
One of these strategies has to do with just the style that fake news
111
432219
5410
ใ“ใ‚Œใ‚‰ใฎๆˆฆ็•ฅใฎ 1 ใคใฏใ€ใƒ•ใ‚งใ‚คใ‚ฏ ใƒ‹ใƒฅใƒผใ‚น่จ˜ไบ‹ใŒใ‚ˆใๆ›ธใ‹ใ‚Œใ‚‹ใ‚นใ‚ฟใ‚คใƒซใซ้–ขไฟ‚ใ—ใฆใ„ใพใ™ใ€‚ ใƒ•ใ‚งใ‚คใ‚ฏ ใƒ‹ใƒฅใƒผใ‚น่จ˜ไบ‹ใฏ้žๅธธใซ
07:17
stories are often written in.
112
437709
1781
07:19
They'll tend to use highly emotional language and content also that gets
113
439759
6750
ๆ„Ÿๆƒ…็š„ใช ่จ€่‘‰ใ‚„ใ€
07:26
people's emotions high.
114
446589
2090
ไบบใ€…ใฎๆ„Ÿๆƒ…ใ‚’้ซ˜ๆšใ•ใ›ใ‚‹ใ‚ณใƒณใƒ†ใƒณใƒ„ใ‚’ไฝฟ็”จใ™ใ‚‹ๅ‚พๅ‘ใŒใ‚ใ‚Šใพใ™ใ€‚
07:28
And this is because psychologically we also like to share
115
448759
4861
ใใ—ใฆใ“ใ‚Œใฏใ€ๅฟƒ็†็š„ใซ ็งใŸใกใ‚‚
07:33
this kind of information.
116
453699
1490
ใ“ใฎ็จฎใฎๆƒ…ๅ ฑใ‚’ๅ…ฑๆœ‰ใ™ใ‚‹ใ“ใจใ‚’ๅฅฝใ‚€ใ‹ใ‚‰ใงใ™ใ€‚
07:35
And this kind of content, especially highly emotive content,
117
455360
4109
ใใ—ใฆใ€ใ“ใฎ็จฎใฎใ‚ณใƒณใƒ†ใƒณใƒ„ใ€ ็‰นใซ้žๅธธใซๆ„Ÿๆƒ…็š„ใชใ‚ณใƒณใƒ†ใƒณใƒ„ใฏใ€ใƒ—ใƒฉใƒƒใƒˆใƒ•ใ‚ฉใƒผใƒ ใฎใ‚ขใƒซใ‚ดใƒชใ‚บใƒ 
07:39
will go much further because of the algorithms of the platforms.
118
459549
4561
ใซใ‚ˆใฃใฆใ•ใ‚‰ใซ้€ฒๅŒ–ใ™ใ‚‹ใงใ—ใ‚‡ใ† ใ€‚
07:44
And we also know that emotions like anger, outrage, fear are
119
464189
5690
ใพใŸใ€ ๆ€’ใ‚Šใ€ๆ€’ใ‚Šใ€ๆๆ€–ใชใฉใฎๆ„Ÿๆƒ…ใฏใ€ๅนธ็ฆใ‚„ๅ–œใณใชใฉใฎๆ„Ÿๆƒ…
07:49
much stronger emotions than things like happiness and joy.
120
469959
3340
ใ‚ˆใ‚Šใ‚‚ใฏใ‚‹ใ‹ใซๅผทใ„ๆ„Ÿๆƒ…ใงใ‚ใ‚‹ใ“ใจใ‚‚็งใŸใกใฏ็Ÿฅใฃใฆใ„ใพใ™ ใ€‚
07:53
And so that's why we see disinformation and fake news stories
121
473379
4031
ใ ใ‹ใ‚‰ใ“ใใ€ไบ‹ๅฎŸใซๅŸบใฅใๆฏŽๆ—ฅใฎใƒ‹ใƒฅใƒผใ‚นๆฆ‚่ฆใฎ่ƒŒๅพŒใซใ‚ใ‚‹
07:57
go viral compared to facts and the boring truth and the mundane truth
122
477490
6039
ไบ‹ๅฎŸใ‚„ ้€€ๅฑˆใช็œŸๅฎŸใ€ใ‚ใ‚Šใตใ‚ŒใŸ็œŸๅฎŸใจๆฏ”่ผƒใ—ใฆใ€ๅฝๆƒ…ๅ ฑใ‚„ใƒ•ใ‚งใ‚คใ‚ฏใƒ‹ใƒฅใƒผใ‚น่จ˜ไบ‹ใŒๆ€ฅ้€Ÿใซๅบƒใพใ‚‹ใฎใ‚’็›ฎใซใ™ใ‚‹ใฎใงใ™
08:03
behind our factual daily news briefs.
123
483609
5261
ใ€‚
08:08
We of course still see our cat memes and cute animal videos going viral too
124
488949
5171
ใ‚‚ใกใ‚ใ‚“ใ€
08:14
because joy and happiness is a strong emotion but it's not quite as strong
125
494199
5020
ๅ–œใณใ‚„ๅนธ็ฆใฏๅผทใ„ ๆ„Ÿๆƒ…ใงใ™ใŒใ€
08:19
as things like anger and fear.
126
499299
2751
ๆ€’ใ‚Šใ‚„ๆๆ€–ใชใฉใฎๆ„Ÿๆƒ…ใปใฉๅผทๅŠ›ใงใฏใชใ„ใŸใ‚ใ€็ŒซใฎใƒŸใƒผใƒ ใ‚„ใ‹ใ‚ใ„ใ„ๅ‹•็‰ฉใฎใƒ“ใƒ‡ใ‚ชใŒใƒใ‚คใƒฉใƒซใซใชใ‚‹ใฎใ‚’ไปŠใงใ‚‚่ฆ‹ใฆใ„ใพใ™ใ€‚
08:22
Now we know that many fake news stories are very emotive.
127
502129
2880
ไปŠใ€็งใŸใกใฏ ใƒ•ใ‚งใ‚คใ‚ฏใƒ‹ใƒฅใƒผใ‚นใฎๅคšใใŒ้žๅธธใซๆ„Ÿๆƒ…็š„ใชใ‚‚ใฎใงใ‚ใ‚‹ใ“ใจใ‚’็Ÿฅใฃใฆใ„ใพใ™ใ€‚
08:25
They generate strong emotions; the sense that something's not fair
128
505089
4000
ใใ‚Œใ‚‰ใฏๅผทใ„ๆ„Ÿๆƒ…ใ‚’็”Ÿใฟๅ‡บใ—ใพใ™ใ€‚ ไฝ•ใ‹ใŒไธๅ…ฌๅนณใงใ‚ใ‚‹ใ€
08:29
or people or organisations are doing things they shouldn't and that can
129
509269
3390
ใ‚ใ‚‹ใ„ใฏไบบใ€…ใ‚„็ต„็น”ใŒใ‚„ใ‚‹ ในใใงใฏใชใ„ใ“ใจใ‚’ใ—ใฆใ„ใ‚‹ใจใ„ใ†ๆ„Ÿ่ฆšใŒใ‚ใ‚Šใ€
08:32
make us look for other content to support that feeling.
130
512740
2990
ใใฎๆ„Ÿ่ฆšใ‚’่ฃไป˜ใ‘ใ‚‹ไป–ใฎใ‚ณใƒณใƒ†ใƒณใƒ„ใ‚’ๆŽขใ™ใ‚ˆใ†ใซใชใ‚‹ใ“ใจใŒใ‚ใ‚Šใพใ™ใ€‚
08:36
Yeah but good to know the cute cat videos are being shared too.
131
516620
4430
ใใ†ใงใ™ใญใ€ใงใ‚‚ใ‹ใ‚ใ„ใ„ ็Œซใฎๅ‹•็”ปใ‚‚ใ‚ทใ‚งใ‚ขใ•ใ‚Œใฆใ„ใ‚‹ใฎใฏๅฌ‰ใ—ใ„ใงใ™ใญใ€‚
08:41
And dogs don't forget dogs!
132
521279
2951
ใใ—ใฆ็Šฌใฏ็Šฌใ‚’ๅฟ˜ใ‚Œใพใ›ใ‚“๏ผ
08:44
But she also talked about algorithms,
133
524899
2070
ใ—ใ‹ใ—ใ€ๅฝผๅฅณใฏใ‚ขใƒซใ‚ดใƒชใ‚บใƒ ใ€ใคใพใ‚Š
08:47
the algorithms that social media companies use.
134
527090
2969
ใ‚ฝใƒผใ‚ทใƒฃใƒซ ใƒกใƒ‡ใ‚ฃใ‚ขไผๆฅญใŒไฝฟ็”จใ™ใ‚‹ใ‚ขใƒซใ‚ดใƒชใ‚บใƒ ใซใคใ„ใฆใ‚‚่ฉฑใ—ใพใ—ใŸใ€‚
08:50
So Sam what is an algorithm?
135
530289
2291
ใ‚ตใƒ ใฏใ‚ขใƒซใ‚ดใƒชใ‚บใƒ ใจใฏไฝ•ใงใ™ใ‹?
08:53
OK, thanks for that Hugo!
136
533090
1449
OKใ€ใƒ’ใƒฅใƒผใ‚ดใ•ใ‚“ใ€ใ‚ใ‚ŠใŒใจใ†๏ผ ใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใฎๅฐ‚้–€ๅฎถใ‹ใ‚‰ใฏ
08:55
I might get into trouble here from computer experts but let's have a go.
137
535190
5490
ใƒˆใƒฉใƒ–ใƒซใซๅทปใ่พผใพใ‚Œใ‚‹ใ‹ใ‚‚ใ—ใ‚Œใพใ›ใ‚“ ใŒใ€ใ‚„ใฃใฆใฟใพใ—ใ‚‡ใ†ใ€‚
09:00
Have you ever lost a huge amount of time because, for example,
138
540760
4390
ใŸใจใˆใฐใ€
09:05
you've gone on YouTube to watch a video and then you see a list of
139
545230
4180
YouTube ใงใƒ“ใƒ‡ใ‚ชใ‚’่ฆ‹ใ‚ˆใ†ใจใ™ใ‚‹ใจใ€ใŠใ™ใ™ใ‚ใฎ้–ข้€ฃใƒ“ใƒ‡ใ‚ช ใฎใƒชใ‚นใƒˆใŒ่กจ็คบใ•ใ‚Œใ€
09:09
suggested related videos which start playing automatically?
140
549490
4059
่‡ชๅ‹•็š„ใซๅ†็”ŸใŒๅง‹ใพใฃใฆใ—ใพใ„ใ€่†จๅคงใชๆ™‚้–“ใ‚’ใƒญใ‚นใ—ใŸใ“ใจใฏใ‚ใ‚Šใพใ›ใ‚“ใ‹?
09:14
All the time, all the time. I think we've all done that.
141
554000
3009
ใ„ใคใ‚‚ใ€ใ„ใคใ‚‚ใ€‚ ่ชฐใ‚‚ใŒใใ‚Œใ‚’ใ‚„ใฃใŸใ“ใจใŒใ‚ใ‚‹ใจๆ€ใ„ใพใ™ใ€‚
09:17
Absolutely. So you click on one video and then another
142
557940
2420
็ตถๅฏพใซใ€‚ ใคใพใ‚Šใ€1 ใคใฎ ใƒ“ใƒ‡ใ‚ชใ‚’ใ‚ฏใƒชใƒƒใ‚ฏใ—ใ€ใ•ใ‚‰ใซ
09:20
and another and another and then it's been hours and hours and hours.
143
560629
4121
ๅˆฅใฎใƒ“ใƒ‡ใ‚ชใ‚’ใ‚ฏใƒชใƒƒใ‚ฏใ—ใ€ใ•ใ‚‰ใซๅˆฅใฎใƒ“ใƒ‡ใ‚ชใ‚’ใ‚ฏใƒชใƒƒใ‚ฏใ™ใ‚‹ใจใ€ไฝ• ๆ™‚้–“ใ‚‚ไฝ•ๆ™‚้–“ใ‚‚ใ‹ใ‹ใฃใฆใ—ใพใ„ใพใ™ใ€‚
09:25
Exactly. Yeah. OK. So what's this got to do with algorithms?
144
565169
3050
ใใฎ้€šใ‚Šใ€‚ ใ†ใ‚“ใ€‚ ใ‚ใ‹ใ‚Šใพใ—ใŸใ€‚ ใงใฏใ€ใ“ใ‚Œใฏ ใ‚ขใƒซใ‚ดใƒชใ‚บใƒ ใจใฉใฎใ‚ˆใ†ใช้–ขไฟ‚ใŒใ‚ใ‚‹ใฎใงใ—ใ‚‡ใ†ใ‹?
09:28
Well an algorithm is a set of instructions a computer system
145
568299
4811
ใ‚ขใƒซใ‚ดใƒชใ‚บใƒ ใจใฏใ€ ใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟ ใ‚ทใ‚นใƒ†ใƒ ใŒ
09:33
follows to complete a particular task.
146
573190
3329
็‰นๅฎšใฎใ‚ฟใ‚นใ‚ฏใ‚’ๅฎŒไบ†ใ™ใ‚‹ใŸใ‚ใซๅพ“ใ†ไธ€้€ฃใฎๅ‘ฝไปคใงใ™ใ€‚
09:36
Social media companies use algorithms to make that list of
147
576600
4130
ใ‚ฝใƒผใ‚ทใƒฃใƒซ ใƒกใƒ‡ใ‚ฃใ‚ขไผๆฅญใฏใ‚ขใƒซใ‚ดใƒชใ‚บใƒ ใ‚’ไฝฟ็”จใ—ใฆใ€ใƒฆใƒผใ‚ถใƒผๅฐ‚็”จใฎ้–ข้€ฃๅ‹•็”ป ใฎใƒชใ‚นใƒˆใ‚’ไฝœๆˆใ—ใพใ™
09:40
related videos just for you.
148
580810
3370
ใ€‚
09:44
These are personalised based on your behaviour online and
149
584260
4179
ใ“ใ‚Œใ‚‰ใฏใ‚ชใƒณใƒฉใ‚คใƒณใงใฎใ‚ใชใŸใฎ่กŒๅ‹•ใซๅŸบใฅใ„ใฆใƒ‘ใƒผใ‚ฝใƒŠใƒฉใ‚คใ‚บใ•ใ‚ŒใฆใŠใ‚Šใ€
09:48
they get us clicking.
150
588519
1500
ใ‚ฏใƒชใƒƒใ‚ฏใ‚’ไฟƒใ—ใพใ™ใ€‚
09:50
This is because the companies want to keep you on their platforms for
151
590100
3830
ใ“ใ‚Œใฏใ€ไผๆฅญใŒใƒฆใƒผใ‚ถใƒผใ‚’ใงใใ‚‹ ใ ใ‘้•ทใ่‡ช็คพใฎใƒ—ใƒฉใƒƒใƒˆใƒ•ใ‚ฉใƒผใƒ ใซ็•™ใพใ‚‰ใ›ใŸใ„ใจ่€ƒใˆใฆใ„ใ‚‹ใŸใ‚ใงใ™
09:54
as long as possible. That's how they make their money.
152
594010
3389
ใ€‚ ใใ†ใ‚„ใฃใฆๅฝผใ‚‰ใฏ ใŠ้‡‘ใ‚’็จผใ„ใงใ„ใพใ™ใ€‚
09:57
The longer you're on their site the more links you click the more
153
597529
3331
ใ‚ตใ‚คใƒˆใซ้•ทใๆปžๅœจใ™ใ‚‹ใปใฉใ€ ใ‚ˆใ‚Šๅคšใใฎใƒชใƒณใ‚ฏใ‚’ใ‚ฏใƒชใƒƒใ‚ฏใ—ใ€ใ‚ˆใ‚Šๅคšใใฎ
10:00
advertisements you see and the more cash they collect.
154
600940
2889
ๅบƒๅ‘ŠใŒ่กจ็คบใ•ใ‚Œใ€ใ‚ˆใ‚Š ๅคšใใฎ็พ้‡‘ใŒ้›†ใพใ‚Šใพใ™ใ€‚
10:04
And the more we click on a particular kind of story, the more
155
604720
3600
ใใ—ใฆใ€ ็‰นๅฎšใฎ็จฎ้กžใฎใ‚นใƒˆใƒผใƒชใƒผใ‚’ใ‚ฏใƒชใƒƒใ‚ฏใ™ใ‚Œใฐใ™ใ‚‹ใปใฉใ€
10:08
the algorithms give us similar content.
156
608400
2790
ใ‚ขใƒซใ‚ดใƒชใ‚บใƒ ใฏ ๅŒๆง˜ใฎใ‚ณใƒณใƒ†ใƒณใƒ„ใ‚’ๆไพ›ใ—ใพใ™ใ€‚
10:11
Now there's nothing necessarily wrong with that because I guess
157
611270
3540
็ตๅฑ€ใฎใจใ“ใ‚ใ€ ็งใŸใกใฏ
10:14
after all we're getting a free service which has to be paid for somehow.
158
614890
4840
ไฝ•ใ‚‰ใ‹ใฎๅฝขใงๆ–™้‡‘ใ‚’ๆ”ฏๆ‰•ใ‚ใชใ‘ใ‚Œใฐใชใ‚‰ใชใ„็„กๆ–™ใฎใ‚ตใƒผใƒ“ใ‚นใ‚’ๅ—ใ‘ใฆใ„ใ‚‹ใจๆ€ใ†ใฎใงใ€ใใ‚ŒใŒๅฟ…ใšใ—ใ‚‚้–“้•ใฃใฆใ„ใ‚‹ใ‚ใ‘ใงใฏใ‚ใ‚Šใพใ›ใ‚“ ใ€‚
10:20
But it does mean that we usually don't see things that
159
620090
3150
ใ—ใ‹ใ—ใ€ใใ‚Œใฏ็งใŸใกใŒ ๆ™ฎๆฎตใ€
10:23
we don't like, that we don't agree with.
160
623320
1859
่‡ชๅˆ†ใฎๆฐ—ใซๅ…ฅใ‚‰ใชใ„ใ‚‚ใฎใ‚„ ๅŒๆ„ใงใใชใ„ใ‚‚ใฎใซๆฐ—ใฅใ‹ใชใ„ใ“ใจใ‚’ๆ„ๅ‘ณใ—ใพใ™ใ€‚
10:25
And this can make us feel that our view, our opinion
161
625259
3680
ใใ—ใฆใ“ใ‚Œใซใ‚ˆใ‚Šใ€็งใŸใกใฏ ่‡ชๅˆ†ใฎ่ฆ‹่งฃใ€ๆ„่ฆ‹ใŒ
10:29
is the right one which everyone shares.
162
629029
2120
่ชฐใ‚‚ใŒๅ…ฑๆœ‰ใ™ใ‚‹ๆญฃใ—ใ„ใ‚‚ใฎใงใ‚ใ‚‹ใจๆ„Ÿใ˜ใ‚‹ใ“ใจใŒใงใใพใ™ใ€‚
10:31
Which is why I think it's a good idea to try and look for different opinions.
163
631460
4189
ใ ใ‹ใ‚‰ใ“ใใ€ ่‰ฒใ€…ใชๆ„่ฆ‹ใ‚’่žใ„ใฆใฟใ‚‹ใฎใŒ่‰ฏใ„ใจๆ€ใ„ใพใ™ใ€‚
10:35
So we don't end up thinking that everyone thinks the way we do.
164
635730
4069
ใ—ใŸใŒใฃใฆใ€ ่ชฐใ‚‚ใŒ็งใŸใกใจๅŒใ˜ใ‚ˆใ†ใซ่€ƒใˆใฆใ„ใ‚‹ใจ่€ƒใˆใ‚‹ใ“ใจใฏใงใใพใ›ใ‚“ใ€‚
10:39
So does that mean after watching all your cat videos you search for dog videos,
165
639879
4821
ใจใ„ใ†ใ“ใจใฏใ€็Œซใฎๅ‹•็”ปใ‚’ใ™ในใฆ่ฆ‹ใŸๅพŒใ€ใƒใƒฉใƒณใ‚นใ‚’ใจใ‚‹ใŸใ‚ใซ ็Šฌใฎๅ‹•็”ปใ‚’ๆคœ็ดขใ™ใ‚‹ใจใ„ใ†ใ“ใจใงใ™ใ‹
10:44
for a bit of balance?
166
644780
1409
?
10:46
I love cats and dogs equally actually but that's not exactly what I was thinking.
167
646429
4551
็งใฏๅฎŸ้š›ใซใฏ็Œซใ‚‚็Šฌใ‚‚ๅŒใ˜ใ‚ˆใ†ใซๆ„›ใ—ใฆใ„ใพใ™ ใŒใ€ใใ‚Œใฏ็งใŒ่€ƒใˆใฆใ„ใŸใ“ใจใงใฏใ‚ใ‚Šใพใ›ใ‚“ใงใ—ใŸใ€‚
10:51
I'm just putting your leg there.
168
651060
979
ใŸใ ใใ“ใซ่ถณใ‚’็ฝฎใ„ใฆใ„ใ‚‹ใ ใ‘ใงใ™ใ€‚
10:52
Sam please remind us of today's vocabulary.
169
652549
2921
ใ‚ตใƒ ใ€ ไปŠๆ—ฅใฎ่ชžๅฝ™ใ‚’ๆ€ใ„ๅ‡บใ—ใฆใใ ใ•ใ„ใ€‚
10:55
Certainly Hugo.
170
655860
600
็ขบใ‹ใซใƒ’ใƒฅใƒผใ‚ดใ€‚
10:56
So we started with 'agenda' which is a plan of things to be done.
171
656540
3659
ใใ“ใง็งใŸใกใฏใ€ใ‚„ใ‚‹ในใใ“ใจใฎ่จˆ็”ปใงใ‚ใ‚‹ใ€Œใ‚ขใ‚ธใ‚งใƒณใƒ€ใ€ใ‹ใ‚‰ๅง‹ใ‚ใพใ—ใŸ ใ€‚
11:00
And it's a hidden, and if it's a hidden agenda, it means the reason
172
660679
3841
ใใ—ใฆใ€ใใ‚Œใฏ้š ใ•ใ‚ŒใŸใ‚‚ใฎใงใ‚ใ‚Šใ€ ้š ใ•ใ‚ŒใŸ่ญฐ้กŒใงใ‚ใ‚‹ๅ ดๅˆใ€
11:04
for someone's actions might be secret or hidden.
173
664600
3779
่ชฐใ‹ใฎ่กŒๅ‹•ใฎ็†็”ฑใŒ ็ง˜ๅฏ†ใงใ‚ใ‚‹ใ‹ใ€้š ใ•ใ‚Œใฆใ„ใ‚‹ๅฏ่ƒฝๆ€งใŒใ‚ใ‚‹ใ“ใจใ‚’ๆ„ๅ‘ณใ—ใพใ™ใ€‚
11:08
Then there was the word 'actor'.
174
668480
1659
ใใ‚Œใ‹ใ‚‰ใ€Œไฟณๅ„ชใ€ใจใ„ใ†่จ€่‘‰ใŒใ‚ใ‚Šใพใ—ใŸใ€‚
11:10
In the context of fake news an actor is someone with an agenda that uses
175
670549
4680
ใƒ•ใ‚งใ‚คใ‚ฏใƒ‹ใƒฅใƒผใ‚นใฎๆ–‡่„ˆใซใŠใ„ใฆใ€ไฟณๅ„ชใจใฏใ€ ใ‚ใ‚‹็›ฎ็š„ใ‚’ๆŒใฃใฆใ€
11:15
power and influence to try and promote that agenda.
176
675309
3761
ๆจฉๅŠ›ใจๅฝฑ้ŸฟๅŠ›ใ‚’ๅˆฉ็”จใ—ใฆ ใใฎ็›ฎ็š„ใ‚’ๆŽจ้€ฒใ—ใ‚ˆใ†ใจใ™ใ‚‹ไบบ็‰ฉใ‚’ๆŒ‡ใ—ใพใ™ใ€‚
11:20
A 'motive' is a reason for doing something.
177
680029
2770
ใ€Œๅ‹•ๆฉŸใ€ใจใฏ ไฝ•ใ‹ใ‚’ใ™ใ‚‹็†็”ฑใฎใ“ใจใงใ™ใ€‚
11:22
And like hidden agendas sometimes the motives for a particular action
178
682990
4419
ใใ—ใฆใ€้š ใ•ใ‚ŒใŸ่ญฐ้กŒใจๅŒๆง˜ใซใ€ใ‚คใƒณใ‚ฟใƒผใƒใƒƒใƒˆไธŠใงใฎ ็‰นๅฎšใฎ่กŒๅ‹•ใฎๅ‹•ๆฉŸใŒๅฟ…ใšใ—ใ‚‚ๆ˜Ž็ขบใง
11:27
on the Internet are not always clear and transparent.
179
687490
3979
้€ๆ˜Žใงใ‚ใ‚‹ใจใฏ้™ใ‚Šใพใ›ใ‚“ใ€‚
11:31
Samantha Bradshaw talked about identity signalling.
180
691549
3641
ใ‚ตใƒžใƒณใ‚ตใƒปใƒ–ใƒฉใƒƒใƒ‰ใ‚ทใƒงใƒผใฏ ใ‚ขใ‚คใƒ‡ใƒณใƒ†ใ‚ฃใƒ†ใ‚ฃใƒปใ‚ทใ‚ฐใƒŠใƒชใƒณใ‚ฐใซใคใ„ใฆ่ชžใฃใŸใ€‚
11:35
This is the description of behaviour online when people share things
181
695299
3691
ใ“ใ‚Œใฏใ€ ไบบใ€…ใŒ็œŸๅฎŸใ‹ใฉใ†ใ‹ใซ่ˆˆๅ‘ณใ‚’ๆŒใŸใšใซ็‰ฉไบ‹ใ‚’ๅ…ฑๆœ‰ใ™ใ‚‹ใจใใฎใ€ใ‚ชใƒณใƒฉใ‚คใƒณใงใฎ่กŒๅ‹•ใฎ่ชฌๆ˜Žใงใ™
11:39
without being interested in whether they're true or not.
182
699320
2889
ใ€‚
11:42
What's important is not how true it is but that it demonstrates that
183
702289
4110
้‡่ฆใชใฎใฏใ€ใใ‚ŒใŒใฉใ‚Œใปใฉ็œŸๅฎŸใงใ‚ใ‚‹ใ‹ใจใ„ใ†ใ“ใจใงใฏใชใใ€
11:46
they think in the same way or share the same beliefs as the group.
184
706480
5149
ๅฝผใ‚‰ใŒใใฎใ‚ฐใƒซใƒผใƒ—ใจๅŒใ˜ใ‚ˆใ†ใซ่€ƒใˆใฆใ„ใ‚‹ใ‹ใ€ๅŒใ˜ไฟกๅฟตใ‚’ๅ…ฑๆœ‰ใ—ใฆใ„ใ‚‹ใ“ใจใ‚’็คบใ™ใ‹ใฉใ†ใ‹ใงใ™ ใ€‚
11:51
And finally 'algorithm'.
185
711710
1380
ใใ—ใฆๆœ€ๅพŒใซใ€Œใ‚ขใƒซใ‚ดใƒชใ‚บใƒ ใ€ใงใ™ใ€‚
11:53
An algorithm is a set of software instructions that for example can
186
713600
3990
ใ‚ขใƒซใ‚ดใƒชใ‚บใƒ ใฏใ€ ใŸใจใˆใฐใ€
11:57
produce a personalised list of recommended videos based on what
187
717670
4089
12:01
an individual has searched for or watched in the past.
188
721840
3160
ๅ€‹ไบบใŒ้ŽๅŽปใซๆคœ็ดขใ—ใŸใ‚‚ใฎใ‚„ ่ฆ–่ดใ—ใŸใ‚‚ใฎใซๅŸบใฅใ„ใฆใ€ๅ€‹ไบบใซๅˆใ‚ใ›ใŸๆŽจๅฅจใƒ“ใƒ‡ใ‚ชใฎใƒชใ‚นใƒˆใ‚’ไฝœๆˆใงใใ‚‹ไธ€้€ฃใฎใ‚ฝใƒ•ใƒˆใ‚ฆใ‚งใ‚ขๅ‘ฝไปคใงใ™ใ€‚
12:06
Thank you Sam. That was very comprehensive.
189
726250
2200
ใ‚ใ‚ŠใŒใจใ†ใ‚ตใƒ ใ€‚ ใใ‚Œใฏ้žๅธธใซๅŒ…ๆ‹ฌ็š„ใชๅ†…ๅฎนใงใ—ใŸใ€‚
12:08
That's all from us today. We look forward to your company again soon.
190
728530
3269
ไปŠๆ—ฅใฏไปฅไธŠใงใ™ใ€‚ ่ฟ‘ใ„ใ†ใกใซใพใŸๅพก็คพใ‚’ใ”ๅˆฉ็”จใ„ใŸใ ใ‘ใ‚‹ใ“ใจใ‚’ๆฅฝใ—ใฟใซใ—ใฆใŠใ‚Šใพใ™ใ€‚
12:11
Goodbye. And stay safe.
191
731879
2260
ใ•ใ‚ˆใ†ใชใ‚‰ใ€‚ ใใ—ใฆๅฎ‰ๅ…จใ‚’็ขบไฟใ—ใฆใใ ใ•ใ„ใ€‚
12:14
Bye bye!
192
734259
500
ใƒใ‚คใƒใ‚ค๏ผ
ใ“ใฎใ‚ฆใ‚งใƒ–ใ‚ตใ‚คใƒˆใซใคใ„ใฆ

ใ“ใฎใ‚ตใ‚คใƒˆใงใฏ่‹ฑ่ชžๅญฆ็ฟ’ใซๅฝน็ซ‹ใคYouTubeๅ‹•็”ปใ‚’็ดนไป‹ใ—ใพใ™ใ€‚ไธ–็•Œไธญใฎไธ€ๆต่ฌ›ๅธซใซใ‚ˆใ‚‹่‹ฑ่ชžใƒฌใƒƒใ‚นใƒณใ‚’่ฆ‹ใ‚‹ใ“ใจใŒใงใใพใ™ใ€‚ๅ„ใƒ“ใƒ‡ใ‚ชใฎใƒšใƒผใ‚ธใซ่กจ็คบใ•ใ‚Œใ‚‹่‹ฑ่ชžๅญ—ๅน•ใ‚’ใƒ€ใƒ–ใƒซใ‚ฏใƒชใƒƒใ‚ฏใ™ใ‚‹ใจใ€ใใ“ใ‹ใ‚‰ใƒ“ใƒ‡ใ‚ชใ‚’ๅ†็”Ÿใ™ใ‚‹ใ“ใจใŒใงใใพใ™ใ€‚ๅญ—ๅน•ใฏใƒ“ใƒ‡ใ‚ชใฎๅ†็”ŸใจๅŒๆœŸใ—ใฆใ‚นใ‚ฏใƒญใƒผใƒซใ—ใพใ™ใ€‚ใ”ๆ„่ฆ‹ใƒปใ”่ฆๆœ›ใŒใ”ใ–ใ„ใพใ—ใŸใ‚‰ใ€ใ“ใกใ‚‰ใฎใŠๅ•ใ„ๅˆใ‚ใ›ใƒ•ใ‚ฉใƒผใƒ ใ‚ˆใ‚Šใ”้€ฃ็ตกใใ ใ•ใ„ใ€‚

https://forms.gle/WvT1wiN1qDtmnspy7