ADVANCED English Vocabulary in context | Learn English with the News

94,914 views ・ 2023-05-31

Learn English with Harry


Please double-click on the English subtitles below to play the video.

00:00
Hi there, this is Harry and welcome back to advanced English lessons with Harry, where
0
179
3971
00:04
I tried to help you to get a better understanding of the English language helping you with vocabulary
1
4150
5500
00:09
expressions, phrases, whatever it takes to help you to improve conversational English,
2
9650
5470
00:15
or indeed your business English.
3
15120
1749
00:16
And from time to time, we include an article and newsy article, something in the news.
4
16869
5571
00:22
And I read that for you, and they give you some words and expressions that have been
5
22440
3950
00:26
used in the article and explain what they mean.
6
26390
3110
00:29
So that's exactly what we're going to do in this particular lesson.
7
29500
2919
00:32
And at the end of the lesson, if you need any further help with any further explanations
8
32419
5191
00:37
or examples, then please write to me on www.englishlessonviaskype.com and I'll happily provide you with any information
9
37610
8320
00:45
that you like or that you need.
10
45930
1750
00:47
Okay, so getting back to this advanced English lesson, it's all about an article, the article
11
47680
5000
00:52
appeared in The Guardian newspaper, which is a really nice source of well-balanced articles
12
52680
6640
00:59
that we like to read from time to time.
13
59320
2480
01:01
And the particular heading in this particular article is" Another warning about the AI apocalypse.
14
61800
8020
01:09
I don't buy it."
15
69820
1839
01:11
So it's not my opinion.
16
71659
1640
01:13
It's the opinion of the journalist who wrote the particular article.
17
73299
3210
01:16
So he's just saying, or she is saying, another warning about the AI, the artificial intelligence,
18
76509
6641
01:23
apocalypse, people think we're coming into a major, major catastrophe dealing with artificial
19
83150
6620
01:29
intelligence.
20
89770
1000
01:30
And the journalist says, I don't buy it, meaning they don't necessarily agree with it.
21
90770
5560
01:36
So what I do, I'll read it once just for you to get the gist of what it's about, then I'll
22
96330
4260
01:40
read it a second time putting some stress on certain words, and then I'll come back
23
100590
4489
01:45
and I'll give you an explanation as to the meaning of those particular words.
24
105079
3921
01:49
Okay, so let's start.
25
109000
2100
01:51
And here's the text.
26
111100
1510
01:52
So I read it once for you just to get a gist and then a second time to get a better understanding.
27
112610
4899
01:57
And I'll come back and give you the particular words and expressions that I've highlighted,
28
117509
4281
02:01
so you get a meaning and how we can use them.
29
121790
3249
02:05
So here's the first reading just for the just AI tools like Chat GPT are everywhere.
30
125039
7420
02:12
It is the combination of computational power, and availability of data that has led to a
31
132459
5511
02:17
surge in AI technology.
32
137970
2769
02:20
But the reason models such as Chat GPT and Bard have made such a spectacular splash is
33
140739
5511
02:26
that they have hit our own homes, with around 100 million people currently using them.
34
146250
6670
02:32
This has led to a very fraught public debate, it is predicted that a quarter of all jobs
35
152920
5450
02:38
will be affected one way or another by AI and some companies are holding back on recruitment
36
158370
6960
02:45
to see which jobs can be automated.
37
165330
3670
02:49
Fears about AI can move markets, as we saw yesterday when Pearson shares tumbled over
38
169000
5780
02:54
concerns that AI would disrupt its business.
39
174780
4500
02:59
And looming above the day to day debate are the sometimes apocalyptic warnings about the
40
179280
5500
03:04
long term dangers of AI technologies, often from loud and arguably authoritative voices
41
184780
7140
03:11
belonging to executives, and researchers who developed these technologies?
42
191920
5590
03:17
So are people right to raise the spectre of apocalyptic AI driven destruction?
43
197510
6899
03:24
In my view, no.
44
204409
1541
03:25
I agree that there are some sobering risks.
45
205950
3429
03:29
But people are beginning to understand that these are socio technical systems that is
46
209379
6361
03:35
not just neutral tools, but an inextricable bundle of code data subjective parameters,
47
215740
7860
03:43
and people.
48
223600
1440
03:45
AI's end users and the direction it develops, aren't inevitable, and addressing the risks
49
225040
7430
03:52
of AI isn't simply a question of stop or proceed.
50
232470
5900
03:58
Companies are aware of these issues as they work on new systems, open AI, the company
51
238370
5819
04:04
behind chat GPT, some of them are pretty well, it recognises that while a lot has been done
52
244189
6970
04:11
to root out racism and other forms of hate from chat, GP tes responses, manipulation,
53
251159
7531
04:18
and hallucination, which means producing content that is nonsensical or untruthful, essentially
54
258690
7009
04:25
making stuff up still happen.
55
265699
2791
04:28
I'm confident that trial and era plus burgeoning research in this area will help.
56
268490
6420
04:34
Okay, so that's the article.
57
274910
2039
04:36
So basically, what the journalist is saying is giving his opinion about the current surge
58
276949
6751
04:43
in the interest in artificial intelligence in particular systems such as Chat GPT and
59
283700
6170
04:49
Bard which have been introduced in the last six months or so and there are lots and lots
60
289870
4400
04:54
of people around the world using it.
61
294270
2680
04:56
But there are also frustrations because the imp act of Chat GPT and other artificial intelligence
62
296950
6580
05:03
on current jobs is huge.
63
303530
2340
05:05
And people, of course, are panicking and worried as to what the future will hold for jobs.
64
305870
5010
05:10
So this particular journalist believes that balance is important that yes, these issues
65
310880
5950
05:16
are there.
66
316830
1809
05:18
But we shouldn't be overly concerned.
67
318639
1661
05:20
And like everything else, we should learn to embrace these new developments.
68
320300
4060
05:24
They happen to be my own opinion, I think that's actually quite right, that it's senseless
69
324360
4690
05:29
and difficult to push them aside and ban them or try to ban them because they will be there
70
329050
5709
05:34
anyway.
71
334759
1000
05:35
So we might as well embrace what is there and tried to work for the good and the better
72
335759
4821
05:40
of everyone.
73
340580
1000
05:41
Okay, so et me read it to you a second time a little bit more slowly.
74
341580
3680
05:45
Okay, so that you can understand it.
75
345260
2020
05:47
And then I'll give you a meaning of the words that I've highlighted.
76
347280
4509
05:51
AI tools like Chat GPT are everywhere.
77
351789
3671
05:55
It is the combination of computational power and availability of data that has led to a
78
355460
5690
06:01
surge in AI technology.
79
361150
1970
06:03
But the reason models such as Chat GPT and Bard have made such a spectacular splash is
80
363120
7240
06:10
that they have hit our own homes, with around 100 million people currently using them.
81
370360
7040
06:17
This has led to a very fraught public debate, it is predicted that a quarter of all jobs
82
377400
5840
06:23
will be affected one way or another by AI and some companies are holding back on recruitment
83
383240
6859
06:30
to see which jobs can be automated.
84
390099
3630
06:33
fears about AI can move markets.
85
393729
1691
06:35
And as we saw yesterday, when Pearson shares tumbled over concerns that AI would disrupt
86
395420
7339
06:42
its business.
87
402759
1961
06:44
And looming about the day to day debate, or the sometimes apocalyptic warnings about the
88
404720
5810
06:50
long term dangers of AI technologies, often from loud and arguably authoritative voices
89
410530
7099
06:57
belonging to executives and researchers who developed these technologies.
90
417629
5741
07:03
So are people right to raise the spectre of apocalyptic AI driven destruction?
91
423370
7139
07:10
In my view?
92
430509
1000
07:11
No.
93
431509
1000
07:12
I agree that there are some sobering risks.
94
432509
2861
07:15
But people are beginning to understand that these are socio technical systems that is
95
435370
6060
07:21
not just neutral tools, but an inextricable bundle of code, data, subjective parameters,
96
441430
7180
07:28
and people.
97
448610
1340
07:29
AI's end users and the direction it develops, aren't inevitable, and addressing the risks
98
449950
7490
07:37
of AI isn't simply a question of stop or proceed.
99
457440
5729
07:43
Companies are aware of these issues as they work on new systems.
100
463169
3451
07:46
Open AI, the company behind Chat GPT sums them up pretty well.
101
466620
6609
07:53
It recognises that while a lot has been done to root out racism, and other forms of hate
102
473229
6090
07:59
from Chat GPT is responses, manipulation and hallucination, which means producing content
103
479319
6831
08:06
that is nonsensical or untruthful essentially made up still happen.
104
486150
6220
08:12
I'm confident that trial and error plus burgeoning research in this area will help.
105
492370
6669
08:19
Okay, so there's the article as I explained before, just the argument that's going on
106
499039
5051
08:24
these days between should we or should we not embrace or try to ban all of this new
107
504090
6770
08:30
issues this new media this new these new platforms such as Chat GPT and Bard, okay, so the words
108
510860
6710
08:37
that I've highlighted.
109
517570
2410
08:39
The first word is computational, computational, so just be careful with the pronunciation,
110
519980
6770
08:46
okay?
111
526750
1450
08:48
Computational.
112
528200
1440
08:49
Computational means about calculating something.
113
529640
2910
08:52
Okay.
114
532550
1000
08:53
So when you calculate how much you need, it's a computation.
115
533550
3880
08:57
You carried out a computation and of course, all the algorithms that we use computational
116
537430
6230
09:03
power to identify how to use these particular artificial intelligence platforms, computational.
117
543660
9589
09:13
Second word surge, and we have it here led to a surge and AI technology, a surge means
118
553249
6710
09:19
a sudden increase.
119
559959
2541
09:22
So if there's a power blackout in certain countries, it's often because there has been
120
562500
5440
09:27
a surge in demand.
121
567940
1820
09:29
So in particularly cold winters, people switch on their heating.
122
569760
5150
09:34
As a result, there's a surge in demand for electricity, and that might result in a blackout.
123
574910
5730
09:40
So surge as a sudden increase.
124
580640
2990
09:43
Or if there's heavy rains and floods in a river there may be a surge of water a sudden
125
583630
6519
09:50
increase in the flow of the water that causes flooding, surge, so it's a noun, okay?
126
590149
6190
09:56
And for pronunciation.
127
596339
2351
09:58
Take the word around.
128
598690
1320
10:00
Ah, and just put the s on the front of it, surge, a surge in demand.
129
600010
5639
10:05
Okay.
130
605649
1000
10:06
Third word frought fraught.
131
606649
3491
10:10
Now think of the word taught, teach taught.
132
610140
4810
10:14
And now from fraught.
133
614950
1730
10:16
So get your pronunciation.
134
616680
3030
10:19
In the article, we have this has led to a very fraught, public debate.
135
619710
5490
10:25
Fraught means very difficult, very niggly, very argumentative.
136
625200
5620
10:30
It's not everybody in agreement with each other people are arguing on one side, other
137
630820
5450
10:36
people are arguing on the other side.
138
636270
1910
10:38
So, it is fraught, it means it is difficult, and we have this expression in English fraught
139
638180
6159
10:44
with difficulties, there are problems.
140
644339
2551
10:46
So it's not an easy passage, it's not going to be smooth.
141
646890
3790
10:50
It's fraught, public debate or fraught, public opinion.
142
650680
6090
10:56
Number four to hold back on.
143
656770
3509
11:00
In the article we said companies are holding back on recruitment.
144
660279
3500
11:03
So to hold back on something is to delay.
145
663779
3571
11:07
Okay, so companies are holding back on recruitment, because they want to see can they get away
146
667350
5120
11:12
without recruiting people and just use artificial intelligence, of course, it will go straight
147
672470
5489
11:17
to the bottom line.
148
677959
1291
11:19
So to hold back on something means to prevent it from being issued.
149
679250
4580
11:23
So perhaps is an article and the publisher said, No, let's hold back on publishing that
150
683830
5280
11:29
article for a few days, meaning, let's delay publishing that article for a few days to
151
689110
5979
11:35
hold back on.
152
695089
1891
11:36
And of course, if you do like this particular lesson, then please like the video, and if
153
696980
5530
11:42
you can subscribe to the channel because it really, really helps.
154
702510
5080
11:47
To tumble over.
155
707590
1350
11:48
Okay, now, here we're talking about financial aspects, the shares and a leading publishing
156
708940
5560
11:54
company called Pearson, the shares tumbled over concerns so they tumbled mean, they fell.
157
714500
7940
12:02
So in something tumbles, it falls, so the value of the shares fell over the concerns
158
722440
6130
12:08
about artificial intelligence and its impact on the company's business.
159
728570
4310
12:12
So they, this is the reason why the shares fell, they tumbled over concerns that AI would
160
732880
6750
12:19
and here's the next word, disrupt their business.
161
739630
3950
12:23
So when something is disrupted, so it doesn't stop completely, but it is made more difficult.
162
743580
7090
12:30
So businesses disrupted.
163
750670
1710
12:32
So for example, if there was a train strike, or a strike by doctors, then travelling and
164
752380
7790
12:40
trains will be difficult because it will be disrupted, you're not sure whether your train
165
760170
4800
12:44
is running or not.
166
764970
1630
12:46
Or if you have to have an operation to get your knee replaced, or whatever it might be.
167
766600
4799
12:51
If there are no doctors, then the operations the shedule of operations will also be disrupted.
168
771399
5421
12:56
Some will happen, some will not.
169
776820
1800
12:58
So when we disrupt something, we cause problems, difficulties, it doesn't stop entirely, but
170
778620
6640
13:05
it's not easy to get things done.
171
785260
1949
13:07
So not easy to travel.
172
787209
1370
13:08
During a train strike, there will be a lot of disruption, not easy to have normal, everyday
173
788579
5570
13:14
operations in the hospital.
174
794149
1880
13:16
When there's a doctor strike, it will cause some disruption.
175
796029
3531
13:19
Okay.
176
799560
1000
13:20
Now, the next word we have is looming.
177
800560
3080
13:23
Okay?
178
803640
1000
13:24
So, and looming above the day-to-day debate.
179
804640
2879
13:27
So when something looms, it's sort of hanging there, waiting to fall.
180
807519
6321
13:33
For example, if you're looking out the window, whether called October, November day, there
181
813840
6170
13:40
might often be some fog around so the fog looms over the streets sort of just hangs
182
820010
6579
13:46
there and makes it very difficult for you to see through that fog, it looms or if something
183
826589
6500
13:53
looms, it's just hang in there and you're waiting for something to happen.
184
833089
4000
13:57
So if a factory or a business is going to close or there are concerns or rumours that
185
837089
6071
14:03
the company has financial difficulties, then there's looming over the city, this uncertainty
186
843160
6599
14:09
whether people are going to lose their job so looms something that just hangs there.
187
849759
5431
14:15
And in the article we said, and looming above the day-to-day debate, so something was in
188
855190
6180
14:21
the air hanging their over their heads, like a big weight waiting to fall.
189
861370
5670
14:27
Okay, looms.
190
867040
1710
14:28
Number eight, the next word is sobering.
191
868750
2779
14:31
In the article we say, I agree that there are some sobering risks sober is when you're
192
871529
7120
14:38
not affected by alcohol, you're sober, but sobering risks are risks that make you sit
193
878649
6941
14:45
up and think.
194
885590
1150
14:46
Okay, so in this article, they're talking about the possible apocalyptic destruction
195
886740
6700
14:53
that artificial intelligence will cause.
196
893440
2930
14:56
And this person writing the article says they don't agree with this but there are some sobering
197
896370
6380
15:02
risks meaning there are some risks that you really need to think about.
198
902750
4000
15:06
You can't just avoid them and put your head in the sand.
199
906750
3120
15:09
So sobering risks, risks that are out there that are real and you need to think about.
200
909870
6290
15:16
The next word inextricable now here, just want to focus on the pronunciation first of
201
916160
6460
15:22
all, okay, so inextricable, inextricable, inextricable, inextricable inextricable inextricable,
202
922620
10250
15:32
inextricable Now, what does it mean?
203
932870
7219
15:40
Inextricable means something that cannot be separated.
204
940089
3310
15:43
Okay.
205
943399
1000
15:44
And the article they say this is that is not just neutral tools, but an inextricable bundle
206
944399
8190
15:52
of code data, subjective parameters, and people, meaning these things cannot be separated.
207
952589
6901
15:59
It's all part of the same problem, the problem about artificial intelligence, they're inextricable
208
959490
6000
16:05
all the issues are inextricable, they cannot be separated.
209
965490
4200
16:09
Okay.
210
969690
1000
16:10
In countries often we talk about the inextricable link between state and church where often
211
970690
7430
16:18
you know what the church does, what the state does are exactly the same.
212
978120
3800
16:21
So this inextricable link, okay, so we have to be really careful about the pronunciation.
213
981920
6420
16:28
And then also to get the correct meaning inextricable.
214
988340
4220
16:32
Next word inevitable, little bit easier to pronounce inevitable.
215
992560
3690
16:36
And here in the article, we say, the use of artificial intelligence at the end of the
216
996250
5910
16:42
day, and the direction in which it develops aren't inevitable.
217
1002160
3869
16:46
So inevitable means something that this is going to happen no matter what.
218
1006029
4601
16:50
So, you know, the end of the day is always inevitable, we get up in the morning at seven
219
1010630
4720
16:55
or 8am.
220
1015350
1409
16:56
The surest thing is that we will go to bed it when it's dark at 10, 11, or 12.
221
1016759
5861
17:02
So the day at the end of the day is inevitable, you can't stop it happening.
222
1022620
4420
17:07
In the article, they say, this change isn't inevitable, meaning it doesn't have to be
223
1027040
5169
17:12
something that will happen, but we can make changes.
224
1032209
4031
17:16
So inevitable is something that is always going to happen inevitable.
225
1036240
5070
17:21
As soon as night follows day, it's inevitable.
226
1041310
3570
17:24
We are born, we live we die, it's inevitable.
227
1044880
6000
17:30
That certain things in life that are inevitable life, death, and taxes was a famous quotation.
228
1050880
5890
17:36
So the people are born, people die, and people pay taxes inevitable, okay.
229
1056770
5550
17:42
And the next expression would have to root out something, okay.
230
1062320
5010
17:47
And here, they say to root out racism.
231
1067330
2630
17:49
So to root out something means to get rid of it, okay.
232
1069960
3910
17:53
So if you're digging in the garden, and you dig up a big tree, you literally root it out,
233
1073870
6530
18:00
meaning you dig up all the roots that hold the tree into the ground, or you've got a
234
1080400
6370
18:06
problem in your office or you've got a problem in the school, you try to root out the cause
235
1086770
5580
18:12
of the problem.
236
1092350
1000
18:13
You know, if there's somebody who's been disruptive, you might tell them, they need to find some
237
1093350
4990
18:18
other place to work all this a pupil who's been disruptive, you might send him home from
238
1098340
5520
18:23
school for a few days or a few weeks, or tell the parents that this child is causing problems.
239
1103860
5500
18:29
And we have to root out bullying or racism within the school.
240
1109360
4250
18:33
And if the child doesn't improve the behaviour, then they will be asked to leave.
241
1113610
4520
18:38
So to root out something is to get rid of it, dig it up and throw it out, to root out.
242
1118130
7320
18:45
And then number 12.
243
1125450
1870
18:47
To make up in the article says to make up stuff.
244
1127320
4180
18:51
We make up stories.
245
1131500
1210
18:52
Kids make up stories all the time.
246
1132710
2480
18:55
They make up little friends that don't really exist.
247
1135190
3760
18:58
They make up stories, why they didn't do their homework, they make up stories, why they don't
248
1138950
4390
19:03
want to go to school because they've got a headache or a pain in the tummy.
249
1143340
3380
19:06
So people are constantly making up stuff, making up stories that are not really true,
250
1146720
5970
19:12
either to impress somebody, or to get something that they want to make up in our article here
251
1152690
6119
19:18
that talking again about this whole Chat GPT where they tried to root out racism and also
252
1158809
6561
19:25
this idea where people make up stories that are not really true, but people for some strange
253
1165370
5150
19:30
reason believe it.
254
1170520
1280
19:31
Okay, so to root out those people making stuff up or making up stuff.
255
1171800
5500
19:37
And then finally, we have another word quite difficult on the tongue for pronunciation.
256
1177300
5940
19:43
Burgeoning, burgeoning.
257
1183240
2280
19:45
Now make sure you've got the g sound burgeoning.
258
1185520
4540
19:50
Yeah.
259
1190060
1030
19:51
Okay.
260
1191090
1030
19:52
So it's like, again, the word urge, got a g.
261
1192120
3950
19:56
Urge.
262
1196070
1000
19:57
And then burge burgeoning.
263
1197070
1930
19:59
Okay, now burgeoning means growing quickly or developing or expanding.
264
1199000
6100
20:05
So here they're talking about, I am confident that trial and error plus burgeoning research,
265
1205100
7290
20:12
lots of increasing or growing research in this area will help.
266
1212390
4690
20:17
Okay, so we might have a burgeoning economy, an economy that started off small, but year
267
1217080
7500
20:24
by year is getting bigger and bigger and bigger burgeoning.
268
1224580
3380
20:27
So let me give you the pronunciation.
269
1227960
2820
20:30
Again, burgeoning, burgeoning, burgeoning, burgeoning.
270
1230780
6170
20:36
Okay, so a burgeoning economy, burgeoning research.
271
1236950
4410
20:41
Okay, so they're their words, we have 13 In total, let me run through them again, quickly.
272
1241360
7500
20:48
Computational search, fraught, to hold back on, to tumble over, to disrupt, to loom or
273
1248860
12720
21:01
looming, sobering, inextricable, inevitable, to root out, to make up stuff, and then finally
274
1261580
13620
21:15
burgeoning.
275
1275200
1000
21:16
Okay, so hopefully you've enjoyed that particular article, AI and all this issue about Chat
276
1276200
5380
21:21
GPT.
277
1281580
1000
21:22
Chat GPT is really top of the agenda these days, everybody is talking about it.
278
1282580
4870
21:27
You know, everybody is looking out for it.
279
1287450
2540
21:29
Everybody's practising.
280
1289990
1000
21:30
We tried it ourselves here just to see what we could get.
281
1290990
2870
21:33
And the results are quite, quite amazing.
282
1293860
2290
21:36
Okay, so if there's anything else that you require in relation to this article, or the
283
1296150
4620
21:40
words or the phrase of the expressions, as I said at the beginning, give me a call, send
284
1300770
4280
21:45
me an email, and we'll try to help you out.
285
1305050
2330
21:47
Okay, this is Harry's always thanking you for listening and for watching, and remember,
286
1307380
4049
21:51
join me for the next lesson.
287
1311429
1220
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7