What Makes Us Human in the Age of AI? A Psychologist and a Technologist Answer | TED Intersections

45,846 views

2024-09-10 ・ TED


New videos

What Makes Us Human in the Age of AI? A Psychologist and a Technologist Answer | TED Intersections

45,846 views ・ 2024-09-10

TED


Please double-click on the English subtitles below to play the video.

00:00
Brian S. Lowery: If you could produce a more immersive social experience,
0
79
4046
00:04
now everybody's having their individual social experiences.
1
4167
3670
00:07
Like now what I worry about with AI, with VR,
2
7837
4087
00:11
with all these kind of technologies that are expanding,
3
11966
4171
00:16
we all inhabit our own singular world.
4
16179
2919
00:19
That is more frightening to me than like, you know,
5
19140
3503
00:22
that we all converged in the same experience.
6
22685
2336
00:25
[Intersections]
7
25063
3086
00:34
[Brian S. Lowery: Social psychologist]
8
34363
3003
00:37
[Kylan Gibbs: Technologist]
9
37950
3504
00:41
BSL: So what makes a human a human?
10
41454
2502
00:45
(Laughter)
11
45625
2335
00:48
Kylan Gibbs: It’s one of those questions, isn’t it?
12
48002
2419
00:50
I mean, there's like, two ways I would look at it.
13
50588
2377
00:52
One is from my personal life and one is from my work life.
14
52965
3045
00:56
One thing that's interesting is like,
15
56010
1794
00:57
there's been points when I've been spending
16
57845
2044
00:59
four to five hours a day interacting with AI.
17
59889
2211
01:02
And the interesting thing that happens in that is the things that you notice,
18
62100
3628
01:05
when you first start interacting with it, oh, this is really realistic.
19
65728
3337
01:09
Similarly when people first had black and white TV and they're like,
20
69107
3211
01:12
wow, this is like real life.
21
72360
1376
01:13
But then as you get used to it, you start to kind of realize,
22
73736
2878
01:16
the things that make it less authentic.
23
76614
1877
01:18
And I think something that I realized with AI is
24
78491
2294
01:20
there's certain ways that we interact that are just more spontaneous.
25
80785
3253
01:24
There's something about the predictability of AI that teaches you
26
84080
3086
01:27
about the spontaneity of being human.
27
87166
1794
01:28
The ways they communicate, the naturalness, the contextual awareness.
28
88960
3253
01:32
These little things that all add up.
29
92255
1835
01:34
That's on the technical side.
30
94132
1418
01:35
On the other side,
31
95550
1167
01:36
there's something of just the shared experience of being human
32
96759
2920
01:39
that actually I think differentiates it from other animals’ experience.
33
99679
3336
01:43
You know, you have a traumatic moment in your life,
34
103057
2628
01:45
and then you start to resonate with other people's.
35
105685
2919
01:48
I feel like every time I've had something nearly catastrophic,
36
108604
2962
01:51
it opened up a new door of empathy, and then you start to be like,
37
111607
3212
01:54
oh man, that really hurt, you know?
38
114819
1960
01:56
Or like when you cry about something, you're like, wow.
39
116821
2586
01:59
And then you start to remember, like this is what usually happens to me.
40
119407
3462
02:02
I start crying about something,
41
122869
1543
02:04
and then I think about all the things that I did for my mom or my grandma
42
124412
3462
02:07
and the things that they felt.
43
127874
1459
02:09
And I feel like there's something in that kind of like shared experience
44
129333
3420
02:12
where we have these things that differentiate us,
45
132753
2294
02:15
we’re all different people.
46
135047
1335
02:16
But there’s something about those common feelings
47
136382
2294
02:18
that it all kind of arises from that.
48
138718
1793
02:20
Anyway, that's one thought.
49
140511
1293
02:21
BSL: I love that answer, and I want to say that you're not normal in that way.
50
141846
3670
02:25
Here's why I don't think you're normal.
51
145558
1877
02:27
People anthropomorphize anything.
52
147435
1585
02:29
It doesn't have to even be that good, right?
53
149061
2086
02:31
It doesn't have to be anywhere near as good AI
54
151147
2169
02:33
for people to treat it like it has some human character,
55
153316
2669
02:35
people treat their cars like they're people.
56
155985
2085
02:38
So I'm surprised that when you interact with it a lot,
57
158070
3504
02:41
it feels less real to you.
58
161616
2919
02:44
KG: There's something about resolution.
59
164535
3587
02:48
It's like the way of seeing the world and you kind of increase this.
60
168164
3211
02:51
It's like the same reason you can't look at TV that's not 4K now.
61
171375
3587
02:54
And it's someone I think who worked on early VR was saying, you know,
62
174962
4171
02:59
the interesting thing about it was when you stepped out of it,
63
179175
2919
03:02
you're like, oh, the real world is actually amazing.
64
182136
2461
03:04
And it's actually really hard to recreate that in technology.
65
184597
2878
03:07
And I think the same is true for AI, it's like maybe for some people,
66
187475
3295
03:10
when they interact with it,
67
190770
1293
03:12
the thing that they see is some commonality.
68
192104
2086
03:14
But the thing that I always notice is,
69
194190
1835
03:16
this is very different from the conversations I have with my parents.
70
196025
3253
03:19
Even when it says something similar, there’s something off.
71
199320
2961
03:22
It's those little things,
72
202323
1209
03:23
that's like I think what, over time, will add up
73
203574
2252
03:25
as people use AI more, is they’ll start to recognize,
74
205826
2503
03:28
and I can't even point at them like, what are those nuances, though,
75
208329
3253
03:31
that make us human?
76
211582
1168
03:32
BSL: You just know it when you see it and you're like,
77
212792
2544
03:35
and it's missing an AI.
78
215378
1167
03:36
I mean that's also interesting
79
216545
1460
03:38
because what you just suggested is that the more people use AI,
80
218005
3170
03:41
the less real it's going to feel to people.
81
221175
2586
03:43
Do you think that's what's going to happen?
82
223761
2044
03:46
KG: I mean, there's probably another case, you know,
83
226389
2460
03:48
it's the same way as, you know,
84
228849
1502
03:50
your Instagram and Facebook feed isn't a real conversation.
85
230351
2794
03:53
There are certainly, kids especially, who would look at those kinds of feeds
86
233145
3587
03:56
and feel like, oh, that's a real representation of my friends
87
236774
2878
03:59
or my favorite celebrities or whatever I actually think,
88
239652
2628
04:02
when it's like completely -- I shouldn't say completely --
89
242280
2752
04:05
largely false.
90
245032
1168
04:06
And I do think something similar will happen with AI,
91
246242
2502
04:08
where some people for sure will almost be encaptured.
92
248744
3295
04:12
And they will believe
93
252039
1210
04:13
that that's the most realistic thing that exists
94
253249
2252
04:15
and then start to compare people to that.
95
255543
1960
04:17
But I think that, you know, if you have that degree of empathy,
96
257545
2961
04:20
you'll be like, oh, there's something off here.
97
260548
2210
04:22
It's the same way even if you use a Zoom call, there's something off.
98
262758
3295
04:26
It's hard to pick it up.
99
266053
1168
04:27
But like, I’m not picking up all the signals,
100
267221
2127
04:29
and it's the very little nuances
101
269348
1543
04:30
that you probably just subtly pick up as well.
102
270891
2169
04:33
BSL: So you don't think that the technology
103
273060
2002
04:35
is going to advance quickly enough,
104
275104
1710
04:36
where it’ll overcome those little things fast enough to capture all of us?
105
276814
3587
04:40
You're not worried about that?
106
280443
1459
04:41
KG: I am definitely worried about that.
107
281902
1877
04:43
Mainly because because I think for most people it's easy, right?
108
283779
4171
04:47
So the thing about AI is it's so beholden,
109
287992
3045
04:51
at least if you think about like, the chatbot styles,
110
291078
2503
04:53
it's so beholden to what we want.
111
293581
1585
04:55
And that's kind of like what people, I think,
112
295166
2169
04:57
a lot of people want in their life or they need,
113
297335
2252
04:59
is the sense of control.
114
299587
1168
05:00
And the AI gives you the sense that, like,
115
300796
2002
05:02
I can control this anthropomorphic thing.
116
302840
2377
05:05
And honestly, one of my fears is that people get used to that.
117
305217
3337
05:08
And what does it mean when I get used to interacting with something
118
308554
3170
05:11
that is beholden to only my views and interests,
119
311766
2252
05:14
and then I go and interact with a human who has their own interests?
120
314060
3211
05:17
BSL: Do you think people want control?
121
317313
1835
05:19
I think people want to be controlled.
122
319148
1793
05:23
KG: Maybe it's a form of control, though.
123
323277
1960
05:25
To be controlled is a predictability, I guess.
124
325279
2169
05:27
BSL: Yeah, people want the world to make sense, don't you think?
125
327448
3128
05:30
KG: Yes, yes, I think they also want the world to be ...
126
330576
3712
05:34
There's something about, like, preferring predictability over optimality.
127
334288
4379
05:38
So, like, I've even felt it when you have, you know,
128
338709
2836
05:41
a mental health moment,
129
341587
1627
05:43
you have friends who have mental health moments.
130
343255
2253
05:45
The things that I've always seen as interesting
131
345549
2211
05:47
is your brain and your mind prefer to stay in a state that's familiar,
132
347760
3420
05:51
even if it's worse.
133
351222
1168
05:52
So if you're in like a depressed state,
134
352390
1876
05:54
you almost would rather like stick in that than break outside of it, right?
135
354308
3545
05:57
So there's something about things that are familiar
136
357853
2628
06:00
rather than actually better.
137
360523
1501
06:02
And I don't know, there's a bias towards, you know,
138
362066
2586
06:04
you almost identifying then with those kinds of states.
139
364693
2712
06:07
BSL: Yeah, there's research on this.
140
367405
1751
06:09
One, it's called the status quo bias.
141
369198
1793
06:10
People like things that are already there.
142
370991
2002
06:13
And two, people like to have what they believe about themselves affirmed
143
373035
3420
06:16
if they really believe them, even if they're not positive.
144
376455
3045
06:19
So that is true.
145
379500
2169
06:21
So, like, what does that look like in AI?
146
381669
2461
06:24
(Laughter)
147
384130
1710
06:28
KG: I mean, it's definitely interesting to me that people seem to love like,
148
388384
4129
06:32
you talk to a lot of these things and they sound like computers
149
392555
3003
06:35
and they sound like AI, but people love it
150
395558
1960
06:37
because it's kind of familiar, it's controllable.
151
397560
2335
06:39
If you start to add lots of personalities and these kinds of things,
152
399937
3212
06:43
it makes sense in context, but I found it interesting
153
403190
2503
06:45
that as we started developing these AI systems
154
405693
2169
06:47
that people interact with,
155
407903
1252
06:49
they all have this kind of similar voice.
156
409196
1961
06:51
And it's a very "AI voice."
157
411157
1293
06:52
You can kind of tell that you're talking to an AI.
158
412491
2336
06:54
Maybe that’s intentional.
159
414869
1209
06:56
But there is something there, where like, I think people have a preference
160
416078
4463
07:00
to getting what they want from humans, from humans,
161
420541
2419
07:02
and from AI, from AI.
162
422960
2002
07:04
But that could blend,
163
424962
1502
07:06
there's already lots of, you know,
164
426505
1877
07:08
people in certain demographics who spend a lot of time on the internet
165
428424
3336
07:11
and they start to identify,
166
431760
1377
07:13
that's their favorite form of interacting with people.
167
433137
2544
07:15
And so I do think that there's a reality where,
168
435723
2335
07:18
as we move into the future,
169
438100
1293
07:19
there will be people who bias towards that for whatever reasons.
170
439435
3003
07:22
Whether it's the comfort of knowing that someone's not judging them,
171
442480
3211
07:25
whether it's like the format that it speaks to you with,
172
445733
2669
07:28
that will kind of bias towards preferring those types of interactions.
173
448444
3628
07:32
But on the other hand, I always think
174
452448
1793
07:34
there’ll be a distribution of people,
175
454241
2795
07:37
and you'll have some people who really don't like it.
176
457036
3044
07:40
And, you know, like I was saying,
177
460080
2128
07:42
the more that I interact with it now, I find it almost painful
178
462249
2920
07:45
because I just pick up on so many of these issues
179
465169
2294
07:47
that you're like, I can't even use it at a certain point.
180
467505
2919
07:50
And, you know, you'd think that, like, you know,
181
470424
2461
07:52
I’m in the AI space, and I write 20-page docs.
182
472927
3128
07:56
I don't need AI for a single bit of it
183
476096
1919
07:58
because it does remove that voice.
184
478057
2878
08:00
And I do also wonder, though,
185
480976
2044
08:03
as people interact with it more,
186
483062
1877
08:04
will they either identify the differences
187
484980
2253
08:07
or start to conform to the things that they're trained with AI.
188
487274
3003
08:10
It's the same as if you interact with your partner for example, right?
189
490319
3295
08:13
You start to be biased by the communication
190
493656
2294
08:15
because you're talking so much.
191
495991
1502
08:17
BSL: You mean they're training you?
192
497493
1710
08:19
KG: They're training you.
193
499203
1293
08:20
Your partner is probably like, you know,
194
500496
2127
08:22
they have a preferred way of communicating.
195
502665
2043
08:24
You get used to it, these kinds of things.
196
504708
2044
08:26
So I do wonder if, as people interact with AI more,
197
506752
2419
08:29
that they'll kind of all converge.
198
509171
1627
08:30
That's probably one of my biggest fears actually of AI.
199
510839
2586
08:33
BSL: I'm concerned about the exact opposite.
200
513425
2086
08:35
I'm going to shift a little bit.
201
515511
1543
08:37
So when we talk about AI, what you're describing,
202
517054
2503
08:39
it's usually like dyadic interactions.
203
519598
1835
08:41
Like, I'm interacting with one AI, one agent.
204
521433
3671
08:45
But really what people do is interact with multiple people, right?
205
525145
3587
08:48
You interact in some community or some small group setting.
206
528774
3462
08:52
And I'm surprised that there's not more of that in AI.
207
532236
2586
08:54
So you're also in gaming.
208
534863
1210
08:56
I don't really game,
209
536073
1668
08:57
but my understanding is that a lot of the gaming
210
537741
2253
09:00
is about connecting with the people, and it's a community kind of experience.
211
540035
3671
09:03
So there's two things.
212
543706
1418
09:05
One, I'm really surprised that AI seems so focused on these,
213
545124
3253
09:08
like, one-on-one interactions
214
548419
1668
09:10
as opposed to like, multiple AI agents
215
550129
2794
09:12
creating a more immersive social experience.
216
552923
2836
09:15
KG: I love you brought it up because that's really what we do.
217
555759
2920
09:18
BSL: Good, so that's one.
218
558721
1710
09:20
Other thing, like, the reason I worry less about convergence
219
560472
2920
09:23
and more about divergence
220
563434
1251
09:24
is if you could produce a more immersive social experience,
221
564727
4129
09:28
now everybody’s having their individual social experiences.
222
568856
3670
09:32
Like now, what I worry about with AI, with VR,
223
572568
4087
09:36
with all these kind of technologies that are expanding,
224
576697
5589
09:42
what we can control about our social environment,
225
582328
2377
09:44
about our physical perceptions in the environment,
226
584747
2544
09:47
is that we all inhabit our own singular world.
227
587333
3712
09:51
That is more frightening to me than like, you know,
228
591045
3545
09:54
that we all converged in the same experience.
229
594632
2335
09:57
KG: Well, my mom’s a grade-seven teacher,
230
597009
2085
09:59
and the one thing that she said is really interesting
231
599094
2503
10:01
is if you went back like 20 years,
232
601639
1626
10:03
everybody was watching the same TV shows,
233
603265
1960
10:05
and they come to class and they'd all be talking about it.
234
605267
2753
10:08
And now everybody watches their own favorite YouTube channel.
235
608062
2877
10:10
And it's the siloing of reality.
236
610939
1627
10:12
Like, what we do is when we work with games, for example,
237
612608
2878
10:15
one of the interesting things is like, as people play through games,
238
615486
3211
10:18
it's basically the same thing.
239
618739
1460
10:20
You could have a million people go through a game,
240
620199
2377
10:22
and it’s some differences
241
622576
1335
10:23
but you're largely going to hit the same points.
242
623911
2252
10:26
And so one of the things that we think about is,
243
626205
2252
10:28
what does that mean for agency?
244
628457
1501
10:29
The way we interact with media
245
629958
1460
10:31
changes the way that we feel agency in the world.
246
631460
2294
10:33
So if we see inert media that we can't change,
247
633754
2169
10:35
it also gives you this sense that you can't change the world.
248
635964
2878
10:38
And so to your point,
249
638842
1168
10:40
one of the things that we want to do with games is, how do you make it
250
640010
3295
10:43
so that each person can actually influence that outcome?
251
643305
2628
10:45
And as you add more agents into that,
252
645974
1794
10:47
that you see, OK, I interact with this one and it has a cascade effect.
253
647768
3337
10:51
I love it.
254
651146
1210
10:52
I mean, even in some of the stuff we've done here,
255
652356
2377
10:54
the magic actually happens when you do have those agents interacting,
256
654775
3253
10:58
because then you’re also not just seeing like that one-to-one interaction
257
658028
3504
11:01
but the emergent effect of basically that interaction.
258
661573
2545
11:04
And another thing is, if your main controls that you have in the computer
259
664118
3461
11:07
is like point-and-click or, in games, jump and shoot,
260
667579
2962
11:10
we're trying to see like, what does it mean if social skills
261
670582
2837
11:13
like interaction like this,
262
673419
1918
11:15
are the ways that you actually interact with the games, the technology
263
675379
3337
11:18
and the agents.
264
678757
1168
11:19
That’s a very different way of conversing or of dialogue than button presses.
265
679967
3879
11:23
And I think that changes the way that you sense agents in the world.
266
683846
3336
11:27
Because I think the way that most people change the world is by speaking
267
687224
3420
11:30
and interacting and interacting with other humans,
268
690644
2419
11:33
not by pressing buttons.
269
693063
1335
11:34
I mean, arguably it's the case in some.
270
694398
2711
11:37
BSL: You know, the other thing that's interesting to me is
271
697151
2752
11:39
I don't think people have an understanding
272
699945
2127
11:42
of the systems they exist in, right?
273
702072
2169
11:44
People think about themselves as existing in like individual relationships,
274
704283
3545
11:47
and they have a harder time understanding system affects
275
707870
2878
11:50
like I affect you, which affects your partner,
276
710748
2335
11:53
which affects your partner's parents, right?
277
713125
2169
11:55
That is a harder thing to grasp.
278
715335
2169
11:57
But I think there's something that's fundamentally human about that.
279
717546
3212
12:00
Like you are also impacted by all these different things going on,
280
720758
3503
12:04
like, we had the person come and put on our makeup,
281
724303
2419
12:06
and now I'm looking beautiful
282
726722
1418
12:08
and it's affecting everybody else around me.
283
728140
2085
12:10
(Laughter)
284
730225
1085
12:11
KG: It's glowing.
285
731351
1168
12:12
BSL: Exactly.
286
732519
1168
12:13
How does that fit in?
287
733729
1835
12:15
I just haven't heard people talk about it in that way,
288
735606
2544
12:18
which is surprising to me, because that, I think,
289
738150
2294
12:20
is what fundamentally makes humans human.
290
740486
3295
12:23
It's interaction
291
743781
2085
12:25
and complex social situations.
292
745908
1668
12:27
KG: And these like, nested systems.
293
747618
1710
12:29
And like, they all affect each other, right?
294
749328
2085
12:31
You think that your small activity
295
751455
1918
12:33
doesn't affect whatever higher-level political stuff,
296
753415
2503
12:35
but it's all aggregate.
297
755959
1210
12:37
And it's all interlinking as well.
298
757169
1627
12:38
I mean, it's like the AI thing is interesting too,
299
758837
2503
12:41
because I often hear people talk about it as like this evolution.
300
761340
3170
12:44
You have like, you know, singular cells to monkeys to humans to AI.
301
764510
4045
12:48
Whereas like, you could flip it, where it's like more like, you know,
302
768597
3295
12:51
cells to organs to human to AI.
303
771892
1543
12:53
It's a system overarching that just because it's trained on us
304
773477
3545
12:57
and we do these things, that we actually influence that system,
305
777022
2961
13:00
now that people are interacting with it, it has this interplay.
306
780025
3086
13:03
And that's interesting too, when it becomes like, you know,
307
783946
2836
13:06
AI isn't this singular entity.
308
786824
1459
13:08
It is more of an institution or a system almost
309
788325
2711
13:11
that is kind of overarching everything else.
310
791078
2127
13:13
BSL: And it's also weird because it's like our vision of who we are.
311
793247
3295
13:16
So when we talk about AGI,
312
796542
1376
13:17
it's like we don't even know what intelligence is
313
797960
2460
13:20
and we think we're going to produce something that has it.
314
800420
3129
13:23
It's just an interesting situation where we talk about it, as you said,
315
803590
3837
13:27
it's natural evolution,
316
807469
1668
13:29
but in fact we’re creating it,
317
809137
1627
13:30
and it's not clear that we know exactly what it is we're creating.
318
810764
4171
13:34
KG: I actually think that one of the most interesting things
319
814977
2836
13:37
is that we're starting to work on AI at a point where,
320
817813
2544
13:40
like, I still think we're figuring out, you know, ourselves.
321
820399
2836
13:43
Neuroscience is very early on in its days,
322
823235
3754
13:47
and yet we're creating things
323
827030
1418
13:48
that are like, based on our own intelligence,
324
828490
2127
13:50
and we don't really understand even what's going on inside.
325
830617
2795
13:53
And so to your point on, what are the effects?
326
833412
2169
13:55
We don't really know yet.
327
835581
1876
13:57
Every year a new paper comes out
328
837499
2044
13:59
and changes how people think about child rearing.
329
839585
2627
14:02
Like how to bring up a child well, like all those kinds of things.
330
842212
3295
14:05
And now we're creating systems that will, you know,
331
845549
2419
14:07
kind of be overarching other humans.
332
847968
2127
14:10
What does that mean, I don't know.
333
850137
1626
14:11
I do actually think,
334
851805
1210
14:13
I happen to be in AI, we happen to be at this point in time.
335
853015
3044
14:16
But if we could pause for a second,
336
856101
1710
14:17
I think it would be good, another few centuries of figuring out what we are
337
857811
3587
14:21
and understanding that a little bit better
338
861440
2002
14:23
before we created something that was in our image.
339
863442
2335
14:25
Because, we’re kind of just, you know,
340
865777
1836
14:27
it's kind of like taking a photograph and like painting it, right?
341
867654
3128
14:30
You're not actually getting the person and painting it.
342
870782
2586
14:33
There's something about the life that's missing there.
343
873368
2545
14:35
So I do agree.
344
875954
1293
14:37
I think that we're honestly kind of premature,
345
877289
2294
14:39
but I think it's just how, I guess,
346
879583
2210
14:41
you know, life goes that things come out when they naturally should.
347
881835
3962
14:45
BSL: So, I mean, you work in AI,
348
885839
1752
14:47
so what's the most exciting thing for you in AI?
349
887633
3128
14:50
What's your hope for it?
350
890802
2169
14:54
KG: I think it's kind of back to that agency question.
351
894598
2586
14:57
So I mean, you know, you read a news story,
352
897184
4004
15:01
you read a book, you watch a movie, you watch a TV show,
353
901188
2669
15:03
this is specific to, like, my domain,
354
903899
2502
15:06
like there's something about the communication
355
906443
2294
15:08
that we're having right now where, like,
356
908779
1918
15:10
I'm adapting to the things that you say, to your body language,
357
910697
2962
15:13
all those kinds of things, right?
358
913700
1585
15:15
To, like, the people in the room we have here, all these things.
359
915285
3045
15:18
And so when you have ...
360
918372
1167
15:20
AI able to, sort of, help that adaptation so that you have that agency
361
920624
3837
15:24
in the things that you interact with.
362
924461
1793
15:26
I don't necessarily believe in fully personalized media
363
926254
2586
15:28
because I think we need like a shared social context.
364
928882
2503
15:31
The reason we watch a movie or a TV show
365
931385
2335
15:33
is because then we can all talk about it, right?
366
933720
2253
15:35
But there is something about the fact
367
935973
1793
15:37
that we're all interacting with these internet objects.
368
937808
2586
15:40
And so the way that technology feels, you're on a screen, it doesn't change.
369
940435
3629
15:44
You're in a movie, it doesn't change.
370
944064
1793
15:45
You're watching Netflix, it doesn't change depending on what you do.
371
945857
3212
15:49
And I think that changes the way we see our own agency in the world.
372
949069
3211
15:52
And so I hope with AI that one of the things that it does
373
952322
3170
15:55
is kind of opens this door to agency
374
955492
3003
15:58
in the way that we interact with media and technology in general,
375
958537
3253
16:01
such that we do notice that effect that you have on systems.
376
961790
2920
16:04
Because even if it's small, right,
377
964710
1793
16:06
where I take a certain action and it completely changes an app
378
966545
2919
16:09
or it changes an experience,
379
969464
1377
16:10
maybe that helps us learn that we have an effect in the social systems
380
970841
4462
16:15
as well that we're affecting.
381
975303
1419
16:16
So something to that effect.
382
976722
1376
16:18
BSL: So you want to make our agency more transparent.
383
978098
2920
16:21
And do you think it does that,
384
981018
1501
16:22
because right now I'm not sure it doesn't obfuscate our agency.
385
982561
3003
16:25
KG: No I don't necessarily know.
386
985564
2085
16:27
I agree, I mean this is why I think also
387
987649
2961
16:30
media and games is, you know, the domain I mainly focus on.
388
990610
3045
16:33
And I think it's interesting, especially because young people use it a lot.
389
993655
3545
16:37
And so I've heard very veteran game developers say
390
997242
3086
16:40
how people interact with games
391
1000370
1460
16:41
kind of trains kids how they should interact with the world.
392
1001830
2836
16:44
So even people who tend to be professional players in different games
393
1004708
3253
16:47
have different psychological profiles
394
1007961
2252
16:50
because they bias towards certain ways of interacting and seeing the world.
395
1010213
3587
16:53
The same way, I guess, if you trained in something academic, right,
396
1013800
3170
16:57
you have a different way of viewing it.
397
1017012
1877
16:58
And so if we make games and media in a way
398
1018889
3295
17:02
that you feel that sort of social impact as well,
399
1022184
3128
17:05
maybe, maybe it opens the door to like, another realm of understanding.
400
1025353
3629
17:09
But, yeah, I agree that like a lot of the systems that we have today
401
1029024
5213
17:14
give you maybe a false sense also of agency
402
1034279
2044
17:16
where like we were talking about the AI systems,
403
1036364
2253
17:18
where you actually feel like you're controlling this thing,
404
1038658
2795
17:21
whereas maybe it's also biasing, you know, and "controlling,"
405
1041453
3420
17:24
having some influence over you as well.
406
1044873
1877
17:26
BSL: So where do you think things are going?
407
1046792
2085
17:28
So there's obviously a huge race among some very,
408
1048877
2336
17:31
very well-resourced organizations over AI, right?
409
1051254
4588
17:35
You know, Microsoft, Google, I mean, are the biggest maybe.
410
1055884
3086
17:40
And they are very quickly going to need to monetize it
411
1060931
3253
17:44
because this is what those companies are designed to do.
412
1064226
3253
17:47
Like what do you foresee?
413
1067521
2502
17:50
Because I just look at social media as an example.
414
1070065
2961
17:53
I think, at the time when it first came out,
415
1073068
4129
17:57
people were really excited, as a new way to connect with people
416
1077197
3003
18:00
and a way to stay connected to people
417
1080200
1960
18:02
you know, you couldn't otherwise;
418
1082160
2461
18:04
catch up with people you lost contact with,
419
1084663
2210
18:06
that sort of thing.
420
1086873
1627
18:08
And it changed into something else.
421
1088542
1960
18:12
In large part because of the way it was monetized,
422
1092295
2378
18:14
like going to ads, focus on attention.
423
1094673
2002
18:18
What's the trajectory of AI?
424
1098844
2585
18:22
KG: You know, I'm taking guesses.
425
1102055
2544
18:24
BSL: Yeah, of course, we're all taking guesses,
426
1104641
2252
18:26
I won’t hold you to it, don’t worry.
427
1106935
1752
18:28
KG: I think that the reality is,
428
1108728
2962
18:31
we were kind of mentioning before about the challenges of scale.
429
1111731
3045
18:34
And when you invest tens of billions of dollars in something,
430
1114818
3587
18:38
you need scale.
431
1118405
1501
18:39
And I think that's one of -- the way that AI is developed
432
1119906
3087
18:43
and specifically even the types of models we're using,
433
1123034
2545
18:45
the economic model of it,
434
1125620
1210
18:46
which is effectively the more compute you have,
435
1126872
2210
18:49
the better models you can create.
436
1129082
1585
18:50
The better models you can create, the more usage you get.
437
1130667
2711
18:53
The more usage you get, the better.
438
1133378
1710
18:55
So it has somewhat of a, honestly, like monopolistic tendency, I think,
439
1135088
3670
18:58
in the way that actually even like the architectures
440
1138800
2503
19:01
and the economy of it works.
441
1141303
1710
19:03
And so
442
1143013
1626
19:04
I think it's almost inevitable that whatever AI systems are produced
443
1144639
3254
19:07
by these large organizations
444
1147893
1376
19:09
will be pushed to scale as quickly as possible.
445
1149311
2752
19:12
And there's some pluses in that where like, you know, sure,
446
1152063
3754
19:15
they're building in feedback loops, people can give their input, it biases it.
447
1155859
3712
19:19
But also at the same time,
448
1159613
1251
19:20
what does it mean when a single model is fit to a billion people, right?
449
1160906
3712
19:24
So that's kind of what I meant about the converging effect
450
1164659
2753
19:27
where, what happens when we are pushed to produce something
451
1167412
2920
19:30
that fits to a billion people?
452
1170373
1627
19:32
There's a lot of diversity in there.
453
1172042
1751
19:33
And so, you know, we create these scaled systems that are fitting with the whole,
454
1173793
4588
19:38
like, trying to fit the whole planet.
455
1178423
1794
19:40
Does that work?
456
1180258
1168
19:41
And so I think what will, you know,
457
1181468
2002
19:43
we're going to go through this phase where like, yeah,
458
1183511
2545
19:46
you're going to have a billion people interacting the same AI.
459
1186056
2961
19:49
And I don't know what the effect of that will be.
460
1189017
2294
19:51
Even the monetization models now are kind of you pay
461
1191353
2460
19:53
to use these kinds of things, which are maybe OK,
462
1193855
2628
19:56
but certainly ads will probably enter the equation.
463
1196524
2420
19:58
Also, what happens when you want attention
464
1198944
2002
20:00
and AI is much better at that than the algorithms
465
1200987
2378
20:03
you even have on YouTube and Instagram.
466
1203365
1877
20:05
And you can start to capture that attention.
467
1205283
2086
20:07
And so I certainly think it's going to be an interesting little bit here now,
468
1207369
3628
20:10
as we see these huge organizations spending tens of billions of dollars
469
1210997
3921
20:14
and the choices that they make to then monetize that,
470
1214918
2961
20:17
and what that means for how AI proliferates.
471
1217921
3420
20:21
I know a lot of the folks in the organizations,
472
1221341
3045
20:24
and their interests have never been in that domain.
473
1224427
2753
20:27
But at the same time, you're beholden, you know,
474
1227222
2252
20:29
to stock market interests and whatever it is, then what happens?
475
1229474
3295
20:32
It shifts it, right?
476
1232811
1418
20:34
We're in a capitalist world.
477
1234271
1376
20:35
And that's kind of like, you know,
478
1235647
1668
20:37
what ultimately will change the incentives.
479
1237315
2461
20:39
So yeah it's interesting.
480
1239818
1585
20:41
I mean I am interested in, coming from your background,
481
1241444
2920
20:44
you have a very different stance on it.
482
1244406
1918
20:46
But, you know, it's all this AI stuff is interesting.
483
1246324
2544
20:48
But, you know, when you think, almost to your first question,
484
1248868
2878
20:51
what makes us human and like as people,
485
1251746
2836
20:54
just technology in general and specifically with AI,
486
1254624
3379
20:58
like where can people find the meaning in their life,
487
1258044
4004
21:02
the values that they find true?
488
1262090
3086
21:05
And how will that change, do you think,
489
1265677
1877
21:07
I guess, with like the advent of these new technologies?
490
1267554
2627
21:10
Or how have you seen it change
491
1270181
2211
21:12
with the technologies we've already seen come to life?
492
1272392
2669
21:16
BSL: This is going to sound like a funny answer.
493
1276479
2253
21:18
I think people are too worked up about technology, personally.
494
1278732
2919
21:21
I mean, you know, we had this conversation.
495
1281693
2210
21:23
I think, you know, people have been using technology since we've been human.
496
1283945
4171
21:28
So paper was a huge invention.
497
1288158
2460
21:30
Talked about this.
498
1290618
1210
21:31
The printing press, huge invention.
499
1291870
1710
21:33
Computer, huge invention.
500
1293580
1460
21:35
Internet, huge invention.
501
1295081
1252
21:36
AI, great, another huge invention.
502
1296333
1835
21:38
And through all of that, I think
503
1298209
2795
21:41
what you see in a lot of the biggest technologies
504
1301046
2294
21:43
is the desire to connect with other human beings.
505
1303381
2419
21:45
I think what fundamentally makes us human is our connection to other human beings,
506
1305800
4129
21:49
our ability to engage with other human beings, and like consciousness
507
1309971
3295
21:53
and all these other things I think are necessary preconditions.
508
1313266
3879
21:57
But really, what makes us human is connections with other humans.
509
1317187
3253
22:00
And that is incredibly complex.
510
1320482
4129
22:04
And I don't think we're close in terms of technology of replicating that.
511
1324652
3713
22:08
I mean, even what you described it's like you have this feeling of like,
512
1328948
3421
22:12
this isn't right, this is off.
513
1332369
1835
22:14
And even if you felt like it was right,
514
1334245
1877
22:16
it still would be off in ways you didn't quite get.
515
1336164
2586
22:18
I don't think we're close.
516
1338792
2502
22:21
Though because it's designed to pull our attention away from other things,
517
1341336
4171
22:25
I think it impedes our ability
518
1345507
5547
22:31
to do what we all kind of want to do,
519
1351096
3920
22:35
which is interact with each other.
520
1355016
2628
22:37
And it might change the way we interact with each other
521
1357977
4505
22:42
in a way that might feel less fulfilling.
522
1362524
3169
22:46
And I think you see some of that in social interactions now.
523
1366069
4463
22:50
Some of that I mean, recently maybe,
524
1370573
3546
22:54
COVID was an issue.
525
1374119
2043
22:56
But, you know, people feeling less comfortable in face-to-face interactions.
526
1376204
4213
23:00
Like people dating,
527
1380458
1293
23:01
there's no serendipity in hanging out and you meet who you meet.
528
1381793
3045
23:04
It's like you're using an algorithm to try to present to you options.
529
1384879
4255
23:09
That's a very different world.
530
1389175
1794
23:11
So even that's prior to AI.
531
1391010
2294
23:14
And I don't know how AI is going to further influence that.
532
1394013
3796
23:17
KG: And I guess just even like the general point,
533
1397809
2669
23:20
how core do you think
534
1400478
2378
23:22
the need for connection is
535
1402856
1501
23:24
in the sense that, you know,
536
1404399
1376
23:25
I've heard some parents say that, through COVID,
537
1405775
2252
23:28
their kids went through a major change, you know,
538
1408069
2544
23:30
these regressions,
539
1410655
1543
23:32
their different habits and these kinds of things
540
1412198
2920
23:35
because they weren't connecting with people.
541
1415160
2085
23:37
And then it's taken years to overcome that.
542
1417287
2169
23:39
So I do also wonder, like, you know,
543
1419456
1918
23:41
whether it's through technology or things like COVID
544
1421416
2461
23:43
or just like circumstances,
545
1423877
1418
23:45
could we lose that need for connection?
546
1425295
2002
23:47
Or even if we need it,
547
1427297
1960
23:49
you know, we might lose the desire for it
548
1429299
1960
23:51
and feel emotional trauma as a result, but still not go for it.
549
1431301
3920
23:55
Like, how core do you think it is?
550
1435263
1627
23:56
And do you think we're safe in that kind of need?
551
1436931
3629
24:01
BSL: So I'm going to give you the most extreme answer,
552
1441769
2545
24:04
which is I think the true one.
553
1444355
1460
24:05
That you will cease to be human
554
1445857
1793
24:07
if you don't have a need for human connection.
555
1447650
2169
24:09
Like I think you will be a physical person,
556
1449861
2002
24:11
but you will literally break down as a human being.
557
1451905
3712
24:15
And this is why in part --
558
1455658
1961
24:17
Social isolation
559
1457660
1752
24:19
or solitary confinement is considered inhumane.
560
1459454
3211
24:22
Because people literally break down,
561
1462707
2711
24:25
you will start to have hallucinations.
562
1465460
2044
24:27
You will break down mentally and physically absent human connection.
563
1467545
5631
24:33
So I don’t think there’s any possibility, in my mind, of losing the need.
564
1473635
4421
24:38
Like, you may get less than you need,
565
1478056
2085
24:40
and that will have negative consequences for you.
566
1480183
3128
24:44
But I'm not worried about people not wanting to be around people.
567
1484562
3420
24:48
KG: Are you worried that, like,
568
1488024
1710
24:49
things like social media or AI or any of these things
569
1489734
3086
24:52
could give you that sense that you're fulfilling that need,
570
1492862
2795
24:55
but not actually fulfilling it?
571
1495657
1543
24:57
It's totally true, right?
572
1497242
1251
24:58
Solitary confinement is a great example because we need it.
573
1498535
4129
25:02
We absolutely lose our sanity as well as, you know, our well-being.
574
1502664
4421
25:07
But maybe we can, you know,
575
1507085
2377
25:09
technology can manufacture the sense that we're fulfilling it.
576
1509504
3253
25:12
And then over time,
577
1512757
1210
25:13
we see these mental health crises evolve as a result?
578
1513967
2502
25:16
BSL: Yeah, that's a good question.
579
1516511
1626
25:18
I think it's unlikely, but I don't know.
580
1518179
2127
25:20
Honestly I don't know.
581
1520306
1669
25:23
I'll talk about meaning for a second.
582
1523726
1836
25:25
And I think of that as fundamentally tied
583
1525603
1961
25:27
to our need for connection to other people.
584
1527564
2085
25:29
I think sometimes we confuse, for example, our need for meaning,
585
1529649
4004
25:33
with a desire for personal achievement.
586
1533695
2252
25:35
That we chase personal achievement
587
1535989
2210
25:38
and what we're trying to do is generate meaning.
588
1538241
2919
25:41
So I think we can be confused
589
1541828
3044
25:44
and we can have those needs displaced into less productive routes.
590
1544914
5506
25:51
But I don't think it's going away.
591
1551504
1752
25:53
But, you know, I don't know that it's the healthiest.
592
1553256
3587
25:57
KG: No, I'm totally aligned.
593
1557760
1669
25:59
Thank you, Brian, that was an awesome conversation.
594
1559721
2460
26:02
BSL: It was great to talk to you.
595
1562181
1585
26:03
Really fantastic and super informative. Thank you.
596
1563766
2545
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7