What Makes Us Human in the Age of AI? A Psychologist and a Technologist Answer | TED Intersections

56,033 views ・ 2024-09-10

TED


请双击下面的英文字幕来播放视频。

00:00
Brian S. Lowery: If you could produce a more immersive social experience,
0
79
4046
翻译人员: Eric Ma 校对人员: Manlin Fang
Brian:如果你能创造出 更身临其境的社交验,
00:04
now everybody's having their individual social experiences.
1
4167
3670
那么,现在每个人都有 自己的社交体验。
00:07
Like now what I worry about with AI, with VR,
2
7837
4087
就像我现在 担心的人工智能、虚拟现实、
00:11
with all these kind of technologies that are expanding,
3
11966
4171
所有这些正在扩展的技术一样,
00:16
we all inhabit our own singular world.
4
16179
2919
我们都生活在我们自己的单一世界中。
00:19
That is more frightening to me than like, you know,
5
19140
3503
对我来说,这比我们都经历了同样 的事情更可怕,你懂的。
00:22
that we all converged in the same experience.
6
22685
2336
00:25
[Intersections]
7
25063
3086
[十字路口] (由TED提供)
00:34
[Brian S. Lowery: Social psychologist]
8
34363
3003
[Brian S. Lowery:社会心理学家]
00:37
[Kylan Gibbs: Technologist]
9
37950
3504
[Kylan Gibbs:技术专家]
00:41
BSL: So what makes a human a human?
10
41454
2502
BSL:那么是什么让人类成为人呢?
00:45
(Laughter)
11
45625
2335
(笑声)
00:48
Kylan Gibbs: It’s one of those questions, isn’t it?
12
48002
2419
这是其中一个问题, 不是吗?
00:50
I mean, there's like, two ways I would look at it.
13
50588
2377
我的意思是,我大概有 两种看法。
00:52
One is from my personal life and one is from my work life.
14
52965
3045
一种来自我的个人生活, 另一种来自我的工作。
00:56
One thing that's interesting is like,
15
56010
1794
一件有趣的事是,
00:57
there's been points when I've been spending
16
57845
2044
有时候我每天花
00:59
four to five hours a day interacting with AI.
17
59889
2211
四到五个小时与人工智能互动。
01:02
And the interesting thing that happens in that is the things that you notice,
18
62100
3628
其中,有趣的事情是当你第一次 开始与它互动时,
01:05
when you first start interacting with it, oh, this is really realistic.
19
65728
3337
你注意到, 哦,这真的很现实。
01:09
Similarly when people first had black and white TV and they're like,
20
69107
3211
同样,当人们第一次拥有 黑白电视时,他们会想,
01:12
wow, this is like real life.
21
72360
1376
哇,这就像现实生活一样。
01:13
But then as you get used to it, you start to kind of realize,
22
73736
2878
但是当你习惯了之后, 你就会开始意识到
01:16
the things that make it less authentic.
23
76614
1877
那些让它变得不那么真实的东西。
01:18
And I think something that I realized with AI is
24
78491
2294
而且我认为,通过人工智能,
01:20
there's certain ways that we interact that are just more spontaneous.
25
80785
3253
我意识到我们的某些互动 方式更加自发。
01:24
There's something about the predictability of AI that teaches you
26
84080
3086
人工智能的可预测性可以教会你
01:27
about the spontaneity of being human.
27
87166
1794
人类的自发性。
01:28
The ways they communicate, the naturalness, the contextual awareness.
28
88960
3253
他们的沟通方式、 自然性、情境意识。
01:32
These little things that all add up.
29
92255
1835
这些小东西加起来了。
01:34
That's on the technical side.
30
94132
1418
那是在技术方面。
01:35
On the other side,
31
95550
1167
另一方面,
01:36
there's something of just the shared experience of being human
32
96759
2920
我认为人类的共同 经历实际上
01:39
that actually I think differentiates it from other animals’ experience.
33
99679
3336
使它与其他动物 的经历有所不同。
01:43
You know, you have a traumatic moment in your life,
34
103057
2628
你的生活中 有一个痛苦的时刻,
01:45
and then you start to resonate with other people's.
35
105685
2919
然后你开始引起他人的共鸣。
01:48
I feel like every time I've had something nearly catastrophic,
36
108604
2962
我觉得每当我遇到 近乎灾难性事情时,
01:51
it opened up a new door of empathy, and then you start to be like,
37
111607
3212
它都会打开一扇新的同理心之门, 然后你会开始觉得,
01:54
oh man, that really hurt, you know?
38
114819
1960
天啊,真的很痛苦,你知道吗?
01:56
Or like when you cry about something, you're like, wow.
39
116821
2586
或者就像当你为某件 事哭的时候,你会想,哇。
01:59
And then you start to remember, like this is what usually happens to me.
40
119407
3462
然后你开始记住, 就像我经常发生的事情一样。
02:02
I start crying about something,
41
122869
1543
我开始为某件事哭泣,
02:04
and then I think about all the things that I did for my mom or my grandma
42
124412
3462
然后我想起我为妈妈 或奶奶所做的一切
02:07
and the things that they felt.
43
127874
1459
以及他们的感受。
02:09
And I feel like there's something in that kind of like shared experience
44
129333
3420
而且我觉得, 在那种共同的体验中
02:12
where we have these things that differentiate us,
45
132753
2294
有一些东西 可以使我们与众不同,
02:15
we’re all different people.
46
135047
1335
我们都是不同的人。
02:16
But there’s something about those common feelings
47
136382
2294
但是,这些共同的感受
02:18
that it all kind of arises from that.
48
138718
1793
在某种程度上是由此产生的。
02:20
Anyway, that's one thought.
49
140511
1293
无论如何,这是一个想法。
02:21
BSL: I love that answer, and I want to say that you're not normal in that way.
50
141846
3670
BSL:我喜欢这个答案,我想 说你这样不正常。
02:25
Here's why I don't think you're normal.
51
145558
1877
这就是为什么我认为你不正常:
02:27
People anthropomorphize anything.
52
147435
1585
人们将任何事物拟人化。
02:29
It doesn't have to even be that good, right?
53
149061
2086
甚至不必那么好, 对吧?
02:31
It doesn't have to be anywhere near as good AI
54
151147
2169
人们不必像对待人 一样对待它好,
02:33
for people to treat it like it has some human character,
55
153316
2669
不一定要把它当成 一个真人一样对待
02:35
people treat their cars like they're people.
56
155985
2085
人们想对待一个人 一样对待他们的车
02:38
So I'm surprised that when you interact with it a lot,
57
158070
3504
因此,令我惊讶的是 当你经常与它互动时,
02:41
it feels less real to you.
58
161616
2919
对你来说感觉不那么真实。
02:44
KG: There's something about resolution.
59
164535
3587
KG:分辨率有些问题。
02:48
It's like the way of seeing the world and you kind of increase this.
60
168164
3211
这就像看待世界的方式, 你在某种程度上增加了这个水平。
02:51
It's like the same reason you can't look at TV that's not 4K now.
61
171375
3587
这就像 你现在不能看不是 4K 的电视的原因一样。
02:54
And it's someone I think who worked on early VR was saying, you know,
62
174962
4171
我想是开发早期虚拟 现实的人在说,对吧,
02:59
the interesting thing about it was when you stepped out of it,
63
179175
2919
它的有趣之处在于, 当你走出虚拟现实时,
03:02
you're like, oh, the real world is actually amazing.
64
182136
2461
你会想,哦,现实世界 真的很神奇。
03:04
And it's actually really hard to recreate that in technology.
65
184597
2878
实际上,要在技术中 重现这一点真的很困难。
03:07
And I think the same is true for AI, it's like maybe for some people,
66
187475
3295
而且我认为人工智能也是如此, 就像某些人一样,
03:10
when they interact with it,
67
190770
1293
当他们与人工智能互动时,
03:12
the thing that they see is some commonality.
68
192104
2086
他们看到 的是某种共同点。
03:14
But the thing that I always notice is,
69
194190
1835
但我经常注意到的是,
03:16
this is very different from the conversations I have with my parents.
70
196025
3253
这与我和父母 的对话有很大的不同。
03:19
Even when it says something similar, there’s something off.
71
199320
2961
即使它说了类似的话, 也有一些不对劲。
03:22
It's those little things,
72
202323
1209
正是这些小东西,
03:23
that's like I think what, over time, will add up
73
203574
2252
就像我认为, 随着时间的推移,
03:25
as people use AI more, is they’ll start to recognize,
74
205826
2503
随着人们越来越多地使用人工智能, 他们会开始识别,
03:28
and I can't even point at them like, what are those nuances, though,
75
208329
3253
我甚至无法指着它们,但是,
03:31
that make us human?
76
211582
1168
那些使我们成为人类的细微差别 是什么?
03:32
BSL: You just know it when you see it and you're like,
77
212792
2544
BSL: 当你看到它时你就会知道,
03:35
and it's missing an AI.
78
215378
1167
而且它缺少人工智能。
03:36
I mean that's also interesting
79
216545
1460
我的意思是这也很有趣,
03:38
because what you just suggested is that the more people use AI,
80
218005
3170
因为你刚才想说的是, 使用人工智能的人越多,
03:41
the less real it's going to feel to people.
81
221175
2586
人们的感觉 就越不真实。
03:43
Do you think that's what's going to happen?
82
223761
2044
你认为会发生这样 的事情吗?
03:46
KG: I mean, there's probably another case, you know,
83
226389
2460
KG:我的意思是,可能还有 另一种情况,你知道,
03:48
it's the same way as, you know,
84
228849
1502
就像你的Instagram
03:50
your Instagram and Facebook feed isn't a real conversation.
85
230351
2794
和Facebook推送一样 不是真正的对话。
03:53
There are certainly, kids especially, who would look at those kinds of feeds
86
233145
3587
当然,尤其是孩子们, 他们会看着这样的社交媒体
03:56
and feel like, oh, that's a real representation of my friends
87
236774
2878
然后觉得,哦,那是我朋友们 的真实写照
03:59
or my favorite celebrities or whatever I actually think,
88
239652
2628
或者是我最喜欢的名人 或者我真正想到的任何东西,
04:02
when it's like completely -- I shouldn't say completely --
89
242280
2752
而这完全是错误的- 我不应该完全说
04:05
largely false.
90
245032
1168
在很大程度上是错误的。
04:06
And I do think something similar will happen with AI,
91
246242
2502
而且我确实认为人工智能 也会发生类似
04:08
where some people for sure will almost be encaptured.
92
248744
3295
的情况,有些人肯定 会几乎被抓住。
04:12
And they will believe
93
252039
1210
他们会相信
04:13
that that's the most realistic thing that exists
94
253249
2252
这是现存的最 现实的东西,
04:15
and then start to compare people to that.
95
255543
1960
然后开始将人与之进行比较。
04:17
But I think that, you know, if you have that degree of empathy,
96
257545
2961
但我认为,你知道, 如果你有这种程度的同理心,
04:20
you'll be like, oh, there's something off here.
97
260548
2210
你会想,哦,这里 有一些东西。
04:22
It's the same way even if you use a Zoom call, there's something off.
98
262758
3295
即使你使用 Zoom通话, 也是如此,也会有一些问题。
04:26
It's hard to pick it up.
99
266053
1168
很难把它捡起来。
04:27
But like, I’m not picking up all the signals,
100
267221
2127
但是,比如,我不会接收 所有的信号
04:29
and it's the very little nuances
101
269348
1543
,你可能也会 巧妙地听到
04:30
that you probably just subtly pick up as well.
102
270891
2169
这些细微的细微差别。
04:33
BSL: So you don't think that the technology
103
273060
2002
BSL:那么你 认为这项技术的发展速度还不
04:35
is going to advance quickly enough,
104
275104
1710
够快,它能以足够快的速度
04:36
where it’ll overcome those little things fast enough to capture all of us?
105
276814
3587
克服这些足以俘获我们 所有人的小问题吗?
04:40
You're not worried about that?
106
280443
1459
你不担心吗?
04:41
KG: I am definitely worried about that.
107
281902
1877
KG:我对此肯定很担心。
04:43
Mainly because because I think for most people it's easy, right?
108
283779
4171
主要是因为我认为 对大多数人来说这很容易,对吧?
04:47
So the thing about AI is it's so beholden,
109
287992
3045
因此,人工智能的问题在于 它非常受其束缚,
04:51
at least if you think about like, the chatbot styles,
110
291078
2503
至少如果你 想象聊天机器人的风格,
04:53
it's so beholden to what we want.
111
293581
1585
它是如此受我们想要的东西的束缚。
04:55
And that's kind of like what people, I think,
112
295166
2169
我认为,
04:57
a lot of people want in their life or they need,
113
297335
2252
这有点 像人们 在生活中想要或需要
04:59
is the sense of control.
114
299587
1168
的是控制感。
05:00
And the AI gives you the sense that, like,
115
300796
2002
人工智能给你一种感觉,比如,
05:02
I can control this anthropomorphic thing.
116
302840
2377
我可以控制这个拟人化的东西。
05:05
And honestly, one of my fears is that people get used to that.
117
305217
3337
老实说,我的恐惧之一 是人们会习惯这种情况。
05:08
And what does it mean when I get used to interacting with something
118
308554
3170
当 我习惯于与只受 我的观点和兴趣约束
05:11
that is beholden to only my views and interests,
119
311766
2252
的事物互动,然后去
05:14
and then I go and interact with a human who has their own interests?
120
314060
3211
和一个有自己兴趣 的人互动时,这意味着什么?
05:17
BSL: Do you think people want control?
121
317313
1835
BSL:你认为人们想要控制权吗?
05:19
I think people want to be controlled.
122
319148
1793
我认为人们想要被控制。
05:23
KG: Maybe it's a form of control, though.
123
323277
1960
KG:不过,也许这是一种控制形式。
05:25
To be controlled is a predictability, I guess.
124
325279
2169
我猜被控制 是一种可预测性。
05:27
BSL: Yeah, people want the world to make sense, don't you think?
125
327448
3128
BSL:是的,人们 希望世界变得有意义,你不觉得吗?
05:30
KG: Yes, yes, I think they also want the world to be ...
126
330576
3712
KG:是的,是的,我想他们也 希望世界成为...
05:34
There's something about, like, preferring predictability over optimality.
127
334288
4379
比如说, 更倾向于可预测性而不是最优性。
05:38
So, like, I've even felt it when you have, you know,
128
338709
2836
所以,比如, 当你有心理健康的时刻时,我甚至
05:41
a mental health moment,
129
341587
1627
有过这样的感觉,
05:43
you have friends who have mental health moments.
130
343255
2253
你的朋友有 心理健康的时刻。
05:45
The things that I've always seen as interesting
131
345549
2211
我一直认为有趣的事情
05:47
is your brain and your mind prefer to stay in a state that's familiar,
132
347760
3420
是你的大脑和大脑 宁愿保持一种熟悉的状态,
05:51
even if it's worse.
133
351222
1168
即使情况更糟。
05:52
So if you're in like a depressed state,
134
352390
1876
因此,如果你处于抑郁状态,
05:54
you almost would rather like stick in that than break outside of it, right?
135
354308
3545
你几乎宁愿坚持下去, 也不愿逃脱困境,对吧?
05:57
So there's something about things that are familiar
136
357853
2628
因此,有些 东西是熟悉的,
06:00
rather than actually better.
137
360523
1501
而不是更好的。
06:02
And I don't know, there's a bias towards, you know,
138
362066
2586
而且我不知道,你知道, 那时人们会有偏见
06:04
you almost identifying then with those kinds of states.
139
364693
2712
你几乎认同这几种状态。
06:07
BSL: Yeah, there's research on this.
140
367405
1751
BSL:是的,有关于这个的研究。
06:09
One, it's called the status quo bias.
141
369198
1793
第一,它被称为现状偏差。
06:10
People like things that are already there.
142
370991
2002
人们喜欢已经存在的东西。
06:13
And two, people like to have what they believe about themselves affirmed
143
373035
3420
第二,如果人们真的相信他们,
06:16
if they really believe them, even if they're not positive.
144
376455
3045
即使他们不是积极的,他们也喜欢 让他们对自己的看法得到肯定。
06:19
So that is true.
145
379500
2169
所以这是真的。
06:21
So, like, what does that look like in AI?
146
381669
2461
那么,比如,在人工智能 中会是什么样子?
06:24
(Laughter)
147
384130
1710
(笑声)
06:28
KG: I mean, it's definitely interesting to me that people seem to love like,
148
388384
4129
KG:我的意思是,对我来说, 很有意思的是,人们看起来很喜欢,
06:32
you talk to a lot of these things and they sound like computers
149
392555
3003
你说很多这样的东西, 它们听起来像计算机,
06:35
and they sound like AI, but people love it
150
395558
1960
听起来像人工智能, 但人们喜欢它,
06:37
because it's kind of familiar, it's controllable.
151
397560
2335
因为它有点 熟悉,是可以控制的。
06:39
If you start to add lots of personalities and these kinds of things,
152
399937
3212
如果你开始添加很多个性 之类的东西,
06:43
it makes sense in context, but I found it interesting
153
403190
2503
这在情境中是有道理的, 但我发现有趣的是,
06:45
that as we started developing these AI systems
154
405693
2169
当我们开始开发这些 可供人们互动的人工智能系统时,
06:47
that people interact with,
155
407903
1252
06:49
they all have this kind of similar voice.
156
409196
1961
它们都有类似的声音。
06:51
And it's a very "AI voice."
157
411157
1293
而且这是一种非常 “人工智能的声音”。
06:52
You can kind of tell that you're talking to an AI.
158
412491
2336
你可以分辨出你 在和人工智能说话。
06:54
Maybe that’s intentional.
159
414869
1209
也许这是故意的。
06:56
But there is something there, where like, I think people have a preference
160
416078
4463
但也有一些东西,比如, 我认为人们更
07:00
to getting what they want from humans, from humans,
161
420541
2419
愿意从人类到人类、
07:02
and from AI, from AI.
162
422960
2002
从人工智能到人工智能 那里得到他们想要的东西。
07:04
But that could blend,
163
424962
1502
但这可能会混为一谈,
07:06
there's already lots of, you know,
164
426505
1877
你知道,在某些
07:08
people in certain demographics who spend a lot of time on the internet
165
428424
3336
人群中,已经有很多 人花了很多时间在互联网上,
07:11
and they start to identify,
166
431760
1377
他们开始意识到,
07:13
that's their favorite form of interacting with people.
167
433137
2544
这是他们最喜欢的与人互动 的方式。
07:15
And so I do think that there's a reality where,
168
435723
2335
因此,我确实 认为,
07:18
as we move into the future,
169
438100
1293
在我们迈向未来的过程中,
07:19
there will be people who bias towards that for whatever reasons.
170
439435
3003
出于各种原因, 会有一些人倾向于那样做。
07:22
Whether it's the comfort of knowing that someone's not judging them,
171
442480
3211
不管是 知道别人没有评判他们的安慰,
07:25
whether it's like the format that it speaks to you with,
172
445733
2669
还是像它对你说话的格式,
07:28
that will kind of bias towards preferring those types of interactions.
173
448444
3628
这都会偏向于偏向 于偏爱这些类型的互动。
07:32
But on the other hand, I always think
174
452448
1793
但另一方面,我一直认为会
07:34
there’ll be a distribution of people,
175
454241
2795
有人分布,
07:37
and you'll have some people who really don't like it.
176
457036
3044
而且会有一些 人真的不喜欢。
07:40
And, you know, like I was saying,
177
460080
2128
而且,你知道,就像我说的那样,
07:42
the more that I interact with it now, I find it almost painful
178
462249
2920
我现在与它的互动越多, 我就会觉得几乎很痛苦,
07:45
because I just pick up on so many of these issues
179
465169
2294
因为我刚刚发现了很多这样的问题
07:47
that you're like, I can't even use it at a certain point.
180
467505
2919
你就是这样,我甚至无法 在某个时候使用它。
07:50
And, you know, you'd think that, like, you know,
181
470424
2461
而且,你会 这么想,比如,你知道,
07:52
I’m in the AI space, and I write 20-page docs.
182
472927
3128
我在人工智能领域, 我写的是20页的文档。
07:56
I don't need AI for a single bit of it
183
476096
1919
我一点也不需要人工智能,
07:58
because it does remove that voice.
184
478057
2878
因为它确实会移除那个声音。
08:00
And I do also wonder, though,
185
480976
2044
我也想知道,
08:03
as people interact with it more,
186
483062
1877
随着人们与人工智能互动的增多,他们
08:04
will they either identify the differences
187
484980
2253
会识别出差异
08:07
or start to conform to the things that they're trained with AI.
188
487274
3003
还是开始遵循他们 用人工智能训练的内容。
08:10
It's the same as if you interact with your partner for example, right?
189
490319
3295
例如,就像你 与伴侣互动一样,对吧?
08:13
You start to be biased by the communication
190
493656
2294
你开始对沟通产生偏见 ,
08:15
because you're talking so much.
191
495991
1502
因为你说话太多了。
08:17
BSL: You mean they're training you?
192
497493
1710
BSL:你的意思是他们在训练你?
08:19
KG: They're training you.
193
499203
1293
KG:他们在训练你。
08:20
Your partner is probably like, you know,
194
500496
2127
你的伴侣可能,对吧,
08:22
they have a preferred way of communicating.
195
502665
2043
有一种首选 的沟通方式。
08:24
You get used to it, these kinds of things.
196
504708
2044
你已经习惯了,诸如此类的事情。
08:26
So I do wonder if, as people interact with AI more,
197
506752
2419
因此,我确实想知道,随着人们 越来越多地与人工智能互动
08:29
that they'll kind of all converge.
198
509171
1627
他们是否会融为一体。
08:30
That's probably one of my biggest fears actually of AI.
199
510839
2586
实际上,这可能是我 对人工智能最大的恐惧之一。
08:33
BSL: I'm concerned about the exact opposite.
200
513425
2086
BSL:我担心的情况恰恰相反。
08:35
I'm going to shift a little bit.
201
515511
1543
我要稍微换一下话题
08:37
So when we talk about AI, what you're describing,
202
517054
2503
因此,当我们谈论人工智能时, 你所描述的,通常
08:39
it's usually like dyadic interactions.
203
519598
1835
就像二元交互。
08:41
Like, I'm interacting with one AI, one agent.
204
521433
3671
比如,我正在与一个人工智能、 一个代理进行互动。
08:45
But really what people do is interact with multiple people, right?
205
525145
3587
但实际上,人们要做的是 与多人互动,对吧?
08:48
You interact in some community or some small group setting.
206
528774
3462
你在某个社区 或某个小组环境中进行互动。
08:52
And I'm surprised that there's not more of that in AI.
207
532236
2586
令我惊讶的是,人工智能中 没有更多这样的东西。
08:54
So you're also in gaming.
208
534863
1210
所以你也在玩游戏。
08:56
I don't really game,
209
536073
1668
我真的不玩游戏,
08:57
but my understanding is that a lot of the gaming
210
537741
2253
但我的理解 是,很多游戏
09:00
is about connecting with the people, and it's a community kind of experience.
211
540035
3671
都是关于与人建立联系的, 这是一种社区式的体验。
09:03
So there's two things.
212
543706
1418
所以有两件事。
09:05
One, I'm really surprised that AI seems so focused on these,
213
545124
3253
第一,我真的很惊讶 人工智能似乎如此专注于这些,
09:08
like, one-on-one interactions
214
548419
1668
一对一的互动
09:10
as opposed to like, multiple AI agents
215
550129
2794
而不是像多个 AI 代理
09:12
creating a more immersive social experience.
216
552923
2836
创造更身临其境的 社交体验。
09:15
KG: I love you brought it up because that's really what we do.
217
555759
2920
KG:我喜欢你提出这个问题, 因为我们确实是这么做的。
09:18
BSL: Good, so that's one.
218
558721
1710
BSL:很好,原来如此。
09:20
Other thing, like, the reason I worry less about convergence
219
560472
2920
另一件事,比如,我之所以 不那么担心融合,
09:23
and more about divergence
220
563434
1251
而更多地担心分歧,
09:24
is if you could produce a more immersive social experience,
221
564727
4129
是因为如果你能创造出 更身临其境的社交体验,那么
09:28
now everybody’s having their individual social experiences.
222
568856
3670
现在每个人都有自己的 社交体验。
09:32
Like now, what I worry about with AI, with VR,
223
572568
4087
就像现在一样,我 担心的人工智能、
09:36
with all these kind of technologies that are expanding,
224
576697
5589
虚拟现 实、所有这些都在扩展技术,
09:42
what we can control about our social environment,
225
582328
2377
我们可以控制 我们的社交环境,
09:44
about our physical perceptions in the environment,
226
584747
2544
对环境 中的身体感知,
09:47
is that we all inhabit our own singular world.
227
587333
3712
是我们都生活 在自己的单一世界中。
09:51
That is more frightening to me than like, you know,
228
591045
3545
对我来说,这比我们都聚集在同样
09:54
that we all converged in the same experience.
229
594632
2335
的经历 中更可怕。
09:57
KG: Well, my mom’s a grade-seven teacher,
230
597009
2085
KG:嗯,我妈妈是一名 七年级的老师
09:59
and the one thing that she said is really interesting
231
599094
2503
她说的一件事非常有趣,
10:01
is if you went back like 20 years,
232
601639
1626
就是如果你回到 20 年前,
10:03
everybody was watching the same TV shows,
233
603265
1960
每个人都在看同样的电视节目,
10:05
and they come to class and they'd all be talking about it.
234
605267
2753
他们来上课 然后都在谈论这件事。
10:08
And now everybody watches their own favorite YouTube channel.
235
608062
2877
现在,每个人都在观看 自己喜欢的YouTube频道。
10:10
And it's the siloing of reality.
236
610939
1627
这是现实的孤立。
10:12
Like, what we do is when we work with games, for example,
237
612608
2878
比如,我们做的是当我们 处理游戏时,例如,
10:15
one of the interesting things is like, as people play through games,
238
615486
3211
有趣的事情之一就是, 当人们玩游戏时,
10:18
it's basically the same thing.
239
618739
1460
它基本上是同一回事。
10:20
You could have a million people go through a game,
240
620199
2377
你可以让一百万人 玩一场游戏,
10:22
and it’s some differences
241
622576
1335
这有一些区别,
10:23
but you're largely going to hit the same points.
242
623911
2252
但你基本上会 遇到同样的分数。
10:26
And so one of the things that we think about is,
243
626205
2252
因此,我们考虑的一件事是
10:28
what does that mean for agency?
244
628457
1501
这对代理机构意味着什么?
10:29
The way we interact with media
245
629958
1460
我们与媒体互动的方式
10:31
changes the way that we feel agency in the world.
246
631460
2294
改变了我们对世界的 代理感的方式。
10:33
So if we see inert media that we can't change,
247
633754
2169
因此,如果我们看到我们 无法改变的惰性媒体,
10:35
it also gives you this sense that you can't change the world.
248
635964
2878
它也会给你 一种你无法改变世界的感觉。
10:38
And so to your point,
249
638842
1168
因此,就你而言, 我们想在游戏中做
10:40
one of the things that we want to do with games is, how do you make it
250
640010
3295
的一件事是,如何
10:43
so that each person can actually influence that outcome?
251
643305
2628
让每个人都能真正 影响游戏结果?
10:45
And as you add more agents into that,
252
645974
1794
当你在其中添加更多代理时,
10:47
that you see, OK, I interact with this one and it has a cascade effect.
253
647768
3337
你会看到,好吧,我和这个 代理互动,它会产生级联效应。
10:51
I love it.
254
651146
1210
我喜欢它。
10:52
I mean, even in some of the stuff we've done here,
255
652356
2377
我的意思是,即使在我们所做 的一些事情中,
10:54
the magic actually happens when you do have those agents interacting,
256
654775
3253
魔法实际上也发生在你与 这些代理进行互动的时候,
10:58
because then you’re also not just seeing like that one-to-one interaction
257
658028
3504
因为那样你看到的 不仅仅是那种一对一的互动,
11:01
but the emergent effect of basically that interaction.
258
661573
2545
而是基本上这种互动 的突发效果。
11:04
And another thing is, if your main controls that you have in the computer
259
664118
3461
还有一件事是,如果你在 电脑中的主要控制
11:07
is like point-and-click or, in games, jump and shoot,
260
667579
2962
是点击或者在 游戏中是跳跃和射击,
11:10
we're trying to see like, what does it mean if social skills
261
670582
2837
我们正在尝试看看, 如果像这样的社交技巧
11:13
like interaction like this,
262
673419
1918
这样的互动的方式
11:15
are the ways that you actually interact with the games, the technology
263
675379
3337
与游戏、技术
11:18
and the agents.
264
678757
1168
和代理进行实际互动意味着 什么。
11:19
That’s a very different way of conversing or of dialogue than button presses.
265
679967
3879
这是一种与按下按钮截然不同 的对话或对话方式。
11:23
And I think that changes the way that you sense agents in the world.
266
683846
3336
而且我认为这改变了 你感知世界代理人的方式。
11:27
Because I think the way that most people change the world is by speaking
267
687224
3420
因为我认为大多数人改变世界 的方式
11:30
and interacting and interacting with other humans,
268
690644
2419
是与其他人说话和互动,
11:33
not by pressing buttons.
269
693063
1335
而不是按下按钮。
11:34
I mean, arguably it's the case in some.
270
694398
2711
我的意思是,可以说, 在某些情况下是这样。
11:37
BSL: You know, the other thing that's interesting to me is
271
697151
2752
BSL:你知道,我感兴趣 的另一件事是
11:39
I don't think people have an understanding
272
699945
2127
我认为人们不了解他们所
11:42
of the systems they exist in, right?
273
702072
2169
处的系统,对吧?
11:44
People think about themselves as existing in like individual relationships,
274
704283
3545
人们认为自己存在 于个人关系中,
11:47
and they have a harder time understanding system affects
275
707870
2878
他们更难 理解系统的影响,
11:50
like I affect you, which affects your partner,
276
710748
2335
就像我影响你一样, 它会影响你的伴侣
11:53
which affects your partner's parents, right?
277
713125
2169
影响你的伴侣的 父母,对吧?
11:55
That is a harder thing to grasp.
278
715335
2169
这是一件很难理解的事情。
11:57
But I think there's something that's fundamentally human about that.
279
717546
3212
但我认为,这从根本上 讲是人性化的。
12:00
Like you are also impacted by all these different things going on,
280
720758
3503
就像你也受到 所有这些不同的事情的影响
12:04
like, we had the person come and put on our makeup,
281
724303
2419
一样,比如,我们让那个人来 化妆,
12:06
and now I'm looking beautiful
282
726722
1418
现在我看起来很漂亮
12:08
and it's affecting everybody else around me.
283
728140
2085
这也影响着我周围的其他人。
12:10
(Laughter)
284
730225
1085
(笑声)
12:11
KG: It's glowing.
285
731351
1168
KG:它在发光。
12:12
BSL: Exactly.
286
732519
1168
BSL:没错。
12:13
How does that fit in?
287
733729
1835
这怎么适合?
12:15
I just haven't heard people talk about it in that way,
288
735606
2544
我只是没有听过人们以这种方式 谈论它,
12:18
which is surprising to me, because that, I think,
289
738150
2294
这让我感到惊讶, 因为我认为
12:20
is what fundamentally makes humans human.
290
740486
3295
这是使人类成为人类的根本原因。
12:23
It's interaction
291
743781
2085
这是互动
12:25
and complex social situations.
292
745908
1668
和复杂的社交场合。
12:27
KG: And these like, nested systems.
293
747618
1710
KG:比如嵌套系统。
12:29
And like, they all affect each other, right?
294
749328
2085
比如,它们都 相互影响,对吧?
12:31
You think that your small activity
295
751455
1918
你认为你的小活动
12:33
doesn't affect whatever higher-level political stuff,
296
753415
2503
不会影响任何 更高层次的政治问题,
12:35
but it's all aggregate.
297
755959
1210
但都是综合性的。
12:37
And it's all interlinking as well.
298
757169
1627
而且这一切也是相互关联的。
12:38
I mean, it's like the AI thing is interesting too,
299
758837
2503
我的意思是,人工智能的东西 也很有趣,
12:41
because I often hear people talk about it as like this evolution.
300
761340
3170
因为我经常听到人们把它当作 这样的进化来谈论。
12:44
You have like, you know, singular cells to monkeys to humans to AI.
301
764510
4045
你知道,从单一细胞到猴子 然后到人类最后到人工智能
12:48
Whereas like, you could flip it, where it's like more like, you know,
302
768597
3295
但是,你可以把它翻过来 ,你知道,它更像是
12:51
cells to organs to human to AI.
303
771892
1543
细胞到器官,再到人再到人工智能。
12:53
It's a system overarching that just because it's trained on us
304
773477
3545
这是一个总体系统 , 只要它对我们进行了训练,
12:57
and we do these things, that we actually influence that system,
305
777022
2961
我们做了这些事情,我们就会真正 影响这个系统,
13:00
now that people are interacting with it, it has this interplay.
306
780025
3086
既然人们正在与之互动, 它就会产生这种相互作用。
13:03
And that's interesting too, when it becomes like, you know,
307
783946
2836
这也很有趣, 当它变成,
13:06
AI isn't this singular entity.
308
786824
1459
人工智能不是这个单一的实体时。
13:08
It is more of an institution or a system almost
309
788325
2711
它更像是一个几乎可以
13:11
that is kind of overarching everything else.
310
791078
2127
支配其他 一切的机构 或系统。
13:13
BSL: And it's also weird because it's like our vision of who we are.
311
793247
3295
BSL:这也很奇怪,因为这 就像我们对自己的看法。
13:16
So when we talk about AGI,
312
796542
1376
因此,当我们谈论 AGI 时,
13:17
it's like we don't even know what intelligence is
313
797960
2460
就像我们甚至不知道 什么是智能
13:20
and we think we're going to produce something that has it.
314
800420
3129
我们认为我们会生产出具有智能 的东西。
13:23
It's just an interesting situation where we talk about it, as you said,
315
803590
3837
这只是一个有趣的情况, 我们谈论它,正如你所说,
13:27
it's natural evolution,
316
807469
1668
它是自然进化,
13:29
but in fact we’re creating it,
317
809137
1627
但实际上我们正在创造它
13:30
and it's not clear that we know exactly what it is we're creating.
318
810764
4171
尚不清楚我们是否确切地知道 我们在创造什么。
13:34
KG: I actually think that one of the most interesting things
319
814977
2836
KG:实际上,我认为 最有趣的事情之一
13:37
is that we're starting to work on AI at a point where,
320
817813
2544
是我们开始 研究人工智能的时刻,
13:40
like, I still think we're figuring out, you know, ourselves.
321
820399
2836
比如,我仍然认为我们 自己正在弄清楚。
13:43
Neuroscience is very early on in its days,
322
823235
3754
神经科学还处于起步阶段
13:47
and yet we're creating things
323
827030
1418
但是我们正在 创造类似的东西
13:48
that are like, based on our own intelligence,
324
828490
2127
基于自己的智力
13:50
and we don't really understand even what's going on inside.
325
830617
2795
甚至我们根本不了解 内部发生的事情。
13:53
And so to your point on, what are the effects?
326
833412
2169
那么就你而言,效果 是什么?
13:55
We don't really know yet.
327
835581
1876
我们还不知道。
13:57
Every year a new paper comes out
328
837499
2044
每年都会发表一篇新论文
13:59
and changes how people think about child rearing.
329
839585
2627
改变人们对抚养孩子的看法。
14:02
Like how to bring up a child well, like all those kinds of things.
330
842212
3295
比如何好好抚养孩子, 就像所有这些事情。
14:05
And now we're creating systems that will, you know,
331
845549
2419
你知道,现在我们正在创建的系统 将在
14:07
kind of be overarching other humans.
332
847968
2127
某种程度上支配其他人类。
14:10
What does that mean, I don't know.
333
850137
1626
那是什么意思,我不知道。
14:11
I do actually think,
334
851805
1210
实际上,我确实在想,
14:13
I happen to be in AI, we happen to be at this point in time.
335
853015
3044
我碰巧在人工智能领域, 我们恰好处于这个时刻。
14:16
But if we could pause for a second,
336
856101
1710
但我认为,如果我们能停顿一秒钟
14:17
I think it would be good, another few centuries of figuring out what we are
337
857811
3587
那就太好了,在我们创造出 符合我们想象的东西
14:21
and understanding that a little bit better
338
861440
2002
之前,再 弄清楚自己是什么样子
14:23
before we created something that was in our image.
339
863442
2335
并更好地理解 这一点。
14:25
Because, we’re kind of just, you know,
340
865777
1836
因为,我们有点,你知道吧,
14:27
it's kind of like taking a photograph and like painting it, right?
341
867654
3128
有点像拍照 然后画画,对吧?
14:30
You're not actually getting the person and painting it.
342
870782
2586
实际上,你没有找到 人物然后画它。
14:33
There's something about the life that's missing there.
343
873368
2545
那里的生命 中有些缺失。
14:35
So I do agree.
344
875954
1293
所以我确实同意。
14:37
I think that we're honestly kind of premature,
345
877289
2294
说实话,我认为 我们现在还为时过早,
14:39
but I think it's just how, I guess,
346
879583
2210
但我认为,我猜,
14:41
you know, life goes that things come out when they naturally should.
347
881835
3962
你知道,生活就是这样的, 事物自然应该出现。
14:45
BSL: So, I mean, you work in AI,
348
885839
1752
BSL:所以,我的意思是, 你在人工智能领域工作,
14:47
so what's the most exciting thing for you in AI?
349
887633
3128
那么在人工智能领域对你来说最 令人兴奋的事情是什么?
14:50
What's your hope for it?
350
890802
2169
你对此有什么希望?
14:54
KG: I think it's kind of back to that agency question.
351
894598
2586
KG:我认为这又回 到了那个代理的问题。
14:57
So I mean, you know, you read a news story,
352
897184
4004
所以我的意思是, 你知道,
15:01
you read a book, you watch a movie, you watch a TV show,
353
901188
2669
你读新闻报道,读书,看电影, 看电视节目,
15:03
this is specific to, like, my domain,
354
903899
2502
这是我的领域所特有的,
15:06
like there's something about the communication
355
906443
2294
比如
15:08
that we're having right now where, like,
356
908779
1918
说,我们现在的沟通,比如,
15:10
I'm adapting to the things that you say, to your body language,
357
910697
2962
我正在适应你所说的话, 你的肢体语言,
15:13
all those kinds of things, right?
358
913700
1585
诸如此类,对吧?
15:15
To, like, the people in the room we have here, all these things.
359
915285
3045
对我们房间里的人来说, 所有这些东西。
15:18
And so when you have ...
360
918372
1167
所以当你有...
15:20
AI able to, sort of, help that adaptation so that you have that agency
361
920624
3837
人工智能能够在某种程度 上帮助你进行适应,这样你就能
15:24
in the things that you interact with.
362
924461
1793
在与之互动的事物中 拥有这样的代理权。
15:26
I don't necessarily believe in fully personalized media
363
926254
2586
我不一定相信 完全个性化的媒体,
15:28
because I think we need like a shared social context.
364
928882
2503
因为我认为我们需要 一个共享的社交环境。
15:31
The reason we watch a movie or a TV show
365
931385
2335
我们看电影或电视节目的原因
15:33
is because then we can all talk about it, right?
366
933720
2253
是因为那样我们就可以 谈论了,对吧?
15:35
But there is something about the fact
367
935973
1793
但是有一个事实是
15:37
that we're all interacting with these internet objects.
368
937808
2586
我们都在 与这些互联网对象进行交互。
15:40
And so the way that technology feels, you're on a screen, it doesn't change.
369
940435
3629
因此,科技的感觉, 你在屏幕上,它不会改变。
15:44
You're in a movie, it doesn't change.
370
944064
1793
你在看电影,情况不会改变。
15:45
You're watching Netflix, it doesn't change depending on what you do.
371
945857
3212
你在看 Netflix,它不会 因你所做的事情而改变。
15:49
And I think that changes the way we see our own agency in the world.
372
949069
3211
而且我认为这改变了我们在世界上 看待自己的代理机构的方式。
15:52
And so I hope with AI that one of the things that it does
373
952322
3170
因此,我希望通过人工智能 它能做的一件事
15:55
is kind of opens this door to agency
374
955492
3003
就是为代理机构打开这扇大门
15:58
in the way that we interact with media and technology in general,
375
958537
3253
以我们与媒体 和技术互动的总体方式,
16:01
such that we do notice that effect that you have on systems.
376
961790
2920
这样我们就能注意到 你对系统的影响。
16:04
Because even if it's small, right,
377
964710
1793
因为即使它很小,对吧
16:06
where I take a certain action and it completely changes an app
378
966545
2919
我采取某种行动, 它会彻底改变应用程序
16:09
or it changes an experience,
379
969464
1377
或改变体验,
16:10
maybe that helps us learn that we have an effect in the social systems
380
970841
4462
也许这可以帮助我们了解我们 对社交系统
16:15
as well that we're affecting.
381
975303
1419
也有影响,我们正在影响什么。
16:16
So something to that effect.
382
976722
1376
所以大意如此。
16:18
BSL: So you want to make our agency more transparent.
383
978098
2920
BSL:所以你想让 我们的机构更加透明。
16:21
And do you think it does that,
384
981018
1501
你认为它能做到吗,
16:22
because right now I'm not sure it doesn't obfuscate our agency.
385
982561
3003
因为我现在不确定 它不会混淆我们的机构。
16:25
KG: No I don't necessarily know.
386
985564
2085
KG:不,我不一定知道。
16:27
I agree, I mean this is why I think also
387
987649
2961
我同意,我的意思是 这就是为什么我认为
16:30
media and games is, you know, the domain I mainly focus on.
388
990610
3045
媒体和游戏也是我主要关注 的领域。
16:33
And I think it's interesting, especially because young people use it a lot.
389
993655
3545
而且我认为这很有趣,尤其是 因为年轻人经常使用它。
16:37
And so I've heard very veteran game developers say
390
997242
3086
因此,我听过非常资深的 游戏开发者说人们
16:40
how people interact with games
391
1000370
1460
如何与游戏互动
16:41
kind of trains kids how they should interact with the world.
392
1001830
2836
就像训练孩子 如何与世界互动。
16:44
So even people who tend to be professional players in different games
393
1004708
3253
因此,即使是那些在不同游戏中 倾向于成为职业玩家的人
16:47
have different psychological profiles
394
1007961
2252
也有不同的心理特征,
16:50
because they bias towards certain ways of interacting and seeing the world.
395
1010213
3587
因为他们偏向于某些 互动和看待世界的方式。
16:53
The same way, I guess, if you trained in something academic, right,
396
1013800
3170
我猜也一样,如果你接受过学术 方面的培训,对吧,
16:57
you have a different way of viewing it.
397
1017012
1877
你有不同的看法。
16:58
And so if we make games and media in a way
398
1018889
3295
因此,如果我们以一种 让你也能感受到
17:02
that you feel that sort of social impact as well,
399
1022184
3128
这种社会影响的方式 制作游戏和媒体,
17:05
maybe, maybe it opens the door to like, another realm of understanding.
400
1025353
3629
也许,它会打开通往 另一个理解领域的大门。
17:09
But, yeah, I agree that like a lot of the systems that we have today
401
1029024
5213
但是,是的,我同意,就像我们 今天的许多系统一样,可能还会
17:14
give you maybe a false sense also of agency
402
1034279
2044
给你一种虚假的代理
17:16
where like we were talking about the AI systems,
403
1036364
2253
意识,就像我们在 谈论人工智能 系统一样,你实际上感觉自己
17:18
where you actually feel like you're controlling this thing,
404
1038658
2795
在控制这个东西,
17:21
whereas maybe it's also biasing, you know, and "controlling,"
405
1041453
3420
而 你知道的,它可能还有偏见,和 “控制”,也会对你
17:24
having some influence over you as well.
406
1044873
1877
产生一定的影响。
17:26
BSL: So where do you think things are going?
407
1046792
2085
BSL:那么你认为 事情会走向何方?
17:28
So there's obviously a huge race among some very,
408
1048877
2336
因此,一些资源
17:31
very well-resourced organizations over AI, right?
409
1051254
4588
非常充足 的组织之间显然存在着 围绕人工智能的激烈竞争,对吧?
17:35
You know, Microsoft, Google, I mean, are the biggest maybe.
410
1055884
3086
例如,微软、 谷歌可能是最大的。
17:40
And they are very quickly going to need to monetize it
411
1060931
3253
而且他们 很快就需要通过它获利,
17:44
because this is what those companies are designed to do.
412
1064226
3253
因为这是 这些公司的设计目标。
17:47
Like what do you foresee?
413
1067521
2502
比如你能预见什么?
17:50
Because I just look at social media as an example.
414
1070065
2961
因为我只是 以社交媒体为例。
17:53
I think, at the time when it first came out,
415
1073068
4129
我认为, 当它刚问世时,
17:57
people were really excited, as a new way to connect with people
416
1077197
3003
人们真的很兴奋,作为一种与人 建立联系的新方式,也是一种
18:00
and a way to stay connected to people
417
1080200
1960
与你认识的人保持联系的方式,
18:02
you know, you couldn't otherwise;
418
1082160
2461
否则你无法做到的;
18:04
catch up with people you lost contact with,
419
1084663
2210
赶上失去联系的人 ,
18:06
that sort of thing.
420
1086873
1627
诸如此类的事情。
18:08
And it changed into something else.
421
1088542
1960
然后它变成了另外一回事。
18:12
In large part because of the way it was monetized,
422
1092295
2378
这在很大程度上是由于 其获利方式,
18:14
like going to ads, focus on attention.
423
1094673
2002
例如投放广告,因此 将注意力集中在注意力上。
18:18
What's the trajectory of AI?
424
1098844
2585
人工智能的轨迹是什么?
18:22
KG: You know, I'm taking guesses.
425
1102055
2544
KG:你知道,我在猜一猜。
18:24
BSL: Yeah, of course, we're all taking guesses,
426
1104641
2252
BSL:是的,当然, 我们都在猜测,
18:26
I won’t hold you to it, don’t worry.
427
1106935
1752
我不会强迫你去猜的,别担心。
18:28
KG: I think that the reality is,
428
1108728
2962
KG:我认为现实是,
18:31
we were kind of mentioning before about the challenges of scale.
429
1111731
3045
我们之前曾提到 过规模的挑战。
18:34
And when you invest tens of billions of dollars in something,
430
1114818
3587
而且,当你在某件 事上投资数百亿美元时,
18:38
you need scale.
431
1118405
1501
你需要规模。
18:39
And I think that's one of -- the way that AI is developed
432
1119906
3087
我认为这是其中之一 —— 人工智能 的开发方式
18:43
and specifically even the types of models we're using,
433
1123034
2545
特别是我们正在 使用的模型类型,
18:45
the economic model of it,
434
1125620
1210
它的经济模型,
18:46
which is effectively the more compute you have,
435
1126872
2210
它实际上是你的计算越多,
18:49
the better models you can create.
436
1129082
1585
你可以创造的模型越好。
18:50
The better models you can create, the more usage you get.
437
1130667
2711
您可以创建的模型越好,获得 的使用量就越多。
18:53
The more usage you get, the better.
438
1133378
1710
使用量越多越好。
18:55
So it has somewhat of a, honestly, like monopolistic tendency, I think,
439
1135088
3670
因此,说实话, 我认为它
18:58
in the way that actually even like the architectures
440
1138800
2503
在某种程度上有一种垄断倾向, 其运作方式甚至就像它的架构
19:01
and the economy of it works.
441
1141303
1710
和经济一样。
19:03
And so
442
1143013
1626
因此,
19:04
I think it's almost inevitable that whatever AI systems are produced
443
1144639
3254
我认为, 无论这些大型组织生产
19:07
by these large organizations
444
1147893
1376
什么人工智能系统,都
19:09
will be pushed to scale as quickly as possible.
445
1149311
2752
将尽快扩大规模, 这是几乎不可避免的。
19:12
And there's some pluses in that where like, you know, sure,
446
1152063
3754
这其中有一些优点, 比如,你知道,当然,
19:15
they're building in feedback loops, people can give their input, it biases it.
447
1155859
3712
他们正在建立反馈循环,人们 可以提出自己的意见,这会造成偏见。
19:19
But also at the same time,
448
1159613
1251
但同时,当一个模型适合 十亿人时,这意味着
19:20
what does it mean when a single model is fit to a billion people, right?
449
1160906
3712
什么,对吧?
19:24
So that's kind of what I meant about the converging effect
450
1164659
2753
所以这就是我所 说的聚合效应的意思,
19:27
where, what happens when we are pushed to produce something
451
1167412
2920
当我们被迫生产
19:30
that fits to a billion people?
452
1170373
1627
适合十亿人的产品时会发生什么?
19:32
There's a lot of diversity in there.
453
1172042
1751
里面有很多多样性。
19:33
And so, you know, we create these scaled systems that are fitting with the whole,
454
1173793
4588
因此,你知道,我们创建了这些 与整体相适应的缩放系统,
19:38
like, trying to fit the whole planet.
455
1178423
1794
比如试图适应整个地球。
19:40
Does that work?
456
1180258
1168
那行得通吗?
19:41
And so I think what will, you know,
457
1181468
2002
所以我想会怎样,你知道,
19:43
we're going to go through this phase where like, yeah,
458
1183511
2545
我们将经历这个 阶段,比如,是的,
19:46
you're going to have a billion people interacting the same AI.
459
1186056
2961
你将有十亿人使用相同的人工智能 进行交互。
19:49
And I don't know what the effect of that will be.
460
1189017
2294
而且我不知道那 会产生什么影响。
19:51
Even the monetization models now are kind of you pay
461
1191353
2460
即使是现在的 盈利模式也是你
19:53
to use these kinds of things, which are maybe OK,
462
1193855
2628
为使用这类东西付出代价的, 这也许没问题,
19:56
but certainly ads will probably enter the equation.
463
1196524
2420
但广告肯定会被纳入 其中。
19:58
Also, what happens when you want attention
464
1198944
2002
另外,当你需要关注
20:00
and AI is much better at that than the algorithms
465
1200987
2378
和人工智能时发生的事情比你在
20:03
you even have on YouTube and Instagram.
466
1203365
1877
YouTube和Instagram 上使用的算法要好得多。
20:05
And you can start to capture that attention.
467
1205283
2086
然后你就可以开始 吸引这种注意力了。
20:07
And so I certainly think it's going to be an interesting little bit here now,
468
1207369
3628
因此,我当然认为现在这里会 是一件有趣的事情,
20:10
as we see these huge organizations spending tens of billions of dollars
469
1210997
3921
因为我们看到这些大型组织 花费了数百亿美元
20:14
and the choices that they make to then monetize that,
470
1214918
2961
他们做出了哪些选择, 然后通过这种方式获利,
20:17
and what that means for how AI proliferates.
471
1217921
3420
以及这对 人工智能的扩散意味着什么。
20:21
I know a lot of the folks in the organizations,
472
1221341
3045
我认识组织 中的很多人
20:24
and their interests have never been in that domain.
473
1224427
2753
他们的兴趣 从未出现在这个领域。
20:27
But at the same time, you're beholden, you know,
474
1227222
2252
但同时,你知道, 你受股市利益的束缚,
20:29
to stock market interests and whatever it is, then what happens?
475
1229474
3295
不管 它是什么,然后会发生什么?
20:32
It shifts it, right?
476
1232811
1418
它会改变它,对吧?
20:34
We're in a capitalist world.
477
1234271
1376
我们处在一个资本主义世界中。
20:35
And that's kind of like, you know,
478
1235647
1668
这就有点像
20:37
what ultimately will change the incentives.
479
1237315
2461
最终会改变激励措施 的因素。
20:39
So yeah it's interesting.
480
1239818
1585
所以是的,这很有趣。
20:41
I mean I am interested in, coming from your background,
481
1241444
2920
我的意思是,从你的背景 来看,我很感兴趣,
20:44
you have a very different stance on it.
482
1244406
1918
你对此的立场截然不同。
20:46
But, you know, it's all this AI stuff is interesting.
483
1246324
2544
但是,你知道,所有这些 人工智能的东西都很有趣。
20:48
But, you know, when you think, almost to your first question,
484
1248868
2878
但是,你知道,当你思考 几乎是你的第一个问题时,
20:51
what makes us human and like as people,
485
1251746
2836
是什么让我们成为人类, 就像人一样,
20:54
just technology in general and specifically with AI,
486
1254624
3379
只是一般 的技术,特别是人工智能,
20:58
like where can people find the meaning in their life,
487
1258044
4004
比如人们在哪里可以找到 生活中的意义,
21:02
the values that they find true?
488
1262090
3086
他们认为真实的价值观?
21:05
And how will that change, do you think,
489
1265677
1877
你觉得那会有什么变化
21:07
I guess, with like the advent of these new technologies?
490
1267554
2627
伴随这些新技术的出现 一样
21:10
Or how have you seen it change
491
1270181
2211
或者,你是否看到它伴随着 我们已经看到的技术变为现实变化
21:12
with the technologies we've already seen come to life?
492
1272392
2669
21:16
BSL: This is going to sound like a funny answer.
493
1276479
2253
BSL:这听起来 像是一个有趣的答案。
21:18
I think people are too worked up about technology, personally.
494
1278732
2919
就我个人而言,我认为人们 对技术太着迷了。
21:21
I mean, you know, we had this conversation.
495
1281693
2210
我的意思是,你知道, 我们进行了这样的交谈。
21:23
I think, you know, people have been using technology since we've been human.
496
1283945
4171
我认为,自从我们成为人类以来, 人们就一直在使用技术。
21:28
So paper was a huge invention.
497
1288158
2460
所以纸是一项伟大的发明。
21:30
Talked about this.
498
1290618
1210
谈过这个。
21:31
The printing press, huge invention.
499
1291870
1710
印刷机,伟大的发明。
21:33
Computer, huge invention.
500
1293580
1460
计算机,伟大的发明。
21:35
Internet, huge invention.
501
1295081
1252
互联网,伟大的发明。
21:36
AI, great, another huge invention.
502
1296333
1835
AI,太棒了,又一个伟大的发明。
21:38
And through all of that, I think
503
1298209
2795
综上所述,我认为
21:41
what you see in a lot of the biggest technologies
504
1301046
2294
你在许多 最大的技术中看到
21:43
is the desire to connect with other human beings.
505
1303381
2419
的是 与其他人建立联系的愿望。
21:45
I think what fundamentally makes us human is our connection to other human beings,
506
1305800
4129
我认为使我们成为人类的根本原因 是我们与其他人的联系,
21:49
our ability to engage with other human beings, and like consciousness
507
1309971
3295
我们与其他人 互动的能力,以及意识
21:53
and all these other things I think are necessary preconditions.
508
1313266
3879
和所有其他我认为 是必要的先决条件的东西。
21:57
But really, what makes us human is connections with other humans.
509
1317187
3253
但实际上,使我们成为人类 的是与其他人的联系。
22:00
And that is incredibly complex.
510
1320482
4129
这非常复杂。
22:04
And I don't think we're close in terms of technology of replicating that.
511
1324652
3713
而且我认为我们能创造足以复制 上述的方面的科技还差很多。
22:08
I mean, even what you described it's like you have this feeling of like,
512
1328948
3421
我的意思是,即使是你所描述的 那样,你也有这样的感觉,
22:12
this isn't right, this is off.
513
1332369
1835
这是不对的,这是不对的。
22:14
And even if you felt like it was right,
514
1334245
1877
而且,即使你觉得这是对的,
22:16
it still would be off in ways you didn't quite get.
515
1336164
2586
它仍然会以 你不太明白的方式失效。
22:18
I don't think we're close.
516
1338792
2502
我认为我们离得很远。
22:21
Though because it's designed to pull our attention away from other things,
517
1341336
4171
但是,由于它旨在将我们的注意力 从其他事情上移开,因此
22:25
I think it impedes our ability
518
1345507
5547
我认为它阻碍了我们 做我们所有人想做的事情,
22:31
to do what we all kind of want to do,
519
1351096
3920
即相互互动的能力。
22:35
which is interact with each other.
520
1355016
2628
22:37
And it might change the way we interact with each other
521
1357977
4505
而且它可能会改变 我们彼此互动的方式
22:42
in a way that might feel less fulfilling.
522
1362524
3169
以使人感到不那么充实的途径。
22:46
And I think you see some of that in social interactions now.
523
1366069
4463
我想你现在 在社交互动中看到了其中的一部分。
22:50
Some of that I mean, recently maybe,
524
1370573
3546
我的意思是,最近,
22:54
COVID was an issue.
525
1374119
2043
新冠病毒可能是一个问题。
22:56
But, you know, people feeling less comfortable in face-to-face interactions.
526
1376204
4213
但是,你知道,人们在面对面的 互动中感觉不太舒服。
23:00
Like people dating,
527
1380458
1293
就像人们约会一样,
23:01
there's no serendipity in hanging out and you meet who you meet.
528
1381793
3045
出去玩没有偶然的机会, 你会遇到你遇到的人。
23:04
It's like you're using an algorithm to try to present to you options.
529
1384879
4255
就像你在使用算法来尝试 向你呈现选项一样。
23:09
That's a very different world.
530
1389175
1794
那是一个截然不同的世界。
23:11
So even that's prior to AI.
531
1391010
2294
因此,即便如此, 也是在人工智能出现之前。
23:14
And I don't know how AI is going to further influence that.
532
1394013
3796
而且我不知道人工智能 将如何进一步影响这一点。
23:17
KG: And I guess just even like the general point,
533
1397809
2669
KG:我想 即使和一般观点一样,你认为
23:20
how core do you think
534
1400478
2378
人际关系的核心需求
23:22
the need for connection is
535
1402856
1501
有多重要
23:24
in the sense that, you know,
536
1404399
1376
因为你知道,
23:25
I've heard some parents say that, through COVID,
537
1405775
2252
听过一些家长 说,经历过新冠病毒,
23:28
their kids went through a major change, you know,
538
1408069
2544
他们的孩子经历了重大变化,
23:30
these regressions,
539
1410655
1543
退化
23:32
their different habits and these kinds of things
540
1412198
2920
他们不同的习惯 和诸如此类的事情,
23:35
because they weren't connecting with people.
541
1415160
2085
他们没有与人 建立联系。
23:37
And then it's taken years to overcome that.
542
1417287
2169
然后花了好几年时间才克 服这个问题。
23:39
So I do also wonder, like, you know,
543
1419456
1918
所以我也想知道,
23:41
whether it's through technology or things like COVID
544
1421416
2461
无论是通过技术 还是新冠病毒之类的东西,
23:43
or just like circumstances,
545
1423877
1418
还是像环境一样,我们
23:45
could we lose that need for connection?
546
1425295
2002
能否失去对连接的需求?
23:47
Or even if we need it,
547
1427297
1960
或者,即使我们需要它,
23:49
you know, we might lose the desire for it
548
1429299
1960
我们可能会失去对它的渴望,
23:51
and feel emotional trauma as a result, but still not go for it.
549
1431301
3920
并因此感到情感创伤, 但仍然无法去做。
23:55
Like, how core do you think it is?
550
1435263
1627
比如,你认为它有多核心?
23:56
And do you think we're safe in that kind of need?
551
1436931
3629
而且你认为我们在这种需求 中安全吗?
24:01
BSL: So I'm going to give you the most extreme answer,
552
1441769
2545
BSL:所以我要给你一个最极端 的答案,我认为
24:04
which is I think the true one.
553
1444355
1460
这是真实的答案。
24:05
That you will cease to be human
554
1445857
1793
你将不再是人类
24:07
if you don't have a need for human connection.
555
1447650
2169
如果你不需要人际关系。
24:09
Like I think you will be a physical person,
556
1449861
2002
就像我认为你将成为 一个自然人一样,
24:11
but you will literally break down as a human being.
557
1451905
3712
但你实际上会像人 一样崩溃。
24:15
And this is why in part --
558
1455658
1961
这就是为什么
24:17
Social isolation
559
1457660
1752
社交隔离
24:19
or solitary confinement is considered inhumane.
560
1459454
3211
或单独监禁在某种程度上被认为 是不人道的。
24:22
Because people literally break down,
561
1462707
2711
因为人们真的会崩溃,所以
24:25
you will start to have hallucinations.
562
1465460
2044
你会开始产生幻觉。
24:27
You will break down mentally and physically absent human connection.
563
1467545
5631
如果没有人际关系,你将在精神 和身体上崩溃。
24:33
So I don’t think there’s any possibility, in my mind, of losing the need.
564
1473635
4421
因此, 在我看来,不可能失去需求。
24:38
Like, you may get less than you need,
565
1478056
2085
比如,你得到的可能会比你需要的少
24:40
and that will have negative consequences for you.
566
1480183
3128
,这将对你 产生负面影响。
24:44
But I'm not worried about people not wanting to be around people.
567
1484562
3420
但我并不担心人们 不想和别人在一起。
24:48
KG: Are you worried that, like,
568
1488024
1710
KG:你是否担心
24:49
things like social media or AI or any of these things
569
1489734
3086
诸如社交媒体 或人工智能之类的东西
24:52
could give you that sense that you're fulfilling that need,
570
1492862
2795
会让你感觉 到你在满足需求,
24:55
but not actually fulfilling it?
571
1495657
1543
但实际上并没有满足需求?
24:57
It's totally true, right?
572
1497242
1251
这是完全正确的,对吧?
24:58
Solitary confinement is a great example because we need it.
573
1498535
4129
单独监禁就是一个很好的例子, 因为我们需要它。
25:02
We absolutely lose our sanity as well as, you know, our well-being.
574
1502664
4421
你知道,我们绝对会失去理智, 也失去了身心健康。
25:07
But maybe we can, you know,
575
1507085
2377
但是,也许我们可以,
25:09
technology can manufacture the sense that we're fulfilling it.
576
1509504
3253
技术可以制造 出我们正在实现它的感觉。
25:12
And then over time,
577
1512757
1210
然后随着时间的推移,
25:13
we see these mental health crises evolve as a result?
578
1513967
2502
我们会看到这些心理健康危机因此 而演变吗?
25:16
BSL: Yeah, that's a good question.
579
1516511
1626
BSL:是的,这是个好问题。
25:18
I think it's unlikely, but I don't know.
580
1518179
2127
我认为这不太可能,但我不知道。
25:20
Honestly I don't know.
581
1520306
1669
老实说我不知道。
25:23
I'll talk about meaning for a second.
582
1523726
1836
我一会儿再谈意义。
25:25
And I think of that as fundamentally tied
583
1525603
1961
我认为这与我们与他人建立联系 的需求息息相关
25:27
to our need for connection to other people.
584
1527564
2085
25:29
I think sometimes we confuse, for example, our need for meaning,
585
1529649
4004
例如,我认为有时候我们会将 对意义的需求
25:33
with a desire for personal achievement.
586
1533695
2252
与对个人成就的渴望混淆。
25:35
That we chase personal achievement
587
1535989
2210
我们追求个人成就
25:38
and what we're trying to do is generate meaning.
588
1538241
2919
我们 想要做的是创造意义。
25:41
So I think we can be confused
589
1541828
3044
因此,我认为我们可能会感到困惑
25:44
and we can have those needs displaced into less productive routes.
590
1544914
5506
我们可能会将这些需求转移 到效率较低的道路上。
25:51
But I don't think it's going away.
591
1551504
1752
但我认为它不会消失。
25:53
But, you know, I don't know that it's the healthiest.
592
1553256
3587
但是,你知道,我不 知道它是最健康的。
25:57
KG: No, I'm totally aligned.
593
1557760
1669
KG:不,我完全一致。
25:59
Thank you, Brian, that was an awesome conversation.
594
1559721
2460
谢谢你,布莱恩, 这是一次很棒的谈话。
26:02
BSL: It was great to talk to you.
595
1562181
1585
BSL:很高兴和你交谈。
26:03
Really fantastic and super informative. Thank you.
596
1563766
2545
真的很棒而且内容 非常丰富。谢谢。
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7


This website was created in October 2020 and last updated on June 12, 2025.

It is now archived and preserved as an English learning resource.

Some information may be out of date.

隐私政策

eng.lish.video

Developer's Blog