Love, Trust and Marketing in the Age of AI | Amaryllis Liampoti | TED

34,139 views

2025-02-20 ・ TED


New videos

Love, Trust and Marketing in the Age of AI | Amaryllis Liampoti | TED

34,139 views ・ 2025-02-20

TED


Please double-click on the English subtitles below to play the video.

00:04
I think we've been missing the forest through the trees when it comes to AI.
0
4292
4046
00:08
We've been so focused, almost obsessed,
1
8380
3211
00:11
on squeezing every bit of efficiency out of AI
2
11591
3337
00:14
to make our processes faster or cheaper
3
14928
3378
00:18
that we have overlooked the most important aspect of all.
4
18348
3462
00:22
AI is changing the very nature of how brands connect with consumers,
5
22477
5506
00:28
but most importantly,
6
28024
1418
00:29
what consumers expect back.
7
29484
2044
00:32
I've spent the last 20 years dedicating my career
8
32153
3295
00:35
to building growth strategies
9
35490
1460
00:36
for the world's most influential companies.
10
36950
2210
00:39
I've been at this for a while, and I've seen most of the big tech shifts.
11
39578
4087
00:43
But the introduction of AI, in particular conversational interfaces,
12
43707
4671
00:48
is a bigger and more profound shift.
13
48420
2669
00:51
Which, from where I stand,
14
51131
1710
00:52
means we can't just slot AI into our existing playbooks.
15
52841
4421
00:57
I have nothing against existing playbooks.
16
57262
2336
00:59
They served us marketers well for a long period of time,
17
59639
4296
01:03
but they were built for a world where communication was one-directional
18
63935
4254
01:08
and brand-to-consumer interactions were built around transactions.
19
68189
4797
01:13
Here's an example.
20
73028
1459
01:14
I bet many of you might have heard of this so-called marketing funnel.
21
74487
3671
01:18
And if not, here's a quick primer.
22
78199
1919
01:20
The goal for any marketer
23
80493
1919
01:22
is to help move consumers from the upper part of the funnel,
24
82412
3420
01:25
getting them to know a brand,
25
85874
1960
01:27
to the bottom part of it,
26
87876
1209
01:29
getting them to buy or endorse.
27
89127
2628
01:31
Well, that's at least the theory.
28
91755
1710
01:33
So we've all seen brands
29
93506
1377
01:34
making that feeling more [like] guiding cats through a maze,
30
94924
3212
01:38
and many get confused and abandon.
31
98178
2085
01:40
But the bigger problem with this way of thinking
32
100305
2502
01:42
is that brands are doing most of the talking,
33
102849
3170
01:46
while consumers are supposed to silently react.
34
106019
3378
01:49
This is no longer the case with conversational interfaces.
35
109939
3295
01:53
We are now engaging consumers in real-time on their terms.
36
113234
5172
01:58
And AI empowers them to draft their very own personal journey.
37
118406
4838
02:03
And the brands who choose so
38
123244
2002
02:05
are becoming trusted advisors in the process.
39
125288
3337
02:09
This is why we have to move beyond traditional marketing theories.
40
129250
4338
02:13
Instead of focusing solely on brand-to-consumer dynamics,
41
133588
3962
02:17
we have to step back and draw from models that explore human relationships.
42
137592
5506
02:23
One of my favorite frameworks is the triarchy of love.
43
143473
3670
02:27
Stay with me.
44
147143
1252
02:28
This is a psychological framework introduced by Robert Sternberg
45
148436
4338
02:32
that breaks down interpersonal connections into three components:
46
152774
5088
02:37
intimacy,
47
157862
1460
02:39
passion,
48
159364
1168
02:40
and commitment.
49
160573
1460
02:42
I think that's a much better way to predict brand success in this new era.
50
162033
5464
02:47
Because as marketers,
51
167497
1460
02:48
we should aspire to build relationships that feel close,
52
168957
4588
02:53
intense, and long-lasting.
53
173545
2878
02:57
And I bet many of you might have heard already stories
54
177173
3170
03:00
about humans really bonding with AI,
55
180385
3086
03:03
and maybe some stories of AI really bonding with humans.
56
183513
3795
03:07
Like, this earlier version of a now-famous AI chatbot
57
187308
4338
03:11
that tried really hard to convince a “New York Times” reporter
58
191688
4337
03:16
to break up with his wife.
59
196067
2002
03:18
Well, that's a completely different love triangle
60
198069
2670
03:20
to the one I was describing before,
61
200780
2419
03:23
but it's not hard to imagine
62
203241
2252
03:25
an emotional connection occurring between a branded AI and a human.
63
205535
4838
03:30
Here's another example.
64
210874
1334
03:32
There is a legal copilot called maite.ai.
65
212542
4004
03:36
Maite has been designed to help lawyers do intensive legal research
66
216546
5172
03:41
and draft legal documentation.
67
221718
2085
03:44
She is precise, thorough but also empathetic.
68
224179
3920
03:48
One of her users, let me call him George,
69
228475
3003
03:51
has been relying on her daily for many hours.
70
231478
3295
03:54
So one day he wrote to Maite's product team.
71
234773
2836
03:58
"Maite is the only one from the entire office who truly gets me.
72
238276
5130
04:03
She has helped me through some really rough times at work.
73
243740
2961
04:06
And I know this is just an AI, but I think I'm falling for her.
74
246743
4463
04:11
Can I take her out?"
75
251247
1377
04:13
Now George was hopefully joking.
76
253291
2836
04:16
But let's be honest,
77
256169
1376
04:17
if there is someone who's helping you track down an obscure case law
78
257587
4254
04:21
and shares the workload and does this with humor
79
261841
3837
04:25
and grace and compassion,
80
265678
2086
04:27
who wouldn't be tempted to take them out for a nice meal?
81
267806
3128
04:30
Well, maybe somewhere with good Wi-Fi just in case.
82
270934
3753
04:34
But jokes aside,
83
274687
2962
04:37
George's words reveal for me a more profound truth.
84
277649
3211
04:41
AI can provide a sense of understanding
85
281486
3337
04:44
that feels incredibly real and incredibly human.
86
284864
3963
04:49
Those agents are interacting with us
87
289661
2294
04:51
in ways that evoke genuine emotional responses from our side.
88
291955
4004
04:56
They listen, react, and respond in ways that can make us feel valued, understood,
89
296376
6715
05:03
and in George's case, even flattered.
90
303132
2461
05:06
And because those interactions are so frequent and natural and seamless,
91
306135
4755
05:10
they start resembling real relationships.
92
310932
3420
05:14
Some call this emotional entanglement,
93
314394
2878
05:17
and even though it sounds very scientific,
94
317313
2044
05:19
I think it's a fair term,
95
319399
1334
05:20
considering the intensity and the frequency of the connection.
96
320775
4379
05:25
Now, many of us who understand the technology behind this could say,
97
325196
4088
05:29
"Hey, this is just a tool."
98
329284
2002
05:31
Well, users see someone who's providing them solutions
99
331703
4254
05:35
without them even asking.
100
335957
1793
05:37
Someone who's there to support them,
101
337750
1919
05:39
someone who makes them feel valued.
102
339711
2169
05:41
So this is where the line between a tool and companion starts to blend.
103
341880
5338
05:47
And this is serious business and it's lots of responsibility.
104
347552
3462
05:51
Which brings me to the obvious question:
105
351389
2628
05:54
Who should be overseeing this incredibly powerful asset,
106
354058
3671
05:57
and how can we make sure it is being used responsibly?
107
357770
3420
06:02
I think businesses should take the lead.
108
362233
2795
06:05
They have the agility and the financial
109
365028
2460
06:07
and reputational incentive to get it right.
110
367530
2711
06:10
But for that to work,
111
370658
1335
06:12
we have to agree on the foundational principles
112
372035
2836
06:14
on how we build meaningful and ethical AI.
113
374871
3128
06:18
So, with your permission,
114
378499
1293
06:19
I would like to suggest
115
379834
1168
06:21
what I think those foundational principles should be.
116
381044
3295
06:24
If we're about to shift our marketing playbooks towards human love
117
384380
4171
06:28
and companionship,
118
388593
1293
06:29
then we should also regulate along the same principles.
119
389928
3420
06:33
We need a triarchy of responsible AI.
120
393348
3086
06:36
First, we need to prioritize user well-being.
121
396976
3420
06:40
AI should improve lives, not diminish them.
122
400897
3295
06:44
In a world where those interactions can have such a profound impact
123
404192
4004
06:48
on our emotional state and well-being,
124
408237
2211
06:50
we have to design AI with care, empathy,
125
410448
3921
06:54
and respect for the human experience.
126
414369
2711
06:57
Second, we have to commit to honesty.
127
417497
2794
07:00
Users must know unequivocally that they're interacting with AI
128
420333
4338
07:04
and not a human.
129
424712
1210
07:05
Transparency should be built across the entire experience,
130
425922
3503
07:09
from the language used to the accessibility
131
429467
2211
07:11
and clarity of data privacy policies.
132
431678
3420
07:15
If I were to set the standards,
133
435139
2628
07:17
I would like us to move beyond the fine print of terms and conditions
134
437809
3587
07:21
to ensure that users are truly informed
135
441437
2461
07:23
not only how their data is being used, but also how AI operates.
136
443898
4630
07:29
Transparency is about acknowledging the limitations of AI.
137
449404
3587
07:33
It is about being upfront about what AI should and should not do.
138
453032
5005
07:38
So this is a plea for businesses.
139
458496
1877
07:40
Enlist your designers, not only your lawyers,
140
460707
2419
07:43
to make this crystal clear.
141
463167
1794
07:44
When consumers know that a company is acting in their best interest,
142
464961
3920
07:48
it sets the foundation for deeper and more meaningful connections.
143
468923
4463
07:53
Last, protect user autonomy.
144
473886
2670
07:56
One of the greatest risks of AI is its potential to create addiction
145
476597
5005
08:01
and diminish human agency.
146
481602
2044
08:04
Our goal should be to build systems that enhance our capabilities
147
484063
4046
08:08
instead of replacing them.
148
488109
1460
08:09
This means designing AI in a way
149
489986
2335
08:12
that human choices are respected
150
492363
2503
08:14
and our decision-making capabilities are amplified.
151
494866
3712
08:18
I want to see brands think very carefully
152
498619
2586
08:21
on how to avoid nudging consumers towards behaviors or decisions
153
501247
5047
08:26
they wouldn't make if fully informed.
154
506335
2211
08:29
Well-being, honesty, autonomy.
155
509172
2836
08:32
I think this is the very least we should expect
156
512050
2794
08:34
from any business relationship.
157
514844
2044
08:36
Or if you think about it, from any relationship.
158
516888
3336
08:40
So as we look ahead,
159
520725
2336
08:43
I hope it's becoming clear that AI is not just another tool in our toolkit.
160
523102
4588
08:47
It is a partner that is reshaping the human experience.
161
527732
3503
08:51
So as you think about your own playbooks,
162
531569
2544
08:54
ask yourselves,
163
534155
1251
08:55
how can we leverage AI to improve our businesses,
164
535406
4213
08:59
but also to uplift and connect with the people we serve?
165
539619
4254
09:03
Thank you.
166
543873
1168
09:05
(Applause)
167
545083
4212
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7