What tech companies know about your kids | Veronica Barassi

84,450 views ・ 2020-07-03

TED


Videoni ijro etish uchun quyidagi inglizcha subtitrlarga ikki marta bosing.

00:00
Transcriber: Leslie Gauthier Reviewer: Joanna Pietrulewicz
0
0
7000
Translator: Shokhnur Akhmedov Reviewer: Nazarbek Nazarov
00:12
Every day, every week,
1
12792
2267
Har kuni, har hafta
00:15
we agree to terms and conditions.
2
15083
2185
biz foydalanish shartlariga rozi bo'lamiz.
00:17
And when we do this,
3
17292
1476
Va biz rozi bo'lganimzida
00:18
we provide companies with the lawful right
4
18792
2476
biz kompaniyalarga qonuniy huquq taqdim etamiz
00:21
to do whatever they want with our data
5
21292
3684
ma'lumotlarimiz bilan istalgan ish qilishiga
00:25
and with the data of our children.
6
25000
2375
va farzandlarimizning ma'lumotlari bilan ham.
00:28
Which makes us wonder:
7
28792
2976
Hayron qolarlisi,
00:31
how much data are we giving away of children,
8
31792
2892
biz bolalarning ancha ma'lumotlarini berib yuboryapmiz.
00:34
and what are its implications?
9
34708
2000
Bularning oqibati qanday bo'ladi?
00:38
I'm an anthropologist,
10
38500
1393
Men antropologman,
00:39
and I'm also the mother of two little girls.
11
39917
2601
shuningdek, ikki qizchaning onasiman.
00:42
And I started to become interested in this question in 2015
12
42542
4476
Men bu savolga 2015-yilda qiziqishni boshlagandim.
00:47
when I suddenly realized that there were vast --
13
47042
2726
O'shanda kutilmaganda shuni angladim,
00:49
almost unimaginable amounts of data traces
14
49792
3017
ulkan va aql bovar qilmas hajmdagi bolalar haqidagi ma'lumotlar
00:52
that are being produced and collected about children.
15
52833
3167
ishlab chiqarilishi va to'planishini.
00:56
So I launched a research project,
16
56792
1976
Shu tarzda ilmiy loyihamni boshladim.
00:58
which is called Child Data Citizen,
17
58792
2476
Uni Bola Ma'lumot Fuqaro deb nomladim
01:01
and I aimed at filling in the blank.
18
61292
2125
va bo'shliqni to'ldirishni maqsad qildim.
01:04
Now you may think that I'm here to blame you
19
64583
3018
Endi o'ylashingiz mumkin meni hozir sizni aybdor qiladi deb
01:07
for posting photos of your children on social media,
20
67625
2768
ijtimoiy tarmoqga farzandingiz rasmini joylaganingiz uchun
01:10
but that's not really the point.
21
70417
2142
ammo gap bunda emas,
01:12
The problem is way bigger than so-called "sharenting."
22
72583
3417
Muammo bundan ko'ra ancha yirikroqdir.
01:16
This is about systems, not individuals.
23
76792
4101
Hammasi tizimga bog'liq, odamlarga emas.
01:20
You and your habits are not to blame.
24
80917
2291
SIz va sizning odatlaringiz aybdor emas.
01:24
For the very first time in history,
25
84833
2851
Tarixda birinchi bor,
01:27
we are tracking the individual data of children
26
87708
2560
biz bolalarning ma'lumotlarini kuzatmoqdamiz
01:30
from long before they're born --
27
90292
1767
ularning tug'ilishidan ancha oldin
01:32
sometimes from the moment of conception,
28
92083
2685
urug'lanish paytidan boshlab,
01:34
and then throughout their lives.
29
94792
2351
va butun hayoti davomida.
01:37
You see, when parents decide to conceive,
30
97167
3101
Bilamizki, ota-onalar farzand ko'rishmoqchi bo'lsihganida,
01:40
they go online to look for "ways to get pregnant,"
31
100292
2976
ular "homilador bo'lish yo'llari"ni onlayn qidirishadi,
01:43
or they download ovulation-tracking apps.
32
103292
2750
yoki ovulyatsiyani kuzatadgan ilovalarni yuklab olishadi.
01:47
When they do get pregnant,
33
107250
2601
Ular homilador bo'lishganida esa,
01:49
they post ultrasounds of their babies on social media,
34
109875
3143
ular chaqalog'ining ultratovushini ijtimoiy tarmoqqa yuklaydi,
01:53
they download pregnancy apps
35
113042
2017
homiladorlik ilovalarini yuklab olishadi
01:55
or they consult Dr. Google for all sorts of things,
36
115083
3726
yoki Doktor Google bilan hamma narsa haqida maslahatlashishadi,
01:58
like, you know --
37
118833
1518
o'zingiz bilasiz --
02:00
for "miscarriage risk when flying"
38
120375
2559
"parvoz davomida bolaning tushib qolish xavfi"
02:02
or "abdominal cramps in early pregnancy."
39
122958
2768
yoki "erta homiladorlikdagi qorindagi spazmlar" haqida.
02:05
I know because I've done it --
40
125750
1809
Men bilaman, chunki shunday qilganman
02:07
and many times.
41
127583
1625
va ko'p marta.
02:10
And then, when the baby is born, they track every nap,
42
130458
2810
Bola tug'ilganida esa, ular kuzatishadi: uning har uyqusini,
02:13
every feed,
43
133292
1267
ovqatlanishini,
02:14
every life event on different technologies.
44
134583
2584
hayotining har bir harakatini, turli texnologiyalarda.
02:18
And all of these technologies
45
138083
1476
Va bu texnologiyalar
02:19
transform the baby's most intimate behavioral and health data into profit
46
139583
6143
chaqaloqning shaxsiy ma'lumotlarini foydaga aylantirishadi, ularni
02:25
by sharing it with others.
47
145750
1792
boshqalar bilan ulashgan holda.
02:28
So to give you an idea of how this works,
48
148583
2143
Mana bu qanday ishlaydi,
02:30
in 2019, the British Medical Journal published research that showed
49
150750
5184
2019-yilda Britaniya Tibbiyot Jurnali bir tadqiqotni nashr qildi.
02:35
that out of 24 mobile health apps,
50
155958
3643
Unda aytilishicha 24 mobil sog'liq ilovasidan
02:39
19 shared information with third parties.
51
159625
3458
19 tasi ma'lumotlarni uchinchi tomon bilan bo'lishishgan.
02:44
And these third parties shared information with 216 other organizations.
52
164083
5834
Va bu uchinchi tomonlar ma'lumotni 216 tashkilot bilan bo'lishishgan.
02:50
Of these 216 other fourth parties,
53
170875
3434
Bu 216 tashkilotdan
02:54
only three belonged to the health sector.
54
174333
3143
bor-yo'g'i uchtasi sog'liqni saqlash tizimiga oid.
02:57
The other companies that had access to that data were big tech companies
55
177500
4518
Ma'lumotga kirish huquqi bor kompaniyalar orasida katta texnologiya kompaniyalari
03:02
like Google, Facebook or Oracle,
56
182042
3517
Google, Facebook va Oracle kabilar bor.
03:05
they were digital advertising companies
57
185583
2601
Ular raqamli reklama kompaniyalaridir,
03:08
and there was also a consumer credit reporting agency.
58
188208
4125
shuningdek iste'mol kreditlari bo'yicha agentlik ham bor.
03:13
So you get it right:
59
193125
1434
Siz to'g'ri topdingiz:
03:14
ad companies and credit agencies may already have data points on little babies.
60
194583
5125
Ularda allaqachon chaqaloqlar haqida ma'lumotlar bor.
03:21
But mobile apps, web searches and social media
61
201125
2768
Ammo mobil ilovalar, veb qidiruvlar va ijtimoiy tarmoqlar
03:23
are really just the tip of the iceberg,
62
203917
3101
shunchaki aysbergning bir uchidir.
03:27
because children are being tracked by multiple technologies
63
207042
2851
Chunki turli texnologiyalar bolalarni kuzatib borishadi
03:29
in their everyday lives.
64
209917
1726
Ularning kundalik hayoti davomida.
03:31
They're tracked by home technologies and virtual assistants in their homes.
65
211667
4142
Ular uy texnologiyalari va uydagi virtual yordamchilar tomonidan kuzatiladi.
03:35
They're tracked by educational platforms
66
215833
1976
Ta'lim platformasi ham ularni kuzatishadi
03:37
and educational technologies in their schools.
67
217833
2185
va maktabdagi ta'lim texnologiyalari ham.
03:40
They're tracked by online records
68
220042
1601
Onlayn yozuvlari ham kuzatiladi
03:41
and online portals at their doctor's office.
69
221667
3017
va shifoxonalaridagi onlayn portallari ham.
03:44
They're tracked by their internet-connected toys,
70
224708
2351
Ularni internetli o'yinchoqlar, ularning onlayn
03:47
their online games
71
227083
1310
o'yinlari orqali kuzatishadi
03:48
and many, many, many, many other technologies.
72
228417
2666
va boshqa judayam ko'plab texnologiyalar orqali.
03:52
So during my research,
73
232250
1643
Tadqiqotim davomida
03:53
a lot of parents came up to me and they were like, "So what?
74
233917
4142
ko'plab ota-onalar oldimga kelishardi va "Ho'sh nima bo'libdi?" deyishardi.
03:58
Why does it matter if my children are being tracked?
75
238083
2917
"Nima farqi bor bolalarim kuzatilishini?
04:02
We've got nothing to hide."
76
242042
1333
Yashiradigan narsamiz yo'q."
04:04
Well, it matters.
77
244958
1500
Lekin, farqi bor.
04:07
It matters because today individuals are not only being tracked,
78
247083
6018
Farqi bor, chunki hozirda odamlarni nafaqat kuzatishadi,
04:13
they're also being profiled on the basis of their data traces.
79
253125
4101
balki, ularning ma'lumoti alohida yig'iladi qidiruv tarixiga asoslanib,
04:17
Artificial intelligence and predictive analytics are being used
80
257250
3809
sun'iy ong va bashorat tahlillari uchun ishlatiladi.
04:21
to harness as much data as possible of an individual life
81
261083
3643
Inson hayoti bo'yicha iloji boricha ko'p ma'lumot yig'ish uchun
04:24
from different sources:
82
264750
1851
turli manbalardan:
04:26
family history, purchasing habits, social media comments.
83
266625
4518
oila tarixi, xarid qilish odatlari va ijtimoiy tarmoq izohlaridan.
04:31
And then they bring this data together
84
271167
1851
Keyin ular bu ma'lumotlarni jamlashadi.
04:33
to make data-driven decisions about the individual.
85
273042
2750
Ma'lumotga asoslangan qaror chiqarish uchun
04:36
And these technologies are used everywhere.
86
276792
3434
bunday texnologiyalar hamma yerda ishlatiladi.
04:40
Banks use them to decide loans.
87
280250
2393
Banklar bularni kredit, qarz berishda ishlatishadi,
04:42
Insurance uses them to decide premiums.
88
282667
2375
sug'urta kompaniayalari esa premiyani aniqlashda.
04:46
Recruiters and employers use them
89
286208
2476
Ish beruvchilar ham ulardan foydalanishadi.
04:48
to decide whether one is a good fit for a job or not.
90
288708
2917
Nomzod ishga to'gri kelishi yoki kelmasligini aniqlashda.
04:52
Also the police and courts use them
91
292750
3101
Shuningdek, politsiya va sud ham jinoyatchini aniqlashda
04:55
to determine whether one is a potential criminal
92
295875
3518
ulardan foydalanishadi,
04:59
or is likely to recommit a crime.
93
299417
2625
yoki qayta jinoyatga moyilligini aniqlaganda.
05:04
We have no knowledge or control
94
304458
4060
Bizda ma'lumotlarni sotadigan, sotib oladigan va tahlil qiladiganlar
05:08
over the ways in which those who buy, sell and process our data
95
308542
3642
ustidan hech qanday ko'nikma yoki nazorat yo'q.
05:12
are profiling us and our children.
96
312208
2709
Ular bizni va bolalarimizning ma'lumotlarini yig'ishadi.
05:15
But these profiles can come to impact our rights in significant ways.
97
315625
4042
Lekin bu yig'ilgan ma'lumotlar huquqlarimizni yetarlicha buzishadi.
05:20
To give you an example,
98
320917
2208
Misol uchun,
05:25
in 2018 the "New York Times" published the news
99
325792
4059
2018-yil New York Times yangiligida aytilishicha
05:29
that the data that had been gathered
100
329875
1976
onlayn kollejni reja qilish tizimi orqali
05:31
through online college-planning services --
101
331875
3059
yig'ilgan ma'lumotlarda
05:34
that are actually completed by millions of high school kids across the US
102
334958
4726
AQSH bo'ylab millionlab maktab o'quvchilari ma'lumoti yig'ilgan
05:39
who are looking for a college program or a scholarship --
103
339708
3643
kollej programmasi yoki grant qidirib yurgan o'quvchilar orasidan
05:43
had been sold to educational data brokers.
104
343375
3042
va bu ma'lumotlar o'quv brokerlariga sotilgan.
05:47
Now, researchers at Fordham who studied educational data brokers
105
347792
5434
Hozirda, Fordhamda o'quv brokerlarini o'rgangan tadqiqotchilar
05:53
revealed that these companies profiled kids as young as two
106
353250
5226
bolalarning ikki yoshligidan ma'lumoti to'planishini oshkor qilishdi.
05:58
on the basis of different categories:
107
358500
3059
Turli yo'nalishlar bo'yicha:
06:01
ethnicity, religion, affluence,
108
361583
4185
millati, dini, boyligi,
06:05
social awkwardness
109
365792
2059
ijtimoiy ahvoli
06:07
and many other random categories.
110
367875
2934
va ko'plab boshqa turli yo'nalishlar bo'yicha.
06:10
And then they sell these profiles together with the name of the kid,
111
370833
5018
Keyin ular bu yig'ilgan profillarni bolalarning ismi,
06:15
their home address and the contact details
112
375875
2809
uy manzili va kontaktlari bilan birga
06:18
to different companies,
113
378708
1851
turli kompaniyalarga sotishadi,
06:20
including trade and career institutions,
114
380583
2459
savdo va ta'lim dargohlaridan tortib,
06:24
student loans
115
384083
1268
talaba qarzlari,
06:25
and student credit card companies.
116
385375
1750
talaba kredit karta kompaniyasigacha.
06:28
To push the boundaries,
117
388542
1351
Aniqlashtirish uchun,
06:29
the researchers at Fordham asked an educational data broker
118
389917
3809
Fordhamdagi tadqiqotchilar o'quv brokeridan
06:33
to provide them with a list of 14-to-15-year-old girls
119
393750
5809
14-15 yoshli qizlar ro'yxati bilan ta'minlashni so'rashdi.
06:39
who were interested in family planning services.
120
399583
3375
Masalan, oilaviy reja qilishga qiziqadiganlarini.
06:44
The data broker agreed to provide them the list.
121
404208
2476
Broker ro'yxatni berishga rozi bo'ldi.
06:46
So imagine how intimate and how intrusive that is for our kids.
122
406708
4875
Tasavvur qiling, bu qanchalik bolalarimiz uchun havfli bo'lishi mumkin.
06:52
But educational data brokers are really just an example.
123
412833
3976
Lekin o'quv brokerlari shunchaki bir misol.
06:56
The truth is that our children are being profiled in ways that we cannot control
124
416833
4685
Haqiqat shuki, bolalarimizning ma'lumoti yig'ilishini biz nazorat qilolmaymiz,
07:01
but that can significantly impact their chances in life.
125
421542
3416
ammo bu narsa ularning hayotiga yetarlicha ta'sir qiladi.
07:06
So we need to ask ourselves:
126
426167
3476
Biz o'zimizdan so'rashimiz kerak:
07:09
can we trust these technologies when it comes to profiling our children?
127
429667
4684
bolalarimizning ma'lumoti yig'ilganda biz bu texnologiyalarga ishona olamizmi?
07:14
Can we?
128
434375
1250
Aniq ishona olamizmi?
07:17
My answer is no.
129
437708
1250
Mening javobim "yo'q."
07:19
As an anthropologist,
130
439792
1267
Antropolog sifatida,
07:21
I believe that artificial intelligence and predictive analytics can be great
131
441083
3768
Sun'iy ong va bashorat tahlillari ajoyibdir.
07:24
to predict the course of a disease
132
444875
2018
Kasallik izini topishda
07:26
or to fight climate change.
133
446917
1833
yoki iqlim o'zgarishga qarshi kurashda.
07:30
But we need to abandon the belief
134
450000
1643
Bu qarashdan voz kechishimiz kerak:
07:31
that these technologies can objectively profile humans
135
451667
3684
ularning xolisona ma'lumot yig'ishida
07:35
and that we can rely on them to make data-driven decisions
136
455375
3184
va dalillarga asoslanib qaror chiqarishda ularga ishonishimizda,
07:38
about individual lives.
137
458583
1893
insonlar hayoti haqida.
07:40
Because they can't profile humans.
138
460500
2559
Chunki ular insonning aslini ko'rsatmaydi.
07:43
Data traces are not the mirror of who we are.
139
463083
3351
Ma'lumotlar bizning kimligimizning ko'zgusi emas.
07:46
Humans think one thing and say the opposite,
140
466458
2101
Odam bir narsani o'ylab, teskarisini aytadi,
07:48
feel one way and act differently.
141
468583
2435
ko'ngli buni desa, o'zi boshqa ishni qiladi.
07:51
Algorithmic predictions or our digital practices
142
471042
2476
Algoritmik bashoratlar yoki raqamli amaliyotlar
07:53
cannot account for the unpredictability and complexity of human experience.
143
473542
5166
insonning murakab tizimini oldindan aytib berolmaydi.
08:00
But on top of that,
144
480417
1559
Lekin, avvalambor
08:02
these technologies are always --
145
482000
2684
bu texnologiyalar har doim ham
08:04
always --
146
484708
1268
har doim ham
08:06
in one way or another, biased.
147
486000
1917
xolis emas.
08:09
You see, algorithms are by definition sets of rules or steps
148
489125
5059
Bilamizki, algoritmlar qoidalar yoki qadamlar to'plamidir.
08:14
that have been designed to achieve a specific result, OK?
149
494208
3709
Ma'lum bir natijaga erishish uchun ishlab chiqilgan, shunday emasmi?
08:18
But these sets of rules or steps cannot be objective,
150
498833
2726
Lekin bu qoidalar yoki qadamlar to'plami xolis emas,
08:21
because they've been designed by human beings
151
501583
2143
chunki ularni odamlar ishlab chiqishgan.
08:23
within a specific cultural context
152
503750
1726
Ma'lum bir madaniy jabxada
08:25
and are shaped by specific cultural values.
153
505500
2500
va o'ziga xos madaniy jihatlar bilan sayqallashgan.
08:28
So when machines learn,
154
508667
1726
Shunday ekan, mashinalar o'rganishsa
08:30
they learn from biased algorithms,
155
510417
2250
ular noxolis algoritmlar orqali o'rganishadi,
08:33
and they often learn from biased databases as well.
156
513625
3208
va ular noxolis ma'lumotlar bazasidan ham tezroq o'rganishadi.
08:37
At the moment, we're seeing the first examples of algorithmic bias.
157
517833
3726
Ayni paytda, algoritmik taraqqiyotning ilk misollarini ko'rishimiz mumkin.
08:41
And some of these examples are frankly terrifying.
158
521583
3500
Va ba'zi misollar rostan qo'rqinchlidir.
08:46
This year, the AI Now Institute in New York published a report
159
526500
4059
Shu yili, Ney-Yorkdagi AI Now Instituti hisobot chop etishdi:
08:50
that revealed that the AI technologies
160
530583
2393
shuni oshkor qilishdiki, sun'iy ong texnologiyalari
08:53
that are being used for predictive policing
161
533000
3476
bashorat qilishda ishlatilayotgan ekan,
08:56
have been trained on "dirty" data.
162
536500
3125
"kir" ma'lumotlarni aniqlashga o'rgatilgan.
09:00
This is basically data that had been gathered
163
540333
2893
Bu asosan shunday yig'ilganki,
09:03
during historical periods of known racial bias
164
543250
4184
tarixiy irqchillika oid ma'lumotlardan iborat va
09:07
and nontransparent police practices.
165
547458
2250
shaffof bo'lmagan politsiya ishlari ham bor.
09:10
Because these technologies are being trained with dirty data,
166
550542
4059
Bu texnologiyalar qora ma'lumotlarni ishlatgani uchun
09:14
they're not objective,
167
554625
1434
ular xolis emas,
09:16
and their outcomes are only amplifying and perpetrating
168
556083
4518
ularning oqibatlari esa faqatgina politsiya qilmishi va xatolarini
09:20
police bias and error.
169
560625
1625
ko'paytiradi va abadiylashtiradi.
09:25
So I think we are faced with a fundamental problem
170
565167
3142
Xo'sh, menimcha bizning jamiyatimiz muhim bir
09:28
in our society.
171
568333
1643
muammoga duch keldi.
09:30
We are starting to trust technologies when it comes to profiling human beings.
172
570000
4792
Biz ma'lumotlar yig'ishda ishlatiladigan texnologiyalarga ishonishni boshladik.
09:35
We know that in profiling humans,
173
575750
2518
Bilamizki, inson ma'lumotlarini yig'ishda
09:38
these technologies are always going to be biased
174
578292
2809
bu texnologiyalar har doim ham xolis bo'lmaydi
09:41
and are never really going to be accurate.
175
581125
2726
va hech qachon aniq bo'lmaydi.
09:43
So what we need now is actually political solution.
176
583875
2934
Hozir bizga kerak bo'lgan narsa bu siyosiy yechimdir.
09:46
We need governments to recognize that our data rights are our human rights.
177
586833
4709
Hukumatlarimiz bizning ma'lumot huquqimiz insoniy huquqimiz ekanini tan olishi kerak
09:52
(Applause and cheers)
178
592292
4083
(Qarsaklar va hayqiriqlar)
09:59
Until this happens, we cannot hope for a more just future.
179
599833
4084
Ungacha biz kelajakka umid bilan qaray olmaymiz.
10:04
I worry that my daughters are going to be exposed
180
604750
2726
Men qizlarimning algoritmik kamsitilishi va xatolarga
10:07
to all sorts of algorithmic discrimination and error.
181
607500
3726
duchor bo'lishidan xavotirdaman.
10:11
You see the difference between me and my daughters
182
611250
2393
Men bilan qizlarim o'rtasidagi farq shundaki,
10:13
is that there's no public record out there of my childhood.
183
613667
3184
mening bolaligim haqida umuman ochiq ma'lumotlar yo'q.
10:16
There's certainly no database of all the stupid things that I've done
184
616875
4018
Shubhasiz men o'smirlik vaqtimda qilgan va o'ylagan barcha ahmoqona ishlarning
10:20
and thought when I was a teenager.
185
620917
2142
ma'lumot bazasi yo'q.
10:23
(Laughter)
186
623083
1500
(Kulgi)
10:25
But for my daughters this may be different.
187
625833
2750
Ammo qizlarim uchun bu boshqacha bo'lishi mumkin.
10:29
The data that is being collected from them today
188
629292
3184
Ularning ma'lumoti bugundan boshlab yig'ilib borilmoqda,
10:32
may be used to judge them in the future
189
632500
3809
u kelajakda ularni ustidan hukm chiqarish uchun ishlatilishi mumkin
10:36
and can come to prevent their hopes and dreams.
190
636333
2959
va ularning orzu-umidlariga to'sqinlik qilishi mumkin.
10:40
I think that's it's time.
191
640583
1518
Menimcha, vaqti keldi.
10:42
It's time that we all step up.
192
642125
1434
Harakat qilish vaqt keldi.
10:43
It's time that we start working together
193
643583
2476
Barchamizning birga ishlash vaqtimiz keldi
10:46
as individuals,
194
646083
1435
inson sifatida,
10:47
as organizations and as institutions,
195
647542
2517
tashkilot va muassassa sifatida,
10:50
and that we demand greater data justice for us
196
650083
3101
va biz ko'proq ma'lumot adolatini talab qilamiz, o'zimiz uchun
10:53
and for our children
197
653208
1393
va farzandlarimiz uchun
10:54
before it's too late.
198
654625
1518
juda kech bo'lishidan oldin.
10:56
Thank you.
199
656167
1267
Tashakkur.
10:57
(Applause)
200
657458
1417
(Qarsaklar)
Ushbu veb-sayt haqida

Ushbu sayt sizni ingliz tilini o'rganish uchun foydali bo'lgan YouTube videolari bilan tanishtiradi. Dunyo bo'ylab eng yaxshi o'qituvchilar tomonidan o'qitiladigan ingliz tili darslarini ko'rasiz. Videoni u yerdan o'ynash uchun har bir video sahifasida ko'rsatilgan inglizcha subtitrlarga ikki marta bosing. Subtitrlar video ijrosi bilan sinxronlashtiriladi. Agar sizda biron bir fikr yoki so'rov bo'lsa, iltimos, ushbu aloqa formasi orqali biz bilan bog'laning.

https://forms.gle/WvT1wiN1qDtmnspy7