QAYTA SOZLANGAN TRANSFORMER MODELLAR YORDAMIDA INGLIZCHA-O‘ZBEKCHA MASHINA TARJIMASINI TAKOMILLASHTIRISH

Авторы

  • U.Y.Tuliyev Автор
  • A. M. Murodov Автор

Ключевые слова:

Katta til modellari, Hugging Face, Mashina tarjimasi, Generative Pretrained Transformers, Tabiiy tilni qayta ishlash

Аннотация

Tabiiy tilga tegishli matnlarni aniq va sifatli tarjima qilish vazifasi 
tabiiy tilga ishlov berishning (NLP) o‘ta muhim vazifalaridan biri hisoblanadi. 
Mashinaviy o‘rganish yordamida aniq va tez amalga oshiriladigan avtomatlashtirilgan 
tarjima ko‘pincha mashinaviy o‘rganish va sun’iy intellekt fanlari hamjamiyatlarida 
katta qiziqish uyg‘otadi. Ushbu tadqiqot doirasida mashina tarjimasini amalga oshirish 
uchun o’zbek tilining mahalliy internet manbaalaridan, kitob, darslik va ilmiy ishlardan 
yig’ilgan matnlarning parallel korpusi yordamida Generative Pretrained Transformer 
(GPT) modellaridan foydalanishni ko‘rib chiqamiz. Biz Hugging Face katta til 
modellari (LLM) ro’yxatidagi o’zbek tili uchun o’qitilgan 2 xil modelni qayta sozlash 
orqali mashina tarjimasini amalga oshirish va ularning turli xil metrikalar bo’yicha 
tarjima sifatining baholarini qiyosiy tahlil qilamiz. Tanlangan modellarni sozlash 
Google colab muhitida A100 Grafikani qayta ishlash bloki (GPU) yordamida amalga 
oshirilgan. 

Библиографические ссылки

1.

Amilia, Ika & Yuwono, Darmawan. (2020). A study of the translation of google

translate. Lingua : jurnal ilmiah. 16. 1-21. 10.35962/lingua.v16i2.50.

2.

Angela Fan, Shruti Bhosale, Holger Schwenk, Zhiyi Ma, Ahmed El-Kishky,

Siddharth Goyal, Mandeep Baines, Onur Celebi, Guillaume Wenzek, Vishrav

Chaudhary, Naman Goyal, Tom Birch, Vitaliy Liptchinsky, Sergey Edunov, Edouard

Grave, Michael Auli, and Armand Joulin. 2021. Beyond english-centric multilingual

machine translation. J. Mach. Learn. Res. 22, 1, Article 107 (January 2021), 48 pages.

3.

Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones,

Aidan N. Gomez, Łukasz Kaiser, and Illia Polosukhin. 2017. Attention is all you need.

In Proceedings of the 31st International Conference on Neural Information Processing

Systems (NIPS'17). Curran Associates Inc., Red Hook, NY, USA, 6000–6010.

4.

Jörg Tiedemann and Santhosh Thottingal. 2020. OPUS-MT – Building open

translation services for the World. In Proceedings of the 22nd Annual Conference of

the European Association for Machine Translation, pages 479–480, Lisboa, Portugal.

European Association for Machine Translation.

5.

Kishore Papineni, Salim Roukos, Todd Ward, and Wei-Jing Zhu. 2002. Bleu: a

Method for Automatic Evaluation of Machine Translation. In Proceedings of the 40th

Annual Meeting of the Association for Computational Linguistics, pages 311–318,

Philadelphia, Pennsylvania, USA. Association for Computational Linguistics.

6.

Liu et al., "Multilingual Denoising Pre-training for Neural Machine

Translation", 2020.

7.

Robert Östling, Yves Scherrer, Jörg Tiedemann, Gongbo Tang, and Tommi

Nieminen. 2017. The Helsinki Neural Machine Translation System. In Proceedings of

Опубликован

2025-08-02

Как цитировать

QAYTA SOZLANGAN TRANSFORMER MODELLAR YORDAMIDA INGLIZCHA-O‘ZBEKCHA MASHINA TARJIMASINI TAKOMILLASHTIRISH. (2025). ОБРАЗОВАНИЕ НАУКА И ИННОВАЦИОННЫЕ ИДЕИ В МИРЕ, 74(1), 284-291. https://scientific-jl.com/obr/article/view/25697