Murteza Hanoon, Tuama and Wahhab Muslim, mashloosh and Yasir Mahmood, Younus (2024) Top Accurate Models for Handling Complex Arabic Linguistic Structures. American Journal of Engineering, Mechanics and Architecture, 2 (9). pp. 113-122. ISSN 2993-2637
Text
Top Accurate Models for Handling Complex Arabic Linguistic Structures.pdf Download (1MB) |
Abstract
Arabic, a language rich in morphology but deficient in resources and syntactical exploration when compared to English, poses major hurdles for Applications of Arabic Natural Language Processing (NLP) include Question Answering, Named Entity Recognition (NER), and Sentiment Analysis (SA). (QA). However, recent advances in transformer-based models have demonstrated that language-specific BERT models, when pre-trained on large corpora, outperform in Arabic comprehension. These models have set new benchmarks and produced outstanding outcomes across a wide range of NLP tasks. In this study, we offer AraBERT, a BERT model built exclusively for Arabic, with the goal of replicating BERT's success in English. We compare AraBERT to Google's multilingual BERT and other cutting-edge techniques. The results revealed that the newly designed AraBERT outperformed most Arabic NLP.
Item Type: | Article |
---|---|
Subjects: | Q Science > QA Mathematics > QA75 Electronic computers. Computer science |
Divisions: | Postgraduate > Master's of Islamic Education |
Depositing User: | Journal Editor |
Date Deposited: | 23 Dec 2024 05:06 |
Last Modified: | 23 Dec 2024 05:06 |
URI: | http://eprints.umsida.ac.id/id/eprint/14987 |
Actions (login required)
View Item |