This survey paper reviews the landscape of Natural Language Processing (NLP) research and applications in the Arab world. It discusses the unique challenges posed by the Arabic language, such as its morphological complexity and dialectal diversity. The paper also presents a historical overview of Arabic NLP and surveys various research areas, including machine translation, sentiment analysis, and speech recognition. Why it matters: The survey provides a comprehensive resource for researchers and practitioners interested in the current state and future directions of Arabic NLP, a field critical for enabling AI technologies to serve Arabic-speaking communities.
This article surveys the landscape of Arabic Large Language Models (ALLMs), tracing their evolution from early text processing systems to sophisticated AI models. It highlights the unique challenges and opportunities in developing ALLMs for the 422 million Arabic speakers across 27 countries. The paper also examines the evaluation of ALLMs through benchmarks and public leaderboards. Why it matters: ALLMs can bridge technological gaps and empower Arabic-speaking communities by catering to their specific linguistic and cultural needs.
This study reviews the use of large language models (LLMs) for Arabic language processing, focusing on pre-trained models and their applications. It highlights the challenges in Arabic NLP due to the language's complexity and the relative scarcity of resources. The review also discusses how techniques like fine-tuning and prompt engineering enhance model performance on Arabic benchmarks. Why it matters: This overview helps consolidate research directions and benchmarks in Arabic NLP, guiding future development of LLMs tailored for the Arabic language and its diverse dialects.
The paper introduces ArabianGPT, a suite of transformer-based language models designed specifically for Arabic, including versions with 0.1B and 0.3B parameters. A key component is the AraNizer tokenizer, tailored for Arabic script's morphology. Fine-tuning ArabianGPT-0.1B achieved 95% accuracy in sentiment analysis, up from 56% in the base model, and improved F1 scores in summarization. Why it matters: The models address the gap in native Arabic LLMs, offering better performance on Arabic NLP tasks through tailored architecture and tokenization.
This article discusses MBZUAI's efforts in advancing Arabic language AI, including the development of advanced linguistic models using deep learning techniques. Key initiatives include Jais, a 13B parameter Arabic LLM developed in collaboration with G42's Inception, and Atlas-Chat, which understands the Moroccan dialect. The university is also incorporating Arabic in practical AI solutions like BiMediX2, a healthcare multi-modal model that understands medical queries in both English and Arabic. Why it matters: These initiatives are crucial for preserving Arabic cultural heritage, enabling future discovery, and addressing linguistic challenges specific to the Arabic language in AI applications.
Abu Dhabi’s Technology Innovation Institute (TII) has launched Falcon-H1 Arabic, a new large language model based on a hybrid Mamba-Transformer architecture. The Falcon-H1 family comes in 3B, 7B, and 34B parameter sizes and outperforms existing models on the Open Arabic LLM Leaderboard (OALL). The model features improvements in data quality, dialect coverage, and long-context stability. Why it matters: This release strengthens the UAE's position in Arabic AI and provides a high-performing model tailored to the linguistic and cultural needs of the region.
The paper introduces Arabic Stable LM, a 1.6B parameter Arabic-centric language model, in both base and chat versions. The Arabic Stable LM 1.6B chat model achieves strong results on several benchmarks, outperforming models with up to 8x more parameters. The study also demonstrates the benefit of incorporating synthetic instruction tuning data through a large synthetic dialogue dataset. Why it matters: This work makes Arabic LLMs more accessible by reducing the parameter size while maintaining strong performance, facilitating deployment in resource-constrained environments.