Skip to content
GCC AI Research

Search

Results for "Arabian Super Light"

ArabianGPT: Native Arabic GPT-based Large Language Model

arXiv ·

The paper introduces ArabianGPT, a suite of transformer-based language models designed specifically for Arabic, including versions with 0.1B and 0.3B parameters. A key component is the AraNizer tokenizer, tailored for Arabic script's morphology. Fine-tuning ArabianGPT-0.1B achieved 95% accuracy in sentiment analysis, up from 56% in the base model, and improved F1 scores in summarization. Why it matters: The models address the gap in native Arabic LLMs, offering better performance on Arabic NLP tasks through tailored architecture and tokenization.

UI-Level Evaluation of ALLaM 34B: Measuring an Arabic-Centric LLM via HUMAIN Chat

arXiv ·

This paper presents a UI-level evaluation of ALLaM-34B, an Arabic-centric LLM developed by SDAIA and deployed in the HUMAIN Chat service. The evaluation used a prompt pack spanning various Arabic dialects, code-switching, reasoning, and safety, with outputs scored by frontier LLM judges. Results indicate strong performance in generation, code-switching, MSA handling, reasoning, and improved dialect fidelity, positioning ALLaM-34B as a robust Arabic LLM suitable for real-world use.

Arabic Stable LM: Adapting Stable LM 2 1.6B to Arabic

arXiv ·

The paper introduces Arabic Stable LM, a 1.6B parameter Arabic-centric language model, in both base and chat versions. The Arabic Stable LM 1.6B chat model achieves strong results on several benchmarks, outperforming models with up to 8x more parameters. The study also demonstrates the benefit of incorporating synthetic instruction tuning data through a large synthetic dialogue dataset. Why it matters: This work makes Arabic LLMs more accessible by reducing the parameter size while maintaining strong performance, facilitating deployment in resource-constrained environments.

MobiLlama: Towards Accurate and Lightweight Fully Transparent GPT

arXiv ·

Researchers from MBZUAI have released MobiLlama, a fully transparent open-source 0.5 billion parameter Small Language Model (SLM). MobiLlama is designed for resource-constrained devices, emphasizing enhanced performance with reduced resource demands. The full training data pipeline, code, model weights, and checkpoints are available on Github.

Technology Innovation Institute Announces Launch of NOOR, the World’s Largest Arabic NLP Model

TII ·

Technology Innovation Institute (TII) in Abu Dhabi, in collaboration with LightOn, has launched NOOR, a 10 billion parameter Arabic natural language processing (NLP) model. The model was trained on a large, high-quality cross-domain Arabic dataset including web data, books, poetry, news, and technical information. It enables applications in automated summarization, chatbots, and personalized marketing. Why it matters: NOOR represents a significant advancement in Arabic NLP, potentially enabling more sophisticated AI applications tailored to the Arabic language and regional needs.

Sadeed: Advancing Arabic Diacritization Through Small Language Model

arXiv ·

The paper introduces Sadeed, a fine-tuned decoder-only language model based on the Kuwain 1.5B Hennara model, for improved Arabic text diacritization. Sadeed is fine-tuned on high-quality diacritized datasets and achieves competitive results compared to larger proprietary models. The authors also introduce SadeedDiac-25, a new benchmark for fairer evaluation of Arabic diacritization across diverse text genres. Why it matters: This work advances Arabic NLP by providing both a competitive diacritization model and a more robust evaluation benchmark, facilitating further research and development in the field.

KAUST celebrates the Year of Light

KAUST ·

KAUST held an open day on December 3, 2015, to celebrate the International Year of Light. The event showcased technological developments in light research, especially photonics and LED-based technologies. Exhibits and demonstrations were provided by researchers from KAUST's CEMSE and PSE divisions, under the direction of Professor Boon Ooi. Why it matters: The event promoted understanding of achievements in light research and its applications in various sectors like communications, medicine, and energy.

Securing the Kingdom's energy future

KAUST ·

KAUST and GE have partnered to study the feasibility of using crude oils like Arabian Super Light (ASL) to power heavy-duty gas turbines. The collaboration aims to develop turbines capable of burning crude oil directly from the ground to meet Saudi Arabia's energy security needs. The research involves building a rig at KAUST's High Pressure Combustion Laboratory (HPCL) to conduct corrosion tests on turbine materials by burning ASL/AXL crude continuously for 2,000 hours. Why it matters: This partnership could reduce reliance on natural gas and offer an economically viable alternative fuel source, bolstering energy security in Saudi Arabia and potentially influencing turbine technology worldwide.