Skip to content
GCC AI Research

Directory

GCC AI Models

Large language models and AI systems built by institutions across the Gulf Cooperation Council. Updated as new models are released.

Falcon

πŸ‡¦πŸ‡ͺ TII Β· UAE

Open-source large language model family from the Technology Innovation Institute (Abu Dhabi). Falcon was the first Arabic-organization model to top the Hugging Face Open LLM Leaderboard. The Falcon 3 series (2024) supports Arabic, French, Spanish, and Portuguese.

Type: LLM Sizes: 7B, 40B, 180B, Falcon 2 (11B), Falcon 3 (1B–10B)

Jais

πŸ‡¦πŸ‡ͺ MBZUAI / Inception Β· UAE

Bilingual Arabic-English LLM developed jointly by MBZUAI, Inception, and Core42. Trained on a large Arabic-English corpus with a custom tokenizer optimized for Arabic morphology. Jais-30B achieved state-of-the-art results on Arabic NLP benchmarks at release.

Type: Arabic LLM Sizes: 13B, 30B

AceGPT

πŸ‡ΈπŸ‡¦ KAUST Β· Saudi Arabia

Arabicized LLM from KAUST, built by fine-tuning Llama 2 on Arabic instruction data. Introduces an Arabic chat format aligned to regional preferences and cultural context. Evaluated on MMLU-AR, AlGhafa, and custom Arabic benchmarks.

Type: Arabic LLM Sizes: 7B, 13B

ALLaM

πŸ‡ΈπŸ‡¦ SDAIA / IBM Β· Saudi Arabia

Arabic large language model developed by SDAIA in collaboration with IBM Research. Designed to support Saudi Vision 2030 digital transformation initiatives. ALLaM is trained on curated Arabic datasets reflecting Gulf Arabic dialect and Modern Standard Arabic.

Type: Arabic LLM Sizes: 7B

Fanar

πŸ‡ΆπŸ‡¦ QCRI / Qatar Foundation Β· Qatar

Arabic language model from Qatar Computing Research Institute (QCRI) and Qatar Foundation. Fanar is designed for Modern Standard Arabic and Gulf dialect, with strong performance on Arabic reasoning and comprehension benchmarks.

Type: Arabic LLM Sizes: 8B

Noor

πŸ‡ΈπŸ‡¦ Saudi Research and Media Group / SDAIA Β· Saudi Arabia

Noor is an Arabic language model developed through a Saudi initiative, focused on culturally-aware Arabic text generation. Part of Saudi Arabia's broader strategy to build sovereign Arabic AI capabilities under Vision 2030.

Type: Arabic LLM Sizes: 1.5B

AraBERT

πŸ‡¦πŸ‡ͺ MBZUAI / AUB Β· UAE

AraBERT is a pre-trained BERT model for Arabic NLP, developed by researchers at MBZUAI and the American University of Beirut. One of the most widely used Arabic NLP models, with broad adoption in Arabic text classification, NER, and question answering tasks.

Type: Arabic PLM Sizes: Base, Large
Missing a model? Let us know β†’