Technology Innovation Institute (TII) will make its Falcon-H1 large language model available as an NVIDIA NIM microservice. Falcon-H1 features a hybrid Transformer–Mamba architecture supporting context windows of up to 256k tokens. The model's availability on NVIDIA NIM aims to provide enterprises with a plug-and-play asset for building AI systems. Why it matters: This integration will simplify deployment and scaling of Falcon-H1 for enterprises, potentially accelerating the adoption of sovereign AI solutions in the region.
Abu Dhabi’s Technology Innovation Institute (TII) has launched Falcon-H1 Arabic, a new large language model based on a hybrid Mamba-Transformer architecture. The Falcon-H1 family comes in 3B, 7B, and 34B parameter sizes and outperforms existing models on the Open Arabic LLM Leaderboard (OALL). The model features improvements in data quality, dialect coverage, and long-context stability. Why it matters: This release strengthens the UAE's position in Arabic AI and provides a high-performing model tailored to the linguistic and cultural needs of the region.
TII in Abu Dhabi has launched Falcon Arabic, the first Arabic language model in the Falcon series, which is now the best-performing Arabic AI model in the region. They also released Falcon H1, a new model designed for performance and portability, outperforming Meta’s LLaMA and Alibaba’s Qwen in the small-to-medium size category. Falcon Arabic is built on Falcon 3-7B and trained on a high-quality native Arabic dataset. Why it matters: These releases strengthen the UAE's position as a leader in Arabic language AI and democratize access to high-performance AI models.
Technology Innovation Institute (TII) has launched Falcon H1R 7B, an open-source 7B parameter AI model with reasoning capabilities. It outperforms larger models like Microsoft Phi 4 Reasoning Plus 14B, Alibaba Qwen3 32B, and NVIDIA Nemotron H 47B on key benchmarks. The model uses a hybrid Transformer–Mamba architecture for improved accuracy and speed and is available on Hugging Face under the Falcon TII License. Why it matters: This release highlights the UAE's growing role in AI innovation by providing an efficient and accessible model for global research and development.
The Technology Innovation Institute (TII) in Abu Dhabi has launched Falcon 3, a new series of open-source large language models. Falcon 3 models range in size from 1B to 10B parameters and have been trained on 14 trillion tokens. Falcon 3 achieved the top spot on Hugging Face's LLM leaderboard for models under 13 billion parameters. Why it matters: This release democratizes access to high-performance AI by enabling efficient operation on laptops and light infrastructure, solidifying the UAE's position as a leader in open-source AI development.
Technology Innovation Institute (TII) in the UAE has launched Falcon 180B, an open access large language model with 180 billion parameters trained on 3.5 trillion tokens. Falcon 180B ranks first on the Hugging Face Leaderboard for pretrained LLMs, outperforming Meta's LLaMA 2 and nearing the performance of OpenAI's GPT-4 and Google's PaLM 2. The model is available for research and commercial use under the 'Falcon 180B TII License', based upon Apache 2.0. Why it matters: This release strengthens the UAE's position in AI development and promotes open access to advanced AI technology, fostering innovation and collaboration.