Skip to content
GCC AI Research

Search

Results for "o3-mini"

Falcon 3: UAE’s Technology Innovation Institute Launches World’s most Powerful Small AI Models that can also be run on Light Infrastructures, including Laptops

TII ·

The Technology Innovation Institute (TII) in Abu Dhabi has launched Falcon 3, a new series of open-source large language models. Falcon 3 models range in size from 1B to 10B parameters and have been trained on 14 trillion tokens. Falcon 3 achieved the top spot on Hugging Face's LLM leaderboard for models under 13 billion parameters. Why it matters: This release democratizes access to high-performance AI by enabling efficient operation on laptops and light infrastructure, solidifying the UAE's position as a leader in open-source AI development.

Green Learning — New Generation Machine Learning and Applications

MBZUAI ·

A recent talk at MBZUAI discussed "Green Learning" and Operational Neural Networks (ONNs) as efficient alternatives to CNNs. ONNs use "nodal" and "pool" operators and "generative neurons" to expand neuron learning capacity. Moncef Gabbouj from Tampere University presented Self-Organized ONNs (Self-ONNs) and their signal processing applications. Why it matters: Exploring more efficient AI models is crucial for sustainable development of AI in the region, as it addresses computational resource constraints and promotes broader accessibility.

Making Autonomous Nano-drones Smarter to Scale New Heights

TII ·

ARRC researchers in collaboration with the University of Bologna and ETH Zürich have developed a CNN-based AI deck to enable autonomous navigation of a 27g nano-drone in unknown environments. The CNN allows the drone to recognize and avoid obstacles using only an onboard camera, running 10x faster and using 10x less memory than previous versions. The demo also featured a swarm of nano-drones flying in formation using ultra-wideband communication. Why it matters: This advancement could significantly enhance the capabilities of nano-drones for applications such as disaster response, where quick and efficient intervention is crucial.

Knowledge distillation and the greening of LLMs

MBZUAI ·

Researchers from MBZUAI, University of British Columbia, and Monash University have created LaMini-LM, a collection of small language models distilled from ChatGPT. LaMini-LM is trained on a dataset of 2.58M instructions and can be deployed on consumer laptops and mobile devices. The smaller models perform almost as well as larger counterparts while addressing security concerns. Why it matters: This work enables the deployment of LLMs in resource-constrained environments and enhances data security by reducing reliance on cloud-based LLMs.

The Autonomous Software Stack of the FRED-003C: The Development That Led to Full-Scale Autonomous Racing

arXiv ·

Researchers from the BME Formula Racing Team present the autonomous software stack of the FRED-003C, which enabled full-scale autonomous racing. The software stack was developed in the context of Formula Student Driverless competitions. The paper details the software pipeline, hardware-software architecture, and methods for perception, localization, mapping, planning, and control. Why it matters: The team's experience contributed to their participation in the Abu Dhabi Autonomous Racing League, and sharing the system provides a valuable starting point for other students in the region.