Skip to content
GCC AI Research

Search

Results for "wingsuit"

Choosing to fly

KAUST ·

Climber, author, and wingsuit pilot Steph Davis spoke to the KAUST community on January 15 as part of the 2017 Winter Enrichment Program. The event was captured in photos by Lilit Hovhannisyan. The program was held at King Abdullah University of Science and Technology. Why it matters: Such enrichment programs can broaden the horizons of the KAUST community.

MonoRace: Winning Champion-Level Drone Racing with Robust Monocular AI

arXiv ·

The paper presents MonoRace, an onboard drone racing approach using a monocular camera and IMU. The system combines neural-network-based gate segmentation with a drone model for robust state estimation, along with offline optimization using gate geometry. MonoRace won the 2025 Abu Dhabi Autonomous Drone Racing Competition (A2RL), outperforming AI teams and human world champions, reaching speeds up to 100 km/h. Why it matters: This demonstrates a significant advancement in autonomous drone racing, achieving champion-level performance with a resource-efficient monocular system, validated in a real-world competition setting in the UAE.

Race Against the Machine: a Fully-annotated, Open-design Dataset of Autonomous and Piloted High-speed Flight

arXiv ·

Researchers at the Technology Innovation Institute (TII) have released a fully-annotated dataset for autonomous drone racing, called "Race Against the Machine." The dataset includes high-resolution visual, inertial, and motion capture data from both autonomous and piloted flights, along with commands, control inputs, and corner-level labeling of drone racing gates. The specifications to recreate their flight platform using commercial off-the-shelf components and the Betaflight controller are also released. Why it matters: This comprehensive resource aims to support the development of new methods and establish quantitative comparisons for approaches in robotics and AI, democratizing drone racing research.

Robust Tightly-Coupled Filter-Based Monocular Visual-Inertial State Estimation and Graph-Based Evaluation for Autonomous Drone Racing

arXiv ·

This paper introduces ADR-VINS, a monocular visual-inertial state estimation framework based on an Error-State Kalman Filter (ESKF) designed for autonomous drone racing, integrating direct pixel reprojection errors from gate corners as innovation terms. It also introduces ADR-FGO, an offline Factor-Graph Optimization framework for generating high-fidelity reference trajectories for post-flight evaluation in GNSS-denied environments. Validated on the TII-RATM dataset, ADR-VINS achieved an average RMS translation error of 0.134 m and was successfully deployed in the A2RL Drone Championship Season 2. Why it matters: The framework provides a robust and efficient solution for drone state estimation in challenging racing environments, and enables performance evaluation without relying on external localization systems.

Drift-Corrected Monocular VIO and Perception-Aware Planning for Autonomous Drone Racing

arXiv ·

This paper details the autonomous drone racing system developed for the Abu Dhabi Autonomous Racing League (A2RL) x Drone Champions League competition. The system uses drift-corrected monocular Visual-Inertial Odometry (VIO) fused with YOLO-based gate detection for global position measurements, managed via Kalman filter. A perception-aware planner generates trajectories balancing speed and gate visibility. Why it matters: The system's podium finishes validate the effectiveness of monocular vision-based autonomous drone flight and showcases advancements in AI-powered robotics within the UAE.