Gregory Chirikjian presented an overview of research on robot navigation in unstructured environments, using computer vision, sensor tech, ML, and motion planning. The methods use multi-modal observations from RGB cameras, 3D LiDAR, and robot odometry for scene perception, along with deep RL for planning. These methods have been integrated with wheeled, home, and legged robots and tested in crowded indoor scenes, home environments, and dense outdoor terrains. Why it matters: This research pushes the boundaries of robotics in complex environments, paving the way for more versatile and autonomous robots in the Middle East.
ARRC researchers in collaboration with the University of Bologna and ETH Zürich have developed a CNN-based AI deck to enable autonomous navigation of a 27g nano-drone in unknown environments. The CNN allows the drone to recognize and avoid obstacles using only an onboard camera, running 10x faster and using 10x less memory than previous versions. The demo also featured a swarm of nano-drones flying in formation using ultra-wideband communication. Why it matters: This advancement could significantly enhance the capabilities of nano-drones for applications such as disaster response, where quick and efficient intervention is crucial.
This paper presents a decentralized multi-agent unmanned aerial system designed for search, pickup, and relocation of objects. The system integrates multi-agent aerial exploration, object detection/tracking, and aerial gripping. The decentralized system uses global state estimation, reactive collision avoidance, and sweep planning for exploration. Why it matters: The system's successful deployment in demonstrations and competitions like MBZIRC highlights the potential of integrated robotic solutions for complex tasks such as search and rescue in the region.
The paper presents MonoRace, an onboard drone racing approach using a monocular camera and IMU. The system combines neural-network-based gate segmentation with a drone model for robust state estimation, along with offline optimization using gate geometry. MonoRace won the 2025 Abu Dhabi Autonomous Drone Racing Competition (A2RL), outperforming AI teams and human world champions, reaching speeds up to 100 km/h. Why it matters: This demonstrates a significant advancement in autonomous drone racing, achieving champion-level performance with a resource-efficient monocular system, validated in a real-world competition setting in the UAE.
This paper presents a fully autonomous micro aerial vehicle (MAV) developed to pop balloons using onboard sensing and computing. The system was evaluated at the Mohamed Bin Zayed International Robotics Challenge (MBZIRC) 2020. The MAV successfully popped all five balloons in under two minutes in each of the three competition runs. Why it matters: This demonstrates the potential of autonomous robotics and computer vision for real-world applications in challenging environments.
Giuseppe Loianno from NYU presented research on creating "Super Autonomous" robots (USARC) that are Unmanned, Small, Agile, Resilient, and Collaborative. The research focuses on learning models, control, and navigation policies for single and collaborative robots operating in challenging environments. The talk highlighted the potential of these robots in logistics, reconnaissance, and other time-sensitive tasks. Why it matters: This points to growing research interest in advanced robotics in the region, especially given the focus on smart cities and automation.
Team TII EuroRacing (TII-ER) developed a full autonomous software stack for oval racing, enabling speeds above 75 m/s (270 km/h). The software includes modules for perception, planning, control, vehicle dynamics modeling, simulation, telemetry, and safety. The team achieved second and third place in the first two Indy Autonomous Challenge events using this stack.