The Technology Innovation Institute (TII) has launched Falcon Perception, a new 600-million-parameter multimodal AI model. This model offers competitive performance in object segmentation, dense visual understanding, and document intelligence, rivalling larger systems like Meta’s SAM3 and Alibaba’s Qwen with significantly greater efficiency. Falcon Perception unifies image and language processing in a single architecture, designed for real-world deployment in compute-constrained environments. Why it matters: This development positions the UAE among leading nations in advanced multimodal AI, which is crucial for applications in robotics, advanced manufacturing, and autonomous platforms.
FalconViz, a KAUST-based startup co-founded by alumnus Luca Passone, specializes in 3D surveying and mapping using unmanned aerial systems. Established in 2015, the company offers services such as topographical surveys, mining assessments, and flood modeling to clients in Saudi Arabia and beyond. KAUST provided FalconViz with early funding, training, and ongoing support. Why it matters: The success of FalconViz highlights KAUST's role in fostering technological innovation and entrepreneurship in Saudi Arabia, contributing to the Kingdom's growing technology sector.
FalconViz is a startup originating from KAUST that specializes in scanning and documenting the world. The company aims to provide new methods for documenting environments, benefitting cities and countries. Co-founder Neil Smith highlights the company's success as a demonstration of KAUST's original vision for startups. Why it matters: FalconViz represents a successful commercial venture emerging from Saudi Arabia's KAUST, showcasing the potential for technology and innovation within the Kingdom.
MBZUAI Professor Fahad Khan is working on a unified theory of machine visual intelligence. His goal is to enable AI systems to better understand and function in complex, chaotic visual environments. The aim is to improve real-world applications like smart cities, personalized healthcare, and autonomous vehicles. Why it matters: This research could significantly advance AI's ability to perceive and interact with the real world, especially in challenging environments common in the developing world.
Dezhen Song from Texas A&M University presented a talk on Co-Modality Active sensing and Perception (C-MAP) for robotics, covering sensor fusion for autonomous vehicles, augmented reality, and remote environmental monitoring. The talk highlighted lessons learned in sensor fusion using autonomous motorcycles and NASA Robonaut as examples. Recent works in robotic remote environment monitoring, especially focused on subsurface surface void and pipeline mapping were discussed. Why it matters: This research explores sensor fusion techniques to enhance robot perception, which could improve the robustness and capabilities of autonomous systems developed and deployed in the Middle East, particularly in challenging environments.