Skip to content
GCC AI Research

Search

Results for "sensor fusion"

Co-Modality Active sensing and Perception (C-MAP) in Autonomous Vehicles, Augmented Reality, Remote Environmental Monitoring, and Robotic Grasping

MBZUAI ·

Dezhen Song from Texas A&M University presented a talk on Co-Modality Active sensing and Perception (C-MAP) for robotics, covering sensor fusion for autonomous vehicles, augmented reality, and remote environmental monitoring. The talk highlighted lessons learned in sensor fusion using autonomous motorcycles and NASA Robonaut as examples. Recent works in robotic remote environment monitoring, especially focused on subsurface surface void and pipeline mapping were discussed. Why it matters: This research explores sensor fusion techniques to enhance robot perception, which could improve the robustness and capabilities of autonomous systems developed and deployed in the Middle East, particularly in challenging environments.

OmniGen: Unified Multimodal Sensor Generation for Autonomous Driving

arXiv ·

The paper introduces OmniGen, a unified framework for generating aligned multimodal sensor data for autonomous driving using a shared Bird's Eye View (BEV) space. It uses a novel generalizable multimodal reconstruction method (UAE) to jointly decode LiDAR and multi-view camera data through volume rendering. The framework incorporates a Diffusion Transformer (DiT) with a ControlNet branch to enable controllable multimodal sensor generation, demonstrating good performance and multimodal consistency.

The next generation of sensing platforms

KAUST ·

KAUST held its third annual Sensor Initiative, hosting 70 delegates from KAUST and international institutions like MIT and UCLA. The interdisciplinary meeting focused on transforming sensor technologies and exploring applications. Researchers from KAUST and abroad presented on topics like chemical sensors and sustainable ecosystems. Why it matters: The initiative demonstrates KAUST's commitment to advancing sensor technology and fostering collaboration between local and international experts.

Foundations of Multisensory Artificial Intelligence

MBZUAI ·

Paul Liang from CMU presented on machine learning foundations for multisensory AI, discussing a theoretical framework for modality interactions. The talk covered cross-modal attention and multimodal transformer architectures, and applications in mental health, pathology, and robotics. Liang's research aims to enable AI systems to integrate and learn from diverse real-world sensory modalities. Why it matters: This highlights the growing importance of multimodal AI research and its potential for advancements across various sectors in the region, including healthcare and robotics.

Imagine a city that thinks about your safety

KAUST ·

KAUST researchers have developed a dual-use wireless sensor system that monitors both traffic congestion and flood incidents in cities. The system combines ultrasonic range finders and infrared thermal sensors to provide real-time, accurate data on traffic flow and roadway flooding. Data is sent to central servers and assimilated with satellite data to form real-time maps and forecasts. Why it matters: This technology can provide up-to-the-minute warnings for flash floods and traffic, enabling rapid emergency response and potentially saving lives in urban environments.

Sensing the world around us

KAUST ·

KAUST hosted the KAUST Sensor Initiative, convening experts in sensor development, material science, energy, communications, and data analysis. Live demonstrations showcased working prototypes, including a flexible sensor for monitoring the speed of dolphins developed by KAUST Ph.D. student Altynay Kaidarova. The initiative aims to advance a network of smarter, interactive physical IoT devices with embedded intelligent sensor technologies. Why it matters: This initiative highlights KAUST's role in fostering innovation in sensor technology and IoT, crucial for advancing smart infrastructure and environmental monitoring in the region.