Skip to content
GCC AI Research

Search

Results for "AR/VR"

Immersive Analytics: Visualising Data in the Space Around Us

MBZUAI ·

The article discusses immersive analytics, which uses VR and AR to visualize data in 3D and embed it into the user's environment, and reviews systems and techniques from the Data Visualisation and Immersive Analytics lab at Monash University. It explores the concept of "embodied sensemaking" and its potential to improve how people work with complex data. Professor Tim Dwyer directs the Data Visualisation and Immersive Analytics Lab at Monash University. Why it matters: Immersive analytics could significantly enhance data comprehension and decision-making across various sectors in the Middle East, where large-scale projects and smart city initiatives generate vast datasets.

Extended Reality on-the-move

MBZUAI ·

This article discusses the evolution of mobile extended reality (MEX) and its potential to revolutionize urban interaction. It highlights the convergence of augmented and virtual reality technologies for mobile usage. A novel approach to 3D models, characterized as urban situated models or “3D-plus-time” (4D.City), is introduced. Why it matters: The development of MEX and 4D.City could significantly enhance user experience and analog-digital convergence in urban environments, offering new possibilities for human-computer interaction.

Integrating Virtual Reality and Robotics: Enhancing Human and Robot Experiences in Assistive Technologies

MBZUAI ·

Tetsunari Inamura's talk explores using VR to collect HRI data and tailor assistive robotic functionalities to individual users. He discusses symbol emergence via multimodal interaction, interactive behavior generation through symbol manipulation, and VR for data collection. The talk emphasizes long-term human capability enhancement and avoiding over-reliance on technology. Why it matters: This research promotes independence and growth in human-robot interactions, potentially revolutionizing assistive technologies in the region.

Reconstruction and Animation of Realistic Head Avatars

MBZUAI ·

Egor Zakharov from ETH Zurich AIT lab will present research on creating controllable and detailed 3D head avatars using data from consumer-grade devices. The presentation will cover high-fidelity image-based facial reconstruction/animation and video-based reconstruction of detailed structures like hairstyles. He will showcase integrating human-centric assets into virtual environments for real-time telepresence and entertainment. Why it matters: This research contributes to advancements in digital human modeling and telepresence, with applications in communication and gaming within the region.

Alumni Focus: Ronell Sicat

KAUST ·

KAUST alumnus Ronell Sicat (M.S. '10, Ph.D. '15) is developing immersive data visualization tools using augmented and virtual reality (AR/VR). After a postdoc at Harvard, Sicat returned to KAUST as a research scientist in the Visual Computing Center. Sicat developed a tool called DXR to help researchers prototype immersive visualizations. Why it matters: This highlights KAUST's role in fostering talent and innovation in AR/VR, a growing area with applications across various sectors in Saudi Arabia.

High-quality Neural Reconstruction in Real-world Scenes

MBZUAI ·

A researcher at the University of Oxford presented new findings on 3D neural reconstruction. The talk introduced a dataset comprising real-world video captures with perfect 3D models. A novel joint optimization method refines camera poses during the reconstruction process. Why it matters: High-quality 3D reconstruction has broad applicability to robotics and computer vision applications in the region.

KAUST students take game-changing AR tool to medical market

KAUST ·

KAUST students Daniya Boges and Dr. Corrado Calì developed an AR tool for medical applications, leading to the startup IntraVides. The project was supported by KAUST's Smart Health Initiative, which provided access to AR/VR facilities and seed funding through the KAUST Innovation Fund. The KAUST Entrepreneurship Center also helped incubate the idea from concept to business. Why it matters: This highlights KAUST's role in fostering innovation and entrepreneurship in healthcare through strategic investments in advanced technology and dedicated support programs.

Co-Modality Active sensing and Perception (C-MAP) in Autonomous Vehicles, Augmented Reality, Remote Environmental Monitoring, and Robotic Grasping

MBZUAI ·

Dezhen Song from Texas A&M University presented a talk on Co-Modality Active sensing and Perception (C-MAP) for robotics, covering sensor fusion for autonomous vehicles, augmented reality, and remote environmental monitoring. The talk highlighted lessons learned in sensor fusion using autonomous motorcycles and NASA Robonaut as examples. Recent works in robotic remote environment monitoring, especially focused on subsurface surface void and pipeline mapping were discussed. Why it matters: This research explores sensor fusion techniques to enhance robot perception, which could improve the robustness and capabilities of autonomous systems developed and deployed in the Middle East, particularly in challenging environments.