This article discusses the evolution of mobile extended reality (MEX) and its potential to revolutionize urban interaction. It highlights the convergence of augmented and virtual reality technologies for mobile usage. A novel approach to 3D models, characterized as urban situated models or “3D-plus-time” (4D.City), is introduced. Why it matters: The development of MEX and 4D.City could significantly enhance user experience and analog-digital convergence in urban environments, offering new possibilities for human-computer interaction.
The article discusses immersive analytics, which uses VR and AR to visualize data in 3D and embed it into the user's environment, and reviews systems and techniques from the Data Visualisation and Immersive Analytics lab at Monash University. It explores the concept of "embodied sensemaking" and its potential to improve how people work with complex data. Professor Tim Dwyer directs the Data Visualisation and Immersive Analytics Lab at Monash University. Why it matters: Immersive analytics could significantly enhance data comprehension and decision-making across various sectors in the Middle East, where large-scale projects and smart city initiatives generate vast datasets.
MBZUAI's Dr. Hao Li is working on using AI and 3D telepresence to transform communication, work, and education by replacing physical transportation with virtual teleportation. His research focuses on the intersection of computer graphics, computer vision, and AI, specifically virtual avatar creation and facial performance capture. Li aims to improve communication using AI to achieve what cannot be done in real life. Why it matters: This research has the potential to reduce carbon footprints by enabling remote work and virtual collaboration, while also positioning MBZUAI and the UAE as leaders in AI-driven metaverse technologies.
Tetsunari Inamura's talk explores using VR to collect HRI data and tailor assistive robotic functionalities to individual users. He discusses symbol emergence via multimodal interaction, interactive behavior generation through symbol manipulation, and VR for data collection. The talk emphasizes long-term human capability enhancement and avoiding over-reliance on technology. Why it matters: This research promotes independence and growth in human-robot interactions, potentially revolutionizing assistive technologies in the region.
MBZUAI's Metaverse Center is developing technologies for realistic avatar generation. Hao Li and colleagues presented a novel approach at CVPR 2024, collaborating with ETH Zurich, VinAI Research, and Pinscreen. The technology addresses the challenge of mapping 2D images to 3D avatars, accounting for poses, expressions, and views. Why it matters: Creating realistic and efficient avatar generation could improve user experience and accessibility in virtual environments across the Middle East.
Egor Zakharov from ETH Zurich AIT lab will present research on creating controllable and detailed 3D head avatars using data from consumer-grade devices. The presentation will cover high-fidelity image-based facial reconstruction/animation and video-based reconstruction of detailed structures like hairstyles. He will showcase integrating human-centric assets into virtual environments for real-time telepresence and entertainment. Why it matters: This research contributes to advancements in digital human modeling and telepresence, with applications in communication and gaming within the region.
MBZUAI's Metaverse Lab is developing AI algorithms for photorealistic virtual humans and dynamic environments. Hao Li, Director of the lab, envisions using the metaverse for immersive learning experiences related to history and culture. He is also working on tools to prevent deepfakes and other cyberthreats. Why it matters: This research at MBZUAI aims to advance AI and immersive technologies for education and address potential risks in the metaverse.