Eyal Ofek of Microsoft Research is researching how to augment users' senses and use scene understanding to create more inclusive workspaces, especially for remote work. His work involves designing applications flexible to changing environments and personalized to each user. Ofek's background includes computer vision, augmented reality, and leading research groups at Microsoft. Why it matters: This research aims to improve remote collaboration and adapt technology to individual user needs, which could enhance productivity and inclusivity in the evolving work landscape of the GCC region.
This article discusses a talk on "Assistive Augmentation," designing human-computer interfaces to augment human abilities. Examples include 'AiSee' for blind users, 'Prospero' for memory training, and 'MuSS-Bits' for deaf users to feel music. Suranga Nanayakkara from the National University of Singapore will present the talk, highlighting insights from psychology, human-centered machine learning, and design thinking. Why it matters: Such assistive technologies can significantly improve the quality of life for individuals with disabilities and extend human capabilities.
A computer science vision involves computing devices becoming proactive assistants, enhancing various aspects of life through user digitization. Current devices provide coarse digital representations of users, but there's significant potential for improvement. Karan, a Ph.D. candidate at CMU, develops technologies for consumer devices to capture richer user representations without sacrificing practicality. Why it matters: Advancements in user digitization can lead to improved extended reality experiences, health tracking, and more productive work environments, enhancing the utility of consumer devices.
This article discusses the evolution of mobile extended reality (MEX) and its potential to revolutionize urban interaction. It highlights the convergence of augmented and virtual reality technologies for mobile usage. A novel approach to 3D models, characterized as urban situated models or “3D-plus-time” (4D.City), is introduced. Why it matters: The development of MEX and 4D.City could significantly enhance user experience and analog-digital convergence in urban environments, offering new possibilities for human-computer interaction.
The article discusses immersive analytics, which uses VR and AR to visualize data in 3D and embed it into the user's environment, and reviews systems and techniques from the Data Visualisation and Immersive Analytics lab at Monash University. It explores the concept of "embodied sensemaking" and its potential to improve how people work with complex data. Professor Tim Dwyer directs the Data Visualisation and Immersive Analytics Lab at Monash University. Why it matters: Immersive analytics could significantly enhance data comprehension and decision-making across various sectors in the Middle East, where large-scale projects and smart city initiatives generate vast datasets.