Skip to content
GCC AI Research

Search

Results for "accessibility"

Humanizing Technology with Assistive Augmentations

MBZUAI ·

This article discusses a talk on "Assistive Augmentation," designing human-computer interfaces to augment human abilities. Examples include 'AiSee' for blind users, 'Prospero' for memory training, and 'MuSS-Bits' for deaf users to feel music. Suranga Nanayakkara from the National University of Singapore will present the talk, highlighting insights from psychology, human-centered machine learning, and design thinking. Why it matters: Such assistive technologies can significantly improve the quality of life for individuals with disabilities and extend human capabilities.

AI-Enabled Technologies for People with Disabilities: Some Key Research and Privacy/Security Challenges

MBZUAI ·

The article discusses the potential of AI-enabled assistive technologies to empower People with Disabilities (PWD), citing that over one billion people live with some form of disability globally. It highlights examples like communication tools, assistive robots, and smart visual aids, and emphasizes the need to address security and privacy concerns. The author, Ishfaq Ahmad from the University of Texas at Arlington, points out that with a growing global population, over two billion people will need assistive products by 2030. Why it matters: The piece advocates for using AI to tackle critical human rights issues and improve the lives of a significant portion of the global population in the face of increasing disability rates.

AI for all: Unlocking an inclusive future with technology

MBZUAI ·

The Special Olympics Global Center Summit in Abu Dhabi convened 300 advocates to discuss social inclusion for individuals with intellectual disabilities. A panel including MBZUAI's Elizabeth Churchill highlighted AI's role in inclusive technology design, especially in education. Churchill noted AI can personalize learning through tailored regimens, emotion detection, and understanding cognitive patterns. Why it matters: AI-driven personalization has potential to transform education and accessibility for children of determination and other underrepresented groups in the region.

Making human-machine conversation more lifelike than ever at GITEX

MBZUAI ·

MBZUAI researchers demonstrated a low-latency, multilingual multimodal AI system at GITEX that integrates speech, text, and visual capabilities for more lifelike human-machine conversation. The demo, led by Dr. Hisham Cholakkal, includes a mobile app where users can point their camera at an object and ask questions, receiving spoken answers in multiple languages. They are also integrating the model into a robot dog that can respond to voice commands. Why it matters: This work addresses key challenges in deploying LLMs to real-world applications in the Middle East, such as multilingual support and real-time responsiveness.