Skip to content
GCC AI Research

Search

Results for "embodied AI"

Key Research in Embodied AI

MBZUAI ·

Dr. Hao Dong from Peking University presented research on addressing the challenge of limited large-scale training data in embodied AI, particularly for manipulation, task planning, and navigation. The presentation covered simulation learning and large models. Dr. Dong is a chief scientist of China's National Key Research and Development Program and an area chair/associate editor for NeurIPS, CVPR, AAAI, and ICRA. Why it matters: Overcoming data scarcity is crucial for advancing embodied AI research and enabling more sophisticated robotic applications in the region.

Towards embodied multi-modal visual understanding

MBZUAI ·

Ivan Laptev from INRIA Paris presented a talk at MBZUAI on embodied multi-modal visual understanding, covering advancements in video understanding tasks like question answering and captioning. The talk highlighted recent work on vision-language navigation and manipulation. He argued that detailed understanding of the physical world through vision is still in early stages, discussing open research directions related to robotics and video generation. Why it matters: The discussion of robotics applications and future research directions in embodied AI could influence the direction of AI research and development in the UAE, particularly at MBZUAI.

Embodied Robot Skills and Good Old Fashioned Engineering

MBZUAI ·

Michael Yu Wang, Chair Professor and Founding Dean of the School of Engineering at Great Bay University, argues for combining "good old fashioned engineering" (GOFE) with learning-based approaches like LLMs for robot skill acquisition, particularly in manipulation. He suggests a modular framework that integrates engineering principles with learning, drawing inspiration from human hand-eye coordination and tactile perception. Wang emphasizes the need to address engineering features of robot tactile sensors, such as spatial and temporal resolutions, to achieve human-like robot manipulation skills. Why it matters: This perspective highlights the importance of hybrid approaches combining traditional engineering with modern AI for advancing robotics, especially in complex manipulation tasks relevant to industries in the GCC region.

Vision and insight: Charting the course of embodied AI with Ian Reid

MBZUAI ·

MBZUAI Professor Ian Reid discusses his career in embodied AI, from early work on active vision at Oxford to current research. He highlights three key developments: cameras as geometric sensors, visual SLAM, and advancements in robot navigation. Reid distinguishes embodied AI from systems like ChatGPT, emphasizing its need for understanding and interaction with the physical world. Why it matters: The insights from a leading expert underscore the importance of embodied AI as the next frontier in intelligent systems and robotics in the region.

A next step for embodied agents: Ivan Laptev on world models

MBZUAI ·

MBZUAI Professor Ivan Laptev is working to bridge the gap between data-driven AI systems and embodied agents (robots). He notes challenges in robotics including data scarcity, the need to generate new data through actions, and the requirement for real-time operation. Laptev aims to transfer innovations from computer vision to robotics, addressing these challenges to improve robots' ability to interpret and respond to the complexities of the real world. Why it matters: Overcoming these hurdles is crucial for advancing robotics and enabling robots to effectively interact with and navigate dynamic real-world environments.

How AI is building a whole new you

MBZUAI ·

MBZUAI researchers are working on digital twin technology that can replicate human beings in detail, with real-time data flow between the physical and virtual. This project aims to extend digital twins from objects to organic entities like humans, plants and animals. The technology mines data from cameras, sensors, wearables, and other sources to predict health issues before they arise. Why it matters: This research has the potential to transform healthcare by enabling the prediction and prevention of health issues.

Super-aligned Machine Intelligence via a Soft Touch

MBZUAI ·

Song Chaoyang from the Southern University of Science and Technology (SUSTech) presented research on Vision-Based Tactile Sensing (VBTS) for robot learning, combining soft robotic design with learning algorithms to achieve state-of-the-art performance in tactile perception. Their VBTS solution demonstrates robustness up to 1 million test cycles and enables multi-modal outputs from a single, vision-based input, facilitating applications such as amphibious tactile grasping and industrial welding. The talk also highlighted the DeepClaw system for capturing human demonstration actions, aiming for a universal interaction interface. Why it matters: This research advances embodied intelligence by improving robot dexterity and adaptability through enhanced tactile sensing, which is crucial for complex manipulation tasks in various sectors such as manufacturing and healthcare within the region.

New Physical AI-Framework Enables Rapid Learning of Complex Skills in Robotics

MBZUAI ·

MBZUAI researchers have developed "Tactile Skills," a new embodied AI framework enabling robots to rapidly learn complex tactile tasks. The framework combines expert process knowledge with reusable tactile control and adaptation components, reducing reliance on extensive datasets. Tested on 28 industrial tasks, the robots achieved nearly 100% success, demonstrating adaptability to changing conditions. Why it matters: This breakthrough offers a practical and scalable approach to robotic automation, potentially transforming robots into adaptable assistants across diverse industries in the GCC.