Skip to content
GCC AI Research

Search

Results for "tactile skills"

New Physical AI-Framework Enables Rapid Learning of Complex Skills in Robotics

MBZUAI ·

MBZUAI researchers have developed "Tactile Skills," a new embodied AI framework enabling robots to rapidly learn complex tactile tasks. The framework combines expert process knowledge with reusable tactile control and adaptation components, reducing reliance on extensive datasets. Tested on 28 industrial tasks, the robots achieved nearly 100% success, demonstrating adaptability to changing conditions. Why it matters: This breakthrough offers a practical and scalable approach to robotic automation, potentially transforming robots into adaptable assistants across diverse industries in the GCC.

Tools of the trade: teaching robots to learn manual skills

MBZUAI ·

MBZUAI Professor Sami Haddadin and his team developed a new framework called Tactile Skills to teach robots manual skills through touch and trial and error. This framework aims to address the gap in robots' ability to learn basic physical tasks compared to AI's advancements in language and image generation. The research, published in Nature Machine Intelligence, focuses on enabling robots to perform manipulation skills at industrial levels with low energy and compute demands. Why it matters: This research could lead to robots capable of performing household maintenance, industrial tasks, and even assisting in medical or rehabilitation settings, potentially solving labor shortages in various sectors in the region and beyond.

Tactile robots: building the machine and learning the self

MBZUAI ·

Sami Haddadin from the Technical University of Munich (TUM) discusses a shift in robotics towards machines that autonomously develop their own blueprints and controls. He highlights advancements driven by human-centered design, soft control, and model-based machine learning, enabling human-robot collaboration in manufacturing and healthcare. Haddadin also presents progress towards autonomous machine design and modular control architectures for complex manipulation tasks. Why it matters: This research has implications for advancing robotics and AI in the GCC region, especially in manufacturing and healthcare, by enabling safer and more efficient human-robot collaboration.

The intelligence of the hand

MBZUAI ·

Lorenzo Jamone from Queen Mary University of London presented on cognitive robotics, focusing on tactile exploration and manipulation by robots. The talk covered combining biology, engineering, and AI for advanced robotic systems. Jamone directs the CRISP group and has over 100 publications in cognitive robotics. Why it matters: This highlights the ongoing research into more sophisticated robotic systems that can interact with complex environments, an area crucial for future applications in manufacturing and human-robot collaboration in the GCC.

Super-aligned Machine Intelligence via a Soft Touch

MBZUAI ·

Song Chaoyang from the Southern University of Science and Technology (SUSTech) presented research on Vision-Based Tactile Sensing (VBTS) for robot learning, combining soft robotic design with learning algorithms to achieve state-of-the-art performance in tactile perception. Their VBTS solution demonstrates robustness up to 1 million test cycles and enables multi-modal outputs from a single, vision-based input, facilitating applications such as amphibious tactile grasping and industrial welding. The talk also highlighted the DeepClaw system for capturing human demonstration actions, aiming for a universal interaction interface. Why it matters: This research advances embodied intelligence by improving robot dexterity and adaptability through enhanced tactile sensing, which is crucial for complex manipulation tasks in various sectors such as manufacturing and healthcare within the region.

Embodied Robot Skills and Good Old Fashioned Engineering

MBZUAI ·

Michael Yu Wang, Chair Professor and Founding Dean of the School of Engineering at Great Bay University, argues for combining "good old fashioned engineering" (GOFE) with learning-based approaches like LLMs for robot skill acquisition, particularly in manipulation. He suggests a modular framework that integrates engineering principles with learning, drawing inspiration from human hand-eye coordination and tactile perception. Wang emphasizes the need to address engineering features of robot tactile sensors, such as spatial and temporal resolutions, to achieve human-like robot manipulation skills. Why it matters: This perspective highlights the importance of hybrid approaches combining traditional engineering with modern AI for advancing robotics, especially in complex manipulation tasks relevant to industries in the GCC region.

An artificial skin that can feel

KAUST ·

KAUST Ph.D. candidate Ahmed Alfadhel won the IEEE best research paper award for his work on artificial skin. The artificial skin design uses a flexible magnetic nano-composite cilia surface with a magnetic field sensing element. The device exhibits unprecedented flexibility due to the embedding of magnetic cilia and the sensing element in a polymeric surface. Why it matters: This research enables the development of cheaper, more versatile tactile sensors for health monitoring, robotics, and prosthetics, potentially advancing personalized healthcare and human-machine interfaces in the region.

Sensing the future

KAUST ·

KAUST researchers Yichen Cai and Jie Shen, led by Dr. Vincent Tung, are developing electronic skin (e-skin) using 2D materials like MXenes. Their research, published in Science Advances, focuses on mimicking human skin functions like sensing and adapting to stimuli. The team leverages the unique properties of 2D materials to create flexible and efficient electronic systems for next-generation electronics. Why it matters: This work advances materials science in the region, potentially enabling breakthroughs in flexible electronics, healthcare monitoring, and robotics.