Song Chaoyang from the Southern University of Science and Technology (SUSTech) presented research on Vision-Based Tactile Sensing (VBTS) for robot learning, combining soft robotic design with learning algorithms to achieve state-of-the-art performance in tactile perception. Their VBTS solution demonstrates robustness up to 1 million test cycles and enables multi-modal outputs from a single, vision-based input, facilitating applications such as amphibious tactile grasping and industrial welding. The talk also highlighted the DeepClaw system for capturing human demonstration actions, aiming for a universal interaction interface. Why it matters: This research advances embodied intelligence by improving robot dexterity and adaptability through enhanced tactile sensing, which is crucial for complex manipulation tasks in various sectors such as manufacturing and healthcare within the region.
Sami Haddadin from the Technical University of Munich (TUM) discusses a shift in robotics towards machines that autonomously develop their own blueprints and controls. He highlights advancements driven by human-centered design, soft control, and model-based machine learning, enabling human-robot collaboration in manufacturing and healthcare. Haddadin also presents progress towards autonomous machine design and modular control architectures for complex manipulation tasks. Why it matters: This research has implications for advancing robotics and AI in the GCC region, especially in manufacturing and healthcare, by enabling safer and more efficient human-robot collaboration.
Lorenzo Jamone from Queen Mary University of London presented on cognitive robotics, focusing on tactile exploration and manipulation by robots. The talk covered combining biology, engineering, and AI for advanced robotic systems. Jamone directs the CRISP group and has over 100 publications in cognitive robotics. Why it matters: This highlights the ongoing research into more sophisticated robotic systems that can interact with complex environments, an area crucial for future applications in manufacturing and human-robot collaboration in the GCC.
KAUST Ph.D. candidate Ahmed Alfadhel won the IEEE best research paper award for his work on artificial skin. The artificial skin design uses a flexible magnetic nano-composite cilia surface with a magnetic field sensing element. The device exhibits unprecedented flexibility due to the embedding of magnetic cilia and the sensing element in a polymeric surface. Why it matters: This research enables the development of cheaper, more versatile tactile sensors for health monitoring, robotics, and prosthetics, potentially advancing personalized healthcare and human-machine interfaces in the region.
KAUST researchers Yichen Cai and Jie Shen, led by Dr. Vincent Tung, are developing electronic skin (e-skin) using 2D materials like MXenes. Their research, published in Science Advances, focuses on mimicking human skin functions like sensing and adapting to stimuli. The team leverages the unique properties of 2D materials to create flexible and efficient electronic systems for next-generation electronics. Why it matters: This work advances materials science in the region, potentially enabling breakthroughs in flexible electronics, healthcare monitoring, and robotics.
MBZUAI researchers have developed "Tactile Skills," a new embodied AI framework enabling robots to rapidly learn complex tactile tasks. The framework combines expert process knowledge with reusable tactile control and adaptation components, reducing reliance on extensive datasets. Tested on 28 industrial tasks, the robots achieved nearly 100% success, demonstrating adaptability to changing conditions. Why it matters: This breakthrough offers a practical and scalable approach to robotic automation, potentially transforming robots into adaptable assistants across diverse industries in the GCC.
MBZUAI Professor Sami Haddadin and his team developed a new framework called Tactile Skills to teach robots manual skills through touch and trial and error. This framework aims to address the gap in robots' ability to learn basic physical tasks compared to AI's advancements in language and image generation. The research, published in Nature Machine Intelligence, focuses on enabling robots to perform manipulation skills at industrial levels with low energy and compute demands. Why it matters: This research could lead to robots capable of performing household maintenance, industrial tasks, and even assisting in medical or rehabilitation settings, potentially solving labor shortages in various sectors in the region and beyond.
Dezhen Song from Texas A&M University presented a talk on Co-Modality Active sensing and Perception (C-MAP) for robotics, covering sensor fusion for autonomous vehicles, augmented reality, and remote environmental monitoring. The talk highlighted lessons learned in sensor fusion using autonomous motorcycles and NASA Robonaut as examples. Recent works in robotic remote environment monitoring, especially focused on subsurface surface void and pipeline mapping were discussed. Why it matters: This research explores sensor fusion techniques to enhance robot perception, which could improve the robustness and capabilities of autonomous systems developed and deployed in the Middle East, particularly in challenging environments.