Skip to content
GCC AI Research

Search

Results for "Tool Manipulation"

Team NimbRo at MBZIRC 2017: Autonomous Valve Stem Turning using a Wrench

arXiv ·

Team NimbRo's robot Mario won the MBZIRC 2017 Challenge 2 by autonomously manipulating a valve stem using a wrench. The robot uses an omnidirectional base for locomotion and a 3D laser scan detector to find the manipulation panel. A deep neural network detects and selects the correct tool from grayscale images, and motion primitives are adapted to turn the valve stem. Why it matters: This work demonstrates advanced robotic manipulation capabilities relevant for industrial automation and hazardous environment operations in the region.

Tools of the trade: teaching robots to learn manual skills

MBZUAI ·

MBZUAI Professor Sami Haddadin and his team developed a new framework called Tactile Skills to teach robots manual skills through touch and trial and error. This framework aims to address the gap in robots' ability to learn basic physical tasks compared to AI's advancements in language and image generation. The research, published in Nature Machine Intelligence, focuses on enabling robots to perform manipulation skills at industrial levels with low energy and compute demands. Why it matters: This research could lead to robots capable of performing household maintenance, industrial tasks, and even assisting in medical or rehabilitation settings, potentially solving labor shortages in various sectors in the region and beyond.

MATRIX: Multimodal Agent Tuning for Robust Tool-Use Reasoning

arXiv ·

Researchers introduce MATRIX, a vision-centric agent tuning framework for robust tool-use reasoning in VLMs. The framework includes M-TRACE, a dataset of 28.5K multimodal tasks with 177K verified trajectories, and Pref-X, a set of 11K automatically generated preference pairs. Experiments show MATRIX consistently outperforms open- and closed-source VLMs across three benchmarks.