Skip to content
GCC AI Research

Search

Results for "rPPG"

Towards Practical Remote Photoplethysmography Detector

MBZUAI ·

Pong C Yuen from Hong Kong Baptist University will present a talk on remote photoplethysmography (rPPG) detection. The talk will review the development of rPPG detection, share recent research, and discuss future directions. rPPG is a technology for non-contact computer vision and healthcare applications like heart rate estimation. Why it matters: Advancements in rPPG could enable new remote patient monitoring and diagnostic tools in the region, reducing the need for physical contact.

RP-SAM2: Refining Point Prompts for Stable Surgical Instrument Segmentation

arXiv ·

Researchers from MBZUAI introduced RP-SAM2, a method to improve surgical instrument segmentation by refining point prompts for more stable results. RP-SAM2 uses a novel shift block and compound loss function to reduce sensitivity to point prompt placement, improving segmentation accuracy in data-constrained settings. Experiments on the Cataract1k and CaDIS datasets show that RP-SAM2 enhances segmentation accuracy and reduces variance compared to SAM2, with code available on GitHub.

Metaverse healthcare in red, green, and blue

MBZUAI ·

Researchers at MBZUAI developed a method to measure vital signs using webcams by analyzing color intensity changes in facial blood flow. They built a digital twin system that uses machine learning to combine heart rate, respiratory rate, and blood oxygen level measures. The system displays real-time vital sign information, enabling remote patient triage. Why it matters: This research contributes to the advancement of telemedicine, potentially improving healthcare access in underserved regions and aligning with UN Sustainable Development Goal #3.

Co-Modality Active sensing and Perception (C-MAP) in Autonomous Vehicles, Augmented Reality, Remote Environmental Monitoring, and Robotic Grasping

MBZUAI ·

Dezhen Song from Texas A&M University presented a talk on Co-Modality Active sensing and Perception (C-MAP) for robotics, covering sensor fusion for autonomous vehicles, augmented reality, and remote environmental monitoring. The talk highlighted lessons learned in sensor fusion using autonomous motorcycles and NASA Robonaut as examples. Recent works in robotic remote environment monitoring, especially focused on subsurface surface void and pipeline mapping were discussed. Why it matters: This research explores sensor fusion techniques to enhance robot perception, which could improve the robustness and capabilities of autonomous systems developed and deployed in the Middle East, particularly in challenging environments.