MBZUAI researchers presented "TransRadar," a study at WACV proposing new uses for radar in object identification. The study, led by Yahia Dalbah, explores fusing radar with other technologies to identify objects, particularly for autonomous vehicles. The "TransRadar" approach uses an adaptive-directional transformer for real-time multi-view radar semantic segmentation. Why it matters: This research addresses the limitations of radar by enhancing its object recognition capabilities, potentially improving the reliability of autonomous systems in adverse conditions.
MBZUAI researchers introduce TerraFM, a scalable self-supervised learning model for Earth observation that uses Sentinel-1 and Sentinel-2 imagery. The model unifies radar and optical inputs through modality-specific patch embeddings and adaptive cross-attention fusion. TerraFM achieves strong generalization on classification and segmentation tasks, outperforming prior models on GEO-Bench and Copernicus-Bench.
This paper introduces a convolutional transformer model for classifying tomato maturity, along with a new UAE-sourced dataset, KUTomaData, for training segmentation and classification models. The model combines CNNs and transformers and was tested against two public datasets. Results showed state-of-the-art performance, outperforming existing methods by significant margins in mAP scores across all three datasets.
MBZUAI student Fatima Ahmed Khalil Mohamed Alkhoori is researching machine learning techniques to improve traffic sign recognition for autonomous vehicles. Her work focuses on using transformer model architectures to enhance the ability of autonomous vehicles to accurately recognize traffic signs in varying environmental conditions. The research aims to address challenges such as viewing angle, lighting variations, and shadows that can confuse regular models. Why it matters: This research contributes to the advancement of safe and effective autonomous vehicle navigation, aligning with the UAE's vision of having a world-class transportation system.
Researchers introduce TomFormer, a transformer-based model for accurate and early detection of tomato leaf diseases, with the goal of deployment on the Hello Stretch robot for real-time diagnosis. TomFormer combines a visual transformer and CNN, achieving state-of-the-art results on KUTomaDATA, PlantDoc, and PlantVillage datasets. KUTomaDATA was collected from a greenhouse in Abu Dhabi, UAE.