Skip to content
GCC AI Research

FedML – Building Open and Collaborative Machine Learning Anywhere at Any Scale

MBZUAI · Notable

Summary

MBZUAI hosted a panel discussion in collaboration with the Manara Center for Coexistence and Dialogue. Chaoyang He, co-founder of FedML, presented on federated learning (FL), covering privacy/security, resource constraints, label scarcity, and scalable system design. FedML is a platform for zero-code, cross-platform, secure federated learning across industries like healthcare and finance. Why it matters: Federated learning is an important subfield for the GCC region, allowing privacy-preserving model training across distributed data sources.

Get the weekly digest

Top AI stories from the GCC region, every week.

Related

Building Planetary-Scale Collaborative Intelligence

MBZUAI ·

Sai Praneeth Karimireddy from UC Berkeley presented a talk on building planetary-scale collaborative intelligence, highlighting the challenges of using distributed data in machine learning due to data silos and ethical-legal restrictions. He proposed collaborative systems like federated learning as a solution to bring together distributed data while respecting privacy. The talk addressed the need for efficiency, reliability, and management of divergent goals in these systems, suggesting the use of tools from optimization, statistics, and economics. Why it matters: Collaborative AI systems can unlock valuable distributed data in the region, especially in sensitive sectors like healthcare, while ensuring privacy and addressing ethical concerns.

Enabling Fast, Robust, and Personalized Federated Learning

MBZUAI ·

A talk at MBZUAI discussed federated learning, a distributed machine learning approach training models over devices while keeping data localized. The presentation covered a straggler-resilient federated learning scheme using adaptive node participation to tackle system heterogeneity. It also presented a robust optimization formulation for addressing data heterogeneity and a new algorithm for personalizing learned models. Why it matters: Federated learning is crucial for AI applications involving decentralized data sources, and research on improving its robustness and personalization is essential for real-world deployment in the region.

Frontiers of federation at the AI Quorum

MBZUAI ·

MBZUAI hosted the Second Workshop on Collaborative Learning as part of the AI Quorum in Abu Dhabi, focusing on collaborative and federated learning for sustainable development. Researchers discussed applications in medicine, biology, ecological conservation, and humanitarian aid. Eric Xing highlighted the potential of large biology models, similar to LLMs, to revolutionize biological data analysis. Why it matters: This workshop underscores the UAE's commitment to advancing AI research in crucial sectors like healthcare and sustainability through collaborative learning approaches.

DaringFed: A Dynamic Bayesian Persuasion Pricing for Online Federated Learning under Two-sided Incomplete Information

arXiv ·

This paper introduces DaringFed, a novel dynamic Bayesian persuasion pricing mechanism for online federated learning (OFL) that addresses the challenge of two-sided incomplete information (TII) regarding resources. It formulates the interaction between the server and clients as a dynamic signaling and pricing allocation problem within a Bayesian persuasion game, demonstrating the existence of a unique Bayesian persuasion Nash equilibrium. Evaluations on real and synthetic datasets demonstrate that DaringFed optimizes accuracy and convergence speed and improves the server's utility.