Soufiane Hayou of the National University of Singapore presented a talk at MBZUAI on principled scaling of neural networks. The talk covered leveraging mathematical results to efficiently scale neural networks. He obtained his PhD in statistics in 2021 from Oxford. Why it matters: Understanding neural network scaling is crucial for developing more efficient and powerful AI models in the region.
MBZUAI is developing the AI Operating System (AIOS) to reduce the energy, time, and talent costs of AI computing. AIOS aims to make AI models smaller, faster, and more efficient, reducing reliance on expensive hardware and speeding up compute operations. It also enables cost-aware model tuning and standardizes AI modules for reliable operation. Why it matters: By addressing the environmental impact and resource demands of AI, AIOS could promote more sustainable and accessible AI development in the region and globally.
MBZUAI Assistant Professor Qirong Ho is researching AI operating systems to standardize algorithms and enable non-experts to create AI applications reliably. He emphasizes that countries mastering mass production of AI systems will benefit most from the Fourth Industrial Revolution. Ho is co-founder and CTO at Petuum Inc., an AI startup creating standardized building blocks for affordable and scalable AI production. Why it matters: This research aims to democratize AI development and promote widespread adoption across industries in the UAE and beyond.
MBZUAI's AI Quorum launched its second workshop, "Building Ecosystems for AI at Scale," focusing on AI scalability and business applications. The first CASL workshop aims to define steps for organizations to become self-sufficient with AI and explore new use cases. Speakers include MBZUAI faculty and researchers from CMU, Stanford, KAUST, UC Berkeley, and Google. Why it matters: The workshop highlights the UAE's growing role in fostering AI innovation and bridging the gap between academic research and industry applications in the region.
Scale AI has partnered with Qatar Foundation to foster artificial intelligence innovation and talent development within Qatar. The collaboration aims to strengthen Qatar's AI ecosystem by leveraging Scale AI's expertise in data annotation and machine learning. This initiative will provide resources and training to local talent, supporting the growth of AI capabilities in the country. Why it matters: This partnership is a strategic move to bolster Qatar's national AI agenda, cultivate a skilled workforce, and drive technological advancement in the region.
KAUST is hosting a workshop on distributed training in November 2025, led by Professors Peter Richtarik and Marco Canini, focusing on scaling large models like LLMs and ViTs. Richtarik's team recently solved a 75-year-old problem in asynchronous optimization, developing time-optimal stochastic gradient descent algorithms. This research improves the speed and reliability of large model training and supports applications in distributed and federated learning. Why it matters: KAUST's focus on scalable AI and federated learning contributes to Saudi Arabia's Vision 2030 goals and addresses critical challenges in AI deployment and data privacy.
MBZUAI's computer science department, led by Xiaosong Ma, focuses on improving AI efficiency and sustainability by reducing wasted resources. Xiaosong's background in high-performance computing informs her approach to optimizing AI workloads. She aims to collaborate with experts across different AI domains at MBZUAI to address these challenges. Why it matters: Optimizing AI efficiency is crucial for reducing the environmental impact and computational costs associated with increasingly complex AI models in the GCC region and globally.
MBZUAI's Qirong Ho and colleagues are developing an Artificial Intelligence Operating System (AIOS) for decarbonization, aiming to reduce energy waste in AI development. The AIOS focuses on improving communication efficiency between machines during AI model training, as inefficient communication leads to prolonged tasks and increased energy consumption. This system addresses the high computing power demands of large language models like ChatGPT and LLaMA-2. Why it matters: By optimizing energy usage in AI development, the AIOS could significantly reduce the carbon footprint of AI technologies in the region and globally.