MBZUAI is developing the AI Operating System (AIOS) to reduce the energy, time, and talent costs of AI computing. AIOS aims to make AI models smaller, faster, and more efficient, reducing reliance on expensive hardware and speeding up compute operations. It also enables cost-aware model tuning and standardizes AI modules for reliable operation. Why it matters: By addressing the environmental impact and resource demands of AI, AIOS could promote more sustainable and accessible AI development in the region and globally.
MBZUAI's Qirong Ho and colleagues are developing an Artificial Intelligence Operating System (AIOS) for decarbonization, aiming to reduce energy waste in AI development. The AIOS focuses on improving communication efficiency between machines during AI model training, as inefficient communication leads to prolonged tasks and increased energy consumption. This system addresses the high computing power demands of large language models like ChatGPT and LLaMA-2. Why it matters: By optimizing energy usage in AI development, the AIOS could significantly reduce the carbon footprint of AI technologies in the region and globally.
AI's energy consumption is a growing concern, with AI, data centers, and cryptocurrency consuming nearly 2% of the world's energy in 2022, potentially doubling by 2026. Training an LLM like GPT-3 uses the equivalent energy of 130 homes per year, and AI tasks consume 33 times more energy than task-specific software. MBZUAI's computer science department, led by Xiaosong Ma, is researching energy efficiency in AI hardware to address this problem. Why it matters: As AI adoption accelerates in the GCC, energy-efficient AI hardware and algorithms are critical for sustainable development and reducing carbon emissions in the region.
MBZUAI President Eric Xing led a global collaboration to develop Vicuna, an LLM alternative to GPT-3 addressing the unsustainable costs of training LLMs. OpenAI CEO Sam Altman acknowledged Abu Dhabi's role in the global AI conversation, building off of achievements like Vicuna. Xing and colleagues are publishing research at MLSys 2023 on "cross-mesh resharding" to improve computer communication in deep learning, aiming for low-carbon, affordable, and miniaturized AI. Why it matters: This research signals a push towards sustainable AI development in the region, emphasizing efficiency and reduced environmental impact.
MBZUAI's computer science department, led by Xiaosong Ma, focuses on improving AI efficiency and sustainability by reducing wasted resources. Xiaosong's background in high-performance computing informs her approach to optimizing AI workloads. She aims to collaborate with experts across different AI domains at MBZUAI to address these challenges. Why it matters: Optimizing AI efficiency is crucial for reducing the environmental impact and computational costs associated with increasingly complex AI models in the GCC region and globally.
A recent talk at MBZUAI discussed "Green Learning" and Operational Neural Networks (ONNs) as efficient alternatives to CNNs. ONNs use "nodal" and "pool" operators and "generative neurons" to expand neuron learning capacity. Moncef Gabbouj from Tampere University presented Self-Organized ONNs (Self-ONNs) and their signal processing applications. Why it matters: Exploring more efficient AI models is crucial for sustainable development of AI in the region, as it addresses computational resource constraints and promotes broader accessibility.
Prof. Mérouane Debbah of the Technology Innovation Institute (TII) warns that current AI development relies on unsustainable, energy-intensive "bruteforce computing." He argues that the field needs more energy-efficient algorithms instead of simply scaling up GPUs. Debbah suggests neuromorphic computing as a potential solution, drawing inspiration from the human brain's energy efficiency. Why it matters: This critique highlights a crucial sustainability challenge for AI in the GCC and globally, as the region invests heavily in compute-intensive AI models.