Skip to content
GCC AI Research

What are we doing to tackle AI’s energy problem?

MBZUAI · Notable

Summary

AI's energy consumption is a growing concern, with AI, data centers, and cryptocurrency consuming nearly 2% of the world's energy in 2022, potentially doubling by 2026. Training an LLM like GPT-3 uses the equivalent energy of 130 homes per year, and AI tasks consume 33 times more energy than task-specific software. MBZUAI's computer science department, led by Xiaosong Ma, is researching energy efficiency in AI hardware to address this problem. Why it matters: As AI adoption accelerates in the GCC, energy-efficient AI hardware and algorithms are critical for sustainable development and reducing carbon emissions in the region.

Get the weekly digest

Top AI stories from the GCC region, every week.

Related

Climate conscious computing

MBZUAI ·

MBZUAI's Qirong Ho and colleagues are developing an Artificial Intelligence Operating System (AIOS) for decarbonization, aiming to reduce energy waste in AI development. The AIOS focuses on improving communication efficiency between machines during AI model training, as inefficient communication leads to prolonged tasks and increased energy consumption. This system addresses the high computing power demands of large language models like ChatGPT and LLaMA-2. Why it matters: By optimizing energy usage in AI development, the AIOS could significantly reduce the carbon footprint of AI technologies in the region and globally.

Bruteforce computing is the next “winter of AI”

MBZUAI ·

Prof. Mérouane Debbah of the Technology Innovation Institute (TII) warns that current AI development relies on unsustainable, energy-intensive "bruteforce computing." He argues that the field needs more energy-efficient algorithms instead of simply scaling up GPUs. Debbah suggests neuromorphic computing as a potential solution, drawing inspiration from the human brain's energy efficiency. Why it matters: This critique highlights a crucial sustainability challenge for AI in the GCC and globally, as the region invests heavily in compute-intensive AI models.

Emulating the energy efficiency of the brain

MBZUAI ·

MBZUAI researchers are developing spiking neural networks (SNNs) to emulate the energy efficiency of the human brain. Traditional deep learning models like those powering ChatGPT consume significant energy, with a single query using 3.96 watts. SNNs aim to mimic biological neurons more closely to reduce energy consumption, as the human brain uses only a fraction of the energy compared to these models. Why it matters: This research could lead to more sustainable and energy-efficient AI technologies, addressing a major challenge in deploying large-scale AI systems.

Cooling more people with fewer emissions: intelligent, efficient cooling with AI and ice batteries

MBZUAI ·

MBZUAI researchers are developing an AI-driven energy management system that optimizes ice battery technology for cooling in hot climates. The system stores energy as frozen water during times of energy surplus and uses it to cool buildings when demand peaks. The AI model integrates multimodal data from weather forecasts, environmental sensors, and power grid signals to determine when to store or release thermal energy. Why it matters: This approach promises to reduce fossil fuel dependence and lower energy costs while improving cooling performance in regions like the UAE.