MBZUAI researchers are developing spiking neural networks (SNNs) to emulate the energy efficiency of the human brain. Traditional deep learning models like those powering ChatGPT consume significant energy, with a single query using 3.96 watts. SNNs aim to mimic biological neurons more closely to reduce energy consumption, as the human brain uses only a fraction of the energy compared to these models. Why it matters: This research could lead to more sustainable and energy-efficient AI technologies, addressing a major challenge in deploying large-scale AI systems.
Prof. Mérouane Debbah of the Technology Innovation Institute (TII) warns that current AI development relies on unsustainable, energy-intensive "bruteforce computing." He argues that the field needs more energy-efficient algorithms instead of simply scaling up GPUs. Debbah suggests neuromorphic computing as a potential solution, drawing inspiration from the human brain's energy efficiency. Why it matters: This critique highlights a crucial sustainability challenge for AI in the GCC and globally, as the region invests heavily in compute-intensive AI models.
KAUST researchers in the Sensors Lab are developing neuromorphic circuits for vision sensors, drawing inspiration from the human eye. They created flexible photoreceptors using hybrid perovskite materials, with capacitance tunable by light stimulation, mimicking the human retina. The team collaborates with experts in image characterization and brain pattern recognition to connect the 'eye' to the 'brain' for object identification. Why it matters: This biomimetic approach promises advancements in AI, machine learning, and smart city development within the region.
AI's energy consumption is a growing concern, with AI, data centers, and cryptocurrency consuming nearly 2% of the world's energy in 2022, potentially doubling by 2026. Training an LLM like GPT-3 uses the equivalent energy of 130 homes per year, and AI tasks consume 33 times more energy than task-specific software. MBZUAI's computer science department, led by Xiaosong Ma, is researching energy efficiency in AI hardware to address this problem. Why it matters: As AI adoption accelerates in the GCC, energy-efficient AI hardware and algorithms are critical for sustainable development and reducing carbon emissions in the region.
MBZUAI Assistant Professors Bin Gu and Huan Xiong are advancing spiking neural networks (SNNs) to improve computational power and energy efficiency. They will present their latest research on SNNs at the 38th Annual AAAI Conference on Artificial Intelligence in Vancouver. SNNs process information in discrete events, mimicking biological neurons and offering improved energy efficiency compared to traditional neural networks. Why it matters: This research could enable running advanced AI applications like GPTs on mobile devices, unlocking their full potential due to the energy efficiency of SNNs.