Skip to content
GCC AI Research

Search

Results for "efficiency"

Going under the hood to improve AI efficiency

MBZUAI ·

MBZUAI's computer science department, led by Xiaosong Ma, focuses on improving AI efficiency and sustainability by reducing wasted resources. Xiaosong's background in high-performance computing informs her approach to optimizing AI workloads. She aims to collaborate with experts across different AI domains at MBZUAI to address these challenges. Why it matters: Optimizing AI efficiency is crucial for reducing the environmental impact and computational costs associated with increasingly complex AI models in the GCC region and globally.

Developing efficient algorithms to spread the benefits of AI

MBZUAI ·

MBZUAI PhD graduate William de Vazelhes is researching hard-thresholding algorithms to enable AI to work from smaller datasets. His work focuses on optimization algorithms that simplify data, making it easier to analyze and work with, useful for energy-saving and deploying AI models on low-memory devices. He demonstrated that his approach can obtain results similar to those of convex algorithms in many usual settings. Why it matters: This research could broaden AI accessibility by reducing computational costs, and has potential applications in sectors like finance, particularly for portfolio management under budgetary constraints.

Biweekly research update

KAUST ·

KAUST researchers developed a tandem solar cell with 32.5% conversion efficiency by optimizing the silicon-perovskite connection. Another team combined spectroscopy and reactor technologies to reveal details on catalyst function and reaction mechanisms. A KAUST team also developed a mathematical framework improving data rates by 30% and optimizing terrestrial network speeds. Why it matters: These advances highlight KAUST's contributions to sustainable energy, industrial processes, and network optimization, addressing key challenges in the region and globally.

Emulating the energy efficiency of the brain

MBZUAI ·

MBZUAI researchers are developing spiking neural networks (SNNs) to emulate the energy efficiency of the human brain. Traditional deep learning models like those powering ChatGPT consume significant energy, with a single query using 3.96 watts. SNNs aim to mimic biological neurons more closely to reduce energy consumption, as the human brain uses only a fraction of the energy compared to these models. Why it matters: This research could lead to more sustainable and energy-efficient AI technologies, addressing a major challenge in deploying large-scale AI systems.

Thin layer solution unlocks stability and efficiency in perovskite solar cells

KAUST ·

KAUST scientists developed a new perovskite solar cell design using thin perovskite layers at the top and bottom of the interface. The new design achieves a power conversion efficiency of 25.6%, comparable to silicon solar cells, with only a 5% efficiency loss after 1000 hours of high heat exposure. The key innovation is the use of a specific ligand that interacts effectively with the 3D perovskites for passivation, maintaining purity in the thin layers. Why it matters: This advancement enhances the stability and efficiency of perovskite solar cells, making them a more viable and cost-effective alternative to silicon, especially for countries like Saudi Arabia aiming to increase renewable energy reliance.

Climate conscious computing

MBZUAI ·

MBZUAI's Qirong Ho and colleagues are developing an Artificial Intelligence Operating System (AIOS) for decarbonization, aiming to reduce energy waste in AI development. The AIOS focuses on improving communication efficiency between machines during AI model training, as inefficient communication leads to prolonged tasks and increased energy consumption. This system addresses the high computing power demands of large language models like ChatGPT and LLaMA-2. Why it matters: By optimizing energy usage in AI development, the AIOS could significantly reduce the carbon footprint of AI technologies in the region and globally.

Green Learning — New Generation Machine Learning and Applications

MBZUAI ·

A recent talk at MBZUAI discussed "Green Learning" and Operational Neural Networks (ONNs) as efficient alternatives to CNNs. ONNs use "nodal" and "pool" operators and "generative neurons" to expand neuron learning capacity. Moncef Gabbouj from Tampere University presented Self-Organized ONNs (Self-ONNs) and their signal processing applications. Why it matters: Exploring more efficient AI models is crucial for sustainable development of AI in the region, as it addresses computational resource constraints and promotes broader accessibility.