Skip to content
GCC AI Research

Vicuna, Altman, and the importance of green AI

MBZUAI · Significant research

Summary

MBZUAI President Eric Xing led a global collaboration to develop Vicuna, an LLM alternative to GPT-3 addressing the unsustainable costs of training LLMs. OpenAI CEO Sam Altman acknowledged Abu Dhabi's role in the global AI conversation, building off of achievements like Vicuna. Xing and colleagues are publishing research at MLSys 2023 on "cross-mesh resharding" to improve computer communication in deep learning, aiming for low-carbon, affordable, and miniaturized AI. Why it matters: This research signals a push towards sustainable AI development in the region, emphasizing efficiency and reduced environmental impact.

Keywords

Vicuna · MBZUAI · OpenAI · Sam Altman · Green AI

Get the weekly digest

Top AI stories from the GCC region, every week.

Related

Collaboration releases Vicuna – environmentally friendly, cost-effective rival to ChatGPT

MBZUAI ·

Researchers from MBZUAI, UC Berkeley, CMU, Stanford, and UC San Diego collaborated to create Vicuna, an open-source chatbot that costs $300 to train, unlike ChatGPT which costs over $4 million. Vicuna achieves 90% of ChatGPT's subjective language quality while being far more energy-efficient and can run on a single GPU. It was fine-tuned from Meta AI’s LLaMA model using user-shared conversations and has gained significant traction on GitHub. Why it matters: This research demonstrates that high-quality chatbots can be developed at a fraction of the cost and environmental impact, opening up new possibilities for sustainable AI development in the region.

Climate conscious computing

MBZUAI ·

MBZUAI's Qirong Ho and colleagues are developing an Artificial Intelligence Operating System (AIOS) for decarbonization, aiming to reduce energy waste in AI development. The AIOS focuses on improving communication efficiency between machines during AI model training, as inefficient communication leads to prolonged tasks and increased energy consumption. This system addresses the high computing power demands of large language models like ChatGPT and LLaMA-2. Why it matters: By optimizing energy usage in AI development, the AIOS could significantly reduce the carbon footprint of AI technologies in the region and globally.

Knowledge distillation and the greening of LLMs

MBZUAI ·

Researchers from MBZUAI, University of British Columbia, and Monash University have created LaMini-LM, a collection of small language models distilled from ChatGPT. LaMini-LM is trained on a dataset of 2.58M instructions and can be deployed on consumer laptops and mobile devices. The smaller models perform almost as well as larger counterparts while addressing security concerns. Why it matters: This work enables the deployment of LLMs in resource-constrained environments and enhances data security by reducing reliance on cloud-based LLMs.

What are we doing to tackle AI’s energy problem?

MBZUAI ·

AI's energy consumption is a growing concern, with AI, data centers, and cryptocurrency consuming nearly 2% of the world's energy in 2022, potentially doubling by 2026. Training an LLM like GPT-3 uses the equivalent energy of 130 homes per year, and AI tasks consume 33 times more energy than task-specific software. MBZUAI's computer science department, led by Xiaosong Ma, is researching energy efficiency in AI hardware to address this problem. Why it matters: As AI adoption accelerates in the GCC, energy-efficient AI hardware and algorithms are critical for sustainable development and reducing carbon emissions in the region.