A CMU professor and MBZUAI affiliated faculty presented research on how LLMs store and use knowledge learned during pre-training. The study used a synthetic biography dataset to show that LLMs may not effectively use memorized knowledge at inference time, even with zero training loss. Data augmentation during pre-training can force the model to store knowledge in specific token embeddings. Why it matters: The research highlights limitations in LLM knowledge manipulation and extraction, with implications for improving model architectures and training strategies for more effective knowledge utilization in Arabic LLMs.
A Caltech researcher presented at MBZUAI on memory representation and retrieval, contrasting AI and neuroscience approaches. Current AI retrieval systems like RAG retrieve via fine-tuning and embedding similarity, while the presenter argued for exploring retrieval via combinatorial object identity or spatial proximity. The research explores circuit-level retrieval via domain fine-tuned LLMs and distributed memory for image retrieval using semantic similarity. Why it matters: The work suggests structured databases and retrieval-focused training can allow smaller models to outperform larger general-purpose models, offering efficiency gains for AI development in the region.
KAUST hosted the Frontiers in Energy Storage 2026 conference, emphasizing energy storage technologies for renewable energy. The conference highlighted electrochemical and chemical systems, including advanced batteries and hydrogen, as complementary layers for long-duration and industrial resilience. KAUST is developing energy-storage solutions relevant for the Kingdom and valuable to global partners, aiming to engineer solutions to withstand extreme environmental temperatures. Why it matters: This positions Saudi Arabia as a potential global exporter of resilient energy hardware, aligning with Saudi Vision 2030 goals in renewable energy.