Margaret Livingstone, a neurobiology professor at Harvard Medical School, lectured at KAUST's Winter Enrichment Program 2018 on how art can reveal insights into the human brain. She discussed how artists have long understood the independent roles of color and luminance in visual perception. Livingstone highlighted examples from Picasso, Monet, and Warhol to illustrate how artists manipulate visual cues. Why it matters: This interdisciplinary approach can potentially lead to new understandings of how the brain processes visual information and inform advances in both neuroscience and art.
KAUST Discovery highlights the contributions of Magistretti to the field of neuroenergetics. His research explores the cellular and molecular basis of brain energy metabolism and brain imaging. Magistretti's group discovered mechanisms underlying the coupling between neuronal activity and energy consumption, revealing the role of astrocytes. Why it matters: Understanding brain energy metabolism and the role of glial cells can advance brain imaging techniques and our understanding of neuronal processes.
KAUST hosted the Nature Conferences: Brain Energy Metabolism in Health and Disease, convening experts to discuss brain energy use and its impact on function and disease. Researchers from KAUST and global institutions shared insights on metabolic interactions among brain cells and the brain's role in whole-body energy regulation. KAUST's President Sir Edward Byrne emphasized brain health as essential for the cognitive economy, aligning with Saudi Arabia’s Vision 2030. Why it matters: The conference highlights KAUST's growing role in global neuroscience research and its commitment to addressing critical health challenges through international collaboration.
Tom M. Mitchell from Carnegie Mellon University discussed using machine learning to study how the brain processes natural language, using fMRI and MEG to record brain activity while reading text. The research explores neural encodings of word meaning, information flow during word comprehension, and how meanings of words combine in sentences and stories. He also touched on how understanding of the brain aligns with current AI approaches to NLP. Why it matters: This interdisciplinary research could bridge the gap between neuroscience and AI, potentially leading to more human-like NLP models.
KAUST professor Pierre Magistretti has been elected to the Norwegian Academy of Science and Letters. His election recognizes his contributions to neuroscience, specifically his work on lactate's role in brain function. Magistretti's research focuses on the lactate shuttle system and how neurons and glial cells cooperate to meet energy demands. Why it matters: This honor highlights KAUST's contribution to international neuroscience and can foster further collaboration in the field.
Caltech graduate student Surya Narayanan Hari presented his research on replicating human-like memory in machines at MBZUAI. He discussed how the thalamus, which filters sensory and motor signals in the brain, inspires the development of routed monolithic models in AI. Hari explained that memory retrieval occurs on object, embedding, and circuit levels in the human brain. Why it matters: This talk highlights the potential of neuroscience-inspired AI architectures for improving memory and information processing in AI systems, which could accelerate the development of more efficient and context-aware AI models in the region.
KAUST Professor Pierre Magistretti received the 2016 Fondation IPSEN Neuronal Plasticity prize for his work in neuroenergetics. The award recognizes Magistretti's contributions to understanding the relationship between neuronal activity and brain energy consumption. He shares the award with Dr. David Attwell and Dr. Marcus Raichle, and will be honored at FENS in Copenhagen. Why it matters: This award highlights KAUST's contribution to international neuroscience research and strengthens its reputation in biological and environmental science.
A Caltech researcher presented at MBZUAI on memory representation and retrieval, contrasting AI and neuroscience approaches. Current AI retrieval systems like RAG retrieve via fine-tuning and embedding similarity, while the presenter argued for exploring retrieval via combinatorial object identity or spatial proximity. The research explores circuit-level retrieval via domain fine-tuned LLMs and distributed memory for image retrieval using semantic similarity. Why it matters: The work suggests structured databases and retrieval-focused training can allow smaller models to outperform larger general-purpose models, offering efficiency gains for AI development in the region.