KAUST researcher Corrado Calì won an award at the Brainstorming Research Assembly for Young Neuroscientists (BraYn) in Italy for his work on glycogen modulation and synapse stabilization. Calì presented research in collaboration with KAUST Professor Andrea Falqui and Dr. Elena Vezzoli from the University of Milan, investigating the lactate shuttle's involvement in synaptic plasticity. Calì and KAUST colleagues are also collaborating with the Blue Brain Project to produce a computer simulation of astrocyte-neuron coupling, using 3D virtual reality to investigate brain cell morphologies at the nanoscale. Why it matters: This award recognizes KAUST's contribution to neuroscience research and highlights the university's collaborative efforts in understanding brain plasticity and developing advanced tools for studying brain structures.
KAUST and EPFL Blue Brain Project researchers propose a new theory about a 'secret language' used by cells for internal communication regarding the external world. Using a computational model, they suggest that metabolic pathways can code details about neuromodulators that stimulate energy consumption. The model focuses on astrocytes and their cooperation with neurons in fueling the brain. Why it matters: This suggests a new avenue for understanding information processing in the brain and how cells contribute to the energy efficiency of brains compared to computers.
Dr. Yves Agid from the ICM Paris Institute of Translational Neuroscience lectured at KAUST's 2018 Winter Enrichment Program about the role of glial cells in brain function and behavior. He highlighted that glial cells, often overlooked in research, are crucial for neural synchronization and overall intelligence. Dysfunction of glial cells can induce pathologies like Alzheimer's and Parkinson's disease. Why it matters: The lecture underscored the importance of studying glial cells in addition to neurons for understanding and treating neurodegenerative disorders, which could influence future research directions at KAUST and in the region.
MBZUAI researchers are developing spiking neural networks (SNNs) to emulate the energy efficiency of the human brain. Traditional deep learning models like those powering ChatGPT consume significant energy, with a single query using 3.96 watts. SNNs aim to mimic biological neurons more closely to reduce energy consumption, as the human brain uses only a fraction of the energy compared to these models. Why it matters: This research could lead to more sustainable and energy-efficient AI technologies, addressing a major challenge in deploying large-scale AI systems.
KAUST Discovery highlights the contributions of Magistretti to the field of neuroenergetics. His research explores the cellular and molecular basis of brain energy metabolism and brain imaging. Magistretti's group discovered mechanisms underlying the coupling between neuronal activity and energy consumption, revealing the role of astrocytes. Why it matters: Understanding brain energy metabolism and the role of glial cells can advance brain imaging techniques and our understanding of neuronal processes.
KAUST professor Pierre Magistretti has been elected to the Norwegian Academy of Science and Letters. His election recognizes his contributions to neuroscience, specifically his work on lactate's role in brain function. Magistretti's research focuses on the lactate shuttle system and how neurons and glial cells cooperate to meet energy demands. Why it matters: This honor highlights KAUST's contribution to international neuroscience and can foster further collaboration in the field.
Caltech graduate student Surya Narayanan Hari presented his research on replicating human-like memory in machines at MBZUAI. He discussed how the thalamus, which filters sensory and motor signals in the brain, inspires the development of routed monolithic models in AI. Hari explained that memory retrieval occurs on object, embedding, and circuit levels in the human brain. Why it matters: This talk highlights the potential of neuroscience-inspired AI architectures for improving memory and information processing in AI systems, which could accelerate the development of more efficient and context-aware AI models in the region.
Paul Liang from CMU presented on machine learning foundations for multisensory AI, discussing a theoretical framework for modality interactions. The talk covered cross-modal attention and multimodal transformer architectures, and applications in mental health, pathology, and robotics. Liang's research aims to enable AI systems to integrate and learn from diverse real-world sensory modalities. Why it matters: This highlights the growing importance of multimodal AI research and its potential for advancements across various sectors in the region, including healthcare and robotics.