Dr. Yves Agid from the ICM Paris Institute of Translational Neuroscience lectured at KAUST's 2018 Winter Enrichment Program about the role of glial cells in brain function and behavior. He highlighted that glial cells, often overlooked in research, are crucial for neural synchronization and overall intelligence. Dysfunction of glial cells can induce pathologies like Alzheimer's and Parkinson's disease. Why it matters: The lecture underscored the importance of studying glial cells in addition to neurons for understanding and treating neurodegenerative disorders, which could influence future research directions at KAUST and in the region.
Margaret Livingstone, a neurobiology professor at Harvard Medical School, lectured at KAUST's Winter Enrichment Program 2018 on how art can reveal insights into the human brain. She discussed how artists have long understood the independent roles of color and luminance in visual perception. Livingstone highlighted examples from Picasso, Monet, and Warhol to illustrate how artists manipulate visual cues. Why it matters: This interdisciplinary approach can potentially lead to new understandings of how the brain processes visual information and inform advances in both neuroscience and art.
Maha Elgarf from NYU Abu Dhabi presented research on using social robots to stimulate creativity in children through subconscious mimicry, leveraging the 'chameleon effect'. The research involved a series of studies where children engaged in storytelling with a social robot, and their creativity was assessed. Elgarf also discussed using Large Language Models (LLMs) in education and challenges in the field. Why it matters: This explores innovative applications of social robotics and AI in education within the UAE, potentially enhancing children's learning and creativity.
MBZUAI researchers are developing spiking neural networks (SNNs) to emulate the energy efficiency of the human brain. Traditional deep learning models like those powering ChatGPT consume significant energy, with a single query using 3.96 watts. SNNs aim to mimic biological neurons more closely to reduce energy consumption, as the human brain uses only a fraction of the energy compared to these models. Why it matters: This research could lead to more sustainable and energy-efficient AI technologies, addressing a major challenge in deploying large-scale AI systems.
Caltech graduate student Surya Narayanan Hari presented his research on replicating human-like memory in machines at MBZUAI. He discussed how the thalamus, which filters sensory and motor signals in the brain, inspires the development of routed monolithic models in AI. Hari explained that memory retrieval occurs on object, embedding, and circuit levels in the human brain. Why it matters: This talk highlights the potential of neuroscience-inspired AI architectures for improving memory and information processing in AI systems, which could accelerate the development of more efficient and context-aware AI models in the region.
Tom M. Mitchell from Carnegie Mellon University discussed using machine learning to study how the brain processes natural language, using fMRI and MEG to record brain activity while reading text. The research explores neural encodings of word meaning, information flow during word comprehension, and how meanings of words combine in sentences and stories. He also touched on how understanding of the brain aligns with current AI approaches to NLP. Why it matters: This interdisciplinary research could bridge the gap between neuroscience and AI, potentially leading to more human-like NLP models.
Olivier Oullier, Visiting Professor at MBZUAI, is working on brain-computer interfaces, founding Inclusive Brains to develop a Neural Foundation Model using neurophysiological and behavioral signals. This model integrates data from brainwaves, eye-tracking, and other modalities to allow machines to build a representation of the world closer to human cognition. Why it matters: Such advancements can transform human-computer interaction, with particular implications for people of determination in the region.
KAUST researchers in the Sensors Lab are developing neuromorphic circuits for vision sensors, drawing inspiration from the human eye. They created flexible photoreceptors using hybrid perovskite materials, with capacitance tunable by light stimulation, mimicking the human retina. The team collaborates with experts in image characterization and brain pattern recognition to connect the 'eye' to the 'brain' for object identification. Why it matters: This biomimetic approach promises advancements in AI, machine learning, and smart city development within the region.