A Caltech researcher presented at MBZUAI on memory representation and retrieval, contrasting AI and neuroscience approaches. Current AI retrieval systems like RAG retrieve via fine-tuning and embedding similarity, while the presenter argued for exploring retrieval via combinatorial object identity or spatial proximity. The research explores circuit-level retrieval via domain fine-tuned LLMs and distributed memory for image retrieval using semantic similarity. Why it matters: The work suggests structured databases and retrieval-focused training can allow smaller models to outperform larger general-purpose models, offering efficiency gains for AI development in the region.
Caltech graduate student Surya Narayanan Hari presented his research on replicating human-like memory in machines at MBZUAI. He discussed how the thalamus, which filters sensory and motor signals in the brain, inspires the development of routed monolithic models in AI. Hari explained that memory retrieval occurs on object, embedding, and circuit levels in the human brain. Why it matters: This talk highlights the potential of neuroscience-inspired AI architectures for improving memory and information processing in AI systems, which could accelerate the development of more efficient and context-aware AI models in the region.
Prof. Chun Jason Xue from the City University of Hong Kong presented research on optimizing mobile memory and storage by analyzing mobile application characteristics, noting their differences from server applications. The research explores system software designs inherited from the Linux kernel and identifies optimization opportunities in mobile memory and storage management. Xue's work aims to enhance user experience on mobile devices through mobile application characterization, focusing on non-volatile and flash memories. Why it matters: Optimizing mobile systems based on the unique characteristics of mobile applications can significantly improve device performance and user experience in the region.
A KAUST team developed piRNAi, a gene-silencing tool in nematode worms using synthetic RNA sequences interacting with the piRNA pathway. They successfully silenced genes involved in sex determination and other functions, demonstrating multiplexed gene silencing. The gene silencing lasted for varying durations across generations, up to six generations. Why it matters: This expands the molecular toolkit for gene manipulation and offers potential therapeutic applications in humans, given the presence of the same gene-silencing pathway.
A Duke University professor presented a data-centric approach to optimizing AI systems by addressing the memory capacity and bandwidth bottleneck. The presentation covered collaborative optimization across algorithms, systems, architecture, and circuit layers. It also explored compute-in-memory as a solution for integrating computation and memory. Why it matters: Optimizing AI systems through a data-centric approach can improve efficiency and performance, critical for advancing AI applications in the region.
Dr. Mikhail Burtsev of the London Institute presented research on GENA-LM, a suite of transformer-based DNA language models. The talk addressed the challenge of scaling transformers for genomic sequences, proposing recurrent memory augmentation to handle long input sequences efficiently. This approach improves language modeling performance and holds promise for memory-intensive applications in bioinformatics. Why it matters: This research can significantly advance AI's capabilities in genomics by enabling the processing of much larger DNA sequences, with potential breakthroughs in understanding and treating diseases.
KAUST researchers have published a review paper in Science magazine covering memristor technology, comparing it to the original transistor. Dr. Mario Lanza is the lead author of the paper, which summarizes data supporting memristor technology readiness across materials and applications. The paper statistically shows the technical criteria for how memristors function in various configurations. Why it matters: Memristors could become the new switching technology standard, surpassing transistors in speed and operational efficiency, especially as current chip technology reaches its quantum limit in terms of size.
KAUST researchers collaborated with the Blue Brain Project to study astrocytes, brain cells crucial for memory and learning. Dr. Corrado Calì produced 3D models of astrocytes using serial block-face electron microscopy to understand their structure. The study, published in Progress in Neurobiology, reveals how lactate transfer from astrocytes to neurons contributes to brain energy usage. Why it matters: Understanding astrocyte function could lead to new drugs for treating conditions like stroke and Alzheimer's disease by improving brain cell function.