MBZUAI researchers presented a new causal discovery method at NeurIPS that identifies relationships between deterministic and non-deterministic variables. The method builds directed graphs visualizing relationships between variables, incorporating both probabilistic and deterministic principles. The lead author, Longkang Li, aims to apply causal discovery to healthcare and biology for better understanding of diseases. Why it matters: This research advances the field of causal inference, potentially improving applications in areas like healthcare where understanding complex relationships is critical.
Saber Salehkaleybar from EPFL presented a talk on causal discovery, focusing on learning causal relationships from observational data and through interventions. He discussed an approximation algorithm for experiment design under budget constraints, with applications in gene-regulatory networks. The talk also covered improvements to reduce the computational complexity of experiment design algorithms. Why it matters: Causal AI systems can lead to more intelligent decision-making in various fields.
The paper proposes a method for causal inference using satellite image time series to determine the impact of interventions on climate change, focusing on quantifying deforestation due to human causes. The method uses computer vision and deep learning to detect forest tree coverage levels over time and Bayesian structural causal models to estimate counterfactuals. The framework is applied to analyze deforestation levels before and after the hyperinflation event in Brazil in the Amazon rainforest region.
MBZUAI's Kun Zhang is applying causal machine learning to improve drug development and precision medicine, focusing on answering 'why' questions. Traditional drug development is costly (est. $2B) due to extensive studies needed to determine drug toxicity and efficacy. Zhang is combining causal ML with organs-on-chips technology to improve pre-clinical drug testing, aiming to reduce the failure rate of drugs in human trials. Why it matters: By improving the accuracy of pre-clinical drug testing, this research could significantly reduce the cost and time required to bring new medicines to market in the region and worldwide.
Researchers are exploring methods for evaluating the outcome of actions using off-policy observations where the context is noisy or anonymized. They employ proxy causal learning, using two noisy views of the context to recover the average causal effect of an action without explicitly modeling the hidden context. The implementation uses learned neural net representations for both action and context, and demonstrates outperformance compared to an autoencoder-based alternative. Why it matters: This research addresses a key challenge in applying AI in real-world scenarios where data privacy or bandwidth limitations necessitate working with noisy or anonymized data.
MBZUAI hosted a talk on causal AI, featuring Professor Jin Tian from Iowa State University. The talk covered enriching AI systems with causal reasoning capabilities, moving AI beyond prediction to understanding. Professor Tian shared research on causal inference and estimating causal effects from data, using a novel estimator with double/debiased machine learning (DML) properties. Why it matters: Causal AI can improve the explainability, robustness, and adaptability of AI systems, addressing limitations of purely statistical models.
MBZUAI researchers introduced CausalVerse, a new benchmark for causal representation learning (CRL) presented at NeurIPS 2025. CausalVerse combines high-fidelity visual complexity with access to underlying causal variables and graphs, featuring 200,000 images and 300 million video frames across 24 sub-scenes in four domains. It aims to provide a realistic and precise testbed to evaluate whether CRL methods can truly learn the right causes. Why it matters: By bridging the gap between toy datasets and real-world data, CausalVerse can drive advances in AI systems capable of understanding causality in complex scenarios.