Skip to content
GCC AI Research

Search

Results for "observational data"

The Human Phenotype Project

MBZUAI ·

Professor Eran Segal presented The Human Phenotype Project, a longitudinal cohort study with over 10,000 participants. The project aims to identify molecular markers and develop prediction models for disease using deep profiling techniques including medical history, lifestyle, blood tests, and microbiome analysis. The study provides insights into drivers of obesity, diabetes, and heart disease, identifying novel markers at the microbiome, metabolite, and immune system level. Why it matters: Such large-scale phenotyping initiatives could inform personalized medicine approaches relevant to the Middle East's specific health challenges.

The complexities of identifying causality in the real world: A new study presented at ICML

MBZUAI ·

MBZUAI researchers presented a study at ICML 2024 examining how data aggregation distorts causal discovery. The study argues that current methods are misled because real-world interactions happen at a micro level while observations are aggregated. Using the example of ice cream sales and temperature, they highlight how aggregation introduces "instantaneous causality" where time-lags exist. Why it matters: The research identifies a fundamental limitation in current causal discovery methods, potentially impacting disciplines relying on accurate causal inference from observational data.

Unlocking coronavirus' secrets through cellphone data and social media

KAUST ·

A KAUST research team is using cellphone mobility data, Google searches, and social media to model and predict COVID-19 spread. The models aim to forecast cases in the coming weeks and inform resource allocation, including hospital beds and medical staff. The team is using aggregated and anonymized data from cellphone companies to respect people's privacy. Why it matters: Integrating real-time digital data with epidemiological modeling can improve the speed and effectiveness of public health responses in the region and globally.

Award-winning algorithm aids observation

KAUST ·

KAUST researchers developed a machine learning algorithm to control a deformable mirror within the Subaru Telescope's exoplanet imaging camera, compensating for atmospheric turbulence. The algorithm, which computes a partial singular value decomposition (SVD), outperforms a standard SVD by a factor of four. The KAUST team received a best paper award at the PASC Conference for this work, which has already been deployed at the Subaru Telescope. Why it matters: This advancement enables sharper images of exoplanets, facilitating their identification and study, and showcases the impact of optimizing core linear algebra algorithms.

Exploring the night sky

KAUST ·

The KAUST Amateur Astronomy Association (AAA), led by Ph.D. student Daniel Corzo, uses telescopes to observe the night sky. The group organizes events to view celestial objects like Saturn and the Milky Way from locations with low light pollution. Corzo's interest in astronomy was sparked by visits to NASA's Johnson Space Center and science fiction literature. Why it matters: Such initiatives promote scientific curiosity and engagement within the KAUST community, potentially inspiring further interest in STEM fields in Saudi Arabia.

Scalable Community Detection in Massive Networks Using Aggregated Relational Data

MBZUAI ·

A new mini-batch strategy using aggregated relational data is proposed to fit the mixed membership stochastic blockmodel (MMSB) to large networks. The method uses nodal information and stochastic gradients of bipartite graphs for scalable inference. The approach was applied to a citation network with over two million nodes and 25 million edges, capturing explainable structure. Why it matters: This research enables more efficient community detection in massive networks, which is crucial for analyzing complex relationships in various domains, but this article has no clear connection to the Middle East.

Causal inference for climate change events from satellite image time series using computer vision and deep learning

arXiv ·

The paper proposes a method for causal inference using satellite image time series to determine the impact of interventions on climate change, focusing on quantifying deforestation due to human causes. The method uses computer vision and deep learning to detect forest tree coverage levels over time and Bayesian structural causal models to estimate counterfactuals. The framework is applied to analyze deforestation levels before and after the hyperinflation event in Brazil in the Amazon rainforest region.

Building Planetary-Scale Collaborative Intelligence

MBZUAI ·

Sai Praneeth Karimireddy from UC Berkeley presented a talk on building planetary-scale collaborative intelligence, highlighting the challenges of using distributed data in machine learning due to data silos and ethical-legal restrictions. He proposed collaborative systems like federated learning as a solution to bring together distributed data while respecting privacy. The talk addressed the need for efficiency, reliability, and management of divergent goals in these systems, suggesting the use of tools from optimization, statistics, and economics. Why it matters: Collaborative AI systems can unlock valuable distributed data in the region, especially in sensitive sectors like healthcare, while ensuring privacy and addressing ethical concerns.