Skip to content
GCC AI Research

Search

Results for "Tabular Data"

Guided Deep List: Automating the Generation of Epidemiological Line Lists from Open Sources

arXiv ·

The paper introduces Guided Deep List, a tool for automating the generation of epidemiological line lists from open source reports. The tool uses distributed vector representations and dependency parsing to extract tabular data on disease outbreaks. It was evaluated on MERS outbreak data in Saudi Arabia, demonstrating improved accuracy over baseline methods and enabling epidemiological inferences.

A graduate’s view on revealing invisible data

MBZUAI ·

MBZUAI graduate Svetlana Maslenkova worked with Assistant Professor Mohammad Yaqub on a project focused on the earlier detection of kidney failure using tabular data. Maslenkova's master's thesis involved predicting Acute Kidney Injury (AKI) using Electronic Health Records (EHR), specifically the MIMIC-IV v2.0 database. She found that patient weight distribution was a factor in the severity of kidney failure. Why it matters: This research highlights the potential of AI and machine learning to improve healthcare outcomes through the analysis of often-overlooked tabular data in electronic health records.

Fact checking with ChatGPT

MBZUAI ·

A new paper from MBZUAI researchers explores using ChatGPT to combat the spread of fake news. The researchers, including Preslav Nakov and Liangming Pan, demonstrate that ChatGPT can be used to fact-check published information. Their paper, "Fact-Checking Complex Claims with Program-Guided Reasoning," was accepted at ACL 2023. Why it matters: This research highlights the potential of large language models to address the growing challenge of misinformation, with implications for maintaining information integrity in the digital age.

A Unified Deep Model of Learning from both Data and Queries for Cardinality Estimation

arXiv ·

This paper introduces a unified deep autoregressive model (UAE) for cardinality estimation that learns joint data distributions from both data and query workloads. It uses differentiable progressive sampling with the Gumbel-Softmax trick to incorporate supervised query information into the deep autoregressive model. Experiments show UAE achieves better accuracy and efficiency compared to state-of-the-art methods.

Machine learning 101

MBZUAI ·

Machine learning (ML) algorithms use data to make decisions or predictions, improving over time as more data is provided. ML is a subset of AI, focused on models that learn from data, contrasting with rule-based systems. ML is superior in scenarios where rules are not exhaustive, such as medical scans, but rule-based systems and ML often complement each other. Why it matters: This overview clarifies the role of machine learning within the broader field of AI, highlighting its data-driven approach and its advantages over traditional rule-based systems in complex decision-making scenarios.

Duet: efficient and scalable hybriD neUral rElation undersTanding

arXiv ·

The paper introduces Duet, a hybrid neural relation understanding method for cardinality estimation. Duet addresses limitations of existing learned methods, such as high costs and scalability issues, by incorporating predicate information into an autoregressive model. Experiments demonstrate Duet's efficiency, accuracy, and scalability, even outperforming GPU-based methods on CPU.

Explainable Fact Checking for Statistical and Property Claims

MBZUAI ·

EURECOM researchers developed data-driven verification methods using structured datasets to assess statistical and property claims. The approach translates text claims into SQL queries on relational databases for statistical claims. For property claims, they use knowledge graphs to verify claims and generate explanations. Why it matters: The methods aim to support fact-checkers by efficiently labeling claims with interpretable explanations, potentially combating misinformation in the region and beyond.

The role of data-driven models in quantifying uncertainty

KAUST ·

KAUST Professor Raul Tempone, an expert in Uncertainty Quantification (UQ), has been appointed as an Alexander von Humboldt Professor at RWTH Aachen University in Germany. This professorship will enable him to further his research on mathematics for uncertainty quantification with new collaborators. Tempone believes the KAUST Strategic Initiative for Uncertainty Quantification (SRI-UQ) contributed to this award. Why it matters: This appointment enhances KAUST's visibility and facilitates cross-fertilization between European and KAUST research groups, benefiting both institutions and attracting talent.