Skip to content
GCC AI Research

Search

Results for "generalized networks"

Temporally Evolving Generalised Networks

MBZUAI ·

Emilio Porcu from Khalifa University presented on temporally evolving generalized networks, where graphs evolve over time with changing topologies. The presentation addressed challenges in building semi-metrics and isometric embeddings for these networks. The research uses kernel specification and network-based metrics and is illustrated using a traffic accident dataset. Why it matters: This work advances the application of kernel methods to dynamic graph structures, relevant for modeling evolving relationships in various domains.

Provable Unrestricted Adversarial Training without Compromise with Generalizability

arXiv ·

This paper introduces Provable Unrestricted Adversarial Training (PUAT), a novel adversarial training approach. PUAT enhances robustness against both unrestricted and restricted adversarial examples while improving standard generalizability by aligning the distributions of adversarial examples, natural data, and the classifier's learned distribution. The approach uses partially labeled data and an augmented triple-GAN to generate effective unrestricted adversarial examples, demonstrating superior performance on benchmarks.

Programmable Networks for Distributed Deep Learning: Advances and Perspectives

MBZUAI ·

A presentation discusses using programmable network devices to reduce communication bottlenecks in distributed deep learning. It explores in-network aggregation and data processing to lower memory needs and increase bandwidth usage. The talk also covers gradient compression and the potential role of programmable NICs. Why it matters: Optimizing distributed deep learning infrastructure is critical for scaling AI model training in resource-constrained environments.

Problems in network archaeology: root finding and broadcasting

MBZUAI ·

This article discusses a talk by Gábor Lugosi on "network archaeology," specifically the problems of root finding and broadcasting in large networks. The talk addresses discovering the past of dynamically growing networks when only a present-day snapshot is observed. Lugosi's research interests include machine learning theory, nonparametric statistics, and random structures. Why it matters: Understanding the evolution and origins of networks is crucial for various applications, including analyzing social networks, biological systems, and the spread of information.

Understanding Machine Learning on Graphs: From Node Classification to Algorithmic Reasoning.

MBZUAI ·

Kimon Fountoulakis from the University of Waterloo presented a talk on machine learning on graphs, covering node classification and algorithmic reasoning. The talk discussed the limitations and strengths of graph neural networks (GNNs). It also covered novel optimal architectures for node classification and the ability of looped GNNs to execute classical algorithms. Why it matters: Understanding GNN capabilities is crucial for advancing AI applications in areas like recommendation systems and drug discovery that rely on relational data.

Understanding networked systems

KAUST ·

Munther Dahleh, director at the MIT Institute for Data, Systems, and Society (IDSS), discussed his group's research on network systems at the KAUST 2018 Winter Enrichment Program. The research focuses on the fragility of large networked systems, like highway systems, in response to disruptions that may lead to catastrophic failures. Dahleh's team studies transportation networks, electrical grids, and financial markets to understand system interconnection in causing systemic risk. Why it matters: Understanding networked systems is crucial for building resilient infrastructure and mitigating risks in critical sectors across the GCC region.

Beyond Attention: Orchid’s Adaptive Convolutions for Next-Level Sequence Modeling

MBZUAI ·

A new neural network architecture called Orchid was introduced that uses adaptive convolutions to achieve quasilinear computational complexity O(N logN) for sequence modeling. Orchid adapts its convolution kernel dynamically based on the input sequence. Evaluations across language modeling and image classification show that Orchid outperforms attention-based architectures like BERT and Vision Transformers, often with smaller model sizes. Why it matters: Orchid extends the feasible sequence length beyond the practical limits of dense attention layers, representing progress toward more efficient and scalable deep learning models.

Scalable Community Detection in Massive Networks Using Aggregated Relational Data

MBZUAI ·

A new mini-batch strategy using aggregated relational data is proposed to fit the mixed membership stochastic blockmodel (MMSB) to large networks. The method uses nodal information and stochastic gradients of bipartite graphs for scalable inference. The approach was applied to a citation network with over two million nodes and 25 million edges, capturing explainable structure. Why it matters: This research enables more efficient community detection in massive networks, which is crucial for analyzing complex relationships in various domains, but this article has no clear connection to the Middle East.