Skip to content
GCC AI Research

Search

Results for "resource optimization"

Developing efficient algorithms to spread the benefits of AI

MBZUAI ·

MBZUAI PhD graduate William de Vazelhes is researching hard-thresholding algorithms to enable AI to work from smaller datasets. His work focuses on optimization algorithms that simplify data, making it easier to analyze and work with, useful for energy-saving and deploying AI models on low-memory devices. He demonstrated that his approach can obtain results similar to those of convex algorithms in many usual settings. Why it matters: This research could broaden AI accessibility by reducing computational costs, and has potential applications in sectors like finance, particularly for portfolio management under budgetary constraints.

Better Optimization Algorithms for Machine Learning

MBZUAI ·

Francesco Orabona from Boston University, with a PhD from the University of Genova, researches online learning, optimization, and statistical learning theory. He previously worked at Yahoo Labs and Toyota Technological Institute at Chicago. MBZUAI hosted a panel discussion (topic not specified in provided text). Why it matters: Optimization algorithms are crucial for advancing machine learning and AI, and researchers like Orabona contribute to this field.

Going under the hood to improve AI efficiency

MBZUAI ·

MBZUAI's computer science department, led by Xiaosong Ma, focuses on improving AI efficiency and sustainability by reducing wasted resources. Xiaosong's background in high-performance computing informs her approach to optimizing AI workloads. She aims to collaborate with experts across different AI domains at MBZUAI to address these challenges. Why it matters: Optimizing AI efficiency is crucial for reducing the environmental impact and computational costs associated with increasingly complex AI models in the GCC region and globally.

A new strategy for complex optimization problems in machine learning presented at ICLR

MBZUAI ·

MBZUAI researchers presented a new strategy for handling complex optimization problems in machine learning at ICLR 2024. The study, a collaboration with ISAM, combines zeroth-order methods with hard-thresholding to address specific settings in machine learning. This approach aims to improve convergence, ensuring algorithms reach quality solutions efficiently. Why it matters: Improving optimization techniques is crucial for advancing machine learning models used in various applications, potentially accelerating development and enhancing performance.

New approaches for machine learning optimization presented at ICML

MBZUAI ·

MBZUAI and KAUST researchers collaborated to present new optimization methods at ICML 2024 for composite and distributed machine learning settings. The study addresses challenges in training large models due to data size and computational power. Their work focuses on minimizing the "loss function" by adjusting internal trainable parameters, using techniques like gradient clipping. Why it matters: This research contributes to the ongoing advancement of machine learning optimization, crucial for improving the performance and efficiency of AI models in the region and globally.

KAUST and the Big Data age

KAUST ·

KAUST held a research workshop on Optimization and Big Data, gathering researchers to discuss challenges and opportunities in the field. Speakers presented novel optimization algorithms and distributed systems for handling large datasets. The workshop featured 20 speakers from KAUST, global universities, and Microsoft Research. Why it matters: The event highlights KAUST's role as a regional hub for advancing research and development in big data and optimization, crucial for AI and various computational fields.

Optimizing AI Systems through Cross-Layer Design: A Data-Centric Approach

MBZUAI ·

A Duke University professor presented a data-centric approach to optimizing AI systems by addressing the memory capacity and bandwidth bottleneck. The presentation covered collaborative optimization across algorithms, systems, architecture, and circuit layers. It also explored compute-in-memory as a solution for integrating computation and memory. Why it matters: Optimizing AI systems through a data-centric approach can improve efficiency and performance, critical for advancing AI applications in the region.