Technology Innovation Institute's (TII) Directed Energy Research Center (DERC) is integrating machine learning (ML) techniques into signal processing to accelerate research. One project used convolutional neural networks to predict COVID-19 pneumonia from chest x-rays with 97.5% accuracy. DERC researchers also demonstrated that ML-based signal and image processing can retrieve up to 68% of text information from electromagnetic emanations. Why it matters: This adoption of ML for signal processing at TII highlights the potential for advanced AI techniques to enhance research and security applications in the UAE.
A recent talk at MBZUAI discussed "Green Learning" and Operational Neural Networks (ONNs) as efficient alternatives to CNNs. ONNs use "nodal" and "pool" operators and "generative neurons" to expand neuron learning capacity. Moncef Gabbouj from Tampere University presented Self-Organized ONNs (Self-ONNs) and their signal processing applications. Why it matters: Exploring more efficient AI models is crucial for sustainable development of AI in the region, as it addresses computational resource constraints and promotes broader accessibility.
Researchers from MBZUAI have released MobiLlama, a fully transparent open-source 0.5 billion parameter Small Language Model (SLM). MobiLlama is designed for resource-constrained devices, emphasizing enhanced performance with reduced resource demands. The full training data pipeline, code, model weights, and checkpoints are available on Github.
Xiaolin Huang from Shanghai Jiao Tong University presented a talk at MBZUAI on training deep neural networks in tiny subspaces. The talk covered the low-dimension hypothesis in neural networks and methods to find subspaces for efficient training. It suggests that training in smaller subspaces can improve training efficiency, generalization, and robustness. Why it matters: Investigating efficient training methods is crucial for resource-constrained environments and can enable broader access to advanced AI.
The Communications and Computing Systems Lab (CCSL) at KAUST received two awards in the International Telecommunication Union AI for Good Machine Learning Challenge and tinyML Hackathon Challenge 2023: Pedestrian Detection. The KAUST team's solution achieved high accuracy in pedestrian identification using event-based cameras, while consuming less power and achieving lower latency. They also received an award for innovative use of "Edge Impulse" for building datasets and training models. Why it matters: This recognition highlights KAUST's growing influence in AI research, particularly in edge computing and computer vision applications for public safety.
Researchers at National Taiwan University are developing low-complexity neural network technologies using quantization to reduce model size while maintaining accuracy. Their work includes binary-weighted CNNs and transformers, along with a neural architecture search scheme (TPC-NAS) applied to image recognition, object detection, and NLP tasks. They have also built a PE-based CNN/transformer hardware accelerator in Xilinx FPGA SoC with a PyTorch-based software framework. Why it matters: This research provides practical methods for deploying efficient deep learning models on resource-constrained hardware, potentially enabling broader adoption of AI in embedded systems and edge devices.
MBZUAI researchers will present 20 papers at the 40th International Conference on Machine Learning (ICML) in Honolulu. Visiting Associate Professor Tongliang Liu leads with seven publications, followed by Kun Zhang with six. One paper investigates semi-supervised learning vs. model-based methods for noisy data annotation in deep neural networks. Why it matters: The research addresses the critical issue of data quality and accessibility in machine learning, particularly for organizations with limited resources for data annotation.
Qirong Ho, co-founder and CTO of Petuum Inc., will be contributing to the "ML Systems for Many" initiative. Petuum is recognized for creating standardized building blocks for AI assembly. Ho also holds a Ph.D. from Carnegie Mellon University and is part of the CASL open-source consortium. Why it matters: Showcases the ongoing efforts to democratize AI development and deployment, making it more accessible and sustainable, although the specific initiative is not further detailed.