A KAUST team led by Hossein Fariborzi won second place in the MEMS Design Contest for their "MEMS Resonator for Oscillator, Tunable Filter and Re-Programmable Logic Applications." The device is runtime-reprogrammable, allowing the function of each device in the circuit to be changed during operation. The KAUST team demonstrated that two MEMS resonators could replace over 20 transistors in applications like digital adders, reducing digital circuit complexity. Why it matters: This innovation could significantly reduce power consumption, chip area, and manufacturing costs in microprocessors, advancing the development of energy-efficient microcomputers in the region.
Researchers from MBZUAI, University of British Columbia, and Monash University have created LaMini-LM, a collection of small language models distilled from ChatGPT. LaMini-LM is trained on a dataset of 2.58M instructions and can be deployed on consumer laptops and mobile devices. The smaller models perform almost as well as larger counterparts while addressing security concerns. Why it matters: This work enables the deployment of LLMs in resource-constrained environments and enhances data security by reducing reliance on cloud-based LLMs.
KAUST Professor Muhammad Mustafa Hussain is working to democratize electronics and make advanced technology accessible. His research focuses on creating flexible, stretchable, and reconfigurable electronics that are cost-effective and easy to use. Hussain also teaches a course at KAUST where students develop electronics solutions to everyday problems. Why it matters: This initiative could empower individuals globally by providing access to affordable and user-friendly electronic devices for various applications.
Dr. Laurent A. Lantieri delivered a keynote address at KAUST on April 17, 2017, discussing microsurgical procedures. The address included a brief history of microsurgery. The event took place in the University Auditorium. Why it matters: Such events expose the KAUST community to advances in specialized medical fields and potential research applications.
KAUST researchers presented their work on stabilizing nanoparticle catalysts at the 252nd American Chemical Society Meeting & Exposition. The team devised a "molecular Scotch tape" using a silica gel support coated with a single molecule layer of soft material containing sulfur. This approach allows nanoparticles to stick to one side while leaving the other side free for catalysis, preventing aggregation without killing the catalyst. Why it matters: This innovation in catalyst stabilization could lead to more efficient and sustainable chemical processes, impacting various industries.
KAUST's Sciencetown podcast episode 23 features researcher Dana Al-Sulaiman discussing portable biosensing technologies for cancer detection. These devices aim to enable liquid biopsies for early screening and personalized treatment. The biosensors gather clinical information from biological samples to inform clinical decisions. Why it matters: This research can advance non-invasive diagnostics and personalized medicine in the region.
KAUST researchers are exploring novel chemical reactors and separation processes using mathematical design, with a focus on time and shape variables to enhance transport, heat transfer, and mass transfer. By aligning design, modeling, and 3D printing, they create customized shapes with great complexity and less material. This approach allows for the creation of bespoke reactors and separation processes tailored to specific applications, improving efficiency and reducing energy consumption. Why it matters: This research demonstrates the potential of advanced manufacturing techniques to revolutionize industrial design in the Middle East's chemical and pharmaceutical sectors.
Xiaolin Huang from Shanghai Jiao Tong University presented a talk at MBZUAI on training deep neural networks in tiny subspaces. The talk covered the low-dimension hypothesis in neural networks and methods to find subspaces for efficient training. It suggests that training in smaller subspaces can improve training efficiency, generalization, and robustness. Why it matters: Investigating efficient training methods is crucial for resource-constrained environments and can enable broader access to advanced AI.