G42's Core42 has released Jais, a new Arabic large language model. Jais includes 13 billion parameters and was trained on a dataset of 126B tokens, including 43B Arabic tokens. According to the developers, Jais achieves state-of-the-art results on Arabic benchmarks and competitive performance on English benchmarks. Why it matters: Jais represents a significant step forward for Arabic NLP, providing a powerful new tool for researchers and developers in the region.
MBZUAI has released Jais and Jais-chat, two new open generative large language models (LLMs) with a focus on Arabic. The 13 billion parameter models are based on the GPT-3 architecture and pretrained on Arabic, English, and code. Evaluation shows state-of-the-art Arabic knowledge and reasoning, with competitive English performance.
This paper introduces Pulmonary Embolism Detection using Contrastive Learning (PECon), a supervised contrastive pretraining strategy using both CT scans and EHR data to improve feature alignment between modalities for better PE diagnosis. PECon pulls sample features of the same class together while pushing away features of other classes. The approach achieves state-of-the-art results on the RadFusion dataset, with an F1-score of 0.913 and AUROC of 0.943.