Skip to content
GCC AI Research

BiMediX: Bilingual Medical Mixture of Experts LLM

arXiv · · Significant research

Summary

MBZUAI researchers introduce BiMediX, a bilingual (English and Arabic) mixture of experts LLM for medical applications. The model is trained on BiMed1.3M, a new 1.3 million bilingual instruction dataset and outperforms existing models like Med42 and Jais-30B on medical benchmarks. Code and models are available on Github.

Get the weekly digest

Top AI stories from the GCC region, every week.

Related

BiMediX2: Bio-Medical EXpert LMM for Diverse Medical Modalities

arXiv ·

MBZUAI releases BiMediX2, a bilingual (Arabic-English) Bio-Medical Large Multimodal Model, along with the BiMed-V dataset (1.6M samples) and BiMed-MBench evaluation benchmark. BiMediX2 supports multi-turn conversation in Arabic and English and handles diverse medical imaging modalities. The model achieves state-of-the-art results on medical LLM and LMM benchmarks, outperforming existing methods and GPT-4 in specific evaluations.