A Benchmark Study of Contrastive Learning for Arabic Social Meaning
arXiv ·
This paper presents a benchmark study of contrastive learning (CL) methods applied to Arabic social meaning tasks like sentiment analysis and dialect identification. The study compares state-of-the-art supervised CL techniques against vanilla fine-tuning across a range of tasks. Results indicate that CL methods outperform vanilla fine-tuning in most cases and demonstrate data efficiency. Why it matters: This work highlights the potential of contrastive learning for improving performance in Arabic NLP, especially in low-resource scenarios.