Introduction: This tutorial will introduce participants to the use of contrastive learning in developing robust word embeddings for NLP applications. While traditional methods, such as Word2Vec or GloVe, rely on static representations of words, contrastive learning offers a dynamic approach to capture nuanced, context-aware word meanings. The session will cover the fundamentals of contrastive learning, demonstrate its integration with word embeddings, and showcase its practical advantages in tasks such as information retrieval, semantic search, and text classification.

Target Audience: Students, Faculty, Researchers and practitioners in NLP and ML who are familiar with word embeddings and basic deep learning. Suitable for participants with intermediate experience in Python and a foundational understanding of representation learning,

Learning Outcomes: Understanding the fundamentals of contrastive learning in the context of word embeddings. Ability to implement and evaluate contrastive word embedding models. Familiarity with advanced topics in contrastive learning, enabling future research and application in NLP. Handout and code sample will be provided.