Skip to main content

Adapting BERT for Word Sense Disambiguation with Gloss Selection Objective and Example Sentences

·1 min

Authors: B. P. Yap, A. Koh, E. S. Chng

Published in: Findings of the Association for Computational Linguistics: EMNLP 2020

ACL Anthology: 2020.findings-emnlp.4

Abstract #

This paper presents an adaptation of BERT for word sense disambiguation that uses a gloss selection objective combined with example sentences. The approach improves the model’s ability to disambiguate word meanings in context.

Key Contributions #

  • Novel adaptation of BERT architecture for WSD tasks
  • Gloss selection objective for improved sense discrimination
  • Integration of example sentences to enhance context understanding
  • State-of-the-art performance on WSD benchmarks

Technologies & Methods #

  • BERT (Bidirectional Encoder Representations from Transformers)
  • Natural language processing
  • Word sense disambiguation techniques
  • Transfer learning from pre-trained language models
  • Gloss-based learning objectives

Research Impact #

This work demonstrates how pre-trained language models like BERT can be effectively adapted for word sense disambiguation tasks through careful objective design and use of linguistic resources.

Citation #

B. P. Yap, A. Koh, and E. S. Chng, "Adapting BERT for word sense disambiguation with gloss selection objective and example sentences,"
in Findings of the Association for Computational Linguistics: EMNLP 2020,
2020. [Online]. Available: https://aclanthology.org/2020.findings-emnlp.4