Paper ID: 2112.07089

Building on Huang et al. GlossBERT for Word Sense Disambiguation

Nikhil Patel, James Hale, Kanika Jindal, Apoorva Sharma, Yichun Yu

We propose to take on the problem ofWord Sense Disambiguation (WSD). In language, words of the same form can take different meanings depending on context. While humans easily infer the meaning or gloss of such words by their context, machines stumble on this task.As such, we intend to replicated and expand upon the results of Huang et al.GlossBERT, a model which they design to disambiguate these words (Huang et al.,2019). Specifically, we propose the following augmentations: data-set tweaking(alpha hyper-parameter), ensemble methods, and replacement of BERT with BART andALBERT. The following GitHub repository contains all code used in this report, which extends on the code made available by Huang et al.

Submitted: Dec 14, 2021