Paper ID: 2210.14815

On the Curious Case of $\ell_2$ norm of Sense Embeddings

Yi Zhou, Danushka Bollegala

We show that the $\ell_2$ norm of a static sense embedding encodes information related to the frequency of that sense in the training corpus used to learn the sense embeddings. This finding can be seen as an extension of a previously known relationship for word embeddings to sense embeddings. Our experimental results show that, in spite of its simplicity, the $\ell_2$ norm of sense embeddings is a surprisingly effective feature for several word sense related tasks such as (a) most frequent sense prediction, (b) Word-in-Context (WiC), and (c) Word Sense Disambiguation (WSD). In particular, by simply including the $\ell_2$ norm of a sense embedding as a feature in a classifier, we show that we can improve WiC and WSD methods that use static sense embeddings.

Submitted: Oct 26, 2022