Paper ID: 2308.01210

Global Hierarchical Neural Networks using Hierarchical Softmax

Jetze Schuurmans, Flavius Frasincar

This paper presents a framework in which hierarchical softmax is used to create a global hierarchical classifier. The approach is applicable for any classification task where there is a natural hierarchy among classes. We show empirical results on four text classification datasets. In all datasets the hierarchical softmax improved on the regular softmax used in a flat classifier in terms of macro-F1 and macro-recall. In three out of four datasets hierarchical softmax achieved a higher micro-accuracy and macro-precision.

Submitted: Aug 2, 2023