Parameterized Hypercomplex
Parameterized hypercomplex neural networks (PHNNs) extend traditional neural networks by using hypercomplex numbers (like quaternions) as parameters, enabling them to more efficiently model relationships within multidimensional data. Current research focuses on applying PHNNs to various tasks, including multimodal emotion recognition, image classification (especially medical imaging), and solving large-scale linear equations, often utilizing architectures like PHResNets and incorporating techniques such as attention maps and hypercomplex multiplications. This approach shows promise in improving model accuracy and reducing computational costs compared to real-valued counterparts, with significant implications for applications requiring efficient processing of high-dimensional data.
Papers
Hypercomplex Multimodal Emotion Recognition from EEG and Peripheral Physiological Signals
Eleonora Lopez, Eleonora Chiarantano, Eleonora Grassucci, Danilo Comminiello
Attention-Map Augmentation for Hypercomplex Breast Cancer Classification
Eleonora Lopez, Filippo Betello, Federico Carmignani, Eleonora Grassucci, Danilo Comminiello
PHYDI: Initializing Parameterized Hypercomplex Neural Networks as Identity Functions
Matteo Mancanelli, Eleonora Grassucci, Aurelio Uncini, Danilo Comminiello