Parameterized Hypercomplex

Parameterized hypercomplex neural networks (PHNNs) extend traditional neural networks by using hypercomplex numbers (like quaternions) as parameters, enabling them to more efficiently model relationships within multidimensional data. Current research focuses on applying PHNNs to various tasks, including multimodal emotion recognition, image classification (especially medical imaging), and solving large-scale linear equations, often utilizing architectures like PHResNets and incorporating techniques such as attention maps and hypercomplex multiplications. This approach shows promise in improving model accuracy and reducing computational costs compared to real-valued counterparts, with significant implications for applications requiring efficient processing of high-dimensional data.

Papers