Tensor Factorization
Tensor factorization is a powerful set of techniques used to decompose multi-dimensional data (tensors) into lower-dimensional components, revealing underlying patterns and structures. Current research emphasizes developing efficient algorithms, such as PARAFAC2 and its variants, for handling large-scale, incomplete, and irregularly structured data, often incorporating constraints like non-negativity and sparsity for improved interpretability. These methods find broad application in diverse fields, including natural language processing, recommendation systems, and scientific data analysis, offering improved data compression, imputation of missing values, and enhanced model interpretability.
Papers
Geometry is All You Need: A Unified Taxonomy of Matrix and Tensor Factorization for Compression of Generative Language Models
Mingxue Xu, Sadia Sharmin, Danilo P. Mandic
Domain-Specific Retrieval-Augmented Generation Using Vector Stores, Knowledge Graphs, and Tensor Factorization
Ryan C. Barron, Ves Grantcharov, Selma Wanna, Maksim E. Eren, Manish Bhattarai, Nicholas Solovyev, George Tompkins, Charles Nicholas, Kim Ø. Rasmussen, Cynthia Matuszek, Boian S. Alexandrov