Matrix Vector Product

Matrix-vector products are fundamental computations in numerous scientific and engineering fields, with primary objectives focused on efficient and accurate calculation, especially for large-scale matrices. Current research emphasizes developing fast algorithms, often leveraging techniques like the non-equispaced fast Fourier transform (NFFT), ANOVA decomposition, and Krylov subspace methods, to accelerate computations within various models, including physics-informed neural networks and support vector machines. These advancements are crucial for improving the scalability and performance of applications ranging from large language model inference to solving complex integral equations and machine learning problems in high-dimensional spaces.

Papers