Sparsity Structure
Sparsity structure research focuses on identifying and leveraging the inherent sparsity—the presence of many zero or near-zero values—within data and models to improve efficiency and performance. Current research explores various sparsity patterns and their impact on model architectures like transformers and graph neural networks, employing techniques such as L1 regularization and structured pruning algorithms to achieve optimal sparsity levels. This work is significant because it leads to more efficient algorithms and models, reducing computational costs and improving scalability across diverse applications, including image processing, time series forecasting, and solving large-scale optimization problems.
Papers
Forecasting Irregularly Sampled Time Series using Graphs
Vijaya Krishna Yalavarthi, Kiran Madhusudhanan, Randolf Sholz, Nourhan Ahmed, Johannes Burchert, Shayan Jawed, Stefan Born, Lars Schmidt-Thieme
HighLight: Efficient and Flexible DNN Acceleration with Hierarchical Structured Sparsity
Yannan Nellie Wu, Po-An Tsai, Saurav Muralidharan, Angshuman Parashar, Vivienne Sze, Joel S. Emer