Multi Scale Sparse
Multi-scale sparse methods aim to efficiently represent and process data by focusing on the most important features across different scales of resolution. Current research emphasizes developing algorithms and model architectures, such as sparse convolutional neural networks and transformer-based approaches, that leverage this sparsity for improved efficiency and accuracy in diverse applications. These techniques are proving valuable in various fields, including image processing (e.g., inpainting, segmentation), point cloud compression and analysis, and solving partial differential equations, by reducing computational costs and improving robustness to noise. The resulting improvements in efficiency and performance have significant implications for resource-constrained applications and large-scale data analysis.
Papers
Generalizable multi-task, multi-domain deep segmentation of sparse pediatric imaging datasets via multi-scale contrastive regularization and multi-joint anatomical priors
Arnaud Boutillon, Pierre-Henri Conze, Christelle Pons, Valérie Burdin, Bhushan Borotikar
Sparse Deep Neural Network for Nonlinear Partial Differential Equations
Yuesheng Xu, Taishan Zeng