Relative Sparsity
Relative sparsity focuses on creating models with a minimal number of parameters that differ from a pre-existing, often established model (e.g., a standard-of-care policy in healthcare or a baseline neural network). Current research emphasizes developing algorithms that achieve this "relative" sparsity while maintaining or even improving performance, often employing techniques like iterative momentum-based pruning or incorporating knowledge distillation within structured pruning frameworks applied to architectures such as ResNet-50. This approach is particularly valuable in applications like healthcare, where interpretability and explainability of model changes are crucial, and in resource-constrained settings where model compression is essential for efficient deployment.