Fixed Point
Fixed-point computation, the iterative process of finding a value that remains unchanged under a given transformation, is a fundamental concept with applications across diverse fields. Current research focuses on improving the efficiency and robustness of fixed-point algorithms in neural networks, particularly concerning quantization for low-power devices and the development of numerically stable methods for various applications like Gaussian smoothing and causal generative modeling. These advancements are significant because they enable the deployment of complex models on resource-constrained platforms, improving the efficiency and scalability of machine learning and other computational tasks.
Papers
Fixed Points in Cyber Space: Rethinking Optimal Evasion Attacks in the Age of AI-NIDS
Christian Schroeder de Witt, Yongchao Huang, Philip H. S. Torr, Martin Strohmeier
HERO: Hessian-Enhanced Robust Optimization for Unifying and Improving Generalization and Quantization Performance
Huanrui Yang, Xiaoxuan Yang, Neil Zhenqiang Gong, Yiran Chen