Binary Activation
Binary activation, the use of binary (0 or 1) values instead of continuous values in neural network activations, is a key technique for improving energy efficiency and reducing computational cost in various applications. Current research focuses on optimizing binary activation functions, including exploring novel thresholding methods and incorporating techniques like dithering and quantization-aware training to mitigate information loss. This approach is particularly relevant for resource-constrained devices and applications like mobile speech processing and edge computing, offering significant potential for reducing the memory footprint and power consumption of deep learning models while maintaining predictive accuracy.
Papers
November 7, 2024
August 28, 2024
July 5, 2024
May 3, 2024
December 22, 2023
October 3, 2023
July 4, 2023
May 14, 2023
July 6, 2022
July 4, 2022