Binary Activation

Binary activation, the use of binary (0 or 1) values instead of continuous values in neural network activations, is a key technique for improving energy efficiency and reducing computational cost in various applications. Current research focuses on optimizing binary activation functions, including exploring novel thresholding methods and incorporating techniques like dithering and quantization-aware training to mitigate information loss. This approach is particularly relevant for resource-constrained devices and applications like mobile speech processing and edge computing, offering significant potential for reducing the memory footprint and power consumption of deep learning models while maintaining predictive accuracy.

Papers