BNN Layer
Binary Neural Networks (BNNs) aim to drastically reduce the computational cost and memory footprint of deep learning models by representing weights and activations using only one bit. Current research focuses on improving BNN accuracy through techniques like optimized gradient approximation, novel activation functions, and adaptive quantization thresholds, often employing architectures such as ResNet and MobileNet. These advancements are significant for deploying deep learning on resource-constrained devices like embedded systems and mobile platforms, impacting both the efficiency of existing applications and enabling new possibilities in edge computing.
Papers
October 15, 2024
August 27, 2024
July 7, 2024
July 6, 2024
June 16, 2024
March 6, 2024
October 23, 2023
September 28, 2023
September 25, 2023
September 7, 2023
April 3, 2023
March 25, 2023
February 3, 2023
December 1, 2022
June 24, 2022
June 17, 2022
June 5, 2022
April 15, 2022
March 11, 2022