Discrete Activation
Discrete activation in neural networks focuses on using discrete, rather than continuous, values for neuron activations and/or weights, aiming to improve efficiency, interpretability, and potentially robustness. Current research explores various architectures, including rule-based networks, spiking neural networks, and models employing discrete backpropagation or the local reparameterization trick to train these networks. This approach offers advantages in resource-constrained environments and allows for the development of more explainable AI models, with applications ranging from improved classification to spatio-temporal system modeling and graph neural networks.
Papers
October 11, 2024
August 21, 2024
February 22, 2024
February 6, 2024
December 28, 2023
December 22, 2023
July 29, 2023
July 4, 2023
July 1, 2023
October 27, 2022
March 24, 2022