Discrete Activation

Discrete activation in neural networks focuses on using discrete, rather than continuous, values for neuron activations and/or weights, aiming to improve efficiency, interpretability, and potentially robustness. Current research explores various architectures, including rule-based networks, spiking neural networks, and models employing discrete backpropagation or the local reparameterization trick to train these networks. This approach offers advantages in resource-constrained environments and allows for the development of more explainable AI models, with applications ranging from improved classification to spatio-temporal system modeling and graph neural networks.

Papers