Paper ID: 2409.05905
Towards Narrowing the Generalization Gap in Deep Boolean Networks
Youngsung Kim
The rapid growth of the size and complexity in deep neural networks has sharply increased computational demands, challenging their efficient deployment in real-world scenarios. Boolean networks, constructed with logic gates, offer a hardware-friendly alternative that could enable more efficient implementation. However, their ability to match the performance of traditional networks has remained uncertain. This paper explores strategies to enhance deep Boolean networks with the aim of surpassing their traditional counterparts. We propose novel methods, including logical skip connections and spatiality preserving sampling, and validate them on vision tasks using widely adopted datasets, demonstrating significant improvement over existing approaches. Our analysis shows how deep Boolean networks can maintain high performance while minimizing computational costs through 1-bit logic operations. These findings suggest that Boolean networks are a promising direction for efficient, high-performance deep learning models, with significant potential for advancing hardware-accelerated AI applications.
Submitted: Sep 6, 2024