Paper ID: 2411.05861
Rethinking Deep Learning: Non-backpropagation and Non-optimization Machine Learning Approach Using Hebbian Neural Networks
Kei Itoh
Developing strong AI could provide a powerful tool for addressing social and scientific challenges. Neural networks (NNs), inspired by biological systems, have the potential to achieve this. However, weight optimization techniques using error backpropagation are not observed in biological systems, raising doubts about current NNs approaches. In this context, Itoh (2024) solved the MNIST classification problem without using objective functions or backpropagation. However, weight updates were not used, so it does not qualify as machine learning AI. In this study, I develop a machine learning method that mimics biological neural systems by implementing Hebbian learning in NNs without backpropagation and optimization method to solve the MNIST classification problem and analyze its output. Development proceeded in three stages. In the first stage, I applied the Hebbian learning rule to the MNIST character recognition algorithm by Itoh (2024), resulting in lower accuracy than non-Hebbian NNs, highlighting the limitations of conventional training procedures for Hebbian learning. In the second stage, I examined the properties of individually trained NNs using norm-based cognition, showing that NNs trained on a specific label respond powerfully to that label. In the third stage, I created an MNIST character recognition program using vector norm magnitude as the criterion, achieving an accuracy of approximately 75%. This demonstrates that the Hebbian learning NNs can recognize handwritten characters without objective functions, backpropagation, or optimization processes. Based on these results, developing a mechanism based on norm-based cognition as a fundamental unit and then increasing complexity to achieve indirect similarity cognition should help mimic biological neural systems and contribute to realizing strong AI.
Submitted: Nov 7, 2024