Paper ID: 2302.03506

Dynamic Training of Liquid State Machines

Pavithra Koralalage, Ireoluwa Fakeye, Pedro Machado, Jason Smith, Isibor Kennedy Ihianle, Salisu Wada Yahaya, Andreas Oikonomou, Ahmad Lotfi

Spiking Neural Networks (SNNs) emerged as a promising solution in the field of Artificial Neural Networks (ANNs), attracting the attention of researchers due to their ability to mimic the human brain and process complex information with remarkable speed and accuracy. This research aimed to optimise the training process of Liquid State Machines (LSMs), a recurrent architecture of SNNs, by identifying the most effective weight range to be assigned in SNN to achieve the least difference between desired and actual output. The experimental results showed that by using spike metrics and a range of weights, the desired output and the actual output of spiking neurons could be effectively optimised, leading to improved performance of SNNs. The results were tested and confirmed using three different weight initialisation approaches, with the best results obtained using the Barabasi-Albert random graph method.

Submitted: Feb 6, 2023