This study investigates columnar-organized spiking neural networks (SNNs) for continual learning and catastrophic forgetting.Using CoLaNET (Columnar Layered Network), we show that microcolumns adapt most efficiently to new tasks when they lack shared structure with prior learning.We demonstrate how CoLaNET hyperparameters govern the trade-off between retaining old knowledge (stability) and acquiring new information (plasticity).Our optimal configuration learns ten sequential MNIST tasks effectively, maintaining 92% accuracy on each.It shows low forgetting, with only 4% performance degradation on the first task after training on nine subsequent tasks.
The paper investigates the capabilities of columnar-organized spiking neural networks (SNNs) for continual learning and mitigating catastrophic forgetting through the CoLaNET architecture. By employing a set of hyperparameters, the model balances stability (retaining old knowledge) and plasticity (acquiring new information) effectively. The findings indicate that with an optimal configuration, CoLaNET can learn ten sequential MNIST tasks while maintaining a high accuracy of 92% and only a 4% degradation in performance on initial tasks after completing subsequent tasks. The architecture incorporates concepts from biology such as local learning and neuronal competition to enhance its continual learning capabilities.
This paper employs the following methods:
- Spiking Neural Networks
- Local Learning
- Neuronal Competition
- Modulated Plasticity
- Gating Mechanisms
The following datasets were used in this research:
- Accuracy
- Forgetting Measure (FM)
- Average Accuracy (AA)
- Average Incremental Accuracy (AIA)
- Maintained 92% accuracy on MNIST tasks
- Achieved only 4% forgetting after learning nine subsequent tasks
- Number of GPUs: 2
- GPU Type: Quadro RTX 6000
- Compute Requirements: Training on Permuted MNIST tasks takes 1h 20m for 15 microcolumns and 3h for 45 microcolumns.