Catastrophic Forgetting
Catastrophic Forgetting
Causes of catastrophic forgetting include:
- updating all weights in the model during re-training
- sequential learning where old data is no longer available
- overfitting to new data, displacing previous learning
There are many strategies for continual learning over time that can counteract catastrophic forgetting. These include selective weight freezing to preserve critical knowledge, experience replay techniques that mix historical data with new training data, or architecture-based solutions such as progressive neural networks where new tasks are handled in separate layers. These methods can be implemented individually or in combination to achieve more robust learning over time.
Source: https://link.springer.com/article/10.1007/s11063-024-11709-7