How can I incrementally train a Yolo model without catastrophic forgetting?

3

I have successfully trained a Yolo model to recognize k classes. Now I want to train by adding k+1 class to the pre-trained weights (k classes) without forgetting previous k classes. Ideally, I want to keep adding classes and train over the previous weights, i.e., train only the new classes. If I have to train all classes (k+1) every time a new class is added, it would be too time-consuming, as training k classes would take $k*20000$ iterations, versus the $20000$ iterations per new class if I can add the classes incrementally.

The dataset is balanced (5000 images per classes for training).

I appreciated if you can throw some methods or techniques to do this continual training for Yolo.

Troy

Posted 2019-07-16T13:30:52.173

Reputation: 83

A general rule of thumb to avoid forgetting is to use a low learning rate – 0x5050 – 2019-07-16T17:19:03.170

@PradipPramanick , but lower learning rate can affect the future new class prediction accuracy right. – Troy – 2019-07-18T16:19:05.257

Answers

0

There's something called Elastic Weight Consolidation to prevent neural networks from forgetting previous tasks as they train on new tasks. It might be helpful for your case too.

The main idea is to quantify the importance of parameters for task $t$ and penalize the model in proportion when it changes its parameters as it trains to learn task $t+1$. As you can see, this incentivizes model to change parameters that are less important for task $t$ which prevents the model from forgetting it.

SpiderRico

Posted 2019-07-16T13:30:52.173

Reputation: 444