Does training of neural networks follow the same order in each epoch?


Each epoch uses the weight from the end of the previous epoch(correct me if I am wrong). Is the updating of parameters after each batch always in the same order? To rephrase, are the batches always in the same order? Could this bias the learning and are there any adaptations that deal with this.

Borut Flis

Posted 2020-07-24T08:47:02.603

Reputation: 145



All the parameters are updated after a batch, there is no notion of order of update.

The batch can or cannot be in the same order. It could lead to overfitting in the sense of the network learn the order of the dataset. An easy turnaround is to shuffle the batch between each epochs.


Posted 2020-07-24T08:47:02.603

Reputation: 312