Furthermore, we observe that mode collapses traditionally plaguing GANs tend to happen very quickly, over the course of a dozen minibatches. Commonly they start when the discriminator overshoots, leading to exaggerated gradients, and an unhealthy competition follows where the signal magnitudes escalate in both networks. We propose a mechanism to stop the generator from participating in such escalation, overcoming the issue (Section 4.2).
What do they mean when the discriminator overshoots? The discriminator gets good too quickly? And signal magnitudes escalate in both networks?
My current intuition is that the discriminator gets too good too soon which causes the generator to spike and try to play catch up, that would be the unhealthy competition that they are talking about. Model collapse is the side effect where the generator has trouble playing catch up and decides to play it safe by generating slightly vary images to increase its accuracy. Is this way of interpreting the above paragraph correct?