Can some one help me understand this paragraph from Nvidia's progressive gan paper?

10

1

Furthermore, we observe that mode collapses traditionally plaguing GANs tend to happen very quickly, over the course of a dozen minibatches. Commonly they start when the discriminator overshoots, leading to exaggerated gradients, and an unhealthy competition follows where the signal magnitudes escalate in both networks. We propose a mechanism to stop the generator from participating in such escalation, overcoming the issue (Section 4.2).

What do they mean when the discriminator overshoots? The discriminator gets good too quickly? And signal magnitudes escalate in both networks?

My current intuition is that the discriminator gets too good too soon which causes the generator to spike and try to play catch up, that would be the unhealthy competition that they are talking about. Model collapse is the side effect where the generator has trouble playing catch up and decides to play it safe by generating slightly vary images to increase its accuracy. Is this way of interpreting the above paragraph correct?

Inkplay_

Posted 2018-06-06T23:27:24.313

Reputation: 283

Answers

0

Yes, your intuition is correct. The effect of this problem is that the generator can no longer improve its output to marginally fool the discriminator - the discriminator isn't buying any of the generated output. In this case, the generator gets stuck in a local minimum and typically produces nonsense results.

DrMcCleod

Posted 2018-06-06T23:27:24.313

Reputation: 605

0

I think your intuition is not correct completely. At the beginning of the training, the discriminator could overshoot. Therefore, It can cause escalation of backpropagation signal.

verdery

Posted 2018-06-06T23:27:24.313

Reputation: 577

0

A discriminator overshooting may result in a dataset that has not been thoroughly clean and probably has too many identical feature, as a result there will be an early convergence from the discriminator as there is little variation. The drawback from this is that the model will not bee able to generalize well.

Simbarashe Timothy Motsi

Posted 2018-06-06T23:27:24.313

Reputation: 370

In other words, the generator can no longer effective learn how to fool the discriminator because everything fails. – David Hoelzer – 2020-03-05T14:40:24.340