How does InfoGAN learn latent categorical codes on MNIST

2

While reading the InfoGAN paper and implement it taking help from a previous implementation, I'm having some difficulty understanding how it learns the discrete categorical code when trained on MNIST.

The implementation we tried to follow uses a target as a randomly generated integer from 0 to 9. My doubt is this: how can it learn categorical information if from the start it's learning using a loss which takes in random values.

If this implementation is wrong, then what should the target be while training the Q network, when using the categorical-cross-entropy loss on the output logits?

Satvik Golechha

Posted 2019-08-26T17:00:41.613

Reputation: 21

No answers