I am currently trying to understand the architecture of a CNN. I understand the convolution, the ReLU layer, pooling layer, and fully connected layer. However, I am still confused about the weights.
In a normal neural network, each neuron has its own weight. In the fully connected layer, each neuron would also have its own weight. But what I dont know is if each filter has its own weight. Do I just have to update the weights in the fully connected layer during back propagation? Or do the filters all have a separate weight that I need to update?