## Updating the weights of the filters in a CNN

16

4

I am currently trying to understand the architecture of a CNN. I understand the convolution, the ReLU layer, pooling layer, and fully connected layer. However, I am still confused about the weights.

In a normal neural network, each neuron has its own weight. In the fully connected layer, each neuron would also have its own weight. But what I dont know is if each filter has its own weight. Do I just have to update the weights in the fully connected layer during back propagation? Or do the filters all have a separate weight that I need to update?

14

In a normal neural network, each neuron has its own weight.

This is not correct. Every connection between neurons has its own weight. In a fully connected network each neuron will be associated with many different weights. If there are n0 inputs (i.e. n0 neurons in the previous layer) to a layer with n1 neurons in a fully connected network, that layer will have n0*n1 weights, not counting any bias term.

You should be able to see this clearly in this diagram of a fully connected network from CS231n. Every edge you see represents a different trainable weight:

Convolutional layers are different in that they have a fixed number of weights governed by the choice of filter size and number of filters, but independent of the input size.

Each filter has a separate weight in each position of its shape. So if you use two 3x3x3 filters then you will have 54 weights, again not counting bias. This is illustrated in a second diagram from CS231n:

The filter weights absolutely must be updated in backpropagation, since this is how they learn to recognize features of the input. If you read the section titled "Visualizing Neural Networks" here you will see how layers of a CNN learn more and more complex features of the input image as you got deeper in the network. These are all learned by adjusting the filter weights through backpropagation.

mmm the number of weights in each layer also depends upon number of strides the filters take... right ? – Arnav Das – 2020-04-17T15:30:18.890

3

During back propagation both dense layers and convolution layers get updated but max-pooling layers do not have any weight to be updated. Dense layers are updated to help the net to classify. Convolution layers get updated to let the network learn the features itself. Because you have not asked in the question I just add a link for you if you want to know more. There is a very good explanation for back propagation here which can be useful for you.