19

10

I was looking at code and found this:

```
model.add(Dense(13, input_dim=13, kernel_initializer='normal', activation='relu'))
```

I was keen to know about `kernel_initializer`

but wasn't able to understand it's significance?

19

10

I was looking at code and found this:

```
model.add(Dense(13, input_dim=13, kernel_initializer='normal', activation='relu'))
```

I was keen to know about `kernel_initializer`

but wasn't able to understand it's significance?

21

The neural network needs to start with some weights and then iteratively update them to better values. The term kernel_initializer is a fancy term for which statistical distribution or function to use for initialising the weights. In case of statistical distribution, the library will generate numbers from that statistical distribution and use as starting weights.

For example in the above code, normal distribution will be used to initialise weights. You can use other functions (constants like 1s or 0s) and distributions (uniform) too. All possible options are documented here.

Additional explanation: The term kernel is a carryover from other classical methods like SVM. The idea is to transform data in a given input space to another space where the transformation is achieved using kernel functions. We can think of neural network layers as non-linear maps doing these transformations, so the term kernels is used.