-1

What is the effect of regularization on the value of parameters/weights?

How does adding a regularization term in the cost function(J) and gradients help? Doesn't adding something increase the cost function value?

-1

What is the effect of regularization on the value of parameters/weights?

How does adding a regularization term in the cost function(J) and gradients help? Doesn't adding something increase the cost function value?

0

I would recommend that you read this blog to gain a better understanding of what regularization is.

Here is my take on it. The goal of regularization is to constraint the weights to not grow too large. Adding a cost should indeed increase the value of your cost function, but all that matters is that your performance (i.e. accuracy, f1 score, etc.) becomes better.

Why do we want to constraint the weights? If we don't the network might put a lot of weights on certain input values, which, as a rule-of-thumb, is more prone to overfishing. Obviously, if you regularize too much, you'll be too conservative and won't learn much, but this is part of the art of training a neural network.