Do non-parametric models always overfit without regularization?


Let's scope this to just classification.

It's clear that if you fully grow out a decision tree with no regularization (e.g. max depth, pruning), it will overfit the training data and get full accuracy down to Bayes error*.

Is this universally true for all non-parametric methods?

*Assuming the model has access to the "right" features.


Posted 2020-07-16T01:23:52.287

Reputation: 111

there is no known mathematical result that applies to "non-parametric models" in general. However it seems to be the case

– Nikos M. – 2020-07-16T12:10:52.137



No - non-parametric methods only means that the method does not assume a function form of the data. There are non-parametric methods such as Random Forest that do not always overfit. In fact nonparametric methods could underfit, it could lack the ability to fit training data. An example of this would be a decision stump.

Brian Spiering

Posted 2020-07-16T01:23:52.287

Reputation: 10 864