How to calculate irreducible error using Bias and Variance for a given machine learning Model?

1

I am trying to calculate the Bias and Variance for a ML Model.

$$ Err(x)=E[(Y−\hat f(x))^2] \\Err(x)=Bias^2+Variance+Irreducible\ Error $$ $\hat f(x)$ is our model

$Y$ is the variable we are trying to predict

$Err(x)$ is the overall error (MSE).

I am using the mlxtend library for bias variance decomposition.

Steps I followed:

  1. Generate training data set using the function $Y = f(x) + \epsilon$

    $f(x) = a + bx + cx^2$

    $\epsilon ∼N(0,σ^2) .$ is the normally distributed noise with mean $0$ and variance $\sigma^2$

  2. Generate test data set using $f(x) = a + bx + cx^2$. Here I create X_test and y_test. y_test contains the true value (without noise), as Bias is calculated using the true function.

  3. Use the mlxtend library function to calculate bias and variance. Here I am passing the Linear Regression estimator to the function.

My problem is even though the formula for MSE here is $Err(x)=Bias^2+Variance+Irreducible\ Error$ and I have also read that if our model is trained on a data which contain noise than it's impossible to eliminate that nose from the estimator. Still, upon decomposition I get $Irreducible\ Error = 0$. Even though I am using the true function ($f(x)$) for calculating the $Bias$ still $Irreducible\ Error$ is $0$.

What am I doing wrong?

According to my understanding if I calculate $Err(x)$, $Bias^2$ and $Variance$ I should be able to get the $Irreducible\ Error$ from the above equation.

Himanshu Dabas

Posted 2020-05-04T19:37:40.650

Reputation: 11

No answers