What is difference between Standard Normal Distribution and Mean Normalization approaches to feature-scaling?



The tag feature-scaling seems to convey that one of the scaling methods is Standard Normal Distribution. Further, I read an Answer on this site saying that Mean Normalization is a form of feature scaling.

What is the difference between two approaches to scaling?

Note: I think that statistics and mathematics of normalization do differ.

Subhash C. Davar

Posted 2020-05-26T10:30:50.083

Reputation: 408

Can you link to the answer you're referring to? – Itamar Mushkin – 2020-08-03T13:52:28.030



The terms standardization and normalization are often used interchangeably. However, strictly speaking they do refer to distinct feature transformations.


Normalization, also called feature scaling usually means scaling the data between 0 and 1. There are many approaches that can be used to achieve this. One common way is by

$x' = \frac{x - x_{min}}{x_{max} - x_{min}}$


Standardization transforms the feature to have a mean 0 and a standard deviation of 1. This is also called z-scoring and can be achieved by

$x_i' = \frac{x_i - \bar{x}}{s}$

where $\bar{x}$ is the mean of the feature and $s$ is the standard deviation of the feature.


Posted 2020-05-26T10:30:50.083

Reputation: 7 863

normal means conforming to a standard; usual, typical, or expected. The definition varies with respect to discipline of study. Scaled data could be Normalized by computing Z- scores or standard scores which allows the application of inferential statistics such as z -statistic. The normalization produces individual data for each question item or each individual. It generates weights / values for each scale point/category. Normalization is processing of data and has nothing to do with standard normal distribution.The latter is based on statistics. – Subhash C. Davar – 2020-07-22T05:22:06.570