2

1

The question is in reference to solution of Titanic survival predictionat kaggle . As many have did the similar kind of feature extraction,

They have converted some of the numerical features (Age, Fares) to Categorical types with Nominal labels (1,2,3) using bands.

Does it help for models to fit efficiently if we have all features in categorical values form?

I have been knowing that if you have categorical values, you should convert them into numerical first, and than to DUMMY Variables. But in the solution it is not done, is dummying not needed?