I saw on a CatBoost site that it supposed to outperform any other boosted training model and decided to try it myself on a Kaggle's https://www.kaggle.com/c/house-prices-advanced-regression-techniques.
I created some basic kernel without any complex preprocessing, feature selection, GridSearch, stacking, etc... just to compare the XGBR and CatBoost performance. But as far as I see the XGBR always outperforms the CatBoost.
The CatBoost parameters: iterations=100, depth=3, learning_rate=0.1
The XGBR parameters: subsample=0.7, colsample_bytree=0.7, n_estimators=500, learning_rate=0.03, max_depth=5, min_child_weight=3
For example, inside the Kaggle, the XGBR received score 0.134 and the CatBoot only reached 0.197 (I tried both one_hot_max_size and cat_features). I'll be thankful if someone could point me what is wrong with the CatBoost model, may be some optimization is missing.