"help" decision tree by tying 2 features together

3

Assuming I have in my dataset 2 (or more) features that are for sure linked (for example: feature B indicates the amount of relevance of feature A), is there a way I could design a decision tree that will tie both features together (the obvious split would be if feature B is big enough split by A and if not ignore A)?
In other words: I'm looking for a way to "help" my decision tree to not test feature A before testing feature B. Plus, not to look on the improvement of each feature individually, but on the improvement of "B and afterwords A". Any help will be appreciated!

Binyamin Even

Posted 2018-10-31T08:31:04.947

Reputation: 225

Answers

2

One way is to create new feature by combining A and B by a function and eliminating A and B from dataset. The less features are correlated the better. Then you don't have redundant features.

But actually, it may be enough to use Random Forest. Then the model will have enough trees to figure out that the best split is first B and then A.

DmytroSytro

Posted 2018-10-31T08:31:04.947

Reputation: 437