- How does Decision Tree work?
- Types of Decision Tree
- Additional Features
- Types of Boosting Algorithms
- Next Steps
July 24, 2019
Gini Index: measure of inequality of distribution
\[Gini = 1 - \sum_{i} p_{i}^{2}\]
Entropy: measure of impurity
\[Entropy = -\sum_{i} p_{i}log_{_2}p_{i}\]
Information Gain: after picking a particular attribute
\[IG(D_{p}) = I(D_{p}) - \frac{N_{left}}{N_{p}}I(D_{left}) - \frac{N_{right}}{N_{p}}I(D_{right})\]
\[RSS = \sum_{i=1}^{M} \sum_{j \in Ss} (y_{j} - y_{SS_{i}})^2\]
Outcomes predict average of values in each leaf
Example 1: separate red circles and blue circles
Outcomes predict average of values in each leaf
Example 2: find the average of the values