DECISION TREE: A Supervised Algorithm

In this tutorial, learn the Decision Tree Classification approach, attribute selection to prevent overfitting of the model, how to build a tree model, its optimization, and lastly, it's plotting (which is rarely taught) in a more precise manner.

Click on this ----> CODE

Decision trees are a non-parametric supervised learning method that performs both regressions as well as a classification problem. Today, the classification approach is going to be solved on the data(download) with code(code).

So, our main node is the most important attribute in our approach which becomes the root node. Others are distributed in such a manner that we are following nested if-else if the condition becomes TRUE than our tree will continue otherwise it's else or FALSE condition becomes Leaf Node and terminates as well.

In order to minimize the impurity of our model, use the concept of Entropy. Prescribing the patient about which drug( A or B ) the patient has to take, calculate Entropy as:

Entropy = -p(A) log(A) - p(B) log(B)

Hence, Tree is considered a logarithmic approach. The lower the Entropy, the less uniform the distribution, and purer the node. To be more specific in accordance with the attribute selection, calculate the Information Gain.

Information Gain = Entropy before split - Entropy after split

Higher the Information gain more suitable the attribute is!

Let's know more about the libraries for better understanding:-

DecisionTreeClassifier( ) -> criterian = 'gini'(default) or 'entropy'                                             
for classifier's mathematics at backend click here
SelectFromModel() -> To select best feature. click here
plot_tree -> For rough plot. click here
graphviz -> To understand the final cool plot. click here

Tip - Always must aware of your model's accuracy don't be happy if your score is extraordinarily good at training set because it may be overfitted. Don't be happy if your score is extraordinarily good at the testing set because it may be underfitted. The code here will let you meet up with the overfitting problem!

Click on this ----> CODE

#artificialintelligence #machinelearning #decisiontree #supervisedlearnig #models #ainews

#algorithms #technology #technews #advancedai #classification #entropy #informationgain

#classifier #regression#optimization #overfitting #underfitting #plottree #logarithmic #aiml

#attributeselection #decisionnode #subtree #leafnode #computerscience #programming

#python #datascience #mathematics #libraries #crazzylearners #2020blogs