Thursday, January 10, 2019

Decision Tree Regression using sklearn


Decision Tree is a decision-making aproach that uses a flowchart-like tree structure of decisions and all of their possible results, including outcomes, input costs and utility.
Decision-tree algorithm falls under the category of supervised learning algorithms. It works for both continuous as well as categorical output variables.
The branches/edges represents as:
  1. Conditions [Decision Nodes]
  2. Result [End Nodes]
The branches/edges represent the truth/falsity of the statement and takes makes a decision based on that in the example below which shows a decision tree that evaluates the smallest of three numbers:

Step-by-Step implementation –
Step 1: Import the required libraries.
Step 2: Initialize and print the Dataset.
Step 3: Select all the rows and column 1 from dataset to “X”.
Step 4: Select all of the rows and column 2 from dataset to “y”.
Step 5: Fit decision tree regressor to the dataset
Step 6: Predicting a new value
Step 7: Visualising the result
Step 8: The tree is finally exported and shown in the TREE STRUCTURE below, visualized using http://www.webgraphviz.com/ by copying the data from the ‘tree.dot’ file.

No comments:

Post a Comment

Support Vector Machine Application support vetor machine: Face detection Text and hyper text categorization classification of im...