young call girls in Rajiv Chowk🔝 9953056974 🔝 Delhi escort Service
83 learningdecisiontree
1. Learning Decision Tree
Prepared By: Tamboli Tahseen
Roll No. : 2140683
Branch: BEIT-SEM (VII)
M.H.Saboo Siddik College of Engineering
shaikhtahseen6783@gmail.com
2. Introduction
Decision tree induction is one of the simplest,
and yet most successful forms of learning
algorithm. It serves as a good introduction to
the area of inductive learning, and is easy to
implement.
A decision tree takes as input an object or
situation described by a set of attributes and
returns a “decision” the predicted output value
for the input.
Decision trees represent protocols which can
be easily understood by humans.11/26/2016 Intelligent System 2
3. Characteristics
Root node is a starting node.
Decision tree gives the test to be carried out on
a decision.
Leaf nodes stand for probable final decision.
Every node in the tree returns yes/no/probable
decision.
Branches of the tree are labeled with probable
value.
Boolean Classification.
11/26/2016 Intelligent System 3
8. Algorithm
Function DECISION-TREE-LEARNING(examples, attribs, default)returns a
decision tree
Inputs: examples, set of examples
attribs, set of attribute
default, default value for the goal predicate
If examples is empty then return default
Else if all examples have the same classification then return the classification
Else if attribs is empty then return MAJORITY-VALUE(examples)
Else
best CHOOSE-ATTRIBUTE(attribs, examples)
tree a new decision tree with root test best
m MAJORITY-VALUE(examples)
for each value Vi of best do
examples {element of examples with best = Vi}
Subtree DECISION-TREE-LEARNING ( examples, attribs --- best,
m)
add a branch to tree with label Vi and subtree subtree
return tree
9. Applicability of Decision trees
Missing data
Multivalued attributes
Continuous and integer – valued input
attributes
Continuous – valued output attributes
10. Pros & Cons
Pros
1. Are simple to understand and interpret. People are able to
understand decision tree models after a brief explanation.
2. Have value even with little hard data. Important insights can be
generated based on experts describing a situation (its alternatives,
probabilities, and costs) and their preferences for outcomes.
3. Help determine worst, best and expected values for different
scenarios
4. Can be combined with other decision techniques.
Cons
1. For data including categorical variables with different number of
levels, information gain in decision trees are biased in favor of
those attributes with more levels.
2. Calculations can get very complex particularly if many values are
uncertain and/or if many outcomes are linked.