A New Type Of Node Split Rule For Decision Tree Learning

Main Article Content

C. Sudarsana Reddy
Dr. V. Vasu, B. Kumara Swamy Achari


A new type of node split rule for decision tree learning is proposed. This new type of node splitting rule is named as Sudarsana Reddy Node Split Rule (SRNSR). SRNSR is very easy to compute. It involves only finding the sum of logarithmic values of non-zero class counts of values of each attribute. The attribute with the highest logarithmic sum value will be selected as the best node split attribute. SRNSR is compared with most important and popular node split attribute rules (measures) and its performance is noticed better than the best node split attribute measures. We have proved that decision trees constructed by using SRNSR node split rule are more efficient and robust. SRNSR decision trees are balanced, simpler, smaller, stable, and safe and more generalize decision trees. We propose a new type of node splitting rule called Sudarsana Reddy Node Split Rule (SRNSR) for decision tree classifier construction. SRNSR improves decision tree classifier construction efficiency. Multi-way splits are applied for categorical attributes and binary splits are applied for numerical attributes. Finding best splitting attribute is an important task in decision tree learning. Also it is well known fact that there is no single splitting attribute rule that gives best performance results for all the problem domains.

Keywords: Decision trees, split attribute, Sudarsana Reddy Node Split Rule (SRNSR), node split rules, classification, data mining, machine learning.


Download data is not yet available.

Article Details