Title :
A new heuristic of the decision tree induction
Author :
Li, Ning ; Zhao, Li ; Chen, Ai-xia ; Meng, Qing-wu ; Zhang, Guo-fang
Author_Institution :
Key Lab. of Machine Learning & Comput. Intell., Hebei Univ., Baoding, China
Abstract :
Decision tree induction is one of the useful approaches for extracting classification knowledge from a set of feature-based instances. The most popular heuristic information used in the decision tree generation is the minimum entropy. This heuristic information has a serious disadvantage-the poor generalization capability [3]. Support vector machine (SVM) is a classification technique of machine learning based on statistical learning theory. It has good generalization. Considering the relationship between the classification margin of support vector machine(SVM) and the generalization capability, the large margin of SVM can be used as the heuristic information of decision tree, in order to improve its generalization capability. This paper proposes a decision tree induction algorithm based on large margin heuristic. Comparing with the binary decision tree using the minimum entropy as the heuristic information, the experiments show that the generalization capability has been improved by using the new heuristic.
Keywords :
decision trees; entropy; knowledge acquisition; learning (artificial intelligence); support vector machines; binary decision tree; decision tree induction algorithm; feature-based instances; knowledge classification extraction; machine learning; minimum entropy; statistical learning; support vector machine; Classification tree analysis; Cybernetics; Decision trees; Entropy; Induction generators; Inverse problems; Machine learning; Machine learning algorithms; Support vector machine classification; Support vector machines; Clustering; Decision tree; Generalization; SMO; Support vector machines; large margin;
Conference_Titel :
Machine Learning and Cybernetics, 2009 International Conference on
Conference_Location :
Baoding
Print_ISBN :
978-1-4244-3702-3
Electronic_ISBN :
978-1-4244-3703-0
DOI :
10.1109/ICMLC.2009.5212227