DocumentCode :
1239056
Title :
Handling continuous attributes in an evolutionary inductive learner
Author :
Divina, Federico ; Marchiori, Elena
Author_Institution :
Dept. of Comput. Sci., Vrije Univ. van Amsterdam, Netherlands
Volume :
9
Issue :
1
fYear :
2005
Firstpage :
31
Lastpage :
43
Abstract :
This work analyzes experimentally discretization algorithms for handling continuous attributes in evolutionary learning. We consider a learning system that induces a set of rules in a fragment of first-order logic (evolutionary inductive logic programming), and introduce a method where a given discretization algorithm is used to generate initial inequalities, which describe subranges of attributes´ values. Mutation operators exploiting information on the class label of the examples (supervised discretization) are used during the learning process for refining inequalities. The evolutionary learning system is used as a platform for testing experimentally four algorithms: two variants of the proposed method, a popular supervised discretization algorithm applied prior to induction, and a discretization method which does not use information on the class labels of the examples (unsupervised discretization). Results of experiments conducted on artificial and real life datasets suggest that the proposed method provides an effective and robust technique for handling continuous attributes by means of inequalities.
Keywords :
evolutionary computation; inductive logic programming; learning by example; learning systems; continuous attribute handling; discretization algorithm; evolutionary inductive logic programming; evolutionary learning system; Algorithm design and analysis; Entropy; Evolutionary computation; Genetic mutations; Iterative algorithms; Learning systems; Logic programming; Machine learning algorithms; Robustness; System testing;
fLanguage :
English
Journal_Title :
Evolutionary Computation, IEEE Transactions on
Publisher :
ieee
ISSN :
1089-778X
Type :
jour
DOI :
10.1109/TEVC.2004.837752
Filename :
1395849
Link To Document :
بازگشت