id author title date pages extension mime words sentences flesch summary cache txt work_xxbtc7zagfg4tgpxdors5gbbfi Aynur Dayanik Learning feature-projection based classifiers 2012 13 .pdf application/pdf 9295 1011 68 The CFP, VFI5, and FIL.IF algorithms represent concept descriptions in the form of feature intervals, which are learned separately for each feature. rates), learns feature intervals separately for each feature by optimally partitioning feature values to minimize the feature classification error. Section 3 presents the optimal feature partitioning learning algorithm for classification. 4. Section 5 presents the greedy feature partitioning learning algorithm for classification. (Duda, Hart, & Stork, 2000; Fukunaga, 1990). 3. OFP.MC: Optimal feature partitioning learning algorithm for the arbitrary division of the labeled examples into training and validation sets, the average misclassification rates and their standard errors For each maximum number of label switchings k, optimal feature intervals are learned from the training data and the performance (the misclassification rate) is calculated on the validation By using the entire set of labeled examples and the dynamic programming algorithm, those features are partitioned optimally into Feature interval learning algorithms for classification. 3 OFP.MC: Optimal feature partitioning learning algorithm for classification ./cache/work_xxbtc7zagfg4tgpxdors5gbbfi.pdf ./txt/work_xxbtc7zagfg4tgpxdors5gbbfi.txt