Peter Peer and Boštjan Čargo and Igor Kononenko (1997) Extension of ReliefF. Elektrotehniski vestnik / Electrotechnical Review, 64 (5). pp. 277-283.
When building a decision tree, we have to select the most important attribute among all attributes of instances, which form the training data set. ReliefF [4,5] is an advanced algorithm for feature selection, as it is not "nearsighted" and can be used in real-world domains (improves certainty of estimates, deals with noisy and missing data and solves multi-class problems). We extended ReliefF in such a way, that feature qualities for different number of near neighbours are calculated simultaneously (6). After testing over 20 standard domains (Table 1, Figures 2,3), it was found, that decision trees obtained with ReliefF algorithm  are approximately equaly successful as those built by ReliefFS algorithm, when comparing only classification accuracy on training instances. However, another characteristic is superior: the number of leaves in decision trees is reduced heavily in many cases. This happens especially in medical domains. Our tests included unpruned trees, trees pruned with MDL1+ method , and finally, trees pruned with m-probability estimate . Decision trees were built with Assistant-R  and Assistant-RS, respectively. Extended algorithm ReliefFS is better than algorithm ReliefF. The above statement is confirmed also by testing results on an artificially composed (synthetic) domain PCMS.
|Item Type: ||Article|
|Keywords: ||machine learning, artificial intelligence, decision tree, feature selection, ReliefF, classification accuracy|
|Language of Content: ||Slovenian and English|
|Institution: ||University of Ljubljana|
|Department: ||Faculty of Computer and Information Science|
|Divisions: ||Faculty of Computer and Information Science > Computer Vision Laboratory|
|Item ID: ||66|
|Date Deposited: ||28 Mar 2003|
|Last Modified: ||13 Aug 2011 00:31|
Actions (login required)