This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
A Rough Hypercuboid Approach for Feature Selection in Approximation Spaces
Jan. 2014 (vol. 26 no. 1)
pp. 16-29
Pradipta Maji, Indian Statistical Institute, Kolkata
The selection of relevant and significant features is an important problem particularly for data sets with large number of features. In this regard, a new feature selection algorithm is presented based on a rough hypercuboid approach. It selects a set of features from a data set by maximizing the relevance, dependency, and significance of the selected features. By introducing the concept of the hypercuboid equivalence partition matrix, a novel representation of degree of dependency of sample categories on features is proposed to measure the relevance, dependency, and significance of features in approximation spaces. The equivalence partition matrix also offers an efficient way to calculate many more quantitative measures to describe the inexactness of approximate classification. Several quantitative indices are introduced based on the rough hypercuboid approach for evaluating the performance of the proposed method. The superiority of the proposed method over other feature selection methods, in terms of computational complexity and classification accuracy, is established extensively on various real-life data sets of different sizes and dimensions.
Index Terms:
Approximation methods,Rough sets,Data analysis,Uncertainty,Data mining,Redundancy,rough hypercuboid approach,Pattern recognition,data mining,feature selection,rough sets
Citation:
Pradipta Maji, "A Rough Hypercuboid Approach for Feature Selection in Approximation Spaces," IEEE Transactions on Knowledge and Data Engineering, vol. 26, no. 1, pp. 16-29, Jan. 2014, doi:10.1109/TKDE.2012.242
Usage of this product signifies your acceptance of the Terms of Use.