IJSDR
IJSDR
INTERNATIONAL JOURNAL OF SCIENTIFIC DEVELOPMENT AND RESEARCH
International Peer Reviewed & Refereed Journals, Open Access Journal
ISSN Approved Journal No: 2455-2631 | Impact factor: 8.15 | ESTD Year: 2016
open access , Peer-reviewed, and Refereed Journals, Impact factor 8.15

Issue: April 2024

Volume 9 | Issue 4

Impact factor: 8.15

Click Here For more Info

Imp Links for Author
Imp Links for Reviewer
Research Area
Subscribe IJSDR
Visitor Counter

Copyright Infringement Claims
Indexing Partner
Published Paper Details
Paper Title: A Fast Clustering-Based Feature Subset Selection Algorithm
Authors Name: Varsha Sonwane
Unique Id: IJSDR1705085
Published In: Volume 2 Issue 5, May-2017
Abstract: Feature selection involves identifying a subset of the most useful features that produces compatible results as the original entire set of features. A feature selection algorithm may be evaluated from both the efficiency and effectiveness points of view. While the efficiency concerns the time required to find a subset of features, the effectiveness is related to the quality of the subset of features. Based on these criteria, a fast clustering-based feature selection algorithm, FAST, is proposed and experimentally evaluated in this paper. The FAST algorithm works in two steps. In the first step, features are divided into clusters by using graph-theoretic clustering methods. In the second step, the most representative feature that is strongly related to target classes is selected from each cluster to form a subset of features. Features in different clusters are relatively independent, the clustering-based strategy of FAST has a high probability of producing a subset of useful and independent features. To ensure the efficiency of FAST, we adopt the efficient minimum-spanning tree clustering method. The efficiency and effectiveness of the FAST algorithm are evaluated through an empirical study. Extensive experiments are carried out to compare FAST and several representative feature selection algorithms, namely, FCBF, ReliefF, CFS, Consist, and FOCUS-SF, with respect to four types of well-known classifiers, namely, the probability-based Naive Bayes, the tree-based C4.5, the instance-based IB1, and the rule-based RIPPER before and after feature selection. The results, on 35 publicly available real-world high dimensional image, microarray, and text data, demonstrate that FAST not only produces smaller subsets of features but also improves the performances of the four types of classifiers.
Keywords: Feature subset selection, filter method, feature clustering, graph-based clustering
Cite Article: "A Fast Clustering-Based Feature Subset Selection Algorithm", International Journal of Science & Engineering Development Research (www.ijsdr.org), ISSN:2455-2631, Vol.2, Issue 5, page no.487 - 490, May-2017, Available :http://www.ijsdr.org/papers/IJSDR1705085.pdf
Downloads: 000337067
Publication Details: Published Paper ID: IJSDR1705085
Registration ID:170392
Published In: Volume 2 Issue 5, May-2017
DOI (Digital Object Identifier):
Page No: 487 - 490
Publisher: IJSDR | www.ijsdr.org
ISSN Number: 2455-2631

Click Here to Download This Article

Article Preview

Click here for Article Preview







Major Indexing from www.ijsdr.org
Google Scholar ResearcherID Thomson Reuters Mendeley : reference manager Academia.edu
arXiv.org : cornell university library Research Gate CiteSeerX DOAJ : Directory of Open Access Journals
DRJI Index Copernicus International Scribd DocStoc

Track Paper
Important Links
Conference Proposal
ISSN
DOI (A digital object identifier)


Providing A digital object identifier by DOI
How to GET DOI and Hard Copy Related
Open Access License Policy
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License
Creative Commons License
This material is Open Knowledge
This material is Open Data
This material is Open Content
Social Media
IJSDR

Indexing Partner