Wrappers for feature subset selection

7.2kCitations
Citations of this article
2.8kReaders
Mendeley users who have this article in their library.

This article is free to access.

Abstract

In the feature subset selection problem, a learning algorithm is faced with the problem of selecting a relevant subset of features upon which to focus its attention, while ignoring the rest. To achieve the best possible performance with a particular learning algorithm on a particular training set, a feature subset selection method should consider how the algorithm and the training set interact. We explore the relation between optimal feature subset selection and relevance. Our wrapper method searches for an optimal feature subset tailored to a particular algorithm and a domain. We study the strengths and weaknesses of the wrapper approach and show a series of improved designs. We compare the wrapper approach to induction without feature subset selection and to Relief, a filter approach to feature subset selection. Significant improvement in accuracy is achieved for some datasets for the two families of induction algorithms used: decision trees and Naive-Bayes. © 1997 Elsevier Science B.V.

References Powered by Scopus

Bagging predictors

19123Citations
N/AReaders
Get full text

Applied regression analysis

16686Citations
N/AReaders
Get full text

Induction of Decision Trees

15628Citations
N/AReaders
Get full text

Cited by Powered by Scopus

Feature selection based on mutual information: Criteria of Max-Dependency, Max-Relevance, and Min-Redundancy

8838Citations
N/AReaders
Get full text

Gene selection for cancer classification using support vector machines

8078Citations
N/AReaders
Get full text

Data Mining: Concepts and Techniques

5898Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Kohavi, R., & John, G. H. (1997). Wrappers for feature subset selection. Artificial Intelligence, 97(1–2), 273–324. https://doi.org/10.1016/s0004-3702(97)00043-x

Readers over time

‘04‘09‘10‘11‘12‘13‘14‘15‘16‘17‘18‘19‘20‘21‘22‘23‘24‘25085170255340

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 1396

70%

Researcher 319

16%

Professor / Associate Prof. 193

10%

Lecturer / Post doc 93

5%

Readers' Discipline

Tooltip

Computer Science 1018

62%

Engineering 465

28%

Agricultural and Biological Sciences 90

5%

Mathematics 69

4%

Article Metrics

Tooltip
Mentions
News Mentions: 2
References: 1
Social Media
Shares, Likes & Comments: 25

Save time finding and organizing research with Mendeley

Sign up for free
0