Soft-max boosting

7Citations
Citations of this article
26Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

The standard multi-class classification risk, based on the binary loss, is rarely directly minimized. This is due to (1) the lack of convexity and (2) the lack of smoothness (and even continuity). The classic approach consists in minimizing instead a convex surrogate. In this paper, we propose to replace the usually considered deterministic decision rule by a stochastic one, which allows obtaining a smooth risk (generalizing the expected binary loss, and more generally the cost-sensitive loss). Practically, this (empirical) risk is minimized by performing a gradient descent in the function space linearly spanned by a base learner (a.k.a. boosting). We provide a convergence analysis of the resulting algorithm and experiment it on a bunch of synthetic and real-world data sets (with noiseless and noisy domains, compared to convex and non-convex boosters).

References Powered by Scopus

A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting

13167Citations
N/AReaders
Get full text

Classification and regression trees

8028Citations
N/AReaders
Get full text

Additive logistic regression: A statistical view of boosting

5336Citations
N/AReaders
Get full text

Cited by Powered by Scopus

BreastNet: A novel convolutional neural network model through histopathological images for the diagnosis of breast cancer

198Citations
N/AReaders
Get full text

Neural networks and statistical learning, second edition

43Citations
N/AReaders
Get full text

Development of a Novel Lightweight CNN Model for Classification of Human Actions in UAV-Captured Videos

7Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Geist, M. (2015). Soft-max boosting. Machine Learning, 100(2–3), 305–332. https://doi.org/10.1007/s10994-015-5491-2

Readers over time

‘15‘17‘18‘19‘20‘21‘22‘23036912

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 13

65%

Researcher 3

15%

Professor / Associate Prof. 2

10%

Lecturer / Post doc 2

10%

Readers' Discipline

Tooltip

Computer Science 18

86%

Physics and Astronomy 1

5%

Nursing and Health Professions 1

5%

Economics, Econometrics and Finance 1

5%

Save time finding and organizing research with Mendeley

Sign up for free
0