FANDOM


Boosting is a machine learning meta-algorithm for performing supervised learning. Boosting occurs in stages, by incrementally adding to the current learned function. At every stage, a weak learner (i.e., one that can have an accuracy as bad as slightly greater than chance) is trained with the data. The output of the weak learner is then added to the learned function, with some strength (proportional to how accurate the weak learner is). Then, the data is reweighted: examples that the current learned function get wrong are "boosted" in importance, so that future weak learners will attempt to fix the errors.

There are several different boosting algorithms, depending on the exact mathematical form of the strength and weight. One of the most common boosting algorithms is AdaBoost. Most boosting algorithms fit into the AnyBoost framework, which shows that boosting performs gradient descent in function space.

Boosting is based on probably approximately correct learning (PAC learning), which is a branch of computational learning theory.

Schapire was the first to show that if a concept is weakly PAC learnable then it is also strongly PAC learnable using boosting.

Algorithmically, boosting is related to

References Edit

See also Edit

External links Edit

Ad blocker interference detected!


Wikia is a free-to-use site that makes money from advertising. We have a modified experience for viewers using ad blockers

Wikia is not accessible if you’ve made further modifications. Remove the custom ad blocker rule(s) and the page will load as expected.