Talk:Boosting (machine learning)

Latest comment: 4 years ago by 2402:4000:2080:1FDC:15CB:7F3F:641:48B5 in topic Boosting for multi-class categorization

Bias vs. Variance

edit

The first sentence of the article defines boosting as a method for reducing bias. Isn't this incorrect? If boosting provides generalization, and variance refers to the variance of the model for different training sets (i.e. high variance means overfitting), then boosting should reduce variance and thereby increase bias. I'm confused about this, could someone please comment?

--EmanueleLM (talk) 07:43, 1 June 2016 (UTC) No that's basicallt right since the wrt number of weak learners you can end up with bias (too few of them) or overfitting (too many of them). That's the best paper you can read about Boosting: http://rob.schapire.net/papers/explaining-adaboost.pdfReply

Anyway boosting almost always reduces bias and in practice unless you use a lot of learners does not increase variance significantly.

Strong vs. Weak

edit

The explanation of strong vs. week learner is a bit confusing. Unfortunately, I am not the right person to explain it better. —Preceding unsigned comment added by 193.171.142.61 (talk) 08:42, 7 December 2010 (UTC)Reply

Boosting

edit

Boosting is also a method for increasing the yield of a fisson bomb (Boosted fission weapon). Is that something that should be linked from this article? Or maybe put on the disambig. page for boost? --81.233.75.23 12:53, 1 June 2006 (UTC)Reply

It should be in the disambug. page. Grokmenow 16:27, 10 July 2007 (UTC)Reply

Oh, didnt see the date. Sorry about that. Grokmenow 16:27, 10 July 2007 (UTC)Reply

Computer vision category

edit

I removed this article from the computer vision category. Boosting is probably used by some people to solve CV problems but

  1. It is not a methodology developed within CV or specific to CV
  2. Boosting is already listed under the ensemble learning category which is linked to the CV category via maching learning.

--KYN 22:36, 27 July 2007 (UTC)Reply

Recent Articles

edit

I removed two articles from the references section. Perhaps another references section should be started to include some of the additional research on boosting.

-- AaronArvey —Preceding unsigned comment added by AaronArvey (talkcontribs) 01:15, 3 September 2007 (UTC)Reply

"branching program based boosters"

edit

The paper cited in reference to "convex potential boosters [not being able to] withstand random classification noise" states that "branching program based boosters" can withstand noise.

It would be really swell if someone knowledgeable could explain what "branching program based boosters" are. (Sorry that I can't) —Preceding unsigned comment added by 194.103.189.41 (talk) 14:14, 23 March 2011 (UTC)Reply

Agreed! --149.148.237.120 (talk) 09:30, 27 August 2014 (UTC)Reply

Merging article

edit

I think that this article : http://en.wikipedia.org/wiki/Boosting_methods_for_object_categorization

should be merge with this one. Anyone agree ? — Preceding unsigned comment added by 207.139.190.179 (talk) 20:21, 4 December 2012 (UTC)Reply

Yes (even if rather belatedly). Klbrain (talk) 14:23, 26 July 2016 (UTC)Reply

Boosting for multi-class categorization

edit

Boosting for multi-class categorization states at the second paragraph that The main flow of the algorithm is similar to the binary case. Perhaps the author intended the word flaw? Anyway there is no mention in the binary case of its main flow or flaw. So this needs to be clarified and possibly rewritten.--Gciriani (talk) 01:57, 3 June 2017 (UTC)Reply


I think he meant algorithm by flow because there is no mention of a flaw. — Preceding unsigned comment added by 2402:4000:2080:1FDC:15CB:7F3F:641:48B5 (talk) 05:28, 23 July 2020 (UTC)Reply

first sentence is contradicted by its own citation

edit

first sentence is:

"Boosting is a machine learning ensemble meta-algorithm for primarily reducing bias, and also variance[1]"

and that citation leads to:

https://web.archive.org/web/20150119081741/http://oz.berkeley.edu/~breiman/arcall96.pdf

I can't find anything in that paper that suggests boosting is "primarily for reducing bias". In fact it seems to be the opposite:

"Although both bagging and arcing[=boosting] reduce bias a bit, their major contribution to accuracy is in the large reduction of variance. Arcing does better than bagging because it does better at variance reduction."

31.220.221.120 (talk) 14:04, 14 December 2017 (UTC)Reply