Artwork

Content provided by Erik Partridge. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Erik Partridge or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

Bagging

0:50
 
Share
 

Manage episode 311996329 series 3211418
Content provided by Erik Partridge. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Erik Partridge or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

Bagging is an ensemble meta-algorithm. Basically, we take some number of estimators (usually dozens-ish), train them each on some random subset of the training data. Then, we average the predictions of each individual estimator in order to make the resulting prediction. While this reduces the variance of your predictions (indeed, that is the core purpose of bagging), it may come at the trade off of bias.

For a more academic basis, see slide #13 of this lecture by Joëlle Pineau at McGill University.

--- Send in a voice message: https://podcasters.spotify.com/pod/show/mlbytes/message
  continue reading

6 episodes

Artwork

Bagging

Machine Learning Bytes

published

iconShare
 
Manage episode 311996329 series 3211418
Content provided by Erik Partridge. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Erik Partridge or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

Bagging is an ensemble meta-algorithm. Basically, we take some number of estimators (usually dozens-ish), train them each on some random subset of the training data. Then, we average the predictions of each individual estimator in order to make the resulting prediction. While this reduces the variance of your predictions (indeed, that is the core purpose of bagging), it may come at the trade off of bias.

For a more academic basis, see slide #13 of this lecture by Joëlle Pineau at McGill University.

--- Send in a voice message: https://podcasters.spotify.com/pod/show/mlbytes/message
  continue reading

6 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide