@datamlistic

If you want to learn more about the bias-variance trade off (an important aspect for both bagging and boosting methods), make sure to check out this video: https://youtu.be/5mbX6ITznHk

@narasimhann2814

Neat explanation in 4mins. Keep making small and informative videos like these :)

@neithmccorkle

Concise explanation, especially helpful in visualizing the differences! Thanks for breaking down these concepts so clearly.

@ansariamaan620

Watching this one day before exams!.. neatly explained

@prathmeshlonkar2387

Crisp visualization & explanation. Loved it!

@coolmanabc1231

Just stumbled across your video, I am looking forward to watching all your videos šŸ˜„

@kimsungho4114

Your video was very helpful for my Datamining class. Thanks!

@zbady4595

I’m learning ML and these videos are fantastic

@mateusfigueiredo9961

Thank you for the clear explanation!

@LauraArdon-v2b

It was very instructive

@sabrinesabrina7255

perfect explanation

@houstonfirefox

Great video. My understanding is that you would almost always use Bagging, evaluate the results and, if good enough, stop there. However, you COULD go on to try various boosting methods to see if the model improved even more but at what cost? If the best boosted model (Adaboost, XGBoost, etc) performed 1% better but took 3x longer to compute then boosting the already-bagged models might not be worth it right? Still trying to cement in my mind the process flow from a developer standpoint šŸ˜‰

@jasonwang-wg8wu

this was dope, color-coding kinda threw me off but the overall explanation was nice and concise.

N/A

Let me "boost" this video by making a comment

@PythonArms

great video, loved the pictures. a caveman like myself loves the pictures. thank you lol

@ritshpatidar

It was clear. Thanks

@kimchijam

simple and easy to understand, nice

@RumayzaNorova

Very good explanation :)

@iacobsorina6924

Awsome video! Keep going!

@AnnaRushi

thanks for easy explaination brother