: random forest method that builds and combines a forest of randomly different trees in parallel
Gradient boosting
The idea of gradient boosting is to build a series of trees where each tree is trained to correct the mistakes of the previous tree in the series.
The gradient boosting ensembles use lots of weak learners, built in a non-random way, to created a model that makes fewer and fewer mistakes as more trees added.