By clicking “Accept”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information.
Gradient Boosted Trees: Two Types of Numbers, Two Types of Failure
Quantising a GBT model is not the same as quantising a neural network. A neural network weight rounded slightly produces a small, continuous error. A GBT threshold rounded slightly can send an input down the wrong branch, producing a catastrophic error, greater than 100 % on real benchmark data. The correct approach is to quantise the input data before training and compute leaf bit-widths analytically from the trained ensemble.