TY - JOUR
T1 - End-to-End learning of decision trees and forests
AU - Hehn, Thomas M.
AU - Kooij, Julian F.P.
AU - Hamprecht, Fred A.
PY - 2019
Y1 - 2019
N2 - Conventional decision trees have a number of favorable properties, including a small computational footprint, interpretability, and the ability to learn from little training data. However, they lack a key quality that has helped fuel the deep learning revolution: that of being end-to-end trainable. Kontschieder et al. (ICCV, 2015) have addressed this deficit, but at the cost of losing a main attractive trait of decision trees: the fact that each sample is routed along a small subset of tree nodes only. We here present an end-to-end learning scheme for deterministic decision trees and decision forests. Thanks to a new model and expectation–maximization training scheme, the trees are fully probabilistic at train time, but after an annealing process become deterministic at test time. In experiments we explore the effect of annealing visually and quantitatively, and find that our method performs on par or superior to standard learning algorithms for oblique decision trees and forests. We further demonstrate on image datasets that our approach can learn more complex split functions than common oblique ones, and facilitates interpretability through spatial regularization.
AB - Conventional decision trees have a number of favorable properties, including a small computational footprint, interpretability, and the ability to learn from little training data. However, they lack a key quality that has helped fuel the deep learning revolution: that of being end-to-end trainable. Kontschieder et al. (ICCV, 2015) have addressed this deficit, but at the cost of losing a main attractive trait of decision trees: the fact that each sample is routed along a small subset of tree nodes only. We here present an end-to-end learning scheme for deterministic decision trees and decision forests. Thanks to a new model and expectation–maximization training scheme, the trees are fully probabilistic at train time, but after an annealing process become deterministic at test time. In experiments we explore the effect of annealing visually and quantitatively, and find that our method performs on par or superior to standard learning algorithms for oblique decision trees and forests. We further demonstrate on image datasets that our approach can learn more complex split functions than common oblique ones, and facilitates interpretability through spatial regularization.
KW - Decision forests
KW - Efficient inference
KW - End-to-end learning
KW - Interpretability
UR - http://www.scopus.com/inward/record.url?scp=85074457689&partnerID=8YFLogxK
U2 - 10.1007/s11263-019-01237-6
DO - 10.1007/s11263-019-01237-6
M3 - Article
SN - 0920-5691
VL - 128 (2020)
SP - 997
EP - 1011
JO - International Journal of Computer Vision
JF - International Journal of Computer Vision
ER -