One-Stage Tree: end-to-end tree builder and pruner

作者:Zhuoer Xu, Guanghui Zhu, Chunfeng Yuan, Yihua Huang

摘要

Decision trees have favorable properties, including interpretability, high computational efficiency, and the ability to learn from little training data. Learning a decision tree is known to be NP-complete. The researchers have proposed many greedy algorithms such as CART to learn approximate solutions. Inspired by the current popular neural networks, soft trees that support end-to-end training with back-propagation have attracted more and more attention. However, existing soft trees either lose the interpretability due to the continuous relaxation or employ the two-stage method of end-to-end building and then pruning. In this paper, we propose One-Stage Tree to build and prune the decision tree jointly through a bilevel optimization problem. Moreover, we leverage the reparameterization trick and proximal iterations to keep the tree discrete during end-to-end training. As a result, One-Stage Tree reduces the performance gap between training and testing and maintains the advantage of interpretability. Extensive experiments demonstrate that the proposed One-Stage Tree outperforms CART and the existing soft trees on classification and regression tasks.

论文关键词:Decision tree, Soft tree, End-to-end learning, Explainable AI

论文评审过程:

论文官网地址:https://doi.org/10.1007/s10994-021-06094-4