Fast greedy \(\mathcal {C}\)-bound minimization with guarantees
作者:Baptiste Bauvin, Cécile Capponi, Jean-Francis Roy, François Laviolette
摘要
The \(\mathcal {C}\)-bound is a tight bound on the true risk of a majority vote classifier that relies on the individual quality and pairwise disagreement of the voters and provides PAC-Bayesian generalization guarantees. Based on this bound, MinCq is a classification algorithm that returns a dense distribution on a finite set of voters by minimizing it. Introduced later and inspired by boosting, CqBoost uses a column generation approach to build a sparse \(\mathcal {C}\)-bound optimal distribution on a possibly infinite set of voters. However, both approaches have a high computational learning time because they minimize the \(\mathcal {C}\)-bound by solving a quadratic program. Yet, one advantage of CqBoost is its experimental ability to provide sparse solutions. In this work, we address the problem of accelerating the \(\mathcal {C}\)-bound minimization process while keeping the sparsity of the solution and without losing accuracy. We present CB-Boost, a computationally efficient classification algorithm relying on a greedy–boosting-based–\(\mathcal {C}\)-bound optimization. An in-depth analysis proves the optimality of the greedy minimization process and quantifies the decrease of the \(\mathcal {C}\)-bound operated by the algorithm. Generalization guarantees are then drawn based on already existing PAC-Bayesian theorems. In addition, we experimentally evaluate the relevance of CB-Boost in terms of the three main properties we expect about it: accuracy, sparsity, and computational efficiency compared to MinCq, CqBoost, Adaboost and other ensemble methods. As observed in these experiments, CB-Boost not only achieves results comparable to the state of the art, but also provides \(\mathcal {C}\)-bound sub-optimal weights with very few computational demand while keeping the sparsity property of CqBoost.
论文关键词:PAC-Bayes, Boosting, Ensemble methods, \(\mathcal {C}\)-bound, Greedy optimization
论文评审过程:
论文官网地址:https://doi.org/10.1007/s10994-020-05902-7