Averaged tree-augmented one-dependence estimators

作者:He Kong, Xiaohu Shi, Limin Wang, Yang Liu, Musa Mammadov, Gaojie Wang

摘要

Ever since the success of naive Bayes (NB) in achieving excellent classification performance and the least computational overhead, more and more researchers have focused their attention on the Bayesian network classifiers (BNCs). Among numerous approaches to refining NB, averaged one-dependence estimators (AODE) achieves excellent classification performance although its discriminative independence assumption for each member rarely holds in practice. Robust AODE with high expressivity and low bias is in urgent need with the ever increasing data quantity. In this paper, the log likelihood function \(LL({\mathscr{B}}|D)\) is introduced to measure the number of bits which is encoded in the network topology \({\mathscr{B}}\) for describing training data D. An efficient heuristic search strategy is applied to maximize \(LL({\mathscr{B}}|D)\) and relax the independence assumption of AODE by exploring higher-order conditional dependencies between attributes. The proposed approach, averaged tree-augmented one-dependence estimators (ATODE), inherits the effectiveness of AODE and gains more flexibility for modelling higher-order dependencies. The extensive experimental comparison results on 36 datasets demonstrate that, compared to state-of-the-art learners including single-model BNCs (e.g., CFWNB and SKDB) and variants of AODE (e.g., TAODE), our proposed out-of-core learner can achieve competitive or better classification performance.

论文关键词:Bayesian network classifier, Log likelihood, Averaged one-dependence estimators, Structure extension

论文评审过程:

论文官网地址:https://doi.org/10.1007/s10489-020-02064-w