A penalized likelihood based pattern classification algorithm
作者:
Highlights:
•
摘要
Penalized likelihood is a general approach whereby an objective function is defined, consisting of the log likelihood of the data minus some term penalizing non-smooth solutions. Subsequently, this objective function is maximized, yielding a solution that achieves some sort of trade-off between the faithfulness and the smoothness of the fit. Most work on that topic focused on the regression problem, and there has been little work on the classification problem. In this paper we propose a new classification method using the concept of penalized likelihood (for the two class case). By proposing a novel penalty term based on the K-nearest neighbors, simple analytical derivations have led to an algorithm that is proved to converge to the global optimum. Moreover, this algorithm is very simple to implement and converges typically in two or three iterations. We also introduced two variants of the method by distance-weighting the K-nearest neighbor contributions, and by tackling the unbalanced class patterns situation. We performed extensive experiments to compare the proposed method to several well-known classification methods. These simulations reveal that the proposed method achieves one of the top ranks in classification performance and with a fairly small computation time.
论文关键词:K-nearest neighbor,Penalized likelihood,Pattern classification,Posterior probability,Class balancing,Weighted KNN
论文评审过程:Received 23 June 2008, Revised 11 February 2009, Accepted 22 April 2009, Available online 4 May 2009.
论文官网地址:https://doi.org/10.1016/j.patcog.2009.04.016