Efficient Minimisation of the KL Distance for the Approximation of Posterior Conditional Probabilities
作者:M. Battisti, P. Burrascano, D. Pirollo
摘要
The minimisation of a least mean squares cost function produces poor results in the ranges of the input variable where the quantity to be approximated takes on relatively low values. This can be a problem if an accurate approximation is required in a wide dynamic range. The present paper approaches this problem in the case of multilayer perceptrons trained to approximate the posterior conditional probabilities in a multicategory classification problem. The use of a cost function derived from the Kullback–Leibler information distance measure is proposed and a computationally light algorithm is derived for its minimisation. The effectiveness of the procedure is experimentally verified.
论文关键词:conditional probabilities estimate, Kullback–Leibler distance, MLP classifier training
论文评审过程:
论文官网地址:https://doi.org/10.1023/A:1009605310499