Behaviour in 0 of the Neural Networks Training Cost

作者:Cyril Goutte

摘要

We study the behaviour in zero of the derivatives of the cost function used when training non-linear neural networks. It is shown that a fair number of first, second and higher order derivatives vanish in zero, validating the belief that 0 is a peculiar and potentially harmful location. These calculations are related to practical and theoretical aspects of neural networks training.

论文关键词:training cost derivatives, neural networks training, numerical optimisation, regularisation

论文评审过程:

论文官网地址:https://doi.org/10.1023/A:1009684310458