On the step size selection in variance-reduced algorithm for nonconvex optimization

作者:

Highlights:

• We obtain online step size for the nonconvex variance-reduced algorithms.

• We propose a novel SARAH-type method and a novel SVRG-type method.

• We theoretically analyze the convergence of the proposed method.

• The proposed methods are robust to the choice of different parameters.

摘要

•We obtain online step size for the nonconvex variance-reduced algorithms.•We propose a novel SARAH-type method and a novel SVRG-type method.•We theoretically analyze the convergence of the proposed method.•The proposed methods are robust to the choice of different parameters.

论文关键词:Nonconvex optimization,Variance reduction,Online step size,Stochastic optimization

论文评审过程:Received 4 January 2020, Revised 1 November 2020, Accepted 16 November 2020, Available online 18 November 2020, Version of Record 10 February 2021.

论文官网地址:https://doi.org/10.1016/j.eswa.2020.114336