Haijie Pan, Lirong Zheng*
CMES-Computer Modeling in Engineering & Sciences, Vol.131, No.1, pp. 493-512, 2022, DOI:10.32604/cmes.2022.019069
- 24 January 2022
Abstract The machine learning model converges slowly and has unstable training since large variance by random using a sample estimate gradient in SGD. To this end, we propose a noise reduction method for Stochastic Variance Reduction gradient (SVRG), called N-SVRG, which uses small batches samples instead of all samples for the average gradient calculation, while performing an incremental update of the average gradient. In each round of iteration, a small batch of samples is randomly selected for the average gradient calculation, while the average gradient is updated by rounding of the past model gradients during internal… More >