Aryan Mokhtari, Mert Gürbüzbalaban, Alejandro Ribeiro
A. Mokhtari, M. Gürbüzbalaban and A. Ribeiro, "A double incremental aggregated gradient method with linear convergence rate for large-scale optimization," 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), New Orleans, LA, 2017, pp. 4696-4700.
Publication year: 2017

This paper considers the problem of minimizing the average of a finite set of strongly convex
functions. We introduce a double incremental aggregated gradient method (DIAG) that
computes the gradient of only one function at each iteration, which is chosen based on a
cyclic scheme, and uses the aggregated average gradient of all the functions to approximate
the full gradient. We prove that not only the proposed DIAG method converges linearly to the
optimal solution, but also its linear convergence factor justifies the advantage of incremental
methods on full batch gradient descent. In particular, we show theoretically and empirically
that one pass of DIAG is more efficient than one iteration of gradient descent.