This paper considers the problem of minimizing the average of a finite set of strongly convex
functions. We introduce a double incremental aggregated gradient method (DIAG) that
computes the gradient of only one function at each iteration, which is chosen based on a
cyclic scheme, and uses the aggregated average gradient of all the functions to approximate
the full gradient. We prove that not only the proposed DIAG method converges linearly to the
optimal solution, but also its linear convergence factor justifies the advantage of incremental
methods on full batch gradient descent. In particular, we show theoretically and empirically
that one pass of DIAG is more efficient than one iteration of gradient descent.