Rates of Convergence
The basic result we use is due to Perron and Ostrowski \cite{ostr}.
Theorem:
- If the iterative algorithm converges to
- and is differentiable at
- and
then the algorithm is linearly convergent with rate
Proof:
QED
The norm in the theorem is the spectral norm, i.e. the modulus of the maximum eigenvalue. Let us call the derivative of the iteration matrix and write it as In general block relaxation methods have linear convergence, and the linear convergence can be quite slow. In cases where the accumulation points are a continuum we have sublinear rates. The same things can be true if the local minimum is not strict, or if we are converging to a saddle point.
Generalization to non-differentiable maps.
Points of attraction and repulsion.
Superlinear etc