I.5.2.1: ALSOS
ALSOS algorithms are ALS algorithms in which one or more of the blocks defines transformations of variables.
Suppose we have observations on two sets of variables and We want to fit a model of the form where the unknowns are the structural parameters and and the transformations and In ALS we measure loss-of-fit by This loss function is minimized by starting with initial estimates for the transformations, minimizing over the structural parameters, keeping the transformations fixed at their current values, and then minimizing over the transformations, with structural values kept fixed at their new values. These two minimizations are alternated, which produces a nonincreasing sequence of loss function values, bounded below by zero, and thus convergent. This is a version of the trivial convergence theorem.
The first ALS example is due to Kruskal \cite{krus}. We have a factorial ANOVA, with, say, two factors, and we minimize Kruskal required to be monotonic. Minimizing loss for fixed is just doing an analysis of variance, minimizing loss over for fixed is doing a monotone regression. Obviously also some normalization requirement is needed to exclude trivial zero solutions.
This general idea was extended by De Leeuw, Young, and Takane around 1975 to This ALSOS work, in the period 1975-1980, is summarized in \cite{forrest}. Subsequent work, culminating in the book by Gifi \cite{gifi}, generalized this to ALSOS versions of principal component analysis, path analysis, canonical analysis, discriminant analysis, MANOVA, and so on. The classes of transformations over which loss was minimized were usually step-functions, splines, mo-no-to-ne functions, or low-degree polynomials. To illustrate the use of more sets in ALS, consider This is principal component analysis (or partial singular value decomposition) with optimal scaling. We can now cycle over three sets, the transformations, the component scores and the component loadings In the case of monotone transformations this alternates monotone regression with two linear least squares problems.