## Iterative generalised least squares

This method estimates the variance parameters by setting the full residual products equal to the variance matrix, V, specified in terms of the variance parameters and solving the resulting equations. This leads to a set of n x n simultaneous equations that can be solved iteratively for the variance parameters (n = number of observations). In ordinary IGLS which gives variance parameters that are biased downwards the equations are

To illustrate the structure of the equations more clearly we will show their form for a small hypothetical dataset. We assume that the first two patients in a repeated measures trial attended at the following visits:

Patient Visit

The equations in a model using a separate covariance term for each pair of visits (i.e. with a 'general' covariance structure) are given by

 -2 \$12 \$i3 0 0 \$12 —2 \$23 0 0 \$13 \$23 0 0 0 0 0 -i2 \$i2 0 0 0 \$i2 -22 i (yi - ßi) (yi - ßi) (yi - ßi ) (yi - ßi) (yi - ßi ) x(yi - - ßi) x(y2 - ß2) x ( y3 - ß3) x(y4 - ß4) x ( ys - ßs) (y2 - ß2) (y2 - ß2) (y2 - - ß2) (y2 - - ß2) (y2 - - ß2) x(yi - - ßi) x(y2 - ß2) x ( y3 - ß3) x(y4 - ß4) x ( ys - ßs) y - ß3) (y3 - ß3) (y3 - - ß3) (y3 - ß3) (y3 - - ß3) x(yi - - ßi) x(y2 - ß2) x ( y3 - ß3) x(y4 - ß4) x ( ys - ßs) (y4 - ß4) (y4 - ß4) (y4 - ß4) (y4 - ß4) (y4 - - ß4) x(yi - - ßi) x(y2 - ß2) x ( y3 - ß3) x(y4 - ß4) x ( ys - ßs) (ys - ßs) (ys - ßs) (ys - - ßs) (ys - ßs) (ys - - ßs) \x(yi - - ßi) x(y2 - ß2) x ( y3 - ß3) x(y4 - ß4) x ( ys - ßs) )

In this simple example, equating corresponding terms from the left-hand side and right-hand side of this equation gives

There are two equations for \$12, and we may obtain an estimate from their average:

Thus, in this artificially simple dataset the covariance terms are calculated over the average of the observed covariances for just one or two subjects. In a genuine dataset, the covariances will be estimated from the average of the observed covariances from many more subjects. The variance terms can be estimated in a similar way.

The approach extends to other covariance patterns. Suppose that with the same artificial dataset we wish to fit a simpler correlation pattern, with a constant covariance between each visit pair (i.e. compound symmetry pattern). Then

 (a 2 0 0 0 0 0 a 2 0 0 0 0 0 a 2 0 0 0 0 0 a2 0 V0 0 0 0 a y
 / (yi- - ii) (yi - - Ii) (yi - - Ii) (yi - - Ii) (yi - - Ii) ^ x(yi - ii) x ( y2 -12) x(y3 -13) x ( y4 -14) x(ys - Is) (y2 - -12) (y2 - -12) (y2 - -12) (y2 - -12) (y2 - -12) x(yi - Ii) x ( y2 -12) x(y3 -13) x ( y4 -14) x(ys - Is) (y3 - -13) (y3 - -13) (y3 - -13) (y3 - -13) (y3 - -13) x(yi - Ii) x ( y2 -12) x(y3 -13) x ( y4 -14) x(ys - is) (y4 - -14) (y4 - -14) (y4 - -14) (y4 - -14) (y4 - -14) x(yi - Ii) x ( y2 -12) x(y3 -13) x ( y4 -14) x(ys - Is) (ys - - is) (ys - - is) (ys - - is) (ys - - is) (ys - - is) \x(yi - Ii) x ( y2 -12) x(y3 -13) x ( y4 -14) x(ys - is) /

The solution is then given by averaging the observed covariances over all pairs of time points so that o = [(y1 - /¿1)(y2 - 12) + (y± - I4)(i5 -15) + (y1 - IM)(yi - I3) + (y2 - |2)(|3 - M3)]/4, and

In random effects and coefficients models, each linear equation may involve more than one parameter. Simple averaging will then not be sufficient to obtain the parameter estimates and standard methods for solving sets of linear equations can be applied.

For restricted iterative generalised least squares (RIGLS) which gives unbiased variance parameters (as in REML) the equations are

Further details on IGLS can be found in Goldstein (2003).