On pages 349 and 353, we discuss how multilevel models will sometimes not run when a variance component in the model is small or non-existent. Besides small variances, convergence problems will also occur when two or more random effects are highly correlated (i.e., problems of multicollinearity). This problem can occur in dyadic models, when there is a couple-level effect. As an example, consider a growth curve model where the two members share the same growth curve which would make the correlation of the two slopes and the two intercepts one. To diagnose the problem, from the output determine the correlation between the two random effects. If the correlation is large, then the two random effects can be collapsed into one effect. The same would occur if the correlation between the two intercepts was perfect.
For growth-curve modeling, convergence can be aided by changing the units of measurement of time, i.e., by having "longer" periods of time. For instance, instead of months, use years or instead of days use weeks. However, in some instances, we have found that shorter periods improves convergence.

Programs vary considerable in convergence. In our limited experience, SPSS performs the worst and HLM the best. Convergence in SPSS can be improved by changing UNR to UN, by using REML as opposed to ML, and adding a statement of /MXSTEP(50) where other numbers besides 50 might be used.