# Congruence coefficient

In multivariate statistics, the congruence coefficient is an index of the similarity between factors that have been derived in a factor analysis. It was introduced in 1948 by Cyril Burt who referred to it as unadjusted correlation. It is also called Tucker's congruence coefficient after Ledyard Tucker who popularized the technique. Its values range between -1 and +1. It can be used to study the similarity of extracted factors across different samples of, for example, test takers who have taken the same test.[1][2][3]

## Definition

Let X and Y be column vectors of factor loadings for two different samples. The formula for the congruence coefficient, or rc, is then[2]

${\displaystyle r_{c}={\frac {\sum {XY}}{\sqrt {\sum {X^{2}}\sum {Y^{2}}}}}}$

## Interpretation

Generally, a congruence coefficient of 0.90 is interpreted as indicating a high degree of factor similarity, while a coefficient of 0.95 or higher indicates that the factors are virtually identical. Alternatively, a value in the range 0.85–0.94 has been seen as corresponding to a fair similarity, with values higher than 0.95 indicating that the factors can be considered to be equal.[1][2]

The congruence coefficient can also be defined as the cosine of the angle between factor axes based on the same set of variables (e.g., tests) obtained for two samples (see Cosine similarity). For example, with perfect congruence the angle between the factor axes is 0 degrees, and the cosine of 0 is 1.[2]

## Comparison with Pearson's r

The congruence coefficient is preferred to Pearson's r as a measure of factor similarity, because the latter may produce misleading results. The computation of the congruence coefficient is based on the deviations of factor loadings from zero, whereas r is based on the deviations from the mean of the factor loadings.[2]