language-icon Old Web
English
Sign In

Multivariate analysis of variance

In statistics, multivariate analysis of variance (MANOVA) is a procedure for comparing multivariate sample means. As a multivariate procedure, it is used when there are two or more dependent variables, and is typically followed by significance tests involving individual dependent variables separately. It helps to answer: In statistics, multivariate analysis of variance (MANOVA) is a procedure for comparing multivariate sample means. As a multivariate procedure, it is used when there are two or more dependent variables, and is typically followed by significance tests involving individual dependent variables separately. It helps to answer: MANOVA is a generalized form of univariate analysis of variance (ANOVA), although, unlike univariate ANOVA, it uses the covariance between outcome variables in testing the statistical significance of the mean differences. Where sums of squares appear in univariate analysis of variance, in multivariate analysis of variance certain positive-definite matrices appear. The diagonal entries are the same kinds of sums of squares that appear in univariate ANOVA. The off-diagonal entries are corresponding sums of products. Under normality assumptions about error distributions, the counterpart of the sum of squares due to error has a Wishart distribution. MANOVA is based on the product of model variance matrix, Σ model {displaystyle Sigma _{ ext{model}}} and inverse of the error variance matrix, Σ res − 1 {displaystyle Sigma _{ ext{res}}^{-1}} , or A = Σ model × Σ res − 1 {displaystyle A=Sigma _{ ext{model}} imes Sigma _{ ext{res}}^{-1}} . The hypothesis that Σ model = Σ residual {displaystyle Sigma _{ ext{model}}=Sigma _{ ext{residual}}} implies that the product A ∼ I {displaystyle Asim I} . Invariance considerations imply the MANOVA statistic should be a measure of magnitude of the singular value decomposition of this matrix product, but there is no unique choice owing to the multi-dimensional nature of the alternative hypothesis. The most common statistics are summaries based on the roots (or eigenvalues) λ p {displaystyle lambda _{p}} of the A {displaystyle A} matrix: Discussion continues over the merits of each, although the greatest root leads only to a bound on significance which is not generally of practical interest. A further complication is that, except for the Roy's greatest root, the distribution of these statistics under the null hypothesis is not straightforward and can only be approximated except in a few low-dimensional cases.An algorithm for the distribution of the Roy's largest root under the null hypothesis was derived in while the distribution under the alternative is studied in. The best-known approximation for Wilks' lambda was derived by C. R. Rao. In the case of two groups, all the statistics are equivalent and the test reduces to Hotelling's T-square. MANOVA's power is affected by the correlations of the dependent variables and by the effect sizes associated with those variables. For example, when there are two groups and two dependent variables, MANOVA's power is lowest when the correlation equals the ratio of the smaller to the larger standardized effect size.

[ "Multivariate analysis", "Social psychology", "Statistics", "Developmental psychology", "Multivariate statistics", "Variance inflation factor", "Normal-Wishart distribution", "Multivariate Behrens–Fisher problem", "wilks lambda", "Multivariate t-distribution" ]
Parent Topic
Child Topic
    No Parent Topic