Emily Lake
Might be a replicant
- Joined
- Jul 7, 2014
- Messages
- 8,494
- Location
- It's a desert out there
- Gender
- Agenderist
- Basic Beliefs
- Atheist
Orthogonal in statistics means no correlation whatsoever - which means no overlap.Actually, orthogonal doesn't mean no overlap. The two lines intersect at a point. It simply means that they move in completely different directions. The most common usage of orthogonal refers to the axes in cartesian coordinates, which are at right angles to one another.
I can see that interpretation, but the measurement of correlation between two variables are still based on two axes at right angles to one another. The correlation is more or less the slope of the linear regression fit between the two variables - A change of 1 in variable A results in a change of ~x in variable B. Loose analogy, of course, but reasonable. Independence of those two variables results in the plots of value-pairs aggregating to the origin... and looking at it from the perspective of either variable results in the plot of the other variable being a line at right angles to the plot of the the first variable.
ETA: Wikipedia
Statistics, econometrics, and economics
When performing statistical analysis, independent variables that affect a particular dependent variable are said to be orthogonal if they are uncorrelated,[13] since the covariance forms an inner product. In this case the same results are obtained for the effect of any of the independent variables upon the dependent variable, regardless of whether one models the effects of the variables individually with simple regression or simultaneously with multiple regression. If correlation is present, the factors are not orthogonal and different results are obtained by the two methods. This usage arises from the fact that if centered by subtracting the expected value (the mean), uncorrelated variables are orthogonal in the geometric sense discussed above, both as observed data (i.e., vectors) and as random variables (i.e., density functions). One econometric formalism that is alternative to the maximum likelihood framework, the Generalized Method of Moments, relies on orthogonality conditions. In particular, the Ordinary Least Squares estimator may be easily derived from an orthogonality condition between the explanatory variables and model residuals.