orthogonal
Statistics
(adjective)
 statistically independent, with reference to variates
Physics
(adjective)
 Of two objects, at right angles; perpendicular to each other.
Art History
Examples of orthogonal in the following topics:
- 
A Geometrical Picture
- Therefore all the elements in the null space are orthogonal to all the elements in the row space.
 - In mathematical terminology, the null space and the row space are orthogonal complements of one another.
 - Or, to say the same thing, they are orthogonal subspaces of $\mathbf{R}^{m}$ .
 - Similarly, vectors in the left null space of a matrix are orthogonal to all the columns of this matrix.
 - This means that the left null space of a matrix is the orthogonal complement of the column $\mathbf{R}^{n}$ .
 
 - 
Superposition and orthogonal projection
- But the solution is especially simple if the $\mathbf{x}_i$ are orthogonal.
 - In this case we can find the coefficients easily by projecting onto the orthogonal directions:
 - If the basis functions $q_i(x)$ are "orthogonal", then we should be able to compute the Fourier coefficients by simply projecting the function $f(x)$ onto each of the orthogonal "vectors" $q_i(x)$ .
 - Then we will say that two functions are orthogonal if their inner product is zero.
 - Now we simply need to show that the sines and cosines (or complex exponentials) are orthogonal.
 
 - 
Some Special Matrices
- A matrix $Q \in \mathbf{R}^{{n \times n}}$ is said to be orthogonal if $Q^TQ = I_n$ .
 - So why are these matrices called orthogonal?
 - An orthogonal matrix has an especially nice geometrical interpretation.
 - Therefore an orthogonal matrix maps a vector into another vector of the same norm.
 
 - 
The Linear Algebra of the DFT
- The matrix $Q$ is almost orthogonal.
 - We have said that a matrix $A$ is orthogonal if $A A^T = A^T A = I$, where $I$ is the N-dimensional identity matrix.
 - For complex matrices we need to generalize this definition slightly; for complex matrices we will say that $A$ is orthogonal if $(A^T)^* A = A (A^T)^* = I$ .
 - Once again, orthogonality saves us from having to solve a linear system of equations: since $Q^* = Q^{-1}$ , we have
 
 - 
Experimental Design
- Orthogonality: Orthogonality concerns the forms of comparison (contrasts) that can be legitimately and efficiently carried out.
 - Contrasts can be represented by vectors and sets of orthogonal contrasts are uncorrelated and independently distributed if the data are normal.
 - Because of this independence, each orthogonal treatment provides different information to the others.
 - If there are $T$ treatments and $T-1$ orthogonal contrasts, all the information that can be captured from the experiment is obtainable from the set of contrasts.
 - Outline the methodology for designing experiments in terms of comparison, randomization, replication, blocking, orthogonality, and factorial experiments
 
 - 
Properties of Spherical Harmonics and Legendre Polynomials
- The Legendre polynomials and the spherical harmonics satisfy the following "orthogonality" relations.
 - We will see shortly that these properties are the analogs for functions of the usual orthogonality relations you already know for vectors.
 - Notice that the second relation is slightly different than the others; it says that for any given value of $m$, the polynomials $P_{\ell m}$ and $P_{\ell ' m}$ are orthogonal.
 - To compute the coefficients of this expansion we use the orthogonality relation exactly as you would with an ordinary vector.
 
 - 
Orthogonal decomposition of rectangular matrices
- However, there is an amazingly useful generalization that pertains if we allow a different orthogonal matrix on each side of $A$ .
 - And since $S$ is symmetric it has orthogonal eigenvectors $\mathbf{w}_i$$\lambda _ i$ with real eigenvalues $\lambda _ i$
 - Keep in mind that the matrices $U$ and $V$ whose columns are the model and data eigenvectors are square (respectively $n \times n$$m \times m$ and $m \times m$ ) and orthogonal.
 
 - 
Eigenvectors and Orthogonal Projections
- Above we said that the matrices $V$ and $U$ were orthogonal so that $V^T V = V V^T = I_m$ and $U^T U = U U^T = I_n$ .
 
 - 
A Matrix Appears
- are orthogonal.
 - This orthogonality is an absolutely fundamental property of the natural modes of vibration of linear mechanical systems.
 
 - 
Introduction to Least Squares
- Since $A \mathbf{x_{ls}}$ is, by definition, confined to the column space of $A$ then $A\mathbf{x_{ls}} - \mathbf{y}$ (the error in fitting the data) must be in the orthogonal complement of the column space.
 - The orthogonal complement of the column space is the left null space, so $A\mathbf{x_{ls}} - \mathbf{y}$ must get mapped into zero by $A^T$ :
 - Before when we did orthogonal projections, the projecting vectors/matrices were orthogonal, so $A^TA$ term would have been the identity, but the outer product structure in $A\mathbf{x_{ls}}$ is evident.