### A.5 Inner, outer, and tensor products

Three kinds of products between a pair of vectors |*v*〉 and |*w*〉 are defined. These are: inner product, outer product, and tensor product.

**A.5.1 Inner product **

The inner product of |*v*〉 and |*w*〉 in the same vector space is represented by 〈 *v|w* 〉. Let

written with respect to the same orthonormal basis (this is a good time to get used to the abbreviated representation of vector components represented by their indices *i* and *j* such as |*i* 〉 for *|v _{i}〉* and

*|j 〉*for

*|w*as done here, if the context is clear) with respect to some orthonormal basis of which

_{j}〉*|i*is a member (this means that

*〈i|j〉 = δ*where

_{ij}*δ*this can always be done by suitably adjusting the values of

_{ij}= 1 if i = j else δ_{ij}= 0;*a*and

_{i}*b*depending on how the basis

_{j}*|i〉*is constructed. Several construction methods exist, a popular one being the Gram-Schmidt procedure) .Then

Since we can multiply any state vector by a non-zero complex number without changing its physical interpretation, we can always normalize the state so that it has unit length to make it a *unit vector* or put it, so to say, in a *normalized state*.

Note also that the inner product remains unaffected if each of the two vectors in the product is multiplied by the factor exp(*iθ*) (or its conjugate as applicable), where *θ* is real:

**A.5.2 Outer product**

The outer product

results in a* n×n* matrix. It is a useful way of representing linear operators. Moreover, the expression *〈 |w〉〈v|v’〉* can be freely given any one of two meanings: (1) to denote the result when the operator *|w〉〈v|* acts on* |v’〉;* and (2) to denote the result of multiplying *|w〉* by the complex number*〈 v|v’〉* . Mathematicians usually try to construct such clever symbolic systems for ease of manipulation and economy of expression when equivalences exist. One also notices that

where the set of vectors represented by* |i〉* is an orthonormal basis (i.e., *〈 i|j〉 = δ _{ij} *) so that

*〈 i|v〉 = a*and

_{i}*I*is a

*n×n*unit matrix. This equation is known as the

*completeness relation*. By

*complete*we mean that any state vector in the chosen vector space can be represented as a weighted sum of just the

*|i〉*vectors,

*e.g.,*

For example, suppose *A: V → W,* is a linear operator, *|v _{i}〉* is an orthonormal basis for

*V*, and

*|w*an orthonormal basis for

_{j}〉*W*. Assume

^{41}

Now apply* 〈i|* to the above equations:

This allows us to extract any element of *A* with indices *i, j* be defining

**A.5.3 Diagonal representation of an operator or orthonormal decomposition**

A diagonal representation for an operator *A* on a vector space *V* is a representation

where the vectors *|i〉* form an orthonormal set of eigenvectors for *A*, with corresponding eigenvalues *λ _{i}*. The

*i*-th diagonal element of

*A*is

*λ*and all non-diagonal elements of

_{i}*A*are zero. An operator which has a diagonal representation is said to be diagonalizable. Diagonal representations are also known as

*orthonormal decompositions*. A diagonal representation of

*A*simplifies computations tremendously since one need deal with only

*n*elements of

*A*rather than

*n*elements for a

^{2}*n×n*representation of

*A*. So, an immediate computational advantage of knowing the eigenvalues and eigenvectors of an operator

*A*is quite obvious. If all the eigenvalues of

*A*are distinct, they can be used to index its eigenvectors.

41 B. Zwiebach, Dirac’s bra and ket notation, 8.05 Quantum Physics II, MIT OpenCourseWare, Fall 2013, 07 October 2013. https://ocw.mit.edu/courses/physics/8-05-quantum-physics-ii-fall-2013/lecture-notes/MIT8_05F13_Chap_04.pdf