A.5 Inner, outer, and tensor products
Three kinds of products between a pair of vectors |v〉 and |w〉 are defined. These are: inner product, outer product, and tensor product.
A.5.1 Inner product
The inner product of |v〉 and |w〉 in the same vector space is represented by 〈 v|w 〉. Let
written with respect to the same orthonormal basis (this is a good time to get used to the abbreviated representation of vector components represented by their indices i and j such as |i 〉 for |vi〉 and |j 〉 for |wj 〉 as done here, if the context is clear) with respect to some orthonormal basis of which |i is a member (this means that 〈i|j〉 = δij where δij = 1 if i = j else δij = 0; this can always be done by suitably adjusting the values of ai and bj depending on how the basis |i〉 is constructed. Several construction methods exist, a popular one being the Gram-Schmidt procedure) .Then
Since we can multiply any state vector by a non-zero complex number without changing its physical interpretation, we can always normalize the state so that it has unit length to make it a unit vector or put it, so to say, in a normalized state.
Note also that the inner product remains unaffected if each of the two vectors in the product is multiplied by the factor exp(iθ) (or its conjugate as applicable), where θ is real:
A.5.2 Outer product
The outer product
results in a n×n matrix. It is a useful way of representing linear operators. Moreover, the expression 〈 |w〉〈v|v’〉 can be freely given any one of two meanings: (1) to denote the result when the operator |w〉〈v| acts on |v’〉; and (2) to denote the result of multiplying |w〉 by the complex number〈 v|v’〉 . Mathematicians usually try to construct such clever symbolic systems for ease of manipulation and economy of expression when equivalences exist. One also notices that
where the set of vectors represented by |i〉 is an orthonormal basis (i.e., 〈 i|j〉 = δij ) so that 〈 i|v〉 = ai and I is a n×n unit matrix. This equation is known as the completeness relation. By complete we mean that any state vector in the chosen vector space can be represented as a weighted sum of just the |i〉 vectors, e.g.,
For example, suppose A: V → W, is a linear operator, |vi〉 is an orthonormal basis for V, and |wj〉 an orthonormal basis for W. Assume41
Now apply 〈i| to the above equations:
This allows us to extract any element of A with indices i, j be defining
A.5.3 Diagonal representation of an operator or orthonormal decomposition
A diagonal representation for an operator A on a vector space V is a representation
where the vectors |i〉 form an orthonormal set of eigenvectors for A, with corresponding eigenvalues λi. The i-th diagonal element of A is λi and all non-diagonal elements of A are zero. An operator which has a diagonal representation is said to be diagonalizable. Diagonal representations are also known as orthonormal decompositions. A diagonal representation of A simplifies computations tremendously since one need deal with only n elements of A rather than n2 elements for a n×n representation of A. So, an immediate computational advantage of knowing the eigenvalues and eigenvectors of an operator A is quite obvious. If all the eigenvalues of A are distinct, they can be used to index its eigenvectors.
41 B. Zwiebach, Dirac’s bra and ket notation, 8.05 Quantum Physics II, MIT OpenCourseWare, Fall 2013, 07 October 2013. https://ocw.mit.edu/courses/physics/8-05-quantum-physics-ii-fall-2013/lecture-notes/MIT8_05F13_Chap_04.pdf