math-primer

In Quantum Computing, one is interested in a finite dimensional complex linear vector space. The elements of these vector space are represented in the dirac notion using |α (ket)

We do not use the ket notation for the zero vector

Linear Independence

c1|α1+c2|α2+|cm|αm=0

iff c1,cm=0

Inner Product

Cauchy-Schwartz Inequality

| \langle \alpha | \beta \rangle | { #2} \leq \langle \alpha | \alpha \rangle \langle \beta | \beta \rangle

Proof: For any |α,|βV and cC we have$$\begin{array}{ll}\langle \alpha - c \beta | \alpha - c \beta \rangle \geq 0 \ \langle \alpha | \alpha \rangle - c \langle \alpha | \beta \rangle - c^* \langle \alpha | \beta \rangle - c c^* \langle \beta | \beta \rangle \geq 0\end{array}$$ we set c=β|αβ|β$$\begin{array}{ll}\langle \alpha | \alpha \rangle - \frac{\langle \alpha | \beta \rangle \langle \beta | \alpha \rangle}{\langle \beta | \beta \rangle} - \frac{\langle \alpha | \beta \rangle \langle \beta | \alpha \rangle}{\langle \beta | \beta \rangle} - \frac{\langle \alpha | \beta \rangle \langle \beta | \alpha \rangle}{\langle \beta | \beta \rangle} &\geq 0 \ \langle \alpha | \alpha \rangle \langle \beta | \beta \rangle - 3 | \langle \alpha | \beta \rangle |
{ #2}
&\geq 0 \ \langle \alpha | \alpha \rangle \langle \beta | \beta \rangle &\geq | \langle \alpha | \beta \rangle |^2 \end{array}$$

Orthonormality Condition

|α,|β are said to be orthogonal if their product is zero$$\langle \alpha | \beta \rangle = 0$$A set of vectors |α1,|α2|αn is said to be orthogonal if:$$\langle \alpha_i | \alpha_j\rangle = \delta_{ij}$$

Linear Operators

An operator A maps each vector |αV into another vector |βV:$$|\beta\rangle = A |\alpha\rangle$$This operator A is said to be linear if$$A(a|\alpha\rangle + b |\beta\rangle) = aA|\alpha\rangle + bA|\beta\rangle$$Two operators are said to be equal if$$A|\alpha\rangle = B|\alpha\rangle$$The sum of two linear operators is defined as$$C|\alpha\rangle = (A + B) | \alpha \rangle = A |\alpha\rangle + B|\alpha\rangle$$The product of two operators is defined as$$D|\alpha\rangle = AB|\alpha\rangle = A(B|\alpha\rangle)$$

Completeness Relation

We know that ai=αi|α$$\begin{array}{ll} |\alpha\rangle &=\sum a_i |\alpha_i\rangle \ &= \sum \langle\alpha_i | \alpha\rangle | \alpha_i \rangle \ &= \sum (|\alpha_i \rangle \langle \alpha_i |) |\alpha\rangle \end{array}$$Thus |αiαi|=I

Matrix Representation of an Operator

Consider a linear operator$$A |\alpha\rangle = |\beta\rangle$$Expanding over an orthonormal basis {|γ1|γn}$$|\alpha \rangle = \sum a_i | \gamma_i \rangle \hspace{2em} |\beta\rangle = \sum b_i | \gamma_i \rangle$$Using bi=γi|β$$\begin{array}{ll}&= \langle \gamma_i | A \alpha\rangle \ &= \sum_j \langle \gamma_i | A | \gamma_i \rangle a_j \ &= \sum_j \langle \gamma_i | A \gamma_j \rangle a_j \ &= A_{ij}a_j \end{array}$$

Pauli Matrices

σx=(0110)$$$$σy=(0ii0)$$$$σz=(1001)
Pauli Matrices Eigenvectors Eigenvalues
σx=(0110) 12(11), 12(11) ± 1
σy=(0ii0) 12(1i), 12(1i) ± 1
σz=(1001) (10), (01) ± 1

Projectors

If |αV is a unit vector, the uni-dimensional projector Pα is defined for some |β as$$\begin{array}{ll}P_{\alpha} | \beta \rangle &= |\alpha\rangle\langle \alpha | \beta \rangle \ &= \langle \alpha | \beta \rangle |\alpha\rangle\end{array}$$This operator is called a projector since it projects a generic vector |β along the dimension |α

Eigenvalues and Eigenvectors

An eigenvector of a linear operator A is a non-zero vector |α s.t.$$A |\alpha\rangle = \alpha | \alpha\rangle$$$$\begin{array}{ll}|\alpha\rangle &= \displaystyle \sum_{i=1}^{n} \alpha_i | \gamma_i\rangle \ A |\alpha\rangle &= \displaystyle \sum_{i=1}^{n} (\sum_j A_{ij} a_j) | \gamma_i \rangle \ A | \alpha\rangle &= \alpha | \alpha \rangle \ A | \alpha \rangle - \alpha | \alpha \rangle &= 0 \ \displaystyle \sum_{i=1}^{n} (\sum_{j=1}^{n} A_{ij}a_j - \alpha a_i) | \gamma_i \rangle &= 0\end{array}$$

Hermitian Operators

For any linear operator A on a Hilbert Space H a unique linear operator A on H called the adjoint or Hermitian conjugate s.t.$$\langle \alpha | A \beta \rangle = \langle A^{\dagger} \alpha | \beta \rangle $$$$\begin{array}{ll}\langle A \alpha | \beta \rangle &= \langle \beta | A \alpha\rangle^* \ &= \langle A^{\dagger}\beta | \alpha \rangle \ &= \langle \alpha | A^{\dagger} \beta \rangle\end{array}$$

Inverse Operators

For an operator A, an operator B is said to be the inverse if AB=BA=I

The inverse of an operator A exists iff detA0

Unitary Operator

An operator U is said to be unitary if $$\mathcal{U}\mathcal{U}^{\dagger} = \mathcal{U}^{\dagger}\mathcal{U} = I$$

Change of Basis

We can convert from one basis |γi to another basis |γi using unitary transformation S$$|\gamma_i^{'} \rangle = \sum_j S_{ji} | \gamma_i \rangle$$
Thus if |α=iai|γi then we can write$$\begin{array}{ll}|\alpha\rangle &= \sum_j a_j^{'} | \gamma_j^{'} \rangle \ &= \sum_{j} a_j^{'} S_{ji} | \gamma_i\rangle \end{array}$$where aj=γj|α

Diagonal Representation

An operator can be represented using its eigenvectors as basis:$$A = \displaystyle \sum_{i=1}^{n} \lambda_i | i \rangle \langle i | $$

defn: Normal Operators

AA=AA

Commutators and Anti-Commutators

#Pauli Matrices anti-commute

Trace

Trace of a matrix is the sum of its diagonal elements$$\text{Tr(A)} = \displaystyle \sum_{i=1}^{n} A_{ii}$$