math-primer
In Quantum Computing, one is interested in a finite dimensional complex linear vector space. The elements of these vector space are represented in the dirac notion using
- Two kets can be added to give a new vector
$$ \begin{array}{ll} | \gamma\rangle &= | \alpha \rangle + | \beta \rangle \ \gamma_i &= \alpha_i + \beta_i\end{array}$$vector addition has the following properties:$$\begin{array}{ll}|\alpha \rangle + | \beta\rangle &= |\beta \rangle + |\alpha\rangle \ | \alpha \rangle + (|\beta \rangle + | \gamma \rangle ) &= (|\alpha\rangle + |\beta\rangle) + | \gamma \rangle \end{array}$$ - We can also multiply a vector
with a complex number to obtain a new vector . The following properties hold for and :$$\begin{array}{ll}c(|\alpha\rangle + |\beta\rangle) &= c|\alpha\rangle + c | \beta \rangle \ (c+ d) |\alpha\rangle &= c |\alpha \rangle + d(|\alpha\rangle) \ (cd) | \alpha \rangle &= c(d|\alpha\rangle)\end{array}$$ - A vector space contains the zero vector with the following properties:$$\begin{array}{ll}|\alpha\rangle + 0 &= | \alpha\rangle \ 0|\alpha\rangle &= 0 \ 1 |\alpha \rangle &= |\alpha\rangle \ |\alpha\rangle - |\alpha \rangle &= 0\end{array}$$
Linear Independence
iff
Inner Product
- denoted
- Prop:
(skew-symmetric) - Prop:
(linearity) - Prop:
(positivity) is called the dual vector/bra. The dual vector is a linear operator from the vector space to the complex numbers , defined by . For example is and then - norm of a ket is defined as
. The normalised unit vector has unit norm$$|| ( | \alpha \rangle )|| = \sqrt{\sum_{i=1}^{n} | \alpha_i|^2}$$ - Hilbert Space
Complex vector space equipped with an inner product
Cauchy-Schwartz Inequality
Proof: For any
{ #2}
&\geq 0 \ \langle \alpha | \alpha \rangle \langle \beta | \beta \rangle &\geq | \langle \alpha | \beta \rangle |^2 \end{array}$$
Orthonormality Condition
- Orthonormal vectors are linearly independent
- Dimension
of vector spaces is given by the maximum number of linearly independent vectors - A set of linearly independent vectors
in an -dimensional vector space is said to be a basis for for the vector space. Any vector can be expanded over a basis $$|\alpha\rangle = \displaystyle\sum_{i=1}^{n} a_i |\alpha_i\rangle$$ are known as the complete set of vectors are the components of the vector w.r.t. basis 's are uniquely determined for an orthonormal basis$$a_i =\langle \alpha_i | \alpha\rangle$$ - The ordered collection of components
constitutes a representation of the vector
Linear Operators
An operator
Completeness Relation
We know that
Matrix Representation of an Operator
Consider a linear operator$$A |\alpha\rangle = |\beta\rangle$$Expanding over an orthonormal basis
Pauli Matrices
- Prop:
- Prop: $$\begin{array}{ll}\sigma_x\sigma_y &= i\sigma_z \ \sigma_y\sigma_z &= i\sigma_x \ \sigma_z\sigma_x &= i\sigma_y\end{array}$$
Pauli Matrices | Eigenvectors | Eigenvalues |
---|---|---|
Projectors
If
- Prop:
- Prop:
when is orthogonal to - Prop:
Eigenvalues and Eigenvectors
An eigenvector of a linear operator
Hermitian Operators
For any linear operator A on a Hilbert Space
- The eigenvectors of an Hermitian operator form an orthonormal set in the Hilbert Space
- Any vector in the Hilbert Space
can be expressed as a linear superposition of vectors of said basis - We know
(from #Matrix Representation of an Operator). Then $$A_{ji}^{*} = A^{\dagger}_{ij}$$ - Matrix elements of
are the complex conjugates of the matrix elements of
Inverse Operators
For an operator
Unitary Operator
An operator
- this definition implies
- The product
of two unitary operators is unitary$$(\mathcal{U}\mathcal{V}) = (\mathcal{U}\mathcal{V})^{\dagger} = \mathcal{U}\mathcal{V}\mathcal{V}^{\dagger}\mathcal{U} = I$$ - Unitary operators preserve inner products$$\langle \mathcal{U} \alpha | \mathcal{U}\beta \rangle = \langle \alpha | \mathcal{U}^{\dagger}\mathcal{U} | \beta\rangle$$
- #Pauli Matrices are both Hermitian and Unitary
Change of Basis
We can convert from one basis
Thus if
Diagonal Representation
An operator can be represented using its eigenvectors as basis:$$A = \displaystyle \sum_{i=1}^{n} \lambda_i | i \rangle \langle i | $$
- An operator is said to be diagonalisable if it has a diagonal representation
- Hermitian and Unitary operators are diagonalisable
Commutators and Anti-Commutators
- We say two operators commute if
. The commutator of two operators and is defined as$$[A, B] = AB - BA$$ - Prop:
- Prop:
- Prop:
- The anti-commutator of two operators is defined by:$${ A, B} = AB + BA$$Two operators anti-commute is
Trace
Trace of a matrix is the sum of its diagonal elements$$\text{Tr(A)} = \displaystyle \sum_{i=1}^{n} A_{ii}$$
-
Prop:
-
Prop:
-
Prop: $ \text{Tr(AB)} = \text{Tr(BA)}$
-
For a unitary operator
$$\text{Tr}(\mathcal{U}^{\dagger}A\mathcal{U}) = \text{Tr}(\mathcal{U}\mathcal{U}^{\dagger}A) = \text{Tr}(A)$$i.e. Trace is invariant under Unitary Transformations