Notation: \(A^{\dagger}\) denotes the conjugate transpose of \(A\).
Let \(V\) be a vector space over \(k\) an algebraically closed and \(A \in \mathrm{End}(V)\). Then if \(W \subseteq V\) is an invariant subspace, so \(A(W) \subseteq W\), the \(A\) has an eigenvector in \(W\).
- Hermitian (self-adjoint) matrices (i.e. \(A^{\dagger} = A\)) are diagonalizable over \({\mathbf{C}}\).
- Symmetric matrices (i.e. \(A^t = A\)) are diagonalizable over \({\mathbf{R}}\).
In fact, \(A\) is symmetric \(\iff \operatorname{Spec}A\) forms an orthonormal basis.
-
Suppose \(A\) is Hermitian.
-
Since \(V\) itself is an invariant subspace, \(A\) has an eigenvector \(\mathbf{v}_1 \in V\).
-
Let \(W_1 = \mathop{\mathrm{span}}_k\left\{{\mathbf{v}_1}\right\}\perp\).
-
Then for any \(\mathbf{w}_1 \in W_1\), \begin{align*} {\left\langle {\mathbf{v}_1},~{ A \mathbf{w}_1} \right\rangle} = {\left\langle {A \mathbf{v}_1},~{\mathbf{w}_1} \right\rangle} = \lambda {\left\langle {\mathbf{v}_1},~{\mathbf{w}_1} \right\rangle} = 0, \end{align*} so \(A(W_1) \subseteq W_1\) is an invariant subspace, etc.
-
Suppose now that \(A\) is symmetric.
-
Then there is an eigenvector of norm 1, \(\mathbf{v} \in V\). \begin{align*} \lambda = \lambda{\left\langle {\mathbf{v}},~{\mathbf{v}} \right\rangle} = {\left\langle {A\mathbf{v}},~{\mathbf{v}} \right\rangle} = {\left\langle {\mathbf{v}},~{A\mathbf{v}} \right\rangle} = \overline{\lambda} \implies \lambda \in {\mathbf{R}} .\end{align*}
A set of operators \(\left\{{A_i}\right\}\) pairwise commute \(\iff\) they are all simultaneously diagonalizable.
By induction on number of operators
- \(A_n\) is diagonalizable, so \(V = \bigoplus E_i\) a sum of eigenspaces
- Restrict all \(n-1\) operators \(A\) to \(E_n\).
- The commute in \(V\) so they commute in \(E_n\)
- (Lemma) They were diagonalizable in \(V\), so they’re diagonalizable in \(E_n\)
- So they’re simultaneously diagonalizable by I.H.
- But these eigenvectors for the \(A_i\) are all in \(E_n\), so they’re eigenvectors for \(A_n\) too.
- Can do this for each eigenspace.
\(M\) is diagonalizable over \({ \mathbf{F} }\iff \min_M(x, { \mathbf{F} })\) splits into distinct linear factors over \({ \mathbf{F} }\), or equivalently iff all of the roots of \(\min_M\) lie in \({ \mathbf{F} }\).
\(\implies\): If \(\min_A\) factors into linear factors, so does each invariant factor, so every elementary divisor is linear and \(JCF(A)\) is diagonal.
\(\impliedby\): If \(A\) is diagonalizable, every elementary divisor is linear, so every invariant factor factors into linear pieces. But the minimal polynomial is just the largest invariant factor.