Fall 2017 #7 #algebra/qual/completed
Let \(F\) be a field and let \(V\) and \(W\) be vector spaces over \(F\) .
Make \(V\) and \(W\) into \(F[x]{\hbox{}}\)modules via linear operators \(T\) on \(V\) and \(S\) on \(W\) by defining \(X \cdot v = T (v)\) for all \(v \in V\) and \(X \cdot w = S(w)\) for all w \(\in\) W .
Denote the resulting \(F[x]{\hbox{}}\)modules by \(V_T\) and \(W_S\) respectively.

Show that an \(F[x]{\hbox{}}\)module homomorphism from \(V_T\) to \(W_S\) consists of an \(F{\hbox{}}\)linear transformation \(R : V \to W\) such that \(RT = SR\).

Show that \(V_T \cong W_S\) as \(F[x]{\hbox{}}\)modules \(\iff\) there is an \(F{\hbox{}}\)linear isomorphism \(P : V \to W\) such that \(T = P^{1}SP\).

Recall that a module \(M\) is simple if \(M \neq 0\) and any proper submodule of \(M\) must be zero. Suppose that \(V\) has dimension 2. Give an example of \(F\), \(T\) with \(V_T\) simple.

Assume \(F\) is algebraically closed. Prove that if \(V\) has dimension 2, then any \(V_T\) is not simple.
 A representation \(M\) of a \(k{\hbox{}}\)algebra \(A\) is a morphism \(A\to { \operatorname{End} }_{ {}_{k} \mathsf{Alg}}(M)\) of \(k{\hbox{}}\)algebras.
 Simple \(R{\hbox{}}\)modules \(M\) can be written as \(M\cong R/I\) for \(I\) a maximal ideal.
Part a: note that for \({ \mathbf{F} }{\hbox{}}\)modules, i.e. representations of \({ \mathbf{F} }\), a morphism \(V\to W\) would be an \({ \mathbf{F} }{\hbox{}}\)linear map \(\phi: V\to W\) that commutes with the actions \({ \mathbf{F} }\curvearrowright V, W\) that define the module structure, i.e. \(\phi(\lambda \bullet_V v) = \lambda \bullet_W \phi(v)\) where \(\bullet_V\) is the action on \(V\) and \(\bullet_W\) is the action on \(W\), where \(\lambda \in { \mathbf{F} }, v\in V\).
Here we are considering \({ \mathbf{F} }[x]{\hbox{}}\)modules, i.e. representations of the \({ \mathbf{F} }{\hbox{}}\)algebra \({ \mathbf{F} }[x]\). Note that \({ \mathbf{F} }[x]\) is generated by \(1, x\) as an \({ \mathbf{F} }{\hbox{}}\)algebra, and so specifying a representation \(M\) is equivalent to specifying an action by scalars \({ \mathbf{F} }\curvearrowright M\) and an action \(x\curvearrowright M\).
Note that a morphism \(R: V_T\to W_S\) of \({ \mathbf{F} }[x]{\hbox{}}\)modules must in particular be a morphism of the underlying \({ \mathbf{F} }{\hbox{}}\)modules \(V\) and \(W\), since \({ \mathbf{F} }\) can be identified with the constant polynomials in \({ \mathbf{F} }[x]\). Thus it is necessary for \(R\) to be \({ \mathbf{F} }{\hbox{}}\)linear.
To see that we must have \(RS=ST\), note that given an action of \(x\) on \(W\) and \(V\), we must have \(R(x\bullet_V v) = x\bullet_W R(v)\). Since \(x\bullet_V v \coloneqq T(v)\) and \(x\bullet_W w \coloneqq S(w)\), we must have \begin{align*} R(T(v)) = R(x\bullet_V v) = x\bullet_W R(v) = S(R(V)) \implies RT = SR .\end{align*} This shows necessity, sufficiency follows because if \(RT=SR\) then a similar argument in reverse shows \(R(xv) = xR(v)\) must hold.
Part b:
\(\implies\): note that \(V_T\cong W_S\) forces \(\dim_{ \mathbf{F} }V = \dim_{ \mathbf{F} }W\), so \(T,S\) must be square matrices. By part a, such an isomorphism is a isomorphism \(P: V\to W\) of \({ \mathbf{F} }{\hbox{}}\)modules with \(PT = SP\), which has an inverse \(P^{1}: W\to V\). Choosing \({ \mathbf{F} }{\hbox{}}\)bases for \(V\) and \(W\), \(P\) is an invertible square matrix, so \begin{align*} PT=SP \implies P^{1}P T = P^{1}S P \implies T = P^{1}S P .\end{align*}
\(\impliedby\): given such a morphism \(P: V\to W\) of \({ \mathbf{F} }{\hbox{}}\)modules with \(T = P^{1}S P\), running the above argument backwards we have \begin{align*} T = P^{1}S P \implies PT = PP^{1}S P \implies PT = SP ,\end{align*} and by part a \(P\) defines a morphism of \({ \mathbf{F} }[x]{\hbox{}}\)modules. A similar argument with \(P^{1}\) produces an inverse morphism, so \(P\) yields an isomorphism of \({ \mathbf{F} }[x]{\hbox{}}\)modules.
A shorter argument: \(V_T\cong W_S \implies\) they have the same invariant factor decompositions \(\implies T, S\) have the same rational canonical form \(\implies\) they are similar matrices by linear algebra. Conversely, if \(T,S\) are similar then \(\exists P\) with \(T = P^{1}S P \implies PT = SP\) by definition of similarity.
Part c: let \({ \mathbf{F} }= {\mathbf{R}}\); by the classification of modules over a PID, any \({\mathbf{R}}[t]{\hbox{}}\)module \(M\) has an invariant factor factor decomposition \begin{align*} M \cong {\mathbf{R}}[t]^n \oplus {{\mathbf{R}}[t]\over \left\langle{f_1}\right\rangle} \oplus \cdots \oplus {{\mathbf{R}}[t]\over \left\langle{f_\ell}\right\rangle }, \qquad f_1 \divides f_2 \divides \cdots \divides f_\ell \end{align*} as \({\mathbf{R}}[t]{\hbox{}}\)modules. Simple modules \(M\) correspond to \({\mathbf{R}}[t]/I\) for \(I\) a single maximal ideal, and since \({\mathbf{R}}[t]\) is a PID maximals are prime and thus \(I = \left\langle{f}\right\rangle\) for some irreducible polynomial. So let \(f(x) = x^2+1\), it then suffices to find a matrix \(A \in { \operatorname{End} }(V)\) which has \(f\) as its characteristic polynomial – this follows since the characteristic polynomial is \(\prod_{i=1}^\ell f_i\) in the decomposition above, so we’re forcing \(\ell = 1\) and \(f_1 = f_\ell = f\) with \(n=0\) to get \(M\cong {\mathbf{R}}[t]/\left\langle{f}\right\rangle\). So take \begin{align*} T \coloneqq{ \begin{bmatrix} {0} & {1} \\ {1} & {0} \end{bmatrix} } .\end{align*}
Part d: suppose \(\dim_{ \mathbf{F} }V = 2\) and \(V_T\) is simple, \(V_T\cong { \mathbf{F} }[x]/I\) for \(I\) a maximal ideal. By the invariant factor decomposition, the characteristic polynomial equals the minimal polynomial since \(I\) must be generated by an irreducible polynomial. So \(T\) is diagonalizable (e.g. by examining the JCF), and in particular \(V_T \cong E_1 \oplus E_2\) is a sum of its eigenspaces since \({ \mathbf{F} }\) is algebraically closed. However, this contradicts simplicity, since \(T{\hbox{}}\)invariant subspaces of \(V\) correspond to \({\mathbf{C}}[x]{\hbox{}}\)submodules of \(V_T\), and \(E_1\) is a 1dimensional such space.
Spring 2015 #3 #algebra/qual/work
Let \(F\) be a field and \(V\) a finite dimensional \(F{\hbox{}}\)vector space, and let \(A, B: V\to V\) be commuting \(F{\hbox{}}\)linear maps. Suppose there is a basis \({\mathcal{B}}_1\) with respect to which \(A\) is diagonalizable and a basis \({\mathcal{B}}_2\) with respect to which \(B\) is diagonalizable.
Prove that there is a basis \({\mathcal{B}}_3\) with respect to which \(A\) and \(B\) are both diagonalizable.
Fall 2016 #2 #algebra/qual/work
Let \(A, B\) be two \(n\times n\) matrices with the property that \(AB = BA\). Suppose that \(A\) and \(B\) are diagonalizable. Prove that \(A\) and \(B\) are simultaneously diagonalizable.
Spring 2019 #1 #algebra/qual/completed
Let \(A\) be a square matrix over the complex numbers. Suppose that \(A\) is nonsingular and that \(A^{2019}\) is diagonalizable over \({\mathbf{C}}\).
Show that \(A\) is also diagonalizable over \({\mathbf{C}}\).
 \(A\) is diagonalizable iff \(\min_A(x)\) is separable.
If \(A \in \operatorname{GL}(m, { \mathbf{F} })\) is invertible and \(A^n/{ \mathbf{F} }\) is diagonalizable, then \(A/{ \mathbf{F} }\) is diagonalizable.

Let \(A \in \operatorname{GL}(m, { \mathbf{F} })\).

Since \(A^n\) is diagonalizable, \(\min_{A^n}(x) \in { \mathbf{F} }[x]\) is separable and thus factors as a product of \(m\) distinct linear factors: \begin{align*} \min_{A^n}(x) = \prod_{i=1}^m (x\lambda_i), \quad \min_{A^n}(A^n) = 0 \end{align*}
where \(\left\{{\lambda_i}\right\}_{i=1}^m \subset { \mathbf{F} }\) are the distinct eigenvalues of \(A^n\).

Moreover \(A\in \operatorname{GL}(m,{ \mathbf{F} }) \implies A^n \in \operatorname{GL}(m,{ \mathbf{F} })\): \(A\) is invertible \(\iff \operatorname{det}(A) = d \in { \mathbf{F} }^{\times}\), and so \(\operatorname{det}(A^n) = \operatorname{det}(A)^n = d^n \in { \mathbf{F} }^{\times}\) using the fact that the determinant is a ring morphism \(\operatorname{det}: \operatorname{Mat}(m\times m) \to{ \mathbf{F} }\) and \({ \mathbf{F} }^{\times}\) is closed under multiplication.

So \(A^n\) is invertible, and thus has trivial kernel, and thus zero is not an eigenvalue, so \(\lambda_i \neq 0\) for any \(i\).

Since the \(\lambda_i\) are distinct and nonzero, this implies \(x^k\) is not a factor of \(\mu_{A^n}(x)\) for any \(k\geq 0\). Thus the \(m\) terms in the product correspond to precisely \(m\) distinct linear factors.

We can now construct a polynomial that annihilates \(A\), namely \begin{align*} q_A(x) \coloneqq\min_{A^n}(x^n) = \prod_{i=1}^m (x^n\lambda_i) \in { \mathbf{F} }[x], \end{align*}
where we can note that \(q_A(A) = \min_{A^n}(A^n) = 0\), and so \(\min_A(x) \divides q_A(x)\) by minimality.
\(q_A(x)\) has exactly \(nm\) distinct linear factors in \({\overline{{ \mathbf{F} }}}[x]\)

This reduces to showing that no pair \(x^n\lambda_i, x^n\lambda_j\) share a root. and that \(x^n\lambda_i\) does not have multiple roots.

For the first claim, we can factor \begin{align*} x^n  \lambda_i = \prod_{k=1}^n (x  \lambda_i^{1\over n} e^{2\pi i k \over n}) \coloneqq\prod_{k=1}^n (x\lambda^{1\over n} \zeta_n^k) ,\end{align*} where we now use the fact that \(i\neq j \implies \lambda_i^{1\over n} \neq \lambda_j^{1\over n}\). Thus no term in the above product appears as a factor in \(x^n  \lambda_j\) for \(j\neq i\).

For the second claim, we can check that \({\frac{\partial }{\partial x}\,}\qty{x^n  \lambda_i} = nx^{n1}\neq 0\in { \mathbf{F} }\), and \(\gcd(x^n\lambda_i, nx^{n1}) = 1\) since the latter term has only the roots \(x=0\) with multiplicity \(n1\), whereas \(\lambda_i\neq 0 \implies\) zero is not a root of \(x^n\lambda_i\).
But now since \(q_A(x)\) has exactly distinct linear factors in \(\overline{{ \mathbf{F} }}[x]\) and \(\min_A(x) \divides q_A(x)\), \(\min_A(x) \in { \mathbf{F} }[x]\) can only have distinct linear factors, and \(A\) is thus diagonalizable over \({ \mathbf{F} }\).
Linear Algebra: Misc
\(\star\) Spring 2012 #6 #algebra/qual/work
Let \(k\) be a field and let the group \(G = \operatorname{GL}(m, k) \times\operatorname{GL}(n, k)\) acts on the set of \(m\times n\) matrices \(M_{m, n}(k)\) as follows: \begin{align*} (A, B) \cdot X = AXB^{1} \end{align*} where \((A, B) \in G\) and \(X\in M_{m, n}(k)\).

State what it means for a group to act on a set. Prove that the above definition yields a group action.

Exhibit with justification a subset \(S\) of \(M_{m, n}(k)\) which contains precisely one element of each orbit under this action.
\(\star\) Spring 2014 #7 #algebra/qual/work
Let \(G = \operatorname{GL}(3, {\mathbf{Q}}[x])\) be the group of invertible \(3\times 3\) matrices over \({\mathbf{Q}}[x]\). For each \(f\in {\mathbf{Q}}[x]\), let \(S_f\) be the set of \(3\times 3\) matrices \(A\) over \({\mathbf{Q}}[x]\) such that \(\operatorname{det}(A) = c f(x)\) for some nonzero constant \(c\in {\mathbf{Q}}\).

Show that for \((P, Q) \in G\times G\) and \(A\in S_f\), the formula \begin{align*} (P, Q)\cdot A \coloneqq PAQ^{1} \end{align*} gives a well defined map \(G\times G \times S_f \to S_f\) and show that this map gives a group action of \(G\times G\) on \(S_f\).

For \(f(x) = x^3(x^2+1)^2\), give one representative from each orbit of the group action in (a), and justify your assertion.
Fall 2012 #7 #algebra/qual/work
Let \(k\) be a field of characteristic zero and \(A, B \in M_n(k)\) be two square \(n\times n\) matrices over \(k\) such that \(AB  BA = A\). Prove that \(\operatorname{det}A = 0\).
Moreover, when the characteristic of \(k\) is 2, find a counterexample to this statement.
Fall 2012 #8 #algebra/qual/work
Prove that any nondegenerate matrix \(X\in M_n({\mathbf{R}})\) can be written as \(X = UT\) where \(U\) is orthogonal and \(T\) is upper triangular.
Fall 2012 #5 #algebra/qual/work
Let \(U\) be an infinitedimensional vector space over a field \(k\), \(f: U\to U\) a linear map, and \(\left\{{u_1, \cdots, u_m}\right\} \subset U\) vectors such that \(U\) is generated by \(\left\{{u_1, \cdots, u_m, f^d(u_1), \cdots, f^d(u_m)}\right\}\) for some \(d\in {\mathbb{N}}\).
Prove that \(U\) can be written as a direct sum \(U \cong V\oplus W\) such that
 \(V\) has a basis consisting of some vector \(v_1, \cdots v_n, f^d(v_1), \cdots, f^d(v_n)\) for some \(d\in {\mathbb{N}}\), and
 \(W\) is finitedimensional.
Moreover, prove that for any other decomposition \(U \cong V' \oplus W'\), one has \(W' \cong W\).
Fall 2015 #7 #algebra/qual/work

Show that two \(3\times 3\) matrices over \({\mathbf{C}}\) are similar \(\iff\) their characteristic polynomials are equal and their minimal polynomials are equal.

Does the conclusion in (a) hold for \(4\times 4\) matrices? Justify your answer with a proof or counterexample.
Fall 2014 #4 #algebra/qual/work
Let \(F\) be a field and \(T\) an \(n\times n\) matrix with entries in \(F\). Let \(I\) be the ideal consisting of all polynomials \(f\in F[x]\) such that \(f(T) =0\).
Show that the following statements are equivalent about a polynomial \(g\in I\):

\(g\) is irreducible.

If \(k\in F[x]\) is nonzero and of degree strictly less than \(g\), then \(k[T]\) is an invertible matrix.
Fall 2015 #8 #algebra/qual/work
Let \(V\) be a vector space over a field \(F\) and \(V {}^{ \vee }\) its dual. A symmetric bilinear form \(({}, {})\) on \(V\) is a map \(V\times V\to F\) satisfying \begin{align*} (av_1 + b v_2, w) = a(v_1, w) + b(v_2, w) {\quad \operatorname{and} \quad} (v_1, v_2) = (v_2, v_1) \end{align*} for all \(a, b\in F\) and \(v_1, v_2 \in V\). The form is nondegenerate if the only element \(w\in V\) satisfying \((v, w) = 0\) for all \(v\in V\) is \(w=0\).
Suppose \(({}, {})\) is a nondegenerate symmetric bilinear form on \(V\). If \(W\) is a subspace of \(V\), define \begin{align*} W^{\perp} \coloneqq\left\{{v\in V {~\mathrel{\Big\vert}~}(v, w) = 0 \text{ for all } w\in W}\right\} .\end{align*}

Show that if \(X, Y\) are subspaces of \(V\) with \(Y\subset X\), then \(X^{\perp} \subseteq Y^{\perp}\).

Define an injective linear map \begin{align*} \psi: Y^{\perp}/X^{\perp} \hookrightarrow(X/Y) {}^{ \vee } \end{align*} which is an isomorphism if \(V\) is finite dimensional.
Fall 2018 #4 #algebra/qual/completed
Let \(V\) be a finite dimensional vector space over a field (the field is not necessarily algebraically closed).
Let \(\phi : V \to V\) be a linear transformation. Prove that there exists a decomposition of \(V\) as \(V = U \oplus W\) , where \(U\) and \(W\) are \(\phi{\hbox{}}\)invariant subspaces of \(V\) , \({\left.{{\phi}} \right_{{U}} }\) is nilpotent, and \({\left.{{\phi}} \right_{{W}} }\) is nonsingular.
\todo[inline]{Revisit.}
Let \(m(x)\) be the minimal polynomial of \(\phi\). If the polynomial \(f(x) = x\) doesn’t divide \(m\), then \(f\) does not have zero as an eigenvalue, so \(\phi\) is nonsingular and since \(0\) is nilpotent, \(\phi + 0\) works.
Otherwise, write \(\phi(x) = x^m \rho(x)\) where \(\gcd(x, \rho(x)) = 1\).
Then \begin{align*} V \cong \frac{k[x]}{m(x)} \cong \frac{k[x]}{(x^m)} \oplus \frac{k[x]}{(\rho)} \coloneqq U \oplus W \end{align*} by the Chinese Remainder theorem.
We can now note that \({\left.{{\phi}} \right_{{U}} }\) is nilpotent because it has characteristic polynomial \(x^m\), and \({\left.{{\phi}} \right_{{W}} }\) is nonsingular since \(\lambda = 0\) is not an eigenvalue by construction.
Fall 2018 #5 #algebra/qual/completed
Let \(A\) be an \(n \times n\) matrix.

Suppose that \(v\) is a column vector such that the set \(\{v, Av, . . . , A^{n1} v\}\) is linearly independent. Show that any matrix \(B\) that commutes with \(A\) is a polynomial in \(A\).

Show that there exists a column vector \(v\) such that the set \(\{v, Av, . . . , A^{n1} v\}\) is linearly independent \(\iff\) the characteristic polynomial of \(A\) equals the minimal polynomial of A.
 Powers of \(A\) commute with polynomials in \(A\).
 The image of a linear map is determined by the image of a basis
 Use CayleyHamilton to relate the minimal polynomial to a linear dependence.
 Get a lower bound on the degree of the minimal polynomial.
 Use \(A\curvearrowright k[x]\) to decompose into cyclic \(k[x]{\hbox{}}\)modules, and use special form of denominators in the invariant factors.
 Reduce to monomials.
Letting \(\mathbf{v}\) be fixed, since \(\left\{{A^j \mathbf{v}}\right\}\) spans \(V\) we have A \begin{align*} B\mathbf{v} = \sum_{j=0}^{n1}c_j A^j \mathbf{v} .\end{align*}
So let \(p(x) = \sum_{j=0}^{n1}c_jx^j\). Then consider how \(B\) acts on any basis vector \(A^k \mathbf{v}\).
We have \begin{align*} BA^k \mathbf{v} &= A^k B\mathbf{v} \\ &= A^k p(A) \mathbf{v} \\ &= p(A) A^k \mathbf{v} ,\end{align*}
so \(B = p(A)\) as operators since their actions agree on every basis vector in \(V\).

If \(\left\{{A^j \mathbf{v}_k {~\mathrel{\Big\vert}~}0\leq j \leq n1}\right\}\) is linearly independent, this means that \(A\) does satisfy any polynomial of degree \(d < n\).

So \(\deg m_A(x) = n\), and since \(m_A(x)\) divides \(\chi_A(x)\) and both are monic degree polynomials of degree \(n\), they must be equal.

Let \(A\curvearrowright k[x]\) by \(A \curvearrowright p(x) \coloneqq p(A)\). This induces an invariant factor decomposition \(V =\cong \bigoplus k[x]/(f_i)\).

Since the product of the invariant factors is the characteristic polynomial, the largest invariant factor is the minimal polynomial, and these two are equal, there can only be one invariant factor and thus the invariant factor decomposition is \begin{align*} V\cong \frac{k[x]}{(\chi_A(x))} \end{align*} as an isomorphism of \(k[x]{\hbox{}}\)modules.

So \(V\) is a cyclic \(k[x]\) module, which means that \(V = k[x]\curvearrowright\mathbf{v}\) for some \(\mathbf{v}\in V\) such that \(\operatorname{Ann}(\mathbf{v}) = \chi_A(x)\), i.e. there is some element \(\mathbf{v}\in V\) whose orbit is all of \(V\).

But then noting that monomials span \(k[x]\) as a \(k{\hbox{}}\)module, we can write \begin{align*} V &\cong k[x] \curvearrowright\mathbf{v} \\ &\coloneqq\left\{{f(x) \curvearrowright\mathbf{v} {~\mathrel{\Big\vert}~}f \in k[x]}\right\} \\ &= \mathop{\mathrm{span}}_k \left\{{x^k \curvearrowright\mathbf{v} {~\mathrel{\Big\vert}~}k \geq 0}\right\} \\ &\coloneqq\mathop{\mathrm{span}}_k \left\{{A^k\mathbf{v} {~\mathrel{\Big\vert}~}k \geq 0}\right\} ,\end{align*} where we’ve used that \(x\) acts by \(A\) and thus \(x^k\) acts by \(A^k\).

Moreover, we can note that if \(\ell \geq \deg \chi_A(x)\), then \(A^\ell\) is a linear combination of \(\left\{{A^j \mathrel{\Big}0 \leq j \leq n1}\right\}\), and so \begin{align*} V &\cong \mathop{\mathrm{span}}_k \left\{{A^\ell\mathbf{v} {~\mathrel{\Big\vert}~}\ell \geq 0}\right\} \\ &= \mathop{\mathrm{span}}_k \left\{{A^\ell \mathbf{v} {~\mathrel{\Big\vert}~}1 \leq \ell \leq n1}\right\} .\end{align*}
Fall 2019 #8 #algebra/qual/work
Let \(\{e_1, \cdots, e_n \}\) be a basis of a real vector space \(V\) and let \begin{align*} \Lambda \coloneqq\left\{{ \sum r_i e_i \mathrel{\Big}r_i \in {\mathbf{Z}}}\right\} \end{align*}
Let \(\cdot\) be a nondegenerate (\(v \cdot w = 0\) for all \(w \in V \iff v = 0\)) symmetric bilinear form on \(V\) such that the Gram matrix \(M = (e_i \cdot e_j )\) has integer entries.
Define the dual of \(\Lambda\) to be \begin{align*} \Lambda {}^{ \vee }\coloneqq\{v \in V {~\mathrel{\Big\vert}~}v \cdot x \in {\mathbf{Z}}\text{ for all } x \in \Lambda \} .\end{align*}

Show that \(\Lambda \subset \Lambda {}^{ \vee }\).

Prove that \(\operatorname{det}M \neq 0\) and that the rows of \(M^{1}\) span \(\Lambda {}^{ \vee }\).
 Prove that \(\operatorname{det}M = \Lambda {}^{ \vee }/\Lambda\).
\todo[inline]{Todo, missing part (c).}

Let \(\mathbf{v} \in \Lambda\), so \(\mathbf{v} = \sum_{i=1}^n r_i \mathbf{e}_i\) where \(r_i \in {\mathbf{Z}}\) for all \(i\).

Then if \(\mathbf{x} = \sum_{j=1}^n s_j \mathbf{e}_j \in \Lambda\) is arbitrary, we have \(s_j \in {\mathbf{Z}}\) for all \(j\) and \begin{align*} {\left\langle {\mathbf{v}},~{\mathbf{x}} \right\rangle} &= {\left\langle {\sum_{i=1}^n r_i \mathbf{e}_i},~{\sum_{j=1}^n s_j \mathbf{e}_j } \right\rangle} \\ &= \sum_{i=1}^n \sum_{j=1}^n r_i s_j {\left\langle {\mathbf{e}_i},~{\mathbf{e}_j } \right\rangle} \in {\mathbf{Z}} \end{align*} since this is a sum of products of integers (since \({\left\langle {\mathbf{e}_i},~{\mathbf{e}_j} \right\rangle} \in {\mathbf{Z}}\) for each \(i, j\) pair by assumption) so \(\mathbf{v} \in \Lambda {}^{ \vee }\) by definition.
The determinant is nonzero.

Suppose \(\operatorname{det}M = 0\). Then \(\ker M \neq \mathbf{0}\), so let \(\mathbf{v} \in \ker M\) be given by \(\mathbf{v} = \sum_{i=1}^n v_i \mathbf{e}_i \neq \mathbf{0}\).

Note that \begin{align*} M\mathbf{v} = 0 &\implies \left[ \begin{array}{ccc} \mathbf{e}_1 \cdot \mathbf{e}_1 & \mathbf{e}_1 \cdot \mathbf{e}_2 & \cdots \\ \mathbf{e}_2 \cdot \mathbf{e}_1 & \mathbf{e}_2 \cdot \mathbf{e}_2 & \cdots \\ \vdots & \vdots & \ddots \end{array} \right] \left[\begin{array}{c} v_1 \\ v_2 \\ \vdots \end{array}\right] = \mathbf{0} \\ \\ &\implies \sum_{j=1}^n v_j{\left\langle {\mathbf{e}_k},~{\mathbf{e}_j} \right\rangle} = 0 {\quad \operatorname{for each fixed} \quad} k .\end{align*}

We can now note that \({\left\langle {\mathbf{e}_k},~{\mathbf{v}} \right\rangle} = \sum_{j=1}^n v_j {\left\langle {\mathbf{e}_k},~{\mathbf{e}_j} \right\rangle} = 0\) for every \(k\) by the above observation, which forces \(\mathbf{v} = 0\) by nondegeneracy of \({\left\langle {{}},~{{}} \right\rangle}\), a contradiction.
???
\todo[inline]{Missing work!}
Write \(M = A^tA\) where \(A\) has the \(\mathbf{e}_i\) as columns. Then \begin{align*} M\mathbf{x} = 0 &\implies A^t A \mathbf{x} = 0 \\ &\implies \mathbf{x}^t A^t A \mathbf{x} = 0 \\ &\implies {\left\lVert {A \mathbf{x}} \right\rVert}^2 = 0 \\ &\implies A\mathbf{x} = 0 \\ &\implies \mathbf{x} = 0 ,\end{align*}
since \(A\) has full rank because the \(\mathbf{e}_i\) are linearly independent.
Let \(A = [\mathbf{e}_1^t, \cdots, \mathbf{e}_n^t]\) be the matrix with \(\mathbf{e}_i\) in the \(i\)th column.
The rows of \(A^{1}\) span \(\Lambda {}^{ \vee }\). Equivalently, the columns of \(A^{t}\) span \(\Lambda {}^{ \vee }\).

Let \(B = A^{t}\) and let \(\mathbf{b}_i\) denote the columns of \(B\), so \(\operatorname{im}B = \mathop{\mathrm{span}}{\left\{{\mathbf{b}_i}\right\}}\).

Since \(A \in \operatorname{GL}(n, {\mathbf{Z}})\), \(A^{1}, A^t, A^{t} \in \operatorname{GL}(n, {\mathbf{Z}})\) as well. \begin{align*} \mathbf{v} \in \Lambda {}^{ \vee } &\implies {\left\langle {\mathbf{e}_i},~{\mathbf{v}} \right\rangle} = z_i \in {\mathbf{Z}}\quad \forall i \\ &\implies A^t \mathbf{v} = \mathbf{z} \coloneqq[z_1, \cdots, z_n] \in {\mathbf{Z}}^n \\ &\implies \mathbf{v} = A^{t} \mathbf{z} \coloneqq B\mathbf{z} \in \operatorname{im}B \\ &\implies \mathbf{v} \in \operatorname{im}B \\ &\implies \Lambda {}^{ \vee }\subseteq \operatorname{im}B ,\end{align*} and \begin{align*} B^t A = (A^{t})^t A = A^{1}A = I \\ \implies \mathbf{b}_i \cdot \mathbf{e}_j = \delta_{ij} \in {\mathbf{Z}}\\ \implies \operatorname{im}B \subseteq \mathop{\mathrm{span}}~ \Lambda {}^{ \vee } .\end{align*}
Spring 2013 #6 #algebra/qual/completed
Let \(V\) be a finite dimensional vector space over a field \(F\) and let \(T: V\to V\) be a linear operator with characteristic polynomial \(f(x) \in F[x]\).

Show that \(f(x)\) is irreducible in \(F[x] \iff\) there are no proper nonzero subspaces \(W< V\) with \(T(W) \subseteq W\).

If \(f(x)\) is irreducible in \(F[x]\) and the characteristic of \(F\) is 0, show that \(T\) is diagonalizable when we extend the field to its algebraic closure.
\todo[inline]{Is there a proof without matrices? What if $V$ is infinite dimensional?} \todo[inline]{How to extend basis?}

Every \(\mathbf{v}\in V\) is \(T{\hbox{}}\)cyclic \(\iff \chi_T(x)/{\mathbf{k}}\) is irreducible.
 \(\implies\): Same as argument below.
 \(\impliedby\): Suppose \(f\) is irreducible, then \(f\) is equal to the minimal polynomial of \(T\).
 Characterization of diagonalizability: \(T\) is diagonalizable over \(F \iff \min_{T, F}\) is squarefree in \(\overline{F}[x]\)?
Let \(f\) be the characteristic polynomial of \(T\).
\(\implies\):
 By contrapositive, suppose there is a proper nonzero invariant subspace \(W<V\) with \(T(W) \subseteq W\), we will show the characteristic polynomial \(f \coloneqq\chi_{V, T}(x)\) is reducible.
 Since \(T(W)\subseteq W\), the restriction \(g\coloneqq\chi_{V, T}(x) \mathrel{\Big}_W: W\to W\) is a linear operator on \(W\).
\(g\) divides \(f\) in \({ \mathbf{F} }[x]\) and \(\deg(g) < \deg(f)\).

Choose an ordered basis for \(W\), say \({\mathcal{B}}_W \coloneqq\left\{{\mathbf{w}_1, \cdots, \mathbf{w}_k}\right\}\) where \(k=\dim_F(W)\)

Claim: this can be extended to a basis of \(V\), say \({\mathcal{B}}_V \coloneqq\left\{{\mathbf{w}_1, \cdots, \mathbf{w}_k, \mathbf{v}_1, \cdots, \mathbf{v}_j}\right\}\) where \(k+j = \dim_F(V)\).
 Note that since \(W<V\) is proper, \(j\geq 1\).

Restrict \(T\) to \(W\) to get \(T_W\), then let \(B = [T_W]_{{\mathcal{B}}_W}\) be the matrix representation of \(T_W\) with respect to \({\mathcal{B}}_W\).

Now consider the matrix representation \([T]_{{\mathcal{B}}_V}\), in block form this is given by \begin{align*} [T]_{{\mathcal{B}}_V} = \begin{bmatrix} B & C \\ 0 & D \end{bmatrix} \end{align*} where we’ve used that \(W<V\) is proper to get the existence of \(C, D\) (there is at least one additional row/column since \(j\geq 1\) in the extended basis.)
\todo[inline]{Why?}

Now expand along the first column block to obtain \begin{align*} \chi_{T, V}(x) \coloneqq\operatorname{det}([T]_{{\mathcal{B}}_V}  xI) = \operatorname{det}(B  xI)\cdot \operatorname{det}(D  xI) \coloneqq\chi_{T, W}(x) \cdot \operatorname{det}(DxI) .\end{align*}

Claim: \(\operatorname{det}(D  xI) \in xF[x]\) is nontrivial

The claim follows because this forces \(\deg(\operatorname{det}(DxI)) \geq 1\) and so \(\chi_{T, W}(x)\) is a proper divisor of \(\chi_{T, V}(x)\).

Thus \(f\) is reducible.
\(\impliedby\)
 Suppose \(f\) is reducible, then we will produce a proper \(T{\hbox{}}\)invariant subspace.
 Claim: if \(f\) is reducible, there exists a nonzero, noncyclic vector \(\mathbf{v}\).
 Then \(\mathop{\mathrm{span}}_k\left\{{T^j\mathbf{v}}\right\}_{j=1}^d\) is a \(T{\hbox{}}\)invariant subspace that is nonzero, and not the entire space since \(\mathbf{v}\) is not cyclic.
 Let \(\min_{T, F}(x)\) be the minimal polynomial of \(T\) and \(\chi_{T, F}(x)\) be its characteristic polynomial.
 By CayleyHamilton, \(\min_{T, F}(x)\) divides \(\chi_{T, F}\)
 Since \(\chi_{T, F}\) is irreducible, these polynomials are equal.
 Claim: \(T/F\) is diagonalizable \(\iff \min_{T, F}\) splits over \(F\) and is squarefree.
 Replace \(F\) with its algebraic closure, then \(\min_{T, F}\) splits.

Claim: in characteristic zero, every irreducible polynomial is separable
 Proof: it must be the case that either \(\gcd(f, f') = 1\) or \(f' \equiv 0\), where the second case only happens in characteristic \(p>0\).
 The first case is true because \(\deg f' < \deg f\), and if \(\gcd(f, f') = p\), then \(\deg p < \deg f\) and \(p\divides f\) forces \(p=1\) since \(f\) is irreducible.
 So \(\min_{T, F}\) splits into distinct linear factors
 Thus \(T\) is diagonalizable.
Fall 2020 #8 #algebra/qual/work
Let \(A\in \operatorname{Mat}(n\times n, {\mathbf{C}})\) such that the group generated by \(A\) under multiplication is finite. Show that \begin{align*} \operatorname{Tr}(A^{1}) ={\overline{{\operatorname{Tr}(A) }}} ,\end{align*} where \({\overline{{({})}}}\) denotes taking the complex conjugate and \(\operatorname{Tr}({})\) is the trace.