| Date | Material covered |
| 1/3 |
Announcements. What the class is and what it isn't.
Introduction: linear algebra is about vector
spaces and linear maps. The first week or
two will be about vector spaces, the rest of the class
about linear maps. Properties of R
and C: they are both fields
(i.e. satisfy the commutativity, associatity, identity,
additive inverse, multiplicative inverse, and distributive
properies listed for C on page 2-3.) In
Axler's book, F denotes the real or
complex numbers, but we shall work a little more generally
and let it be any field. Then I gave the first main
definition of the class: the axioms of a vector space
over F. Stated and proved some
elementary properties (page 11-12). Gave examples.
|
| 1/5 |
The first homework set is posted.
More examples of vector spaces. Linear maps (definition in
Ch. 3). Subspaces of a vector space. Sum of subspaces.
Direct sum of subspaces: If U_1, ..., U_m are subspaces of a
vector space V we say that "their sum is direct" if the only
way to write 0 = u_1 + ... + u_m with u_i in U_i for all i, is
to have all u_i be 0.
|
| 1/7 |
Criteria for sums being direct: I stated (and proved)
modified versions of propositions 1.8 and 1.9 in the book,
which I called proposition 1.8' and 1.9'. In 1.8' I omitted
assumption (a), and can only conclude that the sum is direct.
In proposition 1.9' I omitted the assumption that U+W=V, and
can only conclude that the sum is direct. Linear
combinations. Span of a list of vectors. Finite dimensional
vector spaces. Example F^n is finite dimensional.
|
| 1/10 |
Linear independence. Linear dependence. Basis. Proved
the "linear dependence lemma" and the (very important) theorem
2.6: Any spanning list of vectors in V is at least as long as
any linearly independent list.
|
| 1/12 |
Two bases for the same vector space V has the same
number of elements. Any finite dimensional space has a
basis. In fact, we can obtain a basis by discarding vectors
from any given spanning list. Definition of dimension: the
number of elements in a basis. Any linearly independent list
in a finite dimensional vector space can be extended to a
basis. The dimension of a subspace of a finite dimensional
vector space V is at most as large as the dimension of V.
Criteria for lists to be bases (proposition 2.16 and 2.17):
If V has dimension n, then any linearly independent list of
length n is a basis, and any spanning list of length n is a
basis.
|
| 1/14 |
The dimension of a sum. Criteria for sums being
direct. Linear maps (definition and a few examples).
|
| 1/19 |
More examples of linear maps. The set of linear maps
from V to W is itself a vector space. Composition of linear
maps. Null space and Range of a linear map. Dimension
formula. Injective and surjective maps, and the relation to
injectivity and surjectivity.
|
| 1/21 |
Invertible linear maps. Isomorphic vector spaces.
Dimension criteria: Finite dimensional vector spaces V and W
are isomorphic if and only if dim(V) = dim(W). In that case,
a linear map T:V -> W is invertible if and only if it is
injective or surjective (i.e. if the dimensions agree, it
suffices to check either surjectivity or injectivity; the
other is then automatic).
|
| 1/24 |
Matrices: Definition of Mat(m,n,F).
Vector space structure on Mat(m,n,F). The
matrix M(T) of a linear map T: V -> W with respect to
bases of V and W. L(V,W) is isomorphic to
Mat(m,n,F) when V and W are finite
dimensional, V has dimension n and W has
dimension m. In particular, L(V,W) is finite
dimensional, and dim(L(V,W)) = (dim(V))(dim(W)).
Multiplication of matrices. The matrix of a composition:
M(TS) = M(T)M(S), i.e. "the
matrix of TS is the product of the matrix of T and the matrix
of S".
|
| 1/26 |
The matrix of a vector. M(Tv) =
(M(T))(M(v)). Review of polynomials
(following Axler Ch 4, covering only the part about complex
numbers). Invariant subspaces. Eigenvalues and eigenvectors.
|
| 1/28 |
Eigenvalue and eigenvectors. lambda is an eigenvalue of
T if and only if Null(T-lambda I) is non-zero. Eigenvectors
corresponding to different eigenvalues are linearly
independent. An operator on V has at most dim(V) distinct
eigenvalues. Polynomials evalutated on operators.
|
| 1/31 |
Every linear operator on a finite dimensional non-zero
complex vector space has an eigenvalue. Upper triangular
matrices. For every operator T on a finite dimensional
complex vector space V, there is a basis of V with respect to
which the matrix of T is upper triangular.
|
| 2/2 |
Class taught by Pete Storm.
|
| 2/4 |
Finished proof of proposition 5.21. Example of a linear
transformation T in L(V) such that V does not have
a basis consisting of eigenvectors of T, namely V = C^2,
T(x,y) = (y,0). Started Chapter 6 about inner product
spaces: Some motivation about the dot product in R^n. Basic
definitions.
|
| 2/7 |
Inner product spaces: immediate consequences of the
axioms. Orthogonal decomposition. Cauchy-Schwarz'
inequality. The triangle inequality.
|
| 2/9 |
Orthonormal lists and their properties. Orthonormal
bases. Gram-Schmidt. As a corollary, every finite
dimensional inner-product space has an orthonormal basis.
|
| 2/11 |
Orthogonal complement. Orthogonal projection. Formula
for orthogonal projection onto U, given orthonormal basis of
U. Every linear operator T on a finite dimensional inner
product space V has upper triangular matrix with respect to
some basis of V.
|
| 2/14 |
Interpretation of Gram-Schmidt formula: In the induction
step we subtract from vj its orthogonal
projection onto the span of the previous vectors.
Orthogonal projection onto U gives the nearest point.
Functionals. Adjoints of linear maps.
|
| 2/16 |
Properties of adjoints. Matrices of linear maps between
inner product spaces. With respect to orthonormal bases, the
matrix of the adjoint of T is the conjugate transpose of the
matrix of T. Self-adjoint operators. Normal operators.
Introduction to the spectral theorem.
|
| 2/18 |
Polarization identity. If T is self-adjoint, then T=0
if and only if <Tv,v> = 0 for all v. If T is normal,
then eigenvectors corresponding to distinct eigenvalues are
orthogonal. Statement of the (complex) spectral theorem.
|
| 2/21 |
President's Day, no lecture.
|
| 2/23 |
Proof of the complex spectral theorem. Statement of the
real spectral theorem and warmup to its proof: real
polynomials.
|
| 2/25 |
Any self-adjoint operator has an eigenvalue. Proof of
the spectral theorem for operators in real inner product spaces.
|
| 2/28 |
Generalized eigenvectors. The nullspace of (T- lambda
I) to the jth power is the nullspace of (T- lambda I) to the
(dim V)th power, if j is greater than or equal to dim V.
Nilpotent operators. Stated theorem 8.10 and began the proof.
|
| 3/2 |
Proof of theorem 8.10. Multiplicity of an eigenvalue,
defined as the dimension of the null space of (T - lambda I)
to the power dim V. Theorem 8.10 shows that the multiplicity
equals the number of times lambda occurs on the diagonal of
M(T), if M(T) is upper triangular. Example: V is the
polynomials of degree at most 3, T is differentiation. We
calculated the multiplicity of 0 (it is 4) in two ways: from
the definition, and from an upper triangular matrix.
|
| 3/4 |
Sum of multiplicities equals the dimension of the
vector space. The Cayley-Hamilton theorem.
|
| 3/7 |
Decomposition theorem (thm 8.23): If T is an operator
on V, then V is the direct sum of subspaces U_i, where U_i is
the null space of (T - lambda_i I) to the power dim(V). Block
diagonal matrices and the
matrix of a decomposed operator. Theorem 8.28: for any
operator on a complex vecttor space V, there exists a basis
consisting of generalized eigenvectors, such that the matrix
of T is block diagonal, and each block is upper triangular and
has the same number on the diagonal. Definition of minimal
polynomial.
|
| 3/9 |
Minimal polynomial: The minimal polynomial of T divides a
polynomial p, if and only if p(T)=0. By Cayley-Hamilton, the
minimal polynomial divides the characteristic polynomial.
Jordan normal form: proof in the case T is nilpotent.
|
| 3/11 |
Last lecture! Finish proof of Jordan normal form. Say
goodbye and wish good luck with exam.
|