Math 113 - Evolving Syllabus

DateMaterial covered
9/23 Announcements. What the class is and what it isn't. Introduction: linear algebra is about vector spaces and linear maps. The first week or two will be about vector spaces, the rest of the class about linear maps. Properties of R and C: they are both fields (i.e. satisfy the commutativity, associatity, identity, additive inverse, multiplicative inverse, and distributive properies listed for C on page 2-3.) In Axler's book, F denotes the real or complex numbers, but we shall work a little more generally and let it be any field. The axioms of a vector space over F. Elementary properties (page 11-12) and examples.
9/25 Subspaces of a vector space. New subspaces from old: the intersection of subspaces is again a subspace, the sum of subspaces. Examples. Direct sums of subspaces. Criteria for sums being direct (Prop 1.8 and 1.9). Span of a list of vectors, finitely generated vector spaces.
9/30 Linear combinations. Span of a list of vectors in a vector space. Finitely generated vector spaces. Linear independence. Linear independence and linear dependence. The "linear dependence lemma" and the (very important) theorem 2.6: Any spanning list of vectors in V is at least as long as any linearly independent list. Basis for a vector space. Dimension.
10/2 Two bases for the same vector space have the same number of elements. Dimension of a finite dimensional vector space. Properties and applications of dimension. A subspace of a finite dimensional space is finite dimensional, and it's dimension is at most the dimension of the surrounding space. To check whether some list is a basis for V it suffices to check either that it is spanning or that it is linearly independent, provided you already know it has the right number of elements. To check if V is the direct sum of some subspaces, it suffices to check either that the sum is direct or that the sum is all of V, provided you already know the sum of the dimensions of the subspaces is the dimension of V.
10/7 Linear maps between vector spaces (a subspace of all maps). Invertible maps. Isomorphic vector spaces. Isomorphic vector spaces have the same dimension. A linear map is invertible if and only if it is injective and surjective.
10/9 Null space and Range of a linear map, and their relation with surjectivity and injectivity. An important formula (Theorem 3.4 in Axler) relating their dimensions, as well as several corollaries (to check that a linear map between finite dimensional spaces of the same dimension is invertible, it suffices to check either surjectivity or injectivity; the other property follows automatically).
10/14 Matrices. The matrix of a linear map T from V to W with respect to bases of the two vector spaces. As a corollary, the dimension of the space of linear maps is (dim(V))(dim(W)). Matrix product and its relation to composition of linear maps.
10/16 Polynomials with complex coefficients (this is chapter 4 of Axler's book, but we'll skip the parts about real coefficients for now): division with remainder, and the fact that complex polynomials can factored into degree one factors. Operators, i.e. linear maps from V to itself. Polynomials applied to operators. Invariant subspaces for a linear operator.
10/21 Eigenvectors and eigenvalues. Eigenvectors corresponding to distinct eigenvalues are automatically linearly independent. Bases consisting of eigenvectors, diagonal matrices, and several other equivalent conditions.
10/23 Criteria for existence of bases constisting of eigenvectors. Weaker results for general operators. Existence of eigenvalues for operators on complex finite dimensional vector spaces. Upper triangular matrices. For any operator there exists a basis such that M(T) is upper triangular.
10/28 If the matrix of an operator is upper triangular, the eigenvalues are precisely the entries on the diagonal. Generalized eigenvectors. Eigenvectors are generalized eigenvectors. Eigenspaces and generalized eigenspaces. Midterm 7pm - 9pm in 420-041.
10/30 The sum of generalized eigenspaces corresponding to distinct eigenvalues is direct (see HW6 for an outline of the argument presented in class). Dimension of generalized eigenspaces.
11/4 The dimension of the generalized eigenspace of an operator T is equal to the number of times the eigenvalue appears on the diagonal of the matrix of T when this matrix is upper triangular. Many consequences: This number is independent of choice of basis; decomposition of an operator; Cayley-Hamilton; S+N decomposition.
11/6 Lecture taught by Jeremy Miller. Inner product spaces, triangle inequality, Cauchy-Schwarz.
11/11 Orthonogonal and orthonormal lists and bases. Orthogonal lists of non-zero vectors are automatically linearly independent. Formula for how to write a vector as a linear combination of the vectors in an orthonormal basis. Formula for the entries of the matrix of an operator, with respect to an orthonormal basis. Gram-Schmidt.
11/13 Orthogonal projections. Linear functionals in the presence of inner products. (Linear functionals are the elements in the vector space called V* in some of the homework problems.)
11/18 Adjoint and self-adjoint operators. Spectral theorem for self-adjoint operators on complex inner-product spaces (there is an orthonormal basis consisting of eigenvectors, and all the eigenvalues are real).
11/20 The spectral theorem for self-adjoint operators on real inner-product spaces (there is an orthonormal basis consisting of eigenvectors). The spectral theorem for normal operators on complex inner-product spaces (there is an orthonormal basis consisting of eigenvectors).
12/2 Positive operators (example: T*T is always positive, in the same way as T* + T is always self-adjoint). Criteria for positivity (namely: self-adjoint with non-negative eigenvalues). A positive operator has a unique positive square root. Isometries. Polar decomposition for invertible operators.
12/4 Review and overview of some topics from chapters 5-8.