Summaries


Lecture summaries:
  1. Monday, September 22. Introduction. Informal definition of a field, examples: Q,R,C, field of two elements. Definition of a vector space, examples. Definition of a linear transformation (see Axler Chapter 3), matrices as examples of linear transformations.
  2. Wednesday, September 25. Examples of linear transformations; the notions of injective, surjective, bijection. Subspaces: definitions and examples, including kernel and image of a linear map and lines or planes through the origin in 3-space. Direct sum: Abstract direct sum (more general than the notion of Axler); R^m direct sum R^n is isomorophic to R^{m+n}. Special case: "internal" direct sum, as in Axler.
  3. Friday, September 27: More on direct sums. A vector space is interior direct sum of U1 and U2 exactly when U1+U2 = V and the intersection of U1 and U2 is trivial. Proof used lemma: if T :V -->W is a linear map, then T is injetive if and only if null(T)= 0. Span of a list of vectors; examples.
  4. Monday, September 28: Span and linear independence. A basis is a list that is spanning and linearly independent. Maximal linearly independent or minimal spanning implies basis. Bases exist in any finite dimensional vector space. (Finite dimensional means: finite list of vectors spans.) Examples of vector spaces without countable basis.
  5. Wednesday, October 1. Bases, continued: Steinitz lemma; any two bases have the same length. (Theorem 2.14 in Axler; page 12 of Katznelson). Definition of dimension. Any finite-dimensional vector space is isomorphic to F^n, where n is the dimension. If V is finite dimensional and T:V->V injective, then T is also surjective.
  6. Friday, October 3: Statement and proof of the rank-nullity theorem, Theorem 3.4 in Axler, relating dimension of null space, dimension image and dimension of the "source space". (Be sure you understand this proof!) Applications: polynomial interpolation (consider the evaluation map as a linear map from space of polynomials to R^n). I gave a very confusing disucssion of inverting algebraic numbers, which you should feel free to ignore.
  7. Monday, October 6: Linear maps (Chapter 3 of Axler). Examples, including differentiation, translation and multiplication on polynomials. Linear maps from V to W form a vector space in their own right. Linear maps can be composed. Composition is NOT COMMUTATIVE (example: multiplication by x and differentiation do not commute, on the space of polynomials).
  8. Wednesday, October 8: A linear map is bijective if and only if it is invertible as a linear map. This is so if and only if it carries one basis to a basis, and if and only if it carries any basis to a basis. Two finite-dimensional vector spaes are isomorphic if and only if they have the same dimension. Linear maps can be written, in terms of bases, as matrices. This carries composition of linear maps to multiplication of matrices. The dimension of L(V,W) is dimV x dim W. All in Chapter 3 of Axler, although not quite in the same order.
  9. Friday October 10: Dual spaces! NOT IN AXLER. A finite-dimensional space and its dual have the same dimension. A space is naturally isomorphic to its double-dual. "Naturally" means "without making choices." The adjoint of a linear map goes between the duals, in the opposite direciton. Definition of a bilinear form. Composition of matrices, as example of a bilinear form from L(V,W) x L(W, X) to L(V, X).
  10. Monday October 13. (Not examinable). More on the definitoin of a bilinear form. A bilinear map V xW -->F gives rise to a map from V-->W* and from W-->V*. If one is an isomorphism, so is the other. We call this a perfect pairing. The bilinear map VxV* -->F given by evaluation is a perfect pairing. This is a symmetric version of duality. (Examinable again): Definition of the adjoint of a linear map, see Katznelson Chapter 3.
  11. Wednesday October 15. The dual basis. The matrix of the adjoint of T is the transpose of the matrix of T. Informal discussion of determinants.
  12. Friday October 17. midterm.
  13. Monday, October 20. (Mainly motivation for Katznelson 4.3) Want to define determinant as "volume distortion factor." The volume of a parallelpiped in R^n is multilinear up to signs and symmetric in the inputs. The "signed" volume is multilinear and alternating in the inputs. These properties alone allow us to compute it explicitly. Motivations definition of alternating n-form on an n-dimensional vector space. Statement of theorem (First form): alternating n-forms are in bijection with the field F.
  14. Wednesday, October 23: Proof that alternating n-forms are in bijection with F. Some -- possibly confusing?? -- comments about the signs of orderings of (1,2,..., n); we'll revisit this issue. Katznelson 4.3, 4.1.
  15. Friday, October 25: A nonzero alternating n-form, evaluated at v1, ..., vn, is nonzero if and only if v1, ..., vn are a basis. Definition of determinant in terms of alternating n-forms. Proof that det(A) det(B) = det(AB). Proof that det(A) is nonzero if and only if A is invertible.
  16. Monday, October 27. How to compute the determinant: (1) It's invariant under basic row operations, and it switches sign if you switch two rows. Same for columns. (2) Cofactor expansion.
  17. Wednesday, October 29: Back to Axler, starting from Chapter 5! Our goal is now to study linear transformations, in particular, to find a basis of a vector space "adapted" to a linear trasnformation. Definition of eigenvectors, eigenvalues. Basic result: over the complex numbers any linear transformation has an eigenvector. This need not be true over the real numbers; nor is it true that there always exist a basis of eigenvectors!
  18. Friday, October 31. (Still in Axler Chapter 5). Second proof of the existence of an eigenvector for any transformation over C. Eigenvectors with distinct eigenvalues are linearly independent. So a linear transformation whose characreristic polynomial has n distinct roots has a basis of eigenvectors. (note: char polys are not described in axler chapter 5; skip ahead to Chapter 10 if you want a reference).
  19. November lectures: I have not been posting summaries because Ilya Sherman kindly made his lecture notes available. See main page. We have been covering Chapter 6 and 7 of Axler: inner products, self-adjoint operators, orthonormal bases etc.
  20. Nov 21 + Dec 1: We covered Chapter 8 of Axler: generalized eigenvalues and eigenspaces, the Cayley-Hamilton theorem, the Jordan form. (We did not discuss the section on square roots.)
  21. Wednesday, December 3: Completion of the proof of Jordan form. Example of computing the Jordan form for a sample 3 x 3 matrix.
  22. Friday, December 5: Review of the various good bases for a linear operator. Example of computing the SVD to show it doesn't agree with other bases.

Akshay Venkatesh
Department of Mathematics Rm. 383-E
Stanford University
Stanford, CA
email: akshay at stanford math