We study
u' = Au. (*)
Again, we'll try for exponential solutions. Unfortunately, e^{lambda t) is not a vector-valued function. To make it such we'll have to multiply by a constant vector, say c: so we look for solutions of the form e^{lambda t}c. It's equivalent to look for solutions which lie along a straight line (as long as we require lambda to be real - an unreasonable assumption, in the end, as we'll see).
Now (e^{lambda t})' = lambda (e^{lambda t}c) and A(e^{lambda t}c) = e^{lambda t}Ac, so we are requiring that Ac = lambda c. This is surely satisfied by c = 0, but this case is uninteresting. What are we asking if we require c not 0?
We'll use the principle that [a b; c d][c1; c2] = c1[1;c] + c2[b;d]; so Bv = 0 has a nontrivial solution exactly when det B = 0.
Time for an example: x' = x + 3y, y' = 3x + y: A = [1 3; 3 1]. I can rewrite Ac = lambda c as [1-lambda 3; 3 1-lambda]c = 0. So we'll require that
det(A - lambda I) = 0. (**)
This determinant is the "characteristic polynomial" of A. In our case it comes to lambda^2 - 2 lambda + 8, which has roots lambda1 = 4, lambda2 = -2. These roots are the "eigenvalues" of A.
Already we can say quite a lot about the solutions to (*): there are straight-line solutions of the form e^{4t}c1 and e^{-2t}c2, and the general solution is a linear combination of these.
To find c1 and c2 - the "eigenvectors" - we substitute the eigenvalue into (**):
lambda1 = 4 leads to c1 = [1;1] (or any nonzero multiple),
and
lambda2 = -2 leads to c2 = [1;-1] (or any nonzero multiple).
(General fact: symmetric matrices have perpendicular eigenvectors, and real eigenvalues.)
So there are straight line solutions e^{4t}[1;1] and e^{-2t}[1;-1], and the general solution is a e^{4t}[1;1] + b e^{-2t}[1;-1]; that is x = ae^{4t} + be^{-2t}, y = ae^{4t} - be^{-2t}.
The straight-line solutions are along the "eigenspaces," E_4 = multiples of [1;1] and E_{-2} = multiples of [1;-1]. In general solutions trace out curves diverging from E_{-2} and converging to E_4. We can draw the "phase plane."
Geometrically, A multiplies by 4 along E_4 and by -2 along E_{-2}, and does more complicated things elsewhere.
If we had started with a second order equation y" + ay' + b = 0, with companion matrix A = [0 1; -b -a], then the characteristic polynomial of A is
det(A - lambda I) = (-lambda)(-a-lambda) + b = lambda^2 + a lambda + b,
that is, the characteristic polynomial of the companion matrix equals the characteristic polynomial of the higher order linear operator.