Class Log



Friday 12/6/13: Orthogonal projection gives closest point of subspace. Least squares method.

Wednesday 12/4/13: 5.2 The Gram-Schmidt process.

Monday 12/2/13: 5.1 Orthonormal bases and orthogonal projections.

Friday 11/29/13: No meeting (Thanksgiving).

Wednesday 11/27/13: The linear algebra of Google search.

Monday 11/25/13: Discrete dynamical systems x(t+1)=Ax(t). Explicit formula for x(t) via diagonalization. Behaviour for large t. Examples.

Friday 11/22/13: Computing powers of a matrix via diagonalization.

Wednesday 11/20/13: Complex eigenvalues. If A is a 2x2 matrix with complex (not real) eigenvalues then there is a basis B of R2 such that the B-matrix of the linear transformation T(x)=Ax is a rotation-scaling matrix.

Monday 11/18/13: Eigenspaces, algebraic and geometric multiplicity of an eigenvalue.

Friday 11/15/13: For A an nxn matrix, the characteristic polynomial fA(x)=det(A-xI) is a polynomial of degree n. So A has at most n eigenvalues. If A has n different eigenvalues then A is diagonalizable (there is a basis of Rn consisting of eigenvectors of A).

Wednesday 11/13/13: Eigenvectors and eigenvalues of 2x2 matrices.

Monday 11/11/13: No class (Veteran's day).

Friday 11/8/13: 7.2 Finding the eigenvalues of a matrix. A real number c is an eigenvalue of a square matrix A <=> det(A-cI)=0 (the "characteristic equation"). We can solve this equation for c, then solve (A-cI)v=0 to find the corresponding eigenvectors v.

Wednesday 11/6/13: 7.1 Eigenvalues and eigenvectors. If T:Rn -> Rn is a linear transformation we say a nonzero vector v in Rn is an eigenvector of T with eigenvalue c if T(v)=cv. Examples.

Monday 11/4/13: det(AB)= det(A)det(B). Laplace expansion of a determinant along a row or column.

Friday 11/1/13: General formula for nxn determinant using permutations (called "patterns" in the textbook). Computing determinants using Gaussian elimination.

Wednesday 10/30/13: 6.1 Introduction to determinants. A square matrix A is invertible <=> det A is not equal to zero. |det A| is the "expansion factor" of the associated linear transformation. Algebraic formula for 2x2 and 3x3 determinants (Sarrus' rule).

Monday 10/28/13: Change of basis for linear spaces.

Friday 10/25/13: 4.3 B-matrix of a linear transformation from a linear space to itself.

Wednesday 10/23/13: Kernel and image. Criteria for a linear transformation to be an isomorphism.

Monday 10/21/13: Infinite dimensional linear spaces. 4.2 Linear transformations between linear spaces. Examples.

Friday 10/18/13: 4.1 Linear spaces. Definition and examples.

Wednesday 10/16/13: Diagonalization of a square matrix A. Computing powers of A via diagonalization.

Monday 10/14/13: The B-matrix of a linear transformation T : Rn -> Rn for a basis B of Rn.

Friday 10/11/13: 3.4 Coordinates. The coordinates for a subspace determined by a basis of the subspace. Examples. Simplifying the formula for a linear transformation by choosing good coordinates.

Wednesday 10/9/13: 3.3 Computing a basis of the image and kernel of a linear transformation. For a linear transformation T : Rn -> Rm, T(x)=Ax, the dimension of the image equals the number of pivots of RREF(A) (the rank of A) and the dimension of the kernel equals n-rank(A).

Monday 10/7/13: Linearly independent vectors. Basis of a subspace. Dimension.

Friday 10/4/13: Computing the kernel and the image of a linear transformation. 3.2 Subspaces of Rn. Examples. Formal definition. Image and kernel are subspaces.

Wednesday 10/2/13: 3.1 The image and kernel of a linear transformation. The image is the span of the columns of the matrix of the linear transformation. For a linear transformation T, T(v)=T(w) precisely when the difference v-w lies in the kernel of T. In particular, T is one-to-one precisely when the kernel of T equals {0}.

Monday 9/30/13: The inverse of a 2x2 matrix. The inverse of a product of matrices.

Friday 9/27/13: 2.4 Inverse of a linear transformation. An invertible matrix is square. A square matrix is invertible precisely when the number of pivots is equal to the number of rows or columns. Algorithm to compute the inverse of a matrix A using Gaussian elimination applied to the matrix (A I).

Wednesday 9/25/13: 2.3 Matrix products. Composition of linear transformations corresponds to product of matrices. Matrix multiplication is not commutative in general.

Monday 9/23/13: 2.2 Linear transformations in geometry. Scaling, orthogonal projection, rotation, reflection, shear.

Friday 9/20/13: Examples of linear transformations.

Wednesday 9/18/13: 2.1 Linear transformations. Definition in terms of matrix. The columns of the matrix are the images of the standard basis vectors. Showing the effect of a 2x2 linear transformation by drawing the image of the unit square.

Monday 9/16/13: 1.3 Second geometric picture of system of linear equations in terms of linear combinations of vectors. The rank of a matrix.

Friday 9/13/13: 1.3 Vectors: addition, scalar multiplication, dot product. Product of a matrix and a vector. Matrix form of system of linear equations.

Wednesday 9/11/13: 1.2 Gaussian elimination (continued).

Monday 9/9/13: 1.2 Gaussian elimination.

Friday 9/6/13: 1.1 More examples. Expected number of solutions.

Wednesday 9/4/13: 1.1 Solving systems of linear equations. Geometric picture.