During Linear Algebra on Monday, I began class by answering homework questions. I clarified the fact that and represent the same general solution to a linear system when . Somehow the discussion of applications of the techniques we are learning came up. Thus far, we have basically covered how to use Gauss-Jordan elimination to solve linear systems. I pointed to examples from engineering (such as structural analysis of trusses) and computational fluid dynamics (discretization procedures to solve PDEs). I understand that these are a little outside the scope of this class but I chose them because of the individuals asking the question and based on their particular interests in mechanical engineering and aeronautics.

After chasing a couple rabbits, we continued working with operations on matrices, namely matrix addition, scalar multiplication and matrix multiplication. We went over the properties of the operations such at commutativity of addition, associativity of all three, distributive laws, etc. We proved a couple of these.

I recognize that for many of these students, they have little or no background in formulating a formal proof. In light of this, I chose to first prove that matrix addition commutes. Then I chose to show them one of the longer (thought not really much more difficult) proofs, the associativity of matrix multiplication. The real challenge was to help them see through the cumbersome notation, that it simply hinged on the associativity of multiplication of real numbers. I’m not sure that next time I teach this class I would want to bother with this proof. I think it is important that they see proofs of these fundamental concepts but I can accomplish that with a couple of simpler ones.

We just got into inverses and identities and will finish up this section on Matrix Algebra next time. We’ll then be ready to start talking about partitioning matrices and the doing some applications, such as traffic flow, balancing chemical equations, and search engines.

### Like this:

Like Loading...