Topic outline

  • General

    • Forum Description: This forum is available for everyone to post messages to, but aimed mainly at students to discuss amongst themselves. Thus,  students are encouraged to post to this forum  and should feel free to reply to other students if they are able to. 

  • Week 1

    • This is an external YouTube link to a fun tutorial of which the first 1/2 hour is mostly revision of the level of Linear Algebra I. Ignore the bit about \(L^p\) norms as that will be too advanced for now,  but we will touch upon \(L^2\) but the end of our course.  The full tutorial is intended for people working in neural networks, so do ignore the last part ... or keep watching for a bit fun to see linear algebra in action in the real world. 

    • LECTURE 0 (11 am Monday)

       This is normally a tutorial slot to cover the preceding week but in Week 1 we will use this to introduce the module structure, our way of working and take a first look at the May 2019 exam so you can get a sense of what to aim for. We will discuss how this module follows on from Linear Algebra I


      LECTURE 1 LESSON PLAN

      Rough outline of the module. Definitions: field, vector space. Terminology. Why this generality? Examples: vectors in \(\Bbb R^2\); more generally \(\Bbb K^n\) for an arbitrary field \(\Bbb K\) (verify a couple of the axioms for this example), functions S → \(\Bbb R\), polynomials of degree n − 1. 

      LECTURE 2 LESSON PLAN

       Definitions: what it means for a list of vectors to be linearly dependent, independent, spanning, form a basis. Examples. Span of a list of vectors. Finite dimensional vector space.

      LECTURE 3 LESSON PLAN

      Thinning out a dependent list of vectors (Lemma 1.12). Example.  Exchange Lemma (Lemma 1.13).  Theorem 1.5: Any two bases of a finite dimensional vector space have the same cardinality, etc.


      TASKs: 

      (1) DOWNLOAD AND READ PRINTED LECTURES PAGES 1-6

      (2) CHECK OUT NON-ASSESSED CWK1 AND ITS SOLUTIONS (bring questions to the next weeks tutorial)

      (3) CHECK OUT MAY 2019 EXAM AND ITS SOLUTIONS (think about where your weaknesses are going to be and revise previous courses a bit in light of this)

      (4) CHECK OUT EXTERNAL YOUTUBE REVISION TUTORIAL


    • This was an on-campus exam from before the pandemic and your January exam will have a broadly similar format..

      We will go through this exam in the introductory lecture but consider doing it yourself and marking it to see how you are at the start of the module. Try the same at the end of the module in December!

      Also revise Linear Algebra I in any areas of weakness that you find when we look through this exam and/or check out this YouTube tutorial below. 

    • Modern thinking is to have a go at the exam at the start of the course, expect to fail, have a look at the solutions which will be puzzling at this stage, then repeat the process at the end of the course.  (I will also make some previous years and their solutions available for revision.) 
  • Week 2



    • COURSEWORK TUTORIAL

      Look over printed cwks1,2 and their solutions and ask the module organiser and TA about these or about lectures. 


      LECTURE 4 LESSON PLAN

      Coordinate representation of a vector relative to a basis. Any n-dimensional vector space over a field K is isomorphic to (is essentially the same as) the vector space Kn. Translating between different bases, transition matrices and their properties (covered swiftly, as this is revision of material in Lin. Alg, I).

      LECTURE 5 LESSON PLAN

       Subspace of a vector space. Alternative characterisation as a subset closed under vector addition and scalar product.

      LECTURE 6 LESSON PLAN

      Examples of subspaces: the span of a set of vectors, intersection and sum of subspaces.  A non-example: union of subspaces. Relationship between the dimensions of U, W, U ∩ W and U + W (Theorem 1.25).

      TASKS:


      (1) READ THE PRINTED NOTES PAGES 7-11

      (2) DO CWKS 2 AND BRING TO THE NEXT TUTORIAL FOR HELP

      (3) PREVIEW AND WORK OUT YOUR ANSWERS TO  QUIZ  1 ONCE IT GOES LIVE


    • Covers material  from Weeks 1-2  

      Solutions will be automatically visible once the deadline expires. This also means that its not possible to allow answers after the deadline. 

  • Week 3



    • COURSEWORK TUTORIAL

      Look over printed cwks and their solutions and ask the module organiser and TA about these or about lectures. Also work on your answers for the upcoming quiz. 

      LECTURE 7 LESSON PLAN

      The case when dim(U ∩ W) = 0 is of special interest.  In this case, very element is uniquely expressible as a sum of an element of U and an element of W. We call U + W the direct sum of U and W.  The direct sum of more than two subspaces, dimension of a direct sum of several subspaces. 

      LECTURE 8 LESSON PLAN

      Matrix algebra (revision of Lin. Alg. I). Elementary row and column operations, and the corresponding elementary matrices. Definition:  row and column spaces, and row and column ranks.

      LECTURE 9 LESSON PLAN

      Lemma 2.9. Elementary column operations preserve the column space and row rank, and symmetrically for row operation. Proof.  Theorem 2.10: Every matrix can be reduced using elementary row and column operations to a matrix in canonical form for equivalence.  Rank of a matrix.  Example. For every matrix A there are invertible matrices P and Q such that PAQ is in the canonical form for equivalence. Worked example of the construction of P and Q. 


      TASKS:

      (1) READ THE PRINTED NOTES PAGES 12-19

      (2) SUBMIT QUIZ 1 BY THE DEADLINE

      (3) DO cwks RELEVANT for QUIZ 2 



  • Week 4

    • COURSEWORK TUTORIAL

      Look over printed cwks and their solutions and ask the module organiser and TA about these or about lectures. Also review your answers to the last quiz. 

      LECTURE 10 LESSON PLAN

      Definition of equivalence of matrices; equivalence is an equivalence relation. An n x n matrix is invertible iff it has rank n. Every invertible matrix is the product of elementary matrices and can be reduced to the identity matrix by elementary row (or column) operations alone. 

      LECTURE 11 LESSON PLAN

      Inversion algorithm and example. Two matrices are equivalent iff they have the same rank. New topic: determinants. Sign of a permutation, examples. 

      LECTURE 12 LESSON PLAN

       Impact on the sign of a permutation resulting from composing with a transposition. Leibniz formula for the determinant.  Example: 3 × 3 determinant.  Three properties (D1)–(D3) that a function on square matrices might satisfy.   Proof that det(.), as defined by the Leibniz formula, satisfies (D1)–(D3). [To gain intuition, the proof for the (D2) part will be preceded by a 3 × 3 example while the easier parts (D1) and (D3) will be left as reading exercises.]


      TASKS:

      (1) READ THE PRINTED NOTES PAGES 20-27

      (2) DO relevant printed cwks

      (3)Preview and WORK OUT answers to QUIZ 2 once it goes live


    • This covers material from Weeks 3 and 4.

      Solutions will be automatically visible once the deadline expires. This also means that its not possible to allow answers after the deadline. 

  • Week 5

    • COURSEWORK TUTORIAL

      Look over printed cwks and their solutions and ask the module organiser and TA about these or about lectures. Also work on your answers for the upcoming quiz. 


      LECTURE 13 LESSON PLAN

      Theorem. There is a unique function on matrices satisfying (D1)–(D3); this function is det(). First step: investigate the effect of elementary row operations on D, an arbitrary function satisfying (D1)–(D3). Finish with a case analysis according to whether A is invertible or not. Some corollaries: a square matrix is invertible A iff det(A) ≠ 0; effect of elementary row operations on the determinant; det(AB) = det(A)det(B).  Everything we stated for rows can be stated for columns; det(A^T) = det(A). 

      LECTURE 14 LESSON PLAN

      Example of parallelogram. Definition: minor and cofactor. Laplace or cofactor expansion. [Proof of equivalence to the Leibniz formula is partial; see printed notes.] 

      LECTURE 15 LESSON PLAN

      Adjugate matrix. Theorem: A.Adj(A) = det(A) I. Example. Matrices with polynomial entries vs. polynomials with matrix coefficients. Characteristic polynomial p_A(x) of a matrix A. Cayley-Hamilton Theorem: p_A(A) = O.  Proof. Example. Invariance under similarity.



      TASKS:


      (1) READ THE PRINTED NOTES PAGES 27-36

      (2)DO RELEVANT CWKS

      (3) SUBMIT QUIZ 2



  • Week 6

    • COURSEWORK TUTORIAL

      Look over printed cwks and their solutions and ask the module organiser and TA about these or about lectures. Also review your answers to the last quiz. 

      LECTURE 16 LESSON PLAN

      Linear maps between vector spaces. Rank + nullity theorem. Proof. Corresponding matrix relative to choice of bases. Example. Application of the linear map corresponds to the application of the matrix.  Example.

      LECTURE 17 LESSON PLAN

      Definition of sum and product of linear maps, corresponding sum and product of matrices. Change of bases: how does the matrix representing a linear map change as the bases of the vector spaces change? Answer: the new matrix is related to the old by the relation of equivalence. Proof. 

      LECTURE 18 LESSON PLAN

      Linear maps from a vector space to itself. Definition of projection. For a projection π on V, it is the case that V is the direct sum of Im(π) and Ker(π). Proof. Converse. For several projections on V satisfying certain conditions, V is the direct sum of the images of the projections. Converse. Matrix representations and similarity.



      TASKS:


      (1) READ THE PRINTED NOTES PAGES 37-47

      (2) DO RELEVANT PRINTED CWKS

      (3) Preview and work out answers to QUIZ 3 once it goes live



    • Covers material from Weeks 5 and 6

      Solutions will be automatically visible once the deadline expires. This also means that its not possible to allow answers after the deadline. 

      Due to the week 7 break this is actually due two weeks later from when it goes live.

  • Week 8

    • COURSEWORK TUTORIAL

      Look over printed cwks and their solutions and ask the module organiser and TA about these or about lectures. Also work on your answers for the upcoming quiz. 


      LECTURE 19 LESSON PLAN

      Definitions: eigenvectors, eigenvalues, eigenspaces. Example.  Definition: diagonalisable linear map. Lemma. A linear map on V is diagonalisable if there is a basis of V consisting of eigenvectors. Example.

      LECTURE 20 LESSON PLAN

       Lemma: eigenvectors with distinct eigenvalues are linearly independent. Proof. Theorem: the following are equivalent for a linear map alpha on V: (a) alpha is diagonalisable, (b) V is the direct sum of eigenspaces, and (c) alpha is a linear combination of projections satisfying certain properties. 

      LECTURE 21 LESSON PLAN

      Proof. Examples. Definitions of determinant, characteristic polynomial and minimal polynomial of a linear map. Demonstration that these are well-defined. 



      TASKS:


      (1) READ THE PRINTED NOTES PAGES 48-50

      (2) DO RELEVANT PRINTED CWKS  

      (3) SUBMIT QUIZ 3 



  • Week 9

    • COURSEWORK TUTORIAL

      Look over printed cwks and their solutions and ask the module organiser and TA about these or about lectures. Also work on your answers for the upcoming quiz. 


      LECTURE 22 LESSON PLAN

        Proposition: the minimal polynomial divides the characteristic polynomial. Proof. Theorem: The following are equivalent: λ is an eigenvalue of α; λ is a root of the characteristic polynomial α; λ is the root of the minimal polynomial of α. Proof. 

      LECTURE 23 LESSON PLAN

      Theorem: α is diagonalisable if and only if its minimal polynomial is a product of distinct linear factors. Outline of proof. Examples of matrices that are not diagonalisable, either  because the field does not allow the characteristic polynomial to be decomposed into linear factors, or because the minimal polynomial has a repeated root. Brief treatments of Jordan form. Theorem: over C, every matrix is similar to one in Jordan form.

      LECTURE 24 LESSON PLAN

       Definition of the trace of a matrix. Lemma: Tr(AB) = Tr(BA), and Tr(A) = Tr(A') whenever A and A' are similar. Proposition: Let α be a linear map on a vector space V of dimension n. The the coefficient of xn - 1 in pα(x) is -Tr(α), and the constant term of pα(x) is (-1)det(α). If α is diagonalisable then the sum of the eigenvalues is Tr(α) and the product is det(α). Proof



      TASKS:


      (1) READ THE PRINTED NOTES PAGES 

      (2) Do relevant printed cwks

      (3)Preview and Work on Quiz 4



    • Covers material from Weeks 8,9

      Solutions will be automatically visible once the deadline expires. This also means that its not possible to allow answers after the deadline. 

  • Week 10


    • COURSEWORK TUTORIAL

      Look over printed cwks and their solutions and ask the module organiser and TA about these or about lectures. Also work on your answers for the upcoming quiz. 


      LECTURE 25 LESSON PLAN

      New topic.  Linear and Quadratic forms functions in n variables. Change of basis: Congruence of matrices. Reduction to diagonal form. Proof.

      LECTURE 26 LESSON PLAN

      Example. Abstract view of quadratic forms as bilinear forms.  

      LECTURE 27 LESSON PLAN

      Sylvester's Law of Inertia and associated indices s,t. Rank and signature. Examples. Positive definite, semidefinite and negative definite quadratic forms.



      TASKS:


      (1) READ THE PRINTED NOTES PAGES 

      (2) DO RELEVANT CWKS 

      (3)SUBMIT QUIZ 4




  • Week 11


    • COURSEWORK TUTORIAL

      Look over printed cwks and their solutions eg continue cwk 9.  Have a go at cwk 10 if you want (with help from the module organiser and TA  if the material was not yet covered). Review  whatever you got wrong in Quiz 4.




      LECTURE 28 LESSON PLAN

      New topic: Inner product spaces. Definition: inner product, inner product space.  Lengths, angles. Cauchy-Schwarz inequality: (v.w)2 ≤ (v.v)(w.w). Orthonormal basis. Orthogonal vectors are linearly independent.  Proof.  

      LECTURE 29 LESSON PLAN

      Every inner product space has an orthonormal basis using the Gram-Schmidt process. Definition of adjoint linear map, explanation invoking the Riesz Representation Theorem. Proposition: If α is represented by matrix A relative to some orthonormal basis then the adjoint α* is represented by the transpose of A. Definition of self-adjoint and orthogonal maps. Theorem: The following are equivalent for s linear map α: (a) α is orthogonal, (b) α preserves the inner product, and (c) α maps any orthonormal basis to an orthonormal basis. Proof. Example:  α is a rotation clockwise by π/4.  The adjoint, α*, is rotation anticlockwise by π/4.  Note that α is an orthogonal map since α* = α-1.  Note also that α preserves the inner product and maps orthonormal bases to orthonormal bases.


      LECTURE 30 LESSON PLAN

      Corollary: α is self-adjoint iff A is symmetric, and α is orthogonal iff ATA = AAT = I.   Corollary:  If A represents an orthogonal linear map with respect to an orthonormal basis then the columns of A are themselves orthonormal. Proof. Example. Definition of orthogonal-similarity. Definition of orthogonal complement of a subspace. Example.


      TASKS:


      (1) READ THE PRINTED NOTES PAGES  

      (2)DO RELEVANT PRINTED CWKS


      (3)Preview and work out QUIZ 5


    • Covers mainly material from Weeks 10, 11 and possibly first lecture of Week 12

      Solutions will be automatically visible once the deadline expires. This also means that its not possible to allow answers after the deadline. 

  • Module description

    This module is a mixture of abstract theory, with rigorous proofs, and concrete calculations with matrices. The abstract component builds on the theory of vector spaces and linear maps to construct the theory of bilinear forms (linear functions of two variables), dual spaces (which map the original space to the underlying field) and determinants. The concrete applications involve ways to reduce a matrix of some specific type (such as symmetric or skew-symmetric) to as near diagonal form as possible. 

  • Syllabus

      1. Vector spaces. Definition, basis, Exchange Lemma, dimension, coordinate representation, subspaces and direct sum.
      2. Matrices. Matrix algebra, elementary row and column operations, equivalence of matrices, rank, canonical form for equivalence.
      3. Determinants. Axiomatic definition of the determinant function, properties of the determinant, cofactor (Laplace) expansion, adjugate matrix, Cayley-Hamiltom Theorem.
      4. Linear maps between vector spaces. Image, kernel, rank-nullity Theorem, representation by matrices, change of bases and equivalent matrices.
      5. Linear maps on a vector space. Projections and direct sums, similarity, eigenvalues and eigenvectors, diagonalisability of matrices/linear maps, characteristic and minimal polynomials, conditions for diagonalisability, Jordan form (statement).
      6. Quadratic forms. Change of basis, congruence, canonical form, bilinear forms, Sylvester’s Law of Inertia.
      7. Inner product spaces. Inner products, orthonormal bases. Adjoints, self-adjoint and orthogonal linear maps. Orthogonally similar matrices.
      8. Symmetric matrices/self adjoint linear maps. Orthogonal projections and orthogonal decompositions. Spectral Theorem.

  • Module aims and learning outcomes


    • This is a sequel to Linear Algebra I. It will be a mixture of abstract theory, with rigorous proofs, and concrete calculations with matrices.

      The primary learning outcome in this module is that you will think about learning as a mindset and a process - it has no end point.

      By the end of this module you will have:

      Gained an understanding of:

      • The theory of bilinear forms, dual spaces and determinants.
      • Concrete applications such as simultaneous diagonalisation of matrices, maps and forms.

      Be able to:

      • Define standard terms in the theory of bilinear forms, vector spaces and matrices, such as bilinear form, quadratic form, congruency of symmetric matrices, dual basis, inner product, cofactor, adjugate, equivalent matrices, characteristic polynomial and minimal polynomial.
      • Define the rank, nullity, image and kernel of a matrix.
      • State results given in the course, such as Sylvester’s Law of Inertia or the Cayley-Hamilton theorem.
      • Compute the canonical form for equivalence for a matrix over a suitable field K.
      • Compute the matrix which corresponds to a given real quadratic form.
      • Find a diagonal matrix congruent to a given real symmetric matrix.
      • Carry out Gram-Schmidt orthogonalisation.
      • Carry out simultaneous diagonalisation of two forms over the reals.
      • Reproduce proofs of basic results of linear algebra, such as the Steinitz Exchange Lemma.
      • Reproduce the formula for multiplying a square matrix by its adjoint.

      Developed with respect to the following attributes:

      • Grasp the principles and practices of their field of study.
      • Acquire substantial bodies of new knowledge.
      • Explain and argue clearly and concisely.
      • Acquire and apply knowledge in a rigorous way.
      • Connect information and ideas within their field of study.

  • assessment

    • There is an on-line January final exam worth 80%. The scheduled time is 3 hours which  includes uploading and submitting your answers. Late submissions will not be accepted. 

      In addition there are 5 in-term on-line quizzes each worth 4%, so make sure you do not miss one. These  are normally due by 11.59pm on Thursdays in weeks 3,5,8,10,12. But ALWAYS  see the quiz itself for the exact deadlines as these may be adapted as we go.

      Your marks and the solutions become visible once the quiz closes -- which means that it is not possible to grant any extensions under any circumstances, even IT delays at your end. So make sure you submit in good time.  ONLY THE LAST ATTEMPT COUNTS and any attempts in progress at the deadline are automatically submitted.

      Each quiz  will be listed in the weekly topic sections in the week before it is due, normally going  live by 9am of the Thursday of that week. 
      -

  • teaching team

    The module is taught by Professor Shahn Majid (s.majid@qmul.ac.uk) with teaching assistance for Tutorials provided by Itamar Mor (i.a.mor@qmul.ac.uk)

  • hints and tips


    • I advise each week to read the assigned pages of the Printed Lecture Notes. Make good notes – don’t just highlight .
    • Try to do this in a first look form before the Lecture and review it again after the relevant lecture. You can also browse books and online resources  to enhance your understanding of the week’s topic.
    • Bring the printed courseworks to the weekly tutorial and try to solve them (and/or understand the provided solutions).
    • Identify any other questions to bring to the coursework tutorial (or to my office hour if you want).  
    • You are allowed to ask questions on looking at the quiz  but we can't give hints to solutions  for upcoming quiz, only general advice.
    • You can also email me if your query is super-specific (eg including a page and line number)


  • where to get help

    Come to the  Q&A tutorial sessions or come by my advertised Learning Support Hour). You can also email for help, but be as precise as possible about your issue. 

  • module handbook

  • general course materials

  • coursework

    There is non-assessed coursework with solutions in the weekly topic blocks on the Module Content tab. This complements the in-term Quizzes  which are  the assessed element of coursework for the module. 

  • exam papers

    Below you can find many past exams and their solutions at least in outline form sufficient for  self-marking.  The designated Sample Exam which we went through in detail  during the lectures  starting in week 1 was the May 2019 exam which is  also below along with its solutions  (as well as these  posted in week 1). 

    You can also search the QMPLUS repository for past exams

  • Week 12

    • COURSEWORK TUTORIAL

      Look over printed cwks and their solutions and ask the module organiser and TA about these or about lectures. Also work on your answers for the upcoming quiz. Also,  look at past exams and their answers and ask us about them. 




      LECTURE 31 LESSON PLAN

      Proposition: V is the direct sum of U and the orthogonal complement of U. Spectral Theorem.  Ideas from the Proof.  Alternative statement in terms of eigenspaces/symmetric matrices. Example.  

      LECTURE 32 LESSON PLAN

      Revision lecture I

      LECTURE 33 LESSON PLAN

      Revision lecture II


      TASKS:


      (1) READ THE PRINTED NOTES PAGES  

      (2) START YOUR REVISION 

      (3) SUBMIT QUIZ 5




    • Here Lectures 32-33 are REVISION LECTURES

  • Assessment information

    • Assessment Pattern -   Final Exam in January 80%  and  in-term tests total 20%  (4% each)

      Format and dates for the in-term assessments - These are 5 multiple choice on-line quizzes in weeks 3,5, 8,10,12 see the weekly topic on the Module Content  tab  for the link to the quiz.

      Format of final assessment - on-line

      Link to past papers - see the bottom of the Important Module Information tab  for many past papers and their solutions. The designated Sample Exam is the May 2019 exam which we went through in lectures and which is available, along with solutions,  in the week 1 downloads as well as with the other past exam papers. 

      Description of Feedback - feedback is provided by your in term quiz marks. During the tutorials we can also provide comments on your solutions to quizzes if you show them to us and to the non-assessed courseworks which you can download, along with their solution,  each week on  the weekly topic block of the Module Content  tab.  

  • Reading List Online

  • Q-Review