In no way graded assignments and exams needed to certificate will be shared due to the Coursera honor code
Please note that sometimes github doesn't load the .ipynb file or use a incorrect diagram for a matrix, feel free to download it and use in your own reader
The aim of my repository is to give students learning the basis math for machine learning (in special those doing the Imperial College London Mathematics for Machine Learning course) some helpful resources and somewhere to guide then in the practice exercises available at the course.
This repository covers the following topics:
- Linear algebra;
- Multivariate Calculus;
- Principal Component Analysis (PCA).
Basic course description (From Imperial College London course)
For a lot of higher level courses in Machine Learning and Data Science, you find you need to freshen up on the basics in mathematics - stuff you may have studied before in school or university, but which was taught in another context, or not very intuitively, such that you struggle to relate it to how it’s used in Computer Science. This specialization aims to bridge that gap, getting you up to speed in the underlying mathematics, building an intuitive understanding, and relating it to Machine Learning and Data Science.
In the first course on Linear Algebra we look at what linear algebra is and how it relates to data. Then we look through what vectors and matrices are and how to work with them.
The second course, Multivariate Calculus, builds on this to look at how to optimize fitting functions to get good fits to data. It starts from introductory calculus and then uses the matrices and vectors from the first course to look at data fitting.
The third course, Dimensionality Reduction with Principal Component Analysis, uses the mathematics from the first two courses to compress high-dimensional data. This course is of intermediate difficulty and will require Python and numpy knowledge.
At the end of this specialization you will have gained the prerequisite mathematical knowledge to continue your journey and take more advanced courses in machine learning.
-
Solving simultaneous equations
-
Modulus e inner products
Cosine e dot products
Scalar and vector projections
Basis change
Linear dependence
-
Matrix multiplication
Matrix properties
Identity matrix
Matrix transformation
Solving simultaneous equations through matrix method
Inverse matrix
-
Einstein summation
Symmetry of the dot product
Notes on non-square matrix multiplication
Changing basis in matrices
Transformation in changed basis
Orthogonal matrices
Gram-schmidt process
-
Eigenvalues
Eigenvectors
Special eigen-cases
Changing to the eigenbasis
-
Differentiation and definition of a derivative;
Sum rule;
Power rule;
Special cases derivative;
Product rule;
Chain rule;
All around application.
-
Dependent and independent variables;
Extension to multivariate differentiation;
Multivariate complex example;
Multivariate partial differentiation;
Jacobian vector;
Hessian matrix.
-
Multivariate chain rule;
Neural network in matrix form;
Applied NN with the chain rule
-
Why approximate function;
Power series;
Maclaurin series;
Taylor series;
Linearisation;
Multivariate taylor series.
-
One dimensional newton-raphson;
Gradient descent;
Constrained optimisation.
-
Week 6 not provided.
- Week 1:
Mean of a dataset;
One dimensional variance;
Covariance matrix;
Linear transformation properties for the mean, variance and covariance;
Numpy tutorial (from the course lab);
A little gift for you, if you come this far :)
- Week 2:
Dot product, angles and distance between vectors;
Inner products;
Inner products and length of vectors;
Inner products, orthogonality and angle between vectors.
- Week 3:
Projections onto 1-D subspace;
Projections in higher dimensions (N-D subspace).
- Week 4:
PCA objective and key ideas;
Coordinates of projected data;
Derivation of the average square reconstruction error;
Finding the basis vectors that spans the principal subspace;
Summary of key equations. My personal note on how to create a PCA function (to help in the final assignment).