Mathematical Foundations for AI

(CTU-MATH307.AJ1)
Lessons
Lab
AI Tutor (Add-on)
Get A Free Trial

Skills You’ll Get

1

Vector and Matrix Operations

  • What is a vector space?
  • The basis
  • Vectors in practice
  • Norms and distances
  • Inner products, angles, and lots of reasons to care about them
  • Vectors in NumPy
  • Matrices, the workhorses of linear algebra
2

Solving Systems of Linear Equations

  • What is a linear transformation?
  • Change of basis
  • Linear transformations in the Euclidean plane
  • Determinants, or how linear transformations affect volume
  • Linear equations
  • The LU decomposition
  • Determinants in practice
3

Partial Derivatives and Gradients

  • Functions in theory
  • Functions in practice
  • Numbers
  • Sequences
  • Series
  • Topology
  • Limits
  • Continuity
  • Differentiation in theory
  • Differentiation in practice
  • Minima, maxima, and derivatives
  • The basics of gradient descent
  • Why does gradient descent work?
  • Integration in theory
  • Integration in practice
4

Chain Rule and Composite Functions

  • What is a multivariable function?
  • Linear functions in multiple variables
  • The curse of dimensionality
  • Derivatives of vector-valued functions
  • Multivariable functions in code
  • Minima and maxima, revisited
  • Gradient descent in its full form
5

Advanced Applications of Vector and Matrix Operations

  • Eigenvalues of matrices
  • Finding eigenvalue-eigenvector pairs
  • Eigenvectors, eigenspaces, and their bases
  • Special transformations
  • Self-adjoint transformations and the spectral decomposition theorem
  • The singular value decomposition
  • Orthogonal projections
  • Computing eigenvalues
  • The QR algorithm

1

Vector and Matrix Operations

  • Implementing Tuple and List Operations
  • Performing NumPy Array and Vector Operations
  • Analyzing Vectors and Distances
  • Evaluating Vector Norms and Operations
  • Applying Matrix Computations Using NumPy
  • Representing Images and Text Using Vectors and Matrices
2

Solving Systems of Linear Equations

  • Solving Linear Equations Using Gaussian Elimination
  • Solving Linear Equations Using Determinants and Inverses
  • Solving Linear Models in Machine Learning
  • Performing LU Decomposition
  • Computing the Determinant Using LU Decomposition
3

Partial Derivatives and Gradients

  • Implementing Callable Functions
  • Visualizing Mathematical Sequences and Approximations
  • Visualizing the Harmonic Series
  • Analyzing Openness, Closedness, and Compactness of Sets
  • Applying the Chain Rule
  • Implementing Gradient Descent for Model Training
  • Approximating Integrals Using the Trapezoidal Rule
4

Chain Rule and Composite Functions

  • Plotting Multivariable Function Landscapes
  • Plotting Linear Mappings and Hyperplanes for ML Models
  • Evaluating Composite Functions
  • Analyzing Gradients, Jacobians, and Hessians in ML Optimization
  • Implementing Backpropagation in Neural Networks
  • Training Machine Learning Models with Gradient Descent
5

Advanced Applications of Vector and Matrix Operations

  • Finding Eigenvalues and Eigenvectors of Matrices
  • Analyzing Matrices Using Characteristic Polynomials
  • Visualizing Eigenvectors and Eigenspaces in Linear Algebra
  • Implementing Spectral Decomposition and PCA
  • Performing Feature Extraction and Dimensionality Reduction
  • Performing Singular Value Decomposition
  • Implementing QR Decomposition
scroll to top