Complexity Legend
Name Example
Constant Θ(1)
Log Θ(log(n))
Linear Θ(n)
Log-Linear Θ(nlog(n))
Square Θ(n**2)
Polynomial Θ(n**k)
Exponential Θ(k**n)
Factorial Θ(n!)

Below is a table summarizing key components of some essential machine learning algorithms.

This is a work in progress, you can send me any corrections/suggestions by email or twitter DM. Alternatively, all the information is in the data directory of my website repo (here) so feel free to submit a PR.

Output $\Omega$ Complexity
Name Labeled Type Prediction function $y = f(\Omega, X)$ Loss Update Rule Parameters Hyper-parameters Train Predict
Linear SVM (Support Vector Machine) Supervised Regression, Classification

$y = w^Tx + b \geq 0$

todo
  • todo
  • Support Vectors
  • Soft Margin
Θ(n**3) [ 1 ] Θ(k)
OLS (Ordinary Least Squares) Supervised Regression

$y = X\beta$

Sum of squared residuals $$ \sum_{i=1}^{n}(y_i - x_i^Tb)^2 $$
  • Pseudoinverse: $\beta = (X^TX)^{-1}X^Ty$
  • Stochastic Gradient Descent
  • $\beta$: Coefficients
Θ(nk^2) [ 1 ] Θ(k)
KNN (K Nearest Neighbors) Supervised Regression, Classification

Set based on $y$ for $k$ nearest neighbors in labeled data.

-
  • Keeps an array of training examples.
  • Array of training examples
  • $k$ : the number of neighbors
  • Distance metric
Θ(1) [ 1 ] Θ(nk) [ 1 ]