Complexity Legend
Name Example
Constant Θ(1)
Log Θ(log(n))
Linear Θ(n)
Log-Linear Θ(nlog(n))
Square Θ(n**2)
Polynomial Θ(n**k)
Exponential Θ(k**n)
Factorial Θ(n!)

Below is a table summarizing key components of some essential machine learning algorithms.

This is a work in progress, you can send me any corrections/suggestions by email or twitter DM. Alternatively, all the information is in the data directory of my website repo (here) so feel free to submit a PR.

Output $\Omega$ Complexity
Name Labeled Type Prediction function $y = f(\Omega, X)$ Loss Update Rule Parameters Hyper-parameters Train Predict
KNN (K Nearest Neighbors) Supervised Regression, Classification

Set based on $y$ for $k$ nearest neighbors in labeled data.

-
• Keeps an array of training examples.
• Array of training examples
• $k$ : the number of neighbors
• Distance metric
Θ(1) [ 1 ] Θ(nk) [ 1 ]
OLS (Ordinary Least Squares) Supervised Regression

$y = X\beta$

Sum of squared residuals $$\sum_{i=1}^{n}(y_i - x_i^Tb)^2$$
• Pseudoinverse: $\beta = (X^TX)^{-1}X^Ty$
• $\beta$: Coefficients
Θ(nk^2) [ 1 ] Θ(k)
$y = w^Tx + b \geq 0$
Θ(n**3) [ 1 ] Θ(k)