Machine Learning Algorithms Cheatsheet
Complexity Legend | |
---|---|
Name | Example |
Constant | Θ(1) |
Log | Θ(log(n)) |
Linear | Θ(n) |
Log-Linear | Θ(nlog(n)) |
Square | Θ(n**2) |
Polynomial | Θ(n**k) |
Exponential | Θ(k**n) |
Factorial | Θ(n!) |
Below is a table summarizing key components of some essential machine learning algorithms.
This is a work in progress, you can send me any corrections/suggestions by email or twitter DM. Alternatively, all the information is in the data directory of my website repo (here) so feel free to submit a PR.
Output | $\Omega$ | Complexity | |||||||
---|---|---|---|---|---|---|---|---|---|
Name | Labeled | Type | Prediction function $y = f(\Omega, X)$ | Loss | Update Rule | Parameters | Hyper-parameters | Train | Predict |
KNN (K Nearest Neighbors) | Supervised | Regression, Classification | Set based on $y$ for $k$ nearest neighbors in labeled data. |
- |
|
|
|
Θ(1)
[
1
]
|
Θ(nk)
[
1
]
|
Linear SVM (Support Vector Machine) | Supervised | Regression, Classification | $y = w^Tx + b \geq 0$ |
todo |
|
|
|
Θ(n**3)
[
1
]
|
Θ(k)
|
OLS (Ordinary Least Squares) | Supervised | Regression | $y = X\beta$ |
Sum of squared residuals $$ \sum_{i=1}^{n}(y_i - x_i^Tb)^2 $$ |
|
|
|
Θ(nk^2)
[
1
]
|
Θ(k)
|