CS 229 ― Machine Learning
My twin brother
Afshine and
I created this set of illustrated Machine Learning cheatsheets covering the content of the CS 229 class, which I TA-ed in Fall 2018 at Stanford. They can (hopefully!) be useful to all future students of this course as well as to anyone else interested in Machine Learning.
Cheatsheet
- • Loss function, gradient descent, likelihood
- • Linear models, Support Vector Machines, generative learning
- • Tree and ensemble methods, k-NN, learning theory
- • Expectation-Maximization, k-means, hierarchical clustering
- • Clustering assessment metrics
- • Principal component analysis, independent component analysis
- • Architecture, activation function, backpropagation, dropout
- • Convolutional layer, batch normalization, types of gates
- • Markov decision processes, Bellman equation, Q-learning
- • Confusion matrix, accuracy, precision, recall, F1 score, ROC
- • R squared, Mallow's CP, AIC, BIC
- • Cross-validation, regularization, bias/variance tradeoff, error analysis
Refresher
- • Axioms of probability, permutation, Bayes' rule, independence, conditional probability
- • Random variable, expectation, variance
- • Central Limit Theorem, parameter estimation
- • Main matrices, multiplication, transpose, inverse, trace, determinant
- • Eigenvalue, eigenvector, norm, gradient, Hessian
- • Positive semi-definite matrix, spectral theorem, singular-value decomposition
Would you like to see this set of cheatsheets in your native language? You can help us translating them on GitHub!