Statistical Learning Theory Index
Navigation hub.
Notes
Formal theory
- Supervised Learning — task setup, loss functions, generalization
- PAC Learning — probably approximately correct framework, sample complexity, agnostic PAC
- VC Dimension — shattering, growth function, fundamental theorem of statistical learning
- Generalization Bounds and Rademacher Complexity — ERM bound, uniform convergence, SRM, PAC-Bayes
Applied learning theory
- Bias-Variance Analysis — decomposition, underfitting vs overfitting
- Data Splits and Distribution — train/dev/test strategy, distribution mismatch
- Error Analysis — ceiling analysis, error categorization
- Evaluation Metrics — precision, recall, F1, AUC-ROC
- Multi-Task Learning — shared representations, hard/soft parameter sharing
- Orthogonalization — separating tuning concerns
- Transfer Learning — pretrain/fine-tune, domain adaptation
Links
Navigation: ← Optimization | Foundations Index | Deep Learning Theory →