01 — MLP and Representation Learning
Fully connected feedforward networks — the simplest deep learning building block and the foundation for understanding depth, activation functions, and backpropagation.
Notes
- Multi-Layer Perceptron (MLP) — architecture, activation functions, forward and backward pass, depth vs width trade-offs
- Deep Learning Training Patterns (PyTorch) — MLP, CNN, RNN training loops with PyTorch, early stopping, validation loop, model serialisation (in
08_implementations)