04 — Deep Learning

Neural network architectures from fully connected perceptrons through convolutional, recurrent, and transformer models to multimodal systems.

Sublayers

01 — MLP and Representation Learning

Multi-layer perceptrons, activation functions, universal approximation, backpropagation.

02 — Convolutional Networks

Convolution, pooling, CNN architectures (LeNet, VGG, ResNet), object detection, face recognition, neural style transfer.

03 — Sequence Models

Recurrent networks, LSTMs, GRUs, vanishing gradients, trigger word detection.

04 — Transformers

Attention mechanism, transformer architecture, word embeddings, sequence-to-sequence, overview of large-scale transformers.

05 — Multimodal Models

Vision-language models, cross-modal alignment, CLIP-style architectures.

03 — Probabilistic Models05 — Time Series

5 items under this folder.