— The basics of all that is LLM


Introductory Concepts:

Tokenization

Embeddings & Positional Encoding

Self-Attention Mechanism

Multi-Head Attention Mechanism

Transformer Block


Core Concepts (wip):


Additional Concepts (wip):