— The basics of all that is LLM


Introductory Concepts:

Tokenization

Embeddings & Positional Encoding

Self-Attention Mechanism

Multi-Head Attention Mechanism


Core Concepts (wip):


Additional Concepts (wip):