Context Window
The total amount of text, code, and conversation history an AI model can hold in active memory during a single session. Measured in tokens, not words.
A context window is working memory, not storage. Everything the model reads on each turn, your messages, its previous replies, pasted files, tool output, and hidden system instructions, counts against it. When the window fills up, the model either truncates older material or degrades in quality. Bigger windows help, but a messy million-token session still performs worse than a clean short one.
Learn more in our full guide: Read the article
Related Terms
AI Token
The basic unit of text that AI language models process. Roughly 0.75 words per token in English, though the ratio varies by language and content type.
AI Session
A single continuous conversation thread with an AI model, from the first message to the last. Each session has its own context window that resets when a new session starts.
System Instructions
Hidden directives loaded into an AI session that shape the model's behavior, tone, and constraints. They consume context window tokens like any other input.