ai workflows

Context Window

The total amount of text, code, and conversation history an AI model can hold in active memory during a single session. Measured in tokens, not words.

A context window is working memory, not storage. Everything the model reads on each turn, your messages, its previous replies, pasted files, tool output, and hidden system instructions, counts against it. When the window fills up, the model either truncates older material or degrades in quality. Bigger windows help, but a messy million-token session still performs worse than a clean short one.

Learn more in our full guide: Read the article

Related Terms