AI Token
The basic unit of text that AI language models process. Roughly 0.75 words per token in English, though the ratio varies by language and content type.
Tokens are how models see text. A word like 'design' is one token. A word like 'accessibility' might be two. Punctuation, whitespace, and code all consume tokens. Every input and output in a session burns tokens from the context window, which is why verbose tool output and long assistant replies compound the cost of each subsequent turn.
Learn more in our full guide: Read the article
Related Terms
Context Window
The total amount of text, code, and conversation history an AI model can hold in active memory during a single session. Measured in tokens, not words.
Token Reuse
The compounding effect where each new AI response requires reprocessing all previous conversation tokens, increasing latency and cost with every turn.