What is Token? Simple Explanation (2026)

AI Glossary

What is Token?

Definition: In AI, a token is a piece of text that the model processes. Tokens can be words, parts of words, or characters. Understanding tokens is important because AI tools charge based on token usage and have token limits (context windows).

How Does Token Work?

AI models break text into tokens before processing. English text averages about 1 token per 4 characters or 0.75 tokens per word. For example, ‘Hello world’ is 2 tokens, while ‘extraordinary’ might be split into ‘extra’ + ‘ordinary’ = 2 tokens.

Examples

GPT-4o: 128K token context window. Claude 3.5: 200K tokens. Gemini 1.5: 1M tokens. API pricing: GPT-4o ~$2.50/1M input tokens.

Related Reading

Learn more about how Token is used in practice:

Find the Perfect AI Tool for Your Needs

Compare pricing, features, and reviews of 50+ AI tools

Browse All AI Tools →

Get Weekly AI Tool Updates

Join 1,000+ professionals. Free AI tools cheatsheet included.

Similar Posts