News
Anthropic’s developers recently upgraded the AI model Claude Sonnet 4 to support up to 1 million tokens of context, thereby ...
Anthropic upgrades Claude Sonnet 4 to a 1M token context window and adds memory, enabling full codebase analysis, long ...
Claude Sonnet 4 can now support up to one million tokens of context, marking a fivefold increase from the prior 200,000, ...
Overview Claude 4 achieved record scores on the SWE-bench and Terminal-bench, proving its coding superiority.Claude Sonnet 4 now supports a one-million-token co ...
To account for the extra computing power required for large requests, Anthropic will increase the cost for Claude Sonnet 4 ...
The company today revealed that Claude Sonnet 4 now supports up to 1 million tokens of context in the Anthropic API — a five-fold increase over the previous limit.
Dan Shipper in Vibe Check Was this newsletter forwarded to you? Sign up to get it in your inbox. Today, Anthropic is releasing a version of Claude Sonnet 4 that has a 1-million token context window.
Anthropic AI has increased the context window for their Claude Sonnet 4 model to 1 million tokens, which is 5 times more than ...
Claude Sonnet 4, meanwhile, is described as a more practical, efficient upgrade from its predecessor, Claude Sonnet 3.7, and is now available to free-tier users.
The model’s usage share on AI marketplace OpenRouter hit 20 per cent as of mid-August, behind only Anthropic’s coding model.
Anthropic has upgraded Claude Sonnet 4 with a 1M token context window, competing with OpenAI's GPT-5 and Meta's Llama 4.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results