# long context
Latest news and articles about long context
Total: 3 articles found

Chinese AI Lab DeepSeek Trials 1‑Million‑Token Context Window in App — API Still Capped at 128K
DeepSeek is testing a new long‑context model in its web and app interfaces that supports roughly one million tokens, while its public API remains limited to 128K token context on version 3.2. The trial highlights the commercial and technical trade‑offs involved in bringing ultra‑long context windows to production and signals intensifying competition in China’s AI landscape.

China’s DeepSeek Pushes Context Limits — and Triggers a Backlash Over a Colder, ‘Faster’ Model
DeepSeek activated a grayscale update extending context length to 1 million tokens, prompting user complaints that the assistant sounds colder and less personalised. Industry sources say the build is a speed‑focused variant intended to stress‑test long‑context performance ahead of a V4 launch, highlighting trade‑offs between throughput and conversational quality. The episode illustrates the wider tension in scaling LLMs: architectural gains can come at the cost of user experience and trust.

DeepSeek's Quiet Leap: 1‑Million‑Token Context and May‑2025 Knowledge Cut Hint at a Next‑Gen Chinese LLM
DeepSeek has begun limited testing of a model that supports a 1 million token context window and uses training data up to May 2025, a significant expansion from its previous 128k limit. The change suggests material architectural or pipeline upgrades and signals intensified competition among Chinese AI providers to ship more capable, enterprise‑ready models.