# long context

Latest news and articles about long context

Total: 3 articles found

Close-up of a smartphone with AI assistant interface on screen over a laptop.
Technology

Chinese AI Lab DeepSeek Trials 1‑Million‑Token Context Window in App — API Still Capped at 128K

DeepSeek is testing a new long‑context model in its web and app interfaces that supports roughly one million tokens, while its public API remains limited to 128K token context on version 3.2. The trial highlights the commercial and technical trade‑offs involved in bringing ultra‑long context windows to production and signals intensifying competition in China’s AI landscape.

NeTe2026年2月13日 13:04
#DeepSeek#long context#1M tokens
Close-up of wooden Scrabble tiles spelling 'China' and 'Deepseek' on a wooden surface.
Technology

China’s DeepSeek Pushes Context Limits — and Triggers a Backlash Over a Colder, ‘Faster’ Model

DeepSeek activated a grayscale update extending context length to 1 million tokens, prompting user complaints that the assistant sounds colder and less personalised. Industry sources say the build is a speed‑focused variant intended to stress‑test long‑context performance ahead of a V4 launch, highlighting trade‑offs between throughput and conversational quality. The episode illustrates the wider tension in scaling LLMs: architectural gains can come at the cost of user experience and trust.

NeTe2026年2月12日 17:04
#DeepSeek#large language model#long context
Image displaying DeepSeek AI interface for messaging and search functionality.
Technology

DeepSeek's Quiet Leap: 1‑Million‑Token Context and May‑2025 Knowledge Cut Hint at a Next‑Gen Chinese LLM

DeepSeek has begun limited testing of a model that supports a 1 million token context window and uses training data up to May 2025, a significant expansion from its previous 128k limit. The change suggests material architectural or pipeline upgrades and signals intensified competition among Chinese AI providers to ship more capable, enterprise‑ready models.

NeTe2026年2月11日 14:55
#DeepSeek#long context#1M tokens