News
China's DeepSeek has released a 685-billion parameter open-source AI model, DeepSeek V3.1, challenging OpenAI and Anthropic ...
China’s new "hybrid" DeepSeek model outshines OpenAI’s hyped GPT-OSS, delivering fiction, logic, and code. But OpenAI's model ...
Discover how the DeepSeek 3.1 update transforms AI workflows with hybrid reasoning, 128k token context, and more, while tackling key ...
The Register on MSN3d
DeepSeek's new V3.1 release points to potent new Chinese chips coming soon
Point release retuned with new FP8 datatype for better compatibility with homegrown silicon Chinese AI darling DeepSeek ...
DeepSeek V3.1 is finally here, and while it performs significantly better than R1, it doesn't outperform GPT-5 Thinking or ...
Chinese startup DeepSeek has released its largest AI model to date, a 685-billion-parameter model that industry observers say ...
DeepSeek launches V3.1 with faster reasoning, domestic chip support, open-source release, and new API pricing, marking its ...
The Chinese start-up has introduced only a few incremental updates in recent months, while competitors have released new ...
Chinese AI firm adds longer memory to its flagship model but still faces chip shortages that stall bigger ambitions ...
DeepSeek launches V3.1 with doubled context, advanced coding, and math abilities. Featuring 685B parameters under MIT Licence ...
In a quiet yet impactful move, DeepSeek, the Hangzhou-based AI research lab, has unveiled DeepSeek V3.1, an upgraded version ...
On August 20, DeepSeek announced the open-sourcing of its new V3.1-Base model on Hugging Face. According to the company, the model has approximately 685 ...
Results that may be inaccessible to you are currently showing.
Hide inaccessible results