Learn languages naturally with fresh, real content!

Popular Topics
Explore By Region
Chinese AI startup DeepSeek introduces new method to make large models more efficient, reducing costs and boosting scalability.
DeepSeek, a Chinese AI startup, has unveiled a new training method called Manifold-Constrained Hyper-Connections, designed to make large AI models more efficient and scalable while reducing computational and energy costs.
The technique, detailed in a paper co-authored by founder Liang Wenfeng and published on arXiv, addresses training instability and memory issues in prior models, enabling stable training across 3 billion to 27 billion parameter systems with minimal added compute.
Building on ByteDance’s earlier work, the approach reflects China’s push for AI innovation despite U.S. semiconductor restrictions.
The release fuels anticipation for DeepSeek’s next major model, possibly R2, expected around the Spring Festival in February.
La startup china de inteligencia artificial DeepSeek introduce un nuevo método para hacer que los modelos grandes sean más eficientes, reduciendo los costos e impulsando la escalabilidad.