DeepSeek has released a new AI training method that analysts say is a "breakthrough" for scaling large language models.
The Chinese AI lab may have just found a way to train advanced LLMs in a manner that's practical and scalable, even for more cash-strapped developers.
DeepSeek has published a technical paper co-authored by founder Liang Wenfeng proposing a rethink of its core deep learning ...
DeepSeek published a paper outlining a more efficient approach to developing AI, illustrating the Chinese artificial ...
DeepSeek has released new research showing that a promising but fragile neural network design can be stabilised at scale, ...
DeepSeek researchers have developed a technology called Manifold-Constrained Hyper-Connections, or mHC, that can improve the performance of artificial intelligence models. The Chinese AI lab debuted ...
The paper comes at a time when most AI start-ups have been focusing on turning AI capabilities in LLMs into agents and other ...
Chinese AI company Deepseek has unveiled a new training method, Manifold-Constrained Hyper-Connections (mHC), which will make it possible to train large language models more efficiently and at lower ...
China’s DeepSeek has published new research showing how AI training can be made more efficient despite chip constraints.
Hello and welcome to the newsletter, a grab bag of daily content from the Odd Lots universe. Sometimes it's us, Joe ...
I agree to the Terms of Use , Privacy Notice and Cookie Notice.
Chinese AI and semiconductor stocks have rallied since the breakout of the China-made DeepSeek-R1 AI model in January 2025.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results