Last month, China's DeepSeek released a research paper that rattled global markets after claiming its AI model was trained at a fraction of the cost of leading AI players and on less-advanced ...
The developer of the chatbot that shocked U.S. incumbents had access to Nvidia chips that its parent company providentially ...
Mixture-of-experts (MoE) is an architecture used in some AI and LLMs. DeepSeek garnered big headlines and uses MoE. Here are ...
Stanford’s S1 and Berkeley’s TinyZero are two examples of how researchers are increasingly using Alibaba tech to lower AI ...
The Chinese tech giant behind TikTok has quietly unveiled an advanced AI model for generating video that raises new concerns ...
Despite strong interest in using artificial intelligence to make research faster, easier and more accessible, researchers say ...
After the Chinese startup DeepSeek shook Silicon Valley and Wall Street, efforts have begun to reproduce its cost-efficient ...
At least four current DeepSeek employees, including a key department chief, previously worked at Microsoft Research ... published papers in top conferences or journals in related AI fields ...
Researchers at Fudan University in Shanghai have developed a technology that could dramatically extend the life span of ...
who specialises in technology equities research. Breakthroughs in AI technology, coupled with the falling prices of hardware components – an area where China may have an advantage as it ...
For engineer positions in deep-learning systems, data research and so-called full ... is expected to heat up the battle for young AI talent among China's Big Tech firms. Demand for natural ...
Results that may be inaccessible to you are currently showing.
Hide inaccessible results