QwQ-32B challenges AI giants with innovative techniques, open-source accessibility, and exceptional reasoning capabilities.
The Register on MSN18d
DeepSeek-R1-beating perf in a 32B package? El Reg digs its claws into Alibaba's QwQHow to tame its hypersensitive hyperparameters and get it running on your PC Hands on How much can reinforcement learning - and a bit of extra verification - improve large language models, aka LLMs?
It reflects the broader competitiveness of China’s frontier AI ecosystem,” says Scott Singer, a visiting scholar in the Technology and International Affairs Program at the Carnegie Endowment for ...
Alibaba has unveiled its latest AI reasoning model, QwQ-32B, whose performance is on par with DeepSeek's R1 model despite ...
Alibaba just unveiled its latest reasoning model, the QwQ-32b. It's said to rival DeepSeek at a much lower cost.
Enter the Qwen QwQ 32B, a local reasoning model that’s rewriting the rules of what’s possible in AI. With 32 billion parameters packed into a dense, efficient architecture, this model is ...
Learn More Qwen Team — a division of Chinese e-commerce giant Alibaba developing its growing family of open-source Qwen large language models (LLMs) — has introduced QwQ-32B, a new 32-billion ...
B, an AI model rivaling OpenAI and DeepSeek with 98% lower compute costs. A game-changer in AI efficiency, boosting Alibaba’s market position.
Alibaba surged 7% on Thursday after it made one of its AI models public, signaling how the party isn't over for Chinese tech stocks.
Alibaba Cloud on Thursday launched QwQ-32B, a compact reasoning model built on its latest large language model (LLM), Qwen2.5-32b, one it says delivers performance comparable to other large ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results