🇮🇹

Alibaba releases Qwen3-Max-Thinking, challenging OpenAI and Google's top AI models

Alibaba releases Qwen3-Max-Thinking, challenging OpenAI and Google’s top AI models

DISCLOSURE: this article contains affiliate links. If you purchase through these links we may receive a small commission at no additional cost to you. This helps us keep the site free and independent. Our opinions remain unbiased.

Alibaba has unveiled Qwen3-Max-Thinking, its most advanced AI model for complex reasoning tasks. According to benchmarks published by the Chinese company, performance is on par with OpenAI’s GPT-5.2-Thinking, Anthropic’s Claude Opus 4.5, and Google’s Gemini 3 Pro.

The most interesting feature for everyday users is the automatic use of built-in tools: the model decides on its own when to search the web, access memory from previous conversations, or run code, without requiring users to toggle anything manually. This should reduce hallucinations and provide more accurate, up-to-date responses.

For developers, the APIs are compatible with both OpenAI and Anthropic formats, meaning you can swap in Qwen without rewriting your code.

Qwen3-Max-Thinking is already available for free at chat.qwen.ai and through their API. As reported on the official Qwen blog, the improvements come from scaling up model parameters and new training techniques.

If you’d rather use a more privacy-focused alternative built on European open-source models, Proton Lumo is an AI assistant integrated into the Proton suite.

FONTE qwen.ai


Spread the word

Sniff out what’s new (follow me 🐾)

YOOTA
YOOTA
@yoota@en.yoota.it

Sniffing out tech news

54 posts
42 followers

Leave a Reply

Puoi lasciare solo commenti senza iscrizione che verranno preventivamente moderati e il tuo indirizzo IP sarà anonimizzato.

Cookies! We don't use tracking cookies or collect personal data, but since this site is federated via ActivityPub ⁂, your visit may connect to Mastodon or other federated servers. Affiliations: Some articles include affiliate links. When you buy through them, we may earn a small commission.