Will GigaChat release an open-weights model with ≥100B parameters by the end of 2026?
50%
chance

In June 2025, the GigaChat team released three open-source models based on the Mixture of Experts (MoE) architecture with 20 billion total parameters (3.3 billion active). GigaChat is developed by SaluteDevices, a subsidiary of Sberbank, one of Russia's largest financial institutions and a leading player in Russian AI development. While the team has released smaller open-weights models, they maintain proprietary versions (GigaChat Pro and GigaChat MAX) for which no architectural details or weights have been disclosed.

The progression to releasing significantly larger open-weights models would provide signal towards several important developments in Russia's AI ecosystem:

Computational capacity: Training models above 100B parameters requires substantial GPU clusters and energy infrastructure. Delivering a ≥100B judged worthwhile to make open-weights would suggest access to a reasonable scale of AI infrastructure

Technical sophistication: While GigaChat relies heavily on existing open architectures and techniques, marshalling the data curation, pre-training, post-training and underlying infrastructure at greater scale suggests a reasonably high level of technical sophistication

Ecosystem development: Larger open-weights Russian-language models from a leading player could suggest an increasing embrace of an open-source AI strategy by Russia as the pathway for its overall ecosystem

Resolution Criteria:

This question resolves Yes if, before January 1, 2027, 00:00 UTC, all of the following conditions are met:

  1. Entity: The GigaChat team, SaluteDevices, Sberbank, or a direct successor organization publicly releases model weights for a large language model

  2. Accessibility: The weights are downloadable without requiring individual per-user approval, payment, or access controls beyond standard user agreements (e.g., accepting a license agreement is acceptable; waitlists or application processes are not)

  3. Parameter threshold: The model has ≥100 billion total parameters according to official documentation (Huggingface model card - https://huggingface.co/ai-sage, technical report - https://arxiv.org/pdf/2506.09440 or official announcements)

For Mixture of Experts architectures, total parameter count across all experts is used, not active parameters per forward pass

If SaluteDevices or Sberbank undergoes restructuring, a "direct successor" is defined as an organization that acquires ≥50% of the AI research assets and continues the GigaChat project under any name

Pre-release versions (alpha, beta) count if they meet all criteria and weights are publicly accessible

Joint ventures where GigaChat provides the primary technical contribution count; models where GigaChat only provides fine-tuning or minor contributions do not

Get
Ṁ1,000
and
S3.00
© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules