Gradients is a decentralized artificial intelligence (AI) training platform operating as Subnet 56 within the Bittensor ecosystem. The project aims to provide enterprise-grade AI model training services by creating a transparent and competitive environment for developing and validating training methodologies for both text and image-based models. [1] [2]
Gradients is designed to address a key challenge in the adoption of decentralized AI: the lack of transparency and trust. Many enterprise clients are hesitant to use decentralized networks for training proprietary models due to the "black-box" nature of the process, where the methods used by anonymous network participants (miners) are not verifiable. An enterprise client was quoted as stating, "If there was more transparency and trust with what was happening with the data, there would be a clear path to using Gradients." To solve this, Gradients shifted its model to one of "competitive transparency." [1]
The platform functions as an Automated Large Language Model (AutoLLM) system that leverages a decentralized network of compute providers. Its core innovation, introduced in the Gradients 5.0 update, is a tournament-based system where developers compete to create the most effective AI training scripts. The winning scripts from these competitions are made open-source, providing full visibility into the methodology. This allows Gradients to offer validated, high-performance, and transparent training solutions to enterprise customers. The project's development is associated with an entity named "rayonlabs," as indicated by its code repository. [1] [3]
The project's vision is to create a commercial platform that offers not only performance but also verifiable trust. By open-sourcing the best methodologies and planning for future integration with trusted compute services, Gradients aims to provide a value proposition that combines the innovation of a competitive ecosystem with the security and transparency required by enterprise clients. As stated by the project's spokesperson, "The path to commercial viability runs through openness. Gradients 5.0 is that path." [1]
The public discourse around Gradients began in early 2025, with a series of announcements outlining its capabilities and progress. In an April 6, 2025 post, the platform was described as a high-performing AutoLLM system where decentralized miners compete to train models. By May 9, 2025, the project claimed to have achieved leadership in training both text and image models, highlighting its multi-modal capabilities. [4]
A significant development occurred in June 2025 with the release of a research paper titled "G.O.D.: Training Foundation Models as a Differentiable Game," which reportedly detailed "world-leading results" achieved by the platform. This was followed by the announcement of Gradients 5.0 on July 6, 2025. This major update was framed as a strategic pivot to address enterprise concerns about transparency, with the tagline "Opening the Black Box — Unlocking Enterprise AI Training." The rollout of the first stage of Gradients 5.0 was scheduled to begin on July 21, 2025. [1] [4]
The technological foundation of Gradients is its decentralized network for AI model training, which evolved significantly with the introduction of Gradients 5.0. This update shifted the platform from a conventional miner-based system to a structured, tournament-based competition designed to foster transparency and identify superior training methods. [1]
The core of Gradients 5.0 is a bi-weekly tournament where AutoML practitioners compete to produce the best training scripts. The winning script from each tournament is released as an open-source asset, which then becomes a validated commercial product that Gradients can offer to its clients. This model is designed to replace the opaque system of individual miners with a transparent framework where the best methodologies are proven through open competition. [1]
Each tournament cycle runs for two weeks and is divided into two main categories to address different AI domains:
The tournament progresses through several stages to identify a definitive winner:
The process for participation is designed to be straightforward and secure. Miners do not submit their proprietary code directly to the system. Instead, they register their existing GitHub repositories by providing the repository URL and a specific commit hash. The tournament's automated system then clones the specified repository version and executes the training script on standardized GPU infrastructure. This ensures that the competition is based on the quality of the methodology, not on hardware advantages, as the system dynamically allocates GPU resources based on model complexity.
The evaluation process is managed entirely by network validators. They are responsible for running the scripts, evaluating the performance of the resulting trained models, and scoring the participants. This creates a trustless environment where results are verifiable and the competition is fair. After evaluation, the trained models are uploaded to HuggingFace. [1]
The transition to the Gradients 5.0 model was planned in three distinct stages to ensure a stable and validated migration.
Gradients has a native cryptocurrency token with the ticker symbol SN56, corresponding to its designation as Subnet 56 on the Bittensor network. The token is categorized under the AI and Bittensor Ecosystem tags on market data platforms. [2] [3]
Key token metrics include:
The token is primarily traded on decentralized exchanges within the Bittensor ecosystem, such as Subnet Tokens. The most active trading pairs include SN56/SN0 and SN56/TAO. These details reflect the token's integration within its native network. [2] [3]
Gradients is positioned as a commercial platform for enterprise AI, with a value proposition centered on transparency, performance, and security. The open-sourcing of tournament-winning training scripts is a key differentiator, allowing clients to inspect and trust the methodologies used. A project representative noted, "When we tell enterprises 'this approach won against 24 competitors, and we’ll run it for you with cryptographic proof of integrity', that’s a fundamentally different value proposition than any competitor can offer." [1]
The platform's commercial offerings are designed to cater to enterprise needs:
Specific details about the founding team or corporate structure behind Gradients are not widely publicized. However, some entities are publicly associated with the project. The GitHub repository for the project is maintained under the name "rayonlabs." A key public voice for the project is an individual or group operating under the pseudonym "WanderingWeights," who authored the detailed announcements regarding the Gradients 5.0 update and the project's strategic direction. [3] [1]