On March 16, Bittensor's Templar Subnet (SN3) successfully completed the largest decentralized LLM pre-training of Covenant-72B on March 10. According to BlockBeats, community supporters believe this achievement demonstrates Bittensor's capability as a decentralized infrastructure for producing top-tier AI models, rather than being merely a 'concept coin.'
Covenant-72B is a language model with 72 billion parameters, pre-trained by the Templar team on Bittensor Subnet 3. It operates entirely on the general internet without the need for centralized data centers. In MMLU (zero-shot) testing, the model scored 67.1, surpassing centralized baseline models like LLaMA-2-70B and LLM360 K2 under the same evaluation conditions. This model is the largest collaborative language model to date that allows permissionless participation, with over 70 different nodes contributing computing resources throughout its operation. The team has released all weights and checkpoints under the Apache License.
Following this announcement, Bittensor (TAO) and its subnet tokens experienced significant price increases. TAO has risen by 54.8% over the past two weeks, while the subnet token τemplar has surged by 194% in the last seven days, currently priced at $19.3.