Templar (SN3) Completes Pre-Training of Covenant72B, the Largest Fully Decentralized LLM to Date

Templar, Bittensor Subnet 3, has completed pre-training for Covenant72B, a 72-billion-parameter language model. This is the largest frontier-scale model ever trained in a fully permissionless and decentralized setting. The run was coordinated across a global network of independent GPUs with no central datacenter, no single owner, and no gatekeeping. Pre-training processed roughly 1.2 trillion tokens,…

Read More

Bittensor Enters Academic Spotlight with Templar’s NeurIPS Paper

For the first time, research from the Bittensor ecosystem has been recognized at the prestigious Neural Information Processing Systems (NeurIPS) 2025 conference—one of the world’s leading venues in machine learning. Covenant AI, known for pioneering decentralized training through the Templar, Basilica and Grail subnets on Bittensor, built its NeurIPS papers on concepts already proven in…

Read More

Templar Makes History with the World’s First 70B Decentralized Training Run

Templar, Subnet 3 on Bittensor, has officially launched the first-ever decentralized training run of a 70 billion parameter AI model, a milestone that redefines what is possible outside the walls of Big Tech. Templar is a decentralized AI training system that connects computers worldwide to collaboratively train models. It rewards contributors for providing computing power…

Read More