SUBNETS
From Narrative to Substance: How Bittensor Is Building Its First Real Economic Engine
By: CryptoZPunisher When Substance Replaces Narrative: A Structuring Turning Point for Bittensor Last week, several major announcements were presented publicly during the DNA x Sequire Investor Summit. They mark a clear turning point in how certain key actors are approaching long-term value creation around Bittensor, far removed from the opportunistic structures and speculative storytelling that…
Synthdata Launches API and Subscription Plans for AI-Powered Market Forecasts
Synthdata (Bittensor Subnet 50), has officially launched its Synth API alongside new subscription plans. The release marks a shift from internal tooling to a revenue-generating intelligence product for financial markets. Read more about the launch here. The Synth API allows traders, bots, and institutions to plug AI-powered volatility and probability forecasts directly into trading and…
Ridges AI Partners With Latent Holdings to Fast-Track Subnet 62
Ridges AI has announced a strategic collaboration with Latent Holdings to accelerate development of AI software engineering agents and push SN62 toward real product adoption. The move is aimed at solving a clear bottleneck. Ridges admitted that while its tech has matured over more than a year, scaling the subnet and shipping a competitive product…
“It’s AI” Detects AI-Written Text With 99 Percent Accuracy
As AI writing tools like ChatGPT become common, telling human writing from AI-generated text gets harder. Students submit AI-written essays. Job applicants use AI for resumes. Content creators pass off AI articles as original work. Teachers, employers, and publishers need reliable ways to spot AI text. It’s AI solves this problem through a decentralized detection…
Hone Is Training AI to Actually Think Like Humans Through Decentralized Research
Current AI is impressive but fundamentally limited. ChatGPT can write essays and code, but it sometimes fails at simple reasoning puzzles that children solve easily. These models memorize patterns from massive datasets rather than truly understanding how the world works. Hone is trying to fix this. Operating as Subnet 5 on Bittensor, Hone is a…
Numinous Uses Competing AI Agents to Predict the Future Better Than Humans
Predicting the future is valuable. Whether you’re betting on elections, trading stocks, or making business decisions, knowing what’s likely to happen gives you an edge. Numinous (SN6) is building a network of AI agents that compete to make the most accurate predictions about real-world events. Operating as Subnet 6 on Bittensor, it’s creating what they…
Bittensor Subnet 103 – Djinn Protocol When suppressed intelligence finally finds an exit
By: Punisher 1. The problem – a market that actively rejects intelligence Modern sports betting markets are not inefficient by accident. They are designed to function this way. As soon as a participant demonstrates consistent profitability: Meanwhile, millions of losing accounts remain fully active and are encouraged to keep betting. This creates a structural mismatch:…
Targon Makes Renting Powerful AI Computing Secure and Decentralized
Training AI models or running large-scale AI applications requires serious computing power, the kind most people and even small companies can’t afford to buy. Usually, you’d rent that power from big cloud providers like Amazon or Google. But what if your AI work involves sensitive data you don’t want anyone else to see? Targon solves…
Introducing Manako
Full article source: Score Subnet Manako is the flagship product from Score, the team behind Bittensor Subnet 44 (SN44), focused on building an open and permissionless computer vision layer — initially with a strong emphasis on sports, but designed to extend far beyond. Announced publicly on January 22, 2026, Manako represents the culmination of months…
Templar (SN3) Completes Pre-Training of Covenant72B, the Largest Fully Decentralized LLM to Date
Templar, Bittensor Subnet 3, has completed pre-training for Covenant72B, a 72-billion-parameter language model. This is the largest frontier-scale model ever trained in a fully permissionless and decentralized setting. The run was coordinated across a global network of independent GPUs with no central datacenter, no single owner, and no gatekeeping. Pre-training processed roughly 1.2 trillion tokens,…