Published on: 25 Oct 2023Last edit: Mar 03
Exploring Bittensor's New Subnets: A Deep Dive
The Bittensor network, with its variety of subnets catering to different domains of machine intelligence, is a new era of collaborative AI research and development. By bridging the gap between machine intelligence producers and consumers, Bittensor can play a pivotal role in advancing the field of AI, making it more accessible, inclusive, and innovative.
Introduction
Bittensor introduces a network where computers validate and reward the efforts of their counterparts based on the value they contribute to the group. This initiative serves as a catalyst for open-source developers and AI research labs, offering financial rewards to those who work on refining open foundational models. If you want to learn more about Bittensor, check out our deep dive here.
A notable innovation within Bittensor is the concept of "subnets." Bittensor has curated its own language specifically aimed at crafting incentive mechanisms. This allows developers to design their incentive systems on Bittensor, harnessing its expansive intelligence web to create markets that align with their preferences.
At its core, a subnet is an arena where TAO is mined under a structured reward landscape, orchestrated by validators. Moving away from sole foundation ownership, subnets can now be claimed by various individuals or groups. They are identified by unique identifiers (UIDs), and their operations and interactions can be programmed by their respective owners. The coding specifics have been segregated from the central repository, enabling governance at the subnet level.
Registering a subnet involves participants locking TAO for the duration of the subnet, with the lock-up amount being adaptive based on demand. For example, the kickoff rate for lock-up stands at 2,500 TAO, a figure that can oscillate over time. Upon deregistration of a subnet, the locked TAO is refunded. Subnet owners stand to gain 18% of the emissions generated through their respective subnets.
In this article, we will also delve into the different TAO Subnets, exploring the intricacies and potential they hold in enriching the Bittensor network and expanding the horizons of machine intelligence and incentive systems.
Subnet 1 - Root
Root (website) is the backbone of the Bittensor network, responsible for determining emissions distribution per block to each subnetwork. It underpins the entire incentive structure through Bittensor's TAO token. This network determines the proportion of the network’s block emission to be distributed to each subnet network. This is currently set to 1 TAO for every block mined.
Like other subnetworks, the root network consists of a set of validators that set weights (W). These weights are then processed by Yuma Consensus to determine an emission vector (E). The root network also doubles as the network senate. This senate is the top 12 keys on this network which have been granted veto power on proposals submitted by the triumvirate.
Subnet 2 - Text Generation
The first Bittensor subnet dedicated to text generation is known as the Finney Prompt Subnetwork (website). This subnetwork is specially designed to facilitate the operation of prompt neural networks like GPT-3, GPT-4, ChatGPT, among others, in a decentralized manner. By doing so, it opens up the possibility for users to interact with Validators on the network to obtain the output from the top-performing models, which can then be used to power various applications.
Subnet 3 - Machine Translation
The Machine Translation subnet (website) in Bittensor is devoted to translating text from one language to another using machine learning algorithms. This not only enriches the network with multilingual capabilities but also fosters universal understanding within the ecosystem. Through this subnet, Bittensor aims to bridge language barriers and create a more inclusive environment for users and developers across the globe.
Subnet 4 - Multi Modality
This subnet (website) enhances AI systems to process and generate information across various data types and formats. It leads to a deeper understanding of context and relationships, improving human-AI interactions. Multi-modal AI systems in this setup are more resilient and reliable, as they can better handle data inconsistencies and errors by leveraging data from multiple sources, thus enhancing output and performance.
Subnet 5 - Image Generation
The Image Generation subnet in Bittensor transforms text prompts into images, akin to MidJourney. By doing so, it makes this technology accessible to the public. This subnet likely utilizes advanced machine learning models to interpret text prompts and generate corresponding visual representations, facilitating a user-friendly way to engage with image generation technology.
Subnet 6 - Storage
The Bittensor Storage Subnet (website) is designed to reward miners proportionally based on the amount of storage space they can prove they possess, while also allowing encrypted data to be stored by validators. In this prototype, the emphasis is on incentivizing individuals or entities (miners) to provide storage space, making storage resources available within the network. The validators, on the other hand, can store encrypted data in the provided space, ensuring security and privacy of the information. This structure creates a decentralized storage solution, contributing to the overall decentralized machine learning ecosystem that Bittensor aims to build.
Subnet 7 - Price Prediction
The Bittensor subnet (website) for price prediction, as implied by the name, focuses on predicting prices, potentially in financial markets or other domains with fluctuating values. A video on YouTube mentions a "Trading-Bot Price Predicting Subnet" within the Bittensor Network, suggesting the application of machine learning algorithms to forecast price movements and provide insights for trading activities.
Subnet 8 - Pre Training
Pre-training in the context of neural networks often involves training a model on a large dataset before fine-tuning it on a smaller, task-specific dataset. This process leverages transfer learning to improve model performance and reduce training time. A Bittensor subnet (website) focused on pre-training serves as a platform for training models on large-scale generic datasets before they are fine-tuned in other subnets dedicated to specific tasks.
Within this subnet, miners and validators collaborate to provide the necessary computational resources and verify the pre-training process, ensuring the models are adequately trained and ready for fine-tuning.
Subnet 9 - Map Reduce
MapReduce (website) is a programming model used for processing and generating large datasets with a parallel, distributed algorithm on a cluster. In the context of Bittensor, a subnet dedicated to MapReduce could potentially be designed to facilitate distributed data processing tasks within the network. This subnet could allow participants (miners and validators) to collaboratively process large datasets across multiple nodes in a decentralized manner, following the MapReduce paradigm of mapping, shuffling, and reducing phases.
The design of such a subnet could harness the decentralized and distributed nature of the Bittensor network to efficiently process large-scale data tasks. By doing so, this subnet could significantly enhance the data processing capabilities of the Bittensor network, enabling complex computations and analyses on large datasets in a decentralized and collaborative manner.
Subnet 10 - Text Training
The text training subnet (website) within Bittensor serves as a hub for training machine learning models on text data. Initially, text data is collected and shared across the subnet to various nodes, including miners and validators, sourced from public datasets or contributions from the Bittensor network's users. Each node could take on the task of training machine learning models on a portion of this distributed text data, employing supervised, unsupervised, or semi-supervised training approaches depending on the goals of the subnet and the availability of labeled data.
Once the training phase reaches a satisfactory level, the models could be shared across the Bittensor network for further fine-tuning or direct utilization in other subnets.
Final Thoughts
The trajectory of Bittensor, with its innovative subnet framework, paints a promising picture for the future of decentralized machine learning and artificial intelligence. By fostering a collaborative environment, Bittensor is not just decentralizing machine intelligence but is also incentivizing a community of developers, miners, and validators to contribute towards a common goal. The diverse range of subnets, each with its unique focus, reflects the vast potential of Bittensor in addressing various facets of machine intelligence and data processing. If you want to learn more about subnets, check out our article here.
The structured reward landscape within each subnet, coupled with the ability to channel resources efficiently, sets a solid foundation for a thriving ecosystem. The adaptability in the registration and deregistration of subnets, along with the unique incentive mechanisms, showcase a well-thought-out strategy to maintain an active and contributing community.
Disclaimer: Nothing on this site should be construed as a financial investment recommendation. It’s important to understand that investing is a high-risk activity. Investments expose money to potential loss.