📢 Gate Square Exclusive: #PUBLIC Creative Contest# Is Now Live!
Join Gate Launchpool Round 297 — PublicAI (PUBLIC) and share your post on Gate Square for a chance to win from a 4,000 $PUBLIC prize pool
🎨 Event Period
Aug 18, 2025, 10:00 – Aug 22, 2025, 16:00 (UTC)
📌 How to Participate
Post original content on Gate Square related to PublicAI (PUBLIC) or the ongoing Launchpool event
Content must be at least 100 words (analysis, tutorials, creative graphics, reviews, etc.)
Add hashtag: #PUBLIC Creative Contest#
Include screenshots of your Launchpool participation (e.g., staking record, reward
Sustainable Development of Token Ecosystems: A Comprehensive Perspective and Practical Tools Analysis
Sustainable Development of the Token Ecosystem: A Comprehensive Perspective and Practical Tools
The sustainable development of the token ecosystem is a key issue. A recent video released delves into the main challenges facing the token ecosystem and provides practical solutions and tools.
The video emphasizes the principles and methods of token engineering, providing new perspectives for planning and building token systems. It also introduces a range of practical tools, such as agent-based simulation tools and QTM, which can provide valuable information at different stages to assist projects in making informed decisions. With the help of these auxiliary tools, Web3 startups will have the opportunity for sustainable growth.
This video brings us a new understanding, highlighting the key role of Token engineering and related tools in project teams' response to changes. These tools have proven to be powerful weapons in adapting to the ever-changing Token ecosystem. This understanding has been formed through in-depth research and practice of the Token ecosystem, enabling participants to better comprehend the dynamics of the ecosystem and make more informed and visionary decisions.
Three Stages of Token Design and Optimization
Discovery Phase
Building a successful Token ecosystem requires executing several key steps at a macro level:
These steps are essential elements for building a successful Token ecosystem.
Design Phase
Parameterization is another key step that involves the application of quantitative tools such as spreadsheets, cadCAD, Token Spice, Machinations, and other simulation tools. These tools help obtain optimized validated models for risk analysis and forecasting, providing in-depth insights into token supply and valuation trends. By using these quantitative tools, a better understanding of the ecosystem's operations can be achieved, providing strong support for its design and optimization.
Deployment Phase
The deployment phase puts the previous theoretical analysis and design into practice, truly deploying the ecosystem onto the blockchain. This phase requires the use of various tools, including different programming languages such as Solidity, Rust, and deployment environments like Hardhat. Through this process, actual ecosystem tokens or products are ultimately created, allowing them to be truly implemented and operated on the blockchain.
Token Design Tool
In different stages of discovering, designing, and deploying (, a series of tools need to be used, and the focus and types of these tools may vary in different fields. They are applicable not only in the DeFi sector but also in various application projects, infrastructure, gaming, and other areas.
When considering the details, there are two viewpoints: one believes that the ecosystem can be viewed qualitatively, using market standards is sufficient without any simulation; the other believes that it is necessary to create a digital twin to simulate the entire ecosystem 1:1, as this involves a significant amount of financial risk. As we move towards greater precision and increase resource intensity, the required programming knowledge will also increase. This raises the requirements on users - they need to have programming skills to handle more complex models, which may affect user-friendliness.
There are various tools in the Token ecosystem that can help understand and design the system. On the left side, there are spreadsheet models and some qualitative tools, such as problem statements, stakeholder problem statements, stakeholder mapping, and specific value streams. AI-driven reasoning can even be utilized, such as using machine learning models to draft the initial Token design.
In the middle section, the QTM) quantitative Token model ( is also a spreadsheet model, but it covers multiple different fields, not limited to DeFi. This broad coverage may lead to a loss of accuracy, but it does help startups gain first-hand insights and a preliminary understanding of their Token ecosystem.
On the right side, there are simulation tools like cadCAD that can model ecosystems in a 1:1 ratio in complex environments. Choosing the right tools and methods is crucial for the success of startups. Different types of tools can provide valuable information at different stages, helping businesses make informed decisions and promoting the sustainable development of ecosystems.
![Outlier Ventures: Data-Driven Token Design and Optimization])https://img-cdn.gateio.im/webp-social/moments-44a07d89e581fbc1ec48304130f7388d.webp(
) QTM Overview
QTM is a quantitative Token model that adopts a fixed simulation period of 10 years, with each time step being one month. Therefore, it is more like a macro simulation model rather than a highly precise model. At the beginning of each time step, Tokens are emitted into the ecosystem, thus the model includes incentive modules, Token ownership modules, airdrop modules, etc. Subsequently, these Tokens will be allocated into several primary buckets, from which a more refined general utility redistribution will take place. Then, rewards payment will be defined from these utility tools, among other things. Additionally, in terms of off-chain business, it also considers the general financial situation of the business, such as the possibility of burning or repurchasing, and can also measure user adoption rates or define user adoption scenarios.
It is important to emphasize that the quality of the model's output depends on the quality of the input. Therefore, thorough market research must be conducted before using QTM to obtain more accurate input information and gain a deeper understanding of what is happening. This can lead to output results that are closer to reality. QTM is viewed as an educational tool for early-stage startups, helping them to initially understand their ecosystem, but no financial advice should be drawn from it, nor should one solely rely on its results.
![Outlier Ventures: Data-Driven Token Design and Optimization]###https://img-cdn.gateio.im/webp-social/moments-078e9fa8aa8974144f5994da1dce8355.webp(
) Data Analysis
From the perspective of data analysis, different types of data can be extracted. First, the overall development of the market can be observed from a macro market perspective, including the DeFi market and the cryptocurrency market. Subsequently, attention can be paid to the indicators of fundraising rounds to understand the financing situation of projects, such as the amount of funds raised, valuation, and sales of supply in different rounds. Secondly, the behavior patterns of participants can also be studied to gain deeper insights into the investment habits of others.
Compared to traditional finance, on-chain data has significant differences, as it is publicly visible to everyone, allowing almost every transaction in the ecosystem to be viewed. As a result, various metrics can be obtained, such as user growth, total locked value ### TVL (, trading volume, and so on. More intriguingly, one can also observe how different incentive mechanisms affect the operation of the ecosystem. In addition, social media platforms such as Twitter, Reddit, Discord, and Telegram play an important role in the token economy and project performance.
This information is publicly available and extremely valuable data, and these data should be fully utilized to better understand ecosystem parameters and validate models.
For example, data similar to creating vesting periods can be viewed. Generally speaking, the vesting periods of different stakeholder groups can be observed. The minimum, average, median, and maximum values of the vesting periods can be seen, which are analyses conducted on the vesting periods across all different fields. The same data can also be segmented to differentiate various industry sectors. In this way, it can be observed that the data distribution in different fields may vary significantly. Although these values may not always be optimal, they provide us with a starting point.
![Outlier Ventures: Data-Driven Token Design and Optimization])https://img-cdn.gateio.im/webp-social/moments-d44dc739d197fa1ef14d72cc7b8dd11d.webp(
Let's take another example regarding the historical balance of token buckets. Taking a certain financial platform as an example, you can check the status of its native token and track all transactions within the entire ecosystem, categorizing them into specific "token buckets", such as addresses related to the platform, addresses related to centralized exchanges, and decentralized exchange addresses, etc. In this way, we can view the balance of each stakeholder and observe what is happening throughout the ecosystem.
In the token ecosystem, observing the behavior of specific addresses can provide important information about token liquidity. For example, when tokens are sent from a staking contract to a specific address, we can learn how the recipient handles these tokens. Do they choose to reinvest these tokens, send them back to the staking contract, sell them, or deploy them elsewhere? These are all key pieces of information to understand the behavior of each stakeholder through analysis, and we can feed this data back into our model, which helps to adjust the model.
![Outlier Ventures: Data-Driven Token Design and Optimization])https://img-cdn.gateio.im/webp-social/moments-8deeda92b35f1f80f00597ed566715f5.webp(
This model can analyze the behavior of token recipients for individual addresses, as well as conduct analysis for representative aggregated stakeholder groups. For instance, we can analyze multiple token projects and find that approximately 38% of tokens, after being received through staking contracts, are sent back to the staking contract in the first transaction. In comparison, the proportion for centralized exchanges is about 8%, while for decentralized exchanges it is around 14%. By reviewing the token bucket allocation at a certain point in time on QTM, we can understand the circulating supply of tokens. These values can be applied to our parameters, providing an initial understanding of the ecosystem's behavior.
By utilizing this data, we can make predictions, such as forecasting the balance supply situation of different buckets in the ecosystem over the next ten years, including allocations for the foundation, team, staking, overall circulating supply, and liquidity pools. At the same time, price simulations or forecasts can also be conducted. It is important to emphasize that these predictions are not intended for speculation or financial advice, but rather to help us understand the relationship between supply ownership and Token demand, thereby understanding the balance of these two factors.
In addition, we can analyze other aspects, such as the distribution of different utility parts. For example, we can understand how many tokens are staked, how many are used for liquidity mining incentive programs, or if there is a burn mechanism, how many tokens are burned. If tokens can be used in stores or elsewhere, we can also observe the monthly utility rewards to understand the value of these incentives in dollar terms. Understanding the overall usage of tokens is very important, especially when considering cost factors in incentivizing the ecosystem.
![Outlier Ventures: Data-Driven Token Design and Optimization])https://img-cdn.gateio.im/webp-social/moments-a6b71e12aa773a7e432108b8a646dc5e.webp(
Data-Driven Model
Another theme is a new way of thinking about vesting schedules. Sometimes people believe that a very long vesting schedule is necessary, but this is not always good, as it means that the supply in initial circulation is very low, leading to speculation and potential hype in the market. Therefore, we propose introducing a vesting mechanism for tokens that is adjusted and not influenced by market demand. In other words, predicting the ecosystem's demand is not necessary, as the vesting releases will be controlled by the controller based on certain predefined key performance indicators. These key performance indicators can include TVL, transaction volume, user adoption rate, business profitability, and so on. In this example, the price of the token is simply used.
In the token ecosystem, the relationship between ownership and price can be understood by analyzing real instances of tokens. For example, in the first year of the ecosystem's inception, a large supply entered the market through ownership, but due to the product potentially being not mature enough, market demand may be insufficient, and adoption may not be significant, leading to a decline in token prices.