Tokenomics simulator: Simulate your own tokenomics with Cenit's free engine
In the swiftly evolving web3 landscape, comprehending the tokenomics of blockchain projects is crucial. However, many projects fail to transmit this message to outsiders. A clear showcase of a project’s tokenomics offers potential investors an easy decision making process to know whether this is the right project to invest for them.
With this tool, you can create a visual report with which to communicate your token's potential to investors in an unprecedented way. Go beyond simple pie charts, incorporate realistic buying pressures into your tokenomics while showcasing your project, and clearly convey your token's potential.
Our report tool enables you to seamlessly weave your business KPIs into your tokenomics story in just three simple steps:
Input the vesting schedule
Input the effect of your main business KPI
Run the simulation
Tokenonics modeling using the Cenit Tool
Step 1: Add Vesting Schedule Input your allocations and vesting schedules, specify the maximum supply of your project, and define who receives tokens. You can also specify the expected proportion of tokens to be sold at vesting. For those groups you believe won't sell their tokens, you can assign a smaller “Sold at vesting” percentage, which means they will hold every token which is not sold at vesting time.
Step 2: Introduce the token pressure generated by your business KPIs. Next, add how your Protocol Growth generates buying pressure.
First, identify your primary business metric, usually linked with your growth hypothesis (for example, "Total amount (in USD) of Loans", "Daily active users", "TVL", etc.). Then, articulate how this KPI influences the market pressure on the token. Examples include, "How many tokens can an active user use monthly?" or "How many tokens are bought/burnt from loan fees?"
Optionally, incorporate a secondary KPI if your protocol engages in two main activities that relate to the token utility. You might also set the market pressure as negative if a specific token mechanic is expected to cause selling pressure.
Step 3: Get your tokenomics model Finally, insert your project's name and run the simulation!
To demonstrate its application in real projects, we'll examine the tokenomics of three projects: Neon EVM, SubQuery, and Blast Royale.
Neon EVM is an Ethereum virtual machine built on Solana, allowing developers to create decentralized applications using Ethereum tooling while leveraging Solana's liquidity and scalability.
Neon main KPI & token pressure
Neon’s tokenomics are rooted in its primary business KPI: transactions carried out within the EVM.
The expected growth in transaction count is modeled after Solana’s, with a curve that peaks at 1 billion transactions per month. The cost per transaction isapproximately 0.02 NEON tokens, half of which goes to the treasury. If we assume that, since we do not control it, the solana ledger will sell back all of its revenue, and the treasury won't sell these tokens, it creates a sustained buying pressure of 0.01 NEON per transaction. This acts as the “Main KPI buying pressure” in the tool.
By feeding these figures along with the vesting calendar into Cenit's tool, we derive Neon EVM's tokenomics projection, available here:
Blast Royale is a mobile-based battle royale game that maximizes the advantages of blockchain technology. The game is based on a 'last man standing' format where players compete against each other until only one remains. This format creates an engaging, suspense-filled gaming experience for players.
The game has several sources of revenue, including
Continuous NFT sales
Revenue from in-app purchases
Trading fee: 2.5% fee on trading equipment on a secondary market applied to MATIC/$BLST. 35% of it goes ot Company and 65% goes to the Treasury
Utility expenditures: MATIC/$BLST that are spent as utility are also going to Company/Treasury in a 35%/65% split. Some utilities include upgrading, repairing, and crafting systems
Tournament fees: Collection of 5% of tokens as a Tournament fee on buy-ins
Blast Royale KPI & token pressure
The primary KPI for Blast Royale, as any project in the gaming industry, is Monthly Active Users (MAUs). With a projected growth to 2 million MAUs and an average expenditure of 3 USD per user per month, we can make some assumptions regarding how that expenditure is distributed:
NFT and non-NFT equipment purchases (50% of expenditure): On average, each user spends $1.5 per month on equipment purchases. This revenue goes straight to the company.
Equipment trading (15% of expenditure): Users spend an average of $0.45 per month on equipment trading. Of this, a 2.5% ($0.01125) goes to Blast Royale, $0.004 to the company (to be sold back to the market), and $0.0073 to the treasury. As long as the Treasury does not sell these tokens, this will generate buying pressure.
Utilities (35% of expenditure): Users spend an average of $1.05 per month on utilities. From this, $0.3675 goes to the company and $0.6825 goes to the treasury.
Given the latest round price of BLST at 0.195 USD, the total revenue to the treasury in tokens per user amounts to 0.6898 USD, or 3.53 BLST per user per month. This represents the buying pressure generated per unit of the main KPI (MAUs) for Blast Royale.
The vesting schedule, along with all the tokenomics projections can be visualized here
Finally, we have the Subquery project, the most complex to translate to our framework out of the three in this article. However, if you read until the end, you will be able to use our tokenomics report tool for any project you have in mind.
SubQuery is a data indexing framework for web3. The tokenomics of the SubQuery Network are centered around three main players: consumers, indexers, and delegators, all propelled by SubQuery tokens (SQT).
Consumers: protocol users who make requests for specific data. They pay an advertised amount of SubQuery tokens for the data they need. They also have the ability to establish new projects through a "purchase orders" mechanism. Here, they can post a contract on-chain for a fixed price and a specified number of requests, which indexers can satisfy.
Indexers: these are the participants who host SubQuery projects on their infrastructure. They run both the node and the query service to index data and answer GraphQL queries. They receive the payments made by the consumers. To maximize their rewards, they can stake tokens.
Delegators (stakers): participants who delegate their spare SubQuery tokens to indexers. This generates rewards for themselves and maximizes the rewards of the indexers at the same time.
SubQuery KPI & token pressure
There are two main KPIs, the amount of queriesper month and the amount of purchase orders made per month.
The amount of queries is clearly the main indicator of success for the project. The secondary KPI, the number of purchase orders, significantly impacts SubQuery's value as it enhances the total project value: the more content indexed, the higher the value.According to industry projections, we can anticipate 10 billion queries per month.
Unlike many projects that apply a treasury fee derived from their main utility, SubQuery follows a distinct approach. All fees from queries go towards indexers and delegators. As a result, the transaction cycle does not yield any net positive market pressure for the token.
However, even if there is no direct buying pressure from each query/transaction, there are some secondary effects which do create token buying pressure:
Staking: staking rewards come from query fees, which are proportional to the amount of queries over time. We can therefore assume that new tokens will be bought for staking when the rate of queries in the protocol increases, since that would increase yields and attract more capital to the staking utility.
Now, let’s look at the numbers. According to the market, the price of a query could be close to 0.001$. Given the price of SQT in the next round of 0.0275 USD, this translates into 0.036 SQT/query. This is revenue for both the indexers and stakers. Assuming that it’s split equally between them, 0.018 SQT/query goes to the stakers. Now, assuming that stakers are happy with an expected ROI of 10% annually, if at some point the amount of queries in Subquery generates a higher yield, more stakers will buy the token. This means that for each increase in one query per month, approximately 0.018 * 12 / 0.1 = 2.16 SQT would be bought for staking. But this is the growth of monthly queries over time, which is what generates buying pressure from new stakers. We need to translate that into actual monthly queries. Given that the growth in our example is exponential, there is a linear relationship between the amount of queries over time and its growth. Without entering into too much detail, this means 0.31 SQT bought per query per month. Now let’s add the amount of tokens burnt by misbehavior of indexers. Considering a robust protocol, we could say that only 0.1% of the tokens generated by the queries should be burnt for misbehavior, which means 0.0000091 SQT/query, comparatively insignificant. In total, the buying pressure is 0.31 SQT per query.
Misbehavior of indexers: misbehavior by indexers is more likely to happen when there is more traffic. Subquery penalizes the misbehavior by burning tokens. Therefore, an additional buying pressure results from the penalization that Subquery imposes on the unfulfilled purchase orders by burning the tokens. Let’s assume a penalization of 10000 SQT for every unfulfilled order, and that only 0.1% of the purchase orders are unfulfilled. This results in a total buying pressure of 10 SQT per contract.
Adding the vesting schedule, we end up with the following dashboard:
We have now gained a deeper understanding of the three projects, Neon EVM, Blast Royale, and SubQuery, which is particularly valuable considering their upcoming ICOs.
Furthermore, we've examined the process of creating an insightful tokenomics report. This tool can be an efficient method to articulate the benefits of a project to potential investors. It’s your turn now to go to the next level and create your own tokenomics report.
Finally, if you are interested in knowing how to model your token economy in further depth and with high precision, you can read our previous post or contact us here.