Recently, Starkware kicked off their highly anticipated airdrop. Just like most airdrops, this one has sparked a lot of controversy. Sadly, this is no longer a surprise.
So why does this happen again and again? You might hear some of these opinions:
· Insiders just want to cash out billions of dollars and walk away
· The team didn’t know what to do and didn’t have the right consultants
· Whales should be given more priority because they bring TVL
· Airdrops are about the democratization of cryptocurrency
· Without farmer, there would be no protocol use or stress testing
· Inconsistent airdrop incentives will continue to have strange side effects
None of these views are wrong, but they are not entirely correct in themselves. Let’s do a bit of analysis to make sure we have a comprehensive understanding of the issue at hand.
There is a basic tension when it comes to conducting an airdrop, and you need to choose between the following three factors:
· Capital efficiency
· Decentralization
· Retention
You’ll often run into situations where airdrops work well in one dimension, but it’s hard to strike a good balance between two or all three. Retention is one of the hardest dimensions, and a retention rate above 15% is unheard of.
· Capital efficiency is defined as a criterion for how many tokens you give to participants. The more effectively you allocate your airdrop, the easier it will be to become liquidity mining (one token for every dollar deposited) – beneficial to whale users.
· Decentralization defines who will get your tokens, and under what criteria. The recent airdrop adopted arbitrarily standard practices in order to maximize the acquisition of tokens by as many users as possible. This is usually a good thing because it saves you from legal trouble and gives you more influence because you make people rich.
· Retention rate is defined as how many users will continue to hold airdropped tokens after the airdrop. In a sense, it’s a way to gauge whether the user is aligned with your intentions. The lower the retention rate, the more your users are at odds with your intent. A 10% retention rate as an industry benchmark means that only 1/10 of the addresses are here for the right reasons!
Leaving retention rates aside, let’s look at the first two factors in more detail: capital efficiency and decentralization.
Capital efficiency
To understand capital efficiency, let’s recognize a new term called the “sybil coefficient”. It basically calculates the benefits of allocating one dollar of funds to a certain number of accounts.
Your sybil coefficient results ultimately determine how wasteful the airdrop is. If you have a SYBIL coefficient of 1, technically it means you’re running a liquidity mining scheme, which will irritate a lot of users.
However, when you get to something like Celestia, where the sybil coefficient swells to 143, you’re going to see extreme wasteful behavior and rampant farming.
Decentralization
This brings us to the second factor: decentralization. You ultimately want to help the “little people” who are the real users, and want them to take the opportunity to use your product early – even though they’re not rich. If your sybil coefficient is too close to 1, then you will not be able to give anything to the “little people” and will be more likely to benefit the “whales”.
This is where the airdrop controversy gets heated. There are three types of users here:
- Make quick money here and then turn away from the “little guys” (probably using multiple wallets in the process).
- Like the “little people” whose product will continue to stay.
- “Industry miners who operate like a lot of little people” will definitely take most of your incentive tokens at this stage and then sell them before moving on to the next target.
The third category is the worst users, the first is acceptable, and the second is the most premium. How we distinguish between these three types of users is a huge challenge for airdrops.
While I don’t have a specific solution, I have an idea of how to solve this problem, and I’ve spent a couple of years thinking about it and seeing it firsthand: project segmentation.
I will explain what I mean in detail. Now think about a fundamental question: you have all your users, and you need to be able to divide them into different groups based on some kind of value judgment. The value here is observer-specific and therefore will vary from project to project. Trying to blame some “magic airdrop filter” is never enough. By studying the data, you can begin to understand what your users really look like and start making data-science-based decisions to segment your airdrops.
I’ll talk about this in a future article, but it’s a hard data problem that requires data expertise, time, and money. Not many teams are willing or able to do this.
Retention
The final dimension is retention. Before we discuss retention, it’s a good idea to define what retention means. I summarize it as follows:
The number of people who received the airdrop / the number of people who held the airdrop
The typical mistake that most airdrops make is to treat them as one-offs.
To prove it, I thought some data might help!Luckily, Optimism has actually executed multiple rounds of airdrops!I was hoping to find some simple Dune dashboards that would allow me to get the retention data I wanted, but unfortunately, I was wrong. So, I decided to roll up my sleeves and collect the data myself.
I don’t want to overcomplicate things, I want to understand one simple thing: how the percentage of users who hold OP balances changes during the continuous airdrop process.
I went to github and found a list of all the addresses that participated in the Optimism airdrop. Then I built a small scraper to manually scrape the OP balances of individual addresses in the list and do some data collation.
Before we move on, it’s important to note that each OP airdrop is independent of the previous one. There are no rewards or links to keep the previous airdrop tokens. I know why, so let’s move on.
The first round of airdrops
A total of 248,699 users received the airdrop tokens, and the tokens obtained by users can perform the following operations:
- OP mainnet users (92,000 addresses)
- Duplicate OP mainnet users (19,000 addresses)
- DAO voters (84,000 addresses)
- Multisig signers (19,500 addresses)
- Gitcoin donors on L1 (24,000 addresses)
- Users squeezed out of Ethereum (74,000 addresses)
After analyzing all these users and their OP balances, I get the following distribution. A balance of 0 indicates that the user discarded the tokens, as the unclaimed OP tokens were sent directly to the eligible address at the end of the airdrop.
In any case, the first round of airdrops was surprisingly good compared to the previous airdrops I observed! only 40% of people had zero balances, which is a very good result.
Then, I wanted to understand how the criteria play a role in determining whether a user is likely to keep the token. The only problem with this approach is that addresses can fall into multiple categories, which can skew the data. I wouldn’t just take it on the face of it, here’s a rough indicator:
At one point, OP users had the highest percentage of users with zero balances, followed by those who were squeezed out of Ethereum. Obviously, none of these are the best segments for assigning users. The percentage of multisig signers is the lowest, which I think is a good indicator because it’s not obvious for an airdrop farmer to set up multisig (you can sign a transaction for airdrop mining).
Second round of airdrops
This airdrop was allocated to 307,000 addresses, but in my opinion, this is far less thoughtful.
- Governance delegation rewards are based on the number of OPs delegated and the length of time delegated.
- Partial gas rebates are offered to active Optimism users who pay gas fees above a certain amount.
- More rewards depend on the additional attributes associated with management and use.
For me, that’s a terrible criterion because governance voting is an easy thing for bots to do, and it’s fairly predictable. As we’ll find out below, my gut feeling wasn’t too outrageous. I’m amazed at how low the retention rate is!
Nearly 90% of addresses hold zero OP balances! I’d love to get to the bottom of the situation, but I’m more concerned about the airdropped tokens that are left behind.
The third round of airdrops
This is the best airdrop the OP team has executed so far. The standard is more complex than before and has a “linearization” element. These airdrop tokens are distributed to about 31,000 addresses, so while smaller, they are more efficient.
- Number of OPs = Total number of OPs (for example, 20 OPs, 100 days: 20 * 100 = 2,000 OPs).
- Delegates must be voted on-chain through OP governance during the snapshot period (0:00 UTC on January 20, 2023, 0:00 UTC on July 20, 2023 UTC).
A key detail to note here is that the criteria for on-chain voting is after the final round of airdrops. So the farmer who came in the first round would think “okay, I’m quitting, it’s time to move on to the next target”. It’s a smart and analytical move, so take a look at the retention data!
Only 22% of these airdrop recipients have zero token balances!, in my opinion, this indicates that this airdrop wasting rate is far lower than any previous one. This confirms my view that retention is crucial, and there is more data to show that multiple rounds of airdrops are more effective than one might expect.
Fourth round of airdrops
This round of airdrops was allocated to a total of 2.3 addresses, and there is a much more interesting criterion. Personally, I think the retention rate of this airdrop will be high, but after thinking about it, I think the retention rate may be lower than expected, why?
- You’ve created an appealing NFT on the hyperchain. The total gas on the OP Chain (OP Mainnet, Base, Zora) involving NFT transfer transactions created by your address will be measured 365 days (January 10, 2023 to January 10, 2024) before the airdrop deadline.
- You’ve created an appealing NFT on the Ethereum mainnet. Total gas on Ethereum L1 involving NFT transfer transactions created by your address will be measured 365 days (January 10, 2023 to January 10, 2024) before the airdrop deadline.
You’d think that people creating NFT contracts is a good indicator, right? Unfortunately, that’s not the case. The data shows a very different picture.
While this airdrop wasn’t as bad as the second round, we have taken a big step backwards in terms of retention compared to the third round.
My thinking is that if they put an extra spam or legitimacy filter on the NFTs, the retention rate will increase significantly. This criterion is too broad. Also, since tokens are airdropped directly to these addresses (rather than having to be claimed), you end up in a situation where the creator of a scam NFT says “Wow, free money.” It’s time to throw away. ”
conclusion
As I wrote this article and found the data myself, I managed to prove or refute some of my hypotheses that later turned out to be very valuable. In particular, the quality of your airdrop is directly related to your filtering criteria. People who try to create a generic “airdrop score” or use advanced machine learning models will fail and are prone to inaccurate data or a lot of misinformation. Machine learning is great, but you don’t think so after you’re trying to understand how it came up with the answers it gives.
The main lessons that the airdrop team should take away from this are:
- Don’t do a one-time airdrop! You’re shooting yourself in the foot. You should want to deploy incentives similar to A/B testing. You can use a lot of iteration and lessons learned from the past to guide your future goals.
- By building on the criteria of previous airdrops, you will increase the efficiency of your airdrops. In fact, give more tokens to people who hold tokens in a wallet. Make your users understand that they should stick with one wallet and only change wallets when absolutely necessary.
- Get better data to ensure smarter, higher-quality segments. Bad data equates to bad results. As we’ve seen above, the lower the “predictability” of the criteria, the better the retention results.