7 Apr 2026
Why Neon Invested
14 Min Read
Why Neon Invested in Game State Labs
At some point, almost everyone has seen this game.
You’re sitting in a cab, waiting at a clinic, or standing in a queue, and the person next to you is swiping on their phone.
Three colourful tiles line up. They disappear. Something breaks. New pieces fall into place. It looks harmless. Almost mindless. Just a way to pass a few minutes.
What they’re playing on their phone is Candy Crush Saga.
Bright colours, simple rules, swipe and match.
By 2014, this simple game had quietly become one of the biggest games in the world. More than 90 million people were playing it every single day, and it was making over a billion dollars a year for its parent company, King, a European mobile gaming company.
At that size, the business becomes extremely sensitive.
When you have tens of millions of daily users, even the smallest shift in behaviour starts to matter. If just 2-3% fewer players return the next day, that can mean millions of lost sessions. Fewer sessions mean fewer ad views and fewer in-app purchases. Revenue starts slipping quickly.
So a “small” drop in player retention is no longer a minor design issue that can be fixed later. It becomes a serious business problem.
During this time, a team at King noticed something unsettling. Players stopped playing the game at the exact same point in the game.
Level 65.
This mattered because Candy Crush was free to play. Long-term revenue depended on players staying in the game long enough to eventually spend.
Players who quit early almost never came back. And if they left before getting deeply involved, they almost never spent money. So every time someone stopped at Level 65, King wasn’t just losing a player. It was losing future income.
So King looked closely at what was happening inside Level 65.
They analyzed how many attempts players made, where they failed, how long they waited before trying again, and which players eventually moved forward versus those who quit entirely. The conclusion was clear. The level worked fine for experienced players, but a large group ran into a sudden spike in difficulty that brought their progress to a halt.
King didn’t simply make the level easier.
Instead, they began closely monitoring how players progressed through each level. When unusual failure patterns appeared, the system recommended specific changes. These included adding a small number of extra moves, reducing obstacles, adjusting how new pieces appeared, or slightly changing the board layout so early progress felt achievable.
Each change was tested and measured before being applied more widely. The idea was to reduce repeated failure for players who were struggling, but keep the game challenging for those who were more skilled.
Over time, these small adjustments started to compound.
Level 65 stopped being a dead end. Fewer players dropped off. More players stayed, progressed, and slowly got pulled deeper into the game. What looked like a minor design issue at first turned out to be one of the most important levers in the entire business.
Because in a free-to-play game, retention is everything.
If players stay, they eventually spend. If they leave early, they are almost always gone for good.
By fixing a single point of friction, King wasn’t just improving gameplay. It was protecting future revenue. One level, quietly optimized, had a direct impact on millions of players and millions of dollars.
And this is where the story becomes more interesting.
Because what King did here was not just fix Level 65.
It changed how the game itself worked.
The game was no longer a fixed system where every player experienced the same levels in the same way. It became something dynamic. Something that could observe, learn, and adjust based on how people were actually playing.
At that point, the real question was no longer: What are players doing?
The real question became ‘What is happening to the player when they are doing it?’
Because behavior alone can be misleading.
Two players might take the exact same action. Watch an ad. Retry a level. Buy an item. On the surface, it looks identical.
But underneath, their situations can be completely different.
One player might be stuck, frustrated, running out of moves, trying to survive. Another might be progressing smoothly, simply choosing to move faster.
To a traditional analytics system, both actions look the same. But inside the game, they mean very different things. And that difference is what most systems fail to capture.
A game is not just a sequence of clicks. It is a constantly evolving system, where every player is moving through it in a different state. Their level, progress, inventory, recent wins or failures, all of it shapes what each action actually means.
Without that context, analytics stays shallow. It tells you what happened but it cannot tell you why it happened.
And once you understand that, you start to see the real limitation.
Because if what matters is the player’s state, not just their actions, then the system needs to know that state instantly.
But most systems are not built for that.
At this scale, even the way data is stored becomes a problem.
Most systems do not store the player’s full state directly. They store events — every coin earned, every coin spent, every level played. To understand a player’s situation at any moment, the system has to reconstruct that state by going through a long history of actions.
This works well for reporting.
But it breaks down when you need to make decisions because it is too slow.
By the time you rebuild the player’s state, the moment you were trying to act on has already passed. In a live game, where behavior shifts constantly, that delay becomes expensive.
What matters is not just understanding what happened. It is knowing what is happening right now. And this challenge only grows with scale.
These games serve millions of players across different countries, age groups, skill levels, and playing styles. Preferences change quickly. Attention shifts. Competition is always one tap away.
In that environment, fixed rules and human guesswork are not enough. Designers cannot manually predict how every type of player will behave.
So the system leans more heavily on data.
But inside most studios, this creates a second layer of friction that is less visible, but just as damaging.
Analytics teams are busier than ever, but decision-making is not getting any faster. Every question turns into a process. A product manager has a hypothesis, a designer wants to understand a change, a request is raised, and an analyst has to pull data from multiple systems, stitch together events, and build a report.
By the time the answer comes back, the situation has often already changed.
Over time, teams quietly adapt to this delay. They stop asking deeper questions because each one takes too long, and start relying more on instinct than data. They may move faster, but with less confidence.
This is not a talent problem. It is a structural one.
The system was designed for reporting what already happened, not for helping teams act while things are still happening.
And that is where vertical AI becomes important.
Instead of trying to build intelligence that understands everything, vertical AI focuses on one narrow area. One type of problem. One set of decisions. And it becomes very good at that specific job.
In gaming, this means looking beyond the numbers to see what is really happening.
An “inactive” player might just be stuck on a hard level, not willing to spend money, more motivated by competition than rewards, or only active during events.
You do not have to pre-define all these traits. The goal is to understand the real reason behind the behavior.
A focused AI system can learn these patterns directly from gameplay data. But learning patterns is only part of the problem.
For a system to act intelligently, it also needs to understand how the game itself works — how currency flows, how levels are designed, how rewards are given, and what progression looks like. This context is what gives meaning to the data.
In most studios, this understanding is fragmented. Some of it lives in dashboards, some in documentation, and a lot of it in the heads of analysts and designers. So even when the data exists, connecting it to the right insight takes time.
For an AI system to be useful, it cannot just read data. It needs to understand how events connect to outcomes, how player actions affect progression, and what different behaviours actually mean inside the game. Without that layer, data remains disconnected. It shows activity, but not intent.
A well-designed system changes that. It observes how players move, retry levels, spend money, pause, compete, and quit. Over time, it learns from these patterns, updates itself as behaviour shifts, and begins to act on what it sees. Because the goal is not to generate reports, the goal is to improve what happens next. And that is where most current systems fall short.
What most studios still rely on today is a command center, dashboards. For years, dashboards have been treated as the ultimate source of truth, a single place where attribution, product analytics, and revenue come together to give a clean view of the game. And for a long time, that worked.
But the ecosystem around them has quietly changed. LiveOps has become more complex. User acquisition has become more expensive while signal quality has dropped. Privacy changes have weakened deterministic attribution. And player behaviour itself has become faster, more volatile, and harder to predict.
The problem is not that dashboards are wrong.
It is that they are designed to summarise.
They aggregate information, smooth out variation, and highlight patterns. They are built to show what happened, a dip in retention, a spike in revenue, a drop in engagement, but they rarely explain why it happened.
And more importantly, they almost always show it too late.
Games today don’t really fail because of overall numbers. They fail in small, specific moments. When something feels off for the player. It could be a level that’s too hard, a reward that feels unfair, or simply a moment where the game stops feeling fun.
An AI-based system helps react to these moments much faster. It continuously watches how players behave, learns from their actions, and tries to predict what they might do next. Instead of waiting for reports, it can take action instantly inside the game.
So if a player is struggling, the game can make things slightly easier to keep them engaged. If a player seems ready to spend, the game can show a relevant offer at the right time. These changes happen dynamically, based on what the system has learned from similar players.
That’s where the real value of AI lies. Improving outcomes in real time.
This is also the lens through which Neon Fund evaluates AI investments. Neon is fundamentally bullish on vertical AI infrastructure.
The next strong AI companies will not try to do everything. They will focus deeply on one industry. They will understand how that industry actually works, what drives decisions, and where things usually break.
Game State Labs fits this idea perfectly.
It is a gaming-native AI infra and products company built specifically for the gaming industry. At a simple level, it offers one system where data collection, analysis, testing, and player personalization all happen in the same place, built around how live games actually run.
This framing matters because it makes the business model clear. It is easy to see what the company does and why studios would pay for it.
Running a successful live game is messy and expensive. If a studio earns 100 million dollars a year, 15 to 20% can go into tools, cloud bills, dashboards, and the people needed to manage them. None of this directly improves the game. It just keeps things running.
The bigger issue is time. When player spending slows down or engagement drops, teams usually try to figure out why. They look at numbers, discuss possible reasons, and plan changes. That process takes time. In live games, timing matters. By the time a fix is ready, many players may have already moved on.
If a system can notice in real time that a player is about to spend more, lose interest, or change habits, the game can react immediately. It can adjust the experience while the player is still active.
So instead of calculating how much a player was worth after they leave, the studio can increase that value while they are still playing.
Most studios handle this by using many separate tools. One company provides analytics. Another runs experiments. Internal systems try to connect everything. Engineers and analysts then go through the data and decide what to change.
It works, but as the game grows, this setup becomes slow and expensive.
Seen this way, the opportunity is simple. Studios are already spending a lot on these systems. If Game State Labs becomes the main system a studio relies on to understand and improve its game, even taking a small share of that existing spend can build a strong recurring business.
Reaching 10 million dollars in annual recurring revenue does not mean creating a brand new market. It simply means replacing a tool or system that studios are already paying for.
And for Neon, the team behind Game State Labs reinforced that conviction. We believe it is not just what the market looks like, but who is capable of delivering this.
In gaming infrastructure, the real product is not a dashboard. It is trust.
A studio might try a new ad network. It might test a new creative tool or a pricing plugin. But it rarely shifts the core system that runs its live game unless it believes two things. First, that the team truly understands how live games operate in real conditions, with daily retention pressure, changing content plans, and monetisation limits. Second, that the system will remain stable as the game grows.
That is where Game State Labs appeared unusually complete.
Aashbir Bhatia had built a startup before, and it failed. Not because he could not execute, but because he did not care enough about the problem to push through the hard phases. So when he started again, he chose differently.
He picked gaming.
He left his consulting role at Kearney and spent six months interning at a game studio without pay.
That is where he saw the gap clearly. Studios were collecting huge amounts of player data, but they could not use it while the game was still live. By the time insights came in, many players had already left.
Ashwin Ramakrishnan saw this problem closely when he worked at Electronic Arts. Whenever players started leaving, a level stopped performing well, or spending dropped, the team could not immediately understand the reason. They had to open multiple dashboards, ask analysts to pull data, wait for reports, and then manually connect the dots. By the time they understood the issue, valuable time was already lost.
Ashwin did not want just another dashboard with more charts. He wanted clear and fast answers. Which players are about to quit? Who is likely to spend? Which group is struggling and needs an easier level? In big studios, these answers usually take days because they depend on data teams and reporting cycles.
Jagveer Gandhi came from fintech. He helped build and scale YC-backed Akudo and later worked on core systems at Uni Cards. He did not come from gaming, but he knew how to build strong, real-time systems that can handle millions of events without breaking. That kind of reliability is essential for live games where everything happens instantly.
Together, they built Game State Labs.
The platform watches how players behave, finds patterns, and responds in real time. It brings analytics, prediction, and personalization into one system. It is about changing the game experience while the player is still inside the game.
Studios were already spending 15% to 20% of their revenue on data tools, experiments, and personalization. The problem was that these systems were slow and made up of many separate tools. Game State Labs does not ask for extra money. It replaces the existing setup with one system that works faster and scales better.
GSL reacts to what a player is doing in real time. It learns who might leave, who might spend, and what change could help. Studios can adjust offers, tweak difficulty, or personalize bundles immediately.
The result is pretty evident, the players stay longer and that leads to improved revenue.
Basically, decisions are based on real data, and not guesswork.
That is why Neon Fund invested.
Siddhartha Ahluwalia
Siddhartha Ahluwalia is the Managing Partner at Neon Fund and host of The Neon Show, one of the top business podcasts focused on the India-US startup ecosystem. He previously founded Addodoc (a B2B SaaS CRM for pediatricians) and Babygogo (a healthtech startup acquired by Sheroes). He later worked at Prime Ventures and led the SaaS Ecosystem at AWS India before starting Neon Fund. With deep expertise in 0-1 startup building, he helps founders scale B2B companies in the US from $0 to $10M ARR.