Thank you! I am slightly obsessed with optimization, so hearing that means a lot.
You might be surprised — the game is actually deployed in just one region (US) on only two dedicated servers (Contabo).
Here is the breakdown of why it feels fast:
1. The Metal: I use one server for the Web App + Gameplay Backend (.NET), and a second server strictly for PostgreSQL and MongoDB. No virtualization overhead.
2. The Network: I use Cloudflare for static content, which handles the initial global load speed.
3. Aggressive Prefetching: I rely heavily on ServiceWorkers. When you land on the home page, the 'Play' page and game assets are already being prefetched in the background. When you click play, it loads instantly from the local cache.
4. Single WebSocket: Once connected, there is zero HTTP overhead. Every interaction — gameplay, chat, UI updates — travels through a single persistent WebSocket connection.
Keeping the architecture simple (monolith-ish) rather than distributed helps me keep the latency predictable and maintenance low.
Really surprised it's just one application machine I thought it's some microservices thing. I thought one machine would crumble under load. Thanks for answering though.
Modern servers are absolute beasts if you don't bog them down with serialization overhead and network hops between services.
With efficient code in .NET, a single machine can handle such kind of load without breaking a sweat. I actually sleep better knowing there are fewer moving parts to fail!