Massively multiplayer worlds are fascinating
I still remember the sense of vertigo I felt as a kid, playing World of Warcraft and taking the boat from Tel’drassil to Stormwind for the first time. It suddenly struck me, surrounded by all these players, how vast and lively the world could feel. It almost felt real.
Ever since, I’ve been fascinated with how real and alive multiplayer open worlds can feel. I did some digging and found a bunch of screenshots I took back in the day, so that you can hopefully see what I mean :

And since then, the idea of making my own has slowly been growing in the back of my mind. But, as it turns out, building one is quite the challenge. From managing network authority to syncing state across many devices: here’s what I’ve found out when trying to make my own.
Picking my battles
Before diving into the code, I had to define the scope of the game. I knew there were some features I absolutely wanted :
- Realtime movement in 3D environments: truly necessary to recreate the sense of awe and exploration I loved as a kid.
- Persistent world (meaning that stuff stays where it is even after a server restart): makes players feel like they have an impact on the map
- Realistic physics collision: not something that's present in WoW ! I felt like it would be cool if players could collide with the map but also with moving objects and other players.
- No cheating: after all, nothing can have value in an MMO if some of the players are cheating.
- Mobile and web support: so that inviting your friends is as easy as sending them a link.
- Admin interface: nothing should be hardcoded, to allow admins to add zones or items easily through an interface.
Quite a lot of work, right ? To balance it out, I made some trade-offs:
- Aesthetics: I felt like the environments did not need to be realistic at all : low poly would be much faster to make and much easier since I was a modeling beginner at the time. This also makes for smaller assets (important when making a web game!) and requires less rendering power (smartphone friendly!).
- World Design: I skipped procedural generation in favor of hand-made environments, which are often more rewarding to explore anyway.
- World Loading: I decided to implement load barriers (instanced zones) between map sections. It was a tough call, but I figured I could always release a chunk-based loading system later on to allow for bigger zones.
I wont keep you waiting any longer, here's a quick demo of two players running around on the map!
Making sure the server doesn't melt down
I put quite a lot of thought into how I divided up my codebase between server and client. On the server side especially, I setup a system to easily let the game scale up in the future. The tech stack is divided into three parts :
-
Django is the only layer that has direct access to the database. It offers secure access to the data through a REST API, and hosts the admin panel that lets the staff manage the data (more on this panel later). I chose Django because it has great REST support, has an amazing automatic admin panel tool, and because I just love using it.
-
One NodeJS server is deployed per map area. They receive player input, compute the world physics (using the Bullet physics engine), and update all the clients accordingly. Since they are in charge of the most intensive computing part, they are meant to be easily distributed across different machines. They call Django endpoints regularly to save world state using a secret token.
-
The client is a web app that can either be loaded in a browser or installed on a phone. It is built using React and uses ThreeJS to display the 3D view. It only communicates with the NodeJS server the player is located at and is redirected to another server when moving to another zone. It handles player input, receives state updates and has some light prediction/reconciliation capabilities.
The art of hiding the delay
In a perfect world, every player would receive a super fast stream of updates from everyone else. In reality though, broadcasting a player’s position 60 times per second to every other client soon proved to be way too sluggish. For users on slower connections, the sheer volume of data would cause massive bottlenecks.
To keep the server from choking and the players from lagging, I had to be smart about what data actually traveled across the wire. I implemented a differential update system:
-
Idle Filtering: If a player is standing still, the server stops broadcasting their coordinates entirely. There is no point in telling a client sixty times a second that a rock hasn't moved.
-
Connection Awareness: I built the server to monitor the "health" of each client’s connection. If the latency spikes or the bandwidth drops, the server dynamically throttles the update frequency for that specific user, prioritizing essential game state over cosmetic updates.
Even with an optimized stream, packets get lost or delayed. Without intervention, a player would appear to "teleport" or "jitter" across the screen as the client waits for the next update. To solve this, I implemented entity interpolation.
Instead of instantly snapping a character to the latest coordinates received from the server, the client stays a few milliseconds "behind" the real time, using that buffer to smoothly transition the character's position from point A to point B. This creates a fluid experience where the player never notices the underlying network hiccups.
I planned to go further. My ultimate goal was to implement full client-side prediction and server reconciliation.
I actually chose a Javascript-compatible physics engine for this exact reason; the plan was to run the exact same physics code on both the client and the server. This would have allowed the client to "predict" its own movement instantly, with the server only stepping in to correct mistakes. While I didn't have the time to push the implementation to the finish line, the architectural foundation is there and I might continue it some day.
Big server is watching you
In a single-player game, the game engine trusts your inputs implicitly. If you tell the game you moved 100 meters in a frame, you did. In an MMO (or any online games in fact), trusting the client is really risky. If the client has the power to decide its own position, speed, or health, you can be certain that someone will exploit it to fly teleport AND become invincible (at the same time).
To combat this, I made sure to always trust the server. To do so, the client doesn't actually "move" the character. Instead, it sends a "request to move" containing a direction to the server. The server then performs a series of actions:
- Data validation: Is the player's move direction vector normalized? Imagine that the player sends a direction vector higher than 1 : since we multiply the player's speed by the direction vector, some smart person could try to increase their speed. See this example :
// pseudo-code
float inputMagnitude = clientInput.length();
if (inputMagnitude > 1.0f) {
// Log suspicious activity and normalize the vector to cap speed
logWarning("Player " + p.id + " sent invalid input magnitude: " + inputMagnitude);
clientInput = clientInput.normalize();
}
// Apply speed
Vector3 velocity = clientInput * p.walkSpeed;
I also log the suspicious activity because I'm curious.
- State logic: Can the player actually perform this action? For example, if you ask to jump, it first checks if your legs are touching the ground. Seems obvious, but if you don't do that, you're gonna wake up some day with a bunch of people flying in the sky (could be fun though).
The catch? Server authority is the primary cause of "rubber-banding." If the server disagrees with the client’s local simulation, it will snap the player back to the "true" position.
This is where the Interpolation and Bandwidth Optimization I mentioned earlier become critical. By keeping the server's simulation tight and the data flow efficient, I could minimize the discrepancy between what the player sees and what the server knows to be true. It’s a constant balancing act between absolute security and a smooth player experience.
Key Takeaway: Designing for an MMO means assuming every client is potentially malicious. By moving all logic and physics to the server, I ensured that the game's integrity remains intact, regardless of what's happening on the user's end.
Making an admin panel for a data-driven world
A game is never truly finished, especially not an MMO. To ensure the world could grow without me constantly digging into the source code, I had to build a bridge between the raw database and the game world. I needed tools that would allow a "ghost staff"—even if that's just me for now—to manage the universe in real-time.
Hardcoding game zones or character skins is a recipe for disaster. Every minor tweak would require a full recompilation and a server restart. To avoid this, I moved all game definitions to a centralized database. Whether it’s the coordinates of a new town, the stats of a piece of gear, or the color palette of a new hairstyle, everything is data-driven.
To interact with this data efficiently, I developed a custom Administrative Interface. This is a secure, staff-only dashboard that allows for:
-
Dynamic Zone Management: I can define new map regions, set their boundaries, and assign load-zone properties on the fly.
-
Live Customization Updates: Adding new cosmetic options to the character creator is as simple as filling out a form and hitting "save."
-
Real-time Monitoring: Keeping an eye on server health and connected players without having to tail log files in a terminal.
By building these internal tools, I shifted my role from "the person who codes the map" to "the person who manages the world." It turned a static codebase into a living platform ready for new content.
Key takeaways
If you have anything to add or any opinion (even negative!) on this article, please add a comment below :)
