MMO games like Eve online and DDO require players to connect to a central host. Players are the clients and the MMO company's expensive computers with their expensive T1/T2/T3/T4 connections is the host. If there is significant lag, I think it's often due to limitations of the client's connection or hardware. That, or there are software issues or too many players.
But multiplayer games like Avorion are not MMOs. Hosting maybe 2 dozen players is not "massively multiplayer". The host is either the hardware and connection of a single player or a paid game server service. Problem is, such hardware and connections can't compete with the hardware and connections of profitable MMO companies. Yet, Avorion is a massive game with a lot of things going on under the hood...
From what I can tell, each Avorion client relies on the server or host to do most of the calculations and keep the simulations of all players updated and syncronized. But, with dozens of players and a lot going on, the host can not keep up with such demand.
My Experience:
Despite a decent ping rate, we've witnessed a lot of lag on the server where I often play whenever the player count goes above 1 dozen or so. Sure, there are various reasons. For instance, very predictable, very extreme lag occurs whenever someone is fighting the last boss in the galactic core.
Whenever there is extreme lag, it become frustratingly tedious or impossible to do pretty much anything. Mining an asteroid or salvaging a wreck becomes either frustratingly slow or impossible. (The asteroid or wreck suddenly seems immune to our mining or salvage lasers...)
Worse, combat can become nearly impossible. At nearly point blank range, I can shoot and shoot and shoot... and the enemy takes no damage. Then there's how enemies 'teleport' as the host suddenly updates their location to someplace different that our client did not predict.
My Suggestion:
Instead of requiring the host to do all of the heavy lifting, why not distribute the workload a bit with the clients' hardware? Depending on how this is done, that could relieve a huge amount of the strain on the host's hardware and connection.
Specifically: For each player that is currently by himself (no other players) in a given sector, why not allow that player's hardware to be responsible for simulating that sector? As long as said player's client is updated with the server's copy of that sector when the player enters that sector and as long as it updates the server's copy whenever that player leaves or a new player enters the sector, I can not imagine a situation where this would be a bad thing.
The benefits are obvious. Requiring clients to simulate their own sectors and keep the server updated (instead of requiring the server to simulate everything and keep clients updated) should eliminate the majority of lag for those clients - no matter how many players join.
A multiplayer game really boils down to a shared experience and chat. But as long as players are doing their own thing in their own sectors, there's not much actual sharing going on - besides chat, that is. A bunch of multiplayer players each in their own sector is really not much different than each of those players playing in singleplayer galaxies that happen to be the same galaxy.
Granted, when more than one player is occupying the same sector, that situation is different. That would require the host to do most of the calculations just to keep their simulations syncronized. But, in my experience, a majority of the collective experience in multiplayer (Avorion) games is spent with players off doing their own thing in separate sectors. As such, offloading the workload to simulate a sector to the one player occupying that sector would be a huge boon to the host and, thus, all players in form of much less lag.
Suggestion
Thundercraft
MMO games like Eve online and DDO require players to connect to a central host. Players are the clients and the MMO company's expensive computers with their expensive T1/T2/T3/T4 connections is the host. If there is significant lag, I think it's often due to limitations of the client's connection or hardware. That, or there are software issues or too many players.
But multiplayer games like Avorion are not MMOs. Hosting maybe 2 dozen players is not "massively multiplayer". The host is either the hardware and connection of a single player or a paid game server service. Problem is, such hardware and connections can't compete with the hardware and connections of profitable MMO companies. Yet, Avorion is a massive game with a lot of things going on under the hood...
From what I can tell, each Avorion client relies on the server or host to do most of the calculations and keep the simulations of all players updated and syncronized. But, with dozens of players and a lot going on, the host can not keep up with such demand.
My Experience:
Despite a decent ping rate, we've witnessed a lot of lag on the server where I often play whenever the player count goes above 1 dozen or so. Sure, there are various reasons. For instance, very predictable, very extreme lag occurs whenever someone is fighting the last boss in the galactic core.
Whenever there is extreme lag, it become frustratingly tedious or impossible to do pretty much anything. Mining an asteroid or salvaging a wreck becomes either frustratingly slow or impossible. (The asteroid or wreck suddenly seems immune to our mining or salvage lasers...)
Worse, combat can become nearly impossible. At nearly point blank range, I can shoot and shoot and shoot... and the enemy takes no damage. Then there's how enemies 'teleport' as the host suddenly updates their location to someplace different that our client did not predict.
My Suggestion:
Instead of requiring the host to do all of the heavy lifting, why not distribute the workload a bit with the clients' hardware? Depending on how this is done, that could relieve a huge amount of the strain on the host's hardware and connection.
Specifically: For each player that is currently by himself (no other players) in a given sector, why not allow that player's hardware to be responsible for simulating that sector? As long as said player's client is updated with the server's copy of that sector when the player enters that sector and as long as it updates the server's copy whenever that player leaves or a new player enters the sector, I can not imagine a situation where this would be a bad thing.
The benefits are obvious. Requiring clients to simulate their own sectors and keep the server updated (instead of requiring the server to simulate everything and keep clients updated) should eliminate the majority of lag for those clients - no matter how many players join.
A multiplayer game really boils down to a shared experience and chat. But as long as players are doing their own thing in their own sectors, there's not much actual sharing going on - besides chat, that is. A bunch of multiplayer players each in their own sector is really not much different than each of those players playing in singleplayer galaxies that happen to be the same galaxy.
Granted, when more than one player is occupying the same sector, that situation is different. That would require the host to do most of the calculations just to keep their simulations syncronized. But, in my experience, a majority of the collective experience in multiplayer (Avorion) games is spent with players off doing their own thing in separate sectors. As such, offloading the workload to simulate a sector to the one player occupying that sector would be a huge boon to the host and, thus, all players in form of much less lag.
Link to comment
Share on other sites
8 answers to this suggestion
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now