Here is a solution I just submitted: https://www.ietf.org/id/draft-add-ackfreq-to-tcp-00.txt
TL;DR The trick is not to make UDP like TCP, but instead make TCP like UDP.
Also, the reason many don't use TCP is that while it guarantees ordering and packet arrival people completely forget that it's doing that by adding an indeterminate amount of latency. It's not some "packets arrive perfectly / user different pipes, because [magic]" solution.
- hybrid reliable and unreliable, ordered and unordered delivery semantics
- automatic MTU fragmenting, to avoid router fragmentation
- ordered multi-channel multiplexing
Unfortunately SCTP never took off, but it is the data channel protocol for WebRTC, which uniquely positions WebRTC as a nice platform for game networking in the browser. (However, the stack must be implemented in user space since it is SCTP over DTLS over UDP.)
We are working on a WebRTC based game networking server for Social Mixed Reality at Mozilla which uses the Janus WebRTC gateway as the connection manager, our initial work can be found here:
Our initial SFU plugin for Janus already out of the box makes it easy to set it up as a dumb game networking server that just relays messages between peers, but our plan is to turn it into a connection manager in front of an OTP service that provides full authoritative game networking services.
Seems WebRTC wasn't mature enough at the time. Would be a good showcase if it was brought back.
This is an extremely optimized, important aspect of game programming. I would be very careful to suggest everyone is spending thousands of hours re-inventing the wheel for no reason--especially in the ultra-time-constrained game industry.
>Unfortunately, [...] TCP still has serious problems for multiplayer games and it all stems from how TCP handles lost and out of order packets to present you with the “illusion” of a reliable, ordered stream of data. [goes on to explain issues]
When you implement your own error and ordering stack, you are in control of how and when to drop a packet or consider the packet too out-of-date. These "best settings" are very much related to the entire architecture of your physical simulation, game mechanics/rules, and distribution of players. Some games use TCP for control signals and UDP for faster, but less critically important data.
Lastly, as for re-inventing the wheel, there are actually plenty of vetted libraries that implement error checking, quality of service, etc for UDP while still retaining the configurability and real-time/games focus that TCP doesn't allow.
The majority of the industry runs on UDP. That's not an accident--especially when the majority of college students / outsiders don't know anything about UDP and focus on TCP in their studies. Different application, different tool for the job. When you're sending banking information you need to know it got there or didn't. When you're updating someone's head direction in a FPS game, you can tolerate huge amounts of inconsistency between players when the connection starts degrading.
Assuming that's true, that doesn't change the fact that TCP is unusable for real-time gaming.
So the solution sounds like a formally specified, bug tested, and fast implementation of half of TCP, as long as it's not the half that breaks soft real-time gaming.
Only for the twitchiest of twitch games really needs lower latency than TCP provides. So many amateur/indie game devs waste huge amounts of time messing with UDP because "everyone knows TCP isn't realtime" when TCP would have been perfectly suitable for their game.
(Didn't downvote, but I did spent a good minute trying to parse the problem.)
Also this is a good paper (10 years old) on DOOM 3's networking:
I would love to learn more about this. How did the authentication system work? How difficult was it to brute force? Does it simply involve a master key creating individual subkeys? Is it RSA?
1. Using unit# + some known/random salt, compute a hash
2. Mix-in the unit# + salt to the resulting hash in some deterministic fashion (prefix, suffix, or maybe inserting/replacing into a known/random location)
3. Convert the hash result into a CD-key format
Generate one for each unit produced, print on a label and attach to the distributed product. When installing, the process to verify the key would be
1. Extract the unit# from the entered CD-key (or if randomly inserted/replaced just do this for each possibly bytes assuming its the unit# + salt)
2. Using the unit# follow the same 3 steps above to get a CD-key
3. Compare the generated CD-key to the one entered for validity
Using hash is fast, produces an output with a very large output range (not easily guessed) and unlikely to have collisions, while also limiting the possibly valid keys to (roughly) however many bits necessary for a unit#. This method also allows for offline validation since the key to the hash is being shipped as part of the CD-key, and this was back in the days when a constant-connection to internet wasn't available or necessary for licensing.
The downside, which was the bane of most producers, is that a reverse-engineer who acquires a copy of the software can use a debugger or disassembler to extract out the first 3 steps above to create a keygen. Once the keygen was out though basically anyone could generate their own valid keys. In order to combat this software makers started requiring an additional step that after validating the key it contacts a central server to see if anyone else has used it or not...
> How did the authentication system work? How difficult was it to brute force? Does it simply involve a master key creating individual subkeys? Is it RSA?
Producing individual builds/images to burn onto CD's was unlikely as it would make the CD production process impractical -- a single build/image for burning CD's was likely used until they ran out of stock and/or wanted to ship updated versions of the software. Using RSA would otherwise be a great solution as the software producer could generate private keys for each sold license. Many newer software licensing schemes have this approach.
Edit: Related answer on StackOverflow:
>> I miss the days of playing CS-1.5 and having to use gametiger.com to find servers to play in. Lots of pool_day and de_dats.
People built communities by just being on the same server often. Friendships were made, guilds too. But now all you have is 12 other little kids all free-for-alling shotguns for a brief 5 minute instant gratification fest
edit: Maybe you mean that lag compensation makes it harder to avoid others' shots. But does that really reward less skilled players? If anything it rewards players with higher ping, but that's not necessarily the same as worse or less committed. I suppose that really uncommitted players are more likely to try to game over some crappy Wi-Fi or even cellular connection, whereas for committed players, latency is more likely to result from distance. But that's a rather low threshold of commitment. In any case, having a really high ping comes with serious disadvantages, including rubber-banding, a harder time avoiding shots, a harder time aiming with anything that isn't lag compensated (e.g. projectiles), etc...
> High-budget games these days locate servers around the world, so that is not an issue.
Even assuming there are enough players to fill the severs, it's only a non-issue if you're matchmaking alone. If you want to play with specific people - through either a party system or traditional named/player-operated servers - and they happen to live far away, you can't avoid latency. Of course, there's always going to be some ping value beyond which the game isn't worth playing, even with lag compensation - but it's a higher value than without. And at intermediate levels of latency - say, 100ms - lag compensation can be the difference between a passable experience and an effectively perfect one.
let them run their own servers? tough luck, most games in 2017 don't allow dedicated server setup
Yes, good connections are expensive for some people. If you want to do something about that, maybe you include amount of latency in a player’s ELO or something. I am just saying that I do not like Valve’s approach to the problem (which has been inherited by many other games) because I care about games and it makes games worse overall.
exactly what I'm talking about. "around the world" may not be enough. And since they don't allow dedicated servers hosted by players, they better have lag compensation or to hell with them.
P.S. Someone the other day asked why I don’t post much to HN (and don’t put much effort in when I do). This kind of junk is exactly why!
The problem with your original assertion is that you have ignored the virtual world player types as described by Richard Bartle and positioned the Achiever view as the sole one of import, subsequently question begging your way to a narrow conclusion about what quality is.
I will explain further but first I should point out that I am deliberately engaging you on your weaknesses. You have a lengthy track record of preaching your views as unqualified absolutes and then going out of your way to never show vulnerability under fire. You seem borderline incapable of saying "Oh, you got me on that one. I was wrong!" And you are far too visible to get away with that without being called out by someone who knows the topic better than you do and has the credentials to back it up. That is why whenever you show yourself online you are getting more and more hostility. Rather than engage in pedagogy and encourage further questioning you give speedy dismissals to all sorts of subjects. You are necessarily wrong on at least some of them, being human. You admit to none of it. When you become challenged on Twitter you just block them. It looks alternately arrogant and pitiful, and people are increasingly aware with time that this behavior is the "Jon Blow brand". It has already shut you out of opportunities to benefit from, at minimum, the interesting conversation you seem to crave. I know this because I am in the back room on some of it. You would be let in but for this personal stuff. There are other ways to conduct oneself that avoid this particular problem, but it is up to you to rise to the challenge.
Now, to Bartle. Fair play and a high skill ceiling is the secondary goal of the design in popular online games, even e-sports: the first goal is always a balance of accessibility for each of the player types so that nobody is left with nothing to do or turned towards toxicity. That includes features that make dramatic trade offs throughout the design, like auto-aim functionality versus encouraging fine motor skills, global player ranking systems versus buddy systems versus community servers, team damage to encourage tactics versus potential for griefing and yes, fine-grained netcode decisions. The idea of players "deserving" something for their great effort to develop skill, when made central to the premise as in one-on-one fighters with extreme depth and difficult input execution, limits the potential audience to a niche Achiever population. It leaves out players who just want to play with their friends and don't have the best machines or connections, players who are more interested in exploring side systems of the game than the central mechanic, and players who want to elicit strong, surprised reactions(whether through clever tactics or simple trolling) and will do it with an aimbot if you've deprived them of alternatives.
What compensation does is twofold: it lowers the skill ceiling by moving all the considerations around time from the adversary's shooting performance to the target's perception of when they took damage. It also improves accessibility and feedback for all aiming and shooting because it matches player perceptions more closely. Players experience a deeper sense of control with compensation for the same reason that improving frame rates improves the sense of control.
For nearly every game with "twitch" gameplay this is a no-brainer choice: Lag compensation wheels the design back from a Byzantine technical artifact that inflates the skill ceiling and towards a tighter feedback loop that most players, even the less dedicated ones, will intuitively understand and are more likely to appreciate(hence - popular, accessible). Compensated shooting can still be made a skillful process, as games like Counter-Strike have ably demonstrated, without being opaque or tied to ping. And the downside risk is low because a death that feels unearned is rare even in the cases where it comes with a substantial delay. Surely if you were playing well you shouldn't have been in a position to take damage, right? The player can rest easy feeling that their tactics were at fault for putting them in a risky position, not the game.
And there is in fact a specific reason why I brought up your game. It focused on slower-moving projectiles. Thus target leading is already built into the core feedback loop of aiming, and the resulting latency from uncompensated shooting is less relevant to the experience. This would give you the impression based on personal experience that it's not required for accessibility. All your original remark has to do is find the qualifying factors in your experiences and lay them out, and you'd get the quality reply you seek without a fight.