Hacker News new | comments | show | ask | jobs | submit login
Half-Life and Team Fortress Networking (gamasutra.com)
226 points by setra 7 months ago | hide | past | web | favorite | 54 comments



Here’s a nice white paper from Valve with diagrams on lag compensation, prediction, authoritative server, etc:

https://developer.valvesoftware.com/wiki/Source_Multiplayer_...


I've made an interactive demo that lets you tweak the different knobs and see what happens: http://www.gabrielgambetta.com/client-side-prediction-live-d...


I made my own simple multiplayer 2d game to understand this stuff and just want to let you know that your site was invaluable to understanding client side prediction and server side reconciliation! Thanks so much for that!


Once per year; before christmas, someone posts an old trigger for the classical UDP vs. TCP problem.

Here is a solution I just submitted: https://www.ietf.org/id/draft-add-ackfreq-to-tcp-00.txt

TL;DR The trick is not to make UDP like TCP, but instead make TCP like UDP.


PSA: Add print=1 to old gamasutra articles to get a single page.

https://www.gamasutra.com/view/feature/131577/halflife_and_t...

Also, the reason many don't use TCP is that while it guarantees ordering and packet arrival people completely forget that it's doing that by adding an indeterminate amount of latency. It's not some "packets arrive perfectly / user different pipes, because [magic]" solution.


SCTP provides a nice protocol that gets you the best of both worlds in many ways between TCP and UDP. It features:

- hybrid reliable and unreliable, ordered and unordered delivery semantics

- automatic MTU fragmenting, to avoid router fragmentation

- ordered multi-channel multiplexing

Unfortunately SCTP never took off, but it is the data channel protocol for WebRTC, which uniquely positions WebRTC as a nice platform for game networking in the browser. (However, the stack must be implemented in user space since it is SCTP over DTLS over UDP.)

We are working on a WebRTC based game networking server for Social Mixed Reality at Mozilla which uses the Janus WebRTC gateway as the connection manager, our initial work can be found here:

https://blog.mozvr.com/enabling-the-social-3d-web/

Our initial SFU plugin for Janus already out of the box makes it easy to set it up as a dumb game networking server that just relays messages between peers, but our plan is to turn it into a connection manager in front of an OTP service that provides full authoritative game networking services.


Whatever happened to the BananaBread Engine demo that was supposed to show off WebRTC based game networking? Now the only public demo is single player. A twitch FPS shooter pushes networking more than a social chat room.

EDIT: https://github.com/kripken/BananaBread/issues/40

Seems WebRTC wasn't mature enough at the time. Would be a good showcase if it was brought back.


On the other hand, to misquote Philip Greenspun, any sufficiently complicated UDP-based protocol contains an ad-hoc, informally-specified, bug-ridden, slow implementation of half of TCP.


TCP has different goals than UDP gaming. You can (and 99% of games do) implement a real-time version of abstract TCP. But the similarities fall away when you stop looking at the "abstract" version of TCP.

This is an extremely optimized, important aspect of game programming. I would be very careful to suggest everyone is spending thousands of hours re-inventing the wheel for no reason--especially in the ultra-time-constrained game industry.

https://gafferongames.com/post/udp_vs_tcp/

>Unfortunately, [...] TCP still has serious problems for multiplayer games and it all stems from how TCP handles lost and out of order packets to present you with the “illusion” of a reliable, ordered stream of data. [goes on to explain issues]

When you implement your own error and ordering stack, you are in control of how and when to drop a packet or consider the packet too out-of-date. These "best settings" are very much related to the entire architecture of your physical simulation, game mechanics/rules, and distribution of players. Some games use TCP for control signals and UDP for faster, but less critically important data.

Lastly, as for re-inventing the wheel, there are actually plenty of vetted libraries that implement error checking, quality of service, etc for UDP while still retaining the configurability and real-time/games focus that TCP doesn't allow.

The majority of the industry runs on UDP. That's not an accident--especially when the majority of college students / outsiders don't know anything about UDP and focus on TCP in their studies. Different application, different tool for the job. When you're sending banking information you need to know it got there or didn't. When you're updating someone's head direction in a FPS game, you can tolerate huge amounts of inconsistency between players when the connection starts degrading.


> any sufficiently complicated UDP-based protocol contains an ad-hoc, informally-specified, bug-ridden, slow implementation of half of TCP

Assuming that's true, that doesn't change the fact that TCP is unusable for real-time gaming.

So the solution sounds like a formally specified, bug tested, and fast implementation of half of TCP, as long as it's not the half that breaks soft real-time gaming.


> TCP is unusable for real-time gaming.

Only for the twitchiest of twitch games really needs lower latency than TCP provides. So many amateur/indie game devs waste huge amounts of time messing with UDP because "everyone knows TCP isn't realtime" when TCP would have been perfectly suitable for their game.


Completely disagreed. If you're amateur/indie, use a premade engine that has networking. If the engine's networking isn't good enough for what you need, there are probably some options on the store that meet your needs.


That'd be applicable if TCP were a valid alternative. In many cases, TCP is not at all a valid alternative.


Experience doesn't agree with this however. Pretty much every game engine in existence has a well-tested UDP-based networking stack, without which low-latency game play simply wouldn't be possible. The other thing this statement ignores is that TCP currently has serious issues which I would call performance bugs, so it isn't even setting the bar particularly high.


I have a corollary to Greenspun's maxim: "TCP/IP can solve any given networking problem suboptimally."


do you mean 'uses'?


This comment has been downvoted, but I don‘t understand why. The phrase “user different pipes, because [magic]" is unclear to me as well.


The complaint was confusing to me because it didn't call out where the error was. The word "use" appears early on, so it looked like the complaint was incorrectly calling out a disagreement in plurality.

(Didn't downvote, but I did spent a good minute trying to parse the problem.)


For those interested in game neworking I found this to be a good book on the subject:

https://www.amazon.com/Multiplayer-Game-Programming-Architec...

Also this is a good paper (10 years old) on DOOM 3's networking:

http://mrelusive.com/publications/papers/The-DOOM-III-Networ...


> The CD keys we used were algorithmically generated so as to be very difficult to guess randomly. Because authenticating takes several seconds, and the odds of guessing a valid CD key are low, there is a large barrier to repetitive key guessing.

I would love to learn more about this. How did the authentication system work? How difficult was it to brute force? Does it simply involve a master key creating individual subkeys? Is it RSA?


I'm reminded of this blog post that uses a theorem prover/constraint solver for generating a serial key.

https://rolandsako.wordpress.com/2016/02/17/playing-with-z3-...


Somewhat related: Some cdkey generators printed out valid keys for Steam back in the early days, around 2004.


It's possible this might be more documented on old warez/cracking/serial # generation tutorials if they're still around anywhere. This is only speculation but I imagine the process would be something like breaking a CD-key into multiple parts (a unit#, salt, hash of the unit#+salt)

1. Using unit# + some known/random salt, compute a hash

2. Mix-in the unit# + salt to the resulting hash in some deterministic fashion (prefix, suffix, or maybe inserting/replacing into a known/random location)

3. Convert the hash result into a CD-key format

Generate one for each unit produced, print on a label and attach to the distributed product. When installing, the process to verify the key would be

1. Extract the unit# from the entered CD-key (or if randomly inserted/replaced just do this for each possibly bytes assuming its the unit# + salt)

2. Using the unit# follow the same 3 steps above to get a CD-key

3. Compare the generated CD-key to the one entered for validity

Using hash is fast, produces an output with a very large output range (not easily guessed) and unlikely to have collisions, while also limiting the possibly valid keys to (roughly) however many bits necessary for a unit#. This method also allows for offline validation since the key to the hash is being shipped as part of the CD-key, and this was back in the days when a constant-connection to internet wasn't available or necessary for licensing.

The downside, which was the bane of most producers, is that a reverse-engineer who acquires a copy of the software can use a debugger or disassembler to extract out the first 3 steps above to create a keygen. Once the keygen was out though basically anyone could generate their own valid keys. In order to combat this software makers started requiring an additional step that after validating the key it contacts a central server to see if anyone else has used it or not...

> How did the authentication system work? How difficult was it to brute force? Does it simply involve a master key creating individual subkeys? Is it RSA?

Producing individual builds/images to burn onto CD's was unlikely as it would make the CD production process impractical -- a single build/image for burning CD's was likely used until they ran out of stock and/or wanted to ship updated versions of the software. Using RSA would otherwise be a great solution as the software producer could generate private keys for each sold license. Many newer software licensing schemes have this approach.

Edit: Related answer on StackOverflow: https://stackoverflow.com/questions/3002067/how-are-software...


Another approach is to add a key file as ~32MB gives 2 million 128 bit hashes and you can go up / down on key length and file length. This still takes more effort than a single Master, but doing 1-2 Million from a master is perfectly reasonable.


This paper is what started a lot of things now still in used today, the tribe model:

http://www.pingz.com/wordpress/wp-content/uploads/2009/11/tr...


> "The days of needing to do such things as manually type in IP addresses to connect to remote servers are coming rapidly to a close. Therefore, it is important for you to provide a seamless user experience for your gamers as they go on-line."

>> I miss the days of playing CS-1.5 and having to use gametiger.com to find servers to play in. Lots of pool_day and de_dats.


Yup. I hate how online matchmaking these days has made away with the community feel that picking your own server provided.


Another downside of current gen matchmaking is just how SHORT games are. Back when I played Halo 1 pc's multiplayer, you could watch a single Blood Gulch CTF game go on for an entire WEEK before one team got the 7 captures required to win on that server. It created a really interesting dynamic where attempts at the flag were each unique events, requiring planning, preparation, camaraderie and teamwork in a game without voice chat. You could leave a server to go to bed, and wake up the next day, hopping back on and see the same people on the same teams still going at it. It was much more IRC-like in feeling

People built communities by just being on the same server often. Friendships were made, guilds too. But now all you have is 12 other little kids all free-for-alling shotguns for a brief 5 minute instant gratification fest


I can totally agree. There's no way to go on the same server a few times a week and form friendships. Now you have to in many games specifically add someone as a friend and wait for them to confirm it, if you ever want to see that person again. But that's not organic and that's not how friendships develop. I can't make friends on Overwatch, for example. I wish more people would talk about how this has been totally stripped from online games nowadays. People don't even bother talking during matches, mostly because as you mention they are so short.


That feel still exists in certain communities, such as Insurgency and Arma III, which are both great on Linux.


there are also still retro games being played, like Descent, with small and tight-knit communities.


Games like League of Legends support custom games, but I don't get the impression that they are used much and I don't know if there are persistent servers there, so you don't get to go hangout with the same gang/community at will.


I still play a short list of favorite servers in BF4. This is a game from just a couple of years ago. Are you saying some newer games only have random matchmaking and no joining a favorite server?


Most new games I've seen have only random matchmaking, unfortunately. Even Quake Champions.


Yes. Rainbow six siege is one of them.


It's close to quake 3 protocol as well. It's not uncommon to see fake server listings with fake players and server information, some of them even allow you to connect to.


It is a 17-year-old article. Did the fundamentals stayed the same or has there been any significant paradigm shifts in how game networking is done?


If anything, things have become less complex. Nowadays, you get higher update rates, so good prediction becomes less critical - and things get simplified a lot. A lot of the complexity of the network/game engine code in Q2/Q3 and derivative games (like Warsow) was dedicated to network prediction.


Still the same :)


Everybody keeps repeating that things are changing so fast in IT land that it's hard to keep up. This is a nice example of how that's only superficially true.


There are more resources like this on networking at http://gamedevs.org.



I don’t recommend lag compensation. It’s a bad idea because it degrades the experiences of more-committed / better players in order to cater to the less-committed / worse players.


How so? More committed / better players aim more precisely, and thus benefit the most from a system that judges their shots based on whether they hit the target on their screen - as opposed to a space some distance in front of their target, with the distance depending on their latency to the server (which of course varies depending on the server's location). On the other hand, worse players' aiming has a higher degree of luck/randomness, so it should be less affected by the target being off.

edit: Maybe you mean that lag compensation makes it harder to avoid others' shots. But does that really reward less skilled players? If anything it rewards players with higher ping, but that's not necessarily the same as worse or less committed. I suppose that really uncommitted players are more likely to try to game over some crappy Wi-Fi or even cellular connection, whereas for committed players, latency is more likely to result from distance. But that's a rather low threshold of commitment. In any case, having a really high ping comes with serious disadvantages, including rubber-banding, a harder time avoiding shots, a harder time aiming with anything that isn't lag compensated (e.g. projectiles), etc...

Also,

> High-budget games these days locate servers around the world, so that is not an issue.

Even assuming there are enough players to fill the severs, it's only a non-issue if you're matchmaking alone. If you want to play with specific people - through either a party system or traditional named/player-operated servers - and they happen to live far away, you can't avoid latency. Of course, there's always going to be some ping value beyond which the game isn't worth playing, even with lag compensation - but it's a higher value than without. And at intermediate levels of latency - say, 100ms - lag compensation can be the difference between a passable experience and an effectively perfect one.


High ping definitely makes hitscan aim easier in Valve FPS multiplayer. (Projectiles, OTOH, aren’t lag-compensated.) There was that one time a top-tier North American Team Fortress 2 team brought in a Brazilian Sniper as a substitute in an official match, apparently in an attempt to game the system. But that was an unusual event.


how are they less committed? it's their fault that they live in a country which doesn't exist on the server installation roadmap of stupid greedy corporation? that suggestion was really American for the lack of better word.

let them run their own servers? tough luck, most games in 2017 don't allow dedicated server setup


I am talking about peoples’ home internet connections (or the PC bang where they play, etc). High-budget games these days locate servers around the world, so that is not an issue.

Yes, good connections are expensive for some people. If you want to do something about that, maybe you include amount of latency in a player’s ELO or something. I am just saying that I do not like Valve’s approach to the problem (which has been inherited by many other games) because I care about games and it makes games worse overall.


> High-budget games these days locate servers around the world, so that is not an issue.

exactly what I'm talking about. "around the world" may not be enough. And since they don't allow dedicated servers hosted by players, they better have lag compensation or to hell with them.


High-pace FPS games become unplayable without lag compensation so what do you mean exactly?


[flagged]


How popular is your online game?

P.S. Someone the other day asked why I don’t post much to HN (and don’t put much effort in when I do). This kind of junk is exactly why!


It was reasonably popular and got picked up by a publisher and perhaps you have played it, but I won't talk about that.

The problem with your original assertion is that you have ignored the virtual world player types as described by Richard Bartle and positioned the Achiever view as the sole one of import, subsequently question begging your way to a narrow conclusion about what quality is.

I will explain further but first I should point out that I am deliberately engaging you on your weaknesses. You have a lengthy track record of preaching your views as unqualified absolutes and then going out of your way to never show vulnerability under fire. You seem borderline incapable of saying "Oh, you got me on that one. I was wrong!" And you are far too visible to get away with that without being called out by someone who knows the topic better than you do and has the credentials to back it up. That is why whenever you show yourself online you are getting more and more hostility. Rather than engage in pedagogy and encourage further questioning you give speedy dismissals to all sorts of subjects. You are necessarily wrong on at least some of them, being human. You admit to none of it. When you become challenged on Twitter you just block them. It looks alternately arrogant and pitiful, and people are increasingly aware with time that this behavior is the "Jon Blow brand". It has already shut you out of opportunities to benefit from, at minimum, the interesting conversation you seem to crave. I know this because I am in the back room on some of it. You would be let in but for this personal stuff. There are other ways to conduct oneself that avoid this particular problem, but it is up to you to rise to the challenge.

Now, to Bartle. Fair play and a high skill ceiling is the secondary goal of the design in popular online games, even e-sports: the first goal is always a balance of accessibility for each of the player types so that nobody is left with nothing to do or turned towards toxicity. That includes features that make dramatic trade offs throughout the design, like auto-aim functionality versus encouraging fine motor skills, global player ranking systems versus buddy systems versus community servers, team damage to encourage tactics versus potential for griefing and yes, fine-grained netcode decisions. The idea of players "deserving" something for their great effort to develop skill, when made central to the premise as in one-on-one fighters with extreme depth and difficult input execution, limits the potential audience to a niche Achiever population. It leaves out players who just want to play with their friends and don't have the best machines or connections, players who are more interested in exploring side systems of the game than the central mechanic, and players who want to elicit strong, surprised reactions(whether through clever tactics or simple trolling) and will do it with an aimbot if you've deprived them of alternatives.

What compensation does is twofold: it lowers the skill ceiling by moving all the considerations around time from the adversary's shooting performance to the target's perception of when they took damage. It also improves accessibility and feedback for all aiming and shooting because it matches player perceptions more closely. Players experience a deeper sense of control with compensation for the same reason that improving frame rates improves the sense of control.

For nearly every game with "twitch" gameplay this is a no-brainer choice: Lag compensation wheels the design back from a Byzantine technical artifact that inflates the skill ceiling and towards a tighter feedback loop that most players, even the less dedicated ones, will intuitively understand and are more likely to appreciate(hence - popular, accessible). Compensated shooting can still be made a skillful process, as games like Counter-Strike have ably demonstrated, without being opaque or tied to ping. And the downside risk is low because a death that feels unearned is rare even in the cases where it comes with a substantial delay. Surely if you were playing well you shouldn't have been in a position to take damage, right? The player can rest easy feeling that their tactics were at fault for putting them in a risky position, not the game.

And there is in fact a specific reason why I brought up your game. It focused on slower-moving projectiles. Thus target leading is already built into the core feedback loop of aiming, and the resulting latency from uncompensated shooting is less relevant to the experience. This would give you the impression based on personal experience that it's not required for accessibility. All your original remark has to do is find the qualifying factors in your experiences and lay them out, and you'd get the quality reply you seek without a fight.


I don’t think the Bartle stuff is useful in terms of concrete game design, the main weapon in my online game (from the 1990s!) was hitscan and not slow-moving projectiles as you assert (so I don’t think you know what you are talking about there), and the whole tone of your reply is personally hostile while asserting really weird things, so I don’t see anything else meaningful worth replying to, sorry.


I have had a suspicion for a while that "Flat Earthers" are trolling everyone into giving them a free ride into space in order to prove them wrong.





Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: