I'm going to guess because the ABI says they are passed in int and floating point registers, so the actual order they appear doesn't matter?
Easy bugs hiding in plain sight, you say?
> The fix was a few lines of code to stop traversing after
twenty navigation nodes, presumably saving a few million
dollars in server and power costs
I wonder if this is Stadia?
EDIT: Whoops, I don't think so:
> needed to move on to a different company
Is this a joke, half a joke, or serious?
I have worked on codebases where 8000 compiler warnings was just "So no errors then? That means the build is fine."
But 7990 of those warnings had been there for at least five years. Adding the 8001st warning in your build still meant that you probably did something ill-advised or poorly thought out.
It's just very hard to allow/forbid the right warnings per compilation unit (this was in the C#/.NET ecosystem) to tell the compiler what you did or did not care about.
I've generally found it quite practical to get warning counts to zero with some combination of fixing the issues and disabling warnings that have a poor signal-to-noise ratio. The one exception was /analyze builds. There I added filtering so that I could treat some warning types as fatal, and others would be reported on only when they were new.
Well, doing a clean checkout of the master branch, noting N=number of warnings, then making your changes and comparing with N. At least that is what I did. It was not very popular, since no one ever tracked N on master at all.
> I've generally found it quite practical to get warning counts to zero with some combination of fixing the issues and disabling warnings that have a poor signal-to-noise ratio. The one exception was /analyze builds. There I added filtering so that I could treat some warning types as fatal, and others would be reported on only when they were new.
I completely agree that this is how it should be done.
I vaguely remember that this is however quite complicated to achieve in the .NET ecosystem. Something about a library A being called in another project B respecting the disabled warnings of A when compiling A and B when compiling B, so you as a client could never disable warnings yourself (?).
This was quite a while ago, and as I didn't work too much on A-type projects (only B-type projects), investigating it was never my responsibility. I just got told "It's not that simple" by the A-type employees.
Long-term I find them annoying to work with because they seem to chase progressively higher cost/lower return changes. Especially in a startup context they're obsessed with being technically correct over building something people actually want. Green field stuff takes 10x as long as it needs to because they refuse to release something quickly and iterate.
I took the time to reduce warnings from 10,000s to 10s slowly over the course of months in a legacy codebase that no one else thought could be done within reasonable time and effort.
I've worked with many people who overestimate the effort and underestimate the value of cleaning up code--for the sake of immediate velocity.
If your software will be irrelevant after one or two years though, you can freely hack together whatever you want, I would say.
> And yes, that discounts languages like js or python for long-lived software projects.
That's the kind of dogmatic statement that would be a red flag to me. The last company I was at was wildly successful using Ruby, JS and Python.