He asks how it's possible, but avoids the obvious?
PORT_Memcpy(cx->u.buffer, sig->data, sigLen);
break;
Fig 3. The signature size must match the size of the key, but there are no other limitations. cx->u is a fixed-size buffer, and sig is an arbitrary-length, attacker-controlled blob.
Dude. This function just copied into an object blindly. It didn't interrogate the object to determine the object's maximum size (functionality you can add to a custom class). You don't have a preprocessor, postprocessor, compiler, etc that enforces object boundaries (afaik). This copy probably didn't even require similar object types to copy memory from one place to another.
The failure of C/C++ coding like this is bad design, bad process, and bad practice. This could have been prevented by merely writing simple object oriented code that enforce boundaries when manipulating data. Sure, C programmers love to play fast and loose, but this is no justification for writing code that refuses to enforce correct behavior.
It's not even hard. It's just more lines of code. My theory is that this kind of thing persists because C/C++ programmers tend to lean on packaged shared libraries rather than a "programming language package manager", and so the ecosystem for layers upon layers upon layers of abstractions never became fashionable the way it did for higher level languages. That mess of abstractions could have added some really basic safety frameworks. So really, I think the root cause is culture.
It's not like C is so primitive that it's impossible to create a program that checks bounds when copying data. You definitely can - you just have to actually do it. But there was no culture for rigor. Even the Linux kernel is rife with gotos, almost a middle finger to the general consensus on good practice. "But we're so good, it's perfectly fine for us!" Sure buddy. Tell me that in a month when the next Linux 0day comes out.
> He asks how it's possible, but avoids the obvious?
He’s not asking “how does this cause corruption”, he’s asking “how is it possible that a bug like this can occur in a code base like this, and not be caught earlier”.
He then enumerates all the myriad “correct” things that Mozilla do (did?), including code reviews, fuzzing, static analysis, bug bounties, etc and yet something as trivially trivial as copying an arbitrarily large amount of data into a buffer without verifying it fit went unnoticed.
Personally I think it’s a good example of how over valued static analysis is when something this trivial is not reported (I suspect the issue is SA tools have to avoid too many false positives and reporting every memcpy that only checks one size could be too “noisy”)
What’s the significance of posting this here 3 years later? What did user “fanf2” or the people upvoting the submission found interesting about the blog post?
Their 'about' suggests they're a hostmaster@ and postmaster@ at cam.ac.uk are two of their email addresses; which seems doubtful (that's an understatement).
I checked a few stories they submitted and some were new, some were reposts of relatively highly scoring stories from a few years ago.
Shenanigans?
Looks like it might be a slightly obfuscated spamming to boost an account so it can be used for promotion/advertising. Might be wrong.
Tony worked for years on email and DNS things at Cambridge, and now works for ISC on DNS stuff.
He doesn’t post here (or anywhere else) for promotion or advertising, but he has run a link blog of anything he thinks looks interesting for decades and has a workflow for it that posts to mastodon and other places.
The failure of C/C++ coding like this is bad design, bad process, and bad practice. This could have been prevented by merely writing simple object oriented code that enforce boundaries when manipulating data. Sure, C programmers love to play fast and loose, but this is no justification for writing code that refuses to enforce correct behavior.
It's not even hard. It's just more lines of code. My theory is that this kind of thing persists because C/C++ programmers tend to lean on packaged shared libraries rather than a "programming language package manager", and so the ecosystem for layers upon layers upon layers of abstractions never became fashionable the way it did for higher level languages. That mess of abstractions could have added some really basic safety frameworks. So really, I think the root cause is culture.
It's not like C is so primitive that it's impossible to create a program that checks bounds when copying data. You definitely can - you just have to actually do it. But there was no culture for rigor. Even the Linux kernel is rife with gotos, almost a middle finger to the general consensus on good practice. "But we're so good, it's perfectly fine for us!" Sure buddy. Tell me that in a month when the next Linux 0day comes out.
reply