Hacker News new | past | comments | ask | show | jobs | submit login

Can you imagine that "20 years ago" (for most of us is too early to be programming at this level) you missed this thinking it couldn't possibly be anything but 32 bits for the purpose it was written for?

That said, never assume, because "dict.cpp" stuff happens all the time, except its modern web-incantation is usually called "copy-pasta."




I love cperciva's solution. OK, not solution, but at least paranoia. Back when I worked at a place called Morada in the early 90s, we had to port a C runtime library for the product to everything from 16 bit x86 "real mode" (DOS) to 64 bit DEC-Alpha, so word sizes were all over the map.

Trying to define portable size definitions in C can make you crazy (e.g. - such as trying to save a tokenized binary file from source code, such that the binary ports between platforms).

Kudos to the author for finding this bug.


Only if you'd been raised on Java.

There had been the shift from 16->32 bit not long before that, so it's not that much of a stretch, and 8->16 before that (well after I got my first computer).

We found an almost identical bit-mismatch bug in a programming language that led a bitmap of tcp connections to appear to go crazy on a customer install in a core piece of software. These bugs cost.


Yeah, in that time period you couldn't really get away from writing really defensively. It was even worse, earlier. When I wrote C code for x86 there were all sorts of different memory models you could choose from at compilation, and we were producing both a 16 bit and a 32 bit (for the daring) versions of our product. You couldn't assume anything.

I write java now and this is not something I miss.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: