

The byte order fallacy - mike_esspe
http://commandcenter.blogspot.se/2012/04/byte-order-fallacy.html

======
csense
I do programming in Java. I don't like serialization because I'm old-fashioned
and I want to have control over the exact format of my binary files (not least
to insure interoperability with other languages).

Coming from a C background, from the first time I did binary file I/O with
Java, I've noticed that the lack of support for casting every pointer to char*
forces you to use the portable solution.

As for why a lot of software has to worry about endianness, I assume that
people start out with:

    
    
      typedef struct whatever
      {
         int a;
         int b;
         const char c[MAX_C_LENGTH];
      } S;
    

And then they can get quick-and-dirty file-saving code like this:

    
    
      S inst;
      inst.a = 1;
      inst.b = 2;
      inst.c = "Hacker News";
      fwrite(&inst, sizeof(S), 1, fp);
    

It's robust in that the same version of the code, running on the same machine,
will be able to read and write files, even when more fields are added to the
structure or MAX_C_LENGTH is changed.

Of course, when it comes to interop between different versions of the code, or
running the code on radically different machines (32- vs 64-bit, or different
endianness CPU), it will break. But for many applications this doesn't occur
until after release. And then nobody wants to rewrite the saving and loading
code from scratch, so a translation layer is patched in to allow the non-
portable saving/loading code to give the correct result on problematic
machines.

