Do you know if compilers are smart enough to return 04 even on big-endian architectures nowadays? For some reason I'm under the impression that (at least clang and gcc) are able to change this from "first byte in x" to "least significant byte in x" but don't actually know why I think that. Maybe embedded compilers typically don't?
No, and it would be wrong for it to do so, because you've given it a very explicit set of instructions about what to do: "Give me the value of the byte of memory at the start of x."
To do what you're asking, you'd do something like this:
unsigned char c = (unsigned char)x;
That will give you the low order byte of x. But to do that, on a big endian system, when you've told it to get you the byte at the base address of x, is simply wrong behavior. At least in C. I can't speak to higher level languages since I don't work in them.
To expand slightly on Syonyk said: the compiler cannot do it, because the object is stored between addresses c and c + sizeof(unsigned int). You can use this information to, for example, copy the object to another place with memcpy, and that of course wouldn't work if c wasn't pointing to the "leftmost" byte in the memory.
If you wanted to return 04 on big-endian architectures, you can use a binary mask - (int &0xFF).
Since this compiles to FF 00 00 00 in big-endian and 00 00 00 FF in little-endian, it would work on both platforms.
If you’re reading a file in binary format from disk, though, you always have to know whether the byte you are reading is little-endian or big-endian on disk.