Hacker News new | past | comments | ask | show | jobs | submit login

  // we don't even pretend to work on anything but i386 and LE arm
  const unsigned char c[] = { 0x78, 0x56, 0x34, 0x12 };
  assert(sizeof(int) == 4 && *((int*)c) == 0x12345678);



Could you explain what this does to those of us who're C/C++ challenged ? :)


The software will assert out (bomb, hard) if it's running on a platform which isn't 32-bit (that's the sizeof), and little endian (checked by putting the individual bytes of 0x12345678 into memory in little-endian order, and making sure that when read as an integer, the value is correct).

Except it doesn't even do that well. Traditionally the C "int" type was the largest word size a machine could comfortably work with. But most 64-bit platforms have adopted the "LP64" and "LLP64" conventions - where "int" remains a 32-bit type. One reason to do this is that most values fit comfortably into 32 bits, so a 64-bit "int" wastes memory. Another reason is to keep shoddy code like this running!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: