Hacker News new | comments | show | ask | jobs | submit login

It's standard behaviour or one compiler does so? Outside preprocessor char has another meaning? Thanks anyway.

Outside of the preprocessor, the char type is a one-byte integer (whether it's signed or not is implementation-defined).

So the good solution is make a test that finds it out, for example in configuration script and set proper preprocessor constant and test that constant instead.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact