Hacker News new | comments | show | ask | jobs | submit login

Every problem has a solution which is simple, elegant, and doesn't work. This is it.

Section 6.10.1, paragraph 4: "... For the purposes of this token conversion and evaluation, all signed integer types and all unsigned integer types act as if they have the same represenation as, respectively, the types intmax_t and uintmax_t..."

Inside preprocessor directives, your chars aren't chars any more.

So what they are? Why it doesn't work? Your quote don't clarify.

Your "char" in a preprocessor directive is either a uintmax_t or an intmax_t. Either way, it's going to end up as #if 128 > 127 or #if 256 > 255 -- so the first case will always end up being included.

It's standard behaviour or one compiler does so? Outside preprocessor char has another meaning? Thanks anyway.

Outside of the preprocessor, the char type is a one-byte integer (whether it's signed or not is implementation-defined).

So the good solution is make a test that finds it out, for example in configuration script and set proper preprocessor constant and test that constant instead.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact