Hacker News new | past | comments | ask | show | jobs | submit login

"That’s a lot of trouble being caused by a two-line function. C may be a small language, but it’s not a simple one." I disagree. C is very simple. You need to know C is not suppoused to work the same on platform and I guess behaviour you have tested is not defined by standard. I think every good C programmer know when he have to be causious because behaviour may be platform dependent. In matter of fact you did as well. I guess case you've studdied is not common to be used in real code. If so, there is very easy solution:

  #if ((char)CHAR_MAX) + 1 > ((char)CHAR_MAX)
   /* some code here */
  #else
   /* some code here */
  #endif
There is no problem to me.



Every problem has a solution which is simple, elegant, and doesn't work. This is it.

Section 6.10.1, paragraph 4: "... For the purposes of this token conversion and evaluation, all signed integer types and all unsigned integer types act as if they have the same represenation as, respectively, the types intmax_t and uintmax_t..."

Inside preprocessor directives, your chars aren't chars any more.


So what they are? Why it doesn't work? Your quote don't clarify.


Your "char" in a preprocessor directive is either a uintmax_t or an intmax_t. Either way, it's going to end up as #if 128 > 127 or #if 256 > 255 -- so the first case will always end up being included.


It's standard behaviour or one compiler does so? Outside preprocessor char has another meaning? Thanks anyway.


Outside of the preprocessor, the char type is a one-byte integer (whether it's signed or not is implementation-defined).


So the good solution is make a test that finds it out, for example in configuration script and set proper preprocessor constant and test that constant instead.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: