Hacker News new | past | comments | ask | show | jobs | submit login

So what they are? Why it doesn't work? Your quote don't clarify.



Your "char" in a preprocessor directive is either a uintmax_t or an intmax_t. Either way, it's going to end up as #if 128 > 127 or #if 256 > 255 -- so the first case will always end up being included.


It's standard behaviour or one compiler does so? Outside preprocessor char has another meaning? Thanks anyway.


Outside of the preprocessor, the char type is a one-byte integer (whether it's signed or not is implementation-defined).


So the good solution is make a test that finds it out, for example in configuration script and set proper preprocessor constant and test that constant instead.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: