Hacker Newsnew | comments | ask | jobs | submitlogin
barrkel 670 days ago | link | parent

I've seen more problems from unsigned ints than signed ints (in particular, people doing things like comparing with n-1 in loop boundary conditions). There's a reason Java, C# etc. default to signed integers. Unsigned chars, I have no quibble (and Java, C#, use an unsigned byte here).


rcfox 670 days ago | link

Unsigned integer overflow has defined behaviour in C, while signed overflow doesn't. Is it really better to protect people from a simple logical error by exposing them to possible undefined behaviour?

With signed integers, you'll run into the same problem with comparing to n+1 at INT_MAX or n-1 at INT_MIN.

-----

barrkel 670 days ago | link

0 is a really common value for an integer variable in programs. INT_MAX and INT_MIN are not.

It's just my experience. Don't get too wound up about it ;)

-----

raverbashing 670 days ago | link

"(in particular, people doing things like comparing with n-1 in loop boundary conditions"

There's your problem: that's like saying cars are more dangerous than motorcycles because your finger can get squeezed by the door.

"There's a reason Java, C# etc. default to signed integer"

Legacy? And in Java/C#, and you usually use ints, not so much chars, shorts, etc and casts are probably more picky

I stand by my point, you should only use signed if you know what you're doing and for a specific use only (like math)

-----

barrkel 670 days ago | link

Signed arithmetic is generally only problematic when you hit the upper or lower bound. The right answer is almost never to use unsigned; instead, it's to use a wider signed type.

It's far too easy to get things wrong when you add unsigned integers into the mix; ever compare a size_t with a ptrdiff_t? Comes up all the time when you're working with resizable buffers, arrays, etc.

And no, Java did not choose signed by default because of legacy. http://www.gotw.ca/publications/c_family_interview.htm:

"Gosling: For me as a language designer, which I don't really count myself as these days, what "simple" really ended up meaning was could I expect J. Random Developer to hold the spec in his head. That definition says that, for instance, Java isn't -- and in fact a lot of these languages end up with a lot of corner cases, things that nobody really understands. Quiz any C developer about unsigned, and pretty soon you discover that almost no C developers actually understand what goes on with unsigned, what unsigned arithmetic is. Things like that made C complex. The language part of Java is, I think, pretty simple. The libraries you have to look up."

Unsigned is useful in a handful of situations: when on a 32-bit machine dealing with >2GB of address space; bit twiddling where you don't want any sign extension interference; and hardware / network / file protocols and formats where things are defined as unsigned quantities. But most of the time, it's more trouble than it's worth.

-----

raverbashing 670 days ago | link

"The right answer is almost never to use unsigned; instead, it's to use a wider signed type.""

Depends on the case of course, but yes, you'll rarely hit the limits in an int (in a char, all the time)

"It's far too easy to get things wrong when you add unsigned integers"

I disagree, it's very easy to get things wrong when dealing with signed

Why? For example, (x+1) < x is never true on unsigned ints. Now, think x may be a user provided value. See where this is going? Integer overflow exploit

Edit: stupid me, of course x+1 < x can be true on unsigned. But unsigned makes it easier (because you don't need to test for x < 0)

"what unsigned arithmetic is"

This is computing 101, really (well, signed arithmetic as well). Then you have people who don't know what signed or unsigned is developing code. Sure, signed is more natural, but the limits are there, and then you end up with people who don't get why the sum of two positive numbers is a negative one.

-----

barrkel 670 days ago | link

As you admit, you just made a mistake in an assertion about unsigned arithmetic. You're not very convincing! ;)

-----

raverbashing 670 days ago | link

As I said, if I'm not convincing please go ahead and used signed numbers as I get the popcorn ;)

Here's something you can try: resize a picture (a pure bitmap), with antialiasing, in a very slow machine (think 300MHz VIA x86). Difficulty: without libraries.

-----

jbooth 670 days ago | link

Java actually uses a signed byte type, which was I guess to keep a common theme along with all the other types, but in practice leads to a lot of b & 0xFF, upcasting to int etc when dealing with bytes coming from streams.

-----




Lists | RSS | Bookmarklet | Guidelines | FAQ | DMCA | News News | Feature Requests | Bugs | Y Combinator | Apply | Library

Search: