Hacker News new | comments | show | ask | jobs | submit login

"The right answer is almost never to use unsigned; instead, it's to use a wider signed type.""

Depends on the case of course, but yes, you'll rarely hit the limits in an int (in a char, all the time)

"It's far too easy to get things wrong when you add unsigned integers"

I disagree, it's very easy to get things wrong when dealing with signed

Why? For example, (x+1) < x is never true on unsigned ints. Now, think x may be a user provided value. See where this is going? Integer overflow exploit

Edit: stupid me, of course x+1 < x can be true on unsigned. But unsigned makes it easier (because you don't need to test for x < 0)

"what unsigned arithmetic is"

This is computing 101, really (well, signed arithmetic as well). Then you have people who don't know what signed or unsigned is developing code. Sure, signed is more natural, but the limits are there, and then you end up with people who don't get why the sum of two positive numbers is a negative one.




As you admit, you just made a mistake in an assertion about unsigned arithmetic. You're not very convincing! ;)


As I said, if I'm not convincing please go ahead and used signed numbers as I get the popcorn ;)

Here's something you can try: resize a picture (a pure bitmap), with antialiasing, in a very slow machine (think 300MHz VIA x86). Difficulty: without libraries.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: