Hacker News new | past | comments | ask | show | jobs | submit login

There was a book “C Interfaces and Implementations” by David R. Hanson and he put it so: there are user errors (input/data errors), program bugs (things we 'assert') and everything else are exceptions. So a non-existent file is a user error, uninitialized memory is a program bug, and an arithmetic overflow is an exception. Not sure if this is useful.

I myself think errors and exceptions are misleading terms. There is an instruction; it does this and that and can produce the following results. We call some of there results errors because usually we imply a goal. But instructions do not really have goals. We search for a key in a dictionary and either find it or not. Either result can be an error or not an error depending on whether we expect the key to be there or want to make sure it does not exist. Yet the searching instruction is same. There is no reason to prefer one result or another. It is more important to make all the necessary distinctions.




> There was a book “C Interfaces and Implementations” by David R. Hanson and he put it so: there are user errors (input/data errors), program bugs (things we 'assert') and everything else are exceptions. So a non-existent file is a user error, uninitialized memory is a program bug, and an arithmetic overflow is an exception.

I don't agree with this definition in the same way I don't agree with many others not found in books.

These definitions always reflect the personal view of their authors, but are fundamentally just an interpretation of their own thoughts. There is nothing objective about it. It's just a philosophical or semantic debate.

For example, why is uninitialized memory a bug, but an arithmetic overflow is not? In both cases it would have been the developers responsibility to initialize the memory and prevent an arithmetic operation that overflows.


In his interpretation exceptions were just a little short of being fatal errors. They were almost fatal errors but with an optional escape hatch. But I agree all these distinctions are vague and do not seem to be fundamental differences.

Exceptions are a side effect of nice syntax. When we write

    a = b + c
in a modern language quite a lot can happen behind the scenes. Yet the form implies 'a' will always be a sum of 'b' and 'c' whatever this means. There is no notion of 'a' not being a sum. So it seems there are two ways: either we provide a separate error handling mechanism (exceptions) that stops the execution and leaves 'a' undefined or we somehow turn 'a' into either a sum of 'b' and 'c' or an error (sum types).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: