

Mythology in C++: Exceptions are Expensive - someone13
http://www.flounder.com/exceptions.htm

======
kstenerud
Using isolated, raw time numbers is meaningless when talking about
performance. What you should be talking about is how much time is spent in
code RELATIVE to all other code in the critical path. This talk of nanoseconds
and how they don't matter over X operations is terribly irresponsible and
deceptive.

What we see from the performance tests at the end of the article is that
running an empty code block with exception setup/teardown takes twice as long
as it would without exceptions. That's bad if this code turns out to be the
bottleneck, since you could effectively halve the bottleneck cost by taking
exceptions out.

But even then, this is a flawed experiment, because you need to factor in the
cost of what you're actually DOING inside of the exception handler (after all,
nobody runs an exception handler with nothing in it!). If the exception
setup/teardown code only accounts for 1% of the total execution time within
it, removing exceptions isn't going to help your performance at all.

Not only that, but you really need to be checking times when exceptions ARE
triggered, as well as not. In short, performance testing needs to be done in
an environment as close to the real runtime environment as possible, with
conditions as similar as possible. Otherwise you could very well be wasting
your time.

So are exceptions expensive? Based on the author's own calculations, they can
be if you have very little code within your exception handler. And if the
exception can be expected to trigger with sufficient frequency, the exception
catching code needs to be taken into account as well. So no, the myth is NOT
busted, because the fact remains that exceptions can have a very real cost,
depending on your usage (and platform!).

As well, the rant on code style at the top of the article is flawed:

"Why is this for(;;) idiom so popular? Not because it makes sense as a
syntactic structure."

It's popular because it has been used for so long that it has become an
accepted idiom in C (and other C-like languages). It makes no difference how
it originated. I've never met anyone who actually believed that for(;;) is
faster or slower than while(true).

Same goes for embedding assignments within if statements, the author's
personal views on code aesthetics notwithstanding.

~~~
greyman
>> And if the exception can be expected to trigger with sufficient frequency

But if this is the case, it can be said that the exception handler code should
be a part of a normal code flow - it's generally wrong to use exception
mechanism when it is expected that the code will be executed regularly.
Basically, in that case it is not an "exception", but standard branch of
execution.

>> you need to factor in the cost of what you're actually DOING inside of the
exception handler

Again, the same issue. Exception is not supposed to be triggered that often so
this cost will be significant.

~~~
kstenerud
And yet we see this all the time, even in common libraries. I'd agree that it
_shouldn't_ happen, but the cold hard reality is that it _does_ , and with
shocking frequency.

And when you're the poor sod who has to figure out why this legacy code you've
inherited is so slow, it's common practice to look for bottlenecks such as
these and find some way to tame the chaos.

------
BudVVeezer
I think he missed the point as to why exceptions are expensive. It's not
(only) the setup and teardown. It's _mostly_ the throwing of exceptions.
Throwing an exception is expensive when compared to returning a status code
and checking its value.

