
Why should I always enable compiler warnings? - azhenley
https://stackoverflow.com/q/57842756/938695
======
WalterBright
The trouble with warnings is they inherently balkanize the language into
multiple incompatible and even contradictory languages.

Warnings arise from being unable to modify the language semantics. But even if
you can modify the semantics, warnings are spawned as a compromise when people
cannot agree on what the language semantics should be.

I've tried pretty hard with the D programming language to not have any
warnings - that the semantics should be firmly decided upon. Some warnings
still have crept into the compiler, but fortunately just a handful.

~~~
throwaway_bad
No matter what your language semantics are, there will still be different
styles of coding. It's extremely valuable to shepherd your codebase into a
particular paradigm for readability reasons.

That said, style warnings are incredibly hard to detect and enforce (e.g.,
-Weffc++ is so noisy I've never been in any codebase where I could leave it
on)

~~~
WalterBright
Style warnings are something else entirely. The warnings I'm talking about are
for code that looks suspiciously like a bug.

------
neilv
When I was doing lots of cross-platform development in C and C++, using
various vendor-specific compilers (which had a lot of variation in warnings),
I had a practice of being warnings&lint-free on all of them. I used various
portable warnings-silencing conventions for things like unreachable code,
assignment in expression context, unused return values, etc. (Besides all the
other conventions for potential pitfalls that static analysis can't detect.)

A colleague, who was new to C back then, asked about it, so I shared some of
my masochistic practices, including the `lint`-necessitated mantra, "Always
cast the return value of printf to void."

Years later, after moving to another US state, I'm cutting through the
courtyard of a large block, where people are eating lunch outside, and someone
shouts, "Oh my god, it's ____! Always cast the return value of printf to
void!" She then introduced me to her coworkers at the table, "This is ___. ...
He's the reason I code C like an [bad word]." Which I took as a compliment.

Programming in C reminds me of advice I've heard about helicopter piloting:
from the moment you start the helicopter, that thing is trying to kill you.

Perhaps it's helpful to think about coding C like a cool-headed, meticulous
pilot, who's constantly aware of the danger.

~~~
Sohcahtoa82
> "Always cast the return value of printf to void."

Why?

~~~
neilv
It was part of being "lint-free". The `lint` program of the time would
otherwise complain that you're not doing anything with the return value of
`printf`. Not doing anything with the return value of a function was
considered possibly poor code. One of the likely problems was not checking an
error return value, and handling the error cases. However, in the case of
`printf`, normally you would not check and handle errors there, so you wrote
the code as `(void)printf(...);`, to tell `lint` that you're ignoring the
return value intentionally.

~~~
Sohcahtoa82
I'd think the linter would special-case the printf function since it's so rare
to care about its return value.

~~~
neilv
But it didn't. And there were other bits of language like that, including ones
for which it's less clear-cut (and even some code checked printf), and then
the question becomes where do you draw the line in the defaults.

So the idea was to pass every compiler and linter. Whenever a warning appeared
-- such as due to code changes, moving to a new platform, updating some
library, etc., -- that was something to look at, not to assume you knew the
warning was innocuous and get in the habit of disabling warnings.

The environment is a bit different now, with everyone using a small number of
standards-compliant and featureful compilers. When you had situations like,
e.g., your cpp on one new platform not even being quite K&R compliant, and
potentially mangling your layers of macro expansion subtly, you really needed
every hint you could get that something wasn't parsing or behaving like it
looked like it would.

It was also perhaps faster just to do all your pedantic practices, than to try
and reason about when you needed to do them.

Today, of course, especially if I'm using only one compiler, I might not
bother with void on printf. That's where I might redraw the line, like you
suggested. (Speaking of personal code for which I have stylistic discretion;
for other code, I'd defer to the current conventions of the project and/or
discuss with team.)

Though, a side benefit to being conspicuously pedantic is that, when you see a
chunk of code that is less-pedantic in some way, it stands out. When we have
so many critical defects due to insufficiently perfect C coding, spotting a
chunk of code (especially one's own code) that's a little less-pedantic
_might_ mean that someone who worked on that was maybe being a little cavalier
at the time, and maybe that chunk is a place to focus some attention.

Being warnings&lint-free isn't the most important thing in C, and my bigger
concern is that, as a practice, overall, we don't agonize enough over C code
correctness and manageability.

------
wyldfire
I have found that codebases that have not been built with warnings accrue
bugs. Static analysis techniques from newer compilers etc reveal these bugs.
Sometimes it's pretty astonishing -- you'll find bugs that make you wonder how
anything works at all.

Even if you have some mild disagreement with the warning or some critical
context that indicates "that's not a problem for us" \-- fix it anyways. This
goes for both static analysis and compiler warnings (ok, yes -- they're both
static analysis). Keeping the codebase green means that you can build with
`-Wall -Werror` and prevent new bugs from creeping in. If there's code that
absolutely can't be rewritten to avoid the warning, you can mask the warning
locally with things like pragmas.

~~~
greglindahl
That can work great if you only use one compiler, and can get messy if you use
several.

~~~
llukas
You fix warnings on all of them.

All major compilers come with a way to suppress the particular warning in a
particular file if all else fails.

~~~
girvo
While I completely agree in principle and do exactly that on my own personal
codebases, that’s a hard sell at work at times

------
pornel
I think this is controversial only because C doesn't have a way to explicitly
and reliably silence an instance of a warning. Without ability to negotiate
with warnings they are effectively errors.

Rust has a ton of lints/warnings in the style of "this looks fishy, did you
_really_ mean that?", and users can answer "yes" by adding
`#[allow(fishy_situation)]` to a particular scope.

Users also have an option of setting `#[deny(fishy_situation)]` on entire
modules or programs to ensure the particular thing never happens.

Because it's per scope per warning, it doesn't have the dilemmas of all-or-
nothing `-Werror` and other global warning flags.

------
slacka
Many answers suggest passing -Wall -Wextra to the compiler. In my experience,
this misses many useful warnings. Here's a good list of additions warnings
that usually produce useful results:

[https://kristerw.blogspot.com/2017/09/useful-gcc-warning-
opt...](https://kristerw.blogspot.com/2017/09/useful-gcc-warning-options-not-
enabled.html)

~~~
ohazi
I never really understood why "all" wasn't actually a catch-all. If it's not a
catch-all, don't call it all. Or at least give us a -Wallyesreally, which
doesn't currently exist afaik

~~~
mcpherrinm
Clang has -Weverything:

[https://clang.llvm.org/docs/UsersManual.html#diagnostics-
ena...](https://clang.llvm.org/docs/UsersManual.html#diagnostics-enable-
everything)

The problem with -Wall is that people use it with -Werror, thus compilers
would have to be very conservative about adding new warnings to it. Hence the
situation we are in with a few flags "above and beyond". Most of -Wpedantic
for example is not something I'd want to break my build on.

The following stack overflow answer has some good example of other warnings
you probably don't want, but would be included in -Weverything:
[https://stackoverflow.com/questions/11714827/how-to-turn-
on-...](https://stackoverflow.com/questions/11714827/how-to-turn-on-literally-
all-of-gccs-warnings)

~~~
zik
I'd argue that if you used -Wall and you got all the warnings you can't really
complain too much because you got exactly what you asked for.

~~~
greglindahl
That's fine for an argument in a bar, and not so helpful for people who are
doing software engineering with code that is developed under multiple compiler
versions. Or who build old code versions with a new compiler version.

~~~
monadgonad
It's only not so helpful because -Wall is badly named and now we're forced to
deal with that legacy. Ideally, -Wall would be the same as clang's
-Weverything, and there'd be a -Wstandard (or something) flag for what -Wall
currently does.

------
lilyball
I'm disappointed that the answer does not have any discussion about the
problems with mixing `-Weverything` and `-Werror`.

In case you're not aware, the issue with this is upgrading your compiler
almost certainly causes your project to stop compiling altogether, requiring
immediate fixes to stuff that otherwise could have waited. What's worse, it
also means you can't go back and compile older versions of your code with the
newer compiler.

~~~
antpls
If you want your code to compile with the next versions of the compiler, you
wouldn't wait the latest official stable release of the compiler to start
looking at your future work. You would have a step in your CI that
periodically checks out the latest compiler's revision/nightly and build your
codebase with it. It wil surely fail, but you know what is coming months in
advance before the stable release is out

~~~
ikiris
At that point just flag the warnings instead of having it break the build...

------
ilaksh
Because C++ compilers have _major_ flaws in them that "can't be fixed" because
they would break some people's builds from 1977 or something.

Such as giving "warnings" and happily compiling a program that is practically
guaranteed to start doing random things at some point.

~~~
hoseja
Yep. I wish backwards compatibility were less religious concept in C++. Just
use old tooling if you need to compile your rats' nest of old code.

~~~
wyldfire
The new language standards do forfeit elements of backwards compatibility. And
newer compilers use newer language standards by default.

It's true that there is a great deal of hand-wringing over backwards
compatibility when new C++ features are discussed, though.

~~~
ilaksh
GCC 9 defaults to a warning for no return which is undefined behavior.

------
JohnFen
I always enable all warnings globally for a very simple reason -- they can be
useful (particularly the more obscure ones). Enabling them all globally means
that I have to actually investigate the cause. If it turns out that the
warning does not actually indicate a real problem, then I can disable that
warning locally.

That way, they act as a kind of "checklist" for potential trouble that, once
investigated, are removed from the checklist.

This practice has, on a number of occasions, allowed me to avoid some problems
that would have been very hard, if not impossible, to nail down during testing
or after release.

------
dclusin
I think that code quality largely follows from discipline. We write unit tests
even though it won't catch all of the potential defects, especially at service
boundaries. We write integration tests even though they may not catch all of
the defects of, for example, joe random giving garbage data to your public
facing API. Tools like compiler warnings and static analysis alert us to the
presence of potential issues. Heeding them at the outset is no more onerous in
my ind than the previous two approaches. For me it's also part imitation. If
you look at the high quality open source libraries available to you in
languages that are statically typed, most of them have the compiler configured
with warnings as errors. I don't think it's a coincidence. However, as stated
previously, it also isn't enough on its own.

------
ionforce
Is this a real question? Or someone just whoring for upvotes?

"Why should I eat my vegables? Innernet, tell me why!"

~~~
davidnc
Note that the answer was written by the asker. Who then commented on the
answer:

> I have posted this Q&A because I'm sick and tired of telling people to
> enable warnings. Now I can just point them here (or, if I'm in a
> particularly evil mood, close their question as a dupe). You are welcome to
> improve this answer or add your own.

------
splittingTimes
I have yet to work on a code base where this is actually true. We have some
3.5Mio LOC and about 30k warnings... We have many bugs and tried to address
this but the business does not warrant the resources as the product is "good
enough"

What is your real world experience in this regard?

~~~
yitchelle
Yep, happens in many places I have worked in. When used with lint tools and
metric measuring tools, the "problem" compounds exponentially because it
trigger other warnings or risks that are not picked up the compiler.

------
divan
I know post is about C/C++ compilers, but in general, I find it useful to
treat compiler messages just as a normal log of a program – which, in this
case happened to be a parser/compiler/linker.

Logging, in a nutshell, is a computer program seeking an attention of a human
being. When everything goes smoothly and as expected, no log output needed.

We usually have, though, 4 most often used levels of logging:

\- [DEBUG] – human explicitly asking program to tell a lot, for debugging

\- [INFO] - program tells "hey, I'm fine here", mostly for assuring worrying
human if program even works. As confidence grows, need in this log level
falls. (Historically, this log level is often abused for metrics/tracing
purposes, but that's another story)

\- [WARNING] - program says "hey, human I don't know if this is good or bad,
up to you to decide"

\- [ERROR] - human attention required, program don't know how to handle
situation automatically

Now, from this perspective, [WARNING] level of the compiler is basically
compiler/spec giving up on telling human what's right and what's wrong. Given
the compilers are written by people who know language spec better than anyone,
it would safe to assume that compiler should know better than average
developer what's good and what's not.

So when compiler tells develeoper "warning: I don't know if it's ok, deal it
yourself", majority of the developers can only say "meh, I don't know either"
and "doesn't crash, so we're good, ignore it".

That's why I strongly believe there should be no warnings in compilers at all.
INFO and ERROR levels are sufficient for this kind of programs.

~~~
divan
Any particular reason of downvoting without explaining what's wrong with my
point of view?

This approach (no warning from compiler) is successfully used in Go, for
example, and, I believe, the reasoning is similar.

