

Building C Projects - rsaarelm
http://nethack4.org/blog/building-c.html

======
tujv
By contrast, the Plan 9 C compiler does use the header when linking. Header
files contain a prama statement with library name, removing the need for
linker flags.

See: [http://plan9.bell-labs.com/sys/doc/comp.html](http://plan9.bell-
labs.com/sys/doc/comp.html)

~~~
melling
And when Rob Pike decided to create Go from that compiler, he just got rid of
headers.

~~~
pjmlp
Because the Go team did the right thing by adopting modules, which are as old
as C in the computing world.

~~~
swah
What do you mean by "modules" here?

~~~
pjmlp
Go actually calls them packages.

I just prefer the term modules, as it was introduced by Mesa and CLU in the
70's.

Only languages based on C's primitive toolchain rely on basic separate
compilation of translation units with textual includes for symbol definition.

Modular languages, that make use of better toolchains, couple the concept of
separate compilation, with strong type checking across compilation boundaries
and compiler managed metadata for the exported symbols.

~~~
swah
Thanks for the explanation. Do you think the idea of implementation/interface
separation, that C and Java allow easily, is also better in Go?

~~~
pjmlp
I didn't got your question.

Java packages are not that different from Go packages, in terms of CS
concepts.

Except for the set of issues that are debated to death about Go, the language
is quite modular in Mesa tradition.

Given Oberon's influence in Go's design[1], maybe you will find these books
interesting,

From
[http://www.inf.ethz.ch/personal/wirth/](http://www.inf.ethz.ch/personal/wirth/)
check "Algorithms and Data Structures", "Project Oberon", "Compiler
Construction".

From [http://ssw.jku.at/Research/Books/](http://ssw.jku.at/Research/Books/)
check Object-Oriented Programming in Oberon-2.

------
e12e
> instead of writing yet another wrapper around yet another set of poorly
> standardised existing tools, I'm using aimake, a build system I wrote
> originally for NetHack 4

Not sure I'll be able to use aimake, as it's not invented here ;-)

On a more serious note, it sounds a bit strange to claim this isn't built on
other tools -- he uses various compilers and linkers, and other utilities for
installation etc. Eg CMake might not be perfect, but at least I couldn't find
any kind of rationale for why cmake or tup might not work perfectly fine for
the purpose? Maybe I skimmed too quickly?

~~~
gcv
The author wanted a build system which does not ultimately generate a bunch of
Makefiles. CMake and GNU Autotools both rely on Make to do real work.

~~~
GFK_of_xmaspast
Was there any reason for that beyond ideology?

~~~
jleader
I think he said that make solved only a small, relatively easy part of the
problem, and he didn't want to have to work around the limitations of make and
the syntax of makefiles. He also mentions that different make implementations
have different idiosyncrasies that he'd have to work around. Also, I think he
wanted to preserve more information about the dependency tree between the
various steps, instead of having multiple tools each making their own half-
assed (or completely manual) attempts at deducing that information.

------
AceJohnny2
Considering the glut of C build systems (often built on top of GNU Makefiles)
as proof that this is a complicated field, this is a surprisingly undiscussed
aspect of programming. Thanks for this link!

------
parados
For the preprocessing part of the problem this might be useful:
[https://news.ycombinator.com/item?id=8356100](https://news.ycombinator.com/item?id=8356100)

------
sysk
This is probably the first time I really get what is happening during a build
(despite having written a few C programs and Makefiles). Thanks for writing!

------
Rapzid
If anyone ever finds a linux C program that doesn't quite do what they need,
my advice is this as I've had great success with it in the past modifying
ntfs3g and the like..

Download the source package and open it in NetBeans! For what I needed
clicking build JUST WORKED. For my purposes it was a short trip to adding some
new arg commands and modifying functionality, then building and packaging it
back into a .deb ready to go. I'm not sure if this is still the case, but
NetBeans was pretty fantastic for this just 2 years ago.

~~~
panzi
That won't always work, because often there are OS specific files. E.g.
different implementations of "mmap" for Unix and Windows or strlcmp and
snprintf implementations for platforms that don't provide them their self.

Also often certain headers need to be generated by the build system
("config.h.in" for version information, "export.h.in" for export macros).

------
caf
An interesting treatise, however the entire "Standard directory detection"
part gives me the willies.

------
las_cases
For goodness sake, keep writing these type of articles! I am now digging into
"Memory management in C programs" and I hope the rest of the articles are as
good as these two.

------
Someone
_" The algorithm used by the vast majority of preprocessors is very simple:
the preprocessor has a hardcoded list of directories, which the user can add
to, and the preprocessor will scan them in sequence until it finds the file
it's looking for. In the case of include "file.h", it will also scan the
directory containing the source file."_

At least for gcc, that is not quite correct. It also has a _list_ of user
directories.

The user list starts as a list containing only the current directory, but can
be extended with command-line options. See
[https://gcc.gnu.org/onlinedocs/cpp/Search-
Path.html](https://gcc.gnu.org/onlinedocs/cpp/Search-Path.html)

------
gnuvince
> C is a compiled language[...]

Not off to a great start :/

~~~
angersock
Eh?

Um...that's true?

~~~
brandonbloom
Compiled vs interpreted is not a property of the language. It's a (vague)
property of a particular implementation.

~~~
sjolsen
> Compiled vs interpreted is not a property of the compiler.

I assume you meant "language." You're of course correct, but if ever there
were a language that deserved to be called "compiled," C is it. It truly was
designed to be compiled, and who in their right mind would bother interpreting
it?

And before you ask, no, people who debug C are rarely in their right minds. :)

~~~
zik
> ...who in their right mind would bother interpreting it?

Um... me?

[https://code.google.com/p/picoc/](https://code.google.com/p/picoc/)

~~~
e12e
That's really an interpreter (as opposed to using a bytecode vm or some such)?
How does it deal with memory allocation, alignment etc? And (I don't have a
proper dev setup on hand) -- the obvious question -- can it run itself, in
itself in itself? (And itself in tcc in itself?) ;-)

~~~
zik
It's a real interpreter. It runs directly from the source code - it doesn't
even store a parse tree or bytecode. Memory allocation and alignment is done
in the standard C ways. It's designed as a scripting variant of C so it
doesn't implement 100% of the C standard (eg. bitfields), so no it's not self-
hosting.

~~~
e12e
Fascinating, I'll have to look at it more closely when I've got a usable
command line available. I would not have thought it was feasible for a useful
subset of C -- but then I tend to forget that it is a rather simple language
at heart (eg: after running the pre-prosessor).

