Hacker News new | past | comments | ask | show | jobs | submit login
#include <rules> (2010) (zeux.io)
68 points by davikr 9 months ago | hide | past | favorite | 89 comments



Good stuff, if slightly self-contradictory in places. The header file named after the source should be included first (to catch errors in the header) and the header file defining various macros should be included first (as otherwise it doesn't work).

The detect-missing-includes-in-header is better handled by a separate compilation job which compiles the header by itself as it if was C++, `clang -xc foo.h`, solely for the purpose of catching that error. Then the include your own header first rule no longer matters.

Forward declarations have a cost which the article misses. When the thing in question changes, your compiler no longer warns you when the forward declaration is out of sync with the real thing, and the linker diagnostics are usually less comprehensible. Also it's a nuisance to update all the forward declarations. An <iosfwd> style header, included at the top of the <ios> header, gives ~99% of the compile time advantage of forward declarations with none of the failure modes and usually less typing in the caller.


It basically consists of:

1) The basic, ubiquitous, and uncontroversial rule that headers should be self-contained, i.e. guarded and include their direct dependencies, and

2) nonsensical rules resulting from failure to follow the first rule, or failure to follow the other basic, ubiquitous and uncontroversial rule that source files should also include their direct dependencies.

Manual forward declarations are a non-starter. Any header that contains a class declaration that it might ever make sense to refer to as an incomplete type (so, like, 100% of all classes, rounded to 3 decimal places) should have a corresponding iosfwd-style forward header.

The common header is just another header. You include it if you need it and you don't if you don't.


Exactly. If the order of your includes matters, you screwed up. Run "Include What You Use", fix the errors.


There's nothing self-contradictory. The two rules you're speaking of are:

    make sure that your header file is the first #include
    in the corresponding source file, except the common header,
    if your codebase has one.
and:

    make sure that each source file includes the common header
    before everything else
This defines a pretty clear order: 1) common header, if it exists; 2) the corresponding header for the source file, if it exists; 3) other headers needed by the source file.


Ah you are right, I missed the "except the common header" clause. Probably because the preceding part was in bold, possibly because "common header" is defined further down the article so the clause didn't mean much when first read.


>We’re stuck with C++, at least for another console generation.

That was 14 years ago. C++ is not going anywhere, whether 1, 2, or 10 "console generations" ahead.


yeah, it is. the percentage of new C++ projects has plummeted, compared to modern options such as Rust or Go.

developers are sick and tired of the awful tooling and decades old issues with C++, that will likely never be fixed. I for one will never touch a C++ code base again if I can avoid it.


But what about the percentage of new C++ *game* projects? I don't see any AAA or even commercial Indie games written in Go, and with Rust, it looks like there's always someone making a new toy game _engine_ but no actual games.

Unity is pretty much the only other game in town that isn't C++, with the occasional outlier (e.g., CrossCode is written in HTML5/JS). And of course, Engines like GameMaker that are themselves written in C++ but whose users don't write in C++. (Well, that's Unity as well)


> Unity is pretty much the only other game in town that isn't C++

Even Unity compiles .NET IL down to C++, and then compiles that with a regular compiler to target platforms which don't have a .NET runtime or don't allow JIT compilation. That's more of an implementation detail but it shows how engrained C++ is that the path of least resistance was to use C++ as a compilation target. I think much of the private engine internals are also still straight C++.


Unity is a victim of Apple in this regard. If Apple said it has to be Swift code, it would be called il2swift.

The insistence of AoT compilation has, as of yet, not made iOS any better of a platform for games, and only worse. If you really sincerely cared about games, you would at least advocate that Unity be allowed to do whatever the fuck it wants on Apple platforms, such as either JIT and AoT compilation.

The people who fussed over the real but meaningless details like the 10-20% performance gains are ultimately just co-opted by Apple execs extracting rents. Hacker News doesn’t have a good way to deal with this co-opting, people will downvote comments like these as irrelevant to a piece about headers. You guys are not getting it: C++ being shitty and having high inertia is in console maker’s and Apple’s interests, and the amount of energy wasted in maintaining it is entirely the fault of the platform owners, not developers.


You know that AAA game developers live and breath C++, and many outright love it, right?

And that Unity is used for convenience, as opposed to performance or code quality, right?

And that Unity C# is (or at least was) quite looked down upon by AAA game devs, not to mention even by .NET standards it was tied to an older C# version for ages (I believe Mono was involved too)?


> But what about the percentage of new C++ game projects? I don't see any AAA or even commercial Indie games written in Go, and with Rust, it looks like there's always someone making a new toy game _engine_ but no actual games.

You wouldn't necessarily know, unless they talked about it. Some of the biggest games of the last decade were written in Java or C#, but without necessarily making a big deal about it.

> And of course, Engines like GameMaker that are themselves written in C++ but whose users don't write in C++. (Well, that's Unity as well)

You forget that 30 years ago this was virtually unheard of, at least for "real" games. Writing a new game used to mean writing a new game engine. Now it mostly means writing something for an existing engine.

I'm sure game development toolchains will have C++ somewhere in their bowels for decades to come. But for day-to-day game development work it will continue to slide into irrelevancy.


> with Rust, it looks like there's always someone making a new toy game _engine_ but no actual games.

I know it may be somewhat underwhelming, but Tiny Glade[0] is being made in Rust. It is still basically a tech demo, but it has a real-time GI renderer and stuff.

[0]: https://store.steampowered.com/app/2198150/Tiny_Glade/


Jai is a good effort at an alternative


It seemed like it 10 years or so ago when he first announced it, but it seems more like vapourware by the year, unfortunately.


I check into his twitch stream every now and again and I’d disagree. His next game is clearly his proving ground for the language and it is looking pretty far along these days. I expect his sokoban game and the language will be released pretty close to one another. I can’t/won’t make any actual guesses as to when that will be though.


This is like deja vu! I've definitely seen this conversation before:

-Jai is promising!

-Jai has been in development for years and there's no examples, no compiler, no github, no website, let alone package management and all the associated documentation

-Yea but he's still working on it!

If he does manage to release it and live up to the hype that'll be great, but until then I will respectfully be in the "I'll believe it when I see it" camp :)


My opinion is that it's just too big of project for him to handle alone. So it will be in this status forever as other languages catches up. It's just humanly impossible to finish AND be better than other choices AND be done without help


Good news! It doesn't seem to be a project handled by him alone. I don't pay much attention to Blow or his company, but the few times I've watched videos where he does go over progress of the language, he often talks about compiler work other people in the company are doing, and even about hiring people specifically to work only on some part of the compiler.


I hope he can succeed. Language diversity is generally good, but it does look like an uphill battle


I mean both developing a complete language + compiler toolchain + standard library and developing a game are projects which often take many many years. Rust was started in 2006 and didn't receive a 1.0 release until a decade later in 2015. Go was a bit quicker, starting out in 2007 and releasing a 1.0 in 2012, but I don't think anyone outside of Google really used it until years later.

It really does seem like the people who complain about how long it's taking either have no context for how long this stuff takes. You can knock out a toy language in a month, sure, but growing an actually production ready serious systems language, with a solid compiler which produces optimized code across platforms, and a complete standard library and solid core abstractions, and a good module system, and incremental parallel compilation which scales to millions of LoC, and with innovative language features which aren't in comparable mainstream languages... that takes time.


> Rust was started in 2006 and didn't receive a 1.0 release until a decade later in 2015.

It wasn't widely announced until IIRC 2008, and it was public and in popular use by 2011 or so (long before 1.0 - just as e.g. Zig is definitely pre-1.0, but also very clearly a real language that people are using).

How many years are you planning to wait for Jai? Serious question - if it's still not publicly available in e.g. 2034, are you still going to be saying that's just how long these things take?


Well I'm not waiting for Jai, I'm happily using C++. If Jai eventually comes out and turns out to be a cool language and the tooling becomes available under sane licenses I might pick it up, or I might not. I'm in no rush either way.

If it's still not available in 2034, I guess you can fairly say it's taking a while. But then again, so what; if it comes out in 2036 it comes out in 2036.


>But then again, so what; if it comes out in 2036 it comes out in 2036.

There's this whole 'passing of time', and "we'll all eventually die", thing though, right?

If it comes out in 2036, it's totally irrelevant for a working dev in 2024 to be concerned with it. So that he'll get to use it in a decade? At this point it's already a promise-ware, and it's like it has community adoption to be promising.


Until at least a roadmap is released, Jai is totally irrelevant for a working developer. Even if it's released next month, you can't plan for that.


Happy to hear that he's still making progress on it! I have huge respect for his work as a game designer at least, so I'm rooting for Jai to succeed as well. I just lost track and hope after such a long time without it in a more public spotlight.


>yeah, it is. the percentage of new C++ projects has plummeted

Citation needed. C++ domains don't go to Go, and have only marginal adoption of Rust.


Yeah I agree.

I suspect rust will gain in popularity in systems software - like databases and browsers. But if we're counting in "console generations" - well, I personally doubt rust will ever really take off amongst game developers. Rust's borrow checker has a reputation for making it hard to write entity-component systems. And ECS are core to modern game dev.


> Rust's borrow checker has a reputation for making it hard to write entity-component systems. And ECS are core to modern game dev.

Isn't it the opposite? That rust's borrow checker makes it hard for non-ECS paradigms due to ownership and XOR mutability rules. While ECS is best for rust, as it "owns" things and schedules the mutating functions without any conflicting race conditions.

The most popular engine in rust right now is bevy, which embraces ECS for everything (including UI).


Are the statistics based on intuition or actual statistics.

Let’s not count the GitHub repo that is just a readme.


source?


This is the best measure I've found:

https://madnight.github.io/githut/#/pushes/2023/4

Unfortunately it doesn't have new projects, but it does seem like C++ peaked a couple of years ago and is starting to trend down. "Plummeting" is clearly an exaggeration though.


Based on the graphs here, Go and Python are also stagnant/slightly declining. Rust seems to be stagnant too since 2022.


The best I can make from the graph is that those declines are a scaling adjustment for a recent meteoric Java growth.

(WTF is there a meteoric Java growth is yet to be explained.)


Recent releases of Java have dragged the language kicking and screaming into the late '70s, and so people are starting to appreciate how much the JVM ecosystem gets right (instrumentation, dependency management, IDEs etc.).

Either that or the institutional memory of Java EE has faded, and Spring Boot is here to traumatise a new generation of developers with the magic of write-only COME FROM code.


To keep it in perspective, we are talking about a single month, and it wasn't enough to go back to the share it had half a decade ago.

Those things you cited do no cause single-month rapid growth. What causes it is publicity, or some large ecosystem adopting it by fiat.


I'm surprised to see Javascript in decline since 2017. Most of that can be explained by substitution to Typescript but it's still down a couple percent with that factored in.


That just seems to show proportions on github - it may be with the growth in the total sector size the absolute number is still increasing.

But yeah a 0.234% yoy-decrease isn't "plummeting". If so, then things like typescript or python are even steeper in their, and java being the next big thing showing massive growth compared. And rust itself is decreasing according to those stats, so I guess it's declining too.


So the grandparent is not even close.

In most metrics there it's steadily rising even, though the metrics are cross-correlated with "GitHub adoption".


If you sort by Issues, C++ is #3 and trending up.


Sounds about right! :D


Rust is just C++ with lipstick. Same pig.

With Go, i believe people will eventually mean revert to C, appreciating the value of simplicity.


A great way to handle context-specific headers: common path rooted in context-specific directories coupled with include path management.

Useful for platform- and architecture-specific code but can be used for anything really. Anything you want to parameterize and compile conditionally.

Instead of this mess:

  #if APP_LINUX
  #include <app/linux.h>
  #elif APP_BSD
  #include <app/bsd.h>
  #elif APP_MACOS
  #include <app/macos.h>
  #else
  #error "Unsupported platform"
  #endif
Organize things like this instead:

  linux/include/app/platform.h
  bsd/include/app/platform.h
  macos/include/app/platform.h
Then in the makefile:

  # Detect it somehow
  # or have user provide it
  PLATFORM ?= $(shell uname -s)

  cc -I include -I $(PLATFORM)/include
Then in the source code:

  #include <app/platform.h>


Moving logic from the source code into the compiler invocation is a mixed blessing. In general as the list of -I directories needed to build the thing gets more complicated the ease of building it from something other than the provided build scripts goes down.

Much like working things out in the preprocessor is ugly but works with other build systems and over-elaborate configure script style stuff makes for prettier source by moving complexity elsewhere.


+1. I think platform-specific software architecture should be lifted into the build system rather than in the code. This is straightforward with CMake, Meson, Build2, etc. `#ifdef` is a very bad idea and tends to make for leaky abstractions. It litters the code with platform selection and platform-specific behaviour that should ideally be abstracted away, e.g. things like `CreateProcess` vs `fork(); exec();`.

With C++20 named modules, this is made even easier, and there is no need to even have an `app/platform.h`. Programmers can literally copy-paste the function declarations into different files, and just select the correct module interface at configure time, and just write

  import app;
Again, most new-ish build systems support this too.


Something similar in spirit but Meson is what I have settled on. It’s all trade offs all the way down, but it’s what works for me.


Amen. That's the way.


So many things to remember and potentially screw up. I am glad that modern languages come with fewer footguns.


The article is from 2010. C++20 solved the include file issue with modules. The ecosystem is almost there with CMake 3.28 (released) and GCC 14 (some time in H1 this year) providing out of the box support.

For most practical purposes, modern c++ should be seen as a modern language. You need to make some effort to understand what language features not to use, but in my experience that effort is not too bad in the context of the overall development process.

There are also some initiatives (cppfront is my favorite) to create an updated syntax where certain unsafe historic behaviors are not allowed.


> C++20 solved the include file issue with modules. The ecosystem is almost there with CMake 3.28 (released) and GCC 14 (some time in H1 this year) providing out of the box support.

According to CPPReference [0], there isn't a single compiler release available today that fully supports C++20 modules. So perhaps "it's fixed" is a bit early. Especially since C++ projects tend to lag the latest release of their compiler by a few years, so even if GCC14 will indeed be the first C++ compiler to support modules fully, you still need to learn to use include guards today if you're going to be working in C++.

This is one of the major problems of "modern C++" by the way. Major features always exist in principle long before they actually exist in practice (concepts being the other feature that took years to fully be supported).

[0] https://en.cppreference.com/w/cpp/compiler_support


> According to CPPReference [0], there isn't a single compiler release available today that fully supports C++20 modules

Seems to me like MSVC does at least? What do you see it missing?


I don't know the details, but the table is saying "partial" for MSVC as well, right?

Edit: at a deeper look, I think I misread the table a bit, I guess it's saying that the feature is partially supported in MSVC 19.00 and 19.10, but full support in 19.28. Sorry about that misunderstanding.


It surprises and saddens me how dismissed C++ is nowadays. It's rarely my first choice (unless writing a gui, then Qt is my go-to), but it is a far more modern and less dangerous language than most people think. With most technology, people don't assume it hasn't changed since the 90s, but for some reason, that's what people do with C++


>more modern and less dangerous language than most people think

Compared to what? Itself from 20 years ago? That's great, but not nearly enough. Cpp just has SO MUCH FRICTION that it's simply not worth it to deal with it. Maybe if you have 20yrs of cpp experience and carved out a specific way for yourself to avoid all of it while still being productive... But that's not really an argument.

And sure, there are still some cases when the alternatives aren't ideal either (ie. Rust also having friction, especially in domain like games, or higher level languages being too slow). But cpp has so much downsides I don't see why should one ever default into choosing cpp, unless you REALLY care about the benefits it brings. But those upsides are usually external (ie. ecosystem), rather then features of the language itself

https://bitbashing.io/std-visit.html


> and carved out a specific way for yourself to avoid all of it while still being productive

I believe such person exists when I see one. All that I see around are people claiming they did that, while ignoring some huge issue with their style that inserts some really nasty bugs on their software.

Instead, the only C++ code that works out there is created by committee and peer-reviewed, so that different styles compensate each other.


> I believe such person exists when I see one. All that I see around are people claiming they did that, while ignoring some huge issue with their style that inserts some really nasty bugs on their software.

The real question is, of 100 bugs in a real-world application, how many were caused or at least influenced by the language, and how many are application level errors. In my practical experience bugs due to C++ behavior have just not been a thing. Maybe your world is different.


> but it is a far more modern and less dangerous language than most people think.

I suspect a better answer here is that people have wildly different tolerances for safety and don't discuss the subjective aspects of interactions with programming languages.


Modern C++ is great and I would choose it today if I were starting a project from scratch. I think people’s problem with C++ is the existing projects/code. If I were to join a random company today working with a C++ code base, it is very likely I’d be dealing with legacy code complete with all the 1990s footguns C++ dismissers dismiss. That code base would also likely have problems like requiring Visual C++ 6.0 to compile, emitting thousands of warnings that have been ignored for decades, and the company shipping the DEBUG executable because nobody can figure out how to get the RELEASE build to not crash. Often existing C++ projects come with 1990s era problems, so detractors can still legitimately use 30 year old arguments.


Modern C++ is mostly syntactic sugar over older C++. If you manage to walk the happy path entirely within said syntax the code looks quite pretty. Lots of the language design is about having a nicer way to express common ideas.

If you stray from the happy path, the demons come for you. Or if your use case isn't met by the existing sugar. Or if you have a dependency which was written a couple of years ago that you need to reach into.

And naturally the project started from scratch today, only using the very nicest C++ features, will be legacy code written wrong in three or six years from now as the language moves on.

I wouldn't want to start a new project in C++ today but I'm not totally confident there's a better choice available.


> it is a far more modern and less dangerous language than most people think.

I think the opposite tends to be true. C++ is a more dangerous language than most people think, and many who see some nice new constructs end up with a false sense of security. In any major C++ project you will sooner rather than later find pieces that don't strictly respect all of the rules of modern C++, that end up accidentally invoking UB, and that will happily pass code review and de testing before blowing up when something else changes.

People tend to believe they can write safe C++, but outside constrained environments (similar to MISRA C) , this has not been proven true in practice.


I've never written C++ in professional capacity, what turns me off from learning is what I've heard and read about its vast complexity, e.g. you could meet 20 expert C++ developers and they'd all be using a different subset of the language. I don't want to spend more energy learning a language than I do actually solving the problem. Same reasoning turns me off from Rust, too.


As a pure c++ dev, this is true and a good reason. The committee is borderline sabotaging the language. On top of the anti-cpp crows slinging hitpieces


> With most technology, people don't assume it hasn't changed since the 90s, but for some reason, that's what people do with C++

Disagree, every technology is like this. Talk to people about Maven and they think the 2005-era reasons not to use it are valid. Mention MongoDB and people think it's 2015. The industry is seemingly only ever willing to progress by adopting new things, not by fixing existing things.


> C++20 solved the include file issue with modules. The ecosystem is almost there with CMake 3.28 (released) and GCC 14 (some time in H1 this year) providing out of the box support.

So it's not widely supported yet in the newest released versions of the compiler and the article probably is still good advice until your project knows it no longer needs to compule using older versions?


C++20 may provide a theoretical solution but we're a long way from "it is practical and everyone easily replaces their headers with modules." In 2024 I am still manually writing C++ headers for my Fortune 100 employer.


> GCC 14

Do you have a reference for significant module changes in GCC 14? I see nothing on the changes page for 14



In modern C++ there will be modules (C++20). Even std will be available as a module (C++23). This will likely bring down compile times massively. But so far the only compiler with support for this today is MSVC.


Did someone get around the modules-linearise-the-build-graph problem?

In the before times, people were enraged with C++ compile times and wanted a solution. Modules could have fixed that, thus people assumed modules would fix that.

Then modules were designed with the primary motive of removing preprocessor macros from the language, with reference to fortran's binary module description model. The one that turns the trivially parallel compilation model into a directed graph.

I'm under the impression that C++ modules as shipped require you to build dependencies before uses and the inevitable linearisation of the build graph is considered "probably fine, whatever". Compiling individual translation units gets somewhat faster as you don't need to parse the headers each time, but you can no longer build everything simultaneously.

Do modules actually make compilation appreciably faster in msvc?


> modules-linearise-the-build-graph problem

Modules don't linearise the build graph. Any independent modules are compiled in parallel, but dependents will wait for these modules to be compiled.

In fact the problem is the other way round: header-translation unit compilation does a ton of unnecessary repetitive copy-pasting and parsing in the name of achieving embarrassingly parallel compilation. IMO modules help express build and API dependencies clearer, and even despite the so-called loss of parallelisation, build times with modules are generally an order of magnitude faster.

An extreme example is Vulkan-Hpp, which has a module interface file[1], and header files that exceed 150K lines of code, cumulatively. Using these headers means even a simple example takes something like 20+ seconds to compile every time. This is even worse when using complicated standard headers like `<algorithm>`, `<functional>`, `<ranges>`, etc.

On the other hand, using the module, the compile only takes as long the first time, and every subsequent compile is lightning-quick.

[1]: https://github.com/KhronosGroup/Vulkan-Hpp/blob/main/vulkan/...


It seems we're in agreement on the mechanism of modules. Thank you for the example - one really large header-only library would seem to be the ideal use case for precompiled headers. Sorry, modules.

There are of course alternatives to header only libraries so there's a sense in which this is a solution to a problem that didn't need to exist in the first place.

But yes, header only libraries are an important solution to the problem of cmake, and modules can patch around the compile time implications of header only libraries, so that's sort of all good.


>Do modules actually make compilation appreciably faster in msvc?

For a small toy example, about 10x times faster. Don't know how that looks in real life projects.


are you saying with c++ module parallel make is nearly impossible?


With headers, every translation unit is some (probably large) amount of text spliced together then fed into the compiler front end. You parse <vector> and <string> over and over again. If headers are big or templates many, the time per translation unit get rather high.

With modules, as of the last time I looked, in order to compile some file, you must first compile the files it depends on, at least far enough to create a module file which is then depended on. Compiling a single translation unit with a bunch of already-existing module files should be faster than the equivalent with a lot of header files as you don't need to repeat the parsing.

This makes the individual compilations cheaper (win) but replaces independent work with a dependency graph traversal (loss).

It's totally obvious from the outside that the right solution is a compiler daemon. Integrate the build system with the compiler, persist state between separate file compilations. The C/C++ world really likes the independent batch compiler scheduled by cmake approach though so that arbitrarily constrains their design space.

It's fascinating that tooling choices from the early days (notably separate compilation is forced by insufficient memory) combined with the division of responsibility between compiler tooling vendors and the standards committee forces this sort of design.


its not impossible compiling independent modules parallel, but if module X depends on module Y, module Y needs to be compiled before module X.


> Even std will be available as a module (C++23)

All 3 major compilers offer the std module in C++20 as an extension, at least I remember an issue where it was discussed and agreed uppon by all. I don't actually know if it was actually implemented (yet?).


> I remember an issue where it was discussed and agreed uppon by all

https://github.com/microsoft/STL/issues/3945


I no longer use header files when I can help it. They haven't made sense since 1980. I have a Python script that reads my C++ source and writes all the headers for me. C++ is so much nicer to write when you don't have to do the compiler's job for it.

Modules are supposed to be coming soon, and then I won't even need the script anymore.


This is interesting. Care to elaborate or share your script?



this is still lacking a lot of important rules (group by nature of dependency, sort alphabetically within a group, sort groups per dependency level, use angle brackets for third-party only) and is still only barely scratching the surface of C++ file organization.


>Oh, did I mention that good header dependencies decrease the linking time?

How?


Function definitions in headers are compiled and included in the object file, then all but one copy is discarded during linking. Reading and discarding all this stuff adds some time to the linking process, which is a noticeable fraction of total build time if you're only recompiling one source file.


Moderately interesting that the compile-N-times-keep-one is a C++ thing. C's inline makes you specify which translation unit exports the symbol so the compiler knows whether it is working with the canonical symbol or one that it can discard at will. Thus it can inline functions, but if it chooses not to, it doesn't have to optimise&codegen that function just to have the linker probably throw it away.


You can do the same for c++ with extern templates, just instantiate them with the needed types in one compilation unit and you only end up with a single set of symbols, and codegen etc. only runs the once.

And C inline functions are instantiated and generated for every compilation unit that uses it - it just doesn't generate exported symbols in the resulting object file. Exactly the same as the "inline" keyword in c++.


Declarations, not definitions. Declarations go into object files as symbol names for the purpose of linking. You can have multiple declarations as long as they are the same (or if they are not they define different things in C++ due to overloading), but you cannot have multiple definitions (even in different object files) even if they are identical (compiler doesn't bother checking they are, it sees it as fishy and just drops a "multiple definitions" error).


No, C++ header files contain definitions too, which are emitted into every object file where they're used. If it's something allowed to be multiply defined it's emitted as a "weak" symbol, and the linker eliminates duplicates.

This includes every inline function that's used, and many default-generated functions, such as default constructors & destructors and virtual function tables.

For a simple example, See https://godbolt.org/z/qhvaP8Eed and observe the Foo::Foo() constructor, which would be emitted in every .o file with the 'struct Foo' definition.


Ok, thanks for the correction.


Just don't put include files inside other include files.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: