Hacker News new | comments | show | ask | jobs | submit login
Why Aren't There C Conferences? (nullprogram.com)
466 points by ingve 18 days ago | hide | past | web | favorite | 372 comments

Because though there might be building trades conferences, there are no hammer conferences. Oh, there will be tool vendors at the building trades conferences, there might be sessions on "Efficient Hammering Techniques Using Machine Learning", but there are no conferences about hammers. Don't forget about the blacksmith conference next week, they'll have hammer vendors, too.

I dunno, a Ruby conference kinda makes sense because one will mostly build web sites with it, so it's really going to be a "Building $FOO with $TOOL" kind of thing. But C is so widely used, it's like having that HammerFest 2019 with everyone from people splitting wood with a sledghammer to jewelers with their teensy little tapping tools. I just don't see the diamond workers drinking with the lumberjacks at the hotel bar after hours. :-) I jest, but it's mostly about networking anyway, otherwise just watch the videos online.

To find the answer, I would ask, Why are there conferences for other technologies? What are they for?

And I think the answer is probably 1) get useful news about recent and upcoming changes and 2) to socialize. I'm going to guess--possibly just projecting--that most C developers are older and not terribly interested in meeting new "C friends" or jumping into some hot, new "C startup".

That leaves #1. Web/AI-ML/cloud/mobile tech conferences feature speakers from the big tech companies that might reveal, in a talk or a hallway chat, info about which of your problems they are about to solve or to cause. Independent speakers serve a sort of journalist role of gathering intel from the big companies, organizing it, and revealing their findings. It's all about "tell me what I need to know so I can decide what to do".

But there isn't much news about C that most C programmers will want to know in order to make decisions. It's almost like news about algebra or, yes, hammers: mature, slow-changing technologies whose changes seldom impact more than a small percentage of users.

So without reasons 1 or 2, the answer to why aren't there C conferences is basic and boring: what important purpose would they serve?

I think there's a third one: To generate hype. C is already established - it's been around long enough that it doesn't need advertising.

Yet many still haven't learned to use lint like tooling, initially developed in 1979.

Sounds like a good topic for a conference talk ;)

It is not for lack of trying, check out the Linux Kernel Summit 2018 talks.

The problem is on the receiving end.

> It’s almost like news about algebra

There are plenty of algebra conferences, though. I’ve been to several. :-)

I agree with your general point, but algebra isn’t quite as slow-moving as you seem to believe.

Perhaps they were thinking of the ordinary equation solving that most people know as algebra, rather than the abstract algebra (group theory etc.) covered in university mathematics courses.

C is the tool, so probably a better analogy would have been to pencils, rather than algebra, which is more in line with OP's sentiment regarding hammers.

> I'm going to guess--possibly just projecting--that most C developers are older and not terribly interested in meeting new "C friends" or jumping into some hot, new "C startup".

Unfortunately that doesn't explain why there are not only aren't C conferences today, but there weren't C conferences 15, 20, 30 years ago.

I was a C programmer 30 years ago, but we didn't attend C conferences for two reasons: 1) programming languages weren't newsworthy enough to generate much buzz. Hardware was the focus of attention, and things like programming languages tended to be features of products, not products themselves. And 2) C was already about 20 years old by that point in any case, so monthly magazines were sufficient to keep up with the changes. We didn't tend to go to Fortran or COBOL conferences, either.

We DID go to conferences or expos for things that were buzzy and newsworthy, which meant the exciting new world of microcomputers: Comdex, MacWorld, etc. But the emphasis was on exposition, not conferencing with others, because we got to see exciting, new things with our own eyes, when that was the only way to see them (couldn't "see" things online). We wanted to see them (and they wanted to show them) to inform our decisions. The makers of C tools filled plenty of booths at the "computer expos", though.

Twenty years ago, we had started having big language conferences. JavaOne was like a religious revival or political rally, but C was the old devil being vanquished, not the Hope of the Future. And Visual-J++ and C# conferences were counter-political rallies, where C was going away as both sides agreed, but to be replaced by what? The makers of C tools still filled plenty of booths at the hardware expos, but there wasn't much demand for a C conference where we could hear, in person, about how C wasn't actually dead yet.

And by 15 years ago, all the hotness that inspires a crowd to gather for a conference was based on changes relevant to the WWW. If C had reinvented itself with shiny, new async networking, web service & e-commerce building blocks, concurrency, and whatever in its standard libraries or some such approach, there definitely WOULD have been C conferences. That would have been big news (and probably a bad choice, but that's a different question.) Without something like that, there is still plenty of demand for C, but little demand for a C conference.

> what important purpose would they serve?

To spread the knowledge how to actually write C code that meets a certain quality level regarding security and safety.

And not that writing C89 without any sort of static analyzers is still acceptable practice.

There should be one more reason: businesses that want to sell to that specific category.

There will be Hammer conferences if there are companies that make "handle grips" or "hand hammer protectors" who are willing to sponsor such events.

The reason that there is a conference for X is because somebody makes money running a conference for X. Be it directly or indirectly.

Yes but the reason there’s someone making money hosting a conference for X is that there’s a demand for a conference on that topic. Why is there a demand for Ruby conferences but not C ones?

Exactly. C programmers don't define themselves as C programmers mainly. They are firmware developers, game developers or something else. That's where their real interest is.

I've been to C++ conferences. If Scott Meyers, Herb Sutter, or Andrei Alexandrescu were on the ticket, I know it would easily be worth it.

It's like how they have hamburger restaurants, where you can ask for a hamburger without the pickle, instead of having hamburger without pickle restaurants. C is the burger, bacon, bun, lettuce, tomato, onions and sauces, and ++ is the pickle.

I think you swapped the complexity that each language provides. If C is anything, it's the patty, and C++ literally everything else that makes up the burger.

> C++ literally everything else that makes up the burger.

"Oh, you wanted mayo on your burger? We can do that, but you also have to have mustard. Don't worry, you won't have to pay for it.. but it's going to be in there."

On the other hand, with C you get the mustard on your burger because there was already some the plate and no one bothered to clean it.

And ketchup and ranch and oyster sauce and a pizza and a slice of cheese cake and a jackhammer

So much mustard. Like the best games of Clue.

As if C didn't come with features nobody wanted. At least C11 half dropped VLAs again.

I dunno- I just got back from rubyconf where I both gave and watched talks on metaprogramming with no web focus at all. There was a great illustrated talk featuring lemurs that was just a walkthrough of rebuilding various enumerable methods with reduce. Other ones that I went to:

- a "how ruby's GC works" talk

- A deep dive into how a core dev found + fixed memory usage in ruby

- The unexpected pitfalls of sublassing certain std lib classes

- A new desktop graphics library

- Using ruby to operate an automatic wafflemaker

And these are just the ones I attended. I imagine railsconf is more web focused, but rubyconf doesn't seem like that at all.

A couple members of my team attended and we made a friendly wager before they attended: how many conference attendees would be primarily Rails developers?

We guessed respectively 25%, 50%, 75% (my guess).

Based on some very, very, very fuzzy sampling, they estimated it was somewhere around 50%.

Any guess yourself? Are you a Rails dev?

Ruby is used in a lot of places where Rails isn't, by a few large companies. The one that comes to mind is Stripe, which has at least a couple hundred engineers writing Ruby code that has nothing to do with web development (much less Rails).

While Rails certainly did a huge amount to make the language popular, there's lots of things Ruby is good at aside from concatenating snippets of HTML.

Actually, that being said, while maybe 90% of personal Ruby projects don't use Rails, a good half use ActiveRecord in some capacity. I wonder how much that makes it a Rails project, by some reasonable definition.

>Ruby is used in a lot of places where Rails isn't, by a few large companies.

And indirectly by many more who use chef

And many people use Homebrew on OSX, or Linuxbrew (homebrew for Linux -- useful especially for users of beowulf clusters who often don't have rights to install anything except in their home directories), and brew and its formulas are written in Ruby.

I'm surprised it's so low, actually. I'd have guessed 75% as well. And yes, I am a web developer, spending about half my time in rails and the other half split between JS and misc ops stuff.

Thankfully rubyconf explicitly says that it's not the place for rails-focused talks, which results in a lot of great discussion on other aspects + usages of the language. :)

Offtopic: The waffle talk is listed at https://rubyconf.org/program#session-697 and it may have been recorded. These are my favourite kinds of talks at conferences.

Iirc all the talks were recorded by confreaks. They said they'll be up within a month of the conference. That _was_ an excellent talk - the presenter was very engaging. :)

I get your analogy, but I think there are tons of hammer conferences.

There are conferences for most any language with wide adoption. There's also a Slack conference (not "TeamChatConf," but a "Slack Frontiers"). There's a GitHub conf (not "OpenSourceDevCon," but "GitHub Universe"). There is a Datadog conference, and so on and so on.

Maybe C just doesn't have one big company, or an ecosystem of smaller ones, that stand to benefit from gathering a bunch of C devs in one place.

I think the core distinction is business-centric conferences vs developer-centric conferences. You've entirely named the former. Those are geared towards using platforms, and while developers are welcome, they aren't the entirety of the target audience, business professionals (eg MBAs) are just as first-class at those. But Ruby and Python and Swift conferences are going to entirely be geared towards developers.

I think the hammer analogy works better as conferences for building houses (business conferences) vs power tools (developer conferences) vs hand tools (no conferences), the distinction being how quickly can you get to a finished house. A power tool is something that can make houses quickly. A hand tool can be used quickly and efficiently by someone very experienced with them, but power tools can be used by almost anyone.

C is not a product unlike everything you mentioned. Even most programming languages are products these days.

C product is called UNIX.

Languages were always a product, they either came with the OS one bought, or add to be bought as extra tooling.

I agree with your point but HammerFest 2019 sounds like a freaking blast. I think there is an underserved market there.

Sëë yä thërë düdë!


Haha, I would definitely go there. I went to Donington festival and Wacken while I lived in Europe. Too bad there are no good metal festivals on this side of the world (American continent).

Wow so if you visited that site with no contextual understanding, you would arrive and leave with literally no additional knowledge of what the heck it was! How fantastic!

Seems obvious it’s a metal festival in the U.K. As for who HRH no clue.

Really? I was already aware of Hammerfest but the text on the homepage makes it sound like something out of Lord Of The Rings!

Sure, C u there!

What a wasted opportunity to not arrange the festival in the actual town of Hammerfest:


There's http://www.hamerfest.nl/

It was a blast! And C actually came up in one conversation...

I agree. I'm mostly a C coder but I've been on the Rust train since pretty early and I remember being pretty confused when they started putting emphasis on the Rust branding with logos, the whole "rustacean" thing, conferences, forums etc... In hindsight I get it but originally there was a bit of a culture shock. I mean I don't wear C T-shirts and as far as I can tell there's not really a standard C logo either.

Maybe we should find a cool demonym like "c-lions" or "c-weeds" or whatever. Then we'll be able to ask the real questions like "is C webscale?".

Now I'm being a bit snarky but I genuinely understand why newer languages need branding nowadays, but in the case of C that ship has sailed long ago, you don't really need to introduce it, it's just C.

In order to avoid becoming a niche language nobody uses due to the network effects of C's ubiquity, and to achieve its goal of completely replacing C as much as is practically feasible, Rust needs buy-in and what economists call pre-committing. That is to say, it needs thought leaders to accept that Rust is the future and commit to building things in Rust, even when it doesn't make sense to do so in the current climate. Hence the heavy emphasis on marketing and RESF psyops.

> "c-lions"

C-lioning - n. Derailing a discussion by "politely" demanding evidence that obviously unsafe programming practices are unsafe and should never be used. E.g., "Can you tell me, exactly, why aliased pointers are bad?"

I'm really sorry, but I'd kind-of want to know exactly, why aliased pointers are bad?

People often think that book's title is "The C Programming Language" but if you look closely it's really "The Programming Language" and the blue-and-white logo is a linked list diagram. There's a blue circle around the base of the list's "next" pointer, which unfortunately is white on a white background.

By convention, C programmers know it it's unsafe to guess where the arrow points, so this has never been a problem.

There's also a theme song: https://i.imgur.com/Yd00GWf.gif

There's a youtube version with singing around somewhere but I can't link from work

JetBrains got there already! https://www.jetbrains.com/clion/



I want a C t-shirt.

I wish people would stop comparing programming languages to Hammers. It's a meaningless meme. Yes, they are tools, but comparing them to Hammers is 100% ridiculous.

A language is not a hammer. It is a tool. You know what else are tools? Chainsaws, databases, lasers, and tunnel boring machines that cost tens of millions of dollars.

There are plenty of conferences about tools.

You need to go further and accept that programming languages are also not tools.

If they're not tools, then what are they? Where do you draw the line?

They're languages, or notations. Like https://en.wikipedia.org/wiki/Notation_for_differentiation there's different schools of thought on what's easiest to read, while the fundamentals are essentially the same.

Calculus is a tool, but how we write it is just the notation. Similarly, computer programming (or perhaps a subfield like functional programming) is the tool, and a particular language is just the notation for writing it down.

Mental problem solving paradigms.

i don't know, there are Golang, Python, Javascript, Ruby, PHP conferences and probably a lot more, so not sure if that argument is valid.

Of the ones you list, only one has a wide focus, IMO. Three of those will essentially be web development conferences, and I don't know what the hell people do with Go these days, but I'll bet it's mostly a handful of things. Python probably has broad enough usage and scope to counter what I said. The exception proves the rule. :-)

> I don't know what the hell people do with Go these days, but I'll bet it's mostly a handful of things

The 2018 Go user survey analysis should be released soon, but I'm guessing you're right that it's only used for a few things; however, those few things are pretty significant in terms of marketshare (distributed systems, backend applications, and general purpose tooling--this hypothesis could probably be checked against the StackOverflow survey). Not much on the front-end or mobile application side nor on the embedded side, but probably making some headway on the AI/ML/data-science side.

In general, I think of Go as Python but with 15 years of hindsight but also 15 years of market catchup. Go does a lot of things better than Python (performance, concurrency, distribution, deployment, etc), but the reverse isn't true. The principle advantage to Python is that it was the lingua franca for the data science niche before the niche became a mainstream competency--this means more people were exposed to Python rapidly, but it also means that people are beginning to build high-quality data science libraries in other languages, including Go, which closes the gap between Python and other languages.

A bunch of us are trying to change that and make Go the lingua franca for data science and machine learning! The libraries du jour are Gonum (https://gonum.org), a numpy/scipy equiv for Go, and Gorgonia (https://gorgonia.org), a PyTorch/Tensorflow equiv. Do help out!

I’m a beginner in Go and not an expert in Python. However Python is a lot easier to develop with than Go: generics, no pointers, standard collections have many convenience functions, package handling, etc.

Saying that Python has "no pointers" is a bit of an oversimplification, somewhere between "technically true, but misleading" and "actually incorrect", depending on how you define your terms and frame the question. Remember that Java has "no pointers" too, it works basically the same way Python does (except for some minor quibbles about primitive types), except if you go to 4.3.1 of the Java SE 11 spec and look for "pointer" you'll find the flat admission that Java has pointers.

The part about pointers that makes programming hard is that you can create multiple references to the same object, and if you don't realize this, you can modify something that you didn't realize you were modifying. In Python this can happen easily enough:

    a = []
    b = a
Question: what is the value of b? You might think that this is pretty obvious, but once you can answer this question correctly, you basically understand pointers. In fact, it's not really relevant whether you say that "a" and "b" are pointers or whether you say that they are object references, conceptually, it does not matter what you call them. The fact that Golang has different syntax for pointers and the fact that you can get the address of local variables or member fields is no big deal, since understanding how pointers work is the hard part, and Python already makes you do that.

Package handling in practice I would say is a wash, more or less. Both Python and Golang have multiple different ways of managing packages, depending on your preferences.

It's flat incorrect to say that Python has generics. What Python has is dynamic typing. Golang has interface{}, which is equivalent to what Python has, except it requires casting.

Out of curiosity, is b [1]? Or if b receives a copy of a, then b would still be an empty list, i.e. []. Not sure what answer you're expecting.

interface{} is completely different, in that it doesn't allow polymorphism. In python, I can add = lambda a, b: a+b, and then call add on ints, floats, strs, or some custom type of my own. In go, I can pass an interface{} and check if its type is the same as some other type I know about. But no polymorphism.

This is factually incorrect, interface{} allows polymorphism. Maybe you are unaware of how interface{} works but if you want to call a method on it you just do this:

    func myFunc(a interface{}) {
        a.(interface{ f() }).f()
This works for any type "a" which has a method "f" with the correct signature, and this is what I meant when I said “except it requires casting”. The fact that you can’t do this with + is merely a consequence of the fact that + is just the __add__ method in Python, and in Golang + is not a method. The fact that Golang matches method signatures whereas Python only matches method names is not really a substantial difference in my eyes.

I don’t think there’s a strong case to be made here that interface{} is substantially different from Any in Python, again, except for the required cast. And I think we all agree that neither Python nor Golang support generics. (Python supports metaprogramming, and you can implement something similar to generics with metaprogramming, but I don’t consider that to mean that Python supports generics. Type checkers retrofitted onto Python also support generics, but I don’t really consider those to be core Python yet.)

You don’t have to check the types, you can say “unpack these variables as into and if they aren’t into, blow up” which is what Python does as well. The major difference is that it’s implicit in Python and explicit in Go.

> no pointers

FWIW, this is one of the biggest selling points of Golang for me - I love the fact that I'm able to be explicit about reference vs. value, and that it prevents me from shooting myself in the foot when I inevitably screw it up.

It's enough of a differentiator for me that I almost can't put languages with and without pointers in the same mental bucket.

I mean, there's CppCon as mentioned in the article, and C++ has pretty much the same wide reach as C.

Please tell me that CppCon isn't a conference just for the C Preprocessor?

I hate it when you go to one of those Makerfests, and it actually turns out to be about Makefiles.


Actually is a really good conference to learn Software Development beyond C++

C++ is a wildly unstable language compared to C. Every new standard brings a slew of new coding paradigms and features, often changing the way to write C++ code pretty fundamentally. On top of that C++ is a hellish beast of a language of such tremendous complexity that you can probably host hundreds of conference where you'd teach even experienced C++ coders things they don't know. They won't run out of material any time soon.

Meanwhile a C conference would have a hard time coming up with interesting content year after year IMO. You'd end up having to talk about how C is used more than C itself, but then you're no longer really doing C-con, you're doing "embedded-con" or "high performance trading con" etc...

I agree with the parent when they compare C to a hammer, it's a super useful tool and I'm glad that we have it but it's not like I could talk about it for hours (or hear somebody else talk about it for that matter).

This is completely false. C++11 introduced a lot of new paradigms, and that's it. That shift was for the better, and I haven't met a single person who can honestly say that the changes they made turned it into a worse language. Quite the opposite: it's more expressive, readable, and easier to use.

My comment wasn't meant to disparage C++, I'm not saying it's not improving, I'm just pointing out that it's a massively complex language and that it's still evolving in major ways so there's stuff to talk about. C hasn't really introduced significant "new paradigms" in a long while.

To be honest C is kinda boring. Which is fine, but makes it hard to do a C conference.

I think it's more that C programmers don't really consider themselves part of a C culture in the same way that python, go, ruby, php etc programmers do. C programmers usually have another interest like kernel dev, firmware dev, or 3d game engines that is more interesting to them than the language itself.

Fair point, but Go or PHP hardly sound more exciting.

I'm a heavy Go and Python user, and you're right that Go isn't particularly interesting--it doesn't change terribly rapidly, but it's also simple enough that you can do interesting and useful deep-dive sessions into the garbage collector or scheduler or other aspects of the runtime or the toolchain. You could also speculate or discuss proposals for Go2. There have also been significant recent changes to the dependency management scheme which affect pretty much everyone.

I don't think there's much of an analog for C. Most of the interesting stuff in C revolves around things you would think you can depend on but actually break in really weird and interesting ways (misc undefined behavior, sizes for integral types on different architectures, idiosyncracies of the preprocessor, the way to trivially exploit that pattern you used to avoid buffer overflow exploits). But most of that is just depressing.

I guess, some languages have to fight for their place under the sun, some don't.

That for different reasons. Go is not a general purpose language (it's creators specifically focus it's uses on networked apps). PHP is on the decline, nothing kills excitement like decline.

Boring in what way? Some of the coolest stuff we work with is built with C. C is ace. C is cooler than C++

I don't know what the hell people do with Go these days

It mostly falls under the rubric of "writing servers."

People are doing basically everything in Go right now! Front-end web development being the exception (though there is GopherJS...). Projects like Docker, Kubernetes, CockroachDB, Consul (and anything Hashicorp does), Mattermost, and etcd are all being written in Go!

Anecdotally, I've heard people using Go to rewrite some of their Ruby services in which more performance is needed.

Python is probably mostly data science if I had to guess. jupyter notebooks have made python excessively used for data science.

That's absolutely false. Firstly, while the advent of Jupyter notebooks have certainly made using Python better for data science, why would you think that would cause people using Python for unrelated purposes to stop using it for those purposes?

Practically, while data science / ML / etc work is a big part of Python, I'd wager a much larger part is still web development work (Django et all). And of course, just using it as a "scripting" language, or using it to create GUIs, etc.

To be fair, it is my impression that even in data science, python is more a less a scripting language, and all the libraries which do the heavy lifting computing-wise are written in something else, because speed. Has this changed?

Sure, but for every line of code in such a library there's a hundred (or hundred thousand) lines of code that use that library. A small bunch of people maintain some CUDA code that's underneath a Python wrapper, but great many people write code that uses it.

The underlying libraries may be in C, but the vast majority of data scientists never need to alter the C so it's fair to say most data science code is written in python. Python/Numpy/pandas has enabled data scientists to avoid writing C entirely.

about as much FORTRAN as C I believe

Python is probably mostly data science if I had to guess.

You think? I'll admit that my view of the Python community is super narrow, but I was thinking Django, me writing a bunch of test scripts, backend stuff. I know data science is a "thing" with Python, but I just didn't imagine that it had become it's main use. I guess TIL there's a whole other world of how people use Python, and it might be most of them. :-)

I run an ML team whose primary language is Python, and I’ve been doing scientific computation in Python (eg finite elements, numerical optimisation) in Python for a long time.

My personal impression is that Python is used in more fields than almost any other language, but that every community is only peripherally aware of other communities.

For example, Python is big in Earth Sciences (or so I am told). Almost nobody outside Earth Sciences knows that.

Until maybe 5 years ago, I had no idea that people used Python for web development. It had just never occurred to me. GvR famously had to google what wsgi was.

So I think your final sentence probably applies to most users of Python.

Yup. Python has some really good GIS libraries. So field that needs to deal with map data will often find itself drawn to python. And since it's a pretty newb friendly language you find that people who wouldn't otherwise program write simple scripts to manipulate the data and maybe output it to a csv so they can import it to excel.

At this point, Python is probably the second most used language for system programming (after C), one of the top 4 in web development (no idea on their ordering), around the top for other network protocols (I'd guess 3rd), the top language for data science, on the top 5 of scientific computing, on the top 5 for embedded computing (what sounds completely crazy, but there are few languages there), on the top 4 for game creation (probably 4th), the top one for electronics CAD and among the top for other kinds of CAD, the top one for GIS...

I can't think of other niches right now, but whatever one you think, it's probably on the top 5 there.

Python was popular (though smaller) before NumPy and everything dependent on it came about.

The first version of the project later renamed NumPy was released in 1995, so it's been around almost as long as Python.

Yeah, I started using python around 2.0 and stopped using it because of 3, but from the beginning of my time using it, there always was numeric or numpy

Please don't project your narrow viewpoint onto PL communities.

I don’t know about that, Django is arguable still the most productive web-framework.

I mean, I like JS as much as everyone else and graphql is better with Apollo than it is with graphene, but I can make a full django app and deploy it before I’m even done looking over the changes to my JS packages that have occurred since I used them last.

Hell, even if you want to use vue or react, I still find django more productive than node, Apollo and prisma. And that’s not mentioning flask.

So I’d wager that python powers a fairly large part of the web, and with good reason.

For as much as this is getting downvoted, I've been to PyCon several years running, and it is correct. While the actual usage of Python is very broad, the PyCon convention itself skews very heavily towards data science applications.

Hammer conferences typically get rolled up with Discuss conferences.


I C what you did there very well.

I've been to JSConf EU a couple of times and it's certainly not a JavaScript conference "about building web sites".

Aside from the non-technical talks there were talks like

* how concurrency works in modern JS engines

* a talk about the upcoming BigInt type

* how fingerprinting attacks can circumvent privacy measures

* HTTP/2 Push

* a look at upcoming language proposals

* a retrospective on the JS build tool for React Native

* a talk about versioning

* experimental time travel debugging support in Firefox

* a talk about error handling

* a talk about module systems in JS

Etc etc.

Sure, many talks are either about general web technologies or about new APIs of the web platform, but there is usually also significantly overlap with backend or native mobile and IOT.

When I worked with Python I also went to meetups and followed conferences which always had a wide range of talks from web frameworks to automation, data science and system administration.

Likewise at a Ruby conference I'd also expect talks about tools like Chef or Puppet, which are only tangentially related to web development specifically.

So I don't think your portrayal of conferences is fair at all. "Why aren't there ${language} conferences?" seems like a very sound question, even if ${language} is used in a wide range of industries for very different purposes.

I very much like your answer. And I now realize why disliked PyCon that much. There are so much things that can be done in python that I don't necessarily care about. Language is not the most important thing, what I do with it is.

I much more prefer going to a conference about let's say, web dev or machine learning. Once at the conference, I might prefer going to a talk that use python because I know this language better.

C also isn't new. Did there used to be C conferences?

I tried going to one once but the address I had for the conference center pointed past the end of the street ?!?!?

This is honestly a really, really good joke, I have to say.

Not undefined behavior unless more than 1 element past end of the array. ;)

Unless dereferenced, of course.

The address is legal, just don't try to go there.

and off a bridge? sounds like a JS conference

I can't resist.

C conference: Address is past the end of the street.

C++ conference: Reading the address requires working out an elaborate series of compile-time templates that finally resolves to the actual address. You go there to find that it's past the end of the street.

JS conference: The address is "the last place we held it". You attempt to figure out what that means, but you didn't attend last year. After some research, you find a copy of last year's invitation. Its address reads "the place where we always hold the conference". After more research and a bit of guess work, you finally arrive, but nobody else is there. You have a sinking feeling the conference wasn't in Uzbekistan after all.

TS conference: As JS, but the invitation includes a helpful explanation of how to find where "the last place we held the conference" is. Unfortunately, one of the explanations is wrong, and you arrive in Uzbekistan.

Java conference: To calculate this year's address, you must first go to a JavaConferenceFactory on the outskirts of town. Talk to the JavaConferenceFactoryManagerSingleton, and give him your JavaConferenceConfiguration. He can give you a JavaConference, which you can cast to a JavaConferenceImpl. The JavaConferenceImpl contains a JavaConferenceAddressLocator, which when passed a WorldMappingSystem can give you a JavaConferenceAddress.

Haskell conference: There is no conference. Holding one could cause side effects.

Go conference: The address is given in a simple, predictable format: interface{}.

There was going to be a Rust conference, but the venue was double-booked.

Python conference: the conference was abruptly cancelled after the social event which went awry when couple of attendees got into a fight over mixed tabs.

The php conference was free, but held in that part of town that made you lose all dignity and sense of pride.

Wasn't that the Apple conference?

Apparently there's not enough interest among those who write C. Back in 2012, Brandon Philips (CoreOS CTO & kernel contributor) tried valiantly to organize a C conference, but there weren't enough speakers and presentations.[1]

1. https://web.archive.org/web/20160304013703/http://www.cconf....

Good question, but a majority of this post is links to cppcon videos. (I do appreciate the shoutout to LLVM).

I think the C community:

* Should have a C conference.

* Actually be more aggressive in adding features to the language. For example, there are some really great GNU C extensions that are long overdue be added to the standard. Compiler vendors should work with the standards bodies more, rather than shipping language extensions, compiler builtins, and various compiler plugins in isolation outside of standards bodies (because then codebases become tightly coupled to the compiler; ie. the Linux kernel).

I strongly disagree. C should is adding langauge features at a pace other languages could stand to emulate. C is stable, trusted, and supported everywhere for this very reason. Codebases should not be using extensions, and Linux is wrong to do so.

> Codebases should not be using extensions, and Linux is wrong to do so.

Poppycock. Codebases should use any and all tools available to them to accomplish their goals, given the tradeoffs. Using non-standard language extensions is a tradeoff, and if it suits the Linux community and their goals, there's nothing wrong with that.

That is one way non-standard extensions become standard.

For example, the [[maybe_unused]] and similar annotations in C++17 grew out of gcc builtins.

Of course, C++ is much more aggressive in stealing popular features from the community, especially Boost.

Boost came about from a few members of the ISO C++ committee as a way to iterate on library features. It would be a shame if none of that work made it's way back into the standard.

Boost was created as a place to beta-test future C++ features. Its whole point is to move stuff from boost:: to std::

The problem with not allowing language extensions to be used is that it promotes the staleness of the language.

Just as human languages evolve as people use and adjust it, programming languages evolve as people start trying to change it to better suit their needs.

The difference with human languages and programming ones is that there's a relatively easy process to get everyone in sync on how it changes for programming languages. :)

I also agree with the grandparent comment that more features should be specified and incorporated into the C standard. The fact they exist and are in wide use says a lot all by itself.

This is the fundamental problem with languages that don’t have a good story for user-written compiler extensions: either they don’t adapt to new use cases or the standard gets horribly bloated or else, like Haskell, one implementation becomes the de facto standard.

C is used more as an API than a language.

> C should is adding langauge features at a pace other languages could stand to emulate.

Remember the Vasa! ;)

> C is stable, trusted, and supported everywhere for this very reason.

See the sibling comment about C not being fully supported everywhere. As long as people continue to make proprietary compilers, there will be implementation deficiencies. That's orthogonal to the language itself.

> Codebases should not be using extensions,

Agree and disagree. Maybe "codebases should only use extensions when it's feasible to provide fallbacks" or something.

> and Linux is wrong to do so.

I'm not sure I agree. There are some worthwhile extensions IMO, which is why I made the point earlier that I think it's time to standardize them.

>> Codebases should not be using extensions, and Linux is wrong to do so.

Why? If you can improve the performance/security/stability of Linux by re-writing functionality without extensions, by all means do so(and send in a patch). Why is Linux "wrong" for this?

The main problem with that is there is no single C community.

Imagine trying to get ANSI C, Win32 C, GNU C and every flavor of embedded C together.

Early 2000s:

The main problem with that is there is no single JavaScript/HTML community.

Imagine trying to get IE, Netscape, Opera and every flavor of embedded HTML viewer together.

2018: "This website only works optimally on Google Chrome"

Seriously, check out TI's C++ support for their DSPs. Last time I checked they were so behind the curve they didn't even support templates.

That might because they are targeting Embedded C++, which explicitly eschews these features.

Full C++11 support has been on their roadmap for years. http://e2e.ti.com/support/tools/ccs/f/81/t/542711

And IMO, the final Embedded C++ spec is such a terrible idea that it doesn't absolve them. Having written a C++14 RTOS, if I had to give up multiple inheritance, namespaces, sane casts, and templates, I would have just written the damn thing in C to start off with. The value proposition of institutionalized C with Classes doesn't do it for me.

See https://www.youtube.com/watch?v=TYqbgvHfxjM for what you need to do to have C level perf while using the type system to still get you something.

You may want to take a look at Apple's Embedded C++ implementation for I/O Kit, which is grounded strongly in object-oriented programming and inheritance: https://developer.apple.com/library/archive/documentation/De.... Of course, it does drop the very features you seem to enjoy, namely multiple inheritance, namespaces, and templates, but I think there is more reason to this apart from "these things are complicated to implement/hard to use"–I/O Kit was based on NeXTSTEP's DriverKit, which was written in Objective-C and offered none of these features.

Template metaprogramming is very useful on embedded :) https://github.com/kvasir-io/kvasir

I've only glanced at this, but this caught my eye:

> The header only nature of Kvasir and very sophisticated use of the volatile keyword

This isn't a particularly encouraging sign…

Even the functional language communities arrange joined conferences to learn and exchange ideas, so why can't C?

Nonsense, that is the C community. They need to be put in a room together to sort out their differences. The differences exist due to communication breakdown which leads to divergence over time.

What about the various RTOS people or embedded firmware people? Or the mathematicians... or the...

It's like saying, "Let's get everyone who speaks English speakers together to agree on our favorite tea."

Seems more like saying "Let's get a bunch of tea drinkers together, who all drink tea for different reasons, and different types of tea, to discuss tea", which sounds like a great thing for a bunch of people interested in tea.

Maybe the fact that there are many split communities is even evidence that a conference would be beneficial. Get some cross pollination of ideas going on.

Why does C++ not have that problem? It too has many communities but they manage to go to the same conferences without any issues.

Mostly due to the less diverse nature of the communities I guess. Because C is older and used a lot more low-level you get a lot more strange C versions and C users.

Most of the C developers/users are going to be interested in their topic, which is often already having a conference of their own. Think Linux Kernel Plumbers conference which is just about the kernel, and since practically all of it is written in C that's where you'd find all the people you'd want to meet on your C-topic.

There would be no point in going to a conference that is about using win32 and C when you are not using win32 as virtually nothing (besides the basics, but those basics aren't why you'd attend a conference) ports to your own environment.

Because C++ has a lot of extras that can be used in a variety of cases that cross platforms and APIs all the same, and because it has a different development and dispersion ecosystem, you'd get more of a C++ oriented conference without the information being non-portable.

C is effectively frozen in time, thanks largely to Microsoft's decision to deprioritize C support in favor of C++.

C99 is almost 20 years old and MSVC still doesn't fully support it. Forget about C11. If you're a C programmer who cares about portability, you stick with C89.

There's not a lot of point to having a conference about something that is stuck on a 30 year old standard.

I think this is a very debatable claim. MSVC is an insignificant compiler in a lot of industries. In something like scientific computing, everything is written with gcc, clang and Intel on mind. Sure, you'll find some companies still using MSVC, but it's not industry standard any more and it's very unclear how an insignificant, outdated compiler can stop the progress of world's most fundamental programming language.

C's strength is being the lingua franca of the bottom of the stack. I can believe that there are some applications that know a priori that they will never need to be portable to MSVC. But for any foundational open-source library, closing the door to MSVC is a high price to pay. Someone, sooner or later, will want to use the code on Windows. If you are zlib, ffmpeg, freetype, Lua, LuaJIT, sqlite, libpng, c-ares, OpenSSL, PostgreSQL, glib, gtk, or anything that aspires to be as widely used as these, you stay C89-compatible to support MSVC.

"Use the code on windows" and "support msvc" are orthogonal properties. Gcc and clang both compile windows binaries.

These days you can just use clang on Windows. Latest standard supporting frontend + MSVC ABI compatibility.

MSVC is becoming less and less relevant. People are writing C99 and C11 and MSVC's insistence on breaking portability with modern codebases is incrementally marginalizing it as people switch to compilers that understand their standard compliant code.

Another one that doesn't code on Windows.

It is called MSVC++ for a reason.

Do people still care about MSVC outside whatever is left of the traditionalist Microsoft ecosystem?

But don't people, who compile C on Windows, usually compile it as C++? That means you in fact get access to some (most?) of C99.

Noone cares about Microsoft except a dozen guys caught in the reservation^Wecosystem there who still haven't moved to .NET.

Why would you want to write windows applications in C to begin with?

Heck, if you have a C app and want to port it to windows, you are probably better of just using WSL.

Related tweet from last week: https://twitter.com/krzyzanowskim/status/1062740353533513728

Observation: Before 2014, iOS conferences were iOS conferences and was mostly about how to use the frameworks. After 2014, iOS conferences renamed to Swift conferences and everyone tries to figure out how to use Swift - like this is the main pain point or what?

In my experience, they're still about frameworks mostly, they just brand themselves as Swift conferences to indicate there isn't going to be any Objective-C oriented content presented, so people don't attend with the wrong expectation.

Swift-oriented content at a conference is also going to be popular, as unlike frameworks, it affects every developer, as compared to frameworks where only a percentage may use each one.

I wonder if there might still be new Objective-C content that could be presented for iOS development, or if everything is just Swift now.

A lot of major apps are still Objective-C codebases (Facebook, Spotify), though I would imagine everyone will migrate eventually, considering the momentum of open source libraries seems to be heading that way. As someone who works fulltime in Swift and previously in Objective-C, I'm slightly nostalgic for the days of long, descriptive method names, and faster compile times, but overall glad to have migrated.

Objective-C, as a language, is still improving slowly. But it’s pretty mature overall; the majority of Objective-C developers I know actually write Objective-C++ and hence tend to follow the development of new C++ features.

Swift is a complex language, and this allows for a lot more people to talk about how their programming paradigm is the best way to write Swift. It’s the same reason we often see articles like “extending Optionals” or “AutoLayout with a DSL” on Hacker News: Swift is so overwhelmingly used for iOS development that people seem to find it useful to specialize it in places where it designed to be general, and they love to tell other people about it.

Probably useful for justifying costs to managers if your company is currently migrating to Swift, which is pretty common in iOS shops at the time (and now?)

It's interesting that many of the comments here (7 of 9 top-level comments currently) answer the question in the title, rather than respond to the blog post (which has some interesting points and a great list of relevant talks from conferences... starting to watch some of them right away).

I believe it's mentioned, but BSD conferences are often a great place to find presentations that typically gravitate towards low level systems C programming.

OpenBSD has an events page with a list of past talks given by developers, along with slides, and if it exists, any video.


I agree with his reasons for the lack of C conferences, and present one more: what I've noticed from the C developers and "community" (if there is even something to be called that) is a relative lack of dogmatic cargo-cult thinking and otherwise trend-chasing compared to other language communities, which I hypothesise is partly responsible for the "we must have a conference" attitude. To repurpose a phrase, "the choir doesn't need to be preached to."

Also, I allude to above, there's not much of a "C community" either. It's all just a bunch of people using the language for very different purposes and with very different styles, and IMHO this diversity is a good thing; it just doesn't make for a population who would like conferences.

All reasons why C has been around for a very long time and will be around for a very long time to come.

Why isn’t there a bash conference? Someone is missing a golden opportunity to organize Bash Bash.

Nobody's willing to shell out for it.

I always wanted to attend a "yes" conference.

I heard they were a lot of arguments at the Gnu "cat" conference.

My computer has "true" version 8.26, and I miss some place I can go to discover what's new from version 8.25.

The history of GNU true: http://git.savannah.gnu.org/gitweb/?p=coreutils.git;a=histor...

For clarification, you are using coreutils 8.26 which includes true among other things.

Yes, it was in jest. The result of "true --version" is actually quite extent, bringing licensing and authorship information and a short explanation of the GPL.


You created a new account just to say this?

$ yes no

This is a good example. C made the transition from "hot new platform" to "infrastructural millstone" before conferences on trendy programming topics were a thing.

The original Bourne Shell was written in an odd preprocessed dialect of C called "Bournegol". I'm sure this was analogous to the kind of innovative techniques people present at conferences today.


Maybe someone else can comment authoritatively on how ideas and techniques were disseminated in the early days if Unix and C. My guess is that it happened through listservs and Usenet, thus reaching even isolated Unix installations who wouldn't have been funded to attend a central conference, if that had been thought of. In Ireland I know that in the 70s there were local meetings of DECUS, for users of Digital Equipment Corp hardware and software. I don't know about Unix.

Maybe they started a dash conference and made a dash for it?

I could see it happening in the future.

There's has been a mini resurgence in C from (normally high level language) programmers who are getting sick of web development and are interested in data oriented design.

Conference benefits could be:

Pushing the C standard. C11 is barely supported.

Discussing new standards in a forum other than mailing lists. Did you guys know there is already a C17 standard?

Letting old C devs pass on their wisdom to new C devs. C has a lot of pitfalls but also has a lot of neat tricks.

Show off new and existing tools.


The problem is that nobody new to the scene wants to write C anymore. It's not a secure language, it has no convention and is full of hacks. On top of that it has undefined behaviors and is supported differently by different systems.

Nowadays people want to write in Go, Rust, Swift, etc.

Fortunately not everyone makes technical decisions using the wisdom of crowds.

If people would switch to Rust and stop writing C the world would probably be a safer place yes.

Given Rust's spotty platform support[1], I don't see that happening anytime soon.

[1] https://forge.rust-lang.org/platform-support.html

It's possible to write unsafe code in any language.

Come on... C doesn't initialize memory, doesn't manage memory, doesn't verify bounds for buffers, etc.

It's especially unsafe to write C.

Thats why we have Java / Rust / Python / CSharp.

My point.

So we agree that we have different tools with different tradeoffs?

Of course, for now. But I still disagree that it is as easy (or hard) to write C without bugs as in any other languages. Most people should not write C.

In the same vein - if you happen to be stuck in a problem domain where counting/avoiding allocations is important the other languages I listed above (even C++ really) make things very hard. Rust has a decent take on building a safer c++ (with what appears to be C++ levels of complexity). Would like something similar to emerge for C.

C is so much fun though. Having so much direct control over everything, no layers of abstraction clouding your view, is incredibly enjoyable in the right context.

Which scene? You still learn C to get work done with hardware, like it on not.

For strange that it may seem to youngsters, that capability is not unique to C.






Some examples of non-C languages used in production code for doing stuff with hardware.

Not going to spend $250 on an ugly looking IDE that forces me to work on a Windows PC.

Then better not work on tne embedded industry, as that is the OS of choice for like 99% of the SDKs, even C based ones targeted at Eclipse CDT.

Which I guess is also beautiful then.

And you'll struggle upstream if you go that route. Pretty much all vendors for embedded platforms only provide C libraries/drivers/examples/subsystems/documentation.

The compiler vendors I listed are vendors for embedded platforms, they also sell boards.

The problem, as expressed multiple times on C++ conferences, is the devs that are religiously against anything that isn't C89 + language extensions + Assembly.

I'm 41 and work in embedded, but thanks. There's approximately zero people in the trade who don't know or never used C. And you certainly know this.

Sorry about that then.

Surely I know that, but there is this image being sold that C is the only option, when those companies that I have listed are in business for at least 20 years now.

I also know there are a large amount of people in the trade that have zero regard for writing secure C code, despite the availability of tooling to help in such regard.

Which is getting worse now that those systems are being plugged into the Internet.

Sure, I've seen embedded work done with anything from assembly to Lisp. But it's all marginal, trying to get into embedded development without learning C would be very career limiting. For example, not sure how many new projects are started in 2018 with Pascal on PIC but suspect not a lot.

Perhaps the most popular non-C development platform these days is Arduino on Atmel 8 bit chips, but it is still really a dumbed down version of C(++).

Actually Arduino is one way I hope that C++ gets more embedded love.

Either that, or certification requirements like AUTOSAR that have switched to C++14.

Looks like Rust is starting to become the hardware language

There is a lot of hype and no industry penetration so far. Things may change certainly, and it's not the worst thing to happen.

> Letting old C devs pass on their wisdom to new C devs. C has a lot of pitfalls but also has a lot of neat tricks.

On this topic, does anyone know any good blogs along these lines? I'd love to read some practical day to day stuff from people that have been working in C for decades, usually all I see from places like HN are weird compiler incompatibilities and rare exceptions that I never have to deal with.

Most of my experience comes from toy projects and horrible decades old code I deal with at work. The former is great for trying out ideas but I don't learn how they scale (human wise) on larger longer projects and the later is more a lesson in what not to do.

The blog this post belongs to has loads of interesting posts on C.

There absolutely are C conferences. They just aren't conferences of this type:

"We have speakers who made some cool stuff in a hyped up new language called C and are exited about it, and sharing the techniques!"

They are of this type:

"We have speakers who will share some approaches for preventing and discovering stability and security flaws in widely deployed low-level, middleware and embedded components and stacks."

You have to read between the lines that a lot of this is actually about C.

C doesn't need to be mentioned in the title because it's the 900 pound, white elephant in the room.

C conferences are also of this type:

"We have a gathering of experts in the domain area surrounding a very specialized software stack (that happens to be written in C, and carries a C API).

> the 900 pound, white elephant in the room

Interesting phrase, a mixture of at least three phrases:

- "800-pound gorilla" (https://en.wikipedia.org/wiki/800-pound_gorilla) -- something big that can do whatever it wants

- "white elephant" (https://en.wikipedia.org/wiki/White_elephant) -- an expensive but useless gift or thing

- "elephant in the room" (https://en.wikipedia.org/wiki/Elephant_in_the_room) -- something obvious that no one wants to discuss.

It's not clear which meaning you're trying to convey. C is definitely not useless, so I imagine that the "white elephant" part at least is not intended. Note that an adult elephant weighs several thousand pounds, so a 900-pound elephant is probably a baby elephant.

I'd say if you want to attend a C conference, check one of the BSD gatherings. Lots of good C code, everything open source, and a great community

It is an interesting thought. I went to one of the early OSCON conferences in Portland and it was pretty cool to have the Ruby (Matts), Python (Guido), and Perl (Larry Wall) people all at the same conference. I was a budding Python programmer, but sat through sessions on the other languages to get a feel for what they offered.

That was a great conference - amazing attendees spending the evening hanging out around the conference center hacking code and an amazing diversity of presenters.

I go to OSCON often and that atmosphere has never been repeated as things have splintered off into language/application area specific conferences. Something's been lost - at least for the casual or beginner programmer.

Not sure about conferences, but what bothers me is that there is no single acknowledged discussion group about C, after Google destroyed newsgroups and comp.lang.c with it. There should be a very popular Reddit sub about it, but yet Reddit is not enough standard among developers, unfortunately.

Right now there is:

* https://www.reddit.com/r/C_Programming/

* https://www.reddit.com/r/c_language/

I'm unsure what you mean by Google "destroying" newsgroups. Newsgroups were decentralized to begin with, the fact that Google purchased Deja News and then didn't... I don't know, turn it into a Reddit-equivalent or something doesn't surprise me. You can still use Google Groups as a Usenet gateway and there are still other Usenet gateways around, although the offerings are getting a bit more scarce and it's no longer a standard part of your ISP's package.

Whenever I look at places like comp.lang.c (which still exists) I remember the bad Usenet experiences I had in the 1990s. It's certainly not worse than it was, in my mind. If anything killed newsgroups, it was Slashdot / Reddit / Stack Overflow and the like.

Maybe "killed" is a strong word. My feeling is that when Google exposed that part of the internet to the web, it became interesting for spammers to target USENET much more than in the past, which was a very strong contributing factor for USENET to die. Moreover once no longer an "hidden gem" many felt exposed posting to USENET, now that it was trivially web accessible.

My feeling is that the web was becoming larger and more accessible every year, and that while Google probably did that more than anyone else, you could equally blame AOL for providing NNTP access to the general public in 1993, or Deja News for starting their archive work in 1995, or Google for their indexing work starting in 2001.

That's just assigning a name and face to economic forces, though. It's the underlying economics that killed Usenet—experts hang out there, so it’s valuable, so non-experts start hanging out there, until the experts leave because there are too many non-experts. This happens to every internet forum; Usenet was just older. Travel back in time and any randomly selected person on the internet was likely to be some kind of expert.

Yes, what you say surely makes a lot of sense. Maybe there is to start some form of forum that is very unpleasant to use, like, just having a command line interface to access it... and vim as editor :-D

All programming conferences are C conferences, if you think about it.

In the same sense that all book fairs are forestry expos.

And you know someone is gonna come along and explain that LISP books are written only on tinfoil, so you're completely wrong.

He is completely wrong because paper is sourced from tree farms, not actual forests. Also I love LISP, Scheme, and Emacs.

A tree farm (at least the type that provide pulp for paper) is just a type of forest. Anyway, where I'm from, paper most definitely comes from forests (mostly second-growth) not anything you would call a "tree farm".

I don't see how that follows at all.

If you're implying all program languages are derived from C, that's simply false. Lisp is based on the lambda calculus and predates C, as one easy counter-example.

If you're implying that all languages run on top of C, that's also false. Theoretical Lisp Machines [0] have been envisioned which do not run C anywhere, and many languages (rust, go, etc) can be built into unikernals that run on real hardware with no C anywhere in the mix.

As far as I can tell, your statement is both pointless and wrong in even a generous interpretation of it.

[0]: https://en.wikipedia.org/wiki/Lisp_machine

Show me a lisp which is used by enough people to fill a conference in 2018 that isn't implemented in C. Show me a lisp machine which was manufactured any time in the past 10 years.

Steel Bank Common Lisp and Clozure Common Lisp are both implemented mostly in Lisp with some C code thrown in. And the European Lisp Symposium always has a large share of Common Lisp programmers.

SBCL (a fork of cmucl) is implemented in common lisp and is the most popular common lisp implementation.

CLOC tells me there's 28391 lines of C and 426525 lines of Lisp in there. Also remember that Lisp has, on average, higher code density than C thanks to advanced macro usage.

Pointing to C code in that repository and saying SBCL is implemented in C is like pointing to https://github.com/sbcl/sbcl/blob/master/binary-distribution... and saying that SBCL is implemented in Bash.

Don't forget that much more of the code is #if'd for different architectures compared to lisp.

The runtime is written in C. That's the important bit. Just because the compiler, standard library, etc, isn't - doesn't mean it's not based on C. You cannot use SBCL without a C compiler.

The installation scripts are written in bash. That's the important bit. Just because the compiler, standard library, etc, isn't - doesn't mean it's not based on bash. You cannot use SBCL without a bash shell. /s

Of course you can use SBCL without a C compiler - you simply use a precompiled version. You can't build SBCL without a C compiler.

Shell scripts just slap commands together. You could just be a human interpreter reading the source and get the job done. C is a much more fundamental requirement of SBCL. Much more so than in the way SCBL depends on Python, for example (e.g. it doesn't).

Listen, I get that in the ivory tower of lisp you don't like looking at the little C bricks at the bottom, but the fact is that they exist and C is a hugely important language upon which the entire modern world of computing is built.

Sure, all of us execute stuff on kernels that are written in either C or C++ or C# or Objective-C, utilizing runtimes that are written in the same languages and calling APIs that are defined by the C application binary interface. That family of languages isn't going anywhere and choosing C instead of raw assembly is often chosen out of practical reasons, such as code maintainability.

Still, I find it hard to put an equality sign between "X has a runtime written in C" and "X is written in C".

I said "implemented" in C. I'd argue that the runtime == the implementation.

The runtime for sbcl is just the garbage collector and parts of the OS interface. It's not like the JVM (or e.g. GNU clisp) where the runtime is interpreting code.

The OS interface for any program that is targeted at *nix is likely to be written at least partly in C, because POSIX is very hostile to non-C languages, which leaves us with the garbage collector.

If the GC's implementation language is the most important thing for an implementation then every language implemented mostly in C++, but using bdwgc for garbage collection is actually implemented in C and assembly.

This doesn't really sound sane to me. In this particular case, it would mean that if you throw away SBCL's compiler, SBCL's standard library, SBCL's extensions, and replace them with something completely different, you still get the same Lisp implementation because all of the aforementioned elements aren't a part of what you call the implementation.

Funny you say that, because Windows libc is actually implemented in C++ with extern "C" {} for the ISO C entry points.

Other implementations have parts of the runtime written in Assembler, Lisp, Java, ...

A bunch of languages can be used to write a Lisp runtime.

The ones based on Java run on the JVM, which is always written in C :) The ones written in Lisp, well, at the bottom it's C again. I haven't seen a popular lisp yet with a meaningful amount of assembly in lieu of C.

> The ones written in Lisp

I have some of these exotic Lisp Machines from the 80s which run no C code by default and where the C compiler is optional and written in Lisp. ;-) Obviously exotic and outdated. An alternate world.

> I haven't seen a popular lisp yet with a meaningful amount of assembly in lieu of C.


And, yes, the assembler is written in Lisp, too.

You missed "popular". Very cool, though.

Relative to the Lisp world, Clozure CL is relatively 'popular'. ;-)

Which Java VM?

There are plenty to choose from, implement fully in Java like jikes, partially implemented in Java and C++ like OpenJDK, implemented in Java and C like MicroEJ, and lots of other ones to chose from.

Then there is project Metropolis on the OpenJDK, which targets to replace Hotspot with Graal, thus reducing the amount of C++ code on the runtime.

HotSpot is written in C++. I don't know about the other popular JVMs (IBMs thing, Azul, etc) - but I would guess C++ also.

Isn't oracle vm written in C++?

Ah, fair point.

I bet your C compiler is actually written in C++, unless you happen to use tcc.

Well there's Amber I suppose? Not popular, but not in C at least.


Uses m-expressions instead of s-expressions though.

> Theoretical Lisp Machines [0] have been envisioned which do not run C anywhere

I have one at home. It's not theoretically, but factually. About 10000 Lisp Machines had been sold between the late 70s and early 90s. Which probably adds to $500+ million in sales - then.

They actually can run C, but that's usually optional.

Okay, I'll amend my position from "Everything is based on C", to "Everything is based on C except for a handful of LISP machines that nobody cares about but a handful of fundamentalists".

At which point someone who collects antique UNIVACs will tell me I'm wrong, too.

C is not the first programming language out there, you know. And for its first decade it wasn't even the most popular. There are still architectures in production and use that you'd struggle to find a conformant C compiler for.

It's really bizarre how you expect the world to revolve around it.

Modern C compilers are written in C++ actually.

IBM z, IBM i and Unisys ClearPath are still quite modern and aren't written in C, rather in memory safe system programming languages.

Then Windows is largely C++ and .NET, even what used to be plain old C nowadays has been migrated to play nicely when compiled with C++ compilers.

Finally there are a couple of embedded and real time OSes that are based on C++, like ARM's mbed or even Arduino libraries.

What on earth are you talking about?

Assembly. Java. Ruby. Python. The list goes on.

Could you explain your rationale behind this statement?

Point me to a compiler or interpreter, any of them, that isn't either written in C or written in something that was written in C. And of course, whatever you play with, it's running on an operating system written in C.

If programming is turtles all the way down, C is the bottom turtle, standing on the solid ground of Von Neumann architecture.

The point is, C is utterly pervasive. Everything that we think is different from C is made from C.

> Point me to a compiler or interpreter, any of them, that isn't either written in C or written in something that was written in C.

Plenty of languages have self-hosting compilers or interpreters available, which are neither written in C nor written in something written in C.


I'm also pretty sure that some are written in C++, whose implementations are usually also written in C++, and which (while closely related to C) is not C.

> And of course, whatever you play with, it's running on an operating system written in C.

That's obviously not the case if what you are playing with is an OS written in not-C. It's actually not entirely true in plenty of cases, because operating systems that use C range from almost-entirely C (with some assembly) to C-among-other-things (e.g., z/OS and it's mix of PL/X, HLASM, and C/C++.)

> If programming is turtles all the way down, C is the bottom turtle

I’m pretty sure the bottom turtle is native machine code and/or processor microcode.

Go is written in Go, which is why you need it installed in the first place before you can run the go compiler.

Bootstrapping problems are interesting/frustrating, especially if you're trying to get a new package into Debian.

If you wanted to build Go from source, what would you build first? What version(s)?

Edit: the documentation is outstanding - https://golang.org/doc/install/source#go14

Except for versions before 1.5, which were written in C.

Spoiler: I hear the first C compilers were probably not written in C, until they were.

The very first C compiler wasn't written in C though.

In that case the first C compiler also didn't compile C.

How exactly is that?

clang, among the most popular C compilers, is itself not written in C, but rather C++.

I think the point that he is making is that most popular modern implementations of assemblers, Java compilers/JVM's, Ruby and Python interpreters are written in C, or in a language which itself was bootstrapped from C, eg. C++.

I guess the point is that the other languages are based on C. From your list CPython, CRuby and the GNU assembler are written in C. Not sure about Java.

Edit: not necessarily agreeing, there must be some languages that do not rely on C.

Java runs on the JVM, which is probably written in C++ (which is C, when you start digging). The widely-used Eclipse J9 JVM is written in C++.

> C++ (which is C, when you start digging).

C++ is not C (it was once implemented as a preprocessor in front of a C compiler, but that's not th same as being C, and it's now more likely to implemented as a self-hosting compiler.)

Ironically enough, the gcc suite itself is partially implemented in C++ these days.


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact