Hacker News new | past | comments | ask | show | jobs | submit login
Cppfront, Herb Sutter's proposal for a new C++ syntax (github.com/hsutter)
555 points by pjmlp on Sept 17, 2022 | hide | past | favorite | 545 comments



As someone who dropped C++ almost entirely a decade ago because I couldn't deal with both the immense complexity and the legacy baggage of C++, this is pretty exciting honestly.

What I find a bit strange is that he explains that he's been working on this since 2015 (with "most of the 'syntax 2' design work in 2015-16") and he doesn't want to give documentation because "that would imply this project is intended for others to use — if it someday becomes ready for that, I'll post more docs."

Why this reticence to opening the project and have others play with it and, potentially, contribute to it? I can't imagine a new language becoming successful with this mindset.


I think it's more about losing control. As soon as the documentation is open, there is a lot more pressure to explain the decisions made in an accurate or accepted fashion. Add some good old bike-shedding or contributions that miss the point and a project that was once fun becomes tedious to work for.


It also adds a lot of drag to iteration to have to go back and rework documentation.

On this kind of project, it's more than syntax-deep. Documentation would have a lot of reasoning and justification in it, which can take deep work to keep up to date.


Sutter isn't saying there isn't documentation, but that he's not releasing it.


Documenting something for yourself and documentation that is ready for public release are two very different animals.


Yeah, I couldn't agree more. this is exactly what's happening in ISO C++ and Herb should be well aware of its problems since lots of his good proposals failed to proceed many times thanks to WG21 rife with all the bureaucracies and nitpicking. This kind of skunk works needs to be fully driven by a competent individual or small group until it's finally able to demonstrate a good value proposition and build trusts by the project itself.


And a common cause of getting stuck in local maxima (specifically, in committee/consensus-driven projects) is that each individual decomposed feature doesn't obviously deliver value on its own merits, but the sum of a group of such features can reach a higher local maxima.

Aka, why argue about feature A, when the goal and long game is how A+B+C+D works?


A lot of decisions made, BTW, have been explained in proposals for the C++ committee that came out of this work.


> Why this reticence to opening the project and have others play with it

I heard this recently: open projects are like Good Will Hunting but in reverse. You start as a respected genius, and end up as a janitor getting into fights.


> As someone who dropped C++ almost entirely a decade ago because I couldn't deal with both the immense complexity and the legacy baggage of C++, this is pretty exciting honestly.

This seems to be just a syntax frontend for C++. The underlying semantics stay the same.

BTW, if you dropped C++ a decade ago, you should now look into the modern improvements (C++20).


> Important disclaimer: This isn't about 'just a pretty syntax,' it's about fixing semantics


> This seems to be just a syntax frontend for C++. The underlying semantics stay the same.

That is very much not true. In fact the syntax simplification is of less interest to me than the clarification/simplification of semantics. Most of the dangerous / confusing parts of c++ come from the necessary c compatibility.

So for example, you don’t have to worry about the bug-inducing C integer promotion rules.


> if you dropped C++ a decade ago …

Fair. Still, the more recent C++ improvements tend to roll out slowly across the ecosystem and work environments.


What does the C++20 change in a day-to-day work? I know they added "stuff" (as always) but is there anything that really benefits >80% of all cpp programmers?


Compared to a decade ago a bunch of stuff in no particular order:

1) malloc/new & free/delete are now solidly legacy territory of the "unless it's a placement `new`, you're likely doing it wrong". make_unique && make_shared all day long.

2) templates that are understandable by mortals thanks to `if constexpr` instead of SFINAE nightmares.

3) static_asserts

4) lambdas (which is going to get way more useful with https://en.cppreference.com/w/cpp/utility/functional/move_on... )

5) std::format

6) attributes like [[nodiscard]] being standard

7) std::move making passing std:vector & similar containers around not being terrifying (this is what also really helps #1 be possible)

I'm sure I missed some stuff, but I reach for all of those regularly.


1) I'm always wary of something that is mature that has documented and extensive historical issues being handwaved away with "it's all fixed in this or the next release"

2) in light of that your comment is unintentionally hilarious. C++ became a syntax swamp 15 years ago and it is getting worse every release. I anticipate it getting worse as rust, a hilarious syntax soup in its own right, continues to march forward.


> 1) malloc/new & free/delete are now solidly legacy territory of the "unless it's a placement `new`, you're likely doing it wrong". make_unique && make_shared all day long.

This is awfully wrong. C++ might be convenient to express ownership and manage object lifetimes, but they are not the only way to express ownership by far.

Take for instance Qt, which relies heavily on new-ing up objects still up to this day, as it has its own ownership and object lifetime management system.


Swapping out unique_ptr/shared_ptr for some other smart pointer container doesn't negate what I said.

New/delete are still basically deprecated territory. Qt isn't any different here, other than it seems they are behind the curve with make() variants of their pointer containers. So you'd want to make your own of that, and then go back to the world of "new/delete are deprecated"


You can always create your own “algorithms”. Our codebase has one that creates a new QObject that is a child of an existing one, returning a (raw) pointer to the new child object. That’s a case of not “no raw `new`”, but the next-best thing: isolating raw `new` to the one algorithm that does just that, and letting all other code depend on that.


What makes move_only_functions useful?


std::function is literally an object that can be invoked like a function. As an object, it can contain multiple values.

Move-only values are very used for common situations of unique resources that should not be duplicated.

But, what if you want to enclose a move-only value in a std::function? Are you simply out of luck and must give up on your dreams? A move-only function lets you get that work done.


My majority usage of lambdas is for use with work queues (so think Java executor). With the std only the easiest way to build that is a vector of std::function. But then your lambdas can't capture unique_ptrs, even though lambdas themselves have supported move captures for a long long time now. You can do this with std::packaged_task, but that has internal heap allocations for the future. Which if you don't need is just overhead, and not cheap overhead at that.


auto


to me concepts and coroutines really reduce the amount of boilerplate needed.

- just the ability of doing

     if constexpr(requires { T::some_member; }) { ... }
to check if a type has a member variable / function / whatever makes code infinitely clearer than the previous mess requiring overloads.

- coroutines finally enable to properly abstract the data structures used by a type's implementation, e.g. you don't need to spill out anymore that your type stores stuff in std::vector or std::array or boost::small_vector or std::list etc etc to client code, and they simplify async code very well. for instance with Qt: https://qcoro.dvratil.cz/reference/core/qprocess/#examples

- three-way comparison and automtic generation of comparison / equality function is really great for removing needless boilerplate

- void f(auto arg) { ... } instead of template<typename T> void f(T arg) { ... }

- foo{.designated = 123, .init = "456" }; (although it was already more or less supported on every relevant compiler for years


Sadly `if constexpr (requires {})` was not implemented in MSVC until VS2022, which was not even released when I tried using it and backed out because it wouldn't build on MSVC, and is still a new and less-adopted compiler than 2019: https://developercommunity.visualstudio.com/t/requires-claus...


designated initializers are a trap. you can't require certain fields are set... this has been a big pain point for my team. guess we should have used builder pattern or something


> you can't require certain fields are set

There is a way, but it's cursed:

https://godbolt.org/z/fzc6WEz5e


Clever. Wrap it in a concept, and it might have legs.


I would say constructors are what should be used for such kind of compound initializations.

C# is going the route of allowing required/optional fields in designated initializers, and from my point of view it is just a mess for what should be a constructor call.


In principle Modules would be huge, but in practice you can't use them as the compiler you have doesn't implement them yet.

C++ 20 gets a format feature that's basically a weaker {fmt} library but as part of the standard library. A string formatting feature with reasonable performance and safety as you might be used to from other modern languages.

Concepts is nice, that's basically way to express duck typing, it is often described like you're getting more than duck typing but that's all you actually get - but hey, duck typing is useful, and Concepts should have decent error messages whereas doing the equivalent with SFINAE is a recipe for awful diagnostic output.


Please do not post falsehoods seeking to mislead readers. You have been corrected on this point several times before.

In fact, C++ Concepts can be used for early enforcement of compile-time duck typing, just for better error messages. In the Standard library they are commonly used that way, for backward compatibility.

But Concepts can also implement named-property matching. It is entirely up to the designer of a library how much of each to present.


> You have been corrected on this point several times before.

You have repeatedly (across numerous HN threads) insisted that Concepts aren't just duck typing but that doesn't make it so.

> But Concepts can also implement named-property matching.

That's still just duck typing (and it's awkward to do properly). Contrast with the (eventually abandoned) C++ 0x Concepts, which actually has semantic weight to it. Concept Maps allow C++ 0x Concepts to have some sort of chance in a language that's already in widespread use, because you can adapt an existing type to the Concept using a Map.

But C++ 20 Concepts doesn't offer any of that, after about a decade it's just duck typing.


Templates are duck typing.

Concepts are just the equivalent of if instanceof.


Concepts can be used for instanceof. But they can be used in other ways, too.


Not really a C++ expert, but two things I saw in the documentation and that came in useful were the "contains" method of maps and the std::span class for passing around contiguous sequences of objects without worrying about creating copies of vectors.


Because this isn't a new language which is attempting to become successful. This is a playground to experiment with ways to evolve C++ into a better language.


All the more reason to have people play with it, no?


Other people playing with and using it decrease his ability to make breaking changes freely.


It seems like other people should come up with their own "playground" for experimenting?


Becoming a successful lamnguage does not seem to be a goal of the project:

"Cppfront is a personal experimental compiler from an experimental C++ 'syntax 2' to today's 'syntax 1,' to learn some things, prove out some concepts, and share some ideas."


And this work has resulted in things that were discussed or even got into subsequent C++ standards.


Pros:

- Transpiling to C++ means it will work wherever C++ works. On any toolchain. With relatively little effort. It shouldn't be hard to add CMake and bazel support, for instance. Even mostly dumb makefiles should be workable.

- Approaching the community this way could avoid fracturing the community more than it already is.

- Targeting specific success metrics (like CVE causes) could provide focus around design and requirements

- Seems simpler and easier to learn and teach.

Cons:

- The pitch assumes C++ modules are implemented, ready to use, and reasonable accessible.

- There will be a lot of tooling to reinvent: static analyzers, syntax highlighters, etc.

- At least for a while, debugging cpp2 problems will involve juggling multiple parallel realities: cpp2 code, C++ code, and actual program behavior (i.e. compiled binaries).

- Doesn't address other major C++ ecosystem problems like build times, packaging, build systems, etc. cpp2 could make them slightly worse.


This is all true, but I feel it sort of misses the point a bit. The goal seems to be mainly to use cppfront as a test bed for new language syntax and semantics, where most of the ideas will eventually make its way into the C++ standard or into existing compilers.

The main reason to use it in production is that it's not meant to be used in production and makes no attempt to be suitable for production use or avoid breaking changes, so the stuff in your cons list doesn't really apply. It's kind of like looking at someone's scale model of a proposed city layout and say that a pro is that tiny plastic houses are cheap to manufacture at scale but a con is that the houses may be too small to live in.

But all of your points will become relevant if cppfront eventually gets to a state where it's a serious transpiler meant to be used in production.


I don't think this analogy quite works. If all you wanted to do was evolve the existing C++, then sure, you wouldn't need any of this to be remotely production ready. But to me, part/most of what makes this interesting is the possibility for backwards-incompatible, radical simplifications to the C++ syntax and/or feature set. Those really are only meaningful if they get to a production ready state. And by being backwards-incompatible, they inherently cannot just be bolted onto the existing C++ language spec (as if we wanted to anyway---the spec is far to complex as it is).

I will say that it's impressive how much of this has already made it into C++ language specs. But given that Sutter has chosen to release this now, I get the sense that of what remains, large portions cannot continue to simply be done that way.


I agree, and would like to note that there isn't a need to simplify C++ imo. It's folly from a business and productivity standpoint to use one language, especially C++, to do everything from writing web apps to doing exploratory data analysis.

Besides, the C++ community seems to overlap with academia by way of its feature updates, reveling in complexity (e.g. "C++20 ranges are totally awesome: memory-safe and performant iterators!").


I've been using my own little Go (subset / my own extensions) -> C++ compiler -- https://github.com/nikki93/gx -- and found it to be a fun way to add some guardrails and nicer syntax over C++ usage. You get Go's package system and the syntax analyzers / syntax highlighters etc. just work.


In his talk he demos mixing syntax 1 and 2 (his terms) in the same file. Preprocessor macros work fine unless you enable "pure" mode.

He also demos error messages, and was able to get errors from a cpp compiler (msvc) that point to his cpp2 file with a line number and readable error.


> Transpiling to C++ means it will work wherever C++ works.

This approach is similar to Rust's editions. IIUC, nearly all of the differences been editions are compartmentalized in the frontend, deprecating syntax and rewriting old edition code to rustc's current IR.


Rust's Editions specifically only change the concrete syntax, although there have been hacks to make some stuff that would otherwise be impossible become possible by retrospectively making how things used to work a special case, and of course if some feature needs syntax then you can't use it from an Edition which doesn't have that syntax (e.g. async doesn't exist in Rust 2015 edition)

Here's the guide explaining the most significant hack:

https://doc.rust-lang.org/edition-guide/rust-2021/IntoIterat...

There was a proposal to do Editions for C++, Epochs, https://www.open-std.org/jtc1/sc22/wg21/docs/papers/2019/p18... but it was not accepted.


Rust editions can technically change some aspects of the language semantics, but this ability is used sparingly. An example is how the `iter` method on fixed-size arrays changed in the most recent edition. As long as code from all editions can still be used together, a semantic change is possible.


> An example is how the `iter` method on fixed-size arrays changed in the most recent edition.

Can you explain?

The array type doesn't have an iter() method. It will coerce into a slice, which does have an iter() method, but doesn't care whether it's an array we're slicing.


I was misremembering, it was `into_iter` rather than `iter` that was changed. The documentation for the change can be found here: https://doc.rust-lang.org/edition-guide/rust-2021/IntoIterat...


That's not what they're talking about here. They're saying that any system with a C++ compiler is a target for this new language, as opposed to Rust, which only runs on systems with a LLVM backend.


Yes. This. Assuming some niche embedded cross compilation toolchain uses C++23, it would support cpp2.


Transpilation has historically made source-level debugging more difficult. That's such a huge advantage to have a language that you can actually debug, and this is clearly make it even harder. I feel this is very developer-hostile.


The static analysis should be able to remain because a lot of the concepts he has implemented here are proposed/implemented for "syntax 1" as well, but they require attributes or specific styles. Therefore, the static analyzer should be written for regular C++23 and will support the defaults of "syntax 2".

As far as support in compiled libraries is concerned, that would depend on how those features are implemented in C++23 anyway.


Well, existing static analysis would break or need to add support for cpp2. In the case of pure cpp2, simpler tools could be written, but they would be new code.


> Doesn't address other major C++ ecosystem problems like build times, packaging, build systems, etc. cpp2 could make them slightly worse.

I assume that as long as the cpp2 syntax is clean, you could improve on compile times. In theory, cpp2 files could compile much more quickly that C++, so as you migrate code to cpp2 your builds speed up?

It all depends on how well Sutter has thought through the cpp2 language design I suppose.


I said that because the user surveys conducted by ISO clearly identify dependency management and build times as the biggest pain points for C++ engineers. Every proposal doesn't have to work on those problems, but other than parsing (i.e., language designer tech) tooling seems like an afterthought in this case. Just like all of the big ideas coming from ISO leadership.

Big reasons people look for non-C++ solutions include better build times, dependency management, etc.


Modules are a huge benefit for compile times


I've heard discussions that C++ modules didn't really give that much performance benefits as it initially claimed, when the dependency graph is huge and you have a large number of threads. Some links:

- https://www.open-std.org/jtc1/sc22/wg21/docs/papers/2019/p14...

- https://vector-of-bool.github.io/2019/01/27/modules-doa.html

Are there any notable new updates related to this controversy?


Yes, Herb mentions on his talk, import std on VC++ is faster than only doing #include<iostream>, the module version brings the complete standard library.


But it's adding an extra compilation step. The only way to will improve compile times is if it avoids the need to recompile all source files referring to header that's only been modified to add/remove/modify private methods/data in a class (even data is an issue if objects of that class are instantiated on the stack). In principle it might be possible for an extra transpilation step to help there if you also take over the "make"-style dependency management. I'm guessing c++ modules is supposed to help address this too?


The idea is not adding a compilation step. This is a proof of concept and playground. If it works out it's not going to stay as a transpiler to Cpp1 forever


Sure but unless you exclusively use the new syntax (which should be slightly faster to parse, though I'm not sure what % of total compilation time that constitutes) it's still going to have to do at least the same amount of work, if not more (given there's now more that has to be inferred by the compiler, rather than specified explicitly). But from experience the main issue with c++ compliation time is when modifying header files.


Extra Cons: how do you write in the new syntax when using existing C or C++ libraries?

Imagine writing something using OpenSSL, it might look kind of weird or breaking the flow of the new syntax, just because of its legacy.


> Transpiling to C++ means it will work wherever C++ works.

Only if you only use this Cpp2 thing. Because the C++ you write may not play nice with the C++ this transpiles into.


It support the big 3 (gcc, clang, ms)


I didn't mean compiler support, I meant that if the transpilation result might not follow the coding conventions of your C++ codebase.


who cares if it does? It's just an intermediate product from compiling the code with cppfront. gcc and clang both generate assembly files and then call the assembler on them -- I doubt that assembly code matches your coding conventions either.


> Transpiling to C++ means it will work wherever C++ works.

It also means you don't really benefit from any of the new semantics. Why do all of that static analysis work to establish safety, and then throw the results away by writing out clunky old C++? It all makes very little sense.


.. this doesn't make sense at all. If you prove that a program is safe in language A, it doesn't magically become unsafe when transpiled to language B. Every language works by clunking out instruction sets mostly devised in the 1970/80s, this doesn't remove the safety properties of the higher level language.


A whole program, yes. A library that's written for language A may well rely on arbitrarily complex preconditions for safety that you may have no hope of establishing within language B. Rust devs get bitten by this all the time when trying to reuse "safe" library code within an unsafe context. (That's one key reason for wanting e.g. Carbon as something that isn't just plain Rust.)


Off the top of your head, do you have simple examples of this? I know a bit of Rust but I assumed (apparently wrongly!) that safe-within-unsafe should just work.


I'm not sure what the grandparent post if referring to, but:

- In general, "unsafe" code inside a module may depend on invariants maintained by "safe" code in the same module. For example, the "length" field in a Vec can be changed by the safe internals of Vec. But if the safe code in Vec sets an invalid "length" value, then unsafe code relying on "length" might fail. So once you find an unsafe block, you may need to audit some of the surrounding safe code in the same module.

- Unsafe Rust is actually slightly less forgiving than C or C++, partly because Rust is allowed to set "noalias" on lots of immutable references, IIRC. The Rustonomicon talks a lot about issues like this.


It will work if your unsafe code does not actually opt-in to any unsafe features-- in which case you would not actually need an unsafe block. If it does use any of those however, it's very easy to break the expected preconditions of safe code and create unsoundness, e.g. around references (since code written in safe Rust has little use for raw pointers) or properly initialized data (Rust expects you to use MaybeUninit<> whenever data may not be properly initialized but that's a mere wrapper, which can only be removed by unsafe code).


So if your unsafe code hits an UB then your safe code can be broken…


Of course, undefined behavior anywhere in the program automatically makes the entire program invalid (this is how undefined behavior works in C and C++ compilers as well). But that's not what the parent commenters are talking about. An `unsafe` block in Rust represents a place where the ordinary invariants of the language may be violated. This is immensely useful for auditing, documentation, and manually verifying that the program acts as you expect. But it's a common mistake to assume that this also means that future changes to non-unsafe-blocks cannot invalidate existing unsafe blocks. While it is always necessary for an unsafe block to exist as a sort of "root cause" of unsafety, safety invariants can rely on things that are merely reachable from unsafe blocks. In practice, this means that the boundary for safety in Rust is not the unsafe block itself, but rather the boundary of the module that contains the unsafe block. This also means that, if you're writing unsafe blocks in Rust, it behooves you to make their containing modules as small as possible in order to reduce the amount of things the unsafe block can reach, and therefore reduce the number of changes that might accidentally change an assumption that an unsafe block is relying upon.


I know about that (even though it's a worthwhile addition to this thread for other readers), but I don't think it's what they had in mind, since they were talking about “Rust devs get bitten by this all the time when trying to reuse "safe" library code within an unsafe context” in their original comment[1], and I really can't see what they are talking about except some variation around “from C++, I passed some uninitialized memory to a Rust library and Rust went boom” but maybe I'm just misunderstanding.

[1]: https://news.ycombinator.com/item?id=32878775


Isn't it easy to call safe functions from unsafe Rust that, if called from safe Rust, would lead to a compilation error? For example, accidentally passing a safe function two mut pointers to the same object, which normal Rust ownership wouldn't allow you to?


Technically speaking it is OK to pass two aliasing mut _pointers_ to a (safe) Rust function because safe Rust can't do anything dangerous with raw pointers; both dereferencing and writing through a raw pointer require unsafe. If we are talking about references instead, creating two mutable references to the same object in Rust (unsafe or not) immediately causes UB.


I can't think of any such thing, no. Unsafe Rust doesn't turn off the borrow checker, it just lets you do a handful of operations that you can't do otherwise. The only way I could see that happening would be if you somehow violated ownership invariants in the unsafe block, which is already forbidden.


That's exactly how Typescript works. The point is that the type-safety checks are done at transpilation time, and that step fails if you've violated them. So I'm assuming something similar for cppfront - it will fail to transpile if your code doesn't pass the additional safety checks.


Because you don't actually work with the C++ output I guess? Same with Kotlin/Java — there's no difference in nullable vs non-nullable types for example. But as long as your codebase is 100% Kotlin, you get proper null safety, and there's no need to fall back to what Java has


How are any of the guarantees lost as long as the resulting C++ code isn't touched?


Transpiling and compiling are turtles all the way down.

It's a bit more complicated than, say, typescript to JavaScript, but in spirit is not so very different, right?


I've been engineering videogames in C++ for almost 20years working on AAA games and I can tell you that modern C++ is not very appealing to the vast majority of our industry. Modern C++ it's overcomplicated and from what I can see all the best software engineers I met in my careen write very simple C++98 code, maybe a bit of C++11 but that's it. They keep it simple. They don't need a move constructor because they've been already clever enough to solve a prolbem from a different angle. Modern C++ is for experienced C++ developers and it's solving problems that experienced developers know already how to solve without all this extra complexity. And because of that it's keeping away young developers... This new C++ syntax is just a gimmick that we don't need. The C++ community seems it's feeding itself to create a new standard every couple of years. It's just disappointing...


C++98 is an miserable language, I have no idea who would actually be resistant to improving it. I fucking hated it, C++14 and beyond are such a relief. Move semantics aren't fun but they are necessary.

This kind of claim is extremely common HN but utterly foreign as someone who actually writes C++ for a living.


I don't like replying to this type of comment, but I'd like to point out that you're replying to someone who likely grew up writing 6502 assembly (or the like). Furthermore, even mediocre engine programmers have an extremely good grasp of memory management et. al.

I also don't think they're arguing cxx98 is a good language, especially by today's standards.

The argument they're making is that the tact C++ has taken in more modern revisions is actually _worse_ than writing code the shitty 1998 way. At least for axes games care about.

You mentioned move semantics, which is a great example. For a lot of game/engine code, move semantics are just adding a pile of complexity for negligible gain. Ie. Why not just use a pointer?


> The argument they're making is that the tact C++ has taken in more modern revisions is actually _worse_ than writing code the shitty 1998 way.

They didn’t have any argument. Just old man yelling at cloud.

> You mentioned move semantics, which is a great example. For a lot of game/engine code, move semantics are just adding a pile of complexity for negligible gain. Ie. Why not just use a pointer?

Because it is error prone.


Once you're passing pointers around, you have to manually track lifetimes of things they point to.


I have too written C++ for years---I worked on online game server frameworks---and you are partially right, C++98 is not for mere mortals, but we still tended to use only a part of C++11 and C++14 we saw fit. This is the point: not every new C++ feature is relevant to at least some of us, and the relevant fraction is ever diminishing.


> we still tended to use only a part of C++11 and C++14 we saw fit.

I think everybody does this and it's totally ok. Nobody needs all of C++.


C++98 is the reason Java came to power.


No; Java's success is entirely due to Marketing.


> Java's success is entirely due to Marketing.

Congratulations, this is the stupidest thing I have read this week.

Of course the 20 years of enormous popularity and huge success in multiple industries must be due to marketing. I mean, what else.


Congratulations, you just proved that you know nothing about why a product succeeds/fails in the market.

It was the initial push (with gobs of money) given by Sun that gave Java the momentum. No language was ever pushed so hard in the marketplace by any other organization. Without that push, the language would never have the popularity that it enjoys today. There is nothing inherently "superior" to the Java language. Far better languages have fallen into obscurity because they did not get the publicity (eg. Eiffel) that Java did.


That is not how logic works. "Better languages with less marketing are less popular than Java" means that "marketing matters" and that "quality isn't everything", not that "only marketing matters".


And step 1 in marketing a new language is a cute mascot, obviously.


Oh no. I was there. Cross platform C++ was hard. Cross platform Java was easy.


That wasn't it; There was nothing revolutionary about it. It was the most hyped/marketed language in History[1]. Sun threw ungodly amounts of money to market it and make it what it is today. Invented as a "Embedded Systems" language, pushed as a Browser "Applet" language, moved to "Server App" language and settled as a "Enterprise App" language.

[1] https://www.theregister.com/2003/06/09/sun_preps_500m_java_b...


I think the fact that it eventually ended up as a server app language took everybody by surprise - Sun never saw it as anything like that at inception I'm sure.

Adding JDBC and later nio is probably what got it there the most.


Yes. It was originally meant to be used for interactive television. It then transitioned to the web and to cross platform GUI's. It then transitioned to servers.


And rewriting Distributed Objects Anywhere from OpenSTEP/Objective-C into Java EE.


Java was a reaction to C++. Go was too. C++ feels like it was created by the smartest person in the room without regards for the the other 80% that would use it on a day to day basis.

I guess that's why I like simpler languages more, where I don't have to think about how the language or compiler is going to treat my code.


Rust was also a reaction to C++, and is a move in the opposite direction as go.

Go is OK. It's a little faster to get a first version working, but the result tends to be slower and more likely to crash on memory/concurrency issues than a rust version would be.

I use both; go is fine for spaghetti ~architecting~ devops-ing piles of microservices together. Rust is much better for the data path though.


How do you like Rust for say SaaS type services? JSON marshalling/unmarshalling, socket behavior, etc? HTTP request/response processing?

I spent the last two months coding in Go pretty much not enjoying it. Nil is not always equal to nil in third party libraries through an interface -- and the compiler wouldn't warn about it.


Not OP, but if you have gripes about type safety, you're likely to enjoy rust. The compiler will absolutely let you know if you are not handling optional or result types. As a bonus, they are built using regular generic and enum constructs with a little bit of syntax sugar to make them more ergonomic.

serde is also world class for serializing/deserializing, and can generate implementations for you based on struct definitions independent of data format (json/bson/yaml/toml/etc).

The only sharp edge you may bump into for a web service is in your choice to go either sync or async, and if you go async you must then choose a runtime (usually tokio) as these are libraries and not integrated into the std lib beyond just the `async` keyword and a few traits.


My hopes for Rust is that they take their async coloured functions back to the drawing board because they are really not fun to deal with, and improve the FFI story with C. Easy interface with C and it's weird memory rules is of utmost importance if we want to replace C with something a little more solid.

But yeah, serialization in Rust is a breeze compared to Go. Probably one of the things I hated the most, along with the errnil boilerplate.


I've been working in the async space for a few years now and it may just be survivor bias, but while I think there are definitely some issues, I'm still largely happy with it. It's still evolving after all. If they can get the cancellation issues sorted and async in traits that would be a good place.

What were your issues with the C FFI? That usually gets praise from people.


> Not OP, but if you have gripes about type safety, you're likely to enjoy rust.

Not type safety so much as program correctness. Type safety is just one aspect of it. Golang is "type safe" but still panics on null pointer accesses.


It was a reaction to SunVeiw failing in the marketplace. Sun wanted to give Gosling a new project to keep him around so they had him write Oak, a language for set-top devices. That eventually became Java. Andreesen was so jealous of the marketing money Sun was throwing at it he named his language JavaScript to get a free ride on that.


> This kind of claim is extremely common HN but utterly foreign as someone who actually writes C++ for a living.

OP said:

>> I've been engineering videogames in C++ for almost 20years working on AAA games

It seems like you're implying that every single business domain uses C++ the same way that your business domain does. Instead of insinuating OP is a common HN commenter that doesn't write C++ for a living, which seems to be an incorrect assumption, maybe you should recognize that not every business domain uses the same subset of C++ as you?

Your comment just reads as an arrogant opinion hiding behind a false sense of authority. It sounds like you're saying, "anyone that disagrees with my opinion must be some hobbyist programmer that doesn't know what a real programmer that does this for a living actually codes like". As if coding in a language for your job automatically implies that the code will be of a higher quality than hobbyist code. Its gatekeepy and gross imo.

I've worked professionally with disgusting C++11 code that was an amalgamation of code from some people that clearly knew what they were doing, and a lot of code that seemed to be from people that had no clue what was really happening. They had no clue because of all the hidden complexities in modern C++ that Herb Stutter is trying to reduce.


The fact that these standards (?) or versions (?) of C++ are so wildly different to summon these types of opinions is mind blowing to me. I like languages like F#, Elixir, Erlang, etc. that have cores that basically never change and just add quality of life improvements, and so I have never experienced this. C++ seems like such a minefield that C++ developers have little in common with how other C++ developers work, experienced or not.


C++ people have a large propensity to complain though. We're more than a decade after type inference, one of the largest QoL improvement possible, was introduced and some people still claim it was a bad thing and that all types should be written down in their entirety. Do you imagine this in your language's communities? I had discussions about this with dozens of c++ programmers in real life.


> Do you imagine this in your language's communities?

Absolutely! C# introduced "var" back in 2008, and to this day there are people arguing that you shouldn't use it except when it's outright impossible to spell out the type.

From what I hear, it's the same story with "var" in Java 10.


FWIW as a professional C++ programmer since I graduated university 7 years ago I haven't really had that experience.

There's certainly a large learning gap; both for "classic" C++ as well as the popular modern features. But once you get over that I haven't found other peoples C++ any harder to read than other peoples C or JavaScript or Python.

No one's reaching to esoteric features like "..." or atomics unless they're some low level library wizard (in which case all their friends are also low level library wizards) or the situation actually calls for it.


Java was like that for so many years, it was fundamentally unchanged for decades really. And now (for the last 5 years or so) it's changing rapidly, new code just doesn't look like old code ("var" type inference, lambdas rather than inner classes in many cases, multi-line strings with """, new switch statement syntax etc.) I mean I like it so far but it's kind of weird to be honest!


This comment is classic "not even wrong".

>This kind of claim is extremely common HN but utterly foreign as someone who actually writes C++ for a living.

The above comment is equally applicable to their comment.


> This kind of claim is extremely common HN but utterly foreign as someone who actually writes C++ for a living.

This should be a huge klaxon for the C++ community. Whether or not it's true, it's a massive problem for the C++ one way or the other.


In database engines, where C++ is pervasively used, modern C++ is a vast improvement over legacy C++. It is much simpler and safer. Writing a current C++20 database kernel in the equivalent legacy C++ would require several times more lines of code, and that code would be much more difficult to maintain aside from the much greater volume. A database engine implementation makes very good use of modern C++ features, the idiomatic legacy C++ equivalents were often very ugly and very brittle. I've done both.

I have never worked on games -- maybe that domain has simpler internals -- but there are important categories of C++ software outside of games that unambiguously benefit immensely from the modern C++ features. In databases it is common to ride as close to the bleeding edge of modern C++ as is feasible because the utility of the new features is so high.


> Modern C++ it's overcomplicated and from what I can see all the best software engineers I met in my careen write very simple C++98 code, maybe a bit of C++11 but that's it.

I'm sorry but this assertion does not pass the smell test.

Sticking with C++11 means you do not have std::make_unique, and any claim that these "best software engineers" not only fail to use smart pointers but also refuse to even consider instantiating a std::unique_ptr in a safe, standard way and for no reason at all is something that lacks any credibility.

> Modern C++ is for experienced C++ developers

It really isn't. "Modern" C++ is just the same old C++ with useful features that improve the developer experience (see aggregate initialization, for starters,nested namespace definitions, utf8 character literals, structured binding, etc) and don't require developers to resort to in-house trickery passed around through tribal knowledge to implement basic features (move semantics, constexpr, etc).

> It's just disappointing...

Speak for yourself. It's fantastic that people continue to improve the language and make everyone's life easier, instead of being stuck in the C++98 mud. Each and every single major standard release since C++98 brought huge productivity and safety improvements that everyone stands to benefit. Even standardizing stuff from Boost and the like is a major step forward.

It's totally fine that you personally prefer to not benefit from any of the improvements that sprung in the past two decades, but don't presume for a minute that you represent anyone beyond yourself when making Luddite-like claims.


> Sticking with C++11 means you do not have std::make_unique, and any claim that these "best software engineers" not only fail to use smart pointers

Pretty much every non-trivial C++ engine i've seen has its own equivalents for memory management. Even a game engine development book i bought in ~2001 (meaning it was written before then) had a chapter dedicated to implementing smart pointers.


Which doesn't mean that they are any better than the provided standard implementations.

They are there because it makes sense to have them. Now you don't have build and maintain your own ship.


> Which doesn't mean that they are any better than the provided standard implementations.

That standard implementation is a thin wrapper over malloc(). That standard malloc() is not necessarily good enough. The performance is not great, but worst of all when you have many small objects, they are scattered all over address space. Chasing many random pointers is expensive.

While it’s technically possible to use custom memory management with std::unique_ptr with that that second template argument with custom deleter, it complicates things. The code often becomes both simpler and faster when using custom smart pointers instead.

That’s not specific to videogames, applies to all performance-critical C++. These standard smart pointers are good for use cases with a small count of large long-lived objects. For large count of small things, the overhead is too large, people usually do something else instead.


> That standard malloc() is not necessarily good enough.

The operative word is "it enough".

Let's not fool ourselves by claiming that all memory allocations take place in hot paths, and that all conceivable applications require being prematurely optimized to the extreme because they absolutely need to shave off that cycle from an allocation.

Meanwhile people allocate memory in preparation of a HTTP request, or just before the application is stuck in idle waiting for the user to click on the button that closes the dialog box.

It makes absolutely zero sense to proselytize about premature optimization when no real world measurements are on the table.


Let’s not pretend all conceivable applications are, or should be, written in C++.

People mostly stopped using C++ to develop web servers which handle web requests, because they moved to Java, C#, PHP, Ruby, Python, etc. People mostly stopped using C++ to develop GUI apps which handle these buttons, because they moved to Java, C#, and now JavaScript/TypeScript.

What’s left for C++ is software (or sometimes individual DLLs consumed from other languages) which actually need it to achieve the required performance. Even despite the language is unsafe, low-level, and relatively hard to use which directly affects software development costs.


But at the same time, straightforward C++ code with no tricks is still orders of magnitude faster than Python or PHP, and usually faster than Java and C#. So you still don't need custom allocators etc to be "good enough" most of the time.


I agree about Python or PHP.

However, for Java or modern C#, in my experience the performance is often fairly close. When using either of them, very often one doesn’t need C++ to be good enough.

Here’s an example, a video player library for Raspberry Pi4: https://github.com/Const-me/Vrmac/tree/master/VrmacVideo As written on that page, just a few things are in C++ (GLES integration, audio decoders, and couple SIMD utility functions), the majority of things are in C#. Still, compared to VLC player running on the same hardware, the code uses same CPU time, and less memory.


> However, for Java or modern C#, in my experience the performance is often fairly close.

Aren't you contradicting yourself? You started off complaining malloc of not being good enough, but now it's suddenly ok to tolerate Java and C#'s performance drop when compared to C++?

Which one is it?


> * Let’s not pretend all conceivable applications are, or should be, written in C++.*

This is a discussion on C++.

> People mostly stopped using C++ to develop web servers which handle web requests, because they moved to Java, C#, PHP, Ruby, Python, etc.

I'm not sure you understood what I said, or thought things through.

By the way, the top performing web framework in the Tech Empower benchmark is a C++ framework which uses C++'s standard smart pointers.

https://github.com/drogonframework/drogon

Also, one of the most popular web frameworks for Python started off as an April Fools joke. I'm not sure what's your point.

Lastly, the main reason why C++ ceased to be the most popular choice in some domains was because it was during a very long time the most popular choice in some domains, and still remains one of the most popular choices. Some of the reasons why C++ dropped in popularity is the fact that some vendors decided to roll their own alternatives while removing support for C++. Take for instance Microsoft, which was once responsible for making C++ the only tool in town for professional software development. Since it started pushing C# for all sorts of web applications, multi-platform applications, and even desktop applications, and also pushing the adoption of those technologies as a basic requirement to distribute apps in its app store, developers can only use technologies that exist. But does that say anything about the merits of C++?


> I'm not sure what's your point.

Over time, it became less important for C++ to be a good general-purpose language. When performance of idiomatic C++ is good enough, using C++ is often a bad idea: it delivers comparable performance to C# or Java, but it’s more expensive to use. While technically C++ has desktop GUI frameworks, web frameworks and others, they aren’t hugely popular: due to development costs, people typically prefer higher level memory safe languages for that stuff.

For use cases like videogames, HPC and similar, C++ has very little competition, because that level of performance is borderline impossible to achieve in other languages. It’s for these use cases people care about costs of malloc, cache-friendly RAM access patterns, and other things which are less than ideal in idiomatic C++.


Ironically Microsoft is the only OS vendor for mainstream platforms that still ships a GUI SDK that gives tier 1 treatment to C++ with WinUI, and even then the tooling is really clunky (back to VC++ 6.0 COM days).

On the Apple and Google side that ship has long sailed, with C++ used only on the lower OS levels, and as basis for MSL.

Naturally there are still the GUIs for game consoles left (although XBox dashboard uses React Native, with the previous generation being UWP, PS 4 famously used WebGL), and special purpose embedded devices.


For the longest time I've written my own smart pointers to manage the lifetime of pretty much anything that needs a cleanup - database connections, query objects, file handles, threads, mutexes and yes raw pointers.

All of this code built just fine on pre c++11 compilers, ran reliably, was performant and easy to maintain.

Rolling your own smart pointers is not something I'd discourage.


It might be easy to maintain to you. Somebody who has to pick up that codebase later would have to spend time and effort figuring out all those custom smart pointers and their idiosyncrasies. If all they do in the end is the same as unique_ptr & shared_ptr, it's all wasted time.


> Somebody who has to pick up that codebase later would have to spend time and effort figuring out all those custom smart pointers and their idiosyncrasies.

If we're still on the topic of game engines, someone who cannot easily figure out a smart pointer implementation has no place working with the game engine's code in the first place.

Unreal Engine has its own implementation for that stuff and it took me literally minutes to get to grips with it - same with the custom engines in companies i worked at before. This has absolutely never been a real problem in practice.


To be fair, video games are different to a lot of other types of software, in the sense that performance is absolutely critical but (certain types of) bugs can often be excused.

One could see why it's more important for a game dev to have a language that forces your mental models to more closely match what the hardware is actually doing, rather than using abstractions that make it less likely that you'll introduce bugs.


i am not the guy you replying but i have my 10+ years in the area.

You are voicing something that was true very long time ago. In world where you must be cross play cross platform to cover enough players there is no room for "what the hardware is actually doing". There only few games made by platform holders with exclusivity in mind are still there.


Games that use RAD game tools tend to ship a bunch of platform-specific SIMD code.


So do most games using any middleware. Unreal's math, string and core libraries use platform specific instructions, and do so in a c++ friendly way. That's not a justification for writing a bunch of hyper low level dangerous code to run a raycast!


yep, middleware authors have more time to write SIMD code because they have less moving targets like platform generations. Nobody needs to read that code anyway ^^


Agreed. If I want to write maximally performant code I'll write my own abstractions and use C or very little of C++.

If I want to write fairly performant code quickly I'll reach for C++ and use more of the toolbox.


I've been programming professionally in C++ for 15 years and can say I wholeheartedly disagree. I'm looking forward to whenever our engineering team adopts a newer compiler version in our legacy codebase (sadly we're stuck at C++17 for now).

Whenever I move from the old-style C++98 code units to newer modern C++ ones I breath a sigh of relief. It's beautiful, safe and modern; it reads like poetry.

If I get a code review where someone manually calls new/delete I will fail that code review.

That said, I agree this proposed "new syntax" is ugly and unreadable and unnecessary.


Well put!

Much of the C++ code out there is still C++98 and a lot of the "so called problems" had already been solved by using established code patterns (not GoF). "Modern C++" has added a lot of complexity but not for much benefit; its fanboys always use hand-wavy phrases like "much nicer", "makes programming simple"(!) etc. which mean nothing. After the introduction of STL and explosion in the usage of "Generic Programming" the only thing the language needed was Concurrency support for Synchronous/Asynchronous programming. Instead the ISO committee went off the rails into "modernizing" the language to try to make it like Python/whatever. The result is that experienced C++ developers are very cautious about adopting "Modern C++" while the larger "know little/nothing" crowd are raucously running around making all sorts of unwarranted claims.

IMO, today the greatest enemy of the language is the ISO committee itself.


I am not sure you have used C++ that much.

Claiming that move semantics, structured bindings, lambdas, designated initializers, non-impliciy self, override for virtual functions marker, [[nodiscard]], coroutines, modules, smart pointers, safer bitcasting, non-ambiguous bytes (std::byte), scoped enums, delegated and inherited constructors, constexpr, consteval and much, much more are not improvements for day-to-day... well, shows me that you must have not used it that much.


Nice list ... but that doesn't mean you understand what you have listed. Your list reads like what a noob "Modern C++" programmer would focus on (keywords and features to get through an interview) rather than any long-term (i have been programming in C++ since early nineties) real work experience.

Lots of good C++ based systems have been written before any of the above existed. That proves that they are not "needs" but merely "wants". Only some in the above list are worth adding to the language while others are just noise (as an example keywords like "override"/"default"/"nodiscard" just add to syntactic noise rather than any actual benefit). The members of the ISO C++ committee simply pushed through their pet "wants" for brownie points.


When override was introduced it immediately solved a few hidden bugs here so it definitely has value. I'd say that when doing development with inheritance it catches at least a bug or two a month at compile-time. Same for nodiscard, I started using it recently and immediately found bugs. So for me their value is infinite


As a counter anecdote i have not found any uses for the above and yet my code is running perfectly fine. That just reinforces my point that they are "wants" and not "needs".


Well, that is ok. It depends a lot on your codebase.

The fact is that you will have to check by hand or inspection what the compiler can tell you, right? That saves time and effort. Especially when refactoring.

Go refactor the signature 2 or 3 virtual functions with let us say a couple parameters each in two classes and 4 or 5 derived classes, one two levels deep (this is real code what I am talking about) and do it with and without override.

Try to measure the time it takes you with or without override keyword. You could use some refactoring tool, fair. But you do not have that available in every context.


>Go refactor the signature 2 or 3 virtual functions with let us say a couple parameters each in two classes and 4 or 5 derived classes, one two levels deep (this is real code what I am talking about) and do it with and without override.

This is just trivial and does not really support your arguments.

The way it was done before was to use the "virtual" prefix keyword for virtual functions across the entire class hierarchy. Merely a discipline which was enforced religiously via coding guidelines; it gave the programmer the needed cue while the compiler doesn't care. You then did a Search and Replace as needed.


How can you claim you do C++ and not evwn know that virtual did NOT check if it was an override of what you wanted or you were introducing an entirely new function? Your understanding of C++ is quite poor. You have a wrong mental model even about how virtual functions worked. There was no way for the compiler to say you were overriding or not hence now way to emit an error!!No religions here, just objectively better features. I bet you did not refactor hierarchies pre and post-C++11 otherwise you would not say that. It is not that trivial and marking overrides catches all intents of fake overrides. Before that was not possible.

Seems suspicious to me that you claim to have used C++ for long. You do not understand even how virtual functions worked in the language.


You are making inflammatory statements without understanding what has been written; it is not appreciated;

I said;

>to use the "virtual" prefix keyword for virtual functions across the entire class hierarchy.

What is meant is that the entire signature of the virtual function including the "virtual" keyword is reproduced in all derived classes thus ensuring that no mistakes are made.

I also said;

>it gave the programmer the needed cue while the compiler doesn't care

It was just good programming discipline rather than depending on compiler crutches.

Just to drive it home; here is an example: https://stackoverflow.com/questions/4895294/c-virtual-keywor...


> The way it was done before was to use the "virtual" prefix keyword for virtual functions across the entire class hierarchy. Merely a discipline which was enforced religiously via coding guidelines

No, that did not enforce safety or solved the refactoring problem I told you. It seems you do not want to listen that override fullfills the case where you assert that you are overriding (and it will be a compile error). You have a misunderstanding and a wrong mental model for how virtual worked. It did not enforce anything, just declared a virtual function, no matter it was new or an override. virtual can introduce a new virtual function by accident.

Example:

    class MyClass {
    public:
       virtual void f(int) {}
    };

    class MyDerived : public MyClass {
    public:
       virtual void f(int) {}
    };
Refactor:

    class MyClass {
    public:
       // NOTE: signature changed to double
       virtual void f(double) = 0;
    };



    class MyDerived : public MyClass {
    public:
       // FORGOT TO REFACTOR!!! STILL COMPILES!!!
       virtual void f(int) {}

       //void f(int) override {} // COMPILE TIME ERROR!
    };
> You are making inflammatory statements without understanding what has been written; it is not appreciated

No, I was not. I was just showing you do not know how the mechanism works with facts.

Above you have an example of why what you say does not work. I would recommend to talk more concretely and not to make too broad statements about tools you seem to not know in detail but it is up to you, I would not stop you. Just a friendly recommendation ;)


Have you really not understood what was written or are you just arguing for the sake of it? I cannot be spelling out every step of trivialities.

btw - Anybody who has been following this chain of responses can see who is the one using inflammatory language to hide their lack of comprehension and knowledge.

For the last time;

1) We did not depend on compiler crutches to help us.

2) We enforced coding discipline religiously so that all virtual functions are unambiguously identified with full signatures across the entire class hierarchy.

3) When you need to refactor you did simple search/replace on the full signature using grep/ctags/cscope/whatever across the entire codebase.

That is all there is to it.

This might be the most valueless discussion i have had on HN and that too over a utter triviality...sigh.


> 2) We enforced coding discipline religiously so that all virtual functions are unambiguously identified with full signatures across the entire class hierarchy.

> 3) When you need to refactor you did simple search/replace on the full signature using grep/ctags/cscope/whatever across the entire codebase.

That can break in a ton of ways. Still. For example, if the pattern is not correct for grep. ctags/cscope does not fully understand C++ also AFAIK. Then you are shifting from debug your code to debug your greps, etc.

Not sure how you did it exactly, but I see it as a fragile practice. Because you rely on absolute human discipline.

> This might be the most valueless discussion i have had on HN and that too over a utter triviality...sigh.

Sorry for that. Feel free to not reply. But do not take it to the emotional terrain so fast. I just argued since the start that C++ improvements are not an accumulation of "pet features" for the sake of it. It is you who presented that as facts without any kind of evidence in the first place.

To be clear, I do not care about C++ that much or anything particularly. But I have used it enough to identify one of your top posts as emotional and non-factual.

Greetings.


Your comment comes across as insincere. It reads like you have realized the triviality involved but not admitting it and still arguing that somehow programming discipline is not enough in this case (do we really need to talk about how to specify grep patterns?).

I should have probably said something like "override is just a compiler crutch for lazy programmers who can't be bothered to be careful in their job" but i thought it might get me banned from HN :-)

>It is you who presented that as facts without any kind of evidence in the first place.

My comments just in this whole thread refute your above statement.

>But I have used it enough to identify one of your top posts as emotional and non-factual.

I don't think you have understood the motivations behind the features you claim to have used, how they were solved earlier without those features and properly judging whether the change was worth it.


> Lots of good C++ based systems have been written before any of the above existed. That proves that they are not "needs" but merely "wants".

By the same argument, plenty of good code was written in C before C++ existed, so the entirety of C++ is "wants" rather than "needs".


Yes, In a very strict sense it is true for many experienced programmers and their domains of expertise. So when a expert embedded systems guy tells me that he will not touch C++ for his project, i understand and only ask him to look at C++ as a "Better C" and nothing more. All features are not equal and some are obviously more beneficial than others eg. simple user defined types i.e. class value types vs class hierarchies.

A good example is Linus Torvalds opposition to the use of C++ in the Kernel.


I do understand what I listed bc I basically have been doing C++ for a living for 13 years and started 20 years back at uni.

Of course good systems have been designed, but try to use the stl without lambdas. Or return big values by pointer bc u dnt have value semantics with all its associated usability problems. Try to write sfinae vs concepts or write a std::optional type without C++23 explicit self. Try to go and add boilerplate for free bc you did not have delegated constructors. Or try to build a table inside C++ at compile-time pre-constexpr. Do generic programming withou if constexpr. I have done myself all of that before C++11 and it was way more difficult to write a lot of code. Of course under your view anything is "nice to have". Then just grab assembly. But for the people like me that use it, I'd rather see it evolve with useful features.

Those keywords do NOT add noise. They add safety, since defaults cannot be changed. C++ is as good as it can get with its constraints. You can think it is bad, but C++ is an industrial language, real-world and with compatibility as a feature.

I can buy you could disagree with some of the decisions, but mostlyit fullfills well its purpose. Herb is just trying to create a C++ 2.0 that is simpler and 100% compatible. It is good, very good to not start from scratch if you are in industrial envs. You just do not throw away 40 years of code that has tested the pass of time pretty well, no matter it was written in C or C++.


I am not sure that you have understood my comment, hence let me explain;

I am a longtime programmer firmly in the C++ camp (as an example, see one of my earlier comments here: https://news.ycombinator.com/item?id=27854560). I am also not against the evolution of the language. But what I (and many other C++ programmers) are against is the messianic zeal of the "Modern C++" proponents hellbent on changing the language to be more "user-friendly" like Python/whatever in a mistaken belief that "C++ programming is now made simpler". By adding more and more features the interaction between them is now only more complicated then ever (i.e. When and How do you use them correctly and efficiently? How do you combine them into a clean design?) making the programmer's job that much more difficult (as an aside, this is also the reason beginning/new-to-C++ programmers give up on the language).

The above problem is compounded because C++ is already a multi-paradigm and multi-usage language i.e. situated in a co-ordinate plane with the above axes.

The paradigm axis:

- a) Procedural Programming i.e. "better C". - b) Object-Oriented Programming. - c) Generic Programming. - d) Compile-time Programming.

The Usage axis:

- a) Low-level Interface Programming eg. MCU programming. - b) Library Programming. - c) Application Programming.

Every feature ideally sits at an intersection of the above two axes. Thus for example, the "auto" keyword from C++11 is best suited for "Generic Programming paradigm" and "Library implementation usage" (eg. in the implementation/use of STL). But what is happening is that the "Modern C++" proponents are using it willy-nilly everywhere (because "hey i never declared any stinking types in <scripting language>") making the code that much harder to comprehend. Similar arguments can be raised against a lot of other features. This is the reason many of us are cautious w.r.t. the new features; we know the existing potholes, have worked around them and have our system under control and in hand. The ramifications of introduction of new features just for its own sake is unknown and makes us very nervous.


I do understand it, but you put yourself in a false dichotomy at the same time. You say you have to master every corner just because features are added. This is not true most of the time.

Of course you layer features on top of existing things. It is the only way to "simplify" the language and keep it compatible. For example, now you still have raw pointers: noone recommends managing raw memory anymore, the recommended thing is to return a smart pointer or do RAII inside your class (this last one pre-C++11).

How about virtual functions? Oh, now you have to know more (and this actually does not really hold true in every context, sometimes it is easier): of course you might need to know more! Recommendation is to mark with override! How about template metaprogramming with SFINAE! No! Hell, NO if you can avoid it! That is why constexpr, if constexpr and concepts exist!

It is not more difficult to program in the last standards, it is way easier to program. What is more difficult is that you need to know more. But you do not need to know absolutely every aspect of the language. They layered features that are more general so that you stop using the legacy ones, and that is an improvement inside the constraints of what C++ is allowed to do right now to keep it useful (compatibility).

If what you want is to learn to program in permutations of all styles, including the old ones, in C++ and in all kind of devices... I do not know people that are experts at all domains and levels of the stack, no matter the language you use. So again, you can use your subset.

> But what is happening is that the "Modern C++" proponents are using it willy-nilly everywhere (because "hey i never declared any stinking types in <scripting language>"). Similar arguments can be raised against a lot of other features.

You basically said nothing concrete above.

BTW, noone prevents you from using many of the inferior styles. And by inferior I do not mean non-modern, I do not buy modern for the sake of buying it, I use the features that make my life easier, and the same feature or library thing that makes my life easier in one context is the one I discard in another (for example dynamic memory in embedded or fancy ranges in videogames I might need to debug and understand to the vectorization loop level if I do not have the parallel algos).

You can still, as I told you in the comment above, ignore override, for example, and try the refactoring exercise I told you, making your life more difficult on your way.

Or ignore structured bindings in for loops and declare the variables yourself for pairs. Or you can do some fancy metaprogramming with SFINAE and spend 3 hours figuring out what is going on or why you cannot use it in the parameter type of an overloaded operator and have to use it in its return type or guess which overload you are selecting, because if there are two difficult things in C++ those are initialization and overloading.

In the meantime, some of us will be using some concepts to constraint the overload set, or, even better, present as interface the weakest of the concept fullfilling a function and use if constexpr inside to choose the optimized paths for each type as needed.

Those are improvements, BIG improvements on how I write everyday C++ code. The language is big, bc there is not a choice. But a good training and taste for what to use and what not to use in your area of work is necessary.


My other response is also applicable here: https://news.ycombinator.com/item?id=32894154


> "modernizing" the language to try to make it like Python/whatever.

I've been writing C++ code professionally for almost 20 years now, and Python for 10 years. I tried to think of any C++ features post C++98 that would make it "like Python", but I can't think of any. Can you give some specific examples?


We will not go down this path since everything is highly debatable but here is an article: https://preshing.com/20141202/cpp-has-become-more-pythonic/

It even states: C++ added tuples to the standard library in C++11. The proposal even mentions Python as an inspiration.


You say a lot of words, but have substance. Which “so called problems”? Which “established code patterns?”. Why “it means nothing” and to whom?

I can continue.


This is the old trope of asking for proof from one side in a forum where things are discussed freely. I could turn it around and ask you to justify all the new "features" added to say C++11..20. I am here for the rest of time.

But to give a few concrete examples;

>Which “so called problems”?

Apparently "variants" were introduced to solve problems with POD "unions". Mere complexity for not much benefit. Nobody i have worked with ever said "unions" were difficult to use.

>Which “established code patterns?”

RAII as a solution to People harping about memory problems. I have written entire protocol message libraries just using RAII to avoid memory leaks.

>Why “it means nothing” and to whom?

When "Modern C++" proponents say everything should be written with the new features using hand-wavy phrases, "it means nothing" to experienced programmers. If we find a feature useful to model a concept and/or express something we will use it; but always weighed against how complex it is to read and maintain. A good example is template meta-programming taken too far.

The key reason C++ took off was because it added minimal "Zero-Overhead abstraction" constructs over the baseline "low-level and simple" C core. Suddenly programmers could have their cake and eat it too. The evolution of C++ should have continued in the same spirit but instead in a misguided effort to compete with later and differently designed languages a lot of complexity has been added for not much apparent benefit.


Variants are meant to introduce a tagged union like data structure into C++.

You can do this without variants by manually defining your own unions and managing the tag yourself, but this is very not type safe and requires extra boilerplate code to manage!

Maybe you have never used and don't care about this feature but it's actually pretty useful! Tagged unions make representing certain types of data very elegant, for example nodes of an AST or objects in a dynamic programming language. You can use OOP patterns and dynamic dispatch to represent these things as well, but I think tagged unions are a better fit, and you get to eliminate a virtual method call by using std::visit or switching on the tag.

I suspect that maybe you have never been introduced to sum types in general which is not uncommon! I am curious if you have experience with using them or not?

https://en.wikipedia.org/wiki/Tagged_union


variants/sum types is not some earth-shattering concept but has been implemented using tagged unions in C from the beginning of time. At one point in my career i had an occasion to write a "spreadsheet-like data structure" (basically a linked list of linked lists) in three different ways one of which was using a tagged union for the data nodes. The point is that people trivially rolled their own when needed and did not clamour for language/library additions.

I have already pointed out in some of my other replies why it is wrong to consider a variant as a replacement for POD union.


I do realize people have been using tagged unions for a long time. Having some library helpers goes a long way in making them more usable and more expressive. Having them built into the language as a first class feature is even better yet, but std::variant is a nice middle ground.

Technically you can implement it manually, but the same thing could be said about all language features, even in C. We don't need actually need structs built into the language, we can just allocate blocks of data and handle the offsets by hand. We don't need functions, we can just push stuff to the stack and use the call opcode. The same goes for various loop constructs, goto can do everything "for" and "while" can do!

I don't think "we used to roll our own X back then" is a strong argument for something being bad or unneeded. Abstractions are all about allowing the computer to do work for us, and making things less error prone and more expressive. This is why we have programming languages to begin with and don't write everything in assembly!


What you are expressing is a Sentiment and not an Argument. First see my other relevant comment here: https://news.ycombinator.com/item?id=32893171

Language design is a fine balance; the addition of Abstraction Features has to be balanced against the cognitive load it imposes on the Programmer. These abstractions also need to be tailored to a computation model supported by the language. For example, the Niklaus Wirth school of language design was famous for insisting on minimalist languages; You only added features if they were "needed" to support the model of computation and all extraneous/trivial features were omitted. C++ took the opposite route from the beginning which was ok to a certain extent since it increased the expressive power of the language (eg. multi-paradigm). But over time the balance is being lost and the cost of cognitive load is outstripping the benefit of an added abstraction. This is bad and not welcome. So looked at in this light how many of the features in Modern C++ are essential and how many are extraneous (eg. Concurrency features = Essential, variants/syntactic annotations/etc. = Extraneous)? That is the fundamental issue.


> Nobody i have worked with ever said "unions" were difficult to use.

I wouldn't trust anyone to write the correct boilerplate for

    union { std::string s; int v; }; 
unless they'd do just this all day long


std::string is not a POD.


... yes? One still needs to put it in sum types sometimes. This is what std::variant solves.


But that is the point; "union" is supposed to be a POD, if not all language guarantees are off.


> But that is the point; "union" is supposed to be a POD

no it's not. it's fine to have non-POD unions - but you have to be careful to call constructors and destructs of non-POD's explicitly. thus, variant which automates that.

Also, I didn't talk specifically about unions, but about sum types. There is a need for saying that an object X can be of type A OR type B, no matter the properties of these types.


>it's fine to have non-POD unions

Only from C++11 (or is it later?). So a problem was created by relaxing the existing requirements for a "union" to which a solution was proposed by adding a "variant"? Something which had no runtime overhead (but UB) now has runtime overheads.

Regarding your point about "sum types", agreed.


You mistakenly assume I am pro new "features".

I'm just pointing out that you didn't have anything concrete in your messages.

> Apparently "variants" were introduced to solve problems with POD "unions". Mere complexity for not much benefit. Nobody i have worked with ever said "unions" were difficult to use.

https://stackoverflow.com/a/42082456

First link in Google, how is moving from undefined behavior and not needing manually declare type of union (sic!) is a "mere complexity for not much benefit"?

> Nobody i have worked with ever said "unions" were difficult to use.

Sure, enough people say that about for and while loops while avoiding functional patterns like a plague.

> RAII as a solution to People harping about memory problems. I have written entire protocol message libraries just using RAII to avoid memory leaks.

Which "modern, overhead features" are solved by RAII?

> When "Modern C++" proponents say everything should be written with the new features using hand-wavy phrases, "it means nothing" to experienced programmers. If we find a feature useful to model a concept and/or express something we will use it; but always weighed against how complex it is to read and maintain. A good example is template meta-programming taken too far.

Who are these experienced programmers and how do you define those? Your circle of people?

> If we find a feature useful to model a concept and/or express something we will use it; but always weighed against how complex it is to read and maintain.

Modern features are literally easier and less complex than old way. Like in the variant example.

> A good example is template meta-programming taken too far.

And how often "modern c++ proponents" suggest meta-programming to solve ordinary problems? Because everywhere I've encountered C++ discussions, those were resorted either for internal usage or suggested to avoid at all.

> The key reason C++ took off was because it added minimal "Zero-Overhead abstraction" constructs over the baseline "low-level and simple" C core.

There's no key reason C++ took off. It took off, because it took off.

> The evolution of C++ should have continued in the same spirit but instead in a misguided effort to compete with later and differently designed languages a lot of complexity has been added for not much apparent benefit.

There's a reason why those differently designed languages were defined differently. Maybe instead of blindly hating evolution, try understanding the reasons behind it.


And you have mistakenly assumed that i am "blindly hating evolution". The distinction i make is between "needs" and "wants"; much of what has been added in Modern C++ is "wants".

>First link in Google, how is moving from undefined behavior and not needing manually declare type of union (sic!) is a "mere complexity for not much benefit"?

Because you have not understood the definition and guarantees of a "union"; and UB is not always a bad thing. "union" is explicitly defined to be a POD with all it entails. The stackoverflow answer does not provide anything new. If you want something more, you code it explicitly when needed. No need to burden the language; in fact it creates more problems because "variant" does not guarantee layout compatibility.

>Sure, enough people say that about for and while loops while avoiding functional patterns like a plague.

Of course familiarity and clarity always trumps "new patterns".

>Which "modern, overhead features" are solved by RAII?

The harping on "never use naked pointers" in your code.

>Who are these experienced programmers and how do you define those? Your circle of people?

Of course; It should be the same for you and everybody else too!

>Modern features are literally easier and less complex than old way. Like in the variant example.

This is what we are debating; it is not a fact that you seem to assume.

>There's no key reason C++ took off. It took off, because it took off.

There is always a tipping point. In C++'s case it was compatibility with C and new language constructs for higher level abstractions.

>There's a reason why those differently designed languages were defined differently.

Exactly; Each starts with a Computation Model and evolves a syntax to that model. The evolution should not be willy-nilly dumping everything and the kitchen sink into a language. C++98 was complicated enough but still manageable but what has happened from C++11 onwards is just too much complexity requiring even more effort from experienced programmers. You cannot hand-wave it away by saying "Modern C++ is a whole new language so forget baseline C/C++ cores" which is quite silly.


>They don't need a move constructor because they've been already clever enough to solve a prolbem from a different angle.

How would you implement something like unique_ptr without move semantics?

Can you give an example of angles being used to avoid needing move semantics?


Exceptional engineers I've come across unilaterally don't use smart/unique/whatever_ptr. A common theme is that they employ allocation strategies that avoid the need for 'micro' memory management in favor of 'macro' management strategies. Allocating large blocks of memory at a time and freeing it all at once is one specific example.


Having worked on a few decently large C++ codebases, I can confidently say that this is not how I view things. Shared pointers might be pretty drastically overused by some programmers but have use cases, and I think unique pointers are pretty invaluable.

I can't imagine writing a long running, memory conscious, and fast C++ program that uses whatever 'macro' management strategy you envision.


This is pretty simple actually! At startup, create some object pools, arenas, or bump allocators. Then, never heap allocate anything ever.

We also happen to do 0 other steady-state syscalls (not just 0 mmap, munmap, mremap, etc.), so we can just run the program under valgrind and any time valgrind prints something is a bug.

We didn't use C++ but some other software I've heard uses a similar approach is written in C++.


Arena allocation and smart pointer tackle fairly different problems. Not sure why you're conflating these different problems? I'm extensively using both of them on a fairly large code base (>10M) daily basis. Without smart pointers, I'm confident that engineers will need to spend 2x more time on figuring out actual ownership of pointers.


In principle, with arena allocation you don't need to care about ownership at all, or at least you only care about it at the arena boundary.


If you're working on a nice code base that has a very clean boundary across teams and infrastructures, have the arena boundary follow that principle and the system has relatively straightforward lifetime and ownership, then yes. Obviously this is not true for so-called "large scale software systems". Have you tried to bounce objects across arenas thanks to all the teams that wanted to "optimize" and "simplify" their memory allocation? Good luck with debugging that.


Why should you track ownership of the single objects if you're not going to free them one by one?


You can also allocate an arena of memory, and hand out unique_ptr<std::span> of CHUNK_SIZE slices of it

So I also don't see how these are related


The point of arenas is usually to not care about ownership for object graphs that are allocated in the same arena, and to avoid immediate destruction for said objects.

The typical pattern is to create an arena whose lifetime is the lifetime of some important operation in your program (say, serving a single request or processing a single frame), do all of the logic of that operation via simple allocation in the arena (ideally bump-pointer allocation), don't free/delete anything at all, then when the operation is finished, just free the arena itself (or don't, and reuse it for the next operation by just resetting the pointer).

This implies, or at least allows, a very different style of writing C++ than usual - no need for tracking ownership, so no RAII, so no constructors or destructors, so no smart pointers.

Of course, you can also write this kind of code with RAII and smart pointers (with nop destructors for non-resource objects), using placement new to construct objects inside the arena and so on. But it's not necessary, and some find it easier to avoid it.


I think arenas are more of an alternative to shared_ptr than to unique_ptr.

A std::string is an example we can use, it acts fairly similar to unique_ptr. Sometimes you just need to heap allocate something, work with it and/or store it as a member, maybe pass it to a function or whatever, and then have it cleaned up at the end of the scope or when the object that it's a member of is destroyed. I don't think an arena can replace this use case.

If we need something with multiple handles, that doesn't have a trivial and predictable lifetime (such as objects / variables in a programming language interpreter, or entities in a video game), you can reach for shared_ptr or an arena. In the specific example I gave I would certainly prefer the arena, and would implement a garbage collector to deallocate stuff when it became unreachable.

The arena pattern is fairly common in Rust, because in Rust even a reference counted smart pointer doesn't let you bypass the "shared ^ mutable" enforcement from the borrow checker! Having everything owned by the arena greatly simplifies issues with the borrowck, and you can just allow callers to take temporary references and drop them when they are finished with them. There is a crate called "SlotMap" that provides a data structure to assist with the pattern (there is a C++ SlotMap library as well somewhere).

Anyways I have rambled a bit, but I think unique_ptr solves a different problem than what you describe, which instead is more of an alternative to reference counted pointers (shared_ptr).


Most typically when using arenas, you don't want to pay the cost of de-allocation at all while the arena is still alive. So, you actually have to "fight" unique_ptr, since you don't want it to call any kind of delete() or destructor, and you essentially get nothing in return.

If you use a bare pointer instead, not only do you get the same guarantees, but now if you need to modify the code to actually share that object, there's no need to modify any references from unique_ptr to shared_ptr.


Yeah! That's what I meant when I said arenas solve a different problem! I don't think I mentioned using unique_ptr with the arena and I didn't mean to suggest that if I did. My point was that arenas are not a replacement for unique_ptr at all, and instead solve a different problem where allocations don't have a simple and deterministic lifetime.

With an arena ideally you can just pass out actual references (T&) to callers instead of pointers!


I believe that there are certain situations where one may choose between scope-based memory management and arena-based. Even for the example of the rendering code of a frame, instead of allocating inside an arena, you could choose to allocate an object graph owned by some frame-level object, that is de-allocated once the frame is rendered.


As someone who got scolded by Jeff Dean once for using smart pointers, I can attest to this.


Thanks for the +1 Justine <3 You're my flippin hero


unique_ptr does not require that move semantics to be added to the language.

Boost has a full implementation of boost::unique_ptr that is identical to std::unique_ptr but works with C++98 [1].

https://www.boost.org/doc/libs/1_80_0/doc/html/boost/movelib...


According to the docs this is actually emulating move semantics on the compilers that don't support it.

I guess this means that you don't have to have it at the language level to support unique_ptr, but you do need some form of move semantics at least. I think having it built into the language as a proper feature is preferable personally.

https://www.boost.org/doc/libs/1_80_0/doc/html/move/what_is_...

>Rvalue references are a major C++0x feature, enabling move semantics for C++ values. However, we don't need C++0x compilers to take advantage of move semanatics. Boost.Move emulates C++0x move semantics in C++03 compilers and allows writing portable code that works optimally in C++03 and C++0x compilers.


C is overcomplicated and all really hardcore engineers that I've met either write assembler or carve binary machine code into stone blocks with their bare teeth at moonlight.


The problem with videogames niche seems to be that the debugability of code suffered, as well as the speed, in debug mode.

There are lately some improvements to that, like making move/forward an "intrinsic" from the point of view of the debugger. Also the SIMD interactions seemed to be a problem in certain loop vectorization.

As for "you are clever bc you do not use move constructors"... I can be clever and use assembly instead of C. But I am not sure it would be the right choice in all scenarios. If you have a tool and can take advantage of it, just make use of it.


That's an interesting perspective. I started writing C++ after C++11 came out and personally I can't even imagine writing C++ without auto, move semantics, lambdas, variadic templates, std::unique_ptr, std::atomic, etc.


Ive been working on AAA games for a decade writing c++, and I couldn't disagree more.

An enormous amount of "smart" old school c++ is at best equivalent to a well written modern equivalent, and at worst more dangerous, error prone and slower. Anything that uses references for writable out parameters combined with a bool return value (or no return value) for success is a great example. On my last project, I rewrote a bunch of fundamental Unreal Engine path handling methods, replacing the old school bool GetXXX(TCHAR* In, TCHAR* Out); with FStringView GetXXX(const FString& In);

In the process of doing so I found multiple engine and game bugs that in the best case were logic bugs and in the worst case were memory safety issues. I also left the original function call in place and made it wrap the new version with no perf overhead for the callsites I didn't change, and a 30% increase for the cases that I did fix. Almost all of that is because I stopped requiring null checks, I was able to lean on RVO and move semantics, and I was able to avoid unnecessary initialisations in many places in the code. Most of the sites I replaced were calling FString ThingStr; TCHAR* Thing =<...> if (GetXXX(FooStr.c_str(), Thing)) ThingStr = FString(Thing);

And most were replaced with FStringView Thing = GetXXX(Foo);

> write very simple C++98 code, maybe a bit of C++11 but that's it. They keep it simple.

Assuming of course there's already a smart pointer library that you're using, modern c++ allows for simpler, more correct and faster code in many many places. Some examples are auto:

    for( map<string, int>::iterator it = map.begin(); it != map.end(); ++it)
becomes for (auto& it: map)

nullptr is now a thing over NULL, move semantics make it feasible to write logical code without needing to jump between logic and type manipulation for basic operations, static assertions exist, enum class, attributes (nodiscard, noreturn, deprecated),constexpr and all the evolutions that came with it, structured bindings, concepts, if/for initializes...

All of the above are just off the top of my head features that I interact with daily, even if I don't write them daily, all of which make my code and project safer faster and easier to understand.

> This new C++ syntax is just a gimmick that we don't need.

I disagree. It has many useful properties. It gave us the spaceship operator which massively simplifies the code needed to write correct objects. A single well written spaceship operator can mean numerous algorithms just work on containers of your type, rather than custom filter/search operations. The syntax of left to right definitions for functions _seems- superfluous but it drastically reduces the amount of work needed to be done by the compiler which should translate into real world compilation speedups. The import/include swaps could allow for new code to interact with well behaved old code in a safer manner. As someone else said on this thread, it's not always about one feature in isolation, it's about combining multiple features and the result being worthwhile.


and that is why rust is bad. it solves problems for you, just like modern c++ tries to do. whereas these problens need to first be intimately understood by the engineer, before they can be solved. otherwise you will be faced with lots of unexpected surprises down the line.


I hope you don't forget to prove the whole theory of integration whenever you are doing an FFT.


This sounds very similar to how Jetbrains tried to improve on Java but in the end deciding to invent Kotlin; a new language with more consise syntax, better defaults, some new features, and great interop back and forth with Java.


> great interop back and forth with Java.

That's something all JVM languages share. Not that easy with compiled languages unfortunately.


No, Kotlin's goal was great interop from the start. Clojure and Scala have better interop with Java than, say, Rust with C++, but it's often painful to consume Java libs from them and consuming Clojure of Scala libs from Java is not a thing in practice.


Great interop only when calling Java from Kotlin, good luck doing it the other way, specially with co-routines.


C++ needs a new frontend. Something close to C# / Kotlin


Google is trying to do something like that with their new Carbon language.

https://github.com/carbon-language/carbon-lang


C++ doesn't really have a "backend" like the JVM does; its ABI is already too weak to be used across shared libraries. So there's not as much need to mix a new language with existing C++.

Worse, because C++ libraries are gigantic header files, you have to support 100% of the language to use them.


It's still possible, and it has been done before. One of the most amazing examples is Clasp [0] - a Common Lisp implementation that is able to fully interop with C++, including template code and exceptions. And it's led by a Chemistry PhD - I'm sure someone like Herb Sutter can run miles around him in programming expertise.

[0] https://www.cliki.net/Clasp


Looking forward to finding out what kind of feedback and discussion this generates. Some of the changes seem to make the syntax pointlessly different.

    /* Proposed "new" c++ syntax */
    main: () -> int = {
      cout << "hello world\n"; 
    }

    // vs.

    /* Classix c++ syntax */
    int main() {
        cout << "hello world\n";
    }
The "new" way looks a lot like a copy of the newer Java syntax for closures.


It’s actually close to the new C++ syntax for functions. Take a look at the output C++ in the regression-tests/test-results folder:

    auto main() -> int {
        cout << "hello world\n";
    }
The difference between the two is mostly the : and the =. The colon is for declarations being written:

    i : int = 0;
Instead of

    int i = 0;
main is the variable/function above and its type is ()->int and it’s being assigned, i.e. defined, hence the =. This syntax removes a ton of parsing complexity and makes the language more regular. Functions are defined and assigned similarly to variables.


been a bit since I've touched c++, but is there any reason to use trailing return types in that manner aside from solving scope issue with decltype on templates? I assumed this was a way to write functions to deal with that case and not for "regular" functions.

On another note, anything with "->" is just painful to write, same thing with the (in his proposal) ":_" instead of "=" for function arguments. Just annoying.


:_ appears to mark it as a generic type parameter, the `hello` function in the example ought to be equivalent to this straight C++:

  template<typename T>
  void hello(T msg) {
    std::cout << "Hello " << msg << "\n";
  }

  hello: (msg: _) = {
    std::cout << "Hello " << msg << "\n";
  }
I kind of like this, actually, because so many people I've worked with are afraid of templates in C++, but they seem to have no trouble with essentially equivalent code using generics (in other statically typed languages) or dynamically typed languages where it "just works" (until it doesn't). Not that we'll be adopting this any time soon, but I suspect they'd be more amenable to this form when `template` seems to make them quake in their boots.


The syntax:

   void hello(auto msg){
      ...
   }
Is legal today (since C++20 IIRC) and it is equivalent to your first example.


Ah actually missed that, good catch, it does seem to be a step forward in regards to dealing with the syntactical clunkyness of templates. Now if we could have keyword arguments in functions calls without having to deal with some kludge...


I actually prefer trailing return type for the simple reason that they often get very verbose with templates, and then it can be difficult to find the function name in the declaration. With the -> syntax, the function name is always in the same spot visually.

As for -> being painful to write - that's obviously subjective, but more importantly, given that it's the operator you use to access members via pointer-like things (including smart pointers and iterators), it's already pervasive in C++.


You have to use it for lambdas, so using it everywhere makes some code more consistent and readable


Why does it need both the keywords "auto" and "int" when declaring main()? Just seems extra confusing..


This is cruft. Without auto it wouldn't be parsed as a function declaration


Allow me to hijack a little. I'm working on a new language.

Would you people object to ':' for assign and single '=' for equality ?

  int i:42
  if i=42 print "ok"


Maxima does something like this. It's weird to use. I think := for assignment makes more sense if you want = for equality (i.e. like Pascal etc.).


Golang uses : to assign in struct initialization and := for new variable assignments so there is some prior art.

Practically speaking I find : to be challenging to quickly scan out of source visually. In golang this objection is muted slightly because struct initialization usually ends up being clearly indented by the formatter.


ouch is it just me that feels

int i = 0; // pretty

i : int = 0; // ugly

i will stick to c++03 thank u very much.


I have been using C and related languages since 1980. Switched to C++, later Java. Then about 7 years ago, I went all in on Scala, and have never looked back.

Something like "i: int" is superior (at least in Scala), because the type annotation is optional. You can drop it, and just say "val i = 42". I believe it simplifies the parser as well.

There is a reason many modern languages are adopting the style.


Most new languages use postfix types. I grew up using C and C++, but now that I've learned and use Python with types and Typescript, I'm used to and prefer postfix syntax


shouldnt it then be?

i: int 0 = ;


No, it should read grammatically as English.

int x = 0 "There is an integer x with value 0"

x: int = 0 "x is an integer with value 0"

x: int 0 = "x is an integer that is zero valued"

The last one is awkward, the first one is harder to read.


They said postfix types, not postfix syntax.


Now consider how this works out in C++:

   int i;     // i is a local int
   int i();   // i is a global function returning an int
   int i(0);  // i is a local int initialized to 0.
Yes, there's uniform initialization now. But this whole thing was a problem in the first place because of poorly designed declarator syntax that doesn't extend well and is difficult to parse unambiguously.

It can actually get worse, to the point where it's impossible to tell whether a given line of code is a declaration or not. E.g. "a<b>c" can be a declaration "a<b> c", or it can be an expression "((a < b) > c)", depending on what "a" is, exactly. This can be taken to 11 with templates:

   template<size_t N = sizeof(void*)> struct a {
       template<int> struct b {};
   };

   template<> struct a<sizeof(int)> {
       enum { b };
   };

   enum { c, d };

   a::b<c>d;
The meaning of this code now depends on whether sizeof(int)==sizeof(void*) for a given implementation on which it runs. Now, granted, this isn't the kind of thing you'd find in real world code outside of IOCCC and the likes; but it's legal, so the tooling has to be able to process it anyway. And by "tooling" I mean anything that has to parse C++ at some point - not just the compilers, but also debuggers, IDEs etc.


the ugly way is easier for the parser on some edge cases. https://en.m.wikipedia.org/wiki/Most_vexing_parse

syntax "beauty" is just a matter of how used to it you are (excl brainfuck ofcourse). Admitting this, is quite a big present one can give to him/her self.


That colon is going to be quite burdensome to type, due to the need to hold the Shift key on a line where you otherwise might not need to. It might sound silly but I do think little things like this can really hinder adoption without people necessarily realizing it.


No more burdensome than using capitol letters (camel case), underscore (snake case), curly braces or quote marks.

Been using Scala for 7 years, and it has never even occurred to me this was a "burden".


Meaning no offense but if you aren't exaggerating you might want to invest some time in typing lessons. I'm not even remotely close to being an expert typist and I spend most of the day going back and forth between a C style language (C#) and modern style that uses the colon (Typescript). The difference needed to type a colon doesn't even register.


I write both C++ and Python all the time and I type enough underscores and caps all day just fine, thank you. The colon doesn't register for me when I'm writing those either. When it did register for me was precisely when I started designing my own language, attempted to design declarations with this exact syntax, played around with it, and realized this is a burden I'd never noticed before, and that this was one thing that made languages more pleasant to write in.

Think of it like font kerning. Some people notice it, some don't, but most people prefer good kerning even if they never think about it.


You must not have used Python or classes in C++, where the colon is ubiquitous?

Does `{` not require shift on your keyboard?


> You must not have used Python or classes in C++, where the colon is ubiquitous?

You misread my comment. I'm talking about cases where existing prevalent syntax for that construct does NOT already require holding Shift. Python is better in this respect since ":" requires fewer presses of Shift than "{" or "}". This is not the case when you go from "int x = 1;" to "x: int = 1;".

Moreover, the inconvenience of going from 1 Shift to 2 is not the same as the burden of going from 0 to 1. Especially because you can often combine the 2 into 1 without releasing the modifier key in between.

And furthermore, variable declarations are far more common than class or function declarations, so you can't weight them equally.


You could just change your keymap if you really need to. I used to remap / in my Spanish keyboards because normally it is shift+7.


In many non-US keyboard layouts typing the ";" requires holding shift as well, so not sure how bad this issue is in actuality.


Most non-US keyboards are terrible for programming in general. I only use my native Norwegian keyboard layout when I need to type a significant amount of text in Norwegian. Whenever I need to type some code, I switch back to US layout. I believe most other programmers do the same.


Yes, programming with anything other than the US layout is simply unbearable. I simply switched to the US layout for everything. Although to be fair my native language is Italian, which doesn't need many special characters.

Most programmers I know use native+US layouts (like you), a minority always uses US (like me), almost nobody uses a native layout to code.


I'm an English speaker, so I'm curious: Italian technically requires some accent marks, right? Do you have a quick way to do those from a US layout, or do you just ignore them in casual conversation the way people tend to ignore capitals and (some) punctuation?


Yeah, technically it requires diacritical marks, but unlike French they are only written explicitly when they fall on the final letter of a word. Italian only has seven vowels, so these are the only common ones: è é ì ó ò ù à.

What I would do is this:

- Replace them with a single quote -> E' instead of È or É and so on.

- Ignore all uncommon marks.

In most contexts, it is perfectly acceptable to do that. A native speaker would understand without giving it a second thought or considering it impolite. I write emails to my boss like that.

If I had to make something meant for the public (say a UI, or a document), then I'd switch to the Italian layout for that purpose only and then go back to the US one.


Not for Italian, but I use a US keyboard layout with the addition of German umlauts behind right alt + letters (as well as right alt + ' and right alt + ` being dead keys to put accents on letters for foreign words). I find this much better than the standard German layout which puts all of @{[]}\~ behind right alt - and most of them on the right side of the keyboard too so you have to akwardly hit the right alt + combo with a single hand.


Tell this to people using Python. Every function, if, and loop has a colon at the end.


You misread my comment. I'm not saying "colons are bad", I'm saying "excessive holding of keys is bad".

if (cond) { } requires 4 presses of Shift (or 2, if you combine them).

if cond: foo requires 1 press of Shift.

I'm not sure what part of my comment makes you believe that I think the latter comes out worse than the former. If anything, the fewer keypresses probably strongly helped Python's success without people necessarily realizing it. Moreover, this might very well be one thing hindering more widespread adoption of Python's type annotations.

You also need to consider how common each construct is. A class declaration requiring 20 excessive keystrokes is not necessarily worse than a variable declaration requiring 1 excessive keystroke.


The number of excessive keystrokes is largely irrelevant (as long as it isn't an order of magnitude too much), because typing the code on keyboard is generally far from the limiting factor for the speed of writing code.

Most of the time you spend thinking about what you should be typing.

So optimizing for the number of keystrokes is optimizing for the wrong metrics


But Scala and lots of other programming languages use ":" with no hindrance to adoption.

Also consider this: lots of hugely frequent characters and symbols that don't require SHIFT in EN-US keyboards do in other keyboards layouts, for example Spanish. So this cannot be a consideration, at least not a major one.


> But Scala and lots of other programming languages use ":" with no hindrance to adoption.

It's kind of weird to see you assert that with no attempt to establish its truth when Scala is something like... the ~20th most popular language? Whereas Python, C, C++, Java, VB, C#, etc. are right at the top. So either this is one of the (obviously many) factors hindering wider adoption of Scala, in which case the relative popularity is entirely consistent with this, or it isn't a factor, in which case the it would make sense to present at least some evidence to the contrary given that your assertion goes against the popularity rankings.

> Also consider this: lots of hugely frequent characters and symbols that don't require SHIFT in EN-US keyboards do in other keyboards layouts, for example Spanish. So this cannot be a consideration, at least not a major one.

I'm pretty sure the popularity of {most software technologies} outside the US and inside the US are very far from being independent variables.


> It's kind of weird to see you assert that with no attempt to establish its truth when Scala is something like... the ~20th most popular language? Whereas Python, C, C++, Java, VB, C#, etc. are right at the top

The barrier to adoption of Scala has to do its functional programming side, as well as its complex libraries and types.

Kotlin also uses ": type" syntax, by the way. As does Python!

> I'm pretty sure the popularity of {most software technologies} outside the US and inside the US are very far from being independent variables.

It's hard to understand your point. Are you arguing that programming languages are less popular outside the US? Or "most software technologies"?

So let's sum it up: SHIFT concerns are very minor in comparison to other programming language ergonomics (in which need I remind you, C++ never fared very well), and the ": type" syntax is very common and mainstream in programming languages, with no hindrance to adoption.


No emacs user would even notice.


Or Vim and Python users. They type ":" a lot.


I actually have <space> mapped to : in Vim precisely because it's easier to press.


Some of the syntax changes are to support simple LR parsing. Ideally cpp2 would have boring grammar descriptions available for popular off-the-shelf parser generators. No need to link libclang into everything.


The "type first" declaration syntax is a source of much wailing and gnashing of teeth when it comes to trying to parse C++. See the "most vexing parse" [1].

[1] https://en.wikipedia.org/wiki/Most_vexing_parse


The most vexxing parse was a huge problem in C++ and a very niche problem in C not because of "type first" declaration syntax, but because C++ constructor calls in variable initialization are essentially identical to function declarations. This is a big reason why Java or C#, also type-first syntax languages, have no such ambiguity.

The two examples on Wikipedia are both caused by this, even though it's not explicitly spelled out:

  int i(int(my_dbl)); // two calls to int's constructor

  // in Java/C# style syntax:

  int i = new int(new int(my_dbl)); //no ambiguity

  TimeKeeper time_keeper(Timer());

  // in Java/C# style syntax:

  TimeKeeper time_keeper = new TimeKeeper(new Timer());
C++ could have also chosen to reserve a keyword for initialization even for in-place values, just like they did with `new`, but they chose to re-use function call syntax instead.

I would also say that just moving the type after the variable name doesn't fully solve the issue - it's the : syntax for initialization that does this. Here is a hypothetical C++ where the syntax remained the same as today, but (return) type was moved at the end of a declaration:

  typedef double my_dbl;
  typedef int i;
  foo (my_dbl double ) void {
    i int (int(my_dbl));
    //unambigous: i is an int initialized with an int constructed from my_dbl

    f (double) int; //unambigous: f is a function from double to int
    g int (my_dbl); //unambigous: g is an int constructed from my_dbl

    bar((my_dbl)i); 
    //ambigous: calling bar with i cast to double, or calling it with a function from my_dbl to i (double to int)?
  }


Similar problems happen even in C. Suppose you have this line inside a function:

    X *y;
Are you declaring a pointer, or are you performing multiplication? It depends on what “X” is.


True, though in this case as well, just replacing type first to type last would presumably lead to `y *X` which doesn't help so much in resolving ambiguity.


This is explained in the talk he gave. One of the benefits is that with new syntax the semantics can change without breaking backward compatibility, e.g. [[nodiscard]] can become default for all functions written in the new style.


I had the same reaction initially but then realized that this makes the variable declaration / initialization syntax very consistent.

  variable: Type [= initial_value];

  // initializes foo to an int with an initial value of 43 
  foo: int = 43;

  // initializes main to a (presumably, as I haven't read the
  // spec yet, const)pointer to a function that takes no 
  // arguments and returns an int.
  main: () -> int = {
     ...
  }


The idea is to have a consistent way across many items, functions, lambdas, etc…


The "new" way is basically updated Pascal, which was the right way to do it, from back in the 80's.

Similar story with C strings vs Pascal strings, another can of worms.


And the "old" way is basically Algol-60, which was what Pascal improved on.


True, but C missed the memo, unfortunately.


From a C/C++ programmer's POV those may look like GOTO labels, I agree. The proposed syntax is probably a consequence of a modern trend in using := for declaration and assignment.

    x := 5
Where you can optionally insert the type between the colon and equals, otherwise it is deduced

    x : int = 5


It's not modern at all. Type system notation has been using x : T to say "x has type T" for at least 50 years. It's the standard in academia, and has existed in real-world programming languages since 1971. [0]

[0] https://cstheory.stackexchange.com/questions/43971/why-colon...


It is an old idea and also a modern trend. Trends come and go. It is currently trendy.

90s fashion is also trendy right now. Which is great because a few years ago it was 80. The cycle continues!


"Currently trendy" how? Scala and Kotlin, to name just a few established programming languages, use the ": type" suffix style. They are neither "recent" nor from the original Algol era.


It depends on how you define "current", I suppose, but personally I'd consider Kotlin to be fairly recent.


Well, Kotlin is from 2011 and Scala from 2004! 11 and 18 years ago, respectively. I'd say neither is recent by any reasonable sense of the word that would imply their syntax or conventions are "a current trend".


11 years is not all that much for a PL, especially if you count from the date it was first announced, rather than from the first stable release (which for Kotlin was in 2016). For example, Rust started in 2006, and Rust 1.0 was in 2014, so it's actually older the Kotlin - but I think it still qualifies as "new", and many still consider it not fully battle-tested.

For another example, Python goes all the way back to 1991, with v1.0 in 1994. But it didn't really start to trend until early 00s.


> "11 years is not all that much for a PL"

Sure, if we redefine "recent" or "current" we can twist words to reach whatever conclusion. In reality, this ": type" syntax is neither new nor recent, nor is it "a current trend". Kotlin is also not recent. A decade is a lot.

You conveniently left Scala out, I suppose because it didn't fit your argument.

> "For another example, Python goes all the way back to 1991"

Well, since we are discussing Python, the syntax for its type hints looks like what exactly? And it was discussed as far back as 2014. Is that "recent"?


If you look at the thread, I'm not the same person who originally made the claim.

Nevertheless, it is definitely the case that the "name: type" syntax is noticeably more trendy now than it was, say, 15 years ago. Sure, there were plenty of languages that had it already back then, including Scala (and ML, and Delphi, and ...). But how many of them were mainstream? Whereas now, this syntax is embraced by Python and TypeScript.

And not even OP claimed that the syntax itself is new or recent; I mean, everybody who knows about Pascal knows how old it really is! They literally said "it's an old idea" that is "currently trendy".


I used to code for a language called "Spad" — originally shipped in 1972(?). The syntax:

    x : t = e
Was called "headform" and was the only syntactic form for the whole language. (Expressions used a shunting yard algorithm to desugar into a sequence of headform.)

The language included modules, classes, parameterics, functions, constraints, ..., whatever.


Well, everything looks like a pile of dirt when the only tool you have is a Spad.

ducks


> The proposed syntax is probably a consequence of a modern trend...

ML uses `name: type = value`. It came out over a decade before C++ and just one year after C.


In English, the colon is used to separate a term to its definition, so it looks perfectly apt to mark declarations.

Besides, it makes it trivial for text editors to implement "go to definition", and many other niceties without a real semantic parser.


I don't think this has anything to do with modern trends - := is a formal way of writing assignment (the widely used = is actually an equality sign), and has already been present in Pascal for decades (and other languages as well).


https://en.wikipedia.org/wiki/ALGOL_58

ALGOL, the := in programming languages used for assignment comes from IAL which became ALGOL 58. It's definitely not a modern development. Go reintroduced it to popular mainstream languages, sort of. Except they still use = for assignment and := is used when it's also declaring a new variable.


:= for assignment ultimately comes from math, as do most operators in early Algol: https://www.masswerk.at/algol60/report.htm#2_3


All the replies are missing something, does cpp2 have goto labels? What happened to them.

Unless you now have named loops in modern C++ now this removing goto is problematic.


I’m not sure I’d := is a “modern trend.” Smalltalk had it circa 1972, but none of the modern-day hip languages I can think of do. Your latter example is a common syntax, but the colon is omitted with the type in most languages (TypeScript, Rust, etc.).


X being a modern trend does not mean it was not done in the past. It doesn't mean that it hasn't trended in the past either.


So what does it mean then?

It was done in the very distant past. It was done in the intervening time. And it's done now. "Modern" trend?


Both versions are supposed to return an int but neither has a return statement. Is that not an error?


The main function returns zero implicitly if control flow reaches the end without encountering a return statement.


It wouldn't be C++ without undefined behaviour.


It's disappointing they would keep that odd << syntax, which has always looked weird to me.

Why not take this opportunity to streamline the calling syntax? Streams are no different than regular function calls, e.g.

cout.output("hello").output("world\n")


You are talking about library API design, not language design. A new C++ syntax will not change how functions in the standard library are named (e.g. operator<<).


That is an operator overload of a library and not built in language syntax.

Even so there is now std::format

https://en.cppreference.com/w/cpp/utility/format/format


Doh, you're absolutely correct.


In that particular case, it looks like he declared and assigned a thunk called main. It looks different to me than the original-style main.


> main: () -> int = {

Y U NO fn??!? So close, and yet so obviously wrong.


Why add a keyword when you don't need it?


One answer would be: So that when you hit the ( you know it's a list of parameters and not an expression, even before you spot the ->

I still haven't read enough about syntax 2 to understand if it allows any other use of ( after a : but if yes, a keyword like fn or fun would make the parser much more regular.


For easier grepping


Use an IDE


I'm not sure that any IDEs support cppfront yet!


fn is no fun


This is the best-engineered solution so far from a pragmatic point of view I have seen for C++ because it provides improvements with 100% compatibility.

Sorry for kind of x-posting (I posted in r/cpp the questions below), since I did not get any replies, but, Herb Sutter or anyone close enough, if you are around and have some time for questions... would love to hear more from this project.

I just saw the whole talk. Most ptomising C++ replacement so far for me: it is immediately useable and compatible and it does NOT duplicate standard lib and it is 100% compatible.

That said, here is my feedback/questions:

    About pass by copy. Dave Abrahams in his value semantics talk says copy is too eager in C++. Swift for example uses something sinilar to COW, avoiding eager copies when passing parameters. This also makes all copies de-facto noexcept. Is this the idea with Cpp2?

    About bounds checking: is it possible to call a non-bounds check function and detect it?

    About bounds checking: why always checked access? There are contexts, for example in a range for loop, where checks csn be ellided safely. Also, if you do checking in a span subspan, that slice traversal should not need checks.

    About bounds checking also: why the 3 subscript is a run-time error? I think it could be made a compile-time error for constants and constant folding.

    About overloading: one thing not mentioned in the talk. how are overload sets handled when you introduce the same name in the same context if parameter passing now is a bit different?

    About classes (no design yet, I know): there will be memeber functions vs free? How would overload resolution work? In case there are both memeber and non-member funcs, associated namespaces and so on. Also, it is interesting to look at Swift protocols and Val lang views here. Polymorphism comes from use, not from type itself.

    about exceptions: is it compatible to have lightweight exceptions and exceptions? my concerns here are two: should I annotate with try the calls and the result type of the function is affected (returning some ki d of outcome, reault, etc.). This is really viral compared to throwing from deep stacks and less usable as a default IMHO.
Sorry for the typos. Typing from a phone.


Drive-by, low-quality comment:

Herb is great. Glad he's thinking about this. C++ syntax is garbage and needs help, but why deviate so much from existing syntax? Why drop curly braces? The decision to allow mixing new/old C++ syntax in the same file seems like a gross mistake that prevents a more incremental fix of the language. Why the heck doesn't it support defining classes yet? Is it because of the aforementioned requirement to intermix new/old syntax?

I hope efforts like this continue, as we all know something is broken in C++ syntax. I think it can be fixed with tweaks and some breaking of backwards compatibility in the same source module.


> Why drop curly braces?

What do you mean? It uses curly braces for compound statements. But it doesn't require curly braces to define a one-line function.


Yeah, that just strikes me as weird.


It makes it more consistent with statements, and more convenient for one-liner functions (e.g. those that just delegate to another function).

It's also something that was adopted by other languages in the curly braces family - e.g. C# originally required curly braces for function bodies, but now allows you to write:

   int Sum(int x, int y) => x + y;


why no classes? I assume that it's meant just as an initial sketch/demo. Doing 100% is at least twice as hard as doing 80%.


C++ was one of the first languages I learnt as an undergrad, maybe 20 years ago. I used it for a few hobby projects but not much else (professionally these days I mainly use Typescript & Go). I'd love to pick up C++ again but I've found it really challenging to discover what I should read up on to know what modern C++ should look like. What's a great resource for someone who's been out of C++ for a couple of decades to pick up and learn today's idiomatic C++? I don't want Rust btw - it's a great language but I want to revitalise my C++ knowledge, not jump ship entirely.


I think Jonathan Boccara's talk, "105 STL Algorithms in Less Than an Hour" ( https://www.youtube.com/watch?v=2olsGf6JIkU ), gives a nice overview of the <algorithm> library. Also, the examples on the reference pages (usually near the bottom) at cppreference.com are usually pretty good. For example: Here's how to create a pseudo random number generator using the Mersenne Twister engine.

  std::mt19937 e; // Mersenne Twister engine
  std::uniform_int_distribution<int> u(0,99999); // uniform ints 0 to 99999, inclusive
  auto rng = std::bind(u,e); // so rng() retuns the next random number
And filling a vector with random numbers could look like this:

  std::vector<int> nums(1000);
  std::generate(nums.begin(), nums.end(), rng);
This last line can be written in C++20 more simply like this:

  std::ranges::generate(nums,rng);


Why should I do this, when I can just write:

    for (int i = 0; i < nums.size(); i++) {
        nums[i] = rng();
    }
Simple, easy to understand, actually compiles to efficient code even in Debug mode, don't need to include <algorithm> header which includes 10000+ lines of STL code which bloats compile times, etc.


Even better, so you can't accidentally move out of bounds:

  for (int i : nums) {
      nums[i] = rng();
  }
Edit: Ooops, I'm an idiot, this doesn't do anything useful (i is every value in nums, not the indices of nums...)


Doesn't that not work because `nums` is all initialized to 0 though?

Like, wouldn't it be

   for (int num : nums) {
      num = rng();
   }
(I don't know any c++). I kind of expect the above not to work though because it's doing an assignment. I suppose this is a good argument to just use a `fill` or `generate` function...


You're right, I completely forgot what it meant... The loop I wrote was basically equivalent to :

  for (int i = 0; i < nums.size(); ++i) {
    nums[nums[i]] = rng();
  }
Which happens not to go out of bounds simply because nums[i] is initially 0 for every i in the range - so it sets nums[0] to a different random number 1000 times.


The standard recommendations to come to up speed up to ~C++14 are Scott Meyers books, Effective C++ and Effective Modern C++, but I'm not sure if they work well alone when you've not used C++ in a long time. And don't have a good recommendation for something covering the stuff newer than that.


* Discovering Modern C++ by Peter Gottschling - Good fast coverage of the features.

* The C++ Programming Language, 4th edition by Bjarne Stroustrup - The biggest change in the language was the introduction of C++11 and this book covers it well.

* A Tour of C++, 3rd edition by Bjarne Stroustrup - Summary of all changes upto C++20.

* Software Architecture with C++: Design modern systems using effective architecture concepts, design patterns, and techniques with C++20 by Adrian Ostrowski et.al. - More like a broad survey rather than in-depth but covers development in a "modern ecosystem".

* Modern C++ Programming Cookbook: Master C++ core language and standard library features, with over 100 recipes, updated to C++20, 2nd Edition by Marius Bancila - A series of articles one each for a language feature.


Generally Cppreference.com has the most authoritative information on what C++ is now. It is somewhat opinionated about modernity, but not enough so to be intrusive.

Usage examples have often been updated to current best practice.


I seriously wonder why no one did that before (put a context-free language in front of C++). Presumably, because of the need for modules to get anything beyond trivial code (without modules, one would have to use the preprocessor which would intermix old and new syntax).

Now, please do it. Context-free syntax is so obviously better that it's really an embarrassment, people are still defending hacks from decades ago.

And please, for the love of Knuth, someone do it with LaTeX, too.


https://users.monash.edu/~damian/papers/

Scroll down to "A Modest Proposal: C++ Resyntaxed".


I loved this paper when it came out, but as far as I know, the author never actually implemented his proposal.


> I seriously wonder why no one did that before (put a context-free language in front of C++)

My guess is either yes, macros, or if you're going to write a new language, you can skip C++, output object files, and make them compatible at link time.


Compiler explorer supports cppfront https://godbolt.org/z/shYes883a


Due to a last minute change (and a rarity, we really try hard not to do this), the URL for cppfront changed: https://godbolt.org/z/jEqEqWsbW works now (as does manually selecting cpp2-cppfront as a language)


In The Design and Evolution of C++ by Bjarne Stroustrup there is some interesting discussion on alternate declaration syntax that was rejected because it wasn't compatible with C. The book also explains a lot of design decisions that would be interesting to revisit in light of what has gone on.


> Can we make C++ 10x safer, simpler, and more toolable if C++ had an alternative "syntax #2," within which we could be completely free to improve semantics by applying 30 years' worth of C++ language experience without any backward source compatibility constraints?

Yes please - I really like this approach. C++ as a bare bones on-its-own language is not going to find adoption in the new world - it is way too much baggage & legacy, syntax and functional complications and adding more complexity every 3 years is just counter productive. If this can become something like what Scala or Kotlin are to Java it would be good for C++.


"Fixing semantics" while keeping 100% ABI compatibility with old C++ seems really hard. It looks like this will have to allow for some kind of opt-in ABI breakage, at which point you're not far away from what you could achieve more easily by using e.g. Carbon, or just better Rust/C++ interop (see autocxx, crubit, creusot).


> far away from what you could achieve more easily by using e.g. Carbon, or just better Rust/C++ interop (see creusot).

At that point, why not just upgrade to C++20? Carbon, in particular, is left behind in the C++17 world, so it's outdated by C++20 (not to mention future developments).


No one has ever said that Carbon will forever be frozen at C++17 compatibility. That's just the initial goal.


Google is basically stuck in the C++17 world, so I wouldn't expect Carbon to move forward if they end up investing heavily in it.

In particular Google hasn't even started to adopt C++ coroutines, threading, ranges, or modules (they have their own module implementation which is actually better in most ways, but different)


C++ doesn't define an ABI, even if particular tool chains try to preserve one.


In practice it has been stable on GCC, clang and msvc with their respective stdlibs since 2015. There are kids today learning programming with scratch that weren't born the last time ABI changed.


Interesting, thanks. I stopped using C++ before 2010.

It remains my all time favourite language to do crazy stuff in, but I try to avoid crazy stuff these days!


I found this language recently that also compiles to CPPv1:

https://github.com/SerenityOS/jakt

Really nice feature set.


I don't get it. The code sample is way uglier and more difficult to read than modern C++. Just move your code base to modern C++, it's a pleasure to work in (not joking, I'm loving it).


The point is that you also get the right defaults and the C++ Core Guidelines semantics without any additional tooling, which even with static analysis in C++20 isn't 100% there.


I've written C++ and understand it's appeal: high level abstractions with very little runtime penalty, but I also understand it's legacy issues that make it complicated. Instead of adding to C++ maybe we should subtracting from it like perhaps C-- : The good parts of C++


I find it disappointing that the enhancements aren't explained one by one, or even listed. How am I supposed to decide if it's worth a deep dive?


Each time (e.g., Rust, Go, Carbon, Cppfront, ...) before clicking, I hope that people would remember that Pascal pointer syntax is better than C's convoluted '*' and '->' and '.'. So far, I've been disappointed. I mean:

    'p.x'   just like 'p.x' in C, but
    'p^'    instead of '*p' and 
    'p^.x'  instead of 'p->x' and 
    'p^^.x' instead of '(*p)->x'.
Just admit it: Pascal is nice, clean, compositional, and less surprising.

Don't you remember? Or do you honestly think that C syntax is better?*


Since you mentioned Go.. Go does a very programmer-friendly thing where v.x and p->x are both spelled y.x -- the compiler knows whether the left side is a pointer or not, and v->x and p.x would always be errors, so they eliminated the invalid syntax and simplified to always using dot.


Is that tiny, illegible-in-dark-mode hello world the only example or am I missing something? For as much content as there is in the README I would hope that a syntax change would, well, show some more syntax comparisons.


The README points here for a bunch of examples.

    https://github.com/hsutter/cppfront/tree/main/regression-tests/test-results


Same with me.. are we both tired or is this whole thread about just the idea of a new syntax ?

I scrolled the whole Readme. Are we supposed to parse the parser in /source ?


The readme points to regression tests as the best example of the different syntax


> import std; as default

Personally I think he should switch to using namespace std; as default as well. The compiler can warn if you alias the names, and all the std:: in C++ code is just noise and annoying to read and write.


I think you're in a minority here. Off the top of my head, std::get should never be simply `get`. Here's a complete list of symbols that may or may not be defined depending on what you've included.

https://en.cppreference.com/w/cpp/symbol_index


Ok, maybe bringing everything into the global namespace is a bit of a stretch, but you could at least do the most common ones. Who's going to alias cout or string?

As for std::get, while it's a separate issue I think the problem that solves should be solved in a more elegant way if you're trying to reform the language anyway. This is not elegant:

  std::get<0>(arr) = 1;


There are proposals for a language level version of this - P1061 I think


But for backward compatibility you should be able to undo it, then. (Problem is, I don’t think you can.)


I feel so sad for hours now...well, from the time I saw the original post on reddit and then here.

For the past few days all I can see in HN is the announcement of new languages and research projects that relate to C++:

  * Carbon (https://github.com/carbon-language/carbon-lang)
  * Val (https://www.val-lang.dev/)
  * Jakt (https://github.com/serenityos/jakt)
  * now Cppfront...
Is everything alright within ANSI / ISO committee?


There is no voice of the committee.

A breakaway group, including Google and Microsoft, has sought to break backward link-compatibilty, and has failed. So they are pursuing other avenues.

Probably the single most valuable improvement possible while retaining C++'s strengths would be to fix initialization syntax and semantics. Second would be to simplify and limit implicit conversion and promotion for base types.


Sutter might be a bigger name, but I think this has about the same chances as Damian Conway's SPECS[1].

[1]:https://users.monash.edu/~damian/papers/HTML/ModestProposal....


From what I gather, the projects have quite different goals.

CppFront:

> My specific goal is to explore the question: Can we make C++ 10x safer, simpler, and more toolable if C++ had an alternative "syntax #2," within which we could be completely free to improve semantics by applying 30 years' worth of C++ language experience without any backward source compatibility constraints?

SPECS:

> We propose an alternative text-based syntactic binding (called SPECS) for the existing semantics of the C++ language.

So while SPECS is semantically equivalent to C++ as it is today, CppFront isn't bound by that constraint.

Sutter makes this very clear later in the README:

> Important disclaimer: This isn't about 'just a pretty syntax,' it's about fixing semantics.

Just to be clear, my intent isn't to diminish Conway's work in any way. Both are interesting projects, just with different goals.


> So while SPECS is semantically equivalent to C++ as it is today, CppFront isn't bound by that constraint.

This can be actually seen as cppfront having less chance than SPECS.


CppFront is supposed to be 100% link-compatible with the existing C++ ABI. Even if that isn't "semantically equivalent" in a strict sense, it's so close that it makes no real difference.


Perhaps. But Sutter sits on the committee, and therefore has more ability to get these things into the standard.


Looks like there is a talk about it https://www.youtube.com/watch?v=ELeZAKCN4tY


For anyone else looking for a link to Herb Sutter's accompanying talk at cppcon:

https://youtu.be/CzuR0Spm0nA?t=14532


I remain convinced that Herb is an amazing force of nature!

Simply incredible from my mortal programer point of view.

Wow!

Mark


Yes, there is no language project that isn't C based and hasn't already been resolved all this multiple times over. C beer goggles.


While I very much welcome new ideas and proposals for radically changing C++, I'm not a fan of this one. Removing support for unsafe `union` and pointer arithmetic makes the language largely pointless.

And, just to also quibble on the syntax: using `:` for identifier types, lambdas, ranges, and scope resolution is deeply unfortunate.


Why can't they just improve tooling and dependency management first? After using Cargo, I now find Python's tooling unbearable. Once you have seen the other side, there no coming back.


Should've named it ++C++


What is this obesession with the funcname() -> type syntax? I just hate it


Much easier to parse and to search for.


Looks like an unholy love child of Python, Javascript and Hare.


what does confront provide that Rust does not?

i’m assuming Cppfront is breaking backward compatibility with old c++ codebase.


How is this called cpp2 and not C+++?


Why would you name a safer C++ by an instruction that is UB in both C and C++?


It would need to be C++++


Already taken. Notice the # in C# can be made with four + signs!


EDIT: I am dumb, my brain automatically thought import std was "using namespace std" for some reason.


That default import doesn't pollute the namespace like you seem to think it will, look at the hello world example in the readme, note that the `std::` prefix is still present.


Programming languages are serialization formats for source code.


Carbonfront....


Yeah. I think this is a response to carbon. See what google did with golang. It affects python and Java heavily. If carbon gets successful, this means less influence for cpp


The project dates back to 2015, and the compiler work started in 2021, well before Carbon was announced.


While I would like to see C++ improve, I would argue that this really is reinventing the wheel when Carbon is being worked on.

I guess the only thing cppfront would offer is mixed syntax within one file, but I could even see Carbon implementing something like that.


I'm sorry, but Herb Sutter is too smart for his own good. He should not be involved in language design meant for mortals. I mean, who can honestly say both "For me, ISO C++ is the best tool in the world today to write the programs I want and need", but then also "I want to make C++ 10x simpler/safer/more toolable"?


For a class of problems, there's basically two options of language to use that would fit the requirements, C++ and Rust. While Rust is promising and getting some traction, saying that it's the clear winner in all aspects would be a stretch. So therefore there's a lot of people who end up with your first statement. And given the kinds of problem where it's the best tool, a lot of those would end up actually interested in the second statement. However, of course most people wouldn't have the opportunity to do so. We're lucky that there are people that end up with all three like Herb Sutter, IMO, or we'd be stuck with 98 semantics... (I guess it would have helped Rust getting popular faster though...)


What makes you think that these two statements are exclusive?


I don't mean to say they are mutually exclusive, merely that it takes a very smart person to be able to say both. For most people, a language that is 10x more complex than a language with ideal simplicity and safety, is too difficult for them to understand - or at least too difficult for them to understand without putting in years of study and practice, something which is not reasonable for most programmers.


My impression is that many people use C++ because they need to, but not because they want to.


Java and C# fucked up in that they thought that people really loved the objects, but missed that actually people really hated the pointers and the oo part was mostly a crutch to lessen thr memory safety burden.

And their development in the last 15 years is just trying to escape the oo cage they were put in.

Something that fixed C++ would have been a godsend in 2004. Nowadays... just a nice thing to have.


They were following C++ "best practices" of the era.

Gang of Four patterns, Turbo Vision, OWL, MFC, Motif++, PowerPlant, AppToolbox, VCL, Qt, wxWidgets, Telligent, CORBA, DCOM,....


Well, you did get a whole bunch of fix for C++ in 2011...


teachers, leave our c++ alone!


From TFA, the goal is to "explore whether there's a way we can evolve C++ itself to become 10x simpler, safer, and more toolable".

First let's look at "toolable":

My big issue with Modern C++ these days is not just that it's complex to write, but that it also takes a long time to compile, what with all the template/compiler magic going on in the background. Long compile times break concentration, which I don't like. C++2 is not going to improve compilation times, because it's going to be compiled to C++, and then undergo another compilation step. If I combine C++'s slow compile times, two compilers, and all the dumb bugs that I can create and have to debug+fix, I get ....... (pls wait compiling) ..........

Let's switch context and look at "simpler":

Yay functions look more like JS lambdas or whatever. Fine. Let's look for the pointer-y stuff. Like [1] -- in C or "normal" aka "pre-modern" "C++1" you'd have:

  int x = 42;
  int *p;
  p = &x;
  func(*p);
In the "simpler" "C++2" you have:

  x: int = 42;
  p: *int;
  p = x&;
  func(p*);
I sure hope the changed syntax helps with parsing complexity or whatever, because it's not any different for me. Seems like C++2 got jealous of Rust and wanted to remind everyone "hello fellow kids, C++ is also cool pls". Note that even Rust and Go follow the C-style "*p" pointer deref syntax instead of "p*". What happens if I combine "C++1" and "C++2" syntax in the same file? I'm not as smart as Herb Sutter -- Is "p**x" a pointer deref then a mul? Which variable is being dereferenced? Perhaps it is a double deref, or ...? So much for simpler. (also note I haven't talked about template magicks here)

What do we have left ... "safer"? sighs

Okay, C is unsafe. Yes, C++ has to carry that baggage. Yes, memory corruption is horrible. Yes, buffer overflow errors lost lots of money, and have caused lots of wailing and gnashing of teeth. Yes, Modern C++ alleviates some of that unsafety with templates and a smart compiler. Yes, Rust has a smarter compiler with better static analysis that helps write safer code. But all this statically analyzed safe code still has to provide C wrappers if it's a shared object, or use the C wrappers provided by whichever kernel it's running on. So somewhere underneath all this safety lurk the familiar monsters, and someone's going to reach for them in the name of performance (remember Actix?). What about all the other issues that you can have unrelated to buffer overflows (log4j)? Will C++2 protect me from my own foolishness? Perhaps that's why its so complicated -- all my C++2 code is safe, because I will write no C++2 code.

And another thing: why not just do the Raku or Kotlin thing and call it by a different name, and have it be a separate language compiling to C++? Why "C++2"? Calling it C++2 vs C++ makes me think of Python 3 vs Python 2, and oh man that is not a good connection to hint at when claiming full backwards compatibility. Python 3 also got quite complicated compared to Python 2, so that doesn't give me hope either.

Now that I think about it, shouldn't this be called C+=2?

TL;DR:

- C++2 is not going to be more toolable because ultimately you're still compiling template-heavy C++

- C++2 is not going to be more simpler because unusual pointer syntax, confusion when mixing C++1, template magicks still have to exist

- C++2 is not safer because you can/have to still write or interface with C/C++ and the footguns therein.

- C++2 should not be called C++2, pls let's find a Raku/Kotlin.

Let C++ be the complexity monster it is. That's the tradeoff we make when we have to write it. If safety is the issue, can't we just use a sandbox or something? Is C++2 really the better option?

[1]: https://github.com/hsutter/cppfront/blob/main/regression-tes...


Man, the negative sentiment on this site towards C++ is pretty alien to me.

C++ is (I'm guessing) currently holding up an order of magnitude more applications than whatever you think is better than it. Clearly it has upsides, so it's baffling to me when people greet attempts to reduce the downsides with either "This doesn't make sense, just scrap it for something else" or endless nitpicks about the approach chosen.

A smart guy has decided to put his time towards improving a hugely influential language. That seems like an uncontroversial positive to me, and I welcome any useful results.


I think not many people ever got properly introduced to C++.

Some probably only saw it during their format education, and it's most likely been taught like C with classes and/or they had to use for stupid things.

There are also many forms of C++, there is your "old and regular" C++, game dev C++, there is modern C++, then there is "I only know C, but sure I can write C++" C++.

Pulling dependencies is not as simple as `npm i boost`.

Communities are small, segmented and not welcoming to newbies.

Absolute madness with build tools. I've never worked with CMake myself until last year, and I haven't so frustrated.

As for how many applications using C++ that's not really an argument — C++ was literally the only choice for many of them at that time.


Not really an argument? The C/C++ ecosystem is so well established that even today you don't have a choice in a lot of contexts.

If that doesn't show the power of a language, I don't know what can. I don't really think the community is toxic or unwelcoming, more like people are too used to ctrl+c ctrl+v their way out of problems. Nothing against that, I'm the same way, but it really is a matter of taking your time with the language. I've honestly learned to applaud most rust fanatics because of this, at least they are passionate about learning athe language inside/out.


Where is C++ the only choice? Gamedev seems like the closest—as far as I know, most AAA games aren’t using C, but I think maybe many are using C# via Unity (I’m not familiar with the gamedev space)? In the embedded world, C still reigns supreme. Maybe in the world of high frequency trading? But these seem to be fewer and fewer and Rust seems increasingly likely to be a contender in these domains.


Game engines are one example, there isn't a robust-enough language other than C++ that has the ability to build some powerful abstractions while also giving you the tools to manage hardware more directly. Note that Unity uses C# as a scripting language, but under the hood the core engine is all in C++.


Under the hood some of that C++ code is now the C# subset used by the Burst compiler.


Drivers, data driven system that have higher performance standards, medical imagery, military components heavily employ Ada mixed with C++ interfaces.


I agree that these things all use C++, but the parent’s claim was that there are areas (industries?) where C++ has no alternatives. Drivers and high performance systems can all make use of C, Rust, Ada, etc as well as C++ depending on requirements.


I'm sorry, but Rust still has ways to go before established industries believe on it enough to drop C/C++. I use C/C++ interchangeably, since extending an existing C codebase won't be a problem. Most aren't willing to go as far as Ada either, the reason why that even in military endeavors, C/C++ is used as a more familiar "face".

I think it's pretty easy to forget the foothold these languages have, and their flexibility. We all have seen the perseverance of COBOL.


My original comment specifically said that older languages have staying power, and that Rust will have to mature a bit to become the language of choice for new projects in these niches. But even still, I wasn’t talking about dropping C++ for Rust—I only mentioned Rust as an alternate option.


No problem, I just don't consider it as an alternative for the aforementioned concerns. Nonetheless, I wasn't trying to point fingers at Rust, more of a pragmatic approach to modernizing a code infrastructure.

I hope Rust does well.


Horse and carriages had a well established ecosystem. We don't use that for moving people and goods anymore.

Being old isn't a superpower. That said, I don't see C++ replaced in certain areas for loooong time.


Yeah, it'll surely fade out given time. It's certainly not a perfect language.


To be properly introduced to C++ is to be properly introduced to the madness called UB.

Of course not many people ever got properly introduced to C++.


Something that C++ inherited from C. ISO C has about 200 documented cases of UB.


I echo the same sentiment here but mainly for C language as If my memory serves me right, I didn't have any encounter with C++.

My experience was negative overall and traumatic to some extent and I wouldn't rule out that some who were first exposed to the language in college or even HS share the same aversion and feel scarred and haunted forever by that Blue Ghost.


My theory is that the Internet generally hates multi-paradigm languages (Scala!). Key phrase there is “the Internet,” which is a euphemism for “people who write in the comment section.” They aren’t a representative sample, just a vocal one.

There are plenty of good reasons to hate on C++, but that’s conflated with a lot of “real hacker” signaling.

Most takes on C++ lament that it isn’t C, or is too complex. Rust is a quality replacement for C++, but it doesn’t succeed on either of those counts either.

All new code I write is in Rust or Haskell. If I need to do something weird, like with DynamoRIO, C++ it is.


Python and Ruby were considered multiparadigm languages once, and the internet didn't hate them at all.

I think "the internet" tends to complain about languages which have accrued complexity over time, and contain parts that are now considered bad but still lurk in the shadows tripping you up.

Scala, C++, perl and others seem to fall into this category.


I think the Internet doesn’t like languages that don’t have good, strong opinions that will help you find a successful path. C++ just says “here’s every feature we’ve ever heard of; good luck!” whereas Go has strong opinions about how you should build software and while they might change people, they really do put you on the right track. Rust is closer to Rust, but it has so many features that it still takes some time to figure out the happy path—fortunately, the Rust community makes up for a lot of that with excellent documentation (e.g., the Rust Book) and helpful, friendly forums.


Yes, this is what I'm getting at.

There's a very real desire for things to just be laid out for you, and much less regard for whether the workflow/tradeoffs associated with that beget quality software in the process.

It is hard to not see this as an army of advanced beginners unconsciously trying to avoid doing the work required to level up.


> It is hard to not see this as an army of advanced beginners unconsciously trying to avoid doing the work required to level up.

I don't think it's hard at all. "I don't want to have to manually manage dependencies" isn't "avoiding leveling up", it's just not doing pointless work that computers are good at doing. Similarly, there's negative value in requiring every/most project to script its own bespoke build system (CMake) when 99% of projects can fit a mold (and then there are efficiencies when 99% of those projects fit that mold--e.g., trivial cross compilation). None of this stuff is meaningfully related to "leveling up". Similarly, bombarding people with language primitives that are almost always footguns (e.g., inheritance) isn't really helping anyone "level up" except to know not to use those features.

In the C++ world, you have an army of people who think they're experts because they've navigated all of these problems to find something that sort of works, but in practice they're far less productive and often don't know what they're missing out on from the rest of the industry. I would rather have people who are productive but don't pretend they're experts.


My problem with C++ is that, with all its complexity, it's still unsafe. Efforts to add safety to it as a,library (unique_ptr, etc) are commendable and useful, but they cannot be comprehensive, because the language's design, especially the early decisions, resists them.

Lack of modularity and the fact that everything is treated like a single source file (via #include) adds interesting ways of unexpected interactions.


“Holding up an order of magnitude more applications … clearly it has upsides”.

It’s holding up lots of applications because better languages didn’t exist at the time those projects were started, and it’s rarely feasible to switch languages. Specifically, a lot of people look to Rust to unseat C++ for new applications, but it will take a while for Rust to mature with respect to libraries (e.g., game engines) and mindshare in those industries. But even then, old languages have tons of staying power by virtue of age.


I'm not super familiar with Rust, do you have SSE/AVX intrinsics (others?)? Can you write assembly embedded in Rust code? How does rust stack in terms of performance?

https://benchmarksgame-team.pages.debian.net/benchmarksgame/...

looks quite decent, but overall Rust is still behind? So if I am in an area where performance (or cost/energy) is king then I'd still pick C++.


Rust's std::simd is the portable abstraction but it is so far only available in nightly Rust, in principle you would be able to write code that does SIMD on whatever hardware (ARM, x86-64, whatever) is targeted including AVX.

Yes, Rust has inline assembly in roughly the same way you'd be used to with C or C++

https://doc.rust-lang.org/reference/inline-assembly.html

The Benchmarks Game has a bunch of benchmarks, and while it's probably significant that the Python programs are routinely orders of magnitude slower than say C, we likely shouldn't read too much into whether somebody scraped a few more milliseconds off the best time for one of the benchmarks listed.


Portable abstractions for SIMD aren't very useful, because if you're writing SIMD you want performance, and the things it abstracts over (specific SIMD capabilities and weird performance quirks of different instructions) mean the results of using it aren't predictable.


SIMD intrinsics exist, although currently require nightly compiler on non-x86 targets: https://doc.rust-lang.org/1.29.0/core/arch/x86/index.html

Inline assembly: https://rust-lang.github.io/rfcs/2873-inline-asm.html

Performance is almost identical, even the linked benchmark is quite even.


> Can you write assembly embedded in Rust code?

Yes... unlike in x64 MSVC, one of several C++ compilers I frequently must support, where inline assembly is verbotten. Does that put Rust ahead? :P


I don't know the answers to all of your questions, but

>Can you write assembly embedded in Rust code?

Yes, I've done this several times. It looks like this: https://doc.rust-lang.org/rust-by-example/unsafe/asm.html


> do you have SSE/AVX intrinsics (others?)?

Yes. https://docs.rs/simd/latest/simd/

> Can you write assembly embedded in Rust code?

Also yes. https://doc.rust-lang.org/reference/inline-assembly.html


It doesn't feel much different to any other language that people have been burned by (e.g. Javascript). I have some bad C++ experienced, and I know enough programmers I respect who stick to C over C++.

Of course it's difficult to equate that kind of advice/feedback to negative comments on HN. But often, there's some kernel of truth there. So while there's some merit to C++, the criticism can equally be valid. And keeping the underlying complexity of C++ (similar to how Carbon will) might not meaningfully simplify development).


> I have some bad C++ experienced, and I know enough programmers I respect who stick to C over C++.

Do you know such people who work on large software systems, as opposed to, say, micro-controller firmware, or kernel drivers and such?

(Asking as a person who maintains an important(ish) C library for embedded coders: https://github.com/eyalroz/printf)


I used to work for a company whose product was a fairly large satellite control center software system (and we could supply hardware if needed), written in C++. It's used for a lot of commercial fleets. For example, back in the 2000s, when I worked on it, it was used for CDRADIO/Sirius's fleet. (I don't know if it's still used for SiriusXM's Sirius satellites, if any.)

I liked C++ in some ways, but as a whole, I think, C++ didn't reduce -- and may have increased -- the complexity of our software compared to the complexity of an equivalent C implementation. (I'm talking about the complexity of the software itself, not the complexities of the tasks it was doing.) The distribution of the complexity in the code would just have been different between the two implementations. IMHO.

So I mostly stick with C or other non-C++ languages now. Of course, C++ has expanded and changed greatly since then.


You would be very pleasantly surprised at how programming in C+ is, today, vs. pre-C++11. With C++20, the language has got even nicer to use.


I remember really enjoying C++ circa 2012, but I really hated scripting my own build tooling via CMake or manually wrangling dependencies. This was the stuff that pushed me out of C++. Fortunately, Go had just hit 1.0 and it fixed many of the problems out of the gate, and I didn’t really need the performance that C++ offered.


Even CMake has got less annoying.


It had nowhere to go but up! (: Jokes aside, I’m glad to hear it.


> Do you know such people who work on large software systems, as opposed to, say, micro-controller firmware, or kernel drivers and such?

Redis is written entirely in C.


I really don't think Redis is a large software system. Things like Firefox, Chrome, AAA games, etc. are what I'd call large C++ systems.


I don't think that Redis qualifies as a large software system in the modern standard. In fact, it's popular because it's simple, small and understandable compared to its competitors which comprises of many millions of lines of code.


Redis is ~200K lines of C code, so it's relevant project-size-wise, but - redis was started in C, right? It's not like developers now have the option to "go C++" without a company-wide decision.

Still, if could quote Redis developers making the GP's claim, that would count.


Some software for storage products is C. I imagine other conservative industries could be similar.

But I have also heard that although the senior devs would prefer to stick to C for technical reasons, they have started to use/embrace C++ for hiring reasons (since many grads are taught C++). So I wouldn't be surprised if finding "pure" C codebases/ones without C++ is getting harder. The eco-system around C++ (e.g. test frameworks) certainly seemed healthier. It's been a few years though, so I might not have a good picture any more.


On the top of my head, for large open source C codebases that are not system/kernel/controller I guess Python, Ruby and Redis would be good examples.

I do myself fall in the category of people sticking to C over C++. I used to love C++ back until c++03, then completely hated the whole "new generation c++" orientation that started with c++11 and pretty much decided to not touch c++ anymore.


I think that "large" really needs to be defined here. CPython is 350kloc of C. I have seen C++ codebases with more source files than there are lines of code in that. Just the "base" module of Qt is ~4million loc


I came up with the following guess many years ago:

There are people who are totally unproductive at C++ and find it scary, mainly because they are not very exposed to it. They assume everybody is as uncomfortable and unproductive with it as they are. Nay, it is impossible for anyone to be productive with it, simply because they aren't. They will attack evidence that somebody has done well with it, because it is some defense of ego for them.

There are other complaints about C++ beyond this of course, with validity, and many of them from old timers and people who bitterly complained about it for decades from a place of knowledge and experience. People have been complaining about ugliness of C++ for longer than my own career. However, as more and more people come up in the post-C++-as-fashionable era, I think this above theory is more and more the bulk of the complaints.


C++ is a great language, the sub culture that keeps unsafe C patterns alive in the language, not so much.

Example, back in the heyday of C++ frameworks bundled with compilers, bounds checking strings and collection arrays were the default.

Nowadays you either call at(), have your own bounded checked types, or enable the debug library, but naturally without influence on 3rd party depedencies.


The text-include compilation model is broken, larger projects pay dearly for this, and thanks to C++'s very long history & very long feature list, there is a veritable Babel of feature subsets & compiler opts to choose from. Adding new dialects and features is not an improvement here.


This is what the "modules" feature in C++20 addresses. People are complaining about modules a lot, I think because the spec is a little complicated... but the spec is a little complicated because the problem is a little complicated, and you can't pretend that complexity doesn't exist when you're writing the spec. (There are some other complaints about the modules system. It wasn't going to please everyone.)

New dialects and features were getting added to JS all the time, but what happens is people writing JS libraries or tooling would watch how far these features spread in their users' browser compile base, and many of these various features would never even make it into popular JS runtimes, while others are everywhere now.

I think it's a reasonable model for development--lots of people trying to improve things, the community slowly sifts it out, and the standards are the most conservative of all.


People complain about modules because they only exist on paper.

Not a single compiler supports the standard library as a module. No compiler has full support for modules.

Modules are a C++20 feature that isn't usable in real projects in 2022. And there's no signs that modules will be ready anytime soon.


Is there something I'm missing? People are complaining because modules aren't supported yet? Isn't it reasonable to address this complaint by adding module support to compilers, and isn't this what's already happening?


As I understand it modules support goes beyond just compilers - yes you need support there, but also in libraries (std library still is still not available as a module yet but apparently in progress) as well as build tools (CMake, Bazel, etc.).

People are complaining because it’s 2022 and support for modules is seemingly not there or incomplete in all these places, and modules are talked about in some C++ communities as if they’re a thing that is actually usable (for example Bjarne’s talk at Cppcon a few days ago).


> People are complaining because it’s 2022 and support for modules is seemingly not there or incomplete in all these places...

So they're complaining on the internet about not getting free stuff fast enough.

They could work with the maintainers of their toolchains instead. If everyone that used xcode complained to apple, C++ modules would be done there already. MS likewise would probably put more resources on it. Open source is a little more complicated, but Red Hat and Canonical do have paid products.


> So they're complaining on the internet about not getting free stuff fast enough.

This is a very weird take.

Visual Studio isn't free for professional use. Xcode is free, but publishing for macOS/iOS is not. Both tools exists to serve platforms owned by the first and third largest companies in the world by market cap. Microsoft and Apple don't spend tens of millions of dollars a year on employee salaries for these tools out of the goodness of their heart.

In any case, the issue isn't with the toolchains. The issue is the C++ committee created a specification that has turned out to be problematic. Your next response may be to join the committee and fix it from the inside! Google tried that and rage quit to create Carbon. Sutter hasn't quit, but I think Carbon and Cppfront getting announced the same summer is not an accident.


Yeah, fair.

Very Online engineers complain about slow progress on free stuff for the open source toolchains. And they don't bother to light up ticketing systems on paid products. Microsoft and Apple don't spend as much on these things as you would expext. Because relevant management thinks we don't really care that much. But the common thread is that sitting back and expecting the world to come to you isn't reasonable.

Incidentally, Google is a big place. Some Googlers are still involved in the ISO committees. Certain Googlers, admittedly influential ones, lost patience and started betting their reputations that certain dramatic moves would be a better choice.

I personally don't think ISO is the presidency of C++. C++ culture focuses on language design way too much and engineering (good third party libraries, supporting implementation, empirical evidence, etc.) not nearly enough.


People are still feeling the pain, and are still complaining.

I suppose when GCC and MSVC will both support modules in an interoperable way, these same people will enjoy and praise this development.


The fact that it also took until 2020 to admit it’s a problem worthy of solving (by WG21) is also not a good look


> because the spec is a little complicated

Patching over the old work--hopefully without breaking anything--always is. I'm just stating that much of the hostility toward C++ comes from the fact that we have so many superior options to choose from now--options which profited from the lessons C++ learned the hard way, and incorporated them into v1 instead of patching them in as options at v23.

And this is not to strikeforce anyone with a "Rewrite it in Rust!", but to suggest that C++ is maybe not the best choice for new work in 2022.


> This is what the "modules" feature in C++20 addresses.

This is the perennial "If you only used modern C++, you'd be fine."

Sorry. People have been repeating this for 15 years, and it hasn't gotten any more true.

For example, C++ still doesn't have a useful string type--everybody rolls their own. How do you interoperate when 2 different C++ libraries can't even agree on something as basic as what a "string" is? The existence of "header-only" libraries and the contortions they go through is a tacit admission that compiling and linking C++ is still a disaster. C++ still doesn't have a stable, documented ABI--so everybody defaults to C (thus throwing away any theoretical usefulness C++ might have). Embedded shuts down 90+% of C++ beyond "C with classes" because they can't use exceptions and can't allocate memory dynamically. The preprocessor is a brain parasite that has grown with C/C++ and can't be removed anymore without killing the host. etc.

In fact, I would argue the lack of stable ABI is the only thing propping C++ up. Because the C++ ABI is so shitty, you have to make the top-level code C++ which then calls into the other libraries and subsystems which actually have useful, stable and documented ABIs and interfaces.

If C++ had an ABI that didn't suck, you could invert that, drive the C++ from the better language, and everybody would relegate C++ to increasingly tinier libraries until they could excise the remaining code and throw it into the trash.

I feel for the people who put in their entire lifetime trying to "improve" C++. However, it's time to admit that C++ can't be fixed and move on.

How amazing would it be to have people like Stroustrup and Sutter being paid to work on a language that doesn't start with unfixable suckage?


To be clear, I’m in the “C++ sucks” camp. But I don’t agree that C++ “can’t be fixed”. I’d rather say that some problems with C++ can’t be fixed, but others can be.

People will still use it, and that is entirely sensible and rational, and it makes sense to improve it. Long-term, maybe C++ will fade into the same kind of sunset that Fortran and Cobol currently occupy.

I’m not gonna make fun of Fortran users, or tell them to move on. There’s even a Fortran 2018.

Trying to allocate people to only the “best” possible work—say, replacing C++—is just poor allocation of resources.


> C++ is (I'm guessing) currently holding up an order of magnitude more applications than whatever you think is better than it.

So what. No one's going to rewrite all of those million lines of code in New C++, or whatever they call this incompatible syntax. It's just a distraction from more relevant efforts.


New code can be written in the new syntax, with full access to existing libraries.

I could easily see a company like Meta adopting this. They have both a huge amount of C++ code as well as actively developed guidelines and internal libraries that make use of cutting-edge features.


From the link:

"I'm sharing this work because I hope to start a conversation about what could be possible within C++’s own evolution to rejuvenate C++, now that we have C++20 and soon C++23 to build upon."

Clearly this is relevant for c++ itself (coming from Sutter), so I'd say it's quite unfair/misguided to call it "just a distraction from more relevant efforts."


If you've got 10^6 LoC of a product that will be developed and maintained for 10+ more years then you can slowly replace it with this.


Of course they will. What they are less likely to do is to rewrite it in a completely different language (e.g. Rust.) C++ became popular in the first place because it could easily go along with C. No need to rewrite in bulk.


The thing is that C is still the lingua franca abi, so there's still no need to rewrite in bulk, you can still just use the same C libraries in another language, and rust is particularly well suited to doing that in roughly the same ways C++ is.

What's getting harder and harder to see now is why, if you need to write new or rewrite now, you'd choose C++ over rust. In the long run that's a recipe for only the most gnarly old codebases being written in c++ and no one wanting to touch them.


> The thing is that C is still the lingua franca abi, so there's still no need to rewrite in bulk, you can still just use the same C libraries in another language

No. C is the lingua franca for exported public stable ABIs, which is an extremely small subset of any given program's ABI usages.

C++ ABI is just as widely used as C's, just for internal unstable linkage instead of stable exported linkage. So yes you still need to rewrite in bulk to move off of C++, unless your code happened to be tiny & only used C API libraries.


We're starting new projects today, in 2022. (We do Signal Processing code for the Position, Navigation, and Time industry.)

It's all C and C++, and Python if we need a scripting front-end. (Prototypes are done in matlab). "Rust" doesn't even exist in our universe. I don't think the people on HN are really in touch with how real people program in the real world.

And on few the occasions when we have to do a complex desktop GUI app, we'll use C# or F#. We can get cross-platform Windows/Linux easily this way.


Rust is still fringe even at FAANG level companies. I work at one such and we have a service or two written in Rust. I like Rust as a C++ replacement in the right context, but for 99.999% of applications it’s not better in any qualitative or quantitative way than C++. Or Java. Or Go.


> Rust is still fringe even at FAANG level companies.

Yes, everyone knows that, except for HN users! While I'm sure there are managers at large companies who let some employees play with Rust, it's not used.


I mean, I'm a HN user who just left one FAANG for another and I'm pretty confident this is changing a lot faster than you think.

The thing that obscures this, I think, is that at most of them the surface area that the intersection of C, C++, and Rust that is high availability, security critical software, makes up a relatively small portion of what they do no matter what language it's in.

So while there's a lot of C++ at say, Google and Facebook (but relatively little at Apple IME), very little of it needs to be in c++ let alone Rust.

But where it matters? You better believe big companies are shifting towards "if you're starting new you should seriously consider Rust" (if not a mandate). And once you let one other language into your mix, the question becomes: why's all the high level stuff written in c++? May as well start new projects in Go.

Some are farther along than others but it's a thing.


There is a lot of C++ at Apple. It's of course possible to work there and never touch it, but it's also possible to work there and do almost all your work in C++. Many significant parts of iOS and macOS are written in C++, for example WebKit, the implementations of ObjC and Swift, clang/llvm, dyld, CoreAudio, CoreAnimation, WindowServer, Dock, Finder, etc.

I enjoy this[1] annual series of blog posts on Apple's usage of Swift in iOS. I just did a quick and dirty, but similar analysis of Apple's usage of C++ of the macOS 12.5.1 install on my computer. I extracted my dyld_shared_cache, and then used find and nm to count up binaries containing unstripped C++ symbols. This undercounts the usage of C++, since sometimes it's used only internally in stripped binaries, and I also think the number of binaries metric probably undercounts the importance of C++ because the ones that do use C++ tend to be more significant binaries, but even still it gives some idea of the scope of C++'s use at Apple.

About 25% (559 / 2292) of the libraries in the dyld_shared_cache contain unstripped C++ symbols. About 15% (22 / 154) of executables in /System/Applications (so 1st party apps / helpers that ship with the system) do.

That said, I think you're probably right that things are changing. Probably there are lots of people at Apple thinking of how they can replace C++ with Swift in their code. But on the other hand, I would not be surprised if we can still find significant uses of C++ in whatever macOS release we have 10 years from now.

[1] https://blog.timac.org/2021/1219-state-of-swift-and-swiftui-...


To clarify: Relative to Google and Facebook, which have built up immense c++ codebases that span the entire companies.

I spent 8 years at Apple, mostly working in C and then rust, both even smaller pieces of the Apple language pie.

But I also did a little c++. I have commits in swift for eg (though I wasn't on the team and they were pre-oss, so you'd have to dig real deep to find them now).

Most of the services side of the company, which is where I was, was java ime. A lot of shift to Go in the last few years though. And that's a very quickly growing part of the company where you can't do a local check to find out language use. It's also the only part where rust is viable, because the product side of the company has much stricter limitations on what they'll build for distribution and as of when I'd left, adding rust to that mix was not even on the radar there.

But the places where rust was gaining were small but important, which is the main thrust of what I was saying.

But anyways, with how siloed apple is experiences can differ a lot, even beyond normal for bigcorps. :/


The number of teams at these companies where “you should seriously consider Rust” is a thing is approximately (not exactly) zero. Its adoption is still a novelty, and most of the impetus behind it is engineers looking to scratch an itch without any legitimate analysis of the benefits or trade-offs involved.

It may be changing, but certainly not faster than I’ve observed (it’s not a matter of speculation, for me).


But the world isn't FAANG. There are thouasands of other companies doing interesting things.


I didn't say it was? We're talking about faang in this sub thread.


> I don't think the people on HN are really in touch with how real people program in the real world.

I mean, I’m a real programmer in the real world. I think most people on here are. Rust is already in heavy use at Microsoft, Amazon, and Google, it’s not some fringe thing anymore.


I have worked at two of the companies you’ve mentioned, and I can confirm Rust is not heavily used in these companies. Maybe a few teams use it, but it’s not really supported, and you won’t find many internal libraries and tooling support for it.


They’re using it to write useful, novel production systems software though. Fuchsia is in Google Nest Hub, AWS Firecracker is used in ECS Fargate and fly.io. I would say those represent major commitments to the language, and they’re both also novel as systems projects too.


How do you write GUI apps in C# with Win/Linux support? Usually people don't go C# if they're doing GUI on other things than Windows. CLI/Server software is a first class citizen on Linux these days though.


There are many ways!

https://halfblood.pro/the-story-about-net-cross-platform-ui-...

I like to use Xaml. For iOS/Android there are solutions, too.


A lot of companies started with Java or Go, and will consider Rust without ever touching C++. The stigma alone is enough to turn people off.

Hell it's going into the kernel in 6.1.

C++ runs a real risk of surviving only in the embedded/realtime space in the next 10 years.


10 years? That feels too short.

C++ is all the operating systems and the browsers and the games and the JVM.


> The stigma alone is enough to turn people off.

C++ has been hated for decades now. The reason it's used is often because you have to.

> C++ runs a real risk of surviving only in the embedded/realtime space in the next 10 years.

Ten years from now C++ will still be the language underpinning LLVM, web browsers, geometric modeling, machine learning, etc


Rust means you don't have to anymore. While the old software codebases will keep using it, the new ones won't.


> C++ runs a real risk of surviving only in the embedded/realtime space in the next 10 years.

Embedded/realtime? No way. The people I know in embedded won't touch C++ with your 10 foot pole.

Most of my embedded code is wrangling various "stacks" into cooperation with one another, and C++ helps me not one whit with that. At the other side, when I'm just poking sensors, C is more than enough.

And, as much I would really like embedded communication stacks to be in Rust, all that would happen would be the vendors slapping "unsafe" on everything so they would basically be writing C anyway.


More or less this. “Unsafe” as a value proposition loses its luster when it’s ubiquitous in a code base.


If you had a better syntax and modules that fix C++'s broken (as in slow) compilation model, why on earth would you write anything in Rust?


This isn't a language declaration, it's a tool declaration. A tool useful for prototyping new language features and compiling them into large existing source bases.


Familiarity breeds contempt?

I'm sure many would dislike rust if they were forced to use it. Same with python, etc, etc, etc.

I'm not saying the criticisms are wrong per se, but the reason we have so many languages is probably because there's no 'right' language.


> programming is hard, let's go (language) shopping instead


> C++ is (I'm guessing) currently holding up an order of magnitude more applications than whatever you think is better than it.

You could say the same thing about COBOL.

(Not that I think that C++ is as bad as COBOL.)


> Clearly it has upsides, so it's baffling to me when people greet attempts to reduce the downsides with either "This doesn't make sense, just scrap it for something else" or endless nitpicks about the approach chosen.

People have been working to reduce the downsides of C++ for decades at this point, and the results are incredibly lackluster, and frequently only work in trivial toy examples - if even then. Every now and then I try out the latest and greatest I can find in terms of dangling pointer prevention, and find myself confused as to if I've even enabled the proper checks, when even my trivial toy examples fail to trigger said checks, only to be horrified when I in fact have enabled said checks and gained nothing.

I, too, welcome any useful results. The problem is: I don't actually find the results useful. I get told garbage like "C++ is safe now!" by people who apparently don't know any better, for things underperforming existing static analysis tools from over a decade ago. It's a distraction, a false promise, and a waste of my time - which could be better spent reviewing C++ code, improving existing static analysis coverage, or going full ham and unironically pushing for incremental rewrites in Rust. Not because it is perfect: I've written and debugged my share of dangling pointer bugs in unsafe Rust code too - but because it is at least better, and provides me with tools to fight the good fight.

And if, from time to time, you find my voice amongst the nitpickers of attempts at C++ "improvements", it's perhaps because the attempt has failed, and I would rather not have others waste their time like I have on false promises. Granted, such warnings are unlikely to work: the alure of something that might ward off yet more late nights chasing yet more heisenbugs in yet another C++ codebase caused by coworkers, third parties, those long gone, or perhaps worse - by me - is just too promising. But it might.

Show me real bugs, in real codebases, caught in bulk where existing tools have failed, and you will have my undivided attention.


I agree, and I don't think it's as hard or complex as people make it out to be


I will probably get some flack for this but it's an effect of the Python generation. When you can pull in some libraries and build some bloated slow as hell code in minutes, no one cares what's under the hood. The tradeoff from performance (easily 1000s of times, I've seen it swapping out simple python libs for c++, specifically changing dicts to linked lists) to ease of slapping things together makes me sad daily.


Why would a trade-off make you sad? For most code that gets written, being able to quickly make something work is far more important than a 1000x increase in performance. Some projects do need performance, and for those no one should choose Python. For all the others, I'm glad Python exists.


Because there's lots of projects out there with absolutely horrible optimization. Doing stupid shit inside of loops etc. It turns into the blind leading the blind and churns out bad code.


That's true, but at the same time, modern C++ bears about the same resemblance to the C++ that many (if not most) of us learned years ago, as modern English bears to Middle English.

People have been trying to make the language safer and more palatable for multiple decades now, so it seems reasonable to allow Sutter to have a go at it.


C++'s main upside is that it's the least common denominator. Also its runtime is friendly to embedded systems (kind of).

I welcome good replacements for C++'s syntax, because it's genuinely terrible.


> Clearly it has upsides,

All that you've indicated is that it's used a lot, which is obviously because it has decades on every other option.

Anyway, I love Herb Sutter and I love this work he's doing. Awesome stuff.


C++ is like the English language. Extraordinarily popular, hard to learn, and full of weird cruft. You can write amazing things in it as well as terribly unsafe garbage.


English is not particularly hard to learn.


I’m truly thankful in my life for Sutter doing the work he does to make this language better. He is a giant.


Seems like C++ is having an identity crisis lately


C++ is getting the Perl treatment.

Perl had Ruby, Python, and PHP attacking it. C++ has Rust, Go, Nim, and Swift (albeit that only Rust is a direct challenge, the others are taking market share for things that used to be C++). I've even seen people using Vala in new places.

Perl was hard to use. The new languages addressed many of the ergonomics, semantics, and design concerns. Rust addresses a lot of C++'s dangerous baggage.

Perl had novel tooling at the time, but eventually wound up behind. Newer languages brought more to the table. Rust has Cargo.

Perl tried evolving for a long time. Eventually birthed a new language altogether (Raku), albeit that the old language still receives updates. (Python also saw this.) Google recently unveiled Carbon. Now there's Cppfront. The C++ language committee is doing different things...

It's a question of which ecosystem evolves faster and ultimately which ecosystem attracts more fresh blood as old timers age out. AFAICT, younger developers are flocking to Rust instead of C++. There are a bunch of Rustlang streams on Twitch.


That's an interesting analogy that I hadn't seen before, simplified as Perl : Python :: C++ : Rust in terms of conjectural market adoption.

As someone who did a lot of Perl in web 1.0 times, and haven't really been a full time programmer in a long time but occasionally have since written Ruby, Python, and PHP, but never really felt those languages were that huge of improvements over Perl (certainly now their ecosystems are, but I'd love a much better entrant to successfully "attack" Python), and would sometimes wonder what if Perl had just managed to evolve significantly instead...makes me appreciate and even root for efforts to "make C++ 10x simpler, safer, and more toolable" like this one or perhaps Carbon, even though Rust does seem like a huge improvement.


I wrote a lot of Perl in the first dotcom era and was very happy to switch to Python and Ruby as soon as I had a chance. There were some nice things about Perl but it was just too hard to maintain a large, active perl codebase with multiple engineers making commits. Perl coasted along for a long time on the momentum of CPAN but it wasn't enough.

I think the C++ analogy makes sense. How many programmers entering the field today are choosing to learn C++ when they have so many other choices without all of C++'s baroque complexity and footguns?


Perl lost popularity when people were using windows, learnt programming with java, were scared by unix idioms, implicity, regexps, contexts and concepts like anonymous blocks you can map/filter with. nowadays everyone finally think those features are cool (maybe even C23 will have lambdas) so perl wouldn't have lost popularity. I'm still waiting for 1 other interpreted language to have something as convenient as the CPAN ecosystem.

However I see one similarity between C++ and Perl: for decades, they evolved a lot from there ecosystems and nowadays, both C++ and Perl could be difficult to learn just because you can't figure out what module to use to get the job done. Also Perl suffers the lisp curse http://winestockwebdesign.com/Essays/Lisp_Curse.html (I don't know about C++).


Actually I think the PL/I treatment is closer as example.

The language complexity was famous and eventually grew into several subsets before UNIX adoption wiping out most systems languages that sprung as PL/I alternatives.


Nah... It is just the ISO committee and its members mucking about.

Many of us don't agree with the way the committee has been "evolving" the language and then mandating "the one true way to do Modern C++". At least with this "syntax 2" thing they will leave the language alone.


Rust is eating it in systems and Go for applications.

IMHO nothing but Rust or Go has any merit as a replacement unless it also bring the same safety. A fully unsafe language in 2022 is dumb. (Fully unsafe, not optional unsafety which may be needed in a systems language.)


"A fully unsafe language in 2022 is dumb."

I think this is a subtweet at Zig, and while I am not a zig developer, I will point something out that everyone forgets (to defend C++ and Zig and the like): Some apps don't need memory safety.

For example a single player game or application that runs just on your own machine, memory safety would just be another type of bug.

If dealing with Rust's ownership rules is annoying and you don't want a GC, then you can certainly use an unsafe language. Not all apps are websites and browsers.

I would also add (I could be wrong here) but I think things written in web assembly can also be unsafe, as the memory is in a sandbox. I think I heard this once, but I could be wrong.


> Some apps don't need memory safety.

Even setting security considerations aside, as soon as you have users, using a non-memory-safe language means having users reporting very unhelpful “bun crashes at startup with "segmentation fault"”and good luck to figure out what's going on.

> I think this is a subtweet at Zig

Why do you think it's especially aimed at Zig? Zig is only one among others, like Carbon, or Cppfront discussed here.


I would agree that seg faults are not fun to debug, but if you think you can be speed up production and get the product done several orders of magnitude faster without ownership rules AND you can't use a GC, I do think it could be worth the trade-off.

> Why do you think it's especially aimed at Zig?

I honestly don't use zig so I really don't have a horse in the race, but when I read: "nothing but Rust or Go has any merit as a replacement unless it also bring the same safety. A fully unsafe language in 2022 is dumb"

That seems to be targeted at replacing C++, which both Carbon and CppFront are not doing per se. Zig is the next big "low level replacement language" (that isn't Rust or Go) that HN seems to chat about so I took a guess.

Again, I don't think that's too relevant to my point (although I am curious if OP wants to weigh in on what they meant).


> but if you think you can be speed up production and get the product done several orders of magnitude faster without ownership rules

If you genuinely believe that Rusts rules slow developers down by “several ordrers of magnitude” (which litterally means: “makes you 100 times slower or more”) then you've probably listened/read way to much anti-rust discourse for your own good…

Rust has a learning curve and in the beginning you're obviously going to be slowed down by the learning process, but for 99% of the tasks[1] once you've gone past the initial learning period, the rules aren't slowing you down since you've internalized them, and compiler errors end up being mostly about things that would have caused a memory bug in other low-level languages.

[1]: that is, everything but custom graph data structures, which are really hard to get right without a GC anyway.


I actually have started learning rust and I do like it. I am hoping to get to a point where the ownership structure becomes obvious in my mind.

If I had to make a game right now though, I would be way faster in C++.

That said though, I do hope as I learn rust everything becomes obvious. I would be very happy if I get equally as productive (since that means I can use rust for everything).

I guess I just worry that certain things are like custom graph data structures, where suddenly ownership is a huge issue.


> If I had to make a game right now though, I would be way faster in C++.

I've no doubt about that, learning Rust sometimes feels like having to swim with your hands tied in the back.

> That said though, I do hope as I learn rust everything becomes obvious. I would be very happy if I get equally as productive (since that means I can use rust for everything).

Good luck:), don't be afraid to stop and try again a few weeks later if you get too frustrated: I struggled a bit to learn rust early 2016 and after a few weeks of night hacking, I had no more free time to carry on and I was still very confused about all of that. Then, I stopped for almost 6 months, having read of few blog posts in Reddit in the meantime but nothing especially eye opening or anything, and when I got back to Rust for some reason everything sounded clear and I completed my first project in Rust very smoothly, as if my brain had been slowly digesting the concepts in a background thread during that time.

> I guess I just worry that certain things are like custom graph data structures, where suddenly ownership is a huge issue.

Advice: avoid them as much as possible (most of my former use of graph-like structures where in fact related to programming patterns and not strictly necessary for what I wanted to do) and if you're really manipulating graphs, you should probably use a dedicated library (I think petgraph is the most popular but these kind of things can be workload dependent so you might need to go for another one).


> Some apps don't need memory safety.

Even apps that run locally and don’t talk to the net can be vulnerable. How many break ins happen via bugs in PDF viewers, office packages, and media players? Look it up. PDF and spreadsheet vulnerabilities are very common.

Then there’s the productivity gain you get by not having to spend hours debugging obscure memory and threading bugs. The latter are not ruled out in Go or Rust but they are profoundly less likely to occur, making code that actually takes advantage of modern hardware much easier to write.

After learning Rust and after having used C++ very heavily for years I can say that Rust is far more productive.


> PDF and spreadsheet vulnerabilities are very common.

True. The issue is that they operate on complex input that can arrive from anywhere. So if you download a malicious PDF file it can exploit you.

But GP's other example is valid: A singleplayer game running locally will only run the game assets from the developer. The user input (in almost all games) is very limited. Exploits are not a major issue in such a program.

There is room for languages like Zig. There is also the potential for it to be used in improper places, but so far the majority of applications I've heard of seem reasonable to me.

> Then there’s the productivity gain you get by not having to spend hours debugging obscure memory and threading bugs.

Also true. But there are the other usual tradeoffs, such as that a language focused on simplicity and correctness, like Zig, might reduce other sources of bugs.

Some developers might be more productive with Rust, others with Go, others with Zig. I don't think there's a single answer here. As an example of another related tradeoff, fast compile times often help debugging, which is an advantage of Go and Zig.


> A singleplayer game running locally will only run the game assets from the developer. The user input (in almost all games) is very limited. Exploits are not a major issue in such a program.

Nothing is that isolated anymore. Can you find a modern single-player game on Steam that doesn't talk to the network for analytics or DLC or sending crash reports to the developer? I'm sure they exist, but they're likely exceptionally rare.

For Zig to be helpful in the modern world, it will need memory safety guarantees and it will need them on by default. It's totally OK if those are easier to switch off than in Rust so that you can do the low-level things performantly and easily. But you have to start from a default position of safety.


Even if a game sends crash reports etc., it could be safe enough. Making a secure connection to a known remote server isn't a major source of exploits. (Look for example at the list of exploits against Firefox and Chrome - stuff like that isn't even noticeable.)

I disagree every language needs to start with the same defaults as Rust. Rust proved its approach is a useful one, and Rust is a huge asset to our industry, but we also benefit from exploring other approaches.


> Some apps don't need memory safety.

Maybe not, but it's still a big pain in the butt to avoid bugs in that area. I imagine that static analysis for memory safety would be something that most developers would use in existing unsafe languages if they could.


Lint was created for C in 1979… to this day most devs ignore static analysis tooling for C and C++.


Static analysis can be excellent, but it doesn't solve the whole problem.


For a lot of applications, that doesn't matter. It solves enough of the problem.


Indeed, still using it is better than nothing.


A single player game still accesses the network. Consoles that trust their game code regularly get jailbroken through eg overflows in a game's save file formats.

Plus it's only single player until they decide to add multiplayer…


Only because liability is still not a thing in software as in other regulated domains.

> Many years later we asked our customers whether they wished us to provide an option to switch off these checks in the interests of efficiency on production runs. Unanimously, they urged us not to--they already knew how frequently subscript errors occur on production runs where failure to detect them could be disastrous. I note with fear and horror that even in 1980, language designers and users have not learned this lesson. In any respectable branch of engineering, failure to observe such elementary precautions would have long been against the law.

-- C.A.R Hoare on his Turing award speech in 1981.


Against the law cause if you do this on a bridge, people die.

If a play Civ V and it crashes, I just a bit mad.

I will acknowledge though good engineering practices are important, maybe my perspective of "tolerance" of these bugs when they are isolated is missing the point.


If I play Civ V and it crashes, I want my money back, just like I won't stand a broken appliance.

Thankfully App stores are already making this point clear to the industry.


Doesn’t this counter your first point? A broken appliance isn’t against the law.

Second, I think data shows most people don’t return games after 1 crash. In fact, I think getting to market fast is a bigger factor for profit.


Not refunding a customer for selling a broken product is against the consumer law in plenty of jurisdictions.

People still don't do it because they have been taught that software is special, they have to put up with broken products, that is just how things are with computers.


Similar quote in a more recent video:

    Removing type checking from your running programs is like wearing a life jacket for your practice emergency drills and then taking it off as soon as your ship was really sinking

    https://www.youtube.com/watch?v=YYkOWzrO3xg
Maybe that was right in the 70-80s, but I feel like he's not giving enough credit to modern static verification / type-checking.


>memory safety would just be another type of bug

A bug that can be costly to reproduce and fix and has the affect of terrible UX and losing unsaved progress. If I had to choose a category of bugs to tolerate memory safety bugs would be near the bottom of the list.


It seems like in terms of subjective productivity, memory-unsafe languages have going for them a lot. Otherwise how do you explain so many programmers still willingly choosing them?

I know that in the domains that I'm interested in, I would be way miserable programming in something like e.g. Java, because I've figured out certain patterns that I know will work in a memory-unsafe language with very little friction, but that cannot be simply be translated to Java because of e.g. reference semantics.

The same probably holds for other approaches to safety, like Rust. I'm not interested anymore in tight opinionated approaches to problem solving. I _love_ void-pointers and memcpy'ing stuff because these let me do abstraction with very low friction. Getting the job done very quickly, with very very low rate of bugs, and those get quickly fixed. (I'm not making a statement about multi-million line projects in corporate environments. This is about the situations that I run into, personally).


> It seems like in terms of subjective productivity, memory-unsafe languages have going for them a lot. Otherwise how do you explain so many programmers still willingly choosing them?

Honestly? And this might be a hot take - exactly the same way as I'd explain people rejecting structured programming way back in the day. It was the new thing on the block and people were not used to it, and preferred to just sprinkle GOTOs in their program just like they always used to do rather than learning a new paradigm.

As someone who used to professionally program in C++ on an expert level in the past and now has completely switched to Rust I can tell you that (at least subjectively for me) Rust is way more productive than C++ ever was. Once you internalize how the language works the limitations of Rust just disappear. You learn how to instinctively write code that's idiomatic, and it just stops slowing you down. But it also gives you a ton of new features that end up speeding you up compared to C++, e.g. sum types, pattern matching, cargo, great standard library, destructive moves, proper modules, #[derive], good error messages, etc. Memory safety can also be significant productivity boost because you can (or at least I do) just churn out code without thinking about it too much and just rely on the compiler to make sure that it's memory safe. Can this code trigger use-after-free? Is this iterator valid? What's the lifetime of this pointer? Instead of thinking about all of these you can use those brain cells for something else.


>how do you explain so many programmers still willingly choosing them?

Lack of education + Legacy software / programmers / education material.


It’s also a dumb macho thing. “Real programmers…” Yeah and I bet real home builders don’t use nail guns, prefab sections, or tape measures. They just eyeball it all and use wooden pegs and hammers.


[flagged]


I see what he's going for, with the function signature separated from the identifier, it might improve readability. My initial reaction is I don't love it, but it's not that bad.


how is auto main() -> int { hello ("world\n"); } any better?


looks like they crossed rust and javascript and came up with something worse lol


The semicolon cancer is still there. Good.


semicolons help the parser. easier to point where erroneous statement began.


[flagged]


This is work that is 5+ years old. Before casually dismissing work that is long in the making, it's worth taking a moment to evaluate it. There are some amazing talks in the README that Herb has given. I particularly enjoyed his "Thoughts on Metaclasses" talk (https://www.youtube.com/watch?v=6nsyX37nsRs) which starts with a live on-stage user study.

Herb's thoughtfulness and care for tasteful evolution is manifest in the way he describes his work, and it's a shame to see it disparaged so easily.


> ... This is work that is 5+ years old.

And yet it was released today, in a way that clearly a reaction to Carbon (which is, itself, a reaction to Rust and perhaps Swift to a lesser extent). The comparison is highly relevant.


I don't know when he made the repo public, but the first commit was 2021. If this were published, say, the week or so after Carbon were announced, maybe you could argue it was a reaction to it (though still, it started before Carbon was announced, in what direction does time flow again?). But a more likely reason for it coming out now is that Sutter gave a talk at CppCon last week on this material.


It appears that the CppFront work has been in progress since 2015-2016, which would make it predate Carbon by a few years.


[flagged]


It's Herb Sutter, you don't have to worry about it being enforced except on critics. (https://twitter.com/pati_gallardo/status/1561093468121956352)


For anyone curious, that thread is describing someone very mad that a formerly convicted sex offender who served his time was now allowed to participate in the ISO C++ committee. I have no idea how this is considered a bad thing, when the opposite would be obvious discrimination.


Thanks for pointing out that terrible wart. Codes of Conduct are typically warning signs in themselves, but many of them - including this one - are markedly repressive and authoritarian: Criticism is forbidden except to the extent and in the forms which illustrious project leadership decrees (and is of course determined ex-post-facto), and it is the duty of leadership to excommunicate/cancel people who act in ways they don't like, for the greater good of the community. Naturally, I've phrased it in unforgiving terms, but it's all in the text.


"Code of Conduct" is an unnecessarily formal way of saying "forum rules", but no-one would say a forum is better if you don't write a rules page.

(But the point of forum rules is to constrain the mods, not the users.)


> But the point of forum rules is to constrain the mods, not the users.

Indeed. And these "forum rules" do the opposite: They hallow inappropriate behavior by "moderators": Arbitrary and extreme sanctions, lack of transparency, lack of minimal due process, lack of consistency etc.


... did you even read the code of conduct?


Oh, I did. And with a legal background including arguing multiple cases in different courts in my country. That's why it's scary.


> We've already been improving C++'s safety and ergonomics with every ISO C++ release, but they have been "10%" improvements. We haven't been able to do a "10x" improvement primarily because we have to keep 100% syntax backward compatibility.

You really don't. The vast majority of projects will never change the language version they use. Simple projects are simple to fix. Remaining large projects have enough resources and expertise to do changes across their entire code base - Chromium has recently changed more than 15000 instances (in over 8000 files) of using raw pointers to using raw_ptr instead in a single pull request [0]. How? They wrote a Clang based tool which did it automatically [1].

[0] https://chromium-review.googlesource.com/c/chromium/src/+/33...

[1] https://source.chromium.org/chromium/chromium/src/+/main:too...


That doesn't match my professional experience: Many code bases iteratively move forward through language versions. They do vary in how intensely they drive this, which can lead to vast differences across different areas of the code, but that is not necessarily bad, but would be an issue with hard breaks.


There is no known example of a language successfully doing (significant) backward-incompatible changes. The few languages that tried usually got caught in a decade+ long mire (Python3).

Now, many languages, including C++, have gotten away with smaller backwards-incompatible changes (e.g. C++ 17 removed dynamic exception specifications from the language entirely - getting away with it because it was an almost entirely unused non-feature).

Edit: removed some reference to PHP6 where I had a wrong recollection.


Projects can't keep using an old compilers because of security patches though. At some point, the old versions are too old to keep the security up to date.

A more common example, what if a library that you use stops supporting the old version, and the latest compatible release of the library has a security vulnerability.


> Projects can't keep using an old compilers because of security patches though. At some point, the old versions are too old to keep the security up to date.

I'm talking about using the same language version (standard), not compiler version - e.g. the latest Clang still supports the C++98 ISO version.

> A more common example, what if a library that you use stops supporting the old version, and the latest compatible release of the library has a security vulnerability.

You backport the fix or the change library. Nobody said it's easy but your project should not hold the entire language hostage.


There was a years-long conniption over the Python 2 to 3 migration. I also believe that it's ok for a language to break backwards compatibility once every 40 years, but I completely understand why so many language designers are afraid to.


This is a minor thing, but why is the file extension .cpp2 and not .cpf or .cp2 similar? Why make it longer/uglier?


modern C++ is fine, as long as you don't try to get too clever with return types, move semantics, constexpr and template parameter expansions.

They all have their places, but if your code is littered with them, it feels like every line is a puzzle.

Here's an attempt to wrap sqlite in a typesafe manner using template expansion. All very simple C++11. https://github.com/nurettin/pwned/blob/master/sqlsafe/sqlsaf...

(some types had to be spelled out because the compilers weren't ready)

Here's some more C++11 without using any of the mentioned complexities https://github.com/nurettin/pwned/blob/master/server/server....

This shows that simply spelling out your types and not getting crazy with polymorphic metaprogramming makes code much more readable. Ideally, code should just be a bunch of ifs and loops.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: