I think it's fine to go back to C and maybe play around a bit to learn about some of the things that can be done, but I would implore you to bear in mind that the decades have taught us that the "ultimate danger" in question is basically that you're building sand castles in a minefield. We're not talking "oh ha ha I guess I wrote a few more bugs that a stronger type system would have caught", we're talking "oh ha ha I guess remote unauthenticated attackers can run arbitrary assembler as root in my network code because I tripped over one of C's nominally well-known mines that I did not personally know about and all the attackers had to do was slightly tweak one of the exploit scripts already in metasploit to install root kits on my system and add it to their bot net".
The world has gotten a lot more dangerous than people realize. People generally quite correctly assume that hackers aren't going to spend person-months attacking their system personally but don't realize that the attacker tools are quite sophisticated now and they don't have to. Shoving code into a small little buffer overflow to lift to a larger buffer overflow to load exploit code over the network that will run a pre-packaged root priv escalation to install a rootkit to add them to their botnet is no longer the work of a team of hackers for months. It's all off-the-shelf tech now and it's more like writing a 20 line function now; you don't need to attract very much attention now to attract that level of sophistication.
We are leaving C behind collectively for very good reasons. If you are playing with C and you do not intimately understand those reasons, you're going to relearn them the hard way.
Why do people have this idea that it's the language's job to protect you? C is a small piece of a much larger puzzle. That puzzle includes things like the memory page protection in your CPU's MMU. It includes things like SECCOMP BPF. It also includes things like ASAN, UBSAN, TSAN, etc. If you work in defense it might even include ASICs. The list goes on. Whatever language you believe in probably depends on C. You live in a C world. The value prop of the programming languages other than C/C++ for systems programming is that they build cohesive social communities and C is in the background serving them all. The whole "we're going to save you from the bogeyman" is just kool aid. No one will be saved.
> Why do people have this idea that it's the language's job to protect you?
Because we have the benefit of hindsight. We tried putting the responsibility in programmers' hands. It didn't work. We're now learning from that experience and building better tools that can do the same job, more safely. It's foolish not to use them.
> Whatever language you believe in probably depends on C. You live in a C world. The value prop of the programming languages other than C/C++ for systems programming is that they build cohesive social communities and C is in the background serving them all. The whole "we're going to save you from the bogeyman" is just kool aid.
No one is expecting the new languages to make computing suddenly error-free from top to bottom. It's about reducing the size of the surface where errors can be introduced. You're attacking a strawman here.
It is definitely on its way down, and it is a good thing.
You may think you can handle C. I disagree. The evidence is on my side. We need safer languages.
Heck, that even overstates it. It's not like we need super-safe languages per se. Maybe the 2070s will disagree with me. But we need languages that aren't grotesquely unsafe. It's not just that C isn't as safe as a language could be; it is that it is recklessly unsafe.
Interrupting that descent is foolishness, and pinning your career to a resurrection in C even more foolish. These technologies always have this meta-meta-contrarian phase to them just before they expire. I got into the computer field just as the meta-meta-contrarian "everything must be written in assembler" phase was at its peak. It was a false resurrection and I pity anyone who overinvested in learning how to write large applications in 100% assembler as a result of reading someone's over-enthusiastic screed about the virtues of writing pure assembler in 1998.
So I write this in the hopes of helping some other young HN reader not be fooled. C may be a thing you learn at some point, some day, just as assembler is still something you may learn. But don't get overexcited about the last meta-meta-contrarian false resurrection before death. Which is still years away, but at this point I think it's pretty much set on an irrevocable course.
I wrote 10s of thousands of lines of C code in the 90s and early 00s (without buffer overflows that I learned about; I did also write a evil network layer for inducing buffer overflows in my and my dependencies code), and have been doing a lot of other languages since then, and then had occasion to write some more C where I was doing string allocation and manipulation (for a LD_PRELOAD to monitor what various programs I lacked source to where doing), and it was absolutely nerve wracking. Linux kernel might be mostly C for a long time, but it would be crazy to start a new thing in C. There's growing re-write projects from C to Rust. It would be farther along except the Rust people seem to dislike laboriously recreating decades of GNU long-opt functionality in all these base packages to actually make Rust drop-in replacements for C.
Maybe for embedded, I haven't done that, but for general purpose, I can't imagine it being worth the risks.
The world has gotten a lot more dangerous than people realize. People generally quite correctly assume that hackers aren't going to spend person-months attacking their system personally but don't realize that the attacker tools are quite sophisticated now and they don't have to. Shoving code into a small little buffer overflow to lift to a larger buffer overflow to load exploit code over the network that will run a pre-packaged root priv escalation to install a rootkit to add them to their botnet is no longer the work of a team of hackers for months. It's all off-the-shelf tech now and it's more like writing a 20 line function now; you don't need to attract very much attention now to attract that level of sophistication.
We are leaving C behind collectively for very good reasons. If you are playing with C and you do not intimately understand those reasons, you're going to relearn them the hard way.