> I know it's meme tier and horrible when a guy who watched some videos tells the tradesmen how to do their jobs, but unfortunately I often end up being right and prove them wrong.
Normalization of deviance is becoming common place. There's a lot to unpack here but there is a severe lack of professionalism, responsibility, genuine knowledge, and experience. Now its just a bunch of greedy man children who live by "fake it till you make it" and use every dirty trick to intimidate and mislead the client. Once they secure the job they barely show up to the the job site because they ran off to play with their golf clubs, sports cars and yachts or whatever while a crew of cheap unskilled labor shits all over the job site. And if you call out these morons and challenge them they get angry and defensive because they're the experts and you're a dumb client. Infuriating hubris.
Do you have sources you could point to? I once read that it's a great language mired by a designed by committee ecosystem. I really liked the language when I tinkered with it a bunch around 2010 but moved on after work pushed me to other languages like C# for GUI stuff. First language I used with simple built in concurrency.
For my safety-critical automation software for a machine that will operate around people and overhead, I’m choosing Ada/SPARK2014. Its decades-long track record in high-integrity domains like aerospace, defense, and medical systems ensures reliability for applications where human safety is paramount. SPARK2014’s formal verification tools mathematically prove the absence of runtime errors, aligning with standards like DO-178C and ISO 26262, critical for my Q3 2026 market deadline. While Rust is gaining traction for memory safety, its formal verification tools, like LEAN/Aeneas, are still maturing and lack the production-ready ecosystem of Ada/SPARK2014. Ada’s clear, structured syntax simplifies code reviews, and its tooling generates certification reports familiar to regulators, streamlining approval processes. For my project’s safety and business needs, Ada/SPARK2014 is the proven choice - for now. I am not a fan of Rust syntax or complexity, but that is somewhat subjective. I last dove in about 2 years ago.
That's for automotive. We're shooting for a bunch of various relevant standards. There is no formal language spec. Who did that compiler? AdaCore's been around for a long time, so I am quick to use theirs if I were to chose Rust. I'm also following the Ironclad kernel project with its complementary OS called Gloire. Ada/SPARK for both with partial formal verification progress.
Automotive is the first industry that’s really taking up Ferrocene! They’re adding more stuff as more industries have demand for it.
Ferrous and AdaCore were originally collaborating, but then they parted ways. In my understanding they’re both largely the upstream codebase, I know that the Ferrous folks religiously upstream almost everything, no clue if AdaCore does as well.
This is effectively a fork of the Rust Reference, made by Ferrous, and laid out in a way that allowed the compiler to be qualified. It now lives at this URL, because it's being adopted by upstream as the spec.
Sounds like progress is being made on the language spec front, that's good to see.
I'm not following what you meant by this though, it seems like a contradiction:
> A language specification is not required to be qualified. The behavior of the compiler needs to be described.
But they're putting work into reviving the language spec, to enable certification? Also, if the source language hasn't been described, then surely the compiler's behaviour hasn't been described.
Or did you mean that their documentation is for the Ferrous flavour of Rust and might not reflect the latest version of the Rust language?
It has already been qualified. Upstream has always wanted a spec. It’s being worked on because it’s desirable, not because it’s blocking safety critical cases.
You’re always going to need to have more than a language spec because you qualify compilers not languages.
> Also, if the source language hasn't been described, then surely the compiler's behaviour hasn't been described.
It has. At least to the degree that regulators find it acceptable.
> Or did you mean that their documentation is for the Ferrous flavour of Rust and might not reflect the latest version of the Rust language?
There is no difference in flavors, but it is true that each version of the spec is for a specific version of the compiler, and so sometimes that will lag a bit. But that’s just how this process works.
Can you point to a production language today that doesn't have a committee leading it's development?
C, C++, Rust, Javascript, Python, etc. All have committees leading their development. The only difference with Ada was that, for a long time, that committee was in the DoD (which has plenty of fine engineers, given it's practical achievements) instead of ISO/ANSI. And instead of being focused on general purpose, they had a clear domain they prioritized. That's different now, but it's hard to erase a few decades of heritage.
Specifically, I think these three paragraphs near the end are critical:
> I'm reading a great book now called Why People Believe Weird Things, by
Micheal Shermer, in which the author explains what rational thinking is,
and how skepticism is a process. Basically, people believe something
because that want to, not because of any scientific arguments you make.
> There are guys out there who dislike Ada, but they do so because they
want to, not because of any rational analysis of its merits or flaws.
Sometimes even their arguments are factually incorrect, like saying that
"Ada was designed by committee," ignoring the fact that Jean vetoed
language design arguments that were 12-to-1 against him. It's not
unlike creationists who explain the "fact" that evolution violates the
2nd law of thermodynamics. (No, it does not, as any book on freshman
physics will tell you.)
> I've explained the reasons Ada why I think is not as popular as C++, and
I'd like to hope that it will convince Ada's detractors that Ada isn't
so bad after all. But as Robert Dewar pointed out, a person who has
made an irrational decision probably isn't going to be swayed by
rational arguments!
That is, people aren't really rational. A choice was made to dislike it, it entered into the culture and to this day people dislike it because they think they should dislike it. They don't even spend 5 minutes studying it to see that half of what they've heard (if not more) is flat out wrong. In several Ada discussions on HN people claim its syntax is like COBOL's, for instance. Not just similar in being keyword heavy, but practically the same. Sometimes they even provide Ada "examples" that won't even compile. That's the kind of nonsense that happens when people turn off their brains or refuse to turn on their brains. You see it in many Lisp discussions as well.
There may be lots of uninformed post-hoc rationalizations now, but it couldn't have started with everyone collectively deciding to irrationally dislike Ada, and not even try it. I suspect it's not even the ignorant slander that is the cause of Ada's unpopularity.
Other languages survive being called designed by committee or having ugly syntax. People talk shit about C++ all the time. PHP is still alive despite getting so much hate. However, there are rational reasons why these languages are used, they're just more complicated than beauty of the language itself, and are due to complex market forces, ecosystems, unique capabilities, etc.
I'm not qualified to answer why Ada isn't more popular, but an explanation implying there was nothing wrong with it, only everyone out of the blue decided to irrationally dislike it, seems shallow to me.
"Am I out of touch? No it's the children who are wrong."
This argument eats itself. It's just an accusation that people who disagree with you are irrational, and their arguments are in bad faith. It's not a valid argument because it doesn't even depend on any context or facts of the actual discussion which he's using it. It's the definition of cope.
In the end, even if we can't be sure why Ada failed, it failed spectacularly. It had massive institutional backing and never made it past obscurity. I don't know exactly why people dislike it so much, maybe because everyone already knew C, C was well supported, every single OS was written in C, etc, so trying to bring some incompatible algol like language (always a popular lineage hahaha) with very sparse to nonexistent tooling and very theoretical advantages, especially considering the huge performance disadvantage at the time on highly constrained resources of computers at the time was not likely to succeed on its face.
The only exception I can think of is early versions of mac os, which was still primarily assembly. Even then i recall people went out of their way to use C despite needing pascal calling convention for system calls. They basically immediately regretted using pascal and started a march towards C and basically gave up on pascal before the powerpc.
So Pascal had one mainstream OS for about 10 years, most of which time it was being phased out.
> which os was written in Pascal? most were written in assembly in that era, you are talking about some obscure research or toy.
Your first claim: Every OS was written in C. Your new claim: Most were written in assembly.
Pick a position. If most were written in assembly then it would not have had any impact on the adoption of Ada so why make the original claim?
I would respond to your question but you substantially edited your comment and removed the question. I also notice you removed the claim in your edit about most OSes being written in assembly in the 80s. Obnoxious way to communicate with people, altering your comments while they're replying so their replies look like random comments.
> They basically immediately regretted using pascal and started a march towards C and basically gave up on pascal before the powerpc.
Nonsense. MPW Pascal and Think Pascal were well supported developer tools, and a lot of third-party developer code was written in them during the 80's. Photoshop (1987) was originally written in Pascal! Apple's Pascal dialect had object extensions that made OOP simpler than with C or standard Pascal.
Pascal started to leave the building circa 1991, when C++ became viable for OOP. Even then, Metrowerks CodeWarrior supported native Pascal compilation for PowerPC in 1993/4.
> It is in this context that Assistant Secretary of Defense (Command, Control, Communications, and Intelligence) Emmett Paige, Jr., requested that the National Research Council's Computer Science and Telecommunications Board (CSTB) review DOD's current programming language policy. Convened by CSTB, the Committee on the Past and Present Contexts for the Use of Ada in the Department of Defense was asked to:
> * Review DOD's original (mid-1970s) goals and strategy for the Ada program;
> * Compare and contrast the past and present environments for DOD software development; and
> *Consider alternatives and propose a refined set of goals, objectives, and approaches better suited to meeting DOD's software needs in the face of ongoing technological change.
> Paige says he believes industry engineers will be more likely to accept the benefits of using Ada if DOD leaders recommend, not require, the language. Software engineers, who would rather choose a language based on its merits rather than because of a governmental mandate, historically have resisted the Ada mandate on principle.
and
> Chief complaints about Ada since it first became a military-wide standard in 1983 centered on the perception among industry software engineers that DOD officials were "shoving Ada down our throats."
This is basically the story: The DoD tried to mandate it, people resisted, and made liberal use of the ability to be granted an exception, and so they eventually gave up.
The first link contains much more nuance, some excerpts:
> In decisions affecting adoption of programming languages, non-technical factors often dominate specific technical features. These factors include the broad availability of inexpensive compilers and related tools for a wide variety of computing environments, as well as the availability of texts and related training materials. In addition, grass-roots advocacy by an enthusiastic group of early users, especially in educational and research institutions, often has broad influence on adoption of programming languages. These advantages were largely absent when Ada was introduced in 1980. In contrast, C++ and Java both have achieved widespread acceptance and use. The strong military orientation of the publicity generated for Ada also may have served to alienate significant portions of the academic and research communities.
> Ada has not been widely taught in colleges and universities, particularly compared with Pascal, C, and C++; until recently, even the military academies taught programming in languages other than Ada
> Historically, compilers and other language-specific tools for Ada have been significantly more costly and slower in coming to market than those for C and C++.
> Software engineers are likely to be interested in enhancing skills that they expect to be most valuable in the software engineering marketplace, which is now dominated by commercial opportunities. Thus, programmers have moved quickly to learn Java and Hypertext Markup Language (HTML; used to develop pages for the World Wide Web) because they see these as the next wave, which can carry them to new career opportunities. Similarly, software engineers might avoid using Ada if they see it as limiting their careers.
That's what grandparents are for. Growing up my immediate family lived in the same neighborhood. My mother's parents lived two blocks away and walked over. My fathers parents lived ~15 minutes away. Everyone worked locally. Baby sitters were always named grandma :-)
Now you have to move across the country for a lucrative tech job, leaving behind your support network. You either plan for these things or deal with the consequences. Though I have a feeling many young tech oriented people starting their careers dont have family on their minds...
And lastly, it depends on where you live. An ex military friend moved to a shitty town in PA to be near his mother and sister and bought a hose using the GI bill. He has a federal job, five kids and a stay at home wife. Pretty wild to have a family of seven these days but he is happy and doing good. Family support helps big time.
I have been in tech for 7 years and it would be a stretch to afford the house I grew up in. Plus the commute to the city from my parents has increased from 45 minutes to 2 hours over the last 30 years. My high school recently closed down because families can't afford to live in the neighborhood.
The house my parents bought in the early 90s (after local inflation) would cost between 1/3 and 1/4 of what such a house goes for right now. Big surprise I didn't buy one, but I suppose with 2 incomes we would have bought one for that price.
It's a lot to ask grandparents to take care of an infant full time during the work week. Here and there, on occasion, that is a completely reasonable thing to ask for. It helps strengthen family bonds. But I would never ask my parents or my in-laws to care for my toddler 8-5 M-F. They already raised kids.
> But I would never ask my parents or my in-laws to care for my toddler 8-5 M-F. They already raised kids.
This is disheartening to hear. You should not have to ask, like ever. My mother would KILL to have grandchildren and would absolutely love to repeat what she calls the best time of her life. She nags me for not having kids and I feel bad because she sees other grandparents and is saddened she is missing out.
I recently overheard a conversation between two older women who were both new grandparents and they conversation was about the pure joy of getting to raise kids again - BUT - you get to go home at night. They loved it!
If you have the right parents then you should never have these reservations. Otherwise it sounds like you have parents who had kids "because that's what you're supposed to do." So they never enjoyed it and dont want to repeat it. My condolences.
Which worked great when people had long retirements and were procreating early. Grandparents are working longer, older age when their children have children, and generally enjoying retirement more instead of grand parenting.
> and generally enjoying retirement more instead of grand parenting.
These are just people who never liked kids. My friends parents were still working and went out of their way to watch the kids when available. Believe me, there are people who LOVE kids.
I have an AMD APU Linux PC hooked to my TV with a Logitech K400. Its a bit more fiddly than a throw away android based TV stick thing but you have complete freedom and control.
I watch everything on my web browser with a fancy OLED monitor. The problem is that many services won't give you even HD, let alone UHD. I'm stuck at 480p for renting movies on YouTube.
Android TV sticks scare me, but the Apple TV seems... okay.
Ryzen 5 4600G on an ITX board with 16GB RAM, 512GB NVMe and hard-wired gigabit. It's hooked to a ~40 inch Sony 1080p dumb TV via HDMI.
I don't use fancy GUI media centers or anything, just a standard Debian XFCE desktop scaled up. Netflix and Hulu work just fine in Chrome and Firefox. No idea about 2k+ performance due to 1080p limit. TV for me is mostly background noise so media quality is of no concern to me.
My only gripe is once in a rare while the audio goes to shit and continually crackles but reboot and its fixed.
> That or you more or less have to dive in for months and months.
Waaaaay back in the Quake 3 days I had a pirated copy of 3D Studio Max and decided to try and make a Quake (or maybe it was Half-Life) character model. Found an on-line tutorial that step by step showed you how to setup Max with a background image that you "traced" over. So I grabbed the images of front, back and side views of a human, loaded them into the various view ports, then drew a rectangle from which you extrude a head, arms and legs. Then you manipulate the block headed human mesh to fit the human background image - extruding more detail from the mesh as you go along. In one day I had a VERY crude model of a person. I also found out I dont like 3D modelling. Though I'd say a person who really enjoys it would pick it up in a week or two with good tutorials.
LLM's just cut out the learning part leaving you helpless to create anything beyond prompts. I am still very mixed on LLM's but due to greed and profit driven momentum, they will eat the world.
One problem I find is that a lot educational content has moved into YouTube and videos (monetization be damned). I have no time to watch 10mins of rambling and ads for a quick tip, LLMs are great at distilling the info. Otherwise, I agree, deep knowledge building only happens through doing stuff…
> Having children younger seems like a solution to a lot of this,
Indeed. I have a friend who's younger brother fell madly in love with a girl his family did not approve of. He left home at 19 to live with her then returned about a year later married, with his first child at age 20. Shortly after he had his second child he finished university then helped his wife finish university and nursing school. They're 37 now, 3 kids, both have a career, house, and they still go out with friends and have a solid social life. Just saw them this past weekend and his son is a young man looking at university, daughter is excelling in school, and a toddler (happy mistake.)
BUT! He had a lot of help from family which is key.
I tried it too after it was recommended but it's a detective game that requires you to take notes or cram a bunch of story line in your head. If you want to "just play a game", this isn't it because it will get real boring real fast.
That is why I'm self employed.
reply