If you think the facts are "good" or "bad" then take it up w/ the people who can do something about it to make them "better". Typical discussions about stuff like this becomes nonsensical & incoherent b/c whether you think the facts are "good" or "bad" makes no difference to the material reality & again, those are as I have stated them.
If you believe "costly autocomplete" is all you get, you absolutely shouldn't bother.
You're opting for "sorry boss, it's going to take me 10 times as long, but it's going to be loving craftsmanship, not industrial production" instead. You want different tools, for a different job.
You can only fix that with leverage. The sudo maintainer doesn't have it. sudo is valuable, but if Todd stepped away, you could (and would) find other maintainers because it's so important.
If you want to fix it, you need organizational heft comparable to the companies using it, and the ability & willingness to make freeriding a more painful experience.
Say, I clone sudo. Clearly, a human applying freedom zero. I use it in my projects. Probably still freedom zero. I use it in my CI pipeline for the stuff that makes me money... corporation or human? If it's corporation, what if I sponsor a not-for-profit that provides that piece of CI infra?
The problem is that "corporation or not" has more shades than you can reasonably account for. And, worse, the cost of accounting for it is more than any volunteer wants to shoulder.
Even if this were a hard and legally enforceable rule, what individual maintainer wants to sue a company with a legal department?
What could work is a large collective that licenses free software with the explicit goal of extracting money from corporate users and distributing it to authors. Maybe.
The challenge is that this doesn't really work for community-developed software.
Let's say somebody uses this scheme for software they wrote. Would anybody else ever contribute significantly if the original author would benefit financially but they wouldn't?
Mediating the financial benefits through a non-profit might help, but (1) there's still a trust problem: who controls the non-profit? and (2) that's a lot of overhead to set up when starting out for a piece of software that may or may not become relevant.
And the shades in between account for the large number of new licensing schemes sprouting, with different restrictions on what is and isn't possible. (Not to mention the large number of "just used it anyways" instances). And it struggles for smaller utilities, or packages of many different things.
It's "worked out" in the sense that it still doesn't really work for a lot of maintainers.
"Our tooling was defective" is not, in general, a defence against liability. Part of a companys obligations is to ensure all its processes stay within lawful lanes.
"Three months later [...] But the prompt history? Deleted. The original instruction? The analyst’s word against the logs."
One, the analysts word does not override the logs, that's the point of logs. Two, it's fairly clear the author of the fine article has never worked close to finance. A three month retention period for AI queries by an analyst is not an option.
SEC Rule 17a-4 & FINRA Rule 4511 have entered the chat.
Agree ... retention is mandatory. The article argues you should retain authorization artifacts, not just event logs. Logs show what happened. Warrants show who signed off on what
I don't think you're missing something. The standards committee made a bad call with "no submodules", ran into insurmountable problems, and doubled down on the bad call via partitions.
"Just one more level bro, I swear. One more".
I fully expect to sooner or later see a retcon on why really, two is the right number.
Yeah, I'm salty about this. "Submodules encourage dependency messes" is just trying to fix substandard engineering across many teams via enforcement of somewhat arbitrary rules. That has never worked in the history of programming. "The determined Real Programmer can write FORTRAN programs in any language" is still true.
The C++ committee tries to do features with room for future extension. They believe that whatever you want from sub-modules is still possible in the future - but better to have a small (as if modules is small) thing now than try for perfects. We can argue about submodules once we have the easy cases working and hopefully better understand the actual limitations.
Not to put too fine a point on it: The world has 35 years of experience with submodules. It's not rocket science. The committee just did what committees do.
And sure, "future extension" is nice. But not if the future arrives at an absolutely glacial pace and is technically more like the past.
This may be inevitable given the wide spread of the language, but it's also what's dooming the language to be the next COBOL. (On the upside, that means C++ folks can write themselves a yacht in retirement ;)
That is 35 years of different things tried, some that work better than others, some that are not compatible with others. Trying to figure out what is the best compromise while also making something that doesn't break existing code is hard when there are a lot of people who care.
Fascinatingly, I am not aware of any real issues with how Rust did nested modules. It even treated crates as top-level modules for most language-level purposes. I am sure there are nuanced reasons that C++ can't do quite the same, but the developer experience consequences can't be worth it.
I suppose we shall amend to "The determined Real Programmer will fix FORTRAN" ;)
But, for the folks who didn't grow up with the Real Programmer jokes, this is rooted in a context of FORTRAN 77. Which was, uh, not famous for its readability or modularity. (But got stuff done, so there's that)
I wrote a lot of F77 code way back when, including an 8080 simulator similar to that written by Gates and Allen used to build their BASIC for Altair. I don't know what language they wrote theirs in, but mine was pretty readable, just a bit late. And it was very portable - Dec10, VAX, IBM VM/CMS with almost no changes.
I think F77 was a pretty well designed language, given the legacy stuff it had to support.
It was well designed. Hence the "it got stuff done".
But it was also behind the times. And, if we're fair, half of its reputation comes from the fact that half of the F77 code was written by PhDs, who usually have... let's call it a unique style of writing software.
Indeed. Two PhD students came to see me when the polytechnic I worked for switched from a Dec10 to two IBM 4381s.
[them] How can we get our code to work on the IBM?
[me] (examines code) This only looks vaguely like Fortran.
[them] Yes, we used all these wonderful extensions that Digital provides!
[me] (collapse on the floor laughing) (recover) Hmm. Go see Mike (our VAX systems programmer). You may be able to run on our VAXen, but I can't imagine it running on the IBMs without a major rewrite. Had they stuck to F77 there would have been few problems, and I could have helped with them.
Portability is always worth aiming for, even if you don't get all the way there.
reply