Because it's literally a large chunk of the economy? Why wouldn't you worry about that. I certainly don't like to see a society where the young are enslaved by the old to pay for their medical treatment.
I have little clue about ancient athens. Your statement however does mean contradict the poster. You can despise a party/group, while that group holds power over you. In fact that is a very common way why such a relationship occurs.
Same, I wonder if the people complaining have experience with vscodium (to know the difference). Everything I want works with vscodium and I never had any problems with it. The only thing I know doesn't work are dev containers, but I never personally wanted to use them.
"most useful things around it are full of DRM and legal traps" is a HUGE overstatement. Vscodium is great and have everything anyone can want (except, maybe, devcontainers)
Indeed, the more efficient you become the more brittle you will be. You must depend upon the present being static and the future being perfectly predictable based on the events of the past. The present and the future don't merely need to be dependable within your own domain but also in the entire world.
The flexibility necessary to succeed in a real world requires a certain level of inefficiency.
Interestingly, the same effect shows up in communications systems. The more efficient an error correction code (ie. the closer it approaches the Shannon Bound), the more catastrophically it fails when the channel capacity is reached. The "perfect" code delivers no errors up until the Shannon bound then meaningless garble (50% error rate) beyond the Shannon Bound.
My point is that error correction codes have a precise mathematical definition and have been deeply studied. Maybe there is a general principle at work in the wider world, and it is amenable to a precise proof and analysis? (My guess is that mileage may be made by applying Information Theory, as used to analyse error correcting codes.)
An interesting idea but I'd imagine you would have to operate within something like the "100 year flood" boundaries that insurance companies do in order to define a constrained domain such as the Shannon Bound. I suspect you would also have to define the scope of this principle within the company and/or deal with the compounding effects of the multiple layers of the system and its "effective inefficiency."
Yes, just-in-time supply chain systems often become over-efficient and brittle... usually because each link in the chain assumes that someone else is taking on the burden of inefficiency by having excess inventory in order to absorb shocks to the system.
Why would you evaluate it using an interpreter? Since you are using it in the context of a rust lambda you compile it. You just have a rust file that calls cargo-watch as a library. Crafting an interpreter seems like an incredibly bad idea.
Of course it has semantics. Whether someone knows all of them is a different matter. But whether you use rust or c you have to know the contract to be able to write correct code. The only reason the kernel devs got away with such sloppiness is because c is the ultimate 'I do what you tell me to boss" language.
And yet, Rust is the exact opposite. Why try to make the two meet? The argument for Rust in the kernel over memory safety can be remediated by better memory safety tools for C can't it? (Not that the Linux kernel project doesn't have any already.)
Why not devote the time to writing a better memory safety tool for the Linux kernel (or C in general) rather than keep trying to force two disparate cultures and ideologies to meet in some fantasy middle?
Rust has the very strong pro of already existing. If those tools existed, I’m sure the conversation would be quite different.
It’s also not just about memory safety. Greg in particular has recognized how Rust’s type system will be helpful for preventing other kinds of bugs, for example.
> Rust has the very strong pro of already existing
But not in the Linux kernel. Any new effort will be greenfield, why spend the last two years and many more rallying around an entirely different programming language instead of writing a novel tool?
> Greg in particular has recognized how Rust’s type system will be helpful for preventing other kinds of bugs
There are no decent memory safety tools for C. Could they be theoretically created? Perhaps, I doubt it considering the amount of money flowing in this industry. To solve it you really have to design a new language. Which is exactly what happened it's called Rust.
Kernel devs should just put on their big boy pants and go with the times. C is simply not the right tool for the job anymore for a lot/most kernel work.
I don't understand this argument. What has video length to do with whether it can be denser? This is like looking at a 1gb file and saying it could certainly be smaller.
The commenter believes the video should take less time and contain a higher percentage of strictly factual information.
A text analogy might be a recipe written in simple style, steps, ingredients, etc. and one you might find on a food blog where there is an intro about their childhood, how Nana was the best and along the way, somewhere in there one might learn how to prepare the food.
In this case, the video producer made pretty good choices about info density and content length.
The commenter disagrees and here we are chatting about all that.
Considering place and routing of even synchronous digital logic is at the edge of what we can do computationally I really don't see this happening anytime soon.
reply