One wonders what the point of this article is besides “we fucked up by choosing something that we should have known did not fit the constraints we had”.
"Now we have dismissed Tesla, we have now decided to purchase tiny clown cars, as we could squeeze many police and suspects in each car"
2 months later
"We have now found that clown cars actually don't go fast enough, as they are only designed to travel at 2-3 miles per hour in a circus ring. Also they are not as big internally as you might think. We are now investing in a drag racing car that will be retro-fit for police work."
Isn't Chromium open source? How hard would it be to fork and restore Manifest V2? I'd expect the functionality to be fairly isolated so that easily tracking upstream becomes manageable.
Brave's shady crypto strategies [1] and history of injecting headers and affiliate codes in your navigation [2] makes it an untrustworthy browser to me.
I am asking because this community has a serious double-standard. This isn't about any “shady crypto” or affiliate codes being inserted, issues which were promptly fixed. It's about something else entirely, and people are being disingenuous about it.
I think you underestimate how complex browsers are, it's not as if it's just an isolated part of the browser, it's huge and full of moving parts, good luck maintaining your own fork of Chrome especially if you want to also get security updates.
Compare Vallejo to Frank Frazetta and it's immediately obvious why the latter is considered a master of fantastic art while the former is drifting towards obscurity.
Frazetta's art is full of tension [1] and energy [2] whilst Vallejo is simply drawing his circle of bodybuilders in various poses. The environments he paints have absolutely no effect on his posed subjects and the end-result is disconnected and comes across as "fake" in its intended setting as it evokes memories of bodybuilding gyms.
You might as well be comparing Slayer and Cannibal Corpse. Both styles have their place and they can be enjoyed in their own right.
I prefer Frazetta too, but Vallejo is rightly recognised as one of the greats. He does his own thing. Not everyone can, or should try to, be Frazetta. That'd be pretty boring.
Or, like I once told a friend: you gotta stop comparing everyone to Bugs (Bunny).
Top down architecture doesn’t scale and puts a hard limit on the problems one can tackle before complexity explodes. The Internet, the largest distributed system we have, is based on bottom-up cell-like biologically inspired models. Kay was prescient and decades ahead of his time.
This is the main reason we have banned Rust across my Org. Every third party library needs to be audited before being introduced as a vendored dependency which
is not easy to do with the bloated dependency chains that Cargo promotes.
The dependency hell issue is not directly related to Rust. The Rust language can be used without using any dependency. Have you banned javascript and python too?
And in a similar vein have they audited the runtimes of all the languages they use? Because those a dependencies too and in many ways even more critical than libraries.
TBH, I have adjusted my programming recently to write more stuff myself instead of finding a library. Its not that bad. I think ChatGPT are really good at these at those types of questions since it can analyze multiple from github and give you an answer averaging them together.
Also, if you just have a really well defined problem, its easy to just whip out 10-50 lines to solve the issue and be done with it
Our main languages are Go and OCaml. We can leverage third party libraries without easily running into transitive dependency hell as there’s an implicit understanding in these communities that large number of dependencies is not a good thing. Or, expressed differently, there is coarser granularity in what ends up being a library. This is not the case with Cargo which has decided to follow the NPM approach.
At least in my experience, Go packages and Rust crates are much coarser than NPM packages. (Look at actual direct and indirect dependencies in cargo-watch to judge it by yourself.) I think Go prefers and actually has resource to keep mostly centralized approaches, while Rust crates are heavily distributed and it takes longer for the majority to settle on a single solution.
I'm sorry, but that feels like an incredibly poorly informed decision.
One thing is to decide to vendor everything - that's your prerogative - but it's very likely that pulling everything in also pulls in tons of stuff that you aren't using, because recursively vendoring dependencies means you are also pulling in dev-dependencies, optional dependencies (including default-off features), and so on.
For the things you do use, is it the number of crates that is the problem, or the amount of code? Because if the alternative is to develop it in-house, then...
The alternative here is to include a lot of things in the standard library that doesn't belong there, because people seem to exclude standard libraries from their auditing, which is reasonable. Why is it not just as reasonable to exclude certain widespread ecosystem crates from auditing?
> One thing is to decide to vendor everything - that's your prerogative - but it's very likely that pulling everything in also pulls in tons of stuff that you aren't using, because recursively vendoring dependencies means you are also pulling in dev-dependencies, optional dependencies (including default-off features), and so on.
What you're describing is a problem with how Cargo does vendoring, and yes, it's awful. It should not be called vendoring, it is just "local mirroring", which is not the same thing.
But Rust can work just fine without Cargo or Crates.io.
It's still much much slower than SBCL which is not surprising given the time & effort that went into the latter. User-guided optimizations (type declarations, stack allocation, intrinsics, machine code generation) in SBCL are also more flexible and better integrated.
The high resolution digitized versions are no longer available for purchase at Amazon, just a single issue June 1987. They're also not available anywhere else that I know of, the archive.org versions are inferior / bad scans / low resolution.
This and other newcomers (e.g. exa, ripgrep, ..) are not specified by POSIX and thus not ubiquitous. For someone that interacts with multiple UNIX systems daily, I find that the tedium of maintaining -sometimes multiple versions- and moving these binaries around is greater than the benefits they provide.
The same can be said for new shells that have popped up in the last 5-10 years versus bash. They're not sufficient to escape the local minima of 'good enough'.
Until it's in the base os images, there's an institutional cost for large companies installing everyone's favorite 'enhanced' utility and so they opt to just not do so.
I've spent many years of my career crafting tooling to sync dot files and binaries around and largely over time just gave up on it as the juice is just not worth the squeeze.
What does that have to do with "copying binaries around"? If it isn't in the base OS image, then install yourself. Or not. And this has nothing to do with POSIX either. Because there is plenty in base OS images that aren't specified by POSIX.
I interact with multiple Unix systems daily too. Some of those systems have ripgrep and some don't. Unless I'm in a particular scenario where I think ripgrep would be useful, I just use grep instead. I don't really see the issue. For a short-lived machine, yeah, I don't bother with syncing dotfiles and all that other nonsense. I just use the bare OS environment and install stuff as needed. But I also have plenty of longer lived Unix machines (like the laptop I'm typing on) where it makes sense to set up a cozy dev environment optimized for my own patterns.
Ooh, I can't just install packages on machines. Change management is a super important thing and bypassing that to get a utility installed via a package manager would be a dumb way to get fired.
Sure, I get it. You don't work in a company that has this sort of culture, but a large number of us do. And you want us to. Do you want AWS engineers being able to just install whatever they want on the hosts running your VMs? Of course not.
The entire thread is about how the benefit of these tools are not worth the hassle of copying the binaries around because everyone can't just install or modify the servers.
You keep modifying the argument for some sort of moral win I guess, devolving into name calling when people just don't agree with your position. Very professional of you.
Regardless, I presume you're just having a bad day. I hope it goes better for you and you can find some peace and calm down.
The original comment said nothing about modifying servers or AWS engineers installing random shit. That was you. I responded to "moving binaries around," and you started yapping about change management. Two totally different things. Like obviously if you have a locked down environment, then only install what you need. But this is not what the original poster was referring to specifically.
> You need a portable and ubiquitous tool. While ripgrep works on Windows, macOS and Linux, it is not ubiquitous and it does not conform to any standard such as POSIX. The best tool for this job is good old grep.
Are you living in the UK or just making baseless assumptions? For an ever-inceasing number of serious health issues, the NHS has been essentially non-functional for many years. I have friends with chronic conditions that are looking to emigrate becase they're dreading their health getting worse in a system that's doing nothing to help them.
"NHS would be great except for the bad politicians" is not a serious argument.
It's underfunded. I'm willing to bet that the countries they're thinking of moving to either spend more on healthcare as a proportion of GDP than the UK does or don't have healthcare that's free at the point of use (or both).
reply