Cross-border price differences have been a major point of public debate in Austria recently. Particularly supermarket prices are often up to 50% more expensive for the same item than in neighboring countries, even if the product has been produced in Austria by an Austrian company.
This shows that the EU still has a long way to sufficiently integrate its markets, despite free movement of goods having been established ages ago. Projects like this may help facilitate the transition to a more unified market.
>This shows that the EU still has a long way to sufficiently integrate its markets
Austria's high groceries price problem isn't due to a fault of EU market integration, it's due to the cartels that own the retail sector in Austria and milk consumers for all they're worth.
In other words it's a domestic self inflicted problem, that Austria can solve but chooses not to, not an EU problem.
Only once? This is a discussion topics in Austria every other week. It was bound to make it to HN at least a few times since tech savvy people made platforms to track the data.
The only problem is even with the data in plain sight, government regulators still don't do anything.
The hate is more geared towards SPAs in general, but there are some shining examples that show that a well-made React/Angular/whatever app can have great UX - Clockify being one of them.
I don’t think the culprit apps would have substantially better UX if they were rendered on the server, because these issues tend to be a consequence of devs being pressured to rapidly release new features without regard to quality.
And to be fair, the problems that Facebook had when they introduced React are not common problems at all.
As an aside, I was an employee around then and I vividly remember that the next half there was a topline goal to improve web speed. Hmmmm, I wonder what could have happened?
> And to be fair, the problems that Facebook had when they introduced React are not common problems at all.
That’s one of my favorites. The exact bug they described during React launch presentation, that React was supposed to help fix with the unidirectional dataflow. You know the one where unread message badges were showing up inconsistently in the UI in different places. They never managed to fix that bug in the 10 years since React was announced and I eventually left Facebook for good.
Fantastic article! It seems to me that the flexibility of low-level git objects would lend itself to be embedded in other software systems for version control purposes (e.g. tracking changes in a CMS)
Nowadays there are automated tools like Dependabot (0) or RenovateBot (1) that make it simple to keep dependencies up to date. I can imagine the need originated from the JS ecosystem, but from a security standpoint it makes sense for almost any stack.
I think the idea is that for minor updates or patches, any potential breakages should be caught by the build pipeline or rolling deployments with automated rollback strategies (if you’re at a scale where this is feasible). Major updates will probably fail in the pipeline and require manual intervention either way.
I don’t think it makes sense for every project, but if recovery options are cheap then I don’t see anything that speaks against it.
Sure, why not? Are you suggesting that having a human in the loop, robotically bumping the version numbers of your dependencies would have mitigated it?
Lots of humans upgraded lots of dependencies without noticing, I doubt whoever is doing it in your org is special enough to be the one who would have caught it. And if they are, they should be working in security research, not bumping dependency versions in package.json.
I'm not sure how updating dependencies a few times a year makes you safer from a well-hidden supply chain attack.
Also, not sure what JS has to do with the xz attack.
The argument I was responding to is that automating your dependency updates somehow makes you more vulnerable to a supply chain attack.
I could see an argument that waiting X days from a dependency release to when you pull it in gives you a little time for other people to find issues. But that's orthogonal to whether you update dependencies automatically or manually, or whether you do it once/year or every day.
What I meant was that for maybe 3-4 packages total in a year, automation doesn't matter much. It's not a big cost. Risk wise you can be pretty conservative if you are in a high-risk scenario.
But usually I find that bumping dependencies is a detective work because unlike JS where you can depend on multiple versions of the same package, for .NET you can't. So if you update package A, there is a risk that you also update a transitive dependency C, which package B depends on but on the previous version. So even for what looks like trivial updates it's often a chore. Which is why I'm happy to have just a handful of dependencies.
So according to the company’s 2022 annual report, they source power from 8 hydroelectric and 6 solar power plants, all of which they operate themselves, as well as 4 partnering hydroelectric plants. The rest of their electricity needs is acquired from the market but checked for proof of origin.
Obviously this only covers railway infrastructure in Austria as they have no influence over other countries.
It's unfortunate that Kotlin lacks pattern matching, especially now that Java has it. I can only hope that the release of the new compiler will spur further language features.
At the moment Kotlin only has smart casts and exhaustive type checking (e.g., making sure you didn't forget a switch case). It doesn't let you destructure records, add guards to cases, etc.
After a quick check on crates.io, Axum stands at over 5,000,000 „recent downloads“ which is 5x as much as the next most popular contender, actix-web. Seems to me like the Rust community is already converging towards Axum as the default choice.
The first thing I do for all of my projects is adding a .npmrc with save-exact=true