Hacker News new | past | comments | ask | show | jobs | submit login

I’ve heard this reading of CRA and I think it’s overzealous on purpose to rile individual devs (well done). But we will have the courts decide and I think they will draw a much more clear line than “any project that received 1 EUR donation or accepted a single corporate committer”. For example, Facebook was claiming that violating users’ privacy was a “legitimate business interest” under GDPR until courts ruled otherwise. I am not expecting projects that merely accept some patches or donations to be on the hook. But https://squidfunk.github.io/mkdocs-material/, for example, would fit the bill (see "Trusted in the industry" on the homepage and https://squidfunk.github.io/mkdocs-material/insiders/).

At the same time, I think the rules laid out in Annex 1 [1] are quite reasonable to comply with for any production-grade software. Interested to see if it would accelerate adoption of Rust/Zig, as [1] requires the software "be designed, developed and produced to reduce the impact of an incident using appropriate exploitation mitigation mechanisms and techniques;"

[1]: https://eur-lex.europa.eu/resource.html?uri=cellar:864f472b-...




Saying let the courts decide when there is massive ambiguity for the small projects and developers just means that many of us (I know I will) will region block the EU until some one else deals with the court system and provides clarity for the rest us. It is way, way better for legislators to provide intent and clarity then to make things uncertain and ambiguous for the courts to decide. If they truly aren't going to care about a single corporate committer, then lay that out, or if there is a maximum donation threshold (indexed to inflation) of what can be considered reasonable corporate donations, then specify. Otherwise, those of us who can't afford to be caught in a legal battle on another continent will just shut things down.

That results in less software overall for the EU to use and innovate on. Perhaps that will result in better battle hardened software for the EU in general, but considering how much OSS has a huge dependency chain problem (many of them small projects), I'm doubtful that will occur anytime soon.


I don't see a reason to do this (region blocking whole EU). The legislation has a very similar enforcement to GDPR – did you get a 15M USD fine for Google Analytics on your blog? What's similar is that:

1. There will be authorities overseeing this. Court action will not be the first step (Europe is not a litigious place as opposed to some other jurisdictions).

2. First step would be a corrective action to "Ensure that the manufacturer remedies the cybersecurity vulnerabilities" [1]. Essentially, like in GDPR, where you can email the blog owner or the authority and complain that it uses Google Analytics. The first step would be to tell "stop". Similarly, the first step would be a demand from a user and/or authority that you address a CVE on your project.

3. Only then if you ignore this, there could be a court case or fines.

If you are really afraid, you can stop offering your software/libs in Europe at stage 2. And I am quite certain that it would carry a minimal risk (though I am not a lawyer, so that we are clear).

[1]: https://blog.huawei.com/2022/09/29/cyber-resilience-act-enha...


I did not get 15M USD fine for Google Analytics on my blog.

Instead, I am working for state agencies that are shy of using even locally installed Mamoto for web analytics, out of fear to collect too much PIIs because of GDPR. It is a daily tax on my mental sanity and a real problem for fellow citizens because of worsened service.

GDPR had two effects on the industry in EU:

1. Chilling effect. None wants to do things with GDPR considerations. Better mine bitcoins.

2. Grilling effect. There is a proliferation of DPO jobs and people landing in these jobs are making everyone's life harder because they are incompetent by definition but still want to ascertain their fake jobs and cushy salary.

I looked at your other comments and your profile and I see that you have a vested interest in software lifecycle management. CRA will help grow its visibility for sure. You probably conflate your personal goals with this intellectual discussion.

Still, the overwhelming majority on HN is aware of GDPR, knows at least tangentially how poorly written EU directives are and how much depends on the public opinion and "policy".

It's obvious that if CPA gets adopted, it will go pretty much the same path:

1. Lobbyists will seed FUD

2. Businesses will over-protect and waste precious resources

3. Hobbyists will loose

4. The stated goal will not be achieved.


All right, now that someone downvoted you (not me), I feel a mild obligation to respond.

Regarding the ad hominem part, I invite you to watch https://www.youtube.com/watch?v=Gv2I7qTux7g to understand why I think our industry needs to elevate the level of our craft. And also please take a look at the actual requirements CRA puts on devs in https://eur-lex.europa.eu/resource.html?uri=cellar:864f472b-... (pages 2 and 3 – yes, just those two pages* plus requirements on documentation on page 8, which could be a bit more annoying than the requirements on pp. 2 and 3). I hope you will find them reasonable for the most part.

Regarding GDPR, I am indeed sad that so many people interpret it incorrectly. This happens in part due to the influence of various groups, as you say. I invite you to read the blog of https://noyb.eu/en to understand the spirit of GDPR (indeed, there is a thing called "data minimization" that could be the reason you find it difficult to collect more data without a solid justification; in a fun twist, §1(3)(e) of the CRA annex also mentions data minimization) and see that the progress is made slowly yet steadily. If you noticed big websites recently show the option to deny tracking cookies directly instead of "manage cookies", you got these folks to thank ( https://noyb.eu/en/where-did-all-reject-buttons-come ). BTW, I donate to them and think they are doing awesome work.

* unless you are doing "serious stuff" (TM) as described on pages 5 and 6.


The requirement to provide a postal address is pretty bad if you are an individual developer, or a decentralized organization without an office.

And the "if applicable" part is kind of vague. For just one example, is it applicable for a database to have built in support for encryption at rest? Or is it sufficient to depend on the user setting up an encrypted filesystem? 1.3 is a reasonable list for a complete system, but less so for individual components. Some of those items, such as authentication, event monitoring, and high availability, are frequently "enterprise" features for open core projects. I'm not sure what the impact of that would be. Maybe companies will start including those in the opens offerings, or maybe we'll see those projects become completely proprietary.

And a lot of open source projects do the "serious stuff" described on pages 5 and 6. Some of which accept donations but have very small teams.

I don't think putting this burden entirely on the developers of open source projects is the right way to do it. I agree with the spirit of this, but think the implementation has some serious problems. I feel much the same way about GDPR.


> The requirement to provide a postal address is pretty bad

In Germany/Austria, everyone with a blog must give their address in an Imprint (see the bottom of http://armin.ronacher.eu/#contact , for example). In Sweden, everyone's address is in a public register (see https://www.hitta.se/ , for example). In other words, Americans may find this mildly intrusive from a privacy perspective, but in Europe it's common to know who are you dealing with. Often, I check what country a SaaS company is based in before signing up and get quite uncomfortable when neither the Privacy Policy nor the ToS pages mention even the country let alone an address.

> decentralized organization without an office

I think there were quite specific remarks that dev teams that don't have central management or at least leadership, will not be part of this regulation, e.g. Mastodon devs.

> And the "if applicable" part is kind of vague.

Yes, I agree with you that those points especially are a bit stressful because we don't know precise bounds. Precise bounds, however, tend to make any tech law obsolete very quickly. We had a similar panic with GDPR and now everything has settled rather nicely (in my opinion).

> 1.3 is a reasonable list for a complete system, but less so for individual components.

I think this is precisely why "if applicable" is there. You would just write "not applicable (this is a software component for use in a larger software product)". At least, that's what I plan to do.

> Some of those items, such as authentication, event monitoring, and high availability, are frequently "enterprise" features for open core projects. I'm not sure what the impact of that would be. Maybe companies will start including those in the opens offerings, or maybe we'll see those projects become completely proprietary.

I think this is the point where EU is saying "enough is enough". Just like GDPR largely "cancelled" the business model "if you are not paying for the product, you are the product", I think CRA demands that open core projects get crippled in any way companies want except for security. And while I see how it will be painful for some companies, I also understand the hard regulatory line EU has taken here.

> And a lot of open source projects do the "serious stuff" described on pages 5 and 6. Some of which accept donations but have very small teams.

Again, I think EU is saying "enough is enough". We can't have our most essential systems be vulnerable just because they are maintained by an unpaid dev in Nebraska ( https://xkcd.com/2347/ ). EU is essentially forcing the businesses to donate enough money to audit those allegedly crucial pieces of software or have those projects close down. However, I want to note that companies running such critical software would have to audit it whether it's OSS or not. Therefore, I think that for critical projects like Wireguard or libsodium, there will be enough corp sponsors to split (!) the costs of an audit. Because otherwise, each company using Wireguard will have to pay the same costs to repeat an audit over and over again.

And to be clear, simply refusing donations will not get the project out of CRA compliance. I think this is FUD being spread here. If a project is usable "in the course of a commercial activity" and gets regular releases, it will have to comply with CRA. One example I could think of is apache2. It is used by millions of business websites and even if the core devs don't accept donations, it's still a software clearly usable "in the course of a commercial activity".


>> "In other words, Americans may find this midly intrusive from a privacy perspective, but in Europe it's common to know who are you dealing with."

It's not intrusive, it's dangerous. I know it's popular to say the US is different where those differences don't account for why things aren't done, but this is one of the exceptions. We don't really have much in the way of protections, so it only takes a few pieces of information to find everything out about someone. Address is one of those. That's useful for creeps, snoops, identity thieves, stalkers, and other unscrupulous characters.

Also, SWATting. I really don't want someone to be able to easily find my address to send a murder squad to because I said/did something they don't like. We also have more guns than people, so someone with sufficient disconnect from reason might skip calling the police and take matters into their own hands.


This is where I find myself agreeing with smarx007.

I ran a community website with hundreds of active and tens of thousands of irregular visitors for 20 years. I received regular death threats and all kinds of insults, my address could be found online with some effort, but none ever showed at my doorstep over these years to claim what is due.

The only mild dorrstep clash I had was with a disgruntled husband of the babysitter we fired on the spot. A totally offline affair.

Dunno about USA, but a properly functioning society does not need PO boxes nor fences.

And the only peeple working hard to take down the UBO register are Putin's cronies (this is no exageration, google for Patrick Hansen)


> We can't have our most essential systems be vulnerable just because they are maintained by an unpaid dev in Nebraska

I don't disagree with that. But I don't think saying that one dev in Nebraska has to pay for security audits, or at least convince companies who use their project to pay for it and take charge of coordinating that effort, is the right way to solve the problem. I suspect that this will result in some projects distancing themselves from the EU, and have a chilling effect on new OSS projects in these areas, especially inside the EU.


> the overwhelming majority on HN is aware of GDPR, knows at least tangentially how poorly written EU directives are

No. Most GDPR threads on this website are full of speculation or beliefs that are backed by nothing. Most comment authors would not be able to provide a source to their claims, either with a citation from the legal text, interpretations (eg. From the EDPB, a DPA, or even the GDPRHub stuff), or actual cases. They don’t even make the effort to understand its principles (data minimization, transparency). And I don’t see how you can say that EU directives and regulations are poorly written when you don’t even take the time to read the one that may have an impact on your work. I know developpers here are not lawyers, but neither am I-but I still read the thing (and ePrivacy) so I actually know my rights, and can ask somewhat relevant questions wrt. data protection in my work.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: