Hacker News new | past | comments | ask | show | jobs | submit login

> it will open up a path for devs to get paid for the extra burden.

The thing is, being paid for an extra burden doesn't make it any less of an imposition on devs limited energy.

Basically, if you want to accept donations so people can show their appreciation for what you share freely with the world, you open yourself up to demands that you do work that you don't enjoy on a hobby. That's really shit.




I don't think this legislation will affect hobby projects. The problem is that whether the project is hobby or not is judged from the side of the consumer, i.e. if the software is usable "in the course of a commercial activity" (for the user). I agree that this creates a certain amount of stress, esp. for individual devs, but I think it was necessary to make sure that projects like k8s, kafka, and other OSS projects consistently relied on by businesses cannot claim that the OSS version is not for commercial use. And with that run-around statement, be done with CRA "compliance".


>The problem is that whether the project is hobby or not is judged from the side of the consumer, i.e. if the software is usable "in the course of a commercial activity" (for the user).

Have you seen the dependency trees for commercial software? I'd be surprised if there's any non-trivial OSS project that hasn't been used as part of commercial activity.


I've seen many things in my life I'd gladly unsee, including corp IT devs putting programs with "SNAPSHOT" (unstable) dependencies in production. But just merely having a corp use your software would not place the CRA burden on you. Your project needs to make such an impression.

The most negative outcome of this legislation that I can see is that OS projects like Nix, Debian and others will start aggressively pruning their software repositories from packages where there is any indications that devs/packagers are not reacting to CVE reports (or do it too slow).


I see you are very active on this topic. I have a question regarding an interesting point you're making:

> But just merely having a corp use your software would not place the CRA burden on you. Your project needs to make such an impression.

What does "needs to make such an impression" mean. Sloppy code and PRs with Fix, Fix, Fix, Another Fix commits are hobby projects? And having some integrity implies "you make an impression of commercial activity"?


I just think our industry is long overdue for some regulation ( https://www.youtube.com/watch?v=Gv2I7qTux7g as an inspiration ) and I also found the CRA requirements on pp. 2 and 3 ( https://eur-lex.europa.eu/resource.html?uri=cellar:864f472b-... ) quite reasonable. What shocked me is how nobody on the thread considered/appreciated those two things. Having said that, there is a chance CRA implementation will get bungled, but I hope for the better.

Regarding the commercial activity: I found the screenshot of a 3-part test from a recent Eclipse call and I hope it's OK to post it ( https://imgur.com/a/70a6cQt ). I think it's important to understand that in a multi-part test, you typically need to seriously hit a few points to "pass" the test. Some examples of what I would consider passing each point (but not necessarily the whole test):

1. Rust with its 6-week release cadence will quite likely pass the 1 part.

2. Nginx, k8s, Ubuntu LTS will quite likely pass the part 2 of the test.

3. A project like Eclipse 4diac ( https://projects.eclipse.org/projects/iot.4diac ) would quite likely pass the 3rd part of the test. That's the part of the test that worries Github and others, because receiving money and significant contributions from corporate will contribute to scoring high on this point.

But then, Eclipse 4diac could claim that it's mainly an R&D project and thus not passing the part 2 of the test. And even part 1 of the test would be hard to establish with barely 1 release a year.


>Your project needs to make such an impression.

And how much are the legal fees going to be if you are accused of this and need to defend yourself? Given the fees if found guilty this seems very much a safer to just not risk it type of situation.


The distros don't need upstream to handle CVEs, they can probably handle many of the CVEs themselves or via sharing patches on the oss-security/other mailing lists.


Do you know if the requirement is:

* that a project is developed AND supplied commercially?

* or rather that a project is developed OR supplied commercially?

For example if I write an experimental project at work which might have vulnerabilities (developed commercially), which my employer has no intention of selling yet (not supplied commercially), should I still follow the CRA processes in case someone reports a vulnerability? What if someone else decides to take my toy project and put it into their product?


I was on the Eclipse Foundation call a few days ago regarding this topic and they said there was a well-established 3-part test for this in the EU courts. But I don't think I managed to take a screenshot, sorry.

Here is a snippet from the EU Blue Guide linked the from the Eclipse blog post:

"Commercial activity is understood as providing goods in a business related context. Non-profit organisations may be considered as carrying out commercial activities if they operate in such a context. This can only be appreciated on a case by case basis taking into account the regularity of the supplies, the characteristics of the product, the intentions of the supplier, etc. In principle, occasional supplies by charities or hobbyists should not be considered as taking place in a business related context."

I would consider GCC or React to fit this definition, while a hobby project like https://github.com/rui314/chibicc not to fit it.

Edit: I don't think you would have any obligations under CRA unless you make a project release available, whether commercially or on Github. The 3-part test I mentioned above only kicks in when there is a release of some sort in the first place.


That roughly tracks with my gut reaction from reading what appears to be the current draft: if you're doing any sort of formalized release process, you're probably at the point where you're doing commercial activity. By the time you're supporting old versions of the software, and cutting new point releases, you're almost definitely in commercial activity land.

Definitely it looks like a higher bar than GitHub is implying--I don't see any indication that merely soliciting donations would qualify for commercial activity.


So from what I understand, this means:

- you can't accept donations, or other small payments for your side project anymore, unless you take on a substancial burden

- you can't give commit access to, or possibly even accept contributions from employees of entities that use your side project in a commercial setting. And what if you work for a software company, but contribute to OSS on your own time, does the project need to comply with CRA then?


I’ve heard this reading of CRA and I think it’s overzealous on purpose to rile individual devs (well done). But we will have the courts decide and I think they will draw a much more clear line than “any project that received 1 EUR donation or accepted a single corporate committer”. For example, Facebook was claiming that violating users’ privacy was a “legitimate business interest” under GDPR until courts ruled otherwise. I am not expecting projects that merely accept some patches or donations to be on the hook. But https://squidfunk.github.io/mkdocs-material/, for example, would fit the bill (see "Trusted in the industry" on the homepage and https://squidfunk.github.io/mkdocs-material/insiders/).

At the same time, I think the rules laid out in Annex 1 [1] are quite reasonable to comply with for any production-grade software. Interested to see if it would accelerate adoption of Rust/Zig, as [1] requires the software "be designed, developed and produced to reduce the impact of an incident using appropriate exploitation mitigation mechanisms and techniques;"

[1]: https://eur-lex.europa.eu/resource.html?uri=cellar:864f472b-...


Saying let the courts decide when there is massive ambiguity for the small projects and developers just means that many of us (I know I will) will region block the EU until some one else deals with the court system and provides clarity for the rest us. It is way, way better for legislators to provide intent and clarity then to make things uncertain and ambiguous for the courts to decide. If they truly aren't going to care about a single corporate committer, then lay that out, or if there is a maximum donation threshold (indexed to inflation) of what can be considered reasonable corporate donations, then specify. Otherwise, those of us who can't afford to be caught in a legal battle on another continent will just shut things down.

That results in less software overall for the EU to use and innovate on. Perhaps that will result in better battle hardened software for the EU in general, but considering how much OSS has a huge dependency chain problem (many of them small projects), I'm doubtful that will occur anytime soon.


I don't see a reason to do this (region blocking whole EU). The legislation has a very similar enforcement to GDPR – did you get a 15M USD fine for Google Analytics on your blog? What's similar is that:

1. There will be authorities overseeing this. Court action will not be the first step (Europe is not a litigious place as opposed to some other jurisdictions).

2. First step would be a corrective action to "Ensure that the manufacturer remedies the cybersecurity vulnerabilities" [1]. Essentially, like in GDPR, where you can email the blog owner or the authority and complain that it uses Google Analytics. The first step would be to tell "stop". Similarly, the first step would be a demand from a user and/or authority that you address a CVE on your project.

3. Only then if you ignore this, there could be a court case or fines.

If you are really afraid, you can stop offering your software/libs in Europe at stage 2. And I am quite certain that it would carry a minimal risk (though I am not a lawyer, so that we are clear).

[1]: https://blog.huawei.com/2022/09/29/cyber-resilience-act-enha...


I did not get 15M USD fine for Google Analytics on my blog.

Instead, I am working for state agencies that are shy of using even locally installed Mamoto for web analytics, out of fear to collect too much PIIs because of GDPR. It is a daily tax on my mental sanity and a real problem for fellow citizens because of worsened service.

GDPR had two effects on the industry in EU:

1. Chilling effect. None wants to do things with GDPR considerations. Better mine bitcoins.

2. Grilling effect. There is a proliferation of DPO jobs and people landing in these jobs are making everyone's life harder because they are incompetent by definition but still want to ascertain their fake jobs and cushy salary.

I looked at your other comments and your profile and I see that you have a vested interest in software lifecycle management. CRA will help grow its visibility for sure. You probably conflate your personal goals with this intellectual discussion.

Still, the overwhelming majority on HN is aware of GDPR, knows at least tangentially how poorly written EU directives are and how much depends on the public opinion and "policy".

It's obvious that if CPA gets adopted, it will go pretty much the same path:

1. Lobbyists will seed FUD

2. Businesses will over-protect and waste precious resources

3. Hobbyists will loose

4. The stated goal will not be achieved.


All right, now that someone downvoted you (not me), I feel a mild obligation to respond.

Regarding the ad hominem part, I invite you to watch https://www.youtube.com/watch?v=Gv2I7qTux7g to understand why I think our industry needs to elevate the level of our craft. And also please take a look at the actual requirements CRA puts on devs in https://eur-lex.europa.eu/resource.html?uri=cellar:864f472b-... (pages 2 and 3 – yes, just those two pages* plus requirements on documentation on page 8, which could be a bit more annoying than the requirements on pp. 2 and 3). I hope you will find them reasonable for the most part.

Regarding GDPR, I am indeed sad that so many people interpret it incorrectly. This happens in part due to the influence of various groups, as you say. I invite you to read the blog of https://noyb.eu/en to understand the spirit of GDPR (indeed, there is a thing called "data minimization" that could be the reason you find it difficult to collect more data without a solid justification; in a fun twist, §1(3)(e) of the CRA annex also mentions data minimization) and see that the progress is made slowly yet steadily. If you noticed big websites recently show the option to deny tracking cookies directly instead of "manage cookies", you got these folks to thank ( https://noyb.eu/en/where-did-all-reject-buttons-come ). BTW, I donate to them and think they are doing awesome work.

* unless you are doing "serious stuff" (TM) as described on pages 5 and 6.


The requirement to provide a postal address is pretty bad if you are an individual developer, or a decentralized organization without an office.

And the "if applicable" part is kind of vague. For just one example, is it applicable for a database to have built in support for encryption at rest? Or is it sufficient to depend on the user setting up an encrypted filesystem? 1.3 is a reasonable list for a complete system, but less so for individual components. Some of those items, such as authentication, event monitoring, and high availability, are frequently "enterprise" features for open core projects. I'm not sure what the impact of that would be. Maybe companies will start including those in the opens offerings, or maybe we'll see those projects become completely proprietary.

And a lot of open source projects do the "serious stuff" described on pages 5 and 6. Some of which accept donations but have very small teams.

I don't think putting this burden entirely on the developers of open source projects is the right way to do it. I agree with the spirit of this, but think the implementation has some serious problems. I feel much the same way about GDPR.


> The requirement to provide a postal address is pretty bad

In Germany/Austria, everyone with a blog must give their address in an Imprint (see the bottom of http://armin.ronacher.eu/#contact , for example). In Sweden, everyone's address is in a public register (see https://www.hitta.se/ , for example). In other words, Americans may find this mildly intrusive from a privacy perspective, but in Europe it's common to know who are you dealing with. Often, I check what country a SaaS company is based in before signing up and get quite uncomfortable when neither the Privacy Policy nor the ToS pages mention even the country let alone an address.

> decentralized organization without an office

I think there were quite specific remarks that dev teams that don't have central management or at least leadership, will not be part of this regulation, e.g. Mastodon devs.

> And the "if applicable" part is kind of vague.

Yes, I agree with you that those points especially are a bit stressful because we don't know precise bounds. Precise bounds, however, tend to make any tech law obsolete very quickly. We had a similar panic with GDPR and now everything has settled rather nicely (in my opinion).

> 1.3 is a reasonable list for a complete system, but less so for individual components.

I think this is precisely why "if applicable" is there. You would just write "not applicable (this is a software component for use in a larger software product)". At least, that's what I plan to do.

> Some of those items, such as authentication, event monitoring, and high availability, are frequently "enterprise" features for open core projects. I'm not sure what the impact of that would be. Maybe companies will start including those in the opens offerings, or maybe we'll see those projects become completely proprietary.

I think this is the point where EU is saying "enough is enough". Just like GDPR largely "cancelled" the business model "if you are not paying for the product, you are the product", I think CRA demands that open core projects get crippled in any way companies want except for security. And while I see how it will be painful for some companies, I also understand the hard regulatory line EU has taken here.

> And a lot of open source projects do the "serious stuff" described on pages 5 and 6. Some of which accept donations but have very small teams.

Again, I think EU is saying "enough is enough". We can't have our most essential systems be vulnerable just because they are maintained by an unpaid dev in Nebraska ( https://xkcd.com/2347/ ). EU is essentially forcing the businesses to donate enough money to audit those allegedly crucial pieces of software or have those projects close down. However, I want to note that companies running such critical software would have to audit it whether it's OSS or not. Therefore, I think that for critical projects like Wireguard or libsodium, there will be enough corp sponsors to split (!) the costs of an audit. Because otherwise, each company using Wireguard will have to pay the same costs to repeat an audit over and over again.

And to be clear, simply refusing donations will not get the project out of CRA compliance. I think this is FUD being spread here. If a project is usable "in the course of a commercial activity" and gets regular releases, it will have to comply with CRA. One example I could think of is apache2. It is used by millions of business websites and even if the core devs don't accept donations, it's still a software clearly usable "in the course of a commercial activity".


>> "In other words, Americans may find this midly intrusive from a privacy perspective, but in Europe it's common to know who are you dealing with."

It's not intrusive, it's dangerous. I know it's popular to say the US is different where those differences don't account for why things aren't done, but this is one of the exceptions. We don't really have much in the way of protections, so it only takes a few pieces of information to find everything out about someone. Address is one of those. That's useful for creeps, snoops, identity thieves, stalkers, and other unscrupulous characters.

Also, SWATting. I really don't want someone to be able to easily find my address to send a murder squad to because I said/did something they don't like. We also have more guns than people, so someone with sufficient disconnect from reason might skip calling the police and take matters into their own hands.


This is where I find myself agreeing with smarx007.

I ran a community website with hundreds of active and tens of thousands of irregular visitors for 20 years. I received regular death threats and all kinds of insults, my address could be found online with some effort, but none ever showed at my doorstep over these years to claim what is due.

The only mild dorrstep clash I had was with a disgruntled husband of the babysitter we fired on the spot. A totally offline affair.

Dunno about USA, but a properly functioning society does not need PO boxes nor fences.

And the only peeple working hard to take down the UBO register are Putin's cronies (this is no exageration, google for Patrick Hansen)


> We can't have our most essential systems be vulnerable just because they are maintained by an unpaid dev in Nebraska

I don't disagree with that. But I don't think saying that one dev in Nebraska has to pay for security audits, or at least convince companies who use their project to pay for it and take charge of coordinating that effort, is the right way to solve the problem. I suspect that this will result in some projects distancing themselves from the EU, and have a chilling effect on new OSS projects in these areas, especially inside the EU.


> the overwhelming majority on HN is aware of GDPR, knows at least tangentially how poorly written EU directives are

No. Most GDPR threads on this website are full of speculation or beliefs that are backed by nothing. Most comment authors would not be able to provide a source to their claims, either with a citation from the legal text, interpretations (eg. From the EDPB, a DPA, or even the GDPRHub stuff), or actual cases. They don’t even make the effort to understand its principles (data minimization, transparency). And I don’t see how you can say that EU directives and regulations are poorly written when you don’t even take the time to read the one that may have an impact on your work. I know developpers here are not lawyers, but neither am I-but I still read the thing (and ePrivacy) so I actually know my rights, and can ask somewhat relevant questions wrt. data protection in my work.


I probably missed it (or it's in one of the many documents on the ITRE page) but in the quoted Recital (10) on the Github Blog I'm not seeing a disctintion with regards to the use of a product.

Commercial or not (within the context of the CRA) seems to be based on the development structure and the offering of related services. For example they explicitely allow for a dristributed model where "no single commercial entity" has full control.

I'm not doubting that what you said is true, I'm interested to learn more, because in general this directive seems to be a big step in the right direction.

Do you happen to have a link to where I can read the current ITRE draft in it's enterity?


The full draft is under https://eur-lex.europa.eu/resource.html?uri=cellar:864f472b-... , the body starts from page 14.

Upd: the amendment from 18.4.2023 is available under https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:52...


Thank you very much.

That seems to be the original draft of the Commission before the considerable changes added since then by the several committees


Thank you for checking carefully! I updated my original message with the extra link for the 18.4.2023 amendment.


These suckers badly need version control.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: