Hacker News new | past | comments | ask | show | jobs | submit login

> With that said, what do you make of the point that things that are clearly not suitable for production (and sometimes labeled accordingly) have a nasty habit of making their way into production anyway? Do you think it has salience here? You're clearly a thoughtful person with your hear in the right place, so I'd very much like to hear your opinion.

I don't think I have a good answer. I'm currently serving a little over 6 million requests a month for a service I never advertised yet some non-zero number of developers copied code from stack overflow or wrote based on my github project to hit an endpoint that was just a demo. Since it was un-authenticated and I didn't notice the traffic until much later I have no good way to shut this down without pulling the rug out from under people. Eventually I will have to but since it's not costing my anything extra really I've left it.

Now I put a warning on it when I noticed the traffic but that was 3 years and 2 million requests less per month ago so...

I guess yes, we, as developers, should be better about putting warnings and the like on our projects but by the same (if not greater) token, people using existing code/services bear the responsibility to vet things they use. I don't know how to fix people to vet stuff better. Hell, I don't know how to make myself do that. That said you take risks every day, you risk your live when you get behind the wheel, you risk security holes when you don't vet code you use or build upon. I've weighed the risks of driving against the rewards and found the risk acceptable. For a lot of software I use I've made the same decision. I just don't have the time and energy to vet everything for myself and timelines and other pressures at work make it equally impossible there.

At what point do you say "Ehh, it's secure enough below this point"?

* npm library with 1000's of dependencies?

* npm library with no dependencies

* npm

* nodejs

* OS

* Hardware

I honestly don't know a consistent/standardized way to handle it.

As you so correctly and wisely point out, we take all kinds of risks every day. It's an inherent part of life. Yet we also generally accept that things that can be done in a safer way should be when reasonably possible.

At this point I'm leaning towards deprecating the idea that encouragement, positivity, and documentation will lead to developers making good decisions. As you've demonstrated, it's clearly insufficient, and I suspect your experience is an outlier but far from unique.

Increasingly, I think we may have to consider measures to get it right the first time. And we have to be sure our peers do too, because we're all living in the same environment and context. Otherwise, sooner or later, someone who didn't read a warning label is going to try to build a PII-handling business on Zero Server (or similar) and it's going to be a dumpster fire.

I would love a world where friendly encouragement, niceness, and documentation could scaleably do this. Sadly, it's perhaps possible that that world and this one could be slightly different.

Your conclusion seems to eschew the "honey" for? "Vinegar"? I'm not disagreeing, I'm just trying to figure out what this would look like.

Some might imagine it as eschewing heedless amateurism in favor of mature professionalism.

One starting point might be re-examining if all the negativity shown in reaction to Zero Server is actually excessive. Some of it surely is! It's also perhaps possible that some of it could be viewed as the horrified reactions of professionals who care about quality (and think about downstream effects) coming face-to-face with reckless engineering.

> Some might imagine it as eschewing heedless amateurism in favor of mature professionalism.

I don't fully understand what you're advocating, so I apologize if I am misinterpreting or mischaracterizing your position. But I can't imagine "mature professionalism" including berating or embarrassing another developer for creating a security hole. Some people do need that, but most don't. Many devs I've worked with are horrified when they find out they wrote in a vulnerability, and they try to grow and find out better ways. This is startup culture tho.

When I worked in enterprise type environments it was the opposite. There, you could send the dev a working exploit and most of them would groan and bitch about how "you security guys just want to tear things down and break things." I can understand being more of a dick in those situations, but I would still contend that's a venting of frustration more than a desire to actually teach. Most people just get defensive and put up walls when they feel attacked or embarrassed, and the learning is over at that point.

Berating someone is never helpful. Setting out to embarrass someone is never helpful. The goal should never be to blame someone. The goal should always to educate someone on what has happened, what exploits have been enabled, and how this could cause harm.

One issue is that full-throated encouragement coupled with suggestions of problems couched in uncertainty is the Dale Carnegie approach. I suspect you're using it now! It's well-suited to a great many situations, as documented in the man's seminal work.

Unfortunately, this suitability is not universal. It is fallible, and in security those failure can be quite dangerous. The Carnegie approach thus described makes it very easy for devs to notice the encouragement and ignore the suggestions of criticism. I have personally encountered this reaction in both open source and enterprise-y contexts, generally from developers who might be charitably described as highly enthusiastic. Including right here on HN!

I've also encountered the hostility you described in reaction to kind, generous, compassionate security reports of the sort you suggest. This has happened in open source, startup-type, and enterprise-y environments.

The key to what I'm advocating is this: you are not your code and the other person is not their code. Who wrote a vulnerability is not as important as that it exists. How it can be fixed, and how it can be prevented in the future, are what matter.

Professionalism means understanding the distinction between a craftsperson and what they have produced. It also means understanding the distinction between yourself and your work. It means understanding that ignorance isn't a character flaw, it's a temporary state of affairs that can be fixed.

It also means realizing that some who refuses to participate in fixing their technical ignorance is someone who is being unprofessional. Such a person might benefit from correction.

Ok, I think I understand your position better, and if so I agree.

> The Carnegie approach thus described makes it very easy for devs to notice the encouragement and ignore the suggestions of criticism.

This is definitely true, I've seen it too. Definitely something to watch out for.

I guess at the end of the day I think it comes down to "know your audience." Similar to the Principle of Least Privilege I like to follow (what I call) the Principle of Least Criticism. Don't use any more criticism than what is necessary, but (and I suspect this is where we will both agree) you need to use enough criticism that the person understands your point.

That's a wonderful idea! It's an ideal way to engage with people.

With that said there might be some significant limitations on the approach. The principle one is that it relies on knowing the individuals involved fairly well. This is easy in a close-knit and small startup environment! It could perhaps be more difficult in a sizable enterprise or open source context where you don't know the other party, don't have time to build a relationship, and/or can't rely on a long conversation to slowly build up to the least amount of criticism required. There's also the question of how to handle groups or meetings in which different people have different thresholds. When one person's minimum required might be someone else's excessive negativity and they're both in the room, there might not be a winning outcome with this approach.

Knowing your audience is an excellent and wise maxim to live by. It might not always be as simple to live by as could be hoped.

Wise words, I completely agree :-)

Thanks for the discussion

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact