Hacker News new | past | comments | ask | show | jobs | submit login

It's time to have a mandatory certification for people who develop critical systems. After such certification, you can consider such an implementation a malpractice, and sue them for it (of course the penalty is paid by the insurance company which sold the malpractice insurance).

Doctors, lawyers, and many other professions have such system, why can't we have it as well?




"Critical systems" pretty vague, and could be used to describe any system that processes payments or other basic things we use.

It's fundamentally different from malpractice in my opinion. In health care malpractice has obvious pieces of data - we know who the doctor is, we know their credentials, we know what information they had and when they had it, we know what they decided, what they prescribed, what they said.

Software engineering is a team based endeavor. Who exactly is responsible for unrecognized vulnerabilities? Everyone? No one? One dude who everyone sorta thought handled security stuff? It's as clear as mud.


>Who exactly is responsible for unrecognized vulnerabilities? Everyone? No one? One dude who everyone sorta thought handled security stuff? It's as clear as mud.

Security team with people who do it full time. Betting your security on the one dude who sorta did everything should be criminal.

Aka, not this: http://i.imgur.com/a7S95nG.jpg


>Who exactly is responsible for unrecognized vulnerabilities? Everyone? No one?

Here's a quote from Equifax's early release on the breach [0]:

Equifax said that, it had hired a cybersecurity firm to conduct a review to determine the scale of the invasion.

So, to your question, I'm going with "no one", at least internally.

It's beyond belief (well, not really anymore); but, not only do they not have security covered internally (criminal in itself), but they don't even appear to have a regularly engaged cybersecurity firm. They had to go out and hire one post facto.

[0] https://investor.equifax.com/news-and-events/news/2017/09-07...


What does professional even mean (from her past)? To me it means useless middle management that accomplishes nothing apart from moving numbers around to make them look good.


I always thought "professional" as the sole job description (i.e. not "professional X") was used as an euphemism for "prostitute", so I'm wondering why someone would put it on their resume like that. Did I just learn the word in the wrong context?


What if management doesn't hire a security team? What if management hires incompetent security team?


>What if management doesn't hire a security team?

That's clearly negligence.

>What if management hires incompetent security team?

That's harder to do because you have to establish competence, which has led to a bunch of hazing rituals via whiteboard for general software development and a lot of other insecurities. Being a security professional isn't regulated by law, so you can't check the law to determine if someone's competent. So who's opinion do you trust, and why do you trust their competence? An expert witness, maybe?


>What if management doesn't hire a security team?

"That's clearly negligence."

Great so you just made it illegal or impossible to create a start up, congratulations.


>Great so you just made it illegal or impossible to create a start up, congratulations.

All of this is under the context you'd be handling a lot of PII or sensitive information, in which case, yes, I don't want just any start up to work with PII without some kind of security team.


If you're handling sufficiently private data, then there shouldn't be a low barrier to entry. Starting a medical startup without the requisite expertise would be negligent; I don't see why certain classes of private information should be different.


Real engineers have a system in place for this. It's called "Professional Engineer" and it's managed by NCEES. There is no possible reason that practice cannot directly apply to software engineering, except for the cultural refusal of software engineers to take responsibility for anything.


In fact, there has been a Software Engineering PE exam since 2013. It's not surprising that you don't hear a lot about it because most of the topics on the test would make the average CS student groan (requirements, maintenance, software development lifecycle, etc)

https://ncees.org/ncees-introduces-pe-exam-for-software-engi...

Exam specs: https://ncees.org/wp-content/uploads/2015/07/SWE-Apr-2013.pd...


I think one of the problems is that it's just not societally necessary for 95% of software. If a game is shitty or an order entry system crashes occasionally, nobody dies. Nobody really even cares. Normal social and market mechanisms mean most software at least approaches adequacy.

In at least some of the areas where we really care about software quality (e.g., banking, medical devices) there are existing regulators who will fuck your shit up if you don't take certain aspects of quality seriously. Which is good, but I think it's part of why we don't have an industry-wide program.

Maybe we should take a lesson from Hammurabi:

"If a builder build a house for some one, and does not construct it properly, and the house which he built fall in and kill its owner, then that builder shall be put to death. If it kill the son of the owner the son of that builder shall be put to death." [1]

The occasional execution would probably make people much more serious about unit testing.

[1] http://mcadams.posc.mu.edu/txt/ah/Assyria/Hammurabi.html#Ham...


That would be acceptable if we were talking about buildings, a blue collar job. But if you try to apply it to a white collar executive you're going to run into social resistance of a great magnitude. White collar crime is a social norm and only very rarely even lightly punished. It is, to a degree, expected. White collar crime kills more people and does much more economic damage every year compared to street crime, but our society has established as a norm treating street crime harshly while turning a blind eye to white collar crime. If the builders company gave the builder substandard materials to build with and refused to supply him with the tools needed or the time needed, few will get behind the idea of executing the executive who got his shareholders a 0.1% bump in profitability that quarter through those cuts, no matter who it killed.

Just look at Toyotas "unintended acceleration" case. If their firmware engineers had access to static analysis tools (a few grand for a license), the bug would have been pointed out to them immediately. Instead, Toyota hired inexperienced engineers, deprived them of appropriate tooling, and pushed the cars out to the marketplace where they killed people. The result? Toyota was cleared of any wrongdoing. They're computers. They're too complicated. No one can know how they work.


Yeah, far too much executive crime gets to a "who could have known" resolution when they certainly should have known. Or a "few bad apples" resolution when the system the executives designed created and rewarded bad apples.

I would love to see that change. Right now, though, we're in a big wave of "inequality is great", which I think strongly contributes to this problem. Let's hope that wave crashes, letting us start to hold executives and managers accountable.


I agree in part, but I think there are a few things about this scenario that highlight the problems with software. First is its extreme mutability: you can endlessly patch it, and often have to when vulnerabilities or flaws are discovered. Unfortunately this tends to lower the bar for a first release. Second, if you want to be cost-effective you must leverage many existing components of mostly unknown providence and quality. Finally the security aspect is extremely difficult because both the cost and risk of mounting an attack are extremely low.


> First is its extreme mutability: you can endlessly patch it, and often have to when vulnerabilities or flaws are discovered.

Sometimes, instead of patching, the software should be decommissioned. Search in the news for planes which were grounded when serious flaws are found.

> Second, if you want to be cost-effective you must leverage many existing components of mostly unknown providence and quality.

There're different components for different kinds of requirements. You won't use components for two story buildings, to build a skyscraper.

> Finally the security aspect is extremely difficult because both the cost and risk of mounting an attack are extremely low.

If the risks are high, systems shouldn't be deployed. There's a reason we don't allow people to have machine guns for self defense.


This concept is really not at all portable to software, especially security. It's a tempting analogy, but an invalid one.


No, it's not even an analogy. The precise methods and regulations are almost directly transferable. People are doing it. It works.

It just needs to be industry-wide.


No, it doesn't work at all.


While I agree, how do you apply software engineering practices in a field where a good chunk of the workforce doesn't have formal computer science education?


The same way real engineers work: classroom training in formal engineering, followed by years of experience under an accredited engineer in the field. There is testing at each transition to weed out the skaters.

Software engineers don't need to be computer scientists, in the same way civil engineers don't need to be materials scientists.

There is a bootstrap process, and even in other industries not all engineers are PEs... but all projects are reviewed and stamped by PEs.


Even if the whole workforce had formal computer science degrees, most of us still wouldn't have formal engineering education. The CS programs turn out computer scientists, not professional engineers.


And even if they did have formal engineering education and formal CS degrees and formal whatever most(all?) would still be incapable of writing/designing/implementing bulletproof code.


We have literally centuries of history in engineering in the physical sciences to use as an example.

This industry resists because it's filled with CS folks who either can't or won't believe that there is anything more to engineering than data structures and algorithms trivia.


The management who told the developers "we need this done by tomorrow, figure something out or it won't be good for you"


>The management who told the developers "we need this done by tomorrow, figure something out or it won't be good for you"

Imagine, that management tells their lawyers, we need to do it tomorrow, figure something out? Most likely, lawyers will either refuse to do the work, or will report the management to law enforcement.


truth!


Structural engineers have to deal with these sorts of issues. They do not build a bridge and say "this bridge is safe." They build it and say "this bridge will function within X, Y, and Z parameters for A number of years if maintained in this way" and similar things. They're dealing with a system which is known to not be totally invulnerable. They do it through comprehensive testing, scientific methods, and, above all, through trusting those technical concerns to the total exclusion of business goals. If it is 90% cheaper to use a weaker concrete, they do not substitute it in and cross their fingers. And if the CEO goes behind their back and does the substitution, or he refuses to provide them with the expensive physical simulation software necessary to do their job, or he ignores safety concerns raised by his engineers, that CEO goes to prison and the company is usually destroyed. This is starkly different from technology companies where suggesting such practices is basically asking them to completely restructure their entire organization fundamentally.


How does this work in civil engineering or construction in general. It is also a team based endeavor. The way I understand it only engineers or management needs to go through certification. Basically people who direct the project.


We certainly can have such a thing, and exactly that has been discussed for well over a decade (probably much longer, but I'm only so old) in the ACM, among other organizations. It's a difficult issue. Creating a set of standards and a certification process has a lot of pitfalls. Failing to create such a process has a lot of pitfalls. Knowingly choosing to step into one set of pitfalls over another is never a comfortable choice and people are generally very bad at it until something really, really bad happens that gives a large number of people enough irrational fear of one option to push them toward the other (and they will rabidly and aggressively oppose any discussion of acknowledging or compensating for the pitfalls they're moving toward in that case - they feel entitled to a 'clean' option and they will demand you pretend the option they're going for is that).

Rather than compare it to doctors, lawyers, etc, I would compare it to structural and civil engineers. Those are the sorts of regulations we require. If a CEO of a construction company ignores warnings given by one of his structural engineers while building a bridge, that CEO is held responsible for criminal negligence and he is put in a prison for a long time. The same needs to happen for technology company management who cut the development timeline, deprive developers of adequate tools and work environment, and who hire inexperienced development staff simply because they're cheap.

Would you like to drive across a bridge if you knew the company operated the way tech companies operate? Viewing their engineers as a cost center to be reduced, as little more than spoiled typists whose technical concerns are always viewed as unimportant in the face of business goals, crammed into office spaces proven by over 1000 studies to damage productivity, and constantly pressured to rush through everything in defiance of basic biological fact that human beings are not capable of extended periods of mental exertion especially in the face of constant interruption? Would it make you more or less confident in that bridge if there were court precedent for companies resulting in peoples deaths being let off without punishment with such practices? That's the situation we're in.


The sheer scale is incomparable. Comparing a doctor's mistake during a surgery doesn't quite compare to losing the data of 147 million.

"Critical systems" developers would need astronomically expensive insurance to even exist, and therefore prohibitively high salaries.

I personally believe there should be some measure of a corporate death penalty to emphasize the responsibility involved though.


>The sheer scale is incomparable. Comparing a doctor's mistake during a surgery doesn't quite compare to losing the data of 147 million.

Then, companies shouldn't have such a high concentration of risks in one place.

The problem wouldn't be such a disaster if just SSN wasn't enough to get a loan. For example, if we had a password in addition to SSN (stored in a hashed form), the problem would be much less severe.


If the risk to such a thing failing is so large, this seems a point against your argument. We MUST be able to reduce the risk of such systems failing, and do so provably. If we cannot reduce the cost, we must reduce the risk, such that the calculation makes the system affordable again.


Developers don't control budgets and deadlines at large companies, management does. So what does this "certified" individual do when he's given a project without resources allocated for proper security auditing? Does he intentionally get fired for refusing the assignment? That works if he has bountiful savings, no mortgage, no kids. Surely no unethical contracting company will pick up the job after he leaves...


If only there were more software jobs out there, then they wouldn't be hemmed in so intractably.


Finding another software job is insufficient. You need to find one that gives its employees the time and space they say need regardless of other competitive or financial pressures. Not so easy.


Nice red herring


> Developers don't control budgets and deadlines at large companies, management does.

True, but that's why good organizations almost always have technical people on the management team who advocate for the technical arm of the company and ensure that it is appropriately resourced.


Seems very beside the point. Equifax obviously isn't a good company.


How does it work with lawyers or engineers?


Engineers say "no, we can't build it that way" and people respect it because they know that engineer: knows their work and has recourse through their professional society as well as government oversight agencies if undue management pressure compromises safety. I don't see it playing out the same way for most software teams.


As licensed professionals, they work under the knowledge that there are practices and outcomes that can cause them to lose their licenses.


Once again, may be it is time for such a thing to exist for software engineers who aren't web devs.


It's been suggested, but AFAIK some find fundamental aspects of software development to make it hard if not impossible for it to ever be True Engineering. I'm pulling this out of my back pocket on a Saturday evening, so Google for more, but there are arguments on both sides that go back some years. Heck, google "is software development engineering" and you'll find long Quora threads on the topic.


"who aren't web devs"? Why throw that in? This breach occurred through a hacked website, as do so many others.


They will simply refuse to do the work. It's better to lose a job, than to lose a license.


That doesn't really seem comparable.


Why not? The proposal was to license software engineers (at least for "critical" systems) the same way as those.


I'm not a lawyer, but it doesn't seem to me like they work in the same way that software developers do, with management breathing down their throats to do things in less time and telling them to cut this or that corner. If anything is seems more like the opposite with the lawyer directing a lot of his staff to do legwork.

If anything I'd think accounting is a better fit, but even then, I can't say I've met many people in the software industry who think very highly of any sort of existing test or certification; why would this one do so much better at measuring actual skill?


something something software developers need a union something


The best and most legendary developers still make mistakes that cause this kind of thing. And certified or not, all developers run into deadlines that result in production of imperfect code (and that word is key -- code has to be perfect ... always ... if that's your only defense).

While writing secure code using best practices is a big part of the security equation, any company that stops there will be pwned. The most oft-used illustration in securing systems is "the onion". You need to layer on protections, from firewalls (both traditional and application layer), to solid access control to simply making sure that systems are configured properly (just hit up any SSL testing site and pop in just about every non-tech business' main web page - might as well just make the result page static with a big letter "F"). Heck, even technologies like ASLR/DEP are an extra layer.

The goal is to make an attacker have more than a few hurdles to hop[0] in order to breach and to ensure that if you are breached, the value of what is exfiltrated is either worthless (i.e. properly hashed passwords) or detected and stopped before it's all gone (partial breaches aren't awesome, but it's easier to explain 1% of your data being leaked than it is to explain all of it being leaked).

[0] I've always liked that sarcastic advice of "If you and your friend encounter a bear ... run ... you needn't out run the bear, you only need outrun your friend". If you make things difficult enough, your attacker might move on to another target. And hopefully, if they succeeded in breaching part of the defenses, someone will discover it before they return and shore up what wasn't "perfect".


That's not what's needed. I guarantee you if the companies responsible for these leaks and their executives faced real penalties with teeth things would get better, licensure scheme or no.


Like PCI compliance?

Ask people who've gone through that process how rigorous it is...


It's rigorous, but in all the wrong ways.

At $DAY_JOB our security falls into two buckets (1) PCI and (2) stuff that keeps us secure.

IDK if it's possible to have a widely accepted security standard that isn't checking nonsensical and out-of-date boxes.


Well, it could be worse:

1) Stuff that keeps you insecure (a.k.a ISO 27001 ISMS stuff) 2) Stuff that somewhat helps, but is covered by fluff (a.k.a. PCI-DSS) 3) Stuff that actually keeps you secure.

PCI-DSS at least gives you a sledgehammer to convince lazy low-level managers to dump ciphers like RC4 and encrypt some of their data. It's not utopia, and it does err on the side of perpetuating banks' infatuation with 3DES too much, but I saw getting good stuff done by using it as an excuse.

You still need to have a competent security team of course, but it helps them not being ignored. Well, sometimes.


PCI is why we have weak passwords that we have to reinvent every 90 days.


I'm not a big fan of most tech certification efforts. They mainly document attendance at classes and/or ability to regurgitate trivia. They also tend to reward formality instead of quality, and slow progress. Consider, for example, if we'd had this sort of certification 10 years ago. It'd likely be filled with waterfall-style process requirements. Those don't actually increase safety; they just look impressive.

I'd be happier to see licensing and accountability. But it would have to have significant teeth. E.g., companies can't build systems of type X without somebody licensed. If there are problems with the system, then the person with the license faces personal fines and risk of suspension or loss of license. That would be less bad than certification, but it could still substantially slow industry progress if the licensing review board had a conservative tilt to it.

Of course, the real problem with most places is not engineers not knowing. It's with managers who push for things to happen despite what engineers advise. Licensing could sort of fix that, in that it could force engineers to act like professionals and refuse to do negligent work. But it still lets shitty managers off the hook.

So what I'd really like to see is a regulatory apparatus for PII. In the same way that the EPA comes after you for a toxics spill, an agency would come after you for a data spill. They investigate, they report, they impose massive fines when they think it's warranted. And when they do ream a company for negligent management of data, executives at all the peer companies get scared and listen to the engineers for a while.


I may agree, but Equifax is by no reasonable definition "a critical system".


There are levels of "criticality". Credit monitoring is in a different category from, say, spacecraft or nuclear power plants. But if a security breach can lead easily to identity theft, and therefore to a catastrophic compromise of one's financial stability, that should require some higher level of mandatory certification.


I don't know what definition of "critical" we're going with, but the fact that they have the SSN of basically every American makes them an important weak point.


It is really time to move past SSNs


Tell that to people who are denied loans thanks to credit reporting agencies.


Maybe it is, but that's also just a way of trying to push the costs onto the low-level employees. the attitudinal problem here is with the senior management and shareholders, who are the actual responsible parties. If they really cared they'd have established such a certification or announced an intention to adhere to some existing standard.





Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: