Software projects that have significant impact on public safety/well being are a relatively rare kind. You can easily apply additional policies on them - and indeed this is what is being done (see airplane software, medical software etc.).
But will you really call your union because your manager forces you to skip writing unit tests for each and every class you write? Having in mind you are just churning out some crappy e-commerce website?
Completely another thing is that you might be careful with your code, the third party components you are using are more lax. This is well recognized e.g. in medical software where any third party code must identified and held to the same standards as the software you write.
It is very difficult to say who is to blame for creating a certain software bugs. Is it the person that configures the software, becuase he chose invalid config parameters? Is it the intern that wrote invalid piece of parsing code? Is it the system architect who put wrong data into the specification?
For any given piece of software there are many people responsible in various ways for its creation. In such contexts you cannot be blaming individual persons especially that their intentions might be good. Even very careful programmers will make bugs.
> just churning out some crappy e-commerce website?
That can affect people's lives dramatically though. Off-by-1 error charges someone's debit card $2,000 instead of $200 and now they can't buy food. Ecomm is important.
This comes back to how software is designed and built. On it's own a single bad weld should never endanger a bridge, but we accept a single mistake can bring down software. At it's core that's why software fails so often AND that's what we need to fix.
A single bad weld is isolated, a single bug can propagate a lot of bad data. Software crashing isn't the same as a bridge crashing, in many cases crashing is the best thing that can happen.
> But will you really call your union because your manager forces you to skip writing unit tests for each and every class you write? Having in mind you are just churning out some crappy e-commerce website?
Firstly engineering bodies are not a union. They are a legal organization that takes action to protect the public from actual harm. In the case of the software put forth in TFA, quantifiable harm was caused to people because of poorly tested and poorly integrated software.
Secondary to that, not every piece of code requires this level of oversight. As an example, you don't need to get a civil engineer to sign off so you can fix a picture frame on your wall; however, if you do decide to remove a load-bearing wall, you better be sure you need someone to sign that you're not collapsing your house, or selling it to someone else who won't know about the problem.
In the same way, the biggest restriction this creates for most people is that they won't be able to call themselves a software "engineer" and won't have the authority to sign off a project (in the same way you sign off that work on a bridge has been completed). This already exists in Canada and doesn't prevent anyone from creating a crappy e-commerce site. Protecting the word "engineer" is good for multiple reasons, and holding software to the same standards as we hold the rest of our public projects is not only a good idea in theory, but probably saves cost in the long run too. Think of all the people who will sue over being wrongfully arrested or put on the sex offenders registry.
> For any given piece of software there are many people responsible in various ways for its creation.
As with any large project. We don't blame the individual electrician and make them legally culpable if the light sockets start a house fire. However, if the sockets are faulty and the lead engineer _knew_ such, (or did not do enough due diligence such that they could not expect it to happen) yet still signed to let the construction finish and endangered someone who purchased the home, then the engineer is liable (to an extent for damages but more importantly that they will lose the right to work as an engineer in the future). The most salient point is that you create a code of responsibilities for engineers towards the public, and you build the legal framework so that projects that harm or endanger the public can be reigned in.
For projects like TFA, I can imagine that under an organization like APEGA, you could treat software similar to traditional projects by adding integration requirements to the contract, and requiring a certain period (90 - 120 days maybe?) of time after all the data has been transferred over to soft-test the system in production to see if it meets the on-site requirements (i.e. matches what is done manually). Does this make the contract more expensive? Yes. By having processes like these, can we be reasonably sure that we aren't going to cause a large amount of external harm or grief? Maybe. Building a culture where we don't just ship a proof-of-concept that can potentially imprison people because our manager had short term goals starts by making it clear who is liable, who is accountable, and where your responsibility lies.
But will you really call your union because your manager forces you to skip writing unit tests for each and every class you write? Having in mind you are just churning out some crappy e-commerce website?
Completely another thing is that you might be careful with your code, the third party components you are using are more lax. This is well recognized e.g. in medical software where any third party code must identified and held to the same standards as the software you write.
It is very difficult to say who is to blame for creating a certain software bugs. Is it the person that configures the software, becuase he chose invalid config parameters? Is it the intern that wrote invalid piece of parsing code? Is it the system architect who put wrong data into the specification?
For any given piece of software there are many people responsible in various ways for its creation. In such contexts you cannot be blaming individual persons especially that their intentions might be good. Even very careful programmers will make bugs.