Not knowing is even worst.
The US, meanwhile, has been engaged in wars almost non-stop since WW2. It's "unprecedented world peace" for for only a select few nations.
And, most importantly, the costs of failure have gone up dramatically with the invention of nuclear weapons. The Cuban Missile Crises is just one of several near-catastrophic events. Sooner or later, we will get a madman in a position of power, fall victim to a misunderstanding, or suffer a technical failure.
And that'll be that...
You don't need territory (at least at a price that steep) and you certainly don't need slave labor in 2018.
Aside: the book looked interesting. Skimmed it, beautiful typesetting, not gratuitously mathematical, looked readable -- I may be wrong, but it's just a exercise of adding awareness of it's existence to me.
How many doctors are willing to give up their title to do unethical stuff?
An ethical code will change people's perception, and that's a big thing. It will morph the perception of Google and Facebook from cool tech companies into dark places where no decent people want to work; unless of course these companies are willing to change. And there will be a huge effect on politics, other businesses, and media.
Software engineers are surrogates for ALL jobs because software is used everywhere. When we serve medicine, we should be obliged to take the Hippocratic oath. When we serve the military or intel agencies (with a security clearance), we ARE obliged to take an formal oath of secrecy. IMHO, many other S/W roles bear no less responsibility.
I think it's time that we software infrastructuralists take our role behind the scenes as seriously as those on the front lines do.
I've worked for several past employers whom I now disrespect (and whose leadership since earned them this disrepute), so this issue isn't merely hypothetical for me.
The principal question isn't about policing and punishment. It's about civic duty as an enlightened human being. Each of us either takes responsibility for our actions and does no harm, or we willingly do. On our part, that necessitates continuous diligence taking an interest in how the products of our work affects others.
Software has become an inescapable part of our society's technical and social infrastructure. Like scientists and engineers, S/W pros bear responsibility for how our work is used. And how it's abused. That's all I'm saying. Each of us has to work out the details for ourself, but dismissing them outright shirks that duty, and I believe, diminishes our humanity bit by bit.
You had shitty leaders I'm sorry about that, but maybe they were trying their best in a difficult situation -- it's probably not all fun choices. Or maybe they were just assholes -- updated.
I do not disagree with you on the "why" -- as Grove said I want to know how? You assume that each person can be trusted to figure this out for themselves -- maybe some people can be, but if you look at the entire population you will end with a distribution where more and more force will need to be used to coerce the fringe elements into compliance -- these fringes can destabilize the entire equilibrium since it might snow ball out of control as more and more people pile on seeing the benefits that it brings.
Then let them prove it! That's one thing an ethical code will ensure.
Morally, this argument quite obviously fails because it could be used to justify anything.
Practically, we see many professions where people of character have successfully resisted temptation: every government employee who is not corrupt comes to mind.
And even if they'll just hire the next guy/gal: If you say no to an offer, you were, by definition, their top choice. That next one down the ladder will therefore be slightly worse, making such businesses less profitable.