I actually believe that there's a culture war implied in this debate; the question of who deserves to reap the gains of automation is more than just philosophy or ethics. The question "is there inherent nobility in work itself?" seems to be just as much a political divide as any of the current popular hot-button issues. Your gut reaction says a lot about the regional values of where you grew up, whether you'd ever support a basic income, and whether you believe that someone's refusal to work should condemn them to destitution.
The closest comparison is the attitude people have if they find a wallet. In Japan, you will get your wallet back with cash intact. Yet in the west, there exists a large contingent of people who believe with all of their heart that God wanted them to find it, that the person who lost it should have been more careful, that they are just having a lucky day. Unless God shows up and declares one side to be ethically correct, it will remain a toss-up.
One thing I find fascinating about the article is that it's assumed the cleverness was the code written to automate. This is incorrect; the cleverness is in noticing when a task can be automated. Typically, the code itself is trivial.
Anyhow: automation is surely one of the best reasons every person should learn a little bit of programming. And even with that task accomplished, I suspect that the rate of people seeing the opportunity to automate will stay roughly flat.
I have told my work more than once that if they wanted to get the most out of me they should put me on the "front line" for a month or two, and let me decide what needs to be automated. It has always been laughed off as me being cute.
Try something like "In order to write better tools and automation, I need to empathize with the people using the tools and experiencing day-to-day pain. Can we find somebody who is willing to let me watch over their shoulder for a few minutes on one day?" Note the implicit bargaining for small amounts of time and only small intrusion, as well as the understanding that you are not a decider, but an observer; not an automator, but a pain-reliever.
There is a strange desire on this site to assume good faith. I have no idea where that comes from; nearly everybody is clearly posting with a heavy, thick personal bias. I suspect that folks only feel that they are arguing in good faith, and that their feelings blind them to their biases.
Here, I will show you.
I ask because I directly attribute almost every bit of "luck" I've experienced to people acting in good faith. It's literally been the most important decision I've ever made and stuck with.
Personally, I have no energy for discussions where people assume the worst by default. That falls way too close to "someone is wrong on the internet" for me. It's easy to find people to disagree with, and difficult to find smart people that can illuminate why you might be holding onto incorrect assumptions.
I've also spent time in our sandblast units and our loading docks and sales team.
How can I architect and build good software if I don't have a good understanding of the domain and you can only get that by immersing yourself in it head first.
I'm lucky in that my boss cottoned onto the fact we get better software when I see the problem in the raw and not just his interpretation of the problem.
Wouldn't work for everyone, we are a small production/import company so it's possible for me to get the basics of each users role (and I can physically go ask them if need be) but it's a huge win imo if it's possible.
The woman who had been wasting this time for years came up and hugged me in tears. No one was getting fired but I'd just freed up enough time that she didn't have to work overtime everyday. It was such a simple change and lifted such a weight off the users shoulders that went undone for years because nobody actually watched them using our software.
Unfortunately someones authority was undermined and I was let go a few weeks later.
That is so depressing.
I have a huge amount of latitude in how I spend my time so if boss sees me in the loading dock scanning deliveries he assumes that I'm doing it for a good reason.
One of many reasons you'd need a crowbar and a tub of grease to get me to leave, I've had offers with 20% better compensation or more including some developer management roles but never even considered them, I'm well paid for a developer relative to the local market, I work a 37.5hr week, week in/week out and my work is both appreciated and has a measurable impact, it's close to programmer heaven (except for the mess I inherited but that's slowly getting sorted).
There is more to life than money including actually having time for a life.
Show, don't tell.
If I want something looking repetitive working smooth, I‘ll let my best devs do the work themselves for a week and dig their way out of it.
Heck, that‘s exactly how I got serious about coding in the first place. The guy in the post automated his job, decided not to tell anyone and played games all day. I told my employer about my automation back then and they were like - „so you wanna quit now or join Engineering?“
Whatever floats your boat — I‘m CTO by now.
Also, yes - anybody can do management, and anybody can do coding. Learning to do either effectively is a skill.
Are you just making these decisions off the top of your head? Or are you talking to people, having research done, soliciting feedback, etc? In the latter case, congrats, now you're a manager! :p (If you're at a small company, you've probably still got enough time to do some hands-on work too...)
IME bad technical managers don't occur at any higher frequency than bad engineers; I haven't worked in other industries to know what it looks like there.
But you get a lot more immediate pain working for a bad manager than next to a bad engineer...
Perhaps people are worried you'll automate them straight to a layoff?
There is a thing called "process debt", in the same way as technical debt, that some things are done manually. This can be mundane routine work, that might be up to 80% easy to automate, but allows for oversight and flexibility and often works with systems that are not trivial to work with. Other times the manual process is just cheaper to maintain than it is to invest in a "good enough" automation. You can also follow this by looking at the practice of offshoring and the amount of automation invested into offshored service centers (who benefit most from these solutions via specialization and scaling).
So even if the company hires a goddamn genius, if you want to automate stuff without breaking existing flows, you need quite a few months of understanding of the environment and the real needs of the process (business wise as well), etc.
And that's why managers think it's a joke idea.
(And it also comes off as extremely arrogant, because it's essentially calling your coworkers stupid by saying "I'll see something that you haven't noticed in years!")
It's also arrogant to assume that a fresh perspective will never yield fresh results.
I'm starting to think that the only way to work around the problem (if you actually are good at seeing things that the rest of the organization has blind spots for) is to become a consultant who is expected to suggest changes. Otherwise you will be stuck fighting against primate hierarchy instincts.
There's this saying in my country, roughly translated to "Guest for a second, sees a mile far" (sorry... it even rhymes and everything in Polish...)
Being immersed in something for a long time has a good chance of limiting your perspective to that thing only. While it's true that perhaps 90% of newcomers and their ideas simply miss the intricacies of the current process and would be disastrous if implemented, the remaining 10% is a genuine innovation which would never come from the inside.
Using your software for the intended work can be very eye opening. I have had many occasions where I had to use stuff I had written and quickly saw a lot of inconveniences that could easily be fixed. When I asked users about this they agreed but they never asked because they thought it would be a difficult change.
Recently I had to use our SAP system and there would be a ton of opportunities for small changes with huge time savings but somehow it seems they never trickle up to the devs.
That's what easy to miss when thinking in terms of "automation". Automating existing work doesn't move us forward very much. Making the work unnecessary is.
A large part of how America sees work, poverty and wealth was historically influenced by the Protestant Work Ethic, and I think that lingers quite a bit today. I think it's why a lot of people view the poor as lazy (because obviously if they weren't lazy, they wouldn't be poor, right?).
If you fervently believe hard work will lead to success, how else do you explain the unsuccessful?
Those who work hard, a positive outlook, maintain themselves and their families, are resourceful ... will likely be healthy and abundant.
The farmer who idles away doing not much will have a bad crop. The farmer who is responsible and diligent, mends his fences, ploughs well, keeps his equipment clean and proper and minds the livestock ... will likely do well.
Except in case of disaster of course, and then maybe some kind of metaphysical excuses might come into play!
> Except in case of disaster of course, and then maybe some kind of metaphysical excuses might come into play!
there's a particularly distressing variation of this kind of belief that, upon hearing news that a terrible disaster has stricken a distant country, killing tens of thousands, concludes that those victims must necessarily have deserved such a fate -- by not living life according to the well-understood principles, or not worshipping the correct god.
that conclusion is perhaps a reasonable and logical one given belief in enough axioms like "there is an omnipotent god", "if we live a certain way god will reward us", "the universe is generally fair".
This happens, but it's barely a thing.
I've never seen or heard of such a thing in real life, it's something you only hear about on CNN.
Services in mainline Churches don't really ever go into this kind of thing, the congregation would find it as wacky as you or I would.
Hard work combined with personal agency leads to success. Are you responsible for yourself? That is, are you taking responsibility for yourself? If so, you will find a way to make yourself successful. If you are looking for someone else to take responsibility for you, for your healthcare, for your retirement, for your general environment and well-being, then naturally you will be at the mercy of whoever happens to assume that responsibility.
Hard work results in being able to take advantage of more opportunities. But you still have to have those opportunities.
I think this is a case where people confuse correlation and causation. Hard work correlates with success, but it doesn't cause success.
Not true, hard work is a cause of success, just not the only cause. Luck (opportunity) is a factor, but a much more unreliable one at that. Genetics is also a factor (intelligence, passion, curiosity, playfulness).
I like to see this problem from the point of view of Reinforcement Learning - an agent playing a game in an environment, having to maximise rewards - which is the definition of success. The degree of success depends on many things, including:
- the environment (how complex it is, how many potential rewards it has for the agent)
- the internal structure of the agent (its ability to learn and perceive)
- bodily affordances (what the body can do)
- the exploration/exploitation strategy (will it try something new when it already has a strategy that is sort of ok)
- the amount of time it has to learn (life experience)
- if the agent exists as part of a multi-agent environment, then the strategies of the other agents count (cooperation/betrayal, tit for tat, forgiveness).
So there are many causes of success.
That requires a substantial amount of hard work but unlikely results in a measurable amount of success. Certainly one less than would be if the proper tools were used.
And I do agree with you that it is not the only factor. But this is why I differentiate cause and correlation. And the disagreement is likely to how we are using those terms. I am using "cause" in the colloquial sense (because context from above) of how most people do not see multivariate equations, but single factor (which is an unfortunate thing) ie. "cause and effect" (but we both know better). I am using "correlation" in the more rigorous manner meaning "related to; a statistical variable; or to be in association with".
And going back to your RL example, I think you mention one part which proves my point as well.
> the exploration/exploitation strategy
You may notice that many learning algorithms have a substantial amount of failure. After all, it is the failure that leads to success. BUT a key point is that it is not the effort put into the attempt that led to success, it is the adaptation of a new strategy after recognizing the failure. You could just as easily never visited that attempt method and thus your efforts would lead to a higher success/work factor. You could also never change strategies and your efforts would result in a substantially lower success/work factor. More explicitly "hard work" is a variable that is coupled with "strategy", and the value of "hard work" can be astronomically high but the value for success can be zero. And for this reason we say it is not "the cause" but rather "a factor" or "correlates with".
Also, as mentioned in a grandchild to my post, this is a saying which I gladly will admit does not cover the entire scope of what factors lead to success. Even the unpacking of implicit information cannot do that, as a full description of the path to success is rather cumbersome.
We could contrive an example whereby "working hard" allowed you to "create more widgets in less time than anyone else" and so you receive a promotion (success). But even then we know that your success isn't from just "working hard", but is instead from the result of doing so. Given the context of this thread, I think we can agree that "creating the most widgets in the least time" can be done without "working hard".
I would argue pointing to "hard work" as a cause of success is confusing how a causal factor was reached with the factor itself.
> Hard work correlates with success, but it doesn't cause success.
Absolutely, that's a key part, I think.
What if "luck" was actually just an individual's ability to perceive an opportunity... intersected with their own feeling that they have permission to act upon it?
>>Hard work results in being able to take advantage of more opportunities. But you still have to have those opportunities.
That's what is meant by "being able to take advantage of more opportunities. You are better able to see that they exist. You are also primed to look for them. You also believe that you can succeed at that opportunity. BUT a key thing is that they actually have to exist. There are situations where you can work really hard and there be no opportunities to be successful (financially. Because we can redefine what successful means but that doesn't consider context). As illustrated above you can work really hard in jail and that can result in no gained success.
But we're talking about sayings. Sayings are never exact but rather hold a generalization in a nice and easy to remember package. Sayings always include a lot of implicit information. We could quibble about finer details but whether you should or shouldn't depends on the context of the conversation and whether the context the saying is provided in contextualizes that implicit information.
What this saying packages is:
Though work highly correlated with success it does not necessitate it. There are opportunities for success frequently around us and frequently unseen. Hard work will not result in seeing or being able to take advantage of said opportunities, but rather it correlates with the ability to. Furthermore, hard work often correlates with improvements in skill, which grant access to new opportunities that previously were not obtainable. While there are a lot of things that lead to success that correlate with hard work and while success itself in turn correlates with hard work, success itself is not granted to all those that work hard.
I suspect luck is, in fact, simply the event of uncertainities crystallising favourably (or unfavourably if the luck is "bad"). Like it says in the dictionary.
Which is also why many people believe we make our own luck: because in many situations that people attribute to luck, we can influence the outcome, consciously or otherwise.
Unless you get hit by a truck. Or unless you have a stroke. Or unless you are otherwise unlucky.
Because we do not live in a just world.
As an employer .. if something can be automated, and you're paying someone to do it manually—do you deserve to get burnt?
That said, I was mostly pointing towards what I see as an explanation for the difference in attitudes regarding wealth and poverty and how it intersects with peoples opinions of personal responsibility, which I think does have a place in that scenario, even if it's not always the primary driver.
Yeah OP is way off on his God scapegoat. People in the West are far more likely to not return a lost wallet due to the wide variance in cultural identity. Japanese people have such a shared identity which promotes a level of empathy that only Americans in small towns could hope to replicate. Call it hospitality. Religion has nothing to do with it.
> The so-called Protestant Ethic
then prevalent held that man was a sturdy and responsible individual, responsible to himself, his society, and his God. Anybody who could not measure up to that standard could not qualify for public office or even popular respect. One who was born "with a silver spoon in his mouth" might be envied, but he could not aspire to public acclaim; he had to live out his life in the seclusion of his own class.
In theory, but in practice that just means that those with inherited wealth expend some of it on building a personal myth of being self-made independent of their family legacy. It's often paper thin, but as long as they make the gesture to ritually acknowledge the norm, it seems to suffice in practice.
Obviously, no parallel equivalent exists in the other end of the economic scale.
This happens a fair bit in the UK. Due to the media's obsession with entrepreneurs, we see the wealthy gambling a bit of family wealth on a series of startup ideas that are clearly not made for the reality we inhabit but sound snappy and fin-techy (blockchain for banks, with no understanding of banks).
> Obviously, no parallel equivalent exists in the other end of the economic scale.
If you take away the spending of money bit, there is an equivalent at the other end of the economic scale.
There exists some person with inherited wealth that builds a personal myth of their inherited wealth not being responsible for their current wealth.
There exists some person in inherited poverty that builds a personal myth of their inherited poverty not being responsible for their current poverty.
The latter does happen.
Not really. Because the idea is "To become successful you must be hard working". And if we take that as "iff hardworking -> successful" then the reverse is true "if successful -> hardworking". That's why so many people think the rich are hardworking, even if they inherited their wealth (or a sizable portion).
It also comes down to it being easier to spot someone in poverty than to spot someone that's rich but hasn't worked for it themselves. That might explain most of it.
There's a sleeps-on-the-street homeless guy living in my neighborhood. I talk to him on occasion. A couple days ago he said he was bored off his ass, and that he was thinking of getting a part time job. I did not verbally respond to his potential quest for employment, but I did find myself wondering how he'd pull that off. He's perpetually drunk & unshowered. I doubt he could get a job if he wanted one. Are there any attainable jobs for an alcoholic who doesn't have a shower to use?
Dumpster diving (if it's still worth while, which it might not be). If you're serious about it, it's not really any less of a job than those American Pickers guys.
There's lots of jobs out there, especially if you widen your criteria for "job".
1: I had an interesting run-in with a homeless man at a taco truck almost two decades ago. He was in clean clothes, had a bike with a trailer, and was trying to sell a 49ers jacket. My friends and I started talking to him and he explained that he found a few of the jackets in a dumpster behind one of the big sports stores around, and had a few other things to sell too. He explained that he and a few friends actually had a fairly nice camp (carpeted areas, make-shift shower with water in the day through solar of some sort, DVD player with TV hooked up to solar trickle cell). He camped a little outside the stalled development of a local apartment complex, where they had an agreement with the developers that they could stay there and they would keep people from squatting in the partially completed complex so it didn't sustain damage. He said he was on disability from the military, and liked living outdoors, but would probably rent an apartment for the winter with his girlfriend and friend. Honestly, I left the interaction feeling a little envious of his freedom.
Possible he could find a job that includes access to a shower. I know of once case where that has occurred. It was a job in a warehouse and there were showers and change rooms he could use. I believe he was actually provided with an unused room tucked away somewhere where could stay overnight before someone higher up in the company found out about it and put an end to that arrangement.
"Perpetually drunk" is more difficult and likely a deal breaker.
That said, it seems incredibly wrong to judge someone for being clever enough to relax and get paid for it. And if someone enjoys relaxing as much as I enjoy coding, who am I to say that I'm righteous and they are a loser?
I fall somewhere towards the side that you might guess, since I'm defending it here. ;-)
I don't necessarily automate my job, but I depend on things like math and theory, to solve problems that the other engineers tend to solve laboriously through brute force and trial-and-error. I suppose a math equation is a sort of automation. And of course programming is how one does math these days.
Maybe I avoid resentment, because I'm doing work in a way that most people hate even more than doing hard work. It's also hard for them to pin down how much I work because they don't know how to estimate it.
Hypothetically, if someone wanted to debate with me about the value of work, I'd ask them how work is measured: What are the units of measure? Joules? Hours? Indulgences? How about... dollars?
The only time that we don't measure work in dollars is in a social setting, including within the workplace. For instance, we don't give motivational speeches and other kinds of attention such as meetings in proportion to your wages, but tend to treat all workers relatively equally in those things.
Often, a mature codebase shrinks as code quality improves. This drives the efficiency wonks absolutely bonkers.
what do you mean by that?
I'm not putting this very well.
When a codebase shrinks AND increases in quality, you've now demonstrated that the most obvious way to quantify the utility of a resource - units of productive output - is not useful. So now people who are used to tallying widgets and graphs that are "good" if they move up and to the right, now have to understand things like refactoring, technical debt and design patterns.
What sort of work do you do?
I think it's fair to say they will, but will they do so for their company's benefit? Possibly, but why? This is effectively gifted time.
But even though RPA is mostly screengrapping and macros, we’ve found it impossible to teach to people who aren’t developers.
Similar with better BI tools it’s become possible for sql savvy business process people to do BI, but without s fundamental understanding of data efficiency, which we can’t seem to instill in these highly educated people, what they make is often useless because it slows the systems.
Programming is completely trivial, but understanding what you’re doing, and why. Is much harder to teach than programming. Especially when you’re working toward productiveness where you can’t just write best practices for everything because best practices are often unnecessary for some systems and necessary for others.
I know this wasn’t really that related to the article, it’s just a general observation, and believe me, we tried to decentralize our RPA processes because maintaining them is hell.
What I have heard of - and directly witnessed - is MS Access and MS Excel being used by non-developers to build surprisingly powerful tools to solve their most acute problems.
They often don't even realize that they are programming.
It’s an immensely valuable digitization too in enterprise setups where you operate s lot of closed and/or legacy systems. Like we operate around 600 it systems, and very few of them talk to each other.
So if you want to transfer data from one system to another, you’d need to have someone manually do it. Or you could automate it by using RPA software.
Basically it’s what will do to the office space what the assembly lines did to factories.
It’s also really terrible, because it breaks when the UI changes. But when companies refuse to sell open apis, it’s what you have to work with. The most ironic part is the IT companies that now sell RPA solutions for their own shitty software, but that’s the world we live in.
The tricky part is that there is a finesse involved to make it reliable and the exercise becomes more process improvement than programming. Most people struggle with this and you end up with a 2000 line-of-code Rube Goldberg machine that always breaks.
Part of it is because our IT department is underfund and dysfunctional and RPA is the only “controllable” way to improve processes (in execution it’s not) and part of it is because our management have drank the cool aide that automation anywhere puts out about the impending singularity.
I’m all for automation, more for process improvement; but at 2.5MM for a bunch of brittle macros running critical functions; it’s way oversold and too immature
As you have learned, RPA is mostly screen scraping, macros, and as another person referred to it below "Rube Goldberg machines" that break when UIs change. For all of these reasons above (the need for automation and the limitations of RPA), we started Soroco. Like you said (and we agree) you can't build serious automation without real development.
We take a fundamentally different approach which is to build a platform/SDK for automation that provides a significant amount of reliability on top of a number of unreliable automation layers. There is no other way to automate Windows applications or Java applications than through either A) Screen scraping, or B) Their accessibility layers (e.g., Windows UIA). The problem is that BOTH are extremely unreliable. For web, that equivalent is Selenium (at least as the primary layer).
As another commenter mentioned below, a "watered down version of Selenium" is exactly what is NOT needed -- nor are Windows UIA or other accessibility layers great. They are all extremely unreliable. When something like Selenium fails to find an element, what do you do? Halt your business critical process?
All of these frameworks were never built to handle business critical processes, they were mainly built for application testing. The RPA industry has basically tried to leverage them and pretend they are enterprise ready by hiding them behind flow chart builders. They are NOT enterprise ready.
To draw an analogy, it's like having the Internet and IP, which do NOT provide or guarantee reliability. You need the equivalent of TCP and reliability layers.
At Soroco, we have dealt with all of this and have put a tremendous amount of effort into building reliability layers on these unreliable systems, along with security, scalability, and machine learning. We construct automation systems using a full programming language (Python) and have built flow control layers and reliability on top of it. Sort of like an automation SDK for Python with a full IDE. We have lots of supporting microservices for storing credentials securely, storing information scalably, and deploying. We've even open sourced how we encrypt Python (https://blog.soroco.com/)
What does "cognitive" mean in this field? At Soroco, we don't pretend automation systems can "self heal" or "self construct" -- that's just silly. For Soroco, cognitive means that if you had your automation system do the same thing over and over (e.g., during User Acceptance Testing)... what can it learn about the distribution of inputs or outputs to flag anomalous behavior? We also apply Machine Learning and Computer Vision to specific use cases to make them achieve what is not possible through a laundry list of conditionals. For example, ML/NLP to classify intents of text and to do things like binary or multi-class classification.
There's a lot of noise in the RPA space. I'd be happy to share more with you if you want to reach out directly to me: firstname.lastname@example.org or email@example.com
I was hoping that your company offerred a product similar to Automa, but it looks like your business model is different to that.
Anyway I ended up having to re-implement my Automa-using Python scripts using a different technology (TestStack.White and C#) and I 100% concur with your analogy about TCP over IP and building on unreliable foundation. It's also analogous to the sensor fusion problem of AI: picking out what to do when different methods of interacting with the automation layer report conflicting results. (And you have multiple ways because you need the redundancy to detect and deal with problem #1 - automation interfaces are incredibly unreliable.)
For example between IT-mandated invasive antivirus software injecting shims in the OS and/or screwy win32 code in the program-under-test, or bugs in TestStack.White, (I've found indications of all three causes) there is some kind of randomness or ambiguity in the Window handles you get when you ask for certain controls on a dialog. You might ask for the OK button and get the button 50% of the time and 50% of the time get a handle which turns out to be the textbox above it.
So I've taken a belt-and-suspenders approach and built a database of all of the controls I expect to find and their screen positions, a folder full of screenshot snippets which I intend to make my test automation program compare, and use that to augment what Windows UIA is telling me is on-screen. Due to the problem of receiving the wrong window handle (on this app) when I ask TestStack.White to find a control at a certain position or by name, I resort to just cross-collating the entire list of controls on the window with the best match from the database. However even this is 4x harder than it should be, because for some reason the positions and sizes in the database (obtained by a previous attempt with a different automation tool) differ by small pixel amounts from what TestStack.White reports. So I had to implement a fuzzy match.
And then the entire object-oriented architecture of my automation utility program collapsed on itself like a house of cards, because now I have 2 or 3 different derived classes to represent "the same" conceptual object: if you code "button.Click()" should it click the button with the handle found by TestStack.White? Or click the coordinate from the database? Or click where the screenshot matched? Or do nothing because this is a mock object? Worse, I now have a constellation of 1..N OOP instances associated with each conceptual object: maybe the database says I should have a button, but I can't find a handle for it thus I cannot create a ButtonFromHandle but no matter I can still click on the center coord of ButtonFromData. But if I have both I really want to default to one (which?) that is most reliable. Object-oriented programming design methodologies are completely inadequate to deal with the case of "some number between 1 and N instances actually represent one conceptual object and should be treated like a sheaf of overlaid transparencies". Yes, it can be done, but the design turns into something that makes sense from that aspect but makes no sense from the API-user's point of view: you just want to click "the" button you don't want to do "button.Implementations.First().Click()" or some such. Then add in multiple layers of mocks for unit testing the automation tool and you get N*M combinations...
Not sure where I was going with this, I guess I'm just excited to find someone who's worked on the same (or similar) problem.
I love this article because it changed my mind. I was one of the redditors who urged FiletOFish1066 to fess up to his employers on the basis of some moral high ground I felt I occupied. I see now that I made the mistaken assumption that work == goodness and went on from there.
Nietzsche would hate this question: https://en.wikipedia.org/wiki/Master%E2%80%93slave_morality
Working is the opposite of noble in the traditional sense of the word.
Most of us work because we have to.
I'd like to reach the point where I work 100% because I want to, not because I want to avoid the issues created by lack of money.
Perhaps Nietzsche, with his critique of traditional Christian values, can be seen as a reaction against Christianity.
Nobles wouldn't work because that was for serfs.
But work has for a long time been considered noble, also, if you consider the kind of work we do not as not so much 'labour' i.e. creative endeavor, then it's even closer to both 'nobole' and 'Noble'.
The amount of effort required to produce that value, helps to dictate market conditions and the amount of competition; aside from this, ultimately I don't think effort matters so much when you consider it in terms of B2B transactions.
Eventually, if a service is easy to supply—competition will drive down the price—and the economic system will readjust accordingly.
Strangely, it's only when you consider an individual in permanent employment that automation seems to present itself as a dilemma.
There's definitely a lot being said about automation. We're regularly told via news outlets that a large percentage of the populations' jobs will be replaced by robots.
But aside from the idea of universal basic income; I believe there's a more general trend towards working independently, i.e. in a freelance capacity.
Maybe automation becomes far less of a dilemma when this is this case?
Maybe it is only paranoia, but over the course of my career I have seen an increased treatment of the employee like chattel. The rights of employees have eroded, almost by choice, through social media, smart phones, monitoring technology.
I, personally, had had enough. I decided to remove myself from that environment. I believed that it was dehumanizing and degrading. I started working remotely on contracts. That worked pretty well for a while. But I started working for one company where slowly they wanted me to become more and more like one of their employees and less of an independent contractor. In addition to the work covered by the contract, I was asked to help on other projects or with interviewing. Initially I didn't mind because I wanted to help. After a while I realized what was happening and that is what prompted me to create my own company to work through and to be paid for the end result rather than the time spent on the result.
With respect to the article, there are some things that I think you have to accept as an employee. You probably have signed away the rights to the IP you create at your place of employment. After all, you probably did it on company time with company equipment. But ultimately that is the agreement you entered into so you cannot turn around after the fact and complain.
However, my sympathy lies with the person that automates their work rather than the employer. I think that the ethical problem is interesting. Recently I watched the Crash Course on "meta ethics" and the problem they considered was a burglar who intended to steal from a little, old lady but inadvertently saved her life instead. And the question is, 'was the burglar good or bad?' I think there is a parallel to the issue of automating your job. I am no philosopher, but I think this is an interesting question.
The code is almost never trivial either. It often involves integrating complex systems and making sure that they work which, if done right, involves writing several unit tests.
Using God is a poor excuse to determine ethics because there's no evidence one exists. It's especially a problem when you don't share the code because then, instead of creating improvement to the job for the company as a whole, you've written a pass to be lazy. The real job comes from supporting whatever you write to the masses. If you don't have any interest in that, you're simply selfish.
The problem is people react in different ways, but it seems like the most common way is to not tell anyone which in my mind is foolish because even if you do get fired, you've improved your skills enough to find a better job anyway. Not only that, I can just as easily come up with an example of someone getting promoted for automating job tasks.
> Anyhow: automation is surely one of the best reasons every person should learn a little bit of programming. And even with that task accomplished, I suspect that the rate of people seeing the opportunity to automate will stay roughly flat.
Absolutely. A minor observation I have, living in Operations/Finance. Don't merge cells in Excel. Observing a spreadsheet is a fantastic indicator of an individual's knowledge of automation vs manual process. Someone that uses lists without merged cells is someone that not only can save days of of the year with pivottables, various forms of looking up, but probably has the idea of 'tables' 'if' and 'for' in their head.
This is a great kicking-off point for coaching through ugly VBA. I like VBA. It's also often the only choice in corporate environments. But it is an option, and the chance to coach someone that's on the edge of...
> I suspect that the rate of people seeing the opportunity to automate will stay roughly flat.
...to push over that edge.
Your story reminds me of my son who graduated with a liberal arts degree right after the financial crash. He completely automated his first job (using excel and VBA) and was not rewarded. Long story short, he went back and got a masters in EECS and works where productivity is recognized.
Good he found a better employer.
Agreed, there's a lot of meaty ethics and philosophy tied up in this issue.
For most of my life I've tended to believe in the nobility of work, that it gives my life meaning and purpose.
But as rates of automation increase my children (or more likely their children) may well seek different ways of defining themselves.
Before that can happen though there'll have to be some fraught and difficult changes to our societies.
the cleverness is in noticing when a task can be automated. Typically, the code itself is trivial.
The devil is in the details. Robustly automating anything of reasonable complexity involves a lot of implementation hickups, dealing with corner cases, gracefully failing, and adding structure and regularity to otherwise disparate, adhoc and unstructured components.
It takes skill, experience and a lot of effort to do this without it immediately falling over.
It's similar to how most people will complain about the traffic while a very small number of people will start companies to dig tunnels for maglev vehicles to travel on.
Now, I choose to believe (I'm a realistic optimist) that some of these instincts - to recognize patterns, to ask orthogonal questions, to pay attention to what others take for granted - can be acquired. It's less about training and more about practice. You can decide that you're going to learn how to ask good questions.
It's a conscious act, and that is key.
Anyhow, there's certainly skill and experience required to automate well. However, it's SHOCKING how many people learn to build simple systems in MS Access or Excel to solve their own problems.
Well to be honest there's only a small number of people with the means and connections to even start such a venture.
Called the guy when I found it. He totally panicked. Was odd.
When I handed it to him, that face! Worth it.
Amazingly, he proceeded to try and give me the cash. Was an effort to decline.
At the time, I really needed a warm fuzzy. The lost wallet was a great opportunity, and I got a good one.
But, our conversation basically centered on, "who does that?"
Indeed. The stories of Japan make me want to visit just to soak a little of that in.
My experiences are similar. People really vary in their response to a good deed directly favoring them, and I suspect the nature and quantity of deeds they do can be correlated to those responses.
I think it all adds up. More of us just need to keep doing it.
I thought about this too. Pretty much everything I've automated turned out to be way easier than I assumed and I ended up feeling dumb for not realizing I could have saved myself time by doing it sooner.
I don't understand what you're getting at there. If someone refuses to work, that would seem to mean that they are able to work but choose not to.
We're not talking about someone who is actually unable to work because of some physical or mental limitation and may need assistance of some sort. We're taking about someone who could work but doesn't want to.
What possible obligation do I have toward that person? I don't see any. Of course I don't think they should be, as you put it, condemned to destitution. But what on earth makes it my responsibility to give them anything at all? Do I owe them a living, simply because they refuse to work?
There are tremendous financial opportunities waiting behind automation of expensive tasks usually performed by developers.
For many of those tasks, the moment is right now, and the code itself is not trivial.
This absolutely increases in complexity as the benefit increases, mostly due to the low-hanging, prime fruit already being picked... but there's still tons of stuff being handled by humans still. Moreover, as techniques and computing power become cheaper and cheaper, stuff that used to be hard (OCR!) becomes easier all the time.
these days I get brain chills whenever I do something un-intellectual but very sensory-based: crafting wood, molding things, taichi..
Work today is a twisted version of sharing group survival/happiness.
I think you'll find that most people who 'keep the wallet' are probably not religious and don't think about it the morality of it all that much, whereas those who are regular temple/church/mosque attenders would be among the least likely to 'keep it'.
The notion that religious people are less likely to do bad things is highly problematic, objectively and logically - and it carries the unfortunate assumption that non-religious people are more likely to do bad things.
People should not need to subscribe to a paranormal belief system to understand that theft, rape and murder are bad.
There is overwhelming evidence that religious people, particularly those who attend services are less likely to be involved in criminal activity.
Your assertion that the religious or church attenders are mostly about 'belief in paranormal activity' is really quite missing the ball, to the point that it's almost offensive - though it's a common misunderstanding.
Most classical Churches, Synagogues etc. are basically old-school community centers, where morality, behavior, contentiousness is passed on from generation to generation.
Devotion to 'a higher purpose' and 'the community' is basically paramount and instituted into all ideals.
FYI - it's actually not that much about 'belief', funny enough it's mostly about 'behaviour' which is highlighted in the research that hints that 'attendance' is the strongest predictor, less so 'faith'. (Sorry, I tried to find that specific article and could not).
I'm not some kind of 'holy roller' or whatever, I'm just constantly shocked at the odd misunderstanding many super secular types have about religion. The original posters bit about 'God wanted me to keep the wallet' triggered me because that's basically the opposite of what most religious people would instinctively think: to most service attenders, that the 'wallet belongs to someone' is a most obvious observation. Most mainline religious groups are like 'boy scouts' and 'girl guides'. A little conservative, generally very conscientious.
As you deduced, I missed your initial reply because we both posted at the same time.
For what it's worth, I'm also in Canada and my best friend is a Christian. She was not born into it and has always been really great about being willing to admit when she doesn't know the answers to something. This has, in turn, inspired me to be less of a judgemental asshole.
Don't get me wrong: I still have a really hard time with the idea that intelligent people can justify worshipping malevolent invisible sky wizards. And a lot of my bias against religious influence in modern society comes from the organizations themselves, and not their members. Evil priests, etc are over-staying their welcome.
However, I am heartened to know that research suggests religious people are less likely to be criminals. Happy to acknowledge that I learned something. I do wonder if causation/correlation are skewed, here... as in, would those people still be good if they'd grown up in secular families?
Anyhow... great chat. Feel free to update if you think of anything else.
But it's also worth noting that what constitutes a criminal act is largely dependent on the laws that are enacted by a majority (this is also a broad generalization). Taking into account that a majority votes these laws, these laws will also reflect that same majoritie's Religious (or lack thereof) beliefs.
This 'hypothetical' religious majority will probably never vote for laws which 'criminalizes' their beliefs and actions! Thus it's quite easy to see that Religious folks from this majority will never be deemed criminal.
As a thought experiment: imagine that attempting to obstruct or make it more difficult for someone to have an abortion was illegal. How many more religious people would become 'criminals'?
In conclusion, one person's definition of criminal is another's person definition of a 'hero'. Claiming that religious people are less likely to be criminals is based on shaky assumptions in my opinion.
Do you have any evidence to back up your claim in the OP?
> The notion that religious people are less likely to do bad things is highly problematic, objectively and logically.
What exactly do you mean by “highly problematic, objectively and logically”?
> People should not need to subscribe to a paranormal belief system to understand that theft, rape and murder are bad.
“Should not” and “do not” are two very different things.
However, I do spend a lot of time in conversation with my best friend - who is a late-convert Christian - about the distressingly common phenomena of people who call themselves Christians because it flatters them, but don't actually participate or behave in the way they are supposed to.
You know, like the POTUS.
I do happen to believe that religious people shouldn't assume that they act better than non-religious people. We all have to draw personal lines somewhere.
Someone who is a 'huge believer' is utterly something different from someone who was born and raised Jewish, or Anglican and was raised that way.
Growing up Jewish/Catholic is mostly a matter of culture, tradition, community, it actually has less to do with faith. That's why I focused on 'attendance'.
They tend to be small-c conservative: normal jobs, kids attend school regularly, don't do drugs, eat reasonably well, vote, take out the garbage, stable family etc. etc..
'Converts' are a whole other issue because ironically people who grow up in a regular faith aren't likely to convert to much at all. Seekers, people who are emotional about stuff, people who have had major tragedies, drug problems, who had major family problems, maybe lived on the street ... those 'new beginning' people are often the most outwardly 'believing' but they represent a totally different cohort of behaviours.
The guy who was abused by his parents, lived on the street and smocked crack and then 'saw the light' and is a 'true believer' - this guy is going to bring most of his broken habits with him. This guy is not Ned Flanders.
The other issues is type of religion: Evangelicalism is kind of like a 'Church of Converts' - they are 'living in faith' type thing. The type who mostly watch 'Church on TV'. This is not the camp of people I mean. Again this is not about how someone outwardly talks about / believes in Jesus or whatever.
Old-school, traditional, mainline Churches are full of boring, very conscientious, small-c conservative people. This is my point. I'm not saying so much it's about faith per say, although that's probably related.
I don’t think it’s so much a carried or implied assumption as his central thesis, and his justification down the thread is a page of more claims and anecdotes. Call me cynical, but I don’t think there is any “there” there.
I provided links to three articles of data to support my primary thesis, moreover, if you do a Google search you will find overwhelming research in support of the fact that 'those who attend regular service are less criminal'.
My 'anecdotes' are there to help contextualize it for you.
So the 'there' is so obviously 'there' ... the question is, why, even when faced with actual data to people keep doubting? That's the interesting bit.
And to be clear I'm not religious nor do I attend services.
So we have a HuffPo article, an article from MarriPedia (a religious site) and an abstract from the Sociology and Religion department of The Association of Sociolgy and Religion. Three famously unbiased, data-driven outlets (/s) which even then, only tangentially support part of your point, and then only through the loosest of correlations.
Quantity is no substitute for quality.
But there's overwhelming evidence of it, an BTW I'm not saying 'religious' per say, but 'church attendance' which is a much stronger overlap with conscientious behaviours.
  
People who doubt that regular church attenders are more likely to 'return wallets' I think have little exposure to classical religious communities. I don't mean to offend anyone, but I've found so many people who have no exposure to it have weird and cynical ideas about mainline religions. I'm constantly surprised.
In the very example given: Japan, where 'nobody keeps found wallets' - this is a culture of 'strong cultural ('small c') conservatism and communitarianism. Obviously with shades of ethnocentrism. They're not religious in the same way Westerners are, but they have a whole set of super concientious behaviours perfectly consistent with those here who are 'religous service attenders'
The best analogues in the West are basically agrarian communities, who by the way happen to be religious, or at least 'church attenders'. About 2 generations ago, in the West, most people lived in agrarian communities, even if they weren't farmers, they were tied to that lifestyle, attended service, yada, yada. You lose your wallet there, and nobody will steal it.
In my home town, my grandparents generally did not lock the door to their house. My father still generally does not lock his door.
Now - 99% of my grandparents and their peers attended Church, but they were mainline religions (I'm in Canada, we don't have a lot of this Evangelical, Baptist or Mormon kind of stuff) - but they were not what we would not refer to as specifically 'religious'. I mean everyone did it, it's just what people did. Nobody ran around talking about 'Jesus' etc. - rather, it's more or less part of cultural tradition. Watch UK television series that are set in smaller towns to get an idea.
In my cultural background the thought of keeping someone's wallet is unthinkable. It's obviously and clearly immoral. You probably 'know the person' who lost it anyhow.
The oddest thing about making the transition from more agrarian/communitarian cultures to the more urban one's is how self-oriented everyone seems in the city (i.e. how people could possibly consider 'keeping wallets'), but doubly odd is how they are so cynical they can't fathom that there are groups of people for whom 'returning wallets' is normative and moral action, not something of note. And I'm not harping on 'city folk' (I am born and mostly raised urban), and there are many great things here, rather pointing out the agrarian/communitarian's perspective.
Tons of data to support the relationship between religion and low crime, but FYI I think they overstate it a little bit, if you look carefully you find again 'attendance' more of a predictor of low crime than even 'faith' for example. I think it's just as much about conscientious behavior.
They breed a lot of narrow minded small thinking bigots, even if they do return wallets.
The idea we should as a society somehow return to a romantic past that never actually existed isn't a good one in my opinion. And we can't do that anyway because our understanding has changed.
Not that there aren't a lot of good things from the past that somehow got dropped by the wayside, there are.
Contentiousness is not a 'romantic past', it's a perennial ideal.
But these communities do tend to have a dark side along with the positive attributes you correctly mention.
Additionally, if the basis is faith, and people no longer have that, what then?
I'm experiencing a shift in that thinking right now. I write reusable, modular, tested code. My co-workers don't. I use a project management system (Phabricator, on my own personal server, which I maintain); the company doesn't. For most of the last year, I've used these to produce more and better code than others have, but my responsibilities have only expanded while my pay hasn't.
I've vocally evangelized these practices and I'd love nothing more than to see them get adopted by the other developers, but so far it's not getting any traction.
So I'm shifting towards logging the value of the work I produce -- regardless of whether I'm just copying in a file from my extensive library -- rather than the time it took me to accomplish the task.
And starting to look for a new job.
I hate to burst GP's bubble, but the attitude and environment at his next job probably won't be any better in regard to "best practices." It's extremely difficult to gauge how your attitude toward development compares with a prospective team's after an hour of chatting. I became a manager last year and quickly discovered that changing others' behavior and habits is next to impossible. And if he does happen to find a new team that matches his work ethic he can be certain that 1 or 2 years down the line it won't any longer.
I think the only two ways to avoid this is to become a contractor and switch gigs every 2 years, or start your own company and hope the people you hire share your attitude.
If you're interested in progress and growth, I can say with a high degree of certainty that what you don't want to do is apply to work on a very stable team. These people are always confident that they know what they're doing - even when they agree that the outcomes aren't great. They've had lots of time to practice their rationalizations and that kind of cognitive dissonance can be pretty jarring/aggravating.
By definition, starting your own company or contracting will avoid that situation, but you don't need to take your ball and go home just to get the right kind of environment.
Fifteen years later I sit in meetings telling people (some of them still older than me) not to touch the stove because it will hurt, and I have to patiently wait until someone gets hurt before we can discuss the proverbial oven mitts.
The problem with doing something right the first time is nobody appreciates how hard it was.
Sometimes if you do put in quality and do it right in development it costs you in time and then by perception.
When you hit the ship date over finishing the iteration/version/product you have problems after ship from external customers, when you do it right and handle the problem in development before ship, you take a hit in perception internally.
The problems after ship can harm a product/company more than a slight delay in shipping, but today it seems people don't care as much as there is more specialization and larger teams where it is 'not my problem'. Some clients/project managers see a bad product shipped on time as good, and a good product shipped late as always bad. The external perception should play more into that perception not just hitting their part of the project goals/milestones.
Hitting dates is hugely important, but messing up a product with the customer can be deeply problematic. Noone truly remembers a late product after it ships and is quality, creative and functional, they remember a bad product.
I prefer the Valve Time  philosophy, get the product right over the date, and don't set dates until you have the product actually ready.
Does a development agency that focuses on this model exist? Of course they should be paid more per hour (or other unit) than your average, less optimized agency, but the sum total should be less than hiring inefficient agencies.
(P.S. I understand the best developers probably don't want to start or work at development agencies, but assuming there are some.)
If you know of a development agency, odds are that it is probably a body shop.
The places that do work like this are more like Unicorns. And Unicorns are damned hard to find and hire -- or get hired by them.
This sort of thing is how I now pay the bills.
In many cases, what somebody does (and why) is not documented, or if it is the documentation is most likely out of date and inaccurate.
To effectively automate a task, you need to fully understand what needs to be done as well as have the skills to do the actual automation. Getting somebody in who is unfamiliar with the job, can work, but in most cases you will end up with an off the shelf system that has been customised to perform the original task but come with a large overhead of maintenance that may take more work than the original task.
My suggestion would be that rather than get an outsider to create the automation tools, up-skill and authorise the staff doing the work now to allow them to gradually automate the process. Make sure you make it worth their whiles to automate.
1) As person with business knowledge, vision, plan, etc, I should fully document the business goal and why we're doing it and what we're trying to achieve at the highest level, then work with the developer to figure out what should be developed to achieve that and where to automate.
2) Don't hire outside person or agency, but bring someone in full-time or as a part-time partner for the long term and incentivize them to get better and better?
Seems to be a good idea to separate out the architecting from the actual implementation. Makes sense, because if someone knows they'll have to be implementing, it could cloud their plan for the ideal solution. Similar to separating out design, UX, and development roles.
I think we instead do the same amount of labor, but accomplish more.
Modern web dev is fantastically more productive and efficient than it was 10 or 15 years ago.
Most/many SAAS apps are substitutes for aspects of what previously would have been part of a developers job to put into place.
From a purely utility standpoint, I was much more productive back then because I only had to make it work and be easy to use, not "pretty" and animated with toys or the latest style fad.
Is that even true? Is there any survey showing that users consider those bad or old tech? Because with things like the reddit, ebay, paypal, gmail redesign, the common thing i ve noticed is that nobody asked for those.
As far as old-looking sites like Ebay, if an Ebay competitor appeared with equivalent services and product choices, yet LOOKED fancier/stylish, Ebay would start to lose customers. Lucky for Ebay, no such site exists. (If they appeared, Ebay would probably spend on a visual revamp.)
And perhaps even then not so much: at a minimum you probably need to write unit tests.
That's not always true. In at least 2 different organizations I worked at they were creating a combinatorial mess of search screens and/or reports.
Using a little bit of meta programming, query-by-example forms, data dictionaries, click-able drill-downs, and modular design; such "reporting stacks" could often be simplified into either a fewer number of screens and/or designed in such a way that a non-programming power user could configure most reports on their own. Allowing CSV exports of query results also reduces the number of "paper" report requests because Excel users can then format their own.
The result is something that needs roughly 1/3 as many programming hours to update and maintain.
One caveat is that you have to know the domain fairly well for it to be practical. You have to learn the domain patterns and habits in order to factor those patterns into meta-patterns. When I tried it as a newbie to the org, I usually did it wrong.
They do that to and have always done so. It's barely remarkable because it is expected and part of the job. The problem is you can't benefit from that, as soon as you automate one part, you are given other tasks.
I don't hand-create machine code. I don't write assembly.
I write in the highest-level language that still lets me fully specify everything I care about, and then all of the lower levels of code generation are completely automated. Programmers have always automated programming jobs as much as possible.
And really, the existence of computers is the result of automating the job of human computers away. The entire history of computer science is this.
The difference is that these guys hit a local maxima, and just ... stop. Instead of moving up to the next level and making more money.
That's assuming your employer will "move you up" and you'll make more money. Or that a future employer will see value in you automating your job. Then there's this story from TFA:
> a user posting as AcceptableLosses wrote, “They took what I had developed, replaced me with an idiot that they showed how to work it, and promptly fired me for ‘insubordination.’ I had taken a business asset that was making them $30 grand a year profit and turned it into a million dollar a year program for the company, and they fired me for it to save ~30 grand a year on my salary. Job creators my ass.”
Which sounds so entirely stupid as to be almost incredible. Why not just take this guy and move him, even sideways, to another role he can automate? But I believe him, because I've seen a lot of stupidity in Big Companies, especially the old school ones.
CS encompasses way more than automation. Graph theory, ADTs, complexity analysis (just by way of example) are all very relevant, even if the computation is done manually, by hand.
A lot of computer science presumes a very abstract computer; a human and a modern machine apply equally well.
Not pretending for a second I'm safe.
Operations is a pure cost center. Cost centers only exist for the express purpose of being compressed twice as much this month as they were last month, so that you can make them half as expensive.
Developers are valuable. They create value.
When all the good Operations staff has left because they're tired of being treated like mushrooms, all the Operations stuff gets thrown over the wall and then the Developers get told that they are now DevOps.
One such tool is very popular here in South America:
GeneXus™ streamlines application development by automatically generating everything from databases to code, frontend to backend, and server-side to client-side services. It’s not magic — just a smarter way to create smart technology.
It ends up being a kind of DSL.
As a sysadmin who started out replacing power supplies as a datacenter tech, and nowadays wrangles AWS auto-scaling groups, the idea of _not_ automating away the tedious parts of my job is pretty hilarious. I had one operations/support job where this was the case, and thankfully, I quit after one month.
If anyone out there has the gumption to cobble together a dev environment and automate away their BS job, yet can't get recognition from your employer; it is time to seek out a new job, you've certainly got the skills for something better!
I remember going up to one of the senior management people in the company I worked at a long time ago and pointing out how I'd saved the company from having to spend salaries and other overheads on 12-20 more people by automation. If the salary for each person was X, I asked for a salary bump of 3X. They gave me a bump of only 0.5X . I eventually pushed it to 1X but that experience taught me that most of the time, extra value you deliver rarely flows back to you in a reasonable way.
And from that point on, I stopped treating my job as an extension of my identity or fulfillment of purpose. It became a place where I primarily had a cold hard contract to deliver X value in exchange for Y returns. And if I was ever to work again in a company with similar opaque practices on salaries, any increase in Y returns would be best negotiated before showing my entire hand on how much I could increase the delivery on value X.
Indeed, the best work is done when you do it for yourself. People basically sign their brains away when they go to work for Facebook and Google. Those companies will be likely to get B players no matter how many seemingly great people they hire. The 4.0 MIT grad who can recite chapters from his CS401 class isn't necessarily an A player. Most of the A players I know are either ex-Facebook/ex-Google or turned down working at those places because they have more important things to do with their time.
The real A players are much harder to find and are much harder to value. Without ownership in what you do, there is a high probability that you will not be at your best. If you need to be paid to do something, it is unlikely to be what fulfills you.
The output is actually even better in my experience. I know in the tasks that we've automated the output is always 100% consistent, which is more than can be said for the previous human output.
Also there's a lot of value in having someone on hand who actually understands the process rather than blindly trusting in a script. They'll more than pay for themselves when it comes to future enhancements or debugging.
And I say all of this as an employer - my employees are positively encouraged to automate as much as possible.
I think in the long run, acting in the best interest of the business when you are employed for that business would result in superior income/career than doing otherwise.
Another option is finding possibilities for automation that could apply to other companies with similar problems, and starting a separate company selling such automation services. A consultancy could work, too.
Their boss cares. This is capitalism, and capitalist ethics say you should be fired if you're idling and not continuing to actively labor for the boss anymore. She's not getting anything in return for your salary. You more than likely signed away all rights to the fruits of your labor when you took the job. That means "your" automation is really your boss's capital, and you have no rights to be perpetually compensated for it.
So, if you ever automate your own job, don't tell anyone. Also make sure to make your automation so arcane and impenetrable that no one else can hope to use it.
That is the crux right there.
If you automate something and don't tell your boss then you have cheated the capital owning class out of their right to all capital.
Wrong. Capitalism has no ethics. The only 'ethic' it may have is "Make money and resources". Humans are secondary. Ethics are secondary. The environment/public commons is secondary. Even the laws are secondary, given the fine is less than the gain.
More realistically: as a laborer, you shouldn't put in extra effort to automate your job away, because the benefits will likely be kept from you, for the most part. They might throw you a bone as a reward, but a bones are leftovers. They'll keep the meat for themselves. This kind of automation is properly understood as a kind of charity for the people who need it least.
The appeal of automating away your job is the freedom it seems to bring to the automator, but that's an illusion. You'll remain a laborer until you're put in a position to go live under a bridge, which is a freedom you've always had.
If you want to openly automate away your job, you want to be a software engineer. At a minimum, negotiate a software engineer's salary from you employer before you start that work. However, if you can manage it, it's better for you to start a company to license the automation back in perpetuity.
This is why, in 19th century, they've used to call wage labor, as opposed to self-employment, "wage slavery".
You are selling your time, not the goods you produce.
Rent people at sub-subsistence wages? I guess it's okay.
Edit: also, is this feature of salaried employment so misunderstood as to require an analogy? You get paid a fixed amount, or with specified bonus structure. It's literally the number one fact people know about their employment contracts.
I would argue yes. Many professionals like engineers and programmers can point to contributions they made to the corporation that have been sold for many times their annual salary. How many of those professionals consider the fact that they could have sold their design for themselves and pocketed the revenue?
This is related to Coase's question: why have a firm in the first place? In modern times it seems that ongoing disaggregation of businesses and reduced transaction costs in software would cause more professionals to consider working for themselves, rather than selling their (potential, uncertain, unlimited) profits for a (certain, fixed) income.
If you only have had one employer for a long period, this is a large blow. Taleb makes the point that the taxi driver is in an enviable position compared to the salaryman for this reason.
Although, slavery existed within the feudal system as the lowest class of serf.
If you don't value what you did, why should your employer?
Also, why should an employer raise your salary based on the fact you are automating stuff? It's part of your job, after all. That's what programming is about.
To clarify something first, this was not a programming job. I was managing a team of content uploaders for an e-commerce company. I used software in multiple ways to keep the team at the same size when my original brief was to simplify the process so much that they could hire more people at cheap rates to do the work. So I went above and beyond the brief. The job was so simple they didn't need to hire anyone anyway.
The question of why 3X is to do with alignment of interests. I can't offer savings of 20X and ask for that to be added to my salary. Yes I'm creating that value, but we also want to increase profits. If I ask for all the savings, then what makes my solution worth it? Calculating our business margins at the time I came to the conclusion of 15% of the savings to come to me to be worth it given the extra business our team was able to handle due to the automation.
Lesson here is how to calculate your ask. My team unit with 20 extra people meant the new work being brought in would be coming in with no profit. The team (without extras) was just breaking even at the time by my calculations. The profits made by automating to accept new business at an accelerated rate was above our usual margins (I think it was about 45%). My ask lowered it but it was still above the company's expected margins on business.
My mistake though was that the automation was nearly infinitely scalable. I showed my entire hand and me leaving wouldn't necessarily burn anyone. 6 years since I left, at least two of the tools are still running. Twas a great lesson on negotiation and leverage. Live and learn :)
My advice is to not fight this state of affairs. Figure out your own way to stay productive. Divide up your spare time between coming up with ideas for your employer and doing side projects. Take long lunches with your coworkers and leave at 4.
You can find happy professional nirvana but you have to believe in it first.
As long as that is properly understood you can get a contract that is permissive enough to allow you to work on something else at work when nothing is burning. Even if you have to offer a reduced salary, it's worth it.
Another option is to go the work on OSS route, especially on tools you use for work. After a couple years of doing semi-fulltime OS work you'll be commanding double the salary.
In practise, achieving some sort of "steady state" or status-quo doesn't work for a few reasons:
1. Inherent system complexity ensures something always breaks. Human attention is always needed.
2. Markets evolve rapidly due to technology changes, regulatory changes and the very nature of markets: Strategies and ideas that worked in the past cease to work in the future.
Once two different traders discover how to 'perfectly' trade, the game changes and as they say 'here we go again'
- currency is very different from equities - while one could argue that a central bank deciding to peg it's currency to $ is a realistic scenario, it simply doesn't make sense for equities (an equity is basically a right for present and future earnings of a company - good luck 'perfectly' predicting that !)
- as you note, there would still be speculators, since there is always uncertainty. There are numerous examples of countries pegging their currencies and then when unexpected happens...
- traders / sales would still compete for order flow of the clients (and get the cut), so there would still be a game to be played albeit for smaller margins
* actions of any given platform doesn't meaningfully influence the market
* all have ability to forecast weather with 100% accuracy
* all have ability to perfectly predict human behavior
* all acquire the exact same news at the exact same rate
* all connect to the same exchanges with the same latency
* all have the same amount of capital to leverage trades
Simply put, there are incalculable ways for a trading platform to gain advantage over another, so I don't think this will be an issue in many lifetimes.
Yes - this... Buying or selling even 1 share in the markets adds information to this giant calculating engine. Anything you do will alter the course of the future.
I've never seen a backtested strategy that didn't look great on paper. The moment you drop it into the market - you can take a steep discount to your expectations.
Doesn't make the decisions for you but manages the other 80% of the complexity involved in deploying a trading strategy.