Hacker News new | past | comments | ask | show | jobs | submit login
You Are Not Paid to Write Code (bravenewgeek.com)
328 points by tylertreat on Nov 17, 2016 | hide | past | favorite | 167 comments



My working theory about this is that the industry is driven by resume-driven development. Recruiters filter resumes with a keyword based search : 'Do you have React experience ?', 'What about docker ?', 'Do you do Ruby on Rails ?'. If you cannot show experience in the current hotness, your chances in the job market diminishes. So people pick up the new hotness and implement their next solution with that technology whether or not it is the right tool for the job. If we fix the job market, this problem might automatically disappear.

Another solution could be encouraging personal projects. Many companies over-work their employees to the point that they cannot do anything in their spare time. Give employees free time and encourage them to play with new technology in their personal projects. The curious ones will have an outlet to channel their energies and you will get a rock solid stack. Also, they will be knowledgeable to take you forward when a problem that requires their newfound knowledge arises in your organization.


> Another solution could be encouraging personal projects.

Agree. This is necessary to encourage learning, which helps keep people sharp, which in turn benefits the projects that they are working on in the day job. Even if the technology is the "same old".

A couple of decades ago, many software developers were content using the tools they had and knew. Few demonstrated the eagerness and openness to try something new. These few were also the ones who were better at coming up with out of the box solutions because they were willing to try something new -- both tools and/or approaches. Now, we have reached a point where the bare minimum required of a software developer is familiarity and comfort with the new and shiny, leading to the reverse problem we had early on. Now, people can spend way too much time and energy learning and trying out new tech that they miss out on the opportunity of staying long enough with a technology to learn from experience.

In other words, today we spend way too much time on accidental complexity than on essential complexity.


> encourage learning

Hell, I'd settle for just not discouraging it so damned hard.


"..today we spend way too much time on accidental complexity than on essential complexity"

This. So much this!


The root cause of writing unnecessary code is the way the work day and compensatory systems of the modern workplace are structured. Employed software and infrastructure engineers are not paid to just drop in the simplest, most appropriate solution, do the minor customizations/reconfigurations that may be needed, and get out. They're paid to be in butt-in-chair for at least 8 hours per day, and the presumption is that they're engineering that whole time. In reality, places often must make up tasks to keep their employees busy, and using the most efficient solution is disincentivized, because you have to say you're doing something for those 8 hours.

I think the fact that we have the white-collar workforce organized after the assembly line pattern is the primary cause of unnecessary, borderline intentional complication, which includes mixing in unstable/untested components. If we could find a way to compensate engineers fairly without monopolizing their time, we'd be much better off, because the need to continually invent additional work for oneself would go away.

The bliss of a stable, low-maintenance project can be had through side projects, which often work fine for years with minimal modifications. It's really rewarding to go back to the same utility that you've written and know that the comparatively little quantity of time you spent on it is still paying dividends years later. There's an awesome sense of pride attached to that.


Yes, that. I always feel managers are pressured to make sure devs are worked 8 hours a day, necessarily or not. As for side projects, it is really hard to keep a flow doing that knowing that at any time you may be given a "real task" or a maintenance issue to work on. Maybe contractors have a different work day since they are paid by the hour to do specific things, but full time always have the issue of nothing to do but not being allowed to go home or work on something to enhance knowledge.


In reality, places often must make up tasks to keep their employees busy, and using the most efficient solution is disincentivized, because you have to say you're doing something for those 8 hours.

This does not match my experience. Slow times are for experimenting with things and catching up on less-urgent tasks.


> Slow times are for experimenting with things

You've led a charmed life. In my 25 years in this business, I've found that very little induces the same level of panic in management as "experimenting" with something, especially something that you can't put a solid "business case" behind and "estimate" down to the nearest half hour.


Which is only to be expected. Middle management is not about leadership but all about mitigating risk (Seth Godin explains this quite well in 'Tribes').

It's both a shame and moronic that often assembly line paradigms of management are applied to non-assembly line work (i.e. work that isn't the same for each iteration).

What's needed is to reframe work in terms of value instead of time wasted.


What's interesting is that if you look at history, there are different 'assembly line' paradigms of management, one which could be maybe be termed 'classic American' style (which could be argued necessitated the rise of old-school unions, leading to anti-productive standoffs between management and labor), and the 'Toyota style' which ironically was founded/seeded in Japan by Americans which could not get larger American car companies to listen at the time.

The 'Toyota style' could be argued as more responsive to worker initiative and fits better philosophically with software work. (And Agile is influenced by Toyota-way thinking too, though in a mutated form..)


The "Toyota style" has already played a substantial role in the philosophy of influential technology companies like Apple and Pixar. Steve Jobs was a known admirer of W. Edwards Deming, who created it.


> Recruiters filter resumes with a keyword based search

Although true, this is is somewhat shooting the messenger. If as a client of the recruiter I ask for COBOL developers and all I get are Clojure resumes because the recruiter thinks it's cooler, then the recruiter won't be my recruiter for very long.

Surely where there's an issue it's that the recruiters are being asked for developers with those attributes. They're presumably being asked because the employer, for good or ill is seeking developers with those attributes.

Now, I've personally seen 4 reasons for hiring devs with cutting edge knowledge: because it's cool, because it's the only way to attract good talent, because we checked and we absolutely need that technology and because we run a research lab looking into cutting edge tech we may or may not use.

The latter two are very good reasons to seek hotness. They're also the least likely to be the reason, in my experience. Using shiny, shiny to attract talent is often a successful approach. And who doesn't like cool stuff (though it's a terrible justification)?


It has been suggested several times on HN that another reason is that it attracts younger coders.

Younger coders tend to be more willing to just follow the lead of the management, rather than ask "is this really going to solve the problem?"

Younger coders will put in massively more hours -- this may result in problems like reinventing the wheel, piles of garbage code, etc. -- but it generally "feels good" to be the manager with an energetic, 24-hour around the clock team of young coders.

Younger coders can be paid less, and are willing to forego benefits, stability, medical plans, or even pay sometimes.


Almost half of my time in the past few months have gone into interviewing and hiring and I approached the subject with a complete beginners mindset and tinkered to see what works. Much to my chagrin turns out there is a positive selection effect to filter candidates by resumes based on hot tech. Maybe the smart ones know this increases there employability? Maybe they are naturally more curious? Who knows, it just seem to be a good signal.


> hot tech

your bias is showing.

I'm not sure what you consider hot tech but I can assure you whatever it is it is based on tech that has been around for ages and has just been rebranded. I find that people that hop from "hot tech" to "hot tech" don't ever build up a depth of knowledge and understanding to tackle hard problems in an elegant way. It is essentially a major contributor to the 1 year of experience repeated 10 times phenomena.


>I find that people that hop from "hot tech" to "hot tech" don't ever build up a depth of knowledge and understanding to tackle hard problems in an elegant way

You immediately assumed that the folks I found are just jumping from tech to tech without building anything substantial. Which is probably also bias, just an opposite of mine. You also assumed that I didn't vet them or test them on "elegant solutions". Maybe your heuristic worked for you. But this heuristic, in my current situation is thus far working for me.


Tackling hard problem is often so much social endeavor than the effort to solve it. You have to convince everyone the solution works despite all the bias they might have.

Maybe that's why many of the major changes you see in tech, like Linux, Git, Vim or Redis started as a single person's work rather than a clever corporate product.


I saw this at one large uk Telco Some one was going for a promotion so they spent a year and a team of 15 people to redevelop an existing perl system into an oracle based one as oracle was the company standard.

one of the markers for promotion was manage a team of x size and a > 1million budget - so they where gaming the system not the best use of the share holders money


The overwork thing has to be culture. I'm sure they think they're cost minimizing, but I doubt adding the additional labor costs more than the additional time of the current staff doing it all alone and being stressed out and so on. The "stressed out and so on" adds up to negatives that your company will pay out.


Remember that managers too respond to incentives. People working 9-5 are probably more productive than those doing 12 hour days, but which is more likely to keep the VC money flowing?


Also, most managers won't act on anything they can't measure directly.


It definitely factors into employee turnover rates. That is probably also where employees hurt themselves. People don't like looking for work and for good reason.


It's worth noting that the buzzword-keyword-madness is driven by web stuff, and amplified to an extreme the more a job is web and/or frontend related.

If you don't work in that field, you're fine :D


> You are not paid to write code

Well, actually yes, yes I am. You can make a compelling case that this shouldn't be the case, or that they people paying me are shooting themselves in the foot running things the way they're running things, but the fact remains that in today's daily standup/weekly status report/what have you done for me lately "agile" corporate environment, I am absolutely paid to write code and if I don't write code, they'll stop paying me.


Well, actually no, the author is right. You are paid to deliver results, be it adding new features, processing data, making widgets or whatever. The business really doesn't care how you do that.

Obviously you do write code in the course of providing those results, and all of the author's examples of what you should be doing are still examples of writing code, but they are the minimum code required to deliver results.

Another example of how you're not paid to write code is the frown you receive from stakeholders every time you say you are going to refactor something, or that you are going to rewrite the API to be RESTful.

They don't care, and they shouldn't


I think a thought experiment illuminates the difference. Two different but identical companies with unlimited vacation gets too people to do the same project.

Workaholic Walter shows up to work 7 days a week, works 10 hours days for 3 months busting his butt to write a piece of software that ultimately fails because he spends so much time working he misses the big picture.

Smart Steve does some research decides the project already exists in open source downloads it, sets it up in a week. Then decides to take a 10 weeks vacation because he just saved the company so much money.

In these two situations Steve objectively did far better, but I bet your sweet butt that he gets fired, and Walter gets promoted. The problem is no one ever really knows how much value you added, or how much work you did.

I've been on projects that were incredibly successful from a technical perspective. Objectively the product performed exactly like the stake holders wanted, the performance was great, it could run for months at the time, even while actively being configured by people completely unfamiliar with the software and not crash. It was still a failure because no one wanted to buy it. We spent a year kicking butt and still added 0 value to the business. But I will still paid. And I wouldn't have been paid any more if the project had instead resulted in an extra $12 million in revenue.

You pay a hair stylist to cut your hair, not for the business deal you may or may not get because of it. Why someone pays you is different from what you are paid to do.


> In these two situations Steve objectively did far better, but I bet your sweet butt that he gets fired, and Walter gets promoted.

I wouldn't bet on that.

Noone is aware that Walter overworked because noone saw him. Plus, he didn't delivered anything in the end.


He delivered shit ton of code; managers have seen the progress pretty much daily, or at least weekly. The failure of the project can be blamed on many things - so ultimately, maybe Walter was just digging the right (albeit crude) ditch but at the wrong place. Someone else's ass may or may not get busted for the project flop (depends on the size of the company, usually).

In the meantime, Steve gets sacked because he's obviously not working - so why he should get paid?

The point is, companies would want you to focus on business goals, but in fact they pay you for the amount of code produced (and punish you if it's low).


> Noone is aware that Walter overworked because noone saw him

I know Walter. He made a big point of sending e-mails at 2 AM, CC-ing the recipient's boss, and then following up at 8:00 the next morning asking why you hadn't responded yet.


This behavior is seriously wrong. It's insane that it could go on for months without feedback or blame.

The question is whether it comes from Walter or from the company. Either let him go or move to a better environment.


technically speaking, no I m not paid to write code, if ask that question during an interview, company would tell you that they dont pay you to write code as well.

However, when most of the companies's idea of "agile" development cycle is to have smaller goals and very short deadlines, you end up being a code monkey. A project may run better if you have time to design a better object structure, instead because of a tight deadline, we have to slap on 10 if and else if statements then call it a day.

when making an enhancement? add 2 more if statements, fix a bug? put some if statements under the previous if statements.

bottom line, because of the tight deadline, better code and better design means nothing. we become if statements code writing monkeys.


> Obviously you do write code in the course of providing those results

Well, then you _are_ being paid to write code. We can yammer on all day about how we're adding value, but the programming skill and talent has to be there first. So you're both being paid for the value of your skills (being able to code proficiently and efficiently) _and_ for increasing the equity or profit of the business.

An analogy might help. When I pay a contractor to add improvements to my house my goals is to increase my home equity. So I am paying for value. But I'm also paying for his skill. Because if I get an unskilled contractor I might end up underwater. An unskilled contractor can cause me to go backwards, where the cost of fixing his bad work costs more than the equity boost. The same dynamic applies for bad programmers. Skill is the primary driver that causes any increase in value.

> but they are the minimum code required to deliver results.

We're not looking for the minimum amount code, but the best solution. Writing tests will add overhead to your code. At a minimum you could leave them out. But the benefits of rigorous testing can justify the investment of time.


You are buying into the author's conflicted statements and continuing to spread the dissonance. Dissonance is what got us into this stupid Trump situation. I'm telling you that you can't tell someone who says they are paid to write code they aren't paid to write code and then rationalize some weak argument why you are "right". You aren't right because op is telling you they are paid to write code. All that matters is that "op" is "writing" "code" because they said they were. You don't get to say if they are or not, and the author doesn't get to either!

It's completely asinine and blaming to speak for others actions. Stop doing it. It's ruining this place. <-- This last part is a blanket blaming statement speaking for all others and I really shouldn't say it.

P.S. I've flagged the article given it's making "you" statements.


Well sure, but you're missing the point: If your managers are only measuring LOC produced they're playing on a lower "logical level" than if they were measuring something closer to the actual intended goal(s).

Dijkstra: 'My point today is that, if we wish to count lines of code, we should not regard them as "lines produced" but as "lines spent": the current conventional wisdom is so foolish as to book that count on the wrong side of the ledger.'


Another relevant quote, by Bill Gates:

'Measuring programming progress by lines of code is like measuring aircraft building progress by weight.'


That is my experience as well. I have to do my personal development on my own time, buy my own books and pay for my own subscriptions. The other 8 hours if I am not writing code or behaving like I am, things go south pretty quickly.


Agreed, but to tie it up with the article... At first you are paid to write code, then you are paid to maintain it. You become part of the system you helped build.

You are the steering wheel that alligns the overall thing's behavior with the interests of the organization that owns the thing. That's what author calls "adding value".


Whenever solving a problem entails deleting a whole bunch of code and just writing a few clear lines - I love it. I think 'how wonderful! Look at all that code we are deleting!' I think is is line with what the author is getting at - it's about solving problems and the less systems and code the better.



I'm coming from learning all about toothpicks and glue, then seeing every problem solved as building things with toothpicks and glue. It was quick and fun at first but my creations were brittle and sometimes were dangerous even. Gradually I became aware of ductape, saws, plywood, nails, hammers, steel, etc, then pre-fabricated parts. Eventually, I ended up buying some from vendors even that had compatible standards so I could make things quicker piecing them together. In the end, just having to try to come up with things that were already made seemed stupid and a waste of my time but unfortunately, that idea only seems to make sense to people that have gone through the same.

But it ain't all bad, see. I hear some young folks are coming up with brand new types of toothpicks and glue, hell some of em' I hear put themselves together.


I had a new developer decided to make their own charting library. It seemed like the best idea was to let him try and learn why that was a bad idea. Which looking back I think many other people had the same idea with me.

Honestly, I think programming is at the point where apprenticeships are a good idea. Thirty years ago the tools where changing to fast, and eventually we can codify the correct body of knowledge to have useful formal education. But, education seems to default to handing people a bunch of tools and getting them to learn it own their own.


> It seemed like the best idea was to let him try and learn why that was a bad idea

This is exactly why it's a good idea to try your own implementations. You get intimately familiar with the problem. You understand the trade-offs, corner cases, api design, etc. far better that way. And there might be a slight chance that you come up with a better solution.

I've heard that Feynman had a motta something like "If I can't build it, I can't claim to understand it."

Also, I like the idea of apprenticeships. At some point our academic culture seems to have shifted from liberal education to some kind of quasi-vocational preparation. This is probably in response to the death of apprenticeships when we started requiring a bachelor's degree for entry level professional jobs.


"What I cannot create I do not understand."


> Honestly, I think programming is at the point where apprenticeships are a good idea.

I strongly agree with that. There's still a lot of important "tactile" experience that is easier to transfer via mentoring than to learn from written material.

> Thirty years ago the tools where changing to fast

My impression is that the pace of change of tools is only increasing (to pretty ridiculous levels if you look at the web ecosystem).

> and eventually we can codify the correct body of knowledge to have useful formal education.

This should be the goal, yes. But I don't feel like we have identified much of the knowledge that's worth codifying. Instead, the industry seems to be running in circles, each iteration less efficient and more bloated than the previous one. I wonder what the way out for us is?


Programming should have always been an apprenticeship system. Maybe add in formal education like a Comp Sci degree, but I've been saying since I entered this industry (which isn't THAT long ago, but still) that apprenticeships would be much better than just going to college.


yah when I first was learning JavaScript, I made my own date library if you can believe it. I learned a lot but then at some point it seemed idiotic but only from the perspective after doing it, not before. We are 4 dimensional after all and information assimilation seems to be also.


Not sure what the articles point is? That you shouldn't write an application if you can use an off the shelf tool? Who does that?

I'm paid to write code because someone wanted an application. I don't think I could tell the customers who buy our application that they should learn some bash or excel and solve their business problems?

I assume under some rare circumstances a developer isn't faced with writing an app (site, CAD program, OS, ...) but instead to solve some business problem internal to a business - then you have the option of not writing code.


I think his point is that engineers tend to favor complicated solutions, and that they shouldn't do that. Every line of code you write or modify is potential technical debt that you'll have to own. The author appears to indicate that this is bad, and also appears to recommend making wider use of pre-existing solutions to prevent it. That's fine as far as it goes, but it's a simplification.

The piece does feel incomplete in that it doesn't discuss the incentives behind the decision to "write code" or not.

The simple fact that we always seem to come back to is that once you learn the basics, good engineering just comes down to good judgment, which is only gained through years of dedicated trial and error. Is it better to write some glue code to bounce data between 8 different extant programs and make final transformations or to write your own program that handles the process soup-to-nuts? The answer is now and will always be "it depends" (and quite frequently, it involves elements of both). We need to make sure that the social incentive structures align with the engineering goals (i.e., don't tell someone that if they make something stable and low-maintenance, they'll unemploy themselves) and then we can trust people to do good, iterative work.


Every line of code you write or modify is potential technical debt that someone will have to own.

With the turnover rate in this industry, I'm not sure anyone plans to be around when the monstrosity that they wrote starts to become a maintenance nightmare.


> solve their business problems

Your job is to solve business problems. You happen to use code as the tool to do so because it lends itself to building better, more reliable, more scalable solutions.

Nobody pays for a lump of code, they pay for a tool that solves a problem they have.


Depends really. A surprisingly large number of organisations don't seem to be equipped to handle business input from the development staff or any kind of "productivity multiplier" contributions. E.g: making the team more efficient, reducing signup friction, fixing UX annoyances, inventing a new feature or product, improving security practices, reducing defect rates by introducing CI/static analysis/whatever.

In this case the job of the developer will be to write code and it doesn't make sense to pay high salaries because the ROI is not there.

This leads to the situation where very good developers aren't paid good salaries because in such a company they are literally not worth the money.


Except there are two common problems here:

- NIH. If you think it's programmers who primarily suffer from NIH syndrome, check again. Just how many times you end up writing something that's already done because your bosses insist they must have an in-house solution?

- Nonsolving a nonproblem. A lot of code we write is meant to solve a problem that doesn't really exist, is created by stupid decisions higher up the chain, or sometimes just shouldn't be solved in the first place (e.g. tools facilitating an unethical business model).

So yeah, I'm often paid to write code - because the decision that code must be written is made by people above me.


The customer pays my employer for a solution and my employer pays me to create it but I get your point. I just didn't agree with the argument of the article in general - most of devs can't use a piece of shell script to solve a problem they would otherwise code. Simply because we work within the boundaries of a single product such as a large desktop application.

I don't solve my employers business problems, I solve the customers business problems an the only way to do it is to code it in the application.


> Nobody pays for a lump of code, they pay for a tool that solves a problem they have.

A more cynical situation, which I have seen all too often, is that they pay for the line items on the RFP. Whether or not those features will ever be used or are in any way beneficial.


Incidentally this very often comes about by writing code. Or installing new systems. There's a strange broken circularity to that blogpost.


Sometimes, as a developer, I ask higher level questions about the business. Who is the customer? What problems do they have that we are trying to solve? How does this business make money? How are the executives measured?

These are questions that are relevant if you want me to add value in a non code monkey way. But, sometimes, when I ask these questions, I get looks that say "You're paid to write code, so stop asking non code questions." Or maybe I'm just projecting.


> Nobody pays for a lump of code, they pay for a tool that solves a problem they have.

Of course they're paying for code. They're paying for code that does something useful, and which may or may not produce value for their business. You can't separate the two.


>That you shouldn't write an application if you can use an off the shelf tool? Who does that?

Lots of people? Everyone who ever bought, downloaded or otherwise acquired any software not written by themselves?


Yes, sorry if I was unclear: that was my point. When I as a developer is paid to do something, usually someone (my manager, the customer etc) has already decided that whatever problem there is can't be solved without making a program specifically for it.


The major problem of bringing in external technologies is that they take over your architecture, not that they might introduce bugs.

There is no catch-all architecture, so there is guaranteed to be some impedance mismatch between the expectations of your project, and the provisions of the 3rd party tool. Heaven help you if you need the facilities of multiple architectures, and try to marshal and connect disparate datatypes and calling/threading assumptions together.

As programmers, we work with general purpose programming languages. Many project-specific problems are not difficult to solve in a custom manner, given somebody with enough experience and hindsight to know how to write such a forward-looking thing robustly. It is a serious consideration whether or not to defer your architecture to generic external sources that were not written with your unique needs in mind. And even if you do, it is by far a best practice to ensure that such things live behind an application-specific abstraction separating out your project-specific code from entangling 3rd party code, allowing you to perform the inevitable migration to a different platform later.


This is an issue we're grappling with at present, as we move all our sites out to AWS: do we build our apps to use the incredibly tempting AWS services, or do it ourselves?

Our present approach is "hell yeah, if it can be outsourced it should be" and that applies to services too. We do keep an awareness of what it would take to do each thing by hand again though.


With AWS, I think cost becomes a significant issue in selecting which services to use, more so than development issues.

But then again I've also always found that once you start coding to it, AWS gets you 90% of what you want, and you spend the other 90% of the time fleshing out the exact retry, migration, and error-handling behaviors you need manually on top of it. :-P


This is employer-centric in my opinion - sure you're not paid to write code, and sure you can reuse existing code base in new and exciting ways.

But - there's also value in learning new things, in new skills, in new toolsets. Maybe it's not best for the employer, but for you the employee it can help quite a bit.

There's a fine balance - ideally your employer encourages you to learn things on the job, and to try new things that may not necessarily make it to production, but improve your skillset overall.


Agreed. Employees are investors. They invest their time and future technical direction. Making sure those investors get an acceptable return on their investment is just good business. Encouraging side projects, mentoring, and growth is a way to improve the returns on the investment employees make in their employers and their customers.


That's an interesting take. I think it's important to also remember there's a fine line in breadth vs depth learning in this field as well. So called 'Taco Bell' programming does encourage you, in a way, to learn the basic tools in greater depth than perhaps you would if you weren't forced to.


I completely agree on that - it's exactly like math, the only way to truly learn it is to practice, over and over again, until you've mastered the concept.

This is something I'm coming to terms with in my career right now (~3 years in) - do I learn something to a mastery level, even if it's not the most marketable thing, or do I focus on breadth first and shoot for depth later.


> In fact, code is a nasty byproduct of being a software engineer.

This is the core of the matter.

This is something Dijkstra, Edward Cohen, Jayadev Misra, and others in and around the formal methods camp have been saying for decades. It is more worthwhile to solve the problem than to guess your program into existence and patch the errors later. To dismiss them is to say you do not appreciate the true difficulty of programming and designing systems.

In practice I don't write formal proofs for every line of code -- how exhausting! But I do often write high-level specifications in tandem with the software I'm using to implement it and often one informs the other to the true nature of the problem I'm trying to solve. And I find that as I improve in different logics and predicate calculus I can find and spot the design errors in the structures formed by code that you don't see if all you're doing is trying to "solve the puzzle."

The whole approach to guessing your program and patching the bugs later is far too addictive. It saddens me when otherwise good programmers fall into this trap. It creates work for oneself but it keeps you from solving the real problem at hand!

Nice article. Not sure about the allure of "Taco Bell," but the spirit is in the right place.


Where can I read more about this? "Formal methods"? Being able to write proofs and tests/frameworking around things that I need to write would help me be more confident in what I do and also help me keep myself on track/not end up going down rabbit holes.


Most people are writing business logic not algorithms (a website not a video encoder). If that's you, you should also check out BDD esp frameworks like cucumber, as that may be useful.


Programming in the 1990s by Edward Cohen is an interesting book for introducing software developers to a method of proof writing by way of predicate calculus. It's a book one can read from cover to cover.

The TLA+ book by Leslie Lamport is also an interesting read even if it is specific to a particular logic for creating specifications. Any of his talks on TLA+ are also good for a high-level overview of the why and a bit of the how as well.


There's several, different kinds you might use. One group are designed for formal specification rather than verification where you're mainly making it easier to understand for people with some tool checking. Another type is designed to help a formal logic prove specific things about your software. Then, there's lighter versions like model checkers that explore abstract models of the system to prove properties about it but operate on constrained problems.

Here's an introduction to Z specification language that illustrates how it uses simple formalism to precisely specify requirements and design going from abstract to concrete:

https://people.csail.mit.edu/dnj/teaching/6898/papers/spivey...

Alloy is a model-checker in a similar vein as Z that's designed for easier use and automation:

https://www.doc.ic.ac.uk/project/examples/2007/271j/suprema_...

Although TLA+ is popular now, SPIN was the goto model-checker for analyzing software for concurrency errors:

http://www2.imm.dtu.dk/courses/02152/CP/spin.pdf

Software Foundations teaches you formal verification of the kind that is used in cutting-edge academic projects like CompCert compiler or seL4 kernel:

https://www.cis.upenn.edu/~bcpierce/sf/current/index.html

Language designers can also include lightweight, formal methods into their type systems like Design by Contract which helps knock out interface/integration errors:

https://www.eiffel.com/values/design-by-contract/introductio...

Finally, since logic programming is so fast these days, one can even translate logical specs into logical programs to straight-up execute the specification:

http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.34....

Note: The Z, Allow, SPIN, and Coq tools referenced above all have websites with FOSS software, tutorials, etc. Just Google them.


Eiffel, I hear, was (is?) quite cool. I've read some of Bertrand Meyer's work and have learned a lot from it.

One area that is becoming really interesting to me is in dependently typed languages and their ability to encode at least some of the properties of the formal specification in the type system.

I haven't had much time to take a deep dive into Agda but what I have challenged myself with gives me hope that this may be the way we write programs in the future (even if it's not Agda per se).


Re meyer

The neat thing is his method was adopted commercially with years of success despite never showing up on Hacker News, etc. It combined efficiency, OOP, basic safety, and concurrency safety (SCOOP). Most talk about adoption of formal specs in theory but he got it done in practice with DbC in industry. Now, tools like SPARK, Ada 2012, and Perfect Developer are using it. You can do it in C, C# and Java, too.

Btw, in case you missed it, his Theory of Programs defines most aspects of programming in an elegant, precise way using set theory:

https://arxiv.org/pdf/1507.00723v3

Re dependent

Learn from the master:

http://adam.chlipala.net/cpdt/

Far as whether they're good, Im not sure as formal methodists have endless arguments on it. The critics basically say you have to put in enough spec and proving work that you might as well do the real thing in a prover then extract it. Chlipala actually does something like that called Programs as Proofs I think. Memory gettimg fuzzy. Just remember dependent types vs proving is in dispute at the moment.


Every now and then we'll see an HN thread that asks something like: what do you know now that you wish you would have known when you started programming?

This. This is the thing that took the longest to sink in and had the biggest impact. There were a lot of cool languages, tools, platforms, and systems along the way, and I was stoked picking up each one and coding -- but after decades of that, I realized I was focusing on the wrong thing.

I think the thing that convinced me was when I got to start watching lots of technology teams in the wild across multiple organizations. So many times I would see conversations and tons of problem-solving effort being spent on the tools to solve the problem instead of the problem itself.

A couple of years ago I was teaching tech practices to a team that was deploying on iOS, Droid, and the web. After we went over TDD, ATDD, and the CI/CD pipeline, I emphasized how crucial it was not to silo. When I finished, a young developer took me aside.

"You don't understand. We have coders here who just want to do Java. They want to be the best Java programmers they possibly can be."

I told him that he didn't understand. Nobody gave a rat's ass about people wanting to program Java. They cared about whether the team had both the people and technical skills to solve a problem they have. It would be as if a carpenter refused to work on a cabinet because it wouldn't involve using his favorite hammer. You're focusing on the wrong thing.

Sadly, once you get this, the industry is all too happy to punish you for it. That's because the resume/interview/recruitment world is interested in buzzwords and coolkids tech, not actually whether or not you can make things people want. This means sadly, in a way, if you continue growing, it's entirely possible to "grow out of" programming.


> I think the thing that convinced me was when I got to start watching lots of technology teams in the wild across multiple organizations. So many times I would see conversations and tons of problem-solving effort being spent on the tools to solve the problem instead of the problem itself

Your insight really hit home. Especially since I've been on a new software dev position for about a month now. This is the overwhelming issue that is already reared its ugly head, is massively frustrating and I feel (nearly) powerless to stop it.


Actually I'm paid to do this: http://devhumor.com/content/uploads/images/November2016/mode...

Writing code is just the only way to do the above since all attempts at making it less text file driven have failed so far.


That only applies to web dev and to a subset of user-mode PC, server and mobile software. While I understand that represents the majority of the software being developed, that article is IMO an incorrect generalization.

Some of us do CAD/CAM/CAE, the proven tools either don’t exist, or are targeted towards companies like Boeing and GM and cost a fortune.

Some of us work on console games. The environment is pretty close to the bare metal/hypervisor, and typically, there’re significant costs involved in integrating any third-party stuff.

Some of us program embedded firmware. Only recently the hardware became fast enough to reuse those proven tools ported from PC, you can’t use those on a PIC16, just not enough resources.

That’s just what I personally did/doing over my 16 years in the industry. I’m sure there’re other fields where the proposed approach is inapplicable for various reasons.


This is a good article, and the quote in the middle is absolutely amazing -- it belongs up there with some of the most insightful quotes about software ever (and it was not even directly about software).

I sum the quote up as, "Systems are sentient beings like the One True Ring, and they will absorb you. Soon, though you believe you are thinking freely, you will actually be merely a part of the System, thinking what it wants you to think." So true!

But ... Taco Bell programming still creates a new system. That's a flaw in the article's premise.

If you solve a problem by stringing together 11 tools, then yes, you should get some benefits from reusing preexisting tools. But now, you have a system with some rube goldberg characteristics, plus you've written a bunch of "glue code" (which is "new code") in the process.

Those systems can often turn out to be more complex.


It's pretty amusing when you have to keep repeating to a client that the off-the-shelf tool is better, cheaper, and is immediately available to them. Yet they still argue for the development of a new system. So I don't think this push towards custom builds only comes from the developers side of things.

And another thing worth pointing out is: most employers hire you to write code. That's the job spec. So don't be surprised that that's what we end up doing.


The worst is when people buy off the shelf and its close to what they want. Then they ask for changes.


> they ask for changes

And expect those changes to take an hour (maybe two).


Wake me up the next time a 5-line shell script in Bash that uses only standard tools and runs off of any default Debian sells with a license cost of $20 (only) -- you know, for the 5 lines, not anything else -- and anyone pays it.

Doesn't matter what it does. It is literally impossible to sell even $20 worth of the solution the person advocates. Go ahead: link me to anywhere in the web selling a 5-line bash file without anything else. I'm waiting.

You can do a lot in 5 lines, too. lalalala. waiting. While I wait I think I'll make some money coding something people actually pay for. /s

Seriously though, while the author's point might stand in many sys admin and even systems integration roles - most of the software world actually pays for deliverables: the clearest example of which is consumers doing so. People would rather spend 20 hours cursing, and then give up, rather than pay for a systems integration script that generalizes and solves their problem. It's what the market demands. This article really would make sense if it came from the person paying -- but it doesn't. nobody who is paying actually says the words the article chose for the title. Yes, you are paid to code.


> Wake me up the next time a 5-line shell script in Bash

Or when somebody can write a 5-line shell script in Bash that:

a) typical non-programmers can and will use b) produces enough diagnostic information that anybody (programmer or not) can use to troubleshoot when something inevitably goes wrong - whether it's user input, network connectivity, a missing dependency...

I get what he's saying, but he's talking about a pretty small subset of what we really do.


I'm not 100% sure why you're asking for a link to a product when I think it's fairly clear that he's talking about business solutions. Or to give a weak example: "we have system X and system Y and need them to talk."

After you write something like that you don't turn around and sell it on the web.


I was paid pretty well yesterday to judiciously delete a lot of code, and I had a blast.


I love deleting code. It's probably when deleting code I feel most productive. And of cause: deleted code is debugged code.


People get weirdly apologetic when they have to tell you that code you have written isn't getting released/merged.

I think its great. My future support burden just decreased. That is a big piece of code that will result in zero bug reports.


I don't know about you, but I'm paid to solve problems.


Me, I'm paid to translate someone's idea from one language to another. The value to the business is already understood by the one conveying the original idea. My only job is to ensure that the entity on the other side fully understands what was originally communicated without hinderance through language barriers.


That sounds great, but I never see clients who actually understand what they need.


I am paid to mainly fix bugs. "Writing code" is mainly for free time and enjoyment.


Cool. I am a janitor.


I think that a sysadmin is the modern equivalent of one. Heck, administrator is another name for a janitor. Janitors also solve a set of problems.

Except they might have input in the processes.


I explain my job as computer roadie. I get all the black boxes into place so the devs can get up and be Eric Clapton.


You know these occasioanl threads about engineer burnouts we see and post on HN? Well, I actually feel that when you are "burned out" and don't feel like writing complex systems is "fun" anymore, you actually become a better engineer - exactly for the reason described in this post.


Exactly. After burnout, I spending a lot of time trying to kill code and make simpler designs. However, my outsource firm does not like that.


I can't emphasise how true this is. I've written tonnes of temporary scripts that parse some files or rename some directories, and then later discovered that I could have done it with a single UNIX command.

I'm not employed as a professional software developer, so I still don't actually use the helpful UNIX commands. Takes all the fun out of it. :P


The other day I had problems with grep, so I rewrote it in JavaScript ... I could have used awk, but at least my script is more then double as fast as awk. I think it's some sort of procrastination and being self managed. Reminds me that I should get off HN and start working.


> I could have used awk, but at least my script is more then double as fast as awk.

I find this very hard to believe. Are you sure your awk implementation was okay?


I used this:

  awk '/pattern/' file
I could upload my JavaScript to Github if anyone is interested.


What problem did you have that the easiest way to solve it was by rewriting grep?


I was investigating an e-mail problem, searching log files across several e-mail servers ... Did it go through this server, or that server ? After a lot of frustration I found out grep didn't show all matches ... I does report the right amount with the -c argument though.


... in JavaScript, of all things? And it's somehow fast?


If you see yourself as a software developer instead of problem solver, you tend to solve all your problems by writing code. In many cases problems can be solved by adjusting processes, improving culture, education or just by learning to use the current tools in more efficient ways.


Heh. It's much easier to write a kludge and patch it 500 times than perform a culture change at my place. I'll get rewarded for "fixing" the application and be seen as valuable in the former, and quite easily despised in the latter.


Solving technical problems is, unfortunately, usually far easier than solving people and political problems.

Computers do what you tell them to, most of the time.


"Developer" rather than mere "coder" should mean you're aware of the full import of the system from requirement to end user deployment and production. Sometimes it does.


Enter the insane notion of separating developers from end users. You have sales and managers talking to customers, then writing you a half-assed specification that's broken and illogical because they're not trained in noticing general ideas behind specific points. sigh.


To be fair, extracting coherent requirements has pretty much always been the problem.


True, but it's good to at least have someone deeply technical on the discussion too, just for the ability to come up with generalizations and - after presenting them to the customer - to confirm whether they're sound, or whether the customer has a completely different model in their head.

It's easier to do that with the customer than with the spec - specifications can't talk back to you.


I'm paid to turn a business requirement into software. I create products. Coding is an unfortunately complex step in that process.


No, you are paid to turn a business requirement into a SOLUTION. Yes, unfortunately most of the time you have to write software to reach that.


If the business requirement is to produce something to sell to other companies, more likely than not you will end up writing software. You simply cannot build enough entry barriers and protectable IP by glueing together a bunch of standard components in a clever way.


You're just a slave to business people, just a high-IQ one.


Every single human being is "slave" to others. We call that "specialization". The only exception - and not even completely so - are people like the old 18th/19th century trappers who lived alone in the West. But even they relied on civilization for traps, weapons and a lot of other things and had to deliver something (furs), so they too were just "slaves"? Even people living in hunter-gatherer societies are "slave" to one another. Try being useless to your tribe in such a society because "you feel like it".

Related: https://medium.com/@kevin_ashton/what-coke-contains-221d4499...


Good read. Reminded me of "Why I Strive to be a 0.1x Engineer"

http://benjiweber.co.uk/blog/2016/01/25/why-i-strive-to-be-a...


It is all about efficiency.

Can you dig a canal with a tried and true shovel? Yes, you can, but you will need a lot of time and many shovels.

Can you cook up bioinformatics analysis in Shell and Awk. Yes, you can, but you if you want a large scale analysis to be done in a reasonable amount of time, you roll up your sleeves and reach for compiled languages, distributed systems and so on.

There are downsides in introducing new systems, but it should be weighed against the upside of suitability and efficiency.


It's interesting that you use the analogy "can you dig a canal with a tried and true shovel". The attempts at building the Panama Canal [1] were basically just this, and failed. It's a great point though, and on-topic. I think a lot of times large-scale software projects are approached in exactly this way. "Let's just hire a bunch of mid-level developers (shovelers) and get them writing. We can always add more developers." It seems to be the prevailaing attitude with outsourcing as well.

[1] https://en.wikipedia.org/wiki/History_of_the_Panama_Canal


It would be cool if it was true in our industry. People digging canals know that to do that well, there are those-and-those types of tools useful depending on the constraints of a particular canal design and environment.

In our industry? Your bioinformatic analysis can probably be done well with shell and AWK (however idiotic those tools are, but that's a topic for another day[0]), and this solution will be more efficient and can be even scaled up. But no, most people will pick the "Big Data" solutions requiring a person full-time for just configuring it and managing the machine clusters it used, all because it's trendy these days. There's this saying that "if it fits on a consumer-grade hard drive, it's not big data", and it's absolutely true.

What we need is more focus on efficiency across the board, and that means things like how much it will cost (in money and time) to develop / update / maintain the solution, taking into account the actual problem that's being solved, and not some imaginary version of it.

--

Also, a food for thought: over the past 15 years, computers got a million times faster. The software we use maxes-out the hardware, and it offers still essentially the same set of features we had 15 years ago, or 30 years ago. Where did that million-fold power increase go? It definitely didn't go into valuable work we can do with our computers[1].

--

[0] - coming back to probably the single biggest fail introduced by UNIX culture - doing everything in unstructured text. Every tool you use thus has to contain a half-assed parser for the input data, and produces output in a random, undocumented and often inconsistent format; a lot of shell scripting is just adding additional parsers to glue those tools together. Sad thing is, people know how to deal with structured data long before UNIX, but it became forgotten, save for the PowerShell team and the web folks who try to backport JSON into UNIX-land.

[1] - except in scientific computing, video games and maybe CAD tools.


Your Shell and Awk example is funny since simple, single-CPU tools are often faster than the Big Data tools the industry is standardizing on:

http://aadrake.com/command-line-tools-can-be-235x-faster-tha...

http://www.frankmcsherry.org/assets/COST.pdf

It's true, though, that you should use a more complex tool if it's necessary to get the job done. It's just not clear that the complexity users or developers are dealing with are intrinsic to the problems being solved. It seems more social than anything. :)


I sorta get the idea of the article. Working smart etc. However the incentives in organizations run often counter to that. Gluing together existing pieces to do some grunt work is rarely good for your career. Building new glorious systems is much more likely to do that.

Also for orgs that do the developing for other orgs it isn't really optimal to offer the quick and efficient solutions unless competitors are doing the same.


In my current job I get paid for writing emails, reports and filling out forms. Programming is just something I sneak in when no one's watching.

The company I work for is super successful so there must be something to it.


Advice for the 99%. Some of us do embedded, where you need an actual driver or app. Can't dispatch interrupts in bash.


I'm paid to keep software systems up and resolve issues. Sometimes this means adding, removing, or modifying code. A lot of the time, it just means clicking the right buttons at the right time. Really, I'm paid to know which of those actions is appropriate in various circumstances, and to get it done when it needs to be.


Hello fellow Operator.


My current team suffers from disgust of Taco Bell programming. I love it, and I push hard to use it for the projects I lead. I've even had other leads scoff at my use of Python because "the GIL will throttle your application!" Possibly, but I understand the performance requirements of my systems and GIL is the last thing I'm worrying about. I'm not going to write my own programming language, or use one that doesn't have a good ecosystem, because an existing language is deficient in some negligible aspect.

Engineers who build things from scratch do not understand a very profound fact about software engineering: a computer performs menial, repetitive tasks so you don't have to. Be lazy and use someone else's solution.

When I was 14 I remember writing bubble sort from scratch dozens of times, as well as input scrubbing methods and binary search trees. I didn't know about the concept of libraries. Granted the internet (sourceforge, github, etc) wasn't around and I was writing everything in BASIC or Pascal. By the time I graduated high school I understood that writing code was painful.

Consequently, it's sometimes alarming that most systems I build rely on thousands of lines of code that I didn't write and rarely ever inspect. But I have to trust other developers in order to get my work done in a reasonable amount of time. I don't even consider myself a good programmer, mostly because I hate writing code. Nevertheless I do deliver useful, valuable systems. I know my problem has already been solved by someone else. I like being lazy. It's my best quality.

This isn't addressed in the article, but avoidance of Taco Bell programming is a symptom of a disease. The disease is ignorance; mostly in managers who are out of touch with the technical landscape in which their teams work. Any time I hear of a team building their own tools or doing significant amounts of de-novo work, I try to limit my dependencies on that team. You're never going to ship that shit, buddy, but I solemnly admire you for trying.


I fail to see the point of the article. What's the proposed alternative? To become an admin?

I also doubt most people write software from scratch. We use a well proven language, ide, compiler and deployment tool. We use existing libraries, frameworks, databases, caches and services.

Why doesn't that count as Taco bell programming? When's the last time you wrote code for all these things instead of just assembling all of it together?

This article is forgetting the lesson learned, highly configurable generic software that doesn't need a code change to adapt has actually failed us, and is the source of failure. That's why Domain Driven Design was born. Write simple software that target a single domain only. Write one of those for every domain. Do not try to use the same software with multiple tweaks and config hacks for all your domains.


I guess generic-ware like Microsoft Dynamics CRM is putting this to the test. Written once, configured for whatever domain you wish.


"You may not be paid to write code, but when being hired we're going to throw every convoluted or esoteric brain scratching coding test at you for you to pass, which you'll then never have to do again. Until you apply for your next job, that is."


The Basic premise is true. Every line of code that you write is a liability. Best line of code is the one that you don't write. So the corollary is that the less code you use to solve your problem the better.


That's how I read it. We also no the cost of software, from fixes to extensions, multiplies in the maintenance phase. Many organizations are stuck on huge piles of software too complicated to change. Better to not have huge piles then unless one absolutely has to.


FTA:

> Gall’s Fundamental Theorem of Systems is that new systems mean new problems. I think the same can safely be said of code—more code, more problems. Do it without a new system if you can.

I agree with the basic idea here, yes. But I would have to disagree strongly with the last part.

The problem with Do it without a new system if you can is that we need judgment to decide when to make a new system and when not. Sometimes just writing a small piece from scratch instead of using (and repurposing!) a poorly-matched prior system is a hell of a lot simpler.


Right, sometimes a new system happens to be the best solution.


This is a great post. I think the "paid to write code" is often a byproduct of environment - whether it's poor engineering leadership, bad productivity measurement, etc.


no, but job adverts are written asking for people to write code and often take tests to show they can write code.

Resumes and interviews seem to be heavily focused on if yu can write code. And there will be lots of talk about how much code you have written in the past.

which I guess is what the article is really railing against - but what it comes down to is in theory you're not paid to write code, in practice you are.


Agreed. I have always felt that technical people approach their jobs from the wrong side. They cram every technical acronym on the CV in the hope that something will stick. They don't understand business. Business wants value. The CV should concentrate on the value created in your past. If it doesn't get past the recruiters then the job wasn't meant for you anyway. Most techies are programmers who happen to make money at coding, whereas you should be a businessperson who programs; with the difference in attitudes being night/day.

Some time ago, I went for a architect role at a telco which was the largest Progress shop in the southern hemisphere. The interviewer asked if I knew Progress and I said "No but that is only knowledge" and talked about successful projects and later found out I was hired because of that sentence. But I knew 15 languages at the time so what's another language, and sure enough, 3 months later I was reviewing Progress code. You have to convince them to hire you because of your perceived value. I have interviewed a lot and it always cracks-me-up when I ask if they know some technical thing and the job-seeker goes "No, but I've heard of it" meaning they are unwilling to admit that they don't know one of the million of technical languages/platforms/etc. We aren't Leonardo daVinci, who was probably the last person on Earth to know pretty much everything. It's OK to say No. So say No and explain why that doesn't matter.

Regarding the tools we use, I agree again. I have used the same DBMS for 15 years. I know it, I feel safe using it, I know what it likes and dislikes. The majority of all my coding is in stored procedures in the DBMS and I don't really care what the front-end is, whether it's C or some new-fangled hot platform. The lower level your coding the better. Find something that works at a low level and stick with it.


I'm paid to kill jobs. I meet with a group of internal customers to ask/watch/learn what they do. Then automate the business tasks so that one person can do the job of the entire group more effectively than before. Then provide pretty charts to the managers as they "downsize" their groups.


Meta question: do software developers typically enjoy these types of officious, bossy article that state what is and what must / must not be done?

The tone of these types of posts always make me chuckle. And, I see similar types of articles on here often. When I read these, I feel like Dad is mad at me for being a bad programmer.


Nowadays it pays well to be a programmer (oh wait, developer; engineer; whatever title helps you rationalize), programming-qua-profession has exploded (it even draws the feminists... why aren't they into carpentary? draws governments, America Codes!), and articles are written with the assumption that everyone wants to be a good cog (oh, employee) and consider business needs above all, play good by the team, understand why company interests are always more important. If you don't do (think) this or that, well, you're just not being a professional! Why would BigImportantEntity, or anyone for that matter, hire you?

It's much easier for many people to come up with these articles than it is to write something technical.

I write code since forever, addicted to it. I happen to be in a field that pays good money for my addiction. So far, despite the stupidity of the profession, things turn out mostly OK. Why does someone pay me? Don't care. As long as the relationship is beneficial to me, it's OK. I feel I have more in common with starving artists in other fields, than some corporate official looking out for business interests. So I'd be a starving programmer in a parallel world where programming is not a hot profession. Now, if I had my own business, then I brought all these nontechnical issues upon myself. But then, I'll pay myself for whatever reason I want.


I am a programming addict since early teenage years. I like writing code. But I also like that code to do something fun or useful.

One time, it came the day I was supposed to enter the job market. So I looked at what companies - around me and worldwide - are doing, I read the stuff about "solving business problems" instead of "just writing code"... and I realized that if I were to really care about this, I would be unemployable. Because most of the "business needs" in our industry are trivial bullshit stuff that's at best neutral (more often harmful) to society, and only exists to make money for some entrepreneur. I don't give a shit, and will never give a shit about those problems. So - through many years of depression and burnout - I learned to ignore most of the business goals and focus on doing my part the best I can.

For example, right now I'm paid for developing an application that shouldn't really exist in the first place, and makes little sense to anyone - not to developers, not to people who will end up uing it - except the management. But well, management decides. So if I'm making this application anyway, I'm focusing on making it as useful as possible. Management asks for features. I do the ones that make sense and politely suggest changes in the ones that are idiotic. They're happy. I'm paid. When they're not looking, I'm venting out on HackerNews.

--

So yeah, to people constantly writing about focusing on delivering business values: please tell me how to look for values worth delivering, in the sea of moral bankrupcy the current job market is.


Right on. Can we cut the bullshit about how a wage worker is supposed to care about the customer, the business, or any of that? Does the person working in the drive through at McDonalds care about the customer? Of course not.

Now, if I was a partner in a business with some equity stake, then I would mop the floors if that's what I felt would increase the value of my stake. But as a wage worker, when you tell me to mop the floor, I tell you "that's not my fucking job".

People act in their own self interest. Why is this so god damn difficult to understand? Fuck!


I think it's inherently a part of HN, where 'Founders' make up the majority - it's more about business than technology.


I take it as an opinion. He's trying to sell that opinion. What he's leaving out is:

1. Leaky abstractions 2. Software bloat (using a massive framework to solve a simple problem) 3. Oops, I picked a lemon, now lets roll back all that glue code. 4. Code dependencies. 5. Upgrade hell.

It's a fad, it too shall pass, assuming programmers still remember how to program non-trivial things after the fad.


And people wonder why software quality is going downhill.

Of course a Software engineer is paid to write code - if they're not, you probably want a different person.

Maybe we need a new job title: "Business Value Expander"

Requirements: Glue together everything else that's already been written by other people, and make the business rich.

What could possibly go wrong.


Its more likely to be going downhill because everyone is jumping on the latest hipster bandwagon. NoSQL is a perfect example where many people / projects would be far better off learning about how to use a relation database efficiently rather than adding a new trendy tech to their CV while having limited experience in that new database (most of the projects I hear about are storing their data on a single server).


> And people wonder why software quality is going downhill.

Quite the opposite! Most of the poor quality is due to excessive complexity, little care for reliability, use of too many libraries and frameworks of questionable quality. All of this comes from the idea that a developer is paid to write code.

A lot of strong engineers that I met in Amazon and elsewhere would use the minimum amount of code and libraries required and spend as much time reducing complexity and deleting other people code as writing new code.


I'm not really sure what they are paying me for anymore.

I do analysis, planning, testing, development, client presentations and when I raise this concern with my TL I always get the 'cross-functional team' argument.


It seems that Software Engineer (or Analyst Programmer, etc.) are being lost as professions, and becoming 'careers'.

You just do everything you're asked to, in order that the business makes money - code ? Meh, anyone can do that.

How many other professions suffer the need to be 'cross-functional' ?


Maybe we need a new job title: "Business Value Expander"

The "full stack developers" have got it covered


"Somebody else has had this problem."

(I'm gearing up to build a big distributed "fabric" and reading the SaltStack docs; it's a weird feeling of mixed happiness and disappointment as I discover all the problems they've already solved for me.)


love this, much of my client handling is explaining why they don't want what they thought they wanted, how simpler is easier for them and their users, how functionality driven by need is more powerful than grand all inclusive visions and how as I am seeking a long term relationship, we are here to adapt to change, not to implement a 'final solution'. At the end they more often than not appreciate the honesty and I get the work, when I do the work I'm not questioned on decisions as often as the trust is there. It's a win win scenario.


I was in the UK a few months ago for a meeting and they had a plate of sorts in the room with a list of company guidelines. One of them was ‘Everybody sells’.

It seems obvious but I'd never seen it put so starkly.


My beep-boop sysadmin job is at least 50% internal public relations, in one form or another. Pretty much any job interacting with humans is, I think, which is basically all of them.


Yes, we are paid to write code. The final goal is to create a product, and solve problems, yes. However, as a coder you should think about the code, you have to breath the code, you need to feel the code. Quality of the code should become your obsession. Succinct, nice, readable and reasonable code is more likely to lead to a better product. On the other hand - failing, shitty software, often is a downright result of badly written code.


I'm pretty sure I am paid to write code that solves problems the business needs. What's this you're not paid to write code meta bullshit?


Yes code reuse is important, but most of our current abstractions suck. It's good to rewrite old junk so you never need to rewrite it again.


> a million other people have probably done the same thing

If this were true than 99% of software wouldn't be such crap.


Of course they did! Especially in the web world. It's just that this million other solutions are all crap you can't reuse even if you wanted to, so you too have to do the same CRUD thing everyone else is doing, only in a slightly different way.


Now there are 1,000,001 crap solutions.


I think I have the opposite issue at my current job as the person in charge wants to put every site, every db, every cron job ,every filesystem on one server and then wonder why we have a io problem...

1 instance on aws...sure hope our availability zone stays up next year.


To be fair, running everything on the same host is very efficient. It's just risky in terms of (lack of) redundancy and unpredictable loads.

When hardware was expensive the idea was to make everything as tiny and efficient as possible. Now that hardware is cheap the idea is to make everything as scalable as possible so as to take advantage of all that extra hardware you can now afford.

The next phase of this evolution is to go back to making everything tiny and efficient again as we reach the theoretical limits of horizontal scalability. A good way to "force" such practices is to make your developers use single (or just limited) hardware for everything.

I doubt your boss is coming from this angle but I've actually witnessed this put into practice by forcing a small development team to run their code on Raspberry Pis. It really brings home the idea that, "No, really: Your code is slow and inefficient."

It makes people re-think their architectures and often forces them to divide tasks up into smaller pieces that can be written efficiently. Also, nothing says, "ditch your ridiculously complicated and slow build process" than forcing it to run on embedded hardware.

If your build forces a multi-core embedded system with a gig of RAM to thrash and crash you need to re-think things a bit =)


I agree with the insight, the one that says you are trying to solve problems, not write code. But I don't think we're just mixing the same 8 ingredients.


It would be nice if he provided a list of unix tools deemed essential knowledge to any programmer. Anyone care to provide a list of their own essentials?


I'm no expert but have always found the O'Reilly books helpful.

Also... well I'm not sure what HN's policy is on sources/linking but let's just say that you could search right now for "Classic Shell Scripting pdf" or "Learning the Bash Shell pdf" and find those fairly quickly online...


Bill Gates said measuring software progress by lines of code is like measuring progress on designing a new aircraft by weight


"We’re not paid to write code, we’re paid to add value (or reduce cost) to the business"

I'm curious to know how many companies Brian has worked for as a software engineer. The majority of the places where I've worked just want you to code. They don't really care about adding value and just want the job done as quickly as possible so customers aren't upset (usually because the sales team promised a feature that should take 2 months in 2 weeks).

It's the main reason I quit and started my own company. I was tired of getting into battles with management over the well-being of their own code base and I was tried of being forced to write terrible and hacky code.


> They don't really care about adding value

They also want you _doing_ something every hour of every day - and learning/studying/researching/prototyping/experimenting/refactoring/migrating doesn't count as _doing_ anything, because there's no "business driver" behind it. So you have to invent things to do because there's going to be a daily standup at 8 AM tomorrow morning where management engages in the daily "who can I fire because _I_ have to _do_ something, and firing somebody is something" ritual.




Applications are open for YC Summer 2021

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: