Hacker News new | comments | show | ask | jobs | submit login
The Hard Thing About Software Development (linkedin.com)
197 points by bcl 8 months ago | hide | past | web | favorite | 100 comments

I agree completely. Rarely the hard problems in coding are thinking of a clever algorithm, or solving a particularly nasty scaling problem. Usually I've struggled the most with understanding and building what the customer needs. A spec can be interpreted many different ways and will always have to be polished after its completed. Someone who doesn't know what the customer expects will almost certainly create an implementation that has these micro problems that make the product miss it's mark.

It makes me think that we're training the wrong people in college by making CS a very difficult, math heavy field which often causes the more human skilled people to drop out. Programming doesn't have to be anymore math heavy than building a house yet we force undergrads to implement algorithms on paper? The amount of wasted potential talent due to college is staggering.

> It makes me think that we're training the wrong people in college by making CS a very difficult, math heavy field which often causes the more human skilled people to drop out.

I disagree. I think instead there is a category error being made: that CS is an appropriate degree (on its own) to become a software engineer. It's like suggesting a BS in Physics qualifies somebody to work as an engineer building a satellite. It doesn't, but that doesn't mean "physics is too math heavy." In fact, engineering a satellite requires almost as much basic mathematics education as a BS in physics requires (some exceptions might include the specialized mathematics required for upper-level theoretical physics concepts that may not apply at an engineering firm).

> I think instead there is a category error being made: that CS is an appropriate degree (on its own) to become a software engineer.

Completely agree here. I often find myself wondering why 'Software Engineering' isn't the degree required to be a Software Engineer and further doesn't really exist as a major, whereas 'Mechanical Engineering', 'Electrical Engineering', 'Civil Engineering', 'Chemical Engineering', etc. are the degrees associated with those professional titles. To your point, I don't think it's a simple matter of nomenclature (i.e. that CS and Software Engineering are synonymous). Not a CS major myself, but amongst my friends who did their undergrad in the US I don't think they had any classes that covered requirements gathering, putting together schedules, etc. Any CS majors here have a class/classes that covered those topics?

> I often find myself wondering why 'Software Engineering' isn't the degree required to be a Software Engineer and further doesn't really exist as a major

Because we don't yet understand software engineering. There has been insufficient empirical study of what yields maintainable/performant/what-have-you code, of what sorts of abstractions are "good" for maintenance/reuse/etc., at what's needed to reliably predict a program's resource requirements, and many software developers balk at design tools that restrict their style to detect certain errors as early as possible, like type systems.

Software development is not yet an engineering discipline for all of these reasons, and more.

We will never understand software engineering in that way, because we are in the business of automating ourselves out of work. As soon as we understand part of the process well enough that it could become an engineering discipline, we simply build some new tools and let the robots handle that part of the job, moving the humans along to wherever today's frontier of uncertainty happens to be. We will never be engineers in the traditional sense, because that would be a waste of human brainpower.

That seems to imply that there is a surplus of automatable jobs or job functions in "traditional" engineering disciplines, which is ... not really consistent with my experience. Those fields have had nearly as much computer power applied to them as software development, but yet their projects tend to be much better-defined and have a much greater success rate (would you hire a PE firm that had 40% of its bridges fall down?).

Personally I think the high failure rate of software projects is mostly because people on both sides of the equation regard it as generally acceptable, and aren't willing to pay what it would cost to bring software development in line with a traditional engineering discipline, where failure is typically worth guarding against, even if it drives costs up significantly.

You overestimate AI. Incompleteness is everywhere in CS. Overcoming these limitations is not trivial at all.

Besides, software hasn't automated any other engineering discipline, and those are much more straightforward because they're more mature and the principles understood.

I'm not talking about AI. I am making the claim that software engineering will never be "mature" in the sense we ascribe to other engineering disciplines, precisely because we will never completely understand what we are doing; once we do understand the principles involved, we build some new libraries or languages or other tools which automate that part of the process, and we move on to thinking about other things we don't understand yet.

+1 to this - not to say the cliche "AI is the future" statement, but careers span decades and if the fundamental dynamics of writing software changes mid-career you'll be glad you have a strong background in CS fundamentals rather than having taken "Java", "OO Programming", and "Scripting 101" classes.

A strong background in fundamentals is sufficient for a very small number of programming tasks. Do not mistake my top-level comment as an endorsement of the status quo focus on CS minutiae that reigns in this industry.

Your comment is tantamount to claiming an engineer ought to have a strong understanding of theoretical physics fundamentals in order to adapt his career over the span of decades. The reality is substantially different.

CS Major in Australia from around 2001 here, and our final project was a group project involving requirements documents, meetings (including finding times that suited everyone to meet), creating development schedules & milestones & resource allocation in Microsoft Project, meeting with the 'client' (in this case, our professor) to clarify requirements we weren't sure about. Then hacking together the product in Visual Basic. It's the closest we came to building a commercial / end-user software, and the most memorable part of my degree.

At my uni (Oregon State) we had MIS (Management Information Systems), which covered a wide range of these topics. I switched to this from CS my junior year because I felt it wasn't preparing me for the professional world, and I was right- to a degree.

My CS curriculum had some Software Engineering classes (I believe they were actually called Software Engineering I and II) where we did requirements gathering, estimation, etc- it felt like all of SEII was requirements gathering and documentation. Besides those though, it was very theoretical, and since I was already working in the field while going to school I realized that the theories would be helpful, but not nearly as helpful as being able to plan, document, and manage tasks in the context of a project.

If CS is producing developers, MIS is definitely producing technical analysts. It was extremely light on any development (there were a few classes where people had to write code and it was like every other student was being asked to build a rocket to go to Mars). But it did a great job of teaching how to identify problems and apply technology based solutions to them. Not just fixing things by making code, organizational level solutions.

Armed with MIS and a minor CS (and one class short of a math minor) I hit the ground running and was quickly leading projects, then dev teams. Six years in I was a Director and was also leading a skunkworks Innovation Lab where I got to keep my hands dirty.

The absolute fundamental difference is that I got the hard core tech depth from so much time in CS (and actually using the tech in practice) but I got the study I needed for it's useful application in a business environment. Selling to leadership, planning, source control processes, etc.

Most developers show up with the mindset of a craftsman toiling away in their shop, wanting only to emerge with their beautiful creation when it's ready. Pragmatism and being able to accept trade offs because of time/money/whatever will always make developers stand out.

> Not a CS major myself, but amongst my friends who did their undergrad in the US I don't think they had any classes that covered requirements gathering, putting together schedules, etc. Any CS majors here have a class/classes that covered those topics?

We did this as part of our open ended senior project.

Interesting, maybe a more informative question for me would be - if you did have a class/classes that covered those topics, did you feel that they prepared you sufficiently for a job as a software engineer? But now I'm starting to sound like an alumni survey... :)

My degree required a class like this. It was the most useless class I took.

The theory heavy classes (automata, algorithms etc...) covered material that changes very slowly.

But, as a previous poster said "we don't yet understand software engineering." Because of this software engineering best practices change with the wind. As a result, the software engineering class was a hopelessly outdated survey of how software was built in the early 90s when our professor last worked in industry.

Software Engineering doesn't change that rapidly.

The idea of iterative software development - that we now call "agile" was first documented in 1957

The Mythical Man Month was published in 1975 and is just as relevant today as it was when it was written.

Code Complete came out in 1993 and is still relevant.

Even when you start talking about programming languages, both Java and C# have been popular in the enterprise since 2005.

I took a class titled "Software Engineering" (or maybe "Software Development") that covered these things. I believe it was one of several upper-level electives I could have chosen, though, so I could have avoided taking it and still gotten my CS degree.

Requirements gathering and scheduling and the ilk are but a part of computer science. They shouldn't be, it is a different discipline. This is what a systems analyst or a designer sighs be doing. Now maybe in a small company a programmer had to wear multiple hats, and it might be a good idea to learn some of that. But in specific it isn't a computer scientists job.

That may be true, but if that is the case and you want to define a "computer scientist's" job that way, not many people are actually interested in hiring computer scientists.

What most firms are interested in are software engineers. The dearth of software engineers has led them to hire computer scientists (and, in fact, a rather large percentage of computer scientists that I know are quite happy for this).

But if you proceed down that road, I am not sure that what's left over, in terms of value-add, for the pure-play "computer scientist" who doesn't want to deal with requirements or schedule or that other messy stuff, really is. Once you have the problem suitably defined and the requirements nailed down... you can offshore the hell out of that job, or farm it out to some bidding-war site, and let people chisel each other out of a living wage while you laugh all the way to the bank.

I wouldn't want to hang my career on that. If you're willing to deal with the messy human-factors stuff, and sit in on the requirements-gathering meetings and deal with the client and work on setting the schedule and doing the estimation with the BD guys and all of that other shit ... you're never going to have to worry about some dude in India taking your job. That's not to say you don't need the technical skill. But if you're gunning for a job and it's between two people and one is a little better on the technical side but the other person can get involved in the process that much earlier, maybe iterate on the problem as part of requirements development (as many modern methodologies basically require)... I think they're going to do a lot better.

If you're a "throw it over the wall" coder, well, good luck with that. There are certainly jobs around for pure coders, but I tend to be suspicious of exactly how many (particularly if you want to make a top-end First World salary), and I suspect strongly that supply is going to outstrip demand for a generation or two.

FWIW, some of the most highly-compensated people I know are the ones who have some combination of really impressive technical skills but also have the business understanding and can participate in the process from kickoff, or close to it. It's also easier for them to negotiate since they have the option of going out on their own at pretty much any point, vs. a pure-play coder essentially depends on having someone around to feed them requirements and technical problems to solve.

The good thing is that I think a lot of people who think of themselves as pure coders probably have more business understanding (in some area) than they give themselves credit for, it's just a matter of developing that understanding, which can be hard in an organization that intentionally tries to keep its developers away from "the business".

"Software Engineering" does exist as a degree. Or it used to (15 years ago).

Indeed, it does exist, but IME SE education is just "CS light". I'm not saying that to be disparaging for the sake of it[4], but: there's nothing close to traditional engineering in the software development world[2]. It's basically just business interests coopting the term to try to seem more legit -- and of course they established the equivalent of trade schools for CS as SE. For those in the know SE education is a complete joke. (Disclaimer: Anecdotal, obviously, but I've been a university educator, in a position to hire, etc. etc.)

Just to avoid the inevitable pedantry: Obviously there are people who are serious about this type of thing[1], but in the mainstream and in practice "software engineering" is a complete joke.

[1] E.g. Greg Wilson. WATCH HIS TALKS. Seriously. He's amazing at exposing how absurdly irrational we are when it comes to education and development in general.

[2] I'm not sure why we would expect there to be. Programs, by Turing Completeness, are absurdly non-linear and unpredictable. I'm not sure engineering in such circumstances is even possible nevermind practical. See also [3] to have your mind blown. Engineering is ultimately based on physics which is "linear" in our everyday world (pedantry alert), but "computing" apparently doesn't quite submit to those parameters.

[3] https://en.wikipedia.org/wiki/Busy_beaver (specifically non-computability)

[4] Also, I'd like to add: CS doesn't even remotely prepare you for actually dealing with customers (which is probably the eventual fate of most CS students), but that just means that CS people might need supplemental courses in "requirements analysis"... but they should have had a bit of that when trying to game tests?!? I know that I did game "expectations" massively during my university experience.

Software is less like mechanical engineering and more like industrial engineering or systems engineering.

Designing an assembly line to build a car vs designing a car.

The difference even with those disciplines is that with code, any specification sufficiently detailed to replicate the product is the product.

An architect can design a blueprint for a house and send it to 3 different builders and they will each build more or less the exact same house.

But if you write a software spec and send it to 3 different software teams, you will get 3 very different products. If you try to write a sufficiently detailed spec to avoid this problem, you'll just end up writing code.

This is not exactly a "response", but as you point out, I failed to point this out in my comment: There is little actual value in (what are you doing here? go outside).

Obviously, I'll elaborate if necessary, but really... do you actually need other people to tell you how to live? (Ding, another achievement realized. No, not really, I just thought it would be funny.)

EDIT: For a more comprehensive treatment see the film "Scott Pilgrim ..."

I'm with you. Even many engineering degrees are more theoretical than people need on the job.

Really, this goes back to the purpose of an undergraduate degree. Some people think they should be job training programs while others think they should teach more fundamental skills / topics so students can learn what they need to know for a job.

In the end, CS is a technical degree about a technical topic. There may be an argument for a less technical CS degree, but CS without mathematical rigor is not CS.

I agree 100%. The most useful "software engineering" course I took was in the school of business called "Systems Analysis."

It seems like your suggestion is a little backwards -- computer science is its own thing, and for some reason, people started thinking that computer scientists are ideal software developers. Since that clearly isn't the case, employers should look for some other certification that's more in line with what they require from an employee.

> for some reason, people started thinking that computer scientists are ideal software developers

Musicians generally are. But all the talented ones are drawn to more fun, lower paying careers.

I think not having enough breadth in "one skull" is largely created by psychology and structure at companies and less by curriculum deficiencies. When I used to manage developers, most of them told me that they were pained by how little they were expected to interface with non-developers. These were very technical people, but they still wanted variety and connection/relevance to the product.

Even developers with good interpersonal skills or business sense are discouraged from participating in more producty discussion. They may be encouraged by way of somebody taking them aside early in their career and saying "Hey kid, you really _get it_. You're not like the rest of these nerds. How about you start calling shots on what to build and switch to a product role?" But they'll rarely be encouraged to stay as technical while simply getting listened to more by management. Management may argue that after 2 or 3 years of getting one's hands dirty, you understand programming as well as you need to, and that for the rest of your career, persuasion trumps skill acquisition. It's somewhat taboo to cultivate both skill sets at the same time. So, I think even if you got more "people people" to study CS/whatever, a reverence for specialization will silence a lot of voices.

This article also struck a chord with me. About 3 years ago I started working half-time as a software developer at a civil engineering company. Originally they wanted to hire me because they had some 3D visualizations in their software with which they had problems and I, coming from a game development background, could help them out. By now, I'm deeply immersed in geotechnical topics on a daily basis and I think it's fair to say that I've became a valuable asset for them.

However, I've been having doubts about how long I should still stay there, because of this 'digital nomad' lifestyle which seems to be so popular now (and which appeals to me also). So, that article was a little relief for me, seeing that there are indeed still people who value a willingness to immerse oneself deep into a topic and come up with novel and simple solutions to problems your users face in that field (something which I've liked about software engineering from the start).

Also, to add to your point about the CS courses: I think the math-heavy courses, although they really may be overdoing it in a lot of CS courses, can at least help sharpen your analytical thinking skills. This can help tremendously when you're thrown into something new where the most important thing you need to do is figure out what the problem actually is and what people need from you to solve it.

Lovely to hear you have a position at the intersection of software and civils like this. I am a software dev that switched careers shortly after graduating with a civil & structural degree, always thought I'd be of most value combining the two at some point. There is hope!

Another Civil & structural guy here. The ability to code does magic at times. Many design steps can be automated. It need not be something huge. Once I automated transfer of data from one software to anther using EXCEL VBA which used to take 2+ Man-days in manual style. Within a month almost entire department was using this excel sheet to do this particular task. Fun times.

Engineering requires every bit as much requirement gathering and need finding. It probably requires even more up front because physical artifacts are much harder to change.

But I don't see anyone suggesting that we remove math from engineering curriculums because the design programs can do the math for them.

The solution is to train more experts in human computer interaction, the same way we train architects and interior designers to work with civil engineers.

What's worse is that companies hire that way too. They don't care about your ability communicate or understand a problem from a business perspective. They just care about your ability to write some contrived algorithm on a white board in 30 minutes without asking any clarifying questions.

And 9/10 times the algorithm question is copied verbatim from HackerRank, TopCoder, LeetCode, etc with no regard for the problem content. They only reason it gets used is "because Google used it once, so it must mean it's good at finding 'top tech talent'".


The issue there is CS != programming. The conflation of the 2 has created an artificial demand for CS trained people (to jump through the stupid interview tests) but the vast majority of jobs need almost no academic knowledge. We're an industry starving for mechanics bitching about the fact that Mechanical Engineers want to build better engines.

I think that's an argument for fewer people doing math-oriented CS courses as "training" for being a developer rather than changing the nature of CS courses.

Agreed, I think "Computer Science" is being incorrectly marketed as the program for aspiring software developers. It does not focus on the skills and knowledge needed to actually do the job, with the exception of some rather niche problems.

Yup, Sarah Mei had some good thoughts on this, in the context of bootcamp grads vs CS grads: https://twitter.com/sarahmei/status/862569071631491072

She basically just throws out bootcamp grads have better soft skills with nothing to back that up.

If that's true, and I've never seen any actual evidence to believe that it is, my guess is that it's because bootcamp grads are older the CS grads in general.

They're usually career switchers and they have experience from that previous career. You really shouldn't be comparing a 35 year old ex-teacher, or a 28 year old law school grad to a 22 year old CS grad on the soft skills front.

Favorite line "... the software development process is exploratory by nature."

I always say the customer/client does not have requirements, they have problems. You will not discover all the requirements until you start solving some of the problems and providing solutions. Only then will they say "oh but...." and drop more requirements on you that they didn't think of up front.

Back to that quote. It's not that software development is exploratory in itself. It's that the development is intertwined with an exploration of the problem being solved.

> and drop more requirements on you that they didn't think of up front

I think one of the important qualities of an architect is to anticipate what these requirements are going to be and define solutions to them ahead of time.

I have this conversation all the time with our client-facing team.

Me: "What is supposed to happen if this data changes?"

Colleague: "Well the customer didn't give us a requirement for that so we don't have to worry about it"

Me: Screams inside

Great points from OP and above comments ..

> the software development process is exploratory by nature

>> customer/client does not have requirements, they have PROBLEMS. You will NOT DISCOVER all the requirements until you start SOLVING SOME of the problems and providing solutions.

> "The most valuable asset in the software industry is the synthesis of programming skill and deep context in the business problem domain, in one skull."

> But If Someone Else Knows the Business, Why Can't They Can’t They Just Give Me a Spec?

> The Unmapped Road

>Miles and miles of a software project are spent roving this vast territory that is not exactly computer related, nor exactly business related. Rather, it is a new and emergent problem space that explodes into existence while bridging the middle distance between the business problem and the technology.

> In reality, this vast field of micro-problems was there all along. But the only way to discover them was to encounter them one at a time, like boulders in the path of a road-laying crew

> What is Deep Context?

> Deep context is the state of having achieved a kind of mental fluency in some large percentage of this immense field of micro-problems that appears in the space between technology and a business domain.

> I always say the customer/client does not have requirements, they have problems.

And those problems change as the business adapts to changing market forces.

As a lead developer building a platform dealing with the intricacies of union agreements and labor restrictions, this summarizes exactly the thought process that my team has gone through in the last year.

We started with a simple problem that plagues HR departments in every conceivable industry with unions, finding substitute personnel and erroneously assumed that it was a simple fix. Over the past year and a half we have accumulated a great deal of knowledge after interacting with as many people as possible and have finally released a version that meets our original criteria (and much more). It was obviously not a simple fix.

If I have one thing to tell anyone who is looking for business ideas to try out their new programming skills on, I strongly suggest taking the time to learn as much as possible about the people to whom you want to provide a solution, then recruiting one of them to help you build it, lest you become another project that solves a non-issue beautifully.

Yes. The most successful projects I have worked on, we (the dev team, not project managers, not designers, etc) went on-site often. We observed our customers. We had lunch with them. We stood behind them as they worked. We asked them questions. We were on very friendly terms with them.

We built systems that literally had people say, "Oh, thank God!" when demoed. I haven't seen any other development methodology that matches it. You really have to understand a problem at a deep level in order to reason well about it.

Absolutely! Being able to watch our customers use our application in production environments, observing how they used it and how they perceived certain features has been so enriching.

We would have spent at least 10 times longer trying to get these insights otherwise, if there was even a possibility that we would arrive to the same conclusion.

It takes a little time to get over the fact that you are no longer building this product for yourself (unless you are building dev tools), but seeing customers use your product happily and telling you how much they value it is well worth the investment.

I'd love if that methodology were more common. The "traditional" method usually involves companies insisting we provide waterfall style proposals and they don't want to invest in up-front "mini-project" of having people analyze the problem _first_.

And heaven-forbid any developer actually calls up a customer or subject-matter expert without routing absolutely everything through one or more layers of middlemen and bureaucracy.

"Just send me your questions and I'll pass them along..." Because no one has _ever_ had a question answered and had a follow-up question based on the answer before... /s

True story: Myself and a bunch of internal customers are literally less than 50 meters away in the same office. The blessed middleman is in a different timezone 8 hours ahead.

So instead of a quick conversation 30 minutes with helpful whiteboard doodles, I have to compose my questions and decision-tree into an e-mail and wait at least for a day or two for a reply that may or may not be useful.

I came in my company as a team lead to a very disfunctional department, the developers were frustrated because of unclear requirements and the business side was upset because they could never get their deliverables despite "very clear" 100 page documentation specs and a "detailed" list of requirements.

They would email each other even though they were in the same office more to document everything and CYA than for communications.

On the dev side, I banned internal email for discussions. I told them that they must talk to people first if they are in the office and then send emails of documents, screen shots or whatever.

I also took it on as personal mission to never say "no" to any request and always talk to the person who was making the request - in person - see if we could come up with an acceptable compromise and tell them how it would affect their deliverable.

Are there repercussions to just walking over and asking them then forwarding the middleman a summary / recap?

There was a "Hey, don't do that, X needs to be in the loop" complaint. (Didn't see it myself, but technical program manager who was also in on the meetings told me.)

Later that remote person left the company, and the position was left unfilled for a year and some projects got mothballed.

The book Talking to Humans, shows this concept well, plus the book is free.


Thank you for the suggestion, it was an excellent fast read, quite concise and insightful.

"Being able to program is not the problem. Understanding the problem is the problem" - one of my lecturers in University.

Agree 100%. In fact, this is how I built my career -- I knew the tech ok, but realized that I was never going to have the type of influence I needed unless I got into the product side as well.

In the consulting world, we call this job "enterprise architecture". It does, in fact, pay very well: it requires someone with both a sharp business mind and comprehensive technical skills, and those are very difficult to find in one person. I personally am more of a "jack of all trades" type; but you can be a successful architect by focusing on specific technologies as well.

I honestly find that it's easier to take someone who's a hacker type, and teach them the business. You look at the business itself as a large, complex system and model your application development around that. But you also have to be a good enough technologist yourself so that you can tell your dev team when their designs don't match up to the business problem (this is a common problem when requirements are not clearly communicated).

A good architect is the person who understands both the business context and the technology implementation. You don't have to be in-the-weeds building the product, but often you do have to build quick POCs to prove out an approach before handing off the designs to development - so being able to code is a necessity IMO.

> The nature of the beast is that software requirements rarely change

Put like this, it's hard to agree with the above statement.

Following the spirit of the article, I assume the author means that the problem domain is pretty stable. But I've been in this for more than 20 years, and I know that requirements always change. Not only our understanding, but also the customer/user's understanding of their needs and priorities change.

(Edit: typo)

> Not only our understanding, but also the customer/user's understanding of their needs and priorities change.

And not only does understanding change, but the business requirements actually change too depending on market conditions.

I think the notion of "requirements" the author refers to is what the business actually does. When writing say health-care management software for say, a hospital chain, they're not going to suddenly decide, no matter how much the small-r requirements change, that they instead want their software to be able to do capacity management for lumber mills.

The business domain constrains the field over which the requirements can change, and the sort of deep context which the author mentions will also range over that field.

> When writing say health-care management software for say, a hospital chain, they're not going to suddenly decide, no matter how much the small-r requirements change, that they instead want their software to be able to do capacity management for lumber mills.

But they might start with wanting billing software, and then change their minds and ask for software to do triage and scheduling operating rooms. That's almost as drastic a change in scope as your big-r from health to lumber mills, so I don't really agree with the distinction you're trying to make.

Yeah programmers / developers / software engineers act as an interface for other people for computers. It's not surprising that these qualities of that 'interface' affects price:

- intuitiveness i.e how easy is it to communicate with this person (language fluency, etc...)

- quality i.e. how well do this person understand not only the requirements but also the actual goals

- 'latency' i.e. how convenient and how fast can you communicate with this person (time zone, can you both see facial expressions, hear changes in voice, etc...)

This is insightful. These are possible remotely though:

* intuitiveness i.e how easy is it to communicate with this person (language fluency, etc...)

Native English speakers have an advantage here.

* quality i.e. how well do this person understand not only the requirements but also the actual goals

Experience, empathy, critical thinking, intelligence. Not necessarily common or easy but on site vs remote doesn't affect this really.

* 'latency' i.e. how convenient and how fast can you communicate with this person (time zone,

Hire people from your country or even your time zone.

* can you both see facial expressions, hear changes in voice, etc...)

Use video chat constantly.

Remote work is a skill like any other. It makes sense most employers that offer it require 5+ years doing it previously. The article author makes a good point. A great way to get this is to work at a company with many remote employees and start on site before transitioning to full time remote.

Maybe some people are just cut out for remote too. I remember at the beginning of my career running a business where I talked to the CEO of a mid sized company regularly about his needs, and always delivered. He was thrilled and amazed. It was just good listening, communication, programming skill and hard, applied work. Nothing fancy.

The weird thing about the Bay Area is if you want to live on 5 acres in a home built in the last five years somewhere quiet and pretty that is 20 minutes from the office in traffic, you're basically looking at Woodside. On the low end, those houses start at around 3 million. Good luck paying that mortgage on the income of even two software engineers.

Whereas you could buy the same house somewhere else in California for 300k.

So even though it is indeed pleasant to have coworkers to talk to in person for social needs, the compensation to housing cost math just flat out doesn't come close to working unless you are willing to make some serious housing quality sacrifices.

Edit: typo.

This is a post which touches on a subset of "hard" parts while downplaying other hard parts. I agree that a developer with a deep domain knowledge is precious. In fact, I have seen quite a few developers rising up the hierarchy despite being mediocre on technical skills. The code written by them is big ball of mud but they are good at communicating with stakeholders and understand the business well(And also good at drawing boxes) . This does not mean that there is no cost to it. It is just hidden from others. The memory leak issue, the several bugs introduced due to multiple if-elses encoding the business rules, swallowing exceptions or failing to set right properties for client timeouts. These can exist despite the "deep context" and most of the time junior developers are to be blamed for this because the so-called architect is busy all day in meeting with management.

Also the experience may vary. In my admittedly not so long career(less than a decade) I have seen teams where business rules are the major source of complexity while there are other teams which have less business rules (example the databases/data warehouse/build systems team). Admittedly there are less teams of second type in the world so the general perception is that the hardest part is communication and understanding of business context.

Coming to domain knowledge, even the Mainframe and COBOL chaps make a lot of money while smart open-source contributors freelancing don't. Money is not the only way to judge the hardest problem about software development.

The way I see it, my job isn't to just blindly implement what the business thinks they want. It's a two way thing. If a set of business rules are too complex and convoluted, I explain that this incurs a cost, in increased likelihood of bugs, and slowdown of future development. We then think about how we can simplify the rules, or reduce complexity elsewhere. In one such case, we agreed with the business to remove an entire feature, to make the new feature more robust and easier to develop. I deleted 1/3 of a codebase as a result, ultimately the overall complexity of the system was substantially reduced after the new feature. It was immensely satisfying.

This is where a good product team helps. Understanding the threshold at which technical debt would undermine a business feature. Glad that it worked out well in your case

Great points.

> multiple if-elses encoding the business rules

A good architect would add a linter with a cyclomatic complexity check that fails the build. There's always a way to avoid these hideous nested else ifs. Code review is helpful here to train people who don't understand some of the various abstractions to avoid this such as polymorphism. This is also the code that has high value for unit tests.

Not saying rules encoded in "if-elses" are good, but they can put all the rules in one location. Polymorphism can spread those rules out in somewhat obscure ways. I use both, depending on what I want to do.

Right, I wasn't saying polymorphism was the only solution. I was saying deeply nested if elses are hard to maintain. There's many solutions to avoid them. So I think we agree, use what makes sense given the context.

If you found the article interesting or insightful, I would encourage you strongly to take a look at the book "Peopleware" by Lister and DeMarco. The first edition was published in 1978; the edition I have is from the early 90s, and I think there are newer versions since (at the usual astronomical cost of books that get sold as required reading for college courses), but you can get whatever is cheap for a used copy. Most of it hasn't changed much.

That, ironically, is part of the authors' point: software engineering hasn't changed that much. They were saying that in the late 70s, looking back on the past two decades (all the way back to the late 50s, when software was typically written in assembler or machine language and had to be rewritten if you bought a new computer!), but it hasn't become much less true. The technology changes, sure, but the failure of software development projects has never been mostly the result of technology. It's always been on the human side -- failure to understand the requirements, failure to meet them, failure to estimate the effort appropriately, failure to work with the customer, failure of the customer to understand what they were paying for, etc. There's a long list of things that go wrong, and I suspect anyone who has been in software for a while has seen many of them.

It was eye-opening for me, the first time I read it, to realize that people had been dealing with the same issues I was dealing with, for longer than I'd been alive. (And a bit depressing, too, that we seemingly haven't gotten much better.) Languages and project-management methodologies come and go, and the tech skills and understanding are certainly necessary, but they are not sufficient. The business knowledge and human factors seem to be the difference, or at least the largest controllable variable that leads to a difference, in a successful or failed outcome.

> Sadly, offices are perhaps the closest modern equivalent to a "village" that we have left.

Ha! We need a "tribal leadership" discussion forking off this point.

Why does some independent contractor make bank while others are fighting for pennies? Because it's not about "what you know but who you know."

The author could have just left the "Remote" part out. This piece is really about hiring independents for gigs vs. building and educating your own team.

I thought something quite similar, but upon further reflection the "remote" buildup and conclusion sets the narrative frame for the rest of the article.

Rather than just a "wisdom dump" it becomes more of a story with a purpose.

It does create the framing but I just don't see how it is relevant to the conclusions drawn. There are entire companies that are remote and are obviously able to communicate problem domain expertise and execute effectively.

Yes, the entire article is an attack on remote working and this suits Amazon well since they keep trying to get people to relocate to Seattle.

It is not an attack on remote working. It is an attack on the naive assumption that there is worthwhile value in being handed a stack of detailed requirements and implementing them with no back-and-forth.

I read it as: remote devs don't gain the domain knowledge.

Which is very wrong: You get domain knowledge by mapping out the field (as stated in TFA), which can happen regardless of where you physically sit.

Case in point: I've been working remotely for 2 years now, and I've gathered as much domain knowledge (if not more) as I had in previous roles over the same timespan.

What's important is to get your software out there, so you can map the field.

It is exactly this which makes interviewing so flawed in the field.

Solving well defined algorithm puzzles has zero bearing on the skills described by the author.

Once you are experienced enough you'll realize two things, that all programming languages let you solve problems and that programming in itself is not the problem, but rather applying it to solve business requirements is the true problem to solve.

This is an insanely useful article to read before a major project kick off. Kudos!

Here's the problem:

- Here's a pool of knowledge about software development: hardware, operating systems, memory, disks, file formats, databases, networks, protocols, languages, debuggers, design patterns, security, accessibility, UI/UX, distributed systems, paradigms, typical algorithms & data structures, and CS problems

- There's a pool of knowledge about whatever industry you get into as a developer: user demands, existing workflows, existing infrastructure, previous decisions, legal regulation & compliance, physical laws, profitability, and practical limits.

Your software development skills should reach a point where you don't write "Bad Code" -- anything that's wrong like loading a entire database table that eats memory when you can read individual rows, storing passwords in cleartext, or not doing anything for accessibility (this is not design pattern, space/tab debates). These have been done hundreds of times by new and 'experienced' people.

It takes time to get to this point. More time than anyone likes to admit because the pool of knowledge grows and shrinks daily, but has undoubtedly had a net expansion since computers were a thing.

It takes time to get deep knowledge about whatever industry you get into. This is different for every industry. There's a practical minimum that you need to work on solutions or do maintenance on software within this industry. This is to avoid "Bad Code" which will hurt you, other people, or your business.

You can gain industry knowledge by just being given problems and being shown. This is probably how most of us know our industries from the get-go. A minority of us came from those industries and transitioned to programming later, so we already had a base level of knowledge of our problems.

If I've got the definition of Deep Context right from this article, it means to get to that point, you have to spend a good amount of time within the industry. It's not something you can gain completely by reading out of a book.

If you're to gain deep context within an industry, you have to devote some time away from software. You can't do both at the same instant (but certainly within a day). When you study an industry, there's an opportunity cost to not learning something new about software and vice versa.

When you add more requirements to a single job, it increases the time we have to spend before we're employable. Not every industry changes as fast as software does, but some certainly do, possibly catalyzed by software.

If you increase the time requirements, it's going to reduce the available pool of engineers as long as all of the engineers are honest and don't apply for jobs or remote contracts until they're ready.

If you don't want the time requirements to increase, you have pay the opportunity costs from one of your pools of knowledge.

So really, we need a much better "good enough" for employing developers and career development, including teaching software and industry knowledge. Because eventually the time requirements are going to become steeper and steeper. It can't go up forever.

There is hope.

How about instead of becoming the expert at whichever business domain the software is for, we become experts at helping business domain experts find and express the business rules that need to be implemented?

You can fit a decent amount of that skill and the technical knowledge you mentioned in one skull, and it still lets you be quite effective in more than one domain.

That's what I'm going for anyway. Wish me luck.

I think the deep context is what the author is using to solve that issue. Devs are fluent enough and have enough context that business domain experts don't have to spend an inordinate amount of time going into detail. The level of detail cascading happens in the developer's head as opposed to conversation.

It's really an argument for who's going to spend the time and appears simple:

- Devs learning software and industry knowledge

- Business experts learning software knowledge (incl. technical writing) and industry knowledge

Product Managers with some technical knowledge and writing skills are best at being a middle layer between raw customer requests and development specs in my experience. PMs and customers struggle when they don't have a good vocabulary to use to describe features that they want. That's when a dev has to translate or teach the person. Then again, that's asking a PM to learn industry knowledge and technical knowledge and product management knowledge. This is especially true if you have a good QA pipeline.

I've seen analysts and PMs that didn't have a good UI/UX vocabulary or weren't exposed to different UI/UX's, and usually their requests were the most vague and resulted in the most unspoken details.

I've also had PMs that knew how to write a good technical spec down to quick UI mock-ups and error handling. They also had technical writers to pose questions about some of the details.

Pretending I could be as good as the latter is foolish, and if I could, my salary should have been combined for doing 3 jobs well. I think one-man-army, $250k/yr full-time positions are rare though. We seem to be inching closer to it though, maybe without the salary.

This article sounds a lot like a big argument for Domain Driven Design without naming it.

Is there a way to read this article without logging in?

The hardest thing about software development is having people in charge of you that know jack shit about software.

Man does that ring true on this Friday morning.

I'm working with an intern who's implementing a "proof-of-concept" for voice control using both Alexa and API.AI (Google). He's a CS freshman. It's amazing what he's gotten working so far, but his code is unreadable noise. Like he never learned what a subroutine is for.

But he's giving a demo to the CEO today, and said CEO will undoubtedly say "Nice, finish it and let's ship!" when it's probably not even usable code in any way. The hard part -- connecting thousands of users through a db and thousands of persistent cloud connections from individual IoT devices, hasn't even been sketched yet.

So he looks like the hero (with a demo that does something amazing), and I'm going to look like the can't-get-it-done idiot because no one in the organization understands the complexity of going from that proof-of-concept to a working product, or even a next-level demo that uses actual connections from actual devices.

Of course, that's when you're supposed to quit, I guess.

Eerily similar circumstances here: intern, alexa, demo this morning. First round is on me...

Funny, I've always thought the hardest thing about being an engineering manager is that by default all software engineers think they automatically know way more about everything than their management.

Maybe they just know way more about software development than the engineering manager if said engineering manager has never done actual software development.

Of course both viewpoints are valid, I'm just really feeling the commenter's point today because of my particular circumstance.

Vica versa, many developers don't know jack about the business they work for, and make little or no effort to.

I thought the hard thing in software development was naming things and off by one errors?

Naming things means understanding the problem. Preventing off by one errors means being fluent in the technical minutiae at the same time.

So you could read this essay as just an expanded version of the saying.

Close. It's cache invalidation and naming things. But I feel ya

No, it's "There are only two hard things in Computer Science: cache invalidation, naming things, and off by one errors."

To me, the article falls into an expanded understanding of "naming things". As in, once everybody is talking about the same things using the same terminology and referring to the same abstract concepts, etc, the job is straightforward (it may still be a lot of work, but it's straightforward).

As the article suggests, though, getting there is not easy!

To make the joke work, I could only pick one. I thought naming things was the funnier option.

The joke is "There are only two hard problems in software: Naming things, counting, and naming things."

Applications are open for YC Summer 2018

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact