It makes me think that we're training the wrong people in college by making CS a very difficult, math heavy field which often causes the more human skilled people to drop out. Programming doesn't have to be anymore math heavy than building a house yet we force undergrads to implement algorithms on paper? The amount of wasted potential talent due to college is staggering.
I disagree. I think instead there is a category error being made: that CS is an appropriate degree (on its own) to become a software engineer. It's like suggesting a BS in Physics qualifies somebody to work as an engineer building a satellite. It doesn't, but that doesn't mean "physics is too math heavy." In fact, engineering a satellite requires almost as much basic mathematics education as a BS in physics requires (some exceptions might include the specialized mathematics required for upper-level theoretical physics concepts that may not apply at an engineering firm).
Completely agree here. I often find myself wondering why 'Software Engineering' isn't the degree required to be a Software Engineer and further doesn't really exist as a major, whereas 'Mechanical Engineering', 'Electrical Engineering', 'Civil Engineering', 'Chemical Engineering', etc. are the degrees associated with those professional titles. To your point, I don't think it's a simple matter of nomenclature (i.e. that CS and Software Engineering are synonymous). Not a CS major myself, but amongst my friends who did their undergrad in the US I don't think they had any classes that covered requirements gathering, putting together schedules, etc. Any CS majors here have a class/classes that covered those topics?
Because we don't yet understand software engineering. There has been insufficient empirical study of what yields maintainable/performant/what-have-you code, of what sorts of abstractions are "good" for maintenance/reuse/etc., at what's needed to reliably predict a program's resource requirements, and many software developers balk at design tools that restrict their style to detect certain errors as early as possible, like type systems.
Software development is not yet an engineering discipline for all of these reasons, and more.
Personally I think the high failure rate of software projects is mostly because people on both sides of the equation regard it as generally acceptable, and aren't willing to pay what it would cost to bring software development in line with a traditional engineering discipline, where failure is typically worth guarding against, even if it drives costs up significantly.
Besides, software hasn't automated any other engineering discipline, and those are much more straightforward because they're more mature and the principles understood.
Your comment is tantamount to claiming an engineer ought to have a strong understanding of theoretical physics fundamentals in order to adapt his career over the span of decades. The reality is substantially different.
My CS curriculum had some Software Engineering classes (I believe they were actually called Software Engineering I and II) where we did requirements gathering, estimation, etc- it felt like all of SEII was requirements gathering and documentation. Besides those though, it was very theoretical, and since I was already working in the field while going to school I realized that the theories would be helpful, but not nearly as helpful as being able to plan, document, and manage tasks in the context of a project.
If CS is producing developers, MIS is definitely producing technical analysts. It was extremely light on any development (there were a few classes where people had to write code and it was like every other student was being asked to build a rocket to go to Mars). But it did a great job of teaching how to identify problems and apply technology based solutions to them. Not just fixing things by making code, organizational level solutions.
Armed with MIS and a minor CS (and one class short of a math minor) I hit the ground running and was quickly leading projects, then dev teams. Six years in I was a Director and was also leading a skunkworks Innovation Lab where I got to keep my hands dirty.
The absolute fundamental difference is that I got the hard core tech depth from so much time in CS (and actually using the tech in practice) but I got the study I needed for it's useful application in a business environment. Selling to leadership, planning, source control processes, etc.
Most developers show up with the mindset of a craftsman toiling away in their shop, wanting only to emerge with their beautiful creation when it's ready. Pragmatism and being able to accept trade offs because of time/money/whatever will always make developers stand out.
We did this as part of our open ended senior project.
The theory heavy classes (automata, algorithms etc...) covered material that changes very slowly.
But, as a previous poster said "we don't yet understand software engineering." Because of this software engineering best practices change with the wind. As a result, the software engineering class was a hopelessly outdated survey of how software was built in the early 90s when our professor last worked in industry.
The idea of iterative software development - that we now call "agile" was first documented in 1957
The Mythical Man Month was published in 1975 and is just as relevant today as it was when it was written.
Code Complete came out in 1993 and is still relevant.
Even when you start talking about programming languages, both Java and C# have been popular in the enterprise since 2005.
What most firms are interested in are software engineers. The dearth of software engineers has led them to hire computer scientists (and, in fact, a rather large percentage of computer scientists that I know are quite happy for this).
But if you proceed down that road, I am not sure that what's left over, in terms of value-add, for the pure-play "computer scientist" who doesn't want to deal with requirements or schedule or that other messy stuff, really is. Once you have the problem suitably defined and the requirements nailed down... you can offshore the hell out of that job, or farm it out to some bidding-war site, and let people chisel each other out of a living wage while you laugh all the way to the bank.
I wouldn't want to hang my career on that. If you're willing to deal with the messy human-factors stuff, and sit in on the requirements-gathering meetings and deal with the client and work on setting the schedule and doing the estimation with the BD guys and all of that other shit ... you're never going to have to worry about some dude in India taking your job. That's not to say you don't need the technical skill. But if you're gunning for a job and it's between two people and one is a little better on the technical side but the other person can get involved in the process that much earlier, maybe iterate on the problem as part of requirements development (as many modern methodologies basically require)... I think they're going to do a lot better.
If you're a "throw it over the wall" coder, well, good luck with that. There are certainly jobs around for pure coders, but I tend to be suspicious of exactly how many (particularly if you want to make a top-end First World salary), and I suspect strongly that supply is going to outstrip demand for a generation or two.
FWIW, some of the most highly-compensated people I know are the ones who have some combination of really impressive technical skills but also have the business understanding and can participate in the process from kickoff, or close to it. It's also easier for them to negotiate since they have the option of going out on their own at pretty much any point, vs. a pure-play coder essentially depends on having someone around to feed them requirements and technical problems to solve.
The good thing is that I think a lot of people who think of themselves as pure coders probably have more business understanding (in some area) than they give themselves credit for, it's just a matter of developing that understanding, which can be hard in an organization that intentionally tries to keep its developers away from "the business".
Just to avoid the inevitable pedantry: Obviously there are people who are serious about this type of thing, but in the mainstream and in practice "software engineering" is a complete joke.
 E.g. Greg Wilson. WATCH HIS TALKS. Seriously. He's amazing at exposing how absurdly irrational we are when it comes to education and development in general.
 I'm not sure why we would expect there to be. Programs, by Turing Completeness, are absurdly non-linear and unpredictable. I'm not sure engineering in such circumstances is even possible nevermind practical. See also  to have your mind blown. Engineering is ultimately based on physics which is "linear" in our everyday world (pedantry alert), but "computing" apparently doesn't quite submit to those parameters.
 https://en.wikipedia.org/wiki/Busy_beaver (specifically non-computability)
 Also, I'd like to add: CS doesn't even remotely prepare you for actually dealing with customers (which is probably the eventual fate of most CS students), but that just means that CS people might need supplemental courses in "requirements analysis"... but they should have had a bit of that when trying to game tests?!? I know that I did game "expectations" massively during my university experience.
Designing an assembly line to build a car vs designing a car.
The difference even with those disciplines is that with code, any specification sufficiently detailed to replicate the product is the product.
An architect can design a blueprint for a house and send it to 3 different builders and they will each build more or less the exact same house.
But if you write a software spec and send it to 3 different software teams, you will get 3 very different products. If you try to write a sufficiently detailed spec to avoid this problem, you'll just end up writing code.
Obviously, I'll elaborate if necessary, but really... do you actually need other people to tell you how to live? (Ding, another achievement realized. No, not really, I just thought it would be funny.)
EDIT: For a more comprehensive treatment see the film "Scott Pilgrim ..."
Really, this goes back to the purpose of an undergraduate degree. Some people think they should be job training programs while others think they should teach more fundamental skills / topics so students can learn what they need to know for a job.
In the end, CS is a technical degree about a technical topic. There may be an argument for a less technical CS degree, but CS without mathematical rigor is not CS.
Musicians generally are. But all the talented ones are drawn to more fun, lower paying careers.
Even developers with good interpersonal skills or business sense are discouraged from participating in more producty discussion. They may be encouraged by way of somebody taking them aside early in their career and saying "Hey kid, you really _get it_. You're not like the rest of these nerds. How about you start calling shots on what to build and switch to a product role?" But they'll rarely be encouraged to stay as technical while simply getting listened to more by management. Management may argue that after 2 or 3 years of getting one's hands dirty, you understand programming as well as you need to, and that for the rest of your career, persuasion trumps skill acquisition. It's somewhat taboo to cultivate both skill sets at the same time. So, I think even if you got more "people people" to study CS/whatever, a reverence for specialization will silence a lot of voices.
However, I've been having doubts about how long I should still stay there, because of this 'digital nomad' lifestyle which seems to be so popular now (and which appeals to me also). So, that article was a little relief for me, seeing that there are indeed still people who value a willingness to immerse oneself deep into a topic and come up with novel and simple solutions to problems your users face in that field (something which I've liked about software engineering from the start).
Also, to add to your point about the CS courses: I think the math-heavy courses, although they really may be overdoing it in a lot of CS courses, can at least help sharpen your analytical thinking skills. This can help tremendously when you're thrown into something new where the most important thing you need to do is figure out what the problem actually is and what people need from you to solve it.
But I don't see anyone suggesting that we remove math from engineering curriculums because the design programs can do the math for them.
The solution is to train more experts in human computer interaction, the same way we train architects and interior designers to work with civil engineers.
If that's true, and I've never seen any actual evidence to believe that it is, my guess is that it's because bootcamp grads are older the CS grads in general.
They're usually career switchers and they have experience from that previous career. You really shouldn't be comparing a 35 year old ex-teacher, or a 28 year old law school grad to a 22 year old CS grad on the soft skills front.
I always say the customer/client does not have requirements, they have problems. You will not discover all the requirements until you start solving some of the problems and providing solutions. Only then will they say "oh but...." and drop more requirements on you that they didn't think of up front.
Back to that quote. It's not that software development is exploratory in itself. It's that the development is intertwined with an exploration of the problem being solved.
I think one of the important qualities of an architect is to anticipate what these requirements are going to be and define solutions to them ahead of time.
I have this conversation all the time with our client-facing team.
Me: "What is supposed to happen if this data changes?"
Colleague: "Well the customer didn't give us a requirement for that so we don't have to worry about it"
Me: Screams inside
> the software development process is exploratory by nature
>> customer/client does not have requirements, they have PROBLEMS. You will NOT DISCOVER all the requirements until you start SOLVING SOME of the problems and providing solutions.
> "The most valuable asset in the software industry is the synthesis of programming skill and deep context in the business problem domain, in one skull."
> But If Someone Else Knows the Business, Why Can't They Can’t They Just Give Me a Spec?
> The Unmapped Road
>Miles and miles of a software project are spent roving this vast territory that is not exactly computer related, nor exactly business related. Rather, it is a new and emergent problem space that explodes into existence while bridging the middle distance between the business problem and the technology.
> In reality, this vast field of micro-problems was there all along. But the only way to discover them was to encounter them one at a time, like boulders in the path of a road-laying crew
> What is Deep Context?
> Deep context is the state of having achieved a kind of mental fluency in some large percentage of this immense field of micro-problems that appears in the space between technology and a business domain.
And those problems change as the business adapts to changing market forces.
We started with a simple problem that plagues HR departments in every conceivable industry with unions, finding substitute personnel and erroneously assumed that it was a simple fix. Over the past year and a half we have accumulated a great deal of knowledge after interacting with as many people as possible and have finally released a version that meets our original criteria (and much more). It was obviously not a simple fix.
If I have one thing to tell anyone who is looking for business ideas to try out their new programming skills on, I strongly suggest taking the time to learn as much as possible about the people to whom you want to provide a solution, then recruiting one of them to help you build it, lest you become another project that solves a non-issue beautifully.
We built systems that literally had people say, "Oh, thank God!" when demoed. I haven't seen any other development methodology that matches it. You really have to understand a problem at a deep level in order to reason well about it.
We would have spent at least 10 times longer trying to get these insights otherwise, if there was even a possibility that we would arrive to the same conclusion.
It takes a little time to get over the fact that you are no longer building this product for yourself (unless you are building dev tools), but seeing customers use your product happily and telling you how much they value it is well worth the investment.
So instead of a quick conversation 30 minutes with helpful whiteboard doodles, I have to compose my questions and decision-tree into an e-mail and wait at least for a day or two for a reply that may or may not be useful.
They would email each other even though they were in the same office more to document everything and CYA than for communications.
On the dev side, I banned internal email for discussions. I told them that they must talk to people first if they are in the office and then send emails of documents, screen shots or whatever.
I also took it on as personal mission to never say "no" to any request and always talk to the person who was making the request - in person - see if we could come up with an acceptable compromise and tell them how it would affect their deliverable.
Later that remote person left the company, and the position was left unfilled for a year and some projects got mothballed.
In the consulting world, we call this job "enterprise architecture". It does, in fact, pay very well: it requires someone with both a sharp business mind and comprehensive technical skills, and those are very difficult to find in one person. I personally am more of a "jack of all trades" type; but you can be a successful architect by focusing on specific technologies as well.
I honestly find that it's easier to take someone who's a hacker type, and teach them the business. You look at the business itself as a large, complex system and model your application development around that. But you also have to be a good enough technologist yourself so that you can tell your dev team when their designs don't match up to the business problem (this is a common problem when requirements are not clearly communicated).
A good architect is the person who understands both the business context and the technology implementation. You don't have to be in-the-weeds building the product, but often you do have to build quick POCs to prove out an approach before handing off the designs to development - so being able to code is a necessity IMO.
Put like this, it's hard to agree with the above statement.
Following the spirit of the article, I assume the author means that the problem domain is pretty stable. But I've been in this for more than 20 years, and I know that requirements always change. Not only our understanding, but also the customer/user's understanding of their needs and priorities change.
And not only does understanding change, but the business requirements actually change too depending on market conditions.
The business domain constrains the field over which the requirements can change, and the sort of deep context which the author mentions will also range over that field.
But they might start with wanting billing software, and then change their minds and ask for software to do triage and scheduling operating rooms. That's almost as drastic a change in scope as your big-r from health to lumber mills, so I don't really agree with the distinction you're trying to make.
- intuitiveness i.e how easy is it to communicate with this person (language fluency, etc...)
- quality i.e. how well do this person understand not only the requirements but also the actual goals
- 'latency' i.e. how convenient and how fast can you communicate with this person (time zone, can you both see facial expressions, hear changes in voice, etc...)
* intuitiveness i.e how easy is it to communicate with this person (language fluency, etc...)
Native English speakers have an advantage here.
* quality i.e. how well do this person understand not only the requirements but also the actual goals
Experience, empathy, critical thinking, intelligence. Not necessarily common or easy but on site vs remote doesn't affect this really.
* 'latency' i.e. how convenient and how fast can you communicate with this person (time zone,
Hire people from your country or even your time zone.
* can you both see facial expressions, hear changes in voice, etc...)
Use video chat constantly.
Remote work is a skill like any other. It makes sense most employers that offer it require 5+ years doing it previously. The article author makes a good point. A great way to get this is to work at a company with many remote employees and start on site before transitioning to full time remote.
Maybe some people are just cut out for remote too. I remember at the beginning of my career running a business where I talked to the CEO of a mid sized company regularly about his needs, and always delivered. He was thrilled and amazed. It was just good listening, communication, programming skill and hard, applied work. Nothing fancy.
The weird thing about the Bay Area is if you want to live on 5 acres in a home built in the last five years somewhere quiet and pretty that is 20 minutes from the office in traffic, you're basically looking at Woodside. On the low end, those houses start at around 3 million. Good luck paying that mortgage on the income of even two software engineers.
Whereas you could buy the same house somewhere else in California for 300k.
So even though it is indeed pleasant to have coworkers to talk to in person for social needs, the compensation to housing cost math just flat out doesn't come close to working unless you are willing to make some serious housing quality sacrifices.
Also the experience may vary. In my admittedly not so long career(less than a decade) I have seen teams where business rules are the major source of complexity while there are other teams which have less business rules (example the databases/data warehouse/build systems team). Admittedly there are less teams of second type in the world so the general perception is that the hardest part is communication and understanding of business context.
Coming to domain knowledge, even the Mainframe and COBOL chaps make a lot of money while smart open-source contributors freelancing don't. Money is not the only way to judge the hardest problem about software development.
> multiple if-elses encoding the business rules
A good architect would add a linter with a cyclomatic complexity check that fails the build. There's always a way to avoid these hideous nested else ifs. Code review is helpful here to train people who don't understand some of the various abstractions to avoid this such as polymorphism. This is also the code that has high value for unit tests.
That, ironically, is part of the authors' point: software engineering hasn't changed that much. They were saying that in the late 70s, looking back on the past two decades (all the way back to the late 50s, when software was typically written in assembler or machine language and had to be rewritten if you bought a new computer!), but it hasn't become much less true. The technology changes, sure, but the failure of software development projects has never been mostly the result of technology. It's always been on the human side -- failure to understand the requirements, failure to meet them, failure to estimate the effort appropriately, failure to work with the customer, failure of the customer to understand what they were paying for, etc. There's a long list of things that go wrong, and I suspect anyone who has been in software for a while has seen many of them.
It was eye-opening for me, the first time I read it, to realize that people had been dealing with the same issues I was dealing with, for longer than I'd been alive. (And a bit depressing, too, that we seemingly haven't gotten much better.) Languages and project-management methodologies come and go, and the tech skills and understanding are certainly necessary, but they are not sufficient. The business knowledge and human factors seem to be the difference, or at least the largest controllable variable that leads to a difference, in a successful or failed outcome.
Ha! We need a "tribal leadership" discussion forking off this point.
Rather than just a "wisdom dump" it becomes more of a story with a purpose.
Case in point: I've been working remotely for 2 years now, and I've gathered as much domain knowledge (if not more) as I had in previous roles over the same timespan.
What's important is to get your software out there, so you can map the field.
Solving well defined algorithm puzzles has zero bearing on the skills described by the author.
- Here's a pool of knowledge about software development: hardware, operating systems, memory, disks, file formats, databases, networks, protocols, languages, debuggers, design patterns, security, accessibility, UI/UX, distributed systems, paradigms, typical algorithms & data structures, and CS problems
- There's a pool of knowledge about whatever industry you get into as a developer: user demands, existing workflows, existing infrastructure, previous decisions, legal regulation & compliance, physical laws, profitability, and practical limits.
Your software development skills should reach a point where you don't write "Bad Code" -- anything that's wrong like loading a entire database table that eats memory when you can read individual rows, storing passwords in cleartext, or not doing anything for accessibility (this is not design pattern, space/tab debates). These have been done hundreds of times by new and 'experienced' people.
It takes time to get to this point. More time than anyone likes to admit because the pool of knowledge grows and shrinks daily, but has undoubtedly had a net expansion since computers were a thing.
It takes time to get deep knowledge about whatever industry you get into. This is different for every industry. There's a practical minimum that you need to work on solutions or do maintenance on software within this industry. This is to avoid "Bad Code" which will hurt you, other people, or your business.
You can gain industry knowledge by just being given problems and being shown. This is probably how most of us know our industries from the get-go. A minority of us came from those industries and transitioned to programming later, so we already had a base level of knowledge of our problems.
If I've got the definition of Deep Context right from this article, it means to get to that point, you have to spend a good amount of time within the industry. It's not something you can gain completely by reading out of a book.
If you're to gain deep context within an industry, you have to devote some time away from software. You can't do both at the same instant (but certainly within a day). When you study an industry, there's an opportunity cost to not learning something new about software and vice versa.
When you add more requirements to a single job, it increases the time we have to spend before we're employable. Not every industry changes as fast as software does, but some certainly do, possibly catalyzed by software.
If you increase the time requirements, it's going to reduce the available pool of engineers as long as all of the engineers are honest and don't apply for jobs or remote contracts until they're ready.
If you don't want the time requirements to increase, you have pay the opportunity costs from one of your pools of knowledge.
So really, we need a much better "good enough" for employing developers and career development, including teaching software and industry knowledge. Because eventually the time requirements are going to become steeper and steeper. It can't go up forever.
How about instead of becoming the expert at whichever business domain the software is for, we become experts at helping business domain experts find and express the business rules that need to be implemented?
You can fit a decent amount of that skill and the technical knowledge you mentioned in one skull, and it still lets you be quite effective in more than one domain.
That's what I'm going for anyway. Wish me luck.
It's really an argument for who's going to spend the time and appears simple:
- Devs learning software and industry knowledge
- Business experts learning software knowledge (incl. technical writing) and industry knowledge
Product Managers with some technical knowledge and writing skills are best at being a middle layer between raw customer requests and development specs in my experience. PMs and customers struggle when they don't have a good vocabulary to use to describe features that they want. That's when a dev has to translate or teach the person. Then again, that's asking a PM to learn industry knowledge and technical knowledge and product management knowledge. This is especially true if you have a good QA pipeline.
I've seen analysts and PMs that didn't have a good UI/UX vocabulary or weren't exposed to different UI/UX's, and usually their requests were the most vague and resulted in the most unspoken details.
I've also had PMs that knew how to write a good technical spec down to quick UI mock-ups and error handling. They also had technical writers to pose questions about some of the details.
Pretending I could be as good as the latter is foolish, and if I could, my salary should have been combined for doing 3 jobs well. I think one-man-army, $250k/yr full-time positions are rare though. We seem to be inching closer to it though, maybe without the salary.
I'm working with an intern who's implementing a "proof-of-concept" for voice control using both Alexa and API.AI (Google). He's a CS freshman. It's amazing what he's gotten working so far, but his code is unreadable noise. Like he never learned what a subroutine is for.
But he's giving a demo to the CEO today, and said CEO will undoubtedly say "Nice, finish it and let's ship!" when it's probably not even usable code in any way. The hard part -- connecting thousands of users through a db and thousands of persistent cloud connections from individual IoT devices, hasn't even been sketched yet.
So he looks like the hero (with a demo that does something amazing), and I'm going to look like the can't-get-it-done idiot because no one in the organization understands the complexity of going from that proof-of-concept to a working product, or even a next-level demo that uses actual connections from actual devices.
Of course, that's when you're supposed to quit, I guess.
Of course both viewpoints are valid, I'm just really feeling the commenter's point today because of my particular circumstance.
So you could read this essay as just an expanded version of the saying.
As the article suggests, though, getting there is not easy!