Hacker News new | past | comments | ask | show | jobs | submit login
Why Is There So Much Crap Software in the World (rajivprab.com)
45 points by whack 11 months ago | hide | past | favorite | 26 comments



I think it has less to do with the quality of developers and more to do with perverse incentive structures surrounding commercial software design. A piece of software that optimally solves a problem for the user is rarely optimal for generating revenue. Hence the common practice of companies breaking and sabotaging their own products to get the upsell while slowly turning them into spyware. The fundamental problem is engineers being forced to operate in bad faith, which interferes with them actually doing their jobs.

Obviously there is also bad free software out there, but it has more benign afflictions. Free software also tends to steadily improve with time, but commercial software is a roll of the dice with every update.


To me, the issue seems to be shared in common with "hard" engineering and infrastructure - it's easier to build than to maintain. The demo is easy, the thing people count on for their own work is a Sisyphean monstrosity.

The difference is just one of speed and scale. The software starts paying for maintenance almost immediately. People see a open source codebase that is now "good enough" for their problem, and within months it becomes a popular dependency, and then a singularity. Construction that goes wrong, on the other hand, often has a narrative that plays out over decades. The featureset of a bridge or building grows little with time, and often slims down, and so while the maintenance can be physically costly, it isn't like the building gets remodelled and rewired each year to do a new thing, which is a thing that happens often to software.

Software companies that can really embrace the "maintenance of a singularity" role usually last, even if they accrue some tremendous technical debt: see Microsoft, Oracle, Adobe for some examples. Companies that try to elude that outcome, on the other hand, have to have a business model of disposability: Apple tends to obsolete software when new hardware hits, as do most game companies. The companies that let the product rot are the common failure case, and the companies that keep the product small are themselves small.


Often the end user isn't even the one making the decision about buying/installing the software. (see: consumer OSes/firmware, "ed tech," most coorperate software.)


Of course it has nothing to do with developer skills. Follow the money! Fixing bugs/selling upgrades makes money. Writing perfect software makes you bankrupt.


I would think that depends on the type of commercial software - if, for example, it's used in a large bank where faulty software could screw up a few million customer accounts - anyone who rolls the dice with that will soon be jobless. THAT software (application, OS, utilities, etc.) is usually VERY well tested.


Even there, you can make the software work and be secure, and still get away with loads of maddening practices and making software that just barely does what it does, waiting for the customer to fund a feature.

Is it a smart business move? Debatable (I'd argue overdelivering on a SaaS product also helps bring in new customers). From a moral standpoint, I find it incredibly difficult to justify penny pinching and filling hours with malpractices and bureaucracy.

And for what it is worth: personal experience tells, a lot of customers are quite accepting of faulty software, as long as the damage can be reverted and minimized. Even those in finance. Wouldn't be the first time I see people casually solve a bug that caused millions worth of damage.


Nice article, but from the viewpoint of an Ancient IBM Mainframer, I wonder how ANYONE can become really expert?? The mainframe ecosystem was created to solve one major problem - stability. Up to that time, upgrading one's computer system meant rewriting the application software - simply untenable for organizations with ever-growing libraries of critical applications. It also allowed the upgrading and addition of new software (application and operating system) and hardware with minimal impact on existing software. This relative stability allowed competent (but non-genius) folks to become real experts.

When I see the current hurricane of ever-new and ever changing (evolving?!) software, the need to use an endless variety of differing products and technologies (including many libraries one has no control over), and endless discussions about the subtle differences and advantages and disadvantages of these competing / cooperating technologies.... How can anyone become really expert? How many can even become the fabled "full-stack" developer??

To me this really does not seem fair to the developer, especially those in smaller shops, where the developer almost seems to have to perform as a "full-stack" developer, whether they are or not. Just my 2 cents - I hope my impressions are wrong, but the app dev world sure seems scary!


This article seems to be based on several assumptions that feel a bit off in my experience.

Splitting the level of developers in three equal groups makes it look like it will take the same amount of time to get from noob to mediocre as it takes to get from mediocre to good. If I were to draw these lines the first one would be after a few weeks. You don't need more time to write useful software. You maybe can't solve hard engineering problems after that time, but it's enough to work in a team and be productive.

I'm not sure where to draw the second line, though. I don't know if there's a point where you simply stop making stupid problems. You get better at solving problems and might shoot yourself in the foot less often, but to write bug-free software, you really need dedicated QA who will concentrate on that. I don't think it's a problem that can be solved by individuals, but only in a team.

And this leads me to the conclusion that software quality is mostly driven by the company and economics. If the project doesn't pay for solid quality assurance or has too much time pressure on developers, the software you get might be bad at least to some degree. Even good developers can't make up for that.

I'm not sure this makes any sense outside my current bubble, so please add your 5 cents.


> And this leads me to the conclusion that software quality is mostly driven by the company and economics. If the project doesn't pay for solid quality assurance or has too much time pressure on developers, the software you get might be bad at least to some degree. Even good developers can't make up for that.

I think this is absolutely the case. As an extreme example, an organization that uses formal methods is probably going to ship higher quality software because they've meticulously worked out design bugs before they've even written a line of code and, once the program is implemented, they've rigorously verified its correctness – the economics of this methodology, however, are not really feasible in most circumstances and so instead we opt for duct tape fixes and crunch time.

The author also mentions the lack of gatekeeping for software engineering – this is often pointed to as one reason for bad software existing. I tend to agree, since "good code" and "bad code" that both do what they're supposed to under the common case can appear the same to the user and the manager who signs off on it (excluding bad UI). Point being, if someone can code just well enough to get by in their field, they will probably remain employed. This is a tricky problem to solve though, because so much of the practice of software engineering is qualitative (bordering on aesthetic, I would argue) and involves designing abstractions. While much of software engineering is analytical and must obey a certain set of axioms, higher-level systems and software design is really more akin to art and architecture than civil or mechanical engineering.


We have crap software because the (perceived) consequences of crap software are generally low: wasted time, rework, correctable errors. As the consequences of crap software become more serious, the tolerance for it goes down.

The reason there are so many terrible software developers is simple: the world tolerates the software they write.

This feature isn't limited to software. As Theodore Sturgeon put it, "90 percent of everything is crap".


Because we don't value software quality enough to prioritize quality over other factors.


> Such incompetence would never fly in other engineering disciplines. We would never put up with bridges that are as buggy as your average software system. So why is software engineering so bad? Why is there so much crap software out there?

Software is (generally) much cheaper than a bridge too. And a lot of buyers are especially aggressive in cost-cutting.

Having a proper profession for Software Engineers doesn't sound so bad.


> Having a proper profession

The author torpedoes his entire thesis with his footnote:

> I am certainly not suggesting that we gatekeep the software profession


Not gatekeeping. That would be preventing folks from coding by law. A ridiculous idea.

More like, enforcing certain standards. ie "When you hire from X, you can assume they know Y and are proficient with Z". It already happens with recruiting. When I see resumes from X I know what to expect.

If someone decides to engage in a race to the bottom and hires "engineers" from something other than X then, good luck. Maybe you found a diamond in the rough, but likely not. In that case don't whine that it's impossible to get nice software.


Why we have so much crap is not necessarily because of developers. In my experience (10 years), it's almost never the case.

1. Bloated and useless functionalities. More functionalities will make any software more complex. If these functionalities are not well thought or lack logic, it's more difficult to implement.

2. Believes versus empiric proof. The big difference we have with other engineering field: we deal with believes. The building of a plane rely heavily on the laws of physics, we rely on the law of SOLID. The first is a scientific discipline, the second is a guy talking on a stage.

3. Ego-centered field. We all think we know better. We all think we have the truth ("Microservices are better!", "OOP is the best!") even if we have no idea about it.

This article is full of believes without any study to back it up, with a cool looking curve which doesn't measure anything. The hero expert programmer who will save us all is a myth. We all struggle, we all make mistake, and we build things together. The group synergy is more important (see rework from Google).


I think the problem is a little different. The problems I see are in design and the value steak holders can actually see.

If you are building a crud system for a small business they are unlikely to pay enough to make it work spending time to make the interface easy to use. The admin staff are cheap, developers are not.

Small Business people don't want to spend money on testing and continuous integration for apps when they don't have large amounts of capital around. Nor do they push for security audits when they see no immediate value.

They want the cheapest piece of software that will get the job done. Not until they feel the pain of cheap development will they start to think about reworking their software.

Yes there are some that can produce incredible software in a short amount of time but these are exceptional people. Those people should be demanding a higher rate.


I say this in a kind manner — it's spelled "stakeholder" and not "steak holder."


>> Yes there are some that can produce incredible software in a short amount of time but these are exceptional people. Those people should be demanding a higher rate.

Most hiring I've seen in Australia has a lot of resistance to scaling remuneration to output.


Writing perfect software that nobody wants to pay $ to upgrade/fix is a really bad business model. You want to create software that people want to pay $ for even if it isn’t perfect. It’s like the medical industry. They don’t want to invent medication that cure you. Only medication that “manages” your symptoms and guarantee a long term $ income from your sickness for years until you die.


Same with lawyers. They are billing by the hour. So it is in their interest to make contract negotiations as complicated and drawn out as possible, and the resulting contract vague enough for the client to require more lawyer hours in the future. If you are a lawyer and you help clients to quickly get high quality contracts finished, you will never become a partner.


Bridges are relatively easy to specify. There aren't so many variables and they aren't likely to change after the bridge is built.

Most software is underspecified (an unsolved problem) and thus in exchange for "hacking it together somehow" the vendor is allowed to disclaim any responsibility.


This focuses on the people, but the processes and the tools should even out performance and limit failure and are not doing that effectively. People's performance is often a reflection of their environment with many factors involved in any particular success or failure.


I think the situation can be described pretty accurately like this:

1. A developers job is more complex than most, and takes a very long time to learn, because you have to become proficient in so many aspects of the job. e.g.

- technical stack which is huge (languages, protocols, how hardware works, networking etc),

- plus technical tooling (ides, source control, platforms such as clouds, programming frameworks and so on and on)

- programming models (desktop ui, web ui, state management, apis consumption, api creation, security models, and so on)

- teamwork (project management systems, soft skills, team dynamics, professional writing and so on)

- user level software (operating systems, office software, communication software, anything users use that you need to interact with)

- understanding your industry (how all the above changes and evolves over time)

- one or more business domains - to know what software to write and the nuances of how to create it to be truly useful

- all the other areas, such as high performance, or high quality, or design for testability, or how systems fit into companies management frameworks, or designing a system for manageability over the span of it's lifetime, or for easy extensibility and so on and so on

2. Skill levels

- This is a hard ask, many people don't have the kind of mind to ever do this well, and will be mediocre forever

- Learning time, it tends to take 5-10 years for talented people to become proficient in ALL the above areas they need to combine to truly be a journeyman developer

- True mastery requires a commitment to learn to a deep level on so many area, and 10-20 years is probably realistic for smart people with the right mindset, to truly get to a level of masters of so many related disciplines

3. Skills triangle

- The industry has been growing so fast. There are ten or a hundred times more recently started developers than there are developers with decades of experience because as an industry software development is growing by orders of magnitude as every company becomes an IT company where in the past it was rare.

- People often move up and out. Of the rare developers with the capacity to become real masters, who then have the 10-20 years of experience to have time to learn it all deeply - many of them have left the field by that time. This is for various reasons such as: having made their fortune and moving to investing, getting sick of computing or the industry and wanting a change, moving across to management, not updating their skills to keep up with technology etc

- Quick napkin guesses to put metrics to these -- 1/10 developers with more than 10 years serious development experience (90% less than 10 years, because of fast growing industry) -- 1/3 developers with capacity to really master it -- 1/3 of those remaining in development instead of moving up and out

= on these completely made up but semi realistically guessed figures, 1/10 * 1/3 * 1/3 calculates out to having 90 non master developers for every master developer in the industry

So a true master developer is very, very, very rare.

~

As well as being very rare, they are other factors which mean their experience and guidance is not always utilised:

- management that often suffers from lack of experience technically

- peverse commercial incentives as one of the other comments here mentioned

- fashion storms of technology that occur due to the hype of a significant proportion of the hundred times as many less experienced developers filling the blogsphere and management mindshare with hype regarding the latest thing they've re-discovered that is already known if they cared to try to learn more deeply instead of try to 'thought lead'

- ageism and overemphasis on the 'new shiny' vs tried and true techniques

Last points might seem a bit bitter but please stay on topic, they are mentioned because they are significant, real reasons why there is so much crap software in the world!


This expresses the problem perfectly and echoes everything I've seen in the industry at least. Great comment.


Everything seems to get crappier every year, not only software, due to economics I guess. Sure, there are also benefits, but the general trend seems to go towards crapier.


Because there are so many crap software developers




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: