Hacker News new | past | comments | ask | show | jobs | submit login
The shittiest project I ever worked on (2013) (plover.com)
732 points by luu on May 24, 2019 | hide | past | favorite | 296 comments

Having followed HN and similar dev culture (Dilbert etc.) for some time now, I feel a growing contradiction that I cannot fully resolve.

On the one hand big corporations are presented as these incompetent behemoths that are locked in their internal petty office politics, managers are clueless as to what is being or should be developed, and tons of money is wasted, they are wooed by empty marketing etc.

On the other hand these organizations are immensely wealthy and successful.

It may actually be that they are right, and for non-tech-centered companies this whole software stuff is really peanuts and their mental energy is better spent on other business-related stuff. So they will come up with random low-effort comments on all sorts of details because ultimately they (rightly) don't care. What actually matters is to negotiate deals like referral fees, or coming up with other contract ideas that will bring a lot of revenue in.

Or maybe it's just an equilibrium, a local optimum. You don't have to produce "good" stuff (in the developer's sense) without waste, precisely because the other company that you're targeting is also inefficient in similar ways.

I'm really not sure but I think it's important to look beyond just "haha, stupid managers, they can't make up their mind". There must be deeper reasons.

I've had this same thought, and have a pet theory about it. I think the answer is something like momentum is so unbelievably useful in business that having it is enough to be immensely wealthy and successful.

As someone who recently created yet another startup, I am reminded daily that literally everything I do has to be done for the first time. It's like molasses covering every part of the machine. When a client needs a proposal you have to write one from scratch, when someone wants payment terms or to negotiate a contract adjustment you have to figure out contract language. You don't have expense reimbursement forms, you don't have job descriptions, you don't have anything to use as a starting point or template for anything you do.

Big companies have all that. They have clients that are planning on using them again this year. They have meetings they took two years ago that are about to be a sale. They have a guy on the third floor that has seen this problem they're about to have once and knows how to avoid it. People instinctively trust them, so they don't have to explain themselves as much. People instinctively fear them, so their contracts get upheld. People that used to work for them now work at potential customers and call them up.

And on and on and on. Having a slightly busted software development process is, in fact, peanuts compared to all this.

Or, perhaps another way to say it is that there's a lot of different kinds of "software" besides the digital ones-and-zeroes kind we think about here. A group of humans all organized around a business purpose with tons of experience and momentum is a high functioning machine as well.

This theory, by the way, explains why efforts to turn around failing large companies are so rarely successful. Once that momentum is a bunch of bad decisions, etc. the momentum is working against you. But, up until that point, it is extremely powerful.

It's worth thinking about how companies become #1 in their respective industries.

A local restaurant becomes #1 by having the best food and/or location, and providing it at a price people are willing to pay.

A car dealership probably succeeds by having the best prices on cars and the best customer service.

Most banks compete on customer service and some amount of product differentiation.

Insurance is some combination of customer service, market share, and correctly pricing things.

To the extent that better software contributes to these core "capabilities", companies will invest in it. Investment means it gets management attention, willingness to spend, willingness to hire annoying top-shelf talent that demands flexible working arrangements and huge salaries.

Many -- maybe most -- business don't have competitive dynamics that software enhances. A hotel is a great example: it's location, price, service, and cost control. The software they use at the front desk has approximately zero importance to whether the place will stay in business, or fold. Owners are right to invest as little as possible in it and move on to something more important, like building a brand.

I think you're far too optimistic about why certain companies succeed and others don't. It's much more random than that. Why does the restaurant succeed? Rugby review at the right time, a big regular corporate client, a good cook that works for less than she'd usually take because her mother lives right next door, ...

This idea of momentum reminds me of Jim Collins Flywheel Effect


I have to disagree with the broad statement that a busted software development process is peanuts. In highly regulated industries like finance, bad software can cost you billions in fines from regulators.

In these industries, they have to invest in technology and good processes if they want to stay in business.

High-tech is similar. Where I work there are literally 100s of insanely bright people with backgrounds in math, physics, etc. who struggle for years to get their ideas into results and (ultimately) products. Usually things get stuck on simple software tasks like connecting things together, driving them in a feedback loop, or just identifying that all the parts are already there, scattered around the company, and someone just needs to write a framework to put them together. I honestly think huge business opportunities are lost because of inefficient software development processes.

In a sense, this lends credence to the theory that once you’re big and powerful and have lots of clients, you can afford to lose these opportunities. But I still think that you would have a huge competitor advantage without all the molasses.

> In highly regulated industries like finance, bad software can cost you billions in fines from regulators.

You're assuming the regulators are low-level enough to identify a flawed system. A lot of regulations are applied at a policy level (aka process level) and auditors (internal or external) are not always able to identify severe gaps between what's documented in a policy and reality of day to day. Good people and technology matter way more than process.

A good process is supposed to protect you from not having good people (or, at least people who can't be good every day).

Sure, if we hire perfect people there's no reason to even have a process. In reality, when that breaks down you have to rely on your process to close those gaps. But, like you said, a policy/process is only as good as the paper it's printed on if nobody follows it.

edit I'm using security as an example of a regulated aspect of software/IT systems. I am aware there are others.

100% agree - My point was more that regulators tend to use policy and process artifacts as proof of security compliance. The idea that more highly regulated/audited IT sectors are inherently better at security is probably false. The audits just aren't stringent/detailed enough and there's aren't enough incentives for executives/boards to do more than the minimum. Which is why the values/opinions/ethics of those decision makers becomes key.

The most secure 'company' I worked for was a large research university because it was a priority from the director and we were constantly getting probed by overseas IP addresses. On the other hand, the worst company I worked at in terms of security was in the payment processing world. We were PCI DSS compliant and audited almost continuously by various interested parties (banks, card brands, third parties, etc.). I don't think any of our people were bad/inept but at the end of the day our policy artifacts made us sound/appear way more compliant than we were. The problem was upper management didn't care (allow us time or $$) about closing the gaps in implementation unless we failed audits or pen testing. We didn't fail but no one in IT thought we were doing enough. We didn't have enough auditing and regularly had inexplicable 'bugs' reported that honestly could have been evidence of network intrusion but we had no proof either way.

It is incredibly important in those industries, BUT...it is also for those same reasons more change-averse. Most (granted, not all) of the time, using the same software that worked just ok last year, is far safer than getting new software made.

The safest decision, 99% (but not 100%) of the time, is to cancel that software project before it goes to production, because if you've been successful up to now, by definition you have software that is Good Enough (if not good). So, in highly regulated industries, changing your software may very well be an unwise decision.

Not least because, if it's the same software the regulators were ok with last year, they probably won't fine you billions for using it this year.

Caveat to all of the above: "probably". But the chance of new software getting you into trouble is still higher.

Funny enough, I can think of very few businesses that went under due to fines from regulators.

Most businesses fail because they don't have customers, not because they're inefficient or badly-managed.

They key is right there in your statement: "have to". Good places to work don't build software because they "have to", they do it because it's viewed as an important competitive differentiator and execs WANT to spend on it.

as a wise person once told me, go where you're appreciated, not where you're tolerated (theres obvious limits to this)

His recent podcast with someone whose name HN hates was unbelievably informative and inspiring. Also talks about the flywheel applied to his own life.

What is a "comparison company" in that article?

In that context, a comparison company, would be a company that tries one big push to move the flywheel. This is not enough to build momentum. You need repeated effort to build the momentum.

I got more involved in the sales aspect recently and I find the part about trust especially true. Company size and references has a disproportionate effect compared to the reality of their reliability. A lot of big service companies fail to deliver, yet it doesn't seem to have an impact on future buyers. We all heard the phrase "nobody ever go fired for buying <market leader>" but only recently it started to resonate for me, and experiencing first-hand that when you are buying software in a bureaucratic environment you are not incentived to buy the best solution, but to minimize your responsibility in case of failure.

Weirdly enough, one little-stated aspect of that dynamic is that as a client you can sue a major consulting or services firm for delivery failure and expect to get something out of it, whereas that option is rarely available for smaller firms that can’t absorb the cost of a single lawsuit.

Good analysis.

I think for software sales, this type of thing is slowly fading as it becomes more and more commoditization. For all but the largest of businesses, you go to AWS and pay the market price for compute and other services, there's not much need for contracts and negotiations and meetings, the price is the price. Going to a competitor or doing business the old way simply costs much more. Eventually, businesses align on consuming commodities as commodities, the spending slowly moves from CapEX to OpEX.

AWS is the power company now. You agree to their terms, and you pay their bill. Unless you're doing a major industrial facility that requires mega-watts of power, you don't need to interact with their sales staff; same applies to AWS (and ironically, you can probably use power as a proxy to relative deal size here, too).

100 years from now, all the conceivable software is going to exist that could exist, at least for 'business processes.' If your business is selling potatoes as a farmer to distributor, the software you need to participate efficiently in that market will already exist.

The upper end of the market is still contracts all the way down. My company has around 250 employees and every single service we use, software and otherwise, is a negotiated old school contract.

The market price isn't what most companies pay. If you pay significant amounts to Amazon (maybe 1 employee salary worth) you're screwing yourself if you don't negotiate lower rates

Edit: Just asked the corporate overlords. It's actually against our finance rules to pay for any recurring service charge without a negotiated contract.

You're still paying market price, just not the published one. Negotiating b2b deals is a little like ordering from the secret menu at in&out. Doing tons of that lately and it's quite the mechanical endeavor. The only really annoying part of it is deals move only as quickly as the sales drones want it to. I have to sit through their "sales process" even if I know exactly what I want, I know what I can spend and ultimately the deal will be done on my terms. It's funny to think that some guy got high fived to death for selling me a contract for about 35% the original quote.

He’s being high-fived because you could have got it for 10% of the original quote.

> It's actually against our finance rules to pay for any recurring service charge without a negotiated contract.

Your company probably also employs a professional shark whose job it is to squeeze the blood out of every stone he is pointed towards. With even a modest number of sizeable contracts, such a person would pay for themselves very easily.

At my old company we were paying millions for contracts and software we didn't use or really didn't mean. One of the analysts volunteered to be the "undertaker" and renegotiate. After a year of screaming at people on the phone (I sat next to him), he managed to save us around a million dollars, which was around a 60% savings. He pocketed maybe 30% of the money saved that year. Not bad for a year of being a dick.

Yup I know one of them, we have a couple sales sharks that do it when they're not busy attacking potential customers

Technically speaking, you might enter into a contract with the power company. The 'negotiation' probably goes like this: Power Company: "You'll pay public rates on the same terms as literally every other customer" Your Company: "Deal!"

Power companies give significant discounts to large customers.

>100 years from now, all the conceivable software is going to exist that could exist, at least for 'business processes.' If your business is selling potatoes as a farmer to distributor, the software you need to participate efficiently in that market will already exist.

I disagree. Unless we're talking about Artificial General Intelligence or some other general purpose technology, I think bespoke software will always be better--if not more cost effective--than off-the-shelf.

It's not just momentum in that sense; it's also the fact that wealth begets wealth. (Which you could also call momentum in a way.)

That seems to be begging the question.

Having recently transitioned from small companies and start ups to a big successful corporation, this all seems right to me. A lot of things are way easier because they were just figured out already. A lot of stuff (both technical and non-technical) has already been in the pipeline for a few years and will be ready pretty soon. A related thing is that because of that pipeline thing, it's a lot easier to start thinking about things that may well take two or three years to come to fruition. These are all enormous advantages.

On the other hand, there are entire markets, even quite large ones, that you can't pursue because they're way too small. That's what I'd look for if I were thinking about starting a new business.

I think this is indeed nailing it. Companies are inefficient, but they get away with it because the market is even more inefficient.

I just want to point out that "Companies are inefficient, but they get away with it because the market is even more inefficient." is not at all what GP wrote.

Momentum and delay is a market inefficiency. The most efficient market is instantaneous.

The Dilbert perspective is a shallow one.

End of the day, a big company or government agency is too big for any human to internalize. It's a machine that pumps out process, but the downside of consistent process that change is more difficult.

When I worked at an agency that supported social services, some guy was bitching and complaining about how everyone who wasn't him was stupid, the agency was awful, etc. One of the more respected directors walked by and heard this. She popped in and said something like "There are x million people who depend on us to feed their children. And y million people who expect us to do so in a responsible and equitable way. We've never missed a payment, ever, because of this team."

Another example is the military, which is probably the dumbest possible bureaucratic institution ever devised. Yet is capable of executing the mission with ruthless efficiency, and can survive in the face the adversity and loss that few organizations can.

Smaller organizations are always better at reacting to change, because there are fewer stakeholders. But, they suck at many things, including consistent delivery, paying things on time, and organizational resiliency. Software makes them better, but software is its own form of bureaucracy with its own set of problems.

There is a phenomenon that happens when things scale. A certain level of detail begins to fade into static fuzz around the edges.

Think about the difference in terminology between a coach and players. When a coach takes a team through a play, they usually don't tell the players which hand they need to dribble with or which foot to plant first when running, they use much broader strokes and leave the implementation details to the player.

Let's take the example of... perhaps a Jamaican bobsledding startup. In the early days, you don't know what you're doing so everything matters. You scrap to purchase airfare to get to key industry events so you can be where the players are. If you mess up almost anything, it's game over. But you make it there, and you perform like you knew you could. You pick up sponsors. Expenses like airfare are now no longer a problem. What before used to be a mad scramble to organize and scrape together funds and make sure you have a good pitch for the big meeting is now a smoothly oiled proven machine with accounting and assistant cogs. Your jerryrigged home workouts are now streamlined by an Olympic training facility. Your mind is now no longer occupied by what has become minutia to you. You don't really worry about grocery shopping because you have a team nutritionist. You don't care about how efficiently or inefficiently the process of grocery shopping and cooking goes down now, just that it does and is on time.

You no longer make music with an instrument. You conduct an orchestra. Your mind is now thinking in bigger blocks.

Software is just one section of most orchestras.

I think success doesn't come because of these bad decisions and red tape, but despite them. I have the impression that in many areas, a certain "critical mass" makes a big difference, e.g. more market presence may generate a lot more sales even if the product is mediocre.

I still think that doing things the right way and getting rid of those famous inefficiencies could leverage success even further. With regards to software I think that's the difference between large companies with a known engineering culture (like Facebook, Google, Apple, Microsoft) as opposed to those known to be a bit more red-tapey (like IBM, Adobe).

But of course, "doing things the right way" is terribly hard to quantify, and so is "how much success would be possible". If numbers at the end of the year are already positive, good luck trying to convince C-level to introduce fundamental changes to existing processes.

> But of course, "doing things the right way" is terribly hard to quantify

I don’t think it is hard to quantify actually, shorter build times are better than longer ones, branch merges that take seconds are better than the ones that take up to a day, spending people’s time on higher level thought processes vs squandering their time going through pages of manual steps that could just be a script, etc. It’s just a lack of awareness of what the company is doing that turns them into molasses, and puts projects at risk of being canceled. The companies are still successful overall somehow, but individual employees learn and grow at a much slower pace than they’re capable of.

Note that not one item you quantified even remotely relates to any business outcome.

Those are not quantifications, just binary comparisons. You could also argue that new Ferraris are better than second-hand VW Polos, but should the company really use them as company cars? The question is whether the benefits are worth the additional cost, and that's where quantification (actual numbers) helps.

These organizations succeed because these inefficiencies are not the result of incompetence, but the result of prioritizing other things that have a bigger impact on the bottom line.

Software is viewed as a cost center at a lot of companies, and in many cases that is the correct decision. The company officers and managers understand this, which is why they act the way they do.

Their goal is not to make the best possible software -- it is to ensure the software is good enough that it does not hold back other areas of the business, and to achieve that within the budget they are given.

This is completely true until software eats their industry up as it almost invariably does. Just in the examples provided by the author, insurance is undergoing a shift to direct-to-consumer, and I firmly believe that mobile apps are going to become super important differentiators in that industry. Real estate takes longer to take the sledgehammer to because of entrenched agents' smart use of regulation as a moat, but it's clearly coming next.

Put another way, a company's organization can be seen as its software, except it's running on a kind of wet, carbon-based hardware that does certain things much better than the silicon-based ones, and other things much worse. As the capabilities of the silicon-based computer increase relative to the capabilities of the carbon-based, there are few industries where it would be wise to continue purchasing software like one purchases pencils.

The incumbents don't always have to be "eaten". Was banking swallowed up by new tech companies? No, banks gradually increased their use of software and adapted.

A lot of advances are gradual, and if you are a large company you will have time to react.

Oh right I didn't mean that a different breed of "tech company" gets to outcompete incumbents, just that an industry's prevailing view of software as "just another cost center" cedes way over time to software as core to the business — which can happen through internal transformation or through external disruption. Software eventually becomes a core component of almost every industry.

I agree with you 100%. However, I think one of the aspects of "bad" software that is misunderstood is the increased risk of catastrophic failure. Two applications might have no perceivable difference from the user perspective. But one was built to withstand attack or corruption of the database, and the other was not.

Having worked for a while at one of these enormous companies I will take a stab at answering this:

Tech is an afterthought for most of these companies. Most of them are natural monopolies. Insurance for instance is a business that works such that the larger you are the lower the risk and cost. Take the ACA: many nonprofit startups tried to join the ACA marketplace to compete with the Big Boys. Essentially all failed.

These companies have so much money that they can afford to do things like 10 million dollar POCs (which I've seen). Often they have literally 5+ different teams doing the same exact thing and will pick the best out of that list.

In some cases (I won't say which) they essentially launder profits through the "tech" parts of the company. They basically optimize for cheap people they bill out to the "real" company at some enormous fee.

As I wrote in a previous post, this is about to change I think (hope).

How fast this change is coming for a particular company depends on how inert the customer base is, and how much of the customer "journey" is digital.

For example, insurance -- I've worked at one of the largest (if not the largest) insurance companies in the world. Obviously they make some money, or turn over at least, but the customer base is usually pretty static.

What I mean is, you get an insurance, and you rarely change unless something drastic happens.

This allows for a big enough company to completely manhandle IT/tech/dev -- spend $100 of millions on project spanning years, and ending up with a lot of money in the pockets of "the usual suspects" and often software that does not work at all. All legacy cruft still intact, and just another layer of expensive crap that business users hate.

Still, the business of insurance is heavily tech & software dependent. So much, in fact, that systems down means “no business”.

I've become somewhat cynical with regards to the enterprise tech business, but I've seen these patterns repeat at too many places. It's truly sad and disheartening.

> These companies have so much money that they can afford to do things like 10 million dollar POCs (which I've seen). Often they have literally 5+ different teams doing the same exact thing and will pick the best out of that list.

Ugh. Reading stuff like this gets me deeply sad; what could have been done with that money and time instead? It makes me want to go escape to live in the hills, maybe even a full-on Wonko the Sane.

> Ugh. Reading stuff like this gets me deeply sad; what could have been done with that money and time instead?

You're assuming that an organization can whip out the best proposal possible at a moment's notice, which is highly unlikely. What you perceive as waste is simply development costs, without which the organization could not come up with the most competitive option possible.

Complaining that having different teams working internally on a product is wasteful is just like complaining that a sports team that hiring backup players and running development league teams is wasteful.

Don't forget that capitalism (with companies), and life itself does things that way. It's not necessarily elegant (although some might say it's actually elegant), but it works.

Off on a tangent here, but I also see it in way that David Graeber describes: "the rise of the bullshit job", which in many ways makes me think about the fact that a lot of it is not far from privatized basic income.

A lot of us are being paid handsomely, but are we _really_ adding value to society by any other means that just being consumer machines?

I find it interesting to ponder these things. =)

> which in many ways makes me think about the fact that a lot of it is not far from privatized basic income.

Research grants also fit that description quite nicely.

"It works", yeah, you're right, it does what it's meant to. Unstated, behind my angst, is the question of whether what its meant to do is actually worthwhile.

> In some cases (I won't say which) they essentially launder profits through the "tech" parts of the company. They basically optimize for cheap people they bill out to the "real" company at some enormous fee.

Can you explain this differently? I don't understand how you would launder profits through yourself (or why you're laundering them at all-tax reasons?).

> Can you explain this differently? I don't understand how you would launder profits through yourself (or why you're laundering them at all-tax reasons?).

Same as licensing your name to "another" company that's actually your company, for (coincidentally I'm sure!) all of their profit, in order to shift money around.

Have umbrella company. Put your tech folks either directly in the umbrella company or in some child company. Make your other child companies (or even ones you don't have directly related to the umbrella corp) "pay" the one with the tech folks for their dev work, some arbitrary amount of money (however much you want to move). Now the umbrella company or tech child company has all the profit from your other companies, which you've done for tax or liability reasons or whatever. Mission accomplished. Whether actual tech work was done may or may not matter, depending on whether you were also trying to accomplish actual work at the same time, or just shifting money.

i guess outsourcing the actual work for very cheap and charging the end client a much higher amount. very common.

Basicslly how all profitable businesses work more or less. A fun game to play in order to understand the business is to figure out what they actually sell. For example a software business might sell developer's time.

> A fun game to play in order to understand the business is to figure out what they actually sell.

I think it's more fun to understand how much they can charge for what they sell.

That doesn't sound like laundering at all - isn't that just good business?

I would never call it laundering. Just what the previous poster labelled it.

Actually, I have seen small companies do just as badly at things, but they don't stick around for long, because they die from these mistakes. The ones that CAN make mistakes like this again and again, must be huge, and also their mistakes are more visible because they're big.

Plus, small companies that make big mistakes that do survive, are scarred by the experience and tend not to repeat that particular kind of mistake again. Companies that can make a blunder of the sort described in this story, repeatedly, have to have been big in order to survive long enough to repeat it.

IMO, the way to resolve this paradox is to realize that the equilibrium adjusts slowly, then all at once. IBM is a pretty clear example of your clueless behemoths. On the other hand, their net income per employee is now closing in on like $24,000 annually, which sounds good, until you realize it was $29,000 2 years ago, and Apple is around $600,000 and Facebook is around 2 million.

IBM will continue to target their most expensive employees for layoffs while hiring cheaper new grads, but even after those alleged practices, their profit per employee fell by 5k. Seems reasonable to expect they won't be around in a decade, though I'm no financial expert and this is a projection from two datapoints.

"I'm really not sure but I think it's important to look beyond just "haha, stupid managers, they can't make up their mind". There must be deeper reasons."

You might take it as evidence that, of all the myriad aspects of a business, scale far outweighs all other factors for generating a profit.

You don't have to be "good" (whatever that might mean) to generate profit, production costs just need to be marginally cheaper than your prices.

The managers and directors if you talk to them will frankly tell you they have no idea what’s going on, they hire people and somehow the problems they were hired to solve don’t get solved. I had a founder tell me he created an entire department to improve the developer experience and nearly 20 employees later they’ve not made any successful improvements, they just limp along the broken systems.

Large companies are inefficient because they suffer from a number of problems (calculation, internal politics, Peter's principle, Parkinson's law and so on). However, they can introduce savings due to lengthening the production process, economies of scale, (external) political power and other factors.

The fact that a large company is rich does not necessarily mean that the inefficiencies don't exist.

I somewhat agree with you on "this whole software stuff is really peanuts and their mental energy is better spent on other business-related stuff". For most companies this software stuff can be mostly irrelevant, and it's our fault this is the case. As Erik Dietrich says [1], we are "efficiencers" -- we're supposed to make all kinds of processes more efficient, not just be code monkeys and copy code from StackOverflow.

[1] https://www.amazon.com/dp/B0722H41SG/ref=dp-kindle-redirect?...

I think many people, especially developers, often fail to appreciate how much of decisionmaking in organizations is the result of sustained negotiation between groups that have different conceptual models of what the business does. In the OP’s article, what I heard was that his problems mostly resulted from requirements not being developed or understood in a cross-silo approach (plus the not-uncommon estimation miss on bad data sources, but that’s a mechanical estimation issue). The key lesson here, I think, is not “corporations are stupid,” but that “corporations are too complex for linear requirements-gathering to in many cases support.”

> The key lesson here, I think, is not “corporations are stupid,” but that “corporations are too complex for linear requirements-gathering to in many cases support.”

This comment should be featured higher in a discussion. The "stupid" label is thrown rather fast although more often than not the problem lies in how critics fail to understand the problem and, more importantly, the restrictions applied to the decision-making process. Therefore critics base their view on their personal and very over-simplified and under-informed view of the problem, don't understand how an outcome is a valid and reasonable solution, and instead of assuming that they have limited information and understanding... They instead assume they have perfect information and complete insight into a problem and therefore any outcome they didn't approve or understand can only mean stupidity.

It's perplexed me, as well. What I've learned over the years with my current company may have an insight. When I joined, we were 50-100 employees. The tech was good. Revolutionary, in it's context and industry. We were actually doing something useful and good and positive for other people. In the last 3 years, however, we have seen tremendous growth. We're now closer to 300 employees, if not more. The tech is less revolutionary. We're now in an innovation through acquisition phase.

What I see is: most of this new cohort has _no_ idea what our business is and they are overwhelmingly concentrated in middle management.

I think this is very common, seeing it where I currently work. This theory provides a depressing take on this phenomenon. People at the top like having an insulation of middle management under them, but these people don't actually understand the business or product so they just run around parroting business and tech words in an attempt to not get found outt.


Reminds me of Microsoft. They only had two engineers working on MS-DOS 1.0, their most important and profitable product at the time. Despite recurrent industry criticism of the product and the launch of competing DR DOS (faster, better and 33% the price), it took Gates two years to react, with commercial tactics first (and eventually overdue product changes later)...

Source: http://www.ariplex.com/tina/tcfact01.htm

In most cases, the large companies being picked on as "not knowing what to develop" are not software companies. They make their money from selling things or human services, not programs.

If software is not your core competency, you don't need to be any good at it to make money.

It's really not any more complicated than this. It's disappointing to see this response so far down the page. If the answer to the question 'How does the company make its money?' is anything other than 'by continually producing high quality software', then it's not even a conclusion, it's just a tautology to say that it makes its money for reasons other than its ability to continually produce high quality software.

I think many programmers want to believe our work is more important than it is.

That leads to ignoring simple observations about reality like this one, since this one implies that in most companies software doesn't matter much and the CEO is right to treat it as a cost center. Nobody wants to hear that about what they do with their life.

To be fair, you could run 'sed s/programmers/humans/' on my first sentence and leave its truth value unchanged.

I will invoke the Peter Principal [1], but for businesses.

These now big corporations were really good at something...so good that they beat the competitors, grew their market share, expanded their head count, etc.

However, operating at a higher and higher scale (or with more modern technology) brings new challenges that weren't part of the game before. Ones that they are not necessarily good at. Eventually they might become great enough to threaten the company itself (AOL, AT&T, Yahoo, etc.)

The inefficiencies you see are characteristics that accrued due to the level of success and exposure to new challenges. The inefficiencies are not characteristics that got them to that success.

[1] https://en.wikipedia.org/wiki/Peter_principle

Economists have been scratching their heads over that contradiction for generations.


I think a fair summary is to say that there must be some value in having thousands of people who start each day with some vague idea of what the organization needs from them. And this value must be huge, because it has to outweigh all the value destroyed by corporate inertia, dead weight, infighting, etc. If it doesn't, the firm can't survive very long.

I believe that this is what the "digital transformation" is about, for real. I'm not even joking.

Only a wealthy company can spend $400.000.000 on an SAP project, for example, but in reality it would probably be better for the business long term, to build something custom and embrace the fact that IT & tech is important to achieve its goals. If it's not, why spend the money?

Doing it right probably means creating a "software factory" or tech hub within or outside of the company.

If we're talking a company with muscle and will to plow down $100s of millions on a massive tech/automation project, this should be feasible right?

For an example, see Lufthansas tech initiative: https://lh-innovationhub.de

(I'm not in any way affiliated to LH, I just happened to read about it a while back, while trying to convince my then CEO that something like that probably would be better that spending millions on proper crap from 3rd party)

I'm sure there's many other examples.

Traditionally, the problem is that with money comes politics. Everyone wants a piece of this pie, and a lot of non-tech/dev employees are used to turning to vendors.

The thing is, what you want to achieve is a cultural change, not simply an organisational update (which is why, in the LH case, the tech hub is not located at LH HQ. It is not a cultural match, according to the CEO).

Does this change even require money, except salaries for the "right people"?

Big business have done IT & tech a huge disservice by outsourcing and off-shoring things not considered "core business", such as tech & coding. The pendulum is swinging, at least in my experience, and suddenly everyone need dev expertise again. Of course they do, because most likely you will be disrupted by a company without the legacy, that is sprung from tech.

Well, not a disservice to us still around, but it's obvious that theres a lack of expertise to just pick up off the streets.

How long you can keep a lot of people employed in a traditional IT dept. and buy a lot of half-crap vendor solutions that you glue together in half-assed ways I guess depends inertia of you customer base, and also as pointed out below, already accumulated wealth.

My take on this: you're right that the other large competitors aren't better. That's just basic economics you only get good enough to be close to your competitors.

The other side is that a large company often gets large contracts. If you sell a software 10 million times it doesn't really matter if you spent 100x as much on it as a small company would. The small company with 10000 customers still is having 10x the revenue per work done.

Could you make more profit? Sure, but lacking market pressure you're rarely concerned with efficiency.

I just noticed my company always books flexible airline tickets. They cost about 2x as much as non flexible ones. I've never missed a flight and most of my colleagues haven't either. It makes sense if you're a busy manager that often doesn't fly. But for most engineers the flexibility costs more than what it's used for.

Flexible tickets are cheaper because control fraud is very expensive.

If they release the leash and let people book "extra" tickets when they miss a flight, next thing you know people will be taking their S.O. along on conferences using "rescheduled" tickets or having the company pay for vacation tickets and so on. An interesting unofficial benefit, not necessarily a bad idea if it can be controlled. However people always push the boundaries and next thing you know...

So it saves money in the long run to have one employee traveling equals precisely one (admittedly expensive) ticket. They can put their oversight labor into preventing more elaborate forms of control fraud such as using sales meeting budget for dating purposes or padding wedding receptions into corporate meeting budgets or all the nonsense that happens with "company" cars.

Also the labor savings of phoning it in by taking a week to approve tickets means it could be very expensive indeed to have a protocol that permits same day travel if you need to reschedule, all those tickets being approved at considerable labor expense same day just in case you need one once a year. Or paying you for a week to sit there waiting for tickets to arrive, etc.

A side dish is small companies have control fraud oversight as a small rounding error, but when you start thinking about an entire department to organize travel, now you need oversight of an entire department, leading to strange inefficiencies.

Eh, you can't really reschedule them under a different name.

> On the other hand these organizations are immensely wealthy and successful.

Have you read "Searching for Stupidity"?

Basically the NASDAQ of 1990 has almost no relation to that of 2000 not because of competence but of stupidity. Big lumbering behemoths shoot themselves in the foot and fall over.

> On the other hand these organizations are immensely wealthy and successful.

That's because economies of scale work. They make it easier to get customers, they have easier access to credit, they make it cheaper to employ employees, they make it easier to have things go your way if there's a contract or legal dispute... The list goes on.

Economies of scale are why living in a poorly ran-community is better then trying to scratch out a living as a hermit in the woods. It's the same thing in workplaces.

I worked for a large investment bank a good few years ago. It was probably the crappiest job that I have had in terms of job satisfaction. There was a high staff turnover and the answer was to throw money at the problem and keep the churn going.

It seemed to work for them.

I also worked at a few IBs before I came to the realization that one was just worst than the next. I thought that they worked by underpaying people just out of school, but dangling the carrot of big bucks that always seemed just out of reach (maybe next year I will get the big bonus!) and then churning and burning.

I am back in finance at a private company, and I am part of our on-campus recruiting efforts, and I frequently tell candidates "listen, take a job with one of our direct competitors, that's fine, but whatever you do don't take an offer at an investment bank if you can avoid it- they will just work you to death largely filling tech debt and you won't get the opportunity to work with the truly best and brightest..."

>I thought that they worked by underpaying people just out of school, but dangling the carrot of big bucks that always seemed just out of reach (maybe next year I will get the big bonus!) and then churning and burning.

How do they work? currently at an IB and unsure about continuing, everyone seems apathetic

I think there must be some kind of tradeoff between waste and decision paralysis in large organizations. In other words, there's some kind of "waste quota" that gets distributed between productivity and decision making.

It's a mix of things to be honest. For instance, you're right, for many non tech centred companies the software stuff isn't all that important.

Hell, in many situations in general it isn't that important, since knowing the right problem to solve and who to market the solution to often matters a lot more than the technical quality of the solution in question. People and companies generally don't judge products and services by code quality or development practices, and unless these cause it to constantly break, don't really care about them at all.

But other factors do matter here. If a company is a known brand, then they can keep getting customers for terrible products for a long, long time, simply because they're the first name people go to. At some point you become so big, you get so much momentum, that you can basically publish/release anything and people will buy it. Apple could sell the most broken phones on the planet, and have 10 million + people rush out and buy every one.

Do companies get away with it forever? No, but the bigger you are, the longer it takes for you to fail after your products/services become crappy.

And yeah, as you say, local optimums matter too. If most companies in your field are inefficient and put out terrible products, then there's not much pressure for you to be much better.

The usually were the first mover in their market or otherwise took it with some more marketable good. This sends piles of money their way. They might also invest in brand loyalty or have various forms of lockin that prevent switching. Long as they keep advertising and selling, the piles of money keep coming in no matter what dumb stuff the do inside the company. They usually can move slowly on improving their offerings, too.

So, there is no contradiction. If anything, it might make more sense that adding people increases waste and stupidity. Remember that startup founders' ideas are failures over 90% of the time. These businesses started or grew with at least one good one. Then, they add lots of often-less-innovative people with their own ideas thrown into the mix. Of course many of them are going to be bad. Human nature, esp egos and image management, takes over from there adding the conflicts seen in the article.

I've thought about this too, and something occurs to me: Big corporations attract people who are really good at making money. Some of us are good at tech, others at music, and there's a certain group of people whose gift is setting aside all of those "interesting" things and focusing on making money. And not by doing spectacular things like inventing Facebook, but just by grinding away at it, year after year, making incremental improvements to things. I just can't bring myself to be interested in doing that.

I work at a F250, and I once asked my boss: "Why does the parent company allow our company to exist, when we seem to be steeped in incompetence?" He said to me: "Because we make a shit ton of money."

What think is it's the interaction between coherence and the number of people you have working towards a goal. The thing is the multiplier effect of more people isn't linear. But crucially depends on the amount of coherence management can impose on it's work force. It turns out the multiplier effect is strong even when the coherence is low. Which is good because it's hard to impose much coherence on large groups of people

How that looks depends a lot of where you sit. If you are on the inside you see the lack of coherence up close. On the outside you see the collective power of the organization. There is enough coherence that the multiplier effect is still very positive.

Consider the Soviet Union. It was powerful and wealthy, a political superpower, rival to USA in the cold war. And yet it continuously wasted its people and resources on all sorts of ridiculous (and on many occasions horrifying) bullshit.

So did (and does) its counterparty.

I think it's more likely that they stumbled into a profitable position in some market or another, and that became their bread-and-butter. Being successful for a long time breeds incompetence.

As an entrepreneur I realized most of my choices are wrong. There are just too many I am forced to make every day and there is no time to analyze the relevant data. What keeps the business alive is that one great choice I make once in a while which pays for all the mistakes. It's probably the same for big corporations: they can afford stupid mistakes. It's like a balance sheet with mistakes and good choices. Dilbert only sees the mistakes, the pointy haired boss is clueless but the company lives on.

I'm a technical person and so it happened that at my current company I moved up a bit and eventually ended up being part of the management with C-level meetings.A lot of things that used to seem plain stupid and idiotic, no longer do so,because now I know the background story as well. Also,for various reasons(lack of resources,time,etc.) a lot of great ideas coming from people below me in the company's hierarchy will never get anywhere,which is shame.

From my observation, these companies become wealthy and successful first, then develop awful cultures as innovation becomes less important (and more difficult as you get away from your core business), and extracting maximum value from current capabilities (sales) becomes more important. Once you get to making billions of dollars, your organization can afford to be a lot less efficient and less innovative technologically.

The advantages of having a big customer base are tremendous. If you build an amazing exercise tracker iOS app, for example, you'll have to go convince every user from scratch to start using your product. Doesn't matter if it's the best exercise tracker possible, if Apple comes in with the shittiest version of it but pre-installs it on all iPhones, they'll have 10x your users in a week.

The reason it seems like a contradiction is because the behemoths that are incompetent at making software don't actually make their money from directly from making software. Most of these lumbering incompetent places probably just have dozens or hundreds of people spinning their wheels on things that can mostly be accomplished with a decent architecture around a database.

A lot of answers to this already - props on asking an important question!

My answer would be that while a startup seen in isolation can be very efficient, if we look at startups as a group there will be a lot of duplicated work. It's easier for a large company to solve a problem once and for all.

And this is the primary advantage a large org will have and compensates for many of their shortfalls.

There are some bad organizations, but overall most are not incompetent behemoths. Yes a little website for a startup that gets a few thousand people doing trivial things is technologically much more advanced. But when you get involved with millions of customers are lots of complexities that make everything complicated.

As illustrated by this story, big companies are very good at getting paid, and at getting work out of smaller companies for free. Whether they are good at anything else is beside the point as long as they continue to have the power necessary to keep getting paid.

To be successful in capitalism, you don’t need to be awesome - it’s sufficient to be no worse than competition. And the competition is the same huge, kafkesque mess - primary because no one yet figured out how to organize thousands of people without creating inert beaurocracy in the process.

You've seen the xkcd chart about automating and the cost/reward for varying numbers of repetitions per day? I think the concept that big businesses are behemoths stems from a fundamental understanding of what each type of business does each day.

As an arbitrary example, I think if we compare the number of employees onboarded vs the number of new applications shipped, small companies/startups and enterprises are polar opposites. My experience with startups is that you end up with a new web service for every 2-4 people (wholly anecdotal) you onboard as developers. For that kind of company, it makes sense to optimize shipping new web services; you do it a lot! For an Enterprise, you might onboard 100 developers just to maintain and develop existing systems. Writing new frameworks to make building apps easier doesn't make much sense; many of your developers are working on an existing system that would be expensive to port to a new framework. Meantime, your HR department is drowning trying to onboard 20 people a week.

Wasting money shipping this listing is fine; they did it once. This was released in 2013, and I wouldn't be terribly surprised to find out this, or some small bastardization of this, is still running as a production page for Prudence. The money they wasted, amortized over 6 years, is nothing compared to what I've seen startups waste in productivity while new hires wait for laptops, credentials, access, someone to explain some legacy crap that everyone had to deal with, etc.

Not me, but one of my favourites would be when a guy joined our team. We were an agency and he’d been hired by the client to eventually take over. On the first morning I walked him through the codebase and showed him how it all worked, how to extend it etc. After a while I said “any questions?” He replied, “yeah, just one - am I meant to be a developer?”

Turned out that when the startup had brought him in as CTO they failed to mention that he was eventually going to be the sole developer.

So he was the OTO - Only Technical Officer

Gotta ask -- how did the next few weeks/months pan out?

To his credit he just kinda sucked it up and got on with it. I was happy because eventually I didn’t have to work on it anymore. Surprisingly, all the money they wasted on buying rack servers and Oracle enterprise licences didn’t net them more than 10 visitors a day and it feel apart. Once it all went pop, him and I crossed paths on other projects too, got to trade stories about the absurdity of it all.

He got catfished.

I'm feeling it, because I was recently catfished.

What does catfished mean in this context? Mislead into a role you aren't suited for?

Accepting a job offer based on a false representation. Usually by offering you an inflated title and promising responsibilities and opportunities you won't actually have.

Like when you get the Big Data Analyst job, but you are actually supposed to create plots from Excel data and put them in some sales Powerpoint...

I've had the opposite happen! Early in my career, being brought on as a low-paid frontend developer, only to be stuck as de facto CTO on a spin-off company. Infra, development, hiring, travel, the whole bag.

Still accepting advice on how to negotiate a 100% raise.

Negotiation is only a thing if they say no. So, first ask for the 100% raise. If they say no, then you need to worry about negotiating :-)

Oh, that sounds so brutal. Had no idea this was a thing that happened to people. I would-first week quit.

My buddy quit a senior developer role day 1 at Walmart labs because they sat him down and presented him with the task of doing nothing but technical writing.

I worked there, I find this story hard to believe, though I left after Dion/Ben left, and the paypal mafia came in with their button downs tucked into their khaki's, and Jira became more important than actual technology, and the culture was taking a nosedive fast.

Still, I can't imagine the culture falling that far in two years. Their was tons of meaty dev work being done. And day 1? Maybe they were just trying to ease him into the team? This might have been an effective Mr. Miyagi type move to help him get a greater understanding of how all the pieces work together.

This story smells bad, I would be interested to hear more details.

There was tons of exciting work being done, apparently. Lots of promise of opportunities to plug away on a hot react frontend, scores of talented teams that sort of thing. I'm sure it is happening like that.

But they were plugging holes as fast as they could with contract recruiters that were throwing whatever would stick. I didn't even get to meet my team before I had an offer in hand.

From mine and his experience, whatever department that was (an ecommerce frontend), seems chaotic, with massive turnover.

I know some Walmart people as well. Does not sound like an effective place to work based on their experiences.

I just left a job that amounted to this. I stuck around because I went to work with a good friend who's off-the-charts good at what we do. He lasted three months after I joined and I was gone the week after.

I think anyone who has done any consulting with a big organisation probably has one of these stories. As one of the other commenters says, if the end result is something that the client is happy with, and they actually pay you, you should basically consider this a success!

I spent six months building a single-page app (back when that was still an exciting sounding thing) for a major US financial institution; by the time we actually delivered it, the result was so watered down from the initial proposal that I could've just taken some screenshots from what my designer had produced six months earlier and put them in a PowerPoint deck and said "Done!" - because this was, in fact, what the VP at the company ended up doing, except they were screenshots from the web app we built that no one other than the VP himself ever used.

On the plus side, I became good friends with the VP and the project paid off a significant chunk of the mortgage on my first house.

As an aside, I sometimes have to play UX designer when there is no one qualified available, and I've used every wireframing tool in the market trying to find something that allows me to create UIs as quickly as I can think. A few weeks ago I gave up and in frustration I put together a quick powerpoint so the product team could work on it themselves. It turns out PP is exactly what I was looking for! I found that all the other tools required a lot of setup and configuration that slowed me down, and powerpoint is so simple that you can just start plopping squares down on the page and put a few links in-between slides. I think the real gain for me is that most products I work on end up being data grid heavy, and creating tables in balsalmiq/whatever is always a nightmare.

I've had a lot of luck using Balsamiq as a wireframing or mockup tool. Quick to get into and edit, and clearly meant to be a UI mockup and not a pixel perfect representation of the final product. It eliminates a lot of wasted time on bikeshedding and lets people focus on the layout and flow.

Yeah, Powerpoint is a surprisingly good drawing tool. I remember in college a lot of my classmates used it to create illustrations for their thesis.

Good thing there are no real problems left to work on in this world or this would be a gross mis-allocation of macro resources.

(No judgement on you OP this is the situation we're in)

Ah but if someone paid for it, the problem must be exactly that valuable to them! And if it's valuable to someone, it must be a Real Problem!

When it comes down to it, we're not paid to code. We're paid to solve problems with code and if paying you $75/hr as a contractor to build me three webpages with CI/CD capabilities is it, then off you go.

$75? That's about half of what I would charge myself, and an agency is going to be $300 minimum.

They needed this developer to help them understand what they needed. Every project I’ve worked on (even personal projects) has a fair amount of this.

Yeah, if I paid a carpenter to build me a chair, then decided I wanted a stool, then ended up with a footrest, I would be really happy with him if he delivered all those things so I could try them and make up my mind. It could've been that the agents saw it and said "It's great but send me an email when they click the link", and then the development would have gone in that direction.

Out of interest, do you mean that seriously or are you parodying the sort of knee-jerk reply that an economist might produce?

Personally the capitalized "Real Problem" (along with my current mood) makes me lean towards sarcasm, but yeah, Poe's law...

This kind of misallocation is really the only thing keeping velocity of money high enough to sustain the economy

While I don't necessarily disagree with the conclusion, I think it's worth questioning the unspoken premise here:

Is there no other way to support human life than these accelerating grinding gears of money-shuffling?

Or those losses the only thing preventing the creation of even greater things

Well, of course. Problem is that, apparently, we're doing the best we can.

Could you expand that a little for us slowpokes?

Aside, I assume you mean "to sustain the [USA] economy"? Perhaps you mean World economy, or EU, or ...?

The whole point of all of these stories is funny/sad misallocations of resources. The "real problems" thing is a completely unrelated discussion. Not sure why you'd want to weave that in.

He's weaving it in because the parent called it "a success", which is only true from your individual perspective, but not from a societal one. And some people want more from what they're doing than just personal gratification.

I agree that our definition of success should have a strong societal component. However, if you expect societal benefit from every action, including actions heavily dependent on something like the whim of a consulting client, you may be faced with a lot of anxiety when you inevitably fall short, anxiety which may ultimately hamper your later individual capability to achieve societal benefit.

I think you might be overloading the word "success". A successful project doesn't necessarily imply anything beyond delivery of what was agreed upon.

Did he even get personal gratification, he got paid handsomely (by the sounds of it), but it didn't come across as him feeling fulfilled beyond that.

I can confirm that I both got paid handsomely (especially for the second phase of the project where we were on a retainer and literally did one day of work over the course of three months, barring a couple of phone calls) and that I felt very little satisfaction or gratification at the end of it, especially when the whole thing we had built was eventually rebuilt by a different offshoring company using a much clunkier technology stack without so much as a glance at our code.

I stand by my original assessment that, from the perspective of both the client (everyone was happy with the end result) and my consultancy (we delivered work and we got paid), it was a success.

This... this is just my life right now. Get a contract, spend all of my time doing anything other than actually accomplish anything, and then after wasting time adhering to project requirements that don't make sense, wind up delivering something that could have been done in 10% of the time if people listened to me at the start of the project.

>wind up delivering something that could have been done in 10% of the time if people listened to me at the start of the project.

I felt that way until I eventually got to a point where people did listen - and it turned out to be false. I underestimate the effort required on first glance and I oversimplify things in hindsight. Also I'm a developer - not a business guy - I can guess a lot of things on the requirement side but a lot of it is outside of my domain. This is something that I have to be aware of now when I'm in a position to give such estimates.

Don't get me wrong - I think there's a lot of room for improvement in almost everything I worked on before and after me - but that gut feeling of how I could have done things was never on point for me. With the article in question - too many people involved in the decision - no way to know the final result for sure until people see it have a bunch of meetings, see the iterations and decide on this - rarely do you have the person with deciding power know/understand these things clearly from the start. And you need to account for bureaucracy BS time-waste - it's just a fact of life when dealing with such clients.

Your comment was something like eye - opener for me.

Start a consultancy, name it (or if you have one, rename it to) Cassandra[0].

[0]: https://en.wikipedia.org/wiki/Cassandra "Cassandra was cursed to utter prophecies that were true but that no one believed."

And in keeping with the spirit of the article charge murderous fees to help people with their "Big Data" Cassandra installations, only to find out after weeks of wading through a management quagmire that the data in question can be stored in a few KB text file.

Incidentally, Cassandra is called like that because she kills the /Oracle/...

Selenium being the cure for Mercury poisoning, but I guess that's neither here nor there. :-)

Devious! I love it.

"Big Data" doesn't mean "makes Excel slow down" or "needs to scroll in Excel" :) If you can fit it on a thumbdrive you can buy from a Walmart, it's probably not 'big data'.

This goes for so, so many things I've seen where it's not even big data, but crap like some small part of their total dataset being a graph (unlikely to ever go beyond a thousandish nodes for any given connected set, and even that'd be unusually high, and certain to be sparsely connected to boot) so of course we have to use a hipster-ass graph db adding thousands in development costs and making the whole thing harder to work with and (demonstrably—this wasn't the first project they'd made this mistake on) less stable eyeroll.

Starting a new project? You almost certainly don't need something other than 1) files (yes, seriously), or 2) SQLite (yes, seriously), or 3) Postgresql or some other multi-paradigm, capable SQL DB. If the former two, please also consider whether you even need a f*cking server or are actually writing something that ought to be desktop/mobile software. That's another expensive, feature-delaying, and UX-harming mistake I've seen.

I started a new job where I was not officially a programmer and my machine was locked down so that I could use basically nothing except Microsoft Office, and after a couple of months, I have taught myself to do nearly any sort of data processing I previously did with Linux, Oracle, Perl, Selenium, etc. using Excel, Access, and VBA. Obviously, this doesn't involve that much data, but my previous job was really not "big data" either and they paid for all sorts of expensive licenses.

I'm enjoying myself much more than I used to, because I've escaped the red tape of development and get to solve problems that matter to nontechnical people without intermediaries predigesting the requirements.

If I was starting a business, I would definitely always try to do almost anything with Excel and Access first before deciding to invest in an industrial strength solution whether Postgres, Oracle or "big data".

This really is true, I worked on a project where we were meant to be getting hourly files into a data lake, the files so small we couldn't reach the recommended size of 256mb per file (compressed parquet in azure adls) - the files were like 1 mb each - a years worth of data was tiny and the processing overhead ridiculous

If your workflows don't bog down your servers, add big data technologies until they do!

I think if you cant fit a single data set on the biggest commonly used hard drive is another good metric.

It's pretty much what's happening on my assigned project. It's a legacy app created in the 90s... most of the time spent is just things that takes 1-3 days but due to the constant tweaks and re-tweaks it can take months. In a personal sense it doesn't affect me. I do the work they require me to do and get paid for it. Professionally and personally, I'm a huge mess because I'm frustrated on how bad is the decisionmaking. I feel I'm just wasting, and even though I'm paid for it it just doesn't feel like I'm doing anything with my career. Sigh.

> In a personal sense it doesn't affect me

> I'm a huge mess

> I feel I'm just wasting

> doesn't feel like I'm doing anything with my career

Sounds like it does affect you in every sense.

You should probably try to adjust that, before you burnout not just for that specific job but your carrer in general

I'm burnout and have been questioning myself after 2 years servicing clients in the company I'm in if I'm good enough for other roles (using as example) promoted in HackerNews, or whatever job portal.

When I spend so much time idling by and letting that frustration build up... I just start questioning if I'll be able to do other roles and not fail the team or whatever company I'll be working for because of how dormant I am.

It's not like I knew a lot. I don't consider myself to be super talented, just the average guy trying to make it out there but yea... I need to look into moving on.

My advice is to interview a couple times (with companies you don't mind missing out on), find out what you are missing and work on side projects that interest you. Then interview again about 6 months later. There is nothing like interviews to set your mind straight. Competition and hard questions seem to energize my "survival" instinct.

...doing something hard and outside of your comfort zone I guess is what I am recommending.

I was in the same situation as you at my previous job, feeling like I wasn’t learning anything because I was working on a legacy system where every design and business decision made was pure insanity

After five years at this place I finally got my shit together and applied for a job

I spent the job interview asking pointed questions about their infrastructure and how they ran their business and then I contrasted their way of doing things with my old job and explained why their way was so much better

Turns out I was learning! And learning how NOT to do things carries some value

Invest time into thinking about and listing specific things you've learned (technical, but also organizational, managerial, etc.) from maintaining this horrible legacy system.

This list will show you that it hasn't been a complete waste of your time, and it will show future employers that you're thoughtful, observant, respectful of your own time, and healthily critical.

My estimate would be that about 70% of all IT/software projects on the planet are like that.

Edit: Clearly I am in an optimistic mood this morning.

I think 70% is quite optimistic.

Sometimes the desired result is not known ahead, at project start, and twisting and turning is part of the process of (hopefully) ending up with something that people are happy with. It's just part of human nature and needs to be embraced. You just ensure you are paid for it.

This is the optimistic take. :) But there's a difference between

exploring the design space and iterating until you end up with a product that makes the customer happy


endlessly reworking decisions every time the next "stakeholder" in the customer's org sees a part of the bikeshed that they have an opinion on

One is a skill-engaging and -enhancing professional project. The other is the road to burnout.

Yeah, absolutely, agree on that :).

Reminds me of the Battlechess duck, point 5 here: https://blog.codinghorror.com/new-programming-jargon/

I think most of the impediments are intentionally created to hinder projects so that nothing looks to easy or successful. This may help explain why some PMs or product owners inflate the product features when internally advertising the effort, that way if other people throw up roadblocks they are less likely to impact a core feature.

I suspect something similar. I had a project which had a team of developers for six months; we didn't do anything the first four months (I spent a lot of time on HN during that time!). After those four months, we discovered that during the last month, being December, we were not supposed to release anything (as it might endanger the end-of-year runs). We did the entire thing in two weeks and then deployed it in the remaining two (deploying was a nightmare).

At some point, we discovered that the team lead wanted to be promoted to manager so he needed a reasonably-complex-but-actually-simple project as a "win".

This has been my experience and it might be sane. The 90% went on project/requirements discovery. Remember that you are doing this through several people, so it's multi-dimensional and once you leave 2/3 dimensions things start to get really complex.

The consolation prize is that you're paid by the hour.

>> if people listened to me at the start of the project.

The dream of all software developers.

That assumes that the problem is best solved by developing new software - which quite frequently isn't the case!

That assumes software developers think the problem is best solved by developing new software - which quite frequently isn't the case!

Oh how I wish we could stop reinventing this same stupid wheel badly. And change the "wants" framed as requirements into something more reasonable, so that we could get away developing less bumpy wheels, perhaps even use one off the shelf.

Well, in my experience a lot of developers do generally see the world in terms of opportunities to develop new software.

Edit: I don't think is a bad thing and I am prone to it myself !

I am very much in agreement with you.

At my previous place, after several times where the wheel was re-invented I asked one of my senior guys what did he think he got paid for.

Perhaps unsurprisingly the answer was to design features, write code, review code and advise support. Whether any of this delivered any value to the company was, seemingly neither here nor there and he'd held my position before I joined the company, so it's not like he never had to think about these things or at least he should've thought about those things but the code would testify that he hadn't.

Sadly, this is pretty common

I do think it's a bad thing. In my experience the best developers are those who view new software as a last resort (or rather a second-last resort, with manual work being the last resort).

> Oh how I wish we could stop reinventing this same stupid wheel badly. And change the "wants" framed as requirements into something more reasonable, so that we could get away developing less bumpy wheels, perhaps even use one off the shelf.

We'd be out of many jobs if this happened.

We'd be out of many jobs if people started breaking windows.

Or maybe that time & money could be spent on something else, it's not like we're running out of things to do.

My shittiest project was a hostile takeover from another development team.

In the interview, the manager told me a bunch of BS that turned out to be completely false: that it was a new project, that lots of new development was expected, that there where performance challenges to be tackled.

When I got there, I found out on day one that the goal of the project was to take over the code base from an existing development team, with which the customer had lost trust because among other things they refused to use their new shitty trouble ticket system (which was awful), and instead insisted in using JIRA.

Also, they had a server running in Tomcat that they didn't want to migrate to Websphere.

Turns out that we didn't even have access to the code, until we accidentally found that the SVN repo was listed in an infrastructure powerpoint, and we did have a VPN connection set up.

We were at the taking over consulting firm offices and not at the customer, reverse engineering the existing 7-year-old codebase without any assistance from the development team, which didn't even know that they were getting replaced.

After months of reverse engineering and producing useless documentation so that the client manager could say in some meeting that the system was well documented, we ended moving in and mostly doing production support instead of development.

Our managers were afraid that we broke something, so they did everything to make sure that we didn't spend our time coding.

After a few months, I just asked out because it was not development work and this type of work was actually harming my career and even had my consulting manager actually shout to me on the phone and tell me that he would blacklist me at his company.

Software consulting is some of the shadiest businesses out there, beware.

The amount of lies that gets told to candidates just to get them signed at the dotted line to contracts without exit clauses (which was the case with me), the use of outdated job descriptions, the lack of information that you have about the actual job even if you ask a lot of questions in interviews, you really never know what you are getting into until day one.

Here is a tip from what I have learned: ask a ton of questions about what the job will actually be, if they answer evasively or strangely seem like they want to avoid the questions and move on to other aspects of the interview, that is a huge red flag.

>> Our managers were afraid that we broke something, so they did everything to make sure that we didn't spend our time coding.

that's like hiring football players and not letting them on the field out of fear they'll score for the other team.

please write a book!

Yes its insane LOL Managers were non-coders themselves, and they looked at developers with suspicion.

This happens in a lot of projects, where developers are almost treated like children or assembly-line workers.

I realized through small hints and queues over time that the goal was to take over the maintenance of the project and make sure that we could fix things if something broke, but not add any new features.

They even had a management term for what we were doing. They called it KIR - Keep It Running! LOL

yes but happens a lot - if you dont change it, it won't break (expept it will, every time because the reason you don't change it is because it is crap in the first place)

> they refused to use their new shitty trouble ticket system (which was awful), and instead insisted in using JIRA

I can't even fathom how bad it must have been if using JIRA was preferable...

JIRA was quite nice until v4.0

could you elaborate on the negative effects on your career? were you a contractor or an employee? and why would someone blacklist you specifically? how was that consulting manager related to the project?

i mean as an employee, if i get put in a hard position, sometimes the only choice is to quit, so, in that case, the CV would have an entry that you would not look back at to happily, but other than getting out as fast as you could why would being stuck in such a situation harm your career?

Yes sure, already after a few months doing production support and reverse engineering instead of coding, you start getting questions on interviews about why are you not coding lately anymore.

This was a bad look, because hiring managers want active developers for coding positions. I ended up not even mentioning that the position did not involve much coding on later interviews, I left after 6 months.

You are put in a tough position because you will have an entry of 3 to 6 months on your profile, which is generally a red flag for hiring managers as it's too short.

An entry with a full year or more will not raise any eyebrows, but 3 months will. Also, because you leave against their will, they won't give you references to the next jobs.

I did end up leaving it was a real nightware, and never looked back, but to this day I feel that I was flat out lied to on the interview about the job content.

In other interviews at other places, I was left out critical details, like for example that the team is moving to another city in two months, and you are expected to go with them.

So this why I say, you never know what you are getting into with software consulting, it's a shady business.

With the high turnover rates and the difficulty to find developers, hiring managers are incentivized to embellish otherwise mundane positions in the interview process, leading to unmotivated staff and further turnover.

Several 3 months stints are a problem. One, bookended by reasonable work, is a filter. If a company is overly concerned with one, they just want a skill hire, which may or may not be your target.

You can describe the stint in several ways, such as "the strategic direction the consultancy took was away from software and into project management early in the project cycle. My personal strengths lie in..." or some such.

> You are put in a tough position because you will have an entry of 3 to 6 months on your profile, which is generally a red flag for hiring managers as it's too short.

As an employee, probably. As a contractor, I prefer short contracts (1-3 months is short, 6 months is average) with a clear deliverable. If I wanted to just "sit around and do whatever" I'd get employed.

agree, it's all a matter of how you look at it. in my CV i don't even make a distinction between contracting and employment. i have even been contracting with and was employed by the same company at different times.

Or just avoid consulting. There are plenty of other opportunities in software that AFAICT have better hours and higher pay.

Higher pay than consulting its hard, a consulting job pays usually almost twice as much as a permanent job.

I prefer it to permanent, because besides the monetary aspect you can also get into new projects just starting out more often, and that is were you learn more when compared to maintenaince projects.

What type of opportunities would you recommend other than consulting?

I believe maintaining software is great way to level up some of the most important parts of software engineering.

* How do you design software so that it is easier to modify over time?

* What will your software need that isn't specific to the business features requested when it is being operated in production?

You severely limit your experience when you keep yourself from being around for the aftermath of your software 3-5 years later.

Contracting is different from consulting. As far as I understand when you are consulting you are a permanent employee of a consulting company that assigns you to various projects. When you are contracting you work for a temporary period of time exclusively for a client until your contract ends or you find better opportunities. The first case is paid double of the permanent to the consulting company, but the employee of the consulting company is lucky if he gets as much as a normal permanent. A contractor on the other hand gets paid much more than a permanent, especially counting the less taxes that he needs to pay. There is a 3rd option when you actually own the consulting company, but then it’s up to you to find clients and if needed additional resources to assign to the various projects, but in that case you may be paid even more than contracting if you have no down time between projects.

That's not what I understand those terms to mean.

A contractor is effectively a non-permanent employee: someone you hire for a fixed period because you have irregular work load and you need some extra people temporarily. I say effectively a non-permanent employee, but it's really skirting the edge here: if the relationship is too much like an employee, then the contractor is an employee for tax purposes.

A consultant is someone you hire for their expertise. They're a specialist; they tell you what you should be doing, rather than doing work you specify.



IT contractors are also usually employed by contracting companies that take care of accounting and other overheads, and if they do things like marketing and referral, they'll eat a significant chunk of the fees too. I don't think there's a significant difference in organizational structure between contractors and consultants in this way. It's mostly that consultants address the problem at a bigger distance, and are more likely to be drawing up the plan, rather than working on the execution.

ok, it makes sense, but then the consultant is kind of the 3rd option in my previous post but his job is to advise rather than actually do all the work, right?

Contractors have to pay more in taxes, not less, since they owe both sides of social security and Medicare.

I guess it depends from the country and the type of contractor. In UK, with a limited company, you pay less taxes (at least until next April). If you are under an umbrella company you will pay more taxes because you’ll be paying all the taxes of a normal employee plus the taxes of the employer. No idea if it is different in US and other countries honestly...

What happens next April?

It’s a very good question, and a long story. As far as I know still no one has a certain answer. The back story is that in April 2017, if I remember correctly, all the contractors that worked for public companies were forced to be under the IR35 rule. This rule, in short, tells that the contractors are equiparable to permanent employees. So they had to pay taxes as permanent employees, without the perks, like paid holidays, sick days and so on. This caused a quite massive loss of public contractors, understandably. From April next year they are implementing the same schema in the private sector, with the difference that it will be up to the employer to declare who is inside IR35 and who is not. The mainstream theory is that employers will start putting everyone under IR35 to avoid problems with HMRC. Even if this won’t be the case at the beginning, after the first couple of high visibility cases in which hmrc punishes the employers for some wrong IR35 classification, all the others would want to avoid the risk and headaches and will classify everyone inside IR35. We’ll see next year how it turns out, but I’m not very optimistic. Luckily for now I’m shielded, but I’m paying an eye watering amount of taxes...

But if you're self-employed (or operate a business that's a sole proprietorship) you can deduct your expenses (equipment, rent on your home office, etc.) from your taxable income. In the U.S., you attach "Schedule C" to your federal tax return to do this:


> The report I got about the demo was that the real estate people loved it, it was just what they wanted.

> “But,” they said, “how do we collect the referral fees?”

This is why you need to understand what problem a client wants solved and not just build what their suggested solution is. Even if you build their suggested solution perfectly, they're not going to be happy at the end if their original problem isn't solved. Their suggested solution should only be used as a starting point to understanding their requirements.

That's a confusion about who the client is. This was a web page, to be used by a public consumer. The real estate people were, in my view, obstructionist old-guard trying to preserve their paper empire. Not the client. They torpedoed an early example of the web removing the middleman and streamlining an industry. Prudential's agents weren't ready for the revolution.

Actually, you're very wrong, as I've worked in that trade.

That industry relies very heavily on referral fees to this day. Part of that is matching consumer with broker, and part is brand.

Today it's places like comparethemarket, back then it was Prudential.

So nothing's changed, the web hasn't made any efficiency gains.

The better solution than what they came up with is to have put in a unique forwarding phone number per affiliate, and charge the affiliates a referral fee per call (or perhaps per unique telephone number).

Whether that tech existed in the late 90s I don't know, but it's been available for over a decade now, and was probably available back then in a more analog form. What's nice is that now you get a webhook telling you about each call, so all the billing is automated.

As for when you put your email address into a form on some comparison site? Mortgage brokers bid on those leads. It used to be in the region of £20-50 per lead, pre-2008, and I know someone charging a comparable rate now, plus a cut of commission for subsequent remortgages.

Ha! That just shows that web companies have figured out how to put themselves back in as middlemen. So sad.

Isn't that a lesson then that you should make sure you're talking to and getting buy-in from decision makers higher up the chain instead of people whose decisions are more likely to be overturned?

Hypothetically yes. But in some cases, you are hired by a department in a large division of a huge corp., with many layers of management above the department that hired you. And the end product is for a department in a totally different division of the corp. In such a case, you don't have access to the actual end customer.

The right move at the point where it got presented to and rejected by the other division was probably to say "Well, I built what we agreed on. I'm happy to revise the product based on this external feedback, but that's not included in the original fee".

But the article did say that this was at a point in his career where he might not have been experienced enough to make that call.

Why wouldn't you be able to get access to the end customer to validate what you're building?

Because he's not the one paying you.

I think the wider point here is made where the author mentions that the large company was really a collection of medium sized businesses.

I would rather interpret it such that the original department ordering the web page did not understand the business model of their overall company (with the GP having no chance to learn the real business mechanics from them) and that the real estate people actually saved the project in the end. Wouldn't want to work there myself though ofc.

Agreed. I can't remember the exact issue, but I found a great answer on Ubuntu Exchange one day about some firewall rules. The OP wanted to do something really arcane that may introduce some security vulnerabilities.

The "accepted answer" was someone asking "why"? He asked it five times. His best advice was to always ask "why" at least five times - that almost inevitably gets you to the real meat of the problem.

It's clients all the way down. Primarily you satisfy the one who gives you money today. Everything else is primarily their problem.

Great story!

Out of curiosity I wondered about the current state of the Prudential website. So I searched, and found Berkshire Hathaway bought the real estate arm in 2012 and now they are Berkshire Hathaway Home Services.

I did a local search on the website, received pictures and basic information (price) for each house... and for each one it lists the same toll-free number as a contact.

So the site he built basically works the same way!

As he said he was green, I think this really shows that early on, young developers do not always understand the importance of requirements gathering and problem understanding.

Just yesterday, a junior developer from my team came to me about a request to generate some output from historical data. I went back to the business side and determined it was just a simple switch of several columns in the input data to get the results.

My junior developer at first thought it was some monumental problem that was going to require a lot of work.

Similar story.

I'm a noob when it comes to coding but I worked a technical job with customers for 20 years before changing paths.

I was in a position where I joined a place as the "new old guy" and several new college grads.

One day the president of the company comes to me and asks "You know the questions to ask! Some of our experienced guys don't."

It takes a lot of effort for two people or groups to form an idea and be on the same page, understanding "all" doesn't mean *, thinking of roadblocks before you get there (seeing them coming), softly working around those roadblocks, helping frustrated people, offering alternatives without going off the rails, listening closely to find out what is REALLY important and etc. I think that stuff just comes with experience working with humans.

My own experience makes me think everyone should be required to work some form of technical support for a while ;)

Yup. Knowing when to go back and clearly define the requirements is a senior type of behavior.

"We don't have time to think, just do!" is a recipe for disaster.

As the saying goes, months of development work can save hours of planning.

They said they wanted everything so the DB displays in a table ... everything.

Hey works for me, let me just get my web-scrapper going so I can--hey wait a minute!

Oh dear God ... I just finished this exact project this week (ok, 90% similar)[0], everything he says is true.

The insanely messy data (bathroom intergers are a mess!), the firm essentially being just a collection of smaller firms in one building, the massive issues with 'finder fees', the near endless list of VPs that have to pass anything off, the contractors not knowing anything, the 'real' employees also not knowing anything, the hosting of websites, the brokers paying for useless (to them) web-dev, it's all true! Nothing has changed in ~25 years!

[0] I know you're reading this Jeff.

After working in a web agency for years, my number 1 red flag that the project is going to go badly is the quality of their current website they’re hiring you to fix/replace.

If it’s solid but just a bit old or needs more features, you’re fine. But, if it’s crap, look out.

The quality or lack of is never down to the the people who built it (unless it was by the MD’s nephew or something), especially if the client blames them for it.

It just shows they can’t run a project properly and whatever you end up with will be just a slightly newer pile of crap. And they’ll slag you off to the next lot.

Number two red flag is if they say at any point “You tell us, you’re the experts!” This can be translated to, “We don’t know what we want but we’re going to reject whatever you say anyway.”

> Number two red flag is if they say at any point “You tell us, you’re the experts!” This can be translated to, “We don’t know what we want but we’re going to reject whatever you say anyway.”

I disagree with this being a red flag. That's exactly why I hired someone to build me a small website. It's not because I don't know HTML / CSS / JS; it's because I'm bad at (visual) design. I wanted someone who knows what they're doing to use their expertise and give me a good product.

That’s great, you know to hire an expert and let them get on with it.

The problems start when the client either can’t or won’t explain what they want, then gets upset that what you produce doesn’t match what’s in their heads. At that point they start trying to undo every decision and things go south rapidly.


One company I worked for had a great website. One of my coworkers worked on it as his baby and took great care of it. Rest of the software stack was a dumpster fire.

One other coworker came on board because they thought "no bad company would ever have a bad website". The guy who hired him left a week after he joined. One developer left every week for four weeks afterwards.

Website quality is a weak signal; I'd say if a website didn't have any errors or warnings in $BROWSER DevTools, or made a mention of not pasting JavaScript here or having an explicit "oh hey you're a developer we're hiring", then that's good. Otherwise keep asking questions.

If he was paid by the hour it would be typical, perfectly reasonable contractor project.

Or if you really want to play the fixed price game, better add a _large_ buffer which you may keep should the project miraculously complete on time. You transfer the risk from the client by doing fixed price work and risk is expensive.

And of course, _every_ client change, goes through a process that evaluates it for additional cost that gets added to the bill. I think this change process is a considerable money earner for consultancies.

But yeah, personally I always just charge a day rate.

Consulting companies know this and they NEVER charge fixed price.

However there's a case for fixed pricing and it can work to your advantage: Think of a difficult task or problem that you already solved in similar cases - maybe you already have the heavy foundation of that work ready as a template, maybe you automated complex, repeating steps. Or just think of a simple task that has huge value for your customer. Pricing shouldn't always depend on how long it takes you to finish a project.

I beg to differ. Working at an consultancy and we had fixed price projects for large clients. We just had to plan a risk buffer into the price.

But I can also tell from experience that fixed price, more often than not, doesn't work out so well.

At least he didn't have to install Solaris on Sun executive's workstations.


I feel like I'm missing something in that article. Probably some historical context. Is it really about execs at Sun pranking each other by installing Sun's own flagships OS on each others' computers? Isn't that like "pranking" Satya Nadella by making him use Windows 10?

Windows is meant for desktop use by managers, Solaris was never realistically meant for that sort of use-case.

So it's a bit more like replacing the BMW company car some VP at Caterpillar drives around with one of the 400 ton trucks Caterpillar itself makes. Yes it also has wheels, yes it's also a vehicle, but no, you can't really use it as a company car.

"On September 4, 1991, Sun announced that it would replace its existing BSD-derived Unix, SunOS 4, with one based on SVR4. This was identified internally as SunOS 5, but a new marketing name was introduced at the same time: Solaris 2."

I guess the execs didn't have a high opinion of their new product.

The Day SunOS Died

    "Bye, bye, SunOS 4.1.3!
    ATT System V has replaced BSD.
    You can cling to the standards of the industry
    But only if you pay the right fee -- 
    Only if you pay the right fee . . ."

For context, the guy who wrote "The Worst Job in the World" email was Michael Tiemann, one of "open source's great explainers." ;) Now he's pranking IBM executives by installing RedHat Enterprise Linux on their mainframes.


More like making him use Windows Millenium.

This seems weird to me. I was working for Sun at the end of the 90's,and everybody was running Solaris on their machines.

If McNealy or Zander was running something else, that would have been the exception.

That was a lot later. Michael wrote that story in the early 90's, probably 90 or 91 while I was working there, during the transition from SunOS 4.1.3 to Solaris, when they forced all the engineers to "upgrade".

He and Gumby and John Gilmore founded Cygnus Support ("We make free software affordable") in '89, and Michael was consulting at Sun, working on supporting gcc as an alternative to the shitty AT&T C++ compiler. Remember that Sun unbundled the C compiler from Solaris and started charging for it, and AT&T charged for their shitty C++ compiler too.

Maybe Gumby can provide some more context!





Free Software Report, Volume 1, Number 1, 1992


The Free Software Community Puts A Free Compiler Back In Solaris 2

Sun Microsystems, Inc. decided to unbundle the C compiler from their latest operating system, Solaris 2. Sun users were extremely upset to lose what they saw as an essential component of the system software. Faced with dramatic increases in licensing fees, early Solaris 2 users turned to free software for a reasonable alternative.

Spearheading the effort to port the Free Software Foundation’s GNU C compiler was Palo Alto based Cygnus Support, a company that specializes in providing commercial support for free software. To fund the development effort, Cygnus appealed to the early adopters of Solaris 2. They offered a year of technical support for up to 5 users, and a commitment that the compiler would ship with Solaris 2, in return for a prepaid fee of $2,000.

To insure wide distribution of the free compiler and debugger, Cygnus negotiated with SunSoft, Inc. to make the GNU C development tools available on CDware. CDware is a free CD-ROM available from Sun and shipped at no cost to over 90,000 Sun users.

The only thing that’s ever worked for me to avoid these scenarios is getting all the relevant stakeholders physically in a room for a couple days and drawing mockups, looking at their systems, learning the domain lingo, running reports over their data, etc as we go. Otherwise the domain knowledge for a proper solution just isn’t there, even for a ‘simple’ project.

Unfortunately I also hated every minute of doing this and have switched to a back-end job where I never have to interact with a customer again!

I can say, as a seasoned web apps outsourcing contractor, there are two sides of the medal.

The bright side: clients usually come to me with just an idea, plan or sketches, almost all of my projects were built from the ground up, I totally technically owned them. This is a delight for any coder (and I code a lot, not just manage the team).

The dark side: many startups and green business ventures fail, and the failure rate is around 80%, even larger in the longer run. And I do the apps from the ground up, nurture them.. but sometimes I feel myself a gravedigger.. out of any 10 projects maybe 2 or three are still there after a couple of years.

It's an amusing story, I've seen this countless times as well. I think what finally sunk in for me was that if you're focused on saving money or efficiency on small projects then the upper limit of benefit to the company is less than the budget for the project. Instead if you focus on optimizing for the market, the profits can be unbounded essentially, and any wasted money to get to market is simply part of the learning process.

And if you're the contractor having to run around as the plans change, well just make sure you get paid enough :)

As contractor I like jobs where I can basically recommend “Just don’t build this at all.” For example the young startup that wanted credit card payment and automation for essentially a two digit number of invoices per year for a target market where bank transfer would be much more common. Replaced with an excel sheet. Customer happy. I don’t want to build crap just for the money that’s in building crap. Let me actually add value.

Yeah same here, what’s strange is you have to explain that to clients too. They don’t get you’re contracting because you like to actually do things and helping them is part of the reward. Otherwise you’d just be an employee.

Umm... did you earn the kind dof money the pointless project would have gotten you?

Probably no but they surely built goodwill with that company and as a result can help them build a good reputation. This is often overlooked but people will remember that.

Or they could just be satisfied with doing the right thing, rather than trying to extract as much money as they could.

Thanks for sharing. Good catch at the end. It's easy to forget this as a developer. But usually technically details are just that in the bigger picture - details. They're only important if the bigger picture is somewhat correct.

As an aside, "Tell me about a project that failed" is one of my favorite interview questions to ask. Failure is endemic in our industry. A candidate that only tells you about success isn't telling you their whole story.

And a candidate who talks about failures in terms of what everyone else did wrong, but never acknowledges any failures or learning experiences of their own, is telling you something important, too... they're telling you that either they never learn, or they don't tell you the whole story.

These are the instances where I feel it is definitely worth keeping some people around who act as glue. (or old timers).

In our company - we have a tech lead role which is supposed to be the technical point of contact for a given team. This person is expected to have answers or have a way to find the right answers about the nuances of the team.

The good ones are invaluable. Whenever new projects come up, the modus operandi is to write a proposal, review it with your current team for sanity (on whether it achieves its purpose) and then with this group of relevant stake holders / tech leads to ensure you're not getting some giant thing wrong that can nuke some other part of the business.

The problem I see with this role though - we (I am the one for analytics and one of the TLs for Data Infrastructure at Snap) have to handle 20-25 hours of meetings around all these coordination AND ensure some work gets done in your own area so that you are in touch with the code bases you are fiddling with. So as much of an alluring title that is, you are giving up deep dive development (like I was able to do building storage engines for DynamoDB). Tradeoffs :/

And there are cases where the tech leads don't have the answers (new or incompetent) OR they think of a weird case couple of months down the line, but this works out overall.

I probably should write a blog about this - but anyway, I don't know how this works out in the non-software world: Why wouldn't you have a knowledge person/council around?

> The problem I see with this role though - we (I am the one for analytics and one of the TLs for Data Infrastructure at Snap) have to handle 20-25 hours of meetings around all these coordination AND ensure some work gets done in your own area so that you are in touch with the code bases you are fiddling with.

If you figure out the answer to this PLEASE share it with me. This describes my current role; whenever I focus on one aspect (eg actually writing a few lines of code) the other (keeping up with planning) suffers.

My suspicion is that “tech lead” in this context basically means “manager,” but employers know many devs are scared of the word manager, so we have the highly ambiguous “tech lead” title, which allows dev managers to be in denial about their actual role.

There's a lot wrong with this whole thing but the end solution is the wrong way around too - you don't require the client to then take a further step, you get their contact details and then reduce the friction by calling them.

The solution should surely be "enter your phone number and zip code and we'll contact you". Then the central referral agency contact the person immediately, to keep the momentum & check the person's details, reward them for their effort in finding the company, and arrange for the next step -- ideally an at home meeting with the particular agent you booked them with.

Just "right, now call this number" is a bit daft, isn't it?!

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact