On the one hand big corporations are presented as these incompetent behemoths that are locked in their internal petty office politics, managers are clueless as to what is being or should be developed, and tons of money is wasted, they are wooed by empty marketing etc.
On the other hand these organizations are immensely wealthy and successful.
It may actually be that they are right, and for non-tech-centered companies this whole software stuff is really peanuts and their mental energy is better spent on other business-related stuff. So they will come up with random low-effort comments on all sorts of details because ultimately they (rightly) don't care. What actually matters is to negotiate deals like referral fees, or coming up with other contract ideas that will bring a lot of revenue in.
Or maybe it's just an equilibrium, a local optimum. You don't have to produce "good" stuff (in the developer's sense) without waste, precisely because the other company that you're targeting is also inefficient in similar ways.
I'm really not sure but I think it's important to look beyond just "haha, stupid managers, they can't make up their mind". There must be deeper reasons.
As someone who recently created yet another startup, I am reminded daily that literally everything I do has to be done for the first time. It's like molasses covering every part of the machine. When a client needs a proposal you have to write one from scratch, when someone wants payment terms or to negotiate a contract adjustment you have to figure out contract language. You don't have expense reimbursement forms, you don't have job descriptions, you don't have anything to use as a starting point or template for anything you do.
Big companies have all that. They have clients that are planning on using them again this year. They have meetings they took two years ago that are about to be a sale. They have a guy on the third floor that has seen this problem they're about to have once and knows how to avoid it. People instinctively trust them, so they don't have to explain themselves as much. People instinctively fear them, so their contracts get upheld. People that used to work for them now work at potential customers and call them up.
And on and on and on. Having a slightly busted software development process is, in fact, peanuts compared to all this.
Or, perhaps another way to say it is that there's a lot of different kinds of "software" besides the digital ones-and-zeroes kind we think about here. A group of humans all organized around a business purpose with tons of experience and momentum is a high functioning machine as well.
A local restaurant becomes #1 by having the best food and/or location, and providing it at a price people are willing to pay.
A car dealership probably succeeds by having the best prices on cars and the best customer service.
Most banks compete on customer service and some amount of product differentiation.
Insurance is some combination of customer service, market share, and correctly pricing things.
To the extent that better software contributes to these core "capabilities", companies will invest in it. Investment means it gets management attention, willingness to spend, willingness to hire annoying top-shelf talent that demands flexible working arrangements and huge salaries.
Many -- maybe most -- business don't have competitive dynamics that software enhances. A hotel is a great example: it's location, price, service, and cost control. The software they use at the front desk has approximately zero importance to whether the place will stay in business, or fold. Owners are right to invest as little as possible in it and move on to something more important, like building a brand.
I have to disagree with the broad statement that a busted software development process is peanuts. In highly regulated industries like finance, bad software can cost you billions in fines from regulators.
In these industries, they have to invest in technology and good processes if they want to stay in business.
In a sense, this lends credence to the theory that once you’re big and powerful and have lots of clients, you can afford to lose these opportunities. But I still think that you would have a huge competitor advantage without all the molasses.
You're assuming the regulators are low-level enough to identify a flawed system. A lot of regulations are applied at a policy level (aka process level) and auditors (internal or external) are not always able to identify severe gaps between what's documented in a policy and reality of day to day. Good people and technology matter way more than process.
Sure, if we hire perfect people there's no reason to even have a process. In reality, when that breaks down you have to rely on your process to close those gaps. But, like you said, a policy/process is only as good as the paper it's printed on if nobody follows it.
100% agree - My point was more that regulators tend to use policy and process artifacts as proof of security compliance. The idea that more highly regulated/audited IT sectors are inherently better at security is probably false. The audits just aren't stringent/detailed enough and there's aren't enough incentives for executives/boards to do more than the minimum. Which is why the values/opinions/ethics of those decision makers becomes key.
The most secure 'company' I worked for was a large research university because it was a priority from the director and we were constantly getting probed by overseas IP addresses. On the other hand, the worst company I worked at in terms of security was in the payment processing world. We were PCI DSS compliant and audited almost continuously by various interested parties (banks, card brands, third parties, etc.). I don't think any of our people were bad/inept but at the end of the day our policy artifacts made us sound/appear way more compliant than we were. The problem was upper management didn't care (allow us time or $$) about closing the gaps in implementation unless we failed audits or pen testing. We didn't fail but no one in IT thought we were doing enough. We didn't have enough auditing and regularly had inexplicable 'bugs' reported that honestly could have been evidence of network intrusion but we had no proof either way.
The safest decision, 99% (but not 100%) of the time, is to cancel that software project before it goes to production, because if you've been successful up to now, by definition you have software that is Good Enough (if not good). So, in highly regulated industries, changing your software may very well be an unwise decision.
Not least because, if it's the same software the regulators were ok with last year, they probably won't fine you billions for using it this year.
Caveat to all of the above: "probably". But the chance of new software getting you into trouble is still higher.
Most businesses fail because they don't have customers, not because they're inefficient or badly-managed.
They key is right there in your statement: "have to". Good places to work don't build software because they "have to", they do it because it's viewed as an important competitive differentiator and execs WANT to spend on it.
I think for software sales, this type of thing is slowly fading as it becomes more and more commoditization. For all but the largest of businesses, you go to AWS and pay the market price for compute and other services, there's not much need for contracts and negotiations and meetings, the price is the price. Going to a competitor or doing business the old way simply costs much more. Eventually, businesses align on consuming commodities as commodities, the spending slowly moves from CapEX to OpEX.
AWS is the power company now. You agree to their terms, and you pay their bill. Unless you're doing a major industrial facility that requires mega-watts of power, you don't need to interact with their sales staff; same applies to AWS (and ironically, you can probably use power as a proxy to relative deal size here, too).
100 years from now, all the conceivable software is going to exist that could exist, at least for 'business processes.' If your business is selling potatoes as a farmer to distributor, the software you need to participate efficiently in that market will already exist.
The market price isn't what most companies pay. If you pay significant amounts to Amazon (maybe 1 employee salary worth) you're screwing yourself if you don't negotiate lower rates
Edit: Just asked the corporate overlords. It's actually against our finance rules to pay for any recurring service charge without a negotiated contract.
Your company probably also employs a professional shark whose job it is to squeeze the blood out of every stone he is pointed towards. With even a modest number of sizeable contracts, such a person would pay for themselves very easily.
I disagree. Unless we're talking about Artificial General Intelligence or some other general purpose technology, I think bespoke software will always be better--if not more cost effective--than off-the-shelf.
On the other hand, there are entire markets, even quite large ones, that you can't pursue because they're way too small. That's what I'd look for if I were thinking about starting a new business.
End of the day, a big company or government agency is too big for any human to internalize. It's a machine that pumps out process, but the downside of consistent process that change is more difficult.
When I worked at an agency that supported social services, some guy was bitching and complaining about how everyone who wasn't him was stupid, the agency was awful, etc. One of the more respected directors walked by and heard this. She popped in and said something like "There are x million people who depend on us to feed their children. And y million people who expect us to do so in a responsible and equitable way. We've never missed a payment, ever, because of this team."
Another example is the military, which is probably the dumbest possible bureaucratic institution ever devised. Yet is capable of executing the mission with ruthless efficiency, and can survive in the face the adversity and loss that few organizations can.
Smaller organizations are always better at reacting to change, because there are fewer stakeholders. But, they suck at many things, including consistent delivery, paying things on time, and organizational resiliency. Software makes them better, but software is its own form of bureaucracy with its own set of problems.
Think about the difference in terminology between a coach and players. When a coach takes a team through a play, they usually don't tell the players which hand they need to dribble with or which foot to plant first when running, they use much broader strokes and leave the implementation details to the player.
Let's take the example of... perhaps a Jamaican bobsledding startup. In the early days, you don't know what you're doing so everything matters. You scrap to purchase airfare to get to key industry events so you can be where the players are. If you mess up almost anything, it's game over. But you make it there, and you perform like you knew you could. You pick up sponsors. Expenses like airfare are now no longer a problem. What before used to be a mad scramble to organize and scrape together funds and make sure you have a good pitch for the big meeting is now a smoothly oiled proven machine with accounting and assistant cogs. Your jerryrigged home workouts are now streamlined by an Olympic training facility. Your mind is now no longer occupied by what has become minutia to you. You don't really worry about grocery shopping because you have a team nutritionist. You don't care about how efficiently or inefficiently the process of grocery shopping and cooking goes down now, just that it does and is on time.
You no longer make music with an instrument. You conduct an orchestra. Your mind is now thinking in bigger blocks.
Software is just one section of most orchestras.
I still think that doing things the right way and getting rid of those famous inefficiencies could leverage success even further. With regards to software I think that's the difference between large companies with a known engineering culture (like Facebook, Google, Apple, Microsoft) as opposed to those known to be a bit more red-tapey (like IBM, Adobe).
But of course, "doing things the right way" is terribly hard to quantify, and so is "how much success would be possible". If numbers at the end of the year are already positive, good luck trying to convince C-level to introduce fundamental changes to existing processes.
I don’t think it is hard to quantify actually, shorter build times are better than longer ones, branch merges that take seconds are better than the ones that take up to a day, spending people’s time on higher level thought processes vs squandering their time going through pages of manual steps that could just be a script, etc. It’s just a lack of awareness of what the company is doing that turns them into molasses, and puts projects at risk of being canceled. The companies are still successful overall somehow, but individual employees learn and grow at a much slower pace than they’re capable of.
Software is viewed as a cost center at a lot of companies, and in many cases that is the correct decision. The company officers and managers understand this, which is why they act the way they do.
Their goal is not to make the best possible software -- it is to ensure the software is good enough that it does not hold back other areas of the business, and to achieve that within the budget they are given.
Put another way, a company's organization can be seen as its software, except it's running on a kind of wet, carbon-based hardware that does certain things much better than the silicon-based ones, and other things much worse. As the capabilities of the silicon-based computer increase relative to the capabilities of the carbon-based, there are few industries where it would be wise to continue purchasing software like one purchases pencils.
A lot of advances are gradual, and if you are a large company you will have time to react.
Tech is an afterthought for most of these companies. Most of them are natural monopolies. Insurance for instance is a business that works such that the larger you are the lower the risk and cost. Take the ACA: many nonprofit startups tried to join the ACA marketplace to compete with the Big Boys. Essentially all failed.
These companies have so much money that they can afford to do things like 10 million dollar POCs (which I've seen). Often they have literally 5+ different teams doing the same exact thing and will pick the best out of that list.
In some cases (I won't say which) they essentially launder profits through the "tech" parts of the company. They basically optimize for cheap people they bill out to the "real" company at some enormous fee.
How fast this change is coming for a particular company depends on how inert the customer base is, and how much of the customer "journey" is digital.
For example, insurance -- I've worked at one of the largest (if not the largest) insurance companies in the world. Obviously they make some money, or turn over at least, but the customer base is usually pretty static.
What I mean is, you get an insurance, and you rarely change unless something drastic happens.
This allows for a big enough company to completely manhandle IT/tech/dev -- spend $100 of millions on project spanning years, and ending up with a lot of money in the pockets of "the usual suspects" and often software that does not work at all.
All legacy cruft still intact, and just another layer of expensive crap that business users hate.
Still, the business of insurance is heavily tech & software dependent. So much, in fact, that systems down means “no business”.
I've become somewhat cynical with regards to the enterprise tech business, but I've seen these patterns repeat at too many places. It's truly sad and disheartening.
Ugh. Reading stuff like this gets me deeply sad; what could have been done with that money and time instead? It makes me want to go escape to live in the hills, maybe even a full-on Wonko the Sane.
You're assuming that an organization can whip out the best proposal possible at a moment's notice, which is highly unlikely. What you perceive as waste is simply development costs, without which the organization could not come up with the most competitive option possible.
Complaining that having different teams working internally on a product is wasteful is just like complaining that a sports team that hiring backup players and running development league teams is wasteful.
A lot of us are being paid handsomely, but are we _really_ adding value to society by any other means that just being consumer machines?
I find it interesting to ponder these things. =)
Research grants also fit that description quite nicely.
Can you explain this differently? I don't understand how you would launder profits through yourself (or why you're laundering them at all-tax reasons?).
Same as licensing your name to "another" company that's actually your company, for (coincidentally I'm sure!) all of their profit, in order to shift money around.
Have umbrella company. Put your tech folks either directly in the umbrella company or in some child company. Make your other child companies (or even ones you don't have directly related to the umbrella corp) "pay" the one with the tech folks for their dev work, some arbitrary amount of money (however much you want to move). Now the umbrella company or tech child company has all the profit from your other companies, which you've done for tax or liability reasons or whatever. Mission accomplished. Whether actual tech work was done may or may not matter, depending on whether you were also trying to accomplish actual work at the same time, or just shifting money.
I think it's more fun to understand how much they can charge for what they sell.
Plus, small companies that make big mistakes that do survive, are scarred by the experience and tend not to repeat that particular kind of mistake again. Companies that can make a blunder of the sort described in this story, repeatedly, have to have been big in order to survive long enough to repeat it.
IBM will continue to target their most expensive employees for layoffs while hiring cheaper new grads, but even after those alleged practices, their profit per employee fell by 5k. Seems reasonable to expect they won't be around in a decade, though I'm no financial expert and this is a projection from two datapoints.
You might take it as evidence that, of all the myriad aspects of a business, scale far outweighs all other factors for generating a profit.
You don't have to be "good" (whatever that might mean) to generate profit, production costs just need to be marginally cheaper than your prices.
The fact that a large company is rich does not necessarily mean that the inefficiencies don't exist.
I somewhat agree with you on "this whole software stuff is really peanuts and their mental energy is better spent on other business-related stuff". For most companies this software stuff can be mostly irrelevant, and it's our fault this is the case. As Erik Dietrich says , we are "efficiencers" -- we're supposed to make all kinds of processes more efficient, not just be code monkeys and copy code from StackOverflow.
This comment should be featured higher in a discussion. The "stupid" label is thrown rather fast although more often than not the problem lies in how critics fail to understand the problem and, more importantly, the restrictions applied to the decision-making process. Therefore critics base their view on their personal and very over-simplified and under-informed view of the problem, don't understand how an outcome is a valid and reasonable solution, and instead of assuming that they have limited information and understanding... They instead assume they have perfect information and complete insight into a problem and therefore any outcome they didn't approve or understand can only mean stupidity.
What I see is: most of this new cohort has _no_ idea what our business is and they are overwhelmingly concentrated in middle management.
If software is not your core competency, you don't need to be any good at it to make money.
That leads to ignoring simple observations about reality like this one, since this one implies that in most companies software doesn't matter much and the CEO is right to treat it as a cost center. Nobody wants to hear that about what they do with their life.
To be fair, you could run 'sed s/programmers/humans/' on my first sentence and leave its truth value unchanged.
These now big corporations were really good at something...so good that they beat the competitors, grew their market share, expanded their head count, etc.
However, operating at a higher and higher scale (or with more modern technology) brings new challenges that weren't part of the game before. Ones that they are not necessarily good at. Eventually they might become great enough to threaten the company itself (AOL, AT&T, Yahoo, etc.)
The inefficiencies you see are characteristics that accrued due to the level of success and exposure to new challenges. The inefficiencies are not characteristics that got them to that success.
I think a fair summary is to say that there must be some value in having thousands of people who start each day with some vague idea of what the organization needs from them. And this value must be huge, because it has to outweigh all the value destroyed by corporate inertia, dead weight, infighting, etc. If it doesn't, the firm can't survive very long.
Only a wealthy company can spend $400.000.000 on an SAP project, for example, but in reality it would probably be better for the business long term, to build something custom and embrace the fact that IT & tech is important to achieve its goals.
If it's not, why spend the money?
Doing it right probably means creating a "software factory" or tech hub within or outside of the company.
If we're talking a company with muscle and will to plow down $100s of millions on a massive tech/automation project, this should be feasible right?
For an example, see Lufthansas tech initiative: https://lh-innovationhub.de
(I'm not in any way affiliated to LH, I just happened to read about it a while back, while trying to convince my then CEO that something like that probably would be better that spending millions on proper crap from 3rd party)
I'm sure there's many other examples.
Traditionally, the problem is that with money comes politics.
Everyone wants a piece of this pie, and a lot of non-tech/dev employees are used to turning to vendors.
The thing is, what you want to achieve is a cultural change, not simply an organisational update (which is why, in the LH case, the tech hub is not located at LH HQ. It is not a cultural match, according to the CEO).
Does this change even require money, except salaries for the "right people"?
Big business have done IT & tech a huge disservice by outsourcing and off-shoring things not considered "core business", such as tech & coding.
The pendulum is swinging, at least in my experience, and suddenly everyone need dev expertise again. Of course they do, because most likely you will be disrupted by a company without the legacy, that is sprung from tech.
Well, not a disservice to us still around, but it's obvious that theres a lack of expertise to just pick up off the streets.
How long you can keep a lot of people employed in a traditional IT dept. and buy a lot of half-crap vendor solutions that you glue together in half-assed ways I guess depends inertia of you customer base, and also as pointed out below, already accumulated wealth.
The other side is that a large company often gets large contracts. If you sell a software 10 million times it doesn't really matter if you spent 100x as much on it as a small company would. The small company with 10000 customers still is having 10x the revenue per work done.
Could you make more profit? Sure, but lacking market pressure you're rarely concerned with efficiency.
I just noticed my company always books flexible airline tickets. They cost about 2x as much as non flexible ones. I've never missed a flight and most of my colleagues haven't either. It makes sense if you're a busy manager that often doesn't fly. But for most engineers the flexibility costs more than what it's used for.
If they release the leash and let people book "extra" tickets when they miss a flight, next thing you know people will be taking their S.O. along on conferences using "rescheduled" tickets or having the company pay for vacation tickets and so on. An interesting unofficial benefit, not necessarily a bad idea if it can be controlled. However people always push the boundaries and next thing you know...
So it saves money in the long run to have one employee traveling equals precisely one (admittedly expensive) ticket. They can put their oversight labor into preventing more elaborate forms of control fraud such as using sales meeting budget for dating purposes or padding wedding receptions into corporate meeting budgets or all the nonsense that happens with "company" cars.
Also the labor savings of phoning it in by taking a week to approve tickets means it could be very expensive indeed to have a protocol that permits same day travel if you need to reschedule, all those tickets being approved at considerable labor expense same day just in case you need one once a year. Or paying you for a week to sit there waiting for tickets to arrive, etc.
A side dish is small companies have control fraud oversight as a small rounding error, but when you start thinking about an entire department to organize travel, now you need oversight of an entire department, leading to strange inefficiencies.
Have you read "Searching for Stupidity"?
Basically the NASDAQ of 1990 has almost no relation to that of 2000 not because of competence but of stupidity. Big lumbering behemoths shoot themselves in the foot and fall over.
That's because economies of scale work. They make it easier to get customers, they have easier access to credit, they make it cheaper to employ employees, they make it easier to have things go your way if there's a contract or legal dispute... The list goes on.
Economies of scale are why living in a poorly ran-community is better then trying to scratch out a living as a hermit in the woods. It's the same thing in workplaces.
It seemed to work for them.
I am back in finance at a private company, and I am part of our on-campus recruiting efforts, and I frequently tell candidates "listen, take a job with one of our direct competitors, that's fine, but whatever you do don't take an offer at an investment bank if you can avoid it- they will just work you to death largely filling tech debt and you won't get the opportunity to work with the truly best and brightest..."
How do they work? currently at an IB and unsure about continuing, everyone seems apathetic
Hell, in many situations in general it isn't that important, since knowing the right problem to solve and who to market the solution to often matters a lot more than the technical quality of the solution in question. People and companies generally don't judge products and services by code quality or development practices, and unless these cause it to constantly break, don't really care about them at all.
But other factors do matter here. If a company is a known brand, then they can keep getting customers for terrible products for a long, long time, simply because they're the first name people go to. At some point you become so big, you get so much momentum, that you can basically publish/release anything and people will buy it. Apple could sell the most broken phones on the planet, and have 10 million + people rush out and buy every one.
Do companies get away with it forever? No, but the bigger you are, the longer it takes for you to fail after your products/services become crappy.
And yeah, as you say, local optimums matter too. If most companies in your field are inefficient and put out terrible products, then there's not much pressure for you to be much better.
So, there is no contradiction. If anything, it might make more sense that adding people increases waste and stupidity. Remember that startup founders' ideas are failures over 90% of the time. These businesses started or grew with at least one good one. Then, they add lots of often-less-innovative people with their own ideas thrown into the mix. Of course many of them are going to be bad. Human nature, esp egos and image management, takes over from there adding the conflicts seen in the article.
I work at a F250, and I once asked my boss: "Why does the parent company allow our company to exist, when we seem to be steeped in incompetence?" He said to me: "Because we make a shit ton of money."
How that looks depends a lot of where you sit. If you are on the inside you see the lack of coherence up close. On the outside you see the collective power of the organization. There is enough coherence that the multiplier effect is still very positive.
My answer would be that while a startup seen in isolation can be very efficient, if we look at startups as a group there will be a lot of duplicated work. It's easier for a large company to solve a problem once and for all.
And this is the primary advantage a large org will have and compensates for many of their shortfalls.
As an arbitrary example, I think if we compare the number of employees onboarded vs the number of new applications shipped, small companies/startups and enterprises are polar opposites. My experience with startups is that you end up with a new web service for every 2-4 people (wholly anecdotal) you onboard as developers. For that kind of company, it makes sense to optimize shipping new web services; you do it a lot! For an Enterprise, you might onboard 100 developers just to maintain and develop existing systems. Writing new frameworks to make building apps easier doesn't make much sense; many of your developers are working on an existing system that would be expensive to port to a new framework. Meantime, your HR department is drowning trying to onboard 20 people a week.
Wasting money shipping this listing is fine; they did it once. This was released in 2013, and I wouldn't be terribly surprised to find out this, or some small bastardization of this, is still running as a production page for Prudence. The money they wasted, amortized over 6 years, is nothing compared to what I've seen startups waste in productivity while new hires wait for laptops, credentials, access, someone to explain some legacy crap that everyone had to deal with, etc.
Turned out that when the startup had brought him in as CTO they failed to mention that he was eventually going to be the sole developer.
I'm feeling it, because I was recently catfished.
Still accepting advice on how to negotiate a 100% raise.
Still, I can't imagine the culture falling that far in two years. Their was tons of meaty dev work being done. And day 1? Maybe they were just trying to ease him into the team? This might have been an effective Mr. Miyagi type move to help him get a greater understanding of how all the pieces work together.
This story smells bad, I would be interested to hear more details.
But they were plugging holes as fast as they could with contract recruiters that were throwing whatever would stick. I didn't even get to meet my team before I had an offer in hand.
From mine and his experience, whatever department that was (an ecommerce frontend), seems chaotic, with massive turnover.
I spent six months building a single-page app (back when that was still an exciting sounding thing) for a major US financial institution; by the time we actually delivered it, the result was so watered down from the initial proposal that I could've just taken some screenshots from what my designer had produced six months earlier and put them in a PowerPoint deck and said "Done!" - because this was, in fact, what the VP at the company ended up doing, except they were screenshots from the web app we built that no one other than the VP himself ever used.
On the plus side, I became good friends with the VP and the project paid off a significant chunk of the mortgage on my first house.
(No judgement on you OP this is the situation we're in)
Is there no other way to support human life than these
accelerating grinding gears of money-shuffling?
Aside, I assume you mean "to sustain the [USA] economy"? Perhaps you mean World economy, or EU, or ...?
I stand by my original assessment that, from the perspective of both the client (everyone was happy with the end result) and my consultancy (we delivered work and we got paid), it was a success.
I felt that way until I eventually got to a point where people did listen - and it turned out to be false. I underestimate the effort required on first glance and I oversimplify things in hindsight. Also I'm a developer - not a business guy - I can guess a lot of things on the requirement side but a lot of it is outside of my domain. This is something that I have to be aware of now when I'm in a position to give such estimates.
Don't get me wrong - I think there's a lot of room for improvement in almost everything I worked on before and after me - but that gut feeling of how I could have done things was never on point for me. With the article in question - too many people involved in the decision - no way to know the final result for sure until people see it have a bunch of meetings, see the iterations and decide on this - rarely do you have the person with deciding power know/understand these things clearly from the start. And you need to account for bureaucracy BS time-waste - it's just a fact of life when dealing with such clients.
"Cassandra was cursed to utter prophecies that were true but that no one believed."
Starting a new project? You almost certainly don't need something other than 1) files (yes, seriously), or 2) SQLite (yes, seriously), or 3) Postgresql or some other multi-paradigm, capable SQL DB. If the former two, please also consider whether you even need a f*cking server or are actually writing something that ought to be desktop/mobile software. That's another expensive, feature-delaying, and UX-harming mistake I've seen.
I'm enjoying myself much more than I used to, because I've escaped the red tape of development and get to solve problems that matter to nontechnical people without intermediaries predigesting the requirements.
If I was starting a business, I would definitely always try to do almost anything with Excel and Access first before deciding to invest in an industrial strength solution whether Postgres, Oracle or "big data".
> I'm a huge mess
> I feel I'm just wasting
> doesn't feel like I'm doing anything with my career
Sounds like it does affect you in every sense.
You should probably try to adjust that, before you burnout not just for that specific job but your carrer in general
When I spend so much time idling by and letting that frustration build up... I just start questioning if I'll be able to do other roles and not fail the team or whatever company I'll be working for because of how dormant I am.
It's not like I knew a lot. I don't consider myself to be super talented, just the average guy trying to make it out there but yea... I need to look into moving on.
...doing something hard and outside of your comfort zone I guess is what I am recommending.
After five years at this place I finally got my shit together and applied for a job
I spent the job interview asking pointed questions about their infrastructure and how they ran their business and then I contrasted their way of doing things with my old job and explained why their way was so much better
Turns out I was learning! And learning how NOT to do things carries some value
This list will show you that it hasn't been a complete waste of your time, and it will show future employers that you're thoughtful, observant, respectful of your own time, and healthily critical.
Edit: Clearly I am in an optimistic mood this morning.
exploring the design space and iterating until you end up with a product that makes the customer happy
endlessly reworking decisions every time the next "stakeholder" in the customer's org sees a part of the bikeshed that they have an opinion on
One is a skill-engaging and -enhancing professional project. The other is the road to burnout.
Reminds me of the Battlechess duck, point 5 here: https://blog.codinghorror.com/new-programming-jargon/
At some point, we discovered that the team lead wanted to be promoted to manager so he needed a reasonably-complex-but-actually-simple project as a "win".
The dream of all software developers.
Oh how I wish we could stop reinventing this same stupid wheel badly. And change the "wants" framed as requirements into something more reasonable, so that we could get away developing less bumpy wheels, perhaps even use one off the shelf.
Edit: I don't think is a bad thing and I am prone to it myself !
At my previous place, after several times where the wheel was re-invented I asked one of my senior guys what did he think he got paid for.
Perhaps unsurprisingly the answer was to design features, write code, review code and advise support. Whether any of this delivered any value to the company was, seemingly neither here nor there and he'd held my position before I joined the company, so it's not like he never had to think about these things or at least he should've thought about those things but the code would testify that he hadn't.
Sadly, this is pretty common
We'd be out of many jobs if this happened.
Or maybe that time & money could be spent on something else, it's not like we're running out of things to do.
In the interview, the manager told me a bunch of BS that turned out to be completely false: that it was a new project, that lots of new development was expected, that there where performance challenges to be tackled.
When I got there, I found out on day one that the goal of the project was to take over the code base from an existing development team, with which the customer had lost trust because among other things they refused to use their new shitty trouble ticket system (which was awful), and instead insisted in using JIRA.
Also, they had a server running in Tomcat that they didn't want to migrate to Websphere.
Turns out that we didn't even have access to the code, until we accidentally found that the SVN repo was listed in an infrastructure powerpoint, and we did have a VPN connection set up.
We were at the taking over consulting firm offices and not at the customer, reverse engineering the existing 7-year-old codebase without any assistance from the development team, which didn't even know that they were getting replaced.
After months of reverse engineering and producing useless documentation so that the client manager could say in some meeting that the system was well documented, we ended moving in and mostly doing production support instead of development.
Our managers were afraid that we broke something, so they did everything to make sure that we didn't spend our time coding.
After a few months, I just asked out because it was not development work and this type of work was actually harming my career and even had my consulting manager actually shout to me on the phone and tell me that he would blacklist me at his company.
Software consulting is some of the shadiest businesses out there, beware.
The amount of lies that gets told to candidates just to get them signed at the dotted line to contracts without exit clauses (which was the case with me), the use of outdated job descriptions, the lack of information that you have about the actual job even if you ask a lot of questions in interviews, you really never know what you are getting into until day one.
Here is a tip from what I have learned: ask a ton of questions about what the job will actually be, if they answer evasively or strangely seem like they want to avoid the questions and move on to other aspects of the interview, that is a huge red flag.
that's like hiring football players and not letting them on the field out of fear they'll score for the other team.
please write a book!
This happens in a lot of projects, where developers are almost treated like children or assembly-line workers.
I realized through small hints and queues over time that the goal was to take over the maintenance of the project and make sure that we could fix things if something broke, but not add any new features.
They even had a management term for what we were doing. They called it KIR - Keep It Running! LOL
I can't even fathom how bad it must have been if using JIRA was preferable...
i mean as an employee, if i get put in a hard position, sometimes the only choice is to quit, so, in that case, the CV would have an entry that you would not look back at to happily, but other than getting out as fast as you could why would being stuck in such a situation harm your career?
This was a bad look, because hiring managers want active developers for coding positions. I ended up not even mentioning that the position did not involve much coding on later interviews, I left after 6 months.
You are put in a tough position because you will have an entry of 3 to 6 months on your profile, which is generally a red flag for hiring managers as it's too short.
An entry with a full year or more will not raise any eyebrows, but 3 months will. Also, because you leave against their will, they won't give you references to the next jobs.
I did end up leaving it was a real nightware, and never looked back, but to this day I feel that I was flat out lied to on the interview about the job content.
In other interviews at other places, I was left out critical details, like for example that the team is moving to another city in two months, and you are expected to go with them.
So this why I say, you never know what you are getting into with software consulting, it's a shady business.
With the high turnover rates and the difficulty to find developers, hiring managers are incentivized to embellish otherwise mundane positions in the interview process, leading to unmotivated staff and further turnover.
You can describe the stint in several ways, such as "the strategic direction the consultancy took was away from software and into project management early in the project cycle. My personal strengths lie in..." or some such.
As an employee, probably. As a contractor, I prefer short contracts (1-3 months is short, 6 months is average) with a clear deliverable. If I wanted to just "sit around and do whatever" I'd get employed.
I prefer it to permanent, because besides the monetary aspect you can also get into new projects just starting out more often, and that is were you learn more when compared to maintenaince projects.
What type of opportunities would you recommend other than consulting?
* How do you design software so that it is easier to modify over time?
* What will your software need that isn't specific to the business features requested when it is being operated in production?
You severely limit your experience when you keep yourself from being around for the aftermath of your software 3-5 years later.
A contractor is effectively a non-permanent employee: someone you hire for a fixed period because you have irregular work load and you need some extra people temporarily. I say effectively a non-permanent employee, but it's really skirting the edge here: if the relationship is too much like an employee, then the contractor is an employee for tax purposes.
A consultant is someone you hire for their expertise. They're a specialist; they tell you what you should be doing, rather than doing work you specify.
IT contractors are also usually employed by contracting companies that take care of accounting and other overheads, and if they do things like marketing and referral, they'll eat a significant chunk of the fees too. I don't think there's a significant difference in organizational structure between contractors and consultants in this way. It's mostly that consultants address the problem at a bigger distance, and are more likely to be drawing up the plan, rather than working on the execution.
> “But,” they said, “how do we collect the referral fees?”
This is why you need to understand what problem a client wants solved and not just build what their suggested solution is. Even if you build their suggested solution perfectly, they're not going to be happy at the end if their original problem isn't solved. Their suggested solution should only be used as a starting point to understanding their requirements.
That industry relies very heavily on referral fees to this day. Part of that is matching consumer with broker, and part is brand.
Today it's places like comparethemarket, back then it was Prudential.
So nothing's changed, the web hasn't made any efficiency gains.
The better solution than what they came up with is to have put in a unique forwarding phone number per affiliate, and charge
the affiliates a referral fee per call (or perhaps per unique telephone number).
Whether that tech existed in the late 90s I don't know, but it's been available for over a decade now, and was probably available back then in a more analog form. What's nice is that now you get a webhook telling you about each call, so all the billing is automated.
As for when you put your email address into a form on some comparison site? Mortgage brokers bid on those leads. It used to be in the region of £20-50 per lead, pre-2008, and I know someone charging a comparable rate now, plus a cut of commission for subsequent remortgages.
But the article did say that this was at a point in his career where he might not have been experienced enough to make that call.
The "accepted answer" was someone asking "why"? He asked it five times. His best advice was to always ask "why" at least five times - that almost inevitably gets you to the real meat of the problem.
Out of curiosity I wondered about the current state of the Prudential website. So I searched, and found Berkshire Hathaway bought the real estate arm in 2012 and now they are Berkshire Hathaway Home Services.
I did a local search on the website, received pictures and basic information (price) for each house... and for each one it lists the same toll-free number as a contact.
So the site he built basically works the same way!
Just yesterday, a junior developer from my team came to me about a request to generate some output from historical data. I went back to the business side and determined it was just a simple switch of several columns in the input data to get the results.
My junior developer at first thought it was some monumental problem that was going to require a lot of work.
I'm a noob when it comes to coding but I worked a technical job with customers for 20 years before changing paths.
I was in a position where I joined a place as the "new old guy" and several new college grads.
One day the president of the company comes to me and asks "You know the questions to ask! Some of our experienced guys don't."
It takes a lot of effort for two people or groups to form an idea and be on the same page, understanding "all" doesn't mean *, thinking of roadblocks before you get there (seeing them coming), softly working around those roadblocks, helping frustrated people, offering alternatives without going off the rails, listening closely to find out what is REALLY important and etc. I think that stuff just comes with experience working with humans.
My own experience makes me think everyone should be required to work some form of technical support for a while ;)
"We don't have time to think, just do!" is a recipe for disaster.
The insanely messy data (bathroom intergers are a mess!), the firm essentially being just a collection of smaller firms in one building, the massive issues with 'finder fees', the near endless list of VPs that have to pass anything off, the contractors not knowing anything, the 'real' employees also not knowing anything, the hosting of websites, the brokers paying for useless (to them) web-dev, it's all true! Nothing has changed in ~25 years!
 I know you're reading this Jeff.
If it’s solid but just a bit old or needs more features, you’re fine. But, if it’s crap, look out.
The quality or lack of is never down to the the people who built it (unless it was by the MD’s nephew or something), especially if the client blames them for it.
It just shows they can’t run a project properly and whatever you end up with will be just a slightly newer pile of crap. And they’ll slag you off to the next lot.
Number two red flag is if they say at any point “You tell us, you’re the experts!” This can be translated to, “We don’t know what we want but we’re going to reject whatever you say anyway.”
I disagree with this being a red flag. That's exactly why I hired someone to build me a small website. It's not because I don't know HTML / CSS / JS; it's because I'm bad at (visual) design. I wanted someone who knows what they're doing to use their expertise and give me a good product.
The problems start when the client either can’t or won’t explain what they want, then gets upset that what you produce doesn’t match what’s in their heads. At that point they start trying to undo every decision and things go south rapidly.
One company I worked for had a great website. One of my coworkers worked on it as his baby and took great care of it. Rest of the software stack was a dumpster fire.
One other coworker came on board because they thought "no bad company would ever have a bad website". The guy who hired him left a week after he joined. One developer left every week for four weeks afterwards.
And of course, _every_ client change, goes through a process that evaluates it for additional cost that gets added to the bill. I think this change process is a considerable money earner for consultancies.
But yeah, personally I always just charge a day rate.
But I can also tell from experience that fixed price, more often than not, doesn't work out so well.
So it's a bit more like replacing the BMW company car some VP at Caterpillar drives around with one of the 400 ton trucks Caterpillar itself makes. Yes it also has wheels, yes it's also a vehicle, but no, you can't really use it as a company car.
I guess the execs didn't have a high opinion of their new product.
"Bye, bye, SunOS 4.1.3!
ATT System V has replaced BSD.
You can cling to the standards of the industry
But only if you pay the right fee --
Only if you pay the right fee . . ."
For context, the guy who wrote "The Worst Job in the World" email was Michael Tiemann, one of "open source's great explainers." ;) Now he's pranking IBM executives by installing RedHat Enterprise Linux on their mainframes.
If McNealy or Zander was running something else, that would have been the exception.
He and Gumby and John Gilmore founded Cygnus Support ("We make free software affordable") in '89, and Michael was consulting at Sun, working on supporting gcc as an alternative to the shitty AT&T C++ compiler. Remember that Sun unbundled the C compiler from Solaris and started charging for it, and AT&T charged for their shitty C++ compiler too.
Maybe Gumby can provide some more context!
Free Software Report, Volume 1, Number 1, 1992
The Free Software Community Puts A Free Compiler Back In Solaris 2
Sun Microsystems, Inc. decided to unbundle the C compiler from their latest operating system, Solaris 2. Sun users were extremely upset to lose what they saw as an essential component of the system software. Faced with dramatic increases in licensing fees, early Solaris 2 users turned to free software for a reasonable alternative.
Spearheading the effort to port the Free Software Foundation’s GNU C compiler was Palo Alto based Cygnus Support, a company that specializes in providing commercial support for free software. To fund the development effort, Cygnus appealed to the early adopters of Solaris 2. They offered a year of technical support for up to 5 users, and a commitment that the compiler would ship with Solaris 2, in return for a prepaid fee of $2,000.
To insure wide distribution of the free compiler and debugger, Cygnus negotiated with SunSoft, Inc. to make the GNU C development tools available on CDware. CDware is a free CD-ROM available from Sun and shipped at no cost to over 90,000 Sun users.
Unfortunately I also hated every minute of doing this and have switched to a back-end job where I never have to interact with a customer again!
The bright side: clients usually come to me with just an idea, plan or sketches, almost all of my projects were built from the ground up, I totally technically owned them. This is a delight for any coder (and I code a lot, not just manage the team).
The dark side: many startups and green business ventures fail, and the failure rate is around 80%, even larger in the longer run. And I do the apps from the ground up, nurture them.. but sometimes I feel myself a gravedigger.. out of any 10 projects maybe 2 or three are still there after a couple of years.
And if you're the contractor having to run around as the plans change, well just make sure you get paid enough :)
Or they could just be satisfied with doing the right thing, rather than trying to extract as much money as they could.
And a candidate who talks about failures in terms of what everyone else did wrong, but never acknowledges any failures or learning experiences of their own, is telling you something important, too... they're telling you that either they never learn, or they don't tell you the whole story.
In our company - we have a tech lead role which is supposed to be the technical point of contact for a given team. This person is expected to have answers or have a way to find the right answers about the nuances of the team.
The good ones are invaluable. Whenever new projects come up, the modus operandi is to write a proposal, review it with your current team for sanity (on whether it achieves its purpose) and then with this group of relevant stake holders / tech leads to ensure you're not getting some giant thing wrong that can nuke some other part of the business.
The problem I see with this role though - we (I am the one for analytics and one of the TLs for Data Infrastructure at Snap) have to handle 20-25 hours of meetings around all these coordination AND ensure some work gets done in your own area so that you are in touch with the code bases you are fiddling with. So as much of an alluring title that is, you are giving up deep dive development (like I was able to do building storage engines for DynamoDB). Tradeoffs :/
And there are cases where the tech leads don't have the answers (new or incompetent) OR they think of a weird case couple of months down the line, but this works out overall.
I probably should write a blog about this - but anyway, I don't know how this works out in the non-software world: Why wouldn't you have a knowledge person/council around?
If you figure out the answer to this PLEASE share it with me. This describes my current role; whenever I focus on one aspect (eg actually writing a few lines of code) the other (keeping up with planning) suffers.
My suspicion is that “tech lead” in this context basically means “manager,” but employers know many devs are scared of the word manager, so we have the highly ambiguous “tech lead” title, which allows dev managers to be in denial about their actual role.
The solution should surely be "enter your phone number and zip code and we'll contact you". Then the central referral agency contact the person immediately, to keep the momentum & check the person's details, reward them for their effort in finding the company, and arrange for the next step -- ideally an at home meeting with the particular agent you booked them with.
Just "right, now call this number" is a bit daft, isn't it?!