First day: "Hey welcome to the team hope you got everything setup when you have a minute do you mind going through your contacts we cannot find any good engineers lately."
Me: "Ookaay.. thanks? Sure here are 5 people I've studied and/or worked with. They're literally rockstars. I personally vouch for them and I know most of them are in the market and interested."
"Hmm they look too junior thanks anyway."
One month later: after spending plenty of time interviewing and rejecting some wildly overqualified candidates because they're not culture fits, hiring manager brings in their "80% but can get the job done" (that is a quote no joke) friend from previous job. Or they bing on a far more expensive contractor.
Meanwhile my five recommendations all get hired at competing or peer companies.
This has happened in some form or another at most every place I've worked. To date I'm extremely disillusioned with the "tech" hiring process. I've seen everything from perfectly qualified candidates getting rejected because the interviewer woke up on the wrong side of the bed and, in the span of less than an hour, developed a vendetta against the candidate, to plain old nepotism. I've even seen people vote to pass because they wanted the job the candidate came in for and thought they could do a better job even though there were no plans for such an arrangement to ever happen. I've seen interviewers fail a candidate because they couldn't solve a problem the interviewer (who is in a marginally related domain) was having trouble with last week and wasn't making progress on because google didn't have a copy paste answer.
Interviewing is entirely subjective. Humans are humans. They only person that truely matters at the end of the day WRT interviewing is the hiring manager and maybe their boss.
Edit: I forgot my main point, excuse the rant. The point is I don't see how the industry is going to overcome this hurdle when it's neigh impossible for a front line engineer to bring in good people who wouldn't already appeal to whatever processes/humans are in place--which clearly isn't working because they can't find good engineers.
Agree. At all 3 of my previous companies, most hires we're often a buddy of the manager who usually wasn't very good or a fit for the role. There's usually no process. One thing you can do that's a low hanging fruit is just take a day with your interviewers and come up with 5-10 quality questions and a short way to make sure the same questions are asked to every candidate and nobody asks the same question. That alone will increase the quality of your interviews.
I actually think we've had really quality interviews everywhere I've been. That's the weird thing. It's not the interviews that are the problem. It's the pre and post filtering that just feels so arbitrary.
Nobody in this Twitter thread seems to know or understand what a commons problem is. Training up new devs is good for the industry, sure. But your own company doesn't capture all of that value.
If you are trying to solve this problem by yelling at companies to be more altruistic, then, as previously stated, you don't know or understand what a commons problem is.
> But your own company doesn't capture all of that value.
If they paid their employees properly, with adequate salary bumps as their experience grew, then the company would capture all of that value, most of the time.
People jumping every 2 years isn't exactly something most employees want to do, it's something companies force upon them.
No, the employers won't capture that value by following your recommendation.
What you are recommending is that employers pay for training, and then pay employees exactly the same that they would get for hiring someone with equivalent skills from the market. This leaves employers out by the cost of training from the cost of hiring from the market. So the business is strictly behind where it would have been.
But worse yet, a non-trivial fraction of the time when you train someone for a role that they aren't currently in, you find that by ability or inclination they are not actually suited for that role. Which means that you're now just out the cost of training and may well have also lost the employee. By contrast someone that you hire externally who is already in the role is more likely to be a fit. Furthermore businesses tend to run on lean margins. You're hiring because you need those skills now, not in a year. A lot can change in a year.
Now against this you have to balance the fact that hiring people is expensive, new hires take time to come up to speed, and not all new hires work out. Furthermore a loyal employee with institutional knowledge has value because of that. It isn't a one-sided deal. Companies do train people.
However it is more common for it to be someone that they know already. Such as an internal promotion, or someone who did an internship. Hiring strangers expecting to train them with no particular reason to believe that you will keep them is a losing proposition.
Businesses that consistently pay for losing propositions usually wind up going out of business.
> What you are recommending is that employers pay for training, and then pay employees exactly the same that they would get for hiring someone with equivalent skills from the market. This leaves employers out by the cost of training from the cost of hiring from the market. So the business is strictly behind where it would have been.
This is in the context of businesses complaining about a shortage of developers. If every mid or senior level position was getting filled without much grief or effort, then there wouldn't be any issue. But in a situation where the business can't seem to find anyone that fits the bill, the obvious solution is to look at the talent they CAN train, instead of hoping the exact right candidate walks through the door.
Furthermore, in any reasonable organization where things like positive/negative code review, respectful seniors, mentorship, and a general respect of all developers are part of the company culture, the cost of training a junior developer to be useful to the project is largely absorbed. Of course, a company where everyone is isolated and heads-down will consider training to be a huge burden, because it's completely orthogonal to their workflow.
> Businesses that consistently pay for losing propositions usually wind up going out of business.
Businesses that focus on the short term "we need this now, get the butt in the seat" aren't thinking about the workforce as people, which is just as insidious.
> Furthermore, in any reasonable organization where things like positive/negative code review, respectful seniors, mentorship, and a general respect of all developers are part of the company culture, the cost of training a junior developer to be useful to the project is largely absorbed.
I assume that by "absorbed" you mean "becomes less visible"? I prefer this style of organization as well, but even so the cost is still there, it is just not broken out as a line item.
>> Businesses that consistently pay for losing propositions usually wind up going out of business.
> Businesses that focus on the short term "we need this now, get the butt in the seat" aren't thinking about the workforce as people, which is just as insidious.
It really isn't that simple. You can think of people as people and care about them. But once you accept that you need to hire more, you've also got a very concrete need to get a butt in that seat NOW so that they can take a load off of the valued employees who are stretched too thin.
That said, here is an exercise that I highly recommend.
It is easy to look at businesses through the filter of how we think that they should be run. And every stupid thing that they do becomes confirmation that you are smart, they are stupid, and the world is a bizarre place. People who do this wind up with a lot of strongly held ideas about how the world should be run, and they are mostly not very good ideas.
Turn that around. Start with the assumption that people who are acting in bizarre to you ways are actually competent and know something that you don't. Start trying to figure out what it is that they know which you haven't figured out. As you start figuring those things out, suddenly they will look a lot smarter, and YOU will actually be smarter.
In my experience, people who indulge in that exercise become more effective at their jobs, and if they are interested in climbing the corporate hierarchy, they are in a much better position to do so.
You also need to take into account the cost of not filling positions.
Eventually that cost eclipses the cost of training. Also more experienced developers have an easier time jumping ship for their next >20% bump in compensation. Even with senior developers that's more knowledge and training walking out the door.
It's also much easier to con a junior developer into staying in one place too long for too little money. You might spend two years skilling them up, but if you pay them well (or even don't) you might get them for 6 years or more. Seen it happen too often.
From what I understand, the issue that describe happens because the company doing the training doesn't re-evaluate the now trained developer as more experienced, and therefore, more valuable and deserving of better compensation.
A junior is a junior, until they're not. Pretending that the now trained new developer is still worth the same is foolish, but that's exactly what some employers do to their own detriment.
So, say you hire a junior developer and spend a year training them to the next level of seniority, at significant training / opportunity cost expense.
They now command a much higher market rate - you can pay the higher salary and retain them, or not and they'll leave. Either way, the company is out the training cost relative to hiring someone senior from the beginning.
(I say this as someone who hires 90% junior people and trains them. I think there are good reasons to do so, but the commons problem is real.)
Right - as of recently, I would claim that the market is distorted in a weird way where hiring senior people is just a better deal for most startups than hiring juniors and training them, even factoring in all the associated costs, because a weird effect where market rates for senior developers are maybe ~2-3x higher but they are 5-10x more effective.
Market rates for senior developers relative to juniors should go up in this scenario, but they haven't really, and I don't understand why.
The main selfish economic reason to hire junior developers is if the marginal return of hiring someone sooner swamps the relatively higher total cost, which is why you'll see companies like Google that have huge RoI on engineers hiring many more college grads.
Yes. It's called an investment. I don't see it as being that different from companies expecting an employee to be doing the work of the next level for a while before they get the promotion/raise of that level.
Seems fairly rational. I recon two of the most common reasons people leave are boredom and unsatisfying career progression. So all you have to do is ensure people have interesting work and pay/promote them at a rate that is competitive with their option of switching jobs to achieve that goal.
I recon two of the most common reasons people leave are boredom and unsatisfying career progression. So all you have to do is ensure people have interesting work
My current job is very interesting - I’m leading a team building a brand new green field system and building a modern software development shop simultaneously (CI/CD, automated testing, newest version of everything), I have almost complete autonomy in how things are architected and the resources (money, people) to make it happen.
But once I get done, I don’t see anything the company can do to keep my job “interesting”.
If you were on the iPhone team in 2006 at Apple, what could Apple possibly do to keep you from getting bored?
> So all you have to do is ensure people have interesting work and pay/promote them at a rate that is competitive with their option of switching jobs to achieve that goal.
I think similarly. But these are also the two things that require companies to restructure in order to be able to reliably offer them to employees.
I went through this two years ago with my last employer. When it came right down to it, I didn't really have a place there. When my job was made redundant, my departure was bittersweet.
There was a time when they were open to expanding my role and giving me a direct report. But they decided to move in another direction.
It's more than just "treating employees well." A lot more.
So one instance. Yes, it's not going to be 100% retention. Nothing ever will be, unless you decide to bring back slavery.
I asked how often does it happen. I can almost guarantee you that the retention rate of developers who feel they're being treated well is much, much higher than otherwise.
Let's rehash your argument here. You think that companies that need skilled devs should train up junior devs instead of looking for skilled devs on the job market. Your answer to the retention issue that creates is "treat employees better."
You don't understand why that's a bad deal for employers? You want them to take on all of the risk and get none of the reward.
Look, I like the current system where you have to build the skills yourself and find some way to get into the job market. I was able to navigate that system and build a career in it. I look at the current code-school infrastructure as basically "easy mode." That's what we need more of. Companies still snap up code school grads like nobody's business.
Asking companies to play the role of code schools is just way too much.
No, I'm saying that if they're not able to find senior devs, that they should train junior devs. And I reject the idea that it creates a retention issue, unless you have no intention to treating those people well.
"You don't understand why that's a bad deal for employers?"
No, I don't, at least not for employers that respect their employees. It may be a bad deal for employers that don't, but for them, I really don't care.
"You want them to take on all of the risk and get none of the reward."
As opposed to now, where the employee takes all of the risk and gets no reward?
"Look, I like the current system where you have to build the skills yourself and find some way to get into the job market."
Yes, people who have been through hazing like the system of hazing and have no desire to stop it for the next class.
"Asking companies to play the role of code schools is just way too much."
I don't agree. But if that's the case, then they can stop complaining that there aren't enough skilled developers out there if they're not willing to help out.
It's easier to give a higher salary, filter candidates, and then don't invest into training.
Training must be done by someone, and that person's time is valuable too, so there're also other things for consideration.
I think it's more that, the companies are the ones complaining, so if they're not willing to be part of the solution, then we don't care about their problems.
If they don't want to train, then that's fine, but then they don't get to complain about the lack of developers.
Alot of companies think they need senior devs, when in reality what they really need is more junior or mid-level devs. Most developers aren't architecting whole new apps, or building out massive feature requests. Most of the time, it's fixing bugs, maintenance, adjustments to existing features, and other work that mainly requires navigating existing code and infrastructure, with the occasional new feature here and there.
You do need one really good lead developer on the team, to help mentor people. This is the one area that is sometimes a bit lacking. Most leads are great engineers but aren't very good at leading, mentoring or any of the other people type work they need to do. But, all they really need is some training and a personality adjustment, in some cases.
You underestimate the cost of communication overhead and rework. As a flat team scales, the cost of communication scales with the number of people squared. If you solve this with process, everyone's productivity is dropped.
Data that you can find in Software Estimation: Demystifying The Black Art quantifies this. The overall productivity of a team increases until it has 5-8 people. Then it goes down. A team of 12 actually accomplishes LESS per month than it would if you fired half the people. However once you get to 20-25 people, productivity is back. And then increases fairly close to linearly.
Going from 5-8 people all the way to 20+ with no increase in throughput is really, really painful. And those aren't cheap people. This provides companies with a huge incentive to figure out how to get a small team to be as productive as possible. And you don't get the most out of a small team by filling it up with junior to mid-level devs.
It's okay for front-line devs to think this, but when the people in charge of salary and hiring believe it, your team ends up a crew of fresh graduates herded by a senior dev who is exclusively occupied with making sure the junior devs don't make things worse.
And most failures of junior devs (breaking stuff) are considered the responsibility of the senior dev managing them, so the senior dev bears just as much responsibility as before but now with less control and more unpredictability. It is certainly good to have juniors on the team, but managing them should not take the greatest part of your senior devs' time, and your product should not be written exclusively by juniors.
Some CEOs deflect accusations of cheapness by claiming they don't want "rockstars/divas," but that's a personality trait, not a skill, and the crappiest of crappy junior devs are certainly capable of embodying rockstar arrogance.
Most work is tweaking an existing codebase, and quick fluency in arbitrary code written by strangers is definitely not a junior-level skill. In fact, CS-heavy stuff with little external integration is perfect for fresh graduates, as they did learn it in school.
It's okay for front-line devs to think this, but when the people in charge of salary and hiring believe it, your team ends up a crew of fresh graduates herded by a senior dev who is exclusively occupied with making sure the junior devs don't make things worse.
Why is that bad thing? I'm in that position now more or less - senior Dev (official title architect) working with mostly with inexperienced but smart junior developers and legacy developers. I'm just not expected to do too much actual coding.
Whenever people ask for fullstack do-it-all seniors with part time devops responsibilities, I assume that means "We have no idea how to create or manage a functioning technical team, and instead want a tech-wish granting genie."
You don't need 10 rockstar ninja genius senior engineers. You need five devs, three Ops, one PM, and an intern. Ideally you can divide it up further where certain devs are more back-end and others front-end, one dev is more of a database pro, one ops is metrics focused etc...
Then you have to manage this team of people. Which means solving difficult trade-offs, mediating disputes, finding gaps, removing obstacles, etc...
Those ads make me want a 10x real ultimate power ninja to silently enter their offices and dose every last executive and HR employee with a combination laxative-emetic, video record the aftermath, then silently slip away.
But I'm cheap too, so if that ninja could do it at 60% of the going rate for mass poisonings, that would be great. Their clan is also going to need 200 years of experience in undetectable infiltration, 300 years in poisons and venoms, and 500 years in absolute client confidentiality. Thanks.
I've seen one company asking for Ph.Ds in Math for a non-Math/Data/Science position. They said they put that on there because they had a bad experience in the past from someone who couldn't code but looked good on a resume. They wanted to pay $75-85k for that position. But then decided to take the interview anyway, so it clearly wasn't an actual requirement.
That's really the issue: job postings drive what developers "ought" to know because they should indicate some kind of business demand for individual skills. But there's no feedback to punish companies who scribble whatever they want on their job postings, so they continue to add new line items without verifying that what they're doing right now is working. It just leads to confusion and wasted time finding out what they actually want. If they can't find someone, they obviously go "Developer shortage! No one has the 50 skills w/ 5 years of experience on our job posting!"
Compound that with resume prose (pushed by people charging for resume review) that sometimes goes overboard in its effort to make something out of very little and we have a system where no actual communication takes place until you're in front of someone.
I've seen the same thing multiple times, especially with NodeJS all those jobs require 10 years of NodeJS experience, even though NodeJS is only like 8.5 years old at this time
It amazed me how quickly the recruiter spam started flooding my LinkedIn after passing the one year mark. I am always greatful for the company that hired me freshly out of college based on my education and aptitude, not experience.
I advise current students that the first job is hardest to get. After that you’ll have a wealth of opportunities knocking at your door.
This mentality is also related to the insane idea that junior developers are just glomming onto companies to get the training and then leave, taking advantage of the hard work these companies put into them. This is not a strawman, this is a true fear that I've hear several executives utter out loud with no self awareness.
Treat the junior developers like anyone else - someone who wants the best for their career and knowledge base - and provide it for them. Incentivize them to stay. Train them properly and give them resources to succeed. Maintain an internal culture that properly cultivates their talents and gives them mentors to work with and aspire to. Compensate them when they've reached proper thresholds in their usefulness to the team, and provide honest and valuable feedback (positive and negative) as much as you can. Plus avoid the obvious pitfalls: don't abuse them or belittle them for basic questions or silly mistakes. Don't throw them into the deep end with little to no chance of success. Don't talk down to them or treat them as a substandard part of the team. Don't start firing the juniors the minute the workload starts winding down. These are all signs of disrespect for your own workforce and everyone can see obvious red flags, even juniors.
Given the griping about tech interviews these days, does anyone really think that people actually like job hopping? They do it either to avoid poisonous situations, or out of obvious and necessary self interest, one that can be provided by a company willing to treat their junior hires with respect. There's no reason hiring a junior cannot be a win-win situation for everyone involved.
These are what I call "training companies". Engineers join them and find a miserable state of affairs. Being productive individuals, these people try to improve the state of the company - in technology, in process, or in other areas. But the business, in its infinite waterfall, resists the change.
After about a year, most engineers realize the futility of their efforts, and start to either look for a better company, or try to build a better company. This often takes about a year, thus the 2-year churn cycle in many modern resumes.
If hiring were truely an issue for companies, then salaries would be going up or at the very least you'd see them make location decisions based on where the supply of labor is greater than the demand. You almost never see companies doing that. It's always, let's move the company to where-ever is most convenient for the founders or the VCs.
I know this is targeted more towards student but I also find similar issues if someone with some work-experience wants to switch away from a dwindling tech towards a new tech. People say, well you don't have "relevant" experience.
A company I worked for always was looking for senior devs. Not because they needed them, but if one would stumble in accidentally they would still hire the programmer (and fire one of us? I never saw this pan out!)
Yup. 45 applications, several face-to-face interviews and homework assignments since June. No offers. This is with over 6 years of development and 11 years of technical work and leadership.
Hiring practices suck when the only advice my father can give me is "make sure to smile and be positive!"
I don’t know anything about you, so don’t take this as a personal insult, but have you thought after 45 applications and several face-to-face interviews over four months, it may not be the hiring practices that suck?
In over 20 years of working, it has never taken me more than two weeks to get a job. My fastest turn around was quitting a job on a Monday at noon, with no job prospects and no applications submitted and getting an offer on Thursday from what was then a Fortune 10 (non tech) company.
Could it be skill set, interview skills, location, etc?
Most companies have forgot the power of training. Hiring a junior dev to shadow a mid-level or a senior engineer trumps, almost all the time, the trouble of finding the unicorn candidate.
Companies want to hire a silver bullet and forget all about the perfectly capable people that don't have "15 years of Java/C#/Python/whatever experience".
Anyway. I run a largeish software team and we have a mix of very experienced devs along with some newer devs who definitely have holes in their skills. We run training in certain areas with the goal of upskilling the team. Especially for security and secure coding practices which aren't taught in universities. There's also a lot of training around good engineering practices and the like. I'm pretty sure most software companies are the same.
Yes, the industry does not train people from absolute scratch like apprentice schemes do. It's very reliant on universities. But I don't see legions of unemployed students: they do get jobs and companies do take over their training to bring them from basic to advanced.
Do you have any links to good resources on "secure coding practices" or more generally "good engineering practices"?
I feel like there are lots of resources about specific programming skills (language and library details) but it is difficult to find good resources about more higher level concerns; security, maintenance, tooling, operations, availability, authentication/authorization, scaling, automation.
Google's Site Reliability Engineering is one example of a good resource in this area.
Yes, I'd suggest starting there (I used to be an SRE).
I don't know of any books that handle all those topics off hand. I was thinking more of secure coding for engineers, rather than system architecture in general.
Junior devs finding an initial job and receiving mentoring is certainly an issue, but I think that the issue raised in the tweet is way more endemic to start-ups than more established companies.
I always thought that this was a supply problem? There are a lot of junior developers out there and that has flooded the market, reducing demand for junior engineers.
Neither juniors, nor those experienced. Recruitment gatekeepers want always something else. Imagine a situation in which you release all your tech workforce and try to rehire, would they get the job? Might be eye opening.
This is true for developers applying through job ads. patio11 has a written millions of words of advice, many of which talk about how to land work (hint, don't apply through job ads), so I'm not going to try rephrase that, but if you're a developer (even without experience) and you're good at what you do, you really shouldn't have any issues finding work.
(shameless plug) I'm working with hyperiondev.com. We offer 6-month coding bootcamps, and our grads have companies fighting over them even though they have zero "real world" experience and only 6-months of training (yes I've read the bootcamp vs theoretical CS degree flamewars to preempt anyone who believes that anyone who writes a line of code should have 3 years of CS theory to back it up).
Know your stuff. Have a portfolio (minimal is fine). Talk to people (not recruiters or HR staff). There is almost no easier way to find work at the moment than by saying "hey I know what a for loop is", so I have limited sympathy for the people who claim they can't get jobs because of lack of experience in dev.
For Boston at least that is patently untrue. Every company wants to run you through the whiteboard hazing. Knowing what a for loop is, is going to get you shit.
Also, don't apply to companies? That may be fine for a small percentage of people who can get ahead of the pack or luck out but that's not doable for the whole industry. Is every junior dev supposed to constantly troll meetups and cold email more senior devs to get into the industry? Networking like that is good advice now because it makes you stick out but if everyone does it then the senior devs are going to start treating newcomers like we treat recruiters and just blanket ignore them until we need something from them. It's not feasible for the entire industry to be growing in demand and still only be willing to grow the supply of devs through the old boys club style of networking
I am assuming you are making an overstatement. Even then, you are really mislead if you really think someone who knows the basics of control structures is going to get a job in the industry.
Me: "Ookaay.. thanks? Sure here are 5 people I've studied and/or worked with. They're literally rockstars. I personally vouch for them and I know most of them are in the market and interested."
"Hmm they look too junior thanks anyway."
One month later: after spending plenty of time interviewing and rejecting some wildly overqualified candidates because they're not culture fits, hiring manager brings in their "80% but can get the job done" (that is a quote no joke) friend from previous job. Or they bing on a far more expensive contractor.
Meanwhile my five recommendations all get hired at competing or peer companies.
This has happened in some form or another at most every place I've worked. To date I'm extremely disillusioned with the "tech" hiring process. I've seen everything from perfectly qualified candidates getting rejected because the interviewer woke up on the wrong side of the bed and, in the span of less than an hour, developed a vendetta against the candidate, to plain old nepotism. I've even seen people vote to pass because they wanted the job the candidate came in for and thought they could do a better job even though there were no plans for such an arrangement to ever happen. I've seen interviewers fail a candidate because they couldn't solve a problem the interviewer (who is in a marginally related domain) was having trouble with last week and wasn't making progress on because google didn't have a copy paste answer.
Interviewing is entirely subjective. Humans are humans. They only person that truely matters at the end of the day WRT interviewing is the hiring manager and maybe their boss.
Edit: I forgot my main point, excuse the rant. The point is I don't see how the industry is going to overcome this hurdle when it's neigh impossible for a front line engineer to bring in good people who wouldn't already appeal to whatever processes/humans are in place--which clearly isn't working because they can't find good engineers.