The #1 technical lever I have found worth exploiting, worth far more than anything a university can provide, is the willingness to get the job done.
It hasn't mattered if I wrote CP/M code ISR's, PROGRESS 4GL modules, Unix daemons, Pascal, .. Java, C, C++, Objective-C, Lua, Swift, &etc. - through the thick of it all, the fact that I was willing to get the job done, no matter what, is what made the difference.
On the other end of this scale, is a bit of decadence.
From my particularly tainted point of view, the ideological basis of the current cultural norm of "university==well-paid job" is entirely decadent. People seem to have expectations that, because they're in a certain class, they don't have to work hard at developing other real life issues, and too many times I've seen the well-educated, well-connected, nevertheless incompetent developer fall to pieces under pressure. I am grateful for these guys, because they make my life easier.
Yet, those who are in this group, who haven't got a decadent ideal of their own worth, but rather get things done in computing, whatever it takes: great.
So its really not a matter of educated-enough. Its whether the will to perform is, through whatever means, inculcated - and then manifest by taking the actions to get the state of things, done.
Some of the best developers I've ever worked with, have come from utterly dire circumstances. Some of the worst, too. Likewise for the privileged, candied elite.
The one differentiator, truly, is the will of the individual. Computers, being machines of will, equalise us all that way.
I'm in this situation right now, where I've been always highly appreciated by the place I was working at.
I've also built a few systems used by a few companies in a certain niche, and always strived for excellency in what I was doing. That had resulted quick series of promotions and raises in my career. This has turned out to be of very little help for me when trying to look for a new job. For various reasons, I'm trying to look for a new job and it is impossible for me to find one.
I'm probably not very good at "selling myself" as I always thought my work should speak for me. My CV looks "average" - nothing special in there, just working here and there for x years doing y.
I'm finding myself at a pay level and seniority level that other companies just cannot justify by judging me by my CV, so in the end they decide to pass on me. Maybe it sounds like a good problem to have, but it's a very big problem to me with a significant negative consequences for my life at the moment.
What I'm trying to do now, is exactly trying to put up some impressive things on Github, speaking at conferences and in general try to "market myself" a bit more. I expect this to take quite some time before it bears any fruits, but I see no other way how I can get out of this situation.
My resume style used to highlight the former because I thought I had to prove that I used technology x to solve problem y. I do mention technology where applicable but it's to highlight the business need and solution. I'm trying to show that I can pick up anything at full speed to solve real problems. That's what both of us do day in and day out or else we wouldn't have been employed at all until now. Having to prove that we're not imposters when someone clearly believes we aren't is really the most annoying part about all of it.
I'm in the same boat though around needing a solid GitHub profile even with what I feel is a very concise CV. At some point we do have to mark things as 'done' and just keep shopping out job opportunities until someone else believes us like our current employer.
It's easy to say coming from a generation where higher education was not the norm.
A 20 year old saying he dropped out of high school will not be received well in any job interview. 98% of the population can do better than that.
I got a job doing tech support for a web hosting company at 19 (requirements for getting hired included listing 3 FTP clients, listing 3 e-mail clients, what is whois, and things you could teach to anyone). I worked with many high school dropouts as well.
After getting that tech support job, learning about open source (LAMP stack) changed my life. I didn't have much money but I was able to self study on linux, apache, mysql, php/perl/python with minimal equipment. Over the years open source has provided and avenue that allowed me to better myself to the point I've worked as an SRE for Google/YouTube, a Staff SRE for Dropbox, and starting soon as a Staff SWE at Discord. Open source has given me so many opportunities, made me a better person, and allowed me to provide for my family.
There are a lot of people like me in the industry who care about socioeconomic diversity. I will absolutely interview and hire dropouts if they are driven and capable.
Now, let's keep in mind that we are approaching 2020. The high school dropouts are facing support guys, online MOOC students and boot camp graduates who are all searching for the same developer job with no experience. The competition is more intense than ever.
Also, your text implies that you work in the bay area. It's an order of magnitude easier to find a tech job there than in the rest of the world.
I don't think everyone should take the path I did. If you can afford it, college is a really good option and I would never discourage someone from going that path. But there are other paths, if you work hard, and have some luck.
I'm now in a position that interviews and helps make hiring decisions. I still know a lot of people from South Florida that have spread out all over the country. There are many of us who don't come from traditional backgrounds that have progressed into leadership roles where we impact hiring who do not require college.
Yes it is very competitive, yet so many companies are constantly turning people away, people with and without college degrees. I'm so happy that there are so many more resources today compared to what I had when I started. MOOCs are awesome and I use the to this day for continuing education. I've been part of apprenticeship programs to train bootcamp graduates as SREs. And I've helped mentor people internally transferring from support into SRE.
College is definitely the most typical paths but there are still paths for humble people willing to start that the bottom and work hard. College just isn't an option for everyone unfortunately.
I agree with pretty much everything you said. I will just make a cultural parenthesis on that last sentence.
In the USA, degrees are expensive, the only reason they are not for everyone. It limits the supply of degrees and make them more valuable.
In other parts of the world, mostly thinking of western Europe here, higher education is free or affordable. Degrees are much more common and of higher level, while the job market is pretty bad. It's really shooting oneself in the foot not to get one.
If you limit it to 25-34 year olds with at least a 4 year degree, the US falls to be in the middle of the pack compared to western Europe- behind Switzerland, the Netherlands, Belgium, Denmark, and the UK, but ahead of Norway, France, Spain, and Germany.
So you are saying that the amount of bachelor+ in USA is on par with the amount of master+ in Europe. That's my point, Europeans have higher education because it's free.
So, know what I did in my interviews? I focused on what I _had_ done. Projects I'd driven to completion on my own merit. I put my heart into my interviews and worked to let my passion and depth show through.
The first place I got a foothold, I was nearly passed over, my resume was so bad. I know, because my boss told me, years later how they almost threw it out. But my interview got me in. Barely. Doesn't matter, it was enough. I pushed my way up from there.
It's not easy to say I'm a dropout. Nor has it been easy to work my way up to where I am. But, who cares? I'm not _owed_ a software development job. I _earn_ a job, by being willing to put in the work, and by not making excuses for myself, and showing I can do it.
Obviously YMMV, but after a couple years of experience, employers simply stop caring about education level.
Going to college definitely makes things easier, and I don't advise people to take the path I did but there are paths other than college that still work well for this industry.
> I've never been asked about any education in developer interviews, only experience.
> ... employers simply stop caring about education level.
The reason he was not asked about education is not necessarily because 'employers by and large don't care', but rather that 'employers that do care about education would've already filtered him out after seeing the resume'. The interviews from employers that do care about education would never materialize in the first place.
This is not a judgment about the role of education and pedigree, but rather pointing out that any effect would have likely played out in the resume stage.
Thankfully for me I consider it a positive selection bias as I wouldn't want to work for a company/manager with such insular hiring criteria. Unfortunately it does make it harder for people just getting started in their career.
I generally like to work with people, and at places, that value your contributions and experience rather than your credentials. I've been lucky enough to find that in every place I've worked. I can't say my experience has been one of only working with people without degrees. I've worked with amazing engineers with and without degrees and don't put much stock in the degree itself.
I wouldn't call it a bias, companies have different expectations and needs. At big name, there are a hundred people applying every day, the bar is higher than no degree and/or little experience.
There is also the fact that the lack of education puts one at a disadvantage. For instance, I know of co workers who ask questions about complexity, compilers and graphs. It's common for candidates with no formal education to be completely oblivious to these.
I definitely agree that lack of education is a disadvantage, it just means you have to start a different way. that could mean your first few years is coming in through a side door before you start getting relevant experience.
Plus there's the uncomfortable fact for academics that the state of the art is often pushed forward in industry, and written about in academy. So, sometimes, the job is really, really good. You're doing stuff now that kids will be learning about in 5 years...
Added: It was something like "education means a lot in this town, as much as people don't say it.".
If the person in question can't be bothered with something so inexpensive and easy to obtain, that meaningfully improves their job opportunities, there's a very fair argument to be made about that being a negative statement on their character and work ethic.
But preciously because basically all other candidates have graduated from high school and university, it would certainly help the dropout candidate stand out. If the candidate seemed bright and impressed me in other ways, I would say it was an edge as it is evidence of self-motivation and independent thinking. Of course the candidate would have to impress in other ways, for example, by having a substantial side project.
Just me perhaps, but I think it would be fun to work at a place where coworkers had a variety of educational backgrounds.
That you can succeed even as a high-school dropout by being in the right places at the right time or spending effort on the right things isn't a good argument against degrees. Because a degree increases your likelihood of doing those things.
There are certainly downsides to recognizable degrees, but I don't think ones chances of success in the tech industry is one of them.
Yes and no. Perhaps the degree is simply correlation? That is, to get a degree you need at least a certain amount of focus, ambition, the ability to start / finish something, etc.
The point is, using the degree here is a false flag. And it can be, as the OP notes, deceptive. Just because so many fall for it doesn't mean it's right.
Just the same, success is possible without a degree. How? All / most of the positive traits listed above.
Say you join a large tech company, gets a fair amount of stock, enjoy the stock increase and as a result make an additional $100k. That is not an unlikely scenario even in the early stages of a career for someone with a recognizable degree.
If you don't have a degree it is certainly possible to do the same thing, it just isn't probable. For every step where you have to be aware of something, understand it and make a decision the probability of getting it right decreases. And the same is true for housing, career, startups and even relationships.
It doesn't make sense to say that you can succeed if certain things happen. Because it is the if that is the challenge.
It's _complicated_. I mean, almost everything in society is complicated: it's like sphagetti code written by millions of people with no single coherent goal.
There's a degree to which what you say is true, and a degree to which it is false. The most obvious way in which it is false is that learning how to use a computer well requires a) access to a computer (obviously), b) time you can dedicate to learning how to use it, c) resources to assist with learning (this one is mostly solved by ubiquitous internet connections _today_, but that was much less solved even fifteen to twenty years back, when many of the current self-taught mid/senior experienced people around today were learning), and so on.
Tech is not a pure meritocracy, because not everyone has had access to the same resources or opportunities. I think that's getting a lot better - as this article flags up, the modern open source community is extremely accessible, and sufficient equipment to be competitive with the best is cheaper than it has ever been. With just a cheap laptop and a relatively inexpensive internet connection you're not _that_ far away from having the same tooling available as a rich professional, today. That doesn't mean that problems a and c are _solved_ by any means, but they're certainly better than they used to be.
Point b may not be something we can solve with improving technology, though, nor the thousands of tiny factors involved in whether people unconciously gain an attraction towards programming or pass it by. These are really hard problems to solve.
So, like, generally, I think we're _doing better_ than we were twenty years ago, but we still have a long way to go before we can unironically call tech a meritocracy where those with the will to thrive succeed and those without fail. Life is just more complicated than that.
You are correct that access to technology and learning resources has greatly increased. People who wanted it bad enough could make it work though. I couldn’t afford a computer growing up so I would find ways to access the crappy Apple IIes my school had just to get some programming time. The library had books on Basic and magazines (which I would read for free in the grocery) used to have code in them.
I don’t buy the lack of time argument though. Life is about prioritizing. Millions of poor people have substantial time to invest in football, basketball, TV, video games.. Learning to program just takes putting it above those less productive activities. The real issue (imho) is that very few people are cut out to be developers. This push to turn everyone into an engineer makes about as much sense to me as encouraging everyone to be a pro-basketball player. It’s great money, everyone should do it!
As for time, I think the suggestion that the choice is between "learning to program" or "watching sports" is extremely reductive and not accurate to a lot of people's experiences.
> I think the suggestion that the choice is between "learning to program" or "watching sports" is extremely reductive and not accurate to a lot of people's experiences.
I strongly disagree. That’s exactly what other people were doing while I was learning to master technology. We were all very poor. I’m not anymore, the vast majority of people I knew who didn’t use their time wisely still are.
Besides the will to get things done, other important differentiators are knowledge, talent, and intellligence.
You’ve said it.
I won’t go into details now, but there is something to be said for sincere passion and interest in what you do. I’ve worked with a not insignificant number of CS and digital photography students who by default think they’re above practical work and (insert more snobby stuff here). I’ve been responsible for reminding people why they started those degree programs in the first place. Not due to ranting like this, but due to [foolish] added work I put on myself that came out of sheer interest that has bloomed into something really worthwhile— sometimes unbeknownst to them at the outset. Don’t get me started on the application of science and the scientific method and [oh god, the] politics.
There’s a lot of rambling I could do on this subject, and it’s too early on a Sunday to formulate my thoughts correctly. I do like the subject and I enjoy hearing from a variety of positions on it. I at the very least wish it was easier for kids to acquire higher education outside of the bindings of finances and legalism that have long been lord of institutions that claim that kind of authority.
I have always felt conflicted about our current education system because it undervalues creativity and disingenuously puts forth the notion that school is for learning when clearly it is not. Teachers constantly remind us that attending school is imperative to our growth yet fail to explain that we are merely fufiling a social obligation to become stable adults. Education has always been about the indoctrination of social values and the apparent lack of effort by teachers to purvey this speaks quantities of their system. Mind you, I'm not necessarily blaming the teachers nor those who have constructed such educational pillars but rather I'm antagnozing our modern perception of the intended purpose of school.
All I really hear about from students these days is which prestigious school people want to attend but whenever I question them as to why they wish to attend such facilities they respond with "to get a better job" or "to learn more and then get a better job." Is there something not inherently flawed with this kind of logic? We wish to obtain better jobs so we mindlessly waste hours cramming an entire textbook before an examination only to retain almost no knowledge of its contents afterwards.
There are cultural nuances regarding this matter and my anecdotal argument is probably not very strong, but I do believe there is a problem with incessantly encouraging young students to pursue academia for the sake of acquiring a _better_ job. As someone who has always been interested with the philosophy and intrinsic beauty of learning itself, I find our current implementation repulsive. My aversion to school has led me to study various types of -- or perhaps attempts at -- artificial intelligence in the hopes that we may one day supplant this flawed approach to education with a more personalized view that can take into account my preferences.
That being said, I'm still relatively young and by the duty of such a description, naive. There is still so much that I do not know about and I'm wondering now how I should proceed. Being a well informed and tech literate person, could you possibly offer me some quick guidance? Thank you.
There is a 3rd option (if you want to work as a programmer):
You practice programming interview questions in your favourite programming language for a few weeks and start applying for jobs. Companies often hire young talented programmers without university degree in entry level positions because they are cheaper.
After having worked in the industry for 5 years no one cares about your degree anymore. The good thing about this 3rd option is that you are learning on the job what the other students learn at university and you don't have to pay for your software engineering education.
Students often tend to overestimate the importance and the benefits of a higher education. Personally I don't like your second option: "self-educate via the internet" because it won't help you getting a job and you will eventually forget what you have learned anyways. For this reason I prefer your first option and my 3rd option.
My take: go back and finish. The signal you are sending is that you can put your feelings aside and deliver even under less than favorable conditions. You'll also need that inner strength if you decide to go your own way.
Also, if you want to the non-CS route make sure you learn math, especially discrete mathematics and linear algebra. Besides the ability to finish that's the other foundation that will make you truly great at software, especially if you are interested in machine learning.
Beyond that good luck! We need more outsiders in the field.
I recommend at least finishing high school, but it's not absolutely crucial. It will help just all the more with landing a job for someone else—
Your question and situation is a tough one. If you're young and I knew you I would rail against you:
The world is large. You need to find your own way through it.
Continue to look for advice from accomplished people and ignore advice from people who are seeking to validate their own ego. It can be tough to decode that in real time and real life, but from the sounds of it you'll be fine.
Separate from my sloppy attempts and encouragement— you should never stop learning whatever you can that might help the direction you decide to move in. By that I mean, even if you had a high-level degree in any field you can't stop there.
That is the road less travelled you're going down presently, though. If you want to talk some more (from someone with similar—but not the same—experience), check my profile for a link to my contact.
I'd be happy to talk it out.
Get a job in tech.
Go to school and get a degree in the Humanities.
That assumes there is an ideological basis. It's not hard to rationally connect education with talent and skill, even if we debate the strength of the effect. It also assumes it is a 'norm', not a clear market signal - business unquestionably pay more to people with college degrees; you can argue the reasoning, but the fact is indisputable AFAIK. Yes, there are exceptions, but we are talking about vast trends; YMMV.
People have complained since probably the first written rant that others don't work hard enough, or hard as they did in my day, or as hard as I do, etc. I'm not sure it tells us much besides the consistency of humanity, its capacity for work, and its perception of self and others.
Even if it means building something unmaintainable and generally against your principles of good engineering? Many people will do that, but I hope there are some out there who would value a reluctance to work like that.
Given the rest of the tone of the parent post, I believe this sentence was more intended to emphasize that they did not make excuses for themselves - for example, that they did not know language X, or technology Y. Instead, they rolled up their sleeves and dug in, and figured it out, and got it done.
> So its really not a matter of educated-enough. Its whether the will to perform is, through whatever means, inculcated - and then manifest by taking the actions to get the state of things, done.
> Some of the best developers I've ever worked with, have come from utterly dire circumstances. Some of the worst, too. Likewise for the privileged, candied elite.
A rich GitHub profile is like a college degree:
∙ It's a positive signal, one of dozens possible
∙ Not everybody has the time and resources to get one
∙ Having one does not necessarily mean you're good at your job
∙ Ignoring one would be silly
∙ Requiring one would be silly
There are an infinity of choices we make daily that are completely binary, 100% extreme, both on the individual and social group level. They go unnoticed because there's nothing to gain by pontificating "Shall I go 100% in with 'I won't cut off my legs today', or find some middle ground that's not as extreme?" (apologies for the gross example, but articulating inarticulate choices is hard, by definition).
Our cognitive capacity is limited, so naturally we've evolved to consider only the choices that matter. We look for the "middle ground" in only a tiny sliver of decisions.
Which is obvious, but there's an interesting thought in there: To what degree is the presence of "middle ground" in our minds indicative of some potential gain, an arbitrage opportunity? Turning the causality around, can we assume the distinctions that bubble up to our conscious thought are useful hints at profit? There's something primal about spotting gradients: the only place where it makes sense to talk about a "middle".
Sharing my work got me, as an Aussie working mediocre jobs, on the right person’s radar to help me land a job at Facebook. This shaved probably 10 years off my career path. That was a great return on investment and a big leveller for someone who didn’t study at a brand name US college and have an obvious path to the door of a major tech company.
I’m now in a position to hire others. I certainly don’t mind if someone doesn’t have open source code shared. It clearly not a priority for many people. But I do appreciate when people do have a good backlog of work to show as it can give a seemingly lacklustre candidate a second opportunity to shine.
There are two critical assumptions you are making here:
1) The kind of development you do is conducive to sharing via small projects on GitHub. The first four years of my full-time programing career I was working on firm real-time signal processing algorithms designed to run on an MPI cluster of POWER or Xeon machines. My skills were focused on low-level optimizations for that environment and hardware, design and implementation of grid-distributed algorithms, and understanding how to translate radar concepts and mathematical models into code. Anything along these lines that I could have thrown together for GitHub (had it existed back then) would ave been trivial and not really demonstrated anything.
2) Everyone operates under a regime wherein they do not need permission from their current employer to release things on the side. Needing to do so adds work and can be a real impediment to even wanting to put things out for public consumption.
I don’t see why you’d bother trying to force your work into that model. If it’s not common in your industry then there is no one to compete against in GitHub stakes. Sounds like publishing a paper or being awarded a patent would have the same
Effect of signalling that you’re good at what you do.
Github profiles are advertisement for programmers.
Advertising requires resources - time, money, creativity and commitment and not every business is ready to pay to price. Sometimes advertising pays off, some times it doesn't.
No police arrests a business owner for not advertising. Same with github profiles. Decide to have one, contribute a little, much or nothing at all. It's a matter of preference.
Just like you shouldn't exclude any companies just because they don't advertise.
And no I didn't have a lot of free time. I was working and going to school but when you're young you frankly don't need a whole lot of sleep.
You do if you have health issues, or mental health issues, or kids (some young people have them), or a long commute, or a second job, or if you're not young...
It's hard to recognize privilege when you have it.
If you happen to have rich parents who pay for the best college, that's an awesome opportunity. Nobody should have to feel bad about taking it!
If you are poor, but happen to have a lot of spare time, that's also an opportunity.
Complaining about privilege is not actionable. Focussing on opportunities is a lot more constructive: We can look for ways to give someone an opportunity!
It is actionable though. People can make hiring decisions on a candidates work rather than what they do in the spare time, just as they do in every other industry besides tech.
Of which Github and Open Source are but subsets.
The open internet is the substrate on which innovation runs.
They also weren't as visible and easily browsable.
So, you are perpetuating ageism then. Let’s see how cocky you are in 10, 20, 30 years when you can’t get a job because of all the bloggers crowing about their Github profiles...
That's not actually the case! Blind programmers are not rare, and as for the other: https://venturebeat.com/2014/10/01/max-strzelecki-warlocks/
You don't need to be young to code profesionally nor you need free time. Good job allows you to learn on the job. You need eyes and hands.
You're constructing the viewpoint you want others to see with no data or facts to back it up.
OSS activity is a useful low-effort implicator. As usual for implications, denying the antecedent remains a fallacy.
That is an assumption made without evidence. An equally qualified assumption would be that its a sign of privilege to not need to contribute to open source. You get an awesome education and go straight into well paying entry level position.
Committing to open source is not something casually done at ones leisure. Anything with traction becomes a significant responsibility and it takes an incredible amount of discipline and sacrifice to stay committed.
I think it's better to consider privilege as multifaceted rather than a single global quantity. And in that case, I don't see such a strong disagreement between you and the comment you're replying to.
I very much agree with your second paragraph.
Or maybe there are both friendly and unfriendly projects and a lot depends on where the underprivileged is trying to make it.
And if they already are an excellent programmer and hitting all expectations, why should they do this? In fact, an employee who’s spending 5-6 hours/day on side projects might not be the best person for the company.
I don’t think recruiters look all that deeply either, I think you could game this by writing some docs or readme corrections to all of the top projects on github and get the PRs accepted.
Like most people I know, I use open source software in my day-to-day job even though I work in a company who provides proprietary software and services.
That means that when I do encounter errors or bugs in those open source components and libraries I can make the choice to either write workarounds locally in my code, or I can choose to submit patches back upstream.
That’s not a matter of privilege. That’s a matter of choice.
I also think it’s reflective of a developer’s attitude: does he prefer things done “properly”, or will he just hack together something which works? Will he share his solutions or not?
Then that’s a company with some pretty serious problems, and it’s going to show in all of their products. If you as a developer decides to stay in that company then that will also reflect on you.
I'm just telling you how perception by association works. Your choices are entirely your own.
For that to work, the people you go to interview with have to have some idea of what your previous company is like, which is very unlikely.
Also, "I had bills to pay" should erase 99% of your "perception by association" concerns. And guess what? You can't verify that. And if you do try to verify that, you're the problem.
Look at the candidate, whatever they choose to present as their experience and resume, and make a decision based on that. Don’t ignore anything they send you and don’t discount them because they didn’t send you something that someone else did.
I keep hearing this. Do you have any data to back it up?
Also, larger companies internally operate much like an open source community, having multiple projects that accept contributions from anyone in the company.
Reporting and fixing public issues as part of your employment is a sign that you care about the ecosystem that you are part of, giving back rather than just benefiting from it.
It's not necessarily a showcase of your coding skills, because these contributions are usually small, but it shows that you will get your hands dirty if needed and fix the problem at the root, instead of hacking a workaround in your own code base. It also shows that you have no problem with learning a new code base and delivering code according to that project's quality standards.
If you don't have any public activity it may imply the opposite: that you are just freeloading your community, that you are prone to doing workarounds instead of fixing the problem upstream or that you are unable or reluctant to contribute to other projects.
If you can tell your employer you'll spend the next sprint or two fixing a bug upstream, good for you!
From my experience working with complex libraries, when you hit a bug or a corner case, it is probably not going to be an easy fix. There's a reason many open source projects are triaging low-hanging fruits for first-time contributors.
As for the complexity, from my experience it's a mix. There are indeed many complex issues, but often times the fixes are relatively easy and take little time to fix.
For the hard ones it's often enough if you can contribute to the conversation to better understand the problem, providing valuable information on how to reproduce, so that someone familiar with the code base can solve it easier.
Some companies will hire only people from top schools, other companies put a lot of focus on understanding algorithms; other companies just look at your previous jobs, and some may just look at your Github profile.
If you are looking for a job, you need to understand that, and apply to the right companies. Depending on your background, not every company might be a great fit.
If a CS degree is a hard requirement, and you don't have one, then applying at that job is a waste of time.
If you're applying to a company that works on Open Source and they have popular, public Github repos, and you have a blank Github account without any contributions, you're not going to make a good impression.
Majority of employed developers don't have regular open source contributions. Especially those who work in high pressure jobs that already take all they have - both in time and how much effort you can spend per day. If the company is trying at least little bit to reward contributions, the people who give the company more (or come in better rested) get rewarded more then those who work for company less. That drives employees incentives and behavior.
Yes, there are junior jobs available to people with two months of coding experience. That is how it should be and there is nothing wrong with company seeing your OSS code as one of way to prove you can do it. The moment it will be expected from everyone, guess what, people like me will have tons of advantages to homeless, former "big scholarship to go to university" student who had to drop out due to personal reasons.
Senior developer market is not flooded with people with large open source contributions through. It is just not the case. Why would open source on Github should be privileged over random portfolio in case of juniors? Or to open source contributions that are not on Github?
Lastly, I dont think large open source projects can handle influx of juniors using them to prove themselves - which is what would happen if companies would really require oss contributions for hiring. Oss projects have often hard time to stay on top of existing pull requests.
That's a new take. As someone who lived through that "war" in the 90s, the typical refrain from the "closed source" crowd was that open source developers were merely hobbyists, whereas closed source was built 100% by professional software developers.
I remember the thing you said too, but only in context of "it is obviously stupid" and obvious manipulation from evil Microsoft. I still think that part was true (that it was manipulation from MS who was remarkably unethical in this war).
He would be helped the most if you would hire him as junior, with salary, was able to provide mentoring the same way other juniors are provided mentoring instead of expecting him to spend months homeless in library hoping to run into project that will be good fit for him.
Overwhelming majority if companies don't have 100 CV on position, our salaries would go down quick if that would be case.
I never wrote cover letter. Is that still done for tech positions? Haven't heard of it.
That true, but major industry shift already happened. Literally, everyone is building on top of OSS.
> Overwhelming majority if companies don't have 100 CV on position.
You get around that many CV unless your company sucks(Glassdoor reviews) or you do not want to pay(Startup).
Pre-screening is done by HR/Agency. In some cases, you have even more leads that your Requiter is cold calling. Engineer running interviews will get 10-30 CVs per position.
They are building close source on top of OSS. They are producing close source while using open source libraries. I find it odd that you would decide to limit your pool of applicants on those from few oss projects and rejected the rest of ms employees.
That is my point: majority of employed programmers produce close source application in companies that expect them not to slack off. That oss is used inside does not matter.
> You get around that many CV unless your company sucks(Glassdoor reviews) or you do not want to pay(Startup).
Uhm. Again, why are salaries not falling down? Why is everyone crying that they can not find people? Why don't that company does not specify in more concrete details who are they looking for so that people who wont like position self select out?
> Pre-screening is done by HR/Agency. In some cases, you have even more leads that your Requiter is cold calling. Engineer running interviews will get 10-30 CVs per position
That would explain a lot, mostly why there is simultaneous cry of lack of people, simultaneously too many applicants and simultaneously recruiters doing cold calling.
All in all, it sounds like broken hiring process.
In my day-to-day, I depend OSS to get my job done. If there is a bug, I am allowed take the time to build a patchset and push it upstream. It's good for everyone. The company benefits from not having to maintain a fork, the developer gets to build his github profile, the rest of the world can also use the update.
People who don't have a GitHub profile is also a signal that they either don't take the time during their normal work hours, or are prevented from contributing back due to company policy.
I like the rest of the article but this statement is disturbing. There are plenty of great open source projects which never got off the ground in terms of popularity.
Quality is not necessarily correlated with popularity.
a) doesn't require a lot of resources to make a meaningful contributions (access to a computer and the internet) that could land you a really good job.
b) it's probably the only industry that doesn't requires formal education. Case in point, only one person at my company has a CS degree, and that's me.
There are cases. But most probably are survivorship bias.
I would bet, most current IT professionals have a at least lower middle class background.
If I take a look at my company, that at last rings true. Not sure about the US, though.
I don't quite understand why people seem to think success doesn't require sacrifice anymore. You can either sacrifice your time and money for a more streamlined education and get a college degree or you can sacrifice a lot of your time and take a risk to self-educate.
There's really no middle ground and both strategies can land you a job if you're good, but becoming good always requires effort and some sacrifice.
I think this is because large majority of tasks do not require formal education (proverbial CRUDs etc.). And that's OK.
For the record I've got a master degree in CS but I can't remember last time I had to utilize that knowledge. Hobby projects? Sure. But not daily work.
Great article by the way! I was also on the side of "it's hard to do PRs and github as resume is a bad idea" but it turns out that it's surprisingly easy to start and the experience of working with people that have similar interests is really fun!
The line about "survivorship bias" also jumped out at me as out of place. It's probably a phrasing thing, but I think granting that survivorship bias could be a factor, even as you assert that it is not a dominant factor, would have made the piece stronger. The way it was phrased seemed to deny the possibility, causing me to raise an eyebrow in skepticism, but that doesn't seem to be what you believe.
I absolutely agree with your central thesis that open source can provide a ladder for the underprivileged. At the same time, it's also undeniable that some people are constrained from contributing to open source due to economic and social circumstances, and I don't think those two ideas conflict. Does that make sense?
Popularity isn't the real issue - it's, "What can you do?" or, "Show me what you've done."
When interviewers say, "Tell me about yourself," from my own experience, they mean - what can you do? Not hobbies.
Photographers and designers create portfolio websites to get clients (instragram too). Github or OSS provides the same for programmers.
Doctors and other professionals do the same by placing their certificates, awards, certifications on walls for their clients to see. And gain trust.
In summary, the open source project does not have to be popular. It just simply a proof of ability.
Eh, it definitely is correlated. Maybe you wanted to say correlation is not perfect?
One of my open source projects has almost 5K stars on Github so I should be preaching with the choir here but that would be disingenuous.
I think that a lot of it comes down to having perfect timing... The kind of timing that only dumb luck can deliver.
Building a decent quality project is just the baseline requirement...
Saying that project quality or usefulness is the determining factor to achieving popularity in open source is like saying that having a pair of legs is the determining factor to being qualified for the 100m sprint in the olympics.
It is also a bit like the problem of academic publishing; you might prefer to publish in an Open Access journal and on ArXiv, but feel that you have no choice for your academic career other than to pay dearly for publishing in a “influential”/“reputable” journal owned by Elsevier, giving up all rights to publish elsewhere (and whatever other rights they feel they can push).
Many developers in the west having higher disposable income than their not so lucky counterparts who might be able to rival them in absolute skill.
But those developers want to spend more time with the family.
Remote work has not taken off even if we assume it's more profitable for a company to hire cheaper staff in low COL. This might mean that lots of the things in a startup has to do with maintaining a perception of smart geeks are working on a difficult global problem.
There is an arbitrage opportunity here by hiring ghost developers from low COL locations (similar to ghost writers). Why not support developers from low COL? That's what creates global economic equality right which egalitarian society always strives for.
So, if anyone comes to your office you'll have people looking seriously absorbed by the difficult problem but the real work will be done elsewhere, in return, you get peace of mind and more time with your family. Instagram pages and blogs will be lush with happy tech employee faces and "long stories of slaying tech demons".
Is this happening?
I've personally seen people make one-liner commits deliberately, to get a good graph.
On the other hand, some people write code because they actually created something real, which solves a real problem.
And the difference is quite clear if you're looking for it.
Arguably, the github community suffers from the "celebrity" effect that facebook/Instagram have.
But I digress.
Comparing a Github account to a college degree is crazy.
Once a piece of paper, and the other is a full history of the guy's work, presented at a mouse click away.
I am about to clear the first year of my univ, and honestly couldn't care less about what the univ taught me. I can't even recall most of it.
What I can show you is my Github profile, where I did real work, and built real software that actually works (including some IoT projects that I personally use myself, every day)
Not to mention the other projects that didn't made on Github, but did make it on a Bitbucket private repo.
Github is to this generation of programmers, what View Source was to the early web generation of programmers.
Without View Source in 1996, and with no resource to education or work at that time, I would not have learned how to make complete web sites and then have moved to automatically making those websites (HTML generation via a CMS).
Github is important and critical, but not for hiring those already in the industry... for learning and to make this industry accessible to those who are under privileged in terms of access to education, mentoring, code clubs, etc. And then for recognising those people during hiring and giving them the break they deserve.
Today, you can directly take a look at their craft on github, this has given much confidence to those from lower rungs of the society like, hey! I can create better than this person even tho i am not from such an ellite university. This along with the meritocracy in tech levels the playing field where the ones from better universities should found startups by using their connections instead of working for companies
The harm comes when people deny the existence of privilege and other forms of luck, and attribute success to their own moral superiority and the moral inferiority of others.
The most "fair" system can't survive a single generation of you allow parents to invest their resources in competitive advantage for their children – which is one of the most basic human desires.
Some companies have put this clause in contracts for shorter-term project-based contract work, but I've just crossed it out, generally with no pushback.
What you're willing to give the company is probably the same as what's really important to them: rights to IP you create in the course of your work for the company or during work hours or using company resources.
There are ways to phrase this such that - if you're a reasonably desirable hire - the vast majority of HR departments will be willing to substitute for the blanket terms.
If you're not sure how to negotiate this, explain that as an example you want to be able to continue to manage the website for your rotary club / kids scout program / whatever; and that the current phrasing would give the company the rights to that website, which there's no reason they'd want, and prevent you from giving back to your community.
I'm a former network guy who is learning web development. The amount of software I've installed is kinda crazy compared to what I used with networking. Everything I've used is free and crazy high quality, I haven't felt like I'm missing anything. In the networking world it is a lot of custom software and proprietary software. There are free options for some things but they're not nearly as polished or fully functional out of the gate.
This is the same argument we make against closed access research papers. The goal isn't to make it easier to hire poor people. The goal is to give poor people knowledge and resources so they can build themselves up.
Pretty much all designers have some kind of public portfolio such as a website with example of their previous work. I see GitHub as a centralised (and, for the most part, respected) place to put your work for developers, even if you don't want to use it for collaboration but simply as an advertising platform.
However it does lead to the same kind of problem you get with employers not looking at a candidate who doesn't have a LinkedIn profile. You will inevitably miss out of some excellent people who don't wish to/can't use such a platform.
But that wasn't the main driver for me working on open source. I think it is a nice side effect, but it shouldn't be a requirement; and definitely GH is not your CV (specially when there's so much open source not in GH!).
That said, when I've been in a position that I could influence the hiring policy, I gave value to open source contributions because it did align with the ethos of the company.
I guess it may not be important depending on who's hiring, but it is always useful if it is there and what you see is good stuff.
That's one heck of a premise.
Open source contributions can't be faked (except by real name collision), and in any case, there is much more detail available than a few lines or bullet points in a resume.
They can perform similar tactics on github: like mirroring an obscure but technically in-depth project without attribution, paying someone to write code and then committing it in your name (I've seen this done!), or making sure to get lots of extra commit density by doing one feature or bugfix after successfully landing a series of reformatting PRs.
An applicant planning to fake their way past you is genuinely hard to catch unless your process is tuned for it. The idea that there is a one-size-fits all interview process and that process is as simple as, "Check their github" is lazy to the point of malpractice. It's a goofy premise, and it's playing right into the hand of people who want to abuse you.
And this is just addressing the problematic nature of modern tooling for positive signals. Given the pervasive nature of harassment culture in the open source world, you should ALWAYS excuse the absence of open source work. It cannot be a requirement because the open source world can often become extremely hostile and/or political.
I'll share a model that's worked for me in the past.
Before you even write the job description, define the 3-7 traits you need to hire against. These can be as fuzzy as "gets shit done" or as specific as "5+ years marketing SaaS to $100k+ accounts" or "is effective at resolving incidents while on-call."
You need to be able to rate people on this. Binary is fine ("yes they can get shit done"). 3-4 levels is good (bad / good / total superstar).
Don't go overboard with granularity: you need to fit this all in your head for a couple candidates at a time when you compare them.
As you go through your process, try to get good information about at least one of these areas at each step.
My favorite thing to do is, between in-person interviews, be able to tee up the next interviewer with "I got mixed signals about communication style. Can you dig in?"
When you wrap up, make sure you cover all the boxes. Don't get smitten with a candidate with amazing experience who can code circles around everybody else who's never worked in a 25+ person team (assuming that's what you agreed was important).
When you're ready to make a decision, you now have a reasonable set of trade-offs to make. I tend to create rubrics where solid scores in 3 of 5 categories is a Hire, to allow us to compare different types of candidates.
Ok, back to GitHub.
GitHub can give you VERY STRONG SIGNALS for some of these issues. It's great to be able to know up-front that somebody is comfortable working with teams they don't personally know well (typical in big open-source projects).
The convese to that is that I generally expect some strong signal up front from any candidate. If not GitHub, give me a respected company on your resume. A recommendation from a mutual friend. A well-written cover letter.
When folks complain about 100s of resumes, it's because the problem is this lack of signal.
An aside: it's on you to allow folks with non-traditional backgrounds to succeed at this step. Write a better job req, or go email likely candidates. Diversity requires multiple ways in.
Once you're a bit further, this cuts the other way. If "good communication in pull requests" is important to you, don't spend any time here with candidates who have strong Githubs.
You already have all the data you need. Spend your time elsewhere. There is no extra credit.
GitHub can help you a bunch when evaluating candidates, but it's crucial to stay away from this notion that _anything_ during an evaluation that looks like a silver bullet. You always have to do the work of interviewing well, and it's always specific to your company.
Finally: if you have a process that allows you to fully evaluate a candidate by reading their public GitHub, you're almost certainly hiring for the wrong things.
To suggest that pull requests are a good hiring signal (as the author does in explicitly endorsing the "screening" of open source maintainers) ignores this bias. That can be positive or negative, depending on your commitment to diversity in our field.
The relevant bits are two factors:
* Is the person sending the pull request someone already known to the project, or an outsider?
* Is the gender of the pull request's author easily determinable?
It would be unsurprising to find that "outsider" pull requests get merged at a lower rate. But "outsider" pull requests from women merge at different rates depending on the second factor: when it's easy to determine the gender of the author, a pull request from an "outsider" woman is significantly less likely to be accepted than a pull request from an "outsider" woman whose gender cannot easily be determined. Also, "insider" women and women whose gender is not easily determinable have higher rates of merged PRs than their male counterparts.
There's no reason to suspect that whether someone's gender is easily determinable by a PR reviewer (usually, via username, profile picture, other clues like linked blogs or social-media profiles) has an effect on the quality of their code. So while this of course doesn't prove gender bias (we'd need telepathy for that), it does very very very strongly suggest it.
This is also in line with a lot of prior research and reporting on blind interview and auditioning practices, so shouldn't be too surprising, but for some reason people love to insist that tech is somehow different from all those other fields where hiding someone's gender changed how they were evaluated.
Did you read your link?
> The hypothesis is not only false, but it is in the opposite direction than expected; women tend to have their pull requests accepted at a higher rate than men! This difference is statistically significant