For a split second, I misread the name of the author of the article as "David Gelernter"[1] a professor of computer science at Yale who perhaps is so famous that he needs no introduction here, but on a closer reading I see that the author is Daniel Gelernter,[2] who I think must be David Gelernter's son.[3] The author Daniel is part of three generations in the Gelernter family who have thought deeply about computer science and its implications for society, so I am all for reading the article closely and grappling with its argument, and I am grateful for the submission of this article (which a friend just told me about on Facebook) here on Hacker News.
The thrust of Daniel Gelernter's argument here is that there is a shortage of programmers with a certain kind of skill set suitable for working in startups, because large businesses that grew up as successful startups have bid up the price of programmers with those skill sets. "Part of the problem is that startups have to compete with hegemons like Google and Facebook that offer extraordinary salaries for the best talent." My oldest son began working at Google a month ago, so for the last couple months family conversation has included discussion of Google's hiring process and how working for Google compares to working for other startups (my son's previous employment). The programmer job market has a lot of interesting features, but one of those features is indeed a fairly high premium on software engineers who have a love of coding (who are self-motivated, whether or not the job market rewards the motivation, in learning more and more about coding every day) and who have good problem-solving skills, a business orientation to doing work that meets customer needs and builds a profitable company, and good written and oral communication skills. There is a severe shortage at all levels of employment of software engineers who meet those other requirements.
The author's other comment to note is "The thing I don’t look for in a developer is a degree in computer science." It's empirically true that young people can get great jobs as software engineers at the most selective employers even if they don't have degrees in computer science--as long as they can handle a tough series of technical interviews. I've seen it done. Many other commenters on Hacker News have noted that software engineering, more than most occupations in the United States, has processes for finding people with actual technical skill irrespective of whether those people have college degrees. If you are looking for work, having a college degree may provide many benefits (among those the ability to gain a visa to move to another country where you would rather live), but the main thing to gain is the knowledge and experience that helps you solve actual problems in industry with code that works. Some people gain that knowledge and experience through a computer science degree program (or at least DURING a computer science degree program), but others gain it through other channels, and the smartest companies look for worker performance more than they look for degrees.
I would add that having stripped out the domain specific part prior to these words of your's:
"[... people] who have good problem-solving skills, a business orientation to doing work that meets customer needs and builds a profitable company, and good written and oral communication skills."
Your following sentence is also true with the modification I've made in brackets:
"There is a severe shortage at all levels of employment of [people] who meet those other requirements."
With programming skills or not, people with these characteristics have always been rare. We can gather lots of them for a while for massive nation efforts like the Manhattan Project (something I've been studying in detail for the last few months), but in normal times all organizations, from Google to just another startup, have trouble finding these people who will make or break their ventures.
Yet another example of someone mixing the concepts of scientist, engineer, and technician in the computing world.
You wouldn't hire a food scientist to control and feed the machines that produce the bread in an industrial bakery, or even to design and build that production line. Producing code is not as closely related as one might think to the science that leads to the algorithms and design patterns that underscore the languages and frameworks that enable the tools and techniques ultimately used by those who produce code.
The sooner all concerned (me, you, academics, entrepreneurs, employers) settle on a suitable terminology for what it is that computer scientists, software engineers, and developers need to learn to do what they do, the sooner we can get past this type of article.
Absolutely. This is what I posted on the other copy of this article:
The problem is that even if you understand those concepts well, you're not necessarily able to apply them after most programs. If your startup is in the business of crating new algorithms to solve various problems, that's one thing, but what most want is someone who knows what existing algorithm should be used to solve a problem in the best (or at least an acceptable) way, and do so in a manner that's both expandable and maintainable. Ideally, they're looking for a craftsman, not a scientist. These roles can occasionally be found in the same person, but it's extremely rare in my experience.
> A junior developer fresh out of college can expect to earn around $10,000 monthly, plus benefits, a $100,000 signing bonus and $200,000 in stock options.
Maybe I'm out of touch with the realities of the world, being an ancient 31-year old PHP/Java developer who's never written a line of something cool like Go in his life, but an expectation of $10k post-tax right out of college doesn't seem like a realistic expectation for every graduate.
In the midwest, a dev with no professional experience straight out of college can expect to earn around $3,450 monthly (post-tax), plus benefits, no signing bonus, and no stock options.
It's really pretty irrational (in terms of pay, career progression) to be a dev in the Midwest. Even if you don't like Bay Area cost of living there's Portland, Seattle, Boston, etc. the only reasons I can think of to stay would be a serious desire to own a suburban home or to live near your parents.
Is it really? Bankrate.com's cost-of-living calculator tells me the equivalent income in San Francisco if you're moving from Columbus, Ohio earning $110K is $213K. That would be an approximate salary of a senior developer. And these are jobs where 40 hrs is the norm, in addition to medical, dental, disability, life insurance, 401k match, and bonus.
Personally, if I were to choose the Midwest, it'd be Chicago.
Glassdoor has an average software engineer in Chicago making $75k. Bankrate says the equivalent income in SF would be $113k, and Glassdoor has median salary in SF as $103k.
$10k is a lot of money to be sure, and you are paying a premium for SF, but it's not always so dramatic.
Employers seem to be pretty good at adjusting salaries by geography according to cost of living. You can't really "win" unless you can convince someone to pay you a Bay Area salary to live in the Midwest. And maybe you can, but if they're smart and the market is efficient, it won't last long.
None of these are exactly cheap cities (Portland might be the closest), and not everyone wants to move to a city just to do a job they could have done from home.
There's attractive urban living in the Midwest (and the South) too. Coast = urban kewl stuff, midwest = suburban lame stuff, that's a huge simplification. I'd actually say the opposite. If you desire to own an `urban` home, you have a much better shot at it in the Midwest, than you would have at the coasts. Average dev income will push you to the burbs on the coasts, but not necessarily in the Midwest.
Well, the big exception is Chicago. ThoughtWorks is not held to be a great employer and requires a lot of travel; other jobs are often very corporate, working on enterprise software in drab low-rise office parks in the north suburbs. The only thing that really resembles the SF office culture (AFAIK) is in quant trading firms. I guess I'm discounting those out of personal distaste for HFT.
This is the WSJ, their audience is going to be a bit more niche than all software developers. For a grad of Harvard or Stanford CS or MIT C6? I'd say that's a reasonable starting salary.
Of course not. He doesn't need some computer science majors. He has a little app which allows you to search your mailbox.[1] Well, actually he doesn't have an app. The site says "Coming next spring". You can "preorder" the app. So he's pre-announced a product to be written by people he hasn't hired yet.
And, of course, it's "cloud based". It sucks up all your email into their servers, where they get to look at it. Privacy policy? What privacy policy?
Wow what a ridiculous article. For starters, only an extremely small slice of fresh-out-of-college developers would get a 120k salary and 300k in bonus and options. This is still absurdly high, if it's even true (and not exaggerated for the purpose of getting an article into the WSJ). I've noticed a sharp uptick in CS grads, and it seems starting salaries are actually nosing downwards. Or at least that's the case here in Boston.
This article is nonsense, and whoever the author is, I guarantee you he has no idea what he's doing.
The article identifies who the author is. He is the leader of a startup. People who recognize his name know that he is the son of a very famous computer scientist (and the grandson of another computer scientist).
For those that don't want to register, just delete the http://url and then search for the rest. Click the link from Google or whatever and the page will load without the register popup.
It's interesting how the author laments that CS graduates are not up to date with the latest technologies, while admitting himself that the technologies "change every 10 minutes". By this logic, students starting a CS education should have no idea what they just signed up for, since the up to date courses desired by industry do not even exist yet - heck, the technology does not exist yet.
That said, I personally know several academics who argue that it is simply not a university's responsibility to follow the demands of industry. While a bit extreme, I see the point they are trying to make.
It's a full-time job to keep up with the industry, let alone change it with research. I think a little lag is OK so long as it doesn't teach BAD practice (e.g. thank god academia has moved from CVS to git).
Because many of the topics covered in a typical CS curriculum have direct applicability in industry. For instance, many courses in the area of software engineering make it a goal to try to simulate the "real life" conditions of a software project (e.g. working in a team, having stakeholders with different needs). So it would be hypocritical to then claim to not pay attention to what industry says it needs.
Of course, this should not mean that whatever industry asks for is law - sometimes they simply get it wrong (it is usually not very difficult to realize when), which is exactly the point where universities can help by not educating students to repeat the same mistakes. Something like saving companies from themselves, I guess.
College/University software tracks are very poor in teaching you how to develop SaaS type programs. A bachelor's for CompSci will introduce you to different types of programming, the majority of which you will never use in professional development.
There is a real disconnect between the skills taught in college to the skills needed in the work environment. If you're going to college to learn coding so you can get a job, then you're being done a disservice by being taught things you'll never need.
I wonder how effective these "hacker" schools are. The ones I investigated didn't seem interested in best practices, it was a real focus on getting code on the screen. That can be a hugely damaging mindset working on large, complicated software.
The fact that he first dismisses coding bootcamps and then concludes by describing what sounds like a bootcamp as the solution, makes his entire argument hard to take seriously.
What the author is saying rings true. I've been interested in electronics since I was in Kindergarten. I worked as a EE without a BSEE for 30 years after working as an Electronics Technician right out of high school. I did get a Bachelors in Computer Science at 10 years in with the help of tuition assistance from the company I was working for at the time.
I enjoy building hardware using microcontrollers and FPGA's.
and have several open source projects on Github.
Companies put too much emphasis on credentials, and not enough emphasis on recruiting and mentoring people who build electronics and programming for fun. It is these people who build things which change the world.
A computer science degree in "iPhone or Android development" isn't a degree in computer science. I've work with people ranging from self-taught dropout coders to CS and CompE PhDs, and I would say that the very best of the bunch had solid backgrounds in computing fundamentals and theory. Personally I favor CompE's with a love of both coding and embedded. Working within a constrained environment close to or at the bare metal gives people an understanding of how real computer architectures and software interact, how code will perform, and how to write simple clean code that executes on features without succumbing to enterprise-grade bloat. If you can find a candidate like this, snap them up and give them a leadership position.
I was unaware that there was such a thing as a "CS degree in iPhone or Android development", and if there is - it's a horrible misappropriation of the term "CS".
On a related note I strongly believe that taking a 12-week course or a bootcamp in [insert latest flavor of the day here] does not allow you to call yourself a software engineer or a computer scientist. Anecdotally, I've noticed a marked increase in that attitude (and another term I intensely dislike - IT) since moving to New York. This attitude grossly misrepresents what it means to be a software engineer or a computer scientist, which is much more about an approach to analyzing and solving problems using a combination of experience and first principles in face of tradeoffs, than using specific tools or languages.
I think this is an AND, not an XOR. Someone graduating with a degree in a technical discipline should have both up-to-the-minute skills in that discipline and also a sophisticated understanding of the theory of the field.
If you have only the former, you are a technician, who understands what to do and how but without a deep understanding of why. (And there are plenty of programs that turn out such people in the software field, typically through two-year degrees in community colleges.)
If you have only the latter, you are a theorist with a deep sophisticated understanding of hard problems in your field, but without the skills to build much of anything right now. You known the why, but not the how. But if that's really what you want, try a degree in math or maybe physics.
Technical undergraduate degrees are supposed to bridge this gap by teaching both the how and the why, producing people who both understand the subtleties and can do real work right now. That's really useful, which is why these graduates are so sought-after. A university that neglects either side of this span is producing people who are a whole lot less useful, and that's what Gelertner is pushing back against in his article.
Someone graduating with a degree in a technical discipline should have both up-to-the-minute skills in that discipline and also a sophisticated understanding of the theory of the field.
This falls apart at graduation + n years, unless you accept that up-to-the-minute skills must be maintained and regathered as the minutes progress.
If it's possible for an educated mind to continue to keep up-to-date, then one must accept that an educated mind, freshly graduated with a sophisticated understanding of the field can, and probably should, pick up the up-to-the-minute skills as part of the on-the-job training any new graduate acquires.
There's always a ramp-up in any new role, especially so for someone who's never worked professionally before. Given a choice, I'd rather aim for an education providing the depth, and the up-to-the-minute ramp-up happening on the job rather than the other way around, because I've never seen a developer's day job provide any significant theoretical depth beyond what is required to get the job done. If you don't learn how compilers and operating systems and low-level memory management and design patterns and concurrency and big-O notation work before you hit the workforce, you're not very likely to pick all those up at work.
I think he's making the common mistake University is not there to teach specific skills but to teach you the broad elements of a field.
It should be food for thought as we discuss this article that the author is the son of a famous computer scientist who is a professor of computer science at Yale. I think the nature of a university education and what a university education is for was probably dinner-table conversation at his home as he was growing up.
Another hypothesis would that some factions would like to blur the line between technical colleges and universities. A third hypothesis is that this has already happened.
So the germ of a good idea in the article is that there's a distinction between Computer Science and the techniques used in the actual production of software (call it Software Engineering, but obviously that's an overused term).
I think you can make a very solid argument that universities would do well to offer a Software Engineering track alongside CS, one that focuses less on formal mathematical analyses of algorithms and reticulating splines, and more on practical concerns of software teams -- different approaches to the SDLC, ways to work in a team, how to evolve existing code without breaking things.
But the argument he actually makes -- that universities should be teaching "iPhone or Android development" and spend multiple semesters on that... yeah, no. If you want a trade school that's narrowly focused on teaching people a job skill that's in demand this very second, I'd argue you're misguided, but whatever; either way, it's not what universities do and shouldn't be. The point of a university is to teach you things that won't be worthless a few years after you graduate.
"A serious alternative to the $100,000 four-year college degree wouldn’t even need to be accredited—it would merely need to teach students the skills that startups are desperate for, and that universities couldn’t care less about."
- because everyone should be training to work for a startup?
- why not hire passionate people and train them if university trains them badly? Sounds like people are trying to find perfect developers and of course the competition for those is huge.
CS and SD aren't the same subject, though it would be great if CS degrees had a "intro to software development" class required in the first year or two.
It would teach things such things as the SDLC, waterfall/agile, a dvcs, code maintainability, a documentation system, working with others, writing tests, qa, project management, etc, etc.
Computer Science teaches useful stuff that helps with problem solving. It's not useful APIs or revision control. Those things you can learn on your first job.
Did you find the one at kamusnias.info? That's an exact copy of the WSJ article.
I'm not sure why you couldn't get it through Google, since searching for the title works OK for me - it shows up as the featured "In the news" link at the very top of the results page. You can also try searching via news.google.com; that should also get you through the paywall.
The thrust of Daniel Gelernter's argument here is that there is a shortage of programmers with a certain kind of skill set suitable for working in startups, because large businesses that grew up as successful startups have bid up the price of programmers with those skill sets. "Part of the problem is that startups have to compete with hegemons like Google and Facebook that offer extraordinary salaries for the best talent." My oldest son began working at Google a month ago, so for the last couple months family conversation has included discussion of Google's hiring process and how working for Google compares to working for other startups (my son's previous employment). The programmer job market has a lot of interesting features, but one of those features is indeed a fairly high premium on software engineers who have a love of coding (who are self-motivated, whether or not the job market rewards the motivation, in learning more and more about coding every day) and who have good problem-solving skills, a business orientation to doing work that meets customer needs and builds a profitable company, and good written and oral communication skills. There is a severe shortage at all levels of employment of software engineers who meet those other requirements.
The author's other comment to note is "The thing I don’t look for in a developer is a degree in computer science." It's empirically true that young people can get great jobs as software engineers at the most selective employers even if they don't have degrees in computer science--as long as they can handle a tough series of technical interviews. I've seen it done. Many other commenters on Hacker News have noted that software engineering, more than most occupations in the United States, has processes for finding people with actual technical skill irrespective of whether those people have college degrees. If you are looking for work, having a college degree may provide many benefits (among those the ability to gain a visa to move to another country where you would rather live), but the main thing to gain is the knowledge and experience that helps you solve actual problems in industry with code that works. Some people gain that knowledge and experience through a computer science degree program (or at least DURING a computer science degree program), but others gain it through other channels, and the smartest companies look for worker performance more than they look for degrees.
[1] https://en.wikipedia.org/wiki/David_Gelernter
[2] https://www.linkedin.com/pub/daniel-gelernter/11/a57/251
[3] http://www.wsj.com/articles/SB100014240527023032815045792222...