This anecdote well encapsulates the impression I had of a lot of them.
I'd really like to meet the better of the crop, because I can't imagine PG and Jessica (she's usually the expert on character and personality in YC interviews) are intentionally picking petty people.
I'm not the only person in the startup industry who's had this impression either. They remind me a lot of underqualified legacy Harvard kids who think they're god's gift because they snagged a status symbol.
Maybe the better founders are too busy shipping product to run into me :)
Using management skills as a primary selection criterion would be premature optimization in YC's process. At some point, hopefully HR is screening rather than the founders, but early on putting up with the founders' bullshit is part of a necessary skill set.
When I was at my startup, I hired a guy with no Django experience and minimal python. He made his first bug fix within 24 hours of showing up and was productive within a week.
Maybe RoR has a much steeper learning curve, or maybe the people doing the hiring just don't get it.
I had the same experience with Django walking in with the same repertoire (minimal Python, no Django) back when I lived in NYC.
I don't do Django anymore, but it taught me a lot about trusting people to pick things up on the fly. Especially with the right person and environment.
That aside, both of my interviews at YC companies (two, although I've talked to a lot of YC'ers besides), had profitability so the typical runway concerns weren't predominating their conccerns I would think.
Seems like it would be so drastically high that hiring a MVC (Minimal Viable Candidate) might be the right choice.
Familiarity with Active Record, and making it reasonably performant, handling rails security issues, experience with the gem ecosystem, understanding the intricacies of the routing DSL, getting a handle on reasonable rspec/capybara or cucumber practices, ruby metaprogramming, knowing the Rails conventions and what it looks for automatically in controllers, views, and partials, and which methods are available where, knowing all the various helper methods rails provides for text processing, working with dates, and html tags, best practices integrating with AJAX, migrations, rack middleware, the asset pipeline, etc, etc. I've been working on our Rails app non-stop for 9 months now and every day I learn new stuff.
Long story short, for a start up which needs to iterate fast before it dies, acclimating a competent programmer to the "Rails Way of Life" just might not be worth it. Now, it's a completely different story if you have important domain specific knowledge, and the company is working on something especially innovative, but it seems like a lot of start-ups are scrambling for a pretty low-tech piece of the pie where "execution wins". My point is, Rails is surprisingly big, and while you can become useful in a matter of weeks, you can't become extremely productive with it for awhile longer.
Particularly with Rails where it's convention over configuration, knowing the conventions is a great headstart.
Unless you need language guru, the languages known don't matter much. If someone can write a LISP interpreter in Haskell for 2 days I would hire her for a Java position even if she doesn't know what a loop is.
Either you have another skill that they need, or they just want a "generic (good) rails programmer"
Of course skills are transferrable, within a timeframe. So if you're trying to ship something in 6 months this may not be acceptable
EDIT: Actually, I want to tell more of this story to emphasize what you can do with a reasonably competent employee willing to learn.
At the time, I was hired as a temporary contractor for 3-4 months. I had basically most of 1 month to get to basic competence with Rails, and it was supposed to be less. Then I started getting real work thrown at me. By the end of 3 months, I had justified my pay and my employers had no more temporary work to throw at me. They offered to hire as a full-time salaried employee, but by then I had a plan for grad-school and research.
That was at the end of June. Midway through November, my friend who originally recommended me to that employer messages me on Facebook. He tells a bit of an ironic story, one point of which is that he himself is now using and expanding-on the code I wrote in those two-point-something months. Apparently it's holding up pretty decently, and the team is able to ship product.
All this because the proprietor of the firm figured he could hire someone smart with a good work-ethic for a while and give him time to get competent with Rails rather than go unicorn hunting.
- Knowing and using the basic underlying technologies (languages, but also concepts, like machine learning, databases, UI skills or so on)
- Knowing the specific codebase and process for those projects (how the build works, what are the components, what they do, etc).
True, someone knowing the first skillset has an advantage, but a talented person will have no problem doing the second one much faster. For an hire, I would rather factor in the long term. It's about sane finance management and ROI, actually. For any given pool of candidates, you are considerably reducing the chance you hit a talented people, if you interview only 10% of all candidates just based on their previous base skills.
The only thing stopping me from calling out HR departments in general as a sign of organisational cancer, since I am yet to encounter one that pulls even a tenth of its weight, is that they tend to be comprised of the only cute girls in the whole company. Even on Friday (2 days ago) I was out drinking with a bunch of tech guys and a single cute HR girl. That is pretty much the sole reason I would oppose eliminating HR as we know it from all technology companies.
I had a phone interview once where the person on the other side was (the hiring side) was trying to suss out my skills.
"Do you know real-time object-oriented programming?" they asked.
"Well, I know object-oriented programming, and I've done a lot of real time stuff, so: Sure."
"No, I think you're trying to fool me. Please answer truthfully, do you know _real-time object-oriented programming_?"
"I don't believe you."
Things went downhill from there. The interviewer clearly had no idea what they were actually asking, and it was impossible to express any competence other than to repeat buzzwords.
Bottom-line is: That company missed out on a great employee. They utterly blew it.
[Yeah, I know you only have my word for this. Suffice to say that I have enough objective evidence from subsequent projects and employment that I know I'm not Dunning-Krugering myself.]
I would be so tempted to reply "Do you?"
I can see that no-one likes what I said but I assure you it's my honest feeling. HR departments' best contribution to an organisation is cute HR girls. You can deny this truth, or perhaps try to change the way of things. Clicking to downvote me, of course, changes absolutely nothing.
let's look at famous examples:
Carly Fiorina? Marketing.
Brigitte Ederer (Siemens)? HR.
Meg Whitman? Marketing (started as a Brand Manager)
the cute moniker is unnecessary, I agree. the overall observation is correct.
i agree in particular with the fact that once HR takes over hiring from the team-leaders, the company has jumped the shark. one of the core tasks for a manager is hiring, ensuring his/her team is correctly staffed and has enough fresh blood to ensure senior people can move upward or away without disruption. if you don't, the Peter principle is in play.
overzelous HR is the ruin of a company because by definition HR has no clue about the core competencies of a company, no feeling for its customers nor products/services.
this attitude, to me, is unbelievable. Of course they are human beings. What else would they be?
If you're going to make a claim about HR, make it on its merits. I have said they're nothing but a bunch of good looking girls. They have zero understanding about any of the roles they're hiring for, and no incentive to learn. Care to contradict me?
And I never claimed otherwise. But I really resent your accusation of sexism, since I believe I am extremely fair to both sexes.
It is a plain fact that the attitude towards HR departments here is mostly negative. It is another plain fact that those departments are mostly female. If these facts are sexist, then reality is sexist.
Oh, and the all-male "IT department" is also richly deserving of criticism, though not by me since I jumped that ship long ago. I look forward to your devout claims of sexism when such a criticism does occur, however.
It boils down to this: if you put the wrong people in charge of very important stuff, then (long term) you are fucked.
I believe you when you say that's what you've observed. But, whether you like it or not, it's not okay (in public, at least) to make statements which link gender, race, etc., with competence.
No fair, you (or some other reader) might say. I'm just reporting my observations. But I claim that you have a responsibility to do more than simply report your observations, and it's because people are stupid. When we read "A and B go together," we humans immediately and instinctively (system 1) draw causal conclusions about A and B. It doesn't matter if it's anecdotal evidence; it doesn't matter that correlation does not equal causation. Not for System 1, anyway. And those causal conclusions lead to making bad subconscious assumptions, and later lead to actual sexism down the road. All because of comments like yours -- "A and B go together".
So don't do it. Suppress your urge to report your observations when they could lead to others making sexist/racist assumptions in the future. Bad HR department? Say "bad HR department" and leave the part about gender out.
(I would say that this only applies to speaking/writing in public. In private, you know who your audience is, and what conclusions they will draw from your words, so you can afford to be freer.)
You seem to be fundamentally misunderstanding the original complaint. The idea was that HR itself is a problem. It's not about people being bad at HR. And your suggestion to 'Suppress your urge to report your observations when they could lead to others making sexist/racist assumptions in the future.' deeply bothers me. If we pretend things don't exist because they look sexist/racist/etc, we get in even deeper problems of trying to figure out what is real.
(Although personally, I don't see what the gender of the HR department adds to the discussion anyway.)
No doubt, but this doesn't scale. Right now you and I both are suffering from bad hiring decisions from local government up. I cannot wait for the great wheel to turn and bad hirers to be punished in the afterlife.
> I don't see what the gender of the HR department adds to the discussion
HR is a girl's club, just like IT is a boy's club, and deserves to be called out as such. The gender is relevant, indeed central.
I think this about sums up the comment. You mentioned cute women in HR as ones who are cute and don't know about the job to properly hire.
The next avenue to attack is "denigrating" booth 'people' because we are not allowed to hilight the obvious sexuality of the (shudder) women.
I'm sure women all around the world are extremely interested in Ryan Gosling's personality.
His comment pertaining to not canning all of HR because they are the only cute GIRLS is sexist.
There are a lot more women in HR than men, and there are a lot more men in programming than women. There are many reasons why this is the case, many of which relate to sexism, but the fact that your overall result is gender-skewed is not in any way indicative of sexism in the hiring process.
So if you're going to send a spaceship to Mars for a couple of years, you'd better mix some women in there, whether you consider it sexist or not. Likewise if you're going to put some people in a high-pressure startup for a couple of years, where they're spending almost all of their time and energy at the office, you'll probably have better and more consistent outcomes with a diverse group. That means both men and women, junior and senior, deep thinkers and fast movers.
Why's that a reason not to fire the lot of them?
You have a harridan making your life miserable with stupid policies but she looks cute? You call that a good trade?
You know companies could employ competent ladies in place of those cute ladies.
Like a lady I knew who could write a whole network stack in assembler.
It's the basis of my marriage. :)
"Data science" means many things to many people. Some shops want BI folks who can code. Others want people who can keep up in deep learning threads (like the one on HN's front page) and know about Hinton's work. Others still want HDFS badasses who can distinguish between Cascading and Cascalog (lol).
Shops classify all of these things as "data science" when they're really very different (respectively: Excel/scripting BI analyst; ML researcher with a focus on deep learning and neural networks; data infrastructure developer with broad exposure to MR frameworks)
This is stupid. I worried about checking off these boxes until I realized recruiters are doing the 1990s Java thing all over again. This time, though, the buzzwords are Hadoop, information retrieval, and Andrew Ng. I think it's best to ignore this, keep working hard, and grow your employment options.
There are a lot of advertisements offering jobs only to people with "10+ years experience in X" (sometimes "X" hasn't even been a thing for that long), and assuming that means professional experience, that rules out almost everyone younger than 32. And for specific "X", it can rule out people even older if the "X" isn't what a recent graduate started getting their professional experience in right away. Some of those companies might also offer junior jobs with junior pay, but there are all these startups elsewhere offering higher-than-junior pay plus equity plus (depending on the size) a feeling of larger influence and control without the stress of running a company yourself.
I'm not sure I agree with you on ignoring buzzword-laden fields since there's usually a lot of money to be made, even if only in the short term. Though it's probably best not to greedily expand your employment options in the direction of the latest buzz.
On the other hand, I have never ever found the requirements for a role to be nothing but mostly bullshit. So they don't really help you much. Your best bet if you are not going through a referral is to attempt to get to a phone screen where the Engineer/Data Scientist will be kind enough to spare some time at the end of the interview to tell you what on earth they are actually looking for. You could get lucky and learn just enough to terminate the interview process at this time. The worst situation is where you go there and realize that either a) You are not what they are looking for which is a half a day wasted or even worse b) They have not completely realized what they are looking for.
Me? I have made a decision that I will either apply for software engineering positions where I will maybe be one of the Machine Learning folks or apply to data science positions only if I know some one in the team who can explain to me what on earth that company is actually looking for.
Presumably the assumption is that some good people will be banned and leave, and some other good people will just leave out of sympathy, but it's better to have good people leave than have bad people enter the community. It's an interesting hypothesis, and HN has been going strong for 5+ years now with minimal moderation effort, though I have to admit that it seems to violate basic rules of fairness and empathy. Then again, a lot of businesses are built on being unfair to people.
The result? They don't hire anybody despite a looooooooooong line of overqualified greybeards.
(I am young myself, too!)
1) Some people with a mistaken belief that correlation does imply causation think that just because some programmers have let their skillsets atrophy implies that one should stay away from older programmers.
2) Older people by virtue of experience, skillsets are pricing themselves out of the job market. I have an older friend who is a phenomenal hacker but doesn't like going into management. He found himself either being offered VP of Engg type roles which required way more management than he was willing to do or being given Engineering roles with way less money than he was prepared to accept. Also, once you get a family, the amount of leeway you have in accepting a tiny fraction of what you are worth in exchange for money and stocks becomes quite limited.
- 25 years of HBase experience
- candidate must be 22 year old or younger"
News stories have been filled with reports of managers of manufacturing companies insisting that they have jobs open that they can't fill because there are no qualified workers. Adam Davidson at the NYT looked at this more closely and found that the real problem is that the managers don't seem to be interested in paying for the high level of skills that they claim they need.
Many of the positions that are going unfilled pay in the range of $15-$20 an hour. This is not a pay level that would be associated with a job that requires a high degree of skill. As Davidson points out, low level managers at a fast-food restaurant can make comparable pay.
It should not be surprising that the workers who have these skills expect higher pay and workers without the skills will not invest the time and money to acquire them for such a small reward. If these factories want to get highly skilled workers, they will have to offer a wage that is in line with the skill level that they expect.
Contrast this to the skilled manufacturing jobs which require up front experience. Though many blue collar fields offer entry-level positions with on the job training, apprenticeships, and opportunity for advancement, this doesn't appear to be common practice in manufacturing. Why not? I think the main reason is that it is very hard for a low skill worker to add value to a manufacturing company. There aren't any comparable entry-level positions that allow the employee to learn while still being productive.
Because of this, hiring an unskilled employee for the purpose of training them is a huge risk, since it requires a significant investment. And since this industry is already very unstable with razor-thin margins, it's not something many employers seem willing to do, which is unfortunate.
So maybe the solution is coming up with better training programs, so that manufacturers can hire new employees without taking on such large risks?
And that is an artefact of industry killing on-the-job training and apprenticeships in any meaningful way.
My dad went from sweeping the floor to designing satellite test rigs with on-the-job training. My partner's dad went from painting ships to designing nuclear reactors with on-the-job training.
My dad never went to university. Started off helping out at my granddad's shop. Moved from there to an apprenticeship scheme at a local engineering firm. From there went into the drafting room. From there learned more engineering. When I was in my teens he was designing satellite test rigs for BA. Ended up a very expensive contractor specialising in conveyor systems of all things.
My partner's dad never went to university. Entered an apprenticeship scheme at Chatham Dockyard. Started off painting navy ships during construction. Moved into drafting office. Started doing more engineering work. Became a member of the Institution of Mechanical Engineers as a Chartered Mechanical Engineer in 1966. Worked on the design and building of early UK marine nuclear reactors, nuclear containment facilities, etc. He ended up working in nuclear medicine before he retired.
On the job training is possible. Just nobody in the US and UK seems to want to do it any more ;-(
(Favourite "I wish I knew that at the time and kept it" moment from my youth. My dad always brought home used A1/A0 paper from his design work for me and my brothers to scrawl on. I remember one when I was about seven or eight that was of this massive array of tiny ovals in a mesh of wires. I though the pattern was cool and stared at it for some time, before turning over to try to improve my drawing of Spider-Man. I now know what I was looking at was an A0 schematic of some magnetic-core memory (http://en.wikipedia.org/wiki/Magnetic-core_memory). So wish I had that now so I could frame it for the wall ;-)
Ya'll have probably read some of Prof Peter Cappelli's editorials (he's been making the rounds to promote his book "Why Good People Can’t Get Jobs"). One of the things he's pointed out is that HR use of resume databases encourages people with poor search habits to believe there is no choice. Basically, it's a failure consider the Bayesian math of nested filters.
tl;dr: the absence of time travelers in the applicant pool is not evidence of a skills shortage.
That closing line doesn't really sound like the gist of the piece at all. It sounds like manufacturing employers just don't want to or don't feel they can afford to pay works a fair wage.
If laborers are leaving manufacturers for better pay at fast food companies, workers need higher pay. Its not that complicated. If the company can't afford higher wages, maybe the demand for this skilled labor isn't what we think it is.
If these skills are actually in high demand, they will pay the bills.
I'd say our education system is working out fine if we're skipping the high-barrier jobs that don't pay a premium wage.
Several commenters have basically claimed that the manufacturers are behaving irrationally; a claim like that requires some powerful arguments to credibly support. I haven't seen much of that yet here.
I suspect that the truth is that manufacturers would be willing to pay to train workers if they could be reasonably sure that the investment would have time to pay off. But thanks to Moore's Law and everything that comes with it, manufacturing technology is changing far more rapidly than it used to when everything was more stable.
It seems likely that it's more cost-effective for the manufacturer to lower that risk by training lower-paid labor elsewhere and letting those workers go (or retraining them, if it's cheap to do so) when they're no longer suitable, than it is to do the same thing in first-world countries for much higher cost.
The pricing error argument can be made based on a definition of "value" that is in fairly common use around here (market price) and no more data than is included in the article: they offer a certain price for the labor and don't get as much of it as they want, therefore the price they offer is lower than the value.
Who are we supposed to understand is under-educated? The employers? Or was this a vain attempt to pull a fast one and shift blame to public schools for...? What? Pumping out kids with the mathematical literacy to understand that $14 is greater than $10 therefore McDonalds pays better than manufacturing?
I found a long-running thread where they discussed average wages for machinists / cnc operators, and it actually made me sad. It's here:
When I compare this to what even mediocre programmers get, I can't help but feel that there's real blue-collar discrimination going on.
One thing though - there's a real variance in the responsibilities (and a abilities) of a "CNC operator"...
It's not just about "fair wage" at that point - enough to have a place to live, a car, food in your belly... but that "fair rate" also includes being paid enough to afford the $500-1,000/month student loan payment that is the outcome of getting such a degree.
A high demand for scarce highly skilled workers must drive
salaries up in competition.
Unless the skills are not actually high skilled, but obscure skills. I predict that "high skilled manufacturing" is a morass of proprietary solutions to
highly specific process needs - that is, the metal fabrication plant alluded to at 10USD per hour has highly specific machinery doing a fixed task, and that the skill is mostly one of repairing the proprietary code.
From a software perspective, that mostly means the machinery is "legacy" - and impossible to refactor. So it can never be improved upon, only replaced.
I am not a 3d-printer fanboi - the advantages of additative manufacturing are hugely over hyped (at the moment), and likely to remain elusive for, lets say, a generation, before it becomes obvious to all we throw away factories and their jobs and build millions of thing-o-matics.
However in that generation there will be huge opportunity for semi-general manufacturing - robots with sufficient flexibility in parts and software that they can be re-purposed easily as part of a (virtual) conveyor belt.
This sounds truly skilled work - flexibly adapting as processes and customers change.
if providing semi-general manufacturing machines is uneconomic compared to proprietary simpler but "obscure skill" machines, then we cannot expect a productivity
premium for actual highly skilled workers and should expect
10USD phour jobs to limp along till Shanghai takes their lunch.
If however semi-general machines can be made to adapt to
different manufacturing requirements, then our whole manufacturing base may be in want of replacing.
Again something emerging economies will have an advantage in.
So, overall, the West should view itself as Great Britain was at the end of the 19th Century - a pioneer whose advantages had run out, and without wholesale massive investment will simply enter a managed decline.
Disruption opportunities - Development of semi-general robotic manufacturing solutions that can quickly be re-purposed. And proving that it is both ecomnomic for a greenfield site and an installed base.
It's not just "repairing code"... Oftentimes, small job shops will be taking a dimensioned drawing and write their own G code to actually generate the objects.
This will probably involve some back-and-forth with the machine stepping through the part manufacture, and must be done carefully to guarantee that both the machining is done efficiently by the machine and that the program doesn't result in a catastrophic failure damaging the workpiece, the tool, the machine, the operator, or some combination of the above.
This is, in fact, skilled work. Moreover, experienced operators know how to check parts for proper dimensioning (QC at the bench), perform maintenance on their machines and change tools, and watch a part in progress to make sure something really dumb isn't about to happen.
Moreover, the job doesn't stop at the machining center--you may still have to go and clean things up manually, do welding, do polishing, or any number of finishing touches.
Keeping this all working together is not something to laugh at.
I haven't even touched on the logistical challenges of retooling or upgrading equipment, or operating with machines that (unironically) still accept paper tape.
In short, the manager of a Mcdonalds can move to a local KFC or Wendys with very little effort. An operator of
machinetools X has no chance just picking it up on tools Y.
What I think we are seeing is the result of automation destroying jobs - as ChuckMcM says up page. And if so I am interested suddenly in rereading the Communist Manifesto - because in 20 years, we cannot continue having money flow to the owners of the means of production as defined now - if almost all production is automated, what happens then?
I am too tired to do much more than realise there is an interesting path of thought leading away from here - one I suspect having been trodden by many other thinkers before me. Any travel guides welcome.
edit: too vitriolic
but then if the skills were transferrable (ie not
proprietary) then the market would adjust - but it seems
that there are very very few people with Machine X
experience in each locale, and they are desperate for jobs.
The problem is the entire market for CNC machine operators is very small.
Edit: I see you've edited your comment.
opendna below suggests there is research that small (isolated) pools of labour fail to benefit from the intuitive supply/demand wage growth.
My antennea are wavering - this feels like a disruption happening. If I ever pick up my research MSc I think this will be the perfect subject.
You might try reading up on the Heckscher–Ohlin model and Stolper-Samuelson theorem. There are a great number of criticisms of each but, to over-simplify, the counter-intuitive take-away is that returns will be highest for the most abundant factor of production. i.e. the larger the pool of unskilled labor, the more laborers will be paid. Scarcity actually causes the value of a factor of production to fall. (Yes, I know that's the opposite of everything you know about economics.)
For a more qualitative approach, try "The Coming Post-Industrial Society" by Daniel Bell (1973), available at a library near you [http://www.worldcat.org/oclc/473951691]. For a more culture theory approach, try Neil Postman's "Technopoly" (1992) http://www.worldcat.org/oclc/24694343]. If you're into old-school tech determinism and political economy, both Lewis Mumford's "Technics and Civilization" (1934)[http://www.worldcat.org/oclc/560667] and Harold Innis' "Communication and Empire" (1972) [http://www.worldcat.org/oclc/281110] are more intelligible than Marshall McLuhan's work.
Come to think about it, it isn't obscure skills, it is obscure knowledge. The problems are (a) there is really no skill involved, (b) the knowledge involved is specific to a proprietary machine and proprietary program, and (c) the training courses are training specific knowledge, not the principles and cross-training that would apply more universally.
Mapping to our (geek) world, training obscure knowledge vs. actual skills is all the "office productivity training" that goes on which really is "which buttons to click in Excel"... leaving the trainee helpless in LibreOffice because the button is in a different spot and possibly labeled differently. Note that the trainer is not entirely at fault - I've been amazed at the number of trainees that cannot generalize.
I'm not a fan of unions or work rules, but those are terrible wages. How does the guy expect to get 21st century skills for 20th century wages?
There is nothing wrong with that. That is the economical system in the works.
If no one is willing to pay for his product/service at the price he needs to charge, it is because either people do not need it, or they can get it cheaper somewhere else. In business, if you can't compete, you shouldn't.
In this specific case, these jobs are getting oursourced to china or replaced by machines. This is wonderful. Not for infimal number of people that lose their jobs, but to the huge number of people who will be able to afford that product/service cheaper.
I can't set up an ice-cream kiosk in antartica and whine about how business is hard and what will be of the antartica ice-cream industry.
Absolutely nothing, welcome to one of the big problems of capitalism.
Now, assuming blue collar is done, as robots can take care of most of the tasks, could someone explain why we need workers to oversee the computers?
The way I see it, you need a really expensive robot, and you need a really expensive programmer to make it work; everything in the middle is cheap. Software should be able to distill anything that is happening on the robotic side and present it in such a way that a trained mechanic would understand and be able to fix it. If that's not the case, then we need better developers (and in this case, perhaps, better designers and human-factors engineers) to write better software.
Now, a few folks mentioned that what you need is really a mixture of experience in the mechanical side of things and understanding of the software - presumable that allows you to react and solve problems whenever something goes astray. Well, that could probably be solved by having a few engineers on staff who would help when needed.
That leaves us with yet again, fairly simple mechanical labor. Perhaps then, the article is right, we have a serious education problem. Those who attain enough education, leap forward and presumable learn more to subject themselves to mundane mechanical tasks, while those who would be greatly fitted to do the jobs, actually don't have enough education to understand even the most basics.
Someone's suggested comparing the math needed at Mc'Donalds with the math needed at one of these factories. I'd be curious to know too; although I suspect that in McDs all the calculations are done by a computer and all humans need to do, is simply not to f-up. Even then, when humans fail to add 2+2, all they lose is an occasional McFlurry, while at a factory they could impact tens of thousands of dollars at once.
So, here we have a conundrum. We need labor to work the $10 jobs, but the pool of employees is simply atrocious. At the same time, qualified labor has better things to do with their time. Now, back to my questions - what's the solution?
The solution is to adapt or close the factories down, and go work at McDonalds (or get a better skilled job elsewhere).
The candidates smart enough to realize that they could earn 2x-5x as much after college won't take these jobs, and most of the people who can't earn those wages elsewhere aren't qualified to do these jobs.
Agreed. That is the basic problem. If a worker can produce hundreds of widgets a day after specialized training with new computer-controlled machinery, but the worker could make just as much money per hour right after high school flipping hamburgers at the local fast-food restaurant, there is no reason for the worker to go through two years or more of specialized training, especially at the worker's own expense.
Much of the rest of the article discusses the overall rationality of workers seeking jobs that they can obtain with the least investment of their own time in training for a given income. Of course. Part of the problem is that if companies hire on the basis of course completion certificates rather than on the basis of demonstrated competence, they will miss out on good workers, and yet hire some lousy workers, and thus be reluctant to offer competitive starting wages. (It's expensive to hire a worker who can't do the job, and it's also expensive to let go workers who don't learn on the job and to hire their replacements.)
In what I think has become my best-liked comment on HN, I've collected references other participants here helped me find about company hiring procedures. Companies need to hire on the basis of actually being able to do the job, not on the basis of what classes workers have attended. The review article by Frank L. Schmidt and John E. Hunter, "The Validity and Utility of Selection Models in Personnel Psychology: Practical and Theoretical Implications of 85 Years of Research Findings," Psychological Bulletin, Vol. 124, No. 2, 262-274
sums up, current to 1998, a meta-analysis of much of the HUGE peer-reviewed professional literature on the industrial and organizational psychology devoted to business hiring procedures. There are many kinds of hiring criteria, such as in-person interviews, telephone interviews, resume reviews for job experience, checks for academic credentials, personality tests, and so on. There is much published study research on how job applicants perform after they are hired in a wide variety of occupations.
EXECUTIVE SUMMARY: If you are hiring for any kind of job in the United States, prefer a work-sample test as your hiring procedure. If you are hiring in most other parts of the world, use a work-sample test in combination with a general mental ability test.
The overall summary of the industrial psychology research in reliable secondary sources is that two kinds of job screening procedures work reasonably well. One is a general mental ability (GMA) test (an IQ-like test, such as the Wonderlic personnel screening test). Another is a work-sample test, where the applicant does an actual task or group of tasks like what the applicant will do on the job if hired. (But the calculated validity of each of the two best kinds of procedures, standing alone, is only 0.54 for work sample tests and 0.51 for general mental ability tests.) Each of these kinds of tests has about the same validity in screening applicants for jobs, with the general mental ability test better predicting success for applicants who will be trained into a new job. Neither is perfect (both miss some good performers on the job, and select some bad performers on the job), but both are better than any other single-factor hiring procedure that has been tested in rigorous research, across a wide variety of occupations. So if you are hiring for your company, it's a good idea to think about how to build a work-sample test into all of your hiring processes. If the job you are hiring for involves use of a computer-controlled machine tool, have the candidate put the machine to use making sample parts (advertise the job in a way that makes clear a work-sample test is required, to screen out people who have no clue how to operate such machines). Hire the able, and pay them what they are worth.
Ask yourself about any hiring process you have ever been in, as boss or as applicant: did the applicant have to do a work-sample test based on actual work results expected in the company? Why not?
The key point being that there is a skill 'valuation' gap which is being forced on the manufacturers by competition in distant countries. So piecework manufacturer in Minnesota has to compete with piecework manufacturer in Shenzen, and the former pays employees more and uses more automation and the latter has traditionally been able to throw many more humans at the task. Between the slowly rising demands in these countries, and the challenge of working with your factory that far away, we're just about to the point where its going to flip. I would have called it for this year but the great recession seemed to push everything back 2 - 3 years. So perhaps in 2015. Small manufacturers will either go out of business or require a higher price per good delivered, which supports effectively doubling the wage paid to their line workers. And those line workers will be essentially programmers programming tools, not doing the actual tooling.
The switch will be enabled by "high resolution" workstation automation. Today the last unassailable benefit of the 'tool and die' shop is that you've got a machinist and a machine tool which can be dynamically scheduled for any kind of widget. They are 'soft tooling' in that all of the gear that makes stuff doesn't need a dedicated fabrication line. The benefit is realized because setting up 'hard tooling' has a huge upfront cost and while the dedicated line is much more efficient, it also takes space away from other lines.
The calculus goes something like this : 0 - 1,000 widgets a 'precision' shop can make them; 1000+ to 50000 widgets a 'sweat shop' (primarily labor driven shop) can make them, and for 5000+ widgets you start to get the benefits of hard tooling amortized over your widgets.
As we get to the point where we have machines that have 'high level' sorts of capabilities (some early machines have been demonstrated by the Japanese and others) where your machine can take a dumped out pile of parts and assemble the widget, or 'partial hard tooling' where a machine creates intermediate tools to maximize efficiency "automatically" then it flips back to the place where a well equipped precision shop can make 10,000 widgets a month or arbitrary design and amortize over all 10,000 which switches the economics question back to transportation and re-spins (interaction with the original designers).
When I see robots that can pick up a cell phone, toss it and then catch it again at an arbitrary angle, you realize you can do material transfer by having workstations literally toss partial assemblies from one station to the next. That kind of flexibility is going to make for a very different kind of factory, and of course factory worker.
Why would you use a mental ability test outside of the US, but not in the US? Is it illegal in the US or something?
Are those the same protected groups which tend to not have degrees or other certifications? Because "ban the IQ test, ask for a degree" might have a similar effect.
For example, how much math does somebody need to know for this kind of job? Is it trigonometry to find hypotenuses? Are they writing programs or entering in data to notepad?
This is an important distinction because we need to compare these skills to those of a shift manager at McDonald.
For example, at a hotel the shift manager must supervise the employees, handle money, and keep journal entries. The later two involve rudimentary mathematics and computer skills. There is a good chance that this is a fair wage when compared to similar jobs.
Although, a lot of this thing is getting automated, and there is a huge push towards figuring out how you can feed a CAD design directly to CNC and let it figure out best way to implement it.
PS: I cannot expect that you can ever hire enough good people in this category for $10 an hour. They would probably be writing VB scripts and earning more money. Working in a fast food chain would be totally a waste of potential of decently skilled workers.
Moving on, with CNC automation, you need experience in both industry to understand how to program a CNC machine. With full automation becoming the trend, I see a huge lack of skilled workers who can fit this wold. Afterall, why would you go into manufacturing of metals if you understand computers.
Thinking meta for a moment here: How far does this concept go of machine process. Could we program machines that can run the computers that run the machines? And how about machines that program those machines?
One of my favorite Twilight Zone episodes : http://en.wikipedia.org/wiki/The_Brain_Center_at_Whipple%27s
I maintain that being a programmer will still be a highly paid skill in demand for decades to come for one simple reason : You can't bullshit a computer. You can't sweet-talk a machine into accepting your algorithm because of your great personality , looks or any other intangible except for knowing what you're doing.
if you want a parallel, look no further than the sorry state of machine translation & that despite the enormous amounts of datasets they've been playing with. Human intelligence & judgement, even on a very very narrowed use case is not something computers are close to replicate.
That IS programming. Don't confuse programming with typing. Programming means understanding a problem and coming up with a precise, logically valid solution.
In a way it's only as "meta" as it is to have a compiler generate your assembler code.
And that's probably at least for decades to come. But if someone develops a strong AI, this goes out the window. Probably most familiar forms of employment along with it, possibly the entire idea of a society where the labor market is a primary form of economic participation.
Personally, I don't see any technology on the current landscape or the horizon that I think is likely to produce strong AI within decades, maybe even a century. Wouldn't rule it out, though.
I was given a good piece of advice when I started work: "A process engineer's job is to make themselves obsolete." I've striven to make sure that I did everything I could to make sure I wasn't needed anymore.
And I'm happy to say that the processes will keep me employed for as long as I'm willing. Not from any self-serving malice, but rather that in the end it's terribly messy. The sheer inelegance of industrial manufacturing seems to defy strong automation at the meta-level. And not just because the machines are all only halfway obsolete, new, deprecated, upgraded, documented, up-kept, cleaned, and run (halfway to spec).
It's because to a computer, this mess quickly degrades into special cases. Lots of them. There actually are some shockingly clever software packages out there capable of the early stages of what you're describing, and - this is the kicker - they all require specialized engineering support to create, install, train, and use. The end product can be handled by a floor worker or management, but at some point you've got to explain the situation to a computer somehow, and that ends up being as complicated as programming.
For a much better written punchline, I'll defer to the short story Profession by Asimov . I think it's cynical to think we'll never be able to solve the 'original thought' problem, but I also think it's easy to underestimate it. Things really are different today than 30 years ago, so perhaps a few more iterations will lead to tape-machines that can program the tape-machine programmer tape-machines. 
 Perhaps the worst part is we sorta already have this. There was just a post on the LLVM , which discussed how a computer translated and optimized a division command into machine code. That used to take a lot of work, and it took Real Programmers to optimize that code before this sort of automation. So it's not inconceivable that your ideal comes true. But I am cynical enough to think sami36 is right : Doodleware is past the horizon. But computers that can talk to computers is probably in sight.
Given the history of progressive rounds of optimism and failure on this idea, so far: no. Before you can replace humans, you need to ... um ... replace humans.
However, the general principle of progressively more specialised machinery and skills is not new. It's called "the division of labour"; it really ought to be called "the specialisation of process" or perhaps "the specialisation of outputs", given that capital too becomes progressively more specialised to particular tasks.
The hiring process that most companies use is backwards, and I think mostly due to people that go to school to learn HR and taught this impractical approach.
In the IT industry, specially IT support and break-fix development the same thing happens. Local people don't want to accept low salaries? Let's look what India or the Philippines are charging. Problem solved.