There are two conflicting stories in the job market.
One is "we need more developers and engineers", usually said
by top-tier companies such as Apple, Intel, Google, MS. They really do need well-educated graduates to push the envelope.
The other is "there are many unemployed IT people". These were individuals who worked where IT is a cost center, developing CRUD and maintaining in-house systems. Hum-drum, but it pays the bills. Their jobs are being outsourced.
Now, when deciding on a course of study, if you don't know, for certain, that you are top-notch, have a passion for development, and upon graduation will be desired by startups and established concerns alike, will you want to risk time and money studying cs/sw? Especially if you are smart enough to recognize that that situation exists.
If the hum-drum jobs are gone, you have fewer fall back positions available. It becomes akin to working hard on your football skills in high school, going to university on an athletic scholarship and hoping to be picked up by the NFL. There are no hum-drum, non-NFL jobs available; miss the NFL draft and you're just another communications major on monster.com.
This negative feedback cannot be good. We want to encourage brains into STEM, but if there's no fall back position those smart people are going to stay away and study to work in FIRE, or, if we're lucky, medicine.
I have no solution to this problem; it is a massive multi-player prisoners' dilemma and everybody is snitching.
What a pathetic article. I guess even the tech press doesn't recognize the difference between generic "IT" and upper-level developer/software engineer.
Sure IT support is being outsourced. When the employee doesn't need to actually know anything you might as well pay them as little as possible.
"""What a pathetic article. I guess even the tech press doesn't recognize the difference between generic "IT" and upper-level developer/software engineer."""
Why does it have to?
First of all, the article talks specifically about "IT jobs", not "upper level development/software engineer" and it says so in its title.
Thus, your complain amounts to: "The article is pathetic. It gets out to discuss the loss of IT jobs, and it indeed discuses the loss of 95% of IT jobs, just not the higher-end developer jobs I had in mind". WTF?
Second, generic IT jobs are always orders of magnitude more numerous than "upper-level developer" jobs. Hence, the loss of generic IT jobs has much more impact and is more important for way more of readers.
"""Sure IT support is being outsourced. When the employee doesn't need to actually know anything you might as well pay them as little as possible."""
And who said that "paying as little as possible" does not also hold true when the employee has to know "a lot"?
If physical presence is not needed (i.e in all cases where telecommuting could be used), what makes you think that the highest level of jobs cannot be outsourced somewhere else, where the employer will pay "as little as possible"?
You seriously believe USA has the best engineers --or the best by a margin that matters to most projects, anyway? Perhaps that could be argued in the past, but this is the 21st century. Oh, and 20th century called, he wants his prejudices back.
The former is full of grounded-folks who went back for school to get MCSE/Oracle MCSE and dutifully do their work everyday and the latter is full of self-entitled jerks who think their CRUD applications are somehow original because it's in Python/Ruby?
Outsourcing is and has been a reality. Instead of complaining injustices, the market is always right; you just have to adapt.
Or one could say the former is full of Joe Sixpack clock-punchers who spend 50% of the day on ESPN.com whereas the latter are creating the infrastructure for the jobs the IT types will be going to DeVry for in the future.
dude, there are many people that are doing things that are not web related, or simple crud, or a simple iphone app and are more interesting and complex.
Please. Your assumption that if you are a software engineer you are probably just doing some CRUD app, is flatly wrong.
And I agree with Locke1689, anybody that equals IT with actual software (or system) engineering is ignorant of the space.
And I agree with Locke1689, anybody that equals IT with actual software (or system) engineering is ignorant of the space.
IT is a very broad term that can potentially cover most anything primarily related to computing. Rather than arguing what it does or doesn't mean, it's generally preferable to recognize the relevant meaning of the term in context.
Though in this case, he's right, it appears the authors are ignorant about much of the industry and are using the term "IT" to mean traditional corporate "IT departments" responsible for internal infrastructure (pc/email/network)
I disagree; Most software engineering is CRUD. Bioinformatics is CRUD with Biology. Trading systems is CRUD with FIX. High performance computing is CRUD across clusters. Game programming is CRUD with DirectX/OpenGL. Even compiler design and Linux kernel is CRUD, except low-level CRUD with assembly or device-drivers.
Anytime you start doing something novel, you are no longer a software engineer; you are a computational biologist, quant, game designer or computer scientist. It's not about splitting hair but having the domain knowledge in addition to coding skills. (Because let's not kid ourselves, programming is easy) If people insist on elevating themselves, then it's their prerogative. But since the market is right and is programming skills are becoming a commodity, I would rather be rich than popular.
I think you're stretching the meaning of "CRUD" far beyond its useful range. CRUD is a fairly specific term that refers to a database with a user interface. There are a great many applications that are built around this basic idea. Software like the linux kernel superficially resembles a CRUD application, at a high level, and you could identify elements that are like a CRUD application, but most of the hard problems solved by the linux kernel bear little resemblance to the class of web applications called to mind by the term "CRUD."
Fine, I guess I'm referring to computer scientists then. You're playing semantics.
Edit: Actually, there are no CRUD jobs in kernel development. There aren't CRUD jobs in a lot of fields. There aren't any CRUD jobs in database design and development. There aren't any CRUD jobs in compiler design. Any argument to the contrary betrays a fundamental ignorance of what CRUD means[1].
Over-simplify even more and we have a generic verb... Not only software engineering, everything is CRUD!
A baker does CRUD with bread, a surgeon does CRUD with people's organs, a dentist does CRUD with teeth, and a physicist simply does low-level CRUD at an atomic level.
By stating it this way it sounds simple, but you know, the devil is always in the details...
Huh, I don't really feel defensive at all. I simply don't feel the need to justify computer science as a discipline or software development as a profession. I originally came into college as a physics major and am doing theoretical computer science (math) as a depth focus in my major. These are generally thought of as the "hard" fields -- but I don't find any of them harder than computer science. Computer science just doesn't strike me as a field that needs defending. If it were easy there wouldn't be so few good practicing it.
He doe have a pretty straight forward tone and in writing comes off as a little condescending, regardless of that there is one gem in his message that I agree with:
It's not about splitting hair but having the domain knowledge in addition to coding skills.
That one I would have to say is pretty spot on, domain knowledge adds a lot of value to a developer.
As for the it's all CRUD, I think that is a oversimplification at best. Software development can be rolled up into patterns and best practices but to oversimplify it down to it's all CRUD is a little beyond the reality of the duties of the role.
When was I complaining about injustices? I like to think myself a fairly good developer for my experience level and age group and, thus far, employment opportunities have not been scant.
My point was that if you don't actually know how to do anything you probably won't have a job. This is true of both of the examples you put out but not true of the people I'm talking about.
No, the former usually works at a company whose profit doesn't come from software or computers. They usually work on internal-facing systems that other members of the company are forced/required to use to get their work done.
The latter is usually the exact opposite of this.
Disclosure: I am in the former group and really want to be part of the latter group.
a) there has never been a "perfect market" left to its own devices.
b) there can never be one in any pragmatic way.
c) Who measured it scientifically, or proved it abstractly that it is "always right"?
It's more like: "we are used to working and bending to the market, so, we won't question it. Oh, and those other folks in USSR had a terrible fate when they fought the market, so we wont try anything stupid".
Yeah, this was the rhetoric of 2000-2001. To me the whole article seems like a time capsule from that period. Not that there aren't plenty of true observations in there, but if anything I think the last 5 years has seen a revival in a recognition of the value of local on the ground talent.
The article seems oriented towards services related companies that supply warm bodies to big companies - hardly surprising given the target audience of Computerworld. What is rather surprising is that it seems to take everybody who makes a computer do something for a living and groups them all together - as if there is no differentiation between the skillsets of data-entry personnel and the folks who wrote Hadoop for example.
What is even more surprising is that it views better software and the resulting automation as a great way of reducing (labour) costs while systematically ignoring the processes and people that created the better software in the first place. It seems the same article could have been written by the same people forty years ago lamenting the decline of jobs for punch-card clerks because keyboards were allowing programmers to enter their own programs faster and still ignoring the improvements that would come from this.
But those were lots of "warm bodies" that paid good wages and consulting rates. Within an hour drive of my home, in previous employment gigs alone, I can tally thousands of application developer jobs outsourced or staffed now by imported non-immigrant visa workers. Sure, lots were COBOL (yes, is still a lot of COBOL and old school DBMS like DB2 and gasp IMS running on the backend of major functions like reservations, claims adjudication, utility company billing and metering, charge card transactions and billing, etc.…) positions but a lot of Java and new fangled client/web application too.
The entrepreneurial startups create jobs, but mostly in one-off fashion, in contrast to the bulk of positions (and "IT" is not the sole category for such a metamorphisis) that big corporate Fortune 500 style companies had on the payroll (or funneling of funds to domestic consulting firms).
Yes, the world changes, and it is incumbent for knowledge workers to reequip, refresh and retool their skill set.
The article talks about losing IT jobs to other countries but for many of the reasons mentioned in the article, several states in this country are beginning to consolidate their IT operations as well: http://washingtontechnology.com/articles/2007/05/08/state-it...
My state is allegedly planning to transfer all state IT personnel (including the ones working for the universities) to the state capitol within the next four years. I can't say that I blame them. In addition to the wastefulness that I've seen, state IT workers are sometimes worse than DMV employees when it comes to customer service.
Well if the US makes it hard for people to go to the US then what should they expect?
The USA should loose the "they took our jobs" syndrome and make it easy for foreign people to work there, because then the they would receive tax, currently they receive nothing.
This is pretty general, but applies well to the IT issue.
One is "we need more developers and engineers", usually said by top-tier companies such as Apple, Intel, Google, MS. They really do need well-educated graduates to push the envelope.
The other is "there are many unemployed IT people". These were individuals who worked where IT is a cost center, developing CRUD and maintaining in-house systems. Hum-drum, but it pays the bills. Their jobs are being outsourced.
Now, when deciding on a course of study, if you don't know, for certain, that you are top-notch, have a passion for development, and upon graduation will be desired by startups and established concerns alike, will you want to risk time and money studying cs/sw? Especially if you are smart enough to recognize that that situation exists.
If the hum-drum jobs are gone, you have fewer fall back positions available. It becomes akin to working hard on your football skills in high school, going to university on an athletic scholarship and hoping to be picked up by the NFL. There are no hum-drum, non-NFL jobs available; miss the NFL draft and you're just another communications major on monster.com.
This negative feedback cannot be good. We want to encourage brains into STEM, but if there's no fall back position those smart people are going to stay away and study to work in FIRE, or, if we're lucky, medicine.
I have no solution to this problem; it is a massive multi-player prisoners' dilemma and everybody is snitching.