Literally every company needs a web presence, and it's quickly getting to the point where the usual crappy UX just isn't cutting it.
It also happens to be as hard or harder than most other types of software engineering. You have to stay on top of trends, keep building your skills. You don't deal with algorithms much, but your OOP needs to be on point if you hope to build something maintainable for the web.
Most of the potential talent has a subtle disdain for web work, everyone wants to be a game dev or do stuff that's math-heavy or algorithmic.
So giant shortage of good web programmers.
I would disagree with this point, as evidenced by the fact that people coming out of code schools are still getting hired. To do software engineering in other fields (financial, embedded systems, game dev, operating systems, enterprise LOB, cloud platform, crypto, big data/distributed systems, etc) takes a lot more experience and training. A competent programmer can crank out a rails based Mom & Pop small business site in less than a day. You can't really crank out Big Table, or Unreal Engine. However, the market size for small business web sites vastly exceeds that for Big Table.
I never wrote any code in C, but given a tutorial I'm pretty sure I could create a client-server application to exchange time in milliseconds between two hosts. Would you use it instead of your NTPd daemon in a production server? Of course you wouldn't.
Web development is not considered programming by friends of mine that indulge themselves in (linux) kernel development, write (or try to write) code for drivers, do reverse engineering and what not.
Ruby and Rails are easy to learn but hard to master, same goes for JS (with all the whistles and bells that the language comes and the various frameworks try to elude one way or another). There's also Go, Python, PHP... Same goes for all these languages: Easy to learn hard to master.
So, any language can be hard or easy depending on a wide variety of things, but web-dev is not easy these days. It's easy to be mediocre, it's not easy to be good.
To do either well, you have to have a model in your head of what the entire stack is doing.
With Linux driver development, you have to have a much more accurate model of a much smaller stack compared to web dev.
Broad or deep? Specialist or Generalist? Some people are better at one than the other, but others will be the opposite.
There's also a much larger number of technologies you have to work with. With financial, embedded, game dev, OS, crypto, big data, you have a finite number of languages / tools to learn, once you learn them, you spend the rest of your career perfecting them. With web dev, you have to not only perfect your tools, but constantly integrate new tools and languages and paradigms into your workflow. It's a lot harder than it looks to do well.
And just because code school grads are getting hired doesn't mean they're making useful contributions from day one. Guaranteed it'll be some time before they're really worth their salary.
I thought the initial statement was absurd, but doubling down here isn't helping.
Anecdotally, as a non-web developer I don't find myself fiddling w/ new stuff all that much less than my web dev friends.
edit: Ack, replied to wrong post. This was to Mandatum
The kicker is that, in my opinion, almost all of the complexity in webdev is superficial and to a high degree artificially (by choosing an inappropriate application data transport protocol,HTTP) introduced precisely because of the perception that developer talent in the field is cheap (cheaper than alternatives at least).
Edit: and from your other comment suggesting you can get a $60k offer to somebody implies there is not a real shortage of talent (good or otherwise) and that you consider webdev to be a commodity, and not actually "hard".
- web development is a monstrous sub-field in the software engineering field. As others have pointed out, just about everything has some sort of web presence somewhere. There's a pretty big difference between someone doing something groundbreaking and novel and someone making the 920375235236th standard CRUD oriented java enterprise site. The latter camp will grossly outnumber the former camp by multiple orders of magnitude, but I think it's clear that what vinceguidry is referring to is the former (at least I hope he is)
- People tend to think that their sub-field in the software world is some sort of special snowflake that makes it harder to deal with than the others. They're aware of the tricky bits of their sub-field due to having a ton of specialized knowledge, but then look at the other ones and mainly see the surface scratching bits and declare, "Oh, that's easy!".
- Everyone has different strengths and weaknesses and will tend to gravitate towards the areas that they'll excel at. People also like to think that they're smart and that their coworkers are smart, so assumptions can be made that all of the smart people are gravitating where they're heading. To loop back to the OP a bit here, HN has a strong bias towards bay area startup culture which has a strong bias towards web development (generally of the former category I described above, at least they all would self-report themselves in that category) so everything is going to be seen through that bias.
In my circle of software friends we span a pretty broad range of the industry. Over time the view we've taken when discussing each other's world is that we all could rapidly learn to do the jobs that the others do, however most of us wouldn't want to. Personally my point of pain is when I start hearing about UI oriented stuff, but when I start talking about reworking algorithms to scale from GB to PB that's when they start tuning out. Neither's harder, they're just different.
People always think this when they think CRUD. The reality is, even CRUD can get quite complex. Even just laying out an HTML page properly takes a lot of knowledge. Understanding the 'zen' of CSS. The zen of HTML5. The zen of jQuery. What looks simple on the surface can get very complicated quickly. If you don't respect that complexity, then you'll find it tedious, and won't apply your problem-solving mind to it, won't abstract that difficulty away, and will find it unbelievably painful to maintain and add features to later on down the line.
I'm making what looks on the surface to be standard CRUD. A product info management app. Well, the product data itself are key-value pairs and the design calls for you to be able to edit which keys you're able to put in. So there's no set database schema, and I don't want to do anything like making a two-column table of values that has a zillion rows, so I'm using Postgres Hstore. Now I have to maintain an abstraction layer over that which has gone through a few iterations and might have to withstand a few more.
It makes creating views difficult because you don't know just how big everything's going to be. So I might spend a whole day laying out the list page, diving into the minutiae of CSS rules, thinking carefully about all the different types of data it's supposed to display and deciding where to put certain types of logic so that I can reuse it later if needed.
The thing about CRUD is that to really do it right, you have to also think carefully about the data domain. If you're not building flexibly, which takes up-front time, then when you have to move stuff it's going to feel really painful. I realized that I don't want the ability to add new SKUs to be accessible to normal users, so I moved it into the admin section.
It took five minutes and was as straightforward as it sounds. But if you're not all that great at web dev and/or don't respect the power tools at your disposal, such a move could take all day and introduce subtle bugs in your UX that you won't catch until next week when you're working on something else.
If you really are trying to re-solve a solved problem, like blogging, then don't do any programming at all, just fire up Wordpress. Absolutely solving solved problems again is going to be tedious. But if you're building a CRUD app, and you can't find something interesting about it, then you're not really applying your full mind to it. There's a reason you're being asked to build it, because the functionality they need isn't being offered elsewhere.
I built a vehicle reservation system that was "standard CRUD" but they wanted this dashboard style view that took up half the dev time that I thought was really interesting and turned out great. The rest of the CRUD came out easy, we have lots of tools at our disposal for generating that sort of thing.
I agree w/ everything you said, but I'd argue that oodles of software jobs out there are solving solved problems and doing it over and over and over again and these were the sorts of jobs I was trying to evoke.
There's lots of things out there that are half-solved, like the more general case of CRUD. Here there are frameworks that allow you to solve specific CRUD problems, but each specific instance of CRUD, for it to really be solved, needs its own software package just like Wordpress with a huge community and easy installer. The vast majority of them don't, so you can't call them solved. That means there's still interesting things to learn about them.
Remember 2048? I remember a lot of people saying it wasn't a terribly interesting game. Bullshit it's not! I'm building a Ruby library to run the game, (didn't like any of the ones I found) then I'm going to start building an AI engine to solve it using heuristics. When I play 2048, I've gotten up to 8192 a few times, and would love to be able to program an AI to play like I play. To build tooling to help me visualize my AI programming and watch it operate. 2048 is plenty interesting, if you don't think so, it's you that's the problem, not 2048.
I solve problems once. Once solved, if I need the solution again, I refactor or extract the needed solution to a library and reuse it. If I find myself writing the same kind of code enough times to where if I continued it'd become tedious, I switch gears and start looking for an abstraction.
Web applications 'grow'. Their growth needs to be managed. If you don't manage it well, your business needs suffer horribly.
> and from your other comment suggesting you can get a $60k offer to somebody implies there is not a real shortage of talent (good or otherwise) and that you consider webdev to be a commodity, and not actually "hard".
No it means that we've tried the corporate approach and found it a non-starter, so I have lots of leeway to drive the hiring process. There just aren't any good candidates willing to work for the money we're willing to pay. I have to lead up front with our salary because we've done it too many times where once salary comes up the whole process breaks down.
...so pay more money? Or take a chance on the not so good candidates and have a great training program? Or start an internship/co-op program at a local university and find cheaper talent. Get creative.
Many of us have a not-so-subtle disdain for it, too. (I'm not saying I don't respect web devs--quite the contrary--but I hate writing JS/PHP)
However, I wouldn't touch webdev with a 10 foot pole, at least not the front end stuff. While not intellectually complicated, there's something about that stuff which makes my eyes cross, my head hurt and my brain melt.
I recall a female friend of mine who sighed and obviously though I had sold out when I switched from pure technical (on campus at CIT) to more commercial job working for a big civils constancy
So what's left is that those that work in web development are either self trained or goes through one of those dev bootcamps. But people that goes into these programs does not necessarily have a CS background, and to be a "good web programmer", you do need that kind of background.
Database? Network protocol? 3D graphics? I tend to put these in the same league as web programming, they are job skills. Yet there are plenty of university course on these subjects. So why not have web programming?
I'm not sure what a good definition for "fundamentals" would be, but I'll speak to your examples one by one, which I think gives a decent sense. My databases class talked about what databases are, why they came to be historically, an overview of the different approaches, and a discussion of their trade-offs. My networks class talked about why we started creating networks of computers historically, gave an overview of a bunch of different approaches, talked about trade-offs, discussed why TCP/IP+ethernet/wifi has become the most common end-user deployment, and talked about how it all works. My 3D graphics class was very mathematical, discussing how and why the 3D transformations work, with mostly toy algorithm implementations. This wasn't one of your examples but my programming languages class followed a similar schematic of "history, survey, trade-offs" as databases and networking (ditto for operating systems, and some others). I imagine a web programming class following that pattern to give a sense of how and why the web works. But that is pretty different than learning how to make applications using rails/nodejs/ember/angular/react/whatever like we do in industry.
Actually, when I was in school, this model really frustrated me, it seemed out of touch with industry and like I wasn't getting the specific skill buzzwords I needed to put on my resume to get my first job. But looking back on it, I'm really glad to have gone through a program focusing more on history and concepts rather than technology and details. It's harder (though of course not impossible) to self-learn the broader subject matter, and the details are ever-changing and needing to be kept up with constantly anyway.
I think a really good model for universities is broad coursework in combination with an aggressive internship program and industry-sponsored project classes.
Wait, what?? :)
I once got a lowball (cough) offer that was so bad, I wouldn't have been able to pay my (unusually low) rent, buy gas for my (paid off) high-milage car, and make my (unusually low) monthly student loan. Health care was not included.
My experience is that employers with that attitude aren't worth working for. These employers don't respect my career, nor do they have a realistic understanding of how to make a profit.
OOP != maintainable codebase
As Dijkstra said,
"Dont't trade simplicity for familiarity."
Why is everything trying to grab at state? I call information that everything wants access to 'data', and I manage that accordingly. Each application interested in it grabs the data, preferably stored in a database selected specifically for the needs of the data and how it's accessed, parses it into objects, does the operation it needs to on it, and then perhaps writes a new record of data. The objects can go away as soon as they go out of scope, leaving the data available to construct a new object when it's needed.
People say to store state in a RDBMS, I think that's ridiculous and a perversion of OOP. Program state belongs in memory, not on the network. It's not intended to be tabular, an object's state often consists of references to other objects. I sure hope you're not storing these in a database.
An object's state is only supposed to be accessible through it's interface. It's bad OOP to have other parts of the program interrogating its state directly. It's bad OOP to have an object interact with more than a few other objects. If you find yourself violating that, then you're treating data as state and you need to start managing that data separately, through a persistence layer.
Maintainability means being able to alter a program's behavior without having to understand the whole thing or make drastic edits. If you follow the rules of OOP and don't just say you're doing OOP because you're using classes and stuff, then you'll have earned maintainability because you'll be able to change a class's internal behavior without affecting the rest of the program because it's using an interface rather than needing deep knowledge of the altered class. And you can change the interfaces too, only changing the two or so other classes that use them.
I did a Ruby on Rails bootcamp in Chicago awhile back. At the end, I decided to take a mid six figure job in another industry and am wanting to go back.
It's been two years and while I have been able to save a couple years reserve now, I am basically starting over. I lean towards Ruby on Rails, because at one time I was offered an entry position as well as I know maybe a dozen developers that went through the same program.
I have money/time and the determination (In a previous job I taught myself chemical engineering, so Ruby on Rails was much easier)
ANY ADVICE WOULD BE APPRECIATED!
You definitely won't find that kind of salary as a rails dev.
This remarkable statement is probably true if we correct difficulty levels to the average ability of the population of "web developers".
Doing stuff that's "math-heavy or algorithmic" often involves years of training before you are even remotely effective. People are not snapping up the equivalent of "code school grads with little experience" to design and build optimizing compilers, computer vision systems, high performance distributed systems, operating systems, etc.
You need years of training before you're any good at web dev too. For the fields you listed, people build toy solutions to all of those problems all the time. Just for fun.
So much unreasonable elitism in this industry.
My experience is that as soon as I mention that I'd soon move back to my home country (in Eastern Europe), the conversation dries up, as 99.9% of the SV companies that contacted me are only looking to move talent to SV, and don't want to have remove positions.
Write me, if you're in really deep trouble. ;)
More seriously, it's because web developers are in high demand, so there are going to be more postings for them; having a proper web presence is absolutely vital to modern businesses nowadays, and that requires developers to establish that presence. Hacker News is also run by YCombinator, which specializes in funding startups - a market which tends to lean very heavily on web development, since many of those startups are based on web apps - and therefore will already have an inflated quantity of web development jobs by that virtue alone.
The former will need to do several hundred writing jobs per year, while the latter may only need two. Thus, 99.5% of writing jobs may be for journalist stories. Any novelist looking for commissioned work will not want to sift through 200 posts to find even one relevant listing. So people posting such jobs would probably get better results on a site that explicitly excludes the noise.
In short, there are more "lightweight" postings because the people who solicit them and do them need to secure new work more often. If you can make 20 websites in the same time that you could build one enterprise application, you will probably see that job advertisements are 95% for websites and 5% for business software.
The demand for web-devs might be artificially high right now, as most normal people have not yet discovered that they do not really need a programmer to build their website. I think Squarespace / Wix / WP etc, maybe with a custom design, should cover most websites.
I've been doing back-end web programming as a freelancer for a few months, but it's really hard to find good clients, so I'm moving to something else.
There are plenty of other tech jobs, just not necessarily in the startups that YC focuses on.
All those extra features that you can cram on a web page need that much more effort on mobile. People, for some reason, seem to be understanding that mobile costs a LOT of money, so they wait until later. But they expect to be able to build the web side really cheap, so they stick with that and argue price.
Just my little anecdote....