Hacker News new | comments | ask | show | jobs | submit login

> I think programmers tend to underestimate the difficulties involved with becoming a good programmer because once you're good, you only see the even steeper learning curve ahead of you.

Good programmers I know also overestimate the skill needed to earn high salary in this job. You don't have to go up the learning curve much; these days, you just learn yourself JS a little bit and go for a webdev job, making shit code and still earning more than most people in a given country.

> But wondering why the wealthiest corporations in the world pay their workers high salaries is perhaps like wondering why water is wet. Software has a low marginal cost, and the rest is basic incentives for the corporations.

Nah, that's like wondering why is this ice block sitting on a hot plate and still solid. The answer is: because it just got put there, and it'll melt in a moment. So too, will end high salaries, as most low-hanging fruits get eaten by software, made by mass-produced cohort of programmers.




> Nah, that's like wondering why is this ice block sitting on a hot plate and still solid. The answer is: because it just got put there, and it'll melt in a moment. So too, will end high salaries, as most low-hanging fruits get eaten by software, made by mass-produced cohort of programmers.

Our industry has its share of cycles, but this, in my view, is largely wishful thinking on the part of people. Nothing wrong with optimism but...

Every 5-10 years there's a "technical shift" that forces everyone to reevaluate how they build software or more importantly what they build, and the race starts all over again. The ice block is removed from the hot plate is replaced by a bigger, colder block of ice. And when these technical shifts aren't taking place, the bar for what constitutes as "good software" inches upward.

If your standards for acceptable software were frozen in time in 1985: using modern hardware and software toolchains, you could accomplish in one day what used to take a small team an entire month. But if I delivered, say, what passed for a "good enough" database in 1985, it would resemble someone's 200-level CS final project rather than a commercially-viable piece of software.


I have not noticed these technical shifts per se. What I have noticed is that mature engineers move on and do other things and new ones reinvent the wheel with some new fancy language or term which then becomes the new way of doing things, and the cycle repeats. Sometimes there is a great deal of value when a new level of abstraction happens but I wouldnt call this a shift, it's just progression.

Many of the underlying problems and solutions exist for decades. Database systems you mention are a good example of this.


When I talk about shifts, I'm referring to things like the proliferation of smartphones and tablets, which increase the net-demand for software and along with an entirely new specializations of knowledge.

While there are some key concepts to things like databases, the fact remains that your 1985 database would not be considered sufficient in todays world: It would have too many limitations, lack features we now take for granted, would not scale to modern data requirements, etc. Supporting all that "modern" functionality is non-trivial and requires a huge amount of effort. You can't just say "Well, we figured out space and computationally-efficient hashing, so relational databases are well on their way to being feature-complete"

There's a reason we haven't stuck with 1.0 on our platforms, and it's not just security or a desire for a bigger version number: New demands required new functionality and new ways of building things.


"When I talk about shifts, I'm referring to things like the proliferation of smartphones and tablets, which increase the net-demand for software and along with an entirely new specializations of knowledge."

iOS was basically AppKit, so anyone already developing for the Mac knew most of what they needed to know to develop for iPhone.

Pretty much every programming innovation is incremental, and doesn't require throwing out all of your previous knowledge and starting over.


> iOS was basically AppKit, so anyone already developing for the Mac knew most of what they needed to know to develop for iPhone.

Maybe. But AppKit was not the Mac Toolbox.

When my career began being good at memory management was a skill to be proud of. I would say now, being good at concurrency is a skill to be proud of.

I don't really have to worry about memory management any longer but didn't worry about threading when I started my career.

As I see the younger generation entering the programming field I wonder in what ways the craft will be different when they've had a few decades under their belt.

Will parameter tuning datasets for machine learning be the coveted skill? Who knows.


>so anyone already developing for the Mac knew most of what they needed to know to develop for iPhone

But the demand for developing in AppKit suddenly increased by orders of magnitude.



MongoDB didn't become a public company through innovations in fundamental distributed database technology or even through good engineering. They became a public company because once Javascript became adequate for building client software, there was a strong incentive to build MVPs using the same data structures from client to server to DB, and once you build an MVP that gets adoption there's a strong incentive not to switch databases.

That's the sort of shift in the environment that the grandparent is talking about. Fundamental CS tech was arguably better in the 1970s and 1980s, because it moved more slowly and you had time to get the details right. That doesn't matter if you're building say a mobile Ethereum wallet in 2018, because you're building for the user expectations of today, they don't care about data integrity or security as long as it doesn't fail during the period where they're deciding which tech to use, and software that solves the problem (poorly) now is better than software that doesn't exist.


> Fundamental CS tech was arguably better in the 1970s and 1980s, because it moved more slowly and you had time to get the details right. That doesn't matter if you're building say a mobile Ethereum wallet in 2018, because you're building for the user expectations of today, they don't care about data integrity or security as long as it doesn't fail during the period where they're deciding which tech to use, and software that solves the problem (poorly) now is better than software that doesn't exist.

I believe you are a victim of survivorship bias.

There was plenty of shitty software in the 70s and 80s. The difference between then and now is that we haven't been able to wait for 4 decades, to see what software of 2018 stood the test of time.


I agree, back then there was not a mindset of move fast and break things. It was foundational research, a lot results and learnings from then are still applicable today.


> It was foundational research

In the 1980s there was a lot of "foundational research" (poorly re-inventing the wheel) for microcomputer people who did not know about the work done on large computers in the 1960s. Move fast and break things was also very much a thing for microcomputer manufacturers and most microcomputer software vendors. Look at how many releases software packages went through, and at what rate.


I think you're agreeing to disagree. His viewpoint, to me at least, is opposite to yours. Or at least parallel. He never said that today there's no foundational research.


> software that solves the problem (poorly) now is better than software that doesn't exist.

That has always been the case and is about as good a one-line summary of the software industry as I can think of.


It could be a one line summary of Richard Gabriel's famous essay "The Rise of "Worse is Better""

https://www.jwz.org/doc/worse-is-better.html


Done is better than good!


But on the flip side of that slow movement, you wind up with things like Oracle... for good or bad.


And the mainframes that run our banks, transportation systems, healthcare, public safety.. etc etc. Use the right tool for the job, price it against what the market will bear. Pacemakers and insulin pumps driven by npm updates - shudder -


This predates the formulation of the cap theorem by ~15 years, so I doubt it contains everything relevant to a modern distributed database.


It's before my time, but I'm pretty sure people had at least an intuitive understanding of what partitioning does to a datastore before Eric Brewer wrote it down.


I'm not sure why you get downvoted, I'll upvote you.

The "modern" database systems are now going back to the exact design principles that the books you refer to solved long time ago. There is tons of research, dissertations,.. that focuses on this from decades ago.

Its just now that the new systems realize that these problems actually exist.

If you dont know the history of a certain field and what came out, you repeat and make the same mistakes again. This seems to also apply to software engineering.


I can assure you that the major theories behind modern big distributed databases were not written in decades old books.


Yeah, distributed database systems from the late 70s, early 80s actually had certain transactional guarantees that some of these "modern big distributed systems" you refer to still dont have.


Yes, some of those theories were also applied in decades old systems. Look up Tandem Computers and Teradata.


In maturing parts of the software industry you'll often see a desire to stay with the times in order to maintain a competitive edge, re-inventing the wheel often looks like full/partial re-writes of a system for minor marginal gains.

A great example of this is the evolution of FB/Google/Amazon. Portions of their core tech have been completely re-written over the years for marginal gain, but there is a large premium to being the best in tech.

In other parts of the industry every new cycle enables some new area of tech, and those marginal gains become the minimum bar for entry. e.g. Deep Learning and Computer Vision, distributed systems and cloud computing/SaaS.


You overestimate the quality and reliability and expandability of those old systems. The 1998 10 Blue Links tech couldn't support the functionality and scale of Google today.


your standards for acceptable software were frozen in time in 1985

Haha in 1985 Amiga had a multitasking GUI desktop in 512k and 7Mhz. Now we have millions of times more computational power and struggle to deliver an experience as crisp and satisfying.

I wish people still made software like it was 1985, that actually used the full power of the hardware for something useful...


Yeah but Exec/Kickstart/Workbench was missing some basic niceties that are table stakes today:

- No process or kernel security (processes could rewrite kernel code)

- Processes could effectively disable the scheduler

- Supported only a single, fixed address space: both a massive limitation and a performance hack that made the system bearable to use

- Single-user

- No security model

There are embedded applications these days where not having these features are deak-breakers. Let me assure you: if you re-implemented the original Amiga OS feature set it too will be screaming fast. The tricky part is keeping it fast once you start adding protection and additional functionality on top of it.

And largely what happened when you tried to implement more complicated applications on top of these primitive systems is that they would crash the entire system, constantly.


That and the fact that amiga was clean room design. IBM Pc was already old and even x86-64 won over itanium. Backwards compatibility has a cost but also gains. Amiga wasn't even compatible with c64.


That and the fact that amiga was clean room design. IBM Pc was already

Well the PC wasn’t backward compatible with CP/M so that’s an odd critique to level at the Amiga.


Sure but "pc" at the time of amiga was already pc compatibles and at backward compatible with xt and IBM PC. Besides amiga was very optimized for 2D graphics. 3D story has been less rosey. But I would seriously welcome a new amiga. That wouldn't mean a specced up amigaos running pc. That would mean a radical new mobile architecture. Or Vr machine. Some fresh air On a radical, alien architecture. Belt CPU ? Gpu/CPU à la xeon phy ? Ram only? Dunno. Something crazy.


Itaninum only lost because Intel did the mistake to allow AMD to design x86 chips.


That would mean itanium could have won only by a monopoly and that with open competition, backwards compatibility won.


Itaninum also had an emulation mode, while not perfect, they could eventually improved it.

Backwards compatible didn't matter in the mobile world, or for those screaming for ARM laptops.

Sometimes we can't just have nice things.


more complicated applications on top of these primitive systems is that they would crash the entire system, constantly

That’s chicken-and-egg. Why do modern apps with actually quite simple functionality need all this vast power, the GHz and Gb? Because it’s there. Why does software crash? Because the OS let’s it with nothing more than inconvenience to the user.

Amigas were actually quite usable, they were stable enough for complex software to be developed and real work done. Same for ST’s.


your whole point is extremely backwards! software in 1985 was more "complete" than today. Even CD roms shipped with magazines had to support more windows variations than "good software" today supports of browsers! not to mention that every corporation with a software team also had usability and QA teams. Software quality and resiliency was much better than today. Then in the 90s it took a dive because online made "first to market wins", and it have been downhill from there.

so, your software concepts from 85 would be overkill today, not lacking.


In 1985 you target one machine with one thread and one level of memory.

Today I'm working on systems with memory latencies from l0 cache all the way to tape storage. And it's getting worse.


Not when coding machines like the Amiga.


Agreed, and the games industry may be the clearest example of this. In 1985 the developers had to actually ship a finished game, there wasn't the opportunity to release a half-finished product and update it later. Compare to Fallout 76, which was very clearly unfinished at launch.

Edit:typo


Games in 1985 shipped with bugs baked in that people were just stuck with. Civilization had a broken Gandhi, and still shipped 4 different bugfix versions.


No game (or any large software project) is ever bug-free, but the standards were higher than they are now. Fallout 76 was literally unplayable on launch-a fairly early mainline quest was broken, so it wasn't possible to reach the endgame. Was civ unbeatable on release? Sure, ghandi was bugged, but that was entertaining enough they kept the bug in the sequels.


There were plenty of bad and buggy games in the 80's, it even crashed the market [1]. We just don't remember them.

Also modern non-indie games are orders of magnitude larger and intricate than 80's games, and have about the same distrubution of quality/bugs. They're now made by teams of size 10-1000 rather than 1-10 though.

[1] https://en.wikipedia.org/wiki/Video_game_crash_of_1983


Neither Civilization nor Windows existed in 1985.


Take a guess when Windows 1.0 was released.


> "software in 1985 was more "complete" than today. Even CD roms shipped with magazines had to support more windows variations than "good software"

There was one variation of Windows in 1985. (And I remember installing it!)


Yes, but in your previous comment, you said it didn't exist. It may as well not have! Almost nobody used Windows 1.0...

Still, in 1985, even DOS apps had to target varied environments: mono, CGA, EGA, Tandy graphics, different memory configurations, 8088 or 286, printer drivers...


The anecdote is they deliberately left in nuclear Gandhi for the lulz.


> inches upward

I wish it inched upward... So much software now-days is bloated crap.


Preach, brother/sister! My career started in the early '80s and I say this all the time in comments here on HN. You had less to work with but less was expected of you.


I think you analysis is fair, but Glassdoor data is a bit off. At my work, I filled out salary data on Glassdoor and it never showed up. Maybe they thought it too high?


End user tasks might be able to be automated into dumb programming jobs, but companies have tried for decades to down-source their programming and it always falls flat on its face.

Even if you have library support to hand hold your budget coders, even if you use a lot of them, even if you give them all the time in the world, they will produce more complicated, less coherent, less stable, buggier, and harder to modify, improve, or iterate results than better coders who understand the problem better.

That means that no matter how little you pay up front you end up paying more in the long run throwing more man hours and money at fixing the mess made. A good mess is easier to maintain and improve and costs less over time. A mediocre / bad mess takes substantial efforts to maintain and iterate on.

Its also probably a domain impossible problem to remove the ability for any coder to make bad code. If for no other reason that in any programming environment you can never stop someone from iterating by index over a binary searchable tree for a specific element in it, or you can't stop someone from turning a float into a string to truncate the decimal part to then reinterpret the result as an int. But if you don't give them the tools - the integer types, the data structures, access to the bytes in some form or another - you aren't really programming. Someone else did the programming and you are just trying to compose the result. A lot of businesses, like I said, can be sated on that, but its still not programming unless you are in a Turing complete environment, and anyone in such an environment can footgun themselves.


> and it always falls flat on its face. How so? What about outsourcing in general? Where I work wanted to save money, and now we have 4x as many engineers in India. Most of them suck, and we have to shepherd and correct their mistakes, but business is viewing it as a success. They are creating features at a higher rate than we were before. Every other large company also has teams in 2nd world countries because they're cheap. At some point this labor will be both cheaper and qualitatively comparable to US talent.

"any programming environment you can never stop someone from iterating by index over a binary searchable tree for a specific element in it."

SQL solved that problem a long time ago. You write SELECT, the query optimizer figures out how, using any available indices.


I think that misses the point. SQL does not prevent a junior dev from selecting many more rows than he needs to and then proceed to iterate over them naively.


just use Rust


I know its a troll comment but it really illustrates my point. Rust feels great when you know all the paradigms and complexity involved in writing Rust code because for a seasoned veteran it does everything sensible and right. But for your average person trying to get into Rust is a nightmare of fighting the borrow checker on all your misintended mistakes that would at best be silent undefined behavior in other languages. Top it all off with how expressive Rust is relative to most C like languages (remember, we had an entire generation who thought function objects were alien because they were taught Java) and its fantastic if you know what you are doing and a totally lost cause when you don't.


Ironically, knowing that I can get a good salary with relatively shit skills makes me want to up my game. Because it seems like a situation that is inherently unstable.

Eventually it has to change (imo), either through companies becoming more scrupulous in their hiring, or through a massive flood of new devs.


If you take a look at most schools CS is now one of the most popular majors. When I graduated ~3 years ago the class was ~100 per grade. I went back to recruit and class size was ~600 per grade. Talking to profs I know all over the country this seems to be thematic. The supply curve is about to shift.


I think it will be a massive flood of new devs (maybe we're in it now?) -> overvalued tech stocks -> crash -> stricter hiring.

It's not that tech, even done by shitty devs, isn't valuable. It's a question of whether the market can control itself, which I'm pretty sure is no.


> stricter hiring

Stricter hiring on the low-end perhaps, because in FAANG and the companies that copy them, hiring could hardly get any stricter...


You will get a massive flood of wanna-be new devs. But economic theory would tell us that high salary attracts more talent and even the companies will have more people to choose from they also become more selective so the riff-raffs still won't get a job. So only the good ones will get the plum roles; of coruse a few bad one fall through the cracks, but overall devs are smart people.


I remember back in the early aughts after the .dom implosion and the resultant uptake of outsourcing. There was a pervasive attitude that the high paying software jobs were not going to come back in any meaningful way. Yeah, well...

There will be a day, but when is hard to say. Thinking it's right around the corner is akin to the belief we're on the cusp of true AI. We're more pessimistic today than we were in the mid 80s. And non-programmers were programming with hypercard, filemaker pro, VBA, etc, back in the 90s.

There are of course former well paying jobs such as old-school front end devs (html/css/sprinkle of js) that are largely commoditized, but that's a given considering the low barrier to entry.


Except the high salaries won't end given the level of productivity found within the technology industry. Tech companies as it stands today are some of the most profitable companies out there, it's this profit that enables them to further compete for talent and pay people more.

The same highly productive nature per employee is found in virtually every other high paying industry, most of which have not seen the pay for the higher end of those in the industry fall over time.


> these days, you just learn yourself JS a little bit and go for a webdev job, making shit code and still earning more than most people in a given country.

With these jobs in particular, what I see is that the definition of seniority has shifted to 'knowing the latest tech'.

So a junior dev who's just got to grips with React has become a React Developer, and they are now relatively senior in that field. The experience isn't transferable to other parts of the software stack though, it's too heavily tied up in the browser. So they end up as a super-specialised frontend dev.

It'll pay pretty well until the tech becomes obsolete, unless that kind of person enjoys maintaining legacy code.


I don't see that mass-produced cohort arriving any time soon. My city has 1 developer for every 3 jobs and it's getting worse.


Universities are already producing those like there's no tomorrow, and we now have bootcamps that contribute some more. The only reason there isn't a glut of programmers visible is because the industry is growing even faster. But I can't imagine that growth lasting much longer.


What city is that?


Sydney. This statistic was given to me this week, I actually figure that it's 1 person for every 3 vacancies but who knows.


So then when are doctors going to start making less? And investment bankers, too?


Doctors are already seeing commoditization as healthcare systems consolidate and adopt standardized workflows (aided by EHR implementations). And nurse practitioners and physicians assistants are doing much of the work once done exclusively by doctors, but with less training and lower pay.

It wouldn't surprise me if index funds have caused a decline in earnings for investment bankers.


Yeah, but the relatively poor performance of index funds in 2018 (no gainz) will lead to capital flowing back to ibanks and hedfe funds




Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: