Hacker News new | past | comments | ask | show | jobs | submit login
The deskilling of web dev is harming us all (baldurbjarnason.com)
83 points by loop22 3 months ago | hide | past | favorite | 58 comments



I dunno. On the one hand I hate “web dev” more than anyone. I think it has led to such an astronomical decline in software quality that if you described it to someone from the days when computers were 1000x slower, they straight up wouldn’t believe you.

That said… the article doesn’t really ring true to me. What he is saying about the complexity of each part of the stack (http, html/dom, css) is technically true, but that’s not really how it washes out in practice. This whole “CSS is a complex graphics engine!” “HTTP is a protocol you could write a whole dissertation about!” sounds like an argument being made by someone trying to make a rhetorical point about web. In practice for most of web dev you don’t need to understand the deep nuances of CSS or HTTP or whatever. Yes, there is a large breadth of material you have to learn but the depth you actually need in any one area is much less than the author is trying to imply.

And yes, web is trash, but for different reasons. In fact some of those reasons are the opposite of what the author is saying. He says that each part of the stack is so complex it should be a separate specialty. But the real problem is the very fact that things are so complex. Rather than accept that complexity and subdivide the field into different disciplines, we should get rid of all this unneeded complexity to begin with.


He does also point out that CSS Grid or html tables havent changed. The web still mostly works the same.

You are yet another perfect example of raw antagonism against the web, a body of hate. You are legion. But, if we look at the arguments here, look at where complexity dwells, the things that are hard and changing aren't the fundamentals, aren't the essentials. They are not so complex.

What is hard/changing is state management. What is hard/changing is handling state in client-server or other connected architectures. What is hard/changing is being smart about offloading work to threads. And it's not like anyone else has conquered this complexity. None of the other ecosystems are particularly far in advance. The complexity of these cases seems to be inherent, not accidental.

The reason for so much complexity is because we change & improve & progress. This makes some people very upset. People drastically over-ascribe the woes of the software development world to the web, when really it's just that the web is now the default place for making software & most companies would bungle up these concerns no matter what platform they were building atop.


> On the one hand I hate “web dev” more than anyone. I think it has led to such an astronomical decline in software quality that if you described it to someone from the days when computers were 1000x slower, they straight up wouldn’t believe you.

Nearly all of this, IMO, can be explained by a lack of passion.

I grew up on computers, starting in the 90s. I didn't have internet access until near the end of the decade, and it was slow dial-up. If you broke the family computer, you had to figure out not only how you had broken it, but how to fix it. When I found Linux (Gentoo, obviously, because it's way more fun to spend days tweaking CLFAGS than to use the software), I was also thrust into forum culture, which was rife with RTFM. You quickly learn either to search and read docs, and demonstrate a modicum of capability and effort, or you lose interest and do something else.

This is not the case now. Even before the advent of LLMs, it wasn't that hard to find the answer to most of your questions on SO. The rise of cloud computing means that you don't have to know how to run a computer, you just have to know how to talk to an API – and even then, only at a surface level. You can pretend that TCP doesn't exist, have no idea how disks magically appear on-demand (let alone what a filesystem is), etc. Databases, which used to be magical black boxes that you accessed via queries carefully written by wizened graybeards, are now magical black boxes that you abuse with horrifying schema and terrible queries. Worse, you don't even have to know their lingua franca, and can commit your crimes via an ORM, which probably sucks.

And for all of this abstraction, you are paid handsomely. The difficulty in landing your first job is tremendously high, sure, but the payoff is enormous. Once you're in, you'll find that the demands of most businesses is not to upskill, but to push features out faster. Grow the user base, beat others to market, and dazzle VCs. No one has time for doing things right, because that slows down velocity.

This is aided and abetted by Agile, specifically Scrum. Aside from maintaining the cottage industry of Agile consultancies, it's designed to turn software production into a factory, where no one person really needs to know how to do anything tremendously complex. Instead of insisting that people learn how to do difficult things, we spend hours per week breaking down tasks into bite-sized increments with absurd abstractions of time.

"Thought leaders" deserve a callout here as well for their contributions to this mess. Microservices are a great example. A potentially useful architecture in some circumstances has been turned into gospel that people buy into without question, and apply everywhere regardless of its utility or fit. If you're lucky, someone eventually notices that rendering a page seems to take a lot longer than it should, but more often than not this is met with a shrug, or at best blaming the DB and paying AWS for a larger instance. Multiple network calls that each have to traverse N layers of routing and queues is slower than calls within a single process? Color me surprised.

When you combine the allure of a high-paying job that has little barrier to entry with no business incentives to do things differently, you get what we have today.


I think the “full stack” trend is harmful.

Yes it sounds nice in theory but there’s so many things to learn both in front and back end parts that I’m yet to meet a true “full stack web dev” even if they all claim to be.

Yes a guy who’s done backend all his career can write some basic HTML and CSS and some JS. And some frontend guy can write a simple server side code that writes and reads from a datastore. But they’re not “full stack” in my eyes, there had to be some balance in terms of knowledge in both areas; 60-40 would be ok, but 90-10 is not.

When I joined the company I work for over a decade ago there were backend guys doing frontend; yes they delivered something but frontend quality was poor. Now it’s the opposite, frontend guys doing backend (and of course they don’t want to deal with SQL so NOSQL it is); same thing.

Nothing beats a frontend and a backend dev (or multiple) working in tandem, IMO.


It's not remotely a "trend" though. Full-stack is how it has always been for many web developers since 1994. I almost have never been anything other than a full-stack developer, one way or another.

I did work at a dot com (well a dot co dot uk) back in the 90s where I had varying jobs, and arguably the most successful two I was the front end guy for (but these were successful because of who they were for, not the development; the back-end was a nightmare in one of these). We had to invent things.

Apart from that, I've always just done everything. And I'm good at it all. Slightly conservative or risk-averse after almost three decades, maybe. But still good, and still up to date and learning.

And I'm burned out and want to quit, or get away from the Web, or at least teach (which I've also done).

I don't think newer web developers necessarily understand the luxury of specialising in one part or another. A lot of us didn't get offered the choice.

(But then again, I'm shocked by how many newer developers lack basic competence that I think only comes from deeper understanding of the full stack. There are non-idempotent GET requests on this very website where I am typing.)

ETA: I think in a lot of small shops, developers still end up getting dragged across this divide through circumstance. The web does not really have a front-end/back-end divide, no matter how much recruitment managers, engineering team leads and tech bloggers would like it to have.


Like another commenter here, I've been full stack since my first working day, and to this day, 1.5 decades in the industry. I've always touched on infrastructure, on the database, on the server side and on the client side. Vertical implementations of features.

I can't begin to imagine life as just a ... database guy, or a backend guy or just a frontend guy. Perhaps it needs to be aligned in everyone's eyes that we cannot be as good at databases as one that spends most of his time doing databases. But there are pros and cons to our kind of knowledge.

I can argue pro and con SQL vs. NOSQL, to the limits of my ability, or argue for this frontend framework or that, or consider various languages or architectures for the backend, or discuss about how we'll deploy the production version of whatever it is that we're building, or how we'll do development side CI/CD and so on and so forth. What am I? I'm open to the idea that I'm a fraud, but I like to consider myself a full stack.


I was this person for the first ~17 years of my career. With the caveat that I was developing features on the bottom of the stack in C but would frequently implement features all the way up to a web interface running on an embedded device.

I'm in games now and I have a specific focus and I really, really enjoy it. Maybe it's my old brain, maybe it's because I have young kids now, I don't know, but I really like that I don't have to context switch so often now.


this is fine when you're a junior learning but it's asking to get shit dumped on you later on. I think it's a good idea to pick a lane later on in your career and settle into it.


I have been saying this for years. The opinion has become even firmer since moving into DB specialization – the horror show of schemata and queries that even dedicated backend teams dream up is unreal. I don’t even really blame them; relational databases are hard. Yes, anyone can install Postgres (or spin up a managed service) and get decent results for quite some time, but at scale you absolutely have to know what you’re doing. There is a reason that DBAs were a thing, and the SaaS industry is slowly realizing that they shouldn’t have abandoned that role.


> at scale you absolutely have to know what you’re doing

There's a lot of runway you hit that magic scale: https://news.ycombinator.com/item?id=39276069


Yes, but the runway is a good deal shorter if you’re doing things as most dev teams without RDBMS knowledge do. The biggest three issues unfortunately build on each other:

Using JSON when you really should be properly normalizing your schema (or using a DB better suited for JSON).

Using UUIDs – especially UUIDv4 – for any indexed columns, especially as PKs.

Using Postgres for no actual reason other than it’s popular.

MySQL / InnoDB is a clustering index RDBMS, which can have massive performance improvements over others IFF you build your schema around it. If you use anything non-k-sortable (e.g. UUIDv4) as your PK, it suffers massive performance degradation instead. So you use Postgres instead, and think you’re immune. Wrong – its MVCC implementation means you’ll have approximately 7x read amplification for Visibility Map lookups under the same circumstances, and huge amounts of WAL bloat, which can also lead to increased network latency.

JSON[B] in Postgres is frequently TOASTed due to its size. The overhead from TOAST/DETOAST isn’t small, and you also have to deal with difficulties indexing the column, and you can’t as of yet do a partial update to a JSON column. MySQL can do partial updates, doesn’t have TOAST, and has better (still annoying) strategies for extracting scalars from them to be indexed as a quasi-sidecar lookup, but it lacks GIN/GiST index types.

I’ve personally ran a single MySQL instance with over 100K QPS, and it had the plenty of runway, and a ton of vertical scaling left. This was with very careful tuning, and a suboptimal schema. I’m sure it could go far higher if everything was done correctly from the start.


Fullstack is not much of a trend. The word has been around for awhile but when you go look at the job market you can see that the industry doesn't buy these terms very much, and that they also price that way too. A fullstack salary might only eventually catch up to frontend or backend.

Also, backend is a far more ambiguous term than frontend. With frontend I'd almost want to ask if the next thing you're going to say is React. On the backend? No idea. Do you deal with distributed synchronization as your only specialty? Do you do billion-per-second event logging, warehousing, and querying? Do you write Kafka glue all day?

So when people say fullstack they really mean app making, and however much frontend, backend, or even janitorial work is required to make an app exist.


The frontend is just an eventually consistent node in a distributed system. Fullstack means you have a basic understanding of how the entire system works which is invaluable.


At the risk of sticking my head above the parapet, I've made a career doing both to a pretty high standard. Not just that also server admin, DBA, desktop app development and a pile more skills I've picked up from a quarter century doing this stuff. When you're not working for big companies, you have to fill the gaps.

Sorry you feel the people you've worked with are so inadequate, but these are muscles you need to exercise. If they have no opportunities to learn at work, of course they're not punching at the same weight. But there are plenty of us that do.

I don't often announce myself as a full stack dev though. Maybe that's the difference between me and your experience.


> Sorry you feel the people you've worked with are so inadequate, but these are muscles you need to exercise.

I'm as surprised by this as you are.

Perhaps we actually are unusual.


I disagree completely, the best are teams who are good at both. Separate teams of front and back are significantly slower and always seem to be at odds with each other. Teams should be empowered to build out a working feature without needing to coordinate with a second team's roadmap.


> I think the “full stack” trend is harmful.

Please tell us why you think it's harmful. I don't see the harm in "too much to learn." That just sounds like software work to me.

When people specialize, the different specialties need to show their value, so they find greater complexity in their niche. The different specialties eventually independently "discover" many of the same concepts but explain them differently and their languages even diverge. I think losing the ability to share is a kind of harm.


I don't fully disagree, but this might be subjective.

For me as a full stack developer working with small teams/startup, I actually don't consider myself full stack. I just want to be able to do whatever it takes to make a product and ship features. Does it need a websocket server? I'll learn how to do that. Does it need advanced client side caching? I can do that, etc.

To some extent "product development" is both art and engineering. In the engineering side, you can think of html and CSS and http and testing as different things requiring a multidisciplinary team.. but if you think just in terms of "building the thing", I like to feel that I can get it done with whatever technology needed. That's why I got into programming in the first place. Not to write code and be an "engineer", but to make the computer do cool things.

AI does expand the capabilities of someone that wants to get things done. I have written in languages that I don't have experience with, and recently was exploring a neo4j db with cypher queries written with ChatGPT (I have only MongoDB experience), something that would've taken me hours to learn.

Still, having experts in specific areas in a team can be very helpful (both to get technically difficult things done but also to have others learn from them).

I just don't want to be 1 part of a team specializing in my limited domain, where I start to care more about the technical part than the actual value delivered or user experience..

I think what might cause me to burnout is too much specializing, too much bureaucracy, people telling me I can't do this, we need to hire a staff level person to this thing etc..


> Framework skills are perishable, but are easily just as complicated as the foundation layers of the web platform and it takes just as much – if not more – effort to keep them up to date.

That's so true. For my own projects I try to not use any framework at all (or sometimes still use Backbone, which is entirely deprecated but simple, and that I know well enough).

But of course, as an employee many times you don't have a choice. I was recently part of an Angular team (Angular 2). That was one of my most unpleasant experiences. Angular seems to revel in complexity for complexity's sake. And it's often not needed at all. In that case it was used for displaying information that lives server side and is constantly updated there (live inventory). Why would they need a big client for that?


> I try not to use any framework at all

Aka “I like living in 1999”

Lol


I feel like the deskilling of web dev is that the web dev in this article doesn't feel competent enough to learn HTML, CSS _and_ Javascript at the same time.


"The ongoing labour arbitrage – the deskilling of developers – can only be addressed with collective bargaining and union action."

web dev does feel more like cheap labor work tbh. aside from a few exceptions. mostly tedious grunt work.


I'm curious about what the author of this article expects? That all APIs remain frozen in time forever? That new APIs are only something that happens in the frontend?

> The framework knowledge itself is also perishable. Not because your memory or physical coordination deteriorates (though that happens too), but because frameworks change more and faster than the underlying platform.

> But the React skills I have are all out of date and obsolete. I would effectively have to start from scratch even if I wanted to get back into React work. Everything React has changed in ways that are fundamentally incompatible.

I'm lost as to what the author is talking about. I've been using React for... 9-10 years now. There's been two API changes to React in that time - functional components and hooks. Neither of these are rocket science, and they're also able to be completely ignored and you can stick to your existing knowledge if you so chose to. I don't feel like 2 API changes in 10 years is that radical.


Find a few good tools, understand why they're good, use them extensively and pick up others when needed (i.e. when you switch jobs or face a new problem).

It's really not that hard...


>CSS

>HTML

>JavaScript

> in a sensible industry, would each be a dedicated field.

I get that webdev is a maze of frameworks, but that's just ridiculous.


HTML has been stagnant for so long that as a document-oriented language... it doesn't even have a table of contents element! Or a bibliography!


HTML Custom Elements would allow you to define and provide support for your own <table-of-contents> element, so you could use it, and all browsers would understand it, and dev tools and CSS and JavaScript and all other HTML would work fine with it.


All that can be done without semantics. But HTML is also a document-oriented semantic language. Your way would just have idiosyncratic elements, which is little different than a div from the perspective of the rest of the world.


I think what you're calling "my way" is just what HTML is designed to be in its own specification. Custom elements have been in the spec for over a decade and are supported everywhere, they're as legit as any other part of standard HTML.


To descriptively move the needle on the semantics of custom elements you must first move the world. Any semantics you attach to your custom elements really is just your semantics. If you want this to change then start persuading everyone that your custom elements ought be semantically observed.

Until then, a custom element is just a div. Semantically opaque.

When we have actual consensus on elements and semantics, that enables very rich clients that can provide an alternative view over the same information, regardless of prior styling. For example, imagine if hovering over any time element will also show you a visual ticking clock or calendar. We wouldn't be able to do that if we didn't have consensus on the semantic meaning of time as something more than a valid token for a parser.


As a backend guy I uhh.. have long been of the opinion that web dev was always on the slippery slope of deskilling.

I remember working on a project where I was lone backend guy doing data storage/retrieval/aggregation/caching/entitelement/etc all behind a discoverable API that fed the UI everything it needed in a handful of calls.

Meanwhile the web dev guy took 6 weeks of iteration to create a date selector that wasn't awful.

Felt like they were too busy getting stuck gluing together frameworks they didn't understand and thus couldn't integrate well, and not simply writing a little code.

When he finally got it working, it was still awful. You had to select start date, start time, end date, end time.. and if you didn't proceed in the prescribed order the other selector boxes would reset and go wonky, lol.


Sounds like you worked at squarespace... The date picker of their events creation feature works just like that, among other problems with their date selection support like not handling recurring dates...


My take on this is that we have multiplied the number of potential ways to build a web application continuously for decades. And now there are a combined 10,000 (pick a number) viable (but not trendy) ways to make a web application.

The interesting thing to me is that ultimately businesses don't care how it works. They just want it to work.

Which means that you can pick a small framework or two across the frontend and backend and configure or train an AI system on only a tiny fraction of the sum total of web development knowledge and have an effective automated web development system.

Looking at the trajectory of gpt4 to gpt-4o and Llama2 to llama3, the prevalence of multimodality, improved reasoning ability as models get bigger, strong investment in hardware research, etc.

I've been doing web development in some capacity since the late 90s and focused on leveraging generative AI for the last two years. I don't see how any reasonable person can follow this stuff closely and not anticipate AI systems that can literally do the entire job of a small web development team, within just a few years. It was actually possible to build a version of that two years ago, and some of the latest attempts are very polished, if lacking in some level of functionality. But that is coming.

Every single job that we have today will be automated. I assume they means that people will be left basically herding swarms of AI agents. For a few years. But it won't be very long before you really need an AI to control your agent swarms or even understand what they are doing.


> Testing, irrespective of the language or platform, is yet another complex specialisation we all just pretend doesn’t exist.

Speak for yourself, me and my team use a CI and always add or fix tests right before merging any PR.

(this is a joke, test suite is most useful when it’s part of the dev process and not an afterthought as I’m implying above)


I would absolutely hate working in an environment where I had to wait for someone to write the css for me before I could ship code I wrote.

> These are all distinct specialities and web dev teams should be composed of cross-functional specialists.

I completely disagree and also thankfully this will never happen because it's completely impractical.


A honest question. Could someone tell why is CSS getting more and more complex (other than Google and Co. want to protect their browser's market share and hold control of it)?


CSS is easier than it's ever been


It's not getting more complex, because the easy stuff you have always been able to do is still there the exact same, as easy as it's always been. There is more of CSS now, so you have more power to express styles for more mediums in better ways, but you only need to use what you want so you don't need to make things complicated at all.


because people got tired of slicing up drop shadows into 8 meticulously placed transparent PNGs


Judging by many websites, I think a lot of new web developers choose to tackle the problem of having too much to learn by not bothering.


Easy to fix.

Every client should have the delivered site go through pagespeed.web.dev and when there are 4 green circles around the four 100s the webdesigner gets paid provided the client likes the site. This is not an OR gate


Hard to sympathise. Generalizes like mad. Stop being a full stack. Have something you do well and you should have little problems in this industry. There's so much work an opportunities.


> There's so much work an opportunities.

Many of them suck though. I get hired as a troubleshooter usually: that makes the most and it’s nice, get to see many companies etc. I work with people who are hired for frontend, backend etc; a month or so in, they will be asked things like ‘so devops, is that something you can do?’ Etc. And that is if it’s not in the actual job offer; ‘frontend expert wanted with 4000 years experience in everything’. I usually leave after 3-6 months as I ask a lot of money, but I have been asked to advice and help on Windows upgrades, printer emergencies and more.


Well then you're not selling yourself well. Stop taking shit jobs. Brand yourself better.


I have always just done exactly what I want; I just see people around me who get roped into crap.


It starts off as articulate and then, I felt the arguments were less clear. It seems perhaps ungrounded. It's nice they have energy in a way but it also seems vaguely conspiratorial, rather than emergent. I've nothing against people seeking a better lot in life and collaborating or pointing out typical flaws and rent seeking. I wish the perspective was easier to follow, I'm curious.


This article hit close to home (maybe not the last part). In my last few jobs, I'm horrified at how (mis)treated CSS and HTML is.

<rant mode on>

Very recently, I had the displeasure of coming back from a 3-weeks long vacation and discovering that the rest of the team added Angular Material on the project, and started to sprinkle it everywhere in the code.

Despite me (the only front-end guy in a team otherwise constituted only fullstack devs that are really more backend devs dabbling in frontend) asking them to not add Angular Material or any 'UI library', because this project is very simple, a mobile-only classic layout with a few simple forms.

Nothing in the designer's mockup for this project looked like Material Design, nothing; But since Angular Material is marketed as basically being part of Angular, then they told me that it was "good practice", and "a way to save time". As a result, instead having a clean DOM with a few classes and basically just a border-radius on the buttons, they did tons of ugly CSS to overwrite the Material styling that they themselves added, all in the explicit aim for them to avoid doing any CSS at all, ironically.

Just to reiterate, it's really a super simple app with a sticky header and footer, some form with a few borders, and some rounded buttons. I already did for them the main layout (it was basically a dozen or so CSS lines), but they ignored it all to add underneath tons of stupid Material components.

I mean look at this crap:

https://material.angular.io/components/toolbar/overview

<mat-toolbar> is a component that was actually imported in the project, which:

- does not have any JS behavior

- does not handle positioning (you can do it yourself with flex! How powerful!)

- does not have any ARIA role attached.

- but DO have a default color and background, thanks to the customizable Material palette!

In other word, it's a fucking <div> with a default background, but props to the Angular Material marketing and the mountain of content marketing out there to brainwash hordes of webdev into thinking that it's simpler to import an external dependency than to specify a background in one class.

Another example, here is the doc for the '<mat-divider>' component: https://material.angular.io/components/divider/overview

I understand why a Google designer would specify such element in a plateform-agnostic way, but for everyone else who's working in a web project that only used Material to 'save time', then why the hell would you import and use this component instead of just using the native <hr> tag available since the nineties ?

It's easier I guess, oh except for the planned migration to Material M3 that you will have to do in a few years if you want to be able to upgrade to the latest version of Angular, but I guess that adding some borders and some margins is just incredibly hard.

<rant mode off/>

The saddest part of all is that I love doing frontend dev in general, and as the articles point out the fundamentals of HTML/CSS/JS are incredibly stable and retro-compatible, yet all the ecosystem keep piling layers and layers of crap all in the sake of 'DX' and of 'good practice' (that seems to change every 6 months).

I do believe in the CSS case is that it looks too simple, deceptively even, so one can dabble in it without knowing the fundamentals, and so I believe that looks like CSS is broken the first time a junior dev has something with "position: absolute" that isn't positionned the way they expect, or when they find themselves coding in JS something that could be done with a handy CSS selector, out of ignorance of the available selectors.

As a result of this apparent simplicity of CSS, I think that there is a general reluctance to 'respect' the language enough to try to learn it really, which makes for more piles of broken CSS riddled with !important, and thus more people believing that CSS and HTML is fundamentally crap.

EDIT: I'm not critizising Angular Material in particular. I the React ecosystem I actually saw some 'UI component library' proposing elements to have bold or italic text. Yeeesh.


> Service workers, which require a completely different mode of thinking from the rest of front-end development and have more in common with web server programming than DOM manipulation.

This is spoken like a true front end developer who's never used a thread before.


This went from CSS framework bad to capitalism is bad really quick


The article doesn't refer to capitalism. What in the article are you identifying as references to capitalism?


I do agree with the grandparent, I was a bit surprised too in the last paragraphs that the article's solution to frontend overengineering is to unionize.

Though true, the articles doesn't go into 'capitalism = bad'.


That's not really what the article is about, it's a theme, but the argument is about power in the workplace and how it erodes the conditions for accumulating professional skill. It's like on the nose about this from the outset, I don't understand how you could miss it.

Since shareholders and management in large corporations control much of what software developers do all day, every day, if one considers this a problem and wants a solution, what can you think about except unions? I can think of one thing, guilds. Like the lawyers and doctors and accountants we could form guilds, only take in people we know are decent and honest and we kick them out if they turn bad or don't live up to our professional standards. And then we could use that as a base for collective bargaining, or if necessary, extortion.

If you don't have anything else, then I think union is likely the more practical suggestion. Many know what a union is, there's recent history of success, striking is well understood.


I would not even qualify the gripes about big tech in the article as being capitalism. What we have today is something closer to corporatism.


> And nothing coming out of either the US or Europe comes close to addressing the true problem, which is that these companies are simply too big.

The tech industry will never be a genuinely free market as long as big tech companies are allowed to be as big as they are today.

What we have today is a centrally-planned economy by MBA sociopaths, operated as a looting ground for the rich.

It will never function on normal competitive, supply-and-demand market principles.

Because, even though a healthier market is the only thing that has a hope of a return to the fast-growing tech industry of prior decades, it would also require big tech companies to accept a smaller slice of the overall pie and allow new competitors to grow.

Why do that when you can strangle the market and keep the entire corpse for yourself?


OK, and you consider oligopolies like these capitalistic, as well as assume that the author does too?


This is sarcasm, right?


"De-skilling" would be a better way to write this. I was halfway through before realizing it wasn't about "desk-killing" or the pretending that workers are incorporeal gossamer entities that do not have to physically interact with the real world. (a/k/a "remote work")


There’s more to life than web dev. Backends for instance.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: