Hacker News new | comments | ask | show | jobs | submit login
Signaling in tech is some fucked up shit (2016) (daiyi.co)
286 points by luu 6 months ago | hide | past | web | favorite | 194 comments

As someone running a technical recruitment agency, I can assure you that it is true that someone having built stuff in Clojure is more likely to be considered as a good engineer than someone who did JQuery all the time.

Recruiters use an intuitive approach to conditional probability. Given only the info alone (if someone did JQuery or Clojure) exposes you to two bell curves, with the X-axis being "programming ability". The first bell curve that has all JQuery people is more on "the left" and the one having all Clojurians is more on the right. The Clojure bell curve has shitty engineers and some really good people are part of the JQuery one.

Prejudice is sort of rational. If a recruiter looks at one datapoint, it's unlikely that it is at the tails of the bell curve, so the Clojurian applicant will get more attention. If you are treated unfairly because of the group you "belong to", I am sorry - but this is reality.

I have decades of experience hiring people, and what I find is this:

If I randomly select a programmer from all the programmers who write Clojure, working or not, interviewing or not, they are likely to be a “better” programmer than if I randomly select a programmer from all the programmers who write JavaScript, working or not, interviewing or not.

But of course, that’s not what happens when I interview. I meet one person in the morning with three years experience working on projects of types A, B, and C. In the afternoon I meet another with four years of experience working on projects of types B, C, and D.

If one has Clojure and the other JavaScript, I’m much more likely to be delighted to discuss Clojure than JavaScript, but as far as HIRE/NO HIRE is concened, the thing that matters to me are projects Of type A, B, C, or D, and the candidate’s role/contribution.

If the JavaScript programmer’s experience is solid, I do not want a recruiter in-house or otherwise trying to save me an interview and filtering them out.

The bottom line is, there are like 100x as many JS folks as CJ folks. So let’s say there are 100 great CJ people out of 1,000 CJ people. 10% ham!

But that means there are 100,000 JS people. If even 1% of them are great, that means that there are 1,000 great JS people out there, and I do not want to pass on meeting any of them just because there are also 99,000 not-so-great JS people out there.

In my imaginary distribution, a random CJ person is 10x more likely to be good. But there are 10x as many good JS people as CJ people, and my experience is that THEIR experience is a far better signal than their programming language.


p.s. I am competing with people who only want to hire programmers with shiny signals like ClojureScript or the ability to quote “JavaScript Allongé.” Good! Let them ignore the 1,000 good JS people, easier for me to interview them and make a successful offer.

Now that I think about it, please ignore what I just said. Only interview people who build systems out of Lisp, Prolog, and Maleboge.

We're currently interviewing for positions which I haven't done in a while. There are many more developers and even more completion for them. The number of resumes read and interviews conducted for a Sr. position has gone up. I have found selecting candidates with more diverse exposure/interests greatly increases the s/n ratio and gladly do this knowing I'm missing some. It's just more effective in terms of time taken out of team capacity to do interviews.

I have 0 years hiring people, my experience is this. The language(s) you write code in don't mean a thing. It's the ecosystem. I'm a JS programmer, but that is not why you should hire me, it's my experience with the DOM, HTTP, indexeddb, File API, documentFragment, Fetch API, TreeWalker, NPM, WebGL, interactive SVG, the pros and cons of React, when to use Redux, Basic login, NGINX, Docker, etc, etc, etc. My point is this: knowledge of Closure(Script) is a great tool in any programmers tool belt, but it is definitely not the most important one for a solid frontend engineer.

Knowledge of APIs is not a good indicator of ability in my experience (I have many years of hiring people). So, the 'ecosystem' isn't particularly interesting and neither are the devs that just put endless lists of acronyms on their CVs. It's much more useful to listen to a dev tell you how they solved various problems, dig into their CS knowledge, see what they like working on in their spare time (code or not code related).

When I see an interviewee list out a bunch of things they have "experience" with like that, I assume they actually mean "I used this once".

It's hard to avoid touching the DOM, HTTP, NPM, React, NGINX, Docker, etc. on a daily basis if you're doing any kind of serious work; it's not just some disjoint list of obscure buzzwords.

Yes, compare and contrast the above to my own partial list of technologies from 1977 to 1997...

“Nova 1220 BASIC, Acius 4th Dimension, Hypercard, MetroWerks CodeWarrior, Lightspeed C, Turbo Pascal, MP/M, DigiTalk Smalltalk/V, J2EE, ...”

What does any of that have to do with each other.

There's one thing: Hypercard and DigiTalk represent directions personal computing should have gone, but didn't (although that's not relevant to the thread topic)

It's not so much that someone has touched <acronym soup> rather the kind of people that think just listing all that out is useful, are typically not the developers I want.

You just named a lot of things. Why should I hire you?

Ok, the list is distracting. I mean, experience with web dev's ecosystem is more important than the languages you can write code in. Of course a list on it's own is useless, it should be supported with actual projects (hopefully some in production).

It's true that, based on my interviewing experience, candidates with only a jQuery background are unlikely to have the skills to cut it developing large-scale software at a large-scale company, which is what I am hiring for. At the moment I interview mostly front-end candidates, and it takes a lot of sifting to find qualified hires.

However, the implicit companion assumption that a resume with lots of clojure experience will have those skills is not something I've seen any evidence for. Many Clojure-focused candidates are unproductively dogmatic and are at risk of wasting everyone's time by trying to rearchitect our codebase because it doesn't meet their particular taste. In general, if a candidate is an adherent to a niche programming language they need to be able to explain to me when its use would be appropriate and inappropriate. If their answer is "this language and/or dogma is always better" then you're adorable but you need some more experience before I'm going to hire you.

It's also important to point out that, while my core requirements are the same for both jQuery and Clojurey candidates (ability to function in a large, complex codebase & ability to manage your own code's complexity), the specific skillsets are very different. Most clojure enthusiasts are actively bad front-end engineers. Which is fine! But many positions require engineers to at least be able to dabble in UI work, and if I get the sense that you would find it beneath you to do so, its probably going to mean better luck somewhere else unless I have a very specific position to fill.

Oh yes. Based on my experience hiring on behalf of different firms, arrogance among engineers correlates with the degree of mathy-ness of a stack.

This is my experience hiring Clojure engineers in a small startup- things became dogmatic and opinionated at a time when we were existentially challenged with building a product.

That is definitely a good strategy for a recruiter. However the context of the author's experience seems to be how the author's coworkers treats them. If I'm working with someone and I get to see their code, or I can take a look if I'm curious, IMO it becomes a vice to use Bayesian probability when you can look at the person's actual code and judge their skill. A different way to put it is Bayesian probability is good in a situation of uncertainty, but if you are not in a situation of uncertainty it just becomes lazy/ prejudicial.

> As someone running a technical recruitment agency, I can assure you that - statistically speaking - it is true that someone having built stuff in Clojure is more likely to be a good engineer than someone who did JQuery all the time.

Would you care to share those statistics? While we're at it let's also be sure to define "good engineer."

Experience looking at thousands of CVs and talking to people who are behind the CVs.

That's not a statistic, nor is it quantifiable. I'm not taking issue with your experience, I'm taking issue with the fact that you've implied your claim has quantitative rigor to it, because I'm extremely skeptical that's the case.

For what it's worth, I'm challenging you as someone who is neither a JavaScript developer nor a Clojure developer. I'm of the opinion that biases like the one the author laments are perpetuated (in part) by this leaky arguments from "statistics" that never appear.

Don't get hung up on the language. People sometimes say "statistically speaking" to mean "Based on my experience over a large number of tests". It doesn't mean "I applied a test of statistical significance to arrive at my conclusion".

You're then meant to combine their stated experience with their stated credentials and your assessment of their trustworthiness to arrive at some conclusion.

This imprecision may annoy you but it is a common way of speaking English so it's probably not in your interest to fight over the choice of wording.

Sure, I understand what you're saying. But I'll continue to challenge the use of language like that. In my opinion, saying something is true "statistically speaking" is more grievous than saying something like, "I'm 99% sure." Both are artifacts of the English language, but I think the former carries much more quantification behind it, implicitly speaking. In that sense it's not just about informal use of language.

This is because saying something is true "statistically" is a claim made in both scientific and casual contexts. It's mentally taxing to figure out which context is in play - someone could have come into this thread with actual statistics and said the same phrase, verbatim. Therefore (as I see it), it's realistically plausible that someone could read the original comment and come away believing recruiters have done actual statistical analysis a la TripleByte to arrive at their conclusion.

I hear you. I removed the wording "statistically speaking" from the initial comment.

In english, "statistically speaking" can mean you are making a claim about the odds of something rather than a 1:1 correspondence. It is extremely common and absolutely correct.

For example I have not run a study but I am very confident that, statistically speaking, tall people are better at basketball than short people. I am saying that I have an opinion that if you were to try and choose good basketball players, biasing yourself towards the tall people will increase your odds of making a good choice. I use the phrase "statistically speaking" to make it clear I am not saying "All tall people are better at basketball than short people".

In a lot of contexts you will drop the phrase completely because you are talking to other smart people and they will know that the context of your statement is in the realm of populations and odds, but online there are a lot of pedants with poor communication skills so you have be extra clear.

If you believe this is about pedantry and the literal meaning of words, you’ve missed my point. However, you’re speaking to it: the commenter used the phrase “statistically speaking” to indicate their confidence in their assertion. I’m taking issue with the fact that they used the phrase - which, as you say, is strong enough to describe the way taller people are better at basketball - to generalize that Clojure developers are more likely to be qualified than JavaScript developers.

Idiomatically, any way you use the phrase implies great confidence in the corresponding claim. But I have not seen any substantial explanation of why this effect makes sense, which makes me extremely suspicious that the effect exists at all. I don’t particularly care about whether they’ve conducted a peer reviewed study. I care about whether or not their heuristic is backed by anything empirical. “I’ve noticed this...” doesn’t cut it, no matter how fast and loose you want to play with the literal meaning of the word in context.

It doesn't imply great confidence, it is literally saying a different thing. You seem to think it means the same thing as "scientifically speaking" or "I have evidence that". It very often means only "related to statistics". That's it. No epistemological claims needed.

"I'm so going to win the lottery"

"On average ... no you aren't" / "Statistically speaking ... no you aren't" / "If you work out the odds ... no you aren't" / etc.

If I ask you a question about some gambling game, you are going to think about it in a statistical way. I mean you will think in terms of populations and samples and odds, not that you will go perform experiments and then do a statistical analysis of that evidence in order to come to a scientific conclusion.

I think you're mistaken.

I, like you, am riled up by people using puff words to inflate the apparent credibility of claims, often claims that have little or no basis for actual credibility.

But I think that "statistically speaking", as it is used by nearly all English speakers, means "speaking about these groups in aggregate", not "speaking as a statistician". It doesn't even mean something about an empirical basis, it means talking about groups. I agree pretty much entirely with freshhawk's analysis.

Now, aside from _word choice_, your skepticism about whether the original poster has any empirical basis at all for their claim... well, fair enough.

He just said it was holistic.

The commenter said "statistically speaking" to justify part of their point. I am directly challenging their assertion and requesting evidence that they have 1) actually quantified something in more than a hand wavey manner, and 2) found the (falsifiable) conclusion they stated, based on that analysis.

I'm deeply skeptical. People like to throw around percentages and references to statistics to justify their biases, but I don't think the commenter has done nearly enough to motivate the conclusion that the modal Clojure developer is more likely to be a "good engineer" than the modal JavaScript developer.

Next time you have a blocked toilet, and the plumber says it's a problem with 'ABC' why don't you just tell him that's not quantifiable! and then tell him you think it's 'XYZ' that's the problem.

If someone who interviews thousands of people and finds a correlation between those who have some clojure experience and quality, I believe him.

For the simple reason that most programmers don't try very hard and anyone taking the time to even learn a rarely used language is demonstrating a positive quality that most don't.

If the plumber can explain to me in coherent reasoning why the problem is likely with 'ABC', I'm not going to challenge it. Especially if they don't handwave about statistics they almost assuredly don't have and I have no domain experience in plumbing. That analogy doesn't work.

I stand by my point: using words like "statistically" and "correlation" perpetuate biases which may not have a rational basis. In this entire thread no one has explained why Clojure programmers are more likely to be "good" than JavaScript programmers (nor has "good" been defined!). I could just as plausibly state that someone tried to learn Clojure because they perceive it to be a buzzword, just as JavaScript programmers are often accused of playing buzzword bingo with web development frameworks.

Empirically speaking, we haven't ended up anywhere. No matter how sure you are of this phenomenon, unless you try to make its observation more robust, it will continue to be a microcosm of the tech hiring industry. Heuristics often belie subtle biases that do not actually have a foundation in truth.

most programmers don't try very hard and anyone taking the time to even learn a rarely used language is demonstrating a positive quality that most don't.

Once known as the Python Paradox http://www.paulgraham.com/pypar.html

Nowadays I suppose it would be the Clojure Conundrum or the Haskell Happenstance

I never talked to a recruiter with any technical expertise in my whole 12 year career.

Funny, but presumably just selecting for recruiters with deep knowledge of pokemon, not programming languages.

And you're assuming pokemon trainer abilities aren't congruous with tech recruiting.

Which is krebase? Google seems to only return references to this image.

Beats me, not my image :)

It's really sad, because I get called by 2 of them a week :/

Their websites sound like they are the pinnacle of the industry, but on phone/email they sound like a bunch of HS dropouts :(

I think it’s pure hubris to say that anyone can distinguish overall skill at a statistically significant level between these different populations.

Heck, I’d even say if you only look at the population of engineers who you have directly observed doing a good job on some interview trivia or code test, you still cannot reliably conclude that draws from that population are statistically significantly more likely to be acceptably good at a job once hired than draws from a different population.

People and processes capable of detecting skilled engineers are extremely rare. Skilled engineers are also somewhat rare.

What we set up as tech hiring processes are mostly just crap-shoot just-so stories we tell ourselves, mixed with a ton of hubris.

Presumably people are paying you to find good engineers, not just people who match particular profiles. If "rational prejudice" doesn't always find the good engineers, why justify it like this? Wouldn't it be better to take this as evidence that your approach could use improvement?

> people are paying you to find good engineers

No, they're paying them to find engineers that will pass the interview process. It's a subtle, but important distinction since prejudices that exist among engineers will naturally develop in recruiters. It's the recruiting equivalent of teaching to the test.

And like it or not, the kind of signaling the article describes is real. When an engineer says they can write clojure, it indicates a curiosity that ability to program JavaScript does not. Almost no one is forced to learn Clojure because of some job, project or even job prospects. It means you likely cared enough to devote your own time to learning for the sake of learning. And a lot of engineers look at this kind of thing as important and screen for it in interviews.

A recruiter operating any differently would be like a machine learning algorithm that ignores its training data.

It means you likely cared enough to devote your own time to learning for the sake of learning. And a lot of engineers look at this kind of thing as important and screen for it in interviews.

The curious thing is, I think the number who regard it as important is likely far higher than the number who actually do it. Which is understandable; once you reach a level of seniority you also have many other demands on your time and may not be able to be a hobbyist anymore on top of day job, family commitments and so on. So maybe once you learnt every new language but now you don’t have the opportunity, or you have decided to “go deep” in just a few, established languages. So the power of the signal is highly variable depending both on where the interviewer and interviewee are in their lives.

Also, knowing JS in the late-90s did indicate a curious and ambitious developer. It only went mainstream later...

No. You have limited resources to sift through people. Heuristics like this are an efficient way to reduce your search space. That is sort of the whole point of statistics.

Yep. Funny that HackerNews seem to be unable to wrap their head around the fact that prejudice is just applied Bayesian statistics.

Please don't take HN threads on generic ideological tangents.


I'm gonna say prejudice is actually not applied Bayesian statistics

and you would be right. Say I have a broken phone which is an Android. A prejudice would be that androids are likely to break. Mathematically we could represent as

  P(broken|android) =~ P(android|broken)
We see from Bayes rule that this is false, since

  P(android) >> P(broken)

Even worse, what actually happens with prejudice is that if I think that Androids are shitty phones, when I see a broken Android, I add it to my collection of “Shitty Android” data points.

But when my iPhone breaks, I take it into Apple for service and add it to my collection of “Apple gives great service” data points.

Saying that prejudice is just applied statistics completely mischaracterizes prejudice.

I’d argue that prejudice and bias applies most all statistics, but most often we just don’t notice or care. E.g. The biases inherent to specific domain, of those who captured the data, those who selected the data, selection of statistical techniques, real-world outcomes reinforced by usage of these statistics, etc.


Please don't take HN threads on generic flamewar tangents. Especially not this, for the god knows how manyth time.

Your question assumes that the programming talent bell curve for men and women are different, which I wouldn’t expect to be the case.

Almost nothing in nature matches that statement.

The "Bell Curve" for females of a species generally has a significantly narrower standard deviation for practically any genetically driven characteristic.

The problem is that none of this applies to an individual.

The "Bell Curve" for females of a species generally has a significantly narrower standard deviation for practically any genetically driven characteristic.

Isn't this limited to traits which are influenced by XY chromosomes? (Usually due to males having only one X and thus being more likely to express recessive genes?)

I think there are much more to the dynamics than just genetic distributions. For instance, not every male and female are opting into tech careers; only those w/ interest and/or talent are pursuing tech careers--after this filter, for example, who knows what remains of any genetic distribution?

Seems like an interesting assumption. I’m mot convinced there is a difference, but neither am I confident that there isn’t one.

Yeah I have no instincts in this because my experience has always been so fewer women than men on the engineering team. So this is totally anecdotal. But the few women I’ve worked with have always been in the upper part of the talent scale, while there has been much variance among the men. I.e., I’ve encountered many low talent (and high talent) men but not one low talent woman. Maybe there are enough hurdles for women (even if the only hurdle is that there are so few _other_ women) that you kinda have to be interested & reasonably good at the job to stick around. If this is true the bell curves may be better for women applicants. However amy anecdotal experience could be simply wrong and it somehow is even or worse for women or something other like overlapping curves of different shapes.

From what we understand about the difference in white-collared intelligence in males vs females, there's a higher variance in intelligence among males than in females. So either women really are in general better at coding than men, or the mid-to-low skilled women aren't making it professionally as much as they should, and only the highest-skilled women are being selected. If your anecdote generalizes, that is. (FWIW, that's been my experience as well.)

This logic comes up all the time on HN. I always wonder whether the people repeating it really believe that commercial software development is so intellectually challenging that it is only practiced by people at the very top end of the cognitive spectrum.

No, I was just wondering whether the mean for women is significantly to the right of the mean for men, or whether mid-to-low skilled women aren't being hired for some other reason (hiring bias, self-selection, or whatever). Or maybe my experiences don't generalize, and there actually are a bunch of mid-to-low skilled women whom I've simply never encountered.

My subtext is that commercial software development is so cognitively unremarkable and routinized that you're not going to see any indication of male/female cognitive differences in the demographics of our field.

It's an argument that doesn't play well on message boards, because software developers love to believe that they're part of a cognitive elite "inventing the future" and "eating the world", when in fact really the overwhelming majority of us are performing a bog-standard white collar symbol manipulation job no more challenging than the job done by an accountant or actuary.

Nerds like to pretend that we're all working on the graph isomorphism problem alongside Babai, but really most of our days involve basic wiring of data from one place to another; form fields to database rows to packet fields to page markup. We want to believe we're brain surgeons, but most of the time we're barely even dentists.

I love the deprecation bc I think we sw engineer’s deserve it to a degree.

But I will also say that various software shops I know of large and small are always desperate to find good talent. I’ve hired quite a bit myself and finding capable talent is very difficult. So our work is not as totally deprecatable as you make it out to be.

Believe me, I don't doubt there's a skill to doing this stuff; I've dedicated a big part of my career to finding and assessing that skill. But there's skill and aptitude to all sorts of things, from legal work to baking. We don't often talk about how there are lots of men or women bakers due to the distribution of intelligence.

I imagine being a doctor is more selective for high IQ than development and there are just as many female doctors as male ones.

If this is a true effect I imagine it's a selection effect. Women who felt luke warm about coding never entered, or left.

If there was a difference, are you allowed to note it or will you get fired for it? Programming skill is hard to measure. Chess is an interesting area that is a bit easier to quantify.

Some say "men are more interested in things and women are more interested in people" https://youtu.be/78ZfKUUsNcc?t=256

Given a woman in tech, you probably have someone who is at the tails, someone really good.

Women in tech would presumably have the same signalling as Clojure users - greater probability of being competent ? Is that the point you were making?

> when I told other software people I’m a web developer, I got treated like shit. I was a lower class of coder because I wasn’t “solving cool problems” because making web sites is easy.

Of all the stupid arrogant awfulness in tech that has crossed my path this one is definitely high on the list of the most baffling. Web dev is the hardest and most intimidating dev work I've ever done, by a country mile.

(Whether that complexity is 'justified' is an entirely different discussion that, while no doubt interesting, has nothing to do with this person's post)

The inflated sense of ego associated with “saving the world” and “solving tough problems” has almost completely turned me off from tech, in the Bay Area in particular. My problem isn’t with the sentiment—it’s that 99 out of 100 times it’s pure bullshit. There are a few people out their doing really amazing things, but the vast majority of people who have big egos aren’t the people working on that stuff.

Tech is the new Wall Street in a lot of ways. But instead of being honest about where the paycheck comes from we gloss it up so it’s less morally reprehensible. I actually have more respect for people on Wall Street these days. At least they are honest with themselves about the game. I think I have more respect for Jordan Belfort and Lou Pai then the exec teams af Facebook and Uber.

>The inflated sense of ego associated with “saving the world” and “solving tough problems” has almost completely turned me off from tech, in the Bay Area in particular. My problem isn’t with the sentiment—it’s that 99 out of 100 times it’s pure bullshit. There are a few people out their doing really amazing things, but the vast majority of people who have big egos aren’t the people working on that stuff.

I see this mindset in Seattle very often. A lot of my friends have recently been quitting their jobs to do something other than tech because of the overly inflated egos and the need for software engineers to constantly "do something new" instead of just maintaining & fix what they already have.

Or they get tired/can't keep up with having to continuously learn and adapt, which cognitive dissonance defensively turns into disdain for 'something new'.

As a web developer I've even come across other web devs who are chasing "hard problems" and claim CRUD apps are easy and boring. Yet many of them still manage to write nearly-unmaintainable and rigid APIs etc. They haven't even figured out the "basics" but want to go straight for the "hard problems".

One doesn't need to save the world. But not making it worse through ads and ubiquitous tracking would be nice.

So is not pushing negative externalities inherent to web apps to customers just because it makes web devs more productive.

Today's web is a monster.

> So is not pushing negative externalities [...] to customers

Same could be said of bailouts, recycling, MLM, and other sucker games...

The attitude has changed as people have realized the web isn't going away and isn't qualitatively changing. Fifteen years ago you could ask, "Why on earth would a programmer put up with that incoherent mishmash of shitty technologies? For God's sake, if you have any taste at all, work in a different domain." Now the why is clear, and anyone who had a snobbish attitude (I won't hold my breath while the rest of us out themselves) probably feels a little dumb. I peeked inside the web UI domain a long time ago and said "noooooo thank you, I will come back when this shit gets sorted out." I figured in five or ten years there would be something entirely different (something that made sense) and I would start from scratch on an even playing field, and HTML/CSS and Javascript would end up like Cobol. Now instead of feeling smart for skipping it, I just feel behind, and the people who slogged through that era have earned (really, really earned) a deeper understanding of the technology behind our generation's most influential way of reaching people with information and ideas.

As someone who has slogged through it for 20 years, I wouldn't say I've earned much other than to realize that its all an even bigger mess than what it used to be. The solutions people come up with to combat it (frameworks, JSX, CSS-in-JS) only exacerbate the complexity in my opinion. I truly feel like we're headed in the wrong direction when it comes to web development. I haven't figured out what the right direction is yet, short of going back to the drawing board with HTML.

> I haven't figured out what the right direction is yet, short of going back to the drawing board with HTML.

That's almost certainly what's needed. The fundamental structure of the HTML-CSS-JS triad is the problem that all this complexity tries to abstract around.

Separation of concerns is only good if you're separating the right concerns. Back when web pages were really PAGES (ie documents), the Content-Styling-Interactivity division made sense. But now that web pages are applications that division makes very little sense.

All the different solutions represent different approaches to redefining the separation of concerns, but what they all agree on is that Content-Styling-Interactivity is the wrong separation.

There are some sites that are applications, and there are some sites that are documents.

I'll go as far as to say that will never change. Applications are only one part of the web; the other parts matter too.

I care more about reading documents on the web, and I'm glad it has a nice simple way of presenting them (HTML) and styling them (CSS).

CSS works OK as long as you aren't trying to control things pixel-for-pixel, which is not what it's meant for. It obviously could be better, but the problem is not easy by any means.

>There are some sites that are applications, and there are some sites that are documents.

>I'll go as far as to say that will never change. Applications are only one part of the web; the other parts matter too.

That's true, but it's also not the problem domain.

No one is having trouble creating document websites on the internet. That problem is, more or less, solved to the point that almost anyone with no technical background can do it.

Application-type sites are where all the pain points lie.

I think part of the attitude comes from watching this incoherent mishmash kill the World-Wide Web and wear its skin. There used to be flamewars about abusing non-semantic markup around content, but now it's common for a document to lack any content at all, just a bootloader for a javascript app that never needed to be an app.

Given the model that the web has committed to, I also don’t think it is going to get much better.

As someone at the "other extreme" end from web developers, and having briefly done some "real web dev" myself, I can easily see why it's so loathed. There are immense amounts of churn. The continuing "appification" and bloating of websites with user and privacy-hostile crap. Reinventing ever-more-complex solutions to non-existent problems (remember the left-pad debacle?) Dogmatic cargo-cult thinking. Sadly, it's rare that a website "redesign" is an actual improvement instead of pure fashion-chasing.

(Whether that complexity is 'justified' is an entirely different discussion that, while no doubt interesting, has nothing to do with this person's post)

IMHO that has everything to do with it. It's hard to have any other opinion of web developers when you constantly see the large amounts of unnecessary complexity they foist upon everyone else. As I said, I worked in that space briefly, with others who had severe "framework envy", and largely see this complexity as being self-inflicted --- in some ways it was like Enterprise Java (another thing I did for a similarly brief time), except with much more churn. On more than one occasion I've had to clean up the mess they made by rewriting large web apps with simpler means, and the result was better by every metric (speed, memory, lines of code, etc.) except perhaps fashionability and job security.

I mentioned to some people I was learning clojure. Everyone was super into this. People started telling me I was really cool.

The author of this article must certainly not be in the same "tech bubble" I'm in, where telling someone you're interested in Clojure is going to be met more with neutral "what?" and "why?" reactions. However, I certainly agree with the title of the article: "signaling in tech is some fucked up shit".

Whoever says web is "easy" never had to build or maintain a large web application. The ecosystem is incredibly deep and constantly evolving, to the point where best practices change every six months.

Not to mention; web developers are constantly having to context switch between HTML, CSS, and JavaScript. Old school days they also had a backend language like PHP or Perl. Many still do. Then there was/is SQL. By my count, that's at least five different languages to know all at once.

“Look at how many plates I can juggle at once!”

“Aren’t you just trying to walk to the store?”

“Look! I added a fifth! Everyone do this now!”

I don't understand your analogy. What is walking to the store without juggling any plates in web development? I'm not aware of any stack that allows you to write styles, markup, business logic, and database queries in a single language.

I'm doing that today with Clojure/EDN: Garden, Reagent, Clojure/ClojureScript, HoneySQL.

(You might think it cheating somewhat because HoneySQL is just a thin syntactic wrapper on SQL, but then again, if I weren't cheap/poor, I could use Datomic, where it really is the same language.)

I don't understand the analogy either but I do know that when I'm doing iOS development it's almost always all in Swift. Sometimes some C or Objective-C or even C++ smattered in but that actually kinda proves my point because when I have to deal with one of those languages in a Swift project I slow down.

Arc does, fwiw. And that’s an important design goal. There is a lot of progress left to be made in this space.

In C#, you can almost do everything in a single language (linq to sql, model binding). You still need extensible application markup language (XAML) for the view.

Non-trivial websites are more complicated than walking to the store.

Isn’t using Java, PyTorch, Elasticsearch, and neural networks for a TODO iOS app the same?

Why would you use any of those technologies to build a TODO app?

My point is that overengineering and CV- or grant- driven development is not exclusive to the web.

Well instead of you telling TODO app what you have to do, TODO app tells you what you have to do, and that frees you up to think about doing it. Or it just does it for you and you go to beach.

This made me burst in laughter. I am indeed using Java and Elasticsearch, and no PyTorch, but Spark for an app I am working on. Way more complex than it should be.

It's a little weird to count a markup language, a presentation language, and a query language on the same axis as Turing-complete programming languages. If we go by that metric, the average sysadmin knows dozens of languages all at once. Yet sysadmin work is also often looked down on for some reason.

I don't have a good explanation for either of those phenomena, but I think language-counting isn't a good way to combat them.

I've done sysadmin work as well and the context switching didn't feel as prohibitive. Generally I was using something like Perl, Python, Bash, etc... but rarely switching between more than one in any given day.

With web development, I have to know HTML, CSS, and JS at least. I probably really need to know SQL. And I switch between those four (at least) every day, many times a day.

I think those markup and styling languages and even SQL have all been used in turing-completeness examples. Everything is bad, nothing is easy, and we might as well stop using the internet. We're all going really fast, but nobody knows where ¯\_(ツ)_/¯

Before "full stack" was invented that used to be spread out across 2 or more jobs, you'd have frontend people, backend people, maybe database people, maybe even designers in addition to the front end people.

The context switching existed but not at the same frequency.

The problem is that you're both right.

Putting up a simple web site that does something basic, that you then walk away from is easy, and is what a lot of people who call themselves web developers do.

Building something like Gmail or Facebook OTOH are endeavors only slightly below the moon landing in complexity and “solving cool problems”.

And it is equally legitimately "web development"...

I would imagine some of the problems people have with web developers is this inflated sense of ... something. Comparing GMail with the moon landing is hilariously out of touch. If you hear these sorts of statements enough times it's easy to associate most web developers with delusions of grandeur.

I am not sure I would class either of Gmail or Facebook with the Apollo Program in terms of headcount (est. 400k), total cost (estimates from $100-200 Bn), complexity (hundreds of contractors, millions of moving parts, dozens of directly-operated sites, thousands of projects), or novelty (landing on the fricking moon).

I think Google's total cost is more than $100-200 billion. It's a bit hard to pin down what "total cost" means in this context, though. But their 2017 operating expenses were $82 billion. Each year is higher than the last, but if you add up the last four years, they're over $250 billion.

Facebook is about an order of magnitude less, so the Apollo program fits nicely between them.

> It's a bit hard to pin down what "total cost" means in this context, though.

I realise looking back that comparing a company to a project/program isn't really correct. One kind ends, the other tries very hard not to end.

Well, I did say they were smaller projects :)

But you're of course right that I exaggerated a bit for effect. I do that sometimes.

My point still stands if you slice a few orders of magnitude off my hyperbole.

I don't disagree that Google and Facebook do enormous amounts of difficult engineering (I recently saw a USAF slide which compared Google's R&D spending on software to the entire defence industry's).

But the Apollo program and the Manhattan project were something else. Arguably they represent the historical highwater marks in terms of megaprojects which had effectively unlimited resources in the face of wide-open, unsolved problems.

If rockets were built like software ... (insert joke)

400k - wow that puts things to scale. This sort of comparison really should have been in my schooling textbooks.

Can you explain what you mean when you compare Gmail and Facebook to the moon landing? On its face that comment looks way out of whack to someone who is outside the tech industry.

Obviously software was totally different back then, and Apollo 11 was much more than a software project, but FWIW the Apollo guidance computer software was 130416 lines of assembly[1][2], while Facebook is probably at least 100 million lines of code at this point[3]. Also note that Facebook uses languages that are much more expressive than assembly, so "1000x as much code" probably indicates much more than 1000x as much logic being expressed. The reason Facebook and Gmail seem relatively simple is because they're designed to feel simple so they can be used by regular people, but the full implementation details are very, very complex.

[1] https://qz.com/726338/the-code-that-took-america-to-the-moon...

[2] https://github.com/chrislgarry/Apollo-11

[3] https://www.quora.com/How-many-lines-of-code-is-Facebook

while Facebook is probably at least 100 million lines of code at this point

...which, if I were to guess, consists of a significant amount of "do nothing" or otherwise useless abstractions that either were put there dogmatically or once served a purpose but no longer does. This is reasonably prevalent in Enterprise software but I imagine the much higher churn in Facebook code also increases the amount of cruft that gets thrown in or otherwise left behind.

Search HN or the Internet in general for mentions of Facebook's app being bloated and you'll find that even non-technical users are at least somewhat surprised by this gross inefficiency.

Excepting the moon landing comparison which has already been thoroughly decried, this post is spot on. The phenomenon really shows the importance of connotations, associations, and learning to present yourself effectively.

My personal experience has been that the label "web developer" is typically used by less experienced people if only because after a little while in the industry, even if your focus is primarily web-related, you start to develop skills that are more widely applicable and don't want to be "pigeonholed". It doesn't have anything to do with the difficulty of the problem set, the quality of the ecosystem, or any other specific value judgment.

It's the difference between a package delivery service characterizing their employees as "box movers" v. "logistics coordinators".

What other work have you done though?

From my experience, it seems like a lot of web dev people struggle with OS/Compilers/Embedded/Robotics work, but people who don't struggle with OS/Compilers/Embedded/Robotics work tend to handle web dev just fine.

Why is that? The only explanation I can think of is because the work is easier (easier, not trivial).

(Note: I don't think a web developer (or anyone) should be considered a "lower class" of coder, but saying web dev is the hardest and most intimidating dev work feels untrue).

i think the real reason for this is related to the availability of easy-to-use and up to date learning materials, more than anything else

Those people who are good at OS/Compilers etc would still need to use a tutorial to pick up HTML/CSS/JS...and due to the popularity and inclusiveness of the web dev community, they will find it relatively easy to find high quality interactive tutorials that can get them up and running quite quickly. Additionally a lot of the web is designed to be easy to use...its the reason why we're all using HTML5 and not XHTML right now. I think of the current competition between Vue, React, Angular and others to basically be an ease-of-use competition. Its genuinely quite cool that web developers are making fairly tricky concepts easy to use with these libraries.

On the other hand, a lot of material about OSes, compilers, embedded systems is relatively obscure, frequently out of date, possibly difficult to find outside textbooks and research papers, and open source libraries in these domains might even require looking through source code just to figure out how to use a library. A lot of these communities are also indirectly elitist, leading to various sorts of bootstrapping problems for newcomers (I've heard "if you can't build the Linux kernel, why are you even trying to learn about it" several times on different mailing lists :/)

As a personal anecdote, I work on compiler and OS things as part of my job, and most people on my team have a high degree of respect towards web dev work. There are only 2 people on my team who think that we could easily do some light web dev: me and another colleague, both of us who've done actual web dev jobs for several years before transferring to our current team. So I don't think web dev is easier, but just far more accessible than a lot of other fields.

It's a little less baffling when one realizes that 'front end' work often involves dealing with more arbitrary and byzantine rubbish than in other areas (I'm looking at you HTML 5).

The lack of respect for how difficult it can be to make a smooth user experience is in a way understandable. But the truth is front-end people spend at least 1/2 their time in Chinese finger traps ...

Honestly I don't think web developers will ever be respected, even if every other type of developer disappears from the face of the Earth.

That's because of:

* an over-inflated sense of importance (see the comparison with the moon landing)

* aggressiveness (see the endless statements about killing the desktop, mobile, etc)

* low barrier to entry and high population numbers to entry resulting in lots of ignorance

* attention deficit (see the endless churn in frameworks, libraries, solutions)

* and poor technical craftsmanship (at the end of the day, the solutions offered have poor performance and poor platform integration). This is justified by the fact that these poor technologies are making webdevs more productive.

Not all of the above apply to back-end webdev, those devs are more relaxed. Unfortunately they're being assimilated by the front-end.

I’m there and cannot stand it. In my company, for the native apps each problem has it’s own dedicated team. Including ML team which doesn’t have any results yet. I’m the only „cloud” person, everyone assumes that anyting „web” takes 0 time and effort to complete. I truly hate these people. BTW it’s Germany - when they hear „internet” their brains turn into a pile of rotten cabbage.

This is nuts. The reason I avoid web dev is that it's complicated and hectic.

Hard and intimidating doesn’t necessarily mean interesting, I think a lot of the bias against web dev is hovering in the latter and not the former. Note, I’m not saying that web dev isn’t interesting, just that the common perception against it is rooted in interest rather challenge (like it was for database in the decades before).

More to the point, why would people want to work in a hard and intimidating area anyways? If it was fun and interesting, it might be challenging but not intimidating. Again, I’m not making judgements in web dev, just on how biased perceptions go.

The worst part for me is trying to find work as a self-taught web developer and people somehow magically hear the word 'designer' instead of 'developer'.

Of all the stupid arrogant awfulness in tech that has crossed my path this one is definitely high on the list of the most baffling.

The reason is how a lot of companies do web: sloppyly and pay peanuts.

The web is 'easy' in the sense that it is well (at least in quantity) documented. If there is something you need to know, it is unlikely that you will have much trouble finding it. Justified or not, I am unsurprised that people with more esoteric knowledge, that is not easily found through a Google search, want to try and boast about it. That's pretty standard human behaviour.

> Web dev is the hardest and most intimidating dev work I've ever done, by a country mile.

I think Web Dev and App Dev get such a bad rap because the half-life of the knowledge there is so short that length of experience isn't the signal that it is in other areas (and sometimes it can be an anti-signal).

It does, though. Isn't it possible that other programmers disparage web development not for being easy, but because it's unnecessarily complex?

Even of people who look down on "web development", I don't see anyone claiming that writing Google.com is easy.

There's lots of types of projects that I don't think very highly of, even if technically challenging and competently implemented, simply for being not worthwhile.

Where are these horrible places that people work and hang out? I’ve worked at places as small as a tiny ISP where I was basically the only tech employee, startups with a dozen employees to companies as large as HP and everything in between. I’ve worked on the east coast and west coast. I’ve worked for private companies and public ones. Ive been on small teams and large. Ive had dozens of managers. I’ve been an SA and a programmer and sometimes the jack of all trades who’s responsible for IT and the web farm. I’ve racked and stacked gear. I’ve punched down Ethernet. I’ve got a CS degree from a state school. I’ve coded in Basic, assembly, C, C++, scheme, Perl 4 and 5, python 1, 2 and 3, Java, JavaScript, PHP, Shell, TCL, Pascal, Fortran, Lua, and on and on. I’ve contributed to open source small (chpasswd) and large (Apache, git).

I don’t ever recall running into someone who would treat me badly based on what I do. I’m not arguing it doesn’t exist. I certainly believe the author. I just haven’t experienced it myself. Have I just been lucky?

I haven't experienced this personally either. What I have noticed is the following:

- Signaling in tech is a real thing. I believe this stems from being (understandably) picky about our peers, combined with a fundamental difficulty of evaluating them. The problem is complex enough that our strong tendency is to fall back on stereotyping and other heuristics.

- Many programmers (myself included) have had encounters with "Web developers" whose primary interaction with JS is copying and pasting. Of course, it's hard to say what percentage of "Web developers" fall into this category -- certainly it's just a small proportion of the whole -- but enough of this type exist that "Web developer" is not always a positive signal -- indeed, it can be negative for some people.

- The above issue appears to be self-reinforcing, because the more the tech culture becomes aware of the problem, the more the stigma grows for existing Web developers that have not changed their title. Posts like the OP's are valuable because they work to correct our cultural narrative.

I'd expect the most likely people to experience this are "Web developers" who haven't yet accrued enough other signals of their competency (like experience, charisma, or similarity with their peers). e.g. a soft-spoken female Web developer without a lot of experience sounds like an archetype where this could be an issue.

Of course, it's a game of chance. But the odds are incredibly different for each person.

I think the author was exaggerating a bit, or perhaps is more sensitive to other peoples' opinions than most (e.g. the bit about crying in the bathroom). In the past I have definitely worked at companies where "web developer" was a completely different job title than "software engineer" (back-end developer), and paid less money. Definitely second-class citizens. The industry seems to have turned a corner with the advent of single page app frameworks, npm, etc, and everyone these days seems to be "full stack", but I'm sure there are still plenty of companies out there still doing it this way.

Also, from what I can tell people in academia and certain open source circles are brutal to each other (e.g. Linus Torvalds). I'm sure they would shit on the work I do every day... but I'm too busy to care.

Weird (to me). I don’t care what a coworker makes or what their title is. I don’t even care what my own title is. When a former employer let me choose the title for my business card I went with “problem solver.” All I really care about is having competent coworkers with integrity who are respectful to each other. It’s not my job to judge the value of a coworker’s work.

It’s not my job to judge the value of a coworker’s work.

It most certainly is, when you have to work with and clean up the mess.

Much like there is the notion of "10x" programmers, there are also "-10x" ones. Unfortunately they can stay around for a surprisingly long time.

What I was trying to say by “value” is that it’s not my job to determine the worth of a particular position to a company. i.e. it’s above my pay grade to decide what a web developer is worth vs a backend engineer, etc.

Judging the quality of someone’s work _may_ be my job if I’m asked for peer feedback, but even there I disagree. I feel it should be up to a manager to determine whether his or her reports are doing good work. (I’m not a fan of peer feedback for a variety of reasons.)

Of course if someone is incompetent, that’s a different matter. How I’d deal with that is too circumstance specific to outline here. I’ve been fortunate in that I can only recall a few such people.

Shouldn't everyone be judging (and improving) the quality of everyone's work in a team, if you want the project to succeed ? How is a manager to judge the quality of technical output other than peer feedback ?

Sure, code reviews, architectural review, etc. First line managers should be technical enough to know whether their reports are doing a good job.

You've either been lucky or oblivious. My experience with office people has been that any minuscule thing that can be used to alpha-dog somebody, will be used to alpha-dog somebody.

I suspect it's because the mothership goes out of its way to keep them in a constant state of anxiety and confusion--open offices, stack ranking, endless meetings, etc...

It is easier to assert your dominance over them when they are at each others' throats.

I was (am maybe.) one of those douches that looked down on webdevs. I was so, so wrong. It came from fetishizing complexity and "smartness".

Even setting aside the issue of complexity, from a business perspective, web dev is immensely valuable and only becoming more so.

And as for complexity, I always thought I could learn it easily because its "not real CS" but man its a huge, complex, ever-changing ecosystem and not at all easy. And it takes a much broader set of skills (both technical and non-technical) than pure backend programming or whatever.

Mad respect to all web devs, and all devs. The sooner we get rid of the caste system within software the better.

There's also something to be said for us working hard at getting rid of the complexity in web development.

> It’s fucked up that being interested in this random programming language, not even for the reasons the fangirls love it

I'd argue that the real signaling here is the op showed genuine curiosity. This is hard to fake, and the "fangirls" in this example could be people pretending to like Clojure because that's the hip language/framework of the day.

Maybe people respect you more for being genuine.

Often when people perceive signalling it is a confirmation bias -- they feel something and think it exists in the behaviors of others. e.g. I'm doing something I think is cool, therefore positive interactions are because of it. I'm insecure therefore everyone is dismissive of me.

Having said that, I don't know what timeframe this person is referring to but web development went from extraordinarily hackish and ill considered (the WebForm/php/intermixed render blocks era) to being a quite disciplined portion of the industry. I remember when the overwhelming bulk of web developers wouldn't grok that JavaScript had closures, and they didn't have to use globals to store state during an event handler.

Rewriting your frontend every six months using the current trend isn't my idea of disciplined.

A general advice: don't be, or position yourself, as a "programmer in ${language}". Your value proposition as an engineer is ability to comprehend and solve problems.

Learn enough concepts from a few different schools of thought to be able to make sense of a language in a reasonable time. Master a few, eventually.

An ability to use a semi-obscure language to solve a problem, and an ability to explain why it is superior for the task is indeed an important signal of the above-mentioned abilities, and a costly one. Costly also for the employer (in terms of your negotiated salary).

The other day I was (lightheartedly, but still) asked: "what went wrong?" that I had chosen front-end development even though I graduated from university - because apparently these two things are related somehow.

I've found that while I can't change peoples' biases, waving the flag of "I do stuff in Rust in my spare time" is enough to scare off at least the Java folk.

There's an interesting power struggle I've been caught in more than once: Back-end devs describe their work as very hard so that they're left in peace, and the easiest way to strengthen that is to describe someone else's work - front-end developers' in this case - as easy.

This way whenever a change in requirements occurs there's pressure to avoid doing changes on the back-end in favor of changing things on the front-end. At the same time interesting and "hard" things are meant to be done on the back-end even though nowadays it's possible and sometimes easier to do them on the front-end.

The fact that back-end devs are paid approximately 20% more naturally only worsens this.

This type of signaling is very real and present in functional programming languages specifically. Java programmers feel insecure because they're not writing Scala, the Scala people wish they were as functional as the Haskell people, the Haskell people wish they were as advanced as the academic Idris people.

It's a never ending comparison contest. And despite functional programming having an objectively higher barrier to entry, it is entirely possible to write absolute garbage code in an FP language as well.

it is entirely possible to write absolute garbage code in an FP language as well.

I can attest to that as I in college had a course on functional programming with Haskell and the way I did my assignment was awful to say the least.

This article was written in 2016. A couple years later in 2018, the last couple times I've heard Clojure mentioned it was it was to sarcastically mock the trendiness of the industry. E.g. "My side project is a social media platform that leverages big data, machine learning, and blockchain... and it's written in Clojure".

Has Clojure already lost its cool mystique in the 2 years since this was written? If so I think that further proves the point that assuming good/bad things about a person's talents based on the tools they have used in the past is foolish.

Maybe it's my age, and a youth of grunge, metal, hardcore and punk, but whatever happened to doing what you love and not giving a shit what naysayers think?

>but whatever happened to doing what you love and not giving a shit what naysayers think?

I'm in my 30's and have used a lot of different languages and have worked in a lot of different aspects of software development. As long as I'm being compensated at my skill level, I couldn't care less what language I'm using. I'm currently a web dev and the company I work for primarily uses PHP. To make a long story short, a good friend of mine works for a huge university and if I hang out with him during the week, I'm inevitably going to be around some college kids. The typical icebreaker conversations of "what are you studying" often times leads to chats with CS students. The amount of trash talking I get from kids who haven't even graduated and have no professional experience once they find out I primarily use PHP is astounding. Golang and Python seem to be cool amongst college students, so once they find out I also have more experience with those languages than they do with programming in any language, I suddenly become cool and worth their time.

What happened: the need to earn subsistence.

Because in a capitalist dystopia or an anarchist commune, I need other people to want to help me.

It's just probabilistic.. someone who's really into SICP and clojure is showing signs of being interested in more than your typical minimum 9-5 job requirements. It's not "THE" sign, just one of many. SICP teaches many things, such as how to write compilers, how to think about abstractions or the benefits of immutability. Is it so weird to think that someone's willingness to get better and learn more is more likely to get more job offers?!

Here's one way to look at it, considering everything else equal, if you were an employer, would you pick someone who knows X (where X is jquery or whatever requirements you need) or someone who knows X + Y (where Y is some interesting technology that may eventually turns out to be useful)?

I think everybody who writes HTML, CSS and JS for a living experiences this. For me it's the never ending jokes about "JS was written in 9 days" and remarks on how easy it is to fix a problem, because "it's just CSS". I compare it too subtle discrimination. I am a white male and I won't pretend to know how it feels to be discriminated, but I do know how it feels too fight the uphill battle of condescending remarks. And I also play my part. I downplay what I do ("I build websites"), because I don't want to hear myself pounding my chest.

Javascript being sub optimal makes what you do harder, which implies you are smarter. You aren't being made fun of, the language is.

And CSS is harder than programming!

I think the assumption is: he's willing to code in a suboptimal language, so he must be a suboptimal engineer.

PS. I use TypeScript, love it and don't believe it to be suboptimal.

Can you elaborate on how javascript is sub-optimal?

This was entertaining to read, I like your style.

I also find the webdev low status pretty unjustified. It's true that it's less mathy than other disciplines, but at scale, "webdev" is every bit as architecturally challenging as anything else, and in some cases moreso because of the sheer number of different technologies and stacks you have to be fluent in to make it all work together.

I presume the hatred comes from the purists looking down on the entire mess of web stack: PHP gets a bad rep, JavaScript is at the butt of every joke from an embedded C developer, and the front end of the stack is a total nightmare - CSS, w3c and all.

Web Dev is hard __precisely__ because of this mess!

I think it's the fact that it's a mess that makes people who strive to write correct code and keep it all neat and tidy look down on webdev. It's sort of a variation of the "clean" vs. "messy" roommate conflict. Messy loses status in an industry that aspires to be "scientific." Not that it should be that way, just my take on similar observations.

You're definitely onto something here. "Messy" makes money. An obsession with "correctness" can kill a business, especially in its early stages.

Fail early and fail often. Web dev has very low (technical) overhead and aligns with "startup" mentality. The good news is that the customer has different expectations every 6 months so it's not the worst thing in the world to have a slightly different set of technologies to meet those needs.

I know this is a bit off topic, but I'm curious what stack you're using for your blog. I like the minimalism of it, and I've been looking for a lower-effort way to maintain something like that.

RE: your actual post, I've also personally noticed this, and a similar concept that when you reveal that the bulk of your experience is on a "dinosaur" language, you are automatically lumped into the set of less competent programmers. It pretty explicitly informed my choice of programming language for interviews.

Fortunately, I've found that there's a fairly direct correlation between people you would want to work with and learn from and how much this signalling affects them. Many of the programmers I respect most openly don't care what your language background is, just that you can wrestle your tech stack into good software. That's been my experience too: the good ones can ship great software on pretty much any (reasonable) stack. The okay ones ship okay software on pretty much any stack.

Coming from embedded, the web is such an awful platform. No versioning. Polyfills everywhere! XML (satan's markup). Javascript... Every line of code is a rabbit trail into "can I use" BS.

Web devs didn't make it that way. They're just the ones who have to live with it.

We should be buying them lunch and thanking them for their service, not crapping on them.

Embedded has plenty of its own ugliness, though. It just doesn't change at such a rapid clip. In fact, once you ship a thing, it's unlikely to change at all. I appreciate this "do once and forget about it" approach, but at the same time the tooling for embedded can be ugly AF as well.

Depends on your chipset. Last project I worked on was for the Z80 (only a few months ago, strangely). Like buttah.

Z80 was like a brief breath of fresh air in late 80s: hands down the most intuitive assembly syntax I have ever experienced. I bet I could whip up something decent in it even now, in a couple of days. What would one use to compile assembly for it on Linux?

I used Zilog Developer Studio and C for that particular project, and there was very little inline assembly required.

I don't think I've had to write anything 100% in assembly in 20 years, so no idea what you'd use on Linux.


Nice article, the self-deprecating/funny tone is rather sparingly found in tech blogs (well, apart from Julia Evans).

As a sysadmin at the bottom of the signaling hierarchy, I find this hilarious. Am I managing a complex integration and operations task? Nah, clearly I'm just someone who clicks through installers in windows.

I've worked as a sysadmin in the past. When I moved from a linux shop to the first windows environment I administered I found that a lot of the windows admins were "next next install" types. I'm sure that's not the case everywhere, but as long as those people are called sysadmins they're going to dilute the value of actual professionals.

> I instantly get elevated to a respect-worthy status for free...Coming upon the Correct Signal by accident

I have worked at companies with open positions, and I have had to sort through a lot of resumes. I also get an informal Bayesian probability heuristic in seeing the resume and then seeing the person who wrote the resume. People who are on the ball, who are the standard deviation above the rest, know what to put on their resumes which looks impressive. People who are a standard deviation below the rest not only don't know how to put together an impressive resume that stands out, they don't even know how to put together a median, average resume.

If I was hiring programmers with little experience, and everyone has the same skillset - Javascript, ES7(8,9...), V8, Javascript framework du jour (React, Angular, Vue, Node.js) - they all look the same. If someone has Clojure on their resume as well, it sends a certain signal out. You still want to talk to that person, and verify that they are a standard deviation above the rest, but it sets you out from the pack. If we don't go by such signals from resumes and the like, I don't know where to go from. The OP is correct though, once you send up a signal you are a standard deviation above, then you have to follow through and you start feeling compelled to be able to produce at a level a standard deviation above.

It's really not that complicated. Technical people recognize technical ability. Throwing together a CRUD site in some framework is really not that complicated and can be achieved without really understanding what's going on. When you started learning Clojure you got a little more respect because you were broadening you're horizons. It's not some conspiracy systems programmers have against web-only devs: The web dev learning curve really is not that steep.

Why are you assuming s/he is putting up a "CRUD website in some framework"? I work with HTML, CSS and JS, but it does not tell you anything about what I'm building or how I'm building, so you should not "recognize my technical ability" based on your false assumption.

"Throwing together" anything using anything is not really that hard, but understanding the problem, making trade-offs with limited time and budget, writing maintainable and well-tested code, making something easy to use, accessible, efficient... these are all very difficult to pull off regardless of whether you're building a CRUD site or not.

I would also disagree that the web dev learning curve is not that steep... sure to get something to "compile" (well, show up on the screen) is stupid easy (<p>Hello world</p>)... but to actually make a functional, usable, performant, efficient, accessible, beautiful site or app requires managing a lot of complexity across a bewildering amount of environments and tools.

I'm not saying all web software is simple and all web devs suck. The fact is that most web apps are CRUD, and not well designed or 'beautiful' and written by devs that really don't care about SWE or broadening their skill set - just collecting a pay check - and that's fine. The ecosystem of operating system, compilers, distributed systems, embedded etc is a bit more rigorous because the bar to entry is higher.

As a Clojure dev, I find this attitude absolutely baffling as well. I can assure you that nothing at all about Clojure makes you a good dev. Nothing about any language makes you a good dev.

Want to fast-track a new project into a convoluted legacy codebase few humans can grok? Hire a bad dev and have them write in any language, it doesn't matter. I've seen Clojure code bases passed through a dozen hands that no one could figure out.

I think the attitude comes from the fact that Clojure is almost never someone's first language, and someone that went out of their way to learn it probably learned it to solve some problem they had with another platform.

Those two factors right there weed out a certain bottom percentile of developers.

There was an apocryphal meme reddit where a carrier apparently sent a text asking how the user would rate the reliability of the network on a scale from 1 to 10. The user replied, "9." The user was notified that there was an error and the text didn't get delivered properly, to which the user replied, "8."

Signaling of the kind the author describes is information. If it truly is as widespread as the author suggests it is valuable-- just take note of the few people not signaling in this manner and then try to forge relationships with them.

For me this is another lesson about biases and ambiguity.

1 - Learn to recognize biases.

2 - Scrutinize your thoughts to limit the influence of your biases.

3 - Work on processes to limit unconscious biases.

4 - Since you cannot eliminate biases, use biases to your advantage.

The same is mostly true of two candidates one with go experience and one with Node. The one with go will either get the position, or recieve a higher offer.

The exception might be demand based. PHP for example is generally sneered and looked down upon, but if a successful startup is built using PHP (there are lot's) then demand will be high for that specific expertise.

The deeper tech you know the more underground and probably experienced you are. The more languages you know, you are probably better in all of them because knowing multiple languages is better to see core parts like standards, systems and patterns. If you know a functional programming language that also probably means you have more experience in the deeper end, doesn't mean you are better but you have been around and devs that have been around probably are better.

It is the same with any subject, music and movies also has this, the more underground music/movies and more original/classic movies you have seen the more you appear to know. It is snobby but it is a signal that you probabilistically are more experienced.

Even the OP posted how he came into programming with javascript and webdev, then tried Clojure and went into SICP. The latter two aren't gateway languages/areas. The gateway languages are javascript, java, C, C++, .NET, Ruby maybe Python now are usually the first languages people know, learning Clojure meant it was at least your second language, rarely do people start with Clojure, so it signals you have gone deeper in knowledge and have somewhat leveled-up even though day to day you seem the same self.

Not a he.

A number of years ago, one of the co-founders at work asked various people, "how would you feel if I said you looked like a $LANGUAGE programmer?" And he substituted Perl, Ruby, Python, and PHP for $LANGUAGE.

We often mock up in our minds what it means to work in a language or framework and bucket people based on these poor signals.

Well, this person "missed computer science" and picked up SICP to scratch that itch. Then he did the examples in clojure instead of scheme! It would seem to me that his colleague's signaling was correct. Would your average frontend engineer engage in either of those two activities?

What does "signalling" mean in this context?

I tried to look it up and failed to find a definition that made sense.

This is signalling in the economics sense: https://en.wikipedia.org/wiki/Signalling_(economics)

Thank you

In this context, the word “signaling” means ... what?

You know, sometimes I wish HN would do a bit more editorializing/subtitling and less slavishly copying the damn stupidest subject lines that the original author thought would be cool....

> ack in the day (as in a couple years ago) when I told other software people I’m a web developer, I got treated like shit.

I see this a lot in our field unfortunately. "You program in X? A real programmer would program in Y because...".

I think the negative comments on webdev come from the php era where websites were not much more than information portals. Nowadays webdev is more about web apps, considerably harder to create and maintain. In addition the usage expanded so scale became a thing. Over time it is more that the apps traditionally seen as "true programming" moved to the web but the stigma is only starting to fade.

On the writing style; hope you are more fun over lunch talks :). Hope it is a jokingly written article because the dark writing style can easily be confused with a negative attitude for people who can only judge by the article. Cheer up!

P.s. the fangirl comment is a little out of place in an article on signaling written in this tone. It's like you think less of people who love the language for reasons that qualify them as "fangirl" in your eyes. Just saying.

The writing style is a stylistic choice. I like it

The author seems to be female, so "fangirl" maybe isn't so out of place.

Even so, fanboys, fangirls, it is all said in demeaning manner in this context. I don’t think anybody would feel proud to be categorized in that group of the authors collection.

Wait until you see all other markets.

Seriously, what's your experience out of "tech"? "tech" is the best signaling environment ever created.

I dunno, to me the whole 'abc coders are xyz' is a heuristic. I value curiosity and I find that devs that are solid at a wide variety of paradigms often get there through curiosity. So if someone tells me they are learning Clojure/Haskell/Scheme/Racket/OCaml/Brnfuck/Scala/Smalltalk/etc I'll think that's interesting and wonder what made them decide to learn the language. Without further information it's a safe guess that they are curious and enjoy learning new things/challenging themselves, things that I think are cool to do. And people who do cool things are often times cool people.

But Daiyi's main point isn't that abc coders aren't xyz, but rather that other people are also xyz, and that if an efg coder becomes an abc coder and are found to be xyz then they were probably xyz back when they were efg before becoming abc coders.

And this point should be well taken. There is a common trend that analytic subjects contribute more value than artistic subjects. This, in part, is due to the phenomenon that analytic subjects have values that are easier to calculate (...analytically, whence this is somewhat circular) while artistic subjects have effects that must be evaluated more subjectively. A coder works for a week and adds a new feature which increases marketability. A painter works for a week and produces a painting that may eventually sell for $500 in a couple years. But this undervalues the painter---the effect of arts on a society is more than their retail value.

I think this in part describes why web devs/front end engineers are socially valued less than coders in the development community. Their contributions are harder to quantify and the problems that they solve are more diffuse and subjective. This leads to "it's hard to quantify an efg coder's contribution" being conflated with "efg coders contribute less".

But this is BS---I have yet to come across anything that isn't both an art and a science if done correctly. In fact, this thought lead me to my own answer to one of the great philosophical questions of the ages: "what is art?": I contend that art is anything done well.

And as a computer scientist with a prior life as a musician I can assure you that there is plenty of 'calculation' that goes into the arts. Sometimes this is explicit. For instance, say that I have a closed voicing CFA (closed voicing means everything is close together) and they are moving to B?G that will then move to CEG. What do I want to replace `?` with? Well we don't want F to move down (we try to avoid parallel motion, this can be thought of as a sort of axiom) so F must either remain fixed (oblique motion) or go up (contrary motion). We also don't want voices to cross (while voice crossing is less taboo than parallel motion it is still often avoided, and our adherence to this restriction makes our problem much easier). Since we are working with a closed voice we have two (diatonic) choices: we can double the G or stay fixed on F. Doubling the G is boring but staying on F creates dissonance (B to F is a b5 and F to G is a M2). Luckily this dissonance is nicely resolved by the subsequent voicing and we win music! Yay!

Notice something? This is just a constraint system! But rather than solving a SAT formula we are adding in some subjective data to consider as well. I like to phrase this as "In math, `1 + 2 + 3 = 6` while in music, `C + E + G = happy`".

These calculations can also be done implicitly: say I'm taking a break over some jazz tune and I'm hitting a turn around, a ii-V7-i. I have a vague notion that I want to hit the "Billie Holiday special" (https://youtu.be/KUCyjDOlnPU?t=2m59s) at the end of my break (a melodic 5-2-1 with a slight scoop up to the 2 which falls back to the 1). I'm in the key of Bb and I want to play this around the tonic at the 8th fret of my D string. Right now I'm in the upper area of the neck and about to change to the ii chord which holds for two beats. I want to leave a pause over the V chord to make the tag more interesting, so I have exactly two beats to get from where I am to where I want to go, and I have to quickly 'calculate' this transition in real time. This is, of course, very natural since I've been playing for most of my life, and the 'calculation' is more of a feeling than an explicit mental exercise. But underneath the hood there is a shitload of precomputation that I did, practicing similar harmonic and melodic situations, honing my instincts, expanding my ear, studying theory, etc. I have just, in real time, solved a complicated constraint problem in front of a room full of people. And they are all cheering for me! See? People love math!

I'm sure I don't have to argue the reverse direction on this site (namely that coding/engineering/mathematics/etc are all forms of art) so I'll omit this.

All this is to say that the distinction between the artists and scientists is very blurry. I won't go so far as to argue that it doesn't exist since it clearly does. But when I try to figure out what that distinction is I find a number of qualitative differences but nothing quantitative.


Maybe so, but please don't post unsubstantive comments to HN.

There any many such unsubstantive comments in this thread, but thanks for pointing out that.

I was a hiring manager once. The mere mention of "off the beaten path" languages on the candidate's resume signaled one thing to me: that the guy is willing to put in the effort and learn something new. That gives you a leg up over the horde of other folks who, all other things being somewhat equal, don't really give a shit about their craft.

Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact