Hacker News new | past | comments | ask | show | jobs | submit login
A tech stack should be a well-balanced portfolio (staysaasy.com)
93 points by chesterarthur on June 11, 2020 | hide | past | favorite | 85 comments



> If you’re too boring it gets hard to recruit. You don’t want trend chasers joining, but good candidates have a healthy want and interest in new technology.

I think this is worth examining in a lot more detail.

I accept that candidates are going to be put off by obsolete or terrible technology. I wouldn't take a job working on a Java 1.7 codebase, for example. Or ATG (now Oracle Commerce) - i did three years on that, and will never touch it again.

But if a candidate is motivated to take a job because they get to use shiny new technology, i would suggest that they are not, in fact, a good candidate. In my experience, magpieism is associated with weaker programmers; not the very worst, but second quartile, say - people who get very engaged with superficial details, but don't think about deeper things.


I really, really want to double down on this, because it's not expressed often enough or nearly well enough.

There is a huge contingent of devs (I work in the consulting space, maybe this isn't everywhere) who have this idea that 'upskilling' is all about using new libraries in some toy project to familiarise themselves with the API of the library. They finish a project using React, then ask to be put on a Vue or Angular one next so they can 'widen their skillset'.

You can pretty much split learning up into three kinds: Learning how to solve new problems, learning how to solve the same problems in a different way, or learning how to solve the same problems in the same way, but with a different representation.

Learning the API of new libraries is almost always the latter, and people constantly mistake it for the second one and think they're making progress. Worse, people do this shit long before they've exhausted their options for the first kind of learning (which is the best).

React, Angular and Vue all do the same thing. They all do it roughly in the same way. Even when libraries do things a different way, learning the API is often still the 3rd kind of learning unless the library's abstraction is leaky as shit and you have to learn all the intricacies of their approach anyway.

Focusing on the wrong kind of learning makes a worse dev than you otherwise would have been. I've worked with a handful of great devs, and a fuckton of devs that know 20x as many libraries but churn out nothing but overengineered unmaintainable shit.

If you want to distinguish yourself from the crowd, learn a boring technology in an area that you're not as proficient in. Don't chase the fancy shit just to build stuff you could already fucking build in half as many lines of code.


I disagree with respect to frameworks like React, Angular and Vue. Yes, they all "do the same thing" when viewed from a distance, but up close the details of say, managing state with hooks in React, is very different from managing state with services and RxJS observables in Angular.

These differences aren't always sexy, but understanding them is usually the difference between a developer who can deliver a high-quality solution and solve difficult problems and one who can hack together something that sort-of works with enough Googling. In my experience, a lot of "overengineering" in these kind of frameworks is really devs with insufficient understanding of their details inventing unnecessary workarounds and abstractions.

So, yes, devs flitting between frameworks is a problem, but not because they're doing the wrong kind of learning, but because they never learn enough to be truly effective. And I don't think this is a result of some pervasive attentional deficit, but because hiring practices, especially in consultancy, favour direct commercial experience above everything, and devs are economically incentivised to maximise their earning potential and pool of job opportunities by working with as many technologies as possible.


Understanding the differences is only important if you're hopping jobs to a different tech stack though. Which is why the 'following the market' reason is mostly bullshit if you live in a good city and didn't pick something originally that's gone the way of the dodo.

You may open yourself up to more opportunities by being familiar with multiple front-end frameworks, but you're going to be a worse fit for a lot of them, because if my stack is React and Django, why on earth would I hire a veteran React dev that has spent 3 months learning some Vue instead of some Django? Or AWS? Or anything else that I could possibly have in my stack. You're basically resigning yourself to hitting only one checkbox, when there's other candidates out there that can hit more.

The big front-end frameworks are far from the most egregious examples too. You raised a great point about people not learning enough to be effective. I'm a big fan of React, but it's ecosystem has a massive problem with this. People are so dependency crazed that every single dev has experience with 10 out of the 200 commonly used dependencies and it's always a different 10 to whatever is being used on the project. Most team's solution to this is to crank their package.json up to 100 dependencies so everyone can feel at home, and everyone goes on these fucking odysseys of superficially learning a ludicrous amount of APIs other people have written without ever producing anything meaningful themselves.


> high-quality solution and solve difficult problems and one who can hack together something that sort-of works with enough Googling

Are you assuming enough Googling can't produce high quality solution?


Yes that's an absolutely sound assumption when it comes to web development. The vast majority of resources are written by people that fall into the exact category stupidcar described, with superficial knowledge of the libraries they're using. If anything, Googling gets you further from the simple, clean solution a lot of the time unless you're already experienced and can sift through the bad sources of information.


>but up close the details of say, managing state with hooks in React, is very different from managing state with services and RxJS observables in Angular.

I would say learning the detailed difference when needed for the project is in fact extremely trivial. The wrong kind of learning is thinking that you're improving your skills by memorizing and learning all of these details even when it is not required by your current project.

I would say max it takes about 2 weeks to a month for a developer to get good with react. It's not even a big deal if you think about it, but hiring managers still look for "relevant experience."

Don't learn the details unless it's relevant to the project. To improve your overall skills as a developer: Learn the fundamentals.

Though I have noticed one thing. Many recruiters and even technical hiring managers are buzzword driven. Buzzwords are often related to details so to pad your resume you have to learn the details. Ultimately though, if you end up never touching that detail or that detail changes then you wasted your time learning it.

Learn the fundamentals because the fundamentals never change.

Honestly you're the kind of developer that the OP is talking about, but then again most developers are like that.


> They finish a project using React, then ask to be put on a Vue or Angular one next so they can 'widen their skillset'.

Seems like reasonable behaviour. When looking for your next job, the first question asked is "Do you know tech stack X?" People are always going to optimize for market demand. When companies stop caring about your familiarity with specific technologies, people will stop seeing learning specific technologies as an end goal.

> If you want to distinguish yourself from the crowd, learn a boring technology in an area that you're not as proficient in.

I'm not sure I agree. I've spent an inordinate amount of time learning about concepts, which can be applied using either boring or exciting tech stacks, that have never successfully caught on outside. To stand out, my time would have been better spent focusing on one thing and becoming more familiar with the equivalents of React, Angular, and Vue around that thing.

For personal growth I don't regret spending that time. I'm glad to have had the opportunity to expand my horizons. But in terms of standing out among others, it hasn't proven to be useful. Knowing both React and Vue is much more eye-catching.


> When looking for your next job, the first question asked is "Do you know tech stack X?"

I've never been asked that. I have had jobs where i was expected to know a particular programming language, but never some specific framework within it.

I don't doubt that these jobs exist. But if someone is hiring permanent employees on the basis of flavour of the month frameworks, they have no idea what they are doing, and i would advise against working for them.


> I have had jobs where i was expected to know a particular programming language

Which falls under the same thing we are talking about. If you know one programming language, you can figure out them all. You'll probably fall down flat when it comes to idioms of an unfamiliar language, but the same can be said around the idioms of frameworks. Many programming languages even include their own frameworks out of the box as part of the standard library, muddying the lines further. As far as this discussion goes, there is no meaningful distinction to be made here.

> But if someone is hiring permanent employees on the basis of flavour of the month frameworks, they have no idea what they are doing

The ask isn't necessarily flavour of the month. I still see tons of jobs looking for people with experience in Rails, for example. It hasn't been popular for at least a decade. However, from the job seeker perspective only knowing Rails limits to only Rails jobs, even when one knows everything about web development and could just as easily develop a web application in bash if you asked.

> and i would advise against working for them.

It is not an unreasonable ask, to be honest. Gaining familiarity with idioms is unquestionably the hardest problem in computer science, even above what the familiar joke suggests. While you can start to write code in a new language/framework almost immediately, It can take years to gain the understanding necessary to write great code. That is why businesses shy away from great programmers who don't know the idioms of a particular system adopted within the business.


unfortunately they seem to still be in the majority. companies are not interested in training and overestimate what it takes to learn a new stack. they also want people who can hit the ground running. i see that kind of job offer frequently


It gets worse as a freelancer, due to the fact that for 6 months you may be doing React, then back to Angular then you are on a legacy JSP or ASP project. You really can't blame people for wanting to optimize for future market demand and many times it is not clear, who is going to be the winner when in fact most of the time the market splits between two or three technologies that solve that layer of the stack, and then they can be mixed and matched with other layers. I do believe there is a natural curiosity that drives developers to want to learn new stacks, but there is also an underlining fear of not knowing the next big thing and being left out cold in the job market.


Libraries, frameworks, and even languages are just implementation details. (If they were half as good as they purported to be, would they need to be pushed so hard? Food for thought.) Benefit of experience is that you start to see that every new shiny thing that comes out teaches you so _little_ compared to diving deep into different domains of programming.

It's fine to immerse yourself in learning a platform and different libraries really well. But when it starts to feel too same-y, that's your cue to branch out.

One problem is that devs focus far too much on their own perceived "employability," optimizing for jamming every single library on their resume, rather than learning to convey that they could spin up on a tech stack in a reasonable amount of time in a new gig.


> If they were half as good as they purported to be, would they need to be pushed so hard? Food for thought

The answer is yes, they would still need to be pushed hard. "Build it and they will come" works only in movies.


Don’t blame the devs - blame the market. Most companies insist on developers who don’t have to be trained on their stack. Most developers therefore do resume driven development. It’s as much of a “gravity problem” as being forced to “grind leetCode” to get into $BigTech.


The market is wrong, but devs should take responsibility for their own professional development. Resume-driven development may optimize for the market, but it probably isn't the fastest way to learn.


The purpose for learning tech is to put money in my bank account. Why would I take a job and work eight hours a day using non marketable technology and then spend my free time doing side projects that “don’t count” as much as what you do on a job when I could just choose a company where I get paid while learning new tech and I can spend my free time on basically anything else?


Once you’ve been around long enough. It just doesn’t matter. There are many ways to write a resume that don’t focus on job history, so you can market yourself in a way that makes you most appealing to the company you want to be hired by. You also get bonus points for having a “unique but standard” resume that’s tailored to the work you’d be doing instead of where you’ve worked (unless that’s something you want to highlight because it might get you in the door faster.)


In my case is 25 years “long enough” [1]?

The only way that I landed a job at $BigTech, without having to go through the “leetCode”/algorithm dog and pony show, work remotely, and without having to move from my relatively low cost of living area was by doing plenty of resume driven development and targeting my experience toward either Amazon (AWS) or Microsoft (Azure).

Before, my current job at Amazon, I didn’t care what company I was hired by. In my local market (large metro area on the East Coast) all companies were paying within the same range. By making sure that my resume was in sync with the market brings a lot of optionality (or at least did before Covid)

[1] to be honest, I stayed at one job too long until 2008 and was firmly in “expert beginner” territory.


who have this idea that 'upskilling' is all about using new libraries

Devs have that idea because that's how many companies hire.


> But if a candidate is motivated to take a job because they get to use shiny new technology, i would suggest that they are not, in fact, a good candidate.

In my experience the tech stack is good indicator of how technology is viewed at a company. Java and C# shops I have encountered are very process driven, bureaucratic firms. In my experience they are management focused as opposed to tech focused. Where firms that use Python, Node, Haskell, Rust, Go, etc tend to be more tech forward. The reason I want to use "shiny new technology" is because I want to work at the latter as opposed to the former, I'd be happy to write C or PHP even at a tech focused firm but I rarely see jobs like that advertised.


Is being tech focused really better than being management focused? Surely that's just swapping one mis-focused model for another? I prefer to be in roles that are outcome and value focused. That should mean appropriate tech and sensible management but not for their own sake.


I may not be using the best terminology, but what I am talking about are firms where the majority of players consider themselves "non-technical". I haven't found these to be pleasant places to work or places that particularly value software engineers. I wish to work in places where my contributions can be understood and valued by the people I work with.


I don't think you can just blame developers. There is a real fear in the industry of missing out and being left behind, and that if you don't keep your resume filled with the latest buzzwords you will become unemployable. Because recruiters ignore Github and side projects, the only way you can keep that resume up to date is by either applying for jobs using <shiny new tech> or by forcing it into your work projects, whether it makes sense or not.

So yeah, magpie developers are a thing, but it's not just superficial developers, but superficial recruiters and companies as well.


I agree--I've built a career around Django, which is definitely a boring technology at this point, but there's not really any difficulty in hiring.

For front-end stuff I've been using React, and while React is a great technology, it did attract a lot of inexperienced programmers initially. Nowadays, React by itself isn't the new shiny, so that effect has diminished, but the magpie-ism has taken on a new form: I find myself on a lot of projects fighting to keep out bleeding-edge JS libraries with > 100 dependencies.

It's not hard to build a tool that demos well. It's much harder to build a tool that will withstand the tests of time.


I had the same problem with devs who always want to use the newest stuff as with devs who use the same tech for over a decade because they don't want or can learn something new.


I generally don't care what the tech stack is. I can learn how to program in just about any language and use just about any tool. I care about the domain. I care about the market for that domain. I care about what position the company is in in that market. But, primarily, I care about solving interesting problems in that domain. I want challenging problems to work on. I don't want to do the same thing over and over except this time in <fancy new tech>. You want me to put data from a form in to a db? Pass. You want me to figure out how to shine your company name onto a baseball as it is traveling to home plate? HELL YEAH!


I love this attitude and feel some accord at a personal/intellectual level. You learn “programming” first, not a stack. (Well, ok, you tend to learn one language first and then branch out, but I assume you have been through that stage). So programming and problem solving should translate fairly well across domains.

However, my experience with looking for jobs in London is that if I don’t have more experience with most of the stack than someone else in a company/recruiter’s candidate set, I will not get the job (regardless of how well I communicate my ability to work in teams or problem solve). Even after 15 years professional coding and a Masters in Software Engineering.

How does your position work for you? Have you been able to learn new stacks “on the job”? Can we characterise a hiring pattern for different geographic areas? Do you go learning a prospective employer or client’s problem domain before making contact with them, and base conversations around that?


> Do you go learning a prospective employer or client’s problem domain before making contact with them, and base conversations around that?

Yes, I try to do exactly that. The goal is to be able to speak their language in the interview as well as determine if I'm interested. I try and find out everything I can about the company, their product and the product of any competitors. If I'm enthusiastic about it, I figure out where to send my resume.

As far as tech stacks, I've learned Java, Progress, C#, various scripting and shell languages all on the job without ever knowing them prior to getting the job (I started my career as a C++ programmer which I already knew, so there's that).

I think the key has been the fact that when I interview I already know about the company and their product and domain and I can ask pointed, intelligent questions and generally have a great discussion about it with them. The companies that have hired me knowing I don't know, say, Java, haven't cared. The companies that don't hire me because I don't know Java, only care about hiring Java developers and I'm glad they spared me the trouble.

I don't know if it is luck, but I generally don't get many rejections. I also don't generally send my resume to more than a few companies before getting hired.


Great response. Thanks for going into that much detail.


There is nothing superficial about being attracted to a company - especially a startup - that uses near technologies. When your startup fails - and statistically likely it will, the only thing I would have to show for it is a bullet point on my resume when I apply for my next job.

There are a lot more companies that want you to have experience with their technology stack than those that just want “smart people” (tm) and that will take the time to train you.

On the other hand, salary compression and inversion are real. Your market value will go up by $some_big_number but HR policies will only allow raises of $some_small_number. So you have to keep your eye on the market and be prepared to jump ship.

Anecdotally, I was once hired for a company that wanted to build a new product in .Net. The new product didn’t find a market, but their legacy product written in PHP did. None of us wanted to spend a year doing PHP development. All 14 of us left within six months.


> Anecdotally, I was once hired for a company that wanted to build a new product in .Net. The new product didn’t find a market, but their legacy product written in PHP did. None of us wanted to spend a year doing PHP development. All 14 of us left within six months.

Seems like exactly the point OP is making. By hiring devs that cared more about the stack than the product, your employer got screwed when the product with the boring stack took off.


And if we had stayed, we would have been screwed when we did get ready to leave and find another job and all we could talk about is that we developed in PHP for a year....

But the rest of the story....

The department we worked for was an independent startup that got acquired by a Fortune 10 company (at the time) shortly before they started ramping up. The manager of the department was one of the original developers and founders. Even he left soon afterwards.


> But if a candidate is motivated to take a job because they get to use shiny new technology

I'm not sure it's about shiny tech, but the opportunity to grow and learn, and use those skills to become more marketable. The best developers (in my experience) have a growth mindset, and that includes learning new technology and new techniques.


This is just incorrect. Unless you think you know some older technology completely, there's always plenty of room to grow and learn on any tech stack. Some of the technologies I use daily, such as vim and bash, have been around for decades, yet I still learn things about them on a regular basis.

In fact, one thing I've learned from working on bleeding-edge tech stacks is that communities around bleeding-edge tech stacks can often be narrow-minded in a way that hinders learning. Many (but not all) members Go community, for example, seem intent on not learning anything from other languages: if you point out that coroutines predate Go by decades and there's a lot to learn about their use from other languages, or that Go actually has a fairly weak type system, you'll get a lot of defensive pushback from Go fanboys who know literally nothing about other languages that contain coroutines, and don't understand that there's a difference between static types and strong types. There are good programmers in bad ecosystems--the folks who are bringing in generics might yet make Go a decent language--but remember that there was immense infighting to get there: I've been keeping track of Go long enough to remember all the people ranting that Go didn't need generics because it had `go generate`--a glorified C macro system (since they couldn't be arsed to learn from the mistakes of even one of the most adjacent languages) that breaks the entire point of not including generics in the first place (to keep the compiler single-pass).

If this is the sort of "growing and learning" you're talking about, I generally tend to avoid working with people who "grow and learn" in this way.


> If this is the sort of "growing and learning" you're talking about.

No, it's not. The key phrase is "growth mindset" [1]. That does not mean throwing away knowledge or blindly accepting the shiny new objects. I think that's where we're disconnected?

[1] https://www.brainpickings.org/2014/01/29/carol-dweck-mindset...


The best developers grow by solving progressively more difficult problems. And great developers? It's not about what tech they're using, it's about what tech they're creating.


I'd make a big distinction between entirely new languages, tool and frameworks and simply keeping up with the changes in a reasonably mature environment. For something that is still actively developed, and not in maintenance mode, it is a bad sign if it is on a very old version of that environment. There are a lot of nuances here depending on how disruptive, but also how useful the changes in new versions are.

If you're far enough behind, you get problems with tools and dependencies that stopped supporting the old stuff. That can get very painful and annoying. I'd be generally careful about chasing all the shiny and new stuff, but there is also risk in lagging behind too far once it seems like most people jumped ship.


>But if a candidate is motivated to take a job because they get to use shiny new technology, i would suggest that they are not, in fact, a good candidate. In my experience, magpieism is associated with weaker programmers; not the very worst, but second quartile, say - people who get very engaged with superficial details, but don't think about deeper things.

You want engineers who love to grow, learn and solve problems. Startups in general have rather boring problems to solve compared to large companies. Learning and mastering a new technology however is a problem space that startups can provide their engineers.


Learning a new language or framework is not really that interesting, and certainly not worth changing job for. It can only last so long until it's boring. Solving a new kind of problem on the other hand is, at least for me, where the fulfillment lies.


+1 for use of the term magpieism ;-)


I didn't understand the reference. I looked it up and may have found it here: https://en.wikipedia.org/wiki/Magpie#Cultural_references

> In European culture, the magpie is reputed to collect shiny objects, often in fiction things like wedding rings or other valuable or significant objects (often causing consternation at the disappearance, and false accusation of humans in the plot of the story); the most well-known example probably being Rossini's opera "La Gazza Ladra" (The Thieving Magpie). Recent research [8] has shown that there is little truth in the legend, and that magpies - like many animals - are actually unsettled by shiny, blue, or otherwise unusual objects.


Might be time for the Magpie stack, Rust based static file blog generator (compiled via WebAssembly) via AWS Lamdas written in Go, wait no, Deno.

Or should there be a Rust based React framework? How would that work? If it exists then it’s gotta be more performant, so yeah, let’s use that.

I don’t know if this is magpie enough now days. It’s hard to even make fun of this because the irony is even the joke becomes outdated lol.


> How would that work?

There are a few different Rust frontend frameworks that compile to assembly that are roughly based on the ideas of React. Yew is the most prominent.


I can't take any credit for it! That's an old Jeff Atwood term:

https://blog.codinghorror.com/the-magpie-developer/


This is known as The Python Paradox: http://www.paulgraham.com/pypar.html


>But if a candidate is motivated to take a job because they get to use shiny new technology, i would suggest that they are not, in fact, a good candidate.

I would argue that this attitude often exists at the corporate level as well.

Most companies I've worked exist in a perpetual state of migration from monolith to Micro services. Basically the overall architecture remains static in the sense that it's always a giant monolith sitting in the center with some satellite services surrounding it.

The migration to microservices and kubernetes is 99% buzzword driven and rarely has nothing to do with whether or not the product will benefit from the migration. The weirdest part about it is that it's many times a never ending thing initiative. It's very rare for a company to actually arrive to a point where every component of their system is split out into small pieces. It usually just stays permanently as some services surrounding a monolith despite all the talk.

When entire companies are like this it's usually a sign that the entire company is weak. My experience has been that many companies have this attitude and many programmers share it.

If all/most programmers share this general attitude then I wouldn't call these "new technology chasers" bad as most people are like this.


>In my experience, magpieism is associated with weaker programmers; not the very worst, but second quartile, say - people who get very engaged with superficial details, but don't think about deeper things.

I may be biased because I know I talk a lot. But my experience has actually shown that this is not true.

People who don't know a lot of things tend to stay silent. This strategy works consistently to allow many people to hide extraordinary gaps in knowledge. It's very common and the strategy works.

When you see a person not talking too much... human bias tends to err on the fact that he's just not a talker or he may be deep in thought. Never do you think that a person isn't talking because he doesn't know anything. To be not biased is to consider all the possibilities and basically 90% of people just skip over the thought that the person isn't talking because he doesn't know anything.

Now for a talker, the bias goes in the other direction because a talker has an incredibly higher chance of saying something you disagree with or is flat out wrong. This chance is higher EVEN when the talker is more knowledgeable than the non-talker simply because the talker places himself on the pedestal of judgement every time he opens his mouth while the non-talker avoids it.

If you know a non-talker who may or may not have kept his trap shut after you explained something to him, really the only way to confirm whether he knows what you're talking about is to ask him confirmation questions after you talked to him. If you don't do this, likely he'll just google the info later and pretend that he always was on top of everything.

You have to really watch out for this bias. Even as a talker I tended to bias towards thinking that the quiet nice guy was more intelligent. The reality is, "magpieism" has no correlation to how good or bad someone is... if someone talks a lot they could go either way, it's just that because they talk a lot, you have a ton of info on how good they are based off of what they say... while the non-talker you have no info and your judgement is (unknowingly to you) influenced by your biased optimistic assumptions.

If someone is quiet there is a higher chance that he's quiet because he's totally lost. And if he's quiet often... then there's a higher chance that he's totally lost all the time.


Another rule/heuristic that seems to work well, along with Rule 4 from the article, is having a limited number of golden tickets per team. A golden ticket gets used up when you use a "new shiny thing" or "cool but not widepsread" thing.

For us, the risky choice in our stack [1] was Haskell. It's been amazing and has paid off immensely, but the challenges are very real. Over time we've wanted to go whole in on things like ReasonML too, but we haven't pulled the trigger or made safer choices, because that one golden ticket has been used up.

(I can't take credit for coming up with this, but I don't know who did! I'm sure it's already better articulated somewhere.)

[1] https://github.com/hasura/graphql-engine


I think the usual term is "innovation tokens", from the "choose boring technology" post at https://mcfunley.com/choose-boring-technology. It's definitely very helpful in preventing projects from choosing all the shiny new tech at once while not preventing any all innovation at all. Some of the shiny tech is actually worthwhile, after all.


This is a great way of thinking about technical decision-making. I'm most familiar with this idea through the framing of "innovation tokens" from the essay "Choose Boring Technology" [1].

[1] http://boringtechnology.club


http://boringtechnology.club/ calls it "innovation tokens".


This convinced me the issue is not with people, it's with the system. If the author chose Markdown for their blog and yet can't seem to spare a single minute to read up on how it works, then I guess we can't expect people on HN, stackoverflow, etc. to learn it either. Input whitespace (including newlines) does not translate 1:1 to output whitespace in Markdown and this mistake happens a lot in comments here, on stackexchange sites, etc. https://dro.pm/a.png

More on topic, I find it a bit disheartening that people are discouraged from picking a niche library and are instead supposed to look at GitHub's star system. So if it's not on GitHub or if my users don't star it enough, you shouldn't consider it for use in production? Shouldn't the documentation, code quality, open issues / maintainer responsiveness to issues, etc. speak for itself?


> More on topic, I find it a bit disheartening that people are discouraged from picking a niche library and are instead supposed to look at GitHub's star system.

Just because it's disheartening doesn't mean it's unwise.

> So if it's not on GitHub or if my users don't star it enough, you shouldn't consider it for use in production?

If it's big enough to be popular and not be on Github, that's a good sign. If it's not that popular and refuses to be (also) on Github, that's a really bad sign.

> Shouldn't the documentation, code quality, open issues / maintainer responsiveness to issues, etc. speak for itself?

No, because popularity adds value well beyond those things. It tells you that other people have found that the piece of tech is "good enough". They've already run into issues for you, so that you don't have to. They've already trained on it so you don't have to train. Popular projects are made better just by being popular, which is self-reinforcing.

Of course that means sometimes ill-conceived technology (like HTML/CSS/Javascript) wins out over contenders that are better "on paper".


> They've already run into issues for you

A star does not mean someone used a library for an extended period of time and decided to endorse the project by giving it a star. I expect most people use it as a like button (showing appreciation for or approval of the work and nothing else), a bookmark (since you can view your stars), or both.

Looking at my own stars on GitHub, the first one is something I never used, I just liked that someone documented the protocol for me and published some software around it, so I could make my (planned) open source alternative to the vendor's proprietary crapware more easily.

The second one is a factorio mod that I enjoyed using.

The third is a list with an overview of algorithms. I suppose I gave it a star for similar reasons as why I might give a comment or article an upvote.

The fourth is a simple PHP project that looks nice, but I honestly don't remember seeing it and I certainly never ran it.

So that's n=1 but given how easy it is to leave a star and that there is no way to leave a negative star or any sort of comment: no, stars have nothing to do with endorsement, quality, running into issues, or any such quality. If I want to leave a comment to indicate or warn that something is broken, of bad quality, or something similarly negative, a ticket ("issue") is what I might create, and so I circle back to my previous statement: look at other qualities like developer responsiveness in tickets, past tickets...

A "clones+downloads from unique IPs+users" or something similar might be a metric for how often this was actually attempted to be used, and given that most people wouldn't forward bad software to others, that might say something about it beyond some threshold. But it still doesn't tell you anything about their experience with it, for that you'll (currently) really have to look at other factors.


It doesn't matter. A project that has 500 stars is most likely going to have more active users than another one solving the same problem, but with only 5 stars.

Similarly, a project that has 5000 unanswered issues is going to have more users than one with 50 closed issues.

In lieu of any better metric, this is the way to gauge popularity.


> It doesn't matter. [Repeats previous argument.]

Not sure why I bothered replying to you if you're literally saying "I don't care what you just wrote"


If I didn't care what you wrote, I would not have replied to explain why your concern doesn't matter.

I did not repeat the argument. Let me explain it differently:

Let's say that on average N% of people give a project a star because they're active users and (100-N)% give a star for any other reason. There is no reason to assume that N will be significantly different for similar projects. I don't know N, but it doesn't matter how big N is, more stars still imply more active users.

Is there going to be variance here? Of course, but it's unlikely to distort the metric to the point where it becomes invalid, unless you have very samples (stars).

Indeed, if there's one project with 10 stars and another with 5 stars, the chances are high that it's just noise. You cannot get a good estimate of which project has more active users in that case. However, you can tell that neither are popular and both are risky dependencies.


> I find it a bit disheartening that people are discouraged from picking a niche library and are instead supposed to look at GitHub's star system.

Yeah, it's pretty gross for two reasons:

1. Reinforces the broken belief that the best way to evaluate things is based on the perception of others, rather than the intrinsic qualities (like those you mentioned)

2. Adopts a purely consumeristic view of open source

The striking thing is that now these sorts of views are said out loud with no sense of shame.


> Reinforces the broken belief that the best way to evaluate things is based on the perception of others, rather than the intrinsic qualities (like those you mentioned)

It's not always the best way, it's a shortcut. While you're spending time figuring out which JSON library is the best by some metric that might ultimately be irrelevant to the task at hand, I'm going with the popular option that is almost certainly "good enough" for me - otherwise it wouldn't be popular. That gives me an advantage over you, 98% of the time.

> Adopts a purely consumeristic view of open source

I don't see how this relates to popularity. Most users of open source will always be consumers, whether something is popular or not. However, if something is popular, the likelihood of someone to contribute back also grows.

Furthermore, if something is popular and there's an issue, somebody else will most likely run into it before me, write about it, get feedback on it, and so on. That's a valuable contribution, even though it isn't code.


The tendency to go for most popular library/framework/etc, with all the assorted justifications (which are valid to a degree and thus misleading) is an indication of just how rampant cargo culture is in software. Everybody always looking for the default thing to throw at the wall, hoping it solves their problem. And it’s fine, for prototyping and fast growth “products” (really mvps and demos) but with anything serious you must do the required legwork to figure out what is actually needed, and it’s often best to limit dependencies on external libraries and frameworks as much as possible.


I disagree that this is cargo culture. I'm not arguing for people to follow the latest hype, or do what everyone else does, in the hopes of winning.

I'm saying that if you have the choice between a handful of projects that solve more or less the same problem that you actually have, just going with the most popular one is probably going to be the best choice, because having all these other people in the same boat is valuable.

I agree with limiting dependencies and points of failure, but that's orthogonal.


I'd be careful with Github stars, but in general I don't think it is unreasonable at all to take the popularity of a project into account before choosing it. Though this matters a lot more for things like frameworks that touch all parts of your application compared to smaller libraries that are more contained and can be switched out easily.

The smaller a library is, the easier it is generally to just fork it and maintain it yourself if you need to. In that case there is no big danger in choosing an unpopular one. But for large, complex dependencies it can be very problematic if they become unmaintained. That is a risk that is always there, and that you have to keep in mind while choosing your dependencies.


His github stars was just one possible measurement of how successful a technology is / will continue to be. That being said, github stars is an ok indicator of how likely a project is going to continue to be supported or more broadly be adopted.


It's common to write each sentence on its own line in the source text. It helps with version control. It doesn't mean the author thinks the line breals will also appear in the rendered document.


You think the author meant for them to appear like this Like full sentences but with no punctuation in between This argument makes little sense to me Surely they did not mean it like this As the source code of the page shows, newlines were clearly attempted to be inserted and I very much doubt it was intended to only make the source code look pretty but not the article


Thanks for the tip, we just fixed those bullets.


I feel like this applies especially to the persistence layer, I pick postgres basically everytime, I don't want any surprises at that layer. The rest you can chuck out and rebuild without too much pain.


I actually feel even better when I am persisting using static files (json, csv, parquet, etc.), but of course that's not always possible.


Three years ago, i would have agreed with you. Now, i have three years' worth of static files, and i am not so sure.


I use sqlite in almost all cases. You wanna store a file? Just put it in as a string. The amount of benefit I have gained by defaulting to sqlite for most tasks is immense.


I haven't done that in decades. Has something changed recently that made it easier to create an ACID data store with static files? Last I looked, ACID was tricky enough with static files that it was easier just to use PostgreSQL.


The issue with this is there's no data consistency in the static files.


I do this on mobile quite a lot too! flat json or csv.


A true unixer.


These are well-reasoned defaults. I think the same can be said of architectures, not just technologies. I.e., when you're starting out, a monolith is likely a solid approach: it's easier to reason able, easier to deploy and debug, etc. But as you scale, you need to rethink your approach.


A related discussion here was the cofounder of Scribd talking about how Rails was a fantastic decision but he wouldn't have picked it for a new startup at the end of 2015: https://jaredfriedman.wordpress.com/2015/09/15/why-i-wouldnt...

https://news.ycombinator.com/item?id=10236210


This is the important line for me:

> Every time you aren’t boring in a technology pick, you’re likely not optimizing for your company’s best interests.

Really helps to underscore the difference between choosing something shiny and choosing something pragmatic and efficient. When we do the former (as I'm sure most of us gravitate towards), it's very easy to kid ourselves that we'll somehow make the product greater as a result, but in reality we're only serving our own interests (learning something new and fun) rather than the company's!


Most people know this and have no problem with it. It's pretty well known that the company won't look out for your best interests so you'd better do it yourself.


I feel like this blog post is written towards people who shouldn't be making these decisions. I think the people making decisions like this should be in a senior, experienced position.


There's stacks of people in senior positions that fuck this stuff up constantly. The start-up graveyard is probably full of companies where the CTO came in with an impressive resume then spent 2 years building some massive scale kubernetes clusterfuck for his 5 users while another company ate his lunch with a shitty PHP app.


Indeed, this unfortunately does happen. There are also very smart but relatively inexperienced people who need to pick good technologies due to circumstance. Younger technical founders or early engineering hires come to mind.


I have seen a lot of software "engineers" that forget the primary goal of software engineering (solving a business problem) and instead see engineering as the goal itself, building an over-complicated, over-engineered "playground" of microservices to serve a whopping 20 requests/second and requires a full-time DevOps team just to keep the stack running. At that point building the stack appears to be the main purpose and solving the business problem was a "nice to have" side effect.


Any real project would have real requirements. These real requirements are always going to trump a star rating system, 'what's popular' or a generic guide on the Internet. You should define your requirements and then pick the tech which is best at providing a solution to this requirement.

What? I'm going to have a choice of two bits of tech and I'm gonna decide to use one over the other cos it's 'boring but not too boring'? Come on now. These are rules for people who don't know what they're doing aren't they?

Understand your requirements. Understand the technology you propose to use.


I would add: have a process or framework for evaluating and adopting new technologies. Sometimes new technologies end up as part of the stack through pure momentum (hey, we built this prototype...) rather than deliberate choice. Be mindful of what the technology is being used for and what the costs of changing down the road might be.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: