Hacker news is not immune from this behavior either. Nearly every day I read an article and I think "wow, that might be really cool if I ever have time for it" but then I read the comments section and 90% of the comments go out of the way to disagree with EVERYTHING that the author said. It can be really off-putting to be honest.
Programming expertise is very fragile. A variety of changes in your environment, such as different coding standards, dependencies, programming languages, etc, will cause a severe impact to your performance. The result is a strong negative reaction to things that should be objectively neutral.
That is, a change in your environment can make you feel incompetent, and it is easier to get angry at the change in the environment than it is to be honest with yourself about what is happening.
I forget which book I ran across this theory in, but it explains a variety of conflicts that repeatedly come up about formatting, libraries, programming languages, operating systems, and so on. Furthermore it is easy to experience the feeling yourself when you go from an environment where you're competent to a new one. (At the moment I'm going through this having to learn Eclipse and Java, after many years of being primarily a Perl dev.)
Unlearning is generally as hard as or harder than learning.
And there's a strong tendency to view the first system you learned as "the proper way" to do things. Sometimes it is, sometimes it isn't.
The transition you're going through in Perl => Java is a pretty good example. You happened into a skillset early in your career which was highly capable and offered ample opportunities, but its flower has faded. That's something that the kids today who're building up competency in various currently-popular toolsets might do well to consider. They're starting to see hints of that as tools such as RoR are fading. What happens when we, say, bin the entire present set of Web dev, will be interesting.
And it's happened to whole industries before. Sucked to graduate in nuclear engineering in 1979, or petroleum engineering in 1990.
This doesn't mean you aren't correct. People may still be reacting negatively to change, because it does require work to learn new libraries, frameworks and languages. But that is not the same as saying that programming expertise is fragile.
. "Programmer Knowledge" http://henrikwarne.com/2014/12/15/programmer-knowledge/
The ability to start working with a new language after reading a blog is useful but it doesn't mean you'll be effective and useful in that language in a week.
A person with the ability to program, probably will have that raw ability in various languages. There is a knowledge core that is very transferable. An experienced programmer can come up to speed very quickly as concepts get mapped into things they already know. A polyglot will have been through this several times, and won't be afraid of the process.
But there is a great deal of knowledge that we use every day that is environment dependent. A programmer with 20 years of experience in various languages will not be noticeably better by day 5 in a new language than someone who has been programming for 2 months only, but in that language. A simple change can leave a programmer feeling incompetent.
Programmers who respond to this badly are not going to handle such transitions well. Furthermore if they get stuck in defensive behavior, then the initial discomfort will turn into permanent failure.
People who disdain math as "brain teasers" are deprived of the rush of knowing that some piece of your knowledge will stay true forever.
You can't get useful work done on real problems without learning a lot of that "fragile" stuff. How are things scheduled on your OS? Where do you find your logs? How do you run unit tests? What are your local coding standards? What is the library call for doing X? How do you find your documentation? What was that section of code you found foo in the other day?
All of this stuff is fragile. None of it is going to last. But it is context for your current life, and you're going to be more productive if you learn it. And when this stuff changes on you, you're going to feel the productivity drop. But there is no sense in hamstringing yourself so that you won't feel so bad later. You learn it now, and you learn the replacement later.
Back in the 1800s there was a belief that you could only learn so much, so you had to keep your brain clear for the important stuff. (Go read Sherlock Holmes for an example of this prejudice.) But these days we know better. People can practice learning. And those who do, constantly are learning. About everything. Including trivia. And it pays off. It really, really does.
If you go farther back, you'll find more negative opinions of Java like the one at http://www.perlmonks.org/?node_id=41244 that I stated 15 years ago. I believe that the substance of my complaints were true then. They are true of some organizations today. But they are not true of all of Java, hence my willingness to use Java today.
Java isn't awful. Neither is PHP btw.
(I won't defend everything written in those languages though.)
Sometimes maturity cuts through it, but IME, sometimes it takes maturity and intelligence (though not a huge amount of it), and on the rare cases, there's something more, like a primal competitive nature.
Many programmers are young and have just enough money, time, and knowledge to get into trouble.
This is compounded by the growth rate of the programming field. New, young, programmers jump into the industry every year. So, a young programmer can potentially encounter entire teams of equally young programmers, creating a school like atmosphere of aggression and dominance. (Which is sometimes taken advantage of by employers.)
Where I work now, no one cares and I get questioned on using something that is less than 3 years old. xD
This could also be due to simple inexperience. The first time you come across something that seems amazing, it's easy to make the leap from "amazing" to "perfect" and become a zealot about it.
Then you live with the amazing thing for a while and discover that it actually has flaws that weren't obvious at first glance. Then you go through this cycle a few times and realize that everything has flaws, and that the art is learning how to identify the things whose flaws impact what you want to do the least, rather than being able to find the One Perfect Thing that solves every problem cleanly. And you come to value the tools that tell you their flaws and limitations up front over those that try to hide them behind blustery assertions of perfection.
But by the time you realize all that stuff, you're not young anymore :-D
So what that boils down to is I have opinions that are semi-personal. A lumber-jack or a secretary probably has much less personal investment in their field. Personal interests yield opinions, and where there are opinions there are people who are aggressive about them, whether it be music, religion, sports, cars, or technology.
Everyone likes to feel special/superior. There are many ways to do this. Easy ways are money and power. If you are smart, you can win arguments.
If you aren't any of these things, you can be superior in other ways. You can feel morally superior to those 'millionaires and billionaires'. You can save the planet or the animals, which those others for all their money/power/specialness don't care as much about (or so you tell yourself).
You can become a shameless hedonist. Those corporate fuddy duddies would love this lifestyle, you see, but they are cowards even though they are rich, right?
In every case it's about being unique or on top of those things or making a mark that will last. Special. Valuable.
When we program it's from our heads. If anyone questions why we do something, it's as though they are questioning our intelligence, so the natural (for some people) reaction is to be defensive.
When someone else comes along and says you made the wrong choice, or even if they stop at just championing their own, different choice and therefore 'implying' you are wrong, its easy for individuals who might not have their self-awareness guards up to respond as if they were being personally attacked. Even if your original choice isn't something you, all being equal, don't really care that much about.
I don't think it has anything to do with technology at all
The thing they do not understand though is that what works on one individual might not be the best for everyone. And that we are all different and the best approach is to tailor the tools for what suits the exercise and the individual.
Now when I'm into technology I see the exact thing and I'm just like face->palm. So I guess this exist in all fields.
My advice for someone new in the field is to listen to what the veterans say, try some stuff out. Then make up your own opinion and use the stuff that works best for You and the task at hand.
Filmmaking. I love my artform, but if you want an example of a culture where there are a lot of strong negative opinions about almost every potential methodology, it's definitely one.
Certainly it's frustrating for any front-end developer who has invested his time in learning Grunt only to find out he should now be using Gulp, or Broccoli, or make(1)...
I remember the last time there was this sort of fragmentation and polarization in the web framework world (2005-2007), I chose Python & Django over Ruby on Rails because, well, I liked it better (that and the RoR community had a certain evangelistic quality to it that turned me off, while Pythonistas are basically like "Yeah, it's a tool. I can build cool stuff with it. Let's move on and solve some problems"). I'll freely admit that it doesn't matter, Ruby is a fine language, and Rails is a fine framework. But they do largely the same thing, I've never once regretted my choice, and I've been able to build some pretty cool things with the energy that wasn't expended getting emotionally involved in language/framework/tool wars.
Well, except that saying that X and Y are effectively the same is not a great method of getting booked to speak at conferences about X and write articles for sites dedicated to Y. It's the truth, of course, but if your main goals are self-promotion and/or demonstrating how smart you are, it's not going to help you much.
If anything, denying it would be evidence of ignorance, no?
I've used both in side projects (where the choice is fully mine) and usually the decision comes down to libraries I'm going to use.
So the field is a firehose of high-profile projects, whether or not they're needed.
It's possible a very large and complex generalisation of Conway's Law, between groups that barely communicate:
Then you get aggressive opinionation about the projects, but really that's a secondary phenomenon.
One problem is there are no objective metrics for framework or language quality. I don't think anyone knows if objective metrics are even possible, never mind how they would work.
So you pretty much just make a choice that works for you and stay with it until something obviously much better comes along.
And if it doesn't - if it's obviously much different, but not so obviously better - you can still be getting useful work done.
The seductive promise is that Framework X will make the job take half as long and produce half the bugs.
I doubt that's ever true in practice, for general values of X, especially when you consider learning/retooling time.
I am much more impressed with the integrity of open source projects that actively advertise the options -- especially when they cite advantages and disadvantages of the options. Actively misdirecting you to believe that there no options is pretty much the opposite of that philosophy.
Whenever I somehow get time to work on a new "weekend project", I go through that every time. I know better and should stick to the rivers and lakes that I'm used to, but when working with (often) sub-par code in my day job, I always have the itch of trying new shiny things.
After I get that out of my system, I often keep my own "standard" stack and force myself to just say "No" to the new tools that are not fit for the project. At that point, I've probably wasted the entire weekend playing around with tutorials and fiddles to see that going through the learning curve isn't worth the time.
You read plenty of comments about always "write tests first" or 100% test coverage, ignoring the fact that many businesses don't have the resources, and just want something quickly. Even if it is a bit buggy, the something that works for 98% of the cases today is better than something that works for 100% of cases in two months time.
I stared filling in an application form the other day:
do you prefer
A) I prefer "standard" solutions because they work and are easier to implement and maintain.
B) I prefer to create from scratch because it fits better the needs and offers better performance.
Without a context this question is entirely pointless to me.
And I will be judged on it? There people obviously have their ideals and I am going to be judged on them. There is no "it depends" in the answers.
Questions like that are like being asked if you prefer:
A) A backhoe, because it gets the job done faster.
B) A shovel, because you have finer control
without being told what it is that you're actually digging.
absolutely not surprising given that there is even a pretty popular language with indent base syntax out there.
But that's part of the fun :(
Probably should have thrown in something about systemd, though. ;)
I should also point out that in theory vimsh is just as powerful a programming language as Elisp. People don't use them the same way, but you theoretically can.
That's a bit of a Turing tarpit, though. A one-instruction computer can compute any function, too, but I wouldn't want to use it.
Likewise, vimsh isn't awful, but it's not a general-purpose language (elisp has a lot of warts at this point, but it is general-purpose, even if it wouldn't be my first choice for anything but extending emacs) and it's Yet Another Language; at elisp is a Lisp (which is a virtue).
The toxicity of this is not limited to front-end development, of course -- this is, in fact, the very framework we base modern democracy on. It's just lots of completely unqualified people (myself included) making a great deal of noise based on shitty information, then - once it inevitably goes tits up - we get together and point the finger at some scapegoat to absolve ourselves of liability.
We like to think that as a society we've developed from the primitive peoples of yore, but really all that's happened is that communication tech and industrialism has allowed us to mass produce and saturate our world with the achievements of a very, very select few and thereby create the illusion of universal progress. This entirely baseless pretence is giving unjust confidence to the (figuratively) blind, who are now leading us into a society of completely misguided moral absolutes: one in which local government officials will parent your kids for you with legislation, physicists will be lynchmobbed for the wrong choice of shirt, and Github will outlaw the use of the word 'meritocracy' for fear of intimidating the weak.
So yes, these issues are very important, but HTML and CSS hate games are the least of our worries.
I've basically resolved to not complain about anything given to me for free, ever, at least on twitter, because I've felt so much of it. "Why I personally prefer X vs Y" is ok.
For twitter, it can be anything from a software project to some customer service guy somewhere, but it's important to remember that real people are out there.
That guy's boss might be a jerk to person XYZ about what you said about his software on twitter, even if you weren't particularly upset. And you definitely didn't make the author feel very good either. And who are you to complain when you couldn't build it yourself, and are putting in much less energy?
If you have a problem with free OSS bits, help fix it, or use something else.
If you've paid for something, this is what support departments and customer service groups are before too.
( This blog is good reading on the subject too - http://powazek.com/posts/3368 )
Maybe because it because as a php programmers we're used to being the butt end of jokes, maybe because we think the language is eventually going to end up a backend serving json... but the people I've met at php conferences have been nice, humble and interesting. More talk about projects than tech. Haven't met a lot of dismissive people or people badmouthing other languages.
I love PHP; there's some really neat modern OO methodology and cutting-edge ideas and best-practices that I'd love to share on HN, for example.
But with the PHP-haters, the downsides of sharing outweigh the benefits. So like the OP article says, quality PHP developers just put their heads down and get back to work. Let Laracasts do the best-practices evangelism and move on.
At this stage though Java is getting so old that it seems no longer a threat to people so they are backing off (from the superficial, social point of view. There are still technical pros and cons to be debated)
There are so many tool choices that it's really really hard to even trust anyone's 'review' of the tool landscape, even in a particular vertical niche. And so many people that are writing the tools (and promoting them) are, in fact, primarily tool authors, not necessarily having to face the sort of 'in the trenches' problems that most devs have to deal with (both technical and political).
Denouncing tech XYZ because you're using ABC is often, I've found, done because the person wants some external feedback that they made the 'right' decision. Public (on forums or f2f) signaling of the choice you made, with some 'reason', lets people support you and your choice. There's certainly more to it, but that seems to be the root motivation I've seen in a lot of people in my locality who behave this way in public.
I've told the story about being at a ColdFusion developer conference, helping set up chairs the night before. One of the speakers starting ragging on PHP. Of all the people to be picking on an 'easy target' like PHP, I would have thought CF devs would be more sensitive to that sort of trash talk, likely having been on the receiving end more often than not. Not this guy - we were all treated to a 5 minute rant about how shitty PHP was is, PHP devs need training wheels to use the internet, etc. Unbelievable, but right there in front of me.
When I did it, I was trying to make sure my reasons for my choices were 'correct' (tech X was faster to write, faster to execute, cheaper, etc). After a few more years around the block, it became apparent that there are generally so many more factors at play that are not always readily apparent, and often, the best choice is the one someone already knows, because estimating learning time for 'newtechX' is essentially impossible (doubly so for stuff that's not even out of beta yet).
Argh... maybe we can get Rachel to come speak at one of our local dev groups - this message needs to be driven home more, I think.
You are doing the we vs. them thing, not just in this sentence, but throughout the text. It's never the right approach. There's some us and some them in all of us. I see this often in writing on social issues in tech; a strong tendency to categorize and polarize, even when intentions are good. Regardless, I find the article well written and on point.
I use Knockout (which is supported, just had a release, picked up some new core devs and added component support) extensively and I like it a great deal.
It's not as large as either Angular or Ember but it's worth a look.
Steve Sanderson has an awesome video on it  which is worth a watch even if you aren't interested in Knockout, fella clearly loves and is passionate about what he does.
This article was a good read.
Not everyone has the happy circumstance to work with the "latest and greatest" stack.
I also used to be the guy to swear by Emacs and C, bitch about how awful Java and C++ are with all the crappy IDEs (despite never having tried them out, just reading articles and comments bashing them). Eventually I matured a little and tried both out, then tried a few IDEs like NetBeans, QtCreator, and PyCharm, and I actually prefer them now. If I had never broken out of the "hating circle" I would only be able to write programs in Lisp and C. Some of you may think that's fine, even preferable. Personally in retrospect I'm happy to have widened my horizons.
 This is from my own experience, I used to do that.
There's also a Thing where you signal your knowledge about crappy solutions by peeing on the crap; this informs others that you have enough knowledge to Not Use Crap. (or, alternatively, showcase yourself as an elitist jackass. ymmv).
I've seen plenty of people present things as literally the worst, but they're doing so in a very tongue in cheek way.
Fascinating. In that this episode represents an only slightly exaggerated instance of a pattern of such profoundly dysfunctional and self-blocking behavior in the community that has become so widely entrenched that it is beyond epidemic proportions; it has practically become the new norm.
Or not, everyone's different.
Frankly, I'd much rather have someone like Linus chew me out, because at least it's an honest and clear criticism. The tech industry prides itself on being an open and transparent environment, and I don't think that should change because someone's feelings got hurt.
I suspect that in many cases he gets frustrated when people repeatedly do something he's asked them not to do because he has to review their patches over and over again.
Not sure how relevant frameworks are to the world of kernels but I'd imagine he'd chew you out if you tried to get any C++ into the kernel.
You can go to an interview as an expert in a given language, but if you don't know what framework a company just happens to use, you'll lose the job and they might give it to a novice who does.
Also, I would be curious to know which conference she is referring to. I realize this is a pandemic problem, but if she would name some names it might get the ball rolling on turning things around.
Freud had it that sublimated aggression provides energy. Like all things w/ an indisputable theorist, I'm not sure if he's on the right page. However if it was true, it would explain the value in developer holy wars.
Quote from the above link -->
The sublimated libido, according to Freud, contributes to the formation and maintenance of permanent object relations and to the 'molding of psychic structures'; once the psychic structures are formed, the energy is at the disposal of the ego and the superego. Similarly, aggressive energy is 'neutralized' (Hartmann, 1952) and thus can be transferred from the id to the ego. According to Hartmann, Kris & Lowenstein, "the capacity to neutralize large quantities of aggression may constitute one of the criteria of ego strength".
Also, maybe part of the problem is thinking/assuming that a bunch of front-end devs are going to be "your people". Yeeuch. It's a job, some of those people you will like and get along with, most of them less so.
What's wrong is when someone is unhelpfully negative. This is the person who just complains no matter what happens.
There's a difference between this and just general "best practices". The latter are positive and suggestive whereas the former tends to be very absolute.
Relational Databases are a perfectly good tool and will fit 90% of the applications being developed. But NoSQL is shiny, and we have to try the new way (despite the fact the new way was what people were doing before relational databases came along). Likewise with back end languages. We have plenty of good mature solutions on the back end, but people want to use Nodejs. Fine if your use case is many concurrent connections, but otherwise why not use something more "traditional" (mature, tested, understood).
Some people may say front end developers are not "programmers" in the strict sense of the word, but I believe much of this still applies...
Instead of bashing technology X, show everyone how good technology Y is. There is no need to be negative about anything. There is enough negativity in the world to go around.
When I express my opinions I don't have to account for anyone. My opinions are my opinions and I'm going to be proud of them. And personally I want everybody else to feel the same. Be opinionated and be proud!
When I use "I" in a sentence, people compare who they are to me, by trying to find similarities.
That's why the former is more persuasive.
Also as a general rule they will get more linkbaity attention.
In the particular situation when platforms are multiplying and the years behind us are littered with the corpses of failed initiatives that promised a world enough and time, the call to learn The One Best Technology becomes stale and annoying. It looks like an invitation to waste our time rather than becoming more expert with the technology we've already invested a lot of time in, and are just hoping we can make that time worthwhile before it becomes obsolete.
It does suck that a man is the default for this, but IMO it's a projection of toxic masculinity (i.e. Patriarchy).
If I were to try and draw a line, I'd be more general and say that it depends on culture during early career. You can learn one of two lessons: That aggression is required and accepted way to win an argument, or that aggression is an unacceptable way to reach the correct solution. Both lessons are true in certain contexts, and depending on which goal is valued more in your culture (winning argument or correct solution), you'll learn one or both of these two lessons. If you truly believe that you will find the most correct solution alone, without the help of your team, you're very likely to learn lesson #1 and behave that way. That type of person, though, is destined to make a huge mistake and learn the hard way how to collaborate.
You are dismissing the reality that some people are actually smarter than others. There are some people who really do have a real vision of a 'best' solution. However, it is true that collaboration is the requirement for a working relationship and thus a product out of that working relationship... But, the collaborative effort and resultant end product is always, by necessity, a non-ideal solution for at least a portion of the end-users. Keeping this in mind, it can be understood that "the best product doesn't win" -- rather, the most popular and most agreed upon solution or product is what wins.
In summary -- I agree that negativity and/or aggressiveness in pursuit of a solution leads to a unworkable /collaborative/ environment... But I disagree that the collaborative environment produces the best product or solution to a problem [programming or general]
Warning! I am not one of the smart ones. I tend to sit in a corner and doodle like the article writer. Perhaps, if the collaborative environment was less collaborative and more constructive, I would participate more... see Office Space for better answer.
I never dismissed anything like that. If I did, please quote my dismissal. In fact, this sounds like an assumption you made before reading what I wrote. Change your assumption and try reading it again.
> There are some people who really do have a real vision of a 'best' solution.
This statement only works with quotes around "best" and phrased as "a best" instead of "the best." I do not question that it is possible for a single person to have a driving vision, but what happens when this person, instead of explaining their reasoning to their collaborators, uses aggression to dominate them and force them to accept his or her solutions? Do flaws get pointed out? Or are we under the assumption that "smarter" people do not make mistakes? If not, with everyone on a team looking up to a single person to make all the decisions, who will catch the mistakes?
The question here was never "collaborate vs. don't collaborate." We're talking about teams, where a degree of collaboration is necessary to move forward, and individuals who disrupt the collaborative process via aggression.
>But, the collaborative effort and resultant end product is always, by necessity, a non-ideal solution for at least a portion of the end-users.
How is this different than non-collaborative effort? What exactly are you saying here? That a single person can build a completely perfect product that meets the needs for all end users while a group cannot? What you have said above is true whether collaboration is involved or not -- everything has flaws.
>Keeping this in mind, it can be understood that "the best product doesn't win" -- rather, the most popular and most agreed upon solution or product is what wins.
In a working collaborative environment, these two can be the same thing. When one person makes all the decisions by fiat, the solution that wins is that person's favorite, not one that has been thoroughly vetted or necessarily correct. I would rather rely on a vetting process than on a single human's biases. This is what collaboration is for, diluting biases and natural cognitive flaws. No one is immune to all biases and flaws.
>Perhaps, if the collaborative environment was less collaborative and more constructive, I would participate more.
What do you see as the difference between "collaborative" and "constructive"?