I've seen this article before. A lot of people that are active on the internet like to assume that everyone is using the latest tech and can just use whatever tech they like. This is rarely the case. I've spent the majority of my career tending to old code bases.
There is a lot of legacy code that is still working fine. There is a lot of old C# code bases that are working exactly as intended in now unsupported .NET versions.
One of the constant annoyances of some of the newer tech stacks is that developers constantly break the API meaning that when you upgrade the framework version you have to fix code and probably tests as well. It is absolutely infuriating when an API is changed when the old API was working perfectly well.
That's why the JVM (and platforms with a similar philosophy), to me, is the best runtime there can be. Sure, there's fancy state-of-the-art stuff being integrated into it; but the most compelling feature is the continuing commitment to making dusty old jars work, unmodified.
For the most part the situation is the same with .NET. Any code written in .NET 2.0 should work with 4.7.
The newer .NET Core most stuff is now compatible and it is pretty easy to write code that is compatible. Stuff that isn't compatible normally needs you to install a compat shim that Microsoft provides, but I've not needed them so far.
My major frustration is with the JS frameworks. I wrote some perfectly good Angular 4 code and I had to change quite a lot of it to work with newer versions of the framework.
I haven't been working with .NET for very long, but my experience is that mixing .NET Core and .NET Framework leads frequently to very confusing, inscrutable error messages about incompatibility between the two. Though if it builds it works fine.
The accepted method is convert projects to .NET standard and/or multi-target project itself. I haven't run into any issues but most of what I do is web projects, I wouldn't know if it was something like WPF.
This is virtually all of the important rust libraries that refuse to go out of v0 because of semver and their urgent need to break everything. Anyone who complains is not "apart of the team" either. It's astounding how much post-purchase rationalisation exists in the community, rust has plenty going for it, the people will destroy it before that though.
Sorry guys but if you can't manage the task of providing randomness as a stable basic lib then maybe all you have is an experimental language not fit for production? It's scary how the security and correctness of the lang doesn't match the namespace problem that exists currently.
The single owner of one of the most popular crypto libraries hasnt responded in years to anyone. Hundreds of downloads a day still. Evangelists please take a step back and think about that.
I keep up with modern tech. I regularly learn new languages and frameworks. I lead agile dev teams. I work as a consultant modernizing devops and moving to cloud deployments. I write real software for real businesses doing real things.
I don't blog. I don't go to conferences. I don't go to hackathons. I don't have any meaningful presence on GitHub. Hacker News is about the only place I have even the slightest footprint, and most of what I talk about here has nothing to do with my career as a software engineer.
So what? The implication in this article is that if you're not out there trying to be visible, you must be some crusty old COBOL developer locked in the basement.
In my experience, the people who are the loudest are rarely the people who are doing the most. If you spend all your time trying to prove to strangers that you're hip to the latest trends, just how much time are you spending doing anything of actual value? Not everyone who has a big public presence is actually doing anything. Most of the discussions online and presentations at conferences are nothing new. Most of it is self-promotion. When I see people get all excited about tech, it's usually because it's new to them, not because it's a new idea.
Sure. I've met a lot of well-known authors at conferences and meetups. I've met people who work for big companies with R&D arms who are putting out content and products on the "bleeding edge" of technology (ala ThoughtWorks, IBM subsidiaries etc).
Most of these people love technology. They love learning about technology. They'd be perfectly happy working in technology, but because monetary success ends at being specialist consultant in a super lucrative field.. They stop.
Now they work in strategy and marketing around technology. Why? Because it's more lucrative. The job is more fun. They get to meet really, really smart people without having to work 60 hour weeks for $250K a year. (Which is great, of course, but these people have the soft-skills to add a multiplier to that amount.)
I'd be happy working 9-5 as a CRUD software developer with a great team in an OK company.
I'd be happy working in a very specialist field, earning $250K with an amazing team, in an amazing (funded) company (for a few years before shareholders start making demands).
I'm more happy working random hours around the world, meeting and talking with the smartest people in technology at a range of companies. I get to learn about what they've achieved, how they've achieved it.. Without having to burn the candle at both ends.. And I get to just play with whatever tech I feel like playing with.
They're different careers. A lot of people dabbling, or dipping their toes with blogging etc. are just doing it for a better portfolio when they company hop.
Yeah it quickly becomes apparent in one's career that in order to actually realize any sort of personal vision it is necessary to be slotted into the "executive" box, except maybe in very large companies that have the spare resources to spend on Quality
> If you spend all your time trying to prove to strangers that you're hip to the latest trends, just how much time are you spending doing anything of actual value?
This is something I've wondered about a lot of social media users to be honest. I mean, if you're seeming spending all day on Twitter or what not, how are you finding the time to actually do any work? Do any of these super outgoing social media addicted conference every week developers actually have a full time job? Or is their entire career about promoting themselves?
I would absolutely hate it to be interrupted in the middle of a good coding session by someone who wants to talk to me on twitter about the hype of the day.
The big mess seems to be in the CSS/Javascript space, where fad frameworks come and go rapidly. There's more change than improvement. Visually, most web pages haven't changed much, and web development has become much harder.
But we can now scroll forever. Slowly and jerkily.
On the language front, we really should have had a replacement for C/C++ by now. We don't. D never got traction. Go is mostly for web back ends, where it has really solid libraries heavily used within Google. Rust looked promising but was taken over by the "functional" crowd and is doomed to Haskell obscurity. C++ itself is now trying to fake ownership tracking at run time, with mediocre results. The abstractions always leak because too many old things need C pointers. Java was supposed to replace C/C++ but became tangled in its own libraries and ended up replacing COBOL.
There's been some progress. There's now grudging agreement that the way to go is type declarations with some type inference. Having to type everything is too much of a pain. Not typing anything makes for unreadable programs. Function definitions really do need type info. So C++ added "auto" and Python added "advisory typing", moving both towards the center. Which is about where Go started.
>Rust looked promising but was taken over by the "functional" crowd and is doomed to Haskell obscurity.
I think that's unfair and inaccurate. Rust is going to remain niche because most developers are happy with garbage collected languages and have no need to switch to Rust. Rust will eat a bit into C++'s domain, but I don't think it'll ever take over the world. GC is good enough, works great now and worked great 20 years ago.
Theres many applications where any unexpected garbage collection pause is absolutely detrimental. Even pauses when the kernel cleans up memory cache during a Malloc are detrimental.
Your post doesn't contradict the post you're replying to. There are obviously many cases where it is no desirable to have GC, but there are even more where GC does just fine.
> Visually, most web pages haven't changed much, and web development has become much harder.
Both true, yet engineers keep switching to newer frameworks and libraries anyway, rather than sticking with what they have and just building stuff.
The question then becomes, assuming they are rational, why?
I'll propose at least one major reason: efficiency. What's improving isn't what is possible to build with these technologies, but how effectively those things can be built, to a reasonable degree of reliability, in the same amount of time, even accounting for the greater difficulty in learning and using the technologies.
That's where progress is being made here. Not in the what but in the how easily, how quickly, how well. You could do anything you can do in React (say) with vanilla JS, imperatively updating the DOM with cumbersome APIs etc. The question is how much longer would it take / what more could you have built in that time using React.
> we need to find a balance between those of us online yelling and tweeting and pushing towards the Next Big Thing and those that are unseen and patient and focused on the business problem at hand.
What makes him think we don't have a balance already? Most devs are the quiet ones who just do their job and then go home. Aside from participation on HN, I consider myself to be be one of those quiet ones. I work on business problems with whatever tools are appropriate, which includes both modern stacks and legacy codebases.
Plenty of us are using the latest technologies to push our particular domains forward but we aren’t writing blogs or evangelizing. This doesn’t mean we are in the dark. It just means we get stuff done. It doesn’t mean we don’t have skills. What it does mean is I don’t need google or stack overflow to write my code for me. I can solve my own problems thank you.
Right, but some of the comments on HN imply that is the way people think - that most dark matter devs are using old code bases, or don't have the skills to be on the leading edge.
I would see that much more positively: Most of those Dark Matter devs have the experience to know which old and stable technology will be superior to which hyped new fashion trend. So they know about all the new stuff but are still are using old technology because it's the better choice.
I feel like the article misses out on two crucial truths:
1. Most of the new tools are pure nightmare to maintain. RoR has breaking API changes often enough that you need multiple people to maintain a large codebase, whereas in PHP you'd only need one person because things don't magically break that often. And MongoDB is so convoluted and complicated that almost nobody can deploy it correctly. And then there's "new" container techniques, which are basically just an admission that you have no f*ing clue on how to make your new tool set secure. In most cases, the loud new technology is just over-funded hipster stuff, but not ready for the real world yet.
2. Posting things online (e.g. Source Code) will not only attract the attention of peers. You might also get flooded with rather demanding emails by technically incompetent people who complain that your free source code release didn't solve their problem.
Also, I wonder if the author is aware of IRCs ongoing popularity for private development discussion rooms. It seems many people thought Slack was new, but some of us programmers have been hanging out like that since 95.
> RoR has breaking API changes often enough ... magically break that often
According to the support policy, nothing magic about those breaking changes: "Breaking changes are paired with deprecation notices in the previous minor or major release."
I remember that we had one particularly horrible maintenance update when Arel queries that used to work just fine suddenly started always loading one wrong item. Turns out, they did an internal change so that our way of doing User.where(User.arel_table[:id].eq(2)) now returned nil for the arel eq so that we were calling User.where(nil) which got translated into User.where(id: nil.id)
That said, I was referring more to the Ruby on Rails ecosystem in general. Our production deployment has 80 gems, which I'd say is a normal number for a big Rails webapp, and sadly many of them do not follow semantic versioning, so that minor gem version updates might contain unexpected breaking changes for us.
The advantage that old languages like PHP or C++ have here is that their standard libraries have been around for so long that the API rarely changes these days.
> You might also get flooded with rather demanding emails by technically incompetent people who complain that your free source code release didn't solve their problem.
Once I learned to listen to incompetent users, I started to create usable software.
My gut reaction is still, "RTFM you're doing it wrong." But now I follow that up by figuring out how to make the software work like the user thought it should or I change it so they never get in the frustrating situation in the first place.
What I finally realized was that if one person got frustrated enough with my software that they took the trouble to post an incoherent and technically incorrect rant on Github then it's likely that 100s more users had similar problems and said nothing.
Now I receive effusive emails about my Open-Source almost weekly.
I fully agree with you that listening to users is the best way to create usable software.
But after reading the article, I thought that the 99% number was about developers not joining the discussion with other developers. And for that purpose, I'm not sure if open source is still a good idea.
> You might also get flooded with rather demanding emails by technically incompetent people who complain that your free source code release didn't solve their problem.
This. Happens so frequently, that making things OSS now carries an additional burden. I personally find it very painful to e.g. ignore the PR or issue opened by some hapless person in a bad situation trying to figure out the most dumbest thing. It makes the alternative (never OSS'ing anything) sound better, if only for my own peace of mind.
PHP is horrible as a language, but it fills its niche very well:
* You can get it to run dang near anywhere.
* It has an extremely low barrier to entry for new devs (blessing and a curse).
* There's a large volume of code already out there.
* If you need something that's a little more than a static site but not a full blown custom system, it's one of the simplest ways to get something out the door.
* If you know how to use it, there's likely someone willing to pay you to do so on their behalf.
Eh, that's not really the case anymore either. It's made many improvements over the years. Not that I'm saying it's some wonderful language, but you can't really compare it to the PHP 4/early 5 days. I think credit is deserved to the folks who have brought PHP to where it is today.
Yes each of those websites still use PHP under the hood. But by that argument each of those websites can be counted as using C as well. As well as machine code.
It has nothing to do with the code being spaghetti or not. It's about a level of abstraction where I no longer consider something made with wordpress as something created with PHP. The PHP was allready created and then someone just clicked a few menus to choose what existing PHP he or she wanted.
The more fitting category would be just calling such websites "Wordpress" websites and not PHP websites.
The categorisation you are trying to make us accept is ridiculous.
I can deploy a python web application I have written to a new web server within minutes by clicking deploy in bitbucket.
So using your definition:
* The code is already written.
* I clicked a few menus and I had the python code I wanted running.
It isn't a python application. However the web application when it receives a request will run python code. It is clearly a python based web application.
I wouldn't consider it python code if it is a cookie cutter website where the creator didn't have to write a single line of python. It's not about deployment it's about the process of creation. Why don't you consider each and every website running PHP also a C website?
> I wouldn't consider it python code if it is a cookie cutter website where the creator didn't have to write a single line of python.
So if I got someone else to press the deploy button in bitbucket, it suddenly isn't python code? That doesn't make sense does it?
> Why don't you consider each and every website running PHP also a C website?
The web site/app logic is written in PHP, not C. The runtime for the vast majority of PHP deployments happens to be written in C.
Additionally while I did say interpreted earlier that really isn't true anymore. PHP runtimes these days tend to break the PHP script down into byte-code. There are alternative runtimes for PHP just as there is Java or .NET.
There are at least 4 I can think of. Two of those aren't C (C++ and .NET). However there is nothing stopping you from writing a runtime that runs on the JVM / Lisp / Go / Brainfuck / Lol code / 68k assembler.
> So if I got someone else to press the deploy button in bitbucket, it suddenly isn't python code? That doesn't make sense does it?
You completely ignored the most important sentence in my comment which addresses this.
> The web site/app logic is written in PHP, not C. The runtime for the vast majority of PHP deployments happens to be written in C.
The logic is written in PHP just as much as it is written in C. The logic is actually written in the language the creator used. Which is the wordpress UI.
In the enterprise world, very much so. PHP is super easy to deploy, requires little maintenance, and you can easily hire good admins to keep things running smoothly.
On the other hand, not even using a service like Heroku can save us from constantly having to patch or update the Rails apps.
I wonder if Scott Hanselman ever had to maintain large system while having limited resources to do so. Churn is bad for most, apart from companies who must sell new things to keep on going, like software companies. Moving fast and breaking things is great when you don't have to consider churn cost. Calling the anti-churn crowd "dark matter developers" seems at least a bit derogatory.
I've worked on a team before where we every dev said "we need to upgrade X, we're Y versions behind" and management always responded with "no budget, this is not a priority". This continued until our hand was forced by a company wide initiative to modernize some aspects of our architecture and all of a sudden our team was frowned upon for being so far behind. Then we had to spend many months on an "upgrade project". I'm sure this same story has played out many times over in many other companies. In my own experience, often lack of tech currency/accumulation of tech debt is a management/organization problem. They won't get rewarded for upgrading things until their boss makes it a priority and there is some highly visible "upgrade project", so why bother doing so now.
So, I think there's two antipatterns and a pattern here.
Antipattern one is the one where you just get on the treadmill and churn, baby, churn. Constantly replacing things, spending way too much resources on tech and not enough on meeting customer needs. This is sort of the "new sports car every two years" model, or as a certain person who hates this site enough to make linking to him pathological calls it, the Cascade Of Attention Deficit Teenagers model. Nothing stays around long enough to enter long-term maintenance, because you keep chasing after the hot shiny thing instead of having the discipline to maintain a mature product even though maintenance isn't fun.
Antipattern two is where you don't have the resources to do maintenance or to chase after the shiny thing. This is basically treating the code as a monument, or as a museum exhibition -- every time a new feature is done, you hang it in the museum, and then you go work on the next feature. The problem with this approach is that Someday The Bill Comes Due. A dependency that you have hits EOL and nobody's made a new version in three years. You spend your time writing code to duplicate features that your language's standard library added four years ago but that you don't have access to because you haven't updated your stack in eight years. New features get harder and harder to add because it gets harder to reason about what any one piece of code in the program actually does, or everything is bottlenecked by the one guy who understands the codebase. Your bus factor is one if you're lucky and zero if the bus came.
The pattern is treating your code like a house. It needs upkeep. If you just leave a leaky bit of plumbing long enough you're going to have a flooded basement, so you bite the bullet and you fix the plumbing when something happens. But you don't trade in for a new house every two years either. You have a project list and you tackle it as best you can and sometimes you take a few days off work for a big project like painting or what have you. And you build equity that lasts.
Sure, there's an extreme on both ends. The problem you're describing is that the upgrades suggested were probably not frictionless, managers were burnt before by needless upgrades that took too much resources for little value. If tech companies took much more care wrt upgrades and not breaking stuff, then users of the tech would not see upgrades as needlessly risky.
An example of this is net core vs net fx. Net core broke many things, has not implemented full compatibility from day one, introduced netstandard , which as of latest version is not implemented by netfx, introduced new toolset (dotnet) aside msbuild (bear in mind that VS still uses Ms uild) and so on.
This is all a matter of incentives both for tech companies and for their workers - they are often rewarded for short term/easy to see by managers impact, not for bugfixing, stability or long term support. It was not always like this - in the 90s MS would be picked because they took care to not break stuff. Migrating .Net 1.1 to 2.0 was super easy.
There is always going to be major flux as new architectures are evolved until they are stable. If this was hidden, there would have been little input from outside Microsoft.
While there have been legitimate problems with this process - for example, the explicit ‘go live’ given by Microsoft before things were truly stable, there was a choice made to develop in public and the net result (sorry) seems positive to me.
Depends on the success metric. If you look at the cost of the tech churn to the businesses that have to spend much more than before to achieve similar results as before, then it is not positive result to anyone but tech providers, and (young) developers that are needed to redevelop. I wonder how much will businesses tolerate this if promised efficiency gains do not materialize for most of them.
> I wonder how much will businesses tolerate this if promised efficiency gains do not materialize for most of them.
I think you've stumbled on the core issue here. For an unchanging company for which their digital business does not need to evolve very much, having a basic IT department may just work. But as we digitize more and make technology a core competency and critical differentiator, a legacy, inflexible tech stack becomes a burden. It doesn't let you add the features that your customers may want, features which might be critical to a continuing and successful business relationship. Your customers then have similar demands from _their_ customers... all the way back to the primary sector.
I know Scott meant nothing derogatory by it. "Dark matter developers" refers to the developers you don't see. If someone is offended by that, I think it's a sign that they desperately wish they could be at the leading edge, but aren't allowed to. There's a ton of developers in the industry who have a different set of values than that.
That's a nice way to describe something, but it doesn't have to be a problem. Some devs around me (partially me too) just doesn't want to tweet or write blogs about tech, yet we could work in bleeding edge tech or writing cobol at the same time.
Some people maybe don't need help, guidance, or something like that, maybe just maybe we like the things as they are now.
Maybe we'll start doing all of social stuff some day.
While this is perfectly fine for you and for everyone who makes this understandable decision, what tends to happen is that people learning about bleeding edge of technology tend to have very different perception of what bleeding edge tech is. Every new buzzy technology is so oversold that it makes developers overly cynical and disappointed when it doesn't deliver on the promises made.
If you or others such as yourself did blog about success stories, or real-world experiences in General, other engineers would perhaps have a more sober perspective of the bleeding edge and would make more informed decisions when deciding to switch stacks instead of just "hey its cool and the latest fad".
Again, I don't really expect you to change your behavior at all, this is just a description of what happens in aggregate.
I believe that many programmers, especially junior ones, want to try out and work with the newest tech just because it looks great on their CV.
I've been in some projects where more experienced team members very clearly reminded everyone that technology X would not be useful here, but others on the team just resented them for disallowing the cool stuff.
So this could also be a conflict between the company's goals (stable product) and the individuals' goals (cool CV).
Well, I have that honor to hang on drinking beers, with people who are deploying their parts of larger web application on kubernetes independently on the other team who appears to use svelte, and the other one who uses ABAP, which is basically cobol, and third one who actually supports banking application written in cobol.
There are stories which could certainly help engineers around the globe, but those friends of mine just doesn't feel any emerging feeling to write about it.
The fact that you go mostly "unseen" online seems completely orthogonal to whether you love programming and keep up with the latest technologies, etc. I would consider myself "dark matter" in the respect that no one knows who I am and I'm not constantly blogging or tweeting about stuff. That doesn't mean I'm not interested in, passionate about and highly competent in technology. I find it kind of insulting actually that someone would assume that just because I'm not loud about it means anything else about how much I care about it or how good at it I am. I'm not totally surprised by this though; for most people, if you're not jumping up and down telling them about how awesome you are they have no way of knowing. I see this a lot in my current job; all the developers know which of their colleagues are kick-ass at their jobs, but you have to sell yourself and your accomplishments to everyone else.
TL;DR, I know this is an old article, but it seems there's still some perception that us dark matter folks are all just passionless drones in it for the paycheck, and while there's surely some percentage of us for which that's true, I think it's completely wrong-headed to assume that's how we all are.
This is a really good point too, especially when considered the whole 99% rule or what not. Tech community sites like Hacker News, and web dev sites like Smashing Magazine or CSS Tricks or what not have a large audience of lurkers that keep up to date with everything mentioned there yet don't spend their time posting about it on social media. Hell, I'd say I'm in the same boat there too. Read a lot of these kinds of sites and blog articles, but generally don't talk about 'em online much.
Of course that's not to say being in it for the paycheck is necessarily a bad thing. Not everyone needs to be 'passionate' about their job to do well in it, and if say, my doctor or surgeon is good at what they do, I don't really care if they spend all their free time talking medical issues on social media or posting on Hospital News or what not.
Yeah, I agree, although probably even a few years ago I wouldn't have to be honest.
I think there's just a lot of black and white thinking going on around this stuff, I.e. you're either this or that (I.e. cutting edge rock star or old crusty dinosaur...) and I just don't believe that's how it is. It's multidimensional and we're all on the various spectrums somewhere and we should stop putting value judgements on which side of the spectrum is better or worse. We all do very different things with very different requirements, so stop prescribing what's best for everyone.
Chris Coyier's The Great Divide (https://css-tricks.com/the-great-divide/), although nominally about front end development, does a pretty good job explaining this problem, and I don't think this is unique to FE at all.
You will be surprised, but doctors also have their professional social networks, discussion boards, news sites like HN, various conferences, etc. Especially nowadays, when a huge chunk of medicine is a technology too.
I think that by virtue of being here to comment at all, you are not one of these dark matter devs.
The group of people he is describing I think largely don't bother to come to communities like this. They do their job, go home and only know about developments in tech if it is relevant to their job and most likely because it comes through in-house channels.
I know a lot of people like this. Many of them aren't even aware that automated testing is a thing. They hardly know anything about any programming languages or technologies except for those they work with.
I don’t think it’s about ability or interest. It’s about presence. There’s loads of developers out there with a range of skills. I work with devs who are great who you’ve never heard of and others who aren’t so great who you’ve never heard of.
I’d consider myself in the 99%. I aspire to market myself but I’ve got lots of interests and I’m quite a private person so I find marketing myself in conflict with that aspect of my personality. But get me one to one and I’ll happily talk about my latest interest (which by the way is Directed Graphs).
Same. I'm just horribly bad at self-marketing and generally have zero interest in getting attention. I used to think that as long as I was doing good work people would notice, and sometimes if you're lucky that can happen, but there are lots of situations where you're doing yourself a major disservice if you're not selling yourself and your accomplishments. You just shouldn't depend on any one else to advocate for you.
That doesn't mean it's easy though. I still hate it and have to make a conscious effort to market myself.
Maybe I'm just missing the point of his article then. The way I read this though is "hey there are lots of devs out there who never post on social media or write blog posts. That's because they don't care about technology or being on the cutting edge, they just do their jobs and go home." If we disagree on that point, fair enough, I might be projecting some of my own angst here.
Back in the usenet days it was said that 99% of users were readers and 1% were writers. The same is true for the web.
That is, I think "dark matter" developers read blogs but they just don't write them.
As for conferences, these are widely seen as a scam. I remember sitting through a presentation at the end of a conference by the platinum sponsor where I sat next to some tired defense contractors from the East Coast and then drinking in the hotel bar later with the director of marketing for the sponsor who thought that he hadn't gotten what he paid for.
Scott talks about how devs don't do anything to be seen yet if you look at his twitter stream it's mostly just him tweeting his own posts and retweeting other "famous" developers. He, and the majority or other very visible people in the community, gatekeep the blogosphere by failing to help anyone else break in (not necessarily consciously of course; they may just never see the less famous devs posts).
Scott could be a much bigger part of the solution than he chooses to be.
I feel obliged to jump in and defend Scott here - I just don't think what you are saying is a fair summary of his behaviour.
For sure, I see him doing what you say, but I also see him regularly retweeting and responding to comments and questions from accounts with few followers, and particularly so from people from minorities within the developer community.
There are plenty of examples of people I can think of who behave in exactly the way you suggest, mainly spinning the flywheel of boosting the online presence of already famous people and not engaging much with outsiders, but Scott not only isn't one of them but perhaps one of the best examples I can think of of actively doing the opposite of that.
I was about to try to comment this same idea to defend Scott. To further support it, I recommend anyone to look at the guests he interviews on his podcast, https://www.hanselminutes.com/. He goes out of his way to find and interview guests who are under-represented. Scott Hanselman is someone who I believe definitely does NOT gatekeep, and tries as best as he can to give people a chance to "break in".
onion2k I ask you to look again at his Twitter stream and podcast, and re-assess your evaluation of him.
It seems like you're missing the point here. The "dark matter" developers aren't being prevented from being internet famous by some mean old clique of tech bloggers that won't let anyone else in. They are the ones who don't care at all about being internet famous, don't write blogs or tweets at all, and don't know or care what other people are blogging and tweeting. They just write code that solves business problems and then go home and do non-tech stuff.
This is not a problem that needs to be solved, it's just something to be aware of.
Yeah, there's definitely an element of unintentional gatekeeping in many communities/fields, especially given how many 'popular' or 'famous' creators only seem to mention other popular or famous creators. Often made worse by their tendency to not look for the original source of an article or news post, and to end up retweeting or sharing the aggregated version from some popular site or creator.
For dark matter developers, there is no "gatekeeper" since there's no gate they want to cross: those developers aren't posting, and they don't value the visibility that it sounds like you do.
You should listen to Scott's podcast: he has a wide array of guests, in many diverse topics (even topics that could be seen at odds with his employer, Microsoft). Most of them are people I've never heard of. If there are gatekeepers, Scott isn't one of them.
I don't think the point of his article is that this is a "problem" to be solved. And personally, I don't think these people owe you anything. Twitter etc. are all open platforms, there's nothing preventing your stuff from being seen by others.
I keep a client that has not much to do with my current line of work, just to keep in touch with these guys and this environment. I feel it is important to solve real problem and real user's problems from time to time, I learn a lot from them. On top of that, I am a client of them and we live in the same city, so there is mutual interest things go smooth.
And yes, they learn a lot from me. Sprinkling some new tools like git and Docker, encouraging people migrating their systems off PHP4, etc. can't hurt.
"Dark Matter Developer", what label is that for a developer doing his thing without going to conferences, doing blogs, etc..? This should not be on HN imao.
There is a lot of legacy code that is still working fine. There is a lot of old C# code bases that are working exactly as intended in now unsupported .NET versions.
One of the constant annoyances of some of the newer tech stacks is that developers constantly break the API meaning that when you upgrade the framework version you have to fix code and probably tests as well. It is absolutely infuriating when an API is changed when the old API was working perfectly well.