One devious form of bullshit is always using the simplest of examples to prove out an architecture pattern and call it a day. I’ve lost count on how many times I’ve seen patterns with glaring open questions and pitfalls the author avoids. Then the cacophony of likes follow from folks who’re likely very early in their career or hobby and don’t know any better. This is very prevalent on iOS development blogs as authors rush to write their blog posts after every WWDC.
> The fundamental deficiency with our society is that we value talking about X more than performing/implementing X.
I am so guilty of this. I spend so much time on here and other platforms just mindlessly consuming and participating in discussions about software engineering, new technology, best practices, etc.. So much so that I basically barely program anymore (except for my job where none of this information is important for irrelevant and complex reasons).
I have ideas and projects I want to work on, but I think my brain has be rewired over my life to the point that I get the sense of reward of finishing something by actually just procrastinating/reading and not working towards whatever goal.
It's like information sugar -- sugars are easiest for the body to convert into energy. The body likes sugar because it can get "what it wants" while expending the least amount of effort. My brain has apparently has implemented the same logic when it comes personal projects.
I believe this is a very old problem, which resurfaces in different forms depending on the individual and the context in which they function. I'm weakly certain that CS Lewis even devoted one of his screwtape letters to exactly this problem, but couched in the context of taking action versus writing about taking action.
But since I can't find the money quote right now, and I'm not entirely sure that it was in the screwtape letters rather than some other Lewis book, here's one that addresses an oblique facet of the same paradox:
"the more often he feels without acting, the less he will be able ever to act, and, in the long run, the less he will be able to feel"
In practice, communication is essential. We expect closed feedback loop in our technological systems. Turns out that systems made of humans also need feedback loops.
So I don't know if communicators are treated as "more valuable". Maybe. But it might be fair to treat to treat systems lacking closed feedback loops as missing value. In some cases, they may be borderline useless.
Point being, a great piece of tech needs [1] someone to demonstrate its value. Especially technologies that require many people to maintain them. That's how we close feedback loops in communities and organizations.
[1] OK. Technology-based performance art or side projects are possible. Just don't be shocked if nobody cares and the die off when original creators trade them for other interests.
I’ve learned to wait a couple of years for dubdub hype to subside, and “lessons learned” to be shared.
I get tired of the “write an app in five minutes” demos. As someone that regularly works on ship apps, I can report that there is a world of difference between academic demos, and something that can be shipped.
At this point I’d rather see the how to fix it when things go wrong demo. Of course very few tools can show that either because it’s messy or it’s to hard to show all the pieces.
I feel like classic object oriented tutorials talking about fruits and apples with no connection to actual apps you'd build are ones to add to this pile
Coming up with good examples of inheritance is hard. Instead of teaching with bad examples, that should have been a warning that inheritance should be much less commonly used.
Oh man, who can forget the interview questions back in the nineties. I go to an interview after spending all week writing trading systems in C++ and it’s always “let’s say you are building a car, how would you structure the classes?”
In the same vein, I'd love to see more articles about recursion that don't even mention, let alone demonstrate, solving factorials. There's gotta be something else.
Trees or graphs would be great! Or arbitrarily nested objects. Those things have probably represented 90% of the recursive functions I’ve actually written
Or parsing. One of the most intuitive and elegant real-world uses of recursion. I've been writing an XML parser and everything is recursive. Of course, because XML is fundamentally a recursive data structure, a tree.
Who in their right mind would sincerely reach for a recursion to solve factorials or Fibonacci first?
Even the maths-minded people would write the sequence bottom-up, and stop when it reached their desired T(n); when programming it, they'd ask for a loop-esque construct that they could stop when a condition was reached.
Coming up with protocol oriented programming was probably one of the bigger disservices Apple has done its developers. There are still people who think everything needs a protocol, or it isn’t soundly structured.
I may be wrong on this but weren't the fruit analogies originally meant to describe the difference between class based and prototype based languages and then it was extended to actually teach the concepts in a way that didn't work?
I remember my professor using them to try and teach pre-standard C++ and when I pressed him to explain actual use cases I figured out he didn't understand himself.
I have the feeling it was meant as a bridge for lisp programmers moving to class based languages but that may be because we learned lisp first.
Anyways that context is the only place I found value in it.
Actually these are not bad examples per se if what you want is to understand objects oriented concepts, the problem is that for many scenarios OOP is just a bad model for them.
I've been learning SwiftUI for MacOS to explore the plausibility of a particular idea I had, and I do think there's some truth in this. So far, I've found it best to seek out more robust, somewhat complete examples that use a particular API in a more realistic context, as well as many other discrete tidbits of information wherever they may be found.
In this case, that's the hackingwithswift blog posts, the hackingwithswift youtube tutorials that follow the blog posts, swiftbysundell, and a number of other independent developers that have fairly lengthy but digestible detailed tutorials on specific APIs; using all of it feels a little like the process of getting a fingerprint registered on a new phone, where you get decent coverage and breadth quickly but then fill in the gaps.
However, I don't think this is unique to WWDC or swift or iOS, it's just that those platforms typically demand only the newest versions of iOS or macOS or even unreleased hardware to get much value out of them, so I can't imagine you'd be implementing them any time shortly-thereafter.
> The academics in computer science have gotten into the "structured programming" rut over the past several years. They claim that programs are more easily understood if the programmer uses some special language constructs and techniques. They don't all agree on exactly which constructs, of course, and the example they use to show their particular point of view invariably fit on a single page of some obscure journal or another-- clearly not enough of an example to convince anyone.
Most scientific publications is BS published for the sake of pumping your numbers.
Most student memoire is BS to pass the exam.
Most financial analyses, most TV journal reporting, most tweets, most books are BS.
Ever read doctors or lawyers specialized magazines? Most pages are BS on top of BS.
Paid content, content produced to fill space, making volume, low quality authors trying to make it, agendas every where, and of course automated crap.
Why do you think HN is so successful?
The concept?
The tech?
Of course not.
It's the quality of moderation, and the fact they used their ability to publish their BS to the front page with moderation. Just enough to help them, but not so much as to destroy the average HN quality.
Dedication to quality on the long run, and restrain in milking the cow. This show visions, and it's hard.
Many years ago a friend of mine proposed a theory of his (I use the word "theory" in the loosest possible sense here, more student dorm than lab) that sounded a lot like this. In his version, most of everything is shit. Most music is shit, most movies are shit, most pizzas are shit. Only a minority of any given thing is actually of high quality.
I haven't seen much over the years to convince me that he or you are wrong
Based on a couple of his best known stories that I read when I was pretty young, I revered him as an author.
But a few years ago, I found out that his short stories had been collected in a multi-volume series, and I started reading them.
I was pretty crestfallen; I don't use the term "cringey" much, but that's how I would describe a lot of them. I intended to read the whole series, but haven't and probably won't.
Not all of them were terrible, but the famous short stories he wrote were presumably famous because they stood apart.
Also he seems to have flirted with Dianetics in its early days, even basing some of his fiction on it.
So, I guess when you look deeper, he himself proved the law. On an individual basis, you have to produce a lot of crap to generate a masterwork.
Yes, it does kind of prove the point, but you wouldn't really confirm it if you perceive an author, a genre, a category, whatever, through some sort of filtering mechanism.
I think you can turn this saying around and consider that if 90% of something does not appear to be crap, and you want to understand its true nature, and maybe find some things others have missed, you need to go to the raw data source.
I didn't end up thinking his short stories were as good as I expected, but I did feel like I understood the person better. And even maybe human nature, or how authors progress, a little better.
HN has decent technical discussions and some really good historical/personal anecdotes. Otherwise, it falls short and lacks in ideological diversity. And there is also bullshit here.
What wound proper ideological diversity look like? Or, which ideologies are underrepresented which would make this a better place, in your opinion?
It doesn’t seem like a monoculture to me: depending on the topic there is usually at least two or three significantly different viewpoints. I guess it wouldn’t hurt to have more, but it doesn’t jump out at me as a glaring deficiency.
HN is strongly inclined towards "unreflective instrumentalism". From "Three Kinds of Anti-Intellectualism: Rethinking Hofstadter":
> A third type of anti-intellectualism implied in Hofstadter’s analysis
may be termed unreflective instrumentalism, defined here as the devaluation
of forms of thought that do not promise relatively immediate practical
payoffs. In its narrow conception of practicality, instrumentalism suppresses
questions about the ends toward which practical and efficient means are
directed...
> ...The hostility of an instrumentalist business culture toward intellect expresses not only an impatience with ideas that are deemed impractical or
utopian but also a disdain for purely theoretical inquiry as a valuable activity
in its own right. Theoretical scientists, for example, are called upon to demonstrate practical applications of their work, and esoteric research is held up to
public ridicule with the award of the Golden Fleece. Scientists are sometimes provoked to argue for the value of their research on the grounds that it may eventually yield useful outcomes that are not immediately apparent, producing in their own behalf testimonial examples from the history of science. That such defenses are necessary in the competitive struggle for research dollars is indicative of the power of instrumental criteria to define the direction in which scientific and other kinds of knowledge will be allowed to grow...
> ...Likewise, liberal education is increasingly defended on instrumental
grounds. The liberal arts graduate is presumed to possess skills in analyzing,
synthesizing, organizing and communicating complex ideas, skills that are
(as we now say) marketable in the corporate environment of the post-industrial society. The phrase “ivory tower” has become an almost universal term
of contempt. Beyond the ivory tower, it seems, lies a university-industrial
complex in which intellect is seen principally as a useful problem-solving
tool.
I think you're going a little bit far, but I tend to agree. My reasoning for this is, to look back on experts from the past.
Unless it's a hard scientific fact, a law of physics, most things are really just opinions that people tend to agree with. Remember Leaded gasoline? How about DDT or Asbestos? How about Freud? Of the thing the bible likes to talk about, slavery.
Even in the 25 years, I've been in programming, it seems like all "Proper Way To Do It" have all aged about as well as milk on a hot day.
> The only thing I follow is keeping stuff simple.
Some other things I’ve kept around:
- Complex systems must develop out of a simple system (there’s a clever quote I can’t find right now) - also known as “make it work, then make it better”
- Bugs shall become test cases
- Bug reproducibility should be maximized (crash tracking / good logs, sacrifice performance if necessary)
- Fail fast
- Composition over inheritance (arguably a subjective one, but I’ve never regretted doing it)
I wouldn't say that any of those are bullshit per se.
Leaded gasoline reduced knocking/increased compression in contemporary engines, asbestos is a top-notch insulator, and DDT did kill a lot of bugs. Each just has really unfortunate side-effects, most of which were known at the time--if not appreciated widely enough.
This is grossly reductive. There is a huge spectrum of quality in content. Most of what you describe, formal studies by experts, is on a much much higher tier than the 99% of content most people read.
Comparing informed formal studies to the uninformed opinion pieces most of the tech crowd reads and implying they are somehow the same is borderline supporting disinformation.
There is absolutely a big spectrum of quality in the higher tiers of content too. People in those domains spend countless evenings discussing it.
But the vast majority of people are not reading anything at that level. The typical tech blog is closer to talking about what kind of sandwich you like for lunch than anything that will advance the field.
I’m not sure what you mean by reductive in this case. 90% of kids’ plays are crap while 90% of Broadway shows are crap, with no need to compare those two domains.
I think HN is a madhouse. Any article I click on, there's things I've never heard of that sound like made up techno-jargon, like programming languages which I've never seen used in a real job, let alone met anyone who uses them
But I go to the comments and there's 150 people who all know every intimate detail of this technology already. Experts in the subject.
HN ia is read by experts in any number of fields from biotech, to medicine, physics, farming, or nearly anything else. If there are 150 people programming in Zig, most of them likely read HN. The more niche the tech, I think the higher the odds the users are here. But to your point, I don't know why some of these things rise to the front page in the first place.
My current frustration is links to blog posts. I'm starting to ask myself "who is this person and why should I care about their opinion?" It's one thing to link research, or even a book, but a few of the blogs lately.... The topic sounds interesting, but the content sometimes makes me realize it's just some dudes blog and not professional advice. I'd kinda like some kind of (opinion) tag on those.
I think you touch on a good example do why discernment is such a valuable skill, a skill that should forever be practiced and honed. I don't mind reading some dudes take on things for even if he's wrong he might drop an item or two that gets me thinking. For instance I've been reading (and re-reading) Robert Henri's The Art Spirit which if you judge the book by its cover is focused on painting yet I find ideas and advice that I can apply all across my life to great benefit.
I've read a lot of blogs on here that may or may not be filled with good information but within many of them I have found what I find to be great ideas.
There is very little in the blogosphere of 2023 that isn’t just some rando’s hot takes or a regurgitated idea that consists of a few Wikipedia pages sent through a word spinner.
The truly interesting blogs are those who are written by known subject matter experts with success in their field and something interesting to say such as original research or personal observation resulting from what they are an expert in.
I'd say the content became unreachable, I wanted to find some examples of piano piece programs students did to get in music bachelor/master, I couldn't find any.
So while I found a post there or there in a forum, generally speaking it was from kids who was projecting their next 3 years.
So we live in an era where everyone is sharing their life, experience and knowledge but nothing is reachable/discoverable, or so much gatekeeped that it's only shared to follower(s) and the platform.
It is also the nature of the medium (i.e. short pithy comments) which makes one come across as an "Expert" without the writer having intended it.
The way i deal with it is to treat comments strictly as data/pointers/advice which i then look up in some detailed book on that domain. In other words, don't take anything at face value but do your own study.
Finally it also seems to be the case that HN has more exceptions to your statement than is found at other sites i.e. we have more outliers showing up here.
I’ve noticed this pattern happening on tech blogging pattern, dev-dot-to being not the least:
“I’ve been through a web dev bootcamp 6 months ago and I’m three months on my first tech job ever, and here is my enormous trove of wisdom I’m about to bestow upon you.” What follows is, at best, very superficial, and at worst, just plain wrong and bad stuff.
This is recent. Back when I’ve been more active in hackintosh communities, one would very often encounter tutorials clearly made by people with no clue who just cargo culted it through, pass their cargo culted thing on, and have the chutzpa to call themselves “experts”.
Oh, and there are mediocre journalists (or sometimes even vloggers with no training) who start thinking about themselves as experts on something all of a sudden and start touring radios, podcasts, and TV shows, giving opinions left and right, these are probably the worst.
> mediocre journalists (or sometimes even vloggers with no training) who start thinking about themselves as experts on something all of a sudden and start touring radios, podcasts, and TV shows, giving opinions left and right, these are probably the worst.
Us vendors pay for those and have some random journalism grad or ChatGPT ghostwrite it.
B2B SaaS and Enterprise Sales absolutely depends on SEO from a brand and pipeline creation perspective due to legacy marketing leadership and perverse incentives.
This is starting to change (a la Product Lead Growth, Persona Segmentation, Direct Sales, etc), but to paraphrase Max Planck, GTM Strategy only advances with every retirement.
" On est toujours le con de quelqu'un, et tant pis pour lui". -> "We are always someone's jerk, and so much the worse for him".
Agree with the article, I just wished it wasn't conflating "bullshit / actual harmful content" as much with "stuff that is valid, but in a given context".
200% that nothing online, (nor IRL to be fair), should be taken as universal truth. I see that same at conferences, with certain topics/fads/trends being absolutely overrepresented compared to what's actually happening on the ground.
I interpreted "bullshit" to mean fluff. Much of the content I encounter appears to be regurgitated trivial examples from tutorials. Entire websites are dedicated to this. The goal generally appears to generate some type of influencer status or get money from ads. Few people offer helpful content that provides a deeper understanding or solves a tricky problem. Probably because they can't.
You get a lot more views for a video on how to index a column in MySQL than for how to approach evaluating your needs and choose a suitable database. The latter would be outdated in a few years as new technologies emerge, while the former can be padded with enough fluff to show two adverts. So we end up with the content you know today, rather than what you'd really like to see. It's good to remember this and interact with those who do publish the latter content, plus it makes the algorithm feed you more content that's worth your time.
That is the sort of thing I'd think LLMs (with suitable training data, that's not overrun with promotional material) would be good at. I've had ChatGPT recommend technologies to me before that I hadn't thought of or didn't know (much) about.
"helpful content that provides a deeper understanding or solves a tricky problem" is a lot if work to create, even if you know the solution or can solve the problem easily.
I can empathize with people who dont spend their limited free time on explaining others how to solve problems out of the goodwill of their heart and I am eternally grateful for the tiny minority that does this.
I am also grateful to the people that do actually share exceptional content. The problem is that search engines don't know the difference and the trivial seems to drown out the useful more and more over time.
Not every decision has to be done with maximum research. That’s usually a waste of time. It’s more efficient to try a couple things until something works and move on (“satisficing”). A key skill you learn over time is why decisions actually matter. But it’s not inherently bad that there’s no good reason for something, it really depends on the context. And sometimes things that had good reasons turn out bad anyway because the assumptions proved wrong or circumstances changed.
> Not every decision has to be done with maximum research.
I’ve been at a couple companies that wouldn’t take any proposal seriously unless you showed up with a list of citations to blog posts, books, or even podcasts.
The root cause was a management structure that wanted to do everything with a maximum of evidence.
It opened the door to a lot of terrible decisions winning for no reason other than someone found a blog post that Google does it this way, or Uber wrote a blog post about this, or Martin Fowler wrote a post about that.
The most egregious abuse was when a team that had to deal with maybe 100 logins per day spent over 6 months researching how to build their auth system to match Big Tech. They could have picked any off the shelf solution and been done in a week, but instead it became an endless boondoggle of research, presentations, proposals, and committees. Several people were even planning conference talks around it, so it started to evolve into whatever would sound best for their talks.
That was my cue that I was at the wrong type of company.
Or in general wanting to do "what industry does", not the solution that people they pay and work with actual product and their customers invented.
> The most egregious abuse was when a team that had to deal with maybe 100 logins per day spent over 6 months researching how to build their auth system to match Big Tech. They could have picked any off the shelf solution and been done in a week, but instead it became an endless boondoggle of research, presentations, proposals, and committees. Several people were even planning conference talks around it, so it started to evolve into whatever would sound best for their talks.
To be entirely fair, after dealing with the reverse way of solving it ("just the simplest solution that works", which was just a bunch of static passwords per app) I'd say spending a bit extra to start with good solution for auth in your 50 man company will save a whole massive amount of pain when company grows both in internal service count, users, and compliance requirements.
I’ve been at the flip side where a few google searches would have contradicted much of the fun complexity proposed by the developers of the system. Or the PMs “talked to the users”, didn’t write anything down, and then demanded features that weren’t used at all.
Just enough research should be the theme for any decisions of great consequence.
A good life lesson in general. My version of it: "most of the most important decisions you make in life will be made in the basis of insufficient information."
As programmers we may be used to the notion that we can optimize if we know enough. We often can't. Decisions won't get better with more information because it's still insufficient. All you can do is use your best judgment -- of when to use your best judgment. And live with the consequences.
That doesn't help you make the decision. But it can help you avoid spending too much time kicking yourself over it.
So I come from a background of Civil Engineering. We basically don't do ANYthing arbitrarily. There's a reason for basically every aspect of a design. Sometimes there are competing criteria that leave it to humans to decide, but honestly it's rare. Usually the constraints drive the design from concept down to minute details.
Within the constraints and criteria, there's codes and manuals that cover almost everything.
It's always seemed to me that software has really suffered from a lack of this approach.
Is it the best way to maximize innovation? No, but in some ways, you might be surprised how much placing heavy constraints on a project will drive innovation.
I was in a meeting one day, explaining some tech decisions I had made. At one point, my manager interrupted me, looking pale and completely out of place. He asked, "I don't get this approach. Can you share with me the tech blog where you read about this?"
(...)
I couldn't help but think: would a random blog post from the internet have more respect and authority than my own decisions and explanations?
Being as generous as possible and without other context, I would interpret that as a request for some background info you trust that they could use to catch up on those concepts in their own time without derailing the meeting by getting you to explain it. Especially if most others present probably already understand it.
Or at least that is how I would've tried to convey it if I was your manager.
It would to me. When you are doing something no one has written about, it makes me think "We must be doing something impossible or we haven't identified the true problem to solve. It's unlikely we have a problem no one else has ever had." I've spent so much of my career gluing frameworks together that writing my own code feels like a code smell. Whenever my coworkers have written something new I spend 80% of my time fighting it and wishing it was a pattern or library.
Most of the time, you should not be inventing your own equivalents of off-the-shelf algorithms. (Let alone something like crypto.)
How much you evaluate the borrowed alternatives (or whether you even evaluate alternatives) entirely depends on how much that choice matters in the situation.
Situations in which the choice doesn't matter are very common, and in situations in which it doesn't matter.
The scientist isn't concerned with "does it matter" but with "does it make an observable difference". But the latter concern is often a luxury or idle preoccupation for the engineer.
Writing code is mostly a kind of engineering; more rarely is it science.
Engineers usually select existing materials and patterns of putting them together to put together a solution.
Engineering isn't entirely creative in every detail, like painting or music. Even painters and musicians reuse from others.
Counter point: writing things from scratch once in a while sharpens your skills and helps you make better decisions in the future about tradeoffs, and let’s you evaluate different implementations without having to resort to a search on hackernews, stack overflow, or GitHub to find out what the trendiest option is
Take exercise for example: it’s all unnecessary work, but it makes you stronger so that you maintain your fitness longer and so when you need to be strong for something, you’re ready
Agreed. I also like calculus as an example. First you gotta write out derivatives by hand, then you learn you can “drop the exponent then subtract one from it.”
Doing it the long way without the shortcuts helps you understand it better inside and out I’d think for when you do use existing code.
I find sometimes it’s fun to write out a program to do something without any (or at least very minimum) imports just to see how I’d solve it (for a reasonably real use case). Helps to better understand the concepts / tricks the “real” implementation might use.
Some fun examples: building an animated UI spinner “by hand” or setting up a sub-pub implementation for passing data around my app.
If you're not making something from scratch in every development assignment, probably, get another job.
The article isn't simply disparaging using existing code, but using an existing approach.
Using an existing approach might mean, say, reading a paper about it and writing your own implementation from scratch, using only the paper's description and pseudo-code as an example.
That goes a long way to keeping you on your toes as a programmer, and can exercise some computer science muscle as well.
In some respect being a software engineer today is really hard. OTOH you want to do the right thing, so you don't want to invent a solution yourself. So you look for available resources, you go online. There you're going to encounter either:
1. Spam from junior devs who started a blog so to build a name for themselves.
2. Spam from SEO mills peddling junior level wisdome so they can get ad money.
3. Spam from software vendors that try and convince your that X is the best solution for everything. (Looking at you Confluent)
4. SEO Spam by software consultencies mainly interested in hitting the relevant buzzwords so the suits will find their site.
5. Spam from FAANG employees showcasing their google scale solutions made to attract talent.
What is the engineer to do? Either cargo cult the FAANG engineers, you don't got their scale but atleast these are actual solutions. Or invent it yourself. Which maybe scary after you read all of these conflicting advices.
We really are drowning in spam. Thanks heavens we we employed humanities best to create LLMs so we can now automate this. /s
But seriously, seems like most useful information is siloed in the brains of battle worn engineers who are surrounded by an inflation of junior devs. I have a feeling this field grew too fast and we are losing a lot of wisdom and knowledge.
> Spam from junior devs who started a blog so to build a name for themselves.
I often find this is the most useful category out of the 5 you listed. Junior devs writing something like "How to setup webpack" often explain things in plain words and just spell out how to get the thing working. There's humility to the post like "I don't know what all these options do, but this is what worked for me".
I find that a lot more honest/useful than say category 5 there where I have no idea if what they're talking about will actually work for me or how much effort it's going to take to integrate because the assumption there is you have big teams who can spend all their time maintaining these tools.
I am completely baffled by this comment. You're saying the _juniors_ are creating overly complicated modern frameworks, or are the ones pushing it?? It's clearly the opposite in my opinion. Big companies/teams are releasing tools that may work well for their scale but are overkill for most, and they're championing them as good solutions for everybody, and juniors are the force working in the opposite direction (trying to simplify/untangle all this complexity, they don't have the experience to recognize if it's overkill. They also want to get hired at all the companies that use these tools so they have to learn it).
While I do think that most of the useful pages either come from juniors or from someone promoting their own product, I think there are a ton of bad pages from juniors that have outright incorrect or dangerous information, too. So it can be hard to tell what advice to use and what not to if you're at a level that you're asking the question in the first place.
> you want to do the right thing, so you don't want to invent a solution yourself
How is that the right thing?
Over 90% of the time, the fastest, most maintainable, and simplest solution is to do it yourself.
Yeah sure, there's a few horror stories out there of homegrown database engines that reinvent every wheel of RDBs the wrong way, have been unmaintained for 10 years and the original dev has left.
But those are the exception. The things you shouldn't invent yourself, you can count on one hand. And they're obvious.
The spam is only the first hurdle. You have to find a solution that doesn't suck. Evaluating whether it sucks would take at least 3x as long as writing it yourself, so you can only give it a cursory glance.
There's a chance it sucks in non-obvious ways that you're gonna find out over time, most likely in production, through a slow trickle of WTFs over the months as you work around increasingly weirder bug. There's of course a chance it's good. But you're gonna have to take the risk.
Then comes the maintenance phase. Of course it's gonna redesign its interface every month. Until it suddenly gets abandoned and starts to bitrot. Eventually it stops working when you upgrade something unrelated.
Then you're not only back to square one but have to find a replacement fast because now you're relying on it.
The stuff you wrote yourself 5 years ago, on the other hand, still works exactly like it did when you wrote it with minimal changes.
Yeah. Fundamental libraries at the right level of abstraction, most of which you're probably already familiar with. Those are the obvious things you shouldn't write yourself.
100% agree. All the best devs I know literally do not give a flying fuck about blogs at all (with a few exceptions). They read white papers and textbooks from 40 years ago (this is a bit of an exaggeration).
In my team (big tech company), if there's some new problem to solve, you write a design doc, and validate it with your colleagues. If nobody gets a better idea than you, then you implement what you had in mind. We're not incentized to produce perfect designs, but hopefully something that works and is simple enough that you can implement it within a few months. It's pretty rare that we have to implement entire complex systems from scratch. We also have a lot of internal tools and specific problematics, so I rarely feel the need to look up for an existing solution.
> But seriously, seems like most useful information is siloed in the brains of battle worn engineers who are surrounded by an inflation of junior devs. I have a feeling this field grew too fast and we are losing a lot of wisdom and knowledge.
I think knowledge siloed in the brains of people working somewhere successfully is underrated. These people just don’t have time to write a blog or don’t need to due to their success. Maybe also because the topics that make them successful aren’t that consumable compared to $FAVORITE_NUMBER_STEPS to use this new and shiny framework.
I hate almost all software. I hate the IoT. I have a loaded gun by the printer.
I want to live in the woods. I only like software because it's kind of like doing math and there's interesting logic. Every extra feature is like killing a tree.
Now let me tell you how to make software, which I hate, and assume you also hate.
Like, are there any tech blogs by people who actually like the modern tech ecosystem?
Cynicism is cheap and boring. It’s also vacuous of insight or contents. Anybody can say “everything sucks” and we‘ve all heard it before.
There’s a happy medium between toxic positivity/ignorance and dismissive cynicism.
>modern tech ecosystem
The tech ecosystem is extremely vast. I hate npm and server side js, I think the entire paradigm is pants-on-head, so I ignore that kind of content. I like the “blogs” from people like Aphyr, Google project zero and also random hackers, technical SaaS companies like cloudflare and the myriad databases like cockroach.
Start by ignoring medium and sub stack for technical blogs, they’re filled with novices using it as a resume builder to get their first software job by regurgitating other tutorials they read.
Industrial Society and Its Future, and the various influences for it? The Golden Spruce?
I mostly only pay attention to anything tech-negative at all if it's got some kind of historical or human interest.
I've read a lot of the Suckless stuff though, and the Worse is Better stuff, because I always see it on places like this and thought I should probably at least know what it is.
But it always seems like people from less technical fields have clearer writing on why they like these things, tech people are always trying to fight an uphill battle to make tech simple and logical, meanwhile encapsulated complexity is a lot of why most people like it.
Wheras swiss watchmakers, mathematicians, and DIY woodworkers are working in a medium that naturally supports simplicity, and talking to an audience that appreciates it more, they don't have as much conflict happening to obscure things.
I see that differently. Mathemticians often come into a field that's complex and filled with fudge factors and layers of engineering and make it clear and simple. See the history of electrical engineering in the 20th century.
Mathematicians come in and see the possibilities for deep innovation and making stuff work that didn't before.
Engineers take their work and figure out how to make it last 100k hours and be made of cheap pot metal and plastic and be a drop-in replacement for the previous thing, and to be safe if someone misuses it, how to make it with existing production lines, what extra features could be added cheaply, etc.
Modern tech couldn't exist without both(Even if they can sometimes be the same person), but usually neither side is all that excited about the other's work, if everyone is getting along they seem to just recognize the value in it but be glad they get to stick to their side.
When they don't get along the mathematicians are like "Wow, why are you adding this useless stuff" because they don't realize the extra feature only costs 2 cents a unit and doesn't add much weight, and the engineers are like "Why do we even need those number poets" forgetting that their whole job is using their stuff.
Compared to most other people and industries techies are actually relatively good on this front, however I still agree with the article that there is a lot of cargo culting.
The author mentions a few reasons why this occurs, but I think it's primarily due to time constraints. Not even "I have deadlines", but "I cannot physically audit everything".
You simply don't have enough time to independently verify every decision that has made in the codebase, and even if you did by the time you were finished there would have been new edits and additions.
Chesterton's fence also applies here. A decision made on a whim might take you hours to investigate, gather context and decide whether it was appropriate or not.
---
Something the author doesn't mention which I think matters a lot here is the level of impact the decision has.
Deciding what to name a variable? Unimportant and reversible. Low impact on other people.
Deciding which framework to use? Important and irreversible. High impact on other people.
Having the correct context and a good mental model for making decisions can help you a lot here. Outsourcing your thinking is useful and correct for a lot of things, but sometimes it can be terrible.
Figure out which problems you cannot safely outsource and spend your brain power on them. They tend to be things which are core to your work, so investigating them almost always gives you valuable knowledge and context that you can share with the rest of your team!
I’ve seen people eagerly suggest the use of GraphQL without being able to explain the idea behind REST or SOAP, what can be done, the shortcomings… People in the area, specially the fresh out of college, will get informed about technology by promoted content on social media, specially any technology out of MAANG, and automatically assume it’s an industry standard.
Stay in the field long enough and the number of anecdotes will grow so much that you will notice the pattern, unless your pet rock’s intelligence can surpass your own.
Also, hasn’t there been a link [0] just a couple days ago about how most of the “serious science research” done about software development estimation is based on datasets that are either smaller than the number of anecdotes a typical software developer will easily encounter in five years, or are right away made up? Just what kind of data would you expect, and from whom?
Most people have no desire to actually use their brain and think independently. For one thing, other people who've blindly accepted a point of view from some authority figure might be nasty to them. Also, they can't possibly imagine that somebody whom they really like/respect or who helped them get some code working might be wrong sometimes.
There are very good evolutionary reasons why most humans are wired to conform rather than to think. If you thought too independently in a pre-modern society, you were liable to end up burned at the stake or otherwise executed in an especially brutal and inhumane manner for "heresy" or "witchcraft". While this is adaptive for living in a pre-modern society, this is maladaptive for living in a liberal democracy where being able to think independently and to respectfully disagree with others are among the most critical life skills. This tendency towards mindless conformity is also adaptive for an industrial assembly line but maladaptive for a modern knowledge economy where productivity depends upon creativity and critical thinking not following orders and repetitively doing the same thing.
It’s funny to me that you write this down and not be reflective about your own claims.
I don’t buy the quasi-scientific part about evolutionary reasons.
Ad for that matter, I don’t even buy your first claim, that people don’t want to think independently.
But it doesn’t matter in the end, the whole point of the article - to me at least - is that it’s a good thing to always take the time and at least understand the reasons for choices, and challenge them if it makes sense to do so.
If I allow myself a wild stab at things, I think it’s more that people are “trusting” by nature and assume that things are well thought-out, which the article points out.
> While this is adaptive for living in a pre-modern society, this is maladaptive for living in a liberal democracy where being able to think independently and to respectfully disagree with others are among the most critical life skills.
That is utterly false when it comes to science and engineering, where you're standing on the shoulders of the proverbial giants in order to do anything, and reinventing wheels is mainly the path to low productivity.
This is nonsense. Leveraging the collective wisdom of the society you’re in is probably the most adaptive thing a human can do in any given situation.
Doing the same thing you observe others doing is going to be the safest choice most likely to preserve your life and allow you to reproduce, and it’s not a close contest at all.
Each architectural pattern produces its own fruit, which can be good or bad.
I think the error that technical people do is assume that what they've settled on as the truth IS the ONLY truth and they are right and everybody else is wrong. I think the author makes this error too ("Everybody else's technical content is WRONG")
I think this article glosses over quite a number of different things so the conclusion doesn't always follow.
It says that people who copy stuff without thinking are consumers instead of creators. The implication is that this is wrong when many of us spend most of our time aggregating, not creating. Look at a load of game shows, how many are creative? Not many, most are simply aggregations but done in a way that makes the sum much better than its parts.
However, if you then ignore that and look at what they say about this copying. Again, the tone is that it is wrong to copy without "a bit of skepticism" but you know what? Sometimes I don't care. I need to know how to extract something from a string, I find an example, it works, the need to analyze it might well be a waste of time. Might there be a faster/more efficient way? Probably but that will take even more time to find to save some CPU that is not worth the investigation.
There is also an implication that most of this copying is related to appeal to authority e.g. I will copy this because person X is experience or highly regarded but again, I don't believe that at all. I can't remember the last time in my nearly 30 years in the game that I looked out for a specific person because I believed their authority, most of the time I suspect we try and find the easiest article that meets our needs and perhaps seems a bit up to date.
I don't care that nobody is perfect because anyone worth their chops should know that the code you copy from somewhere else often lacks context and information about trade-offs but we are trusted to make the call as to whether the risk is high enough that we need to go back to first principles.
In most cases, it is more annoying for me for someone to waste time reinventing something than it is for someone to get something a bit less than perfect and maybe having to revisit it later.
The biggest issue I run into is that developers wanting to put some one liner hello world example into production. There is no monitoring or logging, nothing updates the operating system or the application environment that runs in it, the hello world invariably runs as root and fails as any non-privileged user, etc. Any sort of database access reads the entire database on every access and loops through them all performing its own query logic - until the database gets too big to fit into ram. It’s a huge surprise when data gets discarded from memcache or redis. Over and over again I see these same issues.
As I've been looking into freelance teaching and researching what kinds of services people provide, it's become astounding to me how poor most freelance teaching is.
There's a bunch of people with the skills of a first year CS student offering "learn to code" education out there.
But it sort of makes sense, those are exactly the kind of people who would want to make a quick buck on the side, not seasoned professionals who are making 6 figs already.
It's just unfortunate, that people get sold this idea of "becoming a coder in a few weeks" and go nowhere because of it.
This is so on-point. Nearly every single article I have read on HN that covers my own area of expertise has been bullshit. I come here mostly to call BS before people blindly believe it.
From one throwaway to another, this is a legitimately useful service and I hope you and others keep doing it. At this point I flip through the HN comments before the articles. The primary source links alone are a treasure.
Probably 20% of fake jobs generate fake content. Startups have long become the fashion of technology and not about tech itself, as it once was. Even PG does not post on this forum because of the unsolvable problem of the wrong/fake commentator. More and more politics and fashion in IT and less technologists. There are only 28 million programmers, and all the other billions of people do not even know how it works, but they have learned how to copy and sell. Market wins!
I'd say this is partially because creating something genuinely original and free standing is a rare occurrence. You're usually coding against some api or framework or whatever...and that interaction leads to common patterns so copy pasting makes sense.
Perhaps that's the appeal people see in writing their own coding language? It's somewhat free standing
A quasi- bell curve function, is it not? Could paraphrase "on average, content is average". Not trying to be facetious. But sifting out large quantities of general dross has been a common requirement 'in tech' since before the days of perusing CDROMs full of shareware on dial-up bulletin boards.
Like everything else, it's a balance. Nobody can build a modern system in finite time that will be maintainable, secure, fast, etc. without relying on their own and others' experience ("consuming").
Most tech tutorials are terrible. It's some junior engineer who has no idea what they're doing and barely got something to function. It's the blind teaching the blind.
Try to find a sample SQL database, for any vendor, that does not employ artificial table primary keys. This was a requirement in the 1980s so that your queries would execute before the heat death of the universe, but that has not been the case for decades. Microsoft is particularly guilty of this. Artificial keys are an anti-pattern. (And, btw, there's a dearth of literature on why artificial keys are an anti-pattern.)
Here's the only sample database I know of that consistently uses natural keys across all tables, created by a SQL educator who knows his stuff.
Never specifically heard about this terminology until now, but after looking, yeah, I can see why ppl prefer 'artificial' keys to natural ones.
From the animal shelter example, for people it uses email as the 'natural' key, which immediately runs up against 1,2,3 and 7 in https://beesbuzz.biz/code/439-Falsehoods-programmers-believe...
"Natural Key" actually refers to column(s) already available in the dataset you are turning into a row that is unique by row, and thus a candidate for being the key. (Subject to all the normalization, etc.).
An artificial key is a column of made-up data added to the rest of the column. It has no use other than to provide uniqueness. In most cases there is already a unique candidate key in your data.
> Artificial keys are an anti-pattern. (And, btw, there's a dearth of literature on why artificial keys are an anti-pattern.)
Why, in specific, are artificial keys an anti-pattern?
Seems like they preemptively solve problems. For example, using an SSN as a "natural" primary key is both a potential security nightmare and a data normalization nightmare if someone shows up with two or more SSNs, which does happen. (Yes. Yes. My great uncle had at least two SSNs. You won't convince me it can't happen.) Similarly for other "natural" data which isn't bound to obey the properties DBAs need for primary keys.
I literally just read a very convincing post, with good examples, on why relying solely on natural keys is rarely a good idea. Nothing in that repo convinced me otherwise (and alarm bells certainly went off when I saw a table called Species where the primary key was a varchar field "Species" with values like "cat" and "raccoon", neither of which are species at all).