Hacker News new | past | comments | ask | show | jobs | submit login
Phrases in computing that might need retiring (sicpers.info)
183 points by mpweiher on July 29, 2022 | hide | past | favorite | 285 comments



Strong disagree on "technical debt"

Rather, the entire world is built on it and there's some level of acceptance required, but we definitely need a shorthand phrase for "we're doing something stupid that will save us time now and make us hate ourselves a year from now"

Edit: thinking on it, also "devops". It's really job title for "sysadmin but paid and owned by the software department to keep ITs fingers out of everyone's shit" but with a flashy name that sounds important so mgmt will agree to it. It's meaningless to us but it's a hyper-important industry term that saves us all a lot of suffering. It also must necessarily remain vague such that mgmt won't understand it and know to deny the posting. It's sorta like how snapchat used an obfuscated UI to keep parents out.


> also "devops". It's really job title for "sysadmin but paid and owned by the software department to keep ITs fingers out of everyone's shit" but with a flashy name that sounds important so mgmt will agree to it

If that's what "devops" is now, sure, kill the term. Devop is not a job title, and I will die on that hill. The concept of DevOps is close collaboration between development and infrastructure teams throughout the development process, rather than throwing things over the wall and saying "that's infrastructure's problem now". Having a completely separate team entirely misses the point because it should be one team.


If that's what "devops" was before, we needed to kill the term earlier than you might think. Devops was originally a set of software release practises and culture that result in higher quality software, including high release cadence, blame-free postmortems, small patches, time for experimentation, automated testing, continuous integration, etc.

These practises normally require close collaboration between dev and infra, but are separate from it because you can have close collaboration on big, quarterly releases based on manually verified long-lived feature branches -- yet that is not devops.


I get where the author is coming from. The meaning of "DevOps" is in the name - it stems from a culture of encouraging _devs_ to do _ops_, and rejecting the antiquated separation between developers who write the code and the sys-admins who deploy the code.

The term "DevOps" doesn't need retiring, but "DevOps Engineer" does. DevOps is a culture and a practice that allows developers to iterate quickly by taking ownership of the operations of the software they build. Every engineer is a DevOps engineer. But such a widespread diffusion of responsibilities doesn't mean the specialists suddenly disappear. It's not like every dev knows how to configure a new multi-region cluster with automated failover. Instead, they know how to use whatever platform they've been empowered with to help them manage such operations.

For this reason, I think "Platform Engineer" is a better word than "DevOps Engineer" for describing the differentiated specialization of this role. Every developer in an organization engages in "DevOps" on a regular basis, whenever they debug a CI pipeline or deploy a new service. But someone needs to build, setup and maintain all the tooling to enable those _devs_ to do _ops_. In my mind, that's the specialized role of a Platform Engineer.


But DevOps does not mean "devs" doing "ops"! I feel like I'm repeating myself now, but devops is a culture promoting lean-inspired practises to reduce batch size and increase cadence. Go back to the original clique of people who started talking about devops and that was what they were discussing.

I can absolutely see how the name "devops" is confusing and easily lead to the sort of misunderstanding your comment propagates. I don't know why they picked that name. Maybe it was about applying concepts from operations research and business operations into development -- who knows.

Either way, the way I read your comment, it is more proof that the name is misleading and should be retired.


The title/subtitle of the talk mentions both of these definitions: https://youtu.be/LdOe18KhtT4


This is the problem I suppose, that the definition is fuzzy. I'd say it's to have teams where "you build it; you run it" and so the dev and ops are done by one team. That means teams will produce things that are easy to operate, as they'll be doing it.


Or else "let's have the programmers do production support too, they're paid to much to get a full night's sleep".


Related: a team claims to follow TDD but also has a test engineer whose job it is to write automated tests.


DevOps was what you meant, and still is in some places. But in reality it is either a fancy term of system admin, confined to cloud providers, or I don't know what I need but probably can do everything. You might be delighted by the terms DevSecOps as well, which is DevOps, with the confusing factor on steroids.


> If that's what "devops" is now, sure, kill the term. Devop is not a job title, and I will die on that hill.

I'm afraid you are already dead then...


Right, and "debt" is a beautiful word here. It accurately describes receiving the value today, but then owing that time back, with interest, in the future.


Tech debt is one of the few metaphors I love because it’s so accurate. What a lot of engineers struggle with is that manageable levels of debt is fine for a business and in fact superior to being cash constrained.

Some forms of debt are a headache if the business goes south but tech debt can be ignored if the software project goes nowhere. So the “terms” are good.

Rather than complaining about tech debt, one should communicate the advantages of paying it down or the risks of taking on too much. But the metaphor works and simply referring to “debt” will not dissuade people because a bit of debt is fine.


I think those who object to the metaphor do so because they interpret "technical debt" as "technical consumer debt". Consumer debt being a relatively benign form of debt, they don't believe it conveys the gravity of the matter at hand.

Consumer debt is mostly unsecured, almost perfectly fungible across term structures and refinanceable on a whim within a very competitive market of lenders. It's rare for it to be essential for a consumer to take on debt to continue to live[1] and there are typically safeguards in place to ensure consumers don't take on too much debt burden. The only debt that most consumers will come across that imposes any restrictions, or moves slowly, is a mortgage on a residential property. When we talk of "technical debt" we are talking of something has none of the nice properties of consumer debt. If I had sufficient cash I could pay off all my consumer debt within two minutes with a few taps of my phone screen.

On the other hand, I interpret "technical debt" to be akin to a complex structured financial product. Maybe there are convertability clauses between debt and equity, perhaps there is variability in the rates payable, or optionality that can be triggered by various real world events. This form of debt is not liquid, it is not refinanceable and it can be hard to find someone to take it off your hands for any amount of money! Nonetheless, sometimes it is essential for a business to take on this sort of debt to continue to thrive. That is an apt meaning for "technical debt" in my opinion.

[1] rarity depends on how socialised one's healthcare system is!


I agree, and I love the metaphor too, but you have to admit that something about the metaphor isn’t reaching peoples’ brains the way a good metaphor does.

A good litmus test here is to ask someone, “what if I told you that technical debt was originally a good thing?”... Like “Yes! Let’s go and get some technical debt, it will be great!” And so, can you understand why it might have started out that way?

People who really get the metaphor, can understand why it was originally a good thing to be desired. Because debt is a useful tool for the same reason, and if you can pursue what “being unable to take on technical debt” looks like, you can understand that it was a reaction to waterfall-style approaches.

But a solid 80-90% of people in tech don't understand how it could possibly be positive... “tech debt” is just a shorthand for the stuff that is causing development to go slower than you would have liked it to go.


The big difference between technical debt and actual debt in a money sense is that real debt is taken on consciously with an agreement betweenn debtor and lender. Because of this money debt can be worked into the planning and the risk can be calculated at the time the loan is initiated. Circumstances can change over time, but there is a bargain between the debtor and the lender. There are things like interest rates and collateral that are agreed upon when the debt is incurred.

Technical debt is by definition created to save time, so there is rarely an understanding of the risk that comes with the debt until the debt must be repaid. This means that the eventual payback can range from inconsequential to disasterous. The point is that the debtor won't know until it comes time to pay the debt back. Technical debt is often incurred in lieu of actual planning, not because a debtor is consciously and rationally weighing current reward against a future risk.


Simply because it's a lot nicer to say than "my predecessor did dumb stuff". Especially when the predecessor is still around and outranks you.


The predecessor or old self did dumb stuff because of being pressured to deliver something quickly and had to cut corners. I don't see how he'd be offended that now it's time to go back and do it properly.


> People who really get the metaphor, can understand why it was originally a good thing to be desired. Because debt is a useful tool for the same reason

In that case, it is a very apt metaphor because most people naturally see debt as a bad thing. But with some nuanced thinking and deeper investigation you recognise that it can be useful sometimes.


I think they weren't saying that it's not an apt metaphor, but that it's not a good one, because now you first need to explain to people why regular debt is not necessarily bad, before being able to move on to your actual point that technical debt is also not necessarily bad.


I agree completely, it's a powerful concept, just too narrowly applied.

The 'technical' part points out that it's a special kind of debt, one that isn't measured directly in money. What's the debt? The debt is time.

You can't move some technical (which isn't a noun), or some money (which is), from one account to another, to pay down a time debt. You have to pay it off with skilled labor, which takes time in a way that isn't fungible with money: there's a lot of COBOL out there which can be lightly modified, or replaced, but that company can't find the hours of skilled labor to do more with it, at any price.

A codebase isn't the only time debt a company can incur, any process can be stuck in a suboptimal frame because no one wants to sign off on the time it would take to make it right.


Except it doesn't. It's a one-sided and misleading concept.

The problem is usually finding a balance between "Let's hack some suicidally awful crap together quickly to see if we even have a market and then fix it when we have real income, except everyone knows we won't so it will be a permanent drag on the business" and "Let's build a beautiful extendable maintainable paragon of elegance and purity and ignore the fact that it'll take three years when we have six months of runway."

Too far in either direction will kill you. The sweet spot is between those extremes, and finding it is extremely difficult.

It's unhelpful that there isn't a word to describe aiming for that balance, never mind hitting it.


It's not one-sided though. Debt is often a very useful tool. You take on debt now because you can do more productive things with the money that will be worth more than the payments you have to make on it later. This is an extremely apt description of technical debt. Often times people will make bad decisions about what debt to take on, and the same is true for technical debt, and often times people will overestimate the value they'll get out of the debt, which is again true for technical debt, but debt is still a useful tool, and technical debt is a useful compromise you make to ship software.


Your description is apt when you are talking about an analogy between technical debt and personal debt.

For companies, whether to use debt or equity to finance their balance sheet is just a technical decision. Either way, you have to pay the cost of capital.

(Ie even if you finance your project from equity and not from debt, it still has to be better for your shareholders than just giving them the necessary capital back via a stock buyback.)

A company can have debt as a permanent feature of its balance sheet, just like equity.

Funny enough, I suspect from a corporate finance point of view, technical debt should actually be called 'technical equity', because technical debt only gets expensive when your project takes off. If you never end up using that piece of code, the technical debt never has to be paid. But it gets more and more expensive, the more successful your project is.

Just like selling 50% of your startup to an investor (as equity) gets more and more expensive (in retrospect), only if your startup really takes off. Debt stays the same price, whether your startup is middling or goes to the moon.


I get why Technical Equity is a better term, but very few % age of the population really understand what Equity means, while debt can be explained to a 5 year old


Yes. Though I wouldn't even say it's a 'better term'; just because the metaphor might hold slightly more water in some highly technical sense, doesn't actually make it a better term. ;)

This reminds me: if you went to a farm and actually 'picked the low hanging fruit first', they would likely fire you. The fruit higher up on the tree typically ripens faster, so should be picked first. (But the phrase as a metaphor is fine, and everyone knows what it's supposed to mean.)


It's usually only a "beautiful extendable maintainable paragon of elegance and purity" in the eyes of the original architect. Everyone else sees it as leaking abstraction with bolts everywhere to keep the original idea somewhat working... Or just a massive pile of technical debt.

Code quantity is always dependent on how well the person making this judgment understands the software. While I'm sure that everyone will agree that there are some clearly better ways of doing things, they sure as hell won't all agree on what these clearly better ways are.

One person's pile of garbage is the next person's perfect implementation with easy to understand procedural logic.

Please take note that I'm explicitly not saying that any implementation is better then another. I'm just trying to convey that the term technical debt very much depends on the mindset of the person looking at the implementation


How is debt not accurate here?

You wanna build something? You may over-invest (too much debt), invest just right (manageable debt) or wait until it's too late (the paragon you mention).

Debt is a nasty word and it's meant to be but like some other bad things if you know what you can manage you can come out on top. People take loans all the time because they want to achieve that sweet spot you mention.


Take on lots of debt and you'll have a quick flash and then years of problems. Refuse to take on any debt ever and you may never get to university or buy a house.

Taking on debt is not inherently good or bad, it's context dependent.

I feel it maps to technical debt very well.


Ah, thank you, this is actually the argument, not what the article said.

Unfortunately I've become a bit disillusioned that we can reclaim the conversation to be about the optimal level of quality, and what tradeoffs are acceptable.

Office bullshit language seems to have strangled nuance so hard that I'm now sure that most of these conversations are useless and we'd better spend our focus on finding and working with mature adults that don't need that much convincing


Not sure about that. A large part of the company (everybody close to money) probably see debt as a good thing because it lets them buy stuff. Of course you have to pay debt to creditors and how do they do it? By making more debt. It works well until you have to pay it off, all of it. So if they ever enter in a discussion about technical debt expect them to suggest to make more debt (and move faster.)

Building without foundations, driving blindfolded, etc, could be better metaphors everybody can agree upon. If you're talking with a CFO, talk about high leverage (as for derivatives.)


Why would you have to pay off all the (financial) debt?

Debt can be a permanent part of your company's capital structure just like equity can be a permanent part. Or do you have a plan to buy back all outstanding stock?

(Yes, real world debt in the form of bonds a due date, when you have to roll it over. But that's an accident of history. Instead companies could also sell perpetual bonds and put options on those bonds for the same effect.)

In finance terms, loosely speaking debt is the part of your capital structure whose cost is fixed in nominal terms. Equity gets the remainder of your income stream.

About high leverage: the more of your income stream you parcel out for fixed payments, the more you 'concentrate' your equity and the variability of the residual income it gets.

(Overall, I blame tax systems that give preferential treatment to debt over equity. Roughly, you can pay the capital cost of your debt with pre-tax money, but the capital cost of your equity comes out of post-tax money. Put them on equal footing, and you'll solve quite a few problems.)


I'd argue that "debt" instills an unrelated meaning. As well as "legacy" and "debt" mean different things, yet may be related in some situations.

What is technical debt today might actually be (considered) no debt at all later on (or even from the beginning, in the extreme case you're building something for once only), for multiple reasons.

The context matters a lot, and begets a more elaborate discourse than just "that's technical debt", which in itself means little of the possible or mandated actions.

(edited to refine)


I prefer "tech investment"! It's more positive ;)


> … "devops". It's really job title for "sysadmin …

Uhm, that what it has become, but not what it ought to be. I define the «DevOps» as an experienced software engineer with the black belt in networking and system administration. It almost never works the other way round, though, and ex-sysadmins who choose to transition into a DevOps role as an exit path from editing configs in vi, remain as cantankerous and unhelpful as ever before. From my own anecdotal and likely highly biased experience, I have encountered exactly one positive example of the reverse transition.


I’ve also found a lot of devops to be generally unhelpful and adversarial as well. They seem to forget the developer part of the job title and don’t help the developers deliver value to the customer. Instead some devops teams just break things unannounced, make sweeping changes that leave everyone unhappy, require unnecessary extra steps for simple procedures, and remain passive or negative.

I call those teams “developer obstructions” or DevObs for short.

Really this probably highlights the problem of having devops be it’s own role instead of a responsibility of a software team.


That resonates with my experience at large as well, which I find to be unfortunate.

I actually have come across a number of highly knowledgeable and talented sysadmins along the way, and I truly wanted the Dev part of DevOps to work and invested a lot of personal effort into trying to make it work although in vain each and every time.

Such individuals also have a tendency to create new, deeply entrenched silos that continue on to keep developers in shackles.

> I call those teams “developer obstructions” or DevObs for short.

With your permission, I will ecstatically add the new term into my daily professional vocabulary.


It’s very unfortunate! I don’t really know of a good solution other than stronger and more collaborative leadership preventing these “us vs them” silos. In effect the developers should be considered the customers of the devops team therefore they should be doing everything they can to aid the developers. Or, just have them in the sane team.

Yes definitely you can use it, credit would be nice though if the expression becomes well known ha ;)


DevOops


> "we're doing something stupid that will save us time now and make us hate ourselves a year from now"

Not to be pedantic but the metaphor is intended to speak to the economics of making short-term gains at the expense of long term savings. Developers need a way to communicate to business what cutting corners will cost them, financially. The concept of "technical debt" isn't perfect but it's a start. It speaks to the fact that if we don't adhere to engineering principles that allow for easy maintenance, expansion and extension then the company will pay the price when they want to engage in those endeavours.

If we're talking about these things in terms of "we hate ourselves" then we're having an emotional conversation. Unfortunately, that's all too common and it comes about because we didn't have these conversations in terms of dollars early enough. At that point you don't just have a technical debt problem, you have a morale problem. Engineers quit and people are brought in who don't understand the spaghetti coded system and end up exacerbating the problem because "we need to get this feature out before <insert arbitrary deadline that has some apparent meaning to the business>."


In Corporate and Enterprise environments, the term 'Technical Debt' is now commonly used for systems and platforms that are beyond their supportable lifecycle, but are still in use for reasons, yet need to be replaced as soon as possible. These systems were never just hacked together with the knowledge they'd be problem in the future.


It's known now that they're a problem, and they're choosing right now to continue with that problem rather than spend the time/money/effort to change them. That's still technical debt. If it helps, you can see it as starting right now, rather than when the code was written. Either way, they're choosing to continue to have problems rather than pay off that debt... yet.


Wouldn't "legacy" be a better word then? Meaning it wasn't meant to be a debt but it's still the legacy left to you today


Shouldn't those systems be named "legacy system" then?


You'd think so, yes. But... it's Corporate, nothing in those places makes much sense.


Strong disagree on your strong disagreement.

As the term has been corrupted, it just means "Code that isn't the way I would have done it."

You won't have to fix it a year from now -- you'll just be cursing at having to understand it.

And eventually, management will become convinced that everything has to be thrown out and rewritten; maybe it'll even be true.


> It's really job title for "sysadmin but paid and owned by the software department to keep ITs fingers out of everyone's shit"

Most devops aren't really fully-fledged sysadmins though.

The job of babysitting software and coming up with SLOs was always the part of system engineering that I hated. It wasn't digging through production outages with strace and tcpdump. They actually defined that as something that we didn't need any more so all DevOps gets is metrics and logs and they can't touch production at all and defined that job out of existence.

(Then every now and then I see some devops person pleading for information on how to deploy tools like tcpdump and strace to production and I can see that the technical foundation of devops is a pack of lies)

Also, devops is more "oh fuck they want to give all the devs a pager, quick hire some junior devs to give the pagers to and we'll call them devops".


(and now we're getting "Platform Engineer" which seems to mean "oh fuck those asshole system engineers were doing something productive before other than just being dicks about uptime and production access, how do we get them back?")


I think "technical debt" is a good metaphor, as long as you recall that taking on debt to fund investment can be a good idea sometimes. Debt isn't always bad in the long term.


It's arguably what makes it such a good metaphor.


In the video realm, Corp IT always came in and screwed up the production gear because they were so confused by the use cases. Instead, we started Production IT department. A corp IT sees a large array and wants to tune it to make lots of small random I/O, prod IT sets it up to make large continuous I/O. Prod IT sees a network port and plugs it into the corp network. Prod IT connects it to the isolated storage network, and doesn't even give the nodes access to corp network. Corp IT wants to install management software, Prod IT knows it'll only still resources from performance and doesn't bother.

If a corp IT and prod IT employee come in contact, a huge explosion occurs not quite as bad as matter/anti-matter. A healthy argument on best practices nontheless


The trouble with the term "technical debt" is that people tend to use it retroactively to describe messes which may not have been necessary or the result of an intentional technical choice.

It's uncomfortable to say, "In retrospect that was an unfortunate, unnecessary, and shortsighted choice that didn't really save any justifiable amount of time or money", so instead people say, "We've accumulated some technical debt" to cover up mistakes instead of acknowledging and learning from them.


The problem with tech dept is that it instills some kind of linearity -- you can pay it off whenever.

Often times "tech dept" comes from the fact that we do not truly know what to build. Hence we use the simplest measure to satisfy what requirements there are. Call it the "Occam's razor of software development".


Truth be told, because nobody knows anything, it's often easier to deal with an "Occam's razor" designed system that tried to stay simple; than with a prematurely overengineered mess.


Ah yes, sorry, I was unclear in my original comment.

Te point is to stick to the simplest implementation and don't try to reduce "tech dept" in favor of building premature optimizations.

I am very much for dropping the tech dept term

(https://www.madsbuch.com/assumptions/)


There are some differences between “tech debt” and “debt”:

Debt has to be serviced, and if it isn’t some asset is probably forfeited or you can be forced into bankruptcy. Not so for tech debt.

Debt is known for certain in a measurable quantity. Tech debt is not measurable and only exists to the extent people argue it exists. Two engineers may disagree in how much tech debt there is.

Debt can be used to fund things with a clear nexus. I borrow $1m to buy a house. Tech debt is taking on often unknowingly or even as a result of software dependency upgrades (or cloud) that turn debt-free into debted software silently.

There are probably plenty of other differences that make it a sort of half-applicable metaphor.


> Debt has to be serviced, and if it isn’t some asset is probably forfeited or you can be forced into bankruptcy. Not so for tech debt.

It is so for tech debt though. If you keep on accumulating it, at some point it's gonna have very negative effects: a ransomware incident, data leak or even a total company-wide standstill.


Say you have a product, and you add no new features (you fix some bugs and patch for security). The interest payments are a lot less than if you are actively developing new features and therefore adding complexity.

So it is dynamic in a way that normal debt isn’t.

It can be worse than debt or not as bad depending on the nature of the debt and its consequences.

If you have something written in OG Perl and and the messyness makes all your devs leave and you cant find anyone with skills that can be really bad! Although only if that product is your ownly source of revenue.

Say like many small companies trying to make it big you have “bread and butter” product A and “next gen unicorn” product B.

If B is far along and customers transitioning from A to B then tech debt on A is less expensive than if B is stalled and in a quagmire.


In the forward looking sense you are using it, sure. In the backward looking way it's usually used ("development on that is slow because it's full of technical debt") it usually 10% what you are talking about and 90% over engineering and stupid things that were done for no good reason at all where no one at the time was even thinking about the tradeoff for speed.


It's called a short-sighted decision.


Providing value now for a user or customer which will need to be refactored later (iff the project continues) is not short sighted.


It's a trade-off.


"engineering"


finally! A phrase so broad with so many applications that nobody will know what we're talking about!


Have you heard of Dynamic Programming?


The definition I have of DP is : "refactoring recursive algorithms into iterative ones by storing intermediate values (generally into arrays). It generally has performance gains due to avoiding redundant calls, at the cost of making algorithms less clear / readable "

At least that's pretty much what cs classes taught a few years ago

That seems pretty well-defined to me?


Oh, dynamic programming as a concept makes perfect sense and it's very useful and well-defined.

However the origin of the name is bonkers. See https://en.wikipedia.org/wiki/Dynamic_programming#History

> I spent the Fall quarter (of 1950) at RAND. My first task was to find a name for multistage decision processes. An interesting question is, "Where did the name, dynamic programming, come from?" The 1950s were not good years for mathematical research. We had a very interesting gentleman in Washington named Wilson. He was Secretary of Defense, and he actually had a pathological fear and hatred of the word "research". I’m not using the term lightly; I’m using it precisely. His face would suffuse, he would turn red, and he would get violent if people used the term research in his presence. You can imagine how he felt, then, about the term mathematical. The RAND Corporation was employed by the Air Force, and the Air Force had Wilson as its boss, essentially. Hence, I felt I had to do something to shield Wilson and the Air Force from the fact that I was really doing mathematics inside the RAND Corporation. What title, what name, could I choose? In the first place I was interested in planning, in decision making, in thinking. But planning, is not a good word for various reasons. I decided therefore to use the word "programming". I wanted to get across the idea that this was dynamic, this was multistage, this was time-varying. I thought, let's kill two birds with one stone. Let's take a word that has an absolutely precise meaning, namely dynamic, in the classical physical sense. It also has a very interesting property as an adjective, and that is it's impossible to use the word dynamic in a pejorative sense. Try thinking of some combination that will possibly give it a pejorative meaning. It's impossible. Thus, I thought dynamic programming was a good name. It was something not even a Congressman could object to. So I used it as an umbrella for my activities.


The job of a good engineer is to know which type of debt is good under the circumstances, and which debt is "toxic".


Precisely

You can make literally anything work with enough arduinos and duct tape, but should you?


It's only short-sighted if you don't plan to pay off the debt.


In reality, nobody ever pays it off until absolutely necessary.


> It's sorta like how snapchat used an obfuscated UI to keep parents out.

If anyone's else also confused about that part, there's some context: https://news.ycombinator.com/item?id=19022640


Having worked on projects drowning in technical debt, it's a very real concept.


It is a very real concept, but as a term it's no longer useful because the MBA's have heard it already. I don't look like a stereotypical programmer (tall white guy with good hair), so I hear what the "business owners" say when they think there are no programmer dorks in the room, and they passionately believe that "technical debt", "code quality" and "refactoring" are all scams that were made up to siphon money out of productive organizations (the article itself hints at a similar belief on the part of the author).


Hah. "The business" vs. the programmers. I hate that concept but it's something we must live with.


Well, remember, they think of it as management vs. (lowly, replaceable, necessary evil) "workers".


I like the term technical debt but people forget what debt is.

Everything has a cost. If the value added is more than the cost then it is not debt. Technical debt are areas where the costs exceed the value added.


If this were a comment on HN, I would give it one of my rare downvotes. It's dismissive and negative without any compensating value or positivity. Maybe suggestions of terms to use?


I always find the position of telling people to never point out problems, only suggestions or solutions odd.

From something as basic as a management philosophy of "don't come to me with problems, come to me with solutions" to this "Don't suggest terms not to use, suggest terms to use"

A person can identify a problem without having a solution, or identify something is wrong without knowing the full solution.

So in this context a person can believe people using "DevOps" or "Agile" or "Tech Debt" in various ways to be a problem without having to suggest a solution to the problem

If we just ignore problems we have no personal knowledge of the solution then I think that would end in every bad results over all, take for example a basic concept I think everyone would understand, my car is making an odd noise, I know that is a problem, but I have no idea what the solution is. I don't think anyone would say you should just ignore it, or have to present the auto mechanic with a fully fleshed out solution to the problem before taking it in for service.


It's not that you can't do it, its just low effort noise without a solution. Often times the downsides are apparent and the complaints don't add much without some discussion of alternatives.


we regularly confuse the ability to enumerate the downsides of something as some sort of notable epiphany.


> I always find the position of telling people to never point out problems, only suggestions or solutions odd.

The article went lower than pointing out problems. It veered into toxicity, its claim to "irreverence" notwithstanding. From the first:

> Object-Oriented Programming: Luckily the industry doesn’t really use this term any more so we can ignore the changed meaning. The small club of people who still care can use it correctly, everybody else can carry on not using it. Just be aware when diving through the history books that it might mean “extreme late binding of all things” or it might mean “modules, but using the word class” depending on the age of the text.

The reference to the "small club of people who still care can use it correctly" is elitist, and he does not bother to define the term (other than in a meaningless call-out to his fellow club members). The casual readers who are not members of his self-described club are not left with helpful information to decide for themselves when and and what use of the term may or may not be wrong. It's just "If you know, then you know" phrasing.

The tone does not improve through the rest of the article. I'm not sure this fellow should be teaching beginners how to program.


Wait...you took TFA to be entirely serious? Try re-reading the 2nd paragraph.


The same can be said about your comment.


Yeah, and?

I can dismiss something as shit without suggesting anything better. Sometimes things are just shit.


But "this" is a comment on HN!

Not sure you can downvote yourself though. Myself I don't have any downvotes so I can't help you.


Genuine question. Why bother downvoting, and not just move along? Perhaps there should be a button for 'disagree strongly'.


Those are polar opposites.

A button for 'disagree strongly' should upvote a comment.

When Einstein and Niels Bohr 'strongly disagree' on quantum physics in 1927, they start a debate that makes both of them more enlightened.

When a bullshit artist, who couldn't care less about the truth, hijack a discussion to make it all about Him (it's almost always a dude), everyone become dumber.

"Moving along" is not enough because the amount of energy needed to refute bullshit is an order of magnitude larger than is needed to produce it.

You need to actually downvote and hide it.


> make it all about Him (it's almost always a dude)

eyeroll


I wonder whether a voting system with more than one dimension could work here?

(I am sure people have tried it before..)


Like Slashdot, perhaps?


> (it's almost always a dude)

Yes you are https://news.ycombinator.com/item?id=32247166

> hijack a discussion to make it all about Him

Yeah, irrelevant misandry will do that.


I have seen 25 years of internet trolling and this kind of attitude is gendered yes. Not in a binary way but in a bimodal one. I could have avoided the point because it's obvious though. I don't hate men because as you found out I'm one myself.


> Him (it's almost always a dude)

I'm curious as to how you know the sex of anonymous posters.


On websites where it's anonymous I infer that from what I have seen since 25 years on websites where it's not, and from attitudes people have IRL.


What are the sort of rules you use to derive sex from text? Can you state some?


I can't do that in general. Certain attitudes OTOH are heavily gendered, like the psychology behind internet trolls who at the simplest level is that they want everyone's attention to orbit around them.


Do you think women and girls don't want everyone's attention to orbit around them ?


For women online in particular I'm pretty sure lots of them would rather enjoy less attention on them personally. Too much it not pleasant for _them_. I'm thinking in particular about friends who are content creators, minding her own business and receive comments on her look, insults, sexist comments, unsolicited dick pics, rape and death threats,... I'm sure sexism has nothing to do with that.

OK I'm done on this topic. Maybe there are a lots of women behind internet trolls and then I'm wrong, that's an empirical question. My comment was on "strongly disagree" vs "bullshitting"


My followup question is why you felt the need to claim that the answer to an empirical question is something that you haven't verified empirically?


100% you are a dude.


My guess is that HN is mostly men, so you're likely to be right, but probably for the wrong reasons.


What rules you used to infer my sex from the text of my comment?


Apparently you are really curious about getting to the bottom of this so I did my research.

For me it's a vague intuition. But women are much better are recognizing "men talk" than I am.

So I asked one woman how she does it and here is her answer

> Men make comments without revealing their gender, but we can see the comments are coming from a man's point of view.

> How do we know?

> We have a lifetime of experience listening to a man's point of view.

> In other words, they show their asses much more than they think.


So men are the only people who comment without revealing their gender? What about trans individuals? Does that mean I can reliably mask my gender by deciding whether or not I mention it in my comments and posts?

And this still doesn't answer what rules you or the men's talk expert you asked use. Anybody can claim they have "a lifetime of experience" in anything they want, but without either an empirical success rate or a set of explicit testable rules, it's all bullshit out of their ass.

>We have a lifetime of experience listening to a man's point of view.

I and a lot of other men have a lifetime of experience listening to women's point of view too, and yet I can find no sane man who claims they can identify gender from text, unless in very special cases.


PS: This is hilariously dumb and brain-dead. I can't stop re-reading it back and laughing like a maniac.

My followup question would be : what is your IQ, and that of the woman you asked?


From the standpoint of someone discovering Hackernews, it's renowned for accepting (or at least upvoting to the top) only quality contributions, while this article is literally a shitpost.

I've added Twitter to my blocksite and redirect to Hackernews homepage, so that I get smarter everytime I am tempted to read social networks during my productive time. An article like this makes me feel like I should redirect Hackernews to Twitter instead.


Of all the stuff that is happening with technology today, especially with social media, this is what triggers you?


Not everyone is worried about the same things. Everyone lives in their own bubble.


I was expecting "master" and "slave" to show up since I've seen that argument being made (https://www.eetimes.com/its-time-for-ieee-to-retire-master-s...), but I have a hard time taking it seriously.

I agree about AI though. The rush to exploit the phrase as a marketing term has killed it's original meaning. Once we actually have AI we're going to need something else to call it. I've always kind of liked the distinction Mass Effect made between Virtual Intelligence and Artificial Intelligence/Synthetic Intelligence, but I guess it's too late to start calling what we have now VI. Synthetic/Inorganic Intelligence is still on the table for the real thing though and it'd be better than something like AI+ or AI 2.0


Same (re: master / slave)...

At a previous job, a coworker presented a compelling argument for removing "whitelist" and "blacklist" terminology from our codebases / jargon. Even though the etymologies of the words don't describe them as being based on race / racial judgments, their first recorded usages were in the 16th-17th centuries, when mass enslavement was in full swing. Setting aside the historical context, they also help reenforce the idea of "white is good, black is bad" [0].

At that job we moved to terms like excludeList, allowList, denyList, etc.

[0] https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6148600/


Purely from a clarity standpoint, I'm in favor of killing off whitelist/blacklist. They just introduce an extra step in figuring out what something is—whitelist usually means stuff to allow, blacklist is stuff to deny, but I at least have to dig that up from memory since white/black don't mean that at all. Exclude, allow, deny, ignore, block, etc. are all more obvious, and in many cases more descriptive. In most cases, simply replacing "blacklist" with "blocklist" helps. Though I have more trouble with "whitelist", since it doesn't directly suggest anything to me; I have to contrast it with a blacklist, and then remember what blacklist means (for authors or actors or whatever), and then map it back to what's actually going on. (For whatever reason, I have less trouble with using "whitelist/whitelisted" as a verb.) Please please use a term that describes what you're actually doing with the list.

Don't get me started on "false positive/false negative"...


> whitelist usually means stuff to allow, blacklist is stuff to deny, but I at least have to dig that up from memory since white/black don't mean that at all.

Isn't that true for most words/phrases? No words inherently mean anything. We assign meaning to them. I suppose that if enough people settle on a replacement for 'whitelist' and 'blacklist' those words will be limited to history, but right now it will mean being forced to remember multiple phrases for the same thing since others will certainly still use the current terms.

In the context of your own use/code it doesn't matter much, but every mail provider (just one example) will be dealing with blacklists that describe themselves as such and incoming requests from people looking to be whitelisted which means they'll still have to take whatever steps are needed to recall what those words mean, and must still be trained on the meaning of those words, all while still having to remember not to use those words in their own writing/code/documentation and having to remember what new words they should replace them with/translate them as.

Maybe there is too much jargon in tech in general, and I certainty don't object to efforts to reduce it, but I don't see it as saving anybody much time/effort in the near term.


that first paragraph is peak Hacker News lol


> Even though the etymologies of the words don't describe them as being based on race / racial judgments

Then why should we change them? I know you said:

> Setting aside the historical context, they also help reenforce the idea of "white is good, black is bad"

But if we can't trust people to infer a word's meaning based on the context, we have a much bigger problem imo. Another example of this is the word gentleman. If you refer to a gentleman's club, that's a very different thing than calling a man a gentleman. We trust that English speaking people don't start conflating the two contexts together, because most people possess the basic ability to understand the idea of context.

I don't get when or why we suddenly decided that people can't infer the context of words. And in this specific case, if you can't infer that a blacklist is not a racial insult, but just a term that denotes a list of denials, then you have some other issues to sort out.

Put another way, should we stop referring to profitable markets as markets in the black[1] because they put a positive spin on the color black? No, it's just dumb. I trust that the average human is capable of parsing out an inferred context of a common phrase in their native language.

Furthermore, master isn't even synonymous with slavery[2]. We have "a master craftsman" and "master chefs", which aren't evil. The word "boss" is also derived from master[0], but I'm sure we would all agree that it's ridiculous to remove the word "boss" from our vernacular because it may have been used to denote something else in the past, and that thing it denoted (master) was used to denote a specific use of the word (slave master) at some point in a specific time of history that was also a bad thing.

Edit: I'd also like to add that colors denote a certain symbolism (that may vary across cultures), and has nothing to do with racist intent. White has been used for centuries to denote purity, and black is the opposite of that. People aren't black/white in that sense, because skin color isn't even black/white in that sense. The Bible refers to white as a symbol of purity, but the Jews were not "white" by any means. And apparently in Chinese culture, black is associated with heaven[3]. That doesn't mean that your skin color makes you "heavenly" or "evil". I trust the average person is able to infer context.

[0]: https://www.etymonline.com/word/boss

[1]: https://www.investopedia.com/terms/b/black.asp

[2]: https://www.etymonline.com/word/master

[3]: https://en.m.wikipedia.org/wiki/Color_in_Chinese_culture


> I don't get when or why we suddenly decided that people can't infer the context of words.

Probably around the time people decided intent no longer matters and that words are violence.


First, because there often isn't a context where the word has the correct meaning in the first place. If you have a list of disallowed curse word substrings and Dr. Theodore Fucknuckett gets upset at not being allowed to login, then adding "Fucknuckett" to something called a "whitelist" is using a somewhat tenuous connection to the meaning of whitelist. "Allowlist" is more direct. Also, I just made it up, and I gotta say that Dr. Ted has an awesome last name that I'm going to have to reuse for something sometime.

Dictionary meaning of whitelist:

1. A list of people or organizations that have been approved to receive special considerations or privileges.

2. A list or collection of people or entities that are known, trusted, or explicitly permitted.

Are you giving the word "Fucknuckett" special privileges? Sorta. Is Dr. Fucknuckett explicity permitted? Kinda. His name is, at least.

Master vs slave is different but similar. They are often used in a somewhat strained metaphorical way. There are often better terms. Client/server, primary/secondary, primary/replicant, authoritative/copy. (I don't actually have a problem with "master copy".)

Second, often the only context from which you are deriving meaning is the slavery meaning of master/slave. A master/slave database setup gets its meaning from the master telling the slaves what to do. And once again, it's a strained metaphor, since the significance of the master/slave setup is usually about the data, not the control.

Anyway, the point is that other definitions of the word "master" are irrelevant if you're drawing upon the slavery meaning in the first place.

Those don't invalidate your argument, but if you're balancing the usefulness of terms against potential offense or whatever—well, I argue that the usefulness is not all that high.

Third, relying on the cultural meaning of colors is problematic for the very reasons you cite. I'll give another: Lucifer means "light-bringer". How is anyone supposed to guess which meaning you're drawing upon for a given context unless they already know?

I also happen to be sympathetic to the argument that introducing more negative associations to the word "black" is a problem, given the large number that are already in use. But I'm not arguing that here; I'm just arguing that these words and contexts aren't useful in the first place. And that when the context *is useful, it's very often drawing upon the slavery meaning. (Whether that's a reason to stop using it is a separate question.)


> A master/slave database setup gets its meaning from the master telling the slaves what to do.

This I agree with. I suppose my problem is that whatever you decide to call it, that dynamic doesn't change. Although I've yet to see an article on the subject of the use of Master/Slave in computing which doesn't bring up America's historical use of slaves, the dynamic being described is older than humanity itself.

Simply calling a thing something else is never going to be enough when the problem people have is with the very concept of the thing being renamed. It's doomed to fall victim to the euphemism treadmill. No matter how abstract and convoluted our terminology gets trying to avoid talking about it, what we're describing is still right there in our faces. Rather than try to pretend the very concept of Slaves and Masters doesn't exist, it'll save a hell of a lot of time to just continue to call it what it is plainly.


> Dr. Theodore Fucknuckett gets upset at not being allowed to login

This is known as the "Scunthorpe Problem".

https://en.wikipedia.org/wiki/Scunthorpe_problem


> First, because there often isn't a context where the word has the correct meaning in the first place.

In this case, there's an unequivocally correct meaning to the term blacklist and whitelist that has nothing to do with race. Take the popular show The Blacklist[0], which goes so far as to title itself Blacklist, and I'm assuming it's because blacklist is such a well known term that immediately tells you what the show will be about.

> (I don't actually have a problem with "master copy".)

Isn't this exactly what a "master" branch is supposed to indicate? I was under the assumption that this is exactly why the name "master" branch was chosen. We don't have a master branch commanding slave branches how to code themselves. We have a master source of truth that all the sub branches derive from and eventually merge back into.

> A master/slave database setup gets its meaning from the master telling the slaves what to do

Fortunately I've never run into this naming schema for databases. Typically its a production, development, QA, etc for databases.

> Third, relying on the cultural meaning of colors is problematic for the very reasons you cite

Well this is the whole point of language. Of course I'm not going to understand the subtle cultural references in South Korean culture, because I'm not South Korean. Likewise I wouldn't expect people from different cultures to understand my cultural nuances. That's why good communicators tailor their speech to their audience.

This doesn't mean you can't use cultural references though. And it certainly doesn't mean that one culture can impose arbitrary language constructions on another culture because their offended by the meaning. Mature people will usually tell the offender that they've been offended. Then the offender should be able to explain they meant no offense, in my culture this term means this. And if the offended person understands different cultures have different contexts, then they can happily move on with their life and understand that specific subtlety.

My biggest complaint is nobody is asking to change these terms because they've suddenly been deemed inadequate at explaining the concept. People are trying to change these terms in the name of "social justice". If people can't understand that language evolves over time, and they can't separate their emotions from simply understanding the context of the situation, then we have much bigger problems.

How are we supposed to study earlier works of literature that have ugly things in it? As we've seen with To Kill a Mockingbird, people are trying to tear everything down, even though the book is about the evil of racism! [1]

This author writes a more nuanced article on why changing language for social justice is a bad idea [2]. All in all, we need to stop injecting racism into everything. It's getting to the point where we can't say black because that my offend somebody, and that my friend, is insane lol.

[0]: https://en.m.wikipedia.org/wiki/The_Blacklist

[1]: https://www.nbcnews.com/think/opinion/why-are-we-still-teach...

[2]: https://seattlecollegian.com/op-ed-blacklist-and-whitelist-a...


I've always felt the more compelling reason is that a blacklist is never a list of black items.

The amount of times I've seen a blacklist used as a list of users that are denied a restriction (i.e. double-negative -- able to do something) is too many.


> I've always felt the more compelling reason is that a blacklist is never a list of black items.

Why assume it's the items that should be black and not the list itself?

Really, blacklists have nothing to do with the color of lists or the items. It's from the 1400s when "black" was used to mean "censured or disgraced". We see that in phrases like "Black Mark" (as in "a black mark on your record") where at least the mark presumably had a color (“a black cross or other mark made against the name of a person who has incurred censure, penalty, etc.,”).

In that context it makes perfect sense for a blacklist to be a list of things which are being penalized/censured and for an opposite list to be described with the opposite color.


imo the more compelling argument is that the first time you hear "whitelist/blacklist" you have to ask what each means (at least I did). AllowList and DenyList are names that say exactly what the mechanism is.


We should not surrender language to political forces. What started with 'Blacklist' is now 'Recession' 'Vaccine' 'Woman'. Read Orwell.


I find it pretty helpful to just accept the new meaning of "artificial intelligence": a program making a decision you didn't explicitly code it to make. Note that this includes even the most basic statistical decision methods, which the term is used for not too infrequently.

We already have more specifuc technical words for the approaches that are generally sold as "artificial intelligence", like neural networks/machine learning etc.


>Once we actually have AI we're going to need something else to call it

You mean Artificial General Intelligence? ;)

On a more serious note, it is a shame that we ended up calling it AI instead of something innocuous like "pattern matching code". There's so much bullshit and misunderstanding that wouldn't be there if we had named it something less impressive


This isn't really a problem of the term itself; it's more that people want their software product to sound impressive, so they borrow the term liberally. That is, it's marketing's fault.

People have been arguing since Minsky or before where the "Actual Artificial Intelligence" line falls. Thermostats? Graphing calculators? Roombas? Software games? ...


> People have been arguing since Minsky or before where the "Actual Artificial Intelligence" line falls.

I think it'll find a way to tell us. Same with people worried about giving AI rights. Once it cares enough to ask, it should get them.


Oh man, I wish most of this would go away.

RE: Agile 'We have to find a new way to discuss the idea that maybe we focus on the working software and not on the organisational bureaucracy, and that way does not involve the word…'

Which is rather hilarious, because the only people I ever hear use the word agile are the beaurocrats.

RE: DevOps, I still don't know what it actually means. When I first heard the term, it was used to tell us that us developers need to do our own operations. And we did. Cool.

Then it turned into 'developers need to work with the operations people.' Wait what is this just a mantra or something?

Now it's just a fancy word for 'sysadmin', apparently?

I'd probably add 'microservice' or even 'service' to the list. It's become extremely apparent that very few people have any idea what they're doing or trying to express in that realm.


> When I first heard the term, it was used to tell us that us developers need to do our own operations. And we did. Cool.

That is not at all what it ever meant, and is very much the opposite of the original meaning. For what it's worth, I personally think this is probably the best approach now though.

> Then it turned into 'developers need to work with the operations people.' Wait what is this just a mantra or something?

This is sort of it. Calling it a "mantra" is fine, but it was meant to be a cultural shift. It was actually pretty successful, which I think is why most people are a little mystified as to what it was originally about. It has become just the way things are done, so it is difficult to recognize as a change. Back before this change started to happen, developers and operations teams used to be much much more divided. They rarely spoke to each other until applications were preparing for release, and then it was the dreaded "throw it over the wall".

Operators didn't want devs touching their environments and breaking things. Developers didn't want anything to do with production, because once the code was released their job was done. The whole idea was to align these incentives, and to modernize Operations (which was frankly very stuck in old mindsets).

> Now it's just a fancy word for 'sysadmin', apparently?

This is what people mean when they say "if you are hiring 'DevOps engineers' you're doing it wrong".

Anyways, it's basically a dead phrase. If you've been in the industry less than 15 years or so, it's basically meaningless to you. The movement it came from achieved its goals so well that it no longer has any useful meaning. It now just serves as a way to nerdsnipe people who were there into explaining what it actually meant.


DevOps has in my experience become full stack + server maintenance + misc tool development.


I recently moved from software dev to devops and this sounds about right! I'm supposed to understand the product teams entire codebase, write the tooling, create/update terraform for the infra, implement CI/CD stuff plus act as some kind of IT guy with AWS access.

Don't get me wrong, it makes a nice change from pure development, but it's clear that in most places, 'DevOps' is not so clearly defined.


Yeah, likewise. Alongside that I'm also expected to do backend development. I feel like I'm allowed no downtime or get any "easy wins". I wonder if that can quicken burn out onset


I for one am rather fond of just calling it "one man IT department" or a "generalist" (for whatever reason it looks like this has become a dirty word lately)


Sounds like a previous job where I was informally titled 'Head of stuff'


At my company, it's now just a term for the person who has full AWS access.


Opsec hates devops!


DevSecOps hate themselves


Where I worked at least, DevOps didn't do anything "full stack" and had nothing to do with the developing or maintaining of any of the actual code that got deployed. They where responsible for working with the developers to come up with the best solution for having a secure and fast way for deploying and running an application. Sometimes they would help write custom deployment scripts or come with suggestions on how to make things easier from a deployment and/or performance point of view, but they never actually touched the code in any way.


Not really fullstack if you ask me, more like deployment tools + cloud infrastructure maybe with a bit of glue code sprinkled in.


On my experience it is rebraded sysadmins.


I've been hired for multiple DevOps roles. For me it means Syadmin (happy Syadmin day btw) but also responsible for managing the plethora of AWS etc service, CI/CD pipelines, dealing with security audits, the occasional diving into code for back end stuff and being on hand for advice/comments.


DevOps is a sysadmin who doesn't own his infrastructure, mixed with a developer who doesn't own his code.


"So, what would you say you do here?"


Regarding DevOps, a major aim was to embed operational concerns into all stages of development. In that sense, it was a massive success; a developer cannot anymore just say "here's a binary that I made on my machine, make it work in production", throw it towards ops and sign off.


> Regarding DevOps, a major aim was to embed operational concerns into all stages of development. In that sense, it was a massive success...

I strongly disagree here.

If it was such a success, then why are there devops jobs at all? If it was meant to signify that developers should share responsibility on deployments and integrations, what's the role of a "devops engineer"?

It seems to me that self-proclaimed devops practitioners tend to draw lines to limit their responsibility in the software building process, which is also counterproductive.


>devops practitioners tend to draw lines to limit their responsibility in the software building process

I'd say you hit the nail on the head here. I would agree that DevOps does indeed stem from operations folks saying that they don't want full responsibility for how software runs in production, instead introducing tooling and processes to force more of these concerns onto devs.

As to terminology, "DevOps engineers" are ops engineers who adhere to this approach (or being a bit cynical, those who just have familiarity with relevant tooling and like the sound of that term).

And as to whether this is productive or not, I would argue that it's made the individual developer somewhat less productive on the metric of "writing code", but the overall organization more productive on the metric of "deploying working code that serves users".


DevOps is like what happened to DBAs, now devs get to do another role while still being responsible for their other roles without extra compensation while eliminating a whole job (ops or dba).


About 20 years ago I had a job where I was called an 'Integration Engineer'. It would be called Dev Ops now. It will be called something different in 20 years. Who cares?


I thought devops was the cattle not pets mentality. Script all the things. Then CI. Then Terraform. Not necessarily that tool, but that class of solution.


Sinister used to mean left handed. "A shambles" used to mean a slaughterhouse. Sanction used to mean allow, now it means not allow. It is fine for words to change meaning, even to their own opposite. Natural language has done that forever.

Edit: you're all right, sanction has both meanings.


While it's natural, I'm sure most words didn't change their meaning within a few years of introduction.

> Natural language has done that forever.

We've also "retired" words. On a large scale, as a matter of fact, so it's not really an argument against getting rid of "agile".


"Ell" used to mean arm.


> Sanction used to mean allow, now it means not allow.

It still means both, at least as a verb.


It still means both as a noun, at least in the singular.


> Edit: you're all right, sanction has both meanings.

And autoantonyms are not good. They're confusing and lead to confusion.


"Sanction" has to be an autoantonym. To explicitly allow something is to implicitly forbid whatever is not that something, and to explicitly forbid something to implicitly allow whatever is not that something. Or, at minimum, the not that something in both cases exists in some ternary in-between state.

Thus, any word that exists in this vein is going to end up meaning what is forbidden and what is allowed.

One can sanction a country by only allowing it to trade oil, or one can sanction a country by forbidding it from trading food, clothes, and manufactured goods.


What’s your thoughts about the term “literally”? It’s one of my bugbears, because the term lost its meaning and I don’t know of any good replacement…


It's just the latest in a long line of adverbs for truthfulness that morph into something else. "very" (from latin for true), "really" (from real), "honestly", "truly" and so on. "Actually" seems to still hold on to some meaning but is now on the verge of becoming socially unacceptable. I guess this says something about human nature.


> I don’t know of any good replacement…

My Chambers dictionary suggests these alternatives: actually, really, absolutely.

It also says, "literally is in common use to intensify an idiom, and this is not incorrect".

My general rule is that usage defines meaning. If lots of people use literally as an idiom intensifier, then I'd be wrong to argue. However, my GCSE English teacher said that we need to consider the intended audience and form. Me saying, "They literally flew down the road" to my mates in the pub is fine. It is not fine in formal writing.


It lost it’s meaning in as early as 1876 apparently so you can’t blame kids these days.

But I don’t like its use as a hyperbolic no-op.


I like this definition, from wikipedia:

"Pragmatic sanction, historically, a sovereign's solemn decree which addresses a matter of primary importance and which has the force of fundamental law"

It has just been corrupted in various ways. It's no longer only a sovereign's decree, but any authority. It doesn't necessarily have primary importance. And the force it carries depends on the authority.


> Sanction used to mean allow, now it means not allow.

I can't sanction that description I'm afraid.


"Sinister", still left in Italian ("sinistra").


> "Agile: Nope, this one’s in the bin, I’m afraid. It used to mean “not waterfall” and now means “waterfall with a status meeting every day and an internal demo every two weeks”. We have to find a new way to discuss the idea that maybe we focus on the working software and not on the organisational bureaucracy…"

Yup. :-(


May be we should try 1990's style waterfall dev shop. Play Radiohead and Pearl Jam in the office. Everyone wears shorts + tennis shoes with super long white socks. Monthly powerpoint updates on how things are going, PM gets to have some fun with sharpie markers to tick off checkboxes next to each feature that was planned. Dedicated QA department, and they have the coolest arcade machine, it is the Street Fighter machine from 1 Arcade! Ritchie, one of the frontend devs, was able to hack it so it also accepts Doge coin. Morale is high. Project is over by October, 2 months early, and everyone hires a VW bus to go on a road trip. Customers are delighted and next year starts with another waterfall project.


You made this sound so appealing, I actually want to work on a project like that now.


Frontend devs? It's 1990s!


Arguably - if you knew about RS232 and friends, were familiar with some terminal hardware (I knew more than I should have about Wyse 60s), understood the tty interface, and programmed using terminfo/termcap/curses… then you might be called a front end programmer from the 90s.

Which, dear god, I just realised I was a front end programmer in the 90s!


AFAIK, waterfall was always a straw man meant to knock down. Nobody sane ever did a pure waterfall model. The model came from http://www-scf.usc.edu/~csci201/lectures/Lecture11/royce1970... and describes it as risky in that very paper. Notice diagram 2 (the waterfall), 3 (the dream) and 4(the reality).

We had expensive agile consultants coming in a bit after 2000. They were telling us the current official process, RUP, was waterfall, and we were completely misguided. Of course, this was a typical management-saves-face move: Devs were asking for using RUP as described (i.e. more iterative), even if managers expressly forbade the use of agile processes up to that point. RUP as intended was basically doing 2 to 4 week iterations, each a mini-waterfall (thou shall not call them sprints).

Somehow, expensive consultants delivered an agile more waterfall than RUP.

The actual value from RUP was a clear description of who does what, who requires what, and how do we call each part of the process. The Agile consultants saw no value in this, and gave us service based service service services implemented by services. We had standups reused as meetings to find out if service was meant as ITIL/SOAP/process/micro/business/something nobody ever heard of. This distinction was important so we could fill in all the important agile documents, which were not meant to be read by anyone, but filling them in was the only way us lowly peasants would not stray from the true golden agile path.

Oh well, if there is an ASCII code for the END-OF-RANT character, you can place it here: <<EOR>>


Ha, this is a good rant and I like it.

Rational was all that and more in the 90s but ultimately AFAICT they were trying to make design into a coding exercise under a different name and with a fancy “language”; but that’s what code is for.

Unfortunately my reality has been that some shops and some managers actually do believe that the process is design -> build -> deploy and that any errors or delays in step 2 are the engineering teams fault.

I mean, they don’t say that bit out loud or anything.


I remember working at a company whose "agile transformation" was a disaster, costing the company boat loads of money. After that, we used RUP because agile was a dirty word in that office. We even had posters and such on the walls praising RUP.

Nothing actually changed. Standups became morning meetings, sprints were iterations, sprint review became feature review, and sprint planning became elaboration meetings.


Well, it's very difficult (and probably mis-guided) to throw away waterfall completely. In reality, most "not waterfall" approaches are just "smaller waterfalls" with less formailty. You still need some semblance of requirements (what you want to build), some amount of design, you need to implement, you need to test, and you need to maintain.

Those steps could take months, weeks, days, hours, or minutes. And they could require a lot of written documentation with format reviews, a few discussions between people, or some thinking.


In my world, the detailed design happens as late as possible, and is quite fluid, as we get feedback from the implementation. This is certainly my main take away from DDD.

In waterfall as I understand it, each step is separated in time and team, and the feedback loop from implementation is lost.


Sadly agree here. Maybe we can find a shibboleth. Or is there one and I don't know it!? Oh no!


The term "waterfall", like the terms "capitalism" and "meritocracy", was coined to criticize the (previously unnamed) concept that it defined. "Waterfall" was Royce's term for the way software projects ended up being organically managed by people who weren't putting much thought into optimizing their organization. While I definitely agree that the "waterfall" style of "define everything up front, and then do all the stuff that you defined" is absolutely braindead, the alternatives are unfortunately even more braindead.


So true. I'm borrowing this!


Object-Oriented Programming: "Luckily the industry doesn’t really use this term any more"

Wait really? What world are they living in? Am I just out of touch or something?


Bubbles. Software isn't just whatever the author's particular bubble happens to be.

OOP is still very alive and well, mainly because an aweful lot of the most widespread languages are OOP, how can you talk about java and python without using OOP ?


OOP became a marketing term, but no longer sells designs, languages, etc. OOP alone used to be hot. Now it is either just assumed, or eschewed, or considered ambiguous.

We’re getting there with FP, too. OOP…which kind? FP…which kind?


Please note that the author never specifies that their context is marketing or management. They instead use a far broader term, “computing”. It would be very difficult for huge swaths of developers to do their job without using OOP and other attendant terms.


Just because it is not a selling point, doesn't mean people don't still use the term a lot. I used to only know C, OOP is still a relatively new thing for me.


OOP is pretty well defined.

99% of the time it means type dependent namespaces.

Class A.foo and Class B.foo both represent separate namespaces in which a field or method foo can be defined, the name space is directly tied to a type. In Haskell every function is defined in what amounts to be a global namespace. This is particularly annoying when you have records that share the same attribute names.

You might argue that structs fit this definition but they are missing type dependent namespaces for functions/methods and the fact that you can define fields and methods to only be accessible within their type dependent namespace, not outside of them.


By that, then, you mean undefined.

It once meant that three features appeared: encapsulation, inheritance, and constrained runtime binding.

Any time it is used without implying those, it is just more or less noise.

That said, OO has become a niche technique: it fits certain problem spaces well, others very badly. Any big enough system will have some parts that could meaningfully be described as mostly OO. But there is no value in purity, so none in distinguishing those bits, particularly, or excluding non-OO notions there.

Trying to cram all solutions into the OO shoe by providing no support for any other organization (cough Java) damaged the brand, probably irreparably. It still makes sense to say part of a system has an OO flavor, for exposition, but there is little value, otherwise.


I think we have to look to Alan Kay for an original definition, since he invented the term, and according to him, inheritance is not a necessary part of object orientation. In Smalltalk (and various Lisps), inheritance is something that is created from objects and messages, not a fundamental feature of the language. In [one place](http://userpage.fu-berlin.de/~ram/pub/pub_jf47ht81Ht/doc_kay...) he defines it as "OOP to me means only messaging, local retention and protection and hiding of state-process, and extreme late-binding of all things."


Alan Kay's definition of OO has drifted over the years as he tries to exclude any language that is not Smalltalk.

He is free to do that, but we have no need to play along.


This is the fruitful definition of OOP to me. None of that Java Corporate insanity


I love how the well-defined starts with a definition I've never heard for OOP.


Is using structs in C OOP by that definition?


Would be, but isn't.


I've observed the same for at least the past 20 years: most people who are working as programmers can't or won't write object-oriented code, nor will they write functional code. For whatever reason, even if they're working in a language like Java that's fundamentally object oriented, they'll write primarily procedural code (mostly static functions accessing datatypes with no functionality besides getters and setters): either they don't understand that there's an alternative, or they see some benefit in it.


I'm a functional guy myself, so I think OO is a downgrade from plain ol' procedural code. Regardless, the procedural soup model gets applied because it's the end result of too many hands touching any given codebase. Elegant abstractions take time to grok, and are opinionated. If you start a project with a UML diagram, I'm going to be disgusted. If I start a project by defining the whole thing as a left fold over an infinite stream, someone else is going to be appalled. Abstractions have a target audience, and that rarely aligns with corporate hiring practices.

If you're working on a given project for a long time, you'll appreciate the abstractions. If you just want to monkeypatch in a quick fix/feature because your boss told you to, you find where it fits, you hammer it into place, and call it a day.

Rinse repeat for a few years, and all corporate code becomes procedural.


You aren't out of touch. Some people are just living in bubbles. There's still a massive amount of Java, C# and other OOP work going on out there.


Just because it is in Java (equivalently, C#) does not mean it is OO. Java just makes coding the non-OO parts of a system awkward and unnatural.

Of course most parts of most systems are non-OO.


> Luckily the industry doesn’t really use this term [OOP] any more so we can ignore the changed meaning.

OOP is still a widely used term today.

> It used to mean “not waterfall” and now means “waterfall with a status meeting every day and an internal demo every two weeks”.

That doesn't sound waterfall-y at all.

> If you can hire a “DevOps engineer” to fulfil a specific role on a software team then we have all lost at using the phrase DevOps.

DevOps was never about eliminating all operations roles. It is about better cooperation between dev and ops.

> This one used to mean “psychologist/neuroscientist developing computer models to understand how intelligence works” and now means “an algorithm pushed to production by a programmer who doesn’t understand it”.

No, that's never what it meant. It was always a branch of computer science, going as far back as Alan Turing.[0]

> Previously something very specific used in the context of financial technology development.

Not at all. It was coined by Ward Cunningham[1] who happened to be working on financial software. The metaphor was never meant to be restricted to financial software. The fact he was working on financial software is largely irrelevant other than possibly having given him inspiration for the metaphor.

> Was originally the idea that maybe the things your software does should depend on the things the customers want it to do. Now means automated tests with some particular syntax.

Again, completely wrong. The article that introduced BDD was explicitly about automated testing.[2]

[0] https://en.wikipedia.org/wiki/History_of_artificial_intellig...

[1] http://c2.com/doc/oopsla92.html

[2] https://dannorth.net/introducing-bdd/


> That doesn't sound waterfall-y at all. Sounds like an iterative approach to me.

I think you missed the crux of the issue: in a lot of companies, Agile is definitely waterfall-ish, with a lot of pre-planning, design working a few sprints in advance, a lot of "getting it perfect" during PR reviews, no follow-up, no evaluating the work midway, no admitting failure and reverting it before finished, etc.

> Having roles for operations is not incompatible with DevOps at all.

I think you also missed the problem on this one. If you have "roles for operations" that doesn't involve development, it's not a devops role, it's just an operations role. Sure you can have devops in the mix, but someone doing operations exclusively isn't devops.


> I think you also missed the problem on this one. If you have "roles for operations" that doesn't involve development, it's not a devops role, it's just an operations role. Sure you can have devops in the mix, but someone doing operations exclusively isn't devops.

It seems we both agree that having a DevOps engineer is not incompatible with DevOps?


We don't. "Operations" is not a synonym "DevOps".

Devops does involve operations, but is not exclusively that.

Having an Operations person work together with developers doesn't automatically make the Ops engineer into a DevOps engineer. Just as it wouldn't make the non-operations developers into DevOps engineers.

Maybe the better term would be "Operations Engineer for multi-disciplinary Team". Not DevOps.

EDIT: It was previously written "It seems we both agree that having a Ops engineer is not incompatible with DevOps?"


I feel like you misunderstood me. I wasn't arguing about the definition of a "DevOps engineer". Since the author didn't give one, I went by the charity principle and used a definition that would make their argument stronger (a DevOps engineer that really is just an operations engineer). But even with that definition, having an operations engineer on a team does not preclude it from practicing DevOps and is arguably in line with the DevOps practice of increasing collaboration between dev and ops.


The point of the article is that even if we go by your interpretation where "a DevOps engineer that really is just an operations engineer", then we can completely retire the term, since it's not really special and we can just keep using "Operations". The difference between Ops and DevOps isn't just in how the team is organised, but rather what the job itself consists of.

Moreover, in a lot of places there are "DevOps" teams that consist only of Operations people, which is even worse. The usage of DevOps is to attract candidates, not to differentiate the skillset needed.


> The difference between Ops and DevOps isn't just in how the team is organised, but rather what the job itself consists of.

Do you mean that a DevOps engineer's role and skillset is different from an operations engineer? And that hiring DevOps engineers instead of operations engineers is more in line with DevOps?


If we go by the original usage of the term, then yes, it's different skillsets between Ops and DevOps. You can't announce an Ops position with a DevOps title since the two are different by definition.

However, even if we go by your definition that it is the same thing, then the "Dev" part is completely redundant and we should just call it "Ops", since it's by definition the same thing and it doesn't warrant a new title, because it's the exact same thing. However I doubt that's what the case the original article is talking about.


So there is a specific role and skillset for DevOps engineers (going by the original usage of the term), correct? Don't you feel that if you can hire a “DevOps engineer” for a specific role on a software team then we have all lost at using the phrase DevOps?


> So there is a specific role and skillset for DevOps engineers (going by the original usage of the term), correct?

Be careful with that misquoting there: I said "different skillsets", I didn't say necessarily roles. An Ops and a DevOps person will know different things.

> Don't you feel that if you can hire a “DevOps engineer” for a specific role on a software team then we have all lost at using the phrase DevOps?

So, you copy-pasted this from the article. If so I'll assume you're agreeing with them and this conversation can be over.


"Continuous integration" just means "automated tests on every push", and no longer means anything much to do with integration, which I always found kinda funny.

It sometimes even means automatic running of scripts that aren't even tests, like automatically generating html from markdown on push. You could say that's continuous deployment, not CI, but you'd be fighting a losing battle. The yml file you wrote to run the deploy, what do you call it? It's a "CI script".


> no longer means anything much to do with integration, which I always found kinda funny.

I thought the point of CI was to avoid (what we would now call) long-lived feature branches; e.g. a team spending 6 months on their own copy of a codebase, then struggling to merge it back into the mainline when finished.

In other words, CI isn't just about running tests on push; it's about the frequency of pushes, often several times a day.

Perhaps it's a good thing that such practices have become so ubiquitous (largely thanks to DVCS) that we don't really think of the alternatives any more :)


When the term 'continuous integration' was originally conceived, the point was to avoid branching altogether. It was not features branches that was the problem (there was no such thing at the time), the problem was synchronising work between groups that were working on different parts of the same application.

So the original CI server concept was that everyone would get a nightly update. Having a branch run for even 2-3 weeks means that you are not practicing continuous integration ... as originally conceived.

I'm not convinced that it's a good thing that you and so many others have stopped questioning feature branching.


>> Having a branch run for even 2-3 weeks means that you are not practicing continuous integration ... as originally conceived.

Unless you're updating the feature branch from the main branch frequently (at least daily), thereby continuously integrating your code from the mainline.

Problems arise when everyone's working on their own long-lived feature branch and the mainline doesn't move until they all try to merge their feature at the end of a sprint. Refactoring makes this worse.


REST is missing. Should be about hypermedia. Now means JSON API.

See: https://roy.gbiv.com/untangled/2008/rest-apis-must-be-hypert...


I build infrastructures for small/medium size projects in a reproductible, code based, extensive and scalable way - usually cloud based. Businesses I work with need a quick and efficient solution to serve what they build, do not want to/cannot dedicate full time internal resources to it but require production ready setup that their developers are not qualified to build. Sometimes, they just require some guidance or validation. They like to call me DevOps.

DevOps, SysOps, Sysadmin, Infrastructure engineer, Cloud engineer... The title doesn't matter so much, but what I like in the term DevOps is how it links the dev side to the infra side, as the infra itself ends up being already built in a cloud environment.

I also believe there was a time when we needed to make a difference between traditional "Sysadmin" and developers who specialized in building cloud infra, but not anymore.


> Reasoning About Software

What's wrong with this? I need a way to describe techniques that make it easier to...well I don't even know another way to say it. Maybe I'm biased by my experience in formal verification, a field in which it's expected that you can formally study a program's behavior.


It should be replaced with the more accurate term 'handwaving about software'.


Perhaps "API" could be added to this list. I've been jobhunting this week, and first I heard recruiters use "API" to refer to what I'd call a "service", then heard other quite senior developers do the same thing.


APIs, frameworks & libraries are so often being used interchangeably & its getting confusing for us too, specially if planning for a new feature to integrate

"is it a framework? is it a library? is it a API" whenever I have a cursory glance at a product on HN/Github or likewise


APIs are simply a property of any system you want a program to interact with. The confusion comes from the synecdoche of referring to web applications that are only interacted with via their API as APIs.

The more interesting one is frameworks and libraries. There's a gradient between them based, IMO, on how opinionated they are. Working with a framework means filling in the blanks, e.g. "When we recieve an http request to this endpoint, this application code should run" in contrast to a library which is just a barrel of functions to call. But that's a fuzzy line.


I mean, a library has an API (the public facing stuff you're supposed to use), a framework probably also has an API (or rather, a bunch of them), and a service also probably has a Web API (assuming you're programming against said service).


Yes, but its not me getting confused. But the documentation is a mess many times


Changing words reflect worldly dynamics. Just as people use money to achieve aims there is a meta-game in etymology, beyond the communicative use of words. They don't just randomly change or 'evolve' as if by Dawinism. Coining a good phrase can make a business or career. Power comes through words. An interesting analog connects words to money.

  They are 'minted', by small groups or individuals, to serve a local
  need.

  They can be 'issued' by PR/news/sales people, spin-doctors and
  propagandists, or enter popular use through education or popular
  science.

  They lose their 'currency' like "coins that wear down and lose their
  faces".

  They are 'appropriated', taken and reused by other groups.

  They can come back into circulation as old ideas cycle round.
Much of it is simply down to changing (and mostly improving) technologies. It doesn't make sense to talk about mainframes or peripherals in an age of "clouds" and tightly integrated handheld clients. But in computing I've also seen plenty of churn with meanings that reveal a battleground of ideas, frequently with near religious fervour. For example, wasn't "Object Orientation" more than a mere programming approach? A couple of decades ago it was heretical to talk of other paradigms. And yet, in some progressive circles to talk of 'objectification' is anathema.

The rise and rise of "management", its constant, chameleon self-reinvention has muddied many waters. Many new meanings seem to grow from the bureaucratisation of society through code, a restless tendency toward make-work, anthropomorphism and even deification. The "cloud" is more than a mere model for computing infrastructure. At the very least I think its a political idea.


Hard agree on AI, hard disagree on Tech Debt. The sheer force of opinion saturated into so few words made me smile :) My highlights:

- OOP 'modules, but using the word class'

- Agile 'It used to mean “not waterfall” and now means “waterfall with a status meeting every day and an internal demo every two weeks'

- AI 'an algorithm pushed to production by a programmer who doesn’t understand it'


I like defining AI as a catch-all for higher-order solutions. Rather than defining a specific process for taking in input and producing the desired output, you define a process for taking in input to produce a process that takes in input and produces the desired output. That ends up including a lot of boring applications, like SAT solvers, Bayesian statistics engines, as well as the more hip deep learning stuff.

ML is the specific case where the inputs to both the higher level and base level processes are similar, and the goal is for the application to identify patterns to apply to specific cases.


> AI 'an algorithm pushed to production by a programmer who doesn’t understand it'

This is precisely how I read it too. The microsecond I hear a person say “AI” or “ML”, I’m thinking “oh ok so it’s untraceable, non-deterministic bullshit I’m gonna be held responsible for if I approve this.”

Probably still net profitable to continue using this marketing language. Certainly does me a favor too.


I do find “BDD” a frustrating term.

It was coined by Dan North [0] to refer to a coherent set of sensible TDD practices — give tests expressive names, say “behaviour” not “test”, write executable “given/when/then” acceptance criteria — but in the intervening 16 years (!) it’s mutated into… I don’t even know what.

There’s a community of “BDD” practitioners who talk about all sorts of ideas that aren’t necessarily even specific to software engineering. Which is fine and good, but it’s rendered the term essentially meaningless in a software context.

These days I usually say “double-loop TDD” or “outside-in TDD” when I want to emphasise the use of executable acceptance tests; it’s impossible to predict what (if anything) “BDD” will mean to anyone.

[0] https://dannorth.net/introducing-bdd/


There's a lot of people taking this article seriously, but I'm pretty sure this is satire:

> So should we just abandon all technical terminology in computing? Maybe. Here’s an irreverent guide.


Technical Debt is a very descriptive term that everyone can relate to. The problem is we don't have a consensus on the nature of the interest on the debt, and what happens when it doesn't get paid back.

In my opinion, most technical debt has usurious interest rates.


As a "DevOps Engineer" I strongly agree we need to stop using the term, at least as a job title. Where I work, I help automate our infrastructure management and develop tools that make it easier for product engineers to do DevOps, such as making monitoring solutions easier, injecting deployment credentials into CI/CD pipelines, abstracting certain common but complex architectures, etc...

But at least half of the recruitors who contact me clearly expect me to be a glorified sysadmin who codes maybe one day a month. It's bad enough that I changed my job title on LinkedIn to "Software Engineer (DevOps)" which seems to have helped a little bit.


So a Platform Engineer, then.

DevOps is not a role, it's not tools. It's a culture.

Once the term is used outside of that, it has become useless.


Agile [...] It used to mean “not waterfall” and now means “waterfall with a status meeting every day and an internal demo every two weeks”.

"Waterfall" is a straw man invented to sell Agile, which has evolved into a perverse, almost psychotic Taylorism... although I believe programmers themselves share a bit of fault in that. They assumed good faith on the part of the people managing them. That was not a wise move.

Technical Debt [...] Now means whatever anybody needs it to mean if they want their product owner to let them do some hobbyist programming on their line-of-business software, or else.

Uh, whose side is this guy on? Is he defending "product owners"? Is he saying we should forcibly typecast programmers to line-of-business grunts with no autonomy? If so, why should we take him seriously?

If programmers aren't getting personal investment in their career by the top executives, then they have every right to do "hobbyist programming" that will increase their employability, and they shouldn't have to justify themselves at all in doing so. The fact that we've failed to reach tribal agreement on this is an embarrassment.


Quite disagree on almost all of them. What does OP mean by retiring a word? He and his colleagues are unable to use them in the right context?

It's so easy to criticise everything and everyone by not matching some personal purity standards or pointing out at inconsistencies, but unless the audience is too young that critical aspect doesn't make you sound smarter, it just shows that either you got less real world experience or you aren't empathetic.

Come on, the whole thing is a cheesy senseless post aiming to be provocative, the author even doesn't understand what technical debt is, and could be because of his professional background in menstruous organizations. In order to understand what agile is, in spite of not matching all of its original goals, one should try to see some old dinosaur alike companies and come back wishing any kind of cargo cult alike agile thing.

Perhaps OP should start to see marketing powered fuzz-words as tools of social transformation that makes all of us evolve into a more "wise collective" by building a common culture, understanding and framework of thought to handle situations.


So, just because people misuse the term, we should find a different term --- to be misused again? Agile is what agile manifesto defines. What the author defines sounds more like Scrum. And we can call the corporate BS, Jira-based development. Technical debt also has a very specific meaning, nothing about hobbyist programming at all.


Yes and no. Whether or not you should use a word depends on whether the word will be understood to mean what you want to convey when you use it. If you mean “agile” in the original way, but people hear “agile” and think you mean the modern so-called “agile” (full of bureaucracy, and many people actually hate it), then communication will fail. Therefore, using the word “agile” will fail to aid in communication, and you should stop using the word for that. If you want to wage a campaign to change people’s understanding of the word, please do, and I wish you luck. But you can’t simply use a word and then pretend that the fact that people misunderstand you is somehow their fault.


Fair point. Perhaps, “agile as in the manifesto” would be a way of conveying in a concise way the intended meaning.


So victims of either gratuitous overuse, or complete disregard for the original meaning of the term.

For instance, "agile" is rapidly becoming "just another way for management to track time and set OKRs". "DevOps" seems to be now "when a sysadmin is capable of writing infrastructure orchestration code".


DevOps engineering seems to mean "engineer who knows Java AND yaml"


I'll add 'UI/UX' or 'UX' when they're clearly discussing user interface alone.

They seem to be treated as synonymous but aren't, and are a very strong sign that the person using them isn't good at UI because UI is substantially about clarity (at its core), or just none too bright and willing absorb buzzwords without understanding them.

"The user experience (UX or UE) is how a user interacts with and experiences a product, system or service. It includes a person's perceptions of utility, ease of use, and efficiency"

or

"According to Nielsen Norman Group, 'user experience' includes all the aspects of the interaction between the end-user with the company, its services, and its products"

from https://en.wikipedia.org/wiki/User_experience

.

and

"In the industrial design field of human–computer interaction, a user interface (UI) is the space where interactions between humans and machines occur. The goal of this interaction is to allow effective operation and control of the machine from the human end, while the machine simultaneously feeds back information that aids the operators' decision-making process

...

User interfaces are composed of one or more layers, including a human-machine interface (HMI) that interfaces machines with physical input hardware such as keyboards, mice, or game pads, and output hardware such as computer monitors, speakers, and printers"

https://en.wikipedia.org/wiki/User_interface

If I were hiring for a UI role and found somebody talked about UI/UX but couldn't explain or distinguish the difference, that would be a strong mark against them.


I would be happy to see 'UX' go and good old 'usability' make a comeback. Sometimes UX seems to be code for switching focus away from usability and over to styling.

'UX' is just too broad to be a useful term. You can't perfect every aspect of the user experience at the same time, you need to focus on one thing at a time, and for that you need more focused words.


I agree that DevOps is often misued but the example is misleading. Many organisations might recruit a DevOps Engineer who can build out a CD pipeline and train up existing Devs on how Devops works, how the tooling works and how the principles of DevOps work.


But isn't that just a sysadmin? Or am I old for calling tiktoks vines?


> Agile

> Nope, this one’s in the bin, I’m afraid. It used to mean “not waterfall” and now means “waterfall with a status meeting every day and an internal demo every two weeks”. We have to find a new way to discuss the idea that maybe we focus on the working software and not on the organisational bureaucracy, and that way does not involve the word…

I've noticed that demos tend to lead to "demo driven development" model where software devs write code to impress other software devs, without any real connection to customers, and with the benefit of a decade of time most of those got dustbinned pretty quickly.


The author left out "software engineer" - too close to the bone, perhaps, given his bio as presented below the article, but still a term that could easily be given the same treatment as any on this list.


DevOps should really mean "Someone who builds systems and pipelines that allow developers to more seamlessly and sensibly deliver value to the end user".

What DevOps actually means in practice is "We had someone without any operations knowledge build a system before someone with the correct design skillset arrived, you can't remove it or change it because it is business critical, your job is now to maintain this like a classical Sysadmin would".

(I have been an Ops/DevOps engineer for slightly more than 15 years)


This is so pretentious. I don't see any arguments in this piece.


"Blah blah considered harmful", "code smell", "anti-pattern"

I hate them because they've become slang for any code that isn't written the way we like


All patterns are anti-patterns.

A pattern is just something your language is not expressive enough to capture and package in a library. If your language is evolving, what used to be patterns migrate into libraries. New patterns arise, and are later captured.


Also describing any software as "modern".

I find this especially amusing when it's used to describe things that are 50 years old (e.g. programming languages with "modern" features like type inference)


Best practice.


A lot of these are problematic because there was money - lots of it - to be made by misusing the terms.

See “decentralised” for a good example.


"Full stack"

(except when referring to someone who can do everything from CPU design and assembly programming right up to UI work)


They need to be able to grind silicon and make wafers in a pressure pot too.

Full stack means you can build a web based app alone, that involves databases, backend, server management (this one is arguable), frontend development and design. No need to go lower really.


>Full stack means you can build a web based app alone, that involves databases, backend, server management (this one is arguable), frontend development and design. No need to go lower really.

I mean I can do that but some parts of that stack will be more sophisticated than others.


"Full stack" usually means "I'm proficient with React, so why wouldn't I be on Node?".


I am an EE, that works with microcontrollers on a daily basis. I know some html I play around with sdl and ncurses a bit, I can do some internet stuff. I am clearly a full stack dev right?


Do you have a microservices-based REST API with graphical frontend (web and Electron based) deployed/available on redundant cloud providers enabling secured, direct interaction with high- and low-level operations of your microcontrollers, such as environmental data gathering, software/firmware deployment and updates?

That's a fuller stack. I probably missed some things.


Oh, and I designed an UI for my smartwatch OS. For which I wrote everything from the ground up.


Still not quite there. You haven't mentioned whether you make pancakes or not.


I do sometimes make pancakes.


I always thought "stack" referred to just the web development stack, meaning someone who can write both frontend and backend code.

The impression I get from someone using this expression is that the speaker is oblivious to other kinds of software.


Full-stack means you can contribute to code pretty much anywhere in the product’s stack. Why would you have to do CPU design and assembly programming to contribute to an average software product?


“If you wish to make an apple pie from scratch, you must first invent the universe” - Carl Sagan


Under AI:

> "actually a collection of if statements but last I checked AI wasn’t a protected term”

This describes an Expert System which has been a form of AI since at least the 70s. If the author doesn't want it classified as AI it's the author that is trying to shift the definition, not that the definition has drifted.


I prefer an old professors definition of AI, "Computer research in commercially non-viable solutions." Anything AI discovers that is useful becomes rebranded into some other tech. Machine learning is the most recent thing that's starting to break out of the AI umbrella.


AI is the computer science term for some magic we will discover in the future. Once an AI technique becomes something real and useful it gets it's own name and is no longer AI. If we could all agree that AI just means something we don't understand yet, it might still be a useful term.


“psychologist/neuroscientist developing computer models to understand how intelligence works” is a very niche domain of artificial intelligence which has, for half a century, meant "the attempt to duplicate intelligence in algorithms"


I am curious about this statement:

  [OOP] might mean ... “modules, but using the word class”
I would argue that "modules" is almost as useless of a term as "OOP". What does the author mean exactly?


It seems to me they mean to write statements for the sake of making statements.

Nothing of substance there, just unjustified and undifferentiated drivel about language and the (lack-of) meaning of words.


Author doesn't mean. Where the statement is accurate, if anywhere, the use of the term doesn't mean.


I think modules are collections of methods unattached to a class/object.


> But there is a potential for confusion with the minor but common usage “actually a collection of if statements but last I checked AI wasn’t a protected term” which you have to be aware of.

Did someone call my name?


AI should have been called 'dynamic algorithms' or 'learned algorithms' from the start, so as to get rid of this ridiculous notion of 'intelligence' that confuses everyone.

OO, DevOps are fine.


I wouldn't mind if "blockchain" as a buzzword died out. A blockchain is almost never the solution to a problem. And I don't even understand the point if it isn't decentralized.


"Blockchain" is more of a punchline now than an actual serious word used in technology.


Recruiters still say it. Because of course they do. :-)


I’d claim artificial intelligence has been broadened to the point of encompassing any statistical algorithm, regardless of whether it’s used correctly. I’ve seen rolling averages described as AI.


Ok so ignoring something exists solves the problem? C’mon. Try harder.


I don't think that's the point the author is trying to make.


Nonce is UK slang for a peadophile, had lots of emails over the years and one of the system error messages would say "Nonce not found".

Could be worse though.


"analogies suck and we shouldn't use them"


Not specific for computing but I hear it all too often and I would love to remove it from existence.

> "Moving forward"


Thought it would be more technical things like a TTL on an IP packet or a disc for storage.


It's so wrong and also so right that I'm not sure if it's satire.


I laughed out loud at the "waterfall" with stand ups. Totally agree.


This dude's blog has more elitist bullshit than a kardashian orgy


The AI one is spot on, but could use more bile.

But I'm stealing the last line on reasoning about software, "If Tony Hoare were not alive today he would be turning in his grave."

Needs an entry for Crypto. Used to be cryptography, now it means ponzinomics, or something...


thats it?


This is bait.


Hard agree.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: