Hacker News new | past | comments | ask | show | jobs | submit | more rednafi's comments login

Neat. I maintain a blog myself[1] and prefer reading content written by actual human beings, not corporate shills or spammers masquerading on Medium and Dev.to.

But I feel like the whole indie web thing hasn’t taken off because of discoverability issues. RSS and Atom are nice, but they aren’t mainstream enough. Also, adding support for them is difficult for non-technical or even semi-technical people.

My blog does support RSS, and I use a reader to keep tabs on people I find interesting. But personally, I’m not a great fan of the protocol itself. It’s old, written in XML. There is JSON RSS, but that’s not widely supported and is fragmented as hell. Also, most RSS readers are just firehose feeds and don’t offer much in terms of organization.

I’m yet to find a solution for this that I genuinely like.

[1]: https://rednafi.com/


Are we really abandoning established, stable protocols because we don't like the serialisation format they use? The practical difference here between XML and JSON is negligible, the value here comes from the ecosystem (which is extensive on the RSS/atom side, and non-existent on the other). As a user, you'll never interact with the XML. As a developer, if you're interacting with the XML rather than using one of the many, MANY, libraries, you're doing something wrong.

Not saying we need to abandon it, but I’m not a big fan of RSS itself.

Yes, there’s an ecosystem, but it’s neither extensive nor mainstream. Feed readers are hit or miss, and I haven’t found one I like. Subscribing to 50 people is enough to make the feed unusable since there’s little to no organization.

While this isn’t entirely the protocol’s fault, its poor state is largely due to its lack of mainstream adoption—too few people care about it. The protocol itself might also be part of the problem.

So discoverability is still a problem because not enough people care about the existing solutions.


Okay, I'll shill my feed reader since it's an example of one that lets you organize feeds and doesn't present a firehose: https://addons.mozilla.org/en-US/firefox/addon/brook-feed-re...

It's only for Firefox though because I like my reader being integrated into my browser and Firefox was the only one that supported a sidebar at the time. Looks like Chrome supports sidebars to now. So mayhaps I'll update it.


Shilling is welcome as long as it’s not corporate ;) Love FF, this looks promising.

> Subscribing to 50 people is enough to make the feed unusable since there’s little to no organization.

What kind of organization do you want? Every feed reader I've ever used let me categorize/organize feeds in whatever way I wanted, but it's a manual process.


Exactly this, but with some automatic content grouping. Also, the ability to read the whole content in the reader instead of having to go to the site. But that depends on how RSS/Atom exposes the content; this is why I am not a big fan of the protocol. Too much fragmentation: RSS, RSS2, Atom.

I'm not sure I understand what you mean by 'automatic content grouping'. Are you talking about somehow automatically grouping the posts from disparate sites into buckets based on some criteria? Newsboat, for example, lets you do this with tags and queries: https://newsboat.org/releases/2.19/docs/newsboat.html#_query...

I'm also not sure what fragmentation has to do with anything. I don't think I've used a feed reader that didn't understand all current flavors of RSS and Atom, so it makes absolutely no difference what the webmaster decided to use, my news reader can figure it out.

It is a little bit annoying when the webmaster doesn't put the full text of the article in the news feed, and instead wants you to actually visit their site to read their stuff. I'd guess that they do that to make sure that you actually visit the site once in a while and might accidentally view an ad so they can make a few cents or they hope that you might see something else on their site you might be interested in or whatever. It also saves some bandwidth by not downloading the full text of an article if it turns out that I wasn't interested in it.


> I'd guess that they do that to make sure that you actually visit the site once in a while and might accidentally view an ad so they can make a few cents or they hope that you might see something else on their site you might be interested in or whatever.

I can't speak for others, but as someone who hand codes my HTML, it is a non-trivial amount of work to convert the entire text of a page into an RSS feed, whereas it is easy to add the headline, datetime, and summary.

I get that it is pretty cool to read stuff inside an RSS feed reader, but doesn't fit with my current workflow. My site has no ads, tracking, cookies (aside from the ones imported by the odd embedded youtube video).


I also hand code my website and rss feed, so my feed is basically a changelog disguised as a blog post masquerading as a news feed. If you want to read the full articles, you have to visit the site.

> the ability to read the whole content in the reader instead of having to go to the site. But that depends on how RSS/Atom exposes the content;

It rather depends on the amount of content the RSS _author_ includes in the RSS feed. There's nothing in the RSS/Atom protocol that prevents you from reading the entire article, but some website creators decide to truncate the feed content.

My RSS reader of choice, InoReader, has the option to download the original website which solves the problem. However, I have over 200 feeds and it's rare to find one without the entire content being included.


A lot of services, (like Feedbin in my prior reply) and a lot of reader applications will permit different ways of viewing the data to get full content to appear even in truncated feeds. That said, non-full content feeds are pretty rare outside of corporate media.

Honestly, I don't think it's a problem with RSS as a format. It's a problem with clients.

WE have been letting that ecosystem rot. Maybe it's time to fix it up. A lot of these old internet projects were developed by one person to scratch a personal itch. That doesn't seem to happen so much these days - we expect companies to do it for profit.

I don't really know what you mean. There's a ton of feed readers, both from an application and server side. I don't really need a lot of organization, but I've never seen a reader without support for folders. If you need more than one layer of hierarchy at 50 blogs... I have no idea what you're doing. I follow like 250 blogs and have just two folders, maybe, and it's super maintainable.

Anyway, services like Feedbin have been going strong for a long time, have a rock solid syncing system with great tools for things like seeing frequency of posting and abandoned or moved feeds, folders, automatic filters, and broad support in the app ecosystem if you don't like their apps or web experience (which is very good).

RSS is absolutely extensive and has millions of users. It's at least as mainstream as Mastodon/ActivityPub, it's just not talked about as such, and that's _excluding_ Podcasts as a use case.


The comment read to me more about distribution that has some network effect. RSS/atom is fine.

Not sure I fully understand what you mean by distribution here. What I meant is that the ratio of personal blogs supporting RSS to the total number of blogs is too low, indicating a lack of adoption.

I’ve emailed many technical writers whose work I liked, asking them to add feed support, and most didn’t bother. One possible reason could be the protocol itself and the fragmentation between RSS, Atom, JSON RSS, etc.—or it could be something else entirely.


I think the indie web hasn't taken off because it's...indie...and it's competing with businesses that spend lots of money on growth. This will always be the case. You have to jump into the melee if you want the eyeballs, or just be content on a free island. Personally I find plenty of actual human beings publishing on popular platforms.

I remember the Internet before Google, Facebook, YouTube, Myspace, etc. The whole thing was what is now being referred to as 'the indieweb' and it was the best incarnation of the Internet.

Consolidating most of the web into giant content silos is one of the worst things to happen to it


> Consolidating most of the web into giant content silos is one of the worst things to happen to it

I'm not sure that "giant content silos," alone, is the harm.

But as soon as you start adding "algorithmic feeds," and "supported by advertising," then all the dark "engagement hacks" start showing up, and it turns toxic in a heartbeat.

To use a specific example, I don't think LiveJournal, despite being a "giant content silo" back in the early 2000s, was particularly harmful. It was a chronological feed, with pagination - you had to decide, at the bottom of the page, to click next. You didn't have "endless scrolling." And because it was purely chronological, if you refreshed, and there was no new content, well, go do something else. Nobody has posted anything new. If it got too much to manage, there was the ever-popular "LJ Friends Cut" - trimming who you follow to people you actually get value out of.

It was a useful ecosystem, but didn't have any of the nasty dark corners of our modern content silos. But it was also not ad-funded - it was funded by premium memberships, and IIRC some merchandise sales, and in general, "funded by the people who got value out of it," so the goals of those funding it were generally aligned with the goals of those running and using it.

DreamWidth, today, is a fork of LJ that seems to be doing just fine with the same approach LJ had. It is a "moderate sized content silo," at least, and it doesn't have any of the dark patterns of modern ad-based platforms that I've seen.


Yes, and back then the way to go was to have a personal website with links to all your friends and some other cool sites around the web. If you wanted more discoverability, you'd join a webring (or try your shot at getting listed in a directory).

I'm still a fan of webrings, hosting a personal website is now easier and cheaper than ever, but it's not the norm anymore. Back then it was, as you had nowhere else to go, more people were browsing the directories, weblinks, links on personal pages, than now.


Well, that also democratized publishing. For better or worse. Anyone can have a voice now that can spread without needing access from (human) gatekeepers or mastering arcane incantations. Starting from when there were like 4 TV channels, I think this has generally been a good and lauded direction.

One of the things I noticed is the behavior of bloggers nowadays are different from the past. The "blogosphere" used to be ripe with links to other blogs they found interesting, which facilitated discoverability so much!

I created a GitHub repo where you write markdown files as blog posts. And it has a GitHub action that automatically publishes to GitHub pages. One can simply fork and make their own.

Here is the blog that I wrote about how I created that repo (so meta) https://blog.tldrversion.com/posts/vibe-coding

And this the GitHub repo for that https://github.com/veeragoni/blog


My blog works similarly [1]. Everything is written in Markdown, then Hugo builds the site, and GitHub Actions publishes it to Pages.

While my blog gets around 20k monthly views, discoverability is still a problem.

[1]: https://rednafi.com/misc/behind_the_blog/


The reason the indi web hasn't taken off is because the masses don't care about that kind of content, and they never have. The people who are interested in home grown blogs are dwarfed by the masses that came online by way of billion dollar marketing budgets, driven by the business mechanics of dopamine farming. The indie web will always be relegated to nerds and eccentrics.

Exactly this. And this is a good thing. Small communities have good properties that just don't scale.

And the people that want big communities generally want things that federated networkscan't offer (ie, no ability to be authoritive, gather enough attention to make money, transfer money). And because of government interference in such things no non-incorporated network will ever be able to provide those things. Attempts to cater to the masses is a waste of time.


The indieweb cannot, by definition, go mainstream.

I think, what is meant though, the generic Web has shifted the balance away from what once was diffuse to what is now consolidated. That personal connections could and would be monetised was not a given, but has become the norm rather than the exception. That people are retreating from the networks that enable that is not surprising.

> discoverability issues

Webrings[0] are somewhat being used again. I keep a list of the blogs I follow[1] in OPML & HTML, so that you can either bulk-subscribe, or browse through blogs that you might find relevant; you can do the same!

On RSS readers/organization, I didn't need a solution, because "personal blogs" post rarely enough that even following ~100 blogs, I see 3~5 updates per week.

[0]: https://en.wikipedia.org/wiki/Webring

[1]: https://blog.davidv.dev/blogs-i-follow.html


Another good resource is https://searchmysite.net

It is great for searching the indieweb with your interests as the keywords


I also maintain a rough list of people I follow on my own blog.

But I’ll be honest: I came to the game way later than most of the veterans here (circa 2018). I don’t understand how webrings work, what problem they solve, or how to add one to my Hugo-generated static site.


Medium and dev.to are possibly the worst sites in the industry.

Dev.to started out pretty well, tried to keep the algorithms to the minimum, but ended up being flooded with half-assed beginner content and promotion listicles.

It's almost like it's impossible to start a platform without algorithmic curation nowadays and not have it turn into a place of repetitive low-effort content.


Along with Hashnode. They all started with the promise of democratizing blogging, only to adopt all the dark patterns within a few years.

Yes that one too. All three are trash tier shit content.

>hasn’t taken off because of discoverability issues

I just don't think of all of indie web as mass media. Blogs can be for friends and colleagues, they don't necessarily want to maximize 'reach'.

I hope part of this movement manages to reset the whole dynamic of social media. Imagine if instead of always writing for the panopticon, you were just writing to people you cared about. Maybe not even publicly available by default.


One reason to abandon analytics scripts and obsess over the stats. Fell into this trap when a few of my posts surfaced on the front page here.

There was a period where blogging was seen as a great way to make easy money, so everyone ended up with ads and analytics on their sites, obsessing over maximizing reach, just like how YouTube is nowadays.

Perhaps most people just never went back to thinking of blogging as something you do for the sake of it instead of for some expectation of financial compensation in the future?


I think it is a change in proportion instead of a change in perspective. The people who write for the love of the process were doing it before it became mainstream and many kept doing it after too, or found a different venue, maybe even just writing on paper or a typewriter, as it wasn't about the income for them anyway.

In the early days when 90% of bloggers were doing it for passion it seemed like most blogs were good. There was an inflection point around the MySpace & LiveJournal days where it became very easy to be a blogger and some really good writers who otherwise wouldn't have set up a server were part of blogging.. but it went as you say, riding the commercialization train, until 90% of bloggers are doing it for income (or link-farming or whatever). But that doesn't mean all the passionate bloggers became commercially-biased, they are just harder to find.

That's my reading of it, though. I'm sure it's biased by nostalgia. And I'm sure there were people who got caught up in the commercialization and maybe it went as you say, that they never went back to doing it for non-commercial reasons. Like when a hobby becomes an occupation becomes a source of stress. But I've also seen quite a few old blogs that still don't run ads and still post occasionally.


I tried the whole "ads and reach" thing for a while, discovered I actually don't care about it for "beer money" levels of revenue, and went back to just blogging about that which I care about, for the intrinsic benefits of having to write my projects up.

- It forces me to finish things. I was, prior to having a blog, fairly prone to "90% done, I'll finish it later..." sort of stuff, which led to a lot of mental clutter from "having to keep track of what was still inflight."

- I can, after a project is done, confidently flush all details of it from my brain, because anything I found odd or notable that would be worth remembering is noted in my blog posts.

And often enough, I'll end up down a weird research rabbit hole I wouldn't have otherwise gone down, learning about new subjects, so I can write something up with what I feel is enough understanding to competently write about it.


This is just what I want to get to: write-ups as some form of personal accountability, and to reiterate my thought process and learning. My problem is that I rarely feel I'm competent enough to really contribute to the subject, so I just try to make it more of a workshop log than a resource for others.

AT protocol blogging, or wouldn't that get into the AT search index? https://github.com/rwietter/atproto-blog

> RSS and Atom are nice, but they aren’t mainstream enough.

That's a feature, not a bug. "Mainstream" is where the corporate shills and spammers you want to avoid are.


Is there a MotherJones of indie RSS?

Sports machen?

Yes. My native tongue is Bengali—the 7th most spoken language in the world.

I learned English at school and later started working remotely in places where English was obviously the primary language. Then I moved to the US and spoke Bengali only at home.

The final nail in the coffin was when I moved again to Berlin, Germany, and started picking up a little German just to get by. Now I speak English at work, Bengali at home, and a tiny bit of German. These days, my Bengali is a horrible hodgepodge of English and Bengali words strung together by alien conjunctions and interjections.


People from the subcontinent have this weird habit of adding English words randomly in their conversations. Sentences in Hindi/Bangla/Tamil etc and then straight up English words, and then switch back.

Haven’t seen anyone else do that.


Do you mean Anglicisms? Those are very common in many languages nowadays. Especially those in the west. Youth language in the German speaking areas of Europe is around one fourth English words.

Yes, I’ve seen kids in Berlin do that as well, but this is not a recent phenomenon for Indians. I’ve observed it for more than 15 years now

There was an interview with Gukesh Dommaraju after he won the chess world championship, and the interviewer from India asked him to respond in his native tongue which, so far as I understand it is Tamil. It was very odd listening to it when I could understand about 30% of the words!

Germans do that all the time. See Berlinglish. So do Ukrainians, although less often.

It is called «code switching», and it is also very common amongst Hongkongers and Singaporeans.

Gen Z Spanish speakers do this all the time.

I feel the same, but for Django, even though I don’t write Python as much these days.

I worked at two startups before realizing that they weren’t what I wanted out of life. Work is important to me, but so is life. I was duped into this vision of entrepreneurship and making it big in startups. While every startup founder likes to spin the story of wanting to be a unicorn, in practice, few do. I happened to work at ones with an insane workload and a less-than-survivable payout.

I interviewed like crazy and landed a position at one of those well-known companies. Large companies have their fair share of problems, but it suits my lifestyle. I do my work diligently and then clock out. It’s liberating to know that if I don’t respond to a pager, the world isn’t going to end and some third-grade CEO/CTO won’t scream in my face.

Different people want different things out of life, and I made a vow never to join a startup or scale-up with fewer than a thousand employees.


You need a critical mass for any decent discussion to happen, but there’s also a tipping point—go beyond that, and no matter what you do, it turns into chaos.

A bunch of people here have mentioned having great conversations back in the old Usenet days. Maybe because the barrier to entry was higher, naturally filtering people out (aka a gatekeeper).

Now that anyone can post anything online, every discussion attracts a bunch of people throwing random shit for no reason. This is why I usually prefer Lobsters over Hacker News.

Anytime someone shares a bad experience with JS, Vim, or Emacs, an armada of people shows up, ready to obliterate them. Trolls existed before 2010 too, but they weren’t nearly as bad.


Fascinating read. Those suggesting Mongo are missing the point.

The author clearly mentioned that they want to work in the relational space. Choosing Mongo would require refactoring a significant part of the codebase.

Also, databases like MySQL, Postgres, and SQLite have neat indexing features that Mongo still lacks.

The author wanted to move away from a client-server database.

And finally, while wanting a 6.4TB single binary is wild, I presume that’s exactly what’s happening here. You couldn’t do that with Mongo.


This comment appears to be the only place between article and thread where mongo getting recommended is even mentioned. Maybe stop giving them free press. I certainly haven't heard anyone seriously suggest them for about about ten years now.

Edit: ok the other guy mentioning mongo is clearly being sarcastic


The binary would only contain the database engine; not the database. It’s probably a few MB, if I had to guess.


Presumably the database needs to be distributed to servers, too. The engine needs to access something. This is a necessity whether or not it's referred to as a binary.


the person suggesting mongodb in the replies isnt serious, its just a meme reference


JavaScript is an awful language. Doesn’t mean Java is the answer. There are other languages—Go, Rust, Python, C# that offer better protection without sacrificing ergonomics.,


Maybe that’s because you only know one language and focused on using that everywhere.

I have worked at 2 of those named companies where people made conscious decisions of moving away from TS backend because JS is an awful language to be used for system critical things.


Care to expand?

I've been writing backend TypeScript for 8 years and it's more than fine.

It's not my favorite language but it's a great compromise, and I would never choose a Lisp, Haskell, Java or C over it (the other languages I know).


JavaScript, in general, is awful as a language in terms of design, and more astute people have wasted many words on this before me.

It was originally designed for writing throwaway frontend code, but people liked it so much that they started using it to build their system architecture—only to realize it doesn’t work well for anything beyond glorified RPC backends.

The type system is wishy-washy, and TypeScript needs a massive type space to compensate for it. Python is also dynamically typed, but it has a strongly typed system that saves you from runtime blowups. JavaScript doesn’t even properly blow up at runtime—you just get [object Object] or some random undefined error. TypeScript is a fantastic piece of engineering from the C# guy, but even they couldn’t fix all of JavaScript’s language-level blunders.

Few people want to build mission-critical backends on a weakly typed language.

The ecosystem is a mess, and things randomly break after a few days. My Python and Go apps from 2018 still work exactly as they did on day one. Go’s gorm and Python’s SQLAlchemy are the default ORMs that pretty much everyone uses. And how many ORMs does the JS ecosystem have?

And let’s not even start with frontend frameworks—no one loves them two days later.

The Next.js project I built yesterday is already showing 69 vulnerabilities. This sorry excuse of a language, coupled with terrible design and an indecisively childish community, makes it difficult to take seriously.


You're throwing random things but you have failed to explain real world pains.

By the way you can be extremely strict and safe in typescript, it's really up to the team using it, but you can encode virtually everything and have it fully type safe.

Also, there's great tools like effect-ts if you're more functionally leaning.


If you really want to waste time on types, the bolted-on TS is never going to work as well as Golang where they're native. Maybe TS types are better than Python types is all.


Golang's type system is not as expressive as TypeScript's.

It's fine to not know about the goods in the TypeScript ecosystem.

Again, I recommend you checking libraries like effect-ts or effect/schema or even better trying them.

https://effect.website/docs/getting-started/why-effect/


To be honest, I don't like the look of Effect for the same reason I don't like error handling in Golang. It's easier to assume anything can fail (which is 99% correct) and use exceptions. But anyway, every language can be extended in many ways, so I can really only compare default vs default.


In common JS or Py use cases, static typing makes your code less safe, because all that extra time you spend on it is less time writing tests. Meanwhile the types won't catch any bugs even the most simple test coverage would.

Sure, JS has some quirks that language design gurus complain about, but nothing that actually matters on the job. Like, == operator is weird, oh well.

Python is rightfully king for certain things like data science, but parallelism and package management are two big messes in it. ||ism is trying to be fixed with asyncio, but it's kinda too late. You know packaging is broken because every Python repo has a Dockerfile, not something you see in JS where npm is solid.

Golang is more for different use cases. Looks good but really should have done error handling like Rust, or used exceptions.


> npm is solid

sure it is


It's solid enough that people don't fall back to Docker.


> My Python and Go apps from 2018 still work exactly as they did on day one

My NodeJS apps from 2014 still work exactly as they did on day one as well, what's your fucking point?


My point is that the JS ecosystem is masqueraded by script kiddies who will throw a hissy fit every time someone criticizes JS, and you just proved that ;)


This is an embarrassing comment, there are few things this world needs less than more smug engineering tribalism. Even weakly typed languages. Typescript has many downsides, as does Python, and if you think the downsides of Python are trivial then I would invite you to take a look at the sea of different competing tools for environment management. I need a python env, now will that be with pyenv or venv or pipenv or pyvenv or virtualenv or poetry?


I’ve been writing JS and then TS on the backend since 2013. Built large apps, startups and while it does the job fine I’d not go with Js on the backend anymore.

We built some stuff with Go and then went with full .Net. Its like fresh air after years of Js.


What exactly did you like better in .Net than in TypeScript?


- Proper static typing. This is not .net specific but bolting types on a dynamicly typed language only works to some degree. It also opens the gate to runtime type intospection such as generating OpenAPI definitions without any other input than the handler signatures themselves etc. This is handled fine by a full stack ts framework like next or nuxt and such but you are still trapped in that implementation.

- Proper multithreading and performance story. Performance itself is not the most important thing when your app mostly waits on IO. However you still get to use thread pooled async in .net or parallel goroutines in go etc and they sometimes make a difference.

- ASP.NET Core is fully feature packed and easy to work with. .net has a more cohesive ecosystem where tools, libraries, and frameworks are designed to work together seamlessly. In js runtimes, you often need to piece together different approaches and deal with compatibility issues.

- Proper standard library. Both node.js stdlib and the js globals are really weak compared to what Go or .NET or Java etc provides you out of the box. I hate dealing with npm dependencies with a passion at this point and you can have minimal dependencies when the standard library is decent. There is also the fact that upgrading an npm package is a dice roll as you need to trust the semver or check update logs etc. At least with a static typed language you get to catch public api changes on your dependencies during compile time, not in runtime.

- Linq is good, Linq expressions are great. I don't really love ORMs but I tend to use Entity Framework Core on top of low level SQL access in .NET. Makes life easier.

I don't hate TypeScript though. I still use it on the frontend obviously and I'm fine with it but given the choices on backend, it does not make much sense for me anymore.


I know about 10 languages by now, and JS is one of the later ones I learned. Nah, it's the best at what it does. And if I drew a graph of every project that's moved from language A to B, it would have cycles.


So is Go, and so is Python. The LLM craze shifted the focus from JS to Python a while back, and I’m all for it.

That doesn’t mean I’m under the delusion that JS is going away anytime soon; at least from the FE. But its monopoly on the web took a big hit—and rightfully so.

JS is a terrible language, and people who only know that one language will fight tooth and nail with anyone who opposes them.

This gave birth to the awful trend of using JS in the backend. I understand that sometimes JS is the only option for frontend work, but keep that junk out of my sysarch. There are better languages your backend can benefit from—Go, Rust, Zig, Kotlin, Python—anything but JS/TS.

I’m glad the industry has learned from its mistakes. I’m seeing far fewer Node.js backend jobs and a lot more Python, Go, Rust, and C# roles.


Jobs are regional. Here in Estonia I see a lot more Node.js jobs YoY, very few Python gigs, zero Rust jobs, and very few C# jobs. Heck, PHP has more jobs than C#. Node/JS seems to compete with Java here for job popularity, according to local job boards, and most Java jobs also require Node.js/JS knowledge.


Europe is always playing catch-up with the US. Even so, the number of Node.js backend jobs has declined in both Berlin and Amsterdam.

I’m also seeing a ton of Java, C#, and Go jobs in London. At my workplace, Wolt/DoorDash, all new services are being spun up in Go, where Node.js used to be the default.

Plus, LLMs have gotten really good at Node.js and Python, driving backend salaries down quite a bit. In this age, being a one-trick pony just won’t cut it anymore.


Getting kind of tired of this Europe vs US trope. Europe is a huge place with each country having a rather unique economy, making a blanket statement like you just did entirely ignores that, and where does this constant need to compare one place on the planet with another place comes from? I don't live my life comparing everything about my country with another country, do you? Development trends themselves tend to also be cyclical.

As for LLM's, from what I see the languages that LLM's are most often used for are those with the lower barrier to entry / and those that UNI's and bootcamps teach, which would very much also include Java. However, what I see is actually my salary going up as the average candidate quality has been also decreasing, almost in correlation with the usage of LLM's. So while maybe the junior/mid-level salary goes down, the senior salary, in my observations, has not. And I keep hearing from companies how finding actually capable developers has become harder and harder, so I expect my salary to only go up.

Now, of course, being a one trick pony has never been a good thing, I agree on that (did you assume that I'm a just JS dev?), but I was just answering to your original comment.


> This gave birth to the awful trend of using JS in the backend.

Right, famous developer Ryan Dahl who knows only one language.


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: