For example on Twitter/X it has become obvious in an aggregate sense, but it's hard to tell which individuals are bots. On Reddit it depends on the sub. Niche communities feel the same as ever, larger subs tend to feel more robotic, but it's still hard to distinguish on case by case basis.
Again the map is not the territory. On Twitter, I followed people I know amd companies/groups I’m interested in. That’s because I want to know what’s happening around them and the words are just the medium. Anything else is junk because I do not care. There may be some random thing that catch my interest, but that usually take more interaction than just reading. It’s all about intent and context, not the words themselves.
My friend is a manager at a local business, so he's in control of the social media for that business. 100% of the social media posts of that business are written by AI. It's not obvious from their social media feeds either.
He did a presentation on making a fs in rust, the response from this dev was "we won't be converting to your religion. We are going to continue modifying c code and your dumb rust code will always be a second class citizen."
Literally just hostility for hostility sake. He wasn't proposing to rewrite everything in rust. He wasn't proposing that everyone learns rust. He was just giving a freaking presentation on making a rust fs that integrates with the kernel and was immediately trashed for spreading religion or whatever.
What was literally said didn't include the words "always" or "your dumb rust code". From the subtitles with slight manual fixes:
> I suspect part of the problem here is you're trying to convince everyone to switch over to the religion as promulgated by rust and the reality is that ain't going to happen because we have 50 plus filesystems in Linux, they will not all be instantaneously converted over to rust. Before that happens we will continue to refactor code because we want to make the C code better. If it breaks rust bindings, at least for the foreseeable future the rust bindings are a second class citizen and those file systems that depend on the rust bindings will break and that is the rust bindings problem, not the file system community at large's problem and that's going to be true for a long long time, okay, and I think we just simply need to accept that
That seems rather technical to me. To put words into their mouth because they had a snarky tone, or claim the presenter was "immediately stacked for spreading religion or whatever" hardly is.
They literally said "spreading your religion" at the very start of their critique. I think I'm correctly reading the tone of the rest of the comment even if I didn't verbatim quote them.
From the closed captions with minor editing. In response to "we are going to provide API bindings"
> I suspect the problem here is you are trying to convert everyone to the religion as promulgated by rust and the reality is that ain't gonna happen. Because we have 50 plus filesystems in linux and they will not all be instantaneously converted over to rust. Before that happens we will continue to refactor code because we want to make C code better. If that breaks the rust bindings at least for the foreseeable future the rust bindings are a second class citizen. Those file systems that depend on the rust bindings will break and that is the rust bindings problem not the file system community at large problem. And that's going to be true for a long long time, OK. And I think we just simply accept that right because the answer you are not allowed to refactor C code because it would break 5 critical file systems that distros depend on is like not a star, OK.
> They literally said "spreading your religion" at the very start of their critique.
And that critique is near the end of all the people who asked questions or made comments. So no, the presenter wasn't "immediately stacked" when making that presentation.
Linus has the wisdom to see the writing on the wall - the kernel is becoming too
complex to be maintained and extended by newcomers; the pool of C ninjas that can handle it reliably is diminishing. This kind of foresight is what makes Linux trustworthy in the first place. A lot of current individual contributors lack that vision and fail to understand the reason for introducing what they see as unneeded complexity.
The other reason Linux is trustworthy because of the technical meritocracy culture. But since Rust is not a technical choice, as Rust cannot per-se do things that are impossible in C - you can do anything in C - the introduction of Rust falls outside the parameters of the regular rough-but-mostly-fair Linux exchanges. This leading to toxic reactions from the less socially-abled.
> But since Rust is not a technical choice, as Rust cannot per-se do things that are impossible in C - you can do anything in C
Have anything in particular in mind? It's been my understanding that Rust actually can do everything C can, but the additional safety guarantees imposed by the type system make it a bit harder.
I like this rebuttal. It disentangles data from interpretation and knowledge. This distinction helps us to solve problems associated with data and is a core tenet of science and problem solving.
Increasing the amount of generated data and not jumping to conclusions at the same time is how we avoid getting stuck in misconceptions or plain ignorance.
From the data I've seen and read it seems like low interest rates are (politically) alluring, because of short term metrics, but will bite back via inflation and bubbles bursting very consistently.
Is there any wisdom and data that challenges or expands on this that a layman can understand?
So, bubbles are hard, especially when the underlying fundamental thing does not respond well to price increases (NIMBYism, zoning, weaponized environmental legislation, etc.)
Tech bubbles are doubly so, because now that "total addressable market" has exploded thanks to the internet everything is a bubble. (Starting with the good old dot-com one, and crypto, and AR/VR/Metaverse, and now AI. Folks are comparing Nvidia stock graphs to Cisco around 2000.)
....
Likely the HN crowd overestimates the effect of ZIRP on tech, because how crazy the last 10-15 years have been. (But we see that the economy in general performed extremely well, real wages increased, etc.)
Yes, of course there were (and are) a lot of scummy ventures, but there will be more.
...
Regarding rates, the picture is a bit more complex. So coming out of the 2008 crash the Obama/ARRA fiscal stimulus was too small. (And of course it got hyper-over-politicized on "both sides".) And the monetary response was also lackluster.
But we also know now (benefit of hindsight!) that the rates before 2008 were too low.
One important take-away is that of course everyone wants growth, more growth would help lower taxes, inflate away deficits, yey! Prosperity is good!
But ... we can't simply buy growth with endless loans from our future selves. Real economic growth requires increasing productivity, which requires applying new (better, more suited to the situation, more profitable) technologies, which requires changes (duh), but an aging population is very change-averse. The US with the political gridlock tends to go on wild goose chases, spending enormous amounts of money on bullshitting instead of picking better technology.
(And here technology includes social technology too. From the things like "lack of funding reform for fire departments leads to them going with too large firetrucks everywhere to be able to bill a lot, which leads them to not sign-off on thinner roads, which leads to too wide roads where motorists drive too fast, which leads to too many injuries and fatalities" to all the usual like lack of gun control and bribery rules for the courts.)
It's not kids or Roblox specifically, it's gamers and platforms/games with "micro-transactions" etc.
When I was younger and still played online games regularly, I was initially stoked about cosmetic micro-transactions in (competitive) online games. Not because I wanted to buy them, but because these would fund the continuous development of my favorite games without affecting their integrity (no "pay to win" mechanisms).
Later I found this was a Faustian bargain. It turned these games and communities around them into something that I don't want to participate in.
These days I don't mind as much. Because among the sea of predatory, tacky or otherwise low quality crap there are way too many high quality, original and interesting games (typically made by small teams) that I will ever be able to play.
I don't know anything about Roblox specifically. On one hand the comment above is tragic, but on the other hand my understanding is that motivates kids to play around with Lua. If that's the case, then I'm all for it, because for me and many others that kind of thing is how we found our way into our profession as developers.
I think you're right on with the micro-transactions, Roblox is particularly bad for it. One of the games on there my boy likes is Rainbow Friends, its some sort of tame horror survival genre which he loves exploring around and playing as the different characters. If I could just buy that game as a 'full unlock' or something I probably would, instead it's $$$ "micro" transactions for every little thing and it really isn't a habit I want to get the kids into.
This is why I don't stress too much about validating game state in server scripts. It lets the kids cheat clientside if they can figure out how to rewrite and load the Lua scripts.
So you can tell what the URL might point to by looking at it. That’s one of the important things mentioned in the article linked on this HN post: URLs are used by both computers and people.
It's also for crawlers. When doing technical SEO, having a human readable slug in the URL is low hanging fruit that is often overlooked. This, as well as having a <title> structure of `CURRENT_CONTENT_TITLE | WEBSITE_NAME` are things that are quite trivial to implement and provide a significant uplift in both SEO and UX.
> Can the authors (holding the copyright) distribute code under a different license when there were external contributors under GPL?
No, if any contributions were made under the GPL, they'd have to get permission to change the license to AGPL from every contributor or remove their contributions.
Most projects like this require you license your contributions in a way that lets them sell non-AGPL licenses (see their Contributor License Agreement) https://cla-assistant.io/paradedb/paradedb
So basically they can do whatever they want with your contributions.
By default the contributors also hold copyright and need to consent to the separate licensing.
However, this is typically solved by using a contributor licence agreement (CLA) where all contributors click through a form before submitting a PR where they declare that they own the copyright for the PR and they give a license for the organization to relicense the work and derivatives. Sometimes the whole copyright is transferred to the organization in these agreements.
I didn't check but according to some comments here the CLA in this case is already embedded into the AGPL license.
In principle this scheme guarantees that the original organization always has special rights over all of the open source community, as they can dual license all the derivative works.
This might be a viable licensing scheme for Swiss government contractors now. The federal government requires open source licenses for all software projects as of last year or so.
(A)GPL+CLA might be a good way to ensure the interests of the both the Swiss people and the flexibility or competitiveness of contractors, allowing them to retain proprietary licenses where needed or wanted.
Follow up:
Am I correct in thinking that this might slightly hinder contributions on one hand, but ultimately anyone could still maintain an _independent_ fork?
Or in other words: Would contributions to a fork still require signing the CLA and essentially allow the original authors to dual license any such contributions?
Contributions to a fork could be done under just the AGPL, without any CLAs (also to the original repo, but those won't be accepted). Then the entire fork is effectively AGPL only.
I don't think any original CLAs would apply to the fork, unless the fork owner is the same legal entity/successor as defined in there.
Same goes for the original authors, they'd need CLAs from all fork contributors.
Spreading copyright among as many entities as possible to make relicensing more difficult is not a problem you should "solve". It's a feature of the license. A CLA ensures that contributors will be treated unfairly, and I hope it deters many potential contributors.
For example on Twitter/X it has become obvious in an aggregate sense, but it's hard to tell which individuals are bots. On Reddit it depends on the sub. Niche communities feel the same as ever, larger subs tend to feel more robotic, but it's still hard to distinguish on case by case basis.