Huh, this is interesting. Great idea for a project. Is there much out there with similar aims? I absolutely don't know the field well enough to say.
I think this could really benefit from documentation, and maybe a sort of tutorial that walks you through the steps of building something with these. Even just commenting the various parts of the examples, "here we need to foo so we can flerb the qux", "to do xyz, we need to …" and so on.
"Investing" in code readability & comments might also be worthwhile. If I wanted to use nux to build something, I really wouldn't know where to start even with the examples, as the comments on the header files are pretty terse and reading the code itself is pretty daunting as there's quite a lot of it and with fairly few comments, and there's a lot of acronyms and short identifiers. This'd also make it easier for others to contribute too.
But having a toolkit for testing out ideas and prototypes without having to worry about all the more boilerplate-y parts of kernels seems very useful.
I think Go thrives in a much more popular field (web services, etc.) and it's also designed to be easy to pick up so I'm not surprised this was the case.
1.0 was in May of 2015, but that was closer to an MVP with stability guarantees than the full language. It wasn't until 2018 that Rust was as usable for most tasks. Everything you might have learned back then is still available, just a lot of restrictions have been removed, and there are more to come.
"Big words, phrases and memes"? You got this distressed about an article using industry standard terminology for its target audience and assuming that you'd read (or would read) the post it linked to on Conway's law?
The non-profit Plastic Deposit Organisation, responsible for managing Denmark's container deposit system, estimates that this change alone will enable them to collect and reuse approximately 70 million additional bottle caps annually. This equates to 140 tonnes of plastic each year.
This assumes a 90% cap return rate before (which seems low) and a 100% return rate afterwards (not in Denmark myself but I can't be the only one to have returned zero of the new caps vs almost 100% before).
The whole thing smells like a made up issue concocted by some company wanting to sell their bottle cap solution.
Honestly yeah. The EU is run entirely by PMC people who don't understand or care about the effect on lower-class and frankly less intelligent people's lives.
In most ways they were far more social than modern social media, in that they were about socializing. The distinguishing characteristic that sets modern social media apart from the old school stuff is the performative aspect of it—where everyone is now encouraged to behave as a content producer optimizing for engagement—which is hardly social.
Those mediums do not have algorithms, feeds, followers, profiles, influencers, likes, or any features that many people point to as the toxic aspects of pretty much every commercial social media site of the last decade.
I’d say livejournal was the tipping point where the internet became very self-centered and your value in the platform was measured by how much engagement you were able to get.
Up until that point, in a world before blogs, social sites were mostly centered around shared interests and communities would aggressively police off topic content
> Usenet, BBSs, mailing lists etc. are social media
In a generic sense, yes. People did socialize.
But "social media" today really means: a proprietary platform controlled by a single corporation, where all the user interaction is ultimately just a ploy to keep the participation metrics up so the corporation can profile you better and sell more advertising.
Not everyone on Twitter uses their real name. Meanwhile I knew the real names of about half the top 20 most active users on a retro gaming phpbb board in the early 00s and had meet many in person and knew we everyone lived, what other hobbies they had and what they did for work or school.
Real names were absolutely used on Usenet especially in the early days, ditto for mailing lists (and still are for that matter), even though technically they are pseudonymous. In any case pseudonymity doesn't seem like it's relevant for whether something is a social medium or not – many social media are pseudonymous (or even anonymous, like the chans). HN is pseudonymous. Reddit. Tumblr. The various Fediverse services.
I wouldn't call the old stuff social networks. What made social networks a new thing was the social graph of connections becoming the information architecture of the content rather than topics. You found stuff (or it found you) by person rather than subject.
Usenet was topic based (eg reddit seems closest these days), mail lists were usually topic based, forums were organised around topics etc.
Right, and which particular "fact" was it? Somehow I doubt it's an actual biological "fact" but rather a conservative "alternative fact", which ignores actual biology
You generally need line-of-sight for microwave links and the longest known link is something like 350km, and even using tropospheric scatter you can't go over ~500km, so you'd need multiple relay stations with microwaves.
Using a lower band and bouncing off the ionosphere gets you much further but only works if ionosphere weather is OK, and it'll have lower bandwidth (not necessarily a problem though), and encryption is a no-no for amateur radio in most countries
> If one would like to interconnect various meshes, I can't think of a way to avoid using the Internet to trunk traffic across the Atlantic let's say.
Radio links with a regular 'ol dipole antenna? Although I think most countries prohibit encrypted traffic for amateur radio purposes, plus the link would be kinda spotty and depend on ionospheric weather
Crossing the Atlantic with radio is totally doable, even with low power if you accept communicating at a few bits per second (see for example https://en.wikipedia.org/wiki/FT8 and https://wsjt.sourceforge.io/wsjtx.html). As it's time-of-day dependent you'd need to have a way to switch bands (daily propagation patterns on 7 MHz are totally different from 14 MHz for example) depending on the ionospheric conditions, and a multi-band antenna. I'd rather use a pair of beams (this kind of thing: https://en.wikipedia.org/wiki/Yagi%E2%80%93Uda_antenna#/medi...) pointing to each other rather than a dipole, gain is much higher so you need less power. But even using multiple bands, there will be moments where no connection is possible at all.
Google used to have superior translation but that hasn't been the case for years now. Based on my experience DeepL (https://www.deepl.com/) is vastly superior, especially for even slightly more niche languages. I'm a native Finnish speaker and I regularly use DeepL to translate Finnish into English in cases where I don't want to do it by hand, and the quality is just way beyond anything Google can do. I've had similar experiences with languages I'm less proficient with but still do understand to an extent, such as French or German
> Apple’s current implementation of lightweight virtualisation still has no support for Apple ID, iCloud, or any service dependent on them, including Handoff and AirDrop. Perhaps the most severe limitation resulting from this is that you can’t run the great majority of App Store apps, although Apple’s free apps including Pages, Numbers and Keynote can still be copied over from the host and run in a guest macOS.
I think this could really benefit from documentation, and maybe a sort of tutorial that walks you through the steps of building something with these. Even just commenting the various parts of the examples, "here we need to foo so we can flerb the qux", "to do xyz, we need to …" and so on.
"Investing" in code readability & comments might also be worthwhile. If I wanted to use nux to build something, I really wouldn't know where to start even with the examples, as the comments on the header files are pretty terse and reading the code itself is pretty daunting as there's quite a lot of it and with fairly few comments, and there's a lot of acronyms and short identifiers. This'd also make it easier for others to contribute too.
But having a toolkit for testing out ideas and prototypes without having to worry about all the more boilerplate-y parts of kernels seems very useful.