I made a few fun plugins that would encode data (after deduping, compressing and encrypting them) into PNG or BMP and store them on Flickr (which gave you 1TB of free storage for images) or Picasa (now Google Photos). It was actually relatively efficient and not slower than the other methods, and it looked super cool to have albums full of what looked like images of static. It was a blatant violation of their ToC, so obviously not serious.
The code is still online, it's from 2014/2015. The Flickr plugin with the PNG encoder is here , and I'm not entirely sure if I ever published the Picasa one.
FUSE let Linux do all kinds of interesting things. Forgot about this one, thanks!
 DSLreports discussion from 2004 https://www.dslreports.com/forum/r11192502-GmailFS
Once the cloud storage companies launched and provided free/cheap tiers for huge storage we mostly lost the need for it.
I still have emails that I uploaded with it. Need to sit-down and look how that extension worked
In the desktop world, this is pretty much the same idea as FUSE  for filesystems. It's really fun/easy to use FUSE libraries in languages like Python  to make mountable filesystems this way, which then allows for integrations with all of your favorite local software / shell commands / etc.
> proxy.golang.org does not save all modules forever. There are a number of reasons for this, but one reason is if proxy.golang.org is not able to detect a suitable license. In this case, only a temporarily cached copy of the module will be made available, and may become unavailable if it is removed from the original source and becomes outdated. The checksums will still remain in the checksum database regardless of whether or not they have become unavailable in the mirror.
I can't remember the last time I created a website in a professional setting that would work without JS. Everything is written in things like React, Angular and Vue.js these days. And that seems to be the case for most modern websites. Atleast things made here in Oslo.
* instant load times
* stable performance
* no tracking, and sometimes even no ads
* no stupid animations that break scrolling, ask you to sign up for newsletters, etc
* no hijacked back button, or disabled highlighting/copy/paste, or pop ups/unders
(also, I just checked and noticed that Twitter refuses to work if you disable js. Fuck you, Twitter.)
And yet in every thread there's at least one person complaining about the posted link not working with JS disabled.
Sure, if you disable JS you'll have a worse experience, but it's your fault, not the site creators'.
Many people use the web to buy things. Button presses, logins, pizza tracking monitors, they all use JS.
There might be tricks using CSS and images too.
Sites like the new Reddit design, Facebook, twitter etc. are just incredibly slow even on my beefy desktop and it its 100% because of how long the JS takes to execute.
We're off loading the performance hit in developer experience and server side rendering to the end user at the cost of a crummier user experience with many websites these days that strictly wouldn't need JS for most of the functionality. I'm not happy about this trend.
> viewing text/parsing XML
> allowing remote sites to execute code on your machine, immediately when you load a site.
That's quite a big jump, regardless of all the browser sandboxing.
Yes, initially the web could only display hyperlinked text. The same can be said for many technologies/inventions, should we therefore never expand the capabilities of our tools? What is the difference between the web and your operating system in that regard? Why are OS APIs so different?
We can also look at the positive effect this evolution has had where what used to be platform specific tooling is now often simply available via a URL. I much prefer that over random executables that are not sandboxed and by default have full access to all your data. Yes, this can be mitigated, but the average user won't.
No it's not all sunshine and roses, we've made trade-offs with regards to performance and UX among others, but this is still an ongoing process as the modern web is still relatively young and changing.
The problem is that, usually, turning off "current capabilities" leads to a faster loading, lower distraction, less ad-contaminated, less janky, and all around better experience. In an effort to squeeze every ad dollar out of the eyeballs crossing the page, sites are using modern capabilities as weapons against their users.
If sites continue to work js-free, there's a simple switch to enable to improve my browsing experience.
And it even defeats the anti-adblock crap reasonably often.
Usually it is people using JS, where they should not and tracking from FAANG and others, which are the reason to block JS. You are painting a wrong picture there.
If 95% of the web devs used JS in appropriate ways and it was not used so much for spying on people, well, then it would be a different story.
These people, who advocate to use aged JS libraries to poorly implement modern HTML5 features while breaking accessibility, are the weirdest form of Luddites ever.
We ought to be reigning in what JS can do and removing access, not adding more. For one thing, it shouldn't be able to send data without our say-so. It's insecure and spying-enabling by design—why does clicking a link mean the page that loads gets to send my mouse movements and keystrokes to its master? That's crazy, and has been a major contributor to the new norm that all kinds of privacy-invasion is fine. "It's just 'telemetry', what's the big deal?" Ugh.
I think JS gets a bad name when people use it to make crazy modal popups or inline video ads or change the way the page scrolls. Beyond that it's cool that devs can get really creative with a website and I love coding in JS. But also you're adding a lot of complexity for that. HTML/css are fine for creating a website that communicates information and maybe even looks nice. And they aren't actually a programming language, they're just data. JS is a full programming language and gives you enough rope you hang yourself and I think developers kinda go off the rails messing with their sites and ruin the user experience.
It is cool that I can have whoever execute code on my machine without worrying if it will get privileged access to it. That is a pretty amazing feature of JS/browsers.
Security too! https://www.vusec.net/projects/smash/
The modern web is anxiety-inducing and incredibly scary to people that pay attention. I don't want to spend an hour checking the js on sites before I use them to make sure they aren't malicious/mining bitcoin/whatever, so disabling JS is an easy out that preserves my sanity, and gives me a better experience. No cookies, no popups, no paywalls, no ads, no lag.
But the vast majority of websites that use React are the moral equivalent of driving an empty school bus to the store to buy a loaf of bread: It's massively wasteful and frankly stupid.
Yes that's exactly what the luddites are arguing in favor of, rolling back progress, and dramatically stripping away capabilities. They don't consider any of it to be a net positive, they don't think of it as being progress.
In my observation they also typically want to go back to not having graphical user interfaces. They like a nice command line interface as a way of life. It seems silly to stop there though. The computer should be gotten rid of just the same to be philosophically consistent.
First and foremost, the luddites you speak of are programmers or sys admins. They have been using terminals and are still using terminals daily, and they see the benefits that those tools have to offer. Namely, they see composable, interoperable programs that abide by the philosophy that programs "should do one thing well" as the bench mark for real progress. I would say I agree with the merits of this perspective. But I still use VSCode in addition to vim, because I'm not a zealot and there are times when I want to edit something in a very flexible way that VS Code better facilitates.
Both ways of doing things have their merits I suppose. It just hurts a bit to see something simple and powerful be wrapped and rewrapped in progressively less helpful proprietary systems and given a JS front end that lacks all of the focus and freedom that charmed us with the systems to begin with.
I think you could argue that the ubiquity of HTML led to people see value in something like XML back in the late 90's / early 00' and HTML drove XML invention and adoption and not the other way around.
The whole idea of articles, documents, and so on, for the web was semantic content held at a url, crosslinked to related semantic content, etc.
I do understand the idea of SPAs, but not the idea that everything should be an SPA.
This is such a basic thing to refuse that what’s actually weird is that you find it weird.
3 megs of unnecessary data meant that he would sit 3 unnecessary minutes waiting for a page to load.
This is not to say that JS isn't being abused a lot -- but then again, it's soft tech, it's the internet and the abuse comes at little cost (compared to most industries). I always thought it's a huge part of what makes it fun and speeds all progress along like crazy.
Calling someone Amish of the internet because they refuse to be tracked, refuse to take part in voluntary remote code execution or just do their computing on a computer from 2010 on a poor internet connection is just sad.
Doing so is not progress.
Hi, this is World speaking.
You can pretend it's not me. That's a bit silly but oh well, it's hardly a contender for the most outrageous thing that happened around here today.
You can also chose to only partly partake in what's going on RIGHT NOW and be very cool with that decision. Like, oh I don't know, the Amish.
But this is what I am today. Please get real.
If you believe there's more to your point, please restate it in a less patronizing manner.
Anyway, I thought I jotted my point(s) down quite amusingly, but oh well, hit and miss. Restated more plainly:
- I am acknowledging what the world is and where that leaves anyone disabling JS indiscriminately (sic) in 2021, solely because of the technology and not because of individual usage.
- I think it's important to acknowledge what the world is to have any serious discussion about the world. How about you?
- I made no statement about what I want the world to be. You have been inferring (wrongly, but, oh well, it barely matters)
- Being compared to the Amish is not "sad" to me and it's not meant to be offensive. What I think is sad is if you chose to be disconnected (like the Amish) but unclear that you are disconnected because of personal considerations (unlike the Amish) and then get angry at the thing you are disconnected from because it doesn't care about your personal considerations.
Enjoy your 30MB js files and 4k autoplaying video ads that follow your scroll and jump around the screen.
I don't mean to sound flippant, nor do I mean to insult anyone's favorite browsing style, but insisting on this might be shouting into the wind? I could never make a business case for it, at least.
I propose that rather than ‘Web applications’ these be called ‘browser applications,’ because they are applications that run within the browser runtime. It is basically a coincidence (and a convenience) that they happen to rely on the same technology used for the Web.
What I don’t understand is why folks write browser applications for things that the Web platform already handles well. The Web is pretty cool! Not perfect, but then nothing is.
> The Web is a web of hyper-linked pages
The web is an open specification that has evolved greatly over time. Pretending like these principles are religious are exactly the antithesis of an open standard. It requires debate and discourse.
I use Vivaldi because it makes easy to have JS off by default and enable for each website that really needs it. (I do not know if Chrome/Chromium made this thing easy too)
So you fill in this entire form, hit submit, and then find out that you really needed JS and now you have to fill in the form all over again. How is that easy?
It is easy because I don't get popups for cookies, for allowing location access, no fancy scroll effects , I also disabled animated GIFs.
If you are a webdev and you don't work on an SPA but for some reason you site is half broken with JS off then use the standards and put a message that informs the user that JS is required.
I would not setup this to some random person computer, ad blockers too can screw with shopping carts and some websites also beg to stop the ad blocker , so I would not install ad blockers too any random person either, but here on HN JS off by default is a good advice, most should understand that if page is blank or some button does nothing then it might be a JS related issue.
No, you need it to pad your resume.
That said, a lot of what you are doing in a browser application is re-implementing your own version of stuff that the browser already does, and the browser's version of it is accessible by default. So a JS-free application is almost always fully accessible for no extra development cost (as long as you check it for things like foreground/background contrast).
With how prevalent tracking on the internet is, despite GDPR's attempts. Yes, it is an issue.
That's due to GitHub Pages not having any rewrite mechanism other than the 404.html hack. You can host the same thing on Netlify/Vercel/any other host with proper rewrite support, including a server of your own, and not have this problem.
I think you are overestimating the number of people who have JS disabled outside of this website.
It works on 96% of user's browser.
> Service workers are disabled in private browsing mode because they are impossible to use without setting up state tied to the origin and URL scope. This inherently requires writing to disk.
Had no idea this wasn't permitted.
Well that's a start.
A while back I wrote a blogpost about using PNG chunk functionality to "hide" a arbitrary data in PNGs without it affecting the way a PNG is displayed. I'd love to see this being used together with this project.
“Hosting SQLite databases on GitHub Pages or any static file hoster”:
You can still make out who's who in them. Sorta.
If you encode some binary data in a fake photo, after a resolution change and a lossy compression, you will only get garbage.
This was 20 years ago or so... So many memories!
It's not free for the image hosting sites.
That being said, it's a cool util.
It's a hard life running an image uploading service. You have to wonder how much they're being payed in ad revenue.
I use it a lot for a project trying to detect manipulated images including stego encoded images.
 : https://i0.hdslb.com/bfs/article/484c1e1a0c41d4482f6fc121132...
At a guess, it's the encoding of a binary file into pixel color values in a large image.
Imgur has really fallen over the years. It was first created by a redditor who was frustrated by how terrible free image hosts were at the time, like ImageShack and Photobucket. If an image hit the front page of reddit, it would very quickly turn into a "This image has reached its bandwidth limit", not to mention you usually couldn't directly link an image, and the image view page was riddled with ads.
Over time, though, Imgur has tried to become a social network, and is trying to stop being the go-to place when you just want to host an image.
But, to be fair, I don't really blame them. Image hosting gets expensive quickly, especially when everyone on reddit uses you. Of course, now reddit lets you post images directly to them, which on one hand, leads less traffic to Imgur, but on the other hand, the traffic going to Imgur is now more likely to see ads and browse the gallery, rather than just directly download an image.
A new host emerges to address the sorry state of existing hosts, with a streamlined and straightforward path to uploading. Then the cold, hard reality begins to set in: you cannot make money as a dumb pipe that delivers user-supplied images. Monetization strategies begin subtly at first. Before you know it the site is desperately trying to sell your image on a coffee mug, or hijack links to display ads, or grow into a social networking site. Something- anything- that pivots away from straightforwardly posting images and linking to them.
One day, some bold new creator sees the sorry state of existing hosts, recognizes an opening, and the cycle begins anew.
Or yeah, just use DigitalOcean or someone else. They're pretty cheap, too.
Another option might be Dropbox. Not sure how quickly their free hosting will crap out if it hits reddit front-page level traffic, but it should be good enough to share here or with your friends or whatever.
If anything is provided for free someone will come and ruin it...
I mean, if you gotta serve a malware payload, you are not going to host it yourself, so hidding it in an image on a public website is a good strat.
If the image gets recompressed during sharing, is it still possible to decode the website from it?
One of the problems I had when trying to build something like this in the past is that saving & sharing images was very lossy at the time. So I ended up using iCalendar and PDF attachments.
This is one of those things I would almost certainly only play with for fun, but it’s so damn clever, I just love it.
copy /B foo.gif+bar.zip foobar.gif
As for downvotes, just ignore them, there are pitchforks everywhere, not worthy to even spend one second thinking about.