Hacker News new | past | comments | ask | show | jobs | submit login
Deploy a website on imgur.com (github.com/etherdream)
542 points by zjcqoo on Sept 8, 2021 | hide | past | favorite | 179 comments



This reminds me of my (now dead) open source Dropbox alternative Syncany [1] and all the different storage plugins I made for it [2]. Long story short, a storage plugin only had to implement the API methods upload, download, list and delete, so you could literally use anything as a storage backend (FTP, SFTP, S3, WebDAV, ..).

I made a few fun plugins that would encode data (after deduping, compressing and encrypting them) into PNG or BMP and store them on Flickr (which gave you 1TB of free storage for images) or Picasa (now Google Photos). It was actually relatively efficient and not slower than the other methods, and it looked super cool to have albums full of what looked like images of static. It was a blatant violation of their ToC, so obviously not serious.

The code is still online, it's from 2014/2015. The Flickr plugin with the PNG encoder is here [3], and I'm not entirely sure if I ever published the Picasa one.

[1] https://www.syncany.org/

[2] https://github.com/syncany?q=plugin

[3] https://github.com/syncany/syncany-plugin-flickr/tree/develo...


We maintain a similar Javascript library at work called jIO [1][2]. It provides the same API to different storages (currently Memory, IndexedDB, WebSql, WebDav, Dropbox, GoogleDrive, ERP5) plus handlers for functionality and complex storage trees (eg zip, union, query, replicate, crypt storage). It's relatively straightforward to extend and query data sources with an open API since you just have to reimplement the jIO API methods.

[1] https://jio.nexedi.com/ [2] https://github.com/nexedi/jio


I love these kinds of hacks. I can't remember how exactly it worked, but I remember a program or maybe browser extension from back before Google Drive existed that used Gmail's then-generous email storage space to store files.


Ah yes, gmailfs[1] in 2004 used Linux FUSE (Filesystem in USEerspace) to translate local directories to your (at that time, rapidly and absurdly growing) GMAIL quota.

FUSE let Linux do all kinds of interesting things. Forgot about this one, thanks!

[1] DSLreports discussion from 2004 https://www.dslreports.com/forum/r11192502-GmailFS


I had a project in college (~2007-2008) for this! My project partner and I set up a sharded filesystem using gmail account storage, storing metadata in the title and splitting encoded chunks across emails and accounts. The storage limit per gmail acct was in GB, but the attachment limit per email was ~20mb. We spent quite a few nights figuring out bugs in our algo to stitch together major and minor chunks of string-encoded binaries.

Once the cloud storage companies launched and provided free/cheap tiers for huge storage we mostly lost the need for it.


The gspace extension! - https://www.ghacks.net/2007/03/07/gspace-firefox-extension/

I still have emails that I uploaded with it. Need to sit-down and look how that extension worked


> a storage plugin only had to implement the API methods upload, download, list and delete, so you could literally use anything as a storage backend (FTP, SFTP, S3, WebDAV, ..)

In the desktop world, this is pretty much the same idea as FUSE [1] for filesystems. It's really fun/easy to use FUSE libraries in languages like Python [2] to make mountable filesystems this way, which then allows for integrations with all of your favorite local software / shell commands / etc.

[1] https://en.wikipedia.org/wiki/Filesystem_in_Userspace

[2] https://github.com/libfuse/python-fuse


I wonder if it would possible to use the golang cache server. :-)

> proxy.golang.org does not save all modules forever. There are a number of reasons for this, but one reason is if proxy.golang.org is not able to detect a suitable license. In this case, only a temporarily cached copy of the module will be made available, and may become unavailable if it is removed from the original source and becomes outdated. The checksums will still remain in the checksum database regardless of whether or not they have become unavailable in the mirror.

https://proxy.golang.org/


I've never heard of Syncany, but I took a similar approach for my personal file hosting application. I wonder if I could implement similar silly approaches to filestorage :p


This reminds me of when i used a google doc spreadsheet as my database. Moderators need to edit the db? No problem! Here is the link to the spread sheet. Need a previous version? No problem! Copy paste!


I once made a music player similar to this. It converted music files into image files, and saved them to Flickr. One trick that I employed was that the output image files were also visually recognizable, featuring album covers and lyrics.[1][2] The way I used to hide information was pretty barbaric, but it was still a fun experiment.

[1] https://www.flickr.com/photos/barosl/albums/7215763383971356...

[2] https://github.com/barosl/flickr-music-player


Having the images show covert art is super clever and pratical!


This is similar to, but not exactly, how the PC port of Kingdom Hearts 3 used their save files: Image and data in one nice .png.

https://www.pcgamer.com/kingdom-hearts-pc-save-files-are-tuc...


This has all its place on hacker news because it is a fun hack, but it's not at all practical. Not only it doesn't work in all browsers even fairly recent ones (service workers), but even when it works it necessitates javascript enabled, and imposes a non negligible additional load time. And more importantly, with the way it is recommended to use it using 404.html file as a "catch all", it means all your pages are served as 404 errors, which can be a problem for a lot of things (starting with search engine indexing).


I often see people speak about requiring users to have JS enabled as a problem. But is that really an issue? I remember 5-10 years ago I would use things like no-script to disable javascript, and only enable on specific sites. But with the way modern frontends are, I would expect most website to not work at all if I disabled JS. So I would expect most users to have JS enabled anyway.

I can't remember the last time I created a website in a professional setting that would work without JS. Everything is written in things like React, Angular and Vue.js these days. And that seems to be the case for most modern websites. Atleast things made here in Oslo.


People that are in the web since forever and understand how it works and complain about JS are the weirdest form of tech ludites there are. Such a weird line in the sand.


It's the kind of thing where you have to try it for yourself. No javascript == best user experience possible.

* instant load times

* stable performance

* no tracking, and sometimes even no ads

* no stupid animations that break scrolling, ask you to sign up for newsletters, etc

* no hijacked back button, or disabled highlighting/copy/paste, or pop ups/unders

Unless you're trying to play games, or use some complicated web application like an IDE, javascript offers absolutely nothing of value to you. It only exists to benefit the owner of the website. For 99.999% of web browsing use-cases, HTML/CSS is all that's needed.

The only situation I can think of where javascript has the potential to offer an improved user experience is with infinite scrolling. That's personal preference though, so if it's something you value, I could understand wanting to enable javascript.

(also, I just checked and noticed that Twitter refuses to work if you disable js. Fuck you, Twitter.)


> No javascript == best user experience possible

And yet in every thread there's at least one person complaining about the posted link not working with JS disabled.


Of course, since that means the page requires javascript, and is thus providing an inferior user experience. Who wouldn't complain about that?


Your comment reminds me of this good old meme: https://i.kym-cdn.com/photos/images/newsfeed/001/016/674/802...

Sure, if you disable JS you'll have a worse experience, but it's your fault, not the site creators'.


You can still track without javascript. One way is using invisible img tags. It can be less effective though.


> For 99.999% of web browsing use-cases, HTML/CSS is all that's needed.

Many people use the web to buy things. Button presses, logins, pizza tracking monitors, they all use JS.


None of those require JS at all, except the pizza monitor.


How can you send information to your backend server without JS?


Back in the days (I'm 33 and thus quite old in web years) we would use forms


Submit buttons or links. Then the whole page reloads.

There might be tricks using CSS and images too.


This is like a technical invocation of Poe's law.


Full time freelance web dev here. Literally every site I make requires JavaScript to run. No one has ever complained about this.

It would be much harder to make them work without JavaScript.


I'm not that old nor opiniomated, but I dont think disabling JS for the majority of browsing is weird. I had a mid-tier Android phone phone (a 250 euros costing Nokia) that after a few years of use just simply crawled when using normal websites that were a bit JS heavy. I'm not even talking about react/vue SPA apps.

Sites like the new Reddit design, Facebook, twitter etc. are just incredibly slow even on my beefy desktop and it its 100% because of how long the JS takes to execute.

We're off loading the performance hit in developer experience and server side rendering to the end user at the cost of a crummier user experience with many websites these days that strictly wouldn't need JS for most of the functionality. I'm not happy about this trend.


How is it weird? You literally go from

> viewing text/parsing XML

to

> allowing remote sites to execute code on your machine, immediately when you load a site.

That's quite a big jump, regardless of all the browser sandboxing.


I see people comment this on HN on a daily basis and it always boggles my mind. For me this reads as: 'This thing that was initially conceived years ago when computers were incapable of much more than display text should never evolve/change to take advantage of current capabilities'.

Yes, initially the web could only display hyperlinked text. The same can be said for many technologies/inventions, should we therefore never expand the capabilities of our tools? What is the difference between the web and your operating system in that regard? Why are OS APIs so different?

We can also look at the positive effect this evolution has had where what used to be platform specific tooling is now often simply available via a URL. I much prefer that over random executables that are not sandboxed and by default have full access to all your data. Yes, this can be mitigated, but the average user won't.

No it's not all sunshine and roses, we've made trade-offs with regards to performance and UX among others, but this is still an ongoing process as the modern web is still relatively young and changing.


> take advantage of current capabilities'.

The problem is that, usually, turning off "current capabilities" leads to a faster loading, lower distraction, less ad-contaminated, less janky, and all around better experience. In an effort to squeeze every ad dollar out of the eyeballs crossing the page, sites are using modern capabilities as weapons against their users.

If sites continue to work js-free, there's a simple switch to enable to improve my browsing experience.

And it even defeats the anti-adblock crap reasonably often.


We have HTML and CSS, which evolved quite a lot and already allow for what 90% (estimate) of the websites do.

Usually it is people using JS, where they should not and tracking from FAANG and others, which are the reason to block JS. You are painting a wrong picture there.

If 95% of the web devs used JS in appropriate ways and it was not used so much for spying on people, well, then it would be a different story.


[flagged]


Would you please stop breaking the HN guidelines? You've been back to doing it repeatedly lately. This is not cool.

https://news.ycombinator.com/newsguidelines.html


It's possible to create javascript-less sites with partial hydration, e.g. by using Svelte and Elder.js. They are loading fast and working fast. They are favored by search engines. They have much fewer issues with accessibility.

These people, who advocate to use aged JS libraries to poorly implement modern HTML5 features while breaking accessibility, are the weirdest form of Luddites ever.


JavaScript's a massive security threat. It's really weird to me that people seem to just assume it's fine, and isn't the most dangerous damn thing in common use on computers. Every time someone (usually Google) pushes another way for it to touch hardware, I'm surprised that most developers are like "oh good, so glad, can't wait 'till Safari catches up in 5 years". Um... no? It's a terrible idea? Please don't ever?

We ought to be reigning in what JS can do and removing access, not adding more. For one thing, it shouldn't be able to send data without our say-so. It's insecure and spying-enabling by design—why does clicking a link mean the page that loads gets to send my mouse movements and keystrokes to its master? That's crazy, and has been a major contributor to the new norm that all kinds of privacy-invasion is fine. "It's just 'telemetry', what's the big deal?" Ugh.

"That's alarmist, JS is super secure" right, and most folks weren't worried about their CPUs betraying them until Meltdown and Spectre—smart money says there is a vulnerability we'll find shocking in one or more JavaScript implementations, right now, waiting to screw us.


> smart money says there is a vulnerability we'll find shocking in one or more JavaScript implementations, right now, waiting to screw us.

https://www.vusec.net/projects/smash/


We spend a ton of time locking down OS API/ABIs to prevent sandbox violations and I wouldn't trust a shared server with sensitive data unless I had an IT team working on it. JS seems to be a lot better with sandboxing though. You still have to really worry about CSRF though. I use multiple profiles to ensure sketchy sites can't get at my data.

I think JS gets a bad name when people use it to make crazy modal popups or inline video ads or change the way the page scrolls. Beyond that it's cool that devs can get really creative with a website and I love coding in JS. But also you're adding a lot of complexity for that. HTML/css are fine for creating a website that communicates information and maybe even looks nice. And they aren't actually a programming language, they're just data. JS is a full programming language and gives you enough rope you hang yourself and I think developers kinda go off the rails messing with their sites and ruin the user experience.

It is cool that I can have whoever execute code on my machine without worrying if it will get privileged access to it. That is a pretty amazing feature of JS/browsers.


It is a weird hang up because JavaScript/scripting was already ubiquitous in the 90s! I programmed "Dynamic HTML" pages as a summer job in '98 or so when I was in high school.


> we've made trade-offs with regards to performance and UX among others

Security too! https://www.vusec.net/projects/smash/


There was never a technical limitation to running software over a network, unless you go back to before computer networking was invented, far far before the internet. Putting all the capability into the browser is what people have a problem with.

The modern web is anxiety-inducing and incredibly scary to people that pay attention. I don't want to spend an hour checking the js on sites before I use them to make sure they aren't malicious/mining bitcoin/whatever, so disabling JS is an easy out that preserves my sanity, and gives me a better experience. No cookies, no popups, no paywalls, no ads, no lag.


There's nothing wrong with using a school bus to carry 30 children to a school, just like there's nothing wrong with using JS to render a highly-interactive SPA.

But the vast majority of websites that use React are the moral equivalent of driving an empty school bus to the store to buy a loaf of bread: It's massively wasteful and frankly stupid.


> should we therefore never expand the capabilities of our tools?

Yes that's exactly what the luddites are arguing in favor of, rolling back progress, and dramatically stripping away capabilities. They don't consider any of it to be a net positive, they don't think of it as being progress.

It's not specific to JavaScript, it's far broader than that. It's an ethos.

In my observation they also typically want to go back to not having graphical user interfaces. They like a nice command line interface as a way of life. It seems silly to stop there though. The computer should be gotten rid of just the same to be philosophically consistent.


I think you are right in many respects. I would not consider myself one of the accused luddites, but in their defense, this ethos isn't baseless.

First and foremost, the luddites you speak of are programmers or sys admins. They have been using terminals and are still using terminals daily, and they see the benefits that those tools have to offer. Namely, they see composable, interoperable programs that abide by the philosophy that programs "should do one thing well" as the bench mark for real progress. I would say I agree with the merits of this perspective. But I still use VSCode in addition to vim, because I'm not a zealot and there are times when I want to edit something in a very flexible way that VS Code better facilitates.

Both ways of doing things have their merits I suppose. It just hurts a bit to see something simple and powerful be wrapped and rewrapped in progressively less helpful proprietary systems and given a JS front end that lacks all of the focus and freedom that charmed us with the systems to begin with.


Technically you already allow remote code execution via XSLT.


In a mere 20-ish years. Crazy.


And how many years did HTML exist before JavaScript was invented?


100000 years?


You just restated the belief as if the reason is self-evident. What's the reason? Security? Even displaying a JPEG has had security vulnerabilities. You can't really seem to escape that just by not executing code. And no PDFs too, I guess, because they contain code?


Reducing the attack surface is the pragmatic thing to do and it just happens that js alone makes a several orders of magnitude difference on its own. Don't let perfect be the enemy of the good.


HTML has never been XML.

And HTML + CSS3 is Turing complete without Javascript anyway: https://accodeing.com/blog/2015/css3-proven-to-be-turing-com...


Amen, I think if memory serves correctly HTML was SGML inspired and once XML became popular there was a XHTML standardization push for a brief time.

I think you could argue that the ubiquity of HTML led to people see value in something like XML back in the late 90's / early 00' and HTML drove XML invention and adoption and not the other way around.


Yes, but JavaScript has been a thing since the 90's and was widely used by web sites even back in the the days of GeoCities. So it's really a bit weird to complain about something that has been part of the web since almost its beginnings, and arguably has been one of the main drivers for the 'rise of the web' itself.


I don't run with JS disabled, but still hate things like a blog where the html is essentially empty and only filled with content via JS.

The whole idea of articles, documents, and so on, for the web was semantic content held at a url, crosslinked to related semantic content, etc.

I do understand the idea of SPAs, but not the idea that everything should be an SPA.


No, it's not. These people know exactly what they want to avoid.


You do realize those people you call weird mainly refuse to enable js by default because of the amount of tracking it imposes right? Not to mention those “designed” articles (the verge style) that load like 3 megs at least of a god awful amount of crappy slider libraries and crappy stuff?

This is such a basic thing to refuse that what’s actually weird is that you find it weird.


3 megs!? That won’t even fit on two of my floppy disks


I worked with someone not so long ago that was on a 750 KiB connection. I myself remember that speed at being blazingly fast at one point - so fast that we could start listening to a song before Kazaa finished downloading it.

3 megs of unnecessary data meant that he would sit 3 unnecessary minutes waiting for a page to load.


I had a bad connection for the majority of my internet usage life. People don’t understand that excessive data loading is superfluous. But to call people who worry about that weird is truly strange to me.


Same here. I'm still on a 3Mb DSL connection, and to be quite honest, according to my graphs, it's closer to 196KB/sec. Why in the world would I enable JS when I mostly don't need to? I'm just trying to read text!


Not only tracking. For text sites it's just waste of bandwidth. And where I live internet is slow.


If you chose to flat out disable JS in 2021 (by all means, go for it), understand that you are the Amish of the Internet -- so, please, behave like the Amish of the Internet. Be cool with the consequences of your deliberate choice. Do not expect other people to work around you or, worse, resent them for not doing so.

This is not to say that JS isn't being abused a lot -- but then again, it's soft tech, it's the internet and the abuse comes at little cost (compared to most industries). I always thought it's a huge part of what makes it fun and speeds all progress along like crazy.


No.

Calling someone Amish of the internet because they refuse to be tracked, refuse to take part in voluntary remote code execution or just do their computing on a computer from 2010 on a poor internet connection is just sad.

If you want to live in a world where everything is a SPA by all means go for it, but be prepared to be criticised for making a blog post or a news story that is literally a bunch of text paragraphs interleaved with images require Javascript.

Doing so is not progress.


Use Safari - anti tracking measures etc.


I don't see how locking myself into a walled garden would improve the situation.


> If you want to live in a world where

Hi, this is World speaking.

You can pretend it's not me. That's a bit silly but oh well, it's hardly a contender for the most outrageous thing that happened around here today.

You can also chose to only partly partake in what's going on RIGHT NOW and be very cool with that decision. Like, oh I don't know, the Amish.

But this is what I am today. Please get real.

Your friend, World


You're just annoyed because people are complaining.

If you believe there's more to your point, please restate it in a less patronizing manner.


I don't know turminal, that really doesn't sound like me.

Anyway, I thought I jotted my point(s) down quite amusingly, but oh well, hit and miss. Restated more plainly:

- I am acknowledging what the world is and where that leaves anyone disabling JS indiscriminately (sic) in 2021, solely because of the technology and not because of individual usage.

- I think it's important to acknowledge what the world is to have any serious discussion about the world. How about you?

- I made no statement about what I want the world to be. You have been inferring (wrongly, but, oh well, it barely matters)

- Being compared to the Amish is not "sad" to me and it's not meant to be offensive. What I think is sad is if you chose to be disconnected (like the Amish) but unclear that you are disconnected because of personal considerations (unlike the Amish) and then get angry at the thing you are disconnected from because it doesn't care about your personal considerations.


I suppose you also think condoms are for the amish?

Enjoy your 30MB js files and 4k autoplaying video ads that follow your scroll and jump around the screen.


> I can't remember the last time I created a website in a professional setting that would work without JS.

A website which does not work without JavaScript is by definition unprofessional. That doesn't apply to web applications, which of course use JavaScript heavily, and need to (at least until WebAssembly is ready for prime time — and even then JavaScript is required as a shim).

The Web is a web of hyper-linked pages, documents. It does not require JavaScript, although of course many pages are improved with the dynamic behaviour enabled by JavaScript.


An odd definition of unprofessional. Building and testing a site for running without JS is a non-zero cost. But I've never seen usage stats that show that demographic above 1.5% of the browser market share – and those are old stats. It was sub 0.5% for the sites that I ran at the time (2018).

I don't mean to sound flippant, nor do I mean to insult anyone's favorite browsing style, but insisting on this might be shouting into the wind? I could never make a business case for it, at least.


> That doesn't apply to web applications, which of course use JavaScript heavily

I propose that rather than ‘Web applications’ these be called ‘browser applications,’ because they are applications that run within the browser runtime. It is basically a coincidence (and a convenience) that they happen to rely on the same technology used for the Web.

What I don’t understand is why folks write browser applications for things that the Web platform already handles well. The Web is pretty cool! Not perfect, but then nothing is.


By definition? Who's definition?

> The Web is a web of hyper-linked pages

The web is an open specification that has evolved greatly over time. Pretending like these principles are religious are exactly the antithesis of an open standard. It requires debate and discourse.


FWIW, I do still totally use NoScript, and am somewhat picky with what I enable; quite many e.g. blogs (and news sites too) are in fact readable enough without it. And in case of blogs in particular (especially on the blogger site), I do quite often just skip reading them if they don't show anything with JS disabled, as this is plain dumb first of all, but also tends to be a strong signal that the particular site will be super slow, annoying, and hard to read with JS enabled anyway.


If the website is a blog post, or some news it should work without JS, You don't need an SPA to show some rich text.

I use Vivaldi because it makes easy to have JS off by default and enable for each website that really needs it. (I do not know if Chrome/Chromium made this thing easy too)


> I use Vivaldi because it makes easy to have JS off by default and enable for each website that really needs it.

So you fill in this entire form, hit submit, and then find out that you really needed JS and now you have to fill in the form all over again. How is that easy?


I am a developer, I can guess what site needs JS, like payment sites, sites that are actually SPAs. Check all submissions from HN and let me know how many have forms , load fine without JS and don't complain that JS is off and then fail on submit?

It is easy because I don't get popups for cookies, for allowing location access, no fancy scroll effects , I also disabled animated GIFs.

If you are a webdev and you don't work on an SPA but for some reason you site is half broken with JS off then use the standards and put a message that informs the user that JS is required.

I would not setup this to some random person computer, ad blockers too can screw with shopping carts and some websites also beg to stop the ad blocker , so I would not install ad blockers too any random person either, but here on HN JS off by default is a good advice, most should understand that if page is blank or some button does nothing then it might be a JS related issue.


Whose fault is that? The one who disables the JavaScript not actually needed to submit a POST request, or the one who implements a web service which requires client code execution to receive a POST request (or worse, who uses GET for POST)?


You don't need an SPA to show some rich text.

No, you need it to pad your resume.


The best justifiable use case I’ve seen for building site that also functions without JS is accessibility in the section 508 sense. It’s been a few years though so perhaps new standards have emerged for this purpose.


There are. You can make a complex JS browser application that's completely accessible if you follow every single recommendation, and ARIA-tag every element that needs it.

That said, a lot of what you are doing in a browser application is re-implementing your own version of stuff that the browser already does, and the browser's version of it is accessible by default. So a JS-free application is almost always fully accessible for no extra development cost (as long as you check it for things like foreground/background contrast).


We run a sizeable webshop that can fall back entirely on non-js for whatever reasons with plain forms and HTML. Not based in Oslo though.


I often browse the internet through Firefox Focus with JavaScript disabled and the web is a breeze and really fast. Additionally it helps in escaping paywall s for most news sites.


modern frontends work with disabled JS because of SSR


>I often see people speak about requiring users to have JS enabled as a problem. But is that really an issue?

With how prevalent tracking on the internet is, despite GDPR's attempts. Yes, it is an issue.


But Javascript has little to do with tracking.

Third party cookies is a much bigger concern there, and mostly orthogonal to the use of Javascript.


JS is used for fingerprinting and very often to load additional trackers.


> with the way it is recommended to use it using 404.html file as a "catch all", it means all your pages are served as 404 errors

That's due to GitHub Pages not having any rewrite mechanism other than the 404.html hack. You can host the same thing on Netlify/Vercel/any other host with proper rewrite support, including a server of your own, and not have this problem.


> but even when it works it necessitates javascript enabled

I think you are overestimating the number of people who have JS disabled outside of this website.


I've never worked on an enterprise-level site where there were greater than 1% of users with JS disabled or a percentage to make a dent in conversions


> Not only it doesn't work in all browsers even fairly recent ones

https://caniuse.com/serviceworkers

It works on 96% of user's browser.


Apparently the demo doesn't work on Firefox 92 incognito mode, so.


https://bugzilla.mozilla.org/show_bug.cgi?id=1320796

> Service workers are disabled in private browsing mode because they are impossible to use without setting up state tied to the origin and URL scope. This inherently requires writing to disk.

Had no idea this wasn't permitted.


> The tentative plan is indeed to use temporary disk storage that's encrypted so that in the event of a browser crash the data will remain unreadable. We're starting with IndexedDB support for this mode of operation and bug 1639542 is the meta-bug for that. This is a longer term effort.

Well that's a start.


Thanks for pointing the ticket, I also wasn't aware of it.


Requiring javascript is fine in my opinion. Almost everyone has javascript enabled. And the few power users without javascript will enable it if they want to.


Yes, but as you said it's fun, and it would actually be practical if you'd put the unpacking server side, as a way to save hard drive space. Or with a cli tool. It could even shard the content among several images as well, or create a FUSE driver for it. Combine it with webtorrent or a p2p webrtc cdn for additional coolness.


> This has all its place on hacker news because it is a fun hack, but it's not at all practical.

No shit.


Usage-relative, it seems reasonably ok to rely on service workers. Something like 96% of browsers-by-usage have access to them. And probably most of those few users who use an incompatible browser can switch to a compatible one.


It's a fun project decoding a png image clientside. How do you propose it should be done without javascript? How does SEO make any difference - it's clearly a toy project, proof of concept, not meant for real world use.


_____ requires a browser, so it's less resilient than a solution without a browser dependency.


Except for the SEO issues, everything else sounds like modern web stuff.


Great post, very fun project.

A while back I wrote a blogpost[1] about using PNG chunk functionality to "hide" a arbitrary data in PNGs without it affecting the way a PNG is displayed. I'd love to see this being used together with this project.

[1]https://blog.brian.jp/python/png/2016/07/07/file-fun-with-py...


Fun fact: you can create binaries that can execute bit-for-bit on many platforms:

https://justine.lol/ape.html


I wrote this in another comment, but this is the same method Kingdom Hearts 3 used on the PC version for their save files.

https://www.pcgamer.com/kingdom-hearts-pc-save-files-are-tuc...


This reminds me of some wonderful 90s web games. Before XHR, an early web game built for Disney used a .gif as a one-way websocket to stream data into the webpage. This is due to the fact, that a .gif animation can be as long as you want and a server can generate an infinite animation, where some frames have content that you want to push and others are empty frames. Or in other words, they were using primitive video streaming to encode relevant information within the frames.


Interesting, that reminds me of what somebody did to display an always up-to-date clock [1].

[1] https://news.ycombinator.com/item?id=14996715



That's demonstrating both server delays and a hacky way of updating a normal png. No apng there.


If Imgur support HTTP-headers to get part of the image (byte range), then you can implement SQLite into the image.

“Hosting SQLite databases on GitHub Pages or any static file hoster”: https://news.ycombinator.com/item?id=27016630


Shameless plug of my old dead project: this is great, it reminds me of my Android app "Smozzy" which let you browse the web via SMS/MMS. I apparently posted it on HN exactly 10 years ago tomorrow! https://news.ycombinator.com/item?id=2976764


I remember this! What have you been up to since?


That was a summer project after my last semester of undergrad, posted at the beginning of my first semester in a PhD program in AI (lucky timing there). Graduated in 2017 and have been working in the field since.


I once had a silly idea of encoding all my data as images (and other possible formats) and storing them on various sites not designed for storing abitrary files such as imgur. For robustness, I'd need to store several copies on different sites, and run a tiny server which would make sure all data is still there and reupload if something was deleted.


You could also do this with YouTube Videos btw. There is also a github project doing that iirc


One of my coworkers uploaded all his high resolutions photos to some site (probably Yahoo!Photos ???) and lost the local copy. He was surprised that the site only stored the photos at a lower resolution than the original files.


Heh. Almost like having old style photos, you stored the negatives in grandma's garage, but they got wet and now have mold on them.

You can still make out who's who in them. Sorta.


In a normal photo, after a resolution change and a lossy compression, you still have the photo.

If you encode some binary data in a fake photo, after a resolution change and a lossy compression, you will only get garbage.


It depends on the encoding. If it's sufficiently low frequency and includes error correction, then your data could survive. You won't get very good efficiency though!


I still remember Warez being hidden in JPGs before torrents and general purpose filehosters became popular.


Yes! I remember when I figured out how binaries were encoded for Usenet! I’m old enough to remember Usenet but too young to have used it for anything but warez or media.


This concept is old.. Back in the day we used this kind of techniques to sneak any kind of content in imageshack et all. Back then we didn't have Dropbox, online storage wasn't even a thing so we sneaked movies, software or anything in dozen of image files, hosted at plain sight.

This was 20 years ago or so... So many memories!


> You can use image hosting sites as free CDNs to save bandwidth costs.

It's not free for the image hosting sites.

That being said, it's a cool util.


I'm pretty sure it's also against the TOS from the image hosting sites and they'll nuke the account if they realize it's the case


You can use steganography by only using the least significant bit in an image, so it's not visually detectable that the image has a website in it. With compression it would be completely undetectable.


Yes although it's very unlikely that'll happen.


Sweet, how about hosting websites via Tweets?

https://twitter.com/rafalpast/status/1316836397903474688?s=2...


This is pretty ingenious. I wonder what happens if you "host" illegal content in imgur with this?


That happens constantly. Imgur have to review all content uploaded, and you can also report it.


Yes, but some img formats are valid even if the end of the file is garbage, meaning you could display a fully functional image, and still host a hidden content. Not easy to check.


I see what you mean. In that case someone just has to report it to their abuse department. Eventually someone will locate the offensive image and trace it back to imgur. Even if they can't immediately see it, they can see the abuse reports and investigate.

It's a hard life running an image uploading service. You have to wonder how much they're being payed in ad revenue.


I doubt most of the money comes from ad revenues. More likely, they get more cash from invisible PR campaigns, it's the same goal as ads, but more efficient since imgur user base think the content they see is 100% organic. And so probably sells for a lot more to bigger players.


I think a big portion actually comes from image-hosting for other websites. Stack exchange, for example, uses Imgur for image hosting when you attach an image with your answer.


Here is a tool if people are interested - https://stegonline.georgeom.net/upload

I use it a lot for a project trying to detect manipulated images including stego encoded images.


Just reencode all uploaded .png. No need to check!


And how can they know that this [0] is illegal ?

[0] : https://i0.hdslb.com/bfs/article/484c1e1a0c41d4482f6fc121132...


Could you perhaps qualify what makes that image illegal? I don't want to open it not knowing what I'm in for.


It appears to be mostly random-looking pixels with some black streaks and a top section that is noticeably dissimilar to the rest of it.

At a guess, it's the encoding of a binary file into pixel color values in a large image.


"You don't have permission to access the URL on this server."


You need to spoof the Referer (or just copy the link and open it in a new browser tab).


Sounds like a great way to encourage Imgur to clamp down CORS and ruin the fun for everyone.


I'm surprised they haven't already.

Imgur has really fallen over the years. It was first created by a redditor who was frustrated by how terrible free image hosts were at the time, like ImageShack and Photobucket. If an image hit the front page of reddit, it would very quickly turn into a "This image has reached its bandwidth limit", not to mention you usually couldn't directly link an image, and the image view page was riddled with ads.

Over time, though, Imgur has tried to become a social network, and is trying to stop being the go-to place when you just want to host an image.

But, to be fair, I don't really blame them. Image hosting gets expensive quickly, especially when everyone on reddit uses you. Of course, now reddit lets you post images directly to them, which on one hand, leads less traffic to Imgur, but on the other hand, the traffic going to Imgur is now more likely to see ads and browse the gallery, rather than just directly download an image.


This is the circle of life in image hosts.

A new host emerges to address the sorry state of existing hosts, with a streamlined and straightforward path to uploading. Then the cold, hard reality begins to set in: you cannot make money as a dumb pipe that delivers user-supplied images. Monetization strategies begin subtly at first. Before you know it the site is desperately trying to sell your image on a coffee mug, or hijack links to display ads, or grow into a social networking site. Something- anything- that pivots away from straightforwardly posting images and linking to them.

One day, some bold new creator sees the sorry state of existing hosts, recognizes an opening, and the cycle begins anew.


https://imgbb.com/ is what I use now instead of imgur (or I self-host). It's light and easy, will see how it plays out in the future. I still remember liking ImageShack lol


I've thought about self-hosting, but I don't want to wake up one day to a $100 bill from AWS because a hosted image hit the front page of reddit.


I just use a regular VPS, so there's no AWS mystery billing going on anywhere AFAIK. Haven't hit the front page of Reddit but I have hit the front page of Hacker News for awhile.


I have thought about using a Lightsail instance. The cheapest one is only $3.50/month and includes 1 TB of transfer and should be plenty powerful to serve up static files.

Or yeah, just use DigitalOcean or someone else. They're pretty cheap, too.


AWS honestly always scared me because of the black box aspect, but I use my single VPS for a bunch of sites with different functions and like having understanding of what's going on there. But any $10-$20/month VPS would be able to host that kind of thing without too much issue.

Another option might be Dropbox. Not sure how quickly their free hosting will crap out if it hits reddit front-page level traffic, but it should be good enough to share here or with your friends or whatever.


There are countless other "free" image hosting services


I love fun projects like this! It seems like you're restricted to 5mb (Imgur's limit), perhaps there is a way to chain multiple images together? I wrote another project in the same scope [1] where it uses Discord files and compiles everything to a massive blob then splits it up into many smaller chunks, giving an (in theory) infinite upload cross domain. Seems to be still going strong.

[1] https://github.com/5ut/DiskCord


You could store stuff in URL shorteners, but I imagine that's against their terms. Also: how does this break Imgur's terms? If 'an image is just an image' I imagine it's okay to host websites in an Imgur abstraction layer, but I feel Imgur wouldn't be too happy about that. Obviously you wouldn't use this thing for anything super important


Url shorteners, brilliant!


And if you use png, the image could even display as a regular one on imgur.


TIL GitHub READMEs support video playbacks now.


Sourcehut does not—nor does it support SVG. If you need something animated, you must use animated PNG (APNG) or GIF.


Nobody said anything about sourcehut though.


Do they have to? HN always talks about alternatives in every article. It's a bit of perspective on what another Git forge is doing in case people were curious. I've been both upvoted and downvoted for the comment so someone found it interesting.


This reminds me of the time I got free data transfer on a major cellular carrier by writing a script that would take large files, split them into 1MB chunks, prepend a PNG header to them and send them through said carrier's email-to-MMS gateway (a corresponding script on my phone extracted the original file. My carrier's email-to-MMS gateway didn't re-encode images and I had an unlimited texting plan, so this was a nice setup. There was a de-facto transfer limit of 500MB/day (Gmail limits you to 500 emails a day, with 1MB/email, and the email-to-MMS gateway rate-limited smaller email servers after 100 messages)


nice. Reminds me of an semi popular image hoster which executed PHP code embedded as an EXIF comment >10 years ago. Fun times.


Will imgur be able to detect this and remove it? Because it looks that many parties might try to abuse it: "free hosting".

If anything is provided for free someone will come and ruin it...


imgur is already providing free hosting, and as FUSE over gmail driver have shown, it can be abuse, although it's not very practical, and it probably is.

I mean, if you gotta serve a malware payload, you are not going to host it yourself, so hidding it in an image on a public website is a good strat.


This is really great! There's an API to save images locally on most platforms, including mobile (Save To Camera Roll).

If the image gets recompressed during sharing, is it still possible to decode the website from it?

One of the problems I had when trying to build something like this in the past is that saving & sharing images was very lossy at the time. So I ended up using iCalendar and PDF attachments.


Now put this on the Blockchain and you will have miners across the world hosting it for you.


What a clever hack! The follow-up project freecdn [1] is even more robust, from what I can glean using Google Translate.

This is one of those things I would almost certainly only play with for fun, but it’s so damn clever, I just love it.

[1]: 635484


Don't these sites have a right to "reencode" your image as needed, which would destroy whatever carefully encoded data would be contained in it?


Reminds me of using

  copy /B foo.gif+bar.zip foobar.gif
on imageboards back in the day.


Is it possible to implement this using css instead of js since css is turing complete?


You can use reddit to simply paste the code and comments as a branch.


sweet. good job, only thing my firefox 91 on kde neon says does not support service worker but the demo video looks great


Firefox 92 on Windows also not.


and i am being downvoted for some reason ? sometimes i do not understand this community


Apparently it only happens in incognito.

As for downvotes, just ignore them, there are pitchforks everywhere, not worthy to even spend one second thinking about.


Demo didn't work (chrome mobile)


I opened the sample site in brave on mobile and it doesn't load. Meh.


If using it you could expect it to work every time it wouldn't be brave.


Brave blocks a lot of stuff related to service workers.


It doesn't work for me neither on Brave desktop.


turns out it doesn't work for a lot of other people...


> Web2Img is a tool to bundle your web files into a single image, and extract them via Service Worker at runtime.

Oh too bad services workers and javascript are disabled. Your website won't load images then.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: