Hacker Newsnew | past | comments | ask | show | jobs | submit | pnelson's commentslogin

This is awesome and what I initially set out to do but always found a way to talk myself out of it. I dream big. I found a project to stick to "completion" but it took me two years to build alone. I've only lost money but it's been a wild ride of an experience. This tiny project approach sounds like a lot of fun and conveniently my project will help me down that path while I try to market my work.


I'm working towards building the best possible experience for hosting multiple sites on a single machine at a fixed price at https://acrobox.io. Abstracting Docker didn't make the cut for my MVP but it may be the right fit for some of the commenters here.


That's one of many things that lead me to build https://acrobox.io. Abstracting Docker out of it for a bit more of a Heroku-like experience is on the horizon. I think it is well positioned between something like Dokku and Render but I'd be happy to hear what you think.


Hello!

I often see Ask HN threads discussing how people are deploying their apps, specifically solo founders.[^1][^2] Solutions range from Heroku to AWS to niche products or even rolling your own VPS, each with their own set of tradeoffs. I'm not alone in seeing that web development is complex.[^3] What complexities are absolutely required? What complexities are worth it? What complexities are worth paying for a solution to, and how much are they worth?

The job I had just left was making good money running off of a single VPS. Some prolific solo indie hackers have shared their minimal hosting and deployment solutions that brought them to what many would define as success. Removing distributed systems and per-app/per-database/per-X costs is a huge weight off of a creative and experimental solo founders shoulders.

But even on a single VPS there are so many decisions to make and so much to configure. Non-root SSH access, network layer firewall, independently scalable block storage, automatic software updates, deduplicated incremental encrypted off-site backups are the absolute bare minimum standards I hold for customer facing applications and I can only hope the same of others. You probably want SSL/TLS certificates with HTTP/HTTPS and naked/www redirects. Where do you store local data? That's included in your backups, right? What about file permissions?

Acrobox provides guidance and structure to all of this. Not all complexity can be removed, but I've done my best to minimize complexities and maximize freedoms, carving a new niche in this space.

I value your constructive feedback. I'm here to answer questions or find my email on my website if you prefer.

Thanks,

Phil

P.S. I wrote more in depth about the history on my blog.[^4]

[^1]: https://news.ycombinator.com/item?id=28838132

[^2]: https://news.ycombinator.com/item?id=16472887

[^3]: https://news.ycombinator.com/item?id=28275873

[^4]: https://pnelson.ca/posts/gone-indie


I wrote a temporal expression package https://github.com/pnelson/te to escape cron for background workers in greenfield work. I started writing an English language expression parser for user facing work and it functions a little like the reverse of the posted website, though it may be a bit more limited than using the package directly at present.


Same! I switched back to Firefox a few years ago when I discovered that logging into Gmail on Chromium (not Chrome!) crossed a hard boundary by also signing me into the browser account sync. I tested a few times to confirm before switching and I've never looked back.


My experience moving to Go from languages like Python and Ruby required a significant mind shift. I couldn't port my solution - I had to revisit the problem from the perspective of a Go programmer.


I launch my browser with profiles instead of using containers. I have three at the moment:

firefox --no-remote --profile "$XDG_CACHE_HOME/firefox/home" --class="browser-home"

firefox --no-remote --profile "$XDG_CACHE_HOME/firefox/work" --class="browser-work"

firefox --no-remote --profile "$(mktemp -d)"


Profiles and containers solve two different problems. Multiple profiles will keep websites that you only log into from one profile from leaving cookies that can be seen by the other profile, but containers prevent one website from leaving cookies that can be seen by any other website even on the same profile, and prevent that website from seeing any cookies other than their own.


AFAIU containers can be and are shared by multiple domains, though domains pinned to exclusive containers won't be visible to one another.

Opening a domain in a container, then other linked sites from that page, generally remains within the same containdr.


According to this thread, they largely solve the same problem:

https://news.ycombinator.com/item?id=20258432


Thank you. Just this past weekend I came to the realization that, after using JavaScript on and off for over 10 years, I don't actually know JavaScript. Not even close. I'm sure I'm not alone. Anyway, I've just skimmed through your offering and it looks like an entertaining read. This came at a good time.


JavaScript is the one language developers think they can use without having to actually learn it. :-)


This is fine for some scenarios but I've had a series of problems which drove me away from this kind of setup.

Default CarrierWave (Rails gem) settings try to clean up the tmp files after file uploads. This is a good thing normally, but to Ruby I am on a Linix box, so it tries to `unlink` which fails catastrophically because the mounted drive was NTFS.

Then there are NPM modules. It is kind of funny because Node.js and NPM are actually pretty good on Windows for development purposes anyway, but if I want to run it from my VM it picks up my OS as Linix and tries file system operations like symlinks. I couldn't even install Express.

These are just a few of the things I ran into.


The symlinks problem can be mostly resolved by adding this magic incantation to your Vagrantfile:

      # to allow symlinks to be created
  config.vm.customize ["setextradata", :id, "VBoxInternal2/SharedFoldersEnableSymlinksCreate/vagrant", "1"]  
That said I have had times where npm via a putty window refuses to install modules. However, I also have npm installed on my Windows box so I can just drop down into a cmd window when that happens and run the npm command there.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: