Recently I just finished tearing down all my self-host services. Things facing the public internet might be convenient, but heck if I want to face that security nightmare in an ongoing way.
All my git repos now live in a "git" folder I share amongst my fileserver and other machines with syncthing, hosting only the bare repos. If I need pull requests or issue tracking, it's got to be something that goes with the repository - as it always should've been, really.
If you’re the only user, you can avoid almost all the security issues by not exposing any HTTP ports to the internet, and instead accessing any web interfaces through SSH port forwarding. For example:
1. Run Gitea on your server, but make it only available on localhost.
2. Connect to your server via SSH, and use the -L flag to forward the port that Gitea is using e.g.
ssh -L 8888:localhost:8888 me@myserver.com
3. Access Gitea by opening http://localhost:8888 (or whatever port you’re using) in your browser.
However, this doesn’t work if you’d like to collaborate with anyone who you don’t trust enough to give SSH access to your machine.
It would also be annoying or impossible to open the web interfaces on mobile devices.
This, 1000%. There's good reason infrastructure-as-a-service has become so popular: self-hosting _sucks_. Not only is there the constant and every-growing security nightmare for which companies like GitHub have entire world-class teams, but let's not forget the fact that self-hosting means you're a sysadmin again. Which _sucks_.
The interesting write-up would be the one describing what the experience has been like 6-, 12-, 18-months down the road. It's easy to be optimistic and idealistic at the outset, much more difficult to remain so in the face of the day-to-day realities.
(nearly) all my projects live in an account on uberspace.com. This is a German shared hosting provider with quite great offerings.
You get a shell (ok, no root/sudo because shared hosting) with 10GB space, databases and what not. A vibrant community providing installation resources for a massive amount of tooling: https://lab.uberspace.de/
I am in no way affiliated to them just a happy customer for 10+ years now.
btw: it is pay what you like, but min 1€ per month.
The hard part is remembering to update all of your software, and making sure you do it fast enough in case there is a zero-day.
The worst part is discovering that you installed a simple app and forgot about it; and now hackers got in, encrypted your files and database, and now want a ransom.
What do you think about a service like I'm building https://cloudblast.network (Opens Apr 1) Do you think it's something you'd try?.. Makes it easy to use self-hostable software, Gogs for example. Combines discovery, hosting and backup. Updates are automatic.
1 > Does this service allow me to deploy the apps on my own a VPS and/or dedicated server?
2 > Does this service offer to host the apps I deploy?
3 > Does this service allow me to specify a docker container?
4 > There should be a full list of supported deployable apps.
5 > Can I use forks of the available apps, or deploy from a git repo?
6 > Do I get analytics (cpu, ram, visitors)?
7 > [Assuming clear answers on earlier questions] So I can deploy a ghost blog from my phone? Maybe this could be elaborated.
8 > Any plans to release a video example of the service's usage?
ANSWERS
1. On the roadmap, you'll have the ability to automatically deploy your apps to several locations (VPS, AWS, home servers, etc.)
2. Absolutely. Nearly any web app can be wrapped with a definition file and added to your personal library and shared with others. I find wrapping an existing app takes less than an hour.
3. Yes. Apps are docker containers wrapped with a definition file.
4. For sure! Once the service launches there will be a free tier with a fully searchable index.
5. Eventually you'll be able to choose versions of apps and import from repos.
6. That's an interesting idea, cpu, ram, disk use and bandwidth are tracked, number of visits could be too. I could see this being available as a power user option.
7. Yes. Would look like opening the browser on your phone, logging into your account and clicking the Ghost graphic. Now you have a URL to your new public instance of Ghost.
8. There's an instant use free tier but we're considering also making a short 60 second "Intro to using Cloudblast" video to jumpstart new users.
Sourcehut assumes you know your way around Linux and want to spend the time to understand how to host the services you need. The expectation will be that, if you run into issues, you're willing to debug and submit patches if necessary.
Alternatives like Gitea/Gogs are hostable as essentially single binaries and have clear guides on how to host on Docker. They're less feature rich, but more widely used in a self hosting context.
If you're looking for a middle ground, I've had good luck hosting Gitlab but it will consume more server resources than anything mentioned thus far and be a little more difficult to keep up to date
I wouldn't. The last time I tried to use their software I asked if anyone had any insights with how to use it in Docker (much of it is written in Python) and the response I got was cringe worthy. I'm just some person on the internet so take that for what you will.
All of the sr.ht software is packaged for Alpine, so you can just install and configure them in your Docker container, the same way you would install and configure them in a VM.
There's no push-button magic for it, presumably because SourceHut the company (ie. Drew) isn't interested in running it in containers, but nothing is stopping you from trying. It's different from most Python software in that you're expected to use your OS's package manager and standard service tooling.
I did figure this out, but I decided not to run SourceHut mainly because of Drew's comments in the channel.
> (ie. Drew) isn't interested in running it in containers, but nothing is stopping you from trying.
Yes, he is. I asked in SourceHut's Freenode channel and Drew's comments were literally, "we don't take kindly to your kind around here, don't ask again."
there's an aspect of readability that is in the eye of the reader.
I've been thinking about this a lot. often I hear (specially old-guard) hackers talking about easy and simplicity but they really mean it's familiar to them.
As someone who had countlessly saying [0] this to at least to either setup a self-hosted Git repository or use that as a backup almost a year ago since GitHub made teams free [1] I do welcome anyone moving from GitHub to a self hosted solution.
There are are several open-source software projects that have been doing the same thing but on a self-hosted solution. GitLab instance like (GNOME, GTK, etc) or cgit (wireguard, Linux, etc) and so on. ReactOS has a backup mirror of this in case GitHub goes down. [2]
We really need to get away from the idea of 'centralising everything'.
Linux has even used Github once as a backup solution for hosting the main repo, during a very short period of time.
The workflows git has been developed for in the first place has little to do with the non-git and mostly walled garden feature provided by hosted Saas, being Github or even Gitlab.com
Of course some projects find it convenient, but I'm not impressed by the embrace/extend approaches, esp when it shapes the technical practices because of workflow limitations existing for mere business needs: if a merge request is not possible from and to any host, that's really a big loss - and that's particularly important because of geostrategic restrictions (mainly in practice for now: the US "randomly" banning other countries, or the reverse: countries "randomly" banning various, mainly US, services)
Interesting! recently I started self-hosting my personal repos using gogs (https://gogs.io/) it’s really lightweight, and I can evn run it in an arm64 box (the same one I use for my NAS).
If you’re looking for a more complete yet still lightweight alternative, give gogs a look.
Some background for those that may be not know which one to choose. Gitea forked a while back due to the belief that the maintainer of Gogs was not moving fast enough. I can't speak for the quality, but the difference in activity level is extremely evident as the following shows:
What arm 64 box do you use? RPi4? I'm interested in having my own NAS but I want it to be seamless as possible, and I'm not sure if a raspberry pi would achieve that. That being said, I also don't want to spend too much lol... Perhaps I'm asking for much here
You can disable registration (and authentication through third-party providers) if that's what you mean.
Personally I'd rather use gitea though. It has a more healthy development community (as in more than one developer) and is gaining new functionality at decent pace.
Probably a naive question, but is a git server required?
Could one imagine a git repository stored as static files (e.g. in a cloud bucket), and the client then runs git the functionality.
Or, a "serverless" git, with cloud functions?
(I guess concurrent writing would be an issue.)
EDIT: I don't mean ssh'ing to a server with git installed on it. I was thinking of a git-based protocol but based on static files. But thanks for the tips.
you can read up on the --bare option but you can point git to a remote location such as a network drive or even a local folder (I don't recommend this of course) but my point is there is no hard restriction within git as to where it is hosted.
A long time ago when I was working in a company that still prescribed SVN, I used a backed up network drive location as the location for my git bare repos, I just used it to manage prototypes or feasibility work nothing serious but it was more familiar and productivity for me than the accepted SVN workflow.
While i recently gave up self-hosting anything because i felt it was too time consuming, and instead i rely on public services like bitbucket for git hosting, i still have a Gitea instance running in docker on a host that is setup to mirror all my Bitbucket repositories.
While probably not really needed as Git repositories are self contained, i find it nice to have a "backup" on hand.
There's also Pagure[1], which is in-between Gitea/Gogs and GitLab.
Unlike most Git forges, Pagure has features that support decentralized development. For example, Pagure supports submitting pull requests with Git repos on any server (regardless of whether it's running Pagure or not) with its remote pull requests feature. Issues and pull request metadata are all stored as git repos using JSON files as data, making it easy and portable to other Pagure instances and easy to convert for any other system.
There's development ongoing by one of the other contributors to build an extension for Pagure to support Forgefed[2], a protocol to federate and decentralize code project management. And unlike most Git forges, Pagure is really easy to deploy, update, and modify. Naturally, this means Pagure is packaged in virtually all major distributions[3][4][5][6], too. :)
I was using self hosted gitlab (plus runner) . It did the job perfectly well but was too resource heavy. Moved to Gitea and haven't looked back. Although extensibility / plugins are not there yet, but still for "git" it works wonders.
And if you need less than that you can do a lot just with SSH and file system permissions for read only deploy keys. I wrote a script that uses a directory structure with sym-linked ssh keys as a config for a to add/edit/remove ssh access.
It appeals to me because I had gitlab for our 2 person team and it was just wrong sized for us. So many features went unused. Frequent upgrades meant an increasingly powerful server was required for doing basically nothing most of the time. CI/CD is not something we need.
If you've got a small team such that managing SSH keys is reasonable, you can just keep using git over SSH and add CI/CD via githooks. It's honestly very, very easy - put an appropriately-named shell script in the directory and git automatically executes it. Just make a script called post-receive that kicks off your process when a push has been made. The script will receive all the information it needs to know what branch/ref changed.
I find that gitlab is generally overkill. Gitea [0] is quite simple to deploy and uses much less resources. If you do want CI/CD later, you can integrate a self-hosted Drone [1] instance.
Git web interfaces should be Bring Your Own, really. I never bother using those as I can have a consistent interface inside emacs. Why should the repository owner choose which interface you view it on?
I really don't want to have to pull an entire repo to check something out. I read open source code waaaaaaaaaaay more than I have any need to actually run it or do anything to it locally.
Because most self-hosted Git services like Gogs and Gitea include features not part of git, such as pull requests, releases, and issues, which require some kind of database.
That's an entirely different concern. If people would not shove their repository alongside those features, instead of relaying those through a standalone and documented manner, I would probably also be easily tracking those features inside my editor.
Just use rsync.net. All you really need is a secure SSH accessible system with the git suite. Publishing a website is not necessary since you can browse code locally through git-web.
"Yesterday (2021-03-28) two malicious commits were pushed to the php-src
repo 1 from the names of Rasmus Lerdorf and myself. We don't yet know how
exactly this happened, but everything points towards a compromise of the
git.php.net server (rather than a compromise of an individual git account).
While investigation is still underway, we have decided that maintaining our
own git infrastructure is an unnecessary security risk, and that we will
discontinue the git.php.net server." - https://externals.io/message/113838
I've been self-hosting https://gitea.io/en-us/ with no issues. One-liner to spin it up in Docker; exposed on the LAN only and easily accessible from anywhere with an SSH tunnel thru my bastion. Trouble-free workflow and I feel safe knowing my source code doesn't live on a computer somewhere in someone else's closet[1]
All my git repos now live in a "git" folder I share amongst my fileserver and other machines with syncthing, hosting only the bare repos. If I need pull requests or issue tracking, it's got to be something that goes with the repository - as it always should've been, really.