Why bother? Because a read-only, static Git mirror is more secure--there's no server software (other than a basic web server) that can be exploited. And nothing is always easier to maintain than something.
You still need SSH (or a git:// daemon) for write access, unless the entire environment is just for read-only cloning. But SSH access is presumably something already setup and used by the core contributors. (Probably shell access, too, but SSH logins can always be forced to invoke git.)
IMO, static, read-only HTTP cloning is a killer feature of Git which AFAIK cannot be matched by CVS, Subversion, Mercurial, or Fossil, at least not out-of-the-box.
 From a hook on the read-write, master repository outside the web server sandbox.
One thing I'd still like to see, and am thinking if I may (as a nobody with no connections) somehow be able to organise myself, is a global search. One of the main advantages of GitHub is discoverability, both as a project owner and as a project seeker. What if there was a server that distributed searches to the different self-hosted instances of gitea/gogs/gitlab/etc.? (Opt-in of course.)
The search would be distributed amongst all those setups which hopefully makes it light enough even though there will probably be a large volume, and this way people would have no reason to stay with a centralized company for git and bugtracker hosting. You could even have a client that connects to the instances individually, so even the search distribution server does not have to be centralized, you just need a DHT (ahem, I mean blockchain!) or some other central list of contributing nodes. And for very lightweight setups that still want to appear in search results, one could perhaps have different levels of results, from full code index to repository names and descriptions only.
Or adding a code search feature to Software Heritage.
Edit: removed a part about ForgeFed which I thought was what I had in mind, and they do a lot of things that would be nice to have (slightly more convenient than just "login with GitHub")... except search which you can't simply work around by creating an account on each git server where you want to contribute. Not sure what part of their project is discovery and why NLnet granted them money under the "Next Generation Search and Discovery" fund which, from the description, is very clearly aimed at search.
I see that things like the Plume blogging platform have a cross-instance search, but whenever I ctrl+f for search on any documentation page there's nothing. For example, it says the four protocols it uses are activitypub (link goes to the spec at W3; the whole spec never mentions the word search), webfinger (spec page, an RFC, also never mentions it), HTTP signatures (just for verification it seems), and nodeinfo ("not part of the federation itself, but that gives some metadata about each instance"). I've looked in more places but this was one of the more concrete examples of where there clearly is search but I can't find the code or protocol anywhere.
Edit: checked another; Peertube does not seem to search across instances.
In professional environments with Jenkins and Sonarqube i would prefer gitlab though.
I'm keeping my eye on it though and if the situation ever changes, I will certainly take another look at it. Gitea seems like a great tool.
On the other, installation is copying a single Go file to the desired location and running the configurator. Going from zero to fully set up and running took me about 15 minutes. At that point, Docker doesn't seem to offer a lot on top of that simplicity.
The project is here https://github.com/jonashaag/klaus and you can see a preview of it here: http://klausdemo.lophus.org/
My Fossil server on the other hand, has had no problems at all. Unless your project requires Git, you could save some pain by considering Fossil instead of Git.
The method in the post is pretty lightweight for getting a shared git server up and running without Docker. Most of the guide are extra tweaks for web-based comforts like converting README.md’s to HTML, etc.
All that said — I use a docker based install of Gitea for this. It doesn’t have as many features as Gitlab, but for an on-site option, it is pretty good. And I find the simplicity of it to be a feature.
You'll still get spam users but they won't show up in search.
In fact, you could probably set up a honey pot with user profile left publicly accessible. Then when the spammers show up, search Google for the email addys they use. Up should pop all the open source Gitlab instances which don't require a login to view user profiles. :)
Also it's fully featured and WAY simpler to setup on your run-of-the-mill VPS. Personally I have mine hosted at Linode at $25/mo with backups enabled + a $10/mo worker for simple Docker-based builds. This keeps all of my git repos private and 100% in my ownership which is how I prefer to work (sorry GitHub).
GitLab is amazing and I'm a huge advocate.
On that note however, I would encourage anyone with a passing interest in an scm gui to check out fossil-scm. I'm helping a colleague with a project and that's what he is using. Really incredible piece of software with a lot of possibilities.
So new plan is google source repo + mirror somewhere safe for backup / non-google backup
I have a dual-core duo box running about 6 containers, including gitlab, and nginx as a reverse proxy.
I use letsencrypt for ssl, basic auth and duckdns to point cnames to my dhcp DSL.
I've been running my dev environment like this for several years. Works great.