Hacker News new | past | comments | ask | show | jobs | submit login
[dupe] Gitea – Alternative to GitLab and GitHub (gitea.io)
207 points by philonoist on June 6, 2018 | hide | past | favorite | 121 comments



Notable previous discussions:

- https://news.ycombinator.com/item?id=17006503 (has a lot on why it was forked from Gogs and whether using the fork is still a good idea)

- https://news.ycombinator.com/item?id=13451783

- https://news.ycombinator.com/item?id=13296717


Git is a distributed version control system. Every repo can be the master repo. 'Master' is purely a function of convention. It is worth stressing that you need no software to 'host' git. Every git repository is 'self hosting', by design.

Tools like this primarily provide a web client to a repository not intended as a working copy, with some optional non-git code collaboration tools, such as issues, and an inbox of pull requests (i.e. suggested patches). These are not unique tools, there are many options for issue tracking and email.

I make this point because these solutions are trying to replicate smaller or larger chunks of github rather than provide alternative ways to use git.

That said, Gita/Gogs is pretty. And might hold the hand of people only used to github. It is great that it is self-contained as a binary, so can be used with minimal configuration, like other front-ends.

But I worry that the emphasis on tools like this suggest we have accepted that github-flavoured centralised version control ('gitversion', maybe, or 'gcs'), rather than git, is now the de facto standard for version control. Am I paranoid? Does it even matter?


For the vast majority of use cases centralization is an essential feature. You want that single source of truth, then you can have tons of branches etc. outside but without that single source everything falls apart if you are a team of more than one person.

I'd argue that even a single developer really benefit from a "centralized" repository. It helps with maintainability, backups, syncing between machines (oh, my workstation wasn't on, can't update my laptop...).

Pulling code from random developers machines might sound pretty neat in isolation but that is the exception and an edge case to something else. And that something else should most likely be a centralized repository.

When people say they love decentralized version control I often get the impression that what they really love is local commits. There is nothing that says that you can't have local commits in a centralized version control system, it is just that being decentralized is a neat solution for many problems. But that is just an implementation detail that very few could care about, had it not been hyped to death.

Now this is something entirely different than the whole developer community putting everything in one basket (github), that's the only type of centralization worth worrying about.


> Pulling code from random developers machines might sound pretty neat

Isn't that just a straw man argument?

To the extent that it applies to distributed version control, it seems like an invented problem (i.e. I don't know anyone who uses a dvcs that pulls from 'random' developers). To the extent that it is true (we are all probably guilty of using code from developers we don't know and have not properly vetted) it seems like just as much---if not more of a problem---with github-esque projects.

> I often get the impression that what they really love is local commits

That's a really good point I have not considered before. So would you say that git's main contribution to the development community is showing the power of a VCS with local commits? And that's why github is dominating core git?


By random developer's machines they meant the actual machines. You can technically pull from another developer's (random or not, but anyone with the repo) actual laptop rather than going through an intermediary.

That was in fact one of the arguments a few true believers used in favor of DVCS when it was still a novelty.

The think I think truly gives developers comfort, in addition to all the nice features it makes simple, is LOCKSS (Lots of Copies Keeps Stuff Safe). When everyone has a full copy of the repo, even if you generally work through an intermediary, the central repo being blown away or taken control of by a bad actor doesn't mean you need to reconstruct the history of the code from whatever single snapshot checkouts individuals happen to have.

That said, I will agree that the workflows enabled, as well as the power of a common platform, are probably what really let git and github become the new default.

edit: Also I imagine being able to paw through the entire history and bisect entirely locally are vital features for a small (but essential especially of open source) minority of developers.


I interpreted “random” in the parent comment to mean “ad hoc, but planned”. Meaning, not random, untrusted or unknown engineers, but engineers from an obvious trusted group (like my team), just accessed in an ad hoc manner.

The point being that short of choosing a convention for which person’s copy (or which machine, etc) counts as the primary source of record (which would then just be poor man’s centralization with the admin costs and maintenance of any centralization tools now a burden on the team rather than a third-party product), then interacting in an ad hoc way with even just my known team’s set of distributed clones of the project turns into a bookkeeping problem nobody wants.

I can also imagine a non-centralized model can lead to many more complicated workflow failure modes. I’m just thinking about how often novices get stuck with rebasing errors or squashing commits incorrectly, and bikeshedding arguments over whether it’s ok to ever revert master or if you should intentionally keep mistakes in the history and correct with new commits. I can imagine it being even worse when there are fewer conventions, since conventions are often the only way to avoid these bikeshedding debates about preferred git hygiene methods.


Heck even torrents which are distributed needed a central server to connect through for ages. There's definitely a benefit to having a central server that aids the sea of clients / servers to communicate.


Aren't torrents using DHT?


For which they need a number of central servers. These server do not do anything with the content, but they serve to announce to any new client what other clients are there, so that these new clients can start setting up their network. Once that's done, the new clients can discover the other nodes through the nodes they just connected to, and the servers have done their job.

But the servers are necessary, as far as I know, unless you want to enumerate IP addresses and scan for open ports. Which is going to be interesting with IPv6.


You are mistaken. DHT is decentralized.

https://en.wikipedia.org/wiki/Distributed_hash_table


The bootstrap problem isn’t generally solved though - a new client needs to know a set of other clients to connect to, same with a client that’s been turned off for a while and whose cached set of other clients aren’t online. This is solved by an always-on set of clients that these clients are statically configured with.

In theory, yes, if they go down, you could configure your client to point at another well-known client, though.


AFAIK the Linux Kernel doesn't have a single source of truth.


Yes it has - https://git.kernel.org/pub/scm/linux/kernel/git/torvalds/lin...

Of course some (many) people are using forks, but all sync directly or indirectly with this dude's version.


You are mistaken.

Linux kernel dev branch single source of truth is https://git.kernel.org/pub/scm/linux/kernel/git/torvalds/lin...

The stable branches have their single source of truth repositories, too.


I think replying to "it doesn't have a single source of truth" with "actually there are multiple single sources of truth" kind of makes the OP's case for them.


What are you saying??

There is a single source of truth for a particular version of Linux.


I'm saying, you're wrong, and trying to wriggle out of being wrong by redefining the statement.


Before I read the other responses, I assumed it was just Linus’ personal repo


Same here.

A few days ago I read about the multiple Linux distributions that all have their fork with special patches etc. pp.


Independent of Gitea (which I have nothing to do with nor have any relationship with anyone involved). Your comment "It is worth stressing that you need no software to 'host' git. Every git repository is 'self hosting', by design." worries me. Yes, technically accurate, just as notepad, webstorm, atom, vi, emacs, etc. also don't need any 'host'. If you aren't working with multiple authors, a local laptop with git plus perhaps a USB flash drive for backup is all you need. But if you plan to actually have a project that multiple people will work on, it must be hosted whether it is technically required or not IMHO :) Let's say there is no Github or Gitlab type solution... how do you send me a pull request? How do I clone your git repo? In reality, there is always a hosted repo. If you want to open your firewall and share your public IP address so that you are the "hosted" solution that I push commits to or send pull requests to then fine :) But there is always a hosted solution if there are at least two people working on the same project... whether that's gmail.com hosting your emailed zip attachments, or you opening your firewall on your laptop so other people can connect to it, etc.


I took the comment as referring to hosting just a git repo on a public server. If you are working on a private project a simple option is to get a cheap VPS, give all your developers user accounts, and clone the repo into a shared folder. Everyone can pull and push with ssh and you can use Trello or whatever for issue tracking. Certainly that doesn't cover every use case, but it's worth being aware of the option and not feeling that you can't use git unless you pay github or spend a lot of time setting up a self-hosted clone of github.


How is getting a VPS and running a git server, then setting up everyone's SSH key and accounts and then setting up accounts on trello and whatever other services you need (How are you doing CI in this setup) meant to be easier than dropping gitea on a server and having everything you need in one place and with a simple GUI.

Sure you CAN use git without a web UI, We had that at work for a while and it was not a good experience at all compared to using gitlab which we have now and we now have so many features that let us get things done faster.


Sending patches to mailing lists is a popular way of working that is well supported by git.


Playing the devil's advocate: then you still need somewhere that people can get the current repository state from. This place might not need to do anything with pull requests or any other write action on the repository, but somewhere to clone from is necessary. This would then be the "hosted" repo, and the patches sent via mail are just that, patches.


Sounds like you're talking around semantics. Yes, any server you connect to as a client to clone a repo is "hosting" that repo. But that can be practically any computer it doesn't need to be a _dedicated_ host in a server farm, it could be a devs laptop.


Exactly :)


Fair enough, but there is still a hosted solution, it's just hosted email instead :) I know Linux used that approach, which is seemingly ironic since Linus made git, and previously used BitKeeper. There seemed to be some problem that emailing patches to a mailing list wasn't solving to motivate the invention of git.


You can email git patches though (which is essentially what pull requests are, really, just using a different interface)


you could use ngrok to expose yourself temporarily for distribution


I agree, additionally git has a built in server for sharing repos over the network.

What I'd personally like to see is a repo viewer similar to Github but implemented purely in JavaScript and cloning repos in the fly just like the native git does. Git.js (https://github.com/yonran/git.js) has something like that but it looks old and ugly.


There are some real world problems with that: * Many networks limit access between wireless clients, preventing the connection between develop ER machines. * Host based firewalls often prevent services from listening on client machines. Getting security exceptions for this can be difficult to impossible in some organizations. * Discovery and initial remote config can be painful. While largely trivial from a technical perspective, this kind of thing can cause resistance from many end users.

I'm not arguing against your suggestion, but these are things that need to be solved if this workflow is ever going to see mainstream adoption. Network infrastructure and security has been geared towards centralization for a long time. That's going to need to change if we want to empower distributed applications of this nature (which I believe we do).


> I'm not arguing against your suggestion

Well, actually I wasn't suggesting that, just emphasizing git's distributed nature.

What I do suggest is a federated model, large number of small hosts akin to Mastodon. What would be the real issue is, like you noted, discovery. Centralized solutions have it nicely solved but it's hard to do in general with decentralized ones.


What is this built in server git has? I thought sharing git repos over a network was depending on OpenSSH or similar.



probably thinking about git-daemon (for git:// urls)


Yes, acutely git daemon can be used to serve via http(s) too.


The power of DVCS is in that each developer who clones a repository has a fully-functioning local repository complete with history, which can be committed to freely and seamlessly merged into the upstream later.

As opposed to the traditional version control model where, e.g. every commit effectively requires a rebase against the remote and the history cannot be retrieved without a connection to the remote.

The "D" in "DVCS" is about having many copies of the repository, not about having _no_ central repository, which is still a core part of having an effective delivery workflow and very much encouraged by the baked-in concept of a default remote repository.

It's a distinction of technology, not of workflow.

This is really a common misunderstanding about what makes DVCS an effective concept.


The more I think about it the more I think Fossil[1] has the right approach by bundling bug tracking and wiki within the VCS. If git did the same thing (and the tools were usable) then nobody would worry about github "going bad" or anything like that. Changing your host would be as simple as changing your 'origin', anybody could replicate all the information easily. When I first heard about Fossil it sounded a bit like feature bloat but I think that was a bit of a kneejerk reaction.

In retrospect I think it makes complete sense to tie code, documentation and issues. They evolve together, maybe they ought to be versioned together.

[1] https://www.fossil-scm.org/


I feel like asking a poject manager to fiddle with some SCM system to get through grooming could be a bit unreasonable. I can't see how Fossil does ticketing to see if it's PM-friendly…

I feel like code reviews naturally should be stored alongside the code it's reviewing, and like the PoC shown by git-appraise from Google[0]

0: https://github.com/google/git-appraise


I've mused on this in relation to github's pages sitting on the gh-pages branch: a feature I like in the abstract, though I have problems with their implementation (it seems designed to increase lock-in rather than increase flexibility). I'm not sure the VCS needs to build in those tools, but having an established branching convention for associated data could allow many flowers to bloom.


You’re not paranoid, and it does matter. Distributed systems definitely struggle to stay distributed. Git is a great example, as are most cryptocurrencies, which despite being distributed by design, generally come to rely on centralised wallets and trading platforms.

The web itself seems to be be becoming more centralised- more people spend more time on fewer websites. Websites are increasingly being hosted on a tiny number of mega cloud services, rather than millions of independent server rooms.

Centralisation is a real concern.


Isn't 'origin' (not 'master') the convention you're referring to?


Both "origin" and "master" are conventions. They are promoted by the tooling that gives them a sort of default status.


In the context of the original post, he is referring to origin (the convention for a default remote repository), not master (the convention for a default branch).


use ngrok to share repos


Entertainingly the code is hosted on github. :-)

I'd have more confidence in it if it could self-host and I was able to see gitea inside of a gitea instance and that was the main workflow. Like this, it feels like maintainers aren't prepared to eat their own dog food just yet. That's fine, but I'll take a pass until that's fixed.


IIRC they have an active bug/feature ticket to that effect so they are on the case. The issues holding them back are not core features of source control but the other integrated features that a significantly sized project finds useful: CI support, certain discussion features, and so on.

So for small projects I wouldn't let this put you off, and for larger ones or those where for some other reason you want more than just good source control take a judgement based on the features you expect to need.

Of course it is git in the back-end, so if you start needing just an interface onto source control but later desire missing features that gitlab/github/other has, migrating the repositories should be almost no effort.


See https://github.com/go-gitea/gitea/issues/1029 - self hosting as almost there.


An insightful comment linked to from that issue: https://lobste.rs/s/gokjbo/gitea_1_1_0_released#c_dg9pwe


This is exactly the right idea. Just like Golang was written in C until ~1.5, you can't develop something "right" if you're relying on it as your tool. Use something that works until you're stable, and have all the features you think it needs in the "core" functionality, then by all means, eat your dogfood.


Well it had to start out somehow. The original project started off on GitHub so it just kinda stayed there. It does cost money to host these type of applications and its easier to have contributions on GitHub since it doesn't require most people to register a new account. It certainly has trade offs for sure.


> It does cost money to host these type of applications

I'm hosting Gitea (together with around 10 other services) on a 1/1 instance (1 CPU, 1 GB RAM) at Hetzner. Those can be had for € 2,50 per month.

> its easier to have contributions on GitHub since it doesn't require most people to register a new account

That's the big issue.


>> its easier to have contributions on GitHub since it doesn't require most people to register a new account

> That's the big issue.

I don't think so: Github can work as OAuth provider, Gitea supports OAuth integration, so the self-hosted instance can still authenticate people using their Github accounts.


> I'm hosting Gitea (together with around 10 other services) on a 1/1 instance (1 CPU, 1 GB RAM) at Hetzner. Those can be had for € 2,50 per month.

But can the Gitea devs scale it to support potential growth that costs beyond $10 a month? This is the reality too.


Gitea is not going to have hundreds of developers working on it, and users will not visit their project page all too often, so I don't see any scaling issues (besides, maybe, network bandwidth when they put out a new release and everyone comes in to grab the tarball).


we (company 4 people) can use gitlab at 4vCPUs 4 GB Ram (which is slow when browsing repos) vs 1vCPUs 1GB RAM which is still ridicoulus fast.


How do you find the performance compared to Github?


I think that backend is really fast, but I believe that is because when I self host it is on a local server with extremely low latency.

The UI is somewhat slower though, code highlightning show up a second after page render.


>Like this, it feels like maintainers aren't prepared to eat their own dog food just yet.

Sorry but that's dumb. Go offer them a decent free server and paid hosting and i'm sure they'll accept your offer.


You could open a well-worded issue for it (“Gitea code base should be hosted on gitea instance”) and track that. There is a chance that the developers will close it as a WONTFIX of course.


That issue alread exists and it's being worked on: https://github.com/go-gitea/gitea/issues/1029


> well-worded issue

Your example is not well-worded; it's a demand. I'd expect a well-worded issue to share some story about how you were working to achieve some goal, so you tried setting up gitea to self-host the gitea repo, and were surprised that it didn't work (and share the errors etc. that you saw). Then the devss can prioritise your issue by understanding how blocked you are by the missing behaviour, and even propose a workaround or alternative way of achieving your goal.


The phrase between parentheses was meant as an example of the succinct title of such an issue, not the hopefully polite and well-worded body explaining why.


> the succinct title of such an issue

'Title: do this thing' is not polite or well-worded.

"Let's self-host gitea so potential users can easily see how awesome it is" is more polite, far easier to see the benefit of, and a trash-fire of proper composition.


In addition to what other replies have said, it could also be a matter of time and expertise. Operating a highly-available, backed up service requires time, effort, and skills that could be put to better use writing code.


Like if you ran couch-to-10k and only let people on the program who posted certified 10k times ...


Time for a decentralized blockchain of code commits...


Gogs, the project Gitea has been forked off, works just as well. As far as I can see there's no technical reason to prefer the fork over the original.

Judging from Github stars, Gogs is also vastly more popular.


I compared them yesterday. There seems to be more development activity and more contributors for Gitea (check the pulse tool on github for the last month). I haven’t got a stake in this whatsoever, but I recall the rationale for the fork being that PR’s would remain unmerged for months without comment due to the inactivity of the original author, who nevertheless wanted to retain control?


Gogs has many issues that no one want to fix.


Find the 7 differences...

- https://github.com/go-gitea/gitea/commits/master

- https://github.com/gogs/gogs/commits/master

gogs is mostly maintained by a single developer, and has less activity than gitea.


A frantic commit free for all isn't necessarily better. I count nine different developers committing to the project's master branch in a single day. That has tradeoffs too.


Comparing commits is mostly pointless anyway. Some people commit very often any little change (like some indentation change) while others commit only when a feature is done.

Compare releases and features.


Gitea requires peer review for pull requests, a healthy process that promotes quality. Cogs is managed exclusively by its BDFL.


The best way to compare the features could be https://docs.gitea.io/en-us/comparison/ which is a really fair comparision.

Edit: This list have been created on best efford, if you find any wrong information a pull request is always welcome.


Online Gogs demo here: https://gogs.io/


It is an awesome product, I really want to try it a bit more at home later in the evening. Don't you think it could be a good idea that considering it is to self host git projects, to self host your own project? I mean, why don't you use your own product?


I'm hosting Gitea on a $5/month DigitalOcean box, I had to setup disk snaposhots and db backups for security.

It's running since a year, never had a problem with it. It only consumes 20MB of RAM and is extremely fast. Easy to deploy (I've used ansible)


They don't use it because it doesn't have all the features yet: https://github.com/go-gitea/gitea/issues/1029


It's probably a cost question, to my knowledge gitea.io is a github pages website. The only running cost is really the domain.

Running the git service is probably a bit expensive and you loose the network effect of github.

They could probably migrate to gitlab though.


gitea.io is run on Digital ocean. You can see the expenses here: https://opencollective.com/gitea


This is the announcement of Gitea, which is a fork of Gogs.

https://blog.gitea.io/2016/12/welcome-to-gitea/

"We’re a growing group of former Gogs users and contributors who found the single-maintainer management model of Gogs frustrating and thus decided to make an effort to build a more open and faster development model."

This is a guide to run it with Docker https://docs.gitea.io/en-us/install-with-docker/


Why it's considered to be an alternative? Github is all about social networking for developers, not code hosting, or am I wrong?


I'd be interested to see some numbers about projects that actually leave Github. Other than FUD about the Microsoft acquisition, has something changed on Github that is making people talk about leaving? Why run away before there's a concrete reason?


I don't have numbers, but my opinion is that the serious big/popular projects with some user base (contributors) will not bother, at least yet.


You are wrong.

GitHub is a hosting platform for git repositories, which has collaboration tools (such as issue tracking and code review) built into it.


True, indeed it has all the features you mentioned. But do you think people stick to GitHub because these features are so hard to replicate?


What i like about the landing page is all the info is there. I don't have to scroll down through whole screens at a time for each titbit of info


I was about to comment the exact same thing. Excellent seeing this kind of marketing page in a world full of noise.


I wonder whether the user-friendly features of sites/services like GitHub might eventually be displaced by feature-rich git clients like Tower[0].

There are still plenty of things you might want centralized on a server somewhere, but it seems like a lot of the value add of GitHub, GitLab, and now Gitea is in making git repos easier to manage and interact with.

It's interesting to think about how far you could decentralize that, ideally with a "cambrian explosion"[1] of OSS and indie-software clients.

[0]: https://www.git-tower.com/

[1]: https://en.wikipedia.org/wiki/Cambrian_explosion


Somewhat related, I've been thinking of how to speed up git on hosts with limited resources. As an example, the firefox repo has a 275MB index file which has to be loaded whenever you want to read the repository. On a host with 1GB of ram this doesn't work so well.

I think that storing repositories in loose format would make them much faster to read, but maybe I'm missing something. Any thoughts?


The index is just a cache of the metadata of files in the working directory; changing how the objects in the repo are stored won't speed up a 'git status'.

When you say 'read the repo', it makes me think you're more interested in the behaviour of cloning from a remote.

Loose objects would avoid the need to inspect packfiles, but… that code's all written in C and mmaps the contents & does fast seeks. Most likely the slow parts are reconstituting objects from packs (also mmapped C with fast seeks) or delta-compressing objects for git-upload-pack to send to clients. Going to loose objects doesn't help if the remote still burns CPU creating a pack: try using a dumb remote instead of a smart one? You're trading CPU for network now.

If you're more interested in improving performance in a clone: loose objects avoids the need to (fast mmapped C) read a packfile. The index still has to track if you've changed any checked-out file in the working directory, and if there are a lot of files, it's going to be big.


I was thinking about this from the perspective of a server like Gitea. What I meant by 'read the repo' is retrieve objects like commits to display them to the user.

So on the server you should only ever have packfiles, and in order to efficiently read packfiles you read the index (idx) file. I'm not positive, but I think that this file needs to be read in its entirety in order to access an object. Even if you don't have to read the whole file, it's probably best because you generally read more than one object at a time (e.g. if you display a list of files in HEAD you read the commit pointed to by HEAD, read its tree, and read all of the blobs in its tree).

My thought with using loose files rather than packfiles is that you wouldn't suffer the memory overhead of lookup, you just open the file at `objects/some/object` and parse it.

The real solution here is probably to get a server with more RAM and cache repositories. I'd be interested to hear what GitHub does.


GitHUb use DGit to (effectively) get loose objects on demand and cache them locally.

Parsing the packfile indexes is ridiculously fast; even in a memory-constrained environment the OS will manage loading data from disk so you only use a few pages. Inflating objects from packs is slower & will trash your memory; rendering to HTML will be even worse.

Perhaps 1GB is not enough RAM to host a webviewer of the firefox repo? Maybe if you generate a static site version of it…


>DGit I was looking for this yesterday, glad to see I wasn't misremembering.

You're right, 1GB is not enough but it's all that I have so I have to make do.


What operations do you want to do? Have you considered a shallow clone?


I forgot to mention that I'm thinking about it from the perspective of a server like Gitea. So I want to retrive objects like commits and blobs from the repository.


Is there a hosted version of Gitea? Would it be possible to host it and offer unlimited repos for let's say $5/mo as a service?


I don't know about Gitea specifically, but notabug.org uses Gogs which Gitea is a fork of.


I offer Gitea hosting. Storage is the most costly part of it.


> unlimited repos for $5/month

Plus storage, sure


I have always seen GitHub as the social Git. Of course I use Gitea for personal & private projects, but if you want visibility and gain from network effect, GitHub is where you needed to be.

With this MS acquisition, that proposition is starting to become a problem and an uncool dependancy on MS.


Gitea is working on federation, so lock-in wouldn't be a problem


Just wondering if it's time to start to decouple the client and the server. I did that with mail and use a native client only, and hosting a git repo on DO or similar should not be too big of an issue either.


For a solo dev, running a remote is a great idea and way cheaper than paying for GH private repos. You're giving up some availability and disaster recovery, but as a solo dev those aren't big problems (spin up a new host, configure your SSH keys, push). I did this years ago!

When the remote is shared (with other devs or tools), then you have the hassle of provisioning accounts, updating keys when they get lost, implementing ACLs, setting them, recording who changes what refs for audit trails, availability/backups becomes an actual problem, managing disk space + garbage collection, etc. The time you spend on those interruptsion is time (and concentration) that you're not spending on what you want to do. That's where the value proposition of GitHub comes in…

GitHub then has the value-adds/lock-in of easy webhook integrations, gh-pages branch, issues, wiki, and the web UI.

If all you want is somewhere to push code that's always available & private to you, then I'd look into using some could-based object store to host your repo. If you want to share the repo with other devs/tools… let me know where to find 99 other people willing to spend $5/month on this kind of thing.


Gitea is working on federation so there wouldn't be lock-in.


HN discussion on available alternatives to GitHub. https://news.ycombinator.com/item?id=17241487


Does Gitea still not have a good mechanism for backup and restore? I wanted to move away from Gogs but this is a show stopper for me.



well I think besides gitea dump there isn't really another way of backup.


I use ZFS snapshots with send & receive, works very well!


I'd be more convinced to give it a try if Gitea's repository itself was hosted on Gitea rather than on GitHub.


The project is working on that, see this issue for status: https://github.com/go-gitea/gitea/issues/1029


> Open Source: It’s all on GitHub!

There's some level of irony in a GitHub alternative that's hosted on GitHub.


It is currently something that is being worked on. See this issue for status: https://github.com/go-gitea/gitea/issues/1029

Self-hosting also takes resources such as time and money, and not all open source projects have the latter.


GitLab was initially hosted on github too


> Gitea is a community managed fork of Gogs

Why fork the project instead of keeping a single one ?


There was an issue with the Gogs maintainer going AWOL for long periods of time. An earlier fork was merged back in when he returned, but a subsequent incident resulted in a permanent fork.

Also there were some governance issues IIRC.


I stop at self-hosting


lol microsoft just announced it's going to buy GH and GH is already down :p


Looks great, but the fact they won't dogfood means I'm gonna stay away for now. Definitely have my eye on this as an option though.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: