
The GitHub registry public beta is live - talal7860
https://help.github.com/en/articles/about-github-package-registry
======
carapace
FWIW, Software Heritage already has your github repos:
[https://www.softwareheritage.org/](https://www.softwareheritage.org/)

[https://hn.algolia.com/?query=Software%20Heritage&sort=byPop...](https://hn.algolia.com/?query=Software%20Heritage&sort=byPopularity&prefix&page=0&dateRange=all&type=story)

And GNU Guix at least will transparently fallback to them:

> Since Software Heritage archives source code for the long term, Guix can
> fall back to the Software Heritage archive whenever it fails to download
> source code from its original location. The way this fallback has been
> designed, package definitions don’t need to be modified: they still refer to
> the original source code URL, but the downloading machinery transparently
> comes to Software Heritage when needed.

[https://www.softwareheritage.org/2019/04/18/software-
heritag...](https://www.softwareheritage.org/2019/04/18/software-heritage-and-
gnu-guix-join-forces-to-enable-long-term-reproducibility/)

~~~
jolmg
You know what'd be really cool? For either Nix or Guix to transparently
support installation of any version of any program without hacks like putting
an alternate version under a different name or some such. I wish programs
didn't require continued maintenance for dependency updates or risk being
uninstallable without putting them and a number of dependencies under
different package names. I wish I could install Firefox or Chrome from a
decade ago as easily as I can install the latest versions.

I wonder if it'd be too crazy to add git-awareness to the Nix utils. Like tell
nix to install version X of some package, so in the nixpkgs repo it would
check the history of the corresponding file to find the commit where it last
was that version, checkout that commit, build the package and its
dependencies, and then return to the branch it was at.

~~~
matthewbauer
I think a tool like that could be very useful! The main issue with basing it
off of Git commits is that there is no guarantee that the last commit with a
given version is actually a good version. Consider the case where a-1.0
depends on b-1.0 and b is updated to b-2.0 in a commit so that a is not
compatible with b-2.0. Even though a-1.0 is still around, it's not going to
work until we update it to a-2.0, so you need some more complex constaint
solving on top of your commits to figure out what works.

I would prefer doing it based on the 6-month release channels, so you get
multiple version for every 6 months. You end up with some gaps between
versions, but also have more guarantees everything actually works together.
Basically "nix search" with multiple channels.

I actually had to do something similar with GHC versions for a project of
mine. It turns out you can run Nixpkgs all the way back to the first release
in 13.10 (LC_ALL=C is needed). Obviously not that long ago right now, but it
should continue to work as time goes on & give us 10+ years.

[https://gist.github.com/matthewbauer/ecdb665f84b7f6100f3f6e3...](https://gist.github.com/matthewbauer/ecdb665f84b7f6100f3f6e378951dfd4)

~~~
asymmetric
Wouldn’t it be enough to checkout the commit where a-x.0 was updated, rather
than the last known commit where a is at x.0? That way, if the packagr built
successfully at that commit, you know it’s a good version.

We do something similar (and the way more manual) here[0].

[0]:
[https://github.com/dapphub/dapptools/blob/master/overlay.nix...](https://github.com/dapphub/dapptools/blob/master/overlay.nix#L102)

~~~
matthewbauer
That would work better. It probably still requires some curation. For instance
CVE patches, configuration changes, and added platform support usually won’t
include a version change. Just using the first commit may mean you miss those
changes.

------
danShumway
> _is a software package hosting service, similar to npmjs.org, rubygems.org,
> or hub.docker.com, that allows you to host your packages and code in one
> place. You can host software packages privately or publicly and use them as
> dependencies in your projects._

I am... really confused by this.

Isn't this just Github? _Github_ is a hosting service that allows you to host
your packages and code in one place. It has testing and publishing pipeline
support, you can add artifacts/releases, make your packages private or public,
host different types of software at the same time, and it's compatible with
most existing dependency systems, including NodeJS.

I can see this has more download statistics, which is nice. And it has a
policy that artifacts can't be deleted, which is very nice.

Is that it though? I know I have to be missing something; what can I do now
that I couldn't already do with Github as is?

~~~
chrisseaton
> what can I do now that I couldn't already do with Github as is?

Before this new service how would you use GitHub as a source for installing,
for example, Maven packages?

~~~
danShumway
I guess I'm not sure how Maven works then -- I thought it was just downloading
package binaries? I would use Github releases for that and link to the binary
directly. I'd use a CI to auto-build and publish a new release binary whenever
I pushed to master.

Does Maven do something more complicated like automatically figure out which
platform binary to pull?

~~~
chrisseaton
It's the file layout, for just one thing. You can't just point Maven (or many
package managers) at a simple HTTP server without the correct layout.

If you could... they wouldn't have built this.

~~~
danShumway
Can't you?
[http://repo.maven.apache.org/maven2/](http://repo.maven.apache.org/maven2/)
looks a lot to me like a simple HTTP server.

I'll take your word on it though. I don't know much about how Java package
management works, and like you said, I assume the Github team wouldn't waste
their time building something that wasn't necessary.

I guess if nothing else it would be a pain in the neck to have to know in
advance how release files had to be laid out.

~~~
chrisseaton
> ...without the correct layout

How are you going to recreate that directory structure with GitHub releases?
You can't even have any custom directories - they're just release-name/file-
name.

I mean just try recreating it yourself and see how far you get.

You could try use GitHub pages instead, but GitHub very actively pester you if
it even looks like you're distributing binaries there.

------
jrochkind1
Looking at the ruby docs, my interpretation is that if a gem is published only
on github registry, there's no good way to use it as an indirect dependency
(no good way for a gem to list it as a dependency) -- any app using such a
thing would have to know the list of all of these indirect dependencies on
github registry, and list them individually in the top-level Gemfile, along
with their correct github source.

This seems to limit the utility for ruby. I'm not sure if other supported
platforms have similar issues?

You could already do a lot of what github registry for ruby does by using an
existing feature where you could already point to a git repo (not just GH) in
your `Gemfile`. What this adds is just the ability to resolve multiple
versions from github using ordinary rubygems resolution. The existing feature
forced you to manually specify a tag (hoping there was a predictable tag for a
version) or SHA, or use whatever is on master HEAD.

~~~
tehbeard
Other platforms (maven/java comes to mind) benefit somewhat due to the
compiled nature of artifacts.

The immutability of the packages is also handy as you pointed out by the hope
and a prayer that a tag stays static.

Is there not a global config for rubygems that would specify a list of
registries to search for a package instead of having to add them to each
project?

~~~
jrochkind1
The way they have set things up, every github account/organization (the first
thing after a slash) is it's own separate 'source' to rubygems. (I am sure
they have done this because it would be inconvenient to integrate with
rubygems/bundler any other way).

So you'd still need to add a separate source for each dependency hosted on
github to your own project Gemfile. Including for each indirect dependency,
knowing which indirect dependencies exist that need a github repo source.

If you could list this for the entire project... it'd probably be a
performance issue as rubygems/bundler check every repo source you list for
every dependency (including every indirect dependency; a Rails app has
hundreds, still an order of magnitude or two less than a react JS project
heh).

Even if you could only list "github's ruby registry" once (per project? for
your account? and keep in mind this is hypothetical, you can't), it would
still mean any gem expressing a dependency on another gem hosted on github
would have to include in it's instructions "oh, if you use this, you need to
manually make sure to add github to your sources. Or you'll get an error that
says some gem you've never heard of can't be found, and have no idea how to
fix it." Unless it's a bid to get _everyone_ to do that, and basically make
github ruby registry a standard part of the ecosystem that everyone just
always adds to every project.

I don't think there's enough/any value added by the github ruby registry to
get the ecosystem to shift like that. It's unclear what it does that the
'standard' rubygems.org gem source doesn't do already (unless rubygems.org
can't solve their recent severe compromised account security problems... but
as it is, with the indirect dependency problem, I think github registry will
be too painful to use even if you'd like to to escape rubygems.org security
issues).

[https://help.github.com/en/articles/configuring-rubygems-
for...](https://help.github.com/en/articles/configuring-rubygems-for-use-with-
github-package-registry)

------
psadauskas
Deja vu [https://github.blog/2008-04-25-github-s-rubygem-
server/](https://github.blog/2008-04-25-github-s-rubygem-server/)

And then removed 16 months later: [https://github.blog/2009-10-08-gem-
building-is-defunct/](https://github.blog/2009-10-08-gem-building-is-defunct/)

Hopefully this one lasts longer.

~~~
rogerkirkness
Customers ability to 1. Punish innovation and 2. Punish a lack of innovation
is a little bit hard to overestimate as a Product Manager. Experimentation =
bad, no experimentation = also bad. It's like how Google makes some of the
best software ever, and also people savagely denounce them every time they
kill a failing product. As if they would have learned as fast if they either
didn't make the product to begin with or kept it around to languish and
maintain.

~~~
tidepod12
It isn't a customer's responsibility to be a company's guinea pig, and it's
not a secret that customers would be unhappy that tech companies treat them as
such. This is especially true when Product Managers intentionally implement
features that take advantage of users by monetizing their data and then
implementing high switching costs that make it even more painful for the
customer once the Product Manager ends their "experiment". If tech companies
want to perform market research by experimenting on customers, they should do
the same thing that other industries do and compensate the experiment
subjects, not take advantage of them.

If you want to disrespect customers by treating them like disposable guinea
pigs (and not even giving them the courtesy of notifying them they're part of
an experiment), don't be surprised if they start to catch on and treat your
company as if it's disposable, too.

~~~
rogerkirkness
The world would have almost no innovative technology if not for a period where
customers tolerated "not good enough".

------
hprotagonist
There's no immediate mention of this on the site, but -- why did they select
the package formats that they did?

I'd love to be able to host wheels for my python projects, or {rpm, deb,
flatpack, etc...} for effectively arbitrary code. Is that in the works?

~~~
alexbecker
Running a python package registry has some unique challenges, so it makes
sense not to start with it (I run such a registry:
[https://pydist.com](https://pydist.com)).

For example, Python has a distinction between distributions (the actual file
downloaded, e.g. a tarfile or a manylinux1 wheel) and versions that doesn't
exist in most other languages.

~~~
takeda
All of these concerns are handled on client side, in the end all python needs
is an http server, it can be actually hosted on S3.

------
hobofan
At least in the case of NPM (I don't know as much about the other ones):
Doesn't that create a huge opportunity for hijacking attacks, where someone
publishes a malicious NPM package in the default NPM registry under the scope
identical to a Github organization/username?

~~~
quickthrower2
That is an interesting idea, playing on people's confusion as to where to
install from. And someone is going to put the super terse `npm i -g mytool` on
their README.md page (because it's all about the easy installs isn't it!) and
forget to say "change your registry to github" and boom!

~~~
hobofan
Not even "someone". Exactly that command is available to copy to clipboard on
the page of this new feature. Yeah, a small link to the instructions is
printed underneath it, but most users - especially the ones that are new to
package managers and the most vulnerable - will ignore that.

------
hiccuphippo
Any word on trying to tackle package build verification/reproductibility so
users can be guaranteed that the package was built from the source code?

The problems like with rubygems from yesterday and npm a few weeks back would
be gone with something like that.

------
mvanbaak
Deleting packages is not supported. Sobhow to handle a compromised package?
Looks like you have to contact github and hope the act fast.

Oh, and no pip registry :(

~~~
gkoberger
The alternate is that critical infrastructure can just... disappear. Like
"leftpad", but worse.

GitHub is already really great about alerting you with critical issues.
Whenever there's a security bug, it pops up in our repo (and with Dependabot,
it's become automatic).

~~~
jmb12686
I have appreciated the automated notifications from GitHib for projects that
have known vulnerable dependencies in my package.json(s).

I just looked up Dependabot and linked it with a repo that I already have
robust testing and CI pipeline for. Preliminarily Dependabot is great!

It automatically updates by dependencies to the latest versions and submits
individual PRs. Since I have TravisCI hooked up to this particular repo, I can
see all the test results for each PR and can (confidently) merge the changes
into master without manually firing up my personal dev machine(s) and manually
performing what Dependabot just did.

Anyway, thanks for the tip!

------
nickjj
Any word on what the price will be for private repos after the beta ends?

Would be interesting to see how it compares to Docker Hub for hosting private
images.

~~~
andyfleming
I think if it's at all comparable, GitHub will win out. It just seems
convenient to not have one more subscription with another provider. Plus,
hopefully, it will all be integrated well workflow-wise with a repository with
actions that publish to the registry.

~~~
devmunchies
if docker loses, then who will maintain docker?

~~~
andyfleming
There's more to docker than Docker Hub, but I'm not sure what % of their
revenue is from Docker Hub.

Even if GitHub "wins" there will still be a lot on Docker Hub.

Also, it's already possible to use other registries like Amazon's ECR or
Google Cloud's GCR, or even your own private one.

------
wbillingsley
I came across this one the other day, which looks like it does this plus
producing the binary package for you:

[https://jitpack.io/](https://jitpack.io/)

(No, I'm not related to that company in any way. I just saw it yesterday and
thought it seemed like a neater solution.)

------
craigds
Still no python support :(

~~~
ageofwant
A bit disappointing yes, but pip has had support for git repo's for many
years. In requirements.txt:

    
    
        git+ssh://git@bitbucket.org/foo/bar.git@fixit/atemp69#egg=hotshit
    

So perhaps that's why, still would like to have a github hosted devpi

~~~
craigds
git urls are tricky to use with many tools (like pip-compile) though. At best
they're slow, since things like "what's the latest version?" require
downloading the repo.

We forked some things into a private DevPI instance at present for that reason
(well, also for latency)

~~~
orf
Pipenv locks the VCS dependency to the commit, making pulling very fast.

~~~
andyfleming
Does it do a shallow pull or pull a tarball of the source?

~~~
orf
Shallow pull. You can specify the github tarball URL if you’d prefer.

------
andyfleming
I hope they add robots accounts like Quay.io
([https://docs.quay.io/glossary/robot-
accounts.html](https://docs.quay.io/glossary/robot-accounts.html)).

~~~
justincormack
It says that permissions are the same as the github repo, so you can create
github accounts and grant access, or use tokens.

~~~
andyfleming
It's nice to be able to manage machine/robot accounts more directly though.

On top of that, for orgs, you'd be paying monthly for each user you add.

------
gotts
I believe it's going to affect the whole developer community in a bad way.

Right now, all the major package manager are indirectly making each other
better, they experiment, improve and borrow good ideas from each other. It's
open-source and there is a little barrier for developers to contribute.

If n years from now GitHub becomes the defacto standard for package managers
and replaces all the existing ones the further innovation will be much slower.

It might transform into "Want to improve package managers? You have to work
for Microsoft"

~~~
txcwpalpha
I think you're conflating package managers and package registries here. This
GitHub product has almost no overlap with package managers like pip, gem,
maven, or npm. It is not a replacement for docker.

This is a replacement for npmjs.com (the _hosting service_ , which is not the
same thing as npm the package manager), or for rubygems.org (again, the
_hosting service_ , not the gem tool), or for Docker Hub.

If anything, this may actually _improve_ the collaboration between package
manager developers because there will now be a large development team that
will be working with the backend of various package registries and will have
better insight into what each one is doing wrong and what each one is doing
right.

------
andyfleming
I think this beta sign-up has been up for a bit already. It still just adds
you to a wait list as far as I can tell, unless I missed something.

~~~
max23_
I signed this up when they first announced it but I still don't get the invite
to access it. I am not sure how they choose who will have early access to it.

------
samcat116
Weird that the announced support for SPM packages but I don't see that on this
page anywhere.

~~~
twodayslate
Perhaps after the Xcode 11 beta?

------
thefounder
I like Go more. The git/hg/svn/bz repository is the "package". No custom (and
central) "registry"

~~~
sooheon
This is not unique to Go, every language I work with can use dependencies in
this way.

~~~
thanatos_dem
How about java projects? How can I add a dependency to a project I’m building
without it being published to a maven repo, for instance?

~~~
dragonwriter
> How about java projects? How can I add a dependency to a project I’m
> building without it being published to a maven repo, for instance?

You can use Gradle, which supports source repositories, instead of Maven:

[https://blog.gradle.org/introducing-source-
dependencies](https://blog.gradle.org/introducing-source-dependencies)

------
catern
The lack of "the" makes this read a bit weirdly:

>GitHub Package Registry allows you to develop your code and host your
packages in one place. You can use packages from GitHub Package Registry as a
dependency in your source code on GitHub.

"Package registry" is a fairly generic term, so to me it would be natural to
refer to this product as "the Github package registry" (capitalized or not).

Is there a name for deliberately avoiding "the" in this way?

~~~
dang
I don't know, but have a 'the' above.

------
needusername
They do not have a published list of requirements for Maven artifacts. This
does not give a good first impression.

------
tasogare
> limited public beta

"Limited" should have been in the title, because it makes it a not so public
beta.

------
no_wizard
For all the features GitHub has, this is the only one that myself and those
that I know personally have made us care and watch _very_ closely what GitHub
does with this.

We've been looking for a simple way to streamline releases. Right now
everything we have at my job is on GitLab and I use GitLab personally (though
I have a github account, of course).

I prefer GitLab in every way, but this feature alone might be a good enough
reason to switch. It would make releases just _so darn easy_. The only thing I
hope (which is not made clear) is that the stipulation that you can't easily
delete a package on the registry (According to the link, its only for GDPR
requests and legal reasons) is something that, for instance, an Enterprise
account wouldn't have. I already have our purchasing team looking into it,
thats how serious this is.

If the API for hitting these packages is any good, its gonna be so hard to
resist.

I really hope GitLab has a good response to this.

To wit, since GitLab is custom hosted, I wonder how hard it would be to add
this into the CE edition....

With all that said, I wonder what the hidden limits will be. Imagine if
instead of NPM maintaing all of its servers, it was just a thin database that
had better routing to github releases? Would that fall afoul with GitHub?

I mean, whats the point of maintaining your own distribution server when
GitHub can front all the hosting costs and all you have to do is map the name
of a package to its Github Package Release URL. I could see NPM, PyPI et. al.
just doing that, instead of having their own servers. Maybe its a good idea to
run additional cache nodes, but GitHub being the main place where release code
lives for you package index would cut the bills significantly no?

~~~
itslennysfault
This is a feature Azure DevOps (formerly Visual Studio Team Services) has had
for at least 3 years now. Their repositories Maven, Gradle, Pip, and NuGet in
addition to NPM. I'm always surprised more people don't use it. It's a full
featured ticket system, git (PRs / etc), package feeds, and ci/cd in one neat
package.

~~~
no_wizard
I did not know that. Though, we aren't on Azure for anything at all (AWS for
some HIPAA stuff, Google Cloud or our own proxmox cluser for the rest).

I know Azure Pipelines is becoming the sort of defacto automated CD/CI
pipeline though (used to be Travis for so long) and I've heard nothing but
good things about that. Might have to take a look.

~~~
GordonS
It's called Azure DevOps, but beyond technically being (transparently) hosted
in Azure data centers, the "Azure" part of the name is pretty meaningless.

------
codingslave
Github is going to be the source of code and data that neural networks use to
write software. There is so much data on there, and it will only increase.
There's only so many coding patterns. Get ready to be a fill in the blanks
developer

------
baybal2
No RPM or DEB supported

------
penagwin
Any word on pricing?

~~~
bdcravens
> GitHub Package Registry is free for all repositories during the beta. And it
> will always be free for public and open source repositories.

[https://github.com/features/package-
registry](https://github.com/features/package-registry)

~~~
penagwin
Perfect thanks!

------
jonny383
Honestly, GitHub has been going down hill for about 18 months now. It all
started with the ":D Set status" feature. I give it another two years before
Microsoft has officially turned GitHub into a 2021 version of Skype

~~~
cranky_coder
I’m confused, is this an example of one of those “:D Set status features”? It
seems useful to me...

~~~
jonny383
This is a glorified wrapper written around existing package managers. I
wouldn't call that useful.

~~~
dstaley
No, this is a reimplementation of basically every major package server. Now,
instead of hosting your Ruby gems on rubygems, NPM packages on the NPM
registry, and Python packages on PyPI, you can host them directly alongside
your source code. The tools by which you access these package registries are
the same, but they can now be backed directly by GitHub.

~~~
jonny383
Which is a glorified wrapper. Why on earth would you make a decision to lock
up all of your eggs in the Microsoft basket? Diversity is a good thing. Just
look at what's happened to the web industry with Chrome, and now it's
basically too late.

~~~
gotts
I couldn't agree more with jonny383. Why so many people consider centralizing
everything as always and universally a good thing? Some people just don't seem
to want to learn from history.

Some short-term conveniences come at a very high price in the long run.

