
Github experiencing major service outages across all services. - experiment0
http://status.github.com/#
======
jtchang
This is why I love git (and distributed version control systems in general).
For the most part a short downtime isn't the end of the world. When it comes
back up I'll push my changes and that'll be that.

~~~
silverbax88
I would point out that both TFS and Mercurial do the same thing (as does
ClearCase, but ClearCase is awwwful), but with much less memory consumption on
the local PC. Not trying to add snark (I upvoted the above comment), just
adding that it's not a GitHub feature.

~~~
msbarnett
> I would point out that both TFS and Mercurial do the same thing (as does
> ClearCase, but ClearCase is awwwful), but with much less memory consumption
> on the local PC. Not trying to add snark (I upvoted the above comment), just
> adding that it's not a GitHub feature.

Well, no, obviously it's a feature of Distributed Version Control Systems in
general.

In fact, that's _probably_ why OP wrote "This is why I love git (and
distributed version control systems in general)" and didn't in any way claim
it was a feature of GitHub?

And do you have some kind of reference for git being less memory-efficient
(are you talking RAM-wise?) vs Mercurial or TFS? On its face its pretty hard
to believe; git is pretty light-weight.

~~~
silverbax88
Git for large projects is not lightweight, nor do they claim to be. In fact,
that's the basis of their philosophy - that they store full versions of the
files as opposed to just storing the changes because data storage is
considered inexpensive. The only time Git uses pointers in when the files have
not changed.

I'm certainly not slamming Git for working exactly the way it was intended.

~~~
bencevans
I think you may be thinking of a different VCS such as SVN as Git actually
just stores the meta data (changes/deltas) and not full versions of the file
in each commit.

As an example when you clone a repository, it downloaded it, then in goes
through a process of 'Resolving Deltas'. This is it going through the history
applying the changes/deltas to the files so that they are up to the most
recent revision.

The only files that are added in full are binary files.

Some reading: <http://git-scm.com/book/en/Git-Internals-Packfiles> :)

~~~
Cogito
To be fair, it uses the full blob until you pack, which is not immediately and
depends on your settings. Large repositories spend a significant amount of
effort determining optimum window pack sizes etc to optimise the packing of
their history.

So, git can be quite heavy on disk usage, until all the loose objects are
packed.

------
rubynerd
I'm amazed at how much this has knocked me on my arse.

I first attempted to redo the README for a service I've just open-sourced,
before realising Github is down.

Then, I attempted to fix the company CI server (OOM errors because of
Carrierwave needing more than 500MB of memory to run 1 spec in, for some
unknown reason), which failed because it couldn't check out the code.

After giving up on that, I attempted to install Graphite to a company server,
where I hit another roadblock because the downloads are hosted on Github, and
so I had to use Launchpad, which I had an allergic reaction to.

Also, when I was shelling into the server, oh-my-zsh failed to update because,
you guessed it, Github was down.

Still, shouts to the ops team in the trenches, we're rooting for you.

~~~
tinco
Your company runs its source revisions through Github without a backup
solution? Do you really put all your eggs in a basket you have no control
over?

I know that in theory a cloud solution should have a higher uptime than an
amateuristic set up private server, but cloud solutions have a certain
complexity and coherence that make them very vulnerable to these kinds of
'full failures' where nothing is in your control.

Maybe you should take this time to learn from this, and analyze what you could
do to reduce the impact of this failure. For example, you could research what
it would take for your company to move to another Git provider, perhaps even
on your own server or a VM slice at some cloud provider.

I'm not saying you should drop github, because obviously they have great
service, but be realistic about cloud service.

Cloud service is like RAID: it is not a backup.

The way RAID is nice for recovering from errors without downtime, there is a
chance something bigger happens and you still lose your data cloud is nice for
offering scalability and availability but there's a chance everything goes
down and you still can't run your operations.

~~~
ghshephard
Add up your downtime over a three year period by relying on Git (or gmail, or
AWS) versus the cost of trying to engineer some local-backup system, and the
downtime associated with that going awry.

Outages happens - as long as we're talking hours a year, pretty much everyone
but life-safety systems, critical infrastructure, and payment/high-traffic
commerce sites are probably better off just letting third-party cloud vendors
manage their systems. Take the downtime and relax.

(Now, if downtime consistently gets into the 10s of hours/year, it's time to
look for a new cloud provider. )

~~~
kitsune_
It's not about downtimes and outages. It's incomprehensible to me how lax
businesses are with their backups, especially business where their clients
data is everything. Yes, the brave new world of the cloud seems tantalizing,
but even there, data can and will be lost. Don't just use only one way /
provider / service / mechanism for backing up your data.

A tape / lto backup system doesn't cost the world. Yes, it introduces overhead
and maintenance, but I'd rather be safe than sorry.

At my place of work we currently use a lot of virtual servers, hosted by
external providers. We use their backup and snapshot mechanisms. But we also
pull all data to our local back up server. From there we backup to tape on a
daily basis.

~~~
Xylakant
I do have backups of all my (relevant) GH repos since that's just a "git pull"
away and can be automated nicely. But I'd probably be out of luck running my
regular CI-process with github down or do a deployment. Both rely on having a
public-facing git server - having a backup does not imply that I have a full
git server running. I could set one up and administer it, but it's just too
much effort given GH uptime.

------
robomartin
I did a bunch of work on GitHub today just before it went down. Talk about
getting lucky.

I'm sure we will all learn a bunch from the post-mortem. These high-profile
and very openly discussed failures are always good for learning all kinds of
things.

No issues at all on how GitHub is handling it so far. Eager to learn what
happened. Hitting refresh on the status page every so often. Better than
watching underwater basket-weaving competition at the olympics.

~~~
syncsynchalt
I created a new org on GitHub today (through all the unicorns) and was about
to push an existing repo to them so we could all start pulling it. Talk about
unlucky.

~~~
kh_hk
The good part is you can do it yourself without GitHub (which, after the
downtime, could come into the equation quite easily):

    
    
      -- Remote machine, or SSH port forwarded machine --
        > adduser git (you either add pusher/puller's ssh pubkeys to its 
          authorized_hosts, or use a shared password)
        > su git
        > cd ~
        > git init --bare myproject.git
    
      -- Your local repo --
        > git remote add <name> git@aforementionedmachine:myproject.git 
        > git push <name>
    

Setting up a git repo for confidential pushing and pulling is quite easy.

~~~
syncsynchalt
Already had the repo up and running elsewhere, thanks. Just needed to do some
non-git stuff on github (create a new repo on GH to push it to, add team
members, and so on).

I've been doing private DVCS for years (mercurial) but this is my first
project that's on git and I've been looking forward to the opportunity to host
it on github and see what I've been missing.

------
rolleiflex
Well handled, minus the unicorns. Tangentially relevant: I just wish Github
offered some sort of an academic plan for students, no private repo means that
I cannot use Github at all not because I'm building closed-source software,
but because I (obviously) can't put my assignment work for public viewing
before the assignment deadline. So I've been using Bitbucket, which is fine
and all, but I would have loved to be a part of Github community.

~~~
csense
If you want free Git hosting, you can DIY with Gitlab [1] if you want a web-
based GUI like Github's.

If you prefer command-line usage, I'd recommend Gitolite [2]. It allows you to
give people access to git and only git (as opposed to git's built-in system,
which requires granting ssh shell access); and it only uses one OS-level
user/group regardless of how many people it's managing.

Either of the above solutions are for your compsci professors who are clued-in
enough to be comfortable with CLI in general and Git in particular.

If you're trying to give files to technically clueless humanities professors,
I'd suggest only using Git privately, to develop your paper or whatever, then
using a plain old email attachment, or hosting on an HTTP server, to submit
the assignment. Or going _really_ old-school by printing out an old-fashioned
dead-tree hardcopy.

Of course, all of these solutions (except email attachments and printouts)
require running your own server, which is actually a great learning
experience. I'd recommend prgmr.com for hosting; their smallest plans should
be able to fit even an undergrad's budget, and you have full root access to
your (Xen VM) system, so you can do all kinds of fun and exotic experiments.
It's not necessary for basic usage, but you can install any version of any
Linux distro, use LVM, even use a custom-compiled kernel or FreeBSD (the only
requirement is guest Xen patches). It's great because if you have problems,
they give you access to an ssh-based out-of-band console, rebooter, and rescue
image so you can fix them yourself. (By contrast, many other hosts require you
to make changes through some half-baked web UI that lacks half the tools you
need, require you to install only approved distros and only do OS upgrades on
an approved schedule, and require you to file tickets with lengthy turnaround
times and/or fees in order to do the most routine troubleshooting or
maintenance tasks.)

Disclaimer: My only relationship with prgmr.com is that I've been their
hosting customer for a long time (and very happy with them given the nonsense
I've had to put up with from other hosts, in case you couldn't figure that
part out from my above rant).

My only relationship with Gitolite is a project user. (I've created and
maintained three small-scale Gitolite installations.)

I haven't used Gitlab, but I've heard good things about it.

[1] <http://news.ycombinator.com/item?id=4957145>

[2] <https://github.com/sitaramc/gitolite>

~~~
h2s
How fitting that one of the links you've posted is unavailable at the moment
due to the very downtime that caused this discussion! Evidence of our over-
reliance on GitHub in the open source community, perhaps.

~~~
csense
Here I took it as evidence of Github's generous-to-the-point-of-being-
ridiculously-foolish attitude toward their customers: They'll even give free
public Git hosting to products that directly compete with their core business
at a lower price point (free)!

Even if you Do No Evil (R), some people will still complain about you.

That being said, Git hosting would be better for everyone if Github had a
bigger competitor in their niche.

~~~
zalew
> They'll even give free public Git hosting to products that directly compete
> with their core business at a lower price point (free)!

unless you are providing paid accounts and expensive enterprise solutions with
support, you are not competing with them in any way.

~~~
csense
> unless you are providing paid accounts and expensive enterprise solutions
> with support, you are not competing with them in any way.

Yes, you are. If you _would have_ bought their product (paid Git hosting), but
you used somebody else's product instead (Gitolite), then that other product
(Gitolite) is competing with Github for your business.

I agree that there is a subset of the market that (a) won't or can't figure
out Git hosting on their own, or (b) decides that paying for a Github account
will actually be less expensive. But I never said that Gitolite will ever
_replace_ Github.

------
csense
It should be pretty far down their list of priorities at this point, but I
just noticed the "Exception Percentage" value at
<https://status.github.com/graphs/past_day> is saying 483.704%. The fact that
they're measuring this in percentages implies the maximum is 100%, but this
isn't so.

~~~
chuhnk
I imagine it was a percent measured against successful responses. The
exceptions now far exceed those successful responses. You are correct in
saying it should be 100% but people have a habit of scaling beyond it.

~~~
csense
If this is in fact how they're measuring...I was going to make this post a
long rant about how ridiculous it is to measure it in that way, but then I
realized that people _do_ sometimes measure probabilities by quoting the win-
to-loss or loss-to-win ratio.

This is called "odds" and frequently used in gambling. Usually, though,
someone says "4.7 to 1" or "47 to 10" (abbreviated 4.7:1 or 47:10) instead of
470%. Usually the larger number is stated first, and the direction is usually
indicated by a word like "favorite" or "longshot." So one would say "Errors
seem to be a 4.7:1 favorite today."

It's slightly complicated by the fact that odds can measure one of several
things:

A. A probability ratio ("Red" is a slight underdog in the game of roulette
[1]; the odds against hitting it are 20:18 since there are 20 non-red spaces
and 18 red spaces)

B. A payout ratio ("Red" pays 1:1, meaning the prize if you win this bet is
equal to the amount of the bet)

C. The current payout of a paramutuel pool [2]

Odds are seldom used outside of a gambling context.

[1] <http://en.wikipedia.org/wiki/Roulette>

[2] <http://en.wikipedia.org/wiki/Parimutuel_betting>

------
greggman
I'm really curious how much longer github can offer so much free service. It's
not just git but effectively free web hosting as well, at least for statically
served pages.

It seems like it's only a matter of time before something will have to give.
Either they'll have to start throttling web serving or cover the site in ads
like sourceforge or something

I guess I'd better sign up and start paying ASAP to help be part of the
solution

~~~
robin_reala
Plenty of companies pay GitHub multiple thousands of dollars a year for their
services. It’s not just personal accounts.

------
niggler
What's funny is the extent to which Github is being used as a centralized
repository for many projects. I don't just mean for project discovery; the
issues and gists and other services aren't replicated as often or easily as
the code.

In fact, a lot of services depend on github for various reasons, all of which
are probably borked now ...

------
experiment0
A little bit more info here:

<https://status.github.com/messages>

------
zemanel
What about a script/service to mirror Bitbucket and Github (or others) through
webhooks or etc?

Was just getting my hands on Homebrew after a fresh OS install when i hit the
Octobummer :-/

~~~
zalew
.git/config

    
    
        [remote "bitbucket"]
    	url = git@bitbucket.org:you/repo.git
    	fetch = +refs/heads/*:refs/remotes/bitbucket/*
    

$ git push bitbucket

//edit after tinco's comment

+refs/heads/ _:refs/remotes/origin/_ -> +refs/heads/
_:refs/remotes/bitbucket/_

~~~
tinco
That snippet overrides your remotes/origin branch with what you pull from
bitbucket.

This might not be what you want. Instead you probably should do:

    
    
      fetch = +refs/heads/*:refs/remotes/bitbucket/*
    

Unless you really know what you're doing and are just crazy like that :)

~~~
zalew
I thought he thinks of a push and forget backup, so I didn't care. but yea,
you're right.

------
pjungwir
It's easy enough to set up a backup repo so a team can keep collaborating when
Github is down, but does anyone have a way to deploy Rails apps with
Capistrano+bundler? It's terrible not being able to deploy; sometimes that can
be really urgent.

With Cap I can just repoint config/deploy.rb to the backup git repo, but what
about bundler?

~~~
pjungwir
To answer my own question:

    
    
        http://stackoverflow.com/questions/8693319/where-can-i-install-gems-from-when-rubygems-org-is-down
    

Of course if you have private patches of gems with bundler's `:git` option,
you'd need to repoint all of those, too (as well as keep them in two places).

It'd be awesome if there was some simpler way, maybe a separate declaration in
your Gemfile so that any gems added to your project also get installed on a
corporate git server, and then bundler uses that as a fallback if `source`
doesn't work.

------
tronicron
Solution is to self-host your critical files. If you have SSH already on a
server (ha!) it is pretty easy:

[http://www.verbosity.ca/hosting-your-own-git-based-shared-
re...](http://www.verbosity.ca/hosting-your-own-git-based-shared-repositories-
using-ssh)

------
vacipr
Good.I was starting to miss "Github is down" submissions. Quick let's look for
alternatives.

------
timothybone
I feel like such an uber goober! I was installing a package that relied on a
github file, which for obvious reasons failed....little did i realize that
that was the problem, all I saw was such and such python error
callback....doh!

------
revskill
Most of developers in my countries 're using SVN. I don't use SVN at all, and
i'm failing to convince my partner that Git is better than SVN, just because i
don't believe in hardware stability.

~~~
kcbanner
What does hardware stability have to do with it?

~~~
revskill
I mean the harddisk or mainboard could go wrong anytime. The Distributed model
seems more stable.

------
darkchasma
I never forget to pull before going remote, and today I did, and the one time
I actually need github to be up and it's not. But I can't be upset, even if
this wasn't my fault, it's github.

------
fomojola
Interesting: Github Pages is still up (got a blog on there and it doesn't
appear to have experienced any outages). Static site generators for the win.

~~~
buttscicles
Pages was out briefly a couple of hours ago (certainly some repos, anyway)

Seems to be fine now though.

------
tzaman
I guess Murphy is strong on this one, I was just right in the middle of
bringing up new server (to host Rails app). Oh well, I'll go to bed early :D

------
rossjudson
Can't read the clojure 1.5 RC1 release notes. Damn.

Looks like services implemented in terms of github are not reliable (but there
was no guarantee of that anyway).

------
cjbprime
It's back now.

------
pcerioli
GitHub outage is now over.

------
rdl
I'm curious what actually went wrong.

------
23david
Time to fork github. Bad bad sign that they can't get their operations in
order. I guess $100 Million can't buy you uptime...

------
cagenut
That's a really nice status dashboard, tracking and publicly displaying your
98th% is really cool. On the other hand stuff like this:

"13:17 UTC We are seeing unicorns ..."

Comes off as un-professional at exactly the wrong moment.

~~~
pragone
Likewise - I haven't the slightest idea what "unicorns" are in this context.

~~~
syncsynchalt
The github error page is an angry unicorn. Similar to ars technica's
moonshark, or twitter's failwhale.

------
lacosaes0
And this is why I use Google Code: supports git and is more reliable than
Github.

It's a shame that they don't offer a paid service for closed-source software.

~~~
tzaman
Where do you live, under a rock? Of course they offer paid service for closed-
source software.

~~~
cmelbye
I'm assuming by "they" he was referring to Google Code. (Wishing that Google
Code offered a paid service.)

