Really quick pushes, too :-)
> kops update cluster
error reading channel "https://raw.githubusercontent.com/kubernetes/kops/master/channels/stable": unexpected response code "500 Internal Server Error" for "https://raw.githubusercontent.com/kubernetes/kops/master/channels/stable": 500: Internal Server Error
I've since grown to strongly dislike how much of the entire ecosystem seems to depend on random unversioned Github gists or direct links into a raw file in a repo somewhere.
Composer is useless in a native IPv6 only servers.
curl -sSL https://raw.githubusercontent.com/python-poetry/poetry/master/get-poetry.py | python
edit: though their historical data doesn't back it up. It's just a 2020 thing it seems; coincidentally picked up right near the middle of February which correlates way too closely with the beginning of COVID.
The issue is with the meaning of the word "imply"; when used in the formal sense as it appears is Classical Logic, correlation does indeed not imply causation.
In common parlance however, "imply" is often used to mean "provides evidence for", and correlation can indeed provide (potentially strong) evidence for a hypothesised causal link; the problem lies in people reading "correlation does not imply causation", assuming the informal meaning of "imply", and then going on to reject any notion of causation which uses observed correlation as evidence.
Pretty much every empirical science uses notions of correlation (in its various formal statistical guises) to provide support for causation, indeed to reject such reasoning would be to invalidate huge swathes of mainstream accepted science; half the battle in these instances is making the leap from correlation to causation in a manner which is considered scientifically sound.
But, they are moving much faster in recent years. They've added a lot of new features.
To GitHub SRE/oncall people: hang in there, you're awesome.
Lots of incidents lately, but it's becoming increasingly hard to get away from Github.
GNOME, Xfce, Redox, Wireguard, KDE and Haiku all have self-hosted on either cgit, Gitlab, Phabricator or Gitea.
I'm not going to say it was hard, but it was work, and it was work that took time (time that was volunteer free time) away from working on Xfce itself. When I did the final git imports of our svn repos in mid-2009, GitHub had been around for about a year, maybe a year and a half, and wasn't that popular yet. And I suppose back then I had the bull-headed "must self-host because that's the only thing a respectable open source developer should do".
That was a foolish attitude that took time away from the actual goal, which was making Xfce better. If it were today, I would have moved us to GitHub in a heartbeat, or perhaps GitLab (their hosted offering; I wouldn't self-host), instead. I haven't been involved with Xfce (still a happy user though!) in nearly a decade, but I suspect they still self-host out of inertia, with probably a little of that bull-headedness mixed in (that I myself can't claim to be fully free of either).
While GitHub's uptime isn't perfect, it's pretty damn good, and better than most volunteers will get running their git server off a single box someone had racked somewhere in Belgium, which I believe was what we were doing at the time. Tools for that sort of thing are better now, and if I had to self-host today, I'd use EC2 or something like that, and automate the hell out of everything, but it's still a lot more work than just using somebody else's infra.
Meanwhile, I expect many forgotten repos to remain online with Github one way or another for a very long time.
If you do go the self-hosting route, add a mainstream host as an additional remote and push your commits to both.
In fact, with all these free services, I'd probably say it's well worth automatically making remotes (at least) for GitLab, GitHub and having a local Gitea for everything you do. This should be resilient against specific outages, or GitHub simply deciding they don't like your project name, or some other disaster.
I do appreciate how easy git and other dvcs make it to mirror repositories, though. I find myself using Github mirrors of projects that are otherwise self-hosted, just because the code search works on Github works pretty well.
When people try to self-host and get lazy or hit by a bus, the content just disappears unless it was lucky enough to get archived.
And should your project go unmaintained it'll probably disappear if you don't have managed hosting.
Might be good to have a self hosted backup that's automatically synced with Github
git remote add origin https://email@example.com/myrepo
git remote set-url --push --add origin https://firstname.lastname@example.org/myrepo
git remote set-url --push --add origin https://email@example.com/myrepo
Seriously, though, any repo that I work on regularly will be cloned to my local dev environment, so it’s not a hard blocker.
That said, a cron job on a cheap VPS would probably do the trick.
git init —-bare
git remote add alternate firstname.lastname@example.org:project-1.git
If more privacy is required, you can use something like gocryptfs to only send encrypted data to these services.
Syncthing is a similar option that doesn't require an external service, also doesn't require local encryption since all data in transit is always encrypted.
This has a big caveat though: there's no locking mechanism that will work reliably on the bare git repo, so you may have to resolve some conflicts manually if two separate devices push to the same git repo at the same. This is why you should only use this method of you work alone on these repos.
we are trying to migrate to https://www.sonatype.com/nexus-repository-oss (self hosted), it would cache the git tags and you just have to replace the git links to nexus like in the dependency manager.
if you want something simpler, you can try satis for php, sinopia or verdaccio for npm packages. you will find a lot of other tools for the other languages.
GitLab is starting to look good (or even Gitea self-hosted).
also Pi4 now supports boot from USB: https://www.tomshardware.com/how-to/boot-raspberry-pi-4-usb
I bought one to run Windows on my Retina MacBook Pro. I only need Windows for gaming when visiting friends, and it works flawlessly for that purpose.
Check speeds when you get it to see what works for you best.
Pi is 100% not suitable for 100% duty work. It’s just a toy.
SD cards are quite frankly horrible boot media as well.
That's an additional 145kg of CO₂ per year (2018 US average).
For that you gain:
Decent thermal design., A decent quality enclosure, A power supply (thinkpad brick), Two displayport holes, An SATA interface (M.2 form) for an SSD, Built in Wifi, A RAM slot you can chuck 16Gb in, 2 more USB ports.
There's no competition. I paid 79 GBP each for mine (I own 3). Pi is 57 GBP bare board.
Pictures. Mac mini for scale: https://imgur.com/a/jXjLusb
And on CO2, perhaps you should just do without it if it's a problem.
Dont forget that change in software is inherently risky and will result in bugs, etc. Id rather have a platform that is always looking to make things better and risking a bit of downtime, than a stale platform that we all know we depend on.
So is there any actual pressure to move fast and break things?
Github started doing availability reports. Last month's details in the blog post below with summary of the issue.
Stay tuned till next month for the current outage.
I am using https://github.blog/category/engineering/feed/ for engineering category
- A different central server
- A shared on-filesystem copy, e.g. local network drive
- HTTP or SSH between developer computers (put your repository somewhere where your NginX or Apache serves it, the other developer can "git remote add chvid http://chvid.example.com/repository").
Make github a mirror (at least source-wise) and you can benefit from it's outreach without being held hostage. Am happy with that e.g. https://notabug.org/mro/ShaarliOS/src/master/doap.rdf Inspired by https://indieweb.org/POSSE
There have been some other annoyances/changes in behaviour that have bugged me too, but mostly stopped remembering them because am resigned to it now.
You can opt-in to this now. It is a preview functionality, i guess it will be GA in a couple of weeks.
What's the best way to keep my own copy of the packages my software needs (and their dependencies), so that my build process is less fragile? Ideally, I'd only have to rely on those 3rd party platforms to download new versions or have them as a backup.
When relying on my own copy of required packages - can I expect much faster builds?
A day wrecker!
Or, are you guys all devops geniuses better than those who work at GitHub?
When my own side-project has more uptime than Github, there's something wrong somewhere.
Though really, if it is that important for it to be up, you should mirror it to at least one other provider (ex. self-hosted and github).
One day, for whatever reason, he couldn't bring the VM back up. Self-hosted GitLab was out for the rest of the day. I found this pretty funny.
Really git is designed to be serverless and decentralised this centralised GitHub malarkey is probably wrong.
Isn't the only reason it will go down be because of network issues or power failure? What other possible cause could be for system failure?
I have been running HA, Pi-Hole, Maria-DB and my own API instances on my Raspberry Pi and in the 22 days of uptime till now, there have been 0 failures
At that point you'd have to create and manage a cluster.
You have to update servers, etc. etc. and if you count the hours is many times more than working locally and wait a couple of hours until technicians at GitHub fix the problem for you.
Also, most people don't need to provide access to hundred of thousands of users so they won't ever need a cluster of Git servers. (Some bloated UI like GitLab may need more computing to host even for a moderate amount of users, though).
Self-hosting Git is easy. Besides power failures there is not much reason this could go down if you don't help with that actively. But if you don't touch it besides OS updates it can run 24h a day for many years without any effort. (Biggest issue is backup actually, but you have to think about that point anyway if you run something on your own).
For larger projects where you have the resources to have dedicated infra people, I guess it depends.
Do you call IBM to maintain your pi-hole?