I just attempted this myself by creating an issue, commenting a file, copying the link and not submitting the issue.
It seems to work initially, but then 5m later the file gets deleted and the link leads to a dead s3 asset page.
So I believe this is fixed. Though the solutions suggested below are crafty, trying to reproduce myself shows me this has been addressed by the GH team
Maybe. Haven't tried it. Though, that does make the attack vector a little less intense than persisting even without an issue. At least the attack vector can be tracked.
fwiw, I tested this out as well by clicking open issue and uploading a file and then not actually submitting the issue. the file is still accessible 2 days later.
Just want to point out that GitHub removing the asset after 15 minutes is actually worse than leaving it. The least appetizing aspect of this for adversaries is that your payload is now forever available to anyone with the logs. If it were adversary’s choice (submit the issue and it stays, only draft the issue and it gets wiped, good riddance, a phenomenal c2 stager indeed!)
Back in the day when I worked in this field malware writers regularly used things like Youtube as blob-storage and Instagram comments as C2 server mechanism.
I think the core issue is that anyone can easily create a URL for a file that looks like it was published by the Microsoft team.
Instead, if Github uploads used the uploader's Github account in the URL path, scammers would need to resort to other means like punycode to mislead users.
> looks like it was published by the Microsoft team
Right, a simple way to make these URLs seem less associated with the project in question would be to host them at www.githubusercontent.com (a domain they use for other user-generated content, IIRC). I guess current setup includes an easy way to auth access to the files: if you have access to the repo that's right there in the URL, you can download the file; if it's a private repo that you don't have access to, you get a 404/403. That might save them from a separate metadata lookup.
This is one of those things that seems obvious now, but I could easily see how this could happen, and that kind of URL structure is something I could see myself devising, thinking it's an elegant way to tie the file to other repo access rules.
But really, it's surprising to me that GH keeps around files even if you cancel or delete the comment. Perhaps nowadays they don't need to care that much about storage costs, but back when they added that feature I imagine they might have been more cost-conscious.
When do you do this? After an hour? A day? A week? There's been cases it's taken me a few days to write a comment or PR message because I was still working on things.
If you set it to short (hours) it'll be pretty annoying and confusing for people. If you set it too long (more than a day), scammers will just keep generating new links.
It's not that simple to fix, certainly not with these kind of timeouts. The only solution is to not have a "trusted" URL like that.
What if a comment is updated to remove the link? Or maybe a new comment has no link, an edited version has a link, then the latest version removes link etc.
The point is that by doing so, you can smuggle the attachment past anyone who's reading the comment section, yet still appears to the system as a legitimate attachment.
> This seems simple to fix - deactivate all links that didn’t become part of a published comment.
Currently, a major convenience for the malware spreader is being able to put malware on the server and get a usable URL back without submitting the comment. Deactivating links for unsubmitted comments that were canceled is orthogonal to identifying malware, and wouldn't make the comment readers more vulnerable than they already are.
Depends might be a strong word but this would affect things like screenshots submitted in bug reports as well, which would be less than ideal at the very least.
They could probably even just parse all comments and rewrite them. Then remove the redirects. If this feature is being used as intended these files should all be just linked to from comments and they parse the markdown anyways.
While they are at it maybe they can expire some old files that aren't referenced...
I could have sworn that githubusercontent was used for these uploads. I’m not at my computer now so I can’t double check, but I’m guessing there is some special case involved for these repos.
No, githubusercontent.com is used for image uploads, but file attachments have always been on github.com under the repo path. I recall using this as a makeshift distribution mechanism years ago, I was surprised and already thought about the security implications back then, so this abuse can’t possibly be new.
I’ve been using it as a free image host for years now (you don’t even need to actually post the comment, just drop the image in the box and grab the URL)… hopefully they don’t kill it :)
Using the repo prefix allows them to put the same auth on uploads. So if you don’t have access to a repo, you can’t access files uploaded in its issues either.
I'm not saying that they don't need to check auth by repo. They can still check the auth when they look up the metadata for the file. Just don't include it in the URL.
The auth difficulty may make moving the files to a different domain more difficult, but same domain path shouldn't matter.
This is yet another example of hosting hostile content on an authoritative site.
I still have a service running to track this.[1] It's a join of PhishTank and a somewhat dated list of major sites. Google is by far the worst offender. Hosting phishing sites in Google Sheets, etc. is not unusual. Yahoo and Microsoft used to be on that list, but they got better at kicking off hostile content. Adobe (via Adobe Express) has quite a few entries.
People have also been using the fact that commit data inside of PRs can never be deleted (you must keep the URL though), to distribute pirated content (or links to it) for years now.
It seems to work initially, but then 5m later the file gets deleted and the link leads to a dead s3 asset page.
So I believe this is fixed. Though the solutions suggested below are crafty, trying to reproduce myself shows me this has been addressed by the GH team