
Show HN: One click image optimization for GitHub - mwarkentin
https://shrinkray.io/
======
perlgeek
I hate giving out write access to my repos to somebody I don't know. How about
creating pull requests automatically instead?

~~~
mwarkentin
Yeah, understood that this could be an issue for people. I'd like to at least
be able to split out granting access to public / private repos - although
write to public could still be an issue..

What do you mean by creating PRs automatically instead?

~~~
jrochkind1
It sounds like creating PR's _is_ what you're doing, from other people's
reports? Which is great.

But Github tells me I'm about to grant you permissions to make commits
directly to all my repos.

Can you ask for fewer permissions from github?

I haven't tried it myself yet, because it gave me a bad feeling giving you
such permissions. (the permissions sound like they give full git commit
access, which would of course include rewriting history. To a random third
party? That seems like a terrible idea. If someone wanted to release such a
service and use it to inject security vulnerabilities in everyone's (or just
some specific targets) code....)

~~~
mwarkentin
Yeah, in order to create a pull request, I first need to commit to a branch on
the repo. I suppose for public repos, I could fork and send the PR from there,
but I don't know if there's a way that makes sense to do that for private
repos.

I'll take another look at Github's permissions, but I don't think there's any
other way around it right now.

~~~
jrochkind1
Ah, I see.

There must be SOME way to do it -- there are other tools that make PR's
without getting commit rights. (Although I can't think of them now, ha, I'm
pretty positive I've seen it.) But it might be cumbersome, or as you say, not
work well with private repos.

You should contact Github support and show them your site and explain what
you're doing and ask advice. Github support is pretty good, the people
responding seem to generally know what they're talking about. They'll probably
think it's a cool service, as well as agree that full commit rights are
inappropriate. If there's no good way to do it now, perhaps your query will
spark some interest at Github in making better ways to do it.

I think this service is really neat, but I'm really uncomfortable with giving
full write access to all my repos to a random third party.

It would be slightly less bad (but still bad), perhaps, if I could grant you
permission on one repo at a time, and then I'd go in and revoke it after the
PR.

~~~
mwarkentin
If you can find any other tools which _create_ , not just comment on pull
requests, please send them my way.. maybe there is a better way for me to do
this.

I just took another look, and I am asking for too many permissions - I should
really be restricting it to these:
[https://s3.amazonaws.com/snaps.michaelwarkentin.com/Authoriz...](https://s3.amazonaws.com/snaps.michaelwarkentin.com/Authorize_shrinkray.io_staging_2014-12-23_11-40-02.png)

Although that is probably still too much to make you comfortable.

~~~
jrochkind1
I am _sure_ I've seen a pull request from a bot I had never given permissions
to before, but I can't recall what it was.

But it may have actually been a screen-scraping bot -- when I google around,
ironically the first thing I find is something that was doing _exactly_ what
you are, and ended up getting banned by Github for being annoying. (Don't
worry, by being opt-in, you won't be).

[http://www.wired.com/2012/12/github-
bots/](http://www.wired.com/2012/12/github-bots/)

I really encourage you to contact Github with your use case, I think they'll
like what you're doing, and maybe it will implant the seed to make it possible
to do easily without write permissions that ought not to be neccesary.
Services like this are good for github's business, as well as probably being
personally gratifying to githubbers, and they've built out their API's in
response to real use cases.

------
lordbusiness
I'm assuming this is file size optimization for website use? It would be nice
to see a video demo of this before I hand over my GitHub credentials. :-) Call
me Mr. Paranoia if you wish; I apologize. I'm sure there are best intentions
here.

Nice idea to be honest. Wish I'd thought of it. :-)

~~~
mwarkentin
Exactly - well, anywhere you have images which you want to make sure are as
small as possible (without losing any quality). We've had some success running
this against repos for iOS apps and reducing the size of the bundle.

Definitely planning on making a short demo video soon!

~~~
lordbusiness
Sounds good to me! Keep up the happy hacking. :-)

~~~
mwarkentin
Thanks, looking forward to spending some time on this over Christmas break. :)

------
domedefelice
Gaining the confidence of the user is not easy:

1) I think you should state explicitly that the compression is lossless (if it
is);

2) You should provide some kind of demos/examples to look at.

Good job anyway!

~~~
mwarkentin
Image compression is lossless.. I really should call that out on the home
page. Demo / examples are definitely on my list.. here's an example PR for
you: [https://github.com/shrinkrayio/demo-
images/pull/6](https://github.com/shrinkrayio/demo-images/pull/6)

~~~
sauere
Also, do you preserve metadata (EXIF etc.)?

~~~
mwarkentin
I believe metadata gets stripped out, which is how I get some lossless size
reduction for jpgs. In my experience, this is not something that we've been
interested in preserving on our images (use case - websites).

Would you be looking at using this to compress some photos for personal use,
or for a photography website? What would be your use case?

~~~
sauere
JPGs can be optimized without losing metadata. Google "jpegtran". It does some
optimizing of the data representation while staying loseless.

------
mcmillion
Awesome idea, but this should really be a build step, not something people run
manually, and definitely not touching the source images.

~~~
mwarkentin
Yep, if you've got something like this set up as a build step, good on you! A
lot of places don't, so this was made as an easy way to reduce your image
footprint. It's amazing how much can be saved sometimes.

------
gingerlime
Interesting idea and quite clever way to integrate it with github. Very nice.

Does it cost money? (if so, how much? if not, would it cost in future?) Does
it continuously monitor all images or can be launched as a one-off task? I
think some details are still missing.

~~~
mwarkentin
At the moment it's free for both public and private repos, but the plan is to
follow the standard "Github tool" pricing (free for open source, paid for
private / organization support). Happy to hear what people think they or their
business would be willing to pay for a tool like this. :)

Right now it's a one-off task, but I've done some work on using Github
webhooks to trigger image optimizations directly on a branch while you're
working on it. Not sure when this will be available though.

~~~
colinbartlett
I just used it and I really like it. What would I pay for it is a harder
question. Recurring charges are more difficult for me to swallow, but I would
be willing to pay the equivalent of a few coffees or beers every time I used
it.

~~~
mwarkentin
Thanks for the feedback. I was definitely leaning towards the subscription
model, but am open to other ideas.

------
benbristow
It would be nice to see a BitBucket version as well. Mainly because I keep my
private repos on BitBucket since it's free and then do any open-source stuff
on GitHub.

~~~
mwarkentin
It's something I may look at eventually, but not a priority right now - Github
has a lot more users, and it would increase the complexity quite a bit to deal
with two separate systems for permissions, APIs, etc.

Thanks for the feedback though!

------
Rygu
Fun side-project, kudos. Also for humans:
[https://imageoptim.com/](https://imageoptim.com/)

~~~
mwarkentin
Yep, there are definitely a bunch of ways to do this. A few of the reasons I
made this rather than just using a CLI tool:

* Speed - spinning up a bunch of workers to do this can optimize large repos a lot faster than my laptop * CPU - optimizing images on my laptop can really max out the CPU, turning on the fan, using up battery * Stats - I've wanted to make reports to send around to the company on the optimizations that were made - was quite hacky to pull this from a CLI report into something that could be shared

------
occam65
Seems like a pretty useful tool, but I don't have any repos with images. Can
anyone report on actual usage?

~~~
mwarkentin
You can check out an example of the pull request you'd get here:
[https://github.com/shrinkrayio/demo-
images/pull/6](https://github.com/shrinkrayio/demo-images/pull/6)

~~~
occam65
Nice. Does it report total filesize reduction across the entire pull request?

~~~
mwarkentin
It doesn't show you that info directly in the PR (as each worker doesn't know
about the rest of what's going on), but I do track stats for each repo and
make some (questionably useful) graphs available through the app.

Here's an example screenshot:
[https://s3.amazonaws.com/snaps.michaelwarkentin.com/Shrinkra...](https://s3.amazonaws.com/snaps.michaelwarkentin.com/Shrinkray_2014-12-23_10-27-07.png)

------
mwarkentin
Sorry to anyone who this seems to be hanging for - it looks like my workers
have gone over their capacity, and things are pretty queued up. I'm checking
into increasing capacity, but in the meantime things may take longer than
expected.

~~~
mwarkentin
Ok, it looks like the queue has cleared up for now. There were a few errors,
so if one of your requests didn't complete, feel free to retry.

------
jbox
Cool tool!

I gave it a shot on one of our private repos this morning and it did what it
said on the box.

In the process it also generated ~150 comments on the repo.

For folks watching the repo, this is a fair amount of email.

Is there anyway to aggregate the comments? Perhaps into the Pull Request
description?

~~~
mwarkentin
Thanks for trying it out.. sorry about the spam. When I added the
parallelization of the tasks, I kind of lost the ability to manage things
centrally on Github. I'll think about if there's some way to make this better.

Increasing the number of images per task would reduce the number of commits /
comments, but could also increase the amount of time it takes to complete.

Did the commits themselves cause email issues for you, or was it just the
comments? I could look into appending to the PR description instead of adding
comments - I'm just worried about a race condition when multiple tasks are
trying to update the description at the same time.

~~~
jbox
No worries! It worked, we just got a lot of emails :)

Could you aggregate the messages and post an update when the entire task is
complete?

Seeing the commits come seems to be a good progress indicator.

If you are "watching" a repo, you get an notification for every comment, not
every commit.

You could also just create the PR with a link to the job on ShrinkRay and have
the progress updates there...

~~~
mwarkentin
I've added this to my issue list.. thanks for the suggestion!

------
MichaelTieso
After clicking on "shrink all images" it goes to say "warming up" but nothing
happens.

~~~
mwarkentin
Yeah, looks like I'm out of capacity on my workers right now - this is the
first real burst of traffic I've had.

As far as I can tell, all jobs are either queued or running, so it should
complete for you at some point.. not sure if I can get an ETA for you though.

------
Raphmedia
What happens exactly when I click the button? It optimizes 50 images and put
them in a separate commit?

~~~
mwarkentin
Basically! When you click on the button, it will spin up a bunch of workers (1
for every 10 images). These will optimize your images in parallel, and then
push the changes back up to a branch and open a pull request for you to
review.

There's no limit on the number of images - it can optimize hundreds quite
quickly - 50 is the number of parallel workers you can have.

------
mrbig4545
i'm not sure what it is, or why i would want it. or even what kind of
optimising it does

i'm not complaining, just suggesting that you should work on your homepage to
make that crystal clear, it'll help with people who might want to use it but
don't know it yet

~~~
mwarkentin
Thanks for the feedback - definitely know that there's lots of work to do.
Planning on adding a short 30 second video running from sign up through to
viewing the pull request.. surprisingly difficult to do smoothly without a
bunch of "um" and "ah".

More work on clarifying things will definitely be happening.. I've just put
off showing this to people for too long already. :)

~~~
mrbig4545
Even one small paragraph will help, something like, "When you click on the
button, it will spin up a bunch of workers (1 for every 10 images). These will
optimize the filesize of your images in parallel, and then push the changes
back up to a branch and open a pull request for you to review.

There's no limit on the number of images - it can optimize hundreds quite
quickly - 50 is the number of parallel workers you can have"

(i stole this off another of your comments and added filesize after optimise)

~~~
mwarkentin
Definitely.. thanks for the suggestion!

