Excellent explanation! I came here thinking “why use ML for this?” and this is specifically addressed:
> You might ask at this point, if we can enumerate all possible solutions like this, why not just generate colors from first principles - why is machine learning even needed? The answer is basically that translation and rotation in colorspace might be contrast-invariant, but they're not preference invariant. Humans have subjective preferences for certain color combinations over others, so to generate pleasing color combinations we need to quantify which areas in the configuration space people generally prefer.
> To quantify which color combinations graphic designers prefer, I started by scraping design thumbnails from the web.
> At this point we have a decent sized dataset to train our ML model. The overall approach is to treat the problem as conditional image generation - the color contrast graph is the input and the corresponding color palette is the output.
It's interesting to think what would happen if a tool / technique like this one became super popular. Would we see more variation, as more people could choose schemes that look good? Or would we see convergence as there are no longer people actively making decisions (similar to what may be happening in the stock market due to the rise of index funds: [1])?
This is really nice! It'd be nice if the colors picked were saved in the url, so that you can use history to navigate the different patterns generated.
I love this suggestion (and hope the OP implements it), but in the meantime, there is a link button on the toolbar right next to the palette that provides a link to the generated palette.
I’ve used multiple color palette generators and this is definitely one of the best. Besides the ML, showing the colors on images like a phone and real web pages is really impressive.
My one suggestion is since you have Bootstrap support, maybe add TailwindCSS utility support as well
I opened it with a skeptical eye and walked away completely impressed. Despite the odd dissonances that pop up at intervals, many of the palettes produced are not only trendy and pleasing but also unexpected and creative in some ways.
Nice. I have daydreamed about something like this for years, and this is not it.
I'd imagine a series of left/right comparisons, like a visit to an eye doctor, where the machine learning is rewarded for its ability to predict my preferences. Eventually (a time commitment for me) it will be able to build from scratch color designs that I love.
This is like an early application of machine learning: What are the odds of victory for this backgammon position? Here, instead, we've estimating a preference function on color triples. Is RGB even the right domain, or do we want to work in some frequency transform, to capture the equivalent to musical chords. This is an empirical question, that can only be answered by trying to estimate this preference function, and noticing ripples better resolved by a different parametrization.
This would be easy, compared to the Riemannian geometry used in medical imaging. There's more money there.
For commercial use one cares what others think. There's the speciation question: You won't synthesize deep jazz tracks and deep blues tracks without separating the advice into species. Identifying clusters in data is something statisticians have worried about since the dawn of statistics.
This is fantastic, and I will definitely use it for personal projects in the future!
I see a few requests for practical features here, but I have one incredibly silly request: how plausible would it be to restrict the colors to a set list of RGB values, so one could, say, generate color palettes for physical mediums based on medium color -> RGB conversion lists, such as painting, cross stitch thread[0], or yarn[1]?
yeah that should be possible, just need to quantize the resulting palettes to the closest available thread color. You could possibly do it with the api (there's some instructions at the bottom of the about page)
I am a little bit confused, why does a scheme changes every time I switch page, I would like to see single scheme applied all over? Or maybe I am not understanding how this works?
This is nice. Once of my favorite palette tools is coolors.co, and they have a nice feature that let's you hit spacebar for a new palette. Would be useful here too I think.
Also would be nice to "apply" the colors from an upload image to the various scenarios. I.e. grab X colors from image and generate would cycle through various forms of those (possible adding extra complementary colors as needed).
Hmm, I realize I was the upload image works backwards from how I expected – I think you are actually intending me to upload a screenshot of a design to then swap out palettes. That is super cool, but on first guess I thought you were extracting from mine to make a palette. Might need a little helper text, but cool feature!
But this project creates some pretty good ones on the fly. I'd be interested in knowing what features of colors it's uncovered, that it uses to generate new swatches.
This is really cool, I'm excited to see more ML applied to design like this.
One project I wish someone would build is an ML-powered algorithm for perceptually even saturation, drawing on crowdsourced data to help pick colors that most people would perceive as being equally colorful
I love the ML angle but honestly struggled to use the tool. I tried locking a few of the swatches and selecting a different tool on the left, but all swatches were replaced with new hues. What does locking really do?
each template on the left is a separate thing basically. When you lock a color, then click "generate", the next palettes take your locked colors into account - if you lock the background to white, then click generate, the foreground colors should make sense for the white background.
I got that after commenting but would prefer if the locked swatch stayed in place so I can see how the hue (or a combination of hues) works across different media
This is incredible. I've been struggling with a color palette for my most recent project so I uploaded a screenshot of it, clicked generate and I instantly got a really nice palette! Amazing work - thank you.
Try increasing the "creativity" slider in the options. Setting it to a higher value will increase diversity at the cost of accuracy. (This controls the sampling temperature like in language models)
When I use the tool myself, I usually start with the transformer model to lock the first few colors, then switch to the diffusion model if I see a repeat. When all colors are locked except one or two, the "random" mode starts working if you still need more variations.
you can do this by setting the background to black, then lock it. When you click generate again, the subsequent palettes will then take the dark background into account.
there's also a dark mode preset in the (gear icon) menu.
I concede this is not totally obvious just from the UX, maybe it needs a tutorial or something...
You can guide the ML model by locking one or more colors, then clicking generate again. (click the circular swatch on top to lock a color)