You need to go gamma -> linear before doing any linear algebra on your color values.
There is some debate over whether the alpha should be stored linearly or not in low precision (8 bit) formats. I think the standard practice is linear, but there’s good arguments it should be gamma encoded.
And, from there it’s important to point out that while pre-mul alpha is great for the math, having a color recorded in an 8-bit per channel image pre-multiplied doesn’t leave many possible values in the low range. So, you can get a lot of banding. Dithering becomes important. You should probably store the image non-premultiplied and do the premul as part of the math at runtime. Unfortunately, almost all image editors and encoders make it impossible to control the rgb values of low or zero alpha pixels. They all assume those colors conceptually “don’t exist” and don’t matter. But, they matter here because the filtered transition between solid and transparent pixels is the whole issue we are struggling with here.
If you create a sRGB texture on a modern GPU, the R/G/B channels are stored in sRGB (functionally a lot like gamma correction in this case) while the alpha channel is always linear 0-255. So while the GPU's texture samplers will convert the R/G/B channels to linear before filtering or sending the result to a shader, nothing gets done to the alpha channel.
Gamma should not have any effect on the alpha channel, which is linear by definition.
The alpha channel represents the average coverage of (infinite) subpixels within the pixel. The color encoding of an object should not have any impact on the coverage within that pixel.
I recommend reading the technical memo "Alpha and the History of Digital Compositing" by Alvy Ray Smith [1] to get a better intuition on the matter.
Gamma does however have an effect on blurring. And I suspect that because of gamma correction, applying a blur filter to a premultiplied image is (very subtly) incorrect.
Applying a blur filter to a premultiplied image is (very subtly) _correct_, at least if the goal is to emulate what happens if you used a physical lens to blur the same image. Not only does postmultiplied alpha mess up the correct pixel values (as the original post shows), but even without alpha, you get a “halo” that is weird and unphysical.