So this doesn't really surprise me. Looking up the numbers says Safari is used by 0.46% percent on Windows users -- and it doesn't seem to have ever gotten much higher than that. Why waste resources on it?
Apple has never really pushed Safari for Windows at all. The theory I've read that always made sense to me was that it was created to allow Windows developers to test pages without an iPhone (or later iPad).
But here we are a few years later and the cost of getting a Mac is lower than it was. Apple already has a ton more developers thanks to the iPhone and iPad. And if you just want to see "will my site look good on an iPhone" you can buy an iPod touch for ~$200.
I'm actually a bit surprised it took them this long. It clearly wasn't an important product for them.
This is where dropping Windows support may harm Safari. Yes we can test against webkit via Chrome now so in theory anything that works on the current PC trinity (FF, Chrome, IE9+) should work just fine in Safari, any bugs that get raised by Safari users that can't be reproduced in Chrome are just going to get labelled "NONREPRO" or "WONTFIX". Chrome and Safari are not completely in-step so there could easily be the odd issue that affects one and not the other.
Admittedly this isn't a big problem for Apple users so the above concern isn't a killer: they can use FF or Chrome if Safari won't cooperate (though good luck convincing some users that switching browser is a valid workaround for any given bug).
Turns out they've bet on ultrahigh resolution displays, where their rendering is perfect.
Of course, I want 12px or smaller unless a display is really high DPI... ;-)
I'd use it as a primary on those systems except for the fact that some of my essential extensions don't work on the windows version.
I now run a Mac VM to test, which has the added benefit of an iOS emulator.
Can you tell me if Safari on Windows was a reasonable way to test for Safari compatibility - or was it too far off from the 'real thing' so that you really test 'Safari on Windows or devices that weren't turned on for years' if you run it on a Windows based dev machine?
Safari on Windows was pretty often criticized for having ported over the entire OS X font rendering engine (so it looked pretty wildly out of place on Windows), but that was part of what made it a fairly accurate rendition of what it looked like under OS X.
There were still discrepancies even then, sadly.
I don't see how the existence of Chrome has any relevance to the creation of Safari as a whole. First, the timing might be off (I'll leave this for others to research.) More importantly, although I don't know exactly what the point of Safari was (presumably Apple wanted to have their own browser to include with the OS just like Microsoft did), I would have to think that if Chrome would satisfy that need then the latest Mozilla browser would have as well.
Edit: I did a little more digging, and it looks like Chrome did start out as a (not very) secret project for a few years. I'm not sure when they started submitting their own patches to Webkit-core (as required by LGPL?), but they did maintain their own fork of webkit until 2009.
They did. While LGPL doesn't mean you need to release all your project's code to users if they request it, it does mean you need to release any modifications of the parts covered by the LGPL.
They didn't have to release the changes in any way that made them easy to reintegrate with the mainline version, but they did have to release them somehow if requested to do so by a Chrome user.
Even the full GPL doesn't require you to automatically release your code back to the original source, it only states that you must provide the source to the people you distribute your product too if they request that you do.
Edit: Some of my memories are a bit fuzzy about what was going on exactly when in relation to everything else so feel free to correct any mistakes in the above.
Pretty sure Safari from one year ago worked better on Windows than it does now. :/
On the positive side, maybe Apple won't try to force-feed users the browser, when they install and update iTunes, anymore.
And then they went and made the app store and all that went out the window.
I'm surprised they didn't kill it a couple years back.
Hopefully, they'll refocus any Windows Safari people on Webgl.
From the history of WebKit, I also got the feeling that Apple did one of the most important work to modernize it to compete with IE, Firefox, Opera at the time it was released. Afaik, KHTML was not really as far.
Under this view, I don't really see your point. This work by Apple (WebKit/Safari) and by KDE is now part/base of one of the most important browsers, Chrome.
Code may be important, but it's not the only thing that matters. You could write the most beautiful, efficient, useful code in the world and it wouldn't matter if nobody ever saw it.
Besides, this is straying further and further from the point that Apple ships and maintains a web rendering engine as a core component of the system. It's so firmly embedded that it actually powers most text rendering on iOS. In light of this, saying Apple should discontinue their browser which is built upon this engine is preposterous.
By that standard UC Berkeley "created" BSD and Tim Berners-Lee "created" HTML.
Somehow Apple managed to make Safari 5 slower than previous versions on Windows.
I can't exaggerate how unstable it was, it crashed all times and for no reason.