Ultimately, different strokes for different folks. While you're not wrong that Cleartype is empirically better for most people for text content, it is objectively worse, by design, for those for whom positional precision matters.
> This occurs because Apple didn’t dare go near any ClearType patents that Microsoft got for their rendering techniques. For decades OS X remained a very ugly baby, until in 2015 they just gave us courage HiDPI in form of Retina. This was a bid to make all hinting technology obsolete and put everyone else to shame.
Is just wrong. Apple intentionally chooses to display fonts the way they do. This is quite the extraordinary claim -- that a company that is otherwise known for design would not focus on their fonts.
With regard to the freetype font size rendering. You claim that freetype doesn't scale the fonts smoothly, but then you also claim Windows does it correctly. Just FYI, Windows most certainly does not scale fonts linearly. OS X does of course, given its focus on publishing. Windows cleartype actually changes the font size in order to make text readable. If linearity of font scaling is a metric by which you measure font engines, then cleartype is a failure.
In case anyone doesn't believe the claim above, this article (https://damieng.com/blog/2007/06/13/font-rendering-philosoph...) goes into it in depth.
I strongly prefer Windows to Mac for font rendering. As you say, it's subjective. However, in today's world, the overwhelming majority of my computer use is screen-oriented. Printing accounts for a vanishingly small set of use cases for me, easily lower than 1%. And if the fonts were a few pixels off when printed, would I even notice? No. I'm not doing typesetting when I print; I'm merely creating a hardcopy of something so that I can read it when I'm not at a computer or printing a legal document to sign.
In 2019, it seems odd to prioritize print layout at the expense of screen legibility. When I use a Mac (on a normal ~100 dpi screen), the text looks softened with bad anti-aliasing and I feel like I've stepped through a time machine to the amazing year 2000.
On high-DPI displays the difference is less notable. But even there I think Windows does a better job at legibility; I find text more readable on-screen on a modern Surface than a comparable Mac laptop.
That shape doesn't necessarily line up on the pixel grid, so you can smush it around to get sharper edges on a computer screen at the expense of changing the shape/size/spacing of the font. Or you can follow the designed shape more correctly at the expense of smearing some stuff across multiple pixels.
There are pros and cons to each, and which one you prefer is where it gets subjective.
Some comparisons and discussion over here: https://damieng.com/blog/2007/06/13/font-rendering-philosoph...
The Windows text has sharper edges but it looks funny to me in comparison.
I can understand how years of Windows will make the Mac sample look fuzzy, but I’m convinced the correct and harmonious word shapes on the Mac are much more important for reading flow.
Years later, I stumbled across Jeff Atwood's series on the topic.
In 2019, what seems most odd is insisting that there is 'one best way'. The fact is that I fall into that vanishingly small base of users that notices the colors in windows fonts. I find Mac and Linux easier. I like the Retina displays. I'm not sure how you can argue against my preference for them. I did not say you should not have fonts you can read; merely that there is no one way that ought to be said to be better.
Retina officially happened in 2013, not 2015, and I distinctly remember playing with scale factors in 2008 and being frustrated by a search for competitive panels on non-Apple laptops in 2009, so they've been pushing high resolution panels far earlier than 2015, which is exactly what you would expect from a company in the premium niche.
So yeah, OP is doing some sketchy revisionist history to dunk on Apple, but if Apple thought it was OK to release dark themes without updating SPR, hinting, etc to work on non-white backgrounds (which I suspect is the real issue here) then they do need a wakeup call.
EDIT: Did some googling. This is a bigger, older, juicier drama than I knew. It even looks like Nathan Mhyrvold and Woz were involved!
> Cleartype was suspiciously similar to approaches developed many years earlier by researchers at Apple Computer, IBM, Xerox Corp. and elsewhere.
> Gibson decided to stake his own claim to history by creating a Web page to document what he calls the original pioneering work in sub-pixel technology by Steven Wozniak, designer of the Apple II computer.
I believe this is the site: https://www.grc.com/ctwho.htm
I remember downloading that GRC demo to prove to a friend it should work on any laptop, not just the Mac.
I have a vague memory that SPR on Windows XP was very oddly integrated when it did arrive. You didn't download an update or utility, you went to some Microsoft page that turned it on, and tuned it for your monitor. So for most people it may as well have not existed.
It shouldn't be _that_ extraordinary. Much of my life experience has taught me that people/groups known for a 'thing' reinforce their prestige via a sort of circular reasoning. This can cause people to gloss over frayed edges. Sometimes dubious decisions are even taken as the "right way" moving forward! If the paragon of cheese says acrid, fishy cheese is great cheese, there's probably something to it, right?
OSX, depending on the version, has an uneven margin here and there, or a weird aliased image during setup.. It's pretty clean all things considered, but Apple's reputation is a force multiplier and a floor on how their designs are appraised.
The touch bar is goofballs.
Going from a magnetic connector to usb-3 is a usability regression.
Removing the headphone jack is user hostile.
"Courage" is an interesting way to describe those decisions. It's either that, or they're taking their reputation defense shield for granted. Eh, if cracks start to appear, they have enough money to do things right for a while.
It's a trade-off that I'll take for having an industry standard power supply and a replaceable cable. Used to be if your power cord frayed at either the MagSafe end or the brick end you were out $80 for a replacement.
Now you can buy a 60W USB-C power delivery brick with 3 bonus USB-A charging ports for $25 and if your cable ever goes bad it's a cheap replacement from Anker or similar. And I doubt you'll ever need to replace that cable because Anker is apparently better at making them than Apple ever was.
I have one of these under my desk with cords for charging my laptop and phone/tablet: https://www.monoprice.com/product?p_id=24425 Monoprice has frequent sales for even cheaper.
These are takes a lot of Apple users would, to varying degrees, agree with, so I'm not sure how "hot" they are. :) You can make a case for all of those choices, too, but despite the reputation us Apple fans seem to have, most of us just aren't going to say "I hate this, but because Apple did this, I must actually love it." In fact, we tend to be very, very vocal about things Apple does that piss us off, and most of us have a list of such things that goes back decades.
(Personally, I think removing the headphone jack is user hostile on the iPad Pro but not the phone; I think MagSafe is superior from a UX standpoint to USB-C, but USB-C has some advantages when you have USB-C ports on opposite sides of the laptop, a test several of Apple's USB-C-only gadgets fail; and the Touch Bar is a solution in search of a problem.)
Arguably there’s a place for each of these approaches. If you have an extremely low resolution screen, or underpowered speaker, it makes sense to accept the distortion to be able to use the screen/speaker at all. But if you have fairly good speakers or a high resolution screen it just doesn’t, because the volume is loud enough as it is, and the blur is actually pretty sharp.
It can be annoying to tweak scaling in Windows at first, but once you get the right setting, you never have to change it again. It takes about 10 min of setup (if it takes any setup at all).
Windows handles fonts correctly. And Cleartype is an option you can turn on and off. It makes fonts look "more correct" on many types of displays. Are you saying it's better not to have that option available at all?
Correct, but when printed on paper (using movable type for example), the width of left-aligned text set in the same font at a continuously increasing point size typically has a nice 'linear' slope on the ragged right edge. Cleartype does not have this. Cleartype changes the text width drastically between point sizes close in magnitude
The author criticized freetype for having a non-linear apparent font weight as the font point size is increased linearly. The author said this was bad without much justification, while simultaneously claiming cleartype is better, despite having the same issue of font metrics being discontinuous over a continuous point size.
> It makes fonts look "more correcct" on many types of displays. Are you saying it's better not to have that option available at all?
Not at all. Options are great. However, saying something is objectively better when in fact there are multiple use cases for which it is better and many for which it is worse is disingenuous, and that is what the author is doing.
Regardless of my personal opinion, it's clear that other people share this opinon. For instance, Word doesn't use ClearType; they had problems with text on different colored backgrounds.
Uhmmm.... ?
I don’t have a single Mac app that assumes anything about 96 dpi and pretty much the entire Internet is 2x now.
Windows on the other hand... yes. That’s a clusterfk in HiDPI.
Yeah, this is the point where I was like: I'm sorry, but this seems so wrong that it casts the whole article into doubt for me. Anything written in the OS X era doesn't have any assumptions about DPI baked into it -- and if any application in the "classic" MacOS era baked in an assumption about DPI, it would have been 72, not 96, because Apple stuck with a "1 pixel = 1 point" mantra in the (very) early years. One of the entire points of OS X's Quartz rendering system, which was basically "Display PDF," was to be effectively device-independent. There were no enforced assumptions about resolution.
There are arguments to be made that ClearType is a better technology for rendering text on displays than Quartz, but it's folly to pretend that there's not subjectivity to it. Personally, I've long preferred Quartz's rendering approach in nearly all cases; the text is arguably softer, but particularly on non-HiDPI displays, the letter spacing with ClearType often seems subtly off in ways that I find more distracting.
All cases? Have you ever compared them side by side on a low DPI monitor? I have a old but high-end 24" 1080p monitor that I bought for its color fidelity. When I'm in OS X, the fonts are atrocious compared to Win 10. Yet, I prefer OS X's font's when they are displayed on the 4K screen.
Apple moved to HiDPI long before Windows PCs, so you need an a very old Mac or external monitor to see what he's talking about.
> Since Windows never had the font rendering problems that Apple did
Uhm I cannot reliably use windows with any sort of scaling. It's not as bad as linux, but it's still very bad.
> Software maintainers too didn’t buy into the idea of massive code rewrites and UI tinkering to support DPIs other than 96
That seems like a massive overexaggeration. While there are still rough edges, running Windows on my MacBooks retina screen has been just fine for majority of popular apps. Including mixed HiDPI/normal DPI use cases.
Take QGIS for example. It doesn't properly scale. On Windows you get an miniscule interface, on mac it's the same size as all other apps, just blurry. I'll take that any day.
Why? Bad font rendering, or bad app scaling? I've found most if not all apps scale well. Some of them require you to go to compatibility settings and tell Windows that yes, the app can resize by itself. After you do that, they won't look blurry anymore. An example that comes to my mind is eMule.
There are a few apps that won't scale well and widgets will look misaligned, but they are very, very rare.
This almost reads like satire.
It's well-known that Windows prefers to distort letterforms for the sake of crispness, while Macs preserve letterforms for the sake of fidelity.
Saying that one is better than the other is entirely subjective -- there are many, many articles on the subject. There's absolutely nothing objective about it.
If it were so "objective", then it seems quite odd that Macs would be the predominant choice of graphic designs and type designers, who would be expected to care about this the most...
I am still amazed at how crappy MS Word at typesetting. I write these beautiful papers in LaTeX with incredible math type setting and then sometimes I am asked to convert to Word -- it almost makes me cry to ruin my "art" to convert it to Word.
It it an affront to my aesthetic taste to look at a Windows UI.
Regardless of which font you use, TeX will produce a much higher quality document (esp. math) than Word, at the expense of an upfront time investment in learning to use it.
After I spent a big part of my day with adjustments trying to make Windows font rendering suck less so I could get some work done, I also thought it might be satire, but it feels more like reading someone's antivaxx conspiracy theory.
- Some apps are DPI unaware and just get scaled by the OS at the pixel level.
- Some apps are DPI aware but only at startup. If you move screens they need to be restarted.
- Some apps can dynamically switch their DPI and respond to the OS prods to change.
Why is Windows such a mess about this and how did Apple manage to pull it off with virtually no fanfare?
For a while was connecting a Retina MBP to two non-Retina monitors. Sometimes, when dragging a window from the laptop display to a non-Retina monitor I'd see a flash of everything scaled up. Other than that, I've seen none of the problems that seem to still plague Windows 10.
I'm sure the answer has to do with the history of Windows - I'd love to see someone like Raymond Chen dig deep into this.
On macOS, to get anything above 100%, you render the whole interface at 200% (including layout), and the OS scales the resulting pixel output to the display's scale.
On Windows, every application has to detect the scale of the monitor it is being displayed on and do layout at the exact scale. (Actually, that isn't entirely true, if you wrote an app before the high dpi additions, you always render at 100% and Window scales it up for you, which results in a blurry mess.) It is very easy for layouts at 125% to have gaps and varying width lines depending on where on the device pixels the coordinates land.
Adding mixed high DPI support to Sublime Text for Mac was very simple compared to adding it for Windows. On Mac the OS tells us when to render at 200%, and everything in regards to layout is then doubled. On Windows we have to re-layout and snap dimensions to device pixel boundaries to get something that looks crisp when moving a window from one display to another. The code changes were far more invasive for Windows.
We support high dpi on Linux, but haven't added mixed high DPI support yet. Gnome limits users to integer scales (1.0, 2.0, 3.0), but with tweak tools it is possible to get fractional scaling. All of those are fine, it is just when you need to re-layout in fractional scales on the fly that you have to expose the scale of the current display to the entire UI library, or sometimes have gaps and overlaps.
I used to have a fun way to demo this where you dive through the archaeological UI and rendering eras (including XP-style control panel, with a hideous sidebar link hover rendering bug that I don’t think ever got fixed) until you end up at the iSCSI control panel, which I think may be literally unchanged since the year 2000.
(5K ultrawides to replace my 34" monitor simply don't exist yet, and the current 12" MacBook can't drive a regular 5K monitor.)
I hope the anti-vaxxing comparison is only for the "eyesight damage" part, otherwise this paragraph is bullshit in its entirety. From my own personal experience, i vastly prefer antialising disabled, assuming that there fonts used have hinting available that support it. Or even better, bitmap fonts made to be clean and crisp (as opposed to just being rasterized versions of vector fonts). Of course if you try to use fonts that disregard hinting and aren't compatible (or even tested at all) with antialiasing disabled, like the font the author's site uses which looks like total crap on my PC, then, yeah, compared to that i prefer AA with CT.
I find annoying that you need to mess with the registry to disable antialiasing in Windows 10, but at least the option is still there.
Nonetheless their use of the word objective is very annoying since I greatly prefer grayscale with no hinting even on Linux at 96 DPI. It's a bit blurry, but it consistently looks right. No kerning issues.
Also, it is laughable to suggest that Windows never had issues with font rendering. Just look at Microsoft Word struggling to balance between accurately displaying documents and rendering the fonts with Cleartype as shown in the AGG article.
I have no idea. I want it to look good. With macOS I don't have to choose.
Whatever the defaults are in Firefox on OpenBSD look superb on my monitor.
That does seem to be the case. Per Wiki (https://en.wikipedia.org/wiki/Core_fonts_for_the_Web):
> The latest font-versions that were available from Microsoft's Core fonts for the Web project were 2.x (e.g. 2.82 for Arial, Times New Roman and Courier New for MS Windows), published in 2000. Later versions (such as version 3 or version 5 with many new characters) were not available from this project. A Microsoft spokesman declared in 2002 that members of the open source community "will have to find different sources for updated fonts."
So while Windows and MacOS both have up-to-date versions of these fonts (Windows because Microsoft owns them, and MacOS because Apple licenses them from MS), the best Linux distributors can do is to package the last versions released before the 2002 re-licensing. (Or at least, that's the best they can do without paying Microsoft.)
The author, while his opinion should be taken lightly, is trying to get subpixel positioning into Chrome and GTK. This is something positive, and something Windows and MacOS already have. Qt can do it, too, but as of a few years ago KDE/Plasma was actively disabling it because of inconsistencies in Xlib and image rendering backends.
> OS X objectively has the absolute worst font rendering.
Nobody will do this.
s/font rendering/sleep mode
s/font rendering/battery life
I've been using Linux for 15 years now (started on Mandrake), but those issues haven't been fixed in a decade.
I really want to support free software, so I keep using it, reporting bugs and donating again and again. Many do. We hope that like with Firefox, if we support it consistently, it will eventually overcome the situation.
But damn it's hard sometimes.
On non-thinkpad/xps machines (which I'm not saying you're using or talking about, I'm just gonna assume), it's a crapshoot. But I think it's no better than running macos on a pc laptop or windows on a mac laptop (at least the last time I tried years ago, via their dual-boot thing).
I'm writting this from an Ubuntu 16.04 on an XPS 13. Sometime it wakes up from sleep modes with no wifi or bluetooth, and I have to force a reboot. Bluetooth doesn't work out of the box, I have to use blueman just to have a working mouse. But even with these, headphones won't work.
I had a XPS 15 before. It was not better.
I think this bug report is tracking the issue: https://bugs.launchpad.net/ubuntu/+source/linux/+bug/1799988
For the record, I'm current using Ubuntu on an X1 carbon 4th gen and it has worked literally flawlessly since day 1, no issues with wifi, bluetooth, or sleep, and it has great battery life. My X220 before that was 98% perfect.
Thank you for reporting bugs, though!
And I'm not even using ThinkPad or Dells.
People talk about those issues like they happen to everyone or are very common but they aren't in my experience.
Battery life has been a consistent problem, historically. However, I've moved to Thinkpads again and my T480, with the big external battery, gets a solid nine hours when working (running dev stuff in Docker, VSCode, a browser with a ton of tabs, etc.). It get maybe ten and a half or eleven hours doing similar on Windows, and that delta isn't perfect but it's not that bad anymore.
Sure, it sucks that wifi, bluetooth, and sleep mode don't work for you - so maybe Linux isn't a good choice for you if those are important. And the Linux community should definitely work towards fixing those issues, but the same stuff happens on every operating system.
I've had hardware just straight-up not work in Windows, then I head into Linux and it works immediately - and I'm not saying that means Linux is necessarily better, it means different hardware works better with different software.
And yes, the "works on my machine" comments are useless, but IMO no more useless than "it doesn't work for me," as if that somehow means... anything? My touchpad doesn't work on Windows, does that say anything about the viability of the OS? No, my touchpad just doesn't work good with it.
Though it is admirable you continue to use Linux, most people would just go back to what works at that point.
There's also the fact that Linux likes to tout how it is made by the community. If I have a problem with Windows, I blame Microsoft. If I have a problem with a driver on Windows, I blame the company that made the driver. But if I have a problem on Linux, who is to blame other than the community? And then the community turns around and blames everyone else.
...or maybe his hardware's not well-supported. And what's well-supported is by any metric significantly improved from, say, five years ago. If you buy fairly mainstream hardware, at least on the Intel side of things, I think it tends to be fine. My experience has led me to think that a lot of these ongoing complaints (the ones that aren't actually stale) are some weird hardware somewhere along the line.
When buying neither of the last two laptops I bought (HP Spectre x360, Lenovo ThinkPad T480) did I do any sort of Linux compatibility check. For the ThinkPad I figured it'd be fine because Lenovo historically has had solid support, but I kind of expected there to be problems with the x360 that never manifested. I got rid of it for other reasons, mostly around HP's support being pretty lousy and demanding I go without a work machine for two weeks because a keycap had come broken from the factory, but the Linux support? Just fine.
Desktops have been fine--not great, depending on your DE and your preferences, but fine--for a while now. (I'd say that Ubuntu 14.04 was the first widely stable Linux distro I ran into.) Laptops are getting there. There's room to improve, but a lot of these complaints are getting stale.
So it’s not all that bad and all OSes have their warts.
I'm always in dual or triple boot. This month I had 5 laptops just in my home.
The last time I had wifi problems on Windows was with XP, my friends with macs always have better battery, wifi and sleep than I, and autonomy on the same machine is always better on the Windows partition.
You are just repeating what you can read on so many forums. Something that I used to repeat to encourage people to keep trying.
But it's dellusional. And if we want to solve those problems, we need to aknowledge they exist.
The problem is a combination of priorities, server functionality takes precedence, and that it is good enough to get work done. It will continue to slowly improve over time and the only way that will change is if someone decides its important enough to spend the time to work out all the issues with each of these subsystems. Given how much fun that sounds (not much) it will probably have to wait for someone with $$$ to pay to have it addressed.
Either way Linux still does the best out of the competition that I've seen. The BSDs don't do as well as Linux and there are no other real non-Proprietary options at this point.
This has nothing to do with Linux, but with WiFi makers who can't make drivers. So pick your hardware with doing some research first.
The big linux fallacy.
Firstly there is no hardware that can guaranty that everything will work. None. No dell. No system76. There are just some that have less problems than others.
I've many friends saying they never have any problems. I also know they lie, because I come to their home, and I see them using workarounds every time for issues they don't even see anymore as they became part of their workflow.
Secondly, there is no hardware that is guaranty that everything will keep working. Each update may break your machine in some imaginative way. The last time it did, I lost ethernet support on thunderbolt. It's like a personnal chaos monkey, checking my work ethic is resilient to random havock.
And lastly "This has nothing to do with Linux, but with WiFi makers who can't make drivers" is not true anymore, with most phones using linux and having a wifi chip. The problem comes back with different chip anyway.
It's possible they weren't junk originally, but stopped being maintained. Or some other reason.
Also, WiFi drivers can rely on firmware, which Linux developers have no control over. Which is a bad thing, but it's quite common unfortunately.
In general, when choosing WiFi, find the driver which manufacturer actively maintains. Intel is a good example of such.
> Just make a stable driver ABI they can interface with and be done with it.
That will also hold the kernel back. So let the driver better be compatible for some version window, but not forever, even if it's provided with dkms. That's a good, not a bad thing. Freezing driver interfaces forever will turn the kernel itself into a mess. No one does it, including Windows.
Alternative perspective: the lack of any kind of stability in the driver ABI causes all the problems with the nVidia driver.
> That will also hold the kernel back. So let the driver better be compatible for some version window, but not forever [...]
Who said it had to be forever?
Intel and AMD maintain their drivers upstream, and don't have anything resembling Nvidia's problems (general lack of integration). The later ones are very clearly the result of them refusing to upstream it.
To put it differently, Nvidia can make their driver integrated, they just don't want to and it has nothing to do with how fast kernel interfaces are evolving. Case in point, Nvidia do update their dkms exposed parts of the driver, to address kernel changes. If they won't, their driver won't work at all, even through dkms.
> Who said it had to be forever?
You said "be done with it", which means no one will maintain it expecting it to be forever compatible? Otherwise I'm not sure what your point was.
> You said "be done with it", which means no one will maintain it expecting it to be forever compatible?
I don't know why you'd immediately jump to "forever". Nothing is forever. I just meant "a sufficiently long time". Like with Windows APIs, which still have a lot of compatibility with applications compiled 20 years ago.
They don't, because there is nouveau already and Nvidia are being rather nasty in preventing it from fully working.
> Like with Windows APIs, which still have a lot of compatibility with applications compiled 20 years ago.
Try using Windows drivers from older Windows kernels with newer Windows. They won't work. So it's not anywhere close to 20 years.
So do Linux dkms drivers. Windows is just using "dkms" like approach for everything. But it doesn't have any advantage in compatibility time period.
> There's no reason Linux couldn't have stable ABIs for major kernel versions
Not sure what you are talking about. Linux does have such stable interfaces.
Not sure what you're talking about either. I've never had to write a Linux driver, but it is a frequent complaint that one has to recompile drivers for new kernel versions. If the ABI was stable, that wouldn't be necessary.
dkms requires compilation, but it happens automatically with kernel upgrade. Read more about its design:
So not sure where you saw "frequent complaints", those are likely from people who have no clue what they are talking about.
So not a stable ABI at all then.
What was this all about again? Oh right: Linux has bad support for some hardware, you blame the "driver manufacturers" for writing bad drivers, I blamed the kernel for accepting bad drivers.
Here's the crux of my argument: If it is the manufacturer's responsibility to create and maintain drivers, then they shouldn't have to open source them and mainline them because they did all the work and took responsibility.
However, the kernel devs refuse to provide a stable ABI for drivers, making this nigh impossible. They claim that this is because it ensures higher quality drivers that can be maintained without the manufacturer, but that is clearly false because many drivers are crap and not maintained despite being mainlined.
So instead of taking any responsibility for bad drivers, people like you just blame everyone else. A tried and true practice of Linux Desktop evangelists.
I already said about both requirements. Only those who don't upstream or don't maintain their drivers have such problems. Remove either one - and problems are guaranteed. Stable ABI is only going to cause issues because developers won't care to fix anything thinking that "they are done with it" like you said. And review by the kernel maintainers is a good thing. There should be no trust for blobs.
TL;DR: things are working well with responsible hardware makers. So use their hardware, and avoid irresponsible ones. There is no reason for Linux maintainers to change their approach which works.
We really should start to take this karma thing seriously, because I keep getting answer like yours saying I'm just not seen things properly. So I surely must be wrong.
However, Windows and OSX have their fair share of problems as well. One of the differences is that in the Linux world folks largely try and find fixes the,selves, so they have more postings on the internet, whereas in the OSX/Windows world the discussion basically ends at “wait for an update”.
(This was a gigabyte aero 15x with a 94(?)Wh battery)
I was stunned.
This is far too broad of a statement to make, hence why you get all the "works on my machine" posts. Your post is just as anecdotal as others', with no actual numbers or anything to back up the claim.
Anecdata - I own a Lenovo y50-70 running arch with an Intel igpu, hidpi display, and Intel wifi. Everything worked flawlessly from day one. Surprisingly, the hidpi display worked far better than on Windows and I could leave behind the telemetry, updates, and poor input latency.
The thing is, even among all those comments authors, I know all of them are only talking at best about "that times where it worked for them". They all lived the problems I'm talking about, and are filtering.
Taking arch for example. 3 years ago, I was at a friend's flat. He was using arch because "it just works, never have a problem". And of course, when we set for watching a video, it didn't work. "Codec problem. Just a quick fix he had to do later. Usually it works fine.".
I do training all the time. Half of my job is training people, so I see many laptops with many OSs. I always see the same issues.
It feels like a recurring joke in a TV show.
> Taking arch for example. 3 years ago, I was at a friend's flat. He was using arch because "it just works, never have a problem". And of course, when we set for watching a video, it didn't work. "Codec problem. Just a quick fix he had to do later. Usually it works fine.".
This has nothing to do with Arch. And presumably if that codec issue even happened, if you want everything under the sun installed for you out of the box you should be using something like Mint or Ubuntu. I've had zero codec problems myself, things always just worked with VLC and mpv.
> They all lived the problems I'm talking about, and are filtering
No one's saying that Linux works perfectly for all hardware. But it's not broken for everyone either as you claim.
Besides, I'd much rather spend extra 5 minutes installing tlp and tweaking sysctl for battery life than sit through hours of Windows Update installing candy crush and blocking me from doing work.
Google infinality. Exact advice depends on distro.
Sleep is legitimately a pain.
It doesn't help to solve the problem at all.
At best, sometimes, it encourages you to spend a lot of time trying to find a solution that may work, or wreck your system (Exact result depends on distro).
I cannot imagine how I could break my system. It's probably true that there is a ton of bad advice a quick Google away especially since it's easy for advice to persist for many years after it's use by date.
I guess the lesson is don't enter random commands you don't understand from guides from 2008.
Oh boy, ever since I just can't use anything else because of the fonts. It visually hurts my eyes.
> It turns out that font rendering in the Linux ecosystem has been wrong since scalable fonts were introduced to it. Text must be rendered with linear alpha blending and gamma correction, which no toolkit or rendering library does by default on X11, even though Qt5 and Skia (as used by Google Chrome and other browsers) can do it.
The primary effect of no gamma correction is that light-on-dark text is too thin (and dark-on-light is thick but readable).
Surprisingly, I got used to this and never switched back. The most legible font is the one you're most used to, but if there's any objective measure of quality, it has to be sharpness. No anti-aliasing = maximum sharpness. It might look ugly at first, but you get used to it quickly. I recommend trying it.
And yet OS X's font rendering being by far the best for me is a key thing that has kept me on the Mac despite all the other warts. I find text elsewhere horribly blocky. It's all very subjective, it seems. Also, I think the grayscale antialiasing is a new thing, it certainly didn't used to be the case, but maybe they switched how it worked once retina screens became the norm.
Most browsers have their own take of Clear type font rendering when rendering web fonts. While some make web fonts look quite good (Firefox, Internet Explorer/Edge), Chrome has had issues with using settings which make fonts harder to read; I had to increase the weight of the font I use some to compensate for this. Clear Type, on the default settings Chrome used for a long time, is really great, if you’re rendering a Windows font like Calibri or Cambria. For anything else, the results are uneven. (I think Chrome finally started tweaking things in Windows to look better)
In terms of the linked webpage, his comparison is unfair: He is comparing how Arial, a Microsoft font, looks in Linux compared to how it looks in Windows. Liberation Sans has the same metrics as Arial, so is not a good comparison font; he should had used something more OS-agnostic, such as Bitstream Vera Sans (DejaVu Sans if you want more languages).
On my Dell U3415W (109ppi) the issues he pointed out are very obvious. An equal sign (=) for example has a much thicker and blurrier bottom bar than the top. The rendering of the H in "History" is different than the H in "Help" in the menu bar.
Just my 2c
Linux seems to have a lot of detractors around these parts. This person obviously cares more about fonts than freedom, privacy, and respect for the end user.
There might be inaccuracies and subjectivity in the article, but if really all engines have that issue that's a real problem.
I didn't notice it at first, I had to look at it very closely.
(Debian Stable, 3x 1920x1200 monitors, very likely I have tweaked most tweakable things at some point or another -- this system is 15 years old.)
But what do you consider 'perfect' font rendering? Maybe it's a lower bar than the author.
Additionally, their font hinting looks good out of the box and it can be configured in GNOME Tweak Tool.
I’m impressed at the Fedora team’s attention to visual detail.
-some websites render with a very strange font, that includes quote symbols '"' being very tiny, which I'm not sure if actually intended like that or a problem in linux specifically, and sometimes all letters being rendered in such ugly way to be hard to read
-symbols in mathematical formulas (some types of arrow, ...) rendered as colored emoji, even if they shouldn't in that situation since it's not a chat program according to the unicode spec
-sometimes unicode chars becoming a square box, even if having tons of fonts installed
Does the Unicode specification cover how glyphs should look and what colour they should be?
Umm... yes? Unicode includes “reference glyphs” in its character charts. They’re called “reference” for a reason: implementations of those glyphs should appear substantially similar.
I read the whole post but could not tell if what he mentions implicates both xorg as well as wayland?
I do agree that macOS fonts are nearly unreadable without a Retina/4K display. However I’ve never noticed any scaling issues on macOS.
Skip the "Removing the infinality-bundle" section if you don't currently use Infinality.
Linux on the desktop is death by a thousand cuts.
I just tried a three month experiment with using Linux as the primary OS on my laptop. That experiment came to an end last week, and I'm back to macOS.
What killed me was:
- wake-on-open working about 33% of the time. The other 66% of the time required a reboot.
- substantially worse battery life
- clunky handling of Exchange calendars. The major Linux applications are fine for personal use, but they really struggle in an enterprise environment.
- buggy rendering of Word docs (both OpenOffice and Abiword, but the bugs were different)
- no easy way to change screen resolution over VNC. This turns out to be important if you actually try to use screen sharing to get stuff done, and use two or more different platforms as the client.
- the straw that broke the camel's back: my linux email client worked well (for emails - not calendars, etc) for almost the entire three month experiment, until last Friday it somehow decided that my password was different from what it actually was, and insisted on interrogating the enterprise email server several times a second using the bad password. The Exchange server here is configured to lock an account after N incorrect passwords. So effectively the client auto-locked my account every time I fired it up. I still have no idea why it's using an incorrect password, and debugging it wold require my sysadmin to essentially stay on the phone with me and keep hitting the "unlock account" button, or whatever he has to do. My relationship with him would not survive this, and using Linux isn't worth pissing him off.
So, back to the Mac for me. I'll try again in another five years and see if anything's improved. It wasn't any one big thing. It was a lot of little things. Bugs that probably were work-around-able, but not worth it. Little bits of friction in the office applications interfaces that made using them just a little more painful. Multiply by a thousand, and you have a substantially worse productivity environment (unless you're a software developer 100% of the time, which I'm not).
Of course, if you want to use your existing X server it might not work as well, I haven’t tried it.
For most of the rest of your complaints, I use web apps for everything, even on OS X / Windows. Sad at the state of native apps on both Windows and Linux.
Linux is a lot easier on g-suite enterprises than in exchange/office ones. Considering how much of the latter's going online now, maybe it won't matter at all later.
If you want a system that can be customized heavily (for example you don't like using the mouse and want to use keyboard for everything), Linux is a godsent.
If you want perennial consistency in your work env, it's very helpful too. For example, I've been running the same setup for 12 years, bringing it over from the old laptop every time I got a new one.
In exchange for that convenience, I have to wrestle with wifi/ethernet settings once every few years (when I change my hardware); for me it's a small price to pay.
I think windows is not inherently worse though, it is probably great for how some programmers work or think, but to me Linux distros feel like a better match
Computers are tools, endpoints these days. I have been happy with Linux but then again my needs are not super-complex.
I have a recent freetype with a defaultish configuration and the results on my small by today's standards monitors are readable.
Because just a few days ago I read that fonts, as well as UI elements in general, are a complete mess in both Mac and Windows, two paid/closed-source OSes, when you try to use them with Hi DPI displays.