Alas, DirectWrite doesn't support ClearType, so many applications including Microsoft Word (!!) no longer use it – they just antialias with grey pixels.
Some interesting comments from the article that didn't work out this way in the end:
> Everybody’s favorite face will be Constantia by John Hudson.
> Cambria will be the default font in the next Microsoft Word, taking over the spot long owned by Times.
> I’m not sure how much need there is for a rounded sans [Calibri]
In the end, the new default font in Word 2007 was Calibri, which was surely by far the most used of these new fonts. It was easy to switch to Cambria (and it was the default heading font for a while) so that was fairly well used, while Constantia is essentially unknown.
Of all the C-named fonts introduced by Microsoft at that moment in time, I think Consolas was the one which made the greatest difference from what was available already.
Let's see whether it will also be the one with the most lasting impact.
Microsoft did a lot of great work on Fonts in the past. Recently it looked like they abandoned per monitor subpixel-rendering?! In which direction are they heading?
But the angular resolution of the eye doesn't rise. For a desktop monitor 100 ppi practically already reached the limits. Anything beyond that is just additional burden for the GPU and a waste of bandwidth. Surely you can increase resolution just to make font rendering easier, but you also have to pay the price in energy consumption or speed - without any visible improvement.
At the traditional 96 dpi, you have to be 3 ft away to exceed the retinal density. Personally, I sit at half that distance. Something around 200 would be more ideal. Laptops you might sit even closer.
Mobile devices, unless you get really close to the screen, have matched the retinal density for a while. Most people hold the device at about 8 inches, so 450 dpi is the value to hit.
Edit These measurements assume 20:20 vision, which is the average. Many people exceed that. So you'd need slightly higher values if you're feeling pedantic.
Having the focal point up close for a long time isn't that good for the eyes, so
sitting closer than an arms length to a desk monitor isn't an idea that lasts well.
100 dpi with subpixel rendering already maxes out angular resolution (horizontal). It doesn't max out everything (retinal), so you still see some artifacts, but practically this is not that relevant. The price in energy/bandwidth rises quadratic for very little gain.
To get the equivalent of 4K at 100 ppi - with 200 ppi you have to put the burden of 8K onto the GPU... For now this is absolutely not good - High ppi is ok for small monitors and handheld devices, but for a decent desk with several good monitors GPUs just aren't ready yet.
The difference between my 27" 4k and 1440p screens is still quite obvious and I don't consider myself particularly sensitive to these things.
For rendering of text/video even an underpowered integrated gpu can handle it fine, only issue is using a bunch more ram.
For reference my very underpowered desktop AMD igpu on 3 generations old gpu architecture (2CUs of RDNA 2) only has trouble with the occasional overly heavy browser animation
A few years ago, I replaced my 24" 1080p monitors (~96 ppi) with 27" 4k monitors (~157 ppi), and the increased pixel density was very noticeable, and I'd probably notice an increase over that. I sit about 3 feet away from them.
Hate seems a bit strong for an increase in perceived horizontal resolution on low DPI displays, but to each their own. That said, I'm not sure what you mean by it being impossible to turn off. On Windows you can just disable ClearType per monitor, and on Linux it's configurable either through your DE, fontconfig, or sometimes at the application level.
MacOS went the other direction and removed subpixel rendering entirely, which is partly why low DPI external displays tend to look worse there.
> That said, I'm not sure what you mean by it being impossible to turn off.
You can try to configure it to be off, and while that almost works, many applications will still simply not respect the setting. This is particularly apparent (and infuriating) with apps that don't render in high-resolution mode, because their rendering then no longer has anything to do with actual subpixels.
I imagine this behavior came from ClearType having been a special case, and therefore non-native widget toolkits getting explicitly programmed to render with it on Windows, forgetting that the user should be able to turn it off!!
> MacOS went the other direction and removed subpixel rendering entirely, which is partly why low DPI external displays tend to look worse there.
Subpixel antialiasing is a compromise. Once every Mac shipped with a Retina display, there was no need to retain that compromise, because you already get high resolution so you may as well get color accuracy too.
I will note macOS still enables by default a feature called "stem darkening" (incorrectly called "font smoothing" in macOS Settings) that also looks fairly awful to my eye, and seems itself a legacy from the low-DPI days.
> I imagine this behavior came from ClearType having been a special case, and therefore non-native widget toolkits getting explicitly programmed to render with it on Windows, forgetting that the user should be able to turn it off!!
I see, that is indeed frustrating.
> Once every Mac shipped with a Retina display, there was no need to retain that compromise, because you already get high resolution so you may as well get color accuracy too.
I believe that is Apple's position, and it may be valid for their own high-DPI displays. However, it overlooks the fact that most external monitors, especially typical office displays, are still far from retina pixel densities. Even on a relatively good 27" 4K panel, text on macOS looks noticably worse than on Windows or Linux. Then again, that's likely compounded by the lack of fractional scaling. Unless you're using a 5-6K external display, you aren't hitting 250+ PPI to get crisp text at all.
> I will note macOS still enables by default a feature called "stem darkening" (incorrectly called "font smoothing" in macOS Settings) that also looks fairly awful to my eye, and seems itself a legacy from the low-DPI days.
Yea, I've seen quite the range of stem darkening implementations. Skipping proper gamma-correct blending in many doesn't help.
The really annoying thing nowadays is renderers attempting to apply subpixel rendering to panels that aren't even RGB/BGR in the first place.
I really want to use better fonts for my sites but the double page render showing one font while it loads the actual font just looks so unprofessional and jarring.
I thought this site being a typography focused site would have a better way to deal with it's still as bad as I remember it.
Alas, DirectWrite doesn't support ClearType, so many applications including Microsoft Word (!!) no longer use it – they just antialias with grey pixels.
https://en.wikipedia.org/wiki/ClearType#ClearType_in_DirectW...
---
Some interesting comments from the article that didn't work out this way in the end:
> Everybody’s favorite face will be Constantia by John Hudson.
> Cambria will be the default font in the next Microsoft Word, taking over the spot long owned by Times.
> I’m not sure how much need there is for a rounded sans [Calibri]
In the end, the new default font in Word 2007 was Calibri, which was surely by far the most used of these new fonts. It was easy to switch to Cambria (and it was the default heading font for a while) so that was fairly well used, while Constantia is essentially unknown.
Let us not forget about Bill Hill, co-inventor of ClearType technology: https://www.geekwire.com/2012/remembering-bill-hill-importan...
Of all the C-named fonts introduced by Microsoft at that moment in time, I think Consolas was the one which made the greatest difference from what was available already.
Let's see whether it will also be the one with the most lasting impact.
Microsoft did a lot of great work on Fonts in the past. Recently it looked like they abandoned per monitor subpixel-rendering?! In which direction are they heading?
Pixel density continues to rise, but Microsoft might be engaged in… premature de-optimization?
It’s duals/mirrors all the way down. Or up.
But the angular resolution of the eye doesn't rise. For a desktop monitor 100 ppi practically already reached the limits. Anything beyond that is just additional burden for the GPU and a waste of bandwidth. Surely you can increase resolution just to make font rendering easier, but you also have to pay the price in energy consumption or speed - without any visible improvement.
At the traditional 96 dpi, you have to be 3 ft away to exceed the retinal density. Personally, I sit at half that distance. Something around 200 would be more ideal. Laptops you might sit even closer.
Mobile devices, unless you get really close to the screen, have matched the retinal density for a while. Most people hold the device at about 8 inches, so 450 dpi is the value to hit.
Edit These measurements assume 20:20 vision, which is the average. Many people exceed that. So you'd need slightly higher values if you're feeling pedantic.
Having the focal point up close for a long time isn't that good for the eyes, so sitting closer than an arms length to a desk monitor isn't an idea that lasts well.
100 dpi with subpixel rendering already maxes out angular resolution (horizontal). It doesn't max out everything (retinal), so you still see some artifacts, but practically this is not that relevant. The price in energy/bandwidth rises quadratic for very little gain.
To get the equivalent of 4K at 100 ppi - with 200 ppi you have to put the burden of 8K onto the GPU... For now this is absolutely not good - High ppi is ok for small monitors and handheld devices, but for a decent desk with several good monitors GPUs just aren't ready yet.
The difference between my 27" 4k and 1440p screens is still quite obvious and I don't consider myself particularly sensitive to these things.
For rendering of text/video even an underpowered integrated gpu can handle it fine, only issue is using a bunch more ram.
For reference my very underpowered desktop AMD igpu on 3 generations old gpu architecture (2CUs of RDNA 2) only has trouble with the occasional overly heavy browser animation
A few years ago, I replaced my 24" 1080p monitors (~96 ppi) with 27" 4k monitors (~157 ppi), and the increased pixel density was very noticeable, and I'd probably notice an increase over that. I sit about 3 feet away from them.
300 ppi matches printed books which looks nice. On notebook computers having a 3840x2160 panel might not be worth the reduced battery life.
I hate subpixel rendering. It's impossible to turn it off for displays that don't need it. It looks absolutely awful. I wish it was never invented.
Hate seems a bit strong for an increase in perceived horizontal resolution on low DPI displays, but to each their own. That said, I'm not sure what you mean by it being impossible to turn off. On Windows you can just disable ClearType per monitor, and on Linux it's configurable either through your DE, fontconfig, or sometimes at the application level.
MacOS went the other direction and removed subpixel rendering entirely, which is partly why low DPI external displays tend to look worse there.
> That said, I'm not sure what you mean by it being impossible to turn off.
You can try to configure it to be off, and while that almost works, many applications will still simply not respect the setting. This is particularly apparent (and infuriating) with apps that don't render in high-resolution mode, because their rendering then no longer has anything to do with actual subpixels.
I imagine this behavior came from ClearType having been a special case, and therefore non-native widget toolkits getting explicitly programmed to render with it on Windows, forgetting that the user should be able to turn it off!!
> MacOS went the other direction and removed subpixel rendering entirely, which is partly why low DPI external displays tend to look worse there.
Subpixel antialiasing is a compromise. Once every Mac shipped with a Retina display, there was no need to retain that compromise, because you already get high resolution so you may as well get color accuracy too.
I will note macOS still enables by default a feature called "stem darkening" (incorrectly called "font smoothing" in macOS Settings) that also looks fairly awful to my eye, and seems itself a legacy from the low-DPI days.
> I imagine this behavior came from ClearType having been a special case, and therefore non-native widget toolkits getting explicitly programmed to render with it on Windows, forgetting that the user should be able to turn it off!!
I see, that is indeed frustrating.
> Once every Mac shipped with a Retina display, there was no need to retain that compromise, because you already get high resolution so you may as well get color accuracy too.
I believe that is Apple's position, and it may be valid for their own high-DPI displays. However, it overlooks the fact that most external monitors, especially typical office displays, are still far from retina pixel densities. Even on a relatively good 27" 4K panel, text on macOS looks noticably worse than on Windows or Linux. Then again, that's likely compounded by the lack of fractional scaling. Unless you're using a 5-6K external display, you aren't hitting 250+ PPI to get crisp text at all.
> I will note macOS still enables by default a feature called "stem darkening" (incorrectly called "font smoothing" in macOS Settings) that also looks fairly awful to my eye, and seems itself a legacy from the low-DPI days.
Yea, I've seen quite the range of stem darkening implementations. Skipping proper gamma-correct blending in many doesn't help.
The really annoying thing nowadays is renderers attempting to apply subpixel rendering to panels that aren't even RGB/BGR in the first place.
I really want to use better fonts for my sites but the double page render showing one font while it loads the actual font just looks so unprofessional and jarring.
I thought this site being a typography focused site would have a better way to deal with it's still as bad as I remember it.
[dead]