Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Displays

Journal Bill Dog's Journal: it's a tablet's world 5

Last weekend I upgraded to IE9. This weekend I downgraded back to IE8. Maybe I've been living under a rock or something but I hadn't heard. The way fonts are being rendered is undergoing a change on computing devices. And it's being driven by the tiny pixel sizes of current and future portables.

When I bought my Vista system mid-2009, all the fonts looked blurry, and I had to discover that they added something called "ClearType". Altho the setting to turn it off is buried more in Windows 6.0 than it is in 6.1 like I have at work, at least it's a simple switch on Vista, whereas on Win7 you have to click Next thru a bunch of steps of a tuning wizard even to just turn it off. I don't think MS really wants the user to.

Similarly with IE. IE8 was coded to be a rogue app in the sense that it ignores the OS-level setting for this and provides an "Always use ClearType for HTML" checkbox in Internet Options. Apparently IE9 dispensed completely with allowing the user to turn this new rendering mode off.

Some kind programmer made a DLL that hooks into the new rendering and alters how it's called so that ClearType is not used. But MS added another twist. It's called "subpixel rendering". Basically what was happening is thusly.

IE9 always wants to call some new DirectWrite API, as opposed to GDI+, for text rendering. I think it's for one largely because GDI+ is processed by the CPU and the future is web page rendering being processed by the GPU. The call into DirectWrite is altered so that the result does not have all the color-fringing of ClearType's poor attempt at anti-aliasing. But characters are still rendered with the spacing as if those extra fractional widths were there.

So the result was tons of web pages with visually randomly distributed gaps and squeezed spacings, making it look even worse than MS Word. That word processor introduces an extra space between words as needed to make a line of text on screen approximate where it will end up on the printed page. But in the browser I was getting (with the hack added) different spacings between words and letters, and it was just too distracting trying to read something and constantly having the flow of information absorption interrupted by initially not noticing for example that a word had ended.

Here's the other reason for wanting to migrate us to the newer I guess vector-based font renderer and off the pixel-based one. We're supposedly entering "the post-PC era". I don't own a "smartphone" yet but my understanding is that for web sites where they don't have or you don't use a special, different mobile version, everything's too small to use as is so you have to zoom in a bunch. But traditionally this can cause text to be re-flowed differently, as it's rounded to the nearest suitable pixel size for the font in use, and can "break" layouts.

So it seems the idea is, let's assume that most LCD screens are physical construction-wise laid out a certain way (and I guess furthermore that no one uses CRT's anymore), and to smooth out the jaggies we'll turn on just the red subpixel to the right of this "black pixel" here and a blue or green subpixel to the left of this one etc. I noticed also and IIRC read that it's intentional that another purpose was to not make the difference in contrast between bolded and unbolded text so "harsh".

So what went wrong for me, as far as this new rendering in general, similar to so many others who were bitching about this on a bunch of Windows-related forums? First of all, I get headaches from bright screens. So I set mine very low. ClearType is like a haze over all the text, like in the original Star Trek series when the camera was on a close-up of a pretty woman's face. Not blurring it so much as smearing it, and reducing perceived and actual contrast.

Secondly, I've always preferred more of a .30 dot pitch than anything smaller. So for example 1024 x 768 on the 15" CRT's at my first job (and I think I even used the "Large Fonts" Win95 setting then), 1152 x 864 on my super-expensive old 17" NEC CRT, 1280 x 1024 on my cheapo 19" Samsung LCD, and 1920 x 1200 on my somewhat-expensive Dell 27". I.e. I like relatively big pixels, and am more satisfied with a text display the more sharp and contrasty the dots and lines are.

But this is an older-school monitor, a kind you prolly can't get anymore. Not only are the 16:10's becoming less available (so it's 16:9's 1920 x 1080 now), but they're also coming in smaller and smaller diagonal widths, down to 21.5" I see from a quick search. And pixel sizes are smaller still on notebooks and netbooks, and presumably tablets and phones. It's rumored that next year's Apple tablet (the iPad 3) will have a 9.7" display at 2048 x 1536. That's less than a .10 dot pitch!

And I think things will keep going that way. We prolly have the technology and an emerging affordably of much higher resolution screens than what we've had before. It wasn't that long ago when the "full 1080P" TV sets were a significant premium over the standard LCD's and plasmas of the time.

Historically when you bought an LCD computer monitor, you looked at its "native resolution", and ran it in that mode so that your text was sharp and you didn't have pixel approximating going on. I think it's understood by many now, and that I'm just late to the party, that that is going away, and in the near future we'll have such high resolution displays that rendering single pixel wide curves to make up characters would result in them being too tiny and too delicate and faint to read sentences of them comfortably. Intelligently-applied "slop" is the future in font rendering.

Apparently Apple was the first to go this way. And MS resisted awhile, until they jammed some of this into their new Windows forms programming technology, WPF. (To Win32 API programmers, think "owner draw", for everything! Or for Java devs, Swing (vs. the prior AWT).) I hadn't heard at the time, cuz I haven't taken on learning WPF yet (and maybe never will), but apparently tons of people complained about the blurry fonts in WPF and MS did something to scale that back. But it looks like with IE9 they aren't budging. And I would expect the same from Windows 8.

And I read that this font blurring is in Chrome's nightly builds, to be coming soon. Afterall, Google doesn't want to be left behind on precise positioning and possibly hardware rendering in the browser. And the programmer who made the DLL for IE9 ported it from someone who made a plugin for Firefox to turn that shit off in whatever version Mozilla started puttting it in. It will be in OS's, browsers, they're the default fonts in the Office 2007 apps I've noticed, it will be everywhere.

Unfortunately I think that means I'll have to abandon this big beautiful monitor, prolly prematurely in its usable lifetime, and buy some super hi-res replacement, to be able to run Win8 and IE10 in 2013 I guess. And then use the OS setting of 120 DPI (up from the Windows default of 96 DPI (cf. Mac's standard of 72 DPI)) to get the text big enough for my liking and then 47-year-old eyes.

p.s. http://uncleartype.com/ has an image showing the problem. And altho I rolled back much of the workarounds on that guy's Help page when I rolled back my IE version, I decided to keep the font substitution technique there for the "Segoe UI" (system) font. MS content related web sites now seem to be specifying a CSS font-family of "Segoe UI, Verdana, Arial, Sans Serif" IIRC, so I chose Verdana instead of his choice of Arial. So now my dir listings in Windows Explorer and the Start Menu and almost everywhere in the OS itself are sharp, and then I set the Office apps to default to that from within them, and Notepad to Lucida Console, and Visual Studio 2010's code editor to Courier New. Blessed crispness.

This discussion has been archived. No new comments can be posted.

it's a tablet's world

Comments Filter:
  • The glorious thing about or economy: If you don't want it you don't have to buy it.

    I think I hear you, though. It's as though we're finally seeing real divergence between "computing for personal use" and "computing for professional use". You used to be able to use the same hardware for any given task - just load up the right software. But, now, we're seeing greater and greater hardware differences between consumption and production uses.

    It might be for the best...so long as it supports FreeBSD! ;)

    • For me in this particular case it's: If I (think I) need it to maintain my career, I have to get exposure to it somehow.

      I see that divergence you mention actually going away. Sure, right now there seems to be one, where most of the activity occuring on computing devices, surfing and socializing, are fragmenting off onto these mobile platforms, leaving traditional desktop computing devices relegated to only the kinds of uses that require a mostly stationary operational model and a fairly controlled environme

      • by ThorGod ( 456163 )

        Yeah, I see where you're headed there. The cellphones of today will be the laptops of tomorrow. (i.e. At some point, the computational power afforded by cell-sized computers will be more than enough for 100% of the average, 'home' users' use.)

        That, at least, is a situation I would dig. My only misgiving is where mobile OSes provide less functionality for 'power users'. But that should change in time, perhaps.

        • Hopefully as cellphones and tablets become more capable, the notion of having to "jailbreak" them will go away. Sure grandma and the teenagers don't need root access to their playthings, but when the day comes when they're all powerhouses of computing resources and capacity, I think the market will demand that they be accessible as general purpose computing devices are today.

          Mobile OS's themselves might be just a temporary stopgap thing while these devices are still relatively low-powered. When we get to th

    • by rk ( 6314 )

      The problem with that is if "professional" level gear schism grows to include differences between developer hardware, commercial hardware, and consumer hardware, the developer hardware will lose economy of scale and will become prohibitively expensive for most people to buy for personal use. Hobby programming will go the way of amateur radio, with a small cadre of aging practitioners doing things that are perceived as increasingly irrelevant to the populace at large. To me, that sounds like a shitty world

If you think the system is working, ask someone who's waiting for a prompt.

Working...