Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
GUI

Journal Journal: Steaming Piles

[Cross-posted from the Scriptorum.]

Sometimes you have to destroy the document in order to save it....

I give up. I can't support OpenOffice Write any more, and it's nobody's fault but their own. For anything more than simple tasks, the application is terrible. Their only saving grace is that Microsoft Office has its own brand of polished turd, named Word. Collectively, they are racing to the bottom of a decade-long decline in useability.

No, that's too generous. The thing is, they're at the bottom. They are useless for any but the most trivial tasks, and the most trivial tasks are better accomplished elsewhere, anyway.

Yes, I'm ranting. Let's put this into a proper context:

I hate word processors. For any but the simplest tasks, their interfaces are utterly ridiculous. I haven't liked a word processing interface since WordPerfect circa version 5, and if I had my own way, I'd author all my documents in either emacs or vi, depending on the circumstances.

Why do word processors suck so badly? Mostly, it's because of the WYSIWYG approach. What You See Is What You Get, besides being one of the most ghastly marketing acronyms to see the light of day in the digital era, is ultimately a lie. It was a lie back in the early 1990s when it first hit the mainstream, and it remains a lie today. The fact of the matter is that trying to do structuring, page layout and content creation at the same time is a mug's game. Even on a medium as well understood as paper, it's just too hard to control all the variables with the tools available and still have a comprehensible interface.

But the real sin that word processors are guilty of is not that they're trying to do WYSIWYG - okay it is that they're trying to do WYSIWYG, but they way they go about it makes it even worse. Rather than insisting that the user enter data, structure it and then lay it out, they cram everything into the same step, short-circuiting each of those tasks, and in some cases rendering them next to impossible to achieve.

Learning how to write, then structure, then format a document (or even just doing each through its own interface) is easier to learn and easier to accomplish than the all-in approach we use today. For whatever reason, though, we users are deemed incapable of creating a document without knowing what it's going to look like right now, and for our sins, that's what we've become. And so we are stuck with word processors that are terrible at structuring and page layout as well as being second-rate text authoring interfaces. They do nothing well, and many things poorly, in no small part because of the inherent complexity of trying to do three things at once.

It doesn't help that their technical implementation is poor. The Word document format is little better than a binary dump of memory at a particular moment in time. For our sins, OpenOffice is forced to work with that as well, in spite of having the much more parse-worthy ODF at its disposal these days.

There's no changing any of this, of course. The horse is miles away, and anyway the barn burned down in the previous millennium. The document format proxy war currently underway at the ISO is all the evidence I need to know that I'll be dealing with stupid stupid stupid formatting issues for years to come. I will continue to be unable to properly structure a document past about the 80th percentile, which is worse than not at all. I will continue to deal with visual formatting as my only means to infer context and structure, leaving me with very little capacity to do anything useful with the bloody things except to print them out and leave them on someone's desk.

Maybe I'll just stop using them at all. Maybe I'll just start doing everything on the web and never print again. I'm half serious about this, actually. At least on the Web, the idea that content and presentation are separate things isn't heresy. At least on the Web, I can archive, search, contextualise, comment, plan, structure and collaborate without having to wade through steaming piles of cruft all the time.

At least on the Web, I can choose which steaming piles I step into.

I'm going to start recommending people stop using Word as an authoring medium. There are far better, simpler tools for every task, and the word processor has been appropriate for exactly none of them for too long now. Sometimes you have to destroy the document in order to save it.

Security

Journal Journal: Trust Works All Ways

[Cross-posted from the Scriptorum.]

The Debian OpenSSL vulnerability apparently went unnoticed for well over a year. Why is it that crackers and script kiddies never found it and/or exploited it?

Over the weekend, I've been thinking about last week's disclosure concerning Debian's OpenSSL package, which in effect stated that all keys and certificates generated by this compromised code have been trivially crackable since late 2006.

There's a pretty good subjective analysis of the nature of the error on Ben Laurie's blog (thanks, Rich), and of course the Debian crew itself has done a fairly good job of writing up the issue.

The scope of this vulnerability is pretty wide, and the ease with which a weak key can be compromised is significant. Ubuntu packaged up a weak key detector script containing an 8MB data block which, I'm told, included every single possible key value that the Debian OpenSSL package could conceivably create.

The question that kept cropping up for me is: This one-line code change apparently went unnoticed for well over a year. Why is it that crackers and script kiddies never found it and/or exploited it? Numerous exploits on Microsoft Windows would have required far more scrutiny and creativity than this one. Given the rewards involved for 0-day exploits, especially in creating platforms for cross-site scripting attacks, why is it nobody bothered to exploit this?

My hypothesis - sorry, my speculation is this: People at every stage of the production process and everywhere else in the system trusted that the others were doing their job competently. This includes crackers and others with a vested interest in compromising the code. I should exclude from this list those who might have a reasonable motivation to exploit the vulnerability with stealth and to leave no traces. If, however, even they didn't notice the danger presented by this tiny but fundamental change in the code base, well my point becomes stronger.

The change itself was small, but not really obscure. It was located, after all, in the function that feeds random data into the encryption process. As Ben Laurie states in his blog, if any of the OpenSSL members had actually looked at the final patch, they would almost certainly have noticed immediately that it was non-optimal.

In all this time, apparently, nobody using Debian's OpenSSL package has actually (or adequately) tested to see whether the Debian flavour of OpenSSL was as strong as it was supposed to be. That level of trust is nothing short of astounding. If in fact malware authors were guilty of investing the same trust in the software, then I'd venture to state that there's a fundamental lesson to be learned here about human nature, and learning that lesson benefits the attacker far more than the defender:

Probe the most trusted processes first, because if you find vulnerabilities, they will yield the greatest results for the least effort.

P.S. Offhand, there's one circumstance that I think could undermine the credibility of this speculation, and that's if there's any link between this report of an attack that compromised not less than 10,000 servers and the recent discovery of the Debian OpenSSL vulnerability.

Slashdot Top Deals

Your computer account is overdrawn. Please reauthorize.

Working...