The major Linux distributions, like Red Hat, would probably chip in. Part of the reason that Linux has any desktop market share at all is because Firefox runs on it, and many major sites support it. If people couldn't access their banking sites, YouTube, etc. with their Linux browser, they would replace their Linux desktop with Windows. Or, in the case of netbooks, buy the Windows version instead of the Linux one.
Slashdot videos: Now with more Slashdot!
We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).
A better term might be "discoverable". If you can play with it and figure out what it does without consulting the manual or asking someone else, then it has high "discover-ability". Combine that with "consistent": knowledge of one part of the system helps you to use other parts of the system that you haven't tried yet. Those terms together get at what many people mean when they say "intuitive"
From the time I've spent playing with demo iPhones and Touches the interface was pretty easy to understand. When you turn the phone sideways, it goes into landscape mode and it pretty much does that everywhere, so it is consistent. It is also consistent with what I would expect in the real world; if I'm orienting the screen sideways, I probably want to use it so the long edge is the top now. You can also learn that pretty easily just by trying it, so it is also discoverable. When the iPhone breaks consistency, like the lack of a landscape keyboard in some apps, people complain, which indicates that consistent behavior is part of what we think of as "intuitive".
Zooming in and out works by pinching and pulling, which isn't very discoverable, but it makes sense a certain amount of real-world logical sense ( I'm stretching a photo to make it bigger, squishing it to make it smaller). Once you learn it, you can try that same action in other places and it will do pretty much what you expect (discoverable and consistent). Of course, you can get away with some of those things on a media player because many operations aren't really destructive; you can play with the device to see how it works. If stretching a word processing document ripped it in half and deleted it, that would probably be a different story.
I've tried the Android emulator a bit, so I have some familiarity with the interface. I think I could pretty much figure out how to do most things I'd want to do with it, but it definitely has the feeling of a computer interface shrunk to fit a phone. I think it is discoverable, but from playing with it and reading the reviews, it isn't consistent, so it ultimately isn't as discoverable as the iPhone is.
The iPhone software, on the other hand, feels more like it is purpose-built for the phone; like a part of the device as opposed to running on it. Even the main screen evokes a keypad layout like a touch tone phone rather than the desktop metaphor that Android shoehorns in.
Ultimately, I think that is Android's major challenge. It can't easily become part of a device out of the box because it could run on a range of hardware, while the iPhone software only has to support the iPhone and can blend smoothly with the hardware experience. This is in some ways more important than the relationship of Windows and OS X to their various hardware since we have certain expectations about how a phone should perform that PC's don't have. There is potential for Android to become more discoverable and consistent; personally I'm going to wait for the next Android phone to see if it has improved.