I am a big fan of Linux in technical terms, but not a big fan in terms of UX (basically, the social end of computing, where collaboration across large teams is basically required for a high quality product).
Android is illustrative of what Linux *can* be, but on the desktop has never managed to be because of the obvious differences between the social (i.e. people and hierarchy) infrastructure behind Android vs. behind the Linux desktop.
I used Linux from 1993 through 2010. Early on I used the same .twmrc files with TWM that I used on my HPUX and SunOS boxes at CS school. At the time, the Linux desktop was *light years* ahead of the Windows desktop. 16-bit color, high resolutions, fast, lots of very powerful applications from the Unix world and experimental desktop projects like InterViews that seemed very promising. People with MS-DOS or GEM or Windows 1/2.x computers were envious.
Later on I used FVWM. Then I switched to KDE in the KDE Beta 3 era. But then (mid-late '90s), Linux on the desktop had already been outrun by Windows 95 and Mac OS. The level of integration amongst services and components wasn't that of a coherent system like it was for Mac OS and Windows; the Linux "computing is a network" philosophy—very good for things like business and scientific computing—was obvious in comparison.
When KDE 4 was released, I tried to use it for a while but it got in my way. I had to rebuild my entire desktop over and over again as objects were lost, lost their properties, etc. After about two weeks on KDE 4 during which I mostly nursed KDE along rather than doing my actual work, I switched to GNOME 2.x. I see that as something of a golden age for desktop Linux—basic parity with what was going on in the Mac and Windows worlds if you used a polished distribution like Fedora. Install was different, equally demanding of skills, but the actual install and setup process for the desktop OS on a bare machine involved approximately the same amount of work as was true for Windows, and the result was basic feature and experience parity.
Then, the bottom fell out. I suspect that a lot of the need for the Linux desktop with experience parity to Windows was met by an increasingly revived Mac OS, and users flocked there. Myself included, in the end.
GNOME 3 came out and KDE 4 was finally becoming usable and there was something of a battle, but both were behind the curve relative to the stability and seamlessness of OS X, and OS X had end-user application developers already. They screamed and moaned during the transition from legacy Mac OS, but most of them hung on and redeveloped their applications for OS X, and there were a bunch of new application developers to boot.
On top of that, the major applications of the business and academic worlds made their way out for OS X as it became a viable platform. You now had a seamless desktop OS that offered all the big brands in user applications, plus stability, plus easy access to a *nix environment and command line if you wanted it.
I was busy fighting Linux during that "instability era" just as KDE4/GNOME3 happened and duked it out. Things were changing very quickly in many facets of the Linux base installs, in hardware, etc. and every update seemed to break my Thinkpad T60 which at the time ran on Fedora. I was spending a lot of time fixing dotfiles and scripts and trying to solve dependency problems, etc. Meanwhile, lots of new things that were starting to become commonplace needs (cloud services, mobile devices, etc.) didn't yet work well with Linux without lots of command line hacking and compiling of alpha-quality stuff from source.
A couple of fellow academics kept telling me to try Mac OS. Finally I did, I installed a hackintosh partition on my T60. By mid-2010, I realized that I was using my OS X boot, along with the GNU tools environment from MacPorts, far more than I was using the Linux partition, and that there were Mac applications that I was *dying* to start using on a daily basis, but hadn't purchased yet because "I'm not a Mac user, I'm a Linux user, this Mac partition is just to play around with."
Well, I finally bought one of them. And then I started using it all the time. And then another. And soon enough, most of my serious workflow was stuck on my Mac partition and the Linux partition was fading into the background, unused and unmaintained.
By the end of 2010, I'd bought a Macbook Pro and didn't have a Linux installation at all, after 17 years of exclusive Linux use. I'm still on OS X. I use the shell environment extensively. My old scripts and files and removable media from the Linux era still work, including ext2-formatted stuff (there are free extensions to support the filesystem). Basically, I don't feel like I lost a thing.
But I gained a HUGE amount of time that I used to spend hacking dotfiles, moving packages around, and trying to get source code that hadn't been packaged yet for binary distribution—and its many dependencies to compile properly. And I no longer worry about whether a particular piece of tech or software will "work for me" on compatibility grounds. I just buy the applications or hardware that meet the specs that I need, presuming that it will work with OS X. And so far, it always has.
Desktop Linux is basically over. It's not that it couldn't catch up, it's that I don't see any initiative anywhere in Linux-world that is likely to deliver competitive results to the OS X desktop experience before the era of the desktop is entirely over anyway.
Linux has found its niches (as pointed out—scientific computing, data centers, mobile/embedded) and there is basically no push any longer to compete for the general desktop, because it is a shrinking market anyway.