Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
Compare cell phone plans using Wirefly's innovative plan comparison tool ×

Comment Matches my observations (Score 4, Funny) 190

Over the last couple of months, when I cut through one of the local parks on its bike trail, it's looked like the Night of the Living Dead: A bunch of zombies obliviously wandering around, staring down into their phones and cluelessly blocking the path.

Lately, the zombie outbreak seems to have abated somewhat, and the bike path isn't so much of an obstacle course.

Comment This is the problem—Linux is inherently unfr (Score 0) 306

to the kinds of development that UX needs.

In the commercial world, there is a hierarchy whose basic job is to say "no" to everyone's pet idea. To refuse to adopt an initiative proposed by someone, and instead to allocate their resources, against their will, to the *single* direction that the team has been ordered to take. Good or bad. Because even if bad, a single bad direction properly executed by a sizable team with enough labor to complete it well is better than a thousand bad directions each executed by a single individual or a small handful of individuals who lack the resources to complete it, yet chuck it out there alongside all of the other 999 incomplete bad directions.

But the whole *point* of OSS *is exactly* that if you don't like what everyone else is doing, you can do your own thing. That is the basic philosophy. And that's why Linux UX never improves in the free and open space. Because there is nobody with the authority so say, "No, the product will *not* include that, and you *will* dedicate all of your labor to what it has been decided *will* be included."

So the bazaar happens. But the problem with the bazaar as opposed to the cathedral is that the bazaar is only a single story high. You can't build seriously tall stuff without an organized, managed collective of labor. Surge, you get lots of interesting stuff. But very little of it, if any of it, is epic. It's all the size that one single bazaar shopkeeper can build, to man their own little shop.

The Linux kernel avoided this problem because of the cult of personality (not meant in a bad way, but in the technical sense) surrounding Linus. People defer to him. He decides what's in and out, and he does a reasonable amount of labor allocation even if in an interesting, socially backhanded way that's not common. But it works—he is "in charge" enough in everyone's minds that there ends up being one kernel, with leadership.

Nobody similar has emerged in Linux userspace, and it would seem that Linus-like people are a rare enough phenomenon that it's unlikely that one will emerge at any point before the question is irrelevant. The pent-up demand just isn't there now for good Linux UX, like it was for a sound kernel and high-capability OS that didn't cost a fortune, as it was during the late '80s/early '90s boom. The social mechanics just aren't there to generate it.

The Linux desktop as a really sound piece of tech and UX engineering... will never happen. That era has passed, and the problems have been solved—by other platforms. And Android is a very good counterexample. There *was* enough emerging demand for a mobile operating system that wasn't iOS but that offered the same capabilities, and voila—Android. When there is enough demand, there is space for one shopkeeper at the bazaar to emerge as a champion for the needs of others, and to accumulate sufficient influence by acclamation that a cathedral structure can emerge organically.

The bazaar is merely an incubator of ideas. The cathedrals are the epic and actually useful accomplishments. It takes demand and allegiance-pledging at the bazaar from many attendees to lead in the end to a cathedral. This means that the bazaar has to be big, and that the shopkeeper in question has to have an idea that many, many are not just interested in, but willing to work toward—enough to sacrifice their own autonomy and submit to leadership. This just doesn't exist for desktop Linux any longer. It got close during the height of Windows dominance, but there was never quite enough demand to make it happen organically. And now the time has passed. The desktop Linux people are running little shops at the bazaar that don't get a lot of foot traffic, and nobody is seeking them out. They are the kings of very tiny, forgotten kingdoms without enough labor resources or wealth to even maintain their castles any longer—and as a result, there is nothing but infighting, strange hacks to maintain castles on the cheap, and lots of started-but-never-to-be-finished foundations of castles for historians to pick through (or, more likely, forget).

I predict that Linux will continue to be a significant part of whatever new "booms" in technology happen, so long as Linus is significantly involved in kernel development. But the window for desktop Linux has just plain passed.

Comment I can't tell you how many times (Score 1) 306

I had this exact conversation with family and friends in the '90s. The answer was always "nothing."

Q: What do you see?
A: Nothing.
Q: I mean, what's on the screen?
A: Nothing.
Q: There is nothing at all on the screen?
A: No.
Q: So the screen is entirely blank. No power?
A: Pretty much.
Q: Pretty much? Is there something on it or isn't there?
A: There's nothing on it.

I go over... And sometimes there would be words ("Operating system not found" or similar), sometimes even a complete desktop but hard-locked or similarly hung.

Me: That's not nothing (pointing).
Them: I don't see anything.
Me: Don't you see words? and/or Don't you see windows?
Them: Not any that mean anything.
Me: If they didn't mean anything, I wouldn't have asked you about them. If you'd told me, I wouldn't have had to drive all this way.
Them: What was I supposed to tell you?
Me: I asked for the words on the screen. Next time, read me the words on the screen!
Them: Okay. Sorry.

Next time...

Q: What does the screen say?
A: Nothing...

Comment Use Android and Chrome OS at times. (Score 2) 306

I am a big fan of Linux in technical terms, but not a big fan in terms of UX (basically, the social end of computing, where collaboration across large teams is basically required for a high quality product).

Android is illustrative of what Linux *can* be, but on the desktop has never managed to be because of the obvious differences between the social (i.e. people and hierarchy) infrastructure behind Android vs. behind the Linux desktop.

I used Linux from 1993 through 2010. Early on I used the same .twmrc files with TWM that I used on my HPUX and SunOS boxes at CS school. At the time, the Linux desktop was *light years* ahead of the Windows desktop. 16-bit color, high resolutions, fast, lots of very powerful applications from the Unix world and experimental desktop projects like InterViews that seemed very promising. People with MS-DOS or GEM or Windows 1/2.x computers were envious.

Later on I used FVWM. Then I switched to KDE in the KDE Beta 3 era. But then (mid-late '90s), Linux on the desktop had already been outrun by Windows 95 and Mac OS. The level of integration amongst services and components wasn't that of a coherent system like it was for Mac OS and Windows; the Linux "computing is a network" philosophy—very good for things like business and scientific computing—was obvious in comparison.

When KDE 4 was released, I tried to use it for a while but it got in my way. I had to rebuild my entire desktop over and over again as objects were lost, lost their properties, etc. After about two weeks on KDE 4 during which I mostly nursed KDE along rather than doing my actual work, I switched to GNOME 2.x. I see that as something of a golden age for desktop Linux—basic parity with what was going on in the Mac and Windows worlds if you used a polished distribution like Fedora. Install was different, equally demanding of skills, but the actual install and setup process for the desktop OS on a bare machine involved approximately the same amount of work as was true for Windows, and the result was basic feature and experience parity.

Then, the bottom fell out. I suspect that a lot of the need for the Linux desktop with experience parity to Windows was met by an increasingly revived Mac OS, and users flocked there. Myself included, in the end.

GNOME 3 came out and KDE 4 was finally becoming usable and there was something of a battle, but both were behind the curve relative to the stability and seamlessness of OS X, and OS X had end-user application developers already. They screamed and moaned during the transition from legacy Mac OS, but most of them hung on and redeveloped their applications for OS X, and there were a bunch of new application developers to boot.

On top of that, the major applications of the business and academic worlds made their way out for OS X as it became a viable platform. You now had a seamless desktop OS that offered all the big brands in user applications, plus stability, plus easy access to a *nix environment and command line if you wanted it.

I was busy fighting Linux during that "instability era" just as KDE4/GNOME3 happened and duked it out. Things were changing very quickly in many facets of the Linux base installs, in hardware, etc. and every update seemed to break my Thinkpad T60 which at the time ran on Fedora. I was spending a lot of time fixing dotfiles and scripts and trying to solve dependency problems, etc. Meanwhile, lots of new things that were starting to become commonplace needs (cloud services, mobile devices, etc.) didn't yet work well with Linux without lots of command line hacking and compiling of alpha-quality stuff from source.

A couple of fellow academics kept telling me to try Mac OS. Finally I did, I installed a hackintosh partition on my T60. By mid-2010, I realized that I was using my OS X boot, along with the GNU tools environment from MacPorts, far more than I was using the Linux partition, and that there were Mac applications that I was *dying* to start using on a daily basis, but hadn't purchased yet because "I'm not a Mac user, I'm a Linux user, this Mac partition is just to play around with."

Well, I finally bought one of them. And then I started using it all the time. And then another. And soon enough, most of my serious workflow was stuck on my Mac partition and the Linux partition was fading into the background, unused and unmaintained.

By the end of 2010, I'd bought a Macbook Pro and didn't have a Linux installation at all, after 17 years of exclusive Linux use. I'm still on OS X. I use the shell environment extensively. My old scripts and files and removable media from the Linux era still work, including ext2-formatted stuff (there are free extensions to support the filesystem). Basically, I don't feel like I lost a thing.

But I gained a HUGE amount of time that I used to spend hacking dotfiles, moving packages around, and trying to get source code that hadn't been packaged yet for binary distribution—and its many dependencies to compile properly. And I no longer worry about whether a particular piece of tech or software will "work for me" on compatibility grounds. I just buy the applications or hardware that meet the specs that I need, presuming that it will work with OS X. And so far, it always has.

Desktop Linux is basically over. It's not that it couldn't catch up, it's that I don't see any initiative anywhere in Linux-world that is likely to deliver competitive results to the OS X desktop experience before the era of the desktop is entirely over anyway.

Linux has found its niches (as pointed out—scientific computing, data centers, mobile/embedded) and there is basically no push any longer to compete for the general desktop, because it is a shrinking market anyway.

Comment Re:Third choice (Score 1) 275

Easy enough to say but last time I checked if you want to do anything with the current VR headset boom, you're pretty much going to have to use Windows. Steam's OpenVR initiative makes it sound like you don't, but a few months ago when I checked their Linux examples wouldn't even build.

Comment Re:Meh (Score 1) 179

Is there every any particular need to limit them, though? A couple decades ago it was uncommon to have more than one sound device on a machine. Now it's unusual not to have two or three. Designs and requirements change over time, and having to factor out singleton behavior that was never really necessary in the first place is kind of a pain in the ass. You could easily just create those things with thing factories when the program starts up, and pass them around to objects that need them. No artificial limits, and you don't have to factor out singleton behavior when you decide you want two things where you used to only have one.

I've found that design review boards are becoming increasingly hostile toward singletons, too. There was a narrow window where they'd at least consider one, back when people started talking about design patterns. These days it's next to impossible to get one approved, even if there's pretty good justification for it. You can always design around the need for a singleton, and usually the system design will be better without them.

Comment Meh (Score 5, Interesting) 179

I've yet to see a computer science professor with particularly excellent code, either. I run across assignments and example code from courses on a regular basis that fall into the "Never, ever do that" category of programming. Case in point, a relative of mine recently had some questions about a CS programming assignment. Part of the assignment description talked about design patterns and predictably went straight for the Singleton as an example. I'm pretty sure that's the only pattern that about 90% of programmers ever actually learn when reading about design patterns and it's so abused in the industry right now that you can basically never get one past a design review board.

Anywhoo, back in the '90's I worked for a company that was getting a B2 Certification for its operating system. My job basically consisted of reading the entire AT&T C standard library code, finding potential security flaws, writing tests for those flaws and then writing a report with the tests which would be delivered to the NSA. I found the remote buffer overflow in the AT&T telnet daemon a couple years before the same overflow was discovered in the Linux telnet daemon. So the NSA basically outsourced the hard work of finding all those exploits to the companies that were trying to get security certifications. It took three or four guys just a few months to go through all the stuff we had to look at. I'm sure we missed a bit, but I was much more confident in the security of their OS at the end of all that. Too bad they eventually went out of business, were acquired by IBM and their products were killed. You know, progress!

Comment Re:ALT+LEFT (Score 4, Insightful) 141

Because 'backspace to go back' is default behavior in a lot of programs, not just web browsers. Try it in File Explorer, for example.

Just like F1 being a nearly universal shortcut for 'help', F2 for 'rename', F3 or CTRL+F for 'search', and so on. I shouldn't have to relearn shortcuts for common behaviors in every program I want to use.

I thought that Alt+Left and Alt+Right *are* the standard shortcuts for going backward and forward in program histories. It's worked that way in every web browser I can remember using back to the 1990s, and it works that way in Windows Explorer. The backspace key doesn't even have an obvious corresponding "forward" key.

I wasn't aware that backspace was used to go back in history in any program. I always expect it to erase one character, or do nothing.

Comment Re:No internal structure? (Score 1) 189

By the time either a blimp or this thing deflates enough to make the engines flop around, there isn't going to be nearly enough lift of lift of any kind to keep it in the air.

But let's ignore that: You want to do this to a rigid airship.

Look at their history. Excluding the ones that burst into flames, many if not most of the major airships ever built ended up lost due to failure of their internal structures. They got shredded like pretzels with the slightest adverse aerodynamic forces. (Even the Hindenburg disaster probably initially involved the snapping of an internal bracing wire due to overzealous steering.)

If I had to ride in one of these white elephants, I'd still go with the inflatable version.

Slashdot Top Deals

Introducing, the 1010, a one-bit processor. 0 NOP No Operation 1 JMP Jump (address specified by next 2 bits)

Working...