Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
Compare cell phone plans using Wirefly's innovative plan comparison tool ×

Comment This is the problem—Linux is inherently unfr (Score 0) 306

to the kinds of development that UX needs.

In the commercial world, there is a hierarchy whose basic job is to say "no" to everyone's pet idea. To refuse to adopt an initiative proposed by someone, and instead to allocate their resources, against their will, to the *single* direction that the team has been ordered to take. Good or bad. Because even if bad, a single bad direction properly executed by a sizable team with enough labor to complete it well is better than a thousand bad directions each executed by a single individual or a small handful of individuals who lack the resources to complete it, yet chuck it out there alongside all of the other 999 incomplete bad directions.

But the whole *point* of OSS *is exactly* that if you don't like what everyone else is doing, you can do your own thing. That is the basic philosophy. And that's why Linux UX never improves in the free and open space. Because there is nobody with the authority so say, "No, the product will *not* include that, and you *will* dedicate all of your labor to what it has been decided *will* be included."

So the bazaar happens. But the problem with the bazaar as opposed to the cathedral is that the bazaar is only a single story high. You can't build seriously tall stuff without an organized, managed collective of labor. Surge, you get lots of interesting stuff. But very little of it, if any of it, is epic. It's all the size that one single bazaar shopkeeper can build, to man their own little shop.

The Linux kernel avoided this problem because of the cult of personality (not meant in a bad way, but in the technical sense) surrounding Linus. People defer to him. He decides what's in and out, and he does a reasonable amount of labor allocation even if in an interesting, socially backhanded way that's not common. But it works—he is "in charge" enough in everyone's minds that there ends up being one kernel, with leadership.

Nobody similar has emerged in Linux userspace, and it would seem that Linus-like people are a rare enough phenomenon that it's unlikely that one will emerge at any point before the question is irrelevant. The pent-up demand just isn't there now for good Linux UX, like it was for a sound kernel and high-capability OS that didn't cost a fortune, as it was during the late '80s/early '90s boom. The social mechanics just aren't there to generate it.

The Linux desktop as a really sound piece of tech and UX engineering... will never happen. That era has passed, and the problems have been solved—by other platforms. And Android is a very good counterexample. There *was* enough emerging demand for a mobile operating system that wasn't iOS but that offered the same capabilities, and voila—Android. When there is enough demand, there is space for one shopkeeper at the bazaar to emerge as a champion for the needs of others, and to accumulate sufficient influence by acclamation that a cathedral structure can emerge organically.

The bazaar is merely an incubator of ideas. The cathedrals are the epic and actually useful accomplishments. It takes demand and allegiance-pledging at the bazaar from many attendees to lead in the end to a cathedral. This means that the bazaar has to be big, and that the shopkeeper in question has to have an idea that many, many are not just interested in, but willing to work toward—enough to sacrifice their own autonomy and submit to leadership. This just doesn't exist for desktop Linux any longer. It got close during the height of Windows dominance, but there was never quite enough demand to make it happen organically. And now the time has passed. The desktop Linux people are running little shops at the bazaar that don't get a lot of foot traffic, and nobody is seeking them out. They are the kings of very tiny, forgotten kingdoms without enough labor resources or wealth to even maintain their castles any longer—and as a result, there is nothing but infighting, strange hacks to maintain castles on the cheap, and lots of started-but-never-to-be-finished foundations of castles for historians to pick through (or, more likely, forget).

I predict that Linux will continue to be a significant part of whatever new "booms" in technology happen, so long as Linus is significantly involved in kernel development. But the window for desktop Linux has just plain passed.

Comment I can't tell you how many times (Score 1) 306

I had this exact conversation with family and friends in the '90s. The answer was always "nothing."

Q: What do you see?
A: Nothing.
Q: I mean, what's on the screen?
A: Nothing.
Q: There is nothing at all on the screen?
A: No.
Q: So the screen is entirely blank. No power?
A: Pretty much.
Q: Pretty much? Is there something on it or isn't there?
A: There's nothing on it.

I go over... And sometimes there would be words ("Operating system not found" or similar), sometimes even a complete desktop but hard-locked or similarly hung.

Me: That's not nothing (pointing).
Them: I don't see anything.
Me: Don't you see words? and/or Don't you see windows?
Them: Not any that mean anything.
Me: If they didn't mean anything, I wouldn't have asked you about them. If you'd told me, I wouldn't have had to drive all this way.
Them: What was I supposed to tell you?
Me: I asked for the words on the screen. Next time, read me the words on the screen!
Them: Okay. Sorry.

Next time...

Q: What does the screen say?
A: Nothing...

Comment Use Android and Chrome OS at times. (Score 2) 306

I am a big fan of Linux in technical terms, but not a big fan in terms of UX (basically, the social end of computing, where collaboration across large teams is basically required for a high quality product).

Android is illustrative of what Linux *can* be, but on the desktop has never managed to be because of the obvious differences between the social (i.e. people and hierarchy) infrastructure behind Android vs. behind the Linux desktop.

I used Linux from 1993 through 2010. Early on I used the same .twmrc files with TWM that I used on my HPUX and SunOS boxes at CS school. At the time, the Linux desktop was *light years* ahead of the Windows desktop. 16-bit color, high resolutions, fast, lots of very powerful applications from the Unix world and experimental desktop projects like InterViews that seemed very promising. People with MS-DOS or GEM or Windows 1/2.x computers were envious.

Later on I used FVWM. Then I switched to KDE in the KDE Beta 3 era. But then (mid-late '90s), Linux on the desktop had already been outrun by Windows 95 and Mac OS. The level of integration amongst services and components wasn't that of a coherent system like it was for Mac OS and Windows; the Linux "computing is a network" philosophy—very good for things like business and scientific computing—was obvious in comparison.

When KDE 4 was released, I tried to use it for a while but it got in my way. I had to rebuild my entire desktop over and over again as objects were lost, lost their properties, etc. After about two weeks on KDE 4 during which I mostly nursed KDE along rather than doing my actual work, I switched to GNOME 2.x. I see that as something of a golden age for desktop Linux—basic parity with what was going on in the Mac and Windows worlds if you used a polished distribution like Fedora. Install was different, equally demanding of skills, but the actual install and setup process for the desktop OS on a bare machine involved approximately the same amount of work as was true for Windows, and the result was basic feature and experience parity.

Then, the bottom fell out. I suspect that a lot of the need for the Linux desktop with experience parity to Windows was met by an increasingly revived Mac OS, and users flocked there. Myself included, in the end.

GNOME 3 came out and KDE 4 was finally becoming usable and there was something of a battle, but both were behind the curve relative to the stability and seamlessness of OS X, and OS X had end-user application developers already. They screamed and moaned during the transition from legacy Mac OS, but most of them hung on and redeveloped their applications for OS X, and there were a bunch of new application developers to boot.

On top of that, the major applications of the business and academic worlds made their way out for OS X as it became a viable platform. You now had a seamless desktop OS that offered all the big brands in user applications, plus stability, plus easy access to a *nix environment and command line if you wanted it.

I was busy fighting Linux during that "instability era" just as KDE4/GNOME3 happened and duked it out. Things were changing very quickly in many facets of the Linux base installs, in hardware, etc. and every update seemed to break my Thinkpad T60 which at the time ran on Fedora. I was spending a lot of time fixing dotfiles and scripts and trying to solve dependency problems, etc. Meanwhile, lots of new things that were starting to become commonplace needs (cloud services, mobile devices, etc.) didn't yet work well with Linux without lots of command line hacking and compiling of alpha-quality stuff from source.

A couple of fellow academics kept telling me to try Mac OS. Finally I did, I installed a hackintosh partition on my T60. By mid-2010, I realized that I was using my OS X boot, along with the GNU tools environment from MacPorts, far more than I was using the Linux partition, and that there were Mac applications that I was *dying* to start using on a daily basis, but hadn't purchased yet because "I'm not a Mac user, I'm a Linux user, this Mac partition is just to play around with."

Well, I finally bought one of them. And then I started using it all the time. And then another. And soon enough, most of my serious workflow was stuck on my Mac partition and the Linux partition was fading into the background, unused and unmaintained.

By the end of 2010, I'd bought a Macbook Pro and didn't have a Linux installation at all, after 17 years of exclusive Linux use. I'm still on OS X. I use the shell environment extensively. My old scripts and files and removable media from the Linux era still work, including ext2-formatted stuff (there are free extensions to support the filesystem). Basically, I don't feel like I lost a thing.

But I gained a HUGE amount of time that I used to spend hacking dotfiles, moving packages around, and trying to get source code that hadn't been packaged yet for binary distribution—and its many dependencies to compile properly. And I no longer worry about whether a particular piece of tech or software will "work for me" on compatibility grounds. I just buy the applications or hardware that meet the specs that I need, presuming that it will work with OS X. And so far, it always has.

Desktop Linux is basically over. It's not that it couldn't catch up, it's that I don't see any initiative anywhere in Linux-world that is likely to deliver competitive results to the OS X desktop experience before the era of the desktop is entirely over anyway.

Linux has found its niches (as pointed out—scientific computing, data centers, mobile/embedded) and there is basically no push any longer to compete for the general desktop, because it is a shrinking market anyway.

Comment Smaller market, too. (Score 4, Interesting) 75

Just as importantly, the market has shifted. There is still a stable market for computing and it will continue to exist, but it no longer includes the home/casual user segment. Those people have gone over to tablets and phones (most all of the non-tech folks that I know now have an older laptop sitting dusty on their top closet shelf, unused for years, and don't plan to replace it; only about half have even bothered to get a bluetooth keyboard for their tablet, while the rest are perfectly satisfied with the onscreen keyboard).

Business, tech-oriented people, the self-employed, creatives, and so on will continue to buy full-fledged computing hardware and to upgrade it over time, but this is a much smaller market than once existed for computing, where the market included basically every home and individual in developed societies. So some correction in sales was (and probably remains) inevitable over time.

Comment Here in the US (Score 1) 622

you will find that many dealerships do the same.

Only there are precious few protections for the consumer.

So when your used car breaks down catastrophically in a couple of months, it's "Oh, we must have missed that!" and when you try to use your warranty to cover it, it's excluded on one of 57 different technicalities, with maximum coverage for that particular type of repair that only covers a fraction of the typical cost, all spelled out in tiny print in a massive rulebook of which you do not actually get a copy when you purchase your vehicle and your "warranty."

And even if you manage to find something that is covered, and want to get your fractional pittance, you still have to pay out of pocket yourself for the repairs, then submit the receipt to the third-party corporation that provides the "warranty," who will scour your receipt for more technicalities on which to exclude your claim, and if they can't, will ultimately send you your fractional pittance in 8-12 months and after several letters from your attorney.

In short, U.S. "dealer checks" and "warranties" are worth less than lavatory tissue, which is why every reputable U.S. publication strongly advises used car buyers to pay to have their own favorite mechanic go over the vehicle (at a cost of $50-$200) before buying. So you can easily burn up $2k or more having cars vetted by a mechanic before you find one that he or she will actually tell you is a reasonable bet. Yet all of them are happily checked, warranted, etc.

Comment Re:Arogance (Score 5, Interesting) 247

I used to teach a pretty decent load of Chinese students in my classes in Manhattan (I taught at both NYU and on CUNY). By the '00s, they were significantly more creative, sophisticated, well-rounded, and learned (I make no claims about "intelligence") than my American students, who were really sort of "decadent" in the worst, stereotypical ways—knew only a few things about a few things but a lot about consumer goods and fashion, and didn't seem to think they needed to work, just didn't feel the global pressure from competing workers. Very entitled.

The Chinese students tended to cluster in 'A' territory and always approached me after class to talk about class topics until I had to leave, then followed up with serious questions by email. The American students always had one or two in the 'A' group and the rest clustering around low B and high C, and it was a struggle just to learn their names, as they had nothing at all to say to me unless I called on them in class. Ironically, many of the Chinese students had better formal English as well, though there were always also about half that were clearly 'winging it' and needed ESL—but were killing it in class performance anyway, managing to learn and to get through books by relying on a dictionary, a study group, and sheer determination.

Comment I started using Linux in 1993 (Score 1) 211

And I had no real driver trouble that couldn't be worked around. Winmodems and winprinters weren't actually all that common in the grand scheme of things. Maybe for a year or two in the mid-'90s. But there was a wealth of used hardware available in those days that was the real deal.

Anyway, I always used external modems, including for a while a very weird Telebit modem with a steel case, a flip-open front door, and a non-AT command set that meant that I had to log into it via a terminal emulator and execute commands myself because only AT command sets were reported.

On the printing front, very early on I was able to get ahold of a secondhand Apple LaserWriter, and then set up Netatalk and a bunch of adapters to print to it over Either/RS-422 or something like that. It made everything on Linux a thousand percent easier because you could just dump postscript directly do it, and Linux print drivers weren't really sorted for many years.

In fact, there was even a really reasonable (for the period) WYSIWYG office suite called InterViews that ran under X and dumped out PostScript files for printing. The text editor was called 'ez' and I still have a bunch of non-CS homework from that era saved as '.ez' files somewhere. For the CS homework, I would just dial my university's SLIP pool and then telnet over to the Sun systems in the department where we had logins and used gcc for everything.

The actual hard part, as I recall, was getting Linux in the first place, which took me several months. There were no dial-up BBS systems I could find that had actual complete Linux distributions of any kind. The distributions that did exist at the time (I remember Slackware, Yggdrasil, Trans-Ameritech or something like that, and a couple of others, though maybe my memory is off) were set up as a series of dozens of 1.2mb or 1.44mb floppy disk images.

Not only was there no BBS that seemed to host a complete distro, but those were actually pretty sizable downloads at the time—it represented many hours of downloading even if a complete set could be found. At school, the systems on the actual 'net via 10-Base-2 and AUI at the time (our so-called 'smart hosts' that were in the DNS system) could download such things quickly from other smart-host FTP sites with complete sets, but they were Sun workstations with no floppy drives, and our filesystem quotas were not big enough to hold a complete set.

And before I had Linux actually installed, there was no way for me at home to log into those quotas and download the files from Unix machines anyway, otherwise I could have used FTP over dial-up to move a few images at a time through the pipline to home.

IN the end, I managed to find a local ISP that would set me up at home with a UUCP feed, and a vanilla UUCP dial-up binary set that was a massive bear to configure on a non-Unix system. Then I spent many weeks laboriously pulling images down over UUCP nightly from Usenet.

Once I finally had the complete binary set downloaded, I got ahold of many boxes of floppies, wrote the images, and did my first Linux install.

That bootstrapping was the hard part. Once Linux was actually installed, the entire non-BBS online universe of the Internet became massively easy to navigate (at the time, it was not easy to do Internet on PCs—there was little if anything on http:/// but that was the only protocol supported by DOS-based systems or by Macs) because now I had gopher, wais, archie, veronica, ftp, and so on. It was like boostraping your home computing universe into the Internet age.

The drivers were really of secondary importance once you got your hands on a complete distro. You'd just note which graphics hardware was supported by the X binaries, for example, and then go out and buy that card for $50 or $100. That was easy compared to actually getting your hands on a complete distro stored on the right machine and OS (DOS to write the images) and then getting it all written out and ready for install from floppies.

Comment "Proprietary?" (Score 2) 154

I find this use of the term "proprietary" to be significantly different from the usual intended meaning of the term.

Usually, "proprietary" means intellectual property belonging to a private organization, with a harsh hand taken to prevent reverse engineering and the stated assertion (either in EULAs or otherwise) that no use can be made in any way of reverse engineered output without being subject to legal action.

Here, "proprietary" apparently means "hard to understand" since everything else does not apply—not a private organization, no need to reverse engineer since it's an interpreted language, etc. By this standard, all of the perl and assembly code in the universe is "proprietary" since it's not written with forty character variable names.

Seems a stretch.

Comment Re:It's been a while since I was a CS student. (Score 1) 173

I think there are two fundamental concepts here:

1) Understanding information as representation (cryptography here) and eliminating conceptual ambiguity/unpredictability in algorithmics (security "flaws" here). These fundamentals are absolutely part of the basics of CS, but the emphasis is more on correctness: understanding the nature, reversibility, and properties of the representation and the invariants and rigor of the algorithmics. This is good CS in general, for all cases, and you're right, it's also the fundamentals of security.

But I think when they say "teach security" they actually mean:

2) Harden designed and deployed systems against common vectors for attack in real-world situations.

This requires not just the items from (1) but also a familiarity with particular architectures, implementations, protocols, languages, and conventions and conditions of user thought and behavior. So while there is overlap, I suspect that calls for "teaching security" aren't going to be satisfied with cryptographic theory and parsimonious, sound, and unambiguous algorithmics with strong assumption and bounds checking. Most policy people wouldn't consider that to be "teaching computer security."

Comment Re:It's been a while since I was a CS student. (Score 1) 173

Er, "to the level of."

After reading this thread, I think a lot of Slashdot posters have no idea what "computer science" as a science actually is.

They think AI researchers or computer vision folks started by sitting down at an Apple IIe and banging out:

"10 REM This is my first crack at an AI program. Let's see how it goes."

Comment Re:It's been a while since I was a CS student. (Score 1) 173

This is what I think.

It should be required for anyone getting a degree that recommends their ability to write code and create systems.

It should not be required just because someone is getting a *computer science* degree, because (unless the science has completely disappeared) a lot of those guys would have to *first* learn how to code, *then* learn about deployment and networks and users, *then* learn security, and it would take serious time away from the actual thrust of their degree.

People are posting upthread about things being different now, about the engineering wing now being a part of many CS programs, which was not the case when I was in the field. So maybe a dual-track degree is in order. To get the degree with the software engineering emphasis, yes, you have to learn about security. But to get the degree on computational theory? It makes no sense to me, except in the case of those that want to work on cryptographic theory/information theory and so on.

For a lot of the guys I went to school with, it would just make no sense. They'd have to first learn to code before they learned about coding securely. Or they'd have to first learn about networks before they could learn about securing them. And so on. They were busy with problems totally unrelated to implementation and deployment, and it would turn the degree that they got into another degree entirely as they spent years learning how to build stuff, instead of how to rigorously conceptually represent stuff and how to rigorously prove stuff.

When I think "computer science," programmers are the farthest thing from my mind. I imagine blackboards and chalk and lots and lots of scribbling. Not writing well-formed PHP+SQL code. Like, totally separate universes.

Comment Re:It's been a while since I was a CS student. (Score 2) 173

I have no problem with the idea that there ought to be courses on security, just not in CS where (at least when I was a student) that's not really what they do. They're in the business of figuring out/proving/disproving whether things *can be computed in theory* and how, in theory.

Security just isn't a question that has anything to do with that, and these are people that write comparatively little code. It's not what the discipline is about.

There *are* people that spend their time learning how to code, and how to code properly for real-world situations and deployment (which is precisely where security becomes an issue). That's my point. Security ought to be taught where people are actually learning to code, deploy, and operate. It's a serious, rigorous field of its own. It just doesn't happen to be computer science (which, if it helps you, could just as easily be called "computation theory").

Slashdot Top Deals

"The lesser of two evils -- is evil." -- Seymour (Sy) Leon

Working...