Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Comment Working with state agencies in the '90s (Score 1) 189

I saw a lot of EISA systems. It was a reasonable performer and physically robust (not as sensitive as PCI cards to positioning in slots, etc.). I'd say that EISA hardware was generally of very good quality, but high-end enough that most consumers wouldn't run into it despite being a commodity standard, sort of like PCI-X.

The systems I had experience with were running Linux, even then. :-)

Comment Seconded. (Score 1) 93

For a very long time, tape drives and media gave tape drives and media a bad name.

Consumer QIC — about 1% of tapes actually held any data, total snake oil that took 10 days to "store" 10 megs (immediately unreadable in all cases)
4mm — Tapes good for one pass thru drive; drive good for about 10 tape passes
8mm —Tapes good for maybe 10 passes thru drive; drive good for about 100 tape passes before it starts eating tapes

For all three of the above: Don't bother trying to read a tape on any drive other than the one that wrote it; you won't find data there.

Real QIC —Somewhat more reliable but vulnerable to dust, magnetic fields; drive mechanisms not robust, finicky about door closings

Basically, the only tapes that have ever been any damned good are 1/2 inch or wider and single-reel for storage. Problem is that none of these have ever been particularly affordable at contemporary capacities and they still aren't. Any non-enterprise business should just buy multiple hard drives for their rotating backups and replace the lot of them once a year.

Comment Experts are busy. (Score 2) 84

And they ALREADY have expertise.

A computing expert already has decades of highly detailed experience and familiarity with a bunch of paradigms, uses, and conventions.

Experts are the LAST people that want to read manuals for basic things they already have extensive experience with, like desktop environments. Again, they're busy. Being experts.

So, reading the manual on new tech that needs to be implemented in a complex system—great. Reading the manual on a desktop environment? Seriously? That's the last thing an expert wants to be bothered with. "I've used ten different desktop environments over thirty years. Can't you pick one set of conventions I'm already familiar with and use it, so that I can apply my expertise to the actual problems I'm trying to solve? Why reinvent the wheel in such a simple, basic system?"

DEs should leverage existing knowledge and use habits to enable experts to get their real work done quickly. For an expert, using the desktop is NOT the problem at hand requiring a solution. It's not what they're being paid for and not what they care about. Experts love to learn new things—in their area of expertise.

So sure, desktop environment developers probably love to poke around in KDE's front end, code, and docs. But anyone else? People that are not DE specialists are not so excited about the new learning project that is "my desktop," I assure you. The desktop is the last thing they want to be consciously focusing on over the course of the day.

Comment In the very first image... (Score 4, Interesting) 84

The tree widgets on the left are mismatched: some solid lines, some spaces with alphanumeric characters; the alpha characters are black, yet the lines are gray visual noise that creates visual processing and cognitive load for no reason, adding nothing.

The parenthetical text at the top has a title whose margin (left whitespace to other widgets) is significantly different from the text below it; there are spaces between the parentheses and the text, which no text or print style guide in the world endorses because it separates the parenthetical indicators from the parenthetical text, when they should be tightly bound for clarity.

The window title preserves the absurd convention of using both the binary name and a descriptive title together, and separates them with a typographical element (an em-dash) which is inappropriate in a label or design element because it is asynchronous—it indicates a delay in interpretation and pronunciation (as the em-dash just a few words ago in this paragraph does) and thus suggests long-form reading, which is not the intent for at-a-glance window titles (unless you don't want them to be very usable).

The title of the list widget, "Information Modules" is superfluous and redundant; the user starting an "About" dialogue expects to see "information" from the start, and they do not need to know about implementation ("modules").

The resize handle contrasts significantly with the window background, drawing undue attention to this particular area of the window above others (why is it "louder" than the window title, for example? Window controls should be secondary to window content and all at the same visual "volume" for usability).

In short—they still don't get it; they are signaling, in conventional ways that most users process subconsciously, thought habits and forms of attention that are not contributing to efficiency and use, but rather detracting/distracting from it. This is the same old KDE with poor, unprofessional design that leads to cognitive clutter. It's not that KDE has "too much going on" but rather that KDE has "too much going on that isn't actually functional and adds nothing to users ability to get things done).

Yuck.

Comment Nope, their work isn't shit. (Score 1) 153

But they can earn 3x as much by going into the non-academic private sector and doing their research for profit-driven corps that will patent and secret the hell out of it, rather than using it for the good of all. Because the general public doesn't want to own the essential everyday technologies of the future; they'd rather it be kept inside high corporate walls and be forced to pay through the nose for it to wealthy billionaires.

And because bright young researchers actually have to eat, and actually want a life, they grudingly go where the money is, knowing full well they're contributing to deep social problems to come. Myself included.

But why would I settle for a string of one-year postdoc contracts that pay like entry-level jobs and require superhuman hours and commitment when I can go earn six figures at a proper nine-to-five, with revenue sharing, great benefits, and job security? Yes, the company owns everything I do. But I get to pay my bills and build a personal future. Of course, society's future is much dimmer as the result of so many people making the same choice that I have, and so much good work ending up in private hands rather than public ones.

But them's the beans. If you want to own the future, public, you've got to be willing to pay for it.

Comment I think this is pretty much it. (Score 3, Insightful) 598

In terms of revenue, Apple is following the money. iOS has made Apple the wealthy powerhouse that it is today, not OS X. They don't want to lose the installed base or be perceived as just a phone company; OS X gets them mindshare and stickiness in certain quarters that matter (i.e. education and youth) for future iOS revenue.

But they don't actually want to invest much in it; it's increasingly the sort of necessary evil that is overhead, so it makes sense for them to shift to an iOS-led company. In the phone space, where the consumer upgrade cycle is tied to carrier contracts and upgrade cycles, it's important to have "new and shiny" every single year; consumers standing in AT&T shops are fickle people that are easily swayed by displays and sales drones that may or may not know anything about anything.

So the marketing rationale at Apple is (1) follow the revenue, which is mobile and iOS, (2) do what is necessary to stay dominant there, which means annual release cycles at least, and (3) reduce the cost of needed other business wings as much as possible so as to focus on core revenue competencies without creating risk, which means making OS X follow iOS.

It makes perfect business sense in the short and medium terms. In the long term, it's hard to see what effect it will have. It's entirely possible that they could wind down the OS X business entirely and remain dominant and very profitable as a result of their other product lines. It's also possible that poor OS X experiences and the loss of the "high end" could create a perception problem that affects one of their key value propositions, that of being "high end," and that will ultimately also influence their mobile sales down the road in negative ways as a result.

I'm a Linux switcher (just over five years ago now) that was tremendously frustrated with desktop Linux (and still dubious about its prospects) after using Linux from 1993-2009, but that has also in the last couple of months considered switching back. I switched to OS X largely for the quality of the high-end applications and for the more tightly integrated user experience. Now the applications business is struggling (the FCP problem, the Aperture events, the joke that is the iOS-synchronized iWork suite) and third-party applications have declined in quality (see: MS Office on OS X these days) as other developers have ceded the central applications ground to Apple. Meanwhile, the user experience on iOS remains sound but on OS X it has become rather less so as a result of the iOS-centricity of the company.

What to do? I've considered a switch back to Linux, but the Linux distros I've tried out in virtual machines have been underwhelming to me; the Linux desktop continues, so far as I can tell, to be in a worse state for my purposes than it was in 2008. I have no interest in Windows (I have Win7 and Win8 installations in VMs for specific applications, and even in a VM window they make me cringe; just complete usability nightmares).

It's a frustrating time for desktop users in general, I think; the consumer computing world has shifted to mobile/embedded devices and taken most of the labor, attention, and R&D with it. The desktop, needed by those of us that do productive computing work, has been left to languish on all fronts. It's completely rational in many ways at the macroeconomic level, but at the microeconomic level of individual workers and economic sectors, it's been a disaster.

Comment Um, they just want to use Netflix. It adds value (Score 3, Insightful) 121

to the media by making it easy to browse through, search, access, and stream.

And they're paying regular price.

We live in a very strange world when "piracy" has gone from "armed crews of criminal specialists seizing tonnage shipments of goods on the high seas with cannon and sword" to "a regular schmo paying the regular price to use a regular product in the regular way in his regular living room."

Hard to believe that the word still retains any of its negative connotation at all.

"Piracy" these days sounds an awful lot like "tuesday afternoon nothing-in-particular with tea."

Comment No, this is dumb. It should be shorter. (Score 1) 161

Very little useful learning goes on in school. And the top students need time outside of school to visit libraries, pursue intellectual hobbies, do independent reading, and generally do all the academic stuff that will actually matter in their lives later on (and matter to society later on).

By continually extending the school day and the school year, we increasingly ensure that we lock our best and brightest into mediocrity by tying up all of their time in institutionally managed busywork designed to ensure they don't deviate from the mean, which is pretty piss-poor.

Comment Ph.D. is NOT a career move (Score 1) 280

An English major is NOT getting into a STEM Ph.D. program, no matter what.

Even if they were, job prospects are worse for STEM Ph.D. holders than for MS/BS holders—there are far fewer jobs that require Ph.D. level qualifications outside of the professoriate and academics, and for Ph.D. holders in particular, employers are absolutely loathe to hire overqualified people.

Inside the professoriate and academics, the job market is historically bad right now. It's not "get a Ph.D., then become a lab head or professor," it's "get a Ph.D., then do a postdoc, then do another postdoc, then do another postdoc, then do another postdoc, really do at least 6-7 postdocs, moving around the world every year the entire time, and at the end of all of that if you've managed to stay employed at poverty wages using highly competitive postdocs that you may not even get, while not flying apart at the emotional seams, you may finally be competitive enough to be amongst the minority of 40-year-old Ph.D. holders that gets a lab or a tenure-track position, at which point the fun REALLY begins as you are forced onto the grantwriting treadmill and feel little job security, since universities increasingly require junior faculty to 'pay their own way' with external grants or be budgeted out."

And that's INSIDE STEM, which this person is almost certainly likely to be uncompetitive for as a B.A. holder trying to get into graduate programs.

Much more likely is that with great grades and GRE scores they'll be admitted to a humanities or social sciences Ph.D. program, with many of the same problems but with CATASTROPHICALLY worse job prospects due to the accelerating collapse of humanities budgets and support on most campuses.

Ph.D. is absolutely not the way to go unless you are independently wealthy and are looking for a way to "contribute to the world" since you don't actually have to draw a salary.

For anyone with student loans, it's a disastrous decision right now, and I wouldn't recommend it.

I say this as someone with a Ph.D. who is on a faculty and routinely is approached by starry-eyed top students looking to "make the world a better place" and "do research." Given the competition out there right now, only the superstars should even attempt it, and then only if they're not strapped for cash. Hint: If you don't know whether or not you're a superstar, you're not.

I think in a decade I've strongly recommended that someone enter a Ph.D. program once, and greeted the suggestion favorably maybe three times total, out of thousands of students, many of them with the classic "4.0 GPA" and tons of "books smarts."

In short, I disagree strongly with the suggestion. Unless you absolutely know that you're competitive already on the academic market, DO NOT GO. Don't listen to the marketing from the schools; it's designed to drive (a) your enrollment and tuition, and/or (b) your cheap labor as a teaching assistant/research assistant forever once you're in the program. It's a win for the institution, not for you.

The easiest sanity checks: Do you know exactly what your dissertation will be about and what you'll need to do, in broad strokes to conduct your research, as well as what resources you'll need? Do you already have personal contact with faculty on a well-matched campus in a well-matched department that are championing you and that want to bring you in as one of their own students/assistants?

If you answers to either one of these questions is "no," then while you may be offered a position somewhere, you will be on the losing end of the deal and would be naive to take it.

Submission + - Personal Drones Coming to Dominate the Hobbyist Radio Control Market (terapeak.com)

aussersterne writes: Drones continue to be in the news, with more and more "personal drone" incidents making headlines. It's easy to think of such stories as aberrations, but a well-known market research company has studied the radio control hobbyist market on eBay and found that sales of radio control helicopters and, more importantly, "quadcopters" (which are most often understood to be the "personal drone" category of items) are now—when taken together—the dominant form of radio control items sold on eBay. Radio control quadcopters in particular are growing much more quickly than the rest. Are we poised to see personal drones become much bigger influences on news and popular culture? Is it time for regulation?

Comment Not so much winding down as becoming moot. (Score 1) 60

The Linux desktop wars mattered when Linux was the future of the desktop.

Now that the desktop has a much smaller future, and Linux clearly doesn't play much of a role even in this drastically reduced future, it's just that KDE and GNOME really don't matter much.

Desktop Linux is a niche product, and it behaves like one—adoption is vendor-driven, and clients use whatever the vendor supplies.

For individual Linux users, things haven't moved in half a decade or more. Linux is still a mostly complete operating system with mostly working desktops. None of it is very polished (polish, as always, is just a couple years off in the future). Significant time and customization are required to make any stock distro+DE work well, things are generally cluttered, kludgy, and opaque, and for the hobbyist that fits the profile—the sort of person that will actually put up with and use this kind of computing environment—one or the other (KDE or GNOME) is already a clear favorite and this isn't likely to change.

Of course there is also the developer group, probably Linux's largest cohort of "serious" users on the desktop, but they just plain don't care much about which DE is installed. They're much more concerned with toolchains and versions of environment components.

So the KDE vs. GNOME thing is just plain...not that big a deal any longer, for most anyone.

The only possibly interesting development in a very long time is Elementary OS, which appears to have adopted a different philosophy from the one traditionally associated with Linux development/packaging groups. But whether this will ultimately translate into an important operating system and user experience, with its brand that supersedes the branding of the desktop environment itself, remains to be seen.

Comment You're mistaking "we" in "we need." (Score 5, Insightful) 283

You mean study something that enhances profits for the very, very wealthy.

Academic research works on an awful lot of problems that *the world* needs to solve, yet it makes no money for the propertied class, so there are no investment or funds available to support it.

Many fighting this fight aren't fighting for their pocketbooks; they're fighting to do science in the interest of human goods, rather than in the interest of capitalist kings.

Slashdot Top Deals

A morsel of genuine history is a thing so rare as to be always valuable. -- Thomas Jefferson

Working...