Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment How about just don't buy a phone from the carriers (Score 4, Interesting) 100

in the first place?

There are some FABULOUS devices coming out of China these days, readily available on eBay and Amazon, with high specs, Android KitKat or Lollipop, and sold at half the price or less vs. offerings from the carriers.

Just got a Huawei Honor X1 and am using it with an MVNO in the US. The retail price of the new off-contract phone from China, purchased on eBay, was about what the two-year on-contract retail price of a similarly specced Android device is in the U.S. The MVNO contract, with "unlimited" data (throttling to HSPA+ after the first several GB every month) is less than half the price of a similar contract at a major carrier.

There's no reason to buy on-contract phones any longer.

Comment Yup, bewildering management. (Score 2) 294

They seem to have decided a number of years ago to try to be Best Buy, only in 1/20th of the floor space, with higher prices, and while ensuring that they rebadge any major brand products to bear their own, woefully antiquated and little-known brand badges instead, to ensure that consumers would gravitate to Best Buy instead, where said major brands with which consumers were familiar continued to remain on display.

It started to make zero sense sometime in the late-1980s and it just got worse and worse from there.

I still buy parts, diagnostic equipment, and accessories for many tech items in the house. Just now I buy them on Amazon.com. I just bought a pack of about 30 DPDT switches the other day for $5.00 or so. I don't need 30, I just need one. I'd have just as well paid Radio Shack $2.99 for a switch and had it the same day—only the local store doesn't carry that stuff any longer.

Comment Mom-and-Pops don't survive in America (Score 3, Insightful) 294

because suburbanites and flyover folks won't shop in them. Mom and pop and competing national chain open on the same block, the entire crowd flocks to national chains, particularly in smaller communities. Hell, they're even proud to have them. Getting a Wal-Mart means they've arrived, it puts them on the map.

The only place where Mom-and-pop shops still survive are in heavily blue urban areas, where they continue to do well. That's no accident.

Comment Same, with Olympus. (Score 1) 422

I've been shooting with four thirds since it was released, and I have the same great lenses that remain perfect as they day I bought them.

This year I finally upgraded my body (to an E-3) for the first time in years. Logged over 150k actuations on my E-1 previously.

So I bought one body and zero lenses in a decade.

Once all of the pros and semi-pros and serious shooters have made the switch from film to digital, and are fully satisfied with the quality they're getting, and once all of the snapshot shooters have a camera that is automatically included and upgraded each time they get a phone (which everyone has), there's just not a lot of growth market left.

The switch from digital to film was a one-time boom until parity was reached in quality, and now it's done.

Comment That's the point. Nine times out of ten, you don't (Score 1) 422

WANT greater depth of field. You want LESS.

That's what the non-photographer public senses when they talk about the difference between "professional photos" and "snapshots."

In a snapshot (small camera), everything in the picture is in sharp focus, which makes the photo about the "scene" and distracts eyes from any one particular subject.

Shooting at f/2 on a tiny sensor, you get only snapshots.

Shooting at f/2 on a DSLR, only the subject (the person, the face, the rock feature, whatever) is in focus, and everything else is slightly blurred, which brings attention to the subject of the image, and at the same time blurs out distracting, unimportant details in the background.

Here's a good example from Google Images: http://ns12.sovdns.com/~nich61...

On a small camera or a smartphone, only the photo on the left is possible. In fact, on the smallest phones/cameras, you won't even get that much blur in the background; nearly everything can be razor sharp.

Generally, that's not good for subject work—only for scene work.

Comment Working with state agencies in the '90s (Score 1) 189

I saw a lot of EISA systems. It was a reasonable performer and physically robust (not as sensitive as PCI cards to positioning in slots, etc.). I'd say that EISA hardware was generally of very good quality, but high-end enough that most consumers wouldn't run into it despite being a commodity standard, sort of like PCI-X.

The systems I had experience with were running Linux, even then. :-)

Comment Seconded. (Score 1) 93

For a very long time, tape drives and media gave tape drives and media a bad name.

Consumer QIC — about 1% of tapes actually held any data, total snake oil that took 10 days to "store" 10 megs (immediately unreadable in all cases)
4mm — Tapes good for one pass thru drive; drive good for about 10 tape passes
8mm —Tapes good for maybe 10 passes thru drive; drive good for about 100 tape passes before it starts eating tapes

For all three of the above: Don't bother trying to read a tape on any drive other than the one that wrote it; you won't find data there.

Real QIC —Somewhat more reliable but vulnerable to dust, magnetic fields; drive mechanisms not robust, finicky about door closings

Basically, the only tapes that have ever been any damned good are 1/2 inch or wider and single-reel for storage. Problem is that none of these have ever been particularly affordable at contemporary capacities and they still aren't. Any non-enterprise business should just buy multiple hard drives for their rotating backups and replace the lot of them once a year.

Comment Experts are busy. (Score 2) 84

And they ALREADY have expertise.

A computing expert already has decades of highly detailed experience and familiarity with a bunch of paradigms, uses, and conventions.

Experts are the LAST people that want to read manuals for basic things they already have extensive experience with, like desktop environments. Again, they're busy. Being experts.

So, reading the manual on new tech that needs to be implemented in a complex system—great. Reading the manual on a desktop environment? Seriously? That's the last thing an expert wants to be bothered with. "I've used ten different desktop environments over thirty years. Can't you pick one set of conventions I'm already familiar with and use it, so that I can apply my expertise to the actual problems I'm trying to solve? Why reinvent the wheel in such a simple, basic system?"

DEs should leverage existing knowledge and use habits to enable experts to get their real work done quickly. For an expert, using the desktop is NOT the problem at hand requiring a solution. It's not what they're being paid for and not what they care about. Experts love to learn new things—in their area of expertise.

So sure, desktop environment developers probably love to poke around in KDE's front end, code, and docs. But anyone else? People that are not DE specialists are not so excited about the new learning project that is "my desktop," I assure you. The desktop is the last thing they want to be consciously focusing on over the course of the day.

Comment In the very first image... (Score 4, Interesting) 84

The tree widgets on the left are mismatched: some solid lines, some spaces with alphanumeric characters; the alpha characters are black, yet the lines are gray visual noise that creates visual processing and cognitive load for no reason, adding nothing.

The parenthetical text at the top has a title whose margin (left whitespace to other widgets) is significantly different from the text below it; there are spaces between the parentheses and the text, which no text or print style guide in the world endorses because it separates the parenthetical indicators from the parenthetical text, when they should be tightly bound for clarity.

The window title preserves the absurd convention of using both the binary name and a descriptive title together, and separates them with a typographical element (an em-dash) which is inappropriate in a label or design element because it is asynchronous—it indicates a delay in interpretation and pronunciation (as the em-dash just a few words ago in this paragraph does) and thus suggests long-form reading, which is not the intent for at-a-glance window titles (unless you don't want them to be very usable).

The title of the list widget, "Information Modules" is superfluous and redundant; the user starting an "About" dialogue expects to see "information" from the start, and they do not need to know about implementation ("modules").

The resize handle contrasts significantly with the window background, drawing undue attention to this particular area of the window above others (why is it "louder" than the window title, for example? Window controls should be secondary to window content and all at the same visual "volume" for usability).

In short—they still don't get it; they are signaling, in conventional ways that most users process subconsciously, thought habits and forms of attention that are not contributing to efficiency and use, but rather detracting/distracting from it. This is the same old KDE with poor, unprofessional design that leads to cognitive clutter. It's not that KDE has "too much going on" but rather that KDE has "too much going on that isn't actually functional and adds nothing to users ability to get things done).

Yuck.

Comment Nope, their work isn't shit. (Score 1) 153

But they can earn 3x as much by going into the non-academic private sector and doing their research for profit-driven corps that will patent and secret the hell out of it, rather than using it for the good of all. Because the general public doesn't want to own the essential everyday technologies of the future; they'd rather it be kept inside high corporate walls and be forced to pay through the nose for it to wealthy billionaires.

And because bright young researchers actually have to eat, and actually want a life, they grudingly go where the money is, knowing full well they're contributing to deep social problems to come. Myself included.

But why would I settle for a string of one-year postdoc contracts that pay like entry-level jobs and require superhuman hours and commitment when I can go earn six figures at a proper nine-to-five, with revenue sharing, great benefits, and job security? Yes, the company owns everything I do. But I get to pay my bills and build a personal future. Of course, society's future is much dimmer as the result of so many people making the same choice that I have, and so much good work ending up in private hands rather than public ones.

But them's the beans. If you want to own the future, public, you've got to be willing to pay for it.

Comment I think this is pretty much it. (Score 3, Insightful) 598

In terms of revenue, Apple is following the money. iOS has made Apple the wealthy powerhouse that it is today, not OS X. They don't want to lose the installed base or be perceived as just a phone company; OS X gets them mindshare and stickiness in certain quarters that matter (i.e. education and youth) for future iOS revenue.

But they don't actually want to invest much in it; it's increasingly the sort of necessary evil that is overhead, so it makes sense for them to shift to an iOS-led company. In the phone space, where the consumer upgrade cycle is tied to carrier contracts and upgrade cycles, it's important to have "new and shiny" every single year; consumers standing in AT&T shops are fickle people that are easily swayed by displays and sales drones that may or may not know anything about anything.

So the marketing rationale at Apple is (1) follow the revenue, which is mobile and iOS, (2) do what is necessary to stay dominant there, which means annual release cycles at least, and (3) reduce the cost of needed other business wings as much as possible so as to focus on core revenue competencies without creating risk, which means making OS X follow iOS.

It makes perfect business sense in the short and medium terms. In the long term, it's hard to see what effect it will have. It's entirely possible that they could wind down the OS X business entirely and remain dominant and very profitable as a result of their other product lines. It's also possible that poor OS X experiences and the loss of the "high end" could create a perception problem that affects one of their key value propositions, that of being "high end," and that will ultimately also influence their mobile sales down the road in negative ways as a result.

I'm a Linux switcher (just over five years ago now) that was tremendously frustrated with desktop Linux (and still dubious about its prospects) after using Linux from 1993-2009, but that has also in the last couple of months considered switching back. I switched to OS X largely for the quality of the high-end applications and for the more tightly integrated user experience. Now the applications business is struggling (the FCP problem, the Aperture events, the joke that is the iOS-synchronized iWork suite) and third-party applications have declined in quality (see: MS Office on OS X these days) as other developers have ceded the central applications ground to Apple. Meanwhile, the user experience on iOS remains sound but on OS X it has become rather less so as a result of the iOS-centricity of the company.

What to do? I've considered a switch back to Linux, but the Linux distros I've tried out in virtual machines have been underwhelming to me; the Linux desktop continues, so far as I can tell, to be in a worse state for my purposes than it was in 2008. I have no interest in Windows (I have Win7 and Win8 installations in VMs for specific applications, and even in a VM window they make me cringe; just complete usability nightmares).

It's a frustrating time for desktop users in general, I think; the consumer computing world has shifted to mobile/embedded devices and taken most of the labor, attention, and R&D with it. The desktop, needed by those of us that do productive computing work, has been left to languish on all fronts. It's completely rational in many ways at the macroeconomic level, but at the microeconomic level of individual workers and economic sectors, it's been a disaster.

Comment Um, they just want to use Netflix. It adds value (Score 3, Insightful) 121

to the media by making it easy to browse through, search, access, and stream.

And they're paying regular price.

We live in a very strange world when "piracy" has gone from "armed crews of criminal specialists seizing tonnage shipments of goods on the high seas with cannon and sword" to "a regular schmo paying the regular price to use a regular product in the regular way in his regular living room."

Hard to believe that the word still retains any of its negative connotation at all.

"Piracy" these days sounds an awful lot like "tuesday afternoon nothing-in-particular with tea."

Comment No, this is dumb. It should be shorter. (Score 1) 161

Very little useful learning goes on in school. And the top students need time outside of school to visit libraries, pursue intellectual hobbies, do independent reading, and generally do all the academic stuff that will actually matter in their lives later on (and matter to society later on).

By continually extending the school day and the school year, we increasingly ensure that we lock our best and brightest into mediocrity by tying up all of their time in institutionally managed busywork designed to ensure they don't deviate from the mean, which is pretty piss-poor.

Slashdot Top Deals

Machines have less problems. I'd like to be a machine. -- Andy Warhol

Working...