Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×

Comment Branding matters, both for consumers and for (Score 5, Insightful) 293

project management.

The product is called "Windows." Windows are static things. They are embedded into walls. They provide an unmoving portal into another space.

A monitor on your desktop behaves like a window in some sense. It is always in the same place. You sit and you look at it.

Windows Phone and Windows RT just don't make sense for mobile devices, and provide a kind of complacency to project vision and the wrong idea (unpalatable) to consumers looking for mobile devices.

MS should call the mobile product something mobile:

MS Pathways
MS Journeys
MS Passages
MS Ways
MS Compass
MS Latitude

Then they should focus relentlessly on small-screen/long-battery/mobile UX for the mobile system; design toward the lightweight, mobile ethos of the new name, and market it relentlessly not as "the same as windows" but in fact as exactly different from it.

MS Windows in your office
MS Compass for going places
"Because you're not always sitting still.
"Busy people do more than sit by Windows."

I'm not saying that the marketing is the product; we all know that's ridiculous and leads exactly to a product fail (mismatched expectations vs. reality). I'm saying that if MS was as marketing-led as they ought to have been, they'd do the field research to know what mobile users need (field research they clearly haven't done well) and target the product to those needs, as well as the marketing campaign.

Who needs Windows in their pocket on the street? Nobody. Windows belong inside walls.

Same thing goes for the hardware product. "Surface?" Sounds static and architectural. The opposite of mobility. You can see that they themselves imagined the product this way based on what was shipped out the door. Come up with something lightweight and mobile.

The Microsoft Dispatch.
The Microsoft Portfolio.
The Microsoft Movement (tablet) and Microsoft Velocity (phone).

These are not great ideas yet, but they're light years ahead of "Windows" and "Surface" for a mobile device that ends up acting just like a "Window" or a "Surface."

Submission + - Console gaming is dead? How about $14,600 for a launch-day PS4 (terapeak.com)

aussersterne writes: Seven years after the release of the PS3, Sony released the PS4 on Friday to North American audiences. Research group Terapeak, who have direct access to eBay data, finds that despite claims to the contrary, console gaming still opens wallets. Millions of dollars in PS4 consoles were sold on eBay before the launch even occurred, with prices for single consoles reaching as high as $14,600 on Friday. Would you be willing to pay this much to get a PS4 before Christmas? Or are you more likely to have been the seller, using your pre-orders to turn a tidy profit?

Comment Does it work without nursing bluetooth (Score 1) 365

connectivity along? This was my big gripe with the Sony MN2 (see my other comment in this story). I wanted it to do some basic things: notably, to give me a buzz about events (messages, calendaring, calls). It failed miserably at this task, because keeping it charged and connected to the phone all the time in the daily flow of life turned out not to be possible without making "Sony MN2 management" a new part-time job for myself. A distant second reason for failure (but still deserves mention) is that the touchscreen was so worthless that when it did manage to buzz me, I spent a comical ten minutes tap-tap-tapping on my screen just trying to get my taps to register well enough to see what the buzzing was all about.

It was much faster and less labor intensive in the end to continue what I'd been doing, and what so many others do: fish the phone out of my pocket regularly every ten minutes to see if anything was going down.

I thought about Pebble, but the Sony product made me gun-shy about smartwatches for the general consumer market a this point (though I'd still give an Apple product a look—but without much hope that it would work for me, since I use an Android phone now).

Comment Had a Sony MN2 briefly; problem was VERY familiar. (Score 1) 365

I got a smart watch (Sony MN2) last year because I kept missing the vibrate on my phone for meetings and calls, because my phone isn't always in my pocket. I thought that if I had a device on my wrist, I'd always get the buzz and never miss anything important.

SIMPLE task for the device, no? But it failed miserably.

Reason: Same as Windows CE back in the day. The device wasn't up to the job, because it was busy trying (miserably) to do a hundred other things that it simply wasn't suited for AT ALL.

- There were multiple "apps" on the watch, including for things like Twitter and Facebook
- But the screen is by nature so tiny and the device so limited, these were laughable rather than usable
- Rather than focusing like a laser on doing tiny-device things well, this led to compromises:
- Unusable touchscreen (inaccurate, insensitive)
- Useless battery life (lucky to make a day, often less)
- Worst of all, the device had to be tethered to be useful; lose tether, and it is effectively a bracelet

Compare to Windows CE:

- There were multiple applications on the devices, copying most MS desktop applications of the day
- But the device was by nature so tiny and so limited, these were laughable rather than usable
- Rather than focusing like a laser on doing mobile-device things well, this led to compromises:
- Crappy display, crappy resistive touchscreens, inexact and unpredictable input methods
- Useless battery life (lucky to make a few hours, often less)
- Worst of all, CE devices had to be synced to be useful; fail to sync several times a day and they were a data prison or data corrupter, rather than a data aid

The experience with the Sony MN2 was much the same as what I remember from CE: constant nursing the device along, excessive time spent trying to "make it work" for the most simple tasks, paying WAY TOO MUCH ATTENTION all the time to the connectivity (your body and its attention are pressed into service as the mechanical tool that keep the data flowing) to ensure that it was regular and sound, no intention to try to use any of the laughable features, and continuous frustration (Oh god, the battery went dead/I lost bluetooth sync/something went wrong and I can't tell what it is on this tiny screen device with no error reporting, I didn't get buzzed about that meeting/call, WTF IS THE POINT OF THIS SHITTY DEVICE AND ALL THE TIME I SPEND NURSING IT ALONG ANYWAY?)

Wrongheaded.

I presume that if Apple decides to build one of these, they will have better success, given their reasonable HCI and design and decision-making chops.

Comment Re:She will have to find out more than this. (Score 2) 189

I actually don't know. I have the luxury of having institutional access to a full range of print and electronic subscriptions. But even if they do, think about what you're asking a busy professional to do.

People are suggesting that she should just pony up $thousands annually, that she should dedicate days to travel and research, as apart from patients or family, when there's no necessary technical reason to do so, and now, with ILL, that she should stick to a research project about a case or two for the many weeks that it takes to make ILL work.

Sure, there's ILL, and it may well work as it used to (though I doubt it for electronic resources, based on the ways that licenses right now are written). But we're asking her to stick to a project for $thousands and $weeks of constant attention. She's a professional. She is busy. And she ought to have access. The point is not to ask, "can it, plausibly, be done?" but rather "what is science for, and is this the way that it ought to work?"

We made society, as human beings. We can make it better. I'd suggest that this is a case in which it can be made to function much, much better than it currently does. The goal behind having therapists of all stripes is to help people to overcome real problems, not to test the therapists to see whether or not they can navigate arcane social structures and processes. We should make their jobs as easy as possible. Hell, this applies to virtually every job title. Jobs exist for a reason—because there is demand for what they do, because we value it. Why not, then, make the jobs of professionals as plausible and as easy as possible, rather than risking their doing a much worse job simply so that a few corporations that produce little of value (the value in academic publishing is produced by the academics and the researchers, not by the publishers in the era of easy print-on-demand and easy online access) can earn a decent chunk of change.

Comment She will have to find out more than this. (Score 1) 189

She will have to find out:

1) Which libraries have _print_ as opposed to _electronic only_ subscriptions, and
2) Amongst those that do not (I'm guessing the majority), which allow access to electronic resources by non-students/non-faculty (this kind of access is expressly forbidden, at any cost, by many subscription packages offered to universities).

Even if she is able to identify a library that offers non-affiliated individuals access, she will have to pony up whatever the cost of access for the public to the library is, and then, at that stage, she will have access to _one_ journal. It is unlikely that all of the resources that she needs are to be found in that _one_ journal, and much more likely that relevant material is published in several or even several dozen journals, in which case all she has to do is grill library personnel for 20-30 minutes with a detailed list in each phone call, and likely pony up the access fees (and the transportation, and the saturday mornings) to jump around from one library to another on a wild goose chase over many weeks to piece together the materials that an academic can assemble over a cup of coffee without leaving their screen. Just who, pray, are the academics producing their research _for_? Surely those who might actually be able to use it practically?

All of this stuff can technically be accessed from her office, too, in the space of 10 minutes, but for the profit-oriented restrictions (that do not reflect costs, see my previous post) imposed by journal "publishers."

Comment Two further things— (Score 1) 189

"irritating," not "irritable," my apologies for the misuse of the word (it's late where I am); and I should note that the department had to change the name of the journal and all of its graphics as they brought it entirely in-house and severed the Springer relationship, since Springer held the rights to everything, including all past issues, meaning that the new journal is just that—a clean slate, post-Springer (and good riddance).

Comment Having worked for a Springer journal, (Score 5, Informative) 189

as a managing editor, I can tell you that they do not incur substantial expenses, and that academics provide the important parts of the service, essentially for free in the cases of most journals. It's not like putting out a magazine; we didn't even have copy or layout editors for our journal, the most inexpensive components of editorial labor. It paid the university department that hosted the journal a mere thousands (single digits) per year. There were two "paid" staffers—myself and one other person, The rest of the "editorial board" consisted of faculty of our and another several universities doing the work for free, under the auspices of the "professional duties" of the academics involved (not as paid by Springer, as paid by their respective institutions). Peer reviewers—free. Editorial labor (copy, layout to production files according to specs, submissions queue, even rough line editing, style work)—graduate students looking for a title to add to their emerging CVs.

Essentially Springer's total cost for putting out the journal amounted to the several thousand (again, single digit thousands, split between myself and one other individual) that they (usually belatedly) paid our department annually for the entire journal in its substance, plus printing/distribution (a pittance given the circulation size of academic journals and the cost per print subscription—not to mention the increasing number of electronic-only subscriptions). They had one liason that handled our entire "account," and the level of labor involved allowed this person to be "over" several _dozen_ journals as just a single person. That's as much a labor footprint, in its entirety, as our journal actually had inside the "publisher."

And for this, they held onto the reprint/reuse rights with an iron fist, requiring even authors and PIs to pay $$$ to post significant excerpts on their own blogs.

Seeing the direction the wind has been blowing over the last half-decade, the department decided (and rightfully so) that it's basically a scam, that academic publishing as we know it need not exist any longer, and wound down both the print journal and the relationship with Springer several years ago, instead self-publishing the journal (which is easy these days) to much higher revenue for the department, and the ability to sensibly manage rights in the interest of academic production and values, rather than in the interest of Springer's oinking at the trough on the backs of academics.

Oh, and many university libraries (particularly in urban areas) do not admit just anyone off the street; you must generally hold an ID that grants access to the library (often student or faculty, plus a paid option for the general public, either monthly or annually, that can vary from somewhat affordable to somewhat expensive). Not to mention that for many people, yes, it is a significant professional hardship to lose a day or two of work to be trekking into foreign territory and sitting amongst the stacks—and that this hardship is made much more irritable by the fact that the very same articles are sitting there online, in 2013, yet can't be accessed at reasonable cost.

As an academic, I have the same frustration. We bemoan the state of science in this society, yet under the existing publishing model we essentially insure that only a rarefied few scientists and the very wealthy elite have access to science at all. $30-$60 is not a small amount for the average person—and that is the cost to read _one_ article, usually very narrowly focused, and of unclear utility until they've already paid the money, that is borderline unreadable for the layperson (or for the magazine author hoping to make sense of science _for_ the layperson) anyway. Why, exactly, would we expect anyone to know any science at all beyond university walls, under this arrangement?

Comment Make a good enough game (Score 2) 272

and even DRM is merely an obstacle to be overcome to get to the Game . That . You . Must . Play . Now .

The problem is that the games suck. Right now in the 'AAA' space we have an orientation something like:

85% production values
5% compelling and entertaining story and writing
10% gameplay
0% replay value

Show me a game like this, and I'll spend rather a lot, and even suffer DRM for it:

10% production values
20% compelling and entertaining story and writing
50% gameplay
20% replay value

When the technology didn't allow for production values to matter, everything was tied up in gameplay, writing, and replayability. Games had to be entertaining to sell.

Now, particularly given the ways that games are marketed (and the synergy between this kind of marketing and the marketing that happens on the hardware side), everything is about jaw-dropping renderings. It feels like the late '80s and early '90s, when everyone in CS departments were printing out raytrace scenes at 24x36 and hanging them on the wall.

At first, it was "omigod thassocool" to see a bunch of floating cones and spheres and rendered bolts with clearly articulated threads reflecting the image of the chessboard on the other side of the picture. But by the mid-'90s, it was like, "humf, what else you got, I am no longer amazed by the fabulousness of this technology."

That's how I feel about games now. A decade or a decade and a half ago, game engines and triangle count and an asymptotic approach to "photorealistic realtime" rendering were enough to make a person shell out $$$ just to "have the experience."

But now it's old hat. Someone else posted in this story about games being all about showing you sliding your car sideways into a flock of sheep. That pretty much sums it up—how many hours do they spend on tableaux like this? It's plots of shiny raytrace scenes on department walls all over again. I had occasion to play a few games (Silpheed, a few Sonics, etc.) on someone's Sega CD setup not so long ago. I was like "Shit, this is fun!" and then shortly after I realized why I had abandoned gaming in the early 2000's. I just preferred to spend my money on more entertaining things.

I find crossword puzzles to be as fun as many of the 'AAA' titles of the last half-decade.

Comment Or rather, I should say— (Score 1) 204

I use markdown + Deadalus for long-form (i.e. dozens of pages) content that will go to print.

I use markdown + Mou for short-form content that will go online.

If you write for online distribution at all and you don't know about Mou, you should definitely check it out. Similar statement for those that write books but don't know about Daedalus.

Comment ABSOLUTELY (Score 2) 204

As someone that writes a great deal both for online and offline distribution, I use markdown *extensively*.

It's fabulous for the grunt work of formatting: headers, italics, links. The rest can be done by tossing in HTML, XML, or whatever other markup code is needed. It's fabulously lightweight and fast and unobtrusive.

In fact, for all those "I wrote my dissertation in LaTeX, *sniff* *sniff*" people here on Slashdot, how about this:

I wrote my dissertation on an iPad 2, in Daedalus, in markdown, embedding HTML or other kinds of markup as necessary, then formatting it all in a final pass through a couple different parser/formatters. Sometimes the right tool for the job is the one that you have to think about the least—the one that stays out of your way—and for me, that's markdown+Daedalus.

(Yes, I'm prepared for the onslaught of accusers, ridiculers, and doubters here—prepared to ignore them.)

Comment I see a weird parallel in academia (Score 4, Interesting) 356

with beginning grad students. In papers, they often feel like they have to cite every . last . factual . assertion . and . word . that . they . use, to the point of having paragraphs with 20 citations in them, unreadable. But they're so terrified of "plagiarism" and heard that lecture so many times at the beginning of so many classes that it's hard to talk them out of citing Pythagorus or some writing about him when using the Pythagorean theorem, Perskyi when using the word "television," and so on. Exhausting.

As an analog to this, they often hesitate to say anything new (i.e. anything they can't find a citation for). It's as though they feel like only institutions and the famous have license to make new things in the world, and then be cited. It recalls for me the similar divide between creators/consumers, with a hard territorial border in between the two camps, that RIAA/MPAA/BSA et. al. have tried to inculcate into the cultural consciousness.

Comment Reminiscent of the Sony (Score 1) 244

I bought a Sony smartwatch just to have the experience and on the off chance that it would be fabulous.

It was anything but.

You're absolutely right: the "smart watch" is a dead end.

Smart —fine.
Wearable — excellent.
Watch —stupid.

Packing a third of a smartphone (it can't do most of what a smartphone can do, at least not directly and independently) into a device with a crazy-small display size and a battery that might last you a day before needing to be charged (and remember, when you think about charging, that a "watch" is something strapped to your wrist that you rarely want to think about in logistical terms in your everyday life) just plain doesn't make anything about life better.

Wearable tech sounds great, but it'll be something other than "bluetooth device running your phone's OS that you need to charge all the time and that does less than your phone."

Comment Same mistake going all the way back to WinCE (Score 1) 246

Remember in the '90s when Windows CE "laptops" that were basically just the equivalent of a Palm Pilot with a giant screen were selling for the same price as laptops? Yes, you could pay $1,000 for a machine with:

"Pocket" Outlook
"Pocket" Office
"Pocket" IE
8 or 16MB of built-in storage, with few, if any, additional storage options
A tiny-ass MIPS CPU
A 640x480 (if you were lucky) passive matrix screen that was barely legible

Of course, none of the applications actually worked—

"Pocket" Outlook couldn't connect to 90% of infrastructure servers
"Pocket" Office was basically just "plain text with bold and italic" and couldn't open any Office documents with even 10% fidelity
"Pocket" IE required that you drop another $200 on a WiFi card and then wouldn't render any modern (post-HTML 3.0) web pages anyway

And all were slow as sin, and there was precious little in the way of additional applications of any kind, because the machines were so underpowered vs. laptops running full Windows, developer support sucked, and there was no market for apps anyway—not to mention that if you did happen to sell an app to a consumer, they had to go through the convoluted process of installing the CD on their PC first, then plugging their CE device into the serial port of their PC and doing an (always unstable and failing in non-transparent ways) "activesync."

And yet Microsoft kept the "Windows" brand on the CE devices and intentionally marketed them as roughly equivalent to a laptop only lighter and with much longer battery life, OMG!

I had more than one acquaintance come to me asking for help with their new "laptop" only to find that the problem was that their new "laptop" was a CE device they'd been duped into buying, and while it was incredibly light (for its time) and had massive battery life (for its time), it couldn't and wasn't meant to, as they'd imagined, actually run all of the software they had on CD on their office shelf.

Decade and a half later, and Microsoft is still playing the same game and has been all of this time, never with any mainstream success.

I've heard people say, "well, gosh, these are corporate devices for vertically integrated workflows, that's the market," but that never explained (nor does it today) how they then end up at big box stores being sold to consumers, and in mass market Microsoft advertising.

Me, I think Microsoft's just kinda dumb.

Slashdot Top Deals

In seeking the unattainable, simplicity only gets in the way. -- Epigrams in Programming, ACM SIGPLAN Sept. 1982

Working...