Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Comment Answers: (Score 1) 93

It has a fully autonomous assistant that adjusts it twice a year for daylight savings.

It has a GMT hand that can be set to alternate time zones.

It has a little window that shows the current date.

When it is rainy, it shows droplets of water on the crystal. When it is snowy, it shows snow on the crystal. When it's cold outside, it gets cold on your wrist.

Thankfully, it never lets me know when I "get a message" and refuses to help other people interrupt my life.

Comment This. And there's more out there than Timex. (Score 1) 93

I have a bunch of mechanical watches. They last and last and last. I just finished servicing a '70s vintage Slava 2427. Two mainsprings, built coarsely and roughly and like a tank. Keeps on ticking, and keeps reasonably good time.

Good, high-quality mechanicals from the likes of Seiko and Orient are readily available on Amazon, eBay, or a bunch of other places for $100-$150. They don't need a battery, will run for 20 years without any attention, and when they do need attention, oils and parts to service them are out there by the millions. You can leave them to your kids.

And for the hacking crowd, you can also build your own—cases, dials, hands, and movements are plentiful. If you want to go the new, Japanese route, Miyota (citizen) automatic mechanical movements are running about $35 right now from Hong Kong. Cases with sapphire crystals something like $40-60. A hefty solid stainless steel bracelet will only set you back $10 or so these days, and the same with a set of hands. You build your own gaming PC to look just the way you want it? Why not build your own mechanical watch.

Selecting your own dial, hands, case, bracelet, and movement and assembling them yourself is a hell of a lot more personal than choosing a face from a watch app store... that goes dark every day without an overnight charge.

Just IMO.

Comment It's the price. (Score 1) 181

I just picked up a Huawei Mate 9 that was being sold overstock for $249 unlocked. Great 5.9" screen, blazing fast 8-cores and 4GB ram, zero lag, three days on a charge, dual-lens rear camera setup with very decent raw files, dual-sim, SD card slot, built like a tank, Gorilla Glass 3, and desperately thin and light at the same time.

There just isn't enough difference between "last year's overstock" and "the latest and greatest" any longer to justify paying 4x as much. When the first iPhone was launched everyone laughed at $499 for a phone, but it was the only game in town doing what it was doing. But now, Apple is trying to get people to pay twice that much for phones that are at best 5-15% better than its competitors who are priced at a fraction of the cost.

The actual apps are all the same for the most part, and for most use cases. It's all a bunch of third-party SaaS and e-commerce. The only difference is imperceptible screen differences (That one has more PPI than the human eye can detect and this other one has two times again as many pixels! Wooo?) and imperceptible processor differences (That one starts Chrome in 0.8 seconds and this other one in 0.7 seconds! Wooo?)

They're going to have to price more in line with the market or come out with something that is "revolutionary" once again (actually is, not just claims to be) if they're going to drive sales at the levels they want.

Comment Yes. Try Nuance products (e.g. Dragon, etc.) (Score 2) 93

I have a friend who is a quadriplegic and lives in an electric chair. He is also a software engineer and very active on Facebook. Last time I knew the details of his setup he was using Dragon, I believe. As I understand it, it's fully customizable, i.e. you get to tie particular voice commands that you choose to particular actions, widgets, keystrokes, etc.

It took him a year or three to get it all customized to his liking for everything, but at this point he basically rolls around and uses the laptop attached to the deck on his chair in front of him nonstop. He's got a bunch of IoT/smart home stuff set up at home and in his office as well, he provided directions and his wife set it all up under his supervision.

The result is that he basically has a workable voice interface to the Internet, his IDE, Windows, and also most of his immediate physical surroundings, so that he lives a fairly normal life, apart from bodily functions and eating, which he obviously needs help with. But most everything else, from rolling around/chair control to lights and blinds and doors and windows and locks to television and computer and work he does by himself, without any movement in his limbs, using voice.

All off-the-shelf stuff as far as I know, they're very middle class and bog standard insurance, no huge budget, just a lot of his expertise and his wife's hands to set it all up over the years.

Comment Not just that. They're incredibly loaded symbols. (Score 2) 233

Literally networks of power, whose center is somewhere else.
Modernity and (in their proliferation and sagging) the collapse/end/postscript/decay of modernity.
Consumerism, consumer technology, technological encroachment.
Utopian ideals (energy, artificial light) and their mundane failures to transform human life for the better.
The loss (as you point out) of the rural in the face of the urban.
Environmental destruction.
Utilitarianism and rational-instrumentalism at the expense of beauty.
Clutter and the "wreckage of history" that Walter Benjamin famously described.
Setting out to aspire to highs, inevitably sagging back down to lows.
The ravages of time.
Technological debt.
etc.

Comment The problem is the missing noun. (Score 1) 336

"Release new products early, then release updates to them often" is great for users.

"Release new products early and release their complete replacements often" is bad for users.

The Linux community does the second, rather than the first.

Instead of this for a single tool, library, application, or environment:

10.0 this month
10.1 next month
10.2 the month after that
10.3 the start of next year

Linux does this for a single tool, library, application or environment:

10.0 this month
2017.3.51 next month
Project Congo Free the month after that
Project Riga 29.6 Base the start of next year

The releases are often enough... but sadly each release often has so little to do with the last that users, packagers, and integrators are starting over each time.

Comment Lack of integration. (Score 1) 158

They are different companies, that's the basic problem. Google over here with the OS, phone maker over there with the device.

Release cadences and roadmaps are subject to all kinds of practical constraints and pressures, from labor and labor turnover to revenue/financials to other partnerships to strategic mission and vision.

Even when the two organizations know something of each others timelines, that doesn't mean it's practical to synchronize them without significant work and significant negatives. I don't think you can avoid this problem so long as the hardware is made over here and the OS is made over there by two completely separate organizations, often that have two completely separate parent umbrellas with their own roadmaps and priorities.

Comment Because the market doesn't value security. (Score 2) 335

The pressure to release early and often is extreme. "MVP" rules the day, and no one in most senior roles has the granular perspective necessary to be aware of "security" as a concept. Is it checked into the testing repo and does it run? PUSH IT OUT. We'll fix any bugs as we "iterate."

Oh wait, we won't actually iterate. Because existing features don't get us as much as releases of new ones. We'll just keep pushing out new ones as fast as we can.

Security? Hell, often even basic functionality doesn't work. Release it broken, then declare it that part code deprecated in favor of new versions with new features in six months. Even if security is flawed, that's okay, it was only out for a few months "that way." Anyone still using it should have upgraded. If they don't it's their fault.

There are flaws in the new version/new features as well? Well they've only been out for eight weeks. It was an MVP. We're agile. We'll fix any bugs as we "iterate..."

etc.

Comment If Microsoft recreates the Newton in modern form (Score 1) 87

and succeeds before anyone else. I am going to be sick.

I am still waiting on a replacement for my Newton 2100 with a lighter weight and SIM slot. Hell, I don't even need color or an OS update. Just give me Newton OS running on a Kindle with a digitizer.

But I seriously hope I don't have to buy it from MS.

Comment Decades later the Newton 2100 is still king. (Score 1) 223

Yes, the early devices were too slow and underdeveloped. The 2000 solved these problems but then didn't have quite enough memory. But the 2100 was transcendentally spectacular. It was a quantum leap forward in personal computing. From the mid '90s through the mid '00s a Newton 2100 unit was my constant companion. I'd still be using them today in place of this naff smartphone/tablet stuff if Newton devices were able to meaningfully synchronize with anything any longer. As I wound down my Newton use in the mid-'00s, mine were outfitted with WiFi, MP3 playing, GPS, etc. Amazing for a device from essentially before the consumer internet era.

A decade+ ago when the iPhone and iPad were just rumors, each time I was crossing my fingers that Apple was about to resurrect the Newton OS and the Newton 2100, only now thinner and with color.

I still have two Newton 2100 devices here in my office. There is nothing out there that compares to them in terms of personal productivity and UX. That handwriting recognition that people made fun of in the early Newtons became absolutely perfect by the last model. There is not a single handwriting app on the app store that even comes close. The way that newton handled information and information objects was a revelation. We're still stuck with clunky files and indistinct "sharing" on current mobile devices.

The Newton wasn't just ahead of its time, its OS remains best-of-breed today and apart from things like color display and modern WiFi, the hardware remains highly usable, too. So far as I'm concerned, Newton still hasn't been reinvented. The iOS and Android experiences are superficially similar (slate form factor, touchscreen), but the actual usability and capabilities for *personal* computing lag far, far behind. Steve Jobs called the iPad "intimate" at launch, but in comparison to the Newton, the iPad is your stuffy boss.

Newton was a different kind of computing, one that was highly personal and highly integrated and integrate-able into your individual workflows and thinking habits. I miss it. I still use mine every now and then for brainstorming and certain personal record-keeping tasks (with the Notion database) as nothing else can duplicate the experience. The thing that comes closest is good old pen and paper, but pen and paper lack some key features in comparison.

Imagining a Newton OS with modern processing and memory, an iPad Mini-like form factor and display, and cloud support to synchronize and and socialize information across devices literally makes me want to fling current tablets against the wall and shatter them to smithereens. Having been a longtime Newton user, I literally hate iOS and Android.

For those who are not getting it, remember Windows CE and how out of place and clunky it feels in comparison to iOS on mobile devices? Well now imagine that same distance between iOS and Newton OS for mobile personal computing. Even now, even with all the missing modern features, when you use Newton OS for ten minutes and then have to return to iOS/Android, it's painful. You get that same feeling that on iOS/Android, the microminicomputing paradigm has been inappropriately shoehorned into a mobile personal device in comparison.

But it's all we've got.

*sigh*

Comment Every time there's a story about a big data (Score 1) 180

service, half of the /. comments are bemoaning its existence and sarcastically remarking on more data "slurping," how everyone is more owned by company X, and so on. (1) They don't have to use the service, and (2) the service exists that way intentionally, i.e. it's part of the model and intrinsic to the service, which many people find valuable. I *love* that my tech "knows" me and can increasingly deliver me what I want and help me to find/remember stuff that is important, etc.

As far as basic privacy goes, just a couple thoughts:

(1) Privacy is not an on/off thing. Most of the time, people do not want something to be "fully private" (i.e. only for themselves). If they do, they don't put it into technology items in the first place—or at least, they shouldn't. Write it down on paper and keep it in a safe. This is not an appropriate use case for modern tech, at least not at the individual level. Most of the time, people want something to be "private from" a particular someone else (We're talking about Elsie getting fired, so we don't want it to get back to Elsie, etc.; We're talking about a product we're building, so we don't want our competitor to find out) I don't see cloud and big data services as doing much damage there unless you are clueless enough to post your conversation on Facebook or have bad passwords and team security. But in the end, it's not the system that's the risk here, it's bad choices by the individuals as they use the system.

(2) If what someone is after really is "secure 1:1 communication, with no chance of the information reaching any third party," then again—it's 2017?! This need is orthogonal to the very functionality of comm tech. Your communication will have to pass through other parties and systems. Forget Microsoft, everyone knows about the NSA these days, it's an open "secret." If you really want "1:1 privacy" protected against all ears, you tell someone to meet you at a particular small coffee shop in a random neighborhood eight miles from either of your regular haunts, and you go in the back and have a couple coffees and talk in low voices.

It just seems to me that in 2017 anyone who's upset about Google or Microsoft and the ways in which they collect and leverage communication data is fundamentally misconstruing the nature of the technology ecosystem right now. Even that secure Debian phone kickstarter project is still going to need to have a closed source baseband to comply with FCC rules and will ultimately pass packets over the same wireless networks that everyone else uses. Hello, NSA.

You have to decide—are you doing something private, or are you just doing something average for which the only compelling interest that you have in your privacy is a kind of paranoia about third parties, future totalitarian states, etc.? If the former, you'd better get your a$$ off of networks that serve the public in any way. If the latter, you have a choice to make: if you use infrastructure that the general public uses, yet you think you can make it secure, you're dreaming, so you'll need to accept that at a very fundamental level, "public" and "private" are *orthogonal* quantities (it is a fool's game to try to do something private in public, no matter how fancy you get), so you'll either need to concede your publicness and live with it or actually take what you're doing *private*, which means—don't do it in public.

The mobile networks are public. Google is public. Microsoft is public. Facebook is public. These are public places. Even if you put a bag over your head to try to disguise who you are, being in public is being in public and there's always the chance that someone will yank that bag off. That doesn't mean that being in public is a bad thing, or that we ought to culturally forbid or decry publicness.

It means that you should take your private shit private, or at least understand that there is a spectrum of publicprivate correlated to risk that is not going away. Less public? Sure, more private. Also, less useful. At a fundamental sociological level, this is a part of the social condition and the division of labor in society. You can either choose to do *everything* your own damned self, or you can choose to delegate some things (like carrying messages), which means that whatever you delegate is vulnerable as a matter of it having passed out of your immediate control.

It's not a new problem, there just seems to be a new presumption that we've solved it for some reason, and that the Big Bad Corporations are keeping us from enjoying our solution. There is no solution. You either do everything yourself and get less done and enjoy less productive potential as an absolute abstract matter, or you're vulnerable on account of everyone to whom you've delegated things. As it has always been.

Comment But you're only inundated (Score 1) 180

if you try to handle it yourself. That's what the emerging round of services is FOR—the systems can handle the data for you, and do it well. But only if they know you and know what you care about.

Google Now + Google Inbox, for example. Between these two services, about 90-95% of what I care about is surfaced for me automatically from the noise. Places, times, patterns in my schedule, traffic reports and weather reports for places I'm likely to go just now, events that are happening that are "my kinds" of events, things that I might like to read based on my habits, etc.

I don't have to log into Amazon track my packages or onto the community calendar to see what's going on or into Google Maps or Waze to check traffic or into email to see if my boss emailed me. All of those things are pushed to me in a contextual way all in one place, and it's pretty magically accurate nearly all of the time.

Sure I still have a gmail box that gets hundreds of messages per day, but I never have to see 90% of them. Same for the other apps. Just figure out what I need right now based on what I'm doing and what I typically do, and then push it to me.

The big data SaaS vision is actually evolving into a reality, and it's working better than I'd ever have thought it would if you'd asked me 20 years ago. I don't want to manage my own digital life; I want services to do it for me. It's like SPAM filtering x10, with an actual PDA (remember "personal digital assistant?" as a concept) on top of that that functions *like* an assistant. Proactive, reasonably smart, and very capable.

The last thing I do, or want to do, is process data and multi-task. These are exactly the things that today's combination of services can increasingly take off your plate, if you'll let them.

Comment No sh*t. And? (Score 1, Interesting) 180

I really don't get the crowd who's always on about security/privacy here. Sure, you don't want the inconvenience of stolen data. But as Equifax (latest in a long line) demonstrates, it's *going* to happen, and it doesn't require Skype or Google to be compromised. And as it happens more and more, the culture becomes more and more forgiving of individuals who may have been compromised. It's not a life-ending problem.

Meanwhile, the life efficiency benefits from having good data vacuuming and processing are incredible. They make you into Person+ in terms of getting things done and done quickly, and over time they accumulate.

On some story here the other day there were a bunch of people pushing a Debian phone whose big calling card was apparently that—thanks to being so completely locked down against data collection—that it's basically nothing more than a 1:1 communicator—you and whatever other person you're connected to. The big data services get nothing. The big selling point.

I just wouldn't be interested. I actively try to multiply the amount of data I'm providing to Google and others with the way I create and configure logins and use software, because it pays multiples and dividends in productivity and convenience. If someone came up with a phone that got an order of magnitude *more* of my behavioral, locational, and conversational data crunched by big services in order to leverage it all for customization/context/workflows, *that* is something I'd be interested in. Take my data. Make my life faster/better/more convenient.

I don't need someone to make secret the fact that I like show X and buy product Y and often drive to place Z. I need someone to spread the word to as many services as possible and help them to make use of this data to make my life better.

Comment Jobs made the right hard choices. (Score 1) 366

Building anything of complexity in a profit-driven, timing-sensitive marketplace requires concessions—some really desirable stuff isn't going to make the cut, and other stuff is going to have to be simplified. This is the result of balancing what is technically possible against practical constraints on labor, cost, supply chain, and so on.

When something has to go, do you keep A or B in the product? And when C has to be simplified, do you simplify it using method C1 or C2?

These are the things that Jobs tended to get right, often with counterintuitive decisions. People often say that Apple is all about ease of use, but this can encompass a lot of different things:

- Intuitive use for those with no prior knowledge
- Use that requires the fewest number of steps or user-initiated actions
- Use that requires the fewest number of adjustments relative to existing expectations and habits
- Use that maximally shortens the absolute time until results arrive
- Use that has the highest possible correlation between inputs and desired, complete results ...and so on.

And these things are often at odds, and they're often the kids of decisions that line up with the aforementioned A/B/C1/C2/etc. decisions in multiple, complex ways. Steve Jobs had a knack for balancing these in such a way that:

- Those with no prior knowledge were not alienated or intimated, even if they had to learn
- The number of steps or actions was not onerous
- Existing expectations and habits were managed in a way that minimized cognitive load
- Results were accomplished reasonably quickly
- Correlation between inputs and desired results was relatively high

I say that his decisions were often counterintuitive because he often thought outside the box of mere feature delivery. For example, if it was proving tough to design for existing expectations and habits, the choice might be instead to change things more, rather than less—so that the new feature was taken *out* of the realm of existing expectations, even if in some design alternatives there could be a minimum overlap. Most companies would go for "we'll meet existing expectations and habits as well as we can, and 15% overlap is better than a 10% overlap if that's what we can bring to market effectively."

Apple in its heyday would say, "A 15% overlap is poor; let's revamp this so that it doesn't bring to mind any expectations or habits. We could design with some familiarity, sure, but if it's only 15% match, some familiarity is actually worse than 0% familiarity, since in the second case we don't fool the user into thinking they know more than they already do, and they understand from the start that it is something new that they will need to learn [even if it wasn't actually new at all, as people here would often point out]."

Similar counterintuitive decisions for the other bullet points. Maybe the right thing isn't to deliver something that produces results "as quickly as we can made it do so," but in fact not to deliver it at all if the net result is frustration because it's still just too slow or the correlation between inputs and desired results was too low. The traditional strategy would be to make it "as good as we can make it" and release it.

Jobs' famous "knowing when to say no" thing is really a subset of this larger sphere of judgment. Not just knowing when to say "no" but also knowing when to reshape it as an entirely new feature (from the UX perspective) without reference to previous similars, even if there were many; knowing which framings of new things intimate new users vs. excite new users (even if in both cases the net effect is that new learning is required), and so on.

This is the sort of thing where user research is often misleading. Most users will say "I prefer that one, at least it's a little bit familar" when in fact the familiarity, combined with the ultimate variation of the totality of the product from their expectations, might ultimately lead to less use or suboptimal use—yet an "ugh, that's strange and new" might become an indispensable, optimally used product once users get over the learning hump.

During its best years, Apple's design prowess was lauded because the picture was very big—it was UX design plus human factors done at billboard scale, with an eye toward the big picture of how the products would integrate into users entire conceptual ecosystems, across all of their devices, habits, living context, and across time as well—over which learning and experience occur.

Design isn't often done that way; more often it's just more mechanical than that, and more rote. That's what I see in Apple right now—a company that does design by the numbers and never does anything that is both counterintutive but also right anymore. Like most other companies, they're now in the business of "optimizing their compromises" once again by the numbers, rather than in the business of thinking about the big picture beyond the device to decide how to make the compromises.

Too much "design principles" and "use research" and not enough in the way of the bigger social picture and ethnographic quality research and understanding.

Basically, Jobs had an innate, natural talent for social science perspectives and thinking that is underappreciated to this day, and it is now missing from Apple, who can only see in aesthetic/technological/UI/UX perspectives at this point.

TL;DR version—

The single word "design" is often used to reference aesthetics, UI, UX, and human factors.
At its best, Apple balanced these well, and Jobs had a particular innate vision that privileged human factors, then UX, then UI, then aesthetics, in that order.
Now, Apple privileges aesthetics, then UI, then UX, and human factors least of all.

So everything is prettier than it's ever been. Yet at first touch it's not quite as great use as it once was. And over time, it gels far, far less effectively and completely than it used to.

Slashdot Top Deals

"Only the hypocrite is really rotten to the core." -- Hannah Arendt.

Working...