Forgot your password?
typodupeerror

Comment: Occam's Razor (Score 1) 282

by cshbell (#41227085) Attached to: Anonymous Leaks 1M Apple Device UDIDs

So is Apple going to explain why they gave all the UDID of their devices to the FBI?

I know everybody's racing to see conspiracy here -- and that may well end up being the case -- but there might be a simpler explanation for how the FBI got these: From sniffing open WiFi hotspots.

It's possible that the Bureau, perhaps in cahoots with other three-letter agencies, exploited an undisclosed bug that produced the UDID (the technical composition of which is well documented). If so, it wouldn't be any great feat of science to sniff common open-air networks at places like Starbucks, airports, hotels. That's how I'd do it.

Comment: Re:Easy to demand more security (Score 5, Interesting) 266

by cshbell (#40890373) Attached to: Apple Support Allowed Hackers Access To User's iCloud Account

But understand that it will cause massive unhappiness for the majority of cases where(for example) one's 75 year-old grandmother, who has forgotten her password and can't figure out how she phrased the answer to the security question, is about to permanently lose access to the last 5 years of her grand-children's emails.

This is a problem that bites both ends. Consider this real-world scenario that happened to me last week:

I work for a senior care organization. One of our resident, a cheerful 92-year-old woman, uses her AT&T email frequently to communicate with family and friends; she's fairly savvy, actually. However, she is starting to suffer from cognitive problems, which have caused her to forget her password. When we tried to reset her password and walked through security questions, she's also having trouble remembering the answers to those questions. We called AT&T and explained the situation, but they understandably (and rightfully) treated our request as a hostile attempt to access the account and would not help us.

She's the legitimate owner of her account -- how can she be helped? This may seem like an extreme situation, but these problems will only increase as we all continue our digital lives and begin to age.

Password and account verification is a difficult problem to solve. If there's a silver bullet, I haven't heard of it yet.

Comment: Re:No shit (Score 1) 395

by cshbell (#40729437) Attached to: HTML5 Splits Into Two Standards

I appreciate your passion, but you have a lot of things wrong. Chiefly this: HTML5 is not a singular entity, but rather a broad description of several emerging web technologies. I launched one of the first commercial HTML5 sites on the Web, and in the years since, the meaning of "HTML5" has already changed significantly, and that's okay.

The semantic HTML tags are less than useless, because they're based on a now obsolete statistical analysis of common ids/classes.

Nearly every site on the Internet has a header and footer, though; sides/sidebars are pretty common too.

Effectively you can't infer any meaning from the semantic tags anymore because it's such a fucking hash up as to what they're even going to mean due to different people going different ways on them.

I'd like to introduce you to the complexity of human language. This isn't a problem specific to HTML5, or HTML, or heavens, even markup languages. People use words and descriptions in different ways. There is not now, never was, and never will be a singular and coherent semantic structure. What's an article to one person is a section to another.

It's important to note that the specs you rail against go to great lengths to avoid enforcing a rigid semantic context. The new HTLM5 elements aren't complete, and nobody ever claimed they were. Their role is purposefully vague, giving web authors new markup tags that infer some semantic structure without enforcing precision. I mean, how many hundreds of roles does the good old tag play? Semantically it indicates a heading of some sort, but what that means, precisely, is intentionally undefined -- and it works great.

The solution to your problem is XML, wherein authors can markup every document with precisely-semantic tags. We tried that with XHTML, and it turned out to be a solution in search of a problem. HTML, with its imprecision, is fine for markup because it's good enough and broad enough that we can do what we want. And when it isn't -- for example, when it fails to define a drawing canvas for complex visual manipulation -- then a new branch (HTML5) was developed. Repeat ad infinitum.

Also, note that I'm talking about just the HTML5 spec here, I have little problem with for example, CSS3.

This is where your argument runs off the rails. You're incensed by HTML5, which has a pretty widely-accepted core implementation, but you have "little problem" with CSS3, which is also incomplete and an ever-changing amalgam of over 50(!) modules, of which only four are in a somewhat-settled state of definition? Come on.

Imagine my horror when I encountered the term "Responsive web design" the other day, apparently it's about creating device/resolution independent web designs. Yeah, great one kiddies who coined that term, we already figured this out some years back.

The "kiddie" who developed responsive web design was Ethan Marcotte, a highly-respected designer and one of the principle authors of new web design techniques. Responsive design was first presented in A List Apart, the leading journal for web design techniques. Responsive web design chiefly involves the use of CSS3 (remember? that thing you have "little problem" with?) media queries in its implementation. And it has nothing to do with resolution independence, but rather, about dynamically adapting both content and layout to a variety of different usage roles (desktop, mobile, etc.).

Again, I appreciate your passion, but it's heavily misinformed. Sure, HTML5 has problems, and they're no secret -- they've been debated ad nauseum since WHATWG first convened. But the fact is that there is a generally-agreed-upon HTML5 baseline, it's in wide use across the Web, and it's enabled some pretty terrific new sites and apps. This will continue, regardless of hand-wringing about standards and forks. The Web adapts.

Comment: Herman Miller Embody (Score 1) 522

by cshbell (#37722104) Attached to: The best computer upgrade I've ever done was:
I've used an Aeron for a while, and it's great. But last weekend I bought an Embody chair by Herman Miller, the same company that makes the Aeron. Night and day difference.

The Embody was designed for people who sit at computers all day. It forces you to sit with good posture, which may sound painful, but it's damn comfortable. And healthy. There's basically no way you can comfortably slouch in it.

It's spendy at $1600, but this is your long-term health you're talking about. Bite the bullet.

Comment: Not new, but still good (Score 1) 96

by cshbell (#36490708) Attached to: High Tech Elder Care May Be Mixed Blessing

This technology isn't particularly new -- telehealth has been around in some forms for a few decades, but it's definitely taking off now.

One of the benefits not mentioned is that, in addition to 'remote monitoring,' telehealth services generate a huge amount of useful data for clinicians. Our telehealth program collects weight, blood pressure, pulse, and (for diabetic patients) blood sugar readings daily. It takes less than five minutes for the patient to do these tests.

In turn, their physician and care team have daily medical data, and can detect subtle changes much more quickly than a weekly nurse visit or occasional office visits. The only other setting where you have daily vitals monitoring is an in-patient hospital or care facility; now we can have it in people's houses. And the patients generally love it -- they feel a sense of security knowing that their health is being monitored.

Comment: Clearly Explain the Remote Wipe Policy to the User (Score 1) 446

by cshbell (#34326056) Attached to: When Your Company Remote-Wipes Your Personal Phone

I suspect many of the misgivings about remote-wipe policies have to do with the clarity of explanation. Explain to users clearly what ‘remote wipe’ means, and what they can do to protect their data.

Just today, I wrote a new document for our users about our remote wipe policy and how, with iOS 4.2, they can too thanks to Find My iPhone. Here’s what I wrote, under the heading ‘A brief but important note about your privacy and data:’

“It’s important you know that the locating feature of Find My iPhone is tied to your own, personal Apple ID. This feature is not accessible by anyone else in the company, including the IT department, and cannot be used to track or determine your location. We respect your privacy.

“On the other hand, the IT department can immediately erase your iPhone’s data should it be lost or stolen. This would cause everything on your iPhone to be erased, including pictures, music, and apps — not just company data. Therefore, we recommend you connect and sync your iPhone to iTunes regularly to ensure that any personal data on your iPhone is backed up.”

Companies have a right to secure their smartphones —there’s a lot of data on them. End users have a right to protect their personal, non-company data. These are not mutually exclusive. Can we agree?

Comment: Re:History repeats (Score 4, Insightful) 497

by cshbell (#33234862) Attached to: The Coming Onslaught of iPad Competitors
The OLPC XO was very easy to use, yet somehow Sugar/Linux doesn't get the same sort of attention Mac OS X or iOS do.

What real-world questions did the OLPC XO answer? I've never used one, so I honestly have no idea.

For as many people as bought the various Apple products, Macs in the XP and Vista era answered the question, "Would you like your computer to not be a malware-infested heap of frustration?", the iPod answered the question, "Would you like to carry all your CDs in a fun little pocket-sized box?", and the iPhone answered the question, "Would you like to carry the Internet in your pocket?" Not really novel stuff, but it was packaged thoughtfully and it made sense to a lot of people without requiring a great amount of explanation. The iPad is arguably the first major Apple product in a while that doesn't immediately scratch an obvious itch. Its selling point is more along the lines of, "A lot of what you do with your computer, in a smaller, sleeker package."

Again, I don't think it's rocket science. Apple built what their own people thought would be great, and lo and behold, a couple million other people thought it was great too. Sure, Apple is a slick marketer (although Apple's marketing budget is in line with other tech companies its size: http://tech.fortune.cnn.com/2009/10/28/apples-2009-ad-budget-half-a-billion/) and gets a lot of free love from pop culture, but it would be myopic to suggest that this is much more than sugar-coating on an already solid and aggressive business model.

Comment: History repeats (Score 5, Insightful) 497

by cshbell (#33234694) Attached to: The Coming Onslaught of iPad Competitors

Go back about five years in the archives of most tech publications and you can find similar stories about "The coming onslaught of iPod competitors." Look how that worked out.

For some reason, the tech community believes that the commoditize-and-cannabalize cycle that typified the 1980s and 1990s is a perpetual law. It isn't, and Apple's success this decade is a resounding rejoinder to that view. Apple's products aren't, in all respects, better than the competitors; what they are is more polished, more refined, and an order of magnitude easier to pick up on and figure out on your own.

The typical screeds about how Apple's success is due to marketing prowess, reality distortion fields, media sycophancy, etc. are all a bunch of red herrings. Apple makes great products, and it's a real shame that more companies haven't picked up on how they do it and why. It's not rocket science to diligently refine your products while at the same time planning their long-term placement growth; it's just more involved than most companies want to be.

So sure, I'm sure there will be an onslaught of cheaper, different tablets that mindless consumers (Who, I might add, the tech community still believes to be largely ignorant about technology. You know, in 2010.) will buy up and the iPad will be dead. It's impossible that, say, every single one of the competitor tablets will be inferior in one or more significant ways that fails to make an appreciable dent in the iPad's adoption rate. Equally impossible that Apple would refine the iPad beyond its current iteration to entice new customers. I mean, really.

I'm not giving Apple the keys to the kingdom carte blanche, as heaven knows they've made their share of mistakes, but on the whole, I think they've been too successful, too visionary, and too aggressive to continue this endless narrative about how, just when they're about to succeed, the commodity tech market comes up aces and wins the hand.

Comment: Video for Everybody, and an Android caveat (Score 1) 177

by cshbell (#33023736) Attached to: Encoding Video For Mobile Devices?

I'm assuming you're talking about web video; if not, this info won't be applicable. The 'Video for Everybody' project at Camen Design has put a lot of work into cross-platform HTML5 video, and the test page has an extensive compatibility matrix for both desktop and mobile platforms.

Be aware that if you're targeting Android, its implementation of HTML5 video is lackluster (for now; I'd expect this to get better soon). Details of the problems, and a few solutions, can be found here: http://www.broken-links.com/2010/07/08/making-html5-video-work-on-android-phones/.

Comment: Lacking, or just changing? (Score 2, Interesting) 571

by cshbell (#32866838) Attached to: The Creativity Crisis

I'm speaking purely for the United States here. The first thing that came to mind was this great scene from the movie Apollo 13 where engineers are told they have to fit a square filter in a round hole using only the elements on the table. First reaction? They dive right in.

So to use this as an example, it sometimes feels like we've lost the raw brilliance and creativity that allowed us to put human beings on the moon. And we did it years before the development of advanced composites, sophisticated integrated circuits, and computer modeling. The moon missions were calculated with slide rules; even the astronauts had to be skilled mathematicians.

In that sense, it feels like our creativity is on the wane. On the other hand, perhaps it's just changing form.

True, we haven't put anybody on the moon in a while, but we've instead built a giant worldwide interconnected computer network. We've built a search engine that aggregates and indexes it all. We've built touchscreen devices that can make phone calls, access websites, pinpoint your location to within 30 feet using satellites hundreds of miles in the sky, and put it all into a tiny package that slips into your pocket and runs all day on a battery charge.

That's pretty creative, and it's showing no signs of slowing down. So I don't necessarily know that we're less creative today; I see the emotional and anecdotal evidence for it, but the contrary evidence suggests that we're still exceptionally creative and eclectic with our skills.

Comment: Re:Recurring lesson about Apple (Score 2, Informative) 327

by cshbell (#31673738) Attached to: Next iPhone — Front-Facing Camera, A4 Processor

If there's one thing history teaches about rumors regarding upcoming Apple products, it's that nobody talking knows anything.

That's not always the case; sometimes, far from it. The source for this information comes from John Gruber over at Daring Fireball. It's well known, and John says as much, that he has sources inside Apple. As a reliable critic ('critic' as in Ebert, not hostility) of Apple, it seems his sources are either known to the company and the leaks are green-lighted, or else Apple simply doesn't care enough about smaller announcements to ferret out the mole. I'd bet heavily on the former.

John doesn't always make predictions of a declarative nature, but when he does, you can more or less take them as stated fact; for example, his "predictions" for last year's WWDC.

Comment: Re:The Times has its reasons for doing this... (Score 1) 488

by cshbell (#30808042) Attached to: NY Times To Charge For Online Content

Name an important piece of investigative journalism done by the Times in the last ten years. I can't. And I'm a regular reader.

So am I. Here's a recent example: Toxic Waters, a series that ran late last year. The Times put a lot of time - and thus, money - into researching the series, combing through the data (and filing the FOIAs, etc.), as well as the interactive features, which were well done.

Comment: Before Murdoch There was Hearst and Pulitzer (Score 2, Interesting) 388

by cshbell (#30324888) Attached to: The Noisy and Prolonged Death of Journalism

Murdock has ushered in the era of factless journalism and pure opinion as news.

"Reinstated," not "ushered in." Before Rupert Murdoch, there was Hearst and Pulitzer, whose yellow journalism more or less defined the conditions to which Murdoch is now returning his media empire.

The good news is that these things seem to be cyclical. The bad news is that, if Hearst and Pulitzer are any indication, it takes a somewhat cataclysmic event (such as the Spanish-American War) to shake people into their senses and start demanding across-the-board accountability.

Real Programmers don't write in PL/I. PL/I is for programmers who can't decide whether to write in COBOL or FORTRAN.

Working...