Become a fan of Slashdot on Facebook


Forgot your password?
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×

Comment You're wrong. (Score 1) 225

You're looking at the stagnating iOS years on, rather than at what Apple did during Jobs' tenure.

I was a Palm user when the iPhone was released, and I thought I was totally satisfied with my Palm devices (which I'd been using for years) and that the premium for an iPhone was pointless. I poo-pooed the iPhone until the 3GS was released and I finally tried one. I was blown away. Full web browser, lots of useful apps that installed *over the network*, fast and complete WiFi support to enable this, large capacity to hold lots of songs and images, a camera capable of producing large images, the list went on and on. It was a HUGE step up from other things in the market at that point. Apple had taken half-measures scattered throughout the phone ecosystem and brought them all together as full "best of breed" measures in a single device. This is what the Jobs Apple excelled at.

NOW iOS is stale in comparison to Android (see my post above), and that's the problem with Apple and why they are rudderless without Jobs, but early on this was simply not the case—the iPhone was remarkable when it was introduced.

I'm a technology early adopter (not necessarily an Apple one) and this happened several times with Apple products under Jobs:

- MP3 players. I'd had several MP3 players prior to the introduction of the iPod, but the classic iPod blew them all out of the water. Far faster, large screen enabling actual navigation of your music library, capacity to hold thousands of songs (rather than just a couple dozen), played just about any MP3 file you could throw at it rather than requiring you to use their own encoder (or, in the case of Linux users like myself at the time, carefully curate and tweak command line for Lame to create files that the device's bandwidth could handle). The iPod was simply far more functional that other MP3 players at the time.

- iPad. I'd used other tablets for years: Vadem Clio, Hitachi eSlate, Fujitsu Stylistic, etc. They had compromised battery life, a resistive touchscreen, an OS that was difficult to work with, had dog-slow processors and little memory, could not run a full web browser (in the case of the CE devices), required desktop sync or a desktop environment, were heavy and difficult to hold for long periods of time and/or to carry around, etc. iPad was hand-holdable, had massive battery life, did not require desktop sync or that you run a desktop environment that suffered as a tablet, and was generally the device I'd been hoping for for all those years as I struggled to make previous tablets work. Again, the iPad was a tablet done *right*, rather than making me buy the "promise" but suffer through the compromises.

- OS X. I switched from Linux. Why? Because OS X gave me a *nix command line environment and infrastructure, robust stability, support for high-end hardware, *and* off-the-shelf retail purchases of software and devices without having to recompile code or worry about compatibility. It's still the only OS that does this.

Jobs had a talent for spotting technologies that were essentially at the "proof of concept" stage but were making headway in a few tiny niches, and were already being sold to (dissatisfied) consumers and riddled with compromises, and getting his team and company to engineer their way around and through those compromises to realize the technology in consumer-ready, appliance form. Other companies released Ford Model T cars (hand-crank start, too many levers to micromanage mechanical functionality, counterintuitive and dangerous gearbox, rotten ride for grandma) and Jobs could look at what was there, spot the potential, and then put his team to work on a car that could be started from the passenger compartment, manage the obvious parts of its own mechanical operation, that had a safer gearbox that matched the way that people think and expect machines to work, and that let grandma work on her knitting in the back seat without poking herself.

He was masterful at (1) identifying potential in new tech that was either failing in the marketplace or had already been dismissed, (2) seeing why this new tech was flagging, and (3) managing his team to solutions to the obvious problems, so that previously taken-for-granted limitations and complications were removed, (4) all within the realm of consumer budgets (even if at the high end of these). He was also very adept at (5) bringing lots of different technologies of this sort together in a single device or system, with all of them significantly improved, i.e. using lots of disparate tech in combination to solve the problems with each and multiply their effect.

This is the "vision" that people talk about. He spotted this stuff, recognized which limitations weren't as obviously necessary as people imagined, and could find a path to release with much upgraded and/or improved design specs, when everyone else thought it was impossible, and maintain the determination and optimism to keep the business afloat and the team working toward the goal in the meantime. These are not small things.

To me, that is innovative, it's just innovative at the process end, rather than at the "invention" end of things. Jobs was process innovator and a UXD innovator, not an inventor.

What Apple lost with Jobs was this vision to see where (a) potential is hidden and (b) the real UX problems lie with high-potential tech.

They are back to being in the business of "accept what already exists and the taken-for-granted limitations, then iterate with evolutionary improvements over the release cycle." They are consciously trying to innovate at the other end, but they are back to releasing half-baked new tech at the essentially proof-of-concept that really only appeals to niches willing to nurse it along. In short, they're just like all the other tech companies again. They are no longer the company that plucks tech that previously only geeks were capable of using or saw the purpose of, then perfects it beyond all expectation and gets mom to buy it for grandma for Christmas, as was the case with the iMac, iPod, iPhone, iPad, etc.

The Apple Watch is their only post-Jobs attempt, but Cook called it done long before Jobs would have, and the result is that Apple released a product like the Vadem Clio or Fujitsu Stylistic of old that I mentioned above—appealing to a few geeks, but niche, limited, hard to use, and with a small (and often frustrated and product-abandoning) audience in the end.

In short, Apple has become another HP or Compaq once again, just like they were before Jobs came back. They take existing product categories and tech limitations and parameters for granted, build "one of those" to have it in their product line up, release, and hope to compete on build quality alone. Just like they did in the late '80s and early '90s. History says this won't work for them. They have more cash this time, but they're still in a losing position right now.

To maintain the brand, they need to find another person who adopts relatively immature tech that the public doesn't know about, and that those who do know take for granted as niche and limited, and then organizes Apple's huge resources and brain trust to realize them as far less limited consumer devices that work better, and with fewer limits, and more conveniently, and more user-centricity, than was previously imagined to be possible.

Until they find such a person, I'd be short Apple.

Comment This is too bad. (Score 5, Informative) 201

I live in a GF area and love it. There are three tiers, 5 Mbps for $0 (yes, free broadband), 100 Mbps for $70, and 1 Gbps for $90. They have been absolutely bulletproof, the speeds are for real when tested, and the online system and the way that it integrates with their WiFi router is awesome.

I have had multiple providers over the years, including Comcast and Verizon, and Google Fiber's product and service are easily better than the others.

If Google can't make this work, there may be no hope for anything better for a long time to come. I just hope I don't lose it here!

Comment Yup. Apple products used to be focused around (Score 4, Interesting) 225

enabling the user to do things they otherwise wouldn't know how to do or be able to do. Since Jobs left, they've steadily slid into the old game from the '90s and '00s that the tech majors (HP, Compaq, and so on) used to play—"innovation" becomes another word for "throw gadgety gimmicks at the wall and see what sticks," but without well-thought-out reasons why users might want the device, or an understanding of the ways in which UX friction impacts the device's usability.

Compared to the rest of the marketplace and competing products at the time, the original iPhone, the original iPod, the original Intel Power Macs, the original LaserWriter, the original Macbook Pro models, the original iPad, etc. were all towering improvements that enabled users far more than competing products did.

Now, the trend is the opposite.

On the consumer end, iOS phones and tablets feel arbitrarily constrained next to Android
Current Mac OS machines are generally limited in serious software and upgradeability again relative to Windows machines
On the pro end, Apple's application ecosystem is weak once again compared to pro-level Windows applications ...and so on.

It used to be that you paid a premium for Apple products but got much more or at the very least something highly differentiated for your money (esp. in the cases of early iPods vs. other MP3 players, iPhone 1 vs. other smartphones, iPad vs. other contemporary tablets, etc.).

Now you pay a premium either for less or for something that is largely undifferentiated (and often negatively so in the minor differences that do exist).

It hasn't always been the case that you're simply paying double for brushed metal and a glowing Apple logo, but it certainly feels that way now. People still want to pay for quality (hey, the aluminum case and better QA are nice), but now they have to consider the tradeoff—I can pay a lot more and get a nice metal Apple device, or I can pay a lot less and get a phone that's more configurable and flexible.

That's my own feeling, anyway. I'd love to have the nice finish of an iOS device, but even if there was price parity I couldn't give up the flexibility of Android. I don't want to be tied down to Apple's visuals, Apple's icon positioning, Apple's version of KHTML, Apple's take on the (non-)filesystem and so on. I love Mac OS as well, or at least I have done since OS X, but the new Macbook Pros are limiting and I'm seriously considering getting a Windows laptop for my next purchase, just so that I can access hard drive, memory, and so on.

Apple has begun to fetishize itself, rather than fetishize overall UX.

Comment I feel the same way. (Score 1) 269

And the question wasn't addressed in the video.

Can this function like a normal tablet? Will I be able to remove the controller modules and carry it around and read email, use Chrome and Google Now and Microsoft Office apps and snap photos? Or is this a dedicated gaming machine that's just modular?

If the latter, I wouldn't buy it. If the former, I'd buy it to replace my current 8" tablet, as a tablet PLUS gaming experience. But I need a tablet, and I don't want to have to have TWO tablets just to get slightly better gameplay on one of them.

If it's a one tablet concept (would have to be Android, I assume, to have the ecosystem) then great. If it's just a game console with fancy industrial design? Pass. I have good enough gaming on my current tablet.

Comment Don't confuse the worst and the best. (Score 1) 97

"[F]raud, cheating, plagiarism, etc." in *low-end* research, which we also have in spades in the U.S. and in the West more generally (it's really bad in a lot of the also-ran European countries). At the top end, Chinese research is every bit competitive with other players in serious global research, and they have more resources available to them, which they can apply to problems without nearly so much systemic overhead thanks to their particular governing system.

Comment *Because* the put it in an otterbox. (Score 1) 121

I don't use protectors of any kind, but I knew more than just a couple middle-America, middle-class folks who ALWAYS get the hardest, most solid-looking case they can find (irrespective of whether these actually help or which cases perform best). Why? Because their phone is one of their largest investments and a critical piece of everyday tech that they want to protect.

They appreciate the thinnest phone possible precisely because *after* they put it in an Otterbox it will still be manageable, whereas when they had an iPhone 4 or whatever, the Otterbox made it significantly thicker than an old Nokia candybar.

Comment Re: The rush to produce easy code. (Score 2) 531

I think this is a bigger problem than is being recognized here. Most coders that I work with don't get to decide on ship dates. They may in a few cases have a claimed "veto power" if the code isn't ready, but they won't use it, because they'll be let go if they don't ship on time.

The management that I see is too often of the "Give me a demo. What are you talking about, that works fine! Ship it! Let's move the press date up by two months!" variety. Some of the better ones are of the "What's our risk exposure? Hmm... Versus the revenue model... Hmm... It's a close call, but I think the we have to go with the risk to hit our targets" variety. At least they *get* that there is risk.

But the fact is that management and investors don't care if software is buggy and insecure as long as those are "edge cases." They're fully onboard with the Fight Club model. "How many clients will get screwed vs. how much money will we make. Sounds like a good tradeoff."

I think most coders are capable of producing good code in a world in which good code is valued. The problem is that it isn't. Shipping products early and often are the values, and management tends to think that if we can ship code early, do write-offs for the bug and vulnerability cases, and then release the next version before having to patch the one that's about to be shipped, then the entire expense of refining and auditing code can just be eliminated.

At least that's been my experience—the idea is that it's a good way to reduce cost. Release a lot. Be "agile" (hate that word these days), which means: just keep releasing completely new code at an alarming pace. That way, you never have to create good code. You produce a pile of rapidly chucked out, 50% entirely new dogshit every three months with your programmers just barely managing to keep up, you release major versions as fast as you can. Consumers and clients don't get time to be exposed to major bugs and vulnerabilities, or to request that they be fixed, because you release fast enough that your answer can be "That product was released six months ago and is now EOL; no fixes are planned. We recommend that you upgrade to the new version." (The new version also happens to include another revenue item of some kind—upgrade fee, etc.—which is better for the bottom line than providing bug fixes for free.)

I think what we see in software is the same thing we see across the rest of the consumer landscape. Managers and investors have realized that disposable, non-repairable junk is better for the bottom line and for themselves, because it means that consumers have to keep paying over and over again, and often. All of the other employees (e.g. coders) are left to come along for the ride by the seat of their pants, or get fired and replaced by someone who will.

Comment Nothing to see here. Move along. (Score -1, Troll) 424

Woman does malicious thing X.
Woman regrets malicious thing X.
Woman can't take it back.
Woman kills self.

Welps, that about sums it up. Seems like the right outcome.

And for anyone screaming "misogyny," I realize that it is now considered sexist to allow women to experience the consequences of their very own behavior, but I think it's about time we started doing just that. After all, men have to do it. Let's have some equality.

Comment Sugars and starches are seriously bad in my case. (Score 1) 527

Sample size of one, and it may just be my biology, but over the last twenty years I have done this three times:

- Gain 50-70 lbs. over time, see skyrocketing blood pressure, and bad cholesterol, high fatigue, fuzzy thinking
- Get tired of it and cut all sugar and starch (i.e. no breads, sweets, soft drinks) out of my diet
- Lose 50-70 lbs. in the space of about 3 months, see blood pressure and cholesterol return to perfect, lose fatigue and fuzzy thinking problems

The first time I rationalized that it was more likely due to inadvertently reduced calorie count (after all, natural carbs are supposed to be good for you, and the foundation of your diet, while fats are supposed to be bad for you, and protein in moderation—that was the federal wisdom at the time). So I added sweet foods and starches back to my diet but kept to a lower calorie count. Within five years, I had put on tons of weight again.

The second time I sort of thought "worked once, probably will work again," so I cut out all sweeteners, natural or artificial, as well as all grains and grain flours. Three months down the line, I was skinny and healthy. "This time," I thought, "I'll adopt a lower calorie count when I return to a 'normal' diet." Well, another six or so years down the road, back up by 75+ pounds, even with calorie restriction and a conscious replacement of "refined" sugars with "natural" alternatives like honey and sticking to "whole grain, high fiber" starches and flours. I just plain got fat, even on the "natural" and "high fiber" stuff.

Third time cutting out sugars and starches just happened, started in about June of this year. Cut out all sweeteners and all grains. But consciously increased my caloric intake of protein and fat considerably as a kind of experiment. No limits. We're talking a full pound 70/30 beef patty sandwiched between two fried eggs for dinner territory. What many people at Whole Foods would call "heart-clogging food." Well... Dropped 75+ pounds in ~3 months. No calorie control at all, and not even thinking about moderating fat, protein, or salt intake. Same result, and again, blood pressure returned to excellent as did cholesterol, despite likely significantly higher cholesterol and salt intake. Energy levels are much higher. Alertness significantly improved.

Though some people worry about sustained ketosis as the result of diet, I have experienced no problems. This time, I'm not going back to a "normal diet." I feel like I have enough first-hand data for my own biology. I'm just gonna keep eating as much red meat, eggs, and butter as I want, along with low-sugar vegetables (esp. leafy greens like spinach and chard, etc.)

But sweet anything and grains are seriously off-limits.

I am still having trouble convincing relatives that this is a good idea, everyone is terribly worried about me. The fat will clog my arteries, the whole grains are good for me and I'll get colon cancer without them, etc. But I feel about 1,000% better without sugars and grains in my diet, and I can buy regular clothes as well.

Comment Because they don't work. Period. (Score 2) 206

Totally would do this, but:

1) Apps refuse to start on rooted/jailbroken phones.
2) There are about umpteen dozen payment systems that do not support each other.
3) Invariably retailers only support at most one or two (which your particular phone does not have).
4) Only a tiny fraction of retailers even support that one or two.

So the result is that you spend all the time setting it up on your device, and then walk around for months never seeing a place where you can use it. When you finally, finally do see a terminal that claims to support the network that your app uses, and you try to start it, you get a pop-up saying, "For security reasons you can not make payments from a rooted and/or jailbroken phone."

In short, people are willing to use it but the corporate world is fucking it up (again).

Comment Apps solved the monetization problem. (Score 3, Insightful) 154

For years, companies wanted, but struggled, to generate revenue on the web. They couldn't. There was just too much friction for the average user in pulling out a credit card, typing in details, then remembering logins and logging in over and over again, not to mention tracking all of their subscriptions to various services.

Apps and in-app purchases are the "micropayments" that were talked about for so long. User provides billing information once, then is able to conveniently pay for content (whether the app or in-app purchases) with a tap or two. All payments and subscription information are centralized and run through a trusted (to the user) provider.

This is why companies have gone there. Because it's where they were finally able to generate sufficient user acquisitions to sustain an online purchase/subscription model, for the most part. Companies go where the money is, and it wasn't on the web.

Comment For even more fun, put a "Try Again" button (Score 2) 156

beneath the "access denied" and watch a few of them try for 10 minutes straight to load it by clicking again and again, then leave it open and tap it once or twice a day for two weeks before giving up.

I know a couple people like this. You ask, "But what if the link is malware?" and they respond with "But what if it's something great?"

On a similar note, I once sent a bad link by accident to a person who was in college at the time. I then sent a follow up email saying, "Sorry, bad link. Try this one."

They then called me an hour later to say that they kept trying the first link I'd sent, but couldn't get it to load, and asked if there was anything I could do to help. I said, "But I thought I mentioned—that was a broken link, it doesn't work. I sent the right one!" And they responded with a variation on the above—"I know, but you never know, maybe I'd like it! I'd at least like to see it!"

Comment I shoot events as a sideline and have done since (Score 1) 366

the late '90s in digital.

I have a library of about 180k photos. You retain originals in case someone goes back to a contact sheet and wants a reprint or an enlargement a decade later or something. At a typical event I will shoot between 100 and 1,000 images. Sometimes, depending on conditions, I will shoot RAW.

My current gear is 24mp SLR and generated files are on the order of 12-15MB each for JPG images. I can easily lay down 12GB a shoot or 50GB in a week.

I keep an online 12TB RAID-1 library and then have 3 backup sets on LTO, rotated, with one set always offsite.

I know a person that does video editing and production as a sideline for corporate clients, mostly working on online ad videos and 30-second spots. They keep archives as well, because it's not uncommon for a client to come back several times over a period of several years to want minor tweaks to something that's already run (for versioning or feature changes, slightly different voice track, color edits, text overlay edits, etc.). They have even larger data needs.

Point being: even many individuals and small businesses *do* have legitimate, productive needs, and your condescending view is just a tad narrow.

Slashdot Top Deals

There are no data that cannot be plotted on a straight line if the axis are chosen correctly.