The final threat for Google's Android may be the most pernicious: What if a significant number of the people who adopted Android as their first smartphone move on to something else as they become power users? In Apple's last two earnings calls, Tim Cook reported that the "majority" of those who switched to iPhone had owned a smartphone running Android. Apple has not specified the rate of switching, but a survey found that 16 percent of people who bought the latest iPhones previously owned Android devices; in China, that rate was 29 percent. For Google, this may not be terrible news in the short run. If Google already makes more from ads on iOS than Android, growth in iOS might actually be good for Google's bottom line. Still, in the long run, the rise of Android switching sets up a terrible path for Google — losing the high-end of the smartphone market to the iPhone, while the low end is under greater threat from noncooperative Android players like Cyanogen which has a chance to snag as many as 1 billion handsets. Android has always been a tricky strategy concludes Manjoo; now, after finding huge success, it seems only to be getting even trickier.
I think the 47% you're thinking of is sales last quarter or the North American breakdown. I remember seeing the 47% vs 46% cited, but only recently, and I remember it was not the overall figure. Worldwide, Android is sitting at something like 76.6% (it dropped 2% after the iPhone 6, and that translated into a 2% jump for Apple to 19.7%). The mobile profit numbers are inverted and wider though
Beyond that, I agree with the rest of your post. I think one of the points the article was trying to make though was that standing out is difficult. Even if you make a quality app, one that most people would be willing to pay a reasonable amount, it lost in the sea of crap. Which goes back in part to your point about the knock-offs -- they're getting as much prominence as you, and they're cheaper, so why wouldn't someone try that first?
It seems clear that everyone would benefit from a system that pushed quality to the top of the search list, but so far no one has figured out a way to make that happen reliably.
At least in the fourth article, the one posted. I read the first three and found them to be largely unconvincing. I think you can like the flat look or not, like Material Design (barely mentioned, but brought up a few times) or not, and that's cool. But one of the main thrusts of his argument in the first three articles was that the defense of these designs was riddled with 'artspeak', a nonsense language used to dissuade criticism. I don't dispute it; I like Material Design (Android user here) but having watched the Material Design sessions from I/O 2014, I definitely got annoyed at all the 'artspeak' going on from the lead guy at Google (Duarte I think his name is). What's funny is that what rubbed me the wrong way about him was how 'Apple-ish' he sounded, so go figure.
But back to the first three articles -- they seemed riddled with a different kind of 'artspeak'. Churlishing comparing the simplish people imagery from Google with Children's books and comparing Apple's design to the child who can paint like Pollock didn't feel particularly high-brow.
Still, the over-arching point that I felt was useful was that criticism is not well-received at Apple (or Google from the sounds of it). That's a point worth dwelling on, especially since Apple in particular has the reputation of having the 'zealots' come out in force whenever anyone says anything ill of Apple. It was quite interesting to hear in the fourth article that -- unless I misunderstood it? -- there's someone at Apple whose job is to rile up the crazies when they get wind of that kind of thing on the interwebz.
But ultimately, the discussion about the problems of the App Store is more interesting. The 'race to the bottom' is something anyone with half a brain can see, and anyone who's a developer looks at that and must feel some gnawing fear. Maybe I'm wrong, but I feel like we're all pushed to mobile (if you're not on mobile, you're out of touch!) and when I look at the market, it gives me the willies. I don't think the Google Play Store is doing any better in that regard either. Worse, I don't have the foggiest idea of how to correct the problem, not even one that would take Herculean effort from either company to employ.
If I'm reading the article correctly, the information that says that ads in the Facebook style are far more effective than Google's comes from...a study by Facebook. Gee, that seems totally unbiased and could in no way be slanted by them to help them convince potential advertisers to sign up. All of this seems very bizarre after reading -- for years -- about how the Facebook ad model is so deeply flawed.
I'll admit that I don't use any of those apps, so I can't say -- I would have assumed that they would open the default browsers of the system -- but maybe they do it in-app.
That said, I'd expect the big guys like Twitter or Facebook to upgrade to the newer component for that very reason -- someone gets hacked the user experience will fault Twitter or Facebook (and this case, with some good cause). Still, I hadn't thought of those cases, so maybe that does make this more dangerous than I thought!
Also a point that gets largely glossed over is that this only affects apps that use Webview as a widget -- browser apps like Chrome or Opera aren't affected because they've updated themselves to use Chromium (or something else). This may affect 60% of Android users, but what percentage of those are using the browser inside an app to visit random sketchy websites? I'm guessing the actual user base at risk is quite small.
The way this is reported it sounds like if you use Chrome on anything south of 4.4, you're IN GRAVE MORTAL DANGER OF TEH HACKZ.
I actually heard some good things about it. Not 'This is the End' good, but not far off. I think the problem is that it would entirely disrupt the narrative the poster or writer is trying to convey if The Interview is anything but awful tripe. I doubt it will be winning any Oscars, but I've heard nothing from people who've actually see that suggests that its worse than decent, and it might even be pretty good.
I was a conference, GeoWeb I think it was, in 2008. It was for web-based GIS (Geographic Information Systems), basically cartography & the web. This was maybe a year or two after Google bought out Keyhole, and Michael Jones (I think it was him) from Google was there. Also, Google had just released Chrome so there was a lot of discussion about it. I wanted to pick Jones' brain about some KML eccentricities because I had just written a KML reader & writer. I had to wait behind about five other people who just wanted to talk to him because he was from Google.
One conversation though sticks out. Some guy (who seemed somewhat sycophantic for some reason) was going on & on about how Chrome was going to change the world because it was from Google, and they'd make sure it was awesome and because they could use their influence to make sure everyone used it. I remember that Jones cut him off there (sounding more than a little annoyed) and he told the guy (paraphrasing): "Google can't make anyone use anything we write. The search engine lets us put anything we create in front of their eyes at least once -- that's it. If they try it, it has to live or die on its own merits, we can't force people to try or use it."
I'm going to have to disagree with you as well
My brother started taking photography seriously when he was living in Japan for a year. Within a year he went from being general capable (I can take a picture and that's it) to being fairly expert. Enough that he considered briefly making a living doing photography. He credits a lot of that rapid growth to getting instant feedback. Yes, people just taking pictures willy nilly and & looking at the results by itself does not make for fast skill building. But I would suggest that for someone who is interested in the craft, it is impossible to not see that immediate feedback -- even if you still need to fire the picture up in a power editor to be 100% sure -- versus taking pictures in a black hole and not seeing the results for hours at a minimum is an incredibly faster iterative cycle.
If the same people who grew up fascinated with cameras & photography thirty years ago had the digital cameras of today when they started out, there is no doubt at all that they would have become the experts they eventually became much, much, MUCH faster.
But obviously it has to be something a person cares about and invests the time to learn. Learning about composition, aperture, exposure & white balance are all important things, but they're things you can learn about a hell of a lot faster when you can do it in the field and see theory put into practice before your very eyes.