I live in Southern California. Fall starts December 1st, three months before Summer.
While some people are thrilled with that, I'm not in that camp.
It seems to be getting Slashdotted, the site isn't consistently responding for me. Oh, and while paging through the finger results on my first connect I got this (for realsies):
"operator: Slashdotted..367 users, holy shit"
Well, it's hard to define "average user", but I will say that at work we have several popular, commercial web apps that we use for various internal things (bug tracking, timesheets, etc.) that are staggeringly faster on Chrome 9 and especially 10 than on FF3 or Safari 4 or IE < 9 (I can't speak for 9). So much so that it's immediately obvious to your average non-technical person that has to interact with these apps. So much so that these average non-technical people are jumping ship to Chrome after trying it out just once because they're so impressed with how much faster things are.
Not that the features and plugins you're talking about aren't super nice to have, but to answer your question: from my experience, yes, the average user notices.
This is typical of post-iPhone Apple, unfortunately. If you look at pre-iPhone apple, they had their hands in a number of places and were making some cool stuff. This is one example, but look at their various other pro and/or creative tools. They had some small but interesting ones such as Motion and Aperature. They also had tools like Final Cut Pro, which swept the NLE world, and Shake, which when they bought Nothing Real (creators of Shake) was taking over the high-end compositing world and was used in many of the big movies that needed heavy visual effects. They also bought Silicon Grail, makers of Chalice and RAYZ, niche high-end compositing apps that were moving up in the world.
And then they realized they could be FAR more profitable selling phones and without fanfare have slowly but surely left all of their little niche markets behind. They convinced companies to switch their infrastructure over to Macs to use their amazing tools, and then just leave them high and dry. I get that it makes business sense, but it leaves a bad taste in my mouth, as I'm sure it does to many of the companies that dumped huge amounts of money into their products.
Apparently the author who wrote about multitasking hasn't actually tried it out yet, because he's off-base. While the app tray does quickly get cluttered, as he mentions, the lack of true multitasking is exactly why this doesn't matter - you can have as many apps down there as you want but they're not actively consuming resources. Where he's really off is in his implication that it now becomes difficult to find your apps to switch back to them. Look, if I'm playing Peggle and then use 4, or worst case 8, apps after switching out of Peggle - mentally I just won't even think to look in the task tray for it anymore. I just can't keep track of every app I've used in my brain. The tray will quickly let me switch back to my most recently used apps, which is really handy - but when I want to switch back to the middle of my Peggle game a week and 20 other app uses later I... and this will sound crazy... click the Peggle icon wherever it's located on my main screens. The author seems to think that the only way to resume an app is from the task tray, and that's simply not true.
Granted, I had some uncertainty about how this would work, too. But I grabbed a new iPhone and tried it out to see exactly how it works, rather than hopping on the interwebs and writing up an article with uninformed assumptions which then ended up on the front page of
Additionally, he goes on to say that developers have to explicitly add multitasking. While that's true for using the background services, my understanding (and correct me if I'm wrong folks, as I have this on good authority but haven't actually tried it) is that for the base level of background freezing, which for a majority of apps is all that's really needed, all you have to do is recompile the app against iOS 4. It's not automagic, but it's really not so bad as the author implies. The worst bit about it is submitting to the app store, but it should be pretty painless to get to that point.
Granted, it's not true multitasking. Everyone knows that by now. But frankly, I'd rather the phone always be responsive and maintain its battery life than have true multitasking for the vast majority of the things that I do and have no desire to have to actively manage my apps (which contrary to the author's claims, I don't have to do). Maybe some day I'll change my mind on that. Maybe right now this level of multitasking isn't good enough for many people out there. And that's cool, we have options now - get one of the many excellent Android phones. But please don't write a blog post of inaccuracies.
There's a distinct difference between Flash Builder and Flex/Flex SDK. One is an open source application framework (Flex), the other is a standalone version of Eclipse running closed source plugins (Flash Builder, formerly Flex Builder, also available non-standalone).
I take issue with the fact that he's singling out Flash Builder when his complaints actually seem to be with Flex (not even the SDK per se, but the framework/API). That alone makes me question his credibility. I really regret clicking the link as I fear he's just a whiny traffic whore who wants people off of his (relatively new) lawn.
My own casual observation (and one that my friends seem to agree with) is that since Los Angeles introduced a similar law last year, it has in fact curbed such behavior. Prior to that it seemed to be a much bigger problem (as it was in previous cities I lived in). This isn't to say you don't still see it most of the times that you drive, but more frequently it's that one idiot on the cell phone during your trip rather than a whole road full of idiots on their cell phones.
Everyone I know has also made it a point to get a bluetooth headset to use while they're driving, as well. Your Los Angeles Mileage May Vary.
Instead of spending the next 10 years trying to find a Flash implementation for Linux or OS X that doesn't drain CPU cycles like there's no tomorrow
I just did a purely unscientific comparison of CPU usage comparing the native YouTube page and NeoSmart's HTML5 viewer. I tested a couple of different videos and did each one multiple times. I'm running Safari 4.0.3 and Flash 10.0.32.18 on OSX 10.4.11. I was consistently seeing 15-20% more CPU usage with the HTML5 viewer than with YouTube's Flash viewer.
Of course, when I downloaded the MP4 and played it in Quicktime it was much nicer to my CPU (but obviously not nearly as convenient).
What's even more problematic for all of those who want to see Flash die in a fire - Flash Player 10.1 should see performance improvements for playing video if I'm not mistaken (as well as in a number of other areas in regards to performance and resource usage). Meaning the HTML5 implementations will have that much more catching up to do from a performance perspective. (This has nothing to do with other concerns regarding Flash, like openness, security, etc. but the summary specifically called out performance)
You mileage certainly may vary (and please feel free to chime in with your results), but being that my laptop is my main machine - for my battery's sake, I'll stick with watching videos via Flash for now.
including the quality of the game's moderation system, programmed restrictions on chat and known player demographics.
As someone who works on a large website targeted towards children which has both chat and UGC with various systems around who you can communicate with, whitelists, moderation, etc. this seems very unlikely to prove useful. Our weekly lists of banned phrases show just how creative people can be with regular, every day words and their ability to use them in ways which while using no established slang still very clearly come across as harassing/derogatory/sexual, etc - and as noted, the demographics here are young children (hence I don't think there's much value in "known player demographics"). I think the only way they could truly rate a game with real-time interaction with other players is based on what types of interactions you can have (which could still be tricky).
For instance - an online game of chess with no communication system, just the ability to make moves... probably pretty safe (though I'm sure someone will find a way to get creative with a horse and a queen). Whereas a game where you can run around and have the ability to duck - well, someone's gonna get tea bagged. But it all seems of limited usefulness, because very quickly you get to the point with your interactions where all bets are off - you'll end up with a very small segment of "safe" games with everything else being "at your own risk." Parents, et al are probably better off considering any game with online play "at your own risk."
To the systems programmer, users and applications serve only to provide a test load.