Yeah, I was using playing Doom as an example. Replace Doom with just about any other application that isn't a total resource hog. The specific application I'm using for my example isn't part of the point.
I am a developer and I know what developers want. I'm not saying I don't see an advantage to using all cores some of the time, but what I took from the description (no, I didn't RTFA, this is /. after all) is that Apple is trying to make it easier for developers to do something that is typically considered difficult to "get right": multithreading. The point I'm trying to make is that I don't want everything to be multi-threaded. I see why this is useful for some applications, but I don't want this to be a widespread practice.
Let me go at this from two angles. First, as a developer, in my specific job, while I can write multithreaded apps, I typically don't for two reasons: first, it's more complex to write and to understand, not just for me but for anyone else maintaining my code, and second because we tend to write many small components that do little bits of work and run them on the same machine, so we're making good use of all of our processors/cores anyway. I'm not talking about GUIs (these apps are non-interactive services/daemons), and my apps tend to lend themselves to a single-threaded frame of mind anyway, but what I'm trying to say is that here is a case where I'm getting the most out of our hardware without unnecessarily complicating things. This leads me to my second angle, which is that of a user. I already covered this above... I like multi-tasking, and I generally prefer lots of tasks that only use a single thread to having a single process run a little bit faster.
So yes, sometimes developers want the ability to use all cores for a long-running non-interactive task, and that's fine and it does lend itself to some situations. But I don't know that I want this to become the standard, which is, perhaps, what Apple is trying to push towards.