Not quite, but we're getting there. This is part of the reason why lots of people are moving to smartphones and tablets as their primary computing platforms: something with the computing power and memory of a laptop from 5 years ago is ample for their needs. If it can browse the web, play back music and video, send and receive emails, and edit basic office documents, then that's enough for a massive chunk of the population. It's not enough for everyone, and some of the people that it's not enough for have very deep pockets.
I was recently talking to someone at ARM about Moore's law and how it applied to different market segments. Moore's law says that the number of transistors that you can get on an IC for a fixed cost doubles every 12 months. In desktop processors, that's meant that the price has stayed roughly constant but the number of transistors has doubled. In the microcontroller world, they've been using about half of the Moore's Law dividend to increase transistor count and half to reduce cost. A lot of customers would rather have cheaper microcontrollers than faster ones and getting ones that are a bit faster and a bit cheaper every generation is a clear win (faster reduces development costs, cheaper reduces production costs). I just got a Cortex M3 prototyping board. It's got 64KB of SRAM, 512KB of Flash, and a 100MHz 3-stage pipelien. That's an insane amount of processing power and storage in comparison to the microcontrollers of 20 years ago, but it's nowhere near as big a jump as mainstream CPUs have made. It used to be that a microcontroller was a CPU from 10 years earlier (that's about the time for the Z80, for example, to go from being a CPU in home computers to being an embedded microcontroller), but the M3 isn't even as powerful as the MIPS chip from 1993, by quite a long way. The M0 has the same transistor count as the very first ARM chip back in the early '80s.