There are two fundamental dichotomies that hide under this argument, and they've been going on for years, if not decades.
First, there's the disconnect between large business and small business. Second, there's the disconnect between what people have previously been paid (or their peers have), and what they are actually worth. This is coming from a guy who has hired 5 software developers so far this year, and has 2 slots still available...
A lot of developers are looking at what happens at Google and Microsoft (aside from the layoffs...), and try to use that as a standard when they apply for a position at a 50-person shop in the midwest. This creates an expectation disconnect where someone gets an offer for $65k, but won't take it because they've been convinced by the Internet, their Career Planning & Placement department, or the job postings on career boards, that their skills are worth $90k.
This is an "expectation shortage", and results when there are not enough candidates willing to take the positions that ACTUALLY EXIST. It's all well and good to say that employers are under-paying developers, and looking for cheap labor. But the market does set rates, and the fact is that most software projects away from the coasts just don't support paying developers $120k/year - at least not sustainably.
The second disconnect occurs when people misconstrue what it takes to be hired and promoted in the majority of companies, other than the mega-corporations who can have 200 people doing the same job. The sad fact is that you pretty much have to be a specialist to GET a job, and then you have to be a generalist to KEEP it. The specialists who stay in their pidgeon-hole are always the first against the wall when the next re-org comes. But the generalists who have 75% competency in an array of skill-sets rarely make the cut during interviews, but have enormous job security in their current positions -- though often feel themselves "stuck" in positions where they may not feel like they're advancing quickly enough.
This is a failure of cultivation and and expectation problem on the part of employers. It creates a market distortion where people are encouraged to specialize, and then dumped back onto the market with inflated expectations of their overall worth when that very specialization becomes a liability. (Ruby, anyone...?)
From the inside, I think it's undeniable that there is a shortage of quality, trained developers, with attitudes and ethics that will lead to long-term advancement and quality employment. That doesn't mean that there is a shortage of bodies with the raw skills necessary to do the job. But, in the end, that hardly matters... companies aren't hiring automata, even if some of them want to pay as though they were.
There are ample failures on both sides of the equation, and large companies are exacerbating those problems with their treatment of many H-1Bs and "mass hiring" of fresh graduates (at insanely inflated salaries) who then get culled 9 months later.
But candidates are also making the problem worse by viewing software development as a single, unified market, and clinging to the belief that just because Company X in Boston could afford to pay $x for a given product/project, that their skills are still worth $x when they move to Company Y in Pittsburgh, creating software for a completely different industry.
The end result is a shortage of jobs that don't require specialists to get through the door, and a shortage of employees able to adjust their expectations to the realities of the market we are in. When you meet in the middle, it's a real shortage, regardless of how it came to pass.