The simple fact of the matter is that a 4-year university's computer science program is not meant to provide job training, and as far as career skills go, you could pick up a CS degree equivalent of job skills in under a year.
I wrote about this the other day, on the Ask Slashdot: Modern Web Development Applied Associates Degree topic, and I'm sticking to my guns on it. You don't need any math more complex than simple algebra. You don't need any theory classes.
Some of these theory classes may provide better insight, and lacking them may limit you if you're attempting to enter a highly specialized, complex field with no demonstrable experience in it (which, by the by, doesn't really happen), but for 98% of your day job, it's going to be more important for you to know how to parse and sanitize input than it will be for you to know how to write a compiler, raytracer, decompose a function into mathematical terms, perform a Big-O analysis, design a memory manager for an OS, and you'll probably never use matrices or differential equations.
Hell, the grads I see now a days haven't got a concept of efficient design, most lack basic database skills, awareness of common libraries, common development tools, never used any team-based tracking systems or source control, and so on. Unless they've struck out on their own, they're almost completely unsuitable as candidates. Many of the self-taught devs seem to have a better grasp of things, if only because they end up attempting to write usable software from design to implementation, instead of homework assignments demonstrating polymorphism and recursion.
On the other hand, for many HR departments, a degree is go/no-go. You'll never get to an interview without one, and there's no free, online equivalent for that. You'll just have to make do with having superior technical skills, and try to apply at a company that values that more than a sheet of paper.