It all depends on what they learnt and how they apply it. But I will take any of my former co-students as a programmer, over any self-taught programmer, when I can't judge their work in advance.
The difference between someone who understands invariants and pre/post conditions for formal correctness verification, even without using it, and someone who has never even heard of the concepts involved, is huge. There are order of magnitude differences in algorithms for certain tasks, and if you don't even know that you can determine that sort of thing (and how) you're a lost case. Datamodelling is another area. Everytime I see programmers abusing the logical model, I cringe. Code first is a bad idea and with formal training you can avoid things like that.
And I mean, the halting problem. Turing machines. If you don't know Turing machines, you won't understand the implication that at a fundamental level, all computer languages are the same. If you don't know lambda calculus, understanding what Linq does, is much harder.
Etc. etc.
Ofcourse, you can have brilliant self-taught people in the field, as in any field. It's just so very rare to encounter competent ones.