The average quality hovers somewhere between execrable and toe-curlingly awful, and they get dismissed after a glance through the first page.
And yet 99.999% of the remainder still gets rejected.
Why don't all publishers move to purely electronic submissions with simple algorithms to spell and grammar check each incoming MS? There are even well-researched, validated reading-score algorithms that might also be used for further filtering.
This would instantly reduce the role of human readers to almost nothing, according to the definitive statement of virtually every publisher or editor who has ever written anything about submission quality.
That is, if slush is so obviously, screamingly, overwhelming bad, why aren't publishers streamlining their filtering of it, and in the best case rejecting everything being caught by the filter instantly, thereby reducing their turn-around time on everything else?
One suspects that either the quality of slush isn't so bad, or the publishers are just massively incompetent.