> or have a very peculiar definition of "qualified."
I'd argue that half the problem is that some employers get the idea that they need to interview developers like Google or Amazon for roles that may not need that level of detail. In other words, if you're just updating business rules in a CRUD based web app using any of the standard libraries, I don't know if asking candidates to code a binary tree on a whiteboard is the best test of whether or not someone would be able to do the job effectively. Even Google admits that they have a high "false positive" rate (people that fail their interview but would be successful in the job anyways), but when you receive millions of applications a year, I suppose you can afford to be picky.
I would also like to point out that when many companies got their first computers in the 50's, 60's or 70's, there was no such thing as a "computer science" degree. Programming courses were generally offered under math, science or engineering departments. Companies hired people without the formal computer science education we think of today, and things didn't melt down, and they didn't have the benefits of all the different frameworks, languages, etc. that we have today.