Whoops, that was cut off.
My employer has hundreds of developers. We're growing and always looking for more. HR is challenging.
Let's say you want to hire 300 people next year. 100 of them will be software developers. You use technology that will require training; even if a new hire knows the languages you use, he doesn't know your current tools. Hiring and training are both quite expensive.
You've got some options available to you.
1. You could give every interested applicant a test to see how good a programmer he is.
2. You could require that applicants have relevant experience.
3. You could require that applicants have relevant education.
If you're hiring a small number of people, #1 sounds great. But when you get into larger pools of applicants, this becomes less workable. #2 works well if there's a lot of experience out there doing what you want. But by requiring experience, you lose anyone who could quickly learn what you need. #3 lets you find people who were, at least, able to graduate from college.
What it really comes down to is which is worse for you: false negatives or false positives. If you don't have many applicants, you don't want too harsh a filter early in the application process because you can't afford to lose good candidates from your pool. False negatives are your enemy. If, on the other hand, you get 60,000 applicants a year, you really need a way to get rid of the false positives. You probably don't like the fact that you're not even considering some people who'd do an excellent job. It's frustrating and can feel unfair. On the other hand, if you're getting enough good people to do what you need, the fact that you're losing out on some other good people is acceptable.