Business isn't around to "hire", as every right winger I've ever met is so quick to point out, yet every time I hear one spouting tax cuts for business (so they'll hire) and less regulation (so they'll hire). It seems to me you right wing freaks should take your own advice. Business is there to make money.
You see, this is where we America in particular got lost. We lost sight of the fact that we are a society and that Capitalism was originally intended as a better system for encouraging everyone to collaborate for the common good of all citizens. Those that offer jobs and those that fulfill the needs of the jobs are in a symbiotic relationship. The executive boards of the corporations are small in numbers, not able to do the actual amount of work their companies need to bring in the revenue and everyone needs a job to feed and clothe their families and things they need to be able to go to work. The idea is that there is a supposed to be a balance between these two parties that produces the ideal amount of economic output, making the lives of everyone in the country as good as possible. To not acknowledge this and not try to do your best to facilitate this or even worse trying to circumvent or abuse it for your own gain is being a bad citizen. What good are corporations that take so much money out of the system that there is no money left for everyone else? They wouldn't survive very long because eventually they wouldn't have any employees and that would result in a total economic collapse.
This reminds me of the song by Alice Cooper, "Lost in America"
"I can't get a girl
cuz I ain't got a car
I can't get a car
cuz I ain't got a job
I can't get a job
cuz I ain't got a car
So I'm looking for a girl with a job and a car
Don't you know where you are
Lost in America"