Wait ... no, its not. Never in my life has college been anything other than a money grab. If you don't already know this, you're just living in the dark. The idea that its about education is perpetuated by the schools to stay in business, typical marketing.
Look at salaries versus time spent teaching and tell me how its about education.
Look at costs spent on administrative staff and compare those same salaries to other industries.
Nothing about college even indicates VIABLE businesses, they only continue to exist because people think its a good idea to indoctrinate their children into thinking college is about making a better life for yourself.
School is now about getting you to incur as much debt as possible in the time you are there.