It is well-known that a college degree is now required for jobs that really don't require one. Why do employers do this? "Well," goes the argument, "getting a degree shows a willingness to work - it's a signal for abilities other than education."
Now, let's think a minute here.
Studies have shown that college grads walk out without learning ANYTHING. They score EXACTLY the same on their exit exam that they score on their entrance exam. This is true of both internal and external studies.
For instance, I teach college developmental math. The studies there are QUITE clear. It does absolutely no good. You would think that having more and more remedial courses would eventually push scores up and graduation rates up. What we've found is more remedial courses actually push scores and graduation rates DOWN. Students leave a long set of remedial math courses STUPIDER than when they went in. Which is why some colleges are dumping their long list of "progressive" remedial math courses, and shoving everybody into a single remedial class or getting rid of remedial classes entirely. Ironically, because they have tighter oversight, the "for-profit" colleges are leaders in this movement.
So, if all of that is true, does the credential prove the graduate has a "marginally better" education? Or does it prove the graduate is actually STUPIDER than the non-graduate, because the non-graduate was at least smart enough not to waste the money, while the graduate is actually in DEBT in addition to not knowing any more now than s/he did before?
And, the graduate spent between 4 and 6 years both going into debt AND not realizing that the "education" wasn't helping?
Isn't the college degree "credential" therefore as often a sign of stupidity and inability to learn as it is of any increased ability to learn?