thoughts on politics

January 5, 2009

George Will: A pro-civil rights ruling that may have hurt more than it helped

A column by George Will featured today on RCP is worth posting in its entirety:

The Toll of a Rights ‘Victory’

By George Will

WASHINGTON — Like pebbles tossed into ponds, important Supreme Court rulings radiate ripples of consequences. Consider a 1971 Supreme Court decision that supposedly applied but actually altered the 1964 Civil Rights Act.

During debate on the act, prescient critics worried that it might be construed to forbid giving prospective employees tests that might produce what was later called, in the 1971 case, a “disparate impact” on certain preferred minorities. To assuage these critics, the final act stipulated that employers could use “professionally developed ability tests” that were not “designed, intended or used to discriminate.”-
OAS_AD(‘Block’);
//–> 

Furthermore, two Senate sponsors of the act insisted that it did not require “that employers abandon bona fide qualification tests where, because of differences in background and educations, members of some groups are able to perform better on these tests than members of other groups.” What subsequently happened is recounted in “Griggs v. Duke Power: Implications for College Credentialing,” a paper written by Bryan O’Keefe, a law student, and Richard Vedder, a professor of economics at Ohio University.

In 1964, there were more than 2,000 personnel tests available to employers. But already an Illinois state official had ruled that a standard ability test, used by Motorola, was illegal because it was unfair to “disadvantaged groups.”

Before 1964, Duke Power had discriminated against blacks in hiring and promotion. After the 1964 act, the company changed its policies, establishing a high school equivalence requirement for all workers, and allowing them to meet that requirement by achieving minimum scores on two widely used aptitude tests, including one that is used today by almost every NFL team to measure players’ learning potentials.

Plaintiffs in the Griggs case argued that the high school and testing requirements discriminated against blacks. A unanimous Supreme Court, disregarding the relevant legislative history, held that Congress intended the 1964 act to proscribe not only overt discrimination but also “practices that are fair in form, but discriminatory in operation.” The court added:

“The touchstone is business necessity. If an employment practice which operates to exclude Negroes cannot be shown to be related to job performance, the practice is prohibited.”

Thus a heavy burden of proof was placed on employers, including that of proving that any test that produced a “disparate impact” detrimental to certain minorities was a “business necessity” for various particular jobs. In 1972, Congress codified the Griggs misinterpretation of what Congress had done in 1964. And after a 1989 Supreme Court ruling partially undid Griggs, Congress in 1991 repudiated that 1989 ruling and essentially reimposed the burden of proof on employers.

Small wonder, then, that many employers, fearing endless litigation about multiple uncertainties, threw up their hands and, to avoid legal liability, threw out intelligence and aptitude tests for potential employees. Instead, they began requiring college degrees as indices of applicants’ satisfactory intelligence and diligence.

This is, of course, just one reason why college attendance increased from 5.8 million in 1970 to 17.5 million in 2005. But it probably had a, well, disparate impact by making employment more difficult for minorities. O’Keefe and Vedder write:

“Qualified minorities who performed well on an intelligence or aptitude test and would have been offered a job directly 30 or 40 years ago are now compelled to attend a college or university for four years and incur significant costs. For some young people from poorer families, those costs are out of reach.”

Indeed, by turning college degrees into indispensable credentials for many of society’s better jobs, this series of events increased demand for degrees and, O’Keefe and Vedder say, contributed to “an environment of aggressive tuition increases.” Furthermore they reasonably wonder whether this supposed civil rights victory, which erected barriers between high school graduates and high-paying jobs, has exacerbated the widening income disparities between high school and college graduates.

Griggs and its consequences are timely reminders of the Law of Unintended Consequences, which is increasingly pertinent as America’s regulatory state becomes increasingly determined to fine-tune our complex society. That law holds that the consequences of government actions often are different than, and even contrary to, the intended consequences.

Soon the Obama administration will arrive, bristling like a very progressive porcupine with sharp plans — plans for restoring economic health by “demand management,” for altering the distribution of income by using tax changes and supporting more muscular labor unions, for cooling the planet by such measures as burning more food as fuel and for many additional improvements. At least, those will be the administration’s intended consequences.

I’ve come to see the university education system in America today as misguided and dysfunctional.  As Will says, the hiring practices of big companies today (as well as governments at all levels) seem predicated on the idea that someone who went to college is automatically more valuable than someone who didn’t, and meets a certain standard of intelligence.  I suspect that there are a lot of potential causes for this system–viewing education as a “right,” trying to use higher education to bring about “social change,” as two examples–and there are many negative effects of it.

Will points out what everyone already knows: that there is essentially a plateau of high-paying jobs which you cannot scale without a college degree.  However, I had not fully considered the effect this actually has on poorer people.  Under the current system, a person must take classes full-time for 3-4 years to have a shot at a higher-paying job.  It is difficult, but not impossible, to support oneself by working full time while doing this.  But the fact is that this time and money commitment is simply less and less of a workable option the farther down the economic ladder you go.

Of course, the government could attempt to make up for this by providing grants and loans to low-income students, as it does now.  But with grants, the taxes required to raise the money would destroy jobs in the private sector, making the job competition even greater for these jobs.  Loans don’t have this effect as much, but I still think that the private sector could and ought to be responsible for them as well.

Another problem that comes out of this college-for-all paradigm is an overall decline in the quality of education.  Universities of all kinds, but especially big state schools, try to teach much more than they did a century ago.  For example, the University of Georgia, where I go, offers the whole spectrum of liberal arts, business, journalism, agriculture, pharmacy, social work, forestry, and “family and consumer sciences.”  This non-exhaustive list doesn’t even count their professional programs like law, vet, and soon-to-be-added medicine.

I see no reason it should be this way, with one institution trying to teach so many disparate disciplines and trades at the same time.  Colleges began by teaching merely the liberal arts, as well as professions like law, medicine, and theology.  Since all these other colleges were introduced, the focus on liberal arts has diminished, and I think they probably are not as effective.  In UGA’s case, when you have a whole bunch of people there who don’t care about literature, science, and art having to take introductory courses in those subjects, it creates a lot of waste all around.  It wastes departments’ resources by having to provide the courses.  It wastes the students’ time and money by forcing them to take classes they’ll largely forget about after they are done.  Who benefits from that?  And students who major in liberal arts have to spend time as well taking courses in disciplines unrelated to their concentration, while working in a less effective department in their own area of study.  Liberal arts majors are the students ultimately going into academia in these subjects, so we are potentially hurting the next generation of professors and researchers through this paradigm.

Alongside all this, I think there now exists a profusion of master’s degrees and programs.  Since bachelor’s degrees are now so common, job seekers often find it necessary to earn advanced degrees to increase their competitiveness and earn higher pay.  Thus, many schools now offer master’s programs in areas like “public policy,” for example.  Such programs last for perhaps three semesters, putting students through classes that seem to add little benefit to their working skills.  (I make these claims based on the experience of several friends enrolled in such programs.)  This is all to edge out those students who have merely bachelor’s degrees looking for the same jobs, but what will we do when the master’s degree becomes the new standard of hiring for this tier of jobs?  A master’s degree used to be a very rigorous certification to achieve, requiring one to defend a thesis or produce original research.  Who’s being served now that it is slowly replacing the bachelor’s degree as the standard of hiring, and being greatly diluted in the process?

The New York Times had an op-ed last week dealing with this same issue.  The author, a scholar at the American Enterprise Institute, concludes:

Discrediting the bachelor’s degree is within reach because so many employers already sense that it has become education’s Wizard of Oz. All we need is someone willing to yank the curtain aside. Barack Obama is ideally positioned to do it. He just needs to say it over and over: “It’s what you can do that should count when you apply for a job, not where you learned to do it.”

Blog at WordPress.com.