Toward a Culture of Evidence

By Alberto Acereda

The United States has the reputation for being home to the top universities in the world, but can’t answer essential questions about the quality of students’ higher educational experience:

  • Do students who graduate from college know more on commencement day than they did on orientation weekend
  • Beyond grades and transcripts, how do we measure a student’s academic growth?
  • Are graduates ready to apply this knowledge once they leave campus?

Ideological foes like President Obama and Florida Governor Rick Scott both believe that we should be able to answer these questions. But colleges and universities are perhaps the only large American sector of education not driven by hard evidence of its effectiveness. Typically, that means that many graduates lack the skills necessary to succeed in their careers. This status quo dampens economic growth as students continue to enter the workplace with a considerable lack of necessary skills and knowledge.

The hard evidence we do have about student learning is rather limited. Most schools either track student progress using their own internal assessments, which normally can’t be compared to those used by other schools, or they don’t use any assessments at all. In many instances, faculty members and the students themselves are not fully involved or engaged in the assessment process when done at the institutional level.

We are forced to make assumptions about the quality of graduates based on abstract notions, such as the reputation of the school where they earned their degrees or rankings of those institutions by independent arbiters. And the limited data we do have is not aggregated, used properly or in sufficient quantities to amount to any real improvement over the status quo — even though the primary function of colleges and universities is to teach general and domain-specific knowledge and skills.

You can read the full op-ed here, in Real Clear Education.