Toward a Culture of Evidence

Toward a Culture of Evidence
X
Story Stream
recent articles

FILE - In this May 20, 2013 file photo, graduates pose for photographs during commencement at Yale University in New Haven, Conn.

RCEd Commentary

The United States has the reputation for being home to the top universities in the world, but can’t answer essential questions about the quality of students’ higher educational experience:

- Do students who graduate from college know more on commencement day than they did on orientation weekend

- Beyond grades and transcripts, how do we measure a student’s academic growth?

- Are graduates ready to apply this knowledge once they leave campus?

Ideological foes like President Obama and Florida Governor Rick Scott both believe that we should be able to answer these questions. But colleges and universities are perhaps the only large American sector of education not driven by hard evidence of its effectiveness. Typically, that means that many graduates lack the skills necessary to succeed in their careers. This status quo dampens economic growth as students continue to enter the workplace with a considerable lack of necessary skills and knowledge.

The hard evidence we do have about student learning is rather limited. Most schools either track student progress using their own internal assessments, which normally can’t be compared to those used by other schools, or they don’t use any assessments at all. In many instances, faculty members and the students themselves are not fully involved or engaged in the assessment process when done at the institutional level.

We are forced to make assumptions about the quality of graduates based on abstract notions, such as the reputation of the school where they earned their degrees or rankings of those institutions by independent arbiters. And the limited data we do have is not aggregated, used properly or in sufficient quantities to amount to any real improvement over the status quo — even though the primary function of colleges and universities is to teach general and domain-specific knowledge and skills.

In fact, there is currently no commonly used metric to determine whether students are actually learning while enrolled in higher education. That’s why the National Center for Public Policy and Higher Education awarded a grade of “incomplete” for measuring student learning to the higher education system in all 50 states in 2000. Fifteen years later, the landscape hasn’t changed much, despite the obvious need for system-wide clarity and consistently used, easily understood language that has been demanded by stakeholders, including students, parents and accreditation agencies.

The most urgent need — and the first step in creating such a system — is to develop a way to assess the knowledge, skills and abilities that correspond to career readiness. An ETS report released earlier this year showed that American millennials — despite record levels of college attainment — score below the vast majority of their international peers in literacy, numeracy and problem-solving skills. Even our highest-performing and most educated young adults score below their international peers.

A recent survey of 400 employers by the American Association of Colleges and Universities reported that employers, by and large, most value skills like critical thinking, written and oral communication skills, and the ability to apply knowledge to real-world settings — skills that cut across students’ majors and academic disciplines. Nine in 10 of the employers surveyed view recent college graduates as poorly prepared for the workplace.

Some of the existing assessments in use by some colleges and universities have found much the same thing as the ETS research team and American employers: American college students are leaving school lacking the skills necessary for the workplace. Significantly, according to one study, some students do make progress over the course of their college careers. But so many students start with a deficit of knowledge that they won’t be ready for work at graduation even if they do make progress while in school.

As outlined in ETS’s “A Culture of Evidence: Postsecondary Assessment and Learning Outcomes,” the six regional postsecondary accrediting agencies should be charged with integrating a national system of assessing student learning into their ongoing reviews of institutions of higher education. These agencies are already responsible for developing and enforcing standards for higher education institutions and programs. The accrediting agencies are also responsible for performing reviews on a regular basis, which means that accountability for student outcomes is guaranteed to become an integral part of how schools operate. Properly aggregated data on learning outcomes will provide faculty with information on cohorts at similar schools, and equip faculty with high-quality information to help improve their own programs. Systemic, sector-wide improvement will continue to be a challenge without these assessment systems.

Successful development of an assessment system will require coordinated efforts across the sector and significant involvement from an institution’s faculty. We have already waited too long, as the skills gap between American millennials and their peers abroad makes clear. The comparatively low skill levels of American millennials is likely to test our international competitiveness over the coming decades. If our future rests in part on the skills of this cohort — as these individuals represent tomorrow’s workforce, parents, educators and body politic — then that future looks bleak.

Comment
Show commentsHide Comments

Related Articles