On weekend mornings all this winter, anxious high school juniors and seniors will be filing into school cafeterias to sweat through the SAT, ACT, and similar college entrance examinations as stern-looking proctors hover over them.
Such tests are among the long-established requirements for getting into college. But something new is afoot: Increasingly, students have to take a test to get out.
The advent of the college exit test is being driven largely by parents, legislators, and others intent on making sure they’re getting their money’s worth from universities and colleges — and by employers who complain that graduates arrive surprisingly ill-prepared.
“There is a groundswell from the public about whether a college degree is worth what people are paying for it,” said Stephanie Davidson, vice chancellor for academic affairs at the University System of Ohio. “People are asking for tangible demonstrations of what students know.”
Ohio this year started testing candidates for education degrees before they graduate. The Wisconsin Technical College System requires its graduating students to take tests, or to submit portfolios or research papers or other proof of what they know. And all undergraduates at the University of Central Missouri have to pass a test called the College Basic Academic Subjects Examination before they are allowed to graduate.
The trend unmasks a flabbergasting reality: that those expensive university degrees may not actually prove a graduate is sufficiently educated to compete in the workforce. And it advances the seemingly obvious proposition that they should be made to show they are before they get one.
“Isn’t it amazing that the newest and most brilliant idea out there is that students should achieve particular skills and prove it?” Marsha Watson, president of the Association for the Assessment of Learning in Higher Education, asked wryly. “Wow.”
Faculty grades fail to do this, advocates for testing say.
Forty-three percent of grades given out by college faculty are As, according to research published by Teachers College, Columbia University. Yet one-half of students about to graduate from four-year colleges and 75 percent at two-year schools fall below the “proficient” level of literacy, according to a survey by the American Institutes for Research. That means they’re unable to complete such real-world tasks as comparing credit-card offers with different interest rates or summarizing the two sides of an argument.
“It’s really bad news, and it’s gotten worse,” said Margaret Miller, a professor at the University of Virginia’s Center for the Study of Higher Education and an expert on assessing learning.
A separate survey of employers by an association of universities found that more than 40 percent don’t think colleges are teaching students what they need to know to succeed. One-third say graduates aren’t qualified for even entry-level work.
“I can speak to this from my own years as a faculty person: I had a syllabus, and I had what the outcomes of the course would be, and those included critical thinking,” said Julie Carnahan, who taught public policy and organizational management before she took her current job working on assessment at the State Higher Education Executive Officers Association, or SHEEO. “In retrospect, now that I know so much more, I didn’t ever test students to determine if they could actually demonstrate critical thinking.”
So most students pass their courses, said Watson, the former director of assessment at the University of Kentucky, and “are given degrees because they accumulate credits. Each credit hour is assumed to be a metric of learning, and it’s not. The only thing it’s a metric of is that your butt was in a seat.”
Many of the exit tests now being tried are similar to the kinds of practical licensing exams that candidates for nursing degrees have to take, which require them to prove in the real world that they can apply what they’ve learned in a classroom. “Nobody wants a nurse who’s only taken written tests,” Watson said. “What you want is a nurse who has some experience before they jab a needle into your arm.”
In Ohio, for example, candidates for education degrees have to write a lesson plan and make videos of themselves teaching, among other things.
But introducing new ways of measuring what students learn is time-consuming, complicated, and expensive, and resisted by universities fearful that the results will be used to compare them with competing schools. And the ones that have the most at stake are universities and colleges already assumed to be among the best.
“They hate it. They hate it. They already have the reputation for educating students well, so they can only lose,” Miller said.
“The prestige institutions are not going to allow a test to be a standard by which they’re measured,” said Watson. “They have everything to lose and nothing to gain. They look at inputs” such as the high school grade-point averages and SAT scores of their incoming students, which are what put them at the top of the existing college rankings. “When it comes to measuring what their students are learning, they’re not really invested in that.”
That’s one reason why the move to establish exit tests is starting slowly, and the standards remain comparatively low. The cutoff score in the University of Central Missouri exit exam is below the lowest level of proficiency, for instance, and exemptions are made for students with learning disabilities or whose native language is something other than English. No one there in the last three years, or ever in Wisconsin, has been blocked from graduating because of a poor exit-test score.
In other places, students are being tested not to determine whether or not they should be allowed to graduate, but to check for strengths and weaknesses within specific majors or campuses. Some colleges and states, including Ohio, let students and their families see the results. Others don’t, or make them hard to find.
More and more states, including Missouri, Pennsylvania, and South Carolina, have approved using student exit-test results to determine how institutions are doing —though in most cases not yet to judge individual students or decide whether or not they should be allowed to get degrees — as one of the measures on which they base continued public university funding.
Nearly 50 colleges and universities in nine more states — Connecticut, Indiana, Kentucky, Massachusetts, Minnesota, Missouri, Oregon, Rhode Island, and Utah — are trying to develop a way to test students, before they graduate, in written communication and quantitative literacy, though so far this is also solely for the purpose of evaluating their own programs.
They’re treading delicately.
“We want to be very careful,” said Carnahan, who is coordinating the project. “We don’t want this process to end up where states are being ranked. What we hope to do in the short term is to only look at the data by sector across states and not identify institutions. That’s really critical until we can be sure that this paradigm we’re looking at is valid.”
Rather than wait for that to happen, some students and employers are taking things into their own hands.
“We can see that in the portfolios that are coming, where it’s not just, ‘Here’s my GPA,’ but, ‘Here’s my work as well, and what I’ve learned from my internships and classes,’” said Angela Taylor, director of quality enhancement at Texas Christian University. And some employers are now testing job applicants themselves to see if they know what their college degrees say they do.
All of the 42,000 students at Western Governors University, an online university founded by the governors of 19 states, have to prove what they know, by getting at least a B on an assessment test, not only before getting a degree, but before completing any course.
“We want to be sure that they leave with more than they started with,” said Joan Mitchell, the university’s spokeswoman.
“At some point in our world,” she said, “we’re going to have to look at, ‘Do you know it? Have you mastered it?’ It’s a whole cultural shift.”
Reproduction of this story is not permitted