More than 200 colleges across the country, including all California State University campuses, have been using a controversial technique to judge the effectiveness of college instruction on student performance.
For the past three years, CSU's 23 campuses have participated in a little-publicized project using the Collegiate Learning Assessment, which employs a "value-added" approach to measuring students' abilities on a range of critical thinking and writing skills on two exams.
The use of "value-added" methodology has been thrust in to the public spotlight by the recent controversial series of reports from the Los Angeles Times. The goal was to measure the effectiveness of elementary school teachers in boosting, or depressing, student test scores in Los Angeles.
The Times' approach and the one used by the CLA are very different. One major difference is that the Times attempted to measure the effectiveness of individual teachers, while the CLA looks at the effectiveness of an entire institution.
But the growing use of "value-added" approaches at both the K-12 and higher education levels evoked a range of criticisms, some of them extremely vehement, highlighting multiple methodological and other issues that have yet to be fully resolved.
The faculty on at least one CSU campus – CSU Chico – has twice rejected the use of the CLA, even though the campus received excellent ratings on the test. In a resolution last year [PDF], the Academic Senate declared the test to be "an invalid means of determining the quality of a university education, and is therefore of no use in improving the quality of education."
The so-called "value-added" technique on the CLA is designed to assess the effectiveness of individual campuses by testing freshman and then seniors to chart any progress.
According to LeighAnn Rodd, a program associate at the New York-based Council for Aid to Education, the nonprofit organization which administers the CLA, the goal is “to encourage a continuous cycle of improvement at higher education institutions."
It also offers faculty training "to engage faculty members in this cycle of improvement," Rodd told me in an e-mail.
The way it works: A hundred freshmen are given the tests in the fall, and 100 seniors are given the tests in the spring of each academic year.
This year, students will be tested on what the CLA describes as analytic reasoning and evaluation, problem solving, writing effectiveness and writing mechanics.
Students take an essay exam in which they are asked to critique an assertion or make an argument. Another exam, called a "performance task," requires students to marshal material from a range of sources in a similar but more expansive essay format.
If seniors score above, below or at the same level as the freshmen do, that provides a measure of the “value" an institution has added, from where a student started out as a freshman compared to where he or she might end up as a senior.
The argument for measuring teacher effectiveness is partially based on the view that tax dollars are paying teachers' salaries, so taxpayers have a right to know what they are getting in return. The same argument could apply to college professors at public universities.
It is an argument made most vigorously during the past decade by Margaret Spellings, former Secretary of Education under George W. Bush. She advocated making colleges and universities "accountable" in much the same way K-12 public schools are held accountable under the No Child Left Behind legislation. Her controversial Commission on the Future of Higher Education also roiled the academic world by proposing more systematic measurement of student outcomes, including using "value-added" approaches like the CLA.
But Roger Benjamin, the president of the Council for Aid to Education, which developed the CLA, completely rejects Spellings' approach. He told the Chronicle of Higher Education in a front page article (only available with a subscription) last month:
Everything that No Child Left Behind signified during the Bush administration, we operate 180 degrees from that. We don't want this to be a high-stakes test. These core abilities, these higher-order skills, are very important, and they're even more important in a knowledge economy.
Rather than a multiple choice test, only essay tests like the CLA measure those kinds of skills, he said.
But educators at some CSU campuses are struggling with implementation of the test and the results they yield. During the coming week, Sally Murphy, a professor of communication at CSU East Bay and director of general education at the campus, will administer the test to 100 freshmen, all of whom are enrolled in one of the campus’ freshman seminars.
Murphy says she has no problems with the test itself. “It is a very intriguing measure of students’ ability to work with information from a variety of sources, to make critical assessments about the strength and quality of the data they are presented with," she said. “It is a cool test.”
She also has no problem with "being open about what value students are getting from their public education."
But she worries that tests are not integrated into the students’ coursework, and students may not be motivated to do well on it. Some CSU students often race through the test in 20 minutes, rather than taking the full 90 minutes allotted. "This test has not worked out problems regarding motivation of students who take it," she said.
Bill Loker, the dean of undergraduate education at Chico State, also worries about the problem of recruiting and motivating students to participate. "We get them any way we can, because the test has zero relevance for students," he told me yesterday. Unlike CSU East Bay, Chico State this year will award students cash prizes who do best on the test, and will also raffle an iPad as a further incentive.
Murphy also believes the test is more suited for smaller private universities where students arrive as freshman and graduate as seniors four years later.
By contrast, she said, the student body at CSU East Bay consists largely of students who have transferred from community colleges and whose average age is 26. She questions whether Cal State freshmen who take the test are representative of the student body as a whole and can be reasonably matched with the 28-to-30 year old seniors who take the test in the spring and with whose results they are compared.
"Students come to us from a variety of backgrounds and transfer from a variety of schools," Murphy said. "So it is problematical to say whatever growth is demonstrated is a result of their experience at Cal State East Bay."
Chico's Loker has other concerns, including that the test is skewed to measuring skills learned by students in the social sciences, rather than those in the sciences or the humanities.
Loker says the CSU chancellor's office now allows each campus to decide how often to administer the test – annually, every other year, or every three years. But campuses cannot back out from participating altogether.
CSU is one of more than 500 institutions that participates in the Voluntary System of Accountability, which posts a wealth of data from each school in its "College Portraits," including its results on the CLA.
I pulled out the results on the CLA for each CSU campus.
Students at four out of 19 CSU campuses – San Francisco State University, CSU Chico, CSU Long Beach, and CSU Los Angeles – scored at "well above what is expected" on both parts of the CLA test. Another seven campuses show scored at a "what is expected" level. Others had mixed results depending on the test. Three colleges (CSU Bakersfield, CSU Fresno, and CSU Monterey Bay) scored "below" or "well below" what is expected on one of the two portions of the tests.
Here's the roundup:
Cal Poly Pomona – student scores are what would be expected on performance tasks and analytic writing tests.
CSU Bakersfield – student scores are below what would be expected on performance tasks; what would be expected on analytic writing tests.
CSU Chico – student scores are well above what would be expected on performance and analytic writing tests.
CSU Dominguez Hills – student scores are above what would be expected on performance tasks ; well above what would be expected on analytic writing tests.
CSU Fresno – student scores are what would be expected on performance task; well below what would be expected on analytic writing.
CSU Fullerton – student scores are what would be expected on both performance tasks and analytic writing tests.
CSU Long Beach – student scores are well above what would be expected on both performance tasks and analytic writing tests.
CSU Los Angeles – student scores are well above what would be expected on both performance tasks and analytic writing tests.
CSU Monterey Bay – student scores are below what would be expected on performance; what would be expected on analytic writing tests.
CSU Northridge – student scores are what would be expected on both performance tasks and analytic writing tests.
CSU Sacramento – student scores are about what would be expected on both performance tasks, and above what would be expected on analytic writing tests.
CSU San Bernardino – student scores are above what would be expected on performance tasks, and well above what would expected on analytic writing tests.
CSU San Marcos – student scores are what would be expected on performance tasks and analytic writing tests.
CSU Stanislaus – student scores are what would be expected on performance tasks, above what would expected on analytic writing tests.