If California’s graduation and dropout rates were a pair of jeans, they’d have an “irregular” sticker on them. Sure, social science research has a reputation for squishy results, but it’s still a bit jarring when a renowned researcher describes certain data as “bogus.” Although he said it with an ironic laugh, that was the first word that popped into Russell Rumberger’s mind when I asked him about the California high school graduation rates in “Diplomas Count,” an annual report from Education Week.
Rumberger is an education professor in the Gevirtz Graduate School of Education at UC Santa Barbara, where he founded the California Dropout Research Project and has been conducting research on school dropouts for 25 years. His new book on the subject, Dropping out: Why students drop out of high school and what can be done about it, is due out later this year.
A sharp dip followed by a sharper rise
The statistics in question are in the California supplement of the report (available for purchase from EdWeek here), showing changes in graduation rates between 1998 and 2008. For much of the time they’re fairly stable, from a few tenths of a point to a little over one percent from year to year. Over the course of the decade, California’s graduation rate increased from 67.5 percent to 73 percent, staying close to the national average. But in between there’s a dramatic shakeup and recovery.
It started in 2005, when the state’s graduation rate was 70.1 percent. Within two years, by 2007, it had fallen to 62.7 percent, the sharpest decline in the ten-year period. A year later, in 2008, it surged back up by more than ten points to 73 percent. “Pretty wild,” said Rumberger. “The bottom line is all these numbers are estimates, and estimates have errors.”
California runs its own numbers, and they’re quite different. Actually, California runs two sets of numbers,
but more on that in a moment. The official statistics that the state sends to the U.S. Department of Education, for No Child Left Behind reporting, show a downward trend between 1998 and 2008, falling from a high of 87 percent to about 80 percent. But even at their lowest point, those graduation rates are still higher across the board than EdWeek’s figures.
And just to complicate it a little more, that second set of numbers that California prepares has graduation rates heading up after a 2006 downturn, but coming in lower than EdWeek. So we have three sets of calculations, all using data from the same source, obtaining different results and different trends.
It’s all in the formula
It’s no better nationally. Rumberger recently served on a committee of the National Research Council and National Academy of Education that examined the various measurements that states use to determine their graduation and dropout rates. The committee found “widespread disagreement” about the best measurements and their uses. Looking at the year 2005, the committee found three different school completion rates published by the U.S. Department of Education, Editorial Projects in Education (a project of EdWeek), and the Annie E. Casey Foundation. Some researchers suspect the problem is with the way the rates are calculated and not with the numbers used to make those calculations.
In its final report, the panel found most formulas to be flawed and occasionally skewed by politics. “At a time when policy makers are vitally interested in tracking the incidence of dropping out of school, they are faced with choosing among substantially discrepant estimates that would lead them to different conclusions regarding both the size of the drop out problem and how it has changed in recent years,” they wrote.
The most common calculations are aggregate counts, the cumulative promotion index (CPI), and cohorts. Aggregate is the simplest and bluntest, comparing the number of graduates to the number of ninth graders four years earlier. The CPI is based on how many students progress from grade 9 to 10, 10 to 11, 11 to 12, and then ultimately graduate. Karl Scheff, with the California Department of Education, says CPI is better than aggregate but still doesn’t get at what happens to individual students. That’s the true cohort measure and it’s the brass ring of data.
Calling on CALPADS
California has been collecting this data for five years, ever since assigning individual student identifiers as part of the CALPADS student data system. Up until now, even though districts have been sending in student-level data, the state has still been aggregating it. “We just added them up and plugged them into the traditional calculation,” said Scheff. This year should be the first time they have a full cohort of students from grade 9 through graduation, but now funding for CALPADS is up in the air.
Gov. Brown proposed cutting the budget for CALPADS in his May revise. Both the Assembly and Senate have restored the money but Scheff says it’s not clear whether the Governor will veto it. With CALPADS, state officials will have a robust data source that can track students who move out of
state, transfer to a private school, or spend a fifth year in high school in order to provide a much more precise look at graduation and dropout rates.
Until then, we’ll have to sort through two, three or four reports each with its own interpretation of the data. Rumberger tries to be as precise as possible when giving presentations. “I’ll show the two state rates,” he said, “and what I tell people is the actual rate is somewhere in the middle.”
A More Accurate Measure of California’s Dropout Rate, June 2010, California Dropout Research Project.
Independent Evaluation of the California High School Exit Examination: 2010 Evaluation Report, Volume 1, Oct. 2010, Human Resources Research Organization
Public School Graduates and Dropouts From the Common Core of Data: School Year 2008-2009, National Center for Education Statistics