New dropout rate in question

CALPADS delays complicate districts' work
By

The state Department of Education reported a sizable uptick in the dropout rate this week, leading to speculation that school budget cuts, the high school exit exam, or – choose your speculation – was to blame. But there was enough noise in the data to call into question the validity of the increase, let alone the cause behind it. And Russell Rumberger, an education professor at UC Santa Barbara and the state’s foremost authority on the dropout rate, says he doesn’t get too excited about one-year fluctuations anyway.

Already disturbingly high, especially for minorities, the latest four-year adjusted high school dropout rate, for 2008-09, is 21.7 percent, a full 2.8 percentage points above the 18.9 percent in 2007-08. The rate for Hispanics and African Americans also rose proportionally, to 26.9 percent and 36.9 percent respectively.

This is the third year that districts have used individual student identifier numbers, an integral part of the state’s longitudinal data system, to more accurately track where students go when they leave school. No longer willing to accept assumptions that students have transferred to other districts or to take relatives’ word on what students are up to, the state required districts to verify whether students  enrolled in adult education or a continuation school – or else the default assumption was that they dropped out.

This change, which held districts and schools accountable for the data, led to a bump in the dropout rate two years ago, from 16.8 percent to 21.1 percent.

But this year was the first year that districts reported the data through CALPADS, the troubled database system that gave district personnel fits. Because of software problems, the state delayed the CALPADS deployment date by nearly a year when districts had to upload the data for 2008-09 needed to determine the dropout rate.  According to Keric Ashley, director of the Department of Education’s Data Management Division, about a quarter of the state’s districts then missed the initial deadline for submitting the data. They then had a week – instead of months – to correct the dropout report that the state sent back to them. That left districts like San Diego Unified, the state’s second largest district, no time to track down 1,300 “lost transfers” that the state had classified as dropouts.

San Diego’s dropout rate rose from 9.2 percent to 23.5 percent. The Contra Costa Times reported that the state assigned a 99 percent dropout rate to Dublin Unified; clearly something was screwy. There may have been enough instances like these to account for the variation from last year, Ashley acknowledged.

Next year’s dropout rate, for 2009-10,  will be the one to watch – and should be out in May, assuming CALPADS is back up and running. It will use methodology required by the federal government, tracking four years of a cohort of students through their student identifiers. And it will also significantly change how the rates are calculated.

Under the current system, the state bases enrollment on a specific date in October. That tends to inflate the dropout rate, according to Rumberger, because it misses students who come in and out of school throughout the year. The new cumulative rate will pick up students who enroll at continuation schools, community day schools for at risk students, juvenile halls and county programs throughout the year and should significantly lower the dropout rates in these institutions (see a brief that Rumberger wrote on this earlier this year).

Meanwhile, some district officials are scratching their heads over the latest numbers. Last week, I wrote about Stockton’s nationally acclaimed effort to dramatically cut its 4-year dropout rate from 54 percent to 17.7 percent in 2007-08, less than the state average. Well, the dropout rate shot up again to 34.8 percent in 2008-09, according to the state.

Districts officials insist that the latest report is inaccurate, and told the Stockton Daily Record they would retrace students whom the state labeled as dropouts. They assumed the rate went up some, but not nearly that much.

11 Comments

  1. A review of the data for Alameda and Contra Costa County shows something has changed in reporting for 2008/09. Both counties reported a five fold increase in dropouts identified in 2008/09. http://www.scribd.com/full/44920123?access_key=key-s93j46wes4hb8yqxgf4

    Report this comment for abusive language, hate speech and profanity

  2. That Dublin Unified! What a failure.
     
    Some years ago those state peeps issued an alarmist press release, picked up by some of the press*, when supposedly 0% of San Francisco’s 7th-graders failed to pass the 6-part physical fitness test (reported for just certain grades). I investigated a bit. Various non-alarming numbers of students didn’t pass 5 of the parts, but supposedly, 0 7th-graders passed the flexibility part. Since there was a professional contortionist in my daughter’s 7th-grade class that year, plus numerous highly flexible girl dancers if nothing else, I investigated further. An SFUSD functionary had mistyped some reporting form involving 1′s and 0′s. Actually, many SFUSD 7th graders had passed the fitness tests, including the flexibility part. The people at the state don’t seem to have what it takes to go “gosh, could something be wrong here?” They just dutifully report what the numbers show.
    *The Chronicle education reporter said she could see there was an error, so she just didn’t report on it.

    Report this comment for abusive language, hate speech and profanity

  3. Perhaps there’s something in the McKinsey report that can help the state resolve the problems with CALPADS.

    Report this comment for abusive language, hate speech and profanity

  4. Hats off to Taylor and Rumberger for their succinct analysis.  This is yet another facet of California’s embarrassing education data mess.
    Perhaps it’s time for a state law that prohibits the state department from publishing or posting data unless it has reason to believe that the data is sound.  Even when (if?) the state fixes the cumulative graduation data count SNAFU explained by Taylor and Rumberger, other fundamental data problems will remain, including a student identifier system that fails to track  thousands of students every year and confounds high school registrars.
    Even if these underlying data collection problems are addressed, there remain serious issues of interpretation.  For example, if a school that enrolls high proportions of dropouts has a 30 percent dropout rate and the statewide average is 15 percent, does that mean the school is a “bad” one, or should the school be given credit for “recovering” students who previously  dropped-out (i.e., a “drop-in” rate to offset the drop-out count?).
    This is yet another example of California building complex and costly data systems that don’t work.  Worse yet, the state seemingly does so before thinking through whether the resulting data, if sound, would be of analytical value.
    Arguably, it’s the student data equivalent of the Gravina Island Bridge (the “bridge to nowhere”).  Perhaps it’s time for California to simply sunset and halt all of its K-12 assessment and data collection efforts for a year or two and focus the freed-up resources on research to create a new and more robust system or set of systems that are purpose-built to serve real analytical needs and provide schools and teachers with useful data.

    Report this comment for abusive language, hate speech and profanity

  5. Let’s be sure we all understand the difference between problems that are associated with the data system and the problems associated with poor data quality or calculating rates.

    Dublin reported 1,000 less students than, in reality, were actually enrolled. An error in reporting accurate data is not a data system problem. CALPADS depends on 10,000 schools in the state submitting accurate data (and with no additional resources to do it). Accurate data are not free. It takes time, effort, and personnel to submit accurate data and this is the first year of learning a new system. Over the last 5 years, the  Superintendent has called upon the Legislature and the Governor for additional resources to school districts to do this more difficult student-level reporting.

    There is no SNAFU with the rates that are calculated – these aggregate rates are still just best estimates until we have the four years of longitudinal data required to calculate a cohort rate which will be produced this spring.  

    CALPADS has had enough of its own challenges in implementation without adding blame for things not under its control. 
      

    Report this comment for abusive language, hate speech and profanity

  6. If the CDE peeps had been taught critical thinking skills they would have questioned that (and previously the fact that zero SFUSD 7th-graders could pass the flexibility test. It’s an on-the-ground view of what’s lacking in our school system!

    Report this comment for abusive language, hate speech and profanity

  7. In case I was unclear, Keric Ashley is correct. Except for the turnaround time issue (a big issue) that San Diego and other districts faced, some of the problems — data entry or whatever — were  at the district level.

    Report this comment for abusive language, hate speech and profanity

  8. But common sense would dictate that obviously erroneous information not be sent out. If there isn’t enough flexibility to act upon common sense, maybe the value of the entire system should be rethought.

    Report this comment for abusive language, hate speech and profanity

  9. The problem is with the “obviously erroneous” concept.  What is obviously erroneous to anyone that knows the facts on the local level is not obvious to someone at the central location that has to deal with 10,000 school. In other words, the easiest and fastest way to weed errors is to have someone on the local level that knows the ballpark numbers off the cuff to do the sanity check immediately after entry and before submission.
     
    Doing it on the central level is a never-ending process, much more laborious, expensive, and lengthy. Doesn’t mean it should not be done too, but is not a replacement for a local sanity check. At the state level, if a number looks strange, one has to contact the local school to figure out if it is a true mistake (and what is the new number, and find someone to *certify* the new number) or, perhaps, it is a strange-looking but still true number — schools close,  schools are reconfigured, whatever. This takes time and effort. Multiply it by 10,000 times the error rate …

    Report this comment for abusive language, hate speech and profanity

  10. Sure, there are many errors that would be hard to detect, but there’s a tipping point. A sudden surge or a sudden drop in a rate might not be obvious, but it’s not possible to fail to notice that a 99% dropout rate is a red flag.

    With the fitness test in which supposedly zero San Francisco 7th-graders passed the flexibility test, the CDE sent that information out in a press release, so it’s not like they failed to notice. I know, it’s a crazy idea — use common sense.

    Report this comment for abusive language, hate speech and profanity

Trackbacks

  1. Pasar Modal Indonesia » Answers Archive » Stock markets ??

"Darn, I wish I had read that over again before I hit send.” Don’t let this be your lament. To promote a civil dialogue, please be considerate, respectful and mindful of your tone. We encourage you to use your real name, but if you must use a nom de plume, stick with it. Anonymous postings will be removed.

10.1Assessments(37)
2010 elections(16)
2012 election(16)
A to G Curriculum(27)
Achievement Gap(38)
Adequacy suit(19)
Adult education(1)
Advocacy organizations(20)
Blog info(5)
CALPADS(32)
Career academies(20)
CELDT(2)
Character education(2)
Charters(82)
Common Core standards(71)
Community Colleges(66)
Data(25)
Did You Know(16)
Disabilities education(3)
Dropout prevention(11)
© Thoughts on Public Education 2014 | Home | Terms of Use | Site Map | Contact Us