A hardly ‘persistently lowest list’

Analysis: Most schools aren't among worst
By

Most of the 96 schools that the state has designated among the 5 percent “persistently lowest performing” shouldn’t be on the list. Yet they could qualify for the $69 million that the Department of Education has allotted California for schools to turn themselves around.

That’s the conclusion of Doug McRae, a retired test publisher, occasional TOP-Ed contributor and regular attendee of State School Board meetings. He and others urged the State Board to revise the list of schools, but the Board, feeling pressed by a federal deadline, approved it last month. Now, McRae has run the data.

Assuming they’ll apply for the money, the schools will be eligible for School Improvement Grants if they agree to one of four turnaround strategies dictated by the feds. There will likely be enough money for about 30 schools to get as much as $2 million each next year.

The 96 are the schools not funded last year from the 188 schools on the persistently lowest performing list. The original methodology for choosing schools had serious flaws, which the Ed Department hasn’t corrected.

McRae updated the list to include this year’s standardized API scores. He found that 42 of the 96 aren’t in Decile 1, the lowest performing 10 percent of schools (about 1,000 in the state). Thirty are Decile 2, eight are Decile 3, three are Decile 4, and one, a high school in Hacienda La Puente Unified in Southern California with an API score of nearly 700, is a Decile 5 school. As McRae noted it’s hard to take seriously a list of the lowest 5 percent of schools with 40 percent of them in Deciles 2 to 5.

On top of that, a bunch of schools made significant gains in their API scores last year to meet the state’s target for a five-year improvement. “The bottom line is that 60 to 70 percent of the schools on the 2010 SIG eligibility list should not be on a credible PLAS (Persistently Lowest Achieving Schools) list if we use the most recent current status and recent progress information,” McRae wrote. As a result, many of these schools could end up getting money they don’t deserve.

Those Decile 1 schools not on the list include the poster child of troubled schools in Los Angeles Unified: Fremont High. (It was exempted because its API rose 50 points in 5 years – something even barely breathing schools can do when their score is so low to start with.)

Last year, the State Board allotted $415 million in three-year grants of up to $6 million each to 92 schools. Districts don’t have to apply for the money, and many may not this year, because so far Congress has appropriated money for one year. Districts may not want to commit to wrenching changes without more money guaranteed.

The Department of Education has said schools would compete for the money, though it hasn’t disclosed the selection criteria. If enough deserving schools don’t apply, other schools will suck up the money.

Here’s another wrinkle: If Congress does come up with funds for future years, the state will feel obligated to continue funding this year’s winners. So it probably won’t get around to revising the list for two more years.

Approving the SIG list that it inherited was the first item facing a State Board with seven new members last month. Deb Sigman, the state deputy superintendent in charge of curriculum, learning, and accountability, told the board that the state faced a Jan. 31 deadline to submit the list or risk losing the money. She also said the Department was obligated to accept criteria for the list that the Legislature set into law.

McRae argues that the law was clearly worded to allow the State Board and Superintendent of Public Instruction to amend the list and that the Department could have updated the list within a week, then easily negotiated a deadline extension with the feds. My sense is he’s probably right.

I passed on McRae’s analysis to the Department two days ago for comment. I haven’t heard back.

Go here for McRae’s spreadsheet with the eligible schools. Go down to the bottom of the list for a description of each column.

Tagged as: , ,

18 Comments

  1. John, why does it bother you so much that federal money may go to some schools that may not be academically low enough to qualify, while you’re happy as a clam to throw away an estimated $1.6 billion of state money on testing companies, publishers and the PD circuit to replace the CA standards, one of the few state standards that does not need replacing?  It seems a double standard: Save all money possible when it comes to schools; be a spendthrift when it comes to education service industries.  Is someone lobbying you?

    Report this comment for abusive language, hate speech and profanity

  2. John  What would we do without Doug?   When will this insanity stop?   When we have only 2.8% of all schools scoring below 500 on the API (compared to 21% 10 years ago) When the average API score of 1st decile schools is above the average API score of 5th Decile scores 10 years ago we need another measure.    There are some really bad schools out there but lets identify them not this silly political list put forth by the so called “reform” school board.    John

    Report this comment for abusive language, hate speech and profanity

  3. Further evidence that the school ranking schemes are not a good idea.  An EdSector report in the past year or two (Aldeman) showed a high school with a “D” on the state report card had its graduates outperforming the graduates of an “A” high school once they go to college.  Now we see that the lowest aren’t all the lowest in California.  Then, we mandate rigid and unproven “reforms” as school turnaround models, and a few years down the line wonder why nothing’s getting better.  Why are so many persistently underperforming schools in Oakland the same ones that were reconstituted years ago when they were persistently underperforming?  Could there be some underlying causes here that really aren’t about the principal, or the teachers, or whether or not the school is a charter school?
    And if these rankings are so sketchy, isn’t the Parent Trigger really a bit loosely constructed?  Of course, the people who really like the Parent Trigger aren’t interested in school quality, or even parent empowerment.  If they were, they’d be pushing for reforms that support school quality rather than finding “triggers” that threaten schools, often for failures beyond their control and failures that shouldn’t even be labeled failures.  Why not push for reforms that actually give parents more voice and participation in schools?  Why not study existing models and publicize the best ideas?  Will parents be more empowered by “triggering” a takeover by a company they didn’t select, to be run by a board they didn’t elect and which seats none of them, and very well might hold its meetings in private?
    The whole thing reeks.  Rankings, triggers, turnaround models, value-added measures, racing to the top with no child left behind… all diversions from the much, much larger systemic needs that we aren’t willing to address or fund.

    Report this comment for abusive language, hate speech and profanity

  4. Point of clarification – Aldeman’s study looked at high schools in Florida. Please forgive the vague reference to “the state report card.”

    Report this comment for abusive language, hate speech and profanity

  5. David,
     
    You say this is a further evidence that ” the school ranking schemes are not a good idea.” That may or may not be true, but the only evidence we see here says only that badly done rankings — of anything — are not a good idea.
     
    California API model is not inherently a bad one, and ranking schools by deciles, plus the availability of “comparable schools ranking” seem to me, in fact, like a very good idea. They do give parents an accessible tool to get the rough sense of where their school stands.
     
    But  you immediately carry the CDE incompetence in properly using the data into the larger non sequitur statement that school rankings in general are not a good idea. Why am I not surprised?
     
    But I am surprised by the way you jump from CDE incompetence to Parent Trigger.  I am even more surprised how you claim to know what they, and their supporters really(?) think.
    “Of course, the people who really like the Parent Trigger aren’t interested in school quality, or even parent empowerment. ” Of course? “If they were, they’d be pushing for reforms that support school quality rather than finding “triggers” that threaten schools.” Really? You mean if they were to think the way you think, that will make them–finally–”interested in school quality”? Didn’t they teach you any better in ed school about “assuming” what other people think?
     
    Oh, and by the way — how exactly did Compton Unified get to be like it is over many decades? Because it was run by “uninterested” parents, or by “interested” educators? Can you remind us, please?

    Report this comment for abusive language, hate speech and profanity

  6. Compton Unified got to be the way it is largely because it’s a very high-poverty community. The challenges that poses on schools become crushing, and that does foster dysfunction in the organization. That’s why we see school districts like Vallejo and Richmond in crisis, but not Lafayette or Palo Alto.

    Report this comment for abusive language, hate speech and profanity

  7. I know the Compton school district’s history is a mess. But you don’t see that in districts that aren’t high-poverty. It would be extreme to compare it to the city of Bell, but same general principle — that wouldn’t have happened in a high-wealth community with an empowered, educated populace.

    Report this comment for abusive language, hate speech and profanity

  8. Oh come on, Ze’ev,
    You want to pick at each other’s writing styles in blog comments?  You and I have gone back and forth enough in public and private that I’d hope you could refrain from the “didn’t they teach you any better…” type of snarkiness.  Did my strong feelings get in the way of qualifying statements?  Perhaps so.  I should have said, “Based on their actions, it seems to me that the authors and supporters of the Parent Trigger have misplaced priorities.”  I hope that satisfies you.
    I suppose that we could have a debate about the merits of rankings – I am not inclined to support the idea when the measures used are so narrow.  Reading and math scores are a small measure of what schools are supposed to do, and really, what does the API tell us that we don’t already know?  I’m not entirely against testing, but it’s the use of the tests that I question, and here’s a case where we end up proving – once again – that wealth matters, and we punish those who have the least.  That was the lesson I took away from the EdSector report.  If you change the measures – which no one is talking about doing – you totally upend the results.  If you think ranking is really important, propose a way to avoid this inevitable pitfall.
    Even when we get to the comparable schools deciles, well, anything can be divided into ten parts, and ten percent will always be the bottom ten percent.  You could have ten world class sprinters in a race, and someone will finish tenth.  You could put ten average runners in a race, and someone will finish first.  How meaningful is that?  In terms of API, what really is the variability among the comparable schools?  What do you really know from the numbers?  Can you look at the numbers and tell me where you’d want your children in school?  Should we go for the 4/10, 5/5, or 6/1?  We measure so little of what we should care about, and then make too many conclusions based on poorly understood statistical distinctions, without really knowing or communicating what they mean.  I used to teach at a 10/10 school that I thought was inferior to another school in the same district that was around 7 or 8 on the overall API ranking (not sure what the comparable schools API rank was).  Is such a thing possible?
     

    Report this comment for abusive language, hate speech and profanity

  9. Caroline — Are you implying that it takes a cohesive, motivated community of college-educated, upper middle class professionals to keep a district’s administrators and union from ruining the district?  I have to agree that it certainly helps to have peole like that keeping an eye on things, volunteering in the classrooms, and raising money for the schools.  But it doesn’t always work.  I know.  I live in a district where even those forces have not been able to divert our sad educational monopoly from its steady march toward failure.

    Report this comment for abusive language, hate speech and profanity

  10. That’s not an ethical way to argue, Jim Mills — willfully misrepresenting my words is dishonest.
     
    No, as you are well aware, that’s not what I’m saying. I’m saying that a district struggling with a critical mass of high-poverty, at-risk students, and the problems they so often bring to school with them, is vulnerable to ills and evils from management lapses to structural flaws to incompetence to corruption. A more privileged district that doesn’t face the same challenges is far less vulnerable.
     
     

    Report this comment for abusive language, hate speech and profanity


  11. The problems with the persistently lowest performing list quickly illustrates why California needs to overhaul its accountability system. I disagree with Doug that we should only focus on the achievement level of the school. We all know that there is a strong correlation between achievement and demographics. So, too often lower decile schools tend to blame the kids for their low performance. The state needs to begin to incorporate measure of individual student year to year growth into the calculation of school performance. The data is there, it is time to start using it. Other states used their accountability growth models to determine their persistently lowest performing schools, it is time that we did too.
    Measuring growth is a much more accurate measure than the progress measure that was used to the current list. A school can make progress or decline simply by changes in attendance boundaries (This is happening a lot right now in LA as all of the schools that Romer built during his tenure are coming on line) or other adjustments to the students in a cohort. Growth measures can not be gamed in the same way. As the new State Board starts to address the renegotiation of the STAR contract, measuring growth should be central to that discussion.
    As for John Mockler’s comment that we should celebrate the dramatic change in the number of schools below API 500. Yes schools have improved over the last decade, and some of that improvement is real, and not just the result of rebenching that has happened when the API or the test has changed. But, let’s put it in perspective of what it actually means to score an API 500. An API 500 is equivalent to the average student at the school scoring below basic. To get this score a student must get 21 questions correct out of 50.
    A quick glance shows Henry Clay Middle as the lowest performing school in LA Unified on Doug’s list with an API of 538 — 58 percent of its 7th grade students score below or far below basic in English and 67 percent in Math. We should not celebrate this type of performance even if it is higher than the school’s API a decade ago (440). When only 1/3rd of kids can score basic or above in math, the school clearly needs substantial change.

    Report this comment for abusive language, hate speech and profanity

  12. Rob:  For the record, I do not advocate that decisions be based on achievement status only, regardless of metric [API or AYP].  I would advocate decisions be based on both current achievement status and recent achievement gain.  If we can operationalize “gain” by true longitudinal growth measures (like I presume you advocate), great.  But, if the obstacles get in the way of near term valid and reliable true longitudinal growth measures, then operationalizing “gain” via improvement systems like the current API system is the way to go until true longitudinal growth systems can be developed.

    Report this comment for abusive language, hate speech and profanity

Trackbacks

  1. Tweets that mention State’s hardly ‘persistently lowest list’ | Thoughts on Public Education -- Topsy.com
  2. k12reboot » Does CA Know Where Its Lowest-Performing Schools Are?
  3. Seeking Something Better Than the Trigger « InterACT
  4. SIGnificant improvementS | Thoughts on Public Education

"Darn, I wish I had read that over again before I hit send.” Don’t let this be your lament. To promote a civil dialogue, please be considerate, respectful and mindful of your tone. We encourage you to use your real name, but if you must use a nom de plume, stick with it. Anonymous postings will be removed.

10.1Assessments(37)
2010 elections(16)
2012 election(16)
A to G Curriculum(27)
Achievement Gap(38)
Adequacy suit(19)
Adult education(1)
Advocacy organizations(20)
Blog info(5)
CALPADS(32)
Career academies(20)
CELDT(2)
Character education(2)
Charters(82)
Common Core standards(71)
Community Colleges(66)
Data(25)
Did You Know(16)
Disabilities education(3)
Dropout prevention(11)
© Thoughts on Public Education 2014 | Home | Terms of Use | Site Map | Contact Us