California switches test consortiums

Kirst sees upside with SMARTER Balanced
By

In a decision with long-term policy implications, California has switched membership in the state-led consortiums that will create the standardized tests for the new nationwide Common Core math and English Language Arts standards.

California will become one of 18  governing states in the 30-state SMARTER Balanced Assessment Consortium. It is most closely identified with one of its chief advisers, Stanford University School of Education Professor and author Linda Darling-Hammond, an advocate for teachers and a sharp critic of the current generation of  standardized tests.

Gov. Jerry Brown, Supt. Of Public Instruction Tom Torlakson, and State Board of Education President Michael Kirst signed a memorandum of understanding committing California to SMARTER Balanced. Last year, their predecessors in office had signed up California as a participating member in the other consortium, Partnership for the Assessment of College and Career Readiness (PARCC), which has two dozen states.

Choice of observing or shaping

California had the option of choosing one consortium in a decision-making capacity or joining both as observers. A number of  education leaders had recommended the latter option for now, until it becomes clearer which consortium could better deliver on its promises. A half dozen states are members of both.

Others argued California should be in the driver’s seat. “I think we should be in a leadership position,” Torlakson said, “so that we can better shape the outcome.” As a governing member, California can vote on decisions and have representatives on various technical and policy committees.

The federal government has awarded the two consortiums $360 million to develop the assessments by 2014-15, a daunting schedule that leaps past the customary process of spending years fleshing out standards through curriculum frameworks and developing instructional materials and teacher training before tests are developed. Both consortiums will develop annual tests for grades 3 through 8 and grade 11. Both are committed to create tests measuring whether students are on a successful path toward college and/or a career. Both will use multiple-choice questions for part of the tests. Both will have a common scale of measurement and cut points for proficiency that will allow cross-state comparisons – not readily possible now under independently developed state assessments.

But there are distinct geographical and philosophical differences between the two consortiums, and it’s significant that Brown, who has expressed skepticism over  California’s testing system, has allied the state with SMARTER Balanced.

Computer-adaptive testing

As I’ve noted before, both consortiums say they will be creating the next generation of assessments using computers and testing higher-level thinking. But SMARTER Balanced emphasizes performance measures – more in-depth exercises and demonstrations of higher skills. These are more complex, and potentially harder to grade and to make individual class and school comparisons for high-stakes accountability purposes.

SMARTER Balanced also will use computer-adaptive testing, which by choosing questions based on students’ previous answers, can better measure individual student knowledge and skills. Computer-adaptive testing has been used extensively in higher education but not in K-12 at this scale. It will require a much larger bank of questions than standard tests and well-equipped computer labs in every school.

Torlakson acknowledged that computer-adaptive testing may be a challenge in California, which he said is ranked 47th in the nation in its use of technology. But he said he plans a technology initiative that will call on businesses like Comcast to assist schools and will include technology components in the next state school bond issue.

PARCC, which will be managed by Washington, D.C.-based Achieve, is taking a more traditional approach, with a series of  periodic tests, called through-course assessments, leading to an end-of-year test. This has received considerable criticism lately from those who fear that through-course tests will regiment state curriculums. PARCC is said to be rethinking this approach. PARCC plans to base its college readiness measures on California’s Early Assessment Program, an 11th grade test used to gauge readiness for CSU. That’s one reason that the seven superintendents who led the state’s second-round Race to the Top application endorsed PARCC. “PARCC is designing a system that will emphasize college and career preparation, a necessary raising of the bar in today’s competitive global economy,” said Los Angeles Unified Superintendent John Deasy in an unpublished opinion piece.

SMARTER Balanced is also more teacher-oriented, which is why the California Teachers Association endorsed it and Torkalson, a CTA ally, liked it as well. Darling-Hammond said yesterday that the consortium is committed to work with teachers in all phases of test development and that SMARTER Balanced will provide instructional supports for teachers and formative assessments – diagnostic exams that let teachers know how students are progressing.

“SMARTER Balanced will refocus us on learning and not just testing,” Darling-Hammond said.

But some see the linking of  formative and end-of-year or summative assessments as a weakness, not a strength, and as a factor in undermining the accountability value of testing.

Doug McRae, a retired test publisher and occasional contributor to TOP-Ed, called the selection of SMARTER Balanced “a major turning point for California” and a move “toward instructionally-based assessment and away from accountability-based assessment” – a “softer approach.”

McRae was among those who called for participating in both consortiums, as a way to learn from both and as a safeguard; he thinks both will have difficulty meeting their commitments in time. And McRae said he was disappointed that there was no analysis or vetting of the decision in public. Other states had advisory committees that were involved in the choice of consortiums, he said.

The State Board held a lengthy hearing in March, at which representatives of both consortiums made presentations and the Board heard public testimony. But the choice was not formally brought to the Board. Only the signature of Kirst, as president, was required. (Kirst, traveling to New England Thursday, could not be reached for comment.)

UPDATE: Kirst, reached today (the poor guy was waylaid at Dulles yesterday and never made it to his 50th college reunion), characterized SMARTER Balanced as “the best fit for California at this time.” The computer-adaptive technology “is a gamble but has a big upside” because California has a range of student body backgrounds with large numbers of students without strong English language skills. “This will tell us what students know as opposed to what they don’t know.”  SMARTER Balanced also showed in its presentation more understanding of issues for English language learners, he said.

“SMARTER Balanced seemed more adventuresome in trying to prove deep learning in designing the assessment,” Kirst said. He said he wasn’t sure that the consortium is more teacher-centric but agreed that teachers certainly perceive it that way. And he said that he, Brown and Torlakson did get a lot of feedback from the public, compared with last year, when the governor, state superintendent and state board president signed the MOU without any public participation.

PARCC has more of an East Coast – Florida, Massachusetts, New York and New Jersey — and large Midwest states membership, while California joins a distinctly Western and New England, minus Massachusetts, membership in SMARTER Balanced. It will be managed by San Francisco based WestEd and has familiar ed policy academicians serving on advisory committees. Beside Darling-Hammond, they include Jamal Abedi, UC Davis; Ed Haertel, Stanford University; Joan Herman, National Center for Research on Evaluation, Standards, and Student Testing and James Popham, UCLA.

Tagged as: , , , ,

15 Comments

  1. It really doesn’t matter how “good” the new tests are. As long as we maintain the same policies of high stakes testing founded under NCLB, we’ll have a corrupted education system. I’m sure districts, schools, and teachers will figure out how to game this test like they have with other standardized tests – I know of some who are already working with CommonCore standards for just that purpose. Therefore, as long as these tests are high stakes for teachers, schools, districts, and states, the focus will remain on testing and not on learning. It’s about the policies, not the tests.

    Report this comment for abusive language, hate speech and profanity

  2. I see that the state of Washington is hosting the SMARTER Balanced Assessment Consortium.  If I remember correctly Washington is considered the lead state for this consortium.  However Washington state has not yet adopted the Common Core standards.  I’d love to hear someone from SMARTER Balanced say that fact is irrelevant.

    Report this comment for abusive language, hate speech and profanity

  3. The $360 million is not “federal” only- it is substantially bolstered by the Gates Foundation to help Obama keep two wars going.  The money is a nice jobs program for test developers, but we’ll need about 50 times that much to implement and maintain the tests in the schools and to have schools in the first place, with paid staff, etc.  In this light it’s not clear why John tags Darling-Hammond an “advocate for teachers,” since teachers need a paycheck before they need assessments.

    Report this comment for abusive language, hate speech and profanity

  4. Here we go again ,   bickering about what kind of testing is preferable when any  testing which is part of facilitation the nationalization of schools,  a bad idea in and of itself.   Not much diffferent from spending money, time and energy on deciding which kind of  prescribed ”elixir”  is best to nourish a debilitating agenda to  take any remaining “local” out local schools for the transformation to nationalized/internationalized schools.    Once that is accomplished, just how much influence does anyone in the local trenches, including teachers, think they will have about anything having to do with how they exercise their profession?   It may sound good to have such input into the process of choosing what kind of national testing to approve, but for teachers it may mean participating in choosing the colors of their own future straight jackets. 

    Report this comment for abusive language, hate speech and profanity

  5. Interesting point, Paul. This from Chris Barron, spokesman for SMARTER Balanced:

    “States aren’t required to adopt common core until the end of 2011 to be a part of Smarter Balanced. Washington, which provisionally adopted common core in July 2010, will likely adopt common core this July.
    Washington is the fiscal agent for Smarter Balanced, which is why people perceive it to be the lead state. However, the 18 governing states all have equal voting rights.”

     

    Report this comment for abusive language, hate speech and profanity

  6. A bold, positive decision for California.  Committing the state to Smarter Balanced positions it to influence the transition of what began as an accountability regime into a learning utility: tools for teachers and kids.  This should be good news for those who think that the current tests don’t accurately measure what students know or what we want them to know.
    The adaptive technology being built into Smarter Balanced is also part of the design of the most sophisticated on-line and blended learning developments.
    While I am sympathetic with those who ask why we would be spending a dime developing assessments when classrooms are starving, I know that building the systemic capacity to learn is the way forward.
    Educators in California, and particularly the California Teachers Association, have been handed a significant leadership opportunity.

    Report this comment for abusive language, hate speech and profanity

  7. Both consortiums have too much money to spend in too little time with too vague a plan.
     
    I read through the proposals for both plans, and they have great ideas, but their timeline is very aggressive for the amount of development that remains – and there’s no resources out of that devoted to getting enough computers and bandwidth in schools to make either deployable on a nationwide basis.

    Report this comment for abusive language, hate speech and profanity

  8. @el has it right. Both consortia barely know what they are doing and the probability of either one coming up with anything close to what it promises is low. Which, obviously, doesn’t stop them meanwhile to burn the $350M of federal gravy — sorry, “stimulus” money — on paying their staff, their consultants, and their test contractors.
     
    It is the test administration that will incur the heavy duty costs, and that will come from states’ kitties. Today, with about $20/student, California pays over $100M for testing every year. With the cost of performance-based assessment Smarter-Balanced stresses, these costs will escalate — whether “only” 5x, or the historically-based closer to 50x, I’ll leave for posterity to decide — that will have the real impact on our budget down the line.
     
    Today’s decision simply reflects the new reality in Sacramento after last year’s elections. The newly acquired clout of Linda Darling-Hammond and her contra-accountability fans, and the increased clout of CTA. Both prefer the “squishier” assessment Smarter Balanced offers, that pushes subjective grading done by the — extra-paid — teachers.
     
    I had bigger hopes for Mike Kirst and for the supposedly grown-up Jerry Brown. Disappointing.

    Report this comment for abusive language, hate speech and profanity

  9. All of this is interesting to consider in light of the latest position paper and study by the National Research Council on test driven reform efforts. In short from the NRC: Don’t!

    Report this comment for abusive language, hate speech and profanity

  10. Thanx, John, for this post — it is the first exposure for any rationale underlying the decision of the SPI and the SBE Pres to sign up CA as a governing member of SBAC. I’d like to dwell a bit on the rationale expressed by the SPI and the SBE Pres.
     
    The SPI was less substantive than the SBE Pres in providing his rationale. First, there was the “leadership” argument which seemed to me to reflect more self-annointed political hubris than anything really substantive. Second, there was his response to the adaptive-testing challenge, which seemed to be to look to business and a school bond issue to thrown dollars at the problem. Absent was any acknowledgement that the computer-adaptive testing challenge is much broader than simple availability of appropriate hardware and/or bandwith.
     
    The SBE Pres was for more forthcoming, for which I was grateful. His detail deserves more detailed responses — (1) his “best fit for CA” comment was his conclusion, not really rationale; (2) I would totally agree with his computer-adaptive “is a gamble” comment — not only are there major hardware deficiencies to be addressed, but for a computer-adaptive for a secure standardized testing program there are item bank test development challenges far beyond any existing computer-adaptive testing program in the US for K-12 or higher education; (3) for his “computer-adaptive has a big upside” because CA has a range of student backgrounds and large numbers of students without strong English language skills, I don’t see the connection — rather, it is easy to say that kids with extensive computer experience and computers at home will have a major built-in advantage with a computer-adaptive format, as contrasted to kids served by less wealthy schools with fewer computer resources and less exposure to computers at home — if so, computer-adaptive tests will artificially increase the SES and EL achievement gaps rather than provide more accurate estimates based on more traditional formats; (4) for his rationale that computer-adaptive will “tell us what students know as opposed to what they don’t know,” I don’t get it — computer-adaptive tests yield right and wrong answers exactly the same as computer-administered and traditional paper-and-pencil formats and thus really don’t affect anything on a “what students know vs what students don’t know” dimension, though I would say that computer-adaptive formats do have a potential advantage for reducing testing time; (5) Re his indication that the SBAC presentation in March showed a greater understanding of assessment issues for EL’s, I would agree — but whether that will translate into better tests for ELs at the end of the day, for either SBAC vs PARCC or for either consortium, is still highly speculative; (6) his comment that “SBAC seemed more more adventuresome in trying to prove deep learning” is a two-edged sword — I would agree with his comment as I think would many other experienced educational measurement folks, but whether that adventure will turn out to be “too far out on the limb” or not is problematic — CA spent a major bundle on the adventuresome CLAS program trying to measure a fuzzy “deep learning” concept in the early 1990′s and that effort collapsed when it could not generate individual student reliabile scores.
     
    I’m a bit disturbed by the SPI Pres claim that he and the SPI and the Gov did receive a lot of feedback from the public compared to the PARCC participating decision made last year. I don’t doubt that they did receive input from the public, but that input was never shared or vetted in a public forum. The decision this year, as the decision last year, was made with zero transparency on substantive rationale between the March 9 presentation by consortia representatives and the June 9 announcement. I would give Pres Kirst credit for agendizing the consortia presentations in March, but I was very critical of the SBE Pres and SPI and Gov’s Ofc last year for the lack of transparancy at that time and this year’s decision process (after March 9) was no better than last year. I think CA deserves better from their education leaders when directional decisions with long-term policy implications are made. I think our leaders have a responsibility to share their thinking and rationale and vet their proposed decisions in public rather than simply decide behind closed doors and generate press release announcements. Perhaps as a taxpayer, I am too presumptive . . . . .
     
    Lest folks think I may be sore because “my choice for an assessment consortia wasn’t selected,” let me reiterate that I’m not for hopping foresquare on either the SBAC or the PARCC bandwagon at this time. When serving as a business executive in the private sector, one of the most difficult questions to answer when I took a recommendation to the boss was “Why do we have to make this decision at this time?” If I had good substantive reasons for the timing, I generally got approval fairly readily. When I had soft answers, I generally was told to go work on my recommendation some more. I look at the rationale supplied by the SPI and the SBE Pres above, and I see soft answers on rationale — and the “Why decide now?” question didn’t even seem to be on the radar scope. CA has a history of jumping on educational bandwagons with wobbly wheels, and not to denigrate the efforts of either the SBAC or PARCC consortia [I actually think some good things may well come from their efforts, though at an exorbitant cost to the taxpayers], from my perch as an expereienced K-12 assessment system architech I’m deeply skeptical that either consortium will be able to deliver on their promises on schedule. The entire assessment consortia initiative as funded by the feds has the look and feel of a bandwagon with wobbly wheels, and an early commitment by CA has considerable downsides in terms of negative effects on other critical efforts to implement the California Common Core standards adopted by the SBE last August.

    Report this comment for abusive language, hate speech and profanity

  11. If I controlled that money, I would allocate $1m – that’s 10 very smart people for a year – to write a plan that provides costs, requirements, and a framework  that would describe what kind of bandwidth and hardware each school needs above what it has to support a generalized software solution, before I let any money slide for code development. The whole project is useless if we’re not prepared to commit to the infrastructure, and right now there’s no money for either the startup cost or the ongoing annual costs of IT staff plus replacing machines every 3 years or so.
     
    And if the answer is that we’re prepared to commit to the infrastructure if only the test is there, I say, why wait? We know we need it for all kinds of learning opportunities; we should be running broadband connections to every school in the United States right now. Even if the funding were available today, it’s going to take more than 3 years to get it done; we need to start this part of the project now if there’s any credibility to the idea of testing all US students online before this decade is over.

  12. el, thank you for your thoughtful posts.  When the consortia made their presentations to the SBE, they confirmed an estimate that an average district would require at least five full time staff members, versed in IT to support administration of the new tests.
     
    While the legislature continues to dismantle public education, how on earth are we to ramp up the physical infrastructure for these assessment and the associated human resources?
    Aren’t we also looking at new curricula, textbooks and professional development?
     
    Is there another planet on which this is feasible?  ‘Cause it’s not California!

    Report this comment for abusive language, hate speech and profanity

  13. Ze’ev,
    I agree with you about the possiblity that the consortia will disappoint.  But aren’t you concerned about the incredible waste already dedicated to consultants and contractors too far removed from teaching and learning?  And is your “contra-accountability” comment intended as any kind of endorsement of the accountability model in place right now?
    As for your concerns about the “squishy” tests, ELA tests are already that way, but the multiple choice format better masks the subjectivity and bias in the form of the test and the creation of the test items.  The results are also used in subjective ways, grounded in false assumptions about how students arrive at the answers.  You should recognize that most ELA standards cannot be assessed very well through standardized multiple-choice tests, and the results are nearly useless at the classroom and student level.  I’ve seen you comment elsewhere that our ELA tests are crucial to help us identify poor readers in early grades.  If so, shame on the schools that need these lousy tests to figure out whether or not students are reading well, and/or shame on the state for not providing schools with class sizes, teacher training, and resources sufficient to eliminate their dependence on state ELA tests.  I do not reject outright the need for standardized tests as one useful measure of a school or district, but with the current tests, I would absolutely prefer sampling rather than testing every student in every subject every year.  That approach would save us time and money, and decrease the testing burden on high school students especially.
    I would like to offer some sympathy, however.  It must be tiresome for you to be the grown-up and watch immature people come up with conclusions that differ from yours.  Your paternal concern is probably underappreciated.

    Report this comment for abusive language, hate speech and profanity

  14. As a classroom teacher, I cannot tell you how pleased I am to be able to know on a timely basis what the children do know as well as what they do not. To the above poster who accused that some are already looking at the Common Core. We are, and that is a good thing. We’re not doing it to “game” the test, but instead, to not be overwhelmed with sudden changes. Standards and assessments are the way we are making sure the U.S. remains the great country it is. This adaptive testing using technology is brilliant.

    Report this comment for abusive language, hate speech and profanity

Trackbacks

  1. The Best Resources For Learning About The “Next Generation” Of State Testing | Larry Ferlazzo's Websites of the Day...

"Darn, I wish I had read that over again before I hit send.” Don’t let this be your lament. To promote a civil dialogue, please be considerate, respectful and mindful of your tone. We encourage you to use your real name, but if you must use a nom de plume, stick with it. Anonymous postings will be removed.

10.1Assessments(37)
2010 elections(16)
2012 election(16)
A to G Curriculum(27)
Achievement Gap(38)
Adequacy suit(19)
Adult education(1)
Advocacy organizations(20)
Blog info(5)
CALPADS(32)
Career academies(20)
CELDT(2)
Character education(2)
Charters(82)
Common Core standards(71)
Community Colleges(66)
Data(25)
Did You Know(16)
Disabilities education(3)
Dropout prevention(11)
© Thoughts on Public Education 2014 | Home | Terms of Use | Site Map | Contact Us