Common Core groups should be asked plenty of questions

State Board should ask about cost, security, timeline
By

On Wednesday, the California State Board of Education will hear presentations from representatives of two assessment consortia that have been awarded federal grants to develop “next generation” assessments to measure the Common Core content standards adopted by states last year. The assessments developed by these two consortia will be candidates to potentially replace California’s Standardized Testing and Reporting (STAR) statewide assessment system. The reason for the presentations, instigated by Board President Michael Kirst, is to provide information leading to a decision as to which consortium, if either, California will collaborate with to provide assessment system design and test development work leading to replacement tests for STAR.

The two assessment consortia are the Partnership for the Assessment of Readiness for College and Careers (PARCC), which has roughly 25 states as current members, and the SMARTER Balanced Assessment Consortium (SMARTER Balanced), which has roughly 30 states as current members. States may join either or both consortia as “participating” members, but to be a “governing” member with greater leadership obligations and formal voting status, they can only be a member of one consortium. About 15 states are currently governing members for each consortium, while other states belong to either both consortia or only one as “participating” members; a few states belong to neither. To belong to one of the assessment consortia, states must have formally adopted the Common Core content standards developed last year by the National Governors Association and Council of Chief State School Officers. Adopting the Common Core standards was necessary to be competitive for the Race to the Top competition. California adopted the content standards on August 2 last year, the deadline under Race to the Top.

PARCC and SMARTER Balanced were awarded funds last fall, and have been organizing their efforts since then. The grants are between $150 million and $200 million for each consortium and must be spent by the 2014-15 school year, when the states implement the assessments. The consortia’s applications have a lot in common, but a close reading of the design and development details reveals distinct philosophical and operational differences.

In considering what to do about consortium membership, State Board members should ask questions that flesh out how details in each consortium’s plan may affect California’s plans to replace the STAR assessment system. Here are questions I would ask and the context behind them.

  • California’s Role in Consortia Efforts: California policymakers need to decide whether they wish to be a “governing” member of only one consortium, or a “participating” member of both consortia, or join neither at this time. California is currently a participating member of only the PARCC consortium, a decision made by the previous governor, State Board, and Superintendent of Public Instruction. Questions: What are the advantages/disadvantages of being a governing vs a participating member of PARCC and/or SMARTER Balanced? Will a participating state be able to influence PARCC/SMARTER Balanced strategic directions despite its non-voting status?
  • Impact on Curriculum and Instruction: Both consortia plan to include Through Course assessments in their programs. Through Course assessments were required by the federal request for proposals and were defined as “components that are administered periodically during the academic year” yet contribute to a student’s final score. Question: What will PARCC/SMARTER Balanced do to address the concern that Through Course assessments will lead to a uniform “pacing guide” program for curriculum and instruction for all California schools, based on the timing of Through Course test administration schedules?
  • Use for Teacher Evaluation: Much attention has been paid to potential use of statewide test results for evaluation of individual teachers in recent months, as well as possible use for layoff and assignment and compensation decisions. Question: Will PARCC/SMARTER Balanced assessments be developed to support this degree of high-stakes accountability?
  • Computerized Testing: Both consortia include computerized testing in their plans, with SMARTER Balanced anticipating computer-adaptive tests (i.e., test question sequence depends on student response pattern, thus reducing the number of test questions that need to be administered) while PARCC anticipates computer-administered tests (i.e., fixed-length forms of tests administered via computer). These plans will depend on availability of computer hardware for test administration at all schools in the state. Question: Do PARCC/SMARTER Balanced have specifications for the computer hardware needed to administer their anticipated assessments so that California may estimate the cost for computer infrastructure necessary for administration of PARCC/SMARTER Balanced tests at each school site?
  • Open-Ended Item Formats: In response to federal specifications, both PARCC and SMARTER Balanced anticipate extended use of open-ended test item formats rather than multiple-choice formats. Such formats provide increased risk for breaches of test security. Question: How will PARCC/SMARTER Balanced address the concern that extended use of open-ended item formats (i.e, constructed response, essays, projects) will increase opportunities to compromise test security and promote teaching-to-the-test behavior, thus compromising use of test results for accountability purposes?
  • Augmentation: California added to the Common Core content standards when adopting them last August, in particular for Algebra I standards in grade 8 and additions in earlier grades to get students ready for Algebra I in 8th grade.  We will have to add test questions to the base Common Core assessments in order to measure the full range of California’s adopted content standards. Question: Will PARCC/SMARTER Balanced assist California with California’s augmentation requirements, in terms of both test development projects and costs?
  • Ongoing Test Development: For test security reasons, each state will need ongoing test development efforts to generate replacement test questions required to keep annual test forms “fresh” and mitigate efforts to narrowly “teach to the test.” Question: How will PARCC/SMARTER Balanced be able to assist California with these ongoing test development efforts in terms of (a) coordination with base PARCC/SMARTER Balanced tests, and (b) minimizing ongoing test development costs?
  • Costs: We understand PARCC/SMARTER Balanced will cover at a minimum the costs for developing assessments for the base Common Core content standards, but not costs for administering PARCC/SMARTER Balanced tests annually. Questions: Do PARCC/SMARTER Balanced have information for estimating annual per-pupil costs for administering their base assessments? How will PARCC/SMARTER Balanced operational test administration costs compare to California’s current annual per-pupil cost of roughly $13/student [i.e. $60 million for 4.8 million students] for our current STAR assessment program?
  • Unrealistic Design Features and Timelines: Questions: What are the chances PARCC/SMARTER Balanced may have bitten off more than can be chewed by the 2014-15 target date? What are the chances that political changes in Congress may kill or significantly alter the PARCC/SMARTER Balanced initiatives before completion?
  • History: In the early 1990s, California undertook an ambitious “beyond-the-bubble” constructed response statewide assessment program labeled the California Learning Assessment System (CLAS). After roughly four years of development and initial implementation, it crashed and burned in 1994 via a veto from then Gov. Pete Wilson, after a review led by distinguished educational measurement expert Lee Cronbach from Stanford University found that CLAS did not generate individual-student-reliable scores as required by authorizing legislation. CLAS cost California taxpayers more than $100 million in early 1990 dollars. Question: How can California be confident that the PARCC/SMARTER Balanced assessment systems will not suffer a similar fate?

The questions do not cover all pertinent topics for the PARCC and SMARTER Balanced representatives. I encourage readers to enter their own comments and questions in the space below.

Doug McRae is a retired educational measurement specialist living in Monterey.  In his 40 years in the K-12 testing business, he has served as an educational testing company executive in charge of design and development of K-12 tests widely used across the US, as well as an adviser on the initial design and development of California’s STAR assessment system.  He has a Ph.D. in Quantitative Psychology from the University of North Carolina, Chapel Hill.

Tagged as:

9 Comments

  1. Which consortium values broad curricula and is able to test for knowledge, skills and competencies beyond just a narrow bandwidth of English/Math?

    The narrowing of curriculum, particularly for middle/high school students, but also evident in elementary grades (social studies, science, art/music in lower grades, and vocational and technical education in upper grades) are being squeezed-out of the instructional day for most students.  Is there any wonder nearly a third of high schoolers are “voting with their feet” and simply dropping-out?

    We have to keep school relevant to the REAL world awaiting adolescents beyond school, whether those students go on to 4-year degrees or not.  So, which consortium is better prepared to integrate the Common Core in a creative way to place value on disciplines and programs beyond fill-in-the-bubble English/Math questions?

    Report this comment for abusive language, hate speech and profanity

  2. These are good questions.  I think in addition we should ask what advantage California will gain by moving from the STAR system to one of the consortia, and whether the transition will be covered by federal dollars, or paid for by state money.  Regarding the first question, it is assumed by many who are involved in RTTT and CCSSI adoption that CA is short on student data, and that the STAR tests are deficient in important ways.  There is, in fact, no such consensus in the field.  Regarding the second question, if the move to CCSSI and the consortia will be covered by federal dollars, perhaps it’s six of one, half a dozen of the other, but if CA will be required to add to its $25 billion deficit to fund CCSSI and its assessments, there needs to be some solid evidence that the transition is necessary.

    Report this comment for abusive language, hate speech and profanity

  3. Doug did us a great service by providing an outstanding list of questions to ask. Let’s hope that Mike Kirst will cause the board to think hard about what they will hear in response.
     
    And here are a couple of more questions:
     
    1. The consortia committed to develop the *main* annual assessment tool. ESEA, however, requires us also to provide (a) Modified Assessment (currently CMA, for up to 2% of students with special needs), (b) Alternate Assessment (currently CAPA, for up to 1% of students with most severe special needs), and (c) ELL assessment (currently STS, in Spanish). These assessments also need to be based on the content standards. There are some noises about folding the  modified assessment  into the main assessment through Universal Design principles (which I’ll believe in when I’ll see it), but we heard nothing about the other two assessments. Question: What are the consortia plans regarding the Modified, Alternate, and assessments not in English? How much will the states be expected to pay to develop them?
     
    2. Since NCLB states have received almost $4B from the federal government to support the development of reliable assessment. Left over funds can be used for assessment administration. California has developed its STAR program before NCLB and has used much of its annual $30-33M from the feds to pay for test administration. With the need to fund new CMA/CAPA/STS development, this money will may not be available to support test administration. Worse, it is likely that now, with the assessment consortia already funded for the next 3-4 years, and with the federal budget cutting, these funds may be cut as there is no need for the states to spend that much money on test development anymore. Question: How is CDE prepared to adjust it budget to accommodate either of these likely eventualities?
     
     

    Report this comment for abusive language, hate speech and profanity

  4.  

    Doug:
     
    You are a gem. I am a great admirer of your work.
    As to questions regarding educational assessments, here is mine: Where Are The Parents?
    Early last year, in preparation for the so-called Race to the Top program, the U.S. Department of Education began holding a series of meetings in Washington, D.C. and around the country with the intention of hearing from districts, states and the other so-called “stakeholders” in public schools as to what are the most important considerations in addressing the criteria set by the Department. I attended one of these meetings on January 20, 2010, a so-called “public and expert input meeting” to address the assessment component of Race to the Top. The purpose of the meeting was to hear from technical experts, academics and practitioners on the nuances of student assessments and systems.

    The meeting opened with remarks by Joanne Weiss, then-Director of the Race to the Top Program (and former Partner and Chief Operating Officer of NewSchools Venture Fund which itself is overseen by CEO Ted Mitchell, then-President of California’s State Board of Education). Presentations were delivered by Scott Marion (National Center for the Improvement in Educational Assessment), Randy Bennett (Educational Testing Service—ETS, R&D Division), Jeff Nellhaus (Massachusetts Department of Elementary & Secondary Education), Lizanne DeStefano (University of Illinois, College of Education), Jamal Abedi (University of California Davis School of Education) and Laurie Wise, (Human Resources Research Organization).

    There were opportunities for questions to be submitted after each presentation and there were several technical discussions that followed, capped by a conversation among the presenters and Department staff and finally, a chance for members of the public, such as myself, who had pre-registered for the opportunity to speak before the group.

    Much of the day’s discussion related to the need for a universal assessment design and modeling. Also addressed was the need for a well-defined theory of action along with suggestions as to how to best utilize the resources being brought to bear for improvement in general and technical assessments under the Race To The Top funding competition. These are not the kinds of topics that are typically shared with parents of public school students yet they should be. That is why I felt it was important to attend this meeting, to bring the voice of parents.

    At one point during the discussion, I submitted a question to the group: Is there a role for parents in any of the piloting of the proposed school-based assessment models? Here’s how that went:

    Randy Bennett, ETS – “Yeah, I think there certainly is because parents are consumers of assessment results and ideally you would want to design an assessment program that would be capable of giving parents assessment results that they could, number one, understand and number two, have some possibility of doing something with. So I didn’t include parents – and I gave a list of the types of actors that I thought a consortium should make sure to work with in doing innovation. But parents should be among them.”

    Lizanne DeStefano, University of Illinois – “ And I think the use of theory of action provides a great role for parents because when you involve parents you have to make sure that you’re putting them in a role where they can be successful and they can be equal players. And I think the use of theory of action is a good niche for parents. And also, the usability testing of reporting and performance descriptors and other information that surrounds testing, certainly involving parents in the testing of that.”

    So had I not at least asked the question as to whether parents should be included, an explicit acknowledgement of the importance of parents to the process would likely have been omitted from the day’s discussion and from the record, as well. Not only that, but the responses to my question clearly indicated that no one had been thinking seriously about the role of parents and the contribution they can and should make. Parents are just not part of the equation in these conversations.

    It is clear that if parents are going to have a place at the table in creating assessment mechanisms to be used by states and school districts but also to be of value to us, we’re going to need to carve out that place and make sure that the system is responsive to our needs and the needs of our students.

    Report this comment for abusive language, hate speech and profanity

  5. TransParent:
     
    Thank you for your story. Fascinating, but not surprising. When there was still a question whether Common Core standards will pick up steam, or how many states will compete in the Race to the Top (that peddled them),  there was an effort to engage with variety of stakeholders. Once that deal was sealed, there will be little more anyone will see what the consortia are actually doing. The consortia are non-public entities that have no requirement (or interest) in engaging the broader public. And why should they? Their customers are the state bureaucracies, not the citizenry!
     
    With state tests, there was at least a process in most states where citizens could see the blueprints, see regularly released items, and even — if signed a non-disclosure — could often review the actual test forms. With the private consortia, no more. This is particularly dangerous because the Common Core language is quite opaque and includes few examples to elucidate its meaning.  As I already said, citizens are not the customers of these consortia and they have no public accountability. And state official sign confidentiality agreements with them and can’t say much even if they would like to.
     
    Sandra Stotsky and I warned about it last year in a report we wrote:
     
    That is why a standard’s meaning needs to be anchored with a clear example of teachable content. Standards documents are the only documents that are extensively reviewed in public meetings around the country. Once standards are approved, a curtain falls and the general public is not privy to the sausage-making of assessment, trusting professionals to execute faithfully what the public has blessed. But if standards are opaque or have no examples to pin interpretations down, then the public effectively forfeits its right to direct the public school curriculum and will not know if and when its academic or civic goals have been corrupted.
     
    That report was written before we knew how truncated will even the review of the standards be. We are living in a brave new world.
     

    Report this comment for abusive language, hate speech and profanity

Trackbacks

  1. “Taxpayers, Grab Your Wallets” | Inside Scoop with Alice Linahan
  2. Pastor Mikes Report | Kentucky Withdraws from Common Core Assessment Consortium
  3. Kentucky Withdraws from Common Core Assessment Consortium
  4. Kentucky Withdraws from Common Core Assessment Consortium | Political Thrill

"Darn, I wish I had read that over again before I hit send.” Don’t let this be your lament. To promote a civil dialogue, please be considerate, respectful and mindful of your tone. We encourage you to use your real name, but if you must use a nom de plume, stick with it. Anonymous postings will be removed.

10.1Assessments(37)
2010 elections(16)
2012 election(16)
A to G Curriculum(27)
Achievement Gap(38)
Adequacy suit(19)
Adult education(1)
Advocacy organizations(20)
Blog info(5)
CALPADS(32)
Career academies(20)
CELDT(2)
Character education(2)
Charters(82)
Common Core standards(71)
Community Colleges(66)
Data(25)
Did You Know(16)
Disabilities education(3)
Dropout prevention(11)
© Thoughts on Public Education 2014 | Home | Terms of Use | Site Map | Contact Us