Author Archives: The Pioneer Institute

  1. RIP PARCC

    Leave a Comment

    Economies of scale are when a large operation can provide advantages by spreading fixed and overhead costs over a greater number of units sold, thereby reducing the per unit price. A simple concept, and one that proponents of national K-12 testing consortia made repeatedly.

    One of Pioneer’s favorite expressions of this economies of scale euphoria that broke out in Washington, DC, occurred in the offices of the Thomas B. Fordham Institute in 2012 in response to a serious cost estimate Pioneer did on the Common Core (which was the first such cost estimate in the country).  In that paper, the term “economies of scale” is found 13 times.  Here is a quick sampling:

    The Common Core also offers the possibility of new economies of scale, with promising products, services, and practices now part of a more-uniform national market…

    Vendors may no longer focus disproportionately on a few large states, and niche providers can emerge as the market for their products broadens and new economies of scale become possible…

    Such collaboration can both improve implementation and save money as states and districts achieve greater economies of scale…

    From a cost perspective, the CCSS also afford states the opportunity to work together to develop assessment materials and procedures, taking advantage of economies of scale. The federal government has encouraged this collaboration by means of the two assessment consortia, SBAC and PARCC

    The US Department of Education believed in this economic principle when they awarded a $186 million grant to the Partnership for Assessment of Readiness for College and Careers (PARCC).  To drive their point home, the USED required that PARCC and its one national competitor maintained a minimum of 15 member states (see PARCC’s Federal Application).

    Just this week, with the departure of Ohio, the ever-shrinking PARCC test hit a new milestone.  Originally boasting 26 participating states and DC, it now is in place (or in play) in a maximum of 7 states and DC.

    Some of the remaining states are not solidly in the PARCC camp (e.g., Arkansas), and they have few students.  Where PARCC once was slated to serve around 25 million (of the 31 million enrolled) public school students in the 26 original states, it now will be given to an unimpressive 5 million public school students (of the total 7.5 million enrolled in the remaining states).

    That’s fewer students than in the state of California.

    PARCC’s market is drying up, and Questor now lists Pearson as a “sell.”  So their price point must have been affected, right?  After all, we were talking about economies of scale, right?

    Let’s go back to PARCC’s preliminary cost projections, which it included in its grant application in 2010.  PARCC then forecast a price of $32.68 per student each year, just above the average state spending of $29.50 according to a 2012 Brookings Institution report. For a test that claims to drastically improve quality of assessment, a slight increase is palatable and makes sense.

    The original cost estimate was vague and ranged from $17 on the low end, up to $50 per student on the high end. PARCC’s $32.68 estimate is conservative and doesn’t allow for teachers to score tests, has no human read behind for the year end assessment, and only applies to the electronic version of the exam. The cost for a state wishing to implement higher quality features would be closer to $50 per student.

    The projected cost of the test was decreased to $29.50 per student on July 22, 2013, just as the number of member states had decreased to 19 plus Washington D.C. This estimate flies in the face of the principles of economies of scale.

    On May 2, 2014, PARCC had even fewer states in its membership, and yet it released its most recent cost estimate of $23.97 per student.

    With PARCC’s membership in July 2015 at a mere 7 states and DC, perhaps we will magically see it on sale at the local Dollar Store with the next estimate.

    In reality, creating a test is an expensive business. According to Scott Marion, Associate Director of the National Center for the Improvement of Educational Assessment, a single multiple-choice question can cost over $1,000 to develop. Each open-ended question can cost as much as $3,000 to 5,000. Every year thousands of questions are required to administer multiple tests, at several times of the year, to nine different grade levels.

    Since every state would be using the same tests, more states means the cost of developing the test would be spread out more thinly. So how can PARCC’s cost estimates fall by almost $10 per student while their membership is in free-fall?

    When PARCC discusses its cost it often focuses on a state’s flexibility in choosing a scoring method as the main variable. Administrative and operational costs are also cited as important factors that are subject to change.

    Only once does PARCC seem to acknowledge that the number of students is an important factor in this estimate, and it’s the final line of the press release with the most recent cost estimate. If PARCC truly understands this concept, then how can it in good conscience release a significantly lower cost estimate?

    The fact that PARCC is still funded despite no longer meeting the first requirement for eligibility reflects poorly on the USED. The fact that Massachusetts continues to trust a consortium that can mismanage $186 million and has lost over three quarters of its members to test our school children when we have a higher quality alternative already in place is downright ludicrous.

    It’s time to pull the burial shroud over this ill-considered effort called PARCC.  Given the gobs of federal—and state—money poured into this effort, it really is time for the Education and the Workforce Committee to hold all those associated with PARCC accountable.  $186 million in federal funding, an untenable test, and silence from national leadership?

    And then there is the fact that Massachusetts is still holding hearings over a dead national test.  Why are we wasting our time?  The state’s resources would be far better used to create an MCAS 2.0.  How much time and money has the Commonwealth wasted on PARCC, and how much more will be lost before state leaders come to their senses?

  2. Why Massachusetts Should Abandon the PARCC Tests & Common Core

    Leave a Comment

    It is difficult to find any public analysis and comments by teachers, parents, researchers, or literary scholars on PARCC’s “practice” and “sample” test items for English language arts for grades 3-11. Taking the bull by the horns, that is what I decided to do in my invited testimony before the Massachusetts Board of Elementary and Secondary Education at the public hearing at Bridgewater State University on June 10, 2015, on whether the Board should abandon the MCAS tests and adopt the PARCC tests.

    In my written testimony, I first describe my qualifications, as well as the lack of relevant qualifications in Common Core’s standards writers and in most of the members of Common Core’s Validation Committee, on which I served in 2009-2010.  I then detail some of the many problems in the 2011 Massachusetts English language arts standards, written by David Coleman, Susan Pimentel, James Patterson, and Susan Wheltle (so the document indicates), in the tests based on Common Core’s standards (PARCC), and in the two external reports—one issued in February 2015, the other yet to be completed—comparing the PARCC tests with MCAS tests. I conclude by offering several recommendations for parents and teachers who want civically sound and academically rigorous standards and tests written and reviewed by the state’s English teachers and who want a form of accountability that doesn’t penalize their children’s teachers for results of tests based on either the Coleman et al standards or Common Core’s standards.

    Among the many problems with PARCC in 2014-15, based on the examples of test items given, I note the following:

    * The overall reading level of PARCC sample test items in most grades seems to be lower than the overall reading level of test items in MCAS ELA tests based on the pre-Coleman et al standards—sometimes by more than one reading grade level. E.g., an excerpt from The Red Badge of Courage is an example in the 2015 grades 10 and 11 PARCC. But an excerpt from this novel was assessed in a pre-2011 grade 8 MCAS.  E.g., an excerpt from Joseph Conrad’s Heart of Darkness is an example in the 2015 grade 11 PARCC but appears in a 2010 grade 10 MCAS.

    * PARCC doesn’t tell us who determines the cut (pass/fail) score, where it will be, and who changes it, and when. Cut scores on MCAS tests are set by Massachusetts citizens.

    * PARCC offers too many tests at each grade and across grades.

    * PARCC requires extensive keyboarding skills and too much time for test preparation.

    * PARCC plans to provide only a few released test items for teachers to use, it seems.

    * The change to a grade 11 PARCC for fulfilling the requirement for a high school diploma hurts low-achieving students, who often need two years for remediation and retests before graduation.

    * The PARCC tests are very long, even though they have been recently shortened.

    * The writing prompts in PARCC in 2015 do not elicit “deeper thinking” because students are not given a provocative question about a reading assignment and encouraged to make and justify their own interpretation of an author’s ideas based on a range of sources, some self-chosen. They are almost always given the sources to use, beginning in grade 3.

    * The two-part multiple-choice format in PARCC (and in SBAC) often (though not always) requires students to engage in a textual scavenger hunt for the specific words, phrases, or sentences that led to their own thinking when answering the previous question. This two-part multiple-choice format is especially taxing and problematic in the early grades. E.g., in grade 3: “Part B: Which sentence from the story supports the answer to Part A?” “Which detail supports the answer to Part A?” “Which detail from X shows another example of the answer to Part A?”   “What phrase from paragraph 14 helps the reader to understand the meaning of thriving?” “Which section in X introduces how the scientists made wolves feel comfortable in the park?”  In sum, the questions are poorly worded, confusing, unfriendly to children, and cumbersome.

    Watch my oral testimony:


    In my testimony, I offer 7 recommendations for Massachusetts:

    1. Fewer grades tested (just 4, 8, and 10), as in the 1993 MERA and 1994 authorization of ESEA
    2. Paper and pencil tests; no computer-based tests
    3. All or most test items released every year, as MERA requires
    4. Retention of grade 10 competency determination for a high school diploma, required in MERA, for the benefit of low-achieving students
    5. Tests requiring less time for preparing for and teaching to the tests
    6. Test passages and questions chosen and reviewed by Massachusetts English teachers
    7. A Massachusetts-determined cut score
  3. How Massachusetts Promoted Achievement Before Common Core & PARCC

    Leave a Comment

    Before the Massachusetts Board of Elementary and Secondary Education (BESE) officially decides to adopt PARCC’s testing system in place of the testing system that the Department of Elementary and Secondary Education (DESE) developed in the 1990s and early 2000s, local school committees, state legislators, and parents should be able to peruse the test items used in the tests given to all public school students in the Bay State as part of the Massachusetts Comprehensive Assessment System (MCAS).  The major purpose of this blog is to give them access to the test items used in MCAS tests at all grade levels and for all subjects tested, from 1998 on.

    These test items are public information because the Massachusetts Education Reform Act of 1993 (MERA) required all used test items to be released annually. DESE used to compile a report every spring containing all used state test items, distribute this report to the schools in September, and post these annual reports on its website. It no longer does so and will not be able to do so if BESE adopts PARCC. It is not clear how parents, state legislators, and local school boards will be able to understand changes in the K-12 curriculum without access to the full array of test items that will hold teachers accountable for student scores on PARCC.  But, at the least, they can see the test items that promoted achievement in all student groups before Common Core and PARCC. For a view of all “practice test” items PARCC made available for public inspection in mathematics and English language arts for the 2015 PARCC tests, see http://parcc.pearson.com/practice-tests/english/.

    The first MCAS tests were given in 1998.  They were based on the standards for science and mathematics approved by the Martin Kaplan-chaired board in December 1995, on the English language arts standards approved in December 1996 by the John Silber-chaired board, and on the history/social science standards approved by the John Silber-chaired board in mid-1997.  Tests had not yet been developed for the other subjects mandated for assessment by MERA (health, foreign/world languages, and the performing arts), and in 2015 still haven’t.

    Every spring from 1998 on, DESE prepared a document showing all test items used in the previous year.  No direct links to these documents can be located on the DESE website anymore.  The following links, located on other websites, will open up these DESE documents prepared by DESE to address the statute in MERA requiring annual release of all used MCAS test items. The first one is titled The Massachusetts Comprehensive Assessment System: Release of May 1998 Test Items.  It shows all common test items (and answers) for the three subjects tested in 1998.

    The second is titled The Massachusetts Comprehensive Assessment System: Release of Spring 1999 Test Items.  It shows all common test items (and answers) for the four subjects tested in 1999.

    The third is titled The Massachusetts Comprehensive Assessment System: Release of Spring 2000 Test Items.  It shows all common test items (and answers) for the four subjects tested in 2000.

    For each grade level test, these documents indicate the date of the standards on which the test items for that year were based.  After 2000, MCAS tests were based on the revised standards for all four subjects as soon as the revised curriculum framework for the subject was approved. Revised standards in mathematics, science and technology, English language arts, and history/social science were approved under James Peyser, now Secretary of Education under Governor Charles Baker, while he served as BESE chair (from 1999 to 2005).  Governor Charles Baker was a member of BESE from 2000 to 2007. It should also be noted that an annual reading test in grade 3 began in 2001, and that by 2006, DESE was providing a summary page showing “reporting categories, standards, and correct answers” (e.g., 2006, grade 8, p. 345) after showing all test items used at a grade level.

    The links for the documents annually released by DESE from 2001 to 2007 are as follows:

    2001

    2002

    2003

    2004

    2005

    2006

    2007 or this version

    After 2007, DESE announced it would no longer release all used test items annually.[1]  It would release only about half of them, because of cost (it explained) and to shorten the testing periods themselves.  The document containing used test items continued to be prepared by DESE each spring, and released to the schools after 2007 (to address MERA). But test items for only some grades in some subjects can be located via Google. And they can be located only because grade-level groups of test items were downloaded, scanned, and posted on the Internet by individuals, schools, or entrepreneurs, such as the URLs below. More can be located; it just takes time.

    Grade 10 mathematics

    Various

    http://www2.newton.k12.ma.us/~Javier_Mendez/S05386103.4/Spring%202009%20test.pdf

    grade 8 mathematics, 2009, 17 items here

    Grade 10 ELA

    Grade 10 mathematics 2011, 42 test items here

    Grade 10, 2014 MCAS

    Grade 10, 2013, MCAS, ELA

    Grade 10 2014 MCAS ELA

    [1] http://blog.socrato.com/partial-release-of-mcas-test-items-is-a-step-backwards/.  The 2015 MBAE report also indicated that one feature of MCAS was: “Releasing all items on all tests each year for the first nine years of the program, and continuing to release half of items in subsequent years” (p. 4).

  4. Don’t Waste the Crisis over Common Core

    Leave a Comment

    The entire Common Core project is rapidly going south, and within two years may be no more than a dim memory of a nightmare in the minds of a growing army of angry parents and teachers from coast to coast. Before this dystopian scheme for upgrading the academic status of low-income children emerges in a more deadly form in a newly re-authorized Elementary and Secondary Education Act (ESEA), we could try to salvage one of the reasonable arguments for a “common core.” We could benefit from some research-based and internationally benchmarked common standards in elementary school reading, writing, and arithmetic across states.  But not up to grade 12. As educators in other countries and most parents everywhere know, many young adolescents don’t want to go to college or can’t do college coursework and would prefer other options. But common standards up to grade 8, with flexibility in the curriculum and in school organization for that educational wasteland known as middle or junior high school, make sense.

    The first task is to relabel the currently toxic package as high school-ready standards and give the forthcoming “college readiness” tests not in grade 11 but in grade 8, which is where they better fit with respect to content and cut score. There, with additions by academic experts in each state to ensure adequate content knowledge is also assessed, they can serve as nationally recognized indications of whether students are capable of authentic high school-level work in grade 9.

    It won’t take long for college faculty to realize that Common Core’s tests are a better indication of whether students can do authentic high school-level work in grade 9 than of college-level work. Few post-secondary institutions will survive the pretense that grade 6/7 reading and mathematical skills denote “college readiness.” No doctors or engineers can be developed if they are at that academic level in grade 11 or 12, even when fraudulently deemed “college-ready.” How many American communities can survive without a few doctors and engineers of their own?

    Once common ELA and math standards serve to guide a curriculum that makes most students ready for real high school work by the end of grade 8, we can work out alternative high school curricula—the upper secondary options that appeal to a broad range of students even today—and give young adolescents a choice of the kind of curriculum they are willing to commit themselves to—with change always possible. This is what most developed countries do, including Finland.  Our aim would be to try to make sure that all students complete a basic education through grade 8, before compulsory schooling ends and before they choose their upper secondary curriculum.

    Do not think I exaggerate our predicament.  At present, we are spending billions of dollars trying to send students to college and maintain them there when on average our high school students read at about the grade 6 or 7 level and their mathematical knowledge is not much higher—in comparison to their peers in high-achieving countries.  Two independent sources converge on that reading level: Renaissance Learning’s latest report on the average reading level of what our students in grades 9-12 read (whether assigned or chosen), and the average reading level of the books that colleges assign to incoming freshmen for summer reading (the titles can be found in the latest Beach Book report). As for mathematics, most high school graduates do not do much in mathematics beyond what students in high-achieving countries complete in algebra and geometry by the end of grade 8.  Common Core asks for little more than that by the end of grade 11.

    A crisis is a terrible thing to waste, as we have been regularly told.  And here’s one we should take advantage of in order to salvage a battered public school system. If we don’t come to grips with Common Core’s notion of “college readiness,” we face dissolution of our entire education system. And there are other English-speaking post-secondary institutions outside of this country eager for students who can do high school-level reading and mathematics.

  5. Steps for States to Follow to Replace Common Core

    Leave a Comment

    To help out governors and state legislatures that really want to get state-tailored standards close to the quality of the pre-2010 Massachusetts and California standards–or the Indiana 2006 standards–I have provided an outline of the steps or procedures a state legislature could follow (see below).

    The outcomes remain open-ended.  But these procedures, based on my experiences in Massachusetts over 10 years ago, and in other states in recent years, ensure that no special interest groups, including a state’s board, commissioner, or department of education, can take control of the “process,” deceive the parents of the state, and feed back a warmed-over version of Common Core as is now happening in South Carolina and Oklahoma, and as has happened in Indiana and Florida.

    These procedures, among other things, are designed to ensure that those in charge of revising mathematics, science, or English language arts standards for a state actually know the content of these disciplines and teach in a state’s own higher education institutions.  If anyone knows what college readiness should mean for a particular state, they should.

    Procedures for state legislatures to use to establish standards development committees to replace Common Core’s Standards

    STEERING OR EXECUTIVE COMMITTEE to be chosen by the State Legislature

    Membership

    • Chancellor of State System of Higher Education
    • State Legislators (3)
    • Others?   PLUS (when chosen)
    • ELA Standards Development Chair
    • Mathematics Standards Development Chair

     

    Responsibilities

    • Oversees the entire project
    • Final approval of all content and work of the committees
    • Final approval of all committee membership
    • Supported by Attorney General’s office

     

    EXECUTIVE DIRECTOR (paid)

    • Selected by and reports to Steering Committee
    • Oversees entire standards process
    • Assures accurate communications to the Legislature, the Governor. and the public at large
    • Works directly with ELA and Math Chairs, and directs work of the ELA and Math Scribes

     

    Document SCRIBES (paid) selected by the Steering Committee

    • ELA Scribe
    • Math Scribe

     

    Qualifications

    • Expertise in subject matter
    • Adept at digital technology
    • Report to ELA or Math Chair on Steering Committee

     

    Responsibilities

    • Organization
    • Prepares database of all nominees
    • Processes nominations to various committees
    • Coordination
    • Schedules and secures sites for all in-person and virtual meetings
    • Arranges for the services of meeting facilitators where needed

     

    Communication

    • Takes and transcribes detailed notes of all meetings
    • Creates minutes from Steering Committee meetings
    • Provides ongoing updates of standards-writing process to all committees and the public
    • Posts appropriate information on webpage
    • Maintains all drafts and revisions throughout the writing process
    • Compiles public comments
    • Coordinates responses to public comment

     

    MATH AND ELA CHAIRS (2 members)

    Qualifications

    • Undergraduate teaching faculty in the arts and sciences
    • Math Chair must be a faculty member in a science, mathematics or engineering department
    • ELA Chair must be a faculty member in an English literature/language department
    • Nominated by the president of the candidate’s four-year accredited university or college that maintain graduate-level programs (not a member of faculty of college of education)
    • At least 2 candidates for each committee will be nominated to the Steering Committee for final selection

    Responsibilities

    • Maintains schedules and timelines
    • Leads and coordinates the work of the Standards Development Committees
    • Assures that standards are written with an emphasis on disciplinary content and accuracy
    • Works with Executive Director

     

    STANDARDS DEVELOPMENT COMMITTEES (15 members each)

    District superintendents may nominate up to a total of 6 teachers for the subgroups in ELA and Math (i.e., no more than 1 per subgroup).

     

    Membership of ELA Standards Development Committee

    • ELA Chair
    • ELA Vice Chair — Teaching faculty member in an undergraduate English Literature

    Department at a four-year university (selected by the ELA Chair)

    • 4 Pre-K to Fifth Grade school teachers
    • 4 Middle school teachers (Sixth Grade-Eighth Grade)
    • 4 High school teachers at each grade level (Ninth Grade-12th Grade)
    • 1 Librarian (nominated by State’s Library Association)

     

    Qualifications for teachers (District Superintendents are to use one-page nomination forms:

    • Minimum 7 years experience at the educational level of the subgroup for which they are applying: Current teaching assignment at one of the grade levels in that subgroup
    • At least a minor in English and/or list courses completed in literature, composition, or rhetoric for those seeking middle or high school subgroup
    • Reading methods coursework for those in PreK-grade 8
    • Steering Committee in conjunction with the ELA Chair and Vice Chair make the final selection of those to serve on the subgroups

     

    Membership of Math Standards Development Committee

    • Math Chair
    • Math Vice Chair — teaching faculty member in an undergraduate Mathematics Department at a four-year university (selected by the Math Chair)
    • 4 Pre-K to Grade 5 teachers
    • 4 Middle school teachers (Sixth Grade-Eighth Grade)
    • 4 High school teachers (Ninth Grade-12th Grade) to Include:
    • Algebra I teacher
    • Geometry teacher
    • Algebra II teacher
    • Precalculus or Trigonometry teacher
    • 1 Engineer (nominated by a state engineering professional organization or university faculty)

     

    Qualifications for classroom teachers (District Superintendents use a one-page nomination form):

    • Minimum 7 years experience at the educational level of the subgroup for which they are applying: Current teaching assignment at one of the grade levels in the subgroup
    • At least a minor in mathematics, science, or engineering
    • Steering Committee in conjunction with Math Chair and Vice Chair make the final selection of those to serve on the subgroups

     

    Responsibilities

    • Each Standards Development Committee as a whole selects for use a highly-rated pre-2009 set of state standards as the foundational blueprint (for ELA: California, Indiana 2006,    Massachusetts 2001 or condensed/revised 2013, and for math: California, Indiana 2006, Massachusetts 2000, or Minnesota).
    • Each subgroup addresses each relevant grade-level set of standards by adoption, modification, or rewrite.
    • Standards Development Committee as a whole examines entire set of standards and revises when necessary
    • Submits the documents to the Steering Committee for review and approval

     

    FIRST-DRAFT REVIEW COMMITTEE

    Membership

    • Remaining nominees not selected to be on Standards Development Committee

    Responsibilities

    • Teachers review all standards at their own educational level for appropriateness and wording
    • High school teachers review all documents
    • Comments are recorded by the Scribes and sent to the Standards Development Committee for review and possible action
    • First draft presented to the Steering Committee for approval

     

    SECOND DRAFT REVIEW by higher education and special interest groups

    Membership

    • State Chamber of Commerce
    • State business and industry professional organizations
    • State engineering organizations
    • Early childhood advocacy organizations
    • Special education advocacy organizations
    • English Language Learners advocacy organizations
    • School counselor professional organizations
    • Speech pathology professional organizations
    • Undergraduate teaching faculty in science, engineering, mathematics and English literature/language

    Responsibilities

    • Review content standards and advise on:
    1. a) Classroom application
    2. b) Vertical alignment
    3. c) Provide comment on how the standards affect the population they represent.
    • All responses must be signed and submitted electronically
    • Comments on any recommended changes to the Second Draft are recorded by the Scribes and sent to the Standards Development Committee for review and possible action
    • Second draft presented to the Steering Committee for approval. Steering Committee reviews recommendations and provides direction to the ELA and Math Chairs and the Standards Writing Teams as they edit the second draft.

     

    EXTERNAL EVALUATION

    The presidents, provost, and faculty of the 4 year colleges nominate two well known or well published experts in each subject area.

     

    Membership: Selection is by the Steering Committee

    • Individuals do not teach at a state college or university
    • Qualifications determined by Steering Committee

    Responsibilities

    • The External Reviewer will report on the quality of the standards
    • The External Reviewer will report to the Steering Committee and the State Legislature

     

    FINAL DRAFT

    After the second draft review, the Standards Development Committees will review, revise and submit a final draft to the Steering Committee. Final draft will be submitted for a 45-day public comment period and public hearing at the State Capitol. Public comment will be incorporated as deemed appropriate by the Standards Development Committee

    Final Draft submitted to the State Legislature for Legislative Hearings and final approval.