Mitchell Chester, PARCC Consortium Chair, Turns His Back on PARCC
Massachusetts Commissioner of Education Mitchell Chester has been pushing for PARCC assessments in Massachusetts. Chester is the chair of the PARCC consortium. As a member of PARCC’s governing board, he was supposed to have his state signed on for PARCC assessments for 2015, but it didn’t quite work out that way. Massachusetts districts had a choice in 2015 between PARCC and the state’s longtime assessment, the Massachusetts Comprehensive Assessment System (MCAS).
So, even though the PARCC MOU noted that PARCC governing board members needed to have their states signed on for the PARCC assessment in 2015, Chester was allowed to slide for a year.
Meanwhile, the PARCC consortium has steadily been losing states. For 2016, PARCC was down to DC and seven states (Colorado, Illinois, Maryland, Massachusetts, New Jersey, New Mexico, Rhode Island). (This 2016 PARCC state commitment is reflected in its dwindling governing board.)
But it looks like that is about to become six states and DC, with the chair of the PARCC governing board making the next exit.
As Michael Jonas of Commonwealth Magazine notes:
Chester said today (October 20) that he still views PARCC as a superior assessment to MCAS. He said MCAS has “outlived its usefulness” and that “there’s no question that PARCC has set a higher standard for student performance” in requiring more reasoning and critical thinking from students.
Chester said the issue is not the test content, but governance, since signing on to PARCC would lock the state into a testing system in which decisions about the content of the exams would be made by the PARCC consortium, not state officials.
“We need to preserve our autonomy in terms of making decisions about our testing program going forward,” he said.
That marks a sharp change of view for Chester, who had not previously voiced such concerns. His new posture puts him in line with Gov. Charlie Baker, who has expressed reservations from the start of the PARCC process about ceding control over the test administered to Massachusetts students.
Only on day prior, on October 19, 2015, Chester suggested a MCAS-PARCC hybrid as a possible option for Massachusetts as its state ed board prepares to vote on the course of action for the state assessment on November 17, 2015.
And so, the PARCC consortium is poised to take another hit.
A primary problem with the idea of consortium anything over public education is that the consortium is not an official, democratic entity. Thus, in order for the consortium to operate, the members of the consortium– those representing officially-recognized, democratic entities, such as states– must relinquish their official authority as public servants to the states that elected/appointed them for the sake of the unofficial consortium.
Why these politicians ever believed they would remain faithful to an unofficial entity over the years is beyond me.
In the end, this unofficial groupthink is crumbling– and it should because state officials should place as their first priority the welfare of the states they represent, not some Club Consortium.
When the PARCC consortium first publicized the consortium-approved cut scores on September 10, 2015, the writing was on the wall that all PARCC states would not just fall in line and solely abide by the consortium-decided PARCC cut scores. Not one week later, Ohio released two sets of cut scores– those from the PARCC consortium, and those in line with Ohio state law.
As for Chester’s faith in “next generation assessments,” he should actually take the time to commission a real “predictive validity study” of PARCC-styled questions to see the degree to which they live up to the all-too-marketed message of measuring “college and career readiness” rather than relying on a thrown-together, there’s-no-time-to-do-it-right, “predictive” study by Mathematica in which “prediction” actually involves current college freshmen taking only one part of a high-school-level PARCC or MCAS test and the entire marketed guarantee of “career readiness” being completely ignored because Common Core and consortium test pushers have absolutely no idea how to distill “career readiness” into supposedly “next generation” test questions.
By the way, that Mathematica study declared MCAS and PARCC as equals in “predicting” so-called “college readiness”; however, in its narrow focus on comparing MCAS and PARCC, the Mathematica study dodged focusing on the greater issue of just how poorly both MCAS and PARCC fared in accounting for differences in the college GPAs of their study participants.
I am currently working on a post about the full Mathematica MCAS-PARCC study, but let me note here that the results are not impressive at all. Here is an excerpt from my post-in-process:
Mathematica offers a correlation of .23 as the outcome measure of PARCC ELA score to college GPA. That same correlation is the outcome of MCAS ELA score to college GPA. However, Mathematica notes that determining college success “was not the original aim of MCAS.” Yet, for all of its “next generation assessment” hype, PARCC ELA yields the same correlation to college GPA as does MCAS.
A correlation of .23 means that the variance in PARCC ELA scores that is shared by college GPA is .23 x .23 = .0529, or just over 5 percent.
Thus, PARCC ELA cannot account for 95 percent of the variance (the differences) in college GPA for the participants in the Mathematica study.
The same is true of MCAS ELA scores and college GPA.
To the Massachusetts Board of Education: On November 17, 2015, vote for MCAS, and take the time to commission genuine predictive studies based upon piloting any “next generation assessment” changes of interest. I would also suggest the use of standardized tests to assess “college readiness” be compared to the use of teacher assigned grades for achieving the same purpose. (Let’s stop pretending that any appreciable defining of “career readiness” can be readily captured in a standardized test.)
Do not rush toward haphazard.