Skip to content

In May 2016, EdWeek Features PARCC Study Originally Published 7 Months Ago

May 17, 2016

On May 17, 2016, Ed Week’s Catherine Gewertz published an article entitled, “PARCC College-Ready Score Reflects Rigor of College Work, Study Finds.”

The featured study is one conducted by Mathematica and published in October 2015. The study, misnamed “Predictive Validity of MCAS and PARCC: Comparing 10th Grade MCAS Tests to PARCC Integrated Math II, Algebra II, and 10th Grade English Language Arts Test,” intended to compare PARCC and Massachusett’s MCAS using current college students as its sample. (The participants took parts of PARCC while in college. Thus, the study is not predictive.)

It seems that the big to-do on May 17, 2016, is that “peer reviewed” Education Next finally published the Mathematica study.

On May 17, 2016, PARCC CEO Laura Slover tweeted about the 7-month-old Mathematica study that just made it into Ed Next as follows:

I wrote a brief post about the Mathematica study days following its original publication, in October 2015. The study has a number of limitations, one of the most notable being that 66 percent of study participants who did not score proficient on PARCC still did not need to take a remedial course in college.

That does not support Slover’s assertion that PARCC “accurately defines what it means to be ‘college ready.'”

What it does support is the notion that PARCC needlessly flunks a lot of kids.

Note also that “a strong signal” and “accurately defines what it means to be ‘college ready'” is a Slover logic leap.

Moreover, even though there exists no study concerning the predictive validity of PARCC, some states have bypassed this astounding fact to make passing PARCC a graduation requirement. (There is a lawsuit over PARCC as a graduation requirement in New Jersey, where SAT and ACT are currently acceptable options. Maryland also uses PARCC as a graduation requirement “for students enrolled in PARCC-aligned courses.” Rhode Island is facing using PARCC as a 2017 graduation requirement, though the commissioner of education does not seem to want to do so.)

According to Getwertz’s Ed Week article,

PARCC spokeswoman Heather Reams said the consortium plans to conduct a longitudinal study over the next two years that will examine “associations between students’ performance on PARCC and outcomes in entry-level college courses.”

High-stakes sale first, then validation research in the years to follow.

A PARCC bulls eye.

missing the target


Coming June 24, 2016, from TC Press:

School Choice: The End of Public Education? 

school choice cover  (Click image to enlarge)

Stay tuned.



Schneider is a southern Louisiana native, career teacher, trained researcher, and author of the ed reform whistle blower, A Chronicle of Echoes: Who’s Who In the Implosion of American Public Education.

She also has a second book, Common Core Dilemma: Who Owns Our Schools?.

both books

Don’t care to buy from Amazon? Purchase my books from Powell’s City of Books instead.

  1. alainjehlen permalink

    This study found that PARCC scores accounted for only five to 18 percent of the variation in first-year college grades — they reflect a thin sliver of the abilities that lead to college success. No standardized test measures a student’s “readiness” for success in the complex, many-faceted world of college and beyond.

  2. ira shor permalink

    How thoughtful of Slover and PARCC-ites to consider studies to test the grandiose global claims they have been thoughtlessly throwing around for over 5 years, that PARCC actually tests college and career readiness. Test first, collect vendor fees second, propagandize your validity third, and it enough scholars expose your illegitimate claims and enough parents protest by opting out, then finally get around to seeing if all of this is more than a costly boondoggle.

  3. First, the report attempts to calculate only general predictive validity. The type of predictive validity that matters is “incremental predictive validity”—the amount of predictive power left over when other predictive factors are controlled. If a readiness test is highly correlated with high school grades or class rank, it provides the college admission counselor no additional information. It adds no value. The real value of the SAT or ACT is in the information it provides admission counselors above and beyond what they already know from other measures available to them.

    Second, the study administered grade 10 MCAS and PARCC tests to college students at the end of their freshmen years in college, and compared those scores to their first-year grades in college. Thus, the study measures what students learned in one year of college and in their last two years of high school more than it measures what they knew as of grade 10. The study does not actually compute predictive validity; it computes “concurrent” validity.

    Third, student test-takers were not representative of Massachusetts tenth graders. All were volunteers; and we do not know how they learned about the study or why they chose to participate. Students not going to college, not going to college in Massachusetts, or not going to these colleges in Massachusetts could not have participated. The top colleges—where the SAT would have been most predictive—were not included in the study (e.g., U. Mass-Amherst, any private college, or elite colleges outside the state). Students not going to college, or attending occupational certificate
    training programs or apprenticeships–for whom one would suspect the MCAS would be most predictive–were not included in the study.

    Finally, that EdNext endorses this sloppy, mercenary research just shows how untrustworthy EdNext is as a source. Could it be because so many of those associated with the publication receive so much money from Common Core’s funders?

  4. Lauda H. Chapman permalink

    Click to access CORE-Index-Technical-Guide-SY-2014-15-updated-2.1.16.pdf

    There are other ways to estimate whether students are moving along a path that favors high school graduation and entry into post secondary education. Not that I am endorsing them, but the metrics developed for the California Core Districts include GPA, absence from school, suspensions or expulsions, attainment by ELL students of proficiency, and other indicators of relatively smooth sailing through school before high school, and continuing through high school. Soon to be added are strident surveys assessing engagement with learning, dubious measures from promoters of mindset theories and grit, and attendance at a school that students, parents’ teachers, and non-instructional staff regard as safe, attractive, attentive to individual students, etc etc.–so- called school climate and social emotional wellbeing measures. Having looked at some of those surveys, it is clear that the advantage gets back to parental/caregiver attentiveness and resources, and community wealth and investments in schools. In the CORE districts, SBAC test scores are included, but are not the whole ball of wax.

  5. In regard to RI’s pending use of the PARCC as a graduation requirement, there is quite a backstory. The statement that “the commissioner of education does not seem to want to do so” is misleading. (In my opinion, Linda Borg is not to be relied upon for fully accurate reporting on the RI Department of Education.) The RI Department of Ed was set to use a certain score on the previous state assessment, the NECAP (New England Common Assessment Program) in ELA and math as a graduation requirement starting in 2014. There was much controversy over this, with RIDE insisting it was the way to go. Finally, at the last moment, the RI General Assembly passed a law saying that the state assessment could not be a bar to graduation until at least the year 2017. Then the NECAP was replaced by the PARCC in 2015. RIDE itself encouraged districts to hold off on using the PARCC for impacting graduation until 2020, to give the system time to adjust to the Common Core State (sic) Standards. However, they granted the districts leeway to start using the PARCC as soon as 2017. Some districts are taking advantage of this. Commissioner Wagner and RIDE have just unveiled their proposed revised high school graduation requirements. They are saying that there is no state-wide mandate to use a standardized test for graduation, BUT that the 39 school districts are free to use them. There are more complex recommendations regarding additional “designations” and “endorsements” that students will be able to earn and have stamped on their diplomas. This is in the works, but will have to be put out for public comment. The Commissioner still insists that the Common Core standards are appropriate and are the challenging expectations that all student achievement should be measured against.

  6. Linda permalink

    “Under the leadership of Michael Cohen…Achieve formed PARCC.” “Mike Cohen (is) former co-chair ( of the Aspen Education and Society Program); is now president of Achieve Inc.”

    Is the Inc., a Freudian slip?

  7. The razzle dazzle of standardized tests ignores a simple fact: every single admissions counselor I’ve ever talked to has said that the top predictors of college success are good grades and a transcript reflecting a wide array of interests. The latter is the best predictor of whether a student can do more than just pull off good grades in the short term, but has the social and emotional skills to actually complete college.

    That being said, any attempt to quantify academic success with a single score is bogus. Even the SAT and ACT can only predict if a student will complete their freshman year with a B minus – but only with a lousy 65% accuracy. That means that even the most popular standardized tests deserve a D plus in predictive value.

    But also, a review of SBAC manuals reveals zero completed predictive studies, and I presume the same is true for PARCC, since the framework for both was created by the same company, Pearson. If either test had accomplished predictive powers, they’d be crowing it from the rooftops.

Trackbacks & Pingbacks

  1. Mercedes Schneider: EdWeek Dredges Up 7-Month-Old Study of PARCC to Prove Its “Validity” | Diane Ravitch's blog

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s