PARCC Technical Advisor Says PARCC Is an “Evolving Enterprise”
On May 5, 2015, the Colorado Board of Education (CBE) held a special meeting about the Partnership for Assessment of Readiness for College and Careers (PARCC) test. The meeting included the following panel as noted on the CBE website:
Dr. Lorrie Shepard, Dean, College of Education, University of Colorado Boulder
Dr. Kevin Welner, Director, National Education Policy Center (NEPC), Professor, University of Colorado Boulder
Dr. Derek Briggs, Professor and Program Chair, Research & Evaluation Methodology, University of Colorado Boulder
Dr. Sandra Bankes, Vice Chair, El Paso County Republican Party
Discussion by one of the PARCC panelists noted above, Derek Briggs of the University of Colorado (Boulder) and member of the PARCC Technical Advisory Committee (TAC) is the focus of this post.
Here is a brief description of the PARCC TAC:
The Technical Advisory Committee (TAC) advises PARCC as it develops a next-generation assessment system to ensure the assessments will provide reliable results to inform valid instructional and accountability decisions. The TAC is responsible for providing guidance on assessment design and development, and the research agenda of the consortium. The TAC meets three times a year.
Some unidentified individual/organization produced this 21-page, partial transcript of the meeting; the excerpt appears focused on Briggs’ commentary about PARCC but includes other speakers present at the meeting. (The audio archive can be found here.) The transcript comes with the following PARCC-sympathetic disclaimer:
This document has been transcribed in part, and is intended to be word-for-word accurate. Like PARCC, it should be considered an evolving enterprise, not a finished product. Items in red are where language was unintelligible. We encourage you to listen to the audio archive posted by CDE here:
In this post, I focus on excerpts from Briggs’ commentary.
Let’s jump right in.
In his discussion, Briggs notes that he is also on the technical advisory committee for the Smarter Balanced Assessment Consortium (SBAC) as well as those of some other states (New York, Tennessee, Michigan).
Briggs then offers a disclaimer of his own, to the effect that even though he is on the PARCC TAC, he “does not speak for” TAC or for the PARCC developers.
He then notes how personal PARCC is to him:
And, and, um, as a member of the Technical Advisory Committee, it’s worth noting that my view, views on PARCC are sort of like the views that I have with my own child. Which is that, uh, I, I’ve gotten to, to, to know PARCC very well, and, and, and I have some affection for it….
On the other hand, I’m very hard, uh, on PARCC, just the way that I’m sometimes hard on my own child, ’cause I have high expectations for what my child, and for what PARCC, uh, could, could ostensibly accomplish.
And then, this:
So I want to make three points. Um, and I think the points build on much of what has been said. Um, and, and the first thing I think I should note is that I, I want to be sensitive to what my marching orders were, or at least my, my, the request was. And I don’t think any of us necessarily have followed them very well.
Uh, and I want to be clear as to why I think that’s the case. So what we were asked to do was to give general comments on the impact of the PARCC test on students and on student performance, uh, and any comments on the effectiveness of PARCC, with the PARCC test, vis a vis other testing regimes. And that, the reason it’s almost impossible to, uh, to, to comply with that, is that we just don’t know yet. Uh, it’s too soon. Um, and this really connects in one of the three points I want to make, which is I think it’s really, really important to see PARCC for what it is: an evolving enterprise.
Um, and not something that, uh, as it comes out of the box is a finished product, and done, and that’s how it will be from, from time, from here on out. Um, but is something that has seen a lot of work, um, and will see more work.
Compare Briggs’ “evolving enterprise” words with these words from official PARCC vendor, Pearson:
The Partnership for Assessment of Readiness for College and Careers (PARCC) is a group of states working together to develop a set of assessments that measure whether students are on track to be successful in college and their careers.
These high quality K–12 assessments in Mathematics and English Language Arts/Literacy give teachers, schools, students, and parents better information whether students are on track in their learning and for success after high school, and tools to help teachers customize learning to meet student needs. [Emphasis added.]
Apparently Pearson missed the “be sure to word as though PARCC is evolving” memo. Sounds like the PARCC vendor, Pearson, knows that PARCC has arrived.
Though Briggs reiterates that PARCC is “evolving,” he also notes that PARCC is better than traditional standardized tests despite PARCC’s “growing pains”:
…From my perspective on the, uh, Technical Advisory Committee, um, I feel comfortable saying that, um, having also seen a lot of, um, other, uh, tests designed and, and put out, um, I feel very confident saying that PARCC was very thoughtfully and conscientiously designed.
Um, it was submitted to a lot of scrutiny, both, uh, public scrutiny and professional scrutiny. Uh, it continues to be submitted to a lot of public and professional scrutiny. Um, a second point I want to make is that PARCC does in fact have a lot of very novel features, uh, relative to testing we’ve seen before.
Um, some, uh, there is, uh, this sense in which many of these novel features being done all at once has been quite a burden. Uh, and some of the things that, like Dr. Banks has referred to, in terms of technological glitches, are the sorts of things that you can imagine happening when you’re trying to innovate in the sense of both the kinds of item formats that you’re creating on, on any test, and the integration of technology, um, all at once.
Uh, so there are clearly growing pains here.
To insinuate any great “public scrutiny” prior to PARCC’s being unleashed upon students and schools is misleading. The public resistance to PARCC, coupled with the so-called “growing pains” of “technological glitches,” has prompted the PARCC consortium to cut the PARCC testing time in half and then some in an arguable effort to save PARCC.
Briggs is clearly proud of the complexity of PARCC items above those of traditional multiple choice items:
So as a designed principle, um, a lot of effort went in at the front end to conceptualize, and how to create items, to get at things in a way that we haven’t gotten at them before. The other piece that, uh, is important for me to point out, in terms of novel features, uh, one, uh, novel feature of the PARCC test that’s most evident, is this integration of technology, the computer-based format.
Um, but in, in going to this computer-based format, it’s actually, I think to some extent opened the doors to different ways for students to interact with items. Even if you look at the practice test that has been made available for PARCC, um, items that we might characterize as traditional multiple-choice items, really don’t look that traditional anymore.
That is typically, I think when we think of a multiple-choice item, we think of an item that has an A, B, C, D, and you choose, A, B, C, D. If you’ll look at the, the actual items that exist, uh, for, in the practice test for PARCC, um, what you’ll see in many cases is that there isn’t an A, B, C, D.
There are, um, entry points for selecting choices, but there might be as many as eight different choices that one has to choose from, to drag into that entry field. So it would be collected, it would be correct to characterize the tests, in many cases, as having formats that look like selected response.
PARCC items are designed to be more sophisticated than traditional multiple choice items. However, someone has to decide that certain ways of maneuvering the answering of PARCC items was worthy of a higher test grade than are other ways. In short, someone must judge that one approach to answering is clearly “better” (or the “best”).
Are we really so sure of ourselves that we presume that we can grade one means of sophisticated thought over another– and one means of problem solving as “better” evidence of “college and career readiness”? And are we certain that the complexity of the test itself is not interfering with student ability to comprehend the task in the first place?
Given Briggs’ offering the disclaimer of PARCC as “evolving,” the answer to the above questions should be “no.”
What further complicates the issue is that the Common Core State Standards (CCSS) are assumed to “work”– and to be both automatically worthy of assessment and actually able to be assessed– which brings us to yet another issue:
There is no direct connection between CCSS and PARCC. Someone has to create that connection– to interpret CCSS and fashion that interpretation into the PARCC assessment. Briggs mentions as much:
One of the things that’s very notable as well, is that one might think that by writing a test to the Common Core State Standards, it’s just a matter of looking at the standards, and then the items become self-evident from the standards. But that’s not the case at all.
If you actually read carefully the Common Core State Standards, particularly, uh, in mathematics, one of the things that’s very novel about the Common Core State Standards is the attempt to, uh, place equal weight on both what students know about mathematics, and how they apply their knowledge, in terms of how they reason with their knowledge, and how they problem-solve with their knowledge.
But how you weave those things together, things that were more along the lines of recall and knowledge of fractions or decimals, and proportional reasoning, how you demonstrate that in terms of practices, the Common Core doesn’t really lay that out at all.
And one of the things that the designers for PARCC had to do is actually very explicitly say, how you weave together knowledge and reasoning, and they actually had to go beyond what the Common Core lays out, and actually establish a framework for doing this. [Emphasis added.]
In order to solve the problem of being sure that CCSS connects to PARCC, one can just forget CCSS and “teach to PARCC”:
To really be able to argue that the PARCC test really covers the breadth and depth, fully, of the Common Core State Standards, such that if te, if teachers are really teaching to PARCC, they’re teaching to the full range of the Common Core, and hence that will limit distortions given the, the high-stakes nature of the test. [Emphasis added.]
There is no reason to be concerned about the connection between CCSS and PARCC if PARCC becomes the focus of teaching. The trustworthiness of the CCSS-PARCC connection can be conveniently presumed as teachers nationwide just settle in to “teaching to PARCC.”
I shake my head.
The partial transcript of this CBE meeting continues with input from the other three PARCC panelists as well as more discussion from Briggs. I leave readers who are so inclined to pore over the remainder of the 21-page transcript.
Let me close with this statement from Briggs. It’s his thoughts on CCSS, both definition and process. Illuminating, to say the least:
It’s just worth saying that the common core is first of all not the Bible and second of all, it was just a hypothesis, about for example mathematics, how you would see knowledge and skills building over time. And they had to slap grade level markers on this thing but they were a guess and they were also maybe an ambition and an aspiration and so, I wish that there would have been more flexibility on the assessments side to say: “That was hypothesis. We’re going to actually find out what kids can do with different instruction and opportunity to learn and we’re going to try and measure them where they are”. um. That would have been terrific.
In other words, Briggs believes that CCSS– the “hypothesis”– should have been tested.
Instead, it was sold as The Answer, and here is Briggs, serving on a committee to test “the hypothesis” in real time, on real lives, and with real, high-stakes consequences.
Schneider is a southern Louisiana native, career teacher, trained researcher, and author of the ed reform whistle blower, A Chronicle of Echoes: Who’s Who In the Implosion of American Public Education.
She also has her second book available on pre-order, Common Core Dilemma: Who Owns Our Schools?, due for publication June 12, 2015.