Smarter Balanced: Lacking Smarts; Precariously Balanced
In this time of “public-education-targeted boldness,” the Common Core State Standards (CCSS) has made the American public one whopper of a “bold” promise:
The standards were created to ensure that all students graduate from high school with the skills and knowledge necessary to succeed in college, career, and life, regardless of where they live. [Emphasis added.]
There is neither now nor never has been any empirical investigation to substantiate this “bold” claim.
Indeed, CCSS has not been around long enough to have been thoroughly tested. Instead, the above statement–which amounts to little more than oft-repeated advertising– serves as its own evidence.
However, if it’s on the *official* CCSS website, and if CCSS proponents repeat it constantly, that must make it true… right?
Keep clicking your heels, Dorothy.
Now, it is one issue to declare that CCSS works. It is quite another to attempt to anchor CCSS assessments to the above cotton candy of a guarantee. Nevertheless, that is what our two beloved, federally-funded assessment consortia are attempting to do.
Let us consider recent proclamations by one of these CCSS assessment consortia, the Smarter Balanced Assessment Consortium (SBAC).
On November 14, 2014, SBAC published its lean-to efforts at creating a set of SBAC assessment cut scores for levels of achievement connected to an unproven CCSS. (Whew.) In a smooth dig on SBAC lunacy, Washington Post education blogger Valerie Strauss offers the actual SBAC text revealing and shakily explaining their cut score decisions (see this November 20, 2014 Answer Sheet post).
SBAC has purportedly anchored its assessment to empirically unanchored CCSS. How doing so is supposed to serve public education is an elephant in the high-stakes assessment room.
Regarding its assessment scoring, SBAC decided upon cut scores that divide individual student scores into four “achievement levels.” SBAC knows it is peddling nonsense but does so anyway, apparently disclaiming, “Hey, we know that these achievement levels and their cut scores are arbitrary, but we have to do this because No Child Left Behind (NCLB) is making us. But we want to warn about using the achievement-level results of this high-stakes test for any high-stakes decisions”:
Defining these levels of achievement (“Achievement Levels”) is a reporting feature that is federally required under the No Child Left Behind Act, and one that has become familiar to many educators. However, characterizing a student’s achievement solely in terms of falling in one of four categories is an oversimplification. … They must continuously be validated….”
Furthermore, there is not a critical shift in student knowledge or understanding that occurs at a single cut score point.
[Footnote] Additional research will be needed to validate the achievement level descriptors in relation to the actual success rates of students when they enter college and careers.
The above SBAC interpretation explanation (see Strauss’ post) continues for several sentences about how these achievement levels “should serve only as a starting point for discussion” and “should not be interpreted as infallible predictors of students’ futures.”
Not going to happen.
The reality is that the media will publish percentages of students falling into the four categories as though the SBAC-created classification is infallible, and once again, schools, teachers, and students will be stigmatized.
Forget about any cautions or disclaimers. Offer a simplistic graphic, and the media will run with it.
SBAC itself offered several graphics explaining its cut score decisions. These can be found in Strauss’ post. Two are line graphs showing the actual raw scores arbitrarily chosen as cut scores, and two are bar graphs, complete with the percentages of students whose scores fall into each from the SBAC pilot study (grades 3 through 11).
Based upon SBAC cut scores, most students “scored into” the bottom two levels. Imagine that.
Recall the SBAC disclaimer, published as a footnote:
Additional research will be needed to validate the achievement level descriptors in relation to the actual success rates of students when they enter college and careers.
SBAC cut scores are not tied to “actual success rates.” Nothing about CCSS has been validated using “actual success rates.”
SBAC was tasked with figuring out how in the world to operationalize both “college ready” and “career ready.” It decided that “college ready” means CCSS content ready. In other words, the SBAC test assumes that CCSS will “ensure” college readiness simply because CCSS promoters say it will.
What we have here is a test that leads to no end other than CCSS for CCSS’s sake. American education on the hamster’s wheel.
SBAC states that its achievement levels “must continuously be validated” in the utter and complete absence of the most important validation evidence– pilot testing CCSS to see if it delivers on its “college ready” claim in the first place.
The reality is that before state adoption, CCSS should have been studied on at least one cohort of students from Kindergarten through grade 12. SBAC (and other supposed CCSS assessments) should have gone along for that ride for the purposes of testing CCSS itself, not students.
I realize that the above statement is not news to those with common sense. CCSS lacks empirical evidence to support its “college and career ready” claim, and that CCSS “college ready” evidence alone would take at least 14 years to gather.
Collect empirical evidence that CCSS actually delivers on a college and career ready promise before adopting CCSS and chasing it with costly, high-stakes assessments?? No, no, say the CCSS pushers. Not time enough for that. American education is “failing”; so, we need to be “urgent.”
Urgent, not really. Sloppy and irresponsible, absolutely.
(As an aside on “college ready”: Even higher education is expected to center on CCSS. Thus, CCSS continues as its own authority. Colleges and universities are expected to “get ready” for CCSS, which makes “college ready” whatever CCSS says it should be. Watch out, America. CCSS is being positioned as the authoritative, infallible center of education for the masses and is even being promoted as the center of state accountability systems.)
As it stands, SBAC’s flimsy defining of “college ready” as “CCSS-content-ready” is the high point of operationalizing the CCSS sales pitch. When it comes to trying to define “career ready,” SBAC admits being at a complete loss. As SBAC notes in its publicized Achievement Level Recommendations:
Smarter Balanced does not yet have a parallel operational definition and framework for career readiness.
SBAC tests are supposed to measure CCSS, which purports to “ensure” both “college and career readiness,” yet the multi-million-dollar SBAC effort can’t seem to get a handle on what “career readiness” actually is.
In Louisiana, the economy is so depressed that Louisiana Workforce Commission job projections for 2020 estimate that the majority of available entry-level jobs will require a high school diploma or less.
CCSS: So effective, it even makes Louisiana dropouts “career ready.” It’s just that good.
So, see, there is no way for CCSS to fail in the Bayou State. Even high school dropouts can be “career ready” in Louisiana. However, having to face state employment projections opens a real box of confusion for those trying to blanket-define “career readiness.” Career readiness must be defined in relation to careers, and careers are dependent upon state and local economies– none of which can be standardized to suit the likes of CCSS and its assessments.
Operationalizing “career ready” is not so easy to do (understatement), but it should have been done (or the white flag of surrender should have been raised) years ago.
Instead, CCSS and its assessments are propelled forward, awkwardly propped up by lots of Gates money, some of which has even been given to self-appointed standards arbiter, the Thomas B. Fordham Institute, to evaluate SBAC and other CCSS assessments and publish a “report” come spring 2015.
Fordham Institute will not be connecting CCSS assessments with any measurable “college and career ready” outcomes. To Fordham Institute, CCSS works, and it needs to have tests:
“The promise of the Common Core State Standards, implemented faithfully, is improved education and life outcomes for millions of American children,” noted Amber Northern, vice president of research [at Fordham Institute]. “We need tests that fairly reflect and honor the hard work that we are asking teachers and students to do under the Common Core.”
In 2010, Fordham Institute sold America an untested CCSS with the oiliness of promoting a preferred product, and rest assured, it will do the same for some or all of the CCSS assessments. All Fordham Institute has to offer is glossy-brochured sales manure.
Use it to fertilize your spring flowers.
For now, know that SBAC hasn’t a clue about what it is really offering the American public by way of its CCSS assessments. But don’t think that a crucial lack of an empirical foundation will hold SBAC back.
After all, testing an untested CCSS is urgent.
__________________________________________________________
Schneider is also author of the ed reform whistleblower, A Chronicle of Echoes: Who’s Who In the Implosion of American Public Education
Trackbacks & Pingbacks
- Mercedes Scheider: Smarter Balanced Test Is Neither | Diane Ravitch's blog
- “College and Career Ready”, also known as the Common Core Standards: Impossible assumptions and real sacrifices | Seattle Education
- My Position on the Elementary and Secondary Education Act Reauthorization - Dr. Rich Swier
- The SBAC: Separating fact from fiction, the top ten FAQ’s and answers | Seattle Education
- Why I am opting out – a guide for parents – teaching malinche
- Even As States Continue to Leave, PARCC Sets Last Year's Cut Scores - Democratsnewz
- Letter to Fordham Snake Oil Peddlers Part 1 » Missouri Education Watchdog
- Letter From Mark Twain to Snake Oil Peddler: Modified for Petrelli and Pondiscio | Truth in American Education
- Letter From Mark Twain to Snake Oil Peddler: Modified for Petrelli and Pondiscio » Today's America
- Letter to Fordham Snake Oil Peddlers Part 1 | Grumpy Opinions
Reblogged this on Exceptional Delaware and commented:
Mercedes Schneider dissected the Smarter Balanced cut scores. This “consortium” actually managed to make their scoring system more confusing than the actual test. Which just proves my theory all along: they know the test is crap, but they will push it through anyways so they can use data from it to push their own agenda. Parents: You don’t need the Delaware PTA to advise you what to do. Opt your child out now from this farce of an educational assessment. If our children have suffered from Common Core for this, than we all need to get together in every state and demand our politicians and state DOEs abolish this ridiculous idea.
Farce from the get go. This post of yours is a real winner in exposing the fraud, and the images are great.
I have noted a bit differently the absurdity of trying to map career-readiness via U.S. Bureau of Labor Statistics, especially since these projections are never made for more than ten years out and are modified every other year.
The CCSS “research, to map career readiness is a total fraud, dates from 2001 work on the American Diploma Project , including interviews with a convenience sample of managers of business in five states. Back then, the economy was doing a tank that was terrible, but not nearly as bad as the biggie of 2008.
The CCSS are based on amateurish leaps through thin air and arrogant disregard for the right of young people to determine or delay determining whether they wish to be filmmakers, poets, engineers, or whatever. The two track college prep and vocational education tracks are alive and well in the so-called college and career standards. Nothing in or about the standards is trustworthy.
Reblogged this on As the Adjunctiverse Turns and commented:
more on CCSS and “college readiness” claims…
Thank you!!!! You are so goooood at saying, and backing up, what I am thinking!!!! It is with your help I try to spread the word (the truth) about Common Core and High Stakes Testing.
Dr. Schneider, I am writing to thank you for sharing a link to the Prince sermon sent on Christmas Eve. It’s inspired me to incorporate sermons into my week. It has helped “frame” CC and gives me strength & fortitude to continue advocacy.
Thank you for that post and for your consistent presence with all CC “warriors”.
God Bless,
Cheryl Hill
>
Cheryl, you are welcome.
Jesus Christ is greater than CC.
Reblogged this on Saving school math and commented:
It’s a bit long, but it sure takes the lid off the CCSS. Read it now.