John White’s Illegitimate 2013-to-2014 LEAP/iLEAP Comparisons
Whenever Louisiana State Superintendent John White gets his hands on test scores, there will be problems.
He withheld release of the 2014 Louisiana Educational Assessment Program (LEAP) and Integrated Louisiana Educational Assessment Program (iLEAP) scores on Friday, May 16, 2014, without explanation.
Today, May 27, 2014, he defended his decision before district superintendents by telling them that compared to previous years, this year’s scores were not late.
Districts across the state came to expect to deliver LEAP and iLEAP scores to parents on Friday, May 16, 2014 because that was the Friday in May in which LEAP and iLEAP scores are usually released. White even acknowledged in a letter that the scores would be released later than in previous years:
In recent years, the Department has released these results May 17 or 18.
Consider those two dates more closely:
Friday, May 17, 2013.
Friday, May 18, 2012.
Thus, the commonly-held expectation that LEAP and iLEAP scores would be released on Friday, May 16, 2014, was logical, and White knew it.
In that letter, White makes another statement, one that solidifies the reality that any comparison of 2014 LEAP and iLEAP scores with those of prior years is useless:
An important step [toward connecting Common Core and its PARCC assessment] has been the one-time LEAP and iLEAP tests aligned to those new expectations (Common Core). [Emphasis added.]
I won’t pretend for a minute that White has actually carried out any sophisticated psychometric procedures to develop tests that are “aligned” with the Common Core he refuses to call by name (instead resorting to the vanilla term, “new expectations”). However, he has acknowledged that the LEAP and iLEAP tests used in 2014 are different tests that those in previous years.
Thus, for White to declare in this press release that scores “remain steady” is grossly misleading.
The tests might share the same name, but from 2013 to 2014, the tests are distinctly different and therefore, results from 2013 to 2014 are not open to useful comparison.
In order to compare two different tests, one must calibrate the newer test with the older, and that takes data in the form of piloting the test– which White has not done– and it takes time– certainly more that the Friday-to-Tuesday that White scraped up in the form of a score release delay from May 16 to May 20.
As it stands, a student’s 2013 LEAP or iLEAP score– though it be exactly the same number– does not automatically hold the same meaning as the very same number on the 2014 LEAP or iLEAP test.
There is more that complicates the issue:
There is no evidence that the scoring categories from 2013 to 2014 align, even if White did use the same numeric cutoff scores for each category.
White needs to publicly release the cutoff scores that the LDOE under his direction actually used for 2013 and 2014 LEAP and iLEAP tests.
How about some transparency??
Altering cutoff criteria also “muddies” (yes, I wrote it) comparison of one set of scores with another. However, altering category cutoff scores enables the one setting the cutoffs to shape score results according to his own purposes.
As for students and schools, they bear the public brunt of it, even if the consequences are supposed “relaxed” by the cutoff score manipulator.
Based upon the bru-ha-ha (see here and here and here) allegedly coming from the Louisiana Department of Education (LDOE) in the days between LDOE’s having the 2014 LEAP and iLEAP scores and publicly releasing said scores, I’m thinking that there were numerous scenarios toyed with regarding those LEAP and iLEAP category cutoffs.
So, to parents and administrators who are chagrined at comparisons between 2013 and 2014 LEAP and iLEAP results: Know that the comparisons are fiction.
You might as well attribute the results to pixie dust and fairies.
Do not be disturbed by movement up or down in percentile rankings within the 2014 LEAP and iLEAP scores, either. Movement in such rankings is not clearly connected to set criteria for passing or failing. Theoretically, all students in all Louisiana schools could pass LEAP and iLEAP, and some districts would have to be ranked lower than others. When averages are used, someone must be below average.
Of course, the reverse is also true: All students could theoretically fail a test based upon some set criteria, and still, some score would be highest and therefore able to be termed, “the 99th percentile.” All that this means is that the score is higher than that of 99 percent of other test takers– even if deemed a non-passing score by some set criteria.
In the current situation, for stakeholders to get stuck on, “We were a higher percentile ranking last year” is to fall into the trap of comparing a 2014 PARCC-like LEAP or iLEAP to a 2013 non-PARCC-like LEAP or iLEAP.
The May 27, 2014, LDOE press release includes percentile rank “comparisons” that are nothing more than illusion. First of all, the comparisons from 2013 to 2014 are useless. However, not only is White attempting to promote them as useful; he is attempting to draw attention away from what looks like small percentage “gains” to more “dramatic” “changes” in percentile rank. He needs to showcase some “sensational” numbers. Whether or not such numbers are fact-based is irrelevant to White. Consider this press release offering:
Notice that Catahoula Parish “jumped” 22 percentile ranks by “moving from “17 percent mastery and above in 2013” to “20 percent in 2014.” The move is not a legitimate comparison as noted previously in this post; however, if it were, it would be only three percentage points. Somehow (a mystery to those outside of LDOE) those three points “produced” the “sensational 22-percentile rank gain.”
Notice also that East Carroll also “gained” three percentage points, from 12 percent to 15 percent keep in mind, the comparison is a fraud), yet East Carroll only “moves” 9 percentile rankings “from 2013 to 2014.”
So, what is the value of moving “up” three percentage points??
Well, it just depends. Pixie dust and fairies.
Let us now turn our attention toward PARCC:
In his May 27, 2014, press release, White is careful to not call the test of the “new expectations” (i.e., Common Core) what it is: the PARCC test, developed by “a group of states.” (Near the end of the release, he/LDOE do allude to “resources released” by “Louisiana and PARCC.”)
If Louisiana can reshape its 2014 LEAP and iLEAP into a “one-time LEAP and iLEAP aligned to those expectations,” then why are we purchasing PARCC at the still-advertised, estimated price of $29.50 per student?
If PARCC is used next year, which appears to be the direction in which White is taking Louisiana education, then those 2014-15 scores will not be comparable to either the 2013 LEAP/iLEAP or the 2014 “PARCC-like” LEAP/iLEAP.
Steady, up, down, no one will really know.
One final point to this post:
On May 27, 2014, reporter Danielle Dreilinger of nola.com cited me in such a manner as to make it sound as though I bought White’s story of “steady” scores from 2013 to 2014.
Dreilinger, report my position clearly this time:
White’s reported “steady” scores are an illusion.
And I agree with former LDOE employee Jason France’s admonition that those LDOE individuals being pressured into unethical practices in order to “shape” test results need to come forward before the ax falls.