BESE President Chas Roemer Knows Nothing.
It amazes me how many high-profile reformers know nothing regarding the deceptions occurring during their well-fed tenures. Atlanta Superintendent Beverly Hall supposedly knew nothing of the incredible cheating for which she has recently been indicted. Former DC Chancellor Michelle Rhee is innocent of any involvement in systematic test score erasures in DC’s schools, an issue that simply will not go away. And, though a lesser player than Hall and Rhee, Louisiana BESE President Chas Roemer has assumed the reformer vogue of “hearing nothing” about LDOE data manipulation.
Professed innocence from a high-ranking Louisiana official. Imagine that.
Except that I know firsthand Roemer is lying. I have the email to prove it.
However, let me not get ahead of myself. This story is connected to the recent legislative session and the ongoing battle between the Jindalites and those who exercise independent thought.
During the Louisiana Senate Education Committee meeting on Wednesday, May 1, 2013, my colleague Herb Bassett, a band teacher in Grayson who also happens to have a minor in math, presented his findings regarding issues with the school performance scores. During his testimony, Bassett charged Superintendent John White, LDOE, and BESE of both suppressing data and manipulating outcomes. Bassett noted that he sent a report to all 144 legislators and all district superintendents (I helped him compile the email listings for these individuals) in which he documents issues of high school score inflation and of the intentional mislabeling if a column of data in order to conceal the importance of that column to those who would readily recognize it.
Bassett’s testimony was in support of SB 41, a bill proposing the election rather than appointment of the likes of John White, a man now known for lying, deceit, and information suppression.
In his testimony, Bassett stated:
“The data, the Transition Baselines, showed that the GEE (Graduation Exit Exam)—which was being phased out—and the new EOC (End of Course) tests were mis-calibrated by 7.5 points. That’s half a letter grade,” Bassett said. “Had it been correctly labeled, the inflation would have been obvious—at least to me.”
Meanwhile, he said, BESE was given a different version of the scores with the Transition Baselines correctly labeled. “This shows intent to deceive,” he said. [Emphasis added.]
I have written a number of posts featuring White’s lack of ethics in his dealings with both the public and real educational professionals (those who actually hold four-year undergraduate degrees in education and who have careers in the classroom notably longer than White’s token two years via Teach for America).
Unfortunately, SB 41 did not make it out of committee. In response to this, the Baton Rouge Advocate notes,
Chas Roemer, president of the state Board of Elementary and Secondary Education, which appointed White, said the state’s biggest public school gains in the past decade took place under appointed superintendents.
“The evidence seems to be that not electing is working,” Roemer said. [Emphasis added.]
The question is, Working for whom? It certainly is “working” for Jindal and White, neither of whom “work” for the public good. Indeed, White is in office today because BESE had been bought with the help of Jeb Bush and his reform organizations, Foundation for Excellent Education and Chiefs for Change. One can read here regarding Bush’s push to finance a primarily-Jindal-compliant BESE in order to “vote” White in as state superintendent, White, who at the time had been RSD superintendent for only six months. And look at the mess that is RSD, a district chiefly resulting from the 2005 legislative blindsiding of New Orleans in the aftermath of Hurricane Katrina. Both “appointed” Paul Vallas (who is involved in litigation over his credentials, or lack thereof) and John White, who is the subject of numerous controversies (read my posts), demonstrate the nonsensical, nationally-embarrassing leadership that yields from governor appointment of a Louisiana superintendent.
Roemer had just heard evidence that “not electing” is indeed not working in the form of Bassett’s testimony, yet he insists that allowing the likes of Jindal to appoint the likes of White is working just fine.
Regarding Roemer’s own testimony, the Louisiana Voice notes the following:
BESE president Chas Roemer (R-Baton Rouge) was called to the witness table and asked about Bassett’s charges. Roemer said he had heard nothing about Bassett’s claims, but that he would “look into it.” [Emphasis added.]
There’s the lie, my friends.
I wrote the following email on December 1, 2012, and sent it to White, BESE, and Jennifer Baird, a now-former LDOE employee. (There seems to be a growing list of “former” LDOE employees since John White has been “appointed” to wreck that ship.) I included the email response I had from Baird because her response, part of a continuing story, was to the first letter I sent to BESE, one in which I discuss bias in the 2012 school performance scores. As an attachment to the email below I sent my second letter to White and BESE.
It is in the second letter that I address both the hiding of the transitional baseline column of scores and White’s lowering of the percentage of graduates needed for a school to earn “bonus points.”
Chas Roemer did know about this issue, as did White and all of BESE, for several months now.
Here is the email to which my second letter was attached. I have reproduced the email in its entirety below. Following this email, I have reproduced what was my second letter, the one which shows that Roemer did indeed know about “Bassett’s claims” because I presented the same concerns about data manipulation and arbitrary scoring criteria in my letter. Finally, I have included in this post my original letter concerning scoring bias.
From: Mercedes Schneider <firstname.lastname@example.org> To: Superintendent <Superintendent@LA.GOV>; James.Garvey <James.Garvey@LA.GOV>; Kira.OrangeJones <Kira.OrangeJones@LA.GOV>; Lottie.Beebe <Lottie.Beebe@LA.GOV>; Walter.Lee <Walter.Lee@LA.GOV>; Jay.Guillot <Jay.Guillot@LA.GOV>; Chas.Roemer <Chas.Roemer@LA.GOV>; Holly.Boffy <Holly.Boffy@la.gov>; Carolyn.Hill2 <Carolyn.Hill2@LA.GOV>; Penny.Dastugue <Penny.Dastugue@LA.GOV>; John.Bennett <John.Bennett@LA.GOV>; Connie.Bradford <Connie.Bradford@LA.GOV>; Heather.Cope <Heather.Cope@LA.GOV> Cc: jennifer.baird <email@example.com> Sent: Sat, Dec 1, 2012 10:25 pm Subject: Scoring Bias in 2012 SPS, Part Two
Subject: Re: Reply to Mercedes Schneider regarding bias
Subject: Reply to Mercedes Schneider regarding bias
Good Evening Dr. Schneider, Superintendent White recently shared your letter regarding high school v. elementary school performance scores with our accountability team and asked that we reach out to address your concerns. In response to Calculations One and Two: As demonstrated in your letter, high schools and elementary/middle schools are not measured in the same way. Elementary and middle school performance scores focus on academic achievement based on state assessment performance. These data tell us about the proficiency rates and academic progression of students as they prepare for later grades (i.e., high school). By comparison, high school performance scores focus on academic achievement, but also graduation rates. This is because the role of the high school is, in some significant ways, different from that of an elementary school. Successful high schools do not just have students who are proficient on exams; they are also responsible for guiding students successfully to graduation. Thus, as recommended by the Accountability Commission and as approved by BESE in 2007, graduation rates were added to the formula. This is a purposeful distinction used by Louisiana, as well as many other states across the country. In 2011-2012, NCLB policy required that states begin reporting graduation rates by subgroups because graduation rate is a meaningful indicator of high school success. In practice, this results in high schools measuring and progressing in different ways than elementary schools. However, as recommended by state experts and as approved by the State Board responsible for accountability policy, this difference in strategy (by school type) was determined to be the best path forward. The Board has a long history of constantly refining and improving our system so that we are consistently raising expectations for schools and districts. Over the past decade, raised expectations have served our students well as demonstrated by continual increases in proficiency rates of all students, and particularly historically underserved students. In response to Calculation Three: Transition baselines have been necessary several times since Louisiana designed an accountability system. Changes to testing require that we create a system that uses the same data sources to evaluate annual progress. There are two major reasons that transition baselines are used. * 1. Substantial change in testing policy Transition baselines have been used in Louisiana accountability whenever there is a substantial change in testing. The transition baseline is used only to calculate the actual amount of growth that is made from one year to the next. It is not used to assign letter grades. As an example, in 2010-11, the iLEAP for grade 9 was discontinued so a transition score was used to remove the 9th grade scores from the previous year. Similarly, in 2011-12, a transition was used to replace previously used GEE scores with EOC test scores so that the same test data source was used for determining growth using a fair comparison. Utilizing a transition baseline allows us to continually improve and upgrade the system (e.g., incorporating new, improved assessments) while also providing a more accurate representation of a school’s growth from year to year. * 2. Reconfiguration of grades Each year, there are schools that change grade configurations. When reconfigurations affect the type of data that are used for the score, a reconfiguration SPS is created to determine growth using a fair comparison. As with transition scores, the reconfiguration baseline is only used to determine growth. It is not used to assign performance labels or letter grades. If it would be helpful to talk through any of this in greater detail, we can establish some time to do so. Please feel free to contact me any time you would like to discuss accountability issues. Respectfully, Jennifer Baird, Ph. D. 225.342.3514
And now, what was my second letter to BESE, the letter attached to the above email:
December 1, 2012
Mr. White and BESE Board Members:
In response to my letter dated November 21 concerning scoring bias in the 2012 school performance scores, Mr. White had Dr. Jennifer Baird send to me the email I forwarded to you with this document. I am not sure if Mr. White blind-copied it to you, so I have attached it here to be sure you have read it.
Dr. Baird’s email amounts to a flimsy attempt for DOE to justify the bias rather than confront it. It is as though Mr. White said, “I know: I’ll have someone with a Ph.D. write to Dr. Schneider and tell her that we’re right.” In my letter dated November 21, I present thorough and undeniable evidence of scoring bias. Mr. White’s response via Dr. Baird is really no response at all.
I have been reading archived Bulletins 111. One source of bias in favor of high schools/combination schools involves the graduation index. In October 2010 (pre-letter grade), the graduation index was set at 65 (pg. 2242). However, in August 2011 (pre-letter grade), the index was raised to 80. An explanation on pg. 3200 (November 2011 reprint of Aug 2011 bulletin) cites, “Changes in Bulletin 111, Chapter 6, provide detail for the change in the calculation of the graduation rate adjustment factor to eliminate a negative effect on schools with a graduation rate above the state goal or current grade target.” Two issues here: 1) Moving the graduation index will either inflate or deflate scores; thus, the scores have less to do with true “performance” and more to do with unstable measurement criteria; and 2), the term “negative effect” is another way of saying “scoring bias”; therefore, BESE/DOE recognized that issues of scoring bias could potentially pose problems in school performance scores prior to the application of school letter grades.
In July 2012, under John White and the current BESE, the graduation index threshold is once again set at 65, yet no evidence is offered for bias checks. No examination is done to see the effects of this change on outcome scores—the evidence of such negligence is in the inflated 2012 outcome when one examines elementary/middle (no benefit from a graduation index at all) vs. high/combination (benefit from a lowered graduation index) schools. Furthermore, as noted in the Bulletin 111, Chapter 6 comment above, is there indeed “a negative effect on a subgroup of high schools, namely, those with a graduation rate above the state goal or current target”? It is poor measurement procedure not to have examined and addressed such issues prior to letter grade release.
Common sense tells me that if I set and A as 94-100 one semester then lower the threshold to 90-100 the next, I will have more A’s that second semester, and that the increase cannot be attributed to student performance so much as to my lowered criteria. I can congratulate my students on their great performance for yielding such an increased number of A’s in the second semester as I write in my Schneider EdConnect publication or as I interview for an Advocate article, and I can also brag that this increase in A’s happened on my watch (and therefore must be evidence of my superior performance) as I face an upcoming, annual evaluation, but it is a lie. All that I have shown is that grades, and all associated rewards and penalties, are at the mercy of my capriciousness.
[The bolded portion directly confronts Roemer’s assertion that appointed superintendents produce “the biggest public school gains.” Such “gains” are a contrived, manipulated lie.]
The 2012 school performance scores and corresponding letter grades are inflated, and this 65-no-80-no-wait-65-again threshold bouncing of the graduation index is an undeniable contributor to the inflation.
There is another issue in the lack of calibration of the EOC scores to the GEE scores. Mr. White, via Dr. Baird, may write, “We’ve used transition scores before,” but that does not erase the evidence of the bias in the 2012 school performance scores. Proper calibration was not completed.
How do I know this? The answer: I know how to conduct a proper calibration. I know that if I am measured in inches, I am 64. If some value judgment is attached to my height in inches, that judgment is calibrated to inches. So, if the value judgment notes that 60 to 80 is “good” and 80+ is “excellent,” I can inflate this result by being measured not in inches, but in centimeters, since centimeters will “look” larger if one looks at the numeric value alone. In centimeters, I am 163. Without proper calibration, I can “advance” from “good” (64) to “excellent” (163) without having grown at all. In order to avoid the inflated value judgment, I can either 1) convert the centimeters (EOC) into inches (GEE), or 2) convert the value scale into centimeters (EOC). I cannot convert the inches (GEE) into centimeters (EOC) and leave the value scale in inches (GEE). This is what DOE/BESE has done with its “transition baseline.”
There is an additional layer to this “transition baseline” issue. I postulate that DOE knows they did not correctly calibrate the scores because they tried to hide the “transitional baseline” in plain sight in the 2012 school performance score spreadsheet on the DOE website. First, the “transitional baseline” is hidden on the second page of the spreadsheet. “Surely,” one could argue, “this is not intentional; after all, it’s so much data. It was better divided into two spreadsheets.” That may stand as a valid argument. However, on that second page, the “transitional baseline” is hidden under a false name: 2011 Baseline School Performance Score. Now, there IS a real column with this name on the first page of the spreadsheet, and it really is the 2011 baseline scores. Who would think to compare the data from the 2011 baseline as listed on the first page to the 2011 “baseline” on the second page? Who would figure out that the mislabeled column is really the “transitional baseline”? Who would realize that the high/combination school scores are inflated from looking at a single, mislabeled column on the second page, if even one thought to check for a second page to the spreadsheet?
Not most people.
One could certainly argue that this false labeling of the “transitional baseline” is intent to deceive.
[The bolded portion above directly confronts Roemer’s lie that he “heard nothing about Bassett’s claims.” Roemer did know because I presented the same charges of data manipulation.]
The score inflation is most obvious when one compares the 2011 baseline to the transition baseline. That is where I noticed it first, in the clearly labeled spreadsheet sent to BESE prior to 2012 school performance score release.
Once again, I write and ask DOE/BESE to face the issue of bias in the 2012 school performance scores. Mr. White, please don’t send me any more lame attempts via messenger to justify your position. I will continue to refute them and go public with my responses.
Mercedes K. Schneider, Ph.D.
Applied Statistics and Research Methods
And finally, my first letter to White and BESE, the letter to which White had Baird respond in the email reproduced above:
November 21, 2012
Mr. White and BESE Board Members:
I have been reading Bulletin 111, Louisiana School, District, and State Accountability System.
It is quite the Frankenstein of measurement—so many changes in instrumentation, weights, and grading scales—all impressively chaotic. In the end, one can use it to produce numbers, as you have done, and attach value labels to those numbers, as you have also done. However, just because you have created a bulletin with the force of law and declared that such is useful in determining school/district performance and growth does not mean that the resulting figures accomplish the purported goal.
The 2012 school performance scores do reveal that you have consistently captured a measurement concept: scoring bias. Using two of the DOE data spread sheets on the 2012 school performance data, I will demonstrate in three calculations how scoring is biased toward high/combination schools and against elementary/middle schools.
For the first calculation, I used the 2012 school performance score data available to the public on the LDOE website. I examined column K, Point Change 2011 to 2012 (this is the point change in baseline scores). Twelve elem/middle and 13 high/combination schools (approx. 2% of all schools) had no data available in column K.
I counted the number of elem/middle schools vs. the number of high/combination schools that showed a point increase of 10+. Twenty-two elem/middle schools showed such an increase; 168 high/combination schools showed the same increase. However, across the state in general, the elem/middle schools outnumber the high/combination schools approx 3 to 1. Therefore, if 168 high/combination schools showed a 10+ point increase, then in the absence of scoring bias, one would expect approx. 168 x 3 = 504 elem/middle schools to yield the same increase. What is evident in the actual 22 vs. the expected 504 is incredible scoring bias toward high schools.
When the criteria is narrowed to the number of schools increasing the SPS 15+ points from 2011 to 2012, only 2 elem/middle schools meet the criteria, and 86 high/combination schools do. If there were no bias, approximately 86 x 3 = 258 elem/middle schools would have met the 15+ criteria. Again, blatant bias.
Increasing the criteria to 20+, the same 2 elem/middle schools meet the gain; 42 high schools do. Without bias, approx. 42 x 3 = 129 elem/middle schools should have also met the criteria. Finally, schools with an increase of 25+ points in SPS from 2011 to 2012: one elem/middle school; 13 high/combination schools. In the absence of bias, 13 x 3 = 39 elem/middle schools should have met the 25+ criteria.
In the presence of scoring bias, you are not measuring performance or growth. You are measuring the flawed Bulletin 111.
For the second calculation, I also used the 2012 school performance score data available to the public on the LDOE website. Once again I examined column K, Point Change 2011 to 2012 (this is the point change in baseline scores) in order to calculate the number of elem/middle vs. high/combination schools with a growth score of lower than –2.5. In Bulletin 111, page 13, a “school in decline” is defined as one “having a declining SPS (more than –2.5 points).” If there were no bias against elem/middle schools, then one could expect approx. three times as many elem/middle schools as high/combination schools fitting the criteria set for “a school in decline” since the number of elem/middle school exceeds the number of high/combination schools by a ratio of 3 to 1. The ratio of elem/middle to high/combination schools scores decreasing by more than 2.5 SPS points is more than 5.5 to 1 (78 elem/middle schools to 14 high/combination schools). Thus, again, here is evidence that the 2012 SPS favor high/combination schools at the expense of elem/middle schools.
For the third calculation, I used the data sent to BESE members prior to the publicizing of the 2012 school performance scores, the data set including the column, Transitional Baseline Scores 2010-11. The idea of “transitioning” the score column directly related to the school letter grade scale is a psychometric blunder. The proper way to adjust tests in differing metrics is to calibrate them one to another, with newly-introduced calibrated to previously-established; if such calibration cannot yield unbiased results, then the proper course of action is to establish separate grading scales, one for elem/middle schools and one for high/combination schools, testing them prior to use to verify that they yield unbiased results. One cannot “transition” high school scores and not do the same for elem/middle scores without introducing score inflation or deflation. Even though columns F, G, and H are “transitioned” to match one another, column I, Growth Target, is not (the values in column I continue to range from 2 to 10, without adjustment, just as before any “transitioning”), and the resulting bias is evident in column J, Meet Growth Target.
I used columns I and J in my calculations for this bias check. Specifically, I focused on the schools with growth targets of 7 to 10 for 2011-12. According to page 15 of Bulletin 111, such schools have “entered into Academic Assistance.”
In this BESE-issued data set, the total number of schools with growth targets ranging from 7 to 10 was 725: 574 elem/middle schools and 151 high/combination schools, for a ratio of 3.8 to 1. If there were no bias toward the elem/middle vs. high/combination schools, then the ratio of approx. 3.8 to 1 would be reflected in the proportion of elem/middle to high/combination schools that met the respective growth goals. Instead, the ratio of elem/middle to high/combination schools having met the growth target was 1.3 to 1 (actual numbers: 105 elem/middle schools vs. 79 high/combination schools). Over 50% of the high schools met the growth goal (79/151 = .52); however, fewer than 20% of the elementary schools met the growth goal (105/574 = .18). Another way to view this situation is that if there were no bias and 79 high/combination schools met the growth goal, then one would see approx. 79 x 3.8 = 300 elem/middle schools also meeting the growth goal.
The 2012 school performance score calculations as set forth in Bulletin 111 are biased against elem/middle schools and biased toward high/combination schools.
Furthermore, in general, the 2012 bias favors school districts with more high/combination schools than those with fewer high/combination schools.
Given that the tenets of Bulletin 111 have not been examined and tested for the impacts upon future score calculations, there is no solid evidence to support the shaky idea that the 2012 school performance scoring bias is a “one-time occurrence.”
Based upon the above examination of the 2012 SPS data, it is not logical to say that the elem/middle schools just didn’t perform as well as the high/combination schools, that the teachers at the elem/middle schools didn’t “try as hard” or “achieve as much” as did the high/combination schools. And it is not decent, ethical, and likely not legal to say, “Oh, well. We’ll fix it next year,” or “We had to calculate scores this way because it’s in the bulletin.” There are jobs, reputations and money tied to these faulty numbers. As the BESE and LDOE, yours is the power to not only correct the gross bias in the 2012 school performance scores, but also to suspend, investigate and reconfigure Bulletin 111 in anticipation of future scoring bias.
Mercedes K. Schneider, Ph.D.
Applied Statistics and Research Methods
White, Roemer, and the rest of BESE are without excuse concerning issues of manipulation and slant in the 2012 school performance scores. Furthermore, they are without excuse regarding knowledge of the arbitrary nature of scoring schools. Indeed, LDOE and BESE are the parties who created the rules and who can amend them at will. Perhaps they might find it helpful to pretend that Jindal wants it to happen.
Yet not all of BESE is in Jindal’s pocket.
BESE member Lottie Beebe has also challenged LDOE/BESE in this request to White to provide documentation related to Bassett’s allegations of data tampering/hiding. In her email, Beebe also asks for answers regarding LDOE delinquency in responding to public information requests. Beebe also copies this email to the media:
From: “Lottie P. Beebe” <firstname.lastname@example.org>Date: May 2, 2013, 10:40:37 AM CDTTo: “email@example.com” <firstname.lastname@example.org>, “email@example.com” <firstname.lastname@example.org>, “email@example.com” <firstname.lastname@example.org>, “email@example.com” <firstname.lastname@example.org>, “email@example.com” <firstname.lastname@example.org>, “email@example.com” <firstname.lastname@example.org>, “email@example.com” <firstname.lastname@example.org>, “email@example.com” <firstname.lastname@example.org>, “email@example.com” <firstname.lastname@example.org>, “email@example.com” <firstname.lastname@example.org>, John White <John.White@la.gov>Cc: “email@example.com” <firstname.lastname@example.org>, “email@example.com” <firstname.lastname@example.org>, “email@example.com” <firstname.lastname@example.org>Subject: Allegations raised by Herb Basset’s Report on School Performance; Lay-Offs (State Department)
Superintendent White,I am seeking school performance data and (other relevant data) and a detailed report or response that will confirm or deny the allegations raised by Herb Basset, a math teacher, during the Senate Education Committee on May 1, 2013. I understand lawsuits have been filed by groups and/or individuals for the LDOE’s failure (Superintendent John White) to address public information requests within the prescribed 72 hours. I am asking that these requests be addressed immediately. I want an explanation as to why these requests have not yet been addressed.Additionally, with the recent news that approximately 40-60 employees have been laid off at the LDOE, I am requesting the names, addresses, salaries, full or part time employment status, and the years of experience of each. Thank you in advance for your prompt assistance with my requests.Lottie P. Beebe, Director of Human Resources111 Courville Street P. O. Box 100Breaux Bridge, Louisiana 70517Phone: 337.332.2105; ext. 3012
Thank you, Dr. Beebe.
Chas Roemer is without excuse. Either he 1) knows nothing because he is so intellectually limited that he cannot follow BESE’s affairs closely enough, or he 2) really does know yet feigns ignorance in order to simultaneously promote his well-connected, politically-ambitious public persona while excusing himself from answering for the responsibilities of his publicly-elected post.
Even though I have both heard and read of Roemer’s tendency toward vacuous public commentary, I choose Door Two.