VAM-promoting Doug Harris’ Vision for La.’s School Performance Scores
Economics professor Douglas Harris is a mysteriously-“Endowed Chair of Public Education” at Tulane University in Louisiana. He has written a book on value-added modeling (VAM), which he believes works in public education. He believes that “a year’s worth” of student learning can be predicted using linear, statistical algorithms and that such a prediction can be certainly and specifically connected to the actions of a single teacher. If a student scores at or above the statistically predicted value, then (let’s go feminine with pronouns here for ease of writing) that teacher is to be rewarded by keeping her job, because she and she alone must be the reason the student’s score rose to the statistically-predicted level. All other variables have been statistically controlled for, and if some variable is too messy to measure and statistically control– like student free will, for example– that variable is simply excluded. If it cannot be measured, it does not fit into VAM, and if it doesn’t fit into VAM, then it does not exist.
VAM is an appealing weapon for those on the hunt to purge the American public school classroom of the “bad teachers” who are hampering America’s ability to become narrowly defined as “globally competitive” via international, standardized test score results.
Nevertheless, not all who are able to skillfully wield VAM are willing to contribute their talents toward the “bad teacher” hunt.
Here is an excerpt from the American Statistical Association’s (ASA) April 2014 Position Statement on using VAM in education:
Most VAM studies find that teachers account for about 1% to 14% of the variability in test scores, and that the majority of opportunities for quality improvement are found in the system-level conditions. …
Ideally, tests should fully measure student achievement with respect to the curriculum objectives and content standards adopted by the state, in both breadth and depth. In practice, no test meets this stringent standard, and it needs to be recognized that, at best, most VAMs predict only performance on the test and not necessarily long-range learning outcomes. Other student outcomes are predicted only to the extent that they are correlated with test scores. A teacher’s efforts to encourage students’ creativity or
help colleagues improve their instruction, for example, are not explicitly recognized in VAMs. …
Attaching too much importance to a single item of quantitative information is counterproductive—in fact, it can be detrimental to the goal of improving quality. In particular, making changes in response to aspects of quantitative information that are actually random variation can increase the overall variability of the system. …
A decision to use VAMs for teacher evaluations might change the way the tests are viewed and lead to changes in the school environment. For example, more classroom time might be spent on test preparation and on specific content from the test at the exclusion of content that may lead to better long-term learning gains or motivation for students. Certain schools may be hard to staff if there is a perception that it is harder for teachers to achieve good VAM scores when working in them. Overreliance on VAM scores may foster a competitive environment, discouraging collaboration and efforts to improve the educational system as a whole. …
The majority of the variation in test scores is attributable to factors outside of the teacher’s control such as student and family background, poverty, curriculum, and unmeasured influences. …
The VAM scores themselves have large standard errors, even when calculated using several years of data. These large standard errors make rankings unstable, even under the best scenarios for modeling.
With these ASA cautions regarding VAM in mind, let us turn our attention once more to Harris. In January 2015, he released this report to the Louisiana Board of Elementary and Secondary Education (BESE) and a legislatively-established accountability commission.
In his report, Harris offers a number of suggestions that show just how out of touch he is with public school reality– and how much he “values” education through the limited visibility of the economist.
First, Harris suggests schools be “VAMed,” not just teachers:
I recommend adding average teacher value-added to the SPS and assigning 50% of the 150 SPS points to this measure. This would not only do a better job of measuring the performance of the schools, but improve alignment between teacher, school, and district accountability. It would also send a clear message to school and district leaders that they should focus on hiring, retaining, and developing the best teachers.
ASA cautions using VAM? What’s that?
No, no. Let’s up the VAM by making VAM half of a school’s performance score. And it certainly does send a “clear message”: Harris thinks that “best” teachers are those who fulfill their VAM determined, standardized-test-score-delineated destinies.
Several paragraphs later, Harris suggests that instead of 50 percent “growth” (i.e., VAM), why not just make SPS 100 percent “growth”?
I can tell you what would increase for sure under Harris’ VAM-love suggestion for SPS: Ways that administrators would figure out how to game this system– and that includes John White.
Indeed, Harris calls out White on his current, slanted awarding of points to schools for students actually not progressing:
This approach also rewards growth for all students, not, as with the current progress points, just those who are not proficient. The system could be set up to reward schools somewhat more for generating growth among low-performing students, but the present goes much too far, not rewarding growth for proficient students at all.
White’s awarding “bonus points” to schools for students who were not proficient on standardized tests actually helped the Recovery School District (RSD) and made “D and F schools… the best place for nonproficient students,” as my colleague Herb Bassett wrote in this post.
Whereas Harris does call White out for this “bonus points for non-proficiency” farce, keep in mind that Harris’ perspective on “growth” is narrowly defined as increased test scores as governed by VAM.
“Growth” is the test score. The test score is “growth.” Nothing else in the VAM mind constitutes “growth.”
I think Harris assumes that all teachers can be (and should be) VAMed. All teachers are not VAMed. I am not. Instead, I have “student learning targets” (SLTs), whereby I am told a score that my students “will” reach on the standardized test (in my case, this year, it’s the ACT for tenth graders, the PLAN), and I am told what percentage of students will reach this score. However, there is a VAM-like component built in this year: If a student scored higher than the cut score prior to my teaching him/her, and the score appears to go “backward” (not sure how the connection between the ninth-grade test, EXPLORE, and the PLAN has been established), then I will be penalized.
So, my arrangement this year is VAM-like, but teachers of other subjects may not be– or are so ridiculously– like a business teacher’s eval being based on her students’ scores in English. Gotta find a test for all teachers, even if it makes no sense.
Is this “growth”?
I had a 10-pound tumor removed from my abdomen in 2013. Not all “growth” is good.
Harris continues by suggesting that the school letter grading be expanded to include more categories, maybe by adding “plus” and “minus” to the letter grades. He thinks doing so will give schools “incentive to improve.”
What those expanded categories will also show is decline. Not sure how well White would like it if he had to try to hide publicly-nuanced RSD school grade drops– C to C-minus– D to D-minus. Also, a “minus” never “looks” like improvement. So, I think this would be a mixed bag for White. He would like to show how traditional public schools are “minus” schools, but he would not like to show this (especially nationally) for those RSD “miracle” charters.
The next Harris suggestion is my favorite for revealing Harris’ economic mindset as he advises on how to grade schools:
Add students’ college entry and first-year college persistence into the high school and district SPS calculations.
Yep. The endgame is to get students into college. That’s it. And Harris even suggests how to “value” this college entrance that I should push my students to have no matter their own preferences for their own lives:
I recommend a focus only on college entry and first-year (fall-to-spring) persistence because these are most under the control of high schools. …
As a general rule, more points should be given to credentials and other intermediate outcomes that are most closely-associated with the long-term outcomes. The evidence therefore implies four-year institutions should get somewhat more points than two-year institutions and persistence in the same institution should get more points than transfers of any sort.
He bases the above “on economic research” (of course, of course).
When I think of “college entry” being under the “control” of a high school, my mind immediately goes to Texas charter chain, YES Prep, which gamed the “100 percent college acceptance” system by informing parents and students in its handbook that if a student did not get accepted into a college in the senior year, that student would not be awarded a diploma from YES Prep.
I wonder how long it will be before Harris conducts research on the connection between high-stakes (cough, cough) “accountability” and the proliferation of creative system-gaming among the “accountable.”
Economically-minded Harris suggests assigning point values to certain college acceptance and first-year completion.
This is soo incredibly narrow.
My sister delivered pizzas for years before joining the Air Force. She then attended college and became an electrical engineer. She did not go straight to college. No points for our high school.
My brother graduated at nineteen. He did not like school, and it was a bit of a fight to get him to complete it. He is a commercial fisherman who builds his own boats. This is what he enjoys. No points for his high school.
One of my former students came to visit me last year. He was admitted into a program to train him to be an underwater welder. No points for our high school.
A locksmith recently told me how difficult it is becoming to find young people interested in becoming locksmiths. He said there is a real need for locksmiths and that it is a dying profession. The way to learn is to apprentice with a licensed locksmith. As it happens, one young man had come to him recently in order to take him up on the offer to be apprenticed to become a locksmith. No points for that young man’s high school.
No points, no points.
Harris continues by writing that he does not discount short-term job success or long term career success. He just thinks “we [should] hold people accountable for what they control.”
YES Prep is “in control”….
Harris has other suggestions, and I will leave eager readers to read them. At the conclusion, Harris notes that his goal was to “make the state a national leader.”
Harris neglects to clarify a leader in exactly what, but I’ sure it will be much too “economical” for me to “value.”
Point systems for “grading” the teacher-student (and school-teacher-student) dynamic will always fall short because the complex nature of that dynamic defies quantifying. If test-loving reformers insist upon imposing high-stakes quantification onto schools and teachers, it will backfire, a system begging to be corrupted by those fighting to survive it.
It is not that I cannot be evaluated as a teacher. It’s just that such evaluation is rooted a complex subjectivity that is best understood by those who are familiar with my reality. This should be true of the administrators at one’s school, and I am fortunate to state that it is true in my case.
There are no numbers that sufficiently capture my work with my students. I know this. Yes, I am caught in a system that wants to impose a numeric values on my teaching. My “value” to my students cannot be quantified, nor can my school’s value to my students, no matter what the Harrises of this world might suggest in commissioned reports.