The 2013 Louisiana legislative session ended this evening (Thursday, June 6th). It was not the surprise attack of 2012, a time when both House and Senate majorities seemed bent upon unquestioningly bowing to Jindal in his efforts to blindside traditional public education. To my knowledge, there was no locking teachers out of the Capitol this year or demanding that teachers who wished to testify openly declare that they were choosing to be absent from the classroom, an effort to humiliate those wishing to participate in the democratic process.
It was a better session, one in which many legislators asked critical questions before voting. In general, the House Education Committee and the House in general are tired of reformer nonsense. The Senate Education Committee still has a majority kissing the Jindal hindside. Not so much from the Senate in general, but I don’t think they were tired of Jindal to the degree that the House was.
This is progress, folks.
Let me begin with COMPASS. HB 160, an effort to delay the application of COMPASS for one year, made it though the House Ed Committee and the House but died by a 4 – 3 vote in the Senate Ed Committee. HB 160 was then amended and attached to HB 129, but HB 129 was not addressed. So, for now, we still have COMPASS to deal with.
ACT 1 was declared unconstitutional in the lower courts for violating a “single object” requirement of the state constitution. Jindal et al. appealed to the La. Supreme Court, which means that COMPASS is still in effect. (I know, it doesn’t seem fair, but that is how it works– the appeal kept COMPASS in force.) Meanwhile, Act 2 (vouchers) was declared unconstitutional, not for “single object” violation but for use of MFP for expenses not tied to public schools. So, the La. Supreme Court decided on Friday, May 30th, to send Act 1 back to the lower court for reconsideration of the “single object” unconstitutionality finding given that the La. Supreme Court did not declare Act 2 unconstitutional for that reason. (Difficult to follow, I know. Just reread the paragraph a couple of times.)
Here is an official statement from LFT:
On Friday, the Louisiana Supreme Court handed down a ruling vacating a lower court decision that Act 1 of 2012 violated the single object provision of the Louisiana Constitution, and remanding the case to the 19th Judicial District Court for a rehearing.
The high court noted that when Judge Michael Caldwell made his ruling last March, he did not have the benefit of the Supreme Court’s opinion in the Act 2 of 2012 lawsuit. That opinion, while upholding a lower court ruling that vouchers cannot be funded through public education’s Minimum Foundation Program, said that Act 2 did not violate the single object provision.
“Because our (Act 2) opinion clarifies the law in this area,” the court wrote, “we conclude it would be beneficial to remand the case to the district court for reconsideration of its ruling in light of our opinion, after appropriate briefing and argument by the parties.”
After conferring with counsel, Louisiana Federation of Teachers President Steve Monaghan said that he believes Judge Caldwell made the correct ruling, and that the LFT will be prepared to argue the case again when it is scheduled.
“This ruling will allow Judge Caldwell to fully consider all of the information that is available and issue an accurate opinion,” Monaghan said.
The ruling emphasizes the importance of legislative action to quell the confusion that Act 1 has caused for professional educators, Monaghan said.
“Right now there is a bill pending that would slow down implementation of Act 1 until teachers have a clear idea of what they are supposed to do,” Monaghan said. “The rules must be clear and training must be complete before the consequences of this act are imposed.”
Monaghan said that House Bill 160, which was unanimously approved by the House of Representatives, was blocked by four members of the Senate Education Committee. The bill would suspend consequences of Act 1 for a year. It has been amended onto another bill and still has a chance to be heard before the legislature adjourns on June 6.
So, COMPASS is still in force. But chin up, colleagues: The legislature is waking up to the problems presented by uninvestigated-yet-zealous reformer ideas.
The reason that HB 129 (amended to include the COMPASS deferment) was “quietly shelved” likely has to do with the education budget negotiations, not the least of which was the restoration of the 2.75% MFP increase for districts not only in 2013-14 but guaranteed for 2014-15. Since the proposed 2013 education budget included MFP for vouchers (now unconstitutional), the budget reverted back to the 2011 budget (you read it right– 2011– story below), which included the 2.75% MFP increase. Half of the MFP money is to go to teacher raises without regard to COMPASS ratings.
The budget negotiations included the apparent tradeoff of including the 2.75% MFP increase with half earmarked for teacher raises in exchange for funding for 8500 vouchers in 2013-14 from the general fund.
This is the first across-the-board state raise in five years.
The MFP increase resulted from some legislators having an issue with a bidget that would fund vouchers– sending public money to private schhols– while ignoring funding increases to public schools.
As a result of 2012-13 education budget foolishness (pushing the education budget through without the required majority of 53 votes in the House), the La. Supreme Court Ruling on Act 2 declared the 2012-13 MFP formula was not legally approved. To remedy this in the future, HCR 14 was passed:
[HCR 14] provides that legislative approval of the minimum foundation program (MFP) formula adopted by BESE shall be by means of passage of concurrent resolution adopted upon a favorable vote of at least a majority of the elected members of each house.
Specifies that the concurrent resolution contain the verbatim MFP formula adopted by BESE and requires such resolution to be introduced by no later than 6 p.m. of the 23rd calendar day of regular session in an even-numbered year and no later than 6 p.m. of the 10th calendar day of a regular session in an odd-numbered year. [Emphasis added.]
House Speaker and Avid Jindal Admirer Chuck Kleckley is responsible for the failure of the 2012 MFP to be legally adopted. The La. Supreme Court agreed with LFT legal counsel, who maintained
The provision in the Louisiana Constitution that states that matters “intended to have the force and effect of law” must be considered in the legislature prior to a fixed deadline. We contend that because the legislature missed the deadline, the law has no force and effect.
The provision in the Constitution that states that matters “intended to have the effect of law” must receive a majority vote of the elected members of the House (which would be 53 votes). The MFP Resolution received 53 votes. Thus, it never passed.
With HCR 14, there will be no more shady voting and saying that majority vote doesn’t apply. No more passing the wrong Word file to the legislature. No more complicating the process by leaving the legislature to fall back on a two-year-old budget (2011-12) due to an illegally adopted MFP from a previous year (2012-13) because the current MFP (2013-14) also includes unconstitutional elements.
Such a waste of resources cleaning up the mess of arrogant, overpaid incompetence.
The House is more concerned about White’s/BESE’s capriciousness and incompetence than is the Senate Education Committee. HB 466, which would have required White/BESE to submit school performance issues before the legislature for approval PLUS prevent White’s goofy idea of two school performance scores for next year PLUS prevent yet another exam change (to ACT), was overwhelmingly supported in the House but nixed by the Senate Education Committee. White has finally admitted inflation in school performance scores but immediately excuses himself by saying he has been trying to correct the problem. I know better from my own interactions with him.
So, on the downside: John White gets to continue to play with school letter grades.
On the upside: He is under increased pressure due to the publicity about letter grade inflation. And legislators are watching him with sharper eyes in 2013 than they were in 2012.
A beautiful gain this session involves the crippling of Course Choice. As part of the Act 2 La. Supreme Court ruling, Course Choice cannot be funded using MFP. To keep their pet program alive, Jindal and White have chosen as their solution to limit Course Choice to a pilot program and cap the number of students at $250 per provider per million of funding that LDOE can scrape together. Jindal and White did not try to include Course Choice requests in the budget submitted to the legislature. Jindal decided to only broach the subject of voucher funding as a general budget request and not add Course Choice to the mix.
Ever the optimist, White tried hard to frame this undesired forced hand into a positive; as nola.com reports:
White said he wasn’t happy about downsizing the program but found it beneficial to have a year “getting our toe in the water.”
In order to conduct a pilot White must first have illegal, full funding yanked out of reach. That’s right up there with, “I’m only sorry because I was caught.”
Of course, both Jindal and White expected to draw unlimited funds for both vouchers and Course Choice from the MFP. Sadly (for them, not for public education), they are not allowed to do so.
So, having to secure funding outside of MFP has left Course Choice substantially less able to perpetrate unregulated fraud on community public schools.
RSD and “Opting Out”
Another promising bill that passed is HB 115, the “reverse parent trigger, which allows parents of RSD school students to petition to have those schools returned to their respective districts if such schools earned a D or F from the state for five consecutive years. The powerful tacit admission in passing HB 115 is that RSD is not the Miracle it pretends to be.
Go ahead and reread that last sentence.
I wonder how John Merrow will digest this news. He is promoting RSD as a “rebirth.” Here is his own comment:
December 21, 2012
Are Charters the Reason New Orleans’ Schools Are Succeeding?
John Merrow, GOOD
“This is a documentary for anyone interested in children or our nation’s future, because other districts could emulate New Orleans, not simply by adopting charter schools but by committing to a set of familiar virtues: high standards, integrity, hard work, time, resources and more.”
I have emailed him my posts on RSD and even used my work as an annotation to his “Rebirth” promotion.
Perhaps he could call the reverse parent trigger “afterbirth.”
My last bit of news concerns SB 130. This bill died. And it needed to. The official title of the bill was Creates the Early Childhood Care and Education Network and the Tiered Kindergarten Readiness Improvement System. The shorthand name for SB 130 was “Common Core for Babies.” Not kidding. Here is an excerpt from the bill:
Not later than the beginning of the 2015-2016 school year, the state board shall establish and implement the Tiered Kindergarten Readiness Improvement System to establish common standards of kindergarten readiness, assess the quality of early child care and education programs serving children from birth to age five, provide information to parents and the public regarding the quality of early child care and education programs, services, and classes, and provide resources to support needed improvements in the provision of early child care and education programs. [Emphasis added.]
I hope this bill was nixed because enough legislators realized that children require play for healthy development.
My goal here was to update my readers regarding pertinent decisions associated with Louisiana’s newly-ended legislative session. I find the results promising. And as my friend Vicky has noted on FB, the work is not done. We need to continue to contact legislators as they watch the results of their 2013-14 decisions come to pass. We need to help them become increasingly wise to the foolishness that would call itself “reform.” And we need to thank those who stood up for traditional public education.
May their numbers increase daily.
Randi Weingarten really wants to promote the illusion that teachers have bought into the Common Core State Standards (CCSS). She hired Hart Research Associates to conduct a survey of AFT membership regarding perceptions about CCSS, and Hart did so March 27 – 30, 2013. Weingarten has used this survey as a platform to proclaim that “75% of AFT teachers surveyed support the Common Core.” But it is a lie, and I can prove it using a single Power Point slide composed by Hart for AFT and promoted by AFT and comparing that slide to the actual survey question.
Proper Reporting of Survey Results
The National Council on Public Polls (NCPP) was established in 1969 to “set the highest professional standards for public opinion pollsters.” Its membership includes 27 polling organizations, including ABC News, Annenberg Public Policy Center, CBS News, Gallup, Los Angeles Times, and NBC News.
Hart Research Associates is not a member. At least not currently.
NCPP members agree to adhere to certain Principles of Disclosure, which are designated as three levels, the first being what survey reports must include:
Level 1 Disclosure: All reports of survey findings issued for public release by a member organization will include the following information:
- Sponsorship of the survey
- Fieldwork provider (if applicable)
- Dates of interviewing
- Sampling method employed (for example, random-digit dialed telephone sample, list-based telephone sample, area probability sample, probability mail sample, other probability sample, opt-in internet panel, non-probability convenience sample, use of any oversampling)
- Population that was sampled (for example, general population; registered voters; likely voters; or any specific population group defined by gender, race, age, occupation or any other characteristic)
- Size of the sample that serves as the primary basis of the survey report
- Size and description of the subsample, if the survey report relies primarily on less than the total sample
- Margin of sampling error (if a probability sample)
- Survey mode (for example, telephone/interviewer, telephone/automated, mail, internet, fax, e-mail)
- Complete wording and ordering of questions mentioned in or upon which the release is based
- Percentage results of all questions reported
Member organizations reporting results will endeavor to have print and broadcast media include the above items in their news stories.
Member organizations conducting privately commissioned surveys should make clear to their clients that the client has the right to maintain the confidentiality of survey findings. However, in the event the results of a privately commissioned poll are made public by the survey organization the above items should be disclosed.
In the event the results of a privately commissioned poll are made public by the client, the survey organization (a) shall make the information outlined above available to the public upon request and (b) shall have the responsibility to release the information above and other pertinent information necessary to put the client’s release into the proper context if such a release has misrepresented the survey’s findings. [Emphasis added.]
The goal of such stipulations is to ensure unbiased reporting of survey results. I have bolded the sections in which the Hart/AFT Power Point info I will share has violated such disclosure.
Now, Hart is not currently a member of this organization and is therefore not subject to the hearing procedures outlined at the end of the Principles of Disclosure section. Even though Hart is no longer a member of NCPP, the guidelines set by NCPP are a standard for assessing unbiased reporting of polling results.
Hart’s non-member status in NPCC does not excuse it from honest, straightforward reporting of survey results. In fact, Peter Hart used to sit on the NCPP board of trustees; his name is on this 1988 letter sent to a non-member polling organization with the following admonition:
As you are not a member of NCPP, obviously you are not bound by our release standards. However, had more of the pertinent information on methodology been released it would have enhanced the credibility of your efforts. … Shortcomings in the release of survey results from any source are of concern to us, as we all suffer when professional standards are breached. [Emphasis added.]
Thus, as a trustee member of NCPP, Peter Hart has been associated with admonishing non-member pollsters of the need for clear, open, above-board release of survey details.
Many of the Principles of Disclosure are common sense. If I conduct a poll and do not clearly and accurately disclose polling questions and actual results fully and according to the responses asked of participants, I do not have the public’s best interest in mind, for I am choosing to withhold clear information. Furthermore, if I reword questions and reduce reported categories, and if I withhold numbers on graphics supposed to represent the result, then I am manipulating the public via limited release of information.
The NCPP Principles of Disclosure include two additional levels. Level Two involves what survey organizations should provide to the public upon request. It is noted in Level Two that “complete wording of questions” is a Level One requirement and becomes a level Two requirement only if the survey has been translated into another language.
Hart released the survey instrument not as part of this original Power Point but upon the request of one of my readers: Hart AFT survey
At a later date, the same reader requested complete response counts. Those counts are yet to be received.
Level Three includes criteria that NCPP member organizations are “strongly encouraged” to meet, including public release of raw data sets (minus participant identifying information) and public release to a website of “complete wording, ordering, and percentage results of all publicly released survey questions… for a minimum of two weeks.”
The goal is… dare I write it… transparency.
The Power Point Slide In Question
Actually, a number of slides in the Hart/AFT survey evidence reporting problems. However, Weingarten’s position in support of CCSS is the hinge for the rest of the survey questions presented in this Power Point. In order to argue that teachers want more CCSS assistance, they first must be shown to support CCSS, period. Otherwise, the issue needing addressing is not more CCSS assistance but the lack of teacher investment in CCSS.
The teachers in this AFT poll are not invested in CCSS.
The slide in question is page 3 of the Power Point above. (I tried to copy and paste the slide but was unable to do so.)
The first deceptive issue with this slide is its title: Teachers Overwhelmingly Approve of Common Core State Standards. This leads readers to “see” what they are being told to in the slide– not only do “teachers” (not AFT teachers, but teachers in general) “approve,” but by a landslide they approve. The clearly-intended message: Teachers in general really are for this Common Core. Readers are then told that teachers were asked this question:
Based on what you know about the Common Core State Standards and the expectations they set for children, do you approve or disapprove of your state’s decision to adopt them?
Yet this was not the survey question. The actual survey question was as follows:
Q3 Based on what you know about these standards and the expectations they set for children, do you strongly approve, somewhat approve, somewhat disapprove, or strongly disapprove of your state’s decision to adopt the Common Core State Standards?
Strongly approve ……………………………………………………………………….. 1
Somewhat approve …………………………………………………………………….. 2
Somewhat disapprove ………………………………………………………………… 3
Strongly disapprove ……………………………………………………………………. 4
Not sure ………………………………………………………………………………….. 5
Notice that a primary difference between the Power Point question and the actual survey question involves the gradation of the responses. This is key. In the Power Point, Hart/AFT have collapsed categories in order to present the illusion of “overwhelming approval.” Thus, “strongly approve” and “somewhat approve” have been reduced to “approve,” and “strongly disapprove and somewhat disapprove” have been reduced to “disapprove.” “Not sure” is not accounted for using a bar.
“Somewhat approve” is not “overwhelming approval.” It is hesitant approval. It is approval with reservation. But saying that a notable number of teachers “approved with reservation” doesn’t sell the “teachers ‘overwhelmingly’ approve” point-of-view. So let’s manipulate the categories in reporting to support what we want to promote.
Notice also on the page 3 Power Point that only two bars are shown on a three-dimensional drawing of a bar graph. Three-dimensional bar graphs are misleading; they can be manipulated because there is no straight, obvious “zero” line for readers to measure the graph against. And taking what should be a five-bar graph (one for each of the possible responses to the original survey question) and reducing it to two bars gives the illusion that only two options matter. Such presentation can easily disguise the subtleties of the muliple responses.
Why ask for people to respond using multiple responses then collapse catgories in reporting? Simple: The results didn’t turn out the way the researchers had hoped and require some doctoring to say “what we need them to.”
Is this too far-fetched?
Back to bar-graph deception:
Hart chooses a two-bar graph and for each bar uses two different colors, supposedly to represent the multiple possible response categories. But again, if each category is not clearly presented as its own two-dimensional bar against a clear zero line, the the reader must work to tease out the subtleties in response numbers/percentages.
Notice that the first bar on page 3 is tall and has “75%” at the top. But it is two-toned. These are the response categories, unlabeled, with actual counts/percentages per category also absent.
This graph is the only one missing any kind of response category labeling. Based upon subsequent graphs in this presentation, I see that the bottom color in each bar is equivalent to the “strongly approve” (in this slide, dark blue) or “strongly disapprove” group (dark red), and the color on top is the “somewhat approve” (light blue) and “somewhat disapprove” (light red) group.
Notice that the largest area in this graph is light blue– “somewhat approve.”
Most of the respondents were not “sold out” on CCSS. They had reservations. They “somewhat approve” of CCSS.
This is not “overwhelming approval” for CCSS.
I am sorry that I am unable to report actual numbers at this time. If Hart sends the requested actual counts to my blog reader who requested them I will post the actual survey questions (all of them) with the actual response counts.
As for what we have before us now, on page 3 of this Hart/AFT Power Point, the title is deceptive, the question is misreported, the graph is confusing, and the percentages reported are misleading.
There is also an “aside” on this slide: “79% are very/fairly familiar with CCSS.” Again with the limited reporting and collapsing of categories.
Notice how much more detail is devoted to the reporting on subsequent slides. They are still weak for their two-bar graphs, but there is less effort to conceal information. The sale has been made: “75% of AFT teachers surveyed ‘overwhelmingly approve’ of CCSS.” These are now the “bring it home” slides to promote a moratorium on an issue that AFT teachers aren’t sold on but that Hart and Weingarten have shaped reporting to “prove” that they are.
Other Survey Questions
Once I receive the actual results for the survey, I will know how teachers in this survey responded to these two questions:
Q12a Have you read the Common Core standards for mathematics and English Language Arts, or have you not had an opportunity to read the standards?
Have read the standards …………………………………………………………….. 1
Have not had an opportunity to read the standards ………………………… 2
Not sure ………………………………………………………………………………….. 3
Q12b Have you received a hard copy of the Common Core standards, or have you not received a hard copy?
Have received a hard copy ………………………………………………………….. 1
Have not received a hard copy ……………………………………………………. 2
Not sure ………………………………………………………………………………….. 3
It seems that these would be questions asked at the outset of the survey. It does make sense that one would want to check to see if those who say they are or are not familiar with these standards have seen or read them. Instead, Hart/AFT ask and report on this perception question:
Q2 How familiar are you with the Common Core State Standards–very familiar, fairly familiar, just somewhat familiar, or not familiar?
Very familiar ………………………………………………………………………………. 1
Fairly familiar …………………………………………………………………………….. 2
Just somewhat familiar ……………………………………………………………….. 3
Not familiar ……………………………………………………………………………….. 4
Not sure ………………………………………………………………………………….. 5
And even Q3, the focus of the infamous page 3 of the Power Point, is based upon perception (Based on what you know about these standards…).
But that is for another post.
Hart is a major research company. Weingarten leads a major teachers union.
To them I write:
You have deceived the public with your information manipulation.
You have done what NCPP tries to guard against. You have contributed tarnish to the image of the polling and survey research fields.
Rebut this article, folks. I can’t wait to read it.
In March/April 2013, Hart Research Associates conducted a poll of American Federation of Teachers (AFT) members regarding opinions about the Common Core State Standards (CCSS), which have been declared “the solution” and “what kids need to learn.” A finding publicized by AFT President Randi Weingarten is that “75% of AFT teachers polled support CCSS.” I took issue with both the finding itself and the manner in which the finding was reported. One of my criticisms regards the small sample size: only 800 AFT members were polled.
Nevertheless, if Hart’s and AFT’s goal was to truly discern whether or not teachers support CCSS, their sampling misses the mark. I would like to detail my position in this post.
Opinion polling is a tricky business. Care needs to be taken in obtaining a representative sample. Gallup is an established name in opinion polls, and one of the standards used by Gallup is the stratified random sample. A stratum is a subgroup; the random sampling happens within the defined subgroup.
CCSS is a nationally-promoted education agenda. However, it is adopted on the state level. There is no uniform, national protocol for adopting CCSS– a critical issue in the years of transition as states decide how to approach ultimate implementation. Thus, using the state as the unit of adoption of CCSS, some number of the 45 states (and DC) adopting CCSS should have comprised the strata used in the AFT poll. As it stands, Hart Research Associates used general random sampling from AFT membership then ruled out members who did not identify themselves as “teachers.” (This information I gleaned from reading the actual survey instrument: Hart AFT survey.) In their rebuttal to my post, Hart admits that 36% of the sample respondents were from a single state: New York. That’s 288 out of 800 individuals surveyed.
Now here is another tricky part. It is possible that 36% of all AFT members live in New York. (There is no public information available on the web to verify or refute this, so I must take Hart’s word for it.) So, Weingarten’s statement that the survey result “represents” AFT members might be correct. Yet when the public hears Weingarten state that “75% of AFT teachers surveyed support CCSS,” they do not get to hear, “over one third of survey respondents live in New York State.” This survey result is biased towards the opinions of New York teachers. In their rebuttal, Hart Research Associates comments that the New York teachers did not favor CCSS as much as other teachers did; Hart reports that 82% of other teachers favored CCSS. Given an overall sample of 800 in which 36% hails from NY, that means that 63% of New York teachers favored CCSS. The public does not get to hear that in New York, CCSS has been implemented sooner than required, in 2013. (This information I have from reading the actual Hart survey instrument.) Thus, New York teachers are likely more familiar with CCSS than are teachers in many other states, more familiar with exactly what Common Core is, and also less likely to favor CCSS.
The fact that CCSS is differentially implemented in New York– and that Hart knew as much– lends support for the argument for survey stratification by state.
And here is another important sampling query: I wonder how many of those New York teachers are English language arts (ELA) and math teachers.
The above question leads to a third tricky issue: In their survey, Hart defines CCSS as “a set of academic standards in English language arts (ELA) and math for students in grades K through 12 that have been adopted in most states.” In the survey protocol, teachers who do not teach grades K through 12 are excluded from the survey. (See the survey instrument for this rule-out.) However, teachers of subjects other than ELA and math are not excluded.
If any teachers are likely to be familiar with CCSS, especially in these years of transition, it would be the ELA and math teachers.
Why not focus the survey on those whose positions are used in defining CCSS: K through 12 ELA and math teachers? This is a critical sampling issue, one sure to affect survey results. (I realize that other teachers– and administrators– and entire schools– will be affected by such heavy emphasis on standards in only two subjects. Yet the ELA and math teachers remain those immediately and directly affected by definition.)
In my original post, I used AFT teacher membership data by state as provided by a group called Union Facts. I was criticized for using this site because Union Facts is an extremist group. However, the information provided was only demographic; it was not used to promote any extremist view. In addition, I searched for an official AFT accounting of such information on the web and found none.
I also realized that using the demographics provided by Union Facts allowed me to write a post that AFT could counter and offer corrected demographic information. As it was, AFT did not offer corrected stats, and Hart Associates only offered limited correction, such as the 36% NY teacher AFT membership stat.
I have no problem with having AFT provide me with correct demographics by state regarding its teacher membership. I will accordingly correct any inaccuracies in my post.
According to Union Facts, AFT has a teacher membership presence in 31 states (though some states have memberships recorded as zero; last updated October 2012). As for CCSS, it has been adopted by 45 states, plus DC, four territories, and the Department of Defense Education Activity. (I omit the territories and Dept. of Defense from continued discourse.)
If a stratified random sample were used for 31 states, and Hart surveyed 800 teachers, then that means Hart could only survey 25 or 26 teachers per state for a phenomenon that was adopted on the state level, not the national level.
Since Hart Research Associates admits that a proportional 36% (288) of AFT teacher members live in New York, that leaves 800 – 288 = 512 teachers to divide among what might be 30 AFT states. Keep in mind that it is important to have a more equivalent state representation since CCSS is not “”more important” or “less important” in any adopting state, and since assuming uniformity of both publicizing CCSS and implementing CCSS across states (i.e., “nationally”) is unfounded.
512 / 30 = 17 or 18 teachers surveyed per each of 30 states (given that NY state had 288 teachers surveyed out of 800).
Hart Research Associates took me to task for suggesting that it should have had a larger sample for its survey. I even suggested 10% of AFT members. Hart noted that it is common for polls to have 800 respondents.
According to Gallup, for their national polls, they use a sample of 1,000 to 1,500 respondents. Yet the CCSS situation is not a national situation in the sense of a presidential election or a general opinion poll about television habits. For CCSS, the unit of adoption is the state. Thus, several hundred teachers per state should have been randomly surveyed. According to Gallup,
Using common sense and sampling theory, a sample of 1,000 people is most likely going to be more accurate than a sample of 20.
Yet for a phenomenon that differs in its transitioning implementation at the state level, Hart has a potential average of 17 or 18 teachers surveyed for all other states except New York.
If AFT has a presence in all 45 states adopting CCSS plus DC, the average number of teachers polled per state except New York declines:
512 / 46 = 11 or 12 teachers per state plus DC.
Gallup suggests at least a sample size of 500 to begin to approach diminishing returns yet decides upon 1,000 to 1,500 “because they provide a solid balance of accuracy against increased economic cost.” But let’s say that Hart Associates settled on surveying 500 teachers per state (a low number) for a possible 31 AFT states. That would mean 500 x 31 = 15,500 teachers surveyed.
Even with 500 teachers surveyed per each of 31 states, proper reporting still would need to emphasize that only two-thirds of CCSS states were included in the survey. This is a more straightforward means of conveying the limitations of the survey than merely saying “AFT teachers surveyed.” Details are important for enabling consumers of survey research to critically weight the result.
In their rebuttal, Hart Research Associates comments that “not many surveys would be conducted” if “20 million voters” needed to be surveyed.
But I am not referring to some issue on a national ballot. I am referring to a situation pushed at the national level but differentially addressed on the state level.
AFT is not broke. Its total assets for years 2007 through 2011 exceeded $100 million.
I think if it had wanted to, AFT could have surveyed 500 teachers– 500 K-12 ELA and math teachers– in each of the 45 CCSS states plus DC:
500 x 46 = 23,000 teachers.
What a robust sample for capturing an overall result for a state-adopted and differentially-addressed educational issue.
This would have enabled AFT to truly understand on a “national” level a standard with a unit of adoption that is the state/DC.
Better poll conducting/reporting than AFT poll: http://www.wpri.com/dpp/news/local_news/mcgowan/union-poll-finds-little-support-for-education-commissioner-deborah-gist
This is a poll conducted by the Rhode Island Federation of Teachers (RIFT) and National Education Association Rhode Island (NEARI) regarding RI Education Commissioner Deborah Gist. Notice the article includes not only percentages but exact numbers, and the polling issue is uniform across the entire state. The sample is a little small (402 members), but it does represent almost 4% of all RI public school teachers (approx. 10,500). I would have liked to know if respondents were randomly selected and if some strata were used, such as school district, to ensure that respondents did not hail from a concentrated geographical area within RI. I also would have liked to know if respondents were from the population of all RI teachers or only union members.
In October 2012, I sent the following email to Louisiana Legislative Auditor Daryl Purpera requesting a performance audit of Louisiana’s charter schools. I did so in response to having read the US Department of Education’s audit of charters in Arizona, California, and Florida:
request for La. charter schools audit
From: Mercedes Schneider <firstname.lastname@example.org>
To: dpurpera <email@example.com>
Date: Fri, Oct 26, 2012 11:51 p.m.
Attachment: US Dept of Ed Charter Audit
Mr. Purpera, attached is the US inspector general’s audit of US Dept of Ed’s oversight charter schools in California, Florida, and Arizona. As you will note from reading, the US Dept of Ed is seriously lacking in their rigor in their management of both charter school educational quality and fiscal responsibility. The lack of rigor evident in management of California, Florida, and Arizona charter schools is likely problematic in Louisiana, as well. First, there is notable turnover in the charter schools in Louisiana, especially those associated with the state-run RSD.
Second, according to the recently-released 2012 school performance scores, RSD-LA has a district score of F, and RSD-NO, a district score of D. third, as of this date, LaDOE/BESE have outlined no clear accountability measures for charter school operation. Finally, given LaDOE/BESE continued practice of hiring unqualified-yet-highly-paid individuals to serve in key administrative positions, it seems that those hired to supervise/evaluate the charter schools in Louisiana likely haven’t the expertise to duly fulfill the duties of their jobs.
Given that similar lack of rigor found in the attached auditor’s report is already evident in the management of Louisiana charter schools, I ask that your office formally audit the charter schools in Louisiana.
Thank you for your time.
–Mercedes K. Schneider, Ph.D.
St Tammany Parish Public Schools
I received a stock email response, noting that my request had been received; that should an audit be conducted, the auditor’s office could not discuss it with me. So I was very glad to see that approximately six months later, Purpera produced a report, a performance audit of LDOE’s monitoring charter schools.
Given LDOE’s propensity toward hiding information from the public, I was happy to read the first line of the audit document, “Under the provisions of state law, this report is a public document.”
A copy of the report was sent on May 15, 2013, to the president of the Louisiana senate and to the speaker of the house. In Louisiana, both the senate and the house are apparently finished honeymooning with John White.
More power to this audit.
Here is some introductory information regarding the audit:
Authorization and Oversight.
During fiscal year 2012 (2011-2012 school year), 99 charter schools serving 45,684 students operated in Louisiana.
The six types of charter schools are as follows:
Type 1 – Charter creates a new school authorized by a LSB
Type 1B – Charter authorized by Local Charter Authorizer
Type 2 – Charter authorized by BESE
Type 3 – Charter converts a pre-existing school authorized by a LSB
Type 4 – Charter between a LSB and BESE
Type 5 – Pre-existing public school transferred to the Recovery School District (RSD) and operated as a BESE-authorized charter school
Within LDOE, the Office of School Choice monitors Types 2 and 4 charter schools and RSD’s Office of School Performance monitors Type 5 charter schools. Types 1 and 3 charter schools are monitored directly by LSBs. Exhibit 1 shows the authorization and oversight structure for each type of charter school.
This audit focuses on the 78 Types 2, 4, and 5 charter schools operating during fiscal year 2012 for which LDOE was responsible for monitoring. [Emphasis added.]
There are 58 Type 5 charters, all belonging to RSD. The remaining 20 charters under consideration in the audit include 16 Type 2 and 4 Type 4. As of June 30, 2012, no Type 1B charters existed.
According to BESE Bulletin 126, Louisiana Charter School Law, as cited in the audit, charter schools are supposed to be offered “greater flexibility and autonomy in exchange for heightened accountability through regular monitoring.”
What immediately comes to mind is BESE President Chas Roemer’s inane comment That charters are held to “strict accountability standards” since the state can revoke the charter after three years.
Not so strict. I wonder why Roemer didn’t mention the “regular monitoring” portion of Bulletin 126. Maybe he forgot this part (taken from the audit):
One of BESE’s responsibilities as the authorizer of Types 2, 4, and 5 charter schools is to direct LDOE to review and evaluate these schools’ academic, financial, and legal/contractual performance annually. LDOE then recommends to BESE whether to renew or extend a charter’s contract….
Those dastardly school performance scores are supposed to figure into an annual charter review– but even the poor performing charters get their three years. Charter financial risk is also supposed to be assessed annually– but not necessarily answerable annually.
Chas, you could revoke charter agreements following an annual review– if you wanted to.
Finally, student enrollment (including numbers of disabled students), student discipline, health and safety, percent of certified teachers, parental complaints, ethics, and timely submission of required reports are all supposed to play a role in this annual review.
Bulletin 126 allows for early revocation of the charter agreement– but it does not require such. BESE can easily ignore charter problems– and it does.
Chas and friends could always rewrite Bulletin 126 to clarify nebuous wording and add rigor.
Even with three consecutive poor annual academic reviews, the charter might still be placed under a memorandum of understanding (MOU) for improvement. This it the loophole for continuing a contract with a charter that has what community public schools are told they cannot have: a C, D, or F school performance score.
All according to BESE Bulletin 126.
So, what did the auditor find? LDOE did monitor the finances of the charters under its jurisdiction for FY 2012. But those school performance scores– well, there are some problems. For the 10 charters needing a pre-assessment index (PAI) in the fall of 2011, LDOE dropped the ball and provided none of the schools with a PAI until the “uh-oh!” time of April 8, 2013 (one month before the audit result was published).
The purpose of the PAI was to calculate charter performance from fall to the time of spring testing. Keep in mind that the audit was for FY2012, which begins in fall 2011. So, LDOE did not provide pre-assessment information until the spring of the second year of charter school operation.
Of course, LDOE excused itself: They needed to wait until after 2012 spring testing “so that the index (no longer a ‘pre-assessment’ as intended, mind you) would only include the testing histories of students who remained and tested with the school.”
This is against the Bulletin 126, which carries the weight of law. But LDOE, with its high-priced former TFAers, couldn’t manage to pre-assess 10 charters.
What else did LDOE do?
It altered Bulletin 126 after it failed to meet the original standard. (The “update” did not happen until FY2013.)
The audit also reveals that LDOE conducted no verification that school-level data used to calculate school performance scores was indeed correct:
Student performance is the primary measure of school quality and is the main component for LDOE’s renewal and extension decisions. Therefore, it is critical that the data LDOE uses to calculate the SPSs of schools is reliable. Bulletin 741 requires that charter schools maintain supporting documentation for the data used to calculate these scores. In addition, the National Association of Charter School Authorizers recommends that authorizers not rely on self-reported data from schools unless it has been verified. However, LDOE accepts self-reported data from the schools without verifying the reliability of that data. According to LDOE, it stopped conducting on-site audits in 2008 because of a lack of resources. [Emphasis added.]
The auditor acknowledges that most of the school performance score is calculated using test scores (90% for K-8 and 70% for 9-12) and that such information is reported directly to LDOE from the testing vendor. However, regarding attendance, dropout, and graduation data, the auditor recommends LDOE conduct site visits to verify accurate data entry. This is not an unfounded request:
Attendance Data. Bulletin 741 requires that schools maintain a record of each student’s attendance. According to LDOE, records may be electronic attendance records from the school’s own data system or both paper and electronic records. We reviewed attendance records for a sample of 325 students at nine schools and found that 84 (25.8%) student records had attendance data that differed from LDOE’s SIS data for fiscal year 2011.
Dropout Data. LDOE policy requires that schools maintain supporting documentation for dropouts, such as withdrawal forms and requests for records from other schools. We reviewed dropout records for 130 students and found that 15 (11.5%) did not have sufficient documentation to support the withdrawal.
Graduation Data. LDOE regulations and policies do not specify what documentation schools should maintain to support what they enter into STS. Therefore, we were unable to assess the accuracy of transcript information. [Emphasis added.]
LDOE said no. After all, the rules for school performance score calculation will change, and anyway, the dropout data doesn’t really count that much.
No kidding. See below:
LDOE does not agree with this recommendation. According to LDOE management, attendance data will be used for a final transition year SPS in the fall of 2013 but will not be calculated into the SPS beyond that date. Dropout data will count as 5% of the SPS only for schools that serve grades 7 & 8. LDOE further states that while dropout data is important, it has a relatively low impact on the overall SPS.[Emphasis added.]
The auditor counters LDOE’s laissez-faire with this:
While the dropout data is 5% of the SPS calculation for schools that serve grades 7 & 8, this data is also used to calculate the graduation cohort component of the high school SPS. Beginning in fiscal year 2013, the graduation cohort component constitutes 25% of the high school SPS calculation. [Emphasis added.]
A note regarding the ever-changing nature of school performance scores: The Louisiana legislature is tired of White and BESE changing the school performance score rules. In May 2012, the Louisiana House of Representatives passed HB 466. Notice White’s proposed, confusing “solution” to the school performance score transition:
…The House voted 70-28 to block state plans that would make ACT scores — a measure of college readiness — a key part of how public high schools are graded.
House Bill 466 would retain the grading system the way it operated for the 2011-12 school year, which would exclude ACT results.
White has said the ACT scores are vital for all students, not just those who plan to attend college.
He said the state later this year would issue two letter grades per school, with and without the ACT scores, as a way to smooth the transition.
But critics said two grades would confuse parents and that even the state’s top-scoring public high schools would drop a grade letter or two under the new system.
State Rep. J. Rogers Pope, R-Denham Springs, backed HB466 as a solid product of talks among principals, superintendents and others. “If we are going to label schools, let’s do it in a way that is fair and equitable,” said Pope, former superintendent of the Livingston Parish school system.
The measure would also require Louisiana’s top school board to win the permission of the state House and Senate committees before it makes future changes in the school grading system. [Emphasis added.]
This bill would block White from making “retro” changes to the school performance score formula similar to the Bulletin 126 change he made after failing to pre-assess the charters.
If HB 466 passes the senate, then the auditor’s comment above is only slightly modified; the graduation cohort continues to be 30%.
As for LDOE’s monitoring legal/contractual charter performance, it was “hit and miss.” In short, LDOE submitted incomplete information regarding comprehensive monitoring of six indicators: Special education/ELL; enrollment; discipline; health and safety; governance, and facilities. The auditor concluded:
LDOE could not provide evidence that it comprehensively monitored all of the six legal/contractual indicators at any of the 78 Types 2, 4, and 5 charter schools as required by Bulletin 126.
In our September 2011 report on the Recovery School District, we found that the Recovery School District did not comprehensively monitor all Type 5 charter schools for legal/contractual compliance as required by Bulletin 126. LDOE agreed with our recommendation that it should develop a comprehensive process to annually coordinate the collection of data on all Type 5 charter schools to ensure they are meeting their legal/contractual obligations. LDOE has made some progress on this recommendation. For example, LDOE included in its January 2012 Charter School Annual Report to BESE a column labeled “Legal/Contractual Performance.” However, LDOE could not provide evidence that it had addressed all issues in the report. [Emphasis added.]
The auditor’s official recommendation:
LDOE should implement a more comprehensive process to annually assess charter schools’ legal/contractual performance that includes the review of the six legal and contractual indicators as required by Bulletin 126.
Believe it or not, LDOE does not agree that it needs to develop a more comprehensive process; it “is confident that its current practice… is sufficiently comprehensive.”
This is the same LDOE that, according to White, “just sent the wrong Word file” to the Senate Committee on Education as it was considering the 2013-14 minimum foundation program (MFP) funding. The senate rejected the current formula because it included funding for vouchers and Course Choice, both of which were recently declared unconstitutional by the Louisiana Supreme Court.
Roemer, who spoke of the “strict accountability standards” to which Louisiana charters were to be held, is also responsible for the wrong MFP formula sent to the senate. These things ought not to be.
Unfortunately, Roemer’s charge of charter accountability is also “not to be.” In FY 2010, LDOE placed eight charters on probation because of financial issues. In FY 2012, BESE changed the review rules to cover for LDOE’s negligence in reviewing the charters on probation.
In FY 2011, LDOE did not review the charters on probation. As noted in the auditor’s report:
According to Bulletin 126, LDOE should have determined if all eight schools met required standards during fiscal year 2011 to continue operating. Based on these determinations, if a school had not achieved Bulletin 126 standards, LDOE was required to recommend to BESE revocation of the school’s charter, potentially not allowing the school to operate during fiscal year 2012. However, LDOE did not determine if any of these schools met required standards to continue operating. While one of the schools closed, the remaining seven were allowed to operate during fiscal year 2012 without LDOE ensuring that they were in compliance with Bulletin 126 standards. [Emphasis added.]
The charters on probation should have been reviewed in FY 2011– that is fall 2010 to fall 2011. Late in the game– near the end of the 2010-11 school year– BESE changed the rules to accommodate no follow-up review from LDOE:
In April 2011, BESE repealed the requirement for LDOE to determine if schools on probation met required standards to continue operating. However, Bulletin 126 was not updated until August 2011. According to LDOE staff, it decided to grandfather in the eight schools discussed above to avoid the schools receiving two reviews in a six-month timeframe. However, no such provisions were included in the Bulletin 126 revisions. [Emphasis added.]
Let that sink in. Let the pattern sink in: LDOE fails to follow through on its duties; BESE changes the rules in the eleventh hour to cover the negligence.
A pressing question: Why would BESE vote to repeal fiscal monitoring of charters on probation? Could it be to continue to promote an illusory “model to the nation”?
Regarding the capricious alteration of Bulletin 126, the auditor’s recommendation:
LDOE should ensure that Bulletin 126 is updated in a timely manner when changes are made to the criteria for monitoring charter schools so that its staff can hold schools accountable for meeting the required standards to operate. [Emphasis added.]
Things that ought not need saying: Don’t change the rules at the last minute.
With what I envision as a sheepish grin and nod of the head, LDOE agrees with this one.
That doesn’t mean such a signature move will not happen in the future. LDOE and BESE are certainly not known for stability and consistency.
School letter grades have undergone capricious changes. White has even tried to change the rules without involving BESE. And now the legislature desires to monitor LDOE/BESE changes to school performance.
I hope they decide soon to do the same with charter regulation. This audit, complete with White’s/LDOE’s/BESE’s foolishness, is in legislative hands.
So there you have it, folks: The Louisiana charter school audit of LDOE’s management (or rather, the lack thereof).
Another untrue Louisiana model for the nation.
I do not know whether my email to the auditor requesting a performance audit had any bearing on the conducting of this audit, but I will add that in a separate email, I also asked for a performance audit of LDOE and BESE. A performance audit of these two entities would consider what money is being spent and how it is being spent.
Maybe such audits are already in the works.
What a fine thought.
As a classroom teacher, a real teacher– real in the sense of both my credentials and my experience– I am tired of those outside of the classroom catapulting themselves into positions of quick fame in order to comment on the state of the classroom– a place either completely unfamiliar to them firsthand or of only token familiarity. Such is the motivation for my writing many of my posts, and it is my motivation here as I write about Steve Perry.
Steve Perry is not a teacher. Yet he is one of many would-be reformers who has stepped up to claim his moment in the spotlight as an education expert– and one whose contempt for teachers is obvious.
Though not a teacher, Perry was a CNN education commentator, and he opened and leads his own school in Connecticut, Capitol Preparatory Magnet School, a year-round school that advertises sending ”100% of its graduates to 4 year colleges.” The big question is one of student attrition prior to that senior year.
Perry quotes the rapper JZ* and says that “Men lie and women lie but numbers don’t.”
(*I have been reprimanded by an insulted reader that I did not correctly transcribe to “Jay-Z.” There is also a rapper JZ and a rapper JC. She accused me of lacking “scholarship.” I am sorry that someone could read this post and come away with only such a comment.)
I’m not a rapper. I’m a classroom teacher who is also a statistician and researcher, and I know full well that Perry has positioned his numbers to lie.
Of course, hiding the full story behind that sparkling “100%” is Perry’s lie. And it is Perry’s number. So perhaps JZ and I have tied on this one.
Beware of those Wonder Schools.
Beware of Wonder
Perry says “The achievement gap is really an educational gap in terms of the performance of our educators themselves.” [http://www.youtube.com/watch?v=qglQ2CxKhWs]
Notice his words are not “educators ourselves.” He sees himself as the solution, and in reading about and listening to Perry, both his arrogance and foolishness are unmistakable.
When it suits him, Perry does refer to himself as an “educator,” not uncommon in this current environment where the reformer set hides their lack of teaching credentials and experience behind the word.
Like many reformers, Perry’s “bio” includes sketchy information regarding his credentials. He notes graduating “on a scholarship” from the University of Pennsylvania’s School of Social Work, but there is no mention of a degree level or a year. At the bottom of his self-promoting bio, Perry signs his name as “Dr. Steve Perry, MSW,” as if to showcase as many degrees in as unorthodox a manner as he can.
And how about that doctorate? What is it in? And why not feature such information in a bio about oneself?
In preparing for this post, I have been reading a number of articles written about Perry. One interesting piece is on Diane Ravitch’s blog. The comments section is particularly revealing concerning Perry’s reputation. However, it is the final comment (final as of the time I read) that held my attention:
What is Perry’s PhD in? Honorary degree from somewhere? Cannot find trail of his education, degrees or dissertation credentials. Any info?
Yesterday, it just so happened that I read Steve Perry’s dissertation, an Ed.D. in educational leadership from the University of Hartford:
It ought to embarrass the sole faculty member who signed off on it and the school that issued it.
The dissertation file document does include details of Perry’s education, including a BA in political science from the University of Rhode Island in 1992; a MSW focused upon social and economic development from the University of Pennsylvania’s School of Social Work in 1995, and this Ed.D. in educational leadership from the University of Hartford in 2008.
Political science. Social and economic development.
Not a teacher.
And after reading his dissertation:
Not a scholar.
A dissertation is meant to be a scholarly contribution to one’s field of study. It is meant to be a unique contribution, one that adds substantially to research in the selected field as determined by a faculty committee chiefly comprised of those who possess the expertise to critically appraise the candidate’s work. And it is meant to demonstrate the author’s suitability to be recognized as an expert scholar in a clear and defined academic specialty.
Steve Perry’s so-called “dissertation” accomplishes none of these goals and makes a mockery of academic rigor.
Perry has earned a “cereal box doctorate”: Buy the cereal; pull out the prize. That’s it.
Why am I being so hard on this former CNN “expert education commenter”? This “education expert” who on one hand says the problem is that professional educators “are not connecting” with students, but on the other, says, “If you don’t want to go to college, don’t go to Capitol Prep (Perry’s school). Go somewhere else”? This arrogant self-promoter who introduces himself on his website as “America’s Most Trusted Educator”?
His dissertation ”contribution” amounts to nothing more than a self-reported six hours on the phone asking Upward Bound staff what they believe works; transcribing these interviews, and organizing responses into sets. Period.
So much for “manning up” academically.
(Upward Bound is a federal program aimed at promoting college attendance among students from families with either low incomes or no previous members who have attended college.)
Here is what he chatted about with six people for one hour:
1. What are the staff’s reports of how they implemented the project components and services designed to prepare eligible high school students for attendance at four-year colleges?
2. What are the staff’s reports of the project practices that are most effective in preparing eligible high school students for attendance at four-year colleges?
3. What are the staff’s reports of the project practices that they perceive to be the most easily implemented in an urban high school in their service area?
I am hard pressed to believe that discussion of these three questions required more than 15 minutes per staff member.
Far from sufficient for a rigorous dissertation.
For Perry– well– I think it has served its purpose.
Congratulations, “Doctor” Perry. Like a woman who agrees to get married for the diamond ring, you now get to refer to yourself as “doctor” for completing your University of Hartford program. Based upon perusal of your website, I know that the title is very important to you.
Even your Twitter handle has that “Dr.”
Yes, his dissertation is a flimsy, rice-paper version of the real deal. But don’t believe that a self-important man like Perry is short on words. In his pseudo-diss, he wrote 176 pages. Perry thinks he has a lot to say– only most of it pertains to the work others have accomplished. He is long-worded on a literature review of studies that clearly overwhelms his sad, slight, research “contribution.”
He calls his “dissertation” an “exploratory, qualitative case study of a single Upward Bound project” in which he “used a single, semi-structured telephone interview” of “six Upward Bound project staff.”
When I was working toward my doctorate (a bit more rigorous in its completion), doctoral students in the University of Northern Colorado College of Education had to defend choosing a qualitative dissertation against the idea that qualitative is “easier” than quantitative. I have heard fellow students comment, “I don’t like numbers, so I’ll ‘just’ do a qualitative dissertation.”
I do not advocate the view that a well-done qualitative dissertation is “easier” than a well-done quantitative one.
Whether qualitative or quantitative, a rigorous dissertation proposal adequately answers the question, “So what?” In other words, why bother conducting this study? What of substance or significance might this study contribute to the body of research in a given field? (Even though qualitative research involves emergent themes, the researcher should still be able to defend the value of the study.)
A well-done qualitative dissertation is often twice as long as a quantitative dissertation since the qualitative medium of research is the word. Thus, a rigorous qualitative dissertation can easily be 300 pages or longer. And there is much to “qualitative’; the general term encompasses numerous study designs, including but not limited to ethnography, grounded theory, phenomenology, narrative research, and case study.
In his “dissertation,” Perry notes that he has chosen the case study. But the question remains, ‘So what?” It is not as though there has not been a wealth of research conducted on Upward Bound. And with the modern reformer push for quantitative results, why would Perry not decide to at least conduct a mixed methods study, one that incorporates both quantitative and qualitative components? Perry even discusses the education reform movement as part of his literature review. In addition, Perry cites Yin, and Yin advises use of both quantitative and qualitative methods in case study research. And though he cites Creswell, Perry does not even follow advised case study methodology of collecting information from multiple sources, such as interview and observation. (Yin [as reported in Creswell] suggests collecting six types of information in conducting a case study: documents, archival records, interviews, direct observations, participant observations, and physical artifacts.) In rushing though data collection, Perry doesn’t even conduct multiple interviews of his six individuals.
Frankly, his research questions are too watery to warrant multiple interviews.
As for the other five types of information collected in a case study, Perry is without excuse.
In truth, the “case” in Perry’s “study” is neither unique enough nor substantial enough in its own right to stand alone as a study befitting the rigor due a doctoral dissertation. If there were only a single, six-staff Upward Bound program in the entire United States, that would be arguably unique. If research on Upward Bound did not readily lend itself to quantitative questions, that too would warrant a qualitative study in order to discern potential emergent themes associated with Upward Bound. But such is not the case. And Perry’s attempt at a “qualitative study” amounts to little more than “How can I do this thing as quickly as possible?”
No wonder Perry, who likes to feature himself, chooses not to include his dissertation (or even his exact doctoral credentials) as a part of his “Look at me! I’m Steve Perry” website.
Perry’s entire “dissertation” is more of what rigorous researchers would consider a qualitative follow-up component to an absent quantitative study. In other words, what Perry presents as the entire train of his dissertation is sadly only the caboose to a research locomotive and associated cargo that never appear.
For the reasons I have written in the several paragraphs above, I would never have approved of Perry’s dissertation even in its proposal stage. But someone did approve: Diana LaRocco, an assistant professor who also holds an Ed.D. from the same University of Hartford. No other signature was required on Perry’s dissertation.
This is not rigor, folks. But it is $650 per credit hour to the University of Hartford.
As for Perry’s doctoral program: The University of Hartford offers a doctorate in educational leadership that it advertises as a 63-semester-credit-hour program for “mid-career adult learners,” though the numbers don’t quite add up. Anywhere from 21 to 24 hours are associated with the dissertation. A student could complete this program in two years, including summers.
For the sake of comparison, let me add that my doctoral program of 120 transcripted semester-credit hours at the University of Northern Colorado took four full years for me to complete, including summers. My dissertation, a quantitative dissertation, is 180 pages and is signed as approved by five university faculty, including the dean of the graduate school.
As for Perry’s recommendations based upon his conversations with six Upward Bound staff (I have abbreviated them for the sake of space):
1. Implement group and one-on-one activities to foster supportive relationships.
2. Restructure high schools to be smaller to offer opportunities for students to establish caring relationships.
3. Other researchers should replicate this study using additional methods and a larger sample.
4. Staff need to focus on deliberate, sustained efforts to support student college-bound mindset.
5. Organize school so that daily schedule mandates that teachers have time to meet with students and communicate with families.
6. Future research should be conducted using additional methods and a larger sample.
7. Provide students with yearlong academic supports.
8. Increase the length of the high school year to include a summer component.
9. Conduct cost benefit analysis on extending the school year.
Notice that twice Perry advocates redoing the study with additional methods and a larger sample. HE could have done the study with additional methods and a larger sample. With a larger Upward Bound staff sample (which exists had he pursued it), Perry could have incorporated both qualitative and quantitative components to his study. But he also would have had to be sure that his study provided a unique and substantial contribution above and beyond that of the existing Upward Bound research.
Perry cut corners and chose not to conduct the study that he advocates “other” researchers conduct.
Perry also advocates positions popular in educational reform, including the small schools” effort that Gates both championed and abandoned, and the extension of the school year (which he follows with a cost benefit analysis– which should be done prior to implementation).
Perry does not consider the cost– financial and otherwise– of his other ideas, including the cost of the “yearlong academic supports”; the “mandated staff-student meeting time,” and even the “group and one-on-one activities.”
I am left wondering what it is about Perry’s dissertation that makes it worthwhile.
I can think of nothing.
But it did get him that “doctor” title….
There’s money to be made in education, and I am here to get mine.
I am a Louisiana Course Choice provider. Despite the initial, November 2012 state ruling of unconstitutionality regarding my receiving MFP funding for offering Course Choice, and despite the fact that Jindal and White have no other BESE-approved plan in place for paying me, it is full speed ahead and I am approved.
I have access to student data via John White’s shady “partnership” with data storage “nonprofits” such as InBloom, organized and funded by Murdoch and Gates.
Even though I am not supposed to have them, I do have access even to social security numbers.
And, as an “approved” provider, I am allowed to solicit students for enrollment in my courses. I can go into neighborhoods and sign students up on my computer. I can run ads on Craigslist for “sales reps” to sell courses. And I can offer enrollees a free computer as incentive to enroll.
And it gets better for me:
I can sign up children without school or even parental consent. And it is up to the LDOE to inform principals who is enrolled and for principals to contact me to unenroll students.
No one is monitoring me. I can enroll students for any course I like, even courses that don’t make sense, like elementary courses for high school students. I am solely responsible for reporting that the student “attended.” And on my word, I collect the money. I can do as I like for three years, without penalty. If students don’t perform well on standardized tests after “taking” my course, the penalty goes to the home school.
What money? you may ask. After all, didn’t the Louisiana Supreme Court just declare that I cannot collect MFP money, that such is meant for the public schools?
White initially proposed a softer, looser 2013-14 MFP formula for legislative approval. But now, Jindal and White have an opportunity to play with MFP again since the 2013-14 BESE-approved MFP must be reworked given the court’s ruling. And he seems to end up with surplus money from underbidding contracts.
Besides all this, Jindal has promised to shave off some budget money to fund ”choice.” One way to do so is to remove voucher students from school enrollment and divert what would have been MFP to the general fund. Sure, I’m not a voucher program, but White and Jindal are very good at shady schemes.
Ask those in charge. I am still good to go.
If Jindal and White have their way, I will get paid. And I am in a better position than under-regulated charters, for I am virtually invisible, even more so that White’s network leaders.
So, what could stop me?
Parents contacting their children’s schools to go on record before the fact as saying they do not want their children enrolled in Course Choice. Written, dated, and signed correspondence on file with the school and copied to the Course Choice program at the LDOE.
Parents contacting their legislators regarding this mass fraud in the works.
Parents contacting their legislators saying that they do not want public school funding diverted to the general fund for the purpose of funneling money to Course Choice.
Yeah. Parental action could be my undoing.
When I was very young, three or four years old, I used to ride with my father as he drove his New Orleans city bus on its morning route. This was before the age of seat belts, and my father used to allow me to stand next to him and operate the money crank. Riders would put their fare in a glass-topped container; my father would verify the amount was correct, and then I was allowed to pull the handle so that the money would drop into a metal container below.
One day, a lady gave me some change, and I put it in the glass container and cranked it. The lady said, Oh, no! That money was for you to keep!” I remember feeling instant panic and loss at having cranked what was to be a gift to me.
My father immediately replied, “It’s okay. She would rather crank the money than keep it.” Suddenly all was well because I believed what my father had said about me. He said I had rather operate the crank, and his saying as much made it true.
That was over 40 years ago.
Last week, I read a press release by Randi Weingarten in which she stated that most teachers support the Common Core State Standards (CCSS). The tenor of her report was such that she assumed the issue of retaining CCSS was settled.
Weingarten wants me to believe that I support CCSS.
This is not my father’s bus.
I did not buy it.
I have not met American Federation of Teachers (AFT) President Randi Weingarten in person, but from what I have read about her, I have learned that she has chosen to “play to the middle”– to appear to support both traditional public school teachers and corporate reform at the same time. And now, Weingarten has positioned herself to appear to stand against Common Core via her ‘moratorium” while simultaneously standing with it by reporting that “75% of teachers support the new standards.” Here are the exact words:
…A recent poll of AFT members reveals that 75 percent of teachers support the Common Core.
Seventy-five percent sounds overwhelmingly impressive.
In the case of Weingarten’s survey, that’s approximately 600 teachers– give or take 28 teachers– or maybe more.
Weingarten presents the results of her survey in suspiciously general terms in a 12-page Power Point-type pdf.
I would like to discuss some key points regarding why these results are suspect. It is important to consider what is really in this document– and what is absent– since Weingarten is using this “survey” as evidence that “most teachers” want the CCSS.
I should point out that Gates also wants the CCSS. Consider the content to this chummy article with the notation, “Sponsored content by the Bill & Melinda Gates Foundation and American Federation of Teachers.”:
The Bill & Melinda Gates Foundation launched the Measures of Effective Teaching (MET) study in 2009 to identify effective teaching using multiple measures of performance. The foundation also invested in a set of partnership sites that are redesigning how they evaluate and support teaching talent.