Former College Board Exec: New SAT Hastily Thrown Together; Students: March SAT Recycled in June
Manuel Alfaro is the former executive director of assessment design and development at the College Board.
Beginning on May 15, 2016, Alfaro has published a series of posts on Linkedin in an apparent effort to reveal the haphazard construction of the new SAT, released and first administered in March 2016 and again, in June. (He is also posting info on Twitter: @SATinsider.)
Below are excerpts from Alfaro’s Linkedin posts, all of which provide an enlightening read concerning the sham Coleman has thrown together and labeled the “new SAT.”
On May 15, 2016, Alfaro writes:
My name is Manuel Alfaro, former executive director with the College Board. I was recruited in 2013 by David Coleman, President of the College Board, to reform the SAT. The College Board will tell you that I am a disgruntled employee. This statement would not be entirely wrong, but it would not be entirely correct either. I am a disillusioned idealist, shocked by the reality I encountered at the College Board.
I just started a petition on the White House Petitions site, We the People to ask the federal government to investigate the College Board for making false claims about the redesigned SAT: https://wh.gov/is3Sf …
My first assignment with the College Board was to review a draft of the test specifications for the redesigned SAT. The document had been created by two of David Coleman’s cronies, two authors of the Common Core. This document is now known as the “research-based, empirical backbone” of the SAT Suite of Assessments. Back then, it was a subset of standards taken straight from the high school and middle school Common Core. My instructions were to rubberstamp the selection of standards and to rewrite the standard descriptions to make them unrecognizable, so that no one could tell they were Common Core.
From May 17, 2016:
David Coleman and the College Board have made transparency a key selling point of the redesigned SAT. Their commitment to transparency is proclaimed proudly in public documents and in public speeches and presentations. However, public documents, such as the Test Specifications for the Redesigned SAT (https://collegereadiness.collegeboard.org/pdf/test-specifications-redesigned-sat-1.pdf), contain crucial statements and claims that are fabrications. Similar false claims are also included in proposals the College Board wrote in bids for state assessments—I got the proposals from states that make them public.
To corroborate my statements and allegations, I needed the College Board to administer the tests. If I had gone public before the tests were administered, the College Board could have spun this whole matter as “research” or some other nonsense. Now that the PSAT and SAT have been administered; now that the College Board has committed an insurmountable violation of trust; we the people can decide the future of the SAT.
In this Linkedin post, dated May 19, 2016, Alfaro satirically discloses an amazing lack of sophistication in the technology behind SAT test creation:
It is not all gloom and doom at the College Board. For example, new test developers were always surprised when they got their first look at the content management system used to manage the workflow of items during item writing.
The system enforced structured review sequences; stored detailed item histories; stored all relevant item metadata; kept detailed logs of internal and external review outcomes; and we could, at the tap of a key, generate detailed reports for state clients…
Wait, I’m no longer at CB. I don’t have to make things up. The content management system consisted of folders and subfolders on the servers; for each item, staff would move a Word document from one folder to the next, like we did in the 1990s; update item metadata on an Excel doc….
I think Alfaro’s May 20, 2016, post is the most shocking; in it, he reveals that untested items have hastily been included on the actual SAT that students are taking– that over half of the actual SAT administered to students is a thrown-together, unreviewed, unvetted item free-for-all:
Lucky for us, David Coleman made transparency a key selling point of the SAT. In support of that transparency commitment, the College Board published a 200-page document (link below) letting the world know exactly how the SAT is developed. Three paragraphs, out of those 200 pages, state how operational SAT forms are constructed. Obviously, many details were left out, intentionally.
One of those important details is the percentage of operational items that are revised/rewritten during the operational form review process. These are not the items that are included in “experimental sections” for pretesting: these are the items that are used to determine a student’s SAT score. The College Board tells the public, content advisory committee members, and clients that operational items are revised only in the RAREST of occasions. Facts, however, show that a large percentage of operational items on each form (often greater than 50%) are extensively revised/rewritten. And no, I’m not talking about adding a missing comma here, fixing a typo there, or changing the standard alignment. Sometimes the revised items are completely different than the version that was pretested.
Regarding College Board’s use of unvetted items on an operational test, Alfaro asks some important probing questions in his May 24, 2016, post:
Here are some of the questions you should be asking yourselves:
- If a large number of operational items were so bad that they required extensive revisions or rewriting, why were they pretested in those conditions?
- What kind of inferences can be made about the condition of the operational item pool, given the fact that the College Board needed to include these extensively revised/rewritten items in operational SAT forms?
- Given the large number of extensively revised/rewritten items in operational SAT forms, a large part of the form seems to be experimental, what’s up with that?
- What kind of inferences can be made about the condition of the pretest item pool?
- Who is reviewing these items? Surely, Content Advisory Committees would have expressed concerns about item quality to College Board executives.
- How does the College Board ensure that the overall difficulty of experimental sections is similar across forms?
- If the overall difficulty of the experimental sections is different across forms, wouldn’t operational forms containing more difficult experimental sections disadvantage some students and advantage others? Who are the students most likely to be disadvantaged by this?
- Experimental sections can appear anywhere on the test. If they appear early on the tests, and if some items are flawed (maybe even unsolvable), wouldn’t some students spend a lot of their time struggling with these items, which don’t contribute to their scores, and reach fewer items that actually count?
- Experimental sections take up to 1/3 of total testing time, why not get rid of them? Wouldn’t students be better served if they only worked on items that actually contribute to their score?
- The College Board needs to provide thorough documentation to states clients for peer review. Given that their content management system was manual for a large part of the test development process, how does the College Board ensure that the information it provides is accurate?
- Other testing companies openly admit that they contributed to the development of the Common Core standards and that their tests reflect those contributions, why did the College Board resort to scheming?
I will provide the answer to some of these questions in my next post. Many of the questions can only be answered with information obtained through discovery. Please sign my petition to compel the College Board to answer those questions.
On an earlier post I stated that a large number of items on operational SAT forms were extensively revised or rewritten during form construction and review. On a recent post, I asked:
- Who is reviewing these items? Surely, Content Advisory Committees would have expressed concerns about item quality to College Board executives.
As you might imagine, members of the Content Advisory Committee raised issues and concerns frequently and forcefully. Some members of the committee sent emails to David Coleman; others expressed their concerns during face-to-face meetings; and others sent emails to the leadership of the Assessment Design and Development group.
Of the many concerns raised by the Content Advisory Committee, here are the top three:
Item Quality: Committee members were very concerned with the quality of the items the College Board brought to committee meetings for review. Their biggest concern was the large number of items that were mathematically flawed; items that did not have correct answers; and items that did not have accurate or realistic contexts. Some members even went as far as stating that they had never seen so many seriously flawed items.
Development Schedule: Committee members felt that schedules did not allow them enough time to perform thorough reviews. Given the large number of items they had to review (and the poor quality of the items), they needed more time to provide meaningful comments and input.
Development Process: Committee members felt that the process used to develop the items was inadequate. They felt that the process lacked the rigor required to produce the high quality items necessary for item data to be useful.
On May 27, 2016, Alfaro reveals that the content advisory committee did not see items to review them until the items had already been included on a ready-to-use SAT:
In yesterday’s post, I wrote about the top three concerns and issues raised by the Content Advisory Committee regarding the items they reviewed during the content review meetings. In today’s post, we will use the committee’s feedback to address some of the questions from Part 1.
- If a large number of operational items were so bad that they required extensive revisions or rewriting, why were they pretested in those conditions?
- What kind of inferences can be made about the condition of the operational item pool, given the fact that the College Board needed to include these extensively revised/rewritten items in all the operational SAT forms?
- Given the large number of extensively revised/rewritten items in operational SAT forms, a large part of the form seems to be experimental, what’s up with that?
- What kind of inferences can be made about the condition of the pretest item pool?
Given the Content Advisory Committee’s critical feedback about the items they reviewed in preparation for, and during, meetings with the College Board, we can infer that the pretest item pool was of poor quality, at best. The committee and College Board staff/contractors worked hard to improve the items before they were operationally administered to students. I must give credit where credit is due: they did their best.
How, then, did so many flawed items end up in the pretest item pool? If the committee and College Board staff/contractors did their best to fix the items, why did the College Board need to include extensively revised items on operational SAT forms?
The reason was—concerned students, parents, and educators—that the Content Advisory Committee reviewed the items, for the first time, after operational SAT forms were constructed.
To clarify my last sentence: The Content Advisory Committee reviewed the items for the first time, not before they were pretested, but after the items were assembled into operational SAT forms.
Alfaro reiterates the College Board’s violation of item review practices again on May 31, 2016:
Earlier, I wrote that the Content Advisory Committee first reviewed the items after they were assembled into operational forms, not prior to pretesting. The Fairness Committee reviewed some items, but not all, before they were pretested, and did not review the items that were extensively revised during operational form review. The only way the College Board can provide evidence to support the statements it made in this example is by making it up.
On June 02, 2016, Alfaro wrote this open letter to his colleagues:
Dear Colleagues:
Over the last year, I’ve explored many different options that would allow me to provide students and their families the critical information they need to make informed decisions about the SAT. At the same time, I was always seeking the option that would have minimal impact on your lives.
I gave David Coleman several opportunities to be a decent human being. Using HR and others, he built a protective barrier around himself that I was unable to penetrate. Being unable to reach him, I was left with my current option as the best choice.
For me, knowing what I know, performing most tasks at the College Board required that I take a few steps onto a slippery slope. Where my superiors stood on that slope was influenced by the culture at the College Board, but ultimately it was their personal choice. They chose to conceal, fabricate, and deceive instead of offering students, parents, and clients honest descriptions of the development processes for item specifications, items, and tests.
I feel bad for all of us and wish that there was a better solution. Like you, I owed allegiance to the College Board, but my first allegiance was, is, and always will be to the students and families that we serve. Please understand that. Millions of students around the world depend on us to protect their best interests. When we forget that, and put the financial interests of the organization first, it is easy to justify taking a shortcut here and a shortcut there in an attempt to meet unrealistic organizational goals.
You are good people. You just need better bosses.
Best wishes,
M
Even as I was writing this post on June 09, 2016, , Alfaro produced another post detailing the development of the redesigned SAT and the slippery fast-tracking of the item review process.
Add to the above revealing info about the College Board’s ineptness regarding “new” SAT construction this news on the College Confidential SAT discussion thread entitled, “College Board repeats March SAT as one of forms at June SAT sitting”:
[Individual A]:
Kids on the June SAT thread and a parent on the Class of 2017 thread have reported that some kids who took the June 2016 SAT got the exact same test that they had already taken in March 2016.
[Individual B]:
This has also been reported on Reddit.
The March test was leaked and widely circulated. Many, many students in the U.S.–especially, but not only, those who prepped at centers catering to the Chinese, Taiwanese, or Korean communities there–would probably have prepared for the June test using the leaked March document.
As for the Reddit comment thread alluded to above:
A: Wtf?!? It was exactly the same as march
B: Mines was completely different than the March test
A: Wow I lucked out
C: Did you take the test in March and the one today in the US? It was the same test?
A: For me they were both exactly the same
D: They were exactly the same for me too. I’m glad I’m not the only one
E: Wtf are you international or..?
D: nope,both tests taken in the US lol
E: Damn. I wish I was you, I also took the March SAT.
D: I kind of wish I got a different one, just because I’m worried I don’t have much wiggle room to improve my score
No wonder the College Board cancelled a number of March 2016 SAT registrations at the last minute as a supposed “security measure.” The College Board apparently planned to recycle the exact same March 2016 test for at least some of its June 2016 sites.
American colleges and universities are increasingly dropping the SAT and ACT from among their admissions requirements.
Both Alfaro’s insider confessions and the students’ experiences with already recycled, “redesigned” SAT may well prompt more postsecondary institutions to ditch the Coleman wreckage of that test.
Those wishing to sign Alfaro’s White House petition asking the federal government to investigate the College Board for making false claims about the redesigned SAT may do so here: https://wh.gov/is3Sf.
***
My thanks to Erica Meltzer for her assistance with this post.
______________________________________________________________
Coming July 08, 2016, from TC Press (revised release date):
School Choice: The End of Public Education?
Stay tuned.
***
Schneider is a southern Louisiana native, career teacher, trained researcher, and author of the ed reform whistle blower, A Chronicle of Echoes: Who’s Who In the Implosion of American Public Education.
She also has a second book, Common Core Dilemma: Who Owns Our Schools?.
Don’t care to buy from Amazon? Purchase my books from Powell’s City of Books instead.
Trackbacks & Pingbacks
- College Board SAT Mix-up: Some El Paso Students Told They Must Retake SAT | deutsch29
- Approx. 400 SAT Test Items Leaked to Reuters News Agency | deutsch29
- Approx. 400 SAT Items Leaked To Reuters News Agency - Democratsnewz
- FBI Raids Former SAT Exec Manual Alfaro’s Home; Alfaro Posts on LinkedIn Next Day | deutsch29
- FBI Raids Former SAT Exec Manual Alfaro’s Home; Alfaro Posts On LinkedIn Next Day - Democratsnewz
- FBI Raids Former SAT Exec Manual Alfaro’s Home; Alfaro Posts on LinkedIn Next Day | From the ‘deutsch29’ blog | Mister Journalism: "Reading, Sharing, Discussing, Learning"
- “REFORM” by any Other Name Would Still Stink (with apologies to William Shakespeare) – resseger
- The Rhode Island Department of Education’s Draft Plan for Implementing the ESSA (the federal Every Student Succeeds Act)—Spin and More Spin – resseger
- College Board Botches the Scoring of the June 2018 SAT; Affected Test Takers Petition for Rescore | deutsch29
- The College Board COVID-Era AP Exam: Another Botch Job | deutsch29
I signed the petition when it was at 39; saw the number jump to 45, and now it is back to 39 again.
And, didn’t the WH petition use to show a initials and a town for each signature?
Reblogged this on Exceptional Delaware.
Sizzling hot journalism. Mercedes, could you please post your petition for SAT’s full disclosure in a more prominent place?
I have been posting it every where.
https://petitions.whitehouse.gov//petition/investigate-college-board-making-false-claims-about-redesigned-sat
Reblogged this on Dern's Discourse and commented:
This just adds to the testing problems we already have in schools. The redesigned SAT just seems to be another FSA.
Great digging! reposted on notjustaparent.com
Forwarded a link to our state superintendent, since MI now requires all students to take the SAT. Thank you.