Have we reached a tipping point? A skeptic weighs in
By Ron Whitehorne on Aug 30, 2010 03:50 PM
Superintendent Arlene Ackerman, flanked by the governor and the mayor, has been on a public relations offensive this summer. Citing eight years of improved test scores and a dramatic rise in the number of schools that made AYP this year, the message is that the School District has reached, in the mayor’s words, a "critical tipping point.” Ackerman added, “To all the naysayers who say an urban school system can’t be successful: Watch Philadelphia!”
I’d like to be a believer. Students, teachers, and principals have all worked hard to achieve these gains. Urban public schools do get a bum rap from the media and the political establishment. But I can’t jump onto this bandwagon.
We have had a decade of single-minded focus on boosting PSSA test scores. The curriculum has been narrowed to focus on reading and math at the expense of other subjects, intensive test prep has been implemented focusing particularly on low-achieving schools, and the message has been sent to teachers and principals that their effectiveness will be measured by test results.
Given all this it would be surprising if there was not substantial improvement in test scores.
The question is what does it mean? The rosiest scenario is that for the first time more than half of Philadelphia’s students are proficient in reading and math. The bleakest assessment is that these scores are grossly inflated and gains are minimal at best. My own view is somewhere in between.
There are a number of problems with the rosy view. One of them is the gap between the results on NAEP and the PSSA. NAEP, also known as the nation’s report card, is a widely respected standardized test that provides a basis for comparing the progress of different states and now cities as well.
The 2009 NAEP results for Philadelphia were for reading 10 percent in 4th grade, and 15 percent in 8th grade testing proficient and above. In math, 17 percent at the 4th grade level and 16 percent at the 8th grade level were proficient and above. All of these scores were substantially lower than the average large city scores (7 to 13 percent difference). It's also worth mentioning that statewide NAEP scores have been essentially flat over since 2007 while PSSA scores have gone up.
So why the big gap between NAEP and PSSA? Let’s look at a couple of possibilities.
Since each state has its own standards and develops its own assessment, there is a wide range in the quality of the tests. Simply put, some tests are much easier than others. In New York recently, where rising city test scores were widely heralded as a success story, the state decided it had set the bar too low. Many students who had scored as proficient were not really proficient at all. The city’s big gains suddenly evaporated.
Governor Rendell at the AYP party, insisted that couldn’t happen here. Pennsylvania, the guv said, has never dumbed down the PSSA. I didn’t find any evidence that he’s wrong. The state has tweaked the cut scores, the threshold for proficiency, for the PSSA over the years, but claims that this had no impact on test quality.
The standards on which the PSSA is based have come up for some serious criticism. Two reports, one by the AFT in 2008 and another by the Fordham Institute this year, rank PA's standards near the bottom of the heap, faulting the standards for vagueness, repetition, and lack of content.
Going forward, Pennsylvania has joined over thirty states in adopting the Common Core standards developed by the National Governor’s association, a move that should silence these critics. None of these reports, it should be noted, suggested that the PSSA was an easy test.
An issue that gets relatively little attention is the question of cheating. In Atlanta, another school district that has seen big test score gains, a cheating scandal has called these gains into question. Statistical evidence of tampering with answer sheets, has rocked the administration of Beverly Hall, 2009 Superintendent of the Year.
The administration of the PSSA is characterized by lax security and ample opportunities for cheating. Classroom teachers and administrators, who will be evaluated based on this test, administer it with minimal oversight. We simply don’t know to what extent this a problem in Philadelphia. NAEP, by way of contrast, has rigorous security procedures and is administered by people who have no stake in the outcome.
A big factor is the systematic teaching to the test that characterizes instruction particularly in the Empowerment Schools. Students are drilled on how to answer both multiple choice and open-ended questions. Skills that are tested are taught and reviewed again and again, year after year. A large publishing industry of materials designed to boost test scores has developed. Apparently it pays off. The question here is what is the quality of this learning. Are the skills taught retained and integrated in some cognitive framework that is meaningful for the student?
Real learning is transferable so it is reasonable to assume that the skills students have gained will be reflected in the NAEP as well as the PSSA. Some explain the gap by suggesting students are much more motivated to make an effort on the PSSA. Yes, students don’t get a pizza party for doing NAEP, but I’ve administered both tests many times and saw little observable difference in student effort.
While the NAEP-PSSA gap should concern us, the more fundamental question is should we be putting all our educational eggs in the single basket of any standardized test?
In the last decade the introduction of the core curriculum, the greater emphasis on math and literacy, all-day kindergarten, and the modest reductions in class size have all had positive results. Students are more literate and have a greater capacity for math than they had ten years ago and that’s a good thing, but the gains fall short of the hype and we certainly haven’t seen a revolution.
The down side is also clear – the narrowing of the curriculum and a constricting of the definition of student learning.
Instead of continually celebrating test results, which only strengthens the tyranny of the test over urban education, the District needs to reconsider how to measure progress.