WILKES-BARRE — Lean and bit boyish with a buzz cut, Brian Costello can’t help but get wonky when he talks about Wilkes-Barre Area’s poor results on state standardized tests.
“This is not an excuse in any way,” Costello said more than once as he waded through state data, “I want our schools to be number one. But when you compare our test results to our peers, we do better.”
The “peers” he’s referring to are schools that have similar percentages of students deemed “economically disadvantaged.” Studies have long shown that family income correlates with standardized test results. The lower the income, the lower the test scores.
The link is statistically so strong that one common yardstick of academic success is a “regression analysis”: A school’s test results are predicted based on the student poverty rate. If a school exceeds the prediction — regardless of how the score looks compared to other schools — it is deemed a success.
Costello raised the peer group comparison issue at a Nov. 15 school board meeting. Low state test scores had dominated the public comment section of several board meetings including that one.
“If students aren’t learning what we’re teaching,” frequent critic Richard Holodick warned, “we better start teaching how they learn.”
“We need a conversation that’s been missing for far too long,” Gabby Richards urged. “How do we develop high-quality, high-performing schools in high-poverty and under-performing schools?”
Costello cited changes made under his tenure, admitted more needs to be done, and then spoke briefly of district research that compared test results of schools statewide that had at least 75 percent of students deemed economically disadvantaged. Wilkes-Barre Area enrollment includes 77 percent economically disadvantaged, the highest in Luzerne County. (The next highest, Greater Nanticoke Area, is 64.5 percent).
Compared to those 105 “peer” schools, Costello noted, Wilkes-Barre Area scores on state tests ranked in the top 15, and scores on national college prep test — the SAT and ACT tests — the district ranked in the top 5.
Is his comparison accurate? Are the claims valid? And if so, how do other districts in Luzerne County fare when compared to their peer groups?
A Times Leader analysis of state data tried to answer those questions.
Back story on the pushback
Local laments about demographic impact are literally as old as the state standardized tests themselves. Introduced in the early 1990s in limited form (math and reading tests in three grades), the media began reporting results as soon as available, and school administrators howled that it was not fair to compare schools without considering demographics.
The state responded. In 1994 scores were released with comparisons to schools with similar socio-economic make up. Hazleton Area School District officials urged reporters to look for “significant differences in test scores compared to similar schools: 50 points above those schools was good, 50 below was bad.
The debate continued through the decade. “You have to compare districts of similar demographic make-up, apples to apples,” Wilkes-Barre Area then-superintendent Jeff Namey said in 1998.
In 1998 Temple University professor of educational administration Donald Walters told the Times Leader that his 11-year research into how family economics impact standardized test result showed high-scoring districts “tend to be from wealthy communities where kids already have an advantage and the school can capitalize on what students have.”
After the tests expanded to grades three through eight and districts had to meet annual goals under the “No Child Left Behind” law of 2001, the state acknowledged the demographic issue by including various ways to meet goals while accounting for demographic differences.
The pushback ultimately led to a decline in the emphasis of test scores in school evaluations when the Every Student Succeeds Act was passed in 2015, replacing NCLB.
So Costello’s argument that demographic’s matter is not new. The difference is the sophisticated computer tools he can bring to the analysis.
Asked for details of his claim that Wilkes-Barre Area fares better compared to peer schools, Costello showed that the district used a database search program with access to detailed state information. He called up a comparison with all “Local Education Agencies” (LEA) with 75 percent or higher poverty rates, and compared PSSA scores, the newer Keystone exams for high school students, and college prep SAT and ACT scores.
The “LEA” distinction is important because it refers not only to school districts, but to stand-alone charter schools that may cater only to a narrower age group, rather than testing thousands of students in multiple schools and grades.
Stressing that the district goal is to score among top performers compared to all schools, not just these peers, Costello said Wilkes-Barre Area compares favorably with the 105 schools reviewed:
• Wilkes-Barre Area tied with Lancaster School District for 15th place when looking at total percent of students scoring proficient in the Pennsylvania System of School Assessment tests, given in various subjects in grades three through eight. The district had 33 percent of students scoring proficient or better in all tests and grades combined.
• Wilkes-Barre Area ranked 14th among peers in results for the state Keystone Exams administered in high schools, with 43 percent proficient or better.
• Wilkes-Barre Area ranked sixth in average SAT scores among the peer group, with an average of 1,301, and it ranked first in results for ACT tests, though those are taken by fewer high school students in Pennsylvania, where the SAT test is heavily favored.
• Costello also looked at the district’s 87.7 percent graduation rate compared to the peer group, a rate that put it 14th on the list of 105 schools.
Is that right?
To verify Costello’s finding, the Times Leader used state data from 2017. Because charter schools are such a different beast — public schools free from many state regulations and often serving fewer students than school district — Wilkes-Barre Area was compared only to other school districts.
The analysis focused on the PSSA tests in English Language Arts (formerly “reading”), math and science because these are given in more schools. The Keystone exams, SATs and ACTs are given to high school students, and the latter two are optional. The total percentage of students district-wide scoring proficient or better in all three PSSA tests was calculated.
By using only school districts, the number of peers with 75 percent or higher economically disadvantaged enrollment narrows to 20. Still, in that group, Wilkes-Barre Area’s 33 percent proficiency — low enough to rank 475th out of 499 districts reviewed — starts to look much better. In that peer group, Wilkes-Barre Area ranked 8th out of 20.
But breaking districts into “quatriles” — 0 to 24, 25 to 49, etc. — as Costello essentially did makes for uneven numbers of districts in each group. For example, 2o districts had 75 percent or more economically disadvantaged, but there are 256 districts in the 25 to 49 percent range.
So the Times Leader “banded” each district in Luzerne County, comparing them to all other districts within 5 percentage points of that district’s student poverty rate (economically disadvantaged). The groups were still unequal in number, so districts were compared by what percentile they fell into among those banded peers.
The result, going beyond Costello’s preliminary math, showed two things: 1) For Luzerne County’s 11 school districts, those such as Wilkes-Barre Area that ranked poorly statewide generally fared better when compared to peers; 2) Local districts that rank high statewide ranked lower when compared to peers.
Crestwood and Dallas School Districts, for example, are traditionally the highest local performers on state tests. They also have the two lowest percentages of economically disadvantaged students in Luzerne County, under 18 percent at Dallas and just over 18 percent at Crestwood. (The next lowest rate is Lake-Lehman at 32.6 percent).
Both get test results putting them in the top 20 percent of schools statewide, yet when compared to peer schools, they slip considerably. Crestwood falls to a point where it just misses getting into the top 40 percent, and Dallas narrowly avoids falling in the bottom 20 percent.
Wilkes-Barre Area falls in the bottom 5 percent of districts statewide. But when compared to banded peers, rises to the lower end of the top 35 percent. Greater Nanticoke Area doesn’t fare as well but still improves, from ranking in the bottom 5 percent statewide to almost getting above the bottom 15 percent compared to peers.
Since the demographic debate on state tests began, no local district administrator has used it as justification for accepting low test results. Like Costello, they have insisted the goal is to overcome the issue, not hide behind it. But the analysis shows that, overall, local data reflects the trend recognized nationally when standardized tests became ubiquitous two decades ago.
Test results continue to correlate with family income.