Looking for our new site?

Ed Money Watch

A Blog from New America's Federal Education Budget Project

< Back to the Education Policy Program

What We Can (and Can’t) Learn from the Early SIG Results

Published:  November 20, 2012
Publication Image

Some of the Department's early SIG results. Click to enlarge.

Release the kraken data! The U.S. Department of Education has finally revealed some of the results from its research on the effectiveness of the School Improvement Grant (SIG) program, or rather, the one-time, $3 billion infusion to the SIG program included in the 2009 American Recovery and Reinvestment Act (ARRA). The controversial program, which was re-tooled by the Obama administration, has supported intensive turnaround efforts – up to $2 million per school – in over 1,300 of the nation’s chronically low-performing schools.

The sliver of data released this week includes 2009-10 and 2010-11 test data from about 730 of the 831 highest priority SIG schools, those categorized into Tier I or Tier II.[1] Here are the highlights (H/T to RiShawn Biddle and PoliticsK-12):

  • Two-thirds of schools showed gains in math, and two-thirds in reading in the first year of the SIG program (2010-11)
  • 25 percent of schools saw double-digit gains in math, and 15 percent in reading
  • 40 percent of schools saw single-digit gains in math, and 49 percent in reading
  • 28 percent of schools saw a single-digit decrease in math, and 29 percent in reading
  • 6 percent of schools saw a double-digit decrease in math, and 8 percent in reading
  • 26 percent of schools had posted math improvements the year prior to entering SIG, but declined once they received SIG funding; this happened for 28 percent of schools in reading
  • 28 percent of schools had posted math declines the year prior to entering SIG, but improved once they received SIG funding; this happened for 25 percent of schools in reading
  • A larger proportion of elementary schools posted gains in the first year of the SIG program, compared to middle and high schools, and they were less likely to see declines
  • Rural schools appear to fair as well as schools in suburban and urban areas

But can we say that “there’s dramatic change happening in these schools” as Secretary Duncan claimed? Not so fast. Clearly, the Department didn’t read Matt DiCarlo’s excellent run-down of when you can – and cannot – make policy claims based on test data.

First, the Department doesn’t clarify whether any of these increases or decreases in test scores are statistically significant. Given inherent measurement error in any assessment and the fact that it is unclear if the Department is using proficiency rates (less accurate) or actual test scores (more accurate) to calculate these gains and losses, statistical significance cannot be assumed.

Second, the Department doesn’t clarify whether they are using cross-sectional or longitudinal data. In other words, were the gains or declines based on individual student growth (i.e. a student taking the 3rd grade test in math improved when taking the 4th grade math test) or were they based on comparing this year’s crop of 3rd graders in math to last year’s 3rd graders? My money is on the latter, which limits how we can interpret the data as the results aren’t fully comparable from the pre-SIG year to year one of the turnaround program.

Third, the Department doesn’t explain whether or how the researchers took into account other non-school factors that could affect student achievement. Without at least addressing these issues, it is impossible to know whether changes in student performance were even attributable to changes in school leadership or culture (i.e. the SIG program) rather than conditions in the economy or students’ home lives. And the Department doesn’t explain how they controlled for other policies at the school-level that could influence test scores. As chronically low-performing schools, the SIG interventions are unlikely to be the only improvement strategy or program at work in these schools.

These are huge caveats to the SIG data, but that’s not to say that ED’s findings aren’t important. They are. But more details are sorely needed to really make an accurate assessment of the program.

To begin with, the Department of the Education must disaggregate the data into the four turnaround models. More significantly, changes in student proficiency rates on standardized tests are only one possible outcome of the SIG program – and perhaps not the most important outcome to track. The Department plans to release student and teacher attendance data, enrollment in advanced courses, and other “leading indicators” for the SIG schools next year, but what about data relating to school leadership, school culture, and parent involvement?

While more difficult to quantify, these areas are also essential components of school turnarounds. Secretary Duncan alluded to it in releasing these early results: “What’s clear already is that almost without exception, schools moving in the right direction have two things in common; a dynamic principal with a clear vision for establishing a culture of high expectations, and talented teachers who share that vision, with a relentless commitment to improving instruction.” However, the data attached to Duncan’s statement failed to mention the effect sof leadership or teaching in SIG schools.

Predictably, analysts – notably Bellwether’s Andy Smarick – have already interpreted the early results as a failure of the entire SIG effort. But without more convincing and complete data, it really is too early to make definitive judgments about the program. This nuanced, wait-and-see approach may not be as satisfying, but in an effort as important as improving our nation’s worst schools, it is the right approach to take.



[1] To learn more about the SIG schools, including where they are located, how much money they received, and which improvement model – transformation, turnaround, restart, or closure – they selected, check out this handy-dandy map from Education Sector.

Join the Conversation

Please log in below through Disqus, Twitter or Facebook to participate in the conversation. Your email address, which is required for a Disqus account, will not be publicly displayed. If you sign in with Twitter or Facebook, you have the option of publishing your comments in those streams as well.