The public school advocacy group Center on Education Policy released a new report today, titled “Has Student Achievement Increased Since 2002?” Its answer is “yes,” based on relatively worthless high-stakes state-level testing data and on the more esteemed National Assessment of Educational Progress (NAEP). For reasons known only to the report’s authors, they make no use of the available U.S. trend data from either the PISA or the PIRLS international tests (though the CEP study mentions PISA results for a single point in time, it ignores the changes in that test’s scores over time.)


As it happens, U.S. scores have declined on both PISA and PIRLS in every subject and at both grades tested since they were first administered in 2000/2001. In the PISA mathematics and science tests, the declines are large enough to be statistically significant, that is: we can be confident (and disappointed) that they reveal real deterioration in U.S. student performance. In mathematics, our score has dropped from 493 to 474, causing us to slip from 18th out of 27 participating countries down to 25th out of 30 countries. In science, our score fell from 499 to 489, dropping us from 14th out of 27 countries to 21st out of 30 countries.


It is reckless and misleading to form judgments about trends in U.S. student performance without taking into account the declines on these respected international tests. And, as Neal McCluskey and I pointed out last year, the improving trends that exist on some NAEP tests predate the passage of the No Child Left Behind act, and have in some cases actually slowed since that law’s passage.


It is this rather discouraging reality that should guide policymakers in the coming year, as they debate the future of NCLB.