Why the U.S. Results on PISA Matter

January 07, 2014

By Eric A. Hanushek

In 2012, 65 nations and education systems participated in the Program for International Student Assessment. These tests, covering mathematics, science, and reading, provide direct international comparisons of skills. Sadly for our nation, the recently released results are sobering.

According to PISA, the United States placed significantly below the average for member-nations in the Organization for Economic Cooperation and Development for mathematics—and significantly worse than the OECD distribution at both ends of the assessment spectrum, with more low performers and fewer high performers.

The U.S. math performance is not statistically different from that of Norway, Portugal, Italy, Spain, the Russian Federation, the Slovak Republic, Lithuania, Sweden, and Hungary—not the most sought-after group of countries for comparison's sake.

More disturbing, U.S. students' scores have been stagnant for the past decade. Since 2003, the United States has made virtually no gains, even as a range of countries made substantial ones.

The most rapid PISA gains were made in very low-performing countries, such as Qatar and Kazakhstan. Yet some higher-performing nations also made substantial advances: Israel, Singapore, Italy, Poland, and Germany. Poland, for example, steadily improved over the past decade and now ranks eighth within the OECD (14th among all 65 participating countries or education systems).

In the simplest terms, even among high-performing countries, change for the better is possible.

A number of commentators have tried to counsel ignoring the results, and their misleading arguments—These test scores really do not matter—warrant correction.

...