IU School of Education researchers caution U.S. schools about using international test

Monday, December 8, 2014

Though a test based on the highly respected Programme for International Student Assessment is now available for schools in the U.S. to administer to their own students, two assessment researchers at the Indiana University School of Education advise school administrators to weigh the merits and disadvantages before deciding whether to participate.

David Rutkowski, assistant professor of educational leadership and policy studies, and Leslie Rutkowski, assistant professor of inquiry methodology, along with Jonathan Plucker, professor at the Neag School of Education at the University of Connecticut, outline concerns in the December 2014/January 2015 issue of the Phi Delta Kappan.

The article "Should individual U.S. schools participate in PISA?" highlights several areas of concern. PISA is an international test developed by the Paris-based Organisation for Economic Cooperation and Development to compare the proficiency of 15-year-olds in math, science and reading in 60-plus countries. The test, administered every three years, emphasizes one of these subject areas in each cycle.

Policymakers, education advocates and journalists frequently cite the performance of particular countries on the PISA exam compared to that of U.S. students collectively. Starting last year, the Organisation for Economic Cooperation and Development developed the OECD Test for Schools, a version of the PISA test that can be administered by U.S. schools. In the last school year, 285 U.S. schools participated.

"PISA is a high-quality, well-developed instrument whose design and administration make it a good assessment for lots of different settings,” Leslie Rutkowski said. ”But it’s probably not great for any particular country.”

The design of PISA presents a problem, the researchers say. PISA isn’t focused on measuring mastery of a particular curriculum, but rather what a 15-year-old knows and how the student can use that knowledge. Since the testing is linked to age, not grade, the U.S. participants often range from Grade 9 to 11. Nor are the tests linked to Common Core or similar state standards, such as those in Indiana.

"Given this aspect of the PISA design, there is no linkage between participating students and their teachers,” David Rutkowski said. “Such a disconnect likely limits the usefulness of PISA results for understanding the relationship between teaching and learning.”

The researchers also caution that PISA measures what the organization considers important for students of the 34 member countries to know to operate in a global economy and society. And although those sorts of skills are, of course, necessary, the test may not adequately capture important learning outcomes for U.S. students. They question whether the OECD Test for Schools would be relevant for the points of emphasis within U.S. schools.

"Given that the creation of large-scale assessments for the Common Core would charitably be described as complex and difficult, how effective with the OECD [Test for Schools] be in providing information about student progress relative to the Common Core?” David Rutkowski said.

There are better alternatives, they conclude. The National Assessment of Educational Progress is already administered across all states and focuses on specific grades and common curriculum across an array of subjects. It would allow comparisons of schools to similar students in other states, avoiding other pitfalls of adapting PISA.

The Trends in International Mathematics and Science Study is administered to measure fourth- and eighth-grade math and science achievement every four years and has a test for 12th-graders. It is also grade-based and measures based on an internationally accepted curriculum. The authors say it does lack an explicit link to workforce knowledge.

With shrinking state budgets and approximate costs of $8,000 to $11,500 per U.S. school paid to McGraw Hill for the 2013-14 school year PISA test, according to the education advocacy group American Achieves, educators might consider some lower-cost alternatives. For example, the researchers offer an alternative using existing resources. They suggest that U.S. educators develop an assessment using items taken from the National Assessment of Educational Progress to compare at the national level and the Trends in International Mathematics and Science Study to compare at the international level. Past questions are available from the National Center for Education Statistics. These free alternatives have some drawbacks but do allow schools to shape tests to better measure subject areas they are most interested in examining.