Study: Students from high schools with improving ISTEP scores perform no better on ACT exams
A study published this week in the Journal of Research in Science Teaching covering thousands of Indiana high school seniors from three graduating classes finds that students at schools showing consistent improvement on the Indiana Statewide Testing for Educational Progress exam performed no better on the ACT science and math college entrance exams than classmates from declining schools.
"The Consequences of 'School Improvement': Examining the Association Between Two Standardized Assessments Measuring School Improvement and Student Science Achievement" is the work of Adam Maltese, assistant professor of science education in the Indiana University School of Education, and Craig Hochbein, assistant professor of leadership, foundations and human resource education in the College of Education and Human Development at the University of Louisville.
Maltese and Hochbein studied between 3,000 and 5,000 Indiana 12th-graders for each of the years 2008, 2009 and 2010 from 87 to 114 schools in each year. They examined how the students from schools classified as improving on ISTEP English and math scores (scoring higher continuously through the three years) performed on the science section of the ACT.
"We see that students from those schools that are improving based on Indiana tests have generally lower scores than students from Maltese
those declining schools on the science and math portions of the ACT," Maltese said. While in most years the data indicated improving school students performed at least no better on the ACT, the study found that in 2010, students from schools identified as improving in English scored nearly half a point lower than peers from declining schools on both the ACT science and math exams.
"When we saw these declines in math and science, we sat back and said, 'well, you'd think if a school has improved over a couple years in English, those kids' ACT scores in English would be better than the kids from the declining schools," Maltese said. "We don't see that. We don't see it in math, either."
"We don't want to take anything away from teachers working to improve student literacy and numeracy," Hochbein said. "The idea is deciphering what these test scores actually mean for students, parents, teachers and principals. With a heightened sense of accountability, we want to be able to address what these results actually mean in the grand scheme of learning, knowledge and global competitiveness."
Last week, the state of Indiana reported across the board gains in ISTEP scores for 2012. Among all students, 71 percent passed the English and math portions, an increase above last year and the highest mark yet for Indiana students.
The Maltese and Hochbein study sought to determine whether science education, a subject that is tested but usually not part of school accountability metrics, might suffer from a school's emphasis on English and math.
"If the school is performing poorly on these tests by which they're evaluated, then they're going to want to move assets around to best position themselves to do well," Maltese said. "If they're shifting those resources to English and math, they're probably pulling them away from something else. We can't definitively say that's what these results indicate, but it gives us a hint at the high school level that there may be some 'narrowing' of the curriculum."
Further analysis of the data found that not only were the students not performing better on science, but those from improving ISTEP schools aren't outperforming in the other subjects either.
"Not that the ISTEP and the ACT line up perfectly, but there is a decent amount of overlap in the material, especially in math and English," Maltese said. "That is one of our concerns, that these gains on state exams aren't playing out in these other standardized tests that are widely used, at least for this group of students who are likely college-bound."
The authors conclude that the discrepancy between ISTEP improvement and ACT improvement may be an indication that more should be done to make sure all students are making gains in skills to be ready for college and the workforce.
"I think we need to have more honest conversations about what the numbers say, what they mean," Hochbein said. "Because just improving your test scores doesn't necessarily mean that the school is doing all it should or all it could to excel at the highest level."
One of the recommendations they make is exposing students to a more integrated curriculum. "Science teachers care about critical reading of content," Maltese said. "Why can't those teachers work together?"
While Maltese said such collaboration certainly does happen in some schools, the Indiana ISTEP and ACT results indicates a need for a more formal effort. "This should be a goal for districts, to allow teachers time across disciplines to do this integration so that a student's not just hearing about constructing an argument in English class, but they're hearing about how that argument's constructed in the science literature and history literature," he said. "The same can be done with numeracy skills."
They also recommend that state policymakers begin to more seriously consider the potential negative consequences of emphasizing certain subjects in standardized tests.
While this research focused on Indiana, the authors said the situation is likely similar across the country.
"There's no reason to think that Indiana is dramatically different than other states," Hochbein said. "There will be some differences because Indiana students elect to take these tests, whereas in Kentucky, all juniors must take the ACT. But there's no reason to think that these results don't have some association with some other states."
The full study is available at the Wiley Online Library.