Biennial National Tests Aren't Worth It

Biennial National Tests Aren't Worth It
X
Story Stream
recent articles

Tennessee Gov. Bill Haslam compliments a group of 6th graders at John P. Freeman Optional School in Memphis, Tenn. Thursday, Nov. 7, 2013, after the State of Tennessee scored high on National Assessment of Educational Progress Tests, marking the best educational gains in any state. The students were at work learning about electrical current. (AP Photo/The Commercial Appeal, Kyle Kurlick)

RCEd Commentary

What new information do biennial test scores from American students really tell us? Not that much.

The math and reading results of the 2015 National Assessment of Educational Progress for 12th grade students were released Wednesday. Math scores were down slightly from 2013, while reading scores remained flat. The response has been similar to what happened last October, when NAEP released the 2015 results for 4th and 8th graders, whose performance mostly declined.

But comparing these scores every two years doesn’t provide much insight. As an example, the graph below shows NAEP math and reading results over the last 12 years for America’s 4th graders. As you can see, the year-to-year change is so slight that it’s often not statistically significant; in some cases, the scores are the exact same as the prior year.

If we didn’t have the data above from 2005, 2009, and 2013 – meaning these NAEP tests were only administered every four years – would we really be missing out on much? As the graph below shows, we’d still have the same trend lines and the same idea of how math and reading performance looked over the past decade. The fact is, taking these tests every two years just isn’t very productive.

Not only that, but administering the math and reading NAEP assessments so frequently distorts the education policy conversation. In 2013, when NAEP scores ticked up slightly, many – including former Secretary of Education Arne Duncan – were quick to imply that successful Common Core implementation deserved the credit. Predictably, when scores ticked down last fall, people linked the bad news to Common Core too.

Both conclusions are misguided. As Steven Glazerman, a Senior Fellow at Mathematica Policy Research and coiner of the term “misNAEPery,” wrote in 2013, “Beware of arguments that use NAEP to defend or attack policies . . . The reality is that NAEP is not meant for this purpose. You will not find typical peer-reviewed research drawing such conclusions from NAEP data, because it's a fairly well known error that's been widely discredited.”

The frequency of the math and reading assessments also limits what other things NAEP can do with its finite resources. Unlike statewide annual testing, which is used to measure how districts and states are performing for accountability purposes, the goal of NAEP assessments is to inform policymakers and the general public on various educational topics. But from 2017 to 2021, 13 national NAEP tests will be administered – six of which are the biennial math and reading assessments to which people pay so much attention. This means that the National Assessment Governing Board – which sets the NAEP assessment schedule – has to cut corners on other tests.

For example, the next NAEP Long-Term Trend Assessment – last administered in 2012 – won’t be taken by students again until 2024. It was initially scheduled for 2020, but was pushed back four years due to budget constraints. The assessment is described as a “long-term trend” for a reason – it’s the most longitudinal data we have, going back to the 1970s. That’s why the National Center for Education Statistics – which administers all NAEP tests – describes these tests as “the most reliable instruments of change over time.”

Because of NAEP’s focus on math and reading, other subject areas are also being pushed out, known as “curriculum narrowing.” One example of this is civics education. Twelfth graders – Americans on the verge of voting – won’t be tested by NAEP in civics again until 2022. When they were last tested in 2010, their results were pretty terrible. But can we expect this issue to get much attention when it’s only assessed every 12 years?

The answer is no. If NAEP wants to live up to its claim of being “the nation’s report card,” it can’t limit certain assessments to being administered once a decade. If NAEP tested students in math and reading every four years – instead of every two – we would still largely maintain the same data and level of understanding. But more importantly, it would also enable NAEP to provide a better picture of America’s students with regular assessments in a wider array of subjects. As our world becomes more connected and our economy become more complex, we need to ensure that students are developing 21st century skills that go beyond just math and reading.

 

Comment
Show commentsHide Comments

Related Articles