0
(0)
One would hope that journalists would interrogate the official narrative and bring in outside knowledge. But that’s not what happened. 
By Alexander Russo

On Monday, new NAEP scores came out — for the first time in three years — and they were just about as awful as everyone expected. 

Math scores fell to the lowest they’ve been in nearly 20 years. Reading scores fell, too — to 1992 levels. Nearly four in 10 eighth graders failed to grasp basic math concepts. Not a single state saw a notable improvement in average test scores. And the 26 big-city districts that participated in the assessment didn’t fare much better.

It was a bloodbath, pretty much everywhere, confirming the many challenges that schools and students face in the years ahead. 

But what does the coverage tell us about the state of K12 education news? 

Based on the stories I’ve read, it tells us that education journalism is alive and well, capable of doing clear and careful work — and that education journalists still believe that test score results are important and useful indicators.

However, it also highlights several long-established problems, among them an over-reliance on official narratives and a failure to bring in broader knowledge that would help inform readers.

The overall result is careful, cautious coverage that conforms to expert and official views but doesn’t make obvious connections. 

The overall result is careful, cautious coverage that conforms to expert and official views but doesn’t make obvious connections. 

As one might hope, the report was covered by most of the big national outlets like NPR, US News & World Report, the New York Times, Wall Street Journal, Washington Post, USA Today, and AP, along with scads of trades. The PBS NewsHour ran a segment on the results. There was even a Jake Tapper mention during a Jeb Bush segment on CNN. 

Local and regional coverage made a strong showing as well. The Dallas Morning News version of the story describes the results as “a sobering look at pandemic recovery.” AL.com focused on the news that the state’s schools aren’t ranked last in the nation as they have been in the past — and why. Chalkbeat Tennessee noted that Memphis was the only one of the 26 big-city school districts in the nation to record top five declines on every NAEP test

There were some creative elements to the coverage, too. Chalkbeat came up with some vivid graphics showing learning gaps among various demographic groups. USA Today allowed readers to look up their state. I very much enjoyed Axios’s NAEP-inspired GIF, a calculator with all the arrows flashing down. (It’s cringe to some of you, I’m sure, but I still like them.) 

Above: A nice GIF from Axios

And the coverage wasn’t all focused on doom and the gloom, either. The Wall Street Journal ran a story about How Los Angeles Avoided National Academic Plunge During Pandemic, highlighting connectivity and teacher training for its relative success.  Chalkbeat Chicago noted that Chicago’s English language learners now outperform peers in other urban districts and nationally, and Latino 8th grade reading scores ticked up. Wyoming — yes, Wyoming! — overtook Massachusetts for the top spot in 4th grade math. 

The coverage was generally careful and clear, helping readers understand what the results actually mean. And, given the amount of coverage of the report, it’s apparent that education journalists and newsrooms still take high-quality test results seriously.  

The coverage was generally careful and clear, helping readers understand what the results actually mean.

However, there are several aspects of the coverage that are less confidence inspiring.

For example, some of the most interesting storylines from the NAEP results — the relative success of Catholic schools or districts like Los Angeles that fared better than others, or the devastating results for disabled children — didn’t get nearly as much attention as they deserved. 

We don’t know much of anything about whether outside groups were involved behind the scenes in developing or approving the NAEP messaging, which downplays the relationship between the results and policy responses to the pandemic. 

And to some observers — I’m not among them — the coverage suggests that education news is overly obsessed with test results.  

But these are all relatively minor concerns.

My biggest concern is that, taken together, there is an extreme carefulness to the NAEP coverage — a seeming reluctance to try and say too much or draw outside the lines in any way, especially when it comes to linking results to specific pandemic policies. 

“States that returned to in-person learning relatively quickly, such as Arizona, saw declines along with those that stayed remote longer, such as California,” notes a Wall Street Journal writeup that’s pretty typical. “Experts are divided on the degree to which policies such as remote learning affected student achievement.” 

Looking just at NAEP scores, the cautiousness is understandable. Texas’ math scores were about the same as other states that stayed closed longer. California’s declines were slightly less than the national average in several categories, comparable to Florida. 

This carefulness is also likely fueled by the exhortations of journalists and education experts not to make too much of the test score-remote learning relationship, and to avoid conflating causal effects with correlations.  

The cautious approach to interpreting the results hews very closely to the US Department of Education’s position, which is that there is “no straight line” between the test results and the amount or type of disruption different students experienced in different places.

But there’s something curious about this assertion. Nobody seems to mind making over-arching correlations between the pandemic and test score declines. And there are other results and research that support the connection between remote instruction and learning loss.

Several mainstream news organizations “took pains to say that the latest NAEP study offered only murky evidence that school closures were the biggest culprit” despite other studies that support the connection, notes The Atlantic’s Derek Thompson.

One would hope that journalists would interrogate the official narrative and bring in outside knowledge directly relevant to the question at hand. But that’s not what’s happening except in a few instances such as the Associated Press story.*

As a result, the debate rages interminably on Twitter, but not in the education stories that most readers see.  It’s settled knowledge in The Atlantic but not on many education pages. 

One would hope that journalists would interrogate the official narrative and bring in outside knowledge directly relevant to the question at hand. 

Perhaps it’s unfair to critique media coverage for what it doesn’t include. 

I certainly remember that argument being made in response to the New York Times’ review of Anya Kamenetz’s recent book, dinged for its gymnastic attempts to avoid assigning responsibility for specific actions. 

But figuring out why scores are down so dramatically in so many places is a core part of the NAEP results story.

And from where I sit, reporters would have done well to have woven in opposing voices and previous studies that support the connections between school shutdowns and and student learning. 

We need experienced, fearless journalists covering education, a lot more willingness to ask hard questions, to write what they know — not just what they’re told — and to break away from the pack.

*The story has been updated to credit the Associated Press.

ABOUT THE AUTHOR

Alexander Russo

Alexander Russo

Alexander Russo is founder and editor of The Grade, an award-winning effort to help improve media coverage of education issues. He’s also a Spencer Education Journalism Fellowship winner and a book author. You can reach him at @alexanderrusso.

Visit their website at: https://the-grade.org/

How useful was this post?

Click on a star to rate it!

Average rating 0 / 5. Vote count: 0

No votes so far! Be the first to rate this post.