Political journalists have been under pressure to bolster their reporting with poll results for quite some time now. Reliance on telling anecdotes and insider speculation seem increasingly thin in the age of Nate Silver and FiveThirtyEight (among others).

There’s been relatively little ongoing pressure on education journalists, however — and little training and encouragement from inside the field. The result, all too often, is that everyday education reporting doesn’t include available public opinion research.

So, for example, an education reporter might have covered the August Gallup report that just came out. But, based on past experience, this kind of information is unlikely to make it into very many 2016-2017 stories during the next two months — even when the stories are covering topics that have been polled.

“The day any poll comes out there are a lot of stories written about it,” said the Center on Public Integrity’s Ben Wieder in a recent phone interview. “I think it is pretty rare for people to use it beyond that.”

Stories that lack available polling information put readers (and reporters) in something of a vacuum, focused narrowly on the latest events, most colorful opinions, and speculation about the future.

Perhaps the most obvious example of this kind of disconnect from last year was a Spring 2016 New York Times piece on attitudes towards testing, which focused on racial differences but didn’t include available PDK data broken out by race.

This kind of thing is extremely unfortunate but so common as to go without being noted.  Education journalism is generally thought to be inadequate in its use of research in general, so limited use of education polling fits the pattern. Limited use of polling data came up in this space last year, and seems no less of an issue 12 months later.

“The bulk of the coverage in our poll comes in the first couple of weeks after its release,” noted Education Next’s Martin West in a recent email. Education Next, a policy journal, recently released its 10th annual large national poll of public opinion on education.”The incorporation of poll results into coverage of ongoing issues is a bit off.”

There were some exceptions in 2015-2016, according to West, including this opt-out story from The Seventy Four and one from the Washington Examiner in July about the DNC email leak and Common Core.

But there were lots of missed opportunities for reporters to do more, according to West, including in particular coverage of the Supreme Court agency fee case known as Freidrichs and coverage of the teacher tenure lawsuit know as Vergara.

According to Joshua Starr, head of PDK International, coverage of the release of the poll results is generally strong. PDK International released its long-running education poll last week. There are always those who wish that more or different questions had been asked, he said in a recent phone interview. (This year, for example, questions about school integration were cut out due to time constraints.) After the initial release, however, there’s a big dropoff.

One reason for the relatively low use of poll results among education reporters is that despite recent increases in polling there’s still relatively little research on public opinions about education, noted Wieder, a former Stateline education reporter. “My expectation was that there would be more polling data,” said Wieder. But there wasn’t, forcing Wieder to rely on the Gallup/PDK results. “It’s hard to find that much data out there.” With the exception of the occasional top-tier issue like Common Core, education issues aren’t frequently included in general-interest national polls.

In general, poll results are best used in context of others — previous results from the same question, or results from other polls covering the same topic at roughly the same topic. Comparing poll results and tracking them over time is best, agree most. But this is not always possible with education poll results. This forces reporters to vet one-time results — which can be misleading in any number of ways — or leave poll data out of their stories.

Some polls appear to contradict each other, or are funded or conducted by organizations with an ideological bent that makes reporters skittish about using the information and getting burned. If a question hasn’t been included before, it’s hard to measure the initial results until additional data comes in.  Polls ask questions differently and aren’t necessarily comparable.

There’s also the issue that advocates push back hard against poll results they disagree with, either on technical grounds or other factors, as happened in April 2015 in response to a USC Dornsife/Los Angeles Times poll on teachers, creating pressures that can be uncomfortable for education reporters.

National data may not seem relevant to local reporters, notes West. Reporters may think that public opinion is more volatile than they are, and months-old results are thus outdated. It can also be hard for reporters to figure out which findings are the most important, resulting in a cacophony of headlines.

Simple convenience may also play a big role. For rushed reporters, there’s no central place or person to talk to about education polling. In his EWA piece, Wieder recommended historical polls from folks like the PDK/Gallup poll, which goes back several decades. and a site called pollingreport.com, “a free archive of publicly released polls. There’s also the Roper Center on Public Opinion Archives, which is “expensive, but more comprehensive.”

Looking for quick help, reporters often reach out to USC’s Morgan Polikoff, who is active on this issue. He has called for a central repository of education polling data to be created, so that researchers and reporters can better compare and make use of the available information.  (Starr has also talked about the need for a central repository.)

“These results could be really useful in a lot of stories through the year,” said USC’s Morgan Polikoff in a recent phone interview. “We’re asking about all the major education policy issues.”

EdNext has tried to make it easier to track which questions have been asked on which topics with this interactive infographic. “It is a lot to ask of reporters to expect them to remember which poll has asked what and when,” notes West. (Education Next will also hold a DC forum on Sept. 16 about its latest poll results.) But the EdNext graphic isn’t particularly easy to use or widely-publicized — and it’s just one source of information.

Concerns about the PDK and EdNext polls may be exaggerated, however. To be sure, the polls “may be constructed form a particular perspective in terms of the topics they choose and the ways that they ask about them,” said Polikoff in a recent phone interview. Still, “both polls have always been thoughtfully done and well-constructed.”

There have been some steps taken to support careful use of polling in education reporting. The Education Writers Association published a piece a couple of years ago, Using Polls in Education Reporting. A writeup of a 2013 EWA conference panel, penned by EdWeek contributor Mark Walsh (Tapping Public Opinion Polls to Strengthen Stories), encouraged reporters to reach out to polling organizations. It seems like a lot more could be done.

Some of the responsibility lies at the feet of the polling publishers, notes Starr and others. They push hard to get attention for their annual polls and make each poll better than the previous one, but neglect to remind reporters about the data year-round, according to Starr. “We focus our energies on the release and the event, it may fall off the radar screen of reporters,” said Starr in a recent email. “We’re discussing how to keep it relevant and present throughout the year.”

Some journalists like the Washington Post’s Jay Mathews have decided that the poll results asking parents to grade “schools they have not picked and don’t know much about” are dumb — in part because journalists like him usually write about troubled schools and leave a biased negative impression with the public [another problem, about which more some other time]. Because parents are likely uninformed about schools outside their direct experience, their views are deemed unimportant.

The latest PDK poll result about widespread opposition to school closings may be a prime example of this, according to Polikoff. It’s an interesting, eye-catching finding on a controversial topic, which is more likely than others to come up in followup reporting. But it’s a single data point  and those who responded to it may not know much about what school closings actually look like.

Or, notes Mathews, the problem isn’t with the poll questions but with how they’re covered — credulously, or without context. Results that aren’t new are treated as if they’re different from previous results. Results that differ from past results (or other polls) are presented without additional information.

Mathews has a point, at least when it comes to how poll results are desribed. There’s the ever-present danger that reporters – and readers – respond to poll results too strongly. “People have a tendency to assume that numbers are facts,” noted Wieder, forgetting that polls differ in terms of methods, questions, sampling, weighting, and reporting.

But ignoring poll results all year isn’t the answer, either. Without polling data, advocates and educators are free from confronting public opinion results that may contradict their experiences and positions, and readers are understandably confused — if they even know what folks are talking about, that is.

Education advocates and reporters working in the trenches can easily forget what everyday people think about the issues. Often times, Wieder says, non-educators and even some parents don’t really know what reporters are talking about.

Related posts:
NYT Race & Testing Piece… Ignores Polling Data From Parents Of Color
PDK Poll Coverage Reveals Hunger For Reliable Information, Need For More Regular Use Of Polling Data
Morgan Polikoff: Education Researcher, Expert Source, Media Critic
A Nagging Disconnect Between Vivid Anecdotes & Underlying Data

ABOUT THE AUTHOR

Alexander Russo

Alexander Russo

Alexander Russo is founder and editor of The Grade, an award-winning effort to help improve media coverage of education issues. He’s also a Spencer Education Journalism Fellowship winner and a book author. You can reach him at @alexanderrusso.

Visit their website at: https://the-grade.org/