|Community scientists process dragonfly larvae samples at Squannacook Wild and Scenic River in northern Massachusetts in 2022. Photo: NPS/Jackie Dias (Public Domain).|
Feature: Expert Advice Helps Journalists Navigate Unfamiliar Scientific Seas
By Rick Weiss
Pop quiz: Let’s say a new study has found that areas of the world with intensive agricultural development and especially hotter temperatures due to global warming have far fewer songbirds than places with just agricultural development or higher temperatures alone.
Your editor wants a story about how the double whammy of rising temperatures and intensive agriculture together is taking a toll on songbird populations. What do you say?
Pop quiz No. 2: You’ve been wanting to write a story about the connection between adult exposure to wildfire smoke and increased odds of dementia later in life, but the studies you’ve seen have been small and inconclusive. Now a more powerful study is out — a meta-analysis that combines data from many other studies — and the correlation between smoke exposure and dementia is now statistically significant.
With that peer-reviewed evidence in hand, do you finally pitch this story to your editor?
These are the kinds of coverage decisions
that can make or break your reputation
as a reporter on the environment beat.
These are the kinds of coverage decisions that can make or break your reputation as a reporter on the environment beat.
They are also representative of the kinds of potential pitfalls that I and my SciLine colleague, Tori Espensen, warn reporters about when we co-teach “Science Essentials for Local Reporters,” an hour-long “crash course” we offer to reporters — as we did for a cohort of early-career journalism fellows at the Society of Environmental Journalists’ annual meeting in Boise this year.
Tori — a neuroscientist now designing trainings for reporters and scientists — and I, a former longtime Washington Post science reporter, developed the crash course to help reporters who cover science-related topics but who don’t have deep backgrounds in science.
It’s one of several free services offered by SciLine, whose mission is to make it easier for reporters to include scientist sources and research-backed evidence skillfully in their stories. The Science Essentials webinar is offered via Zoom every quarter, and also offered remotely or in person to newsrooms and journalism organizations by request.
Causality or association?
So what about that study that found far fewer songbirds in places facing the double insult of intensive agriculture and steeply warming temperatures?
The thing to note is that such a study is merely “observational.” It looked at songbird numbers in various places and noted correlations with agricultural practices and temperatures.
It may make perfect sense that these two environmental stressors together would cause greater bird declines than either alone.
But by definition, observational studies cannot conclude that something “caused” something else. They can only point to associations.
In this case, maybe songbirds were already rare in places that later experienced agriculture or rising temperatures. All we have is a snapshot in time, and nothing was manipulated or held steady as a control, as would be the case in an experimental study — the kind of study that can make a case for causality.
So sure, you can write the story. But don’t make the mistake of claiming the study shows that agriculture and high temperatures are “causing” songbird declines.
Consider saying the new evidence “is consistent with the idea” or “lends support to growing concerns” that these stressors are combining to take a toll on songbirds.
The second pop quiz has a different problem — what scientists call their “file drawer problem.”
Many scientific studies, it turns out, never get published, because the results did not come out as anticipated or were unclear or uninteresting.
So while meta-analyses can indeed be very powerful studies — they combine data from multiple other studies to gain statistical power and reduce uncertainty — they also have a key weakness: They draw only upon studies that were actually published.
That means all those studies that found no correlation and never got published have been left out of the mix, making meta-analyses susceptible to overstating some findings.
As with the first example, it doesn’t mean these studies can’t be trusted or written about. But it does mean you should probably mention that these kinds of studies can at times offer an exaggerated sense of the strength of their results.
It may also be worth mentioning that just because a result is “statistically significant” does not mean it is necessarily of practical significance.
As an example: If month after month, frogs exposed to a certain pollutant consistently and without exception catch two fewer insects per month than frogs not exposed, that may be a statistically significant difference because it is so unvarying — meaning it is almost certainly not due to chance but really likely due to the pollutant. But it may still be of no practical or functional significance if it’s just a couple of insects out of many hundreds consumed per month.
Ready to help
Besides getting into different kinds of studies and what can and cannot be deduced from them, our crash course covers such topics as how to find and vet scientist sources and how to avoid classic errors when reporting on percentages.
The most important thing a reporter really
needs to extract from a scientific paper is a
handful of smart, on-the-mark questions.
We also take a quick walk through a scientific paper — not because we think a general assignment reporter should be expected to quickly grasp all the details within those jargon-filled reports, but because the most important thing a reporter really needs to extract from a scientific paper is a handful of smart, on-the-mark questions to ask an expert.
And finally, if like any good journalist you are asking yourself: “All this and a bunch of other helpful services, all for free? What’s the catch? Who are you anyway?" … we would expect nothing less from skeptical reporters!
We are a dozen people — about half of us former journalists and the other half PhD- and master’s-level scientists. We’re based at the nonprofit American Association for the Advancement of Science and are completely funded by philanthropies, so everything we do is free.
Hundreds of fellow journalists and editors have participated in our trainings (“succinct and full of information,” “the advice given today will really help,” “really good, focused, practical application, fine speakers, perf!”) and we stand ready to help you, your newsroom or news collaborative.
We so appreciate the work that environmental journalists and others are doing to inform news consumers about the growing number of pressing air, water, land-use and climate-related issues, and we look forward to helping you achieve that shared goal.
Rick Weiss is director of SciLine, a philanthropically funded free service for reporters and scientists, based at the nonprofit American Association for the Advancement of Science. During his 15 years as a science reporter at The Washington Post, he wrote more than 1,000 news and feature articles about the economic, societal and ethical implications of advances in science and technology. In addition to Crash Courses (including a version for editors), SciLine’s services include Expert Matching and Experts on Camera. SciLine also regularly hosts Zoom-based media briefings with live Q&A. Sign up here to be kept apprised of upcoming offerings and events, and follow SciLine on Twitter @RealSciLine.
* From the weekly news magazine SEJournal Online, Vol. 8, No. 28. Content from each new issue of SEJournal Online is available to the public via the SEJournal Online main page. Subscribe to the e-newsletter here. And see past issues of the SEJournal archived here.