“It was recently discovered that research causes cancer in rats.”  —  Unknown

“To steal ideas from one person is plagiarism; to steal from many is research” — attributed to comedian Steven Wright, a familiar sentiment seemingly going back to at least 1820.

Let’s first get the “anecdotal” conclusion out of the way.

Journalists publish many stories based on bad data, which in turn is often based on poor methodology. Editors of academic refereed journals sometimes do the same. So do research outfits, nonprofits, and businesses. Who have I left out?

The grander question is, if research is so poor, from where does knowledge come?1 Plenty of treatises are available if one wishes to delve into this philosophy of knowledge. I hope here to simplify and provide a few practical methods that can help journalists sift out some useful information, or more likely a bit of useless information.

Part II – also posted in this issue – will address six basic research components that could require review when appearing in research conducted by outside organizations as well as research conducted by your own company or publication.

In his eBook, Ethics in Science Journalism2, George Claassen, a University of Stellenbosch lecturer in science and technology journalism and media ethics, listed numerous ways journalists misinform readers:

10 deadly sins of unethical science reporting

  1. Accentuating the positive and ignoring the negative
  2. Generalizing from anecdotes
  3. Ignoring the holistic picture and failing to recognize the conclusions and weaknesses of scientific studies
  4. Wrong or insufficient interpretation of numbers
  5. Ignoring conflicts of interest
  6. Confusing an intermediate outcome with a health outcome
  7. Offering misleading or harmful tips
  8. Not asking for the evidence
  9. Ignoring the certainty of uncertainty in science
  10. Ignoring the scientific ways and method

The result? Businesses, consumers, and nonprofits are making decisions based on faulty or incomplete research. Some of these decisions may even affect your life. Witness drug recalls.

The academicians can worry about their publications. We business-to-business editors and writers must ask why we publish articles about data from poor research.3 Some of the reasons include:

  • The rush of approaching deadlines.
  • Time constraints due to overworked staffs.
  • Competitors might publish before you do.
  • Filling space.
  • Laziness.
  • The research must have been vetted and fact-checked by the researcher.
  • Lack of knowledge about research methodologies.

The last two points are the ones that directly pertain to this article.

We B2B editors and writers must vet or fact-check all research before we publish it. The ethics guide of the American Society of Business Publication Editors (ASBPE)4 calls for this practice via an examination of the research methodology, which typically concerns itself with statistical measures:

“In any editorial content, a clear and complete discussion of the methodology, including methodological and analytical limitations, should be published to allow the reader to make informed judgments about the value of the content.”

“Information graphics should include an explanation of research methodology and give the source.”

The ASBPE directive is similar to Victor Cohn’s, a former Washington Post medical writer and editor who authored one of the standard references for reporters, News & Numbers — A Guide to Reporting Statistical Claims and Controversies in Health and Other Fields.5 He said:

“Scientists who do poor studies or who overstate their results deserve part of the blame. But bad science is no excuse for bad journalism. [boldface mine] We reporters tend to rely most on ‘authorities’ who are either most colorfully quotable or quickly quotable, and these authorities often tend to be those who get most carried away or who have the biggest axes to grind…. Without being cynical and believing nothing — an automatic disqualification for any journalist — a reporter should be equally skeptical and greet every claim by saying ‘show me.’ ”

The imperative is not just for science writing, however. It is true for all research that we might publish, especially for research by companies whose agenda is to promote their product or service. And yes, it is true for the research that your own publication might conduct.

References

1) Knowledge From What?, by Derek L. Phillips, 1971 (Rand McNally & Company)
2) Ethics in Science Journalism, by George Claassen, University of Stellenbosch, www.pdfio.net/k-61498623.html
3) News & Numbers — A Writer’s Guide to Statistics, 3rd edition, by Victor Cohn and Lewis Cope, 2012 (Wiley-Blackwell).
4) B2B Journalist Ethics: An ASBPE Guide to Best Practices, http://www.asbpe.org/guide-to-preferred-editorial-practices/
5) News & Numbers — A Guide to Reporting Statistical Claims and Controversies in Health and Other Fields, by Victor Cohn, 1988 (Iowa State Press).

(Editor’s note: This article is the first in a series. Read Part II here.)

About the author: Robin Sherman is a consultant specializing in editorial development and publication design as well as a freelance editor and layout artist in the business-to-business and nonprofit publishing markets. He is a long-time member of the ethics and research committees of the American Society of Business Publication Editors, and is a former corporate director of editorial development for a large business-to-business publisher. http://www.linkedin.com/in/robinshermaneditdesign