We encourage you to republish this article online and in print, it’s free under our creative commons attribution license, but please follow some simple guidelines:
  1. You have to credit our authors.
  2. You have to credit SciDev.Net — where possible include our logo with a link back to the original article.
  3. You can simply run the first few lines of the article and then add: “Read the full article on SciDev.Net” containing a link back to the original article.
  4. If you want to also take images published in this story you will need to confirm with the original source if you're licensed to use them.
  5. The easiest way to get the article on your site is to embed the code below.
For more information view our media page and republishing guidelines.

The full article is available here as HTML.

Press Ctrl-C to copy

Experts offering scientific advice will serve policymakers best by communicating uncertainties and by breaking with the current practice of giving overly simplified messages about risk, says Andy Stirling.

Scientists called on to offer policy guidance tend to do so with definitive answers — a single interpretation of the available evidence, which is considered best practice. The uncertainties end up hidden away in quantified measures of risk, leaving no room for different interpretations.

This does not go far enough to ensure a complete understanding of scientific information, says Stirling, and leaves science advice open to political manipulation by special-interest groups.

There is a need to move from a narrow focus on risk to broader and deeper understanding of incomplete knowledge, he argues. Accepting that knowledge is complex by nature promises to deliver guidance that is more rigorous and robust, leaving policymakers accountable for their decisions.

The scientific community should also learn from past mistakes, adds Stirling. Early warnings about risks to the ozone layer posed by the release of halogenated hydrocarbons came from scientists opposed to the mainstream view. And most experts initially discounted the idea that spongiform encephalopathies — diseases such as the mad cow disease — could spread in ways not seen before.

Stirling backs thorough and explicit assessments of incomplete knowledge that focus on neglected uncertainties, tackle disagreements that stem from ambiguity, and resist 'one-track' views of progress that can stifle innovation.

Experts should adopt methods that highlight alternative interpretations — and document the regulatory questions to be answered as well as assumptions, value judgments or intentions that go with each of them. The few examples of where this approach has been used suggest that it can work, says Stirling, creating a more open process of making policy decisions.

Link to full article in Nature