We encourage you to republish this article online and in print, it’s free under our creative commons attribution license, but please follow some simple guidelines:
  1. You have to credit our authors.
  2. You have to credit SciDev.Net — where possible include our logo with a link back to the original article.
  3. You can simply run the first few lines of the article and then add: “Read the full article on SciDev.Net” containing a link back to the original article.
  4. If you want to also take images published in this story you will need to confirm with the original source if you're licensed to use them.
  5. The easiest way to get the article on your site is to embed the code below.
For more information view our media page and republishing guidelines.

The full article is available here as HTML.

Press Ctrl-C to copy

Scientific expertise is increasingly prominent in international policymaking but little effort is made to judge how effective this input is, according to a report from the Organisation for Economic Co-operation and Development (OECD).

Simply providing information is not enough — advisory groups and researchers must evaluate and monitor the impacts of advice to improve their credibility and effectiveness, it says.

The authors of the report, published last month (20 April), tell SciDev.Net that their call to action is a response to what they see as lack of responsibility from scientists.

“If development studies have taught us one thing, it’s that you need to get out there and get your hands dirty.”

James Wilsdon, University of Sussex, United Kingdom

Science advisory bodies “usually consider that their role ends when the advice is provided, do not comment on policy decisions and only intervene and communicate when the advice is misinterpreted”, the report says.

Carthage Smith, the OECD’s lead analyst for the report, says advisory bodies must become more self-critical and must automatically consider their effectiveness.

It’s not difficult, he says, as the methods needed are in any standard impact assessment tool kit. Reflection should begin by examining the quality of the evidence being provided. But it is equally important to judge how the information is taken up by stakeholders and if it helped to achieve any policy goals in the long term, Smith adds.

In the report, a German research body, the Commission of Experts for Research and Innovation, is held up as a shining example of what can be achieved. Despite lacking any role in implementing its advice, it still routinely conducts follow-up surveys and tracks the political uptake of its work, the report finds.

Unfortunately, this type of introspection is rare, it says. Scientific advice given on high-profile issues — like the Ebola crisis or the Fukushima nuclear disaster — does get dissected. But more routine consultations usually end the moment the report is handed over. At this point, ad hoc groups formed to answer specific policy questions disband, leaving no one accountable for the information provided.

Smith believes this is particularly prevalent in developing countries. Without stable institutions with a defined and long-term advisory remit, there is rarely the capacity to look beyond the report itself.

Furthermore, the majority of scientific assessments relating to developing countries are conducted by a handful of institutions in richer nations, says Frank Biermann from VU University Amsterdam, the Netherlands. The political scientist says decision-makers are less likely to take assessments on board if they come from external sources — especially those on social and political issues.

But James Wilsdon, professor of science and democracy at the University of Sussex, United Kingdom, says the development sector offers a tried and tested template for conducting the kind of real-time project assessments that science advisors sorely lack.

One central pillar of this approach is moving out of the laboratories and meeting rooms to discuss impacts with the people on the ground who are affected most. “If development studies have taught us one thing, it’s that you need to get out there and get your hands dirty,” he says.