Counting publications is one way to evaluate science.
Traditionally, science is evaluated according to researchers' outputs — the number of articles published in science journals or patents registered, for example.
But the real value is how those products benefit society or the environment. The speed with which new knowledge reaches citizens matters too.
Evaluating the impact of science may sound simple, but it is not. As experience has shown in Latin America, for example, even scientists don't agree on how to do it, and countries measure output in different ways.
What should be measured?
Colombia evaluates its science community using information about the new knowledge produced in a given year — the number of publications, books or software, for example. This web-based system was developed in Brazil.
Scientists in both countries agree that this approach provides useful insight into national trends, and also that it has weaknesses. Yet they have different ideas about how to improve it.
Brazilian researchers have suggested (see Brazil scientists criticise evaluation criteria, in Spanish) that publishing in national journals and presenting work at conferences should count for more than they do at present.
But in Colombia (see Colombia: controversy over evaluation of science groups, in Spanish), scientists feel an article published in a national journal should not carry as much weight as one published in a well-respected international publication such as Science or Nature.
Some researchers in Mexico have also objected to how their Sistema Nacional de Investigadores (National Research System) measures the scientific output of individuals. They say patents should be judged not only on the number registered but also on how they are used to boost the economy. That, they point out, would make Mexican scientists more focused on protecting the rights to the knowledge they produce.
The criteria used to evaluate science become even more problematic as countries in Latin America shift from seeing science and technology as an end in itself, to seeing it as the way to achieve the innovation necessary for a knowledge-based economy and society.
Innovation cannot be measured simply by counting articles in science journals. Nor is the number of patents registered the only indicator of innovation. And, even if it were, does a patent carry the same weight as a published article? Are they even comparable?
To take another example: if a research group seeking solutions to an industrial problem signs a million-dollar contract with a company, is that an indicator of success? And, if such a group comes up with an idea that leads to a profitable spin-off, how should its impact be measured?
As we enter what some are calling the century of innovation, our evaluation systems need to change. They need criteria that reflect the objectives of S&T and innovation: to generate products for national markets and economies, to create practical solutions to social challenges, and to add value to existing knowledge.
And they must provide incentives for a country's own researchers to be innovative. In most developing countries, fewer patent applications come from national researchers than from international companies.
Evaluation systems for science and technology, and now for innovation, are themselves constantly being evaluated in Latin America.
One criticism of our current evaluation systems is that, because the criteria were largely designed by academics, they show little understanding of the productive sector and overlook the value of work produced by groups not dedicated to basic research.
Evaluation criteria should be flexible enough to differ from one discipline to another — as already happens in Mexico.
And what about publications produced for citizens rather than the scientific community — for those who participated in research as patients or survey respondents, for example, or community members affected by a contaminated river? In general, such 'social products' are not valued highly, and scientists do not make much effort to produce them.
But the impact of a well-targeted communication strategy can be far larger than that of a publication in a journal.
An epidemiologist once pointed out to me that sharing results with people affected by the research will do more to save lives than counting international publications, however valuable — so researchers should be evaluated according to the social impact of their work.
This view suits public health research perfectly. But it would not be adequate for other disciplines. It is time to think of different types of evaluation for different types of research. And I would go so far as to say that these should be organised by results, not scientific fields.
A good start would be to set up a state-of-the-art, interdisciplinary advisory group to stimulate a broad and innovative discussion.
Lisbeth Fog is a regional consultant for SciDev.Net, based in Colombia.
Roy H W Johnston ( Techne Associates | Ireland )
17 January 2011
I have worked for decades at the research-development-innovation interfaces and have been aware of this problem.
For some thoughts arising from a recent conference in memory of J D Bernal FRS, see a paper I wrote arising from it, via my web-site at http://www.iol.ie/~rjtechne/scihist/bernalim.htm
This was directed at the Irish context but applies in general to the peripheral post-colonial nation-building problem, where the brain-drain problem is acute, and science policy ill-understood.
Héctor Ginzo ( Argentina )
17 January 2011
The way Colombian researchers consider articles published in their domestic journals is similar to most of Argentinian researchers'. In Argentina the situation is particularly ludicrous because domestic scientific journals are financed with part of the public funds prescribed for the national scientific system. This means that everyone's monies are spent in keeping scientific rubbish (as it were) up instead of getting rid of it forever.
Tariq ( Pakistan )
20 January 2011
An Important item to consider is to think about the application of knowledge in terms of economics and its impact on the society - that should be visible.
Dr.A.Jagadeesh ( Nayudamma Centre for Development Alternatives | India )
12 February 2011
Science to serve society - society to support science.
Jack ( Royal Netherlands Academy of Arts and Sciences | Netherlands )
25 March 2011
I work for the Dutch Academy of Arts and Sciences, and have coordinated a European project on social impact assessment (www.siampi.eu) and participated in a national project called Evaluating research in context - ERiC (www.eric-project.nl), on these sites you will find sevral publications and also a library on social impact assessment. my address is firstname.lastname@example.org
All SciDev.Net material is free to reproduce providing that the source and author are appropriately credited. For further details see Creative Commons.