Publishing online means scientific research is both more accessible and difficult to evaluate
To make science truly useful to development, we need a new, inclusive system of tracking publications, says S&T policy expert Caroline Wagner.
Global comparisons of scientific output are commonplace. As non-experts, policymakers and administrators must rely on indexes of impact and recognition — counts of published papers and citations, and the prestige of source journals — to assess the impact of public spending and to allocate research funds.
The gold standard is SCIE — Scientific Citation Index Expanded, once called ISI — owned by Thomson Reuters. SCIE is an excellent abstracting service, but it covers only a small percentage of all scientific literature.
And although the information revolution is making it easier to publish online, and therefore to access the results of scientific research, it also confounds efforts to monitor and compare the outputs as materials proliferate in many new venues.
The result is a rapidly growing, open system that is harder to evaluate than ever before.
Extent of the 'unseen'
In a recent study we counted more than 15,000 scientific periodicals among the 'BRIC' countries (Brazil, Russia, India and China), of which just 495 — about 3 per cent — are listed in SCIE. 
Amazingly, this is not an anomaly: we found that SCIE lists only about 3 per cent of journals for most scientifically advanced countries.
This means that decision makers anywhere in the world relying on SCIE (or its cousins, Scopus or perhaps Google Scholar) do not account for, access, or compare as much as 90 per cent of scientific output — works we call "unseen science".
For scientifically advanced countries such as the United States, other abstracting services such as Index Medicus and the Chemical Abstracts Service provide additional access. But no single source provides common access or enables comparability.
For developing countries, the challenge of 'unseen' science is compounded by the language barrier. China publishes 6,596 scientific journals, of which only a handful are abstracted in English. Similarly, Russia and Brazil each have close to 2,000 journals in national languages that are not indexed in SCIE.
India is better represented in English and in SCIE, but unlike the other three BRICs, its national publications (about 550) are scattered among a number of databases that are difficult to track down. 
How would a researcher go about finding these works? Right now, there is no way to do this.
Quantity and quality
That science is growing worldwide is widely celebrated. It enriches knowledge, but it also adds to the challenges of assessing and comparing global scientific output.
Many more countries fund research and development (R&D) than at any time in history. In 1990, six countries were responsible for 90 per cent of R&D spending; by 2008 this elite group had grown to more than 13 countries.  Since the beginning of this century, developing countries have more than doubled their R&D spending.
Growing numbers of journals in native languages (paper and electronic); open-source avenues for publication such as Creative Commons; and global conferences (physical and online) are signs of healthy science. Combined with new possibilities for publication opened up by the Internet, scientific findings multiply daily.
It is certainly a good thing that new communications tools enable a vast new group to participate in the global network of science. But the trend also confounds assessments and raises questions about what is being measured in global comparative studies.
The proliferation of sources also raises questions about quality. Online archives such as arXiv and ResearchGate are growing in popularity, and they include pre-publication versions of articles that have not been reviewed, yet are often read and cited by others. But no clear standard is emerging to account for a change in status (such as pre- or post-review), let alone comparing citations over time.
Electronic journals, newsletters and bulletins do not always use high standards of quality assessment (such as peer review and editing). It is impossible to sort out which ones are presenting quality data.
This does not discount the usefulness of these contributions to the stock of global knowledge. Indeed, the seeds of future breakthroughs may very well be contained there. But sifting through expanding volumes of material to find them is becoming increasingly difficult.
Key step: taking stock
Indeed, recent calls for global standards for scientific assessment are well-intentioned in theory, but the proliferation of scientific communications and sources makes the idea a dream that is difficult to put into practice.
The first step towards any global assessment should be an inventory of the various types of scientific outputs and their sources. There are many possible configurations — for example, electronic only or electronic and paper; frequency of publication; how many places the same article is published (pre- and post-publication); links to supporting data; open source or subscription; editor or peer reviewed.
A move towards standard terms for types of outputs would help analysts make accurate counts, and policymakers to use all available information in decision making.
Inclusion may require that regional or national governments, or perhaps academies of science, invest in good accounting and a national library of all scientific periodicals — which must be open access — such as Russia is attempting at elibrary.ru.
An inclusive approach to ensuring that all good science is 'seen' at the global level is a lofty goal that presents significant challenges to the scientific community. Global transparency can be achieved with new protocols for viewing, cataloguing and understanding the swath of activities that represent science in this collaborative era. But it requires commitment from governments and scientific societies.
As things stand, the vast majority of scientific publications remain unseen by most potential users, and this lack of access makes all of global science poorer.
Caroline Wagner is Wolf Chair in International Affairs and director at the Battelle Center for Science & Technology Policy, The Ohio State University, Columbus, Ohio. Caroline can be contacted at email@example.com.
 Wagner, C.S., Wong, S.K. Unseen Science? Representation of the BRICs in global science. Scientometrics 90, 1001–1013 (2012)
 Public reports: science and technology (UNESCO Institute of Statistics)
John Daly ( United States of America )
20 July 2012
This would seem to be a great area for UNESCO, the UN decentralized agency that supports science, communications, and indeed libraries.
Terry ( Red Plough International | Thailand )
23 July 2012
I have your book, The New Invisible College. Loved it. Loved this article. You might like mine: The Myth of Insufficient Information http://www.iupac.org/publications/ci/2010/3201/1_clayton.html
Naiyyum Choudhury ( Bangladesh )
23 July 2012
The Bangladesh Academy of Sciences in collaboration with the INASP, UK has undertaken the responsibility of online access to all most all the journals published in Bangladesh. The INASP, a charity based organization in UK has made arrangements for on line availability of journals published in this region including Bangladesh through the BanglaJOL. It has requested the Academy to take the responsibility of uploading all the journals in Bangladesh in the BanglaJoL. Ms. Souix Cumming head of publication of the INASP recently came and a conducted a training workshop to train the journal editors and two officials in the Academy about uploading the journals in the BanglaJoL. It is hoped that within a short period of time most of the journals published in Bangladesh will be shown online in the BanglaJol. The BRAC University has open access availability of all the research work carried in the BRACU some of which has not been published anywhere. While SCIE and similar other sophisticated system of reporting standard publications are useful for evaluation of the quality of research, the not so cited publications should not be overlooked all together. Some of these may make good sense.
Jan Velterop ( United Kingdom )
3 August 2012
90% of periodicals does not equate to 90% of scientific output. The ones not covered are more than likely small or extremely small journals, that publish only a fraction of the number of articles that the large journals that are in the remaining 10% publish.
cswagner ( United States of America )
9 August 2012
@ jan velterop - do you have data that shows this? my data suggests otherwise... C. Wagner (happy to discuss this @osusiaprof)
Rob Terry ( World Health Organization | Switzerland )
4 February 2013
Jan - did you have an answer for C. Wagner re the volume of visable publications vs the number of registered journals. I have an interest in light of recent WHO discussion on potential to establish an health R&D Observatory to montior and track research in neglected areas see: http://apps.who.int/gb/CEWG/e/E_doc1.html
All SciDev.Net material is free to reproduce providing that the source and author are appropriately credited. For further details see Creative Commons.