Certain charts could lead readers to form a mistaken or inaccurate picture of the output of certain African countries. The findings raise questions about the inner workings of the observatory — established by the African Union and based in Malabo, Equatorial Guinea — which has failed to respond to our queries.*
In May, the observatory released a report called: Assessment of Scientific Production in the African Union, 2005-2010. Alongside this, it promoted a supplement of graphs intended to “facilitate the interpretation” of the relative strengths and weaknesses of African countries and regions in various scientific fields.
Yet experts in the scholarly publishing metrics point out that some of the graphs may be misleading. Graphs for four nations — Angola, the Democratic Republic of Congo, the Republic of the Seychelles and Sierra Leone — indicate that these countries only produced a scientific output in one field: ‘biology’ for the Seychelles and ‘clinical medicine’ for the others.
“There is no way that these countries produce [research output] in only one field,” says Anastassios Pouris of the University of Pretoria in South Africa.
Edward Byers, scientific officer at the Planet Earth Institute, a UK charity working for scientific independence for Africa, also tells SciDev.Net that the apparently narrow range of scientific work conducted in the four countries struck him as odd.
Byers says that, although the methodology described in the report “seems robust”, it is possible that there “may have been an error” when the graphs were created.
He adds that his own searches on Scopus, a database of research articles that the observatory report cites as its data source, returned significant numbers of results listed under scientific disciplines not shown in the graphs for the four nations.
SciDev.Net confirmed this by repeating the Scopus searches. For example, a search for authors affiliated to Angola publishing between 2005 and 2010 returned 91 results filed under ‘medicine’ (which the country’s graph represents as a large bubble). But it also returned 41 in ‘agricultural and biological sciences’, 22 in ‘engineering’ and 17 in ‘social sciences’. None of these fields are displayed on the graph. Similar results were obtained for the other three nations.
SciDev.Net’s attempts to contact the observatory using email and Twitter received no response. No phone number is listed on its website.
The report notes that a Canadian firm, Science-Metrix, was contracted to conduct data analysis for the report.
Éric Archambault, Science-Metrix’s CEO, tells SciDev.Net that, although his company did this analysis, the observatory produced the final documents, including the supplementary graphs.
He also says that graphs sometimes have a cut-off point, below which data is not included, and this might be why fields with small numbers of publications were not displayed.
Byers agrees that this might be a plausible explanation in principle. Yet the supplement does not state that this is the case. All it says (in its short preamble) is that the size of data points in each graph is “proportional to the number of publications produced by the country/region in the corresponding scientific fields”.
He also notes that the main report itself gives Angola reasonably high ratings in the field of natural sciences, and that it would make sense for this to be represented in some way in the supplement’s graph for the country.
> Link to the report
> Link to the graphical supplement
*After this story had been published, on 27 August, AOSTI notified SciDev.Net that it had published a response.