How do journalists know if a scientist's claim is true? Julie Clayton helps reporters check the quality of claims, and spot the fraudsters.
Most scientists are honest, while some will commit scientific fraud by deliberately deceiving colleagues and/or the public with a false claim. They may report experiments that have never taken place, describe patients that do not exist or distort data and illustrations to appear more convincing.
Norwegian physician Jon Sudbø invented some 900 patients in a study published in The Lancet in 2005, claiming that common painkillers help protect against oral cancer.  German physicist Jan Hendrik Schön falsified data in multiple papers, including 15 publications in the top-ranking journals Nature and Science. Most recently, South Korean scientist Hwang Woo Suk fabricated data published in Science, claiming to have grown stem cells from human embryos. [2,3] In all three cases, retractions occurred after the fraud had come to light.
It is important that the media report on scientific fraud in order to hold the scientific community accountable for maintaining standards in research — which is often funded with public money. The scientific community should not only act swiftly to punish fraud, but it should also raise questions about the failure of co-authors to know and understand more about the work being published and to prevent the fraud from occurring.
By publicising fraud, the media can also help to protect the public against fraudsters who, for example, cause patients to delay getting appropriate treatment in preference for unproven medications — as happened recently with AIDS patients in South Africa.  Furthermore, the media's reputation is at stake if a fraudulent claim has had prior publicity.
Why is fraud so difficult to detect?
Scientists, as a rule, follow an accepted code of conduct. They begin with experiments designed to answer a scientific question or create a new product. They present their results to colleagues and then publish them in a scientific journal. A good quality journal requires independent experts to certify that a paper's results are valid — a process known as peer review.
The process of peer review means journalists can usually assume that published work is of a high standard and worth reporting. And this is usually true. But peer review is not designed to detect fraud, and peer reviewers and journalists alike can be fooled by fraud that is well disguised.
After all, reviewers do not witness the experiments, so they must trust that the study is honest, and may not notice if data are fabricated or altered. The fraud often only comes to light when other scientists are unable to replicate results.
Non-expert journalists have little chance of uncovering such deception.
Sometimes, however, fraudsters have so obviously flouted the normal standards of scientific conduct that well-informed journalists are as capable as scientists in raising the alarm. For example, they may omit scientific evidence altogether and rely on anecdotal observation — even in a published report. In clinical studies, they may fail to register details of their experiment to regulatory authorities or refuse to make test results available for independent analysis.
How can you get better at detecting fraud?
The following tips are intended to make journalists better equipped for judging the quality of scientific claims and detecting fraud:
Get to know a field of research
Attend scientific conferences or visit research institutes and meet scientists in your area of interest to find out their goals, methods and progress and also the type of criticisms they may have of each other's work.
Visit university libraries, or use Internet databases such as PubMed to find publications on a particular topic or by a certain author.
This will provide more insight into individual studies. Although primary research papers may be too full of jargon and technical detail to make much sense to a non-specialist, review articles, which explore ideas and hypotheses, may be easier to follow and present a more general view of a fields progress.
Check the quality of peer review
Ask the scientist whether their claim is published in a peer-reviewed journal. Even if the answer is yes, do not assume this to be a mark of quality — different journals have different criteria and practices, and the quality of their peer review varies accordingly. It is therefore important, if possible, to find out the quality of the journal in question. To do so, consult scientists directly, or check with university librarians that the journal is held in high regard.
High quality journals tend to be more widely read and more frequently cited in academic papers. Journalists may also wish to try the internet search engine Google Scholar, a free resource that rates results according to the number of times a paper is cited by others, and hence indicates relative importance in the scientific community.
If you are uncertain about the journal's quality, try to find out the limitations of the study. Was it too preliminary, or too small a sample size to be accepted in a higher quality journal? An honest scientist will readily admit to the weakness of a study, and the need for further research — a less scrupulous one may instead exaggerate the importance and significance of the results, and deny that any data are lacking.
If you discover that a study has been refused publication, find out why. It may be honest work, but poorly designed, or insufficient in some way.
Alternatively, it may simply have been submitted to an inappropriate journal — good science, but too narrow in scope for a broad-interest journal such as Nature or Science, for example. Then again, the authors may have refused to redesign or expand their study, for fear that their assertions will be proved wrong.
Question the numbers
Are the numbers involved in a study appropriate and sufficient for the kind of investigation involved? Clinical trials, for example, proceed through three recognised phases from initial safety trials of just a handful of individuals to larger trials of effectiveness involving hundreds and then thousands of people. This will reveal whether or not a result has arisen by chance (its statistical significance), enabling conclusions to be drawn with greater certainty. Even if the statistics appear to back the claim, they are still worth checking with an independent expert, as mistakes can and do occur, including in the top journals.
Be critical if the claim is made in a public statement.
A journalist hearing an unpublished claim during an interview, press conference or seminar, should dig deeper to investigate how the study has been conducted. Ask the following questions (which can also be applied to a published study):
How credible is the scientist among his/her scientific peers? Asking other scientists directly can be a quick indication. Otherwise, checking through an Internet database such as PubMed may indicate how often others cite the person's work;
Is the scientist based at a recognised scientific institution?
How is the study funded? A publicly funded study, for example, has had its protocol scrutinised by experts in order to compete against others for funding; and
Is the author likely to profit from the sale of products relating to the work? Although many journals require authors to declare any competing financial interests, some scientists fail to do so.
Find experts for advice and comment
Finding an independent expert to comment is the most reliable way to judge the validity of a study. When interviewing a scientist, ask them for the contact details of other scientists doing similar work. Alternatively, identify a relevant expert by checking the editorial board of a journal – as long as it is a reputable one.
Use the PubMed database to see who has published on the topic. Or go through the list of speakers at a relevant conference, which you may find advertised in a journal, or on the website of a scientific society. Local universities, research centres, funding agencies or government departments may also provide a list of academics willing to talk to the media.
Check for ethical and regulatory approval
If the study is a clinical trial, and claims to provide evidence for a treatment, vaccine or cure for a disease, check that details concerning the drug or vaccine composition, and any toxic side effects, are publicly available. Make sure that the investigators are officially registered medical practitioners and that the trial or product has both ethical and regulatory approval — either for experimentation or for sale.
There are now public databases, such as the US National Institutes of Health service, ClinicalTrials.gov, where clinical trials may be registered and which all top quality journals now insist should be referred to in published papers.
Be sure of the facts
Journalists must be certain of their evidence, as an accusation of fraud could leave someone's career in ruins. They should check their facts with more than one source, and also anticipate that they may have difficulty in persuading some researchers to speak out against a colleague. An accused scientist may threaten to sue a journalist or their paper for libel, in which case it may be wise to seek the advice of a lawyer before publication.
In conclusion, it's worth remembering that most science is honest, and fraud is difficult to detect. In following the steps above, however, a journalist can certainly enhance their skills and reputation for reporting accurate and good quality scientific studies, and maybe catch a fraudster in the act.
Julie Clayton is a science journalist and a consultant for SciDev.Net.This article was previously part of SciDev.Net's e-guide to science communication and has been reformatted to become this practical guide.
 Sudbø, J., Lee, J.J., Lippman, S.M. et al. Non-steroidal anti-inflammatory drugs and the risk of oral cancer: a nested case-control study. The Lancet 366, 1359-1366 (2005)
 Hwang, W.S., Ryu, Y.J., Park, J.H. et al. Evidence of a pluripotent human embryonic stem cell line derived from a cloned blastocyst. Science 303, 1669-1674 (2004)
 Hwang, W.S., Roh, S., Lee, B.C. et al. Patient-specific embryonic stem cells derived from human SCNT blastocysts. Science 308, 1777-1783 (2005)
 Bolognesi, N. Bad Medicine. Nature Medicine 12, 723-724 (2006)