How to target a journal that’s right for your research
- Predatory publishers abound: check high fees, poor standards and bogus claims
- High impact factors aren’t always relevant — and some are fakes
- Aim to communicate, not just publish, your research
How to target the right journals
Some researchers are under pressure to publish anywhere, while others are lured by prestigious but often unattainable journals. Either case can lead researchers away from journals that might give them the audience and impact they need. Here I outline how to target a truly appropriate journal for your research.
Academics involved in research are often evaluated based on their research output or publications. Whether they get a degree, get hired, get promoted or get tenure is often tied to the quantity and quality of the publications they have recently authored. And in some countries quantity takes precedence over quality and becomes a defining factor in career progression.
Researchers working in such environments may be tempted to publish more and faster. Thus the demand for publication outlets increases, and so does the supply — in the form of more academic publishers and journals.
In scholarly publishing, no overall body sets standards and processes. Anyone can buy a domain name and set up a journal with a name of their choice. The sole motive may be making money by charging authors for publishing articles. These publishers may have an editorial board, but its members may be complicit. Such publishers tend to name their journals in a grand way, with meaningless words such as ‘global’, ‘international’ or ‘advanced’. They may also have an overly broad title or scope that includes many areas of research (to attract more papers).
Jeffrey Beall, a librarian at the University of Colorado in Denver, United States, maintains a list of such 'predatory' journals and publishers. 
These journals may publish papers after cursory or no peer review, despite claiming otherwise. Researchers may send their papers to predatory journals either knowingly or naively buying into the false claims made. 
And although poor peer review actually suits authors who have not done work of a sufficiently high quality to get published in established journals, many more are victims.  Researchers in developing countries often do not receive adequate research guidance early in their careers. They work in resource-poor environments and they lack research writing skills. Yet they face the same pressure to ‘publish or perish’ as their counterparts in developed countries.
“When you choose a journal, don’t stop there. Keep asking yourself, ‘How can I best communicate my work today?’”
However, publications in such journals eventually lose value and may even bring harm. Some researchers may be able to temporarily advance their careers on the strength of their publication count, but they may be shamed later on in front of their colleagues and students as awareness of predatory publishers increases.
Others may face disciplinary action by promotion or tenure committees that are already aware of predatory publishers. And, of course, serious researchers are likely to ignore papers published in suspicious journals, so these papers may not be read or cited.
Don’t be swayed by grand claims made on a journal's website or in calls for papers unless those claims can be verified. Being ‘under the indexing process’ with ISI, Scopus and so on. is an example of an unverifiable claim that often appears in calls for papers from suspicious journals. In fact, receiving a call for papers out of the blue is a warning sign. Unless you receive the call in a discussion list you are a member of, or from a journal you have submitted papers to or published in, or from another trustworthy source, you should be wary. I regularly receive calls for papers from random journals because they have harvested my email address online and have added me to their bulk mailing list without my permission.
Some claims can be verified: for example, a journal's membership in the Directory of Open Access Journals (DOAJ), Open Access Scholarly Publishers' Association (OASPA), and INASP Journals Online (JOLs). These are notable collections of open access journals, and the DOAJ is putting in place more rigorous criteria for membership.
A journal's membership in publishing societies such as the Committee on Publication Ethics (COPE) is also a good sign.
However, newly established journals may not be quickly indexed in academic databases or may only slowly become part of publishing societies. New journals are set up all the time to address new, neglected, regional and other kinds of research that are not well served by existing journals. You should definitely consider new journals that are relevant for your work but first evaluate the editors who run them.
Look at the editor’s profile on a university website, links to their online profiles (for example, on ResearchGate, Google Scholar or LinkedIn) or evidence of their dedication to the profession of journal editing, for example, membership of organisations such as the Council of Science Editors (CSE), European Association of Science Editors (EASE) and World Association of Medical Editors (WAME).
Academic publishers typically operate their journals using either the traditional subscription model or an open access model. In subscription journals, readers pay to access papers. In the open access model readers are not charged — but somebody has to pay to keep the journal running. So open access journals either ask authors to pay article processing charges (APCs) or are supported by higher education institutions or funding bodies. Some journals use a ‘hybrid’ open access model: authors can choose to make their work freely available by paying an article APC, or if they don't pay this their article will be available only to subscribers.
The open access movement aims to make research accessible to anyone who needs it. This is a noble mission but is misused by predatory publishers. They ask authors to pay article processing fees, knowing that they have very few readers who would pay for their journals under a subscription model.
But remember that the open access model is not necessarily predatory. Far from it. There are many excellent open access journals that charge APCs, such as those in the PLOS family and BioMed Central. Authors who wish to publish in open access journals should try to include APCs in their research budgets and should check to see if fees are waived for authors in developing countries.
A journal’s impact factor is a measure of its quality or prestige. This metric, owned by Thomson Reuters, is commonly used by researchers to identify appropriate target journals, but this approach can be problematic.
Impact factors are rigorously calculated from citations, as described on the Thomson Reuters website. Journals with a high impact factor quickly convince readers that they are reputable or prestigious.
There are other metrics that measure journal quality, such as the eigenfactor score and SCImago journal rank.
The impact factor has become the hallmark of journal prestige — so much so that it has even spawned misleading ‘fake’ metrics in which high ranks can be bought by unethical publishers.  There's even one called the ‘journal impact factor’, which can be easily confused with the Thomson Reuters impact factor.
And the impact factor itself has received criticism from Nobel laureate Randy Shekman and from articles published in a number of leading journals, including some with high impact factors. [5,6,7,8]
Be aware that impact factors are not comparable across fields. The journal Applied Physics Letters is the highest ranking journal in its field, but still has a much lower impact factor than the highest ranking journal in microbiology, Nature Reviews Microbiology.
You also need to be aware that much of the information you need to interpret a journal’s impact factor may be missing. Journals with high impact factors often promote them on their websites but this doesn’t convey the full picture. For example, you may need to know what other journals have similar impact factors. But if you want to know about all the impact factors of journals in your field, you'll need access to the Thomson Reuters’ Journal Citation Reports, which are not free.
In some niche fields, reputable journals may not even have an impact factor. This can be because their topics interest only a small community. But they might still be the best place to publish if you are going to reach the right audience.
The impact factor is a complex metric that should be used for specific purposes and by people who are fully aware of its intricacies. If you use it to evaluate journals, there are many caveats to take into account. Certainly, a journal’s impact factor is inappropriate for evaluating individual articles or authors. And when authors treat it as the most important criterion for selecting a journal, they have not fully understood the point of research communication.
Ultimately, the most important thing you can do is to know your audience.
When considering a journal, be prepared to ask yourself some questions. Who are its readers? Are they part of your research community? Would they be interested in your paper? Would they be able to build on your findings or implement any recommendations?
If you don't have answers to these questions, speak with your senior colleagues, look for advice on online networks such as AuthorAID and ResearchGate and join scientific societies to learn from researchers in your field.
Remember that with so many papers published every day, keeping track of the relevant literature has become a big challenge. It would be naive to think that a paper, even one in a ‘big-name’ high impact factor journal, will attract interest from everywhere. (And be aware that predatory journals may have few or no serious readers.)
“Ultimately, the most important thing you can do is to know your audience.”
These days you should think beyond conventional publications, for example by promoting papers on social media and archiving them and their data.
You may be able to upload full texts of your papers on a digital repository at your institution or on portals like ResearchGate, as long as you follow self-archiving rules set by your publishers. SHERPA RoMEO offers an online tool for finding these. Once archived, your full texts may become discoverable on academic databases such as Google Scholar, potentially becoming more accessible to scholars who don't have access to subscription journals.
Portals such as figshare and the Dryad Digital Repository make it easy to share data as well as polished publications, and this is being increasingly encouraged and even mandated by journals and funders. Sharing your detailed research data makes your work more usable and may even attract more citations. 
If your research paper addresses a development issue, you might need to think about reaching policymakers. The main messages may also need to be put in different words for non-academic audiences. Some advice on this can be found in an AuthorAID presentation on making research relevant for policymakers and SciDev.Net has a practical guide on How to tell policymakers about scientific uncertainty.
This might seem like a lot of work, and it is — if you are concerned only about publication and not communication. But research should be communicated, and publishing is only a means to that end. So when you choose a journal, don’t stop there. Keep asking yourself: “How can I best communicate my work today?”
Ravi Murugesan works for AuthorAID (authoraid.info), which supports developing country researchers in publishing their work. Twitter: @RaviMurugesan
References Jeffrey Beall Potential, possible, or probable predatory scholarly open-access journals and Potential, possible, or probable predatory scholarly open-access publishers (Scholarly Open Access, accessed 12 December 2014)
 Jeffrey Beall Sudanese researcher falls victim to questionable publisher (Scholarly Open Access, 23 September 2014)
 Ravi Murugesan For open access. Against deception (23 September 2014)
 Jeffrey Beall Misleading metrics (Scholarly Open Access, updated regularly)
 The impact factor game. (PLOS Medicine, 2006)
 Kai Simons The misused impact factor (Science, 2008)
 Beware the impact factor (Nature Materials, 2013)
 Arturo Casadevall and Ferric Fang Causes for the persistence of impact factor mania (mBio, 2014)
 Heather Piwowar and others Sharing detailed research data is associated with increased citation rate (PLOS One, 2007)