Skip to Main Content

Recognizing Predatory Journals and Conferences

A guide to educate JWU faculty, students and staff about journals and conferences that employ deceptive practices for unmerited financial or reputational gain. Originally created by Pieta Eklund, and reproduced / edited with permission

Journal ranking

Impact Factor

There are different ways of measuring research impact, but it is usually done with bibliometric analysis of journals and articles. One of the most common methods is to look at citations. When it comes to journals, Journal Impact Factor (JIF) is the gold standard. JIF is calculated and owned by the company Clarivate Analytics. There are other similar indicators that are calculated by other companies. Two examples are SNIP and SJR which use the Elsevier-owned database Scopus instead of Web of Science as their source. 

Journal Impact Factor

JIF is a measurement that shows the average number of citations the published article has accumulated in a particular year. For example: The Journal of Palliative Care (ISSN: 0825-8597) has an Impact Factor of 0,931 for 2011. That means that articles published in the journal under 2009 and 2010 got an average of 0.9 citations in 2011.

The calculation is made like this:

A = the number of articles published 2009 and 2010 that are cited in 2011.

B = all "citable" articles published in the journal 2009 and 2010.
(Citeble usually means articles, reviews, proceedings or notes but not editorials or letters to the editor.)

The Journal Impact Factor for 2011 is A/B.

Every year the JIF for all journals indexed in Web of Science in the Journal Citation Reports. Many ranking systems interpret a high JIF as a marker of high quality.

Predatory journals and bibliometric indicators

It can be challenging to tell if the metric a journal displays is based on reliable data or not. Here is a list of advice on how to spot misleading and fake metrics from Stop Predatory Journals:

  • The metric uses the term “impact factor”. This is trademarked by Clarivate Analytics, and so Journal Impact Factor is the only metric legally allowed to use it.
  • The website for the metric is nontransparent and provides little information about itself
  • The company charges journals for inclusion in the list.
  • The values (scores) for most or all of the journals on the list increase each year.
  • The company uses Google Scholar as its database for calculating metrics (Google Scholar does not screen for quality and indexes predatory journals)
  • The methodology for calculating the value is contrived, unscientific, or unoriginal.
  • The metric is displayed by completely new journals, although these can’t yet have gathered any citations.

You can find more advice at Fake Metrics and How to Spot Them