Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2016 Mar 1;11(3):e0149852.
doi: 10.1371/journal.pone.0149852. eCollection 2016.

How to Rank Journals

Affiliations

How to Rank Journals

Corey J A Bradshaw et al. PLoS One. .

Abstract

There are now many methods available to assess the relative citation performance of peer-reviewed journals. Regardless of their individual faults and advantages, citation-based metrics are used by researchers to maximize the citation potential of their articles, and by employers to rank academic track records. The absolute value of any particular index is arguably meaningless unless compared to other journals, and different metrics result in divergent rankings. To provide a simple yet more objective way to rank journals within and among disciplines, we developed a κ-resampled composite journal rank incorporating five popular citation indices: Impact Factor, Immediacy Index, Source-Normalized Impact Per Paper, SCImago Journal Rank and Google 5-year h-index; this approach provides an index of relative rank uncertainty. We applied the approach to six sample sets of scientific journals from Ecology (n = 100 journals), Medicine (n = 100), Multidisciplinary (n = 50); Ecology + Multidisciplinary (n = 25), Obstetrics & Gynaecology (n = 25) and Marine Biology & Fisheries (n = 25). We then cross-compared the κ-resampled ranking for the Ecology + Multidisciplinary journal set to the results of a survey of 188 publishing ecologists who were asked to rank the same journals, and found a 0.68-0.84 Spearman's ρ correlation between the two rankings datasets. Our composite index approach therefore approximates relative journal reputation, at least for that discipline. Agglomerative and divisive clustering and multi-dimensional scaling techniques applied to the Ecology + Multidisciplinary journal set identified specific clusters of similarly ranked journals, with only Nature & Science separating out from the others. When comparing a selection of journals within or among disciplines, we recommend collecting multiple citation-based metrics for a sample of relevant and realistic journals to calculate the composite rankings and their relative uncertainty windows.

PubMed Disclaimer

Conflict of interest statement

Competing Interests: The authors have declared that no competing interests exist.

Figures

Fig 1
Fig 1. Mean rank (± 95% uncertainty limits) of the top 30 journals for two disparate biological disciplines: Ecology and Medicine, plus one Multidisciplinary theme.
Journals are ordered by mean rank of five metrics: IF, IM, SNIP, SJR and h5/log10(n); statistics were estimated using κ-resampling with 10,000 iterations, from a total sample of 100 journals for Ecology and Medicine and 50 journals for Multidisciplinary (see main text for details). Journal abbreviations follow the Web of Science standard.
Fig 2
Fig 2
(A) Mean rank (± 95% uncertainty limits via κ-resampling with 10,000 iterations) of the top 25 journals within a combined Ecology and Multidisciplinary theme. Journals are ordered by mean rank of five metrics: IF, IM, SNIP, SJR and h5/log10(n) (see main text for details). (B) Mean rank (± 1σ) of the same journals assessed from a survey of 188 ecologists. Journals above the 1:1 correspondence (45° line) are rated higher by ecologists than their mean metric would indicate, and vice versa. (C) Overall, there was a Spearman’s rank correlation of 0.68–0.84 (median = 0.77; based on 1,000 random uniform resamples of the rank interval) between both rankings. Journal abbreviations follow the Web of Science standard.
Fig 3
Fig 3. Quality groupings derived from similar mean metrics.
Top panel: Agglomerative hierarchical clustering tree for 25 Ecology and Multidisciplinary journals based on the ranks of five metrics (IF, IM, SNIP, SJR and h5/log10(n)); calculated as a Euclidean metric, complete linkage clustering of standardized metric values, see main text for details). The clustering methods revealed only two statistically supported groupings: (i) Nature and Science and (ii) all remaining 23 journals. Bottom panel: Principal components analysis of the same sample of journals based on their mean ranks from the same five metrics. 95.3% of the variance was explained by the first principal component axis, with only an additional 2.1% explained by the second principal component axis. This confirms the Science/Nature outliers grouped together using agglomerative and divisive clustering.
Fig 4
Fig 4. Median rank (± 95% uncertainty limits) of 25 journals within two specialist disciplines: Obstetrics & Gynaecology (top panel) and Marine Biology & Fisheries (bottom panel).
Journals are ordered by mean rank of five metrics: IF, IM, SNIP, SJR and h5/log10(n); statistics were estimated using κ-resampling with 10,000 iterations (see main text for details). Journal abbreviations follow the Web of Science standard.

References

    1. Adam D. Citation analysis: the counting house. Nature. 2002; 415(6873):726–9. - PubMed
    1. Vanclay J. Impact factor: outdated artefact or stepping-stone to journal certification? Scientometrics. 2012; 92(2):211–38. 10.1007/s11192-011-0561-0 - DOI
    1. Smith R. Commentary: The power of the unrelenting impact factor—Is it a force for good or harm? International Journal of Epidemiology. 2006; 35(5):1129–30. 10.1093/ije/dyl191 - DOI - PubMed
    1. Jacsó P. The plausibility of computing the h-index of scholarly productivity and impact using reference-enhanced databases. Online Information Review. 2008; 32(2):266–83. 10.1108/14684520810879872 - DOI
    1. Fersht A. The most influential journals: Impact Factor and Eigenfactor. Proceedings of the National Academy of Sciences of the USA. 2009; 106(17):6883–4. 10.1073/pnas.0903307106 - DOI - PMC - PubMed

Publication types

LinkOut - more resources