Article Text

Download PDFPDF

Dissemination of research during the first year of the coronavirus disease 2019 pandemic
Free
  1. Justin S Brandt1,
  2. Sonal Grover2,
  3. Cande V Ananth3,4,5,6
  1. 1 Division of Maternal-Fetal Medicine, Department of Obstetrics, Gynecology, and Reproductive Sciences, Rutgers Robert Wood Johnson Medical School, New Brunswick, NJ, USA
  2. 2 Department of Obstetrics, Gynecology, and Reproductive Sciences, Rutgers Robert Wood Johnson Medical School, New Brunswick, NJ, USA
  3. 3 Department of Biostatistics and Epidemiology, Rutgers School of Public Health, Piscataway, New Jersey, USA
  4. 4 Division of Epidemiology and Biostatistics, Department of Obstetrics, Gynecology, and Reproductive Sciences, Rutgers Robert Wood Johnson Medical School, New Brunswick, NJ, USA
  5. 5 Environmental and Occupational Health Sciences Institute, Rutgers Robert Wood Johnson Medical School, Piscataway, NJ, USA
  6. 6 Cardiovascular Institute of New Jersey, Rutgers Robert Wood Johnson Medical School, New Brunswick, NJ, USA
  1. Correspondence to Dr Justin S Brandt, Department of Obstetrics, Gynecology, and Reproductive Sciences, Rutgers Robert Wood Johnson Medical School, New Brunswick, New Jersey 08901, USA; jsb288{at}rwjms.rutgers.edu

Statistics from Altmetric.com

The rapid and effective dissemination of research during the coronavirus disease 2019 (COVID-19) pandemic is critical if healthcare providers and public health officials are to remain aware of new developments. Several organizations have collected relevant COVID-19 articles to facilitate data sharing, including the World Health Organization1 and the National Institutes of Health (NIH).2 Yet, in this time of remarkable research productivity and social media influence, how peer-reviewed research disseminates to the global community remains poorly understood.

In order to ascertain how published research disseminated during the COVID-19 pandemic, we examined the world’s peer-reviewed literature on COVID-19 through the application of alternative metrics (altmetrics) based on social media engagement and evaluative bibliometrics using citation rates. We utilized almetrics and citation rates to analyze COVID-19 articles indexed in the NIH’s iSearch COVID-19 portfolio2 and the Almetric Explorer3 (study flow chart, figure 1). The iSearch COVID-19 Portfolio contains peer-reviewed COVID-19 articles from PubMed and preprints from several sources. We restricted our analysis to articles and used this dataset to ascertain citation rates. These data were merged by articles’ unique PubMed identification numbers with altmetrics from the Altmetric Explorer, where we ascertained articles’ Almetric Attention Scores (AAS; composite score of social interest that includes mentions in newsfeeds, Twitter, Facebook, and Google, among other sources).4

Figure 1

Study flow chart. WoS, ISI Web of Science.

We analyzed COVID-19 articles in the ISI Web of Science (WoS) Core Collection5 to evaluate article characteristics, including study type, authorship, and funding sources. We queried the ISI WoS Core Collection using the search terms “coronavirus disease 2019”, “COVID-19”, “severe acute respiratory syndrome coronavirus 2”, “SARS-CoV-2”, and “novel coronavirus”. The ISI WoS query was limited to publications from January 2020 to February 2021 (with no language or article type restrictions). Although there are several search engines that can be used for bibliometric analysis, including Scopus, Medline, and Google Scholar,6 7 we selected the ISI WoS Core Collection because this database was classically used by Eugene Garfield, the developer of the impact factor metric, to identify “citation classics”.8

Research data were downloaded from iSearch and merged with Almetric Explorer on February 26, 2021. The results of the ISI WoS query were also downloaded on February 26, 2021. This analysis was based on a total of 87,643 articles in iSearch that were merged with data from the Altmetric Explorer (75,960 (86.7%) published in 2020 and 11,682 (13.3%) published in 2021) and 90,609 articles in the ISI WoS query (82,008 (90.5%) published in 2020 and 8601 (9.5%) in 2021).

COVID-19 articles were published rapidly during the first months of the pandemic, peaking in April 2020, and then plateauing at persistently high rates (figure 2A). Trends in citation rates and AAS mirrored each other during the study period (figure 2B). There were 48 articles in iSearch with unique PubMed identification numbers that were retracted or were retraction notices, corresponding to 34 (0.04%) articles (geometric mean citation rate (95% CI) 7.7 (3.5 to 16.5) and arithmetic mean (SD) 30.3 (80.3) and geometric mean AAS (95% CI) 26.9 (8.5 to 84.9) and arithemetic mean AAS (SD) 1243.3 (3784.0)). Some of these articles received substantial social media attention. For example, among the top 10 articles with highest AAS, one of the articles was retracted and another article was the official retraction notice from the journal that published that article. The top 10 articles with the highest citation rates and highest AAS, after excluding retracted articles, are described in table 1.

Figure 2

Monthly distribution of citation rates and altmetrics for articles indexed in the National Institutes of Health’s iSearch Coronavirus Disease 2019 Portfolio, January 2020 to February 2021. (A) Distribution of citation rates by month of publication. (B) Distribution of monthly article publication, mean citation rates, and mean Altmetric Attention Scores (AAS).

Table 1

Top 10 articles with the highest citation rates and highest Almetric Attention Scores for articles indexed in the National Institutes of Health’s iSearch Coronavirus Disease 2019 Portfolio, January 2020 to February 2021

The ISI WoS query revealed the most common document types were “articles” (47,717; 52.7%), “editorials” (14,491; 16.0%), and “letters” (14,073; 15.5%). The most frequent WoS categories were “Medicine, General & Internal” (11,111; 12.2%), “Public, Environmental & Occupational Health” (7281; 8.0%), and “Infectious Disease” (4790; 5.2%). Over half of all articles originated from four countries: the US (25,312; 27.9%), China (10,535; 11.6%), Italy (8899; 9.8%), and England (8759; 9.7%). The top funding agencies were the NIH, the National Natural Science Foundation of China, and the European Commission.

In this study of the world’s peer-reviewed COVID-19 literature in the first year of the pandemic, we observed a dramatic explosion of research output. With rapid publication of approximately 90,000 peer-reviewed articles that addressed all facets of COVID-19, the global community has been inundated with data. Notably, nearly 60% of the world’s research output originated from four countries that were hit first and hardest by COVID-19. Three countries are geographically distinct from the origination site of the virus, highlighting the global impact of the disease.

We observed that mean citation rates and AAS mirrored each other. Although there is debate about whether altmetrics correlate with citation rates,9 10 this study suggests these metrics have correlated during the pandemic.

While citation rates and altmetrics reflect influence, they do not provide insight into research quality. The explosion of COVID-19 publications has raised legitimate concerns about research quality11 as well as misconduct.12 Many journals, particularly top tier journals, prioritized submissions of COVID-19-related articles, potentially at the expense of other topics, and expedited their peer review and publication. The rush to publish on the part of investigators and journals may have encouraged suboptimal research designs and methods as well as suboptimal peer review. Retracted articles perhaps reflect the most egregious examples of how the push to publish during the pandemic promoted poor-quality research. While the proportion of retracted articles in this study was small, these articles received substantial social media attention as well as high citation rates. This study underscores the need for a novel metric that prioritizes research quality rather than quantity.

Ethics statements

Patient consent for publication

Acknowledgments

Almetric provided no-cost access to their data for this project, but were not involved in the study design, analysis, or drafting the manuscript nor did they review or approve any version of the manuscript.

References

Footnotes

  • Twitter @DrJustinBrandt

  • Contributors JSB: conceptualization, methodology, formal analysis, investigation, data curation, writing - original draft, writing - review & editing. SG: writing - review & editing. CVA: writing - review & editing, supervision.

  • Funding The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.

  • Competing interests None declared.

  • Provenance and peer review Not commissioned; externally peer reviewed.

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.