Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2019 Aug 5:8:1353.
doi: 10.12688/f1000research.19994.2. eCollection 2019.

On the evaluation of research software: the CDUR procedure

Affiliations

On the evaluation of research software: the CDUR procedure

Teresa Gomez-Diaz et al. F1000Res. .

Abstract

Background: Evaluation of the quality of research software is a challenging and relevant issue, still not sufficiently addressed by the scientific community. Methods: Our contribution begins by defining, precisely but widely enough, the notions of research software and of its authors followed by a study of the evaluation issues, as the basis for the proposition of a sound assessment protocol: the CDUR procedure. Results: CDUR comprises four steps introduced as follows: Citation, to deal with correct RS identification, Dissemination, to measure good dissemination practices, Use, devoted to the evaluation of usability aspects, and Research, to assess the impact of the scientific work. Conclusions: Some conclusions and recommendations are finally included. The evaluation of research is the keystone to boost the evolution of the Open Science policies and practices. It is as well our belief that research software evaluation is a fundamental step to induce better research software practices and, thus, a step towards more efficient science.

Keywords: Open Science; Research Software; Research Software Citation; Research Software Evaluation; Research evaluation; Scientific Software.

PubMed Disclaimer

Conflict of interest statement

No competing interests were disclosed.

Figures

Figure 1.
Figure 1.. Interrelations between different software concepts appearing in this work.
Figure 2.
Figure 2.. Initial section of the TreeCloud reference card published in the PLUME project platform, see https://www.projet-plume.org/en/relier/treecloud.
Figure 3.
Figure 3.. Initial section of the Unitex validated software description card published in the French side of the PLUME project platform, see the complete publication on https://projet-plume.org/fiche/unitex.

References

    1. Kanewala U, Bieman JM: Testing Scientific Software: A Systematic Literature Review. Inf Softw Technol. 2014;56(10):1219–1232. 10.1016/j.infsof.2014.05.006 - DOI - PMC - PubMed
    1. Partha D, David PA: Toward a new economics of science. Res Policy. 1994;23(5):487–521. 10.1016/0048-7333(94)01002-1 - DOI
    1. Howison J, Deelman E, McLennan MJ, et al. : Understanding the scientific software ecosystem and its impact: Current and future measures. Res Evaluat. 2015;24(4):454–470. 10.1093/reseval/rvv014 - DOI
    1. Howison J, Bullard J: Software in the scientific literature: Problems with seeing, finding, and using software mentioned in the biology literature. J Assoc Inf Sci Tec. 2016;67(9):2137–2155. 10.1002/asi.23538 - DOI
    1. Hucka M, Graham MJ: Software search is not a science, even among scientists: A survey of how scientists and engineers find software. J Syst Software. 2018;141:171–191. 10.1016/j.jss.2018.03.047 - DOI

Publication types

LinkOut - more resources