Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2018;41(2):207-216.
doi: 10.1007/s00238-017-1378-8. Epub 2017 Dec 4.

Development and validation of a new assessment tool for suturing skills in medical students

Affiliations

Development and validation of a new assessment tool for suturing skills in medical students

Henriette Pisani Sundhagen et al. Eur J Plast Surg. 2018.

Abstract

Background: In recent years, emphasis has been put on that medical student should demonstrate pre-practice/pre-registration core procedural skills to ensure patient safety. Nonetheless, the formal teaching and training of basic suturing skills to medical students have received relatively little attention and there is no standard for what should be tested and how. The aim of this study was to develop and validate, using scientific methods, a tool for assessment of medical students' suturing skills, measuring both micro- and macrosurgical qualities.

Methods: A tool was constructed and content, construct, concurrent validity, and inter-rater, inter-item, inter-test reliability were tested. Three groups were included: students with no training in suturing skills, students who have had training, plastic surgery.

Results: The results show promising reliability and validity when assessing novice medical students' suturing skills.

Conclusions: Further studies are needed on implementation of the instrument. Moreover, how the instrument can be used to give formative feedback, evaluate if a required standard is met and for curriculum development needs further investigation.Level of Evidence: Not ratable.

Keywords: Assessment tool; Microsurgery; Plastic surgery; Surgical education; Suturing skills; Technical skills assessment; Undergraduate training.

PubMed Disclaimer

Conflict of interest statement

Compliance with ethical standardsThe research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors.Henriette Pisani Sundhagen, Stian Kreken Almeland, and Emma Hansson declare that they have no conflict of interest.The study protocol was reviewed by the Regional Committee for Medical and Health Research Ethics and it was concluded that Norwegian law on research ethics and medical research does not require an ethical permit for this type of study (2015/896, REK-vest). The Declaration of Helsinki was followed.All participants gave their written informed consent to participate.

Figures

Fig. 1
Fig. 1
Data distribution of in-house scores by all assessors in the different groups
Fig. 2
Fig. 2
Subjects time distribution in time to complete task. The 67% cutoff is marked as the dashed line at 378 s
Fig. 3
Fig. 3
Differences in in-house scores between different groups. Individual scores from all three assessors are plotted
Fig. 4
Fig. 4
Matched improvement of pre- and post-course performance as measured by the in-house scoring tool
Fig. 5
Fig. 5
Correlation of subjects’ in-house scores with their OSATS and UWOMSA scores. O—pre- and post-course subjects, X—expert controls
Fig. 6
Fig. 6
Relationship of in-house scores given by the three assessors plotted against the average score
Fig. 7
Fig. 7
Repeatability of the in-house score. Single assessor scores plotted against the average score by the three assessors. Arrows represent scores of the same subject by the same assessor at two different time points. The starting point of the arrow is the score given at the first assessment and the arrowhead represents the second assessment score. Large differences in the first and second scores by the same assessor can be spotted as elongated arrows. Only arrowheads are shown when the two scores are equal. The average score is indicated as a numeric value on the graph

Similar articles

Cited by

References

    1. Tan SS, Sarker SK. Simulation in surgery: a review. Scott Med J. 2011;56(2):104–109. doi: 10.1258/smj.2011.011098. - DOI - PubMed
    1. Ahmed K, Miskovic D, Darzi A, Athanasiou T, Hanna GB. Observational tools for assessment of procedural skills: a systematic review. Am J Surg. 2011;202(4):469–80.e466. doi: 10.1016/j.amjsurg.2010.10.020. - DOI - PubMed
    1. Morris M, Gallagher T, Ridgway P (2012) Tools used to assess medical students competence in procedural skills at the end of a primary medical degree: a systematic review. Med Educ Online 17(1):18398 - PMC - PubMed
    1. Temple CL, Ross DC. A new, validated instrument to evaluate competency in microsurgery: the University of Western Ontario Microsurgical Skills Acquisition/Assessment instrument [outcomes article] Plast Reconstr Surg. 2011;127(1):215–222. doi: 10.1097/PRS.0b013e3181f95adb. - DOI - PubMed
    1. Grant AL, Temple-Oberle C. Utility of a validated rating scale for self-assessment in microsurgical training. J Surg Educ. 2017;74(2):360–364. doi: 10.1016/j.jsurg.2016.08.017. - DOI - PubMed

LinkOut - more resources