www.journalonsurgery.org

Research Article

Open Access, Volume 2

Development and Evaluation of Virtual Teaching of Surgical Skills: A Feasibility Study

Finch Enna LH4; Boal Matt WE1,2; Tandon Devansh2,3; Nagi Jasleen6; Ghamrawi Walaa1; Yassa Bishoy7; Gupta Tarush8; Pathmaraj Bavin2; Court Emma5; Eid Joana1,3; Francis Nader K1,2,5
Collaborative Authors
Anirudh Manivannan3; Anisha Talwar3; Anuhya Vusirikala1,2; Christopher Namgoong3; Costanza Girasole3; Deepali Patel3; Eliza Davison9; Harjeet Singh3; Hemina Shah3; Jayantika Uniyal3; Keegan Curlewis9; Keshav Krishnan3; Lesley Aju3; Manveer Singh3; Noorulanne Younis3; Rasha Rashid3; Riana Patel10; Vanessa Coles11

1The Griffin Institute (TGI), Northwick Park Institute for Medical Research, London.
2University College London (UCL), London.
3British Indian Surgical Association/British Indian Medical Association (BISA/BIMA), UK.
4Guy’s and St Thomas’ NHS Foundation Trust, London.
5Yeovil District Hospital NHS Foundation Trust, Yeovil.
6Imperial College Healthcare NHS Trust, London.
7Bart’s and The London School for Medicine and Dentistry, London.
8Brighton and Sussex Medical School, Brighton.
9Royal Free London NHS Foundation Trust, London.
10Barking, Havering, and Redbridge University Hospitals NHS Trust, London.
11Evelina Children’s Hospital, London.

Abstract

Background: The COVID-19 pandemic has created unprecedented challenges to health service delivery around the world, with an adverse impact on doctors-in-training. However, creating an opportunity to utilise and apply digital tools to replace most, if not all, face-to-face teaching. The aim of this study was to ascertain the feasibility and effectiveness of virtual training of basic surgical skills using the Kirkpatrick model of curriculum evaluation.

Methodology: Medical students were recruited for basic surgical skills courses. The virtual (intervention) group was taught over two weeks via Zoom and the in-person (control) group had traditional, face-to-face teaching, both for 6 hours in total. Groups were assessed using the validated OSATS tool at weeks one, two and at four months for skill retention. Feedback was collected to ascertain participant reaction and behaviour change. Groups’ progressions were analysed calculating means, intra- and inter-group analysis was compared using independent and paired t-tests.

Results: 48 participants were recruited, 24 students in both groups. Virtual mean OSATS scores at weeks one and two were 15.94 and 18.97 respectively (p=0.001), whilst for the in-person group means were 17.67 and 18.44 (p=0.02). Virtual participants were outperformed by in-person in week one, with mean scores of 15.71 and 17.67 respectively (p=0.002). By week two, both groups had comparable scores with virtual and in-person scoring 18.97 and 18.44 respectively (p=0.31). Both groups showed skill retention, but the virtual group had a significantly lower mean score of 17.88 vs. 19.50 (p=0.04). Feedback was positive across the board with students more likely to consider a career in surgery.

Conclusion: Virtual teaching of basic surgical skills can be a feasible and useful adjunct to traditional surgical teaching. Future curricula should consider the use of digital platforms, perhaps in a hybrid approach.

Keywords: Curriculum; Assessment; Evaluation; Kirkpatrick; Education; Virtual; Surgical skills.

Abbreviations: BISA/BIMA: British Indian Surgical & Medical Association; OSATS: Objective Structured Assessment of Technical Skill; GRS: Global Rating Scale.

Manuscript Information: Received: Nov 29, 2022; Accepted: Dec 23, 2022; Published: Dec 30, 2022

Correspondance: Nader Francis, The Griffin Institute, Northwick Park Institute for Medical Research, London. Email: n.francis@griffininstitute.org.uk

Citation: Finch Enna LH, Boal Matt WE, Devansh T, Jasleen N, Walaa G, et al. Development and Evaluation of Virtual Teaching of Surgical Skills: A Feasibility Study. J Surgery. 2022; 2(2): 1074.

Copyright: © Francis N 2022. Content published in the journal follows creative common attribution license.

Introduction

The COVID-19 pandemic has created unprecedented challenges to health service delivery around the world, with an adverse impact on doctors-in-training. In the UK alone it is estimated that 1.5 million elective procedures were cancelled or postponed, resulting in a 33.6% drop-in surgical activity in 2020 [1]. As well as a burden on patients, this has had a huge impact on trainees, with each of these procedures representing a lost teaching opportunity. 41% of surgical trainees were redeployed to help with staff shortage, with 74% of those for longer than four weeks, in addition there was complete loss of procedural training in over 65% of trainees. Worryingly, only 9% reported that they would meet all required competencies expected that year [2]. A survey by Choi et. Al [3] of 440 students from 32 UK medical schools revealed that 38.4% had their final OSCE cancelled, and 43% had their student assistantships postponed, which had a significant effect on their confidence in entering first year placements.

The pandemic has created an opportunity to utilise and apply digital tools to replace most, if not all, face-to-face teaching. Due to the nature of restrictions, many educational resources including lectures and final examinations had to go online [4,5]. Remote learning has subsequently shown benefits such as reduced costs in travel and resources [6], flexibility, and better accessibility[7] allowing students to continue receiving medical training despite social distancing. Studies have found that the utilisation of online resources in place of previously classroom-based teaching has been well received by trainees and faculty alike [8-10] and could therefore become essential to medical curricula.

There is limited evidence showing the application of virtual training to practical tasks such as basic surgical skills, although students reported positive responses [11-14]. Of these, three studies have objectively evaluated a virtual teaching programme of surgical skills by formally assessing performance, but there have been no reports further evaluating skill retention [12-14]. However, whether virtual training is as effective in teaching surgical skills when compared to traditional face-to-face methods is an important question to consider.

The aim of this study was to ascertain the feasibility and effectiveness of virtual training of basic surgical skills using the Kirkpatrick model of curriculum evaluation [15].

Methodology

This was a prospective observational cohort study recruiting medical students, without prior suturing experience, who participated in a series of training sessions on basic suturing techniques over a total of 6 hours.

The virtual (intervention) group were recruited through the British Indian Surgical & Medical Associations (BISA/BIMA) for undergraduate medical students. A Basic Surgical Skills Teaching Series was advertised through the social media accounts of BISA/ BIMA and was open to all undergraduates across the UK. Students were asked to complete an application form online, with places being allocated on a first come first serve basis. The first 24 out of 150 students to register for the virtual teaching series were selected for the course. Complete data were obtained from 17 participants in the virtual group as 7 failed to return assessments to the research team.

Sessions were delivered online through Zoom Video Communications, Inc., where participants received a total of 6 hours of teaching. Prior to the first session suturing kits funded by ©2022 Sigma Lance and provided by BISA (Supplementary Material) were sent out including a silicon suturing pad, 3-0 silk sutures, a needle driver, a pair of toothed and non-toothed forceps, suture scissors, and a sharps pad for disposal. Industry was not involved in designing the curriculum, running the teaching or analysing the results. Sessions were taught by specialist trainees and consultants, with a ratio of three to four participants per tutor in break out rooms. The structure of the session was as follows; the lead tutor described the instruments and their use, followed by a demonstration of the suture and knot tie with commentary, then repeated without. Participants were then split into their breakout rooms and given time to practice along with live feedback from their tutor. Assessments were performed by the tutors at the end of the sessions on week one and two, using a validated Objective Structured Assessment of Technical Skill (OSATS) tool which combined a task-specific checklist and a Likert format Global Rating Scale (GRS) [16] (Appendix 1), to give a final score out of 21.

The in-person (control) group were recruited from the cohort of intercalating medical students from University College London, attended conventional face-to-face surgical skills training sessions. These participants underwent two half day workshops (total of 6 hours training) on basic suturing skills with the same equipment and structure of training sessions as the virtual group.

The Kirkpatrick model of evaluation was applied for this curriculum consisting of four levels described below.

Appendix 1

1. OSATS tool

Objective Structured Assessment of Technical Skills (OSTS): Suturing

Candidate name:                  Self-assessment: yes/no
(select appropriate):            Tutor:            Date:

Checklist Yes No
Selects appropriate instruments
Needle loaded ½ to 2/3 from tip
Bite depth and distance from wound edge appropriate 0.5cm-1cmm
Needle enters tissue perpendicular (90 degree)
Single attempt taking bites
Forceps used to hold skin
Supinates wrist
Approximates wound edges (appropriate eversion)
Secures square knot with hand or instrument tie - surgeons knot first throw
Appropriate number of throws i.e. 3-4 braided, 6-7 monofilament
Sutures placed appropriate distance apart/equal bites each side
Cuts suture tail correct length
Avoids handling needle with fingers
Avoids torqueing skin i.e. sutures placed incorrectly
Avoids grasping needle tip
Avoids multiple forceps grasps on tissue/damaging tissue or foam pad
Score                        /16

Global rating scale (Please mark/highlight)        /5

1- Poor technique, poor manual dexterity/instrument handling
and unacceptable knot/closure

2- In between 1 and 3

3- Moderately good technique, moderate manual dexterity/instrument handling, acceptable knot

4- In between 3 and 5

5- Excellent Technique, excellent manual dexterity, and excellent knot

Total score: Checklist + global rating scale (       /21)

2. Example Session Feedback Form – In-person group


3. Example Session Feedback form - Virtual group



4. Overall feedback form – Virtual and in-person groups


Supplementary material

Sigma Lance © EDGE MK IV - Pre-Cut, Great Value System used for the virtual group


Level 1 – Participant reaction

Level 1 evaluates reaction towards the learning experience through standard course evaluation forms assessing how participants perceive the quality and value of training. Feedback questionnaires were completed at the end of each session for both groups, via Google Forms for virtual and paper handouts for inperson participants (Appendix 1). Questions covered expectations of the course, usefulness of the sessions, satisfaction with the tutors’ teaching, as well as areas for improvement.

Level 2 – Skill acquisition

Level 2 measures the degree to which training improves participant knowledge and skills. In both groups, the OSATS tool was used to objectively by the trainers to assess participants’ ability to perform basic suturing skills at weeks one and two of the course.

Level 3 and 4: Evaluation of skill retention and impact

Change in behaviour was assessed through Google Forms feedback [Appendix 1] at the end of each training period for both groups. Questions evaluated confidence in suturing before and after the teaching series, whether they were more likely to pursue a career in surgery or organise theatre experience, and if they would recommend the course.

In addition to this, we objectively measured skill retention in both groups at four months after completion of their respective courses. The virtual group was requested to submit self-recorded videos of basic suturing tasks, whilst the in-person group were invited back to a face-to-face session for reassessment of the same tasks. The skill retention assessments were done by a single specialist level assessor using the same OSATS tool; weeks one and two mean scores were then compared to this.

Data collection and statistical analysis

We collected demographic information of all participants which included year of university course, and previous surgical experience. Mean OSATS scores were calculated for weeks one, two and skill retention sessions in both groups. An independent sample t-test was used to compare the mean scores of virtual versus in-person groups at each stage. Each groups’ progression from week one to two, was measured by calculating the mean scores and compared using a paired samples t-test. Finally, skill retention in each group was evaluated using paired samples t-test to compare mean scores of weeks one and two. Two tailed p values were considered significant if under 0.05. Analysis was performed using IBM SPSS statistics version 28.0.1.1.

Results

Demography

In total 48 participants underwent the training, with 13 (54.2%) females in the virtual group, ranging from year one to four medical students from ten different universities across the UK. The in-person group were all intercalated medical students after their third year and consisted of 10 (41.7%) females. No participants had any prior exposure or formal training of surgical skills.

Evaluation of Curriculum

Complete evaluation data were received from 17 (70.8%) virtual participants at week one and 16 (66.7%) after week two while course feedback was collected from 24 (100%) participants from the in-person group.

Level 1 – Participant reaction

Participant reaction from both groups was highly positive (Figures 1 and 2), with feedback from virtual sessions showing comparable favourable opinions to that of face-to-face sessions. Overall, virtual participants found their sessions to be highly useful, with an average score of 4.68/5, in comparison to 5/5 for the inperson group. The virtual group were overall thoroughly satisfied with the contents of the session, giving an average satisfaction score of 4.79/5.

Points praised in virtual teaching were the interactivity of the sessions, clarity of demonstrations, and quality of teaching from the tutors. Participants highlighted the individual feedback provided, and described the tutors as “helpful", “engaging” and “friendly”. Similarly, the in-person group complimented the equipment and tutors providing “really helpful advice” with “well taught” sessions.

Issues highlighted in the virtual training were that of time restrictions and technical limitations. This included issues with camera focus and angle, with screen limitations sometimes preventing a clear view of demonstrations. Problems with internet connectivity were also noted by some participants, causing interruption to the session.

Figure 1: Virtual group feedback word cloud.

Figure 2: In-person group feedback word cloud.

Level 2 - Skill acquisition: Objective evaluation

Both virtual and in-person groups showed progression (Figure 3), with the mean week one and two scores of the virtual participants (n=16) being 15.94 (SD 2.05) and 18.97 (SD 2.00) respectively (mean difference 3.03, t=4.07, SD 2.98, 95% CI 1.44-4.62, p=0.001), whilst for the in-person group (n=24) mean scores were 17.67 (SD 1.59) and 18.44 (SD 1.36) respectively (mean difference 0.77, t=2.47, SD 1.52, 95% CI 0.13-1.41, p=0.02).

Comparing group performance at each week demonstrated that the virtual group was out performed by the in-person in week one, with mean scores of 15.71 (SD 2.20) (n=17) and 17.67 (SD 1.59) (n=24) respectively (mean difference 1.96, t= 3.31, SE 0.59, p=0.002). However, by week two, both groups had comparable scores with virtual and in-person scoring 18.97 (SD 1.94) and 18.44 (SD 1.36) respectively (mean difference 0.53, t=1.04, SE 0.52, 95% CI -0.51-1.57, p=0.31).

Figure 3: Mean OSATS scores of each group at each session.

Level 2 - Skill acquisition: Self-evaluation

From 10 (41.7%) virtual group responses to the final feedback survey, participants clearly felt that they had improved in specific skills set out by the course, with their confidence in instrument tying improving from an average score of 1.6/5 before the course to 4.4/5 after (corresponding to a 175% improvement in self-rated score), and suturing (1.4 to 4.2/5 i.e. a 200% improvement). From 9 (37.5%) in-person responses, confidence in instrument tying improved from 2.56/5 before the course to 4.56/5 after the course (78% increase) and 2.44 to 4.44/5 (82% increase) for suturing.

Level 3 and 4 evaluation of skill retention and impact: Objective evaluation

Upon completion of the training, self-recorded skill retention videos were submitted at four months by a total of 13 (76.5%), out of the 17 who returned assessments, from the virtual group, whilst 8 (33.3%) from the in-person group returned for reassessment. Skill retention was analysed for these participants, within and between groups. Comparison between groups revealed that the virtual group had a significantly lower mean score of 17.88 (SD 1.92) than the in-person group with 19.50 (SD 0.75) (mean difference 1.62, t=2.26, SE 0.72, 95% CI 0.12-3.11, p=0.04).

However, when looking independently at each group’s scores, both were able to maintain or improve their performance at 4 months. From the virtual group, week one to skill retention mean scores (n=13) of 16.08 (SD 2.18) and 17.88 (SD 1.92) respectively, showed a significant improvement of 1.80 (t=2.23, SD=2.93, 95% CI 0.04-3.58, p=0.046). Their week two and skill retention mean scores (n=13), 19.27 (SD 1.67) and 17.88 (SD 1.92) respectively, had a mean difference of 1.39 (t=1.62, SD 3.09, 95% CI -0.48-3.25, p=0.132), although statistically this was not significant.

The in-person group from week one to skill retention (n=8) also showed an improvement with respective mean scores of 18.75 (SD 1.41) and 19.50 (SD 0.75) (mean difference 0.75, t=1.60, SD 1.60, 95% CI -2.09-0.59, p=0.23). Similarly, their mean scores in week two of 18.88 (SD 1.06) (mean difference 0.62, t=1.53, SD 1.53, 95% CI -1.90-0.65, p =0.29) showed improvement, which again was not significant, likely due to a small comparative sample size.

Level 3 and 4 evaluation of skill retention and impact: Self-evaluation

Further ‘Overall Feedback’ responses were submitted by 10 virtual and 9 in-person participants, assessing the course’s impact on participant’s behaviour. 100% of participants in both groups would recommend their respective teaching series and were more likely to organise theatre experience following their course. 100% of virtual participants responded they were more likely to pursue a career in surgery after their course, in contrast to 88.9% of in-person participants.

Discussion

Our study aimed to establish whether virtual teaching is a feasible and effective means of providing basic suturing training when compared to the traditional face-to-face approach. This curriculum was pragmatically applied to medical students and evaluated comprehensively by applying the Kirkpatrick model, using objective assessments. Our results have demonstrated that participants were able to achieve similar progression in basic suturing skills after two weeks of virtual teaching, compared to traditional training. Furthermore, participants were able to retain skills learnt remotely, and apply them after four months to a similar level of competency as achieved by the end of their training. Lastly, feedback from the participants was equally positive in both groups and enthused many to potentially pursue a career in surgery.

Recent years have seen a rapid development in online technological advances, and the effects of the pandemic has accelerated the transition to remote working and learning. This is particularly pertinent now given the significant disruption to surgical training caused by the pandemic, highlighted by the annual GMC report in 202117. This revealed that 40% of trainees felt they were not on course to undertake the expected number of operative and practical procedures due to a reduction in opportunities to gain the required curriculum competencies. Furthermore, 40% of trainees felt that they were not provided with alternatives to compensate for these lost opportunities. As medical education has shifted towards distance learning, it is important that surgical training can also adopt effective virtual practices so that technical skills can continue to be taught at the same frequency and to the same standard.

The application of virtual teaching for basic surgical skills is also considered an appropriate method to test and evaluate the impact of teaching on those skills. Basic surgical skills have been highlighted by the General Medical Council (GMC) as a technical procedure competency in newly qualified doctors [18], yet despite this, basic suturing is only present in a minority of medical school curricula [19]. This combined with the challenges posed to medical education by the pandemic will undoubtedly lead to reduced student exposure to surgical training. In attempts to address this there have been innovative examples of teaching methods employed [14,20].

In this study we applied a robust evaluation methodology through Kirkpatrick’s model. Firstly, we demonstrated positive participant reaction that correlated to the existing literature [9,11], with overall high satisfaction with the course, including positive feedback on the level of interaction with the faculty. This however, required a high faculty to participant ratio, which can be afforded virtually, with training sessions easily attended by surgical tutors without impacting on their clinical work.

Secondly, we applied OSATS, a validated and reliable tool to objectively evaluate suturing skills [21-23]. The OSATS tool used consists of two components; a task-specific checklist, which breaks down the surgical task into discrete segments, and a GRS, which gives a generic Likert scale scoring of overall technical ability. A modified version of this tool specific to basic surgical suturing was applied as a formative and summative assessment in our study, allowing us to track participant progression from weeks one to two as well as providing valuable feedback.

Interestingly, performance was evidently weaker in virtual delegates after the first teaching session than those taught face-to-face, suggesting that skill acquisition may be slower for virtually teaching basic suturing skills. Potential explanations may be that visuospatial concepts, such as knot-tying, may be more difficult to grasp through a two-dimensional screen, and technical glitches such as Wi-fi interruptions may cause lagging of video demonstrations. As such, issues noted in the feedback evaluation highlighted the necessity of ensuring good internet connection, camera resolution and view, which should be taken into account to improve future courses. Additional limitations exist within a virtual classroom which impede the flow of the teaching, such as the tutor having to switch from screen-to-screen to assess and provide feedback to participants individually.

Our study showed that at four months after the end of training both groups exhibited skill retention with higher mean OSATS scores when compared to the first week, and comparable mean scores at week two when they had had the most teaching and practice. The in-person group had a higher but non-significant mean score at the skill retention session compared to the virtual group, however, we were only able to assess a small sample of the control group at four months, which may have led to an involuntary selection bias towards more committed participants who continued to practice and potentially skewing the true results. On the other hand, this subset in-person group did not have the kit at home to practice yet still demonstrated their best scores at skill retention. Final feedback responses also indicated behavioural change in participants, whereby students felt more encouraged to involve themselves in theatre opportunities, as well as potentially pursue a career in the surgical field.

Limitations

This study has some limitations including a small sample size and lack of randomisation of both groups. Additionally, there was an incomplete collection of data, which was evident in both groups for different reasons. The virtual assessments relied on tutors and participants to email forms back to the researchers, and as there were multiple tutors in each session it was difficult to retrieve a full data set. This led to missing data which necessitated the exclusion of those participants from analysis. Meanwhile, the in-person group had mandatory attendance for face-to-face training as part of their iBSc intercalated year, therefore, there was full attendance to both training sessions with complete collection of data. However, when evaluating skill retention, fewer students returned for the face-to-face reassessment session. This lack of attendance was due to conflicting medical school commitments as they entered their clinical years. In contrast, the majority of virtual students submitted skill retention videos after four months, highlighting the convenience of virtual teaching and assessment, as these participants were able to submit videos from their own home at a time of their convenience. Additionally, this was easier for the tutor who could assess videos at a time of their choosing. Taking this into consideration for reassessment methods in future training curricula, if feasible financially, suturing kits could be provided for students taught face-to-face, allowing not only at-home practice but also easier reassessments for both tutor and student.

Conclusion

This study demonstrated that virtual teaching of basic surgical skills can be a feasible and useful adjunct to traditional surgical teaching. Future curricula should consider the use of digital plat- forms, perhaps in a hybrid approach, to ensure that missed train- ing opportunities can be compensated.

Declarations

Funding: ©2022 Sigma Lance sponsored the use of suturing kits for BISA tutors and provided BISA students with a discount code for suturing kits on their site.

Conflicts of interest: The authors declare no conflicts of interest.

References

  1. Dobbs TD, Gibson JAG, Fowler AJ, Abbott TE, Shahid T, et al. Surgical activity in England and Wales during the COVID-19 pandemic: a nationwide observational cohort study. Br J Anaesth. 2021; 127: 196-204.
  2. COVID-STAR Collaborative Study Group. COVID-19 impact on Surgical Training and Recovery Planning (COVID-STAR) - A cross-sectional observational study. Int J Surg. 2021; 88: 105903.
  3. Choi B, Jegatheeswaran L, Minocha A, Alhilani M, Nakhoul M, et al. The impact of the COVID-19 pandemic on final year medical students in the United Kingdom: A national survey. BMC Med Educ. 2020; 20: 1-11.
  4. Papapanou M, Routsi E, Tsamakis K, et al. Education and learning Medical education challenges and innovations during COVID-19 pandemic. Postgrad Med J. 2022; 98: 321-327.
  5. The Guardian. Medical students take final exams online for first time, despite student concern. 2020.
  6. Mukhtar K, Javed K, Arooj M, Sethi A. Advantages, Limitations and Recommendations for online learning during COVID-19 pandemic era. Pak J Med Sci. 2020; 36: S27.
  7. Papapanou M, Routsi E, Tsamakis K, et al. Medical education challenges and innovations during COVID-19 pandemic. Postgrad Med J. 2022; 98: 321-327.
  8. Brown R, Humphreys A, Bamford R, Mutimer J, Coulston J. Mapping out a virtual surgical curriculum: opinions on a core surgical training programme with technology-enhanced learning.
  9. Figueroa F, Figueroa D, Calvo-Mena R, Narvaez F, Medina N, et al. Orthopedic surgery residents’ perception of online education in their programs during the COVID-19 pandemic: should it be maintained after the crisis? Acta Orthop. 2020; 91: 543-546.
  10. Kam CT, Rait J, Brooke-Ball H, Ojofeitimi O. Virtual surgical education for foundation doctors in the United Kingdom during COVID-19 pandemic: A qualitative study. Annals of Medicine and Surgery. 2022; 80: 104192
  11. Co M, Chu KM, Kent-Man Chu C. Distant surgical teaching during COVID-19-A pilot study on final year medical students. 2020.
  12. McGann KC, Melnyk R, Saba P, Joseph J, Glocker RJ, et al. Implementation of an E-Learning Academic Elective for Hands-On Basic Surgical Skills to Supplement Medical School Surgical Education. J Surg Educ. 2021; 78: 1164-1174.
  13. Zaghal A, Marley C, Rahhal S, Hassanieh J, Saadeh R, et al. Face-to-face versus distance learning of basic suturing skills in novice learners: a quantitative prospective randomized trial. BMC Med Educ. 2022; 22.
  14. Co M, Chung PHY, Chu KM. Online teaching of basic surgical skills to medical students during the COVID-19 pandemic: a case–control study. Surg Today. 2021; 51: 1404-1409
  15. Kirkpatrick DL. The Four Levels of Evaluation. Evaluating Corporate Training: Models and Issues. 1998; 95-112.
  16. Asif H, McInnis C, Dang F, Ajzenberg H, Wang PL, et al. Objective Structured Assessment of technical skill (OSATS) in the Surgical Skills and Technology Elective Program (SSTEP): Comparison of peer and expert raters. Am J Surg. 2022; 223: 276-279.
  17. General Medical Council (GMC). A State of Medical Education and Practice in the UK 2021 - Annual Report. 2021.
  18. General Medical Council (GMC). Practical Skills and Procedures. 2019.
  19. Davis CR, Toll EC, Bates AS, Cole MD, Smith FCT. Surgical and procedural skills training at medical school -a national review. International Journal of Surgery. 2014; 12: 877-882.
  20. Kang CM. Non-face-to-face basic surgical skill education in the novel coronavirus disease 2019 (COVID-19) outbreak: obstacle vs. opportunity? Ann Surg Treat Res. 2020; 99: 247.
  21. Reznick R, Regehr G, MacRae H, Martin J, McCulloch W. Testing technical skill via an innovative “bench station” examination. Am J Surg. 1997; 173: 226-230
  22. Martin JA, Regehr G, Reznick R, MacRae H, Murnaghan J, et al. Objective structured assessment of technical skill (OSATS) for surgical residents. British Journal of Surgery. 1997; 84: 273-278
  23. Vaidya A, Aydin A, Ridgley J, Raison N, Dasgupta P, et al. Current Status of Technical Skills Assessment Tools in Surgery: A Systematic Review. Journal of Surgical Research. 2020; 246: 342-378.